Visualisation of RAMSES data by the SDvision software: Simulation of the formation of cosmological structures in the Universe, showing the gravitational concentration of dark matter and baryonic matter to form galactic clusters.
Physics experiments are making an increasing use of electronic data processing. Information technology and numerical techniques have become essential, be it to operate increasingly complex instruments, to perform simulations for data analysis and the interpretation of physical phenomena, or to share the acquired knowledge. Through its expertise and technological innovations, DAPNIA actively contributes to the development of information technologies for physics applications.
Distributed applications for high-energy physics experiments
Distributed applications (i.e. applications running on networked computers) are playing an increasing role in DAPNIA's scientific activities. The choice of a distributed application may be motivated by multiple needs: execution of data analyses for different sites, sharing of results within the scientific community, facilitating software maintenance on centralised servers, and managing massive data flows from a large number of detectors.
The techniques used at DAPNIA to develop such applications rely on 'open source' products based on the Java language or on middleware (see below) developed by the high-energy physics or astrophysics community. These applications have been used successfully in a large number of experiments.
Examples include the light curve and astrophysics image servers for the EROS and XMM experiments, the simulation data access servers for the ODALISC and HORIZON radiation-matter interaction programmes, and the supernova analysis server for the SNLS experiment. Other more recent applications have been developed within the framework of the CERN ATLAS experiment, such as the optical line monitoring server for the muon spectrometer chamber alignment system, and the magnetic field mapping and geometric correction servers for the trajectography systems. Distributed applications allow scientists to forget about software engineering so as to better concentrate on the algorithmic aspects of analysis programmes.
Software development and Web technology
Expertise in software development is essential for the needs of physics science in large laboratories such as DAPNIA. The purpose of this activity is to develop software tools allowing the production and management of high added value applications used to operate instruments and analyse scientific data. These tools include specific middleware (e.g. real-time processing simulation for CMS experiments), software development frameworks, and tools to exchange and distribute scientific data. A very successful example would be PHOCEA, the web portal configuration environment developed at DAPNIA and adopted by numerous CEA departments and external laboratories.
Plot of the monthly evolution of computing time provided by the GRID in France. Early in 2005 the capacity was around 3,000 hours per month. Beginning 2007, about 1,500,000 hours are provided monthly to the GRID, two third of it being used by LHC experiments, the experiment D0 being the main user of the remaining. Globally the evolution in France reached a factor 500. Another 100 factor is necessary to be fully ready for LHC
Computational grids for LHC experiments and other fields
DAPNIA participates in the LCG and EGEE international computational grid programmes, which are based on the sharing of local, regional, national and international computational resources. These grids are used to process scientific data from LHC experiments (LCG project), as well as data from other fields of interest, including biomedical data (EGEE project). DAPNIA is actively involved in the GRIF regional research grid project, whose purpose is to federate the activities of research laboratories in the Paris area.
DAPNIA participates in project steering, administration and implementation tasks, and is also actively involved in grid management, middleware deployment and user support activities. In addition, the valuable experience acquired by DAPNIA's computational grid team in the course of the EGEE project is being made available to the controlled fusion community through a collaboration with the CEA's Department of Research on Controlled Fusion (DRFC) in Cadarache, France.
The LCG project should be operational prior to the launching of the LHC experimental programme. It must be noted that France is lagging behind other European countries and the USA, particularly in terms of analysis resources. As a result, the first objective of the GRIF project is to become a Tier-2 distributed centre. Given its involvement in the LHC programme, DAPNIA is set to play a leading role in the GRIF project.
Astrophysical image processing
With this objective in mind, the 'Multiresolution' joint research programme (DAPNIA/SEDI-SAP) develops methods and algorithms making the best possible use of existing knowledge of image formation mechanisms. The wavelet transform technique allows the separation of image components corresponding to different spatial scales (hence the term 'multiresolution'). Possible applications include filtering, deconvolution, form detection and data compression. These techniques are used in various research areas, including the detection of dark matter via the gravitational lens effect or Sunyaev-Zel’dovich effect, the characterisation of internal structure and dynamics of stars (asteroseismology) and the characterisation of the cosmic microwave background (CMB).
Visualisation of HERACLES data by the SDvision software: Simulation of turbulences in the interstellar medium.
Parallel computation, visualisation and software development for the numerical simulation of astrophysical plasmas
DAPNIA's Computational Astrophysics programme (COAST) was launched in 2005 to develop, optimise, parallelise and manage software tools for numerical simulation in astrophysics. The results achieved so far include the development of the SDvision visualisation software (to analyse hydrodynamic simulation data), the creation of the ODALISC opacity database (to model laser and astrophysical plasmas), and the implementation of common tools for software version management. Its expertise in various areas (cosmology, interstellar medium, protoplanetary discs, stellar physics, hot laser plasma physics) and its significant experience in numerical analysis and software development have led DAPNIA to play a leading role in several national collaborations such as the HORIZON, ODALISC, MAGNET and SINERGHY projects. These projects will require significant efforts on DAPNIA's part to develop and generalise software tools and methods intended for the international astrophysics, geophysics and plasma physics communities
last update : 02-20 00:00:00-2007 (1286)
European Research Council grant for scientific excellence goes to one of the CEA's experienced scientists
Jean-Luc Starck, an experienced research scientist at IRFU (the CEA Institute for Research on the Fundamental Laws of the Universe) was awarded a 2.2 million euro grant spread over five years under the 7th European research and development framework ... More »
The giant gas ring in Leo, formed when two galaxies collided
An international team led by astrophysicists from the Lyon Observatory (CRAL, CNRS/INSU, Université Lyon 1) and the AIM laboratory (CEA-Irfu, CNRS, Université Paris 7) has just shed some light on the origins of the giant gas ring in ... More »
The most famous collision of galaxies decoded using ‘high-resolution’ simulations
‘High-resolution’ numerical simulations carried out by scientists at the Astrophysics Department of the CEA-Irfu/AIM have just revealed that the most famous galactic collision ever, the Antennae collision, produces far more stars ... More »
Shortly after the LHC started up on the 10th September 2008 at 9:30 am, the detectors recorded the first events resulting from proton collisions in the beam detectors. The data processing scheme that had been planned for such a long time ... More »
The LHC is about to start up for an initial two-year period of data acquisition which will produce a flow rate and volume of data among the largest that the man has ever needed to process. During recent tests under real conditions, the Paris ... More »