Behind very great results lies great computing

At the ATLAS experiment, masterful computing infrastructure is transforming raw data from the detector into particles for analysis, with a set direction, energy and type.

13 November 2015 | By

ATLAS,update,news,computing,diagram
ATLAS Computing Infographic (Image: Nicola Quadri / ATLAS Collaboration)

"Computing is a vital link in ATLAS physics," says Alessandro Di Girolamo, member of the CERN IT department and responsible for ATLAS Computing Operation. "Data from the detector is calibrated, reconstructed and automatically distributed all around the world by the ATLAS Data Management system. ATLAS Production System then filters through these events and selects the ones needed for a particular type of analysis, a process known as 'derivation'. This brings the data set down to a manageable size for someone doing an analysis on their laptop."

When conference season rolls in, this turnover - transforming ATLAS data from bits to data sets for analyses – needs to be as quick as possible. The ATLAS computing team may be given a few days' notice to have an entire dataset ready for a particular talk, and any delay may mean the result will not be ready in time. During the 2015 summer conference season, this entire turnover was done in a couple of days.

The successes aren't just in speed, but in the size of the turnover. With record levels of data flowing out of the detector, the ATLAS computing team needs to be on top of their game. "In computing, everything is simple until you go up in scale," says Di Girolamo. "Over the past months we've handled, on average, five petabytes of data per week, and we've run over 200,000 jobs simultaneously on the Grid, with a peak of 250,000. We were using almost 150% of the computing resources that the sites pledge to ATLAS, exploiting everything available!"


"In computing, everything is simple until you go up in scale," says Di Girolamo.


ATLAS computing infrastructure and software are constantly evolving – with the help of members of the Collaboration. "After the 'first pass' of the data through the software, physicists will examine the output and check if any improvements can be made," says Di Girolamo. "Their corrections are plugged back into the software, improving the data set for the next analysis." During the summer conferences, this feedback loop was sped up by 200% to make sure the best possible results were getting to the groups on time.

As ATLAS software is so closely linked to a detailed understanding of the detector and its physics, it is developed by computing experts and physicists from within the Collaboration. "We have over 130 computing centres worldwide - located on every inhabited continent - nursed around the clock by members of the Collaboration," says Di Girolamo. "So while the Grid provides the underlying infrastructure, it is the 200-strong ATLAS computing team that keep the wheels turning."