The ATLAS Open Data Analysis Service, developed by the Elementary Particle Physics Working Group at Göttingen University , provides access to LHC–ATLAS Open Data, analysis tools, and computing resources. It enables high-school and university students, educators, and physics enthusiasts to analyze real ATLAS data and explore advanced topics in High Energy and Particle Physics.
Beyond education, the service supports new teaching approaches and promotes Citizen Science. Tutorials use 13 TeV proton-proton collision data and simulated samples for signal and background modeling. More details on data structure and content can be found here.
While ATLAS Open Data is public, two barriers have limited wider use outside CERN:
EXPLORE removes these obstacles by providing:
EXPLORE is part of PUNCH4NFDI, promoting FAIR data practices (Findable, Accessible, Interoperable, Reusable) across physics. By repurposing the Tier-2 WLCG GoeGrid at the University of Göttingen as Open Analysis Resources, EXPLORE enables LHC–CERN Open Data analysis without institutional affiliation.
Accessibility Education & Outreach Scalability FAIR Principles Sustainability
The diagram illustrates the advantage of using the GoeGrid computing cluster for ATLAS Open Data analysis over running it locally on a laptop with limited computing power.
Note: To register for access to GoeGrid, simply Register Now
EXPLORE turns GoeGrid into a scalable platform for FAIR Open Data analysis using batch systems, containers, and real-time monitoring.
StartD
and register as worker nodes, forming a seamless execution pool.ssh -i ~/.ssh/id_rsa <username>@punchlogin.goegrid.gwdg.de
After registering and obtaining a user account for GoeGrid, take a moment to familiarize yourself with HTCondor, the software framework enabling the execution of your analysis tasks.
HTCondor, developed by the Center for High Throughput Computing at the University of Wisconsin–Madison, schedules and runs computing tasks across multiple computers. When a user submits tasks to the HTCondor queue at the Submit/Entry Point, the system schedules and runs them on available Execute Nodes, managing tasks on the user's behalf.
To use HTCondor for executing your analysis, you must define the computing task, known as a “job.” A job consists of three main components:
These components are defined in a Job Description Language (JDL) file, which is submitted at the Submit Point. The JDL file also specifies output, error, and log files to capture job information. You will find more details about creating and submitting a JDL file in the section below.
Analyzing with GoeGrid ResourcesLearn more about HTCondor and how it manages your computing tasks by visiting the official HTCondor website.