LHC and Open Access

The CMS experiment at the LHC has released a portion of its data to the public for use in education and outreach. Explore this page to find out more about the data and how to analyse it yourself.

LHC data are exotic, they are complicated and they are big. At peak performance, about one billion proton collisions take place every second inside the CMS detector at the LHC. CMS has collected around 64 petabytes (or over 64,000 terabytes) of analysable data from these collisions so far.

Along with the many published papers, these data constitute the scientific legacy of the CMS Collaboration, and preserving the data for future generations is of paramount importance. “We want to be able to re-analyse our data, even decades from now,” says Kati Lassila-Perini, head of the CMS Data Preservation and Open Access project at the Helsinki Institute of Physics. “We must make sure that we preserve not only the data but also the information on how to use them. To achieve this, we intend to make available through open access our data that are no longer under active analysis. This helps record the basic ingredients needed to guarantee that these data remain usable even when we are no longer working on them.” See: LHC data to be made public via open access initiative

This entry was posted in Uncategorized and tagged , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s