Explore the words cloud of the LHCBIGDATA project. It provides you a very rough idea of what is the project "LHCBIGDATA" about.
The following table provides information about the project.
Coordinator |
ISTITUTO NAZIONALE DI FISICA NUCLEARE
Organization address contact info |
Coordinator Country | Italy [IT] |
Total cost | 180˙277 € |
EC max contribution | 180˙277 € (100%) |
Programme |
1. H2020-EU.1.3.2. (Nurturing excellence by means of cross-border and cross-sector mobility) |
Code Call | H2020-MSCA-IF-2017 |
Funding Scheme | MSCA-IF-EF-ST |
Starting year | 2018 |
Duration (year-month-day) | from 2018-07-02 to 2020-07-01 |
Take a look of project's partnership.
# | ||||
---|---|---|---|---|
1 | ISTITUTO NAZIONALE DI FISICA NUCLEARE | IT (FRASCATI) | coordinator | 180˙277.00 |
Large international scientific collaborations will face in the near future unprecedented computing and data challenges. The analysis of multi-PetaByte datasets at CMS, ATLAS, LHCb and Alice, the four experiments at the Large Hadron Collider (LHC), requires a global federated infrastructure of distributed computing resources. The HL-LHC, the High Luminosity upgrade of the LHC, is expected to deliver 100 times more data than the LHC, with corresponding increase of event sizes, volumes and complexity. Modern techniques for big data analytics and machine learning (ML) are needed to cope with such unprecedented data stream. Critical areas that will strongly benefit from ML are data analysis, detector operation including calibration and monitoring, and computing operations. Aim of this project is to provide the LHC community with the necessary tools to deploy ML solutions through the use of open cloud technologies such as the INDIGO-DataCloud services. Heterogeneous technologies (systems based on multi-cores, GPUs, ...) and opportunistic resources will be integrated. The developed tools will be experiment-independent to promote the exchange of common solutions among the various LHC experiments. The benefits of such an approach will be demonstrated in a real world use case, the optimization of the computing operations for the CMS experiment. In addition, once available, the tools to deploy ML as a service can be easily transferred to other scientific domains that have the need to treat large data streams.
Are you the coordinator (or a participant) of this project? Plaese send me more information about the "LHCBIGDATA" project.
For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.
Send me an email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.
Thanks. And then put a link of this page into your project's website.
The information about "LHCBIGDATA" are provided by the European Opendata Portal: CORDIS opendata.