Opendata, web and dolomites

ScaleML SIGNED

Elastic Coordination for Scalable Machine Learning

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "ScaleML" data sheet

The following table provides information about the project.

Coordinator
INSTITUTE OF SCIENCE AND TECHNOLOGY AUSTRIA 

Organization address
address: Am Campus 1
city: KLOSTERNEUBURG
postcode: 3400
website: www.ist.ac.at

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Austria [AT]
 Total cost 1˙494˙121 €
 EC max contribution 1˙494˙121 € (100%)
 Programme 1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
 Code Call ERC-2018-STG
 Funding Scheme ERC-STG
 Starting year 2019
 Duration (year-month-day) from 2019-03-01   to  2024-02-29

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    INSTITUTE OF SCIENCE AND TECHNOLOGY AUSTRIA AT (KLOSTERNEUBURG) coordinator 1˙494˙121.00

Map

 Project objective

Machine learning and data science are areas of tremendous progress over the last decade, leading to exciting research developments, and significant practical impact. Broadly, progress in this area has been enabled by the rapidly increasing availability of data, by better algorithms, and by large-scale platforms enabling efficient computation on immense datasets. While it is reasonable to expect that the first two trends will continue for the foreseeable future, the same cannot be said of the third trend, of continually increasing computational performance. Increasing computational demands place immense pressure on algorithms and systems to scale, while the performance limits of traditional computing paradigms are becoming increasingly apparent. Thus, the question of building algorithms and systems for scalable machine learning is extremely pressing. The project will take a decisive step to answer this challenge, developing new abstractions, algorithms and system support for scalable machine learning. In a nutshell, the line of approach is elastic coordination: allowing machine learning algorithms to approximate and/or randomize their synchronization and communication semantics, in a structured, controlled fashion, to achieve scalability. The project exploits the insight that many such algorithms are inherently stochastic, and hence robust to inconsistencies. My thesis is that elastic coordination can lead to significant, consistent performance improvements across a wide range of applications, while guaranteeing provably correct answers. ScaleML will apply elastic coordination to two specific relevant scenarios: scalability inside a single multi-threaded machine, and scalability across networks of machines. Conceptually, the project’s impact is in providing a set of new design principles and algorithms for scalable computation. It will develop these insights into a set of tools and working examples for scalable distributed machine learning.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "SCALEML" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "SCALEML" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.1.)

MOCHA (2019)

Understanding and leveraging ‘moments of change’ for pro-environmental behaviour shifts

Read More  

Mu-MASS (2019)

Muonium Laser Spectroscopy

Read More  

MOBETA (2020)

Motor cortical beta bursts for movement planning and evaluation: Mechanisms, functional roles, and development

Read More