Explore the words cloud of the THUNDEEP project. It provides you a very rough idea of what is the project "THUNDEEP" about.
The following table provides information about the project.
Coordinator |
WEIZMANN INSTITUTE OF SCIENCE
Organization address contact info |
Coordinator Country | Israel [IL] |
Total cost | 1˙442˙360 € |
EC max contribution | 1˙442˙360 € (100%) |
Programme |
1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC)) |
Code Call | ERC-2017-STG |
Funding Scheme | ERC-STG |
Starting year | 2018 |
Duration (year-month-day) | from 2018-09-01 to 2023-08-31 |
Take a look of project's partnership.
# | ||||
---|---|---|---|---|
1 | WEIZMANN INSTITUTE OF SCIENCE | IL (REHOVOT) | coordinator | 1˙442˙360.00 |
The rise of deep learning, in the form of artificial neural networks, has been the most dramatic and important development in machine learning over the past decade. Much more than a merely academic topic, deep learning is currently being widely adopted in industry, placed inside commercial products, and is expected to play a key role in anticipated technological leaps such as autonomous driving and general-purpose artificial intelligence. However, our scientific understanding of deep learning is woefully incomplete. Most methods to design and train these systems are based on rules-of-thumb and heuristics, and there is a drastic theory-practice gap in our understanding of why these systems work in practice. We believe this poses a significant risk to the long-term health of the field, as well as an obstacle to widening the applicability of deep learning beyond what has been achieved with current methods.
Our goal is to tackle head-on this important problem, and develop principled tools for understanding, designing, and training deep learning systems, based on rigorous theoretical results.
Our approach is to focus on three inter-related sources of performance losses in neural networks learning: Their optimization error (that is, how to train a given network in a computationally efficient manner); their estimation error (how to ensure that training a network on a finite training set will ensure good performance on future examples); and their approximation error (how architectural choices of the networks affect the type of functions they can compute). For each of these problems, we show how recent advances allow us to effectively approach them, and describe concrete preliminary results and ideas, which will serve as starting points and indicate the feasibility of this challenging project.
Are you the coordinator (or a participant) of this project? Plaese send me more information about the "THUNDEEP" project.
For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.
Send me an email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.
Thanks. And then put a link of this page into your project's website.
The information about "THUNDEEP" are provided by the European Opendata Portal: CORDIS opendata.