Opendata, web and dolomites

TheoryDL SIGNED

Practically Relevant Theory of Deep Learning

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "TheoryDL" data sheet

The following table provides information about the project.

Coordinator
THE HEBREW UNIVERSITY OF JERUSALEM 

Organization address
address: EDMOND J SAFRA CAMPUS GIVAT RAM
city: JERUSALEM
postcode: 91904
website: www.huji.ac.il

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Israel [IL]
 Total cost 1˙342˙500 €
 EC max contribution 1˙342˙500 € (100%)
 Programme 1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
 Code Call ERC-2015-STG
 Funding Scheme ERC-STG
 Starting year 2016
 Duration (year-month-day) from 2016-02-01   to  2021-01-31

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    THE HEBREW UNIVERSITY OF JERUSALEM IL (JERUSALEM) coordinator 1˙342˙500.00

Map

 Project objective

One of the most significant recent developments in applied machine learning has been the resurgence of ``deep learning', usually in the form of artificial neural networks. The empirical success of deep learning is stunning, and deep learning based systems have already led to breakthroughs in computer vision and speech recognition. In contrast, from the theoretical point of view, by and large, we do not understand why deep learning is at all possible, since most state of the art theoretical results show that deep learning is computationally hard.

Bridging this gap is a great challenge since it involves proficiency in several theoretic fields (algorithms, complexity, and statistics) and at the same time requires a good understanding of real world practical problems and the ability to conduct applied research. We believe that a good theory must lead to better practical algorithms. It should also broaden the applicability of learning in general, and deep learning in particular, to new domains. Such a practically relevant theory may also lead to a fundamental paradigm shift in the way we currently analyze the complexity of algorithms.

Previous works by the PI and his colleagues and students have provided novel ways to analyze the computational complexity of learning algorithms and understand the tradeoffs between data and computational time. In this proposal, in order to bridge the gap between theory and practice, I suggest a departure from worst-case analyses and the development of a more optimistic, data dependent, theory with ``grey' components. Success will lead to a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.

 Publications

year authors and title journal last update
List of publications.
2016 Alon Gonen, Francesco Orabona, Shai Shalev-Shwartz
Solving Ridge Regression using Sketched Preconditioned SVRG
published pages: , ISSN: , DOI:
ICML 2019-07-08
2016 Elad Hazan, Kfir Y. Levy, Shai Shalev-Shwartz
On Graduated Optimization for Stochastic Non-Convex Problems
published pages: , ISSN: , DOI:
ICML 2019-07-08
2017 Shai Shalev-Shwartz, Ohad Shamir, Shaked Shammah
Failures of Gradient-Based Deep Learning
published pages: , ISSN: , DOI:
ICML 2019-07-08
2018 Eran Malach, Shai Shalev-Shwartz
A Provably Correct Algorithm for Deep Learning that Actually Works
published pages: , ISSN: , DOI:
2019-07-08
2017 Shai Shalev-Shwartz, Ohad Shamir, Shaked Shammah
Weight Sharing is Crucial to Succesful Optimization
published pages: , ISSN: , DOI:
2019-07-08
2018 Or Sharir, Amnon Shashua
Sum-Product-Quotient Networks
published pages: , ISSN: , DOI:
AISTATS 2019-07-08
2016 Shai Shalev-Shwartz
SDCA without Duality, Regularization, and Individual Convexity
published pages: , ISSN: , DOI:
ICML 2019-07-08
2018 Alon Brutzkus, Amir Globerson, Eran Malach, Shai Shalev-Shwartz
SGD Learns Over-parameterized Networks that Provably Generalize on Linearly Separable Data
published pages: , ISSN: , DOI:
International Conference on Learning Representations (ICLR) 2019-07-08
2018 Alon Gonen, Shai Shalev-Shwartz
Average Stability is Invariant to Data Preconditioning. Implications to Exp-concave Empirical Risk Minimization
published pages: , ISSN: 1532-4435, DOI:
JMLR 2019-07-08
2018 Yoav Levine, Or Sharir, Alon Ziv, Amnon Shashua
On the Long-Term Memory of Deep Recurrent Networks
published pages: , ISSN: , DOI:
ICLR 2019-07-08
2018 Or Sharir, Amnon Shashua
On the Expressive Power of Overlapping Architectures of Deep Learning
published pages: , ISSN: , DOI:
ICLR 2019-07-08
2016 Amit Daniely, Shai Shalev-Shwartz.
Complexity theoretic limitations on learning DNF\'s
published pages: , ISSN: , DOI:
COLT 2019-07-08
2017 Alon Gonen, Shai Shalev-Shwartz
Fast Rates for Empirical Risk Minimization of Strict Saddle Problems
published pages: , ISSN: , DOI:
COLT 2019-07-08
2017 Eran Malach, Shai Shalev-Shwartz
\"Decoupling \"\"when to update\"\" from \"\"how to update\"\"\"
published pages: , ISSN: , DOI:
NIPS 2019-07-08
2018 Jonathan Fiat, Shai Shalev-Shwartz
AproxiPong: Understanding the Merits and Pitfalls of Reinforcement Learning Algorithms when combined with Deep Learning.
published pages: , ISSN: , DOI:
2019-07-08

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "THEORYDL" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "THEORYDL" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.1.)

E-DIRECT (2020)

Evolution of Direct Reciprocity in Complex Environments

Read More  

HYDROGEN (2019)

HighlY performing proton exchange membrane water electrolysers with reinforceD membRanes fOr efficient hydrogen GENeration

Read More  

REPLAY_DMN (2019)

A theory of global memory systems

Read More