Coordinatore | TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Spiacenti, non ci sono informazioni su questo coordinatore. Contattare Fabio per maggiori infomrazioni, grazie. |
Nazionalità Coordinatore | Israel [IL] |
Totale costo | 1˙453˙802 € |
EC contributo | 1˙453˙802 € |
Programma | FP7-IDEAS-ERC
Specific programme: "Ideas" implementing the Seventh Framework Programme of the European Community for research, technological development and demonstration activities (2007 to 2013) |
Code Call | ERC-2013-StG |
Funding Scheme | ERC-SG |
Anno di inizio | 2013 |
Periodo (anno-mese-giorno) | 2013-10-01 - 2018-09-30 |
# | ||||
---|---|---|---|---|
1 |
TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Organization address
address: TECHNION CITY - SENATE BUILDING contact info |
IL (HAIFA) | hostInstitution | 1˙453˙802.00 |
2 |
TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Organization address
address: TECHNION CITY - SENATE BUILDING contact info |
IL (HAIFA) | hostInstitution | 1˙453˙802.00 |
Esplora la "nuvola delle parole (Word Cloud) per avere un'idea di massima del progetto.
'The statistical and computational theory of learning is one of the prime achievements of computer science and engineering. This is evident both in terms of mathematical elegance of capturing intuitive notions rigorously as well as in terms of practical applicability: machine learning has effectively reshaped the way we use information. In this proposal we tackle the very basic notions of learning. Learning theory traditional focuses on statistics and computation. We propose to add information to the characterization of learning: namely the research question we address is: how much information is necessary to learn a certain concept efficiently? The crucial difference from classical learning theory is that traditionally statistical complexity was measured in terms of the number of examples needed to learn a concept. Our question is more finely grained: what if we are allowed to inspect only parts of a given example? Can we reduce the amount of information necessary to successfully learn important concepts? This question is fundamental in understanding learning in general and designing efficient learning algorithms in particular. We show how recent advancements in convex optimization for machine learning yields positive answers to some of the above questions: there exists cases in which much more efficient algorithms exist for learning practically important concepts. Our goal is to characterize learning from the viewpoint of the amount of information necessary to learn, to design new algorithms that access less information than current state-of-the-art and are consequently significantly more efficient. New answers for these fundamental questions will be a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.'