Explore the words cloud of the ACHIEVE project. It provides you a very rough idea of what is the project "ACHIEVE" about.
The following table provides information about the project.
Coordinator |
AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Organization address contact info |
Coordinator Country | Spain [ES] |
Total cost | 2˙266˙907 € |
EC max contribution | 2˙266˙907 € (100%) |
Programme |
1. H2020-EU.1.3.1. (Fostering new skills by means of excellent initial training of researchers) |
Code Call | H2020-MSCA-ITN-2017 |
Funding Scheme | MSCA-ITN-ETN |
Starting year | 2017 |
Duration (year-month-day) | from 2017-10-01 to 2021-09-30 |
Take a look of project's partnership.
# | ||||
---|---|---|---|---|
1 | AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS | ES (MADRID) | coordinator | 495˙745.00 |
2 | UNIVERSITEIT GENT | BE (GENT) | participant | 501˙120.00 |
3 | COMMUNAUTE D' UNIVERSITES ET ETABLISSEMENTS UNIVERSITE BOURGOGNE - FRANCHE - COMTE | FR (BESANCON) | participant | 262˙875.00 |
4 | UNIVERSITE CLERMONT AUVERGNE | FR (CLERMONT-FERRAND) | participant | 262˙875.00 |
5 | UNIVERSITA DEGLI STUDI DI UDINE | IT (UDINE) | participant | 258˙061.00 |
6 | IMASENIC ADVANCED IMAGING SL | ES (BARCELONA) | participant | 247˙872.00 |
7 | UNIVERSIDADE DE COIMBRA | PT (COIMBRA) | participant | 238˙356.00 |
8 | FLIR Systems Trading Belgium FSTB BVBA | BE (Kortrijk) | partner | 0.00 |
9 | Kovilta Oy | FI (Piispanristi) | partner | 0.00 |
10 | NVIDIA Ltd | UK (London) | partner | 0.00 |
11 | Prefixa Inc. | US (Rochester Hills) | partner | 0.00 |
ACHIEVE-ETN aims at training a new generation of scientists through a research programme on highly integrated hardware-software components for the implementation of ultra-efficient embedded vision systems as the basis for innovative distributed vision applications. They will develop core skills in multiple disciplines, from image sensor design to distributed vision algorithms, and at the same time they will share the multidisciplinary background that is necessary to understand complex problems in information-intensive vision-enabled applications. Concurrently, they will develop a set of transferable skills to promote their ability to cast their research results into new products and services, as well as to boost their career perspectives overall. Altogether, ACHIEVE-ETN will prepare highly skilled early-stage researchers able to create innovative solutions for emerging technology markets in Europe and worldwide but also to drive new businesses through engaging in related entrepreneurial activities. The consortium is composed of 6 academic and 1 industrial beneficiaries and 4 industrial partners. The training of the 9 ESR’s will be achieved by the proper combination of excellent research, secondments with industry, specific courses on core and transferable skills, and academic-industrial workshops and networking events, all in compliance with the call’s objectives of international, intersectoral and interdisciplinary mobility.
Gender balance code | Documents, reports | 2020-03-11 14:39:37 |
Ethical guidelines | Documents, reports | 2020-03-11 14:39:22 |
Main recruitment call | Other | 2020-03-11 14:39:34 |
Data management plan | Open Research Data Pilot | 2020-03-11 14:39:20 |
List of selected fellows | Other | 2020-03-11 14:39:41 |
Supervisory Board | Other | 2019-07-05 14:48:06 |
Programme website and communication tools | Websites, patent fillings, videos etc. | 2019-07-05 14:48:07 |
Take a look to the deliverables list in detail: detailed list of ACHIEVE deliverables.
year | authors and title | journal | last update |
---|---|---|---|
2020 |
Ch. Lyu, P. Heyer Wollenberg, L. Platiša, B. Goossens, P. Veelaert and W. Philips. Clip-level Feature Aggregation: A Key Factor for Video-based Person Re-Identification published pages: 179-191, ISSN: , DOI: |
Advanced Concepts for Intelligent Vision Systems - ACIVS 2020 | 2020-03-11 |
2018 |
Delia Velasco-Montero, Jorge Fernandez-Berni, Ricardo Carmona-Galan, Angel Rodriguez-Vazquez Optimum Selection of DNN Model and Framework for Edge Inference published pages: 51680-51692, ISSN: 2169-3536, DOI: 10.1109/access.2018.2869929 |
IEEE Access 6 | 2019-11-08 |
2018 |
Delia Velasco-Montero, Jorge Fernandez-Berni, Ricardo Carmona-Galan, Angel Rodriguez-Vazquez Optimum Selection of DNN Model and Framework for Edge Inference published pages: 1-1, ISSN: 2169-3536, DOI: 10.1109/ACCESS.2018.2869929 |
IEEE Access | 2019-07-05 |
2018 |
Ion Vornicu, Ricardo Carmona-Galán, Ãngel RodrÃguez-Vázquez Demo: CMOS-SPAD Camera Prototype for Single-Sensor 2D/3D Imaging published pages: , ISSN: , DOI: |
Proceedings of International Conference on Distributed Smart Cameras (ICDSC\'18) | 2019-07-05 |
2018 |
Ricardo Carmona-Galán, Jorge Fernández-Berni, Ãngel RodrÃguez-Vázquez, Paula López, Victor Manuel Brea, Diego Cabello, Ginés Domenech-Asensi, Ramón Ruiz-Merino, Juan Zapata-Pérez \", \"\"Demo: Results of iCaveats, a Project on the Integration of Architectures and Components for Embedded Vision\"\". Eindhoven (Netherlands) Sept. 2018.\" published pages: , ISSN: , DOI: |
Proceedings of International Conference on Distributed Smart Cameras (ICDSC 2018) | 2019-07-05 |
2018 |
Delia Velasco-Montero, Jorge Fernández-Berni, Ricardo Carmona-Galán, Ãngel RodrÃguez-Vázquez Demo: Deployment of DNNs on Heterogeneous Hardware in a Low-Cost Smart Camera published pages: , ISSN: , DOI: |
Proceedings of International Conference on Distributed Smart Cameras (ICDSC 2018) | 2019-07-05 |
Are you the coordinator (or a participant) of this project? Plaese send me more information about the "ACHIEVE" project.
For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.
Send me an email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.
Thanks. And then put a link of this page into your project's website.
The information about "ACHIEVE" are provided by the European Opendata Portal: CORDIS opendata.