BEGIN:VCALENDAR VERSION:2.0 PRODID:-//132.216.98.100//NONSGML kigkonsult.se iCalcreator 2.20.4// BEGIN:VEVENT UID:20251007T023606EDT-4881v5L1kk@132.216.98.100 DTSTAMP:20251007T063606Z DESCRIPTION:Progress in theoretical understanding of deep learning.\n\nDeep learning has arisen around 2006 as a renewal of neural networks research allowing such models to have more layers. Theoretical investigations have shown that functions obtained as deep compositions of simpler functions (w hich includes both deep and recurrent nets) can express highly varying fun ctions (with many ups and downs and different input regions that can be di stinguished) much more efficiently (with fewer parameters) than otherwise\ , under a prior which seems to work well for artificial intelligence tasks . Empirical work in a variety of applications has demonstrated that\, when well trained\, such deep architectures can be highly successful\, remarka bly breaking through previous state-of-the-art in many areas\, including s peech recognition\, object recognition\, language models\, machine transla tion and transfer learning. Although neural networks have long been consid ered lacking in theory and much remains to be done\, theoretical advances have been made and will be discussed\, to support distributed representati ons\, depth of representation\, the non-convexity of the training objectiv e\, and the probabilistic interpretation of learning algorithms (especiall y of the auto-encoder type\, which were lacking one). The talk will focus on the intuitions behind these theoretical results.\n DTSTART:20161118T203000Z DTEND:20161118T213000Z LOCATION:room 1205\, Burnside Hall\, CA\, QC\, Montreal\, H3A 0B9\, 805 rue Sherbrooke Ouest SUMMARY:Yoshua Bengio\, Université de Montréal URL:/mathstat/channels/event/yoshua-bengio-universite- de-montreal-264235 END:VEVENT END:VCALENDAR