OPOMNIK: IJS KOLOKVIJ, sreda 8. 5. 2019, ob 13.00 uri, prof. dr. Matteo Marsili

Natasa Gosevac Natasa.Gosevac at ijs.si
Tue May 7 10:06:06 CEST 2019


Vabimo vas na 17. predavanje iz sklopa "Kolokviji na IJS" v letu 2018/19, ki
bo v sredo, 8. maja 2019, ob 13 uri v Veliki predavalnici Instituta >Jožef
Stefan<  na Jamovi cesti 39 v Ljubljani. Napovednik predavanja najdete tudi
na naslovu http://www.ijs.si/ijsw/Koledar_prireditev, posnetke preteklih
predavanj pa na http://videolectures.net/kolokviji_ijs. 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

prof. dr. Matteo Marsili

International Centre for Theoretical Physics, Trst, Italija

 

Teorija optimalnih strojev za učenje 

 

Živi sistemi morajo ustvarjati učinkovite predstavitve okolja, v katerem
živijo. Ta naloga je podobna tisti, ki jo rešuje strojno učenje v umetni
inteligenci z algoritmi, kot so globoke nevronske mreže. Zgrajene
predstavitve lahko vidimo kot zgoščen generativni model stanj okolja.
Tovrstni modeli so podvrženi načelu maksimalne relevance in zato je za
njihovo termodinamiko značilna eksponentna gostota stanj. Maksimalno
informativne predstavitve te vrste v splošnem kažejo statistično kritičnost
(tj. potenčno porazdelitev frekvenc) in Zipfov zakon pri optimalnem razmerju
zgoščevanja. Ta sklep, ki ga podpirajo mnogi primeri iz naravnih sistemov in
umetne inteligence, odpira pot do gradnje učinkovitih predstavitev in izbora
najpomembnejših spremenljivk v visokorazsežnih podatkih.

 

Predavanje bo v angleščini.

Lepo vabljeni!

 
***********

 

We invite you to the 17th Institute colloquium in the academic year 2018/19.
The colloquium will be held on Wednesday, May 8, 2019 at 13 PM in the main
Institute lecture hall, Jamova 39, Ljubljana. To read the abstract click
<http://www.ijs.si/ijsw/Koledar_prireditev>
http://www.ijs.si/ijsw/Koledar_prireditev. Past colloquia are posted on
<http://videolectures.net/kolokviji_ijs>
http://videolectures.net/kolokviji_ijs.

********************************************

prof. dr. Matteo Marsili

International Centre for Theoretical Physics, Trieste, Italy

 

Theory of Optimal Learning Machines

 

Living systems need to generate efficient representations of the
environments they live in. This problem is similar to that solved by
learning machines in artificial intelligence such as deep neural networks.
These representations can be seen as a compressed generative model of the
states of the environment. These models obey a principle of maximal
relevance and, as a result, their thermodynamics is characterised by an
exponential density of states. A consequence of this is that maximally
informative representations of this kind exhibit statistical criticality
(i.e. a power law frequency distribution) in general, and Zipf's law at the
optimal compression ratio. This conclusion is supported by extensive
evidence in natural systems as well as in artificial intelligence and opens
the way to identifying efficient representations and identifying relevant
variables in high dimensional data.

 

Cordially invited!

 

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.ijs.si/pipermail/kolokvij-ijs/attachments/20190507/9fec8019/attachment.htm>


More information about the kolokvij-ijs mailing list