Ido Kanter

Ido Kanter (born: 21 Nov. 1959) is an Israeli professor of physics at and the head of the Lab for Reverberating Modes in Neural Networks at the Gonda Brain Research Center at Bar-Ilan University. He specializes in models of disorder magnetic systems, physical random number generators, theory of neural networks, deep learning and synchronization among neurons and lasers.

Early life and education
Kanter was born and raised in Rehovot, Israel and served in the Israeli Defense Force from 1978 to 1981.

He attended Bar-Ilan University and graduated with a bachelor's degree in physics and computer science in 1983. In 1987, he received his Ph.D. from Bar-Ilan University. His thesis was Theory of Spin Glasses and its Applications to Complex Problems in Mathematics and Biology, under the supervision of Professor Haim Sompolinsky.

He was a visiting research fellow at Princeton University from 1988 to 1989, working with Phil W. Anderson. He was also a visiting research fellow at AT&T Bell Labs, with Yann le Cun, then 1989 joined the physics department at Bar-Ilan University in 1989.

Research
Ido Kanter specializes in models of disorder magnetic systems, ultrafast physical random number generators, theory of neural networks, neural cryptography, deep learning and synchronization among neurons and lasers and experimental and theoretical neuroscience, documented in more than 220 publications.

Main contributions
Using a combination of theoretical and experimental methods, Kanter has made contributions to various fields ranging from statistical physics and communication to neural cryptography and neuroscience. These include work on a field of statistical physics known as the inverse problem, bridging between Shannon theory and the second thermodynamic law, presenting a cryptographic key exchange protocol based on neural networks, and creating an ultrafast non-deterministic random bit generator (RBG).

Kanter is currently focusing on the field of experimental and theoretical neuroscience, Kanter studies a variety of topics including the new neuron, dendritic learning, neural interfaces, and machine learning.