embedded-analog-intelligence

robotics is more than AI, robotics needs AI embedded in the system (fed from real time sensors data and planning and adjusting in real-time its actions).

Basic theory for NN

02 Feb 2019 » theory

Notes from Quanta’s magazine article on General Theory of NN

  • ” In 1989, it was proved paper (or wiki summary) that a neural network has only a single computational layer, but you allow that one layer to have an unlimited number of neurons, with unlimited connections between them, the network will be capable of performing any task you might ask of it.”

  • In a paper completed last year, Rolnick and Max Tegmark of the Massachusetts Institute of Technology proved that by increasing depth and decreasing width, you can perform the same functions with exponentially fewer neurons. They showed that if the situation you’re modeling has 100 input variables, you can get the same reliability using either 2100 neurons in one layer or just 210 neurons spread over two layers. They found that there is power in taking small pieces and combining them at greater levels of abstraction instead of attempting to capture all levels of abstraction at once.

  • Another paper proved that at a certain point, no amount of depth can compensate for a lack of width.