Scientists at the Massachusetts Institute of Technology (MIT) have established a new field of research within artificial intelligence (AI) called Analog Deep Learning (ADL). They are developing a type of artificial analog synapse that will allow for significantly higher computing power with lower energy consumption. The researchers have already created synapses that outperform those in the human brain by a factor of one million. The latest version of the artificial synapses, made from an inorganic material called phosphorus silicate glass (PSG), has surpassed the previous version’s performance by a factor of one million.

The artificial synapses consist of programmable resistors that are comparable to the transistors in processors. A multitude of these synapses are then combined to create an analog processor optimized for ADL. The researchers can create a network of analog artificial “neurons” and “synapses” that perform calculations like a digital neural network. This network can then be trained to handle complex AI tasks such as image recognition and natural language processing. The learning process in the human brain primarily occurs through the strengthening and weakening of synapses that connect individual neurons. In a deep neural network, this approach is technically replicated through the training of AI.

The analog processor works by allowing machine learning through an increase and decrease in the electrical conductivity of the resistors. The researchers are currently working on producing analog synapses using conventional silicon manufacturing techniques. This would significantly reduce the cost of analog processors and enable industrial production. The development of these artificial synapses has the potential to revolutionize the field of AI by providing a more energy-efficient and powerful computing solution.

Leave a Reply

Your email address will not be published. Required fields are marked *