The network, called a reservoir computing system, could predict words before they are said during conversation, and help predict future outcomes based on the present.
Reservoir computing systems, which improve on a typical neural network's capacity and reduce the required training time, have been created in the past with larger optical components.
Researchers from University of Michigan in the US created their system using memristors, which require less space and can be integrated more easily into existing silicon-based electronics.
This contrasts with typical computer systems, where processors perform logic separate from memory modules.
For the study published in the journal Nature Communications, researchers used a special memristor that memorises events only in the near history.
Inspired by brains, neural networks are composed of neurons, or nodes, and synapses, the connections between nodes.
To train a neural network for a task, a neural network takes in a large set of questions and the answers to those questions.
In this process of what's called supervised learning, the connections between nodes are weighted more heavily or lightly to minimise the amount of error in achieving the correct answer.
"A lot of times, it takes days or months to train a network. It is very expensive," said Wei Lu, professor at University of Michigan.
Image recognition is also a relatively simple problem, as it does not require any information apart from a static image.
More complex tasks, such as speech recognition, can depend highly on context and require neural networks to have knowledge of what has just occurred or what has just been said.
This requires a recurrent neural network, which incorporates loops within the network that give the network a memory effect. However, training these recurrent neural networks is especially expensive, Lu said.
Reservoir computing systems built with memristors, however, can skip most of the expensive training process and still provide the network the capability to remember.
This is because the most critical component of the system - the reservoir - does not require training.
When a set of data is inputted into the reservoir, the reservoir identifies important time-related features of the data, and hands it off in a simpler format to a second network.
"The beauty of reservoir computing is that while we design it, we don't have to train it," Lu said.
The team proved the reservoir computing concept using a test of handwriting recognition, a common benchmark among neural networks.
Using only 88 memristors, compared to a conventional network that would require thousands for the task, the reservoir achieved 91 per cent accuracy.
Disclaimer: No Business Standard Journalist was involved in creation of this content
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
