Every company even tangentially involved in technology is currently obsessed with generative AI, which has led to some fascinating and occasionally useful tools. However, the cost to access some of these services is surprisingly high. Training and running AI models are hugely expensive, but new research from the University of Sydney and the University of California points to a potentially cheaper way. The researchers created a rat’s nest of silver nanowires that could prove to be a much more efficient way to run neural networks.
Today’s prominent AI systems like ChatGPT run on massage banks of AI accelerators, essentially GPUs with a ton of VRAM and no video output. The hardware is hugely expensive (you’d be lucky to find a single Nvidia H100 for $30,000), and the power to keep them running is a significant ongoing cost. The nanotechnology approach devised by the researchers has the potential to be much more efficient.
Using advanced nanotech fabrication, the team created networks of silver nanowires, each strand about one-thousandth the width of a human hair. The wires are arranged randomly, forming a network where the strands cross and interact like the synapses in a brain. This is a type of neuromorphic computing, which means the wires behave like a physical neural network—it’s a hardware network instead of a software one.
According to the study published in Nature Communications, the nanowires display brain-like behavior when electrical signals pass through the network. The thousands of intersections between wires undergo signal changes in response to electrical impulses, and that response happens in real time. Thus, the team says nanowire networks are ideal for online machine learning.
With online learning, you don’t need to bundle data into large batches, which is why AI accelerators have tens of gigabytes of RAM. Feeding data as a continuous stream is simply more efficient. Even at this early stage, this approach works for some core machine-learning activities. The team converted the MNIST handwritten data set to electrical signals and it into their hardware network, which was able to learn how to identify written numbers. They also tested memory-like tasks of recalling numbers, and the model could do that.
It will take a while before nanowire networks can compete with high-power AI accelerators, but there could be applications that don’t need all that power. The team believes there’s still much to learn about neuromorphic networks.
ExtremeTech supports Group Black and its mission to increase greater diversity in media voices and media ownerships.
© 2001-2023 Ziff Davis, LLC., a Ziff Davis company. All Rights Reserved.
ExtremeTech is a federally registered trademark of Ziff Davis, LLC and may not be used by third parties without explicit permission. The display of third-party trademarks and trade names on this site does not necessarily indicate any affiliation or the endorsement of ExtremeTech. If you click an affiliate link and buy a product or service, we may be paid a fee by that merchant.
Tags artificial brain build nanowires scientists silver
Check Also
Could Gravitational Waves Be Detectable With a Single Atom?
A new paper from Stockholm University lays out an intriguing idea: What if the spontaneous …
#Bizwhiznetwork.com Innovation ΛI |Technology News