Algorithm Excels at Virtual Wine Tasting with New Hardware Upgrade
As amazing and as successful as AI has been at merging with the modern world and making major improvements to so much of it, it is still not enough for some researchers. And while it may seem that AI is close to being considered sentient, it is still lagging behind the human brain on many levels. AI is slower. AI uses lots more energy to do the same task as a brain. The National Institute Of Science And Technology known as NIST, put out an article that explains a new way to save on power usage, and by happy coincidence, this research includes a wine tasting algorithm that is teaching them how this might work.
Scientists with NIST’s Hardware for AI program and their University of Maryland colleagues fabricated and programmed a very simple neural network from MTJs provided by their collaborators at Western Digital’s Research Center in San Jose, California.
Just like any wine connoisseur, the AI system needed to train its virtual palate. The team trained the network using 148 wines from a dataset of 178 made from three types of grapes. Each virtual wine had 13 characteristics to consider, such as alcohol level, color, flavonoids, ash, alkalinity, and magnesium. Each characteristic was assigned a value between 0 and 1 for the network to consider when distinguishing one wine from the others.
“It’s a virtual wine tasting, but the tasting is done by analytical equipment that is more efficient but less fun than tasting it yourself,” said NIST physicist Brian Hoskins.
Then it was given a virtual wine-tasting test on the full dataset, which included 30 wines it hadn’t seen before. The system passed with a 95.3% success rate. Out of the 30 wines it hadn’t trained on, it only made two mistakes. The researchers considered this a good sign.
“Getting 95.3% tells us that this is working,” said NIST physicist Jabez McClelland.
Not all algorithms have such refined taste for the grapes.
Energy Savings
Now, in a study published in the journal Physical Review Applied, scientists at the National Institute of Standards and Technology (NIST) and their collaborators have developed a new type of hardware for AI that could use several times less energy and operate more quickly. The researchers have begun using nanomagnets to act as “synapses” in the hardware that mimics the brain.
A less energy-intensive approach would be to use other kinds of hardware to create AI’s neural networks, and research teams are searching for alternatives. One device that shows promise is a magnetic tunnel junction (MTJ), which is good at the kinds of math a neural network uses and only needs a comparative few sips of energy. Other novel devices based on MTJs have been shown to use several times less energy than their traditional hardware counterparts. MTJs also can operate more quickly because they store data in the same place they do their computation, unlike conventional chips that store data elsewhere. Perhaps best of all, MTJs are already important commercially. They have served as the read-write heads of hard disk drives for years and are being used as novel computer memories today.
The recently rumored sentient AI gets closer to being a reality every day. And it will have a list of wines to sample.
read more at nist.gov
Leave A Comment