Analog AI? It sounds crazy, but it might be the future





Forget digital. The future of AI is… analog? At least, that’s the claim of Mythic, an AI chip company that, in its own words, is making “a leap in performance” by going back in time. Kind of.

Before ENIAC, the world’s first programmable electronic general purpose digital computer, came to life in 1945, all computers were probably analog – and have been for as long as computers have been around.

Analog computers are a bit like stereo amplifiers, using a variable range as a way of displaying the desired values. In an analog computer, numbers are represented by means of currents or voltages, rather than the zeros and ones used in a digital computer. While ENIAC marked the beginning of the end for analog computers, analog machines persisted in one form or another until the 1950s or 1960s, when digital transistors took over.

“Digital has replaced analog computing,” Tim Vehling, senior vice president of product and business development at Mythic, told Digital Trends. “It was cheaper, faster, more powerful, and so on. [As a result]analog went away for a while.”

To modify a famous quote often attributed to Mark Twain, the reports of the death of analog computers could be greatly exaggerated. If the triumph of the digital transistor marked the beginning of the end for analog computers, perhaps it was just the beginning of the end of the beginning.

Building the next great AI processor

Mythic AI logo on a graphics chip.
mythical

However, Mythic doesn’t build retro technology on purpose. This isn’t a steampunk startup operating out of a vintage clock tower headquarters filled with Tesla coils; It is a well-funded technology company, based in Redwood City, California and Austin, Texas, that builds Mythic Analog Matrix Processors (Mythic AMP) that promise advances in power, performance, and cost using a unique analog computational architecture that differs significantly from mainstream digital architectures.

Devices like the heralded M1076 single-chip analog computing device claim to usher in an era of compute-intensive processing at impressively low power.

“There’s definitely a lot of interest in making the next great AI processor,” says Vehling. “There’s definitely a lot of investment and venture capital going into this space. There is no doubt about that.”

The analog approach isn’t just a marketing gimmick either. Mythic sees future problems for Moore’s Law, Intel co-founder Gordon Moore’s famous observation in 1965, who claimed that about every 18 months, the number of transistors that can be squeezed onto an integrated circuit doubles. This observation has helped usher in a period of sustained exponential improvement for computers over the past 60 years, and has helped support the amazing advances AI research has made over that same period.

But Moore’s law encounters challenges of the physical variety. Progress has been slowed down due to the physical limitations of constantly trying to downsize components. Approaches such as optical and quantum computing offer a possible way around this. Meanwhile, Mythic’s analog approach attempts to create compute-in-memory elements that function as tunable resistors, supplying inputs as voltages and collecting the outputs as currents. The idea is that the company’s chips can handle the matrix multiplication needed to make artificial neural networks function in innovative new ways.

As the company explains, “We use analog computing for our core operations of neural network matrices, where we multiply an input vector by a weight matrix. Analog computing offers a number of important advantages. First, it is amazingly efficient; it eliminates memory displacement for the neural network weights because they in place when resistors are used. Second, they are high performance; there are hundreds of thousands of multi-accumulative operations in parallel when we do one of these vector operations.”

“There are many ways to tackle the problem of AI computation,” Vehling said, referring to the different approaches being explored by different hardware companies. “There is no wrong way. But we are fundamentally convinced that the keep-throw-more-transistors-to-it-keep-the-process-nodes-smaller–basically Moore’s Law–is no longer viable. It’s already starting to show. So whether you use analog computers or not, companies will have to find a different approach to create next-generation products with high processing power, low power, [et cetera]†

The future of AI

brain with computer text scrolling artificial intelligence
Chris DeGraw/Digital Trends, Getty Images

If not addressed, this issue will have a major impact on the further advancement of AI, especially when run locally on devices. Right now, some of the AI ​​we rely on every day combines on-device and cloud processing. Think of it as an employee who can make decisions up to a certain level, but then has to call their boss to ask for advice.

This is the model used, for example, by smart speakers, which locally perform tasks like keyword spotting (“OK, Google”), but then outsource the actual spoken word queries to the cloud, allowing home appliances to harness the power of supercomputers. exploit stored in huge data centers thousands of miles away.

That’s all well and good, although some tasks require immediate responses. And as AI gets smarter, we expect more and more of it. “We see a lot of what we call Edge AI, which is not cloud dependent, when it comes to industrial applications, machine vision applications, drones, in video surveillance,” Vehling said. †[For example], you may want a camera to try to identify someone and take immediate action. There are many applications that need to be applied immediately to a result.”

AI chips need to keep up with other breakthroughs in hardware. For example, cameras are getting better and better. Image resolution has increased dramatically over the past few decades, meaning deep AI models for image recognition need to be able to parse ever-increasing amounts of resolution data to perform analytics.

Add to that the growing expectations for what people think they’re extracting from an image — whether that’s mapping objects in real time, identifying multiple objects at once, figuring out the three-dimensional context of a scene — and you realize the huge challenge that AI systems are becoming. faced.

Whether it’s providing more processing power while keeping the devices small, or the privacy requirements that require local processing rather than outsourcing, Mythic believes its compact chips have plenty to offer.

the rollout

Mythic AI logo on a graphics chip.
mythical

“Goods [currently] in the early stages of commercialization,” says Vehling. “We have announced a number of products. So far, we have some customers who are evaluating: [our technology] for use in their own products… Hopefully by the end of this year, early next year we will see companies using our technology in their products.”

Initially, he said, this will likely be in business and industrial applications, such as video surveillance, high-end drone manufacturers, automation companies and more. However, don’t expect consumer applications to lag too far behind.

“After 2022 — [2023] going into ’24 – we’re going to see consumer tech companies [adopt our technology] too,” he said.

If analog computing turns out to be the innovation powering the augmented and virtual reality necessary for the metaverse to function… well, isn’t that the most perfect meeting point of steampunk and cyberpunk you could hope for?

Hopefully, Mythic’s chips will turn out to be less imaginary and unreal than the company’s chosen name would have us believe.

Editor’s Recommendations









Roxxcloud

Leave a Reply

Your email address will not be published.

Back to top