‘We will definitely see quantum-centric supercomputers’

In an interview during his recent visit to India, Gil said he also met with some government officials, including Rajeev Chandrasekhar, minister of state (MoS) for electronics and information technology, with whom he shared how IBM can help build a national quantum plan in India. . Gil also explained how businesses and governments can take advantage of technologies such as the hybrid cloud, edge computing, quantum computing, and shared his thoughts on Web3. Edited excerpts:

IBM outlined its commitment nearly five years ago to grow a quantum-ready workforce and build an ecosystem to nurture the community in India. What is the progress?

We have made tremendous progress and in fact it was one of their core aspects of the discussion with the minister (Rajeev Chandrashekar) that I had. They want to ensure that India becomes a powerhouse in the world of quantum skills and quantum technologies. Access to technology is crucial in this regard. That’s why we are committed to the open source environment – the most widely used around the world is Qiskit. We are seeing tremendous adoption in terms of advocates and quantum ambassadors here in India, and we are also now having many conversations with various Indian Institutes of Technology (IITs) and leading training centers to develop curriculum and certification. The Qiskit (to learn quantum computation) textbook is now also available in Tamil, Bengali and Hindi. We are going to give many workshops and many programs around it. I think there is a huge opportunity and part of our commitment is to find a way to grow this broad skills and talent programming in India for quantum.

What is the progress on quantum computers and how do they currently compare to supercomputers?

Most of the computation will continue to run on classic computers, be it central processing units (CPUs) or accelerators (GPUs or gaming processor units), or AI, but there are several key issues well suited to quantum computing. One is the dimension of simulating and modeling our world.

It turns out that there are also mathematical problems of great importance that fit well with quantum computers such as cryptography and factoring. Blockchain, crypto and other similar technologies will have to adapt and change due to the advancement of quantum.

We have more than 180 institutions that are part of the IBM Quantum Network and they include some of the largest financial industry companies in the world such as Goldman Sachs and JP Morgan Chase and Wells Fargo, Mizuho Bank and others such as Daimler and also major energy companies in the oil and gas sector and some materials companies. There is also great interest among universities and students with research laboratories that participate in this.

But when will the world see a stable quantum computer that will overcome the current limitations such as noise, leading to higher error rates, interference, etc.

We already have quantum computers, but they have limitations, as you rightly pointed out. We are still not over the threshold of quantum advantage (the so-called quantum advantage or quantum supremacy is a point where a quantum system performs functions that today’s classical computers cannot), but they are quantum computers nonetheless. We’ve built over 30 of them in the last 4-5 years, over 20 of which are quantum computers now up and running and IBM accessing them through the IBM Cloud. Every day we run three and a half billion quantum circuits on real quantum hardware.

The roadmap we shared is that in the first year we will build a 100 qubit quantum computer, this year a 433 qubit machine and next year a machine with more than 1000 qubits (a quantum computer consists of quantum bits or qubits that simultaneously create a one and can encode a zero. This property allows them to process much more information than traditional computers, and at unimaginable speeds.).

The error rate of the qubits also improves enormously (we can reach 10 to minus 4 error rates). And the algorithms and software – the techniques we use for error mitigation and error correction – are getting better too. If you combine all of this (and) if you want to be conservative, we’ll see the quantum advantage this decade.

What is the roadmap for quantum computing?

We’ve seen AI-centered or GPU-centered supercomputers, and we’ll definitely see quantum-centered supercomputers. That’s how it can come true. Imagine a quantum computer with hundreds or thousands of qubits with a single cryostat (heat causes errors in qubits and so they must be cooled to near absolute zero in a device called a cryostat that contains liquid helium), and now imagine a quantum data center with multiple cryostats in a data center.

You could build a data center with thousands or tens of thousands of qubits, but the connection between these different cryostats in the first generation is classic. If you’re smart enough to take a problem and partition the problem so that you can run parallel workloads in the quantum machines and then plug them in and paste them classically, you’ll still incur exponential costs in the classical piece, but you can still get a good answer.

The next step is to combine the field of quantum communication and quantum computing. It’s a roadmap for the next 10-20 years, but we’re going to see quantum supercomputers and they’re going to interact with the current supercomputers.

I would now like to talk about how hybrid cloud adoption has increased in enterprises, and its evolution, from both a market and research perspective.

From a market lens, if you look at a medium or large scale enterprise (from a market lens), this reality (of a hybrid cloud) is there. Simply put, the question is how to make the hybrid cloud strategy work and further modernize the infrastructure so that workloads and processes run optimally through it. That explains why the open source component and the Red Hat acquisition were so crucial: an operating system based on Linux and a container architecture based on Kubernetes. This is a more than $1 trillion annual market opportunity for us to provide the middleware, infrastructure and right skills through IBM Consulting to help our clients operate and succeed in that environment.

From a computer science lens, we have seen the tremendous importance of edge computing and if you look further, you also see the heterogeneous nature of architectures based on microprocessor centered architectures such as the AI ​​accelerator centered architectures and quantum centered architectures in the future. So it’s critical to build a highly heterogeneous, highly distributed computing environment and make sure it’s designed and working properly.

Speaking of AI, while big data is important, a lot of effort is being made to do a lot more with less data.

Yes it’s true. One extreme remains a story of how to learn from large amounts of data – we’re talking about taking advantage of advances in self-monitoring to train large fundamental models, and a prime example is in Natural Language Processing (NLP). But the challenge our customers have had with AI is that the data science part of it – the data labeling and training pipeline consumes 90% of the resources and (also) a lot of time. So anything we can do to reduce this is hugely important. Then there’s another vector: how do you naturally learn from less with far fewer examples, with few short lessons, and so on? This is an area in which we invest a lot.

Semiconductors is another critical part of IBM Research. In May 2021, IBM announced that its second-generation nanosheet technology has paved the way to the 2nm node. Explain the significance of this development.

The topic of semiconductors has become a national and international priority today. I meet government leaders all over the world and now politicians and citizens realize the importance of semiconductors because they (semiconductors) are literally in everything: cars, refrigerators, telephones and computers. The semiconductor industry is a half-trillion dollar industry. In all respects, this will double in size in the next ten years. To enable that growth, innovation and production capacity must go hand in hand.

IBM plays a pivotal role on the innovation side in creating the new technology that will enable manufacturers to bring that capability to the world at scale. For example, last year’s announcement of the 2-nanometer technology is incredibly exciting because there’s almost nothing more impactful than a next-generation transistor (which allows a chip to fit up to 50 billion transistors in a space the size of a fingernail). We also recently (in December 2021) announced the Vertical Field Effect Transistor (VTFET) – a design intended to enable smaller, more powerful and more energy-efficient devices. Of course, we also use the expertise we have in semiconductor technology to build quantum computing.

What is your role as a member of the National Science Board?

The National Science Board is the entity that is the board of directors of the National Science Foundation (NSF) of the United States. It funds a very important part of all basic scientific work. The hallmark of that funding is that it is curiosity-driven, not application-driven. It’s about pushing the boundaries of math, physics, chemistry and biology, and it’s extremely important that we, as societies, defend and support the need for those kinds of discoveries. Without that investment, its (discovery) will take many decades.

Before we wrap up, I’d like to have your thoughts on Web 3.0 and metaverse – the two buzzwords that are currently taking the industry by storm.

I like seeing the foundations of these areas. On Web 3.0, it’s back to the computer science story. It’s about how we build the next generation of truly distributed computing environments. We touched on this lens from the perspective of a hybrid cloud. But this is complementary because it involves: In this case, how do you build a web architecture and a network architecture that is inherently distributed by design? This requires thinking about a lot of fundamental things – from the security dimension to the semantic nature of the relationship around it. The previous version (from the web) was all about interactivity. Now it’s also about how we bring sensors together and the fact that we have computers everywhere and people interacting with them. So it’s this next-generation architecture that I think is fundamental.

As for the metaverse, maybe I’m not the most qualified person to talk about it, (but) it’s obviously going to be a hugely important way to expand the way we entertain and collaborate. (But) I would really like technology to focus on (solving a) broader set of problems as well.

Subscribe to Mint Newsletters

Please enter a valid email address

Thank you for subscribing to our newsletter.

Download the app to get 14 days of unlimited access to Mint Premium absolutely free!

Leave a Comment