Booz Allen: Quantum Computing Insights

Booz Allen CTO Bill Vass Talks Quantum

Demystifying Einstein’s “spooky action at a distance”

As Booz Allen Chief Technology Officer (CTO) Bill Vass explains, quantum computing harnesses the principles of quantum mechanics to solve complex problems beyond the capabilities of classical computers, promising breakthroughs and disruptions in cryptography, optimization, scientific research, and more. Bill is a recognized expert in quantum computing and emerging technologies with decades of experience leading high-performing engineering teams. His insights illuminate the potential of quantum computing to revolutionize industries and drive future advancements.

A Brief (5-Minute) History of Computing

Humans began computing by counting on their fingers and using tally marks for simple calculations. Over millennia, counting/computing technologies advanced to encompass tools like the abacus, slide rule, mechanical calculator, electronic computer, and modern transistor-based computer. Today, computing has evolved into robust, interconnected systems, driving advancements in artificial intelligence (AI), data analytics, and beyond.

Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
    • Chapters
    • descriptions off, selected
    • captions off, selected
      Click Expand + to view the video transcript

      When you start to understand quantum computing, it's a good idea to kind of understand the history of computing. So I'm going to do that in about five minutes. How did mankind start computing? We started with these. That's why everything is base ten. And I still go “One, two, three, four, five...” and I'm sure most of you do. And we still add things up on our fingers. Certainly when I'm doing my timesheet it's like “1 o’clock, 2 o’clock, 3...” So I mean, those are really common things. And so this kind of computing hasn't gone away just because we have a digital computer in our pocket these days. That's important to point out. But this had limitations. When you started doing tens and tens and tens, you couldn't remember how many tens you'd done. So we started doing sticks and stones, literally. So you you've probably done “One, two, three, four, five...” right - that’s sticks. And then two, two bundles of sticks make a stone and so on and so forth. And this served us quite well. And again, we still do “One, two, three, four, five...” even though we've got digital computers in our pocket. Then there was a major advancement, and this major advancement with symbology and the ability to write. And then we could write things down, clay tablets, papyrus, all those kinds of things. That was a huge step forward in computing, our compute capability, and ability to manage information. And then, of course, there was this huge advancement out of Asia called an abacus. If you've ever seen anyone operate an abacus, you can go very quick and do a tremendous amount of computing on an abacus. That was a giant step forward. The next big advancement was gears and slides. So I am unfortunately old enough to have done computing on a slide rule that you see there. And you've probably seen those manual adding machines where you pull a crank and it moves paper out. That was gear-based. And in fact, we broke Enigma with a gear-based machine during World War II. So those have been supplanted very much, by electronics in most cases. But there are still people who can do things very quickly on slide rules. We learned that we could do computing on the Eniac and other things like that with the tube capabilities. And tubes provided switches and AND/OR gates and those kinds of things. And this was really our first electronic computing systems that we did. We don't really use tubes very much unless you're some weird audiophile and you think tubes are better, but, tubes are not used primarily anymore. And, our next advancement was to transistors. And you probably remember transistor radios. Transistors really made a huge step forward in our ability to compute. And then we needed it to be much smaller for the space program, So we invented the ability to do lithography and integrated computers and integrated silicon chips, which is where we're at today, when there's literally hundreds of millions, potentially, of transistors on a single, silicon wafer today. And that's really amazing what those can do. And in this case, you're using electrons and moving them through there. But it's important to point out I still use my fingers to count things, right? So, a lot of the old computing doesn't necessarily be supplanted by this. It's just augmented. So now we're moving to the sort 

      Video Player is loading.
      Current Time 0:00
      Duration 0:00
      Loaded: 0%
      Stream Type LIVE
      Remaining Time 0:00
       
      1x
        • Chapters
        • descriptions off, selected
        • captions off, selected

          Demystifying Quantum Mechanics

          Quantum computers harness the principles of quantum mechanics to perform complex calculations beyond the reach of classical computers. Instead of bits representing 0s or 1s, they use “qubits.” Qubits leverage superposition to represent multiple states simultaneously (both 0 and 1) and entanglement to correlate qubits. By manipulating these quantum phenomena through precisely controlled operations (quantum gates), they can find solutions to specific problem sets that are intractable for even the most powerful supercomputers available today.

          Click Expand + to view the video transcript

          So now we're moving to the sort of the next phase of computing which is using atomic particles. It's a natural progression, right? We're getting smaller, and smaller, smaller in a lot of cases. And with each phase of computing, there were new features that were brought in - memory, the ability with writing to have memory and sticks and stones to have memory instead of just doing it on your fingers. With quantum computing, there's really two differentiating features with these particles. And it's primarily superposition, which we'll talk a lot about today, and entanglement. And those are two things that are very hard to conceive, even though they do exist in our physical world today. They're very micro in our macro world, and how they affect things. So primarily the particles we use are electrons - and that's actually a photograph of a real electron - neutral atoms, charged ions, and photons. Those are the primary four different types of quantum computers. And they all have advantages and disadvantages. And I guarantee you, if you talk to the quantum scientists in each of these areas, it's like talking to different religions. They, you know, that - this is like, you know, Catholic, and Baptists, and Lutheran. And, you know, they all have a commonality that they're using superposition and entanglement, and they all believe in that. They all believe in the future of quantum computing, and they understand what a quantum computer can do. But they're very religious that each of their technologies are going to be the best. And when you talk to them, it's fascinating to hear the reasons why all the others are wrong and they're right. And it's hard to know which will actually win at this time. I have some ideas. But let's talk a little bit about superposition. So superposition is fascinating. A regular bit on a regular computer, you pretty much just have a one and a zero while you're operating on it. With a superposition, and you're measuring primarily spin and vibration of an atomic particle when you're operating on it, is in every position at once until you measure it, and then it's in your final result. So it gives you a tremendous amount of variability to operate on. And I know that's hard to conceive. And hopefully later, as JD and others talk more and more about it, it will become clearer. But remember, superposition is one of the immense powers of a quantum computer, is enabling that quantum capability. That is superposition. The next thing is even harder for most people to understand. Einstein called it “spooky action at a distance.” This is the ability to entangle two atomic particles together so that they communicate with each other, and changing one affects the other instantaneously. That's pretty hard to understand. In both cases here we can drive these things. And by that, I mean, even though I say we don't fully understand them, we can drive them. I'll give you an analogy. I'm a car guy, and every car I know how many cylinders and how the valves work and the tap-ins and, you know, all of the components of a car, and the ECU and everything else. Or an electric motor, and how many kilowatts the battery is and all those kinds of things. I can drive a car pretty well. My wife can also drive a car pretty well. To her it's a skinny pedal, a fat pedal, and a steering wheel. Right? She doesn't really know - if you asked her how many cylinders were in her car, of course we drive electric cars now, but she wouldn't know. But she can drive it just as well. So we can drive entanglement and superposition extremely well without understanding the fundamental components of how it works. And that really bothers me as, you know, one of the reasons I got into computer science is the old saying that “anything with the word ‘science’ in it isn’t,” right? “Poly science,” “computer science,” it’s because all of that is invented by man. And so you can you can send an email to the person that developed the RS flip-flop on a chip as part of a divide algorithm on an Intel chip, and you get an answer from that person. You can understand how to entangle things and how to operate superposition, but nobody actually understands entanglement or how superposition works physically, right? It's an interesting problem to deal with, but it's something we can drive very well on a quantum computer and leverage. As I said before, in a quantum computer, you have a superposition you're operating on. In a digital computer, you have a one and a zero. And remember, the ones and the zeros are arbitrary, right? Like if I count, I pick this as one, two, three, four, five. Or I could take this as one, two, three, four, five. What I pick as “one” is arbitrary. Just like with a digital computer, the lack of a charge or the existence of a charge can be a 1 or 0, that's arbitrary. As long as you are consistent. And the same thing is true with quantum computing. You're picking specific, say, a position, a phase or a vibration and saying that's a one or this is a zero in your superposition in your qubit. And the advantage, again, to a quantum computer is you can be every position between 1 and 0 at once while you're operating on it. 

          Building Quantum Computers

          Today’s quantum computing prototypes reflect different design choices with unique strengths and characteristics. Superconducting (or electron-based) quantum computers use superconducting wires operating at near absolute zero temperatures. Trapped-ion systems use precisely controlled lasers to manipulate individual ions, promising superior qubit coherence. Neutral atom systems trap atoms in optical lattices with laser-based manipulation. Photon-based quantum computers use photons, or particles of light, as qubits, enabling integration with quantum communication systems. Each architecture represents a different tradeoff in terms of scalability, coherence time, and error rates, among other attributes.

          Video Player is loading.
          Current Time 0:00
          Duration 0:00
          Loaded: 0%
          Stream Type LIVE
          Remaining Time 0:00
           
          1x
            • Chapters
            • descriptions off, selected
            • captions off, selected
              Click Expand + to view the video transcript

              So what do they look like really? These were the machines I was building when I was at Amazon. These are electron-based. They're often called electromagnetic cryogenic machines, which is a long word of saying, they have to operate at almost absolute zero. So, that fridge you see there that says Bluefors is one that's running. And as far as we know, when that's running, that's the coldest place in the universe, right there. That is microkelvin. It's as close to absolute zero as you can possibly get. You can never actually get to absolute zero, but it's as close as you can possibly get to absolute zero. So they're very cryogenic machines. The chip actually goes down here on the bottom and these are all microwave transmissions. So in an electromagnetic cryogenic machine, you control the qubits, set the qubits, and manage them with microwaves. And one of the big advantages to that, and I'll talk about lasers as well, is communications has tremendously advanced microwaves and lasers, and all of our networks and communication systems, and quantum computing is able to take advantage of that. It's been a huge leverage for quantum computing. The next machine you see here is called a neutral atom machine. In this case the compute is happening right there in a vacuum chamber. These are all called laser tweezers. You actually grab an atom and you hold on to it in a vacuum, and you grab another atom and you hold on to it in a vacuum. You put them next to each other, use another laser to entangle them. And then you can operate on them, moving to operating areas, and you're holding them with a laser tweezer. Imagine two opposite sine waves that look like a saw, right? Grabbing, an atom and holding it, and they'll hold hundreds of them inside this machine. So the compute is happening in here. It looks pretty sci-fi, doesn't it? That looks very cyberpunk to me. This is an ion trap machine. This is actually an ion Hughes machine. The chip is right down in here, and you'll see pictures of it later. So it suspends ions in a magnetic field, holds onto them, and then uses lasers to set them and measure them and entangle them. And so it actually kind of looks like, when you see it in operation - and you can actually, with your naked eye, see these glowing blue ions. It's pretty cool. And they grab them like the carriage of a typewriter and move them back and forth between the lasers, using the magnetic field to operate on them. And then last but not least, is a photonics machine. So this is, right there are the silicon wafers that do the compute, and then the silicon wafer fits in here. And these are photonic machines going in and out here. And there are photonic detectors in there. 

              Video Player is loading.
              Current Time 0:00
              Duration 0:00
              Loaded: 0%
              Stream Type LIVE
              Remaining Time 0:00
               
              1x
                • Chapters
                • descriptions off, selected
                • captions off, selected

                  Becoming Fault-Tolerant

                  Current quantum computers are extremely error-prone due to the fragile nature of qubits. Becoming fault-tolerant requires quantum error correction (QEC), which detects and fixes computational errors as they occur. QEC involves creating "logical qubits" by distributing information across multiple physical qubits. This redundancy allows the system to detect the presence of errors so they can be addressed before they corrupt calculations. Importantly, this error correction must happen continuously and rapidly during computation. While creating stable logical qubits that can perform reliable operations remains a significant challenge—with current systems demonstrating only a few logical qubits—progress in this field continues to accelerate at remarkable speeds.

                  Click Expand + to view the video transcript

                  It's important to note these machines have a lot of error in them. And that's really one of the big challenges with quantum computing. So just to put it in perspective, your iPhone today is having alpha particles flipping the bits in its memory as we speak. And the way we correct that is an error correction code. Basically a hash that does an error correction on it continuously. Otherwise it would crash all the time and you lose memory and all those other kinds of things. That's true of all classical computers. But, you have this very little bit of error correction code needed in proportion to the compute. With a quantum computer, you have a tremendous amount of error correction in proportion to the compute. So it's the opposite. So what that means, and this is hard for you to conceive probably, but the first quantum computers today that will be error-corrected in the next, probably, five years to seven years timeframe, will be the size of a football field. Because they take an immense amount of compute to error-correct, to get the value of the compute out of them. That will get smaller over time, and they'll be better and better ways to do it. But that's an important thing to understand. The other important thing to understand is just like I still count with my fingers, the classical computers, which is what we call these now, will not go away. You're not going to run a website on an analog quantum computer. It would be a waste. I mean, you could do certain things on it, but it’d be a tremendous waste of it. So these are going to work together. So think of this as a math coprocessor for that. And they'll probably be primarily in cloud and hyperscale environments first where you'll be able to call them. As a matter of fact, today, one of the services that my team built at Amazon is called Braket, which is Dirac's notation for quantum. And you can go out for about $3, create a circuit on a quantum computer, run a simulation of it, and then run it on a real quantum computer today already. You can do that right now and see it. And you can actually see the amount of error in different machines as well, because of the noise that's occurring. And so removing the error - and you can imagine you're dealing with atomic particles, so everything causes an error. From the fabrication process, errors errors in the fabrication process, cell phone signals going through and hitting it, the Earth's magnetic field. Heat is an error. All those things cause errors. And so removing the error is a big deal in quantum computing and it's what's going to advance quantum computing as we figure out how to remove the error. Both reducing the error and removing the error through error correction. And so those are two really important things. The big innovation that's happened over the last five years with quantum computers is we've actually figured out how to remove the errors. It's just hard. And it will require a lot of hardware to do it. 

                  The Problems Quantum Computers May Solve

                  Quantum computers are expected to excel at solving complex problems across many different fields. Quantum simulation can model small-scale molecular and material interactions, potentially accelerating future drug discovery and novel materials development. For example, new methods of creating ammonia could free up one to three percent of global energy consumption; new materials for batteries could enable more efficient electric vehicles. Optimization problems—including logistics, supply chain management, and resource allocation during crises—could in time be approached more efficiently. At the same time, these quantum computers will be able to break much of today’s encryption, driving the need for post-quantum cryptography implementations across cybersecurity infrastructures. 

                  Video Player is loading.
                  Current Time 0:00
                  Duration 0:00
                  Loaded: 0%
                  Stream Type LIVE
                  Remaining Time 0:00
                   
                  1x
                    • Chapters
                    • descriptions off, selected
                    • captions off, selected
                      Click Expand + to view the video transcript

                      The first things you're going to see with a quantum computer that's going to impact our lives is around physics and chemistry. That'll take around 200 error-corrected qubits, in that range. Just to put that in perspective, some of the machines we're talking about here, to get to error-corrected qubits, will be the size of a football field. They will be very big machines, very expensive machines to build. But they can be transformational. The next, as you get to around a thousand qubits, and starting with a hundred qubits, will be materials sciences, and that will transform our world significantly. And the reason these will be the first things a quantum computer will do, not break cryptography and not do logistics, is because a quantum computer works like a molecule.  Basically, what you're doing, you can think of it in your mind when you build a circuit in a quantum computer, you're kind of building a molecular structure in the machine's memory to operate on using the entanglement and the qubits, the atomic particles that you're operating with. What does that mean to transform mineral science? What does it mean? Today when we try to find a superconductor, we discover it accidentally. Almost all materials science is accidental combinations. From sticky notes that you use was an accident where they were trying to build something else at 3M. These are accidents that occur that we find randomly in a lab. With a quantum computer, you can reverse engineer a molecular system. It's called a Hamiltonian simulation. So I could say, for example, I want a high temperature superconductor. And it will tell me through simulation what chemical formula would give me that high temperature superconductor. That will transform our world. Just a high temperature superconductor will transform our world. Just ammonia transformations - like being able to do a Hamiltonian on ammonia is worth $1 trillion in manufacturing. Annually. Ammonia is the most produced chemical out there. If we were to try to do a Hamiltonian on ammonia today with classical computers, if we took all the iPhones and all the laptops and all the high-performance computers on Earth today, they would run for longer than the history of the universe to do that simulation. A quantum computer with a thousand error-corrected qubits would do that in about three minutes.

                      Video Player is loading.
                      Current Time 0:00
                      Duration 0:00
                      Loaded: 0%
                      Stream Type LIVE
                      Remaining Time 0:00
                       
                      1x
                        • Chapters
                        • descriptions off, selected
                        • captions off, selected

                          Explore Further with Bill

                          Booz Allen CTO Bill Vass goes further on quantum computing, addressing the timeline for fault-tolerant quantum computers, common myths, and suggested steps for the U.S. government to maintain leadership in this field.

                          Click Expand + to view the video transcript

                          Hi, I'm Bill Vass. I'm the chief technology officer here at Booz Allen. Quantum computing, a lot of people, are scared by by quantum computing. And I think it's just important to point out that it's just another phase of computing. Now we're just advancing to use the properties of quantum with different types of quantum particles, and leveraging the two magical parts of a quantum computer kind of are the physics around superposition and entanglement. It doesn't mean classical computers go away any more than me counting on my fingers went away. The two are going to work together, and are going to reinforce each other. The biggest thing to remember about quantum computing is it's just another step forward. It'll range probably from three years to seven years. And the reason that it's coming in more closely like that is because, all of the different machines that are being developed today, they've learned how to do error correction. It's just a lot of work to do it. I think that's going to be transformational as they get error correction down. Remember on a quantum computer it's so affected by the environment it operates in. So it takes a tremendous amount of error correction code in proportion to the compute you put out there. So some of these early machines are going to be the size of data centers. Quantum computing will have its biggest near-term effect, in material sciences. And I think that's important - physics, chemistry and material sciences. And the reason for that is a quantum computer works like a molecule, if you like. You're basically building, through superposition and entanglement quantum circuits in the memory of a machine. And so unlike a digital computer that we use today, a quantum computer works analog like a molecule, if you like. That's one way to think of it. So doing Hamiltonian simulations of material sciences, as we get a small number of error corrected qubits, is going to be transformational. Things like, high energy storage systems, long duration storage systems, high temperature superconductors, new drugs, new materials that are lighter weight and handle more stress, new materials that potentially enable stealth or other technologies are going to be able to, for the first time, be reverse-engineered with a quantum computer as more qubits become available. That's when quantum computers start to help in things like logistics, the traveling salesman program, all those other things, and ultimately put cryptography at risk. As quantum computers evolve to the point where they can break our current cryptography, the most important place to focus on is the places where someone could be recording data today to break it tomorrow, because secrets last a long time. We still have classified data from World War II, for example. What you want to do is focus on the areas where, data could be transported in a cryptographic way. So for example, in communications, our TLS systems. TLS systems, our transport layer encryption, in the future will be able to be broken by a quantum computer. So what you want to do is start implementing post-quantum encryption on that in case someone is doing a recording of network traffic where they could record it today, and we don't worry so much about it because it's encrypted, in a way that you know that it can't be broken. In the future they could record it, keep it for ten years, and then break it later. We're working to make the government ready for quantum computing, and quantum technologies in a number of ways. So first of all is developing and implementing post-quantum cryptography and the ability to deploy that at scale. Enabling quantum sensing and leveraging that at scale. And then last but not least, investing in companies like Seeqc and others that build the components for quantum computers, and having a good understanding of how to build quantum circuits and manage and develop algorithms on quantum computers. So we're ready as more and more machines that are error-corrected and stable become available. I think the biggest misconception I see out there about quantum computing, is that it will replace classical computing, because it won't. It augments classical computing. And I think some of the other hype, you have to look at today is a lot of people talk about quantum AI. And once we have quantum memory storage and that's being evolved, a quantum computer could potentially tremendously accelerate linear algebra, which is used for model training. So it has a potential to help in AI in a very big way, it's just a ways off from doing that today. I don't think you'll see a tremendous amount of AI in quantum computing yet. But it has a tremendous potential there. For us to maintain our leadership in quantum computing , there's a number of areas that we have to invest in. The most basic one is of course, continuing to advance quantum sensors for things like GPS-denied environments and other things like that, having quantum-based gyroscopes and other things like that are really, really important for IMU’s and things like that. Another area, of course, is to now, today invest in and start deploying quantum cryptography. That's really important. And then in quantum computing to continue to invest in education in quantum computing, but also understand the supply chain for quantum computing to make sure the fab capabilities are there. And the supply chain for the materials needed for quantum computing are there, along with just continuing to advance the skill sets in quantum computing, Quantum computing is a different way to compute, and it's incredibly important that the government be investing in the training and the understanding to keep the government and the United States a leader in quantum computing. 

                          Watch Bill’s full presentation, recorded live at The Helix, Booz Allen’s Center for Innovation, On Demand

                          Working at the Forefront of Quantum Technologies

                          Booz Allen works at the forefront of quantum sciences and technologies. By combining deep technical expertise with our dedication to mission impact, we advance innovation and deploy solutions that accelerate outcomes across critical national missions. This leadership includes designing novel quantum algorithms, developing and testing the latest quantum hardware, deploying new post-quantum cryptography across integral cyber-infrastructure, and harnessing the evolution of quantum sensing to better understand various mission-critical environments.

                          Learn more about the future of quantum.

                          1 - 4 of 8