At the Forefront of the Quantum Revolution

How QIST can disrupt and revolutionize existing operations:

Quantum Computing

Quantum Computing

Quantum algorithms excel in previously intractable problem spaces, potentially accelerating new and emerging analytics pipelines, enabling the design of new materials, and eliminating bottlenecks in critical government and business processes. Quantum computing includes the development of hardware as well as software and algorithms. 

Quantum-Safe Communications

Quantum communications encompasses new ways of transmitting and securing information. It includes research on hardening traditional forms of communications against cyberattacks by future quantum computers. Chief among this work is post-quantum cryptography (PQC), the development of classical algorithms designed to resist attackers with access to a classical or quantum computer.

Quantum Communications
Quantum Sensing

Quantum Sensing

Quantum mechanics can transform sensor resolution and range beyond what is traditionally possible. This technology enables satellite-free navigation, deep space exploration, geological discovery, and more. Quantum sensors are also leveraged for use in quantum computing and communications.

Our report—Understanding the Capabilities of Modern Quantum Computers—provides a plain-language summary of our research into where and how quantum solutions will be truly advantageous. 

Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
    • Chapters
    • descriptions off, selected
    • captions off, selected
      Full Transcript of Video

      There are some types of computations, such as certain optimization and simulation problems, that remain out of reach for even the most powerful supercomputers. In order to tackle problems like discovering new treatments for cancer patients enabling more accurate weather forecasting developing new materials for hypersonic vehicles, and more, fundamentally new approaches like  Quantum Computing are necessary. Whereas classical physics describes the physics of everyday phenomena, quantum physics explains the behavior of matter and light on atomic and subatomic scales. This new approach to computing has the potential to solve many problems that are out of reach for current systems. Traditional computers encode information in bits that obey the laws of classical physics. Whereas quantum computers use quantum bits, or qubits, that obey the laws of quantum physics. Qubits offer new computational possibilities by using three special features of quantum mechanics: superposition, quantum measurement, and entanglement. Classical bits encode information in ones or zeros. With quantum computing, superposition enables qubits to be in a state that is a combination of zero and one. Imagine a qubit as a spinning coin. A spinning coin is neither heads nor tails until it stops spinning and lands on one side or the other. This process of stopping that spinning coin to see if it is heads or tails is analogous to taking a measurement in quantum mechanics. Quantum entanglement is the notion that two qubits can be connected, such that actions on one of them can influence the other – even if they are physically separated. While this can have many far reaching implications in the realm of quantum communications, this property is also integral to many quantum algorithms. The powerful correlations from quantum entanglement enable us to manipulate information across multiple qubits at the same time. We can use this feature to implement massively parallel operations that in some cases require exponentially fewer resources. As the pace of advancement in quantum technology rapidly accelerates, quantum computing's relevance to mission-critical problems will only continue to grow. As this quantum-enabled future draws nearer, Booz Allen is working rapidly at the forefront of this technology to bring it to our clients’ missions and to drive next-gen outcomes. 

      Video Player is loading.
      Current Time 0:00
      Duration 0:00
      Loaded: 0%
      Stream Type LIVE
      Remaining Time 0:00
       
      1x
        • Chapters
        • descriptions off, selected
        • captions off, selected
          Full Transcript of Video

          So we're going to attempt to, in 20 minutes or less, explain quantum computing. So that should be easy. Not much to do there. Let's go ahead and key up the slides. So when you see this picture here - if you learn nothing today, that's not a quantum computer. That is the cooling and communication system for a quantum computer. Primarily one that operates on electrons, and they call it electromagnetic cryogenic. So we'll talk more about this and I'll give you some examples of what's going on there. So when you start to understand quantum computing, it's a good idea to kind of understand the history of computing. So I'm going to do that in about five minutes. So how did mankind start computing? We started with these. That's why everything is base ten. And I still go “One, two, three, four, five...” and I'm sure most of you do. And we still add things up on our fingers. Certainly when I'm doing my timesheet it's like “1 o’clock, 2 o’clock, 3...” So I mean, those are really common things. And so this kind of computing hasn't gone away just because we have a digital computer in our pocket these days. That's important to point out. But this had limitations. When you started doing tens and tens and tens, you couldn't remember how many tens you'd done. So we started doing sticks and stones, literally. So you you've probably done “One, two, three, four, five...” right - that’s sticks. And then two, two bundles of sticks make a stone and so on and so forth. And this served us quite well. And again, we still do “One, two, three, four, five...”  even though we've got  digital computers in our pocket. Then there was a major advancement, and this major advancement with symbology and the ability to write. And then we could write things down, clay tablets, papyrus, all those kinds of things. That was a huge step forward in computing, our compute capability, and ability to manage information. And then, of course, there was this huge advancement out of Asia called an abacus. If you've ever seen anyone operate an abacus, you can go very quick and do a tremendous amount of computing on an abacus. That was a giant step forward. The next big advancement was gears and slides. So I am unfortunately old enough to have done computing on a slide rule that you see there. And you've probably seen those manual adding machines where you pull a crank and it moves paper out. That was gear-based. And in fact, we broke Enigma with a gear-based machine during World War II. So those have been supplanted very much, by electronics in most cases. But there are still people who can do things very quickly on slide rules. We learned that we could do computing on the Eniac and other things like that with the tube capabilities. And tubes provided switches and AND/OR gates and those kinds of things. And this was really our first electronic computing systems that we did. We don't really use tubes very much unless you're some weird audiophile and you think tubes are better, but, tubes are not used primarily anymore. And, our next advancement was to transistors. And you probably remember transistor radios. Transistors really made a huge step forward in our ability to compute. And then we needed it to be much smaller for the space program, So we invented the ability to do lithography and integrated computers and integrated silicon chips, which is where we're at today, when there's literally hundreds of millions, potentially, of transistors on a single,  silicon wafer today. And that's really amazing what those can do. And in this case, you're using electrons and moving them through there. But it's important to point out I still use my fingers to count things, right? So, a lot of the old computing doesn't necessarily be supplanted by this. It's just augmented. So now we're moving to the sort of the next phase of computing which is using atomic particles. It's a natural progression, right? We're getting smaller, and smaller, smaller in a lot of cases. And with each phase of computing, there were new features that were brought in - memory,  the ability with writing to have memory and sticks and stones to have memory instead of just doing it on your fingers. With quantum computing, there's really two differentiating features with these particles. And it's primarily superposition, which we'll talk a lot about today, and entanglement. And those are two things that are very hard to conceive, even though they do exist in our physical world today. They're very micro in our macro world, and how they affect things. So primarily the particles we use are electrons - and that's actually a photograph of a real electron -  neutral atoms, charged ions, and photons. Those are the primary four different types of quantum computers. And they all have advantages and disadvantages. And I guarantee you, if you talk to the quantum scientists in each of these areas, it's like talking to different religions. They, you know, that - this is like, you know, Catholic, and Baptists, and Lutheran. And, you know, they all have a commonality that they're using superposition and entanglement, and they all believe in that. They all believe in the future of quantum computing, and they understand what a quantum computer can do. But they're very religious that each of their technologies are going to be the best. And when you talk to them, it's fascinating to hear the reasons why all the others are wrong and they're right. And it's hard to know which will actually win at this time. I have some ideas. But let's talk a little bit about superposition. So superposition is fascinating. A regular bit on a regular computer, you pretty much just have a one and a zero while you're operating on it. With a superposition, and you're measuring primarily spin and vibration of an atomic particle when you're operating on it,  is in every position at once until you measure it, and then it's in your final result. So it gives you a tremendous amount of variability to operate on. And I know that's hard to conceive. And hopefully later, as JD and others talk more and more about it, it will become clearer. But remember, superposition is one of the immense powers of a quantum computer, is enabling that quantum capability. That is superposition. The next thing is even harder for most people to understand. Einstein called it “spooky action at a distance.” This is the ability to entangle two atomic particles together so that they communicate with each other, and changing one affects the other instantaneously. That's pretty hard to understand. In both cases here we can drive these things. And by that, I mean, even though I say we don't fully understand them, we can drive them. I'll give you an analogy. I'm a car guy, and every car I know how many cylinders and how the valves work and the tap-ins and, you know, all of the components of a car, and the ECU and everything else. Or an electric motor, and how many kilowatts the battery is and all those kinds of things. I can drive a car pretty well. My wife can also drive a car pretty well. To her it's a skinny pedal, a fat pedal, and a steering wheel. Right? She doesn't really know - if you asked her how many cylinders were in her car, of course we drive electric cars now, but she wouldn't know. But she can drive it just as well. So we can drive entanglement and superposition extremely well without understanding the fundamental components of how it works. And that really bothers me as, you know, one of the reasons I got into computer science is the old saying that  “anything with the word ‘science’ in it isn’t,” right? “Poly science,” “computer science,” it’s because all of that is invented by man. And so you can you can send an email to the person that developed the RS flip-flop on a chip as part of a divide algorithm on an Intel chip, and  you get an answer from that person. You can understand how to entangle things and how to operate superposition, but nobody actually understands entanglement or how superposition works physically, right? It's an interesting problem to deal with, but it's something we can drive very well on a quantum computer and leverage. As I said before, in a quantum computer, you have a  superposition you're operating on. In a digital computer, you have a one and a zero. And remember, the ones and the zeros are arbitrary, right? Like if I count, I pick this as one, two, three, four, five. Or I could take this as one, two, three, four, five. What I pick as “one” is arbitrary. Just like with a digital computer, the lack of a charge or the existence of a charge can be a 1 or 0, that's arbitrary. As long as you are consistent. And the same thing is true with quantum computing. You're picking specific, say, a position, a phase or a vibration and saying that's a one or this is a zero in your superposition in your qubit. And the advantage, again, to a quantum computer is you can be every position between 1 and 0 at once while you're operating on it. So what do they look like really? So these were the machines I was building when I was at Amazon. These are electron-based. They're often called electromagnetic cryogenic machines, which is a long word of saying, they have to operate at almost absolute zero. So, that fridge you see there that says Bluefors is one that's running. And as far as we know, when that's running, that's the coldest place in the universe, right there. That is microkelvin. It's as close to absolute zero as you can possibly get. You can never actually get to absolute zero, but it's as close as you can possibly get to absolute zero. So they're very cryogenic machines. The chip actually goes down here on the bottom and these are all microwave transmissions. So in an electromagnetic cryogenic machine, you control the qubits, set the qubits, and manage them with microwaves. And one of the big advantages to that, and I'll talk about lasers as well, is communications has tremendously advanced microwaves and lasers, and all of  our networks and communication systems, and quantum computing is able to take advantage of that. It's been a huge leverage for quantum computing. The next machine you see here is called a neutral atom machine. In this case the compute is happening right there in a vacuum chamber. These are all called laser tweezers. You actually grab an atom and you hold on to it in a vacuum, and you grab another atom and you hold on to it in a vacuum. You put them next to each other, use another laser to entangle them. And then you can operate on them,  moving to operating areas, and you're holding them with a laser tweezer. Imagine two opposite sine waves that look like a saw, right? Grabbing, an atom and holding it, and they'll hold hundreds of them inside this machine. So the compute is happening in here. It looks pretty sci fi, doesn't it? That looks very cyberpunk to me. This is an ion trap machine. This is actually an ion Hughes machine. The chip is right down in here, and you'll see pictures of it later. So it suspends ions in a magnetic field, holds onto them, and then uses lasers to set them and measure them and entangle them. And so it actually kind of looks like, when you see it in operation - and you can actually, with your naked eye, see these glowing blue ions. It's pretty cool. And they grab them like the carriage of a typewriter and move them back and forth between the lasers, using the magnetic field to operate on them. And then last but not least, is a photonics machine. So this is, right there are the silicon wafers that do the compute, and then the silicon wafer fits in here. And these are photonic machines going in and out here. And there are photonic detectors in there. Now, it's important to note these machines have a lot of error in them. And that's really one of the big challenges with quantum computing. So just to put it in perspective, your iPhone today is having alpha particles flipping the bits in its memory as we speak. And the way we correct that is an error correction code. Basically a hash that does an error correction on it continuously. Otherwise it would crash all the time and you lose memory and all those other kinds of things. That's true of all classical computers. But, you have this very little bit of error correction code needed in proportion to the compute. With a quantum computer, you have a tremendous amount of error correction in proportion to the compute. So it's the opposite. So what that means, and this is hard for you to conceive probably, but the first quantum computers today that will be error-corrected in the next, probably, five years to seven years timeframe, will be the size of a football field. Because they take an immense amount of compute to error-correct, to get the value of the compute out of them. That will get smaller over time, and they'll be better and better ways to do it. But that's an important thing to understand. The other important thing to understand is just like I still count with my fingers, the classical computers, which is what we call these now, will not go away. You're not going to run a website on an analog quantum computer. It would be a waste. I mean, you could do certain things on it, but it’d be a tremendous waste of it. So these are going to work together. So think of this as  a math coprocessor for that. And they'll probably be primarily in cloud and hyperscale environments first where you'll be able to call them. As a matter of fact, today, one of the services that my team built at Amazon is called Braket,  which is Dirac's notation for quantum. And you can go out for about $3, create a circuit on a quantum computer, run a simulation of it, and then run it on a real quantum computer today already. You can do that right now and see it. And you can actually see the amount of error in different machines as well, because of the noise that's occurring. And so removing the error - and you can imagine you're dealing with atomic particles, so everything causes an error. From the fabrication process, errors errors in the fabrication process, cell phone signals going through and hitting it, the Earth's magnetic field. Heat is an error. All those things cause errors. And so removing the error is a big deal in quantum computing and it's what's going to advance quantum computing as we figure out how to remove the error. Both reducing the error and removing the error through error correction. And so those are two really important things. The big innovation that's happened over the last five years with quantum computers is we've actually figured out how to remove the errors. It's just hard. And it will require a lot of hardware to do it. The first things you're going to see with a quantum computer that's going to impact our lives is around physics and chemistry. That'll take around 200 error-corrected qubits, in that range. Just to put that in perspective, some of the machines we're talking about here, to get to 200 error-corrected qubits, will be the size of a football field. They will be very big machines, very expensive machines to build. But they can be transformational. The next, as you get to around a thousand qubits, and starting with a hundred qubits, will be materials sciences, and that will transform our world significantly. And the reason these will be the first things a quantum computer will do,  not break cryptography and not do logistics, is because a quantum computer works like a molecule. Basically, what you're doing, you can think of it in your mind when you build a circuit in a quantum computer, you're kind of building a molecular structure in the machine's memory to operate on using the entanglement and the qubits, the atomic particles that you're operating with. What does that mean to transform mineral science? What does it mean? Today when we try to find a superconductor, we discover it accidentally. Almost all materials science is accidental combinations. From sticky notes that you use was an accident where they were trying to build something else at 3M. These are accidents that occur that we find randomly in a lab. With a quantum computer, you can reverse engineer a molecular system. It's called a Hamiltonian simulation. So I could say, for example, I want a high temperature superconductor. And it will tell me through simulation what chemical formula would give me that high temperature superconductor. That will transform our world. Just a high temperature superconductor will transform our world. Just ammonia transformations - like being able to do a Hamiltonian on ammonia is worth $1 trillion in manufacturing. Annually. Ammonia is the most produced chemical out there. If we were to try to do a Hamiltonian on ammonia today with classical computers, if we took all the iPhones and all the laptops and all the high-performance computers on Earth today, they would run for longer than the history of the universe to do that simulation. A quantum computer with a thousand error-corrected qubits would do that in about three minutes. 

          Why Is Quantum Strategically Important?

          Quantum technologies are expected to propel dramatic leaps in computational capability, important advances in sensor design, and new strategies for accurately communicating quantum information. These increased measurement and processing capabilities would catalyze revolutionary advancements in nearly every industry and discipline—national security, communications, healthcare, materials engineering, manufacturing, and finance sectors all stand to gain.

          Through legislation and executive action, the federal government is working to ensure U.S. leadership in QIST and several commercial sectors are responding to the call. Organizations across industry are rallying together to test hardware, design software, and support applied research into how quantum will affect their fields. Booz Allen is committed to supporting this work. We help our clients understand how quantum technologies will impact their missions and businesses and how early investment can secure future advantages. It is pivotal to begin this planning today to ensure that organizations can leverage the technology to its full potential with a first-mover advantage. We can provide customized quantum support to your organization, federal agency, or national laboratory. 

          Advancements in QIST are challenging our understanding of what future computing, communications, and sensing technologies will look like and have compelling potential implications across the government and private sectors. Continued dedication to furthering QIST, as well as proactive moves toward information security in a quantum-enabled world, is key to building the partnerships and knowledge the United States needs to embrace these exciting technologies as they mature.

          Explore our dedicated capabilities for Quantum Tech for Positioning, Navigation, and Timing.

          Hope for the Best, Plan for the Worst: Preparing for the Quantum Cyber Threat

          Despite their significant promise, quantum computers also threaten how enterprises secure critical data given their ability to solve the difficult math problems that are the basis for some modern encryption standards. Luckily, we do not have to wait for a full-scale quantum computer to start protecting critical data from the threat these computers present. PQC uses classical algorithms rooted in math problems that a quantum computer cannot efficiently solve to secure data. Government entities, led by the National Institute of Standards and Technology (NIST), have selected PQC algorithms for future cybersecurity and continue to manage the PQC standardization process.

          At Booz Allen, we understand that transitioning to new cybersecurity standards is no easy task so we provide a range of cryptographic services and expertise for transitioning to new cybersecurity standards:

          Data Discovery

          Identify data sources and business sensitivity levels used across your organization to prioritize transition strategy.

          Cryptographic Discovery

          Assess the lifecycle of cryptography used, availability of algorithms, and policy governing the cryptographic lifecycle. Create a cryptographic inventory to inform and prioritize PQC adoption.

          Application Prototyping

          Test PQC candidate algorithms on mission-relevant use cases to assess network and performance impacts and to prepare infrastructure.

          PQC Adoption

          Ensure all legacy systems and vendors use quantum-safe algorithms and that governance around cryptographic agility is enforced in future deployments.

          Quantum Research Papers

          Booz Allen in the News

          Contact Us

          Contact us to learn more about preparing for your migration and identifying the best practices you will use. It’s important to consider strategy, cryptosystems inventories, testing, and other key areas to streamline this essential transformation.

          Thank You for Contacting Us

          Thank you for contacting Booz Allen. Your inquiry has been passed on to the appropriate team and we will follow up regarding your submission as soon as possible.

          You can update your communications choices at any time by visiting our preference center

          You can learn more about Booz Allen by following us on LinkedIn or X.