Key Terms — Primary Source Definitions
Precise language from primary sources: Preskill, Nielsen & Chuang, IBM Quantum, Google Quantum AI, and NIST. Each term cited to its origin.
Classical computers use bits — 0 or 1. Quantum computers use qubits that can be both at once. This changes everything.
A spinning coin is neither heads nor tails — it's both simultaneously. A qubit works the same way. Until you look (measure it), it exists in multiple states at once.
Every quantum computing concept, explained visually with real-world analogies.
A qubit can exist in a combination of |0⟩ and |1⟩ at the same time — not one or the other, but literally both, with different probabilities. Measuring it forces it to "choose."
Two qubits become "entangled" — measuring one instantly determines the other, regardless of the distance between them. Einstein called it "spooky action at a distance."
Quantum states can add together (constructive interference) or cancel out (destructive interference), just like waves. Algorithms use this to amplify correct answers and suppress wrong ones.
Like classical logic gates (AND, OR, NOT), quantum gates manipulate qubit states. The Hadamard gate creates superposition; CNOT creates entanglement. They're the building blocks of quantum circuits.
A qubit's state can be visualized as a point on a sphere. The north pole is |0⟩, south is |1⟩, and any point on the surface is a valid superposition. Quantum gates rotate this point.
For certain problems — factoring, search, molecular simulation — quantum computers need exponentially fewer operations. Shor's algorithm factors large numbers efficiently; classical computers simply can't match it.
Put a qubit into superposition, then measure it. Watch the wavefunction collapse.
Every format, carefully curated. Start with whatever suits your learning style.
A structured path from "what is a qubit?" to implementing real algorithms.
Key terms, notation, and formulas — scannable and visual.
A structured progression from foundational theory to advanced techniques and real-world lab applications.
Every quantum computation follows these six steps — from setting up qubits to extracting a usable classical result.
Different approaches to harnessing quantum effects — each suited to different problem types and hardware constraints.
The most straightforward model and the foundation for all others. Uses quantum gates arranged into circuits to perform computation — analogous to classical logic gates but operating on qubits in superposition.
Uses quantum tunneling to bypass energy barriers and find the globally optimal solution among enormous solution spaces. Highly effective for combinatorial optimization where there are vast numbers of possible configurations.
Gradually transitions a quantum system from a simple, well-understood initial state to a complex final state that encodes the solution to an optimization problem. The slow evolution avoids disturbing the system.
Pre-entangles qubits into large clusters, then performs computation through adaptive measurements on that cluster. Requires fewer quantum gates than standard gate-based models — making it resource-efficient for constrained hardware.
Performs computation entirely through a sequence of single-qubit measurements on a pre-prepared entangled resource state. The choice of measurement basis at each step determines the computation — no gates during computation itself.
Three real-world scenarios — apply the right quantum approach to each business problem.
Precise definitions covering the essential vocabulary of quantum computing — hardware components, core concepts, advanced techniques, and applied methods.
8 questions covering quantum computing fundamentals, hardware, algorithms, and case studies. Click an answer to see if you're right.
A structured path from fundamentals to real-world quantum applications.
The papers, algorithms, and hardware achievements that defined the field — from founders to frontier processors.
Each platform takes a different approach to building scalable quantum hardware.
Precise language from primary sources: Preskill, Nielsen & Chuang, IBM Quantum, Google Quantum AI, and NIST. Each term cited to its origin.
8 questions drawn from primary sources — Preskill, Shor, Grover, Nielsen & Chuang, Google, and IBM. Click an answer to reveal the explanation and citation.