You are viewing a free preview of this lesson.
Subscribe to unlock all 10 lessons in this course and every other course on LearningBro.
This lesson brings together the key theoretical concepts to address a fundamental question: What are the limits of what computers can do? Understanding these limits is essential for A-Level Computer Science and for appreciating why some problems remain unsolved despite enormous computational power.
| Limit | Nature | Example |
|---|---|---|
| Theoretical limits | Problems that no algorithm can solve | The Halting Problem |
| Practical limits | Problems solvable in theory but not in practice due to resource constraints | TSP for 1,000 cities |
| Physical limits | Constraints imposed by physics on computation speed and miniaturisation | Speed of light, thermodynamics |
As established by Turing, some problems are undecidable — no algorithm exists (or can ever exist) to solve them:
These limits are absolute. No future technology — quantum computing, AI, or anything else — can overcome them. They are mathematical impossibilities, not engineering challenges.
Even among computable problems, some are intractable — solvable in theory, but the time required grows so fast that solving them for realistic inputs is impractical.
Modern encryption (e.g., AES-256) uses keys of 256 bits. A brute-force attack would need to try up to 2²⁵⁶ ≈ 1.2 × 10⁷⁷ keys. If every atom in the observable universe (~10⁸⁰ atoms) were a computer checking 1 trillion keys per second, it would take approximately 10³⁴ years to exhaust the key space. The age of the universe is only ~1.4 × 10¹⁰ years.
This is not an absolute limit — the problem IS computable — but the time required makes it practically impossible.
Many intractable problems suffer from combinatorial explosion — the number of possibilities grows exponentially or factorially:
| Problem | Input | Possibilities |
|---|---|---|
| TSP (20 cities) | 20 | 20! ≈ 2.4 × 10¹⁸ |
| Boolean SAT (50 variables) | 50 | 2⁵⁰ ≈ 10¹⁵ |
| Protein folding | Amino acid chain | Exponential in chain length |
Even if we had perfect algorithms, physics imposes hard constraints:
Erasing one bit of information dissipates a minimum amount of energy: kT ln 2, where k is Boltzmann's constant and T is temperature. At room temperature, this is ~3 × 10⁻²¹ joules. This sets a theoretical minimum energy for computation.
Information cannot travel faster than the speed of light (3 × 10⁸ m/s). This limits how fast components can communicate and sets a minimum time for signal propagation across a chip.
The maximum rate of computation for a self-contained system is approximately 2 × 10⁴⁷ bits per second per gram. Even a computer the mass of the Earth operating for the age of the universe could perform at most ~10⁹³ operations — far fewer than 2²⁵⁶.
Subscribe to continue reading
Get full access to this lesson and all 10 lessons in this course.