Is it possible to invent a computer that computes anything in a flash? Or could some problems stump even the most powerful of computers? How complex is too complex for computation?
The question of how hard a problem is to solve lies at the heart of an important field of computer science called computational complexity.
Computational complexity theorists want to know which problems are practically solvable using clever algorithms and which problems are truly difficult, maybe even virtually impossible, for computers to crack.
This hardness is central to what’s called the P versus NP problem, one of the most difficult and important questions in all of math and science.
This video covers a wide range of topics including: the history of computer science, how transistor-based electronic computers solve problems using Boolean logical operations and algorithms, what is a Turing Machine, the different classes of problems, circuit complexity, and the emerging field of meta-complexity, where researchers study the self-referential nature of complexity questions.
00:00 | Introduction to the P vs NP problem
02:16 | Intro to Computational Complexity
02:30 | How do computers solve problems?
03:02 | Alan Turing and Turing Machines
04:05 | George Boole and Boolean Algebra
05:21 | Claude Shannon and the invention of transistors
06:22 | John Von Neumann and the invention of the Universal Electronic Computer
07:05 | Algorithms and their limits
08:22 | Discovery of different classes of computational problems
08:56 | Polynomial P problems explained
09:56 | Exponential NP Problems explained
11:36 | Implications if P = NP
12:48 | Discovery of NP Complete problems
13:45 | Knapsack Problem and Traveling Salesman problem
14:24 | Boolean Satisfiability Problem (SAT) defined
15:32 | Circuit Complexity Theory
16:55 | Natural Proofs Barrier
17:36 | Meta-complexity
18:12 | Minimum Circuit Size Problem (MCSP)