Lecture 10: Foundations of Uncertainty & Probability

From Absolute Logic to Probabilistic Reasoning

Understanding how AI reasons under uncertainty and incomplete knowledge
Back to Course Overview

Lecture Overview

This lecture introduces a fundamental paradigm shift in AI: moving from formal logic's absolute certainty to probabilistic reasoning under uncertainty. We explore why real-world AI systems need probability theory, covering Bayesian reasoning, conditional probability, and the philosophical contrast between two views of intelligence.

Key Concepts: Probability Theory, Bayes' Rule, Conditional Independence, Bayesian Inference
Applications: Medical Diagnosis, Robotics, NLP, Computer Vision, Decision Making

Two Perspectives on Intelligence

Historically, humanity has viewed intelligence from two fundamentally different perspectives:

Perspective 1: Absolute Knowledge (Logic-Based)
  • Intelligence as deductive reasoning from certain facts
  • Propositional and first-order logic
  • Rule-based expert systems
  • Assumption: Complete and certain knowledge
Perspective 2: Partial Knowledge (Probabilistic)
  • Intelligence as reasoning under uncertainty
  • Probability as degrees of belief
  • Learning from incomplete, noisy data
  • Assumption: Uncertainty is fundamental to intelligence

"From proving truths to estimating truths" - This lecture explores why modern AI embraces uncertainty not as a weakness, but as a more realistic model of human intelligence and the real world.

📚 Probability Cheat Sheet Available!

Quick reference guide covering probability axioms, Bayes' rule, distributions, and key formulas. Perfect for studying and quick lookups!

📊 Lecture 10 Presentation Slides

Complete lecture presentation (32 slides) covering reasoning under uncertainty, probability theory, Bayes' Rule, and Bayesian inference. Perfect for reviewing lecture content!

Main Topics

Interactive Demonstrations

Visualize and interact with probability concepts and Bayesian inference in real-time.

Practical Exercises

Work through these exercises to master probability and Bayesian reasoning.

Basic Probability Calculations

Practice computing probabilities, applying axioms, and working with sample spaces.

Bayes' Rule Applications

Apply Bayes' theorem to medical diagnosis, spam detection, and other real scenarios.

Independence & Conditional Independence

Identify independent variables and apply conditional independence to simplify models.

Bayesian Inference Problems

Complete step-by-step Bayesian inference with prior beliefs and evidence updates.

Key Concepts Summary

Probability Fundamentals:
  • P(A) ∈ [0, 1] for all events A
  • P(true) = 1, P(false) = 0
  • P(A ∨ B) = P(A) + P(B) - P(A ∧ B)
Bayes' Rule:
  • P(A|B) = P(B|A) × P(A) / P(B)
  • Posterior ∝ Likelihood × Prior
  • Update beliefs with evidence
Independence:
  • P(A ∧ B) = P(A) × P(B) if independent
  • P(A|B,C) = P(A|C) if A ⊥ B | C
  • Simplifies joint distributions
Key Insights:
  • Uncertainty is not a weakness but a feature
  • Probabilistic reasoning is more robust
  • Foundation for machine learning
SE444: Artificial Intelligence | Lecture 10: Foundations of Uncertainty & Probability