Back to Lecture 10

Limits of Formal Logic in Real Worlds

When Perfect Logic Meets Imperfect Reality

When Perfect Logic Meets Imperfect Reality

"Logic is perfect. The world is not."

Formal logic assumes complete, certain, and consistent knowledge. But the real world offers us incomplete, noisy, and contradictory information. This is where logic breaks down โ€” and why AI needs probability.

Logic's Assumptions vs. Real World Reality
Aspect Logic Assumes... Reality Provides...
Knowledge Complete - All facts are known Incomplete - Many facts missing or unknown
Data Quality Perfect - Exact, error-free measurements Noisy - Sensor errors, measurement uncertainty
Consistency Consistent - No contradictions Contradictory - Conflicting evidence common
Truth Values Binary - True or False (1 or 0) Continuous - Degrees of belief (0.0 to 1.0)
The Core Problem

Logic-based AI systems work beautifully in closed, perfect worlds (chess, mathematics, controlled environments). But they struggle or fail entirely in open, messy, real worlds (robotics, medicine, autonomous vehicles, natural language).

What You'll Learn

This topic demonstrates through 5 concrete, interactive examples:

  1. How noisy sensors break deterministic logic (Robot navigation)
  2. How incomplete information makes logic fail (Medical diagnosis)
  3. How contradictory evidence crashes logic systems (Self-driving car)
  4. How ambiguous language confuses logic (Spam email detection)
  5. How belief revision is complex in logic (Detective reasoning)
  6. Why probability is the solution to all these problems

Problem 1: Noisy Sensor Data

Scenario: A warehouse robot uses distance sensors to avoid obstacles

The Scenario

Task: Robot must decide: "Is it safe to move forward?"
Safety rule: Must maintain at least 3 meters from obstacles
Problem: Distance sensor has ยฑ0.5m measurement error!

Interactive: Noisy Sensor Simulator
3.2m
ยฑ0.5m
Click "Take Sensor Reading" to start...
Logic-Based Decision (FAILS)
Rule:
IF distance < 3m
THEN stop
ELSE proceed
Waiting for sensor reading...
Probability-Based Decision (WORKS)
Rule:
P(distance < 3m | readings)
IF P(unsafe) > 0.5
THEN stop ELSE proceed
Waiting for sensor reading...
Key Insight

Logic demands certainty. Sensors provide noise. Probability bridges the gap.
With multiple noisy readings, probability computes confidence in the true distance and makes robust decisions even with imperfect sensors.

Problem 2: Incomplete Information

Scenario: Doctor diagnosing a patient with limited information

The Scenario

Patient symptoms: Fever, headache, fatigue
Possible diseases: Flu, COVID-19, Migraine, Meningitis
Problem: Patient didn't report all symptoms (some unknown/forgotten!)

Interactive: Medical Diagnosis System
Patient Symptoms (check observed symptoms)
Logic-Based Diagnosis (FAILS)
Hard Rules:
IF fever AND headache AND cough
  THEN flu

IF fever AND cough THEN covid
IF headache THEN migraine
Probability-Based Diagnosis (WORKS)
Bayesian Inference:
P(disease | symptoms) =
P(symptoms | disease) ร— P(disease)
Key Insight

Logic requires all facts. Medicine rarely has complete information. Probability makes optimal decisions with partial data.
The probabilistic approach computes probability for each disease even with missing symptoms, while logic either fails to match any rule or picks the wrong diagnosis.

Problem 3: Contradictory Evidence

Scenario: Self-driving car with conflicting sensor readings

The Scenario

Situation: Autonomous vehicle approaching intersection
Camera says: "Traffic light is GREEN" (95% reliable)
Radar says: "Object ahead - car still in intersection" (98% reliable)
GPS says: "You have right of way" (99% reliable)
Problem: Conflicting signals! What should the car do?

Interactive: Multi-Sensor Decision Making
Camera
GREEN LIGHT

Suggests: PROCEED

Reliability: 95%
Radar
OBJECT AHEAD

Suggests: STOP

Reliability: 98%
GPS
RIGHT OF WAY

Suggests: PROCEED

Reliability: 99%
Logic-Based Decision
Rules Fire:
Probability-Based Decision (ROBUST)
Sensor Fusion:
Key Insight

Logic breaks under contradiction. Probability weighs conflicting evidence and makes rational decisions.
By considering sensor reliability, probability fusion computes P(safe) and makes the most rational decision given contradictory inputs. The more reliable sensor (radar) has more influence.

Scenario: Email spam detection with ambiguous words

The Scenario

Email: "Congratulations! You've been selected for a free prize!"
Question: Is "free" a legitimate offer or spam trigger?
Problem: Context matters! Same word means different things in different emails.

Interactive: Spam Email Detector
Test Email Content
Try these examples:
Logic-Based Detection (TOO RIGID)
Keyword Rules:
IF contains("free") AND contains("prize")
  THEN spam
Probability-Based Detection (FLEXIBLE)
Naive Bayes:
P(spam | words) โˆ
P(spam) ร— โˆ P(word | spam)
Key Insight

Natural language is inherently ambiguous. Logic demands precision. Probability embraces context.
Probabilistic models learn from thousands of examples, understanding that "free" has different meanings in spam vs legitimate emails. Logic's rigid keyword matching causes many false positives.

Scenario: Detective solving a crime with evolving evidence

The Scenario

Initial evidence: Fingerprints match Suspect A โ†’ Guilty
New evidence: Video proof shows Suspect A at different location (alibi)
Problem: How to revise belief? What else needs to change?

Interactive: Detective Case Solver
Evidence Available
Logic-Based Reasoning (NONMONOTONIC)
Rule:
IF fingerprints_match
  THEN guilty = TRUE
Probability-Based Reasoning (SMOOTH UPDATE)
Bayesian Update:
P(guilty | evidence) =
Update with each new piece
Key Insight

Logic's nonmonotonic reasoning is complex. Bayesian updating is smooth and principled.
When new evidence arrives, probability gracefully updates beliefs using Bayes' rule. Logic must retract conclusions and figure out cascading changes, which is computationally complex.

Comprehensive Comparison: When Logic Fails & Probability Succeeds

Problem Type Logic Fails Because... Probability Succeeds Because... Real Example
Noisy Sensors BRITTLE
Demands exact values; one noisy reading causes wrong decision
ROBUST
Maintains probability distributions; multiple readings improve confidence
Robot navigation, autonomous driving, sensor fusion
Incomplete Info RIGID
Requires all facts; missing data means "cannot decide" or wrong conclusion
FLEXIBLE
Works with partial data; computes best estimate from available evidence
Medical diagnosis, customer profiling, recommender systems
Contradictions CRASHES
Cannot resolve conflicts; system becomes inconsistent or halts
WEIGHS
Considers reliability and evidence strength; makes optimal decision
Autonomous vehicles, multi-sensor systems, evidence integration
Ambiguous Language LITERAL
Keyword matching; misses context; many false positives/negatives
CONTEXTUAL
Learns word probabilities from data; context-aware classification
Interactive Demo: Spam email detector
NLP, sentiment analysis, text classification
Belief Revision NONMONOTONIC
Must retract conclusions; cascading changes; computationally complex
SMOOTH UPDATE
Bayesian updating gracefully revises beliefs; no retraction needed
Interactive Demo: Detective case solver
Sequential evidence, online learning, investigation
The Pattern is Clear

In every real-world scenario, logic's assumptions are violated. Probability theory provides a principled framework for handling uncertainty, making it the foundation of modern AI.

Key Takeaways

Why Logic Fails in Real Worlds
  • Assumes perfect information - Reality is incomplete
  • Demands exact values - Sensors are noisy
  • Cannot handle contradictions - Conflicts are common
  • Binary truth values - World is shades of gray
  • Brittle under uncertainty - Small errors cascade
Why Probability Succeeds
  • Works with partial information - Uses what's available
  • Handles noise naturally - Probability distributions
  • Weighs conflicting evidence - By reliability and strength
  • Continuous confidence - Degrees of belief [0,1]
  • Robust and graceful - Degrades smoothly under uncertainty
The Fundamental Lesson

"The shift from logic to probability isn't a retreat from rigor โ€” it's a recognition of reality. Probability theory provides the mathematical foundation for reasoning under uncertainty, making it essential for any AI system that must operate in the real world."

Next: Now that you understand WHY we need probability, let's explore how uncertainty becomes a new paradigm for building intelligent systems! Continue to Topic 3 โ†’