![[Pasted image 20260215140536.png]]
- material culture/object history/functions/shapes used to be distinct
- converged in the 70s, allowed by submicron transistors/silicon
-
the death of material culture
- Babbage: difference engine, first machine to embody mathematical rule in mechanism, digital automation
- Lady Byron commented: A thinking machine
- but algorithm is executed by human
- Context: industrialization of thought
- a digital attempt 1820:
- ![[Pasted image 20260215144257.png]]
- discrete, incremental fixed jumps ![[Pasted image 20260215144507.png]] ![[Pasted image 20260215144651.png]]
- Analytical engine in 1833 marked the transition from calculation to computation, generalization
- calculation: rules cannot be altered
- punch card: algorithm is transferred to operation card, programmable ![[Pasted image 20260215151328.png]] ![[Pasted image 20260215151422.png]]
- specific design features
- separation of memory and processor
- microprogramming: small program to solve larger program
- if or
- Jacquard loom and punch card
- exquisite subtlety is not the exclusive province of a human agent
- to improve acceptance of industrial objects
- specificity is in the card, not the machine, software is separated from hardware
- From mathematical machine to general purpose computer
- the realization the the number can represent entity other than quantity ![[Pasted image 20260215152605.png]]
- power of computing comes from the representational power of numbers
- we can manipulate symbols to produce knowledge of the world
![[Pasted image 20260215153004.png]]
Reference
Dr Doron Swade, lecture in Gresham College
More History
1. Purpose of the Series
-
Explains computer science as a field, not programming tutorials.
-
Covers the journey from:
-
Bits & transistors
-
Logic gates & CPUs
-
Operating Systems
-
Internet, VR, Robots
-
-
Focus: how computing works and its societal impact.
2. Importance of Computers in Modern Life
-
Computers run critical infrastructure:
-
Power grids
-
Transportation
-
Finance
-
Manufacturing
-
Communication
-
-
Compared to the Industrial Revolution in terms of societal transformation.
-
Current era described as the Electronic Age.
3. Layers of Abstraction
-
Computers are simple machines at the lowest level but appear complex because of many abstraction layers.
-
Users and engineers typically interact with higher layers, not raw hardware.
-
Goal: understand computing from 1s and 0s → full systems.
4. Origins of Computing
Abacus (~2500 BCE, Mesopotamia)
-
First widely recognized computing device.
-
Manual calculator for addition/subtraction.
-
Stored intermediate results like modern memory.
-
Used place-value rows (units, tens, hundreds, etc.).
5. Other Early Computing Tools
Over ~4000 years humans built devices to reduce mental effort:
-
Astrolabe – navigation and latitude at sea.
-
Slide Rule – multiplication/division.
-
Mechanical clocks – time and astronomical calculations.
Theme: Tools amplify human intelligence and reduce labor.
6. “Computer” Originally Meant a Person
-
First recorded use (1613) referred to humans who performed calculations.
-
Term shifted to machines in the late 1800s.
7. Mechanical Calculators
Step Reckoner (1694)
-
Built by Gottfried Wilhelm Leibniz.
-
Gear-based machine performing:
-
Addition
-
Subtraction
-
Multiplication
-
Division
-
-
Influenced calculator design for ~300 years.
8. Pre-Computed Tables
-
Before modern computers, people used large books of:
-
Square roots
-
Log tables
-
Artillery range tables
-
-
Saved time but were slow to update and error-prone.
9. Military Need for Computation
-
Artillery firing required accounting for:
-
Distance
-
Wind
-
Temperature
-
Air pressure
-
-
Range tables were effective but inflexible.
10. Charles Babbage and Early Computer Concepts
Charles Babbage
Often called the “Father of Computing.”
Difference Engine (1820s)
-
Mechanical machine to automatically compute polynomial tables.
-
Never completed in his lifetime.
-
Successfully built from his plans in 1991.
Analytical Engine
![[Pasted image 20260308174143.png]] (from cs.stanford.edu)
-
Concept of a general-purpose computer.
-
Included:
-
Memory
-
Sequential operations
-
Input/output
-
Printer
-
-
Not built, but introduced the idea of programmability.
11. First Programmer
Ada Lovelace
-
Wrote theoretical programs for the Analytical Engine.
-
Recognized computing as a new symbolic language.
-
Considered the first computer programmer.
12. Census and Punch Cards
Herman Hollerith
- Built an electro-mechanical tabulating machine for the 1890 U.S. Census. a punch card is placed on pools of mercury and a metal spring would be able to penetrate the punchcard to touch mercury if there’s a hole, forming an electrical contact, used electromechanical solenoids to increment mechanical counters btw, flipdot = solenoid(Electromagnetism) + Magnetic dipole interaction + Mechanical rotation
-
Used punch cards to encode data.
-
Achieved ~10× speed improvement.
- Reduced census processing from ~13 years to ~2.5 years.
13. Birth of IBM
IBM
-
Originated from Hollerith’s Tabulating Machine Company (1924 merger).
-
Became a dominant force in business computing.
14. Transition to Digital Computers
-
Electro-mechanical machines transformed government and commerce.
-
Rapid population growth and globalization demanded faster, more flexible systems.
-
Set the stage for fully electronic digital computers.
1. Growing Complexity in the Early 20th Century
-
Rapid increases in:
-
World population
-
Global trade
-
Scientific and engineering projects
-
Military mobilization (World Wars I & II)
-
-
Result: Huge demand for faster, automated computation.
-
Electro-mechanical machines became:
-
Room-sized
-
Expensive
-
Error-prone
-
Slow
-
2. Electro-Mechanical Computers
IBM Harvard Mark I (1944)
-
One of the largest electro-mechanical computers.
-
~765,000 components, 500 miles of wire.
-
Used for WWII simulations (e.g., Manhattan Project).
-
Relied on relays as switches.
-
Performance:
-
3 additions/subtractions per second
-
Multiplication: ~6 seconds
-
Division: ~15 seconds
-
-
Problems:
-
Slow switching speed
-
Mechanical wear and frequent failures
-
Required constant maintenance
-
3. Relays and “Bugs”
-
Relay = electrically controlled mechanical switch.
-
Limited speed (~50 switches per second).
-
Mechanical parts wore out easily.
-
In 1947, Grace Hopper documented a moth stuck in a relay → origin of the term “computer bug.”
4. Vacuum Tubes – First Electronic Switches
John Ambrose Fleming (1904)
-
Invented the vacuum tube diode.
-
Allowed one-way current flow.
Lee de Forest (1906)
-
Added a control electrode → triode vacuum tube.
-
Acted like an electronic switch.
-
Advantages over relays:
-
No moving parts
-
Thousands of switches per second
-
-
Disadvantages:
-
Fragile
-
Generated heat
-
Burned out like light bulbs
-
Expensive initially
-
5. First Electronic Computers
Bletchley Park
Colossus (1943)
-
Designed by Tommy Flowers.
-
First large-scale programmable electronic computer.
-
Used ~1,600 vacuum tubes.
-
Purpose: decrypt Nazi communications.
-
Programming done via plugboards (manual wiring).
ENIAC
-
Built by John Mauchly and J Presper Eckert.
-
First general-purpose programmable electronic computer.
-
Speed:
- ~5,000 additions per second.
-
Limitations:
-
Frequent vacuum tube failures.
-
Operated only ~12 hours before breakdowns.
-
6. The Transistor Revolution (1947)
Inventors at Bell Labs
-
John Bardeen
-
Walter Brattain
-
William Shockley
Key Advantages:
-
Solid-state (no fragile glass)
-
Much smaller
-
Faster switching (10,000+ per second initially)
-
Lower power consumption
-
More reliable
-
Cheaper over time
7. Early Transistor Computers
IBM IBM 608 (1957)
-
First fully transistorized commercial computer.
-
~3,000 transistors.
-
Thousands of operations per second.
-
Marked transition from government-only machines to business and home computing.
8. Silicon Valley and Semiconductors
-
Region between San Francisco and San Jose became Silicon Valley.
-
Named after silicon, key semiconductor material.
-
Company lineage:
-
Shockley Semiconductor
-
Fairchild Semiconductor
-
Intel
-
9. Scale and Speed Today
-
Modern transistors:
-
< 50 nanometers in size.
-
Switch millions to billions of times per second.
-
Can run reliably for decades.
-
-
Enabled the miniaturization of computers into phones and laptops.
Core Takeaway
Computing hardware evolved through three major switching technologies:
Relays → Vacuum Tubes → Transistors
Each step dramatically improved speed, reliability, size, and cost, transforming computers from fragile room-sized machines into compact, powerful electronic devices and laying the foundation for modern digital computing.
1. Ladder of Abstraction
-
Computing is built in layers of abstraction.
-
Lower levels: physical switches, gears, transistors.
-
Higher levels: logic gates → circuits → processors → software.
-
Engineers and programmers usually work at higher layers, not at the raw hardware level.
2. Binary Representation
-
Computers use Binary = two states.
-
Electrical states:
-
On (current flowing) = True = 1
-
Off (no current) = False = 0
-
-
Two states are preferred because:
-
More reliable and resistant to electrical noise.
-
Easier to distinguish than 3+ states.
-
Supported by existing mathematics (Boolean Algebra).
-
3. Boolean Algebra
-
Mathematical system dealing with True / False values.
-
Created by George Boole in the 1800s.
-
Unlike regular algebra (numbers), Boolean algebra uses logical operations.
Three Fundamental Operations
-
NOT – flips value
-
True → False
-
False → True
-
-
AND – both inputs must be true.
-
OR – at least one input must be true.
4. Transistors as Switches
-
A transistor acts like an electrically controlled switch.
-
Has:
-
Control wire (input)
-
Two electrodes (current path)
-
-
Turning control on/off allows or blocks current.
-
Logical behavior can be built physically using transistors.
5. Logic Gates
Logic gates are physical circuits that implement Boolean operations.
NOT Gate
-
One input, one output.
-
Output is the opposite of input.
AND Gate
-
Two inputs.
-
Output is true only if both inputs are true.
-
Built by placing transistors in series.
OR Gate
-
Two inputs.
-
Output is true if either input is true.
-
Built by placing transistors in parallel.
6. XOR (Exclusive OR)
-
Special logic gate.
-
Output is true only when inputs are different.
-
False when both are true or both are false.
-
Built using combinations of NOT, AND, and OR gates.
-
Very important for arithmetic and processor design.
7. Abstraction in Engineering
-
Engineers rarely design at the transistor level.
-
They use:
-
Logic gates
-
Larger components made from gates
-
Eventually CPUs and software.
-
-
Each layer hides the complexity below it.
8. Computation from Logic
-
Even simple gates can evaluate complex logical statements.
-
True/False signals become the first form of data representation.
-
From these basics, full computers are built.
1. Everything Starts with Binary
-
Computers use binary (base-2) instead of decimal (base-10).
-
Binary has only two digits: 0 and 1.
-
These correspond to electrical states:
-
1 = on / true
-
0 = off / false
-
-
Larger numbers are represented by adding more binary digits, just like adding more decimal digits.
2. Positional Number Systems
Decimal (Base-10)
- Each column is a power of 10: ones, tens, hundreds, etc.
Binary (Base-2)
-
Each column is a power of 2:
1, 2, 4, 8, 16, 32, 64, 128… -
Example:
Binary 101 = 1×4 + 0×2 + 1×1 = 5 (decimal)
3. Bits and Bytes
-
Bit = one binary digit (0 or 1).
-
Byte = 8 bits.
-
8 bits can represent 256 values (2⁸) → range 0–255.
-
Early “8-bit” systems were limited in:
-
Colors
-
Sound variation
-
Numerical range
-
4. Data Size Units
-
Kilobyte (KB) ≈ 1,000 bytes (or 1,024 in binary context)
-
Megabyte (MB) ≈ 1 million bytes
-
Gigabyte (GB) ≈ 1 billion bytes
-
Terabyte (TB) ≈ 1 trillion bytes
(Binary definition uses powers of 2, decimal uses powers of 10.)
5. Word Size: 32-bit vs 64-bit
“Bitness” = how many bits a computer processes at once.
32-bit
-
Max unsigned value ≈ 4.3 billion
-
Often uses 1 bit for sign → ±2 billion range.
64-bit
-
Max value ≈ 9.2 quintillion
-
Necessary for:
-
Large numbers
-
Huge memory addressing (GBs/TBs of RAM)
-
6. Representing Negative Numbers
-
Common approach: first bit = sign
-
0 = positive
-
1 = negative
-
-
Remaining bits store magnitude.
7. Floating Point Numbers (Decimals)
Used for non-whole numbers like 3.14 or 12.7.
IEEE 754 Standard
Stores numbers similar to scientific notation:
-
Sign bit – positive or negative
-
Exponent – scale (power of 10 or 2)
-
Significand (Mantissa) – actual digits
Example concept:
625.9 → 0.6259 × 10³
8. Representing Text with Numbers
ASCII (1963)
-
7-bit code → 128 characters
-
Includes:
-
A–Z, a–z
-
Digits 0–9
-
Punctuation
-
Control characters (like newline)
-
-
Enabled interoperability between different computers.
-
Limitation: mostly English-only.
9. Character Encoding Problems
-
Countries extended ASCII differently.
-
Opening foreign text often produced garbled symbols.
-
In Japan this issue was called “mojibake” (scrambled text).
10. Unicode – Universal Text Encoding (1992)
-
Designed to replace all national encodings.
-
Supports 100,000+ characters from many languages.
-
Includes:
-
Global scripts
-
Math symbols
-
Emoji
-
-
Common implementations use 16 bits or more, allowing over a million possible codes.
11. Beyond Numbers and Text
Other file formats also use binary:
-
Images – pixel colors
-
Audio (MP3) – sound waves
-
Video – sequences of images + sound
-
Operating Systems & Programs – all binary internally
Big Idea
Computers don’t just store numbers — their real power is computation: manipulating numbers in structured ways (add, subtract, compare, etc.).
The component that performs this work is the ALU — Arithmetic Logic Unit, often called the mathematical brain of the computer.
What the ALU Does
The ALU has two halves:
1. Arithmetic Unit
Handles number math:
-
Addition
-
Subtraction
-
Incrementing (add 1)
-
Sometimes multiplication/division (in advanced CPUs)
2. Logic Unit
Handles logical operations:
-
AND
-
OR
-
NOT
-
XOR
-
Tests like “is this number zero?” or “is it negative?”
Building Addition from Logic Gates
Half Adder (1-bit addition)
Adds two single bits A and B.
Outputs:
-
Sum → produced by XOR
-
Carry → produced by AND
Why carry?
Because 1 + 1 = 10 in binary. The 1 moves to the next column.
Full Adder (3-bit addition)
Adds:
-
Bit A
-
Bit B
-
Carry-in from previous column
Outputs:
-
Sum
-
Carry-out
A full adder is made from:
-
Two Half Adders
-
One OR Gate
Multi-Bit Addition: Ripple Carry Adder
To add 8-bit numbers:
-
First column uses a Half Adder.
-
All other columns use Full Adders.
-
Carry bits “ripple” forward from right to left.
Overflow
If the final carry exceeds the available bits, the number is too large → Overflow error.
Classic example: early arcade games like Pac‑Man broke after level 255 because they only stored levels in 8 bits.
Speed Consideration
Ripple carry is simple but slightly slow because each carry waits for the previous one.
Modern CPUs often use Carry-Lookahead Adders to speed this up.
Multiplication & Division
-
Simple processors do not have dedicated circuits.
-
They perform repeated addition/subtraction.
-
More advanced CPUs include dedicated hardware for speed.
Logic Unit Examples
-
Checking if result equals 0 (Zero Test)
-
Checking if result is negative
-
Bitwise AND/OR/XOR operations
Operation Codes (Opcodes)
The ALU needs instructions telling it what to do.
Example (4-bit control signals):
-
1000→ Add -
1100→ Subtract
These are called operation codes.
Flags (Status Outputs)
After a calculation, the ALU outputs tiny 1-bit signals called flags:
-
Zero Flag → result is 0
-
Negative Flag → result is negative
-
Overflow Flag → number too large
These flags help the CPU make decisions (like comparisons and branching).
Historical Reference
A famous early ALU chip was the Intel 74181:
-
4-bit only
-
~70 logic gates
-
Major milestone in miniaturization
Core Insight
An ALU is just a very clever arrangement of logic gates.
From simple TRUE/FALSE switches, computers build:
-
Arithmetic
-
Logic
-
Decision-making
-
Ultimately, all computation.