Alvy Ray Smith (2021) ・A Biography of the Pixel

Rina Chen’s living notebook on digital craft and design.


Smith, Alvy Ray. A Biography of the Pixel. The MIT Press, 2021.

waves, computations, and pixels—underlie all the apparent complexity of Digital Light

you can pass back and forth between waves and bits—between the analog and digital worlds. The idea dates back only to 1933 when the Russian Vladimir Kotelnikov established it as we know it today.

The Sampling Theorem, articulated by Kotelnikov in 1933, states that a digital representation can accurately reflect an analog signal, revealing that digital data can capture smooth, continuous information without retaining all the intermediary data points.

In short, Mathematical idea → computational model → digital encoding → physical display

In short:
Every digital image is the end result of a long chain of mathematical concepts becoming physical reality.

I’m questioning it! Sampling does not equal to the original. When you record a bird song, and played that recording, would it be a good reason to get rid of the bird, or move to a place where there is no bird? The “liens” of a digital bird is cut off from the analog world. The digital bird did not come back.

How can we make the two state coexist or oscillate back and forth freely?

A pixel’s true nature is that of a discrete sample, fundamentally important to digital media representation, unlike the common perception of them as physical squares.

it appears that we can throw away an astounding amount of information—an infinite amount, in fact—without losing anything

What a computer does at each step is trivial. A typical step has this form: take some bits from here, twiddle them a little, and put the resulting bits there. A twiddle might be nothing more than a swap of each 0 for a 1 and vice versa, or a move of each 0 or 1 a single position to the right.

A computer, like a piano, can implement countless sequences of its simple basic steps. It can therefore implement countless complex and meaningful processes with them—its music, so to speak—using only 2 bits rather than 88 piano keys.

Although each step that a computer takes is completely predictable, we can’t always know how a sequence of those steps will unfold. Determined in the small doesn’t mean predetermined in the large. A computation must be determined—there must be some answer about what a sequence of completely deterministic steps will do—but we can’t always know that answer.

Although each step that a computer takes is completely predictable, we can’t always know how a sequence of those steps will unfold. Determined in the small doesn’t mean predetermined in the large. A computation must be determined—there must be some answer about what a sequence of completely deterministic steps will do—but we can’t always know that answer.

What happens when the number of steps in an algorithm gets large, the number of loops multiplies, their level of nesting deepens, and the conditional branches ramify vastly? By asking questions about systematic processes at the turn of the twentieth century, mathematicians started to feel their way toward the world of computation, but they didn’t know it yet. They hadn’t yet glimpsed the twin glories of Malleability and Amplification, or the mysteries associated with each.

Derivations are sequences of such steps, with operations as simple as the or rule applied at every step. What we’ve described so far is a systematic way to derive one statement after another in such a way that truth, or falsity, is maintained at each step. If you start with true statements, a derivation will always yield a true statement.

The true statements they start with are called axioms. For example, it’s an axiom that a number is equal to itself. That’s obviously always true. The awesome glory of math is that such derivations can lead to totally unexpected results, even though every step is simple and obvious.

Turing found that simple logic is undecidable. It was such an unexpected and disturbing result that even the great Hilbert didn’t believe it at first.

But it didn’t capture their tedium. The tedium comes—as you’ve discovered if you followed the instructions above—from the unceasing repetition of the tiny steps, the constant worry of error, and the effort of keeping track—during a tea break, say. Nor did it capture their boredom.

Turing defined each of his machines to have only four things: a one-dimensional tape divided into squares, a finite set of symbols for it, a tape scanner with a finite number of states, and an “instruction table” that tells you what to do with each combination of scanner state and tape symbol. In our running example, the tape scanner is the business card with a hole in it, the six symbols are the digits 1 through 5 and a blank, and the four states are the four orientations of the card. The four sets of rules form the instruction table. And one other thing. The tape is as long as necessary in either direction. There’s always another square if you need it. There’s always more scratch paper. Perhaps you can understand now why Newman couldn’t believe at first that such a simple device—a toy it seems—led to a profound mathematical result. But from this simple “machine” concept came all of computing.

Formal system https://www.youtube.com/watch?v=PLVCscCY4xI ![[Pasted image 20260405141706.png]]![[Pasted image 20260405142650.png]] and the complexity of algorithm and ![[Pasted image 20260405142919.png]] ![[Pasted image 20260405143630.png]] https://www.youtube.com/watch?v=7TycxwFmdB0 About halting problem https://www.youtube.com/watch?v=t37GQgUPa6k ![[Pasted image 20260405144236.png]] ![[Pasted image 20260405144353.png]]