Friction

The unit of knowledge is not the answer but the question you can't yet answer. Understanding arrives through repeated failure.

Topology
Struggle:

Why is a coffee cup topologically equivalent to a donut?

Dec 23, 2025 Failed

Something about holes? They both have one hole so they're the same. But that can't be right because a straw has two holes...

Dec 25, 2025 Partial

It's about continuous deformation without cutting or gluing. The handle of the mug corresponds to the hole in the donut. But I still can't explain why genus matters more than shape...

Music Theory
Struggle:

What makes a tritone feel unstable?

Dec 20, 2025 Failed

It's called the devil's interval because it sounds bad? That's not an explanation. It's six semitones exactly...

Physics
Struggle:

Why does E=mc² and not E=mc³ or some other exponent?

Dec 18, 2025 Partial

Dimensional analysis: energy is kg⋅m²/s², and c² gives m²/s². So mc² works dimensionally. But why is c the relevant velocity? Because it's invariant under Lorentz transformations...

Good questions expose gaps. If you can answer it easily, it's not worth tracking.

Dissolve

You understand something when you can make a child understand it. Knowledge dissolves into language.

7
Dissolving
12
Dissolved
3
Crystallized

Fourier Transform

35% dissolved

Explain this to a curious 12-year-old

← Can you make it simpler?

Why planes fly

70% dissolved

Explain this to a curious 12-year-old

← Can you make it simpler?

The Monty Hall Problem

90% dissolved

Explain this to a curious 12-year-old

← Can you make it simpler?

Recursion

55% dissolved

Explain this to a curious 12-year-old

← Can you make it simpler?

+ Add concept — what do you think you understand?

Tension

Understanding is not in nodes but in edges. The relationship between two ideas is the knowledge—not the ideas themselves.

ENTROPY INFO PROB COMPRESS RANDOM

Your Articulated Connections

Entropy Information

"Shannon entropy measures uncertainty, which is equivalent to information content—the less predictable, the more information conveyed"

Confidence:
Probability Compression

"Optimal compression assigns shorter codes to more probable events—compression IS probability modeling"

Confidence:
Randomness Entropy

"Maximum entropy means maximum randomness—true randomness is incompressible"

Confidence:

Articulate a new connection