This article is from the CD-Recordable FAQ, by Andy McFadden (firstname.lastname@example.org) with numerous contributions by others.
Computers store things in "bits", which can be either 0 or 1. To store
something in a computer, it must be converted to a series of bits. The
process is called "digitizing".
You've probably seen an egg slicer. If you haven't, picture a device
that looks like a book resting flat on table. Instead of pages it has
an egg-shaped depression, and instead of a front cover it has a frame
with thin wires stretched across it vertically at regular intervals.
You raise the lid, insert the egg, and when you press the lid down the
wires cut the egg into thin, round slices.
It usually helps to hard-boil the egg first.
Suppose we want to digitize an egg so we can make a nifty 3D model and
display it on a computer. Our slicer has 9 wires, so we could end up
with as many as 10 pieces. We place the egg into the device and slice it.
Now we measure the height of each piece in centimeters (assume the pieces
are perfectly round), measuring the diameter with calipers and rounding
it to the nearest centimeter. Each slice could go from 0cm (the egg was
short, so there was no slice) to 5cm (the width of our slicer).
When we're done, we spit out something that looks like this:
Your eggs may vary. Storing a number from 0 to 5 requires 3 digital bits,
so if we know that measurements are always in centimeters, we can store
the height of each slice in 3 bits. We have ten numbers to store, so we
can hold our egg in a mere 30 bits!
When we try to display our digitized egg on a computer screen, however, we
discover a problem. The image doesn't look like a smooth egg. Instead,
it looks like a bunch of stair steps in a vaguely egg-shaped pattern.
The sizes aren't right either: our original egg was actually 3.4cm at its
widest point, but we had to round it down to 3cm.
Suppose we improve our measurements down to the nearest millimeter. Now,
when we have to round off the measurements, the round-off error is much
smaller. The results look much better, but holding a value from 0 to
50 requires 6 digital bits instead of 3, so we've doubled our storage
requirements to 60 bits. What's more, the image still looks stair-steppy.
The stairs happen because each slice has a single height value. When we go
from slice #7 to slice #8, we abruptly jump from 3cm to 2cm. The reason our
recreated egg doesn't look smooth is because we didn't really capture the
original, in which each slice varied in height from one edge to the other.
Our digitization could only capture the average height of each slice.
There are a couple of ways to improve this. The first is to guess at
the shape of the original egg, and draw smooth curves based on the data
we have. This is called "interpolation". The other approach is to buy a
new egg slicer with wires that are closer together, so we have more slices,
reducing the size of the jump from one slice to the next. This is called
"increasing the sampling rate". If you double the number of slices,
you double the number of bits required to hold the digital version.
If you slice the egg finely and measure it accurately, you can get a
nearly perfect representation of the original. For example, if we create
slices that are one molecule apart, and measure the height to the nearest
molecule, we will have an extremely accurate picture, not to mention a
seriously huge digital representation. The tricky part about digitizing
something is to choose the height and thickness of the slices such that
the likeness is very good but the digital size is small.