🌗

Infinite bits and the rule of product

Before I set foot on American soil, I grappled with one particular subject for an entire year at my home university - Discrete Mathematics. It wasn't exceptionally challenging, but the professor's approach to teaching led the majority of the class to failure. As I recall, only a handful - perhaps three - managed to pass the subject through regular exams.

In this state of uncertainty, with scarce resources to aid our understanding, my classmates and I prepared to take an exam that covered the entire subject in June. It might not seem like a big deal, but it was scheduled right after our second-semester exams. Amidst all the stress, I managed to pass the subject, owing a lot to Rosen's book, 'Discrete Mathematics and its Applications' and Berkeley's online course.

This challenging journey taught me a lot, not just about combinatorics, number theory, and advanced graph theory, but also some philosophical concepts that continue to intrigue me. One of these was the principle of counting. The rule of sum, the rule of product, and the pigeonhole principle apply to numerous problems. Even though they're simple concepts, applying them to infinite sets allows us to demonstrate an array of ideas. I want to delve into how we perceive the future and our place in it, even though we're aware that something is already awaiting us there. Sounds intriguing, doesn't it?

Let's illustrate this with a basic example. I hope you'll excuse any minor inaccuracies; my main aim is to paint a broader picture.

Consider computer screens, which are essentially composed of several tiny dots filled with a color from a vast array of hues. This capability allows us to display any conceivable image on a screen.

These tiny dots, or pixels, don't color themselves. The computer assigns a certain value to fill them. Computers typically use the RGB color space for this, with red, green, and blue colors each using 8 bits, resulting in integer values from 0 to 255. This gives us $256 \times 256 \times 256 = \mathbf{16{,}777{,}216}$ possible colors. Each pixel on an LCD monitor displays colors in this way, through a combination of red, green, and blue LEDs (light-emitting diodes). When the red pixel is set to 0, the LED is turned off. When it's set to 255, the LED is fully on.

So far, so good. But what if we think about it from another angle? If we can control the device that picks a number between 0 and 16,777,216 (and we can), we could try assigning every single pixel on the screen every single number. This would result in every possible combination of colored dots on the screen, i.e., all possible images that can be displayed, whether from the past or future. Imagine a picture of your grandparents hugging in the middle of Kakariko Village from Zelda OOT - it could be there. A picture of what you saw when you were 5 years old going to school for the first time - it could be there too, assuming the human eye can capture a much wider range of colors. Now think of a larger screen with smaller dots. See where I'm going with this?

Let's extend this concept to different scenarios. Imagine constructing an infinite binary file. Even if it's not infinite, it will suffice for our example. If we had a 64MB binary file and we could write all combinations of 0s and 1s, then we could create any program that could fit within that size. If size is not a constraint (which it typically isn't), anything you can imagine could be represented there. Firefox 20000? There it is. Windows LongHorn? There it is. Ubuntu Sacred Unicorn? Absolutely.

If the binary file example doesn't resonate with you, think about sets of characters distributed across a vast amount of space in a source file. This would result in all possible programs in all possible languages. Quite fascinating, right?

So, why aren't we channeling all our efforts in this direction? To understand this, let's first grasp the rule of product.

Consider ordering a pizza. Initially, you choose the type of crust: thin or deep dish (2 options). Next, you decide on the topping: cheese, pepperoni, or sausage (3 options). Applying the rule of product, we realize there are $2 \times 3 = 6$ possible pizza combinations.

Discrete math illuminated the vast potentialities we're dealing with. Let's illustrate how the previous examples would work according to the rule of product.

The screen

Take a 1080p resolution screen, the latest trend, which has 1080 rows of pixels with 1920 pixels in each row, totaling 2,073,600 pixels. Each of these pixels can display any one of 16,777,216 colors. I like to liken this problem to placing colored golf balls into an array of cups.

When we decide to fill these pixels, we have 16,777,216 options at each step, since colors may be repeated. This results in a staggering total of $16{,}777{,}216^{2{,}073{,}600}$ possible images. Some estimates suggest there are $2.0 \times 10^{83}$ atoms in the observable universe, to give you a sense of scale.

The binary file

Now, consider an old 1GB USB stick, which contains 8,589,934,592 bits. Here, we only have two options to fill the spaces: 1 or 0. Despite the limited options, the complexity is immense. We can repeat options, resulting in $2^{8{,}589{,}934{,}592}$ combinations. To put this in perspective, the internet reportedly contained around 45 billion pages as of December 2011.

$2^{9000}$ is this, so you get the picture

The primary challenges lie in generating this massive amount of data, storing it, and distinguishing useful combinations from the irrelevant ones. While I'm not an expert in this area, I'm sure some of you can shed light on these issues.

To delve deeper into these operations, you might want to explore infinitary combinatorics, a field that studies these kinds of problems.