I was sitting at home in my underwear eating Cheetos® and watching Cartoon Network this past Saturday, when something dawned on me. I recalled a conversation I had with my best frind about 8 years ago about the Mac’s 256, Thousands, and Millions of colors versus Windows 8, 16, and 24-bit color. He was saying that he was excited because he’d pushed his Windows 95 computer to display 16.7 million colors.
It was Saturday when the obvious struck me. Mac OS’s “Millions” of colors is equal to 16,777,216 colors, which is equal to Windows’ 24-bit color, because in binary, 111111111111111111111111 is equal to 16,777,216 in decimal. Duh!
For the next few hours, my brain was into binary. I licked the cheese powder off my fingers, turned off the TV, put some pants on, and sat down at my computer. A few months ago, I wrote a Number Conversion Object that is able to convert between Decimal (base 10) and Binary (base 2). I decided to put it to use.
Current video game systems are 128-bit. That translates to 3.402823669209385 × 10³⁸. That’s a freakin’ big number. Then I began thinking about encryption. Most browsers have 128-bit encryption. Some security applications have 256-bit encryption. I haven’t heard of 512-bit encryption, but I’m sure it’s out there.
- 256-bit: 1.157920892373162 × 10⁷⁷
- 512-bit: 1.3407807929942597 × 10¹⁵⁴
WOW! Those numbers are huge! In binary, 512 bits is written as 512 “1’s” right next to each other. And 10¹⁵⁴ is a 1 with 155 zeroes behind it. A billion only has 9 zeroes behind it.
I feel like I’ve discovered the end of the universe or something. I’m having a tough time comprehending how big that number actually is.
As I said earlier, it’s not very useful, but it is kinda cool.