January 31, 2007
What could we do with unlimited CPU, networking and/or storage? With infinite networking we could finally have true integration between local and Internet services into a seamless whole. We could stop thinking of computers as personal, isolated machines periodically communicating with others. Instead, your local machine provides a window onto a larger, massive multi-user machine. Amazon services (EC2 and S3) are an example of near infinite resources available to anyone. Verizon will eventually stick FiOS (fiber networking, 50Mbps download at consumer prices) everywhere, which means I can get access to Amazon-like services quickly. Also, multi-core processors, advanced graphics chips and the Cell processor as an add-on board give home users unbelievable amounts of power. It time to rethink old ways of doing things in those resource constrained days in light of our modern period of abundant computing power.
For example, given infinite computing resources our concept of programming should change considerably. Instead of a static edit-compile-debug cycle, we should dynamically update a running image. Everytime we modify code, the computer should run extensive analysis immediately to ensure correctness, including a battery of unit tests. Snapshots of the image should be saved for every change, including the flow of data through the system. That way you can compare previous versions with the newest changes in the debugger (i.e., x=5 in v2, x=2 in v1… so some recent change effected this variable). Seriously, debuggers haven’t changed in 40+ years. Someone needs to make a mainstream version of OCaml’s backwards debugger.
Another possibility is photo retouching. Rather than ask users how they want to retouch a photo using obscure parameters on filters, let the computer generate 4 pictures with different filter settings and ask the user which he prefers. Then the program can generate 4 more within the users preferred range, and so on until the user settles on one. This requires a lot of CPU, but makes photo editing much easier for novice users.
With infinite disk space, I should be able to store my entire audio CD without lossy compression. CDs store info at 1411 kb/sec compared to at most 320 kb/sec for MP3s. The only reason to compress audio is because we didn’t have enough disk space to store everything, especially on small devices. No problem: store the full CD on giant hard drives so I have full audio quality on my stereo, but compress it on-the-fly so it fits on my iPod (which doesn’t require full audio because it’s just 2 speakers). I think this will be important as more people plug their stereos into their media centers. I can’t believe people pay $1 for lossy compressed audio from iTunes.
With fast networking, online backups become ubiquitous, online digital libraries will be available for everything (for a fee) – audio, video, books. Online databases of information will be open and available for cross-database queries and analysis. Online games will be richer and more dynamic. More importantly, distributed operating systems will be more popular: when you run a program, it could run on any machine on your network and processes can move around dynamically to minimize IPC traffic. Better yet, you can view all the memory on all the machines in your grid as a single memory system: imagine a grid GC.
The point is that engineers should start assuming (nearly) infinite resources when they design systems because it can radically change our assumptions of how things ought to be built.