Monday, April 21, 2008

One good reason webapps are better

When the Iphone came out last year people whined and moaned because there was no third-party native application support or capability on the iPhone. Now that there is, I realize why Apple chose to push developers to go the webapp route for applications instead of using a native application.

At first you might believe that it was about application security and keeping the device virus-free since it is running an almost full version of safari.

But it's actually about connectivity! Because webapps are inherently connected to the Internet, they make it easier for developers to design information-rich applications that not only present a current view of things, but they also enable file storage and processing-intensive database queries and formatting to occur on the server side. This leaves the processor free for the webapp interface to handle the presentation of that information. And what a beautiful presentation it is.

Multi Core Madness

So AMD just popped the seal on its 12 core shanghai chip. I'll believe it when I see it. While we wait, here's some interesting speculation about what the hell you can actually do with 12 cores.

First of all, What good are 12 cores if you can't use them? Most software is not designed with multiple cores in mind. My assertion is that there should be a layer between the OS and the processor that divvies up the processing chores among all available cores and does it automatically or programmatically if the developer so chooses.

Secondly, why stop at twelve and say "tada!" when there's a possibility that month your competitor is going to one-up you? If you showed me a chip with 128 cores then I'd be really impressed. Hell, even 64 cores would be impressive but I find twelve kinda weak, considering that AMD has just built a multi-chip bus that can process instructions not only across multiple cores on a single die, but multiple dies on a single chip. Maybe they are finding that too many cores cause a bottleneck in some other component, like the RAM. In that case I'd be right behind them in supporting the 12 core standard but then I want to see them doing something about that bottleneck so they can open it up, like pouring some R&D cash into better memory.

With twelve cores (assuming each core on its own was capable of running an OS with several applications) you could designate each core to handle a different part of the computing tasks. Hell you could even give each application its own core. I seldom have more than 10 apps open at a time.

I feel that operating systems as we know them should change fundamentally to accept this new paradigm of multiple cores. Yes I'm saying we should risk breaking backward compatibility for the sake of an impressive leap forward. Build a compatibility layer, or virtualize for backward compatibility if you have to but by all means, build new software based on new technology and don't look back! Carry over your old data but leave your applications behind. We all know that there's no way to move boldly forward if we are clutching our legacy applications. Rather than real progress, people would rather have slow progress as long as nothing stops working. That is the old way of thinking and will die along with those that think that way. You can build a new system and test it while you are still supporting an old system. You don't have to immediately drop what you're doing and cause downtime because certain critical apps won't work. If people are smart about it they can roll out new systems in parallel with old ones to make sure that they don't have any downtime if one system fails.

First! 1st! ONE! 1!

Anyone who regularly visits tech gadget blog sites will know that there's a constant presence of ignorant retards waiting in the wings to get their first comment fix on. I'd like to treat them like the dirty flies they are and lure them all onto a giant sheet of poisonous flypaper and watch them squirm their way to certain death.