fast computers are good for something, maybe

whilst on the subject of image analysis…

I get to use a reasonably fast computer. It runs somewhere around 3GHz and carries about a gig of RAM. which is fine, I’m more than happy with that. I’ve noticed though, that half of the time it doesn’t matter how frickin fast your box is, windows operations are still tiresome. Opening windows, bringing up menus, creating new folders, it stutters, it staggers, it acts like it was just trying to remember something and you interupted it…

There are exceptions though. I ran an image classifcation process today that I had spent most of the week setting up. The computer had to take each pixel in a satellite image and decided mathematically which of the landcover spectral signatures I had created that the pixel most likely belonged to. It uses a fairly fat maximum likelihood algorithm that goes beyond simply deciding minimum distance between the pixel and the signature mean in spectral x/y space and works on competing probality wells and such impressive stuff. I’ve seen diagrams.

Used to be, I’m told, people didn’t much use maximum likelihood because you were leaving the damn thing running over the weekend even if you just used a minimum distance algorithm. So I set it up, pressed the start button, and gathered my stuff to go to lunch early. By the time I had my things in my hands and was ready to walk away, the damn thing was at 20%. Damn it! I’m doing something important! I want a satisfyingly long crunch time! I want a long lunch!

Now I’m refining my signatures. I check the sub signature generated by each of the individual areas of cover type I have eyeball-identified, and ask the computer to guess how it would classify the full image if that sub-signature were the whole thing. It uses a paralellpiped limits algorithm for this, basically drawing a square box of arbitrary dimensions around the sub signature mean in spectral x/y space and letting any pixel that ends up inside the box into the club. In other words, it’s another cheater mechanism meant to lower the math processing demands, so you can do it in real time instead of waiting each time.

Well I wish it just used the full blown maximum likelihood, because even though my CPU is getting almost completley used up by iTunes as it rips some MP3s, it does the classifcation guesses without blinking. I think this cutting edge program has some 3 year old assumptions about computing power built in. Some aspects of computing haven’t effectivley spead up all that much in the last decade, but when you come across one of those rare moments where raw mathematical computational power makes a difference, man, things sure are different these days.

leave a comment