Roughly a year ago, I made some noises on this blog about wanting to learn R. Not sur­pris­ingly, I didn’t do it.

A year later I’m a gov­ern­ment sci­en­tist with some sta­tis­tics to do, and I’m once again thinking of learning me some R. In the interim, I’ve received an email assuring me “you could get up and running with it within a day, I think. Master it in a week or two”. So I download the package — it’s free! — install it, and boot it up. I’m looking at a command line console labelled the “GUI” (ha ha), with the fol­lowing help text:

Type ‘demo()’ for some demos”

Demos! Perfect! Let’s see some concrete examples of how to do sta­tis­tics in R-​​land! So I type demo() into the “GUI” prompt, and receive the fol­lowing output:

Demos in package ‘base’:

is.things: Explore some prop­er­ties of R objects and is.FOO() func­tions. Not for newbies!
recur­sion: Using recur­sion for adaptive inte­gra­tion
scoping: An illus­tra­tion of lexical scoping.

Demos in package ‘graphics’:

Hershey: Tables of the char­ac­ters in the Hershey vector fonts
Japanese: Tables of the Japanese char­ac­ters in the Hershey vector fonts
graphics: A show of some of R’s graphics capa­bil­i­ties
image: The image-​​like graphics builtins of R
persp: Extended persp() examples
plotmath: Examples of the use of math­e­matics annotation

Demos in package ‘stats’:

glm.vr: Some glm() examples from V&R with several pre­dic­tors
lm.glm: Some linear and gen­er­al­ized linear mod­el­ling examples from ‘An Introduction to Statistical Modelling’ by Annette Dobson
nlm: Nonlinear least-​​squares using nlm()
smooth: ‘Visualize’ steps in Tukey’s smoothers

Use ‘demo(package = .packages(all.available = TRUE))’ to list the demos in all *avail­able* packages.

Tables of the char­ac­ters in the Hershey vector fonts? demo(package = .packages(all.available = TRUE))? Some of the ‘stats’ packages sounded like they might make sense, so I tried to run them, but I couldn’t figure out how. I love the idea of open source bare-​​knuckle com­puting. I wish I loved it in practice.


Forgive me for saying so, but I know a thing or two about enhancing pho­tographs. I’ve put some time in as a satel­lite and aerial imagery analyst, and as a hobby pho­tog­ra­pher I make no apolo­gies about Photoshop. I grok his­togram response curves, level shifting,  global  and local contrast, inter­po­la­tion, headroom, falloff, edge detec­tion, hue iso­la­tion and sat­u­ra­tion expan­sion. I know you almost always zoom out (!) to see a pattern, but if you want to get into pixel-​​peeping, I know a little about decom­posing a pixel into con­stituent spectral sig­na­tures, k-​​means clus­tering and machine-​​learning clas­si­fi­ca­tion, and all the lovely super­vised and unsu­per­vised pixel binning tech­niques. If I give myself an hour to study up, I can even keep the Minimum Noise Transformation straight in my head for 15 minutes. And the N-​​Dimensional Visualizer speaks for itself.

There is an enormous amount you can do to make a shape or pattern or shade of interest stand out in a image, by tweaking the colour or contrast response, or exploiting extra parts of the light spectrum to help the computer find hidden colours. You can fuzz together noisy patterns to see the shapes behind them, or bin together multiple pixels to lighten up the darkness. Just about the only thing you can’t do is create detail where there wasn’t any to begin with.

So I get grumpy every time I watch a movie with an image analysis scene, and the one and only thing they always always do is the one damn thing you can’t.

dunk3d made a montage:

Two they left out:

Bladerunner (the original?)

and of course Super Troopers

…(although it’s true that imagery analysts wear state trooper uniforms to operate their computer terminals.)

Google Massively Automates Tropical Deforestation Detection

Landcover change analysis has been an active area of research in the remote sensing com­mu­nity for many years. The idea is to make com­pu­ta­tional pro­to­cols and algo­rithms that take a couple of digital images col­lected by satel­lites or air­planes, turn them into land­cover maps, layer them on top of each other, and pick out the places where the land­cover type has changed. The best pro­to­cols are the most precise, the fastest, and which can chew on multiple images recorded under dif­ferent con­di­tions. One of the favourite appli­ca­tions of land­cover change analysis has been defor­esta­tion detec­tion. A par­tic­u­larly popular target for defor­esta­tion analysis is the tropical rain­forests, which are being chain­sawed down at rates which are almost as dif­fi­cult to com­pre­hend as it is to judge exactly how bad the effects of their removal will be on bio­log­ical diver­sity, plan­e­tary ecosystem func­tioning and climate stability.

Google has now gotten itself into the envi­ron­mental remote sensing game, but in a Google-​​esque way: mas­sively, ubiq­ui­tously, com­pu­ta­tion­ally inten­sively, plau­sibly benignly, and with probable long-​​term finan­cial benefits. They are now running a program to vacuum up satel­lite imagery and apply land­cover change detec­tion optomized for spotting defor­esta­tion, and for the time being targeted at the amazon basin. The public doesn’t cur­rently get access to the results, but pre­sum­ably that access will be rolled out once Google et al are con­fi­dent in the system. I have to hand it to Google: they are tech­ni­cally careful, but polit­i­cally aggres­sive. Amazon defor­esta­tion is (or should still be) a very polit­ical topic.

The par­tic­ular land­cover change algo­rithms they are using are appar­ently the direct product of Greg Asner’s group at Carnegie Institution for Science and Carlos Souza at Imazon. To signal my belief in the impor­tance of this project I’m not going to make a joke about Dr. Asner, as would normally be required by my back­ground in the Ustin Mafia. (AsnerLAB!)

From the Google Blog:

We decided to find out, by working with Greg and Carlos to re-​​implement their software online, on top of a pro­to­type platform we’ve built that gives them easy access to ter­abytes of satel­lite imagery and thou­sands of com­puters in our data centers.”

That’s an inter­esting comment in it’s own right. Landcover/​landuse change analysis algo­rithms pre­sum­ably require a rea­son­ably general-​​purpose com­puting envi­ron­ment for imple­men­ta­tion. The fact that they could be run “on top of a pro­to­type platform … that gives them easy access to … com­puters in our data centers” suggests that Google has created some kind of more-​​or-​​less general purpose abstrac­tion layer than can invoke their unprece­dented com­puting and data resource.

They back that comment up in the bullet points:

Ease of use and lower costs: An online platform that offers easy access to data, sci­en­tific algo­rithms and com­pu­ta­tion horse­power from any web browser can dra­mat­i­cally lower the cost and com­plexity for tropical nations to monitor their forests.”

Is Google sig­naling their devel­op­ment of a com­mer­ical super­com­puting cloud, à la Amazon S3? Based on the further marketing-​​speak in the bullets that follow that claim, I woud say absolutely yes. This is a test project and a demo for that business. You heard it here first, folks.

Mongobay points out that it’s not just tropical forests that are quietly dis­s­a­pearing, and Canada and some other devel­oped coun­tries don’t do any kind of good job in aggre­gating or pub­li­cally mapping their own enormous defor­esta­tion. I wonder: when will Google point its detec­tion program at British Columbia’s end­lessly exanding network of just-​​out-​​of-​​sight-​​of-​​the-​​highway clearcuts? And what facts and figures will become readily acces­sible when it does?

View Larger Map

Mongobay also infers that LIDAR might be involved in this par­tic­ular process of detecting land­cover change, but that wouldn’t be the case. Light Detection and Ranging is commonly used in char­ac­ter­izing forest canopy, but it’s still a plane-​​based imaging tech­nique, and as such not appro­priate for Google’s world-​​scale ambi­tions. We still don’t have a credible hyper­spec­tral satel­lite, and we’re nowhere close to having a LIDAR satel­lite that can shoot reflecting lasers at all places on the surface of the earth. Although if we did have a satel­lite that shot reflecting lasers at all places on the surface of the earth, I somehow wouldn’t be sur­prised if Google was responsible.

Which leads me to the point in the Google-​​related post where I confess my ner­vous­ness around GOOG taking on yet another service — envi­ron­mental change mapping — that should probably be handled by a demo­c­ra­t­i­cally directed, pub­li­cally account­able orga­ni­za­tion rather than a publically-​​traded for-​​profit cor­po­ra­tion. And this is the point in the post where I admit that they are taking on that function first and/​or well.

Retro Computing Jam Session

Have you ever had that feeling that some­where out there, people are jamming on a Vic-​​20, a PET and a Commodore 64, possibly in some kind of class­room setting?

The middle computer would be Petsnyth’s first (I assume) public performance.

Artificial Intelligence in Flash

This guy appears to be doing some work on network-​​based arti­fi­cial intel­li­gence… in flash. I wouldn’t have thought flash would be a first choice of pro­gram­ming language if you’re into exper­i­mental com­pu­ta­tion. But you gotta admit, it sure does look pretty.

Maybe Netlogo should hire a designer to gussy up their applets. Right after they get around to going open source.

“The finest in 1-​​bit sound on the Commodore PET”

And now the Petsynth project has a website: petsynth​.org

You have to record the program to an audio cassette tape to load it onto your Pet. But this home copying is fully legal: Petsynth has gone open source.


Broken Happiness Machines Are Go
PetSynth: A Superior Synthesizer for the Commodore Pet

Broken Happiness Machines Are Go

A couple of weeks ago I men­tioned Petsynth, Chiron Bramberger’s novel syn­the­sizer software for the Commodore Pet. But Chiron doesn’t just write music on the Pet, he also blasts 8-​​bit rythmic weird­ness from a pipe-​​organ arrange­ment of Amigas and Ataris and god knows what else.

And last week, Chiron pressed play on the Broken Happiness Machines website, from which he will dis­tribute some of those grooves. (And hope­fully that software!)

photo by Chiron Bramberger

An Adware Programmer on the Resilient Goodness of Humans

From the end of this extra­or­di­nary inter­view with a retired writer of programs to screw up your computer irretrievably:

S: Do you think that in our society we delude our­selves into thinking we have more privacy than we really do?

M: Oh, absolutely. If you think about it, when I use a credit card, the security model is the same as that of handing you my wallet and saying, “Take out whatever money you think you want, and then give it back.”

S: …and yet it seems to be working.

M: Most things don’t have to be perfect. In par­tic­ular, things involving human inter­ac­tions don’t have to be perfect, because groups of humans have all these self-​​regulations built in. If you and I have an agree­ment and you screwed me over badly, you’ve always got in the back of your mind the nagging worry that I’m going to show up on your doorstep with a club and kill you. Because of that, people don’t tend to screw each other too much, right? At least, they try not to. One danger, perhaps, of moving towards an algo­rith­mi­cally driven society is that the algo­rithms aren’t scared of us showing up and beating them up. The algo­rithms will do whatever it is that they are designed to do. But mostly I’m not too worried about that.”

Your Computer Screen May Need To Be Colour Calibrated

Your computer screen may need to be colour cal­i­brated. Mine sure did. I bought a new laptop which seems to have a nice enough screen, but I could tell by looking that it suffered from a blue cast. It’s specif­i­cally a Dell XPS 13 (yes, a Dell, forgive me), but blue-​​ishness seems to be a common char­ac­ter­isitic of laptop screens. I’ve noticed it on several, and I recall pho­tog­raphy spaz Ken Rockwell had the same problem. It’s not an issue if you aren’t doing pho­tog­raphy or graphic design, but if you are it is. Not knowing the actual colour of the image you’re making is a real pain if you’re planning on printing it or showing it on somebody else’s screen.

Calibrating a screen has a hard part and an easy part and a hard part. The hard part is swal­lowing the idea of paying non-​​trivial sums of money for an obscure hardware thing that will sit briefly on your monitor before being for­gotten in a desk for months or years. The easy part is actually running the pro­ce­dure once you’ve installed the asso­ci­ated software and have plugged the device into your computer. The hard part is knowing what to do with the cal­i­bra­tion profile file that pro­ce­dure will produce. The cal­i­brator I use installs a little program that loads up with Windows and auto­mat­i­cally applies the profile file when the system boots, which seems easy. But I notice that program some­times fails, so I have to override it and use the built-​​in Windows color man­age­ment controls anyway. It’s less of pain in Vista than it was in XP, but it’s still a pain. Mac and linux may have better systems, I’m not sure. In any case it means digging through control panels and pon­dering what will happen when you plug multiple dif­fer­nent monitors into your computer, if you’re into that sort of thing.

A litte Q&A:

What is a colour cal­i­brator? What is a colour profile? A cal­i­brator is a little USB puck that has a basic spec­trom­eter built into it. The asso­ci­ated software produces an image on your screen that cycles through some colours. The spec­trom­eter puck pre­cisley records what colour your screen is actually gen­er­ating. The software then compares those output values to what it was feeding into the screen and makes a little file with numbers the oper­ating system can use to correct the dif­fer­ence. With that cor­rec­tion applied the result should be a nuetral colour response.

Why don’t screen man­u­fac­turers test their screens and give out profiles? I don’t know. My old external monitor actually did come with a factory-​​made profile, but I only dis­cov­ered it by accident while digging around the util­i­ties cd. Maybe because every screen coming off the pro­duc­tion line will have slightly dif­ferent colour response and they can’t be bothered testing each screen indi­vid­u­ally. Maybe because they figure it’s too much to expect owners to know how to apply the profile file within their par­tic­ular oper­ating system. Maybe because they figure nobody cares. Maybe nobody does.

Is there any such thing as truly nuetral colour response? Yes. But if you start to think too much about that it gets complicated.

If everybody’s screen is blue-​​ish, shouldn’t you keep yours blue-​​ish so you’ll know what your photos will look like on theirs? That’s depressing, next question.

Shouldn’t you do this for your printer too? Yes. But that gets com­pli­cated. And expensive.

Will the printer at the photo store you send your prints to have a nuetral colour response? Maybe, but they will hope­fully have it cal­i­brated if it doesn’t. But that gets com­pli­cated. For now, just worry about your screen.

PetSynth: A Superior Synthesizer for the Commodore Pet

Today I got to try PetSynth v0.006, by Chiron Bramberger. Chiron owns several Commodore Pet personal com­puters, and was dis­s­a­pointed by the quality of the music-​​making software avail­able for them, so he wrote his own. He has plans to release it to the Commodore com­mu­nity but cur­rently it’s stored on a 5.25″ disk lodged in his dual disk drive.

In addition to pro­ducing lovely beeps, Chiron figured out how to drive dis­tor­tion in the Commodore’s built-​​in music hardware, to produce vibrato, and to generate some­thing very much like a drum tone. Hooked up to a pair of pot knobs for bending the signal, the system can produce some mean 8 bit grooves.

Chiron also builds guitar effects pedals from the recycled innards of modems, grafted into lovely hand-​​painted acrylic boxes with the shells of hard­drives for backs, but that’s a seperate project.

older posts →