Graffiti for the Skys

NASA’s Jet Propulsion Lab routinely uses Cuprite, Nevada as a testing and demonstration site for their plane-mounted hyperspectral sensor AVIRIS.

I’ve been working my way through the back-catalog of AVIRIS imagery, looking for a study site. I was kinda suprised when I saw this:

cuprite graffiti

I guess somebody noticed all those planes and decided to give NASA something to look at.

Full sized image (a black&white preview of the actual data set), is here.

A New Hope for NASA’s Soul

Recently my extended family was chatting about “nasty corporate entities you have worked for”. I have quite a list. Logging companies, oil companies, militaries. It ain’t pretty. My mom pitched out NASA as one of mine.

Now I don’t quite agree with that. It’s far too true that the vast majority of NASA’s resources are committed to cube-esque self-perpetuating boondoggles on the one hand, and boyish triumphalist pre-obselete politicized megafollies on the other. And that’s bad. Some people are advocating a new apollo program to save our planet from epic climatic perturbation, and instead what we have is a more literal apollo program to stand around on Mars a bit. Kind of a waste of our treasure, and a bit off the point as a new spiritual uniter of our civilization. Yes, OK. But.

NASA also does some of the best work on the planet on gathering data about the planet. Planetary and regional and landscape-scale data that is sine quo non for understanding and responding to planetary and regional and landscape-scale problems, which increasingly exist. The fact that we can think globally and quantitatively at the same time, at all, is largely thanks to the boffins and administrators at the National Aeronautics and Space Administration and their fancy ballcaps.

The biggest and best and most mature of NASA’s earth-observing programs is the Landsat series of satellites and their accompanying infrastructure and data processing and distribution programs. Anybody who works in satellite monitoring of the earth’s functions has used Landsat data, and probably mostly Landsat data. Around the time that NASA announced it’s Bush-mandated refocus on putting people on Mars, they also quietly announced that the ailing Landsat program was going to be left to die in space, dieing as the satellites died one by one.

Well, as of this month, they’ve backed off that position. They are once again going to allocate some small fraction of their budget into useful programs. According to this report, a resolve exists to make the “moderate-resolution imaging” program a permanent and stable thing. This is good news. So yay NASA! And yay me for taking their money (sort of, occasionally)! And yay Landsat! May you ride the horizon a few decades longer.

An Agent-Based Modeling Textbook, Free in Alpha

José M. Vidal is writing a textbook called “Fundamentals of MultiAgent Systems”, and he’s posted an alpha version on his site, with a call for comments. It’s here:

Fundamentals of Multiagent Systems Textbook

The link to the .pdf seems a bit flakey, but if you try a couple of times it should come through.

Apparently the book is based on his experiences running a grad course in agent based systems. Cool.

He also runs this user-blog on multi-agent systems:

www.multiagent.com

which works on the mechanism that if you assign a weblink in del.icio.us with a certain tag (for:jmvidal), that link and your accompanying text will show up on the blog. Neat.

Proposal: A Remote-Sensor/Ecological-System Auto-Matcher

There are a lot of remote-sensing sensors. Static lists of sensor characteristics are traded, posted, debated, and hoarded. The number of satellite-based and plane-based sensors is likely in the hundreds. The existence of purpose-built (i.e. homebrew) sensors is difficult to measure, but in my own experience there are a number of such platforms extant or in active development. Matching of systems to these many sensors is, typically done on an ad hoc and historically contingent basis. What other people in your lab have studied and how they studied it probably has more to do with what you will study and how you will study it than any optimal match between sensor and system. This process could be facilitated with some automation.

I propose that I, or preferably somebody else, create an online, updatable, dynamic sensor-scale exploration tool. Functionally similar to a electronic organism identification key.

The tool would present a list of scales and a list of corresponding sensors. By narrowing down a given scale (say, selecting 10 to 30m as the spatial resolution), the user could narrow down the list of corresponding sensors. Ideally, the tool would allow alteration of any scale choice without voiding choices made subsequently, allowing the user to iteratively and dynamically examine how their options changed as they gave themselves more or less flexibility around their requirements. As an ecologist, I would adjust the sliders to match what I thought were the dynamic scales of system. How big are the individuals? If I want to do individual-based ecology, that’s my maximum resolution. How big is the system they compose (e.g, the forest)? That’s the image swath I need.

Although reproducing a process commonly undertaken manually, automation would likely increase the scope of the selection process beyond the ‘bounded-rationality’ generally enforced by the human brain’s ability to hold or read multiple sensor and system scales at the same time.

The tool could be produced as a client-side AJAX or conventional Java applet and hosted online for accessibility. For a related example, see Alex’s Remote Sensing Imagery Summary Table, which is at least ostensibly an updated, accessible , GNU licensed–though non-user-configurable–list of sensor characteristics which has gained some currency in RS circles.

open source viable in GIS?

A friend posted to a remote sensing mailing list to say that he was trying to “create a standalone program for manipulating shapefiles” and did anyone know of any existing code libraries for such things? Several of the responses he got pointed him in the direction of the existing open source GIS community. For example, opensourcegis.org.

GIS is Geographic Information Systems. It’s computer mapping software basically, but it usually isn’t very basic. The existing, commercial tools, are complex, tricky, legacy-ridden and quite powerful. There is one world leader in the production of the software, ESRI with their ArcGIS family, and it is deeply entrenched throughout industry and research.

Open Source is software collaboratively developed by dispersed individuals who contribute their code more or less freely. In return they take advantage of the work of other who collectivley build software that is cheap or free, and open to transformation and growth.

The idea of an actual, functioning open source GIS community is a painful sort of hope. The cost of a single seat license for ESRI ArcGIS is USD$1500.00. That’s a tall barrier to anyone outside of either a) a big institution or b) the western world. GIS, mapping, may sound banal but it’s not. Using spatial data is bottom-line-key to policy and planning in a profound portfolio of infrastructure areas: environmental management, conservation, social and urban development, health. The possibility that every NGO and developing-country school and government organization could use powerful GIS tools is, pardon my geek, a thrilling one.

The reason the hope is painful is that the open-source community has some very high walls to climb if they want to succeed, so high I’m doubtful. The barriers to any new GIS software ecology gaining use are, I think, these:

  • interoperability

    It’s getting easier to open files from one program in another, but even when you can open them, working with them glitch-free across programs is a major issue. Often the process of importing and translating without losing functionality can be time taking and frustrating.

  • transmission of knowledge

    While the underlying theories of cartography and GIS are mostly the same for all different flavours of software, much of the knowledge of a skilled GIS operator is tied up in the working details of specific software. People learn those details from courses or colleagues. Such training is precious and hard to come by, and usually people take what they can get, rather than choosing their software platform and seeking out training for that platform. Once you’ve learned a platform, it’s a more than most people are willing to do to retrain themselves.

  • cost

    nobody wants to buy sofware to do the same thing more than once.

Open source software can generally dodge the cost issue, since it’s typically free and isn’t vulnerable to most of the “total cost of ownership” questions that arguably affect OS operating systems, but the the other two barriers remain substantial for any new GIS software, definitely including open source options.

Hanging over all this is the question of quality. Even if all of the “unfair” barriers to entry are overcome, new software will still have to face the “fair” one: is it good enough to use? For all the complaints I have about ESRI software — and I assure you, I have many — I recognize that creating a program that does so many different complex operations for so many different types and skill levels of users is not a simple thing. The amount of code in the ArcGIS suite must be staggering, and the magnitude of man-hours of interface development is beyond guessing. If an open-source alternative is to compete on features, which ultimately it must, it will require the development of hundreds of analysis and manipulation processes. It seems to me that this is potentially a greater programming challenge than any open source project I am aware of, with the single exception of a full operating system. OpenOffice appears straightforward in comparison (and was jump-started by the Sun Office code, for which there is no paralell option in GIS world), Mozilla/Netscape ditto.

I deeply hope there’s some group of crazy GIS programmers out there with the technical capacity and the heart to take on this challenge. In my short career with GIS/remote sensing I have time and again come across situations where I wished there was a license free GIS package that small groups, enivronmental groups, developing-world groups, could use. Information is power in law, in politics, in science. There is a lot more free data out there than there was: GLCF with it’s back-catalog satellite imagery data, SRTM with all that topology, old, uncopyrighted Soviet maps waiting to become useful again, free fresh satellite data for the taking from NASA, and dozens of small and medium labs turning out their intermediary products. But turning raw and intermediary data into final product needs that software tool. I think it’s safe to say that the overwhelming majority of potential users of GIS simply don’t have the technical knowledge and the computing access to make those products when they need them. Technical know-how is a whole other quagmire, but if someone could make an open source GIS package, the benefits would be substantal and long-lasting. I wish I was more optimistic about the chances of that happening.

Google Earth: un-upping north

I’ve been playing with Google Earth, the free 3D earth visualizer that you can download from Google. Three things strike me. First, I’m suprised that more people aren’t excited by this program. Using it is such a striking experience that I would have expected a meme-ish propagation of interest in it. Given that there is some capacity for user upload of points of interest and commentary into the “keyhole community” space, I also would have expected more interest in user-repurposing. Perhaps it isn’t an open enough platform to encourage data-mashing on the scale of Google Maps, which seems to spawn off a new user side project daily. Regardless, just as a beautiful toy, I’m suprised more folk aren’t obsessing about it.

Second, I’m intrigued that some versions of the program exist that allow importation of some standard GIS data formats for overlay. ERDAS Imagine and TIFF image files, and shape files for vector. The program has no analysis tools of course, so it has no pretensions of being a real GIS platform, but as a data visualizer it could be hard to beat. How many times have I seen wary biologists converted to a belief in spatial computing by seeing their study site spun about in 3D? Well, okay, 3 that I can remember off hand, but that’s a lot. Google Earth’s visualizations are wildly compelling in their intuitivness and scalability – you can see scenes as small as a barnyard placed concretley in the context of the relief of a valley or the expanse of a continent in a few smooth shifts of a mouse. Plug some data into this thing and it could make a major difference doing hard-sell for a project proposal. Everybody likes colourful maps, and these are some colourful maps.

And what if Google did decide to go into GIS analysis? They are the information people. Could be interesting.

Finally, it takes me at least 10 minutes of playing with the globe before I feel comfortable without having north close to being up. After a while I get used to the idea of north not having a specific direction, but I really have to overcome a mental barrier to do it. Once that barrier is past, it opens up some striking new vistas and ways of looking at the earth and it’s forms. But it definitely doesn’t come naturally.

pictures from the surface of another planet

If this doesn’t amaze you in some way you suck. the spirit martian lander/rover is sending back the highest resolution pictures ever taken of the surface of another planter. They’re in colour folks.

NASA seems to be publishing them in a few different ways, but the best seems to be this chronologically arranged list of everything that they get.

Lookit that. It’s another damned planet.

mcmurdo mars panorama

eyes over Baghdad

A week ago NASA’s Advanced Spaceborne Thermal Emission and Reflection Radiometer satellite was flying over Baghdad (I don’t know off hand if ASTER is frequently over the mideast or if it has been tasked by the military). It took this image of Baghdad:

Click here for a larger (400k) JPEG version or follow this link to the full sized (2.5MB) JPEG.

From JPL’s ASTER website:

“The plumes, which originate along major roads and canals, are believed to be burning pools of oil from pipelines. The plumes, which blanket large sections of the city of approximately 5 million, are creating an environmental health hazard for residents of the city and surrounding regions.”

This seems like a good example of the many deadly outcomes of war that can’t be easily reported on because the deaths don’t easily abide by the Who What When Where journalistic format. How many will die because of these clouds? How long will it take to happen? Will it be possible, when the deaths do occur, to tease out to what extent those deaths were the result of inhalation of burning petroleum during Gulf War 2 vs. some other environmental source? Who knows, but people will die from it. You and I will never know who they are.

I hope Saddam and Bush share a 1st class compartment on the train to hell. Except of course, that Saddam probably shares with Bush the characteristic of being only the most visible man of a whole awful system, and there really aren’t enough 1st class compartments for all of the people who would have to fit in that train.

← newer posts