Michal Migurski's notebook, listening post, and soapbox. Subscribe to this blog. Check out the rest of my site as well.

Dec 30, 2010 6:16pm

blog all oft-played tracks II

Like last year, these are some of the tracks I added to iTunes in 2010 and listened to the most, edited for clarity and minor historical whitewashing.

(Here are the MP3s below as an .m3u playlist)

1. Spoon: Got Nuffin

“Got nuffin’ to lose but darkness and shadows.” Most-played track of the year, from the only normal rock band I consistently spend any time with.

2. Jay-Z: Kingdom Come

I didn’t bump into this until August or so reading back into Ta-Nehisi Coates’s archives, but Kingdom Come might be one my favorite songs this year.

Ricardo Gutierrez:

It didn’t take long going down the list before I remembered THE best use of Rick James’ “Superfreak”. Nope, not Hammer. The prize goes to my studio neighbor at Stadium Red, Just Blaze. Really, he flipped the fuck out of that sample for Jay Z’s “Kingdom Come”. I had to hear the song a few times in a row just to figure out, as best I could, the arrangement of it all. … Something he nails really well is giving the song a sort of staccato energy that doesn’t exist at all on the original. I love how the bass line goes from being the focus on the original, to being the fill.

I love this idea of taking pieces of music and dressing them up as something entirely different. It’s old-hat to talk about “sampling” like it’s novel but really Kingdom Come just underscores how hard it is to do well, and how frustrating more obviously sample-driven musicians like Girl Talk can be. You listen to All Day, and you might like it, but in end what you’re left with is a dumb bag of samples dressed in their factory uniforms jostling for your attention, strung one after the other. Ten years on, Z-Trip is still a wedding DJ dusting off the Eurythmics for your amusement, while Kingom Come is closer friends to Orbital’s recombination of Tool or Skinny Puppy’s song-length misappropriation of Roman Polanksi vampire movies (mp3).

Anyway, the other thing I enjoy about Kingdom Come is the use of the studio as a social instrument. Watch Jay-Z and Kanye West working in the studio together, or read about Kanye’s “Hawaii rap-nerd nirvana” for a sense of how art is produced socially both among its producers and between the record and the listener.

3. Venetian Snares: Szamár Madár

Found via David O’Reilly’s Vimeo page and his gorgeous, spooky video for this track.

4. La Roux: Bulletproof

Another song found via the video, this time watched rather than heard on a phone in a restaurant. I love the clarity of La Roux’s voice, and the clinky/jangly pop production on the track.

5. Jamie Woon: Wayfaring Stranger

Spooky, quiet rendition of the old folk song, found via an installment of Electronic Explorations.

6. Netsky: Young And Foolish

7. Hot Chip: I Feel Better

Hot Chip were one of the few live shows I saw this year, at Oakland’s amazing Fox Theater. The video was directed by Peter Serafinowicz (of Look Around You), who says:

I like the idea of taking something we’re all used to seeing—like a boy band music video—and totally destroying it. So I wrote this proposal and included reference images I found on Google—to illustrate the bald guy in the video played by Ross Lee, I used pictures of Mr. Burns from the “X-Files” episode of “The Simpsons”.
I learned there is a boy band tradition—or possibly an actual physical manual—that says you have to have the tough guy, the cute one, the suave one, the one who takes his top off.

8. Vapour Space: Gravitational Arch Of 10

37.752467, -122.418699, 1996-11-17 07:00:00 is the point in spacetime whenwhere I first heard Gravitational Arch Of 10. The sun was already up, and it was one of those things where you think the music is over and then it all rushes back.

9. Miike Snow: Black And Blue (Savage Skulls Remix)

10. Star Eyes: Ruffage (Side A)

This isn’t really a track so much as a 45-minute long mix of drum n’ bass. The MP3 version finally found me this year (thanks Jeremy) after I burned through or lost three separate copies of the cassette over the past fourteen years.

Dec 21, 2010 11:09pm

winter sabbatical 2010: days twelve through sixteen

These posts have drawn somewhat farther apart; this past week has definitely thrown a wrench in my gears. Eric, Shawn and I do a quarterly full-day Stamen partner meeting, and this time we thought it’d be fun to do it down in Palm Springs to celebrate winter. So, down we went on Tuesday and Wednesday last week. Palm Springs is a strange place that rightfully shouldn’t exist, but we still ate two of our meals at the old folks’ country club tennis resort near our hotel. We also managed to squeeze in a visit to the converted 7-11 carniceria/taqueria in Banning that Gem, George, Michele and I found last year on our trip to Joshua Tree. Off the chain, those tacos are.

As you can imagine this hasn’t been a banner week for productive code, research, or design, though I have managed to push a few things out into the world.

(TL;DR*: Skeletron, Oh Yeah, Paper!, Pinboard username mapper)

The first is that my straight skeleton code from last week has a new public home on Github and a new name, Skeletron. Although quite minimal for now, I’ve taken care to ensure that a bare-bones HTTP interface is built right in. You can try it at skeletron.teczno.com:8206. The mnemonic for the port number is the number of bones in the human body. More on all that in a separate post.

The second is a minimal new blog I’ve started, Oh Yeah, Paper, a link-and-picture dump for interesting paper-related things. My research interest in paper is mostly around communication technologies like Mayo Nissen’s City Tickets and easy print-on-demand services used for custom book printing, but there’s plenty of room for silliness as well. I’ve been posting to this site for a bit over a week now, and didn’t want to say anything until after I’d proved to myself that I could keep up a simple daily posting schedule for a little while. We’ll see how that goes.

The big news this week is Yahoo!’s incomprehensible shitcanning of venerable bookmark-sharing service Del.icio.us. This is one of those cases where it’s hard to distinguish malice from stupidity, so I’m grateful to Maciej Ceglowski for having started Pinboard last year. I’ve had an account there since the beginning, and a few weeks ago I started noticing a number of people in my network moving house - James, Aaron, Paul, among many other. Henryk Nyh was kind enough to create a username mapper to speed the transition for folks.

One important outcome of this move has been the sudden interest in baking your own. Jeremy Keith suggests a home-grown Delicious, the “self-hosting-with-syndication way of doing things”. This is largely what I’ve been doing with Delicious for a few years now. I keep the primary copy of my bookmarks in Reblog, and when they’re published I push them to Delicious, Twitter, and my own linkblog. Some of those channels get the tags, some get the full description text, and one just whatever short link and title fragment happens to fit in 140 characters. The elephant in this room is that a primary value of Delicious has always been its network feature, which I use as a primary research tool and general way of finding things I should be paying attention to. Marshall Kirkpatrick wrote a bit about how ReadWriteWeb used the network feature to do research, which meshes pretty well with my own approach. Clearly the social bits were important, something that Jeremy talks about in his post. Stephen Hood (former Delicious product manager) makes the excellent suggestion that the Delicious data corpus be donated, perhaps to the Library of Congress or the Smithsonian. This supports the idea that the data there has value as a historical artifact from 2004-2010, and especially well with Twitter’s corresponding donation of public tweets to the LOC. Twenty years down the line, looking at those two datasets in concert is going to lead to some fascinating research.

There have also been a bunch of responses to the problematic idea that it’s possible to bang out your own alternative service, maybe best expressed by Andre Torrez in How To Clone Delicious in 48 Hours:

You may have heard that Delicious is shutting down (or not?). Someone on Twitter suggested that a group of engineers should get together on a weekend and build a Delicious clone. In anticipation of this mystery group of people sitting down and doing this, I thought I'd make a quick todo list for them.

Andre continues with a laundry list of social features like account systems, imports, tags, etc. Jeff Atwood has a similar post on the difficulties of sending email from code. While both of these guys are clearly accomplished developers, I think they’re expressing a little too much stop energy for my comfort. Andre lists a number of “critical” features that Pinboard is cheerfully lacking right now, and Jeff is ignoring the presence of services like AuthSMTP and Spam Arrest offering completely brainless SMTP for not very much money. While you wouldn’t build something huge on top of these services, it’s worth remembering that Delicious started life as a text file, and grew into its present form over time. Things start tiny and develop direction over time as user-constituents suggest ideas or push and pull the service in new directions. There is still a lot of good you can do online with plain old flat files, and even Twitter was able to get a useful, fun service built on quicksand.

Any product that includes a community component is a vector rather than a point: there’s where you started, and the possible gulf between where you think you’re going and where your constituents think you’re going.


Dec 14, 2010 4:52am

winter sabbatical 2010: days eight, nine, ten, and eleven

I spent days eight and nine in Chicago. Walking Papers is up in all its crinkly glory at the Chicago Art Institute, in the Hyperlinks exhibit thanks to curator Zöe Ryan. Eric, Geraldine, and intern Martha Pettit arranged a selection of submitted scans over a twelve foot wall. It was bitterly cold, and I thought I’d do a much better job of getting out and about without being quite prepared for the reality of weather. Adrian, Paul, and the gang at Everyblock kindly offered a bit of couch space at their northside office for me to lounge on during day nine, and when I left I saw a tiny rabbit cross my path in the snow. I’m told that’s normal - “lots of friendly squirrels and bunnies in Chicago.”

Day ten was travel, and movies, and general laziness. I launched the Atlas feature of Walking Papers, which you can see if you compose a new scan on the front page. You get back a multi-page PDF; nothing special but the possibility of using these to distribute assignments to a group is something I’ve been planning for. Since I last mentioned it, the Random Hacks Of Kindness TaskMeUp project popped onto my radar - it seems to be built around some of the same ideas, aiming for the world of humanitarian and crisis response.

The thing that really ate my head for the past few days has been a computational geometry technique called the straight skeleton. The short story is that it’s a way to simplify polygons down to lines with special applications for cartography. The long story is that I have a personal history of letting what Aaron calls “mathshapes” consume me for days on end, and this is the latest example. Some people have knitting, crossword puzzles, gambling, or crack. I have geometry, and my biggest personal challenge during a fairly open-ended sabbatical like this one has been to ensure that when even whean heading down blind alleys, there’s a plan for sharing the results.

Anyway, this is a simplified illustration (from my State Of The Map U.S. conferences slides) that shows how the use of buffers, polygons, and the straight skeleton can be used to convert OpenStreetMap road data into improved, more easily-labeled lines:

As I mentioned in my NACIS keynote last year, OpenStreetMap has a fairly specific range of scales where it’s designed to work best, and if you need to create a lower-zoom map you will find that details like dual carriageways (large roads and freeways split into parallel strips) and interruptions for bridges or tunnels make it incredibly hard to generate good looking labels, not to mention the issue of publishing raw data in a useful or compressed form. While deriving the skeleton of a shape is perfect for this problem, there’s not a lot out there in the way of accessible implementations of the algorithm. OpenStreetMap has no current answer to the idea of derived, lower-resolution datasets outside the very low-resolution Natural Earth collection. OSM is historically biased toward manual production, and Steve Coast has advised me that setting up lower resolution OSM servers for manual tracing might be a more sensible way to handle this unmet need. I do agree, though computed geometries offer an excellent leg up to solve the cold start problem somewhat.

So, I blew through two days reading through implementation notes and doing a bit of coding, starting from Tom Kelly’s java implemetation Campskeleton. I’m not a java programmer, but Kelly’s straight skeleton index page offers plenty of notes on how to actually get the thing built. Most important are a few late-90s academic papers: Raising Roofs, Crashing Cycles, Playing Pool by David Eppstein and Jeff Erickson and especially Straight Skeleton Implementation by Petr Felkel and Štěpán Obdržálek, along with the cartography-specific approach from Using the Straight Skeleton for Generalisation in a Multiple Representation Environment.

Anyway, here’s where I’ve gotten to:

The easiest way to think of the straight skeleton is as a peaked roof on a building. Starting with the gutters of an arbitrarily-shaped building, you build up sloping roof until the bits all meet in the middle. The ridge of roof all the way around is the skeleton, and it gives a pretty good idea of the building’s axis, or a path it follows.

That angular blob might not look like a map, but imagine it as a curving road. Here’s what I’m aiming for, more generally:

The idea is similar in spirit to Paul Ramsey’s simplified Vancouver Island. There are a bunch of pitfalls in the process. Here’s one from before I figured out how to correctly order the priority list of roof line intersections, and didn’t check that points were actually inside the containing polygon:

Here’s an example showing an imaginary dual carriageway, buffered out to merge the shapes into a single polygon and then skeletonized down the middle:

The algorithm is incredibly sensitive to initial conditions, such as this example where a few extra parallel lines result in a double-peaked roof. It’s a plausible overhead view of a building, and a correct skeleton, but not quite the thing for cartography:

It’s also worth noting that I’m not doing anything to detect collisions just yet:

I’m getting close to done with this alley. The particular needs I’m aiming at are largely connected to the book-based bicycle map I described last time. It’s possible now to render excellent maps from OSM, but the medium of print makes small errors much more glaring, and I’m interested in fixing some of the loose ends of cartographic representation and OpenStreetMaps.

So, math shapes.

Dec 8, 2010 11:07pm

winter 2010 sabbatical: days four, five, six, and seven

It’s been almost a full week since my previous sabbatical update. Where was I? Oh right, maps. I’ve been making progress on two fronts: one is a series of updates to Walking Papers, the other is the a print project that currently lacks a name. The Wikileaks affair has also been completely engrossing, more on that below.

As I described last week, Walking Papers will have a multi-page atlas feature Real Soon Now, probably within a few days. I’ve been using the opportunity to improve the overall quality of the print and scan production process and also moved the whole site from its previous home on my overburdened Pair.com shared hosting account to a fresh new Linode instance. I’m also doing a fair bit of behind-the-scenes work on the scan decoding process, which I’m embarrassed to say has been riddled with problems from day one. The SIFT-driven corner-finding process has never been a major issue, but a lot of scans seem to fail because the QR Code reading library (Google’s Zxing) often can’t find a message in an image that’s nothing but crisp, isolated, beautiful code. I don’t know enough about the internals of Zxing to fix the problem there, but I can insert a manual step to allow people to simply type in the address contained in the code in the event of a read failure. Boring, but necessary. The other bit of sub-surface work I’m starting on is the ability to use a normal digital camera to read back the scans, hopefully with U.C. Berkeley’s Sarah Van Wart and prompted by Ushahidi’s Patrick Meier. Sarah has used the project to great effect with Richmond High School students and Patrick knows from firsthand experience how much of a pain it can be to find a scanner while everyone’s got cameraphones in their pockets.

The other print project is something I’m collaborating on with my friends Adam Schwarcz and Craig Mod, bicycle-enthusiast and book-maker respectively. It’s a long-time-in-the-works bicycle map of San Francisco and Oakland based on government data and OpenStreetMap, with a barcode-driven print/digital connection that we’re still brainstorming about. This was only my second exploration in the area of data-driven automatic print design when Adam first suggested the idea last year, but I’ve had enough experience with talking paper in the intervening time that I’m using my sabbatical to jump back into the project. Look for more on this in the next few weeks.

Alon Salant over at Carbonfive was gracious enough to offer a desk for a few days, and while working on these various projects I’ve also had a privileged view on what an agile workplace looks like. It’s been fascinating to observe from inside a development process that seems built around conversation, and while camped out at a loaner desk in their 2nd St. office I heard the beating heart of design and technical arguments that I’m more accustomed to experiencing via text.

At the moment I’m in Chicago, in what the bellhop (bellhop!) tells me is Al Capone’s old dentist’s office. Need to buy gloves, eat, and then head to the Art Institure where Walking Papers is part of the Hyperlinks exhibit.

Meanwhile, Uptown...

This week, I am helplessly transfixed by the Wikileaks story. There’s so much meat to this unfolding event, well-covered so far by Andy Baio’s Cablegate roundup. What’s really caught my interest has been the reactions of businesses like Amazon, Paypal, Mastercard, and Visa - all supposedly independent businesses suddenly acting in concert to isolate and marginalize a strange new actor. The timing of the response suggests that it wasn’t triggered by the release of the diplomatic cables at all, but rather by Julian Assange’s promise to release future documents related to the activity of an unnamed major bank. It’s like Naked Lunch, a moment when time grinds to a halt and you can see what’s sitting on everyone’s fork as they raise it from their plate. Hypothetical arguments about the likelihood of cloud data providers like Amazon or payment processors like Paypal cutting connections evaporate instantly: here is a clear-cut example of a inconvenient release of information scaring the shit out of someone enough to apply the screws.

Assange’s past writings about conspiracies and invisible governments painted a picture that was too large and too diffuse to elicit a local reaction. A conspiracy as subtle and far-reaching as that described by Assange might not be worth reacting to, because how can it be distinguished from simple alignment of interests? How can a private individual successfully act in response? Here, though, we see pieces of the whole suddenly illuminated as by a camera flash: milquetoast diplomatic chatter causes Amazon to suddenly decide that it cannot abide the hosting of unauthorized material. A strangely-timed sexual assault charge leads to an unprecedented Interpol warrant for Assange’s arrest. Senators and congressmen opine that if pursuing Wikileaks is difficult under the law we have, then perhaps we might look into changing the law?

The decision to leak a stream of diplomatic cables (as opposed to any one particular cable) is a sharp departure from typical journalistic approaches, which is really what I’m finding so engrossing in this story. Yesterday on the radio, a former general counsel of the C.I.A. complained that the released data lacked a “patina of journalism” or an editorial function (and therefore did not qualify for constitutional protection), which suggests to me that the government is now actually quite comfortable with the occasional fallout of the shocking revelations of journalism-as-usual. Even sustained evidence of state-run torture and of course that fucking war hasn’t led to the breadth and depth of reaction we’re seeing now: “All hands, fire as they bear.” Why now? The amusing dullness of the leaked cables shows that Wikileaks has decided to hunt upstream, taking aim at the metabolic processes of communication and secrecy. If there is in fact a conspiracy, and this is how it talks to itself, maybe we play a few games with its own internal communications to see what happens?

What’s currently happening is that all sorts of actors are responding in unexpected ways. Like a meadow crossed with underground gopher tunnels, Amazon’s reaction suggests that at least a few widely-separated individuals are in fact quite deeply connected and spooked enough to show themselves above ground in surprising places. Maybe a simple response is that I take our (four-figure per month) Amazon Web Services business to a competing cloud provider? Maybe another is that I route around Paypal in the future? On the other hand, how could I conceivably avoid Mastercard and Visa? Hopefully, Julian Assange is what he says he is: a spokesman and intentional fall-guy for a much larger group that can act without him.

Meanwhile, I’ve got my bowl of popcorn sitting here and it’s looking like a fascinating ride.

Dec 2, 2010 8:55am

winter 2010 sabbatical: days two and three

Two days have elapsed, all is well.

Yesterday I spent most of my morning dragging the six month-old “Atlas” feature of Walking Papers into a releasable state. It’s not quite there yet, but it’s significantly closer than the one I bashed together in a few days at a Camp Roberts exercise back in April. The general idea behind this feature is that the single-sheet bias of Walking Papers is a hindrance to people covering large areas, and more significantly it doesn’t help people who are delegating work. I continue to be surprised at the outcomes of this project - Eric is more focused on the tactile and aesthetic qualities of the prints themselves, while I’m interested in some of the social and organizational implications. It was always designed to be personal and utilitarian, but we’re finding that in a lot of ways it’s not quite those two things, not necessarily at the same time. Socially, the idea of multiple-page outputs opens the dynamic of tasking or assignment, which turns the sheet of paper into a communicating object. Why should it necessarily be the same person choosing the area, printing everything, handing out maps, noting down features, running them back and doing the scanning? Properly organized, all those activities can be parceled out and done more effectively for large areas.

An important thing that happened was the release of Bing imagery for tracing into OpenStreetMap, one of the most visible outcomes of Steve Coast’s new job over there. Also, the OSM Flash-based editor Potlatch 2 was finally released into the wild. Something really significant is happening with OpenStreetMap right now - it’s hitting all these critical communities and corporations and sparks and ripples are shooting out.

Today I spent most of the day catching up with Stamen alumni Ben Cerveny and Tom Carden, and working on a little thing we’re developing with Adam Greenfield and Nurri Kim over at Do Projects. It’s hiding in plain sight, I’ll talk more about it later when I’m more comfortable that we’re close to release.

Also, mostly thanks to Aaron, I pinned a bunch of new cartography porn:

Nov 30, 2010 4:00am

winter 2020 sabbatical: day one

Today was the first full day of my planned winter sabbatical, and I'm mostly just getting used to the idea of having six weeks of open time in front of me. My plans so far are amorphous, but I know these things:

  • I have a pile of small-to-medium-sized experimental, research, and development work I've been anxious to think about. Most of it is in some way connected to geography or cartography, right now I'm arranging pieces to figure out how they might fit together.
  • I'll be visiting people. In January, Matt Ericson at the New York Times has graciously offered the use of a desk for a week. Until then, I've got a few other multiple-day visits I'm arranging with other folks I know. Partially I'm looking to vary my surroundings, but I'm also interesting in being around people whose work I like to get a sense for how they get it done. I'm thinking of simple things here: furniture layout, how close people are to one another, do they move around a lot or use headphones, is it chatty and friendly?
  • My daily schedule will be broken up into chunks for making/doing (mornings, evenings) and chunks for talking/moving (afternoons).
  • I haven't gone to the gym or ridden my bike in a very long time.

Today, I fixed the way that Reblog was talking to Delicious and Twitter, so that my public Twitter feed, Delicious account and links feed could all be synchronized again. I used to publish a regular stream of links, but stopped around the time that Twitter upgraded to OAuth and everything broke. It made me feel mute and disconnected, not having access to my normal means of publishing tiny things. Pinterest helped somewhat. I also pushed a new version of TileStache, focused on the vector tiling needs of Polymaps. See this GIS Stack Exchange thread for a bit of context. Then I had a steaming bowl of amazing Pho from Ba Le in downtown Oakland, walked the dog, and caught up on some reading courtesy of Aaron.

Also, I found these 3" x 4" post-it labels that fit perfectly on my wrist rest for notes:

Nov 3, 2010 6:03am

election report

This election, I decided to try something different and volunteered as a poll worker. I was given the role of “standby judge”, and late yesterday the Registrar’s office called to say that my help was needed as an Inspector at a downtown Oakland polling location whose regular Inspector was sick. Having had a three-hour class that taught me all about the voting process this year and otherwise no relevant experience, all I knew was that I had been put in charge of a polling place where I was to spend a 15-hour Tuesday supervising the primary point of democratic feedback.

Things I learned today:

  1. You can seriously never have enough pens. Our critical path today was pen availability: the R.O.V. gives each precinct a certain number, and then a bunch of people show up and that number runs out. We tried to buy more in the middle of the day, but there wasn’t a nearby store that sold ballpoints in bulk. Next time I’m bringing a bunch of bics and keeping them secret until the evening voter rush, at which point I will quietly disperse them into the penstream.
  2. People find it surprising when you greet them at the polls by saying “I know you from Twitter” (hello, Mitch).
  3. Election technology, at least in Alameda County, is basically Peak Papernet*. Every piece of the system works together and the County seems to run their elections by Slashdot standards: there are voter-verifiable paper audit trails for the touchscreen, most voting happens by filling in a little broken arrow icon with pen and paper, every sheet and envelope includes a unique identifier with a tear-offable receipt, and every box and bin comes with tamper-evident stickers. Every piece of the process, including the end-of-day shutdown procedure, has built-in safeguards that use simple counting and sorting procedures to ensure that each ballot is accounted for in some numbered ziploc bag: counted, provisional, unused, spoiled, etc.
  4. One effective way to ingratiate yourself with a group of people you’ve only just met is to spend $25 at Whole Foods on a bag of gift snacks. Cookies and fruit seemed effective, including the vegan ones that come in a paper bag and everyone goes “ew, vegan” and then they try them and holy shit.
  5. I am basically happy to cheerfully yell the same thing all day long, to an unending stream of new voters. I’ve always enjoyed the security lines at airports where someone in a sharp suit yells at everyone to keep moving, so I tried to do the same thing here. People in groups faced with bureaucratic procedure become cattle, and need to be helped along not because they don’t know what to do but because everyone in line needs to know what everyone else knows to do. “Keep moving”, “wait here for five minutes while the floor clears up”, “give this envelope to the man in the hat”, and “Hello what’s your last name?” are most of the words that came out of my mouth today.
  6. A system must include provisions for constant forgiveness.

I was sad to see Prop 19 go south and I still haven’t had the heart the check if I was successful at kicking the BART Board to the curb, but otherwise it was one of the most exciting and exhilarating ways I can imagine to spend a weekday and make $100 in the process. I would totally do this again, and I recommend the experience to pretty much anyone.

Oct 26, 2010 7:28am

this tract

This is a placeholder blog post, for this:

Oct 20, 2010 6:00am


I'm relieved that it's winter. I've just returned from Australia, where it's spring and beautiful and pointed at the southern stars. I was the second day's keynote speaker at the ever-excellent Web Directions (South) conference, and I visited Mitchell Whitelaw at the University at Canberra, and before that I was in Denver for Planningness, and before that Mark Hansen invited me to do the first statistics seminar lecture of the year at UCLA. We've just dodged a genuinely-interesting but still bullet-shaped project at Stamen and things are looking to settle into a welcome groove. I'm researching simulated annealing for map label placement and thinking about some upcoming projects and just generally glad to have organized a sort of loose sabbatical for the period starting with Thanksgiving and ending a little ways into 2011.

Sep 27, 2010 5:30am

map sprint

This weekend marked the first global Mapnik Code Sprint. Most attendees converged on Cloudmade's London office, while Nino Walker and I held down the fort in San Francisco and Robert Coup dialed in from New Zealand.

Having just spent about two days thinking deeply about relative path resolution in Mapnik side-car Cascadenik, I'm fairly happy with where the code base has arrived. My goal with all of this stuff - introducing web cartography to CSS, building on Schuyler and Chris's work in tile rendering, and trying to make it easier to run your own map server - has always been to take an activity that's already fun and interesting, and make it easy, fun and interesting. Maps on the internet should not be difficult to publish or overly dependent on single providers like Google, and it's the details in all the bits of glue that make this possible.

For my part, I agreed with Dane to focus on correctness in Cascadenik's output. I put the code away for a while and when I came back a little while ago, it was obvious that a lot of old, bad decisions I had made were in need of some fixing. It's important for good, small tools to behave in predictable ways and I've been taking a lot of cues from what I consider to be prescient, amazing design decisions in HTML and CSS and doing my best to apply them to portable and easy web cartographic stylesheets.

The changes we made fall into three broadish areas:

  • Cascadenik now knows more about paths, so if you ask it to create a stylesheet for Mapnik and put it in some directory, it'll try to be just a little bit smarter about how it names things based on where they are and where they came from.
  • Cascadenik also has a lot less options when compiling, which I think is a good thing. Dane had to update many of my early, now-wrong assumptions with a bunch of patches that added new optional behavior flags, and I used the weekend to change those underlying assumptions so there didn't need to be so many flags.
  • Nino introduced an incredibly cool new way to manage data sources which I think is going to make working with data a lot more palatable.

This isn't quite the place for all the technical details, but I think it's worth repeating that the reason for all this work and effort is to open a certain kind of activity to new groups of people. Dane became an honest-to-god C++ programmer through his exposure to Mapnik and I think that makes him a saint or a hero or both, but it shouldn't be necessary for ordinary users to make this same transition. Rather, it should be normal that people can approach maps and cartography and do interesting things with them, things different from the usual "pizza places in city X" use case offered by the Googles of the world. Like, I've got this wallet that Gem made for me (out of indestructible tyvek!) and these bad-ass shoes that I designed on Zazzle.com:

This is a synthetic preview image that Zazzle showed me when I was posting my renders of the Oakland Assessor's Parcels shapefile:

Pretty close right?

Anyway, most of what I've been doing for the past few years in these occasional experiments and releases has been an attempt to shrink problems, so that activities which might otherwise require substantial effort or time or money or people can require less of all those things, raising the RPE's as Kellan might put it.

I'll see what I can do about making the shoes publicly buyable.

Also, here's that awesome thing Github does where they show you everybody's changes in a long horizontal graph:

Aug 21, 2010 6:42pm

release often

A few code-like things I've been working on lately.


Polymaps is a result of our summerlong collaboration with SimpleGeo. We've been working on it for some time, but yesterday we announced it for realsies and saw an amazing response from all over the internet. This one's been doubly rewarding for us, since it's also the result of a summerlong collaboration with Mike Bostock of Protovis fame. Mike's been on our radar since he showed off Protovis when Tom and I visited Stanford a while back. It was a week-old at the time and already full of promise. This Javascript thing, I think it will go far.

Census Tools

Census Tools is a small thing I put together last week, to extract data from the 2000 U.S. Census by subject and geography. I've just added the ludicrously detailed Summary File 3 with loads of information on housing, commutes, and other topics. Also, Shawn added a second script, text2geojson.py, which converts the textual output of census2text.py into neatly-formed GeoJSON. This makes it trivially compatible with PolyMaps!

Walking Papers

Walking Papers gained two new translations recently. Maxim Dubinin provided a complete version of the site in Russian, while Frank Eriksson has been working on it in Swedish. This will bring our total number of languages to ten, and it's a fascinating case study in the power of using Git (and Github) for open source projects. Generally speaking, most of the translators haven't had to ask for permission or even informed me of their work until they were basically done. A staging site and a few git pulls later and we've got a new translation!

Maxim who did the Russian version was the first person to translate the title of the project in addition to the text content. He said they had a pretty good pun going as well:

Well, as I recall, "walking papers" are docs you're getting when you're fired, right? Turns out "Обходной лист" is exactly the same thing in russian - a piece of paper ("лист"), that you're getting when you're fired and use to "walk around people" ("обходить") in your org collecting signatures that you returned what you had too etc, so it is means almost exactly the same. Funny that that you can literally translate some idioms and they will still make sense. In the context of OSM I guess it translates as well you walk around, now geographically, not people wise and it is a piece of paper :)

So good.

Aug 17, 2010 5:21pm

presenting tilestache

Named in the spirit of the pun-driven life, TileStache is a response to a few years of working with tile-based map geographic data and cartography, and an answer to certain limitations I've encountered in MetaCarta's venerable TileCache.

The edges I've bumped into might be esoteric, but I think they're also indicative of our many experiments in tile-based web mapping since 2007. The core functional needs of a tiling system are well handled by existing software: imagery from bitmap sources of aerial and scanned imagery, Mapnik renderings of OpenStreetMap data, and caches of remote WMS tiles. None of this is really the core point of TileStache, though it's all certainly table stakes.

The place where I've found a need for a new project is somewhere in the intersection between synthetic imagery, composites of existing imagery, and delivery of raw vector data to browsers. More and more we're dealing with the expressive possibilities of new web cartography in project like Pretty Maps, and TileStache is a possible approach to data publishing that borrows a lot of the simplicity of TileCache while adding a dose of designed-in extensibility for creating new kinds of maps.


After developing Travel Time Maps with MySociety in 2008, we adapted our bitmap data imagery technique to tiled delivery. The follow-on Mapumental project hypothetically covers the entirety of the UK with dynamic, temporal data.

Here's a screenshot from one of the early demos, showing travel times around a city, lit up over the coastline:

It's not animated (check the Channel4 site for a video of Mapumental), but this is one of the constituent map tiles underlying the image:

Each pixel in this tile is a 24-bit value encoded in the red, green, and blue channels, expressing a time and speedily decoded by the Flash application in the browser. This part of the project is driven by a custom Layer class in TileCache, that pulls pre-computed time points (e.g. transit stations) from a database and renders cones around them. Some of the code might be findable in MySociety's source repository.

What's interesting here is the idea of completely synthetic providers, i.e. those not directly based on GDAL sources, Mapnik renderings, or WMS servers. It's something I'm demonstrating in the TileStache Grid provider, an implementation of the UTM grid (U.S. National Grid and Military Grid Reference System) for overlay onto other spherical mercator maps.


Lars Ahlzen's TopOSM is a longtime rendering project based on OpenStreetMap data and cartography built from constituent pieces of Mapnik. TopOSM combines renderings of streets, hills, and labels to create a beautiful, dimensional result:

Lars builds the final map up from a stack of images, many of which might themselves be expressed as tile layers:

In attempting to build a new Layer class for TileCache that expresses this idea, I found that it seemed to be impossible to access the full configuration of the system from within a given layer. There was no way to create a derived map sandwich, and I knew that Lars's own method was a homebrew of ImageMagick and similar tools. I'm interested in something a bit more systematic that implements something like Photoshop layers for cartography. The current Composite provider in TileStache provides layers, alpha channels, color fills and masks, and I'd like to implement transfer modes (e.g. Photoshop's hard light) if this sample proves to be interesting.

We've delivered this sort of composite cartography to clients in the past, but always through a combination of spit and chewing gum.

GeoJSON Data

Most recently, we've been developing Polymaps, an SVG-based map engine that can show regular image tiles in combination with vector overlays driven by GeoJSON data. Tiles turn out to be just as helpful for publishing and requesting vector data as they are for pixel-based images. We've modified TileCache to support this use in the past, but there are simply too many places where the code assumes pixel-based images for the exercise to be anything but frustrating. TileStache is designed to accommodate data-only tiles, including an example PostGeoJSON provider that converts PostGIS data to GeoJSON.

As the ability of browsers to interpret and display a wider variety of imagery improves, we're going to see this data tile concept become increasingly useful. Why stop at image tiles, when you might want to render roads that can be rolled-over or clicked directly? Why assume dynamic data services, when TMS-style tile URLs (e.g. */12/656/1582.png) can be hosted from simple storage services or plain filesystems?


It's early days, but we're finding that the limitations around in-browser display of layers and data are increasingly down to the display of SVG or Canvas rather than any particular native slowness in Javascript itself, so we're thinking that our still-fairly-intensive experimental demos getting a few kinds words from friends like Nathan, Alyssa, and Jen will calmly scroll into the window of normalcy within the next year or so. We also know that other developers are thinking about some of the same concerns that the motivating goodies above address. For example, Dane tells me that the current bleeding edge of the map-rendering library Mapnik includes basic image compositing, masking, and GeoJSON output right there in the core.

Really what we're looking at is a future filled with work like Brett Camper's amazing 8-Bit Cities, "an attempt to make the city feel foreign yet familiar ... to evoke the same urge for exploration, abstract sense of scale, and perhaps most importantly unbounded excitement."

What are the tools that help make this possible?

Get TileStache.

Aug 12, 2010 5:30am


The U.S. Census publishes an astonishing volume of data, notably with the most recent 2000 count. The demographic data contained in each of the summary files is precise, detailed, and distributed in a difficult-to-understand text format. The documentation for summary file #1 alone (race, age, sex) is a 637 page PDF file, and the actual data is stored in a maze of zip files all alike.

I've poked at these before, but I recently got a bee in my bonnet about making them available in a more useful form so they could be mapped. I talked to Josh Livni (of Land Summary) quite a while back about his plans for a demographic summary site that would store everything in a database in the cloud. Then Amazon made it available as a public dataset. Still I was not satisfied - both approaches to handling the data seemed a bit ocean-boiling in retrospect.

I've been experimenting with something I'm tentatively calling census-tools that seeks to make this data a bit more accessible. I'm motivated by the idea that predictably-structured zip files stored on a web server and accessed with Python's excellent stream-handling libraries might actually be considered quite a good API, so the first tool in the repository proceeds from there. It does a very simple thing: given an optional U.S. state, a geographic summary level (e.g. census tract or county), and a type of data, it unzips those remote files into memory and converts them to a tab-separated values file.

Here's an example:

python census2text.py ––verbose ––wide ––state=Hawaii ––geography=county ––table=P18 ––output=hawaii-households.txt

It outputs a chatty text file of household data for every county in Hawaii into a file called hawaii-households.txt. It takes about a minute to churn through a 2.8MB zip file and output the results. Omitting the state name gets you every county in the U.S. in about 20 minutes:

python census2text.py ––verbose ––wide ––geography=county ––table=P18 ––output=national-households.txt

I tested with Hawaii because it's small, and immediately discovered the strangely underpopulated Kalawao County:

The county is coextensive with the Kalaupapa National Historical Park, and encompasses the Kalaupapa Settlement where the Kingdom of Hawai'i, the territory, and the state once exiled persons suffering from leprosy (Hansen's disease) beginning in the 1860s. The quarantine policy was lifted in 1969, after the disease became treatable on an outpatient basis and could be rendered non-contagious. However, many of the resident patients chose to remain, and the state has promised they can stay there for the rest of their lives. No new patients, or other permanent residents, are admitted. Visitors are only permitted as part of officially sanctioned tours. State law prohibits anyone under the age of 16 from visiting or living there.


Anyway, this small amount of information can be quite hard to get to. Between the impenetrable formatting of the geographic record files, the bewildering array of different kinds of geographic entities, and the depth of geographic minutiae, it can take quite a bit of head-scratching to extract even the first bits of information from the U.S. Census.

I hope this first tool makes it a little bit less of a hassle. I'd accept whatever patches people choose to offer: support for summary files beyond SF1, additional geograph summary levels, general patches, and more.

Aug 5, 2010 7:05am

stress conditions

I'm at Camp Roberts again for a few days, working on Walking Papers with friends from STAR-TIDES, FortiusOne, Google, and Gonzo Earth on open source, geographic crisis response technology. Being in a military environment working on responses to high-speed disaster has me thinking about stress and preparedness. Two excellent magazine articles on the subject crossed my path recently, both forming a cohesive view on the privilege of living without stress. Privilege is driving a smooth road and not even knowing it, and access to that road is contested. Some are born on it, some never reach it, some resent its existence, and some can't shake the memory of the ditch alongside.

Packing Heat, "conditions of readiness," and the gun lobby:

Contempt for Condition White unifies the gun-carrying community almost as much as does fealty to the Second Amendment. "When you're in Condition White you're a sheep," one of my Boulder instructors told us. "You're a victim." The American Tactical Shooting Association says the only time to be in Condition White is "when in your own home, with the doors locked, the alarm system on, and your dog at your feet ... the instant you leave your home, you escalate one level, to Condition Yellow." A citizen in Condition White is as useless as an unarmed citizen, not only a political cipher but a moral dud. ... Having carried a gun full-time for several months now, I can attest that there's no way to lapse into Condition White when armed. ... Condition White may make us sheep, but it's also where art happens. It's where we daydream, reminisce, and hear music in our heads. (Dan Baum in Harpers, sorry for the paywall)

Under Pressure, chemistry, and health/stress feedback loops:

The deadliest diseases of the 21st century are those in which damage accumulates steadily over time. (Sapolsky refers to this as the "luxury of slowly falling apart.") Unfortunately, this is precisely the sort of damage that’s exacerbated by emotional stress. ... One of the most tragic aspects of the stress response is the way it gets hardwired at a young age - an early setback can permanently alter the way we deal with future stressors. The biological logic of this system is impeccable: If the world is a rough and scary place, then the brain assumes it should invest more in our stress machinery, which will make us extremely wary and alert. There's also a positive feedback loop at work, so that chronic stress actually makes us more sensitive to the effects of stress. (Jonah Lehrer in Wired)

Jul 31, 2010 11:16pm

state of the map U.S.

The first domestic edition of the annual OpenStreetMap State Of The Map conference is in a few weeks, and I'll be there in Atlanta along with all the funky geography enthusiasts.

Join us!

Jun 17, 2010 7:02am

blog all kindle-clipped locations: the big short

I picked up Michael Lewis's new book, The Big Short, pretty much as soon as it hit the Kindle a few weeks ago. It's a post-catastrophe account of the subprime mortgage crisis, told through the eyes of a small group of traders who shorted the supposedly unshortable mortgage backed securities that made everyone rich five years ago.

It's partially a financial story, but to me it's also a story about assumptions. I've been thinking a bit about the effects of unspoken, day-to-day dependencies that we all rely on in our lives. Can we live without them? Are we light enough on our feet to adjust when they shift? Do we even know what they are, and can we explain how they fit together? In a small company like mine, these questions can lead to some fairly serious existential crises. A few years ago, our client base seemed disproportionately tied to the galloping Web 2.0 economy. More recently, the brash announcement by Apple that Flash would be unsupported on the iPad confirmed a long-held suspicion that the platform was on rocky ground. In the trenches of my day-to-day as technology director, I've become excessively sensitive to the problems of cross-dependencies among projects, code bases, and servers. This is not so much an issue of identifying single points of failure as it is a matter of understanding which doorknobs you've tied your teeth to and subsequently forgotten.

Three questions:

  1. Do you depend on anything outside your control? What is it?
  2. Can you repeat past successes with those same externalities?
  3. Could you quarantine, isolate, or replace them, if you had to?

The Big Short is the story of one particular set of external dependencies that turned out to be hopelessly intertwined. Specifically, it's about the revelation that an entire class of financial products based on the performance of mortgage payments was more deeply interdependent and market-distorting than anyone had imagined. It's the moment near the end of a Stephen King novel where all the townspeople are revealed to have first names that start with "K" and they're sitting silently in their cars waiting for you up the road. NPR's Planet Money does a better job of explaining the details ("we bought the toxic asset..."), but the underpinning of the story shows how difficult it is to reject a lie when your livelihood depends on believing it.


Loc. 476-82, an opening anecdote showing the matter-of-fact cultural role of Wall Street greed:

When a Wall Street firm helped him to get into a trade that seemed perfect in every way, he asked the salesman, "I appreciate this, but I just want to know one thing: How are you going to fuck me?" Heh-heh-heh, c'mon, we'd never do that, the trader started to say, but Danny, though perfectly polite, was insistent. We both know that unadulterated good things like this trade don't just happen between little hedge funds and big Wall Street firms. I'll do it, but only after you explain to me how you are going to fuck me. And the salesman explained how he was going to fuck him. And Danny did the trade.

Loc. 483-87, Steven Eisman is one of the main characters, a brusque gadfly with odd listening habits:

Working for Eisman, you never felt you were working for Eisman. He'd teach you but he wouldn't supervise you. Eisman also put a fine point on the absurdity they saw everywhere around them. "Steve's fun to take to any Wall Street meeting," said Vinny. "Because he'll say 'explain that to me' thirty different times. Or 'could you explain that more, in English?' Because once you do that, there's a few things you learn. For a start, you figure out if they even know what they're talking about. And a lot of times they don't!"

Loc. 985-89, on the undesireability of defending an idea:

Inadvertently, he'd opened up a debate with his own investors, which he counted among his least favorite activities. "I hated discussing ideas with investors," he said, "because I then become a Defender of the Idea, and that influences your thought process." Once you became an idea's defender you had a harder time changing your mind about it. He had no choice: Among the people who gave him money there was pretty obviously a built-in skepticism of so-called macro thinking.

Loc. 1788-96, on the role of research that seemingly no one else wants to do. This is actually one of the most interesting aspects of The Big Short for me, the relative rarity of legwork compared to the ease of sticking to first appearances:

It wasn't a question two thirty-something would-be professional investors in Berkeley, California, with $110,000 in a Schwab account should feel it was their business to answer. But they did. They went hunting for people who had gone to college with Capital One's CEO, Richard Fairbank, and collected character references. Jamie paged through the Capital One 10-K filing in search of someone inside the company he might plausibly ask to meet. "If we had asked to meet with the CEO, we wouldn't have gotten to see him," explained Charlie. Finally they came upon a lower-ranking guy named Peter Schnall, who happened to be the vice-president in charge of the subprime portfolio. "I got the impression they were like, 'Who calls and asks for Peter Schnall?'" said Charlie. "Because when we asked to talk to him they were like, 'Why not?'" They introduced themselves gravely as Cornwall Capital Management but refrained from mentioning what, exactly, Cornwall Capital Management was. "It's funny," says Jamie. "People don't feel comfortable asking how much money you have, and so you don't have to tell them."

Loc. 1830-34, on arguing convincingly:

Both had trouble generating conviction of their own but no trouble at all reacting to what they viewed as the false conviction of others. Each time they came upon a tantalizing long shot, one of them set to work on making the case for it, in an elaborate presentation, complete with PowerPoint slides. They didn't actually have anyone to whom they might give a presentation. They created them only to hear how plausible they sounded when pitched to each other. They entered markets only because they thought something dramatic might be about to happen in them, on which they could make a small bet with long odds that might pay off in a big way.

Loc. 2206-11, more on Eisman's listening habits:

Eisman had a curious way of listening; he didn't so much listen to what you were saying as subcontract to some remote region of his brain the task of deciding whether whatever you were saying was worth listening to, while his mind went off to play on its own. As a result, he never actually heard what you said to him the first time you said it. If his mental subcontractor detected a level of interest in what you had just said, it radioed a signal to the mother ship, which then wheeled around with the most intense focus. "Say that again," he'd say. And you would! Because now Eisman was so obviously listening to you, and, as he listened so selectively, you felt flattered.

Loc. 3260-64, on $1.2 billion:

In early July, Morgan Stanley received its first wake-up call. It came from Greg Lippmann and his bosses at Deutsche Bank, who, in a conference call, told Howie Hubler and his bosses that the $4 billion in credit default swaps Hubler had sold Deutsche Bank's CDO desk six months earlier had moved in Deutsche Bank's favor. Could Morgan Stanley please wire $1.2 billion to Deutsche Bank by the end of the day? Or, as Lippmann actually put it - according to someone who heard the exchange - Dude, you owe us one point two billion.

Loc. 3413-22, on eight days of chlorine for all of Chicago:

His wife's extended English family of course wondered where he had been, and he tried to explain. He thought what was happening was critically important. The banking system was insolvent, he assumed, and that implied some grave upheaval. When banking stops, credit stops, and when credit stops, trade stops, and when trade stops - well, the city of Chicago had only eight days of chlorine on hand for its water supply. Hospitals ran out of medicine. The entire modern world was premised on the ability to buy now and pay later. "I'd come home at midnight and try to talk to my brother-in-law about our children's future," said Ben. "I asked everyone in the house to make sure their accounts at HSBC were insured. I told them to keep some cash on hand, as we might face some disruptions. But it was hard to explain." How do you explain to an innocent citizen of the free world the importance of a credit default swap on a double-A tranche of a subprime-backed collateralized debt obligation? He tried, but his English in-laws just looked at him strangely. They understood that someone else had just lost a great deal of money and Ben had just made a great deal of money, but never got much past that. "I can't really talk to them about it," he says. "They're English."

Loc. 3747-52, on being dumb and looking for grownups:

The big Wall Street firms, seemingly so shrewd and self-interested, had somehow become the dumb money. The people who ran them did not understand their own businesses, and their regulators obviously knew even less. Charlie and Jamie had always sort of assumed that there was some grown-up in charge of the financial system whom they had never met; now, they saw there was not. "We were never inside the belly of the beast," said Charlie. "We saw the bodies being carried out. But we were never inside." A Bloomberg News headline that caught Jamie's eye, and stuck in his mind: "Senate Majority Leader on Crisis: No One Knows What to Do."

Loc. 3880-82, a last word on dependencies:

The changes were camouflage. They helped to distract outsiders from the truly profane event: the growing misalignment of interests between the people who trafficked in financial risk and the wider culture. The surface rippled, but down below, in the depths, the bonus pool remained undisturbed.

Jun 16, 2010 7:25am

clipper futures

On June 16th 2010, the Bay Area Metropolitan Transportation Commission released Clipper, a rebranding of the original Translink payment card for public transportation. In the ensuing five years, Clipper has overtaken other forms of fare and payment to become the only remaining acceptable method of beeping a ride on Bay Area public transit. Starting with San Francisco's Muni and Alameda County Transit, later expanding to BART, Santa Clara Valley, San Mateo's SamTrans, Golden Gate Transit, Yellow Cab, Veterans Cab Co., and most recently the full complement of toll bridges including the Golden Gate, Bay Bridge, Richmond and San Mateo bridges thanks to Caltrans. Clipper is everywhere around the Bay.

It's difficult to remember now that just half a decade ago, local transit was a confusing jumble of mismatched schedules and fares. Riders no longer restrict themselves to monthly passes from a single agency, choosing instead to hop from one mode to another as their needs emerge. Unified payment makes most of this activity possible: the Clipper card has grown in importance along with the government ID and credit card. Initially envisioned as a purely a multi-agency payment card, Clipper has long since erased the functional distinctions among agencies thanks to the smooth thoughtlessness of synchronizing payments on all sides of the Bay.

Similar to the historical unification of competing streetcar companies under the umbrella of city-operated transportation authorities, we expect that late next year or possibly early 2017, the MTC Commissioners will retire the independent identities of SF Muni, AC Transit, Santa Clara VTA and SamTrans to be replaced by a unified "Bay Area Clipper" name. Most local observers view this as a pure formality, though it's expected that in keeping with its historic reluctance to participate in inter-agency plans, the BART Board will delay participation in the new name until 2018 at the earliest. They'll come around eventually.

Clipper has also managed to tap into a full range of data streams connected to urban transit, making them more interesting and valuable along the way. The fluidity and ease of motion we enjoy now is also made possible through the nearly ubiquitous availability up-to-the-minute locations for participating vehicles, made available from the agencies themselves through a public stream of real-time updates accessible to programmers and normal users alike. Buses and trains are almost universally equipped with location-tracking devices based on GPS and wireless signals, and a suite of applications from commercial, open-source, and philanthropic developers build on the predictive, route-finding data services published for every vehicle in the system. This has helped ease the pain of occasional budget cuts and service disruptions, giving users a way to minimize the amount of time they waste waiting around for their next ride.

Interestingly, the commercial providers of wi-fi, cell tower, and other "RF beacon" geolocation services have transitioned to something more like a regulated utility model. Originally established up to provide location lookup services for smartphones near the end of the 00's, the largest of these providers recently folded their own billing for Bay Area users into the Clipper system. The number of lookups you do for your physical location or finding an arrival time for a bus are simply charged to your card. Many users never even see these charges, since a large number of employers pick up the tab for employee accounts on a pre-tax basis.

More recently, concerns have surfaced around the privacy of data tracked through the Clipper servers. People are understandably jittery, after the numerous social networking data breach debacles of three years ago that seemingly turned a generation off of oversharing. MTC have gone to great pains to assure users of the system that their data is safe from "getting zucked", and they've begun to provide free personal monitoring services to users of Clipper. It's now possible to access to a complete, up-to-the-minute stream of your own card usage (including the geographic location of each beep) along with a record of access requests to that same data by parents, friends, mobile apps, credit reporting firms, or government agencies monitoring transportation use for oil-credit tax breaks. If someone's peeking at your transit history, you're the first one to know.

As Clipper begins its fifth year, we're seeing movement toward expanding the card into other uses around the Bay Area. Loose legal definitions of "transportation spending" have bike and rollerblade shops lobbying to allow repairs and maintenance to be charged to a Clipper account. The financial district congestion zone has announced a feasibility study for ditching its proprietary payment structure for Clipper. Even hardware companies have started to retool their smartchip-based door locks to optionally work with the system, bringing the card right into private homes. It seems too obvious to mention and too pervasive to notice, but seemingly everywhere you turn the one-time bus payment card has turned into a key to complete mobility and total access.

Let's try and not screw this one up.

Jun 10, 2010 7:05am

swishy curves, where you want them

There's a short series of posts in here, but I'm out of practice with blogging so I'll start with just this.

I've exhumed some of last year's thinking on heat maps, and re-encountered Zach and Andy's excellent series of posts on geographic isolines. Zach Johnson posted some ideas on a quick-and-dirty method for generating isolines from a field of point measurements, using the Delaunay triangulation (which I've mentioned before). Andy Woodruff followed up with ideas about curves, specifically the problem of passing smooth curves through a set of points.

The general idea is to go from this:

...to this:

...using a process somewhat like this:

("S" from Raph Levien)

Anyway. Splines are pretty much a solved problem, in the sense that your typical graphics library is going to support at least cubic splines - Flash, SVG, etc. all have native methods for making smooth curves between two endpoints with two additional control points. If you've used Adobe Illustrator at any time in the past 23 years, you'll know how this works:

Andy writes about the generation of curves that are smooth, yet guaranteed to pass through a full set of points without appearing discontinuous. What makes this difficult is that there are an infinite number of solutions, generally differentiated by the amount of tension or distance along connections between pairs of points. It's also difficult to express these solutions using control points; where do you place them? Often you need to make assumptions about the correct slope through a given point, and often those assumptions lead to some weird-looking results.

With a bit of basic algebra it's possible to deconstruct these smooth curves into a set of parametric equations. The trick is to introduce a new independent variable, t, which represents progress along the curve through a third dimension. T is for time, because it makes sense to think about x and y in the image plane varying through time. You can find a smooth curve through any set of points - a straight line through two points, a quadratic curve through three points, a cubic curve through four points, etc. I'm stopping at four points and the cubic curve, it looks good and is easy to calculate.

Here's one such 2D cubic curve:

It might look familiar if you've ever used a TI-81. The twisty math bit here is that given a set of any four arbitrary points, it's possible to generate one such curve that passes through them all. The second twisty math bit is that you can do this for the x and y components of the curve separately, creating two separate functions over t that move through space and describe a curve. Wikipedia has an excellent article on systems of linear equations, though I avoided most of the tedious algebra by using Sympy, a symbolic math library for Python that does the work. This is the resulting function, and here is an animation showing eight separate cubic functions over a set of eight arbitrary points:

Here's the same image showing all the functions overlaid on top of one another, with the "closest" central one highlighted in turn:

This is the third twisty math bit. To create a single, smooth curve out of all these pieces, you adjust the relative influence of each one over its length using a basis function. They're described in excruciating detail at ibiblio.org, where you can scroll down that page to play with some interactive java applets. I've cheated, and used sine waves that still look something like this over the length of the curve:

Putting it all together, you get this transition from simple cubic curves to a complete, smooth system that closes a full set of points:

The code for all this is a bit of a hairball (also syntax-highighted), but I hope potentially useful.

One reason I find this method interesting is that the end product is actually not a curve at all, but a series of points joined by straight line segments. You provide the illusion of a curve by make lots of them, and very short. This means that the resulting "curves" are trivially compatible with geographic software like PostGIS or Mapnik, and therefore possible to simplify using tools like Mapshaper, distribute in formats like Shapefile, and render with plain old Tilecache. At the cost of additional file size, you are freed from Flash and dumped wide-eyed and blinking in the world of actual geography.

Apr 16, 2010 5:52am

blog all kindle-clipped locations part III

...continued from part II

Wild Bill Bunge

Axis Maps cartographer Zachary Forest Johnson wrote this loving essay-length biography of William Bunge, radical geographer. I loved this excerpt on frozen moments in time, and the necessity of choosing an instant when mapping:

Much of Bunge's cartographic theory is contained in the foreword to the book. Speaking of a historical farm map created for the book (portion above): Maps attempt to integrate over time, that is, maps assume an average span of time. This means that nothing that moves is mapped, and therefore property is inherently preferred over humans. In order to restore truth to the map it is necessary to achieve a fiction of accuracy through an assumption, namely that the map is drawn at an exact instant of time. In this case, the time is June 20, 1915 at 2 p.m. on a sunny day. This fiction freezes the men and horses on the roads, the strawberry pickers in the fields, as well as the crops in rotation and the animals in pasture. This restores life to the dead map of property.

And this, on the relationship between communication technique (old-school graphic design equipment!), choice of study area, and communcation efficacy:

Learning how to make a clean line, lay a rip-a-tone pattern, or design a map with the right Combination of point, area, and line symbols did not seem to be critical knowledge to members of a survival culture. But the school decentralization study made sense. The next three weeks both saved and came to define the potential of the Expedition. The decentralization report - rich in graphs and maps created by Bunge and the Expedition's students - was adopted by a community group and forced the Board of Education to respond to charges that its school districting plans were illegal.

The Art Of Loitering

Chris Heathcote blogged a lengthy passage of A.B. Austin's 1931 The Art Of Loitering. I especially enjoyed this pair of sentences about the then-new practice of working class pleasure-driving on weekends, and the new ownership of the roads by cars:

I had really no business to be meandering along their road. My creeping progress might spoil someone's new-found pleasure. For it was their road. It had been built, or rather adapted, for them. Without its glossy blue-black surface, its faultless camber, its generous width, its gentle curves, they could no more pursue their hobby, seize their thrill, than the railway train could run without its track.

Code Is Not Inevitable

Mark Rickerby writes about literacy in coding, and suggests that good programmers are good editors: "I started noticing a single quality shared by all the coders who were producing the most destructive output: they seemed to have a compulsive fear of changing code after it was written."

I have come to believe that the vocabulary of technology is not sufficient to understand situations like these. Primarily, spaghetti code is a literary failing. Through my observations of the developers responsible for these wrecks - they often turned out to be poor prose writers and some were very arrogant about their coding abilities. I believe the core skill that these cowboys lack is that of editing - an instinctive drive towards pruning and tweaking that all good writers know is one of the most important components of literary creation.

On several distinct forms of literacy:

In his further discussion of computer literacy, Kay outlines three core aspects derived from an understanding of English literacy: Access literacy (reading) Creation literacy (writing) Genre literacy (shaping context of style and form).

The Obama Constituency

This was dense.

There is another constituency - self-employed men and women (often barely afloat) - who identify with the "haves," their present economic status notwithstanding. What they have is not so much current wealth, but a history of, or aspiration towards, status, authority, and autonomy. They are not willing to relinquish their past beliefs or their goals for the future. They conceive of themselves as self-reliant and as integral to what was once an undisputed notion of "American Exceptionalism." The number of the self-employed is expanding at a much faster pace than the population as a whole - to some extent out of necessity, as firms impose major cutbacks, forcing employees to go out on their own.

The Conquest Of Cool

Thomas Frank's The Conquest Of Cool is about the rise of "hip consumerism", specifically as it's connected to advertising and menswear. There's quite a bit of Mad Men in here, and I'm especially interested in the idea that the culture and counterculture weren't quite so separate at the time, and that business culture was going through its own set of tumultuous changes mirroring those of the youth movement. Anyway I clipped a lot of passages here; maybe it means I need to buy the book.

First things first:

Conflicting though they may seem, the two stories of sixties culture agree on a number of basic points. Both assume quite naturally that the counterculture was what it said it was; that is, a fundamental opponent of the capitalist order. Both foes and partisans assume, further, that the counterculture is the appropriate symbol - if not the actual historical cause - for the big cultural shifts that transformed the United States and that permanently rearranged Americans' cultural priorities. They also agree that these changes constituted a radical break or rupture with existing American mores, that they were just as transgressive and as menacing and as revolutionary as countercultural participants believed them to be. More crucial for our purposes here, all sixties narratives place the stories of the groups that are believed to have been so transgressive and revolutionary at their center; American business culture is thought to have been peripheral, if it's mentioned at all. Other than the occasional purveyor of stereotype and conspiracy theory, virtually nobody has shown much interest in telling the story of the executives or suburbanites who awoke one day to find their authority challenged and paradigms problematized. And whether the narrators of the sixties story are conservatives or radicals, they tend to assume that business represented a static, unchanging body of faiths, goals, and practices, a background of muted, uniform gray against which the counterculture went through its colorful chapters. Postwar American capitalism was hardly the unchanging and soulless machine imagined by countercultural leaders; it was as dynamic a force in its own way as the revolutionary youth movements of the period.


The 1960s was the era of Vietnam, but it was also the high watermark of American prosperity and a time of fantastic ferment in managerial thought and corporate practice. Postwar American capitalism was hardly the unchanging and soulless machine imagined by countercultural leaders; it was as dynamic a force in its own way as the revolutionary youth movements of the period, undertaking dramatic transformations of both the way it operated and the way it imagined itself.

On the study of selling out:

It is more than a little odd that, in this age of nuance and negotiated readings, we lack a serious history of co-optation, one that understands corporate thought as something other than a cartoon. Co-optation remains something we vilify almost automatically; the historical particulars which permit or discourage co-optation - or even the obvious fact that some things are co-opted while others are not - are simply not addressed.

On Wired, pretty much:

The revolutions in menswear and advertising - as well as the larger revolution in corporate thought - ran out of steam when the great postwar prosperity collapsed in the early 1970s. In a larger sense, though, the corporate revolution of the 1960s never ended. In the early 1990s, while the nation was awakening to the realities of the hyperaccelerated global information economy, the language of the business revolution of the sixties (and even some of the individuals who led it) made a triumphant return.

On permanent revolution:

The counterculture has long since outlived the enthusiasm of its original participants and become a more or less permanent part of the American scene, a symbolic and musical language for the endless cycles of rebellion and transgression that make up so much of our mass culture.

Back to part I...

Apr 16, 2010 5:52am

blog all kindle-clipped locations part II

...continued from part I.

Is Geography The New History?

I've felt for some time that the discipline of Geography is being shifted to the foreground:

Whatever aspect of geography it is that you start with threatens to segue into a discussion on the most polarising topic there is: climate change. Miss Prism would be quick to notice that geography is no longer a polite subject for meal time. Something similar has happened to atlases. They were once placid, unhurried publications with additional information on the colours of national flags. Now atlases are freighted with maps showing cities that are likely to be submerged if the Arctic melts, or projected population growth, or the relative size of countries in terms of CO2 emissions, or areas where water scarcity will be most intense and resource wars most likely to break out. An atlas is beginning to look like a long-term forecast - history before it happens.

The Deflationist: How Paul Krugman found politics

Larissa MacFarquhar's New Yorker article on Paul Krugman's journey into lefty politics. There's some good stuff in here about the technical aspects of academic economics, its relationship to justice, and the progression of knowledge in a discipline:

"Keynesian economics, which was coming out of the model-based tradition, even if it was pretty loose-jointed by modern standards, basically said, 'Push this button.' " Push this button - print more money, spend more money - and the button-pushing worked. Push-button economics was not only satisfying to someone of Krugman's intellectual temperament; it was also, he realized later, politically important. Thinking about economic situations as infinitely complex, with any number of causes going back into the distant past, tended to induce a kind of fatalism: if the origins of a crisis were deeply entangled in a country's culture, then maybe the crisis was inevitable, perhaps insoluble - even deserved.

On the necessity of models:

Again, as in his trade theory, it was not so much his idea that was significant as the translation of the idea into mathematical language. "I explained this basic idea" - of economic geography - "to a non-economist friend," Krugman wrote, "who replied in some dismay, 'Isn't that pretty obvious?' And of course it is." Yet, because it had not been well modelled, the idea had been disregarded by economists for years. Krugman began to realize that in the previous few decades economic knowledge that had not been translated into models had been effectively lost, because economists didn't know what to do with it.

On the loss of knowledge, similar to the much longer and completely-worth-reading Scott And Scurvy by Maciej Ceglowski:

Sixteenth-century maps of Africa were misleading in all kinds of ways, but they contained quite a bit of information about the continent's interior - the River Niger, Timbuktu. Two centuries later, mapmaking had become much more accurate, but the interior of Africa had become a blank. As standards for what counted as a mappable fact rose, knowledge that didn't meet those standards - secondhand travellers' reports, guesses hazarded without compasses or sextants - was discarded and lost. Eventually, the higher standards paid off - by the nineteenth century the maps were filled in again - but for a while the sharpening of technique caused loss as well as gain.

Taking on Afghanistan's 'Human Terrain

Short but sweet and on a line with Thomas P.M. Barnett's idea of a SysAdmin force ("The 'second half' blended force that wages the peace after the Leviathan force has successfully waged war"):

Deploying small groups of soldiers into remote areas, Colonel Schweitzer's paratroopers organized jirgas, or local councils, to resolve tribal disputes that have simmered for decades. Officers shrugged off questions about whether the military was comfortable with what David Kilcullen, an Australian anthropologist and an architect of the new strategy, calls "armed social work." "Who else is going to do it?" asked Lt. Col. David Woods, commander of the Fourth Squadron, 73rd Cavalry. "You have to evolve. Otherwise you're useless."

Open Geospatial Tools Expand Their Niche

First they ignore you, etc.:

Open source just isn't a dirty word anymore. Go back to 2000, and there were a surprising number of managers that would literally shy away… there was still that "dirty hippy" aura around open source. But at this point they do surveys of Fortune 500 CEOs about whether they're using open source or have an open source strategy, and the responses have gone from 20 percent positive to 80 percent positive. The snide remark often made in the survey reviews is that the remaining 20 percent are using open source but their staff just hasn't told them. The thing that changes a conservative decision maker's mind isn't a great sales presentation, it's knowing that other conservative decision makers have already made the decision. Once that wave starts rolling, it's very difficult to stop.

The Red Carpet Campaign and News Without the Narrative Needed to Make Sense of the News

Two unrelated articles, except that they are both about the importance of a story arc to the understanding of competition and controversy.

On the Oscars:

A good Oscar narrative makes voters feel that, by writing a name on a ballot, they're completing a satisfying plotline. Only a few of these stories are effective, and every campaign season, movies scramble to own them. The best are reused year after year: for example, The Little Movie That Could, the tale of a low-budget indie, a David among studio Goliaths, that often appeals to voters who hate Hollywood's bigger-is-better aesthetic.

On the news, Jay Rosen:

I was grateful, because up to that moment I had absorbed many hundreds of reports about the "subprime lenders in trouble" but had not understood a single one of them. It wasn't that these reports were uninformative. Rather I was not informable because I lacked the necessary background knowledge to grasp what was being sent to me as news.

Mad Men: A Foucaultian take

Will Davies's reading of Mad Men, mostly interesting because I love the show and I'm pining for the next season:

Then there is the subtle questioning of liberation. The historical constant in Mad Men is libido, which empowers and dominates in equal measure. The shift from one epoch (of sexism, domesticity, formality) to a new one (of equality, self-fulfillment and informality) is not represented as progress in any way whatsoever, but simply what Foucault might call a reconfiguring of the economy of desire. In this respect Mad Men - and this is the genius - is a satire of both conservative and liberal America, showing the choice between the two as arbitrary. 

But Today We Collect Gizmos

Fred Scharmen (sevensixfive) is a friend and Baltimore architect. He's interested in the way that disciplines relate to one another, and this post is the first time I've seen gizmo defined as "a temporary, easily available, means of organizing an undifferentiated continuum ... to bring many models to bear on the problems we are presented with."

To be honest I have only a glancing understanding of the broader point here but there are a few moments that made this worth noticing:

The landscape is informational, the desert is networked. If it is all constructed, or at least made from parts of constructs, the ground can be mined for patterns. Even the navigational gizmos themselves are little else but temporary constellations within social, material, and informational networks. There is the persistent rumor that the skins of the Powerbook G4 and the Guggenheim Bilbao were only feasible to produce during a global dip in titanium prices, after Russia flooded the market in the late 90s. Tablet computers are nothing if not devices to sort through the tangle of text and publishing outlets available, and bring reading back under some kind of manageable control.

And, on the transferability of technique:

Techniques, when named, abstracted to their simplest form, and packaged up (Sears catalogue style), seem to want to travel. What can we learn about sustainability from the closed-loop space colony ecosystem diagrams of the the 1970s? How can we talk to civil engineers about the emerging trend of micropractices in stormwater management? A collection of gizmo metaheuristics enables a more fluid code-switching, and a more useful exchange of knowledge within and between disciplines.

Continued in part III...

Apr 16, 2010 5:51am

blog all kindle-clipped locations part I

Matt Jones took my old "blog all dog-eared pages" habit and adapted it for the Amazon Kindle, resulting in the less-than-satisfying name "blog all Kindle-clipped locations" after the Kindle's internal position marker. Even this is a problematic name, since much of my reading on the device is mediated through Instapaper, whose new delivery mechanism augments a single document collection with new reading material. position is discarded as new material arrives.

Still, I continue to be happy with the Kindle's place in my life, in a way that the iPad seemingly hasn't captured. Amazon's device is calm, thin, and light where Apple's is bright, fat, and heavy. It actually surprised me what a slug it was, even though I still remember seeing a 16 pound "Macintosh Portable" from the pre-Powerbook days of high school. I like its passive role as a simple reading tablet, and the way that not having a touch screen means not having a touched screen. Although Instapaper is probably available for iPad, I like that it's not a proper application for the Kindle, but rather just a way of shooting bookmarked articles to myself when I occasionally switch on the network for more articles to read.

The thing is that there have been a lot of articles - the clippings here are selections from almost four months of reading. That's too much to collect without archiving somehow, so what follows is a bit of a slag heap. I'm breaking it up into three separate blog posts (parts II and III).

The Sources Of Soviet Conduct

First up are a few excerpts from George Kennan's 1946 The Sources Of Soviet Conduct, an adaptation of his famous Long Telegram. This essay was an influential, seminal work in the Cold War.

On aggressive intransigence:

It is an undeniable privilege of every man to prove himself right in the thesis that the world is his enemy; for if he reiterates it frequently enough and makes it the background of his conduct he is bound eventually to be right.

On the core concept of antagonism and mistrust in Soviet ideology:

The first of these concepts is that of the innate antagonism between capitalism and Socialism. We have seen how deeply that concept has become imbedded in foundations of Soviet power. It has profound implications for Russia's conduct as a member of international society. It means that there can never be on Moscow's side an sincere assumption of a community of aims between the Soviet Union and powers which are regarded as capitalist. It must inevitably be assumed in Moscow that the aims of the capitalist world are antagonistic to the Soviet regime, and therefore to the interests of the peoples it controls. If the Soviet government occasionally sets it signature to documents which would indicate the contrary, this is to regarded as a tactical maneuver permissible in dealing with the enemy (who is without honor) and should be taken in the spirit of caveat emptor. Basically, the antagonism remains. It is postulated. And from it flow many of the phenomena which we find disturbing in the Kremlin's conduct of foreign policy: the secretiveness, the lack of frankness, the duplicity, the wary suspiciousness, and the basic unfriendliness of purpose. These phenomena are there to stay, for the foreseeable future.

On the unending patience in Soviet tactics:

Its political action is a fluid stream which moves constantly, wherever it is permitted to move, toward a given goal. Its main concern is to make sure that it has filled every nook and cranny available to it in the basin of world power. But if it finds unassailable barriers in its path, it accepts these philosophically and accommodates itself to them. The main thing is that there should always be pressure, unceasing constant pressure, toward the desired goal. There is no trace of any feeling in Soviet psychology that that goal must be reached at any given time.

Questioning Capitalist Realism

I've linked to the writings of K-Punk, a.k.a. Mark Fisher, many times in the past, mostly his writings on music and the "hardcore continuum". Mark is where I first heard of Zomby, who made it to my 2009 oft-played tracks (I'm not sure what it says that I get my cutting edge music from an academic). In an interview with Fisher about his book, Matthew Fuller asks about the division of responsibility between the state and the individual. I like Mark's idea of the "privatization of stress", which seems doubly relevant in the aftermath of a nationwide healthcare debate:

The privatization of stress is central to capitalist realism. If they are "stressed", workers in overloaded institutions are encouraged, not to complain about their workload, but to engage in the kind of performance auditing activities which contributed to their distress in the first place. The question is no longer, "how did work cause you to be unwell?", but "what about you made you unable to do your job properly?" An individual-therapeutic model of stress deflects any structural account of how the stress arose.

Multicultural Critical Theory. At B-School?

Just this sentence, really:

Mr. Saloner says Stanford wants its business students to develop "a lens that brings some kind of principled set of scales to the problem." In other words, he says, students need to learn to ask themselves, "In whose interest am I making the decision?"

The Art of War

Eyal Weizman's 2006 article about IDF urban warfare tactics turned on my full range of Greenfield/Slavin receptors. Mostly, though, it made me incredibly angry. On the one hand, the application of critical theory to warfare is superficially interesting. On the other, it's repulsive in its excuse-making for the forcible takedown of the public/private boundary, and insulting in its implication that an understanding of decostruction is necessary to hammer through walls. A lot of this is just basic reaction to facts-on-the-ground and convenient forgetting of the Geneva Conventions.

To begin with, soldiers assemble behind the wall and then, using explosives, drills or hammers, they break a hole large enough to pass through. Stun grenades are then sometimes thrown, or a few random shots fired into what is usually a private living-room occupied by unsuspecting civilians. When the soldiers have passed through the wall, the occupants are locked inside one of the rooms, where they are made to remain - sometimes for several days - until the operation is concluded, often without water, toilet, food or medicine. Civilians in Palestine, as in Iraq, have experienced the unexpected penetration of war into the private domain of the home as the most profound form of trauma and humiliation.

I still struggle a bit with this article. I'm fascinated by the idea that different professions see reality as a different set of affordances, but at some point this just devolves into a game of dressing up destruction and abuse.

I then asked him, why not Derrida and Deconstruction? He answered, "Derrida may be a little too opaque for our crowd. We share more with architects; we combine theory and practice. We can read, but we know as well how to build and destroy, and sometimes kill."

The conscription of Gordon Matta-Clark here is a bridge too far. "Un-walling", really?

Future military attacks on urban terrain will increasingly be dedicated to the use of technologies developed for the purpose of "un-walling the wall", to borrow a term from Gordon Matta-Clark. This is the new soldier/architect's response to the logic of "smart bombs". The latter have paradoxically resulted in higher numbers of civilian casualties simply because the illusion of precision gives the military-political complex the necessary justification to use explosives in civilian environments.

A sort of justification:

When the military talks theory to itself, it seems to be about changing its organizational structure and hierarchies. When it invokes theory in communications with the public - in lectures, broadcasts and publications - it seems to be about projecting an image of a civilized and sophisticated military. And when the military "talks" (as every military does) to the enemy, theory could be understood as a particularly intimidating weapon of "shock and awe", the message being: "You will never even understand that which kills you."

Continued in part II...

Mar 18, 2010 8:31pm

back from austin

I returned from Austin two nights ago. SXSW was very large. There's too much to say, so here are the few notes I scribbled during the very few panels I attended (Design Fiction was a particular highlight, as was collaboration with Newspaper Club).

Mar 11, 2010 5:16am

sxsw 2010-bound

You picked us for this year's SXSW, so this is the post where I say yes I'm going and won't you please say hello? Our panel will be towards the end of the thing, 3:30 PM Tuesday. I look something like this and it's entirely likely that we have lots to talk about.

Mar 8, 2010 7:59am

walking papers plug

I've gotten a bunch of renewed interest in Walking Papers lately (previously: technology, launch, presentation, hacking). A few people have mentioned that it's difficult to install, which is true. The project originally depended on a few "HTTP ponies", single-purpose web servers with jobs like reading QR codes or rendering static maps. I've made some changes there, written up a proper installation guide, and added an explicit GNU public license to clarify the terms under which I've released the software.

It's been a lot of fun getting the project running on this guy, though:

That's a SheevaPlug, a 1.2GHz Linux server with 512MB of RAM that uses about 5W of power. It comes with Ubuntu 9.04 preinstalled, and when you plug it into a network it basically Just Works. The one I purchased from Ionics last week looks a bit different from the one above: instead of an integrated SD card slot and mini USB serial console, it has sort of a strange printed circuit hernia sticking out one side and a detachable plastic sidecar with the SD slot and mini USB. Less nice, but no matter - it does the right thing.

It's kind of an amazing, exciting piece of technology for a few interesting reasons. Mainly, I think it represents a sensible move for some of the disaster response uses we've seen Walking Papers applied to, specifically situations where remote internet is unavailable, power is hard to come by, and a highly portable little wall-wart could unobtrusively provide map printing and scanning services. It's definitely not a fast computer. My initial paper scanning, SIFTing and QR decoding tests showed it to be almost an order of magnitude slower than my Macbook at the difficult, CPU-intensive mathy inner loops of the decoder process. For the web and data service it's a champ, though.

In the near future, I'd be interested in deploying a complete OpenStreetMap server installation on one of these guys, which probably means getting a bootable volume onto the SD card instead of the severely limited half-GB of flash storage that comes inside the unit.

Anyway, what follows is the installation guide that you can find in the Paperwalking project.

Installing Paperwalking on Ubuntu 9


I've tested this guide on two platforms: Amazon's EC2 cloud-based computing service, and the SheevaPlug, a miniaturized ARM-based plug computing platform. SheevaPlug comes with Ubuntu 9.04, while EC2 can run Ubuntu 9.10 machine images. Machine images from Alestic.com seem to work well, and the guide below was produced using ami-19a34270 (alestic/ubuntu-9.10-karmic-base-20090623).

When setting up an EC2 instance, make sure it's in a security group that can accept both SSH and HTTP connections!

Start by logging into the server via SSH, as the root user.


There are a few packages that you will need to install: some base material, packaged for use by the offline image decoder, and packages to help run the public-facing website. During the last step below, you'll be asked to create a root MySQL password a few times, it's fine to leave this blank.

% apt-get update
% apt-get install curl vim screen tcsh sudo build-essential git-core
% apt-get intsall python-imaging python-numpy openjdk-6-jre-headless
% apt-get install libapache2-mod-php5 php5-gd php5-mysql mysql-server-5.1 php-pear

Paperwalking uses server packages from PHP's PEAR collection. The can be installed via the pear utility. Some of the packages below will throw warnings about deprecation, don't worry about those.

% pear install Crypt_HMAC HTTP_Request DB
% pear install Crypt_HMAC2 MDB2 MDB2#mysql

Apache's default configuration will need to be edited slightly. Edit the line with "DocumentRoot" to say: DocumentRoot /var/www/paperwalking/site/www, then restart Apache.

% pico /etc/apache2/sites-enabled/000-default
% apache2ctl restart

Try http://example.host in a browser.

Now, to install Paperwalking itself.

% cd /var/www
% git clone http://github.com/migurski/paperwalking.git paperwalking
% cd paperwalking
% git submodule init && git submodule update
% mysql -u root
	> create database paperwalking character set='utf8';
	> grant select, insert, update, delete, lock tables on paperwalking.* to paperwalking@localhost identified by 'w4lks';
	> flush privileges;
	> quit;

% mysql -u root paperwalking < site/doc/create.mysql
% cd site && make

The main site configuration information is kept in lib/init.php. Paperwalking comes with a blank that you have to copy and edit with your favorite text editor.

% cp lib/init.php.txt lib/init.php
% pico lib/init.php

Change these two lines (choose your own password):

  1. define('DB_DSN', 'mysql://paperwalking:w4lks@localhost/paperwalking');
  2. define('API_PASSWORD', 'swordfish');

Now try http://example.host in a browser to see it work. Make a new print, it might take a little while (we'll get back to this later). Print it, scan it, or just convert it to a JPEG, and post the image back to your instance of Paperwalking. Note that it's just sitting there, "queued for processing". Keep this browser window open, because we need to build SIFT.

% cd ../decoder
% make

Some stuff will scroll by, you may see compiler warnings, ignore these unless actual errors show up. Run the decoder once with the password chosen above.

% python poll.py -p swordfish -b http://localhost once

Watch the scan page in your browser update as it progresses through the image.

If it works, then re-run poll.py without the "once" argument at the end.

We don't have a proper daemon wrapper for this yet, so I've just been running it in screen as a cheap way to get a long-running process going. It sounds ghetto, but I've had processes like this one stay up for months at a time this way.

Use of screen to maintain long-running shell sessions is described here.

Now for a few optional niceties.

First, we can speed up the creation of print maps by using Modest Maps ws-compose.

% cd ~
% git clone http://github.com/migurski/modestmaps-py.git modestmaps

And in a screen as above:

% cd modestmaps
% python ws-compose.py

Back in the site configuration, configure ws-compose.

% pico /var/www/paperwalking/site/lib/init.php 

Finally we can get applications IDs for three external information services, for specific improvements to the site.

  • GeoPlanet is used in conjunction with the "Find" button on the map composition page.
  • Flickr does our reverse-geocoding, so that maps of arbitrary places can be given names
  • Cloudmade provides a few extra map styles.

Feb 23, 2010 12:05am


Is it creepy, in a stalkerish way, to repost someone else's conversation from Twitter? Eric found these two using the usual "stamen" vanity search, they're hilarious.

Feb 22, 2010 6:04am

user research friday

Nate Bolt of Bolt|Peters invited me to speak at User Research Friday last week. My obvious first reaction was "user research? We don't do any." Nate was interested in the perspective of a design firm that has no formal process for user testing or research, so I did my best to coalesce a bit of our most recent thoughts on astonishment and novelty for the packed house at their SOMA offices.

These are my talk notes and slides. Some of the things here actually came out of my mouth!

I'm here because Stamen Design doesn't do user research, and I hope I can explain why we've made that choice in our work and continue to stand by it.

I'm going to show a few of our projects that I think benefitted from a lack of user research, but I'll start by talking a bit about astonishment. I've become interested recently in thinking about how people respond to novelty. Anthropologist Grant McCracken talks about astonishment as a "jamming" of your powers of action, an inability to function and respond to external events, a subversion your own sense of reality.

User research, to me, is an attempt to mitigate and control astonishment by determining what an audience believes or expects, and where possible delivering on that belief and expectation. User research promises stability and predictable outcomes, and I think that we're at a curve in the road where the idea of stability is just not all that interesting. I'm not arguing for chaos, but a kind of targeted novelty that probes out the edges of our expectations and helps us keep our beliefs and behaviors continually fresh.

It's worth keeping in mind that targeted novelty can rapidly become chaos. Who here has heard of Google Buzz? Do you remember the announcement, accompanied by breathless claims of game-changing excitement? The rush to verb every noun and crown Google victor in the social wars of the late aughts? The launch of Buzz was a fascinating experiment in modifying expectations around email, the internet's most boring communication medium.

Unfortunately, it was also a total, unmitigated cluster fuck.

I think we have to assume that Google performed basic, ass-covering user research on this project, right? What do you think the odds are that in their user-testing labs they might have come across a user like Fugitivus, a blogger hiding from her abusive ex and using email to communicate with close family members? By involuntary launching Buzz at GMail's 175 million members, Google introduced the pressure of public performance into a private communication medium, outing human rights activists, confidential sources, secret friends and battered wives.

There's an alternative to the Buzz model of foisting novelty on people, and I think that Flickr is a classic example of a better way to operate on the web. Flickr is famous inside of Yahoo and out for having a basically cavalier attitude to commonly-accepted wisdom about testing, assurance, predicability, and all-around maturity. What that means is that sometimes, Flickr screws up. This is the Flickr coloring contest, hastily assembled during a multiple hour, unexpected engineering emergency from 2006. Flickr had and has a world-class community management approach that can respond to crisis with humor. The coloring contest kept anxious community members busy while terrifying technological feats were performed to unclog the tubes.

Aaron Cope, formerly of Flickr but presently with us at Stamen, believes that Flickr's ability to successfully respond with this kind of deft flexibility to a crisis is a result of a caring, trusting relationship between the site and its users. This relationship seems to extend to all areas of the site, from emergency management to regular day-to-day feature development that prizes speedy development, deployment, and public response over plans and assumptions. I believe that user research played very little role in the creation of experiences like Flickr, because the site is not something that could have been predicted or tested in lab: it's a whole-body novelty situation.

Sites like Flickr start small, and grow through organic additions to the user base. This short graph of Twitter's early history from Blaine Cook tells a similar story of humble beginnings, sudden floods of interest, rapid response to community desire, and ultimately the triumph of joy over pain through building and testing in public, with real people.

I'm going to show some of our work, but the point I'm making is: get the jetski out of the driveway and into the water.

Digg Labs

Labs was a dominant thread in Stamen's work from about 2006 through 2008. The project was blessed from the start: Kevin Rose contacted us personally, and was intimately involved in the sketching and development process of the first Labs pieces. Whatever you think about the Digg community - it's nerdy, and male, and everything else - Kevin was deeply in tune with the cultural tastes of his userbase. The video game metaphors and API development was all of a piece, and this was the kind of project that was going to see success just because the strong, visible leader of the community identified it as worthy of attention.

The site now looks like a completed artifact, but it's worth noting that at the time it was a wide open playing field. No one really knew what the "labs" terminology was going to mean, and no one was doing this kind of live community visualization on quite the same scale. Our own development testing put us in synch with the Digg community, and we were working with live data from day one. We'd pretty much have to wrap up our work by about 6:00pm PDT, because that's when the Digg community was in bed in Europe, at dinner on the East Coast, and commuting on the West Coast.

Many of the Labs pieces saw multiple public iterations that were a direct result of vocal, frequently bitchy feedback from the Digg community. All of it was valuable, and we did our research in the open, letting people see multiple iterations of the work. The hallmark of this project set was visibility and basic openness, which bought us a great deal of goodwill.

Walking Papers

Who here is familiar with OpenStreetMap?

OSM is a freely-licensed world map created by volunteers in the style of Wikipedia. The project has always valued praxis over correctness. Founder Steve Coast claims that he developed the project after growing so frustrated with existing copyright regimes in the UK and refusing to bother with complicated specification documents showing how he ought to proceed.

Walking Papers is an experimental, paper-based editing tool for OpenStreetMap. It's an alternative to cumbersome GPS technology, based on the use of the QR code, an application of basic computer vision for cell phones widely deployed in Japan but only just now becoming known in the US.

Here's a current ad on BART with Lawrence Hall of Science Geek Night event details stored in a QR code. I think this would have been unimaginable just a few years ago, but now you all have magic touch computers with cameras in your pockets, so here we are.

When you pair the QR code with OpenStreetMap, you get a printable piece of paper that can encode basic information about its geography in a machine readable form.

Users of Walking Papers mark up these maps with information they'd like to add to OpenStreetMap, scan those maps in, and then trace their additions in existing OSM tools normally designed for tracing GPS trails or aerial imagery.

The first thing that surprised me when launching the project was that US users were a surprising minority of the base. Here are a user's notes from Mennecy, France.

This is me making notes about businesses along Telegraph Avenue, in Oakland's Temescal District. Lanesplitter Pizza is in the center.

This is a rain or sweat smudged map of individual address points from the UK.

We didn't know it at the time, but a lot of geographic data collection happens using simple paper tools that talk to one another.

The second thing that surprised me was that I had unwittingly stumbled into a sort of holy grail of the disaster response community. This photo is of an adaptation I made to Walking Papers at an event in Camp Roberts, CA, which combined up-to-date satellite imagery of Afghanistan with photography from test flights of unmanned aerial vehicles. All the conversations I've since had about Walking Papers have focused less on the ease of carrying pieces of paper around San Francisco, and more on the need to use paper in remote environments.

It turns out that members of the OSM foundation had already been considering the universe of possibilities around data input and output. In this napkin sketch from Mikel Maron, we see paper in the corner with a double-headed arrow that he says was a joke at the time. The rest of the picture shows possible connections to other parts of the web, other kinds of hardware. None of this was within our view when we first started thinking about the paper issue in isolation.

Twitter Visualizations

The O'Reilly Emerging Technology conference had a long-time IRC-based backchannel where participants could chat about sessions and speakers as they were happening. It relied on a 20 year old technology comfortable and natural for Unix hackers but not so much for journalists, designers, and other normals. Still, we used the opportunity to eavesdrop on this live communication channel in an automated fashion to create these live, self-updating diagrams of who was talking to whom while the event progressed. I think we've always been interested in ways of tapping into ongoing conversation streams and showing them back to their participants, but in the past this has meant cajoling people to make a special effort to come to a shared medium like IRC. The #etech channel was live for the duration of the event, and effectively dead afterward.

Fast forward to now, and suddenly all these dreams we had about visualizing ongoing conversations can be made real because of the explosive growth of Twitter. What makes Twitter interesting is that it's a performative medium, a place where you know you're being listened to and write as though you are in public. It's broadcast, and explicitly so. It's interesting because very little of the Twitter we know now could have been predicted or tested for beforehand. It could only be used in the wild and reacted to. Like Google Buzz earlier, you simply have to try it *with everyone else* to know whether it makes sense or not.

Metafilter's Matt Haughey used it last year to spread the news about his diagnosis of a brain tumor, and thought a bit about use of the service in emergency situations. What can you predict about a service when what you want it to become a utility?

It's so incredibly interesting, we've been doing a bunch of work with Twitter since last year.

This is our live tracking application for the 2009 MTV Video Music Awards. On these screens, you can see each artist or performer represented as a jiggling chit that shrinks and grows based on the live conversation volume about that person. Here we can see a bit of conversation about Pink and Shakira.

You might remember the moment when Kanye West interrupted Taylor Swift's acceptance speech. Here it is replayed as a bubble graph, with the two stars on the screen at the same time, followed by instantaneous crowd disapproval of Kanye West.

So, this kind of crowd call and response is something that we think is going to get pretty important over the next few years. It's going to be especially interesting because it's going to become normal and expected over that time.

Every live event will come with a live analysis of what the audience is doing, like a country-scale version of those America's Funniest Home Videos audience clicker boxes. The amazing graphics department at the New York Times thought it was interesting to map live Twitter traffic during last year's Superbowl, as entertainment and information.

Right now, we have an ongoing project with NBC Olympics exploring these same themes in the form of a dynamic, self-updating tree map.

At this point in my talk prep, I stopped writing everything down and ad-libbed it all. In and amongst all these bits of NBC Olympics stuff, I called out this amazing NPR blog post about the "weird poetry" of the twitter tracker.

Summing Up

The negative way of phrasing my argument is that it's hard to test everything, and doubly hard to test new things. Some stuff you just have to push out into the world and see what happens.

The positive way of phrasing my argument is that for the astonishing and the novel, you're better off pushing your ideas into the real world early, and testing with the reactions of real people who aren't self-consciously test subjects. Start small, listen carefully to your users, and grow in the direction where they want to take you. Give yourself room to fail, and understand that the trust of your fellow travelers is an important part of the equation.

The doubly-positive way of phrasing my argument is Just Effing Do It.

Feb 8, 2010 8:34am

zoom transitions

There's something really nice going on in this series of zooms around San Jose Ave and 280. I like the introduction of layering as the details move into focus:

Feb 5, 2010 12:51am

commute cinema

Two short videos from my morning commute.

  1. I started recording this one right at the moment I swiped my credit card to pay for my TransLink pass (it's like a Bay Area Oyster card). From swipe to "transaction completed" takes about 45 seconds on these machines, with a disturbing amount of detail in between. "Dialing host" alone took most of that time. I thought credit card transactions were a solved problem?

  2. This guy was riding up Market St at 10am on his tall bike. When he came to red lights, he'd have to circle a bit to avoid a tortuous dismount/remout cycle. He looked pretty badass up there.

Feb 2, 2010 6:18am

the hose drawer

Last week, Stamen was involved in the MTV Hope For Haiti Now telethon. It was apparently the largest such fundraising effort in history, pulling down over $60 million in what the New York Times called "a study in carefully muted star power".

So it was, you know, a pretty intense event to be a part of. With about a week or so of lead time, we put together an interactive map showing moderated Twitter messages connected to the live event and relief for Haiti generally. It was also part of the post-show on television.

It looks like this:

Sha and I traveled to Los Angeles to work out last minute kinks and watch over the project. Aaron was up here, carefully seeing to the smooth operation of the engine driving the Twitter collection process for the duration. Much of the office pulled together to crank out this project on pretty much no notice, and it was an inspiring and energetic effort.

I want to talk about that engine, though - it's occupied most of my headspace since we all got back from the holidays, headspace ordinarily full of geography and cartography. After dabbling in the consumption and development of real-time API's for a few years, we've started working with the high-volume Twitter stream this year. Back in 2007, Kellan Elliot-McCrea and Blaine Cook opened our eyes to the possibilities of streaming APIs with their series of Social Software For Robots talks. We poked at XMPP and other persistent technologies a few times, but what really whet the appetite was a project we did with MTV for the 2009 Video Music Awards, our first large-scale public project connected to the Twitter streaming API. The VMA project was a collaboration with social media monitoring company Radian6, who handled the backend bits. Given Stamen's development and design focus it was only going to be a matter of time before we started pawing at that backend technology ourselves.

The consumption and moderation system we have developed was christened "Hose Drawer" by Aaron, a name that neatly encapsulates two strands in our recent interests:

  • "Hose" draws on Blaine's joking references to "drinking from a fire hose" about live streams of Twitter's real-time database, and now graces the official name of the service itself. The complete feed is called the fire hose, while a limited feed for casual users is the garden hose. I've heard that the names persist internally, with names like "hose bird cluster", referring to Twitter's avian mascot, among others.
  • "Drawer" is a nod to last year's Tile Drawer, an EC2 virtual machine image for rendering OpenStreetMap cartography. We're thinking about a future where this kind of functionality is as much a piece of furniture as a PC, albeit one that can be created out of thin air and destroyed just as quickly. With a growing selection of infrastructure products, Amazon's Web Services are making it possible for us to develop services that act like products, materialized by small python scripts that bootstrap themselves into multi-machine clusters.

This is the naming convention you get from living the pun-driven life*.

The firehose offers a continuous flow of data, yet requires us to break that flow down into discrete chunks. I've been thinking a bit about this process in our work since Tiles Then, Flows Now, my 2008 Design Engaged talk about map tiles and continuity, so much so that I see the process of breaking and reforming continuity everywhere around me. I flew several times last month, and each time I passed through security I thought about the check-in process as an elaborate map/reduce implementation, atomizing a stream of passengers into packets of shoes, laptops, jackets, bags and bodies. Numerous fascinating patents cover the splitting up of a steady flow, from t-shirts cut from unending tubular knit fabrics, to continuously-cast steel to simulated egg yolks sliced from unbroken cylinders.

Scott Bilas, whose Continuous World Of Dungeon Siege paper I drew on heavily for DE 2008, describes the challenges of writing a streaming system in the world of video game design:

The core problem that we had so much trouble with is that, with our smoothly continuous world, there are no fixed places in the world to periodically destroy everything in memory and start fresh. There are no standard entry/exit points from which to hang scripted events to initialize or shut down various logical systems for plot advancement, flag checking, etc. There is no single load screen to fill memory with all the objects required for the entire map, or save off the previous map's state for reload on a later visit to that area. In short, not only is the world continuous, but the logic must be continuous as well!

The pattern we see here is to keep crises small and frequent, as Ed Catmull of Pixar says in an excellent recent talk. When describing the difficulty Pixar's artists had with reviews ("it's not ready for you to look at"), he realized that the only way to break through resistance to reviews was to increase the frequency until no one could reasonably expect to be finished in time for theirs. The point was to gauge work in motion, not work at rest. "So often that you won't even notice it," said Elwood Blues.

I'm interested to see where we can take this product-that-isn't-a-product. We're going to be using it for an upcoming high-profile sports broadcasting client (you'll hear more about this in another two weeks), and the stress tests administered by the Haiti telethon have shown exactly where we need to do more work. Oddly the overall performance was great, but we found ourselves occasionally needing to go tweet diving through the database, looking for specific messages that were good for television. This ability to reach in a meddle with the guts, place yourself on a calm island in the middle of the stream, rewind the tape and alter the flow, is the next type of control we're experimenting with.

Jan 18, 2010 8:33am

last week: new york

I had kind of a bonkers trip to New York this past week. My proximate reason for being there was an invitation to Microsoft Research's annual Social Computing Symposium (SCS), an invitational held this year at NYU's ITP program. Another reason was to visit the U.S. Geological Survey's Volunteered Geographic Information workshop. A third was to pay visits to a bunch of east coast superheroes. The thing I love most about these whirlwind adventures is that you have an excuse to compress a full schedule of visiting into a very short span of time. I had a chance to meet and reconnect with a few people I like and respect quite a bit. I felt warmly embraced.

The SCS theme this year was "City As Platform", and the structure of the event was brief talks punctuated by puppet shows. About half the people gave presentations, though I wasn't one of them because I was thinking more about the USGS thing for later in the week (more on that below). Steven Johnson gave an opening talk that made me think I had accidentally bumbled into a TED or Davos, though the situation did rapidly improve.

Kevin Slavin gave my favorite kind of talk, a barnstormer that I found simultaneously fascinating and provided a lot of hooks for debate. Liz Goodman took better notes than I, and the basic gist was like this:

  1. Concealment technologies, like the radar-defeating B2 Bomber, work by breaking up a shape rather than hiding it entirely.
  2. High frequency, algorithmic trades work the same way, tearing larger financial movements down into microscopic dust.
  3. This modern computerized finance stands in contrast to people-driven, proximity-reliant trading off the past that gave rise to today's financial district.
  4. Physical proximity required for algorithmic trading is no longer a result of human distance but of microsecond ping times necessary to exploit fast, small differences in price.
  5. Why can't the whole thing just float off and go away if it doesn't need people anymore?

Kevin connected to the city theme with 60 Hudson as a pivot. It's a massive telecommunications node, and physical distances along the network from this building translate to communication delays, which subsequently translate to money lost or money gained. We used to think about time in terms of distance ("The furlong was the distance a team of oxen could plough without resting"), which we've mostly forgotten since communications speeds got so fast, but the high rents directly around this building show that on some scales, velocity of communication can still be measured.

Now I don't quite agree with Kevin's conclusion. Where he sees an inhuman system that's begging to be pushed off into space where it will cease to bother its once creators, I'm looking at a natural consequence of technological speed in the service of technology. Someone's always going to want to exploit the seams with science. What we need is not to set the thing adrift off-world but to tame it with a new Glass-Steagal. Still, the core humanism of where he goes with this is touching, beautiful and deserving of some attention. Most "city as platform" talk comes in two varieties: information technology and space syntax. There was plenty of both on offer at ITP, lightly brushing the testicular, amoral war-metaphoring that views urban fabric and shared space as just scenery to play out control scenarios. I don't want to call out anything in particular, so I'll just link to an unfortunate recent BLDGBLOG post about Die Hard and the IDF walking through walls after reading Deleuze and Guattari. One of the commenters, Jim Meredith, has this response:

"However, as you note early in you piece by quoting those who maintain, live in, and trust the concept of private space, the Nakatomi/Nablus/DieHard concept is, in fact, a shocking and gross violation of a core concept of civil society."

(apologies to Marc Ngui)

Not everyone views the city as a battle suit or a maze of twisty passages all alike, and it's worth thinking about provided functions (sewage, macadamizing) in terms of what they do, who they do it for, and for what price. Anil Dash in particular described his Expert Labs concept, a project to consider and modify the motivations for geek work and the federal government. Did you know that you can't, technically, "give" things to the U.S. government? It's true and Anil's trying to change that. Which is an interesting segue to the USGS event in DC, a workshop intended to explore successful volunteer geographic data programs (like OpenStreetMap) to see how that can be applied to the forthcoming update to the National Map.

Thanks to a door malfunction on my NYC/DC flight, I arrived late to the party and unwittingly delivered a closing keynote. The full content of the talk and my slides immediately follow these other exciting things that happened:

  • I visited the New York Public Library map people twice and was allowed to flip through a 17th century Dutch naval atlas.
  • I got to meet the amazing New York Times graphics department, including Matt Ericson, Matthew Bloch, Amanda Cox, Shan Carter, and others. These are the people turning our ideas about interaction and mapping online utterly upside-down.
  • Grand Central Oyster Bar.
  • Veselka for 2:00am pierogi.
  • Mark Hansen's genius installation at the NYT lobby.
  • Poking my head into the Saturday Night Live studio during filming of Laser Cats.
  • Played the new(ish) Mario Brothers Wii game and discovered that my muscle memory of the old game's standard controls has remained fully intact for 20 years despite no external reinforcement.

Now, my slides for the USGS thing.


For many years, OpenStreetMap was understood to be a project about the future, meant for productive use someday but clearly unready. Potential users still had to be convinced, and we passed over OSM as a data source for projects in favor of commercial data on a number of occasions. This phase lasted from 2004 to about 2008 for us at Stamen.

In 2008, it started to look more done than not-done.

The social structure and set of motivations that OSM thrives under can be seen as part of the broader trend towards shared development practices and better communication enabled by the open internet. We're seeing a trend toward a larger number of smaller operations unified by an ethos of participation and local scale, something that Brian Marick has jokingly summarized as "artisinal retro-futurism" and "team-scale anarcho-syndicalism". I love this description.

In our work, we're generally on the receiving end of map data, and I've got some examples showing cartography I particularly enjoy that's benefited from OSM as a resource.

Stamen's "Fresh" map style for Cloudmade, showing legible web cartography for inner London, was immediately applicable to the rest of the world upon launch. It's used by OffMaps, Noticings, and others as a default, web-native basic road map without some of the distracting recognizeability of Google Maps.

UK designer Matt Jones of BERG used Cloudmade's style editor product to apply Kevin Lynch's ideas about the city as a network of paths, edges, districts, nodes, and landmarks for heavy traveler social network Dopplr. Particular ideas about the role of information on the map can be rapidly experimented with and published to a broad audience.

Stamen also did the Cloudmade Midnight Commander map style, answering a creative brief suggesting a design appropriate to Jason Bourne's in-car GPS. Again, something that starts as a small idea or even a joke can very quickly go into full production when simple, well-understood data is available.

I've also been exploring the adaptation of paper maps like this 1996 Rand McNally SF street atlas popular among bike messengers. The result has been OSM-derived, print-ready cartography that additionally incorporates civic parcel and contour data.

Finally, Yahoo photo-sharing web service Flickr has used specific portions of OpenStreetMap data in places where Yahoo Maps has poor or non-existent coverage. The transient desert city of Burning Man is one such example, surveyed and laid-out fresh each year. OSM Foundation board member Mikel Maron contributes his mapping expertise to the Burning Man Earth project to provide visual context for the thousands of photogrpahs that come from the art festival each year.

When anti-government protests broke out in Iran last year, Flickr was able to rapidly pull in high-quality maps of Tehran to provide immediate geographic context to the sudden flood of news and imagery. I'm told that high quality road data for cities like Tehran is traditionally quite hard to come by via normal channels, since there's very little road navigation market for western companies there.

Services such as Google's and the iPhone are bringing more different kinds of people in close contact with cartographic design through their daily lives than ever before. We know that use and familiarity breeds discerning taste, and cartography has become popular, decorative, desirable, and functional.

What specific technical characteristics of OpenStreeMap motivate the creation of this broadly useful data set? I think that a few critical decisions made in the early days of the project have endowed it with generative properties, the "capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences" as Jonathan Zittrain has said.

Three specific features of OpenStreetMap have this effect.

First, each object in the system has a unique and durable identifier. Objects basically stay put, and are composed of very simple primitives. There are no curvy lines in OSM. The identity of each object in the OpenStreetMap database is exposed to the outside world, so that Flickr can let you refer to specific OSM features in your photography. Here we have a photograph of the Queen's Chambers in Manchester that's explicitly connected to the building footprint in OSM because the two entities can be tagged together. The system is useful even when not explicitly mapped.

Second, the tagging structure is free-form. You can apply your own arbitrary descriptions to features, generally conforming to the basic expectations of the community with tags like "highway", but often departing from them entirely. Andy Allan created a rendering for his OpenCycleMap project that specifically called out features with tags relevant to bicyclists, like these portions of the bike network in Oakland. New tag conventions are decided through use rather than committee. Most theoretical arguments about the appropriateness of one approach over another are moot until actual use by a number of people over time can be shown.

Finally, there is an open API. OSM's core feature is a complete, well-understood way to move data into and out of the service. This has made it possible to create numerous ways of recording and editing OSM data. JOSM is the editor for the precise, obsessive, and German. Potlatch is the web-based editor that lives under the default "edit" tab on the site, and includes simple tools for pulling different kinds of points of interest into the aerial map.

Walking Papers is my own research project, and provides a way for paper data collection to help OSM. Each of these applications takes advantage of OSM's well-documented and simple protocol to moved information through the system, with varying amounts of complexity available to different user populations.

I want to close with this napkin sketch by Mikel Maron. Many years ago, Mikel showed a possible "common operating procedure" around the project, incorporating numerous technologies not normally associated with desktop geography. Much of the methodology drawn here did not yet exist at the time, but Mikel could be sure that with healthy usage patterns it would be possible to draw double heads on each of the arrows with some reasonable expectation of future possibility.

Jan 5, 2010 6:54am

next week: new york

Next week, from the 10th to the 15th, I'll be in New York, with a very quick side trip to DC. Anything interesting I should be doing with a smallish amount of free time later in the week?

Jan 3, 2010 10:23am

blog all oft-played tracks

These are a few of the tracks I added to iTunes in 2009 and listened to the most. Not all of the music is new, but it's all got a date-added between 2009-01-01 and 2009-12-31 and a fairly high number of plays, so that's good enough for me. This post would be a one-line perl script if I used last.fm.

(Here are the MP3s below as an .m3u playlist)

1. Pet Shop Boys: Love Etc.

Shawn makes no apologies for hating the Pet Shop Boys, but this is the new track I listened to the most this year, so there you go. They've been a favorite band of mine for about ten years, mostly because I like pop music and I like their lyrics. This track is also a stand-in for Lady GaGa's Poker Face which I don't actually own, but find completely fascinating. Pop music!

2. Zomby: Where Were U In 92?

This track is an amazing signpost to the hardcore rave music of the early 1990's, but it's entirely modern and produced by someone too young to have had firsthand experience of that scene. I found it via k-punk, who has this to say:

That's why, whatever its intentions, whatever its official status as a side-project, Zomby's Where Were U In 92 amounts to a refreshingly honest libidinal confession, an admission that British dance music is still haunted by the hardcore continuum. Think about why it's impossible to have imagined a Jungle producer in 92 do Where Were U in 76: it isn't only that 92 had so broken with the whole frame of reference of sixteen years before, it's that the headlong rush into the future precluded such retrospection.

3. Moderat: A New Error

This is one of those tracks that Shawn brings to the office that makes me have to put down whatever it is I'm doing for about ten minutes. The bass here, heard over a good pair of speakers, is deep like the best of Sleeparchive.

4. Laurent Garnier: Wake Up

I include this because it's part of the two-disc Logic Trance 2 compilation, something from 1994 that I finally reacquired this year. Most of the collection is loaded with floating, detached trance music that hadn't in all cases found a dancefloor yet. It's all deeply nostalgic for me, found through my roommate Paul who helped form a lot of my musical tastes as I was hitting escape velocity from industrial music. The whole compilation is a classic, this track from Laurent Garnier is a taste.

5. Depeche Mode: Wrong

The video for Wrong is like a terrifying bad dream. The song is some of the darkest anything I've heard come out of Depeche Mode, incredibly abrasive and confrontational.

6. Venetian Snares: Sajtban

I saw Venetian Snares play perform in Oakland with Otto Von Schirach, and loved the performance. Again, abrasive and confrontational, but also fast and squiggly.

7. Slayer: Angel Of Death

Another one of those memories dredged up from my youth, this one from Brian, the slightly white-trashy headbanger dude who lived up the street from me in San Jose when I was 14. Other bands I got from Brian include Obituary, Death, Godflesh, and Entombed. He also owned all the Iron Maiden EPs that were just Nicko McBrain funny-talking apropos of nothing. I've started pulling out some of this old metal lately, fixating on some of the more experimental or weird bits that fit cleanly with my later electronic tastes. Angel Of Death pretty much just rocks the fuck out.

8. Sam Cooke: Chain Gang

On a completely different trajectory, Sam Cooke is amazing. The ooh-hah beat of Chain Gang is astonishing.

9. Birdy Nam Nam: The Parachute Ending

I don't recall where I found the video for this, but it's a lovely piece of very French-seeming animation that instantly reminded me of Spartakus and the Sun Beneath the Sea. It didn't even register that the "Birdy Nam Nam" in the video title was the name of a band or that this was a music video, and I wondered why the music wasn't included in the credits. Dur. Excellent techno track though.

10. Blaqstarr: Hustress (snippet) feat. Rye-Rye

I'm not sure why this is here. I got interested in Blaqstarr tangentially through bingeing on The Wire for a few months, and this short slice of omnious hip-hop sound collage is incredibly interesting.

Jan 3, 2010 7:25am

here comes 2010

The odometer has clicked over, we're in a new decade. I'm trying to make a bit of sense of the previous year, so it's helpful to start by looking over a full year of blog posts like last time I did this.

These are some things that happened.


Our dog, Sodapop, died in August. The house feels much emptier, especially when I'm making pasta and expecting her to demand her perquisite of three to four noodles and a bit of bacon.

Blogging Fewer Dog-Eared Pages

A few years ago, I got into the habit of writing down the best stuff from the nonfiction books I read, and posting it all here as entries titled Blog All Dog-Eared Pages. A few friends like Chris and Russell and George have picked up on this, which made me very happy. Problem is I'm bad at habits, so my last such entry was all the way back in January, when I excerpted whole slices of the remarkable Process Of Government by Arthur F. Bentley.

I purchased a Kindle in late spring, and I think this has much to do with how this activity has petered out for me. Specifically, the Kindle and its good friend Instapaper have largely eaten my nonfiction reading, which means that there are no longer any pages to dog-ear. The counterintuitive part is that Kindle actually has an incredibly easy way to mark and save passages, with everything you highlight using the little joystick being dumped to a plaintext file called "My Clippings". In theory this should make the activity much easier, but since the medium is the message and all that blah, I'm now reading entirely different stuff than I used to. I read fewer non-fiction books and more non-fiction long-form online writings, the kind of stuff that fits into Instapaper. I'm not unhappy with this change in my intake, but I do like to be a little more demonstrative with the things I'm interested in, so I'm unhappy the change in my output. If there was a way to make the Kindle pump the clippings file back out on some schedule, that would be good. Having to plug it into a computer does not cut it.

I tried to do a "blog all clipped passages" once about film projection speed, but I don't yet have the hang of this.


I spoke in front of large groups of people much, much more this year, mostly about maps and cartography. This has been incredibly fun.

(Photo by Kris Krug)

In March, Andrew Turner invited myself, David Heyman from Axis Maps and Elizabeth Windram (formerly) from Google Maps to a SXSW panel on Neocartography: all the new stuff happening in maps online. I talked a bunch about Stamen's work of course, and it was great fun to attend my first SXSW after hearing so much about over the years since 2003 or so. I'll be back this year, for Chris Heathcote's Maps, Books, Spimes, Paper panel. Excitement.

Shawn and I did two presentations of a workshop we're calling Maps From Scratch, once at ETech 2009 and again to a shockingly-packed room at Where 2.0. It must have gone pretty well, because Brady has invited us back to do it all over again in an expanded, two-session form. This was an interesting workshop to run, because we opted for a heavy hands-on approach with a genre of server-side GIS software not known for its ease of installation. We created an Amazon EC2 AMI for participants to use as a workstation, and it went pretty well - that's all still available at mapsfromscratch.com.

My solo presentation at Where 2.0 was Flea Market Mapping, building on some of the work I've been doing georectifying historical maps and explaining why that's a fun and easy and useful activity. I expect to do more work in that are this upcoming year, if all goes well.

I also went to Amsterdam in July for State Of The Map, where I described my work on Walking Papers (more on that below). This was great fun. Aside from the excitement of biking around Amsterdam for four days with Aaron and being hosted by A'dam expert traveler Ben Cerveny, the conference itself was my first mass exposure to the rabid OpenStreetMap community. My talk started out on mapping on paper generally before shifting to the Walking Papers project specifically.

Amsterdam made me switch bicycles, from mostly riding my Univega fixed-gear to a more relaxed single-speed with wide handlebars and an early 1980's Trek mountain bike frame. I'm very happy with it, can't find a photo though.

In October, the North American Cartographic Information Society invited me to deliver the keynote at their annual meeting, which was an incredible thrill and honor. I've not done this kind of central focus talk before, so I thought I'd use the opportunity to talk about online community and sharing to an audience of academic and professional cartographers, explaining some of the trends that have led to a strong and vibrant OpenStreetMap project. Slides and notes here.

Other events where I waved my hands in front of crowds included Interesting 2009, Web 2.0 Expo, dConstruct 2009 in Brighton, and a variety of visits to Stanford and UC Berkeley. All of it was by turns gut-wrenching and awesome.

Finally, in December Eric Meyer and Jeffrey Zeldman asked me to participate in An Event Apart here in San Francisco. The audience was much more of a mainstream web design crowd, and I was surprisingly engrossed by the other presenters, especially Luke Wroblewski's talk on HTML forms and Jared Spool's always-entertaining stories from his work for Amazon.com. I posted my slides as a giant, 28MB PDF file.

Being A Fan

I decided early this year that it was important and healthy to be more of a fan, so I've made a special effort to point out things that are awesome and worthy of attention. Early in the year, that meant moving pictures of the sky. More recently, that meant moving pictures of hands and drawings. Along the way, that's meant everyone I know who is doing awesome shit, with all the design and technology and music and video work that my friends have produced. Awesome awesome awesome.

Oakland Crimespotting

We added a PIE OF TIME to long-time research project Oakland Crimespotting, have you seen it?

And, we launched San Francisco Crimespotting with the enthusiastic participation of Jay and Kelly and the City of SF's technology department and some dude named Gavin. Shawn's responsible for the port from Oakland and keeping it running.

O'Reilly published my write-up of the project's technical and social structure in Beautiful Data, a new entry in their Beautiful series.

Walking Papers

Aaron linked to a computer-vision algorithm called SIFT way back in February or so, which caused bells to go off in my head. The result is Walking Papers, a project that connects paper-based mapping and annotation with the crowdsourced OpenStreetMap project through the medium of printing and scanning.

It started as a joke and a feasibility test, but I quickly saw that using technologies like SIFT and FPDF and QR Codes was totally going to work, so a few initial tests searching for pictures of gargoyles on scanned pieces of paper turned into a proper website and service actually used by people around the world. We've even gotten a few international volunteers to translate the site to German, Dutch, French, Spanish, Japanese, and Italian! Thanks Jonas, Milo, Jonathan, Manuel, Hiroshi, and Emanuel!

Walking Papers opened some incredibly interesting doors to the international disaster response community, after Mikel suggested I attend a week-long hack session at Camp Roberts. Turns out, the use of such intermittently-connected technologies as paper and local networks is a super hot topic in these circles, and Walking Papers was one of a numbr of projects that's pushing some serious buttons at the DOD right now, for the kinds of people who need to make networking stuff work with a tent and a car battery and a USB key.


At some point, I basically became overwhelmed with novelty, and decided to realign some priorities in favor of doing and showing instead of talking and stuff.

Fortunately, the holiday break has been a relatively easy one to curl up in, with a trip up to Washington to see some of Gem's family and a few old friends, followed by a week or so of being a total hermit here in Oakland. We spent New Years at the Chawazek house drinking champagne and eating dessert and talking about obsolete systems of measurement. I thought back to all the New Years Eves I've had over the years from the time I was making my own plans to just the other night. I feel like I've come somewhat full circle, with a few scattered trips to Poland and a whole string of raves thrown in the middle:

  • 1993: Home in San Jose, probably with grandparents in town.
  • 1994: Party. A friend's small house party in South San Jose. Sadly we didn't keep in touch after high school.
  • 1995: Home. I was supposed to go out with friends, but my ride flaked and I didn't have a car so I just spent the evening angry and at home. This was my last New Years Eve as a high school kid.
  • 1996: Party. My freshman year at UC Berkeley, I spent the winter break back in San Jose catching up with high school friends. This was fun, everyone was talking about where they ended up for college.
  • 1997: Poland. I visited my mom for a few weeks, spent most of my time stressing about grades that turned out to be fine (A- in CS61A), and went to a small house porty for NYE. People there take their New Years Eve much more seriously, and dress nicely for the occasion. This was weird.
  • 1998: Rave. Cloudfactory NYE at a rock climbing gym in Salinas.
  • 1999: Rave. Cloudfactory NYE at a former planetarium dome storage building on the former Fort Ord near Monterey. Terry and I did live visuals, and the party was shut down at about 5:00am when the building's superintendent finally got antsy about the noise and the drugs.
  • 2000: Rave. Cloudfactory NYE at that same Fort Ord building, but this time it went all night. A curtailed party seems to be more fun than a complete one, possibly because the element of randomness that a bust or sudden cancellation introduces?
  • 2001: Rave. Cloudfactory + Infinite Beat at a warehouse in Millbrae near SFO. This was one of the last times I did live visuals at an event, but we pulled out all the stops with seven overlapping screens. I spent much of the night unhappy because I had recently had a crappy breakup and she was there, with the lighting guys in the other main party room.
  • 2002: Poland. In a cabin in Zakopane with cousins on my dad's side. Vodka in little tiny shot glasses.
  • 2003: Raves. Basically wasted this evening running around between a few different parties.
  • 2004: Rave. Gem and I were living at Otherworld, so the big party was just outside our bedroom.
  • 2005: Rave. Otherworld again, but this time we had our own apartment, where we escaped in the morning to spend the next day contentedly watching old episodes of Kids In The Hall.
  • 2006: Poland. My grandfather passed away, I went there for the funeral and to Krakow with some back-in-the-day family friends for New Years.
  • 2007: Friends. Up at Darren and Bonnie's for poker and drinking. This was my winter of terrifying back pain, but the injury stayed mostly quiet to allow me a nice evening out.
  • 2008: Friends. Rented vacation house in Sonoma, near Charles Schultz Airport, stuffing ourselves with Aaron's cassoulet for four days. Contentment.
  • 2009: Friends. Homemade pizza and board games in the Mission. Contentment.
  • 2010: Friends. Goodtime fooddrinkery up near Buena Vista Park, including an extended conversation about old time weights and measures, e.g. the furlong and the dry vs. wet gallon. The presence of an iphone puts a new twist on this kind of talk. Contentment.


  1. Health.
  2. Mindfulness.
  3. More fandom.
  4. Preparedness.
  5. Small pieces, quickly published.
May 2024
Su M Tu W Th F Sa

Recent Entries

  1. Mapping Remote Roads with OpenStreetMap, RapiD, and QGIS
  2. How It’s Made: A PlanScore Predictive Model for Partisan Elections
  3. Micromobility Data Policies: A Survey of City Needs
  4. Open Precinct Data
  5. Scoring Pennsylvania
  6. Coming To A Street Near You: Help Remix Create a New Tool for Street Designers
  7. planscore: a project to score gerrymandered district plans
  8. blog all dog-eared pages: human transit
  9. the levity of serverlessness
  10. three open data projects: openstreetmap, openaddresses, and who’s on first
  11. building up redistricting data for North Carolina
  12. district plans by the hundredweight
  13. baby steps towards measuring the efficiency gap
  14. things I’ve recently learned about legislative redistricting
  15. oh no
  16. landsat satellite imagery is easy to use
  17. openstreetmap: robots, crisis, and craft mappers
  18. quoted in the news
  19. dockering address data
  20. blog all dog-eared pages: the best and the brightest