Archive for the ‘Uncategorized’ category

Using the Web in Earthquake Recovery

This is a copy of a guest post I wrote for the Government Web Toolkit Blog. It describes some of the experiences gained and lessons learned in using the web to deliver interactive maps to the public following the 2011 Canterbury earthquake, based my time at the Canterbury Earthquake Recovery Authority (CERA) to from April 2011 to September 2012.

A picture can paint a thousand words. In an earthquake recovery, interactive maps are worth a million.

February 22nd 2011, 12:51pm, I’m working from home, lying on my bed, reading email on my iPhone. 30 seconds later, my city, my life, and my future had changed irrevocably. Anything not bolted down was on the floor and half of it was smashed. Computer monitors, TVs, bookshelves, food from the fridge. The power went off then stayed off for five days.

Mobile calls worked for a few minutes, then failed. Texts became patchy after an hour. The only thing that was semi-reliable was Twitter over 3G. It took until 9pm that night for me to know that my 10 year old son was OK as he was at school in the central city and he didn’t get home until then.

Within three hours of the quake a group of volunteers had set up http://eq.org.nz, a crowdsourced effort using an open source disaster response platform to provide maps and information. Within a day http://canterburyearthquake.org.nz/ was set up (on WordPress.com) by the official Civil Defence response team, Environment Canterbury, and Christchurch City Council staff. The technologies that worked were web, cloud and mobile based. These teams delivered in a rapidly responsive and agile way to get information to people on the ground.

Disaster responses and recoveries are about people and things at places. They’re inherently geospatial in nature. Maps make a huge difference. Interactive, dynamically updated maps even more so.

Serving the Public

Fast forward to 23rd June 2011. Less than three months after being established, the Canterbury Earthquake Recovery Authority (CERA) had completed the initial process of land zoning based on detailed geotechnical investigations. Like everything in the recovery, time frames were tight. People want government decisions to be based on sound scientific and economic evidence. They also want to know where they stand (and can live), as soon as possible.

CERA needed a way to let people see exactly which zone their house was in. That required an interactive website, capable of taking a massive initial load, which would be implemented in a very short time frame.

In stepped Trade Me.

Working with Tonkin & Taylor, the engineering firm that had done the geotechnical investigations and created the zoning maps, Trade Me built Landcheck in four days. They served the site from their server farms in Auckland and Wellington. There were 5 million page views, and 3.3 million individual property searches in the first 24 hours of the site being live. Trade Me did this for free, for the people of greater Christchurch.

By September 2011, CERA had taken over the hosting of Landcheck following three further land zoning announcements that generated about 10% of the initial load. The functionality was migrated to the My Property tab on the CERA website (in Drupal) hosted by Egressive (now Catalyst IT). Interactive mapping was added by NorthSouthGIS serving map layers from their Revera hosted ArcGIS server. A land announcements timeline map was also quickly built on the CERA website using Google Maps.

But the story didn’t end there. The Department of Building and Housing (now the Building & Housing Group within the Ministry of Business, Innovation, and Employment) had been working hard on developing the ‘Technical Categories’, describing expected land performance and damage in further earthquakes, and the house foundation systems likely to be required to withstand future quakes. In October 2011 CERA needed to announce the new technical categories, and knew they’d generate a similar level of interest to the initial Landcheck announcement. There wasn’t time to build out similar physical server infrastructure to that Trade Me had deployed, and it would have been very expensive. Within 48 hours of knowing the announcement was needed, Egressive had prototyped a scalable solution using the Amazon cloud platform as a front end cache and application server. This was backended with locally hosted Drupal servers with a Drupal module that automatically managed using Amazon as a content delivery network (serving images, PDFs and static files for the whole CERA website).

Within two weeks the solution was in place and tested, the data loaded, and the announcement made. Map layers were again served by NorthSouthGIS through Revera, who made sure there was enough bandwidth to the map servers for the announcement. People could look up their technical category through the My Property section of the CERA website, and view interactive maps of technical categories and land zones.

In the following year a similar approach was taken to deal with the major peak load of the Christchurch Central Recovery Plan launch by the Christchurch Central Development Unit (CCDU). More sophisticated map layers and viewers were also launched at maps.cera.govt.nz, including mobile map viewers on iPhones/iPads, Androids, and Windows Phones using the free ESRI ArcGIS mobile app.

What Was Learned?

These experiences taught us several things:

  • Drupal combined with Amazon cloud services is a very robust and cost effective method to implement a hybrid locally hosted and massively scalable solution, that can cost effectively be dialed up and back for announcements that will generate large peak loads.
  • As Dave Snowden says, the necessary preconditions for innovation are starvation, pressure, and perspective shift. If you put your IT vendors under significant time pressure, with fewer than normal resources, in a situation that’s really important to get right, they’re much more likely to come up with innovative solutions than they are when everything is comfortable.
  • When you’re under pressure, without clear certainty on what you’re building, function and develop over time, using agile and iterative approaches works. You get to deploy quickly, get customer feedback from real users, and improve the solution. We showed that it is possible for government agencies to be agile.
  • People love maps. The large majority of people think visually and spatially. They want to see information about where they live, and how things around them are affected by government policy. Interactive maps tell these stories really well.
  • Making map layers openly available for download and through live feeds as ‘open data’ makes work much more efficient for the rest of government, and the private sector. Not only are government agencies required to release data for reuse, it creates real benefits. Christchurch City Council, Waimakariri District Council, Selwyn District Council, Environment Canterbury and many central government agencies received live updates of the land zone layers and used them in their systems. Private sector infrastructure contractors and utility companies did the same. Even Wises, a paper map book provider used the spatial layers in new editions of their printed maps.

What’s Next?

Right from the start CERA was supported by Land Information New Zealand (LINZ), and the NZ Geospatial Office (NZGO). Their knowledge of implementing mapping solutions, spatial data standards, and the way to share spatial data across agencies was invaluable. The government is now looking to extend the work already done, leverage the use of location-based information to accelerate recovery and rebuild efforts in Canterbury, and ensure other regions benefit from this innovation. This programme of work is called the Canterbury Spatial Data Infrastructure (SDI) Programme and will be led by LINZ/NZGO in collaboration with the Canterbury agencies.

So, you can expect to see more geospatial innovation from Canterbury in the months to come.

Sharing across the ditch

The great thing about open government, is that it’s just that. Open. The opposite of closed and proprietary. We’re not just sharing the raw data, we’re sharing what we’ve learned about opening data. The guidance, toolkits, governance models and other supporting material.

In that light I spoke last week in Sydney to a group of staff involved in the New South Wales ICT Strategy. In NZ we’ve made some great strides over the last year in the implementation of our Declaration on Open and Transparent Government, and in particular the Toolkit and guidance provided by the Data and Information Reuse Secretariat.

So, in the interests of sharing, here are the list of links I mentioned in my talk:

And some examples of data reuse

and even more here.

Interestingly, while NZ is perhaps more advanced with opening government data, Australia has made substantial progress regards research data management, through their new National Research Investment Plan, and the work by the great people at the Australian National Data Service. I think there’s much in that space that we can learn from them.

Loving the language of spam

I’ve always been fascinated by the evolution of spam. I think this is because of an interest in language and communication through text. Spam represents an attempt to inveigle, to persuade using the written word. To do this it not only has to con-vince, it also has to get past the filters that are designed to sift it out before it gets to unsuspecting recipients.

This leads to a kind of sophisticated tightrope walking, between getting the message through (the filters), and getting the message across (to the readers).

A lot of this is to do with the gulf between what the human brain can recognise/make sense of, and what the human brain can program computers to recognise/make sense of. This gulf in embodied sense making ability, and codifiable sense making ability is where the spammers live. As that gulf slowly narrows it’ll be interesting to see what happens.

The technique I find most interesting at the moment is the use of images to deliver the message (e.g. an ad for viagra), and some pseudo intelligible text to fool the filters. I have no idea how they generate this text, but it is intriguing to read:

“Men lived in families, tribes, and races, at feud super with one another, command frowning yawn plundering, outraging, and killing. All the evil seems to exist through farm some cause independent of school sweep the funny conscience of men. Those men brain scary who accept a slip new truth when it enjoy has gained a certain degree of acceptance, always pass over

The antagonism between amuse life and the bring conscience may be removed cushion in two ways: by do a change of life or by It cannot be. uptight orange What is the mow use of the clergy, who don’t believe picture in what they preach? love Those who do evil through ignorance of the stretch truth quit provoke sympathy with their crash victims and repugnance”

Did you know you are on a biofuel bus?

I’ve been getting the bus the last couple of weeks as I’m waiting for new glasses so I can drive again. One of the things I noticed was that some of the buses in Christchurch have small biofuel notices on them, and there’s one with a full paint job promoting biofuels.

I had a look on the Environment Canterbury web site, and sure enough, they’re running a trial. It’s only a 5% blend, but it may increase to 20%, and it’s only on a few buses, but it’s a start.

Getting on the bus last night, there was a woman with a clipboard asking each passenger “do you know you’re on a biofuel bus?”. It made me wonder about the purpose of asking the question. Was it to see how many people were aware of the pilot? Was it to inform people of the fact that biofuels were being used? It’s one of those problems in social research. You can’t ask the question without changing the knowledge of those you are asking. I could almost see the more astute of the passengers thinking “well, yes, because you’re asking me that question, and the question implies that I am about to get on a biofuel bus”.

But, whatever the rationale behind their questioning methods, I think it’s a fantastic initiative and I’m going to get the bus more often, even when I do get my glasses.

Environment 2.0 … The World is Us/ing Us

A lot of the work I’ve done recently is in the Environment sector, on whole of sector data, information and knowledge issues. One of the fascinating things that’s starting to happen is national and international federation of biodiversity data. I really like the intro video on http://www.eol.org/home.html and the promise this sort of thing has for better managing the world we live in.

It’s stylistically quite reminiscent of the original Web 2.0 … The Machine is Us/ing Us video reated by Michael Wesch, Assistant Professor of Cultural Anthropology at Kansas State University. So it’s also interesting to see these kinds of communication methods being quickly adopted and repurposed into different contexts.

For me these are all examples of the emergence of Pierre Teilhard de Chardin’s Noosphere. The thinking mind of the living breathing earth.

Museum 2.0

A 2 day conference with 200 librarians, museum staff, and archivists. It could have been musty, and death by droning powerpoint, but it wasn’t. I’ve just been at the National Digital Forum, and it was fantastic to see Web 2.0, folksonomies, podcasts and mashups being embraced by this community. One of the speakers had already made a del.icio.us account with bookmarks for the sites he referred to, common enough among digerati, but fairly radical for Museum people.

There were some magnificent speakers including Toby Travis from the Victoria and Albert Museum, and Susan Chun from the Metropolitan Museum of Art, New York. These and other institutions are enabling users to tag art works and gallery objects (online) to improve discoverability, and blogs and podcasts to reach out to a wider audience. Some were using Flickr mashups to let the public contribute images to exhibitions, setting up famous dead photographers as Flickr users, creating gallery profiles in MySpace, and even a gallery in Second Life.

One of my favorite examples was a ‘Design your own Arts&Crafts Tile‘ flash application. People could create tiles, and rate each others. They had a very limited number of patterns and colours to work with, and only a title and brief description field for metadata. Given this very limited palette (in fact probably because of it) it was amazing to see what people did, and the kind of dialogue by picture and metadata that occurred as a result.

Another interesting aspect was the thought going in to the interaction between folksonomies and rigorous academic taxonomies as used by archivists and curators. Folksonomies were being used to inform enhancement to taxonomies and the more formal descriptive content around art works. Discussions were starting about faceted folksonomies, the notion of hierarchies and clusters of tags. I’m wondering whether this will get us closer to the emergence of the semantic web. Certainly having the ‘memory institutions’ involved in this process is going to add something, whether it’s needed academic rigour and inspiration, or restrictive pedanticism remains to be seen. Their increasingly excited and engaged though, and that’s a great start.

Human Traffic

A number of European cities are trialling a traffic management approach involving a massive reduction in the number of signs and traffic signals.

The mantra is “Unsafe is safe” and the rationale is that the more you try to control people, the less personal responsibility they take. Where there are less rules people take more care, and negotiate via gestures, nods and eye contact. It’s been tried in some towns in Germany, and accidents reduced dramatically.

This thinking seems to me to resonate well with reactions against rules based approaches to managing organisational performance. The more you trust people (within an appropriate minimal set of boundaries) the more you get emergence of functional, adaptive behaviour.

OnlineGroups.Net

It’s a big day, OnlineGroups.Net have released their ‘start a site’ and ‘start a group’ service using a paid subscription model.

I’ve been using the technology for three years now, right from a very early alpha stage. This was mostly because I shared an office with the creators of the software. I’ve seen it evolve from being pretty rough and ready to being extremely functional. There are still a few things I don’t like, but overall it’s fantastic.

I’m a member of ten active groups, and am the administrator for a site that comprises six groups, and is likely to have many more. The things I like the best about the system are:

  • the user management – where users can manage their account details in one place, while being members of many groups
  • the centralised file storage (this makes a huge difference for committees and groups that aren’t on the same IT infrastructure)
  • the focus on good online group facilitation processes

Enterprisey

I recently spoke at the Brightstar 6th Annual Strategic Intranets and Portals conference
I was going to blog some notes at the conference, but I stopped as soon as I saw that Michael was doing a much more thorough job than I could have done.

One thing that Michael’s presentation raised for me was the relationship between web 2.0 and enterprise collaboration/km technologies. Which is driving which? I’ve said for a couple of years now that corporate users expectations have been raised by Google. How is it that you can normally find what you’re after on the Web, where there’s 8 billion pages, and you can’t find anything on your corporate document repositories where there’s only a few hunderd thousand documents?

Michael asked whether perhaps Web 2.0 was just the bringing of ‘enterprisey’ collaboration functionality to the public web. To a certain extent I think that’s true, especially for those who have been using Lotus Notes for years. I also think that Web 2.0 is driving some innovations from the public web into enterprises though. Blogs, wikis, and faceted classification (tags) are to me, clear examples of this. Lightweight, ‘paper thin’ portals like Netvibes are also examples of the kinds of customisation that corporate users may start to expect.

Google page rank style search power is another thing that should come into enterprises. The challenge with this is that there just aren’t that many links between corporate documents. The reason Google works is that a lot of relevance ranking can be drawn from the number and type of links between web pages. I think this offers a lot more promise than automated context extraction technologies like Autonomy (as fantastic as they are). It was therefore interesting to hear BEA talking about incorporating contextual links between people, documents, and groups to improve search within the enterprise portal space.

I await these developments with interest.

No One Cares

This is my new favourite t-shirt design. I think it’s both amusing and enlightening. Amusing because perhaps some people do get all excited about their blogs and want everyone to read them, as evidenced by these “read my daddy’s blog” suits for babies.

Enlightening because it’s got me thinking further about a podcast I listened to recently which talked about the way people are actually using blogs. Anil Dash from Six Apart talked about blogs and LiveJournal, and the way that the anthropologically derived numbers of 15 and 150 seem to play out here. So while a very small percentage of bloggers have very large audiences, most people ‘s blogs are read regularly by less than 15 people. They’ve become another medium for communicating with our close friends, and of forming loose relationships with other people on the periphery of our social circles.

That originality is key. It's all about ideas. There are a lot of people out there making trousers and the world doesn't need more of that.
Karen Walker