Category Archives: Computer magic

Political Tech vs. Civic Tech

The campaign post-mortems are pouring in, unveiling the computer magic behind the Obama campaign. We should probably be thankful that the conversation has evolved from 2004-2008’s obsession with social media into a newfound lay interest in data aggregation and empirically valid testing. I’m learning a lot reading these articles.

This stuff is cool, but the Holy Grail Subject Line is not why we got into this work in the first place. Tom Steinberg has responded with an inspiring post encouraging the talent behind campaign tech to consider building civic technologies. Tom outlines some really important (and thus far unfulfilled) goals for civic tech to aspire to, including:

  1. These tools, apps, platforms, whatever, need to scale way beyond even the most popular instances we have today to be considered worthy of actually reshaping society and impacting all of those millions of people in society who have never heard the phrase “Gov 2.0”.
  2. We should attempt to produce tech that creates transformative change at the seismic level of so many internet-driven disintermediators, from TripAdvisor to eBay. This means creating new possibilities and patterns of behavior so profound, we can’t even return to the Old Way of doing things.

I agree with Tom that it’d be good for the world if some of the campaign ninjas become improved democracy ninjas (barring the obvious counterargument that elections have very real results). There’s also the argument that these ninjas achieve similar ends when they open source tools built with the investments generated by the “vast amounts of cold hard cash” generated by US presidential elections. These tools, like improved election day reporting on the Ushahidi platform, could be and probably will be redeployed for general civic use.

I’m confused, though, when Tom’s argument focuses on the simple fact that, in a campaign, you have opposition working against you and your team. Many of the voter outreach tools the Obama campaign built this cycle will live on in various incarnations, including some version of trickle down or transfer to the opposition’s side. The rival campaigns’ technological contributions worked to dilute one another’s chances of winning the election, but I don’t see a strong argument that their tech cancels out the other team’s contributions or general progress.

TripAdvisor and its ilk made the customer stronger in their relationship with hotels, but the battle is hardly over. The many services and products we rate and review are learning to game these systems, whether by creating bots, paying people on Mechanical Turk, or just providing better customer service at key moments (which is clearly a win of some sort for the customer). There’s far more money involved in hospitality industry booking than there was in the entire US Presidential race, and it, too, is a technological arms race.

Partisan tech might drive competition in the election season, but in general, I feel that the left-right political divide weakens the political tech industry by cutting the potential market in half. I’ve always had the impression that one factor causing innovation in political technology to lag significantly behind the commercial technology industry is potential market value. Political campaign tech is a subcategory of the much larger tech industry, and was until recent years, a sleepy backwater of the tech industry. The fact that the market for campaign technologies and its suppliers is, in many places, split in half by the non-economic force of partisanship only contributed to the lack of real investment that drives commercial tech. This split also, in my opinion, allowed some mediocre consultants and products to maintain market dominance where, in a more competitive market, they would have been unseated. Experienced political technologists on the left and right can likely point to tools or people on the other side they’d love to have access to for their campaign.

There are probably many other reasons that campaign tech traditionally lags behind commercial tech, like a political campaign culture that for far too long saw developers as IT managers here to fix your printer, and boom and bust election cycles. And Obama’s tech team, in 2008 and 2012, has clearly made huge strides in changing this reality. But otherwise, I think partisanship’s division of market value is at least a factor. Why else would social change companies like NationBuilder and risk their roots as generally progressive firms, and frankly, certain large-but-partisan clients, to pursue a much larger, richer global market far outside of US politics? And isn’t this how they’ll scale to the levels Tom challenges them to?

I’ll now await pushback from much more experienced political technologists.

Talking Fast II: More CrisisMapper Ignite Sessions

Luis Capelo (@luiscape) of Digital Humanitarian Network loves volunteers. DH exists to stimulate more interaction between humanitarian volunteers and large humanitarian institutions.

There’s information overload in humanitarian responses. How do we collect and make sense of all this information? Luis credits humanitarian orgs with doing the hard work of adapting, but it’s a rough sea to navigate. Volunteer & Technical Communities thrive in this environment. They’re nimble, lightweight, and advanced, technically. Luis thinks its time to stop questioning whether VT&Cs can help, and begin to dive into how these groups can collaborate.

DH aims to create a consortium of groups that faciliates between the two worlds, and reduces the cost of collaboration
They have a simplified activation process: activate volunteers, triage the volume, and forward them to VT&Cs. They’ve produced a guide to manage the activation of VT&Cs.

July and August of this year saw the first two activations. OCHA and ACAPS came to the DH network for help. OCHA wanted to build a pre-crisis profile of every country. ACAPS wanted to include VT&Cs in the formal assessment process.

Join at

Cat Graham (@Peaceful_intent) of Humanity Road works in multinational crisismapping. They specialize in the first hours of an event (12, 24, 48 hour windows). Self-directed work teams with training and a mission come online.

Their forthcoming QuickNets microtasking platform is open source, free, and will stay that way. Each row of data has an ‘anonymize’ button. Its been tested at RIMPAC and Pacific Endeavor exercises. 20 volunteers from 8 nations stepped up to model communications for the tabletop exercise.

Ka-Ping Yee (@zestyping) is an engineer at Google’s Crisis Response team. He uses Stratomap, the open-source tool behind Google Crisis Map. It’s on Google Code, and there’s a hosted version available. There are important datasets for crisis response, but they’re all hosted on different websites, so it’s difficult to get them in front of the right decisionmakers. Some of these maps have crappy UI, or don’t allow the databases to be combined. Data publishers have tools to publish, but map curators could offer even greater value if they were able to mashup maps and databases between various providers. We gain great insight when we can synthesize various pieces of information.

The Google Crisis Map provides a range of useful layers, from traffic to weather to user-submitted YouTube videos. Your map mashup can point to live feeds around the web, and it will be updated in realtime as those data sources are updated.

Users can share a customized view of the map, with layers

Brian Root (@brian_root) of Human Rights Watch shows us a US map depicting ICE’s deportation patterns. The group produced a report on the human rights implications of the US Immigration department’s detainment and deportation policies. They needed data to show ICE’s movements, but only ICE had the data they wanted. Through Freedom of Information requests, they were able to procure some data.

After cleaning the data, they were able to show the number of facilities involved, the facilities sending the most cases, and identify problem cases, where a detainee has been transferred numerous times across the country. They were able to visualize findings about the costs, human and financial, of transferring detainees.

But maps and data do not effective advocacy make. Drilling down to the state level was more useful with getting the attention of local media and local politicians. In January of this year, ICE issued a directive to limit the number of transfers, in large part due to HRW’s report.

Brian asks the audience to consider the human rights research that could be done with the mapping experience sitting in this room.

Clarence Wardell (@cwardell) led a research team at University of Arkansas following the Social Media and Emergency Management conference. One of the main concerns highlighted in their summary report was high-level resistance to use data because of verification. They took the strategy of conceding the “is it perfect?” argument up front, and instead arguing that the data was still nevertheless useful. There is room between horseshoes and hand grenades.

They mapped the verified and unverified data points together, creating the Traveling Salesman problem for disaster responders. Which relief tour is the optimal use of time and ground covered? They tested the multiple approaches.

Munish Puri (@RecordedFuture / site)time travels by looking at data over time. Hindsight + insight leads to foresight. Text has a predictive power when it is loaded with temporal references.

They looked at temper, time, and tone in the Georgian Conflict. Volume and velocity didn’t equal veracity. Licklider’s Intelligence Amplification.

Discovery consits of seeing what everybody has seen and thinking what nobody has thought. Albert Szent-Györgyi

Clionadh Raleigh (@acledinfo) is Director of Armed Conflict Location and Event Dataset (ACLED). They report on all violent events across Africa. Dates, locations, actor types, event types, and territory exchanges are collected to produce trend reports for others.

What’s happening in Africa?
We can use SpatialKey to detect and investigate patterns. The agents of violence have changed drastically in the last few years. Civil wars are declining, but political militias increasingly threaten civilians. We can follow the movements and attacks of groups like the LRA and Boko Haram. Boko Haram, for instance, never attacks troops.

We’re seeing more trans-national threats from Islamist groups. We see Ethiopia’s increasing violence over the last 15 years.

Analysis sees increased urbanization and other factors driving today’s violence. The data informs policy, academic, and public research. They offer special reports on topics like Islamist violence.

Steven Livingston (@ICTlivingston) introduces Mapping the Maps and Crowdglobe, an Internews platform to visualize geospatial data. Different map themes emerge in different regions. Western Europe sees crowdmaps used for entertainment and leisure and media reports.

They also surveyed crowdmappers, 80% of whom were men at an average age of 40 years. Many of the dead maps are simply a result of users experimenting with no intention of creating a map to begin with.

Only 6% of respondents promoted their maps using traditional media.

Jonne Catshoek is a conflict analyst working in the Republic of Georgia. In 2008, Russia and Georgia battled over South Ossetia. Conflict continues despite the international community’s involvement. Jonne blames security strategies that are not responsive enough to the needs of local communities. The elva platform (code here) they developed allows community representatives to SMS in community needs, where the information is then put online for the wide range of non-community actors.

Jonne estimates they spend 20% of their time developing software, and 80% of their time understanding local community need. This local trust allows better information sharing, and more reliable information.

They use SMS because smartphone penetration remains low. Trained monitors code a lot of information into a single SMS.

They’re expanding beyond violence into weather and agricultural information feeds for communities. The community is also heavily reporting security incidents, helping security providers respond appropriately.

Patrick Vinck (@developmentdata) is at the Harvard Humanitarian Initiative. They conduct surveys on countries’ peace and conflict, the way we have regular reports on health and other human factors. KoBo Toolbox is a data collection instrument designed to assist in this effort. It helps surveyors create their forms and export the questions. It even has recommendation engines, to suggest questions based on the topic being queried.

The same tool can collect voice, written text, images, and videos in one place. It works offline.

KoBo Map links to a spreadsheet (Google, CSV, etc.) and visualizes the data it contains. It’s lightweight for slow connections.

Taylor Owen (@taylor_owen) did a PhD thesis to map the historical US bombing of Cambodia. He quotes Nixon telling Kissinger to “crack the hell out of them” with an unlimited budget. He demands it. The result is over 200,000 sorties over an 8-year period. The 115,000 records detailing the planes, bombs, and tonnage remained secret until President Clinton opened the records to Vietnam for the purposes of de-mining the land.

Taylor used the data to produce timelines and fact-check the official timeline. The data can change our understanding of history. The bombings started sooner than we’ve said, lasted after the peace treaties, and hit civilian areas where Kissinger said we wouldn’t. We can see Watergate’s effect on bombs dropped in Cambodia.

We can see how the US bombing pushed the Khmer Rouge east, into the Vietcong territory, where they developed from agrarian socialist revolution to an anti-imperialist group.

It doesn’t take much to radicalize a population, Taylor says. In one instance, a single bomb in a village drove 70 radicalized recruits.

Henry Kissinger’s record of claims on this issue, including Kissinger’s Second Rule of Engagement (We won’t bomb within a mile of a village) turns out to be wildly incorrect once we map the bombing patterns. The US bombed heavily populated areas near Phnom Penh, for example.

Government data is often deleted, or at least classified for long periods of time. We need to work with the data when it does get released, so we can understand its historical implications.

Patrick Florance is in town from Tufts University to talk about the Open Geoportal (OGP). It’s an open source project at Tufts to rapidly discover, preview geospatial data. It’s a collaborative effort to take on big geospatial datasets.

You can shop around for datasets and drop them into your virtual shopping cart. You can incorporate third party web services and share the data all over the web. It works with common web mapping tools and will have over 20,000 data layers available by December.

Josh Campbell (disruptivegeo) is a geographer at the Department of State. He’s working to link together the US government’s purchasing power of commercially available satellite imagery, and the need of VTCs for this imagery.

In a few weeks, Haiti went from being barely mapped at all, to being mapped in such great detail so as to support on-the-ground action. State attributes this incredible development to the existing OSM community, empowered by web-service satellite imagery.

The US government buys a LOT of satellite imagery, and is contractually required to share it. In the Horn of Africa crisis, they experimented with letting volunteer mappers map refugee camps. 29 volunteers produced 50,000 nodes of data on a previously blank refugee camp map. The volunteers provided not just roads and streets, but also footways, paths, hydrologic features, and other rich data.

The project showed that people would map where there is imagery. But they’re interested in mapping the human elements, as well. The Red Cross hosted a mapping party, and went to Uganda to train locals to use OSM and fire responders. Locals annotated data with places names and restaurants. The local community received a map of the density of grass huts (a fire hazard).

John Crowley (@jcrowley) works at Camp Roberts to connect the top-down and bottom-up aid groups.

1. In law, agencies are having trouble navigating the policies that govern their use of crowd data
2. Trust in data, and trust in processes of VTCs
3. Security – the Arab Spring and Anonymous have shown that we can’t secure all the voices in a system
4. Voice – We could have a bigger collective capacity than we’ve ever had in human history. But what happens when that moves faster than our governments?

Agencies ask, how can we control this? Bad news is, you can’t. Good news is, we can begin to see how we can coordinate in this space. But we need to step outside of bureaucracies that only allow information to flow down.

We need space to fail, where it won’t disrupt actual operations. That is the purpose of Camp Roberts. Crashing is allowed. They bring together the players in the space to bridge capability gaps.

How do we repeat Haiti?
A range of actors not traditionally in the same space were brought together. An 18-month exercise brought them up against many legal and policy walls, but they were able to show the process worked.

FEMA came and asked about doing the same with tornadoes. Civil Air Patrol and many VTCs came together and designed a new workflow, which was used in Hurricane Isaac two weeks after it was created.

Can we scale this innovation process to all the other agencies around the world? It’s an Open Humanitarian Initiative.

This process of bringing people together into safe spaces requires combining the wisdom of the old with the innovation of the new. How can we bring together the human race to learn to heal itself?

Talking fast at CrisisMappers: the Ignite Talks


Dr. Jen Ziemke (co-founder of CrisisMappers) welcomes a room packed with a wide variety of professionals and volunteers. CrisisMappers started in 2009 as a network, designed to stay in touch
The group has grown to 5,000 members, organized on a Google Group and Ning network.

For the newbies in the room, what is CrisisMapping?
Jen breaks it down into the data coming in, the visualization of the data, and the response: how does it affect decisions on the ground?

Changing technology is clearly a primary driver of crisis mapping. Mobile technology and the ability to crowdsource shared experiences and visualize it on a map, or elsewhere, has enabled crisis mapping. Beyond mapping, this community quickly becomes a broader group of digital humanitarians, using technology to help communities affected by crisis.

Patrick Meier (the other co-founder of CrisisMappers) hops on stage. He’s currently at the Qatar Foundation Computing Research Institute, where they’re researching how to gather information from social media
In 2009, the field of crisismapping was just beginning to take shape, and it was easy to know every project and get to know all of the people behind the projects, often over drinks. The field has exploded, and matured, and the problems the field faces have grown more difficult.

The community met last year in Geneva to begin tackling some of the challenges. The computer security behind emerging humanitarian technologies was an obvious area of concern. John Crowley spearheaded this effort at the Camp Roberts event. Phoebe Winpope* has taken on issues of data privacy and security, working with professionals in that space.

Wendy Harman and her team at the American Red Cross have launched the Digital Operations Center, driving home the point that social media for disaster response is here to stay.

Andrej Verity has launched the Digital Humanitarians Network to facilitate the space between volunteer technologists and large humanitarian organizations.

Geeks WIthout Bounds and Digital Hacks of Kindness are also driving forward innovation in this space.

Disaster-affected communities are increasingly the source of big data. When Japan was struck by earthquake, tsunami, and nuclear disaster, a TON of data was shared online. Patrick argues that we need hybrid methodologies that combine crowdsourcing and the speed and scalability of advanced machine learning algorithms. We need to use multiple channels to listen to communities.

Verification remains an important challenge for aid organizations looking to make use of social media. Media organizations are actually leading the way in this department. The BBC has had a User Generated Content hub in London since 2007.

Monitoring and Evaluation are also important. Our perception of digital humanitarian technologies is high, but the real evidence supporting them is pretty thin. We need strong, independent evaluations of these technologies’ impact.

CrisisMappers 2013 will be held in Nairobi, Kenya (applause). This will be the fifth annual conference, and Jen and Patrick have decided to step down as organizers, and adopt OpenStreetMap’s model, where members of the network pitch to host and organize the annual conference.

Patrick is also inspired by TEDx, and hopes to see the CrisisMappers brand, logo, and website repurposed to support far more local events in this space.

Ignite talks!

Lin Wells (@STAR_TIDES) gives an overview of the STAR-TIDES network and its 1500 nodes. It is public-private and trans-national. They work post-disaster, post-war, impoverished, and short term and long term (disaster vs. refugees). They work in domestic or foreign situations, whether military is involved, or not. Shelter, water, power, cooking, lighting, sanitation, and ICT technologies.

Lin says technology alone is never enough. Building social networks and developing trust, as well as understanding how policy is adapted in the field, all matter.

TIDES hosts annual technology demos at Fort McMayer. They’re also present at Camp Roberts. The real world events they’ve supported include floods, wildfires, election monitoring, and more.

Jakob Rogstadius (@JakobRogstadius) introduces Crisis Tracker to crowdsource the curation of information like tweets during a crisis. Twitter poses a challenge, in that 140 characters is too short for computers to understand, but the rate of incoming tweets is too voluminous for humans to cope with.

The Crisis Tracker platform helps people sift through the noise and identify the novel information pieces in the stream, reduce the inflow rate, and enable volunteers to act on the actionable content. Similar messages are clustered, junk is filtered out. 30,000 tweets become 2,000 stories, and then 7 unique pieces of information per hour. The system looks at metadata like the timestamp, and then the crowd annotates the information with stories.

Why human involvement?
Humans can process text and images in ways computers cannot. Humans are also adaptable to rapid changes.

The system is up and running for the Syrian civil war. You can drill down into individual stories and see who shared it, links to multimedia content, and similar stories.

Volunteers have appreciated the platform’s ability to aggregate and filter. The system picks up stories and ranks and auto-sorts and filters the top stories. A 6-8 volunteer team can pull out the top items. With 40-60 volunteers, you can have a detailed log of the event at a very local level.

The project is free and open source.

Mona Chalabi (@MonaChalabi) brings us back to the challenges brought up by both slow and sudden-onset challenges. Crisis maps were unable to stop delays in aid distribution in Haiti. Have we reached a plateau in the use of GPS? FedEx uses RFID to track packages. It’s a barcode on steroids. A chip stores coded data, an antenna transfers the signal, and a computer accepts it for processing. Many subway cards use this technology.

In Haiti, hundreds of containers arrived each day, and those that were recorded were tracked using manual processes like Excel spreadsheets, leaving plenty of room for error. RFID would be much more efficient at managing the supply chain. The chips can be re-used.

The private sector has used RFIDs to great effect. The effective distribution of resources in a crisis can be the difference between people being fed and people being tear-gassed by UN troops afraid of a large crowd. RFID also combats information asymmetry in the space because the information is automated rather than input by one group and verified by another.
Simple supply chain logistics remain a major challenge for aid agencies and governments in crisis.

The effective distribution of resources in a crisis can be the difference between people being fed and people being tear-gassed by UN troops afraid of a large crowd.

RFID also combats information asymmetry in the space because anyone can use the information.

Simple supply chain logistics remain a major challenge for governments in crisis, and make it more difficult for corrupt officials to interfere with the chain of supplies.

Nate Smith (@nas_smith) is with Development Seed and Mapbox. There are many factors in communicating the many complicated factors that go into a crisis situation.

Context is important, like the conditions in the Horn of Africa prior to the droughts.

The Sahel Food Crisis project was much more than mapping, Nate says. Getting access to the raw data and communicating it in an appropriate way was part of the challenge.

Workflow is a problem. How we access data, and do things with it, can slow things down. Fews Net maintains their data in PDFs, which slows down developers looking to do anything with it.

What are the right colors and pieces of information in a map to express information?

We must design for shareability. Tools must be equipped for people to use them. If you’re publishing a lot of maps, your website should have the map endpoint on your site, so others can take and use it.

MapBox is also looking at building large data browsers that will support applications. The Sahel Food Crisis site is open for collaboration on Github.

Richard Stronkman (@rstronkman) is founder of Twitcident, another service to listen to the voice of the community during incidents. So much information is produced, with 400 million tweets per day. It’s a lot of noise, but when there’s a crisis, the information rates go up.

Early warning to identify increased risks and potential incidents.
And when incidents occur, they do crisis management.

In the Netherlands, they’ve worked with police forces, event security company, and the Dutch railway operator. On Queen’s Day, the Utrecht police force asked them to produce a map of incidents in a real-time dashboard. The team was able to identify threats towards the Royal family, leading to police visits.

At summer carnaval, the team worked to intervene against false rumor propagation. They found rumors at an early stage, and helped the police publicly disprove rumors early in the process before the rumors took off.

The team was able to identify a lack of drinking water at a large scale water fight early in the event, and help organizers react.

The group publishes their findings as academic research and in the technical press.

Shadrock Roberts (@Shadrocker) works at USAID’s Geo Center. The USAID Development Credit Authority uses credit guarantees to encourage local banks to lend to underserved communities. They wanted to map their work to see how they could do it better. They had a beautiful dataset of 100,000 records over the program’s 12-year history, but the location information was all over the place.

The Geo Center team cleaned up the dataset with a hybrid human-computer method. An automatic process parsed the easy fixes, but they raised a lot of eyebrows when they suggested crowdsourcing the remainder. They built an application on The technical infrastructure was a hurdle, but so were the legal policies restraining how the agency could use the data. They cozied up to the lawyers and received help from VTCs to clarify what volunteers would do with the data.

Involving volunteers went beyond crowdsourced tasks. Volunteers began a much broader discussion about development and the Development Credit Authority. Social media mentions of the organization went up significantly. 300 volunteers took on 10,000 records in 16 hours with 85% accuracy.

The goal wasn’t just a map, but to make the data open for others. You can find it by Googling “USAID Crowdsourcing Transparency” or tweeting at @USAID_credit.

Jerri Husch is here to talk Action Intelligence. How do we make sense of the massive amounts of data. Like electrical outlets, we have competing standards and cultural influences in different places. Jerri argues that we need an adaptable standard. If we look at societies from afar, we might make the mistake of assuming everything’s objective. But the world doesn’t work that way. A cricket is a pest in one place, and a delicacy in another.

Action Intelligence allows us to manage and analyze multiple dimensions and link data. We need to know who’s doing what, where, when. We need the data immediately in a crisis, and we need it around for the long-term as legacy data.

They can link actors to one another, or link actors to actions.

The process goes like this:
1. Collect data, often with university teams, in standardized ways
2. Classify and code that data, by place and time, at micro or macro levels
3. Visualize the data. They use free, open source tools, which allows for an adaptable standard that works anywhere in the world.

Andrew Turner (@ajturner), formerly of GeoIQ, has joined ESRI as CTO of their Research & Development Center.

“Big Data” means different things to different people, but we know it’s huge. In a crisis, with very short, life-threatening situations, data can help. In Haiti, crowdsourced information was really good, but still needed someone on the ground to act on the information.

GIS analysis lets us count and look at geospatial analysis of information. We need to evolve and build learning algorithms, because our current techniques are pretty easily fooled. @Racerboy8 provided useful data from his house during the Colorado wildfires, but his house wasn’t actually in danger. He was just being helpful.

In the NYC Marathon last year, FEMA looked at sensors throughout the crowd to visualize the crowd’s movement over time and space.

We can model situations before they occur, and detect communities in advance of a crisis.

Anahi Ayala (@anahi_ayala) works for Internews, which supports local media across the globe to empower communities. Anahi’s at the Center for Innovation and Learning, looking at how to incorporate new technologies.

Merging crowdsourced data with official information has proven difficult because of the difficult of verification. In the Ukrainian elections, they’re collecting information not only from social media, but also trained electoral monitors and journalists. Users can see verified reports vs. untrusted sources on a map.

The team dissects all of the information that comes in in an attempt to verify. They look at the context, the content, and the source of the information. There are digital traces everywhere online. Who are you already friends with, followed by, directly engaging with?

The content itself can be verified. We can crowdsource, triangulate, follow up with the source, and look at the weather in the video you submitted.

Every event occurs in a context in a country. Reports can be verified based on knowledge of the existing situation in a place and time.

Everyone’s adopting their own verification methods. Yes, falsification of information is always possible. But so is verification. Machine learning makes verification faster and cheaper for organizations. The question today is not whether or not you can verify information, but how to make it of high enough quality and timely enough to be acted upon.

Kuo-Yu Slayer Chuang (@darkensiva) goes by Slayer. He brings us back to the Titanic, which sank in a time when SOS technology wasn’t standardized. Open GeoSMS is a standard that combines SMS and location. Smartphones can embed all sorts of geo information in messages. They are designing a user-centric application to make collaboration easier.

Lars Peter Nissen (@ACAPSproject)
How do we make sure the data we collect actually becomes useful for decisionmakers?
Mistakes happen in large-scale, multi-agency responses. Potential impact is stymied.
In Haiti, we only used 1/3 of the data gathered. That’s a waste of the effort exerted collecting that data.

Disasters are never what we expect. Decisions are made when they have to be made, not at some ideal point in time. And we’ll always have massive information gaps, with plenty of known unknowns and unknown unknowns.

Three principles:
1. Know what you need to know. It sounds obvious, but do you?
2. Make sense, not data. Don’t collect data if you don’t know what you’ll use it for.
3. Don’t be precisely right, be approximately right.

With Internews, they designed the GEO, Global Emergency Overview. A snapshot gives you a quick overview of what’s happening, globally. Short summaries provide a basic understanding of specific crises. Then, you can drill down into 20-page analyses that help you discriminate between different types of needs in the fields.

Sara Farmer (@bodaceacat) is a core team member at Standby Task Force. It’s 2 years old with 1,000 volunteers. They’re generally known for turning social media feeds into maps. But they’re not just about information; they’re also about knowledge and analysis. They support HXL and other standards.

Their Disaster Needs Analysis (DNA) reports provide information about locales before disasters occur. Teams investigated and mapped available data for countries. They created baseline indicator sets, and set up a workflow to collect, store, and distribute the data. A fleet of scrapers converted online data into machine-readable tables. The data was cleaned into standard formats for country names, dates, and geo references. Gaps are filled in with estimates and proxies. Expert (Hunchworks) also help fill in the data.

Phil Harris (@geofeedia) sees every demographic using social media more and more. It’s not just Facebook and Twitter – there are image-rich services that didn’t exist two years ago. Smartphones drive more user generated content.

They set geofences on London and monitored the Olympics, aggregating 175,000 posts from YouTube, Twitter, Flickr, Picasa, and Instagram. Instagram’s a surprisingly large source of posts (36%). Only 31% of the posts contain keywords like London or Olympics. The majority of these pots wouldn’t have been tracked by traditional keyword search methods. Geofeedia sells a service to monitor social media.

Colleen McCue (@geoeye) works in geospatial predictive analytics. Again, we can learn a lot from private sector marketers. Product positioning in the supermarket is critical. Location also matters when you’re talking about bad actors in the humanitarian world. The Lord’s Resistance Army has struck over the borders of several African nations. We can target them based on past behavior. And we can segment the population by crime types, like marketers. Looting, abduction, incidental homicides, and murders produce different geospatial patterns. Individual factors can influence violent crimes. Roads and porters are critical to a successful abduction. IDP camps and other population clusters are attractive to the LRA, just as banks are attractive to thieves.

“Distance from murders” turns out to be a major factor in instances of isolated murders, suggesting a key behavioral difference from incidental homicides that occur over the course of another crime. The group is compiling signature profiles of different behaviors, and using advanced analytics to produce actionable recommendations prior to events occurring.

Kalev Leetaru brings us back to 38,000 years ago, where we find the first written records. 550 years ago, we get the printing press. Today, we’re producing incredible amounts of information and written records as a species.

The history of conflict can teach us about the present. We can visualize NGO reports and the global media tone towards a nation like Egypt or a leader like Mubarak to see when they’ve lost global credibility. We can track geographic affinity for a leader like Osama bin Laden. We can map conflicts within a nation by various factors.

Dave Warner is “dangerously over-educated and works in a memo-free environment.” He wants to make smart people smarter and explode the dots on the map into more complicated pins that contain significantly more information. His maps look much more like something out of a strategy video game than GIS software.

Dave mapped an audience by their WIkipedia entries, and academics by geography across the country.

Smarter Cities, Better Use of Resources?

Dr. Lisa AminiIf you’ve read a magazine or traveled through an airport in the last couple of years, you’ve probably seen ads for IBM’s Smarter Cities initiative. Today in our Post-Oil Shanghai course, we got to learn about some of the projects behind the very public campaign. Dr. Lisa Amini is the first director of IBM Research Ireland, based in Dublin. They focus on creating urban-scale analytics, optimizations, and systems for sustainable energy and transportation.

Lisa’s group focuses on transforming cities with:

  1. Sensor data assimilation: how do we ensure data accuracy, and account for the volume of data that comes in from sensors deployed at a metropolitan scale?
  2. Modelling human demand: how do we design a robuest enough model to reliabily infer demand and peoples’ use of city infrastrucure
  3. Factor in uncertainty: we’re talking about humans, here.

Sensor Data
Smarter CitiesWe have a massive amount of diverse, noisy data, but our ability to use it productively is quite poor.

One reason Lisa’s team is based in Ireland is that Dublin shared their municipal data on energy, water, and other core services. IBM wanted to focus on making use of available data, not laying down new sensors. Lisa shows us a map generated by bus data. The data is much more granular at the city center than in the suburban outreaches. And even downtown, the GPS isn’t terribly accurate, and sometimes locates buses smack dab in the middle of the River Liffey. This complicates efforts to infer and improve the situational awareness.

Bus bunching is a major problem in cities. Buses that begin ten minutes apart get slowed down, and end up clustered in bunches, with long waits in between. One goal is to dynamically adjust schedules and routes to compensate for predictable bunching conditions.

Another project looks to improve traffic signalling, not just for public transportation, but all traffic. Scientists are finding links between vehicle emissions and health, and certain urban corridors and certain times see a dangerous buildup of pollutants. Smarter planning could help us at least prevent this buildup from taking place around schools and hospitals.

Lisa talks about Big Data, but also Fast Data. It’s continuously coming at you, and if you can leverage it in realtime, it can be much more useful than a study conducted well after the fact. Her team is working on technology to make use of data as it comes in, and construct realtime models and optimizations. They can see bus lane speed distribution across an entire city of routes and a fleet of a thousand buses, accounting for anomalies past and present.

Sensor data’s great, but what do you do when a segment of the bus route is flashing red? Often, a disruption requires a person going to the scene to find out what’s wrong. The IBM team is experimenting with using Natural Language Processing data to determine if the cause of traffic is a Madonna performance at the O2 Centre. They can analyze blogs, event feeds, and telco data. Twitter isn’t useful for this yet because of the low percentage of Twitter users with geocoding enabled on their tweets.

Congestion is a significant contributor to CO2 emissions, so proative traffic control is becoming an important tool. Europeans are more and moer concerned about the livability of their cities, and even when they’re not, the EU Commission is happy to regulate. Cities are excited to avoid paying heavy fines, and invest in technologies that help avoid such costs.

At the individual level, congestion charging only works if there’s a reasonable alternative to driving. Even then, Praveen notes, it creates equity issues, where the wealthy can afford to drive into the city, and the poor cannot.

Lisa’s team is also modeling coastal quality and circulation patterns. One of the big problems is the treatment of water in waste plants. These plants treat the water to a static chemical index, and then release the water back into the world. Marine life dies, and we have toxins in the water, because
Large rainfall creates road runoff into the wate system. And tidal conditions can push water back upstream and hold the toxins in place, killing marine life. People are deploying sensors across the water systems, which is a huge improvement on annual testing conducted by a diver. But sensors don’t work incredibly well underwater – they’re limited by range and “fouling” of data.

New technology uses light sensors to understand the movement of water, which, combined with other sensors and de-noising models, can produce a cleaner picture of what’s happening across the bodies of water.

Modelling Human Demand
How people move, interact, and how they prefer to consume resources.
They can improve city services by taking advantage of telco data, smart car data, and other private and public information. The findings show a surprising sapital cohesiveness of regions. Geography still plays a huge role in how we look for services, communicate, and travel. Cellphone path data can illustrate points of origin that can better inform the planning of transportation paths. Political lines are a particularly ineffective way to organize services.

In the energy space, two trends have converged: First, we have more and more renewable energies, but they’re only available when the wind blows and the sun shines (efforts to store this energy notwithstanding). Our fossil fuel power plants require careful management, and must be gradually. Ireland could actually use more windpower than they currently do, but it would have adverse effects on the traditional plants.

The second trend is smart meters, which provide much more information on how energy is used. This allows for demand shaping, dynamic pricing, and smart appliances that act based on this information. But the energy companies are structured around predicting national energy demands, and follow very conservative policies that optimize for fulfilling peak demand. Energy companies are learning to forecast energy demands for pockets, rather than huge regions, and to take advantage of reneweable energy sources with pilot projects. They foresee running hundreds of thousands of dynamic energy models, rather than their current one-model-that-rules-them-all.

A project with Électricité de France simulates massive amounts of realistic smart meter demand data to test future scenarios. They’re building additive models based on human events like holidays and residential vs. commercial energy usage. The IBM Research team has build complicated flowcharts to identify compelling datastreams.

Utility leaders are forced to make decisions that are fraught with risk and uncertainty. It’s not just optimization, but social welfare and balancing competing costs. Lisa would like to incorporate the notion of risk into the technological systems. When your phone tells you there’s a 10% chance of rain today, it’s not very actionable information. Medical tests and treatment plans can be equally infuriating in that they fall short of complete predictability. How do you communicate information that carries risk with it so leaders can make decisions?

Interconnected water systems, with water treament plants, households, and geographical features demanding different priorities. Water utilities spend enormous amounts of energy moving water from one place to another, losing between 20-70% of the water along the way. We need to begin considering these systems as integrated, and acknowledge the risk and uncertainty inherent within them. When you start working on any one aspect of city services, you quickly involve other departments.

There have been many studies on providing energy and water information to homeowners to encourage conservation. The biggest change you can make? It’s not laundry or watering your lawn. Fix your leaky faucet.

Culture matters, too. Europeans expect more of their government, and citizens get up in arms when a resource like water becomes metered.

Rather than produce a perfect formula and answer to a question like municipal water demand levels, they have built models that allow for imperfections in data and can optimize for cost or service delivery.

An area of hope is to target non-experts with well-communicated information and visualizations of existing data.

What would you do if you had a city’s worth of data?
Lisa’s team is working to convince the city of Dublin to release more municipal data for others to make us of, following in the footsteps of, Washington, DC, and San Francisco.

The Social City project seeks to better understand the social context of people in a city to better understand why certain groups of people aren’t getting the resources they need. The conditions in which these people live could be major drivers of why


Sandra: Have you thought about incentivizing people, rather than just providing information?
Lisa: Incentives don’t need to be financial. One study found that knowing how other people like you behave has the ability to change individual behavior.

Praveen: Is it feasible to offset the cost of installing smart meters with the energy savings it provides?
Lisa: Right now, it’s still a net loss, because you still don’t have systems on the energy utlity side to take advantage of smart meters. But utilities know their time to adjust is limited, and governments are helping utilities to see that their time is limited.

People like to see immediate changes when they alter their behavior. Anything you can do to show people a change early in the feedback loop can be powerful (anecdotally).

Q: Can we do a better job of choosing sites for our buildings?

Lisa: There’s great data for this, but there are a lot of difference influencers. The telco data and the social context projects, for example, show just how many factors are at play. People may take advantage of a welfare system, but in the data, we often see them pop up once, and then disappear. They may register under different names. Cities know their service centers aren’t meeting peoples’ needs, often because of inconvenience of location.

Zack: There are a lot of policy implications in your work, and new technologies at play. You’re also in a position to educate policymakers and advocate for specific policies. What kind of barriers do you run into talking to those folks?

Lisa: Where it works is when you find some city leaders who are incredibly passionate about trying to do better and fix their city or some aspect of their city. Predominantly, people take these jobs because they do care about the city and services and infrastructure and making that better. The challenge is that a lot of policy and politics and regulations at larger levels that an individual leader can’t work around. Bus drivers’ union leaders were initially upset about the city sharing the Dublin buses’ GPS data. Are you going to spy on my lunchbreak?? Cities have histories and personalities and election cycles. Some people are afraid that the data will paint a negative picture of their work. Lisa compares it to the tide: sometimes you just can’t command it due to its scale. Leaders can’t yet prove a return on investment on a project because there’s so much uncertainty.

How to Surface the Valuable Resources All Around Us

I started my morning with an exciting (to me) development: Verizon had finally approved the Jelly Bean update to my Android phone. One of its features, Google Now, works like the iOS 6’s Passbook. Both systems pull from the vast number of things your phone and connected services know about you to surface relevant information when and where you need it. The day’s weather, the score of your favorite sports team, and traffic on your commute home are pushed to you so that you don’t even have to search. It was a fitting way to start the morning as I headed to the PhD. thesis defense of Polychronis Ypodimatopoulos (or Pol, or @ypodim, if you’re a fan of brevity).

The question that has guided Pol’s research is how we can enable people to easily tune in to what’s going on around them. When a curious mind walks into the Media Lab, how do they find out all of the amazing things happening inside this building? The Lab has Glass Infrastructure screens, but most other buildings do not.

The pain point here is the “If only I had known…” feeling. We work on projects similar to others, without ever connecting. We could better exchange products and services, engage in joint activity, or pool resources, like finding a roommate.

Cities offer many resources and opportunities, but navigation remains a daunting challenge. High rise buildings and crowds make us feel intimidated, not empowered. How can technology help us see our neighborhoods as the rich hives of potential they really are?

Pol’s answer is the decentralized social network. His first application was a mesh network on a mobile device that showed you what was around. Writing apps on multiple platforms and limitations of battery life were issues, though. So he moved the network to the cloud with Ego. Ego sought to put the user at the center of activity, with apps circling us, rather than our current model of competing platforms, where users revolve around sites like Facebook and Amazon.

Pol added indoor location-sharing to the mix with Bluetooth devices. The Twitter-like interface allows you to literally follow your friends. He charted the aggregate location results of two competing sessions at the same event, and could visualize the depressing effect poor venue selection has on the size of your audience.

The aforementioned Glass Infrastructure is a place-based social information system comprised of 40 touch screens placed around the building. It maps out the Media Lab’s many people, groups, and projects for visitors. By exposing this information in a public place for the first time, students rushed to update their projects and headshots (this novelty effect has worn off, however).

The GI architecture consists of a large touch screen in vertical orientation, which can read RFID tags on the nametags of passerby. This allows the infrastructure to provide different applications for different user classes (Media Labbers, sponsors, visitors).

The Bird’s Eye View application for the Glass Infrastructure provides a collage of faces. When you walk by the screen, your photo is pinned for the next 5 minutes so that others can see your recent presence and potentially reach out to you.

These two projects introduced a decentralized social network and a place-based social information system, but Pol sought to create a discovery mechanism that works across different contexts. Siri and Google Glass are interesting ways to present the information around us, but Pol thinks there’s room for improvement in creating the actual content of what’s interesting around us.

One such application is discovering the experts around us. This is usually done by starting with a corpus of information, such as emails or a forum, and then identify and suggest expertise. But when you’re in a new space, you don’t have such a corpus. FourSquare doesn’t tell us much about the people around us. Twitter has hashtags, but we have to advertise that hashtag’s existence before people know to use it. Starbird, et al. proposed hashtagging everything, which gets messy quickly. And combining Facebook + Highlight limits you to your existing social network’s reach.

Pol pulls up his own Facebook Profile. There’s a lot of information here, but it’s not enough to capture him, especially when you throw variables like location into the mix. is a discovery service that centralizes our skills across users. It’s designed so that multiple discovery services could compete using the same format. You, the subject, list objects (topics) paired with a predicate (talk to me about these topics).

Pol simulated the discovery service with thousands of users at the scale of a city block. A coffee shop owner would be able to determine how many people walking by match the profile of a person likely to buy coffee. Pol sees this as a way to make markets more efficient, and let consumers group themselves to achieve better prices.

This being MIT, most of the skills listed in the demo are software development skills. Your personal taxonomy consists of tags for your skills, gender, and languages spoken. The average user contributed 15 values about themselves, which is actually the same amount of datapoints Pol found in Facebook Profiles.

Pol tested at O’Reilly Ignite Boston 9 to help attendees find people they might want to talk to. TED has actually produced a similar iPhone app, TEDConnect, with the same “Talk to me about” field.

This list of information doesn’t scale, though. It quickly becomes unwieldy. So Pol looked at exposing information selectively based on the user’s current context. The lists become applications based on specific geographically-defined areas depending on whether you’re looking for dinner or a study partner.


Pol also mined a useful Media Lab listserv discussion to extract the knowledge within and import it into He mapped the tips and displayed . He asked users which format they preferred. People like the email thread for the personal touch and the contextual information provided in each email. But it takes a long time to digest many emails in a thread. extracted the valuable data, but not the personal stories behind it.


Me: Are we forever doomed to choose between inefficient but meaningful personal narratives and rich, if soulless, databases?

Pol has attempted to design a sweet spot between the two, and points to databases that link back to the personal context as a solution. provides a map with database entries of rich, concrete information, but each entry also shows the faces of the people who made the recommendation and a link to the email where they tell the story behind that great shot of espresso. You can have the best of both worlds.

Eyal asks if Pol has considered the emergency applications of such a skills database. Could we quickly determine if there’s a doctor in the vicinity?

Pol points out that verification of one’s medical degree would be important in such a scenario. Pol has played out a scenario where your train station is closed, and you are able to hitch a ride with someone headed the same direction. But realtime location sharing apps will probably leapfrog Pol’s tag-based attributes in this specific application.

Catherine Havasi notices a number of Google Maps pins in the Charles River and asks about moderation and verification. It turns out this was intentional, for a flashmob-by-sailboat app. But in general, Pol relies on user flagging for crowd moderation.

Privacy is managed by users themselves, who can set how many degrees of Facebook friend can view their location and attribute information.


Wiring Informal Economies with Square

Jack Dorsey (@jack) is a double-timing executive. He spits his week between Twitter and Square, two fundamentally game-changing companies he has founded. You’ve probably heard of Twitter. And you’ve likely heard of Square by now, as everyone who uses it inevitably becomes an advocate for the service. Every time you see a cab driver or food truck faced with the decision to either not accept credit cards or pay high vendor setup costs, you feel compelled to share the gospel of Square’s free setup and bare minimum processing fee (2.75%).

Jack starts with the story of two nice guys who founded a pizza company together. To keep the company intact, they promised not to date the waitstaff. Until they hired Jack’s mother. His dad broke the interoffice dating rule and gave up the pizza company. The moral being that neither Jack’s father nor Jack himself sought to be entrepreneurs. They wanted to do work they loved. Jack lists sailor, tailor, and surrealist artist as his original career paths. His goal was to teach the world to see in a different way.

Entrepreneurship is an attitude, not a declared self-identity. Steve McQueen said “When I believe in something, I’m going to fight like hell to get it.” That’s the attitude you need to hold to start a company.” Entrepreneurship is finding the intersections of life and being there before anyone else.

William Gibson said, “The future has already arrived. It’s just not evenly distributed yet.” This quote excites Jack: just think about the future that lives inside the heads of students at MIT, or inside our laptops.

The Founding Fathers’ best idea was embodied in the phrase “a more perfect union.” They understood that the work wasn’t done yet, that others would have to carry the banner and flush the rest of it out.

We tend to emphasize these founding moments, but Jack points to every new employee and every new user’s ability to change the course of the company. Such an idea can come from anywhere. Successful organizations are the result of multiple founding moments.

Jack’s not a huge fan of the word ‘disruption’ in a startup context. He shows us a photo of what appears to be post-hurricane damage as an example of disruption. Disruption is confusing, has no purpose, and has no values. We want leadership, we want direction. The world has enough confusion. What we really seek to build is a Revolution. Jack isn’t shy about pointing to the French Revolution or Ghandi to illustrate his corporate principles.

Square is the latest version of a fundamental human behavior: commerce. Commerce, put simply, is the activity between buyer and seller. Square seeks to help ease all of the friction and frustration that lives between buyer and seller in our modern economy. They shrunk the complexity of the credit card industry down to a device the size of a quarter that you just plug into your phone.

The onerous fees, setup costs, and credit checks prevented many people from accepting money. Traditionally, only 10% of applicants are approved to receive credit card transactions. Square accepts 95% of applicants because they found more intelligent ways to verify identity and prevent fraud.

We’ve stopped carrying checkbooks. Many of us have stopped carrying cash. How long will it be before we stop carrying credit cards?

The Square team gave themselves one month to build the system, and were successful. Jack had fun demoing the app to friends and family by swiping their credit cards. The simplicity resonated with everyone they talked to. Many of the informal merchants that make up our economy, from golf trainers to food trucks, were free to accept credit cards for the first time.

Accepting credit cards grows your business. You get more customers, and they spend more. Prior to Square, merchants had to rely on POS (Point of Sale…or Piece of Sh&t) terminals. They’re big, clunky, and silly expensive. You get a receipt for a donut. It doesn’t actually track what you sold.

The analytics Square offers are exciting, too. Small businesses can learn what sells when in an intuitive interface. The Square register app runs on an iPad and replaces the DSL line, the cash register, and all of the other credit card processing paraphernalia.


Jack didn’t have to do much business development for Square. Starbucks came to them (a nice perk of being the Twitter founder). It turned out that Starbucks had many of the same problems of small vendors. The cost savings add up. Merely running an iPad rather than a PC and receipt printer saves on electricity, especially when you consider the scale of Starbucks’ stores.

As a company, Square is focused on building a product. They don’t want to get in the way of their users and merchants. They’ve engineered the company to stay out of the way.

Jack tells us the story of the Golden Gate Bridge. The “Golden Gate” refers to the strait in the bay. The water’s deep, and there are earthquakes. But the engineers had the audacity to do it, and the project came in two years early and under budget. Jack attributes this success to the pairing of design and engineering. “Design” here doesn’t mean the visual aesthetics, but also marrying the function and the form. Engineers are concerned with efficiencies and readable code. When you write code, you’re not just telling the computer what to do. You’re communicating with other coders.

The Golden Gate Bridge’s #1 feature is its 100% uptime. But it also takes your breath away. The designers had the audacity to build something functional and beautiful. They built something they could be proud of. Square takes this approach to heart.


Photo by Joe Azure

Money has been with us for 5,000 years and touches every single person on this planet. Everyone on this planet feels bad about money at some point in their life.

Square’s electronic receipts were noticed to be a compelling medium in their own right. You can communicate with receipts.

A team of four people built the idyllic purchasing experience at Square. You can now walk into a store, buy a cappuccino, and walk out wondering if you paid for your coffee. They accomplished this with geofencing and communication between your smartphone and the Square-powered iPad register. Your face and name show up on the register, you give your name, and choose how much to tip. Tips have gone up 22% (interface design and default options can have serious impact here).

Square has seen crazy growth, crazy competition, and yet, Jack promises, a relaxed work environment.


Every Square piece is a real-world ad impression. The device is synonymous with the company name (an improvement upon the previous working name ‘Squirrel’), and makes enough of an impression that people Google it and find them. Jack is proud that the company now beats Wikipedia’s geometry page for such searches, vindicating his 14-year-old self.

Jack’s passionate about maintaing a transparent corporate culture. Twitter holds weekly town halls, which live on at Square as Town Squares. Notes are kept at every major meeting, and shared with the entire company. The final decision is posted at the top of the document, and the conversation that led to it is recorded below. Jack’s also a fan of standing meetings, where long, drawn-out agendas expire with his colleagues’ leg muscles.

Even with major employee growth, Square only employs 5 people with backgrounds in the finance industry. They are engineers, above all else.

When Jack pitched Square to JP Morgan Chase, they pointed out just how much money he was leaving on the table by eliminating all of the traditional fees and hardware costs. But this pro-customer behavior earned them unquantifiable (IMHO) word of mouth for their product. A taxi driver in Cincinnati was paying 15% transaction fees on credit cards, with a significant time delay before the money made it into his bank account. Square’s radically improved economics led this taxi driver to convince the rest of his union to adopt the device. The company has directly benefitted from these offline social networks simply by being great. It would be interesting to see a business case study on the financial value of leaving money on the table in exchange for your customers’ love.

Square’s analytics feature could also prove controversial. Traditional POS systems have shied away from this feature on the grounds of privacy. Square will collect huge amounts of small business data, which is valuable in aggregate.

Jack points out that 90% of the people in large organizations are paid to say “No” to new ideas. They are hired to protect existing business and eliminate risk. They need to be pushed for innovation to occur.

Square makes its money on the 2.75% fee, but must pay the interchange, so there are transactions on which it loses money. The pricing around credit cards hasn’t been rethought in 62 years, when Diners’ Club started charging merchants 7% to accept their cards. Square introduced Simple Pricing to allow merchants to pay a flat fee per month rather than a per swipe fee. Businesses between $10,000 and $250,000 do well with this pricing model. The more compelling case may be around data. Merchants and consumers alike could benefit from insights provided by Square’s data, but Jack doesn’t offer specifics yet.

On NFC: Jack doesn’t see a fundamental connection between NFC technology and payments. With NFC, you don’t get the customer’s identity until after they’ve paid. Merchants don’t have much reason to be excited about NFC at the moment.

Smart Customization vs. Mass Production

Liveblog of Ryan C.C. Chin’s PhD thesis defense at MIT Media Lab

Ryan came to MIT in 1997, and got a Master’s in Architecture, and then at the Media Lab, before entering into the Lab’s Ph.D program. He took leave for 18 months to work on the CityCar project.

Ryan’s thesis examines smart customization, and the scientific differences between mass customization and traditional mass production. Is one better than the other? Is one more sustainable?

The CityCar is customizable on a number of levels: its base design, its adaptability to its environment (city), and its individual parts’ modularity.

Ryan hasn’t only worked on cars; he’s also studied customization of dress shirts. He chose shirts because of their low cost, frequency of use, and relatively easy traceability (see SourceMap).

Ryan started with an online customer survey of nearly 1,000 people. People have three types of dress shirt, with regards to fit: standard, made-to-measure with your measurements, and custom-tailored, designed specificaly for oyu. The average male has 14.2 dress shirts for work, but we don’t wear them all. Very few of us own only custom shirts, whereas 76% of respondents owned only standard shirts.

He then studied how people actually acquire mass customized products vs. mass produced products. 94% of respondents drove to buy their shirt. 63% of us clean our shirts in the washing machine, but mainly because it’s wrinkled, not because it’s dirty.

The main reason we return shirts is that they don’t fit properly. Online, mass-produced shirt retailers see a 40% return rate. That drops to 20% return rate in offline mass-produced shirt stores. Mass customized retailers see only a 5-10% return rate.

Whether it’s sold online or offline, mass produced shirts are made pretty much the same way. But when you order online, the delivery of a shirt to your home by truck produces huge CO2 savings over you driving to the store yourself.

With made-to-measure dress shirts, nothing gets produced until your order comes in, at which point the order goes to a QA center in China, where an electric scooter brings it to the factory. The carbon costs add up as your shirt is flown DHL to the US.

When you get a shirt custom-tailored, the tailor comes to your office to fit you and your coworkers, and then sends the order to Hong Kong. The shirts are made and flown back to the tailors’ studio, which then delivers the shirt and makes additional alterations. This back and forth adds some carbon costs.

The vast majority of the the CO2 involved in delivering your shirt comes in the last few miles, where you drive to a store. The mass-produced shirt ordered online has the lowest carbon count, followed by made-to-measure shirts ordered online.

Ryan also conducted a post-transaction customer use study using two washable RFID chips inserted into the collar stays on dress shirts. What happens after you acquire the new clothing? They built an RFID tracking system and embedded it into the office environment. Subjects would see a green confirmation light when the shirt they were wearing registered with the RFID readers.

The team cataloged a selection of sample shirts and sold them to employees at Fidelity and MIT Tech Review. They collected thousands of RFID reads over the course of the summer and color-coded a grid (or calendar, really) of how often each shirt was worn in the office.

Patterns emerge

People wear their favorite shirts on consecutive days, often in the same order. Ryan calculated an ideal shirt utilization rate: the number of shirts you own divided by number of days you need to wear a shirt. But we favor certain shirts and shun others. Some of us achieve equal distribution, though, working through our wardrobes systematically (“first in, first out”). One man reported that he gets dressed each morning by literally going right-to-left through his closet. Another man saved his custom-tailored shirt for a big board meeting, like a power tie, and felt the desired effect. A third guy wears only his cheap shirts, knowing that he or his children are likely to stain it, while the nicer shirts are never used at all. Others save their nice custom-tailored shirts for out-of-office occasions where Ryan’s RFID readers couldn’t scan them, like weddings and dinners.

On average, we don’t wear about 20% of our shirts at all. The mass production shirts got worn a lot, and were generally considered favorites, even over custom-tailored shirts. Ryan attributes this puzzle to better craftsmanship in mass-produced shirts, and fewer opportunities to wear custom-tailored shirts.

Lessons Learned:

  • We should move goods, not people, as much as we can. 16-ton UPS trucks are 24 times more efficient than a personal automobile for delivering goods.
  • Pull-based marketing dramatically reduces inventory. $300 billion in lost revenue in textiles wasted on stocks, transportation of goods, and heavy discounts. Build-to-order automobiles are only 6% of the US market, while it represents 50% of the European market.
  • Persuasive interfaces help people make the right choices. Showing the environmental effects of fast shipping vs. slow shipping works on us.
  • We need to miniaturize retail environments. The big box stores have become . Apple has begun deploying urban boutiques, where the highlight is experiencing the product, not stacking boxes.
  • Customizable Clones: Take the top 5 shirts you wear, the ones you love and the ones that fit, and make the rest of your shirts like those. These shirts are the iterative product of the trial and error represented by the rest of your wardrobe.
  • Local production is controversial. The labor cost is still about 2.5 times higher, even when you account for transportation costs.
  • Smart materials, like the Apollo Fabric, reduce the amount of energy the textiles require after it’s produced. Few retailers know Ministry of Supply claims to be anti-microbial and wrinkle-free, meaning fewer trips to the drycleaners, and higher shirt utilization.

Responsible Consumerism would allow us to create the ideal wardrobe, at the intersection of our own desires and environmental benefits. Ryan suggests a carbon label, like US FDA’s nutrition labels, showing the consumer the amount of carbon involved in the clothing article’s production, lifetime use, cleaning, and recycling.

How can customization improve the utilization rates of all the things we produce and own? And how do we scale this customization to the scale of a city?

Ryan attributes his inspiration to the late William J. Mitchell, and the huge number of people that worked on the CityCar and other projects.

Is the era of mass customization over?
Ryan points to Joseph Pine’s continuum of mass production, customization, lean production, and craft. All are necessary.
The number of customized things is going to increase, but what’s ideal? Standard, mass-produced goods work for many purposes (like Ryan’s current outfit). But there are huge cost and environmental savings to customization. Whether or not everyone feels these costs and benefits will depend on actual environmental policy. Everyone would love a custom shirt, but the average mass produced shirt is $20, while custom tailor shirts can easily cost $80. Custom needs to become more economical.

Ryan recommends that we receive a copy of the data generated by the full body scans the TSA requires of us. We could use that data for custom clothing, health, and other purposes.

Ryan foresees an “apparel genome,” where all of our clothing is tagged and machine readable, leading to insights about how we choose our outfits, what additional outfit configurations we could create from our existing clothing, and so on. I’ve begun using SuperCook, where I catalog the food in my pantry, and the app informs me what recipes will utilize my CSA-delivered eggplants. It’s not a big stretch of the imagination to consider doing the same for our clothing.

Customized goods fit into the broader trends of rent-rather-than-own, where an increasingly urban population favors access over ownership and proximity over storage space.