Photo: Code for America

Bringing California open data to life

Okay, I admit it: Even as a champion of open data, I find that it’s often mundane to view data on a portal. Simple lists of datasets — and even the maps and charts you can create — don’t truly show the intrinsic value of data that’s been freed to benefit communities.

Photo: Esther Vargas

Driving smart city innovation with open sensor data (part 5)

While there is much technology that can be sifted into must-have, nice-to-have and maybe-someday categories without a negative impact on smart city advancement, there are a few basic pieces of technology cities will need in order to extract value from the real-time data that has already begun to flow through smart cities.

Photo: Jordi Martorell

Driving smart city innovation with open sensor data (part 4)

While it is commonly acknowledged that cities today produce massive amounts of data, it is less often noted that much of the data referenced is not actually produced directly by city systems, but rather by cities’ ecosystems of partners in domains such as transportation, waste and water management and energy services.

Railway station of Dehradun, Uttarakhand, India. Photo by Lennon Rodgers CC-BY-SA-3.0-migrated-with-disclaimers

Driving smart city innovation with open sensor data (part 3)

An odd thing happened in Dehradun, the capital city of the northern state of Uttarakhand, when the city received news that it would receive funding as one of 100 cities chosen to participate India’s $15 billion Smart Cities Mission. Rather than celebrating making the coveted list, the city instead found itself embroiled in a dispute that saw local activists take to the woods to hug trees in protest against Dehradun’s smart city proposal.

Place de la Nation

Driving smart city innovation with open sensor data (part 2)

You can accomplish many smart city goals in a timely and inexpensive manner by exploring options for leveraging an existing infrastructure of low-tech, collaborative information and communication technologies like mobile phones, social media, online platforms and low-cost sensor kits, before making hefty new technology investments.

City at night

Driving smart city innovation with open sensor data

For many years, open access to data has been viewed as an important means of improving government transparency and accountability and deepening citizen engagement, and today hundreds of local and national governments worldwide are using open data portals to publish data and documents that they produce over the course of their operations.

civictech

Defining civic tech

Over the past few days, I’ve been thinking about Omidyar Network’s recent report, “Engines of Change,” and the need to better label and define the movement happening around civics and government with respect to technology.

"No ugly, old IT."

‘No ugly, old IT.’

“No ugly, old IT” jumped out at me when I first reviewed DataSF’s strategic plan, “Data in San Francisco: Meeting supply, spurring demand,” and it still sticks, mostly because someone inside government was so bold as to make this a priority and openly communicate it and also because this should be a mantra for everyone building civic technology.

Source: data.nucivic.com/dashboard

Participation and the cult of catalogs

“Anonymous access to the data must be allowed for public data, including access through anonymous proxies. Data should not be hidden behind ‘walled gardens.’”
8 Principles of Open Government Data

In the world of open data, there are few things that carry more weight than the original 8 principles of open data.

Drafted by a group of influential leaders on open data that came together in Sebastopol, Calif., in 2007, this set of guidelines is the defacto standard for evaluating the quality of data released by governments, and is used by activists regularly to prod public organizations to become more open.

With this in mind, it was intriguing to hear a well-known champion of open data at the Sunlight Foundation’s recent TransparencyCamp in Washington, D.C. raise some interesting questions about one of these principles, typically considered sacrosanct in the open data community.

Andrew Nicklin (formerly at the helm of open data efforts for both the City and State of New York, and now Open Data Director for the Center for Government Excellence at Johns Hopkins University) asked TransparencyCamp attendees to consider some of the implications of the sixth principle on open data – which calls for non-discriminatory access to data. This principle is generally taken to mean that users of open data should be able to access it anonymously and that governments should not require users to identify who they are or what they plan to do with the data as a condition of accessing it.

While there is obvious merit to this principle, Andrew observed that when governments know who is using their data and how they are using it, there are enormous opportunities to enhance the data and make it more useful for data consumers. If governments don’t understand what user’s want, providing useful data that can meet their needs is difficult – strictly enforcing anonymous access to data may end up being be an impediment to better understanding what data users actually need.

Without being directly critical of the principle or the original intentions behind it, Andrew made a thoughtful suggestion for open data advocates at TransparencyCamp to consider. To me, these comments highlight an important issue facing the civic technology community and governments themselves – one that almost no one is talking about.

When it comes to building the infrastructure of open data – putting in place the pieces of technology that users will leverage to find and use government open data – very little thought seems to be given to what users – data consumers – want or need.

The idea of “build with, not for” has become a central tenant to how civic technology solutions are designed and implemented. Yet this idea seldom applies to the platforms that governments use to make open data available, which form the foundation of many civic technology solutions.

Costs and benefits

“Funding is the most cited barrier to implementing or expanding open data initiatives.”
Empowering the Public Through Open Data

A recent collaborative effort between the University of Southern California’s Annenberg Center on Communication Leadership & Policy and the USC Price School of Public Policy produced a hugely valuable report on the current state of open data in the 88 incorporated cities comprising Los Angeles County.

Based on surveys and interviews with city officials on their open data efforts, this report provides unique insights into the ways that government leaders view open data. Among the findings – government officials surveyed for the report consider funding to be the most significant barrier to expanding work on open data. This isn’t a surprise, and this sentiment is likely not unique to the Los Angeles County area.

But when taken together with other findings, it can seem counterintuitive. Along with citing funding as a constraint, government officials expressed a preference for commercial open data catalogs over open source (or free) alternatives. These commercial solutions – some of which impose non-trivial costs on local governments – appear to meet a perceived need on the part of government officials in that they are viewed as making it “easier to publish [data] and put it in the hands of the citizens.”

Commercial software generally tends to fare better in the government procurement process than open source software, so this outcome isn’t all that shocking. But it’s worth noting this contradiction in the findings of the USC report between the cost constraints limiting more progress on open data and the reported preference for (sometimes pricy) commercial open data catalogs.

Cost aside, there are a few reasons why upfront investment in a commercial open data catalog may not be the best way to start a new open data effort.

Architecting participation 

The web … took the idea of participation to a new level, because it opened participation not just to software developers but to all users of the system.
– Tim O’Reilly, The Architecture of Participation

First, and somewhat ironically, public information on the cost of commercial open data portals can be hard to come by. Another report on municipal open data efforts in southern California found a wide disparity in what different governments – some just a few miles apart, and almost identical in population – pay for commercial open data catalogs. This can make it difficult for governments to know if they are getting good value for the price being paid.

In addition, commercial open data catalogs often come with visualization, mapping and charting tools out of the box. This can make it easier for governments to augment open data offerings by showing what can be done with it. Though these offerings may come at an additional price, some may view them as a way to help advocate open data to internal skeptics – a picture (or a graph, or a chart) is worth a thousand words as the saying goes.

From a user needs perspective, this approach feels very unidirectional – this is government telling the data community what it believes is important, not the other way around. There are a host of examples of sophisticated visualizations and applications being built with government data by outside data users. And while this approach requires outreach and engagement, there is an ever-increasing abundance of tools available for members of the data community to use to create maps, visualizations and new applications.

These two approaches – out of the box vs. community built – are not mutually exclusive. We can see a number of examples of governments using commercial open data catalogs to engage with external data users that produce useful, valuable visualizations and apps – New York City, the City of Los Angeles, Chicago and San Francisco are all great examples of this dual approach.

However, open data efforts in all of those cities have benefited from robust technology and startup communities and often visionary leadership. Almost all of these cities have a long tradition of civic hacking. For cities that don’t have these assets (or have them in smaller quantities), outreach and engagement to nurture and build a data community will be a crucial factor in the long-term success of an open data program. These cities – many of them smaller and with more limited resources – may also feel the cost constraints of implementing an open data effort more acutely than larger cities.

It’s fair to say that the next wave of cities that adopt open data programs may face a very different set of challenges than the cities that have come before them.

Putting Users First

“The procurement model of government digital services generally leads to services that satisfy policy needs, not user needs.”
Government Technology Procurement Playbook, Code for America

The time feels right to rethink how cities put in place the basic infrastructure of open data.

At last year’s Code for America Summit, I gave a talk on how open data was being adopted in small to midsized cities in the U.S. In researching my talk, I found that while larger cities have almost all implemented some form of open data program, less than 20% of the 256 incorporated places in this U.S. with populations between 100,000 and 500,000 have an open data program.

Open data in this country is still – almost exclusively – a big city phenomenon.

Efforts to address this imbalance are underway – the What Works Cities initiative (of which the Center for Government Excellence at Johns Hopkins is a key part) is now working to bring open data and data-driven decision making to 100 mid-sized cities. More and more, small and mid-sized cities are starting to look at open data as a key driver of government innovation.

We are now at a juncture where we can not only help a new cohort of cities adopt open data, but to help ensure that these efforts embrace the principle of “build with, not for” from the ground up. If we’re going to be successful, it’s important that we question long-held beliefs – like the original 8 principles of open data – to ensure our efforts are most efficiently aligned with the outcomes we desire.

It’s worth considering whether commercial open data catalogs provide the best option for the next wave of cities that are embracing open data to succeed and build a healthy data culture, both inside and outside of government.

But whatever foundation we choose to lay for the next phase of open data, we’ll need to make sure we’re putting user’s needs first.

(Note – the term “cult of catalogs” is not my own. I first heard it used by Friedrich Lindenberg, though others may have used it as well.)

Source: data.nucivic.com/dashboard

What should governments require for their open data portals?

My fundamental suggestion is that government-run open data platforms be fully open source. There are a number of technical and procurement reasons for this, which I will address in the future, but I believe strongly that if the platform you’re hosting data on doesn’t adhere to the same licensing standards you hold for your data, you’re only doing open data half right.

SideEffect.io

The future of government technology procurement

The General Services Administration and 18F recently held an open request for quotation related to a new blanket purchase agreement for a federal marketplace for agile delivery services. The transparency throughout the entire process was refreshing and provides a window into the future of procurement as well as what FedBizOpps could and should be.

U.S. Chief Data Scientist DJ Patil (Photo: O'Reilly Conferences)

It’s time for a national chief data officers council

As momentum around appointing public sector chief data officers grows, it’s time for the federal government to get ahead of the curve and create a formal chief data officers council similar to, but more inclusive, proactive and public than the already-established U.S. Chief Information Officers Council.

Photo: Code for America

Chief data officer as business developer

I occasionally get asked about thoughts on how to increase open data consumption, and think about this more and more, especially as it increasingly becomes an issue for those seeking validation and return on investment.

Photo (from left to right): Andrew Hoppin (CEO, NuCivic), Scott Burns (CEO, GovDelivery), Sheldon Rampton (CTO, NuCivic)

An investment in the future of government technology

For the past 15 years, I’ve spent much of my professional life working with and in startups. It’s an environment I love. You have complete control over your destiny, and you win by blending the perfect amalgam of people, design, technology, strategy and execution all into one mission.

U.S. Chief Technology Officer Todd Park with President Obama (Photo: White House)

A new way to write to the White House

The White House has officially released the write version of the “We the People” application programming interface that now allows developers to feed data back into the petition platform via third-party applications.

Photo: USDA

Help get USDA to lead with APIs when it comes to America’s parks

As part of this work I’m always on the look out for valuable public assets across city, state and federal government, and help make sure the conversations around these assets always include application programming interfaces, so that we aren’t just building web and mobile applications in silos, and limiting the potential for public access by individuals and small businesses.