Bay Area Rapid Transit Web Services Manager Timothy Moore discusses the recent upgrade of its flagship website, BART.gov, including a Drupal migration, embracing agile development, encouraging third-party developers to build off its open data and APIs, and plans for the future.
I was asked to provides some thoughts on what is next for the U.S. government’s application programming interface strategy. I’ve put a lot of thought into it during my work and travels over the last couple months since I’ve left Washington, D.C., and I keep coming back to one thought: strengthen what we have.
I wish I had some new technology or platform for the next wave of government APIs that would ensure success with APIs in Washington, but in reality we need to do what we’ve been doing, but do it in scale, and get organized and collaborative about how we do it.
Release more data sets
There are thousands of data sets available via data.gov currently, across 176 agencies and numerous categories. We need more. When any content or data is published via a government website, that data needs to also be made available via agencies’ data repositories and data.gov. Agencies need to understand that releasing open data sets is not something you do every once in a while to meet a mandate or deadline–it is something to do always, forever.
Refine existing data sets
There is a lot of data available currently. However, much of it is in various formats, inconsistent data models and isn’t always immediately available for use in spreadsheets, applications and for analysis. There is a great deal of work to be done in cleaning, normalizing and refining of existing data, as well as deploying APIs around open data that would increase adoption and the chances it will be put to use.
Refine existing APIs
Like open data, there are many existing APIs across federal government, and these APIs could use a lot of work to make them more usable by developers. With a little elbow grease, existing APIs could be standardized by generating common API definitions like Swagger, API blueprint and RAML, which would help quantify all APIs, but also generate interactive documentation, code samples and provide valuable discovery tools for helping understand where interfaces are and what they offer. The mission up until now for agencies is to deploy APIs, and while this remains true, the need to evolve and refine APIs will go a long way towards building those valuable case studies to get to the next level.
Robust /developer areas
There are over 50 agencies who have successful launched a /developer area to support open data and API efforts. Much like open data and the APIs themselves, they represent a mishmash of approaches, and provide varied amounts of resources and necessary support materials. HowTo.gov already provides some great information on how to evolve an agency’s developer area, we just need some serious attention spent on helping each agency make it so. It doesn’t matter how valuable the open data or APIs are, if they are published without proper documentation, support and communication resources, they won’t be successful. Robust developer areas are essential to federal agencies finding success in their API initiatives.
Every successful API initiative in the private sector from Amazon to Twilio has employed evangelists to spread the word and engage developers, helping them find success in putting API resources to use. Each federal agency needs its own evangelist to help work with internal stakeholders making sure open data is published regularly, new APIs are deployed and existing resources are kept operational and up to date. Evangelists should have counterparts at OSTP / OMB / GSA levels providing feedback and guidance, as well as regular engagement with evangelists in the private sector. Evangelism is the glue that will hold things together across the agency, as well as provide the critical outreach to the private sector to increase adoption of government open data and APis.
Build public-private sector partnerships
Opening up data and APIs by the federal government is about sharing the load with the private sector and the public at large. Open data and APIs represents the resources the private sector will need to build applications, sites and fuel industry growth, and job creation. A new type of public-private sector partnership needs to be defined, allowing for companies and non-profit groups to access and use government services and resources in a self-service, scalable way. Companies should be able to build businesses around government Internet services, much like the ecosystem that has grown from the IRS e-File system, with applications like TurboTax that reaches millions of U.S. citizens and allows corporations to help government share the load while also generating necessary revenue.
Establish meaningful case studies
When it comes to open data and APIs nothing gets people on board more than solid examples of open data, APIs and the applications that are built on them have made in government. Open government proponents use weather data and GPS as solid examples of open data and technology impacting not just government, but also the private sector. We need to fold the IRS e-file ecosystem into this lineage, but also work towards establishing numerous other case studies we can showcase and tell stories about why open data and APIs are important–in ways that truly matter to everyone, not just tech folks.
Educate and tell stories within government
In order to take open data and APIs to the next level in government there needs to an organized and massive effort to educate people within government about the opportunities around open data and APis, as well as the pitfalls.
Regular efforts to educate people within government about the technology, business and politics of APIs needs to be scaled, weaving in relevant stories and case studies as they emerge around open data and APIs. Without regular, consistent education efforts and sharing of success stories across agencies, open data and APIs will never become part of our federal government DNA.
Inspire and tell stories outside government
As within government, the stories around government open data and APIs needs to be told outside the Washington echo chamber, educating citizens and companies about the availability of open data and APIs, and inspire them to take action by sharing successful stories of other uses of open data and APIs in development of applications.
The potential of using popular platforms like Amazon and Twitter spread through the word of mouth amongst developer and power user communities, the same path needs to be taken with government data and api resources.
The next 2-3 years of the API strategy for the U.S. government will be about good old-fashioned hard work, collaboration and storytelling. We have blueprints for what agencies should be doing when it comes to opening up data, deploying APIs and enticing the private sector to innovate around government data, we just need to repeat and scale until with reach the next level.
How do we know when we’ve reached the next level? When the potential of APIs is understand across all agencies, and the public instinctively knows that you can go to any government /developer domain and find the resources they need, whether they are individual or a company looking to develop a business around government services.
The only way we will get there is by achieving a solid group of strong case studies of success in making change in government using open data and APIs. Think of the IRS e-file system, and how many citizens this ecosystem reaches, and the value generated through commercial partnerships with tax professionals. We need 10-25 of similar stories of how APIs impact people’s lives, strengthened the economy and has made government more efficient, before we consider getting to the next level.
Even with the housekeeping we have, what should be next for Data.gov?
Data.gov has continued to evolve, adding data sets, agencies and features. With recent, high profile stumbles like with Healthcare.gov, it can be easy to fall prey to historical stereotypes that government can’t deliver tech very well. While this may apply in some cases, I think we can get behind the movement that is occurring at Data.gov, with 176 agencies working to add 54,723 data sets in the last 12 months.
I feel pretty strongly that before we look towards the future of what the roadmap looks like for Data.gov, we need to spend a great deal of time refining and strengthening what we currently have available at Data.gov and across the numerous government agency developer areas. Even with these beliefs, I can’t help but think about what is needed for the next couple years of Data.gov.
Maybe I’m biased, but I think the next steps for Data.gov is to set sights purely on the API. How do we continue evolving Data.gov, and prepare to not just participate, but lead in the growing API economy?
Management tools for agencies
We need to invest in management tools for agencies, commercial providers as well as umbrella. Agencies need to be able to focus on the best quality data sets and API designs, and not have to worry about the more mundane necessities of API management like access, security, documentation and portal management. Agencies should have consistent analytics, helping them understand how their resources are being accessed and put to use. If OSTP, OMG, GSA and the public expect consistent results when it comes to open data and APIs from agencies, we need to make sure they have the right management tools.
Endpoint design tools for data sets
Agencies should be able to go from data set to API without much additional work. Tools should be made available for easily mounting published datasets, then allow non-developers to design API endpoints for easily querying, filtering, accessing and transforming datasets. While data download will still be the preferred path for many developers, making high value datasets available via APIs will increase the number of people who access, that may not have the time to deal with the overhead of downloads.
Common portal building blocks
When you look through each of the 50+ federal agency /developer areas you see 50+ different approaches to delivering the portals that developers will depend on to learn about, and integrate with each agencies APIs. A common set of building blocks is needed to help agencies standardize how they deliver the developer portals. Their approach might make sense within each agency, but as a consumer when you try to work across agencies it can be a confusing maze of API interactions.
Standard developer registration
As a developer I need to establish a separate relationship with each federal agency. This quickly becomes a barrier to entry, one that will run off even the most seasoned developers. We want to incentivize developers to use as many federal APIs as possible, and by providing them with a single point of registration, and a common credential that will work across agencies will stimulate integrations and adoptions.
Standard application keys
To accompany the standard developer registration, a standard approach to user and application keys is needed across federal agencies. As a user, I should be able to create a single application definition and receive API keys that will work across agencies. The amount of work required to develop my application and manage multiple API keys will prevent developers from adopting multiple federal agency APIs. Single registration and application keys will reduce the barriers to entry for the average developer when looking to build on top of federal API resources.
Developer tooling and analytics
When integrating with private sector APIs, developers enjoy a wide range of tools and analytics that assist them in discovering, integrating, managing and monitoring their applications integration with APIs. This is something that is very rare in integration with federal government APIs. Standard tooling and analytics for developers needs to become part of the standard operating procedures for federal agency /developer initiatives, helping developers be successful in all aspects of their usage of government open data and APIs.
Swagger, API Blueprint, RAML
All APIs produced in government should be described using on of the common API definitions formats that have emerged like Swagger, API Blueprint and RAML. These all provide a machine readable description of an API and its interface that can be used in discovery, interactive documentation, code libraries and SDKs and many other uses. Many private sector companies are doing this, and the federal should follow the lead.
Discoverability, portable interfaces and machine readable by default
As with open data, APIs need to be discoverable, portable and machine readable by default. Describing APIs in Swagger, API Blueprint and RAML will do this. Allowing APIs to be presented, distributed and reused in new ways. This will allow each agency to publish their own APIs, but aggregators can take machine readable definitions from each and publish in a single location. This approach will allow for meaningful interactions such as with budget APIs, allowing a single budget API site to exist, providing access to all federal agencies budget without having to go to each /developer area, but there are many more examples like this that will increase API usage and extend the value of government APIs.
Mashape, API Hub
A new breed of API directories have emerged. API portals like Mashape and API Hub don’t just provide a directory of APIs, they simplify API management for providers and integration for API consumers. Federal agencies need to make their APIs friendly to these API hubs, maintaining active profiles on all platforms and keeping each API listed within the directories and actively engaging consumers via the platforms. Federal agencies shouldn’t depend on developers coming to their /developer areas to engage with their APIs, agencies need to reach out where developers are already actively using APIs.
Consistent API interface definition models
Within the federal government each API is born within its own agencies silo. Very little sharing of interface designs and data models is shared across agencies, resulting in APIs that may do the same thing, but can potentially do it in radically different ways. Common APIs such as budget or facility directories should be using a common API interface design and underlying data model. Agencies need to share interface designs, and work together to make sure the best patterns across the federal government are used.
In the federal government, APIs are often a one-way street that allow developers to come and pull information. To increase the value of data and other API driven resources, and help reduce the load on agencies servers, APIs need to push data out to consumers, reducing the polling on APIs and making API integration much more real-time. Technologies such as the Webhook which allows API consumer to provide a web URL, in which agencies can push newly published data, changes and other real-time events to users, are being widely used to make APIs much more of a two-way street.
As the world of web APIs evolve there are new approaches emerging to delivering the next generation of APIs, and hypermedia is one of these trends. Hypermedia brings a wealth of value, but most importantly it provides a common framework for APIs to communicate, and provide essential business logic and direction for developers, helping them better use APIs in line with AP provider goals. Hypermedia has the potential to not just make government assets and resources available, but ensure they are used in the best interest of each agency. Hypermedia is still getting traction in the private sector, but we are also seeing a handful of government groups take notice. Hypermedia holds a lot of potential for federal agencies, and the groundwork and education around this trend in APIs needs to begin.
The first thing you notice when you engage with an government API is nobody is home. There is nobody to help you understand how it works, overcome obstacles when integrating. There is no face to the blog posts, the tweets or the forum replies. Federal government APIs have no personality. Evangelists are desperately needed to bring this human element to federal government APIs. All successful private sector APIs have an evangelist or an army of evangelists, spreading the word, supporting developers, and making things work. We need open data and API evangelists at every federal agency, lettings us know someone is home.
Read / Write
I know this is a scary one for government, but as I said above in the webhooks section—APIs need to be a two way street. There are proven ways to make APIs writeable without jeopardizing the integrity of API data. Allow for trusted access, let developers prove themselves. There is a lot of expertise, “citizen cycles,” and value available in the developer ecosystem. When a private sector company uses federal government data, improves on it, the government and the rest of the ecosystem should be able to benefit as well. The federal government needs to allow for both read and write on APIs—this will be the critical next step that makes government APIs a success.
These are just 14 of my recommendations for the next steps in the API strategy for the federal government. As I said earlier, none of this should be done without first strengthening what we all have already done in the federal government around open data and APIs. However, even though we need to play catch up on what’s already there, we can’t stop looking towards the future and understand what needs to be next.
None of these recommendations are bleeding edge or technology just for technology sake. This is about standardizing how APIs are designed, deployed and managed across the federal government, emulating what is already being proven to work in the private sector. If the federal government wants to add to the OpenData500, and establish those meaningful stories needed to deliver on the promise of open data and APIs, this is what’s needed.
With the right leadership, education and evangelism, open data and APIs can become part of the DNA of our federal government. We have to put aside our purely techno-solutionism view and realize this is seriously hard work, with many pitfalls, challenges and that in reality it won’t happen overnight.
However, if we dedicate the resources needed, we can not just change how government works, making it machine readable by default, we can forever alter how the private and public sector works together.
Publishing government information is about much more than simply throwing 0’s and 1’s over the firewall. It’s about building ecosystems and communities. It’s about solving shared challenges. It’s about consumption — after all, that’s the American way.
Go to any agency website, and chances are you’ll find at least one dataset sitting idly by because the barrier to consume it is too damn high. It doesn’t matter how hard stakeholders had to fight to get the data out the door or how valuable the dataset is, it’s never going to become the next GPS or weather data.
We can build cruft to get around the hurdles, but so long as the workflow is optimized for the publisher, not the consumer, we’re always going to be setting ourselves up to give the naysayers even more fodder to point at as an example for why we shouldn’t waste agency resources on open government initiatives.
An interesting micro-site – it seems very much more aimed at people who know BLS data already, and much less aimed at coders. The FAQ answers the question “What is an API?”, but doesn’t even have a list of the IDs required in order to get any data whatsoever out of that API.
This struck me for two reasons. First, it shows that the data publishers still believe that key stakeholders within government, do not yet understandthe concept of or value in producing an API, a chilling thought considering the critical role open government data can, should, and is playing in shaping our economy and reimagining how citizens interact with their government.
Second, the fact that the documentation is geared not towards developers, but towards internal stakeholders shows a lack of consideration for the data consumer’s user experience. In this case, listing ID’s is just scratching the surface of what a developer might expect from the private sector. Is the API responsive? Does it return data in a usable format? Are related queries properly linked? The list goes on.
All experiences — government experience especially — should be optimized for the consumer, not for the publisher, regardless of format or internal politics.
The General Services Administration, much to their credit, has begun a “First Fridays” usability testing program, which is the tangible result of a much larger and much needed effort to educate publishers of government information as to the importance of and necessity for a purposeful commitment to user experience.
But as we increasingly push for government information to be published in machine-readable formats, what becomes of that focus on the consumer’s user experience? Is the user experience format agnostic? Here, the immediate consumer of the information is a civic hacker or other developer, and the interface is an API rather than an agency website, but the intention and importance is just the same.
The next time you publish a data source, whether in government or out, just as you should with any website, ask yourself one simple question: How can you optimize the experience for the consumer?
Today, the San Francisco Board of Supervisors will take its final vote to approve my update to our city’s groundbreaking open data law. My open data ordinance, in its simplest terms, standardizes and sets timelines for the release of appropriate city government data.
I know that my update to our open data ordinance will help lead to further innovation and technologically driven services, solutions, platforms, and applications for civic issues and problems. Technology is not going to be the cure-all for every problem government faces, but it can certainly help to improve our resident’s quality of life in certain instances, while continuing to boost our local economy at the same time.
All across the nation, cities, counties, states, and even the federal government have and continue to take steps towards making appropriate government data available because open data has proven to spark innovation, drive greater efficiency and cost-savings in government, and fuel further economic development – as evidenced in the recent and steady growth in the civic startup sector.
My law modifies and standardizes the city’s open data standards, ensuring for data released in machine-readable formats; sets timelines for city departments for the release of appropriate city data sets; creates a mechanism for city staff and agencies to interact with the public and entrepreneur community for the prioritization of releasing city data sets; and makes San Francisco the first city in the nation to be tasked with developing a strategy to give residents access to their own government held data.
There are examples here in San Francisco and nationally that show open data used in practice, and how open data can help accomplish all of the positive benefits mentioned above. Whether, it is Yelp’s recent partnership with the city to post public health scores to their website for city restaurants to help residents make healthier choices, to residents being able to use the acclaimed San Francisco Recreation and Parks App, which helps residents and visitors find park and recreation locations, make picnic table reservations, and allows for tickets for concerts, art exhibits, and other events to be purchased straight from a mobile device.
The standardization of the city’s technical open data standards, which ensures that data will be available for use in machine readable format that are non-proprietary, is key to unlocking the true potential and value of appropriate data sets that the City holds. A recent report from Mckinsey&Company states that open data can help unlock $3 trillion to $5 trillion in economic value worldwide annually across seven distinct sectors. A new economy with this great of potential is something that should not be ignored.
My ordinance also creates tighter deadlines for city departments to follow in the release and update of appropriate government data. Tighter deadlines regarding the release of open data sets creates certainty that will be extremely beneficial to the public and entrepreneur community. With more certainty, entrepreneurs and the public will be able to better plan around their individual ideas and implementations of our city’s open data sets that will be the base of the next product, service, or application that helps to benefit all San Franciscans.
The inclusion of timelines regarding the release of appropriate government data sets was not an arbitrary decision. It was a decision based in practice and from testimony we heard from the public and entrepreneur community. Yo Yoshida, CEO and Co-founder of Appallicious, was even quoted as saying “We look forward to putting some teeth into the open-data movement through this legislation. We do have some snafus with some departments not being able to release it quick enough to give the developers the ability to create products from this and create industry and jobs and move the movement forward.”
My ordinance also creates a better mechanism for the public and entrepreneur community to interact with city staff and departments who will be responsible for cataloging, updating, and uploading appropriate government data sets. By mandating each data set have the contact information of the staff that uploaded the data (phone and email) associated with the data set, any interactions between the two parties will be sure to spark creativity and discussion regarding potential high value data sets, so that the next amazing products and services will just be on the horizon.
Lastly, my ordinance would make San Francisco the first city in the nation to develop a strategy for giving residents access to their own government held data. The addition of this requirement in my ordinance believes in a growing national movement that is calling on all levels of government to give residents access to their own data for their own use. If it is yours, we should give it back to you – simple as that.
San Francisco Magazine called my ordinance the “Super-Boring City Law That Could Be Huge.” I guess open data may be boring to some, but I would tell you it depends on who you ask. It is definitely not boring when the transformative potential of open data is known to increase government efficiency and accountability, fuel further economic development, and create an atmosphere that encourages innovation, discovery and growth.
The White House will soon open a limited beta test to developers on a new We the People Write API that allows third-party applications to submit information to official petitions.
“One of the things we’ve heard from the beginning is a strong desire from our users to be able to submit signatures and petitions from other sites — and still receive an official response. Up to this point, we haven’t had a way to accept signatures submitted from other sites, but that is about to change,” writes White House Associate Director of Online Engagement for the Office of Digital Strategy Ezra Mechaber.
According to the White House, more than 10 million users have signed nearly 300,000 petitions.
We the People was built in Drupal and the source code is available on GitHub.
The Read API was opened earlier this year (sample projects here).
While We the People is fairly intuitive and easy to use, there’s huge potential for great designers and developers to essentially build a truly innovative and engaging platform.
Finally, a bike-sharing program is coming to San Francisco! What Europeans figured out years ago will be a reality in the Bay Area by this August. The plan is to put 700 bikes at 70 different stations in the City and throughout the Bay Area—where residents can quickly hop on a bicycle at one station, and drop it off at another. Appallicious is very excited about this new program, not only because we’re looking forward to hopping on these new bikes ourselves, but also in order for the program to be successful, the utilization of open data will be key. That’s why I’ll be joining sf.citi and the San Francisco Bike Coalition at Yammer on Wednesday, for a conversation about the launch of the new program and how open data and the tech community at large fits in.
Once the bike share program starts, it’s going to be extremely important to know where the heaviest demand for bikes are at certain times during the day, and certain days during the week. It’s safe to assume that on a Monday morning, you’re going to need more bikes in residential areas, and less in the Financial District, since commuters will be biking to work. But with any program like this, unexpected variables are bound to come up, and this is where open data will come in.
The bikes and bike stations will most certainly have a GPS component where the city will be able to track bikes in use, and the amount that have been checked in or out at each station. Companies like Appallicious will then be able to synthesize this data and not only help the City of San Francisco figure out where and when the heaviest demand for bikes is, but can also inform citizens through mobile applications how many bikes are available at a specific station at any given time. Just like the features on the SF Rec and Park App we developed allows you to find parks, playgrounds, dog parks, picnic tables, and more — we could also bring bicycle availability right into the app! It will be just like checking the availability of a ZipCar at a nearby parking garage.
Once this raw data is available to Appallicious, there are quite a few steps before it can be packaged and presented to bike riders in a way that will help them figure out bike availability, or to city leaders who need to know which stations need more bikes, and which ones need less. The idea of the public sector providing the private sector with information like this is nothing new. In 1983, President Ronald Reagan issued a directive guaranteeing that GPS signals would be available at no charge to the world when sucha a system became operational, in the wake of a Korean Airlines flight that was shot down after accidentally flying into Russian airspace.
The Obama administration has continued to promote the idea of “sustainable innovation” that President Reagan helped start. The GPS directive from Reagan has created a $250 billion a year navigation industry. Think about GPS companies like Garmin or applications like Google Maps that rely on GPS—without Open GPS, these companies would have never have been created, and we’d still have stacks of paper maps from AAA stuffed in our glove compartments!
With this renewed push for open data, through President Obama’s Open Government Initiative, there is a chance for the United States to build a new, thriving and successful industry through information released to the public by city governments. As more and more information is released by cities all over the country and the world, companies are going to be able to step up and provide new technology that allow citizens to access and benefit from this information.
In San Francisco, open data advocates like Mayor Ed Lee and Supervisor David Chiu have just passed new open data legislation that will allow companies like Appallicious to create apps and change the way in which cities and governments are able to operate for years to come.
The possibilities are endless, and I am extremely excited to see how innovators and entrepreneurs find revolutionary ways of using this data to make bike sharing easier in San Francisco. Wouldn’t it be cool to integrate the bike-sharing program into the SF Rec and Park App? You could reserve a bike with your app and then take it for a tour of Golden Gate Park or see all the incredible art available throughout the city using the app. The open data movement has the potential to create a thriving, sustainable industry that can create millions of jobs, and a symbiotic relationship between the private and public sectors that could make both more effective, efficient, and profitable.
Today, open data and its power to transform a city and a nation by engaging tech savvy citizens will be on display at San Francisco City Hall. And just as importantly, companies that have been successful because of forward thinking open data policies will testify to our elected leaders about its importance. As a founder of one of these sustainable companies, Appallicious, I am proud to be speaking on behalf of the open data movement.
After hearing testimony from myself and others in the open data industry, San Francisco’s Board of Supervisors will review and vote on new legislation that will strengthen the city’s open data initiatives and allow San Francisco to appoint a Chief Data Officer (CDO) to manage the City’s open data efforts.
More than three years ago the City of San Francisco launched DataSF.org, the city’s one-stop shop for government data. San Francisco was the first city to follow the federal government’s open government effort, Data.gov when it launched DataSF.org. Since then, more than 70 apps have been developed for city residents by civic innovators and companies– countless other cities and towns have been inspired to follow San Francisco’s lead and have enacted similar policies, providing residents with greater accessibility to government data.
San Francisco’s open data efforts have helped spur the creation of apps for citizens that makes it easier for residents to receive government services, actively participate in city policy and have saved the city a substantial amount of money. Behind these open data apps are new, civically minded companies, and a new industry that is starting to emerge in the land of Twitter, Facebook and YouTube. Companies like Appallicious, 100Plus, Routesy, and Zonability, that would not have been possible just a couple years ago are popping up in cities all over the country supported by amazing organizations like Code For America.
Back in October 2012, I was proud to join San Francisco Mayor Ed Lee, Supervisor David Chiu and San Francisco Rec & Park GM Phil Ginsburg as they introduced the revised open data legislation. These Gov 2.0 leaders used the event to highlight companies like Appallicious that are using open data to create apps and re-imagine our city. They launched the San Francisco Rec & Park app that Appallicious created using over 1,000 datasets for parks, playgrounds, and dog parks, along with transportation datasets so residents can get directions to all of the City’s attractions. All of these datasets are available on DataSF.org.
The SF Rec & Park app makes it easy for anybody to find city parks, playgrounds, museums, picnic tables, gardens, restrooms, news and events and more in the palm of your hand. Information is displayed with descriptions and pictures on a GPS enabled mobile map.
The SF Rec & Park app, which was recently named by Mashable as one of 7 open data apps every city should have, also will soon make it easier for residents to make reservations for a soccer field or picnic table, or apply for a permit when they need to host an event in a public park. All of this will be available through a mobile device or on the web, saving taxpayers and government workers time and money. No longer will you have to wait on hold or send multiple emails to confirm a picnic table reservation for a birthday party.
Open data apps like this are only the beginning of something much bigger that is being made possible by open data policies and government leaders that get its importance.
On his first day as President, Obama signed the memorandum on Transparency and Open Government to spur innovation at the Federal level for private sector development. This move inspired progressive cities like San Francisco, Chicago, New York and Philadelphia to create their own open data legislation at the local level. This has led to an emergent new industry, unparalleled innovation, job creation, revenue, and collaboration between government and the private sector not seen since President Reagan’s decision to open up the Global Positioning System in the 1980s.
Organizations like Code for America and Citizenville, as well as private companies like Appallicious and the SF Rec & Park app are living, breathing examples of the new industry first created by President Reagan in the 1980s and rejuvenated by President Obama.
Stay tuned, a whole new industry is starting to take form powered by open data on a local level, creating jobs, revenue, and never before seen citizen and government.
If your city is new to the open data movement, please ask your elected leaders to take the Citizenville Challenge and bring open data policies and innovation to your community. And take a second to support the open data movement by applauding Appallicious’ submission to the Knight Foundation News Challenge and others that are transforming the way government and citizens engage and communicate.
Corrections: “Open Government Act” was changed to “memorandum on Transparency and Open Government.” Reference to “Open GPS” was changed to “Global Positioning System.”
For those of you interested in starting or joining the civic technology movement where you live, watch Code for America Brigade program director Kevin Curry discuss how designers and developers are doing just this everywhere across the United States.
Every day, tech-minded citizens across the country are doing good by their communities, literally geeking out about how they can help re-define the relationship government has with its citizens, using technology as a democratic tool to collaboratively empower both.
So much is happening in the civic technology community – website redesigns, new websites, open data initiatives, apps, camps, developer contests, hackathons and more – it’s hard to get a perspective on or truly appreciate the collective work of these dot-dogooders both inside and outside government.
That’s why we created the 2011 GovFresh Awards.
It’s time to recognize and honor all that’s been accomplished this year.
It’s time to say thank you.
Here are the categories. Start entering and start voting.
- City of the Year
- Public Servant of the Year
- Citizen of the Year
- App of the Year
- Best Government/Citizen Collaboration
- Best Use of Open Source
- Best Open and Participatory Budgeting Initiative
- Best Open Government Policy
- Best Open Data Platform
- Best Civic Hackathon
- Best Civic Start-up
- Best Use of Social Media
- Best Use of Social Media for Emergency Management
- Best Transit App
- Best 311 App
- Best Emergency Management App
- Best Social Services App
In a new blog post, Gartner’s Andrea Di Maio asks if it’s time to pull the plug on government Websites? Di Maio cites one Japanese city’s decision to migrate its online presence to Facebook as an example of an outside-the-box approach to government Web operations.
One comment from ‘Carolyn’ makes a strong case why the Facebook approach is short-sighted:
Believe it or not, some people trust Facebook even less than they trust government. Why make civic participation dependent on surrendering portions of your privacy to a corporation that will monetize it? I don’t want a crowdsourced opinion on when my garbage will be collected. I don’t want to have to sift through the mass of information out there on the web to find the proper permit application, or tax form for my business. And I don’t want corporate interests controlling my access to my government.
Related to this, one of my favorite quotes about Facebook comes from blogger Jason Kottke (2007):
As it happens, we already have a platform on which anyone can communicate and collaborate with anyone else, individuals and companies can develop applications which can interoperate with one another through open and freely available tools, protocols, and interfaces. It’s called the internet and it’s more compelling than AOL was in 1994 and Facebook in 2007. Eventually, someone will come along and turn Facebook inside-out, so that instead of custom applications running on a platform in a walled garden, applications run on the internet, out in the open, and people can tie their social network into it if they want, with privacy controls, access levels, and alter-egos galore.
Di Maio’s general point is that when government builds Websites they “almost inevitably fail to model access the way people do expect or need it.” But just because this has been the case to date, doesn’t mean public sector IT should transition its entire online operations to the trendiest social network.
It’s time for government to radically reconsider its online service offering to citizens with a more sustainable approach.
Centralizing government Websites into one portal is something I’ve advocated for years (see here and here). In fact, the White House is exploring this and other options around improving the .gov ecosystem (they addressed my question specifically on this subject at a White House ‘Open for Questions’ live chat here).
If government really wants to focus on IT efficiency and cost-savings, CIOs and CTOs need to construct a more focused, organic strategy that includes the following:
- Centralize your Web ecosystem into a single CMS and uniform brand/theme
- Develop using open source software.
- Create an open data portal.
- Leverage APIs.
- Migrate as much to the cloud as possible.
- Create topic-based content and ensure distribution via RSS, email and all social media means available.
- Develop a mobile strategy based on accessing the data above and empowering external, entrepreneurial ventures to compete in a free market to provide the best services (i.e., build less apps in-house).
The above list is by no means comprehensive and perhaps one day I’ll have more time to elaborate. It is, however, a general, sustainable strategy for addressing pubic sector budgeting constraints given the current economic conditions. Some or all of this could be done in-house or out-sourced. If the latter, it needs to be highly extensible and portable.
I’m all for radical re-working and thinking different, but don’t let fiscal uncertainty or short-term instability drive irrational IT decision-making, especially when it comes to public services and citizen privacy.