Author: Kin Lane

Help get USDA to lead with APIs when it comes to America’s parks

Photo: USDA

Photo: USDA

I need your help with something.

I’m in the business of helping startups, all the way up to our federal government, identify valuable assets and make them more accessible for use in websites and mobile devices. As part of this work I’m always on the look out for valuable public assets across city, state and federal government, and help make sure the conversations around these assets always include application programming interfaces, so that we aren’t just building web and mobile applications in silos, and limiting the potential for public access by individuals and small businesses.

Over the last couple years, I have cultivated a small group of API evangelists in various sectors who help keep me informed of important government projects. One of my partners in crime, Alyssa Ravasio (@alyraz), brought a pre solicitation for a request for proposal to my attention from the U.S. Department of Agriculture, for a “Recreation On Stop Support Services” to my attention. You can see this service in action at In short, this is an ongoing contract that our federal government engages in with a private sector contractor to provide access to our nation’s campgrounds and campsites at our federal parks.

I would put our national parks and campgrounds at the top of treasured assets that this country possesses, and something that we need to make sure continue to be as accessible to individuals and small businesses as possible. I want to take the time to respond to the pre-solicitation from USDA, in hopes of helping them craft a better RFP before this goes out to bid.

I also want to encourage others, who are also passionate about camping and our national parks, to submit their own feedback to USDA’s solicitation number AG3187S151000–Recreation One Stop Support Services.

The current incarnation of the one stop service for recreation project has an API. Well, it has web services as part of the existing Recreation Information Database, but you also find mentions of API in the proposed RFP, under the 5.3.9 Mapping and Geospatial Capability:

A segment of the user community will not only be accessing and viewing the mapping data through the web interface at, but will also have a need to search, retrieve, and download the geospatial data via the RIDB data sharing interface(s), or APIs. The Contractor shall demonstrate utilization of programming languages, formats, data types/conventions/sources, etc. which maximize the portability and usability of maps and map related data by third-parties. This minimizes the need for third-party data consumers to procure expensive and/or proprietary software to utilize, edit or manipulate the mapping data. The contractor shall ensure that all available geospatial data within R1S is incorporated into the user interface to provide users with the most complete mapping solutions available.

There is another mention of APIs under “5.3.6 Provide Travel Planning Tools” with:

Non Federal reservable inventory may include state and local reservable and non-reservable recreation inventory which may be incorporated into the system or accessed via various application programming interfaces (APIs).

Then, towards the end you see the following needs:

  • Data Sharing and Data Management – Information originating from within RIDB and imported shall be presented in a manner such that the external origin of the data is transparent to users and consistent with presentation of information originating from within.
  • Information Sharing – The Contractor shall deliver automated and manual services for the sharing of the consolidated recreation information described herein. The sharing service shall utilize prevailing industry best practices and technologies to make the consolidated recreation information publicly available to data consumers.
  • Information Consolidation – The Contractor shall deliver automated and manual services for collecting of a wide variety of recreation information from a wide range of sources including Federal Government Agencies and Federal Partner organizations such as Historic Hotels of America.

To me, all of this infers that whoever wins the contract will need to at least maintain what is available via the RIDB, which is nice, but it is just a web service, not a modern web API, and only reflects providing access a small portion of the valuable resource that will need to be made available via to partners and the public at large. We can do better.

An API needs to be a first-class citizen in this RFP, something that is not just an afterthought and a connection to the database that just needs to be maintained — an API will be central to deliver the resources needed for a “one stop services for recreation”.

Camping resources

When you look through the requirements for this project, you see all the building blocks for allowing individuals to engage with campsites across our nation, inventory, ticketing, lotteries, permitting, day use and group facilities, special events and use, equipment rentals, recreational and pass sales. All of this keeps our national parks accessible to campers.

Mapping and geospatial resources

Camping resources are meaningful to us because of where they are located. Mapping and geospatial resources will be vital to enabling users to discover the parks and campsites where they wish to stay on their camping trips. The ability to search and navigate campsites via modern mapping and providing geospatial resources is central to the one stop service for recreation.

Financial resources

A third cornerstone of the recreation one-stop support services platform are the financial resources. The ability to accept and process credit cards, cancellations, modifications and refunds will play a role in every engagement on the platform. Access to to all aspects of commerce via the platform will make or break operations at every level for the one stop service for recreation.

User resources

The fourth corner of the one stop service for recreation platform is the user. Providing secure, seamless access to user profiles and history throughout the experience will be critical to it all working. User resources are central to any modern web or mobile application in 2014 and depends on camping, geo and financial resources to bring it all together. website

All of these resource need to be made available on at, in the form of fully functioning web application providing a content management system (CMS) for managing the site, complete with a reservation system that allows users to discovery camping resources, via mapping geospatial resources, and pay for their reservations via financial resources—all via a secure, easy to access user profile.

Travel planning tools

Beyond the core public website and its reservation system the requirements call for travel planning tools including point-to-point itinerary planning, field sales and reservations sales, third party sales and reservable inventory management. While these tools are likely to primarily be web-based, the requirements call for remote sales and management of inventory, which requires a central API solution.

Robust reporting solutions

Reporting is prominent through the requirements requiring web analytics, canned reports, customizable reports, monthly performance reporting, and enterprise reporting systems. In short, reporting will be essential to all aspects of the one stop service for recreation for both the contract winner and operator, as well as all of the agencies involved. I would add in that end-users will also need some views into their own usage, adding another layer to the conversation.

Data accessibility and portability

Another element that is present throughout the requirements is data access and portability of the camping, reservation, customer (user) and resulting financial data. Pretty much everything in the system should be accessible, right? This definitely goes beyond just what is available via the RIDB and requiring all aspects of the one-stop service for recreation to be accessible for use in any other system, or just as download.

Minimum required applications and platforms

As part of the requirements, there is a minimum requirements for browsers, and mobile platforms that the one stop service for recreation should operate on:

  • Applications:
    • Microsoft Internet Explorer
    • Mozilla Firefox
    • Google Chrome
    • Apple Safari
  • Platforms:
    • Android Mobile Devices
    • Apple iOS Mobile Devices
    • Apple Desktop OS
    • Microsoft Windows Mobile Devices
    • Microsoft Windows PC

For me, this portion of the requirements are extremely vague. I agree that the and travel planning tools should work in all these browsers and be responsive on mobile browsing platforms, but these requirements seem to allude to more without actually going there. In 2014-2016, you can’t tell me that native mobile experiences won’t be something end users of and the overall system will be demanding? Mobile will be the gateway to the platforms mapping and geospatial resources, and APIs are how you deliver the resources needed for developing mobile applications on all of the platforms listed.

A one-stop service for recreation requires a multi-channel approach

The one-stop service for recreation requires a multi-channel approach to deliver the camping, mapping and geo, financial and user resources to the main website, supporting reservation system, travel planning tools, reporting solutions and potentially any other web or mobile applications, and round it all off with needing to provide the data accessibility and portability that is demanded of a modern federal government platform.

Transform the RIDB web service into a modern API and make the center

The only way you will engage with users on the web and mobile locations they will demand, as well as deliver the reservation, travel planning and reporting solutions, while making sure everything is portable, the RIDB database and supporting web services needs to be brought up to date and transformed into a modern web API. The RFP_-_Attachment_10_-_Notional_Future_RIDB_Data_Flow_-_Sep._22_2014.pptx even reflects this vision, but the actual RFP severely falls short of describing the role an API will play in the platform.

Making security a priority across all aspects of operations

With a single API layer being the access layer for all web, mobile, reporting and data accessibility channels, security takes on a whole new meaning. APIs provide single access point for all access to platform resources, allowing security and auditing to occur in real-time, using modern approaches to API management. APIs use existing web technology and don’t require any special technology, but do provide a single layer to enable, monitor and deny access to all resources, across the one-stop services for recreation.

A centralized metrics, analytics and reporting layer

Married with security, an API-centric approach allows for a platform-wide analytics layer that can drive security, but also provide the data points required for the overall platform reporting system. With an API analytics layer this data can be delivered to platform, government, developer and end-user reporting solutions. A comprehensive metrics, analytics, and reporting strategy is essential to a modern one stop services for recreation platform.

APIs let in the sunlight, enforcing transparency and accountability

One critical aspect of a modern, API-centric approach is the transparency and accountability it provides. All database connections and web or mobile applications are brokered via an API layer, requiring all applications to rely on the same pipes for reading, writing, modifying and removing data from the system. This approach provides a secure window into all aspects of platform operations, providing the accountability requirements set forth in the proposed RFP.

Some constructive criticism for language in the proposed RFP

There are some points in the proposed RFP that, in my opinion, move the conversation backwards rather than looking to the future and fulfilling a vision of what the “one stop services for recreation platform” could be.

One example is “ Third Party Sales”:

The Government anticipates that during the potential 10-year period of performance after go-live there may be an interest by commercial travel and recreation planning companies (such as Travelocity, Expedia, etc.) to utilize R1S inventory and availability data to facilitate R1S transactions. The Government may be open to such Third Party Sales arrangements, provided the business, financial and legal terms of the arrangement are in the Government’s best interests. At the Government’s request, the Contractor shall lead exploratory meetings, including Government and applicable industry representatives, to determine commercial interest in such a service and also explore the technical and fiscal feasibility of such Third Party Sales arrangements. Upon the conclusion of all necessary exploratory meetings, and should the Government decide to implement Third Party Sales, the Government will issue a formal modification to the R1S Support Services contract to incorporate the service. There is no requirement for the contractor to provide Third Party Sales services until a post-award modification is issued requiring them.

“There may be an interest by commercial travel and recreation planning companies (such as Travelocity, Expedia, etc.)?” In the current online environment, I am pretty sure that this shouldn’t be “may,” this should be “will,” and that the pace of business today moves quite a bit faster than this section describes. APIs enable a new approach to business development that was coined 10 years ago by the co-founder of Flickr, the popular sharing platform.

I understand that you will have to certify developers, but this should not get in the way of small business developers experimenting and innovating with the resources available via the platform. Business development via an API has been going on in the private sector for 10 years and isn’t something USDA should be putting off until the next procurement cycle for the platform.

The proposed RFP states:

The current contract is a Variable Quantity contract with Fixed Price per-transaction pricing such that the Contractor earns a fixed fee for each transaction processed via the website, the telephone customer support line, or in-person at participating Federal recreation locations.  The fixed fees under the current contract vary depending on the type of inventory being reserved, the sales channel, etc.

In an era where we have over 10,000 API resources to use in applications, where cloud-based utility pricing is successfully being applied, and you pay for what you use across numerous industries, you can’t have a business model like this for a public-private sector partnership, and not put APIs to use, and enabling trusted partners to build a business model around the commerce opportunities available via a platform. Right now, this is exclusive to the current contract holder, Active Network, d.b.a. ReserveAmerica, and for the next round should not be something that is exclusive to the contract winner. It should be open to the thousands of businesses that serve the parks and recreation space.

Another item I have an issue with is ownership over the source code of the platform:

The Government owns all of the inventory, transaction and customer data associated with Recreation.Gov; however the reservation system remains the sole proprietary property of the current service provider.

I know this is the current situation, but I strongly urge that for the next relationship, this is not the case. The government should own all of the inventory, transaction and customer data and the reservation system should be open source. Period. The chances that there will be innovation via the platform if it remains a proprietary solution is slim and requiring that the core reservation system and supporting API platform remain open source will stimulate innovation around Any business, including the winner of the contract, can still build proprietary solutions that use the API to deliver specific applications, or end-user solutions and premium features, but the core software should be open this time.

Some praise for what is in the proposed RFP

I am not one to just criticize without providing at least some praise, acknowledging where USDA is looking towards the future. One area I have to note is their inclusion of the U.S. Digital Services Playbook, where #3 on the checklist for deploying in a flexible hosting environment is “resources are provisioned through an API,” and #2 on the checklist for defaulting to open says that you should provide data sets to the public, in their entirety, through bulk downloads and APIs (application programming interfaces). I think the U.S. Digital Services Playbook should be included with ALL federal government RFPs, and it makes me happy to see it in here.

I also understand the number of agencies involved in the project include USDA Forest Service, U.S. Army Corps of Engineers, U.S. Department of the Interior (including at least the National Park Service, Bureau of Land Management, and Bureau of Reclamation, Fish and Wildlife Service), the National Archives and Records Administration and other potential federal partners to be determined at a later date.

Agencies involved in the trip planning component include the agencies named above, as well as the Department of Transportation; Federal Highway Administration, the Department of Commerce; National Oceanic and Atmospheric Administration, the Tennessee Valley Authority and Smithsonian Institution. This is a huge undertaking, and I commend them on the effort, but can’t also help myself in throwing in that this is all the more reason the one stop-services for recreation has to have an API. ;-)

In closing

I think I’m finished with my response to the Department of Agricultures’s solicitation number AG3187S151000–Recreation One Stop Support Services. I would like to request that they extend the deadline for response for another 30 days, to allow me to drum up more feedback, as well as put some more thought into the RFP myself. Following your own lead of including the U.S. Digital Services Playbook, I would also recommend bringing in:

  • 18F – Building on the U.S. Digital Services Playbook, 18F has an amazing set of resources to help USDA move forward with this RFP in a way that is in line with other leading platform efforts in the federal government—please make sure 18F is involved.
  • Round 3 PIFs – This is probably already in the works, but make sure there is someone from the recent round of Presidential Innovation Fellows are on board, helping make sure this project is headed in the right direction.

Ultimately, I think all the right parts and pieces are present for this project, but when it comes to  finding the right language, and vision for the RFP, it needs a lot of help, and I’m confident that 18F and PIFs can help bring it home. The most important element for me is that web service from the RIDB needs to be transformed into a modern web API and brought front and center to fuel every aspect of the platform. I’m confident that if the RFP speaks the right language, the winning contractor will be able to translate the RFP into the platform it needs to be, and serve the expectations of the modern consumers who will be using the platform in coming years.

If this RFP goes out the door without an API vision, planning of the family camping trip will done in a silo, at, and not on the mobile phones that are ubiquitous in our everyday life. User do not need a web experience that is translated to function in mobile browsers, they need a web experience when it makes sense, and a native mobile experience, and even offline when necessary.

Planning the family vacation will not happen from just It needs to be part of a larger vacation planning experience, occurring via the online platforms we already use in our daily lives. If USDA focuses on an API-first vision for delivering the one stop service for recreation, it will live up to its name—truly being a one-stop service that can be used for planning your recreation from any platform or device.

Call to action

Visit USDA’s solicitation number AG3187S151000–Recreation One Stop Support Services and ask them to extend the comment period, and let them know how important it is that the platform is API-centric. Feel free to send me a link to your of feedback, and I’ll add a list of relevant stories to the bottom of this post as I find them.

Models for API driven startups built around public data

I had a conversation with a venture capitalist recently who was looking for information on startups who had APIs and had built their company around public data.

The two companies that were referenced in the original contact email were companies like Eligible API and Clever API: Two similar, yet very different, approaches to aggregating public and data into a viable startup (Clever used to aggregate data from school districts, but now just provides login services).

During our conversation, we talked about the world of APIs and open data, both government and across the private sector. I spent 30 minutes helping them understand the landscape, and told them that when I was done I would generate of list of APIs I thought were interesting, and that I would categorize into a similar space as Eligible and Clever, something that was much more difficult to quantify than I expected. Nonetheless, I learned a lot and, as I do with all my research, I wanted to share the story of my experience.

I started with the companies, that off the top of my head, had built interesting businesses around publicly available facts and data, a definition that would expand as I continued.

I started with a couple of APIs I know provide some common data sources (via APIs):

Next, I wanted to look at a couple of the business data APIs I depend on daily, and while I was searching I found a third. What I think is interesting about these business data providers is their very different business models and approaches to gathering and making the data available.

Immediately after looking through Crunchbase and OpenCorporates, I queried my brain for other leading APIs who are pulling content or data from public sources, and developing a business around them. It makes sense to look at the social data and content realm, but this is where I stop. I don’t want to venture too far down the social media rabbit hole, but I think that these two providers are important to consider.

While Twitter data isn’t the usual business, industry, product, and other common facts, it has grown to be the new data that is defining not just Twitter’s business, but a whole ecosystem of aggregators and other services that are built on consuming, aggregating and often publishing public raw or enriched social data.

I wanted to also step back again and look at Clever, and think about their pivot from aggregating school data to being a login service. There was another API that I was tracking on who offered a similar service to Clever, aggregating school data, that I think is important to list alongside with Clever.

As far as I know, both Clever and LearnSprout are adjusting to find the sweet spot in what they do, but I keep them on the list because of what they did when I was originally introduced to their services. I think we can safely say that there will be no shortage of startups to come, following in Clever and LearnSprout’s footsteps, unaware of their predecessors, and the challenge they face when aggregate data across school districts.

Healthcare data

After taking another look at Clever, I also took another walkthrough at the Eligible API, and spent time looking for similar data driven APIs in the healthcare space. I think that Eligible is a shining example of what this particular VC was looking for, and a good model for startups looking to not just build a company and API around public data, but do it in a way that you can make a significant impact on an industry.

I know there are more healthcare data platforms out there, but these are a handful of ones that have APIs that I track. Healthcare is one of those heavily regulated industries where there is huge opportunity to aggregate data from multiple public and private sectors sources and build an API-driven business from.

Energy data

After healthcare, my mind immediately moved into the world of energy data, because there is a task on my task list to study open data licensing as part of a conversation I’m having with Genability. I think what this latest wave of energy API providers, and the work they do with the data of individual customers, but also wider power companies, and state and federal data, is very interesting.

When I was in Maryland this last May, moderating a panel with folks from the Department of Energy, the conversation came up around the value of the Department of Energy data, to the private sector. I’d say that the Department of Energy data is in the top five agencies when it comes to viability for use in the private sector and making a significant economic impact.


Pushing the boundaries of this definition again, I stumbled onto the concept of launching APIs for libraries, built around public or private collections. While not an exact match to other APIs is this story, I think what DPLA is doing, reflects what we are talking about, and about building a platform around public and private datasets (collections in this case).

Just like government agencies, public and private institutions possess an amazing amount of data, content and media that is not readily available online and provides a pretty significant opportunity to build API driven startups and organizations around these collections.

Scientific data

There is a wealth of valuable scientific data being made available via public APIs, from various organizations, and institutions. I’m not sure where these groups are gathering their data from, but I’m sure there is a lot of public funding and sources included in some of the APIs I track.

These are just two of the numerous scientific data APIs I keep an eye on, and I agree that this is a little out of bounds of exactly for what we are looking for, however, I think that the opportunity for designing, deploying and managing high-performing, high-value APIs from publicly and privately-owned scientific data is pretty huge.

Government data

As I look at these energy, and scientific APIs across my monitoring system, I’m presented with other government APIs that are consumer focused and often have the look and feel of a private sector startup, while also having a significant impact on private sector industries.

While all of these APIs are clearly .gov initiatives, they provide clear value to consumers, and I think there is opportunity for startups to play around in offering complimentary, or even competing services with these government-generated, industry-focused open data–going further than these platforms do.

Quasi-government data

Alongside those very consumer, industry oriented government efforts, I can’t help but look at the quasi-government APIs I’m familiar with that are providing similar data-driven APIs to the government ones above.

While these may not be models for startups, I think they provide potential ideas that private sector non-profit groups can take action on. Providing mortgage, energy, environmental, or even healthcare services, developed around public and private sector data, will continue to grow as a viable business model for startups and organizations in coming years.

Watchers of government data

Adding another layer to government data, I have to include the organizations that keep an eye on government, a segment of organizations that have evolved around building operational models for aggregating, generating meaning from, then republishing data that helps us understand how government is working (or not working).

These are all nonprofit organizations doing this work, but when it comes to journalism, politics, and other areas, there are some viable services that can be offered surrounding, and on top of the valuable open data being liberated, generated and published by the watchers of our government.

School data again

One interesting model for building a business around government data is with Great School. There are some high-value datasets available at the Department of Education as a well as Census Bureau, and using these sources have presented a common model for building a company around public data:

I’m not exactly a fan of the Great Schools, but I think it is worthy of noting. I’ve talked with them, and they don’t really have as open of a business model and platform as I would like to see. I feel it is important to “pay it forward” when building a for-profit company around public data. I don’t have a problem with building businesses and generating revenue around public data, but if you don’t contribute to it being more accessible than you found it, I have a problem.


After spending time looking through the APIs I monitor, I remembered the use of public data by leading news sources. These papers are using data from federal, state and city data sources, and serving them via APIs, right alongside the news.

These news sources don’t make money off the APIs themselves. Like software-as-a-service providers, they provide value-add to their core business. Census surveys, congressional voting, economic numbers, and other public data is extremely relevant to the everyday news that impacts us.

Been doing this for a while

When we talk about building businesses around publicly available data, there are some companies who have been doing this a while. The concept really isn’t that new, so I think it is important to bring these legacy providers into the conversation.

Most of these data providers have been doing it for over a decade. They all get the API game and offer a wide range of API services for developers, providing access to data that is taken directly from, derived or enhanced from public sources. When it comes to building a business around public data, I don’t think these four have the exact model I’m looking for, but there are many lessons of how to do it right, and wrong.

Weather is age-old model

When you think about it, one of the original areas we built services around government data is weather. Weather data is a common reference you will experience when you hear any official talk about the potential of government data. There are numerous weather API services available that are doing very well when it comes to digesting public data and making it relevant to developers.

Weather is the most relevant API resource I know of in the space. Weather impacts everyone, making it a resource all web and mobile applications will need. With the growing concern around climate change, this model for using public data, and generating valuable APIs will only grow more important.

Time zone data

Right there behind weather, I would say that time and data information is something that impacts everyone. Time shapes our world and government sets the tone of the conversation when it comes to date and time data, something that is driving many API-driven business models.

What I like about time and date APIs is that they provide an essential ingredient in all application development. It is an example of how government can generate and guide data sources, while allowing the private sector to help manage vital services around this data, that developers will depend on for building apps.

Currency conversion

Along with time zone data, currency conversion is a simple, valuable, API driven service that is needed across our economy. You have to know what time it is in different time zones, and know what the conversion rate between different currencies to do business in the global API economy.

In our increasingly global, online world, currency conversion is only going to grow more important. Workforces will be spread across the globe, and paying employees, buying goods and services will increasingly span the globe, requiring seamless currency conversion in all applications.

Transit data

Another important area of APIs, that are increasingly impacting our everyday lives, are transit APIs, providing real-time bus, train, subway and other public transit data to developers.

Transit data will always be a tug of war between the public and private sector. Some data will be generated in each sphere, with some projects incentivized by the government, where the private sector is unwilling to go. Establishing clear models for public and private sector partnerships around transit data will be critical to society functioning.

Real estate

While I won’t be covering every example of building a business around public data in this story, I would be remiss if I didn’t talk about the real estate industry, one of the oldest businesses built on public data and facts.

I’m not a big fan of the real estate industry. One of my startups in the past was built around aggregating MLS data, and I can safely say that the real industry is one of the shadiest industries I know of that is built on top of public data. I don’t think this industry is a model that we should be following, but again, I do think there are a huge lessons to be learned from the space as we move forward building business models around public data.

That is as far as I’m going to go in exploring API driven businesses built on public data. My goal wasn’t meant to be comprehensive, I was just looking to answer some questions for myself around who else is playing in the space.

This list of businesses came out of my API monitoring system, so is somewhat limited in its focus, requiring the company who is building on top of public data to also have an API, which creates quite a blind spot for this research. However, this is a blind spot I’m willing to live in, because I think my view represents the good in the space, and where we should be headed.

Open Data 500

For the next edition of this story, I’d like to look through the 500 companies listed in the Open Data 500 project. I like the focus of the project from GovLab out of New York University.

Their description from the site sums up the project:

The Open Data 500 is the first comprehensive study of U.S. companies that use open government data to generate new business and develop new products and services. Open Data is free, public data that can be used to launch commercial and nonprofit ventures, do research, make data-driven decisions, and solve complex problems.

I see a few of the companies I’ve listed above in the Open Data 500. I’m stoked that they provide both a JSON and CSV version of the Open Data 500, making it much easier to process, and make sense of the companies listed. I’d like to make a JavaScript slideshow from the JSON file, and browse through the list of companies, adding my own set of tags—helping me better understand the good from the bad examples, as well as where the trends and opportunities are around developing APIs around public data.

I’m pretty convinced that we have a lot of work to do in making government machine-readable data at the federal, state, county and city level more available before we can fully realize the potential of the API economy.

Without high quality, real-time, valuable public data, we won’t be able to satisfy the needs of the next wave of web, single page, mobile and Internet of things application developers. I’m also hoping we can work to establish some healthy blueprints for developing private sector businesses and organization around public data, by reviewing some of the existing startups who are finding success with this model, and build on, or compliment this existing work, rather than re-invent the wheel.

An open data blueprint for the U.S. Department of Commerce

U.S. Secretary of Commerce Penny Pritzker announcing the agency's new chief data officer position. (Photo: U.S. Secretary of Commerce)

U.S. Secretary of Commerce Penny Pritzker announcing the agency’s new chief data officer position. (Photo: U.S. Secretary of Commerce)

Re-published from API Evangelist

U.S. Secretary of Commerce Penny Pritzker recently announced the Department of Commerce will hire its first-ever chief data officer. I wanted to make sure that when this new and extremely important individual assumes their role, they have my latest thoughts on how to make the Department of Commerce developer portal the best it possibly can be, because this will be the driving force behind the rapidly expanding API driven economy.

Secretary Pritzker does a pretty good job of summing up the scope of resources that are available at Commerce:

Secretary Pritzker described how the Department of Commerce’s data collection – which literally reaches from the depths of the ocean to the surface of the sun – not only informs trillions of dollars of private and public investments each year and plants the seeds of economic growth, but also saves lives.

I think she also does a fine job of describing the urgency behind making sure Commerce resources are available:

Because of Commerce Department data, Secretary Pritzker explained, communities vulnerable to tornadoes have seen warning times triple and tornado warning accuracy double over the past 25 years, giving residents greater time to search for shelter in the event of an emergency.

To understand the importance of content, data and other resources that are coming out the Department of Commerce, you just have to look at the list of agencies under its purview that already have API initiatives:

Then take a look at the other half, who have not launched APIs:

The data and other resources available through these agencies reflect the heart of not just the U.S. economy, but the global economy, which is rapidly being driven by APIs powering stock markets, finance, payment providers, cloud computing and many other cornerstones of our increasingly online economy.

Look through those 13 agencies. The resource they manage are vital to all aspects of the economy: telecommunications, patents, weather, oceans, census, to other areas that have a direct influence on how markets work (or don’t).

I’m all behind the Commerce hiring a CDO, but my first question is, “what will this person do?”

This leader, Secretary Pritzker explained, will oversee improvements to data collection and dissemination in order to ensure that Commerce’s data programs are coordinated, comprehensive, and strategic.

Yes! I can get behind this. In my opinion, in order for the new CDO to do this, they will have to quickly bring all of the agencies /developer programs up to a modern level of operation. There is a lot of work to be done, so let’s get to work exploring what needs to happen.

A central Commerce developer portal to rule them all

Right now, the Commerce developer portal,, is just a landing page. An after thought, to help you find some APIs–not a portal.

The new CDO needs to establish this real estate as the one true portal, which provides the resources other agencies will need for success, while also providing a modern, leading location for developers of web, mobile, Internet of things applications and data journalists or analysts to find the data they need.

If you need a reference point, look at Amazon Web Services,SalesForceeBay or Googe’s developers areas—you should see this type of activity at

Each agency must have its own kick-ass developer portal

Following patterns set forth by Commerce, each sub-agency needs to possess their own best-of-breed developer portal, providing the data, APIs, code and other resources that public and private sector consumers will need. I just finished looking through all the available developer portals for commerce agencies, and there is no consistency between them in user experience, API design or resources available. The new CDO will have to immediately get to work on taking existing patterns from the private sector, as well as what has been developed by 18F, and set a establish common patterns that other agencies can follow when designing, developing and managing their own agencies developer portal.

High-quality, machine-readable open data by default

The new CDO needs to quickly build on existing data inventory efforts that has been going on at Commerce, making sure any existing projects, are producing machine-readable data by default, making sure all data inventory is available within their agency’s portal, as well as at This will not be a one-time effort. The new CDO needs to make sure all program and project managers, also get the data steward training they will need, to ensure that all future work at Commerce, associated agencies and private sector partners produces high-quality, machine-readable data by default.

Open source tooling to support the public and private sector

Within each of the Commerce and associate agency developer portals, there needs to be a wealth of open source code samples, libraries and SDKs for working with data and APIs. This open source philosophy, also needs to be applied to any web or mobile applications, analysis or visualization that are part of Commerce funded projects and programs, whether they are from the public or private sector. All software developed around Commerce data, and receive public funding should be open source by default, allowing the rest of the developer ecosystem, and ultimately the wider economy to benefit and build on top of existing work.

Machine-readable API definitions for all resources

This is an area that is a little bit leading edge, even for the private sector, but is rapidly emerging to play a central role in how APIs are designed, deployed, managed, discovered, tested, monitored and ultimately integrated into other systems and applications. Machine-readable API definitions are being used as a sort of central truth, defining how and what an API does, in a machine-readable, but common format, that any developer, and potentially other system can understand. Commerce needs to ensure that all existing, as well as future APIs developed around Commerce data, possess a machine-readable API definition, which will allow for all data resources to be plug and play in the API economy.

Establish an assortment of blueprints for other agencies to follow

The new Commerce CDO will have to be extremely efficient at establishing successful patterns that other agencies, projects and programs can follow. This starts with developer portal blueprints they can follow when designing, deploying and managing their own developer programs, but should not stop there, and Commerce will need a wealth of blueprints for open source software, APIs, system connectors and much, much more. Establishing common blueprints, and sharing these widely across government will be critical for consistency and interoperability–reducing the chances that agencies, or private sector partners will be re-inventing the wheel, while also reducing development costs.

Establish trusted partner access for public and private sector

Open data and APIs do not always mean publicly available by default. Private sector API leaders have developed trusted partner layers to their open data and API developer ecosystems, allowing for select, trusted partners greater access to resources. An existing model for this in the federal government is within the IRS modernized e-file ecosystem, and the trusted relationships they have with private sector tax preparation partners like H&R Block or Jackson Hewitt. Trusted partners will be critical in Commerce operations, acting as private sector connectors to the API economy, enabling higher levels of access from the private sector, but in a secure and controlled way that protects the public interest.

Army of domain expert evangelists providing a human face

As the name says, Commerce spans all business sectors, and to properly “oversee improvements to data collection and dissemination in order to ensure that Commerce’s data programs are coordinated, comprehensive, and strategic,” the CDO will need another human layer to help increase awareness of Commerce data and APIs, while also supporting existing partners and integrators. An army of evangelists will be needed, possessing some extremely important domain expertise, across all business sectors, that Commerce data and resources will touch. Evangelism is the essential human variable, that makes the whole open data and API algorithm work, the new CDO needs to get to work writing a job description, and hiring for this army—you will need an 18F, but one that is dedicated to Commerce.

Department of Commerce as the portal at the center of the API economy

The establishment of an official CDO at the Department of Commerce is very serious business, and is a role that will be central to how the global economy evolves in the coming years. The content, data, and digital resources that should, and will be made available at and associated agencies, will be central to the health of the API driven economy.

Think of what major seaports have done for the the economy over the last 1,000 years, and what role Wall Street has played in the economy over the last century. This is the scope of the portal, which is ultimately the responsibility of this new role.

When the new CDO gets started, I hope they reach out to 18F, who will have much of what you need to get going. Then sit down, read this post, as well my other one on, An API strategy for the U.S. government, and once you get going, if you need any help, just let me know—as my readers know, I’m full of a lot of ideas on APIs.

An API strategy for the U.S. government

U.S. Chief Technology Officer Todd Park with President Obama (Photo: White House)

U.S. Chief Technology Officer Todd Park with President Obama (Photo: White House)

I was asked to provides some thoughts on what is next for the U.S. government’s application programming interface strategy. I’ve put a lot of thought into it during my work and travels over the last couple months since I’ve left Washington, D.C., and I keep coming back to one thought: strengthen what we have.

I wish I had some new technology or platform for the next wave of government APIs that would ensure success with APIs in Washington, but in reality we need to do what we’ve been doing, but do it in scale, and get organized and collaborative about how we do it.

Release more data sets

There are thousands of data sets available via currently, across 176 agencies and numerous categories. We need more. When any content or data is published via a government website, that data needs to also be made available via agencies’ data repositories and Agencies need to understand that releasing open data sets is not something you do every once in a while to meet a mandate or deadline–it is something to do always, forever.

Refine existing data sets

There is a lot of data available currently. However, much of it is in various formats, inconsistent data models and isn’t always immediately available for use in spreadsheets, applications and for analysis. There is a great deal of work to be done in cleaning, normalizing and refining of existing data, as well as deploying APIs around open data that would increase adoption and the chances it will be put to use.

Refine existing APIs

Like open data, there are many existing APIs across federal government, and these APIs could use a lot of work to make them more usable by developers. With a little elbow grease, existing APIs could be standardized by generating common API definitions like Swagger, API blueprint and RAML, which would help quantify all APIs, but also generate interactive documentation, code samples and provide valuable discovery tools for helping understand where interfaces are and what they offer. The mission up until now for agencies is to deploy APIs, and while this remains true, the need to evolve and refine APIs will go a long way towards building those valuable case studies to get to the next level.

Robust /developer areas

There are over 50 agencies who have successful launched a /developer area to support open data and API efforts. Much like open data and the APIs themselves, they represent a mishmash of approaches, and provide varied amounts of resources and necessary support materials. already provides some great information on how to evolve an agency’s developer area, we just need some serious attention spent on helping each agency make it so. It doesn’t matter how valuable the open data or APIs are, if they are published without proper documentation, support and communication resources, they won’t be successful. Robust developer areas are essential to federal agencies finding success in their API initiatives.

Dedicated evangelist

Every successful API initiative in the private sector from Amazon to Twilio has employed evangelists to spread the word and engage developers, helping them find success in putting API resources to use. Each federal agency needs its own evangelist to help work with internal stakeholders making sure open data is published regularly, new APIs are deployed and existing resources are kept operational and up to date. Evangelists should have counterparts at OSTP / OMB / GSA levels providing feedback and guidance, as well as regular engagement with evangelists in the private sector. Evangelism is the glue that will hold things together across the agency, as well as provide the critical outreach to the private sector to increase adoption of government open data and APis.

Build public-private sector partnerships

Opening up data and APIs by the federal government is about sharing the load with the private sector and the public at large. Open data and APIs represents the resources the private sector will need to build applications, sites and fuel industry growth, and job creation. A new type of public-private sector partnership needs to be defined, allowing for companies and non-profit groups to access and use government services and resources in a self-service, scalable way. Companies should be able to build businesses around government Internet services, much like the ecosystem that has grown from the IRS e-File system, with applications like TurboTax that reaches millions of U.S. citizens and allows corporations to help government share the load while also generating necessary revenue.

Establish meaningful case studies

When it comes to open data and APIs nothing gets people on board more than solid examples of open data, APIs and the applications that are built on them have made in government. Open government proponents use weather data and GPS as solid examples of open data and technology impacting not just government, but also the private sector. We need to fold the IRS e-file ecosystem into this lineage, but also work towards establishing numerous other case studies we can showcase and tell stories about why open data and APIs are important–in ways that truly matter to everyone, not just tech folks.

Educate and tell stories within government

In order to take open data and APIs to the next level in government there needs to an organized and massive effort to educate people within government about the opportunities around open data and APis, as well as the pitfalls.

Regular efforts to educate people within government about the technology, business and politics of APIs needs to be scaled, weaving in relevant stories and case studies as they emerge around open data and APIs. Without regular, consistent education efforts and sharing of success stories across agencies, open data and APIs will never become part of our federal government DNA.

Inspire and tell stories outside government

As within government, the stories around government open data and APIs needs to be told outside the Washington echo chamber, educating citizens and companies about the availability of open data and APIs, and inspire them to take action by sharing successful stories of other uses of open data and APIs in development of applications.  

The potential of using popular platforms like Amazon and Twitter spread through the word of mouth amongst developer and power user communities, the same path needs to be taken with government data and api resources.

The next 2-3 years of the API strategy for the U.S. government will be about good old-fashioned hard work, collaboration and storytelling. We have blueprints for what agencies should be doing when it comes to opening up data, deploying APIs and enticing the private sector to innovate around government data, we just need to repeat and scale until with reach the next level.

How do we know when we’ve reached the next level? When the potential of APIs is understand across all agencies, and the public instinctively knows that you can go to any government /developer domain and find the resources they need, whether they are individual or a company looking to develop a business around government services.

The only way we will get there is by achieving a solid group of strong case studies of success in making change in government using open data and APIs. Think of the IRS e-file system, and how many citizens this ecosystem reaches, and the value generated through commercial partnerships with tax professionals. We need 10-25 of similar stories of how APIs impact people’s lives, strengthened the economy and has made government more efficient, before we consider getting to the next level.

Even with the housekeeping we have, what should be next for has continued to evolve, adding data sets, agencies and features. With recent, high profile stumbles like with, it can be easy to fall prey to historical stereotypes that government can’t deliver tech very well. While this may apply in some cases, I think we can get behind the movement that is occurring at, with 176 agencies working to add 54,723 data sets in the last 12 months.

I feel pretty strongly that before we look towards the future of what the roadmap looks like for, we need to spend a great deal of time refining and strengthening what we currently have available at and across the numerous government agency developer areas. Even with these beliefs, I can’t help but think about what is needed for the next couple years of

Maybe I’m biased, but I think the next steps for is to set sights purely on the API. How do we continue evolving, and prepare to not just participate, but lead in the growing API economy?

Management tools for agencies

We need to invest in management tools for agencies, commercial providers as well as umbrella. Agencies need to be able to focus on the best quality data sets and API designs, and not have to worry about the more mundane necessities of API management like access, security, documentation and portal management. Agencies should have consistent analytics, helping them understand how their resources are being accessed and put to use. If OSTP, OMG, GSA and the public expect consistent results when it comes to open data and APIs from agencies, we need to make sure they have the right management tools.

Endpoint design tools for data sets

Agencies should be able to go from data set to API without much additional work. Tools should be made available for easily mounting published datasets, then allow non-developers to design API endpoints for easily querying, filtering, accessing and transforming datasets. While data download will still be the preferred path for many developers, making high value datasets available via APIs will increase the number of people who access, that may not have the time to deal with the overhead of downloads.

Common portal building blocks

When you look through each of the 50+ federal agency /developer areas you see 50+ different approaches to delivering the portals that developers will depend on to learn about, and integrate with each agencies APIs. A common set of building blocks is needed to help agencies standardize how they deliver the developer portals. Their approach might make sense within each agency, but as a consumer when you try to work across agencies it can be a confusing maze of API interactions.

Standard developer registration

As a developer I need to establish a separate relationship with each federal agency. This quickly becomes a barrier to entry, one that will run off even the most seasoned developers. We want to incentivize developers to use as many federal APIs as possible, and by providing them with a single point of registration, and a common credential that will work across agencies will stimulate integrations and adoptions.

Standard application keys

To accompany the standard developer registration, a standard approach to user and application keys is needed across federal agencies. As a user, I should be able to create a single application definition and receive API keys that will work across agencies. The amount of work required to develop my application and manage multiple API keys will prevent developers from adopting multiple federal agency APIs. Single registration and application keys will reduce the barriers to entry for the average developer when looking to build on top of federal API resources.

Developer tooling and analytics

When integrating with private sector APIs, developers enjoy a wide range of tools and analytics that assist them in discovering, integrating, managing and monitoring their applications integration with APIs. This is something that is very rare in integration with federal government APIs. Standard tooling and analytics for developers needs to become part of the standard operating procedures for federal agency /developer initiatives, helping developers be successful in all aspects of their usage of government open data and APIs.

Swagger, API Blueprint, RAML

All APIs produced in government should be described using on of the common API definitions formats that have emerged like Swagger, API Blueprint and RAML. These all provide a machine readable description of an API and its interface that can be used in discovery, interactive documentation, code libraries and SDKs and many other uses. Many private sector companies are doing this, and the federal should follow the lead.

Discoverability, portable interfaces and machine readable by default

As with open data, APIs need to be discoverable, portable and machine readable by default. Describing APIs in Swagger, API Blueprint and RAML will do this. Allowing APIs to be presented, distributed and reused in new ways. This will allow each agency to publish their own APIs, but aggregators can take machine readable definitions from each and publish in a single location. This approach will allow for meaningful interactions such as with budget APIs, allowing a single budget API site to exist, providing access to all federal agencies budget without having to go to each /developer area, but there are many more examples like this that will increase API usage and extend the value of government APIs.

Mashape, API Hub

A new breed of API directories have emerged. API portals like Mashape and API Hub don’t just provide a directory of APIs, they simplify API management for providers and integration for API consumers. Federal agencies need to make their APIs friendly to these API hubs, maintaining active profiles on all platforms and keeping each API listed within the directories and actively engaging consumers via the platforms. Federal agencies shouldn’t depend on developers coming to their /developer areas to engage with their APIs, agencies need to reach out where developers are already actively using APIs.

Consistent API interface definition models

Within the federal government each API is born within its own agencies silo. Very little sharing of interface designs and data models is shared across agencies, resulting in APIs that may do the same thing, but can potentially do it in radically different ways. Common APIs such as budget or facility directories should be using a common API interface design and underlying data model. Agencies need to share interface designs, and work together to make sure the best patterns across the federal government are used.


In the federal government, APIs are often a one-way street that allow developers to come and pull information. To increase the value of data and other API driven resources, and help reduce the load on agencies servers, APIs need to push data out to consumers, reducing the polling on APIs and making API integration much more real-time. Technologies such as the Webhook which allows API consumer to provide a web URL, in which agencies can push newly published data, changes and other real-time events to users, are being widely used to make APIs much more of a two-way street.


As the world of web APIs evolve there are new approaches emerging to delivering the next generation of APIs, and hypermedia is one of these trends. Hypermedia brings a wealth of value, but most importantly it provides a common framework for APIs to communicate, and provide essential business logic and direction for developers, helping them better use APIs in line with AP provider goals. Hypermedia has the potential to not just make government assets and resources available, but ensure they are used in the best interest of each agency. Hypermedia is still getting traction in the private sector, but we are also seeing a handful of government groups take notice. Hypermedia holds a lot of potential for federal agencies, and the groundwork and education around this trend in APIs needs to begin.

Evangelists present

The first thing you notice when you engage with an government API is nobody is home. There is nobody to help you understand how it works, overcome obstacles when integrating. There is no face to the blog posts, the tweets or the forum replies. Federal government APIs have no personality. Evangelists are desperately needed to bring this human element to federal government APIs. All successful private sector APIs have an evangelist or an army of evangelists, spreading the word, supporting developers, and making things work. We need open data and API evangelists at every federal agency, lettings us know someone is home.

Read / Write

I know this is a scary one for government, but as I said above in the webhooks section—APIs need to be a two way street. There are proven ways to make APIs writeable without jeopardizing the integrity of API data. Allow for trusted access, let developers prove themselves. There is a lot of expertise, “citizen cycles,” and value available in the developer ecosystem. When a private sector company uses federal government data, improves on it, the government and the rest of the ecosystem should be able to benefit as well. The federal government needs to allow for both read and write on APIs—this will be the critical next step that makes government APIs a success.

These are just 14 of my recommendations for the next steps in the API strategy for the federal government. As I said earlier, none of this should be done without first strengthening what we all have already done in the federal government around open data and APIs. However, even though we need to play catch up on what’s already there, we can’t stop looking towards the future and understand what needs to be next.

None of these recommendations are bleeding edge or technology just for technology sake. This is about standardizing how APIs are designed, deployed and managed across the federal government, emulating what is already being proven to work in the private sector. If the federal government wants to add to the OpenData500, and establish those meaningful stories needed to deliver on the promise of open data and APIs, this is what’s needed. 

With the right leadership, education and evangelism, open data and APIs can become part of the DNA of our federal government. We have to put aside our purely techno-solutionism view and realize this is seriously hard work, with many pitfalls, challenges and that in reality it won’t happen overnight.

However, if we dedicate the resources needed, we can not just change how government works, making it machine readable by default, we can forever alter how the private and public sector works together.