Connect with GovFresh on Instagram!
‘Startup.civ’ is a regular GovFresh feature highlighting the startups powering the civic movement.
Give us the 140-character elevator pitch.
StreetCred helps law enforcement agencies locate fugitives, get them out of the community, and bring the officers home safely each day.
Despite the fact that millions of websites around the world today are powered by low- and no-cost open source content management systems, nearly all small city governments remain trapped in the 90s.
It’s not that they don’t want great websites to serve their citizens. They just don’t have the technical prowess to understand what their options are and how to deploy and manage them.
Connect with GovFresh on Instagram!
Since last October the U.S. media, in full orgasmic throng, has been barking madly over the fate of the Healthcare.gov rollout. There has been overwhelming and obdurate polarization around positions on issues that would, in other arenas, be viewed through the objective lens of what most agree are facts.
Tomorrow, we’ll announce the winners of the 2013 GovFresh Awards, and I’m really excited, but the whole process didn’t come without some hiccups, and I wanted to share these with the community, what I learned, what I did during the process, and want your ideas on how we can make it better.
My objective behind the GovFresh Awards was to open it up globally, create wider awareness for journalists interested in covering government innovation and inspire other government leaders and entrepreneurs.
For those of us inside the bubble, we see the same people, places and technologies over and over again. An open awards process allows the community to see what else is happening beyond just the noise.
My favorite example of this is the City of Piqua, who is doing some amazing work around citizen engagement and is a model for other small (and big) cities.
Here’s what I’ve learned:
Doing awards the right way is hard. Most government awards (all?) follow a very closed process. You submit closed nominations and they’re “judged” by a panel of unnamed, unknown (or no) judges with agendas that focus on creating visibility to certain public sector employees or vendors who have paid to increase their standing or credibility (or perception of) within the community. “Winners” usually fit into a nice, strategic package that conveniently brings both together in a way that makes me very uncomfortable.
I don’t ever want the GovFresh Awards to be that. I don’t ever want GovFresh to be that.
Communicate, communicate. I was delayed in communicating updates and reasoning as to why things were delayed. That’s all on me and much of that will be resolved next year with these lessons-learned and feedback from the community.
Start earlier. I made the mistake of rolling out the awards right before the holidays. In the back of my head I knew better, but I just wanted to get something launched, because I felt like it was important to just get something out the door. The initial participation was low, and I decided to extend the deadline. I should have done that earlier and explained why so that there wasn’t any confusion.
Refined categories. I’m not convinced the categories were perfect, and I think they could be better refined. For example, when it comes to publicly nominating people (especially public servants and mayors), there are political, PR and humility issues that cause most to not want to get involved in a public submissions process.
In the case of ‘City of the Year,’ I added a new category (‘Small city of the year’), because I felt it was important to separate the two and give visibility to the work of smaller cities that often get overshadowed by what’s happening in major metro areas. Big thanks to Piqua city employees for working with me on refining that.
Longer submission time. I gave one week for people to submit nominations. In reality, it should be more like three weeks. My reasoning for keeping the process short was to limit what is typically a hype cycle for awards program and the organization holding them. The reality is people need more time than one week.
No popularity contest. Having judges select from a list of top vote-getters potentially eliminates entrees that may not have the bandwidth or network to garner enough votes to secure a nomination. While I think encouraging collaboration, participation and support is important, it should by no means be a gauge for determining who judges can select from.
More judges. The judges who took their time to review submissions were amazing. Most of them I’ve never met or talked with before, but still graciously gave their time and were honest and proactive about conflicts during the voting process. While I’m not a fan of big committees, next time around we’ll have more judges because, even as diverse as it was, I think it could still include a wider demographic.
Longer judging time. As with nominations, judges need more time. There also needs to be more time between the judging deadline and announcement to give someone like me time to review and prep an announcement.
This is an amazing community. Some in the community pinged me about the status and were understanding of the situation once I told them about the hiccups. For those unfamiliar with the intricacies of how GovFresh operates, it is run solely by me in my spare time. When you run into personal issues like a long jury duty selection process or a death in the family, administering awards becomes a secondary priority. Many thanks to those of you who understood this as well as the importance of doing an awards program right as opposed to just doing it for the sake of doing it.
Thanks everyone for participating in the 2013 GovFresh Awards. Let’s make 2014 even better.
‘Startup.civ’ is a regular GovFresh feature highlighting the startups powering the civic movement.
Give us the 140-character elevator pitch.
SmartProcure is a government purchasing database that helps agencies improve purchasing decisions and vendors win more government business.
What problem does SmartProcure solve for government?
There are more than 89,000 government agencies across the United State, and virtually all of them have no access to the data they need for making the best purchasing decisions. The vast majority of them have their own individualized systems to store purchasing data and are, due to data silos, disconnected from other government agencies. The end result of these data silos is that a majority of government purchases are made at a higher price than the best available rate.
The yearly amount spent by 89,000+ U.S. governmental agencies is as high as $7 trillion. Combined, U.S. government municipalities represent the largest purchaser in the world. But despite this large amount spent on procurement, the information shared among agencies is minimal. Without transparent data and an information platform to connect government agencies, these purchasing data silos will endure, along with a significant waste of time, money, and resources for every purchase. Government purchasing agents lack the national data to find the best value and must go with the best known options.
Furthermore, contractors who sell to the government lack the same data as the government. There is no national database that can tell them which government agencies have bought what they sell; they must rely on RFP and bid services to find out about opportunities. However, more than 80% of all government purchasing happens without a bid or RFP.
SmartProcure solves this disconnect with a database of government purchasing history. Now, empowered by a searchable database of purchasing information from across the nation, government agencies are able to consistently get the best value. They can use the information to instantly see all data for every purchase of any product, identify who sells that product, and find the best pricing. Government contractors can use SmartProcure’s database to locate all government agencies that buy what they sell, as well as the ability to see what their competitors are doing.
What’s the story behind starting SmartProcure?
SmartProcure was founded by Jeff Rubenstein. Jeff has been active in government procurement and public safety for more than 20 years. Two years ago, he had an epiphany when he noticed that two agencies in the same city bought the same item at vastly different prices. He knew that this problem could be solved with a proper database, but no such database existed. In 2011, SmartProcure was launched, and since then thousands of government agencies and contractors have joined us.
What are its key features?
SmartProcure’s data system is built upon a growing dataset of more than 64 million purchase orders at the local, state and federal level; this number is at the time of writing and grows each month. Each purchase order is fully indexed, including vendor data, agency contacts, line item descriptions, quantities and pricing.
Users can search by product, service, line item description, quantities, vendor, agency, location and dozens of other purchase order variables. An agency can search for the lowest priced vendors, locate contract piggy-backing opportunities and connect with other agencies that have particular experience with a product, service, or vendor. Contractors can search for every government agency that buys their specific products & services, as well as look into the sales activity (and pricing strategies) of competitors.
In short, you can think of SmartProcure as the Google of government purchasing data.
What are the costs, pricing plans?
SmartProcure is free to any government agency that shares their purchasing data. Government contractors pay an annual subscription for access to the data.
How can those interested connect with you?
- Website: www.smartprocure.us
- Twitter: @smartprocureus
- Email: firstname.lastname@example.org
- Phone: (954) 420-9900
Have a civic startup? Here’s how you get featured.
I was asked to provides some thoughts on what is next for the U.S. government’s application programming interface strategy. I’ve put a lot of thought into it during my work and travels over the last couple months since I’ve left Washington, D.C., and I keep coming back to one thought: strengthen what we have.
I wish I had some new technology or platform for the next wave of government APIs that would ensure success with APIs in Washington, but in reality we need to do what we’ve been doing, but do it in scale, and get organized and collaborative about how we do it.
Release more data sets
There are thousands of data sets available via data.gov currently, across 176 agencies and numerous categories. We need more. When any content or data is published via a government website, that data needs to also be made available via agencies’ data repositories and data.gov. Agencies need to understand that releasing open data sets is not something you do every once in a while to meet a mandate or deadline–it is something to do always, forever.
Refine existing data sets
There is a lot of data available currently. However, much of it is in various formats, inconsistent data models and isn’t always immediately available for use in spreadsheets, applications and for analysis. There is a great deal of work to be done in cleaning, normalizing and refining of existing data, as well as deploying APIs around open data that would increase adoption and the chances it will be put to use.
Refine existing APIs
Like open data, there are many existing APIs across federal government, and these APIs could use a lot of work to make them more usable by developers. With a little elbow grease, existing APIs could be standardized by generating common API definitions like Swagger, API blueprint and RAML, which would help quantify all APIs, but also generate interactive documentation, code samples and provide valuable discovery tools for helping understand where interfaces are and what they offer. The mission up until now for agencies is to deploy APIs, and while this remains true, the need to evolve and refine APIs will go a long way towards building those valuable case studies to get to the next level.
Robust /developer areas
There are over 50 agencies who have successful launched a /developer area to support open data and API efforts. Much like open data and the APIs themselves, they represent a mishmash of approaches, and provide varied amounts of resources and necessary support materials. HowTo.gov already provides some great information on how to evolve an agency’s developer area, we just need some serious attention spent on helping each agency make it so. It doesn’t matter how valuable the open data or APIs are, if they are published without proper documentation, support and communication resources, they won’t be successful. Robust developer areas are essential to federal agencies finding success in their API initiatives.
Every successful API initiative in the private sector from Amazon to Twilio has employed evangelists to spread the word and engage developers, helping them find success in putting API resources to use. Each federal agency needs its own evangelist to help work with internal stakeholders making sure open data is published regularly, new APIs are deployed and existing resources are kept operational and up to date. Evangelists should have counterparts at OSTP / OMB / GSA levels providing feedback and guidance, as well as regular engagement with evangelists in the private sector. Evangelism is the glue that will hold things together across the agency, as well as provide the critical outreach to the private sector to increase adoption of government open data and APis.
Build public-private sector partnerships
Opening up data and APIs by the federal government is about sharing the load with the private sector and the public at large. Open data and APIs represents the resources the private sector will need to build applications, sites and fuel industry growth, and job creation. A new type of public-private sector partnership needs to be defined, allowing for companies and non-profit groups to access and use government services and resources in a self-service, scalable way. Companies should be able to build businesses around government Internet services, much like the ecosystem that has grown from the IRS e-File system, with applications like TurboTax that reaches millions of U.S. citizens and allows corporations to help government share the load while also generating necessary revenue.
Establish meaningful case studies
When it comes to open data and APIs nothing gets people on board more than solid examples of open data, APIs and the applications that are built on them have made in government. Open government proponents use weather data and GPS as solid examples of open data and technology impacting not just government, but also the private sector. We need to fold the IRS e-file ecosystem into this lineage, but also work towards establishing numerous other case studies we can showcase and tell stories about why open data and APIs are important–in ways that truly matter to everyone, not just tech folks.
Educate and tell stories within government
In order to take open data and APIs to the next level in government there needs to an organized and massive effort to educate people within government about the opportunities around open data and APis, as well as the pitfalls.
Regular efforts to educate people within government about the technology, business and politics of APIs needs to be scaled, weaving in relevant stories and case studies as they emerge around open data and APIs. Without regular, consistent education efforts and sharing of success stories across agencies, open data and APIs will never become part of our federal government DNA.
Inspire and tell stories outside government
As within government, the stories around government open data and APIs needs to be told outside the Washington echo chamber, educating citizens and companies about the availability of open data and APIs, and inspire them to take action by sharing successful stories of other uses of open data and APIs in development of applications.
The potential of using popular platforms like Amazon and Twitter spread through the word of mouth amongst developer and power user communities, the same path needs to be taken with government data and api resources.
The next 2-3 years of the API strategy for the U.S. government will be about good old-fashioned hard work, collaboration and storytelling. We have blueprints for what agencies should be doing when it comes to opening up data, deploying APIs and enticing the private sector to innovate around government data, we just need to repeat and scale until with reach the next level.
How do we know when we’ve reached the next level? When the potential of APIs is understand across all agencies, and the public instinctively knows that you can go to any government /developer domain and find the resources they need, whether they are individual or a company looking to develop a business around government services.
The only way we will get there is by achieving a solid group of strong case studies of success in making change in government using open data and APIs. Think of the IRS e-file system, and how many citizens this ecosystem reaches, and the value generated through commercial partnerships with tax professionals. We need 10-25 of similar stories of how APIs impact people’s lives, strengthened the economy and has made government more efficient, before we consider getting to the next level.
Even with the housekeeping we have, what should be next for Data.gov?
Data.gov has continued to evolve, adding data sets, agencies and features. With recent, high profile stumbles like with Healthcare.gov, it can be easy to fall prey to historical stereotypes that government can’t deliver tech very well. While this may apply in some cases, I think we can get behind the movement that is occurring at Data.gov, with 176 agencies working to add 54,723 data sets in the last 12 months.
I feel pretty strongly that before we look towards the future of what the roadmap looks like for Data.gov, we need to spend a great deal of time refining and strengthening what we currently have available at Data.gov and across the numerous government agency developer areas. Even with these beliefs, I can’t help but think about what is needed for the next couple years of Data.gov.
Maybe I’m biased, but I think the next steps for Data.gov is to set sights purely on the API. How do we continue evolving Data.gov, and prepare to not just participate, but lead in the growing API economy?
Management tools for agencies
We need to invest in management tools for agencies, commercial providers as well as umbrella. Agencies need to be able to focus on the best quality data sets and API designs, and not have to worry about the more mundane necessities of API management like access, security, documentation and portal management. Agencies should have consistent analytics, helping them understand how their resources are being accessed and put to use. If OSTP, OMG, GSA and the public expect consistent results when it comes to open data and APIs from agencies, we need to make sure they have the right management tools.
Endpoint design tools for data sets
Agencies should be able to go from data set to API without much additional work. Tools should be made available for easily mounting published datasets, then allow non-developers to design API endpoints for easily querying, filtering, accessing and transforming datasets. While data download will still be the preferred path for many developers, making high value datasets available via APIs will increase the number of people who access, that may not have the time to deal with the overhead of downloads.
Common portal building blocks
When you look through each of the 50+ federal agency /developer areas you see 50+ different approaches to delivering the portals that developers will depend on to learn about, and integrate with each agencies APIs. A common set of building blocks is needed to help agencies standardize how they deliver the developer portals. Their approach might make sense within each agency, but as a consumer when you try to work across agencies it can be a confusing maze of API interactions.
Standard developer registration
As a developer I need to establish a separate relationship with each federal agency. This quickly becomes a barrier to entry, one that will run off even the most seasoned developers. We want to incentivize developers to use as many federal APIs as possible, and by providing them with a single point of registration, and a common credential that will work across agencies will stimulate integrations and adoptions.
Standard application keys
To accompany the standard developer registration, a standard approach to user and application keys is needed across federal agencies. As a user, I should be able to create a single application definition and receive API keys that will work across agencies. The amount of work required to develop my application and manage multiple API keys will prevent developers from adopting multiple federal agency APIs. Single registration and application keys will reduce the barriers to entry for the average developer when looking to build on top of federal API resources.
Developer tooling and analytics
When integrating with private sector APIs, developers enjoy a wide range of tools and analytics that assist them in discovering, integrating, managing and monitoring their applications integration with APIs. This is something that is very rare in integration with federal government APIs. Standard tooling and analytics for developers needs to become part of the standard operating procedures for federal agency /developer initiatives, helping developers be successful in all aspects of their usage of government open data and APIs.
Swagger, API Blueprint, RAML
All APIs produced in government should be described using on of the common API definitions formats that have emerged like Swagger, API Blueprint and RAML. These all provide a machine readable description of an API and its interface that can be used in discovery, interactive documentation, code libraries and SDKs and many other uses. Many private sector companies are doing this, and the federal should follow the lead.
Discoverability, portable interfaces and machine readable by default
As with open data, APIs need to be discoverable, portable and machine readable by default. Describing APIs in Swagger, API Blueprint and RAML will do this. Allowing APIs to be presented, distributed and reused in new ways. This will allow each agency to publish their own APIs, but aggregators can take machine readable definitions from each and publish in a single location. This approach will allow for meaningful interactions such as with budget APIs, allowing a single budget API site to exist, providing access to all federal agencies budget without having to go to each /developer area, but there are many more examples like this that will increase API usage and extend the value of government APIs.
Mashape, API Hub
A new breed of API directories have emerged. API portals like Mashape and API Hub don’t just provide a directory of APIs, they simplify API management for providers and integration for API consumers. Federal agencies need to make their APIs friendly to these API hubs, maintaining active profiles on all platforms and keeping each API listed within the directories and actively engaging consumers via the platforms. Federal agencies shouldn’t depend on developers coming to their /developer areas to engage with their APIs, agencies need to reach out where developers are already actively using APIs.
Consistent API interface definition models
Within the federal government each API is born within its own agencies silo. Very little sharing of interface designs and data models is shared across agencies, resulting in APIs that may do the same thing, but can potentially do it in radically different ways. Common APIs such as budget or facility directories should be using a common API interface design and underlying data model. Agencies need to share interface designs, and work together to make sure the best patterns across the federal government are used.
In the federal government, APIs are often a one-way street that allow developers to come and pull information. To increase the value of data and other API driven resources, and help reduce the load on agencies servers, APIs need to push data out to consumers, reducing the polling on APIs and making API integration much more real-time. Technologies such as the Webhook which allows API consumer to provide a web URL, in which agencies can push newly published data, changes and other real-time events to users, are being widely used to make APIs much more of a two-way street.
As the world of web APIs evolve there are new approaches emerging to delivering the next generation of APIs, and hypermedia is one of these trends. Hypermedia brings a wealth of value, but most importantly it provides a common framework for APIs to communicate, and provide essential business logic and direction for developers, helping them better use APIs in line with AP provider goals. Hypermedia has the potential to not just make government assets and resources available, but ensure they are used in the best interest of each agency. Hypermedia is still getting traction in the private sector, but we are also seeing a handful of government groups take notice. Hypermedia holds a lot of potential for federal agencies, and the groundwork and education around this trend in APIs needs to begin.
The first thing you notice when you engage with an government API is nobody is home. There is nobody to help you understand how it works, overcome obstacles when integrating. There is no face to the blog posts, the tweets or the forum replies. Federal government APIs have no personality. Evangelists are desperately needed to bring this human element to federal government APIs. All successful private sector APIs have an evangelist or an army of evangelists, spreading the word, supporting developers, and making things work. We need open data and API evangelists at every federal agency, lettings us know someone is home.
Read / Write
I know this is a scary one for government, but as I said above in the webhooks section—APIs need to be a two way street. There are proven ways to make APIs writeable without jeopardizing the integrity of API data. Allow for trusted access, let developers prove themselves. There is a lot of expertise, “citizen cycles,” and value available in the developer ecosystem. When a private sector company uses federal government data, improves on it, the government and the rest of the ecosystem should be able to benefit as well. The federal government needs to allow for both read and write on APIs—this will be the critical next step that makes government APIs a success.
These are just 14 of my recommendations for the next steps in the API strategy for the federal government. As I said earlier, none of this should be done without first strengthening what we all have already done in the federal government around open data and APIs. However, even though we need to play catch up on what’s already there, we can’t stop looking towards the future and understand what needs to be next.
None of these recommendations are bleeding edge or technology just for technology sake. This is about standardizing how APIs are designed, deployed and managed across the federal government, emulating what is already being proven to work in the private sector. If the federal government wants to add to the OpenData500, and establish those meaningful stories needed to deliver on the promise of open data and APIs, this is what’s needed.
With the right leadership, education and evangelism, open data and APIs can become part of the DNA of our federal government. We have to put aside our purely techno-solutionism view and realize this is seriously hard work, with many pitfalls, challenges and that in reality it won’t happen overnight.
However, if we dedicate the resources needed, we can not just change how government works, making it machine readable by default, we can forever alter how the private and public sector works together.