Month: August 2014

U.S. Energy Department seeks chief data officer

U.S. Department of EnergyThe U.S. Department of Energy is looking for a chief data officer.

Salary is $120,749 to $181,500.

From the job description:

The Chief Data Officer is located within the Office of the Associate CIO for Technology and Innovation. This position is responsible for:

  • Leads the Department’s Open Data effort to manage as an asset and making it available, discoverable, and usable to strengthen the Department’s effectiveness in carrying out its mission.
  • Responsible for agency-wide orchestration, policy proposals and adherence towards a formal data governance framework and data live cycle practices to unlock the potential of government data.
  • Represents DOE at working group forums established by the Deputy Director for Management at OMB, the Federal Chief Information Officer, and the Federal Chief Technology Officer, to focus on data transparency, accountability, policy and directive participation and collaboration.
  • Ensure agency-wide data efforts adhere to the Privacy Act; ensuring full analysis of privacy, confidentiality, and security issues.
  • Ensure DOE meets all OMB Federal policy and directive Open Data/Open Gov deadlines.
  • Evolve agency behaviors and new culture norms towards collaboration to meet policy needs.
  • Performs other duties as assigned.

Application deadline is September 12. Apply here.

Got natural disasters? There’s an open source emergency preparedness toolkit for that

City72

Source: toolkit.sf72.org

If you live in the San Francisco Bay Area and woke up at to a 6.1 earthquake at 3:30 a.m. this morning, now would be a good time for citizens and local governments everywhere to take a look at City72 Toolkit.

The San Francisco Department of Emergency Management recently partnered with design firm IDEO to create the City72 Toolkit, an open source “emergency preparedness platform that promotes community resilience and connection.”

The toolkit is now freely-available to cities everywhere to re-purpose and customize to provide information to their own residents.

SFDEM’s Kristin Hogan Schildwachter shares the inspiration for City72 and how other cities can easily create their own.

What is City72 Toolkit?

City72 is an open source emergency preparedness platform that promotes community resilience and connection. This toolkit is designed specifically for emergency preparedness organizations and provides the information and resources to create a customized City72 site for any city or region.

It includes:

  • how to create localized content;
  • access to the code to build and install your City72 website; and
  • tips for how to manage and promote your site

How did it come about?

Until 2009, in San Francisco we were following the prescriptive “Make a Plan. Get a Kit. Be Informed.” emergency preparedness messages, which we modeled after FEMA’s national preparedness campaign “Ready.” And what we found was that we were only reaching a small percentage of the general public: the already prepared.

So, in 2008 our deputy director of emergency services, Rob Dudgeon, kicked off an initiative to redefine how we messaged and packaged emergency preparedness with the mantra “If we keep promoting emergency preparedness this way, we’re only going to get who we’ve already gotten prepared.”

We leveraged a lot of research based on social science data and also the findings from a major project assessing state of Bay Area Preparedness (the Bay Area Urban Area Security Initiative) Community Preparedness Project) to develop a communications strategy to redefine how we messaged emergency preparedness.

This strategy, the DEM Preparedness Movement Communications Strategy, became the basis upon which we communicated about emergency preparedness and informed our in-person and social media communications, but it was not reflected in our emergency preparedness website (at the time): www.72hours.org. We knew we needed to rebrand this website to align with our communications strategy, so we secured some grant funding and issued a request for proposal to redesign www.72hours.org.

IDEO bid on the RFP, and through a competitive process they were the selected vendor. From there, IDEO and its human-centered design approach helped us to manifest our vision resulting in www.sf72.org.

Meanwhile, we wanted to share our findings, experience and redefinition of preparedness messaging within the emergency management community at large. So, we wrote within the terms our final deliverable that it the web site be open source, so any other city could have access to SF72.org’s design and content.

To make this a more tangible possibility, we worked with IDEO to create the City72 toolkit.

How can others use it?

The City72 Toolkit provides cities ready to create their own version of City72.org step by step instructions for how to set up their site. It’s recommended to have some technical support from a web developer (or an internal city resource or contractor).

Who’s using it and how?

Right now, Johnson County, Kansas is in the process of creating its own version of City72 (to be called JoCo72.org). We have had conversations with other city offices of emergency management about City72, and we are hoping they may be the next generation of City72 sites.

How can others connect with you to learn more?

We would be thrilled to talk with anyone interested in City72.org. We can be reached via email at sf72@sfgov.org and/or Twitter at @sf72org.

Video

City72 Tour from SFDeptEmrgcyMgmt on Vimeo.

Join us at the 2014 Code for America Summit

The 2014 Code for America Summit is set for September 23-24 and registration is now open.

There’s an excellent schedule of events and an incredible line-up of speakers, but the best part about Summit is you get to meet and network with the who’s who of civic innovation and technology.

Use the “GovFresh” promo code when registering and get a 5 percent discount.

Register here.

How you can help build a more agile government

Agile for Gov

Source: agileforgov.org

Earlier this year, I began doing research work with CivicActions on agile development in government — who was doing it, how and what the needs were to successfully get it deployed.

After the Healthcare.gov launch mishaps, calls for agile practices as the panacea to all of government IT woes reached a high. While agile as the ultimate solution oversimplifies the issue, we’ve evolved as a profession (both software development and public service) that moving towards an iterative approach to operations is the way of the future.

My own formal introduction with agile began with my work with CivicActions, so the research coincided with an introductory immersion into how government is using it. Having been involved with startups for the past 15 years, iterative development is the norm, however, the layer of project management processes has forced me to be a better professional overall.

What I’ve found through many discussions and interviews is that you can’t just snap your fingers and execute agile within the framework of government bureaucracy. There are a number of issues — from procurement to project management training to executive-level commitment to organizational-wide culture change — that hinder its adoption. For IT, launching a new website or app is this easy part. Changing IT operational processes and culture is often overlooked or avoided, especially for a short-term executive, because they reach into the granular organizational challenges most people don’t want to bother with.

After talking with a number of agile government and private sector practitioners, it was clear there was enthusiasm around how it could be applied to fundamentally change the way government works. Beyond just execution from professional project management professionals, everyone I spoke with talked about how deploying agile gives them a stronger sense of public service.

What came from these discussions is the desire to have a stronger community of practitioners and those interested in deploying it to better support one another.

To meet that need, a group of federal, state, local government and private sector professionals have formed Agile Government Leadership, a “community-powered network of agile government professionals.”

Its mission:

By bringing applied agile practices to government, we want to redefine the culture of local, state and federal public sector service delivery across all aspects of government. We will work with agile professionals and organizations to support their work in getting agile infused into government processes. We will foster a spirit of openness and mentor those new to agile so that they have the necessary practical advice, resources, tools and community support for successful deployment. Through Agile Government Leadership, we will create a responsive, engaged government that more efficiently and effectively serves its citizens.

The group has done a ton of behind-the-scenes work and has go-forward plans in place, but also wants your feedback.

To get involved with Agile Government Leadership, join the LinkedIn, Facebook and Google groups, follow on Twitter and visit the website at agilegovleaders.org.

Models for API driven startups built around public data

I had a conversation with a venture capitalist recently who was looking for information on startups who had APIs and had built their company around public data.

The two companies that were referenced in the original contact email were companies like Eligible API and Clever API: Two similar, yet very different, approaches to aggregating public and data into a viable startup (Clever used to aggregate data from school districts, but now just provides login services).

During our conversation, we talked about the world of APIs and open data, both government and across the private sector. I spent 30 minutes helping them understand the landscape, and told them that when I was done I would generate of list of APIs I thought were interesting, and that I would categorize into a similar space as Eligible and Clever, something that was much more difficult to quantify than I expected. Nonetheless, I learned a lot and, as I do with all my research, I wanted to share the story of my experience.

I started with the companies, that off the top of my head, had built interesting businesses around publicly available facts and data, a definition that would expand as I continued.

I started with a couple of APIs I know provide some common data sources (via APIs):

Next, I wanted to look at a couple of the business data APIs I depend on daily, and while I was searching I found a third. What I think is interesting about these business data providers is their very different business models and approaches to gathering and making the data available.

Immediately after looking through Crunchbase and OpenCorporates, I queried my brain for other leading APIs who are pulling content or data from public sources, and developing a business around them. It makes sense to look at the social data and content realm, but this is where I stop. I don’t want to venture too far down the social media rabbit hole, but I think that these two providers are important to consider.

While Twitter data isn’t the usual business, industry, product, and other common facts, it has grown to be the new data that is defining not just Twitter’s business, but a whole ecosystem of aggregators and other services that are built on consuming, aggregating and often publishing public raw or enriched social data.

I wanted to also step back again and look at Clever, and think about their pivot from aggregating school data to being a login service. There was another API that I was tracking on who offered a similar service to Clever, aggregating school data, that I think is important to list alongside with Clever.

As far as I know, both Clever and LearnSprout are adjusting to find the sweet spot in what they do, but I keep them on the list because of what they did when I was originally introduced to their services. I think we can safely say that there will be no shortage of startups to come, following in Clever and LearnSprout’s footsteps, unaware of their predecessors, and the challenge they face when aggregate data across school districts.

Healthcare data

After taking another look at Clever, I also took another walkthrough at the Eligible API, and spent time looking for similar data driven APIs in the healthcare space. I think that Eligible is a shining example of what this particular VC was looking for, and a good model for startups looking to not just build a company and API around public data, but do it in a way that you can make a significant impact on an industry.

I know there are more healthcare data platforms out there, but these are a handful of ones that have APIs that I track. Healthcare is one of those heavily regulated industries where there is huge opportunity to aggregate data from multiple public and private sectors sources and build an API-driven business from.

Energy data

After healthcare, my mind immediately moved into the world of energy data, because there is a task on my task list to study open data licensing as part of a conversation I’m having with Genability. I think what this latest wave of energy API providers, and the work they do with the data of individual customers, but also wider power companies, and state and federal data, is very interesting.

When I was in Maryland this last May, moderating a panel with folks from the Department of Energy, the conversation came up around the value of the Department of Energy data, to the private sector. I’d say that the Department of Energy data is in the top five agencies when it comes to viability for use in the private sector and making a significant economic impact.

Libraries

Pushing the boundaries of this definition again, I stumbled onto the concept of launching APIs for libraries, built around public or private collections. While not an exact match to other APIs is this story, I think what DPLA is doing, reflects what we are talking about, and about building a platform around public and private datasets (collections in this case).

Just like government agencies, public and private institutions possess an amazing amount of data, content and media that is not readily available online and provides a pretty significant opportunity to build API driven startups and organizations around these collections.

Scientific data

There is a wealth of valuable scientific data being made available via public APIs, from various organizations, and institutions. I’m not sure where these groups are gathering their data from, but I’m sure there is a lot of public funding and sources included in some of the APIs I track.

These are just two of the numerous scientific data APIs I keep an eye on, and I agree that this is a little out of bounds of exactly for what we are looking for, however, I think that the opportunity for designing, deploying and managing high-performing, high-value APIs from publicly and privately-owned scientific data is pretty huge.

Government data

As I look at these energy, and scientific APIs across my monitoring system, I’m presented with other government APIs that are consumer focused and often have the look and feel of a private sector startup, while also having a significant impact on private sector industries.

While all of these APIs are clearly .gov initiatives, they provide clear value to consumers, and I think there is opportunity for startups to play around in offering complimentary, or even competing services with these government-generated, industry-focused open data–going further than these platforms do.

Quasi-government data

Alongside those very consumer, industry oriented government efforts, I can’t help but look at the quasi-government APIs I’m familiar with that are providing similar data-driven APIs to the government ones above.

While these may not be models for startups, I think they provide potential ideas that private sector non-profit groups can take action on. Providing mortgage, energy, environmental, or even healthcare services, developed around public and private sector data, will continue to grow as a viable business model for startups and organizations in coming years.

Watchers of government data

Adding another layer to government data, I have to include the organizations that keep an eye on government, a segment of organizations that have evolved around building operational models for aggregating, generating meaning from, then republishing data that helps us understand how government is working (or not working).

These are all nonprofit organizations doing this work, but when it comes to journalism, politics, and other areas, there are some viable services that can be offered surrounding, and on top of the valuable open data being liberated, generated and published by the watchers of our government.

School data again

One interesting model for building a business around government data is with Great School. There are some high-value datasets available at the Department of Education as a well as Census Bureau, and using these sources have presented a common model for building a company around public data:

I’m not exactly a fan of the Great Schools, but I think it is worthy of noting. I’ve talked with them, and they don’t really have as open of a business model and platform as I would like to see. I feel it is important to “pay it forward” when building a for-profit company around public data. I don’t have a problem with building businesses and generating revenue around public data, but if you don’t contribute to it being more accessible than you found it, I have a problem.

News

After spending time looking through the APIs I monitor, I remembered the use of public data by leading news sources. These papers are using data from federal, state and city data sources, and serving them via APIs, right alongside the news.

These news sources don’t make money off the APIs themselves. Like software-as-a-service providers, they provide value-add to their core business. Census surveys, congressional voting, economic numbers, and other public data is extremely relevant to the everyday news that impacts us.

Been doing this for a while

When we talk about building businesses around publicly available data, there are some companies who have been doing this a while. The concept really isn’t that new, so I think it is important to bring these legacy providers into the conversation.

Most of these data providers have been doing it for over a decade. They all get the API game and offer a wide range of API services for developers, providing access to data that is taken directly from, derived or enhanced from public sources. When it comes to building a business around public data, I don’t think these four have the exact model I’m looking for, but there are many lessons of how to do it right, and wrong.

Weather is age-old model

When you think about it, one of the original areas we built services around government data is weather. Weather data is a common reference you will experience when you hear any official talk about the potential of government data. There are numerous weather API services available that are doing very well when it comes to digesting public data and making it relevant to developers.

Weather is the most relevant API resource I know of in the space. Weather impacts everyone, making it a resource all web and mobile applications will need. With the growing concern around climate change, this model for using public data, and generating valuable APIs will only grow more important.

Time zone data

Right there behind weather, I would say that time and data information is something that impacts everyone. Time shapes our world and government sets the tone of the conversation when it comes to date and time data, something that is driving many API-driven business models.

What I like about time and date APIs is that they provide an essential ingredient in all application development. It is an example of how government can generate and guide data sources, while allowing the private sector to help manage vital services around this data, that developers will depend on for building apps.

Currency conversion

Along with time zone data, currency conversion is a simple, valuable, API driven service that is needed across our economy. You have to know what time it is in different time zones, and know what the conversion rate between different currencies to do business in the global API economy.

In our increasingly global, online world, currency conversion is only going to grow more important. Workforces will be spread across the globe, and paying employees, buying goods and services will increasingly span the globe, requiring seamless currency conversion in all applications.

Transit data

Another important area of APIs, that are increasingly impacting our everyday lives, are transit APIs, providing real-time bus, train, subway and other public transit data to developers.

Transit data will always be a tug of war between the public and private sector. Some data will be generated in each sphere, with some projects incentivized by the government, where the private sector is unwilling to go. Establishing clear models for public and private sector partnerships around transit data will be critical to society functioning.

Real estate

While I won’t be covering every example of building a business around public data in this story, I would be remiss if I didn’t talk about the real estate industry, one of the oldest businesses built on public data and facts.

I’m not a big fan of the real estate industry. One of my startups in the past was built around aggregating MLS data, and I can safely say that the real industry is one of the shadiest industries I know of that is built on top of public data. I don’t think this industry is a model that we should be following, but again, I do think there are a huge lessons to be learned from the space as we move forward building business models around public data.

That is as far as I’m going to go in exploring API driven businesses built on public data. My goal wasn’t meant to be comprehensive, I was just looking to answer some questions for myself around who else is playing in the space.

This list of businesses came out of my API monitoring system, so is somewhat limited in its focus, requiring the company who is building on top of public data to also have an API, which creates quite a blind spot for this research. However, this is a blind spot I’m willing to live in, because I think my view represents the good in the space, and where we should be headed.

Open Data 500

For the next edition of this story, I’d like to look through the 500 companies listed in the Open Data 500 project. I like the focus of the project from GovLab out of New York University.

Their description from the site sums up the project:

The Open Data 500 is the first comprehensive study of U.S. companies that use open government data to generate new business and develop new products and services. Open Data is free, public data that can be used to launch commercial and nonprofit ventures, do research, make data-driven decisions, and solve complex problems.

I see a few of the companies I’ve listed above in the Open Data 500. I’m stoked that they provide both a JSON and CSV version of the Open Data 500, making it much easier to process, and make sense of the companies listed. I’d like to make a JavaScript slideshow from the JSON file, and browse through the list of companies, adding my own set of tags—helping me better understand the good from the bad examples, as well as where the trends and opportunities are around developing APIs around public data.

I’m pretty convinced that we have a lot of work to do in making government machine-readable data at the federal, state, county and city level more available before we can fully realize the potential of the API economy.

Without high quality, real-time, valuable public data, we won’t be able to satisfy the needs of the next wave of web, single page, mobile and Internet of things application developers. I’m also hoping we can work to establish some healthy blueprints for developing private sector businesses and organization around public data, by reviewing some of the existing startups who are finding success with this model, and build on, or compliment this existing work, rather than re-invent the wheel.

Feds didn’t say agile development contributed to Healthcare.gov failure

A recent VentureBeat headline misleadingly suggests agile development practices were the cause of Healthcare.gov’s “failure.”

The article is in reference to a new U.S. Government Accountability Office report released Thursday: “Healthcare.Gov: Ineffective Planning and Oversight Practices Underscore the Need for Improved Contract Management.”

Because there’s been more and more attention given to IT project management methodologies, particularly agile, in the aftermath of the Healthcare.gov launch, and most people don’t read beyond the headlines, it’s important to shed some light on how agile was actually referenced in the report.

The story, “Feds say agile development contributed to Healthcare.gov failure,” makes a two-sentence reference to agile, neither of which suggest agile methodologies were the issue. The issue, as indicated in the article, was how the Centers for Medicare & Medicaid Services and its contractors implemented agile.

From the post:

The CMS allowed the contractors to use an “agile” approach to developing its data hub and website, when the CMS had already admitted that it had little experience developing in that way, the report says.

Here are excerpts of the report’s agile references:

To help manage compressed time frames for FFM and data hub development, CMS program officials adopted an iterative IT development approach called Agile that was new to CMS. Agile development is a modular and iterative approach that calls for producing usable software in small increments, sometimes referred to as sprints, rather than producing a complete product in longer sequential phases. However, we found that the quality assurance surveillance plans were not used to inform oversight. For example, contracting and program officials, including the COR and contracting officer, were not sure if the quality assurance surveillance plan had been provided as required by the FFM and data hub task orders. Although a copy was found by CMS staff in June 2014, officials said they were not aware that the document had been used to review the quality of the contractor’s work. Instead, CMS program officials said they relied on their personal judgment and experience to determine quality. The Office of Management and Budget issued guidance in 2010 that advocated the use of shorter delivery time frames for federal IT projects, an approach consistent with Agile. However, CMS program officials acknowledged that when FFM and data hub development began in September 2011, they had limited experience applying an Agile approach to CMS IT projects. In 2011, CMS developed updated guidance to incorporate the Agile IT development approach with its IT governance model, but that model still included sequential reviews and approvals and required deliverables at pre-determined points in the project. In our July 2012 report, we found a number of challenges associated with introducing Agile in the federal environment. Specifically, we found that it was difficult to ensure that iterative projects could follow a standard, sequential approach and that deviating from traditional procedural guidance to follow Agile methods was a challenge. We also reported that new tools and training may be required, as well as updates to procurement strategies. Therefore, the new approach that CMS selected in order to speed work also carried its own implementation risks.

Despite the revised FFM schedule, it is not clear that CMS held all of the governance reviews for the FFM and data hub or received the approvals required by the life cycle framework. The framework was developed to accommodate multiple development approaches, including Agile. A senior CMS program official said that although the framework was used as a foundation for their work, it was not always followed throughout the development process because it did not align with the modified Agile approach CMS had adopted.

So, GAO didn’t say agile was the cause of Healthcare.gov’s failures. It merely outlines the bureaucratic challenges of effectively implementing it.

In fact, GAO has been supportive of an agile approach in the past. A 2012 report, “Software Development: Effective Practices and Federal Challenges in Applying Agile Methods,” recommended the CIO Council continue to champion agile and modular practices throughout the federal government.