18F

How to build a lean startup inside government

Leaders from 18F and the White House Presidential Innovation Fellowship program presented at the 2015 Lean Startup Conference on “Lean Methodologies When the Organization is the Product,” and this is the best video you’ll watch on getting a holistic approach to building a lean startup inside government.

The video features 18F Co-director Hillary Hartley, 18F Talent Director Jennifer Tress, 18F Infrastructure Director Noah Kunin, 18F Designer Nick Brethauer and PIF Director Garren Givens.

Government and the ’empowered product owner’

The 18F Delivery team released a “Partnership Playbook” that aims to help federal agencies understand what to expect when working with 18F, and the gem within is play number two, “We work with an empowered product owner.”

The product owner will soon evolve into one of the most important roles in government technology, so it’s critical for those leading development teams to understand its application.

Key excerpt:

We work best with an empowered product owner who can make decisions about the project we’re partnering on. In agile development, a product owner is responsible for project scoping and prioritizing. Our delivery team will rely on the product owner for direction as the project develops. This product owner must be empowered to make decisions about the product. The product owner should be experienced at getting buy-in from other organizational leaders; support should be lined up before our engagement.

We look for a product owner who has already lined up internal stakeholder support. Any project will impact a number of internal agency groups and systems, so it will need buy-in and technical integration support from those people. The product owner garners this buy-in and support. Before the engagement starts, the owner should have had conversations with and identified champions in relevant internal groups. Beginning these conversations in the middle of development can grind everything to a halt; they should be well underway by the time a digital service team is brought on board to deliver. We recommend that you map out the relevant stakeholders before embarking on a project.

Having an empowered product owner is crucial to decisions getting made and having a solid product vision. Play number two is required reading for everyone building government digital services.

Why Cloud.gov is a big deal

Source: Cloud.gov

Source: Cloud.gov

Enabling internal government tech shops to quickly stand up applications in a secure testing environment is fundamental to quick prototyping, and 18F’s new Cloud.gov is a major step in realizing ultimate IT flexibility.

I reached out to GovReady founder Greg Elin who is working on “making FISMA a platform instead of paperwork,” and he replied with the following comments that are better than anything I could say on the subject:

18F’s Cloud.gov is a tectonic shift in government IT because it replaces policy with platform. Cloud.gov components accelerate the much needed replacement of PDF-based guidance with running code. It’s the difference between a book about Javascript and just using jQuery.

For most of the past 20 years, the CIO Council, NIST, and most agency IT shops have focused on policies and procedures to provide contractual requirements for vendors doing the work. That’s not criticizing anyone, it’s how the system was set up. The CIO Council’s authority is to provide recommendations–not write code. NIST’s mission is to advance measurement science and standards development–not build platforms.

Take the CIO Council’s enterprise architecture efforts or NIST’s Risk Management Framework as examples. They provide incredibly rich, comprehensive expert guidance distributed in documents. Unfortunately, contracts, contractors and projects implement the guidance differently enough that interoperability and reusability rarely occurs between bureaus or across agencies. In contrast, over the past decade in the private sector and on the Internet, knowledge has become immediately actionable via open source, APIs and GitHub repos. It’s a golden era of shared solutions powered by StackOverflows and code snippets, package managers and Docker containers.

If 18F’s Cloud.gov succeeds at encompassing official policies and regulations into loosely coupled running code, then contracts are easier to write, vendors aren’t constantly reinventing things, and projects happen faster.

Learn more about Cloud.gov.

Feds publish guide to setting up an open source project

GitHub

18F has published a guide that helps federal government workers standardize GitHub use and better leverage the social coding platform when setting up open source projects.

Tips include how to best name and describe projects, create readable READMEs , write user story focused issues, wiki best practices and a GitHub repo checklist.

Additional thoughts that would make the guide more helpful:

  • Add a section on collaborators and permissions.
  • Encourage including a link for the ‘Website for this repository’ next to the description whenever possible.
  • Next to the ‘Edit this page’ link, add a ‘Submit feedback’ link to the issues section for the guide so it’s easier to giv feedback. In general, if you’re going to have either option, it’s best to have both, especially the latter.
  • One bug: The images on https://pages.18f.gov/open-source-guide/using-the-wiki aren’t responsive.

Update:

The future of government technology procurement

SideEffect.ioThe General Services Administration and 18F recently held an open request for quotation related to a new blanket purchase agreement for a federal marketplace for agile delivery services. The transparency throughout the entire process was refreshing and provides a window into the future of procurement as well as what FedBizOpps could and should be.

The RFQ asked companies to provide a working prototype with code submitted in a public GitHub repository that could be viewed, watched, forked or downloaded at any time. Timestamps built into GitHub’s commit timeline publicly exposed when a company began working and when and whether it “submitted” its final version within the allocated timeframe.

The objective of the BPA, according to 18F, was “to shift the software procurement paradigm” from a waterfall-based development model with a long, tedious approach to acquisition that typically favors large, established inside-Beltway vendors to one that encourages small business participation, and that required all companies to work in the open, using GitHub to expose not just the code, but how the teams worked together and documented their efforts.

CivicActions (full disclosure: I work for them) participated in the process, and I played a role in developing parts of the front-end and productizing the end result, which was SideEffect.io, an adverse affect comparison tool that leveraged open data from the Food and Drug Administration’s OpenFDA initiative (GitHub repo here).

Having played a minor role on the team and having an odd appreciation for how government IT leaders are working to modernize technology procurement, the process was fascinating to watch both from how GSA and 18F pushed this out and managed, but also an inside perspective on how one company responded and worked together (FCW’s Zach Noble has a great write-up on how the CivicActions team worked, the tools used and its general philosophy going into it).

My general takeaway is that this is the future of the request for information/quote/proposal process. In the future, much like what I prototyped at OpenFBO, for each procurement request, there will be repo-like tools that fully expose public input and questions, allow internal and external stakeholders to easily “watch” for updates, attach bids or quotes with an opportunity for feedback, all of which would eventually turn into the repo for developing the end product.

As GSA and 18F, and hopefully other federal, state and local agencies, continue to refine this process, whether it’s via GitHub or a Git-like platform, you can be sure this is the future of how government will procure custom-built software and services.

Let’s give 18F some space

18F

The questions and criticisms posed in MeriTalk CEO Steve O’Keeffe’s “WT18F?” blog post perfectly highlights the staid sentiments of yesterday’s approach to government technology — one that is comfortable with the status quo, unwilling to embrace change and quick to critique a much-needed experiment before it can properly get off the ground.

It also represents an unwillingness to judge and measure the status quo on the same standards it’s asking of 18F.

Part of this is based on fear — fear of being exposed that it hasn’t effectively adapted to modern technology practices and isn’t quite sure what to do. While it’s important that those with today’s skills mentor those with yesterday’s, that’s not 18F’s direct focus (see “chasm” below). As with anything in life, it’s important for those with yesterday’s skills be self-aware and proactive in learning and embracing today’s, especially when it pertains to technology.

Part of it is also driven by money and the comfortable place the traditional government technology community has held for decades. Seeing the reality shift in realtime is probably difficult. As O’Keefe notes, “Industry has real questions too. Companies feel 18F’s competing with the private sector – leveraging an unfair advantage to shill for work inside the government.”

Let’s be clear, however, that not all of industry is concerned about this. Those in private sector not entrenched in the past, that have baked in modern technology practices, such as open source and agile, are perfectly comfortable and excited that 18F is driving both a new approach to IT, but also a shift in its culture.

To better understand 18F’s methodology and its ideal client collaborator, one must understand Jeffrey Moore’s “Crossing the Chasmtechnology product adoption curve. 18F is best served in fulfilling its mission at a faster pace (and adding true, long-term value to citizens) by focusing on the “innovators” and “early adopters.” Any successful entrepreneur or product manager will vouch for this approach.

It’s important that critics of this new kind of experiment in government technology innovation (see “fear” and “money” above) realize they are at the end of the chasm (“late majority,” “laggards”).

Let’s also keep in mind that 18F is learning as they go in a very public way, unlike any other agency that’s been created before (and especially unlike any private sector vendor). Those who have worked in startup environments know this all too well. Ask any company (Apple, eBay, Twitter, Google) if they got it right the first time (or every time). Part of the beauty is that they didn’t.

Let’s also keep in mind, change doesn’t happen overnight, especially in government.

Especially in federal government.

Especially in federal government technology.

Here’s what former Department of Homeland Security Chief Information Officer Richard Spires has said about this:

“My experience in government has shown that the implementation of significant change takes two years, and the benefits of that change really being felt in year three and beyond.”

Trying to change decades of antiquated practices will take longer than the 12 months 18F has been in existence.

O’Keefe suggests that part of 18F’s problem is a public relations one, but 18F doesn’t need a PR machine. 18F needs some space.

The question to ask isn’t “WT18F?,” but whether “Is 18F trying to change the culture of federal government technology that is aligned with modern practices, and do they need space and time to do this, as well as more leaders in the government IT community to step forward and become innovators and early adopters?”

The answer to that question: “Yes.”

18F starts building pattern library for federal government websites

Government Wide Pattern Library

18F has started building a much-needed federal government-wide pattern library.

For those unfamiliar with the concept of a pattern library, it’s a standardized, front-end design style guide for all the components of a website, such as fonts, colors, layout and forms (example: Code for America’s pattern library).

This is an important project in that it (hopefully) begins to set a standard common look and feel for all federal government websites and moves the focus from design to user experience.

If agencies get on board with the standardization, millions of dollars in savings would be realized, not to mention a path for expedited development processes because front-end code can be easily re-purposed and deployed.

Clarification

GSA takes a big step towards baking agile into federal procurement

The U.S. General Services Administration is working to make it easier for agencies to procure agile development services via a government-wide blanket purchase agreement, which could be finalized as early as the end of this year.

GSA initiated the effort with a request for information and an Agile Delivery Services Industry Day tentatively scheduled for Tuesday, January 27, from 1:00 to 3:00 p.m. eastern time.

According to GSA, the industry day aims to discuss “establishing a new, governmentwide Blanket Purchase Agreement (BPA), which will feature vendors specializing in agile delivery services (e.g.; user-centered design, agile software development, and DevOps).”

To get a better understanding each vendors qualifications and understanding for the agile process, the RFI asks each to explain in 500 words how they would improve the federal government business portal, FedBizOpps. Responses are due by January 23.

From the RFI:

To ascertain your agile delivery capabilities, the government is requesting that you describe how you would approach creating a new and improved version of an existing government digital service called FedBizOpps (FBO).

FBO, which you can view at http://www.fbo.gov, is used by government buyers to share information on federal business opportunities with the public. The system is intended to serve as the central portal for federal agencies to solicit products and services from commercial vendors in support of their missions. Using FBO, vendors can search, monitor, and retrieve opportunities solicited by the entire federal contracting community.

Based on this brief description of FBO, how would you go about designing, developing, testing, deploying and/or operating a new and improved system that produces such outcomes as user needs being met, risk of overall project failure (in terms of cost, schedule, quality) being mitigated, the architecture being adaptive to change, and taxpayer dollars being spent efficiently and effectively? Please be sure to include a listing of all the labor categories your company would use in this effort.

GSA will start an alpha test phase within “2-3 months” that will include vendors currently on Schedule 70 and apply only to GSA procurement, particularly to help “18F’s burgeoning delivery services team.” Afterwards, within 6-8 months, a beta phase will work to establish a government-wide BPA for procuring agile services.

“To keep pace, software acquisitions need to move at the speed of agile development cycles,” write 18F’s Chris Cairns and Greg Godbout in a blog post announcing the effort. “Ideally, this means less than 4 weeks from solicitation to contract kickoff, and from there no more than 3 months to deliver a minimum viable product (MVP).”

Bonus: RFI tips

Here are a few ideas you can use for your RFI submission:

  • Start with “API first.” FedBizOps desperately needs a more useful way to access the information available, especially newly-posted RFIs and requests for proposals.
  • Push all the front-end code to GitHub, where you’ll publicly address interface issues.
  • For design inspiration, start with FBOpen. Emphasize you’ll make searching easier and less convoluted. If you’re not familiar with this project or its predecessor RFP-EZ, start here.
  • Put all support/FAQs into Zendesk, much like the Federal Communications Commission has done with its new consumer complaint website.

An open data blueprint for the U.S. Department of Commerce

U.S. Secretary of Commerce Penny Pritzker announcing the agency's new chief data officer position. (Photo: U.S. Secretary of Commerce)

U.S. Secretary of Commerce Penny Pritzker announcing the agency’s new chief data officer position. (Photo: U.S. Secretary of Commerce)

Re-published from API Evangelist

U.S. Secretary of Commerce Penny Pritzker recently announced the Department of Commerce will hire its first-ever chief data officer. I wanted to make sure that when this new and extremely important individual assumes their role, they have my latest thoughts on how to make the Department of Commerce developer portal the best it possibly can be, because this will be the driving force behind the rapidly expanding API driven economy.

Secretary Pritzker does a pretty good job of summing up the scope of resources that are available at Commerce:

Secretary Pritzker described how the Department of Commerce’s data collection – which literally reaches from the depths of the ocean to the surface of the sun – not only informs trillions of dollars of private and public investments each year and plants the seeds of economic growth, but also saves lives.

I think she also does a fine job of describing the urgency behind making sure Commerce resources are available:

Because of Commerce Department data, Secretary Pritzker explained, communities vulnerable to tornadoes have seen warning times triple and tornado warning accuracy double over the past 25 years, giving residents greater time to search for shelter in the event of an emergency.

To understand the importance of content, data and other resources that are coming out the Department of Commerce, you just have to look at the list of agencies under its purview that already have API initiatives:

Then take a look at the other half, who have not launched APIs:

The data and other resources available through these agencies reflect the heart of not just the U.S. economy, but the global economy, which is rapidly being driven by APIs powering stock markets, finance, payment providers, cloud computing and many other cornerstones of our increasingly online economy.

Look through those 13 agencies. The resource they manage are vital to all aspects of the economy: telecommunications, patents, weather, oceans, census, to other areas that have a direct influence on how markets work (or don’t).

I’m all behind the Commerce hiring a CDO, but my first question is, “what will this person do?”

This leader, Secretary Pritzker explained, will oversee improvements to data collection and dissemination in order to ensure that Commerce’s data programs are coordinated, comprehensive, and strategic.

Yes! I can get behind this. In my opinion, in order for the new CDO to do this, they will have to quickly bring all of the agencies /developer programs up to a modern level of operation. There is a lot of work to be done, so let’s get to work exploring what needs to happen.

A central Commerce developer portal to rule them all

Right now, the Commerce developer portal, commerce.gov/developer, is just a landing page. An after thought, to help you find some APIs–not a portal.

The new CDO needs to establish this real estate as the one true portal, which provides the resources other agencies will need for success, while also providing a modern, leading location for developers of web, mobile, Internet of things applications and data journalists or analysts to find the data they need.

If you need a reference point, look at Amazon Web Services,SalesForceeBay or Googe’s developers areas—you should see this type of activity at commerce.gov/developer.

Each agency must have its own kick-ass developer portal

Following patterns set forth by Commerce, each sub-agency needs to possess their own best-of-breed developer portal, providing the data, APIs, code and other resources that public and private sector consumers will need. I just finished looking through all the available developer portals for commerce agencies, and there is no consistency between them in user experience, API design or resources available. The new CDO will have to immediately get to work on taking existing patterns from the private sector, as well as what has been developed by 18F, and set a establish common patterns that other agencies can follow when designing, developing and managing their own agencies developer portal.

High-quality, machine-readable open data by default

The new CDO needs to quickly build on existing data inventory efforts that has been going on at Commerce, making sure any existing projects, are producing machine-readable data by default, making sure all data inventory is available within their agency’s portal, as well as at data.gov. This will not be a one-time effort. The new CDO needs to make sure all program and project managers, also get the data steward training they will need, to ensure that all future work at Commerce, associated agencies and private sector partners produces high-quality, machine-readable data by default.

Open source tooling to support the public and private sector

Within each of the Commerce and associate agency developer portals, there needs to be a wealth of open source code samples, libraries and SDKs for working with data and APIs. This open source philosophy, also needs to be applied to any web or mobile applications, analysis or visualization that are part of Commerce funded projects and programs, whether they are from the public or private sector. All software developed around Commerce data, and receive public funding should be open source by default, allowing the rest of the developer ecosystem, and ultimately the wider economy to benefit and build on top of existing work.

Machine-readable API definitions for all resources

This is an area that is a little bit leading edge, even for the private sector, but is rapidly emerging to play a central role in how APIs are designed, deployed, managed, discovered, tested, monitored and ultimately integrated into other systems and applications. Machine-readable API definitions are being used as a sort of central truth, defining how and what an API does, in a machine-readable, but common format, that any developer, and potentially other system can understand. Commerce needs to ensure that all existing, as well as future APIs developed around Commerce data, possess a machine-readable API definition, which will allow for all data resources to be plug and play in the API economy.

Establish an assortment of blueprints for other agencies to follow

The new Commerce CDO will have to be extremely efficient at establishing successful patterns that other agencies, projects and programs can follow. This starts with developer portal blueprints they can follow when designing, deploying and managing their own developer programs, but should not stop there, and Commerce will need a wealth of blueprints for open source software, APIs, system connectors and much, much more. Establishing common blueprints, and sharing these widely across government will be critical for consistency and interoperability–reducing the chances that agencies, or private sector partners will be re-inventing the wheel, while also reducing development costs.

Establish trusted partner access for public and private sector

Open data and APIs do not always mean publicly available by default. Private sector API leaders have developed trusted partner layers to their open data and API developer ecosystems, allowing for select, trusted partners greater access to resources. An existing model for this in the federal government is within the IRS modernized e-file ecosystem, and the trusted relationships they have with private sector tax preparation partners like H&R Block or Jackson Hewitt. Trusted partners will be critical in Commerce operations, acting as private sector connectors to the API economy, enabling higher levels of access from the private sector, but in a secure and controlled way that protects the public interest.

Army of domain expert evangelists providing a human face

As the name says, Commerce spans all business sectors, and to properly “oversee improvements to data collection and dissemination in order to ensure that Commerce’s data programs are coordinated, comprehensive, and strategic,” the CDO will need another human layer to help increase awareness of Commerce data and APIs, while also supporting existing partners and integrators. An army of evangelists will be needed, possessing some extremely important domain expertise, across all business sectors, that Commerce data and resources will touch. Evangelism is the essential human variable, that makes the whole open data and API algorithm work, the new CDO needs to get to work writing a job description, and hiring for this army—you will need an 18F, but one that is dedicated to Commerce.

Department of Commerce as the portal at the center of the API economy

The establishment of an official CDO at the Department of Commerce is very serious business, and is a role that will be central to how the global economy evolves in the coming years. The content, data, and digital resources that should, and will be made available at commerce.gov/developer and associated agencies, will be central to the health of the API driven economy.

Think of what major seaports have done for the the economy over the last 1,000 years, and what role Wall Street has played in the economy over the last century. This is the scope of the commerce.gov/developer portal, which is ultimately the responsibility of this new role.

When the new CDO gets started, I hope they reach out to 18F, who will have much of what you need to get going. Then sit down, read this post, as well my other one on, An API strategy for the U.S. government, and once you get going, if you need any help, just let me know—as my readers know, I’m full of a lot of ideas on APIs.

GitHub and the C-suite social

GitHubIn the early days of Twitter, it was easy and common to dismiss the infant social network as a simplistic tool that served a whimsical and nerdy niche.

Today, Twitter has gone from the technorati tweeting hipster conference minutiae to a platform driving the new world digital order. This didn’t happen overnight. But, when the flock of civic technologists set flight, the social government migration happened quickly and collectively.

Much like we pooh-poohed Twitter in those early days, GitHub, in its early crawl, is today dismissed simply as a tool for the diehard developer. However, as with any tool with great potential, innovators find new ways to leverage emerging technology to communicate, and government chief information and technology officers can effectively do this with GitHub.

There’s the obvious use case, such as contributing code and commenting on projects, much like Veterans Affairs Chief Technology Officer Marina Martin does via her GitHub account. It’s probably asking a lot for the C-suite to dive deep into code on a daily basis, there are other, more conversational ways GitHub can be leveraged.

Case in point, a few weeks ago, Federal Communications Commission Chief Information Officer David Bray and I had a Twitter exchange about the utility of GitHub. Immediately, I created a repository (think “folder”) on my personal account, and set up a new “What questions do you have for FCC CIO David Bray?” issue (think “discussion”).

To Bray’s credit, and perhaps surprise of his public affairs office, he humored me by immediately joining GitHub, posting replies to a number of questions about FCC open data, open source, cloud hosting and web operations. Over the course of an hour, there was a genuine, real-time conversation between a federal CIO and the community at large.

Despite wide adoption of social tools by public sector innovators, most of the C-suite remains decidedly analog in terms of engagement and sharing of relevant information about the inner workings of our public sector institutions. A cursory survey of government chief information and technology officers shows they abstain altogether or, when they do, generally give random personal updates or staid posts with a heavily-sanitized public affairs filter.

The emergence of GitHub may change this for the government technologist, especially those willing to engage fellow coders and citizens on projects in an open, fluid environment.

Former Presidential Innovation Fellow and current GitHub government lead Ben Balter has since followed suit and created a government-focused “Ask Me (Almost) Anything” repo featuring Q&As with Philadelphia Chief Data Officer Mark Headd and staff from the newly-minted 18F.

GitHub’s repo and issues features are natural communication tools for C-level technologists who fancy themselves innovators leveraging emerging tech in new, creative ways.

For the IT C-suite, the GitChat is the new Twitter Townhall, a way to instantly and directly connect with peers and the general public and be asked anything.

Well, almost anything.