Month: September 2016

Ekistic Ventures launches $15M fund to ‘solve critical urban problems’

Adding to the increased interest in investment opportunities around civic and government technology, a new venture fund, Ekistic Ventures, launched with the intent of “building a portfolio of companies that will solve critical urban problems.”

According to Crain’s, the fund is $15 million.

The Ekistic team includes former Chicago chief data officer Brett Goldstein, former Rahm Emmanuel advisor David Spielfogel, former Philadelphia mayor Michael Nutter, O’Reilly Media founder Tim O’Reilly, Anne Milgram, Michael Sacks and Will Colegrove.

From the launch announcement:

We’ve seen a lot of pitches over the years, and we know how good ideas never see the light of day because the entrepreneur or start-up doesn’t understand the market landscape or the realities of succeeding in the urban environment.

Second, in addition to writing a check we also commit our time and networks to build meaningful companies for the long haul. That’s why we only work with a small handful of companies each year, and why we so closely tie our success to the success of our portfolio companies.

Learn more about Ekistic’s mission and submit your pitch.

Driving smart city innovation with open sensor data (part 5)

This is part five of a five-part series that looks at successful strategies we at OpenDataSoft have seen our clients and others use to foster innovation and align their smart city and open data goals. The full series is available as a free PDF download.

Attend to the tech must-haves

The concept of what constitutes a “Smart City” has evolved quite a bit over the past 10 years. From early visions of sweeping citywide digital overhauls and the global automation of everything from trash pick up to transportation, cities are now focusing on smaller scale projects; they are testing ideas with pilot programs, and attending to low-tech and even no-tech options for meeting their goals of safe, healthy, sustainable, and vibrant communities.

But while there is much technology that can be sifted into must-have, nice-to-have and maybe-someday categories without a negative impact on smart city advancement, there are a few basic pieces of technology cities will need in order to extract value from the real-time data that has already begun to flow through smart cities.

One is an open data platform that can provide data access to citizens, researchers, developers, city staff and city ecosystem partners (who should also provide access to their data to these same communities).

While they are many options for hosting such data, the rise in real-time data, whether from pollution meters on lamppost, GPS locators on mobile phones, usage data from water meters, or video feeds from security cameras, requires application programming interfaces.

Application programming interfaces

APIs are software code interfaces that allow software applications to exchange data and services. In the context of smart cities, they enable a secure, reliable connection to continuously updated data for developers who want to build web or mobile applications, for researchers or analysts who want to plug city data into existing applications such as business intelligence software, for IT staff at other government agencies or ecosystem partners who want to integrate a city’s data with their own (see the helpful article “Open Data & APIs: Collecting and Consuming What Cities Produce”).

As developing and maintaining custom APIs can be complex and time-consuming, the wisest course for cities is to choose an open data portal natively designed to automate the generation and maintenance of standards-based APIs. To deliver maximum data value and make processes as efficient as possible for data consumers, it is also very helpful if the APIs generated can support queries, range settings and manipulations like mathematical calculations so users can extract only the data required, in the form needed.

Unfortunately most open data portal solutions were designed to handle static, infrequently changing content like spreadsheets and reports, not large real-time, streaming sensor data. Conversely, most platforms specifically designed for Internet of Things (IoT) data and Machine-to-Machine (M2M) data were not designed for use as open data portals. Some governments and open data portals have tried to bridge this divide by coupling standalone IoT platforms and open data portals, or by developing new add-on systems for existing open data portals. At present, these efforts introduce complexity and performance costs that hamper their use. This should change over time, however, as the demand for easy, cost-effective open access to smart city sensor data increases.

Data visualization

Another must-have is easy data visualization and dashboard-building tools. Visualization in the form of charts, graphs and maps is very useful for helping human beings make sense of all kinds of data, and it is absolutely essential for big data collections of the type produced by real-time sensors and captors.

The value of visualization in making data meaningful and accessible is well understood by the Town of Cary. During her keynote speech at Triangle Open Data Day, Cary Town Council Member, Lori Bush was very clear about a primary goal of the town’s Open Data project: storytelling. “We started talking about Open Data a long time ago. We were constantly asked, ‘what’s the value of an Open Data program’?” said Bush. They knew easy data visualization was key, and Cary Chief Information Officer Nicole Raimundo was very pleased to have found in their open data portal “a tool that really allowed us to realize that storytelling aspect. We can embed visualizations on the homepage, which is critical because that’s where most of our citizens are going to go.”

Cary’s data storytelling is showcased through a dedicated section on their open data portal’s homepage. The Data Stories section comes with a data visualization, accompanying text, and a link to associated datasets. This gives a richer context and a clearer story to what the city wants to communicate. In addition, portal visitors can easily create and share their own data visualizations, putting them in the driver’s seat as they seek the meaning behind the facts and figures.

From a technology-centered to a human-centered view of the Smart City

This focus on making data accessible and meaningful for humans is fully aligned with the evolving nature of ‘smart cities.’ The transition underway from a technology-centered to a human-centered view of the smart city is casting a new spotlight on the promise of open data, from transparency and trust to citizen engagement and open innovation.

Accordingly, it’s only natural that cities are increasingly seeking to align their Open Data and Smart City strategies, and they are exploring solutions that can help them ensure that citizens and application developers have ultra-simple access to all the useful data a city produces. This includes the sensor data upon which many of the most engaging and transformative web and mobile-based applications will be built.

There is no doubt that high-tech digital transformation can have enormous impact in helping cities meet the environmental, social and economic challenges of population growth in a world of increasingly strained natural resources and a changing climate. However, even with the most technologically sophisticated solutions, success depends on making residents true partners in defining what ‘smart’ means for their community, and enabling their participation in shaping their city to fulfill that vision. And that means a smart city is first and foremost, an open city.

Read all five strategies on the GovFresh website, or download the complete five-part series as a free PDF download.

The government technology pitch

President Barack Obama joins a toast with tech business leaders at a dinner in Woodside, California, Feb. 17, 2011. (Official White House Photo by Pete Souza)

President Barack Obama joins a toast with tech business leaders at a dinner in Woodside, California, Feb. 17, 2011. (Official White House Photo by Pete Souza)

Originally published at TechCrunch

Crisis has a history of dictating government technology disruption. We’ve seen this with the anticipation of Soviet Union aerospace and military dominance that sparked the emergence of DARPA, aswell as with the response to 9/11 and subsequent establishment of the U.S. Department of Homeland Security.

And, of course, there’s the ongoing, seemingly invisible crisis around security that’s expediting an infusion of public sector funding, particularly in the wake of the U.S. Office of Personnel Management breach that exposed the personal records of millions of federal employees and government contractors.

The Healthcare.gov launch debacle is the most recent and referenced example of crisis spawning government technology progress. The federal government woke to the issues surrounding outdated digital practices — from procurement to technical — and quickly launched two startups of its own: 18F and the U.S. Digital Service (USDS).

The failings of Healthcare.gov and subsequent creation of 18F and USDS has inspired others — such as the state of California, large cities and local governments — to fund a surge in attention to digital — from web to data to security — to address outdated technologies powering the technological infrastructure that runs our governments.

But innovators don’t wait for crises.

They imagine a different path, whether it’s a new approach to solving an old problem or a moonshot that leapfrogs business as usual. They observe the world, realize potential and fund and build engines of change — and forward-thinking, optimistic entrepreneurs and investors are starting to do this with government technology.

“Think right now,” 1776 co-founder Evan Burfield said in an a16z podcast. “Who’s is the most iconic entrepreneur in Silicon Valley, the one all the kids these days are aspiring to? It’s Elon Musk. Every one of his businesses is based on a regulatory hack.”

With the need to do more with less, to address an aging workforce and more and more pressure to recruit and retain the next generation of public sector leaders, government is being forced to adopt quicker than ever before. We’re at a pivotal moment with government technology infrastructure, much of it built on older technologies, with little mobile functionality and proper security protocol, all compounding the need for innovation.

And customers are starting to demand it. According to an Accenture 2015 digital government report: 

  • 86 percent of those surveyed “want to maintain or increase their digital interaction with government”

  • 73 percent are “neutral” or “not satisfied” with digital government services

  • percentage who want online transactions for licenses/permits (66 percent), taxes (45 percent), fines/tickets (39 percent), report non-emergency issues (38 percent)

  • Mobile: 32 percent want to use tablets to access digital government services; 38 percent a mobile phone (51 percent for ages 18-44)

This is the opportunity for innovators.

While the government digital services trend has taken hold, it doesn’t adequately address the needs of the 20,000 cities across the United States. Services costs quickly add up and don’t scale, but software-as-a-service does, and this is where private sector entrepreneurs are re-imagining how government works.

As Marc Andreessen said, “software is eating the world.” Its appetite for government is beginning to get bigger, and interest from accelerators to venture funds is piquing.

Govtech Fund, specifically focused on government enterprise technology, aims to “harness the power of transformers, technology, and capital to help government become more efficient, responsive, and better able to serve society.” To date, Govtech Fund founder Ron Bouganim has raised $23.5 million to help make this happen.

 “VCs historically ran for the hills whenever they heard the word ‘government’ and for good reason: software sales cycles were measured in years, governments often required a ton of product customization and a byzantine structure of prime and sub-contractors made it impossible to actually deploy solutions even after a deal was won,” writes Bouganim.

“However, in the past couple of years, a number of trends including government adoption of the cloud, budget constraints, a massive government personnel retirement cycle and an open data movement have coalesced to create an openness on the part of government agencies to embrace new technologies and a dramatically shortened sales cycle — our portfolio companies’ average is just 86 days.” (Bouganim says this has since shortened to 73 days).

But it’s not just vertical investors and enthusiasts exploring government technology opportunities. Established venture capital firms like Andreessen Horowitz, and incubators and accelerators like Y Combinator and 500 Startups are also taking note.

Y Combinator is beginning to explore entrepreneurial opportunities around cities, establishing a research effort on this front, and has graduated a number of government-focused startups. To date, OpenGov has raised nearly $50 million, including  investments from Andreessen Horowitz, to bring financial benchmarking and transparency tools to government.

Even Google has created its own city-focused venture with Sidewalk Labs.

Other startups — NextRequest for public records management, Romulus for constituent relationship management, Patronus for 911, SeeClickFix for 311, Mark43 for law enforcement,mySidewalk for city analytics, ProudCity (my company) for city websites, and others — are beginning to replace legacy systems as the next-generation government SaaS stack.

Former White House Office of Science and Technology Policy Deputy Chief Technology Officer and now partner at Insight Venture Partners Nick Sinai recently wrote about his firm’s government technology investments:

“Why are these markets attractive areas for Insight, given conventional wisdom that government can be challenging to sell to, and expensive to serve? … First, government is a large market. … Second, government desperately needs better software. … Third, government requirements, while challenging, can sometimes advance product development for our companies.”

In “B2G: The Excitement Of An Old-Line Industry,” OpenGov founders Zac Bookman and Joe Lonsdale highlight four points on why investment in government disruption is a ripe opportunity:

1) Old technology provides opportunities for order-of-magnitude improvements;

2) Big institutions signal huge markets;

3) Industry pressures demand new efficiencies; and

4) Challenging sales cycles increase barriers to entry and foster customer retention.

“The right path for a startup-company in an old-line industry is arduous and immensely rewarding,” write Bookman and Lonsdale. “Conventional wisdom says that it’s too hard to build a business in government (or other major industries), and this has kept many from trying. Grand outcomes await for those top young companies bold enough to venture and win.”

According to Government Technology, state governments will spend $47 billion in IT and local governments $52 billion in 2016 — a 3.25 percent and 2.5 percent increase, respectively, over 2015.

Couple this with a long tail of 19,000 cities, 89,000 agencies, 3,000 counties, 98,000 schools and 119,000 libraries, and the opportunity is there to enjoy entrepreneurial success, disrupt the seemingly impossible and, as Tim O’Reilly says, work on stuff that matters.

For the bold, grand outcomes await.

Bay Area cities team with startups to solve civic problems, scale government innovation

STIR 2016

Bay Areas cities San Francisco, Oakland, West Sacramento and San Leandro teamed with startups this year as part of the Startup in Residence program to “explore ways to use technology to make government more accountable, efficient and responsive.”

The 2016 cohort included 14 companies that worked with the cities over 16 weeks, and the teams made their presentations Friday (see #STIR2016).

All of the projects were fantastic, but Binti really stood out and opened my eyes to the impact modernized technology can have on truly changing lives.

The STIR program started in 2014 and serves as a model for other geographic regions that want to create momentum around civic technology and scaling government innovation.

Big shout to Jeremy Goldberg, Krista Canellakis, Jay Nath, the SF Office of Civic Innovation, an incredible team of ambassadors and mentors, Monique Woodard from 500 Startups and, of course, Lawrence Grodeska and the CivicMakers team. It’s inspiring to see public sector leaders working proactively with startups to break through the procurement and technology mold and bring better digital services to those they serve.

For those interested in participating in the the 2017 cohort, see the participation requirements and apply.

Ash Carter wants to keep DOD weird

Defense Secretary Ash Carter speaks during a visit to Capitol Factory, Austin, Tex., September 14, 2016. (DoD photo by U.S. Army Sgt. Amber I. Smith)

Defense Secretary Ash Carter speaks during a visit to Capitol Factory, Austin, Tex., September 14, 2016. (DoD photo by U.S. Army Sgt. Amber I. Smith)

U.S. Defense Secretary Ash Carter announced the DOD will open its third technology innovation “outpost” in Austin, expanding the reach of the Defense Innovation Unit Experimental that serves as a “bridge between those in the U.S. military executing on some of our nation’s toughest security challenges and companies operating at the cutting edge of technology.”

DIUx, launched in 2015, already has locations in Silicon Valley and Boston.

From startups to venture capital, Carter is proactively reaching out to the technology industry to close the innovation gap between the Beltway and geographic regions with high-density digital ecosystems.

“I created DIUx last year because one of my core goals as secretary of defense has been to build, and in some case to re-build, bridges between our national security endeavor at the Pentagon and America’s wonderfully innovative and open technology community,” said Carter. “That’s important, because we’ve had a long history of partnership working together to develop and advance technologies like the Internet, GPS … satellite communications and the jet engine. What we’ve done together is not only benefitted both our security and our society but, it’s fair to say, the entire world.”

Watch Carter’s TechCrunch Disrupt interview:

Driving smart city innovation with open sensor data (part 4)

This is part four of a five-part series that looks at successful strategies we at OpenDataSoft have seen our clients and others use to foster innovation and align their smart city and open data goals. The complete series “Driving Smart City Innovation with Open Sensor Data: 5 Lessons Learned” is available as a free PDF download.

Strategy 4: Treat your sensor data like a valuable asset

While it is commonly acknowledged that cities today produce massive amounts of data, it is less often noted that much of the data referenced is not actually produced directly by city systems, but rather by cities’ ecosystems of partners in domains such as transportation, waste and water management and energy services. When all goes as it should, these partners join their city clients in providing open access to the data they generate. But, this does not always happen. Sometimes cities have to take action to ensure they have access to the data generated through such partnerships for their own internal use, and for making data available to the public as open data.

At the April 2016 Socitm conference, Peter Wheeler, an account manager at the software firm Red Hat, reported hearing many city officials complain that they face “a huge battle” in getting data they required from software vendors responsible for some smart-cities initiatives. His advice, echoed by many others, is that cities must mandate openness from the outset by insisting on access to all data in new procurement contracts.

Wheeler cautions, however, against exercising that access right unless there is a clear use case for the data in order to avoid overburdening city systems. This makes sense if the city and the vendor are the only consumers of the data, but with the right technology, making that data open does not have to be a financial burden on the city. In fact, it can be a boon, and opening it can allow businesses and civic technology developers to find innovative uses for it that the city might not otherwise discover. This fulfills the promise of open innovation.

One scenario in which open sensor data can be a financial gain rather than a burden is in the monetization of data streams. Traditionally, as the Open Data Institute articulates it, open data has been defined as data “that anyone can access, use and share,” and “it must be published in an accessible format, with a license that permits anyone to access, use and share it.”

While in this classic definition, open data must be free, accessible, and licensed for open re-use, as smart city data goes online, cities and their ecosystem partners have begun to consider shades of open, such as freemium access that offers basic access for free, but premium fee-based access for high-volume usage.

Consider different shades of open

The first reason for this is cost. Streaming senor data is ‘big data,’ and handling big data carries an infrastructure and access cost – a cost which public agencies have to be able to cover to provide access. This is the position taken, for instance, by the Paris regional transit service (RATP) and the French national rail service (SNCF) []. Both provide open access now to a range of transportation data (data.ratp.fr and data.sncf.com), including in the case of SNCF, access to real-time departure and arrival data, though at present with a usage limit. And, both plan to offer high-volume access to such real-time data using a freemium model.

They argue that providing reliable, high-volume access to the streaming data carries a significant cost that can be borne only through premium access, and it’s a cost the large corporations who are most interested in that data can afford to pay. For them, distributing these costs to such heavy data users helps ensure free access can be maintained for civic technologists and start-ups who develop citizen-centric applications that support local economies, improve citizens’ lives, and help all make more efficient use of public services.

The second rationale for freemium models is the growing realization in all sectors that data is an asset with real economic value. For cities, that value can be tapped as a new revenue stream to help support smart city initiatives or meet basic budgetary needs. This is the rationale posited by the Buckinghamshire County Council in the UK. They are exploring options for monetizing their flows of smart city data as a way to make up for shortfalls in the face of “ever-increasing budget cuts.” David Aimson, Project Manager at Buckinghamshire County Council, believes other cities will follow suit: “By the very nature of being publicly funded we are not historically commercial organisations,” however, he adds “over the next decade you are going to see councils turning more into businesses.”

While data monetization is at too early a stage to determine how significant that stream may be, it is nonetheless an area of growing interest to cash-strapped cities; one that needs to be approached with careful consideration of technical, financial and governance issues, including very important issues of public trust and protection of citizen privacy.

However, regardless of whether a city decides to offer access on a freemium or fully free basis, and whether it wants to open data from existing systems, or from newly deployed Internet of Things sensor networks, or from crowdsourced mobile phone data, access has to be made available in a way that supports application development. That means it needs to be available through standardized, efficient Application Programming Interfaces (APIs), and that cities must incorporate access rights into vendor contracts to give themselves maximum flexibility in transforming data into value for their communities.

Check back in next week for Strategy 5, which discusses two must-have technologies for succeeding with open sensor data. You can also download the complete five-part series.

‘Delivering on Digital’

Delivering on DigitalI finished Bill Eggers latest book, “Delivering on Digital: The Innovators and Technologies That Are Transforming Government,” and highly recommend to public sector technology practitioners, especially governments who don’t have the resources to contract with a high-end consulting firm to build out a holistic strategy on their own.

“Delivering on Digital” emphasizes concepts such as open source technologies, agile methodologies, open data, universal user identification/login and security (making the latter very accessible and required reading). There are a number of anecdotes that perhaps are most applicable to larger cities, states and national governments, but still helpful in providing context on how all of these have been effectively implemented.

The aspects “Delivering on Digital” touch on that I’m not convinced are effective are the approaches to engagement around crowdsourcing, contests and prizes. I’m more bullish on open source communities, as advocated by Red Hat CEO Jim Whitehurst in “The Open Organization.” Unfortunately, we’ve yet to see government effectively create community or build accessible collaborative environments, which is why I think it defaults to a push-style approach to engagement.

I also think we’ve run the gamut on using Code for America, 18F, U.S. Digital Service and the U.K.’s Government Digital Service as anecdotes and examples of success, especially since they’re very difficult to replicate at scale. Something the government technology community has yet to confront are areas where things haven’t worked so well and would be invaluable to share and learn from. Unfortunately, the nature of the industry doesn’t make it easy for an open discussion of this, and most likely compounded by the book being part of a (brilliant) content marketing strategy for Deloitte.

Having said this, Eggers and his colleagues are adding tremendous value by publishing a resource like “Delivering on Digital.” Even more brilliant and value-add and breaking with traditional publishing rules would be to issue this with a Creative Commons license, much like O’Reilly Media did with “Open Government.”

Accompanying “Delivering on Digital” is a compilation of digital government playbooks, (currently in images that would also be great to see converted into an open format similar to 18F’s guides).

Eggers, recently appointed as the executive director of Deloitte Center for Government Insights, has also authored “The Solution Revolution,” “If We Can Put a Man on the Moon,” “Governing by Network,” “The Public Innovator’s Playbook” and “Government 2.0.”

Buy “Delivering on Digital” on Amazon.

DISA kicks off overhaul of federal background checks

Photo: U.S. Navy

Photo: U.S. Navy

The Defense Information Systems Agency has released a series of videos and request for information for the National Background Investigation System, created in the wake of security incidents that lead to data breaches of millions of federal government employees and contractors.

According to the RFI, NBIS is “is a new entity that changes how the Federal Government performs background investigations for military, civilian, and government contractors.” DISA will “design, develop, secure, and operate” NBIS which supports the National Background Investigation Bureau, formerly the Federal Investigative Services, managed by the Office of Personnel Management.

The overview and video references read straight out of agile and open source playbooks, so it will be interesting to see how far this goes on those fronts:

NBIS PMO must establish an enterprise IT enclave that enables business process reengineering, including modular system development to accommodate changes in data requirements, advanced security protections to safeguard data, enables broad shared services to maximize investments, and not only meets the needs of the end users, but also connects those users to the process.

Intro video:

GAO to lean more on analytics for government accountability

U.S. Government Accountability Office announced it will create a Center for Advanced Analytics to bring a more data-driven approach into its work.

GAO says the new center’s primary goals are to:

  • enhance access to data sources
  • assess, customize, and help deploy new technologies
  • promote novel analytic approaches
  • strengthen analytical skills

“In future years, we are hoping to improve our capabilities so that we can analyze even larger data sets,” says GAO on its blog. “We are also increasing the use of text analysis. This would allow us to look for key phrases, matches, and similarities in unstructured data—such as large text documents—to help identify patterns, such as those that might indicate potential fraud or improper payments.”

The seeds of a federal government software-as-as-service digital platform?

Photo: White House

Photo: White House

With the release of a new identity management platform, 18F is slowly culling together all the requisite pieces for an easy-to-deploy, cloud-based federal government web management platform.

These include:

  • cloud.gov (“A platform by government developers, for government developers.”)
  • login.gov (“Improving access to government services through a shared authentication platform”)
  • Federalist (“Federalist is a unified interface for publishing static government websites.”)

These components, coupled with the U.S. Web Standards that will allow for a common look and feel front-end theme and templates, the Digital Analytics Program for metrics, its work on security and HTTPS, the General Service Administration (via 18F) is on its way to becoming a full-scale software-as-a-service platform that makes it easy for agencies to launch and maintain web services in-house.