Linux

Lockheed goes open source. Blankenhorn hates it.

I was really pleased to read the announcement that Lockheed Martin's social networking platform, EurekaStreams, was released as an open source project today. Lockheed is a very conservative company, and while they're happy to use open source internally and on projects for their customers, this is their first experiment with actually running a project themselves. I think it's a big deal, not just for Lockheed Martin, but for large corporations who are considering a more open, more innovative approach to software development. And yet, Dana Blankenhorn hates it:

I don’t see anything in Eureka Streams I can’t do in Drupal, or a number of other high-quality open source projects that have existed for years. Lockheed has reinvented the wheel — why?

So here's the nice thing about the open source community: competition. If I think I've come up with a better way to solve a problem, it can easily compete with the incumbents. Low barrier to entry, we say. Let the best ideas win. Unless, apparently, the best ideas come from a company I don't like.

Then things start going sideways:

The author of Eureka Streams, who goes by the name Sterlecki at Github, has left no previous tracks there. Linkedin lists the same picture as belonging to Steve Terlecki, a Lockheed software developer.

The stuff’s legit, so we’re left again with the question of motive. Is the military-industrial complex reaching out to open source, is this just proof of press reports showing our spy efforts have more bloat in them than a Macy’s Thanksgiving float, are we being co-opted, or am I just too suspicious?

Wait, what? Open source advocates have, for years, been trying to encourage more code to come out from behind corporate skirts. Where companies can build business models around governing and supporting open source projects, we want them to take the plunge. If more code is open, that makes everyone smarter. And that, my friends, is exactly what Lockheed Martin did today. Someone who probably never contributed code in their lives just gave the community a project they've been working on for months, or even years. I think that's amazing. In return, this brave developer gets painted as a nefarious secret agent out to steal our thoughts and bug our laptops. Or whatever.

So here's the great thing about open source: we can prove Blankenhorn wrong. They use the Apache license, and it's on Github. We can go through the code and find backdoors, secret plans, and mind-control rays. This reminds me very much of the reaction to the release of SELinux. Conspiracy theories everywhere, but code is auditable and now it's in the mainstream Linux kernel. Do we really want to throw out these contributions, when code doesn't lie? When it's so easy to ensure there's nothing nefarious inside?

You can feel however you like about Lockheed Martin or the US Department of Defense. You can choose to contribute to the project, or not. You can choose to use the software, or not. But is it in the community's interest to summarily dismiss contributions based on those preferences? Lockheed's thousands of developers are sending up a trial balloon. If they fail, we lose access to those developers forever.

I think this kind of fearmongering is exactly what prevents large corporations and government agencies from releasing their code. These knee-jerk reactions harm the open source community at large. We pride ourselves on our meritocracy. A 14-year-old in his mom's basement is the same as a 30-year-old Lockheed developer is the same as a UNIX graybeard. You are just as good as your contributions. We need to welcome Lockheed's contributions, not throw them back in their face. Whether the project is useful or not, they've enriched the open source community. Let them succeed or fail on their own merits. If they do fail, we hope that they'll do better next time. Maybe this is a Drupal-killer. Who knows? Let's give it a try.

Software isn’t a skycraper

Michael Daconta at Government Computer News has posted a brief call to arms for the software industry. Here’s the gist:

Although I am a believer in free markets and the benefits of competition, industry has a responsibility to work together on the foundational layers to build security, quality and reliability from the ground up to advance the professionalism of the field. In essence, the information technology industry must emulate other engineering disciplines, or technological disasters and cybersecurity holes will worsen.

Daconta is uneasy with the number of platforms and methods available to software developers, and sees ever-more options and disruptions in the near future; IPv6 and 64-bit computing seem to trouble him particularly. We’re already balkanized and disorganized, how can we possibly expect to produce reliable and useful software with all this messy innovation happening?

The answer, of course, is control. Lots of it. Specifically, three proposals:

  • Licenses for software developers
  • A new, reliable, layered software platform developed by the NSF and DARPA
  • Treat software like engineering, not art.

Gracious. I barely know where to start. Let’s try to imagine the software development world in five years, with these proposals in place.

Software development is now a licensed activity. Like an architect or a mechanical engineer, you have to pass an exam and perhaps post a bond to practice the discipline. There’s probably a professional association, like the America Bar Association or the American Medical Association, to administer the credentials.

This licensing regime is actually a pretty good idea, because all software has to be developed according to some very specific methods, with plenty of testing and documentation to back it up. So rather than letting any fool with a compiler write software, they’ll have to spend a year or two learning the right way to write code. The process is cumbersome, but every piece of software that gets compiled is perfect. At least, as perfect as we know how.

The licensing and formal methods are only possible, of course, because we have a government-directed platform that we must build on. Anyone who wants to run software in the government must do so on this stack.

It sounds as though Daconta would like a broad mandate for NASA-style code development. You can probably see where I’m going with this. Another way to tell the story might be:

There is now a government-owned platform that every government program is mandated to use, from clouds to mobile phones. Any software built on that platform must submit to rigorous, independent testing before it is deployed. Imagine ISO 9000 and Common Criteria having a baby with teeth. Anyone writing software must be licensed to do so in the United States. As a result, the pool of available programming talent is decimated. The costs of developing software for government rise, naturally.

The pool is further diminshed because developers who want to work with the latest hardware or software no longer work for agencies or contractors — they’ve wandered back to the private sector, where they can enjoy the fruits of the free market. The government software platform quickly begins to show its age, since the only developers on the platform are those that are paid to use it. Platforms that people truly want to use are in the open market, innovating at their leisure, out of the reach of government agencies.

As the government is no longer able to consume most commercially available software, it is now back in the business of writing software itself. More accurately, it is back in the business of hiring system integrators to write that software on its behalf. It’s the 1970s all over again. Budgets explode, and innovation grinds to a halt. It’s all the agencies can do just to tread water. System integrators, of course, are delighted. They’re now commanding outrageous salaries for the few programmers trained and willing to work on this mandated government platform.

Let’s hope there’s a better way.

This doesn’t diminish the problem that Daconta is hinting at, of course. Software reliability is certainly something to worry about. But there is no single solution or set of policy prescriptions that will solve the problem. I don’t think that imposing additional controls on the development of software makes sense, certainly not for all cases. There are already robust certification regimes and methods for software that does very important work: fly an airplane, control a nuclear reactor, and so forth. We don’t need that kind of scrutiny on my game console or desktop.

It’s important to note that these robust certifications only evaluate the software itself, not the people who make it. This is what’s great about software: we can examine the final product before it’s distributed. I don’t really mind if my software is written by a clever 7 year old. If it’s doing the job it’s supposed to, that’s fine with me.

This focus on ends, rather than the means, is something you can’t do with a building or an airplane. With software, we can change our minds with far fewer consequences. We can thoroughly test and scrutinize before it’s in a customer’s hands. When we do find a flaw, it’s easier to patch software than it is a 777 or a skyscraper. We should take advantage of that fact, not try to make software rigid and inflexible just because we know how to manage rigid and inflexible things. Because software has these unique properties, we have the freedom to bring more or less scrutiny to bear, depending on the circumstances. Which is what we’re doing already, through programs like the federally-funded Software Engineering Institute at Carnegie Mellon.

So where Daconta sees mayhem and chaos, I see creativity and innovation. There are many opportunities to improve the reliability of our software, but none of them have to do with the process by which we arrive at a particular piece of code. Projects like David Wheeler’s OpenProofs, for example, can provide the tools we need to be mathematically sure software is doing what we intended. The Linux Test Project does this for Linux, and it was inspired in no small part by governments’ Common Criteria mandates.

This is, in fact, how the process should work. The government should set the requirements for reliability and assurance, and allow the private sector to innovate its way toward those requirements. If we only create software that we can understand perfectly, we lose the ability to be creative, to innovate, and to take advantage of the collective intelligence and cleverness of millions of software developers. We will never eliminate risk in software, but we can manage that risk, not through more stringent controls, but by encouraging as many smart people as possible to address the problem.