I recently ran across an opinion piece with an attention grabbing headline: Did Amazon Just Kill Open Source?
The author’s primary argument is that the industry is moving too fast for good architecture to take shape around open standards like inter-application protocols and common APIs (which take time and collaboration). He makes his case by pointing out that well thought out integration layers were a hallmark of early open source projects like Linux.
The article also layers in the idea that a glut of overlapping open source projects that don’t have clean integration layers has benefitted Amazon’s “full stack” cloud approach, which is easier for developers than sifting through many similar choices and tackling integration on their own.
His points are correct, in some contexts, but they have little to do with problems with open source as a concept or market model. Open source is here to stay. Even if managed cloud services replace some of what people do today with open source code, there will always be other aspects of any project that will inevitably come from the community. The concept is simply too big and too powerful to die, even at the hands of the mighty Amazon! 😉
The other two question he raises are far more interesting to me:
For smaller developers with relatively simple needs, there’s no question Amazon is knocking it out of the park – meeting the needs of startups and smaller fast-moving projects has made AWS the 800-pound gorilla of the cloud space. But for large enterprise customers facing these questions, we are overwhelmingly hearing a different story, with three general priorities emerging:
Attention grabbing headlines do get clicks, and there are still too many people that conflate “open source” and “open standard” (not the same thing folks!), but every indication we can see says larger enterprises are treading carefully into the cloud, determined to not make the mistakes they made in the datacenter.