A couple of years ago, I likely would not have understood the significance of an open source group devoting itself to an NFV reference architecture, and how fundamentally different that is from an open standards process. But having seen the impact of OpenStack in the cloud world and OpenDaylight in software-defined networking, it's more apparent that today's announcement of the Open Platform for NFV Project by the Linux Foundation is a major step forward in accelerating network functions virtualization as a commercial process. (See Open NFV Group Uncloaks Its Platform Plan and Linux Foundation Announces Open NFV Group.)
Open source is fundamental to The New IP -- the more rigid telecom standards processes aren't dead but they are more rooted in the old IP.
Both the telecom standards process and open source projects involve buy-in from a broad swath of industry players in a public process to which companies make contributions. But the open source contributions are made in code, which is tested and vetted, and changes to that code made by contributing companies are brought back to the community to maintain the integrity of the open source (at least that's how it's supposed to work).
As Prodip Sen, CTO of HP Inc. (NYSE: HPQ)'s NFV Business Unit and former Verizon exec expressed it in his public statement on the new group, OPNFV is "a new approach to networking standardization" that uses "an iterative model, with an open source framework serving as the standardization mechanism" in place of an extended standardization model.
Developers from the participating companies -- and you can see the list here -- will meet for the first time this week in Santa Clara and hash out significant early details, including how and where their emerging reference architecture for an NFV infrastructure (NFV-I) and virtualized infrastructure management (VIM) will be tested. Who will be contributing the earliest code is also an issue, says Jim Zemlin, executive director of the Linux Foundation and a veteran of this process.
"What I expect is that you'll have development work that goes on at the Linux Foundation which is the pure development, and there may be some test and dev going on as part of a continuous integration process," he tells The New IP in an interview. "But certainly what usually happens is you then have testing labs and proof of concepts set up within various operator environments that will provide additional input and likely code back into the mainstream project. Again, pretty typically."
As Zemlin notes, open source and standards processes will likely co-exist for the near future and each has its role.
"I think there is an historical change with how reference implementations and standards interact -- it's kind of a push/pull," he says. "Standards heavily influence an open source implementation like this and vice-versa, and I think that is going to be an ongoing reality. There will be cases where developing a spec is useful, but there are cases where it's too slow or hard in the abstract, so you need an actual implementation to do it."
If you browse the individual statements of OPNFV participants, which you can see here , one of most frequently mentioned words is "acceleration" and that is the key for many of these companies: to speed up NFV's arrival as a commercialized technology. That's where the open source process has its greatest advantage.
Participating companies realize that there is so much code to be written "for proper NFV to happen," Zemlin notes, that no single industry player is going to accomplish that on their own, and that brings all the players together in a common process for the greater good.
Doing this process under the leadership of the Linux Foundation also has certain benefits. "One of our jobs at the Linux Foundation will be educating these folks on how to share what they want to share and keep what they want to keep -- understanding what is non-differentiated plumbing from differentiated code," he says.
The Foundation's knowledge comes from having worked the open source process in other industries, such as the auto industry. Right now, competitors such as Toyota and Jaguar are sharing in development of basic code including kernels and middleware, knowing each will ultimately differentiate their products to consumers with the 20% of code they don't share.
In the case of the automakers, that will let them compete more effectively with the wide range of connected devices consumers will bring into their cars -- "with an iPad and a strip of Velcro," as Zemlin puts it. For telecom network operators, the competition is already out there delivering apps and content and threatening to both consume network resources in vast quantities and suck up the value-add dollars that would fund those networks and generate profits.
OPNFV's progress is something we will be watching closely here at The New IP, and we're interested to hear how you view this new effort.
— Carol Wilson, Editor-at-Large, Light Reading