There is an interesting debate taking place on InfoQ: What role will the JCP play in Java's future?
"Alex Blewitt described the Java Community Process (JCP) as dead, likening it to a headless chicken which "doesn't realise it yet and it's still running around, but it's already dead". This touched off a debate over the usefulness of the JCP and how much it will play a role in Java's future"
The traditional role of standards was to define a spec that everyone will comply with, and consequently, an open market around the standards will emerge. This, presumably, will enable customers to choose the best implementation and avoid vendor lock-in.
A quick look into current trends reveals that things like PHP and the Spring Framework, neither of which are formal standards, are being adopted more rapidly than many of the existing formal standards. These frameworks are able to quickly respond to market needs, while traditional formal standards lag behind.
Perhaps more importantly, even in projects that use a formal standard, such as JEE, there is typically heavy use of non-standard, proprietary components that come with the products that implement the standard. So even in these cases, the fact that only a part -- sometimes a small part -- of the project complies with a standard significantly reduces the value of the standard. This is especially true in the Java world.
Alex Blewitt focuses on the reasons why JCP isn't successful, but his arguments are true for other software standards bodies as well. IMO there is a more fundamental problem that goes beyond JCP or any other process. At the heart of it is the fact that the primary focus of such standards bodies is on agreeing on a specific API or protocol. This focus is too low level and is often tied to specific semantics that don't leave enough room for innovation and optimization, and therefore, doesn't encourage healthy competition on the implementation of the standard. This makes it very hard to find common ground for agreement and often leads to ugly compromises, tediously protracted processes, etc. These ugly compromises and prolonged processes lead to dissatisfaction from users, and consequently, to lack of adoption -- and the vicious cycle continues.
Right now we are experiencing this Catch 22 scenario where we want to have an open market and avoid vendor lock-in, but the process to achieve that openness via standards is somewhat closed -- and basically broken. It is this realization that brought me to ask the question: who needs standards, anyway?
The goal of keeping an open market is an important goal that we should continue to strive for. What we need to do is change our thinking and break away from the traditional approach to the role of standards as we know them today.
Here are some ideas to help address the issues:
Leverage open source communities and product adoption as a way for defining de-facto standards
We don't need any regulatory process that will determine which project can be applicable as a standard or not. Anyone who thinks that they have good idea can just spawn their own project and do pretty much whatever they like with it . Adoption, or lack thereof, will be the measure of relevance of the project. In other words, if you're doing the right things and solving real problems then most likely it will be reflected in higher adoption. Adoption is an excellent tool for measuring success. To make this tool even more effective, you can also apply measurements of adoption, similar to Google PageRank. The rank could be based on number of downloads, number of references in the blogosphere, the quality of those references, etc. One of the implicit benefits of this model is that it brings back the power to the developer. It also creates a highly competitive environment that encourage innovation and alternative thinking.
Keep It Loose
Leverage new methodologies, such as the declarative abstraction model (dependency injection), annotations and Aspect-Oriented Programming (AOP) to create a loose plug-in model. With this approach, we can plug-in different implementations that don't necessarily comply with the same API -- or even technology. Mule is a very good example of this approach.
With Mule you can plug-in different protocol implementations, as well as different event sources -- such as JMS, Web Services, JavaSpaces and others -- and inject the event into your businesses logic from any of those sources in the same way. The key is that you can achieve this without changing your business logic every time you add a new data-source, and without forcing the use of a specific event-handling API, such as JMS, for example. Spring Remoting does pretty much the same thing. We can write our Service as a POJO and then use the remoting abstraction to invoke that service through a local-proxy, remote proxy, and enable the dynamic creation of proxies, without enforcing the implementation to bind to a specific protocol (IIOP,JRMP) or model.
Commercial vendors can be involved in this process by contributing back to these open source projects, and through that influence their evolution. A good example for this is MySQL, which is supported by big companies such as SAP, and companies like Google or eBay who contributed large portions of code when they found things that were insufficient to meet their needs. They are not making these contributions for philanthropic reasons. Their interest in doing so is quite straightforward: if it's part of an open source project, it's going to be maintained and tested by an entire community of users, and become a de facto standard.
Now don't get me wrong: this model is far from perfect. One of the biggest challenges is that we will end up with too many options to choose from and too many frameworks that don't work together.
However, if we consider the LAMP stack, and how it evolved without a big brother that controls the process, there is reason to be optimistic that this model could work, and is the best one we have.