The current push by big players in the software world to create and/or control the emerging SOA or WebServices market via MetaStandard standardisation seems to be a tactic learned from the producers of consumer electronics. The consumer electronics markets are managed, technology is presented in an orderly, even evolutionary manner. Revolutionary innovation is kept from the masses until the existing markets are saturated, the innovation is then rolled out across the board. (Ok, it is not that bad, antitrust guidelines prevent it getting too much like collusion, see Carl Shapiro's standards paper for a nice discussion of the issues.)
It makes good sense, the market is huge; the pie is big enough for all the players and the customers like the illusion of choice. Sure there are exceptions once in a while but in the main, there is order.
I imagine it works like this, the major players share much of their R+D. They decide and standardize in advance, the technologies that they will support and license from each other. They publish the standards (to ensure the network effect), market, manufacture and release the products, continue R+D and repeat the process.
In consumer electronics, this model works because the capital costs required to develop, manufacture and distribute a new product are real barriers to entry. The R+D embodied in the standards is very hard to reproduce by some one outside the club.
In software, much of this is turned on it's head. The value of software is in the design. Good design takes experience, domain knowledge and skill. Once the design is embodied in code, the R+D job is mostly complete. The costs associated with manufacturing and distribution are close to zero. Marketing too, can be very cost effective with the web amplifying the word of mouth effect.
The emerging specifications for WebServices are very close to design, this is part of the nature of software and part of the nature of an interoperability specification. By standardisation, the "men in black" are creating a market but they are also designing a solution, one that is very easy to replicate.
Why then, are they taking control of the standardisation process?
Is is simply because they realize that the end game is domain knowledge and customisation or are they just ensuring that the market gains momentum fast, a rising tide to lift all boats (and hence their super tankers)?
Thursday, June 30, 2005
Wednesday, June 29, 2005
Using Standards to create the future
Thinking about the place of open standards, how they are created and evolve, how they sometime lead, sometimes evolve with and sometimes follow a market, lead me to consortium.org. This is a great resource for fact and opinion on standards bodies and consortia.
The OMG consortium, one of the first, had a vision, produced a specification and worked through the evolution of the specification. In the beginning the specifications lead the market in a new direction, then the specifications followed the market; taking innovation (used as a differentiator between competing standard products) back into the standards. What followed was a period of 'evolution with the market', fixing problems, clarifying issues and needs etc. The next phase was extending the OMG model into new ground, up the stack, towards the applications. This required taking the lead on the market again, but this second phase was not as successful. The same "clear problem focus" was missing, the original vision was being diluted. In addition the process (and vendors embedded interests) were getting in the way, the result is captured in the view expressed in a survey on participating in standard development organisations(SDOs):
The latest edition of standard bodies (MetaStandards around Web Services and SOA) appear to be taking a very different approach. Rather than being focused on a common problem, they are focused on a common market. They are using the standards to give credibility and cohesion to the market and to build momentum and awareness about the technology. They are building the implementation and developing the standards at the same time. It is jumping straight to the evolution stage of the standard but without a clear statement of the problem; the "use cases" are being generated on the fly, in reaction to the markets response to the marketing messages. Maybe this is the perfect iterative design metaphor, produce a working trio of implementation, standards and marketing, present it to the market for review, evaluate the response and try again. What is clear is that the so-called "Men In Black" (Microsoft, IBM and BEA Systems) are really taking Alan Kays quote to heart
The fruits of the Open Source community may provide an alternative. Open source is firmly based on solving real problems. It may be in the interest of the market to take some control back, to support initiatives that have the problem solution rather than the market opportunity at the core. Our needs will be better met by a problem solved rather than the opportunity for more problems being created.
Is this just a classic tail of technology churn, all be it a very well executed one?
The OMG consortium, one of the first, had a vision, produced a specification and worked through the evolution of the specification. In the beginning the specifications lead the market in a new direction, then the specifications followed the market; taking innovation (used as a differentiator between competing standard products) back into the standards. What followed was a period of 'evolution with the market', fixing problems, clarifying issues and needs etc. The next phase was extending the OMG model into new ground, up the stack, towards the applications. This required taking the lead on the market again, but this second phase was not as successful. The same "clear problem focus" was missing, the original vision was being diluted. In addition the process (and vendors embedded interests) were getting in the way, the result is captured in the view expressed in a survey on participating in standard development organisations(SDOs):
"Firstly, we (Sun) give very little consideration to SDOs, in large part because the rules are so arcane that we find that we get specifications with "maybe bits", rather than on/off bits. (possibly due to the fact that too many SDOs believe that a compromise where everyone is disenfranchised is a legitimate way to achieve "politically acceptable technical standardization"The lesson may be, that a clear problem focus is key to any standards effort and that efforts to build past this initial problem, to fully capitalize on the first success, are best left until the next clear problem arises. There appears to be a time to "let go" that comes once a solution to the original problem has evolved. Hold on past this evolution stage and you smother the opportunity to build on the original success.
The latest edition of standard bodies (MetaStandards around Web Services and SOA) appear to be taking a very different approach. Rather than being focused on a common problem, they are focused on a common market. They are using the standards to give credibility and cohesion to the market and to build momentum and awareness about the technology. They are building the implementation and developing the standards at the same time. It is jumping straight to the evolution stage of the standard but without a clear statement of the problem; the "use cases" are being generated on the fly, in reaction to the markets response to the marketing messages. Maybe this is the perfect iterative design metaphor, produce a working trio of implementation, standards and marketing, present it to the market for review, evaluate the response and try again. What is clear is that the so-called "Men In Black" (Microsoft, IBM and BEA Systems) are really taking Alan Kays quote to heart
"The best way to predict the future is to invent it."The capability and power of the "Men In Black" is not in question, if any group can create the future then this combination can. It will be interesting to see how far the solution goes past the interoperability play, how much control and scope do these standards want to have. Will they learn from the OMG case and stop once a clear problem and solution have evolved or is this new model of creating the future simply better and can accomplish more?
The fruits of the Open Source community may provide an alternative. Open source is firmly based on solving real problems. It may be in the interest of the market to take some control back, to support initiatives that have the problem solution rather than the market opportunity at the core. Our needs will be better met by a problem solved rather than the opportunity for more problems being created.
Is this just a classic tail of technology churn, all be it a very well executed one?
New Shape of Knowledge
More goodness from one of the co authors of the cluetrain, the web is changing how we think:
"That's because we've thought of our minds as containers. But the Web is made of links - pages pointing outside of themselves to other pages - each a little act of generosity."It is an embodiment of how we learn by talking with others, exploring ideas and concepts, the notion of knowledge as a conversation that is now captured in the web in an open manner is very compelling, this blog entry and blog is worth a read
"Conversation is a paradox because it iterates difference on a common ground. That a paradox happens every day is a miracle."The insight about the way the web changes the politics of knowledge by severing the links between knowledge, organisation of knowledge and ownership is useful. That we can still make sense of all this knowledge is the real miracle.
Monday, June 27, 2005
Grady Booch on Technology Churn
From an MSDN chat with Grady Booch some straight forward good advice
Technology churn is always a challenge.... ...I would therefore offer the general guidance to those teams to focus on some of the best development practices, architecture first, the use of modeling, controlling your change management; these are fundamentals that will prepare you to absorb technological change, which you'll have to do over time.Then a word on adoption in the context of emerging Web Services:
"I expect we will see organizations struggle to build their own kind of Web Services because they would initially choose to not build upon public services out there, unless they are provided by a platform member such as Microsoft. It simply has to do with trust, availability, security, and scalability. And so they will probably dabble by building their own services, and later build systems upon public services."Just trust, availability, security, and scalability; these are so important, so real and the reason why software engineering as a profession has so far to go. Sure we need to innovate, but as consumers we need to take some control, it is fair enough to have to dabble, to experiment, but not all the time. To develop a full understanding of a system takes time, stability of purpose and context, in the constant churning world of Software we have neither. So ensure we have arcitecture and models and ensure we know what our system does and how we validate that it does it. That is, write real tests for it. Only then are we in a position to evaluate the next wave that churns our way and empirically decide whether it is of benefit.
Utility solutions and the IT Hierarchy
Again a lively debate follows from a publication by Carr implying that there will be no need for low level corporate IT; IT functions will be available as a utility service. While APSs, outsourcing, shared data centers etc., are all a reality today, can all IT eventually be served in this way?
I think the hierarchy approach is a useful construct to help answer this question. It causes us to look at the computing needs of an organization in an ordered fashion. Recent details of IT spending discussed in The End of Corporate IT? Not Quite - Computerworld allude to a the need for some separation, in Carr's argument; a move from the 'one size fits all' approach:
Looking through this IT Hierarchy lense, I think the limits of utility computing will be tied to the lower layers of the hierarchy and then coupled with the evolution of standardised application specific solutions. The space occupied by ASPs today (web hosting, payroll, CRM), will grow to encompass the tools/services that are of utility along and across industry sectors as best practice becomes apparent.
However the middle and higher order functions will remain allusive because of their specificity and their value.
Looking for cost reductions for the middle order functions may require a pooling of resource, an openness to sharing best practice and a resistance to technology churn. The use of information is the key, how it is obtained, maintained and shared can become common practice, a standard or an open solution for that particular community. The open innovation model, coupled with an open source approach at the solution level could provide a framework here.
The highest order functions will remain out of the realm of community and utility because they will be the main stay of competitive advantage and are too valuable to an individual organisation to share.
I think the hierarchy approach is a useful construct to help answer this question. It causes us to look at the computing needs of an organization in an ordered fashion. Recent details of IT spending discussed in The End of Corporate IT? Not Quite - Computerworld allude to a the need for some separation, in Carr's argument; a move from the 'one size fits all' approach:
In many ways, basic IT infrastructure has indeed become a commodity that should be treated as a utility where cost reductions reign. However, lumping all IT investments into the commodity category is the critical oversight in Carr's argument.The un-lumping model is then described:
Much like Maslow's Hierarchy of Needs in human development, the IT Hierarchy of Needs segments IT spending into four progressive levels.
The first level is basic IT infrastructureÃthe core foundation for corporate computing including servers, networking, storage, desktops, mobility and telecommunications.
The second level includes the tools to automate manual tasks and processes, streamline transactions and foster creativity and collaboration.
The third level includes all applications to support the collection, visualization and application of information to measure the business and drive improved performance.
The fourth and highest level is how a company uses its information to change the playing field by creating different relationships with suppliers, partners and customers, as well as applying competitive insight.
Looking through this IT Hierarchy lense, I think the limits of utility computing will be tied to the lower layers of the hierarchy and then coupled with the evolution of standardised application specific solutions. The space occupied by ASPs today (web hosting, payroll, CRM), will grow to encompass the tools/services that are of utility along and across industry sectors as best practice becomes apparent.
However the middle and higher order functions will remain allusive because of their specificity and their value.
Looking for cost reductions for the middle order functions may require a pooling of resource, an openness to sharing best practice and a resistance to technology churn. The use of information is the key, how it is obtained, maintained and shared can become common practice, a standard or an open solution for that particular community. The open innovation model, coupled with an open source approach at the solution level could provide a framework here.
The highest order functions will remain out of the realm of community and utility because they will be the main stay of competitive advantage and are too valuable to an individual organisation to share.
Thursday, June 23, 2005
We know more than we can tell.
Jon's Radio references a telling phrase: "The Tacit Dimension of Tech Support, refers to The Tacit Dimension, a 1967 book by the scientist/philosopher Michael Polanyi. One of his touchstone phrases was: 'We know more than we can tell.'"
This is one reason why the agile 'workwith' is so important. Seeing, sharing, doing with another person, is the best way to learn. It takes time to assimilate in this way, but also some stability, in both context and purpose; elements common to a traditional apprenticeship.
Tuesday, June 21, 2005
The open source meritocracy model - move it to the systems/solutions level
Dave Thomas and Andy Hunt do a wonderful job of describing the opensource ecosystem, identifying key motivators and practices that make it work. The mix of capitalism and communism, through meritocracy and community, provides an interesting balance.
Adopting some of these practices in commercial product development makes good sense and has been successfully demonstrated by the agile community. Going further I would like to see the same open practices applied higher up the food chain, in systems or solutions development. The move up the food chain is already occurring with open source in the web and application development arena, projects like ruby on rails and spring are taking on more and more of the infrastructure tasks of the developer. More solution oriented projects are also gaining traction in the CRM space.
An open solution oriented approach need to be limited to open source. Building open solutions with commodity components is also a possibility. Once the key capabilities of the commodity are identified and well understood, opening up the ways in which these capabilities can be used and reused in different configurations and contexts could provided a valuable community resource. Taking from the cluetrain mantra
Adopting some of these practices in commercial product development makes good sense and has been successfully demonstrated by the agile community. Going further I would like to see the same open practices applied higher up the food chain, in systems or solutions development. The move up the food chain is already occurring with open source in the web and application development arena, projects like ruby on rails and spring are taking on more and more of the infrastructure tasks of the developer. More solution oriented projects are also gaining traction in the CRM space.
An open solution oriented approach need to be limited to open source. Building open solutions with commodity components is also a possibility. Once the key capabilities of the commodity are identified and well understood, opening up the ways in which these capabilities can be used and reused in different configurations and contexts could provided a valuable community resource. Taking from the cluetrain mantra
#39 The community of discourse is the marketthe community can determine the net value of a commodity, the appropriate level of functionality and the appropriate life span. Systems can be build from software that is proven, by the market, to be fit for a purpose. The meritocracy would reveal its self as best practice, the community would facilitate the diffusion of knowledge to allow the replication and ongoing maintenance of open working systems. It could be a powerful force in the reduction of technology churn that plagues aspects of the application portfolio.
Thursday, June 09, 2005
When piracy works ....
Using formal economic modeling, professors Pankaj Ghemawat and Ramon Casadesus-Masanell consider the competitive dynamics of the software wars between Microsoft and open source, an interesting side effect of piracy is observed
It effectively reduces the cost to zero but increases the network effect. Understandable but surprising.
"We also look at the effect of piracy and ask whether piracy can ever be beneficial to Microsoft. This extension was motivated by analyzing data on a cross-section of countries on Linux penetration and piracy rates. We found that in countries where piracy is highest, Linux has the lowest penetration rate. The model shows that Microsoft can use piracy as an effective tool to price discriminate, and that piracy may even result in higher profits to Microsoft!"
It effectively reduces the cost to zero but increases the network effect. Understandable but surprising.
Subscribe to:
Posts (Atom)