Tuesday, October 11, 2005
This story contains the essence of Einstein's discovery
What exactly does 'E = mc²' mean, sure I had an idea, but Brian Greene tells a story that contains the essence of Einstein's discovery. If that whole physics thing rocks your boat, you will like this article.
Friday, September 23, 2005
The dichotomy of hard versus valuable and an interest in the business side of things
From I, Cringely, NerdTV, Max Levchin makes a nice observation about engineers when asked about the need to understand the business side of things:
There is no point spending hours figuring out a hard problem if no one cares about the solution, valuable problems are ones that are tied to markets so a bit of business Savvy is important!
".... I think there's a huge benefit to knowing what it is you're working for. And there's definitely - one of the typical pitfalls in the world of engineering - and I don't mean to slight my fellow engineers, but - is this dichotomy of hard versus valuable. And lots of engineers mistake hard for valuable."
There is no point spending hours figuring out a hard problem if no one cares about the solution, valuable problems are ones that are tied to markets so a bit of business Savvy is important!
Wednesday, September 14, 2005
del.icio.us/tag/xml+opensource+java
A perfect example of internet enabled spontaneous collaboration. del.icio.us infoware provides a useful service, keeping your favorites together and accessible from any internet cafe, but the process of using it creates even more value.
Everyone gets to benefit by the sensible tagging of an individual. It is like an expert filtered google, in that all the references are already especially chosen for their content.
It is like a small bit of sense multiplied by everyone!
Everyone gets to benefit by the sensible tagging of an individual. It is like an expert filtered google, in that all the references are already especially chosen for their content.
It is like a small bit of sense multiplied by everyone!
Wednesday, July 27, 2005
Open source solutions will precipitate proprietary opportunity
Open source has changed the game from a cost of software point of view. The traditional model with a split of support over cost/license of 70/30 is changing to a full service model, the 30% has disappeared. This "cost free" element of Free Open Source has launched a bunch of new commodities, for example, LAMP is a serious contender to .NET or J2EE. The fact that it is freely available has been hugely influential here.
But the software bits are not the whole picture, it is the solutions that are build on them that are important. I think we will find that the Open Source commodities will precipitate the commoditisation of traditional closed source or proprietary components through complete integrated solutions. The proprietary modules that are really good at what they do and that play well with Open Source will be successful. That is, they are easily integrated with existing open source components and platforms, are well supported, reasonably priced and reasonable open (from a standards and extensibility point of view).
The proprietary components that work well with open source will become commodities on the back of the open source wave, the customers/market will decide what works well and what does not; essentially the overall integrated solution will be open, the proprietary bits that integrated easily will form an integral part of that solution.
Other uses with the same itch will try the same solution, to facilitate the proven complete solution they will pay for a suitable proprietary component.
One problem with this model is that a real open source alternative commodity will eventually get created by the community, unless of course there are large barriers to entry like huge complexity or the need for a mainframe to test it or something. It may simply mean that traditional proprietary source has a very short life span, build it, charge for it, then give it away once the 'critical mass' of use can provide a support revenue stream.
Hybrid open solutions that combine mostly FOSS with proprietary modules are inevitable, I think a software company can find a niche with a traditional business model applied in 'internet time'.
Services companies will find a niche in customization, and the commonality their-in will be the genesis for a traditional product.
But the software bits are not the whole picture, it is the solutions that are build on them that are important. I think we will find that the Open Source commodities will precipitate the commoditisation of traditional closed source or proprietary components through complete integrated solutions. The proprietary modules that are really good at what they do and that play well with Open Source will be successful. That is, they are easily integrated with existing open source components and platforms, are well supported, reasonably priced and reasonable open (from a standards and extensibility point of view).
The proprietary components that work well with open source will become commodities on the back of the open source wave, the customers/market will decide what works well and what does not; essentially the overall integrated solution will be open, the proprietary bits that integrated easily will form an integral part of that solution.
Other uses with the same itch will try the same solution, to facilitate the proven complete solution they will pay for a suitable proprietary component.
One problem with this model is that a real open source alternative commodity will eventually get created by the community, unless of course there are large barriers to entry like huge complexity or the need for a mainframe to test it or something. It may simply mean that traditional proprietary source has a very short life span, build it, charge for it, then give it away once the 'critical mass' of use can provide a support revenue stream.
Hybrid open solutions that combine mostly FOSS with proprietary modules are inevitable, I think a software company can find a niche with a traditional business model applied in 'internet time'.
Services companies will find a niche in customization, and the commonality their-in will be the genesis for a traditional product.
Monday, July 04, 2005
Branded Open Source
Falling for a brand, Lajos Moczar describes how it can happen:
This comment is in the context of Commercial Open Source(COS) organisation that are using branding to make money out of the open source movement. He talks of new monopolies, controlling the innovation. I think there is lots of value in the alternative openstructure approach, where the customer/market determines the best fit for a problem based on the context and prevailing community view.
COS may lead to unnecessary duplication and bias. The duplication comes from competing with other branded open source, we will see this with JBoss and Gluecode now that IBM has taken an interest in Gluecode. The bias will come from following a brand rather than from following a technology or implementation choice. The meritocracy of open source will be lost to the power of marketing.
But maybe the savior is the brand, with IBM using the open source stick to beat off JBoss, it will be a brand war between IBM and JBoss. It will be hard for JBoss to win.
The trump card for open source is the community, while customers may flock to the brand, the community should target the most interesting and appropriate technology. The community should be (and I think are) brand agnostic, the code is the reference.
The only problem is that at the moment the community does not have the buying power, there are no community solutions, so brand still dominates in the procurement process. Open solutions are the way forward.
"The most important point I can make here is this: if you cannot articulate your business needs and identify IT products that can help you fulfill those needs, you end up thinking in terms of concepts. When you think in terms of concepts, whether feature buzzwords or more general buzzwords like 'mission critical' or 'enterprise interoperability', your thinking is divorced from your needs. And when that happens, the marketing departments have got you. No matter how objective you think you are, you will inevitably find yourself choosing a brand name."
This comment is in the context of Commercial Open Source(COS) organisation that are using branding to make money out of the open source movement. He talks of new monopolies, controlling the innovation. I think there is lots of value in the alternative openstructure approach, where the customer/market determines the best fit for a problem based on the context and prevailing community view.
COS may lead to unnecessary duplication and bias. The duplication comes from competing with other branded open source, we will see this with JBoss and Gluecode now that IBM has taken an interest in Gluecode. The bias will come from following a brand rather than from following a technology or implementation choice. The meritocracy of open source will be lost to the power of marketing.
But maybe the savior is the brand, with IBM using the open source stick to beat off JBoss, it will be a brand war between IBM and JBoss. It will be hard for JBoss to win.
The trump card for open source is the community, while customers may flock to the brand, the community should target the most interesting and appropriate technology. The community should be (and I think are) brand agnostic, the code is the reference.
The only problem is that at the moment the community does not have the buying power, there are no community solutions, so brand still dominates in the procurement process. Open solutions are the way forward.
Friday, July 01, 2005
Google's view of the world is just opinion
The Google-opoly of page rank is just their opinion, nothing more, they can change it at any time, it came up in court and looks likely to remain that way. From slate:
A terrific analysis of the case from James Grimmelmann on LawMeme suggests that Google is more than just sorting Internet content: It's a "gatekeeper" that effectively bars access to anything ranked lower than 200 by its ranking system. Massa seems to commit legal hara-kiri on his Web site by conceding that Google's opinions are protected by the First Amendment.If your page rank changes over night, and the gatekeeper decides to close the gate for what ever reason, there is nothing you can do, in Google we trust, but I guess they know that.
The Open Source Interest Horizon
Clay Shirky writes about the Open Source interest horizon (one of the limiting factors) and the future of open source, providing the building blocks from which customised systems are built.
With dependable open source components or building blocks, the software architects of the future will be able to produce alternative designs that meet the same underlying need or set of requirements. They can concentrate on the customization, on demonstrating that they best understand the customers wants and needs. The customers get the real benefit because they will be able to choose a customized system that they really want from a valid set of competing and mostly equal options.
In the absence of hard and fast rules and standards for software components, open source commodity components can provide defacto standards.
The crucial element, is that these components grow out of a need to solve a real problem in context; the problem is at the heart of the solution.
Clay cites the interest horizon as one of the limiting factors of open source. I am no sure I agree, interest is one of the key motivators, but as the value of the model becomes apparent to more stake holders, interest will grow in all sorts of small groups and niche markets. I think open source will become a core activity of most professional software developers, it will be their day job rather than a hobby. Their interest will be maintained by a salary at the end of the week.
Like the architect, the software engineers real skill will be in integration and customisation. Understand the real requirements of the system, providing domain knowledge and finally bridging the casm between customer needs and service deliverables. The fact that new code artifacts are added to the public domain as a result will just be a nice side effect.
The future with open source building blocks is bright.
"This is the future of Open Source. As the edges of the interest horizon become clear, the market for products will shrink and the market for customization will grow: Expect the Computer Associates of the world to begin training their employees in the use and customization of Open Source products. With Open Source software putting core set of functions in place, software companies will become more like consulting firms and consulting firms will become more like software companies, not duplicating the same basic functions over and over, but concentrating their efforts on the work that lies over the interest horizon."I agree, using the analogy of the architect from civil engineering. Consider building a new bridge, three architects will produce three different designs, but each will be proven to satisfy the requirements because the underlying core components are well understood. For civil engineering this comes from standards, professional indemnification, laws of physics etc. The chosen design could well be the most aesthetically pleasing design or the one considered the best fit with the environment. The point being that the decision can be based on intangibles, qualitative variables, because the core functionality is a given.
With dependable open source components or building blocks, the software architects of the future will be able to produce alternative designs that meet the same underlying need or set of requirements. They can concentrate on the customization, on demonstrating that they best understand the customers wants and needs. The customers get the real benefit because they will be able to choose a customized system that they really want from a valid set of competing and mostly equal options.
In the absence of hard and fast rules and standards for software components, open source commodity components can provide defacto standards.
The crucial element, is that these components grow out of a need to solve a real problem in context; the problem is at the heart of the solution.
Clay cites the interest horizon as one of the limiting factors of open source. I am no sure I agree, interest is one of the key motivators, but as the value of the model becomes apparent to more stake holders, interest will grow in all sorts of small groups and niche markets. I think open source will become a core activity of most professional software developers, it will be their day job rather than a hobby. Their interest will be maintained by a salary at the end of the week.
Like the architect, the software engineers real skill will be in integration and customisation. Understand the real requirements of the system, providing domain knowledge and finally bridging the casm between customer needs and service deliverables. The fact that new code artifacts are added to the public domain as a result will just be a nice side effect.
The future with open source building blocks is bright.
Thursday, June 30, 2005
Web services market - wide open for open source
The current push by big players in the software world to create and/or control the emerging SOA or WebServices market via MetaStandard standardisation seems to be a tactic learned from the producers of consumer electronics. The consumer electronics markets are managed, technology is presented in an orderly, even evolutionary manner. Revolutionary innovation is kept from the masses until the existing markets are saturated, the innovation is then rolled out across the board. (Ok, it is not that bad, antitrust guidelines prevent it getting too much like collusion, see Carl Shapiro's standards paper for a nice discussion of the issues.)
It makes good sense, the market is huge; the pie is big enough for all the players and the customers like the illusion of choice. Sure there are exceptions once in a while but in the main, there is order.
I imagine it works like this, the major players share much of their R+D. They decide and standardize in advance, the technologies that they will support and license from each other. They publish the standards (to ensure the network effect), market, manufacture and release the products, continue R+D and repeat the process.
In consumer electronics, this model works because the capital costs required to develop, manufacture and distribute a new product are real barriers to entry. The R+D embodied in the standards is very hard to reproduce by some one outside the club.
In software, much of this is turned on it's head. The value of software is in the design. Good design takes experience, domain knowledge and skill. Once the design is embodied in code, the R+D job is mostly complete. The costs associated with manufacturing and distribution are close to zero. Marketing too, can be very cost effective with the web amplifying the word of mouth effect.
The emerging specifications for WebServices are very close to design, this is part of the nature of software and part of the nature of an interoperability specification. By standardisation, the "men in black" are creating a market but they are also designing a solution, one that is very easy to replicate.
Why then, are they taking control of the standardisation process?
Is is simply because they realize that the end game is domain knowledge and customisation or are they just ensuring that the market gains momentum fast, a rising tide to lift all boats (and hence their super tankers)?
It makes good sense, the market is huge; the pie is big enough for all the players and the customers like the illusion of choice. Sure there are exceptions once in a while but in the main, there is order.
I imagine it works like this, the major players share much of their R+D. They decide and standardize in advance, the technologies that they will support and license from each other. They publish the standards (to ensure the network effect), market, manufacture and release the products, continue R+D and repeat the process.
In consumer electronics, this model works because the capital costs required to develop, manufacture and distribute a new product are real barriers to entry. The R+D embodied in the standards is very hard to reproduce by some one outside the club.
In software, much of this is turned on it's head. The value of software is in the design. Good design takes experience, domain knowledge and skill. Once the design is embodied in code, the R+D job is mostly complete. The costs associated with manufacturing and distribution are close to zero. Marketing too, can be very cost effective with the web amplifying the word of mouth effect.
The emerging specifications for WebServices are very close to design, this is part of the nature of software and part of the nature of an interoperability specification. By standardisation, the "men in black" are creating a market but they are also designing a solution, one that is very easy to replicate.
Why then, are they taking control of the standardisation process?
Is is simply because they realize that the end game is domain knowledge and customisation or are they just ensuring that the market gains momentum fast, a rising tide to lift all boats (and hence their super tankers)?
Wednesday, June 29, 2005
Using Standards to create the future
Thinking about the place of open standards, how they are created and evolve, how they sometime lead, sometimes evolve with and sometimes follow a market, lead me to consortium.org. This is a great resource for fact and opinion on standards bodies and consortia.
The OMG consortium, one of the first, had a vision, produced a specification and worked through the evolution of the specification. In the beginning the specifications lead the market in a new direction, then the specifications followed the market; taking innovation (used as a differentiator between competing standard products) back into the standards. What followed was a period of 'evolution with the market', fixing problems, clarifying issues and needs etc. The next phase was extending the OMG model into new ground, up the stack, towards the applications. This required taking the lead on the market again, but this second phase was not as successful. The same "clear problem focus" was missing, the original vision was being diluted. In addition the process (and vendors embedded interests) were getting in the way, the result is captured in the view expressed in a survey on participating in standard development organisations(SDOs):
The latest edition of standard bodies (MetaStandards around Web Services and SOA) appear to be taking a very different approach. Rather than being focused on a common problem, they are focused on a common market. They are using the standards to give credibility and cohesion to the market and to build momentum and awareness about the technology. They are building the implementation and developing the standards at the same time. It is jumping straight to the evolution stage of the standard but without a clear statement of the problem; the "use cases" are being generated on the fly, in reaction to the markets response to the marketing messages. Maybe this is the perfect iterative design metaphor, produce a working trio of implementation, standards and marketing, present it to the market for review, evaluate the response and try again. What is clear is that the so-called "Men In Black" (Microsoft, IBM and BEA Systems) are really taking Alan Kays quote to heart
The fruits of the Open Source community may provide an alternative. Open source is firmly based on solving real problems. It may be in the interest of the market to take some control back, to support initiatives that have the problem solution rather than the market opportunity at the core. Our needs will be better met by a problem solved rather than the opportunity for more problems being created.
Is this just a classic tail of technology churn, all be it a very well executed one?
The OMG consortium, one of the first, had a vision, produced a specification and worked through the evolution of the specification. In the beginning the specifications lead the market in a new direction, then the specifications followed the market; taking innovation (used as a differentiator between competing standard products) back into the standards. What followed was a period of 'evolution with the market', fixing problems, clarifying issues and needs etc. The next phase was extending the OMG model into new ground, up the stack, towards the applications. This required taking the lead on the market again, but this second phase was not as successful. The same "clear problem focus" was missing, the original vision was being diluted. In addition the process (and vendors embedded interests) were getting in the way, the result is captured in the view expressed in a survey on participating in standard development organisations(SDOs):
"Firstly, we (Sun) give very little consideration to SDOs, in large part because the rules are so arcane that we find that we get specifications with "maybe bits", rather than on/off bits. (possibly due to the fact that too many SDOs believe that a compromise where everyone is disenfranchised is a legitimate way to achieve "politically acceptable technical standardization"The lesson may be, that a clear problem focus is key to any standards effort and that efforts to build past this initial problem, to fully capitalize on the first success, are best left until the next clear problem arises. There appears to be a time to "let go" that comes once a solution to the original problem has evolved. Hold on past this evolution stage and you smother the opportunity to build on the original success.
The latest edition of standard bodies (MetaStandards around Web Services and SOA) appear to be taking a very different approach. Rather than being focused on a common problem, they are focused on a common market. They are using the standards to give credibility and cohesion to the market and to build momentum and awareness about the technology. They are building the implementation and developing the standards at the same time. It is jumping straight to the evolution stage of the standard but without a clear statement of the problem; the "use cases" are being generated on the fly, in reaction to the markets response to the marketing messages. Maybe this is the perfect iterative design metaphor, produce a working trio of implementation, standards and marketing, present it to the market for review, evaluate the response and try again. What is clear is that the so-called "Men In Black" (Microsoft, IBM and BEA Systems) are really taking Alan Kays quote to heart
"The best way to predict the future is to invent it."The capability and power of the "Men In Black" is not in question, if any group can create the future then this combination can. It will be interesting to see how far the solution goes past the interoperability play, how much control and scope do these standards want to have. Will they learn from the OMG case and stop once a clear problem and solution have evolved or is this new model of creating the future simply better and can accomplish more?
The fruits of the Open Source community may provide an alternative. Open source is firmly based on solving real problems. It may be in the interest of the market to take some control back, to support initiatives that have the problem solution rather than the market opportunity at the core. Our needs will be better met by a problem solved rather than the opportunity for more problems being created.
Is this just a classic tail of technology churn, all be it a very well executed one?
New Shape of Knowledge
More goodness from one of the co authors of the cluetrain, the web is changing how we think:
"That's because we've thought of our minds as containers. But the Web is made of links - pages pointing outside of themselves to other pages - each a little act of generosity."It is an embodiment of how we learn by talking with others, exploring ideas and concepts, the notion of knowledge as a conversation that is now captured in the web in an open manner is very compelling, this blog entry and blog is worth a read
"Conversation is a paradox because it iterates difference on a common ground. That a paradox happens every day is a miracle."The insight about the way the web changes the politics of knowledge by severing the links between knowledge, organisation of knowledge and ownership is useful. That we can still make sense of all this knowledge is the real miracle.
Monday, June 27, 2005
Grady Booch on Technology Churn
From an MSDN chat with Grady Booch some straight forward good advice
Technology churn is always a challenge.... ...I would therefore offer the general guidance to those teams to focus on some of the best development practices, architecture first, the use of modeling, controlling your change management; these are fundamentals that will prepare you to absorb technological change, which you'll have to do over time.Then a word on adoption in the context of emerging Web Services:
"I expect we will see organizations struggle to build their own kind of Web Services because they would initially choose to not build upon public services out there, unless they are provided by a platform member such as Microsoft. It simply has to do with trust, availability, security, and scalability. And so they will probably dabble by building their own services, and later build systems upon public services."Just trust, availability, security, and scalability; these are so important, so real and the reason why software engineering as a profession has so far to go. Sure we need to innovate, but as consumers we need to take some control, it is fair enough to have to dabble, to experiment, but not all the time. To develop a full understanding of a system takes time, stability of purpose and context, in the constant churning world of Software we have neither. So ensure we have arcitecture and models and ensure we know what our system does and how we validate that it does it. That is, write real tests for it. Only then are we in a position to evaluate the next wave that churns our way and empirically decide whether it is of benefit.
Utility solutions and the IT Hierarchy
Again a lively debate follows from a publication by Carr implying that there will be no need for low level corporate IT; IT functions will be available as a utility service. While APSs, outsourcing, shared data centers etc., are all a reality today, can all IT eventually be served in this way?
I think the hierarchy approach is a useful construct to help answer this question. It causes us to look at the computing needs of an organization in an ordered fashion. Recent details of IT spending discussed in The End of Corporate IT? Not Quite - Computerworld allude to a the need for some separation, in Carr's argument; a move from the 'one size fits all' approach:
Looking through this IT Hierarchy lense, I think the limits of utility computing will be tied to the lower layers of the hierarchy and then coupled with the evolution of standardised application specific solutions. The space occupied by ASPs today (web hosting, payroll, CRM), will grow to encompass the tools/services that are of utility along and across industry sectors as best practice becomes apparent.
However the middle and higher order functions will remain allusive because of their specificity and their value.
Looking for cost reductions for the middle order functions may require a pooling of resource, an openness to sharing best practice and a resistance to technology churn. The use of information is the key, how it is obtained, maintained and shared can become common practice, a standard or an open solution for that particular community. The open innovation model, coupled with an open source approach at the solution level could provide a framework here.
The highest order functions will remain out of the realm of community and utility because they will be the main stay of competitive advantage and are too valuable to an individual organisation to share.
I think the hierarchy approach is a useful construct to help answer this question. It causes us to look at the computing needs of an organization in an ordered fashion. Recent details of IT spending discussed in The End of Corporate IT? Not Quite - Computerworld allude to a the need for some separation, in Carr's argument; a move from the 'one size fits all' approach:
In many ways, basic IT infrastructure has indeed become a commodity that should be treated as a utility where cost reductions reign. However, lumping all IT investments into the commodity category is the critical oversight in Carr's argument.The un-lumping model is then described:
Much like Maslow's Hierarchy of Needs in human development, the IT Hierarchy of Needs segments IT spending into four progressive levels.
The first level is basic IT infrastructureÃthe core foundation for corporate computing including servers, networking, storage, desktops, mobility and telecommunications.
The second level includes the tools to automate manual tasks and processes, streamline transactions and foster creativity and collaboration.
The third level includes all applications to support the collection, visualization and application of information to measure the business and drive improved performance.
The fourth and highest level is how a company uses its information to change the playing field by creating different relationships with suppliers, partners and customers, as well as applying competitive insight.
Looking through this IT Hierarchy lense, I think the limits of utility computing will be tied to the lower layers of the hierarchy and then coupled with the evolution of standardised application specific solutions. The space occupied by ASPs today (web hosting, payroll, CRM), will grow to encompass the tools/services that are of utility along and across industry sectors as best practice becomes apparent.
However the middle and higher order functions will remain allusive because of their specificity and their value.
Looking for cost reductions for the middle order functions may require a pooling of resource, an openness to sharing best practice and a resistance to technology churn. The use of information is the key, how it is obtained, maintained and shared can become common practice, a standard or an open solution for that particular community. The open innovation model, coupled with an open source approach at the solution level could provide a framework here.
The highest order functions will remain out of the realm of community and utility because they will be the main stay of competitive advantage and are too valuable to an individual organisation to share.
Thursday, June 23, 2005
We know more than we can tell.
Jon's Radio references a telling phrase: "The Tacit Dimension of Tech Support, refers to The Tacit Dimension, a 1967 book by the scientist/philosopher Michael Polanyi. One of his touchstone phrases was: 'We know more than we can tell.'"
This is one reason why the agile 'workwith' is so important. Seeing, sharing, doing with another person, is the best way to learn. It takes time to assimilate in this way, but also some stability, in both context and purpose; elements common to a traditional apprenticeship.
Tuesday, June 21, 2005
The open source meritocracy model - move it to the systems/solutions level
Dave Thomas and Andy Hunt do a wonderful job of describing the opensource ecosystem, identifying key motivators and practices that make it work. The mix of capitalism and communism, through meritocracy and community, provides an interesting balance.
Adopting some of these practices in commercial product development makes good sense and has been successfully demonstrated by the agile community. Going further I would like to see the same open practices applied higher up the food chain, in systems or solutions development. The move up the food chain is already occurring with open source in the web and application development arena, projects like ruby on rails and spring are taking on more and more of the infrastructure tasks of the developer. More solution oriented projects are also gaining traction in the CRM space.
An open solution oriented approach need to be limited to open source. Building open solutions with commodity components is also a possibility. Once the key capabilities of the commodity are identified and well understood, opening up the ways in which these capabilities can be used and reused in different configurations and contexts could provided a valuable community resource. Taking from the cluetrain mantra
Adopting some of these practices in commercial product development makes good sense and has been successfully demonstrated by the agile community. Going further I would like to see the same open practices applied higher up the food chain, in systems or solutions development. The move up the food chain is already occurring with open source in the web and application development arena, projects like ruby on rails and spring are taking on more and more of the infrastructure tasks of the developer. More solution oriented projects are also gaining traction in the CRM space.
An open solution oriented approach need to be limited to open source. Building open solutions with commodity components is also a possibility. Once the key capabilities of the commodity are identified and well understood, opening up the ways in which these capabilities can be used and reused in different configurations and contexts could provided a valuable community resource. Taking from the cluetrain mantra
#39 The community of discourse is the marketthe community can determine the net value of a commodity, the appropriate level of functionality and the appropriate life span. Systems can be build from software that is proven, by the market, to be fit for a purpose. The meritocracy would reveal its self as best practice, the community would facilitate the diffusion of knowledge to allow the replication and ongoing maintenance of open working systems. It could be a powerful force in the reduction of technology churn that plagues aspects of the application portfolio.
Thursday, June 09, 2005
When piracy works ....
Using formal economic modeling, professors Pankaj Ghemawat and Ramon Casadesus-Masanell consider the competitive dynamics of the software wars between Microsoft and open source, an interesting side effect of piracy is observed
It effectively reduces the cost to zero but increases the network effect. Understandable but surprising.
"We also look at the effect of piracy and ask whether piracy can ever be beneficial to Microsoft. This extension was motivated by analyzing data on a cross-section of countries on Linux penetration and piracy rates. We found that in countries where piracy is highest, Linux has the lowest penetration rate. The model shows that Microsoft can use piracy as an effective tool to price discriminate, and that piracy may even result in higher profits to Microsoft!"
It effectively reduces the cost to zero but increases the network effect. Understandable but surprising.
Thursday, May 26, 2005
The need for proximity in building collective intuition
Yet another interesting and relevant working paper that looks at the factors, and in particular 'proximity', that effect the building of collective intuition in an organisation.
Communities of practice, global idea exchange, Open innovation, together these all help to deal with the proximity issue using internet technologies, exchanges and forums. For something as broad as innovation however, face to face meetings may be a prerequisite, proximity cannot be modeled or synthesised. On the otherhand, for something more concrete and specific like the replication of an IS/IT systems in another context, real proximity may not be needed. The internet tools may be rich enough because they are closer to the medium of IS/IT.
The purpose of this paper is to examine the conditions under which intuitive forms of reasoning emerge to accelerate complex problem solving in product innovation teams. The paper originates from an ethnographic field study of four product teams in two companies: a hardware and software project from both a U.S. and a Japanese computer firm. This inductive, theory-building study was designed to use multiple cases to examine closely the learning and problem-solving behavior of product innovation teams. A problem-solving lens offers an insightful way in which to view product innovation, offering rich descriptions of interpersonal communication, project coordination, and the associated context for project team member interactions (Brown & Eisenhardt, 1995). Nonetheless, the empirical research on product innovation as problem solving has sometimes neglected to consider the challenges faced on the human side of problem solving in product teams (e.g., motivating people, creating the conditions for cross-functional teams to work together) (Brown & Eisenhardt, 1995). This study provides a response to this gap by focusing on the work practices associated with product innovation teams (Brown & Duguid, 1991; Engestrom & Middleton, 1998).
Communities of practice, global idea exchange, Open innovation, together these all help to deal with the proximity issue using internet technologies, exchanges and forums. For something as broad as innovation however, face to face meetings may be a prerequisite, proximity cannot be modeled or synthesised. On the otherhand, for something more concrete and specific like the replication of an IS/IT systems in another context, real proximity may not be needed. The internet tools may be rich enough because they are closer to the medium of IS/IT.
Wednesday, May 25, 2005
Open innovation and Idea Exchange - how it works
Mohan Babu writes about the reality of open innovation in the Indian context with some concrete examples, the concept is simple:
The opportunities for distributed participation are huge, but the reuse of this model in the broader IS community, based around solutions to well known problems provides another angle on innovation. Innovation through replication of best practice or evolution of best practice. It provides a means to allow tech laggards to benefit from each other by sharing their experiences of technology adoption. The experiences then mould the predominant usage patterns for the laggards. Some input form the early adaptors would be of value, it may even be worth paying for, but the real value would come in the shared solution; buying power, known issues, known limitations, complementary vertical processes and systems.
The model behind such global exchanges is simple:Companies or R&D groups post their problems on online Idea Exchanges and invite solutions from a pre-qualified global pool of candidates or companies. In return for a verifiable solution, the solver gets a substantial monetary reward.
The opportunities for distributed participation are huge, but the reuse of this model in the broader IS community, based around solutions to well known problems provides another angle on innovation. Innovation through replication of best practice or evolution of best practice. It provides a means to allow tech laggards to benefit from each other by sharing their experiences of technology adoption. The experiences then mould the predominant usage patterns for the laggards. Some input form the early adaptors would be of value, it may even be worth paying for, but the real value would come in the shared solution; buying power, known issues, known limitations, complementary vertical processes and systems.
Tuesday, May 24, 2005
The role of the knowledge broker...
From The Theory and Practice of Knowledge Brokering in Canada
It is very much about doing rather than being. It is a very active role, some what of a viral catalyst.
Making it happen is very much part of the role, it requires a finisher rather than a starter. Collaboration needs to be nurtured and supported; it must be followed-up with metrics and rewards, taking into account the tacit nature of the mutual benefits.
One of the most consistent messages from the national consultations was that people whose job description actually says 'knowledge broker' are rare and that the situation is not likely to change. That's why the foundation was told to shift its emphasis from the idea of the individual knowledge broker to the activity of brokering.
It is very much about doing rather than being. It is a very active role, some what of a viral catalyst.
A broker's main task is to bring people together; they are catalysts who, through diligent network-building and solid background, can create a mix of people and even organizations that will stimulate knowledge exchange, the development of new research and the interpretation and application of solutions.
Brokers search out knowledge, synthesize research and scan for best practices, useful experiences, and examples from outside their own organization. They may also act as advocates for the use of research-based evidence in decision-making and have a role in supporting and evaluating changes they have helped to put in place - although the literature only mentions a generic 'follow-up' role.
Making it happen is very much part of the role, it requires a finisher rather than a starter. Collaboration needs to be nurtured and supported; it must be followed-up with metrics and rewards, taking into account the tacit nature of the mutual benefits.
legal protection, possibly the true value of ASF for new projects
In an interview with the server side, Greg Stein Chairman of the Apache Software Foundation (ASF) talks about one of the key value propositions:
The PMC "oversight" process ensures that there is open IP disclosure and no hidden legal rat holes. Of course there is the Apache brand, but that means more to the users of open source than to the producers. The incubator and community ensures that the projects have value in them selves, the ASF ensures that the license can be trusted. For corporate IT, this sort of trust is vital.
"..., but what they wanted was sort of the legal oversight, the legal protection that the Apache software foundation provides to its committers. So, at the ASF all the committers are shielded from like lawsuits and other types of claims against them by the ASF. The ASF will be the party in any potential lawsuit. "
The PMC "oversight" process ensures that there is open IP disclosure and no hidden legal rat holes. Of course there is the Apache brand, but that means more to the users of open source than to the producers. The incubator and community ensures that the projects have value in them selves, the ASF ensures that the license can be trusted. For corporate IT, this sort of trust is vital.
Monday, May 16, 2005
The Six Degrees World of Inventors
Sara Grant, HBS Working Knowledge, writes about her research : "'Our work and more recent work on knowledge diffusion demonstrates that knowledge flows along these collaborative relationships, even years after they were formed,' says Fleming. At the same time, the world of inventors 'is getting smaller,' he says, 'inventors are more connected to their colleagues in outside firms, and that knowledge is diffusing in both directions.'"
People. and men in particular, often touch base around work, what is going on now, how things were in the past and so on. It provides a common link or a fabric into which conversation can evolve. Technology, the internet, PC's etc are also part of the fabric, they provide another common touch point. The opportunity for cross pollination are immense, as each new industry sector gets to grasp with technology, new boundaries are explored and common ground uncovered. The "world of inventors" may be getting smaller but the scope of invention is broadening. I think the personal fabrication model has applicability for industry as sectors fabricate solutions for them selves in ways that were before unheard of.
Innovation will be about seeing what should be out of what can be and making it happen with what is. This will be a role not for a smaller group but for the community at large as the language of pervasive technology becomes more understood.
IS/IT practitioners are struggling with the concept of a common language but progress is being made. With consumer electronics we are a lot closer, plugging and playing, sharing and manipulating, downloading and feeding. The current or next generation of children that take things like this for granted (without need ing to know how it works :-)) will have a freedom to explore and innovate like never before.
People. and men in particular, often touch base around work, what is going on now, how things were in the past and so on. It provides a common link or a fabric into which conversation can evolve. Technology, the internet, PC's etc are also part of the fabric, they provide another common touch point. The opportunity for cross pollination are immense, as each new industry sector gets to grasp with technology, new boundaries are explored and common ground uncovered. The "world of inventors" may be getting smaller but the scope of invention is broadening. I think the personal fabrication model has applicability for industry as sectors fabricate solutions for them selves in ways that were before unheard of.
Innovation will be about seeing what should be out of what can be and making it happen with what is. This will be a role not for a smaller group but for the community at large as the language of pervasive technology becomes more understood.
IS/IT practitioners are struggling with the concept of a common language but progress is being made. With consumer electronics we are a lot closer, plugging and playing, sharing and manipulating, downloading and feeding. The current or next generation of children that take things like this for granted (without need ing to know how it works :-)) will have a freedom to explore and innovate like never before.
Subscribe to:
Posts (Atom)