Friday, November 25, 2005

What does it mean to be an Agile software developer?

The principals of agile software development are well documented and extensive. On the ground, day to day, what does it mean for a group of engineers?
I think there are two simple attributes that really matter:
  • Clear short term focus on adding value
  • Willingness and capability to change focus

The clear short term focus on adding value is well understood. It follows from a good user(customer) story that breaks down into tasks. The focus is collectively and iteratively on design, implementation, testing and documentation to make the customer requirement a valuable reality, adapting to requirements changes as necessary. The appreciation of value as perceived by the end user rather than the engineer is also important.

The willingness and capability to change focus is a little more subtle. Willingness is important, without it there is no hope of agility, but more important is the capability to change. The capability must be learned because it is embedded in the context of the team and workplace. Consistency is key I think. Consistency in engineering practices across teams around things like build systems, code style, development and design tools. Consistency and simplicity allows individual developers to change focus because the underlying infrastructure does not change (too much). With consistency, moving from one task to another or from one team to another is not only possible, it is fun.
I understand that rigid restrictions are often considered the antithesis of creativity but they have their place and can be beneficial. Consider:
  • Constraints often provide the motivation for innovation.

  • Discipline predicates habit, habits allow our conscious mind to focus more freely on the task at hand.

Thursday, November 17, 2005

Helping a FOSS project tip over the technology adoption curve

When does a new FOSS project take-off, become really popular and move in the main stream?
What are the necessary pre-conditions or attributes of the project that allow it to tip-over the adoption curve?
I will hazard a guess at a few relevant attributes:

  • Useful: the project must do something useful. Either it is a better or novel solution to a known problem or it is a new solution to a new problem. Whatever the case, it must help solve some problem. This is a necessary, if not obvious, pre-condition.


  • Honest: honesty is important because it builds trust. Trust is important because it is the basis for an on-going relationship and the basis for a technology adoption decision. Honesty is easy to achieve. It is about simply ensuring that the project 'does what it says on the tin'. That it is fit for the purpose for which it is intended. If the project is a first step in the direction of a solution with a bunch of explicitly identified known-limitations, that is ok also. The key point is that there is no world domination marketing blurb, no spin, no claims about un-tested or un-implemented features, no ifs or buts etc.
    Honest revelation is important for two reasons. First, because it is the truth and because it forces identification and analysis of the current reality. Second, because you can be found out. The source is open and the truth is in the code. The type of people an early FOSS project needs to attract can and will read the source-code.
    If a FOSS project is honest, the users first impressions will be a valid reflection of the project. There can be no disappointment. If the intent is perceived as useful, the user can decide to adopt, track changes or get involved. In essence, the project stands solely on its 'usefulness' merits.


  • Extensible: The architecture of participation is important. Successful projects get this right from the beginning. Outsiders are presented with clear opportunities to contribute to the project.

Tuesday, October 11, 2005

This story contains the essence of Einstein's discovery

What exactly does 'E = mc²' mean, sure I had an idea, but Brian Greene tells a story that contains the essence of Einstein's discovery. If that whole physics thing rocks your boat, you will like this article.

Friday, September 23, 2005

The dichotomy of hard versus valuable and an interest in the business side of things

From I, Cringely, NerdTV, Max Levchin makes a nice observation about engineers when asked about the need to understand the business side of things:
".... I think there's a huge benefit to knowing what it is you're working for. And there's definitely - one of the typical pitfalls in the world of engineering - and I don't mean to slight my fellow engineers, but - is this dichotomy of hard versus valuable. And lots of engineers mistake hard for valuable."


There is no point spending hours figuring out a hard problem if no one cares about the solution, valuable problems are ones that are tied to markets so a bit of business Savvy is important!

Wednesday, September 14, 2005

del.icio.us/tag/xml+opensource+java

A perfect example of internet enabled spontaneous collaboration. del.icio.us infoware provides a useful service, keeping your favorites together and accessible from any internet cafe, but the process of using it creates even more value.
Everyone gets to benefit by the sensible tagging of an individual. It is like an expert filtered google, in that all the references are already especially chosen for their content.
It is like a small bit of sense multiplied by everyone!

Wednesday, July 27, 2005

Open source solutions will precipitate proprietary opportunity

Open source has changed the game from a cost of software point of view. The traditional model with a split of support over cost/license of 70/30 is changing to a full service model, the 30% has disappeared. This "cost free" element of Free Open Source has launched a bunch of new commodities, for example, LAMP is a serious contender to .NET or J2EE. The fact that it is freely available has been hugely influential here.

But the software bits are not the whole picture, it is the solutions that are build on them that are important. I think we will find that the Open Source commodities will precipitate the commoditisation of traditional closed source or proprietary components through complete integrated solutions. The proprietary modules that are really good at what they do and that play well with Open Source will be successful. That is, they are easily integrated with existing open source components and platforms, are well supported, reasonably priced and reasonable open (from a standards and extensibility point of view).
The proprietary components that work well with open source will become commodities on the back of the open source wave, the customers/market will decide what works well and what does not; essentially the overall integrated solution will be open, the proprietary bits that integrated easily will form an integral part of that solution.
Other uses with the same itch will try the same solution, to facilitate the proven complete solution they will pay for a suitable proprietary component.

One problem with this model is that a real open source alternative commodity will eventually get created by the community, unless of course there are large barriers to entry like huge complexity or the need for a mainframe to test it or something. It may simply mean that traditional proprietary source has a very short life span, build it, charge for it, then give it away once the 'critical mass' of use can provide a support revenue stream.

Hybrid open solutions that combine mostly FOSS with proprietary modules are inevitable, I think a software company can find a niche with a traditional business model applied in 'internet time'.

Services companies will find a niche in customization, and the commonality their-in will be the genesis for a traditional product.

Monday, July 04, 2005

Branded Open Source

Falling for a brand, Lajos Moczar describes how it can happen:
"The most important point I can make here is this: if you cannot articulate your business needs and identify IT products that can help you fulfill those needs, you end up thinking in terms of concepts. When you think in terms of concepts, whether feature buzzwords or more general buzzwords like 'mission critical' or 'enterprise interoperability', your thinking is divorced from your needs. And when that happens, the marketing departments have got you. No matter how objective you think you are, you will inevitably find yourself choosing a brand name."

This comment is in the context of Commercial Open Source(COS) organisation that are using branding to make money out of the open source movement. He talks of new monopolies, controlling the innovation. I think there is lots of value in the alternative openstructure approach, where the customer/market determines the best fit for a problem based on the context and prevailing community view.
COS may lead to unnecessary duplication and bias. The duplication comes from competing with other branded open source, we will see this with JBoss and Gluecode now that IBM has taken an interest in Gluecode. The bias will come from following a brand rather than from following a technology or implementation choice. The meritocracy of open source will be lost to the power of marketing.

But maybe the savior is the brand, with IBM using the open source stick to beat off JBoss, it will be a brand war between IBM and JBoss. It will be hard for JBoss to win.
The trump card for open source is the community, while customers may flock to the brand, the community should target the most interesting and appropriate technology. The community should be (and I think are) brand agnostic, the code is the reference.

The only problem is that at the moment the community does not have the buying power, there are no community solutions, so brand still dominates in the procurement process. Open solutions are the way forward.

Friday, July 01, 2005

Google's view of the world is just opinion

The Google-opoly of page rank is just their opinion, nothing more, they can change it at any time, it came up in court and looks likely to remain that way. From slate:
A terrific analysis of the case from James Grimmelmann on LawMeme suggests that Google is more than just sorting Internet content: It's a "gatekeeper" that effectively bars access to anything ranked lower than 200 by its ranking system. Massa seems to commit legal hara-kiri on his Web site by conceding that Google's opinions are protected by the First Amendment.
If your page rank changes over night, and the gatekeeper decides to close the gate for what ever reason, there is nothing you can do, in Google we trust, but I guess they know that.

The Open Source Interest Horizon

Clay Shirky writes about the Open Source interest horizon (one of the limiting factors) and the future of open source, providing the building blocks from which customised systems are built.
"This is the future of Open Source. As the edges of the interest horizon become clear, the market for products will shrink and the market for customization will grow: Expect the Computer Associates of the world to begin training their employees in the use and customization of Open Source products. With Open Source software putting core set of functions in place, software companies will become more like consulting firms and consulting firms will become more like software companies, not duplicating the same basic functions over and over, but concentrating their efforts on the work that lies over the interest horizon."
I agree, using the analogy of the architect from civil engineering. Consider building a new bridge, three architects will produce three different designs, but each will be proven to satisfy the requirements because the underlying core components are well understood. For civil engineering this comes from standards, professional indemnification, laws of physics etc. The chosen design could well be the most aesthetically pleasing design or the one considered the best fit with the environment. The point being that the decision can be based on intangibles, qualitative variables, because the core functionality is a given.
With dependable open source components or building blocks, the software architects of the future will be able to produce alternative designs that meet the same underlying need or set of requirements. They can concentrate on the customization, on demonstrating that they best understand the customers wants and needs. The customers get the real benefit because they will be able to choose a customized system that they really want from a valid set of competing and mostly equal options.
In the absence of hard and fast rules and standards for software components, open source commodity components can provide defacto standards.
The crucial element, is that these components grow out of a need to solve a real problem in context; the problem is at the heart of the solution.

Clay cites the interest horizon as one of the limiting factors of open source. I am no sure I agree, interest is one of the key motivators, but as the value of the model becomes apparent to more stake holders, interest will grow in all sorts of small groups and niche markets. I think open source will become a core activity of most professional software developers, it will be their day job rather than a hobby. Their interest will be maintained by a salary at the end of the week.
Like the architect, the software engineers real skill will be in integration and customisation. Understand the real requirements of the system, providing domain knowledge and finally bridging the casm between customer needs and service deliverables. The fact that new code artifacts are added to the public domain as a result will just be a nice side effect.
The future with open source building blocks is bright.

Thursday, June 30, 2005

Web services market - wide open for open source

The current push by big players in the software world to create and/or control the emerging SOA or WebServices market via MetaStandard standardisation seems to be a tactic learned from the producers of consumer electronics. The consumer electronics markets are managed, technology is presented in an orderly, even evolutionary manner. Revolutionary innovation is kept from the masses until the existing markets are saturated, the innovation is then rolled out across the board. (Ok, it is not that bad, antitrust guidelines prevent it getting too much like collusion, see Carl Shapiro's standards paper for a nice discussion of the issues.)
It makes good sense, the market is huge; the pie is big enough for all the players and the customers like the illusion of choice. Sure there are exceptions once in a while but in the main, there is order.
I imagine it works like this, the major players share much of their R+D. They decide and standardize in advance, the technologies that they will support and license from each other. They publish the standards (to ensure the network effect), market, manufacture and release the products, continue R+D and repeat the process.
In consumer electronics, this model works because the capital costs required to develop, manufacture and distribute a new product are real barriers to entry. The R+D embodied in the standards is very hard to reproduce by some one outside the club.

In software, much of this is turned on it's head. The value of software is in the design. Good design takes experience, domain knowledge and skill. Once the design is embodied in code, the R+D job is mostly complete. The costs associated with manufacturing and distribution are close to zero. Marketing too, can be very cost effective with the web amplifying the word of mouth effect.

The emerging specifications for WebServices are very close to design, this is part of the nature of software and part of the nature of an interoperability specification. By standardisation, the "men in black" are creating a market but they are also designing a solution, one that is very easy to replicate.

Why then, are they taking control of the standardisation process?
Is is simply because they realize that the end game is domain knowledge and customisation or are they just ensuring that the market gains momentum fast, a rising tide to lift all boats (and hence their super tankers)?

Wednesday, June 29, 2005

Using Standards to create the future

Thinking about the place of open standards, how they are created and evolve, how they sometime lead, sometimes evolve with and sometimes follow a market, lead me to consortium.org. This is a great resource for fact and opinion on standards bodies and consortia.
The OMG consortium, one of the first, had a vision, produced a specification and worked through the evolution of the specification. In the beginning the specifications lead the market in a new direction, then the specifications followed the market; taking innovation (used as a differentiator between competing standard products) back into the standards. What followed was a period of 'evolution with the market', fixing problems, clarifying issues and needs etc. The next phase was extending the OMG model into new ground, up the stack, towards the applications. This required taking the lead on the market again, but this second phase was not as successful. The same "clear problem focus" was missing, the original vision was being diluted. In addition the process (and vendors embedded interests) were getting in the way, the result is captured in the view expressed in a survey on participating in standard development organisations(SDOs):
"Firstly, we (Sun) give very little consideration to SDOs, in large part because the rules are so arcane that we find that we get specifications with "maybe bits", rather than on/off bits. (possibly due to the fact that too many SDOs believe that a compromise where everyone is disenfranchised is a legitimate way to achieve "politically acceptable technical standardization"
The lesson may be, that a clear problem focus is key to any standards effort and that efforts to build past this initial problem, to fully capitalize on the first success, are best left until the next clear problem arises. There appears to be a time to "let go" that comes once a solution to the original problem has evolved. Hold on past this evolution stage and you smother the opportunity to build on the original success.

The latest edition of standard bodies (MetaStandards around Web Services and SOA) appear to be taking a very different approach. Rather than being focused on a common problem, they are focused on a common market. They are using the standards to give credibility and cohesion to the market and to build momentum and awareness about the technology. They are building the implementation and developing the standards at the same time. It is jumping straight to the evolution stage of the standard but without a clear statement of the problem; the "use cases" are being generated on the fly, in reaction to the markets response to the marketing messages. Maybe this is the perfect iterative design metaphor, produce a working trio of implementation, standards and marketing, present it to the market for review, evaluate the response and try again. What is clear is that the so-called "Men In Black" (Microsoft, IBM and BEA Systems) are really taking Alan Kays quote to heart
"The best way to predict the future is to invent it."
The capability and power of the "Men In Black" is not in question, if any group can create the future then this combination can. It will be interesting to see how far the solution goes past the interoperability play, how much control and scope do these standards want to have. Will they learn from the OMG case and stop once a clear problem and solution have evolved or is this new model of creating the future simply better and can accomplish more?

The fruits of the Open Source community may provide an alternative. Open source is firmly based on solving real problems. It may be in the interest of the market to take some control back, to support initiatives that have the problem solution rather than the market opportunity at the core. Our needs will be better met by a problem solved rather than the opportunity for more problems being created.
Is this just a classic tail of technology churn, all be it a very well executed one?

New Shape of Knowledge

More goodness from one of the co authors of the cluetrain, the web is changing how we think:
"That's because we've thought of our minds as containers. But the Web is made of links - pages pointing outside of themselves to other pages - each a little act of generosity."
It is an embodiment of how we learn by talking with others, exploring ideas and concepts, the notion of knowledge as a conversation that is now captured in the web in an open manner is very compelling, this blog entry and blog is worth a read
"Conversation is a paradox because it iterates difference on a common ground. That a paradox happens every day is a miracle."
The insight about the way the web changes the politics of knowledge by severing the links between knowledge, organisation of knowledge and ownership is useful. That we can still make sense of all this knowledge is the real miracle.

Monday, June 27, 2005

Grady Booch on Technology Churn

From an MSDN chat with Grady Booch some straight forward good advice
Technology churn is always a challenge.... ...I would therefore offer the general guidance to those teams to focus on some of the best development practices, architecture first, the use of modeling, controlling your change management; these are fundamentals that will prepare you to absorb technological change, which you'll have to do over time.
Then a word on adoption in the context of emerging Web Services:
"I expect we will see organizations struggle to build their own kind of Web Services because they would initially choose to not build upon public services out there, unless they are provided by a platform member such as Microsoft. It simply has to do with trust, availability, security, and scalability. And so they will probably dabble by building their own services, and later build systems upon public services."
Just trust, availability, security, and scalability; these are so important, so real and the reason why software engineering as a profession has so far to go. Sure we need to innovate, but as consumers we need to take some control, it is fair enough to have to dabble, to experiment, but not all the time. To develop a full understanding of a system takes time, stability of purpose and context, in the constant churning world of Software we have neither. So ensure we have arcitecture and models and ensure we know what our system does and how we validate that it does it. That is, write real tests for it. Only then are we in a position to evaluate the next wave that churns our way and empirically decide whether it is of benefit.

Utility solutions and the IT Hierarchy

Again a lively debate follows from a publication by Carr implying that there will be no need for low level corporate IT; IT functions will be available as a utility service. While APSs, outsourcing, shared data centers etc., are all a reality today, can all IT eventually be served in this way?
I think the hierarchy approach is a useful construct to help answer this question. It causes us to look at the computing needs of an organization in an ordered fashion. Recent details of IT spending discussed in The End of Corporate IT? Not Quite - Computerworld allude to a the need for some separation, in Carr's argument; a move from the 'one size fits all' approach:
In many ways, basic IT infrastructure has indeed become a commodity that should be treated as a utility where cost reductions reign. However, lumping all IT investments into the commodity category is the critical oversight in Carr's argument.
The un-lumping model is then described:
Much like Maslow's Hierarchy of Needs in human development, the IT Hierarchy of Needs segments IT spending into four progressive levels.

The first level is basic IT infrastructure—the core foundation for corporate computing including servers, networking, storage, desktops, mobility and telecommunications.

The second level includes the tools to automate manual tasks and processes, streamline transactions and foster creativity and collaboration.

The third level includes all applications to support the collection, visualization and application of information to measure the business and drive improved performance.

The fourth and highest level is how a company uses its information to change the playing field by creating different relationships with suppliers, partners and customers, as well as applying competitive insight.


Looking through this IT Hierarchy lense, I think the limits of utility computing will be tied to the lower layers of the hierarchy and then coupled with the evolution of standardised application specific solutions. The space occupied by ASPs today (web hosting, payroll, CRM), will grow to encompass the tools/services that are of utility along and across industry sectors as best practice becomes apparent.
However the middle and higher order functions will remain allusive because of their specificity and their value.

Looking for cost reductions for the middle order functions may require a pooling of resource, an openness to sharing best practice and a resistance to technology churn. The use of information is the key, how it is obtained, maintained and shared can become common practice, a standard or an open solution for that particular community. The open innovation model, coupled with an open source approach at the solution level could provide a framework here.

The highest order functions will remain out of the realm of community and utility because they will be the main stay of competitive advantage and are too valuable to an individual organisation to share.

Thursday, June 23, 2005

We know more than we can tell.

Jon's Radio references a telling phrase: "The Tacit Dimension of Tech Support, refers to The Tacit Dimension, a 1967 book by the scientist/philosopher Michael Polanyi. One of his touchstone phrases was: 'We know more than we can tell.'"


This is one reason why the agile 'workwith' is so important. Seeing, sharing, doing with another person, is the best way to learn. It takes time to assimilate in this way, but also some stability, in both context and purpose; elements common to a traditional apprenticeship.

Tuesday, June 21, 2005

The open source meritocracy model - move it to the systems/solutions level

Dave Thomas and Andy Hunt do a wonderful job of describing the opensource ecosystem, identifying key motivators and practices that make it work. The mix of capitalism and communism, through meritocracy and community, provides an interesting balance.
Adopting some of these practices in commercial product development makes good sense and has been successfully demonstrated by the agile community. Going further I would like to see the same open practices applied higher up the food chain, in systems or solutions development. The move up the food chain is already occurring with open source in the web and application development arena, projects like ruby on rails and spring are taking on more and more of the infrastructure tasks of the developer. More solution oriented projects are also gaining traction in the CRM space.

An open solution oriented approach need to be limited to open source. Building open solutions with commodity components is also a possibility. Once the key capabilities of the commodity are identified and well understood, opening up the ways in which these capabilities can be used and reused in different configurations and contexts could provided a valuable community resource. Taking from the cluetrain mantra
#39 The community of discourse is the market
the community can determine the net value of a commodity, the appropriate level of functionality and the appropriate life span. Systems can be build from software that is proven, by the market, to be fit for a purpose. The meritocracy would reveal its self as best practice, the community would facilitate the diffusion of knowledge to allow the replication and ongoing maintenance of open working systems. It could be a powerful force in the reduction of technology churn that plagues aspects of the application portfolio.

Thursday, June 09, 2005

When piracy works ....

Using formal economic modeling, professors Pankaj Ghemawat and Ramon Casadesus-Masanell consider the competitive dynamics of the software wars between Microsoft and open source, an interesting side effect of piracy is observed
"We also look at the effect of piracy and ask whether piracy can ever be beneficial to Microsoft. This extension was motivated by analyzing data on a cross-section of countries on Linux penetration and piracy rates. We found that in countries where piracy is highest, Linux has the lowest penetration rate. The model shows that Microsoft can use piracy as an effective tool to price discriminate, and that piracy may even result in higher profits to Microsoft!"

It effectively reduces the cost to zero but increases the network effect. Understandable but surprising.

Thursday, May 26, 2005

The need for proximity in building collective intuition

Yet another interesting and relevant working paper that looks at the factors, and in particular 'proximity', that effect the building of collective intuition in an organisation.
The purpose of this paper is to examine the conditions under which intuitive forms of reasoning emerge to accelerate complex problem solving in product innovation teams. The paper originates from an ethnographic field study of four product teams in two companies: a hardware and software project from both a U.S. and a Japanese computer firm. This inductive, theory-building study was designed to use multiple cases to examine closely the learning and problem-solving behavior of product innovation teams. A problem-solving lens offers an insightful way in which to view product innovation, offering rich descriptions of interpersonal communication, project coordination, and the associated context for project team member interactions (Brown & Eisenhardt, 1995). Nonetheless, the empirical research on product innovation as problem solving has sometimes neglected to consider the challenges faced on the human side of problem solving in product teams (e.g., motivating people, creating the conditions for cross-functional teams to work together) (Brown & Eisenhardt, 1995). This study provides a response to this gap by focusing on the work practices associated with product innovation teams (Brown & Duguid, 1991; Engestrom & Middleton, 1998).


Communities of practice, global idea exchange, Open innovation, together these all help to deal with the proximity issue using internet technologies, exchanges and forums. For something as broad as innovation however, face to face meetings may be a prerequisite, proximity cannot be modeled or synthesised. On the otherhand, for something more concrete and specific like the replication of an IS/IT systems in another context, real proximity may not be needed. The internet tools may be rich enough because they are closer to the medium of IS/IT.

Wednesday, May 25, 2005

Open innovation and Idea Exchange - how it works

Mohan Babu writes about the reality of open innovation in the Indian context with some concrete examples, the concept is simple:
The model behind such global exchanges is simple: Companies or R&D groups post their problems on online Idea Exchanges and invite solutions from a pre-qualified global pool of candidates or companies. In return for a verifiable solution, the solver gets a substantial monetary reward.

The opportunities for distributed participation are huge, but the reuse of this model in the broader IS community, based around solutions to well known problems provides another angle on innovation. Innovation through replication of best practice or evolution of best practice. It provides a means to allow tech laggards to benefit from each other by sharing their experiences of technology adoption. The experiences then mould the predominant usage patterns for the laggards. Some input form the early adaptors would be of value, it may even be worth paying for, but the real value would come in the shared solution; buying power, known issues, known limitations, complementary vertical processes and systems.

Tuesday, May 24, 2005

The role of the knowledge broker...

From The Theory and Practice of Knowledge Brokering in Canada
One of the most consistent messages from the national consultations was that people whose job description actually says 'knowledge broker' are rare and that the situation is not likely to change. That's why the foundation was told to shift its emphasis from the idea of the individual knowledge broker to the activity of brokering.


It is very much about doing rather than being. It is a very active role, some what of a viral catalyst.

A broker's main task is to bring people together; they are catalysts who, through diligent network-building and solid background, can create a mix of people and even organizations that will stimulate knowledge exchange, the development of new research and the interpretation and application of solutions.
Brokers search out knowledge, synthesize research and scan for best practices, useful experiences, and examples from outside their own organization. They may also act as advocates for the use of research-based evidence in decision-making and have a role in supporting and evaluating changes they have helped to put in place - although the literature only mentions a generic 'follow-up' role.


Making it happen is very much part of the role, it requires a finisher rather than a starter. Collaboration needs to be nurtured and supported; it must be followed-up with metrics and rewards, taking into account the tacit nature of the mutual benefits.

legal protection, possibly the true value of ASF for new projects

In an interview with the server side, Greg Stein Chairman of the Apache Software Foundation (ASF) talks about one of the key value propositions:
"..., but what they wanted was sort of the legal oversight, the legal protection that the Apache software foundation provides to its committers. So, at the ASF all the committers are shielded from like lawsuits and other types of claims against them by the ASF. The ASF will be the party in any potential lawsuit. "


The PMC "oversight" process ensures that there is open IP disclosure and no hidden legal rat holes. Of course there is the Apache brand, but that means more to the users of open source than to the producers. The incubator and community ensures that the projects have value in them selves, the ASF ensures that the license can be trusted. For corporate IT, this sort of trust is vital.

Monday, May 16, 2005

The Six Degrees World of Inventors

Sara Grant, HBS Working Knowledge, writes about her research : "'Our work and more recent work on knowledge diffusion demonstrates that knowledge flows along these collaborative relationships, even years after they were formed,' says Fleming. At the same time, the world of inventors 'is getting smaller,' he says, 'inventors are more connected to their colleagues in outside firms, and that knowledge is diffusing in both directions.'"

People. and men in particular, often touch base around work, what is going on now, how things were in the past and so on. It provides a common link or a fabric into which conversation can evolve. Technology, the internet, PC's etc are also part of the fabric, they provide another common touch point. The opportunity for cross pollination are immense, as each new industry sector gets to grasp with technology, new boundaries are explored and common ground uncovered. The "world of inventors" may be getting smaller but the scope of invention is broadening. I think the personal fabrication model has applicability for industry as sectors fabricate solutions for them selves in ways that were before unheard of.

Innovation will be about seeing what should be out of what can be and making it happen with what is. This will be a role not for a smaller group but for the community at large as the language of pervasive technology becomes more understood.

IS/IT practitioners are struggling with the concept of a common language but progress is being made. With consumer electronics we are a lot closer, plugging and playing, sharing and manipulating, downloading and feeding. The current or next generation of children that take things like this for granted (without need ing to know how it works :-)) will have a freedom to explore and innovate like never before.

Tuesday, May 10, 2005

Ward and pepper on the IS Capability

Here is a nice presentation of the IS capability theory. A good reference point. The definition of the capability:
Essentially represents organisations ability to "connect ....technology to its business performance" (Marchand et al, 2000)
provides a pithy synopsis.

While the fact that the capability must "transcend organisational boundaries" is clear, going forward, I think it must also transcend institutional boundaries. Looking outward for inspiration, example and know how; then building new partnership or community structures to share in, evolve and profit from the understanding.

Self-Actualization Trends in the Digital World

Self-Actualization Trends in the Digital World: " Assuming that most of us can rest assured that there's a roof over our heads and food on the table, the democratization of technology means that more and more people can have opportunities to author and create aspects of their lives in entirely new ways."

There is a new freedom once the basics are taken care of. This begs the question of how best to take care of the basics, but the answer is, I think, in the community. Where we have community source for software, we should have community tacit knowledge about how to use software. Often, technology, finds it's feet in ways that were not originally envisioned. Only through the trial and error process of engineering, use and reuse does the real capability emerge. A community infrastructure is the best way to quickly identify, nurture and evolve the capability of a new technology. If the technology satisfied one of the basic needs then the motivation is with us all to share in the community, we can then get back to the more important self-actualization.

For the business of course this means bringing more innovation to the market :-)

Monday, May 09, 2005

Maybe WS provides a common language for open innovation at the IS/IT level

One of the daddies of documenting what Open Source is, wrote this back in 2003:
Hacking and Refactoring: "we can imagine a hacker culture speaking a common tongue other than Unix and C (in the far past its common tongue was Lisp), and we can imagine an explicit ideology of open source developing within a cultural and technical context other than Unix (as indeed nearly happened several different times)."


Possibly today, that time is comming, the common language and context may be here for business heads to use. The future of hacking may be with the business bods, working with shared business concepts and using the tools of IS/IT to evolve open, shared, trusted working systems. The solution to the personal problem of one business head could easily emulate successful open source projects if the problem is wide spread, the technology easily accessible and the context easily adaptable.

The key enabler is a shared language, BPEL, ebXML, Web services, WSDL and the WS-* stack can provide the shared framework. Open Innovation can provide the strategic initiative and Open Source can provide the historical background from which to learn.

While hackery is still treated as a craft and will remain so in many domains, the new craft is business agility, way up the software stack and closer to the fully evolved practioner. Where the business process is a mundane, day to day or regulatory task, it makes sense to share the burden of keeping the process up to date using the open source, community paradigm.
A shared solution will induce better understanding, foster alliances and increase buying power. Technology has its place as a contributor to strategic advantage, but it also has its place as a commodity tool that serves a business function in an stable, evolving and proven fashion. Open initiatives are the path to facilitate this distinction and implement the latter.

Friday, May 06, 2005

Some nice ideas about innovation...

This presentation, though a bit bright :-) has some good content. The rate of technology adoption is a handy (slide 17) but the idea that innovation is networking is cool.

From technology package delivery
to
innovation package delivery


you can take on the whole process, not just the technology. First see the technology at work in context and then make it work for you through partnership, colaboration or brokering. The demise of the industrial research lab (IRL) is also interesting.

Real life practitioners of Open innovartion

This industry week article is proof of that the open model, finding existing proven innovation and bringing it to market through collaboration, can work!. They have developed an innovation strategy around what they call the Connect + Develop (C+D) initiative. Looking outwards to see relevant innovation, then connecting to collaborate on developing and profiting from the idea.
To accommodate P&G's accelerated innovation process, Cloyd emphasizes a fast cycle learning methodology to help P&G identify winners earlier and develop them at lower cost. 'Remember,' he adds, 'that the challenge is that most innovations fail!'

If the innovation is already partly proven, evolving it through incremental improvement or modifying it to a more appropriate context can make it a real success. Only a new pair of eyes can see these possibilities!

The case for openness is stated through:

  • Not all the smart people work for you. There is a need to work with smart people both inside and outside the company.

  • External ideas can help create value, but it takes internal R&D to claim a portion of that value for you.

  • It is better to build a better business model than to get to market first.
  • If you make the best use of internal and external ideas, you will win.
  • Not only should you profit from others' use of your intellectual property, you should also buy others' IP whenever it advances your own business model.
  • You should expand R&D's role to include not only knowledge generation, but knowledge brokering as well.

The last two points may be hard to quantify and implement, how much to pay for an idea? How to make the brokering of knowledge work?

They big lesson seems to be that the key to success is in the process employed to make the initiative work.

For those motivated to emulate P&G's lead in open innovation, Brez offers recommendations. "Start with a formal strategy that includes implementation and tactical plans. Comprehensive planning produces the greatest value. Put all of the elements together -- with senior management committed to the strategy, the implementation, the organization and the funding."

Thursday, May 05, 2005

"fab labs" (either fabulous, or fabrication, as you wish).

Edge: PERSONAL FABRICATION: A TALK WITH NEIL GERSHENFELD: "In one of these labs in rural India they're working on technology for agriculture. Their livelihood depends on diesel engines, but they don't have a way to set the timing. The instrument used in your corner garage to do that costs too much, there is no supply chain to bring it to rural India, and it wouldn't work in the field anyway. So, they're working on a little microcontroller sensor device that can watch the flywheel going by and figure out when fuel is coming in. Another project aimed a $50 Webcam at a diffraction grating to do chemical spectroscopy in order to figure out when milk's going bad, when it's been diluted, and how the farmers should be fairly paid. Another fab lab is in the northeast of India, where one of the few jobs that women can do is Chikan embroidery. The patterns are limited by the need to stamp them with wooden blocks that are hard to make and modify; they're now using the lab to make 3D scans of old blocks and 3D machine new ones."


This is fantastic stuff, simple practical and open. The notion of personal fabrication is set for huge growth. Chatting recently with a colleague who predated punch cards he referred to his children who have no fear of technology and just get on with it, making things work together rather than figuring out why they work. As engineers we often need to fully understand how something 'hangs together' before we will trust it, but our inability to command or grasp the variety of fields that technology touches now becomes a limitation. We simply can't understand everything! But like our children we need to just get on with it, make things work and get to grips with the new literacy that is cheap open commodity hardware and software.

The work of the fab labs will have fabulous effects in developing countries who simply see possibilities and don't' fully need to understand the whole history or background design, just the principals. In stead of asking "How did that work", they will ask "How can I make it work for me".

Wednesday, May 04, 2005

Strategic Advantage through Capability building

We build on the insights of the dynamic-capabilities school of business strategy by extending it especially across enterprise boundaries The relative pace of capability building matters most. Companies that embrace their edges will develop their own capabilities much faster than those that simply defend and extend their core operations and core markets.

Looking at the edges, at the competition, at the way things are changing around you, can help focus the innovation effort. While your core competencies are crucial, evolving these competencies in the context of your changing market and with the help of partners is key to sustainability.

One of the biggest challenges that executives face is to know when and how to leap in capability innovation and when to move rapidly along a more incremental path.

This can be very relevant to technology; in some cases it may makes sense to replicate a system that is proven to give value at a fixed cost. Replicate the entire process if it works for someone else, partner to share the maintenanceand increase buying power.

It builds on the insight widely attributed to Bill Joy, one of the founders of Sun Microsystems, that "there are always more smart people outside your company than within it"
. When these smart people have a working solution that they are willing to share, it makes sense to put it to work for you!

Tuesday, April 19, 2005

Why B2B exchanges failed...

Putting the Horse First - NET GAINS - CIO Magazine May 15,2002: If you don't have a critical mass of buyers, how do you attract suppliers? And if you don't have most of the suppliers, why would buyers participate? Most B2B exchanges failed because they could not get past that first hurdle. Suppliers resisted joining the exchanges because they feared direct comparison with competitors would erode their margins. And buyers as well as sellers were loath to pay transaction fees for what they felt was a simple matchmaking function.

But the struggle for liquidity is merely a symptom of the real problem, which is that the creation of exchanges to match buyers and sellers preceded the creation of software and services that would make exchanges truly useful.


This article goes on to identify the root cause, that presents another chicken and egg scenario; unless you have users (buyers/sellers) it is very hard to invent the useful services that they want. It is only through the evolution of the service that we can best know how it can add value.

This solution is to build e-services out of existing inter-enterprise business processes, first automating and then transforming to bring real efficiencies.
The lesson of failed B2B makes good sense for any product development strategy, make one customer happy then the next, then the next.


The future of B2B e-commerce lies not in exchanges but in software and solutions that bring real efficiencies to specific business processes. The business of trading exchanges populated by anonymous buyers and sellers is best left to financial exchanges and commodities traders because only pure commodities can be bought and sold in marketplaces. As the founder of Dean whither used to say, "We build success one investor at a time." Similarly, B2B companies will build their business one customer at a time, instead of building marketplaces with no customers



In the context of an exchange that facilities the transfer of working systems intellectual property(WSIP), this begs the question, could WSIP be a commodity?

Probably not until there is tremendous settling in the market or regulation, possibly SBO would provide the impetus!







Community of Creation - Mohanbir Sawhney

There are some interesting research questions in this presentation. A related reference site 'Business Strategy & Innovation Thought Leadership from ManyWorlds.com' with a quick overview can provide more context.

The research questions are on slide 10.

Monday, April 18, 2005

Geoffrey Moore: The Role of Open Source Computing

These comments indicate that it must have been an interesting talk, placing Open source in context and mapping Maslows hierarchy to current corporate culture. To make collaboration work there must be shared context, but there may be a space for competitive collaboration. Think of the laggards looking for a proven solution to problem X. They will pay to collaborate with the best match solution. Being a year or so behind the curve, the market has already solved the problem on many occasions, the solutions have moved from being contextual to being core. For the laggards, find a good match to their context is the real challenge, if such a match can be found, the laggard can make a quantum leap, not only replicating the solution but also replicating some of the context!
This is a big win for the laggard, the compensation for the innovators comes in payment for their IPR and also in the benefits of commoditisation of their core solution. If their solution can become dominant they can benefit from increased buying power for support and service from the original hardware and software vendors.

When Crossing the Chasm - Act Your Age

Act Your Age provides a nice simplification of Geoffrey Moore book and some sensible advice on putting it into practice. I am interested in making use of the chasm from a laggards or conservatives view point. Can I get information on the good work that has been done in taming technology by the pragmatists. They use stuff that works, I would like to leverage their experience rather that depending on consultants to retrofit their views to my problem.
The problem is that the innovators and pragmatists are too busy working on the next best thing to bother with sharing, can technology help? Can IPR protect their assets and allow laggards to reuse them?

If your pockets are deep enough, any technology can be replicated

Memoirs From the Browser Wars gives a nice insight to the reality that is innovative technology development. In the end, the big pockets have most of the clout provided they identify the threat before it is too late and before it is protected by intellectual property rights. The 'throw money at it philosophy' works when you have a huge install base and can give stuff away for free. I saw it first hand with the demise of X.400 in the main stream after MS consumed the emerging X.400 market by bundling a (buggy) X.400 capability with Exchange.
I hated fixing our browser to make it bug-compatible with Netscape even though we had already coded it to 'the standard'. Life's not fair sometimes. :-)
I know what you mean!.

Wednesday, April 13, 2005

The business model is a.....

The business model is a cognitive device to convert technical aspects of a product or service into economic value. Any successful innovation (vs. invention) needs a business model, and that model must do two things: first, it must create value in its ecosystem, and second, it must capture a portion of that value for the innovator, so that additional advancements will be forthcoming
Henry William Chesbrough provides a very concise view of a business model that nails value creation. Every project undertaken should have a value-based model, sure it doesn’t have to be economic, it could be social, personal, but there is a reward, in some form, that results, so that we do it again!

Chat about 'Open Innovation'

This entry presents a nice interview with Mr. Henry Chesbrough and some follow up comments on the relevance of blogs for Open Innovation. I am using a blog to simply keep track of some of my thoughts and literature search as I work through to a specific, manageable and interesting research question. I am finding the IdeaFlow blog provides a wonderful source of information. Thanks :-)

Tuesday, April 12, 2005

Some evolving work on the effect of ICT on Innovation collaboration

This work in progress deals with the issue of trust and reputation in the diffusion of knowlege. I await the official publication.

A nice collection of current articles around innovation

http://www.innovation-enterprise.com/5.1/5.1.60.html

Structure and Design around Innovation

Structure and Design provides a useful list of resources that tackle some of the major issues.

Of particular interest is the theory by Andrew B. Hargadon, "Firms and Knowledge Brokers: Lessons in Pursuing Continuous Innovation," California Management Review, Spring 1998. How this can apply to managing parts of the IS/IT application portfolio? Maybe that is the question :-)

Open Innovation

Open Innovation could provide a model for replicating IP from an early adaptor to a laggard in the technology adoption life cycle or product diffusion curve. For what types of applications would this work? or is there any point in trying to nail it down? Maybe it is best to deal with the some of the potential problems, ownership, maintenance?

Copy cat

Intel do "Copy Exactly", make each new fab an exact copy of the R+D system, they subsequently replicate incremental improvements across the copies. For Intel, the context in each of the fabs is replicated exactly.

How can the replicate approach become valuable to the outside world; Can replication become a procurement strategy for Innovation?
Rather than work in isolation, look at the early adaptors and copy their innovations, pay them for their intellectual capital and collaborate to add incremental improvement.
Replicate the support services, replicate the business processes because technology and people go hand in hand . How could this strategy map to the IT application portfolio.

Some stuff to read:
New Approaches to Innovation Policy: Some Norwegian Examples

The Non-Technological Side Of Technological Innovation:
State-Of-The-Art And Further Empirical Research

Monday, April 11, 2005

MIT SMR Article, "The Innovation Subsidy" - Spring 2004 Michael Schrage. Reprint 45305

MIT SMR Article, "The Innovation Subsidy" - Spring 2004 Michael Schrage. Reprint 45305: "The Innovation Subsidy"

This is worth a read!

To be successful in uncharted waters, the ability to learn from experience is paramount

MIT SMR Article, "Strategic Innovation and the Science of Learning" - Winter 2004 Vijay Govindarajan and Chris Trimble. Reprint 45212:
To be successful in uncharted waters, the ability to learn from experience is paramount.


This is why replication is an option. Take someone's existing working system and replicate or copy it exactly, the risks are known up front, they are known from the earlier experience. Granted the exact copy may not meet the exact need, but why not adapt the need to meet existing best practice, build a community around the solution and treat the system as a tool, something that does a job or provides a service. Unless the system in question is the source of sustained competitive advantage then why not use something that already exists and is proven to work!.

The community can come into it's own as the system evolves, the changes may be shared with the community to increase buying power and share support, evolving the best practice of the tool in this way means we can again learn from experience.

Preamble: 100% and 80% solutions

Olin Shivers makes some good points about doing the right thing and doing a complete job first time:
.
So I sat down to do a careful, 100% job -- I wanted to cover everything in section 2 of the Unix man pages, in a manner that was harmonious with the deep structures of the Scheme language. As a design task, it was a tremendous amount of work, taking several years, and multiple revisions. But now it's done. Scsh's socket code, for instance, *completely* implements the socket API


I guess the 100% goes hand in hand with many 80% solutions, because one of the benefits of Open Source is that people can see other peoples work, the community can see the efforts and evolution of requirements and solutions. A proper 100% job can really only be done in hind sight, otherwise there is a lot of complete solutions but also a lot of wasted time and effort. However, the benefit of what is learned in doing such a 100% job, at whatever stage in the solution life cycle, will never be lost!

'open' systems

IBM VC calls for 'open' hardware
and it makes some sense, once you realize that the shared capital costs can be recouped by value add services. There is lots of scope for differentiation at the services level, the benefits of trusted shared hardware, basically a risk free set of core functionality, free our minds to concentrate on the more intangible and valuable business benefit issues.

Surely this will evolve into complete systems, where the core functions are well proven and shared collaborations and where the extremities or business interfaces provide the 'localization' and real differentiated value.

In the same way the hardware IP needs to be validated, so will core functional systems, the sort of stuff that the early technologies adaptors have worked hard to develop. The market followers can adopt the experience (IP) of the early adaptors once there is some reasonable method to validate the benefit of an existing proven solution.

Tuesday, March 29, 2005

Why is open source used?

Why is open source used?

One obvious answer is so we can fix bugs ourselves, we have the source, we are techies so we can fix it our selves. But while it may be true that we can, we very rarely do!

I think ubiquity and 'fit for purpose' are the main reasons why open source is valuable.

All successful open source projects have grown from a need, the alternatives are no good!, too expensive or two restrictive, some individual(s) takes it upon him/her self to fix the situation. The evolving new solution, being a good fit for the problem at hand finds a niche; when the itch is common to many the way to scratch becomes popular. In the successful cases, the more popular a solution becomes the more refined it becomes. Ubiquity follows and the solution grows, either to embrace another problem or to become the groundwork for future work.
As a user of a successfuly open source project, the main motivation is to make use of the ubiquity in ones own work, if 50,000 developers are using ant to build their java projects then why don't I.
But will I fix a bug in ant, sure if it bugs me and I can't find a workaround in the community. But will I submit the bug back into the community, probably not. 1) because it takes time 2) because in doing so I undertake a substantial responsibility. I must make sure that all the tests pass, that no backward compatibly issues are introduced, lots of stuff, that a novice at making a fix to an open source project is not familiar with. I can make a suggestion, and the powers that be, thoes with commit access, will decide, but only if the underlying cause is simple and obvious; taking on the responsibly to diagnose, propose and implement a complex fix that is fit for the community is too onerous. Of course doing all of the above is no more that good engineering practice, but it takes time and familiarity, both of which I don't have and cannot easily achieve.
What often happens in my experience is that a fix is used locally, it is made work in a limited environment; however the investment and motivation to filter the fix back to the community is rare.

My point being that having the ability to fix the code is not that important, the fact that the we can get a free working solution to a real problem is the key feature. The free bit in its self is not even that important, we would happily pay, but being free means it is easy to access via a download, there is no lengthy procuring or licensing process. Well I tell a lie, every organisation worth its salt must have a licensing policy around the use of open source. However, for evaluation, the free bit only really makes a difference to ease of access.
That coupled with ubiquity, knowledge that it has worked well for others in a similar environment, are what makes the real difference.
Of course viewing the source does help to understand a solution, but those sufficiently interested to benefit are a rare breed :-)