Friday, October 31, 2008

The Collaborative Reformation

Collaboration has become the buzz word of the times. It holds out the promise of making all work productive all of the time, an attractive carrot in times of economic crisis. While collaboration tools will improve the effectiveness of existing processes and businesses its true impact and most dramatic effect lies in how collaboration tools can bring about a reformation of business.

Improving the efficiency of existing process works up to a point and indeed most, if not all, collaboration tools are predicated on somehow improving the efficiency of existing processes. The real promise of collaboration tools is how they can help users reform the fundamental processes of businesses. I’m not simply talking about getting rid of layers of approval but the complete overhaul of how new work is brought in, how it is created, how it is charged and how it is produced.

The impact comes from allowing business and users to focus on the work that adds value and streamline and eliminate the non-value add work. Elimination may involve out-sourcing to another business where the particular process or work is their value-add. Think designing cover art for a new book. It is not a value add proposition for a publisher but is a value-add proposition for a designer.

The reformation extends beyond simply managing documents and information to completing the core tasks of the business whether it is a plumber, development agency or a manufacturer. Collaboration services need to be given access to the physical world that plays such an important part in many businesses whether it is tasking plumbers and ordering plumbing supplies for delivery or controlling a CNC machine.

In effect collaboration services need to evolve into a framework within which business operates; a framework which supports agile business processes, modules for specialist features (think CAM control) and management of information within the business. This is the path for development of collaboration services such as Huddle.

The ultimate goal is supporting the ideal of the networked business. A “Business” is a network of smaller businesses using a common collaboration platform with specialist modules from various providers. Business becomes a network of networks in which the collaboration service coordinates activities. It is this that is the root of the dramatic and sustainable change that collaboration services can bring to business.

Reblog this post [with Zemanta]

Tuesday, October 21, 2008

Can Hubdub Survive?

I’ve been playing with Hubdub recently and all I can say is it has a looong way to go. In fact I think that unless some major changes are made Hubdub won’t survive 2009. Unfortunately, I think Hubdub faces some major, major hurdles, that will make the struggle to build a decent revenue stream (remember now cash is king...angel funding will only get them so far).

Hubdub for those that have not come across it is as predication market based around news. The idea is to combine news aggregation of some sort with predictive markets.

I’ve put down bets and created questions. My consequent experience has been less than heart warming. But to illustrate my concerns let’s look at the questions I created.

My first question was “Will Digg buy Hubdub in 2009?” Speculative yes, but Hubdub is a predictive market – questions are by nature speculative. Background to the question: Kevin Rose discussed Digg’s international expansion plans in his talk at FOWA London. This talk was widely reported in the media. Some other facts:

  • Digg has just closed a funding round of $29m in September
  • Hubdub has only raised angel funding and has 4 employees
  • In this economic climate cash is king. Getting revenue positive is the holy grail
  • Hubdub has not articulated a source of revenue that is sustaining
From these data points (which are all easily available with a quick search on the web) one concludes that Hubdub is a good target for acquisition. It has a decent (although it requires some work) prediction platform but other than that it has nothing special. Kevin Rose wants to expand internationally and the prediction market technology would work well with the Digg platform. It would certainly give Digg greater number of potential revenue streams. I will be the first to admit that this is speculative but it is based on facts. But all it took was one person to raise a question and the question was voided.

Second question was “How far will UK house prices fall by December 2008?” This question was voided as it didn’t have an option for housing prices fall being below 15%. Let’s look at the logic of this. As of September 2008 both Halifax and Nationwide have reported house price falls of 12.4%. There are three more months to go before the end of December and these things don’t turn around on a dime. Being below 15% is not an option as credit is still tight and the UK has entered recession. There is no sound possibility that house prices will not fall less 15%. Now let’s look at this from angle of question creator. I may not want to provide an option so why should I? Why do questions have to be modified to provide gamblers with an option they want?

The issues with the questions are merely symptoms of what I see as major flaws of the Hubdub system. They are:
  1. The rules for creating questions are vague and easily open to interpretation.
  2. The very act of creating questions is daunting and annoying
  3. The site is rapidly becoming dominated by power users
Hubdub is seeing the play out of Clay Shirky’s maxim “A Group Is Its Own Worst Enemy”. Power users and early adopters will band together to void questions and in other ways hassle newbies merely because they don’t like the question (rather than it’s predictive quality) or because newbies are falling afoul of capricious unwritten rules. There is no penalty against power users for this type of behaviour. This is over and above the effort needed to create questions in the first place. It is extremely disconcerting to put the effort into questions only to have them void on relatively spurious grounds.

The single most worrying aspect about the flaws in Hubdub is everyone has been talking about the development of communities and their interaction for the last 11 years. Let’s recap – Usenet went through the same problem, Slashdot went through the same problem, Digg went through the same problem. See a pattern here?

Clay Shirky has been shouting from the roof tops about it for years. Hugh McLeod and Tara Hunt have all discussed it. At what point do people pay attention? Angel investing or not, for a service that is based on community and users, to not have the necessary tools needed to manage the community’s interaction with the platform is simply, well, scary. It speaks to a company that has a fundamental lack of understand of community base services.

Is it important? Very. Hubdub needs a diverse, large and vibrant community not only of speculators but also question creators. The way Hubdub is going to make money is from selling premium access to data and audience to businesses. However, companies will only pay if the Hubdub community is diverse and large as then the data and audience has value to them. As it is the current community is doing very well at driving new members away. Hardly a good method of growth.

Hubdub can possibly turn this around. The first step is to build the tools and features necessary to manage the community interaction. This will piss off the power users and early adopters as it blunts their power. That is the price to pay for improving the experience and engagement for a broader and more diverse people. Actions must have consequence. When actions have no consequence poor behaviour soon dominates. Slashdot found this out the hard way as has Digg.

The other part of the engagement issue is question creation. Relying on people to read FAQs about question creation is very, well, RTFM. Most people don’t RTFM and nor should they. If there are rules about question creation they need to be clear and objective with no room for abuse to void questions that someone doesn’t like. Of course, if the rules are clear and objective the system should not allow questions to be created in the first place that don’t meet those rules. People should not have to RTFM.

In fact, I wonder why voiding is necessary at all – isn’t the very act of betting on a question a vote on the question's quality? Why not just use the activity on the questions as a way to surface or subsume questions? Activity is a much better method than voiding or voting. Using activity blunts the prejudices and power of any single person or group of people.

The interleaving idea through the emails from Hubdub and site of not being a speculative market still stumps me. It’s a predictive market they are, by definition, speculative. It’s like being a fish and trying not to drink the water. Hubdub is a speculative market – if the problem is with questions that don’t have any news or are a long time in the future have a special section for these types of questions. Don’t ban them.

I did hope that Hubdub would be good. But I am sorely disappointed and I now doubt the company’s survival. I certainly would not invest any money in the company without some major changes to the platform.
Reblog this post [with Zemanta]

Monday, October 20, 2008

Chaos, Finance and Non-linear Behaviour

With all the noise around cause and remedies for the financial woes we are currently experience something has been niggling at the back of my mind. It was only after watching the BBC documentary "High Anxieties: The Mathematics of Chaos" on Chaos math and reading some posts on nodes that it twigged.

The idea that the financial system is fundamentally chaotic (in mathematical terms) has been around for a while so that isn’t new. A system being chaotic is not a problem in itself, it just is. The problem lies in the transition from linear to non-linear response in chaotic system. Here we had a situation that seemed to be out of portion to reasonable rules of thumb for the system response. The size of the sub-prime looses shouldn’t have been enough to trigger the meltdown.

Unless of course the system was optimised and being highly driven. The past 20 to 30 years has seen the financial system optimised for making money. Without getting into the whys, wherefores and who did it, the optimisation process pushed the financial system to the edge of instability. Optimisation moves a system closer and closer to instability, which is how you get “optimised performance.” This works fine when a system is steady-state without unexpected shocks. The downside is it takes very little amount of non-steady state change to force the system into instability.

In chaotic system instability creates non-linear response that is unpredictable. That is what we are facing. A relatively small shock has sent the system into non-linear response. The system was pushed to the edge of non-linearity by two forces: growth in connections between nodes and the hard driving of the system.
The interconnection of nodes grew exponentially via the creation of various 2nd and higher order derivatives. The intent was to “spread risk”. Instead it amplified the risk across the system which would in turn amplify driving forces. The second part was the cheap credit. This acted to effectively increase the energy sloshing around the system, driving it hard.

To stop future financial crises, we need to de-optimise the system. We need to make the system robust. Regulatory changes such as tying capital ratio to the incentive system of executives, whether a good idea or not, will do little in an optimised chaotic system as even a small shock can have massive consequences. Instead we need regulations that look at limiting interconnectedness of the various nodes in the system and work to dampen movement of a shock through the system.

Put another way, we want to shift the system into an area that has the broadest linear response to shocks as possible. Some possible ideas are banning all 2nd and higher order derivatives or counter cyclical capital ratios. There will be great resistance to making the financial system fundamentally robust as it will limit the money making ability of financial institutions.

Reblog this post [with Zemanta]

Wednesday, October 15, 2008

Apple's "New Technology" not so new

Image representing Apple Inc. as depicted in C...Image via CrunchBaseIt was interesting to watch the video of the Apple Notebook Presentation. The new notebooks are indeed items of engineering and design beauty.

But...

What struck me as very wrong was the claim around the new chassis for the notebooks. The claim on it being invented or new is simply and horribly wrong. The unibody is simply a variation of the monocoque technique that has been around in manufacturing for a long time. Aircraft have used the technique since the 1930's.

It might be new to computers but Apple's claim is high suspicious. My real question is - what took so long? Why are advanced (and not so advanced) manufacturing and design techniques only now coming to computers?

I suspect because manufacturing has always been a tertiary or lower concern. Now is the time for computer companies to grab a few manufacturing and mechanical engineers and lock them room and tell them pioneer new designs for chassis that take advantage of the latest manufacturing techniques, equipment and design.

One other thing. The emphasis on new chassis indicates that computer manufacturers have a lot of room to reduce costs and material usage in computers. It is going to interesting to see how companies take advantage of the possibilities offered by advanced manufacturing.



Reblog this post [with Zemanta]

Google Chrome's benefit for Mozilla Firefox

Mozilla Firefox IconImage via WikipediaMozilla release Firefox 3.1 beta 1 today. Reading through the release notes and the blog posts about the new beta it is clear that Google Chrome's biggest effect on Mozilla Firefox was to encourage the Firefox developers to setup innovation and development a notch.

Mozilla had been costing for a while. IE ceased to be challenger. Google Chrome has taken over that role (at least for now). If Google Chrome never gets more than 1% of the market, I would still call it a success by giving the FireFox developers the necessary kick in the pants.


Reblog this post [with Zemanta]

Friday, October 10, 2008

The Crisis Makes the Leader

Brad Feld wrote an excellent post about leadership for entrepreneurs. Fred Wilson re-posted a quote to his blog. Core is that time is now for leadership.

Until now leadership has been easy. It always is in good times. The current crisis will test the leadership skills of entrepreneurs. To paraphrase – the crisis makes the leader. This test is not going to be easy and I am sure it is going to find a lot of entrepreneurs found wanting. I hope that VCs and angel investors will step up to crease to back stop the entrepreneurs.

On the upside it will forge some great leaders that will be of huge benefit to the industry and society once we get out the other side. Something we sorely lack today.

For now it is the time to for entrepreneurs to step out in front their people, swallow their fear and charge forward. Now is the time to lead from the front.

Thursday, October 09, 2008

Spiral to Disaster and Financial Engineering

There is a lot of blame going around for the cause of the current financial crisis. It is a laundry list and often seems to reflect the prejudices of the pundits rather than a rational consideration of what happened.

The striking thing about this mess for me is how closely the crisis resembles a spiral to disaster. Spiral to disaster arose (from memory) out of the fire on the Piper Alpha oil rig in the North Sea. The inquiry into the incident found that while a condensate leak initiated the fire, it was actually the failure or lack of various fail safes that ended in the loss of so much life.

While the financial crisis was kicked off by the sub-prime problem in the US, the reason it has gotten so bad is the lack or failure of the fail safes. There has been nothing to stop the spiral downwards into an ever increasing financial disaster.

The aftermath of the Pipe Alpha fire was 100 recommendations to improve safety on oil rigs, which then went on to being accepted industry wide. We can only hope that the aftermath of the financial disaster will be result in sensible measures that act as fail safes to avoid systemic failure and stop the spiral to disaster.

Saturday, October 04, 2008

Widgets, Communities and the Edge

The web is making it easier and easier for groups and communities to form. Groups foster social cohesion by having members demonstrate affiliation and by the use of objects to create community identity. Think Star Trek fans wearing Star Trek uniforms at conventions or fans of Metallica wearing Metallica branded tee-shirts.

Unfortunately web based methods of indicating affiliation don’t really translate to the real world. This is important as groups are increasingly rooted in the real world, indeed traditional line between cyberspace and the real world is becoming increasingly blurry.

Personalisation services offer the ability to create physical objects that indicate affiliation and community identity. These services are centralised and therein lays the problem. By being centralised they impose a coordination cost on the groups.

Widgets offer services like MOO.com and Ninjazoo the opportunity to offer personalised and communitised products directly into the community without getting in the way. Widgets provide a means of removing the coordination cost on groups by meshing the service within the normal activities and sites of the group.

It is taking the mountain to Muhammad rather taking Muhammad to the mountain.

It is the distribution of core functionality where the true value of widgets lies. Not with the distribution of content but allowing web services to adjust to an Edge Economy.

Tags: MOO.com, Ninjazoo, Edge Economy, Web Services, Web 3.0

Friday, October 03, 2008

Data Half-life: Time Dependent Relevancy

Data Half-Life is not an indication of the importance of a particular piece of information. It is actually a measure of how long a piece of information is relevant. Relevance is not a substitute for importance. It is dependent on context and the information itself. So a low data half-life means that the piece of information will quickly lose its relevancy. A high data half-life means the relevancy will drop slowly.

Consider the story that Clay Shirky related in his keynote at Web 2.0 Expo in New York. In this story someone changed their relationship status from engaged to single. This information is highly relevant to some people and not very relevant to most others. Given that data half-life reflects the broader relevance of the information to a person’s network, it has a low data half-life. It is generally not relevant to most of the people in the network.

Now they many want to know or feel the need to know, that does not mean it is relevant to them. It is easy to mistake the desired to know or the need to know as relevant. Desire to know has no bearing of the information’s data half-life.

By having a low data half-life the relationship status will only travel only so far through the person’s network, thereby avoiding the result in Clay Shirky’s story. Data Half-life is represents how time dependent the information is. The more time dependent some data is, the lower the half-life and the less time dependent the higher the half-life.

Tags: Filters

Thursday, October 02, 2008

Privacy Filters and Facebook

In my previous post I used privacy in Facebook as an example of how data filters could work. One point I glossed over was how currently Facebook, indeed all social sites, fail with social distance. Unfortunately, social distance is a necessary for privacy filters to work satisfactorily.

Facebook has one major flaw, once a person is a friend in Facebook they are treated the same as all other contacts whether the connection comes from bumping into the person at a pub or someone you grew up with. It collapses the privacy or social distance between two people. The social distance can be considered how strong the connection between two people is. Social distance provides a measure of both strong and weak ties as articulated by Mark Granovetter.

Without some measure of social distance or strength of connections, any privacy filter is going to fail. The social graph fails to represent the real world connections between people properly.

Facebook attempts to use groupings of friends to approximate social distance but this is cumbersome to use. The manual nature of setting up and categorising everyone into groups is a major barrier to use. People are lazy.

What is needed is an automated method for calculating social distance. Social distance is calculated (and this is how Mark Granovetter categorised connections) by the frequency of communications. Measuring frequency of communications is difficult for Facebook. While Facebook can measure wall posts, internal emails, poking etc., so much more of our communication occurs outside of Facebook, outside of the wall; whether through email, IMs, phone calls, SMS, twitter parties attended etc.; that the frequency of communication within the wall is not a reasonable approximation for the wider frequency of communication.

The key measure of social distance – communication – is hard to quantify as it is dispersed through many different channels. Trying to capture the frequency of communication via porting the data in is one method of dealing with the issue. The other, probably more realistic, method is to start off with some rules and use what can be easily quantified to refine the measure of connection strength overtime.

The rules would look at what is known generically about social connections. Some of rules are:

  1. Married is a strong connection
  2. The same surname is a strong connection
  3. If strong connections to friends with which you have strong connections then you probably have a strong connection
Some of these rules will dictate a very strong connection (first rule) while others will dictate varying strengths dependent on factors such as prior connections with other friends (third rule). All connections start as very weak and are refined first by application of the rules and then overtime by measures of frequency of communication.

Privacy filters all start with knowing the distance between two end points whether physical in case of centuries before or by social distance in the case of today. Until Facebook and any other social-based site has a measure of social distance privacy filters are going to be mediocre at best and more often prone to failure.

Tags: Privacy, Facebook, Filters