Wednesday, May 30, 2012

Why Has The Internet Failed Us Here?


Back in the 90s, we were promised that the Internet would change everything. It did. Many articles have been written about what our children will never encounter because of the Internet: CD players, copiers, modems, diskettes, plane tickets, encyclopedias, classified ads etc. Yet there are things that should have been made obsolete by now and they are not. Here is where the Internet has failed us:

Checks
In North America, checks (or cheques) are still used everywhere. To pay bills, to pay in a store, or to pay money to a friend. While most banks allow online payments, there are still plenty of utility companies, county tax collectors, and newspapers that only accept checks. And while we all think that European banking is about to go back to the Neanderthal era any day now, most Europeans have not written a check in a decade as online payments and even mobile payments are quite common.

Back in my German days in the 90s, when you needed to send me money, I’d give you my account number and the bank routing number and you’d just do it online via a free instant transfer. That was possible long before the Web by using services such as the French Minitel or the German BTX. Don’t believe that the expensive wire transfers that require a 2 page form with a $35 fee are the same thing. They might be when the Buffetts transfer money to the Gates but most American bank customers have never used a wire transfer. They write checks. Frankly, I am shocked that the banks have left the door wide open for services such as Paypal and Square.

Signatures
“We need the signed original mailed back to us...” - how many more times will I hear this request? Signatures are still required for many transactions, including many credit card purchases. As if the “wet” signature made the transaction somehow magically secure.

While a fax is generally regarded as a legally binding transaction, sending the same signed document via email is usually not acceptable. Why a fax is more legal than an email attachment, I don’t understand. Why a wet signature is considered more secure than any form of electronic authentication is completely beyond me. Any transaction conducted online should be faster, more secure and convenient and yet the Internet has failed to eradicate the paper-based, wet signature dependant transactions used today.

Car dealers
I get it why we need a showroom where I can test drive a car I want to check out. I also know why we - unfortunately - still need a car service center (to replace the tail light bulbs once every 4 years, right?). But why do I need to negotiate with a car salesman who knows that I know the exact invoice price of the car? And please, don’t try that “I have to talk to my manager” routine on me. I want to select, configure and buy my car online and have it delivered to my house. Delivering a washing machine costs $50 which covers the removal of the old one. The $500 you are charging for a vehicle delivery should easily cover your parking the car in my garage and registering it on the way.

Cable TV
Cable and satellite based TV entertainment with preset programming uses a completely obsolete model in the era of on-demand entertainment. Yet, they continue to exist, making viewers pay twice - once for a monthly subscription fee and then again through endlessly annoying commercials. This has to change and I am confident it will. Entertainment delivered on-demand over the Internet is becoming a more and more viable alternative and the ranks of cordcutters are on the rise. Yet still, I would have expected cable to be dead for at least 5 years by now.

Realtors
Buying a house is a ridiculous experience. Pages and pages of meaningless reports, statements and disclosures complemented by mysterious fees for pro-forma inspections and mysterious services like ‘title insurance’. The realtors get up to 6% for ...what exactly? Driving me around to a bunch of lame houses only to make sure I have no choice but to put an offer in for the most expensive one? In the age of Google Earth, Zillow, and Craigslist, who needs a realtor? Both, the seller and the buyer would be better off closing the transaction directly with the title company still acting as the secure clearing house. Yes, the Internet has failed us here.

Borders
The Internet was expected to obliterate country borders. After all, I can access any web site in the world from my iPhone, right? Well, sort of. Web sites providing entertainment content from NBC to Netflix and from Pandora to Amazon are restricting the access to their content from abroad. Even when I pay for a monthly subscription, I am precluded from accessing the content as soon as I cross the border. These are artificial frontiers that have been set up to enforce the country-based content distribution rights - a rather obsolete concept in the age of the Internet. As a frequent traveler, the Internet has failed me big time on this one.

Smart appliances
The smart refrigerator that would automatically re-order milk when I’m running low might indeed not be that useful. But smart appliances would make a lot of sense and yet they are nowhere. Part of the reason why Apple killed off the entire industry of home entertainment devices is the fact that all those amplifiers, decks, equalizers, and tuners were pretty dumb - the manufacturers never considered connecting them to the Internet. How about remotely setting my thermostat to warm up the place before I arrive at home? Makes sense, doesn’t it? I cannot fathom why Honeywell and GE have not thought of that. Today, there are rumors about the perhaps-soon-to-be-announced Apple TV. If it comes, you can bet that it will be a smart, Internet appliance that will change the way we watch TV.

These are a few examples of everyday items and activities that I would have expected to be completely changed by the Internet by now. Yet it didn’t happen so far - the Internet has failed us here. There are probably many more examples you can think of - please do share them in the comments below.


Sunday, May 20, 2012

Does Windows Phone Stand A Chance?

I must say, I am pretty impressed by the Windows Phone operating system. Unlike the Google Android which is, let’s face it, a poor copy of the the Apple iOS, Microsoft created a truly original user experience. The notion of having the same OS that spans the desktop, laptop, tablet, and the smartphone is a very compelling idea. Even Apple didn’t quite make that happen as their Mac OS is still distinct from  iOS.

I have also been impressed by some of the strategic moves Microsoft has been making with their Windows Phone. The partnership with Nokia is a strong endorsement - no matter how long and painful Nokia’s recovery might be. Microsoft also recruited many other vendors including Samsung, LG, HTC, Dell, and others.

Since Google acquired Motorola, there is more and more doubt about Google’s innocence when it gets down to the openness of the Android operating system. That fact alone makes the Windows Phone operating system a rather compelling alternative for the hardware vendors. While none of them have publicly switched their allegiances yet, I’m sure they all are talking to Microsoft.

Also, the recently announced deal with Barnes & Noble and their Nook ebook reader could be a major coup for Microsoft. Sure, Nook runs on Android today but I would bet many chips that the next release will run the Windows Phone OS. Not a bad move for Microsoft if you ask me!

With all of these strategic moves going in Microsoft’s favor, it is surprising that the Windows Phone market share continues sliding down. According to the latest ComScore market share report, Windows Phone has merely 3.9% of the US market - compared with 4.7% three months ago. The trend has been heading south over the last 18 months and frankly, any player in the single digits is questionable...

So what should Microsoft do to reverse this trend? Well, releasing a truly differentiated and innovative operating system is a good start. Windows 8 with all its bells and whistles and particularly with the Metro-style user interface is very promising. The Metro-style apps that Microsoft is going to release next look awesome. But one thing is still missing...

A mobile operating system is supposed to be a platform. And a platform requires applications. This is where Microsoft is not doing so well. When I try to find my favorite iPhone apps on the Windows Phone, I am striking out more often than not. Yet Microsoft understands how to cater to developers. Their tools and developer programs are second to none in the software industry. But there is more to that.

Microsoft should be actively recruiting developers to port their apps to Windows Phone. That is not a slam dunk for most developers as Windows Phone is currently their number 4 platform of choice after Apple, Google and RIM. Yeah sure, RIM has some problems but the BlackBerry OS still has more than double the market share than Windows Phone. Today, building an app for Windows OS is a major gamble with uncertain payoff.

This is where Microsoft could and should use its market power. Instead of billboards, Microsoft could be paying a few thousand dollars to the developers of the key apps to port their apps to Windows Phone. Microsoft should just go aggressively after all these developers and entice them to make it happen.

Particularly in the enterprise market , Microsoft should aggresively solicit the vendors’ support. Right now, Windows Phone OS is the number 4 priority on the list for any enterprise software vendor - after Apple, Google, and RIM. Number 4 rarely makes it into a released product. Everybody’s budget is tight and the money is barely enough for priorities 1 and 2...

Developing apps for a platform that has 3.9% market share with a declining trend is a tough business case to make. This is the top problem Microsoft has to focus on to make Windows Phone a success and without addressing this issue, the platform will likely fail.

Sunday, May 6, 2012

Content Analytics - Crossing the Chasm?

The overabundance of information is one of the greatest challenges today. Things have sure changed a bit since the days of the 90s when Microsoft used to promise us a PC on every desk and information at our fingertips. Today, we have plenty of information at our fingertips. In fact, we have so much information coming at us from all sides that making sense of it became one of the key information management challenges.

That’s where information analytics come in with all their applications such as semantics and auto-classification. Simply put, these technologies are analyzing the actual content of information, deriving insights, meanings, and understanding. It is done by applying powerful algorithms that analyse the content and complete complex tasks such as concept and entity extraction, similarities, trend identification, and sentiment analysis.

Therein lies the problem with the technology. The greater the volume of information, the more content analytics are useful - if the volume is small, humans can do it themselves. But to apply analytics on a large volume of information, one needs a significant computing power.

That’s the reason why the actual use of analytics was confined to very few scenarios where money was not an issue. The US intelligence agencies used supercomputers with analytics to weed through millions of intercepted messages from suspected terrorists. Similarly, IBM’s Watson, the Jeopardy winning machine, was a supercomputer. It was fed with encyclopedic knowledge and optimized for a single task: winning Jeopardy. Even though content analytics have been around for well over a decade, most organizations simply could not afford the computing power required.

That may be changing now, thanks to a couple of market shifts. First, Moore’s Law is helping - computing power is becoming more and more affordable and the algorithms are becoming more powerful and efficient.

The other improvement comes from our understanding of what the technology is expected to accomplish. The early requirements for analytics and classification asked for ultra-accuracy. One of the key objections used to be the lack of dependability on automatic classification - if it isn't 100% accurate, it is no good. But today, we understand that the alternative is not perfect by any stretch. The alternative is to rely on humans who are actually pretty pathetic at analyzing and classifying content. In fact, humans are notoriously poor and inconsistent at the job. Getting to 60% accuracy is a typical result of human classification.

That means that a technology that can get us to 80% or even 90% of accuracy is actually much more accurate than any humans and this kind of approach doesn’t require as much computing power as the attempt to reach 99.9% accuracy.

For example, one of the common applications for analytics is legal discovery - the need to quickly produce any electronic evidence requested by a court subpoena. Here the goal today is to produce all the relevant documents and emails with a defensible level of accuracy. Of course we don’t want to pay the expensive lawyers for manual review of thousands of documents. They are paid by the hour - and paid rather well. But we also don’t want to stand accused of failing to produce an important piece of evidence. Until recently, the fear was that unless we can prove that we have electronically discovered all the pertaining documents, the approach would not be defensible in a court of law. And only humans (ehm, lawyers) can guarantee such accuracy - for a hefty fee...

Today, that has changed. The courts increasingly understand the futility of aiming for 100% accuracy and instead accept statistical sampling as a way to confirm accuracy at a reasonable level. After all, both both opposing parties - the plaintiff and the defendant - are in the same boat when it comes down to reviewing a mountain of electronic evidence. That means that the lawyers no longer have to review every document. Instead, statistical evidence of accuracy is considered defensible.

As a result, auto-classification no longer has to aim for 100% accuracy. Instead, a more reasonable level of accuracy backed by statistical sampling has become acceptable - because it is still much better than what humans could ever do manually. And cheaper, of course. That makes analytics much more effective and affordable. With that, analytics are no longer confined to the world of super-computers and finding real-world use cases. Analytics may indeed have crossed the chasm.