Monday, December 13, 2010

Database.com

Last week, Salesforce.com held their annual user conference, Dreamforce.  You can read details on the event through the Salesforce.com website or other press sources, (ZDNet.com has some of the best coverage), but Salesforce.com's platform service releases require some attention as they have implications for the information industry as a whole.

If you missed the headliner, Salesforce.com launched something called 'database.com', which is in effect an application database in the cloud.  For readers more technically/development oriented, you might take my view as simplistic (or even wrong), but to a strident business/product guy, this is how I see this offering.  In effect, it is an enterprise-ready solution competing with cloud storage providers Amazon, IBM, and others (including internal database resources).  Further, the database is neutral to the application meaning I can build an app that runs on a desktop, a mobile device, another cloud infrastructure stack (public or private) and use the database.com solution as my source.  Why is this significant?

Well, as I noted in a prior post, one thing many information providers have an issue with is legacy infrastructure that is not Web 1.0 (let alone Web 3.0) ready.  Firms like Boomi and other integrators, have made a living deploying web service layers that sit on top of these legacy systems to free them up.  Now, instead of building another layer to my complex stack, I can 'copy' or 'replicate' the data to the database.com environment and free it up to any application built anywhere.  This is something I explored while at Thomson Reuters.  Moving non-time sensitive data such as company performance numbers, people biographies, end-of-day pricing and other information to a cloud-based application database allowed Thomson (and its partners) to leverage the data more easily and eliminate high integration costs as well as reduce support and development costs and quicken time-to-market.  You could still keep a master copy on premise, but the application database would be freed from a firm's infrastructure and exposed to application innovation by customers, partners and even other firm groups, unlike today.

One other thing that this type of architecture change would bring is a shift in commercial models to usage-based rather than simple subscription to account for the wide-range of usage patterns within a customer base.  Another issue facing information companies is for the most part they have simple commercial models - pay me x dollars for a product, period.  In today's consumer-driven market of freemium and premium models as well as utility commercial models, a flat-rate subscription model has little appeal.  By 'outsourcing' the application database (and potentially application layer entirely), clearer usage costs by transaction by user is apparent and greater flexibility on commercials can occur.  

Finally, database.com allows for greater controls on accessing the data stored within it by applications.  By the row-level administration, users (and by extension applications) can be differentiated.  Thus I can keep some data away from one user and expose the super-set to another user.  Again, this is a complex problem for information providers.  Often the data is sourced from 3rd parties or has certain access requirements on it.  Database.com could free up administration and access controls lifting administrative and compliance burdens from information providers.  Further, with a singular application database, user profiles could be created across applications from multiple ISVs that use the same data more easily than today.  A huge benefit to reduce operational risk of data leakage.

Coupled with the database.com release is the maturation of the Force.com application platform represented by Force.com2.  Salesforce.com has matured the platform offering greater flexibility for building apps natively on the platform, including incorporation of both Ruby on Rails and Java in addition to its proprietary Apex language.  Again, I'll leave the more technology-focused to assess the value here, but from my perspective, I can now build robust apps, faster, deploy to customers more easily, track customer usage and issues more accurately and, build a website for my product to market and promote it more quickly.  Integrate the application and the portal with my Salesforce.com CRM and I have a complete, enterprise-class ISV solution out-of-the-box.  As a competitive 'offering' to what Microsoft can offer me with Azure and its mixof on-premise and cloud-based offerings, its not even close.

It must be noted that the value of this stack has not been proven (to me anyway) for real-time data or applications that have higher transaction rates that frequent the trading room.  The value is in off-trade floor applications.  Further, the analytics layer that is needed in many segments is not there either, however, it doesn't mean it can't be or that a powerful 3rd party analytics engine can't be integrated into the Salesforce.com stack.  However, it does open more possibilities for ISVs that have huge data centers and legacy infrastructures to manage to simplify their stack and offer innovation to their customers unlike they can today.

Someone will take advantage of this stack and soon I suspect.  When that happens, it will change the competitive landscape quickly.  It will take some leadership to make the move, but first-mover advantage will be significant.

Friday, December 3, 2010

Data-as-a-Service

Dun and Bradstreet have fired the open salvo in the pending Data-as-a-Service war, with their D&B 360 product.  D&B 360 is an open API which allows customers to access the full breadth of D&;B data to power a customer's applications.  Integrated with Salesforce.com, D&B 360 enables a customer to pull in data over 160 million companies stored in the D&B databases for sales and marketing.

This is a big step, as it suggests that D&B is drawing some clear lines for customers, partners and potentially competitors as well, to where D&B is investing and considers its core value to customers.  As I noted in my previous post (The Next Great Information Company), information providers need to make some choices as to where they will specialize and consider their core business.  For D&B, leadership in financial risk management applications as well as sales and marketing solutions marks the extent of the application business potentially.  However, for their data and information, there is an opportunity to grow revenues significantly as other ISVs learn to leverage the D&B data for other business purposes.

At Dreamforce next week, the Salesforce.com signature event, D&B plans to announce additional details and explore with prospective customers and ISVs how D&B 360 can help them.  It will be interesting to see where they take this innovative solution.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

On another front, I hear there is a ground-swell of start-ups to deliver a 'data cloud' much like the solution D&B and offers.  This is a good thing.  Much like Salesforce.com's push to SaaS CRM solutions and the subsequent embrace of the cloud by Microsoft and others, having a healthy vibrant stable of data-as-a-service providers offers customers choice and keeps costs down.

In a conversation with a couple old friends the other night, we discussed if the data cloud truly is viable.  Some of the issues they raised were primarily around what traditional providers have stated as their core value - we link all the data bits together and 'normalize' the data.  True, but it is still a valid point?

There have been a series of very interesting articles from the A-Team Insight group that highlight to me that the answer is no.

Just this week there were 3 separate pieces that demonstrate that customers are actively looking to work around the vendor data structure, thus paving the way to a seamless data cloud.  From this week's A-Team Insight (A-Team Group.com) there were articles from Deutsche Bank, Northern Trust and Goldman Sachs, which indicated each are: a) breaking down internal data silos; b) taking a more enterprise-wide approach to data management; and c) (in the case of Goldman) establishing their own centralized symbology service.  Although this in itself isn't news, (larger firms have always had individual identifiers), what is news is that there seems to be more open talk of ending vendor identifiers - or at least their complexity.

This was clearly drawn in the same A-Team report, where David Berry, member of the Information Providers User Group and head of market data sourcing and strategy at UBS, clearly states customers are, to paraphrase from Howard Beale in Network, 'mad as hell and we're not going to take it anymore'.  As Berry is quoted in the piece: "Vendors introduce complexity into the market by attempting to differentiate themselves on the basis of instrument identification and this should be eliminated".  A clearer line in a sand has never been drawn.

To me, the initial lines of breaking the large vendor monopoly are starting to appear.  It will be interesting to see where we are in 12 months time.....