Tableau’s IPO on May 17 2013 instantly created the most valuable (in terms of Market Capitalization) Data Visualization Vendor. Since then Tableau kept having the best in “BI industry” YoY growth and its sales skyrocketing, almost reaching the level of QLIK sales. During summer of 2014 DATA (stock symbol for Tableau) shares were relatively low and as one visitor to my blog cynically put it, TCC14 (Tableau Customer Conference for 2014) can be easily justified, since it raised DATA Stock and added (indirectly) to Market Cap of Tableau more than $1B to the level of more than $5B. Below is entire history of DATA prices (click on image to enlarge):

DataQlikTibxMstrSinceIPO

Tableau’s IPO on May 17 2013 instantly created the most valuable (in terms of Market Capitalization) Data Visualization Vendor. Since then Tableau kept having the best in “BI industry” YoY growth and its sales skyrocketing, almost reaching the level of QLIK sales. During summer of 2014 DATA (stock symbol for Tableau) shares were relatively low and as one visitor to my blog cynically put it, TCC14 (Tableau Customer Conference for 2014) can be easily justified, since it raised DATA Stock and added (indirectly) to Market Cap of Tableau more than $1B to the level of more than $5B. Below is entire history of DATA prices (click on image to enlarge):

For me the more indicative then the stock prices, market capitalization and a number of participants in customer conferences are numbers of job openings for competing vendors and Tableau has 270+ of them (more than 20% of its current number of employees), QLIK has 120+ (about 7% of its number of employees) and TIBCO has only about 2 dozens of openings related to Spotfire (unless I misread some other openings).’

As a background for Tableau growing sales (and tremendous YoY) you can see slow growth of QLIK sales (QLIK also delayed for almost 3 years the release of new product: we will not see Qlikvew 12, we still waiting for release of QLIK.NEXT and only recent release is Qlik Sense, which does not make too much sense to me) and almost no changes in Spotfire sales. I am guessing that Tableau is taking all those sales away from competition…

Keynotes and sessions of TCC14 were packed (you cannot find available seats on images below) and full of interesting info and even entertainment for new users and customers.

tableau-keynote-2014

These 2 fresh multimillionaires (see below, not sure why Christian’s face looks unhappy – I guess it is just me) opened TCC14 as usual, with exciting keynote.

2MultiMillionairs

You can find their keynote either on TCC14 website (link below) or on Youtube (below TCC14 link). Keynote contains 3+ parts: two speeches from co-founders (this year Christian choose theme of “Data Art” – I am not sure if it help sales,  but nevertheless entertaining and very speculative topic) and the rest of keynote about new features in upcoming release of Tableau (8.3 and 9.0?).

http://tcc14.tableauconference.com/keynote

As you see from slide below, Tableau is positioning new features in 7 groups, and I will try to mention those.

tcc-keynote-features

Let’s start with most interesting to me: potential performance gain 2x or even 4x, mostly because better usage of multithreading and 64-bit code and I quote here: “Vice President of Product Development Andrew Beers takes his turn next, speaking about Performance. He shows breakthroughs in the Viz Engine, flying through a visualized painting, seamlessly panning, zooming, and selecting. Switching into data more likely to be representative, he shows a live connection to a database of 173 million taxi rides in New York City, and dives in showing results easily four times faster than the same calculations run on the same machine running Tableau 8.2, leveraging a change in the Data Engine to use multiple CPU cores in parallel. Database queries will likewise be parallelized, with cited examples reducing 15 second queries to three, and more complex ones reduced from nearly a minute to as little as seven seconds.”

tab_conf_pan

Among other features, Chris introduced “Lasso & Radial Selections”:  these selections allow interactors to select points in shapes other than just a square. In Stolte’s keynote, he used a map as an example. He only wanted to lasso points in a city from the northwest to the southeast, not selecting some along the way. The shape ended up being like a figure eight. This was impressive.

Vice-President of Product Marketing Ellie Fields talked about new developments forthcoming in Cloud computing with Tableau, featuring Tableau Online as a platform for connecting Cloud data warehouses and applications in conjunction with on-premise data which can be presented in web browsers, on mobile devices, or even encapsulated in embedded applications.

Of course the star of TCC14 was Prof. Hans Rosling – as keynoter as well as part of the excited crowd.

HansKeynotingAtTCC14Hans stars even in cafeteria (could not resist to include his picture seating at table with right hand raised).

HansAtTCC14Another memorable event was “ZEN Masters of 2014” – this is a living prove of huge and very capable Tableau community

ZenMastersTCC14

Tableau provided during TCC14 a lot of classes and training sessions – almost all of them were well prepared and packed. Expect many of them to be available online – many for free.

TCC14_TrainingSessionI included below two video interviews, showing insider’s take on Tableau as Sales Organization

and also Tableau’s approach to Product management with these priorities (I am curious if they always followed in real life): Quality – Schedule – Features.

 

 

While on Cape Cod this summer and when away from beach, I enjoyed some work-unrelated fun with Tableau. My beach reading included this article: http://www.theinformationlab.co.uk/2014/03/27/radar-charts-tableau-part-3/ by Andrew Ball and I decided to create my own Radar. When I show it to coworkers later, they suggested to me to publish it (at least the fun with Polygons, Path and Radars) on my blog. I may reuse this Radar chart for our internal Web Analytics. CloudsOverAtlantic

Natural Order of Points and Segments in Line.

Many visualization tools will draw the line chart, its datapoints and connecting line segments between datapoints in natural progressing order – repainting them from left to right (horizontal ordering by Axis X) or from bottom to upside
(vertical ordering by Axis Y) or vice versa.

Path as the method to break the Natural Order.

Some demanding visualizations and users wish to break the natural repainting and drawing order and Tableau allows to do that by using the Path as the method to order the datapoints and line segments in Lines and Polygons. A Collection of increasing Ordering Numbers (Pathpoints) for each Datapoint in Line defined a Path for drawing and connecting datapoints and segments of that Line (or Polygon). Each Pathpoint can be predefined or calculated, depends on mplementation and business logic.
Changing the Natural Order can create “artificial” and unusual situations, when two or more datapoints occupying the same pixels on drawing surface but have very different Pathpoints (example can be a Polygon, when Line ends in the same point it starts) or when two or more Line Segments intersecting in the same Pixel on screen (example can be the Center of the letter X ).

Radar.

Radar Chart has 2 parts: Radar Grid (background) and Radar Polygons (showing repetitive Data Patterns, if linear timeline can be collapsed into circular “timeline”). Radar Grid has Radials (with common Center) and Concentric Rings.
Polygons optionally can be filled with (transparent) color. For future Discussion let’s use the RMax as the maximal possible distance between the Center of Radar Grid (in case of Radar Grid) or the Center of Radar Polygon (in case of Radar
Polygon) and the most remote Datapoint shown in Radar Grid or Polygon respectively. We will use the “normalized” statistics of Visits to typical Website to visualize the hourly and daily (by day of the week) patterns of Web Visitations. By
normalization we mean the removal of insignificant deviations from “normal” hourly and daily amounts of Web Visits. For complete obfuscation we will assume for Demo purposes that RMax = 144.

Radar Radial Grid.

Radial Grid contains a few Radiuses (equidistant from each other) and we will draw each Radius as 3-point line where Starting and Ending points of each line are identical to each other and collocated with the Center of Radar. For Demo Web Visitation Radar we will use Radial Grid with 8 Radiuses, corresponding to the following hours of the complete 24-hours day: 0, 3, 6, 9, 12, 15, 18, 21:
radials
For example see the Radius, corresponding to HOUR = 3 (below in light brown, other Radiuses greyed out on that image):
Radiuses3
And for that Radius we are using (redundantly) the following 3 datapoints:
Radius3Data

Concentric Rings for Radar Grid.

For Demo Radar we will use 4 Concentric Rings, corresponding to 25%, 50%, 75% and 100% levels of maximum visitation per hour:
Rings
Each ring is a line with 25 datapoints, where Starting and Ending Points collocated/equal. For example, dataset for external Ring (red line above) looks like this:
Ring1Data
When Radials and Concentric Rings collocated and overlaid they represent the Radar Grid, ready to be a background for Radar Chart:
Background

Radar Polygons.

For Demo purposes we use only 2 Polygons – one (largest) representing average Hourly Visits during Weekday and 2nd Polygon representing average Hourly Visits during Weekend day. For Websites which I observed the minimum number of visits happened around 1 AM, so you will see both Polygons are slightly rotated clockwise and slightly shifted up from the Center of Radar Grid to reflect the fact that the minimum number of visitors (even around 1 AM) is slightly more then 0. Each Radar Polygon (in our Demo) has 25 Data Points with Starting and Ending Points collocated at 1AM. Here is a Weekday Polygon, overlaid with Radar Grid:
weekday
Here are the data for Weekday Polygon: 

PolygonForWeekdayData

Here is a Polygon for Weekend day, overlaid with Radar Grid:

weekend

Radar Chart.

When Radar Grid and Radar Polygons overlaid (Polygons transparent but on top of
Grid) we will get the Radar Chart. Please note that Centers of Radar Grid and Radar
Polygons can have different locations:

RadarChart

 

I published Tableau workbook with this Demo Radar Chart and Radar Data here: 

https://public.tableausoftware.com/profile/andrei5435#!/vizhome/radar/Radar

Visitors to this blog keep asking me to estimate Tableau Software prices (including for Tableau Online), even Tableau published all non-server prices on its website here: https://tableau.secure.force.com/webstore However this does not include discounts, especially for enterprise volume of buying no pricing for servers of any kind (at least 2 kinds of server licenses exist) and no pricing for consulting and training.

Thanks to website of Tableau Partner “Triad Technology Partners” we have a good estimate of all Tableau prices (they are always subject of negotiations) in form of so called GSA Schedule (General Services Administration, Federal Acquisition Service, Special Items: No. 132-33 Perpetual Software Licenses, No. 132-34 Maintenance of Software as a Service, No. 132-50 Training Courses) for Tableau Software Products and Services, see it here:

http://www.triadtechpartners.com/vendors/tableau-software/ here (for example it includes prices for IBM Cognos and others):
http://www.triadtechpartners.com/contracts/ and specific Tableau Prices here:
http://www.triadtechpartners.com/wp-content/uploads/Tableau-GSA-Price-List-April-2013.pdf

I grouped Tableau’s Prices (please keep in mind that TRIAD published GSA schedule in April 2013, so it is 1 year old prices, but they are good enough for estimating purposes)  in 5 groups below: Desktop, Server with licensing for Named Users (makes sense if you have less then hundred “registered” users), Core Licenses for Tableau Server (recommended when you have more then 150 “registered” users), Consulting and Training Prices:

Google sheet for spreadsheet above is here:

https://docs.google.com/spreadsheets/d/1oCyXRR3B6dqXcw-8cE05ApwsRcxckgA6QdvF9aF6_80/edit?usp=sharing
and image of it – for those who has misbehaved browsers is below:
TableauPrices2013

Again, please keep in mind that above just an estimate for prices (except for Tableau Online), based on 2013 GSA Schedule, and a good negotiator can always get a good discount (I got it each time I tried). You may also wish to review more general article from Boris Evelson here:

http://blogs.forrester.com/boris_evelson/14-04-22-a_common_denominator_for_pricing_and_negotiating_business_intelligence_bi_and_analytics_software#comment-27689

Note about choice between Core License and Server License with Named Users: I know organizations who choose to keep Named Users Licensing instead of switching to Core License even with more then 300 registered users, because it allows them to use much more capable hardware (with much more CPU Cores).

Observing and comparing multiple (similar) multidimensional objects over time and visually discovering multiple interconnected trends is the ultimate Data Visualization task, regardless of specific research area – it can be chemistry, biology, economy, sociology, publicly traded companies or even so called “Data Science”.

For purposes of this article I like the dataset, published by World Bank: 1000+ Measures (they called it World Development Indicators) of 250+ countries for over 50+ years – theoretically more then 10 millions of DataPoints:

http://data.worldbank.org/data-catalog/world-development-indicators?cid=GPD_WDI

Of course some DataPoints are missing so I restricted myself to 20 countries, 20 years and 25 measures (more reasonable Dataset with about 10000 DataPoints), so I got 500 Time Series for 20 Objects (Countries) and tried to imitate of how Analysts and Scientists will use Visualizations to “discover” Trends and other Data Patterns in such situation and extrapolate, if possible, this approach to more massive Datasets in practical projects. My visualization of this Dataset can be found here:

http://public.tableausoftware.com/views/wdi12/Trends?amp;:showVizHome=no

In addition to Trends Line Chart (please choose Indicator in Filter at bottom of the Chart, I added (in my Tableau Visualization above) the Motion Chart for any chosen Indicator(s) and the Motion Map Chart for GDP Indicator. Similar Visualization for this Dataset done by Google here: http://goo.gl/g2z1b6 .

As you can see below with samples of just 6 indicators (out of 1000+ published by World Bank), behavior of monitored objects (countries) are vastly different.

GDP trends: clear Leader is USA, with China is the fastest growing among economic leaders and Japan almost stagnant for last 20 years (please note that I use “GDP Colors of each country” for all other 1000+ indicators and Line Charts):

GDPTrends

Life Expectancy: Switzerland and Japan provide longest life to its citizens while India and Russian citizens are expected to live less then 70 years. Australia probably improving life expectancy faster than other 20 countries in this subset.

LifExpectancy

Health Expenditures Per Capita: Group of 4: Switzerland, Norway (fastest growing?), Luxemburg and USA health expenses about $9000 per person per year while India, Indonesia and China spent less then $500:

HealthExpenditurePerCapita

Consumer Price Index: Prices in Russia, India and Turkey growing faster then elsewhere, while prices in Japan and Switzerland almost unchanged in last 20 years:

CPI

Mobile Phones Per 100 Persons: Russia has 182 mobile phones per 100 people(fastest growing in last 10 years) while India has less then 70 cellular phones per 100 people.

CellPhonesPer100

Military Expenses as Percentage of Budget (a lot of missing data when it comes to military expenses!):  USA, India and Russia spending more then others – guess why is that:

MilitaryExpensesPercentageOfBudget

 

You can find many examples of Visual Monitoring of multiple objects overtime. One of samples is https://www.tradingview.com/ where over 7000 objects (publicly traded companies) monitored while observing hundreds of indicators (like share prices, Market Capitalization, EBITDA, Income, Debt, Assets etc.). Example (I did for previous blog post): https://www.tradingview.com/e/xRWRQS5A/

Data Visualization Readings, Q1 2014, selected from Google+ extensions of this blog:
http://tinyurl.com/VisibleData and
http://tinyurl.com/VisualizationWithTableau

dvi032914

Data Visualization Index (using DATA+QLIK+TIBX+MSTR; click on image above to enlarge):
Since 11/1/13 until 3/15/14: DATA stock grew 50%. QLIK 11%, MSTR – 6%, TIBX – lost 1%.
Current Market Capitalization: Tableau – $5.5B, QLIK – $2.6B, TIBCO – 3.5B, Microstrategy – $1.4B
Number of Job Openings Today: Tableau – 231, QLIK – 135, Spotfire (estimate) – 30, Microstrategy – 214
However during last 2 weeks of March of 2014 DATA shares lost 24%, QLIK lost 14%, TIBX and MSTR both lost about 10%

Why use R? Five reasons.
http://www.econometricsbysimulation.com/2014/03/why-use-r-five-reasons.html

Studying Tableau Performance Characteristics on AWS EC2
http://tableaulove.tumblr.com/post/80571148718/studying-tableau-performance-characteristics-on-aws-ec2

Head-to-head comparison of Datawatch and Tableau
http://datawatch.com/datawatch-vs-tableau

Diving into TIBCO Spotfire Professional 6.0
http://www.jenunderwood.com/2014/03/25/diving-into-tibco-spotfire-professional-6-0/

TIBCO beats Q1 2014 estimates but Spotfire falters
http://diginomica.com/2014/03/20/tibco-beats-estimates-spotfire-falters/

Qlik Doesn’t Fear Tableau, Oracle In Data Analytics
http://news.investors.com/031314-693154-qlik-focuses-on-easy-to-use-data-analytics.htm?p=full

Best of the visualisation web… February 2014
http://www.visualisingdata.com/index.php/2014/04/best-of-the-visualisation-web-february-2014/

Datawatch: ‘Twenty Feet From Stardom’
http://seekingalpha.com/article/2101513-datawatch-twenty-feet-from-stardom

Tableau plans to raise $345M — more than its IPO — with new stock offering
http://venturebeat.com/2014/03/16/tableau-plans-to-raise-345m-more-than-its-ipo-with-new-stock-offering/

TIBCO Spotfire Expands Connectivity to Key Big Data Sources
http://www.marketwatch.com/story/tibco-expands-connectivity-to-key-big-data-sources-2014-03-11

Tableau and Splunk Announce Strategic Technology Alliance
http://www.splunk.com/view/SP-CAAAKH5?awesm=splk.it_hQ

The End of The Data Scientist!?
http://alpinenow.com/blog/the-end-of-the-data-scientist/

bigData

Data Science Is Dead
http://slashdot.org/topic/bi/data-science-is-dead/

Periodic Table of Elements in TIBCO Spotfire
http://insideinformatics.cambridgesoft.com/InteractiveDemos/LaunchDemo/?InteractiveDemoID=1

Best of the visualisation web… January 2014
http://www.visualisingdata.com/index.php/2014/03/best-of-the-visualisation-web-january-2014/

Workbook Tools for Tableau
http://powertoolsfortableau.com/tableau-workbooks/workbook-tools/

Tapestry Data Storytelling Conference
http://www.tapestryconference.com/attendees
http://www.visualisingdata.com/index.php/2014/03/a-short-reflection-about-tapestry-conference/ ReadingLogo

URL Parameters in Tableau
http://interworks.co.uk/business-intelligence/url-parameters-tableau/

Magic Quadrant 2014 for Business Intelligence and Analytics Platforms
http://www.gartner.com/technology/reprints.do?id=1-1QLGACN&ct=140210&st=sb

What’s Next in Big Data: Visualization That Works the Way the Eyes and Mind Work
http://insights.wired.com/profiles/blogs/what-s-next-in-big-data-visualization-that-works-the-way-the-eyes#axzz2wPWAYEuY

What animated movies can teach you about data analysis
http://www.cio.com.au/article/539220/whatanimatedmoviescanteachaboutdata_analysis/

Tableau for Mac is coming, finally
http://www.geekwire.com/2014/tableau-mac-coming-finally/

Authenticating an External Tableau Server using SAML & AD FS
http://www.theinformationlab.co.uk/2014/02/04/authenticating-external-tableau-server-using-internal-ad/

Visualize this: Tableau nearly doubled its revenue in 2013
http://gigaom.com/2014/02/04/visualize-this-tableau-nearly-doubled-its-revenue-in-2013/

Qlik Announces Fourth Quarter and Full Year 2013 Financial Results
http://investor.qlik.com/releasedetail.cfm?ReleaseID=827231

InTheMiddleOfWinter2

Tableau Mapping – Earthquakes, 300,000,000 marks using Tableau 8.1 64-bit
http://theywalkedtogether.blogspot.com/2014/01/tableaumapping-earthquakes-300000000.html

Data Science: What’s in a Name?
http://www.linkedin.com/today/post/article/20130215205002-50510-the-data-scientific-method

Gapminder World Offline
http://www.gapminder.org/world-offline/

Advanced Map Visualisation in Tableau using Alteryx
http://www.theinformationlab.co.uk/2014/01/15/DrawingArrowsinTableau

Motion Map Chart
http://apandre.wordpress.com/2014/01/12/motion-map-chart/

One of Bill Gates’s favorite graphs redesigned
http://www.perceptualedge.com/blog/?p=1829

Authentication and Authorization in Qlikview Server
http://community.qlik.com/blogs/qlikviewdesignblog/2014/01/07/authentication-and-authorization

SlopeGraph for QlikView (D3SlopeGraph QlikView Extension)
http://www.qlikblog.at/3093/slopegraph-for-qlikview-d3slopegraph-qlikview-extension/

Revenue Model Comparison: SaaS v. One-Time-Sales
http://www.wovenware.com/blog/2013/12/revenue-model-comparison-saas-v-one-time-sales#.UyimffmwIUo

Scientific Data Has Become So Complex, We Have to Invent New Math to Deal With It
http://www.wired.com/wiredscience/2013/10/topology-data-sets/all/

Posting data to the web services from QlikView
http://community.qlik.com/docs/DOC-5530

It’s your round at the bar
http://interworks.co.uk/tableau/radial-bar-chart/

Lexical Distance Among the Languages of Europe
http://elms.wordpress.com/2008/03/04/lexical-distance-among-languages-of-europe/

SnowInsteadOfRainJan2014-SNOW

For this weekend I got 2 guest bloggers (one yesterday and other today) sharing their thoughts about Cloud Services for BI and DV. I myself published recently  a few articles about this topic, for example here: http://apandre.wordpress.com/2013/08/28/visualization-as-a-service/ and here:

http://apandre.wordpress.com/2013/12/14/spotfire-cloud-pricing/ . My opinions can be different from Guest Bloggers. You can find many providers of DV and BI Cloud Services, including Spotfire Cloud, Tableau Online, GoodData, Microstrategy Cloud, Bime, Yellofin, BellaDati, SpreadsheetWEB etc.

Let me introduce my 2nd guest blogger for this weekend: Ugur Kadakal is the CEO and founder of Pagos, Inc. located in Cambridge, MA. Pagos is the developer of SpreadsheetWEB which transforms Excel spreadsheets into web based Business Intelligence (BI) applications without any programming. SpreadsheetWEB can also convert PowerPivot files into web based dashboards. It provides advanced Data Visualization (DV) to SQL Analysis Services (Tabular) cubes without SharePoint. Mr. Kadakal published a few articles on this blog before with great feedback, so he is a serial Guest Blogger.

SaaSCost

Before (or after) you read article of Mr. Kadakal, I suggest to review the article, comparing 5+ scenarios of revenue of Cloud Service vs. Traditional One-Time Sale of software, see it here: http://www.wovenware.com/blog/2013/12/revenue-model-comparison-saas-v-one-time-sales#.UyikEfmwIUp . Illustration above is from that article.

Traditional BI versus Cloud BI

Over the past several years, we have been witnessing numerous transformations in the software industry, from a traditional on-premise deployment model to the Cloud. There are some application types for which cloud makes a lot of sense while it doesn’t for some others. BI is somewhere in between.

Before I express my opinion on the subject of Traditional BI versus Cloud BI, I would like to clarify my definitions. I define traditional BI as large enterprise implementations which connect with many data sources in real-time.  These projects have many phases and require large teams to implement. These projects could take years and cost millions of dollars to implement.

Many people define cloud BI as deployments on a proprietary, third-party, multi-tenant environment managed by a vendor. My definition is somewhat different and broader. Cloud BI is more about ease of deployment, use and management. While Cloud BI can be hosted and managed by a vendor, it can also be deployed on a private Cloud infrastructure like Amazon or Microsoft Azure. With the advancement of cloud infrastructure technologies like OpenStack, deploying and managing private cloud infrastructure is becoming easier for many enterprises. As a result, whether Cloud BI is deployed on a multi/single-tenant environment on vendor infrastructure, a third party cloud infrastructure like Amazon, Azure, etc. or on internal private cloud, it becomes more of a business decision rather than a technical limitation.

DataCloud

One main distinction between Traditional BI and Cloud BI is data management. Traditional BI implementations can have real-time data as they can connect to the original data sources directly. I don’t believe that Cloud BI should deal with real-time data, even if implemented on internal private cloud infrastructure. Supporting real-time data is a requirement that makes any BI project complicated and costly. Hence Cloud BI solutions should include simple utilities i.e. ETL, residing on local computers to push internal data into Cloud BI’s data model periodically. Since Cloud BI should not deal with real-time data scenarios, this data synchronization can be configured by the business user accordingly.

Another distinction is the ease of implementation. Regardless of where it is deployed, Cloud BI solutions should take no more than a few hours to implement and configure. Some BI vendors already support images on Amazon cloud to simplify this process.

Traditional BI model typically requires significant upfront investments. Part of this investment is internal while the rest is BI licensing and implementation fees. But the very nature of Cloud BI requires agility from deployment to data management and dashboard creation. Cloud BI project can be deployed easily and it can also be modified and shut down with equal ease. Hence traditional business model of large upfront investments doesn’t make sense here. Cloud BI business model should be subscription based regardless of whether it is implemented on a vendor infrastructure or on an on-premise private cloud infrastructure. Customers should be able to pay what they use and for how long they use it. Such simplicity will also eliminate vendor lock-in risks that most enterprises have to mitigate.

DVinCloud2

In summary, there are many BI projects that will require traditional BI implementation. These projects typically require real-time data and connectivity to many different data sources. Cloud BI should not attempt to handle these types of projects. But there are many other BI projects that require neither real-time data nor the data which comes from different systems that should be connected. Cloud BI can handle these projects quickly and cost effectively, by empowering business users to manage the whole process without IT or external support. From discovery to data synchronization to dashboard creation and management, every activity can be handled by business users.

For this weekend I got 2 guest bloggers (one today and second tomorrow) sharing their thoughts about Cloud Services for BI and DV. I myself published recently  a few articles about this topic, for example here: http://apandre.wordpress.com/2013/08/28/visualization-as-a-service/ and here:

http://apandre.wordpress.com/2013/12/14/spotfire-cloud-pricing/ . My opinions can be different from Guest Bloggers (see my comment below this article). You can find many providers of DV and BI Cloud Services, including Spotfire Cloud, Tableau Online, GoodData, Microstrategy Cloud, Bime, Yellofin, BellaDati, SpreadsheetWEB etc.

Let me introduce my 1st guest blogger for this weekend: Mark Flaherty is Chief Marketing Officer at InetSoft Technology,  a BI (Business Intelligence) software provider founded in 1996, headquartered in Piscataway, New Jersey with over 150 employees worldwide. InetSoft’s flagship BI application Style Intelligence enables self-service  BI spanning dashboarding, reporting and visual analysis for enterprises and technology providers. The server-based application includes a data mashup engine for combining data from almost any data source and browser-based design tools that power users and developers can use to quickly create interactive DV (Data Visualizations).

DVinCloud

Are public BI cloud services really going to overtake the traditional on-premise deployment of BI tools?

(Author: Mark Flaherty. Text below contains Mark’s opinions and they can be different from opinions expressed on this blog).

It’s been six years since public BI cloud services came to be. Originally termed SaaS BI, public BI cloud services refers to commercial service providers who host a BI application in the public cloud that accesses corporate data housed in the corporate private cloud and/or other application providers’ networks. As recently as last month, an industry report from TechNavio said, “the traditional on-premise deployment of BI tools is slowly being taken over by single and multi-tenant hosted SaaS.” I have a feeling this is another one of those projections that copies a historical growth rate forward for the next five years. If you do that with any new offering that starts from zero, you will always project it to dominate a marketplace, right?

I thought it would be interesting to discuss why I think this won’t happen.

DVinCloud3

In general, there is one legitimate driving force for why companies look to cloud solutions that helps drive the demand for cloud BI services specifically: outsourcing of IT. The types of companies for whom this makes the most sense are small businesses. They have little or no IT staff to set up and support enterprise software, and they also have limited cap-ex budgets so software rentals fit their cash flow structure better. While this is where most of the success for cloud BI has happened, this is only a market segment opportunity. By no means do small companies dominate the IT marketplace.

Another factor for turning to public cloud solutions is expediency. Even at large companies where there is budget for software purchases, the Business sometimes becomes frustrated with the responsiveness of internal IT, and they look outside for a faster solution. This makes sense for domain-specific cases where there is a somewhat narrow scope of need, and the application and the data are self-contained.  Salesforce.com is the poster child for this case, where it can quickly be set up as a CRM for a sales team. Indeed the fast success of salesforce.com is a big reason why people think cloud solutions will take off in every domain.

But business intelligence is different. A BI tool is meant to span multiple information areas, from finance to sales to support and more. This is where it gets complicated for mid-sized and global enterprises. The expediency factor is nullified because the data that business users want to access with their cloud BI tool is controlled by IT, so they need to be involved. Depending on the organization’s policies and politics, this can either slow down such a move or kill it.

The very valid reason why enterprise IT would kill the idea for a public cloud BI solution is why ultimately I think public BI cloud services has such a limited opportunity in the overall market. One of IT’s responsibilities is ensuring data security, and they will rightly point out the security risks of opening access to sensitive corporate data to a 3rd party. It’s one thing to trust a vendor with one set of data like website visitor traffic, but trusting them with all of a company’s financial and customer data is where almost all companies will draw the line.  This is a concern I don’t see ever going away.

What are some pieces of evidence that public BI cloud services have a limited market opportunity? When BI cloud services first came onto the scene, all of the big BI vendors dabbled in it. Now many no longer champion these hosted offerings, or they have shuttered or demoted them. IBM’s Cognos Express is now only an on-premise option. SAP BusinessObjects BI OnDemand can’t be found from SAP’s main site, but has its own micro site. Tibco’s Spotfire Cloud and Tableau Software’s Tableau Online are two exceptions among the better known BI providers that are still prominently marketed. However, Tibco positions this option for small businesses and workgroups and omits certain functionality.

Our company, too, experimented with a public BI cloud offering years ago. It was first targeted at salesforce.com customers who would want to mash up their CRM data with other enterprise-housed data. We found mostly small, budget challenged companies in their customer base, and the few large enterprises that we found balked at the idea, asking instead, for our software to be installed on-premise where they would connect to any cloud-hosted data on their own. Today the only remaining cloud offering of ours is a free visualization service called Visualize Free which is similar to Tableau Public or IBM’s Many Eyes.

Another observation to make, while there have been a handful of pure-play cloud BI vendors, one named “Lucidera,” came and went quite quickly. Birst is one that seems to have got a successful formula.

In summary, yes, there is a place for public BI cloud services in the small business market, but no, it’s not going to overtake traditional on-premise BI.

GoogleDataCenterInGeorgiaWithCloudsAboveIt2

Follow

Get every new post delivered to your Inbox.

Join 294 other followers