Visitors to this blog keep asking me to estimate Tableau Software prices (including for Tableau Online), even Tableau published all non-server prices on its website here: https://tableau.secure.force.com/webstore However this does not include discounts, especially for enterprise volume of buying no pricing for servers of any kind (at least 2 kinds of server licenses exist) and no pricing for consulting and training.

Thanks to website of Tableau Partner “Triad Technology Partners” we have a good estimate of all Tableau prices (they are always subject of negotiations) in form of so called GSA Schedule (General Services Administration, Federal Acquisition Service, Special Items: No. 132-33 Perpetual Software Licenses, No. 132-34 Maintenance of Software as a Service, No. 132-50 Training Courses) for Tableau Software Products and Services, see it here:

http://www.triadtechpartners.com/vendors/tableau-software/ here (for example it includes prices for IBM Cognos and others):
http://www.triadtechpartners.com/contracts/ and specific Tableau Prices here:
http://www.triadtechpartners.com/wp-content/uploads/Tableau-GSA-Price-List-April-2013.pdf

I grouped Tableau’s Prices (please keep in mind that TRIAD published GSA schedule in April 2013, so it is 1 year old prices, but they are good enough for estimating purposes)  in 5 groups below: Desktop, Server with licensing for Named Users (makes sense if you have less then hundred “registered” users), Core Licenses for Tableau Server (recommended when you have more then 150 “registered” users), Consulting and Training Prices:

Google sheet for spreadsheet above is here:
https://docs.google.com/spreadsheets/d/1oCyXRR3B6dqXcw-8cE05ApwsRcxckgA6QdvF9aF6_80/edit?usp=sharing
and image of it – for those who has misbehaved browsers is below:
TableauPrices2013

Again, please keep in mind that above just an estimate for prices (except for Tableau Online), based on 2013 GSA Schedule, and a good negotiator can always get a good discount (I got it each time I tried).

Note about choice between Core License and Server License with Named Users: I know organizations who choose to keep Named Users Licensing instead of switching to Core License even with more then 300 registered users, because it allows them to use much more capable hardware (with much more CPU Cores).

Observing and comparing multiple (similar) multidimensional objects over time and visually discovering multiple interconnected trends is the ultimate Data Visualization task, regardless of specific research area – it can be chemistry, biology, economy, sociology, publicly traded companies or even so called “Data Science”.

For purposes of this article I like the dataset, published by World Bank: 1000+ Measures (they called it World Development Indicators) of 250+ countries for over 50+ years – theoretically more then 10 millions of DataPoints:

http://data.worldbank.org/data-catalog/world-development-indicators?cid=GPD_WDI

Of course some DataPoints are missing so I restricted myself to 20 countries, 20 years and 25 measures (more reasonable Dataset with about 10000 DataPoints), so I got 500 Time Series for 20 Objects (Countries) and tried to imitate of how Analysts and Scientists will use Visualizations to “discover” Trends and other Data Patterns in such situation and extrapolate, if possible, this approach to more massive Datasets in practical projects. My visualization of this Dataset can be found here:

http://public.tableausoftware.com/views/wdi12/Trends?amp;:showVizHome=no

In addition to Trends Line Chart (please choose Indicator in Filter at bottom of the Chart, I added (in my Tableau Visualization above) the Motion Chart for any chosen Indicator(s) and the Motion Map Chart for GDP Indicator. Similar Visualization for this Dataset done by Google here: http://goo.gl/g2z1b6 .

As you can see below with samples of just 6 indicators (out of 1000+ published by World Bank), behavior of monitored objects (countries) are vastly different.

GDP trends: clear Leader is USA, with China is the fastest growing among economic leaders and Japan almost stagnant for last 20 years (please note that I use “GDP Colors of each country” for all other 1000+ indicators and Line Charts):

GDPTrends

Life Expectancy: Switzerland and Japan provide longest life to its citizens while India and Russian citizens are expected to live less then 70 years. Australia probably improving life expectancy faster than other 20 countries in this subset.

LifExpectancy

Health Expenditures Per Capita: Group of 4: Switzerland, Norway (fastest growing?), Luxemburg and USA health expenses about $9000 per person per year while India, Indonesia and China spent less then $500:

HealthExpenditurePerCapita

Consumer Price Index: Prices in Russia, India and Turkey growing faster then elsewhere, while prices in Japan and Switzerland almost unchanged in last 20 years:

CPI

Mobile Phones Per 100 Persons: Russia has 182 mobile phones per 100 people(fastest growing in last 10 years) while India has less then 70 cellular phones per 100 people.

CellPhonesPer100

Military Expenses as Percentage of Budget (a lot of missing data when it comes to military expenses!):  USA, India and Russia spending more then others – guess why is that:

MilitaryExpensesPercentageOfBudget

 

You can find many examples of Visual Monitoring of multiple objects overtime. One of samples is https://www.tradingview.com/ where over 7000 objects (publicly traded companies) monitored while observing hundreds of indicators (like share prices, Market Capitalization, EBITDA, Income, Debt, Assets etc.). Example (I did for previous blog post): https://www.tradingview.com/e/xRWRQS5A/

Data Visualization Readings, Q1 2014, selected from Google+ extensions of this blog:
http://tinyurl.com/VisibleData and
http://tinyurl.com/VisualizationWithTableau

dvi032914

Data Visualization Index (using DATA+QLIK+TIBX+MSTR; click on image above to enlarge):
Since 11/1/13 until 3/15/14: DATA stock grew 50%. QLIK 11%, MSTR – 6%, TIBX – lost 1%.
Current Market Capitalization: Tableau – $5.5B, QLIK – $2.6B, TIBCO – 3.5B, Microstrategy – $1.4B
Number of Job Openings Today: Tableau – 231, QLIK – 135, Spotfire (estimate) – 30, Microstrategy – 214
However during last 2 weeks of March of 2014 DATA shares lost 24%, QLIK lost 14%, TIBX and MSTR both lost about 10%

Why use R? Five reasons.
http://www.econometricsbysimulation.com/2014/03/why-use-r-five-reasons.html

Studying Tableau Performance Characteristics on AWS EC2
http://tableaulove.tumblr.com/post/80571148718/studying-tableau-performance-characteristics-on-aws-ec2

Head-to-head comparison of Datawatch and Tableau
http://datawatch.com/datawatch-vs-tableau

Diving into TIBCO Spotfire Professional 6.0
http://www.jenunderwood.com/2014/03/25/diving-into-tibco-spotfire-professional-6-0/

TIBCO beats Q1 2014 estimates but Spotfire falters
http://diginomica.com/2014/03/20/tibco-beats-estimates-spotfire-falters/

Qlik Doesn’t Fear Tableau, Oracle In Data Analytics
http://news.investors.com/031314-693154-qlik-focuses-on-easy-to-use-data-analytics.htm?p=full

Best of the visualisation web… February 2014
http://www.visualisingdata.com/index.php/2014/04/best-of-the-visualisation-web-february-2014/

Datawatch: ‘Twenty Feet From Stardom’
http://seekingalpha.com/article/2101513-datawatch-twenty-feet-from-stardom

Tableau plans to raise $345M — more than its IPO — with new stock offering
http://venturebeat.com/2014/03/16/tableau-plans-to-raise-345m-more-than-its-ipo-with-new-stock-offering/

TIBCO Spotfire Expands Connectivity to Key Big Data Sources
http://www.marketwatch.com/story/tibco-expands-connectivity-to-key-big-data-sources-2014-03-11

Tableau and Splunk Announce Strategic Technology Alliance
http://www.splunk.com/view/SP-CAAAKH5?awesm=splk.it_hQ

The End of The Data Scientist!?
http://alpinenow.com/blog/the-end-of-the-data-scientist/

bigData

Data Science Is Dead
http://slashdot.org/topic/bi/data-science-is-dead/

Periodic Table of Elements in TIBCO Spotfire
http://insideinformatics.cambridgesoft.com/InteractiveDemos/LaunchDemo/?InteractiveDemoID=1

Best of the visualisation web… January 2014
http://www.visualisingdata.com/index.php/2014/03/best-of-the-visualisation-web-january-2014/

Workbook Tools for Tableau
http://powertoolsfortableau.com/tableau-workbooks/workbook-tools/

Tapestry Data Storytelling Conference
http://www.tapestryconference.com/attendees
http://www.visualisingdata.com/index.php/2014/03/a-short-reflection-about-tapestry-conference/ ReadingLogo

URL Parameters in Tableau
http://interworks.co.uk/business-intelligence/url-parameters-tableau/

Magic Quadrant 2014 for Business Intelligence and Analytics Platforms
http://www.gartner.com/technology/reprints.do?id=1-1QLGACN&ct=140210&st=sb

What’s Next in Big Data: Visualization That Works the Way the Eyes and Mind Work
http://insights.wired.com/profiles/blogs/what-s-next-in-big-data-visualization-that-works-the-way-the-eyes#axzz2wPWAYEuY

What animated movies can teach you about data analysis
http://www.cio.com.au/article/539220/whatanimatedmoviescanteachaboutdata_analysis/

Tableau for Mac is coming, finally
http://www.geekwire.com/2014/tableau-mac-coming-finally/

Authenticating an External Tableau Server using SAML & AD FS
http://www.theinformationlab.co.uk/2014/02/04/authenticating-external-tableau-server-using-internal-ad/

Visualize this: Tableau nearly doubled its revenue in 2013
http://gigaom.com/2014/02/04/visualize-this-tableau-nearly-doubled-its-revenue-in-2013/

Qlik Announces Fourth Quarter and Full Year 2013 Financial Results
http://investor.qlik.com/releasedetail.cfm?ReleaseID=827231

InTheMiddleOfWinter2

Tableau Mapping – Earthquakes, 300,000,000 marks using Tableau 8.1 64-bit
http://theywalkedtogether.blogspot.com/2014/01/tableaumapping-earthquakes-300000000.html

Data Science: What’s in a Name?
http://www.linkedin.com/today/post/article/20130215205002-50510-the-data-scientific-method

Gapminder World Offline
http://www.gapminder.org/world-offline/

Advanced Map Visualisation in Tableau using Alteryx
http://www.theinformationlab.co.uk/2014/01/15/DrawingArrowsinTableau

Motion Map Chart
http://apandre.wordpress.com/2014/01/12/motion-map-chart/

One of Bill Gates’s favorite graphs redesigned
http://www.perceptualedge.com/blog/?p=1829

Authentication and Authorization in Qlikview Server
http://community.qlik.com/blogs/qlikviewdesignblog/2014/01/07/authentication-and-authorization

SlopeGraph for QlikView (D3SlopeGraph QlikView Extension)
http://www.qlikblog.at/3093/slopegraph-for-qlikview-d3slopegraph-qlikview-extension/

Revenue Model Comparison: SaaS v. One-Time-Sales
http://www.wovenware.com/blog/2013/12/revenue-model-comparison-saas-v-one-time-sales#.UyimffmwIUo

Scientific Data Has Become So Complex, We Have to Invent New Math to Deal With It
http://www.wired.com/wiredscience/2013/10/topology-data-sets/all/

Posting data to the web services from QlikView
http://community.qlik.com/docs/DOC-5530

It’s your round at the bar
http://interworks.co.uk/tableau/radial-bar-chart/

Lexical Distance Among the Languages of Europe
http://elms.wordpress.com/2008/03/04/lexical-distance-among-languages-of-europe/

SnowInsteadOfRainJan2014-SNOW

For this weekend I got 2 guest bloggers (one yesterday and other today) sharing their thoughts about Cloud Services for BI and DV. I myself published recently  a few articles about this topic, for example here: http://apandre.wordpress.com/2013/08/28/visualization-as-a-service/ and here:

http://apandre.wordpress.com/2013/12/14/spotfire-cloud-pricing/ . My opinions can be different from Guest Bloggers. You can find many providers of DV and BI Cloud Services, including Spotfire Cloud, Tableau Online, GoodData, Microstrategy Cloud, Bime, Yellofin, BellaDati, SpreadsheetWEB etc.

Let me introduce my 2nd guest blogger for this weekend: Ugur Kadakal is the CEO and founder of Pagos, Inc. located in Cambridge, MA. Pagos is the developer of SpreadsheetWEB which transforms Excel spreadsheets into web based Business Intelligence (BI) applications without any programming. SpreadsheetWEB can also convert PowerPivot files into web based dashboards. It provides advanced Data Visualization (DV) to SQL Analysis Services (Tabular) cubes without SharePoint. Mr. Kadakal published a few articles on this blog before with great feedback, so he is a serial Guest Blogger.

SaaSCost

Before (or after) you read article of Mr. Kadakal, I suggest to review the article, comparing 5+ scenarios of revenue of Cloud Service vs. Traditional One-Time Sale of software, see it here: http://www.wovenware.com/blog/2013/12/revenue-model-comparison-saas-v-one-time-sales#.UyikEfmwIUp . Illustration above is from that article.

Traditional BI versus Cloud BI

Over the past several years, we have been witnessing numerous transformations in the software industry, from a traditional on-premise deployment model to the Cloud. There are some application types for which cloud makes a lot of sense while it doesn’t for some others. BI is somewhere in between.

Before I express my opinion on the subject of Traditional BI versus Cloud BI, I would like to clarify my definitions. I define traditional BI as large enterprise implementations which connect with many data sources in real-time.  These projects have many phases and require large teams to implement. These projects could take years and cost millions of dollars to implement.

Many people define cloud BI as deployments on a proprietary, third-party, multi-tenant environment managed by a vendor. My definition is somewhat different and broader. Cloud BI is more about ease of deployment, use and management. While Cloud BI can be hosted and managed by a vendor, it can also be deployed on a private Cloud infrastructure like Amazon or Microsoft Azure. With the advancement of cloud infrastructure technologies like OpenStack, deploying and managing private cloud infrastructure is becoming easier for many enterprises. As a result, whether Cloud BI is deployed on a multi/single-tenant environment on vendor infrastructure, a third party cloud infrastructure like Amazon, Azure, etc. or on internal private cloud, it becomes more of a business decision rather than a technical limitation.

DataCloud

One main distinction between Traditional BI and Cloud BI is data management. Traditional BI implementations can have real-time data as they can connect to the original data sources directly. I don’t believe that Cloud BI should deal with real-time data, even if implemented on internal private cloud infrastructure. Supporting real-time data is a requirement that makes any BI project complicated and costly. Hence Cloud BI solutions should include simple utilities i.e. ETL, residing on local computers to push internal data into Cloud BI’s data model periodically. Since Cloud BI should not deal with real-time data scenarios, this data synchronization can be configured by the business user accordingly.

Another distinction is the ease of implementation. Regardless of where it is deployed, Cloud BI solutions should take no more than a few hours to implement and configure. Some BI vendors already support images on Amazon cloud to simplify this process.

Traditional BI model typically requires significant upfront investments. Part of this investment is internal while the rest is BI licensing and implementation fees. But the very nature of Cloud BI requires agility from deployment to data management and dashboard creation. Cloud BI project can be deployed easily and it can also be modified and shut down with equal ease. Hence traditional business model of large upfront investments doesn’t make sense here. Cloud BI business model should be subscription based regardless of whether it is implemented on a vendor infrastructure or on an on-premise private cloud infrastructure. Customers should be able to pay what they use and for how long they use it. Such simplicity will also eliminate vendor lock-in risks that most enterprises have to mitigate.

DVinCloud2

In summary, there are many BI projects that will require traditional BI implementation. These projects typically require real-time data and connectivity to many different data sources. Cloud BI should not attempt to handle these types of projects. But there are many other BI projects that require neither real-time data nor the data which comes from different systems that should be connected. Cloud BI can handle these projects quickly and cost effectively, by empowering business users to manage the whole process without IT or external support. From discovery to data synchronization to dashboard creation and management, every activity can be handled by business users.

For this weekend I got 2 guest bloggers (one today and second tomorrow) sharing their thoughts about Cloud Services for BI and DV. I myself published recently  a few articles about this topic, for example here: http://apandre.wordpress.com/2013/08/28/visualization-as-a-service/ and here:

http://apandre.wordpress.com/2013/12/14/spotfire-cloud-pricing/ . My opinions can be different from Guest Bloggers (see my comment below this article). You can find many providers of DV and BI Cloud Services, including Spotfire Cloud, Tableau Online, GoodData, Microstrategy Cloud, Bime, Yellofin, BellaDati, SpreadsheetWEB etc.

Let me introduce my 1st guest blogger for this weekend: Mark Flaherty is Chief Marketing Officer at InetSoft Technology,  a BI (Business Intelligence) software provider founded in 1996, headquartered in Piscataway, New Jersey with over 150 employees worldwide. InetSoft’s flagship BI application Style Intelligence enables self-service  BI spanning dashboarding, reporting and visual analysis for enterprises and technology providers. The server-based application includes a data mashup engine for combining data from almost any data source and browser-based design tools that power users and developers can use to quickly create interactive DV (Data Visualizations).

DVinCloud

Are public BI cloud services really going to overtake the traditional on-premise deployment of BI tools?

(Author: Mark Flaherty. Text below contains Mark’s opinions and they can be different from opinions expressed on this blog).

It’s been six years since public BI cloud services came to be. Originally termed SaaS BI, public BI cloud services refers to commercial service providers who host a BI application in the public cloud that accesses corporate data housed in the corporate private cloud and/or other application providers’ networks. As recently as last month, an industry report from TechNavio said, “the traditional on-premise deployment of BI tools is slowly being taken over by single and multi-tenant hosted SaaS.” I have a feeling this is another one of those projections that copies a historical growth rate forward for the next five years. If you do that with any new offering that starts from zero, you will always project it to dominate a marketplace, right?

I thought it would be interesting to discuss why I think this won’t happen.

DVinCloud3

In general, there is one legitimate driving force for why companies look to cloud solutions that helps drive the demand for cloud BI services specifically: outsourcing of IT. The types of companies for whom this makes the most sense are small businesses. They have little or no IT staff to set up and support enterprise software, and they also have limited cap-ex budgets so software rentals fit their cash flow structure better. While this is where most of the success for cloud BI has happened, this is only a market segment opportunity. By no means do small companies dominate the IT marketplace.

Another factor for turning to public cloud solutions is expediency. Even at large companies where there is budget for software purchases, the Business sometimes becomes frustrated with the responsiveness of internal IT, and they look outside for a faster solution. This makes sense for domain-specific cases where there is a somewhat narrow scope of need, and the application and the data are self-contained.  Salesforce.com is the poster child for this case, where it can quickly be set up as a CRM for a sales team. Indeed the fast success of salesforce.com is a big reason why people think cloud solutions will take off in every domain.

But business intelligence is different. A BI tool is meant to span multiple information areas, from finance to sales to support and more. This is where it gets complicated for mid-sized and global enterprises. The expediency factor is nullified because the data that business users want to access with their cloud BI tool is controlled by IT, so they need to be involved. Depending on the organization’s policies and politics, this can either slow down such a move or kill it.

The very valid reason why enterprise IT would kill the idea for a public cloud BI solution is why ultimately I think public BI cloud services has such a limited opportunity in the overall market. One of IT’s responsibilities is ensuring data security, and they will rightly point out the security risks of opening access to sensitive corporate data to a 3rd party. It’s one thing to trust a vendor with one set of data like website visitor traffic, but trusting them with all of a company’s financial and customer data is where almost all companies will draw the line.  This is a concern I don’t see ever going away.

What are some pieces of evidence that public BI cloud services have a limited market opportunity? When BI cloud services first came onto the scene, all of the big BI vendors dabbled in it. Now many no longer champion these hosted offerings, or they have shuttered or demoted them. IBM’s Cognos Express is now only an on-premise option. SAP BusinessObjects BI OnDemand can’t be found from SAP’s main site, but has its own micro site. Tibco’s Spotfire Cloud and Tableau Software’s Tableau Online are two exceptions among the better known BI providers that are still prominently marketed. However, Tibco positions this option for small businesses and workgroups and omits certain functionality.

Our company, too, experimented with a public BI cloud offering years ago. It was first targeted at salesforce.com customers who would want to mash up their CRM data with other enterprise-housed data. We found mostly small, budget challenged companies in their customer base, and the few large enterprises that we found balked at the idea, asking instead, for our software to be installed on-premise where they would connect to any cloud-hosted data on their own. Today the only remaining cloud offering of ours is a free visualization service called Visualize Free which is similar to Tableau Public or IBM’s Many Eyes.

Another observation to make, while there have been a handful of pure-play cloud BI vendors, one named “Lucidera,” came and went quite quickly. Birst is one that seems to have got a successful formula.

In summary, yes, there is a place for public BI cloud services in the small business market, but no, it’s not going to overtake traditional on-premise BI.

GoogleDataCenterInGeorgiaWithCloudsAboveIt2

For last 6 years every and each February my inbox was bombarded by messages from colleagues, friends and visitors to this blog, containing references, quotes and PDFs to Gartner’s Magic Quadrant (MQ) for Business Intelligence (BI) and Analytics Platforms, latest can be found here: http://www.gartner.com/technology/reprints.do?id=1-1QLGACN&ct=140210&st=sb .

Last year I was able to ignore these noises (funny enough I was busy by migrating thousands of users from Business Objects and Microstrategy to Tableau-based Visual Reports for very large company), but in February 2014 I got so many questions about it, that I am basically forced to share my opinion about it.

  • 1st of all, as I said on this blog many times that BI is dead and it replaced by Data Visualization and Visual Analytics. That was finally acknowledged by Gartner itself, by placing Tableau, QLIK and Spotfire in “Leaders Quarter” of MQ for 2nd year in a row.

  • 2ndly last 6 MQs (2009-2014) are suspicious for me because in all of them Gartner (with complete disregard of reality) placed all 6 “Misleading” vendors (IBM, SAP, Oracle, SAS, Microstrategy and Microsoft) of wasteful BI platforms in Leaders Quarter! Those 6 vendors convinced customers to buy (over period of last 6 years) their BI software for over $60B plus much more than that was spent on maintenance, support, development, consulting, upgrades and other IT expenses.

There is nothing magic about these MQs: they are results of Gartner’s 2-dimensional understanding of BI, Analytics and Data Visualization (DV) Platforms, features and usage. 1st Measure (X axis) according to Gartner is the “Completeness of Vision” and 2nd Measure (Y axis) is the “Ability to Execute”, which allows to distribute DV and BI Vendors among 4 “Quarters”: RightTop – “Leaders”, LeftTop -”Challengers”, RightBottom – “Visionaires” and LeftBottom – “Niche Players” (or you can say LeftOvers).

mq2014

I decided to compare my opinions (expressed on this blog many times) vs. Gartner’s (they wrote 78 pages about it!) by taking TOP 3 Leaders from Gartner, than taking 3 TOP Visionaries from Gartner (Projecting on Axis X all Vendors except TOP 3 Leaders) than taking 3 TOP Challengers from Gartner (Projecting on Axis Y all Vendors except TOP 3 Leaders and TOP 3 Visionaries ) than TOP 3 “Niche Players” from the Rest of Gartner’s List (above) and taking “similar” choices by myself (my list is wider then Gartner’s, because Gartner missed important to me DV Vendors like Visokio and vendors like Datawatch and Advizor Solutions are not included into MQ in order to please Gartner’s favorites), see the comparison of opinions below:

12DVendorsIf you noticed, in order to be able to compare my opinion, I had to use Gartner’s terms like Leader, Challenger etc., which is not exactly how I see it. Basically my opinion overlapping with Gartner’s only in 25% of cases in 2014, which is slightly higher then in previous years – I guess success of Tableau and QLIK is a reason for that.

BI Market in 2013 reached $14B and at least $1B of it spent on Data Visualization tools. Here is the short Summary of the state of each Vendor, mentioned above in “DV Blog” column:

  1. Tableau: $232M in Sales, $6B MarketCap, YoY 82% (fastest in DV market), Leader in DV Mindshare, declared goal is “Data to the People” and the ease of use.

  2. QLIK: $470M in Sales, $2.5B MarketCap, Leader in DV Marketshare, attempts to improve BI, but will remove Qlikview Desktop from Qlik.Next.

  3. Spotfire: sales under $200M, has the most mature Platform for Visual Analytics, the best DV Cloud Services. Spotfire is limited by corporate Parent (TIBCO).

  4. Visokio: private DV Vendor with limited marketing and sales but has one of the richest and mature DV functionality.

  5. SAS: has the most advanced Analytics functionality (not easy to learn and use), targets Data Scientists and Power Users who can afford it instead of free R.

  6. Revolution Analytics: as the provider of commercial version and commercial support of R library is a “cheap” alternative to SAS.

  7. Microsoft: has the most advanced BI and DV technological stack for software developers but has no real DV Product and has no plan to have it in the future.

  8. Datawatch: $33M in sales, $281M MarketCap, has mature DV, BI and real-time visualization functionality, experienced management and sales force.

  9. Microstrategy: $576M in sales, 1.4B MarketCap; BI veteran with complete BI functionality; recently realized that BI Market is not growing and made the desperate attempt to get into DV market.

  10. Panorama: BI Veteran with excellent easy to use front-end to Microsoft BI stack, has good DV functionality, social and collaborative BI features.

  11. Advizor Solutions: private DV Veteran with almost complete set of DV features and ability to do Predictive Analytics interactively, visually and without coding.

  12. RapidMiner: Commercial Provider of open-source-based and easy to use Advanced Analytical Platform, integrated with R.

Similar MQ for “Advanced Analytics Platforms” can be found here: http://www.gartner.com/technology/reprints.do?id=1-1QXWEQQ&ct=140219&st=sg - have fun:

mq2014aap

In addition to differences mentioned in table above, I need to say that I do not see that Big Data is defined well enough to be mentioned 30 times in review of “BI and Analytical Platforms” and I do not see that Vendors mentioned by Gartner are ready for that, but may be it is a topic for different blogpost…

Update: 

We were told (5+ month ago) what to expect from Tableau 8.2 (originally @TCC13 they said Release can be before the end of the winter of 2014; however in the latest Earnings Call here: http://seekingalpha.com/article/1994131-tableau-softwares-ceo-discusses-q4-2013-results-earnings-call-transcript CEO acknowledged the delay: 8.2 in Q2 of 2014, and v.9 in “first half of 2015″, many months later then original plan), including:

  • Tableau for MAC (very timely at time when QLIK about to abandon the Qlikview Desktop in favor of HTML5 Client),
  • Story Points (new type of worksheet/dashboard with mini-slides as story-points, so bye-bye to Powerpoint),
  • seamless access to data via data connection interface to visually build a data schema, including inner/left/right/outer joins,
  • ability to beautify the columns names.

306151016_640

I am sure Tableau already has a Roadmap for Tableau 9 and beyond, but I accumulated a list of wishes for it (may be it is not too late to include some of it to Roadmap?). This Wishlist is rather about backend than about front-end Eye Candies (the nature of the Large Enterprise dictates that). Here it is:

  • Visual ETL functionality and Data Quality Validation/Cleaning;
  • (thanks to Larry Keller): Enterprise Repository for pre-Validated Sharable Regularly Refreshed Data Extracts, Data Connections and Data Sources;
  • Ability to collect Data automatically (say Machine-generated or/and transactional Data) and Visually (say from Humans, filling Data-Entry Forms), both tied to already predefined and/or modifiable Data Extracts;
  • Visual Data Modeling;
  • Free Tableau Reader for Mac (since we are going to have Tableau Desktop for Mac in Tableau 8.2 anyway), iOS, Android and Linux;
  • Real-Time Visualization, support (Spotfire and Datawatch have it!) for Complex Event Processing (CEP), Visual Alerts and Alarms;
  • Scripting for Visual Predictive Modeling and Visual Data Mining with ability to do it in Visual IDE and minimal Coding;
  • Better integration with R (current integration is limited to 4 functions passing parameters to R Server), with Visual IDE and minimal or NO Coding.
  • Enterprise-wide source control and change management.
  • Please allow to share Data Visualizations (read-only) from Tableau Online for free (learn from Spotfire Cloud, it called Public Folder!), otherwise it will be too much of usage of free Tableau Reader.  Currently, in order to access to published on Tableau Online workbooks Tableau by default requiring the extra subscription, which is wrong from my point of view, because you can just publish it on Public Folder of such site (similar to what Spotfire Cloud does). By default Tableau Online does not allow the usage of Public Folder, which contradicts the spirit of Tableau Reader and creates unnecessary negative feeling toward Tableau.
  • Enterprise-wide reuse of workbooks and visual designs etc.

preTableau

Since Tableau is going into enterprise full speed (money talks?) then it needs to justify its pricing for Tableau Server, especially if Tableau wish to stay there for long. Feel free to add to this list (use comments or email for it). The first addition I got in a few hours after posting the Wishlist above from Mr. Damien Lesage, see 3 additions from Damien below and his entire comment below of this blogpost:

  • Tableau Server for Linux (I actually advocated it for a while since Microsoft changed (made CALs more expensive, now it looks to me as unwarranted taxation) its Client Access Licensing for Window Server 2012). For comparison Spotfire Server for Linux and Solaris existed for years: http://support.spotfire.com/sr_spotfireserver60.asp , and it is one of reasons why large enterprises may choose Spotfire over Tableau or Qlikview;
  • Extra visualization capability: hierarchical, network and graph representations of data (do we need an approval of Stephen Few for that?);
  • Ability for extract engine to distribute extracts between different servers to allow to load them more quickly and support bigger datasets (I suggest additional ability to do it on workstations too, especially with Tableau Desktops installed and it means they have TABLEAU.COM executable installed anyway)

Suggestion from Mike Borner (see his comment below):

  • ability to report metadata/calculated fields

Now I can extend my best wishes for you onto 2015 due the delay of Tableau 9!

Google+

Follow

Get every new post delivered to your Inbox.

Join 260 other followers