Tableau


Visitors to this blog keep asking me to estimate Tableau Software prices (including for Tableau Online), even Tableau published all non-server prices on its website here: https://tableau.secure.force.com/webstore However this does not include discounts, especially for enterprise volume of buying no pricing for servers of any kind (at least 2 kinds of server licenses exist) and no pricing for consulting and training.

Thanks to website of Tableau Partner “Triad Technology Partners” we have a good estimate of all Tableau prices (they are always subject of negotiations) in form of so called GSA Schedule (General Services Administration, Federal Acquisition Service, Special Items: No. 132-33 Perpetual Software Licenses, No. 132-34 Maintenance of Software as a Service, No. 132-50 Training Courses) for Tableau Software Products and Services, see it here:

http://www.triadtechpartners.com/vendors/tableau-software/ here (for example it includes prices for IBM Cognos and others):
http://www.triadtechpartners.com/contracts/ and specific Tableau Prices here:
http://www.triadtechpartners.com/wp-content/uploads/Tableau-GSA-Price-List-April-2013.pdf

I grouped Tableau’s Prices (please keep in mind that TRIAD published GSA schedule in April 2013, so it is 1 year old prices, but they are good enough for estimating purposes)  in 5 groups below: Desktop, Server with licensing for Named Users (makes sense if you have less then hundred “registered” users), Core Licenses for Tableau Server (recommended when you have more then 150 “registered” users), Consulting and Training Prices:

Google sheet for spreadsheet above is here:
https://docs.google.com/spreadsheets/d/1oCyXRR3B6dqXcw-8cE05ApwsRcxckgA6QdvF9aF6_80/edit?usp=sharing
and image of it – for those who has misbehaved browsers is below:
TableauPrices2013

Again, please keep in mind that above just an estimate for prices (except for Tableau Online), based on 2013 GSA Schedule, and a good negotiator can always get a good discount (I got it each time I tried).

Note about choice between Core License and Server License with Named Users: I know organizations who choose to keep Named Users Licensing instead of switching to Core License even with more then 300 registered users, because it allows them to use much more capable hardware (with much more CPU Cores).

We were told (5+ month ago) what to expect from Tableau 8.2 (originally @TCC13 they said Release can be before the end of the winter of 2014; however in the latest Earnings Call here: http://seekingalpha.com/article/1994131-tableau-softwares-ceo-discusses-q4-2013-results-earnings-call-transcript CEO acknowledged the delay: 8.2 in Q2 of 2014, and v.9 in “first half of 2015″, many months later then original plan), including:

  • Tableau for MAC (very timely at time when QLIK about to abandon the Qlikview Desktop in favor of HTML5 Client),
  • Story Points (new type of worksheet/dashboard with mini-slides as story-points, so bye-bye to Powerpoint),
  • seamless access to data via data connection interface to visually build a data schema, including inner/left/right/outer joins,
  • ability to beautify the columns names.

306151016_640

I am sure Tableau already has a Roadmap for Tableau 9 and beyond, but I accumulated a list of wishes for it (may be it is not too late to include some of it to Roadmap?). This Wishlist is rather about backend than about front-end Eye Candies (the nature of the Large Enterprise dictates that). Here it is:

  • Visual ETL functionality and Data Quality Validation/Cleaning;
  • (thanks to Larry Keller): Enterprise Repository for pre-Validated Sharable Regularly Refreshed Data Extracts, Data Connections and Data Sources;
  • Ability to collect Data automatically (say Machine-generated or/and transactional Data) and Visually (say from Humans, filling Data-Entry Forms), both tied to already predefined and/or modifiable Data Extracts;
  • Visual Data Modeling;
  • Free Tableau Reader for Mac (since we are going to have Tableau Desktop for Mac in Tableau 8.2 anyway), iOS, Android and Linux;
  • Real-Time Visualization, support (Spotfire and Datawatch have it!) for Complex Event Processing (CEP), Visual Alerts and Alarms;
  • Scripting for Visual Predictive Modeling and Visual Data Mining with ability to do it in Visual IDE and minimal Coding;
  • Better integration with R (current integration is limited to 4 functions passing parameters to R Server), with Visual IDE and minimal or NO Coding.
  • Enterprise-wide source control and change management.
  • Please allow to share Data Visualizations (read-only) from Tableau Online for free (learn from Spotfire Cloud, it called Public Folder!), otherwise it will be too much of usage of free Tableau Reader.  Currently, in order to access to published on Tableau Online workbooks Tableau by default requiring the extra subscription, which is wrong from my point of view, because you can just publish it on Public Folder of such site (similar to what Spotfire Cloud does). By default Tableau Online does not allow the usage of Public Folder, which contradicts the spirit of Tableau Reader and creates unnecessary negative feeling toward Tableau.
  • Enterprise-wide reuse of workbooks and visual designs etc.

preTableau

Since Tableau is going into enterprise full speed (money talks?) then it needs to justify its pricing for Tableau Server, especially if Tableau wish to stay there for long. Feel free to add to this list (use comments or email for it). The first addition I got in a few hours after posting the Wishlist above from Mr. Damien Lesage, see 3 additions from Damien below and his entire comment below of this blogpost:

  • Tableau Server for Linux (I actually advocated it for a while since Microsoft changed (made CALs more expensive, now it looks to me as unwarranted taxation) its Client Access Licensing for Window Server 2012). For comparison Spotfire Server for Linux and Solaris existed for years: http://support.spotfire.com/sr_spotfireserver60.asp , and it is one of reasons why large enterprises may choose Spotfire over Tableau or Qlikview;
  • Extra visualization capability: hierarchical, network and graph representations of data (do we need an approval of Stephen Few for that?);
  • Ability for extract engine to distribute extracts between different servers to allow to load them more quickly and support bigger datasets (I suggest additional ability to do it on workstations too, especially with Tableau Desktops installed and it means they have TABLEAU.COM executable installed anyway)

Suggestion from Mike Borner (see his comment below):

  • ability to report metadata/calculated fields

Now I can extend my best wishes for you onto 2015 due the delay of Tableau 9!

Google+

Tableau Software (symbol DATA) did something that nobody or almost nobody in BI and/or Data Visualization (DV) field did before with this or larger size of Revenue. Tableau in their last Quarter of 2013 Fiscal Year (reported last week) increased their Year-over-Year Ratio for both Quarterly accounting (95%) and Yearly accounting (82%, way above all DV and BI competitors) while dramatically increased their Revenue to $232M per Year, see it here: http://investors.tableausoftware.com/investor-news/investor-news-details/2014/Tableau-Announces-Fourth-Quarter-and-Full-Year-2013-Financial-Results/default.aspx.

You can compare on diagram below the growth of 3 competitors over last 6 years (2008-2013, Spotfire sales unavailable since TIBCO (symbol TIBX) bought it): BI veteran Microstrategy (bluish line slowing down last 2+ years), largest DV vendor Qliktech (symbol QLIK, red line, decreasing Year-over-Year growth) and fastest growing DV Vendor Tableau (yellow line with Record Year-over-Year growth):

DVMomentum2008_2013a

Tableau stock was and is overpriced since its IPO (e.g. today EPS is -0.19 and P/E ratio is very high, see it here: http://ycharts.com/companies/DATA/pe_ratio). If you follow Warren Buffet (Buy Low, Sell High), today is a good day to sell a DATA stock, unless you intend to hold it for long or forever. However many people ignore Warren and volume of buying for last few days was above average (780K for DATA) and above 1 million shares per day (e.g. on 2/5/14 it was 4.4M of shares). On OpenInsider you can find at least 2 people, who agreed with Warren and sold during last few days 700000 Tableau’s shares for total $62M+ (guess who it can be? Chris and Christian – part of 1% since 5/17/13 IPO…):

http://openinsider.com/screener?fd=0&td=365&s=DATA&o=&sicMin=&sicMax=&t=s&minprice=&maxprice=&v=0&sortcol=0&maxresults=500

As the result, the $DATA (Tableau’s Symbol) jumped up $10+ from already overvalued share price to $97+ after 2/14/14, today it added $5 (click on image below to enlarge it) to share price and keeps going up:

DATAvsQLIKvsTIBXvsDWCH_110413to021414

BY end of 2/14/14 Tableau’s Market Capitalization went over $5.96B, twice more then Qliktech’s MarketCap (which is almost the same as a year ago) and $2B more then TIBCO’s MarketCap (which is almost the same as a year ago)! Basically, Tableau’s MarketCap as of end of trading day today is almost the same as combined MarketCap of QLIK and TIBX.

For me the more important indicator of company’s growth is a “HRI” (Hiring Rate Indicator as the ratio of the number of open positions to the number of Full-Time employees of the company). As of today, Tableau has 216 job openings (current estimate is has about 1100 employees), Qliktech has 101 openings (while employed 1700 people) and Spotfire has about 34 open positions (current estimate of number of Spotfire Employees is difficult because it is completely inside TIBCO, but probably still below 500). It means that Tableau’s HRI is 19.6%, Qliktech’s HRI is 5.9% and Spotfire’s HRI is below 6.8%.

This is a repost from Data Visualization Consulting Page.

Visitors of this blog generated a lot of requests for my Data Visualization “Advice” (small projects for a few hours or days, no NDA [Non-Disclosure Agreement] involved), for Data Visualization Consulting projects (a few weeks or months; I tend to avoid the NDAs as they can interfere with my blogging activities) and even for Full-time work (for example my latest full-time job I got because my employer often visited and read my blog; NDA needed).

Additionally, sometimes I am doing free-of-charge work, if involved projects are short, extremely interesting for me and beneficial for my Data Visualization Blog, like this project:

https://apandre.wordpress.com/2014/01/12/motion-map-chart/

Obviously all these projects can be done only when I have spare time either from full-time work and/or other projects, duties and activities.

I also cannot relocate or travel, so I can do it mostly from my home office – telecommuting (RDP, Skype, phone, WebEx, GoToMeeting etc.) or if client is local to Massachusetts, then sometime I can visit Client’s site, see below the Map of my Local “Service Area” – part of Middlesex County between Routes 495, 3 and 20 – where I can commute to Client’s Location (please click on map below to enlarge the image) :

DVServiceArea

If I do have time for short-term advisory projects (from 2 hours to 2 weeks), clients usually pay by the highest rate, similar to what Qliktech, Spotfire, Tableau or IBM charging for their Consulting Services (I consider my consulting as better service than theirs…). If you will go to this thread on Tableau Community:

http://community.tableausoftware.com/thread/127338 then you will find these Indicative Rates for Consulting Tableau Work (Qlikview and Spotfire Rates are very similar):

Low $125,  Max $300,  Average around $175 per hour.

Here are the most popular requests for my Advisory work:

  • Visual Design and Architectural Advice for Monitoring or Operational Dashboard(s);
  • Review of Data Visualization Work done by my Clients;
  • Prototyping of Data Visualizations (most requested by my visitors);
  • My opinion on Strengths and Weaknesses of Data Visualization Vendor/Product, requested by trader, portfolio or hedge fund manager(s)
  • Advice about what Hardware to buy (say to get the most from Tableau License client has);
  • Advice what Charts and Filters to use for given Dataset and Business Logic;
  • Technical Due Diligence on Data Visualization Startup for Venture Capitalists investing into that Start-up.
  • Etc…

3Paths4Options

For mid-size projects (from 2 weeks to 6 months) clients getting a “Progressive” discount – the longer the project then the larger the discount. Here are the most popular requests for my Consulting Data Visualization Work:

  • Comparing Data Visualization Product vs. Other Visualization Product for specific Client’s needs and projects;
  • Comparing Clients’s Visualization Product vs. Competitor(s) Visualization Product (most requested);
  • Benchmarking one or more Visualization Product(s) vs. specific data and application logic.
  • Managed Clients migration of their Reporting and Analytical IT Infrastructure from obsolete BI Platforms like Business Objects, Cognos and Microstrategy to modern Data Visualization Environments like Tableau, Qlikview and Spotfire.
  • Etc.

Solution

Full-time work (1 year or more engagements) is not exactly a Consulting but Full-time job when clients asking me to join their company. These jobs are similar to what I had in the past: Director of Visual Analytics, Data Visualization Director, VP of Data Visualization, Principal Data Visualization Consultant, Tableau Architect etc. Here are samples of full-time projects:

  • Created, Maintained and Managed the Data Visualization Consulting Practices for my company/employer;
  • Led the growth of Data Visualization Community (the latest example – 4000 strong Tableau Community) with own Blog, Portal and User Group behind the corporate firewall, created Dozens of near-real-time Monitoring Dashboards for Analytical and Data Visualization Communities;
  • Designed and Implemented myself hundreds of Practical Data Visualizations and Visual Reports, which led to discovery of trends, outliers, clusters and other Data Patterns, Insights and Actions;
  • Created hundreds of Demos, Prototypes and Presentations for Business Users;
  • Designed Data Visualization Architecture and Best Practices for Dozen of Analytical Projects;
  • Significantly improved the Mindshare and increased the Web Traffic to website of my company, Created and Maintained the Data Visualization blog for it.

You can find more observations about relationship between Full-Time salary and Hourly Rate for consulting in my previous post (from 6 months ago) here: https://apandre.wordpress.com/2013/07/11/contractors-rate/

8 years ago Hans Rosling demoed on TED the Motion Chart, using Gapminder’s Trendalizer. 7 years ago Google bought Trendalizer and incorporated into Google Charts.

A while ago, for my own education and for demo purposes, I implemented various Motion Charts using:

To implement Motion Chart in Tableau, you can use Page Shelf and place there either a Timing dimension (I used Dimension “Year” in Tableau example above) or even Measures Names (Average Monthly Home Value per ZIP Code) in my implementation of Motion Map Chart below.

AverageHomeValuePerZipCode

Tableau’s ability to move through pages (automatically when Tableau Desktop or Tableau Reader are in use and manually when Data Visualization hosted by Tableau Server and accessed through Web Browser) enabling us to create all kind of Motion Charts, as long as Visualization Author will put onto Pages a Time, Date or Timestamp variables, describing a Timeline. For me the most interesting was to make a Filled Map (Chart Type supported by Tableau, which is similar to Choropleth Map Charts) as a Motion Map Chart, see the result below.

As we all know, 80% of any Data Visualization are Data and I found the appropriate Dataset @Zillow Real Estate Research here: http://www.zillow.com/blog/research/data/ . Dataset contains Monthly Sales Data for All Homes (SFR, Condo/Co-op) for entire US from 1997 until Current Month (so far for 12604 ZIP Codes, which is only 25% of all USA ZIP codes) – average for each ZIP Code area.

This Dataset covers 197 Months and contains about 2.5 millions of DataPoints. All 5 Dimensions in Dataset are very “Geographical”: State, County, Metro Area, City and ZIP code (to define the “Region” and enable Tableau to generate a Longitude and Latitude) and each record has 197 Measures – the Average Monthly Home Prices per Given Region (which is ZIP Code Area) for each available Month since 1997.

In order to create a Motion Filled Map Chart, I put Longitude as Column and Latitude as Row, Measure Values as Color, Measure Names (except Number of Records) as Pages, States and Measure Names as Filters and State and ZIP code as Details and finally Attribute Values of County, Metro Area and City as Tooltips. Result I published on Tableau Public here:

http://public.tableausoftware.com/views/zhv/ZillowHomeValueByZIP_1997-2013#1 ,

so you can review it online AND you can download it and use it within Tableau Reader or Tableau Desktop as the automated Motion Map Chart.

For Presentation and Demo purposes I created the Slides and Movie (while playing it don’t forget to setup a Video Quality to HD resolution) with Filled Map Chart colored by Home Values for entire USA in 2013 as a Starting points and with 22 follow-up steps/slides: Zoom to Northeast Map, colored by 2013 Values, Zoom to SouthEastern New England 2013, start the Motion from Southeastern New England, colored  by 1997 Home Values per each ZIP Code and then automatic Motion through all years from 1997 to 2014, then Zoom to Eastern Massachusetts and finally Zoom to Middlesex County in Massachusetts, see movie below:

Here the content of this video as the presentation with 24 Slides:

Now I think it is appropriate to express my New Year Wish (I repeating it for a few years in a row) that Tableau Software Inc. will port the ability to create AUTOMATED Motion Charts from Tableau Desktop and Tableau Reader to Tableau Server. Please!

Selected Tableau Readings after TCC13 (since September 18, 2013)

sometimes reading is better then doing or writing…

0. Top 10 sessions from TCC13:
http://www.tableausoftware.com/about/blog/2013/12/top-10-sessions-tcc13-27292

1. Dual Color Axis:
https://www.interworks.com/blogs/wjones/2013/09/18/create-dual-color-axis-tableau

2. Evaluate models with fresh data using Tableau heat maps:
http://cooldata.wordpress.com/2012/07/12/evaluate-models-with-fresh-data-using-tableau-heat-maps/

3. Tableau Throws a Brick at Traditional BI:
http://www.datanami.com/datanami/2013-09-11/tableau_throws_a_brick_at_traditional_bi.html

4. Easy Empty Local Extracts:
http://www.tableausoftware.com/about/blog/2013/9/easy-empty-local-extracts-25152

5. Tableau 8.1: Sophisticated Analytics for Sophisticated People:
http://www.tableausoftware.com/about/blog/2013/9/tableau-81-sophisticated-analytics-sophisticated-people-25177

6. Tableau 8.1 and R (can be interesting for at least 5% of Tableau users):
http://www.tableausoftware.com/about/blog/2013/10/tableau-81-and-r-25327
also see:
https://www.interworks.com/blogs/trobeson/2013/11/27/using-r-tableau-81-getting-started
and here:
http://www.tableausoftware.com/about/blog/r-integration

7. Tableau, When Are You Going to Fix This?
http://www.datarevelations.com/tableau-when-are-you-going-to-fix-this.html

8. Automated PDF Email Distribution of Tableau Views Using PowerShell and Tabcmd:
http://www.interworks.com/blogs/tladd/2013/08/22/automated-pdf-email-distribution-tableau-views-using-powershell-and-tabcmd

9. Geocoding Addresses Directly in Tableau 8.1 Using Integration with R:
http://www.dataplusscience.com/Geocoding%20in%20Tableau%20using%20R.html

10. Best Practices for Designing Efficient Workbooks (and white Paper about it):
http://www.tableausoftware.com/about/blog/2013/10/best-practices-designing-efficient-workbooks-25391

11. Tableau Mapping Architecture:
http://urbanmapping.com/tableau/mapping-architecture.html

12. Story Points in Tableau 8.2 presentation mode:
http://eagereyes.org/blog/2013/story-points

13. Truly Global: Filtering Across Multiple Tableau Workbooks with the JavaScript API:
https://www.interworks.com/blogs/tladd/2013/10/24/truly-global-filtering-across-multiple-tableau-workbooks-javascript-api

14. Tableau 8.1 Worksheet, Dashboard menus improved, still room for more:
http://tableaufriction.blogspot.com/2013/10/tv81-beta-3-worksheet-dashboard-menus.html

15. Lollipops Charts in Tableau:
http://drawingwithnumbers.artisart.org/lollipops-for-quality-improvement/

16. Was Stephen Few Right?
http://www.datarevelations.com/was-stephen-few-right-my-problems-with-a-companys-iron-viz-competition.html

17. Precision Inputs Required In Addition To Analog Controls:
http://tableaufriction.blogspot.com/2013/11/precision-inputs-required-in-addition.html

18. Google Spreadsheets to Tableau connector – a working driver:
http://community.tableausoftware.com/thread/135281

19. Leveraging Color to Improve Your Data Visualization:
http://www.tableausoftware.com/public/blog/2013/10/leveraging-color-improve-your-data-visualization-2174

20. Workbook acts as a container for multiple Tableau-based Charts – 114
Samples and Visualization Types:
http://www.alansmitheepresents.org/2013/07/team-geiger-rides-again.html

21. The New Box-and-Whisker Plot:
http://www.tableausoftware.com/public/blog/2013/11/box-and-whisker-plots-2231

22. The Tableau Workbook Library:
http://www.tableausoftware.com/about/blog/2013/11/tableau-workbook-library-27004

23. Customizing Tableau Server Experience (Parts 1, 1.5, 2):
http://ugamarkj.blogspot.com/2013/11/customizing-tableau-server-experience.html
http://ugamarkj.blogspot.com/2013/12/customizing-tableau-server-experience.html
http://ugamarkj.blogspot.com/2013/12/customizing-tableau-server-experience_15.html

24. SAML Integration in Tableau 8.1:
https://www.interworks.com/blogs/daustin/2013/11/27/saml-integration-tableau-81

25. Tableau file types and extensions:
http://www.theinformationlab.co.uk/2013/12/02/tableau-file-types-and-extensions/

26. Tableau Server XML Information Files: The Master Class:
http://tableaulove.tumblr.com/post/69383091006/tableau-server-xml-information-files-the-master-class

27. Is it Transparency? Is it Opacity? Labeled one, works like the other:
http://tableaufriction.blogspot.com/2013/12/is-it-transparency-is-it-opacity.html

28. Viz Hall of Fame:
http://www.tableausoftware.com/about/blog/2013/12/viz-hall-fame-27270

29. Tableau Weekly Archive:
http://us7.campaign-archive1.com/home/?u=f3dd94f15b41de877be6b0d4b&id=d23712a896

30. 2013 Winners:
http://www.tableausoftware.com/public/blog/2013/12/2013-award-winners-2272
Happy New Year!
2014Cubes

2 months ago TIBCO (Symbol TIBX on NASDAQ) announced Spotfire 6 at TUCON 2013 user conference. This as well a follow-up release  (around 12/7/13) of Spotfire Cloud supposed to be good for TIBX prices. Instead since then TIBX lost more then 8%, while NASDAQ as whole grew more then 5%:

TIBXvsNasdaqFrom1014To121313

For example, at TUCON 2013 TIBCO’s CEO re-declared “5 primary forces for 21st century“(IMHO all 5 “drivers” sounds to me like obsolete IBM-ish Sales pitches) – I guess to underscore the relevance of TIBCO’s strategy and products to 21st century:

  1. Explosion of data (sounds like Sun rises in the East);

  2. Rise of mobility (any kid with smartphone will say the same);

  3. Emergence of Platforms (not sure if this a good pitch, at least it was not clear from TIBCO’s presentation);

  4. Emergence of Asian Economies (what else you expect? This is the side effect of the greedy offshoring for more then decade);

  5. Math trumping Science  (Mr. Ranadive and various other TUCON speakers kept repeating this mantra, showing that they think that statistics and “math” are the same thing and they do not know how valuable science can be. I personally think that recycling this pitch is dangerous for TIBCO sales and I suggest to replace this statement with something more appealing and more mature).

Somehow TUCON 2013 propaganda and introduction of new and more capable version 6 of Spotfire and Spotfire Cloud did not help TIBCO’s stock. For example In trading on Thursday, 12/12/13 the shares of TIBCO Software, Inc. (NASD: TIBX) crossed below their 200 day moving average of $22.86, changing hands as low as $22.39 per share while Market Capitalization was oscillating around $3.9B, basically the same as the capitalization of 3 times smaller (in terms of employees) competitor Tableau Software.

As I said above, just a few days before this low TIBX price, on 12/7/13, as promised on TUCON 2013, TIBCO launched Spotfire Cloud and published licensing and pricing for it.

Most disappointing news is that in reality TIBCO withdrew itself from the competition for mindshare with Tableau Public (more then 100 millions of users, more then 40000 active publishers and Visualization Authors with Tableau Public Profile), because TIBCO no longer offers free annual evaluations. In addition, new Spotfire Cloud Personal service ($300/year, 100GB storage, 1 business author seat) became less useful under new license since its Desktop Client has limited connectivity to local data and can upload only local DXP files.

The 2nd Cloud option called Spotfire Cloud Work Group ($2000/year, 250GB storage, 1 business author/1 analyst/5 consumer seats) and gives to one author almost complete TIBCO Spotfire Analyst with ability to read 17 different types of local files (dxp, stdf, sbdf, sfs, xls, xlsx, xlsm, xlsb, csv, txt, mdb, mde, accdb, accde, sas7bdat,udl, log, shp), connectivity to standard Data Sources (ODBC, OleDb, Oracle, Microsoft SQL Server Compact Data Provider 4.0, .NET Data Provider for Teradata, ADS Composite Information Server Connection, Microsoft SQL Server (including Analysis Services), Teradata and TIBCO Spotfire Maps. It also enables author  to do predictive analytics, forecasting, and local R language scripting).

This 2nd Spotfire’s Cloud option does not reduce Spotfire chances to compete with Tableau Online, which costs 4 times less ($500/year). However (thanks to 2 Blog Visitors – both with name Steve – for help), you cannot use Tableau online without licensed version of Tableau Desktop ($1999 perpetual non-expiring desktop license with 1st year maintenance included and each following year 20% $400 per year maintenance) and Online License (additional $500/year for access to the same site, but extra storage will not be added to that site!) for each consumer. Let’s compare Spotfire Workgroup Edition and Tableau Online cumulative cost for 1, 2, 3 and 4 years for 1 developer/analyst and 5 consumer seats :

 

Cumulative cost for 1, 2, 3 and 4 years of usage/subscription, 1 developer/analyst and 5 consumer seats:

Year

Spotfire Cloud Work Group, 250GB storage

Tableau Online (with Desktop), 100GB storage

Cost Difference (negative if Spotfire cheaper)

1

$2000

$4999

-$2999

2

$4000

$8399

-$4399

3

$6000

$11799

-$5799

4

$8000

$15199

-$7199

UPDATE: You may need to consider some other properties, like available storage and number of users who can consume/review visualizations, published in cloud. In sample above:

  • Spotfire giving to Work Group total 250 GB storage, while Tableau giving total 100 GB to the site. 2 or more subscriptions can be associated with the same site, but it will not increase the size of storage for the site from 100 GB to more (e.g. 200 GB for 2 subscribers). 
  • Spotfire costs less than Tableau Online for similar configuration (almost twice less!)

Overall, Spotfire giving more for your $$$ and as such can be a front-runner in Cloud Data Visualization race, considering that Qlikview does not have any comparable cloud options (yet) and Qliktech relying on its partners (I doubt it can be competitive) to offer Qlikview-based services in the cloud. Gere is the same table as above but as IMage (to make sure all web browsers can see it):

SFvsTBCloudPrice

It is important to consider another advantage of Spotfire Cloud: ability to share visualizations with everybody on internet by publishing them into Public Folder(s). By contrast, Tableau has limited licensing for this: in order to access to published workbooks on Tableau Online site, the Tableau Software by default requires the extra subscription, which is wrong from my point of view, because you can just publish it on Public Folder of such site (if such option allowed). By default (and without additional negotiations) Tableau Online does not allow the usage of Public Folder.

3rd Spotfire’s Cloud option called Spotfire Cloud Enterprise, it has customizable seating options and storage, more advanced visualization, security and scalability and connects to 40+ additional data sources. It requires an annoying negotiations with TIBCO sales, which may result to even larger pricing. Existence of 3rd Spotfire Cloud option decreases the value of its 2nd Cloud Option, because it saying to customer that Spotfire Cloud Work Group is not best and does not include many features. Opposite to that is Tableau’s Cloud approach: you will get everything (with one exception: Multidimensional (cube) data sources are not supported by Tableau Online) with Tableau Online, which is only the option.

Update 12/20/13:  TIBCO announced results for last quarter, ending 11/30/13 with Quarterly revenue $315.5M (only 6.4% growth compare with the same Quarter of 2012) and $1070M Revenue for 12 months ended 11/30/13 (only 4.4% growth compare with the same period of 2012). Wall Street people do not like it today and TIBX lost today 10% of its value, with Share Price ending $22 and Market Capitalization went down to less then $3.6B. At the same time Tableau’s Share Price went up $1 to $66 and Market Capitalization of Tableau Software (symbol DATA) went above $3.9B). As always I think it is relevant to compare the number of job openings today: Spotfire – 28, Tableau – 176, Qliktech – 71

My previous blogpost, comparing footprints of DV Leaders (Tableau 8.1, Qlikview 11.2, Spotfire 6) on disk (in terms of size of application file with embedded dataset with 1 million rows) and in Memory (calculated as RAM-difference between freshly-loaded (without data) application and  the same application when it will load appropriate application file (XLSX or DXP or QVW or TWBX) got a lot of feedback from DV Blog visitors. It even got mentioning/reference/quote from Tableau Weekly #9 here:

http://us7.campaign-archive1.com/?u=f3dd94f15b41de877be6b0d4b&id=26fd537d2d&e=5943cb836b and the full list of Tableau Weekly issues is here: http://us7.campaign-archive1.com/home/?u=f3dd94f15b41de877be6b0d4b&id=d23712a896

The majority of feedback asked to do a similar Benchmark – the footprint comparison for larger dataset, say with 10 millions of rows. I did that but it required more time and work,  because the footprint in memory for all 3 DV Leaders depends on the number of visualized Datapoints (Spotfire for years used the term Marks for Visible Datapoints and Tableau adopted these terminology too, so I used it from time to time as well, but I think that the correct term here will be “Visible Datapoints“).

3Footprints

Basically I used the same dataset as in previous blogpost with main difference that I took subset with 10 millions of rows as a opposed to 1 Million rows in previous Benchmarks. The Diversity of used Dataset with 10 Million rows is here (each row has 15 fields as in previous benchmark):

I removed from benchmarks for 10 million rows the usage of Excel 2013 (Excel cannot handle more the 1,048,576 rows per worksheet) and PowerPivot 2013 (it is less relevant for given Benchmark). Here are the DV Footprints on disk and in Memory for Dataset with 10 Million rows and different number of Datapoints (or Marks: <16, 1000, around 10000, around 100000, around 800000):

Main observations and notes from benchmarking of footprints with 10 millions of rows as following:

  • Tableau 8.1 requires less (almost twice less) disk space for its application file .TWBX then Qlikview 11.2 (.QVW) for its application file (.QVW) or/and Spotfire 6 for its application file (.DXP).

  • Tableau 8.1 is much smarter when it uses RAM then Qlikview 11.2 and Spofire 6, because it takes advantage of number of Marks. For example for 10000 Visible Datapoints Tableau uses 13 times less RAM than Qlikview and Spotfire and for 100000 Visible Datapoints Tableau uses 8 times less RAM than Qlikview and Spotfire!

  • THe Usage of more than say 5000 Visible Datapoints (even say more than a few hundreds Marks) in particular Chart or Dashboard often the sign of bad design or poor understanding of the task at hand; the human eye (of end user) cannot comprehend too many Marks anyway, so what Tableau does (in terms of reducing the footprint in Memory when less Marks are used) is a good design.

  • For Tableau in results above I reported the total RAM used by 2 Tableau processes in memory TABLEAU.EXE itself and supplemental process TDSERVER64.EXE (this 2nd 64-bit process almost always uses about 21MB of RAM). Note: Russell Christopher also suggested to monitor TABPROTOSRV.EXE but I cannot find its traces and its usage of RAM during benchmarks.

  • Qlikview 11.2 and Spotfire 6 have similar footprints in Memory and on Disk.

More than 2 years ago I estimated the footprints for the sample dataset (428999 rows and 135 columns) when it encapsulated in text file, in compressed ZIP format, in Excel 2010, in PowerPivot 2010, Qlikview 10, Spofire 3.3 and Tableau 6. Since then everything upgraded to the “latest versions” and everything 64-bit now, including Tableau 8.1, Spotfire 5.5 (and 6), Qlikview 11.2, Excel 2013 and PowerPivot 2013.

I decided to use the new dataset with exactly 1000000 rows (1 million rows) and 15 columns with the following diversity of values (Distinct Counts for every Column below):

Then I put this dataset in every application and format mentioned above – both on disk and in memory. All results presented below for review of DV blog visitors:

Some comments about application specifics:

  • Excel and PowerPivot XLSX files are ZIP-compressed archives of bunch of XML files

  • Spotfire DXP is a ZIP archive of proprietary Spotfire text format

  • QVW  is Qlikview’s proprietary Datastore-RAM-optimized format

  • TWBX is Tableau-specific ZIP archive containing its TDE (Tableau Data Extract) and TWB (XML format) data-less workbook

  • Footprint in memory I calculated as RAM-difference between freshly-loaded (without data) application and  the same application when it will load appropriate application file (XLSX or DXP or QVW or TWBX)

Datawatch published today its 2013 (ending 9/30/13) yearly and quarterly results and its YoY growth is impressive 16% (2013-over-2012), which is better then TIBCO (less then 13%) ! See Earnings Call Transcript here: http://seekingalpha.com/article/1849661-datawatch-corporations-ceo-discusses-q4-2013-results-earnings-call-transcript?part=single and webcast available here: http://www.investorcalendar.com/IC/CEPage.asp?ID=171788

Since Datawatch bought recently well-known swedish Data Visualization vendor Panopticon (which had 112% YoY in 2012!) for $31M in stock, Panopticon’s sales for a first time added to Datawatch sales (at least $1.5M revenue per quarter), total Datawatch quarterly revenue (as expected) grew to almost $9M per quarter and to $30.3M per fiscal 2013 (ending 9/30/13).

You can compare “moving” Datawatch YoY index for last 6 quarters vs 2 Top DV Performers (Tableau above 70% YoY, Qlikview above 20%), vs similar (YoY-wise) DV Vendor (Spotfire about 12%) and finally vs 2 Traditional BI Vendors (Microstrategy and Actuate). The thickness of lines reflects Vendor’s ttm (the revenue for Trailing Twelve Months) – click on Image to Enlarge:

Year-over-Year Growth for Trailing Twelve Month (YoY4ttm)

Year-over-Year Growth for Trailing Twelve Month (YoY4ttm)


Datawatch founded in 1985(!), public (traded on NASDAQ as DWCH) since 1992; it has 44000+ customers (including 99 of Fortune 100) and 500000+ end users. Datawatch management team is experienced in BI space and includes veterans from IBM, Applix, Cognos etc. In last 3 years (since 10/1/10) DWCH shares increased in value more then 10 times:

ReturnIn3Years

2nd V for BigData: Data Variety.

The first version of main Datawatch software, called Monarch Professional was released in 1991 and developed by Math Strategies. Overtime Datawatch added a lot of features to this ETL software, including the support for the broadest variety of data types and data sources simultaneously—including traditional structured relational databases, semi-structured sources like reports, PDF files, EDI streams, print spools and documents stored in files systems or enterprise content management systems, with a  new mix of unstructured data such as machine data and social media stored in Big Data solutions or streaming directly from a host of real-time applications.

Datawatch Desktop does ETL from all above Data Sources and then extracts those data into Variety of Standard Formats: Excel spreadsheets, Access Databases, PDF reports, into Panopticon Workbooks etc. Simple example of how Monarch 11 does it you can see here:

or more professional and free video training you can find here: http://www.datawatch.com/information-optimization/item/196-guided-tour-monarch

The latest release of Monarch Professional is in version 12 and it has the new name as Datawatch Modeler; it also integrated and bundled together with Panopticon Desktop Designer under new name Datawatch Desktop and that bundle is available for $1895. As a result Datawatch created for itself an excellent up-sell opportunity: current customers on maintenance can trade-up to Datawatch Desktop for $366 (it also includes first year maintenance) – this is 5 times cheaper than Tableau Desktop professional. My understanding that maintenance of Datawatch Desktop is 22% per year of its price but you may get a better deal.

Monarch11

Datawatch Modeler v.12 has new Core engine with 16 External Lookups (was 9 in version 11), 512 Columns In Table (was 254), 100 Multi-Column Regions (was 40), Optimized for modelling large inputs Data Preview (work with first 100 records), has new PDF Engine, 10GB Internal Database size (was 2GB), Utilized 4 Cores for DB operations (was 2).

1st V for Big Data: Data Volume.

Math Strategies developed for Datawatch another tool – Monarch DataPump (recently renamed as Datawatch Automator or Datawatch Server – Automation Edition, currently in version 12). On 3/30/12 Datawatch acquired intellectual property for its underlying Monarch Report Analytics platform from Raymond Huger, d/b/a Math Strategies (Greensboro, NC).

Datawatch developed other editions of Datawatch Server:

  • Formerly Enterprise Server has new name now as Datawatch Server – Content Edition, version 12. Datawatch Server supports all Monarch functionality on server-side, integrates with web server(s) and related infrastructure, manages all users, their credentials, access rights, roles, privileges, user groups, manages and aggregates all content, data, data extracts etc.

  • Datawatch Server – Automation Edition (Data Pump) – automatically collects and refreshes all content, data and data extracts, both on-demand and on-schedule, manages all schedules etc.

  • Datawatch Server – Complete Edition includes Formerly Panopticon Server (manages all Data Visualizations and its users, converts Visualizations to web applications so they can be accessed through web browsers and HTML5 clients), Datawatch Enterprise Server and Data Pump.

V3

Theoretically Datawatch Server (with help from Datawatch Automator) can support up to 524 Petabytes (1015 bytes) of Data which I consider a very Big Data for 2013.

3rd V for Big Data: High Velocity

Datawatch/Panopticon in-memory data engine supports data visualization for real-time business dashboards and it has low-latency display of analytics that are based on streaming data as it arrives. This enables Datawatch to handle the demanding continuous-intelligence applications, where quick responses are required. This is a big differentiator. An in-memory, OLAP-based StreamCube is associated with each graphical display object. The system processes new data as it arrives, selects the subset of important data, recalculates the relevant sections of the model and refreshes the associated parts of the display immediately. The parts of the model and the display that are not affected by the new data are not touched. This is faster and more efficient than conventional data visualization tools that operate on batch-loaded snapshots of data, run less frequently, and then recalculate the model and rebuild the display for each iteration.

Somebody I know was able to refresh and REPAINT 25000+ datapoints per second per one Datawatch/Panopticon Chart and this is much faster then any competitor.

Datawatch platform integrated with message-oriented middleware, including ActiveMQ, Qpid, Sonic MQ and Tibco EMS. It has connectors to Complex Event-Processing platforms (CEP), such as kx kdb+tick, OneTick CEP, Oracle CEP, StreamBase Systems’ Event Processing Platform and Sybase Event Stream Processor. Datawatch also has interfaces for retrieving data from time series databases, conventional relational and columnar databases, files, Open Data Protocol (OData) sources and in-memory DBMSs. It can be customized for proprietary data sources (recent example is a Visualization Accelerator for Splunk) and even embedded within other applications. Like other leading data visualization tools, it  supports a wide range of charts. It has a development studio (Desktop Designer) for designing and implementing dashboards, and HTML5-based clients/support for mobile applications.

4th and most desirable V: Data Visualization

Datawatch is trying to get into Data Visualization (DV) field and it has potentials to be a 4th major Vendor here: it has a competitive DV Desktop, a competitive DV Server, an excellent HTML5 Client for it and set of differentiators like ready-to-use 3V triplet of features (see above) for Big Data and real-time DV. Datawatch Designer supports rich set of Graphs, Charts, Plots, Maps, Marks and other types of Visualizations, for example:

  • TIME SERIES Graphs: Candlestick, Horizon, Line, Needle, OHLC, Spread, Stack Area, Stacked / Grouped Needle, Table with Micro Charts, Sub Totals & Grand Totals, Timeseries Combo Charts, Timeseries Scatter Plot.

  • STATIC & TIME SLICE Graphs: Bullet, Heat Map, Heat Matrix, Horizontal/Vertical Bar, Horizontal/Vertical Dot Plot, Multi-Level Pie Chart, Numeric Line, Numeric Needle, Numeric Stacked Needles, Scatter Plot, Shapes / Choropleth, Surface Plot, Surface Plot 3D, Table with Micro Charts, Sub Totals & Grand Totals, Treemap.

  • many Visualization Demos still available here: http://www.panopticon.com/Advanced-Data-Visualization and here: http://www.panopticon.com/demos

3232161725_e648f09137_o

In my humble opinion in order to compete with leading DV vendor like Tableau I think that Datawatch needs a few gradual changes, some of them I listed below:

  • Gradually on as-needed basis add features which other 3 DV Vendors have and Datawatch does not (it needs serious R&D)

  • Create free Datawatch Public (cloud service) to make people to learn and compare it (similar to Tableau Public) and to win mindshare

  • Create Fee-based Datawatch Online cloud service (similar to Tableau Online and Spotfire Cloud services)

  • Add more DV-oriented Partners (similar to Qlikview Partner Program, which has now 1500+ partners)

  • Create fee-based Data Visualization Practice in order to help large clients to implement DV Projects with Datawatch Desktop and Server.

  • Add support for Visual Analytics and Data Science, including integration with R Library (similar to Spotfire’s S-Plus and TERR or at least the integration with R like Tableau 8.1 did today)

  • Add support for Storytelling, similar to what next versions of Tableau and Qlikview will have (soon) and communication abilities (similar to what Spotfire 6 has with TIBBR)

  • I may expand this list later as I see the fit, but Datawatch really has an unique opportunity here and large potential market!

Feedback 11/22/13 from multiple visitors of this blog:

I Quote the email from one of frequent visitors to my blog: “The fastest growing sales are in DV field (e.g. Panopticon revenue was 112% YoY in 2012). For example in 2006, when Qliktech’s Sales were $44M, its YoY was 81%; 4 years later, in 2010, when Tableau had $40M revenue, YoY was 106%, see it here: http://www.prnewswire.com/news-releases/tableau-software-doubles-revenue-with-2010-landmark-year-114913924.html and 4 years later, in 2014 history can repeat itself again if Datawatch will allow to unbundle its DV Products and sell them separately. Instead, currently Datawatch prevents its own salesforce to sell separately own DV products like Panopticon Desktop Designer (you may call it now as Datawatch Visualization Studio) and Panopticon Server (you can call it now as Datawatch Visualization Server). That artificial limitation has to be removed!” visitor said to me over email… All I can say: it is not my call… Additional links: 

Something dramatic happened during October 2013 with Data Visualization (DV) Market and I feel it everywhere. Share Prices for QLIK went down 40% from $35 to $25, for DATA went down 20% from $72 to below $60, for MSTR went up 27% from $100 to $127 and for DWCH went up  25% from $27.7 to $34.5. This blog got 30% more visitors then usual and it reached 26000 visitors per month of October 2013!

dwchPlus3DVPricesOctober2013So in this blog post I revisited who are actually the DV leaders and active players in Data Visualization field, what events and factors important here and I also will form the DVIndex containing 4-6 DV Leaders and will use it for future estimate of Marketshare and Mindshare in DV market.

In terms of candidates for DV Index I need measurable players, so I will prefer public companies, but will mention private corporations if they are relevant. I did some modeling and it turned out that the best indicator for DV Leader if its YoY (Year-over-Year Revenue growth) is larger than 10% – it will separate obsolete and traditional BI vendors and me-too attempts from real DV Leaders.

Let’s start with traditional BI behemoths: SAP, IBM, Oracle and SAS: according to IDC, their BI revenue total $5810M, but none of those vendors had YoY (2012-over-2011) more then 6.7% ! These 4 BI Vendors literally desperate to get in to Data Visualization market (for example SAP Lumira, IBM is getting desperate too with Project Neo (will be in beta in early 2014), Rapidly Adaptive Visualization Engine (RAVE), SmartCloud Analytics-Predictive Insights, BLU Acceleration, InfoSphere Data Explorer or SAS Visual Analytics) but so far they were not competitive with 3 known DV Leaders (those 3 are part of DVIndex for sure) Qlikview, Tableau and Spotfire

5th traditional BI Vendor – Microsoft had BI revenue in 2012 as $1044M, YoY 16% and added lately a lot of relevant features to its Data Visualization toolbox: Power Pivot 2013, Power View, Power Query, Power Map, SSAS 2012 (and soon SQL Server 2014) etc. Unfortunately Microsoft does not have Data Visualization Product but pushing everything toward Office 365, SharePoint and Excel 2013, which cannot compete in DV market…

6th Traditional BI vendor – Microstrategy made during October 2013 a desperate attempt to get into DV market by releasing 2 free Data Visualization products: Microstrategy Desktop and Microstrategy Express, which are forcing me to qualify Microstrategy for a status of DV Candidate, which I will include (at least temporary) into DVIndex.  Microstrategy BI revenue for TTM (Trailing 12 months) was $574, YoY is below 5% so while I can include it into DVIndex, I cannot say (yet?) that Microstrategy is DV Leader.

Datawatch Corporation is public (DWCH), recently bought advanced Data Visualization vendor – Panopticon for $31M. Panopticon TTM Revenue approximately $7M and YoY was phenomenal 112%  in 2012! Combining it with $27.5M TTM Revenue of Datawatch (45% YoY!) giving us approximately 55% YoY for combined company and qualifying DWCH as a new member of DVIndex!

Other potential candidates for DVIndex can be Panorama (and their Necto 3.0 Product), Visokio (they have very competitive DV Product, called Omniscope 2.8), Advizor Solution with their mature Advizor Visual Discovery 6.0 Platform), but unfortunately all 3 companies choose to be private and I have now way to measure their performance and so they will stay as DV Candidates only.

In order to monitor the progress of open source BI vendors toward DV Market, I also decided to include into DVIndex one potential DV Candidate (not a leader for sure) – Actuate with their BIRT product. Actuate TTM revenue about $138M and YoY about 3%. Here is the tabular MarketShare result with 6 members of DVIndex:

MarketShareIndex

Please keep in mind that I have no way to get exact numbers for Spotfire, but I feel comfortable to estimate Spotfire approximately as 20% of TIBCO numbers. Indirect confirmation of my estimate came from … TIBCO’s CEO and I quote: “In fact, Tibco’s Spotfire visualization product alone boasts higher sales than all of Tableau.” As a result I estimate Spotfire’s YoY is 16% which is higher then 11% TIBCO has. Numbers in table above are fluid and reflect the market situation by the end of October 2013. Also see my attempt to visualize the Market Share of 6 companies above in simple Bubble Chart (click on it to Enlarge; where * X-axis: Vendor’s Revenue for last TTM: 12 trailing Months, * Y-axis: Number of Full-Time Employees, working for given Vendor, * Sized by Market Capitalization, in $B (Billions of Dollars), and * Colored by Year-Over-Year revenue Growth):

MarketShare

For that date I also have an estimate of Mindshare of all 6 members of DVIndex by using the mentioning of those 6 companies by LinkedIn members, LinkedIn groups, posted on LinkedIn job openings and companies with Linkedin profile:

MindShareIndex

Again, please see below my attempt to represent Mindshare of those 6 companies above with simple Bubble Chart ((click on it to Enlarge; here 6 DV vendors, positioned relatively to their MINDSHARE on LinkedIn and where * X-axis: Number of LinkedIn members, mentioned Vendor in LinkedIn profile, * Y-axis: Number of LinkedIn Job Postings, with request of Vendor-related skills, * Sized by number of companies mentioned them on LinkedIn and * Colored by Year-Over-Year revenue Growth):

MindShare

Among other potential DV candidates I can mention some recent me-too attempts like Yellowfin, NeitrinoBI, Domo, BIME, RoamBI, Zoomdata and multiple similar companies (mostly private startups) and hardly commercial but very interesting toolkits like D3. None of them have impact on DV Market yet.

Now, let’s review some of October events (may add more October events later):

1. For the fourth quarter, Qliktech predicts earnings of 28 cents to 31 cents a share on revenue between $156 million and $161 million. The forecast came in significantly lower than analysts’ expectations of 45 cents a share on $165.78 million in revenue. For the full year, the company projects revenue between $465 million and $470 million, and earnings between 23 and 26 cents a share. Analysts had expectations of 38 cents a share on $478.45 million. As far as I concern it is not a big deal, but traders/speculants on Wall Street drove QLIK prices down almost 40%

2. Tableau Software Files Registration Statement for Proposed Secondary Offering. Also Tableau’s Revenue in the three months ended in September rose to $61 million, 10 millions more then expected - Revenue jumped 90%! Tableau CEO Christian Chabot said the results were boosted by one customer that increased its contract with the company. “Our third quarter results were bolstered by a large multimillion-dollar deal with a leading technology company,” he said. “Use of our products in this account started within one business unit and over the last two years have expanded to over 15 groups across the company. “Recently, this customer set our to establish an enterprise standard for self-service business intelligence, which led to the multimillion-dollar transaction. This deal demonstrates the power and value of Tableau to the enterprise.” However DATA prices went down anyway in anticipation of a significant portion of these Shares Premium prices should quickly evaporate as the STOCK Options lock-up will expire in November 2013.

3. TIBCO TUCON 2013 conference somehow did not help TIBCO stock but in my mind brought attention to Datawatch and to the meteoric rise of DWCH stock (on Chart below compare it with QLIK and TIBX prices, which basically did not change during period of March-October of 2013) which is more then tripled in a matter of just 8 months (Datawatch bought and integrated Panopticon during exactly that period):

DWCHvsQLIKvsTIBXMar_Oct20134. Datawatch now has potentially better software stack then 3 DV Leaders, because of Datawatch Desktop is integrated with Panopticon Desktop Designer and Datawatch Server is integrated with Panopticon Data Visualization Server; it means that in addition to “traditional” BI + ETL + Big Data 3V features (Volume, Velocity, Variety) Datawatch has 4th V feature, which is relevant to DV Market: the advanced Data Visualization. Most visualization tools are unable to cope with the “Three V’s of Big Data” – volume, velocity and variety. However, Datawatch’s technology handles:

  • Data sources of any size (it has to be tested and compared with Qlikview, Spotfire and Tableau)

  • Data that is changing in real time (Spotfire has similar, but Qlikview and Tableau do not have it yet)

  • Data stored in multiple types of systems and formats

We have to wait and see how it will play out but competition from Datawatch will make Data Visualization market more interesting in 2014… I feel now I need to review Datawatch products in my next blog post…

Yesterday I got invited by Qliktech for their semi-annual New England QlikView Boston User Group meeting. It was so many participants, so Qliktech was forced to hold the Keynote (of course the presentation and the demo of Qlikview.Next) and 4 cool presentations by Customers and Partners (Ocean State Job Lot, Analog Devices, Cybex and Attivio) outside of its own office but in the same building  on the 1st floor @Riverside Offices in Newton, MA @Rebecca’s Cafe.

It was plenty of very excited people in a very large room and very promising demo and presentation of Qlikview.Next, which actually will not be generally available until 2014. Entire presentation was done using new and capable HTML5 client, based on functionality Qliktech got when it bought NComVA 6 months ago.

I was alarmed when presenter never mentioned my beloved Qlikview Desktop and I when I asked directly about it, the answer shocked and surprised me. One of the most useful piece of software I ever used will not be part of Qlikview.Next anymore. As part of Qlikview 11.2, it will be supported for 3 years and then it will be out of the picture! I did not believe it and asked one more time during demo and 2 more times after presentation in-person during Networking and Cocktail Hour inside Qliktech offices. While food and drink were excellent, the answer on my question was the same – NO!

LeafsAndNeedlesOnGrass

I have the utmost respect for very smart software developers, architects and product managers of Qlikview, but in this particular case I have to invoke 20+ years of my own advanced and very extensive experience as the Software Architect, Coder and Software Director and nothing in my past can support such a decision. I do not see why Qlikview.Next can not have both (and we as Qlikview users need and love both) Qlikview Desktop Client and Qlikview HTML5 client?

I personally urge Qliktech (and I am sure the majority of 100000+ (according to Qliktech) Qlikview community will agree with me) to keep Qlikview Desktop client as long as Qlikview exist. And not just keep it but 1st,  keep it as the best Data Visualization Desktop Client on market and 2nd, keep it in sync (or better ahead) with HTML5 client.

In case if Qlikview Desktop will disappear from Qlikview.Next, it will be a huge gift to Tableau and Datawatch (Spotfire Cloud Personal will no longer have access to the Spotfire Analyst desktop product and therefor Spotfire Cloud Personal is making a similar (partial) mistake as Qlikview.Next)

.

tableau_cmyk

Tableau recently invested heavily into progress of all variations of Tableau Desktop (Professional, Personal, Public, Online, Free Reader) including (finally) migration to 64-bit and even porting Desktop to MAC, so it will instantly get the huge advantage over Qlikview in desktop, workstation, development, design, debugging, testing, QA  and offline environments.

DATAWATCH CORPORATION LOGO

It will also almost immediately propel the Datawatch as a very attractive contender in Data Visualization market, because Datawatch got (when they bought Panopticon this year) the extremely capable Panopticon Desktop Designer

Panopticon_Data_Visualization_Software_logo_file,_800x155,_Real-Time_Visual_Data_Analysisin addition to its own very relevant line of products.

Again, I hope I misunderstood answer I got 4 times during 4-hour meeting and during follow-up networking/cocktail hour or if understood it correctly, Qliktech will reconsider, but I will respect their decision if they don’t…

So I have to disagree with Cindi Howson (as usual): even if “QlikTech Aims To Disrupt BI, Again“, it actually will disrupt itself first, unless it will listen me begging them to keep Qlikview Desktop alive, well and ahead of competition.

SunsetOnCapeCod102413

You can find in Ted Cuzzillo’s article here: http://datadoodle.com/2013/10/09/next-for-qlik/ the actual quote from Qliktech’s CEO Lars Björk: ““We can disrupt the industry again”. My problem with this quote that Qliktech considers itself as the insider and reinventor of the dead and slow BI industry, while Tableau with its new motto “DATA to the people” is actually trying to be out of this grave and be inside own/new/fast growing Data Visualization space/field/market, see also blogpost from Tony Cosentino, VP of Ventana Research, here: http://tonycosentino.ventanaresearch.com/2013/09/21/tableau-continues-its-visual-analytics-revolution/#!

You can see below interview with Time Beyers, who has own doubts about Qlikview.Next from investor’s point of view:

Basically, Qlikview.Next is late for 2 years, it will not have Qlikview Desktop (big mistake), it still does not promise any Qlikview Cloud services similar to Tableau Online and Tableau Public and it still does not have server-less distribution of visualizations because it does not have free Qlikview Desktop Viewer/Readers similar to free Tableau Reader. So far it looks to me that QLIK may have a trouble in the future…

Last month Tableau and Qliktech both declared that Traditional BI is too slow (I am saying this for many years) for development and their new Data Visualization (DV software) is going to replace it. Quote from Tableau’s CEO: Christian Chabot: “Traditional BI software is obsolete and dying and this is very direct challenge and threat to BI vendors: your (BI that is) time is over and now it is time for Tableau.” Similar quote from Anthony Deighton, Qliktech’s CTO & Senior VP, Products: “More and more customers are looking at QlikView not just to supplement traditional BI, but to replace it“.

One of my clients – large corporation (obviously cannot say the name of it due NDA) asked me to advise of what to choose between Traditional BI tools with long Development Cycle (like Cognos, Business Objects or Microstrategy), modern BI tools (like JavaScript and D3 toolkit) which is attempt to modernize traditional BI but still having  sizable development time and leading Data Visualization tools with minimal development time (like Tableau, Qlikview or Spotfire).

Since main criterias for client were

  • minimize IT personnel involved and increase its productivity;

  • minimize the off-shoring and outsourcing as it limits interactions with end users;

  • increase end users’s involvement, feedback and action discovery.

So I advised to client to take some typical Visual Report project from the most productive Traditional  BI Platform (Microstrategy), use its prepared Data and clone it with D3 and Tableau (using experts for both). Results in form of Development time in hours) I put below; all three projects include the same time (16 hours) for Data Preparation & ETL, the same time for Deployment (2 hours) and the same number (8) of Repeated Development Cycles (due 8 consecutive feedback from End Users):

DVvsD3vsBI

It is clear that Traditional BI requires too much time, that D3 tools just trying to prolongate old/dead BI traditions by modernizing and beautifying BI approach, so my client choose Tableau as a replacement for Microstrategy, Cognos, SAS and Business Objects and better option then D3 (which require smart developers and too much development). This movement to leading Data Visualization platforms is going on right now in most of corporate America, despite IT inertia and existing skillset. Basically it is the application of the simple known principle that “Faster is better then Shorter“, known in science as Fermat’s Principle of least time.

This changes made me wonder (again) if Gartner’s recent marketshare estimate and trends for Dead Horse sales (old traditional BI) will stay for long. Gartner estimates the size of BI market as $13B which is drastically different from TBR estimate ($30B).

BIDeadHorseTheoryTBR predicts that it will keep growing at least until 2018 with yearly rate 4% and BI Software Market to Exceed $40 Billion by 2018 (They estimate BI Market as $30B in 2012 and include more wider category of Business Analytics Software as opposed to strictly BI tools). I added estimates for Microstrategy, Qliktech, Tableau and Spotfire to Gartner’s MarketShare estimates for 2012 here:

9Vendors

However, when Forrester asked people what BI Tools they used, it’s survey results were very different from Gartner’s estimate of “market share:

BIToolsInUse

“Traditional BI is like a pencil with a brick attached to it” said Chris Stolte at recent TCC13 conference and Qliktech said very similar in its recent announcement of Qlikview.Next. I expect TIBCO will say similar about upcoming new release of Spotfire (next week at TUCON 2013 conference in Las Vegas?)

Tableau_brick2

These bold predictions by leading Data Visualization vendors are just simple application of Fermat’s Principle of Least Time: this principle stated that the path taken between two points by a ray of light (or development path in our context) is the path that can be traversed in the least time.

Pierre_de_Fermat2Fermat’s principle can be easily applied to “PATH” estimates to multiple situations like in video below, where path from initial position of the Life Guard on beach to the Swimmer in Distress (Path through Sand, Shoreline and Water) explained: 

Even Ants following the Fermat’s Principle (as described in article at Public Library of Science here: http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0059739 ) so my interpretation of this Law of Nature (“Faster is better then Shorter“) that  traditional BI is a dying horse and I advise everybody to obey the Laws of Nature.

AntsOn2SurfacesIf you like to watch another video about Fermat’s principle of Least Time and related Snell’s law, you can watch this: 
Google+

After announcement of Tableau 8.1 ( and completion of TCC13) this week people asked me to refresh my comparison of leading Data Visualization tools and I felt it is the good time to do it, because finally Tableau can claim it has 64-bit platform and it is able now to do more advanced Analytics, thanks to Integration with R (both new features needs to be benchmarked and tested, but until my benchmarks are completed I tend to believe to Tableau’s claims).  I actually felt that Tableau may be leapfrogged the competition and now Qlikview and Spotfire have to do something about it (of course if they care).

I enjoyed this week Tableau’s pun/wordplay/slogan “Data to the People” it reminds, of course, other slogan “Power to the People” but also indirectly refers to NYSE Symbol “DATA” which is the SYMBOL of Tableau Software Inc. and it means (indirectly): “Tableau to the People”:

DataToThePeople2

In fact the “keynote propaganda” from Christian Chabot and Chris Stolte was so close to what I am saying for years on this blog, that I used their slogan FEBA4A (“Fast, Easy, Beautiful, Anywhere for Anyone”) as the filter to include or remove from comparison any runner-ups, traditional, me-too and losing tools and vendors.

For example despite the huge recent progress Microsoft did with its BI Stack (updates in Office 2013, 365 and SQL 2012/14 of Power Pivot/View/Map/Query, SSAS, Data Explorer, Polybase, Azure Services, StreamInsight, in-Memory OLTP, Columnstore Indexing etc.) did not prevent me from removal of Microsoft’s BI Stack from comparison (MSFT still trying to sell Data Visualization as a set of add-ins to Excel and SQL Server as oppose to separate product), because it it is not FEBA4A.

For similar reasons I did not include runner-ups like Omniscope, Advizor, Panopticon (it is part of Datawatch now), Panorama, traditional BI vendors, like IBM, Oracle, SAP, SAS, Microstrategy and many me-too vendors like Actuate, Pentaho, Information Builders, Jaspersoft, Jedox, Yellowfin, Bime and dozens of others. I even was able finally to rule out wonderful toolkits like D3 (because they are not for “anyone” and they require brilliant people like Mike Bostock to shine).

I was glad to see similar thinking from Tableau’s CEO in his yesterday’s interview here: http://news.investors.com/091213-670803-tableau-takes-on-big-rivals-oracle-sap-ibm-microsoft.htm?p=full and I quote:

“The current generation of technology that companies and governments use to try to see and understand the data they store in their databases and spreadsheets is without exception complicated, development-intensive, staff-intensive, inflexible, slow-moving and expensive. And every one of those adjectives is true for each of the market-share leaders in our industry.”

Here is my brief and extremely personal (yes, opinionated but not bias) comparison of 3 leading Data Visualization (DV Comparison) platforms (if you cannot see in your browser, see screenshot below of Google Doc:

I did not add pricing to comparison, because I cannot find enough public info about it. This is all I have:

  • https://tableau.secure.force.com/webstore

  • http://www.qlikview.com/us/explore/pricing

  • https://silverspotfire.tibco.com/us/get-spotfire/silver-spotfire-feature-matrix

  • additional pricing info for Tableau Server Core Licensing: “8 core server (enough to support 1,000 users, or 100 concurrent) for Tableau is $180k first year, about $34k every year after year 1 for maintenance”. With 8 core licensing I actually witnessed support for more then 1000 users: 1300+ active interactors, 250+ Publishers, 3000+ Viewers. I also witnessed (2+ years ago, since then price grew!) more than once that negotiation with Tableau Sales can get you down to $160K for 8 Core license with 20% every year after year 1 for maintenance (so in 2010-2011 total price was about $192K with 1 year maintenance)

  • Also one of visitors indicated to me that current pricing for 8 core Tableau 8.0 license for 1st year is $240K  now plus (mandatory?) 20-25% maintenance for 1st year… However negotiations are very possible and can save you up to 20-25% of “discount”. I am aware of recent cases where 8-core license was sold (after discount) for around $195K with maintenance for 1st year for about $45K so total sale was $240K with 1st year maintenance (25% growth in price for last 3 years).

Below is a screenshot of above comparison, because some browsers (e.g. Safari or Firefox before version 24) cannot see either Google Doc embedded into WordPress or Google Doc itself:

DVComparisonSeptember2013

Please note that I did not quantify above which of 3 tools are better, it is not possible until I will repeat all benchmarks and tests (I did many of those in the past; if I will have time in the future, I can do it again) when actual Tableau 8.1 will be released (see latest here: https://licensing.tableausoftware.com/esdalt/ ). However I used above the green color for good and red color for bad (light-colored backgrounds in 3 middle columns indicated good/bad). Also keep in mind that Qliktech and TIBCO may release something new soon enough (say Qlikview 12 or they called it now Qlikview.Next and Spotfire 6), so leapfrogging game may continue.

Update 10/11/13: interesting article about Tableau (in context with Qlikview and Spotfire) by Akram Annous from SeekeingAlpha: http://seekingalpha.com/article/1738252-tableau-a-perfect-short-with-a-catalyst-to-boot . Akram is very active visitor to my blog, especially to this article above. This article only 1 month old but already needs updates due recent pre-announcements about Qlikview.Next (Qlikview 12) and Spotfire 6, which as I predicted showing that leapfrogging game continue at full speed. Akram is brave enough by “targeting” pricing for DATA shares as $55 IN 30 DAYS, $35 IN 6 MONTHS. I am not convinced yet.

frogleap4if you will see the AD below, it is not me, it is wordpress.com…

Today Tableau Customer Conference 2013 started with 3200+ attendees from 40+ countries and 100+ industries, with 700 employees of Tableau, 240 sessions. Tableau 8.1 pre-announced today for release in fall of 2013, also version 8.2 planned for winter 2014, and Tableau 9.0 for later in 2014.

Update 9/10/13: keynote now is available recorded and online:  http://www.tableausoftware.com/keynote
(Recorded Monday Sept 9, 2013 Christian Chabot, Chris Stolte and the developers LIVE)

New in 8.1: 64-bit, Integration with R, support for SAML, IPV6 and External Load Balancers, Copy/Paste Dashboards and worksheets between workbooks, new Calendar Control, own visual style, including customizing even filters, Tukey’s Box-and-Whisker Box-plot, prediction bands, ranking, visual analytics for everyone and everywhere (in the cloud now)

Planned and new for 8.2: Tableau for MAC, Story Points (new type of worksheet/dashboard with mini-slides as story-points), seamless access to data via data connection interface to visually build a data schema, including inner/left/right/outer visual joins, beautifying columns names, easier metadata etc, Web authoring enhancements (it may get into 8.1: moving quick filters, improvement for Tablets, color encoding.) etc.

8.1:  Francois Ajenstat announced: 64-bit finally (I asked for that for many years) for server processes and for Desktop, support for SAML (single-sign-ON on Server and Desktop), IPV6, External Load Balancers:

Francois

SAML8.1: Dave Lion announced R integration with Tableau:

DaveLion

r8.1: Mike Arvold announced “Visual Analytics for everyone”, including implemention of famous Tukey’s Box-and-Whisker Box-plot (Spotfire has it for a while, see it here: http://stn.spotfire.com/stn/UserDoc.aspx?UserDoc=spotfire_client_help%2fbox%2fbox_what_is_a_box_plot.htm&Article=%2fstn%2fConfigure%2fVisualizationTypes.aspx ),

better forecasting, prediction bands, ranking, better heatmaps:

MikeArvold8.1: Melinda Minch announced “fast, easy, beautiful”, most importantly copy/paste dashboards and worksheets between workbooks, customizing everything, including quick filters, new calendar control, own visual style, folders in Data Window etc…

MelindaMinch28.2: Jason King pre-announced the Seamless access to data via data connection interface to visually build a data schema, including inner/left/right/outer “visual” joins, beautifying columns names, default formats, new functions like DATEPARSE, appending data-set with new tables, beautifying columns names, easier metadata etc.

JasonKingSeamlessAccess2data28.2: Robert Kosara introduced Story Points (using new type of worksheet/dashboard with mini-slides as story-points) for new Storytelling functionality:

RobertKosara2

Here is an example of Story Points, done by Robert:

storypoints-4

8.2: Andrew Beers pre-announced Tableau 8.2 on MAC and he got a very warm reception from audience for that:

AndrewBeers3Chris Stolte proudly mentioned his 275-strong development team, pre-announced upcoming Tableau Releases 8.1 (this fall), 8.2 (winter 2014) and 9.0 (later in 2014) and introduced 7 “developers” who (see above Francois, Mike, Dave, Melinda, Jason, Robert and Andrew) discussed during this keynote new features (feature list is definitely longer and wider that recent “innovations” we saw from Qlikview 11.2 and even from Spotfire 5.5):

ChrisStolte2Christian Chabot opening keynote today… He said something important: current BI Platforms are not fast, nor easy, they are not beautiful and not for anyone and they are definitely not “anywhere” but only in designated places with appropriate IT personnel (compare with Tableau Public, Tableau Online, Tableau free Reader etc.) and it is only capable to produce a bunch of change requests from one Enterprise’s department to another, which will take long time to implement with any SDLC framework.

CEOChristian basically repeated what I am saying on this blog for many years, check it here https://apandre.wordpress.com/market/competitors/ : traditional BI software (from SAP, IBM, Oracle, Microstrategy and even Microsoft cannot compete with Tableau, Qlikview and Spotfire) is obsolete and dying and this is very direct challenge and threat to BI vendors (I am not sure if they understand that): your (BI that is) time is over and now it is time for Tableau (also for Qlikview and Spotfire but they are slightly behind now…).

Update on 11/21/13: Tableau 8.1 is available today, see it here: http://www.tableausoftware.com/new-features/8.1 and Tableau Public 8.1 is available as well, see it here: http://www.tableausoftware.com/public/blog/2013/11/tableau-public-81-launches-2226

Google+

Last week Tableau increased by 10-fold the capacity of Data Visualizations published with Tableau Public to a cool 1 Million rows of Data, basically to the same amount of rows, which Excel 2007, 2010 and 2013 (often used as data sources for Tableau Public) can handle these days and increased by 20-fold the storage capacity (to 1GB of free storage) of each free Tableau Public Account, see it here:

http://www.tableausoftware.com/public/blog/2013/08/one-million-rows-2072

It means that free Tableau Public Account will have the storage twice larger than Spotfire Silver’s the most expensive Analyst Account (that one will cost you $4500/year). Tableau said: “Consider it a gift from us to you.”. I have to admit that even kids in this country know that there is nothing free here, so please kid me not – we are all witnessing of some kind of investment here – this type of investment worked brilliantly in the past… And all users of Tableau Public are investing too – with their time and learning efforts.

And this is not all: “For customers of Tableau Public Premium, which allows users to save locally and disable download of their workbooks, the limits have been increased to 10 million rows of data at 10GB of storage space” see it here:

http://www.tableausoftware.com/about/press-releases/2013/tableau-software-extends-tableau-public-1-million-rows-data without changing the price of service (of course in Tableau Public Premium price is not fixed and depends on the number of impressions).

Out of 100+ millions of Tableau users only 40000 qualified to be called Tableau Authors, see it here  http://www.tableausoftware.com/about/press-releases/2013/tableau-software-launches-tableau-public-author-profiles so they are consuming Tableau Public’s Storage more actively then others. As an example you can see my Tableau’s Author Profile here: http://public.tableausoftware.com/profile/andrei5435#/ .

I will assume those Authors will consume 40000GB of online storage, which will cost to Tableau Software less then (my guess, I am open to correction from blog visitors) $20K/year just for the storage part of Tableau Public Service.

During the last week the other important announcement on 8/8/13 – Quarterly Revenue – came from Tableau: it reported the Q2 revenue of $49.9 million, up 71% year-over-year: http://investors.tableausoftware.com/investor-news/investor-news-details/2013/Tableau-Announces-Second-Quarter-2013-Financial-Results/default.aspx .

Please note that 71% is extremely good YoY growth compare with the entire anemic “BI industry”, but less then 100% YoY which Tableau grew in its private past.

All these announcements above happened simultaneously with some magical (I have no theory why this happened; one weak theory is the investors madness and over-excitement about Q2 revenue of $49.9M announced on 8/8/13?) and sudden increase of the nominal price of Tableau Stock (under the DATA name on NYSE) from $56 (which is already high) on August 1st 2013 (announcement of 1 millions of rows/1GB storage for Tableau public Accounts) to $72+ today:

DATAstock812Area2

It means that the Market Capitalization of Tableau Software may be approaching $4B and sales may be $200M/year. For comparison, Tableau’s direct and more mature competitor Qliktech has now the Capitalization below $3B while its sales approaching almost $500M/year. From Market Capitalization point of view in 3 moths Tableau went from a private company to the largest Data Visualization publicly-traded software company on market!

Competition in Data Visualization market is not only on features, market share and mindshare but also on pricing and lisensing. For example the Qlikview licensing and pricing is public for a while here: http://www.qlikview.com/us/explore/pricing and Spotfire Silver pricing public for a while too:  https://silverspotfire.tibco.com/us/silver-spotfire-version-comparison .

Tableau Desktop has 3 editions: Public (Free), Personal ($999) and Professional ($1999), see it here: http://www.tableausoftware.com/public/comparison ; in addition you can have full Desktop (read-only) experience with free Tableau Reader (neither Qlikview nor Spotfire have free readers for server-less, unlimited distribution of Visualizations, which is making Tableau a mind-share leader right away…)

The release of Tableau Server online hosting this month:  http://www.tableausoftware.com/about/press-releases/2013/tableau-unveils-cloud-business-intelligence-product-tableau-online heated the licensing competition and may force the large changes in licencing landscape for Data Visualization vendors. Tableau Server existed in the cloud for a while with tremendous success as Tableau Public (free) and Tableau Public Premium (former Tableau Digital with its weird pricing based on “impressions”).

But Tableau Online is much more disruptive for BI market: for $500/year you can get the complete Tableau Server site (administered by you!) in the cloud with (initially) 25 (it can grow) authenticated by you users and 100GB of cloud storage for your visualizations, which is 200 times more then you can get for $4500/year top-of-the line Spotfire Silver “Analyst account”. This Tableau Server site will be managed in the cloud by Tableau Software own experts and require nor IT personnel from your side! You may also compare it with http://www.rosslynanalytics.com/rapid-analytics-platform/applications/qlikview-ondemand .

A hosted by Tableau Software solution is particularly useful when sharing dashboards with customers and partners because the solution is secure but outside a company’s firewall. In the case of Tableau Online users can publish interactive dashboards to the web and share them with clients or partners without granting behind-the-firewall access.

Since Tableau 8 has new Data Extract API, you can do all data refreshes behind your own firewall and republish your TDE files in the cloud anytime (even automatically, on demand or on schedule) you need. Tableau Online has no minimum number of users and can scale as a company grows. At any point, a company can migrate to Tableau Server to manage it in-house. Here is some introductionla video about Tableau Online: Get started with Tableau Online.

Tableau Server in the cloud provides at least 3 ways to update your data (more details see here: http://www.tableausoftware.com/learn/whitepapers/tableau-online-understanding-data-updates )

TableauDesktopAsProxyForTableauServer

Here is another, more lengthy intro into Tableau BI in Cloud:

Tableau as a Service is a step in right direction, but be cautious:  in practice, the architecture of the hosted version could impact performance. Plus, the nature of the product means that Tableau isn’t really able to offer features like pay-as-you-go that have made cloud-based software popular with workers. By their nature, data visualization products require access to data. For businesses that store their data internally, they must publish their data to Tableau’s servers. That can be a problem for businesses that have large amounts of data or that are prevented from shifting their data off premises for legal or security reasons. It could also create a synchronization nightmare, as workers play with data hosted at Tableau that may not be as up-to-date as internally stored data. Depending on the location of the customer relative to Tableau’s data center, data access could be slow.

And finally, the online version requires the desktop client, which costs $2,000. Tableau may implement Tableau desktop analytical features in a browser in the future while continue to support the desktop and on-premise model to meet security and regulations facing some customers.

Tableau_Online

Tableau Software filed for IPO, on the New York Stock Exchange under the symbol “DATA”. In sharp contrast to other business-software makers that have gone public in the past year, Tableau is profitable, despite hiring huge number of new employees. For the years ended December 31, 2010, 2011 and 2012,  Tableau’s total revenue were $34.2 million, $62.4 million and $127.7 million for 2012. Number of full-time employees increased from 188 as of December 31, 2010 to 749 as of December 31, 2012.

Tableau’s biggest shareholder is venture capital firm New Enterprise Associates, with a 38 percent stake. Founder Pat Hanrahan owns 18 percent, while co-founders Christopher Stolte and Christian Chabot, who is also chief executive officer, each own more than 15 percent. Meritech Capital Partners controls 6.4 percent. Tableau recognized three categories of Primary Competitors:

  • large suppliers of traditional business intelligence products, like IBM, Microsoft, Oracle and SAP AG;

  • spreadsheet software providers, such as Microsoft Corporation

  • business analytics software companies: Qlik Technologies Inc. and TIBCO Spotfire.

TBvsQVvsSF

Update 4/29/13: This news maybe related to Tableau IPO: I understand that Microstrategy’s growth cannot be compared with growth of Tableau or even Qliktech. But to go below of the average “BI market” growth? Or even 6% or 24% decrease? What is going on (?) here : “First quarter 2013 revenues were $130.2 million versus $138.3 million for the first quarter of 2012, a 6% decrease.  Product licenses revenues for the first quarter of 2013 were $28.4 million versus $37.5 million for the first quarter of 2012, a 24% decrease.”

Update 5/6/13: Tableau Software Inc. will sell 5 million shares, while shareholders will sell 2.2 million shares, Tableau said in an amended filing with the U.S. Securities and Exchange Commission. The underwriters have the option to purchase up to an additional 1,080,000 shares. It means total can be 8+ millions of shares for sale.

The company expects its initial public offer to raise up to $215.3 million at a price of $23 to $26 per share. If this happened, that will create public company with large capitalization, so Qliktech and Spotfire will have an additional problem to worry about. This is how QLIK (blue line), TIBX (red) and MSTR (orange line) stock behaved during last 6 weeks after release of Tableau 8 and official Tableau IPO announcement:

QlikTibxMstr

Update 5/16/13: According to this article  at Seeking Alpha (also see S-1 Form) Tableau Software Inc. (symbol “DATA”) is scheduled a $176 million IPO with a market capitalization of $1.4 billion for Friday, May 17, 2013. Tableau’s March Quarter sales were up 60% from the March ’12 quarter. Qliktech’s sales were up only 23% on a similar comparative basis.

nyse

According to other article, Tableau raised it IPO price and it may reach capitalization of $2B by end of Friday, 5/17/13. That is almost comparable with capitalization of Qliktech…

Update 5/17/13: Tableau’s IPO offer price was $31 per share, but it started today

with price $47 and finished day with $50.75 (raising $400M in one day) with estimated Market Cap around $3B (or more?). It is hard to understand the market: Tableau Stock (symbol: DATA) finished its first day above $50 with Market Capitalization higher than QLIK, which today has Cap = $2.7B but Qliktech has almost 3 times more of sales then Tableau!

For comparison MSTR today has Cap = $1.08B and TIBX today has Cap = $3.59B. While I like Tableau, today proved that most investors are crazy, if you compare numbers in this simple table:

Symbol  : Market Cap, $B, as of 5/17/13 Revenue, $M, as of 3/31/13 (trailing 12 months) FTE (Full Time Employees)
TIBX 3.59 1040 3646
MSTR 1.08 586 3172
QLIK 2.67 406 1425
DATA between $2B and $3B? 143 834

See interview with Co-Founder of Tableau Software Christian Chabot  - he discusses taking the company’s IPO with Emily Chang on Bloomberg Television’s “Bloomberg West.” However it makes me sad when Tableau’s CEO is implying that Tableau is ready for big data, which is not true.

TableauCEOaboutIPOHere are some pictures of the Tableau team at the NYSE:  http://www.tableausoftware.com/ipo-photos and here is the announcement about “closing IPO”.

Initial public offering gave to Tableau $254 million (preliminary estimate)

Today Tableau 8 was released with 90+ new features (actually it may be more than 130) after exhausting 6+ months of Alpha and Beta Testing with 3900+ customers as Beta Testers! I personally expected it it 2 months ago, but I rather will have it with less bugs and this is why I have no problem with delay. During this “delay” Tableau Public achieved the phenomenal Milestone: 100 millions of users…

Tableau 8 introduced:

  • web and mobile authoring,
  • added access to new data sources: Google Analytics, Salesforce.com, Cloudera Impala, DataStax Enterprise, Hadapt, Hortonworks Hadoop Hive, SAP HANA, and Amazon Redshift.
  • New Data Extract API that allows programmers to load data from anywhere into Tableau and make certain parts of Tableau Licensing ridiculous, because consuming part of licensing (for example core licensing) for background tasks should be set free now.
  • New JavaScript API enables integration with business (and other web-) applications.
  • Local Rendering: leveraging the graphics hardware acceleration available on ordinary computers. Tableau 8 Server dynamically determines where rendering will complete faster – on the server or in the browser. Also – and acts accordingly. Also Dashboards now render views in parallel when possible.

Tableau Software plans to add in next versions (after 8.0) some very interesting and competitive features, like:

  • Direct query of large databases, quick and easy ETL and data integration.
  • Tableau on a Mac and Tableau as a pure Cloud offering.
  • Make statistical & analytical techniques accessible (I wonder if it means integration with R?).
  • Tableau founder Pat Hanrahan recently talked about “Showing is Not Explaining”, so Tableau planned to add features that support storytelling by constructing visual narratives and effective communication of ideas.

I did not see on Tableau’s roadmap some very long overdue features like 64-bit implementation (currently even all Tableau Server processes, except one, are 32-bit!), Server implementation on Linux (we do not want to pay Windows 2012 Server CAL taxes to Bill Gates) and direct mentioning of integration with R like Spotfire does – I how those planning and strategic mistakes will not impact upcoming IPO.

I personally think that Tableau has to stop using its ridiculous practice when 1 core license used per each 1 Backgrounder server process and since Tableau Data Extract API is free so all Tableau Backgrounder Processes should be free and have to be able to run on any hardware and even any OS.

Tableau 8 managed to get the negative feedback from famous Stephen Few, who questioned Tableau’s ability to stay on course. His unusually long blog-post “Tableau Veers from the Path” attracted enormous amount of comments from all kind of Tableau experts. I will be cynical here and notice that there is no such thing as negative publicity and more publicity is better for upcoming Tableau IPO.

TBvsQVvsSF

Human eye cannot process effectively more than a few (thousands) datapoints per View.

LocalRenderingBlue

Additionally, in Data Visualization you have other restrictions:

  • number of pixels on your screen (may be 2-3 millions maximum) available for your View (Chart or Dashboard).
  • time to render millions of Datapoints can be too long and may create a bad User Experience (too much waiting).
  • time to load your Datapoints into your View; if you wish to have a good User Experience, than 2-3 seconds is maximum user can wait for. If you have a live connection to datasource, than 2-3 seconds mean a few thousands of Datapoints maximum.
  • again, more Datapoints you will put in your View, more crowded it will be and less useful and less understandable your View will be for your users.

Recently, some Vendors started to add new reason for you (called Local Rendering) to restrict yourself in terms of how much of Datapoints you need to put into your DataView: usage of Client-side hardware (especially its Graphical Hardware) for so called “Local Rendering”.

Local rendering means that Data Visualization Server will send DataPoints instead of Images to Client and Rendering of Image will happened on Client-side, using capability of modern Web Browsers (to use Client’s Hardware) and HTML5 Canvas technology.

5000MarksBlueGreenGrey

For example, the new  feature in Tableau Server 8 will automatically switch to Local Rendering if number of DataPoints in your DataView (Worksheet with your Chart or Dashboard) is less then 5000 DataPoints (Marks in Tableau Speak). In addition to faster rendering it means less round-trips to Server (for example when you hover your mouse over Datapoint, in old world it means round-trip to Server) and faster Drill-down, Selection and Filtering operations.

Update 3/19/13: James Baker from Tableau Software explains why Tableau 8 Dashboards in Web Browser feel more responsive:

http://www.tableausoftware.com/about/blog/2013/3/quiet-revolution-rendering-21874

James explained that “HTML5’s canvas element” is used as drawing surface. He underscored that it’s much faster to send images rather than data because image size does not scale up linearly. James included a short video shows incremental filtering in a browser, one of the features of Local Rendering.

LocalRenderingPink

Best of the Tableau Web… December 2012:

http://www.tableausoftware.com/about/blog/2013/1/best-tableau-web-december-2012-20758

Top 100 Q4 2012 from Tableau Public:

http://www.tableausoftware.com/public/blog/2013/01/top-100-q4-2012-1765

eBay’s usage of Tableau as the front-end for big data, Teradata and Hadoop with 52 petabytes of
data on everything from user behavior to online transactions to customer shipments and much more:

http://www.infoworld.com/d/big-data/big-data-visualization-big-deal-ebay-208589

Why The Information Lab recommends Tableau Software:

http://www.theinformationlab.co.uk/2013/01/04/recommend-tableau-software/

Fun with #Tableau Treemap Visualizations

http://tableaulove.tumblr.com/post/40257187402/fun-with-tableau-treemap-visualizations

Talk slides: Tableau, SeaVis meetup & Facebook, Andy Kirk’s Facebook Talk from Andy Kirk

http://www.visualisingdata.com/index.php/2013/01/talk-slides-tableau-seavis-meetup-facebook/

Usage of RAM, Disk and Data Extracts with Tableau Data Engine:

http://www.tableausoftware.com/about/blog/2013/1/what%E2%80%99s-better-big-data-analytics-

memory-or-disk-20904
Migrating Tableau Server to a New Domain

https://www.interworks.com/blogs/bsullins/2013/01/11/migrating-tableau-server-new-domain

SAS/Tableau Integration

http://www.see-change.co/services/sastableau-integration/

IFNULL – is not “IF NULL”, is “IF NOT NULL”

http://tableaufriction.blogspot.com/2012/09/isnull-is-not-is-null-is-is-not-null.html

Worksheet and Dashboard Menu Improvements in Tableau 8:

http://tableaufriction.blogspot.com/2013/01/tv8-worksheet-and-dashboard-menu.html

Jittery Charts – Why They Dance and How to Stop Them:

http://tableaufriction.blogspot.com/2013/01/jittery-charts-and-how-to-fix-them.html

Tableau Forums Digest #8

http://shawnwallwork.wordpress.com/2013/01/06/67/

Tableau Forums Digest #9

http://shawnwallwork.wordpress.com/2013/01/14/tableau-forums-digest-9/

Tableau Forums Digest #10

http://shawnwallwork.wordpress.com/2013/01/19/tableau-forums-digest-10/

Tableau Forums Digest #11

http://shawnwallwork.wordpress.com/2013/01/26/tableau-forums-digest-11/

implementation of bandlines in Tableau by Jim Wahl (+ Workbook):

http://community.tableausoftware.com/message/198511

If you visited my blog before, you know that my classification of Data Visualization and BI vendors are different from researchers like Gartner. In addition to 3 DV Leaders – Qlikview, Tableau, Spotfire – I rarely have time to talk about other “me too” vendors.

However, sometimes products like Omniscope, Microstrategy’s Visual Insight, Microsoft BI Stack (Power View, PowerPivot, Excel 2013, SQL Server 2012, SSAS etc.), Advizor, SpreadshetWEB etc. deserve attention too. However, it takes so much time, so I am trying to find guest bloggers to cover topics like that. 7 months ago I invited volunteers to do some guest blogging about Advizor Visual Discovery Products:

https://apandre.wordpress.com/2012/06/22/advizor-analyst-vs-tableau-or-qlikview/

So far nobody in  USA or Europe committed to do so, but recently Mr. Srini Bezwada, Certified Tableau Consultant and Advizor-trained expert from Australia contacted me and submitted the article about it.  He also provided me with info about how Advizor can be compared with Tableau, so I will do it briefly, using his data and opinions. Mr. Bezwada can be reached at

sbezwada@smartanalytics.com.au , where he is a director at

http://www.smartanalytics.com.au/

Below is quick comparison of Advizor with Tableau. Opinions below belong to Mr. Srini Bezwada. Next blog post will be a continuation of this article about Advizor Solutions Products, see also Advizor’s website here:

http://www.advizorsolutions.com/products/

Criteria Tableau ADVIZOR Comment
Time to implement Very Fast Fast, ADVIZOR can be implemented within Days Tableau Leads
Scalability Very Good Very Good Tableau: virtual RAM
Desktop License $1,999 $ 1,999 $3,999 for AnalystX with Predictive modeling
Server License/user $1K, min 10 users, 299 K for Enterprise Deployment license for up to 10 named users $8 K ADVIZOR is a lot cheaper for Enterprise Deployment $75 K for 500 Users
Support fees / year

20%

20%

1st year included
SaaS Platform Core or Digital Offers Managed Hosting ADVIZOR Leads
Overall Cost Above Average Competitive ADVIZOR Costs Less
Enterprise Ready Good for SMB Cheaper cost model for SMB Tableau is expensive for Enterprise Deployment
Long-term viability Fastest growth Private company since 2003. Tableau is going IPO in 2013
Mindshare Tableau Public Growing Fast Tableau stands out
Big Data Support Good Good Tableau is 32-bit
Partner Network Good Limited Partnerships Tableau Leads
Data Interactivity Excellent Excellent
Visual Drilldown Very Good Very Good
Offline Viewer Free Reader None Tableau stands out
Analyst’s Desktop Tableau Professional Advizor has Predictive Modeling ADVIZOR is a Value for Money
Dashboard Support Excellent Very Good Tableau Leads
Web Client Very Good Good Tableau Leads
64-bit Desktop None Very Good Tableau still a 32-bit app
Mobile Clients Very Good Very Good
Visual Controls Very Good Very Good
Data Integration Excellent Very Good Tableau Leads
Development Tableau Pro ADVIZOR Analyst
64-bit in-RAM DB Good Excellent Advizor Leads
Mapping support Excellent Average Tableau stands out
Modeling, Analytics Below Average Advanced Predictive Modelling ADVIZOR stands out
Predictive Modeling None Advanced Predictive Modeling Capability with Built in KXEN algorithms ADVIZOR stands out
Flight Recorder None Flight recorder lets you track, replay, save your analysis steps for reuse by yourself or others. ADVIZOR stands out
Visualization 22 Chart types All common charts like  bar charts, scatter plots, line charts, Pie charts are supported Advizor has Advanced Visualizations like Parabox, Network Constellation
Third party integration Many Data Connectors, see Tableau’s drivers page ADVIZOR integrates well with CRM software: Salesforce.com, Ellucian, Blackbaud and others. ADVIZOR leads in CRM area
Training Free Online and paid Classroom Free Online and paid via company trainers & Partners Tableau Leads

My best wishes for 2013!

hny2013abp

2012 was extraordinary for Data Visualization community and I expect 2013 will be even more interesting than 2012. For Data Visualization vendors 2012 was unusual YEAR and surprised many people.

We can start with Qliktech, which grew only about 18% in 2012 (while in 2011 it was 42% and in 2010 it was 44%) and QLIK stock lost a lot… Spotfire on other hand grew faster then that and Tableau grew even faster than Spotfire. Tableau doubled its workforce and its sales now more than $100M per year. Together the sales of Qlikview, Spotfire and Tableau totaled to almost $600M in 2012 and I expect it may reach even $800M in  2013. All other vendors becoming less and less visible on market. While it is still possible to have a breakthrough from companies like Microsoft, Microstrategy, Visokio and Pagos, it is highly unlikely.

If you will search in web for wishes or wishlists for Qlikview or Tableau or Spotfire, you can find plenty of wishes, including even very technical. I will partially repeat myself, because some of my best wishes are still wishes and may be some of them will be never implemented. I will restrict myself to 3 best wishes per vendor.

Let me start with Spotfire, as the most mature product. I will use analogy: EMC did spin-off VMWare and (today) market capitalization of VMWare is close $40B, about 75% (!) of Market Capitalization of its parent company EMC! I wish that TIBCO will do the same to Spotfire as EMC did to VMWare. Compare with this wish all other wishes look minimal, like making Free Spotfire Desktop Reader (similar to what Tableau has) and make part of Spotfire Silver is completely Public and Free similar to … Tableau Public.

For Qliktech I really wish them to stop bleeding capitalization-wise (did they lost $1B of MktCap during last 9 months?) and sales-wise (growing only 18% in 2012 compare with 42% in 2011). May be 2013 is good time for IBM to buy Qliktech? And yes, I wish Qlikview Server on Linux (I do not like new licensing terms of Windows 2012 Server) and I wish (for many years!) free Qlikview Desktop Viewer/Reader (similar … to Tableau Reader) in 2013 to enable  server-less distribution of Qlikview-based Data Visualizations!

For Tableau I wish a very successful IPO in 2013 and I wish them to grow in 2013 as fast as they did in 2012! I really wish Tableau (and all its processes like VizQL, Application Server, Backgrounder etc.) to became 64-bit in 2013 and of course I wish Tableau Server on Linux (see my wish for Qlikview above).

hny2013blue

Since I still have my best wishes for Microsoft (I guess they will never listen me anyway), I wish them to stop in 2013 using the dead product (Silverlight) with Power View (just complete the switch to HTML5 already), to make it completely separate from SharePoint and make it equal part of Office (integrated with PowerPivot on Desktop) the same way as Visio and Access are parts of Office and as a result I wish Microsoft to have a Power View (Data Visualization) Server (integrated with SQL Server 2012 of course) as well.

Also here are Flags of 21 countries from where this blog got most visitors in 2012:
21CountriesFromWhereDVBlogGotMostVistors

In my previous post https://apandre.wordpress.com/2012/11/16/new-tableau-8-desktop-features/ (this post is the continuation of it) , I said that Tableau 8 introduced 130+ new features, 3 times more then Tableau 7 did. Many of these new features are in Tableau 8 Server and this post about those new Server features (this is a repost from my Tableau blog: http://tableau7.wordpress.com/2012/11/30/new-tableau-8-server-features/ ).

The Admin and Server pages have been redesigned to show more info quicker. In list view the columns can be resized. In thumbnail view the grid dynamically resizes. You can hover over a thumbnail to see more info about visualization. The content search is better too:

ThumbnailView

Web authoring (even mobile) introduced by Tableau 8 Server. Change dimensions, measures, mark types, add filters, and use Show Me are all directly in a web browser and can be saved back to the server as a  new workbook or if individual permissions allow, to the original workbook:

webAuthoring

Subscribing to a workbook or worksheet will automatically notify about the dashboard or view updates to your email inbox. Subscriptions deliver image and link.

Tableau 8 Data Engine is more scalable now, it can be distributed between 2 nodes, 2nd instance of it now can be configured as Active, Synced and Available for reading if  Tableau Router decided to use it (in addition Fail-over function as before)server2sTableau 8 Server now supports Local Rendering, using graphic ability of local devices, modern browsers and HTML5. No-round-trip to server while rendering using latest versions of chrome 23+, Firefox 17+, Safari , IE 9+. Tableau 8 (both Server and Desktop, computing each view in Parallel. PDF files, generated by Tableau 8 up to 90% smaller and searchable. And Performance Recorder works on both Server and Desktop.

Tableau 8 Server introducing Shared sessions allows more concurrency, more caching. Tableau 7 uses 1 session per viewer. Tableau 8 using one session per many viewers, as long as they do no change state of filters and don’t do other altering interaction. If interaction happened, Tableau 8 will clone the session for appropriate Interactor and apply his/her changes to new session:server3sIFinally Tableau getting API, 1st part of it I described in previous blog post about TDesktop – TDE API (C/C++, Python, Java on both Windows AND Linux!).

For Web Development Tableau has now brand new JavaScript API to customize selection, filtering, triggers to events, custom toolbar, etc. Tableau 8 has own JavaScript API WorkBench, which can be used right from you browser:server4w

TDE API allows to build own TDE on any machine with Python, C/C++ and Java (see 24:53 at http://www.tableausoftware.com/tcc12conf/videos/new-tableau-server-8 ). Additionally Server API (REST API) allows programmatically create/enable/suspend sites and add/remove users to sites.

In addition to Faster Uploads andPublishing Data Sources, users can Publish Filters as Set and User Filters. Data Sources can be Refreshed or Appended instead of republishing – all from Local Sources. Such Refreshes can scheduled using Windows Task Scheduler or other task scheduling software on client devices – this is a real TDE proliferation!

My wishlist for Tableau 8 Server: all Tableau Server processes needs to be 64-bit (and they still 32-bit, see it here: http://onlinehelp.tableausoftware.com/v7.0/server/en-us/processes.htm ; they are way overdue to be the 64-bit; Linux version of Tableau Server (Microsoft recently changed very unfavorably the way they charge users for each Client Access) is needed, I wish integration with R Library (Spotfire has it for years), I want Backgrounder Processes (mostly doing data extracts on server) will not consume core licenses etc…

And yes, I found in San Diego even more individuals who found the better way to spend their time compare with attending Tableau 2012 Customer Conference and I am not here to judge:

SealsInLaJolla

I left Tableau 2012 conference in San Diego (where Tableau 8 was announced) a while ago with enthusiasm which you can feel from this real-life picture of 11 excellent announcers:

Tableau8IntroducedInSanDiego

Conference was attended by 2200+ people and 600+ Tableau Software employees (Tableau almost doubled the number of employees in a year) and it felt like a great effort toward IPO (see also article here: http://www.bloomberg.com/news/2012-12-12/tableau-software-plans-ipo-to-drive-sales-expansion.html ).  See some video here: TCC12 Keynote . Tableau 8 introduce 130+ new features, 3 times more then Tableau 7 did. Almost half of these new features are in Tableau 8 Desktop and this post about those new Desktop features (this is a repost from my Tableau Blog: http://tableau7.wordpress.com/2012/11/16/new-tableau-8-desktop-features/). New Tableau 8 Server features deserved a separate blog post which I will publish a little later after playing with Beta 1 and may be Beta 2.

A few days after conference the Tableau 8 Beta Program started with 2000+ participants. One of the most promising features is new rendering engine and I build special Tableau 7 visualization (and its port to Tableau 8) with 42000 datapoints: http://public.tableausoftware.com/views/Zips_0/Intro?:embed=y  to compare the speed of rendering between versions 7 and 8:

ZipColors

Among new features are new (for Tableau) visualization types: Heatmap, “Packed” Bubble Chart and Word Cloud, and I build simple Tableau 8 Dashboard to test it (all 3 are visualizing the 3-dimensional set where 1 dimension used as list of items, 1 measure used for size and 2nd measure used for color of items):

3NewTypesOfVisualizationsInTableau

List of new features includes improved Sets (comparing members vs. non-members, adding/removing members, combining Sets: all-in-both, shared-by-both, left-except-right, right-except-left), Custom SQL with parameters, Freeform Dashboards (I still prefer MDI UI where each Chart/View Sheet has own Child Window as oppose to Pane), ability to add multiple fields to Labels, optimized label placement, built-in statistical models for visual Forecasting, Visual Grouping based on your data selection, Redesigned Mark Card (for Color, Size, Label, Detail and Tooltip Shelves).

New Data features include data blending without mandatory linked field in a view and with ability to filter data in secondary data sources; refreshing server-based Data Extracts can be done from local data sources; Data Filters (in addition be either local or global) can be shared now among selected set of worksheets and dashboards. Refresh of Data Extract can be done using command prompt for Tableau Desktop, for example

>tableau.exe refreshremoteextract

Tableau 8 has (finally) API (C/C++, Python, Java) to directly create a Tableau Data Extract (TDE) file, see example here: http://ryrobes.com/python/building-tableau-data-extract-files-with-python-in-tableau-8-sample-usage/

Tableau 8 (both Desktop and Server) can then connect to this extract file natively! Tableau provides new native connection for Google Analytics and Saleforce.com. TDE files now much smaller (especially with text values) – up to 40% smaller compare with Tableau 7.

Tableau 8 has performance enhancements, such as the new ability to use hardware accelerators (of modern graphics cards), computing views within dashboard in parallel (in Tableau 7 it was consecutive computations) and new  performance recorder allows to estimate and tune a workload of various activities and functions and optimize the behavior of workbook.

I still have a wishlist of features which are not implemented in Tableau and I hope some them will be implemented later: all Tableau processes are 32-bit (except 64-bit version of data engine for server running on 64-bit OS) and they are way overdue to be the 64-bit; many users demand MAC version of Tableau Desktop and Linux version of Tableau Server (Microsoft recently changed very unfavorably the way they charge users for each Client Access), I wish MDI UI for Dashboards where each view of each worksheet has own Window as oppose to own pane (Qlikview does it from the beginning of the time), I wish integration with R Library (Spotfire has it for years), scripting languages and IDE (preferably Visual Studio), I want Backgrounder Processes (mostly doing data extracts on server) will not consume core licenses etc…

Despite the great success of the conference, I found somebody in San Diego who did not pay attention to it (outside was 88F, sunny and beautiful):

HummingbirdInLaJolla

I used LinkedIn for years to measure of how many people mentioning Data Visualization tools on their profiles, of how many LinkedIn groups dedicated to those DV tools and what group membership is. Recently these statistics show dramatic changes in favor of Qlikview and Tableau as undisputed leaders in people’s opinions.

Here is how many people mentioned specific tools (statistics were updated on 9/4/12 and numbers changing every day) on their profiles:

  • Tableau – 18584,
  • Qlikview  - 17471,
  • Spotfire – 3829,
  • SAS+JMP – 3443,
  • PowerPivot – 2335

Sample of “People” search URL: http://www.linkedin.com/search/fpsearch?type=people&keywords=Tableau or http://www.linkedin.com/search/fpsearch?type=people&keywords=SAS+JMP

Here is how many groups dedicated to [in brackets a "pessimistic" estimate of total non-overlapping membership]:

  • Qlikview – 169 [13000+],
  • Tableau – 76 [6000+],
  • Spotfire – 29 [2000+],
  • SAS (+AND+) JMP - 23 [2000+],
  • PowerPivot – 16 [2000+]

Sample of “Group” search URL: http://www.linkedin.com/search-fe/group_search?pplSearchOrigin=GLHD&keywords=Qlikview

I feel guilty for many months now: I literally do not have time for project I wish to do for a while: to compare Advizor Analyst and other Visual Discovery products from Advizor Solutions, Inc. with leading Data Visualization products like Tableau or Qlikview. I am asking visitors of my blog to volunteer and be a guest blogger here; the only pre-condition here is: a guest blogger must be the Expert in Advizor Solutions products and equally so in on of these 3: Tableau, Qlikview or Spotfire.

ADVIZOR’s Visual Discovery™ software is built upon strong data visualization technology spun out of a research heritage at Bell Labs that spans nearly two decades and produced over 20 patents. Formed in 2003, ADVIZOR has succeeded in combining its world-leading data visualization and in-memory-data-management expertise with predictive analytics to produce an easy to use, point and click product suite for business analysis.

Advizor has many Samples, Demos and Videos on its site: http://www.advizorsolutions.com/gallery/ and some web Demos, like this one

http://webnav.advizorsolutions.net/adv/Projects/demo/MutualFunds.aspx but you will need the Silverlight plugin for your web browser installed.

If you think that Advizor can compete with Data Visualization leaders and you have interesting comparison of it, please send it to me as MS-Word article and I will publish it here as a guest blog post. Thank you in advance…

(this is a repost from my other blog: http://tableau7.wordpress.com/2012/06/09/tableau-and-big-data/ )

Big Data can be useless without multi-layer data aggregations, hierarchical or cube-like intermediary Data Structures, when ONLY a few dozens, hundreds or thousands data-points exposed visually and dynamically every single viewing moment to analytical eyes for interactive drill-down-or-up hunting for business value(s) and actionable datum (or “datums” – if plural means data). One of best expression of this concept (at least how I interpreted it) I heard from my new colleague who flatly said:

“Move the function to the data!”

I got recently involved with multiple projects using large data-sets for Tableau-based Data Visualizations (100+ millions of rows and even Billions of records!). Some of largest examples of their sizes I used were: 800+ millions of records and other was 2+ billions of rows.

So this blog post is to express my thoughts about such Big Data (in average examples above have about 1+ KB per CSV record before compression and other advanced DB tricks, like columnar Databases used by Data Engine of Tableau) as back-end for Tableau. But please keep in mind that as a 32-bit tool, Tableau itself is not ready for Big Data. In addition, I think Big Data is mostly a buzzword and BS and we are  forced by marketing BS masters to use sometimes this stupid term.

Here are some Factors involved into Data Delivery from main and designated Database (Back-ends like Teradata, DB2, SQL Server or Oracle) for Tableau-based Big Data Visualizations) into “local” Tableau Visualizations (many people still trying to use Tableau as a Reporting tool as oppose to (Visual) Analytical Tool:

  • Queuing thousands of Queries to Database Server. There is no guarantee your Tableau query will be executed immediately; in fact it WILL be delayed.

  • Speed of Tableau Query when it will start to be executed depends on sharing CPU cycles, RAM and other resources with other queries executed SIMULTANEOSLY with your query.

  • Buffers, pools and other resources available for particular user(s) and queries at your Database Server are different and depends on privileges and settings given to you as a Database User

  • Network speed: between some servers it can be 10Gbits (or even more), in most cases it is 1Gbit inside server rooms, outside of server rooms I observed in many old buildings (over wired Ethernet) max 100Mbits coming into user’s PC; in case if you using Wi-Fi it can be even less (say 54 Mbits?). If you are using internet it can be even less (I observed speed in some remote offices as 1 Mbit or so over old T-1 lines); if you using VPN it will max out at 4Mbits or less (I observed it in my home office).

  • Utilization of network. I use Remote Desktop Protocol – RDP to VM (from my workstation or notebook; (VM or VDI Virtual Machine, sitting in server room) and connected to servers with network speed of 1Gbit, but it still using maximum 3% of network speed (about 30 MBits, which is about 3 Megabytes of data per second, which is probably about few thousands of records per seconds.

That means that network may have a problem to deliver 100 millions of records to “local” report overnight (say 10 hours, 10 millions of records per hour, 3000 records per second) – partially and probably because of factors 4 above.

On top of those factors please keep in mind that Tableau is a set of 32-bit applications (with exception of one out of 7 processes on Server side), which is restricted to 2GB of RAM; if data-set cannot fit into RAM, than Tableau Data Engine will use the disk as Virtual RAM, which is much, much slower and for some users such disk space actually not local to his/her workstation and mapped to some “remote” network file server.

Tableau desktop is using in many cases 32-bit ODBC drivers, which may even add more delay into data delivery into local “Visual Report”. As we learned from Tableau support itself, even with latest Tableau Server 7.0.X, the RAM allocated for one user session restricted to 3GB anyway.

Unfortunate Update: Tableau 8.0 will be 32-bit application again, but may be follow up version 8.x or 9 (I hope) will be ported to 64-bits… It means that Spotfire, Qlikview and even PowerPivot will keep some advantages over Tableau for a while…

(this is a repost from my other Data Visualization blog: http://tableau7.wordpress.com/2012/05/31/tableau-as-container/ )

Often I used small Tableau (or Spotfire or Qlikview) workbooks instead of PowerPoint, which are proving at least 2 concepts:

  • Good Data Visualization tool can be used as the Web or Desktop Container for Multiple Data Visualizations (it can be used to build a hierarchical Container Structures with more then 3 levels; currently 3: Container-Workbooks-Views)

  • It can be used as the replacement for PowerPoint; in example below I embedded into this Container 2 Tableau Workbooks, one Google-based Data Visualization, 3 image-based Slides and Textual Slide: http://public.tableausoftware.com/views/TableauInsteadOfPowerPoint/1-Introduction

  • Tableau (or Spotfire or Qlikview) is better then PowerPoint for Presentations and Slides

  • Tableau (or Spotfire or Qlikview) is the Desktop and the Web Container for Web Pages, Slides, Images, Texts

  • Good Visualization Tool can be a Container for other Data Visualizations

  • Sample Tableau Presentation above contains the Introductory Textual Slide

  • Sample Tableau Presentation above  contains a few Tableau Visualization:This Tableau Presentation contains a Web Page with the Google-based Motion Chart Demo

    1. The Drill-down Demo

    2. The Motion Chart Demo ( 6 dimensions: X,Y, Shape, Color, Size, Motion in Time)

  • This Tableau Presentation contains a few Image-based Slides:

    1. The Quick Description of Origins and Evolution of Software and Tools used for Data Visualizations during last 30+ years

    2. The Description of Multi-level Projection from Multidimensional Data Cloud to Datasets, Multidimensional Cubes and to Chart

    3. The Description of 6 stages of Software Development Life Cycle for Data Visualizations

Some people pushing me to answer on recent Donald Farmer’s comments on my previous post, but I need more time to think about it.

Meanwhile today Ted Cuzzillo published an interesting comparison of Qlikview vs. Tableau here:

http://datadoodle.com/2012/04/24/tableau-qlikview/

named “The future of BI in two words” which made me feel warm and fuzzy about both products and unclear about what Ted’s judgement is?

Fortunately I had a more “digitized” comparison of these 2 Data Visualization Leaders, which I did a while ago for a different reason. So I modified it a little to bring it up-to-date and you can see it for yourself below. Funny thing is that even I used 30+ criterias to measure and compare those two brilliant products, final score is almost identical for both of them, so it is still warm and fuzzy.

Basically conclusion is simple: each product is better for certain customers and for certain projects, there is no universal answer (yet?):

The short version of this post: as far as Data Visualization is a concern, the new Power View from Microsoft is the marketing disaster, the architectural mistake and the generous gift from Microsoft to Tableau, Qlikview, Spotfire and dozens of other vendors.

For the long version – keep reading.

Assume for a minute (OK, just for a second) that new Power View Data Visualization tool from Microsoft SQL Server 2012 is almost as good as Tableau Desktop 7. Now let’s compare installation, configuration and hardware involved:

Tableau:

  1. Hardware:  almost any modern Windows PC/notebook (at least dual-core, 4GB RAM).
  2. Installation: a) one 65MB setup file, b) minimum or no skills
  3. Configuration: 5 minutes – follow instructions on screen during installation.
  4. Price – $2K.

Power View:

  1. Hardware: you need at least 2 server-level PCs (each at least quad-core, 16GB RAM recommended). I will not recommend to use 1 production server to host both SQL Server and SharePoint; if you desperate, at least use VM(s).
  2. Installation: a) Each Server  needs Windows 2008 R2 SP1 – 3GB DVD; b) 1st Server needs SQL Server 2012 Enterprise or BI Edition – 4GB DVD; c) 2nd Server needs SharePoint 2010 Enterprise Edition – 1GB DVD; d) A lot of skills and experience
  3. Configurations: Hours or days plus a lot of reading, previous knowledge etc.
  4. Price: $20K or if only for development it is about $5K (Visual Studio with MSDN subscription) plus cost of skilled labor.

As you can see, Power View simply cannot compete on mass market with Tableau (and Qlikview and Spotfire) and time for our assumption in the beginning of this post is expired. Instead now is time to remind that Power View is 2 generations behind Tableau, Qlikview and Spotfire. And there is no Desktop version of Power View, it is only available as a web application through web browser.

Power View is a Silverlight application packaged by Microsoft as a SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Server 2010 Enterprise Edition. Power View is (ad-hoc) report designer providing for user an interactive data exploration, visualization, and presentation web experience. Microsoft stopped developing Silverlight in favor of HTML5, but Silverlight survived (another mistake) within SQL Server team.

Previous report designers (still available from Microsoft:  BIDS, Report Builder 1.0, Report Builder 3.0, Visual Studio Report Designer) are capable to produce only static reports, but Power View enables users to visually interact with data and drill-down all charts and Dashboard similar to Tableau and Qlikview.

Power View is a Data Visualization tool, integrated with Microsoft ecosystem. Here is a Demo of how the famous Hans Rosling Data Visualization can be reimplemented with Power View:

Compare with previous report builders from Microsoft, Power View allows many new features, like Multiple Views in a Single Report, Gallery preview of Chart Images, export to PowerPoint, Sorting within Charts by measures and Categories, Multiple Measures in Charts, Highlighting of selected data in reports and Charts, Synchronization of Slicers (Cross-Filtering), Measure Filters, Search in Filters (convenient for a long lists of categories), dragging data fields into Canvas (create table) or Charts (modify visualization), convert measures to categories (“Do Not Summarize”), and many other features.

As with any of 1st releases from Microsoft, you can find some bugs from Power View. For example, KPIs are not supported in Power View in SQL Server 2012, see it here: http://cathydumas.com/2012/04/03/using-or-not-using-tabular-kpis/

Power View is not the 1st attempt to be a full player in Data Visualization and BI Market. Previous attempts failed and can be counted as Strikes.

Strike 1: The ProClarity acquisition in 2006 failed, there have been no new releases since v. 6.3; remnants of ProClarity can be found embedded into SharePoint, but there is no Desktop Product anymore.

Strike 2: Performance Point Server was introduced in November, 2007, and discontinued two years later. Remnants of Performance Point can be found embedded into SharePoint as Performance Point Services.

Both failed attempts were focused on the growing Data Visualization and BI space, specifically at fast growing competitors such as Qliktech, Spotfire and Tableau. Their remnants in SharePoint functionally are very behind of Data Visualization leaders.

Path to Strike 3 started in 2010 with release of PowerPivot (very successful half-step, since it is just a backend for Visualization) and xVelocity (originally released under name VertiPaq). Power View is continuation of these efforts to add a front-end to Microsoft BI stack. I do not expect that Power View will gain as much popularity as Qlikview and Tableau and in my mind Microsoft will be a subject of 3rd strike in Data Visualization space.

One reason I described in very beginning of this post and the 2nd reason is absence of Power View on desktop. It is a mystery for me why Microsoft did not implement Power View as a new part of Office (like Visio, which is a great success) – as a new desktop application, or as a new Excel Add-In (like PowerPivot) or as a new functionality in PowerPivot or even as a new functionality in Excel itself, or as new version of their Report Builder. None of these options preventing to have a Web reincarnation of it and such reincarnation can be done as a part of (native SSRS) Reporting Services – why involve SharePoint (which is – and I said it many times on this blog – basically a virus)?

I am wondering what Donald Farmer thinking about Power View after being the part of Qliktech team for a while. From my point of view the Power View is a generous gift and true relief to Data Visualization Vendors, because they do not need to compete with Microsoft for a few more years or may be forever. Now IPO of Qliktech making even more sense for me and upcoming IPO of Tableau making much more sense for me too.

Yes, Power View means new business for consulting companies and Microsoft partners (because many client companies and their IT departments cannot handle it properly), Power View has a good functionality but it will be counted in history as a Strike 3.

(this is a repost from my Tableau blog: http://tableau7.wordpress.com/2012/04/02/palettes-and-colors/ )

I was always intrigued with colors and their usage, since my mom told me that may be ( just may be, there is no direct prove of it anyway) Ancient Greeks did not know what the BLUE color is – that puzzled me.

Later in my live, I realized that Colors and Palettes are playing the huge role in Data Visualization (DV) and it eventually led me to attempt to understand of how it can be used and pre-configured in advanced DV tools to make Data more Visible and to express the Data Patterns better. For this post I used Tableau to produce some palettes, but similar technique can be found in Qlikview, Spotfire etc.

Tableau published the good article of how to create customized palettes here: http://kb.tableausoftware.com/articles/knowledgebase/creating-custom-color-palettes and I followed it below. As this article recommended, I modified default Preferences.tps file; see it below with images of respective Palettes embedded.

For the first, regular Red-Yellow-Green-Blue Palette with known colors with well-established names, I created even a Visualization in order to compare their Red-Green-Blue components and I even tried to placed respective Bubbles on 2-dimensional surface, even originally it is clearly a 3 dimensional Dataset (click on image to see it in full size):

For the 2nd Red-Yellow-Green-NoBlue Ordered Sequential Palette, I tried to implement the extended “Set of Traffic Lights without any trace of BLUE Color” (so Homer and Socrates will understand it the same way as we are) while trying to use only web-safe colors. Please keep in mind, that Tableau does not have a simple way to have more than 20 colors in one Palette, like Spotfire does.

Other 5 Palettes below are useful too as ordered-diverging almost “mono-chromatic” (except Red-Green Diverging, since it can be used in Scorecards when Red is bad and Green is good). So see below Preferences.tps file with my 7 custom palettes.

<?xml version=’1.0′?> <workbook> <preferences>
<color-palette name=”RegularRedYellowGreenBlue” type=”regular”>
<color>#FF0000</color> <color>#800000</color> <color>#B22222</color>
<color>#E25822</color> <color>#FFA07A</color> <color>#FFFF00</color>
<color>#FF7E00</color> <color>#FFA500</color> <color>#FFD700</color>
<color>#F0e68c</color> <color>#00FF00</color> <color>#008000</color>
<color>#00A877</color> <color>#99cc33</color> <color>#009933</color>
<color>#0000FF</color> <color>#00FFFF</color> <color>#008080</color>
<color>#FF00FF</color> <color>#800080</color>

</color-palette>

<color-palette name=”RedYellowGreenNoBlueOrdered” type=”ordered-sequential” >
<color>#ff0000</color> <color>#cc6600</color> <color>#cccc00</color>
<color>#ffff00</color> <color>#99cc00</color> <color>#009900</color>

</color-palette>

<color-palette name=”RedToGreen” type=”ordered-diverging” >
<color>#ff0000</color> <color>#009900</color> </color-palette>

<color-palette name=”RedToWhite” type=”ordered-diverging” >
<color>#ff0000</color> <color>#ffffff</color></color-palette>

<color-palette name=”YellowToWhite” type=”ordered-diverging” >
<color>#ffff00</color> <color>#ffffff</color></color-palette>

<color-palette name=”GreenToWhite” type=”ordered-diverging” >
<color>#00ff00</color> <color>#ffffff</color></color-palette>

<color-palette name=”BlueToWhite” type=”ordered-diverging” >
<color>#0000ff</color> <color>#ffffff</color> </color-palette>
</preferences> </workbook>

In case if you wish to use the colors you like, this site is very useful to explore the properties of different colors: http://www.perbang.dk/rgb/

(this is a repost from http://tableau7.wordpress.com/2012/03/31/tableau-reader/ )

Tableau made a couple of brilliant decisions to completely outsmart its competitors and gained extreme popularity, while convincing millions of potential, future and current customers to invest own time to learn Tableau. 1st reason of course is Tableau Public (we discuss it in separate blog post) and other is a Free Tableau Reader, which provides full desktop user experience and interactive Data Visualization without any Tableau Server (and any other server) involved and with better performance and UI then Server-based Visualizations.

While designing Data Visualizations is done with Tableau Desktop, most users got their Data Visualizations served by Tableau Server to their Web Browser. However in the large and small organizations that usage pattern is not always the best fit. Below I am discussing a few possible use cases, where the usage of Free Tableau Reader can be appropriate, see it here: http://www.tableausoftware.com/products/reader .

1. Tableau Application Server serves Visualizations well, but not as well as Tableau Reader, because Tableau Reader delivers a truly desktop User Experience and UI. Most known example of it is a Motion Chart: you can see automatic motion with Tableau Reader but Web Browser will force user to manually emulate motion. In cases like that user advised to download workbook, copy .TWBX file to his/her workstation and open it with Tableau Reader.

Here is an example of the Motion Chart, done in Tableau, similar to famous Hans Rosling’s presentation of Gapminder’s Motion Chart (an you need the free Tableau Reader or license to Tableau Desktop to see the automatic motion of the 6-dimensional dataset with all colored bubbles, resizing over time):
http://public.tableausoftware.com/views/MotionChart_0/Motion?:embed=y

Please note that the same Motion Chart using Google Spreadsheets will run in browser just fine (I guess because Google “bought” Gapminder and kept its code intact):
https://docs.google.com/spreadsheet/ccc?key=0AuP4OpeAlZ3PdC14OXU1RGJsV05uaDlxRV9GLXlTZXc#gid=2

2. When you have hundreds or thousands of Tableau Server users and more then couple of Admins (users with Administrative privileges), each of Admins can override viewing privileges for any workbook, regardless of designated for that workbook Users and User Groups. In such situation there is a  risk for violation of privacy and confidentiality of data involved, for example for HR Analytics and HR Dashboards and other Visualizations where private, personal and confidential data used.

Tableau Reader enables additional complementary method of delivering Data Visualizations through private channels like password-protected portals, file servers and FTP servers and in certain cases even by-passing Tableau Server entirely.

3. Due popularity of Tableau and ease of use, many groups and teams are considering Tableau as vehicle to delivering of hundreds and even thousands of Visual Reports to hundreds and may be even thousands of users. That can slow down Tableau Server, decrease user experience and create even more confidentiality problems, because it may expose confidential data to unintended users, like report for one store to users from another store.

4. Many small (and not so small either) organizations trying to save on Tableau Server licenses (at least initially) and they still can distribute Tableau-based Data Visualizations; developer(s) will have Tableau Desktop (relatively small investment) and users, clients and customers will use Tableau Reader, while all TWBX files can be distributed over FTP, portals or file servers or even by email. In my experience, when Tableau-based business will grow enough, it will pay  by itself for buying licenses for Tableau Server, so usage of Tableau Reader in n o way is threat to Tbaleau Software bottom line!

Update (12/12/12) for even more happy usage of Tableau Reader: in upcoming Tableau 8 all Tableau Data Extracts – TDEs – can be created and used without any Tableau Server involved. Instead Developer can create/update TDE either with Tableau in UI mode or using Tableau Command Line Interface and script TDEs in batch mode or programmatically with new TDE API (Python, C/C++, Java). It means that Tableau workbooks can be automatically refreshed with new data without any Tableau Server and re-delivered to Tableau Reader users over … FTP, portals or file servers or even by email.

Dan Primack, Senior Editor at Fortune, posted today at http://finance.fortune.cnn.com/2012/02/22/tableau-to-ipo-next-year/ a suggestion that Tableau can go public next year and I quote:

“Scott Sandell, a partner with New Enterprise Associates  (the venture capital firm that is Tableau’s largest outside shareholder) told Dan “that the “board-level discussions” are about taking the company public next year, even though it has the numbers to go out now if it so chose. Sandell added that the company has been very efficient with $15 million or so it has raised in VC funding, and that it shouldn’t need additional pre-IPO financing”.

Mr. Primack also mentioned an “unsolicited email, from outside spokesman: “Next week Tableau Software will announce its plans to go IPO“…

I do not have comments, but I will not be surprised if somebody will buy Tableau before IPO… Among potential buyers I can imagine:

  • Microsoft (Seattle, Multidimensional Cubes, integration with Excel),
  • Teradata (Aster Data is in, front-end for “big data” is needed),
  • IBM (if you cannot win against the innovator, how about buying it),
  • and even Oracle (everything moving is the target?)…

Qliktech made its price list public on its website. In a move that calls for “other enterprise software and business intelligence vendors to follow suit, QlikTech is taking the mystery out of purchasing software“.

I expanded this post with comments and comparison of pricing from Qlikview and Tableau.

I have to mention that Tableau has pricing on its website for years. I wish Tableau will publish on its website the pricing for Core License (for Tableau Server) and more detail for Tableau Digital and Server pricing, but other than that, Tableau is a few years ahead of Qliktech in terms of “pricing transparency”… Also talking with Qliktech sales people until today was more time consuming then needed and I hope that public pricing will make it more easy.

One note about Qlikview pricing: Qliktech has a very weird requirement to buy a Document License ($350 per named user, per 1 (ONE) document) for each document is a potential time-bomb for Qlikview. But they are very good at sales  (Total Q4 2011 revenue of $108.1 million increases 33% compared to fourth quarter of 2010, see http://investor.qlikview.com/secfiling.cfm?filingID=1193125-12-65355&CIK=1305294) and not me, so I will be glad if Qliktech will prove me wrong!

 Again, for now, just review this:

http://www.qlikview.com/us/explore/pricing

I tried to compare the cost of average Deployment  for Qlikview-based and Tableau-based Data Visualization Systems using currently Published prices of Qlikview and Tableau (I actually have an estimation for Spotfire-based deployment too, but TIBCO did not published its pricing yet). See prices in table below, and comparison of average deploymnet after/below this table:

I took as average the deployment with 46 users (it is my estimate of average Qlikview Deployment), 3 desktop clients, 10 documents/visualizations available to 10 (potentially different) named users each, 1 Application Server and maintenance for 3 years.

.

My estimate of total cost for 3 years came up as about $118K for Qlikview Deployment and $83K for Tableau Deployment (both before discounts and taxes and both do not include any development, training, consulting and IT cost).

Note 3/8/12: you may wish to review this blog post too:

http://i3community.com/blogs/entry/qlikview-user-license-named-client-access-license-cal

Since Gartner keeps doing its “Magic Quadrant” (MQ; see MQ at the very bottom of this post) for Business Intelligence Platforms every year, it forces me to do my

“Yellow Square for DV, 2012″

for Data Visualization (DV) Platforms too. I did it last year and I have to do it again because I disagreed with Gartner in 2011 and I disagree with it again in 2012. I have a few different (from Gartner) views, but I will mention 3.

1. There is no such thing as Business Intelligence as a software platform. It is a marketing term, used as an umbrella for multiple technologies and market segments. Gartner released its MQ for BI at the same time it had “BI Summit 2012″ in London on which it practically acknowledged that BI is not a correct term and suggested to use the term “Business Analytics” instead, see for example this article: http://timoelliott.com/blog/2012/02/what-i-found-interesting-about-gartner-bi-summit-2012-london.html

2. I personally is using – for many years – the term Data Visualization as a replacement for BI, as much more specific. Because of that, I removed from consideration a few vendors present in Gartner’s MQ for BI and added a few important DV vendors.

3. I used for my assessment 3 groups of criterias, which I already used on this blog before, for example here:

https://apandre.wordpress.com/2011/12/18/dv-comparison-2011/

and here:

https://apandre.wordpress.com/tools/comparison/

As a result, I got a very different from Gartner the placement of “Data Visualization Platforms and their vendors”:

.

For reference purposes please see below the Magic Quadrant for BI, published by Gartner this month. As you can see our lists of Vendors are overlapping by 11 companies, but in my opinion their relative positioning is very different:

This is a repost from my Tableau-dedicated blog: http://tableau7.wordpress.com/2012/01/17/tableau-7/

2011 was the Year of Tableau with almost 100% (again!) Year-over-Year growth ($72M in sales in 2011, see interview with Christian Chabot here: http://www.xconomy.com/seattle/2012/01/27/tableaus-10th-year/ ), with 163+ new employees (total 350 employees as of the end of 2011) – below is the column chart I found on Tableau’s website:

and with tremendous popularity of Tableau Public and Tableau Free Desktop Reader. In January 2012 Tableau Software disclosed the new plan to hire 300 more people in 2012, basically doubling its size in 2012 and all of these are great news!

Tableau 7.0 is released in January 2012 with 40+ new cool features, I like them, but I wish 4+ more “features”. Mostly I am puzzled what wizards from Seattle are thinking when they released (in 2012!) their Professional Desktop Client only as a 32-bit program?

Most interesting for me is the doubling of the performance and the scalability of Tableau Server with 100+ users deployments (while adding multi-tenancy, which is the sign of the maturing toward large enterprise customers):

and adding “Data Server” features, like sharing data extracts (Tableau-optimized DB-independent file containers for datasets) and metadata across visualizations (Tableau applications called workbooks), automatic (through proxies) live reconnection to datasources, support for new datasources like Hadoop (since 6.1.4) and Vectorwise and new “Connect to Data” Tab:

Tableau’s target operating system is Windows 7 (both 64-bit and 32-bit but for Data Visualization purposes 64-bit is the most important), Tableau rightfully claims to complement Excel 2010 and PowerPivot (64-bit again), Access 2010 (64-bit), SQL Server 2012 (64-bit) and their competitors are supporting 64-bit for a while (e.g. Qlikview Professional has both 64-bit and 32-bit client for years).

Even Tableau’s own in-memory Data Engine (required to be used with Tableau Professional) is the 64-bit executable (if running under 64-bit Windows). I am confused and hope that Tableau will have 64-bit client as soon as possible (what is a big deal here? don’t explain, don’t justify, just do it! On Tableau site you can find attempts to explain/justify, like this: “There is no benefit to Tableau supporting 64-bit for our processing. The amount of data that is useful to display is well within the reach of 32 bit systems” but it was not my (Andrei’s) experience with competitive tools). I also noticed that under 64-bit Windows 7 the Tableau Professional client is  using at least 4 executables: 32-bit tableau.exe (main Tableau program), 64-bit tdeserver64.exe (Tableau Data Engine) and two 32-bit instances of Tableau Protocol Server (tabprotosrv.exe ) – looks strange (at least) to me…

You also can find on Tableau’s site users are reporting that Tableau 6.X underuses multi-core processors: “Tableau isn’t really exploiting the capabilities of a multi-core architecture, so speed was more determined by relative speeds of one core of a core 2 duo vs 1 core of an i7 – which weren’t that different, plus any differences in disk and memory speed“. Good news: I tested Tableau 7.0 and it uses multi-core CPUs much better then 6.X !

Of course, most appealing and sexy new features in Tableau 7.0 are related to mapping. For example I was able quickly create Filled Map, showing the income differences between states of USA:

Other mapping features include wrapped maps, more synonyms and mixed mark types on maps (e.g. PIE instead of BUBBLE), the ability to edit  locations and add new locationsas well as using Geography as Mark(s), like I did below:

etc.

Tableau 7.0 supports new types of Charts (e.g. finally Area Charts) and has new Main Menu, which actually causes a lot of changes where user can find menu items, see it here: http://kb.tableausoftware.com/articles/knowledgebase/new-locations

Tableau added many analytical and convenience features for users, like parameter-based Ref.lines, Top N filtering and Bins, Enhanced Summary Statistics (e.g. median, deviation, quartiles, kurtosis and skewness are added):

Trend models are greatly improved (added t-value, p-value, confidence bands, exponential trends, exporting of trends etc.). Tableau 7.0 has now 1-click and dynamic sorting, much better support for tooltips and colors.

I hope Tableau will implement my other 3+ wishes (in addition to my wish to have 64-bit Tableau Professional “client”) and will release API, will support the scripting (Python, JavaScript, VBScript, PowerShell, whatever) and will integrate with R Library as well.

One of the most popular posts on this blog was a comparison of Data Visualization Tools, which originally was posted more then a year ago where I compared those best tools only qualitatively. However since then I got a lot of requests to compare those tools “quantitatively”. Justification for such update were recent releases of Spotfire 4.0, Qlikview 11, Tableau 7.0 and Microsoft’s Business Intelligence Stack (mostly SQL Server 2012 and PowerPivot V.2.)

.

However I quickly realized that such “quantitative” comparison cannot be objective. So here it is – the updated and very subjective comparison of best Data Visualization tools, as I see them at the end of 2011. I know that many people will disagree with my assessment, so if you do not like my personal opinion – please disregard it at “your own peril”. I am not going to prove “numbers” below – they are just my personal assessments of those 4 technologies – I love all 4 of them. Feel free to make your own comparison and if you can share it with me – I will appreciate it very much.

.

Please keep in mind that I reserve the right to modify this comparison overtime if/when I will learn more about all those technologies, their vendors and usage. Criterias used in comparison below listed in 1st column and they are grouped in 3 groups: business, visualization and technical. Columns 2-5 used for my assessments of 4 technologies, last column used for my subjective weights for each criteria and last row of this worksheet has Total for each Data Visualization technology I evaluated.

Some of visitors to this blog after reading of my recent post about $300K/employee/year as a KPI (Key Performance Indicator) suggested to me another Indicator of the health of Data Visualization vendors: a number of job openings and specifically a number and percentage of software development openings (I include software testers and software managers into this category) and use it also as a predictor of the future. Fortunately it is a public data and below is what I got today from respective websites:

  • 56(!) positions at Tableau, 14 them of are developers;

  • 46 openings at Qliktech, 4 of them are developers;

  • 21 positions at Spotfire, 3 of them are developers;

  • 3 positions at Visokio, 2 of them are developers.

Considering that Tableau is 4 times less in terms of sales then Qlikview and 3-4 times less (then Qliktech) in terms of workforce, this is an amazing indicator. If Tableau can sustain this speed of growth, we can witness soon the change of Data Visualization landscape, unless Qliktech can find the way to defend its dominant position (50% of DV market).

.

For comparison, you can use Microstrategy’s number of openings. While Microstrategy is not a Data Visualization vendor, it is close enough (as BI vendor) for benchmarking purposes: it has 281 openings, 38 of them are developers and current Microstrategy’s workforce is about 3069, basically 3 times more then Qliktech’s workforce…

.

In light of recent releases of Qlikview 11 and Spotfire 4.0 it makes (soon to be released) Tableau 7.0 is very interesting to compare… Stay tuned!

I expected Qlikview 11 to be released on 11/11/11 but it was released today to Qliktech partners and customers. Since Qliktech is the public company, it releases regularly a lot of information which is not available (for now) from other DV leaders like Tableau and Visokio and more fuzzy from Spotfire, because Spotfire is just a part of larger successful public corporation TIBCO, which has many other products to worry about.

However I guessed a little and estimated for DV Leaders their 2011 sales and number of employees and got an interesting observation, which is true for a few last years: size of sales per employee (of DV leading vendor) is $300k/Year or less. I included for comparison purposes similar numbers for Apple, Microsoft and Google as well as for Microstrategy, which is a public company, established (22+ years) player in BI market, dedicated to BI and recently to Data Visualization (that is DV, thanks to it Visual Insight product).

Table below included 2 records related to Spotfire: 1 based on 2010 annual report from TIBCO (for TIBCO as whole; I know TIBCO sales for 2011 grew from $754M to $920M but do not know the exact number of TIBCO’s employees for 2011) and other record is my estimates (of a number of employees and sale) for Spotfire division of TIBCO. Update from 1/11/12: For Tableau’s 2011 I used the numbers from John Cook’s article here: http://www.geekwire.com/2012/tableau-software-doubles-sales-2011-hires-160-workers ) :

To me this is an interesting phenomena, because Qliktech thanks to its fast growing sales and recent IPO was able to double it’s sales in last 2 years while … doubling it’s number of employees so it still has its sales hovering around $300K/employee/year, while Software giants Apple, Microsoft and Google are way above this barrier and Microstrategy is 50% below it. I will also guess that Qliktech will try to break this $300K barrier and be closer to Apple/Microsoft/Google in terms of sales per employee.

Thanks to the public nature of Qliktech we know details of its annual Revenue growth and YoY (Year-over-Year) indicators:

and with estimate of 2011 Revenue about $315M, YoY growth (2011 over 2010) will be around 39.4% which is an excellent result, making it difficult (but still possible) for other DV competitors to catch-up with Qliktech. Best chance for this belongs to Tableau Software, who probably will reach the same size of sales in 2011 as Spotfire (my estimate is around $70M-$75M for both), but for last 2 years Tableau has 100% (or more) YoY revenue growth… Qliktech also published the interesting info about major factors for its sales: Europe (56%), Existing Customers (58%), Licenses (61%), Partners(52%):

which means that the increase of sales in Americas, improving New sales (as oppose to sales to existing customer by using “Land and Expand” approach) and improving revenue from Services and Maintenance may help Qliktech to keep the pace. Qliktech has the tremendous advantage over its DV competitors because it has 1200+ partners, who contributed 52% to Qliktech sales (about $136K per partner and I can guess that Qliktech wish to see at least $200K/year contribution from each partner).

Observing the strengths of other DV competitors, I personally think that Qliktech will benefit from the “imitation” of some of their most popular and successful features in order to keep its dominance in Data Visualization market, including:

  • free public Qlikview service (with obvious limitations) like free SaaS from Tableau Public and free Spotfire Silver personal edition,

  • ability to distribute Data Visualization to desktops without Server by making  available a free desktop Qlikview Reader (similar to free desktop readers from Tableau and Omniscope/Visokio),

  • integration with R library (Spotfire and recently Omniscope) to improve analytical power of Qlikview users,

  • ability to read multidimensional OLAP Cubes (currently only Tableau can do that), especially Cubes from Microsoft SQL Server 2012 Analysis Services and

  • scalability toward Big Data (currently Spotfire’s and Tableau’s data engines can use the disk space as Virtual Memory but Qlikview limited by size of RAM)

This is not a never ending “feature war” but rather a potential ability to say to customers: “why go to competitors, if we have all their features and much more”? Time will tell how DV competition will play out, I expect a very interesting 2012 for Data Visualization market and users and I hope that somebody will able to break $300K/employee/year barrier unless the major M&A will change the composition of DV market. I hope that the DV revolution will continue in new year…

7 months ago I published a poll on LinkedIn and got a lot of responses, 1340 votes (in average 1 vote per hour) and comments. People asked me many times to repeat this poll from time to time. I guess it is time to re-Poll. I added 2 more choices (LinkedIn allows maximum 5 choices in their polls and it is clear not enough for this poll), based on a feedback I got: Omniscope and Visual Insight/Microstrategy. I also got some angry voters complaining that certain vendors are funding this poll. This is completely FALSE, I am unaffiliated with any of vendors, mentioned in this poll and I am working for completely independent (from those vendors) software company, see the About page of this Blog.


Today Tableau 6.1 is released (and client for iPad and Tableau Public for iPad), that includes the full support for incremental Data updates whether they are scheduled or on demand:

New in Tableau 6.1

  • Incremental Data updates scheduled or on demand
  • Text parser faster, can parse any text files as data source (no 4GB limit)
  • Files larger than 2GB can now be published to Tableau Server (more “big data” support)
  • Impersonation for SQL Server and Teradata; 4 times faster Teradata reading
  • Tableau Server auto-enables touch, pinch, zoom, gesture UI for Data Views
  • Tableau iPad app is released, it browses and filters a content on Server
  • Any Tableau Client sees Server-Published View: web browser, mobile Safari, iPad
  • Server enforces the same (data and user) security on desktop, browser, iPad
  • Straight links from an image on a dashboard, Control of Legend Layout etc.

Here is a Quick demo of how to create Data Visualization with Tableau 6.1 Desktop, how easy to publish it on Tableau server 6.1 and how it is instantly visible, accessible  and touch optimized on the iPad:

 

New since Tableau 6.0, more then 60 features, including:

  • Tableau now has in-memory Data Engine, which greatly improves I/O speed
  • Support for “big” data
  • Data blending from multiple sources
  • Unique support for local PowerPivot Multidimensional Cubes as Data Source
  • Support for Azure Datamarket and OData (Open Data Protocol) as Data Sources
  • Support for parameters in Calculations
  • Motion Charts and Traces (Mark History)
  • In average 8 times faster of rendering of Data Views (compare with previous version)

Tableau Product Family

  • Desktop: Personal ($999), Professional ($1999), Digital, Public.
  • Server: Standard, Core Edition, Digital, Public Edition.
  • Free Client: Web Browser, Desktop/Offline Tableau Reader.
  • Free Tableau Reader enables Server-less distribution of Visualizations!
  • Free Tableau Public served 20+ millions visitors since inception

Tableau Server

  • Easy to install: 13 minutes + optional 10 minutes for firewall configuration
  • Tableau has useful command line tools for administration and remote management
  • Scalability: Tableau Server can run (while load balancing) on multiple machines
  • Straightforward licensing for Standard Server (min 10 users, $1000/user)
  • With Core Edition Server License: unlimited number of users, no need for User Login
  • Digital Server Licensing based on impressions/month, allows unlimited data, Tableau-hosted.
  • Public Server License: Free, limited (100000 rows from flat files) data, hosted by Tableau.

Widest (and Tableau optimized) Native Support for data sources

  • Microsoft SSAS and PowerPivot: Excel Add-in for PowerPivot, native SSAS support
  • Native support for Microsoft SQL Server, Access, Excel, Azure Marketplace DataMarket
  • Other Enterprise DBMSes: Oracle, IBM DB2, Oracle Essbase
  • Analytical DBMSes: Vertica, Sybase IQ, ParAccel, Teradata, Aster Data nCluster
  • Database appliances: EMC/GreenPlum, IBM/Netezza
  • Many Popular Data Sources: MySQL, PostgreSQL, Firebird, ODBC, OData, Text files etc.

Some old problems I still have with Tableau

  • No MDI support in Dashboards, all charts share the same window and paint area
  • Wrong User Interface (compare with Qlikview UI) for Drilldown Functionality
  • Tableau’s approach to Partners is from stone ages
  • Tableau is 2 generations behind Spotfire in terms of API, Modeling and Analytics

Comparison of DV Tools is the most popular page (and post) of this site, visited by many thousands of people. Some of them keep asking to append this comparison with different additional features, one of them is a comparison of requirements of leading DV tools for file and memory footprint and also for reading and saving time.

I took mid-sized dataset (428999 rows and 135 columns), exported it into CSV and compressed it to ZIP format, because all native DV formats (QVW by Qlikview, DXP by Spotfire, TWBX by Tableau and XLSX by Excel and PowerPivot) are compressed one way or another. My starting filesize (of ZIPped dataset) was 56 MB. Here is what I got, see for yourself:

One comment is that numbers above are all relative to configuration of hardware used for tests and also depend on other software I ran during tests, because that software also requires RAM, CPU cycles, disk I/O and even on speed of repainting applications windows on screen, especially for Excel. I probably will add more comments to this post/page, but my first impression from this comparison is that new Tableau’s Data Engine (released in version 6.0 and soon will be updated in 6.1) made Tableau more competitive. Please keep in mind, that comparison of in-memory footprint was much less significant in above test, because Qlikview, Excel and PowerPivot putting all dataset into RAM, while Tableau and Spotfire can leave some (unneeded for visualization) data on disk, treating it as “virtual memory”. Also Tableau using 2 executables (not just one EXE as others): tableau.exe (or tabreader.exe) and tdserver64.exe

Since Tableau is the only DV Leading software, capable to read from SSAS Cubes and from PowerPivot (local SSAS) Cubes, I also took large SSAS Cube and for testing purposes I selected SSAS Sub-Cube with 3 Dimensions, 2 Measures and 156439 “rows”, measured the Time and Footprint, needed for Tableau to read Sub-Cube, Refresh it in Memory, Save to local application file, and also measurted “Cubical” Footprint of it in Memory and on Disk and then compared all results with the same tests while running Excel 2010 alone and Excel 2010 with PowerPivot:

While Tableau’s ability to read and visualize Cubes is cool, performance-wise Tableau is far behind of Excel and PowerPivot, especially in Reading department and memory footprint. In Saving department and File footprint Tableau is doing nothing because it is not saving cube locally in its local application TWBX file (and it keeps data in SSAS cube outside of Tableau) so Tableau’s file footprint for SSAS Cubes is not an indicator but for PowerPivot-based local Cubes Tableau does better job (saving data into local application file) then both Excel and PowerPivot!

For many years, Gartner keeps annoying me every January by publishing so called “Magic Quadrant for Business Intelligence Platforms” (MQ4BI for short) and most vendors (mentioned in it; this is funny, even Donald Farmer quotes MQ4BI) almost immediately re-published it either on so-called reprint (e.g. here – for a few months) area of Gartner website or on own website; some of them also making this “report” available to web visitors in exchange for contact info – for free. To channel my feeling toward Gartner  to a  something constructive, I decided to produce my own “Quadrant” for Data Visualization Platforms (DV “Quadrant” or Q4DV for short) – it is below and is a work in-progress and will be modified and republished overtime:

3 DV Leaders (green dots in upper right corner of Q4DV above) compared with each other and with Microsoft BI stack on this blog, as well as voted in DV Poll on LinkedIn. MQ4BI report actually contains a lot of useful info and it deserved to be used as a one of possible data sources for my new post, which has more specific target – Data Visualization Platforms. As I said above, I will call it Quadrant too: Q4DV. But before I will do that, I have to comment on Gartner’s annual MQ4BI.

MQ4BI customer survey included vendor-provided references, as well as survey responses from BI users in Gartner’s BI summit and inquiry lists. There were 1,225 survey responses (funny enough, almost the same number of responces as on my DV Poll on LinkedIn), with 247 (20%) from non-vendor-supplied reference lists. Magic Quadrant Customer Survey’s results the Gartner promised to publish in 1Q11. The Gartner has a somewhat reasonable “Inclusion and Exclusion Criteria” (for Data Visualization Q4DV I excluded some vendors from Gartner List and included a few too), almost tolerable but a fuzzy BI Market Definition (based on 13 loosely pre-defined capabilities organized into 3 categories of functionality: integration, information delivery and analysis).

I also partially agree with the definition and the usage of “Ability to Execute” as one  (Y axis) of 2 dimensions for bubble Chart above (called the same way as entire report “Magic Quadrant for Business Intelligence Platforms”). However I disagree with Gartner’s order of vendors in their ability to execute and for DV purposes I had to completely change order of DV Vendors on X axis (“Completeness of Vision”).

For Q4DV purposes I am reusing Gartner’s MQ as a template, I also excluded almost all vendors, classified by Gartner as niche players with lower ability to execute (bottom-left quarter of MQ4BI), except Panorama Software (Gartner put Panorama to a last place, which is unfair) and will add the following vendors: Panopticon, Visokio, Pagos and may be some others after further testing.

I am going to update this DV “Quadrant”, using the method suggested by Jon Peltier: http://peltiertech.com/WordPress/excel-chart-with-colored-quadrant-background/ - Thank you Jon! I hope I will have time before end of 2011 for it…

Permalink: https://apandre.wordpress.com/2011/02/13/q4dv/

On New Year Eve I started on LinkedIn the Poll “What tool is better for Data Visualization? and 1340 people voted there (which is unusually high return for LinkedIn polls, most of them getting less then 1000 votes), in average one vote per hour during 8 weeks, which is statistically significant as a reflection of the fact that the Data Visualization market has 3 clear leaders (probably at least a generation ahead of all other competitors: Spotfire, Tableau and Qlikview. Spotfire is a top vote getter: as of 2/27/11, 1pm EST: Spotfire got 450 votes (34%), Tableau 308 (23%), Qlikview 305 (23% ; Qlikview result improved during last 3 weeks of this poll), PowerPivot 146 (11%, more votes then all “Other” DV Tools) and all Others DV tools got just 131 votes (10%). Poll got 88 comments (more then 6% of voters commented on poll!) , will be open for more unique voters until 2/27/11, 7pm and its results consistent during last 5 weeks, so statistically it represents the user preferences of the LinkedIn population:

URL is http://linkd.in/f5SRw9 but you need to login to LinkedIn.com to vote. Also see some demographic info (in somewhat ugly visualization by … LinkedIn) about poll voters below:

Interesting that Tableau voters are younger then for other DV tools and more then 82% voters in poll are men. Summary of some comments:

  • - poll’s question is too generic – because an answer partially depends on what you are trying to visualize;
  • - poll is limited by LinkedIn restrictions, which allows no more than 5 possible/optional answers on Poll’s question;
  • - poll’s results may correlate with number of Qlikview/Tableau/Spotfire groups (and the size of their membership) on LinkedIn and also ability of employees of vendors of respective tools to vote in favor of the tool, produced by their company (I don’t see this happened). LinkedIn has 85 groups, related to Qlikview (with almost 5000 members), 34 groups related to Tableau (with 2000+ members total) and 7 groups related to Spotfire (with about 400 members total).
  • Randall Hand posted interesting comments about my poll here:    http://www.vizworld.com/2011/01/tool-data-visualization/#more-19190 . I disagreed with some of Randall’s assessments that “Gartner is probably right” (in my opinion Gartner is usually wrong when it is talking about BI, I posted on this blog about it and Randall agreed with me) and that “IBM & Microsoft rule … markets”. In fact IBM is very far behind (of Qlikview, Spotfire and Tableau) and Microsoft, while has excellent technologies (like PowerPivot and SSAS) are behind too, because Microsoft made a strategic mistake and does not have a visualization product, only technologies for it.
  • Spotfire fans from Facebook had some “advise” from here: http://www.facebook.com/TIBCOSpotfire (post said “TIBCO Spotfire LinkedIn users: Spotfire needs your votes! Weigh in on this poll and make us the Data Visualization tool of choice…” (nothing I can do to prevent people doing that, sorry). I think that the poll is statistically significant anyway and voters from Facebook may be added just a couple of dozens of votes for … their favorite tool.
  • Among Other Data Visualization tools, mentioned in 88 comments so far were JMP, R, Panopticon, Omniscope (from Visokio), BO/SAP Explorer and Excelsius, IBM Cognos, SpreadsheetWEB, IBM’s Elixir Enterprise Edition, iCharts, UC4 Insight, Birst, Digdash, Constellation Roamer, BIme, Bissantz DeltaMaster, RA.Pid, Corda Technologies, Advizor, LogiXml,TeleView etc.

Permalink: https://apandre.wordpress.com/dvpoll/

It looks like honeymoon for Qlikview after Qliktech’s IPO is over. In addition to Spotfire 3.2/Silver, now we have the 3rd great piece of software in form of Tableau 6. Tableau 6.0 released today (both 32-bit and 64-bit) with new in-memory data engine (very fast, say 67 millions of rows in 2 seconds) and quick data blending from multiple data sources while normalizing across them. Data Visualization Software available as a Server (with web browsers as free Clients) and as a Desktop (Pro for $1999, Personal for $999, Reader for free).

New Data Sources include local PowerPivot files(!),  Aster Data ; new Data Connections include OData , (recently released) Windows Azure Marketplace Datamarket; Data Connection can be Direct/Live or to in-memory data engine. Tableau 6 does full or partial automatic data updates; supports parameters for calculations, what-if modeling, and selectability of Displaying fields in Chart’s axis; combo charts of any pair of charts; has new project views, supports Motion Charts

tumblr_mssaaxhajz1stz40uo1_500

(a la Hans Rosling) etc. Also see Ventana Research and comments by Tableau followers. This post can be expanded, since it is officially 1st day of release.

n009: http://wp.me/sCJUg-tableau6


Tableau added 1500 new customers during last year (5500 total, also it is used by Oracle on OEM basis as Oracle Hyperion Visual Explorer), had $20M in sales in 2009, Q3 of 2010 showing 123% growth over the same period a year ago, claiming to be a fastest growing software company in BI market (faster than Qliktech), see http://www.tableausoftware.com/press_release/tableau-massive-growth-hiring-q3-2010

Tableau 6.0 will be released next month, they claiming it is 100 times faster than previous version (5.2) with in-memory columnar DB, 64-bit support and optional data compression. They are so confident (due increasing sales) so they put 40 job openings last week (they had 99 employees in 2009, 180 now and plan to have 200 by end of 2010). Tableau is raising (!) prices for their Tableau Desktop Professional from $1800 to $1999 in November 2010, while Personal will stay at $999. They aim directly at Qliktech saying (through their loyal customer) this: “Competitive BI software like QlikView from QlikTech is difficult to use without a consultant or IT manager by your side, a less than optimal allocation of our team’s time and energy. Tableau is a powerful tool that’s easy to use, built to last, and continues to impress my customers.”

In Tableau’s new sales pitch they claiming (among other 60 new features):

  • New super-fast data engine that can cross-tab 10 million rows in under 1 second
  • The ability to blend data from multiple sources in just a click
  • Create endless combination graphs such as bars with lines, circles with bars, etc.

n004: http://wp.me/pCJUg-3Z

Published the comparison of 4 leading DV Products, see http://wp.me/PCJUg-1T

I did not included into comparison the 5th leading product – Visokio’s Omniscope, because it has very limited scalability due the specifics of it’s implementation: Java does not allow to visualize too much data. Among factors to considered when comparing DV tools:

  • - memory optimization [Qlikview is the leader in in-memory columnar database technology];
  • - load time [I tested all products above and PowerPivot is the fastest];
  • - memory swapping [Spotfire is only who can use a disk as a virtual memory, while Qlikview limited by RAM only];
  • - incremental updates [Qlikview probably the best in this area];
  • - thin clients [Spotfire has the the best THIN/Web/ZFC (zero-footprint) client, especially with their recent release of Spotfire 3.2 and Spotfire Silver];
  • - thick clients [Qlikview has the best THICK client] ,
  • - access by 3rd party tools [PowerPivot's integration with Excel 2010, SQL Server 2008 R2 Analysis Services and SharePoint 2010 is a big attraction];
  • - interface with SSAS cubes [PowerPivot has it, Tableau has it, Omniscope will have it very soon, Qlikview and Spotfire do not have it],
  • - GUI [3-way tie, it is heavily depends on personal preferences, but in my opinion Qlikview is more easy to use than others];
  • - advanced analytics [Spotfire 3.2 is the leader here with its integration with S-PLUS and support for IronPython and other add-ons]
  • - the productivity of developers involved with tools mentioned above. In my experience Qlikview is much more productive tool in this regard.

p003: http://wp.me/pCJUg-3R

Follow

Get every new post delivered to your Inbox.

Join 260 other followers