Post


Visitors to this blog keep asking me to estimate Tableau Software prices (including for Tableau Online), even Tableau published all non-server prices on its website here: https://tableau.secure.force.com/webstore However this does not include discounts, especially for enterprise volume of buying no pricing for servers of any kind (at least 2 kinds of server licenses exist) and no pricing for consulting and training.

Thanks to website of Tableau Partner “Triad Technology Partners” we have a good estimate of all Tableau prices (they are always subject of negotiations) in form of so called GSA Schedule (General Services Administration, Federal Acquisition Service, Special Items: No. 132-33 Perpetual Software Licenses, No. 132-34 Maintenance of Software as a Service, No. 132-50 Training Courses) for Tableau Software Products and Services, see it here:

http://www.triadtechpartners.com/vendors/tableau-software/ here (for example it includes prices for IBM Cognos and others):
http://www.triadtechpartners.com/contracts/ and specific Tableau Prices here:
http://www.triadtechpartners.com/wp-content/uploads/Tableau-GSA-Price-List-April-2013.pdf

I grouped Tableau’s Prices (please keep in mind that TRIAD published GSA schedule in April 2013, so it is 1 year old prices, but they are good enough for estimating purposes)  in 5 groups below: Desktop, Server with licensing for Named Users (makes sense if you have less then hundred “registered” users), Core Licenses for Tableau Server (recommended when you have more then 150 “registered” users), Consulting and Training Prices:

Google sheet for spreadsheet above is here:
https://docs.google.com/spreadsheets/d/1oCyXRR3B6dqXcw-8cE05ApwsRcxckgA6QdvF9aF6_80/edit?usp=sharing
and image of it – for those who has misbehaved browsers is below:
TableauPrices2013

Again, please keep in mind that above just an estimate for prices (except for Tableau Online), based on 2013 GSA Schedule, and a good negotiator can always get a good discount (I got it each time I tried).

Note about choice between Core License and Server License with Named Users: I know organizations who choose to keep Named Users Licensing instead of switching to Core License even with more then 300 registered users, because it allows them to use much more capable hardware (with much more CPU Cores).

Observing and comparing multiple (similar) multidimensional objects over time and visually discovering multiple interconnected trends is the ultimate Data Visualization task, regardless of specific research area – it can be chemistry, biology, economy, sociology, publicly traded companies or even so called “Data Science”.

For purposes of this article I like the dataset, published by World Bank: 1000+ Measures (they called it World Development Indicators) of 250+ countries for over 50+ years – theoretically more then 10 millions of DataPoints:

http://data.worldbank.org/data-catalog/world-development-indicators?cid=GPD_WDI

Of course some DataPoints are missing so I restricted myself to 20 countries, 20 years and 25 measures (more reasonable Dataset with about 10000 DataPoints), so I got 500 Time Series for 20 Objects (Countries) and tried to imitate of how Analysts and Scientists will use Visualizations to “discover” Trends and other Data Patterns in such situation and extrapolate, if possible, this approach to more massive Datasets in practical projects. My visualization of this Dataset can be found here:

http://public.tableausoftware.com/views/wdi12/Trends?amp;:showVizHome=no

In addition to Trends Line Chart (please choose Indicator in Filter at bottom of the Chart, I added (in my Tableau Visualization above) the Motion Chart for any chosen Indicator(s) and the Motion Map Chart for GDP Indicator. Similar Visualization for this Dataset done by Google here: http://goo.gl/g2z1b6 .

As you can see below with samples of just 6 indicators (out of 1000+ published by World Bank), behavior of monitored objects (countries) are vastly different.

GDP trends: clear Leader is USA, with China is the fastest growing among economic leaders and Japan almost stagnant for last 20 years (please note that I use “GDP Colors of each country” for all other 1000+ indicators and Line Charts):

GDPTrends

Life Expectancy: Switzerland and Japan provide longest life to its citizens while India and Russian citizens are expected to live less then 70 years. Australia probably improving life expectancy faster than other 20 countries in this subset.

LifExpectancy

Health Expenditures Per Capita: Group of 4: Switzerland, Norway (fastest growing?), Luxemburg and USA health expenses about $9000 per person per year while India, Indonesia and China spent less then $500:

HealthExpenditurePerCapita

Consumer Price Index: Prices in Russia, India and Turkey growing faster then elsewhere, while prices in Japan and Switzerland almost unchanged in last 20 years:

CPI

Mobile Phones Per 100 Persons: Russia has 182 mobile phones per 100 people(fastest growing in last 10 years) while India has less then 70 cellular phones per 100 people.

CellPhonesPer100

Military Expenses as Percentage of Budget (a lot of missing data when it comes to military expenses!):  USA, India and Russia spending more then others – guess why is that:

MilitaryExpensesPercentageOfBudget

 

You can find many examples of Visual Monitoring of multiple objects overtime. One of samples is https://www.tradingview.com/ where over 7000 objects (publicly traded companies) monitored while observing hundreds of indicators (like share prices, Market Capitalization, EBITDA, Income, Debt, Assets etc.). Example (I did for previous blog post): https://www.tradingview.com/e/xRWRQS5A/

Data Visualization Readings, Q1 2014, selected from Google+ extensions of this blog:
http://tinyurl.com/VisibleData and
http://tinyurl.com/VisualizationWithTableau

dvi032914

Data Visualization Index (using DATA+QLIK+TIBX+MSTR; click on image above to enlarge):
Since 11/1/13 until 3/15/14: DATA stock grew 50%. QLIK 11%, MSTR – 6%, TIBX – lost 1%.
Current Market Capitalization: Tableau – $5.5B, QLIK – $2.6B, TIBCO – 3.5B, Microstrategy – $1.4B
Number of Job Openings Today: Tableau – 231, QLIK – 135, Spotfire (estimate) – 30, Microstrategy – 214
However during last 2 weeks of March of 2014 DATA shares lost 24%, QLIK lost 14%, TIBX and MSTR both lost about 10%

Why use R? Five reasons.
http://www.econometricsbysimulation.com/2014/03/why-use-r-five-reasons.html

Studying Tableau Performance Characteristics on AWS EC2
http://tableaulove.tumblr.com/post/80571148718/studying-tableau-performance-characteristics-on-aws-ec2

Head-to-head comparison of Datawatch and Tableau
http://datawatch.com/datawatch-vs-tableau

Diving into TIBCO Spotfire Professional 6.0
http://www.jenunderwood.com/2014/03/25/diving-into-tibco-spotfire-professional-6-0/

TIBCO beats Q1 2014 estimates but Spotfire falters
http://diginomica.com/2014/03/20/tibco-beats-estimates-spotfire-falters/

Qlik Doesn’t Fear Tableau, Oracle In Data Analytics
http://news.investors.com/031314-693154-qlik-focuses-on-easy-to-use-data-analytics.htm?p=full

Best of the visualisation web… February 2014
http://www.visualisingdata.com/index.php/2014/04/best-of-the-visualisation-web-february-2014/

Datawatch: ‘Twenty Feet From Stardom’
http://seekingalpha.com/article/2101513-datawatch-twenty-feet-from-stardom

Tableau plans to raise $345M — more than its IPO — with new stock offering
http://venturebeat.com/2014/03/16/tableau-plans-to-raise-345m-more-than-its-ipo-with-new-stock-offering/

TIBCO Spotfire Expands Connectivity to Key Big Data Sources
http://www.marketwatch.com/story/tibco-expands-connectivity-to-key-big-data-sources-2014-03-11

Tableau and Splunk Announce Strategic Technology Alliance
http://www.splunk.com/view/SP-CAAAKH5?awesm=splk.it_hQ

The End of The Data Scientist!?
http://alpinenow.com/blog/the-end-of-the-data-scientist/

bigData

Data Science Is Dead
http://slashdot.org/topic/bi/data-science-is-dead/

Periodic Table of Elements in TIBCO Spotfire
http://insideinformatics.cambridgesoft.com/InteractiveDemos/LaunchDemo/?InteractiveDemoID=1

Best of the visualisation web… January 2014
http://www.visualisingdata.com/index.php/2014/03/best-of-the-visualisation-web-january-2014/

Workbook Tools for Tableau
http://powertoolsfortableau.com/tableau-workbooks/workbook-tools/

Tapestry Data Storytelling Conference
http://www.tapestryconference.com/attendees
http://www.visualisingdata.com/index.php/2014/03/a-short-reflection-about-tapestry-conference/ ReadingLogo

URL Parameters in Tableau
http://interworks.co.uk/business-intelligence/url-parameters-tableau/

Magic Quadrant 2014 for Business Intelligence and Analytics Platforms
http://www.gartner.com/technology/reprints.do?id=1-1QLGACN&ct=140210&st=sb

What’s Next in Big Data: Visualization That Works the Way the Eyes and Mind Work
http://insights.wired.com/profiles/blogs/what-s-next-in-big-data-visualization-that-works-the-way-the-eyes#axzz2wPWAYEuY

What animated movies can teach you about data analysis
http://www.cio.com.au/article/539220/whatanimatedmoviescanteachaboutdata_analysis/

Tableau for Mac is coming, finally
http://www.geekwire.com/2014/tableau-mac-coming-finally/

Authenticating an External Tableau Server using SAML & AD FS
http://www.theinformationlab.co.uk/2014/02/04/authenticating-external-tableau-server-using-internal-ad/

Visualize this: Tableau nearly doubled its revenue in 2013
http://gigaom.com/2014/02/04/visualize-this-tableau-nearly-doubled-its-revenue-in-2013/

Qlik Announces Fourth Quarter and Full Year 2013 Financial Results
http://investor.qlik.com/releasedetail.cfm?ReleaseID=827231

InTheMiddleOfWinter2

Tableau Mapping – Earthquakes, 300,000,000 marks using Tableau 8.1 64-bit
http://theywalkedtogether.blogspot.com/2014/01/tableaumapping-earthquakes-300000000.html

Data Science: What’s in a Name?
http://www.linkedin.com/today/post/article/20130215205002-50510-the-data-scientific-method

Gapminder World Offline
http://www.gapminder.org/world-offline/

Advanced Map Visualisation in Tableau using Alteryx
http://www.theinformationlab.co.uk/2014/01/15/DrawingArrowsinTableau

Motion Map Chart
http://apandre.wordpress.com/2014/01/12/motion-map-chart/

One of Bill Gates’s favorite graphs redesigned
http://www.perceptualedge.com/blog/?p=1829

Authentication and Authorization in Qlikview Server
http://community.qlik.com/blogs/qlikviewdesignblog/2014/01/07/authentication-and-authorization

SlopeGraph for QlikView (D3SlopeGraph QlikView Extension)
http://www.qlikblog.at/3093/slopegraph-for-qlikview-d3slopegraph-qlikview-extension/

Revenue Model Comparison: SaaS v. One-Time-Sales
http://www.wovenware.com/blog/2013/12/revenue-model-comparison-saas-v-one-time-sales#.UyimffmwIUo

Scientific Data Has Become So Complex, We Have to Invent New Math to Deal With It
http://www.wired.com/wiredscience/2013/10/topology-data-sets/all/

Posting data to the web services from QlikView
http://community.qlik.com/docs/DOC-5530

It’s your round at the bar
http://interworks.co.uk/tableau/radial-bar-chart/

Lexical Distance Among the Languages of Europe
http://elms.wordpress.com/2008/03/04/lexical-distance-among-languages-of-europe/

SnowInsteadOfRainJan2014-SNOW

For this weekend I got 2 guest bloggers (one yesterday and other today) sharing their thoughts about Cloud Services for BI and DV. I myself published recently  a few articles about this topic, for example here: http://apandre.wordpress.com/2013/08/28/visualization-as-a-service/ and here:

http://apandre.wordpress.com/2013/12/14/spotfire-cloud-pricing/ . My opinions can be different from Guest Bloggers. You can find many providers of DV and BI Cloud Services, including Spotfire Cloud, Tableau Online, GoodData, Microstrategy Cloud, Bime, Yellofin, BellaDati, SpreadsheetWEB etc.

Let me introduce my 2nd guest blogger for this weekend: Ugur Kadakal is the CEO and founder of Pagos, Inc. located in Cambridge, MA. Pagos is the developer of SpreadsheetWEB which transforms Excel spreadsheets into web based Business Intelligence (BI) applications without any programming. SpreadsheetWEB can also convert PowerPivot files into web based dashboards. It provides advanced Data Visualization (DV) to SQL Analysis Services (Tabular) cubes without SharePoint. Mr. Kadakal published a few articles on this blog before with great feedback, so he is a serial Guest Blogger.

SaaSCost

Before (or after) you read article of Mr. Kadakal, I suggest to review the article, comparing 5+ scenarios of revenue of Cloud Service vs. Traditional One-Time Sale of software, see it here: http://www.wovenware.com/blog/2013/12/revenue-model-comparison-saas-v-one-time-sales#.UyikEfmwIUp . Illustration above is from that article.

Traditional BI versus Cloud BI

Over the past several years, we have been witnessing numerous transformations in the software industry, from a traditional on-premise deployment model to the Cloud. There are some application types for which cloud makes a lot of sense while it doesn’t for some others. BI is somewhere in between.

Before I express my opinion on the subject of Traditional BI versus Cloud BI, I would like to clarify my definitions. I define traditional BI as large enterprise implementations which connect with many data sources in real-time.  These projects have many phases and require large teams to implement. These projects could take years and cost millions of dollars to implement.

Many people define cloud BI as deployments on a proprietary, third-party, multi-tenant environment managed by a vendor. My definition is somewhat different and broader. Cloud BI is more about ease of deployment, use and management. While Cloud BI can be hosted and managed by a vendor, it can also be deployed on a private Cloud infrastructure like Amazon or Microsoft Azure. With the advancement of cloud infrastructure technologies like OpenStack, deploying and managing private cloud infrastructure is becoming easier for many enterprises. As a result, whether Cloud BI is deployed on a multi/single-tenant environment on vendor infrastructure, a third party cloud infrastructure like Amazon, Azure, etc. or on internal private cloud, it becomes more of a business decision rather than a technical limitation.

DataCloud

One main distinction between Traditional BI and Cloud BI is data management. Traditional BI implementations can have real-time data as they can connect to the original data sources directly. I don’t believe that Cloud BI should deal with real-time data, even if implemented on internal private cloud infrastructure. Supporting real-time data is a requirement that makes any BI project complicated and costly. Hence Cloud BI solutions should include simple utilities i.e. ETL, residing on local computers to push internal data into Cloud BI’s data model periodically. Since Cloud BI should not deal with real-time data scenarios, this data synchronization can be configured by the business user accordingly.

Another distinction is the ease of implementation. Regardless of where it is deployed, Cloud BI solutions should take no more than a few hours to implement and configure. Some BI vendors already support images on Amazon cloud to simplify this process.

Traditional BI model typically requires significant upfront investments. Part of this investment is internal while the rest is BI licensing and implementation fees. But the very nature of Cloud BI requires agility from deployment to data management and dashboard creation. Cloud BI project can be deployed easily and it can also be modified and shut down with equal ease. Hence traditional business model of large upfront investments doesn’t make sense here. Cloud BI business model should be subscription based regardless of whether it is implemented on a vendor infrastructure or on an on-premise private cloud infrastructure. Customers should be able to pay what they use and for how long they use it. Such simplicity will also eliminate vendor lock-in risks that most enterprises have to mitigate.

DVinCloud2

In summary, there are many BI projects that will require traditional BI implementation. These projects typically require real-time data and connectivity to many different data sources. Cloud BI should not attempt to handle these types of projects. But there are many other BI projects that require neither real-time data nor the data which comes from different systems that should be connected. Cloud BI can handle these projects quickly and cost effectively, by empowering business users to manage the whole process without IT or external support. From discovery to data synchronization to dashboard creation and management, every activity can be handled by business users.

For this weekend I got 2 guest bloggers (one today and second tomorrow) sharing their thoughts about Cloud Services for BI and DV. I myself published recently  a few articles about this topic, for example here: http://apandre.wordpress.com/2013/08/28/visualization-as-a-service/ and here:

http://apandre.wordpress.com/2013/12/14/spotfire-cloud-pricing/ . My opinions can be different from Guest Bloggers (see my comment below this article). You can find many providers of DV and BI Cloud Services, including Spotfire Cloud, Tableau Online, GoodData, Microstrategy Cloud, Bime, Yellofin, BellaDati, SpreadsheetWEB etc.

Let me introduce my 1st guest blogger for this weekend: Mark Flaherty is Chief Marketing Officer at InetSoft Technology,  a BI (Business Intelligence) software provider founded in 1996, headquartered in Piscataway, New Jersey with over 150 employees worldwide. InetSoft’s flagship BI application Style Intelligence enables self-service  BI spanning dashboarding, reporting and visual analysis for enterprises and technology providers. The server-based application includes a data mashup engine for combining data from almost any data source and browser-based design tools that power users and developers can use to quickly create interactive DV (Data Visualizations).

DVinCloud

Are public BI cloud services really going to overtake the traditional on-premise deployment of BI tools?

(Author: Mark Flaherty. Text below contains Mark’s opinions and they can be different from opinions expressed on this blog).

It’s been six years since public BI cloud services came to be. Originally termed SaaS BI, public BI cloud services refers to commercial service providers who host a BI application in the public cloud that accesses corporate data housed in the corporate private cloud and/or other application providers’ networks. As recently as last month, an industry report from TechNavio said, “the traditional on-premise deployment of BI tools is slowly being taken over by single and multi-tenant hosted SaaS.” I have a feeling this is another one of those projections that copies a historical growth rate forward for the next five years. If you do that with any new offering that starts from zero, you will always project it to dominate a marketplace, right?

I thought it would be interesting to discuss why I think this won’t happen.

DVinCloud3

In general, there is one legitimate driving force for why companies look to cloud solutions that helps drive the demand for cloud BI services specifically: outsourcing of IT. The types of companies for whom this makes the most sense are small businesses. They have little or no IT staff to set up and support enterprise software, and they also have limited cap-ex budgets so software rentals fit their cash flow structure better. While this is where most of the success for cloud BI has happened, this is only a market segment opportunity. By no means do small companies dominate the IT marketplace.

Another factor for turning to public cloud solutions is expediency. Even at large companies where there is budget for software purchases, the Business sometimes becomes frustrated with the responsiveness of internal IT, and they look outside for a faster solution. This makes sense for domain-specific cases where there is a somewhat narrow scope of need, and the application and the data are self-contained.  Salesforce.com is the poster child for this case, where it can quickly be set up as a CRM for a sales team. Indeed the fast success of salesforce.com is a big reason why people think cloud solutions will take off in every domain.

But business intelligence is different. A BI tool is meant to span multiple information areas, from finance to sales to support and more. This is where it gets complicated for mid-sized and global enterprises. The expediency factor is nullified because the data that business users want to access with their cloud BI tool is controlled by IT, so they need to be involved. Depending on the organization’s policies and politics, this can either slow down such a move or kill it.

The very valid reason why enterprise IT would kill the idea for a public cloud BI solution is why ultimately I think public BI cloud services has such a limited opportunity in the overall market. One of IT’s responsibilities is ensuring data security, and they will rightly point out the security risks of opening access to sensitive corporate data to a 3rd party. It’s one thing to trust a vendor with one set of data like website visitor traffic, but trusting them with all of a company’s financial and customer data is where almost all companies will draw the line.  This is a concern I don’t see ever going away.

What are some pieces of evidence that public BI cloud services have a limited market opportunity? When BI cloud services first came onto the scene, all of the big BI vendors dabbled in it. Now many no longer champion these hosted offerings, or they have shuttered or demoted them. IBM’s Cognos Express is now only an on-premise option. SAP BusinessObjects BI OnDemand can’t be found from SAP’s main site, but has its own micro site. Tibco’s Spotfire Cloud and Tableau Software’s Tableau Online are two exceptions among the better known BI providers that are still prominently marketed. However, Tibco positions this option for small businesses and workgroups and omits certain functionality.

Our company, too, experimented with a public BI cloud offering years ago. It was first targeted at salesforce.com customers who would want to mash up their CRM data with other enterprise-housed data. We found mostly small, budget challenged companies in their customer base, and the few large enterprises that we found balked at the idea, asking instead, for our software to be installed on-premise where they would connect to any cloud-hosted data on their own. Today the only remaining cloud offering of ours is a free visualization service called Visualize Free which is similar to Tableau Public or IBM’s Many Eyes.

Another observation to make, while there have been a handful of pure-play cloud BI vendors, one named “Lucidera,” came and went quite quickly. Birst is one that seems to have got a successful formula.

In summary, yes, there is a place for public BI cloud services in the small business market, but no, it’s not going to overtake traditional on-premise BI.

GoogleDataCenterInGeorgiaWithCloudsAboveIt2

For last 6 years every and each February my inbox was bombarded by messages from colleagues, friends and visitors to this blog, containing references, quotes and PDFs to Gartner’s Magic Quadrant (MQ) for Business Intelligence (BI) and Analytics Platforms, latest can be found here: http://www.gartner.com/technology/reprints.do?id=1-1QLGACN&ct=140210&st=sb .

Last year I was able to ignore these noises (funny enough I was busy by migrating thousands of users from Business Objects and Microstrategy to Tableau-based Visual Reports for very large company), but in February 2014 I got so many questions about it, that I am basically forced to share my opinion about it.

  • 1st of all, as I said on this blog many times that BI is dead and it replaced by Data Visualization and Visual Analytics. That was finally acknowledged by Gartner itself, by placing Tableau, QLIK and Spotfire in “Leaders Quarter” of MQ for 2nd year in a row.

  • 2ndly last 6 MQs (2009-2014) are suspicious for me because in all of them Gartner (with complete disregard of reality) placed all 6 “Misleading” vendors (IBM, SAP, Oracle, SAS, Microstrategy and Microsoft) of wasteful BI platforms in Leaders Quarter! Those 6 vendors convinced customers to buy (over period of last 6 years) their BI software for over $60B plus much more than that was spent on maintenance, support, development, consulting, upgrades and other IT expenses.

There is nothing magic about these MQs: they are results of Gartner’s 2-dimensional understanding of BI, Analytics and Data Visualization (DV) Platforms, features and usage. 1st Measure (X axis) according to Gartner is the “Completeness of Vision” and 2nd Measure (Y axis) is the “Ability to Execute”, which allows to distribute DV and BI Vendors among 4 “Quarters”: RightTop – “Leaders”, LeftTop -”Challengers”, RightBottom – “Visionaires” and LeftBottom – “Niche Players” (or you can say LeftOvers).

mq2014

I decided to compare my opinions (expressed on this blog many times) vs. Gartner’s (they wrote 78 pages about it!) by taking TOP 3 Leaders from Gartner, than taking 3 TOP Visionaries from Gartner (Projecting on Axis X all Vendors except TOP 3 Leaders) than taking 3 TOP Challengers from Gartner (Projecting on Axis Y all Vendors except TOP 3 Leaders and TOP 3 Visionaries ) than TOP 3 “Niche Players” from the Rest of Gartner’s List (above) and taking “similar” choices by myself (my list is wider then Gartner’s, because Gartner missed important to me DV Vendors like Visokio and vendors like Datawatch and Advizor Solutions are not included into MQ in order to please Gartner’s favorites), see the comparison of opinions below:

12DVendorsIf you noticed, in order to be able to compare my opinion, I had to use Gartner’s terms like Leader, Challenger etc., which is not exactly how I see it. Basically my opinion overlapping with Gartner’s only in 25% of cases in 2014, which is slightly higher then in previous years – I guess success of Tableau and QLIK is a reason for that.

BI Market in 2013 reached $14B and at least $1B of it spent on Data Visualization tools. Here is the short Summary of the state of each Vendor, mentioned above in “DV Blog” column:

  1. Tableau: $232M in Sales, $6B MarketCap, YoY 82% (fastest in DV market), Leader in DV Mindshare, declared goal is “Data to the People” and the ease of use.

  2. QLIK: $470M in Sales, $2.5B MarketCap, Leader in DV Marketshare, attempts to improve BI, but will remove Qlikview Desktop from Qlik.Next.

  3. Spotfire: sales under $200M, has the most mature Platform for Visual Analytics, the best DV Cloud Services. Spotfire is limited by corporate Parent (TIBCO).

  4. Visokio: private DV Vendor with limited marketing and sales but has one of the richest and mature DV functionality.

  5. SAS: has the most advanced Analytics functionality (not easy to learn and use), targets Data Scientists and Power Users who can afford it instead of free R.

  6. Revolution Analytics: as the provider of commercial version and commercial support of R library is a “cheap” alternative to SAS.

  7. Microsoft: has the most advanced BI and DV technological stack for software developers but has no real DV Product and has no plan to have it in the future.

  8. Datawatch: $33M in sales, $281M MarketCap, has mature DV, BI and real-time visualization functionality, experienced management and sales force.

  9. Microstrategy: $576M in sales, 1.4B MarketCap; BI veteran with complete BI functionality; recently realized that BI Market is not growing and made the desperate attempt to get into DV market.

  10. Panorama: BI Veteran with excellent easy to use front-end to Microsoft BI stack, has good DV functionality, social and collaborative BI features.

  11. Advizor Solutions: private DV Veteran with almost complete set of DV features and ability to do Predictive Analytics interactively, visually and without coding.

  12. RapidMiner: Commercial Provider of open-source-based and easy to use Advanced Analytical Platform, integrated with R.

Similar MQ for “Advanced Analytics Platforms” can be found here: http://www.gartner.com/technology/reprints.do?id=1-1QXWEQQ&ct=140219&st=sg - have fun:

mq2014aap

In addition to differences mentioned in table above, I need to say that I do not see that Big Data is defined well enough to be mentioned 30 times in review of “BI and Analytical Platforms” and I do not see that Vendors mentioned by Gartner are ready for that, but may be it is a topic for different blogpost…

Update: 

Analytics extrapolates Visible Data to the future (“predicts”) and enables us to see more then 6-dimensional subsets of data with mathematical modeling. The ability to do it visually, interactively and without programming … vastly expands the number of potential users for Visual Analytics. I am honored to present the one of the most advanced experts in this area – Mr. Gogswell: he decided to share his thoughts and be the guest blogger here. So the guest-blog-post below is written by Mr. Douglas Cogswell, the Founder, President and CEO of ADVIZOR Solutions Inc.

Formed in 2003, ADVIZOR combines data visualization and in-memory-data-management expertise with usability knowledge and predictive analytics to produce an easy to use, point and click product suite for business analysis. ADVIZOR’s Visual Discovery™ software spun out of a distinguished research heritage at Bell Labs that spans nearly two decades and produced over 20 patents.

Mr. Cogswell is the well known thought leader and he is discussing below the next step in Data Visualization Technology, when limitation of human eye prevents users to comprehend the multidimensional (say more than 6 dimensions) Data Patterns or estimate/predict the future trends with Data from the Past. Such Multidimensional “Comprehension” and Estimations of the Future Trends requires a Mathematical Modeling in form of Predictive Analytics as the natural extension of Data Visualization. This is in turn, requires the Integration of Predictive Analytics and Interactive Data Visualization. Such Integration will be accepted much easier by business and analysts , if it will require no coding.

Mr. Cogswell discussing the need and possibility of that in his article (Copyright ADVIZOR Solutions, 2014) below.

no-code2

Integrating Predictive Analytics and Interactive Data Visualization WITHOUT any Coding!

It’s a new year, and many organizations are mulling over how and where they will make new investments. One area  getting a lot of attention these days is predictive analytics tools. The need  to better understand the present and predict what might happen in the future for competitive advantage is enticing many to look at what these tools can do. TechRadar spoke with James Fisher, who said 85 percent of the organizations that have adopted these tools believe they have positively impacted their business.

Fast Fact Based Decision Making is Critical.

“Businesses are collecting information on their customers’ mobile habits, buying habits, web-browsing habits… The list really does go on,” he said. “However, it is what businesses do with that data that counts. Analytics technology allows organizations to analyze their customer data and turn it into actionable insights, in a way that benefits business.”

Interest in predictive analytics by businesses is expected to continue to grow well beyond this year, with Gartner reporting in early 2013 that approximately 70 percent of the best performing enterprises will either manage or have a view of their processes with predictive analytics tools by 2016. By doing this, businesses will gain a better sense of what is happening within their own networks and corporate walls, which actions could have the best impact and give increased visibility across their industries. This will give situational awareness across the business, making operating much easier than it has been in past years.

Simplicity and Ease of Use are Key.

Analytics is something every business should be figuring out.  There are more software options than ever, so executives will need to figure out which solution will work best for them and their teams. According to InformationWeek’s Doug Henschen, the “2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey” found that business users and salespeople need easy-to-use, visual data analytics that is intuitive and easily accessible from anywhere, any time. . These data visualization business intelligence tools can give a competitive edge to the companies adopting them.

“The demand for these more visual analytics tools leads to one of the biggest complaints about analytics,” he said. Ease-of-use challenges have crippled the utilization rate of this software.  But that is changing.  “Analytics and BI vendors know that IT groups are overwhelmed with requests for new data sources and new dimensions of data that require changes to reports and dashboards or, worse, changes to applications and data warehouses,” he wrote. “It’s no wonder that ‘self-service’ capabilities seem to be showing up in every BI software upgrade.”

A recent TDWI research report titled “Data Visualization and Discovery for Better Business Decisions” found that companies do have their future plans focused on these analytics and how they can use them. In fact, 60 percent said their organizations are currently using business visualization for snapshot reports, scorecards, or display. About one-third are using it for discovery and analysis and 26 percent for operational alerting. However, companies are looking to expand how they use the technology, as 45 percent are looking to adopt it for discovery and analysis, and 39 percent for alerts.

“Visualization is exciting, but organizations have to avoid the impulse to clutter users’ screens with nothing more than confusing ‘eye candy’,” Stodder wrote. “One important way to do this is to evaluate closely who needs what kind of visualizations. Not all users may need interactive, self-directed visual discovery and analysis; not all need real-time operational alerting.”

Data Visualization & Predictive Analytics Naturally Complement Each Other.

Effective data visualizations are designed to complement human perception and our innate ability to see and respond to patterns.  We are wired as humans to perceive meaningful patterns, structure, and outliers in what we see.  This is critical to making smarter decisions and improving productivity, and essential to the broader trend towards self-directed analysis and BI reporting, and tapping into new sources of data.

Visualization also encourages “storytelling” and new forms of collaboration.  It makes it really easy to not only “see” stories in data, but also to highlight what is actionable to colleagues. 

On the other hand, the human mind is limited in its ability to “see” very many correlations at once.  While visualization is great for seeing patterns across 2, or 4 or maybe 6 criteria at a time, it breaks down when there are many more variables than that.  Very few people are able to untangle correlations and patterns across, say, 15 or 25 or 75 or in some cases 300+ criteria that exist in many corporate datasets.

Predictive Analytics, on the other hand, is not capacity constrained!!  It uses mathematical tools and statistical algorithms to examine and determine patterns in one set of data . . . in order to predict behavior in another set of data.  It integrates well with in-memory-data and data visualization, and leads to faster and better decision making.

Making it Simple & Delivering Results.

The challenge is that most of the predictive analytics software tools on the market require the end-user to be able to program in SQL in order to prep data, and have some amount of statistics background to build models in R … or SPSS … or SAS.  At ADVIZOR Solutions our vision has been to empower business analysts and users to build predictive models without any code or statistics background.

NoCode

The results have been extremely promising — inquisitive and curious-minded end-users with a sense for causality in their data can easily do this — and are turning around models in just a few hours.  The result is they are using data in new and powerful ways to make better business decisions.

Three Key Enablers to a Simple End-User Process.

The three keys to making this happen are:  (1) having all the relevant data offloaded from the database or datamart into RAM, (2) allowing the business user to explore it visually, and (3) providing a really simple modeling interface.

Putting the data in RAM is key to making it easy to condition so that the business user can create modeling factors (such as time lags, factors from data in multiple tables, etc.) without having to go back and condition data in the underlying databases — which is usually a time consuming process that involves coordinating with IT and/or DBAs. 

Allowing the business user to explore it visually is key to hypothesis generation and vetting about what really matters, before building and running models.

Providing really simple interfaces that automate the actual statistics part of the process lets the business user focus on their data, not the statistics of the model.  That simple modeling process includes:

  • Select the Target & Base Populations
    • The “target” is the group you want to study (e.g., people who responded to your campaign)
    • The “base” is the group you want to compare the target to (e.g., everybody who received the campaign)
  • Visually Explore the data and develop Hypotheses
    • This helps set up which explanatory fields to include …
    • … and which additional ones may need to be added
  • Select list of Explanatory Fields
    • The “explanatory fields” are the factors in your data that might explain what makes the target different from other entities in your data
  • Build Model
  • Iterate
  • Understand and Communicate what the model is telling you
  • Predict / Score Base Population
  • Get lists of Scored potential targets

Check out how you can do this with no code in this 8 min YouTube video.

Best Done In-house with Your Team.

In our experience this type of work is best done in-house with your team.  That’s because it’s not a “black box”, it’s a process.  And since your team knows the data and its context better than anybody else, they are the ones best suited to discuss, interpret, and apply the results.  In our experience, over and over again it has been proven that knowing the data and context is the key factor  … and that you don’t need a statistics degree to do this.

IncrementalSalesTrends

Quick Example: Consumer Packaged Goods Sales.

In recent client work a well known consumer packaged goods company was trying to untangle what was driving sales.  They had several key questions they were attempting to answer:

  • What factors drive sales?
  • How do peaks in incremental sales relate to the Social Media spikes?
    • For all brands
    • By each brand
  • How does it vary by media provider?  By type of post?
  • Can we use this data to forecast incremental sales? Which factors have the biggest impact?

They had lots of data, which included sales by brand by week, and a variety of potential influences which included:  a variety of their own promotions, call center stats, social media posts, and mined sentiment from those social media posts (e.g., was the post “positive”, “neutral”, or “negative”).   The key step in creating the right explanatory fields was developing time lags for each of these potential influences since the impact on sales was not necessarily immediate — for example, positive Twitter posts this week may have some impact on sales, but more likely the impact will be on sales +1 week, or maybe +2 weeks, or +4 weeks, etc. 

Powerful Results.

What we learned was that there were multiple influences and their intensity varied by brand. Seasonality was no longer the major driver.  New influences — including social media posts and online promotions — were now in the top spot.  We also learned that the key influences can and should be managed.  This was critical — there are lags between the impact of, for example, a negative Twitter post and when it hits sales. As a result, a quick positive response to a negative post can heavily offset that negative post.

In Summary.

An easy to use data discovery and analysis tool that integrates predictive analytics with interactive data visualization and which is then placed in the hands of business analysts and end-users can make huge differences in how data is analyzed, how fast that can happen, and how it is then communicate to and accepted by the decision makers in an organization.

And, stay tuned.  We’ll next be talking about the people side of predictive analytics — if there is now technology that lets you create and use models without writing any code, then what are the people skills and processes required to do this well?

This is a repost from Data Visualization Consulting Page.

Visitors of this blog generated a lot of requests for my Data Visualization “Advice” (small projects for a few hours or days, no NDA [Non-Disclosure Agreement] involved), for Data Visualization Consulting projects (a few weeks or months; I tend to avoid the NDAs as they can interfere with my blogging activities) and even for Full-time work (for example my latest full-time job I got because my employer often visited and read my blog; NDA needed).

Additionally, sometimes I am doing free-of-charge work, if involved projects are short, extremely interesting for me and beneficial for my Data Visualization Blog, like this project:

http://apandre.wordpress.com/2014/01/12/motion-map-chart/

Obviously all these projects can be done only when I have spare time either from full-time work and/or other projects, duties and activities.

I also cannot relocate or travel, so I can do it mostly from my home office – telecommuting (RDP, Skype, phone, WebEx, GoToMeeting etc.) or if client is local to Massachusetts, then sometime I can visit Client’s site, see below the Map of my Local “Service Area” – part of Middlesex County between Routes 495, 3 and 20 – where I can commute to Client’s Location (please click on map below to enlarge the image) :

DVServiceArea

If I do have time for short-term advisory projects (from 2 hours to 2 weeks), clients usually pay by the highest rate, similar to what Qliktech, Spotfire, Tableau or IBM charging for their Consulting Services (I consider my consulting as better service than theirs…). If you will go to this thread on Tableau Community:

http://community.tableausoftware.com/thread/127338 then you will find these Indicative Rates for Consulting Tableau Work (Qlikview and Spotfire Rates are very similar):

Low $125,  Max $300,  Average around $175 per hour.

Here are the most popular requests for my Advisory work:

  • Visual Design and Architectural Advice for Monitoring or Operational Dashboard(s);
  • Review of Data Visualization Work done by my Clients;
  • Prototyping of Data Visualizations (most requested by my visitors);
  • My opinion on Strengths and Weaknesses of Data Visualization Vendor/Product, requested by trader, portfolio or hedge fund manager(s)
  • Advice about what Hardware to buy (say to get the most from Tableau License client has);
  • Advice what Charts and Filters to use for given Dataset and Business Logic;
  • Technical Due Diligence on Data Visualization Startup for Venture Capitalists investing into that Start-up.
  • Etc…

3Paths4Options

For mid-size projects (from 2 weeks to 6 months) clients getting a “Progressive” discount – the longer the project then the larger the discount. Here are the most popular requests for my Consulting Data Visualization Work:

  • Comparing Data Visualization Product vs. Other Visualization Product for specific Client’s needs and projects;
  • Comparing Clients’s Visualization Product vs. Competitor(s) Visualization Product (most requested);
  • Benchmarking one or more Visualization Product(s) vs. specific data and application logic.
  • Managed Clients migration of their Reporting and Analytical IT Infrastructure from obsolete BI Platforms like Business Objects, Cognos and Microstrategy to modern Data Visualization Environments like Tableau, Qlikview and Spotfire.
  • Etc.

Solution

Full-time work (1 year or more engagements) is not exactly a Consulting but Full-time job when clients asking me to join their company. These jobs are similar to what I had in the past: Director of Visual Analytics, Data Visualization Director, VP of Data Visualization, Principal Data Visualization Consultant, Tableau Architect etc. Here are samples of full-time projects:

  • Created, Maintained and Managed the Data Visualization Consulting Practices for my company/employer;
  • Led the growth of Data Visualization Community (the latest example – 4000 strong Tableau Community) with own Blog, Portal and User Group behind the corporate firewall, created Dozens of near-real-time Monitoring Dashboards for Analytical and Data Visualization Communities;
  • Designed and Implemented myself hundreds of Practical Data Visualizations and Visual Reports, which led to discovery of trends, outliers, clusters and other Data Patterns, Insights and Actions;
  • Created hundreds of Demos, Prototypes and Presentations for Business Users;
  • Designed Data Visualization Architecture and Best Practices for Dozen of Analytical Projects;
  • Significantly improved the Mindshare and increased the Web Traffic to website of my company, Created and Maintained the Data Visualization blog for it.

You can find more observations about relationship between Full-Time salary and Hourly Rate for consulting in my previous post (from 6 months ago) here: http://apandre.wordpress.com/2013/07/11/contractors-rate/

Data Visualization readings – last 4 months of 2013.

(time to read is shrinking…)

0. The Once and Future Prototyping Tool of Choice
http://tableaufriction.blogspot.com/2013/07/the-once-and-future-prototyping-tool-of.html

1. Block by Block, Brooklyn’s Past and Present
http://bklynr.com/block-by-block-brooklyns-past-and-present/

2. Data Visualization and the Blind
http://www.perceptualedge.com/blog/?p=1756

3. WHY ABRAHAM LINCOLN LOVED INFOGRAPHICS
http://www.newyorker.com/online/blogs/elements/2013/10/why-abraham-lincoln-loved-infographics.html#

4. Old Charts

5. Back To Basics
http://www.quickintelligence.co.uk/back-to-basics/

6. In-Memory Data Grid Key to TIBCO’s Strategy
http://www.datanami.com/datanami/2013-10-21/in-memory_data_grid_key_to_tibco_s_strategy.html

7. Submarine Cable Map
http://visual.ly/submarine-cable-map?view=true

8. Interview with Nate Silver:
http://blogs.hbr.org/2013/09/nate-silver-on-finding-a-mentor-teaching-yourself-statistics-and-not-settling-in-your-career/

9. Qlikview.Next will be available in 2014
http://apandre.wordpress.com/2013/09/25/qlikview-next/

10. Importance of color?
http://www.qualia.hr/why-is-color-so-important-in-data-visualization/#

11. Qlikview.Next has a gift for Tableau and Datawatch
http://apandre.wordpress.com/2013/10/24/qlik-next-has-gift/

12. (October 2013) Tableau posts 90% revenue gain and tops 1,000 staffers, files for $450 million secondary offering
http://www.geekwire.com/2013/tableau-software/#

13. The Science Of A Great Subway Map
http://www.fastcodesign.com/3020708/evidence/the-science-of-a-great-subway-map

14. SEO Data Visualization with Tableau
http://www.blastam.com/blog/index.php/2013/10/how-to-create-awesome-seo-data-visualization-with-tableau/

15. John Tukey “Badmandments”
http://www.kdnuggets.com/2013/11/john-tukey-badmandments-lessons-from-great-statistician.html#
Tukey

Supplementary BADMANDMENTS:

  • 91. NEVER plan any analysis before seeing the DATA.
  • 92. DON’T consult with a statistician until after collecting your data.
  • 94. LARGE enough samples always tell the truth

16. Thinking about proper uses of data visualization.
http://data-visualization-software.com/finally-some-clear-thinking-about-proper-uses-of-data-visualization/

17. Big BI is Stuck: Illustrated by SAP BusinessObjects Explorer
http://www.perceptualedge.com/blog/?p=727

18. IBM (trying to catch up?) bets on big data visualization
http://www.zdnet.com/ibm-bets-on-big-data-visualization-7000022741/

19. Site features draft designs and full views of the Treemap Art project (By Ben Shneiderman)
http://treemapart.wordpress.com/
http://www.cs.umd.edu/hcil/treemap-history/
http://www.cs.umd.edu/hcil/treemap/
http://treemapart.wordpress.com/full-views/
http://treemapart.wordpress.com/category/draft-designs/
img_6560

20. A Guide to the Quality of Different Visualization Venues
http://eagereyes.org/blog/2013/a-guide-to-the-quality-of-different-visualization-venues

21. Short History of (Nothing) Data Science
http://www.forbes.com/sites/gilpress/2013/05/28/a-very-short-history-of-data-science/

22. Storytelling: Hans Rosling at Global Health – beyond 2015

23. DataWatch Quarterly Review: Rapid Growth Finally Materializing
http://seekingalpha.com/article/1872591-datawatch-quarterly-review-rapid-growth-finally-materializing

24. QlikView Extension – D3 Animated Scatter Chart
http://www.qlikblog.at/2574/qlikview-extension-animated-scatter-chart/

AnimatedScatterChart-500x328

25. SlopeGraph for QlikView (D3SlopeGraph QlikView Extension)
http://www.qlikblog.at/3093/slopegraph-for-qlikview-d3slopegraph-qlikview-extension/

26. Recipe for a Pareto Analysis
http://community.qlikview.com/blogs/qlikviewdesignblog/2013/12/09/pareto-analysis

27. Color has meaning
http://www.juiceanalytics.com/design-principles/color-has-meaning/#
Meaning-in-color-e1328906744180

28. TIBCO’s Return To License Growth Frustratingly Inconsistent
http://seekingalpha.com/article/1909571-tibcos-return-to-license-growth-frustratingly-inconsistent

29. Automated Semantics and BI
http://www.forbes.com/sites/danwoods/2013/12/30/why-automated-semantics-will-solve-the-bi-dashboard-crisis/

30. What is wrong with definition of Data Science?
http://www.kdnuggets.com/2013/12/what-is-wrong-with-definition-data-science.html
mout-stats-cs-database

31. Scientific data became so complex, we have to Invent new Math to deal with it
http://www.wired.com/wiredscience/2013/10/topology-data-sets/

32. Samples

Selected Tableau Readings after TCC13 (since September 18, 2013)

sometimes reading is better then doing or writing…

0. Top 10 sessions from TCC13:
http://www.tableausoftware.com/about/blog/2013/12/top-10-sessions-tcc13-27292

1. Dual Color Axis:
https://www.interworks.com/blogs/wjones/2013/09/18/create-dual-color-axis-tableau

2. Evaluate models with fresh data using Tableau heat maps:
http://cooldata.wordpress.com/2012/07/12/evaluate-models-with-fresh-data-using-tableau-heat-maps/

3. Tableau Throws a Brick at Traditional BI:
http://www.datanami.com/datanami/2013-09-11/tableau_throws_a_brick_at_traditional_bi.html

4. Easy Empty Local Extracts:
http://www.tableausoftware.com/about/blog/2013/9/easy-empty-local-extracts-25152

5. Tableau 8.1: Sophisticated Analytics for Sophisticated People:
http://www.tableausoftware.com/about/blog/2013/9/tableau-81-sophisticated-analytics-sophisticated-people-25177

6. Tableau 8.1 and R (can be interesting for at least 5% of Tableau users):
http://www.tableausoftware.com/about/blog/2013/10/tableau-81-and-r-25327
also see:
https://www.interworks.com/blogs/trobeson/2013/11/27/using-r-tableau-81-getting-started
and here:
http://www.tableausoftware.com/about/blog/r-integration

7. Tableau, When Are You Going to Fix This?
http://www.datarevelations.com/tableau-when-are-you-going-to-fix-this.html

8. Automated PDF Email Distribution of Tableau Views Using PowerShell and Tabcmd:
http://www.interworks.com/blogs/tladd/2013/08/22/automated-pdf-email-distribution-tableau-views-using-powershell-and-tabcmd

9. Geocoding Addresses Directly in Tableau 8.1 Using Integration with R:
http://www.dataplusscience.com/Geocoding%20in%20Tableau%20using%20R.html

10. Best Practices for Designing Efficient Workbooks (and white Paper about it):
http://www.tableausoftware.com/about/blog/2013/10/best-practices-designing-efficient-workbooks-25391

11. Tableau Mapping Architecture:
http://urbanmapping.com/tableau/mapping-architecture.html

12. Story Points in Tableau 8.2 presentation mode:
http://eagereyes.org/blog/2013/story-points

13. Truly Global: Filtering Across Multiple Tableau Workbooks with the JavaScript API:
https://www.interworks.com/blogs/tladd/2013/10/24/truly-global-filtering-across-multiple-tableau-workbooks-javascript-api

14. Tableau 8.1 Worksheet, Dashboard menus improved, still room for more:
http://tableaufriction.blogspot.com/2013/10/tv81-beta-3-worksheet-dashboard-menus.html

15. Lollipops Charts in Tableau:
http://drawingwithnumbers.artisart.org/lollipops-for-quality-improvement/

16. Was Stephen Few Right?
http://www.datarevelations.com/was-stephen-few-right-my-problems-with-a-companys-iron-viz-competition.html

17. Precision Inputs Required In Addition To Analog Controls:
http://tableaufriction.blogspot.com/2013/11/precision-inputs-required-in-addition.html

18. Google Spreadsheets to Tableau connector – a working driver:
http://community.tableausoftware.com/thread/135281

19. Leveraging Color to Improve Your Data Visualization:
http://www.tableausoftware.com/public/blog/2013/10/leveraging-color-improve-your-data-visualization-2174

20. Workbook acts as a container for multiple Tableau-based Charts – 114
Samples and Visualization Types:
http://www.alansmitheepresents.org/2013/07/team-geiger-rides-again.html

21. The New Box-and-Whisker Plot:
http://www.tableausoftware.com/public/blog/2013/11/box-and-whisker-plots-2231

22. The Tableau Workbook Library:
http://www.tableausoftware.com/about/blog/2013/11/tableau-workbook-library-27004

23. Customizing Tableau Server Experience (Parts 1, 1.5, 2):
http://ugamarkj.blogspot.com/2013/11/customizing-tableau-server-experience.html
http://ugamarkj.blogspot.com/2013/12/customizing-tableau-server-experience.html
http://ugamarkj.blogspot.com/2013/12/customizing-tableau-server-experience_15.html

24. SAML Integration in Tableau 8.1:
https://www.interworks.com/blogs/daustin/2013/11/27/saml-integration-tableau-81

25. Tableau file types and extensions:
http://www.theinformationlab.co.uk/2013/12/02/tableau-file-types-and-extensions/

26. Tableau Server XML Information Files: The Master Class:
http://tableaulove.tumblr.com/post/69383091006/tableau-server-xml-information-files-the-master-class

27. Is it Transparency? Is it Opacity? Labeled one, works like the other:
http://tableaufriction.blogspot.com/2013/12/is-it-transparency-is-it-opacity.html

28. Viz Hall of Fame:
http://www.tableausoftware.com/about/blog/2013/12/viz-hall-fame-27270

29. Tableau Weekly Archive:
http://us7.campaign-archive1.com/home/?u=f3dd94f15b41de877be6b0d4b&id=d23712a896

30. 2013 Winners:
http://www.tableausoftware.com/public/blog/2013/12/2013-award-winners-2272
Happy New Year!
2014Cubes

My Best Wishes for 2014 to all visitors of this Blog!

New2014

2013 was very successful year for Data Visualization (DV) community, Data Visualization vendors and for this Data Visualization Blog (number of visitors per grew from average 16000 to 25000+ per month).

From certain point of view 2013 was the year of Tableau – it went public, Tableau has now the largest Market Capitalization among DV Vendors (more than $4B as of Today) and its strategy (Data to the People!) became the most popular among DV users and it had (again) largest YoY revenue growth (almost 75% !) among DV Vendors. Tableau already employed more than 1100 people and still has 169+ job openings as of today. I wish Tableau to stay the Leader of our community and to keep their YoY above 50% – this will not be easy.

Qliktech is the largest DV Vendor and it will exceed in 2014 the half-billion dollars benchmark in revenue (probably closer to $600M by end of 2014) and will employ almost 2000 employees. Qlikview is one of the best DV product on market. I wish in 2014 Qlikview will create Cloud Services, similar to Tableau Online and Tableau Public and I wish Qlikview.Next will keep Qlikview Desktop Professional (in addition to HTML5 client).

I wish TIBCO will stop trying to improve BI or make it better – you cannot reanimate a dead horse; instead I wish Spotfire will embrace the approach “Data to the People” and act accordingly. For Spotfire my biggest wish is that TIBCO will spin it off the same way EMC did with VMWare. And yes, I wish Spofire Cloud Personal will be free and enabled to read at least local flat files and local DBs like Access.

2014 (or may be 2015?) can witness new, 4th DV player coming to competition: Datawatch bought recently Panopticon and if it will complete integration of all products correctly and add features which other DV vendors above already have (like Cloud Services), it can be very competitive player. I wish them luck!

TibxDataQlikQwchFrom051713To122413

Microsoft released in 2013 a lot of advanced and useful DV-related functionality and I wish (I recycling this wish for many years now) that Microsoft finally will package the most its Data Visualization Functionality in one DV product and add it to Office 20XX (like they did with Visio) and Office 365 instead of bunch of plug-ins to Excel and SharePoint.

It is a mystery for me why Panorama, Visokio and Advizor Solutions still relatively small players, despite all 3 of them having an excellent DV features and products. Based on 2013 IPO experience with Tableau may be the best way for them to go public and get new blood? I wish to them to learn from Tableau and Qlikview success and try this path in 2014-15…

For Microstrategy my wish is very simple – they are only traditional BI player who realised that BI is dead and they started in 2013 (actually before then 2013) a transition into DV market and I wish them all success they can handle!

I also think that a few thousands of Tableau, Qlikview and Spotfire customers (say 5% of customer base) will need (in 2014 and beyond) more deep Analytics and they will try to complement their Data Visualizations with Advanced Visualization technologies they can get from vendors like http://www.avs.com/

My best wishes to everyone! Happy New Year!

y16_84590563

2 months ago TIBCO (Symbol TIBX on NASDAQ) announced Spotfire 6 at TUCON 2013 user conference. This as well a follow-up release  (around 12/7/13) of Spotfire Cloud supposed to be good for TIBX prices. Instead since then TIBX lost more then 8%, while NASDAQ as whole grew more then 5%:

TIBXvsNasdaqFrom1014To121313

For example, at TUCON 2013 TIBCO’s CEO re-declared “5 primary forces for 21st century“(IMHO all 5 “drivers” sounds to me like obsolete IBM-ish Sales pitches) – I guess to underscore the relevance of TIBCO’s strategy and products to 21st century:

  1. Explosion of data (sounds like Sun rises in the East);

  2. Rise of mobility (any kid with smartphone will say the same);

  3. Emergence of Platforms (not sure if this a good pitch, at least it was not clear from TIBCO’s presentation);

  4. Emergence of Asian Economies (what else you expect? This is the side effect of the greedy offshoring for more then decade);

  5. Math trumping Science  (Mr. Ranadive and various other TUCON speakers kept repeating this mantra, showing that they think that statistics and “math” are the same thing and they do not know how valuable science can be. I personally think that recycling this pitch is dangerous for TIBCO sales and I suggest to replace this statement with something more appealing and more mature).

Somehow TUCON 2013 propaganda and introduction of new and more capable version 6 of Spotfire and Spotfire Cloud did not help TIBCO’s stock. For example In trading on Thursday, 12/12/13 the shares of TIBCO Software, Inc. (NASD: TIBX) crossed below their 200 day moving average of $22.86, changing hands as low as $22.39 per share while Market Capitalization was oscillating around $3.9B, basically the same as the capitalization of 3 times smaller (in terms of employees) competitor Tableau Software.

As I said above, just a few days before this low TIBX price, on 12/7/13, as promised on TUCON 2013, TIBCO launched Spotfire Cloud and published licensing and pricing for it.

Most disappointing news is that in reality TIBCO withdrew itself from the competition for mindshare with Tableau Public (more then 100 millions of users, more then 40000 active publishers and Visualization Authors with Tableau Public Profile), because TIBCO no longer offers free annual evaluations. In addition, new Spotfire Cloud Personal service ($300/year, 100GB storage, 1 business author seat) became less useful under new license since its Desktop Client has limited connectivity to local data and can upload only local DXP files.

The 2nd Cloud option called Spotfire Cloud Work Group ($2000/year, 250GB storage, 1 business author/1 analyst/5 consumer seats) and gives to one author almost complete TIBCO Spotfire Analyst with ability to read 17 different types of local files (dxp, stdf, sbdf, sfs, xls, xlsx, xlsm, xlsb, csv, txt, mdb, mde, accdb, accde, sas7bdat,udl, log, shp), connectivity to standard Data Sources (ODBC, OleDb, Oracle, Microsoft SQL Server Compact Data Provider 4.0, .NET Data Provider for Teradata, ADS Composite Information Server Connection, Microsoft SQL Server (including Analysis Services), Teradata and TIBCO Spotfire Maps. It also enables author  to do predictive analytics, forecasting, and local R language scripting).

This 2nd Spotfire’s Cloud option does not reduce Spotfire chances to compete with Tableau Online, which costs 4 times less ($500/year). However (thanks to 2 Blog Visitors – both with name Steve – for help), you cannot use Tableau online without licensed version of Tableau Desktop ($1999 perpetual non-expiring desktop license with 1st year maintenance included and each following year 20% $400 per year maintenance) and Online License (additional $500/year for access to the same site, but extra storage will not be added to that site!) for each consumer. Let’s compare Spotfire Workgroup Edition and Tableau Online cumulative cost for 1, 2, 3 and 4 years for 1 developer/analyst and 5 consumer seats :

 

Cumulative cost for 1, 2, 3 and 4 years of usage/subscription, 1 developer/analyst and 5 consumer seats:

Year

Spotfire Cloud Work Group, 250GB storage

Tableau Online (with Desktop), 100GB storage

Cost Difference (negative if Spotfire cheaper)

1

$2000

$4999

-$2999

2

$4000

$8399

-$4399

3

$6000

$11799

-$5799

4

$8000

$15199

-$7199

UPDATE: You may need to consider some other properties, like available storage and number of users who can consume/review visualizations, published in cloud. In sample above:

  • Spotfire giving to Work Group total 250 GB storage, while Tableau giving total 100 GB to the site. 2 or more subscriptions can be associated with the same site, but it will not increase the size of storage for the site from 100 GB to more (e.g. 200 GB for 2 subscribers). 
  • Spotfire costs less than Tableau Online for similar configuration (almost twice less!)

Overall, Spotfire giving more for your $$$ and as such can be a front-runner in Cloud Data Visualization race, considering that Qlikview does not have any comparable cloud options (yet) and Qliktech relying on its partners (I doubt it can be competitive) to offer Qlikview-based services in the cloud. Gere is the same table as above but as IMage (to make sure all web browsers can see it):

SFvsTBCloudPrice

It is important to consider another advantage of Spotfire Cloud: ability to share visualizations with everybody on internet by publishing them into Public Folder(s). By contrast, Tableau has limited licensing for this: in order to access to published workbooks on Tableau Online site, the Tableau Software by default requires the extra subscription, which is wrong from my point of view, because you can just publish it on Public Folder of such site (if such option allowed). By default (and without additional negotiations) Tableau Online does not allow the usage of Public Folder.

3rd Spotfire’s Cloud option called Spotfire Cloud Enterprise, it has customizable seating options and storage, more advanced visualization, security and scalability and connects to 40+ additional data sources. It requires an annoying negotiations with TIBCO sales, which may result to even larger pricing. Existence of 3rd Spotfire Cloud option decreases the value of its 2nd Cloud Option, because it saying to customer that Spotfire Cloud Work Group is not best and does not include many features. Opposite to that is Tableau’s Cloud approach: you will get everything (with one exception: Multidimensional (cube) data sources are not supported by Tableau Online) with Tableau Online, which is only the option.

Update 12/20/13:  TIBCO announced results for last quarter, ending 11/30/13 with Quarterly revenue $315.5M (only 6.4% growth compare with the same Quarter of 2012) and $1070M Revenue for 12 months ended 11/30/13 (only 4.4% growth compare with the same period of 2012). Wall Street people do not like it today and TIBX lost today 10% of its value, with Share Price ending $22 and Market Capitalization went down to less then $3.6B. At the same time Tableau’s Share Price went up $1 to $66 and Market Capitalization of Tableau Software (symbol DATA) went above $3.9B). As always I think it is relevant to compare the number of job openings today: Spotfire – 28, Tableau – 176, Qliktech – 71

Since we approaching (in USA that is) a Thanksgiving Day for 2013 and shopping is not a sin for few days, multiple blog visitors asked me what hardware advise I can share for their Data Science and Visualization Lab(s). First of all I wish you will get a good Turkey for Thanksgiving (below is what I got last year):

Turkey2012

I cannot answer DV Lab questions individually – everybody has own needs, specifics and budget, but I can share my shopping thoughts about needs for Data Visualization Lab (DV Lab). I think DV Lab needs many different types of devices: smartphones, tablets, projector (at least 1), may be a couple of Large Touchscreen Monitors (or LED TVs connectable to PCs), multiple mobile workstations (depends on size of DV Lab team), at least one or two super-workstation/server(S) residing within DV Lab etc.

Smartphones and Tablets

I use Samsung Galaxy S4 as of now, but for DV Lab needs I will consider either Sony Xperia Z Ultra or Nokia 1520 with hope that Samsung Galaxy S5 will be released soon (and may be it will be the most appropriate for DV Lab):

sonyVSnokia

My preference for Tablet will be upcoming Google Nexus 10 (2013 or 2014 edition – it is not clear, because Google is very secritive about it) and in certain cases Google Nexus 7 (2013 edition). Until Nexus 10 ( next generation) will be released, I guess that two leading choices will be ASUS Transformer Pad TF701T

t701

and Samsung Galaxy Note 10.1 2014 edition (below is a relative comparison of the size of these 2 excellent tablets):

AsusVsNote10

Projectors, Monitors and may be Cameras.

Next piece of hardware in my mind is a projector with support for full HD resolution and large screens. I think there are many good choices here, but my preference will be BENQ W1080ST for $920 (please advise if you have a better projector in mind in the same price range):

benq_W1080ST

So far you cannot find too many Touchscreen Monitors for reasonable price, so may be these two 27″ touchscreen monitors (DELL P2714T for $620 or Acer T272HL bmidz for $560) are good choices for now:

dell-p2714t-overview1

I also think that a good digital camera can help to Data Visualization Lab and considering something like this (can be bought for $300): Panasonic Lumix DMC FZ72 with 60X optical zoom and ability to do a Motion Picture Recording as HD Video in 1,920 x 1,080 pixels - for myself:

panasonic_lumix_dmc_fz72_08

Mobile and Stationary Workstations and Servers.

If you need to choose CPU, I suggest to start with Intel’s Processor Feature Filter here: http://ark.intel.com/search/advanced . In terms of mobile workstations you can get quad-core notebook (like Dell 4700 for $2400 or Dell Precison 4800 or HP ZBook 15 for $3500) with 32 GB RAM and decent configuration with multiple ports, see sample here:

m4700

If you are OK with 16GB of RAM for your workstation, you may prefer Dell M3800 with excellent touchscreen monitor (3200×1800 resolution) and only 2 kg of weight. For a stationary workstation (or rather server) good choices are Dell Precision T7600 or T7610 or HP Z820 workstation. Either of these workstations (it will cost you!) can support up to 256GB RAM, up to 16 or even 24 cores in case of HP Z820), multiple high-capacity hard disks and SSD, excellent Video Controllers and multiple monitors (4 or even 6!) Here is an example of backplane for HP Z820 workstation:

HP-z820

I wish to visitors of this blog a Happy Holidays and good luck with their DV Lab Shopping!

Famous Traditional BI vendor got sick and tired to be out of Data Visualization market and decided to insert itself into it by force by releasing today 2 Free (for all users) Data Visualization Products:

  • MicroStrategy Analytics Desktop™ (Free self-service visual analytics tool)

  • MicroStrategy Analytics Express™ (Free Cloud-based self-service visual analytics)

That looks to me as the huge Disruption of Data Visualization Market: For example similar Desktop Product from Tableau costs $1999 and Cloud Product called Tableau Online costs $500/year/user. It puts Tableau, Qlikview and Spotfire to a very tough position price-wise. However only Tableau stock went down almost $3 (more then %4) today, but MSTR, TIBX an QLIK basically did not react on Microstrategy announcement):

DataMstrQlikTibx

And don’t think that only MIcrostrategy trying to get into DV market. For example SAP did similar (in less-dramatic and non-disruptive fashion) a few months ago with SAP Lumira (Personal Edition is free), also SAP Cloud and Standard edition available too, see it here http://www.saplumira.com/index.php and here http://store.businessobjects.com/store/bobjamer/en_US/Content/pbPage.sap-lumira . SAP senior vice president and platform head Steve Lucas 10 weeks ago was asked if SAP would consider buying Tableau, Lucas went in the opposite direction. “We aren’t going to buy Tableau,” Lucas said with a smile on his face. There’s no need to buy an overvalued software company.” Rather, SAP wants to crush companies like Tableau (I doubt it is possible, but SAP is free to try) and build own Data Visualization product line out of Lumira, read more at

http://venturebeat.com/2013/07/30/sap-platform-head-tableau-overvalued/#yFzUpzOh6ivMYvqP.99

If I will be Tableau, Qlikview or Spotfire I will not worry yet about Microstrategy competition yet, because it is unclear how the future R&D for free Analytics Desktop and Express will be funded – out of MicroStrategy Analytics Enterprise™ R&D budget? That can be tricky, considering as of right now Tableau hiring hard (163 open job positions as of yesterday!) and Qliktech is very active too (about 93 openings as of yesterday) and even TIBCO has 36 open positions just for Spotfire alone.

But I may start to worry about other DV Vendor - Datawatch, who recently completed the acquisition of Panopticon. Datawatch grew 45% YoY (2012-over-2011), has only 124 employees but $27.5M in sales, very experienced leadership, 40000+ customers worldwide and mature product line. May be another evidence of it here:

http://online.wsj.com/article/PR-CO-20131023-907942.html

The three MicroStrategy Analytics Platform products also share a common user experience—making it easy to start small with self-service analytics and grow into the production-grade features of Enterprise. Desktop and Express from Microstrategy can be naturally extended (for fee)  to a new enterprise-grade BI&DV Suite, also released today and called MicroStrategy Analytics Enterprise™ (known under other name as MIcrostrategy Suite 9.4). 

New MicroStrategy Analytics Enterprise 9.4 includes data blending, which allows users to combine data from more than one source; the software stores the data in working memory without the need for a separate data integration product.  9.4 can connect with the MongoDB NoSQL data store as well as Hadoop distributions from Hortonworks, Intel and Pivotal. It comes with the R, adds better ESRI integration. The application can now fit 10 times as much data in memory as the previous version could, and the self-service querying now runs up to 40 percent faster.

MicroStrategy Analytics Enterprise™ Suite is also available starting today for free for developers and non-production use: 10 named user licenses of MicroStrategy Intelligence Server, MicroStrategy Web Reporter and Analyst, MicroStrategy Mobile, MicroStrategy Report Services, MicroStrategy Transaction Services, MicroStrategy OLAP Services, MicroStrategy Distribution Services, and MultiSource Option. 1 named user license of development software, MicroStrategy Web Professional, MicroStrategy Developer, and MicroStrategy Architect The server components have a 1 CPU limit).

Quote from Wayne Eckerson, President of  BI Leader Consulting: “The new MicroStrategy Analytics Desktop makes MicroStrategy a top-tier competitor in the red-hot visual discovery market. The company was one of the first traditional enterprise BI vendors to ship a visual discovery tool, so its offering is mature compared to others in its peer group, but it was locked away inside its existing platform. By offering a stand-alone desktop visual discovery tool and making it freely available, MicroStrategy places itself among” Data Visualization Leaders.

You also can read today’s article from very frequent visitor to my blog (his name Akram), who is the Portfolio and Hedge Manager, Daily Trader and excellent investigator of all Data Visualization Stocks, DV Market and DV Vendors. His article “Tableau: The DV Market Just Got More Crowded”  can be found here (cannot resist to quote: “Microstrategy is priced like it has nothing to do with this space, and Tableau is priced like it will own the whole thing.”):

http://seekingalpha.com/article/1760432-tableau-the-dv-market-just-got-more-crowded?source=yahoo

Heatmap generated by Microstrategy Analytic Desktop

Heatmap generated by Microstrategy Analytic Desktop

MicroStrategy Analytics Desktop.

It’s free visual analytics: Free Visual Insight, 100M per file, 1GB total storage, 1 of user, Free e-mail support for 30 days. Free access to online training, forum, and knowledge base.
Data Sources: xls, csv, RDBMSes, Multidimensional Cubes, MapReduce, Columnar DBs, Access with Web Browser, export to Excel, PDF, flash and images, email distribution. The product is freely available to all and can be downloaded instantly at:http://www.microstrategy.com/free/desktop .

TRellis of Bar Charts generated by Microstrategy Analytics Desktop

TRellis of Bar Charts generated by Microstrategy Analytics Desktop

Kevin Spurway, MicroStrategy’s vice president of industry and mobile marketing said: “The new desktop software was designed to compete with other increasingly popular self-serve, data-discovery desktop visualization tools offered by Tableau and others”. To work with larger data sets, a user should have 2GB or more of working memory on the computer, Spurway said. See more here:

http://www.microstrategy.com/Strategy/media/downloads/free/analytics-desktop_quick-start-guide.pdf

MicroStrategy Analytics Express.

MicroStrategy Analytics Express is a software-as-a-service (SaaS)-based application that delivers all the rapid-fire self-service analytical capabilities of Desktop, plus reports and dashboards, native mobile applications, and secure team-based collaboration – all instantly accessible in the Cloud. Today, the Express community includes over 32,000 users across the globe.

In this release, Express inherits all the major functional upgrades of the MicroStrategy Analytics Platform, including new data blending features, improved performance, new map analytics, and much more. For a limited time, MicroStrategy is also making Express available to all users free for a year. With this valuable offer, users will be able to establish an account, invite tens, hundreds, or even thousands of colleagues to connect, analyze and share their data and insight, and do it all at no charge. For some organizations, the potential value of this offer can be $1 million or more. Users can sign up, access the service, and take advantage of this offer instantly at

www.microstrategy.com/free/express

MicroStrategy Analytics Express includes Free Visual Insight, Free web browser and iPad access, Free SaaS for one year, 1GB upload per file, unlimited number of users, Free e-mail support for 30 days. Free access to online training, forum, and knowledge base. Data Sources: xls, csv, RDBMSes Columnar DBs, Drobbox, Google Drive Connector, Visual Insight, a lot of security and a lot more, see http://www.microstrategy.com/Strategy/media/downloads/free/analytics-express_user-guide.pdf

All tools from Microstrategy Analytics Platform (Desktop, Express and Entereprise Suite) support standard list of Chart Styles and Types: Bar (Vertical/Horizontal Clustered/Stacked/100% Stacked), Line (Vertical/Horizontal Absolute/Stacked/100% Stacked), Combo Chart (of Bar and Area)Area (Vertical/Horizontal Absolute/Stacked/100% Stacked)

Area Chart Generated by Microstrategy Analytics Express

Area Chart Generated by Microstrategy Analytics Express

Dual Axis ( Bar/Line/Area Vertical/Horizontal), HeatMap, Scatter, Scatter Grid, Bubble, Bubble Grid, Grid,

Data Grid generated by Microstrategy Analytics Express

Data Grid generated by Microstrategy Analytics Express

Pie, Ring, ESRI Maps,

Microstrategy Analytics Desktop and Express integrate and generate ESRI Map Visualizations

Microstrategy Analytics Desktop and Express integrate and generate ESRI Map Visualizations

Network of Nodes, with lines representing links/connections/relationship,

Network Graph Generated by Microstrategy Analytics Express

Network Graph Generated by Microstrategy Analytics Express

Microcharts and Sparklines,

e4Microcharts

Data and Word Clouds,

DataCloud

and of course any kind of interactive Dashboards as combination of all of the above Charts, Graphs, and Marks:

Interactive Dashboard Generated by Microstrategy Analytics Express

Interactive Dashboard Generated by Microstrategy Analytics Express

Yesterday TIBCO announced Spotfire 6 with features, competitive with Tableau 8.1 and Qlikview.Next (a.k.a Qlikview 12). Some new features will be showcased at TUCON® 2013, TIBCO’s annual user conference, October 14-17, 2013 (2100 attendees). Livestream Video is here: http://tucon.tibco.com/video/index.html , tune in October 15th and 16th from 11:30am – 3:30pm EST.

More details will be shown in webcasts and webinars (I personally prefer detailed articles, blogposts, slides, PDFs and Demos, but TIBCO’s corporate culture ignores my preferences for years) on 10/30/13 by Steve Farr

Spotfire 6.0 will be available in mid-November, presumably the same time as Tableau 8.1 and before then Qlikview.Next so TIBCO is not a loser in Leap-frogging game for sure…

TIBCO bought the Extended Results and will presumably will show the integration with PSUHBI product, see it here: http://www.pushbi.com/ ; TIBCO called it as Delivery of  personal KPIs and Metrics on any mobile phone, tablet or laptop, online or offline (new name for it will be TIBCO Spotfire® Consumer):

ipadiphoneAnother TIBCO’s Purchase is MAPORAMA and integration with it TIBCO called (very appropriately) as the Location Analytics with promise to

  • Visualize, explore and analyze data in the context of location

  • Expand situational understanding with multi-layered geo-analytics

  • Mashup new data sources to provide precise geo-coding across the enterprise

la1Spotfire Location Services is the Agnostic Platform and supports (I guess this needs to be verified, because sounds too good to be true) any map service, including own TIBCO, ESRI (Spotfire integrates with ESRI previously), Google:

GeoCodingSP6TIBCO has Event processing capabilities (e.g BusinessEvents (5.1.2. now), ActiveSpaces ( currently v. 2.2) and realtime streaming of “Big Data” StreamBase (7.3.7) , they bought (StreamBase that is) a few months ago, see it here: http://www.streambase.com/news-and-events/press-releases/pr-2013/tibco-software-acquires-streambase-systems/#axzz2hiEjnr9X) and it will be interesting to see the new Spotfire Events Analytics (to Spot Event patterns)  product (see also: http://www.streambase.com/products/streambasecep ) integrated with Spotfire 6:

  • Identify new trends and outliers with continuous process monitoring

  • Automate the delivery of analytics applications based on trends

  • Operationalize analytics to support continuous process improvement:

ea1

One more capability in Spotfire mentioned (this claim needs to be verified) in recent TIBCO blogpost http://www.tibco.com/blog/2013/10/11/connecting-the-loops-the-next-step-in-decision-management/ as the ability to overlap 2 related but separated in real-life processes: the processes of analysis (discovery of insights in data) and execution (deciding and actions) could be separated by days, but with Spotfire 6.0 the entire decision process can happen in real time:

spotfireloops

For business user Spotfire 6 has new web-based authoring (Spotfire has a few “Clients”, one called Web Player and another called Enterprise Player, both are not free unlike Tableau Free Reader or Tableau Public). Bridging the gap between simple dashboards and advanced analytic applications, Spotfire 6.0 provides a new client “tailored to meet the needs of the everyday business user, who typically has struggled to manipulate pivot tables and charts to address their data discovery needs”.

With this new web application, known as TIBCO Spotfire® Business Author, business users can visually explore and interact with data, whether residing in a simple spreadsheet or dashboard, a database, or a predefined analytic application. It will definitely compete with Web Authoring in Tableau 8.1 and incoming Qlikview.Next.

For me personally the most interesting new feature is new Spotfire Cloud Services (supposedly the continuation of Spotfire SIlver, which I like but it is overpriced and non-competitive storage-wise vs. Tableau Public and Tableau Online cloud services). Here is the quote from yesterday’s Press Release: “TIBCO Spotfire® Cloud is a new set of cloud services for enterprises, work groups, and personal use (see some preliminary info here: https://marketplace.cloud.tibco.com/marketplace/marketplace/apps#/sc :

  • Personal: Web-based product, Upload Excel, .csv and .txt data, 12 visualization types, 100 GB of data storage. However, Spofire making a big mistake by denying access to Spotfire Analyst desktop product and making it as not free but only as “free trial for 30 days”, after which you have to pay a fee. That will benefit Tableau for sure and may be even Datawatch. As of 11/13/13, Spotfire still did not posted prices and fees for Spotfire Cloud Personal etc. and suggested to contact them over email, which I did but they never replied…
  • Workgroup: Web-based and desktop product, Connect and integrate multiple data sources, All visualization types, 250 GB of data storage.
  • Enterprise: Web-based and desktop product, Connect to 40+ data sources, All visualization types, Advanced statistics services, 500 GB of data storage

TIBCO Spotfire® Cloud Enterprise provides a secure full-featured version of Spotfire in the cloud to analyze and collaborate on business insights, whether or not the data is hosted. For project teams seeking data discovery as a service, TIBCO Spotfire® Cloud Work Group provides a wealth of application-building tools so distributed teams can visually explore data quickly and easily and deploy analytic applications at a very low cost. For individuals looking for a single step to actionable insight, TIBCO Spotfire®Personal is a cost-effective web-based client for quick data discovery needs.”

Please don’t forget that Spotfire 6 has TIBBR - v.5 as of now: https://tibco.tibbr.com/tibbr/web/login (social computing platform built for the workplace and integrated with Spotfire; Ram Menon, President of Social Computing at tibbr, says, “We now have 6.5 million users for tibbr as of October [2013]” and accessed from 7,000 cities, and 2,100 different device models. “A typical tibbr post is now seen by 100 users in the span of 24 hours, in 7 countries and over 50 mobile devices.” This fulfills TIBCO’s mission of getting the right information to the right people, at the right time. Related: integration between TIBBR and HUDDLE: http://www.huddle.com/blog/huddle-and-tibbr-unite-to-bring-powerful-collaboration-to-enterprise-social-networking/ )

And finally – enterprise-class, R-compatible statistical engine: TIBCO Enterprise Runtime for R (TERR) which is  the part of excellent TIBCO Spotfire Statistics Services (TSSS). TSSS allows Integration of R (including TERR), Spotfire’s own S+ (SPlus is Spotfire’s commercial version of R), SAS® and MATLAB® into Spotfire and custom applications. TERR, see http://spotfire.tibco.com/en/discover-spotfire/what-does-spotfire-do/predictive-analytics/tibco-enterprise-runtime-for-r-terr.aspx supports:

  • Support for paralelized R-language scripts in TERR

  • Support for call outs to open source R from TERR

  • Use RStudio – the most popular IDE in the R Community-to develop your TERR scripts

  • Over a thousand TERR compatible CRAN packages

Among other news is support for new Data Sources: http://spotfire.tibco.com/en/resources/support/spotfire-data-sources.aspx including SAP NetWeaver Business Warehouse v.7.0.1 (required TIBCO Connector Link).

General notes:

  1. I maintain my opinion that the best way for TIBCO to capitalize on tremendous hidden market value of Spotfire is to spin-it off as EMC did with VMWare.

  2. My other concern is too many offices involved with Spotfire: (Parental) TIBCO’s HQ in California, Swedish HQ (mostly R&D) office in Sweden and Large Marketing, Sales, Support and Consulting office in Newton, Massachusetts. My advise to have only one main office in MA, which is compatible with spin-off idea. Tableau has advantage here with concentrating their main office in Seattle.

  3. Update 11/13/13: TIBCO’s Spotfire propaganda so far did not help TIBCO stock shares at all, but seems to me that it helps a lot to Datawatch stock prices (Datawatch bought recently a very capable (technically) DV Vendor Panopticon and integrated its own software with Panopticon Software; Datawatch has 40000+ customers with 500000+ end users)

Last month Tableau and Qliktech both declared that Traditional BI is too slow (I am saying this for many years) for development and their new Data Visualization (DV software) is going to replace it. Quote from Tableau’s CEO: Christian Chabot: “Traditional BI software is obsolete and dying and this is very direct challenge and threat to BI vendors: your (BI that is) time is over and now it is time for Tableau.” Similar quote from Anthony Deighton, Qliktech’s CTO & Senior VP, Products: “More and more customers are looking at QlikView not just to supplement traditional BI, but to replace it“.

One of my clients – large corporation (obviously cannot say the name of it due NDA) asked me to advise of what to choose between Traditional BI tools with long Development Cycle (like Cognos, Business Objects or Microstrategy), modern BI tools (like JavaScript and D3 toolkit) which is attempt to modernize traditional BI but still having  sizable development time and leading Data Visualization tools with minimal development time (like Tableau, Qlikview or Spotfire).

Since main criterias for client were

  • minimize IT personnel involved and increase its productivity;

  • minimize the off-shoring and outsourcing as it limits interactions with end users;

  • increase end users’s involvement, feedback and action discovery.

So I advised to client to take some typical Visual Report project from the most productive Traditional  BI Platform (Microstrategy), use its prepared Data and clone it with D3 and Tableau (using experts for both). Results in form of Development time in hours) I put below; all three projects include the same time (16 hours) for Data Preparation & ETL, the same time for Deployment (2 hours) and the same number (8) of Repeated Development Cycles (due 8 consecutive feedback from End Users):

DVvsD3vsBI

It is clear that Traditional BI requires too much time, that D3 tools just trying to prolongate old/dead BI traditions by modernizing and beautifying BI approach, so my client choose Tableau as a replacement for Microstrategy, Cognos, SAS and Business Objects and better option then D3 (which require smart developers and too much development). This movement to leading Data Visualization platforms is going on right now in most of corporate America, despite IT inertia and existing skillset. Basically it is the application of the simple known principle that “Faster is better then Shorter“, known in science as Fermat’s Principle of least time.

This changes made me wonder (again) if Gartner’s recent marketshare estimate and trends for Dead Horse sales (old traditional BI) will stay for long. Gartner estimates the size of BI market as $13B which is drastically different from TBR estimate ($30B).

BIDeadHorseTheoryTBR predicts that it will keep growing at least until 2018 with yearly rate 4% and BI Software Market to Exceed $40 Billion by 2018 (They estimate BI Market as $30B in 2012 and include more wider category of Business Analytics Software as opposed to strictly BI tools). I added estimates for Microstrategy, Qliktech, Tableau and Spotfire to Gartner’s MarketShare estimates for 2012 here:

9Vendors

However, when Forrester asked people what BI Tools they used, it’s survey results were very different from Gartner’s estimate of “market share:

BIToolsInUse

“Traditional BI is like a pencil with a brick attached to it” said Chris Stolte at recent TCC13 conference and Qliktech said very similar in its recent announcement of Qlikview.Next. I expect TIBCO will say similar about upcoming new release of Spotfire (next week at TUCON 2013 conference in Las Vegas?)

Tableau_brick2

These bold predictions by leading Data Visualization vendors are just simple application of Fermat’s Principle of Least Time: this principle stated that the path taken between two points by a ray of light (or development path in our context) is the path that can be traversed in the least time.

Pierre_de_Fermat2Fermat’s principle can be easily applied to “PATH” estimates to multiple situations like in video below, where path from initial position of the Life Guard on beach to the Swimmer in Distress (Path through Sand, Shoreline and Water) explained: 

Even Ants following the Fermat’s Principle (as described in article at Public Library of Science here: http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0059739 ) so my interpretation of this Law of Nature (“Faster is better then Shorter“) that  traditional BI is a dying horse and I advise everybody to obey the Laws of Nature.

AntsOn2SurfacesIf you like to watch another video about Fermat’s principle of Least Time and related Snell’s law, you can watch this: 
Google+

Qlikview 10 was released around 10/10/10, Qlikview 11 – around 11/11/11, so I expected Qlikview 12 to be released on 12/12/12. Qliktech press release said today that the next (after 11.2) version of Qlikview will be delivered under the new nickname Qlikview.Next in 2014 but “for  early adopter customers in a production environment in 2013″. I hope I can get my hands on it ASAP!

The new buzzword is Natural Analytics: “QlikView.Next’s key value as an alternative BI platform is in its use of Natural Analytics“. The new Qliktech motto that “Qlikview is a Replacement of Traditional BI” is similar to what we heard from Tableau leaders just 2 weeks ago on Tableau Customer Conference in Washington, DC.  Another themes I hear from Qliktech about Qliview.Next are sounds familiar too: Gorgeous, Genius, Visually Beautiful, Associative Experience, Comparative Analysis, Anticipatory, Drag and Drop Analytics.

Qlikview.Next will introduce “Data Dialogs” as live discussions between multiple users about Data they see and explore collectively, enabling “Social BI”. This reminds me the integration between TIBBR (TIBCO’s collaboration platform) and Spotfire, which existed since Spotfire 4.0.

Details about new features in Qlikview.Next will be released later, but at least we know now when Qlikview 12 (sorry, Qlikview.Next that is) will be available. Some features actually unveiled in generic terms::

  • Unified, Browser-Based HTML5 Client, which will automatically optimize itself for user’ device;

  • Automatic and Intelligent re-sizing of objects to fit user’s screen;

  • Server-side Analysis and Development, Web-based creation and delivery of content, Browser-based Development;

  • Data Storytelling, narrative and social with Data Dialogs;

  • Library and Repository for UI objects;

  • Multi-source Data Integration and new web-based scripting;

  • QlikView Expressor for advanced graphical Data Integration and Metadata Management;

  • Improved Data Discovery with associative experience across all the data, both in memory and on disks;

  • Open API: JSON, .NET SDK and as JavaScript API;

  • All UI Objects can be treated as extension Objects, customizable with their source files available to developers;

  • New Managment Console with Qlikview on Qlikview Monitor;

  • New visualization capabilities, based on advanced data visualization suite from NComVA (bought by Qliktech a few months ago), potential samples see here: http://www.ncomva.se/guide/?chapter=Visualizations

NComVAVisualizations11

In addition Qliktech is launching the “Qlik Customer Success Framework” , which includes:

  • Qonnect Partner Program: An extensive global network of 1500+ partners, including resellers, (OEMs), technology companies, and system integrators.

  • Qlik Community: An online community with nearly 100,000 members comprised of customers, partners, developers and enthusiasts.

  • Qlik Market: An online showcase of applications, extensions and connectors.

  • Qoncierge: A single point of contact service offering for customers to help them access the resources they need.

  • Comprehensive Services: A wide range of consulting services, training and support.

QlikFramework

Also see Ted Cuzzillo blogpost about it here: http://datadoodle.com/2013/10/09/next-for-qlik/# and Cindi Howson’s old post here: http://biscorecard.typepad.com/biscorecard/2012/05/qliktech-shares-future-product-plans-for-qlikview.html and new article here: http://www.informationweek.com/software/business-intelligence/qliktech-aims-to-disrupt-bi-again/240162403#!

Today Tableau Customer Conference 2013 started with 3200+ attendees from 40+ countries and 100+ industries, with 700 employees of Tableau, 240 sessions. Tableau 8.1 pre-announced today for release in fall of 2013, also version 8.2 planned for winter 2014, and Tableau 9.0 for later in 2014.

Update 9/10/13: keynote now is available recorded and online:  http://www.tableausoftware.com/keynote
(Recorded Monday Sept 9, 2013 Christian Chabot, Chris Stolte and the developers LIVE)

New in 8.1: 64-bit, Integration with R, support for SAML, IPV6 and External Load Balancers, Copy/Paste Dashboards and worksheets between workbooks, new Calendar Control, own visual style, including customizing even filters, Tukey’s Box-and-Whisker Box-plot, prediction bands, ranking, visual analytics for everyone and everywhere (in the cloud now)

Planned and new for 8.2: Tableau for MAC, Story Points (new type of worksheet/dashboard with mini-slides as story-points), seamless access to data via data connection interface to visually build a data schema, including inner/left/right/outer visual joins, beautifying columns names, easier metadata etc, Web authoring enhancements (it may get into 8.1: moving quick filters, improvement for Tablets, color encoding.) etc.

8.1:  Francois Ajenstat announced: 64-bit finally (I asked for that for many years) for server processes and for Desktop, support for SAML (single-sign-ON on Server and Desktop), IPV6, External Load Balancers:

Francois

SAML8.1: Dave Lion announced R integration with Tableau:

DaveLion

r8.1: Mike Arvold announced “Visual Analytics for everyone”, including implemention of famous Tukey’s Box-and-Whisker Box-plot (Spotfire has it for a while, see it here: http://stn.spotfire.com/stn/UserDoc.aspx?UserDoc=spotfire_client_help%2fbox%2fbox_what_is_a_box_plot.htm&Article=%2fstn%2fConfigure%2fVisualizationTypes.aspx ),

better forecasting, prediction bands, ranking, better heatmaps:

MikeArvold8.1: Melinda Minch announced “fast, easy, beautiful”, most importantly copy/paste dashboards and worksheets between workbooks, customizing everything, including quick filters, new calendar control, own visual style, folders in Data Window etc…

MelindaMinch28.2: Jason King pre-announced the Seamless access to data via data connection interface to visually build a data schema, including inner/left/right/outer “visual” joins, beautifying columns names, default formats, new functions like DATEPARSE, appending data-set with new tables, beautifying columns names, easier metadata etc.

JasonKingSeamlessAccess2data28.2: Robert Kosara introduced Story Points (using new type of worksheet/dashboard with mini-slides as story-points) for new Storytelling functionality:

RobertKosara2

Here is an example of Story Points, done by Robert:

storypoints-4

8.2: Andrew Beers pre-announced Tableau 8.2 on MAC and he got a very warm reception from audience for that:

AndrewBeers3Chris Stolte proudly mentioned his 275-strong development team, pre-announced upcoming Tableau Releases 8.1 (this fall), 8.2 (winter 2014) and 9.0 (later in 2014) and introduced 7 “developers” who (see above Francois, Mike, Dave, Melinda, Jason, Robert and Andrew) discussed during this keynote new features (feature list is definitely longer and wider that recent “innovations” we saw from Qlikview 11.2 and even from Spotfire 5.5):

ChrisStolte2Christian Chabot opening keynote today… He said something important: current BI Platforms are not fast, nor easy, they are not beautiful and not for anyone and they are definitely not “anywhere” but only in designated places with appropriate IT personnel (compare with Tableau Public, Tableau Online, Tableau free Reader etc.) and it is only capable to produce a bunch of change requests from one Enterprise’s department to another, which will take long time to implement with any SDLC framework.

CEOChristian basically repeated what I am saying on this blog for many years, check it here http://apandre.wordpress.com/market/competitors/ : traditional BI software (from SAP, IBM, Oracle, Microstrategy and even Microsoft cannot compete with Tableau, Qlikview and Spotfire) is obsolete and dying and this is very direct challenge and threat to BI vendors (I am not sure if they understand that): your (BI that is) time is over and now it is time for Tableau (also for Qlikview and Spotfire but they are slightly behind now…).

Update on 11/21/13: Tableau 8.1 is available today, see it here: http://www.tableausoftware.com/new-features/8.1 and Tableau Public 8.1 is available as well, see it here: http://www.tableausoftware.com/public/blog/2013/11/tableau-public-81-launches-2226

While blog preserving my observations and thoughts, it preventing me to spend enough time to read what other people thinking and saying, so I created almost 2 years ago the extension of this blog in the form of 2 Google+ pages http://tinyurl.com/VisibleData and http://tinyurl.com/VisualizationWithTableau , where I accumulated all reading pointers for myself and gradually reading those materials when I have time.

Those 2 pages magically became extremely popular (this is unintended result) with total more than 5000 Google+ followers as of today. For example here is a Chart showing monthly growth of the  number of followers for the main extension of this blog http://tinyurl.com/VisibleData :

GPFollowersMonthly

So please see below some samples of Reading Pointers accumulated over last 3 months of summer by my Google+ pages:

Author trying to simplify BigData Definition as following: “BigData Simplified: Too much data to fit into a single server”: http://yottascale.com/entry/the-colorful-secrets-of-bigdata-platforms

Recent talk from Donald Farmer: http://www.wired.com/insights/2013/06/touch-the-next-frontier-of-business-intelligence/

Dmitry pointing to implementation Disaster of Direct Discovery in Qlikview 11.2: http://bi-review.blogspot.com/2013/04/first-look-at-qlikview-direct-discovery.html

Specs for Tableau in Cloud: https://www.tableausoftware.com/products/online/specs

The DB-Engines Monthly Ranking ranks database management systems according to their popularity. Turned out that only 3 DBMSes are popular: Oracle, SQL Server and MySQL:

According to Dr. Andrew Jennings, chief analytics officer at FICO and head of FICO Labs, three main skills of data scientist are the same 3 skills I tried to find when hiring programmers for my teams 5, 10, 20 and more years ago: 1. Problem-Solving Skills. 2. Communications Skills. 3. Open-Mindedness. This makes all my hires for last 20+ years Data Scientists, right? See it here: http://www.informationweek.com/big-data/news/big-data-analytics/3-key-skills-of-successful-data-scientis/240159803

A study finds the odds of rising to another income level are notably low in certain cities, like Atlanta and Charlotte, and much higher in New York and Boston: http://www.nytimes.com/2013/07/22/business/in-climbing-income-ladder-location-matters.html

Tableau is a prototyping tool: http://tableaufriction.blogspot.com/2013/07/the-once-and-future-prototyping-tool-of.html

Why More Data and Simple Algorithms Beat Complex Analytics Models: http://data-informed.com/why-more-data-and-simple-algorithms-beat-complex-analytics-models/

New Census Bureau Interactive Map Shows Languages Spoken in America: http://www.census.gov/newsroom/releases/archives/education/cb13-143.html

Google silently open sourced a tool called word2vec, prepackaged deep-learning software designed to understand the relationships between words with no human guidance. It actually similar to known for a decade methods called PLSI and PLSA:

“Money is not the only reward of education, yet it is surely the primary selling point used to market data science programs, and the primary motivator for students. But there’s no clear definition of data science and no clear understanding of what knowledge employers are willing to pay for, or how much they will pay, now or in the future. Already I know many competent, diligent data analysts who are unemployed or underemployed. So, I am highly skeptical that the students who will invest their time and money in data science programs will reap the rewards they have been led to expect.”: http://www.forbes.com/sites/gilpress/2013/08/19/data-science-whats-the-half-life-of-a-buzzword/

Some good blog-posts from InterWorks:

Technique for using Tableau data blending to create a dynamic, data-driven “parameter”: http://drawingwithnumbers.artisart.org/creating-a-dynamic-parameter-with-a-tableau-data-blend/

More about Colors:

Russian Postcodes are collected and partially visualized:

http://acuitybusiness.com/blog/bid/175066/Three-Reasons-Why-Companies-Should-Outlaw-Excel

EXASolution claims to be up to 1000 times faster than traditional databases and the fastest database in the world – based on in memory computing.

http://www.exasol.com/en/exasolution/technical-details.html

web interest to Tableau and Qlikview:

http://www.google.com/trends/explore?q=qlikview%2C+tableau%2C+spotfire%2C+microstrategy#q=tableau%2C%20microstrategy%2C%20qlikview%2C%20spotfire&geo=US&date=9%2F2008%2061m&cmpt=q

With releases of Spotfire Silver (soon to to be a Spotfire Cloud), Tableau Online and attempts of a few Qlikview Partners (but not Qliktech itself yet) to the Cloud and providing their Data Visualization Platforms and Software as a Service, the Attributes, Parameters and Concerns of such VaaS or DVaaS ( Visualization as a Service) are important to understand. Below is attempt to review those “Cloud” details at least on a high level (with natural limitation of space and time applied to review).

But before that let’s underscore that Clouds are not in the skies but rather in huge weird buildings with special Physical and Infrastructure security likes this Data Center in Georgia:

GoogleDataCenterInGeorgiaWithCloudsAboveIt2

You can see some real old fashion clouds above the building but they are not what we are talking about. Inside Data Center you can see a lot of Racks, each with 20+ servers which are, together with all secure network and application infrastructure contain these modern “Clouds”:

GoogleDataCenterInGeorgiaInside2

Attributes and Parameters of mature SaaS (and VaaS as well) include:

  • Multitenant and Scalable Architecture (this topic is too big and needs own blogpost or article). You can review Tableau’s whitepaper about Tableau Server scalability here: http://www.tableausoftware.com/learn/whitepapers/tableau-server-scalability-explained
  • SLA – service level agreement with up-time, performance, security-related and disaster recovery metrics and certifications like SSAE16.
  • UI and Management tools for User Privileges, Credentials and Policies.
  • System-wide Security: SLA-enforced and monitored Physical, Network, Application, OS and Data Security.
  • Protection or/and Encryption of all or at least sensitive (like SSN) fields/columns.
  • Application Performance: Transaction processing speed, Network Latency, Transaction Volume, Webpage delivery times, Query response times
  • 24/7 high availability: Facilities with reliable and backup power and cooling, Certified Network Infrastructure, N+1 Redundancy, 99.9% (or 99.99% or whatever your SLA with clients promised) up-time
  • Detailed historical availability, performance and planned maintenance data with Monitoring and Operational Dashboards, Alerts and Root Cause Analysis
  • Disaster recovery plan with multiple backup copies of customers’ data in near real time at the disk level, a 

    multilevel backup strategy that includes disk-to-disk-to-tape data backup where tape backups serve as a secondary level of backup, not as their primary disaster recovery data source.

  • Fail-over that cascades from server to server and from data center to data center in the event of a regional disaster, such as a hurricane or flood.

While Security, Privacy, Latency and Hidden Cost usually are biggest concerns when considering SaaS/VaaS, other Cloud Concerns surveyed and visualized below. Recent survey and diagram are published by Charlie Burns this month:

CloudConcerns2013

Other survey and diagram are published by Shane Schick in October 2011 and in February of 2013 by KPMG. Here are concerns, captured by KPMG survey:

CloudConcernsKPMG

As you see above, Rack in Data Center can contain multiple Servers and other devices (like Routers and Switches, often redundant (at least 2 or sometimes N+1). Recently I designed the Hosting Data VaaS Center for Data Visualization and Business Intelligence Cloud Services and here are simplified version of it just for one Rack as a Sample.

You can see redundant network, redundant Firewalls, redundant Switches for DMZ (so called “Demilitarized Zone” where users from outside of firewall can access servers like WEB or FTP), redundant main Switches and Redundant Load Balancers, Redundant Tableau Servers, Redundant Teradata Servers, Redundant Hadoop Servers, Redundant NAS servers etc. (not all devices shown on Diagram of this Rack):

RackDiagram

20 months ago I checked how many job openings leading DV Vendors have. On 12/5/11 Tableau had 56, Qliktech had 46 and Spotfire had 21 openings. Today morning I checked their career sites again and noticed that both Tableau and Qliktech almost double their thirst for new talents, while Spotfire basically staying on the same level of hiring needs:

  • Tableau has 102(!) openings, 43 of them are engineering positions (I counted their R&D positions and openings in Operation department too) – that is huge! Update as of 9/18/13 has exactly 1000 employees. 1000th employee can be found on this pictureTableau1000Employees091813

  • Qliktech has 87 openings, 29 of them are engineering positions (I included R&D, IT, Tech Support and Consulting).

  • TIBCO/Spotfire has 24 openings, 16 of them are engineering positions (R&D, IT, Tech.Support).

BostonSkylineFromWindow

All 3 companies are Public now, so I decided to include their Market Capitalization as well. Since Spofire is hidden inside its corporate parent TIBCO, I used my estimate that Spotfire’s Capitalization is about 20% of TIBCO’s capitalization (which is $3.81B as of 8/23/13, see https://www.google.com/finance?q=TIBX ). As a result I have this Market Capitalization numbers for 8/23/13 as closing day:

Those 3 DV Vendors above together have almost $8B market capitalization as of evening of 8/23/13 !

Market Capitalization update as of 8/31/13: Tableau: $4.3B, Qliktech $2.9B, Spotfire (as 20% of TIBCO) – $0.72B

Market Capitalization update as of 9/4/13 11pm: Tableau: $4.39B, Qliktech $3B, Spotfire (as 20% of TIBCO) – $0.75B . Also as of today Qliktech employed 1500+ (approx. $300K revenue per year per employee), Tableau about 1000 (approx. $200K revenue per year per employee) and Spotfire about 500+ (very rough estimate, also approx. $350K revenue per year per employee)

Google+

Last week Tableau increased by 10-fold the capacity of Data Visualizations published with Tableau Public to a cool 1 Million rows of Data, basically to the same amount of rows, which Excel 2007, 2010 and 2013 (often used as data sources for Tableau Public) can handle these days and increased by 20-fold the storage capacity (to 1GB of free storage) of each free Tableau Public Account, see it here:

http://www.tableausoftware.com/public/blog/2013/08/one-million-rows-2072

It means that free Tableau Public Account will have the storage twice larger than Spotfire Silver’s the most expensive Analyst Account (that one will cost you $4500/year). Tableau said: “Consider it a gift from us to you.”. I have to admit that even kids in this country know that there is nothing free here, so please kid me not – we are all witnessing of some kind of investment here – this type of investment worked brilliantly in the past… And all users of Tableau Public are investing too – with their time and learning efforts.

And this is not all: “For customers of Tableau Public Premium, which allows users to save locally and disable download of their workbooks, the limits have been increased to 10 million rows of data at 10GB of storage space” see it here:

http://www.tableausoftware.com/about/press-releases/2013/tableau-software-extends-tableau-public-1-million-rows-data without changing the price of service (of course in Tableau Public Premium price is not fixed and depends on the number of impressions).

Out of 100+ millions of Tableau users only 40000 qualified to be called Tableau Authors, see it here  http://www.tableausoftware.com/about/press-releases/2013/tableau-software-launches-tableau-public-author-profiles so they are consuming Tableau Public’s Storage more actively then others. As an example you can see my Tableau’s Author Profile here: http://public.tableausoftware.com/profile/andrei5435#/ .

I will assume those Authors will consume 40000GB of online storage, which will cost to Tableau Software less then (my guess, I am open to correction from blog visitors) $20K/year just for the storage part of Tableau Public Service.

During the last week the other important announcement on 8/8/13 – Quarterly Revenue – came from Tableau: it reported the Q2 revenue of $49.9 million, up 71% year-over-year: http://investors.tableausoftware.com/investor-news/investor-news-details/2013/Tableau-Announces-Second-Quarter-2013-Financial-Results/default.aspx .

Please note that 71% is extremely good YoY growth compare with the entire anemic “BI industry”, but less then 100% YoY which Tableau grew in its private past.

All these announcements above happened simultaneously with some magical (I have no theory why this happened; one weak theory is the investors madness and over-excitement about Q2 revenue of $49.9M announced on 8/8/13?) and sudden increase of the nominal price of Tableau Stock (under the DATA name on NYSE) from $56 (which is already high) on August 1st 2013 (announcement of 1 millions of rows/1GB storage for Tableau public Accounts) to $72+ today:

DATAstock812Area2

It means that the Market Capitalization of Tableau Software may be approaching $4B and sales may be $200M/year. For comparison, Tableau’s direct and more mature competitor Qliktech has now the Capitalization below $3B while its sales approaching almost $500M/year. From Market Capitalization point of view in 3 moths Tableau went from a private company to the largest Data Visualization publicly-traded software company on market!

Competition in Data Visualization market is not only on features, market share and mindshare but also on pricing and lisensing. For example the Qlikview licensing and pricing is public for a while here: http://www.qlikview.com/us/explore/pricing and Spotfire Silver pricing public for a while too:  https://silverspotfire.tibco.com/us/silver-spotfire-version-comparison .

Tableau Desktop has 3 editions: Public (Free), Personal ($999) and Professional ($1999), see it here: http://www.tableausoftware.com/public/comparison ; in addition you can have full Desktop (read-only) experience with free Tableau Reader (neither Qlikview nor Spotfire have free readers for server-less, unlimited distribution of Visualizations, which is making Tableau a mind-share leader right away…)

The release of Tableau Server online hosting this month:  http://www.tableausoftware.com/about/press-releases/2013/tableau-unveils-cloud-business-intelligence-product-tableau-online heated the licensing competition and may force the large changes in licencing landscape for Data Visualization vendors. Tableau Server existed in the cloud for a while with tremendous success as Tableau Public (free) and Tableau Public Premium (former Tableau Digital with its weird pricing based on “impressions”).

But Tableau Online is much more disruptive for BI market: for $500/year you can get the complete Tableau Server site (administered by you!) in the cloud with (initially) 25 (it can grow) authenticated by you users and 100GB of cloud storage for your visualizations, which is 200 times more then you can get for $4500/year top-of-the line Spotfire Silver “Analyst account”. This Tableau Server site will be managed in the cloud by Tableau Software own experts and require nor IT personnel from your side! You may also compare it with http://www.rosslynanalytics.com/rapid-analytics-platform/applications/qlikview-ondemand .

A hosted by Tableau Software solution is particularly useful when sharing dashboards with customers and partners because the solution is secure but outside a company’s firewall. In the case of Tableau Online users can publish interactive dashboards to the web and share them with clients or partners without granting behind-the-firewall access.

Since Tableau 8 has new Data Extract API, you can do all data refreshes behind your own firewall and republish your TDE files in the cloud anytime (even automatically, on demand or on schedule) you need. Tableau Online has no minimum number of users and can scale as a company grows. At any point, a company can migrate to Tableau Server to manage it in-house. Here is some introductionla video about Tableau Online: Get started with Tableau Online.

Tableau Server in the cloud provides at least 3 ways to update your data (more details see here: http://www.tableausoftware.com/learn/whitepapers/tableau-online-understanding-data-updates )

TableauDesktopAsProxyForTableauServer

Here is another, more lengthy intro into Tableau BI in Cloud:

Tableau as a Service is a step in right direction, but be cautious:  in practice, the architecture of the hosted version could impact performance. Plus, the nature of the product means that Tableau isn’t really able to offer features like pay-as-you-go that have made cloud-based software popular with workers. By their nature, data visualization products require access to data. For businesses that store their data internally, they must publish their data to Tableau’s servers. That can be a problem for businesses that have large amounts of data or that are prevented from shifting their data off premises for legal or security reasons. It could also create a synchronization nightmare, as workers play with data hosted at Tableau that may not be as up-to-date as internally stored data. Depending on the location of the customer relative to Tableau’s data center, data access could be slow.

And finally, the online version requires the desktop client, which costs $2,000. Tableau may implement Tableau desktop analytical features in a browser in the future while continue to support the desktop and on-premise model to meet security and regulations facing some customers.

Tableau_Online

I got many questions from Data Visualization Blog’s visitors about differences between compensation for full-time employees and contractors. It turned out that many visitors are actually contractors, hired because of their Tableau or Qlikview or Spotfire skills and also some visitors consider a possibility to convert to consulting or vice versa: from consulting to FullTimers. I am not expert in all these compensation and especially benefits-related questions, but I promised myself that my blog will be driven by vistors’s requests, so I google a little about Contractor vs. Full-Time worker compensation and below is brief description of what I got:

Federal Insurance Contribution Act mandates Payroll Tax splitted between employer (6.2% Social Security with max $$7049.40 and 1.45% Medicare on all income) and employee, with total (2013) as 15.3% of gross compensation.

Historical_Payroll_Tax_Rates

In addition you have to take in account employer’s contribution (for family it is about $1000/per month) to medical benefits of employee, Unemployment Taxes, employer’s contribution to 401(k), STD and LTD (short and long term disability insurances), pension plans etc.

I also added into my estimate of contractor rate the “protection” for at least 1 month GAP between contracts and 1 month of salary as bonus for full-time employees.

RR20120507-BCC-2

Basically the result of my minimal estimate as following: you need to get as a contractor the rate at least 50% more than base hourly rate of the full-time employee. This  base hourly rate of full-time employee I calculate as employee’s base salary divided on 1872 hours: 1872 = (52 weeks*40 hours – 3 weeks of vacation – 5 sick days – 6 holidays) = 2080 hours – 208 hours (Minimum for a reasonable PTO, Personal Time Off) = 1872 working hours per year.

I did not get into account any variations related to the usage of W2 or 1099 forms or Corp-To-Corp arrangements and many other fine details (like relocation requirements and overhead associated with involvement of middlemen like headhunters and recruiters) and differences between compensation of full-time employee and consultant working on contract – this is just a my rough estimate – please consult with experts and do not ask me any questions related to MY estimate, which is this:

  • Contractor Rate should be 150% of the base rate of a FullTimer

RS-COLLEGE LOAN SCAMS low resIn general, using Contractors (especially for business analytics) instead of Full-timers is basically the same mistake as outsourcing and off-shoring: companies doing that do not understand that their main assets are full-time people. Contractors are usually not engaged and they are not in business to preserve intellectual property of company.

Capitalist
For reference see Results of Dr. Dobbs 2013 Salary Survey for Software Developers which are very comparable with salary of Qlikview, Tableau and Spotfire developers and consultants (only in my experience salary of Data Visualization Consultants are 10-15% higher then salaries of software developers):

Fig01SalaryByTitle_full

This means that for 2013 the Average rate for Qlikview, Tableau and Spotfire developers and consultants should be around 160% of the base rate of a average FullTimer, which ESTIMATES to Effective Equivalent Pay to Contractor for 1872 hours per Year as $155,200 and this is only for average consultant... If you take less then somebody tricked you, but if you read above you already know that.

2400 years ago the concept of Data Visualization was less known, but even than Plato said “Those who tell stories rule society“.

PlatoStoryTelling

I witnessed multiple times how storytelling triggered the Venture Capitalists (VCs) to invest. Usually my CEO (biggest BS master on our team) will start with a “60-seconds-long” short Story (VCs called them “Elevator Pitch”) and then (if interested) VCs will do a long Due Diligence Research of Data (and Specs, Docs and Code) presented by our team and after that they will spend comparable time analyzing Data Visualizations (Charts, Diagrams, Slides etc.) of our Data, trying to prove or disprove the original Story.

Some of conclusions from all these startup storytelling activity were:

  • Data: without Data nothing can be proved or disproved (Action needs Data!)

  • View: best way to analyze Data and trust it is to Visualize it (Seeing is Believing!)

  • Discovery of Patterns: visually discoverable trends, outliers, clusters etc. which form the basis of the Story and follow-up actions

  • Story: the Story (based on that Data) is the Trigger for the Actions (Story shows the Value!),

  • Action(s): start with drilldown to a needle in haystack, embed Data Visualization into business, it is not an Eye Candy but a practical way to improve the business

  • Data Visualization has 5 parts: Data (main), View (enabler), Discovery (visually discoverable Patterns), Story (trigger for Actions) and finally the 5th Element – Action!

  • Life is not fair: Storytellers were there people who benefited the most in the end… (no Story no Glory!).

5DVelements

And yes, Plato was correct – at least partially and for his time. Diagram above uses analogy with 5 Classical Greek Elements. Plato wrote about four classical elements (earth, air, water, and fire) almost 2400 years ago (citing even more ancient philosopher) and his student Aristotle added a fifth element, aithêr (aether in Latin, “ether” in English) – both men are in the center of 1st picture above.

Back to our time: the Storytelling is a hot topic; enthusiasts saying that “Data is easy, good storytelling is the challenge” http://www.resource-media.org/data-is-easy/#.URVT-aVi4aE or even that “Data Science is a Storytelling”: http://blogs.hbr.org/cs/2013/03/a_data_scientists_real_job_sto.html . Nothing can be further from the truth: my observation is that most Storytellers (with a few known exceptions like Hans Rosling or Tableau founder Pat Hanrahan) ARE NOT GOOD at visualizing but they still wish to participate in our hot Data Visualization party. All I can say is “Welcome to the party!”

It may be a challenge for me and you but not for people who had a conference about storytelling: this winter, 2/27/13 in Nashville, KY: http://www.tapestryconference.com/ :

Some more reasonable  people referring to storytelling as a data journalism and narrative visualization: http://www.icharts.net/blogs/2013/pioneering-data-journalism-simon-rogers-storytelling-numbers

Tableau founder Pat Hanrahan recently talked about “Showing is Not Explaining”. In parallel, Tableau is planning (after version 8.0) to add features that support storytelling by constructing visual narratives and effective communication of ideas, see it here:

Collection of resources on storytelling topic can be found here: http://www.juiceanalytics.com/writing/the-ultimate-collection-of-data-storytelling-resources/

You may also to check what Stephen Few thinks about it here: http://www.perceptualedge.com/blog/?p=1632

Storytelling as an important part (using Greek Analogy – 4th Classical Element (Air) after Data (Earth), View (Water) and Discovery (Fire) and before Action (Aether) ) of Data Visualization has a practical effect on Visualization itself, for example:

  • if Data View is not needed for Story or for further Actions, then it can be hidden or removed,

  • if number of Data Views in Dashboard is affecting impact of (preferably short Data Story), then number of Views should be reduced (usually to 2 or 3 per dashboard),

  • If number of DataPoints is too large per View and affecting the triggering power of the story, then it can be reduced too (in conversations with Tableau they even recommending 5000 Datapoints per View as a threshold between Local and Server-based rendering).

 

RedBinaries

NearBy1920

Below you can find samples of Guidelines and Good Practices for Data Visualization (mostly with Tableau), which I used recently.

best-practiceSome of this samples are Tableau-specific, but others (may be with modifications) can be reused for other Data Visualization Platform and tools. I will appreciate feedback, comments and suggestions.

Naming Convention for Tableau Objects

  • Use CamelCase Identifiers: Capitalize the 1st letter of each concatenated word

  • Use Suffix for Identifiers with preceding underscore to indicate the type (example: _tw for workbooks).

Workbook Sizing Guidelines

  • Use Less than 5 Charts per Dashboard, Minimize the number of Visible TABs/Worksheets

  • Move Calculations and Functions from Workbook to the Data.

  • Use less than 5000 Data-points per Chart/Dashboard to enable Client-side rendering.

  • To enable Shared Sessions, don’t use filters and interactivity if it is not needed.

Guidelines for Colors, Fonts, Sizes

  • To express desirable/undesirable points, use green for good, red for bad, yellow for warning.

  • When you are not describing “Good-Bad situation” (thanks to feedback of visitor under alias “SF”) , try to use pastel, neutral and blind colors, e.g. similar to “Color Blind 10″ Palette from Tableau.

  • Use “web-safe” fonts, to approximate what users can see from Tableau Server.

  • Use either auto-resize or standard (target smaller screen) sizes for Dashboards

Data and Data Connections used with Tableau

  • Try to avoid pulling more than 15000 rows for Live Data Connections.

  • For Data Extract-based connections 10M rows is the recommended maximum.

  • For widely distributed Workbooks use of Application IDs instead of Personal Credentials.

  • Job failure due expired credentials leads to suspension from Schedule, so try to keep embedded credentials up to date

5Options

Tableau Data Extracts (TDE)

  • If Refresh of TDE takes more than 2 hours, consider to redesign it.

  • Reuse and share TDEs and Data Sources as much as possible.

  • Use of Incremental Data Refresh instead of Full Refresh when possible.

  • Designate Unique ID for each row when Incremental Data Refresh is used.

  • Try to use free Tableau Data Exract API instead of licensed Tableau Server to create Data Extracts

Scheduling of Background Tasks with Tableau

  • Serial Schedules is recommended; avoid the usage of hourly Schedules.

  • Avoid scheduling during peak hours (8am-6pm), consider weekly instead of daily schedules.

  • Optimize Schedule Size, group tasks related to the same project into one Schedule, if total tasks execution exceeds 8 hours, split Schedule on a few with similar Name but preferably with different starting time.

  • Maximize the usage of Monthly and Weekly Schedules (as oppose to Daily Schedules) and usage of weekends and nights.

Guidelines for using Charts

  • Use Bars to compare across categories, use Colors with Stacked or Side-by-Side Bars for deeper Analysis

  • Use Line for Viewing Trends over time, consider Area Charts for Multi-lines

  • Minimize the usage of Pie Charts; when appropriate – use it for showing proportions. It is recommended to limit pie wedges to six.

  • Use Map to show geocoded data, consider use maps as interactive filters

  • Use Scatter to analyze outliers, clusters and construct regressions

Guideline960

You can find more about Guidelines and Good Practices for Data Visualization here: http://www.tableausoftware.com/public/community/best-practices

Tableau Software filed for IPO, on the New York Stock Exchange under the symbol “DATA”. In sharp contrast to other business-software makers that have gone public in the past year, Tableau is profitable, despite hiring huge number of new employees. For the years ended December 31, 2010, 2011 and 2012,  Tableau’s total revenue were $34.2 million, $62.4 million and $127.7 million for 2012. Number of full-time employees increased from 188 as of December 31, 2010 to 749 as of December 31, 2012.

Tableau’s biggest shareholder is venture capital firm New Enterprise Associates, with a 38 percent stake. Founder Pat Hanrahan owns 18 percent, while co-founders Christopher Stolte and Christian Chabot, who is also chief executive officer, each own more than 15 percent. Meritech Capital Partners controls 6.4 percent. Tableau recognized three categories of Primary Competitors:

  • large suppliers of traditional business intelligence products, like IBM, Microsoft, Oracle and SAP AG;

  • spreadsheet software providers, such as Microsoft Corporation

  • business analytics software companies: Qlik Technologies Inc. and TIBCO Spotfire.

TBvsQVvsSF

Update 4/29/13: This news maybe related to Tableau IPO: I understand that Microstrategy’s growth cannot be compared with growth of Tableau or even Qliktech. But to go below of the average “BI market” growth? Or even 6% or 24% decrease? What is going on (?) here : “First quarter 2013 revenues were $130.2 million versus $138.3 million for the first quarter of 2012, a 6% decrease.  Product licenses revenues for the first quarter of 2013 were $28.4 million versus $37.5 million for the first quarter of 2012, a 24% decrease.”

Update 5/6/13: Tableau Software Inc. will sell 5 million shares, while shareholders will sell 2.2 million shares, Tableau said in an amended filing with the U.S. Securities and Exchange Commission. The underwriters have the option to purchase up to an additional 1,080,000 shares. It means total can be 8+ millions of shares for sale.

The company expects its initial public offer to raise up to $215.3 million at a price of $23 to $26 per share. If this happened, that will create public company with large capitalization, so Qliktech and Spotfire will have an additional problem to worry about. This is how QLIK (blue line), TIBX (red) and MSTR (orange line) stock behaved during last 6 weeks after release of Tableau 8 and official Tableau IPO announcement:

QlikTibxMstr

Update 5/16/13: According to this article  at Seeking Alpha (also see S-1 Form) Tableau Software Inc. (symbol “DATA”) is scheduled a $176 million IPO with a market capitalization of $1.4 billion for Friday, May 17, 2013. Tableau’s March Quarter sales were up 60% from the March ’12 quarter. Qliktech’s sales were up only 23% on a similar comparative basis.

nyse

According to other article, Tableau raised it IPO price and it may reach capitalization of $2B by end of Friday, 5/17/13. That is almost comparable with capitalization of Qliktech…

Update 5/17/13: Tableau’s IPO offer price was $31 per share, but it started today

with price $47 and finished day with $50.75 (raising $400M in one day) with estimated Market Cap around $3B (or more?). It is hard to understand the market: Tableau Stock (symbol: DATA) finished its first day above $50 with Market Capitalization higher than QLIK, which today has Cap = $2.7B but Qliktech has almost 3 times more of sales then Tableau!

For comparison MSTR today has Cap = $1.08B and TIBX today has Cap = $3.59B. While I like Tableau, today proved that most investors are crazy, if you compare numbers in this simple table:

Symbol  : Market Cap, $B, as of 5/17/13 Revenue, $M, as of 3/31/13 (trailing 12 months) FTE (Full Time Employees)
TIBX 3.59 1040 3646
MSTR 1.08 586 3172
QLIK 2.67 406 1425
DATA between $2B and $3B? 143 834

See interview with Co-Founder of Tableau Software Christian Chabot  - he discusses taking the company’s IPO with Emily Chang on Bloomberg Television’s “Bloomberg West.” However it makes me sad when Tableau’s CEO is implying that Tableau is ready for big data, which is not true.

TableauCEOaboutIPOHere are some pictures of the Tableau team at the NYSE:  http://www.tableausoftware.com/ipo-photos and here is the announcement about “closing IPO”.

Initial public offering gave to Tableau $254 million (preliminary estimate)

Today Tableau 8 was released with 90+ new features (actually it may be more than 130) after exhausting 6+ months of Alpha and Beta Testing with 3900+ customers as Beta Testers! I personally expected it it 2 months ago, but I rather will have it with less bugs and this is why I have no problem with delay. During this “delay” Tableau Public achieved the phenomenal Milestone: 100 millions of users…

Tableau 8 introduced:

  • web and mobile authoring,
  • added access to new data sources: Google Analytics, Salesforce.com, Cloudera Impala, DataStax Enterprise, Hadapt, Hortonworks Hadoop Hive, SAP HANA, and Amazon Redshift.
  • New Data Extract API that allows programmers to load data from anywhere into Tableau and make certain parts of Tableau Licensing ridiculous, because consuming part of licensing (for example core licensing) for background tasks should be set free now.
  • New JavaScript API enables integration with business (and other web-) applications.
  • Local Rendering: leveraging the graphics hardware acceleration available on ordinary computers. Tableau 8 Server dynamically determines where rendering will complete faster – on the server or in the browser. Also – and acts accordingly. Also Dashboards now render views in parallel when possible.

Tableau Software plans to add in next versions (after 8.0) some very interesting and competitive features, like:

  • Direct query of large databases, quick and easy ETL and data integration.
  • Tableau on a Mac and Tableau as a pure Cloud offering.
  • Make statistical & analytical techniques accessible (I wonder if it means integration with R?).
  • Tableau founder Pat Hanrahan recently talked about “Showing is Not Explaining”, so Tableau planned to add features that support storytelling by constructing visual narratives and effective communication of ideas.

I did not see on Tableau’s roadmap some very long overdue features like 64-bit implementation (currently even all Tableau Server processes, except one, are 32-bit!), Server implementation on Linux (we do not want to pay Windows 2012 Server CAL taxes to Bill Gates) and direct mentioning of integration with R like Spotfire does – I how those planning and strategic mistakes will not impact upcoming IPO.

I personally think that Tableau has to stop using its ridiculous practice when 1 core license used per each 1 Backgrounder server process and since Tableau Data Extract API is free so all Tableau Backgrounder Processes should be free and have to be able to run on any hardware and even any OS.

Tableau 8 managed to get the negative feedback from famous Stephen Few, who questioned Tableau’s ability to stay on course. His unusually long blog-post “Tableau Veers from the Path” attracted enormous amount of comments from all kind of Tableau experts. I will be cynical here and notice that there is no such thing as negative publicity and more publicity is better for upcoming Tableau IPO.

TBvsQVvsSF

The most popular (among business users) approach to visualization is to use a Data Visualization (DV) tool like Tableau (or Qlikview or Spotfire), where a lot of features already implemented for you. Recent prove of this amazing popularity is that at least 100 million people (as of February 2013),  used Tableau Public as their Data Visualization tool of choice, see

http://www.tableausoftware.com/about/blog/2013/2/crossing-100-million-milestone-21304

However, to make your documents and stories (and not just your data visualization applications) driven by your data, you may need the other approach – to code visualization of your data into your story and visualization libraries like  popular D3 toolkit can help you. D3 stands for “Data-Driven Documents”. The Author of D3 Mr. Mike Bostock designs interactive graphics for New York Times – one of latest samples is here:

http://www.nytimes.com/interactive/2013/02/20/movies/among-the-oscar-contenders-a-host-of-connections.html

and NYT allows him to do a lot of Open Source work which he demonstartes at his website here:

https://github.com/mbostock/d3/wiki/Gallery .

overview

Mike was a “visualization scientist” and a computer science PhD student at #Stanford University and member of famous group of people, now called “Stanford Visualization Group”:

http://vis.stanford.edu/people/

This Visualization Group was a birthplace of Tableau’s prototype – sometimes they called it  “a Visual Interface” for exploring data and other name for it is Polaris:

http://www.graphics.stanford.edu/projects/polaris/

and we know that creators of Polaris started Tableau Software. One of other Group’s popular “products” was a graphical toolkit (mostly in JavaScript, as oppose to Polaris, written in C++) for Visualization, called ProtoVis:

http://mbostock.github.com/protovis/

- and Mike Bostock was one of ProtoViz’s main co-authors. Less then 2 years ago Visualization Group suddenly stopped developing ProtoViz and recommended to everybody to switch to D3 library

https://github.com/mbostock,

authored by Mike. This library is Open Source (only 100KB in ZIP format) and can be downloaded from here:

http://d3js.org/d3.v3.zip

Cubism

In order to use D3, you need to be comfortable with HTML, CSS, SVG, Javascript programming, DOM (and other Web Standards); understanding of jQuery paradigm will be useful too. Basically if you want to be at least partially as good as Mike Bostock, you need to have a mindset of a programmer (I guess in addition to business user mindset), like this D3 expert:

http://www.jasondavies.com/

Most of successful early D3 adopters combining even 3+ mindsets: programmer, business analyst, data artist and even sometimes data storyteller. For your programmer’s mindset you may be interested to know that D3 has a large set of Plugins, see:

https://github.com/d3/d3-plugins

and rich #API, see https://github.com/mbostock/d3/wiki/API-Reference

You can find hundreds of D3 demos, samples, examples, tools, products and even a few companies using D3 here: https://github.com/mbostock/d3/wiki/Gallery

ChordDiagram705x235

Best of the Tableau Web… December 2012:

http://www.tableausoftware.com/about/blog/2013/1/best-tableau-web-december-2012-20758

Top 100 Q4 2012 from Tableau Public:

http://www.tableausoftware.com/public/blog/2013/01/top-100-q4-2012-1765

eBay’s usage of Tableau as the front-end for big data, Teradata and Hadoop with 52 petabytes of
data on everything from user behavior to online transactions to customer shipments and much more:

http://www.infoworld.com/d/big-data/big-data-visualization-big-deal-ebay-208589

Why The Information Lab recommends Tableau Software:

http://www.theinformationlab.co.uk/2013/01/04/recommend-tableau-software/

Fun with #Tableau Treemap Visualizations

http://tableaulove.tumblr.com/post/40257187402/fun-with-tableau-treemap-visualizations

Talk slides: Tableau, SeaVis meetup & Facebook, Andy Kirk’s Facebook Talk from Andy Kirk

http://www.visualisingdata.com/index.php/2013/01/talk-slides-tableau-seavis-meetup-facebook/

Usage of RAM, Disk and Data Extracts with Tableau Data Engine:

http://www.tableausoftware.com/about/blog/2013/1/what%E2%80%99s-better-big-data-analytics-

memory-or-disk-20904
Migrating Tableau Server to a New Domain

https://www.interworks.com/blogs/bsullins/2013/01/11/migrating-tableau-server-new-domain

SAS/Tableau Integration

http://www.see-change.co/services/sastableau-integration/

IFNULL – is not “IF NULL”, is “IF NOT NULL”

http://tableaufriction.blogspot.com/2012/09/isnull-is-not-is-null-is-is-not-null.html

Worksheet and Dashboard Menu Improvements in Tableau 8:

http://tableaufriction.blogspot.com/2013/01/tv8-worksheet-and-dashboard-menu.html

Jittery Charts – Why They Dance and How to Stop Them:

http://tableaufriction.blogspot.com/2013/01/jittery-charts-and-how-to-fix-them.html

Tableau Forums Digest #8

http://shawnwallwork.wordpress.com/2013/01/06/67/

Tableau Forums Digest #9

http://shawnwallwork.wordpress.com/2013/01/14/tableau-forums-digest-9/

Tableau Forums Digest #10

http://shawnwallwork.wordpress.com/2013/01/19/tableau-forums-digest-10/

Tableau Forums Digest #11

http://shawnwallwork.wordpress.com/2013/01/26/tableau-forums-digest-11/

implementation of bandlines in Tableau by Jim Wahl (+ Workbook):

http://community.tableausoftware.com/message/198511

This is the Part 2 of the guest blog post: the Review of Visual Discovery products from Advizor Solutions, Inc., written by my guest blogger Mr. Srini Bezwada (his profile is here: http://www.linkedin.com/profile/view?id=15840828 ), who is the Director of Smart Analytics, a Sydney based professional BI consulting firm that specializes in Data Visualization solutions. Opinions below belong to Mr. Srini Bezwada.

ADVIZOR Technology

ADVIZOR’s Visual Discovery™ software is built upon strong data visualization tools technology spun out of a distinguished research heritage at Bell Labs that spans nearly two decades and produced over 20 patents. Formed in 2003, ADVIZOR has succeeded in combining its world-leading data visualization and in-memory-data-management expertise with extensive usability knowledge and cutting-edge predictive analytics to produce an easy to use, point and click product suite for business analysis.

ADVIZOR readily adapts to business needs without programming and without implementing a new BI platform, leverages existing databases and warehouses, and does not force customers to build a difficult, time consuming, and resource intensive custom application. Time to deployment is fast, and value is high.

With ADVIZOR data is loaded into a “Data Pool” in main memory on a desktop or laptop computer, or server. This enables sub-second response time on any query against any attribute in any table, and instantaneously update all visualizations. Multiple tables of data are easily imported from a variety of sources.

With ADVIZOR, there is no need to pre-configure data. ADVIZOR accesses data “as is” from various data sources, and links and joins the necessary tables within the software application itself. In addition, ADVIZOR includes an Expression Builder that can perform a variety of numeric, string, and logical calculations as well as parse dates and roll-up tables – all in-memory. In essence, ADVIZOR acts like a data warehouse, without the complexity, time, or expense required to implement a data warehouse! If a data warehouse already exists, ADVIZOR will provide the front-end interface to leverage the investment and turn data into insight.
Data in the memory pool can be refreshed from the core databases / data sources “on demand”, or at specific time intervals, or by an event trigger. In most production deployments data is refreshed daily from the source systems.

Data Visualization

ADVIZOR’s Visual Discovery™ is a full visual query and analysis system that combines the excitement of presentation graphics – used to see patterns and trends and identify anomalies in order to understand “what” is happening – with the ability to probe, drill-down, filter, and manipulate the displayed data in order to answer the “why” questions. Conventional BI approaches (pre-dating the era of interactive Data Visualization) to making sense of data have involved manipulating text displays such as cross tabs, running complex statistical packages, and assembling the results into reports.

ADVIZOR’s Visual Discovery™ making the text and graphics interactive. Not only can the user gain insight from the visual representation of the data, but now additional insight can be obtained by interacting with the data in any of ADVIZOR’s fifteen (15) interactive charts, using color, selection, filtering, focus, viewpoint (panning, zooming), labeling, highlighting, drill-down, re-ordering, and aggregation.

AdvizorCharts
Visual Discovery empowers the user to leverage his or her own knowledge and intuition to search for patterns, identify outliers, pose questions and find answers, all at the click of a mouse.

Flight Recorder – Track, Save, Replay your Analysis Steps

The Flight Recorder tracks each step in a selection and analysis process. It provides a record of those steps, and be used to repeat previous actions. This is critical for providing context to what and end-user has done and where they are in their data. Flight records also allow setting bookmarks, and can be saved and shared with other ADVIZOR users.
The Flight Recorder is unique to ADVIZOR. It provides:
• A record of what a user has done. Actions taken and selections from charts are listed. Small images of charts that have been used for selection show the selections that were made.
• A place to collect observations by adding notes and capturing images of other charts that illustrate observations.
• A tool that can repeat previous actions, in the same session on the same data or in a later session with updated data.
• The ability to save and name bookmarks, and share them with other users.

Predictive Analytics Capability

The ADVIZOR Analyst/X is a predictive analytic solution based on a robust multivariate regression algorithm developed by KXEN – a leading-edge advanced data mining tool that models data easily and rapidly while maintaining relevant and readily interpretable results.
Visualization empowers the analyst to discover patterns and anomalies in data by noticing unexpected relationships or by actively searching. Predictive analytics (sometimes called “data mining”) provides a powerful adjunct to this: algorithms are used to find relationships in data, and these relationships can be used with new data to “score” or “predict” results.

AdvizorPredictiveModel

Predictive analytics software from ADVIZOR don’t require enterprises to purchase platforms. And, since all the data is in-memory, the Business Analyst can quickly and easily condition data and flag fields across multiple tables without having to go back to IT or a DBA to prep database tables. The interface is entirely point-and-click, there are no scripts to write. The biggest benefit from the multi-dimensional visual solution is how quickly it delivers analysis, solving critical business questions, facilitating intelligence-driven decision making, providing instant answers to “what if?” questions.

Advantages over Competitors:

• The only product in the market offering a combination of Predictive Analytics + Data Visualisation + In memory data management within one Application.
• The cost of entry is lower than the market leading data visualization vendors for desktop and server deployments.
• Advanced Visualizations like Parabox, Network Constellation in addition to normal bar charts, scatter plots, line charts, Pie charts…
• Integration with leading CRM vendors like Salesforce.com, Blackbaud, Ellucian, Information Builder
• Ability to provide sub-second response time on query against any attribute in any table, and instantaneously update all visualizations.
• Flight recorder that lets you track, replay, and save your analysis steps for reuse by yourself or others.

Update on 5/1/13 (by Andrei): Avizor 6.0 is available now with substantial enhancements: http://www.advizorsolutions.com/Bnews/tabid/56/EntryId/215/ADVIZOR-60-Now-Available-Data-Discovery-and-Analysis-Software-Keeps-Getting-Better-and-Better.aspx

If you visited my blog before, you know that my classification of Data Visualization and BI vendors are different from researchers like Gartner. In addition to 3 DV Leaders – Qlikview, Tableau, Spotfire – I rarely have time to talk about other “me too” vendors.

However, sometimes products like Omniscope, Microstrategy’s Visual Insight, Microsoft BI Stack (Power View, PowerPivot, Excel 2013, SQL Server 2012, SSAS etc.), Advizor, SpreadshetWEB etc. deserve attention too. However, it takes so much time, so I am trying to find guest bloggers to cover topics like that. 7 months ago I invited volunteers to do some guest blogging about Advizor Visual Discovery Products:

http://apandre.wordpress.com/2012/06/22/advizor-analyst-vs-tableau-or-qlikview/

So far nobody in  USA or Europe committed to do so, but recently Mr. Srini Bezwada, Certified Tableau Consultant and Advizor-trained expert from Australia contacted me and submitted the article about it.  He also provided me with info about how Advizor can be compared with Tableau, so I will do it briefly, using his data and opinions. Mr. Bezwada can be reached at

sbezwada@smartanalytics.com.au , where he is a director at

http://www.smartanalytics.com.au/

Below is quick comparison of Advizor with Tableau. Opinions below belong to Mr. Srini Bezwada. Next blog post will be a continuation of this article about Advizor Solutions Products, see also Advizor’s website here:

http://www.advizorsolutions.com/products/

Criteria Tableau ADVIZOR Comment
Time to implement Very Fast Fast, ADVIZOR can be implemented within Days Tableau Leads
Scalability Very Good Very Good Tableau: virtual RAM
Desktop License $1,999 $ 1,999 $3,999 for AnalystX with Predictive modeling
Server License/user $1K, min 10 users, 299 K for Enterprise Deployment license for up to 10 named users $8 K ADVIZOR is a lot cheaper for Enterprise Deployment $75 K for 500 Users
Support fees / year

20%

20%

1st year included
SaaS Platform Core or Digital Offers Managed Hosting ADVIZOR Leads
Overall Cost Above Average Competitive ADVIZOR Costs Less
Enterprise Ready Good for SMB Cheaper cost model for SMB Tableau is expensive for Enterprise Deployment
Long-term viability Fastest growth Private company since 2003. Tableau is going IPO in 2013
Mindshare Tableau Public Growing Fast Tableau stands out
Big Data Support Good Good Tableau is 32-bit
Partner Network Good Limited Partnerships Tableau Leads
Data Interactivity Excellent Excellent
Visual Drilldown Very Good Very Good
Offline Viewer Free Reader None Tableau stands out
Analyst’s Desktop Tableau Professional Advizor has Predictive Modeling ADVIZOR is a Value for Money
Dashboard Support Excellent Very Good Tableau Leads
Web Client Very Good Good Tableau Leads
64-bit Desktop None Very Good Tableau still a 32-bit app
Mobile Clients Very Good Very Good
Visual Controls Very Good Very Good
Data Integration Excellent Very Good Tableau Leads
Development Tableau Pro ADVIZOR Analyst
64-bit in-RAM DB Good Excellent Advizor Leads
Mapping support Excellent Average Tableau stands out
Modeling, Analytics Below Average Advanced Predictive Modelling ADVIZOR stands out
Predictive Modeling None Advanced Predictive Modeling Capability with Built in KXEN algorithms ADVIZOR stands out
Flight Recorder None Flight recorder lets you track, replay, save your analysis steps for reuse by yourself or others. ADVIZOR stands out
Visualization 22 Chart types All common charts like  bar charts, scatter plots, line charts, Pie charts are supported Advizor has Advanced Visualizations like Parabox, Network Constellation
Third party integration Many Data Connectors, see Tableau’s drivers page ADVIZOR integrates well with CRM software: Salesforce.com, Ellucian, Blackbaud and others. ADVIZOR leads in CRM area
Training Free Online and paid Classroom Free Online and paid via company trainers & Partners Tableau Leads

In my previous post http://apandre.wordpress.com/2012/11/16/new-tableau-8-desktop-features/ (this post is the continuation of it) , I said that Tableau 8 introduced 130+ new features, 3 times more then Tableau 7 did. Many of these new features are in Tableau 8 Server and this post about those new Server features (this is a repost from my Tableau blog: http://tableau7.wordpress.com/2012/11/30/new-tableau-8-server-features/ ).

The Admin and Server pages have been redesigned to show more info quicker. In list view the columns can be resized. In thumbnail view the grid dynamically resizes. You can hover over a thumbnail to see more info about visualization. The content search is better too:

ThumbnailView

Web authoring (even mobile) introduced by Tableau 8 Server. Change dimensions, measures, mark types, add filters, and use Show Me are all directly in a web browser and can be saved back to the server as a  new workbook or if individual permissions allow, to the original workbook:

webAuthoring

Subscribing to a workbook or worksheet will automatically notify about the dashboard or view updates to your email inbox. Subscriptions deliver image and link.

Tableau 8 Data Engine is more scalable now, it can be distributed between 2 nodes, 2nd instance of it now can be configured as Active, Synced and Available for reading if  Tableau Router decided to use it (in addition Fail-over function as before)server2sTableau 8 Server now supports Local Rendering, using graphic ability of local devices, modern browsers and HTML5. No-round-trip to server while rendering using latest versions of chrome 23+, Firefox 17+, Safari , IE 9+. Tableau 8 (both Server and Desktop, computing each view in Parallel. PDF files, generated by Tableau 8 up to 90% smaller and searchable. And Performance Recorder works on both Server and Desktop.

Tableau 8 Server introducing Shared sessions allows more concurrency, more caching. Tableau 7 uses 1 session per viewer. Tableau 8 using one session per many viewers, as long as they do no change state of filters and don’t do other altering interaction. If interaction happened, Tableau 8 will clone the session for appropriate Interactor and apply his/her changes to new session:server3sIFinally Tableau getting API, 1st part of it I described in previous blog post about TDesktop – TDE API (C/C++, Python, Java on both Windows AND Linux!).

For Web Development Tableau has now brand new JavaScript API to customize selection, filtering, triggers to events, custom toolbar, etc. Tableau 8 has own JavaScript API WorkBench, which can be used right from you browser:server4w

TDE API allows to build own TDE on any machine with Python, C/C++ and Java (see 24:53 at http://www.tableausoftware.com/tcc12conf/videos/new-tableau-server-8 ). Additionally Server API (REST API) allows programmatically create/enable/suspend sites and add/remove users to sites.

In addition to Faster Uploads andPublishing Data Sources, users can Publish Filters as Set and User Filters. Data Sources can be Refreshed or Appended instead of republishing – all from Local Sources. Such Refreshes can scheduled using Windows Task Scheduler or other task scheduling software on client devices – this is a real TDE proliferation!

My wishlist for Tableau 8 Server: all Tableau Server processes needs to be 64-bit (and they still 32-bit, see it here: http://onlinehelp.tableausoftware.com/v7.0/server/en-us/processes.htm ; they are way overdue to be the 64-bit; Linux version of Tableau Server (Microsoft recently changed very unfavorably the way they charge users for each Client Access) is needed, I wish integration with R Library (Spotfire has it for years), I want Backgrounder Processes (mostly doing data extracts on server) will not consume core licenses etc…

And yes, I found in San Diego even more individuals who found the better way to spend their time compare with attending Tableau 2012 Customer Conference and I am not here to judge:

SealsInLaJolla

I left Tableau 2012 conference in San Diego (where Tableau 8 was announced) a while ago with enthusiasm which you can feel from this real-life picture of 11 excellent announcers:

Tableau8IntroducedInSanDiego

Conference was attended by 2200+ people and 600+ Tableau Software employees (Tableau almost doubled the number of employees in a year) and it felt like a great effort toward IPO (see also article here: http://www.bloomberg.com/news/2012-12-12/tableau-software-plans-ipo-to-drive-sales-expansion.html ).  See some video here: TCC12 Keynote . Tableau 8 introduce 130+ new features, 3 times more then Tableau 7 did. Almost half of these new features are in Tableau 8 Desktop and this post about those new Desktop features (this is a repost from my Tableau Blog: http://tableau7.wordpress.com/2012/11/16/new-tableau-8-desktop-features/). New Tableau 8 Server features deserved a separate blog post which I will publish a little later after playing with Beta 1 and may be Beta 2.

A few days after conference the Tableau 8 Beta Program started with 2000+ participants. One of the most promising features is new rendering engine and I build special Tableau 7 visualization (and its port to Tableau 8) with 42000 datapoints: http://public.tableausoftware.com/views/Zips_0/Intro?:embed=y  to compare the speed of rendering between versions 7 and 8:

ZipColors

Among new features are new (for Tableau) visualization types: Heatmap, “Packed” Bubble Chart and Word Cloud, and I build simple Tableau 8 Dashboard to test it (all 3 are visualizing the 3-dimensional set where 1 dimension used as list of items, 1 measure used for size and 2nd measure used for color of items):

3NewTypesOfVisualizationsInTableau

List of new features includes improved Sets (comparing members vs. non-members, adding/removing members, combining Sets: all-in-both, shared-by-both, left-except-right, right-except-left), Custom SQL with parameters, Freeform Dashboards (I still prefer MDI UI where each Chart/View Sheet has own Child Window as oppose to Pane), ability to add multiple fields to Labels, optimized label placement, built-in statistical models for visual Forecasting, Visual Grouping based on your data selection, Redesigned Mark Card (for Color, Size, Label, Detail and Tooltip Shelves).

New Data features include data blending without mandatory linked field in a view and with ability to filter data in secondary data sources; refreshing server-based Data Extracts can be done from local data sources; Data Filters (in addition be either local or global) can be shared now among selected set of worksheets and dashboards. Refresh of Data Extract can be done using command prompt for Tableau Desktop, for example

>tableau.exe refreshremoteextract

Tableau 8 has (finally) API (C/C++, Python, Java) to directly create a Tableau Data Extract (TDE) file, see example here: http://ryrobes.com/python/building-tableau-data-extract-files-with-python-in-tableau-8-sample-usage/

Tableau 8 (both Desktop and Server) can then connect to this extract file natively! Tableau provides new native connection for Google Analytics and Saleforce.com. TDE files now much smaller (especially with text values) – up to 40% smaller compare with Tableau 7.

Tableau 8 has performance enhancements, such as the new ability to use hardware accelerators (of modern graphics cards), computing views within dashboard in parallel (in Tableau 7 it was consecutive computations) and new  performance recorder allows to estimate and tune a workload of various activities and functions and optimize the behavior of workbook.

I still have a wishlist of features which are not implemented in Tableau and I hope some them will be implemented later: all Tableau processes are 32-bit (except 64-bit version of data engine for server running on 64-bit OS) and they are way overdue to be the 64-bit; many users demand MAC version of Tableau Desktop and Linux version of Tableau Server (Microsoft recently changed very unfavorably the way they charge users for each Client Access), I wish MDI UI for Dashboards where each view of each worksheet has own Window as oppose to own pane (Qlikview does it from the beginning of the time), I wish integration with R Library (Spotfire has it for years), scripting languages and IDE (preferably Visual Studio), I want Backgrounder Processes (mostly doing data extracts on server) will not consume core licenses etc…

Despite the great success of the conference, I found somebody in San Diego who did not pay attention to it (outside was 88F, sunny and beautiful):

HummingbirdInLaJolla

On May 3rd of 2012 the Google+ extension http://tinyurl.com/VisibleData of this Data Visualization blog reached 500+ followers, on July 9 it got 1000+ users, on October 11 it had already 2000+ users, 11/27/12 my G+ Data Visualization Page has 2190+ followers and still growing every day (updated as of 12/01/12: 2500+ followers.

One of reasons of course is just a popularity of Data Visualization related topics and other reason covered in interesting article here:

http://www.computerworld.com/s/article/9232329/Why_I_blog_on_Google_And_how_ .

In any case, it helped me to create a reading list for myself and other people, base on feedback I got. According to CicleCount, as of 11/13/12 update, my Data Visualization Google+ Page ranked as #178 most popular page in USA. Thank you G+ ! Updates:

5/25/13: G+ extension of this blog now has 3873+ followers,
and as of  7/15/13 as of 4277+ followers),
and as of 11/11/13 it has 5013+ followers:

DVFollowersOnGPlus111113

I also have 2nd G+ extension of this blog, see it here:  http://tinyurl.com/VisualizationWithTableau with 375 followers as of 11/11/13:

DVWithTableauFollowersOnGPlus111113

 

Qlikview 10 was released around 10/10/10, Qlikview 11 – around 11/11/11, so I expected Qlikview 12 to be released on 12/12/12 but “instead” we are getting Qlikview 11.2 with Direct Discovery in December 2012, which supposedly provides a “hybrid approach so business users can get the QlikView associative experience even with data that is not stored in memory”

This feature demanded by users (me included) for a long time, but I think noise around so called Big Data and competition forced Qliktech to do it. Spotfire has it for a longtime (as well as 64-bit implementation) and Tableau has something like that for a while (unfortunately Tableau still 32-bit) . You can test Beta of it, if you have time: http://community.qlikview.com/blogs/technicalbulletin/2012/10/22/qlikview-direct-discovery-beta-registration-is-open

Just 8 months ago Qliktech estimated its sales for 2012 as $410M and suddenly 3 months ago it changed its estimates down to $381M, just 19% over 2011, which is in huge contrast with Qliktech’s previous speed of growth and way behind the current speed of growth of Tableau and even less then current speed of growth of Spotfire. During last 2 years QLIK stock unable to grow significantly:

and all of the above forcing Qliktech to do something outside of gradual improvements – new and exciting functionality needed and Direct Discovery may help!

QlikView Direct Discovery enables users to perform visual analysis against “any amount of data, regardless of size”. With the introduction of this unique hybrid approach, users can associate data stored within big data sources directly alongside additional data sources stored within the QlikView in-memory model. QlikView can “seamlessly connect to multiple data sources together within the same interface”, e.g. Teradata to SAP to Facebook allowing the business user to associate data across the data silos. Data outside of RAM can be joined with the in-memory data with the common field names. This allows the user associatively navigate both on the direct discovery and in memory data sets.

QlikView developer should setup the Direct Discovery table on the QlikView application load script to allow the business users to query the desired big data source. Within the script editor a new syntax is introduced to connect to data in direct discovery form. Traditionally the following syntax is required to load data from a database table:

To invoke the direct discovery method, the keyword “SQL” is replaced with “DIRECT”.

In the example above only column CarrierTrackingNumber and ProductID are loaded into QlikView in the traditional manner, other columns exist in the data table within the Database including columns OrderQty and Price. OrderQty and Price fields are referred as “IMPLICIT” fields. An implicit field is a field that QlikView is aware of on a “meta level”. The actual data of an implicit field resides only in the database but the field may be used in QlikView expressions. Looking at the table view and data model of the direct discovery columns are not within the model (on the OrderFact table):

Once the direct discovery structure is established, the direct discovery data can be joined with the in-memory data with the common field names (Figure 3). In this example, “ProductDescription” table is loaded in-memory and joined to direct discovery data with the ProductID field. This allows the user to associatively navigate both on the “direct discovery” and in memory data sets.

Direct Discovery will be much slow then in-memory processing and this is is expected, but it will take away from Qlikview its usual claim that is is faster then competitors. QlikView Direct Discovery can only be used against SQL compliant data sources. The following data sources are supported;

• ODBC/OLEDB data sources – All ODBC/OLEDB sources are supported, including SQL Server, Teradata and Oracle.
• Custom connectors which support SQL – Salesforce.com, SAP SQL Connector, Custom QVX connectors for SQL compliant data stores.

Due to the interactive and SQL syntax specific nature of the Direct Discovery approaches a number of limitations exist. The following chart types are not supported;
• Pivot tables
• Mini charts
And the following QlikView features are not supported:
• Advanced aggregation
• Calculated dimensions
• Comparative Analysis (Alternate State) on the QlikView objects that use Direct
Discovery fields
• Direct Discovery fields are not supported on Global Search
• Binary load from a QlikView application with Direct Discovery table

Here is a some preliminary video about Direct Discovery, published by Qliktech:

It was interesting to me that just 2 days after Qliktech pre-anounced Direct Discovery it also partners with Teradata. Tableau partners with Teradata for a while and Spotfire did it a month ago, so I guess Qliktech trying to catchup in this regard as well. I mentioned it only to underscore the point of this blog post: Qliktech realized that it behind its competitors in some areas and it has to follow ASAP.

Today TIBCO announced Spotfire 5, which will be released in November 2012. Two biggest news are the access to SQL Server Analysis Services cubes and the integration with Teradata “by pushing all aggregations, filtering and complex calculations used for interactive visualization into the (Teradata) database”.

Spotfire team “rewrote” its in-memory engine for v. 5.0 to take advantage of high-capacity, multi-core servers. “Spotfire 5 is capable of handling in-memory data volumes orders of magnitude greater than the previous version of the Spotfire analytics platform” said Lars Bauerle, vice president of product strategy at TIBCO Spotfire.

Another addition is “in-database analysis” which allows to apply analytics within the database platforms (such as Oracle, Microsoft SQL Server and Teradata) without  extracting and moving data, while handling analyses on Spotfire server and returning result sets back to the database platform.

Spotfire added new Tibco Enterprise Runtime for R, which embeds R runtime engine into the Spotfire statistical server. TIBCO claims that Spotfire 5.0 scales to tens of thousands of users! Spotfire 5 is designed to leverage the full family of TIBCO business optimization and big data solutions, including TIBCO LogLogic®, TIBCO Silver Fabric, TIBCO Silver® Mobile, TIBCOBusinessEvents®, tibbr® and TIBCO ActiveSpaces®.

The Mass Technology Leadership Council (MassTLC) organized today the Data Visualization Panel in their series of “Big Data Seminars”:

http://www.masstlc.org/events/event_details.asp?id=243502

and they invited me to be a Speaker and Panelist together with Irene Greif (Fellow @IBM) and Martin Leach (CIO @Broad Institute). Most interesting about this event was that it was sold out and about 150 people came to participate, even it was most productive time of the day (from 8:30am until 10:30am). Compare with what I observed just a few years ago, I sensed the huge interest to Data Visualization, base on multiple, very interesting and relevant questions I got from event participants.

I doubt that Microsoft is paying attention to my blog, but recently they declared that Power View now has 2 versions: one  for SharePoint (thanks, but no thanks) and one for Excel 2013. In other words, Microsoft decided to have own Desktop Visualization tool. In combination with PowerPivot and SQL Server 2012 it can be attractive for some Microsoft-oriented users but I doubt it can compete with Data Visualization Leaders – too late.

Most interesting is the note about Power View 2013 on Microsoft site: “Power View reports in SharePoint are RDLX files. In Excel, Power View sheets are part of an Excel XLSX workbook. You can’t open a Power View RDLX file in Excel, and vice versa. You also can’t copy charts or other visualizations from the RDLX file into the Excel workbook.

But most amazing is that Microsoft decided to use the dead Silverlight for Powerview: “Both versions of Power View need Silverlight installed on the machine.” And we know that Microsoft switched to HTML5 from Silverlight and no new development planned for Silverlight! Good luck with that…

And yes, you can add now maps (Bing of course), see it here:

(this is a repost from my other blog: http://tableau7.wordpress.com/2012/06/09/tableau-and-big-data/ )

Big Data can be useless without multi-layer data aggregations, hierarchical or cube-like intermediary Data Structures, when ONLY a few dozens, hundreds or thousands data-points exposed visually and dynamically every single viewing moment to analytical eyes for interactive drill-down-or-up hunting for business value(s) and actionable datum (or “datums” – if plural means data). One of best expression of this concept (at least how I interpreted it) I heard from my new colleague who flatly said:

“Move the function to the data!”

I got recently involved with multiple projects using large data-sets for Tableau-based Data Visualizations (100+ millions of rows and even Billions of records!). Some of largest examples of their sizes I used were: 800+ millions of records and other was 2+ billions of rows.

So this blog post is to express my thoughts about such Big Data (in average examples above have about 1+ KB per CSV record before compression and other advanced DB tricks, like columnar Databases used by Data Engine of Tableau) as back-end for Tableau. But please keep in mind that as a 32-bit tool, Tableau itself is not ready for Big Data. In addition, I think Big Data is mostly a buzzword and BS and we are  forced by marketing BS masters to use sometimes this stupid term.

Here are some Factors involved into Data Delivery from main and designated Database (Back-ends like Teradata, DB2, SQL Server or Oracle) for Tableau-based Big Data Visualizations) into “local” Tableau Visualizations (many people still trying to use Tableau as a Reporting tool as oppose to (Visual) Analytical Tool:

  • Queuing thousands of Queries to Database Server. There is no guarantee your Tableau query will be executed immediately; in fact it WILL be delayed.

  • Speed of Tableau Query when it will start to be executed depends on sharing CPU cycles, RAM and other resources with other queries executed SIMULTANEOSLY with your query.

  • Buffers, pools and other resources available for particular user(s) and queries at your Database Server are different and depends on privileges and settings given to you as a Database User

  • Network speed: between some servers it can be 10Gbits (or even more), in most cases it is 1Gbit inside server rooms, outside of server rooms I observed in many old buildings (over wired Ethernet) max 100Mbits coming into user’s PC; in case if you using Wi-Fi it can be even less (say 54 Mbits?). If you are using internet it can be even less (I observed speed in some remote offices as 1 Mbit or so over old T-1 lines); if you using VPN it will max out at 4Mbits or less (I observed it in my home office).

  • Utilization of network. I use Remote Desktop Protocol – RDP to VM (from my workstation or notebook; (VM or VDI Virtual Machine, sitting in server room) and connected to servers with network speed of 1Gbit, but it still using maximum 3% of network speed (about 30 MBits, which is about 3 Megabytes of data per second, which is probably about few thousands of records per seconds.

That means that network may have a problem to deliver 100 millions of records to “local” report overnight (say 10 hours, 10 millions of records per hour, 3000 records per second) – partially and probably because of factors 4 above.

On top of those factors please keep in mind that Tableau is a set of 32-bit applications (with exception of one out of 7 processes on Server side), which is restricted to 2GB of RAM; if data-set cannot fit into RAM, than Tableau Data Engine will use the disk as Virtual RAM, which is much, much slower and for some users such disk space actually not local to his/her workstation and mapped to some “remote” network file server.

Tableau desktop is using in many cases 32-bit ODBC drivers, which may even add more delay into data delivery into local “Visual Report”. As we learned from Tableau support itself, even with latest Tableau Server 7.0.X, the RAM allocated for one user session restricted to 3GB anyway.

Unfortunate Update: Tableau 8.0 will be 32-bit application again, but may be follow up version 8.x or 9 (I hope) will be ported to 64-bits… It means that Spotfire, Qlikview and even PowerPivot will keep some advantages over Tableau for a while…

(this is a repost from my other Data Visualization blog: http://tableau7.wordpress.com/2012/05/31/tableau-as-container/ )

Often I used small Tableau (or Spotfire or Qlikview) workbooks instead of PowerPoint, which are proving at least 2 concepts:

  • Good Data Visualization tool can be used as the Web or Desktop Container for Multiple Data Visualizations (it can be used to build a hierarchical Container Structures with more then 3 levels; currently 3: Container-Workbooks-Views)

  • It can be used as the replacement for PowerPoint; in example below I embedded into this Container 2 Tableau Workbooks, one Google-based Data Visualization, 3 image-based Slides and Textual Slide: http://public.tableausoftware.com/views/TableauInsteadOfPowerPoint/1-Introduction

  • Tableau (or Spotfire or Qlikview) is better then PowerPoint for Presentations and Slides

  • Tableau (or Spotfire or Qlikview) is the Desktop and the Web Container for Web Pages, Slides, Images, Texts

  • Good Visualization Tool can be a Container for other Data Visualizations

  • Sample Tableau Presentation above contains the Introductory Textual Slide

  • Sample Tableau Presentation above  contains a few Tableau Visualization:This Tableau Presentation contains a Web Page with the Google-based Motion Chart Demo

    1. The Drill-down Demo

    2. The Motion Chart Demo ( 6 dimensions: X,Y, Shape, Color, Size, Motion in Time)

  • This Tableau Presentation contains a few Image-based Slides:

    1. The Quick Description of Origins and Evolution of Software and Tools used for Data Visualizations during last 30+ years

    2. The Description of Multi-level Projection from Multidimensional Data Cloud to Datasets, Multidimensional Cubes and to Chart

    3. The Description of 6 stages of Software Development Life Cycle for Data Visualizations

TIBCO said Spotfire 4.5 will be available later this month (May 2012).

Among news and additions to Spotfire: it will include ADS connector to Hadoop, integration with SAS, Mathworks and Attivio engines and new deployment kit for iPad.

The short version of this post: as far as Data Visualization is a concern, the new Power View from Microsoft is the marketing disaster, the architectural mistake and the generous gift from Microsoft to Tableau, Qlikview, Spotfire and dozens of other vendors.

For the long version – keep reading.

Assume for a minute (OK, just for a second) that new Power View Data Visualization tool from Microsoft SQL Server 2012 is almost as good as Tableau Desktop 7. Now let’s compare installation, configuration and hardware involved:

Tableau:

  1. Hardware:  almost any modern Windows PC/notebook (at least dual-core, 4GB RAM).
  2. Installation: a) one 65MB setup file, b) minimum or no skills
  3. Configuration: 5 minutes – follow instructions on screen during installation.
  4. Price – $2K.

Power View:

  1. Hardware: you need at least 2 server-level PCs (each at least quad-core, 16GB RAM recommended). I will not recommend to use 1 production server to host both SQL Server and SharePoint; if you desperate, at least use VM(s).
  2. Installation: a) Each Server  needs Windows 2008 R2 SP1 – 3GB DVD; b) 1st Server needs SQL Server 2012 Enterprise or BI Edition – 4GB DVD; c) 2nd Server needs SharePoint 2010 Enterprise Edition – 1GB DVD; d) A lot of skills and experience
  3. Configurations: Hours or days plus a lot of reading, previous knowledge etc.
  4. Price: $20K or if only for development it is about $5K (Visual Studio with MSDN subscription) plus cost of skilled labor.

As you can see, Power View simply cannot compete on mass market with Tableau (and Qlikview and Spotfire) and time for our assumption in the beginning of this post is expired. Instead now is time to remind that Power View is 2 generations behind Tableau, Qlikview and Spotfire. And there is no Desktop version of Power View, it is only available as a web application through web browser.

Power View is a Silverlight application packaged by Microsoft as a SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Server 2010 Enterprise Edition. Power View is (ad-hoc) report designer providing for user an interactive data exploration, visualization, and presentation web experience. Microsoft stopped developing Silverlight in favor of HTML5, but Silverlight survived (another mistake) within SQL Server team.

Previous report designers (still available from Microsoft:  BIDS, Report Builder 1.0, Report Builder 3.0, Visual Studio Report Designer) are capable to produce only static reports, but Power View enables users to visually interact with data and drill-down all charts and Dashboard similar to Tableau and Qlikview.

Power View is a Data Visualization tool, integrated with Microsoft ecosystem. Here is a Demo of how the famous Hans Rosling Data Visualization can be reimplemented with Power View:

Compare with previous report builders from Microsoft, Power View allows many new features, like Multiple Views in a Single Report, Gallery preview of Chart Images, export to PowerPoint, Sorting within Charts by measures and Categories, Multiple Measures in Charts, Highlighting of selected data in reports and Charts, Synchronization of Slicers (Cross-Filtering), Measure Filters, Search in Filters (convenient for a long lists of categories), dragging data fields into Canvas (create table) or Charts (modify visualization), convert measures to categories (“Do Not Summarize”), and many other features.

As with any of 1st releases from Microsoft, you can find some bugs from Power View. For example, KPIs are not supported in Power View in SQL Server 2012, see it here: http://cathydumas.com/2012/04/03/using-or-not-using-tabular-kpis/

Power View is not the 1st attempt to be a full player in Data Visualization and BI Market. Previous attempts failed and can be counted as Strikes.

Strike 1: The ProClarity acquisition in 2006 failed, there have been no new releases since v. 6.3; remnants of ProClarity can be found embedded into SharePoint, but there is no Desktop Product anymore.

Strike 2: Performance Point Server was introduced in November, 2007, and discontinued two years later. Remnants of Performance Point can be found embedded into SharePoint as Performance Point Services.

Both failed attempts were focused on the growing Data Visualization and BI space, specifically at fast growing competitors such as Qliktech, Spotfire and Tableau. Their remnants in SharePoint functionally are very behind of Data Visualization leaders.

Path to Strike 3 started in 2010 with release of PowerPivot (very successful half-step, since it is just a backend for Visualization) and xVelocity (originally released under name VertiPaq). Power View is continuation of these efforts to add a front-end to Microsoft BI stack. I do not expect that Power View will gain as much popularity as Qlikview and Tableau and in my mind Microsoft will be a subject of 3rd strike in Data Visualization space.

One reason I described in very beginning of this post and the 2nd reason is absence of Power View on desktop. It is a mystery for me why Microsoft did not implement Power View as a new part of Office (like Visio, which is a great success) – as a new desktop application, or as a new Excel Add-In (like PowerPivot) or as a new functionality in PowerPivot or even as a new functionality in Excel itself, or as new version of their Report Builder. None of these options preventing to have a Web reincarnation of it and such reincarnation can be done as a part of (native SSRS) Reporting Services – why involve SharePoint (which is – and I said it many times on this blog – basically a virus)?

I am wondering what Donald Farmer thinking about Power View after being the part of Qliktech team for a while. From my point of view the Power View is a generous gift and true relief to Data Visualization Vendors, because they do not need to compete with Microsoft for a few more years or may be forever. Now IPO of Qliktech making even more sense for me and upcoming IPO of Tableau making much more sense for me too.

Yes, Power View means new business for consulting companies and Microsoft partners (because many client companies and their IT departments cannot handle it properly), Power View has a good functionality but it will be counted in history as a Strike 3.

(this is a repost from my Tableau blog: http://tableau7.wordpress.com/2012/04/02/palettes-and-colors/ )

I was always intrigued with colors and their usage, since my mom told me that may be ( just may be, there is no direct prove of it anyway) Ancient Greeks did not know what the BLUE color is – that puzzled me.

Later in my live, I realized that Colors and Palettes are playing the huge role in Data Visualization (DV) and it eventually led me to attempt to understand of how it can be used and pre-configured in advanced DV tools to make Data more Visible and to express the Data Patterns better. For this post I used Tableau to produce some palettes, but similar technique can be found in Qlikview, Spotfire etc.

Tableau published the good article of how to create customized palettes here: http://kb.tableausoftware.com/articles/knowledgebase/creating-custom-color-palettes and I followed it below. As this article recommended, I modified default Preferences.tps file; see it below with images of respective Palettes embedded.

For the first, regular Red-Yellow-Green-Blue Palette with known colors with well-established names, I created even a Visualization in order to compare their Red-Green-Blue components and I even tried to placed respective Bubbles on 2-dimensional surface, even originally it is clearly a 3 dimensional Dataset (click on image to see it in full size):

For the 2nd Red-Yellow-Green-NoBlue Ordered Sequential Palette, I tried to implement the extended “Set of Traffic Lights without any trace of BLUE Color” (so Homer and Socrates will understand it the same way as we are) while trying to use only web-safe colors. Please keep in mind, that Tableau does not have a simple way to have more than 20 colors in one Palette, like Spotfire does.

Other 5 Palettes below are useful too as ordered-diverging almost “mono-chromatic” (except Red-Green Diverging, since it can be used in Scorecards when Red is bad and Green is good). So see below Preferences.tps file with my 7 custom palettes.

<?xml version=’1.0′?> <workbook> <preferences>
<color-palette name=”RegularRedYellowGreenBlue” type=”regular”>
<color>#FF0000</color> <color>#800000</color> <color>#B22222</color>
<color>#E25822</color> <color>#FFA07A</color> <color>#FFFF00</color>
<color>#FF7E00</color> <color>#FFA500</color> <color>#FFD700</color>
<color>#F0e68c</color> <color>#00FF00</color> <color>#008000</color>
<color>#00A877</color> <color>#99cc33</color> <color>#009933</color>
<color>#0000FF</color> <color>#00FFFF</color> <color>#008080</color>
<color>#FF00FF</color> <color>#800080</color>

</color-palette>

<color-palette name=”RedYellowGreenNoBlueOrdered” type=”ordered-sequential” >
<color>#ff0000</color> <color>#cc6600</color> <color>#cccc00</color>
<color>#ffff00</color> <color>#99cc00</color> <color>#009900</color>

</color-palette>

<color-palette name=”RedToGreen” type=”ordered-diverging” >
<color>#ff0000</color> <color>#009900</color> </color-palette>

<color-palette name=”RedToWhite” type=”ordered-diverging” >
<color>#ff0000</color> <color>#ffffff</color></color-palette>

<color-palette name=”YellowToWhite” type=”ordered-diverging” >
<color>#ffff00</color> <color>#ffffff</color></color-palette>

<color-palette name=”GreenToWhite” type=”ordered-diverging” >
<color>#00ff00</color> <color>#ffffff</color></color-palette>

<color-palette name=”BlueToWhite” type=”ordered-diverging” >
<color>#0000ff</color> <color>#ffffff</color> </color-palette>
</preferences> </workbook>

In case if you wish to use the colors you like, this site is very useful to explore the properties of different colors: http://www.perbang.dk/rgb/

(this is a repost from http://tableau7.wordpress.com/2012/03/31/tableau-reader/ )

Tableau made a couple of brilliant decisions to completely outsmart its competitors and gained extreme popularity, while convincing millions of potential, future and current customers to invest own time to learn Tableau. 1st reason of course is Tableau Public (we discuss it in separate blog post) and other is a Free Tableau Reader, which provides full desktop user experience and interactive Data Visualization without any Tableau Server (and any other server) involved and with better performance and UI then Server-based Visualizations.

While designing Data Visualizations is done with Tableau Desktop, most users got their Data Visualizations served by Tableau Server to their Web Browser. However in the large and small organizations that usage pattern is not always the best fit. Below I am discussing a few possible use cases, where the usage of Free Tableau Reader can be appropriate, see it here: http://www.tableausoftware.com/products/reader .

1. Tableau Application Server serves Visualizations well, but not as well as Tableau Reader, because Tableau Reader delivers a truly desktop User Experience and UI. Most known example of it is a Motion Chart: you can see automatic motion with Tableau Reader but Web Browser will force user to manually emulate motion. In cases like that user advised to download workbook, copy .TWBX file to his/her workstation and open it with Tableau Reader.

Here is an example of the Motion Chart, done in Tableau, similar to famous Hans Rosling’s presentation of Gapminder’s Motion Chart (an you need the free Tableau Reader or license to Tableau Desktop to see the automatic motion of the 6-dimensional dataset with all colored bubbles, resizing over time):
http://public.tableausoftware.com/views/MotionChart_0/Motion?:embed=y

Please note that the same Motion Chart using Google Spreadsheets will run in browser just fine (I guess because Google “bought” Gapminder and kept its code intact):
https://docs.google.com/spreadsheet/ccc?key=0AuP4OpeAlZ3PdC14OXU1RGJsV05uaDlxRV9GLXlTZXc#gid=2

2. When you have hundreds or thousands of Tableau Server users and more then couple of Admins (users with Administrative privileges), each of Admins can override viewing privileges for any workbook, regardless of designated for that workbook Users and User Groups. In such situation there is a  risk for violation of privacy and confidentiality of data involved, for example for HR Analytics and HR Dashboards and other Visualizations where private, personal and confidential data used.

Tableau Reader enables additional complementary method of delivering Data Visualizations through private channels like password-protected portals, file servers and FTP servers and in certain cases even by-passing Tableau Server entirely.

3. Due popularity of Tableau and ease of use, many groups and teams are considering Tableau as vehicle to delivering of hundreds and even thousands of Visual Reports to hundreds and may be even thousands of users. That can slow down Tableau Server, decrease user experience and create even more confidentiality problems, because it may expose confidential data to unintended users, like report for one store to users from another store.

4. Many small (and not so small either) organizations trying to save on Tableau Server licenses (at least initially) and they still can distribute Tableau-based Data Visualizations; developer(s) will have Tableau Desktop (relatively small investment) and users, clients and customers will use Tableau Reader, while all TWBX files can be distributed over FTP, portals or file servers or even by email. In my experience, when Tableau-based business will grow enough, it will pay  by itself for buying licenses for Tableau Server, so usage of Tableau Reader in n o way is threat to Tbaleau Software bottom line!

Update (12/12/12) for even more happy usage of Tableau Reader: in upcoming Tableau 8 all Tableau Data Extracts – TDEs – can be created and used without any Tableau Server involved. Instead Developer can create/update TDE either with Tableau in UI mode or using Tableau Command Line Interface and script TDEs in batch mode or programmatically with new TDE API (Python, C/C++, Java). It means that Tableau workbooks can be automatically refreshed with new data without any Tableau Server and re-delivered to Tableau Reader users over … FTP, portals or file servers or even by email.

In unusual, interesting (what it means? is it promising or what?) move the two Data Visualization leaders (Panopticon and Qliktech) partners today, see

http://panopticon.com/Panopticon-Software-Partners-with-QlikTech-to-Provide-Real-Time-Visual-Data-Monitoring-and-Analysis-Dashboards

“to offer enhanced, real-time visualization capabilities for the QlikView Business Discovery platform”.

Panopticon’s press-release looks overly submissive to me:

“As a member of QlikTech’s Qonnect Partner Program for Technology Partners, Panopticon supports QlikView desktop, web, and mobile interactive dashboards and allows users to filter and interact directly with real-time data. By integrating Panopticon into their systems, QlikView users can:

The combined Panopticon-QlikView platform is now available for immediate installation.”

Panopticon integration into QlikView dashboards utilizes QlikView UI extension objects within the web browser. The extension object calls Panopticon “web parts” and creates a Panopticon extension object with a number of pre-defined properties. The defined context/data is passed into the Panopticon extension object. The Panopticon “web parts” call a Panopticon EX Java applet and renders the requested Panopticon visualization workbook within the context defined by the QlikView user. The Panopticon component executes parameterized URL calls and parameterized JavaScripts to update the parent QlikView display.

Qliktech is trying to be politically correct and its Michael Saliter, Senior Director Global Market Development – Financial Services at QlikTech said, “Our partnership with Panopticon allows us to incorporate leading real-time visualization capabilities into our QlikView implementations. We recognize the importance of providing our clients with truly up-to-date information, and this new approach supports that initiative. Our teams share a common philosophy about proper data visualization design. This made it easy to develop a unified approach to the presentation of real-time, time series, and static data in ways that people can understand in seconds.”

While I like when competitors are cooperating (it benefits users and hopefully improve sales for both vendors), I still have a question: Qliktech got a lot of money from IPO, had a lot of sales and hired a lot of people lately; why they (Qlikview Developers) was not able to develop real-time functionality themselves?

Hugh Heinsohn, VP of Panopticon, said to me: “we (Panopticon) don’t see ourselves as competitors – and neither do they (Qliktech). When you get into the details, we do different things and we’re working together closely now”

Another indirect sign of relationship between Panopticon and Qliktech is the recent inclusion of Måns Hultman, former CEO of QlikTech into the list of advisors for Panopticon’s Board of Directors.

Other questions are rising too: if Qliktech suddenly is open to integration with Panopticon, why not to integrate with Quantrix and R Library (I proposed integration with R a while ago). Similar questions applicable to Tableau Software…

I was silent for a while for a reason: I owe myself to read a big pile of books, articles and blog posts by many authors – I have to read it before I can write something myself. List is huge and it goes many weeks back! I will sample a sublist  here with some relatively fresh reading materials in no particular order:

1. Excellent “Clearly and Simply” blog by Robert Mundigl, here are just 2 samples:

2. Interesting site dedicated to  The Traveling Salesman Problem:

3. Excellent QV Design blog by Matthew Crowther, here are a few examples:

4. Good article by James Cheshire here:

5. Interesting blog by Josh Tapley: http://data-ink.com/

6. A must read blog of Stephen Wolfram, just take a look on his 2 last posts:

7. Nice post by my friend John Callan: http://community.qlikview.com/blogs/theqlikviewblog/2012/03/09/why-discovery-really-matters

8. I am trying to follow David Raab as much as I can:

9. As always, interesting articles from Timo Elliott:

10. Huge set of articles from variety of Sources about newly released or about to be released xVelocity, PowerPivot2, SQL Server 2012, SSDT (SQl Server Data Tools), VS11 etc.

11. Here is a sample of article with which I disagree (I think OBIEE is TWO generations behind of Qlikview, Tableau and Spotfire), but still need to read it:

http://www.projectedconsulting.com/index.php/component/wordpress/2012/03/qlikview-versus-bi-applications-and-obiee

this list is go on and on and on, so answer on my own question is: to read!

Below is a prove (unrelated to Data Visualization, but cannot resist to publishing it – I did the spreadsheet below by myself) – rather for myself, that reading can help to avoid mistakes (sounds funny, I know). For example if you will listen last week’s iPropaganda from iChurch, you will think that new iPad 2012 is the best tablet on market. But if you read carefully specification of new iPad 2012 and compare it (after careful reading) with specifications of new Asus Transformer Pad Infinity, you will have a different choice:

Dan Primack, Senior Editor at Fortune, posted today at http://finance.fortune.cnn.com/2012/02/22/tableau-to-ipo-next-year/ a suggestion that Tableau can go public next year and I quote:

“Scott Sandell, a partner with New Enterprise Associates  (the venture capital firm that is Tableau’s largest outside shareholder) told Dan “that the “board-level discussions” are about taking the company public next year, even though it has the numbers to go out now if it so chose. Sandell added that the company has been very efficient with $15 million or so it has raised in VC funding, and that it shouldn’t need additional pre-IPO financing”.

Mr. Primack also mentioned an “unsolicited email, from outside spokesman: “Next week Tableau Software will announce its plans to go IPO“…

I do not have comments, but I will not be surprised if somebody will buy Tableau before IPO… Among potential buyers I can imagine:

  • Microsoft (Seattle, Multidimensional Cubes, integration with Excel),
  • Teradata (Aster Data is in, front-end for “big data” is needed),
  • IBM (if you cannot win against the innovator, how about buying it),
  • and even Oracle (everything moving is the target?)…

I started recently the new Data Visualization Google+ page as the extension of this blog here:

https://plus.google.com/111053008130113715119/posts

.

Internet has a lot of articles, pages, blogs, data, demos, vendors, sites, dashboards, charts, tools and other materials related to Data Visualization and this Google+ page will try to point to most relevant items and sometimes to comment on most interesting of them.

.

What was unexpected is a fast success of this Google+ page – in a very short time it got 200+ followers and that number keeps growing!

.

Next Page »

Follow

Get every new post delivered to your Inbox.

Join 260 other followers