November 26, 2013
Leave a Comment
October 22, 2013
Famous Traditional BI vendor got sick and tired to be out of Data Visualization market and decided to insert itself into it by force by releasing today 2 Free (for all users) Data Visualization Products:
MicroStrategy Analytics Desktop™ (Free self-service visual analytics tool)
MicroStrategy Analytics Express™ (Free Cloud-based self-service visual analytics)
That looks to me as the huge Disruption of Data Visualization Market: For example similar Desktop Product from Tableau costs $1999 and Cloud Product called Tableau Online costs $500/year/user. It puts Tableau, Qlikview and Spotfire to a very tough position price-wise. However only Tableau stock went down almost $3 (more then %4) today, but MSTR, TIBX an QLIK basically did not react on Microstrategy announcement):
And don’t think that only MIcrostrategy trying to get into DV market. For example SAP did similar (in less-dramatic and non-disruptive fashion) a few months ago with SAP Lumira (Personal Edition is free), also SAP Cloud and Standard edition available too, see it here http://www.saplumira.com/index.php and here http://store.businessobjects.com/store/bobjamer/en_US/Content/pbPage.sap-lumira . SAP senior vice president and platform head Steve Lucas 10 weeks ago was asked if SAP would consider buying Tableau, Lucas went in the opposite direction. “We aren’t going to buy Tableau,” Lucas said with a smile on his face. “There’s no need to buy an overvalued software company.” Rather, SAP wants to crush companies like Tableau (I doubt it is possible, but SAP is free to try) and build own Data Visualization product line out of Lumira, read more at
If I will be Tableau, Qlikview or Spotfire I will not worry yet about Microstrategy competition yet, because it is unclear how the future R&D for free Analytics Desktop and Express will be funded – out of MicroStrategy Analytics Enterprise™ R&D budget? That can be tricky, considering as of right now Tableau hiring hard (163 open job positions as of yesterday!) and Qliktech is very active too (about 93 openings as of yesterday) and even TIBCO has 36 open positions just for Spotfire alone.
But I may start to worry about other DV Vendor - Datawatch, who recently completed the acquisition of Panopticon. Datawatch grew 45% YoY (2012-over-2011), has only 124 employees but $27.5M in sales, very experienced leadership, 40000+ customers worldwide and mature product line. May be another evidence of it here:
The three MicroStrategy Analytics Platform products also share a common user experience—making it easy to start small with self-service analytics and grow into the production-grade features of Enterprise. Desktop and Express from Microstrategy can be naturally extended (for fee) to a new enterprise-grade BI&DV Suite, also released today and called MicroStrategy Analytics Enterprise™ (known under other name as MIcrostrategy Suite 9.4).
New MicroStrategy Analytics Enterprise 9.4 includes data blending, which allows users to combine data from more than one source; the software stores the data in working memory without the need for a separate data integration product. 9.4 can connect with the MongoDB NoSQL data store as well as Hadoop distributions from Hortonworks, Intel and Pivotal. It comes with the R, adds better ESRI integration. The application can now fit 10 times as much data in memory as the previous version could, and the self-service querying now runs up to 40 percent faster.
MicroStrategy Analytics Enterprise™ Suite is also available starting today for free for developers and non-production use: 10 named user licenses of MicroStrategy Intelligence Server, MicroStrategy Web Reporter and Analyst, MicroStrategy Mobile, MicroStrategy Report Services, MicroStrategy Transaction Services, MicroStrategy OLAP Services, MicroStrategy Distribution Services, and MultiSource Option. 1 named user license of development software, MicroStrategy Web Professional, MicroStrategy Developer, and MicroStrategy Architect The server components have a 1 CPU limit).
Quote from Wayne Eckerson, President of BI Leader Consulting: “The new MicroStrategy Analytics Desktop makes MicroStrategy a top-tier competitor in the red-hot visual discovery market. The company was one of the first traditional enterprise BI vendors to ship a visual discovery tool, so its offering is mature compared to others in its peer group, but it was locked away inside its existing platform. By offering a stand-alone desktop visual discovery tool and making it freely available, MicroStrategy places itself among” Data Visualization Leaders.
You also can read today’s article from very frequent visitor to my blog (his name Akram), who is the Portfolio and Hedge Manager, Daily Trader and excellent investigator of all Data Visualization Stocks, DV Market and DV Vendors. His article ”Tableau: The DV Market Just Got More Crowded” can be found here (cannot resist to quote: “Microstrategy is priced like it has nothing to do with this space, and Tableau is priced like it will own the whole thing.”):
MicroStrategy Analytics Desktop.
It’s free visual analytics: Free Visual Insight, 100M per file, 1GB total storage, 1 of user, Free e-mail support for 30 days. Free access to online training, forum, and knowledge base.
Data Sources: xls, csv, RDBMSes, Multidimensional Cubes, MapReduce, Columnar DBs, Access with Web Browser, export to Excel, PDF, flash and images, email distribution. The product is freely available to all and can be downloaded instantly at:http://www.microstrategy.com/free/desktop .
Kevin Spurway, MicroStrategy’s vice president of industry and mobile marketing said: “The new desktop software was designed to compete with other increasingly popular self-serve, data-discovery desktop visualization tools offered by Tableau and others”. To work with larger data sets, a user should have 2GB or more of working memory on the computer, Spurway said. See more here:
MicroStrategy Analytics Express.
MicroStrategy Analytics Express is a software-as-a-service (SaaS)-based application that delivers all the rapid-fire self-service analytical capabilities of Desktop, plus reports and dashboards, native mobile applications, and secure team-based collaboration – all instantly accessible in the Cloud. Today, the Express community includes over 32,000 users across the globe.
In this release, Express inherits all the major functional upgrades of the MicroStrategy Analytics Platform, including new data blending features, improved performance, new map analytics, and much more. For a limited time, MicroStrategy is also making Express available to all users free for a year. With this valuable offer, users will be able to establish an account, invite tens, hundreds, or even thousands of colleagues to connect, analyze and share their data and insight, and do it all at no charge. For some organizations, the potential value of this offer can be $1 million or more. Users can sign up, access the service, and take advantage of this offer instantly at
MicroStrategy Analytics Express includes Free Visual Insight, Free web browser and iPad access, Free SaaS for one year, 1GB upload per file, unlimited number of users, Free e-mail support for 30 days. Free access to online training, forum, and knowledge base. Data Sources: xls, csv, RDBMSes Columnar DBs, Drobbox, Google Drive Connector, Visual Insight, a lot of security and a lot more, see http://www.microstrategy.com/Strategy/media/downloads/free/analytics-express_user-guide.pdf
All tools from Microstrategy Analytics Platform (Desktop, Express and Entereprise Suite) support standard list of Chart Styles and Types: Bar (Vertical/Horizontal Clustered/Stacked/100% Stacked), Line (Vertical/Horizontal Absolute/Stacked/100% Stacked), Combo Chart (of Bar and Area)Area (Vertical/Horizontal Absolute/Stacked/100% Stacked)
Dual Axis ( Bar/Line/Area Vertical/Horizontal), HeatMap, Scatter, Scatter Grid, Bubble, Bubble Grid, Grid,
Pie, Ring, ESRI Maps,
Network of Nodes, with lines representing links/connections/relationship,
Microcharts and Sparklines,
Data and Word Clouds,
and of course any kind of interactive Dashboards as combination of all of the above Charts, Graphs, and Marks:
October 11, 2013
Yesterday TIBCO announced Spotfire 6 with features, competitive with Tableau 8.1 and Qlikview.Next (a.k.a Qlikview 12). Some new features will be showcased at TUCON® 2013, TIBCO’s annual user conference, October 14-17, 2013 (2100 attendees). Livestream Video is here: http://tucon.tibco.com/video/index.html , tune in October 15th and 16th from 11:30am – 3:30pm EST.
More details will be shown in webcasts and webinars (I personally prefer detailed articles, blogposts, slides, PDFs and Demos, but TIBCO’s corporate culture ignores my preferences for years) on 10/30/13 by Steve Farr
- here: http://lp.spotfire.tibco.com/Global_Webcast_2013Spotfire6OctoberAM.html and
- here: http://lp.spotfire.tibco.com/Global_Webcast_2013Spotfire6OctoberPM.html
Spotfire 6.0 will be available in mid-November, presumably the same time as Tableau 8.1 and before then Qlikview.Next so TIBCO is not a loser in Leap-frogging game for sure…
TIBCO bought the Extended Results and will presumably will show the integration with PSUHBI product, see it here: http://www.pushbi.com/ ; TIBCO called it as Delivery of personal KPIs and Metrics on any mobile phone, tablet or laptop, online or offline (new name for it will be TIBCO Spotfire® Consumer):
Another TIBCO’s Purchase is MAPORAMA and integration with it TIBCO called (very appropriately) as the Location Analytics with promise to
Visualize, explore and analyze data in the context of location
Expand situational understanding with multi-layered geo-analytics
Mashup new data sources to provide precise geo-coding across the enterprise
Spotfire Location Services is the Agnostic Platform and supports (I guess this needs to be verified, because sounds too good to be true) any map service, including own TIBCO, ESRI (Spotfire integrates with ESRI previously), Google:
TIBCO has Event processing capabilities (e.g BusinessEvents (5.1.2. now), ActiveSpaces ( currently v. 2.2) and realtime streaming of “Big Data” StreamBase (7.3.7) , they bought (StreamBase that is) a few months ago, see it here: http://www.streambase.com/news-and-events/press-releases/pr-2013/tibco-software-acquires-streambase-systems/#axzz2hiEjnr9X) and it will be interesting to see the new Spotfire Events Analytics (to Spot Event patterns) product (see also: http://www.streambase.com/products/streambasecep ) integrated with Spotfire 6:
Identify new trends and outliers with continuous process monitoring
Automate the delivery of analytics applications based on trends
Operationalize analytics to support continuous process improvement:
One more capability in Spotfire mentioned (this claim needs to be verified) in recent TIBCO blogpost http://www.tibco.com/blog/2013/10/11/connecting-the-loops-the-next-step-in-decision-management/ as the ability to overlap 2 related but separated in real-life processes: the processes of analysis (discovery of insights in data) and execution (deciding and actions) could be separated by days, but with Spotfire 6.0 the entire decision process can happen in real time:
For business user Spotfire 6 has new web-based authoring (Spotfire has a few “Clients”, one called Web Player and another called Enterprise Player, both are not free unlike Tableau Free Reader or Tableau Public). Bridging the gap between simple dashboards and advanced analytic applications, Spotfire 6.0 provides a new client “tailored to meet the needs of the everyday business user, who typically has struggled to manipulate pivot tables and charts to address their data discovery needs”.
With this new web application, known as TIBCO Spotfire® Business Author, business users can visually explore and interact with data, whether residing in a simple spreadsheet or dashboard, a database, or a predefined analytic application. It will definitely compete with Web Authoring in Tableau 8.1 and incoming Qlikview.Next.
For me personally the most interesting new feature is new Spotfire Cloud Services (supposedly the continuation of Spotfire SIlver, which I like but it is overpriced and non-competitive storage-wise vs. Tableau Public and Tableau Online cloud services). Here is the quote from yesterday’s Press Release: “TIBCO Spotfire® Cloud is a new set of cloud services for enterprises, work groups, and personal use (see some preliminary info here: https://marketplace.cloud.tibco.com/marketplace/marketplace/apps#/sc :
- Personal: Web-based product, Upload Excel, .csv and .txt data, 12 visualization types, 100 GB of data storage. However, Spofire making a big mistake by denying access to Spotfire Analyst desktop product and making it as not free but only as “free trial for 30 days”, after which you have to pay a fee. That will benefit Tableau for sure and may be even Datawatch. As of 11/13/13, Spotfire still did not posted prices and fees for Spotfire Cloud Personal etc. and suggested to contact them over email, which I did but they never replied…
- Workgroup: Web-based and desktop product, Connect and integrate multiple data sources, All visualization types, 250 GB of data storage.
- Enterprise: Web-based and desktop product, Connect to 40+ data sources, All visualization types, Advanced statistics services, 500 GB of data storage
TIBCO Spotfire® Cloud Enterprise provides a secure full-featured version of Spotfire in the cloud to analyze and collaborate on business insights, whether or not the data is hosted. For project teams seeking data discovery as a service, TIBCO Spotfire® Cloud Work Group provides a wealth of application-building tools so distributed teams can visually explore data quickly and easily and deploy analytic applications at a very low cost. For individuals looking for a single step to actionable insight, TIBCO Spotfire®Personal is a cost-effective web-based client for quick data discovery needs.”
Please don’t forget that Spotfire 6 has TIBBR - v.5 as of now: https://tibco.tibbr.com/tibbr/web/login (social computing platform built for the workplace and integrated with Spotfire; Ram Menon, President of Social Computing at tibbr, says, “We now have 6.5 million users for tibbr as of October ” and accessed from 7,000 cities, and 2,100 different device models. “A typical tibbr post is now seen by 100 users in the span of 24 hours, in 7 countries and over 50 mobile devices.” This fulfills TIBCO’s mission of getting the right information to the right people, at the right time. Related: integration between TIBBR and HUDDLE: http://www.huddle.com/blog/huddle-and-tibbr-unite-to-bring-powerful-collaboration-to-enterprise-social-networking/ )
And finally – enterprise-class, R-compatible statistical engine: TIBCO Enterprise Runtime for R (TERR) which is the part of excellent TIBCO Spotfire Statistics Services (TSSS). TSSS allows Integration of R (including TERR), Spotfire’s own S+ (SPlus is Spotfire’s commercial version of R), SAS® and MATLAB® into Spotfire and custom applications. TERR, see http://spotfire.tibco.com/en/discover-spotfire/what-does-spotfire-do/predictive-analytics/tibco-enterprise-runtime-for-r-terr.aspx supports:
Support for paralelized R-language scripts in TERR
Support for call outs to open source R from TERR
Use RStudio – the most popular IDE in the R Community-to develop your TERR scripts
Over a thousand TERR compatible CRAN packages
Among other news is support for new Data Sources: http://spotfire.tibco.com/en/resources/support/spotfire-data-sources.aspx including SAP NetWeaver Business Warehouse v.7.0.1 (required TIBCO Connector Link).
I maintain my opinion that the best way for TIBCO to capitalize on tremendous hidden market value of Spotfire is to spin-it off as EMC did with VMWare.
My other concern is too many offices involved with Spotfire: (Parental) TIBCO’s HQ in California, Swedish HQ (mostly R&D) office in Sweden and Large Marketing, Sales, Support and Consulting office in Newton, Massachusetts. My advise to have only one main office in MA, which is compatible with spin-off idea. Tableau has advantage here with concentrating their main office in Seattle.
Update 11/13/13: TIBCO’s Spotfire propaganda so far did not help TIBCO stock shares at all, but seems to me that it helps a lot to Datawatch stock prices (Datawatch bought recently a very capable (technically) DV Vendor Panopticon and integrated its own software with Panopticon Software; Datawatch has 40000+ customers with 500000+ end users)
October 6, 2013
Leave a Comment
Last month Tableau and Qliktech both declared that Traditional BI is too slow (I am saying this for many years) for development and their new Data Visualization (DV software) is going to replace it. Quote from Tableau’s CEO: Christian Chabot: “Traditional BI software is obsolete and dying and this is very direct challenge and threat to BI vendors: your (BI that is) time is over and now it is time for Tableau.” Similar quote from Anthony Deighton, Qliktech’s CTO & Senior VP, Products: “More and more customers are looking at QlikView not just to supplement traditional BI, but to replace it“.
Since main criterias for client were
minimize IT personnel involved and increase its productivity;
minimize the off-shoring and outsourcing as it limits interactions with end users;
increase end users’s involvement, feedback and action discovery.
So I advised to client to take some typical Visual Report project from the most productive Traditional BI Platform (Microstrategy), use its prepared Data and clone it with D3 and Tableau (using experts for both). Results in form of Development time in hours) I put below; all three projects include the same time (16 hours) for Data Preparation & ETL, the same time for Deployment (2 hours) and the same number (8) of Repeated Development Cycles (due 8 consecutive feedback from End Users):
It is clear that Traditional BI requires too much time, that D3 tools just trying to prolongate old/dead BI traditions by modernizing and beautifying BI approach, so my client choose Tableau as a replacement for Microstrategy, Cognos, SAS and Business Objects and better option then D3 (which require smart developers and too much development). This movement to leading Data Visualization platforms is going on right now in most of corporate America, despite IT inertia and existing skillset. Basically it is the application of the simple known principle that “Faster is better then Shorter“, known in science as Fermat’s Principle of least time.
This changes made me wonder (again) if Gartner’s recent marketshare estimate and trends for Dead Horse sales (old traditional BI) will stay for long. Gartner estimates the size of BI market as $13B which is drastically different from TBR estimate ($30B).
TBR predicts that it will keep growing at least until 2018 with yearly rate 4% and BI Software Market to Exceed $40 Billion by 2018 (They estimate BI Market as $30B in 2012 and include more wider category of Business Analytics Software as opposed to strictly BI tools). I added estimates for Microstrategy, Qliktech, Tableau and Spotfire to Gartner’s MarketShare estimates for 2012 here:
“Traditional BI is like a pencil with a brick attached to it” said Chris Stolte at recent TCC13 conference and Qliktech said very similar in its recent announcement of Qlikview.Next. I expect TIBCO will say similar about upcoming new release of Spotfire (next week at TUCON 2013 conference in Las Vegas?)
These bold predictions by leading Data Visualization vendors are just simple application of Fermat’s Principle of Least Time: this principle stated that the path taken between two points by a ray of light (or development path in our context) is the path that can be traversed in the least time.
Fermat’s principle can be easily applied to “PATH” estimates to multiple situations like in video below, where path from initial position of the Life Guard on beach to the Swimmer in Distress (Path through Sand, Shoreline and Water) explained:
Even Ants following the Fermat’s Principle (as described in article at Public Library of Science here: http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0059739 ) so my interpretation of this Law of Nature (“Faster is better then Shorter“) that traditional BI is a dying horse and I advise everybody to obey the Laws of Nature.
If you like to watch another video about Fermat’s principle of Least Time and related Snell’s law, you can watch this:
September 25, 2013
Leave a Comment
Qlikview 10 was released around 10/10/10, Qlikview 11 – around 11/11/11, so I expected Qlikview 12 to be released on 12/12/12. Qliktech press release said today that the next (after 11.2) version of Qlikview will be delivered under the new nickname Qlikview.Next in 2014 but “for early adopter customers in a production environment in 2013″. I hope I can get my hands on it ASAP!
The new buzzword is Natural Analytics: “QlikView.Next’s key value as an alternative BI platform is in its use of Natural Analytics“. The new Qliktech motto that “Qlikview is a Replacement of Traditional BI” is similar to what we heard from Tableau leaders just 2 weeks ago on Tableau Customer Conference in Washington, DC. Another themes I hear from Qliktech about Qliview.Next are sounds familiar too: Gorgeous, Genius, Visually Beautiful, Associative Experience, Comparative Analysis, Anticipatory, Drag and Drop Analytics.
Qlikview.Next will introduce “Data Dialogs” as live discussions between multiple users about Data they see and explore collectively, enabling “Social BI”. This reminds me the integration between TIBBR (TIBCO’s collaboration platform) and Spotfire, which existed since Spotfire 4.0.
Details about new features in Qlikview.Next will be released later, but at least we know now when Qlikview 12 (sorry, Qlikview.Next that is) will be available. Some features actually unveiled in generic terms::
Unified, Browser-Based HTML5 Client, which will automatically optimize itself for user’ device;
Automatic and Intelligent re-sizing of objects to fit user’s screen;
Server-side Analysis and Development, Web-based creation and delivery of content, Browser-based Development;
Data Storytelling, narrative and social with Data Dialogs;
Library and Repository for UI objects;
Multi-source Data Integration and new web-based scripting;
QlikView Expressor for advanced graphical Data Integration and Metadata Management;
Improved Data Discovery with associative experience across all the data, both in memory and on disks;
All UI Objects can be treated as extension Objects, customizable with their source files available to developers;
New Managment Console with Qlikview on Qlikview Monitor;
New visualization capabilities, based on advanced data visualization suite from NComVA (bought by Qliktech a few months ago), potential samples see here: http://www.ncomva.se/guide/?chapter=Visualizations
In addition Qliktech is launching the “Qlik Customer Success Framework” , which includes:
Qonnect Partner Program: An extensive global network of 1500+ partners, including resellers, (OEMs), technology companies, and system integrators.
Qlik Community: An online community with nearly 100,000 members comprised of customers, partners, developers and enthusiasts.
Qlik Market: An online showcase of applications, extensions and connectors.
Qoncierge: A single point of contact service offering for customers to help them access the resources they need.
Comprehensive Services: A wide range of consulting services, training and support.
Also see Ted Cuzzillo blogpost about it here: http://datadoodle.com/2013/10/09/next-for-qlik/# and Cindi Howson’s old post here: http://biscorecard.typepad.com/biscorecard/2012/05/qliktech-shares-future-product-plans-for-qlikview.html and new article here: http://www.informationweek.com/software/business-intelligence/qliktech-aims-to-disrupt-bi-again/240162403#!
September 9, 2013
Today Tableau Customer Conference 2013 started with 3200+ attendees from 40+ countries and 100+ industries, with 700 employees of Tableau, 240 sessions. Tableau 8.1 pre-announced today for release in fall of 2013, also version 8.2 planned for winter 2014, and Tableau 9.0 for later in 2014.
Update 9/10/13: keynote now is available recorded and online: http://www.tableausoftware.com/keynote
(Recorded Monday Sept 9, 2013 Christian Chabot, Chris Stolte and the developers LIVE)
New in 8.1: 64-bit, Integration with R, support for SAML, IPV6 and External Load Balancers, Copy/Paste Dashboards and worksheets between workbooks, new Calendar Control, own visual style, including customizing even filters, Tukey’s Box-and-Whisker Box-plot, prediction bands, ranking, visual analytics for everyone and everywhere (in the cloud now)
Planned and new for 8.2: Tableau for MAC, Story Points (new type of worksheet/dashboard with mini-slides as story-points), seamless access to data via data connection interface to visually build a data schema, including inner/left/right/outer visual joins, beautifying columns names, easier metadata etc, Web authoring enhancements (it may get into 8.1: moving quick filters, improvement for Tablets, color encoding.) etc.
8.1: Francois Ajenstat announced: 64-bit finally (I asked for that for many years) for server processes and for Desktop, support for SAML (single-sign-ON on Server and Desktop), IPV6, External Load Balancers:
8.1: Mike Arvold announced “Visual Analytics for everyone”, including implemention of famous Tukey’s Box-and-Whisker Box-plot (Spotfire has it for a while, see it here: http://stn.spotfire.com/stn/UserDoc.aspx?UserDoc=spotfire_client_help%2fbox%2fbox_what_is_a_box_plot.htm&Article=%2fstn%2fConfigure%2fVisualizationTypes.aspx ),
better forecasting, prediction bands, ranking, better heatmaps:
8.1: Melinda Minch announced “fast, easy, beautiful”, most importantly copy/paste dashboards and worksheets between workbooks, customizing everything, including quick filters, new calendar control, own visual style, folders in Data Window etc…
8.2: Jason King pre-announced the Seamless access to data via data connection interface to visually build a data schema, including inner/left/right/outer “visual” joins, beautifying columns names, default formats, new functions like DATEPARSE, appending data-set with new tables, beautifying columns names, easier metadata etc.
8.2: Robert Kosara introduced Story Points (using new type of worksheet/dashboard with mini-slides as story-points) for new Storytelling functionality:
Here is an example of Story Points, done by Robert:
8.2: Andrew Beers pre-announced Tableau 8.2 on MAC and he got a very warm reception from audience for that:
Chris Stolte proudly mentioned his 275-strong development team, pre-announced upcoming Tableau Releases 8.1 (this fall), 8.2 (winter 2014) and 9.0 (later in 2014) and introduced 7 “developers” who (see above Francois, Mike, Dave, Melinda, Jason, Robert and Andrew) discussed during this keynote new features (feature list is definitely longer and wider that recent “innovations” we saw from Qlikview 11.2 and even from Spotfire 5.5):
Christian Chabot opening keynote today… He said something important: current BI Platforms are not fast, nor easy, they are not beautiful and not for anyone and they are definitely not “anywhere” but only in designated places with appropriate IT personnel (compare with Tableau Public, Tableau Online, Tableau free Reader etc.) and it is only capable to produce a bunch of change requests from one Enterprise’s department to another, which will take long time to implement with any SDLC framework.
Christian basically repeated what I am saying on this blog for many years, check it here http://apandre.wordpress.com/market/competitors/ : traditional BI software (from SAP, IBM, Oracle, Microstrategy and even Microsoft cannot compete with Tableau, Qlikview and Spotfire) is obsolete and dying and this is very direct challenge and threat to BI vendors (I am not sure if they understand that): your (BI that is) time is over and now it is time for Tableau (also for Qlikview and Spotfire but they are slightly behind now…).
Update on 11/21/13: Tableau 8.1 is available today, see it here: http://www.tableausoftware.com/new-features/8.1 and Tableau Public 8.1 is available as well, see it here: http://www.tableausoftware.com/public/blog/2013/11/tableau-public-81-launches-2226
September 8, 2013
Leave a Comment
While blog preserving my observations and thoughts, it preventing me to spend enough time to read what other people thinking and saying, so I created almost 2 years ago the extension of this blog in the form of 2 Google+ pages http://tinyurl.com/VisibleData and http://tinyurl.com/VisualizationWithTableau , where I accumulated all reading pointers for myself and gradually reading those materials when I have time.
Those 2 pages magically became extremely popular (this is unintended result) with total more than 5000 Google+ followers as of today. For example here is a Chart showing monthly growth of the number of followers for the main extension of this blog http://tinyurl.com/VisibleData :
So please see below some samples of Reading Pointers accumulated over last 3 months of summer by my Google+ pages:
Author trying to simplify BigData Definition as following: “BigData Simplified: Too much data to fit into a single server”: http://yottascale.com/entry/the-colorful-secrets-of-bigdata-platforms
Recent talk from Donald Farmer: http://www.wired.com/insights/2013/06/touch-the-next-frontier-of-business-intelligence/
Dmitry pointing to implementation Disaster of Direct Discovery in Qlikview 11.2: http://bi-review.blogspot.com/2013/04/first-look-at-qlikview-direct-discovery.html
Specs for Tableau in Cloud: https://www.tableausoftware.com/products/online/specs
The DB-Engines Monthly Ranking ranks database management systems according to their popularity. Turned out that only 3 DBMSes are popular: Oracle, SQL Server and MySQL:
According to Dr. Andrew Jennings, chief analytics officer at FICO and head of FICO Labs, three main skills of data scientist are the same 3 skills I tried to find when hiring programmers for my teams 5, 10, 20 and more years ago: 1. Problem-Solving Skills. 2. Communications Skills. 3. Open-Mindedness. This makes all my hires for last 20+ years Data Scientists, right? See it here: http://www.informationweek.com/big-data/news/big-data-analytics/3-key-skills-of-successful-data-scientis/240159803
A study finds the odds of rising to another income level are notably low in certain cities, like Atlanta and Charlotte, and much higher in New York and Boston: http://www.nytimes.com/2013/07/22/business/in-climbing-income-ladder-location-matters.html
Tableau is a prototyping tool: http://tableaufriction.blogspot.com/2013/07/the-once-and-future-prototyping-tool-of.html
Why More Data and Simple Algorithms Beat Complex Analytics Models: http://data-informed.com/why-more-data-and-simple-algorithms-beat-complex-analytics-models/
New Census Bureau Interactive Map Shows Languages Spoken in America: http://www.census.gov/newsroom/releases/archives/education/cb13-143.html
Google silently open sourced a tool called word2vec, prepackaged deep-learning software designed to understand the relationships between words with no human guidance. It actually similar to known for a decade methods called PLSI and PLSA:
“Money is not the only reward of education, yet it is surely the primary selling point used to market data science programs, and the primary motivator for students. But there’s no clear definition of data science and no clear understanding of what knowledge employers are willing to pay for, or how much they will pay, now or in the future. Already I know many competent, diligent data analysts who are unemployed or underemployed. So, I am highly skeptical that the students who will invest their time and money in data science programs will reap the rewards they have been led to expect.”: http://www.forbes.com/sites/gilpress/2013/08/19/data-science-whats-the-half-life-of-a-buzzword/
Some good blog-posts from InterWorks:
Technique for using Tableau data blending to create a dynamic, data-driven “parameter”: http://drawingwithnumbers.artisart.org/creating-a-dynamic-parameter-with-a-tableau-data-blend/
More about Colors:
Russian Postcodes are collected and partially visualized:
EXASolution claims to be up to 1000 times faster than traditional databases and the fastest database in the world – based on in memory computing.
web interest to Tableau and Qlikview:
August 28, 2013
With releases of Spotfire Silver, Tableau Online and attempts of a few Qlikview Partners (but not Qliktech itself yet) to the Cloud and providing their Data Visualization Platforms and Software as a Service, the Attributes, Parameters and Concerns of such VaaS or DVaaS ( Visualization as a Service) are important to understand. Below is attempt to review those “Cloud” details at least on a high level (with natural limitation of space and time applied to review).
But before that let’s underscore that Clouds are not in the skies but rather in huge weird buildings with special Physical and Infrastructure security likes this Data Center in Georgia:
You can see some real old fashion clouds above the building but they are not what we are talking about. Inside Data Center you can see a lot of Racks, each with 20+ servers which are, together with all secure network and application infrastructure contain these modern “Clouds”:
Attributes and Parameters of mature SaaS (and VaaS as well) include:
- Multitenant and Scalable Architecture (this topic is too big and needs own blogpost or article). You can review Tableau’s whitepaper about Tableau Server scalability here: http://www.tableausoftware.com/learn/whitepapers/tableau-server-scalability-explained
- SLA – service level agreement with up-time, performance, security-related and disaster recovery metrics and certifications like SSAE16.
- UI and Management tools for User Privileges, Credentials and Policies.
- System-wide Security: SLA-enforced and monitored Physical, Network, Application, OS and Data Security.
- Protection or/and Encryption of all or at least sensitive (like SSN) fields/columns.
- Application Performance: Transaction processing speed, Network Latency, Transaction Volume, Webpage delivery times, Query response times
- 24/7 high availability: Facilities with reliable and backup power and cooling, Certified Network Infrastructure, N+1 Redundancy, 99.9% (or 99.99% or whatever your SLA with clients promised) up-time
- Detailed historical availability, performance and planned maintenance data with Monitoring and Operational Dashboards, Alerts and Root Cause Analysis
Disaster recovery plan with multiple backup copies of customers’ data in near real time at the disk level, a
multilevel backup strategy that includes disk-to-disk-to-tape data backup where tape backups serve as a secondary level of backup, not as their primary disaster recovery data source.
- Fail-over that cascades from server to server and from data center to data center in the event of a regional disaster, such as a hurricane or flood.
While Security, Privacy, Latency and Hidden Cost usually are biggest concerns when considering SaaS/VaaS, other Cloud Concerns surveyed and visualized below. Recent survey and diagram are published by Charlie Burns this month:
As you see above, Rack in Data Center can contain multiple Servers and other devices (like Routers and Switches, often redundant (at least 2 or sometimes N+1). Recently I designed the Hosting Data VaaS Center for Data Visualization and Business Intelligence Cloud Services and here are simplified version of it just for one Rack as a Sample.
You can see redundant network, redundant Firewalls, redundant Switches for DMZ (so called “Demilitarized Zone” where users from outside of firewall can access servers like WEB or FTP), redundant main Switches and Redundant Load Balancers, Redundant Tableau Servers, Redundant Teradata Servers, Redundant Hadoop Servers, Redundant NAS servers etc. (not all devices shown on Diagram of this Rack):
August 26, 2013
Leave a Comment
20 months ago I checked how many job openings leading DV Vendors have. On 12/5/11 Tableau had 56, Qliktech had 46 and Spotfire had 21 openings. Today morning I checked their career sites again and noticed that both Tableau and Qliktech almost double their thirst for new talents, while Spotfire basically staying on the same level of hiring needs:
Qliktech has 87 openings, 29 of them are engineering positions (I included R&D, IT, Tech Support and Consulting).
TIBCO/Spotfire has 24 openings, 16 of them are engineering positions (R&D, IT, Tech.Support).
All 3 companies are Public now, so I decided to include their Market Capitalization as well. Since Spofire is hidden inside its corporate parent TIBCO, I used my estimate that Spotfire’s Capitalization is about 20% of TIBCO’s capitalization (which is $3.81B as of 8/23/13, see https://www.google.com/finance?q=TIBX ). As a result I have this Market Capitalization numbers for 8/23/13 as closing day:
Tableau – $4B (see https://www.google.com/finance?q=NYSE%3ADATA&sq=DATA )
Qliktech – $3.05B (see https://www.google.com/finance?q=QLIK )
Spotfire – $0.76B (my rough estimate as 20% of TIBCO’s capitalization; I wish it will be more for such a mature and excellent product and team, but TIBCO does not want to capitalize on Data Visualization market. I think that the best way for TIBCO to take advantage of Spotfire is to spin it off the same way EMC ($55B Market Cap) did with VMWare (currently $37.7B)). Update on 9/10/13: Akram Annous validated my 3+ years old idea of Spotfire’s spinoff in this article @SeekingAlpha: http://seekingalpha.com/article/1684032-is-tibco-dan-loebs-next-pet-project
Those 3 DV Vendors above together have almost $8B market capitalization as of evening of 8/23/13 !
Market Capitalization update as of 8/31/13: Tableau: $4.3B, Qliktech $2.9B, Spotfire (as 20% of TIBCO) – $0.72B
Market Capitalization update as of 9/4/13 11pm: Tableau: $4.39B, Qliktech $3B, Spotfire (as 20% of TIBCO) – $0.75B . Also as of today Qliktech employed 1500+ (approx. $300K revenue per year per employee), Tableau about 1000 (approx. $200K revenue per year per employee) and Spotfire about 500+ (very rough estimate, also approx. $350K revenue per year per employee)
August 12, 2013
Last week Tableau increased by 10-fold the capacity of Data Visualizations published with Tableau Public to a cool 1 Million rows of Data, basically to the same amount of rows, which Excel 2007, 2010 and 2013 (often used as data sources for Tableau Public) can handle these days and increased by 20-fold the storage capacity (to 1GB of free storage) of each free Tableau Public Account, see it here:
It means that free Tableau Public Account will have the storage twice larger than Spotfire Silver’s the most expensive Analyst Account (that one will cost you $4500/year). Tableau said: “Consider it a gift from us to you.”. I have to admit that even kids in this country know that there is nothing free here, so please kid me not – we are all witnessing of some kind of investment here – this type of investment worked brilliantly in the past… And all users of Tableau Public are investing too – with their time and learning efforts.
And this is not all: “For customers of Tableau Public Premium, which allows users to save locally and disable download of their workbooks, the limits have been increased to 10 million rows of data at 10GB of storage space” see it here:
http://www.tableausoftware.com/about/press-releases/2013/tableau-software-extends-tableau-public-1-million-rows-data without changing the price of service (of course in Tableau Public Premium price is not fixed and depends on the number of impressions).
Out of 100+ millions of Tableau users only 40000 qualified to be called Tableau Authors, see it here http://www.tableausoftware.com/about/press-releases/2013/tableau-software-launches-tableau-public-author-profiles so they are consuming Tableau Public’s Storage more actively then others. As an example you can see my Tableau’s Author Profile here: http://public.tableausoftware.com/profile/andrei5435#/ .
I will assume those Authors will consume 40000GB of online storage, which will cost to Tableau Software less then (my guess, I am open to correction from blog visitors) $20K/year just for the storage part of Tableau Public Service.
During the last week the other important announcement on 8/8/13 – Quarterly Revenue – came from Tableau: it reported the Q2 revenue of $49.9 million, up 71% year-over-year: http://investors.tableausoftware.com/investor-news/investor-news-details/2013/Tableau-Announces-Second-Quarter-2013-Financial-Results/default.aspx .
Please note that 71% is extremely good YoY growth compare with the entire anemic “BI industry”, but less then 100% YoY which Tableau grew in its private past.
All these announcements above happened simultaneously with some magical (I have no theory why this happened; one weak theory is the investors madness and over-excitement about Q2 revenue of $49.9M announced on 8/8/13?) and sudden increase of the nominal price of Tableau Stock (under the DATA name on NYSE) from $56 (which is already high) on August 1st 2013 (announcement of 1 millions of rows/1GB storage for Tableau public Accounts) to $72+ today:
It means that the Market Capitalization of Tableau Software may be approaching $4B and sales may be $200M/year. For comparison, Tableau’s direct and more mature competitor Qliktech has now the Capitalization below $3B while its sales approaching almost $500M/year. From Market Capitalization point of view in 3 moths Tableau went from a private company to the largest Data Visualization publicly-traded software company on market!
July 24, 2013
Competition in Data Visualization market is not only on features, market share and mindshare but also on pricing and lisensing. For example the Qlikview licensing and pricing is public for a while here: http://www.qlikview.com/us/explore/pricing and Spotfire Silver pricing public for a while too: https://silverspotfire.tibco.com/us/silver-spotfire-version-comparison .
Tableau Desktop has 3 editions: Public (Free), Personal ($999) and Professional ($1999), see it here: http://www.tableausoftware.com/public/comparison ; in addition you can have full Desktop (read-only) experience with free Tableau Reader (neither Qlikview nor Spotfire have free readers for server-less, unlimited distribution of Visualizations, which is making Tableau a mind-share leader right away…)
The release of Tableau Server online hosting this month: http://www.tableausoftware.com/about/press-releases/2013/tableau-unveils-cloud-business-intelligence-product-tableau-online heated the licensing competition and may force the large changes in licencing landscape for Data Visualization vendors. Tableau Server existed in the cloud for a while with tremendous success as Tableau Public (free) and Tableau Public Premium (former Tableau Digital with its weird pricing based on “impressions”).
But Tableau Online is much more disruptive for BI market: for $500/year you can get the complete Tableau Server site (administered by you!) in the cloud with (initially) 25 (it can grow) authenticated by you users and 100GB of cloud storage for your visualizations, which is 200 times more then you can get for $4500/year top-of-the line Spotfire Silver “Analyst account”. This Tableau Server site will be managed in the cloud by Tableau Software own experts and require nor IT personnel from your side! You may also compare it with http://www.rosslynanalytics.com/rapid-analytics-platform/applications/qlikview-ondemand .
A hosted by Tableau Software solution is particularly useful when sharing dashboards with customers and partners because the solution is secure but outside a company’s firewall. In the case of Tableau Online users can publish interactive dashboards to the web and share them with clients or partners without granting behind-the-firewall access.
Since Tableau 8 has new Data Extract API, you can do all data refreshes behind your own firewall and republish your TDE files in the cloud anytime (even automatically, on demand or on schedule) you need. Tableau Online has no minimum number of users and can scale as a company grows. At any point, a company can migrate to Tableau Server to manage it in-house. Here is some introductionla video about Tableau Online: Get started with Tableau Online.
Tableau Server in the cloud provides at least 3 ways to update your data (more details see here: http://www.tableausoftware.com/learn/whitepapers/tableau-online-understanding-data-updates )
- republish Data Source or Workbook;
- Use Command-Line Tools to Schedule Batch Updates Locally and automatically, see more details here: http://onlinehelp.tableausoftware.com/current/pro/online/en-us/help.html#extracting_TDE.html;
- refresh the Tableau Server Data Source using your Tableau Desktop as a proxy between your on-premise Data Source and Tableau Online:
Here is another, more lengthy intro into Tableau BI in Cloud:
Tableau as a Service is a step in right direction, but be cautious: in practice, the architecture of the hosted version could impact performance. Plus, the nature of the product means that Tableau isn’t really able to offer features like pay-as-you-go that have made cloud-based software popular with workers. By their nature, data visualization products require access to data. For businesses that store their data internally, they must publish their data to Tableau’s servers. That can be a problem for businesses that have large amounts of data or that are prevented from shifting their data off premises for legal or security reasons. It could also create a synchronization nightmare, as workers play with data hosted at Tableau that may not be as up-to-date as internally stored data. Depending on the location of the customer relative to Tableau’s data center, data access could be slow.
And finally, the online version requires the desktop client, which costs $2,000. Tableau may implement Tableau desktop analytical features in a browser in the future while continue to support the desktop and on-premise model to meet security and regulations facing some customers.
July 11, 2013
Leave a Comment
I got many questions from Data Visualization Blog’s visitors about differences between compensation for full-time employees and contractors. It turned out that many visitors are actually contractors, hired because of their Tableau or Qlikview or Spotfire skills and also some visitors consider a possibility to convert to consulting or vice versa: from consulting to FullTimers. I am not expert in all these compensation and especially benefits-related questions, but I promised myself that my blog will be driven by vistors’s requests, so I google a little about Contractor vs. Full-Time worker compensation and below is brief description of what I got:
Federal Insurance Contribution Act mandates Payroll Tax splitted between employer (6.2% Social Security with max $$7049.40 and 1.45% Medicare on all income) and employee, with total (2013) as 15.3% of gross compensation.
In addition you have to take in account employer’s contribution (for family it is about $1000/per month) to medical benefits of employee, Unemployment Taxes, employer’s contribution to 401(k), STD and LTD (short and long term disability insurances), pension plans etc.
I also added into my estimate of contractor rate the “protection” for at least 1 month GAP between contracts and 1 month of salary as bonus for full-time employees.
Basically the result of my minimal estimate as following: you need to get as a contractor the rate at least 50% more than base hourly rate of the full-time employee. This base hourly rate of full-time employee I calculate as employee’s base salary divided on 1872 hours: 1872 = (52 weeks*40 hours – 3 weeks of vacation – 5 sick days – 6 holidays) = 2080 hours – 208 hours (Minimum for a reasonable PTO, Personal Time Off) = 1872 working hours per year.
I did not get into account any variations related to the usage of W2 or 1099 forms or Corp-To-Corp arrangements and many other fine details (like relocation requirements and overhead associated with involvement of middlemen like headhunters and recruiters) and differences between compensation of full-time employee and consultant working on contract – this is just a my rough estimate – please consult with experts and do not ask me any questions related to MY estimate, which is this:
- Contractor Rate should be 150% of the base rate of a FullTimer
In general, using Contractors (especially for business analytics) instead of Full-timers is basically the same mistake as outsourcing and off-shoring: companies doing that do not understand that their main assets are full-time people. Contractors are usually not engaged and they are not in business to preserve intellectual property of company.
For reference see Results of Dr. Dobbs 2013 Salary Survey for Software Developers which are very comparable with salary of Qlikview, Tableau and Spotfire developers and consultants (only in my experience salary of Datas Visualization Consultants are 10-15% higher then salaries of software developers):
This means that for 2013 the Average rate for Qlikview, Tableau and Spotfire developers and consultants should be around 160% of the base rate of a average FullTimer, which ESTIMATES to Effective Equivalent Pay to Contractor for 1872 hours per Year as $155,200 and this is only for average consultant... If you take less then somebody tricked you, but if you read above you already know that.
June 7, 2013
Leave a Comment
2400 years ago the concept of Data Visualization was less known, but even than Plato said “Those who tell stories rule society“.
I witnessed multiple times how storytelling triggered the Venture Capitalists (VCs) to invest. Usually my CEO (biggest BS master on our team) will start with a “60-seconds-long” short Story (VCs called them “Elevator Pitch”) and then (if interested) VCs will do a long Due Diligence Research of Data (and Specs, Docs and Code) presented by our team and after that they will spend comparable time analyzing Data Visualizations (Charts, Diagrams, Slides etc.) of our Data, trying to prove or disprove the original Story.
Some of conclusions from all these startup storytelling activity were:
Data: without Data nothing can be proved or disproved (Action needs Data!)
View: best way to analyze Data and trust it is to Visualize it (Seeing is Believing!)
Discovery of Patterns: visually discoverable trends, outliers, clusters etc. which form the basis of the Story and follow-up actions
Story: the Story (based on that Data) is the Trigger for the Actions (Story shows the Value!),
Action(s): start with drilldown to a needle in haystack, embed Data Visualization into business, it is not an Eye Candy but a practical way to improve the business
Data Visualization has 5 parts: Data (main), View (enabler), Discovery (visually discoverable Patterns), Story (trigger for Actions) and finally the 5th Element – Action!
Life is not fair: Storytellers were there people who benefited the most in the end… (no Story no Glory!).
And yes, Plato was correct – at least partially and for his time. Diagram above uses analogy with 5 Classical Greek Elements. Plato wrote about four classical elements (earth, air, water, and fire) almost 2400 years ago (citing even more ancient philosopher) and his student Aristotle added a fifth element, aithêr (aether in Latin, “ether” in English) – both men are in the center of 1st picture above.
Back to our time: the Storytelling is a hot topic; enthusiasts saying that “Data is easy, good storytelling is the challenge” http://www.resource-media.org/data-is-easy/#.URVT-aVi4aE or even that “Data Science is a Storytelling”: http://blogs.hbr.org/cs/2013/03/a_data_scientists_real_job_sto.html . Nothing can be further from the truth: my observation is that most Storytellers (with a few known exceptions like Hans Rosling or Tableau founder Pat Hanrahan) ARE NOT GOOD at visualizing but they still wish to participate in our hot Data Visualization party. All I can say is “Welcome to the party!”
It may be a challenge for me and you but not for people who had a conference about storytelling: this winter, 2/27/13 in Nashville, KY: http://www.tapestryconference.com/ :
Some more reasonable people referring to storytelling as a data journalism and narrative visualization: http://www.icharts.net/blogs/2013/pioneering-data-journalism-simon-rogers-storytelling-numbers
Tableau founder Pat Hanrahan recently talked about “Showing is Not Explaining”. In parallel, Tableau is planning (after version 8.0) to add features that support storytelling by constructing visual narratives and effective communication of ideas, see it here:
Collection of resources on storytelling topic can be found here: http://www.juiceanalytics.com/writing/the-ultimate-collection-of-data-storytelling-resources/
You may also to check what Stephen Few thinks about it here: http://www.perceptualedge.com/blog/?p=1632
Storytelling as an important part (using Greek Analogy – 4th Classical Element (Air) after Data (Earth), View (Water) and Discovery (Fire) and before Action (Aether) ) of Data Visualization has a practical effect on Visualization itself, for example:
if Data View is not needed for Story or for further Actions, then it can be hidden or removed,
if number of Data Views in Dashboard is affecting impact of (preferably short Data Story), then number of Views should be reduced (usually to 2 or 3 per dashboard),
If number of DataPoints is too large per View and affecting the triggering power of the story, then it can be reduced too (in conversations with Tableau they even recommending 5000 Datapoints per View as a threshold between Local and Server-based rendering).
May 31, 2013
Leave a Comment
Stephen Few asking you a tricky question: Are You a Data Scientist?: http://www.perceptualedge.com/blog/?p=1719 . It seems to me that Stephen agrees with me that 3 Terms: Data Scientist, Data Science and Big Data are full of BS… or as our VP will say: “a Bunch of Malarkey”…
3 very important articles by Russel Christopher about History Tables in Tableau 8:A funny “Preview” by Stephen Few of Tableau 9: Gauges?!: http://www.perceptualedge.com/blog/?p=1706
Leverage PowerShell and Tableau to Extract Server Datasources: https://www.interworks.com/blogs/mroberts/2013/05/28/leverage-powershell-and-tableau-extract-server-datasources
Tableau 7 data dictionary: http://tableaulove.tumblr.com/post/50438516817/better-late-than-never-tableau-7-data-dictionary
Collection of Good Practices, provided by Tableau: http://www.tableausoftware.com/public/community/best-practices
Tableau Server has many built in features to promote data exploration, collaboration, and security. The Data Server is arguably the most powerful of these tools but is commonly overlooked and underutilized. Answer these questions to see how the Data Server can save you time and increase productivity. http://www.tableausoftware.com/about/blog/2013/4/unleash-tableau-data-server-23038
Using Tableau 8 to Leverage Your ‘Big-ish’ Data: https://www.interworks.com/blogs/mroberts/2013/04/08/using-tableau-8-leverage-your-big-ish-data
“New in 8: Word Clouds”: http://www.tableausoftware.com/public/blog/2013/02/new-8-word-clouds-1825
“What Makes a Chart Boring?”: http://www.perceptualedge.com/blog/?p=1612
Emotional reactions to Tableau 8: http://www.datarevelations.com/dylans-gone-electric-emotional-reactions-to-tableau-8.html
Leverage Multiple Tableau Data Extracts for Big Data in Tableau 8: : https://www.interworks.com/blogs/mroberts/2013/04/25/leverage-multiple-tableau-data-extracts-big-data-tableau-8
Leverage the Tableau 8 Tableau Data Extract Command-Line Utility Without Tabcmd : https://www.interworks.com/blogs/mroberts/2013/04/15/leverage-tableau-8-tableau-data-extract-command-line-utility-without-tabcm
Very popular article from Stephen Few about Tableau losing the clear vision of its youth collected 49 comments from known experts : http://www.perceptualedge.com/blog/?p=1532 and most interesting, 50th comment came from Stephen fimself: “I recently learned that when my review of Tableau 8 was published, Tableau employees were forbidden from responding publicly”.
UK Boundary Polygon Data from Information Lab: http://www.theinformationlab.co.uk/2013/03/25/uk-area-polygon-mapping-in-tableau/
Sample of how to use DataExtract API with C#/.NET: http://community.tableausoftware.com/message/203714#203714
Tableau Data Blending, Sparse Data, Multiple Levels of Granularity, and Improvements in Version 8: http://drawingwithnumbers.artisart.org/tableau-data-blending-sparse-data-multiple-levels-of-granularity-and-improvements-in-version-8/
Tableau Custom SQL connections through the JET driver: https://www.interworks.com/blogs/jwright/2013/02/22/musings-tableau-custom-sql-connections-through-jet-driver
Software company XL Cubed has incorporated Bandlines into their product, but comments to post showing that Tableau has it fro a while too: http://www.perceptualedge.com/blog/?p=1485
Stepheb Few introduced Bandlines as Sparklines Enriched with Information about Magnitude and Distribution: http://www.perceptualedge.com/articles/visual_business_intelligence/introducing_bandlines.pdf
May 24, 2013
Below you can find samples of Guidelines and Good Practices for Data Visualization (mostly with Tableau), which I used recently.
Some of this samples are Tableau-specific, but others (may be with modifications) can be reused for other Data Visualization Platform and tools. I will appreciate feedback, comments and suggestions.
Naming Convention for Tableau Objects
Use CamelCase Identifiers: Capitalize the 1st letter of each concatenated word
Use Suffix for Identifiers with preceding underscore to indicate the type (example: _tw for workbooks).
Workbook Sizing Guidelines
Use Less than 5 Charts per Dashboard, Minimize the number of Visible TABs/Worksheets
Move Calculations and Functions from Workbook to the Data.
Use less than 5000 Data-points per Chart/Dashboard to enable Client-side rendering.
To enable Shared Sessions, don’t use filters and interactivity if it is not needed.
Guidelines for Colors, Fonts, Sizes
To express desirable/undesirable points, use green for good, red for bad, yellow for warning.
When you are not describing “Good-Bad situation” (thanks to feedback of visitor under alias “SF”) , try to use pastel, neutral and blind colors, e.g. similar to “Color Blind 10″ Palette from Tableau.
Use “web-safe” fonts, to approximate what users can see from Tableau Server.
Use either auto-resize or standard (target smaller screen) sizes for Dashboards
Data and Data Connections used with Tableau
Try to avoid pulling more than 15000 rows for Live Data Connections.
For Data Extract-based connections 10M rows is the recommended maximum.
For widely distributed Workbooks use of Application IDs instead of Personal Credentials.
Job failure due expired credentials leads to suspension from Schedule, so try to keep embedded credentials up to date
Tableau Data Extracts (TDE)
If Refresh of TDE takes more than 2 hours, consider to redesign it.
Reuse and share TDEs and Data Sources as much as possible.
Use of Incremental Data Refresh instead of Full Refresh when possible.
Designate Unique ID for each row when Incremental Data Refresh is used.
Try to use free Tableau Data Exract API instead of licensed Tableau Server to create Data Extracts
Scheduling of Background Tasks with Tableau
Serial Schedules is recommended; avoid the usage of hourly Schedules.
Avoid scheduling during peak hours (8am-6pm), consider weekly instead of daily schedules.
Optimize Schedule Size, group tasks related to the same project into one Schedule, if total tasks execution exceeds 8 hours, split Schedule on a few with similar Name but preferably with different starting time.
Maximize the usage of Monthly and Weekly Schedules (as oppose to Daily Schedules) and usage of weekends and nights.
Guidelines for using Charts
Use Bars to compare across categories, use Colors with Stacked or Side-by-Side Bars for deeper Analysis
Use Line for Viewing Trends over time, consider Area Charts for Multi-lines
Minimize the usage of Pie Charts; when appropriate – use it for showing proportions. It is recommended to limit pie wedges to six.
Use Map to show geocoded data, consider use maps as interactive filters
Use Scatter to analyze outliers, clusters and construct regressions
You can find more about Guidelines and Good Practices for Data Visualization here: http://www.tableausoftware.com/public/community/best-practices
April 2, 2013
Leave a Comment
Tableau Software filed for IPO, on the New York Stock Exchange under the symbol “DATA”. In sharp contrast to other business-software makers that have gone public in the past year, Tableau is profitable, despite hiring huge number of new employees. For the years ended December 31, 2010, 2011 and 2012, Tableau’s total revenue were $34.2 million, $62.4 million and $127.7 million for 2012. Number of full-time employees increased from 188 as of December 31, 2010 to 749 as of December 31, 2012.
Tableau’s biggest shareholder is venture capital firm New Enterprise Associates, with a 38 percent stake. Founder Pat Hanrahan owns 18 percent, while co-founders Christopher Stolte and Christian Chabot, who is also chief executive officer, each own more than 15 percent. Meritech Capital Partners controls 6.4 percent. Tableau recognized three categories of Primary Competitors:
large suppliers of traditional business intelligence products, like IBM, Microsoft, Oracle and SAP AG;
spreadsheet software providers, such as Microsoft Corporation
business analytics software companies: Qlik Technologies Inc. and TIBCO Spotfire.
Update 4/29/13: This news maybe related to Tableau IPO: I understand that Microstrategy’s growth cannot be compared with growth of Tableau or even Qliktech. But to go below of the average “BI market” growth? Or even 6% or 24% decrease? What is going on (?) here : ”First quarter 2013 revenues were $130.2 million versus $138.3 million for the first quarter of 2012, a 6% decrease. Product licenses revenues for the first quarter of 2013 were $28.4 million versus $37.5 million for the first quarter of 2012, a 24% decrease.”
Update 5/6/13: Tableau Software Inc. will sell 5 million shares, while shareholders will sell 2.2 million shares, Tableau said in an amended filing with the U.S. Securities and Exchange Commission. The underwriters have the option to purchase up to an additional 1,080,000 shares. It means total can be 8+ millions of shares for sale.
The company expects its initial public offer to raise up to $215.3 million at a price of $23 to $26 per share. If this happened, that will create public company with large capitalization, so Qliktech and Spotfire will have an additional problem to worry about. This is how QLIK (blue line), TIBX (red) and MSTR (orange line) stock behaved during last 6 weeks after release of Tableau 8 and official Tableau IPO announcement:
Update 5/16/13: According to this article at Seeking Alpha (also see S-1 Form) Tableau Software Inc. (symbol “DATA”) is scheduled a $176 million IPO with a market capitalization of $1.4 billion for Friday, May 17, 2013. Tableau’s March Quarter sales were up 60% from the March ’12 quarter. Qliktech’s sales were up only 23% on a similar comparative basis.
According to other article, Tableau raised it IPO price and it may reach capitalization of $2B by end of Friday, 5/17/13. That is almost comparable with capitalization of Qliktech…
Update 5/17/13: Tableau’s IPO offer price was $31 per share, but it started today
with price $47 and finished day with $50.75 (raising $400M in one day) with estimated Market Cap around $3B (or more?). It is hard to understand the market: Tableau Stock (symbol: DATA) finished its first day above $50 with Market Capitalization higher than QLIK, which today has Cap = $2.7B but Qliktech has almost 3 times more of sales then Tableau!
For comparison MSTR today has Cap = $1.08B and TIBX today has Cap = $3.59B. While I like Tableau, today proved that most investors are crazy, if you compare numbers in this simple table:
|Symbol :||Market Cap, $B, as of 5/17/13||Revenue, $M, as of 3/31/13 (trailing 12 months)||FTE (Full Time Employees)|
|DATA||between $2B and $3B?||143||834|
See interview with Co-Founder of Tableau Software Christian Chabot - he discusses taking the company’s IPO with Emily Chang on Bloomberg Television’s “Bloomberg West.” However it makes me sad when Tableau’s CEO is implying that Tableau is ready for big data, which is not true.
Here are some pictures of the Tableau team at the NYSE: http://www.tableausoftware.com/ipo-photos and here is the announcement about “closing IPO”.
Initial public offering gave to Tableau $254 million (preliminary estimate)
March 21, 2013
Today Tableau 8 was released with 90+ new features (actually it may be more than 130) after exhausting 6+ months of Alpha and Beta Testing with 3900+ customers as Beta Testers! I personally expected it it 2 months ago, but I rather will have it with less bugs and this is why I have no problem with delay. During this “delay” Tableau Public achieved the phenomenal Milestone: 100 millions of users…
Tableau 8 introduced:
- web and mobile authoring,
- added access to new data sources: Google Analytics, Salesforce.com, Cloudera Impala, DataStax Enterprise, Hadapt, Hortonworks Hadoop Hive, SAP HANA, and Amazon Redshift.
- New Data Extract API that allows programmers to load data from anywhere into Tableau and make certain parts of Tableau Licensing ridiculous, because consuming part of licensing (for example core licensing) for background tasks should be set free now.
- Local Rendering: leveraging the graphics hardware acceleration available on ordinary computers. Tableau 8 Server dynamically determines where rendering will complete faster – on the server or in the browser. Also – and acts accordingly. Also Dashboards now render views in parallel when possible.
Tableau Software plans to add in next versions (after 8.0) some very interesting and competitive features, like:
- Direct query of large databases, quick and easy ETL and data integration.
- Tableau on a Mac and Tableau as a pure Cloud offering.
- Make statistical & analytical techniques accessible (I wonder if it means integration with R?).
- Tableau founder Pat Hanrahan recently talked about ”Showing is Not Explaining”, so Tableau planned to add features that support storytelling by constructing visual narratives and effective communication of ideas.
I did not see on Tableau’s roadmap some very long overdue features like 64-bit implementation (currently even all Tableau Server processes, except one, are 32-bit!), Server implementation on Linux (we do not want to pay Windows 2012 Server CAL taxes to Bill Gates) and direct mentioning of integration with R like Spotfire does – I how those planning and strategic mistakes will not impact upcoming IPO.
I personally think that Tableau has to stop using its ridiculous practice when 1 core license used per each 1 Backgrounder server process and since Tableau Data Extract API is free so all Tableau Backgrounder Processes should be free and have to be able to run on any hardware and even any OS.
Tableau 8 managed to get the negative feedback from famous Stephen Few, who questioned Tableau’s ability to stay on course. His unusually long blog-post “Tableau Veers from the Path” attracted enormous amount of comments from all kind of Tableau experts. I will be cynical here and notice that there is no such thing as negative publicity and more publicity is better for upcoming Tableau IPO.
February 23, 2013
Leave a Comment
The most popular (among business users) approach to visualization is to use a Data Visualization (DV) tool like Tableau (or Qlikview or Spotfire), where a lot of features already implemented for you. Recent prove of this amazing popularity is that at least 100 million people (as of February 2013), used Tableau Public as their Data Visualization tool of choice, see
However, to make your documents and stories (and not just your data visualization applications) driven by your data, you may need the other approach – to code visualization of your data into your story and visualization libraries like popular D3 toolkit can help you. D3 stands for “Data-Driven Documents”. The Author of D3 Mr. Mike Bostock designs interactive graphics for New York Times – one of latest samples is here:
and NYT allows him to do a lot of Open Source work which he demonstartes at his website here:
Mike was a “visualization scientist” and a computer science PhD student at #Stanford University and member of famous group of people, now called “Stanford Visualization Group”:
This Visualization Group was a birthplace of Tableau’s prototype – sometimes they called it ”a Visual Interface” for exploring data and other name for it is Polaris:
- and Mike Bostock was one of ProtoViz’s main co-authors. Less then 2 years ago Visualization Group suddenly stopped developing ProtoViz and recommended to everybody to switch to D3 library
authored by Mike. This library is Open Source (only 100KB in ZIP format) and can be downloaded from here:
Most of successful early D3 adopters combining even 3+ mindsets: programmer, business analyst, data artist and even sometimes data storyteller. For your programmer’s mindset you may be interested to know that D3 has a large set of Plugins, see:
and rich #API, see https://github.com/mbostock/d3/wiki/API-Reference
You can find hundreds of D3 demos, samples, examples, tools, products and even a few companies using D3 here: https://github.com/mbostock/d3/wiki/Gallery
January 31, 2013
Leave a Comment
Best of the Tableau Web… December 2012:
Top 100 Q4 2012 from Tableau Public:
eBay’s usage of Tableau as the front-end for big data, Teradata and Hadoop with 52 petabytes of
data on everything from user behavior to online transactions to customer shipments and much more:
Why The Information Lab recommends Tableau Software:
Fun with #Tableau Treemap Visualizations
Talk slides: Tableau, SeaVis meetup & Facebook, Andy Kirk’s Facebook Talk from Andy Kirk
Usage of RAM, Disk and Data Extracts with Tableau Data Engine:
Migrating Tableau Server to a New Domain
IFNULL – is not “IF NULL”, is “IF NOT NULL”
Worksheet and Dashboard Menu Improvements in Tableau 8:
Jittery Charts – Why They Dance and How to Stop Them:
Tableau Forums Digest #8
Tableau Forums Digest #9
Tableau Forums Digest #10
Tableau Forums Digest #11
implementation of bandlines in Tableau by Jim Wahl (+ Workbook):
January 21, 2013
This is the Part 2 of the guest blog post: the Review of Visual Discovery products from Advizor Solutions, Inc., written by my guest blogger Mr. Srini Bezwada (his profile is here: http://www.linkedin.com/profile/view?id=15840828 ), who is the Director of Smart Analytics, a Sydney based professional BI consulting firm that specializes in Data Visualization solutions. Opinions below belong to Mr. Srini Bezwada.
ADVIZOR’s Visual Discovery™ software is built upon strong data visualization tools technology spun out of a distinguished research heritage at Bell Labs that spans nearly two decades and produced over 20 patents. Formed in 2003, ADVIZOR has succeeded in combining its world-leading data visualization and in-memory-data-management expertise with extensive usability knowledge and cutting-edge predictive analytics to produce an easy to use, point and click product suite for business analysis.
ADVIZOR readily adapts to business needs without programming and without implementing a new BI platform, leverages existing databases and warehouses, and does not force customers to build a difficult, time consuming, and resource intensive custom application. Time to deployment is fast, and value is high.
With ADVIZOR data is loaded into a “Data Pool” in main memory on a desktop or laptop computer, or server. This enables sub-second response time on any query against any attribute in any table, and instantaneously update all visualizations. Multiple tables of data are easily imported from a variety of sources.
With ADVIZOR, there is no need to pre-configure data. ADVIZOR accesses data “as is” from various data sources, and links and joins the necessary tables within the software application itself. In addition, ADVIZOR includes an Expression Builder that can perform a variety of numeric, string, and logical calculations as well as parse dates and roll-up tables – all in-memory. In essence, ADVIZOR acts like a data warehouse, without the complexity, time, or expense required to implement a data warehouse! If a data warehouse already exists, ADVIZOR will provide the front-end interface to leverage the investment and turn data into insight.
Data in the memory pool can be refreshed from the core databases / data sources “on demand”, or at specific time intervals, or by an event trigger. In most production deployments data is refreshed daily from the source systems.
ADVIZOR’s Visual Discovery™ is a full visual query and analysis system that combines the excitement of presentation graphics – used to see patterns and trends and identify anomalies in order to understand “what” is happening – with the ability to probe, drill-down, filter, and manipulate the displayed data in order to answer the “why” questions. Conventional BI approaches (pre-dating the era of interactive Data Visualization) to making sense of data have involved manipulating text displays such as cross tabs, running complex statistical packages, and assembling the results into reports.
ADVIZOR’s Visual Discovery™ making the text and graphics interactive. Not only can the user gain insight from the visual representation of the data, but now additional insight can be obtained by interacting with the data in any of ADVIZOR’s fifteen (15) interactive charts, using color, selection, filtering, focus, viewpoint (panning, zooming), labeling, highlighting, drill-down, re-ordering, and aggregation.
Flight Recorder – Track, Save, Replay your Analysis Steps
The Flight Recorder tracks each step in a selection and analysis process. It provides a record of those steps, and be used to repeat previous actions. This is critical for providing context to what and end-user has done and where they are in their data. Flight records also allow setting bookmarks, and can be saved and shared with other ADVIZOR users.
The Flight Recorder is unique to ADVIZOR. It provides:
• A record of what a user has done. Actions taken and selections from charts are listed. Small images of charts that have been used for selection show the selections that were made.
• A place to collect observations by adding notes and capturing images of other charts that illustrate observations.
• A tool that can repeat previous actions, in the same session on the same data or in a later session with updated data.
• The ability to save and name bookmarks, and share them with other users.
Predictive Analytics Capability
The ADVIZOR Analyst/X is a predictive analytic solution based on a robust multivariate regression algorithm developed by KXEN – a leading-edge advanced data mining tool that models data easily and rapidly while maintaining relevant and readily interpretable results.
Visualization empowers the analyst to discover patterns and anomalies in data by noticing unexpected relationships or by actively searching. Predictive analytics (sometimes called “data mining”) provides a powerful adjunct to this: algorithms are used to find relationships in data, and these relationships can be used with new data to “score” or “predict” results.
Predictive analytics software from ADVIZOR don’t require enterprises to purchase platforms. And, since all the data is in-memory, the Business Analyst can quickly and easily condition data and flag fields across multiple tables without having to go back to IT or a DBA to prep database tables. The interface is entirely point-and-click, there are no scripts to write. The biggest benefit from the multi-dimensional visual solution is how quickly it delivers analysis, solving critical business questions, facilitating intelligence-driven decision making, providing instant answers to “what if?” questions.
Advantages over Competitors:
• The only product in the market offering a combination of Predictive Analytics + Data Visualisation + In memory data management within one Application.
• The cost of entry is lower than the market leading data visualization vendors for desktop and server deployments.
• Advanced Visualizations like Parabox, Network Constellation in addition to normal bar charts, scatter plots, line charts, Pie charts…
• Integration with leading CRM vendors like Salesforce.com, Blackbaud, Ellucian, Information Builder
• Ability to provide sub-second response time on query against any attribute in any table, and instantaneously update all visualizations.
• Flight recorder that lets you track, replay, and save your analysis steps for reuse by yourself or others.
Update on 5/1/13 (by Andrei): Avizor 6.0 is available now with substantial enhancements: http://www.advizorsolutions.com/Bnews/tabid/56/EntryId/215/ADVIZOR-60-Now-Available-Data-Discovery-and-Analysis-Software-Keeps-Getting-Better-and-Better.aspx
January 11, 2013
If you visited my blog before, you know that my classification of Data Visualization and BI vendors are different from researchers like Gartner. In addition to 3 DV Leaders – Qlikview, Tableau, Spotfire – I rarely have time to talk about other “me too” vendors.
However, sometimes products like Omniscope, Microstrategy’s Visual Insight, Microsoft BI Stack (Power View, PowerPivot, Excel 2013, SQL Server 2012, SSAS etc.), Advizor, SpreadshetWEB etc. deserve attention too. However, it takes so much time, so I am trying to find guest bloggers to cover topics like that. 7 months ago I invited volunteers to do some guest blogging about Advizor Visual Discovery Products:
So far nobody in USA or Europe committed to do so, but recently Mr. Srini Bezwada, Certified Tableau Consultant and Advizor-trained expert from Australia contacted me and submitted the article about it. He also provided me with info about how Advizor can be compared with Tableau, so I will do it briefly, using his data and opinions. Mr. Bezwada can be reached at
firstname.lastname@example.org , where he is a director at
Below is quick comparison of Advizor with Tableau. Opinions below belong to Mr. Srini Bezwada. Next blog post will be a continuation of this article about Advizor Solutions Products, see also Advizor’s website here:
|Time to implement||Very Fast||Fast, ADVIZOR can be implemented within Days||Tableau Leads|
|Scalability||Very Good||Very Good||Tableau: virtual RAM|
|Desktop License||$1,999||$ 1,999||$3,999 for AnalystX with Predictive modeling|
|Server License/user||$1K, min 10 users, 299 K for Enterprise||Deployment license for up to 10 named users $8 K||ADVIZOR is a lot cheaper for Enterprise Deployment $75 K for 500 Users|
|Support fees / year||
|1st year included|
|SaaS Platform||Core or Digital||Offers Managed Hosting||ADVIZOR Leads|
|Overall Cost||Above Average||Competitive||ADVIZOR Costs Less|
|Enterprise Ready||Good for SMB||Cheaper cost model for SMB||Tableau is expensive for Enterprise Deployment|
|Long-term viability||Fastest growth||Private company since 2003.||Tableau is going IPO in 2013|
|Mindshare||Tableau Public||Growing Fast||Tableau stands out|
|Big Data Support||Good||Good||Tableau is 32-bit|
|Partner Network||Good||Limited Partnerships||Tableau Leads|
|Visual Drilldown||Very Good||Very Good|
|Offline Viewer||Free Reader||None||Tableau stands out|
|Analyst’s Desktop||Tableau Professional||Advizor has Predictive Modeling||ADVIZOR is a Value for Money|
|Dashboard Support||Excellent||Very Good||Tableau Leads|
|Web Client||Very Good||Good||Tableau Leads|
|64-bit Desktop||None||Very Good||Tableau still a 32-bit app|
|Mobile Clients||Very Good||Very Good|
|Visual Controls||Very Good||Very Good|
|Data Integration||Excellent||Very Good||Tableau Leads|
|Development||Tableau Pro||ADVIZOR Analyst|
|64-bit in-RAM DB||Good||Excellent||Advizor Leads|
|Mapping support||Excellent||Average||Tableau stands out|
|Modeling, Analytics||Below Average||Advanced Predictive Modelling||ADVIZOR stands out|
|Predictive Modeling||None||Advanced Predictive Modeling Capability with Built in KXEN algorithms||ADVIZOR stands out|
|Flight Recorder||None||Flight recorder lets you track, replay, save your analysis steps for reuse by yourself or others.||ADVIZOR stands out|
|Visualization||22 Chart types||All common charts like bar charts, scatter plots, line charts, Pie charts are supported||Advizor has Advanced Visualizations like Parabox, Network Constellation|
|Third party integration||Many Data Connectors, see Tableau’s drivers page||ADVIZOR integrates well with CRM software: Salesforce.com, Ellucian, Blackbaud and others.||ADVIZOR leads in CRM area|
|Training||Free Online and paid Classroom||Free Online and paid via company trainers & Partners||Tableau Leads|
November 30, 2012
In my previous post http://apandre.wordpress.com/2012/11/16/new-tableau-8-desktop-features/ (this post is the continuation of it) , I said that Tableau 8 introduced 130+ new features, 3 times more then Tableau 7 did. Many of these new features are in Tableau 8 Server and this post about those new Server features (this is a repost from my Tableau blog: http://tableau7.wordpress.com/2012/11/30/new-tableau-8-server-features/ ).
The Admin and Server pages have been redesigned to show more info quicker. In list view the columns can be resized. In thumbnail view the grid dynamically resizes. You can hover over a thumbnail to see more info about visualization. The content search is better too:
Web authoring (even mobile) introduced by Tableau 8 Server. Change dimensions, measures, mark types, add filters, and use Show Me are all directly in a web browser and can be saved back to the server as a new workbook or if individual permissions allow, to the original workbook:
Subscribing to a workbook or worksheet will automatically notify about the dashboard or view updates to your email inbox. Subscriptions deliver image and link.
Tableau 8 Data Engine is more scalable now, it can be distributed between 2 nodes, 2nd instance of it now can be configured as Active, Synced and Available for reading if Tableau Router decided to use it (in addition Fail-over function as before)Tableau 8 Server now supports Local Rendering, using graphic ability of local devices, modern browsers and HTML5. No-round-trip to server while rendering using latest versions of chrome 23+, Firefox 17+, Safari , IE 9+. Tableau 8 (both Server and Desktop, computing each view in Parallel. PDF files, generated by Tableau 8 up to 90% smaller and searchable. And Performance Recorder works on both Server and Desktop.
Tableau 8 Server introducing Shared sessions allows more concurrency, more caching. Tableau 7 uses 1 session per viewer. Tableau 8 using one session per many viewers, as long as they do no change state of filters and don’t do other altering interaction. If interaction happened, Tableau 8 will clone the session for appropriate Interactor and apply his/her changes to new session:Finally Tableau getting API, 1st part of it I described in previous blog post about TDesktop – TDE API (C/C++, Python, Java on both Windows AND Linux!).
TDE API allows to build own TDE on any machine with Python, C/C++ and Java (see 24:53 at http://www.tableausoftware.com/tcc12conf/videos/new-tableau-server-8 ). Additionally Server API (REST API) allows programmatically create/enable/suspend sites and add/remove users to sites.
In addition to Faster Uploads andPublishing Data Sources, users can Publish Filters as Set and User Filters. Data Sources can be Refreshed or Appended instead of republishing – all from Local Sources. Such Refreshes can scheduled using Windows Task Scheduler or other task scheduling software on client devices – this is a real TDE proliferation!
My wishlist for Tableau 8 Server: all Tableau Server processes needs to be 64-bit (and they still 32-bit, see it here: http://onlinehelp.tableausoftware.com/v7.0/server/en-us/processes.htm ; they are way overdue to be the 64-bit; Linux version of Tableau Server (Microsoft recently changed very unfavorably the way they charge users for each Client Access) is needed, I wish integration with R Library (Spotfire has it for years), I want Backgrounder Processes (mostly doing data extracts on server) will not consume core licenses etc…
And yes, I found in San Diego even more individuals who found the better way to spend their time compare with attending Tableau 2012 Customer Conference and I am not here to judge:
November 16, 2012
I left Tableau 2012 conference in San Diego (where Tableau 8 was announced) a while ago with enthusiasm which you can feel from this real-life picture of 11 excellent announcers:
Conference was attended by 2200+ people and 600+ Tableau Software employees (Tableau almost doubled the number of employees in a year) and it felt like a great effort toward IPO (see also article here: http://www.bloomberg.com/news/2012-12-12/tableau-software-plans-ipo-to-drive-sales-expansion.html ). See some video here: TCC12 Keynote . Tableau 8 introduce 130+ new features, 3 times more then Tableau 7 did. Almost half of these new features are in Tableau 8 Desktop and this post about those new Desktop features (this is a repost from my Tableau Blog: http://tableau7.wordpress.com/2012/11/16/new-tableau-8-desktop-features/). New Tableau 8 Server features deserved a separate blog post which I will publish a little later after playing with Beta 1 and may be Beta 2.
A few days after conference the Tableau 8 Beta Program started with 2000+ participants. One of the most promising features is new rendering engine and I build special Tableau 7 visualization (and its port to Tableau 8) with 42000 datapoints: http://public.tableausoftware.com/views/Zips_0/Intro?:embed=y to compare the speed of rendering between versions 7 and 8:
Among new features are new (for Tableau) visualization types: Heatmap, “Packed” Bubble Chart and Word Cloud, and I build simple Tableau 8 Dashboard to test it (all 3 are visualizing the 3-dimensional set where 1 dimension used as list of items, 1 measure used for size and 2nd measure used for color of items):
List of new features includes improved Sets (comparing members vs. non-members, adding/removing members, combining Sets: all-in-both, shared-by-both, left-except-right, right-except-left), Custom SQL with parameters, Freeform Dashboards (I still prefer MDI UI where each Chart/View Sheet has own Child Window as oppose to Pane), ability to add multiple fields to Labels, optimized label placement, built-in statistical models for visual Forecasting, Visual Grouping based on your data selection, Redesigned Mark Card (for Color, Size, Label, Detail and Tooltip Shelves).
New Data features include data blending without mandatory linked field in a view and with ability to filter data in secondary data sources; refreshing server-based Data Extracts can be done from local data sources; Data Filters (in addition be either local or global) can be shared now among selected set of worksheets and dashboards. Refresh of Data Extract can be done using command prompt for Tableau Desktop, for example
Tableau 8 has (finally) API (C/C++, Python, Java) to directly create a Tableau Data Extract (TDE) file, see example here: http://ryrobes.com/python/building-tableau-data-extract-files-with-python-in-tableau-8-sample-usage/
Tableau 8 (both Desktop and Server) can then connect to this extract file natively! Tableau provides new native connection for Google Analytics and Saleforce.com. TDE files now much smaller (especially with text values) – up to 40% smaller compare with Tableau 7.
Tableau 8 has performance enhancements, such as the new ability to use hardware accelerators (of modern graphics cards), computing views within dashboard in parallel (in Tableau 7 it was consecutive computations) and new performance recorder allows to estimate and tune a workload of various activities and functions and optimize the behavior of workbook.
I still have a wishlist of features which are not implemented in Tableau and I hope some them will be implemented later: all Tableau processes are 32-bit (except 64-bit version of data engine for server running on 64-bit OS) and they are way overdue to be the 64-bit; many users demand MAC version of Tableau Desktop and Linux version of Tableau Server (Microsoft recently changed very unfavorably the way they charge users for each Client Access), I wish MDI UI for Dashboards where each view of each worksheet has own Window as oppose to own pane (Qlikview does it from the beginning of the time), I wish integration with R Library (Spotfire has it for years), scripting languages and IDE (preferably Visual Studio), I want Backgrounder Processes (mostly doing data extracts on server) will not consume core licenses etc…
Despite the great success of the conference, I found somebody in San Diego who did not pay attention to it (outside was 88F, sunny and beautiful):
October 27, 2012
Leave a Comment
On May 3rd of 2012 the Google+ extension http://tinyurl.com/VisibleData of this Data Visualization blog reached 500+ followers, on July 9 it got 1000+ users, on October 11 it had already 2000+ users, 11/27/12 my G+ Data Visualization Page has 2190+ followers and still growing every day (updated as of 12/01/12: 2500+ followers.
One of reasons of course is just a popularity of Data Visualization related topics and other reason covered in interesting article here:
In any case, it helped me to create a reading list for myself and other people, base on feedback I got. According to CicleCount, as of 11/13/12 update, my Data Visualization Google+ Page ranked as #178 most popular page in USA. Thank you G+ ! Updates:
5/25/13: G+ extension of this blog now has 3873+ followers,
and as of 7/15/13 as of 4277+ followers),
and as of 11/11/13 it has 5013+ followers:
I also have 2nd G+ extension of this blog, see it here: http://tinyurl.com/VisualizationWithTableau with 375 followers as of 11/11/13:
October 18, 2012
Leave a Comment
Qlikview 10 was released around 10/10/10, Qlikview 11 – around 11/11/11, so I expected Qlikview 12 to be released on 12/12/12 but “instead” we are getting Qlikview 11.2 with Direct Discovery in December 2012, which supposedly provides a “hybrid approach so business users can get the QlikView associative experience even with data that is not stored in memory”
This feature demanded by users (me included) for a long time, but I think noise around so called Big Data and competition forced Qliktech to do it. Spotfire has it for a longtime (as well as 64-bit implementation) and Tableau has something like that for a while (unfortunately Tableau still 32-bit) . You can test Beta of it, if you have time: http://community.qlikview.com/blogs/technicalbulletin/2012/10/22/qlikview-direct-discovery-beta-registration-is-open
Just 8 months ago Qliktech estimated its sales for 2012 as $410M and suddenly 3 months ago it changed its estimates down to $381M, just 19% over 2011, which is in huge contrast with Qliktech’s previous speed of growth and way behind the current speed of growth of Tableau and even less then current speed of growth of Spotfire. During last 2 years QLIK stock unable to grow significantly:
and all of the above forcing Qliktech to do something outside of gradual improvements – new and exciting functionality needed and Direct Discovery may help!
QlikView Direct Discovery enables users to perform visual analysis against “any amount of data, regardless of size”. With the introduction of this unique hybrid approach, users can associate data stored within big data sources directly alongside additional data sources stored within the QlikView in-memory model. QlikView can “seamlessly connect to multiple data sources together within the same interface”, e.g. Teradata to SAP to Facebook allowing the business user to associate data across the data silos. Data outside of RAM can be joined with the in-memory data with the common field names. This allows the user associatively navigate both on the direct discovery and in memory data sets.
QlikView developer should setup the Direct Discovery table on the QlikView application load script to allow the business users to query the desired big data source. Within the script editor a new syntax is introduced to connect to data in direct discovery form. Traditionally the following syntax is required to load data from a database table:
To invoke the direct discovery method, the keyword “SQL” is replaced with “DIRECT”.
In the example above only column CarrierTrackingNumber and ProductID are loaded into QlikView in the traditional manner, other columns exist in the data table within the Database including columns OrderQty and Price. OrderQty and Price fields are referred as “IMPLICIT” fields. An implicit field is a field that QlikView is aware of on a “meta level”. The actual data of an implicit field resides only in the database but the field may be used in QlikView expressions. Looking at the table view and data model of the direct discovery columns are not within the model (on the OrderFact table):
Once the direct discovery structure is established, the direct discovery data can be joined with the in-memory data with the common field names (Figure 3). In this example, “ProductDescription” table is loaded in-memory and joined to direct discovery data with the ProductID field. This allows the user to associatively navigate both on the “direct discovery” and in memory data sets.
Direct Discovery will be much slow then in-memory processing and this is is expected, but it will take away from Qlikview its usual claim that is is faster then competitors. QlikView Direct Discovery can only be used against SQL compliant data sources. The following data sources are supported;
• ODBC/OLEDB data sources – All ODBC/OLEDB sources are supported, including SQL Server, Teradata and Oracle.
• Custom connectors which support SQL – Salesforce.com, SAP SQL Connector, Custom QVX connectors for SQL compliant data stores.
Due to the interactive and SQL syntax specific nature of the Direct Discovery approaches a number of limitations exist. The following chart types are not supported;
• Pivot tables
• Mini charts
And the following QlikView features are not supported:
• Advanced aggregation
• Calculated dimensions
• Comparative Analysis (Alternate State) on the QlikView objects that use Direct
• Direct Discovery fields are not supported on Global Search
• Binary load from a QlikView application with Direct Discovery table
Here is a some preliminary video about Direct Discovery, published by Qliktech:
It was interesting to me that just 2 days after Qliktech pre-anounced Direct Discovery it also partners with Teradata. Tableau partners with Teradata for a while and Spotfire did it a month ago, so I guess Qliktech trying to catchup in this regard as well. I mentioned it only to underscore the point of this blog post: Qliktech realized that it behind its competitors in some areas and it has to follow ASAP.
September 25, 2012
Today TIBCO announced Spotfire 5, which will be released in November 2012. Two biggest news are the access to SQL Server Analysis Services cubes and the integration with Teradata “by pushing all aggregations, filtering and complex calculations used for interactive visualization into the (Teradata) database”.
Spotfire team “rewrote” its in-memory engine for v. 5.0 to take advantage of high-capacity, multi-core servers. “Spotfire 5 is capable of handling in-memory data volumes orders of magnitude greater than the previous version of the Spotfire analytics platform” said Lars Bauerle, vice president of product strategy at TIBCO Spotfire.
Another addition is “in-database analysis” which allows to apply analytics within the database platforms (such as Oracle, Microsoft SQL Server and Teradata) without extracting and moving data, while handling analyses on Spotfire server and returning result sets back to the database platform.
Spotfire added new Tibco Enterprise Runtime for R, which embeds R runtime engine into the Spotfire statistical server. TIBCO claims that Spotfire 5.0 scales to tens of thousands of users! Spotfire 5 is designed to leverage the full family of TIBCO business optimization and big data solutions, including TIBCO LogLogic®, TIBCO Silver Fabric, TIBCO Silver® Mobile, TIBCOBusinessEvents®, tibbr® and TIBCO ActiveSpaces®.
September 20, 2012
Leave a Comment
The Mass Technology Leadership Council (MassTLC) organized today the Data Visualization Panel in their series of “Big Data Seminars”:
and they invited me to be a Speaker and Panelist together with Irene Greif (Fellow @IBM) and Martin Leach (CIO @Broad Institute). Most interesting about this event was that it was sold out and about 150 people came to participate, even it was most productive time of the day (from 8:30am until 10:30am). Compare with what I observed just a few years ago, I sensed the huge interest to Data Visualization, base on multiple, very interesting and relevant questions I got from event participants.
August 12, 2012
I doubt that Microsoft is paying attention to my blog, but recently they declared that Power View now has 2 versions: one for SharePoint (thanks, but no thanks) and one for Excel 2013. In other words, Microsoft decided to have own Desktop Visualization tool. In combination with PowerPivot and SQL Server 2012 it can be attractive for some Microsoft-oriented users but I doubt it can compete with Data Visualization Leaders – too late.
Most interesting is the note about Power View 2013 on Microsoft site: “Power View reports in SharePoint are RDLX files. In Excel, Power View sheets are part of an Excel XLSX workbook. You can’t open a Power View RDLX file in Excel, and vice versa. You also can’t copy charts or other visualizations from the RDLX file into the Excel workbook.“
But most amazing is that Microsoft decided to use the dead Silverlight for Powerview: “Both versions of Power View need Silverlight installed on the machine.” And we know that Microsoft switched to HTML5 from Silverlight and no new development planned for Silverlight! Good luck with that…
And yes, you can add now maps (Bing of course), see it here:
June 9, 2012
Leave a Comment
(this is a repost from my other blog: http://tableau7.wordpress.com/2012/06/09/tableau-and-big-data/ )
Big Data can be useless without multi-layer data aggregations, hierarchical or cube-like intermediary Data Structures, when ONLY a few dozens, hundreds or thousands data-points exposed visually and dynamically every single viewing moment to analytical eyes for interactive drill-down-or-up hunting for business value(s) and actionable datum (or “datums” – if plural means data). One of best expression of this concept (at least how I interpreted it) I heard from my new colleague who flatly said:
“Move the function to the data!”
I got recently involved with multiple projects using large data-sets for Tableau-based Data Visualizations (100+ millions of rows and even Billions of records!). Some of largest examples of their sizes I used were: 800+ millions of records and other was 2+ billions of rows.
So this blog post is to express my thoughts about such Big Data (in average examples above have about 1+ KB per CSV record before compression and other advanced DB tricks, like columnar Databases used by Data Engine of Tableau) as back-end for Tableau. But please keep in mind that as a 32-bit tool, Tableau itself is not ready for Big Data. In addition, I think Big Data is mostly a buzzword and BS and we are forced by marketing BS masters to use sometimes this stupid term.
Here are some Factors involved into Data Delivery from main and designated Database (Back-ends like Teradata, DB2, SQL Server or Oracle) for Tableau-based Big Data Visualizations) into “local” Tableau Visualizations (many people still trying to use Tableau as a Reporting tool as oppose to (Visual) Analytical Tool:
Queuing thousands of Queries to Database Server. There is no guarantee your Tableau query will be executed immediately; in fact it WILL be delayed.
Speed of Tableau Query when it will start to be executed depends on sharing CPU cycles, RAM and other resources with other queries executed SIMULTANEOSLY with your query.
Buffers, pools and other resources available for particular user(s) and queries at your Database Server are different and depends on privileges and settings given to you as a Database User
Network speed: between some servers it can be 10Gbits (or even more), in most cases it is 1Gbit inside server rooms, outside of server rooms I observed in many old buildings (over wired Ethernet) max 100Mbits coming into user’s PC; in case if you using Wi-Fi it can be even less (say 54 Mbits?). If you are using internet it can be even less (I observed speed in some remote offices as 1 Mbit or so over old T-1 lines); if you using VPN it will max out at 4Mbits or less (I observed it in my home office).
Utilization of network. I use Remote Desktop Protocol – RDP to VM (from my workstation or notebook; (VM or VDI Virtual Machine, sitting in server room) and connected to servers with network speed of 1Gbit, but it still using maximum 3% of network speed (about 30 MBits, which is about 3 Megabytes of data per second, which is probably about few thousands of records per seconds.
That means that network may have a problem to deliver 100 millions of records to “local” report overnight (say 10 hours, 10 millions of records per hour, 3000 records per second) – partially and probably because of factors 4 above.
On top of those factors please keep in mind that Tableau is a set of 32-bit applications (with exception of one out of 7 processes on Server side), which is restricted to 2GB of RAM; if data-set cannot fit into RAM, than Tableau Data Engine will use the disk as Virtual RAM, which is much, much slower and for some users such disk space actually not local to his/her workstation and mapped to some “remote” network file server.
Tableau desktop is using in many cases 32-bit ODBC drivers, which may even add more delay into data delivery into local “Visual Report”. As we learned from Tableau support itself, even with latest Tableau Server 7.0.X, the RAM allocated for one user session restricted to 3GB anyway.
Unfortunate Update: Tableau 8.0 will be 32-bit application again, but may be follow up version 8.x or 9 (I hope) will be ported to 64-bits… It means that Spotfire, Qlikview and even PowerPivot will keep some advantages over Tableau for a while…
May 31, 2012
Leave a Comment
(this is a repost from my other Data Visualization blog: http://tableau7.wordpress.com/2012/05/31/tableau-as-container/ )
Often I used small Tableau (or Spotfire or Qlikview) workbooks instead of PowerPoint, which are proving at least 2 concepts:
Good Data Visualization tool can be used as the Web or Desktop Container for Multiple Data Visualizations (it can be used to build a hierarchical Container Structures with more then 3 levels; currently 3: Container-Workbooks-Views)
It can be used as the replacement for PowerPoint; in example below I embedded into this Container 2 Tableau Workbooks, one Google-based Data Visualization, 3 image-based Slides and Textual Slide: http://public.tableausoftware.com/views/TableauInsteadOfPowerPoint/1-Introduction
Tableau (or Spotfire or Qlikview) is better then PowerPoint for Presentations and Slides
Tableau (or Spotfire or Qlikview) is the Desktop and the Web Container for Web Pages, Slides, Images, Texts
Good Visualization Tool can be a Container for other Data Visualizations
Sample Tableau Presentation above contains the Introductory Textual Slide
Sample Tableau Presentation above contains a few Tableau Visualization:This Tableau Presentation contains a Web Page with the Google-based Motion Chart Demo
This Tableau Presentation contains a few Image-based Slides:
The Quick Description of Origins and Evolution of Software and Tools used for Data Visualizations during last 30+ years
The Description of Multi-level Projection from Multidimensional Data Cloud to Datasets, Multidimensional Cubes and to Chart
The Description of 6 stages of Software Development Life Cycle for Data Visualizations
May 7, 2012
TIBCO said Spotfire 4.5 will be available later this month (May 2012).
Among news and additions to Spotfire: it will include ADS connector to Hadoop, integration with SAS, Mathworks and Attivio engines and new deployment kit for iPad.
April 14, 2012
The short version of this post: as far as Data Visualization is a concern, the new Power View from Microsoft is the marketing disaster, the architectural mistake and the generous gift from Microsoft to Tableau, Qlikview, Spotfire and dozens of other vendors.
For the long version – keep reading.
Assume for a minute (OK, just for a second) that new Power View Data Visualization tool from Microsoft SQL Server 2012 is almost as good as Tableau Desktop 7. Now let’s compare installation, configuration and hardware involved:
- Hardware: almost any modern Windows PC/notebook (at least dual-core, 4GB RAM).
- Installation: a) one 65MB setup file, b) minimum or no skills
- Configuration: 5 minutes – follow instructions on screen during installation.
- Price – $2K.
- Hardware: you need at least 2 server-level PCs (each at least quad-core, 16GB RAM recommended). I will not recommend to use 1 production server to host both SQL Server and SharePoint; if you desperate, at least use VM(s).
- Installation: a) Each Server needs Windows 2008 R2 SP1 – 3GB DVD; b) 1st Server needs SQL Server 2012 Enterprise or BI Edition – 4GB DVD; c) 2nd Server needs SharePoint 2010 Enterprise Edition – 1GB DVD; d) A lot of skills and experience
- Configurations: Hours or days plus a lot of reading, previous knowledge etc.
- Price: $20K or if only for development it is about $5K (Visual Studio with MSDN subscription) plus cost of skilled labor.
As you can see, Power View simply cannot compete on mass market with Tableau (and Qlikview and Spotfire) and time for our assumption in the beginning of this post is expired. Instead now is time to remind that Power View is 2 generations behind Tableau, Qlikview and Spotfire. And there is no Desktop version of Power View, it is only available as a web application through web browser.
Power View is a Silverlight application packaged by Microsoft as a SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Server 2010 Enterprise Edition. Power View is (ad-hoc) report designer providing for user an interactive data exploration, visualization, and presentation web experience. Microsoft stopped developing Silverlight in favor of HTML5, but Silverlight survived (another mistake) within SQL Server team.
Previous report designers (still available from Microsoft: BIDS, Report Builder 1.0, Report Builder 3.0, Visual Studio Report Designer) are capable to produce only static reports, but Power View enables users to visually interact with data and drill-down all charts and Dashboard similar to Tableau and Qlikview.
Power View is a Data Visualization tool, integrated with Microsoft ecosystem. Here is a Demo of how the famous Hans Rosling Data Visualization can be reimplemented with Power View:
Compare with previous report builders from Microsoft, Power View allows many new features, like Multiple Views in a Single Report, Gallery preview of Chart Images, export to PowerPoint, Sorting within Charts by measures and Categories, Multiple Measures in Charts, Highlighting of selected data in reports and Charts, Synchronization of Slicers (Cross-Filtering), Measure Filters, Search in Filters (convenient for a long lists of categories), dragging data fields into Canvas (create table) or Charts (modify visualization), convert measures to categories (“Do Not Summarize”), and many other features.
As with any of 1st releases from Microsoft, you can find some bugs from Power View. For example, KPIs are not supported in Power View in SQL Server 2012, see it here: http://cathydumas.com/2012/04/03/using-or-not-using-tabular-kpis/
Power View is not the 1st attempt to be a full player in Data Visualization and BI Market. Previous attempts failed and can be counted as Strikes.
Strike 1: The ProClarity acquisition in 2006 failed, there have been no new releases since v. 6.3; remnants of ProClarity can be found embedded into SharePoint, but there is no Desktop Product anymore.
Strike 2: Performance Point Server was introduced in November, 2007, and discontinued two years later. Remnants of Performance Point can be found embedded into SharePoint as Performance Point Services.
Both failed attempts were focused on the growing Data Visualization and BI space, specifically at fast growing competitors such as Qliktech, Spotfire and Tableau. Their remnants in SharePoint functionally are very behind of Data Visualization leaders.
Path to Strike 3 started in 2010 with release of PowerPivot (very successful half-step, since it is just a backend for Visualization) and xVelocity (originally released under name VertiPaq). Power View is continuation of these efforts to add a front-end to Microsoft BI stack. I do not expect that Power View will gain as much popularity as Qlikview and Tableau and in my mind Microsoft will be a subject of 3rd strike in Data Visualization space.
One reason I described in very beginning of this post and the 2nd reason is absence of Power View on desktop. It is a mystery for me why Microsoft did not implement Power View as a new part of Office (like Visio, which is a great success) – as a new desktop application, or as a new Excel Add-In (like PowerPivot) or as a new functionality in PowerPivot or even as a new functionality in Excel itself, or as new version of their Report Builder. None of these options preventing to have a Web reincarnation of it and such reincarnation can be done as a part of (native SSRS) Reporting Services – why involve SharePoint (which is – and I said it many times on this blog – basically a virus)?
I am wondering what Donald Farmer thinking about Power View after being the part of Qliktech team for a while. From my point of view the Power View is a generous gift and true relief to Data Visualization Vendors, because they do not need to compete with Microsoft for a few more years or may be forever. Now IPO of Qliktech making even more sense for me and upcoming IPO of Tableau making much more sense for me too.
Yes, Power View means new business for consulting companies and Microsoft partners (because many client companies and their IT departments cannot handle it properly), Power View has a good functionality but it will be counted in history as a Strike 3.
April 2, 2012
(this is a repost from my Tableau blog: http://tableau7.wordpress.com/2012/04/02/palettes-and-colors/ )
I was always intrigued with colors and their usage, since my mom told me that may be ( just may be, there is no direct prove of it anyway) Ancient Greeks did not know what the BLUE color is – that puzzled me.
Later in my live, I realized that Colors and Palettes are playing the huge role in Data Visualization (DV) and it eventually led me to attempt to understand of how it can be used and pre-configured in advanced DV tools to make Data more Visible and to express the Data Patterns better. For this post I used Tableau to produce some palettes, but similar technique can be found in Qlikview, Spotfire etc.
Tableau published the good article of how to create customized palettes here: http://kb.tableausoftware.com/articles/knowledgebase/creating-custom-color-palettes and I followed it below. As this article recommended, I modified default Preferences.tps file; see it below with images of respective Palettes embedded.
For the first, regular Red-Yellow-Green-Blue Palette with known colors with well-established names, I created even a Visualization in order to compare their Red-Green-Blue components and I even tried to placed respective Bubbles on 2-dimensional surface, even originally it is clearly a 3 dimensional Dataset (click on image to see it in full size):
For the 2nd Red-Yellow-Green-NoBlue Ordered Sequential Palette, I tried to implement the extended “Set of Traffic Lights without any trace of BLUE Color” (so Homer and Socrates will understand it the same way as we are) while trying to use only web-safe colors. Please keep in mind, that Tableau does not have a simple way to have more than 20 colors in one Palette, like Spotfire does.
Other 5 Palettes below are useful too as ordered-diverging almost “mono-chromatic” (except Red-Green Diverging, since it can be used in Scorecards when Red is bad and Green is good). So see below Preferences.tps file with my 7 custom palettes.
<?xml version=’1.0′?> <workbook> <preferences>
<color-palette name=”RegularRedYellowGreenBlue” type=”regular”>
<color>#FF0000</color> <color>#800000</color> <color>#B22222</color>
<color>#E25822</color> <color>#FFA07A</color> <color>#FFFF00</color>
<color>#FF7E00</color> <color>#FFA500</color> <color>#FFD700</color>
<color>#F0e68c</color> <color>#00FF00</color> <color>#008000</color>
<color>#00A877</color> <color>#99cc33</color> <color>#009933</color>
<color>#0000FF</color> <color>#00FFFF</color> <color>#008080</color>
<color-palette name=”RedYellowGreenNoBlueOrdered” type=”ordered-sequential” >
<color>#ff0000</color> <color>#cc6600</color> <color>#cccc00</color>
<color>#ffff00</color> <color>#99cc00</color> <color>#009900</color>
In case if you wish to use the colors you like, this site is very useful to explore the properties of different colors: http://www.perbang.dk/rgb/
March 31, 2012
Leave a Comment
(this is a repost from http://tableau7.wordpress.com/2012/03/31/tableau-reader/ )
Tableau made a couple of brilliant decisions to completely outsmart its competitors and gained extreme popularity, while convincing millions of potential, future and current customers to invest own time to learn Tableau. 1st reason of course is Tableau Public (we discuss it in separate blog post) and other is a Free Tableau Reader, which provides full desktop user experience and interactive Data Visualization without any Tableau Server (and any other server) involved and with better performance and UI then Server-based Visualizations.
While designing Data Visualizations is done with Tableau Desktop, most users got their Data Visualizations served by Tableau Server to their Web Browser. However in the large and small organizations that usage pattern is not always the best fit. Below I am discussing a few possible use cases, where the usage of Free Tableau Reader can be appropriate, see it here: http://www.tableausoftware.com/products/reader .
1. Tableau Application Server serves Visualizations well, but not as well as Tableau Reader, because Tableau Reader delivers a truly desktop User Experience and UI. Most known example of it is a Motion Chart: you can see automatic motion with Tableau Reader but Web Browser will force user to manually emulate motion. In cases like that user advised to download workbook, copy .TWBX file to his/her workstation and open it with Tableau Reader.
Here is an example of the Motion Chart, done in Tableau, similar to famous Hans Rosling’s presentation of Gapminder’s Motion Chart (an you need the free Tableau Reader or license to Tableau Desktop to see the automatic motion of the 6-dimensional dataset with all colored bubbles, resizing over time):
Please note that the same Motion Chart using Google Spreadsheets will run in browser just fine (I guess because Google “bought” Gapminder and kept its code intact):
2. When you have hundreds or thousands of Tableau Server users and more then couple of Admins (users with Administrative privileges), each of Admins can override viewing privileges for any workbook, regardless of designated for that workbook Users and User Groups. In such situation there is a risk for violation of privacy and confidentiality of data involved, for example for HR Analytics and HR Dashboards and other Visualizations where private, personal and confidential data used.
Tableau Reader enables additional complementary method of delivering Data Visualizations through private channels like password-protected portals, file servers and FTP servers and in certain cases even by-passing Tableau Server entirely.
3. Due popularity of Tableau and ease of use, many groups and teams are considering Tableau as vehicle to delivering of hundreds and even thousands of Visual Reports to hundreds and may be even thousands of users. That can slow down Tableau Server, decrease user experience and create even more confidentiality problems, because it may expose confidential data to unintended users, like report for one store to users from another store.
4. Many small (and not so small either) organizations trying to save on Tableau Server licenses (at least initially) and they still can distribute Tableau-based Data Visualizations; developer(s) will have Tableau Desktop (relatively small investment) and users, clients and customers will use Tableau Reader, while all TWBX files can be distributed over FTP, portals or file servers or even by email. In my experience, when Tableau-based business will grow enough, it will pay by itself for buying licenses for Tableau Server, so usage of Tableau Reader in n o way is threat to Tbaleau Software bottom line!
Update (12/12/12) for even more happy usage of Tableau Reader: in upcoming Tableau 8 all Tableau Data Extracts – TDEs – can be created and used without any Tableau Server involved. Instead Developer can create/update TDE either with Tableau in UI mode or using Tableau Command Line Interface and script TDEs in batch mode or programmatically with new TDE API (Python, C/C++, Java). It means that Tableau workbooks can be automatically refreshed with new data without any Tableau Server and re-delivered to Tableau Reader users over … FTP, portals or file servers or even by email.
March 20, 2012
In unusual, interesting (what it means? is it promising or what?) move the two Data Visualization leaders (Panopticon and Qliktech) partners today, see
“to offer enhanced, real-time visualization capabilities for the QlikView Business Discovery platform”.
Panopticon’s press-release looks overly submissive to me:
“As a member of QlikTech’s Qonnect Partner Program for Technology Partners, Panopticon supports QlikView desktop, web, and mobile interactive dashboards and allows users to filter and interact directly with real-time data. By integrating Panopticon into their systems, QlikView users can:
Federate reference and real-time streaming data as well as conflated time series data sets;
Connect to virtually any relational or column-oriented database, including tick databases;
Connect to real-time message queues;
Connect to Complex Event Processing (CEP) engines; and
Make full use of Panopticon’s library of visualizations designed specifically to analyze financial data.
The combined Panopticon-QlikView platform is now available for immediate installation.”
Qliktech is trying to be politically correct and its Michael Saliter, Senior Director Global Market Development – Financial Services at QlikTech said, “Our partnership with Panopticon allows us to incorporate leading real-time visualization capabilities into our QlikView implementations. We recognize the importance of providing our clients with truly up-to-date information, and this new approach supports that initiative. Our teams share a common philosophy about proper data visualization design. This made it easy to develop a unified approach to the presentation of real-time, time series, and static data in ways that people can understand in seconds.”
While I like when competitors are cooperating (it benefits users and hopefully improve sales for both vendors), I still have a question: Qliktech got a lot of money from IPO, had a lot of sales and hired a lot of people lately; why they (Qlikview Developers) was not able to develop real-time functionality themselves?
Hugh Heinsohn, VP of Panopticon, said to me: “we (Panopticon) don’t see ourselves as competitors – and neither do they (Qliktech). When you get into the details, we do different things and we’re working together closely now”
Another indirect sign of relationship between Panopticon and Qliktech is the recent inclusion of Måns Hultman, former CEO of QlikTech into the list of advisors for Panopticon’s Board of Directors.
Other questions are rising too: if Qliktech suddenly is open to integration with Panopticon, why not to integrate with Quantrix and R Library (I proposed integration with R a while ago). Similar questions applicable to Tableau Software…
March 11, 2012
I was silent for a while for a reason: I owe myself to read a big pile of books, articles and blog posts by many authors – I have to read it before I can write something myself. List is huge and it goes many weeks back! I will sample a sublist here with some relatively fresh reading materials in no particular order:
1. Excellent “Clearly and Simply” blog by Robert Mundigl, here are just 2 samples:
2. Interesting site dedicated to The Traveling Salesman Problem:
3. Excellent QV Design blog by Matthew Crowther, here are a few examples:
4. Good article by James Cheshire here:
5. Interesting blog by Josh Tapley: http://data-ink.com/
6. A must read blog of Stephen Wolfram, just take a look on his 2 last posts:
7. Nice post by my friend John Callan: http://community.qlikview.com/blogs/theqlikviewblog/2012/03/09/why-discovery-really-matters
8. I am trying to follow David Raab as much as I can:
9. As always, interesting articles from Timo Elliott:
10. Huge set of articles from variety of Sources about newly released or about to be released xVelocity, PowerPivot2, SQL Server 2012, SSDT (SQl Server Data Tools), VS11 etc.
11. Here is a sample of article with which I disagree (I think OBIEE is TWO generations behind of Qlikview, Tableau and Spotfire), but still need to read it:
this list is go on and on and on, so answer on my own question is: to read!
Below is a prove (unrelated to Data Visualization, but cannot resist to publishing it – I did the spreadsheet below by myself) – rather for myself, that reading can help to avoid mistakes (sounds funny, I know). For example if you will listen last week’s iPropaganda from iChurch, you will think that new iPad 2012 is the best tablet on market. But if you read carefully specification of new iPad 2012 and compare it (after careful reading) with specifications of new Asus Transformer Pad Infinity, you will have a different choice:
February 22, 2012
Leave a Comment
Dan Primack, Senior Editor at Fortune, posted today at http://finance.fortune.cnn.com/2012/02/22/tableau-to-ipo-next-year/ a suggestion that Tableau can go public next year and I quote:
“Scott Sandell, a partner with New Enterprise Associates (the venture capital firm that is Tableau’s largest outside shareholder) told Dan “that the “board-level discussions” are about taking the company public next year, even though it has the numbers to go out now if it so chose. Sandell added that the company has been very efficient with $15 million or so it has raised in VC funding, and that it shouldn’t need additional pre-IPO financing”.
Mr. Primack also mentioned an “unsolicited email, from outside spokesman: “Next week Tableau Software will announce its plans to go IPO“…
I do not have comments, but I will not be surprised if somebody will buy Tableau before IPO… Among potential buyers I can imagine:
- Microsoft (Seattle, Multidimensional Cubes, integration with Excel),
- Teradata (Aster Data is in, front-end for “big data” is needed),
- IBM (if you cannot win against the innovator, how about buying it),
- and even Oracle (everything moving is the target?)…
February 5, 2012
Leave a Comment
I started recently the new Data Visualization Google+ page as the extension of this blog here:
Internet has a lot of articles, pages, blogs, data, demos, vendors, sites, dashboards, charts, tools and other materials related to Data Visualization and this Google+ page will try to point to most relevant items and sometimes to comment on most interesting of them.
What was unexpected is a fast success of this Google+ page – in a very short time it got 200+ followers and that number keeps growing!
January 31, 2012
New version 3.3 of SpreadsheetWEB with new features like Data Consolidation, User Groups, Advance Analytics and Interactive Charts, is released this month by Cambridge, MA-based Pagos, Inc.
SpreadsheetWEB is known as the best SaaS platform with unique ability to convert Excel spreadsheets to rich web applications with live database connections, integration with SQL Server, support for 336 Excel functions (see full list here http://wiki.pagos.com/display/spreadsheetweb/Supported+Excel+Formulas ), multiple worksheets, Microsoft Drawing, integration with websites and the best Data Collection functionality among BI tools and platforms.
See the simple Video Tutorial about how to create a Web Dashboard with Interactive Charts by publishing your Excel Spreadsheet using SpreadsheetWEB 3.3 here:
SpreadsheetWEB supports Mapping for a while, see video showing how you can create Map application in less then 4 minutes:
as well as PivotTables, Web Services, Batch Processing, and many other new features, see it here: http://spreadsheetweb.com/features.htm
In order to create a SpreadsheetWEB application, all you need is Excel and free SpreadsheetWEB Add-in for Excel, see many impressive online Demos here: http://spreadsheetweb.com/demo.htm
January 18, 2012
Leave a Comment