Guest Posts

For this weekend I got 2 guest bloggers (one yesterday and other today) sharing their thoughts about Cloud Services for BI and DV. I myself published recently  a few articles about this topic, for example here: and here: . My opinions can be different from Guest Bloggers. You can find many providers of DV and BI Cloud Services, including Spotfire Cloud, Tableau Online, GoodData, Microstrategy Cloud, Bime, Yellofin, BellaDati, SpreadsheetWEB etc.

Let me introduce my 2nd guest blogger for this weekend: Ugur Kadakal is the CEO and founder of Pagos, Inc. located in Cambridge, MA. Pagos is the developer of SpreadsheetWEB which transforms Excel spreadsheets into web based Business Intelligence (BI) applications without any programming. SpreadsheetWEB can also convert PowerPivot files into web based dashboards. It provides advanced Data Visualization (DV) to SQL Analysis Services (Tabular) cubes without SharePoint. Mr. Kadakal published a few articles on this blog before with great feedback, so he is a serial Guest Blogger.


Before (or after) you read article of Mr. Kadakal, I suggest to review the article, comparing 5+ scenarios of revenue of Cloud Service vs. Traditional One-Time Sale of software, see it here: . Illustration above is from that article.

Traditional BI versus Cloud BI

Over the past several years, we have been witnessing numerous transformations in the software industry, from a traditional on-premise deployment model to the Cloud. There are some application types for which cloud makes a lot of sense while it doesn’t for some others. BI is somewhere in between.

Before I express my opinion on the subject of Traditional BI versus Cloud BI, I would like to clarify my definitions. I define traditional BI as large enterprise implementations which connect with many data sources in real-time.  These projects have many phases and require large teams to implement. These projects could take years and cost millions of dollars to implement.

Many people define cloud BI as deployments on a proprietary, third-party, multi-tenant environment managed by a vendor. My definition is somewhat different and broader. Cloud BI is more about ease of deployment, use and management. While Cloud BI can be hosted and managed by a vendor, it can also be deployed on a private Cloud infrastructure like Amazon or Microsoft Azure. With the advancement of cloud infrastructure technologies like OpenStack, deploying and managing private cloud infrastructure is becoming easier for many enterprises. As a result, whether Cloud BI is deployed on a multi/single-tenant environment on vendor infrastructure, a third party cloud infrastructure like Amazon, Azure, etc. or on internal private cloud, it becomes more of a business decision rather than a technical limitation.


One main distinction between Traditional BI and Cloud BI is data management. Traditional BI implementations can have real-time data as they can connect to the original data sources directly. I don’t believe that Cloud BI should deal with real-time data, even if implemented on internal private cloud infrastructure. Supporting real-time data is a requirement that makes any BI project complicated and costly. Hence Cloud BI solutions should include simple utilities i.e. ETL, residing on local computers to push internal data into Cloud BI’s data model periodically. Since Cloud BI should not deal with real-time data scenarios, this data synchronization can be configured by the business user accordingly.

Another distinction is the ease of implementation. Regardless of where it is deployed, Cloud BI solutions should take no more than a few hours to implement and configure. Some BI vendors already support images on Amazon cloud to simplify this process.

Traditional BI model typically requires significant upfront investments. Part of this investment is internal while the rest is BI licensing and implementation fees. But the very nature of Cloud BI requires agility from deployment to data management and dashboard creation. Cloud BI project can be deployed easily and it can also be modified and shut down with equal ease. Hence traditional business model of large upfront investments doesn’t make sense here. Cloud BI business model should be subscription based regardless of whether it is implemented on a vendor infrastructure or on an on-premise private cloud infrastructure. Customers should be able to pay what they use and for how long they use it. Such simplicity will also eliminate vendor lock-in risks that most enterprises have to mitigate.


In summary, there are many BI projects that will require traditional BI implementation. These projects typically require real-time data and connectivity to many different data sources. Cloud BI should not attempt to handle these types of projects. But there are many other BI projects that require neither real-time data nor the data which comes from different systems that should be connected. Cloud BI can handle these projects quickly and cost effectively, by empowering business users to manage the whole process without IT or external support. From discovery to data synchronization to dashboard creation and management, every activity can be handled by business users.

For this weekend I got 2 guest bloggers (one today and second tomorrow) sharing their thoughts about Cloud Services for BI and DV. I myself published recently  a few articles about this topic, for example here: and here: . My opinions can be different from Guest Bloggers (see my comment below this article). You can find many providers of DV and BI Cloud Services, including Spotfire Cloud, Tableau Online, GoodData, Microstrategy Cloud, Bime, Yellofin, BellaDati, SpreadsheetWEB etc.

Let me introduce my 1st guest blogger for this weekend: Mark Flaherty is Chief Marketing Officer at InetSoft Technology,  a BI (Business Intelligence) software provider founded in 1996, headquartered in Piscataway, New Jersey with over 150 employees worldwide. InetSoft’s flagship BI application Style Intelligence enables self-service  BI spanning dashboarding, reporting and visual analysis for enterprises and technology providers. The server-based application includes a data mashup engine for combining data from almost any data source and browser-based design tools that power users and developers can use to quickly create interactive DV (Data Visualizations).


Are public BI cloud services really going to overtake the traditional on-premise deployment of BI tools?

(Author: Mark Flaherty. Text below contains Mark’s opinions and they can be different from opinions expressed on this blog).

It’s been six years since public BI cloud services came to be. Originally termed SaaS BI, public BI cloud services refers to commercial service providers who host a BI application in the public cloud that accesses corporate data housed in the corporate private cloud and/or other application providers’ networks. As recently as last month, an industry report from TechNavio said, “the traditional on-premise deployment of BI tools is slowly being taken over by single and multi-tenant hosted SaaS.” I have a feeling this is another one of those projections that copies a historical growth rate forward for the next five years. If you do that with any new offering that starts from zero, you will always project it to dominate a marketplace, right?

I thought it would be interesting to discuss why I think this won’t happen.


In general, there is one legitimate driving force for why companies look to cloud solutions that helps drive the demand for cloud BI services specifically: outsourcing of IT. The types of companies for whom this makes the most sense are small businesses. They have little or no IT staff to set up and support enterprise software, and they also have limited cap-ex budgets so software rentals fit their cash flow structure better. While this is where most of the success for cloud BI has happened, this is only a market segment opportunity. By no means do small companies dominate the IT marketplace.

Another factor for turning to public cloud solutions is expediency. Even at large companies where there is budget for software purchases, the Business sometimes becomes frustrated with the responsiveness of internal IT, and they look outside for a faster solution. This makes sense for domain-specific cases where there is a somewhat narrow scope of need, and the application and the data are self-contained. is the poster child for this case, where it can quickly be set up as a CRM for a sales team. Indeed the fast success of is a big reason why people think cloud solutions will take off in every domain.

But business intelligence is different. A BI tool is meant to span multiple information areas, from finance to sales to support and more. This is where it gets complicated for mid-sized and global enterprises. The expediency factor is nullified because the data that business users want to access with their cloud BI tool is controlled by IT, so they need to be involved. Depending on the organization’s policies and politics, this can either slow down such a move or kill it.

The very valid reason why enterprise IT would kill the idea for a public cloud BI solution is why ultimately I think public BI cloud services has such a limited opportunity in the overall market. One of IT’s responsibilities is ensuring data security, and they will rightly point out the security risks of opening access to sensitive corporate data to a 3rd party. It’s one thing to trust a vendor with one set of data like website visitor traffic, but trusting them with all of a company’s financial and customer data is where almost all companies will draw the line.  This is a concern I don’t see ever going away.

What are some pieces of evidence that public BI cloud services have a limited market opportunity? When BI cloud services first came onto the scene, all of the big BI vendors dabbled in it. Now many no longer champion these hosted offerings, or they have shuttered or demoted them. IBM’s Cognos Express is now only an on-premise option. SAP BusinessObjects BI OnDemand can’t be found from SAP’s main site, but has its own micro site. Tibco’s Spotfire Cloud and Tableau Software’s Tableau Online are two exceptions among the better known BI providers that are still prominently marketed. However, Tibco positions this option for small businesses and workgroups and omits certain functionality.

Our company, too, experimented with a public BI cloud offering years ago. It was first targeted at customers who would want to mash up their CRM data with other enterprise-housed data. We found mostly small, budget challenged companies in their customer base, and the few large enterprises that we found balked at the idea, asking instead, for our software to be installed on-premise where they would connect to any cloud-hosted data on their own. Today the only remaining cloud offering of ours is a free visualization service called Visualize Free which is similar to Tableau Public or IBM’s Many Eyes.

Another observation to make, while there have been a handful of pure-play cloud BI vendors, one named “Lucidera,” came and went quite quickly. Birst is one that seems to have got a successful formula.

In summary, yes, there is a place for public BI cloud services in the small business market, but no, it’s not going to overtake traditional on-premise BI.


Analytics extrapolates Visible Data to the future (“predicts”) and enables us to see more then 6-dimensional subsets of data with mathematical modeling. The ability to do it visually, interactively and without programming … vastly expands the number of potential users for Visual Analytics. I am honored to present the one of the most advanced experts in this area – Mr. Gogswell: he decided to share his thoughts and be the guest blogger here. So the guest-blog-post below is written by Mr. Douglas Cogswell, the Founder, President and CEO of ADVIZOR Solutions Inc.

Formed in 2003, ADVIZOR combines data visualization and in-memory-data-management expertise with usability knowledge and predictive analytics to produce an easy to use, point and click product suite for business analysis. ADVIZOR’s Visual Discovery™ software spun out of a distinguished research heritage at Bell Labs that spans nearly two decades and produced over 20 patents.

Mr. Cogswell is the well known thought leader and he is discussing below the next step in Data Visualization Technology, when limitation of human eye prevents users to comprehend the multidimensional (say more than 6 dimensions) Data Patterns or estimate/predict the future trends with Data from the Past. Such Multidimensional “Comprehension” and Estimations of the Future Trends requires a Mathematical Modeling in form of Predictive Analytics as the natural extension of Data Visualization. This is in turn, requires the Integration of Predictive Analytics and Interactive Data Visualization. Such Integration will be accepted much easier by business and analysts , if it will require no coding.

Mr. Cogswell discussing the need and possibility of that in his article (Copyright ADVIZOR Solutions, 2014) below.


Integrating Predictive Analytics and Interactive Data Visualization WITHOUT any Coding!

It’s a new year, and many organizations are mulling over how and where they will make new investments. One area  getting a lot of attention these days is predictive analytics tools. The need  to better understand the present and predict what might happen in the future for competitive advantage is enticing many to look at what these tools can do. TechRadar spoke with James Fisher, who said 85 percent of the organizations that have adopted these tools believe they have positively impacted their business.

Fast Fact Based Decision Making is Critical.

“Businesses are collecting information on their customers’ mobile habits, buying habits, web-browsing habits… The list really does go on,” he said. “However, it is what businesses do with that data that counts. Analytics technology allows organizations to analyze their customer data and turn it into actionable insights, in a way that benefits business.”

Interest in predictive analytics by businesses is expected to continue to grow well beyond this year, with Gartner reporting in early 2013 that approximately 70 percent of the best performing enterprises will either manage or have a view of their processes with predictive analytics tools by 2016. By doing this, businesses will gain a better sense of what is happening within their own networks and corporate walls, which actions could have the best impact and give increased visibility across their industries. This will give situational awareness across the business, making operating much easier than it has been in past years.

Simplicity and Ease of Use are Key.

Analytics is something every business should be figuring out.  There are more software options than ever, so executives will need to figure out which solution will work best for them and their teams. According to InformationWeek’s Doug Henschen, the “2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey” found that business users and salespeople need easy-to-use, visual data analytics that is intuitive and easily accessible from anywhere, any time. . These data visualization business intelligence tools can give a competitive edge to the companies adopting them.

“The demand for these more visual analytics tools leads to one of the biggest complaints about analytics,” he said. Ease-of-use challenges have crippled the utilization rate of this software.  But that is changing.  “Analytics and BI vendors know that IT groups are overwhelmed with requests for new data sources and new dimensions of data that require changes to reports and dashboards or, worse, changes to applications and data warehouses,” he wrote. “It’s no wonder that ‘self-service’ capabilities seem to be showing up in every BI software upgrade.”

A recent TDWI research report titled “Data Visualization and Discovery for Better Business Decisions” found that companies do have their future plans focused on these analytics and how they can use them. In fact, 60 percent said their organizations are currently using business visualization for snapshot reports, scorecards, or display. About one-third are using it for discovery and analysis and 26 percent for operational alerting. However, companies are looking to expand how they use the technology, as 45 percent are looking to adopt it for discovery and analysis, and 39 percent for alerts.

“Visualization is exciting, but organizations have to avoid the impulse to clutter users’ screens with nothing more than confusing ‘eye candy’,” Stodder wrote. “One important way to do this is to evaluate closely who needs what kind of visualizations. Not all users may need interactive, self-directed visual discovery and analysis; not all need real-time operational alerting.”

Data Visualization & Predictive Analytics Naturally Complement Each Other.

Effective data visualizations are designed to complement human perception and our innate ability to see and respond to patterns.  We are wired as humans to perceive meaningful patterns, structure, and outliers in what we see.  This is critical to making smarter decisions and improving productivity, and essential to the broader trend towards self-directed analysis and BI reporting, and tapping into new sources of data.

Visualization also encourages “storytelling” and new forms of collaboration.  It makes it really easy to not only “see” stories in data, but also to highlight what is actionable to colleagues. 

On the other hand, the human mind is limited in its ability to “see” very many correlations at once.  While visualization is great for seeing patterns across 2, or 4 or maybe 6 criteria at a time, it breaks down when there are many more variables than that.  Very few people are able to untangle correlations and patterns across, say, 15 or 25 or 75 or in some cases 300+ criteria that exist in many corporate datasets.

Predictive Analytics, on the other hand, is not capacity constrained!!  It uses mathematical tools and statistical algorithms to examine and determine patterns in one set of data . . . in order to predict behavior in another set of data.  It integrates well with in-memory-data and data visualization, and leads to faster and better decision making.

Making it Simple & Delivering Results.

The challenge is that most of the predictive analytics software tools on the market require the end-user to be able to program in SQL in order to prep data, and have some amount of statistics background to build models in R … or SPSS … or SAS.  At ADVIZOR Solutions our vision has been to empower business analysts and users to build predictive models without any code or statistics background.


The results have been extremely promising — inquisitive and curious-minded end-users with a sense for causality in their data can easily do this — and are turning around models in just a few hours.  The result is they are using data in new and powerful ways to make better business decisions.

Three Key Enablers to a Simple End-User Process.

The three keys to making this happen are:  (1) having all the relevant data offloaded from the database or datamart into RAM, (2) allowing the business user to explore it visually, and (3) providing a really simple modeling interface.

Putting the data in RAM is key to making it easy to condition so that the business user can create modeling factors (such as time lags, factors from data in multiple tables, etc.) without having to go back and condition data in the underlying databases — which is usually a time consuming process that involves coordinating with IT and/or DBAs. 

Allowing the business user to explore it visually is key to hypothesis generation and vetting about what really matters, before building and running models.

Providing really simple interfaces that automate the actual statistics part of the process lets the business user focus on their data, not the statistics of the model.  That simple modeling process includes:

  • Select the Target & Base Populations
    • The “target” is the group you want to study (e.g., people who responded to your campaign)
    • The “base” is the group you want to compare the target to (e.g., everybody who received the campaign)
  • Visually Explore the data and develop Hypotheses
    • This helps set up which explanatory fields to include …
    • … and which additional ones may need to be added
  • Select list of Explanatory Fields
    • The “explanatory fields” are the factors in your data that might explain what makes the target different from other entities in your data
  • Build Model
  • Iterate
  • Understand and Communicate what the model is telling you
  • Predict / Score Base Population
  • Get lists of Scored potential targets

Check out how you can do this with no code in this 8 min YouTube video.

Best Done In-house with Your Team.

In our experience this type of work is best done in-house with your team.  That’s because it’s not a “black box”, it’s a process.  And since your team knows the data and its context better than anybody else, they are the ones best suited to discuss, interpret, and apply the results.  In our experience, over and over again it has been proven that knowing the data and context is the key factor  … and that you don’t need a statistics degree to do this.


Quick Example: Consumer Packaged Goods Sales.

In recent client work a well known consumer packaged goods company was trying to untangle what was driving sales.  They had several key questions they were attempting to answer:

  • What factors drive sales?
  • How do peaks in incremental sales relate to the Social Media spikes?
    • For all brands
    • By each brand
  • How does it vary by media provider?  By type of post?
  • Can we use this data to forecast incremental sales? Which factors have the biggest impact?

They had lots of data, which included sales by brand by week, and a variety of potential influences which included:  a variety of their own promotions, call center stats, social media posts, and mined sentiment from those social media posts (e.g., was the post “positive”, “neutral”, or “negative”).   The key step in creating the right explanatory fields was developing time lags for each of these potential influences since the impact on sales was not necessarily immediate — for example, positive Twitter posts this week may have some impact on sales, but more likely the impact will be on sales +1 week, or maybe +2 weeks, or +4 weeks, etc. 

Powerful Results.

What we learned was that there were multiple influences and their intensity varied by brand. Seasonality was no longer the major driver.  New influences — including social media posts and online promotions — were now in the top spot.  We also learned that the key influences can and should be managed.  This was critical — there are lags between the impact of, for example, a negative Twitter post and when it hits sales. As a result, a quick positive response to a negative post can heavily offset that negative post.

In Summary.

An easy to use data discovery and analysis tool that integrates predictive analytics with interactive data visualization and which is then placed in the hands of business analysts and end-users can make huge differences in how data is analyzed, how fast that can happen, and how it is then communicate to and accepted by the decision makers in an organization.

And, stay tuned.  We’ll next be talking about the people side of predictive analytics — if there is now technology that lets you create and use models without writing any code, then what are the people skills and processes required to do this well?

This is the Part 2 of the guest blog post: the Review of Visual Discovery products from Advizor Solutions, Inc., written by my guest blogger Mr. Srini Bezwada (his profile is here: ), who is the Director of Smart Analytics, a Sydney based professional BI consulting firm that specializes in Data Visualization solutions. Opinions below belong to Mr. Srini Bezwada.

ADVIZOR Technology

ADVIZOR’s Visual Discovery™ software is built upon strong data visualization tools technology spun out of a distinguished research heritage at Bell Labs that spans nearly two decades and produced over 20 patents. Formed in 2003, ADVIZOR has succeeded in combining its world-leading data visualization and in-memory-data-management expertise with extensive usability knowledge and cutting-edge predictive analytics to produce an easy to use, point and click product suite for business analysis.

ADVIZOR readily adapts to business needs without programming and without implementing a new BI platform, leverages existing databases and warehouses, and does not force customers to build a difficult, time consuming, and resource intensive custom application. Time to deployment is fast, and value is high.

With ADVIZOR data is loaded into a “Data Pool” in main memory on a desktop or laptop computer, or server. This enables sub-second response time on any query against any attribute in any table, and instantaneously update all visualizations. Multiple tables of data are easily imported from a variety of sources.

With ADVIZOR, there is no need to pre-configure data. ADVIZOR accesses data “as is” from various data sources, and links and joins the necessary tables within the software application itself. In addition, ADVIZOR includes an Expression Builder that can perform a variety of numeric, string, and logical calculations as well as parse dates and roll-up tables – all in-memory. In essence, ADVIZOR acts like a data warehouse, without the complexity, time, or expense required to implement a data warehouse! If a data warehouse already exists, ADVIZOR will provide the front-end interface to leverage the investment and turn data into insight.
Data in the memory pool can be refreshed from the core databases / data sources “on demand”, or at specific time intervals, or by an event trigger. In most production deployments data is refreshed daily from the source systems.

Data Visualization

ADVIZOR’s Visual Discovery™ is a full visual query and analysis system that combines the excitement of presentation graphics – used to see patterns and trends and identify anomalies in order to understand “what” is happening – with the ability to probe, drill-down, filter, and manipulate the displayed data in order to answer the “why” questions. Conventional BI approaches (pre-dating the era of interactive Data Visualization) to making sense of data have involved manipulating text displays such as cross tabs, running complex statistical packages, and assembling the results into reports.

ADVIZOR’s Visual Discovery™ making the text and graphics interactive. Not only can the user gain insight from the visual representation of the data, but now additional insight can be obtained by interacting with the data in any of ADVIZOR’s fifteen (15) interactive charts, using color, selection, filtering, focus, viewpoint (panning, zooming), labeling, highlighting, drill-down, re-ordering, and aggregation.

Visual Discovery empowers the user to leverage his or her own knowledge and intuition to search for patterns, identify outliers, pose questions and find answers, all at the click of a mouse.

Flight Recorder – Track, Save, Replay your Analysis Steps

The Flight Recorder tracks each step in a selection and analysis process. It provides a record of those steps, and be used to repeat previous actions. This is critical for providing context to what and end-user has done and where they are in their data. Flight records also allow setting bookmarks, and can be saved and shared with other ADVIZOR users.
The Flight Recorder is unique to ADVIZOR. It provides:
• A record of what a user has done. Actions taken and selections from charts are listed. Small images of charts that have been used for selection show the selections that were made.
• A place to collect observations by adding notes and capturing images of other charts that illustrate observations.
• A tool that can repeat previous actions, in the same session on the same data or in a later session with updated data.
• The ability to save and name bookmarks, and share them with other users.

Predictive Analytics Capability

The ADVIZOR Analyst/X is a predictive analytic solution based on a robust multivariate regression algorithm developed by KXEN – a leading-edge advanced data mining tool that models data easily and rapidly while maintaining relevant and readily interpretable results.
Visualization empowers the analyst to discover patterns and anomalies in data by noticing unexpected relationships or by actively searching. Predictive analytics (sometimes called “data mining”) provides a powerful adjunct to this: algorithms are used to find relationships in data, and these relationships can be used with new data to “score” or “predict” results.


Predictive analytics software from ADVIZOR don’t require enterprises to purchase platforms. And, since all the data is in-memory, the Business Analyst can quickly and easily condition data and flag fields across multiple tables without having to go back to IT or a DBA to prep database tables. The interface is entirely point-and-click, there are no scripts to write. The biggest benefit from the multi-dimensional visual solution is how quickly it delivers analysis, solving critical business questions, facilitating intelligence-driven decision making, providing instant answers to “what if?” questions.

Advantages over Competitors:

• The only product in the market offering a combination of Predictive Analytics + Data Visualisation + In memory data management within one Application.
• The cost of entry is lower than the market leading data visualization vendors for desktop and server deployments.
• Advanced Visualizations like Parabox, Network Constellation in addition to normal bar charts, scatter plots, line charts, Pie charts…
• Integration with leading CRM vendors like, Blackbaud, Ellucian, Information Builder
• Ability to provide sub-second response time on query against any attribute in any table, and instantaneously update all visualizations.
• Flight recorder that lets you track, replay, and save your analysis steps for reuse by yourself or others.

Update on 5/1/13 (by Andrei): Avizor 6.0 is available now with substantial enhancements:

If you visited my blog before, you know that my classification of Data Visualization and BI vendors are different from researchers like Gartner. In addition to 3 DV Leaders – Qlikview, Tableau, Spotfire – I rarely have time to talk about other “me too” vendors.

However, sometimes products like Omniscope, Microstrategy’s Visual Insight, Microsoft BI Stack (Power View, PowerPivot, Excel 2013, SQL Server 2012, SSAS etc.), Advizor, SpreadshetWEB etc. deserve attention too. However, it takes so much time, so I am trying to find guest bloggers to cover topics like that. 7 months ago I invited volunteers to do some guest blogging about Advizor Visual Discovery Products:

So far nobody in  USA or Europe committed to do so, but recently Mr. Srini Bezwada, Certified Tableau Consultant and Advizor-trained expert from Australia contacted me and submitted the article about it.  He also provided me with info about how Advizor can be compared with Tableau, so I will do it briefly, using his data and opinions. Mr. Bezwada can be reached at , where he is a director at

Below is quick comparison of Advizor with Tableau. Opinions below belong to Mr. Srini Bezwada. Next blog post will be a continuation of this article about Advizor Solutions Products, see also Advizor’s website here:

Criteria Tableau ADVIZOR Comment
Time to implement Very Fast Fast, ADVIZOR can be implemented within Days Tableau Leads
Scalability Very Good Very Good Tableau: virtual RAM
Desktop License $1,999 $ 1,999 $3,999 for AnalystX with Predictive modeling
Server License/user $1K, min 10 users, 299 K for Enterprise Deployment license for up to 10 named users $8 K ADVIZOR is a lot cheaper for Enterprise Deployment $75 K for 500 Users
Support fees / year



1st year included
SaaS Platform Core or Digital Offers Managed Hosting ADVIZOR Leads
Overall Cost Above Average Competitive ADVIZOR Costs Less
Enterprise Ready Good for SMB Cheaper cost model for SMB Tableau is expensive for Enterprise Deployment
Long-term viability Fastest growth Private company since 2003. Tableau is going IPO in 2013
Mindshare Tableau Public Growing Fast Tableau stands out
Big Data Support Good Good Tableau is 32-bit
Partner Network Good Limited Partnerships Tableau Leads
Data Interactivity Excellent Excellent
Visual Drilldown Very Good Very Good
Offline Viewer Free Reader None Tableau stands out
Analyst’s Desktop Tableau Professional Advizor has Predictive Modeling ADVIZOR is a Value for Money
Dashboard Support Excellent Very Good Tableau Leads
Web Client Very Good Good Tableau Leads
64-bit Desktop None Very Good Tableau still a 32-bit app
Mobile Clients Very Good Very Good
Visual Controls Very Good Very Good
Data Integration Excellent Very Good Tableau Leads
Development Tableau Pro ADVIZOR Analyst
64-bit in-RAM DB Good Excellent Advizor Leads
Mapping support Excellent Average Tableau stands out
Modeling, Analytics Below Average Advanced Predictive Modelling ADVIZOR stands out
Predictive Modeling None Advanced Predictive Modeling Capability with Built in KXEN algorithms ADVIZOR stands out
Flight Recorder None Flight recorder lets you track, replay, save your analysis steps for reuse by yourself or others. ADVIZOR stands out
Visualization 22 Chart types All common charts like  bar charts, scatter plots, line charts, Pie charts are supported Advizor has Advanced Visualizations like Parabox, Network Constellation
Third party integration Many Data Connectors, see Tableau’s drivers page ADVIZOR integrates well with CRM software:, Ellucian, Blackbaud and others. ADVIZOR leads in CRM area
Training Free Online and paid Classroom Free Online and paid via company trainers & Partners Tableau Leads

This is a guest post from, Marc Gedansky, a well-known sales and marketing consultant in the Business Intelligence space.  Marc writes and speaks frequently on a variety of issues that influence technology providers and users, and is based in Cambridge, MA. I am fortunate to know Marc as Business Intelligence and Data Visualization expert and as my friend for many years.

Recently I noticed that internet (thanks to big data waves and to easy to use Data Visualization tools) is polluted with a lot of useless Dashboards and I spoke with Marc about this topic. Turned out he has a a very good explanation for it and he was kind enough to share his opinion on this blog as a guest blogger. Marc’s post reminded me the old story:

“An admirer asked Michelangelo how he sculpted the famous statue of David that now sits in the Academia Gallery in Florence. How did he craft this masterpiece of form and beauty? Michelangelo’s offered this strikingly simple description: He first fixed his attention on the slab of raw marble. He studied it and then “chipped away all that wasn’t David.


Dashboards – why are so many useless?

Marc Gedansky,

“Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.” – Antoine de Saint-Exupery

Most dashboards are designed with no clue as to the meaning and/or importance of this quote.

(BTW, even though this is a blog about data visualization, I (M.G.) won’t show any poorly designed dashboard examples, as they are ubiquitous.  Trying to find them is about as difficult as trying to find leaves

on the ground in New England during the Fall).

I view dashboards every day; on software company sites, news sites, financial sites, and blogs.  Since dashboards can distill so much information and display it in such a small space, they hold the potential of quickly delivering valuable insights; of cutting through the “data clutter” to immediately reveal important trends or truths.

So why then, are most dashboards crammed with so many charts, dials, and graphs that they overwhelm you?  Just because you can fit a half-dozen on a screen, why is there a need to do it?  (This approach reminds me of my friend Geoff, who, upon hearing that Hellmann’s was coming out with mayonnaise that had half the calories remarked, “great, now I can eat twice as much”.)

I think there can only be two reasons.

1. The designer/developer wants to show off their expertise with Qlikview, or Spotfire, or Tableau, or X product.

2. The designer/developer does not care about the average person, and wants to build smart software for brilliant users.

That attitude reminds me of a meeting I attended at a software company a few years ago.  The head of development was upset because he was being asked to make his software “easy to use”.    He called it “dumbing down”, and complained that it would be less challenging for his development team to build “software for idiots”.  At this point, the President of the company interjected, “if our customers are smart enough to write us a check, then they are smart enough to use our software.  And the onus for them to be able to use our software is on us, not on them.”

For Continuation of this post please see it on this blog’s page:

Below is a Part 3 of the Guest Post by my guest blogger Dr. Kadakal, (CEO of Pagos, Inc.). This article is about of how to build Dashboards and Data Visualizations with Excel. The topic is large, and the first portion of article (published on this blog 3 weeks ago) contains the the general Introduction and the Part 1 “Use of Excel as a BI Platform Today“.  The Part 2 – “Dos and Don’ts of building dashboards in Excel“ published 2 weeks ago  and Part 3 – “Publishing Excel dashboards to the Internet“ is started below and its full text is here.

As I said many times, BI is just a marketing umbrella for multiple products and technologies and Data Visualization became recently as one of the most important among those. Data Visualization (DV) so far is a very focused technology and article below shows how to publish Excel Data Visualizations and Dashboards on Web. Actually a few Vendors providing tools to publish Excel-based Dashboards on Web, including Microsoft, Google, Zoho, Pagos and 4+ other vendors:

I leave to the reader to decide if other vendors can compete in business of publishing Excel-based Dashbaords on Web, but the author of the artcile below provides a very good 3 criterias of how to select the vendor, tool and technology for it (and when I used it myself it left me only with 2 choices – the same as described in article).

Author: Ugur Kadakal, Ph.D., CEO and founder of Pagos, Inc. 

Publishing of Excel Dashboards on the Internet


In previous article (see “Excel as BI Platform” here) I discussed Excel’s use as a Business Intelligence platform and why it is exceedingly popular software among business users. In 2nd article (“Dos&Don’ts of Building Successful Dashboards in Excel) I talked about some of the principles to follow when building a dashboard or a report in Excel. Together this is a discussion of why Excel is the most powerful self-service BI platform.

However, one of the most important facets of any BI platform is web enablement and collaboration. It is important for business users to be able to create their own dashboards but it is equally important for them to be able to distribute those dashboards securely over the web. In this article, I will discuss two technologies that enable business users to publish and distribute their Excel based dashboards over the web.

Selection Criteria

The following criteria were selected in order to compare the products:

  1. Ability to convert a workbook with most Excel-supported features into a web based application with little to no programming.
  2. Dashboard management, security and access control capabilities that can be handled by business users.
  3. On-premise, server-based deployment options.

Criteria #3 eliminates online spreadsheet products such as Google Docs or Zoho. As much as I support cloud based technologies, in order for a BI product to be successful it should have on-premise deployment options. Without on-premise you neglect the possibility of integration with other data sources within an organization.

There are other web based Excel conversion products on the market but none of them meet the criteria of supporting most Excel features relevant to BI; therefore, they were not included in this article about how to publish Excel Dashboard on Web .

Next Page »