Analytics extrapolates Visible Data to the future (“predicts”) and enables us to see more then 6-dimensional subsets of data with mathematical modeling. The ability to do it visually, interactively and without programming … vastly expands the number of potential users for Visual Analytics. I am honored to present the one of the most advanced experts in this area – Mr. Gogswell: he decided to share his thoughts and be the guest blogger here. So the guest-blog-post below is written by Mr. Douglas Cogswell, the Founder, President and CEO of ADVIZOR Solutions Inc.

Formed in 2003, ADVIZOR combines data visualization and in-memory-data-management expertise with usability knowledge and predictive analytics to produce an easy to use, point and click product suite for business analysis. ADVIZOR’s Visual Discovery™ software spun out of a distinguished research heritage at Bell Labs that spans nearly two decades and produced over 20 patents.

Mr. Cogswell is the well known thought leader and he is discussing below the next step in Data Visualization Technology, when limitation of human eye prevents users to comprehend the multidimensional (say more than 6 dimensions) Data Patterns or estimate/predict the future trends with Data from the Past. Such Multidimensional “Comprehension” and Estimations of the Future Trends requires a Mathematical Modeling in form of Predictive Analytics as the natural extension of Data Visualization. This is in turn, requires the Integration of Predictive Analytics and Interactive Data Visualization. Such Integration will be accepted much easier by business and analysts , if it will require no coding.

Mr. Cogswell discussing the need and possibility of that in his article (Copyright ADVIZOR Solutions, 2014) below.

no-code2

Integrating Predictive Analytics and Interactive Data Visualization WITHOUT any Coding!

It’s a new year, and many organizations are mulling over how and where they will make new investments. One area  getting a lot of attention these days is predictive analytics tools. The need  to better understand the present and predict what might happen in the future for competitive advantage is enticing many to look at what these tools can do. TechRadar spoke with James Fisher, who said 85 percent of the organizations that have adopted these tools believe they have positively impacted their business.

Fast Fact Based Decision Making is Critical.

“Businesses are collecting information on their customers’ mobile habits, buying habits, web-browsing habits… The list really does go on,” he said. “However, it is what businesses do with that data that counts. Analytics technology allows organizations to analyze their customer data and turn it into actionable insights, in a way that benefits business.”

Interest in predictive analytics by businesses is expected to continue to grow well beyond this year, with Gartner reporting in early 2013 that approximately 70 percent of the best performing enterprises will either manage or have a view of their processes with predictive analytics tools by 2016. By doing this, businesses will gain a better sense of what is happening within their own networks and corporate walls, which actions could have the best impact and give increased visibility across their industries. This will give situational awareness across the business, making operating much easier than it has been in past years.

Simplicity and Ease of Use are Key.

Analytics is something every business should be figuring out.  There are more software options than ever, so executives will need to figure out which solution will work best for them and their teams. According to InformationWeek’s Doug Henschen, the “2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey” found that business users and salespeople need easy-to-use, visual data analytics that is intuitive and easily accessible from anywhere, any time. . These data visualization business intelligence tools can give a competitive edge to the companies adopting them.

“The demand for these more visual analytics tools leads to one of the biggest complaints about analytics,” he said. Ease-of-use challenges have crippled the utilization rate of this software.  But that is changing.  “Analytics and BI vendors know that IT groups are overwhelmed with requests for new data sources and new dimensions of data that require changes to reports and dashboards or, worse, changes to applications and data warehouses,” he wrote. “It’s no wonder that ‘self-service’ capabilities seem to be showing up in every BI software upgrade.”

A recent TDWI research report titled “Data Visualization and Discovery for Better Business Decisions” found that companies do have their future plans focused on these analytics and how they can use them. In fact, 60 percent said their organizations are currently using business visualization for snapshot reports, scorecards, or display. About one-third are using it for discovery and analysis and 26 percent for operational alerting. However, companies are looking to expand how they use the technology, as 45 percent are looking to adopt it for discovery and analysis, and 39 percent for alerts.

“Visualization is exciting, but organizations have to avoid the impulse to clutter users’ screens with nothing more than confusing ‘eye candy’,” Stodder wrote. “One important way to do this is to evaluate closely who needs what kind of visualizations. Not all users may need interactive, self-directed visual discovery and analysis; not all need real-time operational alerting.”

Data Visualization & Predictive Analytics Naturally Complement Each Other.

Effective data visualizations are designed to complement human perception and our innate ability to see and respond to patterns.  We are wired as humans to perceive meaningful patterns, structure, and outliers in what we see.  This is critical to making smarter decisions and improving productivity, and essential to the broader trend towards self-directed analysis and BI reporting, and tapping into new sources of data.

Visualization also encourages “storytelling” and new forms of collaboration.  It makes it really easy to not only “see” stories in data, but also to highlight what is actionable to colleagues. 

On the other hand, the human mind is limited in its ability to “see” very many correlations at once.  While visualization is great for seeing patterns across 2, or 4 or maybe 6 criteria at a time, it breaks down when there are many more variables than that.  Very few people are able to untangle correlations and patterns across, say, 15 or 25 or 75 or in some cases 300+ criteria that exist in many corporate datasets.

Predictive Analytics, on the other hand, is not capacity constrained!!  It uses mathematical tools and statistical algorithms to examine and determine patterns in one set of data . . . in order to predict behavior in another set of data.  It integrates well with in-memory-data and data visualization, and leads to faster and better decision making.

Making it Simple & Delivering Results.

The challenge is that most of the predictive analytics software tools on the market require the end-user to be able to program in SQL in order to prep data, and have some amount of statistics background to build models in R … or SPSS … or SAS.  At ADVIZOR Solutions our vision has been to empower business analysts and users to build predictive models without any code or statistics background.

NoCode

The results have been extremely promising — inquisitive and curious-minded end-users with a sense for causality in their data can easily do this — and are turning around models in just a few hours.  The result is they are using data in new and powerful ways to make better business decisions.

Three Key Enablers to a Simple End-User Process.

The three keys to making this happen are:  (1) having all the relevant data offloaded from the database or datamart into RAM, (2) allowing the business user to explore it visually, and (3) providing a really simple modeling interface.

Putting the data in RAM is key to making it easy to condition so that the business user can create modeling factors (such as time lags, factors from data in multiple tables, etc.) without having to go back and condition data in the underlying databases — which is usually a time consuming process that involves coordinating with IT and/or DBAs. 

Allowing the business user to explore it visually is key to hypothesis generation and vetting about what really matters, before building and running models.

Providing really simple interfaces that automate the actual statistics part of the process lets the business user focus on their data, not the statistics of the model.  That simple modeling process includes:

  • Select the Target & Base Populations
    • The “target” is the group you want to study (e.g., people who responded to your campaign)
    • The “base” is the group you want to compare the target to (e.g., everybody who received the campaign)
  • Visually Explore the data and develop Hypotheses
    • This helps set up which explanatory fields to include …
    • … and which additional ones may need to be added
  • Select list of Explanatory Fields
    • The “explanatory fields” are the factors in your data that might explain what makes the target different from other entities in your data
  • Build Model
  • Iterate
  • Understand and Communicate what the model is telling you
  • Predict / Score Base Population
  • Get lists of Scored potential targets

Check out how you can do this with no code in this 8 min YouTube video.

Best Done In-house with Your Team.

In our experience this type of work is best done in-house with your team.  That’s because it’s not a “black box”, it’s a process.  And since your team knows the data and its context better than anybody else, they are the ones best suited to discuss, interpret, and apply the results.  In our experience, over and over again it has been proven that knowing the data and context is the key factor  … and that you don’t need a statistics degree to do this.

IncrementalSalesTrends

Quick Example: Consumer Packaged Goods Sales.

In recent client work a well known consumer packaged goods company was trying to untangle what was driving sales.  They had several key questions they were attempting to answer:

  • What factors drive sales?
  • How do peaks in incremental sales relate to the Social Media spikes?
    • For all brands
    • By each brand
  • How does it vary by media provider?  By type of post?
  • Can we use this data to forecast incremental sales? Which factors have the biggest impact?

They had lots of data, which included sales by brand by week, and a variety of potential influences which included:  a variety of their own promotions, call center stats, social media posts, and mined sentiment from those social media posts (e.g., was the post “positive”, “neutral”, or “negative”).   The key step in creating the right explanatory fields was developing time lags for each of these potential influences since the impact on sales was not necessarily immediate — for example, positive Twitter posts this week may have some impact on sales, but more likely the impact will be on sales +1 week, or maybe +2 weeks, or +4 weeks, etc. 

Powerful Results.

What we learned was that there were multiple influences and their intensity varied by brand. Seasonality was no longer the major driver.  New influences — including social media posts and online promotions — were now in the top spot.  We also learned that the key influences can and should be managed.  This was critical — there are lags between the impact of, for example, a negative Twitter post and when it hits sales. As a result, a quick positive response to a negative post can heavily offset that negative post.

In Summary.

An easy to use data discovery and analysis tool that integrates predictive analytics with interactive data visualization and which is then placed in the hands of business analysts and end-users can make huge differences in how data is analyzed, how fast that can happen, and how it is then communicate to and accepted by the decision makers in an organization.

And, stay tuned.  We’ll next be talking about the people side of predictive analytics — if there is now technology that lets you create and use models without writing any code, then what are the people skills and processes required to do this well?