More than 2 years ago I estimated the footprints for the sample dataset (428999 rows and 135 columns) when it encapsulated in text file, in compressed ZIP format, in Excel 2010, in PowerPivot 2010, Qlikview 10, Spofire 3.3 and Tableau 6. Since then everything upgraded to the “latest versions” and everything 64-bit now, including Tableau 8.1, Spotfire 5.5 (and 6), Qlikview 11.2, Excel 2013 and PowerPivot 2013.

I decided to use the new dataset with exactly 1000000 rows (1 million rows) and 15 columns with the following diversity of values (Distinct Counts for every Column below):

Then I put this dataset in every application and format mentioned above – both on disk and in memory. All results presented below for review of DV blog visitors:

Some comments about application specifics:

  • Excel and PowerPivot XLSX files are ZIP-compressed archives of bunch of XML files

  • Spotfire DXP is a ZIP archive of proprietary Spotfire text format

  • QVW  is Qlikview’s proprietary Datastore-RAM-optimized format

  • TWBX is Tableau-specific ZIP archive containing its TDE (Tableau Data Extract) and TWB (XML format) data-less workbook

  • Footprint in memory I calculated as RAM-difference between freshly-loaded (without data) application and  the same application when it will load appropriate application file (XLSX or DXP or QVW or TWBX)