Can tableau store huge amounts of data in memory engine? How to handle huge volume of data in Tableau?
Yes, tableau can store huge amounts of data.
But there is no straight answer for this question. Because, we can store huge amount of data based on the Tableau Server implementation (8 cores, 16 cores etc.) configurations.
Whether Hyper is used or not, server memory and other factors can influence the volume of the data that we can store.
One important point to remember when dumping huge amounts of data in Tableau server – the volume should not impact the performance and response time of the dashboards.
Also, the extract processing time. This is where Hyper is improving Tableau performance.
The FACT:
If we design the dashboard and referring view in a proper way, we don’t need to store huge amounts of data in Tableau. At the end of the day users see the visual dashboard which contains couple of GBs data (Majority of the time less than 5 GB).
Because, dashboard presents consolidated information such as avg, sum, calculation results, drill downs, roll ups etc. If dashboard is not presenting RAW detailed data, there is no need for huge amounts of data in Tableau.
When there is a need for trend analysis based on detailed historic data in the dashboard then we may need more storage, even this can be consolidated. Example: instead of seconds and minutes raw data, it can be consolidated in to hourly, daily, weekly data to reduce the huge volume and to improve the performance.
If for some reason users are looking for seconds and minutes data, it is possible to deliver all the detailed high-volume data by using right design techniques.