Tableau best practices

Working with Tableau seeks to change how people view, interact with, and understand data. Tableau is at its best when used to create workbooks that are visual, interactive (interactive delivery to users, either on their desktops, over the web or on a mobile device), iterative (speed the cycle from question to insight to question so that users can quickly develop a hypothesis, test it with available data, revise that hypothesis, test it again, and so on), fast, simple and ubiquitous (view and interact with data on their desktops, over the web, on mobile devices, embedded in other applications and documents, and more). In the previous article, we learned about tableau, in this article, grip-in some tableau best practices.

The Lifecycle of Tableau development

Business Use Cases- It comes up with exhaustive use cases and KPIs/Metrics to be measured for them.

Wireframes- Create a mockup of visuals that are going to be included in the dashboard and get a signoff from respective stakeholders. This ensures that the dashboard is as per the user’s needs and avoids any re-work that might come up in future.

Data Requirements- Create comprehensive data requirements that capture the .data needed at different levels and dimensions.

Data Engineering – Collate data from multiple sources and create a backend data which can be subsequently plugged into Tableau dashboard.

Dashboard Development- Based on the final wireframe, build dashboards on Tableau.

Quality Check & Optimization- It checks for data integrity and accuracy. Also, reduces the latency of the dashboard.

Best Practices of Tableau


A Data Strategy Whiteboarding Session is a great way to begin to crystallize the design and architecture of your future state data and analytics environment.

What to Expect:

-Simplify a complex technical infrastructure in your current state.

-Outline the path from your current state to your future state.

-Unify divergent internal opinions about your future technical architecture.

-Centralize and unify your data with a governance and adoption plan using a ‘build small and iterate’ approach.

-Whiteboarding sessions are the first step on the path to data modernization, advanced analytics, machine learning, or any data initiative any organization is considering.

You can format settings for fonts, alignment, shading, borders, lines, and tooltips at the worksheet level.


When you spend a lot of time building layout, you direct it to develop an approach that makes it easy to create the next one. After a while, you realize there are always other ideas to improve your design process in general.

See some below-

>Add appropriate logo and title of the dashboard on top.

>Do not exceed the boundaries of a single screen.

>Place filters in one section of the dashboard (top or side) and arrange it in the right order (product/region/user hierarchy).

>Have appropriate headers for the sub-section of the dashboard.


>Have consistent colours in graphs.

>Have appropriate axes values for each chart.

>Always have legends for each chart.

>Have borders for each chart.

>Have minimal or no decimal points for numbers in charts.

>Provide relevant data labels.

>Conditional formatting to show very high/low values.

>Add tooltips to highlight appropriate additional information in charts.


>Provide right headers for each table.

>Minimize the number of digits after the decimal point.

>Ensure the right data formats are provided.


When talking about latency, extracts are typically much faster to work with than a live data source. Hence use extracts wherever possible. Now, minimize the number of fields based on the analysis being performed. Use the hide all unused fields option to remove unused columns from a data source. Then minimize the number of records. Use extract filters to keep only the data you need. Now, remove unneeded dimensions from the detail shelf.

Reduce the number of filters in use. Remember, excessive filters in a view will create a more complex query, which takes longer to return results. Use Boolean or numeric filters as they are processed faster than strings. To reduce query load use parameters and action filters. Perform calculations in the database wherever possible and reduce calculated fields. Now, reduce the number of nested calculations. Minimize joined tables. Lots of joins take extra time for processing. Remove custom SQL for faster rendering

Tableau practices are part art and part science. Visualizations are about more than simply presenting the numbers in your reports; they are about the effective use of graphical elements to enable insights and best moments for your users. In our next part, we’ll be talking about Tableau vs Lookr.

About the Author

Shivakumar is an Analytics Manager at Factspan who has a keen interest in retail industries and has worked for multiple fortune 500 retailers across domains of merchandising, marketing, customer strategy, and product management. He loves watching football and is an avid supporter of Arsenal FC. He also loves travelling and has been to 9 countries across 3 different continents so far.


Must Read- Exploring the Power of Retail Analytics

Most Popular

Let's Connect

Please enable JavaScript in your browser to complete this form.

Join Factspan Community

Subscribe to our newsletter

Related Articles

Add Your Heading Text Here


Modernizing Medication Management: Data-driven Approach to Pyxis MedStation

Delve into the significance of Pyxis MedStation in healthcare, highlighting its challenges and the data-driven solutions offered by Factspan. Discover how analytics improves medication management, saving costs and enhancing patient care in the process

Read More ...

Meta’s LLAMA 2 Vs Open AI’s ChatGPT

Explore the world of cutting-edge AI with a detailed analysis of Meta’s LLaMA and OpenAI’s ChatGPT. Uncover their workings, advantages, and considerations to help you make the right choice for your specific needs. Dive into the future of AI and its profound impact on content creation and data analysis.

Read More ...

Data Contract Implementation in a Kafka Project: Ensuring Data Consistency and Adaptability

Data contracts are essential for ensuring data consistency and adaptability in data engineering projects. This blog explains how to implement data contract in a Kafka project and how it can be utilized to solve data quality and inconsistency issues.

Read More ...

CDP: A band-aid solution?

Step into the world of Customer Data Platforms (CDPs) with our captivating blog, designed to guide you through every angle. Discover the origin story of CDPs – why they stepped into the spotlight. Uncover their true essence and explore the four common categories they belong to. Delve into real-life scenarios with eight compelling use cases that are revolutionizing businesses today. Tackle the question: are CDPs a quick fix or a sustainable solution? And don’t shy away from addressing the challenges that come with CDP territory. Wrapping it all up, you’ll find key takeaways that provide fresh insights into this dynamic technology.

Read More ...

The Magical Transformation: How Nike Used Marketing Intelligence to Win the Game

Discover how Marketing Intelligence and Generative AI shape effective strategies. Learn from Nike’s success against Adidas in 2018. Dive into personalized content, automation, and insights.

Read More ...

Web 3.0: Transforming the Future of E-commerce

With Web 3.0, users will experience heightened control over their data, leading to faster and safer transactions. For businesses, this paradigm shift will necessitate embracing AI, blockchain, and machine learning technologies to better connect with customers and thrive in this new era of digital commerce.

Read More ...