top of page

Designing a Data Model for Analytics: What Must Be Decided Before Tableau Implementation

  • Feb 9
  • 8 min read

By Chiou Hao Chan, Chief Growth Officer at CRS Studio


two managers discuss about data model for dashboard implementation

Many Tableau projects fail or stall not because of visual design, but because the underlying Tableau data model and the surrounding decision design were never explicitly defined. Dashboards then become a mirror of fragmented source systems, rather than a coherent view of the business.


A core decision is to clarify your analytical “version of the truth” and how it fits into a broader single source of truth across systems, while recognising constraints from how operational systems store and expose data.  


This article focuses on the structural and governance decisions that shape your analytics data modelling, semantic layer, and KPI definitions before any workbook is built.


It does not provide step-by-step implementation guidance or recommend a specific technical pattern, because the right design depends heavily on your organisation’s systems, data quality, and decision-making culture.



Tableau as a Consumer of Your Data Model, Not the Centre of It


Tableau is often treated as the place where the data model is “fixed” through joins, blends, and calculated fields, even though its own guidance positions these features as ways to work with an existing analytical model rather than to define it from scratch.


That is a risky assumption. In many organisations, Tableau works best as a consumer of an analytical data model rather than the primary place where the model is defined—though some semantic logic may still live in Tableau depending on constraints.


For SMEs and nonprofits in Singapore, data typically comes from a mix of accounting software, CRM, fundraising platforms, spreadsheets, and sometimes custom databases, and the role of Tableau must be judged against what can already be achieved with existing CRM and operational reporting.


If each Tableau workbook connects directly to these sources and defines its own joins and calculations, you effectively create multiple, conflicting data models.


The key architectural decision is: where will your analytical data model live, and how much of it should be pushed into Tableau versus upstream systems?  


The synthesis here is that Tableau is typically more sustainable when it sits on top of a deliberately designed model, rather than being used as the primary way to compensate for the lack of one.



From Operational Data to an Analytics Data Model


Operational systems store data in ways optimised for transactions—prioritising speed, integrity, and process-specific workflows—whereas analytical models are deliberately structured to support cross-cutting analysis and decision-making.


Analytics needs a different structure: one that supports comparison over time, across segments, and across functions.


For analytics, you typically need to reshape data into entities that reflect how the business is analysed, not how it operates.


For example, “donation transactions over time by campaign and donor segment” is an analytical view that may span CRM, payment gateways, and marketing platforms.


Before Tableau implementation, leaders need to decide:


  • What are the core analytical entities?

  (e.g., Customer / Donor, Account, Opportunity / Grant, Campaign, Transaction, Service Case)


  • How will time be represented?

  (e.g., order date vs. delivery date vs. invoice date; fiscal vs. calendar periods)


  • What level of granularity is required for decisions?

  (e.g., transaction-level vs. daily aggregates vs. monthly snapshots)


  • How will cross-system identifiers be reconciled?

  (e.g., donor IDs differing between CRM and payment systems)


For SMEs and nonprofits, the constraint is often limited engineering capacity. That makes it even more important to be selective about which entities and relationships are modelled centrally, and which remain local to a specific report.  


The synthesis: an analytics data model is a deliberate abstraction of your operational reality, tuned to decision-making rather than process execution.



Deciding the Role and Shape of Your Semantic Layer


The semantic layer is the bridge between raw data and business understanding. It defines business-friendly names, consistent calculations, and reusable logic that multiple dashboards can rely on.


In many Tableau deployments, the semantic layer quietly emerges inside individual workbooks as calculated fields and custom groups.


That creates hidden complexity and inconsistency. Instead, you need an explicit decision on where the semantic layer will live and how it will be governed.


Key decisions include:


  • Location of semantic logic

    1. Database / data warehouse views

    2. Tableau data sources (published data sources)

    3. A mix of both, with clear boundaries


  • Ownership and change control

    1. Who approves new KPIs or changes to existing ones?

    2. How are changes communicated to dashboard owners?


  • Abstraction level

    1. How much complexity is hidden from business users?

    2. Which fields are “safe” for self-service use?


For smaller organisations, a lightweight semantic layer can still be powerful: a curated set of published data sources with agreed definitions and minimal, well-documented calculations.  


The synthesis: a semantic layer is not a tool feature; it is a governance decision about where meaning lives and how it is controlled.



KPI Definitions: Deciding What “Good” Looks Like Before Visuals


Conflicting KPI definitions are a common source of frustration in Tableau projects. Two teams see different numbers on different dashboards and spend meetings debating which is correct.


Before implementing Tableau dashboards, leaders need to resolve:


  • Primary definition per KPI (with explicitly documented variants where needed)

    1. For each critical KPI (e.g., “Active Donor”, “Qualified Lead”, “Program Utilisation Rate”), decide on one primary definition for analytics purposes.

    2. Document exceptions explicitly rather than embedding them in hidden filters.


  • Scope and filters

    1. Which records are included or excluded?

    2. How are cancellations, refunds, or reversals treated?

    3. How are internal vs. external transactions handled?


  • Time windows and currency

    1. Rolling 12 months vs. calendar year vs. fiscal year

    2. Local currency vs. multi-currency conversions and rates


  • Status logic

    1. When does a donor become “lapsed”?

    2. When does a grant move from “pipeline” to “committed”?


These decisions are not purely technical; they are political and operational. Different departments may have valid reasons for different definitions.


The analytics data model must make a conscious choice about which definition is “standard” and how alternatives are represented.  


The synthesis: KPI definitions are part of your data model, not just labels on a chart; they must be decided and governed before visualisation.



Choosing a Tableau Data Model Pattern: Wide Tables vs. Star Schemas


Tableau can work with different data structures, from single wide tables to more formal star schemas that separate facts and dimensions for analytic flexibility.


The choice has implications for performance, flexibility, and maintainability.


For SMEs and nonprofits, the temptation is to create one “master table” per dashboard: a wide, denormalised table with all the fields needed.


This can work for simple use cases but becomes fragile as complexity grows.


Key trade-offs to consider:


  • Single wide table (denormalised)

    1. Pros: simple to understand; fast for narrow use cases; fewer joins in Tableau

    2. Cons: duplication of logic; difficult to maintain; poor fit for multi-fact analysis (e.g., donations and events)


  • Star schema (facts and dimensions)

    1. Pros: reusable dimensions (e.g., Date, Donor, Campaign); supports multiple facts; clearer grain and relationships

    2. Cons: requires more upfront modelling; may need database or ETL skills not readily available


  • Hybrid approach

    1. A small number of well-designed fact tables (e.g., Donations, Grants, Services Delivered) with shared dimensions, plus a few purpose-built wide tables for highly specific use cases.


The decision is less about technical purity and more about how many analytical questions you expect to support from the same data over time.  


The synthesis: choose a data model pattern based on expected analytical breadth and longevity, not just initial dashboard requirements.



Governance and Change: Who Controls the Tableau Data Model?


Even a well-designed Tableau data model can degrade over time without governance around changes. New fields, new systems, and organisational changes will introduce drift.


Before implementation, it is important to decide:


  • Data model ownership

    1. Is there a named role (or small group) responsible for the analytical data model and semantic layer?

    2. How are business stakeholders involved in approving changes?


  • Change management process

    1. How are new KPIs requested, evaluated, and introduced?

    2. How are breaking changes (e.g., redefining “Active Donor”) handled and communicated?


  • Documentation expectations

    1. What level of documentation is considered “minimum acceptable”?

    2. Where is it stored and how is it kept current?


  • Access and self-service boundaries

    1. Which users can create their own data sources and calculations?

    2. Under what conditions can those be promoted to shared, governed assets?


For smaller organisations, governance does not need to be heavy, but it does need to be explicit. A short, agreed set of rules often reduces confusion later.  


The synthesis: governance decisions determine whether your data model remains a shared asset or fragments into personal versions of the truth.



Context-Dependent Considerations for SMEs and Nonprofits in Singapore


While the general principles above apply broadly, SMEs and nonprofits in Singapore face some specific contextual constraints and choices. These factors influence how far you can go with your tableau data model before hitting organisational limits.


Key context-dependent considerations:


  • Data engineering capacity

    1. Limited in-house expertise may push more logic into Tableau or into simpler database views.

    2. This increases the importance of keeping calculations minimal, well-structured, and documented.


  • Budget and tooling

    1. You may not have a full data warehouse or ETL platform.

    2. Pragmatic solutions (e.g., curated Excel/CSV extracts, basic SQL views, or lightweight cloud databases) can still support a coherent analytical model if the conceptual design is clear.


  • Regulatory and data protection requirements

    1. Handling donor, beneficiary, or customer data must align with the PDPA and internal policies, as these set the baseline obligations for how personal data may be collected, used, and disclosed.

    2. Decisions about where data is stored, anonymised, or aggregated affect how the model is designed.


  • Organisational decision culture

    1. If decisions are decentralised, the model may need to support more flexible slicing and local KPIs.

    2. If decisions are centralised, a more prescriptive, tightly governed model may be appropriate.


This article does not attempt to prescribe a single “correct” architecture for all organisations. Instead, it highlights the levers you can adjust depending on your scale, skills, and regulatory environment.  


The synthesis: your optimal tableau data model is constrained by people, skills, tools, and regulation, not just by what Tableau can technically support.



Common Failure Patterns and How Design Decisions Address Them


By this point, the main themes are clear: Tableau success depends on upstream decisions about structure, semantics, and governance. It is helpful to connect these decisions to the failure patterns many organisations already experience.


Typical failure patterns include:


  • Dashboard proliferation with inconsistent numbers

    1. Root cause: no shared semantic layer or KPI definitions.

    2. Design response: centralised KPI catalogue and governed published data sources.


  • Slow performance and brittle workbooks

    1. Root cause: complex joins and calculations done inside Tableau on top of poorly structured source data.

    2. Design response: move heavy joins and aggregations upstream; simplify the analytical grain.


  • Dependency on a single “Tableau person”

    1. Root cause: implicit data model living in one individual’s workbooks and knowledge.

    2. Design response: explicit data model documentation, ownership, and change process.


  • Inability to answer new questions without rebuilding everything

    1. Root cause: one-off, dashboard-specific data extracts rather than reusable fact/dimension structures.

    2. Design response: design for shared entities (e.g., Donor, Campaign, Date) and multiple fact tables.


These patterns are symptoms of earlier design decisions not being made explicitly.  


The synthesis: many Tableau pain points are downstream consequences of unmade or implicit data model decisions, not purely tool issues.



Practical Decision Checklist (Without Implementation Steps)


To re-anchor the discussion, it is useful to summarise the key decisions that should be made before or alongside Tableau implementation. This is not a technical how-to, but a set of design questions for leaders and data owners.


General decision areas:


  • Analytical scope and entities

    1. Which business questions are in scope for the first phase?

    2. Which entities and relationships must be modelled to support them?


  • Grain and time

    1. At what level of detail will facts be stored and analysed?

    2. How will time be standardised across systems?


  • Semantic layer and KPI catalogue

    1. Where will common calculations and business terms be defined?

    2. Who approves and maintains KPI definitions?


  • Data model pattern

    1. Are you adopting wide tables, a star schema, or a hybrid?

    2. How will this pattern scale as new use cases emerge?


  • Governance and change control

    1. Who owns the analytical data model?

    2. How are changes evaluated, approved, and communicated?


Leaders can use these questions to structure internal discussions and to assess whether their current Tableau deployment rests on a stable analytical foundation.  


The synthesis: a small set of explicit design decisions, made early, provides clarity and stability for all subsequent Tableau work.



Optional External Support for Tableau Data Model Decisions


Some organisations find value in having an external party challenge assumptions, validate architectural choices, or facilitate agreement on KPI definitions and semantic layers.


This can help separate technical constraints from organisational preferences and make trade-offs more visible.


CRS Studio provides Tableau implementation services, including support on analytics design, data structure, and adoption.

CRS_LOGO-01-Crop-Transparent-Small.webp

Bespoke Salesforce CRM, AI, Tableau, and MuleSoft integration solutions. Designed for mission-driven organisations in Singapore to streamline operations, enhance engagement, & deliver measurable impact.

Salesforce Partner and Certified Consultant badges earned by CRS Studio.
Tableau-From-Salesforce-Logo-COLOR-1.webp
SG Cyber Safe – Cyber Essentials Certified CMS Vendor badge
MuleSoft-From-Salesforce-Logo-RGB.webp
Contact Us

© Copyright 2025 Consulting Research Services Studio.

bottom of page