Data Modelling in Power BI: Tips and Best Practices
- everythingpowerbi

- Jul 3, 2024
- 3 min read
Updated: Jan 25
Data modelling in Power BI is a crucial part of the development process as serves as the foundation for building reports and dashboards. A poorly designed data model can lead to performance issues, inaccurate reporting and a poor user experience.
In this article we will be going through the below best practices to follow when constructing your data model.
Understand your reporting requirements
Choosing the correct Semantic Model mode
Model design considerations
Data transformation best practices
General tips
1) Understand your reporting requirements
Before any development starts, it is important to understand what the reporting requirements are and what the users expect from the report in terms of functionality,
Below are some of the things you need to think about:
What is the purpose of the report and what is the business use case.
How many people will be using the report and who is the audience.
How often must the report be refreshed.
How much data will be brought into the report and what datasources will be used.
Will the data transformations be done upstream (such as in SQL) or in Power BI.
Having a solid understanding of what the user expects from the report is the first step in the data modelling process.
2) Choosing the correct Semantic Model mode
Choosing the correct semantic model mode (formerly storage mode) for your report is a very important concept to understand, as the mode you use will have an effect on what you can or cannot do with your data once it has been retrieved from the source.
Power BI offers the following Semantic Model Modes:
Import
DirectQuery
Dual
Live Connection
Direct Lake (Microsoft Fabric)
PROS | CONS | |
Import | -Fastest connection -Full Power Query Transformations | -File size limitations -Regular refreshes required |
DirectQuery | -Ideal for large scale datasets -Near to real time data | -Limited Power query transformations -Query limit of 1 000 000 rows |
Dual | -Combines the benefits of Import and DirectQuery modes | -Complex to design and implement -Only available in Premium Capacity Workspaces |
Live Connection | -Unlimited refreshes -Suited for real-time data needs | -No Power query transformations -Limitations on measures that can be created |
Direct Lake | -High speed for performance and doesn’t need a scheduled refresh | -Queries in Direct Lake mode are not as fast as Import mode |
For a detailed breakdown of Semantic Model modes please check out one of our previous articles here.
3) Model design considerations
The architecture of the data model and how the relationships between tables are defined is paramount to the performance of your report. A poorly constructed model will degrade report performance and lead to a poor user experience.
Below are some things to consider when designing the model:
Combining tables that have a one-to-one relationship.
The performance impact of adding Custom (Power Query) vs Calculated (DAX) columns.
Handling many-to-many relationships (avoid if possible).
Choice of datatypes.
Star Schema vs Snowflake Schema design.

4) Data transformation best practices
The data transformation requirements will vary from report to report and can be as simple as changing a "datetime" field to a "date", or more in-depth like merging two datasources together and implementing complex business logic.
The complexity and number of the transformations will also depend on your data source and if transformations are being handled upstream (such as in SQL) or must be done in Power BI.
Here are some tips to help you along your way:
Use Power Query instead of DAX to do your transformations.
Label your transformation steps in Power Query appropriately so it makes it easy to read and debug.
Make use of Custom Columns (Power Query) instead of Calculated Columns (DAX).
Remove unnecessary columns from your data model.
Leverage query folding to push transformations back to the data source.
Use the correct data types.
5) General tips
Below are some general tips to take note of in the data modelling process:
Only import what you need into your model. This includes tables, columns and rows.
Group measures into a display folder as it keeps the model tidy.
Break down complex transformations into smaller steps. This makes your model more manageable and easier to troubleshoot.
Use variables in your DAX code as it improves readability and makes it easier to debug.
Ensure the data types are consistent across columns. Mismatched data types can lead to errors or unexpected results.
In closing, remember that a well-developed data model is crucial for building effective Power BI reports. We hope these tips help you and offer some guidance along your development journey.
If you enjoyed this post, please leave a comment, share on Social Media and subscribe to get notified when we publish a new article.
Happy developing!
References


Comments