In this lab I learnt how to use Power BI desktop and connect to multiple data sources, preview data, and assess data quality using Power Query. I also explored source data characteristics, identified data quality issues, and prepared tables for future transformation, modeling, and reporting.
Ensure imported data is accurate, complete, and ready for analysis.
Understand the structure of relational and flat-file data.
Detect inconsistencies and missing values that could impact reporting.
1️⃣ Connect to Data Sources
SQL Server: Imported tables from AdventureWorksDW2020 including DimEmployee, DimProduct, DimReseller, FactResellerSales, DimSalesTerritory, DimEmployeeSalesTerritory.
CSV Files: Imported ResellerSalesTargets (sales targets per salesperson) and ColorFormats (product color codes).
2️⃣ Data Profiling and Quality Checks
Reviewed column quality, distribution, and uniqueness for all tables.
Identified missing values in key columns (e.g., TotalProductCost, Position).
Spotted inconsistent labels in categorical data (e.g., BusinessType “Warehouse” vs “Ware House”).
Fact tables contain some missing numerical values requiring transformation.
Dimension tables have columns suitable for one-to-many relationships in modeling.
Understanding column uniqueness and distribution ensures reliable data joins and accurate analysis.
Power BI Desktop setup and navigation
Connecting to relational (SQL Server) and flat-file (CSV) sources
Power Query data profiling: Column Quality, Column Distribution, Column Profile
Identifying data quality issues and preparing tables for modeling
Understanding fact and dimension table structures
Clean inconsistent categorical labels and handle missing values.
Establish table relationships and hierarchies for regional, product, and reseller analysis.
Transform data for dashboards and reporting.
Extend profiling to additional datasets to ensure accuracy before analysis.
This lab demonstrates the ability to:
Connect multiple data sources in Power BI
Profile and assess data quality before analysis
In this lab I cleansed, transformated, and shaped Power BI using Power Query. As a result, prepared a clean, analysis-ready data model by applying transformations, handling missing or inconsistent values, merging related tables, and loading the queries into the semantic model for reporting and visualisation.
Ensure data is accurate, consistent, and ready for analysis.
Transform raw tables from multiple sources (SQL Server and CSV) into a usable model.
Handle missing, inconsistent, or poorly formatted data before loading into reports.
Salesperson: Filtered only salespeople, merged first and last names, renamed columns for clarity.
SalespersonRegion: Simplified employee-territory mapping for analysis.
Product: Filtered finished goods, expanded subcategories and categories, merged with ColorFormats CSV for visual formatting.
Reseller: Corrected inconsistent labels (e.g., “Ware House” → “Warehouse”), expanded geography data, renamed columns.
Region: Filtered sales territories, kept relevant hierarchical attributes.
Sales: Created a custom column to fix missing TotalProductCost values, standardized numeric and financial data types.
Targets: Unpivoted monthly sales target columns, created a date column, and scaled values for analysis.
ColorFormats: Merged with Product query to include formatting; disabled loading to the model to reduce redundancy.
Column filtering, renaming, and merging
Handling missing values with calculated/custom columns
Unpivoting and reshaping CSV data
Expanding related tables for enriched datasets
Applying correct data types for numeric and financial accuracy
Merging queries across multiple sources for a unified model
Fact tables now contain complete, calculated measures ready for analysis.
Dimension tables are clean, standardized, and structured for one-to-many relationships.
Corrected inconsistent categorical labels ensure reliable joins and hierarchies.
Prepared data supports accurate dashboards, reporting, and business insights.
Power BI Desktop navigation and setup
Power Query transformations: Filter, Merge, Expand, Unpivot
Data cleaning and handling missing values
Creating calculated/custom columns
Merging data from multiple sources (SQL Server + CSV)
Structuring fact and dimension tables for modeling
Load transformed tables into the semantic model for building dashboards.
Establish table relationships and hierarchies for reporting across regions, products, and resellers.
Apply additional transformations or aggregations as required for advanced analysis.
Extend this approach to other datasets to ensure consistency and accuracy.
This lab demonstrates the ability to:
Clean and transform raw data for analysis in Power BI
Merge multiple data sources into a single, consistent model
Handle missing and inconsistent data for accurate reporting
Prepare a structured model ready for dashboards and business insights
Identify inconsistencies and missing values to prepare accurate models
Lay the foundation for reporting, visualization, and business insights
In this lab I designed interactive Power BI reports across multiple pages and published them to the Power BI service. I learnt how to configure visuals, format properties, sync slicers across pages, and enable interactive data exploration for end-users. As a result, I created a professional, multi-page dashboard that allows insights at both summary and detailed levels.
Build visually appealing and informative reports for decision-making.
Enable dynamic filtering and interaction across multiple visuals.
Ensure consistent slicer behavior and cross-filtering to avoid misrepresentation of data.
Added company logo, Year and Region slicers.
Created Line & Stacked Column, Stacked Column, and Stacked Bar charts.
Configured visual properties, labels, and colors for clarity.
Enabled “Show items with no data” for accurate month representation.
Added Region slicer with “Select all” option.
Designed a Matrix visual displaying Sales, Cost, Profit, and Profit Margin by Fiscal hierarchy and Product dimensions.
Applied page-level and column filters to refine data.
Simulated row-level security by filtering for a specific salesperson.
Added Year slicer for FY2019.
Created Multi-row Card, Clustered Bar, and Clustered Column visuals to compare Sales, Targets, Variance, and Variance Margin.
Formatted visuals for clarity, contrast, and readability.
Synced Year and Region slicers across pages to maintain consistent filtering.
Enabled cross-filtering and tooltips on charts for deeper insights.
Configured focus mode for individual visuals to explore details.
Published reports to Power BI Service (My Workspace).
Explored report behavior, cross-filtering, and interactions in a cloud environment.
Verified synced slicers and visual-level filters for consistent user experience.
Multi-page reports provide high-level overview, detailed profit analysis, and salesperson-specific performance.
Slicers and cross-filtering allow dynamic exploration of trends by time, region, product, and individual salesperson.
Proper formatting, labels, and color-coding improve readability and communication of business insights.
Publishing to Power BI Service enables secure sharing, collaboration, and interactive analysis for stakeholders.
Designing multi-page Power BI reports with visuals: Line & Column, Stacked Column, Stacked Bar, Matrix, Multi-row Card
Configuring slicers, filters, and cross-filter interactions
Syncing slicers across pages for consistent reporting
Formatting visuals for readability and user experience
Publishing reports to Power BI Service and exploring interactive dashboards
Add drill-through pages for more detailed product or regional analysis.
Include KPI indicators and trend lines to highlight performance gaps.
Optimize report performance by reducing visual complexity where needed.
Explore embedding reports in organizational portals for broader access.
This lab demonstrates the ability to:
Design professional, interactive Power BI dashboards
Apply best practices in report formatting, slicers, and filters
Enable cross-page synchronization and interactivity for accurate insights
Publish and explore reports in Power BI Service for real-world business use
I built a Sales Monitoring dashboard in the Power BI service using an existing report. I learnt how to pin report visuals, use Q&A for dynamic tiles, and embed images to create an interactive dashboard that consolidates key business metrics for monitoring performance.
Provide a single view of sales and performance metrics for stakeholders.
Enable quick insights using interactive visuals and filters.
Leverage Q&A functionality to allow business users to ask questions directly on the dashboard.
Pinned key report visuals (Sales & Profit Margin by Month) from the report to the Sales Monitoring dashboard.
Set Year and Region filters to FY2020 and all regions, maintaining the appropriate filter context.
Used Power BI Q&A to create dynamic tiles based on natural language queries.
Created a Sales YTD tile filtered to FY2020, demonstrating Time Intelligence calculations in DAX.
Added company logo using a data URI to enhance branding.
Organised tiles for clear visibility and logical layout: logo top-left, Sales YTD beneath, Sales & Profit Margin to the right.
Configured tile details: added subtitles, displayed last refresh time.
Performed manual refresh of the semantic model in Power BI Desktop and republished to the service.
Verified updated sales data (including June 2020) reflected correctly in dashboard visuals.
Consolidates key performance metrics in a single view for quick monitoring.
Q&A enables self-service analytics, allowing non-technical users to explore metrics.
Proper layout, visual hierarchy, and filter context ensure actionable insights are immediately visible.
Embedding images and formatting improves usability and branding.
Pinning visuals from Power BI reports to dashboards
Creating dashboard tiles with Q&A natural language queries
Adding and formatting media (images) on dashboards
Managing filter context and tile refresh to ensure accurate data
Updating and publishing refreshed semantic models in Power BI Service
Explore scheduled refreshes for automated updates of sales metrics.
Add additional KPIs and interactive tiles for regional or product-level monitoring.
Incorporate alerts for key thresholds (e.g., low sales or high variance) to proactively notify stakeholders.
Expand Q&A capabilities for more complex business queries and dynamic reporting.
This lab demonstrates the ability to:
Transform Power BI reports into interactive dashboards for stakeholders
Use Q&A to enable self-service analytics and business-friendly interactions
Maintain real-time insights by refreshing the semantic model and ensuring accurate dashboard data
Combine visuals, media, and interactivity to communicate key business metrics effectively
In this lab I developed a semantic data model in Power BI Desktop. I learnt to create relationships between tables, configure table and column properties, build hierarchies, and define quick measures. I also created many-to-many relationships and learnt techniques for improving model usability and reporting accuracy.
Build a structured and user-friendly data model to support accurate reporting.
Ensure relationships propagate filters correctly for meaningful analysis.
Enhance model usability for report authors through hierarchies, display folders, and formatting.
Enable performance analysis of salespeople and sales regions.
Configured one-to-many relationships between core tables (Product, Sales, Reseller, Region, Salesperson).
Used drag-and-drop techniques and the Manage Relationships dialog to create and validate relationships.
Centralised the Sales table and positioned related tables for clarity in the model diagram.
Product Table: Created a Products hierarchy (Category → Subcategory → Product) and a Formatting display folder for colour fields.
Region Table: Created a Regions hierarchy (Group → Country → Region) and set data categories for accurate map visualisation.
Reseller Table: Created Resellers and Geography hierarchies; assigned appropriate data categories to address geographic reporting.
Sales Table: Added descriptions, formatted numeric columns (decimal places, thousands separator), and set default summarisation to average for Unit Price.
Bulk Updates: Hidden columns used for relationships or calculations to reduce clutter and improve report authoring experience.
Verified fields, hierarchies, and measures in Report view.
Disabled Auto Date/Time to align fiscal year with AdventureWorks financial year (starting 1 July).
Calculated Profit using subtraction (Sales – Cost).
Calculated Profit Margin using division (Profit ÷ Sales) and formatted as percentage.
Validated measures using a table visual to confirm correct values and formatting.
Created relationships between Salesperson, SalespersonRegion, Region, and Sales tables to support performance analysis.
Configured bi-directional cross-filtering and managed active/inactive relationships to ensure correct filter propagation.
Renamed Salesperson table as Salesperson (Performance) to indicate its purpose in analysis.
Connected Salesperson (Performance) to the Targets table using EmployeeID.
Visualised sales and targets for each salesperson while considering filter context and non-additive measures.
Relationships enable correct filter propagation, ensuring accurate aggregations in reports.
Hierarchies and display folders improve model usability for report authors.
Many-to-many relationship design is essential for accurately assessing salesperson performance across multiple regions.
Quick measures simplify calculations, improving speed and accuracy of metric creation.
Proper formatting and metadata descriptions enhance clarity and maintainability of the model.
Building and managing relationships (one-to-many, many-to-many)
Configuring table and column properties, hierarchies, and display folders
Formatting numeric columns and setting summarisation defaults
Creating quick measures (Profit and Profit Margin)
Managing filter propagation for performance analysis
Optimising the model for usability and reporting accuracy
Extend hierarchies and display folders for additional tables as the data model grows.
Implement calculated tables for more complex many-to-many analysis scenarios.
Review hidden columns periodically to maintain a clean authoring experience.
Use the semantic model as a foundation for interactive reports, dashboards, and performance analytics.
This lab demonstrates the ability to:
Build a robust semantic model that accurately propagates filters across tables
Improve model usability with hierarchies, display folders, and formatting
Calculate business metrics efficiently using quick measures
Handle complex many-to-many relationships to support performance management and reporting
In this lab, I created calculated tables, columns, and measures using Data Analysis Expressions (DAX) within Power BI Desktop.
I achieved the objective of extending the semantic model with custom logic that supports accurate business analysis, including fiscal time intelligence, pricing analytics, order metrics, and performance tracking against targets.
Enhance the semantic model with advanced calculations using DAX.
Enable fiscal year reporting aligned to organisational requirements.
Create reusable and controlled aggregation logic through measures.
Support performance analysis by comparing sales against targets.
Created a new Salesperson calculated table using DAX.
Established a direct relationship between Salesperson | EmployeeKey and Sales | EmployeeKey.
Removed inactive relationships to eliminate ambiguity in filter propagation.
Configured descriptions to clarify analytical purpose:
Salesperson: sales made by the individual.
Salesperson (Performance): sales generated within assigned regions.
This design enables two analytical perspectives:
Individual sales performance
Regional performance responsibility
Generated a Date table using:
Date = CALENDARAUTO(6)
Configured fiscal year logic (financial year ending in June).
Created calculated columns:
Year (Fiscal Year – FY logic)
Quarter
Month
MonthKey (for correct chronological sorting)
Built a Fiscal hierarchy (Year → Quarter → Month).
Marked the table as the official Date table to enable time intelligence functionality.
Disabled Auto Date/Time to ensure alignment with the organisation’s fiscal calendar.
This ensures accurate time-based reporting and proper chronological sorting.
Developed calculated columns within the Date table to support grouping and filtering by fiscal periods.
Implemented a numeric MonthKey column to override alphabetical month sorting and enforce chronological order.
This improves reporting accuracy and user experience in visuals such as matrices and charts.
Replaced implicit aggregations with explicit DAX measures to control summarisation logic.
Created pricing measures:
Avg Price
Median Price
Min Price
Max Price
Created order measures:
Orders (DISTINCTCOUNT)
Order Lines (COUNTROWS)
Configured:
Decimal formatting (two decimal places for pricing)
Thousands separator for counts
Display folders (“Pricing” and “Counts”)
Hidden base columns (e.g., Unit Price) to prevent incorrect aggregation
This modelling approach prevents inappropriate reporting behaviour and enforces consistent calculations.
Renamed the Target column to TargetAmount and created structured measures:
Target (using HASONEVALUE to avoid incorrect totals)
Variance
Variance Margin
Applied correct formatting:
Zero decimal places for Target and Variance
Percentage (two decimal places) for Variance Margin
This ensures non-additive measures (such as targets) behave correctly in totals and aggregated views.
Calculated tables provide alternative analytical pathways but increase model size.
A dedicated Date table is essential for reliable time intelligence.
Measures offer stronger governance than implicit aggregation.
Many-to-many and inactive relationship scenarios require deliberate modelling decisions.
Proper formatting and display folders significantly improve report authoring usability.
Writing DAX for calculated tables and columns
Creating fiscal year logic
Building hierarchies and sort-by columns
Designing explicit aggregation measures
Using DISTINCTCOUNT and COUNTROWS
Managing filter context with HASONEVALUE
Controlling model behaviour through hidden fields and display folders
Structuring performance metrics (Target, Variance, Margin)
Implement advanced time intelligence measures (YTD, QTD, rolling periods).
Optimise calculated tables to minimise memory footprint where possible.
Introduce calculation groups for scalable time-based analysis.
Develop performance dashboards leveraging the new DAX measures.
This lab demonstrates the ability to:
Transform a semantic model into a fully analytical solution using DAX
Implement fiscal-aware time intelligence logic
Create governed, reusable business metrics
Handle performance analysis scenarios with correct filter context
Apply professional data modelling practices suitable for enterprise reporting