Useful Documentations about Choosing the Right Datastore

It is important to understand the difference between Relational and Non-relational storage solution because different solution fits for different use cases and application type.

I have organized some resources below for you to understand the different between these 2 and hopefully after reading these you can:

•Know the different datastore categories available

•Know the different use cases for data store workloads

•Know the Azure and OSS datastore offerings available

•Design solutions that pick the right datastores for the use case

Choosing the Right Datastore

Criteria for choosing a data store

Choosing a big data storage technology in Azure

Choosing a search data store in Azure

Choosing a stream processing technology in Azure

Choosing a batch processing technology in Azure

Choosing a real-time message ingestion technology in Azure

Thank you,

Your friend,


Power BI Adjacent Microsoft Services and Products

Power BI in itself has a lot of features and functionalities which help you with your job or business. However, Power BI itself won’t solve all your problems which in some cases you may want to peer with other services.

Power BI Premium Capacity

By default, most people will use Power BI pro license level. But when should you consider to leverage Power BI premium capacity? There are couple of things you may need to consider: 1. to store larger datasets, 2. Allow faster DirectQuery /Live Connection query, 3. Support Paginated Reports, 4. Allow unlimited Content Sharing for both internal or external viewers.

If you want to know more information about Power BI Premium Service, please refer to this Microsoft doc.

Azure Synapse Analytics

Azure Synapse Analytics (ASA) is a great alternative for your Power BI model Development. It allow you to query petabytes of data in a performant way with reasonable price by leveraging Direct Query mode of Power BI. ASA Data Pool is a MPP(Massive parallel processing) solution with the Result Set Catches and Materialized View built in so that it enabled high volumes of concurrency quires. Build your data warehouse in ASA as compare to Power BI you can provide Enterprise-grade data preparation and data warehousing solution which allows single source of trues for Power BI and other application and centralized security. In addition, the studio experience within ASA allow team collaboration amongst (data scientists, data engineers, DBA, and BI Developers).

For more information about this topic, please refer to this whitepaper.

AI and Machine Learning Services

There are 3 ways you can extend your BI data to AI insights within Power BI.

1. Auto ML in Power BI Dataflow; Official Doc Tutorial

2. Calling Existing Cognitive Services; Official Doc Tutorial

3. Call Machine Learning Models which Stored in Azure Machine Learning Services. Official Doc Tutorial

Azure Data Lake Storage Gen 2

By default, data used with Power BI is stored in internal storage provided by Power BI. With the integration of dataflows and Azure Data Lake Storage Gen2 (ADLS Gen2), you can store your dataflows in your organization’s Azure Data Lake Storage Gen2 account.

Simple put, Dataflow within power bi is Power Query + Common Data Model (CDM). With the Common Data Model (CDM), organizations can use a data format that provides semantic consistency across applications and deployments. And with Azure Data Lake Storage gen2 (ADLS Gen2), fine-grained access and authorization control can be applied to data lakes in Azure.

For more information about this topic, please refer to this official doc.

Azure Analysis Service

The core data modeling part of Power BI is actually Analysis Services. So using Azure Analysis Services (AAS), you can create data models independently from Power BI services. Right now the largest SKU of AAS allows you to store up to 400GB of data. You can allow different report applications (like Power BI, Tableau, Excel) to connect to your AAS models. Within Microsoft, The Analysis Services team is a sub team of Power BI. The product roadmap for Power BI is to minimize the data modeling experience between Power BI premium capacity and Azure Analysis services. That’s why are are seeing the recent updates from Power BI included those enterprise grade data modeling capability such as increased dataset limited, XMLA endpoint, and workflow pipeline, etc. However, most of those features are in preview and is about to go GA. In short, if you are migrating from your on premise SSAS model to Azure or you do not want to wait for the PBI features go to GA, then AAS is your easiest option. Otherwise, choose Power BI premium to host your reports and data models is strongly recommended.


Power Automate & Power App

When I think about Power BI with other Power Platform tools, I think it is a Insights + Action Play. You can use Power BI to get insights from your data and then use Power Automate or Power App to take action out of it.

The Bi-Directional Connectors allow you to connect the services between each other which means you can embed power bi report to power app and you can embed power app to power bi.




Azure Logic Apps, Azure Monitor log analytics

•Synchronously Refreshing a Power BI Dataset using Azure Logic Apps.

Reference: A Step by Step Community Blog Post.

•Use Power BI to understand your Logic Apps Consumption Data

Reference: A Microsoft Blog

•Use Azure Monitor Logs you can understand your Power BI embedded consumption. You can also use Power BI to create customized report out of your Azure Monitor Logs.

Reference: Microsoft Doc

Azure Streaming Analytics

This is a solution for true real time analytics in Power BI. It enables you to visualize your IOT or Events data.

Typical Architectural is Device Data -> Event Hub -> Azure Streaming Analytics -> Power BI Streaming Datasets -> Power BI Dashboards.

Reference: Official Document

My blog:

Microsoft Teams

You can Add your Power BI report as an App to Teams.

Reference: Official Doc

Also if you have admin credentials to Teams, then you can Analyze Call Quality Data in Power BI.

Reference: Official Doc

Dynamics 365

•You can use Power BI to analyze the data in Dynamics 365 (OData or Entity Store from LCS)

•You can Pin your Power BI visuals to Dynamics 365

•You can use existing solutions from Lifecycle Service

•There is a Power BI Embedded Experience within Dynamics (Boxed)


  1. Official Doc Configure Dynamics 365 Power BI Integration
  2. Official Doc: Service in Dynamics F&O
  3. Official Doc: Get Dynamics Data in Power BI using Business Central
  4. Official Doc: Customize Dynamics F&O analytical workspace

In summary, there many other services inside or outside Microsoft products portfolio works will with Power BI. So consider think outside of the box and create different solutions for your business.

Your friend,


Load data from cosmos db to Event hub using Azure Function at Azure Portal

One way to load data from Cosmos DB to Event hub is to use Azure Function. But although there is many coding samples out there to create such Azure Function. If you are like me do not have much application development experience, reading those code samples is bit channenging. Luckly, Azure Portal made is so easy.

  • Navigate to your Cosmos DB on Azure Portal, on the left pannel, you can find a choice under Settings called ‘Add Azure Function’

CosmosDB Azure Function 1

  • Fill the form popped up after you clicked on ‘Add Azure Function’ and choose the Function Language as C#. Then click ‘Save’
    • CosmosDB Azure Function 2


  • The portal will automaticly navigate to the Azure Function Apps page. and you will find the newly created Azure Function there. The next steps for us is to add the data target. In this case, it is Event hub. To add the Azure Function output target,
    • Click on ‘Integrate’
    • Click on the ‘Output’ New Output in the center form
    • Then choose ‘Azure Event Hub’ from the output choices
    • Save
      CosmosDB Azure Function 3
  • After you hit save, you will see the output form. change the Event Parameter Name field to be ‘$return’. And make sure the ‘Use Function Return Value’ is checked before ‘Save’
    CosmosDB Azure Function 4
  • The next step is to update the Azure Function run.csx code to create the data integration code.
    • Click on the FunctionName itself and the run.csx editing page will show up.
    • Then change the current code into this code
    • And click on ‘Save and Run’
    • Please note where I highlighted in yellow, depends on which run time version of Azure Functions App you have. the reference file can be either #r “Microsoft.Azure.Documents.Client” or #r “Microsoft.Azure.DocumentDB.Core”. So interchange those two if you run into error.
      • CosmosDB Azure Function 5
  • Now, your azure function is up running and capturing the changes and sending the changed files to event hub.



Your friend, Annie

Microsoft Bot 101

There are a lot of concept from Microsoft Conversational AI services. My teammate and I organized the following concept and resources for you hopefully it helps you get started.

  • Types of Bots:
    • Knowledge

This type of bot is result from knowledge base. A knowledge base is a pre-stored database which host Question and Answer pairs.

Source Link

  • Enhancement

This is a helping conversation part built as an enhancement of existing working application

  • Functional

The bot itself is a full application

  • Virtual Assistant

It is an open-source Virtual Assistant bot solution provides you with a set of core functional capabilities and full control over the end user experience.

Source Link

  • Technologies
    • Q&A (Aka Knowledge base stores question and answer pairs) Link
      • Score

when you asking the bot a question, the bot will return a score where how much % it matches existing questions in the knowledge base using MS internal fuzzy match logic

  • Metadata

It allow us to separate one information to the next. Metadata is data about data. By adding Meta Tags for you question and answer pair in a Q&A KB, it can better filter and answer for the end user.

Source Link

  • Single/Multiple Turn

The distinguish between single turn and multiple turn is that multiple turn conversation has follow-up questions

Source Link

  • LUIS (

Language Understanding (LUIS) is a cloud-based API service that applies custom machine-learning intelligence to natural language text to predict overall meaning, and pull out relevant, detailed information.) Link

  • Intents

The overall meaning of the phrase

  • Entitles

The detailed pieces of information that LUIS extract

  • Patterns

A pattern is a combination of expression matching and machine learning to increase intent and entity prediction

Source Link

  • Utterances

Text in users own words

  • Natural language process

The process which changes users unstructured text input to and return a JSON-formatted response with a top intent. LUIS support prebuilt NLP model as well as custom model for domain-specific natural language models.

  • Adaptive Cards

They are a way to present a self-contained UI to a user within a larger UI such as a chat client.

Source Link

  • OAuth

OAuth (Open Authorization) is an open standard for token-based authentication and authorization on the Internet. You can add Authentication capability to your bot Source Link

  • Bot specific concepts
    • Channels

A channel is a connection between the bot and communication apps.

  • Turn

Request and response in and out model

  • Events

Microsoft Teams sends notifications to your bot for events that happen in scopes where your bot is active.

From <Source Link>

  • Conversation

A set of request and response groups

  • Prompts

prompt is used whenever a bot needs input from the user

From <Source Link>

  • Dialogs

you can use dialogs to model a conversation and manage conversation flow.

From <Source Link>

  • State

State data can be used for many purposes, such as determining where the prior conversation left off or simply greeting a returning user by name.

From <Source Link>

  • Dispatch

If a bot uses multiple LUIS models and QnA Maker knowledge bases (knowledge bases), you can use Dispatch tool to determine which LUIS model or QnA Maker knowledge base best matches the user input

From <Source Link>

Thank you for reading.

Your friend, Annie

Power BI Refresh Failed – Trouble shooting

Issue: Few Power BI datasets have been successfully refreshed but recently have some inconsistent refresh failures

Possible Problems and Solutions:

a. License level: Power BI Premium license has different level and Power BI Premium capacity is shared within a Tenant and can be shared by multiple workspaces.The maximum number of models (datasets) that can be refreshed in parallel is based on the capacity node.  For example, P1 = 6 models, P2 = 12 models and P3 = 24 models.  Beyond that, you must understand how memory is managed by the capacity.  If memory is consumed by interactive processes (users interactive with reports)  the refreshes will go into a queue or possibly fail depending on resource utilization.  The best recommendation I can provide is to read this whitepaper, which provides much more detail.  In addition, I would recommend to install the Power BI Premium Capacity App and begin to analyze the model refreshes.  This should assist in identifying and building a model refresh pattern that minimizes/eliminates refresh failures.

  1. So to solve the issue, try limit parallel refresh or
  2. Increase the Power BI premium license level through O365 admin portal. Or, purchase Power BI Embedded Capacity through Azure Portal then assign the dedicated capacity to workspaces

b. Source Database Activities: check on the source database activities during the time datasets are been refreshed. There are might be database locks or database backups going on.

c. Network: Network communication through data source server to Power BI server which is in Azure cloud might be another reason. Please check on this community link for trouble shooting . If network is the issue, we can further explore other network related solution such as adding Azure Express Route.


Your friend Annie.

My first week at Microsoft

On Monday March 3rd 2019, I started my journey at my dream company Microsoft. I would love to share the experience I have acquired over the past few days.

How did I get in

You may be wondering why I’ve decided to join Microsoft and how I got here, so let me tell you my story leading up to today. After finishing my master’s degree in Information Systems from Indiana University, I joined Anthem (one of the largest health insurance companies in the industry).

My team back then relied heavily on Microsoft products and worked closely with its supporting team to build its world class data infrastructure to support Anthem’s customer operation business. From that experience, I was not only fascinated by its products and services but also its community, filled with passionate and supportive individuals who love to learn and share.

Feeding off of this passion, I began joining local PASS SQL groups, attended and presented at SQL Saturdays, got my Microsoft certification with SQL Server 2016, and started this very tech blog you are reading. After a few years in the work place, I started to sense that cloud, big data and AI were and are the future of our world, and that the space provided me with a unique set of challenges to solution problems and share my knowledge with others, both being things I love.

In order to broaden my cloud technology skills and get more involved with direct customer interactions so to help them with my skills, I accepted the offer from MCAConnect a Microsoft Golden Partner consulting firm as senior consultant. From that experience, I got the opportunity to leverage Microsoft cloud technologies to help customers of different sizes, working directly with executive or management level clients to build solutions for their business needs. I took great pride in being able to leverage my knowledge in Microsoft technologies to help impact, change, and improve the various business processes of varying clients.

Yet despite the opportunities I was given, I still casually found myself day dreaming of one day pacing around the Seattle office with a fresh hot of the printer Microsoft badge, swimming in one of the richest pools of knowledge in the world. It wasn’t until the end of 2018 that my day dreaming became a reality. I had attended SQL Saturday in Washington DC, and after the event, the Microsoft One Commercial Partner (OCP) group reached out to me for a few openings. I had reached out to my Microsoft friends, my mentor James and the hiring manager Justin to understand better the job and Microsoft’s working culture. I found out that the Cloud Solution Architect for Big Data and AI under Justin’s team was the best fit for me and aligned well with my personal goal, and shortly after our discussion, I applied for the position.

After 1 general interview and 2 technical interviews, I got the offer.

To be honest, I felt like I had bombed my technical interviews as I wasn’t able to answer a ton of the questions asked, but it wasn’t until later, after reading Microsoft CEO Satya Nadella’s book ‘Hit Refresh’, that I realized why it actually made sense that I got the offer. Microsoft doesn’t hire know-it-all’s but the learn-it-all’s, welcoming people with a growth mindset. The moment I got the Microsoft offer, I admit I was ecstatic, more so than I had ever been in my professional career, as it was a truly elusive dream come true moment. Of course, after waiting through a lengthy hiring process, done through an internal career system which tracks progress and collects the necessary documents, I found myself talking with my HR manager about onboarding, a complicated process due to the size of the company. Along the way, my manager and the HR manager worked diligently and patiently, communicating with me regularly throughout the 1 month process, reducing my anxiety levels significantly.

Redmond, Seattle and Microsoft Campus

My New Employee Orientation was set to be at Microsoft’ headquarters located in Redmond, WA. My flight, hotel, and car rental were pre-booked by the company based on my schedule. Last Sunday evening, I landed at Seattle Airport, and from the rental car office to the hotel, I got all kinds of Welcome to Microsoft signs and greetings from the service companies, an experience comparable to a home coming. The hotel I stayed at was the Marriott in Redmond town center, a lovely shopping center with all kinds of restaurants, fitness studios, and shops.

Next morning, I drove to campus to attend New Employee Orientation (NEO). The building was called building 92, and at first, I thought it must have been a random number as I could not imagine any company having more than 90 buildings on headquarter campus alone. It was, however, true, and to my astonishment, the campus was still expanding. It was so large that transportation between buildings often relied on shuttle cars. It had multiple walking/running trails, football and basketball courts, on campus shopping centers, like those in Disneyland, and for every 3-4 buildings, there was a food court, summing up to over 30 food courts on campus providing all kinds of cuisines.

The campus was covered with trees, creating the feeling of working in a park. Later, my manager showed me and one of my teammates the on campus treehouses designed by someone famous, though I couldn’t remember the name, and those treehouses were accessible by all employees based on reservation. Then for the next few days, I visited multiple buildings. The offices were illuminated with bright natural light and were separated into different rooms, enhancing my mood as well as providing a feeling of privacy. It fit the culture of Microsoft, formal but fun.

Also, very important, provided for free on every floor and corner, was a variety of drinks and office supplies free for the taking.

New Employee Orientation (NEO)

NEO happened in building 92 which was the visitor’s center. It started with new employees (NE) gathering in the product showcase room where we had access to coffee, Xbox, VR, and networking opportunities. Some posed for a picture in front of a photo featuring the first team of Microsoft employees all of different ages, genders, and cultural backgrounds.

At around 8:00 a.m., we were led through different stations to process our I-9, get our photo taken for employee badges, and have breakfast before entering the conference room where the NEO meeting was hosted. In the room, we were randomly assigned to different tables, done on purpose so for us to interact with individuals from different areas to gain exposure to the different groups in the company. Leading hosts of this events were senior leaders and employees, all who were volunteers. It started with simple greetings and storytelling exercises for each table. Then one of the hosts delved into the company culture and his experiences at Microsoft, followed by the HR manager introducing the benefits which Microsoft provided. During this session, nearly everyone around me, including myself, whispered amongst each other, doubting what we heard and checking our sanities amongst each other. Are all of those really free and reimbursable? I have to say that although the general benefits including 401k, health insurance, family support, Health Savings Accounts, and employee stock purchase plan were amazing, the ones that caught my attention were the fitness benefit, tuition support and donation matches, as my health and growth are the two stabilizing factors to my career.

Following the benefits introduction, there were more stories and program introductions provided by different speakers, all of which reinforced my excitement for finally being here. During lunch, hiring managers came to the conference room to greet their new hires before taking them to lunch. My manager Justin came to welcome me and my new teammate Alex, and proceeded to take us to the café building nearby to provide us with our new PC. After lunch we were sent back to the NEO building to finish up with our security settings and the benefit vendor tour.

NEO officially ended at around 2pm. Justin soon after gave us a brief introduction meeting and provided some tips on what to expect for the next couple of days and weeks. He also told us there was a TechFest on campus happening this week so in lieu of a short Q&A meeting with him each day, we were welcome to attend the event or leave earlier to check out the ins and arounds of Seattle.

TechFest and Seattle area

TechFests, in conjunction with other events, happen every now and then and offer various keynote sessions and presentations. Here, you can learn of the latest research happening in the tech word as well as inside different teams at Microsoft. This week’s TechFest was about AI and machine learning. I was so impressed by all the new inventions and studies out there in our field, and felt so fortunate to be so close to the inventors and researchers, allowing me to talk directly to them about their ideas while asking questions. It was quite an experience to be surrounded by such creative and curious individuals, all so very accomplished and intelligent, all leaders of the industry.

After each day, I had a chance to visit the Seattle area, a place where I have never been before. It was such a beautiful city with lakes, ocean, and mountains, home to multiple large corporations such as Microsoft, Amazon and Starbucks. There were so many hiking, running, and biking trails, various outdoor activities, a paradise for the adventurous soul. It was also one of the more foodie cities I have seen, as all of those companies and universities embrace diversity, offering a vast choice of ethnic cuisines from around the world.

My experience during my first week at Microsoft truly reflects its mission statement “Empower every person and every organization on the planet to achieve more”, reflecting its culture of valuing growth mindset. Microsoft is a place where an aspirational technology professional as myself can use as a platform to empower my skillsets to impact the world.

I am ready for this new chapter of my career.

Thanks for reading,

Your friend, Annie

You may need a table variable for IRR and NPV for DAX

I finished my Fundamental Finance Accounting course last week. I learned 2 awesome functions which helps which which projects to undertaken. Frist is NPV net present value and the second is IRR(Internal Rate of Return). I found those two are so powerful to use helping to decide which investment to choice so I want to implement in DAX. Fortunately, in DAX there are two functions XNPV and XIRR which I can choose to use. I found that XNPV and XIRR are also in excel and would be a replacement of IRR and NPV.  They have slightly different calculation details but the purpose are the same. You can find and  for more details.

The Excel XNPV function is a financial function that calculates the net present value (NPV) of an investment using a discount rate and a series of cash flows that occur at irregular intervals. Calculate net present value for irregular cash flows. Net present value. =XNPV (rate, values, dates)

The Excek XIRR(Internal Rate of Return) is the discount rate which sets the Net Present Value (XNPV) of all future cash flow of an investment to zero.  If the NPV of an investment is zero it doesn’t mean it’s a good or bad investment, it just means you will earn the IRR (discount rate) as your rate of return. =XIRR(values,dates,guess)

The challenge in DAX, I found, is that , we may not have the date field. For example, we may just calculate a intial investiment and an estimate assuming paying amount every year or every month in the future. Something like this:

Assumption - IRR NPV

So in DAX, we can create table variable in the Measure to leverage XIRR and XNVP.

VAR cashflowtable =
    UNION (
        GENERATE (
            GENERATESERIES ( YEAR ( NOW () )YEAR ( NOW () ) ),
            VAR inc = [Value]
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Initial Inventestment] )
        GENERATE (
            GENERATESERIES ( YEAR ( NOW () ) + 1YEAR ( NOW () ) + [Investment Years] ),
            VAR inc = [Value]
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Annual Cash flow] )
        GENERATE (
            GENERATESERIES (
                YEAR ( NOW () ) + [Investment Years],
                YEAR ( NOW () ) + [Investment Years]
            VAR inc = [Value]
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Exit Value] )
    XIRR ( cashflowtable, [Values], [date] )
VAR cashflowtable =
VAR inc = [Value]
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Initial Inventestment] )
GENERATESERIES ( YEAR ( NOW () ) + 1, YEAR ( NOW () ) + [Investment Years] ),
VAR inc = [Value]
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Annual Cash flow] )
YEAR ( NOW () ) + [Investment Years],
YEAR ( NOW () ) + [Investment Years]
VAR inc = [Value]
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Exit Value] )
XNPV ( cashflowtable, [Values], [date], [Interest Rate] )

Hopefully, you find this helpful.


your friend, Annie

Migrating Local Power Query to PowerBI Data Flow and link it to Azure Data Lake Storage account

Within Microsoft lastest PowerBI update, the one exites me most is that we can link PowerBI data flow to Azure data lake storage account.

Benefits are:

  1. When you have a PBIX imported mode reaching more than 1 GB and you are not Primier memeber you can not publish to However, with DataFlow in PowerBi you can save upto 10 GB data.
  2. Your workspace members are able to reuse the same data flow as souce of their PowerBI file.
  3. When liked to Azure Data Lake Sotrage Gen 2 you can use data got from PowerBI Dataflows as data source across your Azure applications.14a64eeb-b1a7-4a12-889f-8b573c98ef20[1]

To try this, you need to have the following things

You have PowerBI-Pro license and above.

You are an admin to your Azure portal. (Or you are good friend of your admin ^_^)

There are few things I found out when I did my experiment and I think you might face are:

  1. You can use blank query to copy M Codes from your Power Queries
  2. If you have a table function to transfer and you don’t have primier account, you can still do so by select “do not load”
  3. To configure Azure Data Lake Gen 2 connect to PowerBI, follow this document from microsoft.
  4. When you configured everything right according to the above microsoft document. You may still have issue like I did. Please follow the following communication string  to see the lastest udpate on the PowerBI community discussing.

Latest update, Microsoft team has fixed the above. Now I have tested and works fine.


Your friend, Annie

An easy way to compare all tables’ DDL between two SQL environments

To continue my story about data warehouse migration from UAT to PROD environment.

The DDL difference between the same source loading table from two environemnts will cause my data warehouse load fail, so I need to find an easy to compare the two for all my sourcing tables . There are a lot of good tools with free trail out there to choose. However, there are some regulation from my client side to installing third party tool so I need to create a query to help me to do so instead.

Inspired by an answer from this Post which compares one table. I generated the following query which can loop through all of my targeted sourcing tables.

DECLARE @i int=1;
While @i<= (select max([RANKColumn]) from [dbo].[EvaluationTableList] ) — this is the list table which contains all the tables you want to validate
SET @Table=(select [TABLENAME] from [dbo].[EvaluationTableList] where [RANKColumn]=@i)
insert into [dbo].[EnviromentObjectDifference] — (this is the output table)
SELECT Table1.ServerName,
Table1.ColumnName, DataType,
Table2.ServerName as T2ServerName,
Table2.DBName as T2DBName,
Table2.SchemaName as T2SchemaName,
Table2.TableName as T2TableName,
Table2.ColumnName as T2ColumnName, as T2DataType,
Table2.Length as T2Length,
Table2.Precision as T2Precision,
Table2.Scale as T2Scale,
Table2.Is_Identity as T2Is_Identity,
Table2.Is_Nullable as T2Is_Nullable
(SELECT @Server1 ServerName,
@DB1 DbName,
SCHEMA_NAME(t.schema_id) SchemaName,
t.Name TableName,
c.Name ColumnName,
c.Max_Length Length,
FROM [ServerName01].[DatabaseNameO1].sys.tables t
INNER JOIN [ServerName01].[DatabaseNameO1].sys.columns c ON t.Object_ID = c.Object_ID
INNER JOIN sys.types st ON St.system_type_id = c.System_Type_id AND st.user_type_id = c.user_type_id
WHERE t.Name =@Table) Table1
(SELECT @Server2 ServerName,
@DB2 DbName,
SCHEMA_NAME(t.schema_id) SchemaName, TableName, ColumnName,
c.max_length Length,
FROM [ServerName02].[DatabaseNameO2].sys.tables t
INNER JOIN [ServerName02].[DatabaseNameO2].sys.columns c
ON t.Object_ID = c.Object_ID
INNER JOIN sys.types st
ON St.system_type_id = c.System_Type_id
AND st.user_type_id = c.user_type_id
WHERE t.Name = @Table) Table2
ON Table1.ColumnName = Table2.ColumnName
where Table1.ColumnName is null or Table2.ColumnName is null;
SET @i=@i+1

Hopefully, you find this helpful.


Your friend, Annie

Create Dynamics CRM Online – User Activity PowerBI Report

Couple days ago, a client of mine is asking me to build a Dynamics CRM Online user activities dashboard. In Dynamics CRM On-premises version, I know, this information resides in audit table. However, with Dynamics CRM online, I am not able to access directly to the backend database. Fortunately, there are few PowerBI content packs for Dynamics CRM online out there in market place so that I was able to read through the PBIX file and find out how PowerBI is connected to the Dynamics CRM Online data.

PowerBI is using OData as data source to connect to Dynamics CRM online web services which gives you almost all data available (There are still some important fields and tables which are in On-premises version and causing the challenge of build reports effectively) . You just need to change the highlighted piece below to your organization CRM web name.

CRM Online Audit Report snapshots


Odata Tables available

I did find the ‘Audit’ table available. However, there are few challenges to interpret the data. I am listing the challenges below and the steps I handled it.

  1. Initially, the ‘Audit’ table is blank.
    • This is because in CRM online you need to enable the setting of Audit Tracking. You can follow this blog to enable it.
  2. Every field is in its code or id format. And, there is no mapping table in Odata connection to map between field id and field value.
    • I have check the Microsoft provided content pack and found out those mapping table are hard coded. So I have to google the CRM Audit Entity documentation in Microsoft to get the hard coded mapping for the following fields.
      • Operation – The action that causes the audit – It will be create, delete, or update
      • Action – Actions the user can perform that cause a change
      • Audit Table fields
  3. You can expand the userID field to get the information  (such as full name, login account, address, etc.) of user who caused a change. This however, only provide information for users who make changes like ‘create’, ‘update’, and ‘delete’. For ‘access’ status, this userid only reflect as ‘SYSTEM’ as the user.
    • To get the information about the user’s name who access CRM, we need to link the ‘_objectid_value’ field from Audit table to the ‘UserId’ field from systemusers table.
  4. Missing ‘ObjectTypeCode‘ field. This is the key field is missing from this table for the Odata connection data pull. Thus, we are not able to find which entities the changes were made on.
    • Unfortunately, I have not find a solution for this unless CRM team reveal this field for the CRM Audit Entity.
    • If this field is available in the future, we can use this field in combine with ‘Attributemask’ fields to get the column names of the objects which has been changed. So that we can build further analysis report.

For now, with the information we can get, we can start to build dashboard with user login activities.

Final Result.PNG


Blog at

Up ↑