An easy way to compare all tables’ DDL between two SQL environments

To continue my story about data warehouse migration from UAT to PROD environment.

The DDL difference between the same source loading table from two environemnts will cause my data warehouse load fail, so I need to find an easy to compare the two for all my sourcing tables . There are a lot of good tools with free trail out there to choose. However, there are some regulation from my client side to installing third party tool so I need to create a query to help me to do so instead.

Inspired by an answer from this Post which compares one table. I generated the following query which can loop through all of my targeted sourcing tables.

DECLARE @Table VARCHAR(100);
DECLARE @i int=1;
While @i<= (select max([RANKColumn]) from [dbo].[EvaluationTableList] ) — this is the list table which contains all the tables you want to validate
Begin
SET @Table=(select [TABLENAME] from [dbo].[EvaluationTableList] where [RANKColumn]=@i)
insert into [dbo].[EnviromentObjectDifference] — (this is the output table)
SELECT Table1.ServerName,
Table1.DBName,
Table1.SchemaName,
Table1.TableName,
Table1.ColumnName,
Table1.name DataType,
Table1.Length,
Table1.Precision,
Table1.Scale,
Table1.Is_Identity,
Table1.Is_Nullable,
Table2.ServerName as T2ServerName,
Table2.DBName as T2DBName,
Table2.SchemaName as T2SchemaName,
Table2.TableName as T2TableName,
Table2.ColumnName as T2ColumnName,
Table2.name as T2DataType,
Table2.Length as T2Length,
Table2.Precision as T2Precision,
Table2.Scale as T2Scale,
Table2.Is_Identity as T2Is_Identity,
Table2.Is_Nullable as T2Is_Nullable
FROM
(SELECT @Server1 ServerName,
@DB1 DbName,
SCHEMA_NAME(t.schema_id) SchemaName,
t.Name TableName,
c.Name ColumnName,
st.Name,
c.Max_Length Length,
c.Precision,
c.Scale,
c.Is_Identity,
c.Is_Nullable
FROM [ServerName01].[DatabaseNameO1].sys.tables t
INNER JOIN [ServerName01].[DatabaseNameO1].sys.columns c ON t.Object_ID = c.Object_ID
INNER JOIN sys.types st ON St.system_type_id = c.System_Type_id AND st.user_type_id = c.user_type_id
WHERE t.Name =@Table) Table1
FULL OUTER JOIN
(SELECT @Server2 ServerName,
@DB2 DbName,
SCHEMA_NAME(t.schema_id) SchemaName,
t.name TableName,
c.name ColumnName,
st.Name,
c.max_length Length,
c.Precision,
c.Scale,
c.Is_Identity,
c.Is_Nullable
FROM [ServerName02].[DatabaseNameO2].sys.tables t
INNER JOIN [ServerName02].[DatabaseNameO2].sys.columns c
ON t.Object_ID = c.Object_ID
INNER JOIN sys.types st
ON St.system_type_id = c.System_Type_id
AND st.user_type_id = c.user_type_id
WHERE t.Name = @Table) Table2
ON Table1.ColumnName = Table2.ColumnName
where Table1.ColumnName is null or Table2.ColumnName is null;
SET @i=@i+1
END

Hopefully, you find this helpful.

Thanks,

Your friend, Annie

Create Dynamics CRM Online – User Activity PowerBI Report

Couple days ago, a client of mine is asking me to build a Dynamics CRM Online user activities dashboard. In Dynamics CRM On-premises version, I know, this information resides in audit table. However, with Dynamics CRM online, I am not able to access directly to the backend database. Fortunately, there are few PowerBI content packs for Dynamics CRM online out there in market place so that I was able to read through the PBIX file and find out how PowerBI is connected to the Dynamics CRM Online data.

PowerBI is using OData as data source to connect to Dynamics CRM online web services which gives you almost all data available (There are still some important fields and tables which are in On-premises version and causing the challenge of build reports effectively) . You just need to change the highlighted piece below to your organization CRM web name.

CRM Online Audit Report snapshots

 

Odata Tables available

I did find the ‘Audit’ table available. However, there are few challenges to interpret the data. I am listing the challenges below and the steps I handled it.

  1. Initially, the ‘Audit’ table is blank.
    • This is because in CRM online you need to enable the setting of Audit Tracking. You can follow this blog to enable it.
  2. Every field is in its code or id format. And, there is no mapping table in Odata connection to map between field id and field value.
    • I have check the Microsoft provided content pack and found out those mapping table are hard coded. So I have to google the CRM Audit Entity documentation in Microsoft to get the hard coded mapping for the following fields.
      • Operation – The action that causes the audit – It will be create, delete, or update
      • Action – Actions the user can perform that cause a change
      • Audit Table fields
  3. You can expand the userID field to get the information  (such as full name, login account, address, etc.) of user who caused a change. This however, only provide information for users who make changes like ‘create’, ‘update’, and ‘delete’. For ‘access’ status, this userid only reflect as ‘SYSTEM’ as the user.
    • To get the information about the user’s name who access CRM, we need to link the ‘_objectid_value’ field from Audit table to the ‘UserId’ field from systemusers table.
  4. Missing ‘ObjectTypeCode‘ field. This is the key field is missing from this table for the Odata connection data pull. Thus, we are not able to find which entities the changes were made on.
    • Unfortunately, I have not find a solution for this unless CRM team reveal this field for the CRM Audit Entity.
    • If this field is available in the future, we can use this field in combine with ‘Attributemask’ fields to get the column names of the objects which has been changed. So that we can build further analysis report.

For now, with the information we can get, we can start to build dashboard with user login activities.

Final Result.PNG

Thanks.

How to create customized color themes for your PowerBI visuals

I found it is just as important to have a nice looking dashboard/reports as the data itself. In this blog, I want to share with you some good tricks which I learned recetly from PowerBI.com and a Youtube channel EnterpriseDNA to create customized color themes in PowerBI desktop.

By default, the customized theme feature is not enabled in PowerBI desktop. So what you need to do is go to ‘File’ -> ‘Options and Settings’

PowerBI Option Change

then, enable the ‘Custom Report Themes’ in ‘Preview Features’ section.

Custom Report Theme Enable

After restarting your PowerBI desktop, you will see ‘Themes’ under ‘Home’ tab

Themes tab

By choosing ‘Switch Themes’, you can import your customized theme saved as JSON file.

Typical JSON Code for themes like this

{    “name”: “St Patricks Day”,
“dataColors”: [“#568410”, “#3A6108”, “#70A322”, “#915203”, “#D79A12”, “#bb7711”, “#114400”, “#aacc66”],
“background”:”#FFFFFF”,
“foreground”: “#3A6108”,
“tableAccent”: “#568410” }
From <https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-report-themes/>

But, how could we get a good color combination which looks good if I am not a artist?

Usually, I will create the foundation colors based on company logo or image. Since Halloween is coming soon, let’s use a Halloween picture as an example.

There is a site recommended http://palettefx.com/ which helps you to find all the colors used in the image.

Color Pickers

After that, you use Notepad and edit the JSON code I shared above and save as .json file. Json code

Now, you can import the JSON file into PowerBI use the ‘Switch Themes’ add-in we got as your customized theme.

Jason file

Now, you can use your new customized theme to build your PowerBI reports!

There is a PowerBI theme Gallery: https://community.powerbi.com/t5/Themes-Gallery/bd-p/ThemesGalleryHopefully it helps.

Your friend, Annie

 

From SQL to DAX- ‘Lead’ and ‘Lag’ window functions

In SQL, we have two window functions call lead and lag, and with these two functions, you can get the previous and next value of a column partition by and order by other columns in a table.

Use our Advanturework Sales.SalesOrderhead table as an example. The following code can give you the previous and next SalesOrderID for a SalesPersion order by OrderDate.Lead and Lag SQL code

However, it is a very expensive function because the SQL engine need to fetch through the entire table for every row calculation where the functions are called.

It is much faster to use DAX in SSAS tabular model in this case, where the column-store and vertipaq compression technologies are embedded. To use DAX replace lead and lag function, we will be using a key function in DAX called ‘Earlier’.

Using the same example mentioned above,

You can write Previous Order ID calculate column like this:DAX previous and next.PNG

As we know, ‘Calculate’ Function covers the current row context to filter context of the calculation (as the first argument) inside ‘Calculate’. However, the filter contexts (second and the following arguments of ‘Calculate’) created in side ‘Calculate’ block those external filter contexts when they are referring the same columns. In this case, ‘Filter’ function blocked all the filter contexts added externally on the ‘SalesOrderHeader’ table, in other word, the calculation in the first argument of ‘Calculate’ MAX(SalesOrderHeader[SalesOrderID]) don’t know which row it is at SalesOrderHeader table. Only the ‘Earlier’ function brings the previous filter contexts back, which allows MAX(SalesOrderHeader[SalesOrderID]) aware of which row it is at.

Using the second row of the above screenshot as an example, the DAX calculation of PreviousOrderID column can be explained as:

Find the max SalesOrderID where SalesPersonID equal to the SalesPersionID of existing row (called by ‘Earlier’ function) which is ‘274’ and OrderDate are older than the OrderDate of existing row (called by ‘Earlier’ function) which are all the records with OrderDate older than 9/1/2015. Thus, the result is ‘43846’.

 

Thanks.

Your friend Annie.

 

 

 

The one investment which provides the highest ROI

20150625044625-shutterstock-112158569.jpegYou can invest your money and time on many things like house, stock options, retirement account, etc. But the only one investment which will provide you the overall highest ROI and least amount of risk is yourself.

ROI stands for Returns of Investment. The definition of returns shouldn’t be just money but should be a combination of money, happiness, health, personal satisfaction. Based on this definition, the only one investment which will provide the returns is yourself.

Personally, I have spend quite a lot money and time on attending conferences, getting courses which helps me to grow professionally, gym membership, Yoga studio, Healthy suppliments, home robots, books in different format, facial products which helps my skin looks better (La Mer is my favorite brand), pretty close which helps me feel more confident, and lots of Not Free but very good apps which help my life more efficient… Turns out that all of those investment I spend on myself bring me way more satisfaction on income, health, and myself, which make me more happier in general.

Basically, try invest your money and time on the following items and I guarantee that you won’t regret.

  1. Makes you more valuable professionally
  2. Improves your health
  3. Makes you more confident
  4. Save you time on the unproductive works or things you don’t like, and instead use those time to focus on things you are good at, passions about, etc.

Introduction

Hello friend,

My name is Annie Xu. I am a IT girl who loves fitness. I love to share and love to help people.

Since 2013, I started my journey creating Business Intelligent (BI) solutions and found myself very passionate about this area. Inside the circle of technologies, knowledge shared by others is extremely helpful to me.

That’s why I started my very first blog site to share my knowledge and tips during my journey of  learning Business Intelligence.

Hopefully, you find some useful information and feel free to send me your comments, thoughts, or questions at anniexu1990@gmail.com

post

Blog at WordPress.com.

Up ↑