Useful Documentations about Choosing the Right Datastore

It is important to understand the difference between Relational and Non-relational storage solution because different solution fits for different use cases and application type.

I have organized some resources below for you to understand the different between these 2 and hopefully after reading these you can:

•Know the different datastore categories available

•Know the different use cases for data store workloads

•Know the Azure and OSS datastore offerings available

•Design solutions that pick the right datastores for the use case

Choosing the Right Datastore

Criteria for choosing a data store

Choosing a big data storage technology in Azure

Choosing a search data store in Azure

Choosing a stream processing technology in Azure

Choosing a batch processing technology in Azure

Choosing a real-time message ingestion technology in Azure

Thank you,

Your friend,


Power BI Adjacent Microsoft Services and Products

Power BI in itself has a lot of features and functionalities which help you with your job or business. However, Power BI itself won’t solve all your problems which in some cases you may want to peer with other services.

Power BI Premium Capacity

By default, most people will use Power BI pro license level. But when should you consider to leverage Power BI premium capacity? There are couple of things you may need to consider: 1. to store larger datasets, 2. Allow faster DirectQuery /Live Connection query, 3. Support Paginated Reports, 4. Allow unlimited Content Sharing for both internal or external viewers.

If you want to know more information about Power BI Premium Service, please refer to this Microsoft doc.

Azure Synapse Analytics

Azure Synapse Analytics (ASA) is a great alternative for your Power BI model Development. It allow you to query petabytes of data in a performant way with reasonable price by leveraging Direct Query mode of Power BI. ASA Data Pool is a MPP(Massive parallel processing) solution with the Result Set Catches and Materialized View built in so that it enabled high volumes of concurrency quires. Build your data warehouse in ASA as compare to Power BI you can provide Enterprise-grade data preparation and data warehousing solution which allows single source of trues for Power BI and other application and centralized security. In addition, the studio experience within ASA allow team collaboration amongst (data scientists, data engineers, DBA, and BI Developers).

For more information about this topic, please refer to this whitepaper.

AI and Machine Learning Services

There are 3 ways you can extend your BI data to AI insights within Power BI.

1. Auto ML in Power BI Dataflow; Official Doc Tutorial

2. Calling Existing Cognitive Services; Official Doc Tutorial

3. Call Machine Learning Models which Stored in Azure Machine Learning Services. Official Doc Tutorial

Azure Data Lake Storage Gen 2

By default, data used with Power BI is stored in internal storage provided by Power BI. With the integration of dataflows and Azure Data Lake Storage Gen2 (ADLS Gen2), you can store your dataflows in your organization’s Azure Data Lake Storage Gen2 account.

Simple put, Dataflow within power bi is Power Query + Common Data Model (CDM). With the Common Data Model (CDM), organizations can use a data format that provides semantic consistency across applications and deployments. And with Azure Data Lake Storage gen2 (ADLS Gen2), fine-grained access and authorization control can be applied to data lakes in Azure.

For more information about this topic, please refer to this official doc.

Azure Analysis Service

The core data modeling part of Power BI is actually Analysis Services. So using Azure Analysis Services (AAS), you can create data models independently from Power BI services. Right now the largest SKU of AAS allows you to store up to 400GB of data. You can allow different report applications (like Power BI, Tableau, Excel) to connect to your AAS models. Within Microsoft, The Analysis Services team is a sub team of Power BI. The product roadmap for Power BI is to minimize the data modeling experience between Power BI premium capacity and Azure Analysis services. That’s why are are seeing the recent updates from Power BI included those enterprise grade data modeling capability such as increased dataset limited, XMLA endpoint, and workflow pipeline, etc. However, most of those features are in preview and is about to go GA. In short, if you are migrating from your on premise SSAS model to Azure or you do not want to wait for the PBI features go to GA, then AAS is your easiest option. Otherwise, choose Power BI premium to host your reports and data models is strongly recommended.


Power Automate & Power App

When I think about Power BI with other Power Platform tools, I think it is a Insights + Action Play. You can use Power BI to get insights from your data and then use Power Automate or Power App to take action out of it.

The Bi-Directional Connectors allow you to connect the services between each other which means you can embed power bi report to power app and you can embed power app to power bi.




Azure Logic Apps, Azure Monitor log analytics

•Synchronously Refreshing a Power BI Dataset using Azure Logic Apps.

Reference: A Step by Step Community Blog Post.

•Use Power BI to understand your Logic Apps Consumption Data

Reference: A Microsoft Blog

•Use Azure Monitor Logs you can understand your Power BI embedded consumption. You can also use Power BI to create customized report out of your Azure Monitor Logs.

Reference: Microsoft Doc

Azure Streaming Analytics

This is a solution for true real time analytics in Power BI. It enables you to visualize your IOT or Events data.

Typical Architectural is Device Data -> Event Hub -> Azure Streaming Analytics -> Power BI Streaming Datasets -> Power BI Dashboards.

Reference: Official Document

My blog:

Microsoft Teams

You can Add your Power BI report as an App to Teams.

Reference: Official Doc

Also if you have admin credentials to Teams, then you can Analyze Call Quality Data in Power BI.

Reference: Official Doc

Dynamics 365

•You can use Power BI to analyze the data in Dynamics 365 (OData or Entity Store from LCS)

•You can Pin your Power BI visuals to Dynamics 365

•You can use existing solutions from Lifecycle Service

•There is a Power BI Embedded Experience within Dynamics (Boxed)


  1. Official Doc Configure Dynamics 365 Power BI Integration
  2. Official Doc: Service in Dynamics F&O
  3. Official Doc: Get Dynamics Data in Power BI using Business Central
  4. Official Doc: Customize Dynamics F&O analytical workspace

In summary, there many other services inside or outside Microsoft products portfolio works will with Power BI. So consider think outside of the box and create different solutions for your business.

Your friend,


Load data from cosmos db to Event hub using Azure Function at Azure Portal

One way to load data from Cosmos DB to Event hub is to use Azure Function. But although there is many coding samples out there to create such Azure Function. If you are like me do not have much application development experience, reading those code samples is bit channenging. Luckly, Azure Portal made is so easy.

  • Navigate to your Cosmos DB on Azure Portal, on the left pannel, you can find a choice under Settings called ‘Add Azure Function’

CosmosDB Azure Function 1

  • Fill the form popped up after you clicked on ‘Add Azure Function’ and choose the Function Language as C#. Then click ‘Save’
    • CosmosDB Azure Function 2


  • The portal will automaticly navigate to the Azure Function Apps page. and you will find the newly created Azure Function there. The next steps for us is to add the data target. In this case, it is Event hub. To add the Azure Function output target,
    • Click on ‘Integrate’
    • Click on the ‘Output’ New Output in the center form
    • Then choose ‘Azure Event Hub’ from the output choices
    • Save
      CosmosDB Azure Function 3
  • After you hit save, you will see the output form. change the Event Parameter Name field to be ‘$return’. And make sure the ‘Use Function Return Value’ is checked before ‘Save’
    CosmosDB Azure Function 4
  • The next step is to update the Azure Function run.csx code to create the data integration code.
    • Click on the FunctionName itself and the run.csx editing page will show up.
    • Then change the current code into this code
    • And click on ‘Save and Run’
    • Please note where I highlighted in yellow, depends on which run time version of Azure Functions App you have. the reference file can be either #r “Microsoft.Azure.Documents.Client” or #r “Microsoft.Azure.DocumentDB.Core”. So interchange those two if you run into error.
      • CosmosDB Azure Function 5
  • Now, your azure function is up running and capturing the changes and sending the changed files to event hub.



Your friend, Annie

Microsoft Bot 101

There are a lot of concept from Microsoft Conversational AI services. My teammate and I organized the following concept and resources for you hopefully it helps you get started.

  • Types of Bots:
    • Knowledge

This type of bot is result from knowledge base. A knowledge base is a pre-stored database which host Question and Answer pairs.

Source Link

  • Enhancement

This is a helping conversation part built as an enhancement of existing working application

  • Functional

The bot itself is a full application

  • Virtual Assistant

It is an open-source Virtual Assistant bot solution provides you with a set of core functional capabilities and full control over the end user experience.

Source Link

  • Technologies
    • Q&A (Aka Knowledge base stores question and answer pairs) Link
      • Score

when you asking the bot a question, the bot will return a score where how much % it matches existing questions in the knowledge base using MS internal fuzzy match logic

  • Metadata

It allow us to separate one information to the next. Metadata is data about data. By adding Meta Tags for you question and answer pair in a Q&A KB, it can better filter and answer for the end user.

Source Link

  • Single/Multiple Turn

The distinguish between single turn and multiple turn is that multiple turn conversation has follow-up questions

Source Link

  • LUIS (

Language Understanding (LUIS) is a cloud-based API service that applies custom machine-learning intelligence to natural language text to predict overall meaning, and pull out relevant, detailed information.) Link

  • Intents

The overall meaning of the phrase

  • Entitles

The detailed pieces of information that LUIS extract

  • Patterns

A pattern is a combination of expression matching and machine learning to increase intent and entity prediction

Source Link

  • Utterances

Text in users own words

  • Natural language process

The process which changes users unstructured text input to and return a JSON-formatted response with a top intent. LUIS support prebuilt NLP model as well as custom model for domain-specific natural language models.

  • Adaptive Cards

They are a way to present a self-contained UI to a user within a larger UI such as a chat client.

Source Link

  • OAuth

OAuth (Open Authorization) is an open standard for token-based authentication and authorization on the Internet. You can add Authentication capability to your bot Source Link

  • Bot specific concepts
    • Channels

A channel is a connection between the bot and communication apps.

  • Turn

Request and response in and out model

  • Events

Microsoft Teams sends notifications to your bot for events that happen in scopes where your bot is active.

From <Source Link>

  • Conversation

A set of request and response groups

  • Prompts

prompt is used whenever a bot needs input from the user

From <Source Link>

  • Dialogs

you can use dialogs to model a conversation and manage conversation flow.

From <Source Link>

  • State

State data can be used for many purposes, such as determining where the prior conversation left off or simply greeting a returning user by name.

From <Source Link>

  • Dispatch

If a bot uses multiple LUIS models and QnA Maker knowledge bases (knowledge bases), you can use Dispatch tool to determine which LUIS model or QnA Maker knowledge base best matches the user input

From <Source Link>

Thank you for reading.

Your friend, Annie

Power BI Refresh Failed – Trouble shooting

Issue: Few Power BI datasets have been successfully refreshed but recently have some inconsistent refresh failures

Possible Problems and Solutions:

a. License level: Power BI Premium license has different level and Power BI Premium capacity is shared within a Tenant and can be shared by multiple workspaces.The maximum number of models (datasets) that can be refreshed in parallel is based on the capacity node.  For example, P1 = 6 models, P2 = 12 models and P3 = 24 models.  Beyond that, you must understand how memory is managed by the capacity.  If memory is consumed by interactive processes (users interactive with reports)  the refreshes will go into a queue or possibly fail depending on resource utilization.  The best recommendation I can provide is to read this whitepaper, which provides much more detail.  In addition, I would recommend to install the Power BI Premium Capacity App and begin to analyze the model refreshes.  This should assist in identifying and building a model refresh pattern that minimizes/eliminates refresh failures.

  1. So to solve the issue, try limit parallel refresh or
  2. Increase the Power BI premium license level through O365 admin portal. Or, purchase Power BI Embedded Capacity through Azure Portal then assign the dedicated capacity to workspaces

b. Source Database Activities: check on the source database activities during the time datasets are been refreshed. There are might be database locks or database backups going on.

c. Network: Network communication through data source server to Power BI server which is in Azure cloud might be another reason. Please check on this community link for trouble shooting . If network is the issue, we can further explore other network related solution such as adding Azure Express Route.


Your friend Annie.

Simulated Streaming Data -> Power BI Embedded

This post is about something new I have tried last week. The goal was to create simulated streaming data source, feed it into Power BI as a streaming dataset, create a report out of the streaming dataset, and then embed it to an web application. With proper directions provided by my teammates, I finished the implementation from end to end within 1.5 hours. I was super impressed by how awesome it is and how easy it is to implement so that I want to share those directions to you.

Main Step 1 Create simulated streaming data and push it to a Power BI as streaming dataset

  1. The basic steps include:
    1. Set up an Azure IOT hub to capture the data
    2. Use IOT Simulator to simulate streaming data
    3. Create a Power BI App Workspace in
    4. Set up Streaming Analytics Job to connect data coming from IOT hub to Power BI datasetstreaming analytics job

  1. There is a free tutorial course from EDX talks about the above steps in details. You do not have to finish all the tutorial and you can just go through the ones in green check marks as shown in below.

                 edx tutorial for iot

Main Step 2: Create a Power BI report using the streaming dataset

  1. Open Power BI Desktop
  2. Click on Get Data
  3. Choose ‘Power BI dataset’
  4. Choose the Power BI streaming dataset you just created from Main Step 1

powerbi datasets

  1. Once you connected to the streaming dataset, you can use it to create your graphs and charts.
  2. Lastly, click ‘Publish’ to save your PBIX file back to

Main Step 3 Embed your Power BI report to your application

  1. If you need to embed your report into your application, please follow this tutorial

This tutorial introduced the following important topics which we have not done in the previous steps

  1. How to register an application in Azure Active Directory (Azure AD) which allows your application to interact with Power BI.
  2. It provides a sample application package from GitHub. You can just update webconfig file of the sample package on applicationID you just created, the workspaceid where your report locates, the reportid of your report, and your Power BI credential.

  1. Lastly, to set up automatic refresh of your streaming dataset on your application, please try the following blog


Thank you,

Your friend, Annie


My first week at Microsoft

On Monday March 3rd 2019, I started my journey at my dream company Microsoft. I would love to share the experience I have acquired over the past few days.

How did I get in

You may be wondering why I’ve decided to join Microsoft and how I got here, so let me tell you my story leading up to today. After finishing my master’s degree in Information Systems from Indiana University, I joined Anthem (one of the largest health insurance companies in the industry).

My team back then relied heavily on Microsoft products and worked closely with its supporting team to build its world class data infrastructure to support Anthem’s customer operation business. From that experience, I was not only fascinated by its products and services but also its community, filled with passionate and supportive individuals who love to learn and share.

Feeding off of this passion, I began joining local PASS SQL groups, attended and presented at SQL Saturdays, got my Microsoft certification with SQL Server 2016, and started this very tech blog you are reading. After a few years in the work place, I started to sense that cloud, big data and AI were and are the future of our world, and that the space provided me with a unique set of challenges to solution problems and share my knowledge with others, both being things I love.

In order to broaden my cloud technology skills and get more involved with direct customer interactions so to help them with my skills, I accepted the offer from MCAConnect a Microsoft Golden Partner consulting firm as senior consultant. From that experience, I got the opportunity to leverage Microsoft cloud technologies to help customers of different sizes, working directly with executive or management level clients to build solutions for their business needs. I took great pride in being able to leverage my knowledge in Microsoft technologies to help impact, change, and improve the various business processes of varying clients.

Yet despite the opportunities I was given, I still casually found myself day dreaming of one day pacing around the Seattle office with a fresh hot of the printer Microsoft badge, swimming in one of the richest pools of knowledge in the world. It wasn’t until the end of 2018 that my day dreaming became a reality. I had attended SQL Saturday in Washington DC, and after the event, the Microsoft One Commercial Partner (OCP) group reached out to me for a few openings. I had reached out to my Microsoft friends, my mentor James and the hiring manager Justin to understand better the job and Microsoft’s working culture. I found out that the Cloud Solution Architect for Big Data and AI under Justin’s team was the best fit for me and aligned well with my personal goal, and shortly after our discussion, I applied for the position.

After 1 general interview and 2 technical interviews, I got the offer.

To be honest, I felt like I had bombed my technical interviews as I wasn’t able to answer a ton of the questions asked, but it wasn’t until later, after reading Microsoft CEO Satya Nadella’s book ‘Hit Refresh’, that I realized why it actually made sense that I got the offer. Microsoft doesn’t hire know-it-all’s but the learn-it-all’s, welcoming people with a growth mindset. The moment I got the Microsoft offer, I admit I was ecstatic, more so than I had ever been in my professional career, as it was a truly elusive dream come true moment. Of course, after waiting through a lengthy hiring process, done through an internal career system which tracks progress and collects the necessary documents, I found myself talking with my HR manager about onboarding, a complicated process due to the size of the company. Along the way, my manager and the HR manager worked diligently and patiently, communicating with me regularly throughout the 1 month process, reducing my anxiety levels significantly.

Redmond, Seattle and Microsoft Campus

My New Employee Orientation was set to be at Microsoft’ headquarters located in Redmond, WA. My flight, hotel, and car rental were pre-booked by the company based on my schedule. Last Sunday evening, I landed at Seattle Airport, and from the rental car office to the hotel, I got all kinds of Welcome to Microsoft signs and greetings from the service companies, an experience comparable to a home coming. The hotel I stayed at was the Marriott in Redmond town center, a lovely shopping center with all kinds of restaurants, fitness studios, and shops.

Next morning, I drove to campus to attend New Employee Orientation (NEO). The building was called building 92, and at first, I thought it must have been a random number as I could not imagine any company having more than 90 buildings on headquarter campus alone. It was, however, true, and to my astonishment, the campus was still expanding. It was so large that transportation between buildings often relied on shuttle cars. It had multiple walking/running trails, football and basketball courts, on campus shopping centers, like those in Disneyland, and for every 3-4 buildings, there was a food court, summing up to over 30 food courts on campus providing all kinds of cuisines.

The campus was covered with trees, creating the feeling of working in a park. Later, my manager showed me and one of my teammates the on campus treehouses designed by someone famous, though I couldn’t remember the name, and those treehouses were accessible by all employees based on reservation. Then for the next few days, I visited multiple buildings. The offices were illuminated with bright natural light and were separated into different rooms, enhancing my mood as well as providing a feeling of privacy. It fit the culture of Microsoft, formal but fun.

Also, very important, provided for free on every floor and corner, was a variety of drinks and office supplies free for the taking.

New Employee Orientation (NEO)

NEO happened in building 92 which was the visitor’s center. It started with new employees (NE) gathering in the product showcase room where we had access to coffee, Xbox, VR, and networking opportunities. Some posed for a picture in front of a photo featuring the first team of Microsoft employees all of different ages, genders, and cultural backgrounds.

At around 8:00 a.m., we were led through different stations to process our I-9, get our photo taken for employee badges, and have breakfast before entering the conference room where the NEO meeting was hosted. In the room, we were randomly assigned to different tables, done on purpose so for us to interact with individuals from different areas to gain exposure to the different groups in the company. Leading hosts of this events were senior leaders and employees, all who were volunteers. It started with simple greetings and storytelling exercises for each table. Then one of the hosts delved into the company culture and his experiences at Microsoft, followed by the HR manager introducing the benefits which Microsoft provided. During this session, nearly everyone around me, including myself, whispered amongst each other, doubting what we heard and checking our sanities amongst each other. Are all of those really free and reimbursable? I have to say that although the general benefits including 401k, health insurance, family support, Health Savings Accounts, and employee stock purchase plan were amazing, the ones that caught my attention were the fitness benefit, tuition support and donation matches, as my health and growth are the two stabilizing factors to my career.

Following the benefits introduction, there were more stories and program introductions provided by different speakers, all of which reinforced my excitement for finally being here. During lunch, hiring managers came to the conference room to greet their new hires before taking them to lunch. My manager Justin came to welcome me and my new teammate Alex, and proceeded to take us to the café building nearby to provide us with our new PC. After lunch we were sent back to the NEO building to finish up with our security settings and the benefit vendor tour.

NEO officially ended at around 2pm. Justin soon after gave us a brief introduction meeting and provided some tips on what to expect for the next couple of days and weeks. He also told us there was a TechFest on campus happening this week so in lieu of a short Q&A meeting with him each day, we were welcome to attend the event or leave earlier to check out the ins and arounds of Seattle.

TechFest and Seattle area

TechFests, in conjunction with other events, happen every now and then and offer various keynote sessions and presentations. Here, you can learn of the latest research happening in the tech word as well as inside different teams at Microsoft. This week’s TechFest was about AI and machine learning. I was so impressed by all the new inventions and studies out there in our field, and felt so fortunate to be so close to the inventors and researchers, allowing me to talk directly to them about their ideas while asking questions. It was quite an experience to be surrounded by such creative and curious individuals, all so very accomplished and intelligent, all leaders of the industry.

After each day, I had a chance to visit the Seattle area, a place where I have never been before. It was such a beautiful city with lakes, ocean, and mountains, home to multiple large corporations such as Microsoft, Amazon and Starbucks. There were so many hiking, running, and biking trails, various outdoor activities, a paradise for the adventurous soul. It was also one of the more foodie cities I have seen, as all of those companies and universities embrace diversity, offering a vast choice of ethnic cuisines from around the world.

My experience during my first week at Microsoft truly reflects its mission statement “Empower every person and every organization on the planet to achieve more”, reflecting its culture of valuing growth mindset. Microsoft is a place where an aspirational technology professional as myself can use as a platform to empower my skillsets to impact the world.

I am ready for this new chapter of my career.

Thanks for reading,

Your friend, Annie

You may need a table variable for IRR and NPV for DAX

I finished my Fundamental Finance Accounting course last week. I learned 2 awesome functions which helps which which projects to undertaken. Frist is NPV net present value and the second is IRR(Internal Rate of Return). I found those two are so powerful to use helping to decide which investment to choice so I want to implement in DAX. Fortunately, in DAX there are two functions XNPV and XIRR which I can choose to use. I found that XNPV and XIRR are also in excel and would be a replacement of IRR and NPV.  They have slightly different calculation details but the purpose are the same. You can find and  for more details.

The Excel XNPV function is a financial function that calculates the net present value (NPV) of an investment using a discount rate and a series of cash flows that occur at irregular intervals. Calculate net present value for irregular cash flows. Net present value. =XNPV (rate, values, dates)

The Excek XIRR(Internal Rate of Return) is the discount rate which sets the Net Present Value (XNPV) of all future cash flow of an investment to zero.  If the NPV of an investment is zero it doesn’t mean it’s a good or bad investment, it just means you will earn the IRR (discount rate) as your rate of return. =XIRR(values,dates,guess)

The challenge in DAX, I found, is that , we may not have the date field. For example, we may just calculate a intial investiment and an estimate assuming paying amount every year or every month in the future. Something like this:

Assumption - IRR NPV

So in DAX, we can create table variable in the Measure to leverage XIRR and XNVP.

VAR cashflowtable =
    UNION (
        GENERATE (
            GENERATESERIES ( YEAR ( NOW () )YEAR ( NOW () ) ),
            VAR inc = [Value]
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Initial Inventestment] )
        GENERATE (
            GENERATESERIES ( YEAR ( NOW () ) + 1YEAR ( NOW () ) + [Investment Years] ),
            VAR inc = [Value]
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Annual Cash flow] )
        GENERATE (
            GENERATESERIES (
                YEAR ( NOW () ) + [Investment Years],
                YEAR ( NOW () ) + [Investment Years]
            VAR inc = [Value]
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Exit Value] )
    XIRR ( cashflowtable, [Values], [date] )
VAR cashflowtable =
VAR inc = [Value]
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Initial Inventestment] )
GENERATESERIES ( YEAR ( NOW () ) + 1, YEAR ( NOW () ) + [Investment Years] ),
VAR inc = [Value]
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Annual Cash flow] )
YEAR ( NOW () ) + [Investment Years],
YEAR ( NOW () ) + [Investment Years]
VAR inc = [Value]
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Exit Value] )
XNPV ( cashflowtable, [Values], [date], [Interest Rate] )

Hopefully, you find this helpful.


your friend, Annie

Migrating Local Power Query to PowerBI Data Flow and link it to Azure Data Lake Storage account

Within Microsoft lastest PowerBI update, the one exites me most is that we can link PowerBI data flow to Azure data lake storage account.

Benefits are:

  1. When you have a PBIX imported mode reaching more than 1 GB and you are not Primier memeber you can not publish to However, with DataFlow in PowerBi you can save upto 10 GB data.
  2. Your workspace members are able to reuse the same data flow as souce of their PowerBI file.
  3. When liked to Azure Data Lake Sotrage Gen 2 you can use data got from PowerBI Dataflows as data source across your Azure applications.14a64eeb-b1a7-4a12-889f-8b573c98ef20[1]

To try this, you need to have the following things

You have PowerBI-Pro license and above.

You are an admin to your Azure portal. (Or you are good friend of your admin ^_^)

There are few things I found out when I did my experiment and I think you might face are:

  1. You can use blank query to copy M Codes from your Power Queries
  2. If you have a table function to transfer and you don’t have primier account, you can still do so by select “do not load”
  3. To configure Azure Data Lake Gen 2 connect to PowerBI, follow this document from microsoft.
  4. When you configured everything right according to the above microsoft document. You may still have issue like I did. Please follow the following communication string  to see the lastest udpate on the PowerBI community discussing.

Latest update, Microsoft team has fixed the above. Now I have tested and works fine.


Your friend, Annie

How to calculate PMT and CUMPRINC in DAX

PMT is a function in excel for the periodic payment for a loan.

In excel, The Syntax is PMT(rate, nper, pv, [fv], [type])

  • Rate    Required. The interest rate for the loan.
  • Nper    Required. The total number of payments for the loan.
  • Pv    Required. The present value, or the total amount that a series of future payments is worth now; also known as the principal.

Yet, in DAX, there is no built in function yet for PMT. To calculate this we need to reapply for its mathmatic logic.

So, you can use

DAXVersionPMT = ((Pv*Rate))/(1-((1+Rate))^(-1*Nper))))*(-1) 
// Remind: if you are trying to calculate monthly payment but rate is annual rate, then you need to convert to monthly rate by devide to 12
CUMPRINC is the accumuated principle amount for a loan 

In excel, the syntax is CUMPRINC(rate, nper, pv, start_period, end_period, type)

The CUMPRINC function syntax has the following arguments:

  • Rate    Required. The interest rate.
  • Nper    Required. The total number of payment periods.
  • Pv    Required. The present value.
  • Start_period    Required. The first period in the calculation. Payment periods are numbered beginning with 1.
  • End_period    Required. The last period in the calculation.
  • Type    Required. The timing of the payment.

In DAX right now, there is no built in function fo rCUMPRINC yet.
So, you can use the follow DAX
Accumulated Principle = var p=[PV]

var i=[Rate]

var n=[Nper]

var b=[Start_period]

var e=[End_period]

var m=[DAXVersionPMT]*(-1)

var q=12 // if rate is annual rate and you calculated you PMT as monthly period, then you need to convert to monthly rate by devide to 12, if the rate is for monthly rate, then you just need to use q=1

Return ((p-m*q/i)*(1+(i/q))^(e)+m*q/i)-((p-m*q/i)*(1+(i/q))^(b-1)+m*q/i)

Your friend,Annie

Blog at

Up ↑