Do you play with data often? Are you involved in ETL (Extract, Transform & Load) process? Want to do the same with Power Platform in a simpler way?
Power Platform has Dataflows! Dataflow is a collection of data that you can play with & load onto Dataverse, either to a new table or an existing table. And, from which source? Well, it includes excel, SharePoint Lists & many more. I will share a picture of sources below.
In this blog, I will share how to move data from the SharePoint list (source) to a Dataverse table (destination). Whether it’s a data migration or just one-time activity, you can make use of dataflows.
Login to PowerApps & check out Dataflows under Data.
Choose Dataflows> New Dataflow. You can either start from a blank or import template. I am going with ‘Start with blank’
Give the Dataflow a name & create. I am not looking for analytical entities so I will leave that blank.
Tip – If you have multiple dataflows, follow a naming convention making sure you identify it easily.
These are all the sources that could be used to get your data from.
There are two sources for SharePoint, I will select SharePoint List. Let’s quickly see the source.
This is my SharePoint list – List1 that I am using & some sample data to begin with. We will be playing with little data (update/enhancement) as well in this blog.
Let’s go back to Dataflow on setting up a connection. Select SharePoint list from the sources.
Note – If you have entered the wrong URL, username, or password, you will get the error – Invalid credentials. The error could also come in case of a wrong source selection.
Post a successful connection, you will see the whole SharePoint list that you have, choose the one from which you want to import data. I will go ahead & select List1.
You will notice that it loads the columns & their data as soon as you select it & move to next.
On this page, you can perform all the transformations that you would like. If you use Power BI, you will notice the designer is similar.
The bottom part informs about how much duration was it loaded, the number of columns & the number of data rows (excluding column name/ headers)
I will go ahead & remove the columns that I don’t need. And there are two ways of doing so – either select columns & right-click on them to remove columns. Alternatively, from the ribbon, select ‘Choose column’
So, my updated column list looks far better now with only relevant columns
Click next & let’s load the data now. You can either create a new table or choose to load it into an existing table. I am choosing to load to a new table.
Note that it will ask you to select the unique column for the table (primary key in SQL), you can either leave it as auto-generated or select an existing column if you have one. I will select GUID in my case.
Finally, it will look like this. I have kept the destination column type to be ‘Text’ for now.
Click next & update the refresh setting. If this is one-time data migration don’t do anything in this section.
Or you can select a time interval
Select Publish now and wait for few seconds.
The window will close & you will back to the dataflows view, the notification will be shown.
Also, your dataflow will update its status as its progresses.
I like the final confirmation message shown to end users to let us know each stage for the same.
Once it’s published, it will also update the next refresh field for your dataflow.
Since it was a manual refresh for me, Next Refresh will be N/A in my case.
If you did select any automatic refresh interval in the previous window, the date & time will be updated automatically here.
You can also view the refresh history of the dataflow
Once done, let’s see the table & column that was created automatically via dataflow.
Update Active <table name> view to have the desired result. The active list view looks like below & we can see updated data
Let’s now update the SharePoint list values.
Let’s do an incremental refresh for the updated data.
That’s how easy & user friendly Dataflow has given us ETL capabilities. I will blog about some of the limitations and issues faced with dataflows in my next blog. Stay tuned!
Keep reading, keep enjoying, keep exploring!