Tips, ideas, and updates from the experts

  • 5 New Updates in Power Query

    Take Power Query to the next level: Try Power BI for Free today!


    We are happy to announce the availability of the November 2014 Update for Power Query. This update is packed with lots of new features that we hope you will enjoy and thank us for them (hmm, since we are in Thanksgiving week…).

    Download the November Power Query Update

    Before talking about new features in this update, we would like to invite you to complete a survey. It will take you only a few minutes and it will help us better understand what you like or don’t like about Power Query, how do you currently use it and what new features you would like to see in the future.

    Take the Power Query Survey

    What's new in this update?

    1. SQL Server Analysis Services connector.
    2. Salesforce Reports & Objects connector.
    3. Expand Columns – Ability to disable/customize column prefix.
    4. Improved File Menu options in the Query Editor.
    5. Entry point for “Advanced Editor” in the Queries pane inside the Query Editor.

    You can watch the following video or read below for more details about each feature.

    SQL Server Analysis Services connector

    With this update, you can now connect to Analysis Services cubes through Power Query. This connector has been long awaited for and commonly requested, and we hope that you like it. It provides exploration, transformation and import capabilities on top of AS cubes. Underneath the connection, Power Query translates user actions into MDX queries against the source. While the connector will also work for tabular instances, it is currently optimized for multi-dimensional experience. We hope to add 1st class support for DAX in the future.

    You will find a new entry under the “From Database” menu that will let you connect to your AS server. After connecting, you will be able to browse your cubes and pick the desired dimensions, measures, hierarchies or KPIs from them, filter and transform the data before loading it to the worksheet or Data Model.

    Salesforce Reports & Objects connector

    This month we’re including the Salesforce connector as a mainstream Power Query feature. This connector has been in Preview since early October. This feature allows customers to connect to their Salesforce accounts and import data into Excel. There are two entry points offered, including Salesforce Objects and Salesforce Reports. For a detailed description of these two capabilities, please check out this blog post.

    Expand Columns – Ability to disable/customize column prefix

    A frequent piece of feedback from our users is that they would like to have control over the column name prefix for columns generated by an Expand Column operation. Before this update, new columns would contain the original column name as prefix and users not wanting this prefix would have to manually remove it from each column afterwards (either via Column Rename or by modifying the generated Expand Column formula).

    With this update, we’re introducing a new checkbox in the Expand Column dropdown menu to let users opt out of this prefix.

    In addition to this check-box, we’re also giving users the option to change the prefix via the “Expand Columns” dialog.

    Improved File Menu options in the Query Editor

    We have added a few more options to the Query Editor File menu. Now you can access the Data Source Settings and Options dialogs without having to leave the Editor. Note that Workbook Settings are still not accessible from the File menu. This is something that we are trying to enable in next month’s update.

    Entry point for “Advanced Editor” in the Queries pane inside the Query Editor

    We’ve made it much easier to access the Advanced Editor dialog when working with multiple queries in the Query Editor. Simply right click the query that you wish to edit and select “Advanced Editor” in the context menu.

    That’s all for this month! We hope that you enjoy this update and find the new features valuable for you and your customers. Please send us your feedback or suggestions via Smile/Frown in Power Query.


    Download Power Query from our official download page.

    #update#power query#Power BI#Excel#Query Editor#Microsoft BI#Salesforce#expand columns#SQL Analysis Services

    Tue, 25 Nov 2014 17:00:00 GMT byMiguel.Llopis2 Comments

    Read full post
  • Top 10 Power Map tours

    We all know how beautiful and stunning Power Map videos are but what's even more striking is how powerful tours are as a storytelling tool to convey a message with Geo-spatial data.

    That is why we decided to put together a Top 10 list of Power Map tours and... drum roll... here they are!

    Visualizing Climate Change

    Lukas Steindl created this tour to visualize the intensity of tropical storms and population affected in the west pacific from 1945 to 2012. It also includes Twitter activity for typhoon "Haiyan" that hit the Philippines.


    World Cup History with Power Map

    Igor Peev, Senior Program Manager from the Power Map team created this tour that shows the history of the World Cup with data.


    Beer sales in Norway

    Alistair Brown created this tour to show the most popular beer brands in Norway.


    Thanksgiving Turkey Production and Consumtion

    Last year for Thanksgiving we wanted to answer the question, "Where do turkeys come from and where do they go?". This was the result.


    Walmart stores US growth

    Jason Strate created this very simple but powerful tour to illustrate the amazing growth and coverage of Walmart stores in the US since 1962.


    NORAD's Santa route tracking

    For the 2013 holidays, we put together this tour with the official NORAD tracking of Santa's route so we wouldn't lose him on Christmas eve.


    Windsurfing race GPS tracking

    Shish Shridhar shows us how it is possible to map different sailors routes from a windsurfing race.


    Best places to Trick or Treat

    The Office 365 team grabbed a bunch of public data and visualized it with Power Map to help us choose the best place in the US to go trick or treating.


    Casino customer traffic simulation

    Matt Smith simulates customer foot traffic and waiting times in a casino, showing us how the new custom maps feature can be used.


    Healthcare analytics

    This tour shows explores Geo-location health metrics such as disease frequency, gyms and fast food restaurants to drive better local policies for the public and private sector..


    If you have more top Power Map tours to share, please let us know @MSPowerBI and @MSExcel with the hashtag #TopPowerMap.

    #Power Map#Excel#tours#visualization#maps#geo location#custom maps#top 10#storytelling

    Thu, 20 Nov 2014 17:30:00 GMT byPower BI Team1 Comment

    Read full post
  • Machine Learning for Analysts: The Basics

    Machine learning is a hot trend in the tech and business press - you've probably heard about it, and you may be starting to explore it, or perhaps are already using it. OR you might have zero idea what to do with it, in which case, read on for some basics to get you started!

    So first of all, what is machine learning?  Simply put, machine learning is the practice of mining your historical data with computer systems to predict future activity - trends, behaviors, patterns, etc. From a benefits perspective, this predictive knowledge is huge. If you know where things are going, you can adapt your business and stay ahead.

    Historically, machine learning was an expensive endeavor requiring complex software, as well as deep expertise from data scientists. While many wanted to take advantage of machine learning, it was often too hard, time consuming, and costly to implement and operationalize.

    Enter the cloud.

    By putting machine learning capabilities in the cloud, we can eliminate many of these barriers and make these capabilities accessible to many more people.

    So why should you care? If you're a business analyst of any kind - financial, marketing, IT, operations, or any other department that's using data (so, basically all of them) - you can benefit from all that machine learning has to offer.

    Try it for free – no time limits, subscriptions, or credit cards needed – on To help you get started, there are several pre-baked sample models that can help you explore some common business scenarios for machine learning:

    Check it out, and let us know what you think.

    If you’re interested in stepping through more detailed machine learning tutorials, explore these videos:

    If you’re already exploring machine learning and perhaps know a bit of R, there’s a free webinar happening on Friday Nov. 14 at 9:30am PDT that might be of interest to you: Operationalizing R as a Web Service. Register now!

    #Machine Learning

    Tue, 11 Nov 2014 17:00:00 GMT byPower BI Team1 Comment

    Read full post
  • Getting Started with Power Pivot - Part II

    Welcome back to our Getting Started with Excel and Power BI Series.

    In Part I of Getting Started with Power Pivot we created our first Excel Data Model. Today we will create additional relationships and start using DAX to add business logic to our Model.

    To follow this tutorial you will need:

    1. Excel 2013



    After making the data show up using the right relationships in part I, we want improve the Pivot Table as we want to slice my data by year. To do this I'll add a slicer to my worksheet:


    Clicking the slicer doesn’t do anything yet as there again no relationship between the Invoice and DateTable. After inspecting the fields in both tables I find out that there is no field in the Invoice table that I can use to relate with. Luckily we can create a field in the table that allows us to create the relationship.

    To work with the Data model In Excel 2013 we use the Power Pivot add-in. This add in is installed by default in many versions of Excel, you just need to enable it.

    Now that we have enabled the Power Pivot add in a new ribbon will show up in Excel:


    To work with this model I click the “Manage” button, this will open the Power Pivot dialog:


    The initial dialog that opens shows each tables in the Model with all the rows of data, in this case it shows the DateTable. You can switch between tables by clicking the table tabs at the bottom:


    I now select the Invoice table where I want to create a column that adds a Date value to for each row of data.  I do right mouse click on the CalendarYearMonth column and select “Insert Column”.


    This creates a calculated column in the Invoice table where I use a formula to populate the column with values, I combine values from the CalendarYearMonth column to create a new date value using the following formula:



         , RIGHT([CalendarYearMonth],2)

         , 1)

    As you might notice the formula used here is very similar to the formula’s you would write in Excel, this on purpose, the DAX language used in Power Pivot is designed to be familiar to the Excel users. It has some differences that are mostly focused around capabilities that are not supported by Excel like traversing relationships and special built-in functionality for calculations.

    Adding the calculation to the data model will give us a new column that we can use in the relationship:


    I then rename the column to InvoiceDate. To create the relationship I can also move over to the diagram view in the Power Pivot add-in and drag and drop the fields that I want to create a relationship between. Dragging and dropping results in a relationship being created:



    If we now go back to our PivotTable and we now see the results we expect:


    Now I want to compare the RevenueAmount for the selected year with the RevenueAmount for the previous year. In order for me to do this I can again use a DAX formula. In this case I want to create a calculated field in Excel to add to my PivotTable. 

    I go to the Power Pivot ribbon and click Calculated Fields, “New Calculated Field”:


    This opens a new dialog that allows me add a new DAX formula that leverages some of the special DAX time intelligence formula’s to calculate the “Sum of RevenueAmount“ for the previous year:

    [RevenueAmount PreviousYear] =


    [Sum of RevenueAmount],




    This will add a new field to the PivotTable that shows the RevenueAmount for the previous year compared to the Year selected in the slicer:


    Now I can add another calculation that will calculate the Year over Year Growth in percentage:

    [Sum of RevenueAmount YoY%] =



                      ([Sum of RevenueAmount]                                                                

                        - [RevenueAmount PreviousYear])

                      ,[RevenueAmount PreviousYear]


    In this calculation we divide the Sum of RevenueAmount minus the RevenueAmound previousyear by the RevenueAmount previousyear that results in the percentage growth. Adding this calculated field to the PivotTable gives us the percentage of growth for revenueamount:


    I hope these blog post gave you a good overview of the capabilities of Power Pivot. Having said that this blog post will be just the beginning of your journey to getting to know Power Pivot and unlocking all its wonders.


    #power query#Power Pivot#Excel#data#Getting Started#vlookup#model#pivot table

    Thu, 30 Oct 2014 16:00:00 GMT byKasper de Jonge [MSFT]2 Comments

    Read full post
  • Getting Started with Power Query - Part II

    Welcome back to our Getting Started with Excel and Power BI series.

    Today we will continue with our first Excel feature, Power Query, which allows you to discover, shape and transform your data before importing it into Excel or your Power Pivot data model. In Part I we covered mostly how to connect to different data sources and some quick transformations for your newly found data.

    In this post we will cover:

    To start this tutorial you'll need:


    OK, let's get started. Open the provided Excel file at the end of this post. You can follow the getting started steps with the video or with the text on this post. Once you open the file, select Power Query from the ribbon. If you are on one of the imported tables, you will see the Workbook Queries menu on your right. If you don't, just click the Workbook Queries button on the Power Query ribbon.

    First let's do an online search to import and edit its query so we can learn how to create and modify formulas:

    1. Create and modify a simple formula

    1. Select the Power Query tab and perform the following online search: "largest cities south america"

    2. Hover over the public data results: 'Largest Cities..." and select edit, which will take you to the Query Editor

    3. Click on the fx icon right next to the formula bar

    4. Type the following formula = Table.AddColumn(Source, "Country Code", each Text.Start([Country], 3)) and click on the check symbol to the left of the formula bar - This formula will extract the first 3 characters of Country and place them in a new column

    5. You can modify this formula directly in the formula bar or selecting in Applied Steps the step you want to edit and clicking the gears icon

    2. Filter, sort and group data

    Filtering on the Query Editor is very similar to Excel. For any given column you can click on the down arrow and start filtering or sorting right away. In our specific case: 

    1. First, let's try automatic filtering. Select the Country column, click on the down arrow, deselect (Select All) and choose Brazil, Chile and Ecuador.

    2. When you click OK, you'll have the filtered list.

    You can also use the same menu to sort your data (ascending or descending). 

    Now, let's try grouping by country:

    3. Right click on the country column header and select "Group By"

    4. You will see the Group menu. In this case, we'll just count how many records we have by country. You can also perform other grouping operation such as sum, average, min or max on a field (in this example we could do that with population):




    4. Work with columns

    You can perform almost any type of operation with columns on the Query Editor to shape and transform your data any way you like it. We did that on step 1 with a simple formula. Let's try other transformations available within Power Query.

    1. First let's remove columns that we don't want to use. In this case, holding down the Ctrl key select the Image and Key fields. Then right click on one of them and select Remove Columns on the menu:

    2. Now, let's create a Key with our country code and the city name. In Excel, this would be a text operation with the "&" symbol. In Power Query, this is the same. Click on add custom column on the column menu.

    3. On the add custom column menu, write the following expression: [Name]&"-"&[Country Code]. We are joining the city name and the country code with a dash in the middle. Rename the column to "Key"

    4. You will see the newly added column after you click OK and the function being showed in the formula bar. The Power Query Formula Language provides a lot of flexibility to shape and transform your data. You can find more information and references here. In any case, you can use the Query Editor to save these steps through the UI and Power Query will take care of the formulas for you.

    5. Finally, you will notice that Power Query is saving all of these steps to be replicated on your data refresh or modified in the Applied Steps panel

    5. Combine data

    There are several ways to combine data in Power Query. We will take a look at 2 of them: Append and Merge

    1. First, we will merge this query with another one that has the country population on it. The file provided already has another query (found with Online Search) with the population of countries called "Population by Country". Click on the Merge button on the Power Query menu.

    2. On the merge menu, select each query and then click on the country field for both. This will be the the field that will define the merge so we can get country population on our original table. Also select "Only include matching rows" so we get population just for the list of countries we have. Before clicking OK Power Query already tells us that we have 40 matching records. Click OK

    3. You will get a new Query Editor window with a new column. Click on the corner of the header for this new column and select the fields that you want to merge. In this case, we want population. Then click OK.

    4. Now, we have a new table with all the information we had before, plus the population of the country for each record. If we click on close and load, we'll import this data into Excel.

    5. Now, let's try to append data to our original table. There's another query in the file called "Largest Cities in Europe". This query already has the same structure as our original table after adding Country Code and Key, including column names. We achieved this using the Query Editor on a similar online search. You can review the steps editing this provided query. We'll append this data to our first table. Click on the Append button on the Power Query menu.

    6. As before, select the 2 queries that we want to Append ("Largest Cities in South America" and "Largest Cities in Europe") and then click OK.


    7. You will get a new Query Editor window with the appended queries.


    6. Share queries

    Finally, if you signed up for a Power BI trial, you have the option of sharing any of these queries with the rest of your organization. As long as the user has access to the underlying data (in this case public data) they will be able to search for the query with the Online Search feature within Excel and load the data from the query you shared. Let's share the first table we created: "Largest cities in South America"

    1. First of all, you need to be signed in to your Power BI tenant. On the Power Query ribbon, click on the Sign In button

    2. Enter your Power BI / Office 365 credentials and sign in

    3. Now, let's make sure the name and description of our query is accurate so other users are able to find it. On the Workbook Queries menu, right click on the "Largest cities in South America" query and select edit

    4. Once your are on the Query Editor, click on All Properties on the Query Setting menu on the right. The you will see the query name and description

    5. Everything looks good, so let's share the query to the Data Catalog. Close the Query Editor and right click again on the query. Then select the Send to Data Catalog option

    6. You will get another opportunity to edit the name and description. You can also add a documentation URL in case people want more info or request access to the data. Click on the Sharing menu. Here you can decide who will have access to query you're sharing. In this case we will go with Everyone in the enterprise. Click Send.

    7. Now, your query is live on the Data Catalog and discoverable by anyone based on your description. Let's do an online search for it and filter by organizational data.

    8. Finally, if you want to edit your shared queries you can click on the My Data Catalog Queries button on the Power Query ribbon. From this menu you can delete, load, update and even view statistics for your query.


    #power query#Excel#data#Getting Started#files#data base#data sources#group#filter#share query#sort#column#shape data

    Wed, 29 Oct 2014 16:00:00 GMT byMiguel A. Martinez0 Comment

    Read full post
  • 6 New Updates in Power Query

    Take Power Query to the next level: share your queries and create a corporate data catalog. Try Power BI for Free today!

    In this Post

    October is getting close to an end, but before we all put on our costumes and eat lots of candy for Halloween, we are very pleased to give you another treat: the October Update for Power Query is now available! This update includes several feature improvements that we hope you will like. Here is what is new:

    Download the Power Query Update

    What's New?

    New Transformation: Use headers as first row

    Column Name field in Merge Columns dialog

    Quick Access Toolbar in Query Editor dialog

    Warning for stale data previews in new queries

    Function Invocation from Search

    Preserve Excel Number formats in the worksheet on query refresh


    If you are using the Preview of the Salesforce connector for Power Query, you will also need to install this newer version after installing the October Power Query Update:  32-bit, 64-bit

    What's new in this update?

    1. New Transfomation: Use Headers as First Row.
    2. Column Name field in Merge Columns dialog.
    3. Quick Access Toolbar in Query Editor dialog.
    4. Warning for stale data previews in new queries.
    5. Function Invocation from Search.
    6. Preserve Excel Number formats in the worksheet on query refresh.


    You can watch the following video or continue reading the rest of this blog post for more details about each improvement.


    New Transformation: Use Headers As First Row

    Table.DemoteHeaders has been available for a long time as a library function. However, users would have trouble discovering this function and would ask very often about how to achieve this transformation. We’re trying to address this discoverability issue by exposing “Use Headers as First Row” in the Query Editor ribbon. The existing “Use First Row as Header” entry points (Home tab and Transform tab) have been converted from a button to a split button that now exposes “Use Headers as First Row” as its 2nd option.

    Column Name field in Merge Columns dialog

    “Merge Columns” now exposes a new field to specify the name of the column upfront. Previously, the only ways of customizing this name were via the generated step formula or by adding a subsequent “Rename Column” step. Over the next few months, we aim at exposing this option in other transformations that generate new columns as a result (such as Expand and Aggregate).

    Warning for stale data in new queries

    When connecting to data sources that the user had connected in the past, Power Query will try to leverage a local preview cache as much as possible. This has lots of advantages in terms of performance and UX responsiveness, however it is also an area of potential confusion for end users. Quite a few customers have reported that the “preview is wrong” where, in fact, it was simply out of date. Despite the Status Bar at the bottom of the Editor exposes the last updated date/time for a Preview, this was not prominent enough for many of these users.

    In this release, we’ve added a warning for the first time in each Editor session where a preview older than 24 hours is displayed to the user including an approximation for how old the preview is (at least, how long it has been stored by Power Query). Users can optionally click Refresh to update the preview. If the user decides that the cached preview is ok and starts working with it (i.e. adds a new step), the warning message will go away automatically.

    Quick Access Toolbar in Editor dialog

    The Query Editor dialog now exposes a Quick Access Toolbar. By default, this toolbar contains the Send Smile/Frown menu so that users don’t need to go back to the Home tab to reach out to us. In addition, users can pin their favorite actions from the ribbon to this toolbar, by right-clicking any of them.



    Function Invocation Experience from Search

    If you find a query in Search results that is a function, Power Query now lets you directly invoke it from the Online Search pane. Previously, users had to load the query to the workbook and then invoke it from the Workbook Queries task pane. This flow has been now simplified enabling users to directly invoke the query from Search. In addition to invoking the function, users can also add the function query to their workbook so it can be invoked multiple times. This can be done via a context menu option (“Add Definition”) available in function queries.


    Preserve Excel number formats on refresh

    Before this update, customizations to the Number format of Excel worksheet cells part of a Power Query table would be lost after refreshing your query.

    With this update, Number formats are not lost after a refresh. We look at the current and previous data type for each column and if the type didn’t change, we don’t remove the Number format of the output. This has been a very frequent customer complaint that is now resolved.

    Columns with Number format stay the same after refresh:


    That’s all for this month! We hope that you enjoy this update and find the new features valuable for you and your customers. Please send us your feedback or suggestions via Smile/Frown in Power Query.


    If you are using the Preview of the Salesforce connector for Power Query, you will also need to install this newer version after installing the October Power Query Update:  32-bit, 64-bit

    Download Power Query from our official download page.


    #update#power query#Power BI#Excel#Query Editor#Microsoft BI#number format#function invocation#first row headers#column name

    Tue, 28 Oct 2014 17:35:00 GMT byMiguel.Llopis5 Comments

    Read full post
  • Getting Started with Power Pivot - Part I

    Welcome back to our Getting Started with Excel and Power BI Series.

    In our previous blog posts we explored how to bring data into Excel through Power Query (Part I and Part II). In this blog post we will take that data and prepare it for analyzing and visualizing in Excel.

    To follow this tutorial you will need:

    1. Excel 2013



    In order for us to do this we need to load the data into the Excel Data Model. Loading data into the Excel Data Model allows you to use data in a way previously impossible in Excel. It allows you to create analytics and visualizations using multiple tables without writing a single VLOOKUP, load data that has more than 1.000.000 rows and add advanced business logic to your solution with Data Analysis Expressions (DAX) formulas.

    To load the data into the Excel Data Model you have three different methods, to load any table from Power Query into the Data Model you can select “Load to Data Model” and this will bring the data into the model. Besides Power Query there are several other ways to add data to the Data Model, you can use Excel’s get external data features or even directly import in Power Pivot. All these methods end up with the same result: one or more tables in the Data Model. In many cases this is enough to get started with your first Excel PivotTable.

    In this blog post we will take a look how to do analytics on top of some sales data that I loaded into the Data Model using Power Query. I imported invoice, date and a geography tables. To create my first PivotTable on top of this data I can get started immediately as Power Query already added my tables to the Data Model. Select Insert, PivotTable and selecting the Data Model as the connection. This creates a PivotTable with three tables in the Field list:


    I now open the Invoice table and select the RevenueAmount field.


    This will automatically add this field to the values area of the PivotTable and aggregate the values in the column in the PivotTable by using a SUM:


    Just the Sum of RevenueAmount by itself is not very valuable, I want to see this value for each region so I again check the checkbox in front of Region field in the PoliticalGeography table. This automatically adds this fields to the PivotTable on Rows:


    Unfortunately we see something is wrong here, the same value is repeated for each Region. Luckily Excel knows what is wrong and warns us in the Field list:


    To related fields from two different tables I can create a relationship in the model. I click Create to create this relationship. I then select the two fields between the two tables that I can use to create the relationship:


    This now automatically allows combine data from two different tables into a single PivotTable without having to do anything else (like writing a VLOOKUP to bring all the data together in a single table):



    When relationship between tables are already defined in the underlying database relationships will automatically be created in the Excel Data Model during Import. Also when you are not sure how to create relationships Power Pivot you can use Power Pivot to detect and create relationships automatically.

    In this blog post we created our first Excel data model, in part II of this blog post we will create additional relationships and start using DAX to add business logic to our Data Model.


    #power query#Power Pivot#Excel#data#Getting Started#vlookup#model#pivot table

    Wed, 22 Oct 2014 19:10:00 GMT byKasper de Jonge [MSFT]0 Comment

    Read full post
  • Announcing Power Query support for

    Today, at the Salesforce Dreamforce conference in San Francisco, we announced the preview of a new Salesforce connector for Microsoft Power Query for Excel. With this integration customers can now connect and work with data from their Salesforce accounts in Excel to build reports, visualize information and discover new insights. You can download this connector now (32-bit, 64-bit). Please be aware that for this preview to work you need to install the latest Power Query update.

    You can watch the following video or continue reading this blog post for more details about this connector preview

    After installing the latest version of Power Query and the Salesforce extension, users see two new sources in the From Other Sources ribbon menu. Like other data sources, Power Query connects on behalf of the user and only has access to the user’s data. The Salesforce API is used in a read-only manner, letting users access the data without making changes. The connection uses the Salesforce OAuth flow, and the user provides their Salesforce permissions to Power Query in order to connect.


    Salesforce Objects connects directly to the underlying object tables. Selecting Salesforce Objects displays a full list of all objects to which the user has access. Users can choose one or more tables and import them into the workbook.


    Salesforce Reports connects using the Reports API and lets users import reports built in Salesforce into Excel. This option lets users access data in a form similar to what they see in Salesforce, without having to connect to the underlying objects and rebuilding these views on top.


    Once in Excel, users can combine their Salesforce data with other relevant data using the wide range of connectors available in Power Query.

    Since this is a preview, the extension and feature set may change in the final released version. This extension is always tied to the latest version of Power Query, and may require users to update.

    Feedback is always appreciated, so please don’t hesitate to send us a smile/frown through the Power Query feedback options.

    #power query#Power BI#Power View#Power BI Training#Power BI for Office 365#Excel#Office 365#Getting Started#Microsoft BI#data sources#data visualization#hybrid#Salesforce

    Mon, 13 Oct 2014 21:00:00 GMT byPower BI Team14 Comments

    Read full post
  • 3 Examples of Data-Driven Sales Growth

    Last week we announced a new Sales Productivity solution that combines Microsoft Dynamics CRM Online Professional with Office 365 and Power BI. Many of our customers are already using this solution and seeing incredible results, so how can you benefit from it? The common thread across these examples is data. By combining raw data with the power of insights and collaboration, you can increase your business insights, productivity, and ultimately sales growth. Here are some real-world examples: 

    • Grant Thornton LLP uses insights into client demand to increase their sales pipeline. Using these new tools, they’ve increased their opportunities by 450 percent and increased their average win value by 36 percent. “The solution enables us to be much more focused,” says Rick Stow, head of Client Relationship Management at Grant Thornton. “We can use data to make data-driven decisions and communicate to our teams using actual data. Everyone here is very smart, but the additional insights — based on a single shared version of the truth — make us all smarter.”

    • Metro Bank fine-tunes its services to continue its focus on customer experience and reach its million-customer goal. By analyzing the data generated from customers and their interactions with the company’s services (stores, online, telephony, and mobile), Metro Bank can monitor customer interactions with new services that come online, and provide visual dashboards to share this insight across the company. 

    • CSX Transportation, a premier US railway company, identifies new business opportunities. After centralizing their sales data on Microsoft Dynamics CRM, they’re able to see trends across customer information, then relay that trend information to sales teams in order to further uncover these opportunities and pursue them.

    Interested in getting your hands on the Sales Productivity solution? Check it out now. You can find the press release announcing this new solution here.

    #Customer Story#Office 365#Microsoft Dynamics CRM Online Professional

    Mon, 13 Oct 2014 16:00:00 GMT byPower BI Team0 Comment

    Read full post
  • 5 minute tutorial for creating custom maps with Excel and Power Map

    This blog post was authored by James Marshall, Developer on the Excel Power Map team

    In early September, we introduced you to Custom Maps, a new feature in Power Map for Excel. Power Map is part of several analytics and visualization features, such as Power View, that comes with Excel when you get an Office 365 subscription with Office Professional Plus. You can get enhanced versions of these features as part of Power BI for Office 365, our cloud-based solution for analyzing, visualizing and sharing data insights across your company or organization.

    In this post I'll show you the basic steps for how to create a custom map with Excel Power Map. All you need is (1) an image with your custom map and (2) data that you can map in an XY coordinate system/plane. That's it! Let's get started.

    To illustrate the process from start to finish, we’ll walk through the creation of a Custom Map that visualizes train routes on the imaginary island of Lunda.


    1. First, you will need an image: any jpg, bmp, or png file.  In this case, we have a map of the island with various cities and paths of trains plotted. Here's is the one that we used for this example:

    2. Next, it’s assumed you have some data which relates to the image.  For this example, we have the number of passengers on the train routes at given intervals throughout the day.

    3. Power Map needs a way to plot your data on an X, Y coordinate system.  One such mapping is the pixel placement of where you want your data to appear.  In our case, we want to show the total number of passengers on the cities, so we need the pixel position of each city.  This is simple enough to do by opening Microsoft Paint, hovering over a city with the mouse, and recording its pixel position in the bottom-left hand corner of Paint into X and Y columns in Microsoft Excel.  Repeat for each city.

    4. Now that the data mapping is complete, boot Power Map and select New Scene > New Custom Map.  Select the “Geography” of the Custom Map, which will be the X and Y Coordinate System in the Layer Pane on the right.  In the Custom Map Options dialog, click the “Pixel Space” button as we’ve already created a one-to-one mapping of the data to the image.  If you’re not mapping your data by pixels, there are options to scale and offset your X and Y data in this dialog.

    5. Click Apply in the dialog and “Next” in the Layer Pane.  Now, you can use Power Map just as you would with any geographical data.  Add value and category fields, select colors, and play your tour when you’re finished!

    #Power Map#Getting Started#maps#geo location#tutorial#custom maps

    Tue, 07 Oct 2014 16:00:00 GMT byPower BI Team13 Comments

    Read full post
  • 4 new updates in Power Query

    Take Power Query to the next level: share your queries and create a corporate data catalog. Try Power BI for Free today!

    In this Post

    Another month is almost over and that means that a new Power Query update is due. We’re glad to announce the availability of the September 2014 Power Query update. Here is what is new:

    Download the Power Query Update

    Update summary

    Improved "Data Source Settings" Dialog

    Improved "Insert Index Column" transformation

    Additional options in Query Navigator in the Editor (reorder, delete, group)

    Option to disable Power Query Update notifications


    Update Summary

    The following features have been added or improved in this update:

    1. Improved “Data Source Settings” dialog
    2. Improved “Insert Index Column” transformation
    3. Additional options in Query Navigator in the Editor (reorder, delete, group)
    4. Option to disable Power Query Update notifications


    You can watch the following video or continue reading this blog post for more details about each feature.

    Improved “Data Source Settings” dialog

    With this update, we’re significantly improving the usability of the Data Source Settings dialog and addressing many of the functional gaps that it previously had.

    This dialog now allows users to:

    • Sort data sources by name/path or data source type.
    • Search within the list of data sources.
    • Select multiple items and delete them, or edit the Privacy Level for all of them at once.

    In addition to improving the initial dialog, we have also made the Edit dialog for each data source much more capable so that users can perform the following tasks:

    • Edit or delete stored credentials.
    • Control whether Power Query should encrypt connections to this data source or not.
    • Modify (or even set for the first time) the Privacy Level for the source.
    • For database sources, know whether any Native Query has been approved or not, and revoke all approvals.

    Improved “Insert Index Column” transformation


    We have added UX support for customizing new Index Columns in your queries. Before this update, the “Insert Index Column” option in the “Add Column” tab would always create a new index starting from 0. With this update, we’re adding the option to create a new index column starting from 1 or even to customize the starting value and row increment for the index.

    Additional options in Query Navigator inside the Editor


    In last month’s update, we added a Query Navigator inside the Query Editor to let users switch between queries without having to close and reopen the Query Editor. In this update, we’re extending the capabilities available within this Query Navigator to also let users reorder queries within the list, delete queries or create and manage their query groups.

    Option to disable Power Query Update notifications


    While we want everyone to be aware of and install new Power Query updates every month, we also understand that some folks may not want to be reminded with a system tray notification every time that a new update is available. We have added in this update an option to disable these notifications in the Options dialog. This option can be also specified as an installer command line argument: PowerQuery.msi DISABLE_UPDATE_NOTIFICATION=1


    That’s all for this update… We hope that you enjoy these features and continue sending us your valuable feedback and suggestions, so that we can continue improving Power Query with every new update.

    Download Power Query from our official download page.

    #update#power query#Power BI#Excel#Microsoft BI#data sources#share query#feedback#notifications#index column#query navigation

    Tue, 30 Sep 2014 16:00:00 GMT byMiguel.Llopis7 Comments

    Read full post
  • Retail Location Analytics using Power BI

    This blog post was authored by Shish Shridhar, Director of Business Development - Retail, Microsoft

    By combining demographic data like Median Income, Education Levels, Median Age and customer purchasing data such as preferences, past purchases, and online behavioral data, retailers gain a more in-depth understanding of customer needs and wants than with just past purchase data. Power BI provides powerful capabilities for combining data from various sources and enabling visual correlation: you can certainly use the Data Analysis Tookpak in Excel and run a Correlation Coefficient on the combined data as well.

    To test this out, I came with the assumption that Seattle has the most Starbucks and that demographics affect the number of stores. To learn what the answer was, I used Power BI and Excel. Here is what I did:

    I looked for oData sources relevant to retail and found one via Socrata:


    I ran a search for Starbucks, to see if there was a list of Starbucks Restaurants from around the world. I did find a oData source of with a listing of all the Starbucks Restaurants around the world:


    Here is the oData source link to access the data:

    Using Power Query for Excel, I was able to access the data using the oData option. This returns 20, 621 rows of data containing details of Starbucks restaurants around the world:


    To get a better insights from the data, I used Power View for Excel to create a visualizations. A quick drag & drop of the Brands against the count of the StoreIds showed me the brands represented by the data:


    I was curious about the countries with the most Starbucks, so I dragged in the Country information along with a count of the StoreIds. Here is the result:


    And interestingly, Seattle is not the city with the most Starbucks, as I’d assumed:


    Power Map for Excel enables visualizing this data on a Map as a layer of information:

    I was able to obtain US Census Data from Neustar and I imported this data into Excel. This data included Zip codes as well as detailed information about every Zip code. I could potentially use this information to correlate things like median age, median income, population around each of the Starbucks in the US. The Data looks like this:


    When I overlay the Census Data on top of the Starbucks Store Locations, I get a visual correlation between demographics data and Starbucks locations:


    Here is a Power Map for Excel video of two layers: Starbucks store locations with Median Income by Zip code:

    There are several sources of interesting public data that you can use to analyze Retailers: proximity analysis of retailers and their competition using data from Yelp and Foursquare; Correlating retail yelp rating and FourSquare CheckIns against Demographic data; Correlating Weather data against store performance.

    Here's the actual live visualization I created with Power View:

    You can check out some more examples at my blog.

    #power query#Power BI#Power Map#Power View#Power BI Training#Excel#live demo#data sources#visualization#Starbucks#store#zip code#retail#location#geo location

    Wed, 24 Sep 2014 16:00:00 GMT byPower BI Team2 Comments

    Read full post
  • London's Metro Bank uses Microsoft’s Power BI to help improve customer service

    While we love talking about the latest out of Microsoft, the best part of talking about the latest and greatest is hearing how our customers are able to use our products to meet – and beat – goals, increase efficiency, or get a leg up on competitors. This last benefit – gaining a competitive edge – was of particular importance to Metro Bank, which in 2010 became the first new retail bank in Britain in more than a century.

    Metro Bank prides itself on being different. From round-the-clock call centers staffed by people (not machines) and “Magic Money Machines” to entertain kids, to the ability to open accounts and issue debit cards within minutes, Metro Bank’s priority is unrelenting customer service. This approach paid off, resulting in thousands of customers joining the ‘banking revolution’ over the last four years. As the bank continued to grow, it needed a business intelligence solution that could help it understand how customers used all of its services, from in-store to mobile and online. This information would help Metro Bank fine-tune its services and move toward its goal of 1 million customers by 2020. Instead of a third-party BI system, Metro Bank went with Power BI for Office 365.

    Over the last year, Metro Bank put Power BI through its paces, creating a variety of dashboards to track bank operations, including the launch of a mobile banking service. Features such as Power Q&A enable executives and colleagues alike – regardless of previous experience with business intelligence – to ask questions in natural language, accelerating adoption throughout the company.

    For more information on how Metro Bank is using Power BI for Office 365 to reach a million members, check out the brand new case study.

    If you’re interested in trying Power BI, do it for free today.

    If you’re an Excel power user or are simply interested in growing your analytics skills, check out the free “Faster Insights to Data with Power BI” training

    #Power BI#Customer Story#Office 365#Metro Bank

    Thu, 11 Sep 2014 17:00:00 GMT byPower BI Team0 Comment

    Read full post
  • Three Steps To Gleaning Actionable Insights Using Power BI

    By Pedro Ardila; Designer at Microsoft, Ironman competitor, Data Geek

    Triathlon is a changing sport. Over the past twenty years, there has been a tremendous amount of change to equipment, training methodology, and racing strategy. No detail is too small for those athletes whose main goal is to qualify to a world championship event. There is no bigger stage in triathlon than the Ironman World Championships in Kailua-Kona, Hawaii.

    What goes into qualifying to this event? In most cases, it is months—or even years—of training, tens of thousands of dollars in equipment and entry fees, and of course, lots of support from family and friends. One aspect that no athlete should leave out is strategy; and every athlete’s strategy starts with a couple questions:

    1. What race should I enter in order to have the best chance to make it to the big dance in Kona?
    2. What times do I need in the swim, bike, and run portions to have a chance to qualify?

    I asked myself these questions, and decided to use my Microsoft Power BI knowledge to get some insights into what is needed to qualify. Here are the steps I followed:

    Step 1: Get Data

    I wanted to get some general demographic information to understand who enters these races. I also needed some recent race results to understand the characteristics of each race, and to get a better feel for the competition. I gathered the demographics data from USA Triathlon followed by a few thousands of rows worth of results for different races from, and this is a lot of data. What made working with this data simple was using Excel and Power Query.

    Let’s focus on how I gathered results from the Ironman website. Ironman does a great job storing the results for all its races. However, the pagination system makes it hard to export results for further analysis. Here is where Power Query becomes essential. Without Power Query, I would have to manually copy each page to an Excel spreadsheet, which is extremely tedious and time consuming. I decided to write a couple of queries that would go through each page and collect all the results for a race.

    Query 1: GetData

    Collects the results for a page given the page number. The second query, called KonaResults2023 (see below) will call GetData once for each page of results. We want to disable auto-loading to the worksheet for this query because KonaResults2013 will ultimately be responsible for getting the complete results into my model. Here is the code for the query:

    let    #"a"= (page) =>let    Source = Web.Page(Web.Contents("" & Text.From(page))),/*replace this with the url for your desired race. Make sure to trim the url to the same spot: “http://...?p=” and leave out the other variables*/    Data0 = Source{0}[Data],    ChangedType = Table.TransformColumnTypes(Data0,{{"Name", type text}, {"Country", type text}, {"Div Rank", type number}, {"Gender Rank", type number}, {"Overall Rank", type number}, {"Swim", type time}, {"Bike", type time}, {"Run", type time}, {"Finish", type time}, {"Points", type number}})in    ChangedTypein    a

    Query 2: KonaResults2013

    This query calls GetData n times, where n is the number of pages we need to go through. In this case, n = 107. The query also expands the results from GetData into a table and adds some formatting. I made sure to load the results straight to the data model as I was planning to visualize it using Power View. Once I ran this query –and a few more for the other races I wanted to analyze—I was ready for some analysis. Check out the code:

    let    Source = {1..107},/*replace 107 with the last page on your desired race.*/    TableFromList = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),    InsertedCustom = Table.AddColumn(TableFromList, "Custom", each GetData([Column1])),    #"Expand Custom" = Table.ExpandTableColumn(InsertedCustom, "Custom", {"Name", "Country", "Div Rank", "Gender Rank", "Overall Rank", "Swim", "Bike", "Run", "Finish", "Points"}, {"Custom.Name", "Custom.Country", "Custom.Div Rank", "Custom.Gender Rank", "Custom.Overall Rank", "Custom.Swim", "Custom.Bike", "Custom.Run", "Custom.Finish", "Custom.Points"}),    RemovedColumns = Table.RemoveColumns(#"Expand Custom",{"Column1"})in    RemovedColumns

    One great thing about the pattern above is that it can be used for any paginated table as long as the page number is passed through the URL. All of the transformations in the query (renaming columns, etc.) can be done through the Power Query UI, so the amount of code I ended up typing was minimal.

    Step 2: Visualize

    Next step was to visualize the data. I used Power View to create a series of charts. By using these charts, I could begin to not only see the results, but also begin to gather insights.

    Step 3: Gather Insights

    Here are some of the interesting things I found, as well as pictures for the different Power View report above.

    The first insights came from the demographics of triathlon in the United States:

    -          There is a heavy concentration of triathletes in the east coast and California

    -          Triathlon participation is higher in Michigan than Colorado


    The next set of insights came from slicing and dicing finisher data for a few different Ironman races. First of all, let’s look at Kona:


    Here we can see the size of each age group participating, as well as the average completion time for each age group. To dig deeper into the data, I clicked on the ‘M30-34’ group in the horizontal “Average of OverallInHours by Division” chart, which cross-filters our data. This step shows us that the wining participant for the  Age Group of males between age 30 and 34 finished in an astonishing time of 8.62 hours (or 8:37 minutes for those of us without a calculator).

    Now, let’s look at some of the races I am considering signing up for, and compare their 2013 results against one another. We will also include Kona for to get some perspective, even though it is a qualifying-only event.


    Here we start getting clues about the relative ‘toughness’ of each race. For instance, it is evident that Ironman Lake Tahoe favors strong climbers. It is a tough race overall. Here we can see that 20% of participants DNF’d (did not finish) at Lake Tahoe. Here are some more insights:

    -          Lots of people sign up but don’t start at Ironman Cozumel. My theory is that lots of people pull out of this race due to its proximity to Thanksgiving.

    -          The average finishing times for Ironman Canada and Ironman Cozumel are pretty close despite having very different elevation numbers. This doesn’t mean that elevation is not a factor. Instead, it tells us that there may be other challenges at Cozumel not accounted for in our data. In this case, those challenges are the added winds and humidity of Cozumel.

    -          There is a huge gap between Kona and the other races as far average finish times goes. This is primarily due to the selective nature of a World Championship event.

    -          Ironman Canada offers a balanced ride and a challenging run. This course seems to suit me, given that I am relatively light—meaning I don’t need to exert lots of effort while climbing on the bike—and I can run well in challenging courses.

    With this information at hand, we can now look at some specifics for Ironman Canada, such as the time breakdowns for swimming, biking, and running, and average finishing time for each age group. We will focus in on my Age group (25-29) and compare the average bike vs average run times using a scatter plot.


    The scatter plot has a clear trend. It is that most people are able to balance their efforts on the bike and the run. There are, however, some outliers. These could be people who were slow on the bike (perhaps due to mechanical or nutrition issues) but pulled together a great run, or people who had a really strong bike, but faded on the run. Some additional insights:

    -          Ironman Canada’s fastest age group was different from Kona’s (M35-39 in Canada vs. M30-34 in Kona)

    -          Age groups M35-39 and M30-34 were faster on average than the Female Pros.

    -          The first four people finished in under 10 hours, and were all in the top 100 overall. This means that to have a reasonable chance to qualify, I would have to finish in about 9 hours and 45 minutes.

    This last insight is extremely important, and it will help me set up objective goals for my next training season.

    Actionable Insights: How Should I Move Forward with my Training?

    Using Power BI I was able to quickly gather use big data, and then slice it and dice it until I found the answer to my original question: What time do I need to qualify to Kona? The answer is 9:45 or better. Time to start training.

    Ready to uncover your own insights?

    #power query#Power BI#Excel#data visualization#Ironman#insights

    Thu, 11 Sep 2014 16:00:00 GMT byPower BI Team1 Comment

    Read full post
  • 3 Great Examples: Data Done Well

    It’s no secret that big data has big potential – here at Microsoft, we work with hundreds of customers using technology in new ways to get the most out of data in every shape and form. While each of these companies defines “success” differently, there are a few that are top-of-mind for us. The following are just a few examples of how real companies from a variety of industries across the world are seeing real benefits from working with data differently: increased efficiency, improved performance, and using resources more effectively, all with insights from their data.

    Grameen Foundation

    The Problem: The need to quantify the impact of the Ghana Mobile Technology for Community Health (MOTECH) initiative. With this information, the Grameen Foundation can eliminate ineffective programs and expand those making the biggest impact.

    The Solution: Accessible, customizable reports and data visualizations no matter the staff’s location or device. This enables employees to better understand program data, resulting in better justification of program expenses, data-driven program management and community engagement.

    The Benefits:

    -          Increased efficiency: report creation in minutes instead of hours
    -          Easy use by anyone in the company without formal training or added costs
    -          Increased awareness of MOTECH, and expansion of humanitarian efforts



    The Problem: The need to easily measure and represent the health of an ad campaign despite vast number of diverse data sets.

    The Solution:
    A “health check” that captures the various facets of a multi-platform media campaign in one single score, incorporating paid media effectiveness, earned media effectiveness, the ratio of earned to paid media, realized client value, and longitudinal performance. The “health check” is built on Power BI for Office 365, and provides a unified campaign dashboard, as well as a collaborative site where the account team can ask questions and instantly receive answers in chart and graph form to share across the team and with clients.

    The Benefits:

    -          Increased optimization checks for campaigns: from weekly to daily
    -          High adoption of BI across the company using the Excel tools that teams already know
    -          Increased campaign productivity


    Carnegie Mellon University

    The Challenge: Optimize energy and operational efficiency in buildings worldwide.

    The Solution: Carnegie Mellon worked with OSIsoft (also the Microsoft Business Intelligence Partner of the Year!), to install a PI system, which integrated all of the building automation systems, as well as lights, ventilation, air quality, weather, and security data sources. Then, they added Power BI for Office 365 to provide custom-reporting capabilities for all the real-time data generated by the various systems, equipping employees with maps, visualizations, and dashboards showing information such as building types, geographic locations, and energy consumption. This information allows employees to zone in on problems such as faulty equipment, and identify places to cut back on energy consumption.

    The Benefits:

    -          The ability to present relevant information to diverse users of any analytics skill level
    -          30% less energy consumption by using data to see where equipment is faulty


    How are you using data and analytics to change the way you do business?

    If you’re interested in trying Power BI, do it for free today.

    If you’re an Excel power user or simply interested in growing your analytics skills, check out the free ‘Faster Insights to Data with Power BI’ training.

    #Power BI#MediaCom#Carnegie Mellon University#Power BI for Office 365#Grameen Foundation#Data Done Well

    Wed, 10 Sep 2014 16:00:00 GMT byPower BI Team0 Comment

    Read full post
  • 7 new updates in Power Query

    Take Power Query to the next level: share your queries and create a corporate data catalog. Try Power BI for Free today!

    In this Post

    Download the Power Query Update

    Support for loading Binary-typed columns (such as images) to the Data Model

    Multi-query editing in the Query Editor

    Pivot Columns

    Automatic type coercion for Text operations over any scalar column type (number, date, etc.)

    Query Properties dialog for editing name & description

    Simplified “Share Query” experience

    “Enable Tracing” option in the Options dialog


    We are very glad to announce the availability of a new Power Query update. You can get this update, as always, from our official download page.

    This update is packed with new features and usability improvements to existing features. Here is a quick summary of what has been added or improved in this update:

    1. Support for loading Binary-typed columns (such as images) to the Data Model
    2. Multi-query editing in the Query Editor
    3. Pivot Columns
    4. Automatic type coercion for Text operations over any scalar column type (number, date, etc.)
    5. Query Properties dialog for editing name & description
    6. Simplified “Share Query” experience
    7. “Enable Tracing” option in the Options dialog

    You can watch the following video or continue reading this e-mail for more details about each feature.

    Support for loading Binary-typed columns (such as images) to the Data Model

    With this update, we’ve enabled the ability to load Binary-typed columns into the Data Model from Power Query. This lets users download blobs of data into the Data Model from data sources that support this data type (such as SQL Server) and also lets users leverage existing Power Query library functions (such as File.Contents and Web.Contents) to load and store contents from an specified location to the Data Model.


    Note that, after loading a table containing a Binary column from Power Query into the Data Model, you will need to go into the PowerPivot window and perform the following two steps:

    1. Open the Table Properties dialog from the Design tab in PowerPivot and enable the column containing Binary data.
    2. Customize the Table Behavior in the Advanced tab, to indicate that your Binary data column is the Default Image.

    After performing these two steps you will be able to visualize the images in a Power View report.


    Multi-query editing in the Query Editor

    We have added a Navigator pane to the left side of the Query Editor. This Navigator allows users to quickly switch between queries and perform changes to each of them without having to close the Query Editor and open a different query from the Workbook Queries task pane. The Navigator pane reflects the list of queries and query groups in the current workbook, just like the Workbook Queries task pane does.

     Pivot Columns

    Before this update, users were able to perform Pivot Columns operations by typing in the corresponding library function for it (Table.Pivot). However, this was not very discoverable and it was also hard due to the number of parameters required by that function. With this update, we have added 1st class UX support for Pivot Columns. This operation can now be accessed via the Transform tab in the Query Editor.

    After selecting the desired column to pivot on and clicking Pivot Column, users are presented with a few options to customize the Values Column to use (column that contains the cell values for the new columns), the Aggregation function to apply to these values. There is also a link to a Help page with more information and examples about pivoting columns in Power Query.


    Automatic type coercion for Text operations over any scalar column type (number, date, etc.)

    With the goal of making it easier to apply column-level transformations, we have enabled in this update the ability to perform Text transformations (i.e. split/combine columns, uppercase/lowercase, etc.) on top of any scalar-type column such as numeric (Whole or Decimal Numbers) or date & time columns. When applying such operations to transform an existing column or insert a new columns based on existing ones, Power Query will take care of automatically handling column type conversions so that the Text operations behave as expected. This reduces the need to explicitly change column types in preparation for additional column transformations.


    Query Properties dialog for editing name & description

    One very common piece of feedback from our customers was that having to reopen the Query Editor in order to change the name or the description of an existing query was very inconvenient. In this update, we have added a new Properties dialog to control the name & description of each query and made it accessible from the Query ribbon tab and the Workbook Queries task pane.


    Note that the Properties dialog is also accessible from the Home tab inside the Query Editor and that the Description field has been removed from the Query Settings pane. This allows for additional space under the Applied Steps section.


    Simplified “Share Query” experience

    Another area of feedback has been that the “Share Query” dialog was a little bit overwhelming due to the amount of fields presented to the user at once: name, description, data sources list, documentation URL, preview settings, etc. In addition to this, it wasn’t quite clear to the user what was the relationship between a shared query and a local copy of the query.

    With this update, we’re changing the terminology around “Share Query” to be more descriptive about the product behavior. Users can send a point-in-time copy of the query to the Power BI data catalog and, optionally, share it with an specific set of users and groups of users or with their entire organization. We have also divided the input fields in the dialog between two tabs: Query (properties related to the query being sent to the catalog) and sharing (options for who should be able to find the query in the Power BI Data Catalog).



    Entry point for managing your Power BI Data Catalog queries is now available under the Power BI group in the Power Query ribbon tab, right next to the Sign In/Out button.


    Note that if you are using the Documentation URL field when sharing queries, this field is not available in this latest update. This is a temporary issue that will be addressed in next month’s update, by bringing this field back into the “Send to Data Catalog” dialog. In the meantime, please use the previous Power Query bits (32-bit, 64-bit). We apologize in advance for the inconvenience that this issue might cause to you.

    “Enable Tracing” option in the Options dialog

    If you have ever had to send us some traces to help us troubleshoot an issue with Power Query, you already know how painful the process to enable tracing was… until now. It involved manually modifying a Windows Registry key, reproducing the scenario to generate traces and, finally, disabling tracing via the Windows Registry so that these traces wouldn’t continue being generated (taking up a significant amount of local disk space).

    With this update we have added a new option under the Options dialog to easily turn on tracing. We will also take care of automatically disabling tracing after all in-flight Excel sessions are closed, to avoid taking up too much local disk space.


    That’s all for this update… We hope that you enjoy these features and continue sending us your valuable feedback and suggestions, so that we can continue improving Power Query with every new update.

    Download Power Query from our official download page.

    #update#power query#Power BI#Excel#Microsoft BI#share query#pivot columns#feedback#text operations#multi query

    Wed, 03 Sep 2014 16:00:00 GMT byMiguel.Llopis7 Comments

    Read full post
  • Best practices for building hybrid business intelligence environments

    Last week, authors Joseph D'Antoni and Stacia Misner released a new white paper that talks in detail about the use of Power BI in a hybrid environment taking advantage of the Data Management Gateway.

    In their words from the white paper summary: "Many organizations are taking advantage of the benefits of cloud computing such as lower capital expenditures and increased agility, while still maintaining data in on-premises data stores. This paper describes best practices for security, networking, and component architecture for building a hybrid business intelligence (BI) environment by using a combination of Microsoft Office 365, Microsoft Azure, and on-premises data sources".

    On this white paper you will find:

    • Instructions on how to set up Power BI in a hybrid environment
    • Best practices for security, networking, and component architecture for building a hybrid business intelligence (BI) environment by using a combination of Microsoft Office 365, Microsoft Azure, and on-premises data sources
    • Detailed options and best practices for each service that might be included on this type of hybrid solution
    • Guidance for integrating security and identity management in a hybrid environment
    • How to set up Power BI to refresh from the supported data sources

    Author Bios

    Joseph D'Antoni is a Senior Architect with over a decade of experience working in both Fortune 500 and smaller firms. He is currently Solutions Architect for SQL Server and Big Data for Anexinet in Blue Bell, PA.  He is frequent speaker at major tech events, a Microsoft SQL Server MVP, and blogger about all topics technology. He is the co-president of the Philadelphia SQL Server Users Group.  He holds a BS in Computer Information Systems from Louisiana Tech University and an MBA from North Carolina State University.

    Stacia Misner is a consultant, educator, mentor, author, and SQL Server MVP, specializing in business intelligence solutions since 1999. During that time, she has authored or coauthored multiple books about BI. Her latest books include Microsoft SQL Server Reporting Services 2012 (Microsoft Press, 2013) and Business Intelligence in Microsoft SharePoint 2013 (Microsoft Press, 2013). Stacia provides consulting and custom education services through her company, Data Inspirations; speaks frequently at conferences serving the SQL Server community; and serves as the chapter leader of her local PASS user group, SQL Server Society of Las Vegas. She is also a contributing editor for SQL Server Pro magazine. Stacia writes about her experiences with BI at and tweets at @StaciaMisner.

    #Power BI#azure#hybrid#onprem#cloud#SQL server

    Tue, 26 Aug 2014 16:00:00 GMT byMiguel A. Martinez1 Comment

    Read full post
  • Visualizing the Primetime Emmy History

    The 66th Primetime Emmy Awards ceremony will be held tonight at the Nokia Theatre in Downtown Los Angeles. Like every other TV fan around the world, I'll be watching. That is why I decided to honor the men and women that create these amazing shows for us with a visualization of Emmy Primetime and International winners since the first ceremony in 1949 using Excel and Power BI.

    In this visualization you will be able to explore the winners by show, network, actor/actress over the history of the awards. For international awards, you will be able to see a map and ranking with the total awards won by country.

    How did I build this? I basically used 2 data sources to build this visualization:

    1. List of Primetime Emmy Winners from Wikipedia
    2. List of International Emmy Winners from Wikipedia
    Tapping into this data sources was actually very easy using Power Query Online Search ("List of Primetime Emmy Winners" and "List International Emmy Winners"). I just typed both searches and voila! The data was already in Excel. I had to work big time with the first one since categories have changed over time and not all the fields were available (for example decade or category such as Drama and Comedy), but I was able to do it using all the shape and transform functions within Power Query.
    After that it was just adding the tables to the Power Pivot data model and create the dashboards with Power View using bar charts, slicers, maps and tables.
    It's important to note that I only used the main categories for this visualization (the ones available on the data sources):
    • Best Comedy, Drama and Variety TV series (no movies for TV)
    • Lead Actor and Actress (no supporting acting, writing or directing)
    • No reality show awards (Some people might think this is a good thing...)
    • For international Emmys, the awards are not Primetime but totals on all categories within this type of award
    Some interesting insights you can explore using these dashboards include:
    • All in the Family is the show with most Emmy awards in these categories since 1949 (11)
    • The Daily Show (one of my favorites) from Comedy Central is a close second with 10
    • CBS leads the scoreboard for networks with the amazing number of 146 awards
    • Before the 80s, CBS dominated the Emmys, but NBC took their lead until the 2000s. CBS re-gained the top spot starting in 2010
    • Cable networks such as HBO, AMC, Comedy Central and Showtime have become more important over the past 15 years
    • The list of shows winning awards in 2 different decades is an exclusive (and very high quality) club: show such as Cheers, Sopranos, Frasier and Taxi are members of this club
    • Only 2 actresses and 1 actor have won 5 Emmys since 1949: Candice Bergen, Carroll O'Connor and Mary Tyler (famous for 2 shows: The Dick Van Dyke Show and The Mary Tyler Moore Show)
    • Last but not least, Chile, the country where I'm from has won 1 international Emmy for Children and Young People: ¿Con Qué Sueñas? (What is your Dream?)
    That's it! Enjoy this visualization and tune in tonight to see which actors, actresses and series will take the Emmy home this year.
    #power query#Power BI#Power View#Excel#data visualization#dashboard#emmy#tv#primetime

    Mon, 25 Aug 2014 17:00:00 GMT byMiguel A. Martinez0 Comment

    Read full post