Tips, ideas, and updates from the experts

  • London's Metro Bank uses Microsoft’s Power BI to help improve customer service

    While we love talking about the latest out of Microsoft, the best part of talking about the latest and greatest is hearing how our customers are able to use our products to meet – and beat – goals, increase efficiency, or get a leg up on competitors. This last benefit – gaining a competitive edge – was of particular importance to Metro Bank, which in 2010 became the first new retail bank in Britain in more than a century.

    Metro Bank prides itself on being different. From round-the-clock call centers staffed by people (not machines) and “Magic Money Machines” to entertain kids, to the ability to open accounts and issue debit cards within minutes, Metro Bank’s priority is unrelenting customer service. This approach paid off, resulting in thousands of customers joining the ‘banking revolution’ over the last four years. As the bank continued to grow, it needed a business intelligence solution that could help it understand how customers used all of its services, from in-store to mobile and online. This information would help Metro Bank fine-tune its services and move toward its goal of 1 million customers by 2020. Instead of a third-party BI system, Metro Bank went with Power BI for Office 365.

    Over the last year, Metro Bank put Power BI through its paces, creating a variety of dashboards to track bank operations, including the launch of a mobile banking service. Features such as Power Q&A enable executives and colleagues alike – regardless of previous experience with business intelligence – to ask questions in natural language, accelerating adoption throughout the company.

    For more information on how Metro Bank is using Power BI for Office 365 to reach a million members, check out the brand new case study.

    If you’re interested in trying Power BI, do it for free today.

    If you’re an Excel power user or are simply interested in growing your analytics skills, check out the free “Faster Insights to Data with Power BI” training

    #Power BI#Customer Story#Office 365#Metro Bank

    Thu, 11 Sep 2014 17:00:00 GMT byPower BI Team0 Comment

    Read full post
  • Three Steps To Gleaning Actionable Insights Using Power BI

    By Pedro Ardila; Designer at Microsoft, Ironman competitor, Data Geek

    Triathlon is a changing sport. Over the past twenty years, there has been a tremendous amount of change to equipment, training methodology, and racing strategy. No detail is too small for those athletes whose main goal is to qualify to a world championship event. There is no bigger stage in triathlon than the Ironman World Championships in Kailua-Kona, Hawaii.

    What goes into qualifying to this event? In most cases, it is months—or even years—of training, tens of thousands of dollars in equipment and entry fees, and of course, lots of support from family and friends. One aspect that no athlete should leave out is strategy; and every athlete’s strategy starts with a couple questions:

    1. What race should I enter in order to have the best chance to make it to the big dance in Kona?
    2. What times do I need in the swim, bike, and run portions to have a chance to qualify?

    I asked myself these questions, and decided to use my Microsoft Power BI knowledge to get some insights into what is needed to qualify. Here are the steps I followed:

    Step 1: Get Data

    I wanted to get some general demographic information to understand who enters these races. I also needed some recent race results to understand the characteristics of each race, and to get a better feel for the competition. I gathered the demographics data from USA Triathlon followed by a few thousands of rows worth of results for different races from ironman.com, and this is a lot of data. What made working with this data simple was using Excel and Power Query.

    Let’s focus on how I gathered results from the Ironman website. Ironman does a great job storing the results for all its races. However, the pagination system makes it hard to export results for further analysis. Here is where Power Query becomes essential. Without Power Query, I would have to manually copy each page to an Excel spreadsheet, which is extremely tedious and time consuming. I decided to write a couple of queries that would go through each page and collect all the results for a race.

    Query 1: GetData

    Collects the results for a page given the page number. The second query, called KonaResults2023 (see below) will call GetData once for each page of results. We want to disable auto-loading to the worksheet for this query because KonaResults2013 will ultimately be responsible for getting the complete results into my model. Here is the code for the query:

    let    #"a"= (page) =>let    Source = Web.Page(Web.Contents("http://www.ironman.com/triathlon/events/americas/ironman/world-championship/results.aspx?p=" & Text.From(page))),/*replace this with the url for your desired race. Make sure to trim the url to the same spot: “http://...?p=” and leave out the other variables*/    Data0 = Source{0}[Data],    ChangedType = Table.TransformColumnTypes(Data0,{{"Name", type text}, {"Country", type text}, {"Div Rank", type number}, {"Gender Rank", type number}, {"Overall Rank", type number}, {"Swim", type time}, {"Bike", type time}, {"Run", type time}, {"Finish", type time}, {"Points", type number}})in    ChangedTypein    a

    Query 2: KonaResults2013

    This query calls GetData n times, where n is the number of pages we need to go through. In this case, n = 107. The query also expands the results from GetData into a table and adds some formatting. I made sure to load the results straight to the data model as I was planning to visualize it using Power View. Once I ran this query –and a few more for the other races I wanted to analyze—I was ready for some analysis. Check out the code:

    let    Source = {1..107},/*replace 107 with the last page on your desired race.*/    TableFromList = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),    InsertedCustom = Table.AddColumn(TableFromList, "Custom", each GetData([Column1])),    #"Expand Custom" = Table.ExpandTableColumn(InsertedCustom, "Custom", {"Name", "Country", "Div Rank", "Gender Rank", "Overall Rank", "Swim", "Bike", "Run", "Finish", "Points"}, {"Custom.Name", "Custom.Country", "Custom.Div Rank", "Custom.Gender Rank", "Custom.Overall Rank", "Custom.Swim", "Custom.Bike", "Custom.Run", "Custom.Finish", "Custom.Points"}),    RemovedColumns = Table.RemoveColumns(#"Expand Custom",{"Column1"})in    RemovedColumns

    One great thing about the pattern above is that it can be used for any paginated table as long as the page number is passed through the URL. All of the transformations in the query (renaming columns, etc.) can be done through the Power Query UI, so the amount of code I ended up typing was minimal.

    Step 2: Visualize

    Next step was to visualize the data. I used Power View to create a series of charts. By using these charts, I could begin to not only see the results, but also begin to gather insights.

    Step 3: Gather Insights

    Here are some of the interesting things I found, as well as pictures for the different Power View report above.

    The first insights came from the demographics of triathlon in the United States:

    -          There is a heavy concentration of triathletes in the east coast and California

    -          Triathlon participation is higher in Michigan than Colorado

     

    The next set of insights came from slicing and dicing finisher data for a few different Ironman races. First of all, let’s look at Kona:

     

    Here we can see the size of each age group participating, as well as the average completion time for each age group. To dig deeper into the data, I clicked on the ‘M30-34’ group in the horizontal “Average of OverallInHours by Division” chart, which cross-filters our data. This step shows us that the wining participant for the  Age Group of males between age 30 and 34 finished in an astonishing time of 8.62 hours (or 8:37 minutes for those of us without a calculator).

    Now, let’s look at some of the races I am considering signing up for, and compare their 2013 results against one another. We will also include Kona for to get some perspective, even though it is a qualifying-only event.

     

    Here we start getting clues about the relative ‘toughness’ of each race. For instance, it is evident that Ironman Lake Tahoe favors strong climbers. It is a tough race overall. Here we can see that 20% of participants DNF’d (did not finish) at Lake Tahoe. Here are some more insights:

    -          Lots of people sign up but don’t start at Ironman Cozumel. My theory is that lots of people pull out of this race due to its proximity to Thanksgiving.

    -          The average finishing times for Ironman Canada and Ironman Cozumel are pretty close despite having very different elevation numbers. This doesn’t mean that elevation is not a factor. Instead, it tells us that there may be other challenges at Cozumel not accounted for in our data. In this case, those challenges are the added winds and humidity of Cozumel.

    -          There is a huge gap between Kona and the other races as far average finish times goes. This is primarily due to the selective nature of a World Championship event.

    -          Ironman Canada offers a balanced ride and a challenging run. This course seems to suit me, given that I am relatively light—meaning I don’t need to exert lots of effort while climbing on the bike—and I can run well in challenging courses.

    With this information at hand, we can now look at some specifics for Ironman Canada, such as the time breakdowns for swimming, biking, and running, and average finishing time for each age group. We will focus in on my Age group (25-29) and compare the average bike vs average run times using a scatter plot.

     

    The scatter plot has a clear trend. It is that most people are able to balance their efforts on the bike and the run. There are, however, some outliers. These could be people who were slow on the bike (perhaps due to mechanical or nutrition issues) but pulled together a great run, or people who had a really strong bike, but faded on the run. Some additional insights:

    -          Ironman Canada’s fastest age group was different from Kona’s (M35-39 in Canada vs. M30-34 in Kona)

    -          Age groups M35-39 and M30-34 were faster on average than the Female Pros.

    -          The first four people finished in under 10 hours, and were all in the top 100 overall. This means that to have a reasonable chance to qualify, I would have to finish in about 9 hours and 45 minutes.

    This last insight is extremely important, and it will help me set up objective goals for my next training season.

    Actionable Insights: How Should I Move Forward with my Training?

    Using Power BI I was able to quickly gather use big data, and then slice it and dice it until I found the answer to my original question: What time do I need to qualify to Kona? The answer is 9:45 or better. Time to start training.

    Ready to uncover your own insights?

    #power query#Power BI#Excel#data visualization#Ironman#insights

    Thu, 11 Sep 2014 16:00:00 GMT byPower BI Team0 Comment

    Read full post
  • 3 Great Examples: Data Done Well

    It’s no secret that big data has big potential – here at Microsoft, we work with hundreds of customers using technology in new ways to get the most out of data in every shape and form. While each of these companies defines “success” differently, there are a few that are top-of-mind for us. The following are just a few examples of how real companies from a variety of industries across the world are seeing real benefits from working with data differently: increased efficiency, improved performance, and using resources more effectively, all with insights from their data.


    Grameen Foundation

    The Problem: The need to quantify the impact of the Ghana Mobile Technology for Community Health (MOTECH) initiative. With this information, the Grameen Foundation can eliminate ineffective programs and expand those making the biggest impact.

    The Solution: Accessible, customizable reports and data visualizations no matter the staff’s location or device. This enables employees to better understand program data, resulting in better justification of program expenses, data-driven program management and community engagement.

    The Benefits:

    -          Increased efficiency: report creation in minutes instead of hours
    -          Easy use by anyone in the company without formal training or added costs
    -          Increased awareness of MOTECH, and expansion of humanitarian efforts

     

    MediaCom

    The Problem: The need to easily measure and represent the health of an ad campaign despite vast number of diverse data sets.

    The Solution:
    A “health check” that captures the various facets of a multi-platform media campaign in one single score, incorporating paid media effectiveness, earned media effectiveness, the ratio of earned to paid media, realized client value, and longitudinal performance. The “health check” is built on Power BI for Office 365, and provides a unified campaign dashboard, as well as a collaborative site where the account team can ask questions and instantly receive answers in chart and graph form to share across the team and with clients.

    The Benefits:

    -          Increased optimization checks for campaigns: from weekly to daily
    -          High adoption of BI across the company using the Excel tools that teams already know
    -          Increased campaign productivity

     

    Carnegie Mellon University

    The Challenge: Optimize energy and operational efficiency in buildings worldwide.

    The Solution: Carnegie Mellon worked with OSIsoft (also the Microsoft Business Intelligence Partner of the Year!), to install a PI system, which integrated all of the building automation systems, as well as lights, ventilation, air quality, weather, and security data sources. Then, they added Power BI for Office 365 to provide custom-reporting capabilities for all the real-time data generated by the various systems, equipping employees with maps, visualizations, and dashboards showing information such as building types, geographic locations, and energy consumption. This information allows employees to zone in on problems such as faulty equipment, and identify places to cut back on energy consumption.

    The Benefits:

    -          The ability to present relevant information to diverse users of any analytics skill level
    -          30% less energy consumption by using data to see where equipment is faulty

     

    How are you using data and analytics to change the way you do business?

    If you’re interested in trying Power BI, do it for free today.

    If you’re an Excel power user or simply interested in growing your analytics skills, check out the free ‘Faster Insights to Data with Power BI’ training.

    #Power BI#MediaCom#Carnegie Mellon University#Power BI for Office 365#Grameen Foundation#Data Done Well

    Wed, 10 Sep 2014 16:00:00 GMT byPower BI Team0 Comment

    Read full post
  • 7 new updates in Power Query

    Take Power Query to the next level: share your queries and create a corporate data catalog. Try Power BI for Free today!

    In this Post

    Download the Power Query Update

    Support for loading Binary-typed columns (such as images) to the Data Model

    Multi-query editing in the Query Editor

    Pivot Columns

    Automatic type coercion for Text operations over any scalar column type (number, date, etc.)

    Query Properties dialog for editing name & description

    Simplified “Share Query” experience

    “Enable Tracing” option in the Options dialog

     

    We are very glad to announce the availability of a new Power Query update. You can get this update, as always, from our official download page.

    This update is packed with new features and usability improvements to existing features. Here is a quick summary of what has been added or improved in this update:

    1. Support for loading Binary-typed columns (such as images) to the Data Model
    2. Multi-query editing in the Query Editor
    3. Pivot Columns
    4. Automatic type coercion for Text operations over any scalar column type (number, date, etc.)
    5. Query Properties dialog for editing name & description
    6. Simplified “Share Query” experience
    7. “Enable Tracing” option in the Options dialog

    You can watch the following video or continue reading this e-mail for more details about each feature.

    Support for loading Binary-typed columns (such as images) to the Data Model

    With this update, we’ve enabled the ability to load Binary-typed columns into the Data Model from Power Query. This lets users download blobs of data into the Data Model from data sources that support this data type (such as SQL Server) and also lets users leverage existing Power Query library functions (such as File.Contents and Web.Contents) to load and store contents from an specified location to the Data Model.

     

    Note that, after loading a table containing a Binary column from Power Query into the Data Model, you will need to go into the PowerPivot window and perform the following two steps:

    1. Open the Table Properties dialog from the Design tab in PowerPivot and enable the column containing Binary data.
    2. Customize the Table Behavior in the Advanced tab, to indicate that your Binary data column is the Default Image.

    After performing these two steps you will be able to visualize the images in a Power View report.

     

    Multi-query editing in the Query Editor

    We have added a Navigator pane to the left side of the Query Editor. This Navigator allows users to quickly switch between queries and perform changes to each of them without having to close the Query Editor and open a different query from the Workbook Queries task pane. The Navigator pane reflects the list of queries and query groups in the current workbook, just like the Workbook Queries task pane does.

     Pivot Columns

    Before this update, users were able to perform Pivot Columns operations by typing in the corresponding library function for it (Table.Pivot). However, this was not very discoverable and it was also hard due to the number of parameters required by that function. With this update, we have added 1st class UX support for Pivot Columns. This operation can now be accessed via the Transform tab in the Query Editor.

    After selecting the desired column to pivot on and clicking Pivot Column, users are presented with a few options to customize the Values Column to use (column that contains the cell values for the new columns), the Aggregation function to apply to these values. There is also a link to a Help page with more information and examples about pivoting columns in Power Query.

      

    Automatic type coercion for Text operations over any scalar column type (number, date, etc.)

    With the goal of making it easier to apply column-level transformations, we have enabled in this update the ability to perform Text transformations (i.e. split/combine columns, uppercase/lowercase, etc.) on top of any scalar-type column such as numeric (Whole or Decimal Numbers) or date & time columns. When applying such operations to transform an existing column or insert a new columns based on existing ones, Power Query will take care of automatically handling column type conversions so that the Text operations behave as expected. This reduces the need to explicitly change column types in preparation for additional column transformations.

       

    Query Properties dialog for editing name & description

    One very common piece of feedback from our customers was that having to reopen the Query Editor in order to change the name or the description of an existing query was very inconvenient. In this update, we have added a new Properties dialog to control the name & description of each query and made it accessible from the Query ribbon tab and the Workbook Queries task pane.

     

    Note that the Properties dialog is also accessible from the Home tab inside the Query Editor and that the Description field has been removed from the Query Settings pane. This allows for additional space under the Applied Steps section.

       

    Simplified “Share Query” experience

    Another area of feedback has been that the “Share Query” dialog was a little bit overwhelming due to the amount of fields presented to the user at once: name, description, data sources list, documentation URL, preview settings, etc. In addition to this, it wasn’t quite clear to the user what was the relationship between a shared query and a local copy of the query.

    With this update, we’re changing the terminology around “Share Query” to be more descriptive about the product behavior. Users can send a point-in-time copy of the query to the Power BI data catalog and, optionally, share it with an specific set of users and groups of users or with their entire organization. We have also divided the input fields in the dialog between two tabs: Query (properties related to the query being sent to the catalog) and sharing (options for who should be able to find the query in the Power BI Data Catalog).

     

     

    Entry point for managing your Power BI Data Catalog queries is now available under the Power BI group in the Power Query ribbon tab, right next to the Sign In/Out button.

     

    Note that if you are using the Documentation URL field when sharing queries, this field is not available in this latest update. This is a temporary issue that will be addressed in next month’s update, by bringing this field back into the “Send to Data Catalog” dialog. In the meantime, please use the previous Power Query bits (32-bit, 64-bit). We apologize in advance for the inconvenience that this issue might cause to you.

    “Enable Tracing” option in the Options dialog

    If you have ever had to send us some traces to help us troubleshoot an issue with Power Query, you already know how painful the process to enable tracing was… until now. It involved manually modifying a Windows Registry key, reproducing the scenario to generate traces and, finally, disabling tracing via the Windows Registry so that these traces wouldn’t continue being generated (taking up a significant amount of local disk space).

    With this update we have added a new option under the Options dialog to easily turn on tracing. We will also take care of automatically disabling tracing after all in-flight Excel sessions are closed, to avoid taking up too much local disk space.

     

    That’s all for this update… We hope that you enjoy these features and continue sending us your valuable feedback and suggestions, so that we can continue improving Power Query with every new update.

    Download Power Query from our official download page.

    #update#power query#Power BI#Excel#Microsoft BI#share query#pivot columns#feedback#text operations#multi query

    Wed, 03 Sep 2014 16:00:00 GMT byMiguel.Llopis4 Comments

    Read full post
  • Best practices for building hybrid business intelligence environments

    Last week, authors Joseph D'Antoni and Stacia Misner released a new white paper that talks in detail about the use of Power BI in a hybrid environment taking advantage of the Data Management Gateway.

    In their words from the white paper summary: "Many organizations are taking advantage of the benefits of cloud computing such as lower capital expenditures and increased agility, while still maintaining data in on-premises data stores. This paper describes best practices for security, networking, and component architecture for building a hybrid business intelligence (BI) environment by using a combination of Microsoft Office 365, Microsoft Azure, and on-premises data sources".

    On this white paper you will find:

    • Instructions on how to set up Power BI in a hybrid environment
    • Best practices for security, networking, and component architecture for building a hybrid business intelligence (BI) environment by using a combination of Microsoft Office 365, Microsoft Azure, and on-premises data sources
    • Detailed options and best practices for each service that might be included on this type of hybrid solution
    • Guidance for integrating security and identity management in a hybrid environment
    • How to set up Power BI to refresh from the supported data sources

    Author Bios

    Joseph D'Antoni is a Senior Architect with over a decade of experience working in both Fortune 500 and smaller firms. He is currently Solutions Architect for SQL Server and Big Data for Anexinet in Blue Bell, PA.  He is frequent speaker at major tech events, a Microsoft SQL Server MVP, and blogger about all topics technology. He is the co-president of the Philadelphia SQL Server Users Group.  He holds a BS in Computer Information Systems from Louisiana Tech University and an MBA from North Carolina State University.

    Stacia Misner is a consultant, educator, mentor, author, and SQL Server MVP, specializing in business intelligence solutions since 1999. During that time, she has authored or coauthored multiple books about BI. Her latest books include Microsoft SQL Server Reporting Services 2012 (Microsoft Press, 2013) and Business Intelligence in Microsoft SharePoint 2013 (Microsoft Press, 2013). Stacia provides consulting and custom education services through her company, Data Inspirations; speaks frequently at conferences serving the SQL Server community; and serves as the chapter leader of her local PASS user group, SQL Server Society of Las Vegas. She is also a contributing editor for SQL Server Pro magazine. Stacia writes about her experiences with BI at blog.datainspirations.com and tweets at @StaciaMisner.

    #Power BI#azure#hybrid#onprem#cloud#SQL server

    Tue, 26 Aug 2014 16:00:00 GMT byMiguel A. Martinez0 Comment

    Read full post
  • Visualizing the Primetime Emmy History

    The 66th Primetime Emmy Awards ceremony will be held tonight at the Nokia Theatre in Downtown Los Angeles. Like every other TV fan around the world, I'll be watching. That is why I decided to honor the men and women that create these amazing shows for us with a visualization of Emmy Primetime and International winners since the first ceremony in 1949 using Excel and Power BI.

    In this visualization you will be able to explore the winners by show, network, actor/actress over the history of the awards. For international awards, you will be able to see a map and ranking with the total awards won by country.

    How did I build this? I basically used 2 data sources to build this visualization:

    1. List of Primetime Emmy Winners from Wikipedia
    2. List of International Emmy Winners from Wikipedia
    Tapping into this data sources was actually very easy using Power Query Online Search ("List of Primetime Emmy Winners" and "List International Emmy Winners"). I just typed both searches and voila! The data was already in Excel. I had to work big time with the first one since categories have changed over time and not all the fields were available (for example decade or category such as Drama and Comedy), but I was able to do it using all the shape and transform functions within Power Query.
    After that it was just adding the tables to the Power Pivot data model and create the dashboards with Power View using bar charts, slicers, maps and tables.
     
    It's important to note that I only used the main categories for this visualization (the ones available on the data sources):
    • Best Comedy, Drama and Variety TV series (no movies for TV)
    • Lead Actor and Actress (no supporting acting, writing or directing)
    • No reality show awards (Some people might think this is a good thing...)
    • For international Emmys, the awards are not Primetime but totals on all categories within this type of award
    Some interesting insights you can explore using these dashboards include:
    • All in the Family is the show with most Emmy awards in these categories since 1949 (11)
    • The Daily Show (one of my favorites) from Comedy Central is a close second with 10
    • CBS leads the scoreboard for networks with the amazing number of 146 awards
    • Before the 80s, CBS dominated the Emmys, but NBC took their lead until the 2000s. CBS re-gained the top spot starting in 2010
    • Cable networks such as HBO, AMC, Comedy Central and Showtime have become more important over the past 15 years
    • The list of shows winning awards in 2 different decades is an exclusive (and very high quality) club: show such as Cheers, Sopranos, Frasier and Taxi are members of this club
    • Only 2 actresses and 1 actor have won 5 Emmys since 1949: Candice Bergen, Carroll O'Connor and Mary Tyler (famous for 2 shows: The Dick Van Dyke Show and The Mary Tyler Moore Show)
    • Last but not least, Chile, the country where I'm from has won 1 international Emmy for Children and Young People: ¿Con Qué Sueñas? (What is your Dream?)
     
    That's it! Enjoy this visualization and tune in tonight to see which actors, actresses and series will take the Emmy home this year.
    #power query#Power BI#Power View#Excel#data visualization#dashboard#emmy#tv#primetime

    Mon, 25 Aug 2014 17:00:00 GMT byMiguel A. Martinez0 Comment

    Read full post
  • Scheduled Data Refresh Update: New Data Sources

    In this Post

    Download the Data Management Gateway

    Extended data source support

    Step-by-step Power Query connection configuration

    Additional information

     

    We are excited to announce new data sources for the scheduled data refresh feature of Power BI. To support these enhancements, a new version of Data Management Gateway is available. Installation package (version 1.2) can be downloaded from Microsoft Download Center.

     

     

    Extended data source support

    Supported data source types from Power Query connection string in the Data Management Gateway version 1.2:

    Gateway version

    Newly supported data source

    Authentication type

    1.2

     

    SQL Server

    Windows, Basic

    Oracle

    Windows, Basic

    File (CSV, XML, Text, Excel, Access)

    Windows

    Folder

    Windows

    SharePoint List (Online)

    Anonymous

    SharePoint List (On-prem)

    Anonymous, Windows

    OData Feed

    Anonymous, Windows, Basic, Key (Data Market)

    Azure Marketplace

    Key (Data Market)

    Azure HDInsight

    Key (Azure Account)

    Azure Blob Storage

    Key (Azure Account)

    Azure Table Storage

    Key (Azure Account)

    PostgreSQL

    Basic

    Teradata

    Windows, Basic

    Sybase

    Windows, Basic

    DB2

    Windows, Basic

    MySQL

    Windows, Basic

    SQL Azure

    Basic

    Query that is not accessing data sources

     

    [1]: Native queries (custom SQL statements) for relational databases are not supported yet.

    [2]:.It is best to place file and folder data sources on shared folders, so that the accessibility of the data sources is more reliable. However, if the data sources are available on every machine with the gateway instance installed, the Power Query connection is still able to get refreshed successfully.

    [4]: Web API key and OAuth2 are not supported yet.

    Step-by-step Power Query connection configuration

    Here is a step by step guide showing the UI improvements for Power Query connection configuration introduced since our May update

    1. In the data source page, click new data source > Power Query

     

    In addition to SQL Server, Oracle and a Power Query connection, the July update to Power BI Admin Center also supports SharePoint Online Document Library for data indexing. Please refer to related online documentation for more information.

    2. In the connection info page, enter a valid Power Query connection string, and click next.


    We’ve added more information on this page to help you determine the information needed for a valid Power Query connection string, as well as a complete matrix of supported data sources.
    Please note that currently we only support Power Query connection strings from the DATA tab in the Excel workbook, please refer to online document for details. Direct copying Power Query connection string from Power Pivot is not supported. A valid Power Query connection string should contain:

      • Provider=Microsoft.Mashup.OleDb.1;
      • Data Source=$EmbeddedMashup(SomeGUID)$;
      • Location=SomePowerQueryName;

    3. All data sources in the Power Query connection will be shown in the data source info page.


    To improve the user experience, we’ve added a DETAIL column to demonstrate the information of the data source, and the Details in the right pane is expanded by default.

    4. In order to configure a non-configured data source, specify Name and Description (optional) for the data source and select a Gateway. Then click the set credentials button to launch the data source settings dialog.
    The layout of the data source settings dialog depends on the Credential type for the data source. For example, you can specify Database or Windows authentication for Teradata, and Account key for Azure HDInsight.


     
    You can specify other data source settings in the dialog including privacy level and encrypt connection for relational databases.
    You must test connection before clicking ok to save your credentials. And don’t forget to click save in the Admin Center to commit your modifications to the data source . The data source status will be changed to configured.



    Please note you will need the latest version (v1.2) of Data Management Gateway for data sources other than SQL Server and Oracle. And all data source in the Power Query connection still need to be on the same gateway. However, you can add more instances to one gateway to solve any scalability issues, which is also a new feature in this July update.

    5. When all data sources within the Power Query connection are appropriately configured, you should test Power Query connection before clicking next to make sure the Power Query connection works.

     

    6. In the users and groups page, specify users and groups that are allowed to access these data sources to refresh Power Query connections. If a data source already exists, the specified users and groups will be appended to the existing users and groups list of the data source.

     

    Now you can successfully refresh your Excel workbook with a data model using the Power Query connection on the Power BI site!

    Additional information

     

    • Please refer to data source prerequisites for required providers to access the corresponding data sources. Please note that #literals, Web, SAP BusinessObjects, Active Directory, HDFS, Facebook, Exchange and Current Excel workbook are not yet supported sources.
    • Power Query connection string from Power Pivot is still not supported. You can only get a valid Power Query connection string from a data table.
    • All data sources in the Power Query connection must still be hosted on the same gateway. You may consider adding more instances to the gateway to solve any scalability problems.
    • Data source details are now displayed in the DETAIL column to improve visibility when configuring a Power Query connection.
    • Since privacy levelis a part of data source settings for Power Query connections, it is required to specify a value for privacy level and to confirm the credential before trying to update privacy level.
      • Privacy level for existing SQL Server and Oracle data sources will by default be assigned as organizational. You can update the privacy level afterwards.
      • Privacy level for anonymous data sources is fixed to be public.
      • For data sources not used in Power Query connections, the value of this setting will not take effect and it is safe to choose an arbitrary value.

     

    #data refresh#Power BI app#data sources#schedule data refresh#data management gateway

    Tue, 19 Aug 2014 16:00:00 GMT byPower BI Team6 Comments

    Read full post
  • Power BI is changing the way health services are provided

    This blog post was authored by Tom Lawry, Director Worldwide Health, Microsoft

    With healthcare representing one of the fastest growing business segments for Business Intelligence, Power BI is finding its way in to healthcare organizations worldwide where it’s being used to improve quality, performance and productivity.

    Here are four examples from around the world:

    United States: A provider/payor organization in the Midwest wanted to evaluate how to better serve populations of patients with chronic health conditions. Utilizing Power Query they combined internal data from their Electronic Medical Record system with external data from credible open data sources. Once completed they made use of Power Map to visualize the data and explore trends and patterns in their service area that might help them identify new ways of serving patients and consumers.

    Anonymized view of the data used to explore population health patterns.

    Norway: Helse Vest is a Norwegian health authority that operates 50 healthcare facilities. To meet requirements for a government patient safety program, the Authority needed to find a way to more quickly create analytical reports based on surgery trends and other medical data. Helse Vest implemented Power BI for Office 365, that employees now use to gather and visualized combined data from multiple hospitals. With Power BI the staff reduced the time to build reports from 14 days to less than 1 day, and Helse Vest can quickly comply with government requirements.

    The Intervention Centre is the research and development unit at Oslo University Hospital charged with identifying clinical best practices and applying new technologies to healthcare management.

    In one use case Radiology data was collected from department systems by Power Query and loaded into a cloud-based database hosted on a Power BI website which could be accessed and used as a collaboration site for sharing queries among researchers. Queries, in turn, could be made in the form of natural-language searches, eliminating the need for programming skills.

    Hospital executives and researchers at the Intervention Centre say that they can use Power BI to speed analyses to deliver actionable information, collaborate better with researchers beyond the enterprise and avoid the capital and operating costs of a traditional BI solution.

    United Kingdom: The North Tees and Hartlepool National Health Services Foundation Trust provides hospital and community based services to a population of about 400,000.

    The Foundation Trust initially made use of Power BI to evaluate the cost and efficacy of the use of drugs in the treatment process. One use case of Power BI was used to correlate weather data with the institution’s own data to identify the impact of inclement weather on the frequency of respiratory data to understand which drugs were being prescribed and how prescription patterns varied by locality. Use of this data is now being expended by collaborating with other hospitals to compare data through Power BI with a goal of identifying best practices in prescribing drugs.

     

    Tom Lawry

    Director Worldwide Health

    Microsoft

    #Power BI#Excel#hospital#healthcare#clinic#health#public data

    Thu, 14 Aug 2014 16:00:00 GMT byPower BI Team4 Comments

    Read full post
  • Data Visualization for the 2014 World Cup results using Excel and Power BI

    This blog post was authored by Marc Reguera, Finance Director, Microsoft

    After the success and interested generated by my first post, Visualizing the World Cup history with Excel and Power BI, I decided to update the model with data from the 2014 Brazil World Cup and analyze how the history of the world cup changed based on the results. The concept was exactly the same: use Power Query to get the latest data into Excel, update the Power Pivot model and visualize with Excel dashboards using Power View to analyze 2014 data vs. historic data from previous world cups. I used the same visualizations to unlock insights on how team performance changed, how the attendance stacks up vs. previous tournaments and other interesting facts around goals, goal average and number of games. So let's dig in! remember: all of these dashboards were created with Excel, tapping into online data with Power Query and updating the model that I created.

    Here are eleven brand new insights I learned after exploring the updated world cup data using Excel interactive dashboards and Power BI features:

    1. Brazil 2014 was the world cup with most goals scored, matching France 1998 and breaking the trend since then of decreasing goals per game and reaching 2.67 goals per game
    2. Attendance was great to, being only beaten by USA 1994 (because of the US massive stadium average size) and almost hitting the 3.5 million mark for all 64 games
    3. Brazil is now one of the countries that hosted most world cup games, with 85 being only surpassed by Germany with 102 games
    4. The average goals per game increase is explained by the percentage of games with 3 goals, which exploded from 22% in 2010 to 31% in 2014. The increase in goals with 4 or more games (9% in 2010 vs. 13% in 2014) also contributed
    5. Regarding team historic performance based on the model I created (+1 win, 0 tie and -1 loss and goal difference), I found the following trends:
      1. Germany got closer to Brazil, who before 2014 was the clear leader
      2. Since Argentina made it to final, now they are very close to third place catching up with Italy which had an awful world cup, being eliminated in the first round
      3. On the middle tier of team (based on number of games played since the beginning of the world cup) the biggest improvement is for Chile
      4. Finally, for the teams that go as often to the world cup, Costa Rica and Colombia improved dramatically based on their great performance in Brazil 2014
    6. Germany has had an exceptional performance on the last 4 world cups, making the top 4 on all of them and winning the last one
    7. Germany-Argentina is now the most common final game in the history of world cups
    8. Netherlands is the team with most top 4 finished without winning a single world cup.

    Just like last time, here's a live demo of the model for you to play around that includes all the data from the 2014 World Cup.

    Thanks for reading / watching and I can't wait to see you all again for the 2018 World Cup!

    Marc Reguera
    Finance Director
    Microsoft
    #power query#Power BI#Power Map#Power View#Excel#live demo#2014 world cup#World Cup

    Wed, 13 Aug 2014 16:00:00 GMT byPower BI Team0 Comment

    Read full post
  • 6 new updates in Power Query - July 2014

    In this post

    Download Power Query Update

    More flexible Load Options

    Query Groups

    Improved Error Debugging Experience

    Additional Query Editor transformations

    Options dialog

    Update notifications

     

    The July 2014 Update for Power Query is now available. You can download it from this page.

    This update is packed with lots of new features, so please take a look at the following video and the rest of this blog post. We hope that you like them!

     

     

    Here is a summary of the new features included in this release: 

    • More flexible Load Options for your queries.
    • Query Groups.
    • Improved Error Debugging Experience.
    • Additional Query Editor transformations:
      • Replace Errors within a column.
      • UX for defining Math operations based on a single column.
    • Options dialog: Restore Defaults & Tips for Cache Management
    • Update notifications: 3 notifications per update (max.), once per day at most.

    More flexible Load Options

    One of the most common areas of feedback about Power Query in the past has been the desire of having additional options to control how and where to load queries within your workbook. In this update, we're introducing a new Load Options dialog to customize how to load your queries. In addition to controlling whether a query should be loaded to the worksheet or Data Model, we now offer you the option to load to an existing worksheet instead of always loading to a new worksheet. It is also more clear now how to disable the load of a query (or to "only create connection" instead of downloading the results), which until now was only possible by disabling load to worksheet and load to Data Model.

    In addition to new options for how to load your queries, another area of feedback has been the need to have access to these options in all places from which users can load queries to their workbook. To address that, the following entry points have been added in this update…

    •From Search results:

    •From the Navigator pane:

    •From the Query Editor:

    •From the Workbook Queries pane and the contextual Query ribbon tab (to reconfigure the Load Options of an existing query without having to go back to the Query Editor):

    Query Groups

    Query Groups is a new concept introduced in this update that will help users better organize their queries within a given workbook, as well as perform bulk operations on all queries within a group (such as Refresh). Up until now, Power Query offered only a few capabilities in order to organize queries in the Workbook Queries pane, primarily moving queries up and down in the list.

    With this update, users can now select multiple queries using (CTRL + Click) and move them into their custom groups. Users can define as many groups as they want in the workbook, as well as groups within groups to create more advanced organization layers. This enables them to organize and classify the queries better within their workbook. In addition, users can leverage the context menu for each group entry to apply bulk operations to all queries within that group.

    Improved Error Debugging Experience

    In previous Power Query updates, we introduced some transformation capabilities within the Query Editor to "Remove Rows With Errors" and to "Keep Rows With Errors" within your queries. These features were helpful in order to discard all error rows or to narrow down to just the rows with errors in the final result, but didn't quite help users to see these errors in context, understanding the row from the result where these errors were introduced. In addition, after loading queries into the workbook, users would get in the Workbook Queries pane an indication of the total number of rows and the number of errors but weren't able to easily preview those errors from this pane.

    With this update, we've turned the "Number of rows with Errors" indicator into a hyperlink which brings up a preview of the rows with errors that users can explore and interact with. This preview also includes the row index to better understand where these errors appear.

     

    Replace Errors

    There are cases in which the way to resolve errors within your data is not to ignore the rows with errors but rather to replace the error values with a default value for the column. We're introducing the ability to do this: a new "Replace Errors" operation is available in the Transform tab inside the Query Editor. This option brings up a dialog asking users for the value to replace errors with for the selected columns.

    UX for defining Math operations based on a single column

    Very frequently, users find the need to add new columns to their queries that reference a different column and apply a Math operation. In previous updates, we introduced the "Add Column" tab which provides several operations that will create new columns based on one or multiple existing columns. One limitation until now was that, for Math Standard operations (Add, Multiply, Subtract, Divide, etc.), users were only allowed to select two columns, representing the first and second operators.

    In this update, we've added the ability to select just one column and use these Math operations. Users will be asked to provide the second operator in a dialog, and the result will be a new column added to the query with the selected Math calculation. This is available for all operations under the Standard dropdown menu in the "From Number" group, part of the "Add Column" tab.

    Options Dialog: Restore Defaults & Tips for Cache Management options

    If you followed our last two updates, you may be already aware that we have introduced new options for Custom Default Load Settings and Cache Management. This made the amount of choices in the Options dialog grow significantly, but there wasn't an easy way to "reset" Power Query to its default behaviors. Well, now there is such way…  : )

    In addition to the new "Restore Defaults" button, we have added a few tooltips to help users better understand the Cache Management options introduced in our previous update.

    Update Notifications Improvements

    As you may be already aware, Power Query has an Update notification mechanism that tells users about our updates every month. This notification was displayed to the user in the system tray every time that they launched Excel and there was an update available. While this works out great for many users, we also heard from some of them that installing these updates wasn't directly possible for them and they needed to notify their system administrator to perform this update and wait for a few days or weeks. At that point, seeing the Update notification continuously displayed in the system tray every time they launched Excel would become annoying…

    With this update, we've limited the number of times that a user will see the Update notification to three times per update (i.e. three times each month). In addition, we have also limited these updates to only be displayed once per day. We believe that this will establish a good balance between making users aware of our updates while also not reminding them too many times. : )

    That's it for this update... We hope that you enjoy these new Power Query features. Please don't hesitate to send us a smile/frown or post something in our forums with any questions or feedback about Power Query that you may have.

    Follow this links to access more resources about Power Query and Power BI:

    #update#power query#Power BI Training#Excel#Microsoft BI#data load options#query groups#update notifications#options dialog#transformations#error debugging

    Wed, 30 Jul 2014 16:00:00 GMT byMiguel.Llopis6 Comments

    Read full post
  • Behind the Scenes: How Power BI Handled The World Cup

    In this post

    Importing the Data with Power Query

    Getting the right format with Power Query

    Power Pivot

    Power Q&A

    Crawl, then walk

    This year’s world cup was filled with unbelievable surprises; Miroslav Klose’s record breaking total goals, Germany’s thrashing of Brazil, Spain’s unfortunate crash-out, United States’ resilience through the Group of Death, and moments of Messi magic.  For the stat geek in all of us, Power BI Q&A made it easy to put the game in context and help you make better predictions about the outcome.  If you didn’t have a chance to ask Q&A about the world cup during the tournament, you can still give it a try here.

    We heard from many of you how amazing it was to see the breadth of questions Q&A could answer on the world cup.  If you’d like a glimpse into how we can teach Q&A a whole new sport in a matter of a day, read on!

    In this post we will discuss how we used the breadth of tools available in Excel and Power BI to bring this experience to you including:

    • Power Query to import and format Data
    • Power Pivot to build the massive model that supports the questions being asked
    • Power Q&A to actually answer the questions asked through visualizations

    Importing the Data with Power Query

    We partnered with Opta Sports, a leading provider of historical sports statistics to give Q&A all of the facts you’d expect a sports statistician to know.  The historical data was delivered as a series of XML files with the statistics for each game organized in folders by year. Each file contained information on the teams, coaches, referee, players, goals, cards, substitutes, and game statistics. This isn’t the easiest way to import data, but luckily Power Query has robust support for handling different data formats and folder structures.

    We used the From Folder option in Power Query to import all the data at once. Each row was a path to the individual XML file. Power Query also has the ability to read in the content of an XML file. Under content, clicking on “binary” lets you expand out all the data within a XML. After deleting and renaming a few columns, you end up with data for each game (see image below). As you can see, the value “Table” is in a lot of cells. That’s because Opta’s XML feed was highly nested, but it allowed for easy organization and navigation of the data. 

     

     

     

     

    We had to use a different approach for the 2014 data since it was updated in real time from Opta as the games progressed.  Using the From Web option in Power Query, it was easy to import the latest data from each game and refresh it whenever we wanted.   If you’d like to learn more about how to connect to almost any data source with Power Query, check out Getting Started with Power Query Part I.

      

     

     

    Getting the right format with Power Query

    Once we pulled in the data, we had to create a query for each table we wanted in our data model: games, goals, cards, referees, managers, stadiums, etc. The motivation behind creating a query for each table is that when we load each table to the data model, we can easily create the relational diagram required to make Q&A shine. Most queries involved expanding out nested tables, and filtering columns to get one query for one particular table. One functionality we would like to point out with Power Query is the advanced editor, which allows you full control over the transformation of you data. The entire advanced editor is not the scope of this post, but we wanted to highlight the scenario of pivoting tables. In the Power Query editor, under the “View” tab, there is a button to launch into the advanced editor (shown below):

     

     

    Chris Webb wrote a great blog post on pivoting tables.

    After each query was created for the particular tables we wanted, we loaded them to the data model for Power Pivot to consume.

    Power Pivot

    When you load to the data model from Power Query, it automatically adds your tables to Power Pivot. All we had to do in Power Pivot was match unique IDs to one another to create relationships so we know which players had goals, cards and which referees, teams, players, and coaches were in a particular game. Now that the relationships were created, we could take this to Power Q&A. 

     

    Power Q&A

    Now that we have our relational model created, we optimized our data by going through the steps outlined in the Demystifying Power BI Q&A Series (Parts 1 | 2).

    The model is now ready to be used on Power Q&A. However, we optimized the model further using the new cloud modeling environment to be able to ask a wider variety of questions. The cloud modeling environment also is an easy way to manage synonyms for your columns and tables. Since all the changes are saved on to the cloud, it allows everyone to reap the benefits.

    Documentation on how to get started with the cloud modeling environment can be found here. As an example on how to make a wider variety of questions to work, we created the following phrasing “aggressive team has large average cards per game”. The inclusion of this phrasing allowed us to ask more questions like:

    You can click the links above to see how one good phrasing allows Q&A to give really great answers!

    Crawl, then walk

    As you can see, there’s a lot more to teaching Q&A than just understanding natural language.  Finding the right data, choosing the right tables and relationships, and data modeling all contribute to a great experience in Q&A.  Luckily, the full suite of Power BI tools like Power Query and Power Pivot make it easy to achieve powerful results in a few minutes.

     

    #power query#Power BI#A#Q&amp#Power Pivot#2014 world cup#Brazil#football#soccer#World Cup#futbol

    Tue, 29 Jul 2014 16:00:00 GMT byPower BI Team0 Comment

    Read full post
  • Power Map July Update

    Last month in the June update we focused on quality improvements for Power Map, including enhanced interaction with maps on touch devices. The touch experience should feel smoother, more fluid, and more predictable. Using your feedback, we also improved stability. Thank you for sending in your error reports! 

    This month in the July update we’re adding a new feature called “Automatic Refresh.” This feature makes the Repeat Tour feature we added back in May even better. During tour playback, Automatic Refresh will pull in the latest updates to your data every 10 minutes. When used together, Repeat Tour and Automatic Refresh let you use Power Map to monitor events in near real time. We’re very excited about the new opportunities this opens up, and we hope you are too.

    To enable Automatic Refresh, make sure Repeat Tour is turned on:

    Then press the “…” button when you’re playing your tour to bring up the new tour options menu:

    From the menu click the Automatic Refresh option to toggle it on:

     

    If you find these features useful, please “Send a Smile” in the app and let us know how you’re using them. We hope you enjoy using Power Map. Big things are coming soon, so stay tuned for our monthly updates.

    #Power BI#Power Map#data refresh#Office 365

    Wed, 16 Jul 2014 16:00:00 GMT byPower BI Team7 Comments

    Read full post
  • New in Power BI: Cloud Modeling for Q and A

    In May, the Q&A team released an update to Q&A that enabled a wider variety of natural language questions that can be answered as well as personal customization of certain types of questions. In the new cloud modeling environment, it is now possible to define new terms and resolve any ambiguity for your questions. For example, cloud-modeling enabled these questions to bring back meaningful results:

    Which player scored the most unassisted goals per world cup

    Who are the diving medalists from China and what is their medal count

    Which player has received a yellow card and scored 2 goals in the same game

    How does Q&A know what an unassisted goal is? Or that athletes are from a particular country, not just that they compete in a host country? With cloud modeling, it is possible to define what an unassisted goal is and that athletes are from their home country. This update allows you to create new phrasings within your workbook’s language model.

    Before walking through an example, what is a language model? A language model is your workbook’s dictionary. It’s what Q&A uses to know how to understand each row and column and how they are related. Within a language model, phrasings are created to define what words are used to relate different columns in your model.

    Let’s walk through an example to show how creating phrasings through the cloud-modeling environment works.

     

    Getting Started with Cloud Modeling

    1)      If you sign into PowerBI, you will be presented with the main dashboard:

     

    2)      Make sure to add your workbook with the “Add” button, located next to Documents.

    3)      After you add your workbook, click on the (…) located on your workbook’s tile (circled in red in the above picture).

    a) Click “Add to Q&A”

    4)      After you add your workbook for Q&A, it is now time to launch into the language modeling experience. Click on the gear icon in the upper right (circled in green in the above picture) and click “Power BI site settings”.

    5)      Click on “Q&A” and it’ll bring you to a screen like this:

     

    6)      Click on the (…) and then click “Optimize for Q&A”

    7)      Now you are in the Language Modeling tool and are ready to customize your workbook to your liking. It will look like the screenshot below.

     

    On the right hand side of the screen, you will see a pane with you Workbook name on it. The pane lets you create your language model in several ways:

      1. The overview pane gives you a synopsis of what your Language Model contains;
      2. Synonyms are the synonyms you added in Power Pivot (you can add more here if you forgot to add them in Power Pivot);
      3. Phrasings are where you define the entity relationships. This is where the bulk of fine tuning will take place as well as the remaining contents of this blog post; and
      4. Usage shows the utterances that have been typed in the Q&A text box. This is useful to see questions that weren’t recognized by Q&A (to know what phrasings to add); questions that didn’t give the most intuitive visualization; and most popular utterances

     

    Before making a phrasing, it is important to start from the question you are trying to ask in Q&A. In this example, the question “which player has the most unassisted goals” doesn’t work because Q&A doesn’t know what an unassisted goal is.

    To create the phrasing that will enable that question, we click on the “phrasings” tab.

     

    After clicking on “add phrasing”, you can start typing in what you want to define. Typing in the word “unassisted” prompts a suggestion box to help craft your phrasing. In the World Cup model, I have a column named AssistPlayerName. If the row value is “N/A”, it is an unassisted goal. Therefore, the proper phrasing needed would be the second one in the dropdown box. The full phrasing is shown below.

     

    You can immediately test your phrasing to see if it works.

     

    Now that we have walked through how to get started with cloud modeling, we encourage you to read the documentation on the various phrasing types, what types of questions they enable, and how to create them.

     

    Importance of Cloud Modeling

    Cloud modeling allows you to make Q&A work for the kind of language you use in your company and industry.   It is the next step in model optimization, and it follows the optimizations done in Power Pivot (see Demystifying Power BI Q&A Part 1 | 2).

    Before even creating any phrasings on your own, Q&A already generates some phrasings for you so that basic queries will work. For example, if you have sales data with a Customers table and State and Customer Name were two columns, we would automatically generate the phrasing “Customer name is the name of Customer” so that you could ask, “Show Pennsylvania customers”.

    However, if you have another table Orders with columns named Product, Price, Date Bought, Customer ID (relating back to the Customer table), and you want to ask, “what products did Pennsylvania customers buy?”), Q&A wouldn’t know what to do with that question. By adding the phrasing “customers buy products”, Q&A has the context it needs to relate both the columns together.

    Another example would be what if you are the owner of a small bakery and you want to see what items to have on your seasonal display showcase. You have a database of ingredients and their respective prices. If you just want to ask, “show expensive ingredients”, you can create a phrasing to define what expensive meant for you! For small bakeries, it could be that ingredients more than $26/lb are expensive. For bigger bakeries, ingredients more than $40/lb are considered expensive. Depending on the needs of your company, you can tailor Q&A to give you the most pertinent results.

    With cloud modeling, it is possible to answer a wide range of questions with Q&A. Q&A is still a ways from being as intelligent as Batman’s batcomputer, but because Power BI is a cloud service, we can rapidly and continually improve the product based on your needs and wants from a self-service business intelligence product. Lastly, we hope this blog post has shown how Q&A can be tailored to fit your needs and we encourage that you try making your own phrasings as explained here.

     
    #Power BI#QA#q and a#cloud modeling

    Tue, 15 Jul 2014 16:00:00 GMT byPower BI Team5 Comments

    Read full post
  • Getting Started with Excel and Power BI Series

    Over the next couple of weeks, we'll introduce you to the basic usage of all the latest and greatest features in Excel and Power BI. How do you know if this is the right series for you? Here are a few questions to help you decide:

    • Are you new to Power BI?
    • Did you sign up for the trial but don't know where to start?
    • Have you been using Excel for a long time, but you don't know where to start to use the new BI and visualization features?

    If you answered "yes" to any of them, then this is the right place for you to get started!

    Whether you're an Excel pro or you just downloaded the latest version of Excel, we decided to create this Power BI and Excel Getting Started series to get you up and running with our new set of features and services in no time.

    Our goal is to help you learn about and understand the features in Excel and Power BI following the natural path of getting some data in your hands (It doesn't matter where it comes from), shaping it, creating a model, visualizing your data to finally share it and consume it anytime, anywhere taking advantage of Power BI in the cloud.

    What should you expect to see in this series?

    The natural path of data analysis is a perfect match to the process of analyzing data. This series will cover the following features/processes:

    Excel

    Discover and Combine Data

    Model and Analyze

    • Getting Started with Power Pivot

    Visualize

    • Getting Started with Power View
    • Getting Started with Power Map

    Power BI for Office 365

    Share and Collaborate

    • Getting Started with Power BI Sites
    • Getting Started with Power View Forecasting Capabilities
    • Getting Started with Sharing Queries and Data Stewardship
    • Getting Started with Connecting On-Prem data to Power BI

    Ask Questions

    • Getting Started with Q&A

    Access your data anywhere

    • Getting Started with the Power BI App
    • Getting Started with Power BI HTML5

     

    You can always come back to this post to find the latest getting started post or to access past posts of this series.

    Enjoy and please let us know what we can do to make this series better for you!

    #power query#Power BI#Power Pivot#Power View#Excel#Power BI app#Getting Started#power bi sites

    Tue, 08 Jul 2014 18:35:00 GMT byPower BI Team3 Comments

    Read full post
  • World Cup: comparing contrast 20 years worth of data with natural language

    June 26th marked the end of the Round Robin stage, and we saw the World’s best teams crash out early while seeing other teams fight their way to the next round. There is no easy path to winning the World Cup, and we’re in for a treat for the next 16 games.

    Try PowerBI Q&A on World Cup 1990-2010 Data

    But, how did the 2014 Group Stage games compare with previous World Cups and could we potentially use these as an indicator for the games to come? PowerBI Q&A investigates and highlights key statistics surrounding World Cup’s most popular facts: goals, cards, and game performance. 

    2014 Performances

     

    Key insights: From the top 5 most winningest teams, Italy and Spain suffered an unfortunate fate. However, they have had similar performances before—Italy in 2002 and Spain in 1998. Their premature exit from the tournament is balanced by a bulk of the middle-tier teams doing really well (ex. Belgium, Colombia, Mexico, Switzerland, Costa Rica, etc.).

     

    Key insight: Taking a look at the club teams that Spanish players play for, we see that all of them play for the best teams in the world. All of these club teams play an average of 45-50 games a year so when the World Cup starts, they are not mentally or physically prepared. Also, this graph points out that there isn’t necessarily a correlation between playing for the same club team (or same league) for the entire year and World Cup performance. The Guardian did a great post on this as well using a complete data set from FIFA to show national team performance based on club teams.

     

    Key Insight: Scoring a lot of goals helps leading to victory, but at the end of the day, you have to stay consistent and win. Taking a look at Belgium and Croatia, they both scored 4 goals, but Croatia scored all those 4 goals in 1 game. Although Belgium didn’t score many goals, they won every game. One question I have looking at this graph is, can Colombia continue their perfect form?

    Goals, Cards, Attendance Statistics

    Key Insight: This year, compared to other World Cups, has been one of goals, not cards.

     

    Key Insight: Adding to the insight found from the above picture, this clearly has been a World Cup of goals, not cards.

     

    Key Insight: Longevity has helped Miroslav Klose tie the record for most goals scored (he has played in 4 world cups as opposed to Ronaldo’s 3). Also, Messi may be the best player in the world, but he isn’t yet the best in Argentina. Gabriel Batistuta has the most goals for Argentina, and Messi needs to continue his form this year. Thomas Muller, last year’s top goal scorer, is proving to be a consistently lethal threat.

      

    Key Insight: Although the United States held the World Cup in 1994, it still is today the highest attended World Cup. Due to existing infrastructure, USA has potential to host another World Cup. Maybe FIFA can make 2026 in the US.

    Can I Play With This Data

    When the World Cup is finished, we will make the 2014 data available to play with on Q&A. Currently, we have a demo experience live of 1990-2010 data that you can experience here:

    Try PowerBI Q&A on World Cup 1990-2010 Data

    *Note: Data is provided by Opta Sports and this documents 1990-2014 statistics.

    #Power BI#2014 world cup#Brazil#football#soccer#World Cup#QA

    Tue, 01 Jul 2014 16:00:00 GMT byPower BI Team0 Comment

    Read full post
  • 15 World Cup history facts with Power Map

    This blog post was authored by Igor Peev, Senior Program Manager who enjoys working on challenging projects that deliver new, innovative ways to analyze data and tell good stories with it. He works on interactive tours to present and play with data in 3D environments.

     

     

    I am not a big soccer fan. I am not very fond of watching TV, but every four years I can’t help it and get into the World Cup. There is something different about this particular competition, something totally awesome about watching teams from every country try their best to win the Champion Title. Perhaps the universal appeal has to do with the fact that in all parts of the world kids play soccer.  I too played when I was really young. I grew up in a small town in Bulgaria and we played straight on the asphalt street, pausing for the occasional car to pass by.  All we needed was a ball and four cardboard boxes to make goals. I bet that millions of others who watch the World Cup can tell you the same story.

    For this tour, I search for some basic data with Microsoft Power Query. I found a table from Wikipedia – Teams Reaching the Top Four – Results – FIFA World Cup which I imported and had to normalize. This is what the table looked like initially:

     

    This is how I normalized the above table, granted, there is more than one way to normalize this data: 

     

    I had to do a few other things:

     

    • Convert the Year column contents to date format using the Date function in Excel, like this:   =DATE([@Year],1,1)
    • Convert countries like Czechoslovakia and Yugoslavia which no longer exist, to countries which do, such as the Czech Republic and Serbia and I sincerely hope I am not going to offend anyone with that choice.
    • Build a story board on some sheets of paper about how I was going to make the tour.
    • As an intro I made a blank scene with the globe, no data, no text boxes, just the globe:

     

    Then I copied that scene and put a text box in it. That way the video will start with a blank globe and then the title appears a couple of seconds later. 

     

    My following scenes followed each logical part of script which told a story from the data above. The story followed the original table which I imported using Power Query and the Wikipedia article on FIFA World Cup. I wanted to show all countries which have been and are in the World Cup. I made a scene for that, and used Category for shading the regions using Region chart and a layer based on the Country column.

    After doing this, I made a scene about the first winners in 1930. That scene was shot from the bottom of South America looking north.

    Then I made a scene that had all of the countries which have won the World Cup Championship Title. That has 8 countries, some in South America and some in Europe so I positioned the camera over Argentina and set a ‘Fly over’ effect in the scene so that the motion is towards Europe. That way, I can have a smooth transition to the next scene which showed all countries which have won the top three spots in the last 10 World Cup tournaments. I knew from the data in the original table that a lot of these countries were in Europe and I positioned the scene such that you can see Europe best but you can see South America too.

     

     

    Later, I added another effect and set the Year in the Time well to play in accumulation.

    Adding more time to each scene – at least 10 seconds and more time to each transition in Scene Options definitely makes for a better, smooth moving tour.

    I made all other scenes about the European winners and added shaded regions for each country using the Region Chart type. I put scenes in the same geo locations together so that the camera does not have to travel back and forth too much. As a rule, I position and move the camera in each scene so that it always moves towards the beginning point of the next scene as much as possible. In that manner, the engine which applies the effects and arcs in transition works on both things at the same time and does better interpolation between adjacent scenes.

    I did choose a theme with less colors so that the visuals stand out and used monochrome colors for the shading of countries. The final scenes showed all countries who have ever scored a spot in the top 4, over time. I used the ‘First place’, ‘Second’ place columns in the Height field, set to Count Not Blank. For the very last scene I removed the Year from the Time well and added a text box saying ‘Good luck to all teams’ and a few other things. 

    Enjoy!

    #power query#Power Map#World Cup#wikipedia

    Mon, 30 Jun 2014 16:00:00 GMT byPower BI Team0 Comment

    Read full post
  • 6 new updates in Power Query - June 2014

    In this post

    Download Power Query Update

    What's New in Power Query

    Support for “Whole Number” column type

    Additional transformations in Query Editor ribbon

    Aggregate Column dialog

    Additional Date/Time filters

    Improved Navigator experience: Refresh Tree & Function Invocation

    Cache Management options

    What’s New in Power Query

    We are very happy to announce the availability of a new Power Query update. You can download the update from this page.

    There are quite a few new or improved features in this release, as well as several quality fixes. You can watch the following video or read the rest of this blog post for more details.

    Support for “Whole Number” column type

    Data Type menus in the Home & Transform tabs in the ribbon and in the Change Type context menu on preview columns now let users set a column type to Whole Number. Previously supported type (Number) has now been renamed to Decimal Number in the UX. This provides better alignment with Data Model numeric types, which was a recurrent piece of feedback from our customers.

     

    Additional Transformations in Query Editor ribbon

     A few new transformations have been added to the Query Editor ribbon under the Transform & Insert tabs. The motivation for this is to increase discoverability of existing capabilities that customers were having trouble to find without UX support.

    -          Home tab:

    • Keep Bottom Rows: Remove all rows from a table except the last N rows.
    • Keep Errors: Keep rows with errors in the selected column(s). Note that there will be another improvement in the next SU geared towards providing a better experience to identify rows with errors in the output of a query.

    -          Transform tab:

    • Count Rows: Return the number of rows of a table.
    • Unpivot Other Columns: Unpivot all columns in a table except the currently selected ones.

     Aggregate Column dialog

     Similarly to how we added a dialog for “Expand Column” in the previous release, this month we have added a dialog for “Aggregate Column”. The goal of this change is to improve discoverability of this operation by exposing it via the Editor ribbon, as well as enable editing Aggregate steps later (via the Edit Settings button next to the step).

    Additional filters for Previous/Next “N” days/weeks/months/quarters/years

     Previously, Power Query only supported filtering by Last or Next Day/Week/etc. instead of letting users customize “N”. In this release, additional filters have been added to the Filter dropdown menu. Users can now specify the value of “N” as well as the unit of time that they want to use for their filter.

     

      

    Improved Navigator experience: Refresh Navigator items & Function Invocation experience

    The Navigator task pane now exposes “Invoke” for functions. This lets users directly invoke a function instead of having to load the function into the workbook and invoke it from there. This is available for functions in Azure Marketplace, Databases and OData feeds.

     

    Also the Navigator Task pane now has a Refresh button to refresh the contents of the tree as needed by the user.

    Cache Management options

    The Power Query Options dialog now includes a new section about Cache Management. This section lets users check the amount of memory used by the PQ Cache, delete all cache items and configure the max size allowed for this cache.

     

    That’s all for this update! We hope that you enjoy these new Power Query features. Please don't hesitate to contact us via the Power Query Forum or send us a smile/frown with any questions or feedback that you may have.

    Follow this links to access more resources about Power Query and Power BI:


    #power query#Excel#Query Editor#column dialog#cache management#refresh#date time filters#column type

    Thu, 26 Jun 2014 16:00:00 GMT byPower BI Team8 Comments

    Read full post