You are here

Feed aggregator

NVidia driver and windows 10

MSDN Blogs - Tue, 11/24/2015 - 10:28

This is kind of a reminder for me as well as maybe something that can help others also.

My computer was recently upgraded to Windows 10 and when that happened it decided to use the Microsoft Basic Display adapter instead of my legacy NVidia radeon 4800 driver.  It had the incorrect resolution and did not detect my dual monitor.

When looking at my display adapter settings in device manager, it did show the display adapter correctly, so that wasn't the issue.  If your adapter is not showing, I heard you can do an uninstall to let the plug in play manager detect it again.  Or update the driver and chose your drivers inf file.  But that wasn't my problem, so here's what I did:

  1. right click the display adapter and choose properties
  2. Click update driver
  3. Choose browse my computer for driver software
  4. choose let me pick of device drivers on my computer
  5. From there I was presented with a list and I went to the previous version of the driver and clicked it

That worked for me, the correct resolution was now on my monitor and my dual monitor was detected.

SignIn issues with Visual Studio Team Services - 11/24 - Investigating

MSDN Blogs - Tue, 11/24/2015 - 10:04

Initial Update: Tuesday, 24 November 2015 10:30 UTC

We are actively investigating issues with VS Team Services AAD backed accounts. Subset of customers accessing AAD backed accounts in VS Team Services may experience slowness to login.


We are working to resolve this issue and apologize for any inconvenience.

Next Update:Before 20:00 UTC

VS Online Service Delivery Team

Power BI mobile apps update - November 2015

MSDN Blogs - Tue, 11/24/2015 - 09:30

We’re excited to announce the Power BI Mobile apps November update.
This month’s update includes new capabilities as well as stabilization and quality improvements.

New this month

New dashboard actions menu (iPhone)

The new dashboard actions menu puts all dashboard actions into one accessible place. Use this new menu to easily invite colleagues to share dashboards, edit favorite tiles on this dashboard, and even add a picture tile.

Enrich your dashboards with picture tiles directly from your iPhone

Imagine being able to take the white board from a board review and pin it into your dashboard, or adding a picture of your factory machine gauges when visiting the production floor. Now you can.

We are happy to introduce cool new functionality on the iPhone – add a picture tile.
With this cool capability you can leverage your mobility and enrich your dashboards with pictures from your phone's gallery or camera. The picture you select is added to your dashboard as a new tile.  The new tile will be available to anyone viewing this dashboard on any platform.

To add a picture tile, open one of your dashboards on Power BI iPhone app (make sure you have Edit permissions for this dashboard).

Landscape (iPhone app)

Enjoy the landscape with the Power BI iPhone app – See more details in this blog .

New welcome experience (all Power BI mobile apps)

New to the Power BI mobile app? Get to know Power BI using our new and improved welcome experience.

The new experience outlines Power BI key functionality followed by full in-app access to samples.

It’s well known that the best way to learn something is to experience it for yourself. Explore the app capabilities and understand its value to you even before signing in.


Improved chart data capacity(all Power BI mobile apps)

We have improved the capacity of charts on mobile. Before, charts that had many data points on the X axis were truncated, but not anymore.

Direct links for dashboard tiles (Power BI for Windows app)

Navigate directly from mobile tiles to a specific URL or to another dashboard from your Windows app. You add links to the tiles while in the Power BI service. Then you can follow them in the Windows app.

Improved internal app browsing (Android app)

We improved the internal app browsing capabilities to allow display of full URLs and provide a better experience of the internal app browser activation under settings.

Want to start your own gaming studio? First stop - Imagine Cup!

MSDN Blogs - Tue, 11/24/2015 - 09:00


John Brengman, leader of Team Radication, presenting their project Sanitarium at the Dundee Science Festival in Scotland.

Team Radication Games competed in the UK Imagine Cup 2015 with their unique game, “Sanitarium.” Though they were beat out of the competition by another team (from their own university!), the Radication team has won a boatload of other awards – including third place at the Microsoft Azure Cloud Gaming Innovation Challenge 2015 and a gold medal at the International Serious Play Awards 2015. They were also nominated for Games with Purpose at TIGA Game Industry Awards!

In “Sanitarium,” the player is a doctor trying to diagnose and treat tuberculosis (TB) patients. The user chooses from several hot spots for TB shown on a world map. There, the player works on mini-games to get the resources they need to treat patients. 


One of the mini-games a player must win to acquire resources for treating patients. Here, the player must examine a patient’s lungs by finding cavities. This particular mini-game is a favorite with 8 year olds, John Brengman told us.

The team is also working on implementing real-world money. For instance, a player might have to use real money to pay for certain treatments in the game, but Radication plans to donate that money to charity.

Sanitarium is one of several frontrunners in a new movement know as Games for Good. John Brengman, Radication’s gregarious producer and spokesperson, explained, “While some games out there are combining fun and education, we are doing it all - the charity aspect, the fun and the medical education.”

But the game didn’t start out with this lofty goal in mind.

Abertay University in Edinburgh, Scotland, was approached by both Microsoft and the University of St Andrews to build a game. Microsoft wanted a game developed for the Windows 8 mobile platform as well as a game using Azure Cloud tools. St Andrews wanted a game that would show the treatment of tuberculosis from the point of diagnosis through treatment. They also wanted the team to use a cutting-edge mathematical model they developed that mimics the symptoms and pathology of TB. The model itself was used in their last actual drug trial and had proved a success.

Ten students from Abertay took the bait and formed a team to complete their third-year professional project. Thus, Team Radication Games was born.

Even though “Sanitarium” started as an assignment, John tells us that the team had no problem getting into it and really making it their own. Despite the tall order of making a game for such a broad audience (medical students and doctors, gamers and enthusiasts, philanthropists and researchers), the team had the first release completed in only three months.

The game also has the potential to disrupt the way drug trials and clinical research are conducted. It’s possible that other mathematical models could be plugged into Sanitarium. John said, “We could give the game’s data to the doctors for analytics. We could build that data into the game, and in that way, we can do a simulated drug trial. They could put them into hundreds of combinations!”

In an exciting twist of perfect timing, the school deadline coincided with the Imagine Cup deadline for submission.

One of the UK’s Microsoft technical evangelists, Lee Stott, saw a presentation the team delivered to the University of St Andrews Infection Group. He was so impressed that he suggested they enter Imagine Cup. They did and made it all the way to the UK National Finals.

John said, “Imagine Cup was so motivating that we decided to leverage more competitions to keep Project Sanitarium on the move. Losing doesn’t mean it’s over. We just saw it as a deadline and pushed ourselves to enter another and then another contest.”

No doubt one of the many reasons Sanitarium has been so successful is because of the team’s extensive use of cloud services. They knew they’d use Azure to host the reader boards, but then they realized something.

John explained, “If 10,000 users play the game and any user can have up to 50 patients – that would wear out the hardware they are playing with! So, we moved the model to Azure so as not to tax the hardware.”

In short, John added, “Using Azure means you’ll never have to turn away users.” 

Not content with where they ended up in their first Imagine Cup, Team Radication Games plans to enter the competition again.

John explained, “Not winning Imagine Cup didn’t feel good. We believe in our game and want to go for it again! Not just to win, though. Imagine Cup put you right in front of people looking for the best game developers!”

In fact, because of their time at Imagine Cup, Radication games attracted the attention of another Microsoft evangelist, Elena Branet. She was so impressed with how Radication used Azure that she suggested they enter the Azure Cloud Gaming Innovation Challenge, and they took home third place.

Imagine Cup is a great jumping off point for starting a business, especially for gamers. Always dreamed about owning your own gaming studio? Microsoft provides the opportunities, exposure, people and tools you’ll need along the way. Enter the Imagine Cup Gaming Competition and start making your game today!

Also, if you are ready to tap into the power of the cloud like Radication Games, be sure to check out the Azure student offer

Team Radication Games is:

Kirsty Fraser, Analytics Programmer (bottom left with glasses); Akos Demuth, Lead Designer (bottom middle with glasses); Maz Magzoub, Composer (middle right); Adam Harrison, Programmer (middle left with glasses); James Warburton, Lead Programmer (back left); Michael McLean, Graphics Design (back middle left); Do Jin Choi, Artist (back middle right); Chris Box, Lead Artist (back right); John Brengman, Producer (not pictured)


Creating an Azure IoT Device Explorer in node.js, express and jade

MSDN Blogs - Tue, 11/24/2015 - 08:25

After playing a bit with Azure IoT hub and building a webcam system with a RaspberryPi 2 running Linux in my previous article, I’ve decided to continue developing a bit in node.js to build a simple equivalent of the Device Explorer but in node.js. I’m not a node.js expert so there may be more efficient way to write some of the code.

Code is available on GitHub: The code on GitHub does include more than what is explained in this article.

Setup the dev environement

I’ll use Windows as my development environment. Visual Studio 2015 can support node.js development and debugging. So first I needed to setup Visual Studio 2015:

  • You can get for free Visual Studio Community 2015 here.
  • Then you need to the free node.js tools for Visual Studio, download them here.
  • And you need of course to install the node.js framework from here.

Once those 3 steps done, you’re good to go!

Creating a node.js project in Visual Studio

This is quite straight forward, you have new project type which is node.js. I’ve created a Start Node.js Express 3 Application project. It does come with a simple MVC project containing couple of example pages which makes it easy to learn and understand how all is working.

Views are using jade. This is a part where I had quite some difficulties to use. I do recommend to read the Language Reference as well as the examples from The most important is to keep in mind the indentation must be respected and this is what makes groups and makes all the code logic. When you got that, the rest is quite easy and really nice to use.


It will create a full web site. I do recommend to create TypeScript file and not Javascript. this will allow to have a better support and maintainability over time. Javascript file are generated once compiled. And you can choose which version of Javascript you want to generate.


Listing devices

The idea is to have a page that will allow to enter the connection key:

Once the key is entered, it will list the devices plus their keys and couple of other info (in real, keys are displayed instead of the blue box):

So to build this, we will need to:

  • Create a view which is about adding a jade file in the views directory
  • Add a function to handle requests on the page
  • Add a route so the traffic will be correctly redirected to the page

I will first explain how the principle of views and code is working.

Adding the view

Right click on Views then Add and New Items…

select jade file and create one call devices.jade. Let start by replacing what is generated by this code:

extends layout

block content
  h2 #{title}
  h3 #{message}

  p please enter your connection string like;SharedAccessKeyName=iothubowner;SharedAccessKey=XXX
  form(name="connection", action="/devices", method="post")
    input(type="text", name="constr")
    input(type="submit", value="Connect")

In jade, it will create a page where the #{title} will be replaced by the default text rendering of the title object which will be given to the page. It can be a string or any object. This will allow to manipulate those objects and we will see later how to do that.

The second part will create a form which will post the value of the text box to the page names /devices

Adding a function to handle the code

In the routes/index.js file, add this code:

export function devices(req: express.Request, res: express.Response) {
    var cnxstr = req.body['constr'];
   res.render('devices', { title: 'Devices', message: 'this is the connection string: ‘ + cnxstr });

This simple code just find the value of ‘constr’ which has been send by the post form and ask to render the page by sending back the info in the message object. Now, we still don’t have everything as the function has not been declared as a route when the page /devices is called.

Adding the route for a page

Find the file app.ts and add the lines after the line “app.get(‘/’, routes.index);”

app.get('/devices', routes.devices);'/devices', routes.devices);

Basically, the function we’ve just wrote is now linked to a get or post request on the /devices page.

All together, if you click F5, go on the /devices pages, you’ll be able to fill the text box, send the data to the page and see what you’ve posted. This is a very basic example but it does allow to understand how express and jade are working together.

Requesting Azure IoT Hub devices list in node.js

You’ll need to add the ‘azure-iothub”"’ module.  Using Visual Studio, makes it super easy. Just right click on “npm” in your project then select “Install new npm Packages…”

This will open this window where you can search for the packages.


Here is the code to request all devices present in the IoT hub. It is quite straight forward as the node.js SDK is really nicely done:

var iothub = require('azure-iothub');
var cnxString = '';

export function devices(req: express.Request, res: express.Response) {
    var cnxstr = req.body['constr'];
    if ((cnxstr != undefined) || (cnxString)) {
        if (cnxString) {
            if ((cnxstr == undefined)||(cnxstr ==''))
                cnxstr = cnxString;
        var registry = iothub.Registry.fromConnectionString(cnxstr);
        registry.list(function (err, deviceList) {
            if (!err) {
                cnxString = cnxstr;
                res.render('devices', { title: 'Devices', year, message: 'Getting list of devices', devicelist: deviceList });
            } else
                res.render('devices', { title: 'Devices', year, message: 'Error getting list of devices', devicelist: null });
    } else
        res.render('devices', { title: 'Devices', year, message: 'Please give a valid connection key', devicelist: null });   

First part of the code is really here to check if a connection string has already been provided. If yes, it will reuse the connection string. The key 2 lines are:

         var registry = iothub.Registry.fromConnectionString(cnxstr);
        registry.list(function (err, deviceList) {

this will return in devicelist an array of devices. If no error, then it is passed to the devices view. So very straight forward.

Modifying the jade view to render devices list

Go back to the devices.jade file and add the following code:

          | DeviceId
          | Prim Key
          | Sec key
          | Last upd
          | Status
          | Msg waiting
      each device in devicelist
            a(href='/devicedetail/' + device.deviceId) #{device.deviceId}
            | #{device.authentication.SymmetricKey.primaryKey}
            | #{device.authentication.SymmetricKey.secondaryKey}
            | #{device.lastActivityTime}
            | #{device.status}
            | #{device.cloudToDeviceMessageCount}
  a(href='/adddevice/') Add a new device

This part took me quite a lot of time. The reason is jade and the way indentation is working. It is really super important to respect it and the alignment almost drives the behavior of what will be generated. the good news is that you can add code in the jade file like testing if you have a devicelist object. and do for each in the code. The code will generate a simple table which contains some of the device properties. I’ve created as well a page for details as well as a page to add a new device.

This code will render exactly as in the screen capture from the first part of this article. Now, let see how to generate the detailed page for devices as well as creating a new device.

Listing devices properties

Similar to the previous part, add a devicedetail jade, here is the code very similar to the previous page to generate the details in a table:

extends layout

block content
  h2 #{title}
  p #{message}

            | Details
            | Values
              | Device Id
              | #{device.deviceId}
              | Primary Key
              | #{device.authentication.SymmetricKey.primaryKey}
              | Secondary Key
              | #{device.authentication.SymmetricKey.secondaryKey}
              | Last Activity
              | #{device.lastActivityTime}
              | Generation Id
              | #{device.generationId}
              | Messages waiting
              | #{device.cloudToDeviceMessageCount}
              | etag
              | #{device.etag}
              | Status
              | #{device.status}
              | Status Reason
              | #{device.statusReason}
              | Connection State
              | #{device.connectionState}
              | connectionStateUpdatedTime
              | #{device.connectionStateUpdatedTime}
              | statusUpdatedTime
              | #{device.statusUpdatedTime}

We’ll need to create a function that will handle the request and return the device object:

export function devicedetail(req: express.Request, res: express.Response) {
    var devId = req.params.deviceId;
    var strcnx = getHostName(cnxString);
    if (strcnx == '')
        res.render('devicedetail', { title: 'Device detail', year, message: 'Error getting device details. Connection string was: ' + cnxString + ' and deviceId: ' + devId });
    strcnx += ';DeviceId=' + devId;
    var registry = iothub.Registry.fromConnectionString(cnxString);
    var msg = 'No device found';
    registry.get(devId, function (err, device) {
        if (!err) {
            strcnx += ';SharedAccessKey=' + device.authentication.SymmetricKey.primaryKey;
            res.render('devicedetail', { title: 'Device detail', year, message: 'Those are the device details. Connection string: ' + strcnx, device: device });
        } else
            res.render('devicedetail', { title: 'Device detail', year, message: 'Error connecting' });

function getHostName(str)
    var txtchain = str.split(';');
    for (var strx in txtchain) {
        var txtbuck = txtchain[strx].split('=')
        if (txtbuck[0].toLowerCase() == 'hostname')
            return txtchain[strx];
    return '';

As you’ll see in the jade page, the link to the page is /devicedeatil/name_of_a_device. In order to catch it, we’ll need to declare it in the route and it will allow to have it thru the req.params function. I will name it deviceid. so add this line in the app.ts file:

app.get('/devicedetail/:deviceId', routes.devicedetail);

First part of the code is about getting the list of devices and making sure the device exists. then it’s about getting the device and returning it. As sometimes you need the device connection string, this string is built and returned as well. As a result, you’ll get a detailed page like:

Those properties are the ones available for every device. The status shows is the device is allow or not to connect. If not, you’ll have a reason (128 bit max) displayed in the Status reason. If you send messages to your device, you’ll see as well if messages are waiting.

Adding a device to Azure IoT hub

Very similar to the previous part, we’ll just add a jade file adddevice. Here is the code:

extends layout

block content
  h2 #{title}
  p #{message}

    p please enter your device name
    form(name="adddevice", action="/adddevice", method="post")
      input(type="text", name="deviceId")
      input(type="submit", value="Add")

Simple code, very similar to the first example. For the main function code, it’s quite easy as well:

export function adddevice(req: express.Request, res: express.Response) {
    var devId = req.params.deviceId;
    if (devId == undefined) {
        devId = req.body['deviceId'];
        if (devId == undefined)
            res.render('adddevice', { title: 'Add device', year, message: 'Error, no device ID' });
    if (cnxString == '')
        res.render('adddevice', { title: 'Add device', year, message: 'Error, no connection string' });
    else {
        var registry = iothub.Registry.fromConnectionString(cnxString);
        //create a new device
        var device = new iothub.Device(null);
        device.deviceId = devId;
        registry.create(device, function (err, deviceInfo, response) {
            if (err)
                res.render('adddevice', { title: 'Add device', year, message: 'Error, creating device' + err.toString() })
                if (deviceInfo)
                    res.render('adddevice', { title: 'Add device', year, message: 'Device created ' + JSON.stringify(deviceInfo) });
                    res.render('adddevice', { title: 'Add device', year, message: 'Unknown error creating device ' + devId });

All up, first part of the code is just to check if there is a device name either thru get as a param, either thru post. Second part is about adding a device:

         var registry = iothub.Registry.fromConnectionString(cnxString);
        //create a new device
        var device = new iothub.Device(null);
        device.deviceId = devId;
        registry.create(device, function (err, deviceInfo, response) {

Again, very simple, very straight forward. we’re connecting to the Azure IoT Hub registry, then create an empty device, set the deviceId name and ask for creation.

Don’t forget to add the route as well. the “?” in “/adddevice/:deviceId?” is to make the param as optional.

app.get('/adddevice/:deviceId?', routes.adddevice);'/adddevice', routes.adddevice);

Once created, if there is no error, the device is sent back. I’m just converting it into a JSON and display it in the message:

You can do a very similar code to delete a device. You can as well send a message to the device and monitor the results. I have to say the Azure IoT SDK in node.js is really great and working perfectly. And please note that this website can be deployed on Azure, Windows or Linux or anything else that can run one of the latest node.js version (see restriction in the Azure Iot SDKs on GitHub here)

More examples on my GitHub! Enjoy and feedback welcome.

How to proactively avoid parameter sniffing step-by-step

MSDN Blogs - Tue, 11/24/2015 - 08:24

In the following blog post the so called „parameter sniffing“ is explained:

The purpose of this blog post is to explain the fix implementation steps a little more in detail:


The kernel only hotfixes mentioned for MS Dynamics AX in the blog post above are minimum prerequisites. Because all kernels are cumulative I'd therefore recommend to install the latest kernel which you can always find here:

If you like you may find more useful information about kernel updates in general here:

Like for every kernel installation please be aware of this:

- The following steps should be tested in a test environment first before doing that in the production environment in order to make sure that the updated kernel works as expected in your specific, customized environments.

- If you update the kernel then all AX clients and all AOS servers have to be updated in a unique manner per environment.

1) Make sure you downloaded the latest kernel

2) Install the kernel via running axupdate.exe that comes with the kernel hotfix itself. (Note: Or take the latest axupdate.exe from LCS ( )if you run AX 2012 R3 CU 8 or higher)  

Please also make sure to take the following hints into consideration here in general and depending on your MS Dynamics AX version:

3) After the kernel installation please make sure that all of you AOS service(s) are stopped.

4) Run the following T-SQL query against your MS Dynamics AX transaction database using SQL server management studio depending on your MS Dynamics AX version:

AX 2009 SP 1:

AX 2012 R1 (=RTM)

AX 2012 R2 and AX 2012 R3

5) Restart your corresponding SQL server service

6) Restart your all AOS services

Now your AX clients are ready to login again and test planned.

ReplTip - Publishing 1 Article into 2 Publications bloats Distribution DB

MSDN Blogs - Tue, 11/24/2015 - 08:19

Chris Skorlinski – Microsoft SQL Server Escalation Services

While visiting a customer site, we discussed consequences of publishing common article\tables into multiple Publications. For this customer, each Publication contains same set of core or common tables used by all subscribers, but then some subscribers had tables unique to just that subscriber. For example, all Publications contain Customers table, while 1 Publication\Subscription adds Sales information while a 2nd Publication\Subscription contains Product Feedback data. The Customer\Sales Publication goes to 1 subscriber while the Customer\Feedback Publication goes to a different subscriber.

Using 2 Publications to 2 different subscribers is a technique to scale-out reporting workload as each subscriber can have tables and non-clustered indexes unique to support its specific workload. Reducing number of non-clustered indexes at a subscriber can have dramatic improvements in Distribution Agent performance by lower IO required for data update as fewer indexes are updated and maintained.

Okay, so what's the catch?

Catch is the LogReader will create an entry in the Distribution Database for the same table for each Publication to which it belongs. In example below I published AdventureWorks Customer table in 2 different Publications then changed 1 row. As you can see, there are 2 rows in the Distribution database, one for each Publication.



Article 1: {CALL [sp_MSupd_SalesLTCustomer] (,,,N'Chris',,,,,,,,,,,,1,0x0800)}

Article 2: {CALL [sp_MSupd_SalesLTCustomer] (,,,N'Chris',,,,,,,,,,,,1,0x0800)}


Now take that Customer table and push through 100 million changes per day x 2 publication and you have 200 million commands to move from LogReader to Distribution database and 200 million commands need cleanup once delivered. Multiply this by other "common" tables and more publication and quickly have over 800 million commands moving through the Replication environment bloating the Distribution database.


Alternative Designs. – Common Tables Publication

There are 2 different solutions to reduce the Distribution database bloat. First, only publish tables once by moving all the "common" tables into 1 publication for all subscribers, then create separate publications for those tables unique to that subscriber. In the example above we'd have 3 publications. At the individual subscribers create non-clustered indexes for that subscriber once the initial snapshot has been delivered.

Sales subscriber         (CommonCustomer and Sales publications)
Feedback subscriber         (CommonCustomer and Feedback publications)

Alternative Designs. – One Publication

Another solution is to create 1 publication that contains ALL tables and push this to ALL subscribers. Yes, the Sales subscriber will now contain Feedback table and vice-versa, however, if the extra tables used by the other subscriber is small and the number of data changes in minimal, then over head of the other subscriber's table would be not as great as distribution database bloat problem. This also has the added benefit of providing data redundancy at the subscribing allow either application to use either subscriber should the need arise.

Alternative Designs. – One Publication – Selective Subscriptions

Sounds good, but say I still have a large "sales" table not needed by Feedback subscriber? For variation on the 1 publication approach configure the subscriber to select to subset of articles within the common publication. By default, the subscriber gets @article = 'all', instead via scripts execute sp_addsubscription selecting which tables goes to which subscribers.

exec sp_addsubscription ...@article = 'all'


-- Sales Subscriber

exec sp_addsubscription @publication = 'SalesAndFeedback', @subscriber = 'Sales',@article = 'Customers'

exec sp_addsubscription @publication = 'SalesAndFeedback', @subscriber = 'Sales',@article = 'Sales'


-- Feedback Subscriber

exec sp_addsubscription @publication = 'SalesAndFeedback', @subscriber = 'Feedback',@article = 'Customers'

exec sp_addsubscription @publication = 'SalesAndFeedback', @subscriber = 'Feedback',@article = 'Feedback'


If publishing using default "concurrent snapshot" you'll get error below when subscriber to selective articles.

Msg 14100, Level 16, State 1, Procedure sp_MSrepl_addsubscription, Line 572

Specify all articles when subscribing to a publication using concurrent snapshot processing.


This "selective subscriptions" capability does introduce special consideration when executing sp_addpublication that may not fit all situations. For example, @sync_method supports "database snapshot" or "native" and @immediate_sync feature must turned off.

exec sp_addpublication @sync_method = 'database snapshot' or 'native' ...@immediate_sync = N'false' ... @allow_anonymous = N'false'


I hope these options help you explore Transactional Replication design options to reduce Distribution database bloat.

[Channel9 Series] Office 365 APIs Webinars (Arabic)

MSDN Blogs - Tue, 11/24/2015 - 08:10

Arabic Speaker? Interested to attend/watch a series of webinars targeting O365 Development in your local language?

Here you go, Office 365 APIs Webinars!


"The most strategic developer surface area for us is Office 365" - Satya Nadella.

With 1 Billion Office users around the globe and more than 400 petabytes of data hosted in the Office 365 service, it's the time to get started with O365 app development!

This series is a set of online webinars walking you through O365 development journey and mainly targeting accessing Office 365 data from web apps or native applications using MS Graph API aka Unified API.


0/ Overview of O365 Development

1/ Authentication

2/ Deep Dive into Azure AD and Office 365 APIs

3/ Light up Azure Mobile Apps with Office 365 APIs

Setting Expectations

• New to O365 Dev?  No Problem • Target Audience: žDevelopers • Suggested Prerequisite: Familiarity with REST, C#, Azure AD and Azure Mobile Apps

Webinars Schedule - In case you want to attend it live

0/ Overview of O365 Development

1/ Authentication

2/ Deep Dive into Azure AD and Office 365 APIs

3/ Light up Azure Mobile Apps with Office 365 APIs

New to Skype Meetings? See how to join

Channel9 Series - In case you want to watch the recordings

Series Link:





Slide Decks

You will find the slide decks of the 6 modules here: 


Kindly share with your community.

For post alerts and more updates, please feel free to follow me on Twitter @_SherifElMahdi

Pin a range from Excel to your dashboard!

MSDN Blogs - Tue, 11/24/2015 - 08:00

Excel has always been an honorable member in Power BI, as one of the popular ways to bring data into Power BI.

We are excited to announce that now you can use your existing Excel workbooks, with the features and formatting you have learned to use and love over the years, and create dashboard tiles from within an embedded Excel workbook in Power BI.

You can select ANY range from a workbook that you uploaded to Power BI, and pin it as a tile to a dashboard:


Figure 1 Excel workbook viewed inside Power BI, with range B14:H39 selected containing a chart inside.

Click the “Pin” button on the top right of the menu bar to pin the selection to the dashboard

You can select ranges containing charts, tables, Pivot tables, Pivot charts, and many other Excel parts and pin them to your dashboard, to create beautiful dashboards like this:


Figure 2 Dashboard with Excel tiles


The tiles are connected to the workbooks in OneDrive for Business, and are being refreshed automatically every few minutes.

In this article you will find more details and the list of supported features.

Furthermore, dashboards containing Excel tiles, as well as Excel workbooks themselves, can be added to organizational content packs for easy sharing with your colleagues. Read more about sharing workbooks here.

Go ahead and give it a try! Create dashboards using your Excel workbooks with your favorite look & feel!

Working with GitHub from Visual Studio 2015

MSDN Blogs - Tue, 11/24/2015 - 07:44

In addition to the well-known tools for Windows, Azure and Web development, Visual Studio 2015 ships several tools and libraries for open source and cross-platform development. This includes GitHub Extension – a Visual Studio add-in for integrating IDE with GitHub to provide an access to your Git repositories and common operations like clone, branch, and pull requests from within of Visual Studio.

The GitHub Extension installation package is integrated with the Visual Studio installer, and it is an optional feature listed under the Common Tools group.

Using GitHub Extension

When installed, GitHub Extension adds an additional section to the VS Team Explorer panel, giving you an option to choose which source control service to use. Right from the panel you can create a new GitHub account or connect to an existing one (it allows to use both GitHub and GitHub Enterprise accounts, two-factor authentication is supported), clone any available repository or create a new repository.

All local repositories are listed in the same panel to allow you to quickly switch between them. For the selected repository, Team Explorer displays GitHub specific-information and actions, making it simple to see repository changes, branch source code or send a pull request.

In the same time, GitHub web site was modified for this release to allow Windows developers to open GitHub repositories in Visual Studio directly from the site. The new “Open in Visual Studio” action on the GitHub site launches Visual Studio and populates Team Explorer controls with the required information to clone selected repository.

What’s next

This is a first release of the GitHub extension for Visual Studio and some actions may lead to GitHub Web site, instead of embedded experience, but it is already a solid solution for developers that use GitHub for open source and private repositories.

We got around three

MSDN Blogs - Tue, 11/24/2015 - 07:00
In this story about Restoration Hardware's mail order extravagance, there is a little statistic:

Industry surveys from groups like the Direct Marketing Association estimate that catalogues get average response rates of four to five per cent.

That reminded me a story back from the days when Microsoft sold Fortran PowerStation, the development suite for the scientific programming language. A developer from the Windows 95 team decided to try a change of pace and switched into sales and marketing, and he wound up working on Fortran PowerStation.

One of his attempts to drum up interest was to include a reply card in a periodic mailing to subscribers of some general computer programming magazine. This was back in the days when people subscribed to computer programming magazines, like, in print. Also back in the days when the publishers would send out little playing-card-deck-sized blocks of business reply cards for various products.

As he explained it to us, "We sent out around ten thousand cards. They say that a typical response rate for a direct mailing is four to five percent. We got three."

Okay, three percent is a bit low, but given that Fortran is not exactly an up-and-coming product, it's still quite respectable.

"No, not three percent. Three responses."

Try Microsoft.Xrm.Data.PowerShell.Samples!

MSDN Blogs - Tue, 11/24/2015 - 07:00
Hi everyone, Today, I introduce PowerShell script samples. How to download the sample 1. Go to 2. Click “Download ZIP” on the left side. 3. Save and extract it. 4. Go to Microsoft.Xrm.Data.PowerShell.Samples folder in extracted folder. 5. Each folder is a sample. We provide 6 samples now, but we will keep adding them. So, how you find detail information for each samples? Sample ReadMe Each sample contains...(read more)

Embracing advanced technology for the digital parent – SharePoint at St Helen’s Northwood

MSDN Blogs - Tue, 11/24/2015 - 06:00

There are many ways in which schools, colleges and universities are making use of SharePoint within education, and today we're going to look at one of the perhaps less obvious applications of this collaborative space.

Parent engagement is a crucial aspect of any institution’s heartbeat, and is a great way to ensure a complete learning experience for the students, and for the school to be receptive of feedback from outside the gates. We pick up this particular story in North London…


As a leading independent girls’ day school and member of the Girls' Schools Association (GSA), St Helen’s provides an outstanding education for girls aged three to eighteen. Situated at the heart of Northwood in north London since its foundation in 1899, the School provides the opportunity for more than 1100 girls to excel academically, to fulfil their all-round potential and to acquire the skills, insight and confidence to become leaders in their fields, their professions and their communities.

St Helen’s School aims to ensure that its large parent community is engaged and plays a full and active part in School life. Microsoft SharePoint has been used in the School for over two years as a staff/student intranet and virtual learning environment, and recently this has been extended to include a Parent Portal.

St Helen’s decided to create their parental platform using Microsoft SharePoint 2013. The tools available in Microsoft’s most popular server application enabled the school to take advantage of templates such as announcements, document libraries and custom lists. The flexibility in the product allowed the School to work with Cloud Design Box and customise these lists with workflows, responsive display templates and web parts. Parental communication documents such as PDFs are now permissioned dynamically using workflows, so only the relevant groups of parents see the documents.

David Nanton (Head of IT Systems) said, “We liked the idea of targeting announcements and specific documents at different groups of parents across our many year groups accurately and rapidly. With SharePoint it was easy to do this with workflows. Every time we upload a new document, it’s easy to target a particular parental group. The workflow does all the hard work in the background to ensure that the correct permissions are applied .”

The new Parent Portal combines the latest responsive design technologies including bootstrap and CSS3 to render lists and libraries with a mobile touch-friendly interface. It was important for this school to have a mobile-first mentality when it came to parents.

Dr Paul Arnold, Deputy Head (Development), explains: “At St Helen’s we are always striving to take advantage of the latest developments in communications technology – the mandate in this instance was to deliver a responsive, mobile-friendly digital communication platform to engage our busy parents who need access to School information while on-the-go. With the launch of the Parent Portal, they can now quickly click through to the latest communications from us on their smartphone, tablet or any other mobile device.”

Working with Cloud Design Box allowed St Helen’s to take full advantage of their experience developing SharePoint portals by adding workflows and responsive design. The impressive portal gives parents access to:

  • Announcements: each announcement can be tagged with a particular parent group which is automatically permissioned using SharePoint workflows.
  • Communication Documents: the visibility of these are tailored to each parent, so only the relevant documents are displayed - such as specific year group letters or consent forms.
  • Attendance, published reports and class information via integration with Capita web parts from the SIMS MIS data. Data collection sheets are available for parents to update their own information.
  • Important links to the School’s connected websites.

David Nanton also added: “The experience we have had with Microsoft SharePoint so far has proved to be excellent and we can see so much potential to streamline collaboration and communication. We found Cloud Design Box to be a professional and knowledgeable partner. They were helpful in turning our designs and ideas into reality.”


St Helen’s have started to use Office 365 and aim to create a hybrid experience in the near future to combine the full functionality of Office 365 with the on-premises data. St Helen’s worked with Cloud Design Box ( to customise their SharePoint site using responsive designs, workflows and custom display templates.

Tony Phillips from Cloud Design box added, “Responsive design is so important in modern web development. Website access is ever-increasing via mobile devices and tablets. We are really excited to see the new Parent Portal in action; it combines the functionality of SharePoint with our responsive display templates so that it looks great on all devices.”

Embracing advanced technology for the digital parent – SharePoint at St Helen’s Northwood

MSDN Blogs - Tue, 11/24/2015 - 06:00

St Helen’s School aims to ensure that its large parent community is engaged and plays a full and active part in School life. Microsoft SharePoint has been used in the School for over two years as a staff/student intranet and virtual learning environment, and recently this has been extended to include a Parent Portal.

...(read more)

Visual Awesomeness Unlocked - The Globe Map

MSDN Blogs - Tue, 11/24/2015 - 06:00

By Amir Netz, Technical Fellow and Manoj Patel, Software Engineer

All Map enthusiasts, we have a special treat for you this week. You are going to love it! Some of you might have seen this already in our demos but this week you can touch it and feel it with your own hands (I mean your own data :)

Today digital map has become integral part of everyday life. Most organizations have data with geo attributes in some form or other and the need to understand geographical significance of data has been on the rise.

The combination of aggregating information with location and presenting them in a small space allows Map visualizations to present the big picture out of sea of data. They help you find the theme and outliers at a quick glance. While a 2D map can help achieve this goal to some extent, when it comes to multiple attributes and widespread data, they fell short.

A 3D Map makes this experience more immersive and magical. They provide the sense of connection to the data with the physical world. This, combined with our spatial ability, brings a new perspective to the data when we present them as 3D objects.

GlobeMap brings this magical map exploration experience to Power BI. To use, simply import the Globe Map from the Visuals gallery to your Power BI report and use it with a location data type. The location could be an address, city, county , state/province or country/region.  On this 3D map, you can project a measure as the height of the bar. The 3D bars reduce the clutter of overlapping bubbles and allow you to get instant insight.  GlobeMap also allows you to rotate the Globe and see it from different angles. But we didn’t just stop here. We added one more goody to this visual. The icing on the cake is the added benefit of heat map in this visual.  You can use a second measure for heat intensity and draw immediate attention to the right areas.

For many mapping solutions, some location names may be ambiguous, when considered in the context of multiple countries or regions. You can increase the accuracy of geo-coding of such location names by concatenating the city, state and country. For example, instead of using just ‘Bellevue,’ as the location you can provide it as ‘Bellevue, WA, USA’.

As usual, we can’t wait to hear your thoughts and your ideas for improvements.


NOTE:  This Globe Map custom visual is not the same as Power Map for Excel (called 3D Maps in Excel 2016). Power Map is a 3D data visualization tool for Excel that lets you plot geographic and temporal data visually, analyze that data in 3D, and create cinematic tours to share with others in Excel. Globe Map is a custom visual for use within Power BI reports.

Devops, Visual Studio Team Services y Azure Marketplace

MSDN Blogs - Tue, 11/24/2015 - 05:51


La forma en la que los equipos de desarrollo y operaciones trabajan está evolucionando a día de hoy hacia un entorno de mayor comunicación y colaboración. El objetivo es sencillo, mejorar el ritmo en el que proporcionan nuevos servicios y característica a sus usuarios y clientes. Esta serie de prácticas quedan englobadas dentro de la filosofía DevOps; actualmente podemos ver como cada vez equipos desarrollo y operaciones de pequeñas, medianas e incluso grandes empresas están adoptándola.

Cuando hablamos de DevOps hablamos habitualmente de tres cosas:

  1. Personas, también conocido como Cultura. Es necesario romper con el modelo tradicional y evolucionar hacia una nueva cultura de trabajo alineada con esta filosofía. En este cambio, las personas es el punto principal. Podemos tener los procesos y los productos que nos favorezcan esta evolución; sin embargo, si no contamos con la voluntad de las personas de acoger este cambio será complicado que obtengamos un resultado exitoso.
  2. Procesos. No es posible continuar desplegando nuestras aplicaciones pasándonos un fichero comprimido junto con un manual de 20 páginas. Si queremos ser ágiles necesitamos implementar nuevos procesos que nos permitan automatizar nuestros procesos. Conceptos como la integración continua, despliegue continuo o gestión de la configuración son clave.
  3. Productos. Finalmente, los productos o las herramientas serán los que nos permitan soportar los nuevos procesos originados por ese cambio en las personas hacia una nueva forma de trabajar.


Microsoft dispone de las mejores herramientas para facilitarte en este camino gracias a su oferta incluida dentro de Visual Studio Team Services, en él podemos encontrar una solución completa para la gestión de nuestros proyectos de software desde las fases iniciales de planificación hasta el desarrollo y su posterior despliegue y puesta en producción. En líneas generales:

  • Control de código fuente: podrás alojar tus repositorios de código basados en Git o en TFVC de forma privada y sin límite del número de repositorios que puedes crear.
  • Soporte a Kanban, Scrum y dashboards personalizados: ajusta el entorno a tu forma de trabajar. Gestiona la captura, la priorización y la gestión de tu trabajo de forma visual y con conexión directa a tu repositorio de código.
  • Sistemas de integración continua: compila, valida y despliega tus soluciones de forma continua con cada cambio en tu código. De esta manera te permitirá detectar los errores de forma temprana en tu ciclo.
  • Soporte a diferentes herramientas y lenguajes: da igual si trabajas con Visual Studio, Eclipse, Xcode o IntelliJ; si desarrollas con .Net, Java, PHP, Ruby u otro lenguaje. También si ya dispones de sistemas como GitHub, Trello, Slack o Jenkins ya que te permite integrarse a través de sus APIs REST y OAuth 2.0.

Sin embargo, es posible que ya estéis utilizando alguna solución de terceros para gestionar alguno de los puntos anteriores. No hay problema, gracias al Marketplace de Azure puedes encontrar una gran colección disponible para desplegarlas directamente sobre Azure y obtener las ventajas de la nube.

Por ejemplo, en el área de integración y despliegue continuo tenemos disponibles las soluciones de CloudBees, XebiaLabs, SolanoCI o StackStorm. A la hora de desplegar vuestras aplicaciones os puede interesar desplegar en contenedores de Docker o sobre un clúster de DEIS. Finalmente, para gestionar la configuración os puede interesar Puppet Labs, Chef o SaltStack.

Estas son algunas de las más de 3.500 soluciones disponibles a día de hoy en el Marketplace de Azure listas para ser desplegadas y empezar a usarlas. Ya puedes elegir la que mejor se ajuste a lo que necesitas.

Un saludo

José Ángel Fernández

Technical Evangelist


Perfomance issues with Hosted Build on Visual Studio Online - 11/24 - Investigating

MSDN Blogs - Tue, 11/24/2015 - 05:28

Initial Update: Tuesday, 24 November 2015 10:25 AM UTC

We are actively investigating issues with Hosted Build in West Europe region. Some customers may experience longer than usual build queue time

• Next Update:  As issue warrants

We are working to resolve this issue and apologize for any inconvenience.

VSTS Service Delivery Team


12.12 в 12:00 приходите на Community DevCamp в Москве

MSDN Blogs - Tue, 11/24/2015 - 05:08

Приглашаем вас на Community DevCamp – мероприятие для разработчиков о разработчиках. Мероприятие состоится через месяц после Connect(); //2015 в Нью Йорке и через неделю после виртуального Visual Studio Connect(); в России. Основными докладчиками будут признанные эксперты сообщества, которые расскажут о том, как они видят, используют или планируют использовать самые последние новинки для разработчиков на .NET — .NET Native, Roslyn, кросс-платформенную разработку на ASP.NET, контейнеры Docker, Azure Service Fabric, F# — и многое другое.

Мероприятие проводится при поддержке сообщества MVP.

Приходите, будет интересно!


.NET Core and ASP.NET Bug Bounty Update

MSDN Blogs - Tue, 11/24/2015 - 04:49

As we've now released RC1 of .NET Core and ASP.NET restrictions on areas for investigation are now lifted. The entire cross platform stack, including networking is now in scope and eligible for bounty submissions.

The ASP.NET web site has instructions on how to install RC1 on Windows, Linux and OS X. Windows researchers can use Visual Studio 2015, including the free Visual Studio 2015 Community Edition, after installing RC1 from The source for .NET Core can be found on GitHub at The source for ASP.NET v5 can be found on GitHub at

As before we encourage you to read the program terms and FAQs before beginning your research or reporting a vulnerability.

Gastbeitrag: Microsoft Azure an der Georg-August-Universität Göttingen

MSDN Blogs - Tue, 11/24/2015 - 03:57

Gastbeitrag von Dirk Hähnel, wissenschaftlichen Mitarbeiter der Physikalischen Fakultät – III. physikalisches Institut – Biophysik der Georg-August-Universität Göttingen.

Innerhalb des Göttinger Wissenschaftsbetriebs sind wir als biophysikalisches Institut der physikalischen Fakultät zugehörig. In unserer Abteilung wird wissenschaftliche Forschung an der Schnittstelle von Physik, Biologie und Chemie betrieben. Genauer gesagt: Einzelmolekül Spektroskopie und hochauflösende Mikroskopie, mit einer Vielzahl von Fragestellungen innerhalb der Biophysik und generell dem Bereich der komplexen Biosysteme. Life Science ist heterogen, daher sind es auch unsere Projekte und wir müssen flexibel agieren können, im Besonderen wenn es um Stützprozesse wie die IT geht; Wir sind der Auffassung, IT muss zu besseren Ergebnissen in unserer Kernkompetenz führen. Mit anderen Worten, wir beschäftigen uns lieber mit der biophysikalischen Forschung als mit der IT-Infrastruktur.

Vor einigen Jahren haben wir begonnen einen Linux Cluster zu betreiben, um die großen Datenmengen auszuwerten, welche in unseren Laboren generiert werden. Der Cluster besteht aus mehreren mit CPU und GPU Kernen ausgerüsteten Serverblades. Der Betrieb dieser Anlage stellte sich als sehr wartungsintensiv heraus und die Schnittstellen zu den Bürocomputern waren mangelhaft. Daher haben wir im Jahr 2012 die Anlage auf Microsoft Technologie umgestellt (MS High Performance Server 2008 R2). Hauptsächlich um den Betriebsaufwand zu verringern, denn mehrere Mitarbeiterstellen im IT-Bereich sind weggefallen bzw. wurden nicht neu besetzt.

Bevor wir mit der Migration auf Microsoft begonnen hatten gab es mehrere pessimistische Stimmen. Vom Ergebnis jedoch, waren wir mehr als überrascht, merkliche Leistungseinbußen konnten wir nicht feststellen, ganz im Gegenteil, die Aufwände hinsichtlich der Administration sowie der Integration von Third Party Komponenten konnten auf fast 10% der ursprünglichen zeitlichen Aufwände reduziert werden. Selbst der Betrieb der Büro Computer als Compute Node, während der Abwesenheit von Mitarbeiter, konnte innerhalb mehrerer Stunden stabil aufgebaut werden. Glücklicherweise bekam unser Institut regen Mitarbeiterzuwachs in den letzten Jahren, daher wurde auch diese Lösung schnell unzureichend. Beschleunigt wurde dieser Prozess auch dadurch, dass innerhalb der letzten zwei Jahre unsere Forschungsmethoden mehr Daten generieren und der globale Trend BigData auch bei uns alles datenlastiger werden lässt. Neuere Geräte zur Datenakquise wie z.B. wissenschaftliche Kameras oder Spektrometer unterbieten sich jährlich mit der Detektionseffizienz und daher kann man in derselben Zeitspanne, heute im Vergleich zu 2010 fast doppelt so viele Daten generieren. Für uns bedeutet das gesteigerte Anforderungen an Storage und Computing. Nach langer Beratung kam für uns nur die Cloud infrage, daher machten wir uns auf die Suche nach einen kompetenten Partner.

Fakultät für Physik an der Universität Göttingen

Den nächsten Schritt nach vorne machten wir Anfang 2014, in dem wir unsere Lastspitzen des lokalen Clusters in die Microsoft Azure Cloud auslagerten. Bei Bedarf konnten wir zusätzliche Compute Nodes in der Cloud dem lokalen Cluster zuschalten. Zu dieser Zusammenarbeit kam es im Rahmen des Microsoft Azure Research Grant, den wir im Jahr 2014 und 2015 verliehen bekommen haben. Die Konfiguration von den Compute Nodes in der Microsoft Azure Cloud war trivial und die Dokumentation in der Microsoft Azure Managementsuite hat uns bei den auftretenden Problemen weitergeholfen. Der Support durch den Microsoft Service war ebenfalls sehr zufriedenstellend, innerhalb kürzester Zeit haben kompetente Kundendienstmitarbeiter den Kontakt mit einem Experten hergestellt, der den Supportfall zufriedenstellend lösen konnte. Wir haben unsere Erfahrungen mit Microsoft Azure machen dürfen und sind davon überzeugt, dass die Microsoft Azure Plattform eine attraktive Lösung für Wissenschaftler darstellt, ganz gleich ob Physiker, Biologen, Chemiker oder Mediziner. Wer sich nicht mehr Ressourcen als notwendig in die IT-Infrastruktur binden möchte sollte auf starke externe Service Angebote setzten. Ganz gleich ob man Python, Matlab oder R Skripte ablaufen lassen möchte, mit Microsoft Azure haben wir stets hervorragende Erfahrungen machen dürfen.

Zukünftig werden wir die Azure Machine Learning Suite benutzen um beispielsweise Bilddaten mit Nervenzellen automatisch auszuwerten. Vorteilhaft ist die Vielzahl der bereits vorhandenen Machine Learning Bibliotheken und die Integration des eigenen Modells. Stillstand ist Rückschritt, daher planen wir die vorhandenen Algorithmen auf Hadoop basierte Systeme zu übertragen, hierfür ist Microsoft HdInsight ein geeignetes Werkzeug. Es ist unser Ziel Algorithmen zur Auswertung von Spektroskopie Daten und Hochauflösender Mikroskopie für den >100TB Bereich für die Science Community zur Verfügung zu stellen.

Mehr Informationen zu Microsoft Azure im Foschungsbereich bekommen Sie hier


Subscribe to Randy Riness @ SPSCC aggregator
Drupal 7 Appliance - Powered by TurnKey Linux