You are here

MSDN Blogs

Subscribe to MSDN Blogs feed
from ideas to solutions
Updated: 19 min 20 sec ago

Help us Shape the Azure Storage iOS Library

Tue, 03/31/2015 - 16:06

The Storage team is looking for feedback to help us focus our development for the upcoming Azure Storage iOS library. We’ve created a survey that should take 5-10 minutes to complete.

Once you complete the survey you will also have the opportunity to learn more about an upcoming early preview program plus the first 20 people to complete the full survey will receive some fun Azure merchandise for your offices!

Please click here to start the survey.

Thank you!

Michael Curd
Program Manager, Azure Storage

Mailbag: How can I revert shared files to older versions when uninstalling the new version of my MSI-based product?

Tue, 03/31/2015 - 14:55

Question:

I have 2 versions of my MSI, and each version installs some files to a shared location.  After installing version 1 of my MSI, then version 2 of my MSI, the shared files are updated to version 2.  When I uninstall version 2 of my MSI afterwards, the shared files are still version 2.  However, I need the shared files to be reverted back to version 1.  How can I implement that behavior in my MSI?

Answer:

Windows Installer does not have any built-in functionality to back up old versions of shared files and them restore them when uninstalling the upgraded files.  The general philosophy that is being applied here is that if a shared file is upgraded, new versions of any shared files are expected to be backwards compatible with older versions of the product that depend on those files.

If backwards compatibility is not possible for your products and you need to restore older versions of shared files when uninstalling the newer version of your MSI, you will have to write custom actions to copy older versions of the files to a backup location during installation of the new version of the MSI and to restore the older versions of the files during uninstallation of the new version of the MSI.

The custom actions should behave as follows:

  • The custom action that backs up older versions of the files needs to be sequenced before the InstallFiles standard action and should have an execution condition that includes (NOT Installed) so it will run during a first-time install of the MSI and not during repairs or uninstalls.
  • The custom action that backs up older versions of the files needs to handle the case where the shared files do not exist.  In this case, it should skip trying to back up the older versions of the files so that install will not fail if the user does not have an older version of your MSI installed.
  • The custom action that restores older versions of the files needs to be sequenced after the RemoveFiles standard action and have an execution condition that includes REMOVE=”ALL” so it will only run during an uninstall of the MSI.
  • The custom action that restores older versions of the files needs to handle the case where the backup location does not exist.  In this case, it should skip trying to restore the older versions of the files so that uninstall will not fail if the user only has one version of your MSI installed.

Note – this solution also assumes that your MSIs follow the Windows Installer component rules – you cannot change composition of MSI components that are shared by multiple products.  In other words, shared files must be authored in MSI components that have the same GUID and same contents (files + install locations) in both versions of your MSI.

Azure App Services Mobile Apps and a new One Dev Minute Video for Azure Mobile Services

Tue, 03/31/2015 - 14:07

Things have been quiet recently on my blog, but things have been far from quiet in the world of Azure Mobile Services.

First off, in the new Azure Portal, Mobile Services is looking a bit different…

Introducing App Service Mobile Apps

Microsoft recently unveiled a new group of Azure services, bundled collectively as Azure App Service. You can read ScottGu’s announcement here. This new preview service, which is targeted a enhancing web and mobile capabilities in Azure, is comprised of Web Apps, Mobile Apps, API Apps, and Logic Apps features. The cool thing is that these new features are highly composable and let you easily build-out your apps and reuse components and build-in connectors across both web and mobile, and to construct integrated workflows.

With the introduction of .NET backend and integration with Notification Hubs, Mobile Services was already headed this direction. The major change in going from Mobile Services to Mobile Apps (beside the portal and the component reuse) is that the authentication functionality has been moved out to a shared gateway that supports all App Service features. Note that App Service is only available in the slick new Azure portal, which uses “blades” and look like this:

This new service offering is currently in preview, and it seems like the feature positioning/messaging is still being refined. Yes, Mobile Services is still there and will be for a good while, especially since this Mobile Apps preview does not currently have parity with Mobile Services, in particular a lack of support for JavaScript (Node.js) backend and not yet having quickstarts for all major client platforms (the preview supports quickstarts for Xamarin, iOS and Windows).

You can get started learning more about Mobile Apps with these new topics:

A New One Dev Minute Video for Azure Mobile Services

The Windows folks have a developer education team dedicated to producing the classy-looking video series One Dev Minute series that is published on Channel 9. These videos focus on the cool stuff that you can do with Windows apps, and I was lucky enough to get to help create a One Dev Minute video for Azure Mobile Services to highlight push notifications.

Here’s the new video Connecting your app with Azure:

This video demonstrates how easy it is to use Visual Studio to connect to a Windows app project to Azure to be able to send push notifications to app users. It also conveys how you can then expand that same mobile service to support other platform apps, like iOS and Android apps.

Thanks to Ross, Kelly, Harry and everyone else that helped to publish this first-rate video for Azure and Windows.

Cheers!

Glenn Gailey

Visual Studio 2015 and Graphics Tools for Windows 10

Tue, 03/31/2015 - 14:06
In Visual Studio 2012, we introduced Visual Studio Graphics Diagnostics for Direct3D. Since then, with every update and release of Visual Studio and every monthly preview release we have continued our commitment to improve and deliver great new features in these Direct3D graphics tools. I recently had the opportunity to demonstrate the graphics tools in Visual Studio 2015 and Windows 10 at Game Developers Conference 2015. I was floored by the great response from developers. The conference seemed...(read more)

GPIO in One Minute

Tue, 03/31/2015 - 14:00

This video explains GPIO in just one minute and shows how to use a Galileo to control a button and a LED from Windows. Very cool to see!

Enjoy!

 

Quand Azure ML contribue à soigner des malades…

Tue, 03/31/2015 - 13:43

Comme fidèles lectrices et lecteurs de ce blog, il n’est nul besoin de souligner que les données sont porteuses d’informations cruciales pour tout type d’organisation. Mais le champ de l’apprentissage automatique ne s’arrête bien entendu pas à cela. Et s’il pouvait nous aider dans nos prises de décisions quotidiennes ? Comme nous l’avons vu dans le précédent billet, Azure Machine Learning (Azure ML) permet par exemple de faire de la détection de fraudes, de la prévision sur les ventes ou encore de la classification de textes.

L’objectif du présent billet est de vous présenter un autre cas d’usage d’Azure ML: l’aide à la décision dans le domaine médical.

Pour celles et ceux qui n’ont pu assister à la session des Microsoft TechDays 2015 animée par mon collègue Benjamin Guinebertière (@benjguin) et par Reda Mattar de la société Exakis, j’ai aujourd’hui le plaisir de relayer la disponibilité de leur cours MVA (Microsoft Virtual Academy) Aide à la décision avec Azure Machine Learning.

Celui-ci présente de façon complète Azure Machine Learning sur un cas pratique : il s’agit – comme le titre de ce billet le suggère - d’aider un praticien à prédire si un patient souffre d’une maladie en fonction des symptômes observés et d’autres caractéristiques du patient.

Ce cours ne suppose aucun prérequis particuliers et permet ainsi à qui le veut de comprendre l’intérêt de l’apprentissage automatique dans un tel cas de figure, d’apprendre comment créer son propre environnement de travail pour ses expériences dans Azure ML, de créer un modèle d’aide à la décision clinique, de l’évaluer, de l’optimiser pour répondre aux besoins métier et enfin de le déployer en tant que service web.

Ce cours s’organise en 4 vidéos :

  1. Présentation du contexte et d’Azure Machine Learning
  2. Préparation des données
  3. Elaboration et évaluation d’un modèle prédictif
  4. Publication d’un service exploitant ce modèle

Vous verrez ainsi que dans cette situation, l’évaluation du modèle ne tourne pas seulement autour du fait de prédire correctement le caractère sain ou malade d’un patient. L’accent est mis sur le fait de ne pas prédire une personne malade comme saine. D’où l’utilisation du « recall » comme mesure d’évaluation.

Bon visionnage !

Je profite enfin de ce billet pour vous inviter également à retrouver Benjamin sur Internet à différents endroits et en particulier ici pour vous guider ;-).

Vous souhaitez détecter des fraudes ? Faire des prévisions sur vos ventes ? de la classification de textes ?

Tue, 03/31/2015 - 13:28

Azure Machine Learning (Azure ML) propose aujourd��hui 3 patrons à personnaliser selon vos souhaits !

Comme nous avons déjà pu l’illustrer au travers de nombreux billets sur ce blog, Azure ML est un service cloud entièrement géré qui permet aux scientifiques des données (data scientists) et aux développeurs d'incorporer de manière efficace des analyses (prédictives) avancées dans leurs applications.

L’environnement de développement visuel et collaboratif Azure ML Studio constitue, pour cela, et depuis ses débuts, un canevas puissant facile d’utilisation qui permet la composition rapide d'expériences d’apprentissage automatique au travers d’un modèle pratique de glisser/déposer pour rapidement construire des flux de travail (workflows) et leur opérationnalisation en quelques clics. De très nombreux modules de manipulation de données et d’apprentissage automatique sont ainsi proposés pour cela.

Azure ML Studio vous permet ainsi de bénéficier directement de puissants algorithmes utilisés au sein de Microsoft, de communautés, etc., de les combiner avec votre propre code R ou Python et de créer une solution.

Nous évoquions dans un précédent billet la possibilité qui vous est donnée de pouvoir désormais partager avec la communauté votre solution ainsi créée d'un simple clic via la nouvelle galerie pilotée par la communauté.

C’est ce que vient de faire en l’occurrence le groupe produit Azure ML. Ce dernier a récemment publié trois nouveaux patrons (template) - pour ne pas dire modèle ici ;-) - :

  • un premier sur la détection de la fraude en ligne,
  • un second sur les prévisions pour la vente en détail,
  • et un troisième visant la classification de textes.

A la différence des exemples classiques d’expérience que vous avez pu voir jusque présent, un patron Azure ML présente les bonnes pratiques et utilise des modules généraux pour construire une solution Machine Learning propre à un domaine particulier. Comme indiqué dans le diagramme qui suit, un patron Azure ML commence par la préparation des données, procède ensuite à leur traitement, à une analyse des critères, à l’entrainement du modèle puis à son déploiement (en tant que service web). L’objectif est de permettre aux scientifiques des données et aux développeurs de construire et déployer rapidement sur le Cloud des solutions d’apprentissage automatique personnalisées et ainsi d’accroitre leur productivité tout en garantissant un haut niveau de qualité du résultat.

Comme Azure ML Studio propose une collection de modules préconfigurés à l’image du module Execute R Script auquel nous nous sommes intéressés récemment et qui permet de faire appel à des scripts R personnalisés, un tel patron offre à chacun la possibilité de construire une solution de bout en bout.

Chaque patron Azure ML inclut :

  • Un schéma de données applicable à un domaine particulier,
  • Un traitement des données et une analyse de critères spécifiques à ce domaine,
  • Des algorithmes d’entrainement du modèle qui correspondent à ce domaine,
  • Le cas échéant, une métrique adaptée au domaine permettant d’évaluer le modèle.

Pour en refléter le caractère itératif et en faciliter la compréhension et l’expérimentation par les utilisateurs, chaque patron est subdivisé en plusieurs étapes (donnant de multiples expériences sur Azure ML) selon le diagramme qui suit. Chacune dispose également d’une documentation propre et d’instructions présentant comment l’utiliser dans la galerie Azure ML.

La proposition étant clarifiée, voyons ce qu’il en est au cas par cas.

La détection de fraudes

La détection de fraudes est l’une des premières applications industrielles du Data Mining et du Machine Learning. La détection de fraudes est généralement traitée comme un problème de classification binaire (p. ex. : cette transaction est-elle frauduleuse ou non ?) divisant la population en deux classes d’effectifs inégaux comme les cas de fraudes sont généralement très rares ( ? ;-)) par rapport au volume global des transactions. Lorsqu’une organisation comme un organisme financier détecte des transactions frauduleuses, cette dernière prend généralement des mesures pour bloquer les transactions sur les comptes mis en cause et éviter ainsi de nouvelles pertes : elle se doit donc agir le plus rapidement possible ! Par conséquent, en plus de mesures relatives au niveau de transactions par compte, des mesures de performances du modèle de détection de fraude s’avèrent souvent très importantes. On pense ici par exemple à la courbe ROC.

Pour augmenter le pouvoir de prédiction du modèle, des critères sont générés selon des méthodes de comptages (en utilisant pour ce faire les modules Learning with Count).

Le patron ainsi proposé présente un traitement de bout en bout en utilisant un scénario de détection de fraudes sur des achats en ligne. La méthodologie qui y est utilisée peut aisément être étendue à de la détection de fraudes dans d’autres domaines :-)

Vous trouverez ci-dessous un inventaire des expériences relatives à la détection de fraudes. Vous pouvez également les retrouver depuis la galerie Azure ML en tapant « fraud detection » entre guillemets.

Passons au second patron.

Des prévisions pour la vente de détail

La capacité à réaliser de bonnes prévisions au bon moment est (souvent) source de succès dans le marché de la vente de détail. Ces prévisions jouent en effet un rôle essentiel dans la planification de la demande sur les produits, de leur inventaire, de leur prix, de leur promotion et leur placement dans les différents rayons d’un magasin (comme cela a pu être récemment évoqué lors de la session plénière J3 Vers une technologie invisible et une intelligence omniprésente ? des Microsoft TechDays 2015 au sein de la table ronde).

Un patron vous est désormais proposé afin de (pouvoir) construire et déployer facilement une solution complète de prévision pour la vente de détail. Au travers de plusieurs étapes comme souligné ci-avant, ce patron fournit automatiquement des prévisions de ventes hebdomadaires pour chaque magasin et pour chaque produit d’un quelconque vendeur. Ce patron présente à la fois l’approche par les séries temporelles et la régression.

Vous trouverez ci-dessous un inventaire des expériences relatives aux prévisions pour la vente de détail. Celles-ci peuvent également être retrouvées depuis la Galerie ML en tapant « retail forecasting » entre guillemets.

Il est à présent temps d’aborder le troisième patron relatif à la classification de textes.

La classification de textes

Le champ d’application de la classification de textes est très vaste qu’il s’agisse de catégoriser des articles de journaux, de découper un article en plusieurs sujets, d’organiser des pages web en catégories hiérarchiques, de filtrer des spams, d’analyser des sentiments, de prédire l’intention des utilisateurs à partir de leurs recherches ou encore d’analyser les commentaires des clients etc.

L’objectif de la classification de textes est d’assigner aux diverses parties d’un texte leur classe ou catégorie. Une partie de texte peut être un document, un article de journal, une requête de recherche, un email, un tweet, un ticket de caisse, un commentaire laissé par un client etc.

Ce patron utilise des données de sentiments collectées sur Twitter pour montrer comment prétraiter un texte, en extraire des critères et y entrainer un modèle de classification de sentiment et enfin comment publier ce modèle en tant que service web.

Vous trouverez ci-dessous un inventaire des expériences relatives à la classification de texte. Celles-ci peuvent également être retrouvées depuis la Galerie ML en tapant « text classification » entre guillemets.

Nous voilà arrivé à la fin de ce billet. Nous espérons que ce dernier vous aura donné l’envie de tester ces patrons.

D’autres patrons et modèles d’apprentissage automatique devaient être publiés prochainement. Ces derniers feront certainement l’objet de futurs billets ;-) Stay tuned ! Comme on dit.

Azure Logic App with simple API App with inputs and outputs

Tue, 03/31/2015 - 13:18

Logic Apps is a new service offered by Microsoft Azure. It allows users to easily create and manage a flow of triggers and actions. Triggers start when an event happens, it can be periodic triggers or triggers based on an arrival of an Email, etc… when a trigger kicks in, the actions specified in the Logic App are invoked. All triggers and actions are API Apps, for more details about Logic Apps please visit this documentation  and for API Apps, you can find documentation here  

 Here I will go through the creation of a simple API App that takes an input and does some processing and has an output. I chose my operation to be ToUpper, it takes a string and converts all characters to upper case. To do that I used Visual studio to create and publish the API App, and then Azure portal to create and manage the Logic App. 

Let’s start by creating a new API App. To do that I had Visual studio 2013 and I had to install Azure SDK 2.5 for visual studio 2013. (I installed it using Web Platform Installer, search for visual studio azure SDK). Open visual studio, create new project -> visual C# -> Web -> ASP.NET Web Application, you will see these options:

Choose Azure API App and ok.

This will create a new project and then will add 2 main components

  1. Web API: Microsoft Framework to create web services based on ASP.NET for more info please visit http://www.asp.net/web-api
  2. Swashbuckle: a utility that helps create swagger api definitions for your web api for more info please visit https://github.com/domaindrivendev/Swashbuckle

The new project will have ValuesController I will just rename the controller to ToUpperController, make sure you change the controller name and the file name too to ToUpperController.cs. And this is how my controller looks like 

public class ToUpperController : ApiController
{
// GET api/values
public string Get(string toConvert)
{
return toConvert.ToUpper();
}
}

In the project go to App_Start, SwaggerConfig class and enable the UI by uncommenting this 

/*
})
.EnableSwaggerUi(c =>
{
*/

Run your site and go to {localhost:port}/swagger you will see this UI:

And you can pass in any string and “try it out!” you see that the body will have the string you sent in capitalized letters. If you browse to {localhost:port}/swagger/docs/v1, you will get the swagger API definition file

Now here is the trick that took me some time to figure out, Logic apps only work with the default response and not the 200 response defined in the swagger JSON  

So to add default response to all your APIs here is what you need to do is to go to SwaggerConfig.cs and do the following

  1. Uncomment
    c.OperationFilter<AddDefaultResponse>();  
  2. Add the following class public class AddDefaultResponse: IOperationFilter
    {
    public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription)
    {
    if (!operation.responses.ContainsKey("default") && operation.responses.ContainsKey("200"))
    {
    operation.responses.Add("default", operation.responses["200"]);
    }
    }
    }

And that is how the swagger response will look like:

Now we are ready to publish to a live API App, right click on the project then hit publish. To do that you need to be signed in to Azure and you need a subscription. Documentation on how to publish API App here 

Ok now let’s play with the new API App that we created, go to the portal and create a new twitter API app, just search Azure marketplace and install the package “Twitter Connector”.

Once you have all deployed, create a new Logic app, go the designer by clicking triggers and actions.

In my scenario I will trigger this flow manually so I will check the box “Run this Logic Manually”, now it will ask to drop in an action, from the right list drop in twitterconnector, I will choose search tweets action, and I will search for Azure. Now drop in your API App in and choose the only action you got now “ConverToUpper_Get” it will ask for input called “toConvert” let’s click on the “…” button to get a list or parameters that we can get from the previous action. I will choose Search Tweets Tweeted By as input. Notice here it only shows the “variables” that are of type string, because we specified that our input to the API is string. This is how the design will look after you are done

 

Now save your changes, close the designer and go back to the Logic App blade and click “Run Now” you will see in the operation part a new run shows up, if you click on that you can get to the results of the run 


 

Now if you open the logs for each action you will see 2 links “INPUTS LINK” and “OUTPUTS LINK” each will open JSON data files that contain the information. If you go to your ToUpperApp input links you will see something similar to this:

And to outputs link you will see that the name DanWahlin was transformed to Capital letters and is sent back in the body.


Now you have created the API App from scratch and it can take in parameters and send back ouputs. That opens up the gate for lots of innovative Logic apps you can easily and rapidly develop.

 

Hyper-V on Windows 10 Technical Preview Build 10049

Tue, 03/31/2015 - 13:10

Last night we released Windows 10 Technical Preview Build 10049 to Windows Insiders on the Fast ring.

This is an exciting release - as it has some key new technology in it.  Most notably it is the first release with Project Spartan in it.  You can read all about it here: http://blogs.windows.com/bloggingwindows/2015/03/30/windows-10-technical-preview-build-10049-now-available/

However, this build also includes a bug that affects Hyper-V. 

The bug stops us from being able to correctly register one of our system drivers if you choose to install Hyper-V.  It should be noted that if you already have Hyper-V enabled, and then upgrade to 10049 - everything works fine.  It is only if you install build 10049 and then enable Hyper-V that you will hit this issue.

If you want to use Project Spartan and Hyper-V on the same system - what you need to do is:

  1. Install an earlier version of Windows 10
  2. Enable Hyper-V
  3. Upgrade to 10049

If you are already on 10049 and have not yet enabled Hyper-V, you can either follow the above steps, or hang tight while we work on the next flight!

Cheers,
Ben

Management Reporter Version/Feature/ERP Compatibility Summary Overview

Tue, 03/31/2015 - 13:08
The quarterly RU/CU releases of Management Reporter, over the last two(2) years have introduced many new features and supported scenarios. To help you and your team, the MR Program Management and Customer Service teams have created a summary cheat sheet that covers what version contains a feature a customer is interested in and the Dynamics AX version(s) are compatible with those update releases. . Release Release date Version # Features GL Versions...(read more)

Small Basic is Fun - The Flickr Object

Tue, 03/31/2015 - 12:51

With Small Basic 1.1, the Flickr object is fully functional. So I added this as a bullet to The Unique Features of the Small Basic Language.

   

 

 

To see all the features, check out:

The Unique Features of the Small Basic Language 

 

Have a Small and Basic day!

   - Ninja Ed

Rethinking sales compensation models for cloud services

Tue, 03/31/2015 - 12:05

In my previous blog posts I’ve talked about the importance of thinking differently about your approach to sales in terms of building a volume engine to drive packaged and targeted offerings and considering the different conversations your sales team need to have outside the IT organisation. This can mean hiring different skills into your organisation as you bring on board new sales team members. It’s also critical to ensure you are incentivising and compensating your sales team effectively to motivate them to drive your cloud offerings.

Recently Jen Sieger, Senior Business Strategy Analyst in the Worldwide Partner Group at Microsoft posted an article focused on 4 key considerations when you think about bringing on board new sales team members. Not only thinking about the structure of your team, particularly if you have an existing on premises business and are expanding to include cloud offerings, but also the skills and even the location of your new hires.

Taking this another step, it’s also critical to ensure you are incentivising and compensating your sales team effectively. Again, this is particularly important to get right if you’re transitioning your organisation from selling and deploying on premises solutions and now moving toward subscription payments and building annuity streams through managed services offerings. Brent Combest, Worldwide Lead for Partner Profitability & Compete at Microsoft recently posted an article talking specifically about building these compensation models based on research his team have been doing around the financials and P&L impact for cloud Partners and best practices he is seeing globally. I highly recommend taking time to read Brent’s post which provides 4 key focus areas when considering compensation models for cloud.

Locally in Australia we are running a series of workshops to help our Partners making changes in their business to grow profitability particularly with cloud services. These sessions span change management, sales and the financial implications to your P&L. I have included an overview of each below. There is a charge of $75 per seat for each session.

Rethinking your approach to sales – we’ve teamed up with Luke Debono from Incredible Results to deliver this one day workshop focused on the different sales methodologies and strategies to consider as you grow your cloud offerings.

Leading change in your organisation – we’ve teamed up with Fiona Hathaway, APAC HR Director for Services at Microsoft to deliver this half-day workshop focused on arming you with skills to influence and lead change within your organisation as you build your cloud business. Register here.

  • Wednesday 29th April – Melbourne (half day morning session)
  • Friday 1st May – Sydney (half day morning session)

The impact of cloud services on your P&L – we’ve teamed up with Hui Cheng Tan, Financial Controller at Microsoft Australia to bring you this half day workshop covering the financial impact of moving to cloud services. Including walking through modelling tools you can take back to your business. Register here.

  • Thursday 30th April – Melbourne (half day morning session)
  • Friday 1st May – Sydney (half day afternoon session)

You will need your Microsoft Partner Network login to access the details and sign up.  If you’re not yet an MPN member take a look at Jack Pilon’s blog post on MPN 101 which highlights the benefits and how to sign up.  If you need further assistance registering please email partnerau@microsoft.com.

List of Modules loaded

Tue, 03/31/2015 - 11:41

While working on the .NET Loader and now in Bing where I am right now working on some features around module loading I frequently need to know and filter on the list of modules (dll/exe) loaded in a process or on the whole system. There are many ways to do that like use GUI tools like Process Explorer (https://technet.microsoft.com/en-us/sysinternals/bb896653.aspx) or even attach a debugger and get the list of loaded modules. But those to me seems either cumbersome (GUI) or intrusive (debugger). So I have written a small command line tool. It’s native and less than 100kb in size. You can get the source on GitHub at https://github.com/bonggeek/Samples/tree/master/ListModule or the binary at http://1drv.ms/1NAzkvy.

The usage is simple. To see the modules loaded in all processes with the name note in it. You just use the following

F:\GitHub\Samples\ListModule>listmodule note Searching for note in 150 processes \Device\HarddiskVolume2\Program Files\Microsoft Office 15\root\office15\ONENOTEM.EXE (8896) ======================================================== (0x00DB0000) C:\Program Files\Microsoft Office 15\root\office15\ONENOTEM.EXE (0xCBEF0000) C:\windows\SYSTEM32\ntdll.dll (0x776D0000) C:\windows\SYSTEM32\wow64.dll ... \Device\HarddiskVolume2\Program Files\Microsoft Office 15\root\office15\onenote.exe (12192) ======================================================== (0x01340000) C:\Program Files\Microsoft Office 15\root\office15\ONENOTE.EXE (0xCBEF0000) C:\windows\SYSTEM32\ntdll.dll ... \Device\HarddiskVolume2\Windows\System32\notepad.exe (19680) ======================================================== (0xF64A0000) C:\windows\system32\notepad.exe (0xCBEF0000) C:\windows\SYSTEM32\ntdll.dll (0xCB7D0000) C:\windows\system32\KERNEL32.DLL ...

The code uses Win32 APIs to get the info. This is a quick tool I wrote, so if you find any bugs, send it my way.

Introducing the Bing Ads Python SDK

Tue, 03/31/2015 - 11:28

We are excited to announce the availability of the Bing Ads Python SDK Beta today. This SDK client library will improve your Bing Ads developer experience by providing high-level access to features such as:

This post is an overview of the main features available within the Bing Ads Python SDK. You can find more details about these and other features on our MSDN page. Our Bing Ads Python SDK is distributed through Maven and the open source code can be downloaded from github.

Bulk API

Since its first announcement, the Bulk API has offered an efficient way to manage your campaigns, keywords and other entities, transferring large amounts of data up to ten times faster than traditional SOAP APIs. However, as a text-based API, the Bulk API requires a lot of work on the client side to parse and process the files containing the Bing Ads entities. On the other hand, traditional SOAP APIs can be easily accessed using an automatically generated object model.


The Bing Ads SDK closes this gap by providing an infrastructure for accessing the Bulk API using the same object model as our SOAP APIs.

Our SDK makes it very easy now to migrate your existing code working with the SOAP API objects to take advantage of the Bulk API. For example, one of the most popular requests from advertisers is the ability to quickly update keyword bids under their account based on their performance. Here’s how it can be done using the Bing Ads SDK:

First we instantiate a BulkServiceManager object:

bulk_service=BulkServiceManager(
    authorization_data=authorization_data
)

Then download the keywords including their performance data during the last month:

performance_stats_date_range=bulk_service.factory.create('PerformanceStatsDateRange')
performance_stats_date_range.PredefinedTime='LastMonth'
download_parameters=DownloadParameters(
    data_scope=[ 'EntityData' , 'EntityPerformanceData' ],
    entities=[ 'Keywords' ],
    performance_stats_date_range=performance_stats_date_range
)

bulk_entities=bulk_service.download_entities(download_parameters)

And finally upload the keywords back, while increasing the bids by 10 for the keywords that had received more than 100 clicks:

keywords=[]
for entity in bulk_entities:
    if isinstance(entity, BulkKeyword) \
    and hasattr(entity.performance_data, 'clicks') \
    and entity.performance_data.clicks > 100:
        entity.keyword.Bid.Amount=entity.keyword.Bid.Amount * 1.10
        keywords.append(entity)

if keywords.count > 0:
    entity_upload_parameters=EntityUploadParameters(
        entities=keywords
    )

    bulk_entities=bulk_service.upload_entities(entity_upload_parameters)

    # output the uploaad results
    for entity in bulk_entities:
        if isinstance(entity, BulkKeyword):
            output_bulk_keywords([entity])

The SDK also provides an easy way to write your objects to the bulk files or parse existing bulk files. Please check out the MSDN documentation for the BulkFileReader and BulkFileWriter.

OAuth Authorization

As you may already know, Bing Ads is now actively transitioning to the OAuth Authorization model, a better and more secure way to handle user authentication than the traditional username/password authentication. To make this transition easier for our customers, the Bing Ads SDK includes high-level implementations of standard OAuth 2.0 grant flows, including the authorization code grant flow for both web and desktop applications and the implicit grant flow for desktop applications.

For example, to start using the OAuth authorization code grant flow, you have to first instantiate the corresponding OAuth object. If you are building a web app:

oauth_web_auth_code_grant = OAuthWebAuthCodeGrant(
    client_id=CLIENT_ID,
    client_secret=CLIENT_SECRET,
    redirection_uri=REDIRECTION_URI
)

Or if you are working on a desktop app:

oauth_desktop_mobile_auth_code_grant = OAuthDesktopMobileAuthCodeGrant(
    client_id=CLIENT_ID
)

Then you can get the url to direct the user to the Microsoft Account authorization page:

oauth_web_auth_code_grant.get_authorization_endpoint()

And once you are called back by the Microsoft Account authorization server, you can request the OAuth tokens by using the received authorization response uri:

oauth_web_auth_code_grant.request_oauth_tokens_by_response_uri(RESPONSE_URI)
oauth_tokens = oauth_web_auth_code_grant.oauth_tokens
access_token = oauth_tokens.access_token

When using the OAuth objects with the ServiceClient class (described in the next section), your access and refresh tokens will be automatically refreshed upon the access token expiration. Optionally you can register a callback function to automatically save the refresh token anytime it is refreshed. For example if you defined have a save_refresh_token() method that securely stores your refresh token, set the

authentication token_refreshed_callback property.
oauth_web_auth_code_grant.token_refreshed_callback=save_refresh_token

Please check our Getting Started Guide on MSDN for complete examples of using the OAuth classes in a web or desktop application.

SOAP APIs

One of our goals in the Bing Ads SDK is to make sure it’s easy to access the SOAP APIs without doing any additional work. We have included generated proxy classes for all of our SOAP APIs, so there is no need for API users to manually run “svcutil” or another tool of choice.

We have also included the ServiceClient class that provides uniform access to all of our SOAP services. It also handles common request header fields for you, allowing to specify the values for fields such as CustomerId, AccountId, DeveloperToken etc. in the AuthorizationData object only once.

Here is an example of initializing the AuthorizationData object using the OAuthWebAuthCodeGrant instance defined above: 

authorization_data=AuthorizationData(
    developer_token=DEVELOPER_TOKEN,
    authentication=oauth_web_auth_code_grant,
    # IDs not required for Customer Management service
    account_id=AccountIdGoesHere,
    customer_id=CustomerIdGoesHere,
)

The Authentication property can hold one of the OAuth Authorization objects (for authorization code grant flow or implicit grant flow also described in the previous section) or the PasswordAuthentication object (if you want to use the user name/password authentication).


password_authentication=PasswordAuthentication(
    user_name='UserNameGoesHere',
    password='PasswordGoesHere'
)
authorization_data=AuthorizationData(
    developer_token=DEVELOPER_TOKEN,
    authentication= password_authentication
)

Now that we have the AuthorizationData object, we can create the ServiceClient and call API operations without passing the same information with every request:

customer_service=ServiceClient(
    'CustomerManagementService',
    authorization_data=authorization_data
)

# Set to an empty user identifier to get the current authenticated Bing Ads user
user_id=None
user=customer_service.GetUser(user_id).User

Note that we only set the UserId field on the GetUserRequest. All other fields will be taken from the AuthorizationData object that we provided earlier.

Whenever the current access token expires, ServiceClient and BulkServiceManager will request a new set of access and refresh tokens from the Microsoft Account authorization server on your behalf. Using BulkServiceManager it will happen automatically without any required interaction from your side. The tokens will be refreshed anytime you call getService() on the corresponding ServiceClient instance. To ensure that you always have the latest refresh token, you should listen for new tokens received by registering a callback function. For example if you have a save_refresh_token that stores a refresh token securely, register it with the token_refreshed_callback authentication property.

authorization_data.authentication.token_refreshed_callback=save_refresh_token

Please check our Getting Started Guide on MSDN for complete examples of using the ServiceClient with the AuthorizationData and OAuth classes in your web or desktop  application.

Additional Resources

Please check our MSDN pages for more information about working with the Bing Ads SDK:

Coming soon

Our Java SDK will be out of beta and released for general availability later this week.

Feedback

Please let us know what you think about the Bing Ads SDK by sending an email to bing_ads_sdk@microsoft.com or leaving a comment below. We want to know about your experience coding with the Bing Ads SDK and what we can do to help you minimize your coding time and maximize conversions. 

PowerShell, Executing Internet Explorer as Network Service

Tue, 03/31/2015 - 11:03

Sometimes as part of troubleshooting things will work when you try them as an administrator, but they will not work when executed as Network Service, here is a PowerShell script that will open Internet Explorer in the context of the Network Service:

(of course you could use the same script to run any program with any other user)

psexec -u "nt authority\network service" -i "c:\program files\internet explorer\iexplore.exe"

Sample chapter: T-SQL Querying: TOP and OFFSET-FETCH

Tue, 03/31/2015 - 11:00

This chapter from T-SQL Querying starts with the logical design aspects of the filters. It then uses a paging scenario to demonstrate their optimization. The chapter also covers the use of TOP with modification statements. Finally, the chapter demonstrates the use of TOP and OFFSET-FETCH in solving practical problems like top N per group and median.

Classic filters in SQL like ON, WHERE, and HAVING are based on predicates. TOP and OFFSET-FETCH are filters that are based on a different concept: you indicate order and how many rows to filter based on that order. Many filtering tasks are defined based on order and a required number of rows. It’s certainly good to have language support in T-SQL that allows you to phrase the request in a manner that is similar to the way you think about the task.

This chapter starts with the logical design aspects of the filters. It then uses a paging scenario to demonstrate their optimization. The chapter also covers the use of TOP with modification statements. Finally, the chapter demonstrates the use of TOP and OFFSET-FETCH in solving practical problems like top N per group and median.

Read the complete chapter here: https://www.microsoftpressstore.com/articles/article.aspx?p=2314819.

Performance Issues with Visual Studio Online 3/31 - Investigating

Tue, 03/31/2015 - 10:44

Initial update : Tuesday, March 31 2015 10:40 AM UTC

We are currently investigating performance issues with a small subset of our databases. Customers whose collections hosted in these databases may experience slowness while using the service. Our DevOps are actively engaged and are working to resolve the issue.

We apologize for the inconvenience this may cause.

Sincerely

VS Online Service Delivery Team

Woot! Visual Studio 2015 Product Line Announced

Tue, 03/31/2015 - 10:43

I’ll refer you to the official Visual Studio blog post for more details, but just a heads up that the lineup for Visual Studio 2015 was announced this morning!

Again, full details on the VS blog, but here are the highlights:

  • Visual Studio Premium and Ultimate are condensing into a single, awesome product.  Meet Visual Studio 2015 Enterprise
    • If you have 2013 Premium or Ultimate today, you’ll automatically get upgraded to VS Enterprise whe it’s released.
    • The full comparison of features for the VS 2015 lineup are now available here.
    • Go here for more information on the Visual Studio 2015 product line, including pricing (and yes, Enterprise will be much cheaper that Ultimate!)
  • Are you using VS Professional or Test Professional?  Take a look at the offer that will be coming out in May.  Upgrade from Visual Studio Professional with MSDN or Visual Studio Test Professional with MSDN to Visual Studio Premium with MSDN for 50% off the regular list price and get a free upgrade to Visual Studio Enterprise with MSDN automatically when we release Visual Studio 2015.

    This exclusive offer is only available for 2 months!

 

All in all, Visual Studio 2015 is going to add tremendous value!  Not just to the product itself, but MSDN as well.  Look for more information about this upcoming release as it becomes available (I’ll try to post more as well!).

Write native to managed interop code: reg free COM

Tue, 03/31/2015 - 10:24
If you followed the steps of the last post ( Call C# code from your legacy C++ code ), then your C++ program simply started the CLR (Common Language Runtime), invoked a C# method that returned the length of the passed in string argument. Not very exciting. Let’s spice it up a little. We’d like to allow the native code to call the managed code in a well-defined way. Interfaces come to mind. COM (the Component Object Model) was designed to be a way to have software communicate in a language independent...(read more)

Issues in loading Cloud Load test results in excel – Work Around

Tue, 03/31/2015 - 10:00

We have identified a bug in our system which will not allow customers to load run results in Excel for runs that were initiated after 03-19-2015 11:00 UTC. The root cause has been identified and we are in the process of deploying a hotfix, ETA: 03-31-2015 23:00 UTC.

Workaround:

For the runs that were completed during the impact window, you will have to run the SQL script attached to this blog post. This script is designed to calculate the cumulative values for performance counter samples at your end and update the results DB accordingly.

  1. Download attached SQL file.
  2. Connect to Load Test Results DB in SQL Management Server Studio/Visual Studio.
  3. Run downloaded SQL script

Once executed you should be able to generate excel reports for the load test runs.

We apologize for the inconvenience this may have caused.

Sincerely

VS Online Service Delivery Team

Pages

Drupal 7 Appliance - Powered by TurnKey Linux