You are here

Feed aggregator

Build Bot with Voice (Skype)

MSDN Blogs - 4 hours 19 min ago

In the previous post “BUILD BOT with Microsoft Bot Framework Rest Api” I described about the rest raw of text messaging bot. The SDK is built on the rest api. Therefore, you can understand the bottom technology by learning this rest api.
In this blog post, I show you the same for the calling (voice) bot, not text messaging bot. I hope this helps you to understand the essence of voice communications in Microsoft Bot Framework.

If it’s Skype, you (developers) can provide the calling (voice) bot using Microsoft Bot Framework. (In Microsoft Bot Framework, this calling bot is only supported by Skype channel.)

Note : If you use SDK (.NET, Node.js), you can refer the official document for the step-by-step tutorial of the calling bot.

Note : Currently only en-US culture is supported in the calling bot. Therefore, if you want to handle ja-JP (Japanese language), please use the following play and record communication now.


The prerequisite settings is the same as normal chatbot in Microsoft Bot Framework. (Please refer the document.)
But, in this case (when you create the calling bot), you need to do the additional settings for calling. First, open the bot settings page, go to Skype channel configuration page, and enable “Calls” feature in Skype channel. (See the following screenshot.)
You must also set the call endpoint (which I explain later) in this settings.


If you proceed the bot communication in the webhook, all request is having Authorization header like following, and you must verify this token each time. For verifying this token, please refer the previous post “BUILD BOT with Microsoft Bot Framework Rest Api“.

POST Accept: application/json Authorization: Bearer eyJ0eXAiOi... Content-Type: application/json; charset=utf-8 ... Call endpoint and Callback endpoint

The calling (voice) communication needs two endpoints in your bot side: the one is “call” endpoint and the other is “callback” endpoint.

When the user starts to communicate with your bot, the first webhook arrives at your call endpoint like the following HTTP request.

POST Authorization: Bearer eyJ0eXAiOi... Content-Type: application/json; charset=utf-8 { "id": "13107307-7bd6-4c5e-9a1b-65b98464cee6", "participants": [ { "identity": "29:12BgCTOvVtWCWb0LlRkes7g428GXh_A4Gl9qbfce7YteH4zcD5pqSlQB-OMF1MVRM", "languageId": "en", "originator": true }, { "identity": "28:d82c7c25-ddb4-426a-8b59-76a8a034abb4", "originator": false } ], "isMultiparty": false, "presentedModalityTypes": [ "audio" ], "callState": "incoming" }

When your bot accepts this calling request, your bot reply the callback endpoint in the response body.

HTTP/1.1 200 OK Content-Type: application/json; charset=utf-8 { "links": { "callback": "" }, ... (here, explain later) }

After that, the Bot Framework (or Skype Bot Platform) calls this callback endpoint for subsequent communications. The all requests (webhook) in this communications is done by this callback url, after the acceptance.

In the Skype configuration settings page which I previously explained, you must set the “call” endpoint in “Calling Webhook” textbox.

Key concepts – Actions and Outcomes

All event is notified as the callback webhook, and you can reply as corresponding HTTP response. (the request-reply pattern)
When your bot want to act some operations, your bot must set these activities in HTTP respose as “action“. When the user has responded to this activity (or some system event has occured), this event is included as the “operation outcome” in the callback webhook (HTTP request body). That is, the outcome is corresponding to some specific action.

The type of actions (outcomes) are: answer (answerOutcome), playPrompt (playPromptOutcome), recognize (recognizeOutcome), record (recordOutcome), reject (rejectOutcome), and hangup (hangupOutcome).

Why this style of communication is used ?
When we were creating chatbot using only text messages (see “BUILD BOT with Microsoft Bot Framework Rest Api“), the communication pattern might be essentially one-way. Imagine if you are creating the alarm bot. This sort of bot might be idle while waiting, and is accidentally triggered by some occurrence.
But, when you create the calling (voice) bot, the state is always connected till hang-up. That is reason that the calling bot communication pattern is request-reply with “actions” and “outcomes”, and continuously communicates each other till hang-up.

Now, let’s see the communication flow !

For example, when your bot accepts the initial call request (which is described above), your bot sends the following “answer” action. (If your bot refuses, please use “reject” action.)

HTTP/1.1 200 OK Content-Type: application/json; charset=utf-8 { "links": { "callback": "" }, "actions": [ { "operationId": "673048f9-4442-440b-93a3-faa7433c977a", "action": "answer", "acceptModalityTypes": [ "audio" ] } ], "notificationSubscriptions": [ "callStateChange" ] }

You can include multiple outcomes in one HTTP response. For example, if you accepts the initial call request and reply some messages to the user, your bot can send like the following HTTP response.
When you use the “playPrompt” action with text value, Microsoft Bot Framework (Skype Bot Platform) transfers to the voice (speech) using the built-in text-to-speech engine.

HTTP/1.1 200 OK Content-Length: 383 Content-Type: application/json; charset=utf-8 { "links": { "callback": "" }, "actions": [ { "operationId": "673048f9-4442-440b-93a3-faa7433c977a", "action": "answer", "acceptModalityTypes": [ "audio" ] }, { "operationId": "030eeb97-8210-48fd-b497-d761154f0b5a", "action": "playPrompt", "prompts": [ { "value": "Welcome to test bot", "voice": "male" } ] } ], "notificationSubscriptions": [ "callStateChange" ] }

When the playPrompt action is accepted, the Bot Framework calls the following callback (webhook) with the outcome. This outcome means that the prompt is successfully accepted by the user.

POST Authorization: Bearer eyJ0eXAiOi... Content-Type: application/json; charset=utf-8 { "id": "13107307-7bd6-4c5e-9a1b-65b98464cee6", "operationOutcome": { "type": "playPromptOutcome", "id": "030eeb97-8210-48fd-b497-d761154f0b5a", "outcome": "success" }, "callState": "established" }

If your bot wants to ask something to the user, your bot use “recognize” action in the HTTP response.
For example, the following is requesting the choice of the dial pad digit to the user.

HTTP/1.1 200 OK Content-Type: application/json; charset=utf-8 { "links": { "callback": "" }, "actions": [ { "operationId": "32e1a166-f557-4b22-9fd7-64a742d5f040", "action": "recognize", "playPrompt": { "operationId": "bab59923-63d2-48a0-9d34-c0fbadb54435", "action": "playPrompt", "prompts": [ { "value": "If you want to report technical issues, press 1. You want to ask about our products, press 2.", "voice": "male" } ] }, "bargeInAllowed": true, "choices": [ { "name": "1", "dtmfVariation": "1" }, { "name": "2", "dtmfVariation": "2" } ] } ], "notificationSubscriptions": [ "callStateChange" ] }

When your bot want to provide the choice by the voice (not dial pad digit), your bot sends the following HTTP response. In this example, if the user speaks “yes” or “okay”, the “Yes” is returned as the result of choice.

Note that this isn’t the speech recognition feature itself (speech-to-text functionality), but choice by the voice. If the user speaks other words, your bot cannot recognize that.
If your bot needs speech-to-text functionality itself, please use the recording capability which I explain later.

HTTP/1.1 200 OK Content-Type: application/json; charset=utf-8 { "links": { "callback": "" }, "actions": [ { "operationId": "9a109320-7f30-46f4-a894-a47a4eb8b398", "action": "recognize", "playPrompt": { "operationId": "01105459-d549-4327-b85d-0b0c94b62e8e", "action": "playPrompt", "prompts": [ { "value": "Please answer yes or no.", "voice": "male" } ] }, "bargeInAllowed": true, "choices": [ { "name": "Yes", "speechVariation": [ "Yes", "Okay" ] }, { "name": "No", "speechVariation": [ "No", "None" ] } ] } ], "notificationSubscriptions": [ "callStateChange" ] }

When the user has responded for the “recognize” action, the following callback (webhook) is called. This is the example of the dial pad digit choice.

POST Authorization: Bearer eyJ0eXAiOi... Content-Type: application/json; charset=utf-8 { "id": "13107307-7bd6-4c5e-9a1b-65b98464cee6", "operationOutcome": { "type": "recognizeOutcome", "id": "09c78c7c-33fc-488c-808b-0db83de1b433", "choiceOutcome": { "completionReason": "dtmfOptionMatched", "choiceName": "1" }, "outcome": "success" }, "callState": "established" }

If the user is wasting the time to think, the following system event is returned as the recognize outcome.

POST Authorization: Bearer eyJ0eXAiOi... Content-Type: application/json; charset=utf-8 { "id": "13107307-7bd6-4c5e-9a1b-65b98464cee6", "operationOutcome": { "type": "recognizeOutcome", "id": "32e1a166-f557-4b22-9fd7-64a742d5f040", "choiceOutcome": { "completionReason": "initialSilenceTimeout" }, "outcome": "failure", "failureReason": "InitialSilenceTimeout" }, "callState": "established" }

If your bot wants to hang up (disconnect), your bot sends the following HTTP response with the “hangup” action.

HTTP/1.1 200 OK Cache-Control: no-cache Content-Type: application/json; charset=utf-8 { "links": { "callback": "" }, "actions": [ { "operationId": "d2cb708e-f8ab-4aa1-bcf6-b9396afe4b70", "action": "hangup" } ], "notificationSubscriptions": [ "callStateChange" ] } Conversation with Play / Record

Using Microsoft Bot Framework, your bot can also play media, or record as binary. This capabilities are very much used in the real scenarios : playing music on hold, recording messages to someone, etc
Especially, you can do the interactive talks using this capabilities as follows. In the case of needing the high-quality voice guidance, you can also use these capabilities.

  • Record user’s request (speech) and get binary
  • Call external speech-to-text engine (like Bing Speech API) and get text value
  • Select speech binary for response and play

Let’s see these capabilities.
If your bot plays some existing audio file, your bot sends the HTTP response like follows. As you can see, your bot can use the audio file uri in the playPrompt action, instead of text value.

HTTP/1.1 200 OK Content-Type: application/json; charset=utf-8 { "links": { "callback": "" }, "actions": [ { "operationId": "d18e7f63-0400-48ff-964b-302cf4910dd3", "action": "playPrompt", "prompts": [ { "fileUri": "" } ] } ], "notificationSubscriptions": [ "callStateChange" ] }

If you want to record the user’s response, send the “record” action. (The recording starts !)

HTTP/1.1 200 OK Content-Type: application/json; charset=utf-8 { "links": { "callback": "" }, "actions": [ { "operationId": "efe617d7-4de5-42e9-b4e1-90dfd5850e49", "action": "record", "playPrompt": { "operationId": "e2381379-20b0-4fae-b341-afcdd8187323", "action": "playPrompt", "prompts": [ { "value": "What is your flight ?", "voice": "male" } ] }, "maxDurationInSeconds": 10, "initialSilenceTimeoutInSeconds": 5, "maxSilenceTimeoutInSeconds": 2, "playBeep": true, "stopTones": [ "#" ] } ], "notificationSubscriptions": [ "callStateChange" ] }

If the recording has completed, your bot receives the following “record” outcome as the MIME multipart format.
Your bot can retrieve the audio binary from this raw data, and proceed some subsequent operations.

For example, Microsoft Bot framework is not having speech recognition feature itself (speech-to-text functionality), but you can get the text string value with external speech recognition service (like Bing Speech API), and you might also proceed the language understanding using LUIS (language understanding intelligent service).

POST Authorization: Bearer eyJ0eXAiOi... Content-Type: multipart/form-data; boundary="test-0123" --test-0123 Content-Type: application/json; charset=utf-8 Content-Disposition: form-data; name=conversationResult { "id": "13107307-7bd6-4c5e-9a1b-65b98464cee6", "operationOutcome": { "type": "recordOutcome", "id": "efe617d7-4de5-42e9-b4e1-90dfd5850e49", "completionReason": "completedSilenceDetected", "lengthOfRecordingInSecs": 5.0459999999999994, "format": "wma", "outcome": "success" }, "callState": "established" } --test-0123 Content-Type:audio/x-ms-wma Content-Disposition:form-data; name=recordedAudio (This is the audio binary ...) --test-0123--

【Xamarin】 「Creating Mobile App with Xamarin.Forms」 電子ブック(PDF,ePub,Kindle)を無料で手に入れてアレコレ学んで試そう

MSDN Blogs - 7 hours 46 min ago


10/18(火)、チャールズ・ペゾルト (Charles Petzold) 氏はXamarin.Formsを使ったモバイルアプリ開発について解説した電子ブック 『Creating Mobile App with Xamarin.Forms』 の28章の要約のサマリ版を公開したことを Xamarin Blog 「Xamarin.Forms Book Now Available in Easy to Digest Chapter Summariesに投稿しました。

・PDF (56MB)
・ePub (151MB)
・Kindle Edition (325MB)



Have a nice Xamarin♪

Heading down to Portland SQL Saturday with Adam Saxton. This year SQL Saturday has dedicated an entire track to Power BI.

MSDN Blogs - Fri, 10/21/2016 - 19:03

In addition to the Power BI sessions CSG will also be offering a Dashboard in an hour Session!

If you are are not familiar w/ this event, SQLSaturday is a free training event for SQL Server professionals and those wanting to learn about SQL Server.

This event will be held on Oct 22 2016 at Washington State University Vancouver, 14204 NE Salmon Creek Ave , Vancouver, Washington, 98686, United States

For more information Check out:


I will be doing the following:  

Calling REST APIs, working with JSON and integrating with your Web Development using Power BI

Charles Sterling

Calling REST APIs, working with JSON and integrating with your Web Development using Power BI

Charles Sterling shows how to use Power BI in your development efforts specifically how to call REST APIs with Power BI without writing any code.  Parsing, modeling and transforming the resulting JSON to make creating rich interactive reports a snap and integrating this into your  development efforts by embedding the Power BI data visualizations into your web applications.

Take your cloud skills further with Cloud + Enterprise University Online

MSDN Blogs - Fri, 10/21/2016 - 18:52

Massive Open Online Courses (MOOC) training

Build knowledge, stay sharp, and prove your expertise on selling and supporting Microsoft cloud solutions through our virtual, instructor-led courses and webinars, giving you the flexibility to train at your own pace. Complete a course and walk away with best practices and earn a digital badge to share your demonstrated capabilities.

MOOC courses typically run 4 to 8 weeks and require 1 to 2 hours per week to complete assignments on your own schedule.

Orchestrated courses – Delivered entirely online through an orchestrated set of work including video, case studies, chat, applied assignments, and feedback opportunities.

Assessment – Provide an opportunity plan from a customer and get assessed on your applied knowledge and an opportunity to offer feedback on plans of peers.

Digital Badge – Complete the course and earn a digital badge to use publicly to signify demonstrated capabilities in the course subject.

Visit Cloud + Enterprise University Online often as more trainings will be added throughout the year. Registration opens as courses become available—and some courses are subject to change.

Internet of Things (IoT) available courses:

Course Date Registration link Selling Azure IoT Suite (MOOC) Course starts: 14/11/16

Course ends: 9/12/16 Register

Registration closes 4/11/16


Data Platform available courses:

Course Date Registration link Mission Critical Application Platform (MOOC) Course starts: 7/11/16

Course ends: 2/12/16 Register

Registration closes 28/10/16 Application Platform Migration (MOOC) Course starts: 14/11/16

Course ends: 9/12/16 Register

Registration closes 4/11/16


Cloud Infrastructure available courses:

Course Date Registration link Windows Server 2016 App Plat and Nanoserver (Webcast) October 25, 2016

Speaker: Jeff Woolsey Time Zone: Australian Eastern Daylight Savings Time 3:00am Registration

12:00pm Registration LOB Apps in Azure Launches (Webcast)

  October 26, 2016

Speaker: Venkat Gattamneni Time Zone: Australian Eastern Daylight Savings Time 3:00am Registration

12:00pm Registration Operations Management and Security (MOOC)

  Course Starts on: 7/11/16

Course Ends on: 9/12/16 Register here

Registration closes 28/10/16 Selling Hybrid Cloud Storage with StorSimple (MOOC) Course Starts on: 7/11/16

Course Ends on: 2/12/16 Register here

Registration closes 28/10/16


Enterprise Mobility + Security available courses:

Course Date Registration link Identity Driven Security Overview (Webcast) October 25, 2106

Speaker(s): Nasos Kladakis, Pragya Pandey, Vladimir Petrosyan, Demi Albuz Time Zone: Australian Eastern Daylight Savings Time 2:00am Registration

11:00am Registration



Cloud Application Development available courses:

Course Date Registration link How to Accelerate Business Success with the Microsoft DevOps Solution


  November 10, 2016

Speaker: Michael Köster Time Zone: Australian Eastern Daylight Savings Time

2:00am Registration

11:00am Registration Selling the Microsoft Mobile DevOps Solution


  November 11, 2016

Speaker: Michael Köster Time Zone: Australian Eastern Daylight Savings Time

2:00am Registration

11:00am Registration


For questions, please contact the Cloud + Enterprise University Online Program Support at


Understanding Advanced Analytics to Build a Stronger Retail Customer Experience

MSDN Blogs - Fri, 10/21/2016 - 17:55

Thank you for those of you that joined by webinar yesterday: I loved the engagement and questions that came my way. For those of you that missed the live session, we have an on-demand recording available. In addition, I have also posted my slides to slideshare. We also shared an Ebook that you can download with more information on Advanced Analytics and Retail.

I would love to hear your feedback and comments via @5h15h

CppRestSDK 2.9.0 is available on GitHub

MSDN Blogs - Fri, 10/21/2016 - 17:04

We are delighted to announce a new version of CppRestSDK (Casablanca) 2.9.0, this new version available on GitHub introduces new features and fixes issues reported on the 2.8.0 version. The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design. This project aims to help C++ developers connect to and interact with services.
We added

  • support for basic authentication on Linux.
  • static library support for Windows xp
  • a project for compiling as a static lib on Windows
  • a websocket_client_config option for ssl verify mode
  • host based connection pool map on non windows http_clients

We fixed issues of Linux, OSX and Android versions. Here are the set of changes going into this release:

  • Merged #70 & #65 which should fix building on CentOS/RedHat.
  • #143 Work around SSL compression methods memory leak in ASIO.
  • #82 Fixed ambiguous call to begin when using with boost library.
  • #117 Fix header reading on linux listener using HTTPS.
  • #97 Add support for basic authentication.
  • #206 remove warnings-errors for system-headers under linux; honour http_proxy env-variable.
  • #114 Removed redundant std::move() that was causing errors on Xcode 7.3 gcc.
  • #140 Fix returning std::move causing build failure on osx.
  • #137 Fix android build script for linux, remove libiconv dependency.
  • Use Nuget packages built with Clang 3.8 (VS 2015 Update3) and Android NDK 11rc. Update built scripts for the same.
  • #150 Add static library for windows xp.
  • #115 Added projects which target v140_xp to resolve Issue#113.
  • #71 Add a project for compiling as a static lib.
  • #102 Added websocket_client_config option for ssl verify mode.
  • #217 Fixed race condition in Casablanca WinRT Websocket client.
  • #131 Update to include access control allow origin.
  • #156 add host based connection pool map on non windows http_clients.
  • #161 Header parsing assumes whitespace after colon.
  • #146 Fix ambiguous reference to ‘credentials’
  • #149 Some perf improvements for uri related code.

· #86 Fix obtaining raw string_t pointer from temporary.

· #96 Fix typo hexidecimal/hexadecimal.

  • #116 Fixing latin1 to UTF-16 conversion.
  • #47 Fixing .then to work with movable-only types.

As always, we trust the community to inform our next steps so, let us know what you need, how to improve Casablanca, by continuing to create an issue or a pull request on

10/27 Webinar: Managing data and applications just got easier with PowerApps by James Oleinik

MSDN Blogs - Fri, 10/21/2016 - 16:47

Back by Popular demand, in next weeks webinar James Oleinik is going to show how creating and managing PowerApps applications just got easier.   In this webinar James Oleinik, will introduce some exciting new enhancements that will make your applications both easier to manage and more performant.  In this webinar James Oleinik will introduce new features and drill into how they can simplify your lifecycle management and the new PowerApps administration experience that will make managing your PowerApps development efforts a breeze.

When: October 27, 2016  10:00 AM – 11:00 AM


About the presenter:

James Oleinik

I’m a PM on the Microsoft PowerApps team and will be presenting. Check out the PowerApps preview today:

Introducing Microsoft Certification Badges

MSDN Blogs - Fri, 10/21/2016 - 13:37

Beginning October 21, 2016, Microsoft Certified Professionals (MCPs) who achieve certain certifications or pass select exams will be awarded Microsoft Certification badges. These web-enabled badges are trusted and verifiable representations of the Microsoft Certifications you’ve achieved. They allow you to easily share the details of your skills with employers or clients, and broadly across social networks. To find out which exams and certifications are included during the launch phase, and detailed instructions on claiming your badge, please visit the Microsoft Certification Badge webpage

Build new web and mobile solutions for your customers with Microsoft PowerApps and Microsoft Flow

MSDN Blogs - Fri, 10/21/2016 - 12:32

To complement the upcoming launch of Dynamics 365, including PowerApps and Flow, we are happy to announce a free Partner Roadshow with instructors from the PowerApps and Flow product team.  Please use the attached Partner Invitation to share with your Partner community!


Dates and Locations:

Chevy Chase, MD

 December 5, 2016

Boston, MA

 December 7, 2016

Irvine, CA

 December 7, 2016

San Francisco, CA

 December 8, 2016

Bellevue, WA

 December 9, 2016

New York, NY

 December 9, 2016

Minneapolis, MN

 January 16, 2016

Dallas, TX

 January 17, 2017

Chicago, IL

 January 18, 2017




Training Topics:

  • Overview of Microsoft PowerApps & Microsoft Flow
  • SharePoint Online Integration
  • Common Data Model
  • Hands-on Exercises
  • Product Roadmap
  • Partner Solutions and Business Opportunities
  • Question and Answer Session



Technical consultants, application consultants, presales and similar roles. Also open to marketing and sales roles.

What’s the Difference Between an Azure Security Center Alert and a MSRC Notification

MSDN Blogs - Fri, 10/21/2016 - 12:24

This week someone wrote to me and asked about an email he received from Microsoft regarding a possible security incident.

Of course, since I always have Azure Security Center in mind, my first question was “did the email come from an Azure Security Center alert”?

The reason why I asked about it being an alert is that it’s possible to configure Azure Security Center to forward alerts to one or more email addresses.

You can see how to do that in the figure below. Just click Email notifications in the Policy components section. That will bring up the Email notifications pane and there you enter your Security contact emails (BTW – there’s no practical limit on the number of emails, but don’t enter so many that it looks like SPAM [not that you would ]). And although this pane is intended for email notification, we also provide you the opportunity to give us a phone number that we can use if we need to contact you about high security alerts.

Also note in the Send me emails section that you can turn off/on Send me emails about alerts (which is currently in Preview) and Send email also to subscription owners (which is also in Preview). Note that “me” can actually be many “me’s” (but not mini-me’s ), because the alert will go to all email addresses listed in the Security contact emails text box.

When you do get an alert, you can check it out as seen in the figure below. Just for fun (since we’re here), I highlighted an interesting alert that was generated because of a modified system binary that was found by a crash dump analysis performed by Azure Security Center.


When we look at the details of the alert, we see that Azure Security Center detected an image mismatch on a loaded module in memory during the analysis of a crash dump. If the presence of this module is unexpected, it may indicate a system compromise and that the Process name was lync.exe – it isn’t good to have this kind of mismatch on what would otherwise be a trusted application. Notice that we provide some remediation steps too.

Anyhow, back to the story.

The email my friend received wasn’t generated by Azure Security Center – it came directly from the Microsoft Security Response Center (MSRC). These emails are different from those you get from Azure Security Center. The Security Center emails are automated. Emails coming from the MSRC are manually driven by the MSRC. The MSRC does continuous security monitoring of the Azure fabric as well as receiving threat intelligence feeds from multiple resources. When the MSRC determines that you data has been accessed by unauthorized entities (i.e., attackers) OR if you’re doing something in Azure that you shouldn’t be doing, an incident response manager will contact you via email (or phone or maybe even both).

So now you know – there’s a difference between emails you get from Azure Security Center and the MSRC – they’re both important – but if you get one from the MSRC, make sure you read it right away!

BTW – if you want to learn more about Azure Security Center alerts, check out Managing and Responding to Security Alert in Azure Security Center.



Tom Shinder
Program Manager, Azure Security
@tshinder | Facebook | LinkedIn | Email | Web | Bing me! | GOOG me

Clippers basketball is back and battling in NWAC West

SPSCC Posts & Announcements - Fri, 10/21/2016 - 11:49

Clippers Athletics today announced season outlooks for its Men’s and Women’s Basketball teams for the 2016-17 year.  Last season, both teams achieved personal record bests in league standings and several student athletes were honored for their achievements on and off the court.  Resonating from coaches Landon (men’s team) and Moore (women’s team) was excitement, high expectations, and confidence for their balanced teams of Clippers rookies and returners.

Men's Basketball Women's Basketball Athletics Students

C++ / VBA – How to send a COM object from VBA to a C++ DLL via PInvoke

MSDN Blogs - Fri, 10/21/2016 - 11:05

Today I would like to present a quite uncommon scenario, which involves requesting a COM object from VBA and forwarding it
through PInvoke to another C++ DLL.

The puzzling part is that if we work with managed COM DLLs, everything runs properly, but if we’re using C++ DLLs, Office will
crash with an  Access Violation!

Here’s some background info. about the components involved …

COM object originates from a simple C++ COM DLL

This is the Interface from which VBA gets its COM object. I created it based on based on a MSDN sample: In-process C++
COM server (CppDllComServer).

Besides the usual COM interfaces (IUnknown, IDispatch), it exposes an interface named ISimpleObject and a COM Class
that implements it.

/****************************** Module Header ****************************** Module Name: SimpleObject.h Project: CppDllCOMServer Copyright (c) Microsoft Corporation. This source is subject to the Microsoft Public License. See All other rights reserved. THIS CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. ***************************************************************************/ #pragma once #include #include "CppDllCOMServer_h.h" class SimpleObject : public ISimpleObject { public: // IUnknown IFACEMETHODIMP QueryInterface(REFIID riid, void **ppv); IFACEMETHODIMP_(ULONG) AddRef(); IFACEMETHODIMP_(ULONG) Release(); // IDispatch IFACEMETHODIMP GetTypeInfoCount(UINT *pctinfo); IFACEMETHODIMP GetTypeInfo(UINT itinfo, LCID lcid, ITypeInfo **pptinfo); IFACEMETHODIMP GetIDsOfNames(REFIID riid, LPOLESTR *rgszNames, UINT cNames, LCID lcid, DISPID* rgdispid); IFACEMETHODIMP Invoke(DISPID dispidMember, REFIID riid, LCID lcid, WORD wFlags, DISPPARAMS *pdispParams, VARIANT *pvarResult, EXCEPINFO *pExcepInfo, UINT *puArgErr); // ISimpleObject IFACEMETHODIMP get_FloatProperty(FLOAT *pVal); IFACEMETHODIMP put_FloatProperty(FLOAT newVal); IFACEMETHODIMP HelloWorld(BSTR *pRet); IFACEMETHODIMP GetProcessThreadID(LONG *pdwProcessId, LONG *pdwThreadId); SimpleObject();



VBA accesses the COM DLL via add Tools > Reference

Sub test() Dim comObj As CppDllCOMServerLib.SimpleObject Set comObj = New SimpleObject Debug.Print comObj.HelloWorld End Sub

Finally, VBA forwards the COM Object to a different C++ DLL via PInvoke

It’s just a simple C++ DLL which exposes a function that takes in an ISimpleObject COM pointer.


Sub test() #include #include namespace SimplePInvokeDLL { #import "C:\..\C++ COM DLLDebug\CppDllCOMServer.dll" using namespace CppDllCOMServerLib; extern "C" { __declspec(dllexport) int __stdcall API_DummyCOMCall(ISimpleObject* iso); // Returns random number __declspec(dllexport) long API_DummyCall(); } }


#include "stdafx.h" #include "Pinvoke_CppDllCOMServer.h" #include <stdlib.h> #include <time.h> using namespace std; namespace SimplePInvokeDLL { __declspec(dllexport) long SimplePInvokeDLL::API_DummyCall() { /* initialize random seed: */ srand(GetTickCount()); /* generate a random number*/ return rand(); } __declspec(dllexport) int __stdcall API_DummyCOMCall(ISimpleObject* iso) { //_bstr_t Class /* A _bstr_t object encapsulates the BSTR data type. The class manages resource allocation and deallocation through function calls to SysAllocString and SysFreeString and other BSTR APIs when appropriate. The _bstr_t class uses reference counting to avoid excessive overhead. */ _bstr_t strComResult = iso->HelloWorld(); _bstr_t strLocal = _bstr_t(L"HelloWorld"); return strComResult == strLocal; } }

As I wrote before, the goal of our exercise is to obtain a COM pointer from the 1st DLL and forward it to the 2nd DLL, via VBA ..

Declare Function API_DummyCOMCall Lib "C:...DebugPinvoke_CppDllCOMServer.dll" _ (simpleObj As CppDllCOMServerLib.SimpleObject) As Integer Sub test() Dim comObj As CppDllCOMServerLib.SimpleObject Set comObj = New SimpleObject Debug.Print comObj.HelloWorld Debug.Print API_DummyCOMCall(comObj) End Sub


… the first part works, and we’re getting a working COM object from CppDllCOMServerLib:


Now, if we attempt to send this COM object via a Pinvoke call…


Let’s attach to the PInvoke DLL and see why we crash

First, we need to restart Excel, and pause the code just before we make the PInvoke call.

Then we have to open the Visual Studio PInvoke C++ DLL project, and attach to the running instance of Excel. You’ll also need to add a breakpoint on the PInvoke function which is used by VBA to send the COM pointer.


Finally, we switch back to Excel and resume executing the code. We’ll soon hit the PInvoke C++ callback function and if we
take a closer look at the input parameter, we’ll see it points at an IUnknown VTable.

The only problem here is that this VTable has a couple of NULL pointers inside … now I don’t know enough about COM to say
for sure that those bad addresses are the cause, but if we step through the code, we’ll see that when trying to execute the
COM call, we end up trying to execute code from address zero, which is not very nice.



My idea was to find a proper way of sending COM pointers from VBA, and avoid NULL VTable pointers inside the PInvoke DLL
callback function. After asking around, I was told that VBA has a special operator which returns the address of an object:
How To Get the Address of Variables in Visual Basic



ObjPtr takes an object variable name as a parameter and obtains the address of the interface referenced by this object variable. One scenario of using this function is when you need to do a collection of objects. By indexing the object using its address as the key, you can get faster access to the object than walking the collection and using the Is operator. In many cases, the address of an object is the only reliable thing to use as a key. Example: objCollection.Add MyObj1, CStr(ObjPtr(MyObj1)) objCollection.Remove CStr(ObjPtr(MyObj1))



So, we will have to modify our PInvoke declaration and the way we send the COM object like this. Notice that since ObjPtr
returns an address, we need to change the PInvoke method’s input parameter’s type from SimpleObject to LongPtr.

Declare Function API_DummyCOMCall Lib "C:...DebugPinvoke_CppDllCOMServer.dll" _ (ByVal simpleObj As LongPtr) As Integer Sub test() Dim comObj As CppDllCOMServerLib.SimpleObject Set comObj = New SimpleObject Debug.Print comObj.HelloWorld Debug.Print API_DummyCOMCall(ObjPtr(comObj)) End Sub


What’s that? VBA complains about the calling convention … hmmm.


After some research I found that PInvoke  C++ DLL functions which take parameters must have the “__stdcallcalling
convention, so that cleaning up the Stack gets done by the callee. With this occasion we’re losing our nicely formatted function
names and we’ll get decorated names instead …


Let’s fix the VBA code to reflect that change in calling conventions.

Declare Function API_DummyCOMCall Lib "C:...DebugPinvoke_CppDllCOMServer.dll" _ Alias "_API_DummyCOMCall@4"(ByVal simpleObj As LongPtr) As Integer Sub test() Dim comObj As CppDllCOMServerLib.SimpleObject Set comObj = New SimpleObject Debug.Print comObj.HelloWorld Debug.Print API_DummyCOMCall(ObjPtr(comObj)) End Sub
Success! We can now send COM objects from VBA to C++ DLLs and not crash Office in the process :).

Source code download link: Example – C++ COM DLL send COM pointer from VBA with PInvoke



Thank you for reading my article! If you have liked it, please use the rating button.

P.S. I can’t always manage to reply to your comments as fast as I’d like. Just drop me an email at cristib-at-microsoft-dot-com,
should you need help with understanding or getting something in my blog to work. 

DISCLAIMER: Please note that the code I have offered is just a proof of concept and should not be put into production without a thorough testing! Microsoft is not responsible if your users will lose data because of this programming solution. It’s your responsibility to test it before deployment in your organization. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that. You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.

Integrating PolyBase with Cloudera using Active Directory Authentication

MSDN Blogs - Fri, 10/21/2016 - 09:46

This article outlines the steps to use PolyBase in SQL 2016(including R-Services) with a Cloudera Cluster and setup authentication using Active Directory in both SQL 2016 and Cloudera.

  1. Cloudera Cluster
  2. Active Directory with Domain Controller
  3. SQL Server 2016 with PolyBase and R-Services installed

NOTE: We have tested the configuration using the Cloudera Cluster 5.5 running on CentOS 6.6, SQL Server 2016 running on Windows Server 2012 R2 and Active Directory with Domain Controller running on Windows Server 2012 R2. Other Windows Server and CentOS operating systems might also work in this configuration.

All the prerequisites above must be in the same network and domain say (CORP.CONTOSO.COM). After the prerequisites are completed, we will follow the steps listed below in order:

  1. Connecting SQL to AD
  2. Connecting Cloudera to AD
  3. Connecting PolyBase to Cloudera

Connecting SQL 2016 with AD

Since SQL 2016 and DC are in the same domain CORP.CONTOSO.COM – you should be able to create a new login in SQL Server from an existing user in CORP.CONTOSO.COM

Connecting Cloudera with AD

For all usernames and principals, we will use the suffixes like Cluster14 for name-scalability.

  1. Active Directory setup:
  1. Install OpenLDAP utilities (openldap-clients on RHEL/Centos) on the host of Cloudera Manager server. Install Kerberos client (krb5-workstation on RHEL/Centos) on all hosts of the cluster. This step requires internet connection in Hadoop server. If there is no internet connection in the server, you can download the rpm and install.
sudo yum -y install openldap-clients krb5-workstation sudo yum -y install krb5-workstation
  1. Apply the JCE Unlimited Strength Jurisdiction Policy Files. Download the Java Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy Files from Oracle. Be sure to download the correct policy file updates for your version of Java 7 or Java 8. Uncompress and extract the downloaded file. The download includes a Readme.txt and two .jar files with the same names as the existing policy files. Locate the two existing policy files: local_policy.jar, US_export_policy.jar. Look in JAVA_HOME/lib/security/ and replace the existing policy files with the unlimited strength policy files you extracted.

We will use the wizard in Cloudera Manager to enable Active Directory Authentication. The 9 steps involved in the “Enable Kerberos” Wizard are provided through the following screenshots (use relevant values for your own cluster and AD):

You can view the credentials generated by Cloudera in the Active Directory OU=Hadoop, OU=CORP, DC=CONTOSO, DC=COM ( we gave the prefix “cluster14” in step 2)

Once Kerberos is successfully enabled, let us use kinit to obtain a ticket in cache and then list the directories in HDFS

kinit hdfsCluster14@CORP.CONTOSO.COM hadoop fs -ls /

If the above command is successful, then we have configured AD Authentication for Cloudera!

Create a folder in hdfs for PolyBase tables (Say cdh)

hadoop fs -mkdir /cdh

NOTE: Make sure the setting in HDFS is set to Authentication:

Currently there is a known issue when setting this to “integrity” or “privacy” will result in failures to connect from PolyBase to HDFS. You will see error message like the following:

EXTERNAL TABLE access failed due to internal error: 'Java exception raised on call to HdfsBridge_IsDirExist: Error [Failed on local exception: Couldn't setup connection] occurred while accessing external file.' Connecting PolyBase to Cloudera

Run the following command to confirm that PolyBase has been successfully installed. If PolyBase is installed, returns 1; otherwise, 0

SELECT SERVERPROPERTY ('IsPolybaseInstalled') AS IsPolybaseInstalled;

Run sp_configure (Transact-SQL) ‘hadoop connectivity’ and set an appropriate value. To find the value, see PolyBase Connectivity Configuration (Transact-SQL).

sp_configure 'hadoop connectivity', 6; sp_configure 'allow polybase export', 1;  reconfigure

You must restart SQL Server using services.msc. Restarting SQL Server restarts these services:

  • SQL Server PolyBase Data Movement Service
  • SQL Server PolyBase Engine

In the following location, set the appropriate values in the configuration files from the Cloudera Cluster settings:

C:Program FilesMicrosoft SQL ServerMSSQL13.MSSQLSERVERMSSQLBinnPolybaseHadoopconf core-site.xml <property>  <name>polybase.kerberos.realm</name>   <value>CORP.CONTOSO.COM</value> </property> <property>   <name>polybase.kerberos.kdchost</name>   <value>ACTIVEDIRECTORY.CORP.CONTOSO.COM</value> </property> <property>   <name></name>   <value>KERBEROS</value> </property> hdfs-site.xml <property>   <name>dfs.namenode.kerberos.principal</name>   <value>hdfsCluster14/_HOST@CORP.CONTOSO.COM</value> </property> mapred-site.xml <property>   <name>mapreduce.jobhistory.principal</name>   <value>mapred/_HOST@CORP.CONTOSO.COM</value> </property> <property>   <name>mapreduce.jobhistory.address</name>   <value><HOSTNAME and port of YARN JobHistory Server></value> </property> yarn-site.xml <property>   <name>yarn.application.classpath</name>   <value>$HADOOP_CLIENT_CONF_DIR,$HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*, $HADOOP_HDFS_HOME/*,$HADOOP_HDFS_HOME/lib/*,$HADOOP_YARN_HOME/*,$HADOOP_YARN_HOME/lib/*</value> </property> <property>   <name>yarn.resourcemanager.principal</name>   <value>yarnCluster14/_HOST@CORP.CONTOSO.COM</value> </property>

Now, we are ready to use PolyBase – let’s try creating an external table:

-- 1: Create a database scoped credential.  -- Create a master key on the database. This is required to encrypt the credential secret. CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Pas5w0rd_'; -- 2: Create a database scoped credential  for Kerberos-secured Hadoop clusters. -- IDENTITY: the Kerberos user name. -- SECRET: the Kerberos password  CREATE DATABASE SCOPED CREDENTIAL myCredObject WITH IDENTITY = 'myHdfsUser', Secret = 'P455w0rd!#' ;  -- 3:  Create an external data source. -- LOCATION (Required) : Hadoop Name Node IP address and port. -- RESOURCE MANAGER LOCATION (Optional): Hadoop Resource Manager location to enable pushdown computation. -- CREDENTIAL (Optional):  the database scoped credential, created above. CREATE EXTERNAL DATA SOURCE clouderaCluster14 WITH (          TYPE = HADOOP,           LOCATION ='hdfs://CLUSTER14.CORP.CONTOSO.COM:8020',           RESOURCE_MANAGER_LOCATION = 'CLUSTER14.CORP.CONTOSO.COM:8032',           CREDENTIAL = myCredObject        );  -- 4: Create an external file format.  CREATE EXTERNAL FILE FORMAT CsvFileFormat WITH (          FORMAT_TYPE = DELIMITEDTEXT,           FORMAT_OPTIONS (FIELD_TERMINATOR =',', USE_TYPE_DEFAULT = TRUE)) -- 5:  Create an external table pointing to data stored in Hadoop. -- LOCATION: path to file or directory that contains the data (relative to HDFS root).  CREATE EXTERNAL TABLE [dbo].[CarSensor_Data] (          [SensorKey] int NOT NULL,           [CustomerKey] int NOT NULL,           [GeographyKey] int NULL,           [Speed] float NOT NULL,           [YearMeasured] int NOT NULL  )  WITH (LOCATION='/cdh/',         DATA_SOURCE = clouderaCluster14,        FILE_FORMAT = CsvFileFormat  );  -- 6: Insert some data into external table and view the data INSERT INTO [dbo].[CarSensor_Data] VALUES (1,1,1,40,2011) SELECT * FROM [dbo].[CarSensor_Data]

The above data will be stored in CSV format in hdfs, you can browse the demo folder in hdfs to find the contents.

Now you can work with the table [dbo].[CarSensor_Data] as a normal table in SQL, but the data storage will be in HDFS.

You can also import existing data in HDFS as a SQL table to leverage awesome SQL features like Columnstore Indexes, R-services (In-Database)

Here is a simple example of using rxSummary on the external table [dbo].[CarSensor_Data]

exec sp_execute_external_script    @language =N'R',      @script=N'print(rxSummary(~.,InputDataSet))',        @input_data_1 =N'select * from [dbo].[CarSensor_Data]' REFERENCES

Enabling Kerberos Authentication Using the Wizard

Create the HDFS Superuser

Direct Active Directory Integration for Kerberos Authentication

Quickly Configure Kerberos for Your Apache Hadoop Cluster

Integrating Cloudera cluster with Active Directory

PolyBase Guide

PolyBase Installation

Get Started with PolyBase

PolyBase Connectivity Configuration

PolyBase Configuration

PolyBase Setup Errors and Possible Solutions

Evergreen joins Jazz Band for Duke Ellington music Dec. 2

SPSCC Posts & Announcements - Fri, 10/21/2016 - 09:44

South Puget Sound Community College (SPSCC) today announced a dual performance between the SPSCC Jazz Band with The Evergreen Singers featuring the music of Duke Ellington and His Orchestra.

The SPSCC Jazz Band is joined by The Evergreen singers, led by Marla Elliott from The Evergreen State College.


ArcGIS for Server on Microsoft Azure Government

MSDN Blogs - Fri, 10/21/2016 - 09:00

Esri is excited to announce the availability of ArcGIS for Server on the Microsoft Azure Government Cloud. ArcGIS for Server will allow you to deploy leading-edge GIS technology on Microsoft Azure virtual machines.

As the world’s leading enterprise mapping and spatial analytics platform, ArcGIS for Server provides a complete Web GIS environment for mapping and spatial analytics with ready-to-use maps and apps that can be shared and used by everyone in the organization.  ArcGIS for Server easily dovetails with other enterprise systems, including Microsoft Azure SQL and supports Azure security and compliance standards.  Mapping, analysis and geodata products can be readily used in apps for office and field workers, and for engaging and crowdsourcing communities.

Deploying on Microsoft Azure means you don’t have to maintain hardware infrastructure, and you can create or remove sites based on demand. With this new offering, customers can deploy ArcGIS for Server on Microsoft Azure sites from images on Microsoft Azure Marketplace.

For more information, a few resources are listed below.

We welcome your comments and suggestions to help us continually improve your Azure Government experience. To stay up to date on all things Azure Government, be sure to subscribe to our RSS feed and to receive emails, click “Subscribe by Email!” on the Azure Government Blog. To experience the power of Azure Government for your organization, sign up for an Azure Government Trial.

Happy #Friday Five!

MSDN Blogs - Fri, 10/21/2016 - 07:00

Changing Domain Users’ “User Logon Names” and UPNs

Pete Long is a Technical Consultant working in the North East of England. Previously he’s worked in IT Project management, and as a Consultant for solution providers and channel partners. Pete is an IT Pro with 15 years of both infrastructure and networking experience. One week he may be fitting a firewall for a small SMB, and the following week doing major Domain and Exchange migrations for a multi thousand seat network.

Follow him on Twitter @PeteNetLive



Microsoft Advanced Threat Analytics (ATA) – Attack Simulation and Demo

Santhosh Sivarajan is a recognized expert in Microsoft technologies. He has extensive experience working on enterprise and cloud Security, identity and access management, and privileged access and account Management projects and solutions. He is the author of two books entitled Windows Server Security and Migration from Windows Server 2008 to Windows Server. He has also published hundreds of articles on various technology sites. Microsoft has recognized Santhosh with the MVP award multiple times for his exceptional contribution to the technical community.

Follow him on Twitter @Santhosh_Sivara


How to set up Angular 2 and Webpack in Visual Studio with ASP.NET Core

Fabian Gosebrink is a professional software engineer, Microsoft MVP,  and Microsoft Technology Ambassador in Switzerland. He is also a Microsoft Certified Specialist in web application development and regular speaker at Microsoft events in Switzerland. He helps companies and projects to build web applications with AngularJS, Angular2, ASP.NET, ASP.NET Core, and all the build tools around it. Fabian is very into new technologies and helps to grow his community, by leading the biggest german speaking C# forum “

Follow him on Twitter @FabianGosebrink


How I use Azure Logic App, API App and Function App in my life Frank Boucher is a trusted Microsoft Azure professional with over 15 years of experience in the IT industry. He’s leveraged his expertise in Microsoft Azure in a development role at Lixar IT, an Ottawa-based software company. At work, he leads a dedicated team of developers to advance technology in the mobile, air, and telecommunication industries. Outside of work, he is a sought-after speaker, author, and trusted collaborator on Microsoft Azure. Follow him on Twitter @fboucheros




Channel 9 Video: .NET Conf UY v2016 Event Summary

    Fabian Fernandez is CEO & Co-Founder of Kaizen Softworks and Organizer & Co-Founder of .NET Conf UY. The 28-year-old is an Agile practitioner, and loves to stay up to date on tech news and startups. In his spare time, he plays guitar and is an extreme sports fanatic. Fabian’s been a Microsoft MVP since 2015. He is based in Uruguay. 


    Follow him on Twitter @kzfabi


    Is there anything better than GetThreadTimes for obtaining per-thread CPU usage information?

    MSDN Blogs - Fri, 10/21/2016 - 07:00

    A customer was using the
    function for high-resolution
    profiling of performance-sensitive code,
    but found that its accuracy is rather poor.
    They were hoping there would be something more
    along the lines of a
    that reported only CPU time consumed by a particular thread,
    rather than by the system in general.

    Fortunately, there is.


    gives you the CPU cycles consumed by a particular thread.
    This includes time spent both in user mode and in kernel mode.

    Note, however, that these values are reported directly from the CPU
    using mechanisms like RDTSC

    the performance monitor control register
    This means that the actual results are at the mercy of whatever
    the CPU manufacturer decides the CPU cycle counter means.
    Maybe they correspond to wall clock time; maybe they don’t.

    Free data sets for Azure Machine Learning

    MSDN Blogs - Fri, 10/21/2016 - 03:19


    One of the key things students need, for learning how to use Microsoft Azure Machine learning,  is access sample data sets and experiments.

    At Microsoft we have made a number of  sample data sets available. These data sets are used by the sample models in the Azure Cortana Intelligence Gallery.

    Some of these data sets are available in Azure Blob storage so can be directly linked to Azure ML experiments whilst other are available in CSV format.

    For these data sets the below list provides a direct link.

    You can use these data sets in your experiments by using the Import Data module.

    The rest of these sample data sets are listed under Saved Datasets in the module palette to the left of the experiment canvas when you open or create a new experiment in ML Studio. You can use any of these data sets in your own experiment by dragging it to your experiment canvas.
    Try Azure Machine Learning for free

    No credit card or Azure subscription needed. Get started now >

    Here are some of the FREE Data sets available to use

    Adult Census Income Binary Classification dataset
    A subset of the 1994 Census database, using working adults over the age of 16 with an adjusted income index of > 100.

    Usage: Classify people using demographics to predict whether a person earns over 50K a year.

    Related Research: Kohavi, R., Becker, B., (1996). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    Airport Codes Dataset
    U.S. airport codes.

    This dataset contains one row for each U.S. airport, providing the airport ID number and name along with the location city and state.

    Automobile price data (Raw)
    Information about automobiles by make and model, including the price, features such as the number of cylinders and MPG, as well as an insurance risk score.

    The risk score is initially associated with auto price and then adjusted for actual risk in a process known to actuaries as symboling. A value of +3 indicates that the auto is risky, and a value of -3 that it is probably pretty safe.

    Usage: Predict the risk score by features, using regression or multivariate classification.

    Related Research: Schlimmer, J.C. (1987). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    Bike Rental UCI dataset
    UCI Bike Rental dataset that is based on real data from Capital Bikeshare company that maintains a bike rental network in Washington DC.

    The dataset has one row for each hour of each day in 2011 and 2012, for a total of 17,379 rows. The range of hourly bike rentals is from 1 to 977.

    Bill Gates RGB Image
    Publicly-available image file converted to CSV data.

    The code for converting the image is provided in the Color quantization using K-Means clustering model detail page.

    Blood donation data
    A subset of data from the blood donor database of the Blood Transfusion Service Center of Hsin-Chu City, Taiwan.

    Donor data includes the months since last donation), and frequency, or the total number of donations, time since last donation, and amount of blood donated.

    Usage: The goal is to predict via classification whether the donor donated blood in March 2007, where 1 indicates a donor during the target period, and 0 a non-donor.

    Related Research: Yeh, I.C., (2008). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    Yeh, I-Cheng, Yang, King-Jang, and Ting, Tao-Ming, “Knowledge discovery on RFM model using Bernoulli sequence, “Expert Systems with Applications, 2008,

    Book Reviews from Amazon
    Reviews of books in Amazon, taken from the website by University of Pennsylvania researchers (sentiment). See the research paper, “Biographies, Bollywood, Boom-boxes and Blenders: Domain Adaptation for Sentiment Classification” by John Blitzer, Mark Dredze, and Fernando Pereira; Association of Computational Linguistics (ACL), 2007.

    The original dataset has 975K reviews with rankings 1, 2, 3, 4, or 5. The reviews were written in English and are from the time period 1997-2007. This dataset has been down-sampled to 10K reviews.

    Breast cancer data
    One of three cancer-related datasets provided by the Oncology Institute that appears frequently in machine learning literature. Combines diagnostic information with features from laboratory analysis of about 300 tissue samples.

    Usage: Classify the type of cancer, based on 9 attributes, some of which are linear and some are categorical.

    Related Research: Wohlberg, W.H., Street, W.N., & Mangasarian, O.L. (1995). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    Breast Cancer Features
    The dataset contains information for 102K suspicious regions (candidates) of X-ray images, each described by 117 features. The features are proprietary and their meaning is not revealed by the dataset creators (Siemens Healthcare).

    Breast Cancer Info
    The dataset contains additional information for each suspicious region of X-ray image. Each example provides information (e.g., label, patient ID, coordinates of patch relative to the whole image) about the corresponding row number in the Breast Cancer Features dataset. Each patient has a number of examples. For patients who have a cancer, some examples are positive and some are negative. For patients who don’t have a cancer, all examples are negative. The dataset has 102K examples. The dataset is biased, 0.6% of the points are positive, the rest are negative. The dataset was made available by Siemens Healthcare.

    CRM Appetency Labels Shared
    Labels from the KDD Cup 2009 customer relationship prediction challenge (orange_small_train_appetency.labels).

    CRM Churn Labels Shared
    Labels from the KDD Cup 2009 customer relationship prediction challenge (orange_small_train_churn.labels).

    CRM Dataset Shared
    This data comes from the KDD Cup 2009 customer relationship prediction challenge (

    The dataset contains 50K customers from the French Telecom company Orange. Each customer has 230 anonymized features, 190 of which are numeric and 40 are categorical. The features are very sparse.

    CRM Upselling Labels Shared
    Labels from the KDD Cup 2009 customer relationship prediction challenge (orange_large_train_upselling.labels).

    Energy Efficiency Regression data
    A collection of simulated energy profiles, based on 12 different building shapes. The buildings differ with respect to differentiated by 8 features, such as glazing area, the glazing area distribution, and orientation.

    Usage: Use either regression or classification to predict the energy efficiency rating based as one of two real valued responses. For multi-class classification, is round the response variable to the nearest integer.

    Related Research: Xifara, A. & Tsanas, A. (2012). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    Flight Delays Data
    Passenger flight on-time performance data taken from the TranStats data collection of the U.S. Department of Transportation (On-Time).

    The dataset covers the time period April-October 2013. Before uploading to Azure ML Studio, the dataset was processed as follows:

    • The dataset was filtered to cover only the 70 busiest airports in the continental US
    • Cancelled flights were labeled as delayed by more than 15 minutes
    • Diverted flights were filtered out
    • The following columns were selected: Year, Month, DayofMonth, DayOfWeek, Carrier, OriginAirportID, DestAirportID, CRSDepTime, DepDelay, DepDel15, CRSArrTime, ArrDelay, ArrDel15, Cancelled

    Flight on-time performance (Raw)
    Records of airplane flight arrivals and departures within United States from October 2011.

    Usage: Predict flight delays.

    Related Research: From US Dept. of Transportation

    Forest fires data
    Contains weather data, such as temperature and humidity indices and wind speed, from an area of northeast Portugal, combined with records of forest fires.

    Usage: This is a difficult regression task, where the aim is to predict the burned area of forest fires.

    Related Research: Cortez, P., & Morais, A. (2008). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    [Cortez and Morais, 2007] P. Cortez and A. Morais. A Data Mining Approach to Predict Forest Fires using Meteorological Data. In J. Neves, M. F. Santos and J. Machado Eds., New Trends in Artificial Intelligence, Proceedings of the 13th EPIA 2007 – Portuguese Conference on Artificial Intelligence, December, Guimarães, Portugal, pp. 512-523, 2007. APPIA, ISBN-13 978-989-95618-0-9. Available at:

    German Credit Card UCI dataset
    The UCI Statlog (German Credit Card) dataset (Statlog+German+Credit+Data), using the file.

    The dataset classifies people, described by a set of attributes, as low or high credit risks. Each example represents a person. There are 20 features, both numerical and categorical, and a binary label (the credit risk value). High credit risk entries have label = 2, low credit risk entries have label = 1. The cost of misclassifying a low risk example as high is 1, whereas the cost of misclassifying a high risk example as low is 5.

    IMDB Movie Titles
    The dataset contains information about movies that were rated in Twitter tweets: IMDB movie ID, movie name and genre, production year. There are 17K movies in the dataset. The dataset was introduced in the paper “S. Dooms, T. De Pessemier and L. Martens. MovieTweetings: a Movie Rating Dataset Collected From Twitter. Workshop on Crowdsourcing and Human Computation for Recommender Systems, CrowdRec at RecSys 2013.”

    Iris two class data
    This is perhaps the best known database to be found in the pattern recognition literature. The data set is relatively small, containing 50 examples each of petal measurements from three iris varieties.

    Usage: Predict the iris type from the measurements.

    Related Research: Fisher, R.A. (1988). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    Movie Tweets
    The dataset is an extended version of the Movie Tweetings dataset. The dataset has 170K ratings for movies, extracted from well-structured tweets on Twitter. Each instance represents a tweet and is a tuple: user ID, IMDB movie ID, rating, timestamp, numer of favorites for this tweet, and number of retweets of this tweet. The dataset was made available by A. Said, S. Dooms, B. Loni and D. Tikk for Recommender Systems Challenge 2014.

    MPG data for various automobiles
    This dataset is a slightly modified version of the dataset provided by the StatLib library of Carnegie Mellon University. The dataset was used in the 1983 American Statistical Association Exposition.

    The data lists fuel consumption for various automobiles in miles per gallon, along with information such the number of cylinders, engine displacement, horsepower, total weight, and acceleration.

    Usage: Predict fuel economy based on 3 multivalued discrete attributes and 5 continuous attributes.

    Related Research: StatLib, Carnegie Mellon University, (1993). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    Pima Indians Diabetes Binary Classification dataset
    A subset of data from the National Institute of Diabetes and Digestive and Kidney Diseases database. The dataset was filtered to focus on female patients of Pima Indian heritage. The data includes medical data such as glucose and insulin levels, as well as lifestyle factors.

    Usage: Predict whether the subject has diabetes (binary classification).

    Related Research: Sigillito, V. (1990). UCI Machine Learning Repository”. Irvine, CA: University of California, School of Information and Computer Science

    Restaurant customer data
    A set of metadata about customers, including demographics and preferences.

    Usage: Use this dataset, in combination with the other two restaurant data sets, to train and test a recommender system.

    Related Research: Bache, K. and Lichman, M. (2013). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science.

    Restaurant feature data
    A set of metadata about restaurants and their features, such as food type, dining style, and location.

    Usage: Use this dataset, in combination with the other two restaurant data sets, to train and test a recommender system.

    Related Research: Bache, K. and Lichman, M. (2013). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science.

    Restaurant ratings
    Contains ratings given by users to restaurants on a scale from 0 to 2.

    Usage: Use this dataset, in combination with the other two restaurant data sets, to train and test a recommender system.

    Related Research: Bache, K. and Lichman, M. (2013). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science.

    Steel Annealing multi-class dataset
    This dataset contains a series of records from steel annealing trials with the physical attributes (width, thickness, type (coil, sheet, etc.) of the resulting steel types.

    Usage: Predict any of two numeric class attributes; hardness or strength. You might also analyze correlations among attributes.

    Steel grades follow a set standard, defined by SAE and other organizations. You are looking for a specific ‘grade’ (the class variable) and want to understand the values needed.

    Related Research: Sterling, D. & Buntine, W., (NA). UCI Machine Learning Repository Irvine, CA: University of California, School of Information and Computer Science

    A useful guide to steel grades can be found here:

    Telescope data
    Records of high energy gamma particle bursts along with background noise, both simulated using a Monte Carlo process.

    The intent of the simulation was to improve the accuracy of ground-based atmospheric Cherenkov gamma telescopes, using statistical methods to differentiate between the desired signal (Cherenkov radiation showers) and background noise (hadronic showers initiated by cosmic rays in the upper atmosphere).

    The data has been pre-processed to create an elongated cluster with the long axis is oriented towards the camera center. The characteristics of this ellipse, (often called Hillas parameters) are among the image parameters that can be used for discrimination.

    Usage: Predict whether image of a shower represents signal or background noise.

    Notes: Simple classification accuracy is not meaningful for this data, since classifying a background event as signal is worse than classifying a signal event as background. For comparison of different classifiers the ROC graph should be used. The probability of accepting a background event as signal must be below one of the following thresholds: 0.01 , 0.02 , 0.05 , 0.1 , or 0.2.

    Also, note that the number of background events (h, for hadronic showers) is underestimated, whereas in real measurements, the h or noise class represents the majority of events.

    Related Research: Bock, R.K. (1995). UCI Machine Learning Repository Irvine, CA: University of California, School of Information

    Weather Dataset
    Hourly land-based weather observations from NOAA (merged data from 201304 to 201310).

    The weather data covers observations made from airport weather stations, covering the time period April-October 2013. Before uploading to Azure ML Studio, the dataset was processed as follows:

    • Weather station IDs were mapped to corresponding airport IDs
    • Weather stations not associated with the 70 busiest airports were filtered out
    • The Date column was split into separate Year, Month, and Day columns
    • The following columns were selected: AirportID, Year, Month, Day, Time, TimeZone, SkyCondition, Visibility, WeatherType, DryBulbFarenheit, DryBulbCelsius, WetBulbFarenheit, WetBulbCelsius, DewPointFarenheit, DewPointCelsius, RelativeHumidity, WindSpeed, WindDirection, ValueForWindCharacter, StationPressure, PressureTendency, PressureChange, SeaLevelPressure, RecordType, HourlyPrecip, Altimeter

    Wikipedia SP 500 Dataset
    Data is derived from Wikipedia ( based on articles of each S&P 500 company, stored as XML data.

    Before uploading to Azure ML Studio, the dataset was processed as follows:

    • Extract text content for each specific company
    • Remove wiki formatting
    • Remove non-alphanumeric characters
    • Convert all text to lowercase
    • Known company categories were added

    Note that for some companies an article could not be found, so the number of records is less than 500.

    Downloadable Data Sets in CSV Format

    The dataset contains customer data and indications about their response to a direct mailing campaign. Each row represents a customer. The dataset contains 9 features about user demographics and past behavior, and 3 label columns (visit, conversion, and spend). Visit is a binary column that indicates that a customer visited after the marketing campaign, conversion indicates a customer purchased something, and spend is the amount that was spent. The dataset was made available by Kevin Hillstrom for MineThatData E-Mail Analytics And Data Mining Challenge.

    Features of test examples in the RCV1-V2 Reuters news dataset. The dataset has 781K news articles along with their IDs (first column of the dataset). Each article is tokenized, stopworded, and stemmed. The dataset was made available by David. D. Lewis.

    Features of training examples in the RCV1-V2 Reuters news dataset. The dataset has 23K news articles along with their IDs (first column of the dataset). Each article is tokenized, stopworded, and stemmed. The dataset was made available by David. D. Lewis.

    Dataset from the KDD Cup 1999 Knowledge Discovery and Data Mining Tools Competition (kddcup99.html).

    The dataset was downloaded and stored in Azure Blob storage (network_intrusion_detection.csv) and includes both training and testing datasets. The training dataset has approximately 126K rows and 43 columns, including the labels; 3 columns are part of the label information, and 40 columns, consisting of numeric and string/categorical features, are available for training the model. The test data has approximately 22.5K test examples with the same 43 columns as in the training data.

    Topic assignments for news articles in the RCV1-V2 Reuters news dataset. A news article can be assigned to several topics. The format of each row is ” 1″. The dataset contains 2.6M topic assignments. The dataset was made available by David. D. Lewis.

    This data comes from the KDD Cup 2010 Student performance evaluation challenge (student performance evaluation). The data used is the Algebra_2008_2009 training set (Stamper, J., Niculescu-Mizil, A., Ritter, S., Gordon, G.J., & Koedinger, K.R. (2010). Algebra I 2008-2009. Challenge data set from KDD Cup 2010 Educational Data Mining Challenge. Find it at downloads.jsp or

    The dataset was downloaded and stored in Azure Blob storage (student_performance.txt) and contains log files from a student tutoring system. The supplied features include problem ID and its brief description, student ID, timestamp, and how many attempts the student made before solving the problem in the right way. The original dataset has 8.9M records, this dataset has been down-sampled to the first 100K rows. The dataset has 23 tab-separated columns of various types: numeric, categorical, and timestamp.

    Patterns & Practices 17 November

    MSDN Blogs - Fri, 10/21/2016 - 01:45

    Vi gjentar suksessen fra i fjor og har igjen invitert Patterns and Practices teamet fra USA til Norge.

    Saga Kino, Oslo den 17.november fra kl 08:30 til kl 15:30.

    Nå har vi anledningen til å tilbringe dagen med et par av de klokeste hodene i vårt «Patterns and Practices» team. Vi har gleden av å ha de på besøk i Oslo, og i den forbindelse vil vi invitere arkitekter og utviklere til et arrangement av de sjeldne.
    Registrering og kaffe starter kl 08:30 på Saga Kino, Sal 1.

    PÅMELDING – Patterns and Practices Architecture Summit i Oslo

    Microsoft Patterns and Practices (P&P) teamet ble opprettet i 2000 for å møte behovet for veiledning til arkitekter og programutviklere. P&P er et sett med mønstre og anbefalinger høstet fra erfaring, for å designe, utvikle, implementere og drive sunne programmerings vaner på Microsoft-plattformen.

    Agenda for dagen:
    • Modern cloud fundamentals
    • Resiliency guidance
    • Azure reference architecture (IaaS, PaaS, Hybrid Networking, Identity)
    • Microservices architecture

    Lunsj serveres kl 12 og vi avslutter kl 15.30.
    Hver sesjon inneholder mye dyp teknologisk og arkitektuell innsikt.
    Om du har ansvar for software utvikling, bør du ikke gå glipp av dette eventet.

    *Eventet er gratis og arrangeres i forbindelse med LEAP 2017 konferansen. P&P vil også være en viktig brikke i LEAP 2017, så dersom dette er interessant, så har du absolutt mye å glede deg til.
    Mer informasjon om LEAP,

    Nyttig informasjon:

    Adresse: Saga Kino Sal 1, Stortingsgata 28, 0161 Oslo
    Henvendelser rettes til: Hanne Wulff på e-post eller telefon +47 913 17 273

    PÅMELDING – Patterns and Practices Architecture Summit i Oslo

    Azure Resource Manager a šablony prakticky

    MSDN Blogs - Fri, 10/21/2016 - 01:37

    Azure Resource Manager (ARM) je již nějakou dobu primární metodou, jak v Azure pracovat s výpočetními, datovými i platformními službami, a postupně vytlačuje tu klasickou. Pokud používáte webový portál Azure, s volbou „Resource Manager“ vs. „Classic“ jste se určitě setkali, ale kromě toho, že vám každá nabídla mírně odlišný formulář, se na první pohled nijak zvlášť neliší. Síla ARM se naplno projeví, jakmile začnete používat šablony nasazení, neboli anglicky Deployment Templates.

    Pokud je vám koncept Resource Manageru cizí, projděte si nejprve oficiální dokumentaci, abyste získali představu o tom, jak ARM funguje.

    Šablona nasazení vám umožňuje definovat kompletní Azure infrastrukturu ve formátu JSON. K čemu je to dobré? Díky tomuto předpisu můžete automatizovat deployment na různá prostředí – vývoj, testování, produkci… Nasazování je libovolně opakovatelné a parametrizovatelné, takže si každý vývojář může vytvořit vlastní verzi celého backendu pro testování. Zdrojový kód šablony je možné udržovat v úložišti zdrojových kódů a verzovat ho jako zbytek aplikace. A třešnička na závěr – šablonu můžete zapojit i do cyklu Continuous Integration (CI) jako jeden z kroků po sestavení a otestování aplikace.

    V tomto článku se na šablony podíváme prakticky a ukážeme si, jak se vytvářejí, na co si dát pozor a co naopak využít, na základě zkušeností ze skutečných projektů.

    Limit velikosti šablony je 1 MB po rozpadu všech proměnných a funkcí, nicméně nejjednodušší šablona vypadá takto:

    { "$schema": "", "contentVersion": "", "parameters": { }, "variables": { }, "resources": [ ], "outputs": { } }

    Je to opravdu soubor JSON rozdělený na klíčové oblasti:

    • Do parameters se umisťují hodnoty (parametry), které přijdou do šablony zvnějšku – požadovaný počet instancí webu, název Storage Accountu nebo třeba heslo administrátora linuxového serveru. Používání parametrů není povinné, ale je vhodné. Když je vynecháte, nasadí se prostředky pokaždé se stejnými názvy ve stejných lokalitách a pravděpodobně dojde ke konfliktům.
    • Ve variables se dynamicky sestavují proměnné, aby se mohly používat dále v šabloně. Tvoří je parametry, funkce a konstanty. Používají se například pro vytvoření jedinečného názvu z parametru nebo přidání předpon/přípon ke jménům.
    • Část resources je typicky nejobsálejší, protože je v ní definice všech zdrojů, které budeme nasazovat – weby, virtuální stroje, síťové adaptéry, databáze… vše najdeme zde.
    • A nakonec v outputs říkáme, jaké hodnoty chceme ze šablony dostat zpět. Mohou to být třeba vygenerované jedinečné názvy, connection stringy nebo adresy serverů.

    Přestože je šablona textový soubor ve formátu JSON, nemusí být statická – pomocí výrazů uzavřených v hranatých závorkách [ ] můžeme hodnoty vyplňovat až za běhu podle kontextu. Výrazy se mohou objevit kdekoliv, podmínkou ale je, aby vracely validní JSON hodnotu.

    Do výrazů se dají vkládat předpřipravené funkce. Často používané jsou např.:

    Funkce Příklad Co dělá parameters() [parameters(‘webTier’)] Vrátí hodnotu parametru „webTier“. variables() [variables(‘saltedPass’)] Vrátí hodnotu proměnné „saltedPass“. concat() [concat(‘ABCD’, base64(parameters(‘password’)))] Spojí řetězec „ABCD“ s hodnotou parametru „password“ kódovanou do Base 64. resourceGroup() [resourceGroup().location] Vratí hodnotu „location“ z JSON objektu Resource Group, do níž se právě nasazuje (např. „North Europe“). uniqueString() [uniqueString(subscription().subscriptionId)] Vrátí hash dlouhý 13 znaků, který není globálně jedinečný. Parametr určuje rozsah unikátnosti hodnoty – v tomto případě unikátní v rámci subscription. [uniqueString(resourceGroup().id)] Hash dlouhý 13 znaků, jedinečný v rámci Resource Group. reference() [reference(concat(‘Microsoft.Storage/storageAccounts/’, parameters(‘storageAccountName’)), ‘2016-01-01’).primaryEndpoints.blob] Vrátí primární URI pro Blob Storage. Funkce reference() získává hodnotu z runtime, takže se nedá použít v proměnných, ale dá se použít v outputs. Nevykoná se, dokud prostředek neexistuje.

    Na velikosti písmen u názvů funkcí a parametrů nezáleží.

    Pro představu, jak vypadá celá jednoduchá šablona, můžete zabrousit na GitHub a podívat se na jeden z příkladů. Často se pro definici používají soubory dva: azuredeploy.json a azuredeploy.parameters.json (tyto názvy jsou obvyklé, ale mohou být libovolné).

    • azuredeploy.json obsahuje samotnou šablonu – deklarace parametrů, definice proměnných a zdrojů atd. Sám o sobě stačí k nasazení, pokud poskytnete hodnoty parametrů.
    • azuredeploy.parameters.json obsahuje hodnoty parametrů, které vstupují do šablony, takže je nemusíte vyplňovat při každém nasazení ručně. Při nasazení přes příkazovou řádku se přidává jako parametr.

    Práce se šablonami ve Visual Studiu zjednodušuje napovídání kódu a jeho formátování, k dispozici je také šablona projektu Azure Resource Group a panel JSON Outline.


    Začít s tvorbou šablon můžete několika způsoby:

    1. Ve Visual Studiu založit nový projekt typu Azure Resource Group a postupně pomocí UI poskládat vše potřebné.
    2. Na GitHubu najít hotovou šablonu, která odpovídá vašim potřebám, a přizpůsobit si ji. Následně do ní přidávat další prostředky.
    3. Připravit celou infrastrukturu v Azure a následně na Resource Group použít příkaz Automation Script, který vygeneruje hotovou šablonu reflektující přesně ty prostředky, které aktuálně provozujete.

    V našem případě jsme zvolili třetí možnost, protože prostředí vznikalo postupně během vývoje. Výsledný JSON vypadá jako kompletní šablona, která je i do jisté míry parametrizovaná. Nenechte se však zmást – úpravy budou potřeba.

    Upravování šablony Názvy a parametry

    První věc, na kterou se zaměříme, jsou názvy. Na začátku je dobré si říct, které budete generovat automaticky a které se budou volit jako vstup šablony. Azure typicky vygeneruje individuální parametry pro každou součást, kterou nasazujete.

    My jsme šli opačným směrem a zredukovali zadávání názvů na jediný parametr: nameRoot. Zbytek se vygeneruje automaticky. Protože jsou jména prostředků dynamická, nacházejí se v sekci variables:

    "variables": { "backendName": "[concat(parameters('nameRoot'), '-backend', uniqueString(subscription().subscriptionId))]", "hostingPlanName": "[concat(parameters('nameRoot'), '-plan')]", "documentDbName": "[concat(toLower(parameters('nameRoot')), uniqueString(subscription().subscriptionId))]", "storageAccountName": "[concat(toLower(parameters('nameRoot')), uniqueString(subscription().subscriptionId))]", "iotHubName": "[concat(parameters('nameRoot'), '-hub', uniqueString(subscription().subscriptionId))]", "serviceBusNamespaceName": "[concat(parameters('nameRoot'), '-ns', uniqueString(subscription().subscriptionId))]", "faceApiName": "[concat(parameters('nameRoot'), '-face')]", "emotionApiName": "[concat(parameters('nameRoot'), '-emotion')]" }

    Princip jejich skládání je většinou stejný – funkcí concat() spojíme text zadaný do parametru nameRoot s příponou podle typu a nakonec v některých případech přidáme pomocí funkce uniqueString() hash, který je jedinečný pro daný účet (díky subscription().subscriptionId).

    Unikátní řetězec přidáváme u těch služeb, jejichž název bude tvořit URI. Například u Service Bus vypadá výsledek pro nameRoot = „jmeno“ takto:


    Šablona slouží vývojářům, takže jsme nebyli úplně striktní vzhledem k ošetření omezení vstupů. Každopádně je dobré mít tato pravidla na paměti hlavně při ladění:

    Prostředek Unikátní název Rozsah Znaky Storage Account Ano 3-24 znaků pouze číslice nebo malá písmena Web App Ano 2-60 znaků číslice, písmena, pomlčka DocumentDB Ano 3-50 znaků číslice, malá písmena, pomlčka IoT Hub Ano 3-50 znaků číslice, malá písmena, pomlčka Service Bus Ano 6-50 znaků číslice, písmena, pomlčka
    musí začínat písmenem, musí končit písmenem nebo číslicí

    Některé zdroje mají názvy, které musí zůstat konstantní. Naše aplikace například využívá službu Service Bus, v níž je fronta s názvem „Events“. Protože se na ni odkazuje automatický job, chtěli jsme tento název zachovat, proto jsme parametr odebrali a zadali jej jako prostý text.

    Výchozí hodnoty

    Jakmile jsou názvy srovnané a parametry zredukované, je na čase pročistit definice jednotlivých prostředků. Azure má ve zvyku ve vygenerované šabloně velmi explicitně stanovit hodnotu různých parametrů, která se ale často kryje s hodnotou výchozí nebo pochází z běhového prostředí a pro definici není podstatná. Například u Service Bus:

    "properties": { "provisioningState": "Succeeded", "status": "Active", "createdAt": "2016-09-22T09:59:23.153Z", "serviceBusEndpoint": "[concat('https://', parameters('namespaces_sbus_name'),'')]", "enabled": true, "updatedAt": "2016-09-22T09:59:48.983Z" }

    Všechny tyto hodnoty je možné ze šablony odebrat.


    Souběžně s čištěním šablony se můžete zaměřit i na kontrolu závislostí. Některé prostředky závisejí na tom, aby existovaly jiné zdroje. Například:

    • Service Bus Queue nemůže vzniknout, dokud není vytvořeno Service Bus Namespace.
    • Virtual Machine nemůže vzniknout, dokud není vytvořen síťový adaptér a Storage Account.
    • Web App nemůže vzniknout, dokud není vytvořen App Service Plan.
    • My jsme zařadili frontu jako resource pod Service Bus a ještě v sekci „dependsOn“ řekli, že nemá vzniknut dříve než Service Bus:

      "type": "Microsoft.ServiceBus/namespaces", "sku": { "name": "Basic", "tier": "Basic" }, "kind": "Messaging", "name": "[variables('serviceBusNamespaceName')]", "apiVersion": "2015-08-01", "location": "[resourceGroup().location]", "tags": {}, "resources": [ { "type": "queues", "name": "Events", "apiVersion": "2015-08-01", "properties": {}, "resources": [], "dependsOn": [ "[resourceId('Microsoft.ServiceBus/namespaces', variables('serviceBusNamespaceName'))]" ] }

      Vynecháte-li správné stanovení závislostí, Azure vás upozorní chybovou hláškou.


      Cílem šablony bylo bezobslužně připravit kompletní prostředí pro nasazení a běh aplikace. Důležité jsou i připojovací řetězce a klíče k jednotlivým službám, které je potřeba doplnit do konfigurace před nasazením kódu. Proto jsme připravili sadu výstupů, díky níž není třeba procházet konfiguraci a kopírovat klíče odtud.

      Jejich získání je ve většině případů triviální a spočívá v použití funkce listKeys(), nicméně některé hodnoty je potřeba skládat a odvozovat složitěji.

      • Storage Connection String se používá v C# kódu pro připojení k Azure Storage. Tato hodnota je složena ze dvou údajů: proměnné storageAccountName a prvního klíče, který vrací nově vytvořený Storage Account. Používáme funkci concat() a listKeys():
      "StorageConnectionString": { "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName')), '2016-01-01').keys[0].value)]", "type": "string" },
      • Storage Keys obsahuje výpis obou klíčů k Azure Storage jako JSON objekt:
      "StorageKeys": { "value": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName')), '2016-01-01')]", "type": "object" },
      • Service Bus Keys je opět JSON objekt, v němž najdeme connection string. Přistupujeme na něj přímo přes název RootManageSharedAccessKey, protože je konstantní a určili jsme ho v dřívější části šablony:
      "ServiceBusKeys": { "value": "[listKeys(resourceId('Microsoft.ServiceBus/namespaces/authorizationRules', variables('serviceBusNamespaceName'), 'RootManageSharedAccessKey'), '2015-08-01')]", "type": "object" },
      • IoT Hub Keys funguje stejně jako Service Bus, opět používáme název klíče:
      "IotHubKeys": { "value": "[listKeys(resourceId('Microsoft.Devices/IotHubs/Iothubkeys', variables('iotHubName'), 'iothubowner'), '2016-02-03')]", "type": "object" },
      • DocumentDB Endpoint vznikl prostým dosazením názvu účtu DocumentDB do známé adresy URL:
      "DocumentDbEndpoint": { "value": "[concat('https://', variables('documentDbName'), '')]", "type": "string" },
      • Pro získání zbylých klíčů se opět používá funkce listKeys():
      "DocumentDbKeys": { "value": "[listKeys(resourceId('Microsoft.DocumentDB/databaseAccounts', variables('documentDbName')), '2015-04-08')]", "type": "object" }, "FaceApiKeys": { "value": "[listKeys(resourceId('Microsoft.CognitiveServices/accounts', variables('faceApiName')), '2016-02-01-preview')]", "type": "object" }, "EmotionApiKeys": { "value": "[listKeys(resourceId('Microsoft.CognitiveServices/accounts', variables('emotionApiName')), '2016-02-01-preview')]", "type": "object" } Nasazení

      K nasazení šablony můžete použít PowerShell, nástroje příkazové řádky (CLI), Visual Studio Team Services nebo webový portál Microsoft Azure. Pro vývoj a postupné zkoušení a opravování chyb se ukázalo jako vhodné použít právě portál.

      1. Na hlavní obrazovce (Dashboard) klikneme na „New“.

      2. Vyhledáme „Template deployment“.

      3. Klikneme na „Edit“.

      4. Zkopírujeme svou šablonu z Visual Studia a nahradíme jí kompletní obsah editoru.

      5. Potvrdíme tlačítkem „Save“.

      6. Vytvoříme novou Resource Group nebo vybereme stávající a zkontrolujeme parametry, které se doplnily ze šablony. Nakonec potvrdíme souhlas s podmínkami a klikneme na „Purchase“.

      7. Azure zkontroluje, zda je šablona validní a upozorní vás na případné nedostatky. Pro drobné úpravy a rychlé iterování se osvědčilo použít webové rozhraní a teprve potom je promítnout zpět do Visual Studia. Následně začne samotné nasazení.

      8. Průběh deploymentu můžete sledovat po kliknutí na notifikaci, která se objeví vpravo nahoře.

      9. Výstupy najdete v Resource Group, sekci „Overview“, v části „Essentials“ pod odkazem „Last Deployment“.

      Další možnosti nasazování (příkazovou řádku a VSTS) probereme v jiném článku.

      Na závěr

      Hotová šablona je bezpečně uložená ve správě zdrojového kódu, verzovaná a připravená k opakovanému nasazení. Díky dynamickým názvům nehrozí konflikty a každý vývojář z týmu může šablonu vzít a vytvořit si vlastní verzi celého prostředí pro vývoj a testování. Připojovací údaje k jednotlivým službám najde jako výstup, takže je nemusí hledat na portále.


    Subscribe to Randy Riness @ SPSCC aggregator
    Drupal 7 Appliance - Powered by TurnKey Linux