You are here

Feed aggregator

Create an IoT Device Using the Gadgeteer with Azure Blob Storage

MSDN Blogs - 2 hours 59 sec ago

Please have a read of my article I wrote about an IoT device project I created.  It was published in this month’s MSDN Magazine.

Making changes to the applicationHost.config on Azure Websites

MSDN Blogs - 2 hours 16 min ago

There are a few other resources about this here and here, but as experiences and changes happen, documenting and sharing different perspectives help move projects and people forward.  Additionally, there is no better way to learn than by doing and no better way to remember than by documenting or writing it down.

As you likely know, not all IIS configuration elements can be modified on the Azure Website platform.  Here is the most current list of the elements which you can modify that I am aware of.

An example is the weblimitselement that can control connectTimeout, headerWaitTime, maxGlobalBandwidth, minBytesPerSecond, etc.. behavior.  As shown in Figure 1, using the IIS Management console on a stand-alone instance of IIS, changing these values is achieved via the Configuration Editor.

Figure 1, changing applicationHost variables, Azure Websites

There is no interface for doing this to an Azure Website, nor will there ever likely be one, because access to system configuration settings would cause some issues when automated scaling, or manual for that matter, happened.  I.e. system configurations are not replicated to other instances when scaled.  To get around this, the Azure Website owner, can create an XDT file that includes these system configurations and publish them to the website.  Then, when the website is scaled, the settings can be propagated as required, because theses settings are part of the website configuration and not the system.

For this example, want to decrease the connectionTimeout to 30 seconds.  To achieve this, I performed the following.

  1. Create an applicationHost.xdt file

  2. Create the content for the XDT file

  3. Publish the XDT file

  4. Confirm the change was applied

Create an applicationHost.xdt file

I created a file called applicationHost.xdt and placed it in my Azure Website project using Visual Studio 2013, similar to that show in Figure 2.

Figure 2, adding an applicationHost.xdt file to my project

Create the content for the XDT file

Once created I added the following content, shown in Listing 1, to the applicationHost.xdt file.

Listing 1, reducing the connectionTimeout limit on Azure Website

<?xml version="1.0"?> <configuration  xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">   <system.applicationHost>     <webLimits xdt:Transform="SetAttributes(connectionTimeout)" connectionTimeout="00:00:30" />   </system.applicationHost> </configuration> Publish the XDT file

I then published the XDT file to my Azure Website to the /site directory, similar to that shown in Figure 3 using FileZilla.

 

Confirm the change was applied

Before the configuration is visible, a restart of the Azure Website is required.  Additionally, make an initial request to the website to make sure the worker process is running.  There are certainly numerous options to get the Azure Websites configuration, I used the CMD Debug Console from KUDU, which I have discussed here, shown in Figure 4.  As you can see, after navigating to D:\home\site, I executed the following command: 

type c:\dwasfiles\sites\#1standard\config\applicationHost.config > myAppHostConfig.config

 

Figure 4, how to get the applicationHost.config for an Azure Website

Once created, the file can be downloaded using an FTP tool or by clicking the download link, illustrated by Figure 5, within KUDU. 

 

 

Figure 5, download the applicationHost.config for you Azure Website

When I clicked on the download link the configuration file was opened in my browser.  Searching for weblimits resulted in finding the desired element configuration, shown in Figure 6.

Figure 6, finding the XDT element in the extended Azure Website applicationHost.config file

NOTE:  Not all variables are extendable.  When I find a list of which ones are or are not, I will share.  For example, I do not think the sites\siteDefaults\limits element can be extended.  This was what I initially tried to get to work, without luck, so there is obviously some Azure Website logic preventing its change and likely protecting me from changing something that would cause problems with my website or for others.

20 новых бесплатных курсов виртуальной академии Microsoft Virtual Academy, март 2015

MSDN Blogs - 2 hours 32 min ago
В этом обзоре мы поговорим про бесплатные курсы виртуальной академии Microsoft MVA , которые будут полезны как профессиональным разработчикам программного обеспечения и ИТ-про, так и новичкам. Некоторые курсы предлагают бесплатную подготовку к официальным сертификационным экзаменам. Обратите внимание, что видео-плеер на сайте для ряда курсов, которые созданы на английском языке, содержит возможность включить русские субтитры. Хит! Основы Microsoft Azure В рамках этого первого курса вы узнаете, зачем...(read more)

Enterprise Mobility Suite (EMS) now available in Open Licensing

MSDN Blogs - 2 hours 44 min ago

I’m excited to announce Enterprise Mobility Suite (EMS) is now available in the Open License Program.  EMS is a combination of Microsoft Intune, Azure Active Directory Premium and Azure Rights Management Services (RMS), and is priced per seat and available in Open Commercial, Government and Education (as well as OV and OVS). Azure Active Directory Premium will also be available through Open as a standalone SKU. Azure RMS and Intune are already available in Open as standalone SKU’s.

The availability of EMS on Open License provides the opportunity to offer cloud solutions with a low barrier to entry, and the opportunity to add more value as you engage with your customers.

In addition, starting in early March, Microsoft Action Pack subscribers, along with Silver and Gold competency partners, will get access to EMS and Azure AD Basic as part of their Internal Use Rights (IURs) benefits. This will help you run your internal business, demo EMS to your customers, and perform internal development and testing. We encourage you to leverage the ModernBiz Business Anywhere sales and marketing materials to help you engage with your customers.

EMS Opportunity with Office 365

EMS and O365 work together to create a secure and productive environment for the workforce. Learn how some of the basic functionality of O365 is enhanced when paired with the capabilities of EMS to support new productivity models. 

Next Steps

1. Access your internal use rights per the above

2. We encourage you to learn more about EMS by attending tomorrow’s ‘Introduction to EMS’ Webinar On Demand will also be available post webinar

3. Watch EMS in Action

4. Speak to your distributor today for pricing and more information

Technorati Tags: ,,,,,,

Application Request Routing (ARR) - self referencing itself 400 or 502.3

MSDN Blogs - 2 hours 47 min ago

IMO the Application Request Routing server should be on a server which is not hosting the webFarm itself.  The only reason I can think of wanting to do this is to save costs on resources, both hardware and support.  I look at the documentation here and do not see any mention of this kind of configuration.  However, the beauty of platforms, especially IIS, is that users can be creative and come up with solutions that work like this, so here is an example of how I did it.  The goal is to make what is shown in Figure 1 work, where the requests are handled by the Default Web Site on port 80 which are then redirected to sites APP3 and APP4.

Figure 1, Routing requests to ARR back to itself

When I set this up initially I get a 400.0 Bad request – The request cannot be routed because it has reached the Max-Forwards limit, as shown in Figure 2 when I access either the APP3 or APP4 sites directly from on the server.

Figure 2, ARR The request cannot be routed because it has reached the Max-Forwards limit

And when I tried accessing the Default Web Site from on the server or from another client, I received a 502.3 – The server returned an invalid or unrecognized response, which I have discussed here in more detail.

Figure 3, ARR The server returned an invalid or unrecognized response

Here are the steps I took to make this work in the way I wanted.

  1. Modify the HOST file and bind the webFarm site names to the IP of the ARR server

  2. Bind those webFarm site names to the websites

  3. Disable the Server Level URL Rewrite Rule, create a new URL Rewrite rule on the Default Web Site

  4. Set preserveHostHeader=”false”

Modify the HOST file

The HOST file is a way you can run a pseudo DNS server on your machine.  Basically it’s where you can link an IP address to a domain name.  The HOST file is located in the c:\windows\system32\drivers\etc directory.  I have seen issues caused by this that took days to resolve, so make changes here with care and document it so others know you did this.  You need to add something similar to the following.

127.0.0.1 APP3.MSFT.TEST

127.0.0.1 APP4.MSFT.TEST

In my example I used the actual IP of the machine, however using this IP should work too.

Bind the webFarm sites to the web sites

Ceteris Paribus, requests will come into the server on port 80 which the Default Web Site is configured to do.  To support the ‘self-reference’ solution you need to bind the domain names you added in the HOST file to the websites you want in the webFarm.  As shown in Figure 4, you see that APP3.MSFT.TEST is bound to the same on port 80.  You would need to add the binding for each of the domains in your webFarm.

Figure 4, Adding a hostname binding to a self-reference ARR solution

Disable the default URL Rewrite rule and create a new one

If you try to navigate to any of these sites now, you will get either the 400 or 502.3 shown in Figure 2 and 3.  This is because by default, as shown in Figure 5, when you create an ARR server farm, it creates a server level URL Rewrite rule as well, that redirects requests to the server farm.  Disable that, once disabled you should be able to access all 3 sites from the server.  Note that the domain names you added to the HOST file can only be accessed from the server you placed them, you won’t be able to access them from another client.

Figure 5, the default URL Rewrite Rule for ARR

At the Default Web Site level, you need to create a new URL Rewrite Rule similar to this one, shown in Listing 1.

Listing 1, ARR self-redirection URL Rewrite Rule

<?xml version="1.0" encoding="UTF-8"?> <configuration>   <system.webServer>     <rewrite>       <rules>         <rule name="ARR-SELF-REDIRECTION" patternSyntax="Wildcard" stopProcessing="true">           <match url="*" />           <action type="Rewrite" url="http: // MSFT.TEST/{R:0}" />         </rule>       </rules>     </rewrite>   </system.webServer> </configuration>

Notice that the action type is Rewrite and the URL is the name of the Server Farm to which I want the requests redirected to.

If you try accessing the Default Web Site now you will still get the errors shown previously in Figure 2 and Figure 3.  The last thing you need to do is disable the preservation of the Host Header.

Set preserverHostHeader=”false”

Using the Configuration Editor at the server level shown in Figure 6, set the preserveHostHeader parameter to false.

Figure 6, disable preserverHostHeader to get ARR self-redirection to work

Save the configuration change and access the ARR server and your request should now be routed to the IIS sites on the same server.

 

Warehouse Mobile Device Portal Architecture

MSDN Blogs - 3 hours 2 min ago

This is the second part of a blog series on the Warehouse mobile device portal (WMDP) introduced in Microsoft Dynamics AX 2013 R3. The first part can be found here. The mobile device portal is designed to provide warehouse workers with an interface into the advanced warehousing system of Microsoft Dynamics AX and is intended to be used on a wide variety of devices within the warehouse. This post is designed to help you understand the overall architecture of the mobile device portal and how the existing code operates to generate the mobile device pages.

WMDP – Architecture

To understand how the mobile portal ultimately renders the warehouse work on the mobile device, it is important to take a step back and see how the overall data flow operates and all of the major components in the solution. The following picture shows the high level objects that constitute the warehouse mobile portal architecture.

The far left represents the end user device, which displays the final HTML in a browser or emulator. As mentioned in the previous blog post, the mobile portal is designed to render across a wide variety of devices and form-factors, and we have tried to ensure the generated HTML is widely compatible across browsers and devices.

IIS, or Internet Information Services, executes on the server and acts as the Web server for the mobile device portal. Within IIS the warehouse mobile device portal executes as an ASP.NET web application. This application accepts the HTTP communications from end-user browsers and translates these into web service calls to the AOS endpoint – ultimately resulting in new webpages being generated for the end-user, based on their current workflow state and data entered and/or returned from the Application object server (AOS).

The AOS hosts the WHSMobileDeviceService endpoint as an Application Integration Service endpoint. This is the entrypoint into AX for all advanced warehouse functionality exposed by the mobile portal. You can see the details of the WHSMobileDeviceService in the AOT under the Services node, and you can configure the exposed endpoint within AX by navigating to System administration -> Setup -> Services and Application Integration Framework -> Inbound ports. The configuration for this endpoint on my machine can be seen below:

 

WMDP – Data Flow

Let's walk through a typical data exchange from the client-side device to the AOS and back. This will illustrate how each of these components operate within the framework and what they contribute to the overall solution.

Step 1 – Client-side HTTP Form Post

All communication is initiated from the client-side via standard HTTP operations. The initial webpage is retrieved via an HTTP GET operation – providing the starting point for the mobile device. After this all interactions with the server-side portal are conducted with HTTP Form POST commands. Remember that one of the design goals for this portal was to ensure we could support low-power and low-capability devices. As such the portal is not designed as a fancy JavaScript single-page app – instead we rely on standards-based, tried-and-true HTML4/HTTP functionality that should be available to most devices with an Internet browser.

The following are two examples of this form data being posted from the client-side browser. Note that the username/password fields are not encrypted in transit – which is why we recommend the mobile portal be exposed only on a trusted private network or to ensure secure transport protocols are implemented between the client-side device and the server (i.e. HTTPS). You will also note the various data fields are passed to the server as form/url-encoded data as well as the session token used to determine the user authentication and authorization details.

ClickedButton=&IsMenu=False&SessionGuid=ZZZ&__RequestVerificationToken=XXX&UserId=42&Password=1

ClickedButton=&IsMenu=False&SessionGuid=ZZZ&__RequestVerificationToken=XXX&PONum=123&Qty=1&UOM=

Step 2 – WMDP Website Processing

Inside the WMDP website several validations occur on the incoming data, including determining if the mobile user is valid and currently still logged in. The incoming data is validated and the data is combined with an XML structure of the current workflow UI to create a Control XML structure. This XML is used to communicate with the WHSMobileDevicesService AIF endpoint. A sample of this, containing the controls constructed from a login submission request, would look like the following:

Step 3 – WHSMobileDevicesService Integration

The XML created by the mobile portal is now submitted to the endpoint exposed in the AOS. The address of this endpoint is defined in the web.config file of the mobile portal website; by default this is assumed to be on the same machine as the mobile portal – but this can be reconfigured for a multi-box deployment.

Step 4 – WHSMobileDevicesService Processing

The final step in the inbound chain, this is where the incoming request is processed by the AX Work framework and the next step in the workflow is generated and returned to the caller. This is the set of X++ classes we will dive into later in this post – and this is where it is possible for you to inject your own code into the system to represent custom workflows and operations you want to enable in your warehouse. It is also possible to customize the workflows shipped as part of AX R3/CU8 – as you will see later.

Step 5 – Control XML Returned

Ultimately all of the processing that occurs within the AOS will result in a package of XML being returned to the caller. This will contain the structure and content of the controls to ultimately display in the generated HTML – including any error or validation error messages that need to be displayed to the user.

The following XML is what might be returned from the AOS when the user is conducting a PO Receiving workflow but has entered an incorrect PO number. You can see the error label with the information to inform the user, and you can see all of the other controls that make up the structure of the user interface. You will also note that now that we have authenticated this user we have an additional node in the XML containing the session GUID to track the user authentication information.

Step 6 – WMDP Processing

The WMDP website uses the Control XML returned from the WHSMobileDevicesService endpoint to construct the final HTML that will be returned to the client device. This HTML is constructed based on the controls defined in the XML, as well as the mobile settings view page discussed in the previous blog post. The goal is ultimately to have the correct HTML generated for the client device/user for the specific workflow step they are currently executing.

Step 7 – HTML Returned

The final step in the process is returning the constructed HTML back to the client device. As discussed above, this HTML is designed to be widely compatible and standard-based back to HTML4, such that any device with a browser should be able to operate within the warehouse portal.

The basic format of the constructed HTML is created by the display settings view, which controls the layout and construction of the HTML structure. The default view creates an HTML table and populates the rows of the table with the returned controls.

 

WMDP – AOT Classes

The above walkthrough gives us a better idea of the overall data flow into and out of the mobile portal. If we go back and expand "Step 4" a bit – we will now cover what exactly happens within the AOS when a request comes in from a mobile device and how that is translated into a result to the user.

Recall that the endpoint the WMDP uses to communicate with the AOS is the WHSMobileDevicesService. This exposes five operations, but the primary one we will deal with in detail in the getNextFormHandheld. This is the operation invoked by the WMDP to determine the next step in the current workflow for the user.

The high-level flow of classes involved in constructing the response to the user can be seen below. We will walk through this tree and discuss what each of the various components are doing along the way. All of these classes can be found in a standard Microsoft Dynamics AX 2012 R3 deployment if you want to follow along.

At the highest level – the WHSMobileDevicesService service is responsible for exposing the operation endpoints to external processes. As we described above, the getNextFormHandHeld operation consumes and returns XML data. This service is backed by the WHSMobileDevicesServiceFacade class which implements the operation methods. In turn, this class utilizes the WHSWorkExecuteDisplay:: getNextFormHandheld method to start the actual logic of generating the return XML.

I consider the WHSWorkExecuteDisplay::getNextFormHandheld method as the true entrypoint to the WMDP Work Framework. This is where I put a breakpoint when debugging and want to see exactly what is entering the system from the client and what we are returning from the various Work X++ classes. The method itself is very straightforward:

public static str getNextFormHandHeld(str _xml)

{

container con;

con = WHSWorkExecuteDisplay::readXML(_xml);

return WHSWorkExecuteDisplay::getNextFormXML(con);

}

The logic is very simple – the incoming XML is first read and a container is constructed. This is then used when building the return XML.

The WHSWorkExecuteDisplay::readXML method converts the incoming Control XML into a semi-structured container format. This container is composed of 3+ containers – each of which are used throughout the processing and rendering process. I have tried to capture a high-level overview of the container below.

WHS container (con)

Index 1
#StateInfo

State Info container

   

Index 1

WHSWorkExecuteMode enumeration

 

Index 2

State Step

Index 2
#PassthroughInfo

PassthroughInfo container

<WHSRFPassthrough Map object>

 

Index 1

Map identifier

 

Index 2

Type of Keys stored in this map (String)

 

Index 3

Type of Values stored in this map (String)

 

Index 4

Number of values in the Map

 

Index n

Key Name

 

Index n+1

Value

Index 3+
#ControlsStart

Control Container

   

Index 1

Control Type

 

Index 2

Control Name

 

Index 3

Control Label

 

Index 4

Newline Indicator

 

Index 5

Control Data

 

Index 6

Control Data Type

 

Index 7

Control Data Length

 

Index 8

Error Indicator

 

Index 9

Default Button Indicator

 

Index 10

Selected Indicator

 

Index 11

Control Color

 

In the WHSWorkExecuteDisplay::getNextFormState we have code that consumes this container, primarily the StateInfo container, in order to figure out the correct class to load in order to process the user request.    

You will note the mode and step variables are extracted out of the container structure using the various macros defined either in the base class WHSWorkExecuteDisplay or in the global #WHSRF macro definition.

After the validation of the user session information, the critical step in this code is the call to the WHSWorkExecuteDisplay::construct method. This is a factory method that accepts the current mode we are operating under and creates the appropriate WHSWorkExecuteDisplay derived class to handle the rest of user interface creation and workflow processing. Here is a snippet of this code:

If you were to define your own processing class you would need to hook some code into this method to ensure the new class is created when the framework requests that new WHSWorkExecuteMode class.

Once the correct WHSWorkExecuteDisplay* class has been constructed the framework will invoke the displayForm method on this class – passing in the container and the specific button clicked in the UI (if any). This method is the central location to define the state machine of the various workflows. The current state the user is executing is located in the step variable, and this can be updated to reflect movement in the state machine model. In addition, the WHSRFPassthrough Map object discussed above is used to pass data between the various classes/methods involved in creating the final output.

Here is an example of the displayForm method. It is a very common pattern to have the central switch statement on the step member variable – with each case representing a different state in the state machine this class is representing.

Next Time

I hope this was a useful overview of the Mobile Device Portal and how the various components work together to provide workflow solutions to warehouse users. There is a lot more to discuss – and we will have another blog post that will dive into building customized workflows within this framework shortly. Stay tuned!

Coffee Break | more piping with Dynamics NAV

MSDN Blogs - 3 hours 38 min ago

 Did you see our first coffee break about piping at t Windows PowerShell and Piping? Let's dig a bit further.

Coffee Break 6 - return to piping

This time we will use more piping and other ways to look at a PowerShell object and format it in different ways. For the example here we will use Get-NAVServerInstance and the results from that cmdlet. But everything in this post would apply in the same way on other cmdlets, like

Get-NAVServerUser
Get-NAVWebService
Get-Process
Get-Service

 

Change how data is displayed

Import the NAV management module so we can use the NAV cmdlets

import-module 'C:\Program Files\Microsoft Dynamics NAV\80\Service\Microsoft.Dynamics.Nav.Management.dll'

Run the following commands and check how the result differs:
Get-NAVServerInstance
Get-NAVServerInstance | Format-Table
Get-NAVServerInstance | Format-Table -AutoSize

 

Select parameters:

Select which columns to return

Get-NAVServerInstance | Select-Object ServerInstance, State

But... How do you know which columns are available? Simply pipe the cmdlet into Get-Member:

Get-NavServerInstance | Get-Member

This shows you a list of members, including these properties

Default                                                                                                      
DisplayName                                                                                                 
ServerInstance                                                                                              
ServiceAccount                                                                                            
State                                                                                                    
Version

Formatting Output

The most usual formats are list and table. Confusingly to a Dynamics NAV person, Format-List is like a card display, and Format-Table is just like a list. Run these to see the difference:
Get-NAVServerInstance | Select-Object ServerInstance, State | Format-List
Get-NAVServerInstance | Select-Object ServerInstance, State | Format-Table

Some of the most useful other formats (to replace the last bit of the pipe above):

Group-Object State
Sort-Object State
ConvertTo-Html
Export-Csv -Path c:\x\servers.txt
Out-gridview
Clip

Especially Clip is very useful - it sends the result directly to your clipboard so you can paste it into Notepad or somewhere else.

Note that formatting pipes may destroy the object in order to display it, so always do the formatting as the last part of a pipe. Except if you want to follow it by an Out-cmdlet.

 

Jasminka Thunes, Escalation Engineer Dynamics NAV EMEA

Lars Lohndorf-Larsen, Escalation Engineer Dynamics NAV EMEA

Bas Graaf, Senior Software Engineer Dynamics NAV

The AMC Bank Data Conversion Service for Microsoft Dynamics NAV 2015 now supports 35 additional banks

MSDN Blogs - 3 hours 38 min ago

Based on requests from customers and partners, the AMC Bank Data Conversion Service for Microsoft Dynamics NAV 2015 has just released support for 35 new banks worldwide. To see the new banks being supported, please go to the service sign up page here where you can also find more information on the service.

If your bank isn’t supported today, please place your vote here and influence which banks the AMC Bank Data Conversion Service should add support for next.

Guest Post by CORE ECS – Technology in schools: Microsoft’s Showcase Classroom

MSDN Blogs - 4 hours 8 min ago

The following is a guest post from Core ECS - specialists in integrated SharePoint solutions for education. The Team at CORE ECS designs, develops and maintains teaching and learning environments including intranets, VLEs, collaborative learning workspaces and public websites. Their flexible solutions for schools, academies and colleges are SharePoint-based and can be hosted on your site, at a data centre or in the cloud.

---

Technology in schools: Microsoft’s Showcase Classroom

In February 2015, employees of Core ECS spent the day at Microsoft’s Showcase Classroom in the heart of London. Core ECS design, develop and maintain SharePoint-based teaching and learning environments for schools, academies and colleges. They organised for seventeen Year 10 students and members of the leadership team from The Nobel School in Stevenage to attend the immersive, interactive, technology session.

The workshop held in the modern, brightly decorated classroom was designed to give students and teachers an insight into and a full appreciation of what the Windows 8.1 tablet offers the education sector. Led by Phil Burney, who has 10 years’ experience as an ICT teacher and has worked as a consultant in Australia and the UK, the session gave everyone the opportunity to explore technologies in a hands-on environment.

The 21st century lesson began with everyone being given their own tablet to log on to for the day. An icebreaker saw all the delegates introducing themselves and naming their favourite piece of technology. The list included an iPad, Windows phone, Sky Go, Go-Pro, lawnmower and even a coffee machine! Phil explained that over the course of the day, the students and their teachers would learn how six different Microsoft technologies – Windows 8.1, Yammer, Office 365, OneNote, OneDrive and Lync – can assist in engagement and attainment at their school.

A series of activities gave everyone the opportunity to experience how Windows 8.1 tablets can transform the classroom. The morning session began with an introduction to the devices, with the delegates receiving instructions and then completing tasks for themselves. They learnt how to customise the start screen, group apps, use smart search, understand how ‘gestures’ operate features, split the screen and write notes using a keyboard, stylus and their finger. The group was introduced to a number of Apps which along with the information they had received, had to be put to use to complete several activities including translating instructions from Russian into English, taking a panoramic photo and finding a recipe for lasagne.

The attendees learnt how the 'cloud’ or the centralised, online data storage area supports anytime, anywhere learning. To illustrate how OneDrive can link people and their data wherever they are, Phil requested volunteers to act the parts of ‘student on the bus’, ‘X-box player at home’, ‘teacher in the staffroom’ and ‘student in the library’. He demonstrated how through the programmes on their devices they were able to communicate and share information about a homework task, no matter where they were. The session illustrated how Microsoft 365 can be used to create and edit documents on the go, with SharePoint providing a collaborative space where they can be hosted and shared with a particular group of people.

“Office 365 is excellent and really delivers a suite of programs that could really support
teaching and learning, particularly on mobile devices.” - Barry Burningham, Deputy Head Teacher

The workshop aimed to show teachers how they can combine the use of tablets with their current teaching and learning strategies as well as offering guidance on exploring new ones. They were also able to secure immediate feedback from their students. With everyone having access to their own tablet, not only were they empowered with the knowledge and the confidence to use it, the activities were specifically designed to show how the tablets would work in a classroom setting, rather than just in a general technological sense.

We were also introduced to Yammer - the social networking service that can be used for private communication within schools. Its demonstration revealed how teachers can use a controlled environment as a means of communicating the importance of e-safety. Another activity revealed that students are much more confident at writing posts in answer to questions than they are voicing their thoughts out loud. The posts on Yammer provide teachers with a documented record of how students are working together in a safe environment where they can share and discuss ideas. The teachers also saw how Yammer can be used as a tool through which a group of parents can communicate with the school.

 

"The students were engaged in what they were doing and the presenter was very good. It was great to get an understanding of how the
technology could be used in the classroom.” - Christine Crawley, School Business Manager

Every practical exercise focussed on how staff and students can use Microsoft technology in school to support teaching and learning outcomes. The workshop looked at how schools with Skype and Microsoft Lync can use instant messaging, conferencing and video and audio calls to link up with students across the world and to contact experts who will help engage them in their studies. Students and teachers were able to try out OneNote, collating content in text, audio and video form in their digital notebooks, as well as using PowerPoint to devise quizzes and record and add video content to presentation slides.

Not since the emergence of the printed book has the education sector experienced such an opportunity to revolutionise learning. Whilst the introduction of mobile personal technology is a great opportunity, it comes with many challenges as its development leads to unique changes in the way we interact. It is often difficult for school leaders to see the scope of this innovation and teachers can be reluctant to incorporate new ICT ideas and utilise new equipment in their lessons. The Microsoft workshop with its step-by-step guide to some of the programmes out there, demonstrated how technology in the hands of teachers and students alike can create a force that will truly transform education across the UK.

“I chose to partner with Core ECS because they were able to demonstrate a strong pedigree. I have worked with ‘365’ in its earlier incarnations and I was impressed how Core ECS have developed easy integration services of Office 365
into our current infrastructure so that I can offer flexible and robust solutions internally.” - Simon Evans, School Network Manager

To find out more about how Core ECS creates and delivers advanced technology solutions in schools, please contact the team on 08452 606022 or email sales@coreecs.co.uk

Guest Post by CORE ECS – Technology in schools: Microsoft’s Showcase Classroom

MSDN Blogs - 4 hours 8 min ago

The following is a guest post from Core ECS - specialists in integrated SharePoint solutions for education. The Team at CORE ECS designs, develops and maintains teaching and learning environments including intranets, VLEs, collaborative learning workspaces and public websites. Their flexible solutions for schools, academies and colleges are SharePoint-based and can be hosted on your site, at a data centre or in the cloud.

---

Technology in schools: Microsoft’s Showcase Classroom

In February 2015, employees of Core ECS spent the day at Microsoft’s Showcase Classroom in the heart of London. Core ECS design, develop and maintain SharePoint-based teaching and learning environments for schools, academies and colleges. They organised for seventeen Year 10 students and members of the leadership team from The Nobel School in Stevenage to attend the immersive, interactive, technology session.

The workshop held in the modern, brightly decorated classroom was designed to give students and teachers an insight into and a full appreciation of what the Windows 8.1 tablet offers the education sector. Led by Phil Burney, who has 10 years’ experience as an ICT teacher and has worked as a consultant in Australia and the UK, the session gave everyone the opportunity to explore technologies in a hands-on environment.

The 21st century lesson began with everyone being given their own tablet to log on to for the day. An icebreaker saw all the delegates introducing themselves and naming their favourite piece of technology. The list included an iPad, Windows phone, Sky Go, Go-Pro, lawnmower and even a coffee machine! Phil explained that over the course of the day, the students and their teachers would learn how six different Microsoft technologies – Windows 8.1, Yammer, Office 365, OneNote, OneDrive and Lync – can assist in engagement and attainment at their school.

A series of activities gave everyone the opportunity to experience how Windows 8.1 tablets can transform the classroom. The morning session began with an introduction to the devices, with the delegates receiving instructions and then completing tasks for themselves. They learnt how to customise the start screen, group apps, use smart search, understand how ‘gestures’ operate features, split the screen and write notes using a keyboard, stylus and their finger. The group was introduced to a number of Apps which along with the information they had received, had to be put to use to complete several activities including translating instructions from Russian into English, taking a panoramic photo and finding a recipe for lasagne.

The attendees learnt how the 'cloud’ or the centralised, online data storage area supports anytime, anywhere learning. To illustrate how OneDrive can link people and their data wherever they are, Phil requested volunteers to act the parts of ‘student on the bus’, ‘X-box player at home’, ‘teacher in the staffroom’ and ‘student in the library’. He demonstrated how through the programmes on their devices they were able to communicate and share information about a homework task, no matter where they were. The session illustrated how Microsoft 365 can be used to create and edit documents on the go, with SharePoint providing a collaborative space where they can be hosted and shared with a particular group of people.

“Office 365 is excellent and really delivers a suite of programs that could really support
teaching and learning, particularly on mobile devices.” - Barry Burningham, Deputy Head Teacher

The workshop aimed to show teachers how they can combine the use of tablets with their current teaching and learning strategies as well as offering guidance on exploring new ones. They were also able to secure immediate feedback from their students. With everyone having access to their own tablet, not only were they empowered with the knowledge and the confidence to use it, the activities were specifically designed to show how the tablets would work in a classroom setting, rather than just in a general technological sense.

We were also introduced to Yammer - the social networking service that can be used for private communication within schools. Its demonstration revealed how teachers can use a controlled environment as a means of communicating the importance of e-safety. Another activity revealed that students are much more confident at writing posts in answer to questions than they are voicing their thoughts out loud. The posts on Yammer provide teachers with a documented record of how students are working together in a safe environment where they can share and discuss ideas. The teachers also saw how Yammer can be used as a tool through which a group of parents can communicate with the school.

"The students were engaged in what they were doing and the presenter was very good. It was great to get an understanding of how the
technology could be used in the classroom.” - Christine Crawley, School Business Manager

Every practical exercise focussed on how staff and students can use Microsoft technology in school to support teaching and learning outcomes. The workshop looked at how schools with Skype and Microsoft Lync can use instant messaging, conferencing and video and audio calls to link up with students across the world and to contact experts who will help engage them in their studies. Students and teachers were able to try out OneNote, collating content in text, audio and video form in their digital notebooks, as well as using PowerPoint to devise quizzes and record and add video content to presentation slides.

Not since the emergence of the printed book has the education sector experienced such an opportunity to revolutionise learning. Whilst the introduction of mobile personal technology is a great opportunity, it comes with many challenges as its development leads to unique changes in the way we interact. It is often difficult for school leaders to see the scope of this innovation and teachers can be reluctant to incorporate new ICT ideas and utilise new equipment in their lessons. The Microsoft workshop with its step-by-step guide to some of the programmes out there, demonstrated how technology in the hands of teachers and students alike can create a force that will truly transform education across the UK.

“I chose to partner with Core ECS because they were able to demonstrate a strong pedigree. I have worked with ‘365’ in its earlier incarnations and I was impressed how Core ECS have developed easy integration services of Office 365
into our current infrastructure so that I can offer flexible and robust solutions internally.” - Simon Evans, School Network Manager

To find out more about how Core ECS creates and delivers advanced technology solutions in schools, please contact the team on 08452 606022 or email sales@coreecs.co.uk

ملخص لأهم ما أعلنته مايكروسوفت في مؤتمر MWC

MSDN Blogs - 4 hours 29 min ago

قامت مايكروسوفت بالإعلان عن مجموعة من المنتجات و الأمور الهامة متنوعة ما بين أجهزة جديدة و أخبار تهم المطورين و متتبعي أخبار ويندوز 10. إليك ملخص سريع لأهم ما قامت مايكروسوفت بالإعلان عنه:

 

 هاتفي لوميا رائعين .. لوميا 640 و 640 XL

 

قامت مايكروسوفت بالإعلان عن هاتفين مميزين من الفئة المتوسطة Mid Range ينضمان إلى عائلة لوميا. هاتف لوميا 640 و الذي يأتي مزوداً بشاشة 5 إنش و كاميرا خلفية 8 ميجا بيكسل مع دعم لشبكات الجيل الرابع  

 

 

الشقيق الأكبر للوميا 640 هو الهاتف 640 XL و الذي يأتي بشاشة 5.7 إنش مع كاميرا خلفية رائعة بدقة 13 ميجا بيكسل و دعم لشبكات الجيل الرابع و هيكل بسماكة 9 ملم فقط.

 

 

الهاتفين يأتيان بعدة ألوان جذابة كما هي العادة في سلسلة لوميا كما يتميزان بجود التصنيع العالية و المتانة أثناء الإستخدام. أسعار الهواتف في منطقتنا سيتم تحديدها لاحقاً

 

 

كيبورد البلوتوث القابل للثني

 

أعلنت مايكروسوفت عن كيبورد صغير جداً يعمل بتقنية البلوتوث ، قابل للثني و يعمل على كل المنصات، ويندوز، أندرويد، آي أو أس و ماك. و بالتالي يمكنك إستخدامه مهما كان نوع الجهاز الذي تستخدمه. إسم الكيبورد الرسمي هو Universal Foldable Keyboard و سيكون سعره 100 دولار تقريباً.

 

 

نظرة أولية على منصة التطبيقات الموحدة Universal App Platform

 

وضحت مايكروسوفت يوم أمس بعض التفاصيل الهامة عن موضوع يعني الكثير للمطورين، التطبيقات الموحدة في ويندوز 10 ستأخذ منحنى جديد و تتبلور فكرتها بشكل كامل بحيث تكون فعلاً موحدة لكل أنواع الأجهزة التي سيعمل عليها ويندوز 10 . من أجهزة حاسب مكتبية و محمولة مروراً بالهوا��ف الذكية و الأجهزة اللوحية و الإكس بوكس و وصولاً حتى لجهاز HoloLens و أجهزة إنترنت الأشياء IoT .

أبرز ما جاء في هذا الجانب هو تحويل منصة التطوير للتطبيقات إلى منصة موحدة فعلاً بدءاً من تطوير التطبيق و حتى نشره، فالشيفرة المصدرية واحدة، تصميم الواجهة البرمجية واحدة، حزمة التطبيق واحدة بل وحتى المتجر الذي سيتم نشر التطبيق عليه هو متجر واحد لكل أنواع الأجهزة. الصورة السابقة توضح بشكل مبسط هدف مايكروسوفت من باء منصة تطبيقات موحدة ستتيح للمطورين الوصول إلى مليارات المستخدمين فقط بكتابة التطبيق مرة واحده.

 

 

 

 

Конференция DevCon 2015 : интернет вещей в мире живого кода

MSDN Blogs - Mon, 03/02/2015 - 23:48
Дорогие друзья! В рамках подготовки к DevCon 2015 мы готовы поделиться с вами тем, что ожидает участников на конференции. В этой заметке я расскажу о том как в первые за всю историю DevCon мы представим историю интернета вещей (IoT). Но, сначала, представляю вам классный анимационный ролик “Добро пожаловать в Мир Живого Кода на DevCon 2015!”. Интернет вещей уже стал горячей темой и интерес к технологиям IoT только растет. Ниже вы найдете подробности программы DevCon 2015 посвященной интернету вещей...(read more)

Azure SQL Database V12 GA 版本正式在亞洲地區提供服務

MSDN Blogs - Mon, 03/02/2015 - 22:58

感謝北科大劉建昌同學協助翻譯微軟資料平台行銷主管 Tiffany Wissner  於 2015/2/24 發表的文章 Announcing latest version of Azure SQL Database now GA in Asia; improvements to disaster recovery objectives ( http://azure.microsoft.com/blog/2015/02/24/announcing-latest-version-of-azure-sql-database-now-ga-in-asia-improvements-to-disaster-recovery-objectives-2/ )

正如先前 2015 年 1 月所宣布的,最新的 Azure SQL Database服務版本 (V12) 開始在歐洲與美國等地釋出正式版本 (general availability , GA)。我們現在很高興的宣布,這個服務版本也將在 Azure 亞洲資料中心登場。

除此之外,在這個最新版本之中,我們更提升了所有資料庫的災難復原時效目標 (disaster recovery objectives)。總之這些新的更新,使得 Azure SQL Database 成為了雲端關鍵性商業應用 (business-critical workloads) 以及本地端資料庫移植至雲端最便捷方式的首選。

最新版本的服務在亞洲上市

目前在亞洲釋出的 Azure SQL Database V12 版本提供了接近完整版 Microsoft SQL Server 引擎相容性、支援更大的資料庫、新的 S3 效能服務等級、增加 25% 的執行效能。也由於以上的功能強化,使得客戶能夠節省更多的 IT 成本以及簡化移轉 SQL Server 到 Azure 上的複雜度。

所有在 GA 版本 (generally available) 支援地區建立的新訂閱以及之後新建立的資料庫,都將使用新的服務架構,並且得到這些最新版本的所有功能。而在 GA 支援地區的現有資料庫,將能夠選擇是否從原本的 Basic、Standard、Premium 升級到新的V12服務架構來獲得新的功能。(現有資料庫升級至 V12 版方式可參閱 http://azure.microsoft.com/zh-tw/documentation/articles/sql-database-preview-upgrade/ )

最新版本的收費將從 2015 年 4 月 1 日開始實行

顯著改善災難復原目標時間

隨著世界各地釋出服務版本 V12,我們顯著的提升了RPO (recovery point objective)  以及ERT (estimated recovery time)。

RPO 為定義了發生災難事件時可接受的最大可容忍資料遺失。ERT 則為資料庫從故障切換/重新啟動的狀況下,資料庫恢復所有功能的時間。

您能夠從這裡得到更多關於Azure SQL Database(V12) 的詳細說明

我們致力於將雲端資料庫的發展推到極致,也相當高興能夠將這個服務提供給您。

Azure RemoteApp Service now with Chat

MSDN Blogs - Mon, 03/02/2015 - 21:50

We have enabled a chat feature on https://www.remoteapp.windowsazure.com/ to help answer quickly questions our customers may have thinking about, deploying, or managing Azure RemoteApp. This is in addition to the forum and feedback site. 

Stop by and ask your questions!

Thanks, Remote Desktop team.

PowerShell Lessons Learned from Building an Automated SQL Installation and Patch Management Implementation

MSDN Blogs - Mon, 03/02/2015 - 21:25
Attached are the PowerPoint slides from the recent presentation I gave for the SQL Server User’s Group meeting. I hope to update this blog post in the next few weeks with: Demo scripts A video recording of the presentation Thank you all for attending! Download the presentation here . <Extracted Slide Text> PowerShell Lessons Learned from Building an Automated SQL Installation and Patch Management Implementation Presented by: Fany Carolina Vargas, Microsoft Corp., Sr. PFE, SQL Dedicated Support...(read more)

Lync Online is changing it's name to Skype for Business

MSDN Blogs - Mon, 03/02/2015 - 20:55

Hi All

I am very excited about the latest news from the Skype team.

Changes are coming to Lync Online in Office 365. Lync is joining the Skype family, so in the coming months, Lync will be changing to Skype for Business. Please check back because this page will be updated frequently to provide you with important information about the Skype for Business client and the Skype for Business Online service. We hope that by giving you this information, it will help you through this transition.

If you want to find out more check out the link here:

https://technet.microsoft.com/en-us/library/dn913785.aspx

Happy Lync'ing (or should I now say Skype'ing) :)

Steve

Cloud Champions II – Deep Dive into Azure – view on demand and access resources

MSDN Blogs - Mon, 03/02/2015 - 20:03

In today’s Cloud Champions II webinar we heard from Vikram Ghosh how to grow your business with Microsoft Azure. If you missed the webinar or want to review the content you can access it on demand.  All previous sessions can be found on demand on the Cloud Champions calendar.

THE BLOG BELOW IS POSTED ON BEHALF OF VIKRAM GHOSH – CHANNEL STRATEGY LEAD – MICROSOFT AZURE, MICROSOFT CORPORATION

In August of 2014 we introduced Azure into the Open Licensing Program. With Azure, partners are able to help customers embrace the speed, scale and economics of cloud through an open and flexible platform that provides all the building blocks to quickly build, deploy, and manage cloud-based solutions.

Prior to the launch of Azure in the Open Licensing program there were two ways in which customers could buy Azure, either directly on azure.microsoft.com or using their Enterprise Agreement (EA) contracts. Azure in the Open licensing program enables you, our partners to deepen customer relationships and grow your revenues.

We are constantly innovating on the Azure Platform and offering a range of services that can be used by IT PROs as well as developers to deliver solutions for end customers. As part of the Azure in Open launch we highlight 4 keys scenarios that help our infrastructure partners deliver IT solutions to customers business problems. These solutions, Azure Backup, a new disaster recovery solution called Azure Site Recovery, Azure Virtual Machines as well as Azure Websites, help our partners offer cloud services to their customers who are still running on premise hardware, with Backup and Disaster recovery solutions being delivered from the cloud, and for customers who no longer wish to run server hardware, the ability to host their workloads in the cloud.

Staring March 1st we are offering the Enterprise Mobility Suite (EMS), in the Open Licensing program, that in conjunction with Office 365 truly enables our customers to conduct their business anywhere and at the same time help our partners to deepen their engagement with end customers by offering Identity Management in the Cloud, Mobile Device Management and Information Protection.

Azure & EMS unlocks a range of economic benefits for our partners by not only offering them a margin opportunity that’s associated with the sales of IT products, but also enabling you to deliver higher value project based services as well as the opportunity to create a deeper and ongoing customer relationship with managed services. With Azure datacenters being the locations from where these services are delivered, you no longer have to carry the capital expenses associated with setting up your customers IT systems and at the same time significantly reduce the associated operating expenses, leaving more room for higher value project and managed services within the customers IT budgets. You can find resources to help you with this journey and stories of partners who are successfully using the cloud to transform their businesses at Cloud Surestep. Helping your customers solve business problems vs positioning discrete point products, is a lynch-pin of this transformation you can find this solution based approach to customer engagement delivered via our ModernBiz campaign.

I look forward to hearing about your successes and how you are building strong businesses based on Azure & EMS and celebrating these successes at WPC 2015. If you have a story to share, I am super excited to hear it. Drop me a line at vikramg@microsoft.com

Regards

Vikram Ghosh

Breadth Channel Strategy Lead

Microsoft Cloud + Enterprise Marketing Group

 

Below are some useful resources following today’s webinar:

• Microsoft Azure resources on Cloud Surestep

• Build your Learning Path for the Cloud Platform Competency

• Microsoft Australia Azure Webinar Series

Microsoft Azure Sales Toolkit for selling into SMBs

Dynamics CRM 2015/Online 2015 更新プログラム SDK: フォームスクリプトでビジネスプロセスフローを操作する その 2

MSDN Blogs - Mon, 03/02/2015 - 19:00

みなさん、こんにちは。

前回に続いて Microsoft Dynamics CRM 2015 および Microsoft Dynamics CRM
Online 2015 更新プログラム SDK の新機能から、フォームスクリプトでビジネス
プロセスフローを操作する方法の紹介をします。

参考: 業務プロセス フローのスクリプトを作成する
https://msdn.microsoft.com/ja-jp/library/dn817874.aspx

前回は業務プロセスフローの各要素を取得する方法を紹介しましたので、
今回は取得した要素を利用して何ができるかを紹介します。

利用する業務プロセスフローを切り替える

Xrm.Page.data.process. getEnabledProcesses で取得したプロセスの一覧から
利用したいプロセスを Xrm.Page.data.process.setActiveProcess に渡すことで
プロセスを切り替えることが出来ます。

例えば、以下の様に潜在顧客に対してロール別のプロセスがある場合、現在
ログインしているユーザーのセキュリティロールよりどちらのプロセスを
利用するかを指定することが考えられます。

以下のサンプルでは、ユーザーのロールを取得して、ロールにより利用
する業務プロセスフローを切り替えます。

function switchBPF() {
    // ユーザーのロールを取得
    var userRoles = Xrm.Page.context.getUserRoles();

    // ユーザーのロールが複数ある場合はなにもしない
    if (userRoles.length > 1)
        return;

    // 業務プロセスフロー一覧の取得
    Xrm.Page.data.process.getEnabledProcesses(function (processes) {
        for (var processId in processes) {
            // セキュリティロールが営業担当者の場合
            if (userRoles[0] == "b215f8d4-32ae-e411-80e3-c4346badf6d8") {
                if (processes[processId] == "営業担当者向け業務プロセス")
                {
                    // 業務プロセスフローの切り替え
                    Xrm.Page.data.process.setActiveProcess(processId, function () {
                        alert("業務プロセスフローを切り替えます。");
                    });
                }
            }
            // セキュリティロールがマーエティング プロフェッショナルの場合
            else if (userRoles[0] == "ac0df8d4-32ae-e411-80e3-c4346badf6d8")
            {
                if (processes[processId] == "マーケティング担当者向け業務プロセス")
                {
                    // 業務プロセスフローの切り替え
                    Xrm.Page.data.process.setActiveProcess(processId, function () {
                        alert("業務プロセスフローを切り替えます。");
                    });
                }
            }
        }
    });
}

ステージの移動

通常ユーザーが手動で行うステージ間の移動を以下関数で自動化できます。

次に進む場合: Xrm.Page.data.process.moveNext
前に戻る場合: Xrm.Page.data.process.movePrevious
特定のステージに移動する場合: Xrm.Page.data.process.setActiveStage
※ setActiveStage を利用する場合、移動先ステージは同じエンティティである
必要があります。現在潜在顧客レコードを開いている場合に、営業案件に移動
することはできません。

ステージの移動を行う際には、現在の状況を確認する必要があります。
その場合は Xrm.Page.data.process.getActivePath 関数を利用できます。

戻り値はステージの集合体であり、各ステージのステータスが取得できるほか
各ステップの状況も確認できます。

詳細は以下のサンプルをお試しください。
サンプル: Xrm.Page.data.process.getActivePath

業務プロセスフローの表示をコントロールする

状況よって業務プロセスフロー領域を最小化しておきたい場合は、
Xrm.Page.ui.process.setDisplayState 関数を利用します。引数が Boolean 型
ではなく、文字列であることに注意してください。

折りたたむ場合
Xrm.Page.data.process.setDisplayState(“collapsed");
展開する場合
Xrm.Page.data.process.setDisplayState(“expanded");

また完全に業務プロセスフロー領域の表示を制御したい場合には、
Xrm.Page.ui.process.setVisible 関数を利用します。引数は Boolean 型です。

まとめ

業務プロセスフローはとても強力で便利な機能です。フォームスクリプトで
操作が行えるようになったことで、自動化や表示/非表示の制御が柔軟になり
より使える機能になりました。

是非お試しください!

- 中村 憲一郎

The Case of Azure Websites and Continuous Deployment

MSDN Blogs - Mon, 03/02/2015 - 18:36
So this post has a misleading title… Because I’m not going to explain how this works, Azure Websites continuous deployment is not anything really new, and also It’s well documented here and here . Read more >>...(read more)

Getting Insights from Data in Real Time : Azure Stream Analytics

MSDN Blogs - Mon, 03/02/2015 - 17:47

Data analytics provides insights into data which help’s business take it to the next well. There are scenarios where you may want to analyze the data in real time, even before it’s saved into the database and perform analytics on it. Making decisions quickly in certain scenarios facilitate to have an edge over others and take the experience of your products to the next level. Also with Internet of things gaining momentum where billion of devices and sensors are connected to Internet, there is a need of processing these events in real time to perform appropriate actions.

 

What is the difference between traditional big data analytics and real time analytics?

To understand the difference between traditional big data analysis and real time analytics let’s explore the concept of data at rest and data in motion

This can be understood by the analogy of water, consider water in a lake to represent static data, similarly water falling through a fall is similar to the data in motion. So here we have to consider analytics with reference to time. Another interesting example to understand this consider you have to count the no. of cars in parking lot, you can count the cars in the entire lot. Now let’s say you need to calculate the number of cars passing a crossing, you have to analyses this data in a windows of time and analyses it in real time. The main idea that the analytics is carried out without storing the data

 

One of the biggest challenges in real time data analytics is time, effort and expertise needed to develop complex real time analytics solutions. Azure stream analytics helps overcome this entry barrier and allows to provision a solution for processing millions of events using familiar SQL queries with a few clicks.

Azure Stream Analytics is a fully managed service by Microsoft Azure which allows you to position a streaming solution for complex event processing and analytics within minutes as we will see in the example in the last section of this article.

 

 

Architecture of Azure Stream Analytics is given below:

  1. You can define input streams to be Data streams which can be coming through two sources currently
  • Event Hub
  • Azure Storage Blob
  1. Alternately input can also be from a reference data source which could be in a Blob
  2. You can write SQL like queries within windows which are discussed below to perform analysis on this data
  3. You can output this data to a SQL database , event hub or blob
  4. From this SQL database you could create solutions for presenting using Power BI dashboards or predictive analysis using Machine Learning
  5. Through the Even hub perform some actions with the sensors 

 

 

 Concept of Windows in Azure Stream Analytics Queries

To be able to write the queries for Stream Analytics, you need to understand the concepts of Windows there are three different kinds of windows which you can define with your SQL queries .Windows are nothing but time intervals within which events are analyzed.

  •  Tumbling Window

            It’s a fixed length, non –overlapping time interval

  •  Hopping Window

            The next windows overlaps with the previous windows by a certain fixed amount of time interval.

  • Sliding Window

          In case you want the overlap of the new next windows to be at every time duration we have sliding windows.

 

Getting Started with Azure Stream Analytics

Currently Azure stream analytics in Public Preview. To be able to try Azure Stream Analytics you must have an Azure subscription. In case you don’t have an Azure subscription you can sign up for a free one month trial here

 We will implement a sample toll both, cars continuously keep on entering the toll and exiting the toll.  We would be assuming sensors have been installed at the entry and exit which are continuously sending data to event hub and a vehicle registration data which is available for lookup. Using Stream Analytics we would calculate no. of cars which pass through this toll in every five minutes using the data available from the input stream. Then we would calculate the average time taken by a car at the toll, this analysis would help in increasing the efficiency of the Toll Booth.

To get started you would need to provision the following:

  1. Event Hubs: You would need to provision two event hubs “Entry” and  “Exit“
  2. SQL Database: You would provision a SQL Database for storing the output results from Stream Analytics jobs
  3. Azure Storage: To store reference data about Vehicle registration.
In case you are new to Azure , detailed steps for setting up the above can be found here.(http://1drv.ms/1ARJcxl)

Create Stream Analytics Job

  1. In Azure Portal navigate to Stream Analytics and click on the “New” button at the bottom to create a new analytics job. Currently the service is in preview and is available in limited regions

 

 

2. “QuickCreate”, select either Western Europe or “Central US” as the region. For regional monitoring storage account, create a new storage account. Azure Stream analytics would use this account to store monitoring information for all your future jobs.

 

 3. Define Input sources.

3.1 We need to define input sources for Stream Analytics, we would be using Event hubs for input

Steps to add Input Sources

  1. Click on the created Stream Analytics job à Click on the Input tab
  2. Select “Data Stream “, as an input job.

  

3. 1Select “Event Hub” as the input source


4. Add input alias as “EntryStream”. Choose the event hub name you have created for this demo from the dropdown

5. Move to the next page select default values for all Serialization settings on this page.

6. Repeat the above steps for creating an Exit Stream and choosing the “exit” hub as the event hub this time

 3.2 Adding Vehicle Registration Data as reference as another input Source

Steps to be followed

  1. Click on Add Input at bottom.

 

2. Add reference data to your input job

 

3.Select the storage account you had created while setting up the lab environment. Container name should be “tolldata” and blob name should be “registration.csv”

 

4. Choose the existing serialization settings and Click Ok

 

      3.3 Output Data

  1. Go to “Output” tab and click “Add an output”

 

  1. Choose SQL databaseà Choose the SQL server database from the drop down, that you created while setting up the lab.  Enter the username and password for this server.

Table name would be “TollDataRefJoin”

 

 

 AZURE STREAM ANALYTICS QUERIES

 

In the Query tab you can write in the familiar SQL syntax itself, which would perform the transformation over the incoming data streams.

 

Download and extract TollAppData.zip to your local machine. This contains the following files:

1. Entry.json

2. Exit.json

3. Registration.json

 

 Now we will attempt to answer several business questions related to Toll data and construct Stream Analytics queries that can be used in Azure Stream Analytics to provide a relevant answer.

QUESTION 1 - NUMBER OF VEHICLES ENTERING A TOLL BOOTH

For testing this query we upload sample data representing data from a stream. You can find this sample JSON data file in TollData zip folder located here.

Steps

  1. Open the Azure Management portal and navigate to your created Azure Stream Analytic job. Open the Query tab and copy paste Query below

 

SELECT TollId, System.Timestamp AS WindowEnd, COUNT(*) AS Count FROM EntryStream TIMESTAMP BY EntryTime  GROUP BY TUMBLINGWINDOW(minute,3), TollId

 

 To validate this query against sample data, click the Test button. In the dialog that opens, navigate to Entry.json (downloaded on your local system in Data folder) with sample data from the EntryTime event stream.

 

QUESTION 2 - REPORT TOTAL TIME FOR EACH CAR TO PASS THROUGH THE TOLL BOOTH

We want to find average time required for the car to pass the toll to assess efficiency and customer experience.

 

SELECT EntryStream.TollId, EntryStream.EntryTime, ExitStream.ExitTime, EntryStream.LicensePlate,  DATEDIFF(minute, EntryStream.EntryTime, ExitStream .ExitTime) AS DurationInMinutes FROM EntryStream TIMESTAMP BY EntryTime JOIN ExitStream TIMESTAMP BY ExitTime ON (EntryStream.TollId= ExitStream.TollId AND EntryStream.LicensePlate = ExitStream.LicensePlate) AND DATEDIFF(minute, EntryStream, ExitStream ) BETWEEN 0 AND 15  

 

Click test and specify sample input files for EntryTime and ExitTime.

Click the checkbox to test the query and view output:

 

QUESTION 3 – REPORT ALL COMMERCIAL VEHICLES WITH EXPIRED REGISTRATION

Azure Stream Analytics can use static snapshots of data to join with temporal data streams. To demonstrate this capability we will use the following sample question.

 

If a commercial vehicle is registered with the Toll Company, they can pass through the toll booth without being stopped for inspection. We will use Commercial Vehicle Registration lookup table to identify all commercial vehicles with expired registration.

Note that testing a query with Reference Data requires that an input source for the Reference Data is defined.

To test this query, paste the query into the Query tab, click Test, and specify the 2 input sources

 Following results would appear as shown below

SELECT EntryStream.EntryTime, EntryStream.LicensePlate, EntryStream.TollId, Registration.RegistrationId FROM EntryStream TIMESTAMP BY EntryTime JOIN Registration ON EntryStream.LicensePlate = Registration.LicensePlate WHERE Registration.Expired = '1'

 

 START THE STREAM ANALYTICS JOB

Now as we have written our first Azure Stream Analytics query, it is time to finish the configuration and start the job.

 Save the query from Question 3, which will produce output that matches the schema of our output table TollDataRefJoin.

 Navigate to the job Dashboard and click Start.

 

 

 Starting the job can take a few minutes. You will be able to see the status on the top-level page for Stream Analytics.

 

 View the Table data in the SQL Database to view the results of the above query :)

 

References:

 http://azure.microsoft.com/en-in/documentation/services/stream-analytics/

http://azure.microsoft.com/en-in/documentation/services/stream-analytics/

 

Pages

Subscribe to Randy Riness @ SPSCC aggregator
Drupal 7 Appliance - Powered by TurnKey Linux