You are here

MSDN Blogs

Subscribe to MSDN Blogs feed
from ideas to solutions
Updated: 1 hour 25 min ago

Sitecore Commerce 7.5 now available!

Tue, 10/28/2014 - 12:39
Sitecore Commerce 7.5 is now available immediately from the Sitecore Developer Network. Building upon the strong foundation set with 7.2, Sitecore Commerce 7.5 adds: Support for Sitecore Experience Platform 7.5, including leveraging the Experience Database for all analytics data gathering and personalization Underlying support for Microsoft SQL Server 2014 and Microsoft BizTalk Server 2013 R2 A number of performance, debugging, stability, and maintainability improvements This release includes all...(read more)

Q&A with the SharePoint MVP Experts Chat On Oct 29th @1pm EST or 10am PDT

Tue, 10/28/2014 - 12:21

Hello everyone

We are launching our SharePoint MVP Expert Chats again!  Do you or someone you know have questions about SharePoint 2010 or 2013?  Or SharePoint Online and Office 365?  Please join us October 29th at 1pm EST or 10am PDT where you can have your questions answered live by the Microsoft MVP Experts!  We will be using the Reddit Ask Me Anything format.  This is new to us but many of Microsoft teams are using this medium now.  Please create a Reddit account beforehand so you can be ready to ask questions.  More information on the chat and room location will be available on Oct 29th in the SharePoint forum.  Hope you can join us!

https://www.reddit.com/r/sharepoint

 

MVP Experts Participating:

1. Andrew Connell

2. Cathy Dew

3. Doug Hemminger

4. Doug Ware

5. Eric Shupps

6. Gavin Barron

7. John Ross

8. Kris Wagner

9. Randy Drisgill

10. Sahil Malik

11. Sean McNeill

12. Shane Young

13. Spencer Harbar

14. Todd Bleeker

15. Trevor Seward

16. Wictor Wilen

HDInsight Storm Topology Submission Via VNet

Tue, 10/28/2014 - 12:12
1. Introduction

To submit a Storm topology to an HDInsight cluster, a user can RDP to the headnode of the cluster and run storm command. This is not always convenient. It is actually possible to submit a Storm topology from outside of an HDInsight cluster. The idea is to create an HDInsight Storm cluster with a configured Virtual Networks (VNet), and submit Storm topology from a machine that is connected to the VNet.

There are many types of VNet so with VNet support we can actually submit topology from Azure VM, other Azure services, on-premises infrastructure or developer boxes.

To show case the idea, I’ll show you how to use an Azure VM to submit a Storm topology via VNet.

2. Step-by-step Instructions

1) Create a Cloud-Only VNet in Azure portal. You can use the “QUICK CREATE” button.

  

 

2) Create a VM using the created VNet, this will be the VM from which we submit Storm topology. You need to use “FROM GALLERY” button and on page 4 you need to choose the VNet that we just created.

 

3) Create a Storm cluster using the created VNet. Note you need to use “custom create” and specify the VNet name in Region/Virtual Network section.

  

4) Find out the FQDN of the active head node of HDInsight Storm cluster using REST API.

This is a Powershell script to help you get the FQDN of the active head node:

function Get-ActiveFQDN(
    [String]
    [Parameter( Position=0, Mandatory=$true )]
    $ClusterDnsName,
    [String]
    [Parameter( Position=1, Mandatory=$true )]
    $Username,
    [String]
    [Parameter( Position=2, Mandatory=$true )]
    $Password)
{
    $DnsSuffix = ".azurehdinsight.net"
    $ClusterFQDN = $ClusterDnsName + $DnsSuffix
    $webclient = new-object System.Net.WebClient
    $webclient.Credentials = new-object System.Net.NetworkCredential($Username, $Password)
    $Url = "https://" + $ClusterFQDN + "/clusteravailability/status"
    $Response = $webclient.DownloadString($Url)
    $JsonObject = $Response | ConvertFrom-Json
    Write-host $JsonObject.LeaderDnsName
}

  
This script will print out something like this:

headnode1.<clusterdnsname>.b1.internal.cloudapp.net

5) RDP to the Azure VM we just created. Copy Storm bits from HDInsight head node (c:\apps\dist\storm-xxx or %STORM_HOME%) to Azure VM (let’s say we copy to c:\storm folder); Install Java 1.7 runtime (either Oracle or OpenJDK is fine).

6) On the Azure VM, make sure the following configurations (environment variable and Storm configurations) are correctly set:

Environment variable:

    JAVA_HOME = "<your java installation path>"

storm.yaml (c:\storm\conf\storm.yaml):

    nimbus.host: headnode1.<clusterdnsname>.b1.internal.cloudapp.net

7) On the Azure VM, submit a Storm topology using storm.cmd command line like this:

C:\storm\bin>storm jar ..\contrib\storm-starter\storm-starter-<version>-jar-with-dependencies.jar storm.starter.WordCountTopology wordcountSampleTopology

Then on the Azure VM you can manage topology status using Storm UI web page (start IE and enter the address like this):

http://headnode1.<clusterdnsname>.b1.internal.cloudapp.net:8772/

This is how easy it is. Enjoy storming!

Dutch water company Evides benefits from the OpenText and Microsoft Alliance

Tue, 10/28/2014 - 11:55

 

As a productivity and platform company a key part of our strategy is working with and investing in our partners to deliver enterprise solutions that extend our core capabilities to deliver value to our customers. A great example of this in practice comes from Evides Waterbedrijf (Evides), a Dutch water company headquartered in Rotterdam, serving 2.5 million people and businesses in The Netherlands.

Evides has a Microsoft first strategy so they naturally looked at implementing Microsoft SharePoint for a repository, records and document management and archiving solution. But to meet core requirements SharePoint would have required additional third party modules from several vendors that would expose Evides to future risk due to the greater number of vendors and the potential for future incompatibilities.

In addition to being a Microsoft shop, Evides has been a longtime user of OpenText solutions and to make a long story short they selected OpenText’s Content Server to provide the core foundation of the solution along with SharePoint. As Gerry van Meijel, Head of Document Management Department, at Evides says: “Top among the factors for selecting OpenText was the strong, long standing, strategic partnership between OpenText and Microsoft. Their global relationship and track record gave us the confidence to proceed knowing their respective solutions will continue to work seamlessly together.”

It’s a great success story of how OpenText and Microsoft work together to provide business value and minimize customer risk. You can read about the OpenText – Microsoft Alliance here and the full story on how we jointly delivered business value to Evides Waterbedrijf here. Enjoy! – Jon C. Arnold

Manage the PowerShell DSC Extension in the Azure Preview Portal

Tue, 10/28/2014 - 10:52

 

If you haven’t yet started using the Azure Preview Portal and are a fan of PowerShell DSC and the Azure PowerShell DSC Extension (announced in August here), now just might be the time to start. We are pleased to announce that you can now manage the extension using the Preview Portal! Before today, you could use the Preview Portal to get information about the PowerShell DSC Extension (version, status, and status text – example below), but you couldn’t add the extension to an existing virtual machine or update the configuration if the extension was already present.

 

 

 

Preparing for the UI

Just like with the original example using Fourth Coffee, we need to have a few things ready in order to use the new UI, specifically the ZIP file (containing the Configuration and custom DSC Resources). For our example today, we’ve trimmed down the DSC Configuration for Fourth Coffee and made a slight change (file saved as C:\examples\FourthCoffee.ps1):

 

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050 Configuration WebSite
{
    Import-DscResource -Module xWebAdministration   
      
        # Install the IIS role
        WindowsFeature IIS 
        { 
            Ensure          = 'Present' 
            Name            = 'Web-Server' 
        } 
 
        # Install the ASP .NET 4.5 role
        WindowsFeature AspNet45 
        { 
            Ensure          = 'Present' 
            Name            = 'Web-Asp-Net45' 
        } 
 
        # Stop the default website
        xWebsite DefaultSite 
        { 
            Ensure          = 'Present' 
            Name            = 'Default Web Site' 
            State           = 'Stopped' 
            PhysicalPath    = 'C:\inetpub\wwwroot' 
            DependsOn       = '[WindowsFeature]IIS' 
        } 
 
        # Copy the website content
        File WebContent 
        { 
            Ensure          = 'Present' 
            SourcePath      = 'C:\Program Files\WindowsPowerShell\Modules\BakeryWebsite'
            DestinationPath = 'C:\inetpub\FourthCoffee'
            Recurse         = $true 
            Type            = 'Directory' 
            DependsOn       = '[WindowsFeature]AspNet45' 
        } 

        # Create a new website
        xWebsite BakeryWebSite 
        { 
            Ensure          = 'Present' 
            Name            = 'FourthCoffee'
            State           = 'Started' 
            PhysicalPath    = 'C:\inetpub\FourthCoffee' 
            DependsOn       = '[File]WebContent' 
        }
}

You’ll note that we are using the xWebAdministration DSC Resource again. If you haven’t downloaded it and the rest of most current Wave of the DSC Resource Kit, grab the ‘All Modules’ package here.

With our Configuration file ready, we’ll use PowerShell to prepare the ZIP file. Note that we are using the same cmdlet as before (Publish-AzureVMDscConfiguration), but with a different switch (run from C:\examples; I split the command in two lines; note the backtick character at the end of the first line):

PS> Publish-AzureVMDscConfiguration .\FourthCoffee.ps1 ` >>> -ConfigurationArchivePath .\WebsitePackage1.zip

This doesn’t actually upload anything to your Storage Account, it just prepares the ZIP file with the specified Configuration. Additionally, it parses the Configuration for custom DSC Resources and as long as they are present, copies them from your Modules folder into the newly created ZIP.

For our example, we are going to manually copy the required web content (available at the bottom of this post) into the ZIP file. You’ll notice in the new FourthCoffe.ps1 Configuration that we declared [File]WebContent as getting the files from ‘C:\Program Files\WindowsPowerShell\Modules\BakeryWebsite’, so we’ll put it at the same root of the ZIP as the ‘xWebAdministration’ folder is. Note that there are of course other ways with PowerShell DSC to get content, this is just for this example (using the Azure Files Preview is my new favorite way).

Using the UI

Now that our Configuration ZIP (with content) is ready, it’s time to look at the new UI and see what’s possible. In the Azure Preview Portal, browse to your Virtual Machine and in the Configuration space, click on the Extensions box (just like if you were going to go look at the status of the extensions). Now, in the Extensions pane, click the ‘Add’ button and note that our friend ‘PowerShell DSC’ is now present!

Just like all of the other existing Azure Extensions, when we click on the PowerShell DSC Extension, we get some descriptive information, a Publisher, and a link to resources:

Clicking on the ‘Create’ button gives the window to input the needed settings. In the event you already have a DSC Extension installed on a VM, you’ll get the second window with the warning:

 

For ‘Configuration Modules or Script’, use the folder and browse to the WebsitePackage.zip that we previously created and added content to. Since we didn’t use any Configuration Data in this example, we’ll skip the ‘ConfigurationData PSD1 File’. That option allows for the specifying of a Configuration Data file as the name implies and is the equivalent of using –ConfigurationDataPath with the Set-AzureVMDscExtension cmdlet. For ‘Module-qualified Name of Configuration’, use the name of the DSC Configuration file included in the ZIP along with the name of the Configuration in this format (note that the ‘Configuration File’ may contain more than one DSC configuration; this format ensures the Azure Extension picks up the correct one):

<ConfigurationFile>.ps1\<ConfigurationName>

For our example, it will look like this when all put together:

We didn’t have any arguments to pass to the Configuration in this example either, but the ‘Configuration Arguments’ would be where you placed –ConfigurationArgument details. Instead of using a hashtable like in the PowerShell cmdlets, use the format ‘name1=value1,name2=value’.

When you click the ‘Create’ button, the Add Extension pane will disappear and close. You can see that the work is being done and the status by using the Notifications icon on the left bar:

And when completed:

Browsing back into our Virtual Machines Configuration and Extensions, we can see that the PowerShell DSC Extension is being installed:

Here’s a rapid set of screen grabs as it progresses and finishes (took about 5 minutes on my VM):

Success! Now, the only thing we have to do is check to see if our content was put in place and the website is working. You can use the Azure Portal to configure an endpoint for this Virtual Machine, but I did it the quick way with PowerShell (note that I split the command in 3 lines):

PS> Get-AzureVM -Name sample -ServiceName sample-i5oc0t5i | >>> Add-AzureEndpoint -Name HTTP -Protocol TCP -LocalPort 80 -PublicPort 80 | >>> Update-AzureVM >>>

Then I took my favorite browser and went to see if it worked and see that everything on my new VM is exactly as I wanted it. All done!

Additional Resources

Here are just a few other PowerShell DSC related resources:

· Secure Credentials in the Azure Powershell DSC Extension

· Troubleshooting the DSC Extension for Azure

· DSC Resource Kit (All Modules)

Happy configuring!

David Coulter - Microsoft, Senior Consultant

6 New Updates in Power Query

Tue, 10/28/2014 - 10:35

Take Power Query to the next level: share your queries and create a corporate data catalog. Try Power BI for Free today!

In this Post

October is getting close to an end, but before we all put on our costumes and eat lots of candy for Halloween, we are very pleased to give you another treat: the October Update for Power Query is now available! This update includes several feature improvements that we hope you will like. Here is what is new:

Download the Power Query Update

What's New?

New Transformation: Use headers as first row

Column Name field in Merge Columns dialog

Quick Access Toolbar in Query Editor dialog

Warning for stale data previews in new queries

Function Invocation from Search

Preserve Excel Number formats in the worksheet on query refresh

 

If you are using the Preview of the Salesforce connector for Power Query, you will also need to install this newer version after installing the October Power Query Update:  32-bit, 64-bit

What's new in this update?
  1. New Transfomation: Use Headers as First Row.
  2. Column Name field in Merge Columns dialog.
  3. Quick Access Toolbar in Query Editor dialog.
  4. Warning for stale data previews in new queries.
  5. Function Invocation from Search.
  6. Preserve Excel Number formats in the worksheet on query refresh.

 

You can watch the following video or continue reading the rest of this blog post for more details about each improvement.

 

New Transformation: Use Headers As First Row

Table.DemoteHeaders has been available for a long time as a library function. However, users would have trouble discovering this function and would ask very often about how to achieve this transformation. We’re trying to address this discoverability issue by exposing “Use Headers as First Row” in the Query Editor ribbon. The existing “Use First Row as Header” entry points (Home tab and Transform tab) have been converted from a button to a split button that now exposes “Use Headers as First Row” as its 2nd option.

Column Name field in Merge Columns dialog

“Merge Columns” now exposes a new field to specify the name of the column upfront. Previously, the only ways of customizing this name were via the generated step formula or by adding a subsequent “Rename Column” step. Over the next few months, we aim at exposing this option in other transformations that generate new columns as a result (such as Expand and Aggregate).

Warning for stale data in new queries

When connecting to data sources that the user had connected in the past, Power Query will try to leverage a local preview cache as much as possible. This has lots of advantages in terms of performance and UX responsiveness, however it is also an area of potential confusion for end users. Quite a few customers have reported that the “preview is wrong” where, in fact, it was simply out of date. Despite the Status Bar at the bottom of the Editor exposes the last updated date/time for a Preview, this was not prominent enough for many of these users.

In this release, we’ve added a warning for the first time in each Editor session where a preview older than 24 hours is displayed to the user including an approximation for how old the preview is (at least, how long it has been stored by Power Query). Users can optionally click Refresh to update the preview. If the user decides that the cached preview is ok and starts working with it (i.e. adds a new step), the warning message will go away automatically.

Quick Access Toolbar in Editor dialog

The Query Editor dialog now exposes a Quick Access Toolbar. By default, this toolbar contains the Send Smile/Frown menu so that users don’t need to go back to the Home tab to reach out to us. In addition, users can pin their favorite actions from the ribbon to this toolbar, by right-clicking any of them.

 

 

Function Invocation Experience from Search

If you find a query in Search results that is a function, Power Query now lets you directly invoke it from the Online Search pane. Previously, users had to load the query to the workbook and then invoke it from the Workbook Queries task pane. This flow has been now simplified enabling users to directly invoke the query from Search. In addition to invoking the function, users can also add the function query to their workbook so it can be invoked multiple times. This can be done via a context menu option (“Add Definition”) available in function queries.

 

Preserve Excel number formats on refresh

Before this update, customizations to the Number format of Excel worksheet cells part of a Power Query table would be lost after refreshing your query.

With this update, Number formats are not lost after a refresh. We look at the current and previous data type for each column and if the type didn’t change, we don’t remove the Number format of the output. This has been a very frequent customer complaint that is now resolved.

Columns with Number format stay the same after refresh:

 

That’s all for this month! We hope that you enjoy this update and find the new features valuable for you and your customers. Please send us your feedback or suggestions via Smile/Frown in Power Query.

 

If you are using the Preview of the Salesforce connector for Power Query, you will also need to install this newer version after installing the October Power Query Update:  32-bit, 64-bit

Download Power Query from our official download page.

 

Epic Saga Chapter 2: Wherein I try to use an XMLHttpRequest in a Cordova app on my Windows Phone

Tue, 10/28/2014 - 10:26

This post is the second post in the series: Uploading Images from PhoneGap/Cordova to Azure Storage using Mobile Services

The previous chapter of this saga ended just as I realized that the super-convenient-and-easy-to-use FileTransfer plugin would not enable me to upload raw binary data to my Azure Blob storage service—multipart/form-data only. With great sadness at the prospect of having to say goodbye to such a helpful plugin, I turned to the web’s old friend XMLHttpRequest (XHR for short).

Where FileTransfer did the work of reading from a file on my phone (based on the local URI) for me, to use XHR, I would need to read the file myself into a format that XHR could send to the Blob storage service. This seemed easy enough, I figured. The Cordova File plugin has a FileReader object that would do the trick, so now my submit event handler looked like this:

// We need a FileReader to read the captured file into a byte array. var reader = new FileReader(); … // Handle insert $('#add-item').submit(function (evt) { var textbox = $('#new-item-text'), itemText = textbox.val(); if (itemText !== '') { var newItem = { text: itemText, complete: false }; var capturedFile; // Do the capture before we do the insert. // Launch device camera application, allowing user to capture an image. navigator.device.capture.captureImage(function (mediaFiles) { if (mediaFiles) { // Set a reference to the captured file. capturedFile = mediaFiles[0]; // Set the properties we need on the inserted item. newItem.containerName = "todoitemimages"; newItem.resourceName = capturedFile.name; alert(JSON.stringify(newItem)); } // Do the insert and get the SAS query string from Blob storage. todoItemTable.insert(newItem).then(function (item) { if (item.sasQueryString !== undefined) { // Read the file into an array buffer. reader.readAsArrayBuffer(fileContent); } }, handleError).then(refreshTodoItems, handleError); }); } textbox.val('').focus(); evt.preventDefault(); });

This code uses the readAsBinaryBuffer method to read the image into an array, which is then uploaded by the XHR. In the reader’s onload event handler (not shown) is where I created the XHR and tried to send the blob. The problem is that the onload event never got called, and every time readAsArrayBuffer returned an error. Digging into the Windows Phone source code for this plugin, I found this depressing bit of code:

public void readAsBinaryString(string options) { string[] optStrings = getOptionStrings(options); string filePath = optStrings[0]; int startPos = int.Parse(optStrings[1]); int endPos = int.Parse(optStrings[2]); string callbackId = optStrings[3]; DispatchCommandResult(new PluginResult(PluginResult.Status.ERROR), callbackId); }

First of all, what a useless error to return ERROR. Then, what I clearly failed to notice (I’ll blame the wacky state of Cordova/PhoneGap documentation) is that readAsBinaryBuffer is not supported for Windows Phone. It is for iOS and Android (of course). Come on Cordova guys…please get this fixed.

OK, no biggie, I can just switch from Windows Phone to Android. After all, no one but me even has a Windows Phone, right?  I mean, I don’t have an Android device that takes pictures, but there’s an Android emulator that comes with the Android SDK, right?

Stay tuned for our next painful installment…Chapter 3: Wherein I Try to Use Eclipse and the Android Emulator to Test My Cordova App

Cheers!

Glenn Gailey

Announcing Microsoft’s new IoT blog

Tue, 10/28/2014 - 10:16

By Barb Edson

General Manager, Marketing, Cloud and Enterprise

 

Today, we’re pleased to announce the launch of Microsoft’s Internet of Things blog. Join us there to follow what’s new and what’s next in Microsoft IoT.

We’ve also launched a Twitter handle for all things IoT: @MicrosoftIoT, and our IoT videos will be found on the new IoT YouTube playlist. So update your bookmarks, and join us to see what comes next in Microsoft IoT.

Support for anonymous inbound email over IPv6 in Office 365

Tue, 10/28/2014 - 10:07

Office 365 now supports anonymous inbound email over IPv6. In this case, “anonymous” means:

  1. The sending IPv6 address originates outside the service and is not in any customer’s settings (that is, not in any customer-specified connector)
  2. The sending IPv6 address has not been previously allow-listed by the service
  3. The sending connection is not sent over TLS (it can be sent over TLS but it is not required)

While Office 365 already permitted customers to create connectors upon which to relay inbound email (that is, email from the Internet to an on-premise mail server), now it also allows those messages to come from the outside into the service, from anyone.

In the above diagram, the part in red is new.

 

1. Customers must opt-in to receive email over IPv6

Office 365 does not allow any inbound connection over IPv6 to connect to the service by default.

First, customers must request to be opted into the service by opening up a support ticket. The engineering team will then manually configure the service to permit, on a per-domain basis, receiving email over IPv6. This means that if a customer has multiple domains, they can pick-and-choose which ones to enable. Turnaround time for enabling IPv6 is quick, it can be the same day.

Once a domain is enabled for IPv6, its MX-record will resolve AAAA records. For example, for contoso.com:

contoso.com.          3599    IN      MX      5 contoso-com.mail.protection.outlook.com

contoso-com.mail.protection.outlook.com. 10 IN A 207.46.163.247
contoso-com.mail.protection.outlook.com. 10 IN A 207.46.163.215
contoso-com.mail.protection.outlook.com. 10 IN A 207.46.163.138
contoso-com.mail.protection.outlook.com. 10 IN AAAA 2a01:111:f400:7c10::10
contoso-com.mail.protection.outlook.com. 10 IN AAAA 2a01:111:f400:7c0c::11
contoso-com.mail.protection.outlook.com. 10 IN AAAA 2a01:111:f400:7c09::11

Domains that are not enabled for IPv6 do not resolve AAAA requests for their MX records.

Office 365 publishes IPv4 and IPv6 MX-records at the same MX-priority, and therefore it is up to the sender to decide whether to send email over IPv4 or IPv6. Sending mail servers are supposed to prefer IPv6 over IPv4 but some may choose to send all email over IPv4.

If a sender tries to manually connect to the service over IPv6 to a customer, for example contoso.com, but contoso.com has not opted in to receive over IPv6 then the service will reject the message with the permanent reject error:

550 5.2.1 contoso.com does not accept email over IPv6

Office 365 already sends outbound email over IPv6 if the receiver publishes AAAA records in its MX-record.

In addition, Office 365 already supports customers with an on-premise email server that are IPv6 end-points (that is, inbound email from the Internet over IPv4, and customer on-premise mail server is IPv6). This part has not changed.

2. Requirements for senders over IPv6

Second, Office 365 conforms with the Messaging, Mobile and Malware Working Group’s (M3AAWG) recommendations for receivers over IPv6:

http://www.m3aawg.org/sites/maawg/files/news/M3AAWG_Inbound_IPv6_Policy_Issues-2014-09.pdf.

Senders over IPv6 must pass two conditions:

  1. The sending IPv6 address must have a PTR record. If it does not, the service will reject the message with the permanent reject error:

    550 5.7.1 Service unavailable, sending IPv6 address [$SenderIPAddress] must have reverse DNS record.

  2. The sending email must pass SPF or DKIM verification. If it does not, the service will reject the message with the permanent reject error:

    554 5.7.1 Service unavailable, message sent over IPv6 must pass either SPF or DKIM validation.

Representatives from Microsoft worked with other participants in the M3AAWG working group to define and agree to these requirements.

Office 365 stamps the results of the SPF check into the Authentication-Results header, for example:

Authentication-Results: spf=pass (sender IP is 2207:eab0:3001:a01::123) smtp.mailfrom=user@example.com; contoso.com; dkim=pass (signature was verified) header.d=example.com;

Some customers may find these conditions overly stringent and therefore may choose to not enable IPv6 for their domains until they are confident that either the majority of their traffic over IPv6 passes SPF or DKIM, and has a PTR record; or, the senders who do not pass those requirements transmit exclusively over IPv4.

There is no way to override either of these requirements.

Sending mail servers may decide to retry over IPv4, at which point their email would not be subjected to these requirements, but it is up to the sender.



3. Service wide throttling limits for IPv6

Third, Office 365 has implemented some service-wide throttling mechanisms. Because IPv6 makes minimal use of IP reputation lists, a large spam attack over IPv6 could cause service degradation because performing SPF and DKIM verification consumes more computational resources than a simple IP blocklist check. To protect against spam attacks, or simply misconfiguration of a sending IPv6 host, Office 365 implements throttling of IPv6 ranges.

  1. If a particular IPv6 sender has exceeded its sending limits, its connections will be rejected with the permanent reject error:

    550 5.3.2 Access Denied, [$SenderIPAddress] has exceeded permitted limits within $range range.

  2. If the service-wide network capacity allocated to IPv6 has been exceeded, any connection over IPv6 will be rejected with the temporary reject error:

    421 4.3.2 Local IPv6 capacity exceeded, please try again later.

As the network capacity returns to normal, IPv6 connections will be permitted.


4. Customer Settings

Currently, customers can create IP Allow and Block lists for IPv4 but the Exchange Admin Center currently prevents adding IPv6 address. This will still be the case even if you opt in to receive email over IPv6.

Instead, the preferred mechanism to allow messages over IPv6 is to create Exchange Transport Rules (ETRs) that allow a domain and simultaneously require an SPF pass or DKIM pass by looking for the corresponding result in the Authentication-Results header; or to alternately block a domain (there is no need to look for Authentication-Results).

Preventing IPv6 Allow Lists and Block Lists, and requiring ETRs, has the following advantages:

  1. Because the sender domain has been authenticated with SPF or DKIM, there is a much lower risk of a spammer spoofing a faked message from a domain that has been allowed, and thereby having spam delivered to the inbox.

  2. It is easier to manage domains that are good or bad than it is to manage IPv6 addresses because the IPv6 address blocks can be so large, but domains are much fewer in number.

5. Conclusion

Permitting anonymous inbound email over IPv6 is a major step forward in Office 365. It is still very new even in the rest of the email industry, but the requirements that have been set in place should allow for the transition of email into a more trustworthy and reliable world.

AX for Retail 2012 R2 Cannot see Variants when selecting a Transfer Order in POS

Tue, 10/28/2014 - 10:04

Thank you to Brett Christiansen for helping with this blog

KB 2829678 fixes an issue with Variants not showing on Transfer orders.  AX 2012 CU6 includes this hotfix.

With the fix installed on my machine, I had an issue where an older item on a transfer order in the demo data would show the variants fine (Item 0140), but my new item would not display the variants.

What we found was the string displayed on the form is built using the dimension information from the EcoResProductMasterDimensionValue table. The query is a complex join to get information from this table for each dimension.

Broken the query down, it comes to showing the value for the dimension description in EcoResProductMasterDimensionValue.Description. 

Go to Product information management > Common > Released Products and select your item that does not show the variant in POS. 

Click Product dimensions

Notice that my variants do not have a Name\Description

I enter a value in the Name field for both of my variants

I then ran the 1040 job to push the change to the item variant to POS.

Now, when I log into POS and select Picking and Receiving when I select the Transfer Order I do see the variants.

 

Celebrating HTML5 Recommendation with the W3C

Tue, 10/28/2014 - 10:03

Today, while several Internet Explorer team members are at W3C TPAC 2014, the IE team is happy to join Microsoft Open Technologies, other browser vendors, and the web community at large in celebrating the HTML5 specification reaching W3C Recommendation.

This milestone represents many years of commitment from people and organizations around the world to produce and stabilize the next generation of the W3C Open Web Platform. The IE team believes that the standards process is vital to creating an interoperable Web and ensuring that the web just works for everyone.

We’d also like to congratulate the W3C on its 20th anniversary and look forward to continued collaboration on the future of the open web.

 

Some Customers Unable to Link Older VSO Accounts to Azure Subscription in the Azure Portal - 10/28

Tue, 10/28/2014 - 09:28

We have discovered an issue where VSO accounts that were previously created will fail when attempting to link them to an Azure Subscription in the Azure Portal.

A customer who attempts to LINK TO EXISTING Visual Studio Online will see "Could not determine account region" for the REGION field when selecting an older account.  If the customer then hits LINK ACCOUNT they will receive a "The location constraint is not valid" error message and the linking will fail.

Newly created VSO accounts will not exhibit this issue.

We are working with DEV to determine the best way to resolve this issue.

We apologize for the inconvenience this may have caused and appreciate your patience while working on resolving the issue.

- Visual Studio Online Team

 

 

 

Sample chapter: SharePoint Development Practices and Techniques

Tue, 10/28/2014 - 09:00

SharePoint 2013 gives you more options, but it also requires you to make more choices, and it is important to make deliberate and well-informed choices to make sure that you end up with the best solution that you could possibly build for your specific situation and scenario. This chapter from Inside Microsoft SharePoint 2013 talks you through a lot of the choices and can help you make the right decisions.

Before you can start building a custom Microsoft SharePoint solution you will have to make sure you set up your development environment correctly. Because the hardware requirements for SharePoint 2013 are again a lot more demanding than they were for SharePoint 2010, setting up a new development environment might well mean that you have to acquire new hardware. There might be quite a bit of time between the moment that you order the hardware, whether from an external vendor or from an internal department, and when you can actually start using the hardware. This means that it’s important to start planning your SharePoint customizations early, so that waiting on the hardware will not interfere with your project planning.

When you have gotten the hardware, you will have to install your development environment. It is important to do this meticulously, to follow best practices and to make sure you document the entire configuration. Documentation is important if you have to create a second environment, or if you have to recreate your development environment.

When your SharePoint environment has been set up properly, you will need proper specifications so that you can start designing your solution. You will have to decide what type of solution will best suit your skills, the environment into which the solution will have to be deployed, and the functionality that you have to create. SharePoint 2013 introduces a new development approach, which means that you can now not only create farm solutions and sandboxes solution, but you can also create SharePoint apps. SharePoint 2013 also introduces a third application programming interface (API) by making Representational State Transfer (REST) APIs available that allow you to use simple HTTP requests and responses to perform CRUD (create, read, update, delete) operations on SharePoint data.

All these additions give you more options, but they also require you to make more choices, and it is important to make deliberate and well-informed choices to make sure that you end up with the best solution that you could possibly build for your specific situation and scenario. This chapter talks you through a lot of the choices and can help you make the right decisions.

Find the rest of this chapter here: https://www.microsoftpressstore.com/articles/article.aspx?p=2222445.

Interested in more free sample chapters? Check out our growing collection here: https://www.microsoftpressstore.com/articles/.

Des Apps Dynamics NAV pour Office 365 - NAPA

Tue, 10/28/2014 - 08:52

 

Introduction :

La nouvelle version Dynamics NAV 2015 nous permet d’avoir une intégration très étroite avec Office 365 :

  • Accès à toutes les fonctionnalités et les pages de Dynamics NAV depuis le portail SharePoint Office 365
  • Export des liste directement vers Excel Online
  • Gestion des documents Dynamics NAV dans One Drive
  • Intégration OneNote Office 365
  • Authentification unique (Accès à Dynamics NAV en utilisant un compte O365)

Dans ce post, nous personnaliserons  cette intégration en ajoutant une fonctionnalité Dynamics NAV (Catalogue produit par exemple) dans un message Outlook O365.

Nous allons utiliser les outils de développement Office 365 « Napa » . Ces outils facilitent le développement des applications pour Office et SharePoint sans quitter le navigateur ou installer un logiciel. Il suffit d'ajouter l'application “Outils de développement « Napa » pour Office 365” à votre Site SharePoint Online, la lancez, et vous êtes prêt à créer votre première application Dynamics NAV pour Office ou SharePoint.

Exemple d’applications :

Consulter le catalogue produits à l’ouverture d’un e-mail :

 

Saisie d’un devis depuis un e-mail :

Prérequis :
  • Une installation Microsoft Dynamics NAV dans Azure (client tablette)
  • Un compte Office 365 (avec un accès à l’environnemenr de développement)
  • Authentification unique pour Dynamics NAV & Office 365
  • Autoriser les IFRAME dans le paramètrage du serveur Dynamics NAV

 

Installez les outils de développement NAPA :

Dans un site SharePoint O365, cliquez sur “Contenu du site” puis “Ajouter une application”

Dans la liste des applications, choisissez l’application “Napa” Office 365 Development Tools :

Cliquez sur Approuver :

Si l’application n’apparait pas dans la liste des applications de votre site, installez la depuis le Store SharePoint :

Après avoir lancer l’application, Cliquez sur “Add New Projects” :

Choisissez le type d’application que vous souhaitez développer (SharePoint, Office contenu/task pane ou Mail) :

Nous allons dans ce post nous intéresser à l’application Mail.

Sélectionnez le type “Mail app for Office”, donnez un nom à votre projet puis cliquez sur Create. Un projet est créé :

Le projet est composé de plusieurs rubriques : AppCompose (quand on compose l’email, AppRead quand on lit l’email, …)

Ajoutez l’IFRAME de la page Dynamics NAV :

Remplacer la balise <body> par le code suivant :

<body>
    <!-- Page content -->
    <div id="content-header">
        <div class="padding">
            <h1>Microsoft Dynamics NAV</h1>
        </div>
    </div>
    <div id="content-main">
        <iframe width="100%" height='450' src=http://navxxx.cloudapp.net:8080/DynamicsNAV80/WebClient/tablet.aspx?company=CRONUS%20France%20S.A.&mode=View&page=31&i=58D&ni=2></iframe>

    </div>
</body>

Paramétrez et déployer votre application :

Cliquez sur le bouton   du menu de gauche :

 

Complétez les paramètres de votre application puis cliquez sur le bouton   du menu de gauche pour déployer la solution.

Pour tester la solution, accédez à Outlook et sélectionnez un mail.

ADAL JavaScript and AngularJS – Deep Dive

Tue, 10/28/2014 - 08:39
New post: http://www.cloudidentity.com/blog/2014/10/28/adal-javascript-and-angularjs-deep-dive/...(read more)

How technology helps Cambridgeshire Constabulary adapt to tomorrow’s challenges

Tue, 10/28/2014 - 08:04

The time had come for Cambridgeshire Police to upgrade from its ing Blackberry devices. The office could easily have opted for a new standard device for the entire force. But rather than choose a device, they opted to focus on a platform: Windows Phone 8.1. Now Cambridgeshire is rolling out 8,000 Windows Phone devices across the force.

The critical difference is that because the office is focusing on a platform, different officers can have different devices, each tailored to meet their needs. No matter what Windows Phone 8.1 device an officer uses, the constabulary can rest easy, knowing it will be secure, powerful and completely compatible with the rest of office’s devices.

The decision helps future-proof the department, because it doesn’t rest on a single device that will eventually become obsolete. Upgrades will be easy and can done over time. By recognising that the future of mobility lies in platforms, not devices, the constabulary is able to transform the way its staff operates in a cost-effective, flexible manner.

But of course, Cambridgeshire Police isn’t stopping there. When the office decided to replace its outdated Lotus Notes environment with Microsoft SharePoint 2013, the need to control costs was at the forefront of its thoughts. The upgrade also helped employees discover information, share documents and organise data more effectively, leading to operational efficiency gains and lower spending in the long run.

Cambridgeshire Police is even getting proactive, using the Microsoft Social Listening tool within Dynamics CRM to monitor things like public sentiment, crowd movement, and local infrastructure conditions. Intelligence is the backbone of effective public safety, and today, more insight exists than ever before, as many citizens share information using social networks.

Cambridgeshire Police will be one of more than a dozen innovative companies presenting at Future Decoded on how they’re using technology to meet the challenges of tomorrow. Discover the full line-up of speakers and register to attend this unique, free event on 10th November at ExCel London. 

New Azure tooling for IntelliJ IDEA

Tue, 10/28/2014 - 08:03

 

We’ve had first class support for JAVA in Azure for quite some time, and there are thousands of developers and ISV’s already running their JAVA workloads in Azure. Over the last few years, IntelliJ IDEA has become a popular tool among JAVA developers to the point that many of the enterprise developers and partners we work with require Azure tooling inside IntelliJ IDEA (similar to what we have for Eclipse).

Today, the MS Open Tech team released a new set of tools for IntelliJ IDEA that expands upon their support of JAVA developers. With the MS Open Tech Tools plugin you can debug and deploy Java application from IntelliJ.

In this blog, I will walk you through deploying a JAVA application directly from IntelliJ IDEA to Azure.

Before we begin, you will need to acquire the MS Open Tech Tools plugin from the JetBrains plugin repository, Alternatively, you can also download binaries from GitHub (https://github.com/MSOpenTech/msopentech-tools-for-intellij) as well

Open the IntelliJ IDEA and navigate to Filesà Settings and look for Plugins and then click on Browse Repositories

Select the plugin by searching for MS Open Tech Tools plugin, install the plugin and restart the IDE. You are now setup to deploy your JAVA applications to Azure.

Now let’s look at how to build and publish a simple JAVA Web Application to Azure.

Navigate to File -->New Project -->Web Application -->JSF

Use the built-in JAVA EE Web Module program that comes with IntelliJ IDEA.

Expend the project and navigate to index.jsp and write some HTML code.

Now that it is ready, we will configure the application to publish/deploy to Azure. With the installation of the Azure plugin, new tools are available to configure and deploy to Azure. Select ‘publish to Azure’.

Follow the instructions on the wizard.

Select ‘Web Server’ (i.e. tomcat) for local emulator, and add recently created JAVA application war file

Run the application locally to check that it works.

All good? Publish the application to Azure. Right click on the Azure project that is created and select ‘Publish to Azure’. If you don’t have a subscription, you can sign up for a free trial here (http://azure.microsoft.com/en-us/pricing/free-trial/)

Import Azure PUBLISHSETTINGS file and point to it

Select a role on which to deploy the application, as well as the staging or production environment.

Hit ‘publish’ and the application will be deployed to Azure.

Please note that this is alpha release and our intention is to share this early work that we are doing for IntelliJ IDEA developers. Below is the list of scenarios that we support, as well as the limitations.

· Run in Azure Emulator locally on your PC

· Add additional Java applications to your deployment (as WAR files)

· Configure deployment components for more advanced deployment configurations

· Configure Azure storage accounts for your deployments to use

· Publish your project to Microsoft Azure

· Supported application servers - Tomcat, Jetty, GlassFish, JBoss

· Enable sticky sessions (session affinity)

· Supports both Community and Ultimate Editions

· Plugin Works only on Windows (No Mac, Linux support)

· Requires Java 7 at minimum

· Alpha” preview state – work in progress, incomplete functionally compared to Azure Toolkit for Eclipse

BI tools help Metro Bank create fans, not customers

Tue, 10/28/2014 - 07:54

Superior customer service sets Metro Bank apart, giving them the competitive advantage they need as the first high street bank to open in the UK in more than 100 years. That customer-first attitude shows in everything they do, such as being open seven days a week and staffing their round-the-clock call-centre service with people, not machines. They’re also using Big Data analytics, data visualisation tools and data-driven decision making with Power BI to ensure customer needs continue to drive everything Metro Bank does as the company grows.

“We set out to create fans, not customers,” says Bruce Rioch, Head of Business Information and Customer Systems at Metro Bank. “That’s our proposition. We want to surprise and delight. We want to be the bank that our customers tell their family and friends about—the bank that offers amazing customer service and has a simple, understandable proposition.”

Metro Bank now has more than 350,000 customer accounts. As the bank has continued to grow, it’s become increasingly important to have Big Data analytics to show how its customers interact with all its services across all channels. BI software will help Metro Bank fine-tune its services and reach its million-customer goal. The company needed a business intelligence tool that could quickly and accurately provide Big Data analytics.

Watch how Metro Bank uses customer data analysis to deliver superior service:

Metro Bank will be one of more than a dozen innovative organisations presenting at Future Decoded on how they’re using technology to meet the challenges of tomorrow. Discover the full line-up of speakers and register to attend this unique, free event on 10th November at ExCel London.

Recap of TechEd Europe, day one

Tue, 10/28/2014 - 07:33

Here is the first blog post from your reporter Jan Tielens, on-site at TechEd Europe in sunny Barcelona. This morning the event kicked off with a keynote diving into the mobile first, cloud first world. Jason Zander was the host and he focused on the key parts which makes this vision a reality: devices and cloud.

On devices
Jason explained the devices trend we are seeing today: an explosion of internet-connected devices, ranging from small sensors, to phones, tablets, hybrids, laptops, desktops and even large screen Perceptive Pixel devices. Then he clarified the investments Microsoft is making in Windows 10, all of them centered around:

  • One converged Windows platform, running on a wide range of devices
  • A product people will love to use
  • Protection against modern security threats
  • Managed for continuous innovation

On Windows 10
You might know you can try out the Enterprise Preview for Windows 10 already via the Insider Program, but you’ll notice the current build is really meant to give enterprises a very early heads-up on what’s coming. Early 2015, we’ll also disclose the consumer story of Windows 10, so stay tuned for much more on Windows 10 in that timeframe.
To wrap up the Windows 10 talk, Joe Belfiore came on stage to demo some of the UI improvements: we’ve already seen the come-back of the enhanced Start Menu  and the advances of the Command Prompt. But Joe also demoed some new scenarios, for example two-factor authentication in Windows 10, using your phone, completely transparent and seamlessly for the end-user.

On Azure
lots of exiting Azure announcements were made today:

  • Huge virtual machine support, the G Series of virtual machines, these are optimized for heavy data workloads and use the very latest Intel Xeon processors.  They have up to 32 CPU cores of compute capability, 448 gigabytes of RAM and more than 6.5 terabytes of local SSD storage.
  • Premium Storage offering support up to 32 terabytes of storage per virtual machine. 
  • Azure Batch delivers job scheduling as a service, making it easy to run large-scale parallel and high performance computing (HPC) work in Azure. During the keynote an impressive demo of a ray tracing application was shown. Ray tracing is typically compute-intensive, but with the help of Azure
  • In preview soon is Azure Operational Insights, an operations management and intelligence service that integrates across System Center and leverages the power of Azure and HDInsight to analyze machine data across environments, enabling actionable insights and better decisions.
  • Now generally available: Azure Automation which orchestrates time-consuming and frequently repeated tasks across Azure and third-party systems, decreasing expenses for cloud operations.
  • Now generally available: WebJobs. The WebJobs feature of Microsoft Azure Websites provides an easy way for you to run programs such as services or background tasks in a website
  • Enhancing connectivity and reliability with ExpressRoute, we deliver enterprise-grade connectivity ‘out-of-the-box’. Not only do we provide dual redundant circuits, we also let enterprises connect to multiple Azure regions via an ExpressRoute location, or connect via a secure VPN over the Internet for added redundancy.
  • We also announced the launch of a new Azure Marketplace that brings together the strength of Azure’s partner ecosystem within a single, unified platform. Today, we are excited to announce new offers from IBM, SAP, DataStax, Veeam, McAfee, SoftNAS, Trend Micro, and NGINX in the Azure Marketplace.

On Office 365
From the management perspective, we announced:

  • Mobile device management for Office 365 set to roll out in the first quarter of 2015, enable you to manage Office 365 data across a diverse range of phones and tablets, including iOS, Android and Windows Phone devices.
  • The expansion of Data Loss Protection capabilities so you can protect your content no matter where it is stored and shared within Office 365, whether in email, OneDrive for Business, SharePoint Online, and Windows File Server or within the file itself.
  • Today we rolled out advanced encryption at rest for SharePoint Online and OneDrive for Business called per-file encryption.

But also for developers, we made some exciting announcements for Office 365:

  • General availability of new Office 365 APIs for mail, files, calendar and contacts
  • New mobile SDKs for native app development, targeting Windows, iOS and Android
  • Visibility for developers’ apps through the new Office 365 app launcher

So quite a lot of exiting announcements don’t you think? And TechEd Europe is only just starting! Stay tuned for more updates from Barcelona on this blog. And if you would like to learn more about all the news of today, these are highly  recommended blog posts:

以 Office 365 為開發平台開發加值應用

Tue, 10/28/2014 - 07:20

今天開始的 TechEd Europe 2014 宣佈了一些關於 Office 365新聞,包含了幾個重點:

運用 Office 365 API 開發應用

今天起 Office 365 API 正式營運服務,目前提供了關於郵件、檔案(商務用 OneDrive)、行事曆以及聯絡人的資料操作,透過這些 API 您便能夠直接開發出加值在 Office 365 上的應用,相要瞭解如何運用這些 API 作開發,可以參考官方的 Office 開發中心

提供 Windows, iOS, Android 等 SDK

而除了提供 Visual Studio 的 SDK (for Windows)之外,MS Open Tech 也開發iOSAndroid 的 SDK,方便開發人員可以輕鬆在 iOS、Android 平台上開發 Office 365 的延伸應用。

新的 app 啟動器增加 app 能見度

開發好的 app 可以上架到 Office 市集上供其它的 Office 365 用戶來使用,而這次也更新了 Office 365 的 app 啟動器畫面,讓使用者可以更容易客製化 app 啟動的版面,把他比較常用的 app 放在顯眼的位置。而目前 Office 市集中已經有超過 1,200 個加值 app,你也可以開發一個上架!

接下來該做什麼?

如果您對於開發 Office 365 的加值應用有興趣,以下是幾件你可以開始動作的選擇:

Pages

Drupal 7 Appliance - Powered by TurnKey Linux