You are here

Feed aggregator

Introducing File and Folder ACLs for Azure Data Lake Store

MSDN Blogs - Sat, 07/30/2016 - 21:31

We’re excited today to announce the availability of File and Folder ACLs for the Azure Data Lake Store. Many of you have been eagerly awaiting this feature because it is critical in securing their big data.

When we launched the preview of Data Lake Store in October 2015, filesystem security was controlled by a single ACL at the root of store that applied to all files and folders underneath.

Starting today, ACLs can be set on any file or folder within the store, not just the root folder.

The POSIX ACL model used by Data Lake Store

We’ve emphasized that Azure Data Lake Store is compatible with WebHDFS. Now that ACLs are fully available, it’s important to understand the ACL model in WebHDFS/HDFS because they are POSIX-style ACLs and not Windows-style ACLs.  Before we five deep into the details on the ACL model, here are key points to remember.

  • POSIX-STYLE ACLs DO NOT ALLOW INHERITANCE. For those of you familiar with POSIX ACLs, this is not a surprise. For those coming from a Windows background this is very important to keep in mind. For example, if Alice can read files in folder /foo, it does not mean that she can rad files in /foo/bar. She must be granted explicit permission to /foo/bar. The POSIX ACL model is different in some other interesting ways, but this lack of inheritance is the most important thing to keep in mind.
  • ADDING A NEW USER TO DATA LAKE ANALYTICS REQUIRES A FEW NEW STEPS. Fortunately, a portal wizard automates the most difficult steps for you.

The ADLS Access Control model featuring POSIX-style ACLs is described here:!AvdZLquGMt47gzv6ajn-Hr0dgkiK

Adding a New Data Lake Analytics User

If you want a new user to run U-SQL jobs in ADLA, the overall steps are shown below:

  1. Assign the user to a role in the Azure Data Lake Analytics account (using Azure RBAC)
  2. *Optional* Assign the user to a role in the Azure Data Lake Store account (using Azure RBAC)
  3. Run the ADLA “Add User Wizard” for the user
  4. Give user R-X access on all folders and their subfolders recursively where data must be read by U-SQL jobs
  5. Give user RWX access on all folders and their subfolders recursively where data must be written by U-SQL jobs

Detailed Instructions can be found here:!AvdZLquGMt47gzohZ69Ob47k-P_y

ProTip: Leverage the power of Active Directory Security groups

Repeating manual steps if both irritating and prone the error. It’s easier if you use Active Directory security groups.

First give the needed permissions to the security group. Afterwards, adding new users is simple: just add them to the security group. This will dramatically simplify maintaining and securing your Data Lake.


Thanks, to everyone for your patience through the service upgrade process!



Tiger Team at SQL Server Geeks Summit 2016

MSDN Blogs - Sat, 07/30/2016 - 20:52

Tiger Team (@mssqltiger) represented by (Sudhakar Sannakkayala, Bob Ward, Sunil Agarwal and Ajay Jagannathan)  will be in full force by speaking at 14 sessions at the SQL Server Geeks Annual Summit (#SSGAS2016), Asia’s Premier Data & Analytics Conference taking place on 11-13 August in Bangalore, India. SQLServerGeeks Annual Summit 2016 is a full 3-day conference with more than 100 breakout sessions and deep dive pre-con sessions on SQL Server, BI & Analytics, Cloud, Big Data, and related technologies.


This is a truly unique conference (see this video), comprised with multiple tracks on Database Management, Database Development, Business Intelligence, Advanced Analytics, Cloud, and Big Data. The summit attracts SQL experts from around the globe. SSGAS 2016 is the only Data/Analytics event in Asia where product teams from Microsoft’s Data Group fly down from Redmond to deliver advanced sessions on the latest technologies. 


Here is a list of breakout sessions, chalk talks and open talks where you can get to meet and interact with the team.


Date Time Type Session Title Speaker Level Track(s) Abstract Aug 11 2016 10:45 – 12:00 Breakout Friction-Free Upgrade and Migration to SQL Server using DMA, SSMA and WAR Ajay Jagannathan / Sudhakar Sannakkayala Intermediate DBA Are you excited about new capabilities in SQL Server 2016 and SQL Azure DB to unlock your business insights? Are you looking for a friction-free upgrade and migration experience? Do you want to understand the application compatibility and performance impact the upgrade/migration will have? Do you want low cost and high confidence from these complex activities? If you answered yes to any of these questions, then this demo-filled session is a must attend. We will showcase several new tools such as Data Migration Assistant, SQL Server Migration Assistant and Workload Analysis Reports that will help you plan, assess, execute and compare results of your upgrade or migration projects. Aug 11 2016 13:45 – 15:00 Breakout Understanding SQL R Services: What Every SQL Professional Should Know  Bob Ward Intermediate/Advanced DBA/DEV SQL Server 2016 introduces a new platform for building intelligent, advanced analytic applications called SQL Server R Services. This session is for the SQL Server Database professional to learn more about this technology and its impact on managing a SQL Server environment. We will cover the basics of this technology but also look at how it works, troubleshooting topics, and even usage case scenarios. You don’t have to be a data scientist to understand SQL Server R Services but you need to know how this works so come upgrade you career by learning more about SQL Server and advanced analytics Aug 11 2016 14:30 – 15:00 Chalk-Talk Transaction Isolation Levels Including RCSI and Snapshot Isolation Sunil Agarwal     Chalk-Talks are 30 minutes’ sessions focussing on conceptual & architectural understanding, that too with only whiteboard and marker. No Laptops, no PPTs, no demos – only whiteboard-ing! Aug 11 2016 17:15 – 17:45 Chalk-Talk Background Threads – Silent Heroes of SQL Server Engine Ajay Jagannathan     Chalk-Talks are 30 minutes’ sessions focussing on conceptual & architectural understanding, that too with only whiteboard and marker. No Laptops, no PPTs, no demos – only whiteboard-ing! Aug 11 2016 17:15 – 18:30 Breakout Operational Analytics in SQL Server 2016 and Azure SQL Database Sunil Agarwal Intermediate DEV/DBA SQL Server 2016 enables customers to run analytic queries on in-memory and disk-based OLTP tables with minimal impact on business critical OLTP workloads, requiring no application changes. This session covers various configurations and best practices for achieving significant performance gains with Real-Time Operational Analytics. Aug 11 2016 18:00 – 18:30 Open-Talk Troubleshooting the Top 5 Issues Seen by Customers of SQL Server Bob Ward     Open-Talks are 30 minutes free-flowing discussion on a specific topic. No laptops, no PPTs, no demos – only discussion and Q & A Aug 12 2016 11:45 – 13:00 Breakout SQL Server 2016: Customer Success Stories with Columnstore Index Sunil Agarwal Intermediate/Advanced DBA/DEV In-memory Analytics using columnstore index provides industry leading performance for analytics workload. This session will cover some key customer workloads that have been successfully deployed in production both for in-memory analytics and real-time operational analytics with SQL Server 2016. For each workload, we will describe the scenario, learnings and the performance achieved
As a result of attending this session I am better able to…
o Understand the scenarios where columnstore can deliver superior performance for the workload
o Learn how to choose between clustered columnstore index and nonclustered columnstore index
o Apply the best practices in using columnstore index to deliver superior performance Aug 12 2016 13:15 – 14:30 Breakout Inside the SQL Server Query Store Bob Ward Advanced DBA/DEV In this session you will understand how the SQL Server Query Store can be used to take you to the next level for query performance monitoring, tuning, and troubleshooting. You will see how the Query Store works, what its capabilities are, and even how to troubleshoot when problems occur with its execution. There will be plenty of demos in this session covering both SQL Server 2016 and Azure SQL Database scenarios. Aug 12 2016 14:45 – 16:00 Breakout Enhancements That Will Make Your SQL Engine Roar Ajay Jagannathan Advanced DBA Are you interested in knowing about some cool improvements in SQL Server to turbo charge its performance? If yes,  then this session is for you!

This session will showcase several improvements in the Database Engine for SQL Server 2012 through 2016, that address some of the most common customer pain points involving tempdb, new CE, memory management, partitioning, alter column as well as diagnostics for troubleshooting query plans, memory grants, and backup/restore.

Come see this demo filled session to understand these changes in performance and scale of the database engine, new and improved diagnostics for faster troubleshooting and mitigation.

Learn how you can use these features to entice your customers to upgrade and run SQL Server workloads with screaming performance.  


1. Learn about performance, scale and diagnostics enhancements in SQL Server database engine.

2. Evangelize these enhancements to get an out-of-box performance.

3. With this, understand how your experience with SQL Server will improve, and why you should install the latest and greatest Service Packs Aug 12 2016 16:30 – 17:00 Open-Talk TempDB Configuration and Common Issues Ajay Jagannathan / Sunil Agarwal     Open-Talks are 30 minutes free-flowing discussion on a specific topic. No laptops, no PPTs, no demos – only discussion and Q & A Aug 12 2016 17:15 – 17:45 Chalk-Talk Understanding SQLOS Scheduling Bob Ward     Chalk-Talks are 30 minutes’ sessions focussing on conceptual & architectural understanding, that too with only whiteboard and marker. No Laptops, no PPTs, no demos – only whiteboard-ing! Aug 13 2016 08:45 – 10:00 Breakout Become A Troubleshooting Ninja Using SQL Server Extended Events Ajay Jagannathan Advanced DBA New to Extended Events? Want to become a ninja in troubleshooting common SQL issues using extended events? This is a demo filled session that will show you how to troubleshoot several common issues in SQL server database engine and identify hotspots as well failures using only extended events. We will showcase how extended events make scenario based troubleshooting easier without having to collect disparate sets of diagnostic data, gather memory dumps and comprise on performance. There have been a plethora of Extended Events which have been added to SQL Server recently based on customer feedback which deprecates the need to run profiler in many commonly encountered situations in production environments. This session covers the new enhancements and capabilities available for Extended Events. Aug 13 2016 10:30 – 11:45 Breakout SQL Server 2016: It Just Runs Faster Bob Ward Intermediate/Advanced DBA/DEV Join me in taking a deep dive and a behind the scenes look at how SQL Server 2016 ‘It Just Runs Faster’, focused on scalability and performance enhancements. This talk will discuss the improvements, not only for awareness, but expose design and internal change details. The beauty behind ‘It Just Runs Faster’ is your ability to just upgrade, in place, and take advantage without lengthy and costly application or infrastructure changes. If you are looking at why SQL Server 2016 makes sense for your business, you won’t want to miss this session. Aug 13 2016 13:30 – 14:45 Breakout In-Memory OLTP: Concepts and Improvements in SQL Server 2016 Sunil Agarwal Advanced DBA/DEV SQL Server 2014 introduced in-memory technology for optimizing OLTP workloads. This talk provides an overview of the in-memory technology using common use cases. We will examine how in-memory tables and indexes are managed. You will also learn how to provision and control memory usage, how durability and high availability is achieved and how these constructs and operations are integrated together with disk-based tables to provide you a seamless experience


Follow  SQLServerGeeks and the #SSGAS2016 hashtag on Twitter for new and exciting updates about the conference. We hope to meet you at the conference.


Ajay Jagannathan (@ajaymsft)

Principal Program Manager

[REQUEST FOR FEEDBACK] We need your help – listing Wiki performance issues

MSDN Blogs - Sat, 07/30/2016 - 13:12
Hello all,

we need your help.
We’re investigating the Wiki performance issues and need to provide some empirical data / information on the issues that you experience, so the platform owners can link this to platform monitoring/events.

So here is the question…

When you experience an issue with TNWiki, reply to the post on the TNWik forum (click here, and provide the following data

  1. Link to article Article updated/posted
  2. Data & Time (+ Time Zone)
  3. Issue description
  4. Your city/region (where are you connected to internet at the time of working on the article)

The issue description can be like: time out, error (with error details), save not working, article reverted to previous edit, …

If you can provide a screenshot, that would  be great!

Small Basic – Downloading the LitDev Extension

MSDN Blogs - Sat, 07/30/2016 - 11:41

So, I was downloading the LitDev extension for Small Basic onto a new computer, and I thought I’d blog about it while it was fresh in my mind! =^)

If you haven’t tried the LitDev extension, you’re depriving yourself of awesome!


Here’s the API Reference:


Here are some highlights of objects/classes in the extension you should try out:

  1. LD3DView – Bring the capabilities for a full 3D graphics engine into Small Basic! So cool!
  2. LDPhysics – Boom! Bring on the game physics!
  3. LDController – Now you can use videogame controllers (USB gamepads and joysticks) to play the games you make!
  4. LDDebug – Hack together some basic debugging tricks! Debug the small and basic way!
  5. LDWaveForm – Remote controlled objects can be programmed by your Small Basic programs! No way? Yes way!
  6. LDWebCam – Now you can see! With your webcam.

And there’s a lot more! Head to to try them all!

Got a favorite that should be mentioned here? Leave a comment!

All this in one extension!

So how do you install this beauty?

You’ll likely want the latest version on the top here, but you can install the beta version to try out whatever LitDev and others are currently combobulating.

After you download the files, you’ll want to unzip the files and move them:

As with all Small Basic extensions, copy the extension dll and xml (inside the dll zip) to a lib sub-folder of the Small Basic installation folder, usually in one of the following locations:

C:Program FilesMicrosoftSmall Basiclib
C:Program Files (x86)MicrosoftSmall Basiclib

You may require to create the lib folder if it doesn’t exist.

The instructions then show you how to unblock the DLL file:

If the extension commands fail to appear in Small Basic (type LD and no options appear) then you may need to unblock the downloaded dll.  Right click LitDev.dll and select Properties and clear the block if present.  If it fails to unblock, then copy LitDev.dll to a folder where you have write access (e.g. Documents), unblock it there and then move back to the Small Basic lib folder.  Alternatively, unblock the downloaded zip file before unzipping the content.

I just had all those problems. So I’d follow the instructions/steps like this:

  1. Install the LitDev zip file from here.
  2. Save it somewhere. Unzip it.
  3. Copy/move the DLL file and the two XML files into the Small Basic extension folder. I think the “Lib” folder is automatically created in version 1.2 (because it houses the Kinect files). For me, the extension folder was: C:Program Files (x86)MicrosoftSmall Basiclib
  4. You can try the “Unblock” troubleshooting steps above, but I had the worst scenario for unblocking the DLL file, so I’d start with that… Cut/Move the LitDev.dll file into your Documents folder.
  5. Right-click the DLL file and click Properties.
  6. In the Properties window, click “Unblock” at the bottom in the Security section (see image above). Click OK to close the window. At this point, you can re-open the window to double-check. You know that it worked if the “Unblock” option is now gone (under the Attributes section).
  7. Cut/move the LitDev.dll file back into the Small Basic “Lib” folder.
  8. Run Small Basic with LitDev objects to test it. Import “ZPX036” and click Run. Or, as another test, type “LD” and you should see the LitDev objects in the IntelliSense wheel.

Once you’re up and running, check out the LitDev documentation to dig in: Extension Documentation

As you can imagine, LitDev and many supporting developers put a lot of thought and effort into this extension. So if this is your first time trying out the extension, please leave a comment here, and join me as we thank LitDev for all the goodness we can enjoy!


Small and Basically yours,

– Ninja Ed

Unable to start debugging on the web server. Operation not supported. Unknown error. 0x80004005

MSDN Blogs - Sat, 07/30/2016 - 00:31

I was trying to run my application hosted in IIS from Visual Studio. On pressing F5, it gave me the following error: “Unable to start debugging on the web server. Operation not supported. Unknown error. 0x80004005”



Go to the Application Pool in IIS by which your application is running. Right click -> Advanced Settings.  Enable 32-Bit Applications to True.


Adding/Updating SharePoint O365 Calendar Event

MSDN Blogs - Sat, 07/30/2016 - 00:30

To add or update a calendar in SharePoint 0365:

First connect to the SharePoint site


Now if the the listItemCollection already has data, update it or insert a new Calendar event.



BizTalk 2013: Suspended Error Message – The message found multiple request response subscriptions. A message can only be routed to a single request response subscription

MSDN Blogs - Sat, 07/30/2016 - 00:13

BizTalk 2013- If you find the below suspended messages in your application :

  1. The message found multiple request response subscriptions. A message can only be routed to a single request response subscription.

  2. This service instance exists to help debug routing failures for instance “{84685AE1-3D71-49E0-BB16-87B6A3049AFD}”. The context of the message associated with this instance contains all the promoted properties at the time of the routing failure.

this could be when there are multiple ports/orchestrations trying to listen to the same request message. You can check in your application if a previous version of the Orchestration is in “Stopped” state or is running. It should be in unenlisted state or remove the old versions. Check for the receive ports and send ports also if there are ports which are listening to the same type of message.

Froggy goes to Seattle: World Championship dan persiapan pulang

MSDN Blogs - Fri, 07/29/2016 - 23:01

Di hari terakhir kegiatan Imagine Cup 2016, Juara 1 dari setiap kategori dipertandingkan lagi dalam World Championship, yang akan memilih 1 juara umum, atau World Champion. Ketiga tim yang tampil berturut-turut adalah Juara Games dari Thailand, Juara Innovation dari Rumania, dan Juara World Citizenship dari Yunani.

Ada 3 juri yang bertugas memilih World Champion, yaitu Jennifer Tang (World Champion Imagine Cup 2014), Kasey Champion (Software Engineer dari Microsoft) dan John Boyega (aktor yang memerankan tokoh Finn dalam Star Wars: The Force Awakens). Ketiga juri ini memilih ENTy dari Rumania sebagai World Champion!

Dengan terpilihnya World Champion, maka berakhir sudahlah seluruh rangkaian Imagine Cup 2016 di seluruh dunia. Sekarang kita menunggu pengumuman resmi mengenai Imagine Cup 2017.

Setelah acara World Championship selesai, tim None Developers kemudian menyempatkan waktu untuk berjalan-jalan di downtown Seattle, sambil berbelanja oleh-oleh. Setelah itu, seluruh rombongan kembali ke asrama dan mengikuti Closing Party Imagine Cup 2016 yang dihadiri oleh semua World Finalists.



Malam ini, seluruh rombongan akan mulai mengepak koper, karena rombongan utama akan berangkat dengan pesawat jam 9 pagi, yang artinya sudah harus berangkat ke airport dengan bis pukul 6 pagi. Rombongan kedua kemudian akan menyusul dengan bis pukul 10 pagi, dengan keberangkatan pesawat pukul 2 siang. Mohon doanya agar perjalanan kami dapat berlangsung dengan baik!

Using SQL Server Stored Procedures to Dynamically Drop and Recreate Indexes

MSDN Blogs - Fri, 07/29/2016 - 21:36

Recently, I’ve been working on a project that of necessity involves periodically updating data in some reasonably large tables that exist in an Operational Data Store (ODS). This particular ODS is used for both reporting via SQL Server Reporting Services and staging data for use in a SQL Server Analysis Services database. Since a number of the tables in the ODS are used for reporting purposes, it’s not entirely surprising that the report designers have created a few indexes to help report performance. I don’t have a problem with indexes, but as any experienced DBA is well aware, the more and larger the indexes the greater the impact on the performance of inserts and updates.

The magnitude of the performance impact was really brought home when a simple update on a 12 million row table that normally completed in roughly three minutes had to be killed at the two hour mark. On further investigation, it was found that over 30 indexes had been added to the table in question. So to address the immediate problem and allow the update to complete in a reasonable time period, DROP INDEX and CREATE INDEX commands were scripted out and added to a stored procedure which would first drop the indexes then run the update statement and finally recreate the indexes. That worked well for a couple of days and then performance again began to degrade. When this episode of performance degradation was investigated, it was found that of the indexes that had been scripted out and added to the stored procedure only one remained and several additional indexes had been created.

Not wishing to revise a rather lengthy stored proc on an almost daily basis, after a brief bit of research, I found a blog posting by Percy Reyes entitled Script out all SQL Server Indexes in a Database using T-SQL. That was great, but only covered NONCLUSTERED indexes and since we were seeing both CLUSTERED and NONCLUSTERED indexes, it would need a bit of revision. Coincidentally, at about the same time there was some serious talk of adding COLUMNSTORE indexes on one or two of these tables, which would essentially cause any update statement to fail. The possibility of having to contend with COLUMNSTORE indexes in addition to CLUSTERED and NONCLUSTERED indexes would necessitate a reasonably significant revision to the T-SQL presented in Percy’s blog, especially since it would be necessary to dynamically drop and then recreate indexes. With those bits of information, it was time to formulate a plan, which would mean accomplishing the following:

  1. Capturing and storing the names of tables, their associated indexes and the definitions of those indexes
  2. Dropping the indexes after the definitions had been safely stored
  3. Recreating the indexes from stored definitions using correct syntax

A relatively simple three step task, the first of which was to create a stored proc that would capture the names of the tables, associated indexes and the definitions of those indexes.  That lead to creation of the following SQL Server stored procedure:

CREATE PROCEDURE [dbo].[sp_GetIndexDefinitions]



CREATE TABLE WORK.IDXDEF (SchemaName NVARCHAR(100), TableName NVARCHAR(256), IndexName NVARCHAR(256), IndexDef NVARCHAR(max))

DECLARE @SchemaName VARCHAR(100)
DECLARE @ColumnName VARCHAR(100)
DECLARE @is_unique VARCHAR(100)
DECLARE @IndexTypeDesc VARCHAR(100)
DECLARE @FileGroupName VARCHAR(100)
DECLARE @is_disabled VARCHAR(100)
DECLARE @IndexOptions VARCHAR(max)
DECLARE @IndexColumnId INT
DECLARE @IsDescendingKey INT
DECLARE @IsIncludedColumn INT
DECLARE @TSQLScripCreationIndex VARCHAR(max)
DECLARE @TSQLScripDisableIndex VARCHAR(max)

SELECT schema_name(st.schema_id) [schema_name],,,
CASE WHEN si.is_unique = 1 THEN ‘UNIQUE ‘ ELSE ” END
, si.type_desc,
+ CASE WHEN si.fill_factor>0 THEN ‘, FILLFACTOR =’ + cast(si.fill_factor as VARCHAR(3)) ELSE ” END  AS IndexOptions
,si.is_disabled , FILEGROUP_NAME(si.data_space_id) FileGroupName
FROM sys.tables st
INNER JOIN sys.indexes si on st.object_id=si.object_id
WHERE si.type>0 and si.is_primary_key=0 and si.is_unique_constraINT=0 –and schema_name(tb.schema_id)= @SchemaName and
and st.is_ms_shipped=0 and<>’sysdiagrams’
ORDER BY schema_name(st.schema_id),,

open CursorIndex
FETCH NEXT FROM CursorIndex INTO  @SchemaName, @TableName, @IndexName, @is_unique, @IndexTypeDesc, @IndexOptions, @is_disabled, @FileGroupName

WHILE (@@fetch_status=0)
DECLARE @IndexColumns VARCHAR(max)
DECLARE @IncludedColumns VARCHAR(max)

SET @IndexColumns=”
SET @IncludedColumns=”

SELECT, sic.is_descending_key, sic.is_included_column
FROM sys.tables tb
INNER JOIN sys.indexes si on tb.object_id=si.object_id
INNER JOIN sys.index_columns sic on si.object_id=sic.object_id and si.index_id= sic.index_id
INNER JOIN sys.columns col on sic.object_id =col.object_id  and sic.column_id=col.column_id
WHERE si.type>0 and (si.is_primary_key=0 or si.is_unique_constraINT=0)
and schema_name(tb.schema_id)=@SchemaName and and
ORDER BY sic.index_column_id

OPEN CursorIndexColumn
FETCH NEXT FROM CursorIndexColumn INTO  @ColumnName, @IsDescendingKey, @IsIncludedColumn

WHILE (@@fetch_status=0)
IF @IsIncludedColumn=0
SET @IndexColumns=@IndexColumns + @ColumnName  + CASE WHEN @IsDescendingKey=1  THEN ‘ DESC, ‘ ELSE  ‘ ASC, ‘ END
SET @IncludedColumns=@IncludedColumns  + @ColumnName  +’, ‘

FETCH NEXT FROM CursorIndexColumn INTO @ColumnName, @IsDescendingKey, @IsIncludedColumn

CLOSE CursorIndexColumn
DEALLOCATE CursorIndexColumn
SET @IndexColumns = substring(@IndexColumns, 0, len(ltrim(rtrim(@IndexColumns))))
SET @IncludedColumns = CASE WHEN len(@IncludedColumns) >0 THEN substring(@IncludedColumns, 0, len(@IncludedColumns)) ELSE ” END

SET @TSQLScripCreationIndex =”
SET @TSQLScripDisableIndex =”
SET @TSQLScripCreationIndex=’CREATE ‘+ @is_unique  + @IndexTypeDesc + ‘ INDEX ‘ +QUOTENAME(@IndexName)+’ ON ‘ +QUOTENAME(@SchemaName) +’.’+ QUOTENAME(@TableName)+
CASE WHEN @IndexTypeDesc = ‘NONCLUSTERED COLUMNSTORE’ THEN ‘ (‘+@IncludedColumns+’) ‘
ELSE ‘ (‘+@IndexColumns+’) ‘
END  +
CASE WHEN @IndexTypeDesc = ‘NONCLUSTERED COLUMNSTORE’ and len(@IncludedColumns)>0 THEN ”
CASE WHEN LEN(@IncludedColumns)>0 THEN CHAR(13) +’INCLUDE (‘ + @IncludedColumns+ ‘)’ ELSE ” END
END  +
CASE WHEN @IndexTypeDesc not like (‘%COLUMNSTORE%’) THEN CHAR(13) + ‘WITH (‘ + @IndexOptions + ‘) ‘ + ‘ ON ‘ + QUOTENAME(@FileGroupName) ELSE ” END  + ‘;’

INSERT INTO [WORK].[IDXDEF] (Schemaname,TableName,IndexName,IndexDef) values (@SchemaName, @TableName, @IndexName, @TSQLScripCreationIndex)

FETCH NEXT FROM CursorIndex INTO  @SchemaName, @TableName, @IndexName, @is_unique, @IndexTypeDesc, @IndexOptions, @is_disabled, @FileGroupName

CLOSE CursorIndex

When that tested out, it was time for the next step of the task and simply dynamically dropping the indexes. But I wanted to ensure that when the indexes were dropped, that the index definitions would be safely stored (I’m like most other DBAs and sort of enjoy being employed). That resulted in creation of the following stored proc:

CREATE PROCEDURE [dbo].[sp_DropIndexes] as

EXEC sp_GetIndexDefinitions

SELECT DISTINCT AS schemaname, AS tblname, AS indexnname
FROM sys.tables st
INNER JOIN sys.schemas ss ON st.schema_id=ss.schema_id
INNER JOIN sys.indexes si ON st.object_id=si.object_id
WHERE si.type<>0 AND st.is_ms_shipped=0 AND<>’sysdiagrams’ AND
(is_primary_key=0 AND is_unique_constraint=0)

OPEN CursorIDXDrop
FETCH NEXT FROM CursorIDXDrop INTO @SchemaName, @TableName, @IndexName
SET @DropIndex= ‘DROP INDEX ‘ + QUOTENAME(@IndexName) + ‘ ON ‘ + QUOTENAME(@SchemaName) + ‘.’ + QUOTENAME(@TableName)
EXEC sp_executesql @DropIndex
FETCH NEXT FROM CursorIDXDrop INTO  @SchemaName, @TableName, @IndexName

After that worked, with the index definitions safely stored so that I could manually re-create them if necessary, it was time to move on to the third step of dynamically re-creating the indexes. That resulted in creation of the following stored proc:

CREATE PROCEDURE [dbo].[sp_RebuildIndexes]


DECLARE @Command nvarchar(max)
DECLARE @IndexCmd nvarchar(max)
DECLARE @NumRows int
DECLARE CursorIndexes CURSOR for
SELECT DISTINCT CAST(IndexDef as nvarchar(max)) as IndexDef from work.IDXDEF
SET @NumRows=0
OPEN CursorIndexes
FETCH NEXT FROM CursorIndexes into @Command
EXEC sp_executesql @Command
FETCH NEXT FROM CursorIndexes into @Command
SET @NumRows=@NumRows+1
CLOSE CursorIndexes
DEALLOCATE CursorIndexes
–PRINT LTRIM(RTRIM(CAST(@NumRows as varchar(10)))) + ‘ Indexes Recreated from stored definitions’

By having my update script call the sp_DropIndexes before the update and then call the sp_RebuildIndexes, it was very easy to drop the indexes, run updates and then re-create the indexes in a reasonable time period without the necessity of having to continually revise code.

Imagine Cup 2016 世界大会、閉幕しました!

MSDN Blogs - Fri, 07/29/2016 - 19:10

日本代表チームは惜しくも入賞を逃してしまいましたが、Imagine Cup は、テクノロジーのすごさだけでなく、そのテクノロジーをどのようにビジネスにするのか? Go to Market ストラテジーは? そのテクノロジーは既存の代替品と比べて何が優れているのか? このテクノロジーは、どれぐらいのマーケットポテンシャルがあり、どれぐらいの規模の人に役に立つのか? といったビジネス要素がとても必要だということを実感しました。



授賞式の後は、Imagine Cup出場の学生さん、MSP  Summitに参加した学生はHoloLensを使ったHolographic Academyへ。すごかった!とワクワクして帰ってきてくれました。


金曜日のワールドチャンピオンシップは、Game、Innovation、World Citizenship各部門の優勝チームが、ステージで、3分間のピッチを行いました。シアトルのGarfield 高校で行われた最終決選には、スターウォーズのキャラクターたちがお出迎え、マイクロソフト のExecutive Vice PresidentのJudson Althoff、Corporate Vice President of Developer Platform & Evangelism and Chief EvangelistのSteve Guggenheimer、2014年Imagine Cup優勝者、マイクロソフトのComputer Science Curriculum DeveloperのKesey Champion、ハリウッドスターが登場してのセレモニーとなりました。優勝は、Innovation部門のルーマニアチームENTy。体のバランスと姿勢をリアルタイムにモニターできる手軽なメディカルデバイスを開発しました。既に、数名の医者と数百人の患者での利用実績がある点もプレゼンテーションでしっかりとアピールしていました。

MSPの皆さんは、授賞式後、Robo World Cup Hackathon の会場で現地の子供たちのメンターとして大活躍してくれました!

Imagine Cup 日本代表の筑波大学Biomachine Industrialチームの皆さん、MSPのお二人、本当にお疲れ様でした!!

Packaging issues with Visual Studio Team Services – 7/30 – Investigating

MSDN Blogs - Fri, 07/29/2016 - 19:07

Initial Update: Saturday, 30 July 2016 02:01 UTC

We are actively investigating issues with Visual Studio Team Services. Some customers may see a build failure with Nuget errors if they have below conditions.

1) If you have at least one VSTS Nuget package source
2) If you have more than one NuGet restore task in the build definition.

The symptom is, the second NuGet restore build task fails.

You will see an error message like below : 
Unable to find version ‘1.8.1’ of package ‘Microsoft.Cosmos.Client’.
##[error]Error: C:BAagentWorkerToolsnuget.exe failed with return code: 1
##[error]Packages failed to install
##[debug]task result: Failed

Work Around:: If you have the above repro then, you go to your agent pool in web UI, right click, and choose “Update All Agents”

Next Update: Before 30 July 2016 06:00 UTC

We are working to resolve this issue and apologize for any inconvenience.


.NET 4.6.2 and long paths on Windows 10

MSDN Blogs - Fri, 07/29/2016 - 18:21

The Windows 10 Anniversary update is almost out the door. .NET 4.6.2 is in the update (as we’ve looked at in the past few posts). I’ve talked a bit about what we’ve done in 4.6.2 around paths, and how that is targeted at both allowing access to previously inaccessible paths and opens up the door for long paths when the OS has support. Well, as people have discovered, Windows 10 now has started to open up support. In this post I’ll talk about how to enable that support.

Enabling Win32 Long Path Support

Long paths aren’t enabled by default yet. You need to set a policy to enable the support. To do this you want to “Edit group policy” in the Start search bar or run “gpedit.msc” from the Run command (Windows-R).

In the Local Group Policy Editor navigate to “Local Computer Policy: Computer Configuration: Administrative Templates: All Settings“. In this location you can find “Enable Win32 long paths“.

Enabling Win32 long paths in the policy editor.

After you’ve turned this on you can fire up a new instance of PowerShell and free yourself from the constraints of MAX_PATH! The key File and Directory Management APIs respect this and now allow you to skip the check for MAX_PATH without having to resort to using “\?” (look back to my earlier posts on path formats to understand how this works). This is also possible as PowerShell has opted into the new .NET path support (being that it is a .NET application).

If you look carefully at the description in the setting you’ll see “Enabling Win32 long paths will allow manifested win32 applications…“. That’s the second gate to getting support- your app must have a specific manifest setting. You can see what this is by opening C:WindowsSystem32WindowsPowerShellv1.0powershell.exe in Visual Studio or some other manifest viewer. Doing so you’ll see the following section in it’s manifest:

<application xmlns="urn:schemas-microsoft-com:asm.v3"> <windowsSettings> <longPathAware xmlns="">true</longPathAware> </windowsSettings> </application>

These two gates will get you the native (Win32) support for long paths. In a managed app you’ll also need the new behavior in .NET. The next section covers this.

Configuring a Simple Long Path .NET Console App

This example uses a new C# Console Application in Visual Studio 2015.

The first thing to do after creating a new console app is edit the App.Config file and add the following after the <startup> end tag:

<runtime>   <AppContextSwitchOverrides value="Switch.System.IO.UseLegacyPathHandling=false;Switch.System.IO.BlockLongPaths=false" /> </runtime>

Once the 4.6.2 Targeting Pack is released you can alternatively select 4.6.2 as your target framework in the project properties instead of using the app.config setting. The defaults for these two values are true if the target framework is 4.6.1 or earlier.

The second thing to do is add the Application Manifest File item to your project. After doing so add the windowsSettings block I shared above. In the default template there is already a commented-out section for windowsSettings, you can uncomment this and add this specific longPathAware setting.

Here is a sample block to add to your Main() method to test it out:

string reallyLongDirectory = @"C:TestabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ"; reallyLongDirectory = reallyLongDirectory + @"abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ"; reallyLongDirectory = reallyLongDirectory + @"abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ"; Console.WriteLine($"Creating a directory that is {reallyLongDirectory.Length} characters long"); Directory.CreateDirectory(reallyLongDirectory); You can open up PowerShell and go and look at the directory you created! Yayyyyyy! This is the start of what has been a very long journey to remove MAX_PATH constraints. There is still much to do, but now the door is finally open. The rest can and will come- keep your feedback coming in to keep us on track! Note that in this initial release CMD doesn’t support long paths. The Shell doesn’t add support either, but previously had limited support utilizing 8.3 filename trickery. I’ll leave it to the Windows team for any further details.

SQL Updates Newsletter – July 2016

MSDN Blogs - Fri, 07/29/2016 - 18:03
Recent Releases and Announcements


Recent Whitepapers/E-books/Training/Tutorials


Monthly Script Tips


Windows Server 2016 – Get Started


Issue Alert


Fany Carolina Vargas | SQL Dedicated Premier Field Engineer | Microsoft Services




What’s in a PDB file? Use the Debug Interface Access SDK

MSDN Blogs - Fri, 07/29/2016 - 17:37

It’s easy to use C# code and MSDia140.dll from the Debug Interface Access SDK to examine what’s inside a PDB.

A PDB is Program Database which is generated when an executable such as an EXE or DLL is built. It includes a lot of information about the file that is very useful for a debugger. This include names and addresses of symbols.

Managed code PDB contents are somewhat different from native code: a lot of the managed code information can be obtained from other sources. For example, the Type of a symbol can be obtained from the Metadata of the binary.

Below is some sample code that uses the DIA SDK to read a PDB and display its contents.

See also

Write your own Linq query viewer

Use DataTemplates and WPF in code to create a general purpose LINQ Query results display

<code> using Dia2Lib; using System; using System.Collections.Generic; using System.Linq; using System.Reflection; using System.Runtime.InteropServices; using System.Windows; using System.Windows.Controls; using System.Windows.Data; // File->New->Project->C# Windows WPF Application. // Replace MainWindow.Xaml.cs with this content // add a reference to c:Program Files (x86)Microsoft Visual Studio 14.0Common7PackagesDebuggermsdia140.dll namespace WpfApplication1 { public partial class MainWindow : Window { class SymbolInfo { public int Level { get; set; } //recursion level public string SymbolName { get; set; } public uint LocationType { get; set; } public ulong Length { get; set; } public uint AddressOffset { get; set; } public uint RelativeAddress { get; set; } public string SourceFileName { get; set; } public uint SourceLineNo { get; set; } public SymTagEnum SymTag { get; set; } public string SymbolType { get; set; } public override string ToString() { return $"{SymbolName} {SourceFileName}({SourceLineNo}) {SymbolType}"; } } public MainWindow() { InitializeComponent(); this.Loaded += (ol, el) => { try { this.WindowState = WindowState.Maximized; var pdbName = System.IO.Path.ChangeExtension( Assembly.GetExecutingAssembly().Location, "pdb"); this.Title = pdbName; var lstSymInfo = new List<SymbolInfo>(); using (var diaUtil = new DiaUtil(pdbName)) { Action<IDiaEnumSymbols, int> lamEnum = null; // recursive lambda lamEnum = (enumSym, lvl) => { if (enumSym != null) { foreach (IDiaSymbol sym in enumSym) { var symbolInfo = new SymbolInfo() { Level = lvl, SymbolName =, Length = sym.length, LocationType = sym.locationType, SymTag = (SymTagEnum)sym.symTag, AddressOffset = sym.addressOffset, RelativeAddress = sym.relativeVirtualAddress }; var symType = sym.type; if (symType != null) { var symtypename =; symbolInfo.SymbolType = symtypename; } lstSymInfo.Add(symbolInfo); if (sym.addressOffset > 0 && sym.addressSection > 0 && sym.length > 0) { try { IDiaEnumLineNumbers enumLineNums; diaUtil._IDiaSession.findLinesByAddr( sym.addressSection, sym.addressOffset, (uint)sym.length, out enumLineNums ); if (enumLineNums != null) { foreach (IDiaLineNumber line in enumLineNums) { var linenumber = line.lineNumber; symbolInfo.SourceFileName = line.sourceFile.fileName; symbolInfo.SourceLineNo = line.lineNumber; break; } } } catch (Exception) { } } switch (symbolInfo.SymTag) { case SymTagEnum.SymTagFunction: case SymTagEnum.SymTagBlock: case SymTagEnum.SymTagCompiland: IDiaEnumSymbols enumChildren; sym.findChildren(SymTagEnum.SymTagNull, name: null, compareFlags: 0, ppResult: out enumChildren); lamEnum.Invoke(enumChildren, lvl + 1); break; } } } }; /* query by table of symbols IDiaEnumTables enumTables; diaUtil._IDiaSession.getEnumTables(out enumTables); foreach (IDiaTable tabl in enumTables) { var tblName =; if (tblName == "Symbols") { IDiaEnumSymbols enumSyms = tabl as IDiaEnumSymbols; lamEnum.Invoke(enumSyms, 0); } } /*/ // query by global scope var globalScope = diaUtil._IDiaSession.globalScope; IDiaEnumSymbols enumSymGlobal; globalScope.findChildrenEx(SymTagEnum.SymTagNull, name: null, compareFlags: 0, ppResult: out enumSymGlobal); lamEnum.Invoke(enumSymGlobal, 0); //*/ } var gridvw = new GridView(); foreach (var mem in typeof(SymbolInfo).GetMembers(). Where(m => m.MemberType == MemberTypes.Property) ) { var gridCol = new GridViewColumn(); gridvw.Columns.Add(gridCol); gridCol.Header = new GridViewColumnHeader() { Content = mem.Name }; var template = new DataTemplate(typeof(SymbolInfo)); var factTblk = new FrameworkElementFactory(typeof(TextBlock)); factTblk.SetBinding(TextBlock.TextProperty, new Binding(mem.Name)); // for wide columns let's set the tooltip too factTblk.SetBinding(TextBlock.ToolTipProperty, new Binding(mem.Name)); factTblk.SetValue(TextBlock.MaxWidthProperty, 300.0); var factSP = new FrameworkElementFactory(typeof(StackPanel)); factSP.SetValue(StackPanel.OrientationProperty, Orientation.Horizontal); factSP.AppendChild(factTblk); template.VisualTree = factSP; gridCol.CellTemplate = template; } var lv = new ListView() { ItemsSource = lstSymInfo, View = gridvw }; lv.DataContext = lstSymInfo; this.Content = lv; } catch (Exception ex) { this.Content = ex.ToString(); } }; } } public class DiaUtil : IDisposable { public IDiaDataSource _IDiaDataSource; public IDiaSession _IDiaSession; public DiaUtil(string pdbName) { _IDiaDataSource = new DiaSource(); _IDiaDataSource.loadDataFromPdb(pdbName); _IDiaDataSource.openSession(out _IDiaSession); } public void Dispose() { Marshal.ReleaseComObject(_IDiaSession); Marshal.ReleaseComObject(_IDiaDataSource); } } }


Tips and Tricks on Doing “Lift and Shift” On-Prem Systems to Azure

MSDN Blogs - Fri, 07/29/2016 - 17:14

While Microsoft Azure offers an open and flexible platform for PaaS solutions, customers and partners usually take a “Lift and Shift” approach to moving their existing apps to Azure, that is, they try to keep and run the systems “as-is” or with minimal changes. The reason why they take the approach is rather obvious, whether it is for a proof of concept (POC), pilot or full migration. Most of these on-prem systems have some dependencies on other internal or external systems and any change to the infrastructure configuration, not to mention source code change, require further testing, which can take time and people resources. With the approach, they are also interested to evaluate the overall cost as compared to on-prem hosting. Working from several ISV partners I have discovered and learned 12 important lessons, most of which are related to manageability and security and should be applicable to hybrid migration and cloud-only migration, and would like to share them here.

Working with Azure Resource Group

Resource group is container that holds related resources for an application and role-based access controls. It’s up to you to determine how many resource groups you want to have as you create VMs, networks, etc. to support one or many apps on Azure. While it is not wrong to create multiple resource groups for your apps, with one VNET within each resource group, you will discover very quickly that doing so requires fairly amount of configuration if you have to enable communications between these VNETs. 

It is common, however, that you create one dedicated resource group for Azure networking, that is, Azure VNET and subnets, and that you can grant read-only, contributor or custom role permissions to the group of people who are responsible for managing the networking at your organization.

Creating Windows Active Directory AD Domain

When you have multiple Windows Active Directory AD domains, you may be thinking whether or not you should consolidate the domains to simplify AD management. On the other side, you may be wondering if making such change would break existing administrative boundaries among teams. The general rule of thumb is that you make no or little change during the initial phase of “lift and shift” unless the benefits of making changes overweigh the no change option.

It is worth noting that when Windows AD domains must be deployed on Azure data disks, the same or separate data disks and that the host cache preference setting on the Azure data disk is set for NONE. Here is why. Active Directory Domain Services (AD DS) uses update sequence numbers (USNs) to keep track of replication of data between domain controllers. Failure to disable write caching may, under certain circumstances, introduce USN rollback resulting in lingering objects and other problems. For more info, read the AD documentation.

To further protect your AD identity systems, you can implement the Tier model.

Considering Custom AD Domain with Azure DNS

You can use Azure DNS or your own DNS. If you use custom domains or subdomains, e.g., for public accessible URLs, you 

Configuring VNET

You can choose to have one or many VNETs. My colleague Igor has put together a nice blog post explaining how to configure communications between these VNETs. It is not uncommon that you go with one VNET with multiple subnets and place them in one separate resource group. You then grant appropriate permissions to users from other resource groups.

Leveraging Network Security Group and User Defined Routes

To protect your resources in Azure, you can use NSG to set up access controls and UDR to route traffic flows.

Adding Virtual Appliances to meet network requirements

Virtual appliances are typically Linux or FreeBSD-based VMs on Azure that perform specific network functions including security (Firewall, IDS , IPS), Router/VPN, application delivery controller and WAN optimization. They are available through partner solution on the Azure Marketplace and can be used to meet on-prem network requirements.

Setting up jump box for secure remote access

Despite different views on their benefits, as mentioned in this skyport blog post, Jump boxes are used today to provide secure remote access to administrators. In conjunction with Azure NSG and UDF and virtual appliances, jump boxes (two for high availability) can be configured behind the virtual appliances (two for high availability) with no public IP. This way, only authorized administrators can get on the jump box through the virtual appliance and then RDP to internal resources.

Providing multi-factor authentication on remote access

You can easily enable MFA through Azure AD premium. In addition, you can add MFA to RDP servers. For more info on the latter, read the white paper “Secure RDP Connection to on premise servers using Azure MFA – Step by Step Guide“.

Storing keys and secrets in Key Vault

You can use Azure Key Vault to create and store keys and passwords using PowerShell or CLI. There is no portal UI at the moment but will be added. Also, there are no notifications/alerts for keys due to expire at this time but this is a known common feature request.

Dealing with backup and DR issues

You can use Azure Backup service to back up files. For Bitlocker protected volume, the volume must be unlocked before the backup can occur. More info at Azure Backup service- FAQ

You can use Azure Site Recovery Service (ASR) to migrate an on-prem system to a secondary site or to Azure. However, site to site within Azure is not supported currently.

Working around the Linux cluster issue

Linux cluster requires shared access to a shared disk, which is not currently supported on Azure. There are some workarounds that you can find from the Linux community. For example, this blog post, “Step-By-Step: How to configure a Linux failover cluster in Microsoft Azure IaaS without shared storage #azure #sanless” walks you through all steps required to configure a highly available, 2-node MySQL cluster (plus witness server) on Azure VMs.

Monitoring your Azure environments with OMS

Azure Operations Management Suite (OMS) is your best bet when it comes to monitor the health of your systems on Azure. Keep in mind that services such as backup that are not available today are being added to the suite very rapidly.

Guest Blog: Paul Woods – Adopt & Embrace. Reflections on my first time at the Worldwide Partner Conference

MSDN Blogs - Fri, 07/29/2016 - 15:44

It has been almost two week since I landed back in Brisbane after a whirlwind week and a bit in North America for the Microsoft Worldwide Partner Conference. My, and Adopt & Embrace’s first WPC! I think I am finally over the jet lag. But the work has only just begun!

In my final guest post on the Microsoft Australia Partner Blog about my “First Time Attendee” experience at WPC, I wanted to take some time to reflect on the experience and share with you my key takeaways, actions, progress against the goals I set etc.

As a general statement before I get into the detail – if someone asked me whether or not WPC was a worthwhile investment as an emerging Microsoft Partner… my answer would be a resounding YES.

There are a couple of different ways I can unpack that very quick answer.

WPC enabled me to work on the business, not in it

Like many smaller Microsoft partners, I wear many hats in the business. One of the interesting things about being away from the business for a week at WPC (well as much as I could be) is that it gave me a little bit of breathing, and more importantly thinking space. I could take most of those hats off! From when I stepped on the plane in Brisbane until when I landed back in Brisbane 9 days later… I was working “on my business, not in it”. WPC was a great catalyst to enable me to get out of the minutiae of customer meetings, proposals, billable work etc and really think about how we can improve our organisation and better align within the Microsoft eco-system. There are already a number of decisions we have made in the business that we would not have made (or had the chance to even think about) if we didn’t “pull the rip cord” and land at WPC for the week.

WPC enabled me to establish and/or reinforce key relationships across Microsoft

Being a Partner Seller based in Brisbane, we have reasonably good access to the local branch. However, Microsoft doesn’t start and finish at 400 George Street. There are plenty of Microsoft stakeholders that we have established ‘virtual’ relationships over time that we very rarely get face time with. Whether it was members of the Office Business Group, the Partner team, Sales Managers or the Executive team, WPC gave me the opportunity to have a meal, a social drink, or a formal meeting with people we don’t normally get to see all that often. To put it in perspective the conversations and catch ups that we managed to have in just one week in Toronto was the equivalent of 3 or 4 trips to Sydney to engage with the same people. But in a more relaxed, but also more focused environment.

Park Microsoft Australia for a second though… the real advantage of WPC was being able to establish or reinforce existing relationships with Microsoft stakeholders from APAC, or Redmond. Be it Ananth Lazurus, the APAC Partner Lead out of Singapore… Brian Kealey, the Country Manager for Sri Lanka and the Maldives… Steven Malme, the Senior Director for Corporate Accounts in the Asia time zone… Cyril Belikoff from the Microsoft Office team in Redmond who has responsibility for the O365 active usage number globally… to be honest it would be possible to drop even more names. Beyond name dropping, we have follow up actions and activities with most of those stakeholders, all focused on mutual “win/win” outcomes. Would that be possible without going to WPC – well yes. However, it would be difficult to achieve so much progress so quickly without everyone being at WPC at the same time.

WPC enabled me to connect with other partners from around the world and learn from their success

Calling WPC a “melting pot” or Microsoft Partners from around the world may sound a bit clichéd… but it really is. Formally I was able to sit down with a number or partners who traditionally would be considered competitors – except that geography means that realistically we are not competitors, but organisations focused on serving our markets better. For example, I had a great 1 hour meeting in the WPC Connect meeting zone with Richard and Kanwal from 2ToLead. We explored opportunities to deliver services on each other’s behalf for our respective (and growing) lists of customers with international operations.

Other partnership opportunities emerged between partners focused on serving different markets here in Australia as well. Whether it was exploring how to assist LSPs broaden their customer conversation to accelerate active usage, or help solution partners broaden their offering to existing customers, there were a number of very fruitful discussions.

RE: the Australia specific conversations – yes they could have taken place at the Australia Partner Conference, but the key difference with WPC is that because only 250-300 people from Australia attend (and not 2000) there is a very high likelihood you are having those conversations with a decision maker who can act (or delegate action) based on the outcomes of your meeting. Another big tick for WPC!

WPC enabled me to learn some new things

The opportunity to learn is everywhere! The content of the keynotes and handful of breakout sessions I managed to attend was valuable. Even more valuable was the conversations, questions and answers you overhear throughout the conference. What are the questions or concerns that the delegates from New Signature – one of the more successful Office 365 partners in North America – are raising at the end of the session. Do they apply to my business? Is that answer relevant to my customers.

Then there are the war stories you get caught up in over a few beers at the networking events like the Australian trip to Niagara Falls. My biggest regret of my experience at WPC was that I didn’t get into “Sponge Mode” fast enough and didn’t realise the useful and actionable knowledge that was being shared right from the start.

WPC has enabled us to establish more authority with our customers

All this week the customers I have talked to have been very keen to learn more about what WPC was like, what the key takeaways were, and what it means for their business. A great ice breaker for an authentic conversation where we can deliver more value to our customers by interpreting and contextualising the key announcements and news form WPC for our customers.

So how did we go with regard to our WPC goals?

If you think back to my first guest post in June, there were three key goals we set for WPC

  1. Connect with key stakeholders within Microsoft Australia. Specifically, those that are goaled/targeted on Office 365 active usage / adoption / consumption. We will ensure they are familiar with what personal and professional value Adopt & Embrace can deliver to them and the customers in their territory. We will do this during informal conversations using customer evidence from engagements over the past 6 months.
  2. Similarly connect with key Microsoft Partners within Australia. Specifically, those that are considering augmenting their traditional business with high value advisory services, managed services or IP focused on user adoption. We will ensure they are familiar with our channel friendly approach that enables them to resell Adopt & Embrace’s capability to unlock additional value for their customers. We will do this using customer evidence from through partner engagements over the past 6 months
  3. Finally connect with forward thinking international partners. Specifically, those that have dipped their toe in the water of delivering services around Office 365 adoption / change management / value realisation. Beyond sharing war stories, we want ensure they are familiar with our “Lean User Adoption” methodology and discuss the potential for them to leverage our ‘secret sauce’. We will do this using customer evidence from Lean User Adoption based engagements over the past 6 months

So how did we go? Compared to many attendees (both partners and Microsoft), I had a relatively relaxed week. I didn’t have many meetings pre-scheduled apart from a handful that I had arranged via email or the meeting scheduling tool. This meant that I needed to actively seek out the people I wanted to engage with on site. Across the week there were around 15 meetings that would fall into one of the three buckets listed above. Some of those meetings where 5 minutes catch ups with emails exchanged and a list of actions. Some were an hour long and covered a lot of ground before agreeing on next steps.

If I were to have my time again, I think a little more time planning ahead of arriving on site, and organising a few more scheduled meetings would have helped us unlock more from WPC. That being said, having flexibility in the schedule also meant that we could quickly react and meet with people when the opportunity arose. That careful balance between a calendar jam packed with meetings, and enabling the serendipity of connections at the event.

What about the other side of WPC…

“What happens at WPC stays at WPC” J   Just kidding. What I can say is that there is never nothing to do at the Worldwide Partner Conference. You can’t attend every party. You can’t attend every lunch or breakfast event. The key is to prioritise and think about what value you can extract from that event.

The partner celebration on the other hand was amazing. As per my last guest blog post – Icona Pop were great, Gwen Stefani was sensational. Nothing like a concert to really take your mind off work for a little while.

So what is next?

For me, I have a long list of actions from WPC. Be it follow up activity based on meetings had on site, or changes in our business / approaches / how we communicate about what we do with customers. I am slowly working through the list, I almost need another week off to kick start all the execution we need to do off the back of what we learned at the Microsoft Worldwide Partner Conference.

I think we will have all of that under control just in time for the Australian Partner Conference in September. Wow… only 40 days to go! I have registered, you should make sure you get your ticket today.

Entity Framework Core 1.1 Plans

MSDN Blogs - Fri, 07/29/2016 - 13:57

Now that Entity Framework Core (EF Core) 1.0 is released, our team is beginning work on bug fixes and new features for the 1.1 release. Keep in mind that it’s early days for this release, we’re sharing our plans in order to be open, but there is a high chance things will evolve as we go.

High level goals

Our goal with 1.1 release is to make progress on the items that are blocking folks from using EF Core.

  • Fix a lot of the bugs reported on the 1.0 release in order to improve stability
  • Tackle a number of critical O/RM features that are currently not implemented in EF Core

EF Core 1.1 is scheduled for Q4 2016 / Q1 2017. We’ll have a more exact date as we get closer to the release and decide where to draw the line on features to be included.

What features are we working on

Over the next couple of months we are beginning work on a number of features. Some we expect to include in 1.1, others we expect to deliver in a future release.

Features we expect to ship in 1.1

Following is the list of features that our team is expecting to include in the 1.1 release.

  • LINQ improvements
    • Improved translation to enable more queries to successfully execute, with more logic being evaluated in the database (rather than in-memory).
    • Queries for non-model types allows a raw SQL query to be used to populate types that are not part of the model (typically for denormalized view-model data).
  • DbSet.Find provides an easy way to fetch an entity based on its primary key value.
  • Explicit Loading allows you to trigger population of a navigation property on an entity that was previously loaded from the database.
  • Additional EntityEntry APIs from EF6.x such as Reload, GetModifiedProperties, GetDatabaseValues etc.
  • Connection resiliency automatically retries failed database commands. This is especially useful when connection to SQL Azure, where transient failures are common.
  • Pluralization support for reverse engineering will result in singularized type names and pluralized DbSet property names, regardless of whether the table name is singular or plural.
  • Stable release of tools – whilst the runtime has reached RTM, the tooling for EF Core (plus .NET Core and ASP.NET Core) is still pre-release.
Other features we’re starting on

We’re also planning to start work on the following features, but do not expect them to be ready in time for inclusion in the 1.1 release. These features will require a longer time to implement, stabilize, and gather feedback from the community and therefore they will have a longer pre-release cycle.

  • Complex/value types are types that do not have a primary key and are used to represent a set of properties on an entity type.
  • Simple type conversions such as string => xml.
  • Visual Studio wizard for reverse engineering a model from an existing database.
Will EF Core replace EF6.x after the 1.1 release

The short answer is no. EF6.x is still the mature, stable data access stack and will continue to be the right choice for many applications when EF Core 1.1 is available. Along with EF Core 1.1, we are also starting work on the EF6.2 release – we’ll share our plans on that shortly. That said, EF Core will be a viable choice for more applications once 1.1 is released. Our documentation has guidance on choosing between EF6.x and EF Core, and this same guidance will apply when 1.1 is released.

Notable exclusions

The full backlog of features that we want to add to EF Core is too long to list here, but we wanted to call out a few features that we know are critical to a lot of applications and will not be worked on in the 1.1 timeframe. This is purely a matter of not being able to work on every feature at the same time, and the order in which things must be implemented. As an example, Lazy Loading will build on top of some of the feature work we are doing in the 1.1 timeframe (such as Explicit Loading).

While we realize this list will cause frustration for some folks, we want to be as transparent as possible so that you all have the information required to make an informed decision about when EF Core would be the right choice for you.

  • Many-to-many relationships without join entity. You can already model a many-to-many relationship with a join entity, see Relationships for details.
  • Alternate inheritance mapping patterns for relational databases, such as table per type (TPT) and table per concrete type (TPC). Table per hierarchy (TPH) is already supported.
  • Lazy loading enables navigation properties to be automatically populated from the database when they are accessed. Some of the features we implement in 1.1 may enable rolling your own lazy loading, but it will not be a first class feature in 1.1.
  • Simple command interception provides an easy way to read/write commands before/after they are sent to the database.
  • Stored procedure mapping allows EF to use stored procedures to persist changes to the database (FromSql already provides good support for using a stored procedure to query, see Raw SQL Queries for details).
  • Spatial data types such as SQL Server’s geography & geometry. The type conversion work we are starting may enable some spatial scenarios, but it will not be a complete solution.
  • Seed data allows you to easily specify a set of data to be present in the database. This is useful for populating lookup tables etc. and for inserting test data.
  • GROUP BY translation will move translation of the LINQ GroupBy operator to the database when used with an aggregate function (i.e. when all underlying rows do not need to be returned to the client).
  • Update model from database allows a model that was previously reverse engineered from the database to be refreshed with changes made to the schema.
  • Model visualization allows you to see a graphical representation of your model.

DSC Resource Kit Community Call August 3

MSDN Blogs - Fri, 07/29/2016 - 13:22

We will be hosting a community call for the DSC Resource Kit 1-2PM on Wednesday, August 3 (PDT).
Call in to ask questions or give feedback about the DSC Resource Kit!

How to Join Skype for Business

Join Skype Meeting
This is an online meeting for Skype for Business, the professional meetings and communications app formerly known as Lync.


+14257063500 (USA – Redmond Campus) English (United States)
+18883203585 (USA – Redmond Campus) English (United States)
Find a local number

Conference ID: 88745041
Forgot your dial-in PIN? | Help


The community call agenda is posted on GitHub here.

Backup Migrated Mobile Service (Node.Js backend)

MSDN Blogs - Fri, 07/29/2016 - 12:48

You may have gotten this error when trying to backup a Migrated Mobile Service built with a Node.Js backend:

Database connection string not valid for database MS_TableConnectionString (SQLAzure). Keyword not supported: ‘driver’.

Backup is using ADO.Net internally. The Backup feature uses the Application Setting MS_TableConnectionString to configure the Database backup and this legacy setting is for the node Driver and connection.

Don’t use the MS_TableConnectionString.  Instead create a new ADO.Net connection string for your Database.  Configure the Database Backup settings so it picks up this new string, and save your changes.
Walkthrough of the fix: Get the ADO.Net connection string you need

In your migrated Azure Mobile Service, find the name of your Mobile Service DB in your Application Settings, MS_TableConnectionString.

Example: Driver={SQL Server Native Client 10.0};Server={,1433};Database=jsanderswestmobileservice_db;;Pwd=Ns36hxx8663zZ$$;


Go to the SQL Databases tab, find the Database and click on it

When it opens, click on the Database Connection Strings.

Copy the ADO.NET connection string and save it

Example:,1433;Initial Catalog=jsanderswestmobileservice_db;Persist Security Info=False;User ID={your_username};Password={your_password};MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;


Replace the User ID and Password with the Db Admin or equivalent login

Important: The UID and PWD from you old connection string does NOT have sufficient privilages to do the backup.

Example:,1433;Initial Catalog=jsanderswestmobileservice_db;Persist Security Info=False;User ID=dbadmin;Password=dbadminpwd;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;

Create a new BackupConnection with the new value

Go back to your Migrated Mobile Service, App Settings, Connection Strings, View the connection strings and create a new Connection String called BackupConnection with the new connection string you created from the ADO.NET connection string.  Ensure the type is SQL Database.

Tab out of the field and save your changes.

Reset the Database configuration in Backup

Go to the Backups setting and hit the Configure Icon

Click on the Database Settings and toggle the check Icon on MS_TableConnectionString to turn it off and select the new BackupConnection, Hit OK and Save (this forces the configuration to update).

Test your Backup by Manually kicking it off.


It is not complicated to enable backups this way but it is a ‘Gotcha’ so please let me know if this Post helped you out!

BizTalk Boot Camp 2016: September 22 – September 23, 2016

MSDN Blogs - Fri, 07/29/2016 - 11:38

The BizTalk Boot Camp is a free open-to-the-public technical event that provides a deep-dive into our integration story. In this Boot Camp, our focus is on:

  • BizTalk Server 2016 and the new stuff, including using the new Logic Apps adapter
  • Logic Apps
  • Microsoft Flow

The itinerary and hands-on sessions are being created. More specific details will be added as they are finalized.

  • In-person event; Skype is not offered
  • Laptop: Many discussions include hands-on activities
  • Non-disclosure agreement (NDA): A non-disclosure agreement (NDA) is required; which is available to sign upon arrival. Many companies already have a NDA with Microsoft. When you register, we’ll confirm if one already exists.

Microsoft Corporation
8055 Microsoft Way
Charlotte, NC 28273

Bing map  Google map

Check-in is required. If you are driving to the event, car registration is also required (download and complete Vehicle and Customer Registration).


Will open week of August 1, 2016. Attendance is limited. The event is targeted towards a technical audience, including administrators and developers. Registration includes:

  • Attendance both days
  • Breakfast and lunch both days
  • Access to the Microsoft company store

Details coming soon.


Ask in this post or contact me: mandi dot ohlinger at microsoft dot com.

We hope to see you here!
Mandi Ohlinger


Subscribe to Randy Riness @ SPSCC aggregator
Drupal 7 Appliance - Powered by TurnKey Linux