You are here

Feed aggregator

Azure SQL Data Warehouse での統計の管理

MSDN Blogs - Sun, 08/21/2016 - 18:00

Microsoft Japan Data Platform Tech Sales Team

高木 英朗

 

以前のエントリで統計の概要と作成方法について紹介しました。今回は統計の管理方法について紹介します。

統計はクエリの実行プランを作成するための情報を提供する重要な要素であることをお伝えしましたが、最適なパフォーマンスを得るためには、この統計情報を最新にしておくということも重要です。


統計の更新タイミング

統計を最新にしておくための最適なタイミングは、データの追加や更新の後です。これはデータの追加や更新時にテーブルのサイズや値の分布が変わる可能性が高いためです。
もし、すべての統計を管理するのは時間がかかりすぎる場合は、例えば新しい値が毎日追加されるような日付列だったり JOIN、GROUP BY、ORDER BY、DISTINCT 等に使われる列に限定すると良いでしょう。

統計が最新かどうかを判断する方法

統計が最新かどうかを判断するため、統計が最後に更新された日時を確認することができます。
確認するには、以下のクエリを実行します。

SELECT sm.[name] AS [schema_name], tb.[name] AS [table_name], co.[name] AS [stats_column_name], st.[name] AS [stats_name], STATS_DATE(st.[object_id],st.[stats_id]) AS [stats_last_updated_date] FROM sys.objects ob JOIN sys.stats st ON ob.[object_id] = st.[object_id] JOIN sys.stats_columns sc ON st.[stats_id] = sc.[stats_id] AND st.[object_id] = sc.[object_id] JOIN sys.columns co ON sc.[column_id] = co.[column_id] AND sc.[object_id] = co.[object_id] JOIN sys.types ty ON co.[user_type_id] = ty.[user_type_id] JOIN sys.tables tb ON co.[object_id] = tb.[object_id] JOIN sys.schemas sm ON tb.[schema_id] = sm.[schema_id] WHERE st.[user_created] = 1;


実行結果


最後に更新された日時以降にテーブルサイズや値の分布が変わっている場合(データの追加や更新等が行われている場合)は統計を更新する必要があります。


以前の記事でも触れましたが、現在 Azure SQL Data Warehouse では統計情報が自動的に作成されないため、手動で作成する必要があります(これは今後変更される予定です)が、データのサイズや分布が変更された場合には、統計を更新することも重要なポイントになりますので、是非ご活用ください。

Small Basic – Flipping Shapes

MSDN Blogs - Sun, 08/21/2016 - 17:00

Today, I’ll introduce a sample program to flip a picture drawn with Shapes object.  The program ID is MMF211.

This program uses Shapes.Move(), Shapes.Zoom() and Shapes.Rotate() operations to flip the picture.  In this logic, a triangle must be symmetric (an isosceles triangle).

Be in to WIN a Surface Pro 4

MSDN Blogs - Sun, 08/21/2016 - 15:58

Heading to Microsoft Ignite NZ this year? When you convince an Ignite Newbie to join you at Microsoft Ignite NZ 2016, you both go into the draw to win a brand new, Microsoft Surface Pro 4 each.

“How do I enter”?
Simple. Share the promo code ‘IGNITENEWBIE’ with someone who hasn’t been to the conference for at least 2 years – maybe never! If your powers of persuasion have them buying a ticket to this year’s conference using the promo code ‘IGNITENEWBIE’ then you are both in the draw to win!

Convince an Ignite Newbie to join you and you could both walk out of the conference clued-up, connected, and the proud owners of the tablet that can replace your laptop.

You and your Ignite Newbie must both have a full conference pass to be eligible to win. Only tickets purchased using the ‘IGNITENEWBIE’ promo code will be entered into the draw to win. Full Terms and Conditions.


WIIFM-“Powershell available on Linux”

MSDN Blogs - Sun, 08/21/2016 - 13:59

When I first read this announcement – https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/, I thought what the heck we are trying to solve when “Bash/Python/Perl” Linux already has got for automation & scripting. But it is proved wrong after this video – https://youtu.be/2WZwv7TxqZ0 

But once again, this is a very cool move from our Powershell team by opening up the source code + extending support to Linux and MacOS. Devops would love this for sure, it comes handy for managing or automating Azure resources across the OS. They are plans to include most of the modules down the line so that, seem less execution for sure.

Keytakeaway:- If we have any PS script written in Windows say Azure VM creation or management /Dockers/AWS storage/VMware resource management etc then we can reuse the same script in Linux & Macos. There is no difference in the syntax. Just copy paste should work without any difference.

Sweet – “We are partnering with third party companies – Chef, Amazon Web Services, VMware, and Google to name a few – to create a rich, seamless experience across the platforms you know and use.”

Write once, runs any ware philosophy –heterogeneous management toolset. 

Azure News on Friday (KW33/16)

MSDN Blogs - Sun, 08/21/2016 - 12:54

Auch diese Woche gab’s wieder viele Nachrichten zur Microsoft Azure Plattform. Hier sind nähere Infos dazu…

Aktuelle Neuigkeiten Datum Nachricht 19.08. Exporting your database to MySQL in-app database
Export einer Datenbank in eine In-App MySQL Datenbank 19.08. Announcing MySQL in-app (Preview) for Web Apps
Neue Funktion für Azure App Service Web Apps – In-App MySQL Datenbanken – MySQL as a Service 18.08. Webcast:  Citrix & Microsoft – Making Cloud Simpler, Business Faster
Webcast zu den Möglichkeiten zur Nutzung von Citrix Technologien zur Bereitstellung von Windows Clients via Azure 18.08. Microsoft announces PowerShell is open sourced, available on Linux
Microsoft PowerShell jetzt Open Source und (neben Windows) auch auf Linux verfügbar 18.08. Introducing Image Processing in U-SQL
Bildverarbeitung im Rahmen von Big Data Auswertungen mit U-SQL in Azure Data Lake Analytics 15.08. Announcing HTTP/2 support for all customers with Azure CDN from Akamai
HTTP/2 Unterstützung für Azure CDN von Akamai verfügbar Neue Kurse in der MVA Datum Nachricht 18.08. Introduction to Azure Security Center
Überblick über das Azure Security Center Neue Videos Datum Nachricht Video 19.08. Diagnose and resolve issues with Azure Troubleshooting
Mit Hilfe des Azure Portals Probleme in Azure diagnostizieren und beheben
17.08. Tuesdays with Corey Azure Functions Part 2 – Expanded Demo
Corey Sanders mit einem Einblick in Azure Functions
16.08. Using Azure Notification Hubs for Push Notifications in Google Chrome
Push Notifications aus Chrome Apps via Azure Notification Hubs versenden
16.08. Using Azure Notification Hubs in an Android Application
Push Notifications aus Android Apps via Azure Notification Hubs versenden
16.08. Using Azure Notification Hubs from a Xamarin.Android Application
Push Notifications aus Xamarin Apps heraus via Azure Notification Hubs versenden
16.08. Using Temporal in SQL Server 2016 and Azure SQL Database
Temporal Funktionalität in Azure SQL Database nutzen

PowerShell on Linux and Mac

MSDN Blogs - Sun, 08/21/2016 - 10:29

This week Microsoft announced that now PowerShell is open source and it also works on Linux and Mac.

This is possible because the PowerShell runs on the .NET Core, which is Microsoft’s new open-source framework for developing cross-platform applications. For details, see the article: Creating a .NET Core Console application in just 5 minutes.

PowerShell open source project is available at: https://github.com/PowerShell/PowerShell and alphas downloadable versions are available for the following operating systems:

After installation, simply type powershell in the prompt to run PowerShell. In the following picture, I ran the Get-Process cmdlet to list the running processes of Ubuntu machine:

 

So far there are 345 commands available cross-platform, being 210 cmdlets and 135 functions:

For more details, see the articles:

PowerShell on Linux and Open Source!

PowerShell Official Page.

 

 

PowerShell no Linux e Mac

MSDN Blogs - Sun, 08/21/2016 - 10:21

Essa semana a Microsoft anunciou que agora o PowerShell é open source e que funciona também no Linux e Mac.

Isso é possível, pois o PowerShell executa sobre o .NET Core, que é o novo framework open-source da Microsoft para o desenvolvimento de aplicações cross-platform. Para maiores detalhes, veja o artigo: Como criar uma aplicação .NET Core em apenas 5 minutos.

O projeto open source do PowerShell está disponível em: https://github.com/PowerShell/PowerShell e as versões alphas para download estão disponíveis para os seguintes sistemas operacionais:

Após a instalação, por exemplo no Ubuntu, basta digitar powershell no prompt para executar o PowerShell. Na imagem a seguir, eu executei o cmdlet Get-Process para listar os processos em execução da máquina Ubuntu:

Até o momento existem 345 comandos disponíveis cross-platform, sendo 210 cmdlets e 135 functions:

 

Para maiores detalhes, consulte os artigos:

PowerShell on Linux and Open Source!

Página oficial do PowerShell.

Sunday Surprise – Sharing my Forum’s participation experience

MSDN Blogs - Sun, 08/21/2016 - 10:05

Good day to all my readers!

In my first blog in Forum’s Ninja blog I want to go back into the memory line. I want to remember how I became a forum contributor and a moderator in many forums…

It all started in the beginning of 2007 when I found MSDN forums and started to answer various questions right away. I always was very active in the different online communities although I am a very private person and don’t have an experience of public speaking. But online you can be active to your heart content.

I started answering questions about MS Visual FoxPro as this is the programming language I was very familiar with. Eventually I also switched to answering in SQL Server forums and in particular, in Transact-SQL forum, as this is the area I can provide the most help and the T-SQL challenges were always interesting to me.

I have been answering for a while (may be 2 or more years) before I became a moderator in Transact-SQL forum and other SQL Server related forums.

Today I am moderating about 20 different forums and answering questions when time permits.

I am interested in other people experience with the forums, please, leave me comments about your forums’ participation.

All the best and see you soon in a new blog. I plan to highlight interesting forums’ discussions as well as share my vision on forum’s moderation…

 

Experiencing Power BI reports accessing issues – 08/21 – Investigating

MSDN Blogs - Sun, 08/21/2016 - 05:26
Initial Update: Sunday, 21 August 2016 12:20 UTC

We are aware of issues within Application Insights and are actively investigating. Customers will not be able to create new PowerBI dashboards. Also, customers will not be able to access the existing PowerBI dashboards. 
  • Work Around: None
  • Next Update: Before 08/21 16:30 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Varun

test post when working

MSDN Blogs - Sat, 08/20/2016 - 23:23

Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Donec odio. Quisque volutpat mattis eros. Nullam malesuada erat ut turpis. Suspendisse urna nibh, viverra non, semper suscipit, posuere a, pede.

Donec nec justo eget felis facilisis fermentum. Aliquam porttitor mauris sit amet orci. Aenean dignissim pellentesque felis.

test post

MSDN Blogs - Sat, 08/20/2016 - 19:41

test post mostly disabled plugins!

test draft

MSDN Blogs - Sat, 08/20/2016 - 19:28

test draft update 2

Bruno test

MSDN Blogs - Sat, 08/20/2016 - 13:09

/* GitHub stylesheet for MarkdownPad (http://markdownpad.com) */
/* Author: Nicolas Hery – http://nicolashery.com */
/* Version: b13fe65ca28d2e568c6ed5d7f06581183df8f2ff */
/* Source: https://github.com/nicolahery/markdownpad-github */

/* RESET
=============================================================================*/

html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, img, ins, kbd, q, s, samp, small, strike, strong, sub, sup, tt, var, b, u, i, center, dl, dt, dd, ol, ul, li, fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td, article, aside, canvas, details, embed, figure, figcaption, footer, header, hgroup, menu, nav, output, ruby, section, summary, time, mark, audio, video {
margin: 0;
padding: 0;
border: 0;
}

/* BODY
=============================================================================*/

body {
font-family: Helvetica, arial, freesans, clean, sans-serif;
font-size: 14px;
line-height: 1.6;
color: #333;
background-color: #fff;
padding: 20px;
max-width: 960px;
margin: 0 auto;
}

body>*:first-child {
margin-top: 0 !important;
}

body>*:last-child {
margin-bottom: 0 !important;
}

/* BLOCKS
=============================================================================*/

p, blockquote, ul, ol, dl, table, pre {
margin: 15px 0;
}

/* HEADERS
=============================================================================*/

h1, h2, h3, h4, h5, h6 {
margin: 20px 0 10px;
padding: 0;
font-weight: bold;
-webkit-font-smoothing: antialiased;
}

h1 tt, h1 code, h2 tt, h2 code, h3 tt, h3 code, h4 tt, h4 code, h5 tt, h5 code, h6 tt, h6 code {
font-size: inherit;
}

h1 {
font-size: 28px;
color: #000;
}

h2 {
font-size: 24px;
border-bottom: 1px solid #ccc;
color: #000;
}

h3 {
font-size: 18px;
}

h4 {
font-size: 16px;
}

h5 {
font-size: 14px;
}

h6 {
color: #777;
font-size: 14px;
}

body>h2:first-child, body>h1:first-child, body>h1:first-child+h2, body>h3:first-child, body>h4:first-child, body>h5:first-child, body>h6:first-child {
margin-top: 0;
padding-top: 0;
}

a:first-child h1, a:first-child h2, a:first-child h3, a:first-child h4, a:first-child h5, a:first-child h6 {
margin-top: 0;
padding-top: 0;
}

h1+p, h2+p, h3+p, h4+p, h5+p, h6+p {
margin-top: 10px;
}

/* LINKS
=============================================================================*/

a {
color: #4183C4;
text-decoration: none;
}

a:hover {
text-decoration: underline;
}

/* LISTS
=============================================================================*/

ul, ol {
padding-left: 30px;
}

ul li > :first-child,
ol li > :first-child,
ul li ul:first-of-type,
ol li ol:first-of-type,
ul li ol:first-of-type,
ol li ul:first-of-type {
margin-top: 0px;
}

ul ul, ul ol, ol ol, ol ul {
margin-bottom: 0;
}

dl {
padding: 0;
}

dl dt {
font-size: 14px;
font-weight: bold;
font-style: italic;
padding: 0;
margin: 15px 0 5px;
}

dl dt:first-child {
padding: 0;
}

dl dt>:first-child {
margin-top: 0px;
}

dl dt>:last-child {
margin-bottom: 0px;
}

dl dd {
margin: 0 0 15px;
padding: 0 15px;
}

dl dd>:first-child {
margin-top: 0px;
}

dl dd>:last-child {
margin-bottom: 0px;
}

/* CODE
=============================================================================*/

pre, code, tt {
font-size: 12px;
font-family: Consolas, “Liberation Mono”, Courier, monospace;
}

code, tt {
margin: 0 0px;
padding: 0px 0px;
white-space: nowrap;
border: 1px solid #eaeaea;
background-color: #f8f8f8;
border-radius: 3px;
}

pre>code {
margin: 0;
padding: 0;
white-space: pre;
border: none;
background: transparent;
}

pre {
background-color: #f8f8f8;
border: 1px solid #ccc;
font-size: 13px;
line-height: 19px;
overflow: auto;
padding: 6px 10px;
border-radius: 3px;
}

pre code, pre tt {
background-color: transparent;
border: none;
}

kbd {
-moz-border-bottom-colors: none;
-moz-border-left-colors: none;
-moz-border-right-colors: none;
-moz-border-top-colors: none;
background-color: #DDDDDD;
background-image: linear-gradient(#F1F1F1, #DDDDDD);
background-repeat: repeat-x;
border-color: #DDDDDD #CCCCCC #CCCCCC #DDDDDD;
border-image: none;
border-radius: 2px 2px 2px 2px;
border-style: solid;
border-width: 1px;
font-family: “Helvetica Neue”,Helvetica,Arial,sans-serif;
line-height: 10px;
padding: 1px 4px;
}

/* QUOTES
=============================================================================*/

blockquote {
border-left: 4px solid #DDD;
padding: 0 15px;
color: #777;
}

blockquote>:first-child {
margin-top: 0px;
}

blockquote>:last-child {
margin-bottom: 0px;
}

/* HORIZONTAL RULES
=============================================================================*/

hr {
clear: both;
margin: 15px 0;
height: 0px;
overflow: hidden;
border: none;
background: transparent;
border-bottom: 4px solid #ddd;
padding: 0;
}

/* TABLES
=============================================================================*/

table th {
font-weight: bold;
}

table th, table td {
border: 1px solid #ccc;
padding: 6px 13px;
}

table tr {
border-top: 1px solid #ccc;
background-color: #fff;
}

table tr:nth-child(2n) {
background-color: #f8f8f8;
}

/* IMAGES
=============================================================================*/

img {
max-width: 100%
}

Test blog

This is a test

An outline:

  • Just testing that I can use github markup
  • Just testing that I can use github markup

test

MSDN Blogs - Sat, 08/20/2016 - 10:27

/* GitHub stylesheet for MarkdownPad (http://markdownpad.com) */
/* Author: Nicolas Hery – http://nicolashery.com */
/* Version: b13fe65ca28d2e568c6ed5d7f06581183df8f2ff */
/* Source: https://github.com/nicolahery/markdownpad-github */

/* RESET
=============================================================================*/

html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, img, ins, kbd, q, s, samp, small, strike, strong, sub, sup, tt, var, b, u, i, center, dl, dt, dd, ol, ul, li, fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td, article, aside, canvas, details, embed, figure, figcaption, footer, header, hgroup, menu, nav, output, ruby, section, summary, time, mark, audio, video {
margin: 0;
padding: 0;
border: 0;
}

/* BODY
=============================================================================*/

body {
font-family: Helvetica, arial, freesans, clean, sans-serif;
font-size: 14px;
line-height: 1.6;
color: #333;
background-color: #fff;
padding: 20px;
max-width: 960px;
margin: 0 auto;
}

body>*:first-child {
margin-top: 0 !important;
}

body>*:last-child {
margin-bottom: 0 !important;
}

/* BLOCKS
=============================================================================*/

p, blockquote, ul, ol, dl, table, pre {
margin: 15px 0;
}

/* HEADERS
=============================================================================*/

h1, h2, h3, h4, h5, h6 {
margin: 20px 0 10px;
padding: 0;
font-weight: bold;
-webkit-font-smoothing: antialiased;
}

h1 tt, h1 code, h2 tt, h2 code, h3 tt, h3 code, h4 tt, h4 code, h5 tt, h5 code, h6 tt, h6 code {
font-size: inherit;
}

h1 {
font-size: 28px;
color: #000;
}

h2 {
font-size: 24px;
border-bottom: 1px solid #ccc;
color: #000;
}

h3 {
font-size: 18px;
}

h4 {
font-size: 16px;
}

h5 {
font-size: 14px;
}

h6 {
color: #777;
font-size: 14px;
}

body>h2:first-child, body>h1:first-child, body>h1:first-child+h2, body>h3:first-child, body>h4:first-child, body>h5:first-child, body>h6:first-child {
margin-top: 0;
padding-top: 0;
}

a:first-child h1, a:first-child h2, a:first-child h3, a:first-child h4, a:first-child h5, a:first-child h6 {
margin-top: 0;
padding-top: 0;
}

h1+p, h2+p, h3+p, h4+p, h5+p, h6+p {
margin-top: 10px;
}

/* LINKS
=============================================================================*/

a {
color: #4183C4;
text-decoration: none;
}

a:hover {
text-decoration: underline;
}

/* LISTS
=============================================================================*/

ul, ol {
padding-left: 30px;
}

ul li > :first-child,
ol li > :first-child,
ul li ul:first-of-type,
ol li ol:first-of-type,
ul li ol:first-of-type,
ol li ul:first-of-type {
margin-top: 0px;
}

ul ul, ul ol, ol ol, ol ul {
margin-bottom: 0;
}

dl {
padding: 0;
}

dl dt {
font-size: 14px;
font-weight: bold;
font-style: italic;
padding: 0;
margin: 15px 0 5px;
}

dl dt:first-child {
padding: 0;
}

dl dt>:first-child {
margin-top: 0px;
}

dl dt>:last-child {
margin-bottom: 0px;
}

dl dd {
margin: 0 0 15px;
padding: 0 15px;
}

dl dd>:first-child {
margin-top: 0px;
}

dl dd>:last-child {
margin-bottom: 0px;
}

/* CODE
=============================================================================*/

pre, code, tt {
font-size: 12px;
font-family: Consolas, “Liberation Mono”, Courier, monospace;
}

code, tt {
margin: 0 0px;
padding: 0px 0px;
white-space: nowrap;
border: 1px solid #eaeaea;
background-color: #f8f8f8;
border-radius: 3px;
}

pre>code {
margin: 0;
padding: 0;
white-space: pre;
border: none;
background: transparent;
}

pre {
background-color: #f8f8f8;
border: 1px solid #ccc;
font-size: 13px;
line-height: 19px;
overflow: auto;
padding: 6px 10px;
border-radius: 3px;
}

pre code, pre tt {
background-color: transparent;
border: none;
}

kbd {
-moz-border-bottom-colors: none;
-moz-border-left-colors: none;
-moz-border-right-colors: none;
-moz-border-top-colors: none;
background-color: #DDDDDD;
background-image: linear-gradient(#F1F1F1, #DDDDDD);
background-repeat: repeat-x;
border-color: #DDDDDD #CCCCCC #CCCCCC #DDDDDD;
border-image: none;
border-radius: 2px 2px 2px 2px;
border-style: solid;
border-width: 1px;
font-family: “Helvetica Neue”,Helvetica,Arial,sans-serif;
line-height: 10px;
padding: 1px 4px;
}

/* QUOTES
=============================================================================*/

blockquote {
border-left: 4px solid #DDD;
padding: 0 15px;
color: #777;
}

blockquote>:first-child {
margin-top: 0px;
}

blockquote>:last-child {
margin-bottom: 0px;
}

/* HORIZONTAL RULES
=============================================================================*/

hr {
clear: both;
margin: 15px 0;
height: 0px;
overflow: hidden;
border: none;
background: transparent;
border-bottom: 4px solid #ddd;
padding: 0;
}

/* TABLES
=============================================================================*/

table th {
font-weight: bold;
}

table th, table td {
border: 1px solid #ccc;
padding: 6px 13px;
}

table tr {
border-top: 1px solid #ccc;
background-color: #fff;
}

table tr:nth-child(2n) {
background-color: #f8f8f8;
}

/* IMAGES
=============================================================================*/

img {
max-width: 100%
}

Test blog

This is a test

An outline:

  • Just testing that I can use github markup

MPP, DWU and PolyBase in Azure SQL Data warehouse

MSDN Blogs - Sat, 08/20/2016 - 10:23

Understanding some basic concepts in Azure SQL data warehouse can accelerate getting good grips on the functionality it offers.  Discussed below are some key terms.

MPP: Unlike the previous incarnation of on-premise SQL Server Data warehouse which uses SMP (Symmetric Multi Processing), Azure SQL Data warehouse is designed using MPP (Massively Parallel Processing).  In essence this design co-ordinates processing 1 task by multiple logical processors. Each logical processor consists of its own CPU and memory. Processors communicate with each other.

These processors can also be referred as nodes. Nodes can be classified into following 2 types.

Control node: Creates parallel query plan, co-ordinates query execution, data aggregation, etc.
Compute node: Nodes that do actual computation.

MPP has 2 design models –

  1. Share Nothing
  2. Shared Disk

In Share Nothing model,  used by Azure SQL Data warehouse, each node is independent and has its own data. This data is subset of rows of a table in Data warehouse. This enables massive scalability. Speaking of scalability, Azure SQL Data warehouse expresses it in terms of Data Warehouse Unit (DWU).

DWU: In essence, DWU is a function of memory, CPU and concurrency. Basic DWU, DW100 can have upto 24GB of RAM with lesser concurrency
1 DWU is approximately 7.5 DTU (Database Throughput Unit, used to express the horse power of an OLTP Azure SQL Database) in capacity although they are not exactly comparable.

 

Another important concept is the technology called as PolyBase.

PolyBase: It provides a scalable, T-SQL compatible query processing framework for combining data from RDBMS and Hadoop (or Azure Blob Storage) . It abstracts away many of the MapReduce/Hadoop technologies for SQL developers which are comfortable in the realms of  T-SQL.

As its name implies, it enables to query from or store at multiple (poly) places (base).  Its the actual engine that parallelize the query. While internally it does a lot of optimization, developers can use DMV (Data Management View) to monitor/query plan.

Node “Console App” & Debugging TypeScript in VS Code

MSDN Blogs - Sat, 08/20/2016 - 09:23

One of the most common utility cases when working with SharePoint is to create a new console app, import the PnP CSOM components, and then perform the actions required. This is a way to test code, make quick one-off updates, or process many sites for a common operation. As we have developed the Patterns and Practices client library I’ve been looking forward to an opportunity to do something similar directly in node. Once we added node request support in 1.0.3 this became much closer to reality. The next step was setting up a project to run an arbitrary TypeScript program and enable debugging, a process outlined in this post.

Step 1 – Setup the Project

As simple as it sounds the first step is setting up our TypeScript project. You can always add any of the libraries you like, but this will get us started for the example. We also then install the typings we want to give us better type checking and intellisense. Once these setup steps are done you can re-use the application by just updating the source code.

npm init npm install gulp gulp-sourcemaps gulp-typescript node-fetch sp-pnp-js typescript typings --save-dev typings install dt~es6-promise dt~microsoft.ajax dt~promises-a-plus dt~sharepoint dt~whatwg-fetch --global --save

Next create a tsconfig.json file in the root of the project and add the below content and save.

{ "compilerOptions": { "target": "es5", "module": "commonjs", "jsx": "react", "declaration": false, "sourceMap": true, "removeComments": true, "sortOutput": true } }

Finally we need a gulp file to build and run our program. Create a file named gulpfile.js in the root directory and add the below content.

var gulp = require("gulp"), tsc = require("gulp-typescript"), maps = require('gulp-sourcemaps'), exec = require("child_process").exec; gulp.task("build", function () { var src = ["src/**/*.ts", "typings/index.d.ts"]; var project = tsc.createProject("tsconfig.json"); return gulp.src(src) .pipe(maps.init()) .pipe(tsc(project)).js .pipe(maps.write(".", { sourceRoot: "../src" })) .pipe(gulp.dest("build")); }); gulp.task("run", ["build"], function (cb) { exec('node build/main.js', function (err, stdout, stderr) { console.log(stdout); console.log(stderr); cb(err); }); });

We are first requiring the four libraries we need: gulp, the task runner; gulp-typescript, pipes the TypeScript compiler in gulp; gulp-sourcemaps, captures and writes the source maps we generate during build; and child_process.exec, which is the way we can start process in node. Then we define two gulp tasks, “build” and “run”. The build task transpiles our TypeScript files into JavaScript and writes out the source maps – these maps will be critical once we begin debugging. Also note we are including the typings index file so our global references are defined during build, this is a common source of unexpected build errors related to references not being defined. Finally we build all the files in the “src” folder and write the results to the “build” folder – these are just names and you can change them to suit your preference.

The run task is what actually takes our built output and runs it. It does so by invoking exec on the main.js file found in the build folder (we’ll create a main.ts source file shortly). The exec operation starts a new program in node from the file specified in the path supplied as the first argument. Once we have our program ready we will be able to type gulp run and execute our code.

Step 2 – Write some Code!

Now that we setup our project we can actually do something. Start by creating a “src” folder in your project and adding a main.ts file to that folder. Then add the below contents to main.ts:

console.log("Hello world!");

Now in your console type the command “gulp run” and you should see your project get built and “Hello World!” written out to the console. Awesome! Now we can get down to business. Update your main.ts to include the below:

import pnp from "sp-pnp-js"; pnp.setup({ nodeClientOptions: { clientId: "4d4d3329-1d3d-425c-93fa-e4ae393f8cbb", clientSecret: "F9iUC6B4LM7TClLWY5aixJEmGDGpvGsXD3lifX7ogts=", siteUrl: "https://{your tenant}.sharepoint.com/sites/dev" } }); pnp.sp.web.select("Title").get().then((r) => console.log(JSON.stringify(r)));

You will need to update the clientId, clientSecret and siteUrl values once you register the add-in permissions in your site. If you have properly registered the permissions you can once again use the “gulp run” command and you should see the title of your web written to the console. You can then begin using your app to perform any tasks you want in your site. Remember you need to ensure you’ve requested the appropriate permissions on /_layouts/appinv.aspx. One little note, batching is currently broken in node – this has been fixed and will be included in 1.0.4 release coming up soon.

Step 3 – Debugging

Once we can run code it would be fantastic if we could debug it to see what is going on. In Visual Studio Code this means adding a launch.json file. First create a folder named “.vscode” in the root of the project and add a file named “launch.json” with the contents:

{ "version": "0.2.0", "configurations": [ { "name": "Launch", "type": "node", "request": "launch", "program": "${workspaceRoot}/src/main.ts", "stopOnEntry": false, "args": [], "cwd": "${workspaceRoot}", "preLaunchTask": "build", "runtimeExecutable": null, "runtimeArgs": [ "--nolazy" ], "env": { "NODE_ENV": "development" }, "externalConsole": false, "sourceMaps": true, "outDir": "${workspaceRoot}/build" } ] }

This file will instruct Visual Studio Code how to launch when you hit F5. In your main.ts file set a break point and hit F5. You should see the breakpoint hit and you can examine locals and add watches on the debug tab accessed on the left hand navigation bar. If your breakpoint is not hit, try restarting Visual Studio Code.

Gotcha: If you continue to have issues hitting your break point or are getting the message “Breakpoint ignored because generated code not found (source map problem?)” – ensure you have correctly set the “sourceRoot” property of the source maps write call found in the gulpfile.js’s build task. This needs to be a relative path from your built output to your source files.

Now that you can debug your code and use the Patterns and Practices Core Library from a console app you can rapidly develop and test your code – or perform quick queries and updates to your SharePoint sites. Happy coding!

Download the Sample Project

You can download the starter project as well: NodeConsoleApp. You will need to run the following commands to load the dependencies:

npm install typings install What is the JS Core Component?

The Patterns and Practices JavaScript Core Library was created to help developers by simplifying common operations within SharePoint. This is aligned with helping folks transitioning into client side development in support of the upcoming SharePoint Framework. Currently it contains a fluent API for working with the full SharePoint REST API as well as utility and helper functions. This takes the guess work out of creating REST requests, letting developers focus on the what and less on the how.

“Sharing is Caring”

Console Ouptut: my New Debugging and Testing tool for Windows

MSDN Blogs - Sat, 08/20/2016 - 09:01

I have been working with a partner who wanted to get standard console output from a UWP app for debugging and testing purposes.  Of course the easiest way get debugging output is by attaching a debugger like Visual Studio but in some cases that isn’t possible or practical.  I just published an app, Console Output that enables other apps to send output text lines to it. There are some new capabilities in Windows 10 Anniversary Update that makes this possible:

  1. App Services that can run in the application’s process instead of in a background task 
  2. Remote Systems API that enables connecting apps across devices and platforms.

On its own, Console Output doesn’t do anything special when you run it, but if you launch it from another app by activating the consoleoutput: protocol handler consoleoutput:?title=My app name you can then send text output to it with app services.

When you install and run Console Output, you will see a full code sample in C# showing how to launch Console Output with the Launcher.LaunchUriAsync() API and then open an AppServiceConnection communications channel for bi-directional communications with Console Output.  The sample code is at the bottom of this article.  Console Output is also enabled for remote activation so doing debugging and testing of devices nearby or through the cloud becomes possible as well.  I have also shared the source code for a Console Output Tester app that demonstrates how an app might use Console Output.

Sample C# Code

using System;
using Windows.ApplicationModel.AppService;
using Windows.Foundation.Collections;
using Windows.System;

public class ConsolLoggingTester : IDisposable
{
    private AppServiceConnection _appServiceConnection;
   
    public ConsolLoggingTester()
    {
        InitializeAppServiceConnection();
    }

    private async void InitializeAppServiceConnection()
    {
        // this is the unique package family name of the Console Ouput app
        const string consoleOutputPackageFamilyName = “49752MichaelS.Scherotter.ConsoleOutput_9eg5g21zq32qm”;

        var options = new LauncherOptions
        {
            PreferredApplicationDisplayName = “Console Output”,
            PreferredApplicationPackageFamilyName = consoleOutputPackageFamilyName,
            TargetApplicationPackageFamilyName = consoleOutputPackageFamilyName,
        };

        // launch the ConsoleOutput app
        var uri = new Uri(“consoleoutput:?Title=Console Output Tester&input=true”);

        if (!await Launcher.LaunchUriAsync(uri, options))
        {
            return;
        }

        var appServiceConnection = new AppServiceConnection
        {
            AppServiceName =  “consoleoutput”,
            PackageFamilyName = consoleOutputPackageFamilyName
        };

        var status = await appServiceConnection.OpenAsync();

        if (status == AppServiceConnectionStatus.Success)
        {
            _appServiceConnection = appServiceConnection;
   
            // because we want to get messages back from the console, we
            // launched the app with the input=true parameter
            _appServiceConnection.RequestReceived += OnRequestReceived;
        }
    }

    public async void LogMessage(string messageText)
    {
        if (_appServiceConnection == null)
        {
            return;
        }

        var message = new ValueSet
        {
            [“Message”] = messageText
        };

        await _appServiceConnection.SendMessageAsync(message);
    }

    public void Dispose()
    {
        if (_appServiceConnection != null)
        {
            _appServiceConnection.Dispose();
            _appServiceConnection = null;
        }
    }

    private void OnRequestReceived(
        AppServiceConnection sender,
        AppServiceRequestReceivedEventArgs args)
    {
        var message = args.Request.Message[“Message”] as string;

        // handle message input from Console Ouptut app
    }
}

Links

Please tell me if you start using Console Output and if you find it useful!

NEXT UP — Get a leg up on your future with a Microsoft Office 365 Certification

MSDN Blogs - Fri, 08/19/2016 - 21:22

Earning any kind of specialist certification is a great way to stand out from the crowd, whether you’re looking for a new challenge, a new job, or a way to make yourself more valuable to your current employer. With the growing importance of the cloud, Microsoft Office 365 is a must-have certification for anyone looking to prove their skills.

You’re probably familiar with all the great things Microsoft Office 365 brings to a business, a software suite that matches a business needs anytime, on any device. You’ve worked with those solutions and have earned technical skills and knowledge.

Now it’s time to validate that experience.

Here’s what it takes to become a Certified Microsoft Office 365 Specialist: schedule, prepare for and pass one of these two exams:

Exam 70-346: Managing Office 365 Identities and Requirements — is for you if you take part in evaluating, planning, deploying, and operating the Office 365 services, including its dependencies, requirements, and supporting technologies.

To get certified by passing this exam, you’ll need to already be familiar with the features and capabilities of Office 365, as well as strong skills around managing cloud identities with Microsoft PowerShell.

This exam tests your knowledge of how to:

  • Provision Office 365
  • Plan and implement networking and security in Office 365
  • Manage Cloud Identities
  • Implement and manage identities by Azure AD Connect
  • Implement and manage federated identities for single sign on
  • Monitor and troubleshoot Office 365 availability and usage

Exam 70-347: Enabling Office 365 Services — is for people who take part in evaluating, planning, deploying, and operating the Office 365 services, including its dependencies, requirements, and supporting technologies.

To get certified by passing this exam, you should have experience with the Office 365 Admin Center and an understanding of Microsoft Exchange Online, Skype for Business Online, SharePoint Online, Office 365 ProPlus, and Microsoft Azure Active Directory.

In order to pass, you’ll need to know how to:

  • Manage clients and end-user devices
  • Provision SharePoint Online site collections
  • Configure Exchange Online and Skype for Business Online for end users
  • Plan for Exchange Online and Skype for Business Online

Next Up is our, five-week three-stage exam preparation camp created by Microsoft Certified Trainers to give you the edge you need and fast track your way to be a Microsoft Azure Certified Specialist.

STAGE 1: Five weeks of guided self-study with the support of skilled Microsoft Certified Trainers via Yammer.

STAGE 2: In Person Exam Preparation ½ day, which is designed to get you in shape to sit the exam, you will have access to skilled experts to help you answer those tough questions.

STAGE 3: In Person Exam Preparation ½ day, it is time to put all your hard work aside and to SIT THE EXAM and become a Microsoft Office 365 Certified Specialist.

The program kicks off the final week of September register today and get ready for your 5-weeks of virtual study prior to your in person class.

Course Sydney Microsoft Office Melbourne Cliftons Brisbane Cliftons Adelaide Cliftons Perth Cliftons Exam 70-346: Managing Office 365 Identities and Requirements Tuesday November 1

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm Monday November 21

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm Thursday November 24

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm Monday November 28

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm Thursday December 1

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm Exam 70-347: Enabling Office 365 Services Wednesday November 2

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm Tuesday November 22

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm Friday November 25

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm Tuesday November 29

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm Friday December 2

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm

PowerShell on Linux (Ubuntu16.4) のインストール方法

MSDN Blogs - Fri, 08/19/2016 - 20:05

先日、PowerShell がオープンソース化され、Linux や OSX でも利用可能になりました。ただ、使えるようになったということではなく、その先には、 Microsoft loves Linux のきちんとしたビジョンとメッセージがあるのですが、詳細はリンク先をご覧いただくとして、SQL Server on Linux のアナウンス(製品はまだプライベートベータですので、内容については今日現在は言及できませんが、SQL Server 2016 の機能エンハンスから期待してもいいと思います。)、マルチプラットフォームでオープンソースな.NET Core やWSLとしてBash がWindows 10 に加わり、オープンソースソフトウェアの開発のためのプラットフォームを整備してきました。将来的には Microsoft Operations Management Suite (OMS)への機能拡張も視野にはいっています。個人的には、PowerShell のDesired State Configuration (DSC) で扱える範囲がさらに広がることによってマルチプラットフォームでのインフラストラクチャの運用管理がさらに自動化・効率化され、DevOpsに代表される作って捨てて、作って捨ててっていうインフラのライフサイクルをさらに加速するのではないかと期待しています。

さて、そんなオープンソースになったPowerShellですが、手元にUbuntu 16.04 (64-bit) の環境があったので、実際にどのようにインストールするのかを実施してみましたので手順について書いてみます。

 

PowerShell on Linux のインストール

前提環境:

Welcome to Ubuntu 16.04.1 LTS (GNU/Linux 4.4.0-34-generic x86_64) * Documentation:  https://help.ubuntu.com * Management:     https://landscape.canonical.com * Support:        https://ubuntu.com/advantage Get cloud support with Ubuntu Advantage Cloud Guest: http://www.ubuntu.com/business/services/cloud 14 packages can be updated. 0 updates are security updates. Last login: Sat Aug 20 01:43:00 2016 from 157.65.54.111

作業用フォルダの作成、移動

miyamam@SQLinux:~$ pwd  /home/miyamam miyamam@SQLinux:~$ mkdir temp miyamam@SQLinux:~$ cd temp/

wget コマンドでインストールバイナリをダウンロードする。

miyamam@SQLinux:~/temp$ wget https://github.com/PowerShell/PowerShell/releases/download/v6.0.0-alpha.9/powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb --2016-08-20 01:44:41--  https://github.com/PowerShell/PowerShell/releases/download/v6.0.0-alpha.9/powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb Resolving github.com (github.com)... 192.30.253.112 Connecting to github.com (github.com)|192.30.253.112|:443... connected. HTTP request sent, awaiting response... 302 Found Location: https://github-cloud.s3.amazonaws.com/releases/49609581/3ab34990-63bf-11e6-84b3-9c3f34c3318d.deb?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAISTNZFOVBIJMK3TQ%2F20160820%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20160820T014442Z&X-Amz-Expires=300&X-Amz-Signature=9eee30f12bd1767fe0200f43acac93214a70e7aaa7835e97abc4eb393487be6d&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3Dpowershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb&response-content-type=application%2Foctet-stream [following] --2016-08-20 01:44:42--  https://github-cloud.s3.amazonaws.com/releases/49609581/3ab34990-63bf-11e6-84b3-9c3f34c3318d.deb?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAISTNZFOVBIJMK3TQ%2F20160820%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20160820T014442Z&X-Amz-Expires=300&X-Amz-Signature=9eee30f12bd1767fe0200f43acac93214a70e7aaa7835e97abc4eb393487be6d&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3Dpowershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb&response-content-type=application%2Foctet-stream Resolving github-cloud.s3.amazonaws.com (github-cloud.s3.amazonaws.com)... 54.231.72.219 Connecting to github-cloud.s3.amazonaws.com (github-cloud.s3.amazonaws.com)|54.231.72.219|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 40928824 (39M) [application/octet-stream] Saving to: ‘powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb’ powershell_6.0.0-alpha.9-1ubu 100%[=================================================>]  39.03M  12.1MB/s    in 4.3s 2016-08-20 01:44:48 (9.03 MB/s) - ‘powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb’ saved [40928824/40928824]

#wget する先のリンクは都度新しいものがリリースされている可能性がありますので、GitHubで確認をして環境に合わせます

 

インストール

miyamam@SQLinux:~/temp$ sudo apt-get install libunwind8 libicu55 Reading package lists... Done Building dependency tree Reading state information... Done libicu55 is already the newest version (55.1-7). libunwind8 is already the newest version (1.1-4.1). 0 upgraded, 0 newly installed, 0 to remove and 18 not upgraded. miyamam@SQLinux:~/temp$ sudo dpkg -i powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb Selecting previously unselected package powershell. (Reading database ... 99535 files and directories currently installed.) Preparing to unpack powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb ... Unpacking powershell (6.0.0-alpha.9-1) ... Setting up powershell (6.0.0-alpha.9-1) ... Processing triggers for man-db (2.7.5-1) ...

 

起動確認とHello PowerShell miyamam@SQLinux:~/temp$ powershell PowerShell Copyright (C) 2016 Microsoft Corporation. All rights reserved. PS /home/miyamam/temp> $PSVersionTable Name                           Value ----                           ----- PSVersion                      6.0.0-alpha PSEdition                      Core PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0...} BuildVersion                   3.0.0.0 GitCommitId                    v6.0.0-alpha.9 CLRVersion WSManStackVersion              3.0 PSRemotingProtocolVersion      2.3 SerializationVersion           1.1.0.1

 

PowerShellの典型的なサンプルのWrite-Host コマンドレットを使用します。(-ForegroundColor オプションは指定色にするというオプションです。)

 

PS /home/miyamam/temp> Write-Host "Hello PowerShell!!" -ForegroundColor Cyan Hello PowerShell!! PS /home/miyamam/temp> Write-Host "Hello PowerShell!!" -ForegroundColor Magenta Hello PowerShell!!

 

インタラクティブシェルの動作確認として、Windows の PowerShell 環境ではエラーになる ifconfig ですが…

ちゃんと動作します。

終了させる場合はexit コマンドで終了させます。

 

参考リンク

PowerShell is open sourced and is available on Linux

https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/

 

Package installation instructions

https://github.com/PowerShell/PowerShell/blob/master/docs/installation/linux.md

 

 

本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。

Pages

Subscribe to Randy Riness @ SPSCC aggregator
Drupal 7 Appliance - Powered by TurnKey Linux