You are here

Feed aggregator

Hybride Windows Apps mit dem Web App Template – Teil 2

MSDN Blogs - Wed, 10/29/2014 - 03:12

Dieser Blogeintrag ist ein Teil von meiner Blog-Serie zu Web App Templates. Hier sind die Links zu den anderen Einträgen:

Hybride Windows Apps mit dem Web App Template – Teil 1

 

Das Web App Template (WAT) ist ein Open-Source-Projekt von Microsoft, die schnelle Erstellung von Hybrid-Apps für Windows 8.1/Windows Phone 8.1 und Windows Phone 8 ermöglicht. Die Visual Studio-2013 Vorlage gibt es auf http://wat.codeplex.com zum Herunterladen.

Es gibt einige Apps bereit im Windows-App-Store, die auf dem Web App Template basieren, z.B. River Island Clothing und Zoopla Property Search haben WAT-Apps auf Windows 8.1 und Zoopla Property Search auf Windows Phone.

Eine Webseite muss die folgenden Kriterien erfüllen, um als eine Grundlage für eine WAT-App zu dienen:

  • Funktioniert gut in Internet Explorer 11
  • Verwendet kein Flash oder Silverlight

Wenn die Webseite diese Kriterien erfüllt, sollte es technisch in WAT funktionieren. Allerdings werden die folgenden Richtlinien empfohlen, um ein tolles Erlebnis für den Benutzer zu liefern:

  • Mobile oder Responsive-Webseiten, die genauso gut bei einer Breite von 300px als auch bei 1920px funktionieren
  • Konzipiert für Touch-Eingabe
  • Gut organisiertes Mark-Up mit Verwendung von ID-Attributen und CSS-Klassen
  • Haben einen RSS-feed für die Live-Kachel (Blogs oder sozialen Netzwerken wie Twitter oder Pinterest sind möglich)

Um eine WAT Basis-App zu erstellen, müssen Sie nur ein bisschen über Windows App-Funktionen wissen und haben die Fähigkeit, eine JSON-Datei zu bearbeiten. Grundkenntnisse in CSS sind auch hilfreich.

In den meisten Fällen benötigen kein JavaScript oder C#. Nur JSON und CSS müssen Sie vielleicht anpassen. Die WAT-Vorlage hat die wichtigste Funktionalität bereits vorgefertigt und Sie müssen nur die config.json anpassen, um die App zu konfigurieren.

WAT-Apps sind vollständige Native-Apps unter der Haube und umhüllen jede Menge vorgefertigte Funktionen rund um die Webseiten. Das bedeutet, dass Sie eine WAT-App wie jede Native-App erweitern können. Die meisten beginnen mit einem standardmäßigen Config-App und fügen dann sukzessive native Funktionen hinzu. Die häufigsten native Erweiterungen sind:

  • Dynamische Navigation
  • Custom Live-Kacheln
  • Warenkorb
  • Verwendung von Hardware wie Webcams usw.

Hier sind die 2 wichtigen Ressourcen, um mit WAT-Apps anzufangen:

  • http://wat.codeplex.com: Hier können Sie den Quellcode und Installer herunterladen, an der Community rund um WAT teilnehmen, Anregungen abgeben oder Hilfe erhalten.
  • http://wat-docs.azurewebsites.net: Sie finden hier mehr detaillierte Info rund um WAT und wie Sie es verwenden können. Sie finden auch Dokumentation zu der Config.json-Datei und eine ausführliche Beschreibung der Funktionen und Info rund um hilfreiche Tools, um Bilder und Kacheln für die App zu erstellen.

 

Hybride Windows Apps mit dem Web App Template – Teil 1

MSDN Blogs - Wed, 10/29/2014 - 03:11

Wenn Sie mit einem Projekt anfangen, müssen Sie sich entscheiden, was für eine Art von App Sie schreiben wollen: Eine Möglichkeit ist eine Native-App zu erstellen, die speziell für eine bestimmte Plattform (iOS, Android, Windows) entwickelt wird und die individuellen Vorteile der Plattform nutzt. Eine andere Möglichkeit ist, eine Hybrid-App zu bauen, die bestehenden Web-Content nutzt und wichtige Native-Funktionen hinzufügt. Eine Hybrid-App ist eine Mischung zwischen Native- und Web-Code. Web-Inhalte werden teilweise dazu gerendert. Eine Hybrid-App verbindet das Beste aus beiden Welten.

Sowohl Native- als auch Hybride-Apps haben ihre Stärken und Schwächen und es gibt keinen richtigen oder falschen Weg eine App zu bauen. Sie müssen sich nur entscheiden, was richtig für das Projekt oder die Kunden ist.

Eine wachsende Zahl von Unternehmen investieren in ihre Webseiten, weil sie für den deutlichen Anstieg mobiler Geräten optimieren wollen. Diese Investitionen umfassen wichtige Design-Prinzipien wie Responsive- oder Adaptive-Design, das es Websites ermöglicht, die Darstellung auf dem Bildschirm optimal anzupassen. Viele Unternehmen sehen das Web als den kleinsten gemeinsamen Nenner, auf den Sie von fast jedem Gerät zugreifen können. In vielerlei Hinsicht ist das Web die ultimative Cross-Plattform-App.

Das Web über einen Browser hat seine eigenen Schwächen – z.B. die begrenzte Fähigkeit, eine direkte Schnittstelle mit dem Betriebssystem oder der Hardware zu verbinden, eingeschränkte Fähigkeit außerhalb des Browsers zu funktionieren, fehlende Offline-Unterstützung, usw. Hier können Hybrid-Apps helfen. Sie können das Beste aus der Web-Plattform herausholen und verbessern das Erlebnis durch die Nutzung von Hybrid-App-Funktionen. Ein weiterer wichtiger Vorteil für Hybrid-Apps sind die Kosten. Die Kosten für die Entwicklung und Aktualisierung einer Hybrid-App kann deutlich billiger als eine volle Native-App sein, weil der Großteil die Funktionen bereits auf der Website existieren.

Das Web App Template (WAT) ist ein Open-Source-Projekt von Microsoft, die schnelle Erstellung von Hybrid-Apps für Windows 8.1/Windows Phone 8.1 und Windows Phone 8 ermöglicht. Die Visual Studio-2013 Vorlage gibt es auf http://wat.codeplex.com zum Herunterladen.

Die Vorlage erstellt eine Native-App (HTML/JS für Windows 8.1 und Windows Phone 8.1 und C#/XAML für Windows Phone 8), die über eine Reihe vorgefertigter Funktionen verfügt, beispielsweise:

  • Webseite auf dem Haupt-Canvas der App einbinden
  • Live Kacheln mit Inhalt aus einem RSS-Feed
  • CSS- und JavaScript-Injections, um das Aussehen der Webseite in der App zu ändern
  • Native Nav-Bar und App-Bar für Navigation
  • Integration mit nativer Suche, Teilen und Einstellungen über die Charms
  • Redirect-Regeln steuern, welche URLs innerhalb der App bleiben und welche im Browser öffnen
  • Sekundäre Kacheln anheften

Eine JSON-basierte Konfigurationsdatei (config.json) steuert die ganze App und kontrolliert, welche Funktionen aktiviert sind und wie sie sich verhalten. Sie können eine vollständige Auflistung der JSON Optionen auf der WAT-Dokumentation-Webseite finden: http://wat-docs.azurewebsites.net/Json

 

Dieser Blogeintrag ist ein Teil von meiner Blog-Serie zu Web App Templates. Hier sind die Links zu den anderen Einträgen:

How to transform an automated business process into an intelligent business process using Dynamics and Azure ML

MSDN Blogs - Wed, 10/29/2014 - 02:33

Dynamics automates business processes. And does so cheaply and simply and you know that. But what if business processes could be made more intelligent instead of just automated. Automation among other things, brings consistency and improves productivity and you know that too. That’s why you want it. But intelligence could do all of that at a different scale and do it in real-time and could be flexible as business changes perhaps at a fraction of cost. By intelligence I simply mean that instead of a system making rule-based decisions such decisions are made using big data technologies in the cloud in real-time. This opens up the possibility of using various data sources including social data and the possibility of using computing power of the cloud.

Variety, velocity and volume of data is increasing at tremendous pace. Technologies to work with this data are being offered in troves and data is the new oil…. Now we have all been hearing about such possibilities for some time now. But are companies really using and
leveraging such technologies and are these really useful? Jury is still out.

However I will like to share with you our first successful story in Dynamics space.

This post will provide you with a real example of a LIVE customer that benefits from this approach. And perhaps will inspire you to think of scenarios where you can use it in your own business.

CASE STUDY

Let's call our customer Contoso. Contoso is the UK’s leading foodservice delivery and collection provider, supplying full- range of food stuffs across the UK. It is a leading supplier to Restaurants, cars, pubs, café's and schools. It delivers five thousand high-quality, affordable food products to over 45,000 different establishments.

CUSTOMER BACKGROUND

Food service in UK is a highly competitive industry with relatively low barriers to entry. Most players source & distribute low-priced, low-margin simple food products and kitchen supplies from similar set of manufacturers and sell them to similar type of customers in their own regions in UK. The low-to-medium price-range food service business cannot be profitable by differentiating on product quality or by cost cutting alone.

Contoso therefore deploys advanced technology to provide highly differentiated customer service and has managed to grow at higher than market rate consistently for past ten years. Because customers are fickle and there are no real costs to switching suppliers for them, to keep existing customers and to grow the customer base Contoso strives to delight customers by offering new services and reinventing existing ones with laser-sharp attention to detail.

Contoso uses Dynamics AX 2012 as the central source of truth for all its other applications. Contoso has two sales channels - both online portal and call center contribute to sales equally. Call-center is the traditional channel through which all of business was routed once upon a time but Contoso introduced online portal to stay ahead of the curve, reduce call center costs and make it possible for customers to place an order anytime anywhere. Call center agents use Dynamics AX to record call history and create sales orders. Call center agents make guided selling possible which improves customer service experience and can also help with up-sells.

Online portal allows the flexibility for customers to place their orders whenever they like. Product catalogue is pulled from Dynamics AX and shown on the portal, customers complete the order online which creates sales orders in Dynamics AX for warehousing and delivery processes.

Contoso's quest to improve customer experience led them to big data technologies. In the beginning they only had one key requirement - how to predict what a customer is likely to buy today given all their past purchases in the current and past months. This prediction
will allow them to present only those products on the main page. And you may ask what is the benefit of this? Does the benefit justify the cost of building a predictive model? The key benefit is that ordering time is reduced, user does not have to scroll on the page or go to other pages or search through the portal. Same applies in the call center scenario. In a B2B scenario where chefs or restaurant managers have to order food ingredients and kitchen supplies they need online almost daily from many different sources, shaving off a few minutes and making their experience with Contoso stand-out among the competition is a very significant differentiator. What impressed us was the attention to customer experience and how central customer experience is to Contoso's investment decisions.

And if the predictive model could not only be built using simple and inexpensive tools but also could be changed, tested, trained, deployed and consumed without any up-front investment - what more could Contoso ask for?

Awareness that such technologies are already being offered by Microsoft is low currently in the Dynamics world. And more importantly that the real business value of these technologies is only in lighting-up business processes which live inside Dynamics.

So from the first key requirement this project with Contoso is now at a stage where we provide a number of different services.

  1. We present next purchase
  2. We offer recommendations
  3. We analyse customer churn
  4. And then we classify customers value to
    the business

We build predictive models in Azure ML, we leverage Azure ML apps available in Azure marketplace and we call Azure ML services real-time from within Dynamics.

Purchasing history and clickstream data are collected and ingested into these models automatically.

There are various predictive models, some are trained weekly others are trained on the fly.

There is still some work remaining to make the data flow seamless and automatic and we are working on improving this.

Let's see each of the Azure ML services created in a bit more detail.

CUSTOMER CLASSIFICATION

Customers are classified into different levels based on a mining model built in Azure ML studio. This classification helps call-center agents in decision making. Going forward, we will base promotion offers on this dynamic customer classification.

CUSTOMER'S NEXT PURCHASE

Based on purchase history, customer’s next purchase is predicted and presented products on the portal and in call-center are ordered by likeliness to buy today. This prediction is based on a model built in Azure ML studio. Customer’s next purchase is expected to delight the customer by presenting what she needs on a given day. It also increases order taking efficiency.

RECOMMENDATIONS AS A SERVICE

Customer has been live for a month and we have seen consistently that 20% of all the provided recommendations are clicked on by the users which is a very high number given this is a B2B space and customers/users are busy people whose main focus is to get done with the
order quickly. Almost 5% of the items in the final shopping cart are coming from the recommendations provided. If this number continues to hold this would lead to about 5% sales uplift which is also very high. Customer's expectation is 1-2% lift in the long-term. For Contoso this translates to £1-2m lift. Contoso consumes about half-a-million prediction events per month.

Three types of recommendations are offered in real-time

Market basket analysis produces frequently bought together recommendation when customer choses to put an item into the cart.When user clicks on a product, a real time call is made to Azure ML. The Azure ML service returns the items that are frequently bought together with this item. Azure ML models do all the math on more than billions of possible combinations to show the most relevant items. The most significant benefit here is to help customer place the order quickly to make sure s/he is not forgetting something most people buy together with this item. 

Item-to-item recommendations are provided to the customer on the item landing page in the portal. When user clicks on an item to go to the item landing page, a real-time call is again made to Azure ML. The service returns other recommended items. These items are predictions based on what other users buy when they buy this item. Azure ML service not only recommends relevant items but makes sure that there is some novelty and diversity in its recommendations - nudging & subtly urging the customer to click. This helps in exposing the darker parts of the catalogue which the user may not be aware of. This not only can lead to upsells but also helps in bringing back the concept of guided-selling which Contoso lost to some extent when they moved half of their sales from call-center to the portal.

User-to-Item recommendations are provided to the customer just before she checks out both in the call-center and on the portal. Just before check-out another Azure ML call is made in real-time. This time user is recommended items based on the total basket of items user already has. Azure ML service here does what is referred to as "training on the fly", real-time personalization. This is unique to Azure ML.

CUSTOMER CHURN ANALYSIS

Contoso analyses monthly data to predict which customers are likely to churn in the current month. Sales department uses this information to call customers-at-risk and take necessary action.  

DISCUSSION

As you can see the sales order business process has been surrounded by a number of machine learning services at critical decision points - what product is this customer likely missing in the cart?, what product is this customer likely to buy today? What product is this customer likely to want if revealed to the customer? Should I offer this promotion to all my customers or to the ones who are likely to churn or to the ones who are rock solid customers? Customer wants a delivery slot that is reserved, should I offer that slot to him given that he is a  diamond customer for three straight months? This type of decision management capability is what transforms sales order process from merely being an automated process to an intelligent process. With the simplicity and affordability of Azure ML Studio and services in Azure marketplace Contoso can chose to stay engaged and continuously develop newer models to answer deeper questions about their business. For instance, say a particular product is recommended several times but gets rarely clicked by users, is it time to discontinue this product and save all costs related to the product. Or if certain recommended products consistently end up in the cart, is there some type of latent demand and perhaps more such products should be included in the catalogue? Or perhaps work in a different area for instance do text analytics on case history logs stored in Dynamics AX to identify customers who have complained often in the past or delivery drivers against whom most complaints have been registered or products against which most pre-orders have been placed.  

What has been achieved at Contoso with Azure ML and Dynamics working together cannot have been achieved if Contoso relied on Dynamics alone. For instance clickstream data being analysed to monitor consumer behaviour would not fit in a SQL database. The recommendations offered are predictions that need heavy computing capability and would be too expensive if were deployed on premise. The business changes quickly, new products, new markets, new customer requirements, new competitor offerings all require
Contoso to respond quickly - Flexibility and ease of modifying a model and redeploying a service using Azure ML studio makes this possible to do in few hours.

This is a case study of a successful Dynamics customer. We are closely monitoring Contoso and at the same time working simultaneously on a number of other projects. If you have a scenario you would like to discuss please reach out aksheyg@microsoft.com.

Like I said initially, the jury is still out. Success depends on identifying right scenarios, having the requisite skillsets etc. but clearly the opportunity here is endless - surround the business processes with various Azure ML services, perhaps at each decision point and transform your automated business processes into intelligent business processes. Manage your decisions consciously and not by accident.

You are welcome to join us for EMEA Convergence in Barcelona on Tue, Nov 4, 2014 to watch this in action at the keynote or later in afternoon session on Wed, 5th Nov 2014.

Contributors: Royi Ronen, Akshey Gupta

 

A straightforward Office 365 Training Guide made by teachers for teachers!

MSDN Blogs - Wed, 10/29/2014 - 02:00

One of the best sources of advice and tuition on a product are the end users who can speak from a position of experience and impartiality. That’s why we’re always pleased to pass on the findings and opinions of our customers, as more often than not the use cases they talk about will be applicable to a much larger subset of our existing and future users.

We recently came across a fantastic resource from the United States, put together by the Grant Wood Area Education Agency. This particular guide was designed to help when it comes to schools using Office 365.

In their guide you’ll find concise overviews, along with everyday usage tips for the following applications found within Office 365 for Education:

· OneDrive

· Word

· PowerPoint

· Excel

· Excel Surveys

· OneNote

We hope this resource from Grant Wood AEA is of use to you, and please feel free to share it with others who might find it helpful.

Office Pro Plus Benefit for Students

It is also worth remembering that any institution worldwide that licenses Office for staff and faculty can provide access to Office 365 ProPlus for students at no additional cost. As a result, more than 35,000 institutions worldwide are automatically eligible to deliver the package to their students.

Office 365 ProPlus includes all the familiar and full Office applications, such as Word, Excel and PowerPoint, and offers the ability for these to be locally installed on up to five devices and available offline.

Furthermore, when a school combines the Office Pro Plus Benefit with our other cloud services - Exchange Online, SharePoint Online and Lync Online - of which all are available complementary through Office 365 Education, students have access to the same set of productivity tools and services used by Fortune 500 companies all over the world.

A straightforward Office 365 Training Guide made by teachers for teachers!

MSDN Blogs - Wed, 10/29/2014 - 02:00

One of the best sources of advice and tuition on a product are the end users who can speak from a position of experience and impartiality. That’s why we’re always pleased to pass on the findings and opinions of our customers, as more often than not the use cases they talk about will be applicable to a much larger subset of our existing and future users.

We recently came across a fantastic resource from the United States, put together by the Grant Wood Area Education Agency. This particular guide was designed to help when it comes to schools using Office 365.

In their guide you’ll find concise overviews, along with everyday usage tips for the following applications found within Office 365 for Education:

· OneDrive

· Word

· PowerPoint

· Excel

· Excel Surveys

· OneNote

We hope this resource from Grant Wood AEA is of use to you, and please feel free to share it with others who might find it helpful.

Office Pro Plus Benefit for Students

It is also worth remembering that any institution worldwide that licenses Office for staff and faculty can provide access to Office 365 ProPlus for students at no additional cost. As a result, more than 35,000 institutions worldwide are automatically eligible to deliver the package to their students.

Office 365 ProPlus includes all the familiar and full Office applications, such as Word, Excel and PowerPoint, and offers the ability for these to be locally installed on up to five devices and available offline.

Furthermore, when a school combines the Office Pro Plus Benefit with our other cloud services - Exchange Online, SharePoint Online and Lync Online - of which all are available complementary through Office 365 Education, students have access to the same set of productivity tools and services used by Fortune 500 companies all over the world.

A straightforward Office 365 Training Guide made by teachers for teachers!

MSDN Blogs - Wed, 10/29/2014 - 02:00

One of the best sources of advice and tuition on a product are the end users who can speak from a position of experience and impartiality. That’s why we’re always pleased to pass on the findings and opinions of our customers, as more often than not the use cases they talk about will be applicable to a much larger subset of our existing and future users.

We recently came across a fantastic resource from the United States, put together by the Grant Wood Area Education Agency. This particular guide was designed to help when it comes to schools using Office 365.

In their guide you’ll find concise overviews, along with everyday usage tips for the following applications found within Office 365 for Education:

· OneDrive

· Word

· PowerPoint

· Excel

· Excel Surveys

· OneNote

We hope this resource from Grant Wood AEA is of use to you, and please feel free to share it with others who might find it helpful.

Office Pro Plus Benefit for Students

It is also worth remembering that any institution worldwide that licenses Office for staff and faculty can provide access to Office 365 ProPlus for students at no additional cost. As a result, more than 35,000 institutions worldwide are automatically eligible to deliver the package to their students.

Office 365 ProPlus includes all the familiar and full Office applications, such as Word, Excel and PowerPoint, and offers the ability for these to be locally installed on up to five devices and available offline.

Furthermore, when a school combines the Office Pro Plus Benefit with our other cloud services - Exchange Online, SharePoint Online and Lync Online - of which all are available complementary through Office 365 Education, students have access to the same set of productivity tools and services used by Fortune 500 companies all over the world.

還在擔心雲端儲存空間不夠? 現在起,Office365的訂戶沒這煩惱啦!!

MSDN Blogs - Wed, 10/29/2014 - 01:56

1 TB 可以放多少東西咧?

微軟舊金山實驗室的Gordon Bell曾針對此問題給出如下解釋...

如何才能在一年時間裡用完1TB

1)假定彩色照片是每張300KB的jpeg檔,1TB一共可以儲存大約360萬張,相當於一年中你每天儲存9800張。

2)假定文字檔每個有1MB大,那1TB一共可以儲存大約100萬個文字檔,相當於一年中你每天儲存2900個檔。

3)假定音樂檔是以256K/s的採樣頻率錄製,1TB一共可以儲存大約9300個小時的音樂,
      相當於一年中你每天儲存26個小時的音樂。

4)假定視頻檔是以1.5M/s的採樣頻率錄製,1TB一共可以儲存大約1600個小時的視頻,
      相當於一年中你每天儲存4小時的視頻。

 

很驚人吧?
Office 365 其實今年稍早才宣布容量擴增 5000%. 讓每個訂戶的雲端空間增至1TB
但這周令人更振奮的消息,則是微軟開始決定提供 
每個Office 365用戶免費升級享有 " 無限制的 OneDrive 儲存空間"
!! 

 

即日起到未來幾個月,會先開始實施於 Office 365 家用版、個人版及教育機構。如果您想成為第一批取得無限制儲存空間的幸運者,請趕緊點這裡預約!

而針對 Office 365 企業客戶,
在接下來的幾天,無限制儲存空間將被列於 Office 365 Roadmap,並於 2015 開始進行第一波更新 (詳細請見 Office 365 發行方案),同時,您可以在今天立刻取得 1 TB 的儲存空間 (詳情請見商務用 OneDrive)。

無限制的雲端儲存空間是 OneDrive 的其中一個重要里程碑。我們相信雲端儲存的真正價值實現,是能整合人們於公於私時用來溝通, 創造, 合作的工具。因此提供無限制的雲端儲存空間只是我們承諾的其中一小部分,無論是在工作上或是生活中,我們都希望可以提供更多更好的使用經驗給大家。

我們很高興地持續努力,讓 OneDrive 成為雲端儲存中的領導者,成為 Office 365 提供客戶最好生產力服務中的一個品項。就是這樣的努力提供給各位最好的服務,請期待我們接下來幾個月將有更振奮人心的發表!

開發人員 想了解更多關於 Office 365的ˊ技術熱門新聞?
請看 以 Office 365 為開發平台開發加值應用
除了提供 Visual Studio 的 SDK (for Windows)之外,MS Open Tech 也開發了 iOS 及 Android 的 SDK,方便開發人員可以輕鬆在 iOS、Android 平台上開發 Office 365 的延伸應用。

Collecting diagnostic information from Azure management portal when an error occurs

MSDN Blogs - Wed, 10/29/2014 - 01:29

 

Many a times you see errors on Azure management portal while doing an operation. Most of the times you have to open a support incident to resolve the error or know the root cause of the error.

 

Here is a neat trick you can use to collect the diagnostic information from the management portal page and provide to the support professional or attach it with the support incident. This will speed up the troubleshooting at the support side.

 

When you see an error on the management portal always ensure that you have the DETAILS button clicked on the error section so that you get to see the details than just “An operation failed ….”. Please copy the entire message and provide it to the support professional or add it in the problem description of the support incident.

 

Press “ctrl alt a” key combination on the management portal when you see the error and it will bring up a dialog which will provide you the diagnostic information. Either copy this information or take a screenshot and provide it to the support professional or attach it in the support incident.

 

 

Hope this helps.

Event-Tipp: Webinar “Eine Einführung in Microsoft Azure”

MSDN Blogs - Wed, 10/29/2014 - 01:29

Für alle, die sich am ersten Überblick über Microsoft Azure verschaffen wollen, habe ich einen Event-Tipp: Morgen werde ich im Rahmen eines Webinars eine Einführung in Microsoft Azure geben. Dabei gehe ich auf die absoluten Grundlagen ein, d.h. Es wird mehr um einen Überblick als um eine tiefen Einstieg in Bits und Bytes geben. Das Webinar richtet sich demnach an alle, die gerade in Microsoft einsteigen oder eine Testphase in Erwägung ziehen.

Das Webinar findet am Donnerstag, den 30. Oktober 2014 von 14:00-15:00 Uhr statt.

Inhalt wird unter anderem sein:

  • Die Historie von Microsoft Azure
  • Die Infrastrukturdienste von Microsoft Azure
  • Die Plattformdienste von Microsoft Azure
  • Kleine Demos, die zeigen, wie man sich in Microsoft Azure zurecht findet
Weitere Informationen

[SSRS] SharePoint 統合モードのレポートサーバー構築手順 (単一サーバー) - SQL Server 2014 と SharePoint 2013 の組み合わせ

MSDN Blogs - Wed, 10/29/2014 - 01:14

 

SQL Server Developer Support

藤丸陽子

 

 

SharePoint 2013 と SQL Server 2014、同一筐体のシングル構成における手順を案内します。

前回案内した [SSRS] SharePoint 統合モードのレポートサーバー構築手順 (単一サーバー) - SQL Server 2012 と SharePoint 2013 の組み合わせ と基本的に同じ構成手順となります。

 

■■■ 構成 ■■■

SharePoint 2013 と SQL Server 2014、同一筐体のシングル構成における手順です。

■■■ 手順概要 ■■■

以下が手順概要となります。

1. 事前準備 (.NET Framework 3.5 SP1 のインストール)

2. SQL Server 2014 をインストール

3. SharePoint 2013 をインストール

4. Reporting Services SharePoint サービスの登録と開始

5. Reporting Services サービスアプリケーションの作成

6. サイトコレクションを作成

■■■ 手順詳細 ■■■ 1. 事前準備 (.NET Framework 3.5 SP1 のインストール)

 

 

2. SQL Server 2014 をインストール

1) SQL Server 2014 メディアからルートフォルダの Setup.exe をダブルクリックします。

[インストール] - [SQL Server の新規スタンドアロンインストールを実行するか、既存のインストールに機能を追加します] をクリックします。

 

2) プロダクトキーを入力します。

 

3) ライセンス条項を確認し、次へ進みます。

 

4) Microsoft Update を確認し、次へ進みます。

 

5) インストールルールが終了次第、次へ進みます。

 

 

6) [セットアップロール] 画面で "SQL Server 機能のインストール" をチェックし、次へ進みます。

 

 

7) [機能の選択] 画面で次の通り選択し、次へ進みます。 緑で囲ったものが、 シングル環境における SharePoint 2013 + SQL Server 2014 Reporting Services 統合環境に必須のコンポーネントとなります。

 

8) [機能ルール] が成功することを確認し、次へ進みます。

9) [インスタンスの構成] 画面でインスタンス名を指定します。 既定のインスタンスで構成するには、そのままの設定で次へ進みます。

 

10) [サーバーの構成] - サービスアカウントを設定し次へ進みます。 (特に変更がない場合既定の設定とします。)

 

11) [データベースエンジンの構成] 画面で SQL Server 管理者を追加します。

SQL 認証を有効にする場合、"混合モード" にチェックし、SQL Server のシステム管理者 (sa) のパスワードを指定します。

 

 

12) (Optional - Analysis Services をインストールしている場合) [Analysis Services の構成] 画面で、サーバーモードを選択の上、Analysis Services 管理者を追加します。

 

13) [Reporting Services の構成] 画面で "Reporting Services SharePoint 統合モード" の 'インストールのみ' にチェックが入っていることを確認の上、次へ進みます。

 

 

14) [インストールの準備完了] 画面でインストール対象の機能をチェックの上、[インストール] ボタンをクリックします。

15) セットアップ完了を確認の上、OS を再起動します。

 

 

参考情報

=====

インストール ウィザードからの SQL Server 2014 のインストール (セットアップ)

<http://msdn.microsoft.com/ja-jp/library/ms143219.aspx>

3. SharePoint 2013 をインストール

1) ソフトウェア必須コンポーネントのインストールをします。

SharePoint 2013 のメディアを実行し、[インストール] - [ソフトウェア必須コンポーネントのインストール] を選択します。

 

2) Microsoft SharePoint 2013 製品準備ツールが起動します。 そのまま次へ進みます。

 

 

3) ライセンス条項を確認し、次へ進みます。

 

 

4) 製品準備ツールの実行完了後、[完了] ボタンをクリックし、OS を再起動します。

 

5) OS 再起動後、SharePoint Server のインストールを実施します。

 

6) プロダクトキーを入力します。

7) ライセンス条項を確認し、次へ進みます。

 

8) [サーバーの種類] が "完全" となっていることを確認し、[今すぐインストール] をクリックします。

 

9) インストールが完了するのを待ちます。

10) SharePoint 2013 のインストール完了後、 SharePoint 2013 製品構成ウィザードを実行します。

11) [SharePoint 製品構成ウィザード] が起動します。 次へ進みます。

 

12) 各サービスの開始、リセットについてのポップアップが立ち上がります。 [はい] をクリックします。

13) [サーバーファームへの接続] 画面で "新しいサーバーファームの作成" にチェックし、次へ進みます。

14) SharePoint サーバーファームの構成データベースの設定を行います。

SQL Server データベースエンジンが稼働するサーバーと構成データベースへの接続に使用する Windows アカウントを指定します。

(今回はシングル構成ですので、SharePoint と同一のサーバーを指定します。)

 

 

15) [ファームセキュリティ設定の指定] 画面で、任意のパスフレーズを指定します。

このパスフレーズはファームを追加する際に必要となります。

 

16) [SharePoint サーバーの全体管理 Web アプリケーションの構成] で全体管理のポート番号と認証プロバイダを指定します。

ここでは既定の設定のまま、次へ進みます。

 

17) SharePoint 製品構成ウィザードの終了画面で設定を確認し、次へ進みます。

 

 

18) 製品構成ウィザードの完了を確認し、[完了] ボタンをクリックします。

 

19) ファーム構成の初期構成ウィザードが起動します。

 

 

20) [ウィザードの開始] をクリックします。

 

 

21) サービスアカウント、サービスの設定を行います。

 

 

22) 設定後、SharePoint 全体管理の画面が表示されることを確認します。

 

4. Reporting Services SharePoint サービスの登録と開始

1) SharePoint 2013 管理シェルを [管理者として実行] で実行します。

 

2) 下記のコマンドをそれぞれ実行します。

Install-SPRSService

Install-SPRSServiceProxy

get-spserviceinstance -all |where {$_.TypeName -like "SQL Server Reporting*"} | Start-SPServiceInstance

 

 

参考情報

======

SharePoint 2013 用 Reporting Services の SharePoint モードのインストール

<http://msdn.microsoft.com/ja-jp/library/jj219068.aspx>

--> 手順 2:Reporting Services SharePoint サービスの登録と開始

 

 

5. Reporting Services サービスアプリケーションの作成

1) SharePoint サーバーの全体管理で、[アプリケーション構成の管理] の [サービス アプリケーションの管理] をクリックします。

 

2) SharePoint リボンで、[新規作成] ボタンをクリックします。

※ [新規作成] ボタンがグレイアウトして選択できない場合、ブラウザを [管理者として実行] で起動し、SharePoint サーバーの全体管理を起動ください。

 

 

3) [新規作成] メニューで [SQL Server Reporting Services サービスアプリケーション] を選択します。

 

4) [SQL Server Reporting Services サービス アプリケーションの作成] ページで、アプリケーションの名前を入力します。

例) RSServiceApp1

 

5) Reporting Services サービスアプリケーションが正常に作成されることを確認し、[OK] ボタンをクリックします。

 

6) Reporting Services サービスアプリケーションが作成されていることを確認します。

 

7) 作成した Reporting Services サービスアプリケーション (例: RSServiceApp1) を選択し、プロパティをクリックの上、 Web アプリケーションの関連付けを行います。 ( Reporting Services サービスアプリケーションを利用する Web アプリケーションにチェックを入れます。)

 

 

6. サイトコレクションを作成

1) [アプリケーション構成の管理] - [サイトコレクションの作成] を選択します。

 

2) サイトコレクションの作成画面で各項目を設定します。

· [タイトルと説明]

· サイトコレクションのタイトルと説明を入力します。

· [Web サイトのアドレス]

· URL に使用するパス (たとえば /sites/ などのワイルドカードを使用したパスやルート ディレクトリ (/) など) を選択します。

· ワイルド カードを使用したパスを選択する場合は、サイトの URL で使用するサイト名も入力する必要があります。

· [テンプレートの選択]

· 一覧で、サイト コレクションのトップレベルサイトに使用するテンプレートを選択するか、[ユーザー設定] タブをクリックして空のサイトを作成し、後からテンプレートを適用します。

· テンプレートの一覧の下に、選択したテンプレートの説明が表示されます。

· [エクスペリエンス バージョンの選択]

· ボックスの一覧で、使用するテンプレートの SharePoint エクスペリエンス バージョンを選択します。

· そのサイト コレクションの見た目と動作を、SharePoint Server 2013 のサイト コレクションと同じようにする場合は、2010 エクスペリエンスバージョンを選択します。2010 エクスペリエンス バージョンを使用するサイト コレクションは SharePoint 2013で実行されますが、サイト コレクションのユーザー インターフェイスとユーザー エクスペリエンスは SharePoint Server 2013 と同じです。サイト コレクションをアップグレードする方法の詳細については、「サイトコレクションを SharePoint 2013 にアップグレードする」を参照してください。

· [サイト コレクション管理者]

· サイト コレクションの管理者のユーザー名を DOMAIN\username の形式で入力します。

· [代理のサイトコレクション管理者]

· サイトコレクションの代理の管理者のユーザー名を入力します。

· サイトコレクションの管���者が不在の場合に誰かがサイト コレクションを管理できるように、サイト コレクションの代理の管理者を指定しておくことをお勧めします。

3) サイトコレクション作成後、 レポートを作成したサイトコレクションにアップロードし、レポートの表示がなされることをご確認ください。

※ レポートが参照できない場合、サイトコレクションが属する Web アプリケーションが Reporting Services サービスアプリケーションと関連付けがなされていることをご確認ください。

以上の通りとなります。

[SSAS] SQL Server Analysis Services メタデータマネージャー Error 発生時の対処方法

MSDN Blogs - Wed, 10/29/2014 - 01:09

SQL Server Developer Support Team

藤丸陽子

SQL Server Analysis Services 利用時に、メタデータマネージャーエラーが記録され、Analysis Services に対する操作が失敗する事象とその対処について案内します。

 

事象

Analysis Services データベースの参照、処理、バックアップ・復元、もしくは、Analysis Services サービス起動時にエラーが発生し、

実施した操作が失敗します。

Analysis Services サービス起動時にエラーとなる場合や、サービス起動は成功した後も、特定のデータベースのみで実施の操作が失敗する場合と、すべてのデータベースで問題が発生する場合があります。また、失敗する操作は一つに限らず、いくつか、もしくは、全て失敗する可能性があります。

 

エラーメッセージ    メタデータマネージャーエラーが発生すると次のようなエラーが発生します。
               

-----抜粋-----

ファイル システム エラー: ファイルを開いている途中で次のエラーが発生しました '\\?\C:\Program Files\Microsoft SQL Server\MSAS<Version>.MSSQLSERVER\OLAP\Data\SSASDB1.0.db\<ファイル名>

---------------

-----抜粋-----

コード: 0xC114001D

---

メタデータマネージャーでエラーが発生しました。 <SSASオブジェクト名> をファイル '\\?\C:\Program Files\Microsoft SQL Server\MSAS<Version>.MSSQLSERVER\OLAP\Data\SSASDB1.0.db\<ファイル名> ' から読み込み中に、エラーが発生しました。

--------------

-----抜粋-----

Message: モデル の読み込み中にエラーが発生しました。 (Source: \\?\C:\xxx\MSAS<Version>.MSSQLSERVER\OLAP\Log\msmdsrv.log, Type: 3, Category: 289, Event ID: 0xC1210013)

--------------

 

        

要因

Analysis Services データベースを構成する一部、もしくは複数のファイルの破損や不整合が発生した場合に起きうる問題です。

Analysis Services 稼働マシンで以下のような運用がなされた場合に発生する可能性があります。

・ Analysis Services データベースファイルが存在するディスクの障害によ��、ファイル破損やファイルの書き込みエラーが発生した場合

・  データベースおよびキューブ処理中に Analysis Services サービスを再起動した場合

・  データベースバックアップ中に Analysis Services サービスを再起動した場合

・  ウィルス対策ソフトが Analysis Services データベースファイルの一部をスキャン中に、Analysis Services を再起動した場合

・ Analysis Services が稼働するマシンの強制終了等、正しくない形で Analysis Services サービスを終了した場合

補足

どの要因に該当しているかは、残念ながらログ等からは確認することはできません。

対処方法

メタデータマネージャーエラーが発生した場合、問題が検知されたファイルを含むデータベースを削除し、再配置することが対処方法となります。再配置はメタデータマネージャーエラーが発生する前に採取しておいた Analysis Services データベースのバックアップファイルが必要です。 正常時のデータベースバックアップファイルが存在しない場合、Analysis Services データベースプロジェクトを再配置の上、データベースを再処理します。

重要 メタデータマネージャーによる予期せぬエラーに対処できるよう、定期的に Analysis Services データベースのバックアップを採取しておきましょう。

メタデータマネージャーエラーが発生した後には、Analysis Services データベースのバックアップの採取もエラーとなる可能性があるため、エラーが出ていない正常時に都度バックアップを採取しておくことが重要です。

 

Analysis Service データベースのバックアップ手順は、下記技術情報に記載があります。

 

[データベースのバックアップ] ダイアログ ボックス (Analysis Services - 多次元データ)

<http://msdn.microsoft.com/ja-jp/library/ms186830(v=SQL.105).aspx>

----- <抜粋> -----

[データベースのバックアップ] ダイアログ ボックス (Analysis Services - 多次元データ)

・・・

SQL Server Management Studio のオブジェクト エクスプローラーで、Analysis Services インスタンスの [データベース] フォルダー、またはデータベースを右クリックし、[バックアップ] をクリックします。

----- </抜粋> -----


 

事前作業

Analysis Services のデータフォルダ配下を別の場所にコピーの上、退避しておきます。

Tips

Analysis Services のデータフォルダは既定で下記パスに存在します。

例) \Program Files\Microsoft SQL Server\MSAS<Version>.MSSQLSERVER\OLAP\Data

実際どのパスが Analysis Services のデータフォルダであるかは SQL Server Management Studio から

該当の SQL Server Analysis Services インスタンスへ接続し、インスタンスを右クリックで [プロパティ] を選択すると表示される

"分析サーバーのプロパティ" ダイアログの [全般] - "DataDir" プロパティの現在の値から確認できます。

 

対処手順 (バックアップ復元による対処)

1) SQL Server Analysis Services サービスを停止します。 ([管理ツール] - [サービス] - [SQL Server Analysis Services] )

2) Analysis Services データフォルダ配下から、エラーメッセージに記録されているデータベースの .db フォルダを削除します。

例) Analysis Services データフォルダが "C:\Program Files\Microsoft SQL Server\MSAS<Version>.MSSQLSERVER\OLAP\Data" で、メタデータマネージャーエラーが記録されているデータベースが SSASDB1 の場合:

"C:\Program Files\Microsoft SQL Server\MSAS<Version>.MSSQLSERVER\OLAP\Data" フォルダ配下の SSASDB1.<number>.db フォルダを削除します。

3) SQL Server Analysis Services サービスを開始します。 ([管理ツール] - [サービス] - [SQL Server Analysis Services] )

4) メタデータマネージャーエラーが発生する前に採取しておいた該当の Analysis Services データベースバックアップファイル (.abf)

(例: SSASDB1.abf) を SQL Server Management Studio から復元します。

上記手順でエラーが解消しない場合、次の対応で、エラーの解消がなされるかご確認下さい。

  • 他のデータベースを一点一点削除し、エラーの解消を確認する。 (上記対処手順 1) から 4) を繰り返す。)
  • Analysis Services データフォルダの場所を変更し、事前に採取しているバックアップを順次復元する

補足 バックアップファイルが存在しない場合

正常時のデータベースバックアップファイルが存在しない場合、Analysis Services データベースプロジェクトを再配置の上、再処理します。

前述の対処手順 1) から 3) を実行し、Analysis Services データベースプロジェクトを該当の Analysis Services インスタンスに配置の上、データベース完全処理を実施下さい。

エラーが解消しない場合、前述の対処手順同様に下記を確認下さい。

  • 他のデータベースを一点一点削除し、エラーの解消を確認する。
  • Analysis Services データフォルダの場所を変更し、事前に採取しているバックアップを順次復元する

エラーを可能な限り抑止するための運用方法

Analysis Services 運用にあたり、下記の運用がなされていることを確認下さい。

  • Analysis Services データベース処理中やオンラインでデータベースデザインを変更している最中の Analysis Services サービス停止・再起動や OS 停止・再起動をしないようにする。
  • ウイルス対策ソフトで Analysis Services 関連のファイル除外設定をする。*1
  • SQL Server を最新のサービスパックで運用する *2

*1 ファイル除外設定を行っていない場合、下記技術情報を参考にファイル除外設定を行います。

SQL Server を実行しているコンピューター上で実行するウイルス対策ソフトウェアを選択する方法

<http://support.microsoft.com/kb/309422/ja>

*2  SQL Server を最新のサービスパックで運用することも合わせて推奨しております。

SQL Server の最新モジュール情報 (まとめページ)

http://blogs.msdn.com/b/jpsql/archive/2010/08/01/sql-server.aspx

Introducing Azure Stream Analytics

MSDN Blogs - Wed, 10/29/2014 - 00:07

We are so excited today to be announcing to you Azure Stream Analytics - a new fully managed stream processing service inTo the cloud. Stream Analytics provides low latency, real-time processing of millions of events per second in a highly resilient service. This morning at TechEd Europe 2014, Microsoft officially announced the preview of Azure Stream Analytics.

Stream Analytics enables you to easily combine streams of data with historic records or reference data to derive business insights quickly and easily, by providing a range of operators from simple filters to complex correlations. Defining time based windowed operations such as windowed aggregates, or correlating multiple streams to detect patterns such as sequences, or even comparing current conditions to historical values and models, can be done in a matter of minutes using the simple set of SQL-like Stream Analytics Query Language operators. Specifying a streaming pipeline is as simple as configuring its inputs and outputs, and providing a SQL-like query describing the desired transformations. While this suffices for most simple cases, higher-scale and more complex scenarios will also benefit from Stream Analytics configurability and the ability to determine how much compute power to dedicate to each step of the pipeline to achieve the desired peak throughput.

The Stream Analytics service is optimized to provide users very low cost to get going and maintain real time analytics solutions. The service is built for you to pay as you go based on usage. The usage is derived based on the volume of events processed and the amount of compute power provisioned within the cluster to handle the respective streaming jobs.

To get started, check out the Stream Analytics service page and visit the Microsoft Azure Portal.

- The Azure Stream Analytics Team

Lync 2013 クライアント CU 公開されました。

MSDN Blogs - Tue, 10/28/2014 - 23:43

こんばんは、 Lync サポートチームの吉野です。
Lync 2013 最新の CU が公開されました。

http://support.microsoft.com/kb/2889929/ja

修正個所についてはこちら
http://support.microsoft.com/kb/2889929/ja#issue

今回結構多いので一部を紹介しますと
・アプリケーション共有でクラッシュする現象の修正
・高解像度のビデオ会話でメモリリークが発生する
・会話履歴表示の問題の修正

結構、重要な修正が含まれておりますね…。

なお、こちらの修正については [Lync 2013] だけでなく [Lync 2013 basic]  や [Lync 2013 VDI プラグイン] にも有効です。
また、過去の修正はすべて含まれていますので過去の CU の適用は必要ありません。

それでは引き続き快適な Lync ライフをお楽しみください。

 

 

 

如何在Windows Store和Windows Phone应用之间传送文件

MSDN Blogs - Tue, 10/28/2014 - 22:42

首先要知道的事:它是如何工作的

 

如果我们曾有一些应用同时运行于Windows Phone和Win 8 Store中,我们可能会考虑使用一些版本之间的数据同步的机制。然而,由于两种程序之间无法直接访问对方的数据,这就要求它们之间能有某种方式可以进行文件传输。可行的方案的好几种,最流行的一种是使用Web Server(放置在某个地方作为中间介质)。然而,这种方式也有一些弊端,假如用户的万维网访问因为各种环境因素而被限制了,你的应用可能就没办法正常地和用户进行交互了。虽然在一般情况下的万维网访问是不会受到限制的,但实际情况并不总是这样乐观。另外,这种方式可能会产生一定金额的维护费用,你需要从你应用的收益中拿出一部分来支付这些费用。

 

通常情况下,用户都会拥有自己的网络连接,如:局域网,Ad-Hoc网络。很多时候,甚至连用户自己都没有意识到这一点。例如,假设用户有一个路由或者接入点,然后有一些设备(不管无线有线)连接在路由或者接入点上,这些处于当前网络中的设备形成了某种意义上的 “互联网”。此时,不管有没有万维网访问,与互联网相关的网络协议都能正常工作。

 

所以,我们可以跳过中间服务器,通过TCP协议实现同一网络实现设备间的直接通信。

 

TCP协议是互联网的基石。几乎所有访问互联网的程序都会使用TCP协议从远端服务器获取数据(大多数情况下,网络请求要经过几个服务器的路由,这里暂时不做讨论)。TCP 协议包含一个服务器(监听者)和一个客户端,这两个实体通过“请求”的阶段变化进行数据交互。“请求”必须进行初始化,客户端根据服务器的IP地址和指定端口打开一个连接即可完成。而服务器不能简单的去连接客户端。TCP协议的主要好处是任一方发送的数据包都能确保被另一方接收到。如果在传送过程中丢失了某个数据包,发送方就会重新发送丢失的数据包,直到接收方确认传送过程完成。

 

TCP协议基本上表现得像数据流一样,因此所有TCP套接字又称“流式套接字”。套接字是设备间互连所使用的底层基础对象。如果在数据队列中排队等待传输的数据太大,不适合放在一个单独的数据包里时,LAN驱动会把它拆分成多个包,挨个进行传送,直到所有包发送完成。但对于好奇的开发人员来说,上述过程已经被抽象化了。

 

TCP 套接字可以通过多种连接的方式打开:RFCOMM Bluetooth,共享同一接入点的设备(一个设备通过无线方式连接而其他设备通过以太网口连接时仍然适用),即便手机是通过USB连接到平板或者电脑上时,也依然可行。只要知道端口名称和IP地址,你就可以用TCP去连接这个星球上的任何电脑(或许地球之外的也可以)。

 

本文中,电脑将扮演服务器的角色,而手机将作为客户端。这里,电脑是指任何可以运行Windows Runtime程序的设备,包括ARM架构和X86架构的平板以及可以运行Windows 8的电脑。在Windows Phone 8发布之前,手机只能作为客户端。但是现在,手机也可以作为服务器来使用。有趣的是,服务器和客户端之间的界限并不明晰。任一方都可以作为服务器或者客户端,但是一个设备同时扮演这两种角色可能会导致一些不稳定的情况发生。

 

通过参照MSDN文档,让我们把这些将要使用的类熟悉一下。Stream Socket , Stream Socket Listener

 

文档非常详细,甚至都已经涵盖把事情做好的每一个步骤。

 

好了,让我们跳过文档,开始切入正题。我们需要启动一个监听者: 创建一个win 8 应用程序,然后创建一个看起来还过得去的界面。

 

这里建议创建一个新的类来存储所有东西,便于稍后使用。

 

我们所想发送的文件中包含了一些关于应用程序状态的随机数据。假设我们要在Windows Phone应用中继续做我们之前在Windows App上做的工作。那么我们就需要把重建的数据存储在这个要发送的的文件里。

 

我们把这个文件隐藏在Roaming文件夹中

 

string fileName = "MyFileName.txt";

 

你可以用以下代码创建这个文件(你可能会好奇为什么我们不await这个异步调用, 因为我们不需要这样做:该调用会在后台线程上创建。除非你想要在接下来的方法中使用这个文件,否则你不需要await它)

 

ApplicationData.Current.RoamingFolder.CreateFileAsync(fileName, CreationCollisionOption.ReplaceExisting);

 

使用以下代码你可以获得这个文件:

Var myfile =await ApplicationData.Current.RoamingFolder.GetFileAsync(fileName);

 

你可以使用FileIO 类的方法在这个文件里面写一些东西。请确保你真的在文件里写了东西,这样它的大小才会大于0。如果你想要观察当一个文件不能被放在一个数据包里时,流套接字的所产生行为,那么,你可以把写文件的那些代码放在while或者for循环中,这样你就可以人为的增加文件的大小。

 

此时,你已经准备了这个文件,那就该设置监听者了。由于我们不能在这个类的构造器中使用async调用,我们必须要创建一个任务,并在使用类构造器之后调用它:

 

这里是一些起作用的变量。

public StreamSocketListener ServerListener = new StreamSocketListener();

StreamSocket socket;

public async Task InitStuff()

{
var f = ServerListener.Control;

f.QualityOfService = SocketQualityOfService.LowLatency;

ServerListener.ConnectionReceived += ServerListener_ConnectionReceived;

await ServerListener.BindEndpointAsync(null, "13001");

}

 

变量“f”用来为流套接字监听者设置QualityOfService的属性。我们把它设为低延迟是为了确保传送能够尽可能的快。我们也必须要确保数据包能被尽可能快的处理,以避免超时(事实上这也是我们要做的工作)。

 

在为接收的连接设置完事件处理方法之后,我们就可以开始监听了。

await ServerListener.BindEndpointAsync(null, "13001");

 

上面这行代码就是做这个的。它定义了用于监听连接的主机名和端口号。因为我们的手机没有主机名,我们只是简单的把它定义为null来接受来自其他主机的连接。

 

来看一下接收连接的事件处理程序:

async void ServerListener_ConnectionReceived(StreamSocketListener sender, StreamSocketListenerConnectionReceivedEventArgs args)

 

字段args包含了用在请求中的套接字。我们应该把该套接字赋予给事先声明过的流套接字。

socket = args.Socket;

流套接字公开了两个流,一个输入流一个输出流。输入流是由远程套接字发送进来的数据流。

输出流是发给远程套接字的数据流。因为这是一个相当简单的请求和服务通信,我们并不是很关心从手机发过来的是什么。由于文件通常都以相同的方式传输,所以这里我们只看输出流:

var outputstream = socket.OutputStream;
DataWriter writer = new DataWriter(outputstream);

                      

我们将用writer对象往输出流写数据。

现在,我们打开文件,读取它的内容,然后使用data writer把这些字节写到输出流。

var myfile =await ApplicationData.Current.RoamingFolder.GetFileAsync(fileName);
var streamdata = await myfile.OpenStreamForReadAsync();

 

传输的第一阶段是以字节的方式发送文件的长度

if(stage==0)
{
stage++;

writer.WriteBytes(UTF8Encoding.UTF8.GetBytes((awaitstreamdata.Length.ToString()))
await writer.StoreAsync();
}

第二个阶段的代码如下。使用一个变量来计算阶段的索引。如果需要,也可以去读取客户端发送给你的数据。大家可以去了解一下Windows 8.1的流套接字示例,它是一个关于如何读写数据详细的例子。

if(stage==1)
{
byte[] bb = new byte[streamdata.Length];

streamdata.Read(bb, 0, (int)streamdata.Length);

writer.WriteBytes(bb);

await writer.StoreAsync();

stage=0;
}

                            

注意StoreAsync()方法的调用。这个调用清空了data writer的缓存并把所有数据注入到输出流中。我们所要做的几乎就这么多,系统会帮我们完成剩下的事情。另外一个需要注意的是我们如何发送数据。首先写这个文件的长度,然后写文件本身的内容。牢记这些,我们后面将要用到。

 

现在,这段代码呈现出一个大问题:整段代码并不是类型安全的。我们发送原始字节到客户端,客户端也向我们发回原始字节。其实,仅就设备之间的单个文件传送而言,我们没必要去过多的考虑类型安全。但是值得注意的是:如果你想发送复杂的数据,如基于通信的消息,你就需要考虑到你只是得到了字节这一事实,并且你需要自己来构建解析系统。

 

现在来看一下客户端代码。

这次我们将用到Silverlight 运行时。这种代码在Windows Phone 7和Windows Phone 8上都可以使用。

 

套接字类包含在System.Net.Sockets命名空间中。

 

创建一个新类,添加下面的字段:

MemoryStream ArrayOfDataTransfered;
string _serverName = string.Empty;
private int _port = 13001;
long FileLength = 0;
int PositionInStream = 0;
int Stage = 0;

 

用户需要输入服务器名称和端口号。因为这个连接是在家庭或私有网络中建立的,服务器的具体IP地址是可变的。直接使用电脑的名称可以解决这个问题。提供给用户一个设置页面来设置连接的话,也是一个不错的主意。下面这些行的代码应该放在初始化的方法中,比如类的构造函数。流程是这样的:连接 > 发送 > 接收。如果你想要传送的文件不止一个,你就需要重新开始整个过程。

if (String.IsNullOrWhiteSpace(serverName))

{
throw new ArgumentNullException("serverName");
}

if (portNumber < 0 || portNumber > 65535)

{
throw new ArgumentNullException("portNumber");

}
_serverName = serverName;

_port = portNumber;

public void SendData(string data);

 

这个方法只是发送数据到远程服务器。下面来详细说明:

 

首先我们需要SocketAsyncEventArgs这个对象,我们稍后将使用它。SocketAsyncEventArgs表示一个套接字操作。它的操作可以是发送,接收或者连接。

SocketAsyncEventArgs socketEventArg = new SocketAsyncEventArgs();

 

然后,我们需要一个服务器端,使用服务器名称和端口号来构建这个对象。

DnsEndPoint hostEntry = new DnsEndPoint(_serverName, _port);

下一步是创建套接字并设置各种属性,然后去连接远程服务器。

Socket sock = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);

socketEventArg.Completed += new EventHandler<SocketAsyncEventArgs>(SocketEventArg_Completed);

socketEventArg.RemoteEndPoint = hostEntry;

socketEventArg.UserToken = sock;

 

再然后就是连接

sock.ConnectAsync(socketEventArg);

 

SocketEventArgs的事件处理方法如下:

void SocketEventArg_Completed(object sender, SocketAsyncEventArgs e)
{
switch (e.LastOperation)

{
case SocketAsyncOperation.Connect:

ProcessConnect(e);

break;

case SocketAsyncOperation.Receive:

ProcessReceive(e);

break;

case SocketAsyncOperation.Send:

ProcessSend(e);

break;

default:

throw new Exception("Invalid operation completed");
}
}

当操作是一个连接请求的时候,我们只需使用上面定义过的SendData方法发送数据到服务器就可以了,相关的代码:

byte[] buffer = Encoding.UTF8.GetBytes(dataIn);

e.SetBuffer(buffer, 0, buffer.Length);

Socket sock = e.UserToken as Socket;

sock.SendAsync(e);

dataIn 是我们想发送到服务器的一个随机字符串。如果你想要从服务器请求多个文件,或者传输是在多个阶段完成,这个会很有用。SendAsync方法发送数据的原始字节到已连接的远程套接字。

 

ProcessSend方法:

if (e.SocketError == SocketError.Success)

{
//Read data sent from the server

Socket sock = e.UserToken as Socket;

sock.ReceiveAsync(e);

}

else

{
ResponseReceivedEventArgs args = new ResponseReceivedEventArgs();

args.response = e.SocketError.ToString();

args.isError = true;

OnResponseReceived(args);

}

这里唯一有意思的事情是调用ReceiveAsync(e).这个方法基本上是文件传输的核心。

 

我们想要得到两个阶段的数据。第一个阶段是文件大小。 如果没有得到文件的大小,有一个可能是它太小,小到几乎不存在(把可能的最大的64字节的数据转化为以兆为单位,你就可以看到结果)。第二个阶段就是文件本身。

var dataFromServer = Encoding.UTF8.GetString(e.Buffer, 0, e.BytesTransferred);

Stage++;

FileLength = long.Parse(dataFromServer);

有意思的部分在第二阶段。基本上,它是整个流程中最重要的部分。

首先,我们将读取缓存区中的字节。

 

ArrayOfDataTransfered.Write(e.Buffer,
0, e.BytesTransferred);

 

PositionInStream字段只计算到目前为止我们获取的字节数。如果它的值小于文件长度,则意味着文件还没有完全在这边。反之,则表示文件已经传输完成。

if (PositionInStream < FileLength)

{
Socket socks = e.UserToken as Socket;

socks.ReceiveAsync(e);

}

else

{
Stage = 0;
//save the file
}

 

如果文件还不在这里,意味着这个文件太大不适合放在一个数据包里面。也就是说,还有更多其它的包需要传送,我们需要再次调用ReceiveAsync(e)方法去读网络适配器的缓存区。

 

现在你要做的就是把内存流复制到一个单独的存储文件流中,传输即可完成。因为我们只用了一个内存流,你可能需要防止用户传送超过100MB大小的文件,否则就可能会触发一个内存溢出异常。如果你实在很想要传输超过100MB的文件,你可以直接把数据写入单独存储文件流中。如果要发送敏感数据的话,你还需要考虑套接字连接的安全问题。

 

附言: 可以通过以下方法提高程序的性能:在Windows phone程序中使用ID_CAP_NETWORKING,在Windows Store程序使用Private Networks。

 

参见:

 

另外一个可以找到大量Windows Phone 相关文章的地方是TechNet Wiki,最佳的入口是Windows Phone
Resources on the TechNet Wiki
.

 

本文翻译自:http://social.technet.microsoft.com/wiki/contents/articles/20495.how-to-transfer-files-between-a-windows-store-app-and-any-windows-phone-app.aspx

Collaborating with Office 365: OneDrive

MSDN Blogs - Tue, 10/28/2014 - 22:30

As more and more educational spaces get access to Office 365, we are all looking for guidelines and easy ideas for use in our classrooms. Over the next couple of weeks we will highlight the top tips for getting your Office 365 world humming along with ease.

...(read more)

Operational Insights Search How To: Part IV – Introducing the MEASURE command

MSDN Blogs - Tue, 10/28/2014 - 19:20

This is the third installment of a Series (I don’t know yet how many posts they will be in the end, but I have at least 5 in mind at this point…) that walks thru the concepts of Microsoft Azure Operational Insights Search Syntax – while the full documentation and syntax reference is here, these posts are meant to guide your first steps with practical examples. I’ll start very simple, and build upon each example, so you can get an understanding of practical use cases for how to use the syntax to extract the insights you need from the data.

In my first post I introduced filtering, querying by keyword or by a field’s exact value match, and some Boolean operators.

In the second post I built upon the concepts of the first one, and introduced some more complex flavors of filters that are possible. Now you should know all you need to extract the data set you need.

In the third post I introduced the use of the pipeline symbol “|” and how to shape your results with search commands.

 

Today I will start talking (it will take more than one post) of our most versatile command so far: Measure.

Measure allows you to apply statistical functions to your data and aggregate results ‘grouped by’ a given field. There are multiple statistical functions that Measure supports. it might sound all complicated at this point, but as we walk thru some of them with examples I’m sure they’ll become clearer.

Measure count()

The first statistical we’ll work with (and the simplest to understand) is the count() function.

Given a search query, i.e.

Type=Event

you should already know the ‘filters’ (previously called ‘facets’) on the left end of the screen show you a distribution of values by a given field for the results in the search you executed.

For example in the screenshot above I am looking at the ‘Computer’ field – it tells me that, within the almost 3 million ‘Events’ I got back as results, there are 20 unique/distinct values for the ‘Computer’ field in those records. The tile only shows the top 5 (the most common 5 values that are written in the ‘Computer’ fields), sorted by the number of documents that contain that specific value in that field. From the screenshot I can see that – among those almost 3 million events – 880 thousand come from the DMUSCETT-W2012 computer, 602 thousand from the DELLMUSCETT machine, and so forth…

What if I want to see all values, since the tile only shows only the top 5?

That’s what measure command will let you do with the count() function. This function takes no parameters, and you just specify the field by which you want to ‘group by’ – the ‘Computer’ field in this case:

Type=Event | Measure count() by Computer

But ‘Computer’ is just a field IN each piece of data – no relational databases here, there is no separate ‘Computer’ object anywhere. Just the values IN the data can talk about which entity generated them, and a number of other characteristics and aspects of the data – hence the term ‘facet’. But you can just as well group by other fields. Since our ‘original’ results (the almost 3 million events that we are piping into the ‘measure’ command) have also a field called EventID, we can apply the same technique to group by that field and get a count of events by EventID:

Type=Event | Measure count() by EventID

And if you are not interested in the actual ‘count’ of records that contained a specific value, but only want a list of the values themselves, try adding a ‘Select’ command at the end of it and just select the first column:

Type=Event | Measure count() by EventID | Select EventID

and you can even get fancy and pre-sort the results in the query (or you can just click the columns in the grid too)

Type=Event | Measure count() by EventID | Select EventID | Sort EventID asc

You should have gotten the idea with this couple of examples. It should be fairly straightforward. Try doing your own searches featuring the Measure count() now!

There are a couple important things and caveats to notice and/or emphasize:

  1. The ‘Results’ we are getting are NOT the original ‘raw’ results anymore – they are ‘Aggregated’ results – essentially ‘groups’ of results. Nothing to worry about, just need to understand you are interacting with a very different ‘shape’ of data (different than the original ‘raw’ shape) that gets created on the fly as a result of the aggregation/statistical function.
  2. Measure count today (at the time of this writing) only returns the TOP 100 distinct results. This limit does not apply to the other statistical functions we’ll talk about later. Anyhow, we have a tracking item on the feedback forum that you might want to vote on if this limit is annoying to you. You typically just have to have a more granular filter first (looking for more specific things), before applying the measure count() command is the workaround you have today for this behavior (i.e. rather than asking for all computers that have reported events, you probably are more interested in just the computer that have reported a SPECIFIC error EventID, and similar scenarios).

In the next installment we’ll look at other statistical functions such as AVG, MIN, MAX, SUM and more!

Till then, happy searching!

Remember that all of the search-related feature requests we have are visible and tracked on the Azure Operational Insights feedback forum in their own category. Come and vote on the ones that matter to you, and suggest your own ideas!

DirectX SDK Tools Catalog

MSDN Blogs - Tue, 10/28/2014 - 16:47

In the same vein as my post on where you can find many of the latest samples from the DirectX SDK, and where you can find all the various replacements for D3DX, this post is a catalog of where you can find the latest version of various tools that shipped with the legacy DirectX SDK. Lacking that, it at least provides a status or alternative for the tool.

DirectX Capabilities Viewer (DxCapsViewer.exe)
DirectX Control Panel (DxCpl.exe)
Game Definition File Editor (GDFMaker.exe)
HLSL Compiler (fxc.exe)
AdpcmEncode.exe

Windows 8.1 SDK / Visual Studio 2013

DirectX Error Lookup Tool (DXErr.exe)

The Visual Studio Error Lookup (under Tools) should help with many HRESULT values, particularly for development on Windows 8.x or later.

For notes on the DxErr library that the DXErr.exe tool used, see this post.

Game Definition File Validator (gdftrace.exe)

MSDN Code Gallery

Texture Conversion Tool (Texconv.exe)
Texconvex.exe

DirectXTex

Meshconvert.exe

DirectXMesh

See also the Samples Content Exporter.

DirectX Texture Editor (DxTex.exe)

The legacy DirectX SDK tool does not support DDS files with the 'DX10' header extension.

VS 2012 and VS 2013 can view all DDS files supported by DirectXTex.

TxView.DLL

Windows 8.1 WIC supports DDS format files for BC1-BC3/DXT1-5 so these show up as thumbnails and can be opened with Photo Viewer.

AudConsole3.exe
Xact3.exe
XactBld3.exe
XaudConsole3.exe

XACT is deprecated and is only available in the legacy DirectX SDK.

The xwbtool in DirectX Tool Kit can build XACT3-style .xwb wave banks for bulk loading of .wav files using DirectX Tool Kit for Audio.

xWMAEncode.exe

xWMA compression is supported by XAudio 2.7 and the Xbox One version of XAudio. Therefore this tool only ships in the legacy DirectX SDK and in the Xbox One XDK.

DXViewer.exe

Only available on the legacy DirectX SDK.

Microsoft XNA Test Case Tool

The latest version of this tool is on Microsoft Downloads.

Certification for Windows logo usage for Win32 desktop application is managed through Windows App Certification Kit (WACK)..

PIX for Windows

PIX for Windows is not compatible with the DirectX 11.1 or later runtime on Windows 8.x or Windows 7 SP1 with KB2670838. See this post for details.

Visual Studio Graphics Diagnostics in VS 2012 Pro+, VS 2013 Express for Windows, or VS 2013 Pro+.

Graphics vendor tools from Intel (GPA), AMD (GPU PerfStudio), or NVidia (NSight).

Simple WSE Routing Sample for Classic SOAP Service

MSDN Blogs - Tue, 10/28/2014 - 16:22

 

A customer came to me recently asking about options to migrate a set of SOAP services hosted on Windows 2003 into two new modern datacenters running Windows 2012. The most critical requirement for the customer was that the client remain unaware of the change and client proxies could not be modified. We decided to insert the WSE Routing layer on the SOAP server layer. WSE routing is implemented through an IHttpHandler. Done this way, the handler can intercept the request for any of the client services and make routing decisions on behalf of the client. So what’s the rub then? Don’t we have routing samples in the WSE QuickStart? Yes, we do, however, they introduce some complexity which makes the most critical requirement, not modifying the client proxies, nearly impossible. The routing samples assume your clients were built using WSE and introduce a requirement for WS-Address to be present in the client request (which can be modified through a SoapFilter). However, we can rebuild the same using only classic Web Service SOAP standards which really cleans up the project and demonstrates how simple it is to utilize WSE 3.0 routing for your classic SOAP service needs! Download WSE_Soap_Router! Components of the Sample:
  • ClassicSoapClient – console application which was a web reference to DataCenter.asmx
  • WSE_Routing_Sample – Classic ASMX web project with two web services:
    • DataCenter.asmx – ClassSoapClient was built against this service
    • DataCenter1.asmx –  this is where the call will be routed instead
  • SoapRouter.cs – this is also contained in the WSE_Routing_Sample web project and contains the logic to process the incoming SOAP message and route where desired

Tricks you can play on yourself #789–Linq

MSDN Blogs - Tue, 10/28/2014 - 14:05

I was profile some code this morning, and came across some interesting behavior.

Basically, we had some low level code that looked something like this:

IEnumerable<Guid> GetSpecialQuestionIds()
{
    return
      GetAllSpecialItems()
        .Select(specialItemXml => specialItemXml .CreateFromXml(questionXml))
        .SelectMany(specialItem => specialItem.Questions.Select(question => question.UniqueIdentifier)).Distinct();
        .Distinct();
}

So, it’s taking special item xml, deserializing each special item, and then grabbing all the unique question ids that are referenced by the special items. Perfectly reasonable code.

Elsewhere, we did the following (the original code was spread out over 3 different classes but I’ve simplified it for you):

var specialQuestionIds = GetSpecialQuestionIds();

foreach (Item item in items)
{
    var questions = item.Questions.Where(question => specialQuestionIds.Contains(question.UniqueIdentifier);
}

That also looks fine, but when I looked at the profile, I found that it was heavily dominated by the CreateFromXml() call. Well, actually, I did the profile first, and then looked at the code.

The problem is the call to Contains(). It will walk every entry in specialQuestionIds, which normally would be fine, but because it’s never been realized, it will deserialize all the special items… for every question in every item.

The fix is pretty simple – I changed GetSpecialQuestionIds() to call .ToList(), so that the deserialization only happened once, and the deserialization dropped from 65% down to 0.1% in the profile. And there was much rejoicing.

The lesson is that you should be careful whenever you return an IEnumerable<T> that isn’t super-cheap, because the person who gets it may enumerate it over and over.

Powershell DSC ResKit Wave 8: Now with 100+ Resources!

MSDN Blogs - Tue, 10/28/2014 - 13:41

The DSC Resource Kit Wave 8 has landed! You can find it right here.

This wave contains an astounding 48 resources, our largest release ever! We just went from 90 resources released via the Resource Kit to 138, a whopping 65% boost! We didn’t just creep across the 100 resources line, we blew right past it!

Special thanks to everyone who has contributed, downloaded, or looked at this blog post. We can’t wait to share even more resources with you in the future!

Questions, Comments?

If you're looking into using PowerShell DSC, but are blocked by a lack of resources, let us know in the comments or the TechNet QA Section.

What's in this Wave?

This wave has added almost too many modules to list, but we’ll try:

  • xExchange
  • xSCDPM
  • xSCOM
  • xSCSMA
  • xSCSR
  • xSCVMM
  • xCredSSP
  • xDismFeature
  • xBitlocker
  • xComputerManagement
  • xPendingReboot

All these resources are experimental. The “x” prefix in the names stands for experimental – which means these resources are provided AS IS and are not supported through any Microsoft support program or service. We will monitor the TechNet pages, take feedback, and may provide fixes moving forward. 

We also want to announce the recent release of DSC content and resources from the Visual Studio ALM Rangers.  They are a community group who provided a great write-up and some new resources here:

Also, don’t forget to check out the community versions of many resources on PowerShell.Org's GitHub

Details

After installing the modules, you can discover all of the resources available by using the Get-DSCResource cmdlet.  Here is a brief description of each resource (for more details on a resource, check out the TechNet pages). 

Module

Resource(s)

Description

cFileShare

cCreateFileShare

cSetSharePermissions

The cFileShare module is a part of the ALM Ranger DevOps solutions (VsarDevOps.codeplex.com), which consists of code as config guidance, quick reference posters and supporting resources. This module contains the cCreateFileShare and  cSetSharePermissions resources. These DSC Resources allow configuration of a node’s file share and share permission rules.

xBitLocker

xBLAutoBitlocker

xBLBitlocker

xBLTpm

This DSC Module allows you to configure Bitlocker on a single disk, configure a TPM chip, or automatically enable Bitlocker on multiple disks

xComputerManagement

xComputer

xDisk

xWaitForDisk

This DSC Resource allows you to rename a computer and add it to a domain or workgroup.

xCredSSP

xCredSSP

This module contains the xCredSSP resource, which enables or disables Credential Security Support Provider (CredSSP) authentication on a  client or on a server computer, and which server or servers the client  credentials can be delegated to.

xDismFeature

xDismFeature

The xDismFeature module enables or disables Windows optional features that specifically need to be handled by DISM.exe.

xExchange

xExchActiveSyncVirtualDirectory

xExchAutodiscoverVirtualDirectory

xExchAutoMountPoint

xExchClientAccessServer

xExchDatabaseAvailabilityGroup

xExchDatabaseAvailabilityGroupMember

xExchDatabaseAvailabilityGroupNetwork

xExchEcpVirtualDirectory

xExchExchangeCertificate

xExchExchangeServer

xExchImapSettings

xExchMailboxDatabase

xExchMailboxDatabaseCopy

xExchMapiVirtualDirectory

xExchOabVirtualDirectory

xExchOutlookAnywhere

xExchOwaVirtualDirectory

xExchPopSettings

xExchPowershellVirtualDirectory

xExchReceiveConnector

xExchUMService

xExchWaitForDAG

xExchWaitForMailboxDatabase

xExchWebServicesVirtualDirectory

This DSC Module allows you to configure many different properties of Exchange 2013 servers, including individual server properties, databases and mount points, and Database Availability Groups.

xPendingReboot

xPendingReboot

This module contains the xPendingReboot resource. xPendingReboot examines three specific registry locations where a Windows Server might indicate that a reboot is pending and allows DSC to predictably handle the  condition. 

xSCDPM

xSCDPMConsoleSetup                                                                                  
 

xSCDPMDatabaseServerSetup                                                                           
 

xSCDPMServerSetup

xSCDPM contains three resources:

xSCDPMServerSetup for installation of the DPM server, xSCDPMDatabaseServerSetup for installation of DPM support files for SQL Server, and xSCDPMConsole for installation of the DPM console.

xSCOM

xSCOMConsoleSetup                                                                                   
 

xSCOMManagementServerSetup                                                                          
 

xSCOMReportingServerSetup                                                                           
 

xSCOMWebConsoleServerSetup

The xSCOM module contains resources for installation of System Center Operations Manager (OM).

xSCSMA

xSCSMAPowerShellSetup                                                                               
 

xSCSMARunbookWorkerServerSetup                                                                      
 

xSCSMAWebServiceServerSetup                                                                         
 

 The xSCSMA module contains resources for installation of System
Center Service Management Automation (OM).

xSCSR

xSCSRServerSetup

The xSCSR module contains resources for installation of System Center Service Reporting (SR).

xSCVMM

xSCVMMConsoleSetup                                                                                  
 

xSCVMMManagementServerSetup

The xSCVMM module contains resources for installation of System
Center Virtual Machine Manager (VMM).

 

Renaming
Guidelines

When making changes to these resources, we urge the following practice:

1. Update the following names by replacing MSFT with your company/community name and replacing the “x” with "c" (short for "Community") or another prefix of your choice:

a.   Module name (ex: xWebAdministration becomes cWebAdministration)

b.   Folder name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

c.   Resource Name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

d.   Resource Friendly Name (ex: xWebsite becomes cWebsite)

e.   MOF class name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

f.    Filename for the <resource>.schema.mof (ex: MSFT_xWebsite.schema.mof becomes Contoso_cWebsite.schema.mof)

2.      Update module and metadata information in the module manifest

3.      Update any configuration that use these resources

 

We reserve resource and module names without prefixes ("x" or "c") for future use (e.g. "MSFT_WebAdministration" or "Website").  If the next version of Windows Server ships with a "Website" resource, we don't want to break any configurations that use any community modifications.  Please keep a prefix such as "c" on all community modifications. As specified in the license, you may copy or modify this resource as long as they are used on the Windows Platform.  

Requirements

Note:

The DSC Resource Kit requires at least Windows 8.1 or Windows Server 2012 R2 with update KB2883200 (aka the GA Update Rollup). You can check whether it is installed by running the following command:

PS C:\WINDOWS\system32> Get-HotFix -Id KB2883200

Source       Description    HotFixID     InstalledBy         InstalledOn   

------       -----------     --------     -----------         ----------

MyMachine    Update          KB2883200    MyMachine\Admini...9/30/2013 12:00:00AM 

 

For most modules, you can use them on supported down-level versions of Windows by installing WMF 4.0. Refer to these previous blog posts for more information on WMF 4.0 and issues with partial installation. A few modules will require the use of WMF 5.0. You can confirm the requirements for each module on the individual blog topics that provide the details for the module.

 

Pages

Subscribe to Randy Riness @ SPSCC aggregator
Drupal 7 Appliance - Powered by TurnKey Linux