You are here

Feed aggregator

How to ensure your toast notifications continue to work when converting your win32 app into a Windows Store app using project Centennial

MSDN Blogs - 1 hour 37 min ago

As you may already know, developer can now bring their existing apps and games to the Windows Store with the Desktop Bridge, known as project Centennial.

For developers who has been integrating with toast notification features in Windows 8 and Windows 10, when you turn your app into a Windows Store app using project Centennial, you may realize that your toast notification does not work as desired – an incorrect/unresolved string will show up in Action Center at where the application title is supposed to be shown. This post summarizes on why it happens and how to fix it with a simple change to your code.

Why does it break?

In Windows 8/8.1, we enabled support for desktop Win32 apps to send local toast notifications – by enabling win32 apps to do 2 things:

  • be able to register with the notification platform and;
  • call WinRT APIs to send toast.

What did we do to enable those respectively?

  • A registration with notification platform requires an app identity, but win32 apps don’t really have modern app identities. This problem is solved by by allowing win32 apps to provide us with a “fake” app identity – through creating a desktop shortcut.
  • The app will then need to call the method overload that takes this manually created application id to indicate which identity is trying to send this notification, as shown:
ToastNotificationManager.CreateToastNotifier(appId).Show(toast);

Windows Store apps converted through project Centennial on the other side, already come with proper app identities just like any regular UWP app, so if the win32 app is converted without changing any of its previous code, then it will create a conflict as below:

  • The shortcut creation will be skipped for the converted Store app during app deployment because the deployment path is different, however;
  • The app will still call the same overload mentioned above to pass in the fake app id that the OS can no longer resolve, since the shortcut is not there, thus results in unresolved string.
How to fix it?

Any app in this situation will need to simply change a line of their code to call

ToastNotificationManager.CreateToastNotifier().Show(toast);

without passing in the manually created appId and just let the app identity get resolved by the OS like any other modern apps do.

Why didn’t we fix this for you?

The same method overload can also be used by Windows Store apps to send toast notifications on behalf of other apps in the same package. Once a win32 app is converted to Windows Store app through project Centennial, Windows just see it as a regular Store app without knowing its true intention behind calling CreateToastNotifier(appId). To prevent us from getting your real intention wrong and further break some other unexpected scenarios, we ask the developers to make the change here during your conversion process.

SQL Azure Tip! – Failures importing Bacpac

MSDN Blogs - 1 hour 52 min ago

Most of the time importing bacpac files from SQL Azure locally works without an issue but occassionally there will be an issue when a particular setting has been used that is not supported in the destination database version.  There are many forum and stack overflow posts in regards to these sneaky little issues so here is a tip when working with bacpacs.  They’re zips…

Recently I had a requirement to bring a SQL Azure database down to my local SQL server and ran into a compatibility issue:

Error SQL72014: .Net SqlClient Data Provider: Msg 4631, Level 16, State 1, Line 1 The permission ‘ALTER ANY DATABASE EVENT SESSION’ is not supported in this version of SQL Server.

I was not able to alter the SQL Azure database in order to alter or remove any security settings so I was forced to take a different approach… in other words, I had to hack the bacpac.

  1. Make a copy of the bacpac…
  2. Rename the copy by changing the extension to .zip
  3. Open the zip and open the model.xml file
  4. In my situation I found the element containing the ALTER ANY DATABASE EVENT SESSION command and I deleted it.
  5. Rename the zip back to bacpac
  6. The bacpac does contain a checksum that it uses to validate the package.  This needs to be updated.  Fortuntately there is a git copy of the dacchksum.exe utility.  Simply run against the new bacpac in order to get the checksum value.
  7. Rename the bacpac back to zip
  8. Inside the zip, open the origin.xml and update the checksum
  9. Rename the zip back to bacpac
  10. Try the import again!

Note: In four years I have only had to resort to this twice so I do view this as a last resort.

SQL Azure Backup References

Cheers!

Dynamics AX における、伝票トランザクション入力時の制限事項について

MSDN Blogs - 3 hours 50 sec ago

Dynamics AX では、1 つの伝票番号に対して、 1 つの顧客 / 仕入先トランザクション の入力があることを前提として処理設計されております。但し、一般会計 > 仕訳帳 > 一般仕訳帳からトランザクションの入力時に、仕訳伝票明細にて様々な勘定タイプ (Account type) を選択することが可能です。現行バージョンの Dynamics AX では入力時に、以下のように複数の顧客や仕入先を選択して保存することが可能なため、税金の計算や後の決済処理等で想定されない動作となる問題が発生する可能性がございます。

 

NG 例-1)

————————————————————-

伝票番号  勘定タイプ  勘定         借方     貸方    相手勘定

————————————————————-

GL-0001   顧客        顧客_A               100.00   n/a

GL-0001   顧客        顧客_B       30.00            n/a

GL-0001   顧客        顧客_C       70.00            n/a

 

推奨される入力方法)

————————————————————-

伝票番号  勘定タイプ  勘定         借方     貸方    相手勘定

————————————————————-

GL-0001   顧客        顧客_A               100.00   経過勘定 (*)

GL-0002   顧客        顧客_B       30.00            経過勘定

GL-0003   顧客        顧客_C       70.00            経過勘定

 

(*) 経過勘定を用いることで、顧客勘定毎に異なる伝票番号とすることが重要です。

 

 

NG 例-2)

————————————————————-

伝票番号  勘定タイプ  勘定         借方     貸方    相手勘定

————————————————————-

GL-0010   仕入先      仕入先_A            100.00    n/a

GL-0010   顧客        顧客_A       100.00           n/a

 

推奨される入力方法)

————————————————————-

伝票番号  勘定タイプ  勘定         借方     貸方    相手勘定

————————————————————-

GL-0011   仕入先      仕入先_A            100.00    経過勘定

GL-0012   顧客        顧客_A       100.00           経過勘定

 

 

例えば、税金の計算が正しく行われない場合や決済処理が想定した動作となならい現象を確認された際には、まずは伝票が「推奨される入力方法」となっているかご確認ください。

 

 

Creating and Registering SSL Certificates

MSDN Blogs - 3 hours 50 min ago

A few days back, I was working with one of our partners who had a requirement of creating a SSL self-signed certificate through MMC console. As we are already aware that it is a complex and a tedious procedure, tried developing a script to ease the task for us. Also found that there were a lot of partners asking for assistance in having a script based approach to create the certificates.
Tried finding a way out by looking through various discussion forums which yielded nothing, but queries to build a script to accomplish the task. Addressing this requirement of the partner pool, here is the blog explaining the script based way of creating the Self-signed certificates and registering them meeting the pre-requisites of SQL server.
By developing the script based way of creating the certs, it is just at the run of a command we will get the SSL self-signed certificates created and ready to be registered. Along with the creation of the certificate, this blog also explains the different ways of registering those certificates.

 

Scenario 1:

I will be creating a SSL self-signed certificate using the following 3 methods:

  • Using Makecert util from the SDK.
  • Using certreq command and a script.sine
  • Using powershell command.

Steps to be followed:

  1. Using Makecert util:
  • Firstly, the pre-requisite for using this method is to have Windows SDK installed on the machine.
  • Navigate to the location where you have the makecert util and then Run the below command from elevated CMD prompt:
  • Run the following command to create the certificate:

makecert -r -pe -n "CN=MININT-Q99PLQN.fareast.corp.microsoft.com" -b 10/16/2015 -e 12/01/2020 -eku 1.3.6.1.5.5.7.3.1 -ss my -sr localMachine -sky
exchange -sp "Microsoft RSA SChannel Cryptographic Provider" -sy 12

  • We will have the certificate created under the MMC console –> Certificate snap in –> Local Computer –> Personal section
  • As per the parameters specified, the certificate will be created with the following set of specifications:
    • The common name of the certificate will be “MININT-Q99PLQN.fareast.corp.microsoft.com” which is the FQDN of the machine.
    • The private key will be enabled for exporting.
    • Certificate will be created in the Computer account -> Personal -> Certificate store
    • Validity period will be 10-16-2015 to 12-01-2020
    • The server authentication will be enabled. [eku = 1.3.6.1.5.5.7.3.1]
    • Key Spec value will be set to 1. [AT_KEYEXCHANGE will be enabled]
    • The algorithm used here for encryption is Microsoft RSA SChannel Cryptographic Provider.

2. Using Certreq command:

  • Firstly, we need to save the below script in a text document with a .inf extension.

[Version]
Signature = "$Windows NT$"
[NewRequest]
Subject = "CN = MININT-Q99PLQN.fareast.corp.microsoft.com"
FriendlyName = test1.contoso.com
MachineKeySet = true
RequestType=Cert
;SignatureAlgorithm = SHA256
KeyLength = 4096
KeySpec = 1
KeyUsage = 0xA0
MachineKeySet = True
Exportable = TRUE
Hashalgorithm = sha512
ValidityPeriod = Years
ValidityPeriodUnits = 10
[EnhancedKeyUsageExtension]
OID=1.3.6.1.5.5.7.3.1

  • Navigate to the location where you have saved this request.inf file and then Run the below command from elevated CMD prompt

Certreq -new -f .inf .cer

  • We will have the certificate created under the MMC console –> Certificate snap in –> LocalComputer –> Personal section
  • The advantages of this technique is that it does not require the Windows SDK installed and the key length can be subjected to changes where as if it is using makecert it would be by default set to ‘2048’ for ‘RSA’ and ‘512’ for ‘DSS’

3. Using Power-shell command

  • Here is the approach to create the SSL certificate satisfying the pre-requisites to load it for SQL server using the power-shell command.
  • Run Powershell as an administrator and enter the following command (where DnsName = Host name or FQDN of the machine)

New-SelfSignedCertificate -DnsName MININT-Q99PLQN.fareast.corp.microsoft.com -CertStoreLocation cert:LocalMachineMy -FriendlyName test99 -KeySpec KeyExchange

Scenario: 2

I will be registering the SSL self-signed certificate using the following 2 methods:

  • Through the SQL Server Configuration Manager
  • Through explicit registration

Steps to be followed:

  1. Through SQL Server Configuration Manager:
  • Initially need to check the health of the certificate using the CheckSQLssl.exe tool.
  • Here are the pre-requisites for the SSL certificate to use it for SQL server:
    • Certificate must be present in the Local computer certificate store or the current user certificate store.
    • Certificate age must be present within the validity period.
    • Certificate must be meant for server authentication. (EKU should specify Server Authentication [1.3.6.1.5.5.7.3.1])
    • Certificate must be created using the KEY_SPEC option of AT_KEYEXCHANGE (KEY_SPEC=1)
    • Common name of the certificate should be the host name or the FQDN of the server computer.
    • Running the tool using the command prompt will generate the following report

  • On getting all the validation checks ‘OK’ regarding the pre-requisites of the certificate we can go ahead register it.
  • On SSCM, expand SQL server network configuration -> Right click on ‘Protocols for <Instance name> -> Properties. Turn the ‘Forced Encryptionto Yes.

  • Click on the ‘Certificate’ tab where the certificates will be listed and select the required certificate from the list and restart the service.

  • Thus the SSL certificate will be loaded to the selected SQL server and this can be verified by analyzing the SQL error logs for the below message and verifying it with the thumbprint of the certificate in MMC.

The certificate [Cert Hash(sha1) "BFB714872C7B2CD761ADEB1893BFC99581D3420B"] was successfully loaded for encryption.

  • To verify the thumbprint, in MMC double click on the certificate which is loaded, click on ‘Details’ tab and click on thumbprint in the list.

2. Through explicit registration:

  • Even after the validation checks are proved to be OK by the CheckSQLssl tool and still if the certificate is not listed in SSCM, then follow this technique.
  • Run ‘regedit’ and open HKEY_LOCAL_MACHINESOFTWAREMicrosoftMicrosoft SQL ServerMSSQL12.MSSQLSERVERMSSQLServerSuperSocketNetLib and enter the thumbprint of the certificate without spaces to the ‘Certificate’ value.
  • Note that in case of a clustered environment in those nodes whose FQDN does not match with the certificate name, the certificate will not be listed in the configuration manager. In that case explicit registration is the only way to register the certificate.
  • Then on restarting the SQL service the SSL certificate will be loaded to SQL and this can be verified again by analyzing the SQL server error logs.
Written by:
Shreyas R, Support Engineer, SQL Server Support Reviewed by:
Sravani Saluru, Support Escalation Engineer, SQL Server Support
Pradeep M M, Escalation Engineer, SQL Server Support

3 Steps on How to Register a Trademark in Australia

MSDN Blogs - 4 hours 46 min ago

Guest post by Anthony Lieu, Lawyer and Strategist at LegalVision

 

Whether you run an e-commerce marketplace or pure tech startup, every business creates intellectual property and has a brand to protect. A key part of protecting a business’ brand and goodwill is registering a trademark. From determining the trademark’s class to applying with IP Australia, the trademark registration process can be convoluted. This article sets out what you can trademark and the process that it entails in Australia.

A trademark applicant should take care to ensure they are making the correct application when lodging for a trademark. Trademarks exist within their own area of intellectual property, as distinguished from designs, patents, copyright and plant breeder’s rights. Within this area, trademarks are inclusive of letters, logos, words, phrases, numbers, shapes, phrases, packaging, sounds, smells, movements and pictures. Across all these different types, the good or services should be uniquely identifiable with the trademark and must be so in such as way that they can be linked to the business that provides those good or services.

  1. Classifying your Trademark

Once it has been determined whether or not you have a trademark, it must then be classified into one of the 45 goods and services classes listed by IP Australia. Misidentification of a product class means the trademark will not extend to cover the product.

To streamline this process, IP Australia has introduced the picklist to extensively outline 60,000 sub-sections of good and services with their corresponding class. This ensures that nearly any type of good or service may match or be reasonably similar enough to a picklist sub-category for it to grouped into a particular class.

When trying to determine the class, it is also helpful to frame the search with some background questions around what your business activities are, the nature of the product or services you provide and where income is derived from. The answers to these questions better ensure you can isolate the correct class and items within them.

However, if there is not a classification that adequately covers a product or service, applicants can also submit a custom description to detail the unique features of a good and service not covered by any of the picklist entries. If necessary, registration can be completed for more than one class if there is suitable overlap to do so. The caveat here is that it does reduce the chance of successful registration, having to satisfy the criteria of multiple classes. It can also present issues, if, for instance, the trademark falls into disuse and once of the class registrations lapses, leaving the product in limbo.

  1. Conducting a Trademark Search

The purpose of a trademark search is to eliminate the possibility that there already exists a near identical or duplicate trademark to the one for which registration is being sought.  The official and most widely used way to carry out a trademark search is through the Australian Trademarks Online Search System (ATMOSS), which will reveal if trademarks are:

  • Pending: still awaiting the decision of IP Australia
  • Registered: approved by IP Australia
  • Not-Registered: having not been registered before
  • Refused: rejected on various grounds, namely it infringed on another trademark.

Images, words, prefixes and images are all searchable. ATMOSS also enables applicants to follow application process and to search for other trademark owners. Other than ATMOSS, database search options include Trademark Check, Classification Search and Trademarks Image Viewer.

  1. Registration Application and Outcome

Applicants are required when they are registering their trademark to provide their contact details, a representation of the trademark and description of the product/service being trademarked, the list of relevant classes the trademark falls under and the filing fee.

Once the trademark has officially been applied for, applicants then await the outcome, which can take up to four months depending on whether there are any issues with trademark examination. If there are problems with the application, they will be presented in the examiner’s report, and the applicant will have the opportunity to rectify them and re-submit. If successful, the applicant will receive a notification of registration.

As trademark registration is a lengthy and complex process, making thorough checks at every step of trademark registration is crucial to prevent potentially expensive oversights. If you have any questions or are uncertain about along the way, it’s advisable to consult with a specialist IP lawyer. LegalVision’s IP team can assist on 1300 544 755.

 

***
Anthony is a lawyer and Strategist at LegalVision. He has a keen interest in startup law, IT law and scaling fast-growing businesses. He has a strong understanding in how startups operate at all stages and navigating the myriad of legal issues surrounding online businesses. He has worked in the public and private legal sector, specialising in disputes and litigation, corporate advisory and tax controversy.

 

Featured Small Basic Wiki Article: Read and Write CSV Files

MSDN Blogs - 8 hours 43 min ago

Here is a classic article from Florian that examines how to use CSV files in Small Basic.

Read the full article here:

And here is an excerpt:

===========================

What is CSV?

Comma separated values (CSV) is a file format that is used to exchange data between programs. It uses commas to separate values and requires equally many values on each line.

Example

For a game we want to save the best player’s score, its name and the date on which the score was achieved.
The format we choose therefore is: [player’s name], [score], [date]

Bob,12,2013-01-02

This means that bob scored 12 points at 2013-01-02. We can easily store multiple scores by writing them one below another.

Alice,15,2013-03-04 Bob,12,2013-01-02 Charlie,9,2013-05-06

Observation:  CSV files can hold one to many records or lines with values.

 

=================

Read the full article here:

 

TechNet Wiki articles about Small Basic are gifts that keep giving, years later!

Have a Small and Basic week,

– Ninja Ed

dotnet restore unable to resolve .NET Framework libraries

MSDN Blogs - 9 hours 37 min ago

I was at a hack at the start of the month and we were attempting to create a CI pipeline for the solution. The solution was made up of a dotnet core (1.0.0-preview2-003121) Web API project that references some .Net Framework 4.5.2 class libraries. When I compiled this in VSTS it was unable to restore the .Net Framework libraries.

Searching around, there are similar issues being found on GitHub; https://github.com/dotnet/cli/issues/3199 and https://github.com/aspnet/Tooling/issues/741.

The problem turned out that the version of Nuget on the Hosted Build Agents does not work with this mix of core and framework. In order to fix this: –

  • Add a variable called NugetDownloadUrl with a value of  https://dist.nuget.org/win-x86-commandline/v3.5.0-rc1/NuGet.exe
  • Add a simple PowerShell task to download the Release Candidate nuget. The command should be: –Invoke-WebRequest -OutFile “$(build.ArtfifactStagingDirectory)NuGet.exe” $(NugetDownloadUrl)
  • Then add a standard Nuget restore task but pointed explicitly at the NuGet.exe file that was just downloaded. $(Build.ArtifactStagingDirectory)Nuget.exe

Now, when you compile your project, it will compile your dotnet framework projects and restore the dotnet core packages.

Application Insights tokenization for Visual Studio Team Services extensions

MSDN Blogs - 14 hours 3 min ago

To apply DevOps to a Software Development Project we implement a continuous integration (CI) / continuous deployment (CD) pipeline. Next we add Application Insights telemetry to be in a position to monitor the extension, as outlined in Monitor Team Services web extensions with Visual Studio Application Insights and Application Insights and TypeScript. When we decide to open source our solution, we realise that we will share the instrumentation key publically. It’s not a show-stopper, but something we recommend to avoid. By protecting your instrumentation key, you know where the telemetry data is coming from, making it trustworthy.

One solution is to replace to replace the instrumentation key in your instrumentation code with a token, which is dynamically exchanged with your instrumentation key during the CI/CD pipeline. Let’s walk through these two common scenarios:

  • Application Insights instrumentation key that needs to be injected during the build. The instrumentation key is the same for all your environments.
  • Application Insights instrumentation key that needs to be injected for each release environment. The instrumentation key is different for some or all of your environments.

Replace sensitive information with a token

Our extensions contain a script file, TelemetryClient.ts, that contains our telemetry instrumentation code. See https://github.com/ALM-Rangers for examples.

  • In your telemetry script, for example TelemetryClient.ts, replace the value of the instrumentation key with __INSTRUMENTATIONKEY__.
  • Push the change to your public repo.
  • Optionally squash all history to clean things up and remove your instrumentation key from the history. We’ll cover this step in a future post.

Replacing a token Scenario 1 – Hide the Application Insights instrumentation key in your OSS code and inject it during the build

  • Install the Colin’s ALM Corner Build & Release Tools from the Visual Studio marketplace.
  • Edit your build definition.
  • Click on Variables [1], add a new variable  called INSTRUMENTATIONKEY [2], and set its value [3] to your Application Insights instrumentation key.
  • Add a new task called Replace Token task to your build definition. Ensure to place it before the compilation step, which generates JavaScript from the TypeScript files.
  • Set the Source Path [1] to your scripts folder, and the Target File Pattern [2] to the TelemetryClient.ts file. The Token Regex parameter  defaults to __(w+)__ [3]. This will match tokens that have double-underscore __ prefix and postfix, for example __INSTRUMENTATIONKEY__.
     
  • Save the build.
  • Check-in the updated TelemetryClient.ts, which will trigger a build, if you have a CI/CD pipeline.
  • Verify that the Replace Token task successful. It uses regex to find our __INSTRUMENTATIONKEY__ token, it searches for the corresponding INSTRUMENTATIONKEY variables, and then replaces the token with the variable value.

    Running the build with system.debug set to true, will give you a more detailed trace.

 

Scenario 2 – Hide the Application Insights instrumentation key in your OSS code and inject it for each release environment

For this scenario we unzip the extension VSIX package, replace the token in the TelemetryClient.js file, and zip the extension VSIX package for each environment defined by the release.

Let’s get started.

Create instrumentation key token variable
  • Click on the context [1] for an environment and click on Configure Variable [2].
  • Add a new variable  called INSTRUMENTATIONKEY [1], set its value [2] to your Application Insights instrumentation key, and optionally click on Secret [3] to hide the value.
  • Use the same Application Insights instrumentation key for all environments to collect metrics for all environments in one bucket. Alternatively use different keys for the environments to collect metrics for specific environments.
Create VsixPath variable
  • Click on release Variables [1], add a new variable  called VsixPath [2], and set its value [3] to the extension VSIX package generated by your build.
Extract files from VSIX package
  • Add a new task called Extract files [1] task to your environment definition. Set the Archive file [1] to the $(VsixPath) we just created, and define a Destination folder [3].
Replace token
  • Add a new task called Replace Token task [1] to your environment definition, placing it after the Extract files task.
  • Add a new task called Source Path [1] to the folder containing the extracted files, and the Target File Pattern [2] to the TelemetryClient.js file [3].
Re-create VSIX package
  • Add the Archive files task [1] to your environment definition, placing it after the Replace Tokens task.
  • Set the Root folder to the folder containing the extracted files, and the Archive file to create to the $(VsixPath) variable [3].
Group Tasks

To simplify the previous three steps, we can create a Task Group and encapsulate all three in one re-usable task.

  • Select the three tasks we just created [1]. Right click and select Create task group [2].
  • Define a unique Name [1], select the Utility category [2], and set the VsixPath to *.vsix [3].
  • Once validated, the 3 tasks are automatically replaced with the new Extension Tokenize Task Group.

That’s it.

We look forward to hearing from you

We would love your feedback. Here are some ways to connect with us:

Resource Owner Password Credentials Grant in Azure AD OAuth

MSDN Blogs - 17 hours 20 min ago

Azure AD supports varies grant flows for different scenarios, such as Authorization Code Grant for Web server application, Implicit Grant for native application, and Client Credentials Grant for service application. Furthermore, the Resource Owner Password Credentials Grant is also supported for the case that the resource owner has a trust to the target application, such as an in-house windows service.

 

As the Resource Owner Password Credentials Grant is totally based on http request without URL redirection, it not only can apply to WPF, Winform application but also C++, MFC, also no matter there is user interact or not. For more official description regarding to this flow, you may refer to RFC6749. This flow has given us much flexibility to gain a token easily, while, as this flow will expose the user name and password directly in a http request, it brings potential attack risk as well. Your credential will be lost easily if the request is sent to an unexpected endpoint, and definitely we should always avoid  handling the user credential directly. Furthermore, notice that resource owner password grant doesn’t provide consent and doesn’t support MFA either. So, try to use Authorization Code flow if possible and do not abuse the resource owner password grant.

The following are the parameters needed in Azure AD OAuth for resource owner password grant.

Name Description grant_type The OAuth 2 grant type: password resource The app to consume the token, such as Microsoft Graph, Azure AD Graph or your own Restful service client_id The Client Id of a registered application in Azure AD client_secret The Client Secret of the app username The user account in Azure AD password The password of the user account scope optional, such as openid to get Id Token

 

For now, the latest .Net ADAL doesn’t support this grant directly, while you can simply use the below code snippet to require access token via native http request.

using (HttpClient client = new HttpClient()) { var tokenEndpoint = @"https://login.windows.net/<tenant-id>/oauth2/token"; var accept = "application/json"; client.DefaultRequestHeaders.Add("Accept", accept); string postBody = @"resource=https%3A%2F%2Fgraph.microsoft.com%2F &client_id=<client id> &grant_type=password &username=xxx@xxx.onmicrosoft.com &password=<password> &client_secret=<client secret> &scope=openid"; using (var response = await client.PostAsync(tokenEndpoint, new StringContent(postBody, Encoding.UTF8, "application/x-www-form-urlencoded"))) { if (response.IsSuccessStatusCode) { var jsonresult = JObject.Parse(await response.Content.ReadAsStringAsync()); token = (string)jsonresult["access_token"]; } } }

 

And refresh token, id token as well shown below.

 

For a .Net console or windows service, Owin is not available to provide token validation in middleware, you may make the validation via System.IdentityModel.Token.Jwt. This package has provided JwtSecurityTokenHandler.ValidateToken to make the validation based on a series of validation criteria such as issuer, signature; and you will be able to retrieve this information from https://login.windows.net/<tenant-id>/.well-known/openid-configuration and https://login.windows.net/common/discovery/keys

 

 

OpsMgr: Displaying Agent Average Send Queue Size with a Blue Bar State Widget

MSDN Blogs - 20 hours 39 min ago

This post demonstrates how the Blue Bar State Widget Template (Download Here) can be used to create a custom dashboard in the OpsMgr 2012 Operations Console, to display the health state, patch level and average Send Queue Size of each managed agent, and to allow the user to visually identify maximum and minimum values of their Send Queue Sizes on the column with the blue bar.

 


A high average Send Queue Size value for an agent over a long period of time, e.g over 3 hours, could indicate that the agent is collecting a lot of monitoring data from a monitored server like event or performance data. This could be due to misconfigured collection workflows where they could be running too frequently, may not be targeting the right class or may have loose (collect all) filters in their filtering expression or even problematic application issues like the case outlined in the previous post.
These configuration issues must be addressed to avoid flooding of data in the OpsMgr databases, and unnecessary consumption of resources in the monitored servers and database servers.

For an example of how the influx of collected events affected the Send Queue Size of an agent and the data set in a data warehouse database over time, please refer to the following blog post entitled:
An Inflating Event Data Set !? The Case of the Rogue Event Collection Workflow.

To create an instance of the dashboard shown above using the Blue Bar State Widget Template, first, create a dashboard layout with 1 cell, Click to add widget on a cell, then select the Sample Blue Bar State Widget template located under the “All Templates/WeiOutThere Blue Bar Column Template” folder , go through the UI pages of the template and enter the required information.

 

Here is a sample PowerShell script that can be inserted into the “PowerShell Script” page of the widget template to create a Blue Bar State Widget that consist of a list of managed objects representing all Microsoft Monitoring Agents in the current management group (all instances of the Microsoft.SystemCenter.Agent class). The health state, patch level and average Send Queue Size over the last 3 hours (Blue Bar column) for each agent is displayed in their respective columns as well.

$class = get-scomclass -Name Microsoft.SystemCenter.Agent $Agents = Get-SCOMClassInstance -class $class $avg_stat = @{} $dataObjects = @() #///////// Functions Section ///////////////////// START function RecalculateMinMaxForAvgStatItem { param($name, $value) $avg_stat[$name]["min"] = ($avg_stat[$name]["min"], $value | Measure -Min).Minimum $avg_stat[$name]["max"] = ($avg_stat[$name]["max"], $value | Measure -Max).Maximum } function CreateStatistics { param($value) $stat = $ScriptContext.CreateInstance("xsd://Microsoft.SystemCenter.Visualization.Library!Microsoft.SystemCenter.Visualization.DataProvider/PerformanceDataStatistics") if ($value -ne $null) { $stat["AverageValue"] = [double]$value $stat["Value"] = [double]$value } $stat } # Initialize Stat Item: function InitAvgStatItem { param($name) if ($avg_stat[$name] -eq $null) { $avg_stat[$name] = @{} $avg_stat[$name]["min"] = 0 $avg_stat[$name]["max"] = [Int32]::MinValue } } function AddColumnValue { param($dataObject, $name, $value) $v = $value InitAvgStatItem $name if ($v -ne $null) { $dataObject[$name] = CreateStatistics($v) RecalculateMinMaxForAvgStatItem $name $v } else { $dataObject[$name] = $null } } #///////// Functions Section ///////////////////// END #///////// Main Section ///////////////////// START foreach ($Agent in $Agents) { $dataObject = $ScriptContext.CreateFromObject($Agent, "Id=Id,State=HealthState,Name=Name", $null) $dataObject["Name"]=$Agent.Path $dataObject["Patch Level"]=$Agent.'[Microsoft.SystemCenter.HealthService].PatchList'.value $dataObject["Path"]=$Agent.Path if ($dataObject -ne $null) { #Last 3 hours UTC $aggregationInterval = 3 $dt = New-TimeSpan -hour $aggregationInterval $nowlocal = Get-Date #Convert local time to UTC time $now = $nowlocal.ToUniversalTime() $from = $now.Subtract($dt) $perfRules = $Agent.GetMonitoringPerformanceData() foreach ($perfRule in $perfRules) { if($perfRule.CounterName -eq "Send Queue Size") { $data = $perfRule.GetValues($from, $now) | % { $_.SampleValue } | Measure-Object -Average AddColumnValue $dataObject $perfRule.CounterName $data.Average } } $dataObjects += $dataObject } } foreach ($dataObject in $dataObjects) { foreach ($metric in $avg_stat.Keys) { $stat = $avg_stat[$metric] $dataObject[$metric]["MinimumValue"] = [double]$stat["min"] if ($stat["max"] -ne [Int32]::MinValue) { $dataObject[$metric]["MaximumValue"] = [double]$stat["max"] } else { $dataObject[$metric]["MaximumValue"] = [double]0 } } $ScriptContext.ReturnCollection.Add($dataObject) }


On the “Refresh Interval” page, enter a numerical value for the refresh interval of the widget (in seconds), then click the Finish button to create the custom Blue Bar State Widget.


Here is another example of a dashboard that consist of a Blue Bar State Widget with the managed agents created using the sample PowerShell script above and two other Contextual Performance Widgets created with the Sample Contextual Performance Widget Template.

   
The performance counter specified for the first Contextual Performance Widget for Agent Send Queue Size is:
Performance Object: Health Service Management Groups
Performance Counter: Send Queue Size
Performance Instance: %

The performance counter specified for the second Contextual Performance Widget for Agent Processor Utilization is:
Performance Object: Health Service
Performance Counter: agent processor utilization







 
Disclaimer:
All information on this blog is provided on an as-is basis with no warranties and for informational purposes only. Use at your own risk. The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of my employer.

Integrate Azure logs streamed to Event Hubs to SIEM

MSDN Blogs - Sat, 09/24/2016 - 23:13

Overview – Azure Resource Providers Logs Azure diagnostic logs that are intended for our customers to gain insight into what is happening to their resources and to be archived as per their requirements. To enable this feature, please follow the steps provided at Azure diagnostics and Azure Audit logs

Azure log integration can be used to integrate the logs to your SIEM (Security Information and Event Management) systems. Eventhub can be configured as source of log destined for your SIEM.

Prerequisite – Install the latest Azure Powershell Open Powershell as administrator and Run the following cmdlets
  • login-azurermaccount
  • add-azureaccount
  • Select-AzureRmSubscription -SubscriptionName <SubscriptionName>
  • Select-AzureSubscription -SubscriptionName <SubscriptionName>
Create an AzureRMLogProfile
  1. Register the Resource Provider

Register-AzureRmResourceProvider -ProviderNamespace Microsoft.Insights

  1. Create ResourceGroup, StorageAccount and Eventhub.

#Specify the name of your AzureRMLogProfile, Location, storage account, resourcegroup and servicebusnamespace
$name = ‘azlog’
$location = ‘West US’
$storagename = $name + ‘str’
$rgname = $name + ‘-rg’
$sbnamespace = $name+’-ns’
#The below command creates a new resourcegroup
$rg = New-AzureRmResourceGroup -Name $rgname -Location $location
#Create a new storage account
$storage = New-AzureRmStorageAccount -ResourceGroupName $rgname -Name $storagename -Location $location -SkuName Standard_LRS
#Create a new servicebug
$sb = New-AzureSBNamespace -Name $sbnamespace -Location $location -CreateACSNamespace $false -NamespaceType Messaging

$sbresource = Get-AzureRmResource | where {$_.ResourceName -eq $sbnamespace -and $_.ResourceType -eq ‘Microsoft.ServiceBus/namespaces’}
$sbruleid = $sbresource. ResourceId+’/authorizationrules/RootManageSharedAccessKey’
$locationobjects = Get-AzureRmLocation
$locations = @(‘global’) + $locationobjects.Location
Add-AzureRmLogProfile -Name $name -StorageAccountId $storage.Id -ServiceBusRuleId $sbruleid -Locations $locations

Set Diagnostics for resources (Example below creates an Azure Key vault and Sets Diagnostics on it)

 

$kv = New-AzureRmKeyVault -VaultName $name -ResourceGroupName $rgname -Location $location
Set-AzureRmDiagnosticSetting -ResourceId $kv.ResourceId -StorageAccountId $storage.Id -ServiceBusRuleId $sbruleid -Enabled $true

 

Generate logs on the newly created Keyvault

$sb = Get-AzureSBNamespace -Name $sbnamespace
$storage = Get-AzureRmStorageAccount -ResourceGroupName $rgname -Name $storagename
$kv = Get-AzureRmKeyVault -VaultName $name -ResourceGroupName $rgname
# Roll secondary key to generate Activity Log
Get-AzureRmStorageAccountKey -Name $storagename -ResourceGroupName $rgname  | ft -a
New-AzureRmStorageAccountKey -Name $storagename -ResourceGroupName $rgname -KeyName key2
Get-AzureRmStorageAccountKey -Name $storagename -ResourceGroupName $rgname  | ft -a
# Set and read a secret
Set-AzureKeyVaultSecret -VaultName $name -Name TestSecret -SecretValue (ConvertTo-SecureString -String ‘Hi There!’ -AsPlainText -Force)
(Get-AzureKeyVaultSecret -VaultName $name -Name TestSecret).SecretValueText

Azlog Integration Uninstall previous version of Azure log integration

If you have previous versions installed, you will need to uninstall it first. Uninstalling it will remove all sources that are registered.
Azlog removeazureid
In Control Panel, Add remove program, uninstall Microsoft Azure log integration

 

Install Azure log integration
  • Download AzureLogIntegration.msi and install
  •  Open Powershell as an administrator cd into c:Program FilesMicrosoft Azure Log Integration
  • Run .LoadAzLogModule.ps1

 

Add the event hub as source to Azlog

$sb = Get-AzureSBNamespace -Name $sbnamespace
$storage = Get-AzureRmStorageAccount -ResourceGroupName $rgname -Name $storagename
$kv = Get-AzureRmKeyVault -VaultName $name -ResourceGroupName $rgname

# Get Namespace Manager
$NamespaceManager = [Microsoft.ServiceBus.NamespaceManager]::CreateFromConnectionString($sb.ConnectionString);

# Show ConnectionString, EventHubs
$sb.ConnectionString
$EventHubs = $NamespaceManager.GetEventHubs()
$EventHubs.Path

#Get Storage Key
$storagekeys = Get-AzureRmStorageAccountKey -ResourceGroupName $rgname -Name $storagename
$storagekey = $storagekeys[0].Value

# Run AzLog command for each event hub
$EventHubs.Path | %{Add-AzLogEventSource -Name $sub’ – ‘$_ -StorageAccount $storage.StorageAccountName -StorageKey $storageKey -EventHubConnectionString $sb.ConnectionString -EventHubName $_}

Validate Integration
  • Verify that logs from the EventHubs are written to c:usersazlogEventHubJson and to c:usersazlogEventHubJsonLD
  • Follow the steps at Partner Configuration steps for your SIEM to integrate to your SIEM

Azure Log Integration SIEM configuration steps

 

 

Visual Studio Code Extension Download Count

MSDN Blogs - Sat, 09/24/2016 - 22:40

Usually, I want to see download count data for my vs code extensions, especially when there is a new release of the extension. However, we could only see the current download count in VS Marketplace. Therefore, I implement this site to support view the historical data of download count: Visual Studio Code Extension Download Count

So, how do I make that? I use below things.

Go to https://vsce.github.io/ and have a look! This is just an initial release. Look forward to your feedback!

SQLCAT at Microsoft Ignite 2016

MSDN Blogs - Sat, 09/24/2016 - 11:36

Hi all, we are looking forward, as we are sure you are, to a great Microsoft Ignite 2016! Three members of the SQLCAT team will be in Atlanta (September 26th-30th) and of course we would love to see everyone and talk about SQL Server, Azure SQL DB, and SQL DW.

We have 3 sessions where we will share some great customer experiences and learnings with SQL Server 2016. Mike Weiner (@MikeW_SQLCAT) will co-present with early adoptors from Attom Data Solutions and ChannelAdvisor:

BRK2231 Understand how ChannelAdvisor is using SQL Server 2016 to improve their business (Wednesday, September 28th from 2-2:45PM EST)

BRK3223 Underststand how Attom Data Solutions is using SQL Server 2016 to accelerate their business (Thursday, September 29th from 9-9:45AM EST)

Then, be sure to keep your Friday open for Arvind Shyamsundar (@arvisam) and Denzil Ribeiro’s (@DenzilRibeiro) presentation:

BRK3094 Accelerate SQL Server 2016 to the max: lessons learned from customer engagements (Friday, September 30th from 12:30-1:45PM EST)

When we are not presenting we’ll primarily be at the Expo in Hall B at the Microsoft Showcase – Data Platform and IoT, eager to talk to you! Look forward to seeing everyone there!

Arvind, Denzil and Mike

[Windows/Mac] Xamarin 環境構築(既にインストール済みかチェックする方法も)

MSDN Blogs - Sat, 09/24/2016 - 01:24

Xamarin(ざまりん)の環境構築(始め方)についてです。

Windows でも Mac でも Xamarin で ネイティブアプリ開発することができます。

Xamarin は、もともとは「Xamarin社」の作っていたサービスで、ライセンス代が1人1年25万円ほどで高かったのですが、
2016年春に Microsoft社に会社ごと買収されてからは無料になりました。


↑ iOS/Android/Windowsアプリ

この記事では、それぞれで、どうやって Xamarinの環境を作るか、を書きます。

/ Windows Mac 使用IDE Visual Studio 2015~ Xamarin Studio Androidアプリ開発 可 可 iOSアプリ開発 リモート接続されたMacがあれば可(*) 可 Windowsアプリ開発 可 (UWP/Win8.1) ✕

(*) 「リモート接続されたMacがあれば可」とは:iOSの SDKは Xcode だけが持っており、Xcode は Mac アプリなので。「リモート接続されたMac」とは、つまり、Macを「ビルドホスト」として使うということです。VSで iOSプロジェクトを作ったら、Macエージェントが立ち上がり、リモート接続の許可された近くのMacを認識しようとします。

環境構築 (Mac編)

Mac (Xamarin Studio) で始める場合。

Macの場合、話はかんたんで、Xamarin Studio という IDE が入っていれば、イコール、Xamarinインストール済みです。
また、iOSアプリ開発には、Xcode (iOS SDKが入っている)もインストールされている必要があります。

Xamarin Studio インストール方法など、読んで下さい:
Mac で Xamarin 使ってみた!インストール〜実行まで【完全無料】[Getting Started Xamarin on Mac]

環境構築 (Windows編)

Windows (Visual Studio) で始める場合。

  • まだ Visual Studio 自体が入っていない場合:
    • xamarin.com の「Visual Studio Community」の「Download VS」をクリック。(無料です)
  • すでに Visual Studio を入れている場合:
    • → そのお手持ちのVSに Xamarinがインストール済みかどうかチェックしましょう。以下に手順を示します
お手持ちの VS に Xamarin がインストール済かどうか確認する

「ファイル」→「新規」→「プロジェクト」

「テンプレート」→「C#」→「クロスプラットフォーム」の中に Xamarin関係(「Blank App」)が入っている

もし入っていたら、Xamarinはインストール済です。
もし入っていなかった(存在していない)ら、あなたの環境に Xamarinはまだインストールされていません。

以下に、お手持ちの VS へ Xamarin のインストールする方法を書きます。何度も言いますが無料です。

お手持ちの VS に Xamarin をインストールする

「プログラムの追加と削除」

「アプリと機能」→「Visual Studio {エディション(Communityとか)} 2015 with Update 2/3」→「編集」

Xamarinにチェックを入れてインストールしてください。

ここで Xamarin にチェックを入れると、付随して、他のも勝手にチェックが入ります(特定のバージョンのUWP SDKなど)が、それは必要なものなので、そのままインストールをお願いします。

環境構築に関する参考リンク 環境構築が終わったら

おすすめリンク集です

Azure News on Friday (KW38/16)

MSDN Blogs - Sat, 09/24/2016 - 00:00

Auch diese Woche gab’s wieder viele Nachrichten zur Microsoft Azure Plattform. Hier sind nähere Infos dazu…

Aktuelle Neuigkeiten Datum Nachricht 23.09. Service Fabric SDK and Runtime for version 5.2 released
Neues Azure Service Fabric SDK in Version 5.2 verfügbar 22.09. Service Bus client 3.4.0 is now live
Neue Service Bus Client Library in Version 3.4.0 – wichtig insbesondere für Nutzer von Event Hubs 22.09. Microsoft Azure Storage samples – cross platform quick starts and more
Beispielcode und Samples für die Programmierung mit Azure Storage 22.09. Umbraco uses Azure SQL Database Elastic Pools for thousands of CMS tenants in the cloud
Azure SQL Database Elastic Pools als Basis für Umbraco as a Service auf Microsoft Azure 21.09. Azure ML, as Part of the IoT Suite, Now Available in Azure Germany
Azure ML jetzt auch in der Azure Cloud in Deutschland verfügbar 21.09. Azure Stream Analytics support for IoT Hub Operations Monitoring
Azure Stream Analytics ermöglicht jetzt auch Analysen zum Betrieb eines IoT Hub (Device-Telemetrie, -Identity etc.) 21.09. Microsoft Azure Germany now available via first-of-its-kind cloud for Europe
Endlich ist die Microsoft Cloud Deutschland da! Zwei Azure Rechenzentren in Deutschland unter Datentreuhänderschaft 20.09. Project Bletchley – Blockchain infrastructure made easy
Project Bletchley – Blockchain Infrastruktur auf Microsoft Azure mit dem Azure Resource Manager bereitstellen 20.09. Announcing the release of Azure Mobile Apps Node SDK v3.0.0
Azure Mobile Apps Node SDK v3.0.0 verfügbar Neue Videos Datum Nachricht Video 23.09. Episode 214: Hockey App and Azure App Insights with Evgeny Ternovsky and Josh Weber
Hockey App und Azure App Insights vorgestellt – Analysen und App Telemetrie von Client- und Cloud-Apps
23.09. What is Microsoft Azure Stack?
Azure Stack im Kurzportrait
22.09. Azure Functions and the evolution of web jobs
In knapp 10 Minuten alles Wissenswerte zu Azure Functions – Serverless Computing auf Microsoft Azure
22.09. PowerShell on Linux – Azure Demo
Azure PowerShell auf Linux
22.09. Get Started with Azure Portal
Erste Schritte mit dem Azure Portal
22.09. Create a Linux Virtual Machine
Eine Linux VM mit Azure Virtual Machines aufsetzen – Kurzüberblick in 4 Minuten
21.09. PowerShell Tools for Visual Studio 2015
Azure PowerShell mit Visual Studio nutzen
21.09. Unified application model
Das Unified Application Model mit Azure Resource Manager – Apps in die Azure Cloud oder Azure Stack deployen
20.09. Tuesdays with Corey: More Azure Portal stuff with Vlad
Corey Sanders mit Neuigkeiten zum Azure Portal

Visual Studio での Xamarin のアップデート方法

MSDN Blogs - Fri, 09/23/2016 - 23:49

Xamarin for Visual Studio のアップデート方法について書きます。

  1. Visual Studio を開きます。
  2. 上のメニューバーの「ツール」
  3. 「オプション」

  1. 「Xamarin」(下の方にある)
  2. 「Other」
  3. 「Check Now」
  4. 「Download」

これで Xamarin インストーラーが立ち上がってくるので、
Visual Studio を閉じて、インストーラーの指示に従ってください。(基本 accept と OK連打です)

Integrate Azure logs to QRadar

MSDN Blogs - Fri, 09/23/2016 - 23:12

Please read Azure log integration. It covers the high level architecture of the integration.

This blog is for anyone who has Azure resources and wants to have their logs integrated to QRadar SIEM (Security Information and Event Management).

By following the steps outlined here, you will be able to integrate the following logs to QRadar

  1. Azure Activity logs
  2. Azure Security Center Alerts

At the time of this blog post, there are about 200 events from Azure Activity logs that will successfully map to categorized event in QRadar. The number of events supported will increase as we work closely with IBM to add more events to the DSM (Device Support Module).

Step 1 – Install Azure DSM released from IBM Qradar 7.2 download

 

Step 2 – Uninstall previous version of Azure log integration

If you have previous version of Azure log integration installed, you will need to uninstall it first. Uninstalling it will remove all sources that are registered.

Steps to Uninstall –

  1. Open the command prompt as administrator and cd into c:Program FilesMicrosoft Azure Log Integration.
  2. Run the command

            Azlog removeazureid

3. In Control Panel –>Add remove programs –> Microsoft Azure log integration –> uninstall

Install Azure log integration
  1. Download Azure log integration and follow the install instruction
  2. Open command prompt as administration and cd into c:Program FilesMicrosoft Azure Log Integration
  3. Run “azlog.exe powershell”. This will open a PowerShell window
  4. In the PowerShell window, run the following

              Add-AzLogEventDestination -Name QRadarConsole1 -SyslogServer 10.0.0.5 -SyslogFormat LEEF

             Name is a friendly name for the Destination

             SyslogServer is the IP address of the QRadar console (you can specify Syslog Port if necessary).

  1. Run the command

           .azlog.exe createazureid 

  1. Run the command

           .azlog authorize <SubscriptionID>

  1. Qradar will autodiscover the source. You can verify on the Log Activity tab on your Qradar console. Provide the SubscriptionID in the Quick filter or if you want to search across all subscriptions, provide ‘azure’ as the text in quick filter. Note that only 200 events are currently categorized

 

 

 

 

 

 

 

[Sample Of Sept. 24] How to create a Hello World 3D holographic app with Unity

MSDN Blogs - Fri, 09/23/2016 - 18:27
Sept. 24

Sample : https://code.msdn.microsoft.com/How-to-create-a-Hello-bae9df25

This sample demonstrates how to create a Hello World 3D holographic app with Unity.

You can find more code samples that demonstrate the most typical programming scenarios by using Microsoft All-In-One Code Framework Sample Browser or Sample Browser Visual Studio extension. They give you the flexibility to search samples, download samples on demand, manage the downloaded samples in a centralized place, and automatically be notified about sample updates. If it is the first time that you hear about Microsoft All-In-One Code Framework, please watch the introduction video on Microsoft Showcase, or read the introduction on our homepage http://1code.codeplex.com/.

Angular 2 with @types declaration files

MSDN Blogs - Fri, 09/23/2016 - 14:11

– To install TypeScript declarations files, at the beginning you had to install DefinitelyTyped packages using NuGet
– Then the node tool TSD followed and after that came Typings
– The next step in the evolution of type acquisition is @types

– @types only requires npm to be managed
– In enterprise environments, this means one tool less to manage and one proxy setting less to worry about (Typings doesn’t use node proxy settings)

– Currently, as of 2016-09-23, the official Angular 2 tutorials use a Typings file to install the declaration files

– To use @types instead:
1. Make sure TypeScript 2 is installed in Visual Studio (if you are using it)
2. Make sure you are using "typescript": "^2.0.3" (or newer) in the package.json file
3. Remove "typings" from "devDependencies" in the package.json file
4. Add the following to "devDependencies":
4a. "@types/core-js": "*"
4b. "@types/node": "*"
4c. If you want to run Jasmine tests, add: "@types/jasmine": "*"
5. Delete the “typings.json” file
6. Run npm install

Machine Learning Workshop in Charlotte, NC

MSDN Blogs - Fri, 09/23/2016 - 14:08

Hello everyone,

Just two weeks ago in Charlotte, NC, the Microsoft Dynamics and Cortana Intelligence team held a 2 day workshop for our Industry Partners at the Microsoft campus off of Arrowood Road.  In attendance was an array of partners who provide Dynamics AX and CRM customers with actionable business intelligence, forecasting and integration support.  We had a busy two day schedule where the class learned and participated hands-on in the integration from Cortana Intelligence Cognitive APIs into Dynamics AX.  We also built and operationalized machine learning models using Azure Machine Learning; we wired up data from AX through the entity store and Azure Data Factory to make a data pipeline capable of retraining that model (as purchases and catalogs change, so should the model); and we had a fun time playing with Event Hub, Stream Analytics, and PowerBI to show how streaming IOT data can be filtered and processed using Azure.

It was exciting to see the participation, the questions and the overall engagement from the class, and I wanted to thank folks for making the trip from far-off places (or not so far) to attend.

Our Workshop Class in Charlotte, NC

Thanks to all who participated and made this a fun and engaging two days!

Materials from the course are publicly available here: http://ax-cis-ws.azurewebsites.net/

Non-public materials from the course (such as source code and sample applications) can be requested axcispntr@microsoft.com .

 

Pages

Subscribe to Randy Riness @ SPSCC aggregator
Drupal 7 Appliance - Powered by TurnKey Linux