You are here

Feed aggregator

Visual Studio Toolbox: Developing Universal Windows Apps

MSDN Blogs - Wed, 05/27/2015 - 07:59

In this episode, I am joined by Navit Saxena, who demonstrates building Universal Windows Apps with Visual Studio 2015. Navit reviews how to install the Windows 10 tooling, create adaptive apps, build UI for different device families and use platform and 3rd-party extension SDKs.

Dubious security vulnerability: Luring somebody into your lair

MSDN Blogs - Wed, 05/27/2015 - 07:00

A security report was received that went something like this:

The XYZ application does not load its DLLs securely. Create a directory, say, C:\Vulnerable, and copy XYZ.EXE and a rogue copy of ABC.DLL in that directory. When C:\Vulnerable\XYZ.EXE is run, the XYZ program will load the rogue DLL instead of the official copy in the System32 directory. This is a security flaw in the XYZ program.

Recall that the directory is the application bundle, The fact that the XYZ.EXE program loads ABC.DLL from the application directory rather than the System32 directory is not surprising because the ABC.DLL has been placed inside the XYZ.EXE program's trusted circle.

But what is the security flaw, exactly?

Let's identify the attacker, the victim, and the attack scenario.

The attacker is the person who created the directory with the copy of XYZ.EXE and the rogue ABC.DLL.

The victim is whatever poor sap runs the XYZ.EXE program from the custom directory instead of from its normal location.

The attack scenario is

  • Attacker creates a directory, say, C:\Vulnerable.
  • copy C:\Windows\System32\XYZ.EXE C:\Vulnerable\XYZ.EXE
  • copy rogue.dll C:\Vulnerable\ABC.DLL
  • Convince a victim to run C:\Vulnerable\XYZ.EXE.

When the victim runs C:\Vulnerable\XYZ.EXE, the rogue DLL gets loaded, and the victim is pwned.

But the victim was already pwned even before getting to that point! Because the victim ran C:\Vulnerable\XYZ.EXE.

A much simpler attack is to do this:

  • Attacker creates a directory, say, C:\Vulnerable.
  • copy pwned.exe C:\Vulnerable\XYZ.EXE
  • Convince a victim to run C:\Vulnerable\XYZ.EXE.

The rogue ABC.DLL is immaterial. All it does is crank up the degree of difficulty without changing the fundamental issue: If you can trick a user into running a program you control, then the user is pwned.

This is another case of if I can run an arbitrary program, then I can do arbitrary things, also known as MS07-052: Code execution results in code execution.

Note that the real copy of XYZ.EXE in the System32 directory is unaffected. The attack doesn't affect users which run the real copy. And since C:\Vulnerable isn't on the default PATH, the only way to get somebody to run the rogue copy is to trick them into running the wrong copy.

It's like saying that there's a security flaw in Anna Kournikova because people can create things that look like Anna Kournikova and trick victims into running it.

WCF: Delegation at Message Level security

MSDN Blogs - Wed, 05/27/2015 - 06:47

 

Basics:

Review this article to get familiar with basic settings needed for WCF delegation.

http://blogs.msdn.com/b/saurabs/archive/2012/08/28/wcf-learning-impersonation-and-delegation.aspx  

 

Basic Message Level security can be easy to set up delegation, as indicated from below diagram:

 

 

What about delegation between boxes with Load Balancer in place ?

 

 

Key points:

1. For Load balancer scenario, we have to be careful because WCF message security deals with "SecurityContextToken" creation and usage.

 

Steps:

Request 1: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue

Response: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/Issue

Request 2: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/SCT

Response: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/SCT

Request 3: http://tempuri.org/IService1/GetData

Response: http://tempuri.org/IService1/GetDataResponse

 

Problem:

Because of creation of SCT token, we need to direct the actual call to the same server who issued the SCT to client.

In case we send the SCT token issued from Server 1 to server 2 because of round robin distribution of request via LB, we end up in failure.

 

Approaches attempted and reason for failure:

1. To avoid the SCT creation we can set these two property inside message security tag. This will help us get what we call - ONE SHOT PROXY

a) EstablishSecurityContext

b) NegotialServiceCredential

 

2. Message Security with no Negotiation needs to set the NegotiateServiceCredential = false.
In the case of Windows credentials, setting this property to false causes an authentication based on KerberosToken.  This requires that the client and service be part of a Kerberos domain. This mode is interoperable with SOAP stacks that implement the Kerberos token profile from OASIS. Setting this property to true causes a SOAP negotiation that tunnels SPNego exchange over SOAP messages. This mode is not interoperable. Since we set the value to true in the config file, we end up validating the server identity and creating a unique message id .        When NegotiateServiceCredential  set to true, EstablishSecurityContext as false.   ============ Step 1: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue Step 2: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/Issue Step 3: http://tempuri.org/IService1/GetData     When NegotiateServiceCredential  set to true along with EstablishSecurityContext as true ====== Step 1: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue Step 2: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/Issue Step 3: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/SCT Step 4: http://tempuri.org/IService1/GetData     When EstablishSecurityContext is false, we still end up in getting a unique SCT token as a part of SSPI Negotiation (NegotiateServiceCredential  set to true).         <t:RequestedSecurityToken>           <c:SecurityContextToken u:Id="uuid-570f65bc-7197-4d56-b388-61a445d317b4-1" xmlns:c="http://schemas.xmlsoap.org/ws/2005/02/sc">             <c:Identifier>urn:uuid:20ac8eb5-73f3-46d9-ba43-8c7ea9e8c56b</c:Identifier>           </c:SecurityContextToken>         </t:RequestedSecurityToken> Client end up in using the same Context token it received as a part of SSPI Negotiation.
   So the true solution is to set both Establish Security context and Negotiate Service Credentials to FALSE.   3. But limitation with this approach is, it does not support  client to set TokenImpersonationLevel as "DELEGATION" without which client really can't get a Delegatable token from DC. Above approach is good, if we just need to authenticate or impersonate the cred to next hop.. and no requirement to do delegation. Failure Code: public KerberosSecurityTokenProvider(string servicePrincipalName, TokenImpersonationLevel tokenImpersonationLevel, NetworkCredential networkCredential)         {             if (servicePrincipalName == null)                 throw DiagnosticUtility.ExceptionUtility.ThrowHelperArgumentNull("servicePrincipalName");             if (tokenImpersonationLevel != TokenImpersonationLevel.Identification && tokenImpersonationLevel != TokenImpersonationLevel.Impersonation)  <-------------------------------------             {                 throw DiagnosticUtility.ExceptionUtility.ThrowHelperError(new ArgumentOutOfRangeException("tokenImpersonationLevel",                     SR.GetString(SR.ImpersonationLevelNotSupported, tokenImpersonationLevel)));             }               this.servicePrincipalName = servicePrincipalName;             this.tokenImpersonationLevel = tokenImpersonationLevel;             this.networkCredential = networkCredential;         }   Stack Trace: Server stack trace:     at System.IdentityModel.Selectors.KerberosSecurityTokenProvider..ctor(String servicePrincipalName, TokenImpersonationLevel tokenImpersonationLevel, NetworkCredential networkCredential)    at System.ServiceModel.ClientCredentialsSecurityTokenManager.CreateSecurityTokenProvider(SecurityTokenRequirement tokenRequirement, Boolean disableInfoCard)    at System.ServiceModel.ClientCredentialsSecurityTokenManager.CreateSecurityTokenProvider(SecurityTokenRequirement tokenRequirement)    at System.ServiceModel.Security.SecurityProtocol.AddSupportingTokenProviders(SupportingTokenParameters supportingTokenParameters, Boolean isOptional, IList`1 providerSpecList)    at System.ServiceModel.Security.SecurityProtocol.OnOpen(TimeSpan timeout)    at System.ServiceModel.Security.WrapperSecurityCommunicationObject.OnOpen(TimeSpan timeout)    at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)

 

3. Another approach is to user "KerberosOverTransport" security mode via custom binding. But again it does not allow client to set the Token Impersonation level as Delegation.

4. If we try to move to - "TransportWithMessageCredential" security mode, SSL handshake will happen and further we will still end up in creating the SCT token.

Request 1: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue

Response: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/Issue

Request 2: http://tempuri.org/IService1/GetData

c:SecurityContextToken [ u:Id=uuid-fa4f216a-18cf-4204-b4a0-11a62fe765a4-1 xmlns:c=http://schemas.xmlsoap.org/ws/2005/02/sc ]

Response: http://tempuri.org/IService1/GetDataResponse

  

Solution:

1. There is only one solution: Enable Sticky session on LB to make sure specific client are routed to same machine all the time and we can live with default Message security or even TransportWithMessageCred..

2. Switch to pure transport security and windows authentication.

Can I tell Narrator how I want a particular character to be pronounced?

MSDN Blogs - Wed, 05/27/2015 - 06:45

Someone recently raised the interesting point that they found the Narrator screen reader pronounces “3 + 1” as “three plus one” when interacting with their app, yet it pronounces “3 - 1” as “three to one”. They were curious as to whether they could tell Narrator to pronounce the ‘-’ as “minus”.

Today there is no way for the app developer to force Narrator to use some particular pronunciation for the text exposed by the app UI. If your app exposes the text as “3 - 1”, then Narrator will simply pass that text to the Text-To-Speech (TTS) engine being used by Narrator, and your customer will hear the text with whatever pronunciation the TTS engine wants to use.

So when the TTS engine encounters the ‘-’, maybe it’ll say “to”, “minus”, “dash” or nothing at all. And the TTS engine could say something quite different if the character is actually a ‘–’ instead of a ‘-’.

But depending on your situation, maybe there are a couple of things you could consider when pondering options for your own customers. I’ve described those options below. You may well feel that for your app, it’s appropriate to not take any special action, and go with the default Narrator experience.

Guy

 

Achieve the desired pronunciation by setting the accessible name on the element in your UI

If say you have a XAML element displaying the text “3 - 1”, it would often be quite straightforward to set an accessible name on the element of “3 minus 1”. Narrator would access that accessible name, and pass that text to the TTS engine, which would speak it exactly as supplied. (Of course, “minus” would need to be localized in all the languages that your app ships.)

In some cases this might be your best option, given that you have absolute control over what’s going to be spoken. And in fact when a button simply shows a symbolic text character, for example ‘?’, we’d always set an accessible name on that button in order to provide a helpful, consistent experience regardless of what TTS engine is currently being used.

But if a ‘-’ is part of some large text content being presented in your app, (say like the document content I’m writing here,) then you’re not going to be setting an explicit accessible name on all that content just to control how the ‘-’ is pronounced.

To learn more about setting accessible names on elements in your UI, see Giving your XAML element an accessible name.

 

Take advantage of common TTS behavior

If the TTS engine isn’t given instructions on how to pronounce some text, it’ll take its best guess. And how one TTS engine pronounces the text might be different to how another TTS engine pronounces the same text.

But there are some behaviors that are fairly common across TTS engines. For example, if a ‘-’ is preceded by a space, and followed immediately by a number then a space, the TTS engine will often pronounce the ‘-’ as “minus”. So if I type “3 -1 ” in Word 2013 here, and have Narrator speak that text, I hear “3 minus 1”.

So perhaps in some cases, this simple change could get the desired results for your customers who use Narrator. It does have the drawback that the visual text is also changing, and that might be unacceptable.

 

An example of how different TTS engines say different things

In some situations you might opt for using spaces or punctuation in your text to take advantage of common behaviors across TTS engines regarding pronunciation. But taking the approach of changing the text exposed by your UI because it triggers some response by the TTS engine that you happen to be using, can lead to unpredictable results for your customers who use different TTS engines. For example, while both the Microsoft David (US) and Microsoft Hazel (UK) TTS engines pronounce “mph” as “miles per hour”, David pronounces “st.” as “s t”, and Hazel pronounces it as ”street”.

And Hazel happily pronounces “HRH” as “his royal highness”. To get David to say that, you’d have to set the accessible name on the element containing the text. A straightforward way of setting such an accessible name would be as follows:

 

XAML:

   <TextBlock x:Uid="TTSDemoTextBlock" />

 

Localized string resource file:

    <data name="TTSDemoTextBlock.Text" xml:space="preserve">
        <value>HRH Bob</value>
        <comment>Demo string shown visually in the app on a TextBlock</comment>
    </data>
    <data name="TTSDemoTextBlock.[using:Windows.UI.Xaml.Automation]AutomationProperties.Name" xml:space="preserve">
        <value>His Royal Highness Bob</value>
        <comment>Localized accessible name for the demo TextBlock</comment>
    </data>

 

Now I hear Narrator say “His Royal Highness Bob” when reaching the TextBlock regardless of what TTS engine I’m using.

 

 

Figure 1: Narrator highlighting a TextBlock visually presenting the text “HRH Bob”.

 

WCF: Delegation at Message Level security

MSDN Blogs - Wed, 05/27/2015 - 06:28

 

Basics:

Review this article to get familiar with basic settings needed for WCF delegation.

http://blogs.msdn.com/b/saurabs/archive/2012/08/28/wcf-learning-impersonation-and-delegation.aspx  

 

Basic Message Level security can be easy to set up delegation, as indicated from below diagram:

 

 

What about delegation between boxes with Load Balancer in place ?

 

 

Key points:

1. For Load balancer scenario, we have to be careful because WCF message security deals with "SecurityContextToken" creation and usage.

 

Steps:

Request 1: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue

Response: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/Issue

Request 2: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/SCT

Response: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/SCT

Request 3: http://tempuri.org/IService1/GetData

Response: http://tempuri.org/IService1/GetDataResponse

 

Problem:

Because of creation of SCT token, we need to direct the actual call to the same server who issued the SCT to client.

In case we send the SCT token issued from Server 1 to server 2 because of round robin distribution of request via LB, we end up in failure.

 

Approaches attempted and reason for failure:

1. To avoid the SCT creation we can set these two property inside message security tag. This will help us get what we call - ONE SHOT PROXY

a) EstablishSecurityContext

b) NegotialServiceCredential

 

2. Message Security with no Negotiation needs to set the NegotiateServiceCredential = false.
In the case of Windows credentials, setting this property to false causes an authentication based on KerberosToken.  This requires that the client and service be part of a Kerberos domain. This mode is interoperable with SOAP stacks that implement the Kerberos token profile from OASIS. Setting this property to true causes a SOAP negotiation that tunnels SPNego exchange over SOAP messages. This mode is not interoperable. Since we set the value to true in the config file, we end up validating the server identity and creating a unique message id .        When NegotiateServiceCredential  set to true, EstablishSecurityContext as false.   ============ Step 1: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue Step 2: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/Issue Step 3: http://tempuri.org/IService1/GetData     When NegotiateServiceCredential  set to true along with EstablishSecurityContext as true ====== Step 1: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue Step 2: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/Issue Step 3: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/SCT Step 4: http://tempuri.org/IService1/GetData     When EstablishSecurityContext is false, we still end up in getting a unique SCT token as a part of SSPI Negotiation (NegotiateServiceCredential  set to true).         <t:RequestedSecurityToken>           <c:SecurityContextToken u:Id="uuid-570f65bc-7197-4d56-b388-61a445d317b4-1" xmlns:c="http://schemas.xmlsoap.org/ws/2005/02/sc">             <c:Identifier>urn:uuid:20ac8eb5-73f3-46d9-ba43-8c7ea9e8c56b</c:Identifier>           </c:SecurityContextToken>         </t:RequestedSecurityToken> Client end up in using the same Context token it received as a part of SSPI Negotiation.
   So the true solution is to set both Establish Security context and Negotiate Service Credentials to FALSE.   3. But limitation with this approach is, it does not support  client to set TokenImpersonationLevel as "DELEGATION" without which client really can't get a Delegatable token from DC. Above approach is good, if we just need to authenticate or impersonate the cred to next hop.. and no requirement to do delegation. Failure Code: public KerberosSecurityTokenProvider(string servicePrincipalName, TokenImpersonationLevel tokenImpersonationLevel, NetworkCredential networkCredential)         {             if (servicePrincipalName == null)                 throw DiagnosticUtility.ExceptionUtility.ThrowHelperArgumentNull("servicePrincipalName");             if (tokenImpersonationLevel != TokenImpersonationLevel.Identification && tokenImpersonationLevel != TokenImpersonationLevel.Impersonation)  <-------------------------------------             {                 throw DiagnosticUtility.ExceptionUtility.ThrowHelperError(new ArgumentOutOfRangeException("tokenImpersonationLevel",                     SR.GetString(SR.ImpersonationLevelNotSupported, tokenImpersonationLevel)));             }               this.servicePrincipalName = servicePrincipalName;             this.tokenImpersonationLevel = tokenImpersonationLevel;             this.networkCredential = networkCredential;         }   Stack Trace: Server stack trace:     at System.IdentityModel.Selectors.KerberosSecurityTokenProvider..ctor(String servicePrincipalName, TokenImpersonationLevel tokenImpersonationLevel, NetworkCredential networkCredential)    at System.ServiceModel.ClientCredentialsSecurityTokenManager.CreateSecurityTokenProvider(SecurityTokenRequirement tokenRequirement, Boolean disableInfoCard)    at System.ServiceModel.ClientCredentialsSecurityTokenManager.CreateSecurityTokenProvider(SecurityTokenRequirement tokenRequirement)    at System.ServiceModel.Security.SecurityProtocol.AddSupportingTokenProviders(SupportingTokenParameters supportingTokenParameters, Boolean isOptional, IList`1 providerSpecList)    at System.ServiceModel.Security.SecurityProtocol.OnOpen(TimeSpan timeout)    at System.ServiceModel.Security.WrapperSecurityCommunicationObject.OnOpen(TimeSpan timeout)    at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)

 

3. Another approach is to user "KerberosOverTransport" security mode via custom binding. But again it does not allow client to set the Token Impersonation level as Delegation.

4. If we try to move to - "TransportWithMessageCredential" security mode, SSL handshake will happen and further we will still end up in creating the SCT token.

Request 1: http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue

Response: http://schemas.xmlsoap.org/ws/2005/02/trust/RSTR/Issue

Request 2: http://tempuri.org/IService1/GetData

c:SecurityContextToken [ u:Id=uuid-fa4f216a-18cf-4204-b4a0-11a62fe765a4-1 xmlns:c=http://schemas.xmlsoap.org/ws/2005/02/sc ]

Response: http://tempuri.org/IService1/GetDataResponse

  

Solution:

1. Enable Sticky session on LB to make sure specific client are routed to same machine all the time and we can live with default Message security or even TransportWithMessageCred..

2. Switch to pure transport security and windows authentication.

Getting the most out of Office 365 with Lumia - King’s College London Case study

MSDN Blogs - Wed, 05/27/2015 - 05:30

The following case study comes from King’s College London, home to over 27,000 students, and one of the top 20 universities in the world. We recently visited King’s to let everyone know about free Office 365 for students and staff, and to help people get Office working on whatever device they had. This particular story focuses on Chief Information Officer Nick Leake’s quest to find a new mobile solution that would improve connectivity for academics and administrators, as well as being a natural fit with the existing Microsoft ecosystem at King’s.

Ultimately, they ended up going with the Lumia 635. Read on to find out why this was the right choice for the staff at King’s.

Getting the most out of Office 365 with Lumia - King’s College London Case study from Microsoft Education UK

Copying data from one SQL server table to another SQL Server table

MSDN Blogs - Wed, 05/27/2015 - 04:46

 

Hi Readers,

I had a requirement where I had to copy data from one SQL Server located in U.S. (Source Server) to SQL Server located in India (Destination Server). The copy has to be triggered when data is inserted at Source Server table.

I encountered various issues starting from Linked server connection, collation issues to distributed transaction coordinator issues.

Finally, I have collected all steps that I followed in my project and steps needed to handle all issues. And here is my blog talking about them. Feel free to drop any suggestions or comments for further enhancement of this blog.

  Scenario:-

Let us assume you have ‘EmployeeDetails’ table present in Source (Local SQL Server) and Destination Server (‘MyDestinationSQLServer’). With every Insert in EmployeeDetails table in local server, we want the records to be inserted in destination server table as well.

 

Steps Followed:-

Majorly, there were below 5 steps that were followed:-

1. Create local login in both source and destination SQL server (Same Credentials to be used).

2. Create Linked Servers. Link your Source Server to Destination Server.

3. Enabling Distributed Transaction Coordinator service and properties

4. Handling collation issues if any.

5. Writing stored procedure and trigger for copying data from source server to destination server.

 

Detailed Steps:-

We will look into each step in details now:-

 

1. Create local login. This step should be followed in both source and destination machines.

a. Login to the SQL Server

b. Go to Server->Security->Logins->New Login (I have used ‘Admin’ and ‘P@ssw0rd’ as credentials.)

c. Give the user ‘sysadmin’ server role from Server Roles tab.

 

2. Create linked server. You will be required to link source server to destination server to copy data.

a. Run below script from source server (Local server in this blog) to create linked server.

b. Make sure to update destination server name and Database name. (Currently used name is ‘MyDestinationSQLServer’ and ‘MyDatabase’)

USE [master] GO   EXEC master.dbo.sp_addlinkedserver @server = N'MyDestinationSQLServer', @srvproduct=N'', @provider=N'SQLNCLI', @datasrc=N'MyDestinationSQLServer ', @catalog=N'MyDatabase' EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'MyDestinationSQLServer ',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'MyDestinationSQLServer ',@useself=N'False',@locallogin=N'admin',@rmtuser=N'admin',@rmtpassword='P@ssw0rd' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'collation compatible', @optvalue=N'false' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'data access', @optvalue=N'true' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'dist', @optvalue=N'false' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'pub', @optvalue=N'false' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'rpc', @optvalue=N'false' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'rpc out', @optvalue=N'false' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'sub', @optvalue=N'false' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'connect timeout', @optvalue=N'0' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'collation name', @optvalue=null GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'lazy schema validation', @optvalue=N'false' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'query timeout', @optvalue=N'0' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'use remote collation', @optvalue=N'true' GO   EXEC master.dbo.sp_serveroption @server=N'MyDestinationSQLServer', @optname=N'remote proc transaction promotion', @optvalue=N'true' GO

c. After the linked server is created successfully. Test the Connection

Go to Server -> Server Objects -> Linked Servers -> MyDestinationSQLServer -> Right Click -> Test Connection.

You will see below popup when linked server connection test is successful.

 

3. Distributed Transaction Coordinator service should be running. This step should be followed in both source and destination server machines.

a. Go To Services.msc.

b. Start DTC Service

c. Set Startup Type to Automatic

 

4. Allow DTC in windows firewall. This step should be followed in both source and destination server machines.

a. Go to control panel -> Window Firewall -> Allow an app or feature through Windows Firewall

b. Scroll down to DTC feature

c. Enable domain and private communication

 

5. Update local DTC security settings. This step should be followed in both source and destination server machines.

a. Go to Control Panel -> Administrative Tools -> Component Services

b. Expand Component Services -> Computers -> My Computer -> Distributed Transaction Coordinator -> Local DTC.

c. Right click -> Properties -> Security tab

d. Enable network DTC access, allow inbound and outbound communication with no authentication required as shown in screenshot below:-

 

6. Resolve Collation issues.

I had different collations in columns of source table and destination table in actual production scenario. Source and Destination Servers were having different collations as mentioned below

Source Server Collation: SQL_Latin1_General_CP1_CI_AI

Destination Server Collation: Latin1_General_CI_AI

So while creating stored procedure, it was failing on where clause in equal to operator with below error:-

Cannot resolve the collation conflict between "SQL_Latin1_General_CP1_CI_AS" and "Latin1_General_CI_AI" in the equal to operation.

To resolve issue on specific columns ‘COLLATE DATABASE_DEFAULT’ has to be added when comparing columns. Something like shown below

INNER JOIN [Mydatabase].[dbo].EmployeeDetails SourceTable ON DestTable.EmpId COLLATE DATABASE_DEFAULT = SourceTable.EmpId COLLATE DATABASE_DEFAULT where SourceTable.EmpId = @SourceEmpId 7. Write Script to copy data from one server to another.

In this blog, we are considering a simple example of ‘EmployeeDetails’ table with majorly 5 columns – Unique Id, Employee Id, Name, Address and Phone Number

a. Script to create table is given below

  SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[EmployeeDetails]( [Id] [int] IDENTITY(1,1) NOT NULL, [EmpId] [nvarchar](50) NOT NULL, [Name] [nvarchar](50) NOT NULL, [Address] [nvarchar](50) NOT NULL, [Phone] [nvarchar](50) NOT NULL, CONSTRAINT [PK_EmployeeDetails] PRIMARY KEY CLUSTERED ( [EmpId] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO

b. Stored Procedure to copy data from Source Server table to Destination Server EmployeeDetails table.

Below stored procedure will give you collation issue while running as collation of column EmpId in Source and Destination table was different

  Create Procedure [dbo].[InsertEmployeeDetails] @SourceEmpId nvarchar(10) AS declare @DestEmpExists int Select @DestEmpExists = Count(*) from [MyDestinationSQLServer].[MyDatabase].[dbo].EmployeeDetails where EmpId = @SourceEmpId If @DestEmpExists = 0 BEGIN PRINT 'Inserting Employee Information in Destination Employee Table' Insert Into [MyDestinationSQLServer].[MyDatabase].[dbo].EmployeeDetails (EmpId, Name, Address, Phone) Select EmpId, Name, Address, Phone from [MyDatabase].[dbo].EmployeeDetails where EmpId = @SourceEmpId END ELSE If @DestEmpExists = 1 BEGIN PRINT 'Updating Employee Information in Destination Employee Table' Update [MyDestinationSQLServer].[MyDatabase].[dbo].EmployeeDetails SET Name = SourceTable.Name, Address = SourceTable.Address, Phone = SourceTable.Phone FROM [MyDestinationSQLServer].[MyDatabase].[dbo].EmployeeDetails DestTable INNER JOIN [MyDatabase].[dbo].EmployeeDetails SourceTable ON DestTable.EmpId = SourceTable.EmpId where SourceTable.EmpId = @SourceEmpId END

Error received was

Msg 468, Level 16, State 9, Procedure InsertEmployeeDetails, Line 28

Cannot resolve the collation conflict between "SQL_Latin1_General_CP1_CI_AS" and "Latin1_General_CI_AI" in the equal to operation.

To resolve this issue, ‘COLLATE DATABASE_DEFAULT’ was added when comparing columns. Below is the updated stored procedure with collation issues fixed.

  Create Procedure [dbo].[InsertEmployeeDetails] @SourceEmpId nvarchar(10) AS declare @DestEmpExists int Select @DestEmpExists = Count(*) from [MyDestinationSQLServer].[MyDatabase].[dbo].EmployeeDetails where EmpId = @SourceEmpId If @DestEmpExists = 0 BEGIN PRINT 'Inserting Employee Information in Destination Employee Table' Insert Into [MyDestinationSQLServer].[MyDatabase].[dbo].EmployeeDetails (EmpId, Name, Address, Phone) Select EmpId, Name, Address, Phone from [MyDatabase].[dbo].EmployeeDetails where EmpId = @SourceEmpId END ELSE If @DestEmpExists = 1 BEGIN PRINT 'Updating Employee Information in Destination Employee Table' Update [MyDestinationSQLServer].[MyDatabase].[dbo].EmployeeDetails SET Name = SourceTable.Name, Address = SourceTable.Address, Phone = SourceTable.Phone FROM [MyDestinationSQLServer].[MyDatabase].[dbo].EmployeeDetails DestTable INNER JOIN [MyDatabase].[dbo].EmployeeDetails SourceTable ON DestTable.EmpId COLLATE DATABASE_DEFAULT = SourceTable.EmpId COLLATE DATABASE_DEFAULT where SourceTable.EmpId = @SourceEmpId END

c. Trigger created on EmployeeDetails table in Source server.

  Create TRIGGER [dbo].[Trg_InsertEmployeeDetails] ON [dbo].[EmployeeDetails] AFTER INSERT, UPDATE AS BEGIN SET XACT_ABORT ON BEGIN DISTRIBUTED TRANSACTION --1. Declare variables declare @InsertedEmpId nvarchar(50) --2. Fetch values from Table Select @InsertedEmpId = EmpId from inserted exec InsertEmployeeDetails @SourceEmpId = @InsertedEmpId COMMIT TRANSACTION SET XACT_ABORT OFF END

Hope this will help you. Suggestion and Feedback is welcome.

Happy Coding!

Surface Pro 3: Can Microsoft’s workhorse drag an Apple fan away? (Guest Post)

MSDN Blogs - Wed, 05/27/2015 - 04:30

Can a top-of-the-range Microsoft computer convert Apple fanboy San Sharma? Or will using a Surface Pro 3 prove to be a baptism of fire?

I queued for the first iPhone. I’ve buffered my way through almost every Apple keynote. I’m wearing an Apple Watch.  But when I saw Microsoft’s Surface Pro 3 in the cool, magnesium flesh, I was intrigued. How does this curious laptop-tablet hybrid compare to the MacBook Air?
Full disclaimer: Microsoft is a client of mine, and they gave me a Surface. Partial spoiler: This was about 6 months ago, and I’m still using the Surface. So… how have I got on?

The Wow Factor
When Steve Jobs unveiled the MacBook Air on stage in 2008, he pulled it out of a manila envelope. The Surface Pro 3 feels just as thin and light. With its detachable keyboard cover, kickstand and pen, there’s more than enough about it to catch the eye. In fact, I don’t think I’ve been in a meeting with it in the past 6 months without somebody commenting on it – and in a good way. “I really want one,” is what most people say.

The beauty of the Surface Pro 3, however, is more than skin deep. There are design touches here and there that suggest a real attention to detail – and care. The USB socket in the power brick so you can charge another device while your Surface is plugged in. The way the keyboard magnetically grips the screen for a more comfortable typing position. The kickstand that gives you a multitude of viewing angles.


Some limitations
The risk, however, when putting together a laptop-tablet hybrid, as Microsoft has done here, is that you end up with a device that’s neither a great laptop nor a great tablet. I don’t think that’s entirely true here, but there are some compromises. For example, as a laptop, the Surface Pro 3 doesn’t actually sit comfortably on your lap. The kickstand juts and the keyboard cover wobbles. As a tablet, it’s a little bit heavy to hold in one hand.

However, there have been moments in my 6 months of using the Surface Pro 3 that make up for these limitations – and, if you can see yourself in the following situations, the Surface may be the computer for you.


Magic moments
So, I'm in a meeting, typing notes furiously, when I have an idea that I can’t quite express in words. In that moment, I pulled the keyboard off my Surface, grabbed my Surface Pen and, within seconds, I was sketching my idea by hand. It was magical.

More recently, I was sent a long Word document to read. Rather than hunch over my Surface at my desk, I again pulled off my keyboard, held the Surface in portrait mode and read the document on a couch like it was a magazine. It transformed that task into something that felt more natural – and, again, magical.

These moments speak to the Surface Pro 3’s real strengths – its ability to adapt to the way you work best. Of course, this is only really possible because of the software that powers the Surface Pro 3 – and, as a lifelong Apple customer, I can’t review a Windows computer without talking about Windows.

Looking through the Windows
As you can probably tell, I have no real hang-ups about the design of the Surface Pro 3. I like it. My only anxiety about accepting the Surface Pro 3 was about accepting Windows back into my life. I’d spent the past 10 years using Mac OS X as my main operating system. Could I relearn Windows?

The version of Windows that came with the Surface was Windows 8.1 – and, I must admit, it didn’t make my transition from Apple to Microsoft easy at all. It was too unfamiliar and geared too much towards tablet use, when I was mostly using the Surface as a laptop.

At the earliest opportunity, I joined the Windows Insider Programme (for free) and upgraded to the Windows 10 Technical Preview, which is basically a sneak peek at the future of Windows – and it is so much better. It’s in desktop mode by default, the Start menu is back, and it’s just so much better looking. If you’re switching from Mac to Windows, upgrade to Windows 10 at the earliest opportunity. You’ll feel far more at home.

For small businesses, the special sauce of Windows is really Office 365. Yes, you can get it on a Mac, but it’s at its best on Windows. Fully-featured Word, Excel and PowerPoint – plus, features like Skype for Business integration with Outlook – make even the smallest of businesses feel like the mightiest.

This is where Microsoft, versus Apple, really shines – and the Surface Pro 3 is the best embodiment of this idea: no-one cares more about your productivity than Microsoft. That unfortunately means, for me, the Surface isn’t quite as good as a tablet. But as a pure work horse? The Surface Pro 3 is hard to beat.

 

Find out more about the Surface Pro 3 here!

You can also learn about Office 365 and to get involved in the Windows Insider program and get your hands on Windows 10 sign up here. 


San Sharma (@sansharma) is a writer and marketer, specialising in tech and business.

Visual Studio Enterprise 2015 is coming

MSDN Blogs - Wed, 05/27/2015 - 03:51

2 months ago we announced the editions of Visual Studio 2015 that will be available when we release the final product this summer..With our new on premise / in the cloud ALM capabilties (with TFS and VSO), Visual Studio Community and Visual Studio code, it is clear that Visual Studio is the best suite of tools to support you when building and testing software solution, no matter what platform you’re on, no matter what app you’re building.

As part of the product lineup, we are merging Visual Studio Premium and Visual Studio Ultimate into one single offering called Visual Studio Enterprise with MSDN. It includes all the high value features you’re already familiar with in Visual Studio Ultimate, along with new innovation that’s coming with the 2015 release.

Below is the best way to prepare for Visual Studio 2015 and leverage all the new capabilties for you and your team

  • Download now Visual Studio 2015 RC
  • Learn more about Visual Studio 2015
  • Upgrade to Visual Studio Premium now
    • Upgrade from Visual Studio Professional with MSDN or Visual Studio Test Professional with MSDN to Visual Studio Premium with MSDN for 50% off the regular list price and get a free upgrade to Visual Studio Enterprise with MSDN automatically when we release Visual Studio 2015.This exclusive offer is only available for 2 months so talk to your reseller now to take advantage of this special offer before June 30th. *

 
Jihad
Visual Studio / ALM solution specialist
Microsoft Gulf
jihadda@microsoft.com
+971 52 908 7952

【de:code 新聞 #3 発行しました】

MSDN Blogs - Wed, 05/27/2015 - 03:48
二日間に渡って開催された「de:code 2015」が閉幕しました。 長時間のセッションにご参加されたみなさま、本当にお疲れ様でした。 いち早くde:code 2015のセッションレポートを、de:code 新聞 3号の夕刊でお届け致します。 ぜひご一読ください。 来年もご期待ください。...(read more)

Rename SSAS databases with Powershell

MSDN Blogs - Wed, 05/27/2015 - 02:42

We had a challenge to keep our SSAS Cube always in browsable state as users are spread across the globe and we hardly find a window to deploy our changes and process quickly. To intensify the problem state, the cube we have is very big in size and to add to the problems the cube should undergo enhancements at least once a week..

We evaluated few options and zeroed down the option of having a redundant cube database, deploy changes and process it and finally swap the SSAS databases. The approach seemed fine and quickly we were able to host a second database and swap it with the original one using XMLA script.

 

However, when we did the same thing second time, we are stumped with errors as DatabaseID property is not same as of the DatabaseName property. We tried to solve this problem using XMLA scripts but it was tough as to figure out the DatabaseID before swapping - XMLA accepts only DatabseID but not DatabaseName. Below is the error message SSAS throws out..

 

Executing the query ...

Errors in the metadata manager. The object ID cannot be changed from 'MyAnalyticsCube_2' to 'MyAnalyticsCube' for the 'MyAnalyticsCube_2 database. The object ID cannot be changed by an ALTER statement.

Execution complete

 

We finally moved to powershell which solved this problem in no time.

Here is the powershell script (simplified version)

 I have attached the powershell script for your easy use

 

混合移动应用程序的Office 365 客户端API

MSDN Blogs - Wed, 05/27/2015 - 02:07

[原文发表地址] Office 365 Client APIs for Hybrid Mobile Apps

[原文发表时间] 2014-12-08 9:00 AM

Visual Studio的Apache Cordova工具允许开发人员利用Web标准技术对iOS, Android和Windows操作系统创建移动应用程序。当使用Visual Studio 的Office 365 API工具时,开发人员可以从他们的混合移动应用程序进入Office 365的API来访问用户的日历,联系人,邮件, 和来自他们的Office 365帐户轻松创建的更加丰富,连接体验更好的文件。
Apache Cordova工具和Office 365的API工具都是是对Visual Studio 2013的扩展。建立一个混合移动应用程序,从一个REST服务器上查询数据是比较困难的:你需要管理认证指令,动态构造REST URI,并处理错误和重试。此代码虽为样版,但经常会出错,Office 365的API可处理这种复杂的问题,让你可以专注于建立应用程序。除了处理REST调用的复杂性,类库还提供了API认证和探索,OneDrive我的文件使用情况,和访问用户、组信息。
例如,一个混合移动应用程序可以轻松地直接添加日历预约到用户的Exchange日历,以代表用户发送邮件,或者直接访问他们存储在OneDrive的文件。使用几行代码,用户便可以验证和访问他们的Office 365帐户。

下面,你可以看到如何对用户进行验证,并创建一个客户端对象,作为从Exchange上访问所有API的基端。getIdToken方法提示输入用户名和密码来验证outlook.office365.com。

var authContext = newO365Auth.Context();

authContext.getIdToken('https://outlook.office365.com/')

.then(function(token) {

// Promise callback: Authentication succeeded

client = newExchange.Client(

'https://outlook.office365.com/ews/odata',

token.getAccessTokenFn('https://outlook.office365.com')

);

});

使用构建的客户端对象,你可以从输入框中访问所有的消息。

// Use getFolder to access Inbox folder

client.me.folders.getFolder("Inbox").fetch()

.then(function(folder) {

// Retrieve all the messages

folder.messages.getMessages().fetch()

.then(function(mails) {

// mails.currentPage contains all the mails in Inbox

});

});

若要获取联系人,只需调用getContacts:

client.me.contacts.getContacts().fetch()

.then(function(contacts) {

// contacts.currentPage contains the contacts information

});

同样,使用getEvents方法来返回日历事件。这里有一些Woodgrove演示应用程序的截图,它使用Office 365的API认证,获取用户的日历预约,并在应用程序中显示它们的一些截图。

在开始使用Visual Studio 3013 的Office 365 API建立混合移动应用程序之前,需要安装Apache Cordova工具预览版和Visual Studio 2013 的Office 365 API工具。你可以找到一个MSDN上的小演练:添加Office 365服务到您的应用程序。您也可以按照教程如何使用SharePoint API访问OneDrive文件

敬请试用这些API并让我们知道您的反馈意见。如果您遇到任何问题或疑问,可以通过UserVoice推特Stack overflow,或电子邮件直接联系产品团队。

Using Vorlon to debug Office Apps Javascript

MSDN Blogs - Wed, 05/27/2015 - 02:00

If you have been following Build 2015, you must have heard about the cool new Javascript debugging tool released by Microsoft Open Source - VorlonJS. The steps to install Vorlon are real easy.

  1. Install Node.js on your system.
  2. Install Vorlon on your install by the following command in NPM - "npm i -g vorlon"
  3. Run Vorlon by command - "vorlon".
Vorlon is especially useful in scenarios where an app supports devices of multiple sizes in different environments. Since Office apps (including SharePoint apps) can now run literally anywhere - Web, Windows, Android, iOS - it makes sense to have a tool which helps you quickly debug issues in all these environments without resorting to weird hacks. In this example, I am going to show you how easy it is to use Vorlon in a Word Task Pane app. 
  1. Include a reference to Vorlon in your webpage -  <script src="http://localhost:1337/vorlon.js"></script>
  2. Done!
Here are some screenshots (Click for original image size):      

Be Careful when using Table-Valued Parameter

MSDN Blogs - Wed, 05/27/2015 - 01:31

Recently we found two implicit traps when using Table-Valued Parameter, which may easily lead to unexpected behaviors.

First is that it needs additional precision/scale information when passing decimal type data, otherwise the decimal data would be rounded like long type value. It is very hidden as there is no exception thrown, and we can only find it by manually checking th data. In the meantime, as ADO.Net DataTable class doesn't contain precision/scale information, it means you cannot directly pass the DataTable as the parameter. You may either use IEnumerator<SqlDataRecord> or use DbDataReader created from DataTable and reset the schema table. Refer to this article which also discusses such issue.

Second is that the data is populated by position instead of column names, although you have column names defined in DataTable/DbDataReader and column names defined in table type. As there is no exception thrown, it is hard to realize if the data is corrupted for incorrect orders. Refer to this article which also discusses this issue.

What to expect from the Microsoft Showcase Classroom Regional Roadshow [VIDEO]

MSDN Blogs - Wed, 05/27/2015 - 00:30

There are just five dates left on the Microsoft Showcase Classroom Regional Roadshow – the next being at our London HQ in Victoria – and to show you what you could be experiencing, here is a video illustrating the what teachers and school leaders can get out of a session learning all about Windows 8 devices:

Microsoft Tour MOVIE v1 from Born on Vimeo.

Run by teachers through Tablet Academy, the sessions ensure that the technical information and advice delivered is done so with education in mind, by those who understand the importance of how to get the most out of these devices, with the children in the classroom.

Individual venues for each stop on the Regional Roadshow were purposefully chosen so that they have a very strong educational input, and are the sorts of places where children would use a tablet while on a trip. Again, this allows the teachers to gain a real practical understanding of how the devices might be used in education, enabling them to make a much more informed choice when it comes to purchasing new technology for their schools.

"It's a big decision, so we wanted to get it right. And the more we can look at what opportunities there are and what capabilities there are, the better decision we can make." – Mark Gibbons, headteacher, Windmill Primary School

The remaining dates on the Tablet Academy Win 8 Education tour are as follows:

  • 8th June 2015 – Microsoft London HQ, London (in the Microsoft Showcase Classroom)
  • 15th June 2015 – The Eden Project, Cornwall
  • 22nd June 2015 – The Great Hospital, Norwich
  • 8th July 2015 – Blackpool Zoo, Blackpool

To register for any of the above dates (all dates have AM and PM sessions), please visit Tablet Academy.

Some venues will be offering tour delegates free entry or activities either side of the sessions. Further details will be provided on your tickets.

Price: £105.00 (+VAT) including 8" Windows 8 Pro Tablet.

What to expect from the Microsoft Showcase Classroom Regional Roadshow [VIDEO]

MSDN Blogs - Wed, 05/27/2015 - 00:30

There are just five dates left on the Microsoft Showcase Classroom Regional Roadshow – the next being at our London HQ in Victoria – and to show you what you could be experiencing, here is a video illustrating the what teachers and school leaders can get out of a session learning all about Windows 8 devices:

Microsoft Tour MOVIE v1 from Born on Vimeo.

Run by teachers through Tablet Academy, the sessions ensure that the technical information and advice delivered is done so with education in mind, by those who understand the importance of how to get the most out of these devices, with the children in the classroom.

Individual venues for each stop on the Regional Roadshow were purposefully chosen so that they have a very strong educational input, and are the sorts of places where children would use a tablet while on a trip. Again, this allows the teachers to gain a real practical understanding of how the devices might be used in education, enabling them to make a much more informed choice when it comes to purchasing new technology for their schools.

"It's a big decision, so we wanted to get it right. And the more we can look at what opportunities there are and what capabilities there are, the better decision we can make." – Mark Gibbons, headteacher, Windmill Primary School

The remaining dates on the Tablet Academy Win 8 Education tour are as follows:

  • 8th June 2015 – Microsoft London HQ, London (in the Microsoft Showcase Classroom)
  • 15th June 2015 – The Eden Project, Cornwall
  • 22nd June 2015 – The Great Hospital, Norwich
  • 8th July 2015 – Blackpool Zoo, Blackpool

To register for any of the above dates (all dates have AM and PM sessions), please visit Tablet Academy.

Some venues will be offering tour delegates free entry or activities either side of the sessions. Further details will be provided on your tickets.

Price: £105.00 (+VAT) including 8" Windows 8 Pro Tablet.

Client Certificate Authentication

MSDN Blogs - Wed, 05/27/2015 - 00:04

SSL/TLS certificates are commonly used for both encryption and identification of the parties. In this blog post, I'll be describing Client Certificate Authentication in brief.

Client Certificate Authentication is a mutual certificate based authentication, where the client provides its Client Certificate to the Server to prove its identity. This happens as a part of the SSL Handshake (it is optional).

Before we proceed further, we need to understand

  • What is a client certificate?
  • What is authentication & why do we need it?

Client Certificates

Client Certificate is a digital certificate which confirms to the X.509 system. It is used by client systems to prove their identity to the remote server. Here is a simple way to identify where a certificate is a client certificate or not:

  • In the Details tab, the certificates intended purpose has the following text:
    "Proves your identity to a remote computer"
  • Verify that the Enhanced Key Usage field of the certificate has the OID set to (1.3.6.1.5.5.7.3.2).

Below is a screenshot of a sample Client Certificate:

Refer RFC 5246

In Computer Science, Authentication is a mechanism used to prove the identity of the parties involved in a communication. It verifies that "you are who you say you are". Not to be confused with Authorization, which is to verify that "you are permitted to do what you are trying to do".

There are several types of authentication. Here is a list of authentication widely used on IIS (in no specific order):

  • Anonymous Authentication (No Authentication)
  • Basic Authentication
  • Client Certificate Authentication
  • Digest Authentication
  • Forms Authentication
  • NTLM
  • Kerberos
  • Smart Card Authentication

NOTE: As the SSL Handshake happens before HTTP communication, Client Certificate Authentication takes the highest precedence over any other type of authentication that takes place over HTTP protocol.

Kerberos, Client Certificate Authentication and Smart Card Authentication are examples for mutual authentication mechanisms. Authentication is typically used for access control, where you want to restrict the access to known users. Authorization on the other hand is used to determine the access level/privileges granted to the users.

On Windows, a thread is the basic unit of execution. Any task performed by the user is executed by the thread under the context of a specific account/identity. Authentication is one of the ways used to determine the thread identity, whose privileges will be used by the thread for execution.

Client Certificate Authentication in SSL/TLS Handshake

I have already discussed SSL Handshake in one of my blog posts. Browse to:
http://blogs.msdn.com/b/kaushal/archive/2013/08/03/ssl-handshake-and-https-bindings-on-iis.aspx

Here is a screenshot describing the SSL/TLS Handshake:

  • Client sends CLIENT HELLO as described in the above image
  • Upon receiving the CLIENT HELLO, if the server is configured for Client Certificate Authentication, it will send a list of Distinguished CA names & Client Certificate Request to the client as a part of the SERVER HELLO apart from other details depicted above.
  • Upon receiving the Server Hello containing the Client Certificate request & list of Distinguished CA names, the client will perform the following steps:
    • The client uses the CA list available in the SERVER HELLO to determine the mutually trusted CA certificates.
    • The client will then determine the Client Certificates that have been issued by the mutually trusted Certification Authorities.
    • The client will then present the client certificate list to the user so that they can select a certificate to be sent to the user.

NOTE:

  • On the Client the Client Certificates must have a Private Key. If absent, then the certificate is ignored.
  • If the server doesn't provide the list of Distinguished CA Names in the SERVER HELLO, then the client will present the user with all the client certificates that it has access to.
  • Upon selection, the client responds with a
    • ClientKeyExchange message which contains the Pre-master secret
    • Certificate message which contains the Client certificate (Doesn't contain the private key).
    • CertificateVerify message, which is used to provide explicit verification of a client certificate. This message is sent only if the Client Certificate message was sent. The client is authenticated by using its private key to sign a hash of all the messages up to this point. The recipient verifies the signature using the public key of the signer, thus ensuring it was signed with the client's private key. Refer RFC 5246 for more details.
  • Post this Client & Server use the random numbers and the Pre-Master secret to generate symmetric (or Master) keys which will used for encrypting & decrypting messages for further communication.
  • Both respond with ChangeCipherSpec indicating that they have finished the process.
  • SSL Handshake stands completed now and both the parties own a copy of the master key which can be used for encryption and decryption.

Design Problems

We know that the server sends the list of Distinguished CA names as a part of SERVER HELLO. The RFC never mandates the list of Distinguished CA Names should contain Root CA or Intermediate CA certificates. Here is a snippet of this section defined in the RFC5246:

certificate_authorities

A list of the distinguished names [X501] of acceptable
certificate_authorities, represented in DER-encoded format. These
distinguished names may specify a desired distinguished name for a
root CA or for a subordinate CA; thus, this message can be used to
describe known roots as well as a desired authorization space. If
the certificate_authorities list is empty, then the client MAY
send any certificate of the appropriate ClientCertificateType,
unless there is some external arrangement to the contrary

Refer the below blog post for information on Root & Intermediate CA certificates:
http://blogs.msdn.com/b/kaushal/archive/2013/01/10/self-signed-root-ca-and-intermediate-ca-certificates.aspx

This can lead to a problem where few systems require Root CA's while few require Intermediate CA's to be present in the list sent in the SERVER HELLO. This makes the communicating parties incompatible on certain occasions.

Both the implementations are debatable. On one hand the list sent by the server cannot exceed a certain limit (on windows the size is 12,228 bytes). If exceeded, the auth will fail. The list of Intermediate CA's always exceeds the list of Root CA by 2-3 folds or even higher. This is one of the reasons why some systems send the ROOT CA's in the list of Distinguished CA Names. On the other hand, the Intermediate CA names are readily available in the client certificate provided by the user, so it makes it easier during the certificate chain validation, therefore some systems prefer this over the previous one. Both have their own merits.

One example I have personally encountered is Apple's Safari browser communicating to a site hosted on IIS 7 or higher which requires Client Certificate for authentication. Safari expects a list of Intermediate CA's in the SERVER HELLO. On the other hand, IIS sends only Root CA's in that list. As a result the authentication fails as the client is unable to provide a client certificate to the server.

A solution to the above problem is to configure IIS to not send any the CA list in the SERVER HELLO. To achieve this follow the Method 3 described in the support article below:
https://support.microsoft.com/en-us/kb/933430/

The above article requires you to add a registry key, SendTrustedIssuerList, which is set to 0.

As a result the server doesn't send any list to the client, but requires it to pass a client certificate. The client will present the complete list of client certificates to choose from and it will proceed further as expected.

Dynamics CRM Online 2015 Update 1: Yammer 統合の強化

MSDN Blogs - Tue, 05/26/2015 - 23:30

みなさん、こんにちは。

Dynamics CRM は Yammer と統合することで、Yammer 上で共有されている情報を
Dynamics CRM 上に表示することが可能です。今回、Dynamics CRM Online 2015 Update 1 では、
Yammer 統合がさらに強化されています。

既定のグループ指定

Yammer 統合を開始する際、既定の Yammer グループを指定するオプションがあります。
オプションを選択する際、候補がドロップダウンリストで表示されていたため長くなりがちでしたが、
テキストボックスになったことにより操作しやすくなりました。

フィード領域の拡大

ダッシュボード上の自身がフォローしているオブジェクト領域が削除され、
フィード領域が広くなりより見やすくなりました。

構成手順

早速、Dynamics CRM Online と既存の Yammer を統合してみましょう。

1. Dynamics CRM Online にログインします。

2. 設定 | 管理 | Yammer の構成をクリックします。

3. 免責事項を確認し、続行をクリックします。

4. Yammer への接続を承認します。

5. 接続を確認する画面が表示されるため、許可します。

6. Yammer ネットワークに Yammer の名前が設定されていることを確認します。

7. 既定で投稿される Yammer グループを指定します。

以上で設定は完了です。

動作確認

1. Dynamics CRM Online にログインします。

2. 営業 | ダッシュボード | 営業活動ソーシャルダッシュボード をクリックします。

3. 初回のみ Yammer アカウントでサインインする必要があるためクリックし、Yammer への接続を許可します。

4. 接続に成功すると、ダッシュボード上から Yammer へ投稿が可能になります。
投稿時の既定のグループが設定したものになっていることを確認します。

まとめ

Yammer は社内の情報共有に有効なツールです。Dynamics CRM Online と統合することで、
より社内の情報共有の強化が期待できます。是非、Yammer 統合をご活用ください!

- プレミアフィールドエンジニアリング 河野 高也

Problem in Cumulative Update 19 for Microsoft Dynamics NAV 2013 R2

MSDN Blogs - Tue, 05/26/2015 - 23:30

Problem

Cumulative Update 19 for Microsoft Dynamics NAV 2013 R2 has a problematic change that will affect your implementation of the cumulative update.

The problem is in table 114, Sales Cr. Memo Header, where the number of field 1300, Canceled, is changed to 1301. A typical partner development license does not allow field changes, so you will run into the problem when you try to import CU 19.

 

Resolution

There is no good resolution at the moment (we are working on it), so here are two different ways to work around the problem.

Workaround 1: Rename the old field and merge the new field. Write code to copy values from the old field to the new field. See the following code snippet as an example.

SalesCrMemoHeader.SETRANGE("Canceled 2",TRUE);

SalesCrMemoHeader.MODIFYALL(Canceled,TRUE);

SalesCrMemoHeader.MODIFYALL("Canceled 2",FALSE);

Where “Canceled 2” is the renamed field.

 

Note:

You will not be able to remove the old field.

In table 112, Sales Invoice Header, field 1301, Canceled, is a FlowField that uses field 1301 in table 114 in its CalcFormula. Because table 112 is not part of CU 19, the CU will fail if you try to compile table 112 because it points to non-existing field 1300 in table 114.

 

Workaround 2: Import table 114 as a .fob file. (The partner development license does not allow deletion of an existing field if you import the table as a .txt file.)

  1. Save the data in custom fields.
  2. Import table 114 as a .fob file. (Note: This will remove the custom fields.)
  3. Re-apply any customizations that you may have lost.
  4. Move the data back.

 

Next Step

There will be a hotfix available to resolve this issue soon. We will link to the hotfix from the current KB articles and blog posts, and we will create a new announcement when it is ready.

 

Sorry for the inconvenience.

The Dynamics NAV team

Intel® SCS Add-on 2.1 and SC2012 R2 ConfigMgr Integration (RCS Database mode) - Part 6

MSDN Blogs - Tue, 05/26/2015 - 22:08
Intel AMT Deployment

This part will cover the procedure to provision Intel AMT computers. As a prerequisite, we strongly recommend executing the following 3 tasks on each client computer. To obtain these modules, please contact vendor of client computers.

  • Update BIOS
  • Intel Management Engine Interface (Intel MEI) installation
  • Local Manageability Service (LMS.exe) installation

For more details about prerequisites, please refer to the section 2.2 Supported Intel AMT Versions of Intel(R)_SCS_User_Guide.pdf document included in the Intel SCS for Microsoft System Center Configuration Manager package.

Firewall configuration

1. On the client computer, run the “Windows Firewall wit Advanced Security] console. In the [Inbound Rules], create a [New Rule].
2. Select [Port] and click on [Next].
3. Select [TCP], select [Specific local ports] and type “16993” and “16995”. Click [Next]


4. Select [Allow the connection] and click [Next]
5. Put a check on [Domain], [Private] and [Public] and click [Next]
6. Type a name for the new rule and click [Finish]

Adding Enterprise Root CA certificate thumbprint into AMT computers

≪Warning≫
Please refer to the paragraph 2.2.1 about choosing what type of certificate to use for AMT Provisioning. If using a public certificate, this section can be skipped.

1. Select the Enterprise CA Root certificate and open properties.
2. From the [Details] tab, select [Thumbprint] and note the thumbprint.


3. Turn on an Intel AMT computer and press <Ctrl+P> during boot to trigger the Intel ME interface. Default password is “Admin”.


※the way to access to Intel ME console might be different on some computer hardware.
4. Select [Intel AMT Configuration]


5. Select [Remote Setup And Configuration]


6. Select [TLS PKI]


7. Select [Manage Hashes] then to add a new entry, type the <ins> key.


8. Enter hash name ”Contoso Root CA”


9. To the question [SHA1?], type “Y”.


10. Type the hash we got on step 2


11. Set the newly added hash in “Active” status and leave Intel MBEx saving changes.

Enable Intel SCS Platform Discover task sequence

1. From [Software Library]-[Overview]-[Operating Systems]-[Task Sequence], right-click on [Intel SCS: Platform Discovery] task sequence and [Enable] it.


2. Click [OK] on the dialog box.
3. On the client computer, run [Start]-[All Programs]-[Microsoft System Center 2012 R2]-[Configuration Manager]-[Software Center].  Verify that [Intel SCS: Platform Discovery] task sequence has ended successfully.

4. From [Assets and Compliance]-[Overview]-[Device Collections], right-click on [Intel AMT: Exists] and click on [Update membership].


5. Click [OK] to the warning dialog box.
6. Verify that membership of [Intel AMT: Exists] collection has been updated.

Enable Intel AMT Discovery and Report task sequence

1. From [Software Library]-[Overview]-[Operating Systems]-[Task Sequence], right-click on [Intel AMT: Discovery and Report] task sequence and [Enable] it.


2. Click [OK] on the dialog box.
3. On the client computer, run [Start]-[All Programs]-[Microsoft System Center 2012 R2]-[Configuration Manager]-[Software Center].  Verify that [Intel AMT: Discovery and Report] task sequence has ended successfully.


4. From [Assets and Compliance]-[Overview]-[Device Collections], right-click on [Intel AMT: Not Configured] and click on [Update membership].


5. Click [OK] to the warning dialog box.
6. Verify that membership of [Intel AMT: Not Configured] collection has been updated.

Enable Intel AMT Remote Configuration task sequence

1. From [Software Library]-[Overview]-[Operating Systems]-[Task Sequence], right-click on [Intel AMT: Remote Configuration] task sequence and [Enable] it.


2. Click [OK] on the dialog box.
3. On the client computer, run [Start]-[All Programs]-[Microsoft System Center 2012 R2]-[Configuration Manager]-[Software Center].  Verify that [Intel AMT: Remote Configuration] task sequence has ended successfully.


4. From [Assets and Compliance]-[Overview]-[Device Collections], right-click on [Intel AMT: Configured] and click on [Update membership].


5. Click [OK] to the warning dialog box.
6. Verify that membership of [Intel AMT: Configured] collection has been updated.

Enable Intel AMT Remote Maintenance task sequence

1. From [Software Library]-[Overview]-[Operating Systems]-[Task Sequence], right-click on [Intel AMT: Remote Maintenance] task sequence and [Enable] it.


2. Click [OK] on the dialog box.

AMT Status discovery

1. From [Assets and Compliance]-[Overview]-[Device Collections], double-click on [Intel AMT: Configured] collection. From the computer list, right-click on a computer and click on [Manage Out of Band]-[Discover AMT Status].


2. Click [OK] to the dialog box
3. Select [Intel AMT: Configured] collection and add [AMT Status] and [AMT Version] columns by right-clicking on the column name bar.


4. Verify [AMT Status] and [AMT Version]

Intel AMT provisioning is, then,  over.

Pages

Subscribe to Randy Riness @ SPSCC aggregator
Drupal 7 Appliance - Powered by TurnKey Linux