You are here

Feed aggregator

New requirement when updating Team Services extensions on the Marketplace

MSDN Blogs - 1 hour 9 min ago

If you develop extensions for Visual Studio Team Services or Team Foundation Server, there is a new requirement during publishing that you should be aware of: when updating an extension on the Marketplace, the updated extension’s version number must be greater than the published extension’s version number. To say it another way, you must increment the version of your extension every time you update it. As an example, if your extension is currently published to the Marketplace at version 0.9.0, attempting to upload version 0.8.0 will fail. In TFX this will be reflected with a message similar to this:

Version number must increase each time an extension is published. Extension: fabrikam.my-extension Current version: 0.9.0 Updated version: 0.8.0

This change only impacts Team Services extensions, not Visual Studio Code extensions.

Why the change

Up until now, the Marketplace did not enforce that an extension’s version number be incremented on update. Although this made life easier during development, it resulted in problems downstream. For example, different variations of an extension with the same version number make it difficult for developers to correlate problems back to the appropriate version of the source code. It also makes it difficult for Team Foundation Server 2015 users to definitively know what version of an extension they are actually running. This change is also necessary to support some upcoming features (stay tuned).

Tools to help

If you already increment your extension’s version number before you publish an update, good job – keep doing what you’re doing! If not or you don’t want to manage your extension’s version, there are a few tooling options to help.

TFX

The latest version of TFX (0.3.26 and later) introduces a new argument to increment the last segment of the extension’s version number:

tfx extension create --rev-version

This reads the current version from the manifest file, increments the value in the third segment of the version (the patch segment), writes the new version to the local file, and then creates the package.

As an example, assume the extension manifest file currently has a version of 0.4.10. After running TFX with the –rev-version argument, the version in the local manifest file and resulting .vsix file will be 0.4.11.

For more information, see TFX extension commands.

Build and Release Tasks for Extensions

A few months ago, we (with help from the ALM Rangers) released a set of build and release tasks that help extension developers package, publish, and manage their Team Services and Team Foundation Server extensions. Of course, these tasks were packaged as an extension and made available on the Visual Studio Marketplace.

One of the tasks, Query Extension Version, queries the Marketplace to get the current version of a specified extension. The task then increments the specified segment (MAJOR.MINOR.PATCH) of the extension’s version. This new version is stored as a variable that can be used in a later Package or Publish task to override the version from the manifest file.

As an example, Team Calendar is a public extension maintained by us (Microsoft) under our DevLabs publisher. We use the build and release extension tasks in a Team Services-hosted CI build process to package the Team Calendar extension when updates are pushed to its public GitHub repository.

A “query extension version” step gets the current version of the published extension, increments the minor version segment, and saves the new value in a build variable called Extension.NewVersion:

A later “package extension” step uses the value from the Extension.NewVersion build variable when creating the .vsix package (no change is made to the source manifest file):

The end result is a compiled .vsix file, with a version that is greater than the version currently published.

We know that any change to a day to day process can be tough, but this is an important and necessary change. We welcome your feedback on this.

Will Smythe, Program Manager at Microsoft

Experiencing Data Access Issue in Azure Portal for Availability Data Type – 08/24 – Resolved

MSDN Blogs - 3 hours 5 min ago
Final Update: Wednesday, 24 August 2016 23:07 UTC

Between 08/24 5:55 PM UTC and 08/24 9:15 PM UTC, up to 15% of customers may have experienced errors accessing analytics, search and metrics data due to an issue in one of our back end components.  We understand the root cause of the issue as related to a misconfiguration which was able to slip past our validation steps.
  • Root Cause: The failure was due to an automation error during a planned change which resulted in a misconfiguration.
  • Lessons Learned: We are investigating additional checks and balances during our automated change process to maintain code velocity while improving availability and reliability.
  • Incident Timeline: 3 Hours & 20 minutes – 08/24 6:50 PM UTC through 08/24 9:10 PM UTC

We understand that customers rely on Application Insights as a critical service and apologize for any impact this incident caused.

-Sapna


Upcoming Changes to How You Log into Visual Studio Team Services

MSDN Blogs - 3 hours 33 min ago

In order to make it easier for you to sign into Visual Studio Team Services (VSTS), we will soon be updating the steps that you will take when you log into your account. After we’ve made these changes, you will see some new login screens when connect to the service. If you’re using Azure Active Directory (AAD) or Office 365 (O365), these screens will already look familiar to you, since we’re aligning more closely with the AAD & O365 login processes as a part of this change.

For more information on this update, please follow the blog here: https://blogs.msdn.microsoft.com/visualstudioalm/2016/08/24/upcoming-changes-to-how-you-log-into-visual-studio-team-services/ 

Azure Import Export Service generally available in Azure Government

MSDN Blogs - 3 hours 50 min ago

We are excited to announce the general availability of the Azure Import/Export Service in Microsoft Azure Government. The Import/Export service allows migration of large amounts of data in and out of Azure blob storage by shipping hard disk drives directly to the datacenter. This service is suitable in situations where you want to transfer several TBs of data in or out of Azure Storage, but uploading or downloading over the network is not feasible due to limited bandwidth or high network costs. Some scenarios where this service can be used are data seeding, content distribution, recurring data update, offsite backup, disaster recovery.

Benefits of using Azure Import Export

  • Fast: We recommend using Azure Import/Export if loading data over the network would take 7 days or more. Shipping disks directly to the data center can save weeks or more off of network transfer time.
  • Secure: Data is secured by Bitlocker encryption. The keys are securely uploaded using SSL REST-API and do not travel along with the disk.
  • Reliable: The client tool has internal checksum logic to maintain data integrity. Various verbosity of logging is available directly in customer storage accounts making this process highly reliable.
  • Azure Backup Offline Seeding: Azure Import/Export Service for Azure Government will enable government customers to seed initial backups to Azure Backup service.

Note While all import/export functionality is available, we currently only support the REST API interface for creation and management of import/export jobs in Azure Government. The Portal experience for Import/Export jobs will come in the new portal later this year.  See below for details and samples for getting started with import jobs via the REST API.  

Support

We are eager to help you achieve your goals. Feel free to contact us if you are facing any difficulty using the service, tool or REST API.

Additional Resources:

Generating SSH keys for Azure Linux VMs

MSDN Blogs - 4 hours 25 min ago

When creating new Azure Linux virtual machines, it is recommended you use SSH keys to connect to the VM rather than a username/password combination. Creating these keys is simple using Bash.

Bash is native in Linux and Mac OS X but clients are available on Windows as well. Git for Windows comes with the Git Bash client. Otherwise if you are running Windows 10 with the Anniversary Update, you can use the Windows Ubuntu Bash client. This demo was done using Git Bash but the steps should be the same in other Bash clients.

From the Bash command prompt, run ssh-keygen with type ‘rsa’. You will be prompted for a file name and a passphrase which you can leave blank if you want. This will create the public and private RSA files.

ssh-keygen -t rsa

In the example above, two files were created:

  • mynewkey_rsa – this is the private key you use when connecting to the VM
  • mynewkey_rsa.pub – this is the public key you will use when creating the VM

The file you want when creating the VM is the public key. To display the contents of this file, you can enter the following in the bash prompt:

cat mynewkey_rsa.pub

This will display the contents of the public key file which you can copy and then paste into the SSH section

Once the VM is created, you should be able to ssh to it in your Bash shell by using your private key and the username/IP address of the VM you created.

ssh -i mynewkey_rsa myadminuser@13.92.100.189

I hope this shows how easy it is to get up and running using SSH to connect to Azure Linux VMs.

Formato moneda en POS (LATAM)

MSDN Blogs - 4 hours 43 min ago

Apenas nos reportaron un comportamiento, en donde el formato de la moneda estaba invertido en el punto de venta, es decir, el separador de los decimales era una coma (,) y el digito agrupador era (.)

 

Para resolver el inconveniente es necesario realizar los cambios en la configuración regional de Windows, Control Panel| Clock, Language, and Region| Region| Clic en Additional Settings y en el tab Numbers y Currency, configurar el formato requerido y aplicar los cambios. (Adjunto imagen para LATAM)

Espero que sea de utilidad.

Abrazo.

Upcoming Changes to How You Log into Visual Studio Team Services

MSDN Blogs - 4 hours 52 min ago

In order to make it easier for you to sign into Visual Studio Team Services (VSTS), we will soon be updating the steps that you will take when you log into your account.  After we’ve made these changes, you will see some new login screens when connect to the service. If you’re using Azure Active Directory (AAD) or Office 365 (O365), these screens will already look familiar to you, since we’re aligning more closely with the AAD & O365 login processes as a part of this change.

Why are we making this change?

We know that many of our customers are using not just VSTS, but also AAD, O365, and other Microsoft cloud services. By updating our login processes to more closely align with the rest of the Microsoft cloud ecosystem, VSTS can bring you a more consistent user experience when accessing our service.

How will this change impact you?

When you sign in to VSTS today, you see the screen below, where you log in on the right-hand side if you’re using a Microsoft Account (MSA, formerly Windows Live ID), or on the left-hand side if you’re using a Work or School Account (WSA, your AAD or O365 login).

Starting soon, you’ll see a much simpler screen that simply asks you for your username, which might be your email address or your User Principal Name (UPN) if you’re an AAD or O365 customer:

If You’re Logging in Using your Microsoft Account (MSA)…

If you’re logging into VSTS using your Microsoft Account, you’ll be redirected to the following screen to enter your username & password – this is the same screen you would see if you went to www.outlook.com, if you want to compare:

Once you enter your MSA username & password, you’ll be redirected to VSTS and you can keep going as usual – you shouldn’t see any other changes inside of VSTS.

If You’re Logging in Using a Work or School Account (WSA)…

If you’re logging in using your Work or School Account, you’ll be redirected to the following screen to enter your password:

Once you enter your WSA password, you’ll be redirected to VSTS and you can keep going as usual – you shouldn’t see any other changes inside of VSTS.

If You Get Prompted to Choose…

In some cases, you may have a single email address, like joe@contoso.com, that corresponds to both a Microsoft Account and a Work or School Account. If that’s the case, you’ll see one additional screen after you enter your username. (We need to ask you which one you want to use, since these are two different users inside of VSTS that probably have different permissions and other security settings attached to them):

  • If you used a Work or School Account when you created this VSTS account or were invited to it, please select “Work or school account”. (Hint: if you used to sign into VSTS using the blue link on the left-hand side of the old login page, choose “Work or school account”.)
  • If you’re using a Microsoft Account to sign into VSTS, please select “Personal Account”. (Hint: if you used to sign into VSTS using the right-hand side of the old login page, choose “Personal account”.)
FAQ Where should I see this new experience?

You’ll see this experience anywhere that you’re used to being prompted for your username and password today, including:

  • Accessing VSTS from the web
  • Creating an OAuth authorization that requires you to sign into VSTS
  • Accessing VSTS from the Visual Studio IDE
I signed into VSTS from visualstudio.com and I’m not seeing the list of accounts that I am the owner. What gives?

The list of accounts that you see on visualstudio.com is associated with the identity that you used to sign into VSTS. If you were prompted with the “chooser” screen when you signed in, then you may have picked the incorrect option. Try signing out from VSTS completely (see the steps below), then sign in again and choose the other option on that screen.

I tried to connect to <account>.visualstudio.com and I’m getting this error.

This indicates that the identity you logged in with doesn’t have access to the resource that you’re trying to access. Here are some things you can try:

  • If this is the first time you’ve tried accessing this URL, check with the account owner to make sure that you’ve been granted access to it.
  • If you were prompted with the “chooser” screen when you signed in, then you may have picked the incorrect option. Try signing out from VSTS completely (see the steps below), then sign in again and choose the other option on that screen.
How can I sign out of VSTS completely if I made a mistake?

We’ve created the following tool to help you sign out of VSTS completely, since just closing your browser window doesn’t always completely sign you out of the service:

  1. Close any open browser tabs or windows, including ones that aren’t running VSTS.
  2. Open an InPrivate or Incognito browser tab and browse to the following URL: http://aka.ms/vssignout. You’ll see a message that says “Sign Out In Progress” and a few spinning progress buttons, and then you’ll be redirected to the main www.visualstudio.com home page.
  3. If the “Sign Out In Progress” screen spins for more than a minute, go ahead and close the browser window anyway and continue.
  4. Try signing in again, and select a different option on the “chooser” screen
I’m trying to enter my username and I’m seeing the following error

This means that you don’t have an MSA or a Work or School Account associated with the email address that you entered. Here are some things you can try:

  • If you’ve accessed this account before successfully, be sure that you’ve typed the email address correctly.
  • If you’re signing up for VSTS for the first time and don’t already have an MSA, click the “Create a Microsoft account” link to get started. Don’t worry, once you create a new MSA, you’ll be taken back to getting started with VSTS
  • If you’re signing up for VSTS for the first time and want to create a new Work or School account, use the following link that will help you create a work/school account in Azure Active Directory before sign up. In the case of creating a new Work or School account, you’ll need to start the VSTS sign-up process over again once you’ve created your WSA. We’re sorry for the inconvenience, we’re working on fixing this.
    I’ve tried all the steps in this article and I’m still having trouble, what do I do?

    If you’re still having trouble accessing VSTS, you can open a new support case and our Customer Support team will help. Be sure to have your username ready, as well as the URL of the resource that you’re trying to access.

    To open a new support case with us:

    1. Visit http://www.visualstudio.com/support-overview-vs
    2. Click the “Visual Studio Team Services technical support” tile:

    Where can I give feedback about the new experiences?

    If you have additional questions or feedback for us regarding VSTS login experiences, please feel free to leave us a suggestion on User Voice here.

    AX – Cómo cancelar el Cálculo de la Revalorización de moneda extranjera

    MSDN Blogs - 5 hours 55 min ago
    INTRODUCCIÓN

    Considere que las fluctuaciones en el tipo de cambio ocasionan que el valor nominal de las transacciones pendientes de pago varíe. Para calcular esta variación se utiliza el proceso de revalorización de moneda extranjera, en los módulos: Cuentas por pagar, Cuentas por cobrar y Contabilidad general en AX.

     

    Una vez que se ha realizado este proceso, es posible que sea necesario cancelarlo y volver a realizarlo por diferentes razones, tales como: error en las fechas con las que fue realizado, error en los valores de tipo de cambio, faltaron documentos por considerar, o alguna otra.

     

    Si este es el caso, es posible cancelar el cálculo y volver a realizarlo, ya sea en Contabilidad general, o bien, en Cuentas por pagar y Cuentas por cobrar.

     

    ESCENARIO DE NEGOCIO

    Una vez explicado lo anterior, considere el siguiente escenario de negocio:

    La entidad legal tiene dólares americanos como moneda de contabilidad.

    Se tiene la siguiente equivalencia del tipo de cambio:

    1. Al 1 de enero 2018 es de USD $1.50 por cada €1.00, y
    2. Al 1 de febrero 2018 es de USD $2.00 por cada €1.00

    Nota. Estos valores son supuestos y redondeados para facilidad de cálculo.

    CONTABILIDAD GENERAL

    Se registra una factura de proveedor por €1000 el 1 de enero 2018 (USD $ 1,500.00 según el tipo de cambio). Por lo que el Saldo de comprobación para la cuenta de Proveedores (200110), muestra un crédito por USD $1,500.00.


     

    Posteriormente, se efectúa el proceso de Revalorización de moneda extranjera al 1 de febrero 2018. Y ahora el Saldo de comprobación para la cuenta de Proveedores (200110) ha sido actualizado a USD $2,000.00, ya que el tipo de cambio para el 1 de febrero es de USD $2.00 por cada €1.00.


    Al observar el resultado del ajuste por tipo de cambio, el contador de la empresa, encuentra que el cálculo se realizó con algunas omisiones. De manera que, decide cancelar el proceso y volverlo a ejecutar. Para lo cual realiza lo siguiente:

    1. Actualizar el tipo de cambio, de manera temporal, al que se tenía antes de ejecutar el proceso: USD $1.50 por cada €1.00 colocando como fecha el 1 de febrero 2018.


    1. Ejecutar el proceso de Revalorización de moneda extranjera al 1 de febrero 2018

    Nota. A este punto, se observa que el saldo de comprobación para la cuenta contable de Proveedores (200110) es el que se tenía antes de realizar la Revalorización de moneda extranjera. Por lo cual, ahora es posible volver a realizar el proceso como se requiere al 1 de febrero 2018, utilizando el tipo de cambio que se tenía originalmente para esa fecha.


    El procedimiento arriba mostrado es un ejemplo para tener el resultado esperado bajo el escenario de negocio planteado. Sin embargo, Microsoft siempre recomienda primero hacer pruebas en un ambiente de pruebas hasta observar los resultados esperados antes de hacerlo en un ambiente de productivo.

     

    CUENTAS POR PAGAR Y CUENTAS POR COBRAR

    Teniendo en cuenta que el proceso de Revalorización de moneda extranjera es igual en ambos módulos, se realizará el ejercicio únicamente desde el módulo de Cuentas por cobrar.

    Para este ejemplo, se registra una factura de servicios por €1000 el 1 de enero 2018 (USD $ 1,500.00 según el tipo de cambio). Por lo que el Saldo de comprobación para la cuenta de Clientes (130100), muestra un débito por USD $1,500.00.


     

    Posteriormente, se efectúa el proceso de Revalorización de moneda extranjera al 1 de febrero 2018. Y ahora el Saldo de comprobación para la cuenta de Clientes (130100) ha sido actualizado a USD $2,000.00, ya que el tipo de cambio para el 1 de febrero es de USD $2.00 por cada €1.00.


    Al observar el resultado del ajuste por tipo de cambio, el contador de la empresa, encuentra que el cálculo se realizó con algunas omisiones. De manera que, decide cancelar el proceso y volverlo a ejecutar. Para lo cual se realiza nuevamente el proceso de Revalorización de moneda extranjera, pero ahora utilizando el parámetro Método: Fecha de la factura.


     

    Al utilizar este parámetro, el efecto de cualquier revalorización anterior se cancela. Y el Saldo de comprobación queda como se muestra abajo.


     

    Nota. A este punto, se observa que el saldo de comprobación para la cuenta contable de Clientes (130100) es el que se tenía antes de realizar la Revalorización de moneda extranjera. Por lo cual, ahora es posible volver a realizar el proceso como se requiere al 1 de febrero 2018, utilizando el tipo de cambio que se tenía originalmente para esa fecha.

     

    Referencia del procedimiento para Cuentas por pagar y Cuentas por cobrar:

    Foreign currency revaluation for Accounts payable and Accounts receivable

    https://ax.help.dynamics.com/en/wiki/foreign-currency-revaluation-for-accounts-payable-and-accounts-receivable/

     

     

    Para E

    C++14/17 Features and STL Fixes in VS “15” Preview 4

    MSDN Blogs - 6 hours 58 min ago

    Visual Studio “15” Preview 4 is now available, with a new installer.  (VS “15” is an IDE version number, not a year.  It’s the next major version after VS 2015, which was IDE version 14.  VS has different numbers for its year branding, IDE version, and C++ compiler version.)

     

    All of the features and fixes in VS 2015 Update 3 (including optimizer improvements for std::abs(), std::min(), std::max(), and std::pow(), which we forgot to mention) are also available in VS “15” Preview 4.  Additionally, we have something new to announce.  Previously, new major IDE versions contained new major compiler versions and binary-incompatible STLs (which allowed us to overhaul our data structure representations for correctness and performance).  Now that we’ve been adding features to the compiler and STL since VS 2015 RTM in a highly compatible manner, we’re going to continue this into the new major IDE version.  Specifically, VS 2015 and VS “15” will have the same major compiler version (19) and their STLs will be binary-compatible, and this compatible compiler and STL will remain available throughout the lifecycle of VS “15”.  This implies that the STL’s DLL will continue to be named msvcp140.dll.  (At some point in the future, we expect to have a compiler version 20 and a binary-incompatible STL again.)

     

    Note that we’re guaranteeing binary compatibility, not source compatibility.  While the version switch /std:c++14 (which is the default) will typically preserve source compatibility, it is always possible for bugfixes or Issue resolutions to require source code changes.  While we’re trying to avoid unnecessary source breaking changes, when they’re necessary they’ll be documented on MSDN, as we’ve been doing for VS 2015 Updates.  And note that /std:c++latest will frequently experience source breaking changes, but we’ll try to document them too (especially in the STL, as we update our implementation to conform to the latest Working Paper, this can sometimes break source code in unexpected ways which we aren’t immediately aware of, hence the caveat).

     

    To be clear, this is a good thing for you, our programmer-users.  Although the major compiler version is remaining unchanged at 19, we’re still adding new compiler and STL features.  (_MSC_FULL_VER will increase, allowing the updated compiler to be detected.)  And the STL binary compatibility means that third-party libraries can be built once and used with both VS 2015 and VS “15”.  (However, it is still best for everything to be compiled consistently with the latest available version, as that will give you the most correctness and performance.)  Now, here’s what we’re adding:

     

    Compiler Features

     

    The C++14 feature NSDMIs for aggregates has been implemented unconditionally by Vinny Romano.

     

    The C++17 feature [[fallthrough]] attribute has been implemented under /std:c++latest by Shuo Chang.

     

    STL Features

     

    The C++17 feature <algorithm> sample() has been implemented under /std:c++latest.

     

    The C++17 feature is_callable has been implemented under /std:c++latest.  (is_nothrow_callable was blocked by a compiler bug in Preview 4, which has been fixed for the next build.)

     

    LWG Issues

     

    The following C++14 Library Issue resolutions have been implemented unconditionally:

     

    • LWG 2135 Unclear requirement for exceptions thrown in condition_variable::wait()
    • LWG 2203 scoped_allocator_adaptor uses wrong argument types for piecewise construction
    • LWG 2210 Missing allocator-extended constructor for allocator-aware containers

     

    The following C++17 Library Issue resolutions have been implemented unconditionally:

     

    • LWG 2063 Contradictory requirements for string move assignment
    • LWG 2219 INVOKE-ing a pointer to member with a reference_wrapper as the object expression
    • LWG 2439 unique_copy() sometimes can’t fall back to reading its output
    • LWG 2476 scoped_allocator_adaptor is not assignable
    • LWG 2566 Requirements on the first template parameter of container adaptors
    • LWG 2576 istream_iterator and ostream_iterator should use std::addressof
    • LWG 2577 {shared,unique}_lock should use std::addressof
    • LWG 2579 Inconsistency wrt Allocators in basic_string assignment vs. basic_string::assign
    • LWG 2583 There is no way to supply an allocator for basic_string(str, pos)
    • LWG 2586 Wrong value category used in scoped_allocator_adaptor::construct()
    • LWG 2684 priority_queue lacking comparator typedef
    • LWG 2716 Specification of shuffle and sample disallows lvalue URNGs

     

    STL Fixes

     

    Cleaned up _ITERATOR_DEBUG_LEVEL=2 assertions.  Now they always emit only one assertion dialog, and they cannot be ignored.  (Previously, some but not all assertions emitted two dialogs, and some but not all could be ignored.)

     

    Further improved support for fancy pointers.  Class types imitating pointers while wearing top hats and monocles are now accepted throughout more of the STL.  (Fancy pointers, powered by std::pointer_traits, are highly advanced and extremely rare.)

     

    Fixed a regression that was triggering compiler errors when calling uninitialized_copy() on a list/forward_list containing elements with non-trivial destructors (VSO#233820/Connect#2846868).

     

    Fixed all known scoped_allocator bugs: VSO#129349 “<scoped_allocator>: scoped_allocator is attempting to default construct allocators”, VSO#146338 “<scoped_allocator>: error C2512: no appropriate default constructor available”, and VSO#224478 “<scoped_allocator>: construction is using true placement new, not OUTERMOST_ALLOC_TRAITS::construct”.

     

    Fixed setlocale() memory corruption issues in <filesystem>.

     

    Fixed broken handling of match_prev_avail, match_not_bol, and match_not_eol in <regex> (VSO#225160/Connect#2745913, VSO#226914).

     

    Prevented the CRT from shutting down before std::async() threads have shut down (VSO#225699).

     

    Billy Robert O’Neal III – @MalwareMinigunbion@microsoft.com

    Casey Carter – @CoderCaseycacarter@microsoft.com

    Stephan T. Lavavej – @StephanTLavavejstl@microsoft.com

    Steve Wishnousky – stwish@microsoft.com

    Windows フィードバックはまず検索を

    MSDN Blogs - 7 hours 49 min ago

    日々 Windows 10 のフィードバックハブへのご報告ありがとうございます。

    たまに見ていますがやはり残念なところがあります。それは皆、同じ内容のものを個別に報告してしまっているケース。これです。

    これはAnniversary Update での電源ボタンに対する不具合報告ですが、みんな同じ内容を別々に上げてしまっています。こうすると1人だけが不具合にあっている(まるで特殊ケース)がたくさん上がっているだけ、になってしまいます。

    ですので、まず報告前に同じ内容の報告が上がっていないかチェックして、同じ内容のものや近いものがあれば、そこに投票してください。それが改善へのカギになります。もし内容と自分の報告に若干の差異があっても、その違いはコメントに書けばいいので。

    せっかく皆さんからの報告。よい意見として本社に伝えたいのでぜひまずは検索してみてください。

    SqlCommandFilters–a tool for every toolbox

    MSDN Blogs - 8 hours 18 min ago

    SqlCommandFilters is a utility assembly that will automatically parse your SqlCommand.CommandText and parameterize it for you.

    Why would you want to do this?

     

    Picture this 

    You want to leverage SQL Server Always Encrypted, but you your queries are not currently parameterized

    You have a web application that builds all SQL input from elements on the user page. SQL Injection anyone?

    You want better performance so you know you should parameterize your queries… but there are thousands of them.

     

    What if…

    You could by adding one using statement and one line of code accomplish all of the above?

    You can with SqlCommandFilters. The source code is all posted on CodePlex.

     

    How did you do this?

    By using the Microsoft.SqlServer.TransactSql.ScriptDom namespace I was able to parse the SQL command text and automatically create and add parameters to the SqlParameters collection of the SqlCommand object. I used the excellent information provided by Arvind Shyamsundar  found here: https://blogs.msdn.microsoft.com/arvindsh/tag/scriptdom/ as my starting point.

     

    What constructs does it support?

    There is a test / driver program that will allow you to easily test with over 20 different T-SQL constructs. The tool supports non-parameterized, partially parameterized and fully parameterized queries.

     

    Is it hard to use? You be the judge. The important statement is line 18 – that is where all the magic happens.

    1: SqlConnectionStringBuilder sb = new SqlConnectionStringBuilder(); 2: sb.InitialCatalog = "AdventureWorks2014"; 3: sb.IntegratedSecurity = true; 4: sb.DataSource = @"SQLBOBTSQL2k16CTP3"; 5: SqlConnection con = new SqlConnection(sb.ConnectionString); 6: con.Open(); 7: SqlCommand cmd = new SqlCommand(); 8: cmd.Connection = con; 9: // Pick one of the TestStatements and assign it to the CommandText property of the SqlCommand object 10: cmd.CommandText = TestStatements.existentialSubQueryStmt; 11:  12: // Parameterize supports a reparse parameter 13: // by default it is true. It will reparse and format the resultant SQL to ensure we have good code 14: // if you feel that the performance suffers you can turn off by calling this instead 15: // Parameters.Parameterize(ref cmd, false); 16: // Parameterize will parse, parameterize and create the parameter collection and modify the CommandText and Parameters collection 17: // appropriately 18: SqlCommandFilters.Parameters.Parameterize(ref cmd);

    Improving Concurrency & Scalability of SQL Server workload by optimizing database containment check in SQL 2014SQL 2016

    MSDN Blogs - 8 hours 18 min ago

    Starting SQL 2012, database containment property is introduced in SQL Server database to support contained databases. As described in MSDN article here, database collation not only applies to data stored in user tables but also applies to variables, stored procedure parameters, GOTO labels. In case of contained databases, the collation of the metadata, variables, parameters, GOTO labels, cursors, is different from the database collation. Since database containment property can be updated online while the instance is running, during each execution of the stored procedure, it is important to check the database containment property to see if the property is set or changed before the query plan is compiled, to ensure the stored procedure parameters, variables or GOTO labels in the stored procedure is using the right collation setting.

    Database containment property is set or read from DBTABLE, which is an in-memory data structure for every database in SQL Server. To read containment property during every execution of stored procedure, a spinlock is acquired on DBTABLE, which is a lightweight synchronization mechanism to acquire fast exclusive locks. Spinlock wait bottlenecks (also referred to as spinlock collisions) occur when concurrent threads are waiting to acquire the same lock on a data structure that is already acquired by one of the threads for exclusive reading/writing of in-memory data structure. Spinlocks are usually used for fast access and hence the threads waiting for the lock spins on CPUs – rather than yielding immediately – until it acquires the lock. To ensure the threads waiting for spinlock do not hog the CPU for long during long waits (spins), the threads are forced to yield periodically (which is referred to as spinlock backoffs). To ensure excessive spins do not cause additional CPU overhead, the sleep period between each backoff is increased exponentially.

    On high end systems with large number of cores or processors (typically 32 processors or more), the concurrency of the SQL Server may be high enough such that if all the threads are executing stored procedures from same database, all the threads might appear to wait for spinlock on DBTABLE structure simultaneously, to read the containment property of the database. Excessive spinlock collisions can lead to increased query response time, drop in throughput and further lead to increased CPU overhead with worker threads spinning and burning the cpu cycles while waiting for spinlocks. This limits the overall concurrency, throughput, and scalability of the SQL Server on latest high end systems.

    Starting SQL 2014 SP1 CU8, SQL 2014 SP2 CU1 (not released yet) and SQL 2016 CU1, the spinlock to check the database containment property is replaced by the “load acquire and store release” lock semantics, which is a non-blocking lock-free synchronization mechanism between the concurrent threads. This avoids exclusive spinlocks and thereby avoids the spinlock collisions between the concurrent threads executing stored procedures from same database as described earlier. This change improves the overall concurrency and scalability of the system especially if all the worker threads are simultaneously executing a stored procedure from same database.

    The improvement is also documented in KB 3175883 and is further elaborated in this blog to explain the improvement in detail.

    Parikshit Savjani
    Senior Program Manager (@talktosavjani)

    ExpressRoute common customer questions answered

    MSDN Blogs - 9 hours 42 min ago

     

    I was asked by a higher education customer in Arizona some questions about ExpressRoute for their Azure subscriptions so I posted them here:

     

     

    ExpressRoute circuit diagram showing Azure Public, Azure Private peering and Microsoft peering to other Microsoft cloud services like Intune, CRM Online, Office 365, etc..

     

    Can you have both ASM and ARM VNets connected to a single ER circuit?

    Good news here is the answer is yes provided you enable ‘AllowClassicOperations’ on the ARM ExpressRoute circuit.  See the PowerShell steps here under ‘Enable for Both models’ section.

    Here is a PowerShell sample to enable for both ASM and ARM VNets over an ARM ExpressRoute circuit:

     

    # Get details of the ExpressRoute circuit

    $ckt = Get-AzureRmExpressRouteCircuit -Name “DemoCkt” -ResourceGroupName “DemoRG”

    #Set “Allow Classic Operations” to TRUE

    $ckt.AllowClassicOperations = $true

    # Update circuit

    Set-AzureRmExpressRouteCircuit -ExpressRouteCircuit $ckt

     

    Can I use a single ExpressRoute circuit with multiple subscriptions?

    Yes, you can leverage a single ER circuit with up to 10 subscriptions. See here for more information.

    Single ExpressRoute circuit example connecting to several Azure subscriptions

     

    Can I use the new VNet peering with ExpressRoute?

     

    Yes, you can leverage the new VNet peering option with ExpressRoute. This is very useful feature since it can help to bypass the default 10 VNet limit placed on the standard ExpressRoute circuit.

     

     

    Can I combine VNet to VNet BGP with ExpressRoute BGP routing?

    You cannot currently combine VNet to VNet BGP with ER BGP routing. An alternative solution, if the VNets are in the same region, may be to use VNet peering in conjunction with ER BGP routing.

     

    VNet to VNet BGP example

     

    ExpressRoute BGP example

     

    Can I use QoS with ExpressRoute?

    Yes, you can with specific Microsoft peering traffic over ExpressRoute. There are specific DSCP tags you can enable for things like Skype traffic, etc. See here for more details.

     

    Can I encrypt all ExpressRoute traffic natively?

    Currently this is not possible on an ExpressRoute circuit without an additional network VPN/encryption appliance on premises paired with a network virtual VPN/encryption appliance hosted in Azure.

    Can I have both ExpressRoute ARM and S2S VPN as a failover connection?

    The answer is yes this is possible and is a good best practice for Azure Private peering traffic. The Azure Public peering traffic can failover to a standard Internet link.  See more on the configuration here.

     

    Are there Egress charges for ExpressRoute?

    Yes, if you use the ‘metered plan’ there is an egress cost of about 2 cents per GB. If you use the ‘unlimited plan’ there is not charge for egress. The education Internet egress waiver does not apply to ExpressRoute circuits. See here for more pricing details.

     

    For more on ExpressRoute see here, a useful ER FAQ here, and detailed ER documentation see here.

    Client unable to establish connection.. error when you configure ‘max server memory’ to a low value

    MSDN Blogs - 10 hours 8 min ago

     

    When you inadvertently set SQL Server’s ‘max server memory’ to a very low value, you may experience one or more of the following symptoms:

    • SQL Server cannot be started (from a tool like configuration manager) and the following message is logged in the SQL error log:
      ”An internal query can’t be executed due to inefficient memory…
    • SQL instance is started, but connections would fail with an error message that reads:
      Cannot connect to <Servername>
      Additional information:
      The client was unable to establish connection because of an error during connection initialization process before login. Possible causes include the following: the client tried to connect to an unsupported version of SQL Server, the server was too busy to accept new connections; or there was a resource limitation (insufficient memory or maximum allowed connections) on the server. (provider : Shared Memory Provider: Error :0 – No process is on the other end of the pipe.) (Microsoft SQL Server, Error:233)

      No process on the other end of the pipe.

     

    As noted in this blog, we can try to workaround this issue by starting SQL Server with minimal configuration by adding–f  to SQL startup parameters, but in most cases this causes SQL to start in single user mode and the connection is typically grabbed by some other process other than a tool like SSMS that can be used to correct the memory setting. The solution to this scenario is to specify the app that will be allowed to connect to SQL Server by using the –m parameter as noted below:

    NET START MSSQLSERVER /f /m” Microsoft SQL Server Management Studio – Query”

    OR

    NET START MSSQLSERVER /f /m”SQLCMD”

    Additional References:

     

    Contributed by: Idit Izak (Senior Support engineer in SQL team)

    OneNote Class and Staff Notebooks are now available in over 150 countries

    MS Access Blog - 10 hours 36 min ago

    With back-to-school season in full swing or just about to get going in many countries, it’s a good time to reflect on the excitement around OneNote Class Notebooks and Staff Notebooks. The original OneNote Class Notebook app was in preview release during the summer of 2014 and went to general availability in October of 2014. Since then, we’ve had hundreds of thousands of teachers using the app in Office 365—resulting in millions of student notebooks created. We also just released some big updates for the 2016/2017 school year. In just one week, we’ve seen more class notebooks created than the entire 2014/2015 school year! We’ve heard from all over the world about teachers’ and schools’ desire to use class and staff notebooks, so we have enabled the app in a set of new countries and regions.

    OneNote Class and Staff Notebook app—now available via Office 365 in 90 new countries and regions

    In early August, we enabled OneNote Class and Staff Notebooks for 90 new countries around the world—bringing the grand total to 151 countries where Office 365 Education customers can use the OneNote Class or Staff Notebook app.

    Here is the full list of languages and countries.

    We already had class and staff notebooks created in many of these new countries and, as you can see by the map, there are lots of countries (highlighted with purple for OneNote) using OneNote Class and Staff Notebooks!

    Map showing where class or staff notebooks are being used.

    To get started, any teacher can follow these three easy steps to create class and staff notebooks today:

    1. Go to www.office.com/teachers and sign up for FREE.

    1. To start a class notebook, visit onenote.com/classnotebook and invite students into the notebook and get their own Office 365 account for free.
    2. To start a staff notebook, visit onenote.com/staffnotebook and invite staff members or educators into the notebook and get their own Office 365 account for free.

    —The OneNote team

    Full list of countries and regions supporting OneNote Class and Staff Notebooks in Office 365 Afghanistan, Albania, Algeria, Angola, Argentina, Armenia, Australia, Austria, Azerbaijan, Bahamas, Bahrain, Bangladesh, Barbados, Belarus, Belgium, Belize, Bermuda, Bolivia, Bosnia and Herzegovina, Botswana, Brazil, Brunei, Bulgaria, Cameroon, Canada, Cape Verde, Cayman Islands, Chile, China , Colombia, Costa Rica, Côte d’Ivoire, Croatia, Curaçao, Cyprus, Czech Republic, Denmark, Dominican Republic, Ecuador, Egypt, El Salvador, Estonia, Ethiopia, Faroe Islands, Fiji, Finland, France, Georgia, Germany, Ghana, Greece, Guatemala, Honduras, Hong Kong SAR, Hungary, Iceland, India, Indonesia, Iraq, Ireland, Israel, Italy, Jamaica, Japan, Jordan, Kazakhstan, Kenya, Korea, Kuwait, Kyrgyzstan, Latvia, Lebanon, Libya, Liechtenstein, Lithuania, Luxembourg, Macao SAR, Macedonia (FYRO), Malaysia, Malta, Mauritius, Mexico, Moldova, Monaco, Mongolia, Montenegro, Morocco, Namibia, Nepal, Netherlands, New Zealand, Nicaragua, Nigeria, Norway, Oman, Pakistan, Palestinian Authority, Panama, Paraguay, Peru, Philippines, Poland, Portugal, Puerto Rico, Qatar, Romania, Russia, Rwanda, Saint Kitts and Nevis, Saudi Arabia, Senegal, Serbia, Singapore, Slovakia, Slovenia, South Africa, Spain, Sri Lanka, Sweden, Switzerland, Taiwan, Tajikistan, Tanzania, Thailand, Trinidad and Tobago, Tunisia, Turkmenistan, Turkey, Uganda, Ukraine, United Arab Emirates, United Kingdom, United States, Uruguay, US Virgin Islands, Uzbekistan, Venezuela, Vietnam, Zambia and Zimbabwe

    The post OneNote Class and Staff Notebooks are now available in over 150 countries appeared first on Office Blogs.

    Meteor 1.4 App on Azure App Services

    MSDN Blogs - 11 hours 23 min ago

    Meteor is a full-stack JavaScript platform for developing modern web and mobile applications. Meteor includes a key set of technologies for building connected-client reactive applications, a build tool, and a curated set of packages from the Nodejs and general JavaScript community.

    This Blog would help you create a sample meteor app in local environment and later we would help you move sample app to Azure Web Apps

    Creating Sample Meteor App:

    Use Below command to install meteor on local environment

    curl https://install.meteor.com/ | sh

    Check for your meteor version. we highly recommend using > 1.4

    meteor --version

    Use Below command to create a sample meteor app

    meteor create simple-todos

    Above command would create a new folder with few files for our sample app as in below screenshot

    Using Demeteorizer to convert app into node.js format

    Use below cmd to install Demeteorizer

    npm install -g demeteorizer

    Navigate to your meteor app root folder and enter below cmd

    >demeteorizer

    It would create a new .demeteorized folder

    Navigate to .demeteorized/bundle/programs/server using below cmd

    cd .demeteorized/bundle/programs/server

    Enter below cmd to install all the required node.js modules

    npm install Running App on Local Env

    Use below cmd to execute demeteorized/converted nodejs app in local environment

    PORT=8080 ROOT_URL=http://localhost:8080 npm start

    Moving App to Azure

    Create a new web app on azure and Setup continuous deployment and get the git url. Below link has details on it
    https://azure.microsoft.com/en-us/documentation/articles/web-sites-nodejs-develop-deploy-mac/

    Add below app setting to your web app inside Azure portal App Settings
    Key : ROOT_URL
    Value : web app url(ex: http://Your_APP_Name.azurewebsites.net/)

    Create a web.config file @ .demeteorized/bundle/ and insert below link content
    https://raw.githubusercontent.com/christopheranderson/azure-demeteorizer/master/resources/web.config

    Navigate to .demeteorized/bundle/ folder and Commit your changes to WEB_APP_GIT_URL

    git init git add . git commit -m "initial commit" git remote add samplemeteorapp WEB_APP_GIT_URL git push samplemeteorapp master

    Easy tables, Easy APIs error message ‘Unsupported Service’

    MSDN Blogs - 12 hours 15 min ago
    Overview

    When you try to access or create Easy tables or Easy APIs in Azure App Services or Azure Mobile Apps you may get this error message in the portal: Unsupported service.

    Cause

    Easy tables and Easy APIs are only available to an Azure Mobile App that has been created initially as an Azure Mobile App and with a Node.Js backend in the quckstart OR to an Azure Mobile App Quickstart.

    Fix

    Create a new Azure Mobile App, choose quickstart and select Node.js when you are prompted for the backend OR choose Mobile Apps Quickstart and create your Easy Tables.

    FAQ:

    But I don’t want to create a new Azure Mobile App, what can I do? – You can grab the necessary Node.js code from a quickstart you have created and then edit the code to add tables.  You can use https://NAMEOFYOURAPP.scm.azurewebsites.net/dev to see and edit the table and api code (see below on how this works).  You can view you tables with your favorite database access tool.

    How does the Easy Table/API functionality work if I don’t see them in the UI? – The mobile app node.js packages look for certain files in the Tables and API directories (mytable.js and mytable.json for example).  It uses express middle ware and when it sees them they dynamically add then to the expressroute. 

    What if I need more help? –  StackOverflow and MSDN forums are you first option, creating a support case is another.

    More Info:

    You can also not use easy tables/apis and manually define the routes and add your code:

    https://shellmonger.com/30-days-of-azure-mobile-apps-the-table-of-contents/

    https://shellmonger.com/2016/04/15/30-days-of-zumo-v2-azure-mobile-apps-day-8-table-controller-basics/

    https://shellmonger.com/2016/05/13/30-days-of-zumo-v2-azure-mobile-apps-day-20-custom-api/ (Version 2: The Node.js Custom API)

    Millennial Manager

    MSDN Blogs - 12 hours 24 min ago

    Unlike a lot of technical blogs, Premier Developer covers a range of both technical and business focused topics.  Why?  Simply put, developers need to understand the business and the business needs to understand developers. 

    This post is provided by Dan Simmons, a Technical Delivery Manager in Microsoft’s Modern Application Domain and a millennial manager.  David S. Lipien, a Director in Microsoft’s Premier Services and a manager of millennials, helped contribute.

    I was recently at the lake relaxing and talking to my neighbor, a Press Lawyer, and she was discussing her position and the challenges she was facing. I mentioned I was leading a team, to which she stated, “oh man, I can relate to people management, I have some millennials I’m managing and they want to work from anywhere, be recognized all the time, and think they should make so much!” I responded saying, “I’m actually a millennial manager.

    Which then made me think –  Is this situation any different than managing other employees?

    Continue reading Dan’s post on LinkedIn.

    Contact your Premier ADM or email us to learn how Premier Support for Developers can help you minimize technical debt and ensure your solutions stay supported with Microsoft by your side.

    New Web Property OneNoteVersionIntervalMinutes since August 2016 CU

    MSDN Blogs - 14 hours 32 min ago

    With the following change we introduced a new property some SharePoint administrators may find useful.

    3115454               August 9, 2016, update for SharePoint Foundation 2013 (KB3115454)

    http://support.microsoft.com/kb/3115454/EN-US

    • You have versioning enabled on a document library. You were be unable to control how frequently a new version of an OneNote Notebook Section file will be created. This update introduces the SPWebService.OneNoteVersionIntervalMinutes (default value 60*24=1 day) property to enable administrators to control how frequently versions of OneNote Notebook sections are created.

    The General:
    OneNote (e.g. 2013, 2016) notebooks are having an own mechanism to create versions of pages including an own Recycle Bin. Therefore it would not make much sense to store those notebooks inside a document library with versioning enabled. As usual there are a lot of circumstances around the world, that you can find notebooks everywhere and also in versioned doclibs. In the past you had no control how often a new version of a section might be created and now it is possible.

    How OneNote notebooks are stored in a doclib?

    You may see:

    The icon tells you that this is a OneNote notebook, click on it and you may see the content of the notebook in OWAS (Office Web Application Server). You can also open the notebook with your OneNote client and sync it on many and different devices.

    Technically this is a folder and the content of the notebook is one step down the folder structure.

    In OWAS it looks like:

     

    And in OneNote 2016 on the client:

     

    Now the content or files in the doclib:

    How to navigate to this view? The answer is, that it is a bit tricky, but not a secret thing. To make it easier, create a standard folder beside the notebook. Click on it and you are one level deeper. Copy the URL into notepad.

    http://contoso/sites/TeamSiteContoso/_layouts/15/start.aspx#/ContosoDocLib/Forms/AllItems.aspx?RootFolder=%2Fsites%2FTeamSiteContoso%2FContosoDocLib%2FJustFolder&FolderCTID=0x012000805BE7957100214AB39FF33236B5807C&View=%7B433440F4%2DE342%2D4F72%2D8766%2D6ACEAF2764A1%7D

    We named the folder: JustFolder.

    Compare the link with the next one:

    http://contoso/sites/TeamSiteContoso/_layouts/15/start.aspx#/ContosoDocLib/Forms/AllItems.aspx?RootFolder=%2Fsites%2FTeamSiteContoso%2FContosoDocLib%2FMyTeamContosoNoteBook

    In short, exchange the JustFolder with the name of the notebook and remove the rest of the line. Use that URL and navigate so that you can see the section files. Now you can check and manage the version history of each section file.

     

    How to set and read that new property?

    You may store this into OneNoteVersioning.ps1

    Call it to read the value: OneNoteVersioning.ps1 –WebUrl http://contoso

    Call it to read the current value and also set the value to e.g. 2880 minutes: OneNoteVersioning.ps1 –WebUrl http://contoso –SetInMinutes 2880

    ################################################################################################

    # THIS CODE-SAMPLE IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED

    # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR

    # FITNESS FOR A PARTICULAR PURPOSE.

    #

    # This sample is not supported under any Microsoft standard support program or service.

    # The script is provided AS IS without warranty of any kind. Microsoft further disclaims all

    # implied warranties including, without limitation, any implied warranties of merchantability

    # or of fitness for a particular purpose. The entire risk arising out of the use or performance

    # of the sample and documentation remains with you. In no event shall Microsoft, its authors,

    # or anyone else involved in the creation, production, or delivery of the script be liable for

    # any damages whatsoever (including, without limitation, damages for loss of business profits,

    # business interruption, loss of business information, or other pecuniary loss) arising out of

    # the use of or inability to use the sample or documentation, even if Microsoft has been advised

    # of the possibility of such damages.

    ################################################################################################

     

    ##

    ## Using new Web Property OneNoteVersionIntervalMinutes since August 2016 CU

    ##

     

    Param(

      [Parameter(Mandatory=$True)]

      [string]$WebUrl,

      [Parameter(Mandatory=$False)]

      [int]$SetInMinutes

    )

     

    Write-Host -ForegroundColor White “This script read and can set the value how often a new version for a OneNote notebook section will be created”

    Write-Host -ForegroundColor Gray “This setting makes sense in case you need to store OneNote Notebooks inside DocLibs with versioning enabled”

    Write-Host -ForegroundColor Yellow “For Web Application : “ $WebUrl

     

    # Use the URL and get the web object

    $SPWeb = Get-SPWeb -Identity $WebUrl

     

    # Read the value and calculate it into hours and days

    $InMinutes = $SPWeb.Site.WebApplication.WebService.OneNoteVersionIntervalMinutes

    $InHours = $InMinutes / 60

    $InDays = $InHours /24

     

    Write-Host -ForegroundColor Gray “OneNote versions are created every : “ $InMinutes ” Minutes or”

    Write-Host -ForegroundColor Gray “OneNote versions are created every : “ $InHours ” hours or”

    Write-Host -ForegroundColor Gray “OneNote versions are created every : “ $InDays ” days”

     

    # The new value will be written only when it is set with the parameter SetInMinutes

    if($SetInMinutes -gt 0)

    {

        Write-Host -ForegroundColor Red “We will set a new Value : “ $SetInMinutes ” Minutes”

        $SPWeb.Site.WebApplication.WebService.OneNoteVersionIntervalMinutes = $SetInMinutes   

        $SPWeb.Update()

       

        # Read again the value and write it out

        $InMinutes = $SPWeb.Site.WebApplication.WebService.OneNoteVersionIntervalMinutes

        $InHours = $InMinutes / 60

        $InDays = $InHours /24

     

        Write-Host -ForegroundColor Gray “Read the Value again:”

        Write-Host -ForegroundColor Green “The new Value is now : “ $SetInMinutes ” Minutes”

        Write-Host -ForegroundColor Gray “OneNote versions are created every : “ $InMinutes ” Minutes or”

        Write-Host -ForegroundColor Gray “OneNote versions are created every : “ $InHours ” hours or”

        Write-Host -ForegroundColor Gray “OneNote versions are created every : “ $InDays ” days”

     

    }

     

    $SPWeb.Dispose()

     

    App Dev on Xbox One Live Event 30th August 2016

    MSDN Blogs - 15 hours 20 min ago

    Windows 10 Anniversary Update was released earlier this month  Windows 10 Anniversary Update SDK, build 14393, was made accessible to any developer who wishes to build UWP applications that, for the first time ever, also target Xbox One consoles.

    With so many great new features and APIs to talk about, we are excited to announce the “App Dev on Xbox” live online event where engineers will spend one day covering the new Anniversary Update SDK capabilities that enable you to build great app experiences for the TV and across other device form factors. The event will cover topics around:

    • the Anniversary Update SDK and getting started with app development on Xbox One
    • deep dive into developing apps using both XAML and Web technologies
    • guidance for designing and creating impressive TV experiences and
    • submitting your apps via the Dev Center to all UWP devices including the Xbox

    The event will take place on August 30th at 9:00am PST and it is open to everyone. Head on over to the event page to add it to your calendar now and join the conversation on Twitter using #XboxAppDev where live Q&A will take place during the event.

    Get started now!

    In the meantime, you can get started today by downloading the free Visual Studio Community 2015 and the Anniversary Update SDK. Activate Developer Mode on your retail Xbox One or Xbox One S and checkout the docs. And make sure to visit the event page for a lot more resources and regular updates up to the event.

    Read more at https://blogs.windows.com/buildingapps/2016/08/23/announcing-the-app-dev-on-xbox-live-event/#ui55WHdE5y4twRlL.99

    Pages

    Subscribe to Randy Riness @ SPSCC aggregator
    Drupal 7 Appliance - Powered by TurnKey Linux