You are here

Feed aggregator

What’s New in DynamicsPerf 2.0 Release Candidate 0

MSDN Blogs - Sun, 06/26/2016 - 15:20

First, I want to say “Thank You” to the community.  All of the feedback has been greatly appreciated.

 

Let’s start with SSRS data collection.

 

This is now working as designed.  There were some schema changes needed to get this working as necessary so the upgrade script will truncate the SSRS_HISTORY table and the SSRS_EXECUTIONLOG tables.  The SSRS data collection has been moved to it’s own SQL Agent Job called DYNPERF_CAPTURE_SSRS.

There is no additional configuration beyond setup of the SSRS_CONFIG table in the install guide.

 

Additional Options for controlling the size of the DynamicsPerf database.

 

It appears that our 5 minute data collection is working really well as we are collecting all variations of query plans.  The problem is that this is consuming a lot of space.  So in Release Candidate 0, we have added some additional configuration parameters to better control how much data is collected.  These options are a part of script 4-ConfigureDBs to Collect.sql.  The parameters are as follows:

NO. OF QUERY PLANS TO KEEP PER QUERY_HASH – This parameter limits the number of query_plan_hash values we keep per query_hash.  It defaults to 20.  The extra plans are purged with the data purge job that runs once per day so you can have more then the configured amount until that task runs.

Ignore queries below this time in (ms) – This will cause the collection to not gather query statistics on queries below this threshold.  Default is 0, collect all.

COLLECT TOP X PERECENT QUERIES BY TOTAL_ELAPSED_TIME – This applies a TOP X PERCENT to the collection of QUERY_STATS.  Default is 100, collect all

COLLECT TOP X PERCENT QUERY PLANS BY TOTAL_ELAPSED_TIME – This applies a TOP X PERCENT to the collection of QUERY_PLANS.  Default is 100, collect all

 

If you have a large SQL Server with 256GB of memory or more and a large Dynamics deployment, then you may want to adjust these values.  The recommendation would be to adjust the number of plans to keep and the TOP X PERCENT Query Plans first as these will be the biggest space savings.

 

The third major improvement for Release Candidate 0 is performance.

 

There were a few of the scripts, data purge, parse query plans and database statistics that were causing some blocking and slower performance.  There were 2 fixes implemented to resolve this issue. First, the procedures were rewritten to select into temp tables to avoid table locks from the complexity of the rules in the queries.  Secondly, we have added a new SQL Agent Job called DYNPERF_PROCESS_TASKS_LOW_PRIORITY

:

 

All of the longer running tasks are now part of this job.  This was done to make the project run smoother.  This change will ensure that QUERY_HISTORY rollups and QEURY_ALERTS tasks run in a timely manner.  There is a new task type called PROCESS_LP that is used as part of the scheduling tables to get the appropriate tasks to run on this new job.

 

There is no additional configuration needed for these new jobs.

 

There were many other small changes made to improve the toolset.  Some of those changes were provided by the community so “Thank You “ again.

 

Please try this new version once posted.  I look forward to hearing your feedback.  I’m especially interested in your feedback on the sample queries and what changes could be used to improve the understanding of the data.  Reporting for DynamicsPerf 2.0 will come after release.  We know Dashboards are a big request and are investigating a solution.

 

Rod “Hotrod” Hansen

Roll-up your backlog board and display an aggregated view on your dashboards

MSDN Blogs - Sun, 06/26/2016 - 12:03

We are pleased to announce the Roll-up Board Widget, created by Mikael Krief, allows you to display aggregated views of your backlog boards on your dashboards.

Getting started
  • Install the Roll-up Board extension.
  • Edit your dashboard.
  • Select the **Roll-up Board Widget** (1) and click Add (2).
  • Enter a Title (1), select a suitable Size (2), and select a Backlog (3). Click Save (4).
  • Add and arrange one or more of the widgets on your dashboard.
We look forward to hearing from you

We need your feedback. Here are some ways to connect with us:

Performance issue in VMWare 6

MSDN Blogs - Sun, 06/26/2016 - 07:44

We recently ran into a performance issue with a Dynamics customer that was running VMWare 6.  The issue is caused by incorrect support in the VMWare network drivers for a new feature in Windows Server 2012 R2.  You can read more about this issue here:

https://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=2129176

There is a workaround for customers who don’t have Update 2 installed which corrects the issue. 

 

If you see erratic performance while pulling large amounts of data, such as reports or form loads, then you might have this issue.

Работаем с ошибкой “Отсутствуют зависимости на стороне сервера” (Missing server side dependencies)

MSDN Blogs - Sun, 06/26/2016 - 05:56

В данном посте я привожу свою методологию решения проблемы “Отсутствуют зависимости на стороне сервера” (Missing server side dependencies).
В данном примере используется SharePoint 2010, но принцип остается тем же самым и в новых версиях (хотя честно говоря, на SP2016 я не проверял). Важно отметить, что в данном решении нам необходим доступ непосредственно в базу.

В анализаторе SharePoint

ошибка выглядит так:

На секунду задумаемся о том, что же это за ошибка и откуда она берется?

Допустим, у нас имеется серверное решение. В решении есть артефакт – ghosted file, или может быть веб часть. Веб часть могла быть установлена в галерею веб частей, или может быть добавлена на страницу. После чего по разным причинам веб часть удалили (обычно ее переименовывает кто нибудь). Но в базе SharePoint остается запись о старой веб части.

Что же делать?
Сначала нужно найти, где же на нашем сайте ссылка на “отсутствующий” компонент.

Обычно я делаю это с помощью SQL.
Возьмем скрипт, который пройдет по всем таблицам базы данных контента , и найдет там указанную строчку (в нашем случае ИМЯФАЙЛАВЕБЧАСТИ.webpart)
(В своей практике я несколько раз создавал и терял данный скрипт, так что пора его указать в блоге).

Скрипт взят отсюда и слегка модифицирован.

http://stackoverflow.com/questions/591853/search-for-a-string-in-all-tables-rows-and-columns-of-a-db

use WSS_Content — could be your database name here

DECLARE

@search_string VARCHAR(100),

@table_name SYSNAME,

@table_schema SYSNAME,

@column_name SYSNAME,

@sql_string VARCHAR(2000)

SET @search_string = ‘yourwebpart.webpart’

DECLARE tables_cur CURSOR FOR SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = ‘BASE TABLE’

OPEN tables_cur

FETCH NEXT FROM tables_cur INTO @table_schema, @table_name

WHILE (@@FETCH_STATUS = 0)

BEGIN

DECLARE columns_cur CURSOR FOR SELECT COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = @table_schema AND TABLE_NAME = @table_name AND COLLATION_NAME IS NOT NULL — Only strings have this and they always have it

OPEN columns_cur

FETCH NEXT FROM columns_cur INTO @column_name

WHILE (@@FETCH_STATUS = 0)

BEGIN

— SET @sql_string = ‘IF EXISTS (SELECT * FROM ‘ + QUOTENAME(@table_schema) + ‘.’ + QUOTENAME(@table_name) + ‘ WHERE ‘ + QUOTENAME(@column_name) + ‘ LIKE ”%’ + @search_string + ‘%”) PRINT ”’ + QUOTENAME(@table_schema) + ‘.’ + QUOTENAME(@table_name) + ‘, ‘ + QUOTENAME(@column_name) + ””

SET @sql_string = ‘IF EXISTS (SELECT * FROM ‘ + QUOTENAME(@table_schema) + ‘.’ + QUOTENAME(@table_name) + ‘ WHERE ‘ + QUOTENAME(@column_name) + ‘ LIKE ”%’ + @search_string + ‘%”) SELECT * FROM ‘ + QUOTENAME(@table_schema) + ‘.’ + QUOTENAME(@table_name) + ‘ WHERE ‘ + QUOTENAME(@column_name) + ‘ LIKE ”%’ + @search_string + ‘%”’

EXECUTE(@sql_string)

FETCH NEXT FROM columns_cur INTO @column_name

END

CLOSE columns_cur

DEALLOCATE columns_cur

FETCH NEXT FROM tables_cur INTO @table_schema, @table_name

END

CLOSE tables_cur

DEALLOCATE tables_cur

Вот что получилось после запуска скрипта :

Итак, мы знаем, что сайт с идентификатором 089D94D0-7BF4-4A51-9E29-2BACABC30DF4 содержит запись об отсутствующей веб части в галерее, и мы можем решить данную проблему с помощью PowerShell или вручную .

Вот пример другой веб части. В данном случае у нас некорректная веб часть находится в двух местах – на основном сайте и на личном сайте администратора.

К сожалению, нам понадобится еще один скрипт. Скрипт выше отлично решает проблему веб части в галерее или отсутствующего файла, но не решает проблему веб части , расположенной на странице.
Вот еще один скрипт, который находит все веб части на всех страницах. Если раскоментировать последнюю строчку, то найдется нужная веб часть.

SELECT Webs.FullUrl, Webs.Title, AllDocs.DirName, AllDocs.LeafName, AllWebParts.tp_DisplayName, AllWebParts.tp_ID

FROM AllDocs, Sites, AllWebParts, Webs

WHERE Webs.Id = Sites.RootWebId AND AllDocs.Id = AllWebParts.tp_PageUrlID

AND Sites.Id = AllDocs.SiteId AND tp_WebPartTypeId IN (

SELECT DISTINCT tp_WebPartTypeId FROM AllWebParts (NOLOCK)

WHERE tp_WebPartTypeId IS NOT NULL AND tp_Assembly IS NULL AND tp_Class IS NULL)

— if you add this line, the script will find your specific web part instead of giving you all web parts on all pages

–AND (AllWebParts.tp_DisplayName = ‘yourwebpartname’)

Вот что я получил при запуске скрипта:

Pic4.

Теперь, в зависимости от количества проблем, можно вручную или с помощью скрипта пройтись по указанным страницам и убрать Вашу веб часть.

Правда, однажды, я столкнулся с ситуацией, которую долго не смог разрешить.
Дело в том, что несколько пользователей добавили веб часть в свою личную копию страницы. Найти того, кто конкретно добавил, было невозможно (несколько тысяч пользователей).

К счастью, я вспомнил про еще один хороший прием. Дело в том, что если к любой странице ?contents=1

Вот таким образом можно сбросить все “личные” копии страницы.

А вот так – удалить “некорректную” веб часть.

How to deal with “Missing server side dependencies” error

MSDN Blogs - Sun, 06/26/2016 - 05:02

In this post I am going to show my methodology for dealing with the “Missing server side dependencies” error. I do not have a good way to deal with it online but one good way to deal with it on-premises.

For this example I am using the SharePoint 2010, but the problem is the same in the newest versions (though honestly I did not check it on the SharePoint 2016).

This error appears in the SharePoint Health Analyzer

and looks like this:

Let us think for a second – why does this error happen and what is the inner reason for it?

Let us say your server side solution has some artefact. In this case it is the web part. Once upon a time the web part was a part of the solution and was deployed somewhere. Then you have decided you do not need the web part or simply has renamed it. However the content database already contains an information like “page xxx has a web part yyy in this zone with this settings”.

And if the webpart does not exist the database is inconsistent.

It is not necessary a web part. You can add a ghosted module file and then decided to rename it. So the SharePoint will keep the information about the file in the database but the file will not be there.

Could be also that the web part gallery (_catalogs/wp contains information about your webpart and the webpart is also part of farm solution and do not exist anymore)

So the first step is to find where the web part is.

The way I was always doing it is via SQL. With the SQL you can search through the content database and find that specific string.

I have been making and losing the script the number of times, most recent time I have found a script in here

http://stackoverflow.com/questions/591853/search-for-a-string-in-all-tables-rows-and-columns-of-a-db

And I have slightly modified it so it not only shows the table where the string is stored, but also specific lines.

use WSS_Content — could be your database name here

DECLARE

@search_string VARCHAR(100),

@table_name SYSNAME,

@table_schema SYSNAME,

@column_name SYSNAME,

@sql_string VARCHAR(2000)

SET @search_string = ‘yourwebpart.webpart’

DECLARE tables_cur CURSOR FOR SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = ‘BASE TABLE’

OPEN tables_cur

FETCH NEXT FROM tables_cur INTO @table_schema, @table_name

WHILE (@@FETCH_STATUS = 0)

BEGIN

DECLARE columns_cur CURSOR FOR SELECT COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = @table_schema AND TABLE_NAME = @table_name AND COLLATION_NAME IS NOT NULL — Only strings have this and they always have it

OPEN columns_cur

FETCH NEXT FROM columns_cur INTO @column_name

WHILE (@@FETCH_STATUS = 0)

BEGIN

— SET @sql_string = ‘IF EXISTS (SELECT * FROM ‘ + QUOTENAME(@table_schema) + ‘.’ + QUOTENAME(@table_name) + ‘ WHERE ‘ + QUOTENAME(@column_name) + ‘ LIKE ”%’ + @search_string + ‘%”) PRINT ”’ + QUOTENAME(@table_schema) + ‘.’ + QUOTENAME(@table_name) + ‘, ‘ + QUOTENAME(@column_name) + ””

SET @sql_string = ‘IF EXISTS (SELECT * FROM ‘ + QUOTENAME(@table_schema) + ‘.’ + QUOTENAME(@table_name) + ‘ WHERE ‘ + QUOTENAME(@column_name) + ‘ LIKE ”%’ + @search_string + ‘%”) SELECT * FROM ‘ + QUOTENAME(@table_schema) + ‘.’ + QUOTENAME(@table_name) + ‘ WHERE ‘ + QUOTENAME(@column_name) + ‘ LIKE ”%’ + @search_string + ‘%”’

EXECUTE(@sql_string)

FETCH NEXT FROM columns_cur INTO @column_name

END

CLOSE columns_cur

DEALLOCATE columns_cur

FETCH NEXT FROM tables_cur INTO @table_schema, @table_name

END

CLOSE tables_cur

DEALLOCATE tables_cur

Once I have run the script I have seen this:

So now I know that the site with ID 089D94D0-7BF4-4A51-9E29-2BACABC30DF4 contains a missing web part in the gallery, I guess I could just delete it? A simple get-spsite PowerShell command could find me the site and the gallery.

Here is another example of another webpart. This time we can see it happens on 2 different web sites – a default site and a personal site of the administrator.

However, running this script is not enough. This is another good script that will find your webpart on all pages.

SELECT Webs.FullUrl, Webs.Title, AllDocs.DirName, AllDocs.LeafName, AllWebParts.tp_DisplayName, AllWebParts.tp_ID

FROM AllDocs, Sites, AllWebParts, Webs

WHERE Webs.Id = Sites.RootWebId AND AllDocs.Id = AllWebParts.tp_PageUrlID

AND Sites.Id = AllDocs.SiteId AND tp_WebPartTypeId IN (

SELECT DISTINCT tp_WebPartTypeId FROM AllWebParts (NOLOCK)

WHERE tp_WebPartTypeId IS NOT NULL AND tp_Assembly IS NULL AND tp_Class IS NULL)

— if you add this line, the script will find your specific web part instead of giving you all web parts on all pages

–AND (AllWebParts.tp_DisplayName = ‘yourwebpartname’)

Results of the script when run with my webpart:

Pic4.

Now depending on the number of pages and problems you could either clean the pages manually or make a script that will go through the pages and delete the webpart from it.

One more hint to finish this. Recently I met a problem I could not solve easily. Some users had added the obsolete Web Part to the Personal View of the page. So the script that would clean the web part could not delete this or even find the users! The content database did not help in this case.

So the simplest way to remove all personalized settings and the faulty/missing webpart from the page is to add the ?contents=1 to the page URL. This will bring the page into maintenance mode.

Here is how you remove all personal users settings

And here is how you delete the problematic web part from the page

Brexit – Economic Impact Simulation

MSDN Blogs - Sat, 06/25/2016 - 13:17

The following simulation shows why the Brexit is such a terrible move!

Assume you have several hundred projects across europe and a few experts in each country to deliver them. Assigning these experts to the projects follows complex optimization rules. It is quite interesting to see that adding new political constraints to this model changes the cost function so that the optimizer converges into a new global optimum. In the case of UK leaving the european union my worst case simulation (a 20% cost increase for UK import and export (as a result of increased traveltime, administrative and legal effort) ) shows the following:

Before Brexit: 

You can see projects in europe close to the UK made sense to be delivered by experts from the UK to avoid overcapacity. And also experts from outside traveled to London to help out in case of a resource bottleneck:

The green lines show collaboration in projects between experts from different countries.

 

After Brexit:

In the new optimum UK experts are assigned to UK projects only. For some projects the optimizer can`t even find the required experts anymore due to a lack of skilled resources within the country. On the other hand the increased cost will make working outside UK for British workers unprofitable ultimately leading to higher unemployment. You can see the island really isolated from the rest of the world.

Another interesting aspect is that the majority of people will have to commute longer distances to get to their workplace.

 

While the impact on europe is negligible it can be said that the United Kingdom is likely to pay a very high price for this decision.

In this worst case simulation  (not very likely as the pound gets weaker) they lost 100% of their export/import involvement due to a 20% delivery cost increase.

 

Here the overlay:

 

 

My hope is that this simulation will convince people that the european project is essential for peace and prosperity on this continent and should not be sacrificed to those who promise simple solutions to complex problems.

 

I will give a talk about this computational model in vienna on Tuesday! Free Registration: http://www.eventbrite.de/e/pass-austria-sql-server-community-meeting-juni-ii-registrierung-24865086142

Extensions experiencing – Hang – Does not load – HTTP 401s in browser console

MSDN Blogs - Sat, 06/25/2016 - 08:59
Symptom

You may be experiencing authentication issues with extensions running in an on-prem Team Foundation Server (TFS) 2015 Update 2.x environment. When you try to access an extension point, you may experience a “hanging” extension or a HTTP 401 error in the browser console window. The issue affects all users on the TFS instance.

Cause

Your may be missing a server name in the OAuth trusted issuers configuration.

To confirm, run the following SQL query against your TFS Configuration database:

SELECT * FROM tbl_RegistryItems WHERE ParentPath like ‘%#ConfigurationOAuthTrustedIssuers%’ AND PartitionId > 0

If you see more than one row, in particular the “tfs” row, you have a healthy configuration and this post is not applicable to you.

If you only see one row, as below, you may be experiencing the missing server name issue.

Resolution

WARNING

Serious problems might occur if you modify the configuration incorrectly by modifying the database or by using another method. These problems might require that you reinstall your Team Foundation Server. Microsoft cannot guarantee that these problems can be solved. Modify the configuration at your own risk.

Execute the following in SQL, after replacing <dnsname> with your server name:

declare @features dbo.typ_KeyValuePairStringTableNullable

insert into @features values(‘#ConfigurationOAuthTrustedIssuers<dnsname>‘, ‘Microsoft.TeamFoundation.Framework.Server.OAuth.ClientAuthTokenValidator’)

exec prc_UpdateRegistry @partitionId=1, @identityName = ‘00000000-0000-0000-0000-000000000000’, @registryUpdates = @features

Re-run the following SQL query against your TFS Configuration database to confirm that the tfs entry has been added.

SELECT * FROM tbl_RegistryItems WHERE ParentPath like ‘%#ConfigurationOAuthTrustedIssuers%’ AND PartitionId > 0

Reload your browser and verify that your extension is now functioning correctly.

For additional information on troubleshooting 401 errors, see the following article in the Microsoft Knowledge Base: 907273 Troubleshooting HTTP 401 errors in IIS

Applies to

Team Foundation Server 2015 Update 2

Team Foundation Server 2015 Update 2.1

Visual Studio Team Services Extensions are not supported on Team Foundation Server 2015 Update 1 or older

Office 365 E5 Nuggets of week 25

MSDN Blogs - Fri, 06/24/2016 - 23:35

 

“How do I get an app ATO?” Join our next DC Azure Gov Meetup – June 29

MSDN Blogs - Fri, 06/24/2016 - 14:38

If you’re in the DC area, please join us at our next DC Azure Government Meetup, “How do I get an app ATO? — Industry best practices,” starting at 5:30 p.m. on Wednesday, June 29 at the Microsoft Office in DC.

Our monthly Meetups provide you with networking opportunities and feature a variety of speakers who share and discuss best practices, challenges and solutions for government IT modernization.

Be sure to visit the DC Azure Gov Meetup page to join our group, RSVP and get the latest information on our upcoming events.

For an overview of our most recent Meetup event, DoD Azure Best Practices & National Day of Civic Hacking DC, check out Microsoft MVP and Applied Information Sciences CTO, Vishwas Lele’s blog.

Visual Studio 2015 Update 2 ALM VM available on TechNet Virtual Labs

MSDN Blogs - Fri, 06/24/2016 - 14:02

The ALM VM 2015 Update 2 and the corresponding labs are now available on TechNet Virtual Labs – You can simply launch the VM from a browser or a windows client without having to worry about downloading the entire virtual machine.

Browse all the available labs by filtering Job Role by Developer from the left side navigation on the home page or check out this post for the links to the individual labs. Enjoy!!

BHM Repository Updates List

MSDN Blogs - Fri, 06/24/2016 - 13:33

In BizTalk Health Monitor (BHM) v3.2 we decoupled the Query/Rules and Maintenance repositories from the tool itself.  This was done so we could provide repository updates via Azure without having to release a new version of BHM every time.

BHM v3.2 shipped with the following repositories:

Query Repository Maintenance Repository
  • Purpose:  Analyzing the BizTalk environment and generating the report (MBV functionality)
  • File Name:  MYHCQueries_MBVQueries.dll
  • File Version:  13.46.0.0
  • Purpose:  Executing maintenance tasks in the environment (Terminator functionality)
  • File Name:  MaintenanceRep.dll
  • File Version:  2.3.5829.33506

 

Below is a list of all subsequent repository updates.  This page will continue to be updated as new releases are made.

See here for more information on how to update BHM in your environment with the latest repositories.

 

Click on the version number for more details on that specific update.

Release Date Repository Type File Version May 2, 2016 Query 13.47.0.0 May 2, 2016 Maintenance 2.4.5941.31641

 

 

Updates in Query Repository 13.47.0.0:

  • Check for BTS2013 CU4, BAP2013 CU3, HIS2013 CU3, Swift MP2015 CU1 & CU2, BTS2013R2 CU2, BTS2010 CU9
  • Check for .NET 4 and .NET 4.6
  • Change SQL2012 SP rule to check for SP3 instead of SP2
  • Change MLLP 64 bits support warning in BTS2013
  • List all BAM Activities with their definition XML file and Online Time Window
  • Show Critical Warning for BTS2010 being out of mainstream support
  • Get all installed programs on each BTS Server
  • Fix rule checking SQL 2008 R2 SP3 not raising warning
  • Other minor fixes and corrections

 

Updates in Maintenance Repository 2.4.5941.31641:

  • Added new task: View Count of Orphaned DTA Tracking Parts and Fragments
  • Added new task: DELETE Orphaned DTA Tracking Parts and Fragment
  • Added new task: View Count of MarkLog Table in All DBs
  • Added new task: Purge MarkLog Table in All DBs
  • Added new task: PURGE FailedTrackingData Table in the BAMPrimaryImport
  • Fixed incorrect counts with View Spool Message Count of Instances task
  • Other minor fixes and corrections

 

 

 

 

 

 

Office 365 news roundup

MS Access Blog - Fri, 06/24/2016 - 13:30

Tell me and I forget. Teach me and I remember. Involve me and I learn.” Benjamin Franklin, one of America’s most famous inventors, statesmen and self-made men, clearly understood that people learn best by doing. That’s one reason why at Microsoft we make Office 365 intuitive and easy to use. We know that the more you use Office 365 to actually do things, the more quickly you will learn and derive value from its many features and applications.

Several announcements over the past couple of weeks show how we consistently work to provide opportunities for you and your colleagues to get involved with Office 365 in ways that will benefit you and your business.

For Office 365 administrators, the new version of the Office 365 admin app will make it easier and more efficient for you to manage your company’s service from your smartphone when you’re away from the office. And speaking of mobile technology, the new SharePoint mobile app lets you put your intranet in your pocket and take it along, wherever you go. The new app provides quick access to your team sites, company portals and resources, and helps you stay up to speed on collaborative projects. We also explained how Microsoft Power BI in Office 365 can help sales representatives gain insights, avoid missed opportunities and close more deals by making it easier for them to collect, unify and visualize all of their data in one place.

Learning by doing is of course powerful and valuable for educators, as well. We recently announced new Office 365 experiences and updates that will make it easier for teachers to manage their classrooms and collaborate with their peers. Teachers will soon be able to customize Office 365 Groups to create professional learning communities, where they can share expertise and collaborate on ways to improve their teaching skills and students’ academic performance. With Docs.com, they will be able to create attractive online portfolios for their Office documents, and Microsoft Forms will make it easier for teachers to assess student progress and provide real-time personalized feedback.

Take time to look a little more deeply into Office 365 and learn more about what it can do—and what you can do with it. After all, as Ben Franklin said on another occasion, “An investment in knowledge pays the best interest.”

Below is a roundup of some key news items from the last couple of weeks. Enjoy!

Minnesota school district starts the school year right with the help of Microsoft FastTrack—Learn why the Austin Public School District chose Office 365 when teachers asked for more flexible tools, easier network access and better communication with colleagues and students.

Telefónica uses Yammer to stay engaged, aligned and more competitive—Discover how Telefónica is using Yammer, the enterprise social network platform in Microsoft Office 365, to become a global community of engaged employees.

Empowering attorneys and staff with Office 365 helps Kelley Drye transform into a digital workplace—Find out how Office 365 is revolutionizing professional collaboration at this global law firm.

Microsoft adding more teacher-focused Office 365 Education updates—Learn about recent features and updates that Microsoft has added to Office 365 to help teachers.

20 amazing features in Office 365 that you probably don’t know about—Discover some of the valuable but lesser-known features in Office 365 and how they can make your life easier.

Microsoft: Better meetings require better technology—Find out how Microsoft is bringing video conferencing into sharper focus with Office 365.

The post Office 365 news roundup appeared first on Office Blogs.

IIS Tomcat

MSDN Blogs - Fri, 06/24/2016 - 12:41

IIS-Tomcat

Before getting started with IIS -Apache Tomcat we may want to know about some concepts:

 

TOMCAT:

Apache Tomcat (also called Jakarta Tomcat or simply Tomcat) functions as a servlet container developed under the Jakarta project at the Apache Software Foundation. Tomcat implements the servlet and JavaServer Pages (JSP) Oracle Corporation (although created by Sun Microsystems).

We have to have installed JRE:

Concepts:

JRE: Java Runtime Environment. It is basically the Java Virtual Machine where your Java programs run on. It also includes browser plugins for Applet execution.

JDK: It’s the full featured Software Development Kit for Java, including JRE, and the compilers and tools (like JavaDoc, and Java Debugger) to create and compile programs.

Usually, when you only care about running Java programs on your browser or computer you will only install JRE. It’s all you need. On the other hand, if you are planning to do some Java programming, you will also need JDK.

 

Sometimes, even though you are not planning to do any Java Development on a computer, you still need the JDK installed. For example, if you are deploying a WebApp with JSP, you are technically just running Java Programs inside the application server. Why would you need JDK then? Because application server will convert JSP into Servlets and use JDK to compile the servlets. I am sure there might be more examples.

 

remark: servlets = applets to be ran.

A Java servlet is a Java program that extends the capabilities of a server. Although servlets can respond to any types of requests, they most commonly implement applications hosted on Web servers..

 

 

 

Diferences between Apache and Tomcat(apache tomcat):

Tomcat works as a container of servlets but today tomcat can now function as an application server alone. apache alone is unable to execute the dynamic content of some pages and that’s where tomcat comes to play who facilitates the execution of these servlets or JSPs (these are translated into servlets by tomcat).

 

Documentation to install apache + tomcat

http://www.debianhelp.co.uk/apachetomcat.htm http://www.netadmintools.com/art340.html http://www.debianhelp.co.uk/apachetomcat.htm http://wiki.apache.org/tomcat/HowTo

Conector apache–tomcat http://tomcat.apache.org/connectors-doc-archive/jk2/index.html http://tomcat.apache.org/connectors-doc/ http://tomcat.apache.org/tomcat-3.3-doc/mod_jk-howto.html

Conector IIS-Tomcat:  https://tomcat.apache.org/download-connectors.cgi   http://tomcatiis.riaforge.org/

 

 

 

Now Go inside out with IIS- Tomcat:

 

First you must have installed the following:

  • Apache Tomcat
  • JRE/JDK.
  • the right .net framework [as you have IIS server it is obvious you will have it installed already]

 

First install Tomcat 7

Please check wether it is running on port 8080 as it should.

 

Let´s create a site on IIS which will point to the same Tomcat Webroot directory where web content will be executed. Site name TomcatSite.

Add an Index.html to that directory and then test if that run as expected.

if so, create an Index.jsp and add it to the Default documents on IIS server.

This will fail because we have not tomcat IIS connector installed as of yet.

 

 

You can download connector from http://tomcatiis.riaforge.org

 

Follow the wizard and then repro again.

This will recognize now. jsp extensión

 

All request:

Take into account that the “connector” only will register distinct extensions to be passed to tomcat for processing Eg: .jsp , .cfm , .cfc .

If you may want to request for all kind of extensions, you can add a catch all handler that sends everything to Tomcat by AJP if not recognized.

This technique can be used for servlets and so on.

you must register on IIS server in Handler mappinng * handler and select the connector dll and more below give a handler name.

 

 

If you browse now you can see all kind of allowed content on the browser.

 

If you want to dig deeper into this matter please go here:

 

Understanding IIS HTTPPlatformHandler using Tomcat 8

Configuring an IIS 7 front-end for Apache Tomcat, using AppCmd.exe

 

Remark: some information I´ve taken from Wikipedia for a better understanding.

Azure News on Friday (KW25/16)

MSDN Blogs - Fri, 06/24/2016 - 12:13

Auch diese Woche gab’s wieder viele Nachrichten zur Microsoft Azure Plattform. Hier sind nähere Infos dazu…

Aktuelle Neuigkeiten Datum Nachricht 23.06. Two more things to keep your costs on track in DevTest Labs
Neue Möglichkeiten zur Kostenkontrolle in DevTest Labs – Übersicht zu aufgelaufenen Kosten pro Ressource 23.06. Updated high availability and disaster recovery app design guidance
Anleitung zur Hochverfügbarkeit und Disaster Recovery beim Entwurf Azure basierter Anwendungen aktualisiert 23.06. Cross-Post: App Service Auth and Azure AD B2C
Authentifizierung über Social Provider sehr einfach in eigene Web Apps via EasyAuth und Azure AD B2C einbauen 22.06. Text Analytics API Now Available in Multiple Languages
Die Text Analytics API im Rahmen der Cognitive Services jetzt in mehreren Sprachen verfügbar 21.06. Service Bus client 3.2.3 is now live
Service Bus Client Library 3.2.3 verfügbar – Bugfixes und Neuerungen für Event Hubs und Service Bus Relay 21.06. Azure Data Lake: signed up requests will be accepted within one hour
Interessant für alle Big Data & Analytics Fans: Genehmigung von Zugriffs-Requests für Azure Data Lake in 1 Stunde 21.06. Running Selective Parts of an Azure ML Experiment
Azure ML jetzt mit der Möglichkeit, nur Teilschritte eines Machine Learning Workflows auszuführen 21.06. Microsoft brings container innovation to the enterprise at DockerCon 2016
Neuerungen für Nutzer von Docker auf Azure – Docker Datacenter im Marketplace, SQL Server in einem Container etc. 21.06. DNS security appliances in Azure
Best Practices, um DNS Protokoll basierte Angriffe auf eigene Azure Ressourcen abzuwehren – DNS Security Appliances 21.06. General availability of Azure DevTest Labs – VSTS extension
VSTS Extenstions für Azure DevTest Labs verfügbar – Virtual Machines im Rahmen von Builds erstellen, löschen etc. 20.06. EventProcessorHost for Java Released
EventProcessorHost jetzt auch im Azure Event Hub Java Client verfügbar – Nachrichten leicht aus Event Hubs empfangen 20.06. Azure Media Services adds support to manage resources through Azure Resources Manager (ARM)
Azure Media Services Ressourcen können jetzt auch über den Azure Resource Manager angelegt und verwaltet werden 20.06. New! Get introduced to DAX with Guided Learning
Einstieg und Überblick in DAX (Data Analysis Expressions), die Sprache zur Datenanalyse in Power BI 20.06. Application Map: Filter and pin to a dashboard
Application Maps – Abhängigkeiten zwischen Azure Ressourcen analysieren und im Azure Portal anzeigen Neue Kurse in der MVA Datum Nachricht 23.06. StorSimple
Alles zu StorSimple – hybrider, sicherer, skalierbarer Datenspeicher zu Datenablage lokal und Azure 23.06. Securing Your Data in Microsoft Azure SQL Database
Daten sicher in Azure SQL Database ablegen und verwalten – dieser kostenlose MVA-Kurs zeigt, wie’s geht 23.06. Introduction to Microservices
Einführung in Microservices – kostenloser Online-Kurs in der Microsoft Virtual Academy 23.06. Microservices Design and Patterns
MVA Kurs zu Microservices basierten Architekturen auf Azure 23.06. Building Distributed Applications and Microservices-Based Apps on Azure Container Service
Microservices basierte Architekturen auf Azure mit den Azure Container Service Neue Videos Datum Nachricht Video 24.06. Add Windows Containers behind Docker Swarm Cluster – Part 2
Provisionierung von Containern auf Windows und Linux mit Hilfe von Swarm und dem Azure Container Service
21.06. Tuesdays with Corey: DockerCon Announcements
Corey Sanders über die Ankündigungen und Demos von Mark Russinovich zu Docker auf Azure auf der DockerCon

AX for Retail 2012 R3: How to deploy customized Commerce Runtime services

MSDN Blogs - Fri, 06/24/2016 - 12:12

We get a number of questions on the exact steps needed to create and deploy customizations to the Commerce Runtime (CRT).  Here is a step-by-step process to get up and running with a very simple customization.  These steps should work with both 2012 R3 and AX7 versions of the Commerce Runtime

Method 1:  total replacement of the CRT services assembly
  • Start Visual Studio and open the Commerce Runtime solution.  By default, this can be found in this location:  C:UsersAdministratorDocumentsRetail SDK CU9Commerce Run-timeSdk.CommerceRuntime.sln
  • By default, all seven projects in the solution are configured to be signed with a strong name key which isn’t provided:  strongnamekey.snk.  For purposes of this example, we will create unsigned assemblies and configure the CRT to use them.  As part of your development you will want to sign with your organizations key and update the SDK source files with the UpdateAssemblyIdentities.ps1 script.  Details on that process can be found on this page.  For now, right-click on each project and select properties.  On the Signing tab unmark the Sign the assembly checkbox:

  • After saving each project, make sure that the solution compiles before making any code changes.  Select Build > Rebuild solution and make sure you get no errors.
  • The Sdk.Services project points to the Microsoft-signed version of the Runtime.Services.PricingEngine assembly.  This will cause problems when deploying your customized CRT.  To fix this, remove the reference and point it to the PricingEngine project from the solution instead.
    • Expand the Sdk.Services project and then expand the References menu item.  Right-click on Microsoft.Dynamics.Commerce.Runtime.Services.PricingEngine and select Remove.
    • Right-click on References and select Add Reference.  On the Reference Manager form, expand Solution and then select Projects.  Select the Runtime.Services.PricingEngine project and select OK.  Make sure that you actually mark the project using the checkbox (it’s easy to miss).

  • Now make your code changes.  An easy one to test out is to make customize a receipt using the ReceiptService service.  Go to the Sdk.Services > Receipt > ReceiptService.cs file and find the GetReceiptForXZReport method.  Near the end of this method is line of code to copy the text of the receipt into the receipt.Body property.   Add an extra line of text to the receipt after this line:

  • Re-build your solution to make sure things didn’t break.
  • Now go look for the resulting assemblies.  By default they are saved in the References subfolder of the Commerce Runtime folder.  Sort by date to make them easy to notice. 

  • For most customizations, you will be most concerned with the Services.dll and Services.Pricing.dll files.  Copy those (and their corresponding PDBs) to a temporary folder for deployment.
  • Copy the files to your CRT folder.  If you are using Retail Server, they will go in the PackageBin folder.  If you are using the direct database or offline capability for MPOS, they will go in the Retail Modern POSClientBroker folder:

  • The final step is to tell the CRT to look for your assembly instead of the Microsoft-signed version.  In the same folder you copied your assemblies to, make a backup of the commerceRuntime.config file.  Open the file in a text editor and change the signature information for the Runtime.Services assembly (Version, Culture, PublicKeyToken, and processorArchitecture).  For this example, our assemblies are unsigned so you can remove the entire highlighted text:

  • Restart your Retail Server and MPOS and test your changes.  Remember to kill the DLLHost.exe process if using direct database mode.  If everything worked you should immediately see your change the next time you print a receipt:

If things don’t work, check your event viewer for any exceptions.  The most common problem that I’ve seen is that the Microsoft-signed version of the PricingEngine attempts to be loaded.  Dropping and re-creating the reference as noted above usually takes care of the issue.


Method 2:  Deploy a service in its own assembly


Replacing the entire Runtime.Services assembly is a relatively easy option and can be used both in development production environments.  But the real power of the CRT architecture shows up when you completely segregate code changes to just the services you are modifying.  Deploying using this methodology is very similar to how you would deploy a service in Enterprise POS.  This also makes thing much easier from a maintenance view:  if the jump between hotfix releases doesn’t change the code in your services, you don’t have to merge code and redeploy your assembly; you can just install the CRT hotfix and leave your DLL in place until it needs to be changed.

The initial setup steps are a bit more involved but once that leg work has been completed, continuing development is pretty simple.

  • Launch a Windows Explorer instance and go to the root of your Commerce Run-time folder.  Right-click on the Services folder and select Copy.  Right-click on white space in the root folder and select Paste.  This will make a folder named Services – Copy.  Rename the folder something like ServicesPartner:

  • Go into your new folder and delete all subfolders except for the service you will be customizing.  For this example, we will only keep the Receipt subfolder.  While in the folder, rename the .csproj file as well:

  • Open the Sdk.CommerceRuntime.sln solution in Visual Studio.  Using the notes above, change the Signing on the seven projects so the solution will compile.
  • In Solution Explorer, right-click on the top node of your solution and select Add > Existing Project.  Navigate to your new folder and select the .csproj file.  Again, select all folders except Receipt and remove them from your project.

  • Expand the References node and remove the reference to the PricingEngine assembly since the Receipt service doesn’t use the pricing engine.  This will help avoid any assembly definition issues.
  • Right-click in your project and select Properties.  On the Library tab, change the Assembly name and Default namespace to match your organization:

  • Select the Signing tab either un-mark the Sign the assembly checkbox or sign it with your organization’s strong name key.
  • Make your code changes and rebuild your new project.
  • Navigate out to your References folder again and you should see your new assembly:

  • Copy the assembly and its PDB into either the bin folder of the Retail Server or the ClientBroker folder of MPOS.
  • Open the commerceRuntime.config file and add the following line above the line for the Runtime.Services assembly.  If you followed the steps in Method 1, add the signed assembly information back in first (restore from the backup you made).
  • Restart your Retail Server and MPOS and test your changes.   Again, if you’re using direct database or offline for MPOS, kill the DLLhost.exe process so your changes get picked up.

As you can see, the second method really gives you a lot more flexibility on how you develop and deploy customized CRT services and I highly recommend using that approach if possible.

Use the comments below to ask any questions you have on this topic.

Investigating delays with Release notification emails in Visual Studio Team Services – 6/24

MSDN Blogs - Fri, 06/24/2016 - 11:31

Initial Update: Friday, 24 June 2016 18:30 UTC

We are actively investigating issues with Release Management Service in South Central US region. Users might experience delays in receiving the Release notification email once their release is completed.

 

  • Next Update: Monday, 27 June 2016 17:00 UTC

We are working to resolve this issue and apologize for any inconvenience.

Sincerely,
Harish

A New Team Services build task to easily extract files

MSDN Blogs - Fri, 06/24/2016 - 10:34

Team Services sprint 101 introduces a new build task, Extract Files.  Use it to easily extract archives during your Team Foundation Server (TFS) build process.  The Extract Files task is cross platform and uses native unzip, tar, and 7-Zip on Mac and Linux.  For Windows, we bundled 7-Zip with the task so you can extract anything supported by 7-Zip, which means you can create tarballs on your on Windows build machines.

 

The build task is open sourced, so feel free to use it to create your own build extensions. If you have any suggestions or issues, reach out to us on GitHub.

2016-06-24 Release Update

MSDN Blogs - Fri, 06/24/2016 - 09:59

This is a substantial release that includes a new default schema – 2016-04-01-preview.  By default all new logic apps created in Azure should have the new schema and capabilities that come with it.  In addition, existing logic apps have the option to upgrade to the new schema from the logic app resource blade.  More details on the new schema and how to upgrade can be found here:  http://aka.ms/logicapps-2016-04-01

The update is scheduled to roll out 2016/06/24 – it may be a few hours from this update publish time before you see changes in the portal.

Release Notes:

  • Can add Scopes from the designer and in code-view
  • Conditions and Loops visible in designer and are now types of actions
  • New filter action to filter an array
  • DependsOn replaced by more fleixible “runAfter” to determine execution ordering based on previous action status
  • Rename option in the designer to rename actions
  • Support for the ‘nullable’ indicator ‘?’ and will automatically be added for optional values.
  • @result() and @workflow() functions.
    • @result(‘scopeName’) will return an array of action results, status, and inputs/outputs for evaluation and exception handling
    • @workflow() will return an object that details information on the run – like @workflow()[‘run’][‘name’] to get the ID
  • Request trigger with no response action returns run name as a response header

Bug Fixes:

  • Using functions for fields like HTTP Header would sometimes show invalid JSON error
  • ‘?’ operator would be removed when loading designer

Podcast and Overview: Microsoft Dynamics CRM 2016 Update 1.0 (Service Pack 1)

MSDN Blogs - Fri, 06/24/2016 - 09:19

Contents:

We’re proud to announce that all packages for Microsoft Dynamics CRM 2016 Update 1.0 (Service Pack 1, codenamed Naos) were released February 21st, 2016 to the Microsoft Download Center! These packages will appear on Microsoft Update shortly .

Note the naming convention change! Post-RTM Updates used to be called Update Rollups, now they’re just called Updates with the version number:

Was: Microsoft Dynamics CRM Update Rollup 1 or 2

Is now: Microsoft Dynamics CRM Update 0.1 or 0.2

For more details, see the Dynamics CRM Product Group blog “New naming conventions for Microsoft Dynamics CRM updates

Microsoft Dynamics CRM 2016 Update 1.0 Build number:

8.1.0.359

Microsoft Dynamics CRM 2016 Update 1.0 Microsoft Download Center page

Here’s the “Master” Microsoft Dynamics Knowledge Base article for Microsoft Dynamics CRM 2016 Update 1.0: (KB 3154952). Going forward, the plan is to continue publishing Master Knowledge Base articles for CRM Updates a bit in advance of release to aid planning.

Podcast

On Monday, June 27th 2016, Greg Nichols and Ryan Anderson from the Microsoft CRM Premier Field Engineering Team will provide information about:

  • The release of Microsoft Dynamics CRM 2016 Update 1.0
  • New fixes made available in Microsoft Dynamics CRM 2016 Update 1.0
  • New functionality made available in Microsoft Dynamics CRM 2016 Update 1.0
  • Deprecated functionality in Microsoft Dynamics CRM 2016 Update 1.0

during their Microsoft Dynamics CRM 2016 Update 1.0 podcast.

https://msdnshared.blob.core.windows.net/media/2016/03/CRM2016u01.mp3

Note regarding Podcasts: We’ve recently changed the location of where we are hosting and distributing our podcasts. See PFE Dynamics Podcast Update for more information. To download the podcast audio file, right-click here, and choose to save the link location or file locally.

Go to Top

The “CRM Update Rollup Collateral Page

For pointers to download locations, release dates, build information, and CRM Premier Field Engineering blogs and podcasts for all supported Microsoft Dynamics CRM Updates, Update Rollups, and Service Packs, visit the “CRM Update Rollup and Service Pack Collateral Page

Go to Top

Important note:

An updated Unified Service Desk for Microsoft Dynamics CRM (Build 2.0.1.426) has been released. See the following Microsoft Download Center webpage for download details:

Unified Service Desk for Microsoft Dynamics CRM

General Upgrade Rollup and Service Pack Notes:

  • Testing CRM Update Rollups: Best Practices
    • Microsoft Dynamics CRM Premier Field Engineering recommends doing all the standard testing you generally do for all Updates, which could be the functional and performance testing that you would do with a new major release or a subset of that test plan
    • The “general rule of thumb” for test plans for Update Rollup installs are:
      • Test any changes in a pre-production environment BEFORE introducing into your production environment. Manage your risk!
      • Consider using the Performance Toolkit for Microsoft Dynamics CRM to simulate your production user load in your testing environment to shake out any performance-related issues early. The link point to a recently-released version of the Toolkit reworked to support CRM 2016! Talk to your TAM (Technical Account Manager) if you want Premier Field Engineering to help your team install and configure it!
      • Test using the permissions your most restrictive end-user roles have. Testing with CRM Administrator permissions, for example, does not give you the complete picture
      • Concentrate on your SDK customizations, JavaScript, ISV add-ons – basically anything that’s not OOB functionality or customizations done from within the UI

Go to Top

Microsoft Dynamics CRM 2016 Update 1.0 packages are available for download via:

  • The Microsoft Dynamics CRM 2016 Update 1.0 Microsoft Download Center page – released May 23rd, 2016
  • The Microsoft Update Catalog – to be released shortly
  • The Microsoft Update detection / installation process
    • Note: Microsoft Dynamics CRM 2016 Updates will be pushed via Microsoft Update as Important updates
    • Client packages installed manually by downloading the packages and running install will require local administrator privileges. If the client packages are installed via Microsoft Update or SCCM (System Center Configuration Manager), they will not require local administrator privileges
    • Consider using Windows Server Update Services (WSUS) or similar software distribution technologies to distribute Dynamics CRM Update Rollups internally. WSUS is a locally managed system that works with the public Microsoft Update website to give system administrators more control. By using Windows Server Update Services, administrators can manage the distribution of Microsoft hotfixes and updates released through Automatic Updates to computers in a corporate environment
    • For help with installation please see the Installation Information section of the Microsoft Dynamics CRM 2016 Update 1.0 “master” Microsoft Knowledge Base article
    • Please review Jon Strand’s blog posting “CRM 2011: Silently Installing Update Rollups” which provides details on installing CRM Outlook client update rollups “silently” in order to limit end-user interruption, which also applies to CRM 2015 Updates for these CRM components:

Microsoft Dynamics CRM Server 2016

Microsoft Dynamics CRM 2016 for Microsoft Office Outlook (Outlook Client)

Microsoft Dynamics CRM 2016 Email Router

Microsoft Dynamics CRM 2016 SSRS (SQL Server Reporting Services) Data Connector

The SSRS Data Connector is not available as an individual download. It is included in the Microsoft Dynamics CRM Server 2016 download. When you extract the Server package (CRM2015-Server-ENU-amd64.exe /extract:path: extracts the content of the package to the path folder), you’ll find the Data Connector in the SrsDataConnector folder

Microsoft Dynamics CRM 2016 Language Packs (aka Multi-Language User Interface / MUI)

Microsoft Dynamics CRM 2016 Report Authoring Extension (with SQL Server Data Tools support)

Microsoft Dynamics CRM 2016 List Component for Microsoft SharePoint Server 2010 and Microsoft SharePoint Server 2013 (for multiple browsers)

Go to Top

Microsoft Dynamics CRM 2016 Update 1.0 Prerequisites:

  • Essentially the prerequisites listed in the Microsoft Dynamics CRM 2016 Implementation Guide download or Online TechNet for the various CRM components serviced

Go to Top

Issues resolved via Microsoft Dynamics CRM 2016 Update 1.0:

Microsoft Dynamics CRM 2016 Update 1.0 contains fixes for issues reported by customers or discovered via internal testing.

Fixes released via Microsoft Dynamics CRM 2016 Update 1.0:

  • Resolve incorrect navigation property names during upgrade from CRM 2016 RTM to CRM 2016 Update 1.0
  • Outgoing and incoming e-mail stops processing for all organizations
  • Using Internet Facing Deployment the OWA App is not loaded in Edge browser
  • Using Compose Mode in OWA is adding a known lead into the “To” Field and clicking retry throws an error
  • Incorrect numbers are displaying for Recent Cases and Opportunities on the Record form
  • Unable to create Opportunities if Business Process Flow exists
  • Not able to add members from one Marketing List to another
  • Update logic of Record Creation Rules automatically updates the Regarding Object Entity data
  • Activities, Contacts and Tasks are not synced when connecting CRM Online to Exchange On Premises in Hybrid mode
  • RetrieveInlineSearchResults doesn’t filter lookup types by Read/Append Privilege
  • Incorrectly modified on date displayed in Social Pane for Activity Records created after 6:30 PM
  • Large workflows are slow to execute in CRM Online
  • IOS and Android Dynamics CRM apps fail to configure if an uppercase value is in the organization URL
  • Fixed missing publication warning dialog when user performs any action in Activity Feeds configuration
  • Navigation after Related Record Grid operation is redirected to a Form instead of a View
  • The OptionSet control methods for StatusCode Field is not working
  • Bulk edit on entities causes the Status Reason Field to change back to the default value
  • Incorrect Next Page link for related entities
  • Quote Product, Order Product, and Invoice Product Forms are updated
  • Cloning a Product causes sharing of the image
  • Server Side Synchronization Performance Dashboard should have a name or description that indicates that it is for troubleshooting
  • Can create Navigation properties with the same Name on an Entity
  • Unable to add Contacts from one static Marketing List to another
  • The message “Web browser tying to close the window” appears when attempting to use the CRM app for Outlook in Internet Explorer, or Microsoft Edge
  • Outgoing E-mail, and Incoming E-mail stop processing for the organization
  • Uninstalling a Managed Solution will cause Business Rules to be Deactivated
  • (Microsoft Dynamics CRM Online 2016 Update 1 Only) Slow Performance when opening Customize the System, and other associated Views
  • (Microsoft Dynamics CRM Online 2016 Update 1 Only) Getting error while installing sample data on Finnish (1035), Hungarian (1038), and Norwegian (1044) languages

Go to Top

Support for new technologies provided by CRM 2016 Update 1.0:

The Microsoft Dynamics CRM Engineering team consistently tests Microsoft Dynamics CRM and associated CRM Updates against pre-release and release versions of technology stack components that Microsoft Dynamics interoperates with. When appropriate, Microsoft releases enhancements via future Microsoft Dynamics CRM Updates or new major version releases to assure compatibility with future releases of these products. This compatibility matrix is updated via this Microsoft Knowledge Base article: Microsoft Dynamics 2016 CRM Compatibility List.

Microsoft Dynamics CRM 2016 Update 1.0 provides no new support for technologies, though CRM 2016 RTM does. Consult the Microsoft Dynamics 2016 CRM Compatibility List to identify newly-supported technologies.

Hotfixes and updates that you have to enable or configure manually

Occasionally, updates released via Microsoft Dynamics CRM Updates require manual configuration to enable them. Microsoft Dynamics CRM Updates are always cumulative; for example, Update 0.2 will contain all fixes previously released via Update 0.1 as well as fixes newly released via Update 0.2. So if you install Update 0.2 on a machine upon which you previously installed no Updates, you will need to manually enable any desired fixes for Update Rollups 0.1 – 0.2:

  • Microsoft Dynamics CRM 2016 Update 1.0: no updates requiring manual configuration

Go to Top

Greg Nichols
Dynamics CRM Senior Premier Field Engineer
Microsoft Corporation

Split a Large Row-Formatted Text File using PowerShell

MSDN Blogs - Fri, 06/24/2016 - 07:43

I move a lot of large, row-formatted text files into Azure Storage for work I do with HDInsight and other technologies.  I land these files as block blobs which means that individual files must stick below the 200 GB block blob size limit.  Azure Data Lake Store does not have this limitation and most of the technologies I work with support compression, but let’s assume these are not options for the specific work I want to perform.

The way I typically work around this limitation is to split my files into smaller files that are below the max allowed size. There are several freely available file splitters on the Internet for this but they typically split the files based on a number of bytes which means that the split will likely occur in the middle of a row.  As these splitters do not recognize rows, they also do not recognize that my file may have a header which I would like replicated in the output files. For these reasons, I wrote a custom script in PowerShell to split my files.

The script requires you to target an input file, define a max number of bytes per output file, and identify whether the file has a header row. It then reads the input file line by line and writes these to a series of output files, preserving the original order of the rows, and making sure each file stays below the specified max size. If a header row is present, it is written to each output file.  The encoding of the original file is detected and preserved in the output files.

There are two things I’d like to add to the script but will simply need to return to at a latter date.  First, I’d like the script to detect the row-delimiter used by the original text file and then use this in the output files. As it currently stands, you must tell the script what row-delimiter to use in your output files.  Second, I’d like the script to assess the size of the original file and attempt to create output files that are more consistently sized.  This one is tricky if your rows are highly variable in size so I decided to keep the script simple for now.

One other thing about the script …. Because I am using this with very large files, I made use of the .NET Framework to read and write lines of data. If you will do a quick search, you’ll find a ton of posts showing the performance impact of this.  I found in some simple testing that the performance benefits of using the .NET Framework instead of the built-in PowerShell commandlets, i.e. Get-Content, Set-Content & Add-Content, were tremendous.  .NET adds complexity but it was totally worth it in this scenario.

Pages

Subscribe to Randy Riness @ SPSCC aggregator
Drupal 7 Appliance - Powered by TurnKey Linux