We are pleased to announce that the Building a Release Pipeline with Team Foundation Server 2012 hands-on labs now support both TFS 2012 and 2013.
There are currently no plans to further enhance the book, hands-on labs or sample application. Our focus will be shifted to other DevOps initiatives, such as Config as Code.please send candid feedback!
We can’t wait to hear from you, and learn more about your experience using the add-in. Here are some ways to connect with us:
Purchase from these online retailers:
Prepare for Microsoft Exam 70-414—and help demonstrate your real-world mastery of implementing an advanced server infrastructure in Windows Server 2012 R2. Designed for experienced IT professionals ready to advance their status, Exam Ref focuses on the critical-thinking and decision-making acumen needed for success at the MCSE level.
Focus on the expertise measured by these objectives:
· Manage and maintain a server infrastructure
· Plan and implement a highly available enterprise infrastructure
· Plan and implement a server virtualization infrastructure
· Design and implement identity and access solutions
This Microsoft Exam Ref:
· Organizes its coverage by exam objectives
· Features strategic, what-if scenarios to challenge you
· Assumes you have experience planning, configuring, and managing Windows Server 2012 R2 services, such as identity and access, high availability, and server infrastructure
About the Author:
Steve Suehring is a technical architect and writer with extensive experience in system administration, networking, computer security, and software development.
The latest tool for debugging shaders now ships as a feature in Microsoft Visual Studio, called Visual Studio Graphics Debugger.
This new tool is a replacement for the PIX for Windows tool. Visual Studio Graphics Debugger has greatly improved usability, support for Windows 8 and Direct3D 11.1, and integration with traditional Visual Studio features such as call stacks and debugging windows for HLSL debugging.
For more info about this new feature, see Debugging DirectX Graphics.
Pearson VUE has been added as a new exam delivery provider for Microsoft Certification exams. Starting September 4, 2014, candidates may choose from both Prometric or Pearson VUE to complete their Microsoft Certification Professional (MCP) and Microsoft Technology Associate (MTA) exams. Microsoft Office Specialist (MOS) and academic MTA exams will continue to be delivered through Certiport.
As announced at the Worldwide Partner Conference, Microsoft is consolidating all exam delivery provider services with Pearson VUE and Certiport, a division of Pearson VUE, effective January 1, 2015. This will enable Microsoft to provide a more seamless customer experience through the Microsoft Certification exam testing process.
Some Frequently Asked Questions (FAQ)
When do the changes take effect?
Starting on September 4, 2014, you may choose from Prometric or Pearson VUE
On January 1, 2015, all exams will be provided by Pearson VUE only
Which exams are affected?
MCP and MTA exams
Why are you making the change?
In order to provide a more seamless customer experience, Microsoft is consolidating all exam delivery provider services with Pearson VUE and Certiport, a division of Pearson VUE, effective January 1, 2015.
Where can I find Pearson VUE online?
Having some social with Tampa Bay WAVE based startup guy EricN at a fine local establishment in Tampa the other night. We were discussing the MEAN stack he's been using on his project via (argh) AWS Virtual Machines.
MEAN led to a discussion of Windows Azure and its inherent but understated support of the MEAN stack across the board. But I really didn't have a quick set of resources available with bootstrapping, development, and scaling info. For posterity and SEO, below is an exerpt of my favorite learning, deployment and scaling resources for MEAN on Windows Azure.
So here's my favorite links in each of the MEAN categories. Enjoy. Got extras talking about Azure + MEAN? Post them in the comments and lets build the article base.
(really part of Node.js - see node.js below)
This week, the University of Washington is sharing hosting responsibilities with Microsoft for this year's World Imagine Cup Finals. The event acts as a moment in time for communities and universities across the globe to celebrate student entrepreneurship and creativity. Most importantly, it's a ton of fun to watch.
The VP for Developer Experience and Evangelism in the U.S., Sanket Akerkar shared a little of that Imagine Cup spirit earlier today with his one word of advice to not only the team representing the U.S. on the Imagine Cup world stage this year, #TeamGrantFellow, but to all competitors: Fun.
Fun...As in don't let the chaos of this amazing event scream by without taking a moment to really soak in the spirit of the event: showcasing some of the most promising and talented entrepreneurs and developers across the world.
What one word of advice would you offer up for teams at Imagine Cup? You can follow the U.S. team's journey this week and share your word of wisdom or support with the team out of U Penn using the #TeamGrantFellow hashtag on Twitter. Here's a couple examples of folks that are already cheering on the team from the U.S.
Visual Studio Tool for Unity are Microsoft’s FREE Visual Studio add-on that enables a rich programming and debugging experience for working with the Unity3d gaming tools and platform.
Here are the highlights in today’s 1.9 release:
This will be continual updated so check out the changelog.
Für alle, die meinen heutigen VS Tipp gelesen haben und Lust auf ein Template für Shared Code bekommen haben, habe ich eines erstellt. Ihr findet es in der Visual Studio Gallery. Den Quellcode werde ich morgen auch noch freigeben, so könnt Ihr Euch das Project Template nach Belieben anpassen.
Ihr findet das Template hier:
Ich wünsche Euch viel Spaß damit. Bitte beachtet, dass ich keinen Support für das Template geben kann – es soll Euch nur zeigen, was geht und ggf. inspirieren selbst ein etwas ausgefeilteres zu erstellen!
Nearly 300 MVPs, technology enthusiasts and SQL fans joined forces in Germany to participate in a hackathon and SQL Saturday event featuring MVP presenters. The two-day event was organized by SQL Server MVPs Constantin Klein, Tillmann Eitelberg and Oliver Engels. We had the chance to catch up with Constantin “Kostja” Klein to get an inside look into the event.
What was the inspiration for creating such a unique community event?
"Since BIG DATA is a hype-topic, we decided to favor a Hackathon over a regular Pre-Con in order to allow attendees to really get a first, hands-on experience with reference to the existing technologies on the Microsoft platform, like HDInsight and PowerBI. With Azure MVP Sascha Dittmann, Scott Klein and Emil Siemens we also found the right people to introduce the tools to the attendees and help them on occurring problems."
What was the highlight of the event?
"The highlight was the presentation of the results at the end of the day. This was when all other teams had the chance to find out about different approaches to the same problem, different ways of visualization, etc.
By the way, the challenge we prepared for the day was to get some interesting insight and visualization out of more than 250,000 tweets collected during [the] football World Cup with [the] hashtag #WorldCup. And actually we had some interesting findings, like it seems that in the UK, there is a disproportionately high number of people who like football and use Twitter.
We wanted attendees to use the Microsoft cloud technology stack. Therefore we had Azure accounts prepared and helped people to get the environment (Cloud Storage, Azure SQL Database and HDInsight cluster) up and running. People helped each other and we started the day with building teams of up to five people who then worked together. In fact, most of the teams were not colleagues and had never worked together before."
What is the benefit of attending such a great community event?
"We believe that you get a much better kick-start for dealing with a brand new topic – which it was for almost all attendees – when you really have time to start a project and play with the technology. So during the wrap-up many attendees confirmed, that they now have a first real experience, they can take home and elaborate on that. This is totally different if you just listened to a whole day lecture. Obviously this lecture could cover much more details, but an attendee would not be able to immediately reproduce what he heard or had seen. Working in mixed groups is another interesting aspect, which helps people to deal with and adjust to new situations."
Congratulations to the MVP organizers, presenters and all the participants!
Today we released the Visual Studio Tools for Unity add-on (formerly known as UnityVS). It is now available for download on the Visual Studio Gallery at the following links:
VSTU is Microsoft’s free Visual Studio add-on that enables a rich programming and debugging experience for working with the Unity gaming tools and platform. This is our first release since the acquisition of SyntaxTree, and we’re excited to have the opportunity to reach to the Unity community with Visual Studio.
Here are the highlights in today’s 1.9 release:
And many more new features and bug fixes as you can see in our changelog. If you have any suggestion for VSTU, please post them on UserVoice, and if you encounter any issue please report them through the Visual Studio Connect site.Jb Evain, Senior SDE Lead, Visual Studio Platform Team
Jb runs the Visual Studio Tools for Unity experience for the Visual Studio Platform team. He recently joined Microsoft as part of the acquisition of SyntaxTree, a company he founded and where he led the development of UnityVS. He has a passion for developer tools and programming languages, and has been working in developer technologies for over the last decade.
Check out the following video about Jared Poole, Developer and creator of the "Pass a PFT" ( Physical Fitness Test) App, For the US Air Force.
Every Developer has a story, Do you have one? We would love to hear about it.
View the original post at Channel 9: http://channel9.msdn.com/Blogs/Every-dev-has-a-story/Jared-Poole-Developer-and-Creator-of-Pass-a-PFT-for-Windows-Phone?linkId=8747239
AppFabric is a key part of any SharePoint 2013 as we all know by now. It’s used for many things indeed all over the product and AppFabric in turn uses Active Directory to make sure only authorised processes get access to the cache data.
Thus when SharePoint uses AppFabric, it opens a new TCP connection and AppFabric has to authenticate that connection with Active Directory; the same as when a user/browser connects to a Windows authentication protected website. As with the browser/IIS connection, there’s two ways of authenticating to AppFabric – NTLM or Kerberos, with NTLM being picked if Kerberos isn’t setup. Given Kerberos is a more efficient protocol we want to use that over NTLM to reduce traffic to AD as it’s just an all-round more efficient security protocol.
If you have Kerberos logging enabled on a web-front-end for example, something you’ll notice is these SPNs being requested and not being found:
Notice anything about the SPN? Yep, it’s the AppFabric server we’re trying to use – the client (SharePoint) is trying to connect via Kerberos and it’s not working so we have to fail-back silently to NTLM. That works of course as far as SharePoint’s concerned but we have generated more traffic than we wanted due to this security protocol fall-back.Add AppFabric SPNs to Reduce Calls to Active Directory
As with HTTP, we want to use Kerberos for AppFabric authentication to reduce the extra logins to Active Directory. This is just done by adding the right SPNs to the app-fabric account in the form of:
This you do with either ADSI Edit or setspn; in my example:
That should take care of both SPNs but you might want to try adding the NetBIOS one too manually – Win2012 will duplicate-check the SPN first anyway so there’s no risk. Check the security log with Kerberos logging enabled to double-check if you’re unsure. Restart AppFabric (and if I remember right IIS might be necessary too) and you should be good to go – AppFabric will make significantly less calls to Active Directory as it’ll use a Kerberos token instead.
If this is of particular interest I can probably expand on this subject, but that’s it for now.
// Sam Betts
The “MS Removal Tool” or MSRemovalTool is malware. It is not a Microsoft product. This kind of malware is known as “rogue security software” because it imitates a real product. In this case, the Microsoft Malicious Software Removal Tool.
If you’re infected with this malware you might see a MS Removal Tool window when you start your computer and you might not be able to access your desktop. You might not be able to start Task Manager, and you might not be able to open Internet Explorer or any other programs.
The window might look like this:
The warning in your notification area might look like this:
Microsoft security software detects and removes this threat, but if you already have it you might need to boot your computer into Safe Mode in order to remove it.
This year’s world cup was filled with unbelievable surprises; Miroslav Klose’s record breaking total goals, Germany’s thrashing of Brazil, Spain’s unfortunate crash-out, United States’ resilience through the Group of Death, and moments of Messi magic. For the stat geek in all of us, Power BI Q&A made it easy to put the game in context and help you make better predictions about the outcome. If you didn’t have a chance to ask Q&A about the world cup during the tournament, you can still give it a try here.
We heard from many of you how amazing it was to see the breadth of questions Q&A could answer on the world cup. If you’d like a glimpse into how we can teach Q&A a whole new sport in a matter of a day, read on!
In this post we will discuss how we used the breadth of tools available in Excel and Power BI to bring this experience to you including:
We partnered with Opta Sports, a leading provider of historical sports statistics to give Q&A all of the facts you’d expect a sports statistician to know. The historical data was delivered as a series of XML files with the statistics for each game organized in folders by year. Each file contained information on the teams, coaches, referee, players, goals, cards, substitutes, and game statistics. This isn’t the easiest way to import data, but luckily Power Query has robust support for handling different data formats and folder structures.
We used the From Folder option in Power Query to import all the data at once. Each row was a path to the individual XML file. Power Query also has the ability to read in the content of an XML file. Under content, clicking on “binary” lets you expand out all the data within a XML. After deleting and renaming a few columns, you end up with data for each game (see image below). As you can see, the value “Table” is in a lot of cells. That’s because Opta’s XML feed was highly nested, but it allowed for easy organization and navigation of the data.
We had to use a different approach for the 2014 data since it was updated in real time from Opta as the games progressed. Using the From Web option in Power Query, it was easy to import the latest data from each game and refresh it whenever we wanted. If you’d like to learn more about how to connect to almost any data source with Power Query, check out Getting Started with Power Query Part I.
Getting the right format with Power Query
Once we pulled in the data, we had to create a query for each table we wanted in our data model: games, goals, cards, referees, managers, stadiums, etc. The motivation behind creating a query for each table is that when we load each table to the data model, we can easily create the relational diagram required to make Q&A shine. Most queries involved expanding out nested tables, and filtering columns to get one query for one particular table. One functionality we would like to point out with Power Query is the advanced editor, which allows you full control over the transformation of you data. The entire advanced editor is not the scope of this post, but we wanted to highlight the scenario of pivoting tables. In the Power Query editor, under the “View” tab, there is a button to launch into the advanced editor (shown below):
Chris Webb wrote a great blog post on pivoting tables.
After each query was created for the particular tables we wanted, we loaded them to the data model for Power Pivot to consume.Power Pivot
When you load to the data model from Power Query, it automatically adds your tables to Power Pivot. All we had to do in Power Pivot was match unique IDs to one another to create relationships so we know which players had goals, cards and which referees, teams, players, and coaches were in a particular game. Now that the relationships were created, we could take this to Power Q&A.Power Q&A
The model is now ready to be used on Power Q&A. However, we optimized the model further using the new cloud modeling environment to be able to ask a wider variety of questions. The cloud modeling environment also is an easy way to manage synonyms for your columns and tables. Since all the changes are saved on to the cloud, it allows everyone to reap the benefits.
Documentation on how to get started with the cloud modeling environment can be found here. As an example on how to make a wider variety of questions to work, we created the following phrasing “aggressive team has large average cards per game”. The inclusion of this phrasing allowed us to ask more questions like:
You can click the links above to see how one good phrasing allows Q&A to give really great answers!Crawl, then walk
As you can see, there’s a lot more to teaching Q&A than just understanding natural language. Finding the right data, choosing the right tables and relationships, and data modeling all contribute to a great experience in Q&A. Luckily, the full suite of Power BI tools like Power Query and Power Pivot make it easy to achieve powerful results in a few minutes.
Download the LitDev Extension :
It has loads of extra functionality with over 40 new objects and over 700 methods, properties and events; including 2D physics, 3D rendering, controls, dialogs, graphing, statistics, matrices, lists, sorting and searching (Regex), faster arrays, lists and graphics, USB and COM connection, SQLite or MySQL databases, webcam, multiple GraphicsWindows, scrollbars, image processing, date and time handling, debugger and lots more.
See http://litdev.hostoi.com for additional details, online documentation and source code.
As with all Small Basic extensions, copy the extension dll and xml (inside the download zip) to a lib sub-folder of the Small Basic installation folder, usually in one of the following locations:
C:\Program Files\Microsoft\Small Basic\lib
C:\Program Files (x86)\Microsoft\Small Basic\lib
You may require to create the lib folder if it doesn't exist.
Copy all other files (documentation and samples) elsewhere.
Thanks to LitDev for providing this great Extension!
- Ninja Ed
Part 1 of this series provided a high level overview of the components that make up the chart of accounts in Microsoft Dynamics AX 2012. Part 2 focused in on the chart of accounts components. Part 3 talked about the financial dimensions. Part 4 focused on the new account structures that were introduced in AX 2012. Part 5 discussed the relationship of an organization hierarchy with an account structure. Part 6 discussed the advanced rules for an account structure. This final posting of the series will describe the balances functionality using dimension sets in AX 2012.
Prior to AX 2012, Dynamics AX calculated balances as part of posting for every possible segment combination of an account number. This had a major impact on the performance of posting into the general ledger. To improve the posting performance, the balance calculation was removed from the posting process and from the time of shipping AX 2012 through the most recent R3 release, has had some major changes. Let me describe them to you.
Beginning with AX 2012, balances will only be calculated based on how you have set up the financial dimension sets. A financial dimension set is an ordered set of financial dimensions. Out of the box, we ship one dimension set that includes only the main account segment. Users can set up additional dimension sets based on the level of balance you care about. So if you want to view the trial balance by BusinessUnit and MainAccount, you need to create a financial dimension set with those financial dimensions included. If you don't care about the balance of Department only, balances will not be calculated for that segment if a financial dimension set is not created.
You can create as many financial dimension sets as you need, but the set of selected financial dimensions must be unique across the financial dimension sets. For example, you can only have one dimension set that only contains MainAccount but you can set up another dimension set that contains MainAccount as well as another dimension.
When creating a new financial dimension set, the balances must be initialized. Once initialized, the button label will change to Rebuild balances. Rebuild balances does the same thing as initializing by clearing the balance and recalculating. You should only need to do this if you don't think the balances are correct for some reason.
The Update balances button will update the balances with anything posted to the general ledger since the last time the balances were updated. You can schedule the update as a batch process which may or may not be set up as recurring.
When we shipped AX 2012, balances needed to be manually updated or scheduled to be updated. There was an option to update the balances with posting and this option just triggered the update process to run after posting was completed. This process was not intuitive to users and improvements were made with AX 2012 CU4.
With AX 2012 CU4, we removed the option to update balances with the posting and automatically updated the balances when we needed them. Viewing the trial balance list page or generating the trial balance summary report are examples of when AX needs the balances. The financial dimension sets still need to be initialized when set up and you can optionally schedule the balance update to run as a recurring batch process if you don't view the trial balance very often as this can impact the performance of opening the list page.
We have continued to add performance improvements to the balance calculation and other fixes through the release of Dynamics AX 2012 R3. With the latest release, rebuilding balances should be a rare occurrence. If you find that you are still having to rebuild the balances, please contact your Microsoft Dynamics partner or enter in a support request so that we can determine the issue.
This concludes the Planning your chart of accounts blog series.
Commenter kinokijuf wonders whether the Windows 95 interface had a code name.
We called it "the new shell" while it was under preliminary development, and when it got enabled in the builds, we just called it "the shell."
(Explorer originally was named Cabinet, unrelated to the container file format of the same name. This original name lingers in the window class: CabinetWClass.)
We all know that food is one of the basic elements of survival. And yet the world is transitioning from an era of food abundance to one of scarcity, driven in part by population growth and rising affluence, as well as growing water shortages and the earth’s rising temperature. In response to this concern, Microsoft and the United States Department of Agriculture (USDA) have teamed up to address Food Resilience, one of the themes of the White House’s Climate Data Initiative.
Climate change has the potential to impact all aspects of the food system, from our ability to grow food, to the reliability of food transportation and food safety, to the dynamics of international trade in agricultural goods. The Food Resilience theme is organizing datasets and tools that tell the story of food system vulnerability to climate change both domestically and abroad, focusing on climate-vulnerable links in the chain.
At Microsoft, we believe that climate change is a serious issue that demands immediate, worldwide attention. We are dedicated to reducing our impact on the environment, and are doing so through organizational commitments like carbon neutrality, and through public-private partnerships like the Climate Data Initiative. Many of farming’s most essential questions, such as how to increase yields, how much input is required to produce a healthy crop, and how to adapt to weather conditions, can be answered via information technology. These investments can increase yield, increase input efficiency, and, most importantly for farmers, increase profitability.
As a result of the Climate Data Initiative, a huge amount of data, which lives on Data.gov’s Climate website, has already been amassed. Now we face the challenge of how best to analyze these enormous information sets and share meaningful insights. On July 29, the White House announced public and private partnerships and commitments to provide support in tackling this challenge. Microsoft’s partnership with the USDA will focus on organizing data sets and tools in the cloud to provide insight into vulnerabilities in the food system. The USDA has posted its data on Microsoft Azure to make it easy for agricultural researchers to explore, analyze, and share insights to address climate-change concerns and help young farmers and producers improve the food system. Those data and tools are provided by Microsoft Research in the Microsoft Azure Marketplace as part of the Microsoft Azure for Research program.
To find out more about how Microsoft is working with the USDA to support the Climate Data Initiative, check out the Inside Microsoft Research and Microsoft Research Connections blog. And while you're at it, also check out the USDA data sets posted to the Microsoft Azure Marketplace (enter search term USDA)