Enabling MAC address collection using Hardware Inventory
Enabling MAC address collection using Hardware Inventory
Explanation of July 18th outage
Sorry it took me a week and a half to get to this.
We had the most significant VS Online outage we’ve had in a while on Friday July 18th. The entire service was unavailable for about 90 minutes. Fortunately it happened during non-peak hours so the number of affected customers was fewer than it might have been but I know that’s small consolation to those who were affected.
My main goal from any outage that we have is to learn from it. With that learning, I want to make our service better and also share it so, maybe, other people can avoid similar errors.
What happened?
The root cause was that a single database in SQL Azure became very slow. I actually don’t know why, so I guess it’s not really the root cause but, for my purposes, it’s close enough. I trust the SQL Azure team chased that part of the root cause – certainly did loop them in on the incident. Databases will, from time to time, get slow and SQL Azure has been pretty good about that over the past year or so.
The scenario was that Visual Studio (the IDE) was calling our “Shared Platform Services” (a common service instance managing things like identity, user profiles, licensing, etc.) to establish a connection to get notified about updates to roaming settings. The Shared Platform Services were calling Azure Service Bus and it was calling the ailing SQL Azure database.
The slow Azure database caused calls to the Shard Platform Services (SPS) to pile up until all threads in the SPS thread pool were consumed, at which point, all calls to TFS eventually got blocked due to dependencies on SPS. The ultimate result was VS Online being down until we manually disabled our connection to Azure Service Bus an the log jam cleared itself up.
There was a lot to learn from this. Some of it I already knew, some I hadn’t thought about but, regardless of which category it was in, it was a damn interesting/enlightening failure.
Don’t let a ‘nice to have’ feature take down your mission critical ones
I’d say the first and foremost lesson is “Don’t let a ‘nice to have’ feature take down your mission critical ones.” There’s a notion in services that all services should be loosely coupled and failure tolerant. One service going down should not cause a cascading failure, causing other services to fail but rather only the portion of functionality that absolutely depends on the failing component is unavailable. Services like Google and Bing are great at this. They are composed of dozens or hundreds of services and any single service might be down and you never even notice because most of the experience looks like it always does.
The crime of this particular case is that, the feature that was experiencing the failure was Visual Studio settings roaming. If we had properly contained the failure, your roaming settings wouldn’t have synchronized for 90 minutes and everything else would have been fine. No big deal. Instead, the whole service went down.
In our case, all of our services were written to handle failures in other services but, because the failure ultimately resulted in thread pool exhaustion in a critical service, and it reached the point that no service could make forward progress.
Smaller services are better
Part of the problem here was that a very critical service like our authentication service shared an exhaustible resource (the thread pool) with a very non-critical service (the roaming settings service). Another principle of services is that they should be factored into small atomic units of work if at all possible. Those units should be run with as few common failure points as possible and all interactions should honor “defensive programming” practices. If our authentication service goes down, then our service goes down. But the roaming settings service should never take the service down. We’ve been on a journey for the past 18 months or so of gradually refactoring VS Online into a set of loosely coupled services. In fact, only about a year ago, what is now SPS was factored out of TFS into a separate service. All told, we have about 15 or so independent services today. Clearly, we need more :)
How many times do you have to retry?
Another one of the long standing rules in services is that transient failures are “normal”. Every service consuming another service has to be tolerant of dropped packets, transient delays, flow control backpressure, etc. The primary technique is to retry when a service you are calling fails. That’s all well and good. The interesting thing we ran into here was a set of cascading retries. Our situation was
Visual Studio –> SPS –> Service Bus –> Azure DB
When Azure DB failed Service Bus retried 3 times. When Service Bus failed, SPS retried 2 times. When SPS failed, VS retried 3 times. 3 * 2 * 3 = 18 times. So, every single Visual Studio client launched in that time period caused a total of 18 attempts on the SQL Azure database. Since the problem was that the database was running slow (resulting in a timeout after like 30 seconds), that’s 18 tries * 30 seconds = 9 minutes each.
Calls in this stack of services piled up and up and up until, eventually, the thread pool was full and no further requests could be processed.
As it turns out SQL Azure is actually very good about communicating to it’s callers whether or not a retry is worth attempting. SB doesn’t pay attention to that and doesn’t communicate it to it’s callers. And neither does SPS. So a new rule I learned is that it’s important that any service carefully determine, based on the error, whether or not retries are called for *and* communicate back to their callers whether or not retries are advisable. If this had been done, each connection would have been only 30 seconds rather than 9 minutes and likely the situation would have been MUCH better.
A traffic cop goes a long way
Imagine that SPS kept count of how many concurrent calls were in progress to Service Bus. Knowing that this service was a “low priority” service and that calls were synchronous and the thread pool limited, it could have decided that, once that concurrent number of calls exceeded some threshold (let’s say 30, for arguments sake) that it would start rejecting all subsequent calls until the traffic jam drained a bit. Some callers would very quickly get rejected and their settings wouldn’t be roamed but we’d never have exhausted threads and the higher priority services would have continued to run just fine. Assuming the client is set to attempt a reconnect on some very infrequent interval, the system would eventually self-heal, assuming the underlying database issue was cleared up.
Threads, threads and more threads
I’m sure I won’t get out of this without someone pointing at that one of the root causes here is that the inter-service calls were synchronous. They should have been asynchronous, therefore not consuming a thread and never exhausting the thread pool. It’s a fair point but not my highest priority take away here. You are almost always consuming some resource, even on async calls – usually memory. That resource may be large but it too is not inexhaustible. The techniques I’ve listed above are valuable, regardless of sync or async and will also prevent other side effects, like pounding an already ailing database into the dirt with excessive retries.
So, it’s a good point, but I don’t think it’s a silver bullet.
So, onto our backlog go another series of “infrastructure" improvements and practices that will help us provide an ever more reliable service. All software will fail eventually, somehow. The key thing is to examine each and every failure, trace the failure all the way to the root cause, generalize the lessons and build defenses for the future.
I’m sorry for the interruption we caused. I can’t promise it won’t happen again, *but* after a few more weeks (for us to implement some of these defenses), it won’t happen again for these reasons.
Thanks as always for joining us on this journey and being astonishingly understanding as we learn, And, hopefully these lessons provide some value to you in your own development efforts.
Brian
The Windows Azure Pack VM Role – Part 2 Concept
This is part 2 from the blog series written by MVP Marc van Eijk (@_marcvaneijk).
After successfully deploying a gallery item, which was downloaded from the custom feed in the Web Platform Installer, the first thing you probably did was install the VM Role Authoring Tool to open that gallery item. At least, that is what I did. Just to look at some VM Role examples created by Microsoft and work my way from there. Now this is probably when things get a bit more complex. There are quite a few options in the VM Role Authoring Tool and it is not really clear what all these options mean, how they work and how they relate to each another.
To provide us with some help, Stephen Baron created a couple of videos to explain the workings of the VM Role and Charles Joy posted them in a VM Role Authoring Tool How Tovideos playlist. These recordings really helped. I was now able to make adjustments to the gallery item without breaking it. To be honest, I just edited some existing values in downloaded gallery items into something else. I still didn’t understand the meaning of all those options, fields and values. The more I worked with the VM Role, the more questions seemed to arise. To provide some answers to these questions we will start with the main concept.
Concept
The VM Role consists of two packages. The Resource Definition Package is imported in Windows Azure Pack. This package is required. The other package is called the Resource Extension Package and is imported in System Center Virtual Machine Manager. The Resource Extension Package is only required if you want install applications on top of the Operating System. Each package is a compressed file that contains multiple files, just like a .zip file.
To get an idea of some of the possibilities with the VM Role, you might think of a PowerShell script. A PowerShell script can be very complex. The variables are placed on the top of the script so we only need to edit that section, without having to bother with the complexity of the rest of the script. The VM Role takes this to the next level, allowing you to capture those variables with easy to understand questions for a tenant in Windows Azure Pack.
The tenant submits these values through the VM Role wizard. The values are passed from the Resource Definition (Windows Azure Pack) to the Resource Extension (System Center Virtual Machine Manager), where they can be used as values for variables. For example in that PowerShell script.
Resource definition
The resource definition allows you to configure the VM Role settings that are used in Windows Azure Pack. When a tenant selects a VM Role from the gallery in Windows Azure Pack, he is presented with a wizard. All the requirements, options, tabs, values and language variations displayed in the wizard are defined in the view definition, which is part of the resource definition. The main purpose of the wizard is to capture values into parameters. These values can be prepopulated, hard-coded or entered by the tenant. When the wizard is completed and the tenant submits the request, the captured parameters are passed from the resource definition to the resource extension.
Resource extension
After the request is submitted by the tenant in Windows Azure Pack, the resource extension comes into play. The resource extension leverages the service template engine in System Center Virtual Machine Manager. One or more virtual machine instances are installed with the specified Operating System and the configured application logic is executed. The VM Role provides many options to configure application logic. Before looking at these options it is essential to understand the installation procedure.
The resource extension contains one application profile. All application logic is configured within that application profile. Specified Windows Server roles and features will be installed first. The rest of the installation steps and installation order can be completely defined by yourself.
It is possible to attach a script to the application profile. These scripts are called provisioning scripts and are executed before or after all individual applications are installed and are useful for prepare or cleanup actions that apply to all applications. A script to format and label additional data disks in the virtual machine is an example of a pre-install provision script. Deleting the folder with the application logic at the end of the deployment process is an example of a post-install provision script. Provisioning scripts have a green color in the Application Profile image.
For the individual applications, there are three application types that you can add to the application profile. The applications are referenced in blue in the Application Profile image.
- Web Application
- SQL DAC
- Script Application
Besides the three applications you can also add a SQL Profile to the complete a sysprepped SQL installation. The SQL Profile is also referenced in blue in the Application Profile image.
- SQL Profile
Based on the application type, depending actions can be attached. The depending actions are referenced in grey in the Application Profile image. The following depending actions are available.
- Script Command
- SQL Script
- SQL Deployment
Depending actions and Provisioning Scripts can be configured with a Script Type. A Script Type defines when a script is executed. The following Script Types are available.
- PreInstall
- PostInstall
- PreService
- PostService
- PreUninstall
- PostUninstall
- OnProvisionFirst
- OnProvisionRest
- OnDeleteRest
- OnDeleteLast
Each Script Type is triggered by a different action and can only be applied at a certain level (application profile level or application level).
All these options provides even more choices. I have performed a lot of test deployments to figure out what script and script type can be referenced where and how many times. The upcoming blog on the Resource Extension will describe that in more detail. The following table shows the application types, possible depending actions and the limits for each type.
- Web Application (one or more)
- Script Command (one or more)
- SQL DAC (one or more)
- SQL Script (one or more)
- Script Command (one or more)
- SQL Profile (only one)
- SQL Deployment (one or more)
- Script Application (only one)
- Script Command (one or more)
VM Role Authoring Tool
The VM Role Authoring Tool can be downloaded from CodePlex. It is a stable tool that creates consistent valid packages for the VM Role. Please note that since the tool is from CodePlex there is no official support from Microsoft for this tool. After downloading the VM Role Authoring Tool you can extract the files and open the tool from the extracted files with VmroleAuthor.exe.
The VM Role Authoring Tool has the following areas.
Menu Bar
The Menu Bar provides multiple menu items with dropdowns. The selections in the menus are context sensitive. You are only able to choose actions valid to the selected item in the Navigation Pane. If the menu item is not valid, it is greyed out.
- File: Allows you to create new VM Role files and packages, open and save existing packages.
- Clipboard: Shortcuts to clipboard actions
- View: A package consists of multiple files. When a package is opened in the VM Role Authoring Tool a working directory is created that contains the extracted files from the package. The View Directory option opens the working directory folder directly.
- Save: Save the selected package or file.
- Validate: The VM Role Authoring Tool provides error checking. When you select this menu item all the packages and files that are open in the VM Role Authoring Tool are validated. Errors and warning are displayed in the Notification Area.
- Add: Allows you to add items to a package or file.
- Remove: Remove an item from a file or package. It also allows you to remove a complete file or package from the tool.
- Deploy: This menu item was introduced in version 1.1 of the VM Role Authoring Tool and allows you to deploy a resource extension to Microsoft Azure.
- About: Version
Navigation Pane
The Navigation Pane in the VM Role Authoring Tool displays the content of one or more files and/or packages. You can create new files and packages from the Menu Bar, import existing items from the Menu Bar and even open existing items by dragging and dropping them directly in the Navigation Pane.
Main Window
The Main Window displays the details of the entry that is selected in the Navigation Pane. The Main Window has two tabs. The editor tab displays an abstraction view of the JSON language that is used by the VM Role. This abstracted view provides fields with validation and related field names.
The JSON View tab show the actual JSON language that reflect the values specified in the Editor View.
I have had a couple of cases where the validation resulted in errors that I did not understand. A quick peek at the JSON View helped in these situations.
Notification Window
The VM Role packages consists of many values and dependences between these values. The creators of the VM Role Authoring Tool provided a validation option that will verify the values and notify when errors or warning exists. The errors and warnings are presented in the Notification Windows as hyperlinks.
These hyperlink will take you to the related value in the Main Window.
Before you start
The VM Role Authoring Tool is an offline tool. It does not have an undo option. Before creating a new package in the VM Role Authoring Tool it is a good idea to create a folder structure for working with packages. A simple folder structure will save you from a lot of grief when an accidental removal or change can’t be undone. This folder structure can also contain folders with the application logic used within the resource extension.
Throughout my experiences with the VM Role Authoring Tool the following folder structure works for me.
- Import: Importing the packages into Windows Azure Pack or Virtual Machine Manager locks the files (also after the import). When a package is ready for import I copy it from the custom folder to the import folder. The import folder contains the latest versions of all VM Roles that were imported.
- Templates: All the gallery items created by Microsoft downloaded from the custom feed in the Web Platform Installer are placed in this folder. I use these items for reference.
- Custom:This folder contains a folder for each VM Role that I’m working on.
- VM Role name:Think of a good naming convention for your VM Roles and create a folder according to this naming convention for each VM Role that you are working on
- vXX - Description: Before edits are made to a VM Role, a copy of this folder is placed on the same level with a incremented version number and a short description of the type of edits made. It is also recommended to create a naming convention for your VM Role files. For instance: Organization_OS_App_Version
- VM Role name:Think of a good naming convention for your VM Roles and create a folder according to this naming convention for each VM Role that you are working on
- Script: Create a folder for each type of application logic, containing the scripts for that application logic. The best way to organize the folders is by defining a modular structure where the application logic within a folder can run separately or combined with other application logic from another folder. For example a folder with the scripts that apply to the Operating System and a folder with scripts that apply to a specific application.
The following image is a screenshot of the folder structure.
This is by no means the way your folders must be organized. You can create a structure that works for you.
In the next part of this blog series, we will create a new resource definition and discuss the details.
The Mobile Web should just work for everyone
Windows Phone 8.1 Update includes hundreds of Internet Explorer 11 enhancements that greatly increase compatibility with the mobile web.
Based on your feedback, we pursued a web experience for IE users consistent with what is available on iOS and Android devices – even where this meant we would be adding non-standard web platform features. We believe that this is a more pragmatic approach to running today's less-standardised mobile web.
We tested more than 500 of the top mobile web sites and found that the IE11 update improves the experience on more than 40% of them.
For example, if you visit www.twitter.com with IE11, you used to see:

Here is what you see in IE11 with the update, on Firefox OS and on an iPhone:



Similarly, if you visit www.baidu.com with IE11 and Firefox OS, you see:


Here is what you see in IE11 with the update and on an iPhone:


Analysing the most popular web sites
Unlike the mostly standards-based ‘desktop' web, many modern mobile web pages were designed and built for iOS and the iPhone. This results in users of other devices often receiving a degraded experience.
A few weeks ago we talked about our vision and priorities for the web. We believe that "The Web should just work for everyone – users, developers and businesses." We started researching what it would take to make the mobile web "just work" for our customers.
As we investigated the most popular mobile web sites from around the world we started to see common patterns causing problems. Often sites would use poorly written browser detection code that would result in the desktop site experience for Windows Phone users. Desktop web sites tend to be larger and slower to load costing more of a user's data plan. These sites end up with tiny text and you have to spend a lot of time zooming and panning around to read the content. They also expect you to be using a mouse and so menus and forms are hard to work with.
When Windows Phone 8.1 reached RTM, it included the same fast, standards-based, IE11 browser engine that powers the PC version of IE on the desktop. For the last several years we've talked about providing the same mark-up to all browsers using feature detection and graceful degradation. Although we still see broken desktop sites not following this guidance from time to time, the situation has improved on the desktop. We found a much different situation on the mobile web. Many sites use features via a legacy vendor specific prefix without supporting the un-prefixed standard version or only support vendor prefixes for certain devices. Other sites use non-standard proprietary APIs that only work with Safari or Chrome. Of course there were also bugs or missing features in IE that became particularly apparent on mobile sites designed specifically for our competitors' browsers.
Updating Internet Explorer in Windows Phone 8.1 Update
We gathered all of this compatibility data and then we began to plan what changes we should make to IE. The remainder of this blog post discusses some of the most important changes and the rationale for why we made them. The issues affecting mobile web sites fall primarily into five main categories:
- Faulty browser detection not recognising IE as a mobile browser and giving the desktop experience
- Using only old webkit-prefixed features that have been replaced by standards
- Using proprietary webkit-prefixed features for which there is no standard
- Using features that IE does not support with no graceful fall-back
- Running into interoperability bugs and implementation differences in IE
Changing the User Agent string
One of the most significant issues we saw was related to sites not detecting that IE on Windows Phone is a mobile browser and therefore providing desktop content. This often results in sites displayed with tiny text that you need to zoom in to read and then pan around. It also often means more data is transmitted over the phone's data connection because the content isn't mobile optimised. Images are large and many more ads are downloaded and shown.
There are many different ways that sites try to detect whether to deliver the mobile experience. Here is one such check we found on a real site:
window.mobileCheck = function() {
var check = false;
(function(a){if(/(android|bb\d+|meego).+mobile|avantgo|bada\/|blackberry|blazer|compal|elaine|fennec|hiptop|iemobile|ip(hone|od)|iris|kindle|lge |maemo|midp|mmp|mobile.+firefox|netfront|opera m(ob|in)i|palm( os)?|phone|p(ixi|re)\/|plucker|pocket|psp|series(4|6)0|symbian|treo|up\.(browser|link)|vodafone|wap|windows (ce|phone)|xda|xiino/i.test(a)||/1207|6310|6590|3gso|4thp|50[1-6]i|770s|802s|a wa|abac|ac(er|oo|s\-)|ai(ko|rn)|al(av|ca|co)|amoi|an(ex|ny|yw)|aptu|ar(ch|go)|as(te|us)|attw|au(di|\-m|r |s )|avan|be(ck|ll|nq)|bi(lb|rd)|bl(ac|az)|br(e|v)w|bumb|bw\-(n|u)|c55\/|capi|ccwa|cdm\-|cell|chtm|cldc|cmd\-|co(mp|nd)|craw|da(it|ll|ng)|dbte|dc\-s|devi|dica|dmob|do(c|p)o|ds(12|\-d)|el(49|ai)|em(l2|ul)|er(ic|k0)|esl8|ez([4-7]0|os|wa|ze)|fetc|fly(\-|_)|g1 u|g560|gene|gf\-5|g\-mo|go(\.w|od)|gr(ad|un)|haie|hcit|hd\-(m|p|t)|hei\-|hi(pt|ta)|hp( i|ip)|hs\-c|ht(c(\-| |_|a|g|p|s|t)|tp)|hu(aw|tc)|i\-(20|go|ma)|i230|iac( |\-|\/)|ibro|idea|ig01|ikom|im1k|inno|ipaq|iris|ja(t|v)a|jbro|jemu|jigs|kddi|keji|kgt( |\/)|klon|kpt |kwc\-|kyo(c|k)|le(no|xi)|lg( g|\/(k|l|u)|50|54|\-[a-w])|libw|lynx|m1\-w|m3ga|m50\/|ma(te|ui|xo)|mc(01|21|ca)|m\-cr|me(rc|ri)|mi(o8|oa|ts)|mmef|mo(01|02|bi|de|do|t(\-| |o|v)|zz)|mt(50|p1|v )|mwbp|mywa|n10[0-2]|n20[2-3]|n30(0|2)|n50(0|2|5)|n7(0(0|1)|10)|ne((c|m)\-|on|tf|wf|wg|wt)|nok(6|i)|nzph|o2im|op(ti|wv)|oran|owg1|p800|pan(a|d|t)|pdxg|pg(13|\-([1-8]|c))|phil|pire|pl(ay|uc)|pn\-2|po(ck|rt|se)|prox|psio|pt\-g|qa\-a|qc(07|12|21|32|60|\-[2-7]|i\-)|qtek|r380|r600|raks|rim9|ro(ve|zo)|s55\/|sa(ge|ma|mm|ms|ny|va)|sc(01|h\-|oo|p\-)|sdk\/|se(c(\-|0|1)|47|mc|nd|ri)|sgh\-|shar|sie(\-|m)|sk\-0|sl(45|id)|sm(al|ar|b3|it|t5)|so(ft|ny)|sp(01|h\-|v\-|v )|sy(01|mb)|t2(18|50)|t6(00|10|18)|ta(gt|lk)|tcl\-|tdg\-|tel(i|m)|tim\-|t\-mo|to(pl|sh)|ts(70|m\-|m3|m5)|tx\-9|up(\.b|g1|si)|utst|v400|v750|veri|vi(rg|te)|vk(40|5[0-3]|\-v)|vm40|voda|vulc|vx(52|53|60|61|70|80|81|83|85|98)|w3c(\-| )|webc|whit|wi(g |nc|nw)|wmlb|wonu|x700|yas\-|your|zeto|zte\-/i.test(a.substr(0,4)))check = true})(navigator.userAgent||navigator.vendor||window.opera);
return check;
}
We updated the User Agent string in IE on Windows Phone to increase the number of sites that would correctly deliver the best mobile content. This continues an unfortunate pattern that all browsers have had to deal with and most web developers have run into. For example, there is an interesting discussion from as early as 2006 in a WebKit bug entitled "Safari lies. Reports itself as Mozilla, Gecko and KHTML too." When we shipped IE11 on the desktop, we added the token "like Gecko" to the string because it significantly improved compatibility with desktop sites. Chrome and Opera claim to be like Gecko and Safari in order to be compatible with web content.
If you visit www.hawaiianairlines.com with IE11 and Firefox OS, you see the desktop experience:


Here is what you see in IE11 with the update and on an iPhone:


If you visit www.nytimes.com with IE11 and Firefox OS, you also see the desktop experience:


Here is what you see in IE11 with the update and on an iPhone:


In general, our advice is to develop a responsive site that can adapt to the capabilities of different devices. If you choose to build a mobile-specific experience then we recommend looking for the sub-string "mobile" in the user agent string to determine when to deliver mobile optimised content:
function isMobile() {
return navigator.userAgent.toLowerCase().indexOf("mobile")>=0;
}
Mapping legacy webkit-prefixed features to IE implementation
After changing the user agent string so that IE receives the same content as other phone browsers we could begin to analyse issues that were breaking mobile experiences. The first important problem we identified was that many mobile sites only send webkit-prefixed content for CSS gradients, flexbox, transitions, and animations. These are features that IE11's web standards-based engine already supports for sites with cross-browser mark-up. As Mozilla found, WebKitCSSMatrix is commonly used on mobile devices. IE supports MSCSSMatrix. Many sites also use window.orientation instead of the emerging standard screen.orientation. The second problem we found here is that sites also often use old syntax in their code. For example, using the old gradient syntax instead of the updated standards based approach.
In Windows Phone 8.1 Update, we added a mapping of popular webkit-prefixed APIs to the standards based support already part of IE11. This means that sites that only send WebKit code are translated into standards based code as the page loads. We are not planning to support all webkit-prefixed APIs. We have added mappings for the ones that are so prevalent in mobile sites that the web won't work without them.
If you visit www.macys.com with IE11, you see:

Here you can see the gradients drawn correctly in IE11 with the update and on an iPhone:


If you visit www.answers.com with IE11, you see:

Here you can see the site drawn correctly in IE11 with the update and on an iPhone:


Adding support for non-standard proprietary features
We found a small number of non-standard features popularised by Apple on the iPhone in widespread use. These features are not currently on a standards track but browsers that don't support them can't provide a good experience for top sites on the mobile web. One example is -webkit-appearance, which allows a page to modify the styling of an element to match native applications. As Mozilla points out, "not only is it non-standard, but its behavior changes from one browser to another." Unfortunately, without some level of support for these non-standard proprietary features, web sites are more difficult to use.
New features supported in IE
There are several standards-based features that IE11 didn't support that are used infrequently on desktop sites, but are in common use in the mobile web. Once we made IE11 receive more mobile content we determined that we would need to add these features. For example, window.locationbar is defined in HTML5 but rarely used on desktop sites. We prioritised implementing several new features based on the mobile sites they enabled.
One of the larger API-related issues affecting compatibility with mobile sites is support for touch. In IE10, we shipped support for Pointer Events, which is now a Candidate Recommendation at W3C, and we updated the implementation in IE11 to incorporate changes in the spec. Using Pointer Events provides many performance and functional advantages to sites that wish to use either mouse, touch, pen or other pointer inputs and we continue to recommend this as the best API for sites that work for users across all of their devices.
On the mobile web, most sites have been coded to use the older Touch Events model and users expect those sites to just work. In the IE11 update, we added support for touch events so that these sites would work correctly. Our research has shown that on the desktop web, enabling touch events on a device that also supports a mouse (like Windows tablets and 2 in 1 devices) causes more problems—for example, we found that mouse and trackpad support is broken on about 10% of top sites when Touch Events are enabled. Many sites don't expect to be able to receive touch events and mouse events and only support one or the other. We have joined other browser vendors in the W3C Touch Events Community Group to work through these issues for the web at large. We'll talk more about pointer and touch events in a future post.
Fixing the most impactful interop issues
As we continued to investigate the mark-up in sites that were not working correctly in Internet Explorer, we found some peculiar interoperability issues. For example,
Finally, we also identified several bugs in the Trident engine that particularly impacted top mobile sites and we included fixes for these issues in this update. For example, we fixed some navigation issues with location.hash
and some CSS layout problems that were affecting popular mobile sites.
What can you do?
Many of the changes we've made are specifically targeted at consuming legacy or vendor prefixed content published on these sites. It is not our goal to support all the -webkit- vendor prefixed APIs. While we will continue our outreach efforts to encourage these sites to adopt standards-based mark-up, the support we've added is necessary today for the mobile web to just work. You can help here too if you see sites using non-standard code: we are collaborating with Mozilla at webcompat.com to record broken sites. These sites often cause issues across multiple browsers including Firefox and IE and it is easy for you to report problematic sites.
If you are a web developer, run your site through the scanner tool on http://modern.ie. This tool will identify common coding problems including issues with vendor prefixes and help you fix your code.
When taken all together, the changes we have made to IE in Windows Phone 8.1 Update dramatically improve compatibility with the most popular mobile web sites. The update should start rolling out to those of you already on the Windows Phone 8.1 Preview for Developers next week and will roll out to consumers with devices running Windows Phone 8.1 in the coming months. Next week, we will also publish a comprehensive list of all the changes in the IE Developer Guide on MSDN.
If you have questions, you can connect with us on Twitter @IEDevChat. The next #AskIE tweet chat is today (July 31) from 10AM-Noon PDT. Make sure you include #AskIE in your questions.
Adrian Bateman
Program Manager, Internet Explorer
Frank Olivier
Program Manager, Internet Explorer
General Availability for Enhanced Mitigation Experience Toolkit (EMET) 5.0
PASS Summit 2014: Inside the World’s Largest Gathering of SQL Server and BI Professionals
PASS VP of Marketing Denise McInerney– a SQL Server MVP and Data Engineer at Intuit – began her career as a SQL Server DBA in 1998 and attended her first PASS Summit in 2002. The SQL Server Team caught up with her ahead of this year’s event, returning to Seattle, WA, Nov. 4-7, to see what she’s looking forward to at the world’s largest conference for SQL Server and BI professionals.
For those who’ve never attended or who’ve been away for a while, what is PASS Summit?
PASS Summit is the world’s largest gathering of Microsoft SQL Server and BI professionals. Organized by and for the community, PASS Summit delivers the most technical sessions, the largest number of attendees, the best networking, and the highest-rated sessions and speakers of any SQL Server event.
We like to think of PASS Summit as the annual reunion for the #sqlfamily. With over 200 technical sessions and 70+ hours of networking opportunities with MVPs, experts and peers, it’s 3 focused days of SQL Server. You can take hands-on workshops, attend Chalk Talks with the experts, and get the answers you need right away at the SQL Server Clinic, staffed by the Microsoft CSS and SQLCAT experts who build and support the features you use every day. Plus, you can join us early for 2 days of pre-conference sessions with top industry experts and explore the whole range of SQL Server solutions and services under one roof in the PASS Summit Exhibit Hall.
Nowhere else will you find over 5,000 passionate SQL Server and BI professionals from 50+ countries and 2,000 different companies connecting, sharing, and learning how to take their SQL Server skills to the next level.
What’s on tap this year as far as sessions?
We’ve announced a record 160+ incredible community sessions across 5 topic tracks: Application and Database Development, BI Information Delivery, BI Platform Architecture, Development and Administration; Enterprise Database Administration and Deployment, and Professional Development. And watch for over 60 sessions from Microsoft’s top experts to be added to the lineup in early September.
You can search by speaker, track, session skill level, or session type – from 10-minute Lightning Talks, to 75-minute General Sessions, to 3-hour Half-Day Sessions and our full-day pre-conference workshops.
And with this year’s new Learning Paths, we’ve made it even easier to find the sessions you’re most interested in. Just use our 9 Learning Path filters to slice and dice the lineup by everything from Beginner sessions to Big Data, Cloud, Hardware Virtualization, and Power BI sessions to SQL Server 2014, High Availability/Disaster Recovery, Performance, and Security sessions.
Networking is at the heart of PASS Summit – what opportunities do you have for attendees to connect with each other?
PASS Summit is all about meeting and talking with people, sharing issues and solutions, and gaining knowledge that will make you a better SQL Server professional. Breakfasts, lunches, and evening receptions are all included and are designed to offer dedicated networking opportunities. And don't underestimate the value of hallway chats and the ability to talk to speakers after their sessions, during lunches and breaks, and at the networking events.
We have special networking activities for first-time attendees, for people interested in the same technical topics at our Birds of a Feather luncheon, and at our popular annual Women in Technology luncheon, which connects 600+ attendees interested in advancing role of women in STEM fields. Plus, our Community Zone is THE place to hang out with fellow attendees and community leaders and learn how to stay involved year-round.
You mentioned the networking events for first-time attendees. With everything going on at Summit, how can new attendees get the most out of their experience?
Our First-Timers Program takes the hard work out of conference prep and is designed specifically to help new attendees make the most of their time at Summit. We connect first-timers with conference alumni, take them inside the week with community webinars, help them sharpen their networking skills through fun onsite workshops, and share inside advice during our First Timers orientation meeting.
In addition, in our “Get to Know Your Community Sessions,” longtime PASS members share how to get involved with PASS and the worldwide #sqlfamily, including encouraging those new to PASS to connect with their local SQL Server communities through PASS Chapters and continue their learning through Virtual Chapters, SQLSaturdays, and other free channels.
How can you learn more about sessions and the overall PASS Summit experience?
A great way to get a taste of Summit is by watching PASS Summit 2013 sessions, interviews, and more on PASStv. You can also check out the best of last year’s Community blogs.
Plus, stay tuned for 24 Hours of PASS: Summit Preview Edition on September 9 to get a free sneak peek at some of the top sessions and speakers coming to PASS Summit this year. Make sure you follow us on Twitter at @PASS24HOP / #pass24hop for the latest updates on these 24 back-to-back webinars.
Where can you register for PASS Summit?
To register, just go to Register Now– and remember to take advantage of the $150 discount code from your local or Virtual PASS Chapter. We also have a great group discount for companies sending 5 or more employees. And don’t forget to purchase the session recordings for year-round learning on all aspects of SQL Server.
Once you get a taste for the learning and networking waiting for you at PASS Summit, we invite you to join the conversation by following us on Twitter (watch the #sqlpass #summit 14 hashtags) and joining our Facebook and LinkedIn groups. We’re looking forward to an amazing, record-breaking event, and can’t wait to see everyone there!
Please stay tuned for regular updates and highlights on Microsoft and PASS activities planned for this year’s conference.
Office for iPad: now with Presenter View, Pivot Table interaction, Export to PDF, and more top-requested features
Four months ago we released Office for iPad—the world’s best productivity experience reimagined for a mobile-first, cloud-first world. Since then millions of you have had the chance to use Office for iPad and provide feedback, and we’ve been hard at work delivering updates for the things you’ve asked for the most. Today we’re sharing the next wave of updates to Office for iPad. We’ve done a ton of work and we hope you like it.
PowerPoint: Presenter View, audio and video playback, and more
With this release, you can enable Presenter View when projecting to another screen, so that you can see your notes, what’s coming next, and more. You asked for this. Now you’ve got it.
Presenter View in PowerPoint for iPad
With this update you can also make your presentations more rich and interactive than ever by more easily including video and sound to help get your point across. Not only do all of your embedded videos and sound recordings now play right from PowerPoint for iPad, but you can insert video directly from your Camera Roll as well. This allows you to quickly record, embed, and present interactive content more quickly than ever, and all from one device.
Playing video directly from PowerPoint for iPad
But that’s not all. There are also new eraser and pen settings to make annotations during presentations easier than ever. You can also edit hyperlinks right from within the app, another way using PowerPoint for iPad is now easier.
Excel: interaction with PivotTables, easier grid navigation, and more
Excel is all about interacting with, consuming, and manipulating your data. When it comes to interacting with data, PivotTables are key to helping explore that data. With Excel for iPad you can now sort, filter, expand and collapse, show details, and even refresh PivotTables whose data is all contained within the workbook. You can even change how your PivotTables look and feel by changing both their visual style and layout.
Sorting a column in a PivotTable in Excel for iPad
We’ve also made it easier to consume workbooks and select data, including a large data range, with the introduction of a new flick gesture. Simply grab the selection handle, flick it in any direction, and Excel will automatically select from where you started to the next blank cell. Say you’re at the top of a column of data and want to select all the way to the bottom. Just flick down and the column is selected automatically.
When it comes to manipulating data, we know that many of you want to be able to do that with hardware keyboards. Excel has lots of behaviors we’ve built up over decades to make working with a keyboard as easy and efficient as possible. Some of these make it easy to navigate around within a cell you are editing using the arrow keys (called Edit mode), and some make it easy to enter formulas without using the mouse (called Point mode). There are also keyboard shortcuts (F2 in Windows, CTRL+U on Mac) that allow you to switch between modes.
All of this is usually completely transparent to you—Excel just works the way it’s supposed to! And now Excel for iPad does too. We’ve even added in a CTRL+2 shortcut key for advanced users to switch between the modes.
We also made some improvements to the print capability we announced in April that make using Excel in Office for iPad easier. We added more paper sizes to choose from, and we added in scaling options. These improvements give you more control over the layout of your workbooks when you choose to print them.
But wait, there’s more!
Last but not least, with this wave of updates to Office for iPad we added three top feature requests that affect all of Word, Excel, and PowerPoint.
First up is Export to PDF. Export to PDF is available in the Share menu in all of our apps, and it’s available for everyone to use, whether you have an Office 365 subscription or not.
Export to PDF in Word for iPad
We also introduced new tools to help you edit your pictures from within Office for iPad. When you tap a picture you now see two new options: Crop and Reset. Crop lets you do exactly what it sounds like: crop your picture. You can do this manually, using your finger to set exactly the range you’d like to crop to, or you can select from a menu of popular options. Reset allows you to quickly remove all styles and other changes from your pictures, in case you want to start over.
Cropping a photo in Word for iPad
Finally, Office for iPad now supports third-party fonts! If you use one of several apps to install fonts on your iPad (for example, AnyFont), Office for iPad will now recognize and allow you to use those fonts in any documents you create.
Download the updated Word, Excel, and PowerPoint for today
As this wave of updates hopefully shows, we’re making good on our commitment to delivering continuous updates and improvements to the Office for iPad apps. Download the updated Word, Excel and PowerPoint for iPad in the App Store today. If you want to edit and create documents with your iPad, get started with an Office 365 subscription or a 30-day trial, sign-up at www.Office365.com.
The post Office for iPad: now with Presenter View, Pivot Table interaction, Export to PDF, and more top-requested features appeared first on Office Blogs.
New limited edition Wireless Mobile Mouse 3500 features Master Chief from Halo
Are you a Halo fan? You’re going to want this. Today we’re announcing the Wireless Mobile Mouse 3500 Halo Limited Edition: The Master Chief. This mouse gloriously features a highly detailed Master Chief in his two-tone green MJOLNIR Powered Assault Armor and iconic gold hued visor.
With this mouse, you get Master Chief and all the awesomeness that comes standard with the popular Wireless Mobile Mouse 3500 such as its ambidextrous design, snap-in nano transceiver, 2.4 GHz wireless technology, and two-color battery light indicator. You also get BlueTrack Technology that allows you to use the mouse on virtually any surface including granite, marble, carpet, and wood.
The Wireless Mobile Mouse 3500 Halo Limited Edition: The Master Chief will be available for pre-order from GameStop starting today and generally available in October 2014 for the estimated retail price of $29.95 (U.S.) at the Microsoft Store and other retailers.
Support Tip: VMs deployed to Hyper-V networks experience delays acquiring an IP address after reboot
~ Prad Senniappan
When using System Center 2012 R2 Virtual Machine Manager (VMM 2012 R2), you may discover that some virtual machines that are deployed on Hyper-V Network Virtualization networks with dynamic IP address allocation may not get an IP address for a few minutes after a reboot of the VM. Eventually the VM gets the IP address and otherwise functions normally.
Cause
The behavior can occur if the host has an older version of the VMM DHCP server extension. In order to verify this, find the version of “Microsoft System Center Virtual Machine Manager DHCP Server (x64)” installed on the host by running the following PowerShell command:
Get-WmiObject -Class win32_product -Filter 'Name = "Microsoft System Center Virtual Machine Manager DHCP Server (x64)"'
If the version is less than 3.2.7649.0 then VMs may not get IP addresses for a few minutes after a reboot.
Resolution
To resolve this issue, you first need to upgrade the VMM 2012 R2 server to Update Rollup 3 (UR3) or later. For information on UR3 as well as a download link, please see the following:
2965414 - Update rollup 3 for System Center 2012 R2 Virtual Machine Manager (http://support.microsoft.com/kb/2965414)
Once UR3 is installed, you will find the DHCP extension installer on the VMM server in the following location:
\
NOTE The default path is C:\Program Files\Microsoft System Center 2012 R2\Virtual Machine Manager\SwExtn\DHCPExtn.msi
Copy DHCPExtn.msi to the Hyper-V hosts experiencing the problem and run it to install the updated DHCP server extension. Once this is done, the VMs should no longer experience a delay in acquiring an IP address after a reboot.
Prad Senniappan | Senior SDE | Microsoft
Get the latest System Center news onFacebookandTwitter:
System Center All Up: http://blogs.technet.com/b/systemcenter/
System Center – Configuration Manager Support Team blog: http://blogs.technet.com/configurationmgr/
System Center – Data Protection Manager Team blog: http://blogs.technet.com/dpm/
System Center – Orchestrator Support Team blog: http://blogs.technet.com/b/orchestrator/
System Center – Operations Manager Team blog: http://blogs.technet.com/momteam/
System Center – Service Manager Team blog: http://blogs.technet.com/b/servicemanager
System Center – Virtual Machine Manager Team blog: http://blogs.technet.com/scvmm
Windows Intune: http://blogs.technet.com/b/windowsintune/
WSUS Support Team blog: http://blogs.technet.com/sus/
The AD RMS blog: http://blogs.technet.com/b/rmssupp/
MED-V Team blog: http://blogs.technet.com/medv/
Server App-V Team blog: http://blogs.technet.com/b/serverappv
The Forefront Endpoint Protection blog : http://blogs.technet.com/b/clientsecurity/
The Forefront Identity Manager blog : http://blogs.msdn.com/b/ms-identity-support/
The Forefront TMG blog: http://blogs.technet.com/b/isablog/
The Forefront UAG blog: http://blogs.technet.com/b/edgeaccessblog/
Table of Contents: PASS Summit
Follow us on @SQLServer and Microsoft SQL Server on Facebook to stay-up-to date on the latest news and announcements about Microsoft’s participation at PASS Summit 2014
Register to attend PASS Summit 2014
C++ Runtime for Sideloaded Windows 8.1 apps
[Announcement] ODataLib 6.6.0 Release
We are happy to announce that the ODL 6.6.0 is released and available on nuget along with the source code oncodeplex (please read the git history for the v6.6.0 code info and all previous version). Detailed release notes are listed below.
Bug Fix
- Fix a bug that Client cannot serialize/deserialize a collection property whose element type is abstract complex type, but the element value is concrete
New Features
Add Instance Annotation on ODataProperty and ODataComplexValue
- UrlParser supports some Arithmetic Operators and built-in functions
- Support add, sub operators between DatetimeOffset and Duration
- Support the negation operator for duration
- Support following built-in functions
- ODL supports advertising functions/actions which bound to a collection of entities in payload by using name/value pairs of title and target
Call to Action
You and your team are highly welcomed to try out this new version if you are interested in the new features and fixes above. For any feature request, issue or idea please feel free to reach out to us atodatafeedback@microsoft.com.
GRIT 1.2 – Updated Tool to Import and Configure VM Role Gallery Items in WAP
Hello Readers!
Today I am happy to publish an updated version of the Gallery Resource Import Tool (GRIT).
This post summarizes the new features/enhancements in this version 1.2 and, of course, where to get it.
What is GRIT?
Let’s start with a short overview of the tool, understanding that full details and “how to use” instructions are available in the very first blog post about GRIT here.
Fully written in PowerShell, the “Gallery Resource Import Tool” (GRIT) aims at simplifying discovery and installation of VMRole Gallery Items in Windows Azure Pack (WAP), in addition to help reduce manual errors when tagging virtual disks. Through this single tool, all the configuration and import steps can be achieved.
With GRIT, you can:
- Browse and download Gallery Resources available from Microsoft on the internet, or use a local copy from your disk
- Review virtual disks prerequisites for that Gallery Resource, compare with your existing virtual hard disks, and optionnaly update these disks to match the requirements for the Gallery Resource
- Import the Resource Extension and/or Resource Definition
- List, update, remove Resource Extensions and Resource Definitions
Download location
What’s new and changed in version 1.2?
Note: New features added in version 1.1 are also detailed here.
GRIT now works with remote SPF servers
Depending on the actions chosen, GRIT leverages VMM and SPF cmdlets, and previous versions had issues when the tool was run on another machine than the SPF server. With version 1.2, you can now run the tool on the SPF server, or any other machine. You still need network access to the VMM and SPF server, and the VMM cmdlets.
Bu default, you do not need to do anything different to benefit from this new feature: Just like before, you just specify the VMM and SPF servers in the tool’s parameters, directly in the script or via the command line, and GRIT will determine if the SPF server is local or remote, and run the SPF actions in the most appropriate way depending on the current configuration.
- When the SPF server is local, GRIT uses the same SPF calls than in previous versions
- When working with a remote SPF server, GRIT will execute remote PowerShell commands against the SPF server, through CredSSP authentication. When this happens, GRIT will ask you if you are OK with the CredSSP configuration to be modified (understanding that it will be disabled back again when the tool closes). If you CredSSP changes should be allowed and do not want to see this popup again every time you run the tool, you can set the $CredSSPChangesAlwaysAllowed parameter to $true.
Optional warning popup before CredSSP configuration when SPF server is remote
GRIT enabling and configuring CredSSP settings when launching and upon user approval (or if CredSSPChangesAlwaysAllowed is enabled)
GRIT configuring CredSSP settings when launching, and disabling CredSSP on exit
Multiple selections are now enabled in the “Bonus Tools”
The “Bonus Tools” feature is the third tab in the tool and lets you list/update/remove Resource Extensions and Resource Definitions in an easy way in an existing environments. It is possible to just use this part of the tool without importing a Gallery Item (you still have to pick a gallery item when launching the tool, but could just go the “Bonus Tools” tab afterwards).
In version 1.2, the drop down list now allow for multiple selections. So, for example, you can remove multiple Resource Extensions at the same time, or simultaneously set multiple Resource Definitions to “private”.
The list of Resource Extensions in the Bonus Tools tab now includes release and published information
Instead of just listing the Resource Extension name, release and publisher information are now also listed (just like for Resource Definitions). For example, this avoids confusion when there are multiple Resource Extensions with the same name, but different releases. This is also illustrated in the previous screenshot.
Special thanks to Charles Joy on this feature!
GRIT now checks if it’s running in elevated mode, and re-launches itself as needed
Some of the PowerShell cmdlets used by GRIT require elevated mode (“Run As Administrator”), and this was a documented requirement in previous release. With version 1.2, the tool checks if it’s indeed running as administrator and, if not, tries to re-launch itself in such mode. This simplifies execution of the tool, as there is no need to launch PowerShell ISE in administrator mode before launching the tool, or any other method. You can just right-click the script and execute it.
Updated logging in the console window
For those of you familiar with other GUI-based tools available on our blog, GRIT now uses the same color-formatted, time-stamped logging as the SMART Documentation and Conversion Helper, and that can be seen in some of the earlier screenshots.
Wrap up
I hope you will find these new/updated features useful, as you work with VM Role Gallery Items in the Windows Azure Pack. Thanks for reading, keep the feedback coming on the tool, and make sure to share back with the community any enhancements/updates you may make to this PowerShell-based tool!
Wrapping Up The Imagine Cup
It’s been a great week here at the Imagine Cup World Finals. We brought 125 of the top students from around the world, plus another 70+ Microsoft Student Partners here to Seattle, and these young developers really brought their A game.
These students have spent the past year (or more), developing impressive solutions on Microsoft platforms and technologies. I am particularly struck by the fact that six of the nine winning projects used Azure – web services, mobile services, SQL databases, etc. In a cloud-first, mobile-first world, where people need to access their work and home data and services on a variety of platforms and devices, Azure is front-and-center for developers in meeting those needs.
Looking back, Thursday night we celebrated some of our top Imagine Cup teams at Seattle’s Space Needle, with our judges, members of the press, and our friends representing Facebook and AppCampus. It was a great evening that honored the potential of young developers.
From today, I’m also pleased to be able to share that we announced Imagine Cup is expanding to include middle and high school students worldwide. This commitment to younger developers is critical. Technology is being used by younger and younger people every day and we want to ensure that we are reaching everyone we can, whom can benefit from Microsoft technology. We believe in the power of developers and we also believe that it’s not enough to just consume technology – we recognize a need to enable the next generation to create their own games, their own apps, their own solutions, and express themselves and their goals through the power of the same technology they use every day.
We are a company founded by developers, built by developers and we are committed to helping developers wherever and whomever they are. The makers, creators, doers, and dreamers of today, will create the future waves of innovation that we will all benefit from. This is why we are expanding Imagine Cup to welcome a new generation of students.
But we also know that we can’t just open a door for developers – we are committed to helping those unaware of software development to see its possibilities. We will shine a light on this path, wherever possible. This aspiration is why we are marshaling the incredible talents of our seven thousand Microsoft Student Partners (MSPs) from around the world. MSPs are college students who are already leading the charge into the latest platforms and technologies, and we are inviting them to participate in teaching this December’s Hour of Code to younger students all over the planet.
Coding is the great equalizer – the future will be built on code, at all corners of the globe. At Microsoft, we have seen young people realize their great potential, to create bold new lives for themselves through the power of code again and again. This is what drives us to ensure that we are always reaching out to students in fresh, relevant ways.
As we close the Imagine Cup World Finals, I’m reminded why I am so enthusiastic to be the Chief Evangelist for this company. Where others talk, Microsoft acts. We engage with the world as it is and we support and empower those who would make it a better place for themselves, their families, and their fellow people. Students in our Imagine Cup and Microsoft Student Partner programs are transforming their skills, their lives and fulfilling their dreams, making something better and more powerful along the way.
Thanks to everyone that made the Imagine Cup such a great success!
Cheers,
Guggs
A particularly convincing nefarious ad
As a researcher with the Microsoft Malware Protection Center (MMPC), I see a lot of digital advertising. Recently I came across a nefarious ad that is so convincing I need to warn you about it.
Below is a mock-up of the ad I saw. I’ve changed the name of the company to Contoso, which is a fictitious company used by Microsoft in examples and documentation:
Figure 1: The nefarious ad
At first glance, the ad seems to follow all of the criteria Microsoft has for clean advertising as explained in our objective criteria.
- Attribution: The ad has attribution; in this example it is attributed to Contoso Ads.
- An uninstall entry: If I check the Uninstall or change a program menu in Windows, I can find Contoso listed there.
- A close button: The ad has a close button – the grey ‘X’ in the top right corner. This is not to be confused with the red circle next to it, which has no function and is just part of the rest of the ad.
This ad is usually displayed by adware in the bottom left hand corner of the browser. However, this ad could be displayed by any other means, for example, embedded in a webpage, as a standalone popup, or something else.
What makes this ad exceptionally nefarious is that when you move your mouse over it, another ad appears in a new browser tab. Until recently the ad did not even have the text at the bottom that mentions the “rollover” functionality.
Some examples of this second pop-up ad are shown in figure 2. I have seen a lot of these ads pop-up and they all omit to do two things. The first is they do not tell you that they are an ad. The second is they do not display what program has caused this ad to be shown. The user has no indication that the ad in the new tab would not be there if it was not for the program that displays the first ad, in this case Contoso.
Microsoft considers this behavior as adware and in this case we would detect and remove Contoso. Some of the examples of the second ad that I have seen look like this:
Figure 2: Some examples of the second ad displayed by this adware
As for the second ads, they look like real warnings, but are not. These are advertisements to entice you to download a program which will then offer you more programs to download. I suggest closing the page.
If you do see this ad I suggest you keep your mouse well away from it and do not attempt to close it using the close button (the grey 'X'). Instead, run a scan with an antivirus product, such as Microsoft Security Essentials. You should also see if there is an uninstaller for the program displaying the ad. In the example above there should be an uninstall entry for Contoso in the Uninstall or change a program menu. You might also want to check if you can disable an add-on entry for it in Internet Explorer.
Michael Johnson
MMPC
SharePoint Online simplifies admin interface plus adds new controls over user experience
Mark Kashman (@mkashman) is a senior product manager on the SharePoint marketing team.
Narrow in on what is important—the task at hand. And when that changes, refocus. The journey to simplify the Office 365 admin experience continues, this time with a focus on simplifying the SharePoint Online admin center. For those admins who don’t specialize in deep SharePoint management, our goal is to allow you to manage the service more easily while maintaining access to the most common controls.
With the new capabilities in Office 365 and SharePoint Online, you can now:
- Choose between a Simple or Advanced SharePoint Online admin user experience.
- Control which Office 365 top navigation items your users see.
- Block SharePoint 2013 workflows from being used from your environment.
Let’s dive into the details of these new capabilities that help you manage your service more efficiently.
Choose between a Simple or Advanced SharePoint Online admin interface
A common customer scenario in cloud computing is a basic one: get started. And why not start more simply? If you choose a phased approach—adopting various SharePoint workloads in sequence—you’ll benefit from seeing only that which you need at that time. To support this and other scenarios, SharePoint Online now enables admins to choose between Use Simple and Use Advanced to best map the required admin interface to your current use.
The simple admin center experience displays only the essential options used in the most common scenarios: site collection management, user profile management, and the main settings for external sharing, Information Rights Management, and more. When in simple mode, you don’t see the other admin control tabs. When you select the advanced admin center experience, you have access to all SharePoint Online management capabilities: Business Connectivity Services (BCS), the Term Store, the Secure Store, and more, in addition to site collections, user profiles, and settings.
See a little, see a lot. Choose which is right for you and your company to simplify the overall admin experience.
In Advanced mode in the SharePoint Online admin center, you can select the Use Simple (selected in the image above) or Use Advanced admin experience.
When you’re in Simple mode in the SharePoint Online admin center, the left-hand navigation shows only site collections, user profiles, and settings.
Control which Office 365 top navigation items your users see
Let’s continue with the scenario of a phased approach. If you want to use this approach, there are additional ways you can simplify the admin experience. One way is to show or hide certain elements of the Office 365 top navigation bar. Now admins can choose to show or hide OneDrive for Business, Yammer/Newsfeed, and/or Sites in users’ top global navigation bar. That means you can choose to align what your users see with corporate policy, instead of this choice being dictated by user license type.
The two screenshots below illustrate a scenario where a company that has rolled our email and calendaring using Exchange Online is now ready to provide one place for all their users’ work files, OneDrive for Business. The SharePoint Online admin can go into the settings in the admin center and select to hide Yammer/Newsfeed and Sites. Once you click OK, these elements no longer show in a user’s global top navigation. That way you can keep your users’ focus on learning and adopting OneDrive for Business. And when you’re ready to move on to establishing enterprise social or team site collaboration and want your users to see those, it’s as easy as a click or two.
Admins can choose which elements to show and hide on users’ top navigation bar. In this image, the admin has selected to hide Yammer/Newsfeed and Sites to keep users focused on OneDrive for Business.
Once you choose to hide Yammer/Newsfeed and Sites in the admin center, they no longer appear in users’ global Office 365 top navigation, as you can see in this view of a user’s OneDrive for Business.
Learn more about how to customize the Office 365 navigation bar.
Block SharePoint 2013 workflows from being used from your environment
Some customers are bound to certain company governance policies that can ultimately dictate which services can or cannot be leveraged within the company. Office 365 is continuing to provide transparency as to when services run within Office 365 and when they connect to other services. Starting now, you can choose to block the use of SharePoint 2013 workflows, which by default are published to and run from Microsoft Azure. Previously, you could only enable or disable all Preview Features. This now adds more granularity for certain features.
If you manage workflows into SharePoint Online from SharePoint Designer 2013, and you select Block 2013 workflows, your users will not be able to create a new workflow. Also, if you previously published SharePoint 2013 workflows, you will no longer see them within SharePoint Designer 2013 and, more importantly, they will no longer be available from the web interface. Note: this does not block SharePoint 2010 workflows from being published or executed, because by default they are published into SharePoint Online, not Microsoft Azure.
At the bottom of the settings tab in the SharePoint Online admin center, you can select Block SharePoint 2013 workflows.
Stay informed and ahead of the curve
Be in the know. We publish upcoming innovation to the new Office 365 roadmap. You, too, can get the latest and greatest as soon as it’s available. Opt in by going to your Office 365 admin center, clicking SERVICE SETTINGS > Updates, and turning on First Release. Along with proactive and reactive messages in your Message Center, you’ll get messages keeping you informed about what’s coming up, and next time your team gathers at the water cooler, you’ll be the one with the relevant insight.
Thanks,
–Mark Kashman
Frequently asked questions
Q: Which Office 365 plans will get these SharePoint Online admin center improvements?
A: Any plan that has a SharePoint Online admin center will receive these new admin controls: Office 365 Enterprise E1, E3, and E4; Office 365 Education A2, A3, and A4; Office 365 Government G1, G3, and G4; and Office 365 Midsize Business.
Q. Does this announcement apply to Office365 dedicated subscribers?
A. No. The Office 365 dedicated plans are not receiving this same update because they are managed in a unique, isolated infrastructure, with a different methods for customer accessible management.
The post SharePoint Online simplifies admin interface plus adds new controls over user experience appeared first on Office Blogs.
Pie in the Sky (April 1st, 2014)
Missed last week as I was out of town, so catching up a bit this week.
Cloud
Packer.io, OpenNebula, Docker, Kubernetes: Coming to Azure & Hyper-v
Responsive images as a service: From Telerik
Client/Mobile
A quick look at Winjs: Another thing MSOpen Tech has worked on open sourcing. Cross-platform UI stuff.
Polymer vs. X-Tag: Differences between the two.
Sharing styles across web components with Polymer and core-style: Reusing stuff FTW!
Node.js
V8 memory corruption and stack overflow: Fixed in Node.js v0.8.28 and v.0.10.30.
Current detection with Johnny-Five: Hardware stuff meets Node.js
Generate ASCII boxes: It's the late 80's all over again.
Node.js tools for VS 1.0 beta 2: New bits for Node.js devs that use Visual Studio.
Misc.
Call for Kinect samples: MSOpen Tech is looking for some samples for the Kinect.
Open Source OneNote libraries for Android and iOS: Because OneNote rocks.
Introduction to WAI-ARIA: Because the web must be more accessible.
Blockies: For icons.
Octohost: Docker based mini-PaaS.
Enjoy!
- Larry
SQL Server Database Projects and Team Foundation Build
This post provides walkthroughs that discuss the use of SQL Server Database Projects (.sqlproj files) with Team Foundation Build in Team Foundation Server 2013. The scenarios addressed are:
- Configuring a Build
- Configuring a Build with Unit Tests
- Automatically Publishing a Database Project after a Successful Build
- Automatically Publishing a Database Project that includes Deployment Contributors
- Special Considerations for Visual Studio Online
Configuring a Build
This walkthrough covers the process of configuring a scheduled build using Team Foundation Build in TFS 2013 and Visual Studio 2013.
Prerequisites
- A build service should be installed and configured for your TFS instance. If a build service has not been installed or configured for your TFS instance, then install a build service by following the instructions at http://msdn.microsoft.com/en-us/library/ee259687.aspx.
- The version of Visual Studio that is used by your organization to develop the SQL Server Database Project should be installed on the build machine. If Visual Studio has not been installed, or if the installed version is different than the one used to develop the SQL Server Database Project, then install Visual Studio by following the instructions at http://msdn.microsoft.com/en-us/library/e2h7fzkw.aspx.
- The version of SQL Server Data Tools that is used by your organization to develop the SQL Server Database Project should be installed on the build machine. If you have not already done so, we recommend updating to the latest version of SQL Server Data Tools by following the instructions at http://msdn.microsoft.com/en-us/data/hh297027. This is because significant changes have been made to SQL Server Data Tools since the RTM version of Visual Studio 2013. Updating will ensure that you have all of the latest features and fixes. Note: Visual Studio 2013 includes SQL Server Data Tools. In Visual Studio 2013, you can check for updates to SQL Server Data Tools by using the Tools > Extensions and Updates dialog. You can find the installed version of SQL Server Data Tools by looking at the Help > About Microsoft Visual Studio dialog.
Creating a New Build Definition
- Follow the instructions found at the http://msdn.microsoft.com/en-us/library/ms181716.aspx to create a new build definition. Note: When selecting the solution or project to be built, you can choose either the SQL Server Database Project (.sqlproj file) itself or a solution (.sln file) that contains the SQL Server Database Project.
- Your build definition should now appear in the Team Explorer – Builds page (Keyboard: Ctrl+0, B) under All Build Definitions.
- Right-click on your build and then click on Edit Build Definition... in the context-menu.
- Click on the Process tab and find the Build process parameters grid.
- Expand the Advanced node in the Build section to reveal the MSBuild arguments field.
- Insert /p:VisualStudioVersion=12.0 in the MSBuild arguments field.
- Click File and then click Save to save the build definition.
- To verify that the new build definition works correctly, queue a new build by following the instructions at http://msdn.microsoft.com/en-us/library/ms181722.aspx.
Configuring Unit Tests
Once Team Foundation Build is configured to build your database project, Team Foundation Build can also execute your database unit tests after every build. By default, Team Foundation Build will attempt to execute unit test projects it finds in the build output, but some additional configuration is necessary to get database unit tests working correctly. This walkthrough covers the process of configuring unit test execution using Team Foundation Build in TFS 2013 and Visual Studio 2013.
Prepare Login Credentials
- When executing database unit tests, Team Foundation Build will connect to an instance of SQL Server. Decide which SQL Server instance Team Foundation Build will use for unit testing. If you already have a working SQL Server database unit test project, then you can find the connection information used by that project in the SQL Server Test Configuration dialog:
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate your database unit test project and right click on it.
- In the context menu, click on SQL Server Test Configuration...
- Click Edit Connection… to view the connection information.
- Decide which credentials Team Foundation Build will use to connect to SQL Server. If you already have logins for unit testing that use SQL Server authentication and have sufficient permissions, then you may be able to re-use those credentials. Documentation on the permissions required for database unit tests can be found at http://msdn.microsoft.com/en-us/library/jj889462.aspx#DatabaseUnitTestingPermissions. Note that the permissions required depend on the options selected and features used in the database project and unit test. For example, a unit test that deploys a database project that contains a SQLCLR assembly requires a login that is a member of the sysadmin role.
- If you need to create a new login, choose whether to create a SQL Server authentication login or a Windows authentication login for the Windows account under which Team Foundation Build executes. To create a new login, follow the instructions at http://technet.microsoft.com/en-us/library/aa337562.aspx
Note: Certain considerations apply when choosing between SQL Server authentication and Windows authentication.
Windows Authentication– Team Foundation Build can use Windows Authentication for SQL Server login only when Team Foundation Build and SQL Server are running on the same machine, or when both the Team Foundation Build and SQL Server machines are joined to the same domain.
To find the Windows account under which Team Foundation Build executes, look in the Team Foundation Server Administration Console’s Build Configuration tab.
If the Team Foundation Build service runs as NT AUTHORITY\NETWORK SERVICE, a login in SQL Server can be created using the pattern DOMAIN\TFS_BUILD_SERVER_NAME$. For example, if the Team Foundation Build server is named BldSrvr1 and both it and the target SQL Server are joined to the domain CONTOSO, then a Windows authentication login in SQL Server for CONTOSO\BldSrvr1$ would allow the Team Foundation Build service to login. For production environments, Microsoft advises against configuring a SQL Server login for NT AUTHORITY\NETWORK SERVICE, and instead suggests that you create a domain account specifically for Team Foundation Build.
SQL Server Authentication– Provided that SQL Server is configured to allow SQL Server Authentication, Team Foundation Build can use a SQL Server Authentication login. Instructions on enabling SQL Server Authentication are available at http://msdn.microsoft.com/en-us/library/ms188670.aspx.
When using SQL Server Authentication, a common practice is to store the password as part of the connection string, but Microsoft recommends against storing the password to a production database in your code.The Team Foundation Server Administration Console’s Build Configuration tab displays which Windows user account the Build Service executes under.
Configuring Test Execution
- If you do not already have a database unit test project, instructions on creating a unit test project are available at http://msdn.microsoft.com/en-us/library/jj851203.aspx. Follow the instructions at http://msdn.microsoft.com/en-us/library/ms181407.aspx to check in the new unit test project.
- Follow the instructions at http://msdn.microsoft.com/en-us/library/jj851221.aspx to configure SQL Server unit test execution. Additional documentation regarding connection strings is available at http://msdn.microsoft.com/en-us/library/jj851219.aspx.
- If your build definition contains multiple Configurations (for example, both x86|Release and Any CPU|Release), then it is recommended that you not attempt to have the database unit test deploy the database. Instead, follow the steps in Publish Option 4 documented below to create a customized build process template to deploy your database.
- If you configured your database unit test to deploy the database, you will need to modify the unit test to set an environment variable named VisualStudioVersion. This change is necessary because the deployment mechanism used by the database unit test relies on the VisualStudioVersion environment variable, and Team Foundation Build does not automatically set the variable. If your database unit test settings are not configured to deploy the database, modifying the unit test to set the environment variable is not necessary. Note: The value of VisualStudioVersion should be set to 12.0, which corresponds to Visual Studio 2013. Setting this environment variable in this way will limit the execution of database unit tests solely to Visual Studio 2013.
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate the file SqlDatabaseSetup under your unit test project and open it.
- If your unit test project is written in VB, insert the following code in the class:
Shared Sub New() Environment.SetEnvironmentVariable("VisualStudioVersion", "12.0") End Sub
- If, instead, your unit test project is written in C#, insert the following code in the class:
static SqlDatabaseSetup() { Environment.SetEnvironmentVariable("VisualStudioVersion", "12.0"); }
- If you want your database unit test to deploy the database to LocalDB\ProjectsV12, you will need to add a pre-test script to your build to create the instance. Running this command multiple times is safe.
- Identify the path to SqlLocalDB.exe on the build server. By default, the latest version of SQL Server Data Tools installs SqlLocalDB.exe to: Earlier versions of SQL Server Data Tools installed SqlLocalDB.exe to:
%ProgramFiles%\Microsoft SQL Server\120\Tools\Binn\SqlLocalDB.exe
In the case of Visual Studio Online, the version installed at the time of publish of this blog post uses the latter path.%ProgramFiles%\Microsoft SQL Server\110\Tools\Binn\SqlLocalDB.exe
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate your database unit test project and right click on it.
- In the context menu, click on Add New Item...
- In the Add New Item dialog, click on Visual C# Items and then click on Text File.
- Type a filename ending with .cmd into the Name field. For example, SetupLocalDB.cmd.
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate the file you created in the previous step and double click on it to open the file.
- Enter the following text in the file, being sure to enter the path to SqlLocalDB.exe as found in step A.
"%ProgramFiles%\Microsoft SQL Server\120\Tools\Binn\SqlLocalDB.exe" create ProjectsV12 -s
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), right click on the file you just edited.
- In the context menu, click on Properties.
- In the Properties grid, change the Copy to Output Directory field to Copy always.
- Click on File and then Save All to save the changes to your solution.
- Check-in the changes to your solution. Instructions on checking-in changes are available at http://msdn.microsoft.com/en-us/library/ms181407.aspx.
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find your build in the list of All Build Definitions.
- Right-click on your build and then click on Edit Build Definition... in the context-menu.
- Click on the Process tab and find the Build process parameters grid.
- Expand the Advanced node in the Test section to reveal the Pre-test script path field.
- Enter the following text in the Pre-test script path field, being sure to replace the file name with the name you chose in a previous step.
$(TF_BUILD_BINARIESDIRECTORY)\SetupLocalDB.cmd
- If the Output location field in the Build section is set to PerProject, then insert the name of the project or solution file listed in the Projects field into the path. For example, if the Projects field contains YourApp\ContosoApp.sln then the resulting path should look similar to:
$(TF_BUILD_BINARIESDIRECTORY)\ContosoApp\SetupLocalDB.cmd
- Click File and then click Save to save the changes to your build definition.
- Identify the path to SqlLocalDB.exe on the build server. By default, the latest version of SQL Server Data Tools installs SqlLocalDB.exe to:
- If you are using SQL Server Authentication, modify the app.config file to store the connection password. Note: Microsoft recommends that you do not store the password to a production database in your code.
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate a file named app.config inside of your database unit test project and double-click on the file to open it.
- In the app.config file, locate the Execution Context and Privileged Context lines that look similar to this:
- Modify these lines by inserting the password into the connection string, such that the result looks similar to this:
- If you have configured the database unit test to deploy the database, then you need to modify the path to the database project stored in your app.config file. This is necessary because Team Foundation Build uses a different directory structure than Visual Studio when executing unit tests.
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate a file named app.config inside of your database unit test project and double-click on the file to open it.
- In the app.config file, locate the Database Deployment line that looks similar to this:
- If your app.config file does not contain a Database Deployment line then proceed to step 6.
- If your Team Foundation Server Team Project uses Git as the source control provider, then identify the relative path to the database project from the Git repository path.
- In the Team Explorer – Connect page (Keyboard: Ctrl+0, C), find your Team Project in the list of Team Projects.
- Hover your mouse cursor over your Team Project. A tooltip will appear that shows (among other things) the Repository Path.
- The relative path from the Team Project repository to the database project is the path to the database project (the .sqlproj file) without the Repository Path. For example, if your database project resides on disk at C:\GitRepo\YourTeamProject\YourApp\DB\DBProject\DBProject.sqlproj and the Team Project repository path is C:\GitRepo\YourTeamProject, then the relative path is YourApp\DB\DBProject\DBProject.sqlproj
- Open Notepad (Keyboard: Win+R, notepad, Enter) and type or paste into the window the relative path you found in the previous step.
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find your build in the list of All Build Definitions.
- Right-click on your build and then click on Edit Build Definition... in the context-menu.
- Click on the Process tab and find the Output location setting in the Build section. If the Output location setting is Single Folder, then return to the Notepad window and insert ..\src\ in front of the relative path you entered previously. Alternately, return to the Notepad window and insert ..\..\src\ in front of the the relative path you entered previously. For example, if the relative path you found was YourApp\DB\DBProject\DBProject.sqlproj and your Output location setting is Per Project, then the final path in your Notepad window should be ..\..\src\YourApp\DB\DBProject\DBProject.sqlproj
- If, on the other hand, your Team Foundation Server Team Project uses Team Foundation Version Control, then identify the relative path from the mapped source location to the database project.
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find your build in the list of All Build Definitions.
- Right-click on your build and then click on Edit Build Definition... in the context-menu.
- Click on the Source Settings tab and examine the Working folders mappings. These settings describe the mapping from the Team Foundation Server source control folder to the build agent's hard drive. When the build occurs, the selected folders under TFS source control are downloaded to the build agent's hard drive according to the mapping.
- Locate the row whose Source Control Folder contains your database project (.sqlproj file) and determine the relative path to your project from the mapped Source Control Folder. For example, if your database project is in TFS source control at $/YourTeamProject/YourApp/DB/DBProject/DBProject.sqlproj and your Source Settings tab mapping row maps the Source Control Folder $/YourTeamProject/YourApp, then the relative path is DB\DBProject\DBProject.sqlproj.
- Open Notepad (Keyboard: Win+R, notepad, Enter) and type or paste into the window the relative path you found in the previous step.
- From the same row in the Source Settings tab find the Build Agent Folder. Copy the Build Agent Folder and paste it into the Notepad Window in front of the relative path you found previously, followed by a \ character. For example, if your Notepad window contains DB\DBProject\DBProject.sqlproj and the Build Agent Folder is ($SourceDir)\AppBuild, then the modified path in your Notepad window should be ($SourceDir)\AppBuild\DB\DBProject\DBProject.sqlproj
- If the path in your Notepad window starts with ($SourceDir) then return to the Visual Studio build definition window and click on the Process tab. On that tab, find the Output location setting in the Build section. If the Output location setting is Single Folder, then replace ($SourceDir) in your Notepad window with ..\src\. Alternately, replace ($SourceDir) in your Notepad window with ..\..\src\. For example, if your Notepad window contains ($SourceDir)\AppBuild\DB\DBProject\DBProject.sqlproj and your Output location setting is Per Project, then the final path in your Notepad window should be ..\..\src\AppBuild\DB\DBProject\DBProject.sqlproj
- Return to the app.config file and modify the Database Deployment line by replacing the quoted value with the content of your Notepad file. For example, your app.config file might look like this: And after editing it might look like this:
- Save (Keyboard: Ctrl+S) your app.config file.
- [Optional] Database unit test projects store test execution settings in their app.config file. Consequently, every test execution uses the same settings. It may be the case, though, that Team Foundation Build requires different settings to execute your unit test than the settings required for developers to execute the unit test locally. In this scenario, you can create config files for individual users or machines. The steps below document how to do this by modifying the app.config file and adding a [machine].sqlunittest.config or a [user].sqlunittest.config file to the project.
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate a file named app.config inside of your database unit test project and right-click on it.
- In the context menu, click on Copy.
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate your database unit test project and right-click on it.
- In the context menu, click on Paste.
- A file named app - Copy.config should appear in the Solution Explorer tree.
- Right-click on the file app - Copy.config and click on Rename in the context menu.
- The name of your file determines where it will be used. If you want the settings to be applied when the database unit test is run by a specific user, then name the file [UserName].sqlunittest.config. Alternately, if you want the settings to be applied when the database unit test is run on a specific computer, then name the file [ComputerName].sqlunittest.config. For example, to have settings that are applied when the database unit test is run by a user whose user name is jdoe, name the file jdoe.sqlunittest.config.
- Double click on the renamed sqlunittest.config file to open it.
- In the sqlunittest.config file, delete everything except the section that starts with the element named SqlUnitTesting_VS2013. The resulting edited file should look similar to this:
- Edit the connection strings and database project path in the sqlunittest.config file as needed for the specified user or machine.
- In the Solution Explorer tree (Keyboard: Ctrl+Alt+L), locate the file named app.config inside of your database unit test project and double click on the file to open it.
- Locate the element named SqlUnitTesting_VS2013 and modify it by inserting an attribute named AllowConfigurationOverride with the value true. The resulting line should look like this:
- Click on File and then Save All to save the changes to your solution.
- Check-in the changes to your solution. Instructions on checking-in changes are available at http://msdn.microsoft.com/en-us/library/ms181407.aspx.
Configuring Test Execution in the Build Definition
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find your build in the list of All Build Definitions.
- Right-click on your build and then click on Edit Build Definition... in the context-menu.
- Click on the Process tab and find the Build process parameters grid.
- Expand the Advanced node in the Build section to reveal the MSBuild arguments field.
- Append /p:VisualStudioVersion=12.0 to the end of the MSBuild arguments field.
- Expand the Test Source node to reveal the Test sources spec field.
- Ensure that the name of your unit test project will be found by the Test sources spec pattern.
- Expand the Advanced node in the Test section to reveal the Disable tests field. Ensure that the Disable tests field is set to false.
- Click File and then click Save to save the build definition.
Automatically Publishing a Database Project after a Successful Build
Publishing a database project will update the schema of a target database to match the schema defined in the database project. In some application lifecycle management processes, it is desirable to automatically publish a database project as part of a daily or continuous integration build. Automatic publishing can be accomplished in multiple ways using Team Foundation Build. This document describes prerequisites applicable to all scenarios, and several common methods for achieving automatic publishing (in ascending order of difficulty).
Prerequisite – SQL Server Login Credentials
Team Foundation Build will need to connect to an instance of SQL Server in order to deploy your database project. Decide what credentials Team Foundation Build will use to connect to SQL Server. If you already have a login that uses SQL Server authentication and has sufficient permissions to deploy the database project, then you can re-use those credentials. Documentation on the minimum permissions required to create or deploy a database project can be found at http://msdn.microsoft.com/en-us/library/jj889462(v=vs.103).aspx#DatabaseCreationAndDeploymentPermissions. Note that the permissions required depend on the options selected and features used in the database project. For example, a database project that deploys a SQLCLR assembly to the target SQL Server requires a login that is a member of sysadmin role.
Otherwise, choose whether to create a SQL Server authentication login or a Windows authentication login for the Windows account under which Team Foundation Build executes, and create a login by following the instructions at http://technet.microsoft.com/en-us/library/aa337562.aspx
Notes:
- You can find the account under which Team Foundation Build executes in the Team Foundation Server Administration Console’s Build Configuration tab.
The Team Foundation Server Administration Console’s Build Configuration tab displays which Windows user account the Build Service executes under.
- It may be necessary to use a SQL Server authentication login if either the SQL Server instance or the Team Foundation Build server are not part of a domain.
- If the Team Foundation Build service runs as NT AUTHORITY\NETWORK SERVICE, a login in SQL Server can be created using the pattern DOMAIN\TFS_BUILD_SERVER_NAME$. For example, if the Team Foundation Build server is named BldSrvr1 and both it and the target SQL Server are joined to the domain CONTOSO, then a Windows authentication login in SQL Server for CONTOSO\BldSrvr1$ would allow the Team Foundation Build service to login. For production environments, Microsoft recommends against configuring a SQL Server login for NT AUTHORITY\NETWORK SERVICE, and instead suggests that you create a domain account specifically for Team Foundation Build.
Recommendation – Prepare a Publish Profile file
For any deployment option described in this section, you will need to specify the target database for deployment. You can either tell the system where to deploy the new database schema by using a Publish Profile (a publish.xml file) or, in some cases, you can use individual parameters to specify the connection information. A Publish Profile is recommended because it normally simplifies re-use. If you want to use a Publish Profile (a publish.xml file), follow these steps to create one:
- Right-click on the SQL Server Database Project tree node in the Solution Explorer tree and click on Publish… in the context menu.
- The Publish Database dialog allows you to configure a connection to a SQL Server database. Follow the instructions at http://msdn.microsoft.com/en-us/library/hh272687(v=vs.103).aspx to configure and test your connection.
- In the Publish Database dialog, click Save Profile As… to save the profile as a publish.xml file.
- Your Publish Profile file should now appear in Solution Explorer as a node under the SQL Server Database Project. Right-click on the Publish Profile file and click on Properties in the context menu.
- In the Properties grid, change the Copy to Output Directory value to Copy always. This will place the Publish Profile file into the output directory, making it easy to reference as an argument. Note: If you do not want the Publish Profile file included in the Output Directory, you can skip this step and leave the Copy to Output Directory value set to Do not copy. In this case, you can use the environment variable TF_BUILD_SOURCESDIRECTORY to specify the path to the Publish Profile file in the source code. Instructions regarding the use of this environment variable can be found at http://msdn.microsoft.com/en-us/library/hh850448.aspx.
- Follow the instructions at http://msdn.microsoft.com/en-us/library/ms181407.aspx to check in the new file and the modified SQL Server Database Project file.
Publish Option 1: Setting MSBuild arguments in the Build Definition
The default build processes in Team Foundation Build use a component named MSBuild to compile SQL Server Database Projects. MSBuild has an option that, when enabled, tells MSBuild to publish the SQL Server Database Project after a successful build.
The approach outlined below will configure Team Foundation Build to always pass arguments to MSBuild that tell it to publish the database project. This approach is straightforward, but it is not appropriate in some situations:
- The solution you are building contains more than one SQL Server Database Project, or you intend to add another SQL Server Database Project to the solution in the future. This will not work because you will need to specify target database information for each project individually, but the default build definition process uses the same set of arguments for each project.
- The solution you are building contains projects other than the SQL Server Database Project you want to publish, or you intend to add such a project to the solution in the future, and those other projects should not or cannot be published. This will not work because the default build definition process uses the same set of arguments for each project, and MSBuild should not attempt to publish those projects.
- You are unable or do not want to use a Publish Profile file. This will not work because the MSBuild publish task requires a Publish Profile file.
- You want to deploy the schema to more than one target database. This will not work because the MSBuild publish argument accepts only one Publish Profile file.
- You are using a non-default build process that does not use MSBuild or does not have a configuration setting for MSBuild arguments.
If this option is appropriate for your build, follow these steps to pass the appropriate arguments to MSBuild.
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find your build in the list of All Build Definitions.
- Right-click on your build and then click on Edit Build Definition… in the context-menu.
- Click on the Process tab and find the Build process parameters grid.
- Expand the Advanced node to reveal the MSBuild arguments field.
- In the MSBuild arguments field, add the following:
/t:Build;Publish /p:SqlPublishProfilePath=your_file_name.publish.xml
- Click File and then click Save to save the build definition.
Notes:
- If you would like to generate a deployment script without executing it on the target database, append the argument /p:UpdateDatabase=false to the MSBuild arguments field
- Additional MSBuild arguments for the publish task are available. See documentation here: http://msdn.microsoft.com/en-us/library/microsoft.data.tools.schema.tasks.sql.sqlpublishtask(v=vs.103).aspx
The Build process parameters grid in the Build Definition tab contains the MSBuild arguments field under Build > Advanced. In this case, a file called myserverprofile.publish.xml contains the connection information for the target database.
Publish Option 2: Calling SqlPackage.exe as a Post-Build Script
SQL Server Data Tools includes a command-line utility named SqlPackage.exe that can be used to publish a database, and TFS 2013’s default build process includes a Post-build script field that can be used to execute SqlPackage.exe.
The approach outlined below will configure Team Foundation Build to call SqlPackage.exe after building. This approach is straightforward, but it is not appropriate in some situations:
- Your solution contains multiple SQL Server Database Projects that you want to deploy. This will not work because you will need to call SqlPackage.exe for each database project individually, but the Post-build script field allows only one call to SqlPackage.exe.
- Your build definition contains multiple Configurations (for example, both x86|Release and Any CPU|Release). This will not work because the path needed in the Post-build script argument is different for each configuration.
- You want to deploy the schema to more than one target database. This will not work because you will need to call SqlPackage.exe for each target database individually, but the Post-build script field allows only one call to SqlPackage.exe.
- You already use the Post-build script path for a different purpose.
- You are using a build process other than the TFS 2013 default build process or your build process does not contain a Post-build script field.
If this option is appropriate for your build, follow these steps to call SqlPackage.exe after a build.
- Verify that SqlPackage.exe has been installed on the build server. SqlPackage.exe is a command-line utility that is included with SQL Server Data Tools. By default, on a 64-bit version of Windows the latest version of SQL Server Data Tools installs it in this location:
C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe
- Find the file name for the .dacpac file output by your SQL Server Database Project. You will use this name in later steps. By default, the output name matches the project name.
- In Solution Explorer, right-click on your SQL Server Database Project and click Properties in the context menu.
- In the Properties tab, click on the Build tab. The field Build output file name contains the string you will need.
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find your build in the list of All Build Definitions.
- Right-click on your build and then click on Edit Build Definition… in the context-menu.
- Click on the Process tab and find the Build process parameters grid.
- Expand the Advanced node to reveal the Post-build script path and Post-build script arguments fields.
- Put the path to SqlPackage.exe that you found in Step 1 into the Post-build script path field.
- If you are using a Publish Profile file, then in the Post-build script arguments field, enter: Note: If you have set the Output location field to PerProject, then you will need to insert the name of the Solution file or project being built into the path to the Publish Profile and the dacpac file, like so:
/a:Publish /pr:$(TF_BUILD_BINARIESDIRECTORY)\your_file_name.publish.xml /sf:$(TF_BUILD_BINARIESDIRECTORY)\your_dacpac_name.dacpac
/a:Publish /pr:$(TF_BUILD_BINARIESDIRECTORY)\YourProject\your_file_name.publish.xml /sf:$(TF_BUILD_BINARIESDIRECTORY)\YourProject\your_dacpac_name.dacpac
- If you are not using a Publish Profile file, then in the Post-build script arguments field, enter: Note: If you have set the Output location field to PerProject, then you will need to insert the name of the Solution file or project being built into the dacpac file path, like so:
/a:Publish /tcs:"Data Source=YourSQLServer;Integrated Security=true;Initial Catalog=YourTargetDatabase;Pooling=false" /sf:$(TF_BUILD_BINARIESDIRECTORY)\your_dacpac_name.dacpac
/a:Publish /tcs:"Data Source=YourSQLServer;Integrated Security=true;Initial Catalog=YourTargetDatabase;Pooling=false" /sf:$(TF_BUILD_BINARIESDIRECTORY)\YourProject\your_dacpac_name.dacpac
- If you would like to generate a deployment script without executing it on the target database, use the argument /a:Script instead of /a:Publish and append the argument /OutputPath:$(TF_BUILD_DROPLOCATION)\DeploymentScript.sql.
- Click File and then click Save to save the build definition.
Note: Additional SqlPackage.exe arguments are available. See documentation here: http://msdn.microsoft.com/en-us/library/hh550080(v=vs.103).aspx
![]() |
The Build process parameters grid contains the Post-build script path and arguments fields under Build > Advanced. In this case, a file called myserverprofile.publish.xml contains the connection information for the target database. |
Publish Option 3: Calling SqlPackage.exe from a Batch File
SQL Server Data Tools includes a command-line utility named SqlPackage.exe that can be used to publish a database, and TFS 2013’s default build process includes a Post-build script field that can be used to execute a batch file.
The approach outlined below will configure Team Foundation Build to call a batch file after building, and the batch file will in turn call SqlPackage.exe. This approach is not appropriate in some situations:
- Your build definition contains multiple Configurations (for example, both x86|Release and Any CPU|Release). This will not work because the path to the Post-build script is different for each configuration.
- You already use the Post-build script path for a different purpose.
- You are using a build process other than the TFS 2013 default build process or your build process does not contain a Post-build script field.
If this option is appropriate for your build, follow these steps:
- Verify that SqlPackage.exe has been installed on the build server. SqlPackage.exe is a command-line utility that is included with SQL Server Data Tools. By default, on a 64-bit version of Windows the latest version of SQL Server Data Tools installs it in this location:
C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe
- Find the file name(s) for the .dacpac file(s) output by your SQL Server Database Project(s). You will use the name(s) in later steps. By default, the output name matches the project name.
- In Solution Explorer, right-click on your SQL Server Database Project and click Properties in the context menu.
- In the Properties tab, click on the Build tab. The field Build output file name contains the string you will need.
- Open Notepad (Keyboard: Win+r, notepad, Enter)
- If you are using a Publish Profile (publish.xml) file, then enter the following text all on one line: Note: If you have set the Output location field to PerProject, then you will need to insert the name of the Solution file or project being built into the paths, like so:
"C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe" /a:Publish /pr:%TF_BUILD_BINARIESDIRECTORY%\your_file_name.publish.xml /sf:%TF_BUILD_BINARIESDIRECTORY%\your_dacpac_name.dacpac
"C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe" /a:Publish /pr:%TF_BUILD_BINARIESDIRECTORY%\YourProject\your_file_name.publish.xml /sf:%TF_BUILD_BINARIESDIRECTORY%\YourProject\your_dacpac_name.dacpac
- If you are not using a Publish Profile file, then enter the following text, all on one line: Note: If you have set the Output location field to PerProject, then you will need to insert the name of the Solution file or project being built into the path, like so:
"C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe" /a:Publish /tcs:"Data Source=YourSQLServer;Integrated Security=true;Initial Catalog=YourTargetDatabase;Pooling=false" /sf:%TF_BUILD_BINARIESDIRECTORY%\your_dacpac_name.dacpac
"C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe" /a:Publish /tcs:"Data Source=YourSQLServer;Integrated Security=true;Initial Catalog=YourTargetDatabase;Pooling=false" /sf:%TF_BUILD_BINARIESDIRECTORY%\YourProject\your_dacpac_name.dacpac
- If you would like to generate a deployment script without executing it on the target database, use the argument /a:Script instead of /a:Publish and append the argument /OutputPath:%TF_BUILD_DROPLOCATION%\DeploymentScript.sql.
- Repeat step 4 or step 5 for each SQL Server Database Project and target database. Place each call to SqlPackage.exe on its own new line.
- Click File and then click Save. Name the save file your_name_here.cmd and make note of its location.
- In Solution Explorer, right-click on your SQL Server Database Project and click on Add and then Existing Item...
- In the Add Existing Item dialog, navigate to and select the batch file you created (your_name_here.cmd).
- Note: If you have multiple SQL Server Database Projects in a single solution, it is not necessary to add this batch file to each project – only one copy is necessary. Also, it is not required that the batch file be added to a SQL Server Database Project. It’s sufficient to simply add the batch file somewhere under source control, provided that it is in a folder that is part of the set of Working folders defined on the Source Settings tab of the Team Foundation Build definition.
- Your batch file should now appear in Solution Explorer as a node under the SQL Server Database Project. Right-click on the batch file and click on Properties in the context menu.
- In the Properties grid, change the Copy to Output Directory value to Copy always. This will place the batch file into the output directory, making it easy to reference as an argument. Note: If you do not want the batch file included in the Output Directory, you can skip this step and leave the Copy to Output Directory value set to Do not copy. In this case, in subsequent steps you will need to use the environment variable TF_BUILD_SOURCESDIRECTORY to specify the path to the batch file in the source code. Instructions regarding the use of this environment variable can be found at http://msdn.microsoft.com/en-us/library/hh850448.aspx.
- Follow the instructions at http://msdn.microsoft.com/en-us/library/ms181407.aspx to check in the new file and the modified SQL Server Database Project file.
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find your build in the list of All Build Definitions.
- Right-click on your build and then click on Edit Build Definition… in the context-menu.
- Click on the Process tab and find the Build process parameters grid.
- Expand the Advanced node to reveal the Post-build script path field.
- In the Post-build script path field, enter: Note: If you have set the Output location field to PerProject, then you will need to insert the name of the Solution file or project being built into the path, like so:
$(TF_BUILD_BINARIESDIRECTORY)\your_name_here.cmd
$(TF_BUILD_BINARIESDIRECTORY)\YourProject\your_name_here.cmd
- Click File and then click Save to save the build definition.
Note: Additional SqlPackage.exe arguments are available. See documentation here: http://msdn.microsoft.com/en-us/library/hh550080(v=vs.103).aspx
![]() |
The Build process parameters grid contains the Post-build script path under Build > Advanced. In this case, a file called your_name_here.cmd calls SqlPackage.exe in order to publish the SQL Server Database Project. |
Publish Option 4: Modifying the Build Process Template
In Team Foundation Build, build processes are Windows Workflow (XAML) files. The Windows Workflow file used for your build can be customized to call SqlPackage.exe. This option provides the greatest flexibility at the cost of greater complexity.
- Follow the instructions at http://msdn.microsoft.com/en-us/library/dd647551.aspx to create a custom template and Workflow project for your build definition.
- Verify that SqlPackage.exe has been installed on the build server. SqlPackage.exe is a command-line utility that is included with SQL Server Data Tools. By default, on a 64-bit version of Windows the latest version of SQL Server Data Tools installs it in this location:
C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe
- Find the file name(s) for the .dacpac file(s) output by your SQL Server Database Project(s). You will use the name(s) in later steps. By default, the output name matches the project name.
- In Solution Explorer, right-click on your SQL Server Database Project and click Properties in the context menu.
- In the Properties tab, click on the Build tab. The field Build output file name contains the string you will need.
- Open your Templates solution that you created as part of Step 1 in Visual Studio.
- In Solution Explorer locate CustomTemplate.xaml, and open it by double-clicking on it.
- You will need to locate an appropriate place in the Workflow process after the build has been performed to execute SqlPackage.exe. If your custom template is derived from the default TFS 2013 build process template, then a reasonable place in the process is inside of Overall build process>Run on agent>Try>Compile, Test and Publish, immediately after the activity Run optional script after MSBuild.
- Open the Toolbox window (Keyboard: Ctrl+Alt+X) and find RunScript under Team Foundation Build Activities.
- Drag and drop RunScript from the Toolbox into the appropriate place in the Workflow.
- In the Workflow, right-click on the new RunScript action and click on Properties in the context menu.
- In the Properties grid, set the DisplayName field to Run SqlPackage.exe.
- In the Properties grid, set the FilePath field to:
"C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe"
- Find the path to your dacpac file within the Team Foundation Build working directory. This path will vary depending on your build process and build configuration. For example, if your build is based on the default process template, has an Output location field value of PerProject, and has multiple target configurations, then a resulting path might look similar to: If you're uncertain of the correct path, you can consult the MSBuild log file from a successful build to find the path.
$(TF_BUILD_BINARIESDIRECTORY)\Release\YourProject\your_dacpac_name.dacpac
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find a successful build in the My Builds list.
- Double click on the build to open it.
- Click on View Log.
- Scroll down through the log and click on MSBuild Log File.
- Search (Keyboard: Ctrl+F) the document for the name of your database unit test project assembly (for example, DBProjectUnitTests.dll). You should find a line similar to this: The italicized portion of the above line (C:\Builds\2\SomeTeamProject\SampleBuildDefinition\bin) should be replaced by the environment variable TF_BUILD_BINARIESDIRECTORY before use, as it will vary depending on the build agent. The bolded portion of the above line (\Release\DBProject\) is not included in the environment variable, so you will need to insert that into the path used in the subsequent steps.
Copying file from "obj\Release\DBProjectUnitTests.dll" to "C:\Builds\2\SomeTeamProject\SampleBuildDefinition\bin\Release\DBProject\DBProjectUnitTests.dll".
- If you are using a Publish Profile (publish.xml) file, then in the Properties grid, set the Arguments field to: Note: Edit the path in the above example to match the path structure for your build process, as you found in the previous step.
"/a:Publish /pr:$(TF_BUILD_BINARIESDIRECTORY)\your_file_name.publish.xml /sf: $(TF_BUILD_BINARIESDIRECTORY)\your_dacpac_name.dacpac"
- If you are not using a Publish Profile file, then in the Properties grid, set the Arguments field to: Note: Edit the path in the above example to match the path structure for your build process, as you found in the previous step.
"/a:Publish /tcs:""Data Source=YourSQLServer;Integrated Security=true;Initial Catalog=YourTargetDatabase;Pooling=false"" /sf:$(TF_BUILD_BINARIESDIRECTORY)\your_dacpac_name.dacpac"
- If you would like to generate a deployment script without executing it on the target database, use the argument /a:Script instead of /a:Publish and append the argument /OutputPath:$(TF_BUILD_DROPLOCATION)\DeploymentScript.sql.
- Repeat steps 8 through 15 for each SQL Server Database Project, configuration and target database.
- Follow the instructions at http://msdn.microsoft.com/en-us/library/ms181407.aspx to check in the custom build process template.
- If you have not already done so, follow the instructions at http://msdn.microsoft.com/en-us/library/dd647551.aspx to configure your build to use the customized build process.
Automatically Publishing a Database Project that Includes Deployment Contributors
Deployment contributors are custom .NET classes that can modify or interact with the publishing of a database schema. The SQL Server Database Project deployment engine will search for deployment contributors in the same directory as the deployment code. This section details approaches to using deployment contributors with Team Foundation Build.
Manually Place the Deployment Contributor in the Program Files directory
Some deployment contributors almost never change after having been written. In this case, manually placing the deployment contributor assemblies into the Program Files directory on the build server is a good option.
- Verify that SqlPackage.exe has been installed on the build server. SqlPackage.exe is a command-line utility that is included with SQL Server Data Tools. By default, on a 64-bit version of Windows the latest version of SQL Server Data Tools installs it in this location:
C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe
- Place a copy of the deployment contributor assemblies into the same directory as SqlPackage.exe.
- Follow the instructions in the previous section of this document to configure automatic publishing after a build.
Execute SqlPackage.exe from a Temporary Directory
Suppose that you have a solution that contains both a SQL Server Database Project and a separate project that contains a deployment contributor, and you want Team Foundation Build to compile both, and then to publish the database project using the freshly-built deployment contributor. In this case it is probably not possible for Team Foundation Build to copy the deployment contributor into the directory where deployment contributors normally reside, because most organizations do not allow Team Foundation Build to execute under an account with Administrator permissions.
A workaround for this scenario is to copy both the deployment contributor and the SqlPackage.exe code into a temporary directory, and then have Team Foundation Build call that copy of SqlPackage.exe. This works because the code that performs SQL Server Database Project deployment loads extensions from its own directory – the directory in which it is executing.
The steps below will walk you through creating a batch file to execute SqlPackage.exe from a temporary directory.
- Verify that SqlPackage.exe has been installed on the build server. SqlPackage.exe is a command-line utility that is included with SQL Server Data Tools. By default, on a 64-bit version of Windows the latest version of SQL Server Data Tools installs it in this location:
C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\SqlPackage.exe
- Find the file name(s) for the .dacpac file(s) output by your SQL Server Database Project(s). You will use the name(s) in later steps. By default, the output name matches the project name.
- In Solution Explorer, right-click on your SQL Server Database Project and click Properties in the context menu.
- In the Properties tab, click on the Build tab. The field Build output file name contains the string you will need.
- Find the name(s) of the deployment contributor assemblies.
- Open Notepad (Keyboard: Win+r, notepad, Enter)
- Enter the following text:
md %TF_BUILD_BUILDDIRECTORY%\%TF_BUILD_BUILDNUMBER%\DEPLOY xcopy /s /h /r /y /i %TF_BUILD_BINARIESDIRECTORY% %TF_BUILD_BUILDDIRECTORY%\%TF_BUILD_BUILDNUMBER%\DEPLOY xcopy /s /h /r /y /i " C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120" %TF_BUILD_BUILDDIRECTORY%\%TF_BUILD_BUILDNUMBER%\DEPLOY %TF_BUILD_BUILDDIRECTORY%\%TF_BUILD_BUILDNUMBER%\DEPLOY\sqlpackage.exe /a:Script /pr:%TF_BUILD_BINARIESDIRECTORY%\your_file_name.publish.xml /sf:%TF_BUILD_BINARIESDIRECTORY%\your_dacpac_name.dacpac /OutputPath:%TF_BUILD_DROPLOCATION%\DeploymentScript.sql /p:AdditionalDeploymentContributors=your_contributor_id /p:AdditionalDeploymentContributorArguments=your_contributor_arg=value %TF_BUILD_BUILDDIRECTORY%\%TF_BUILD_BUILDNUMBER%\DEPLOY\sqlpackage.exe /a:Publish /pr:%TF_BUILD_BINARIESDIRECTORY%\your_file_name.publish.xml /sf:%TF_BUILD_BINARIESDIRECTORY%\your_dacpac_name.dacpac /p:AdditionalDeploymentContributors=your_contributor_id /p:AdditionalDeploymentContributorArguments=your_contributor_arg=value rmdir /S /Q %TF_BUILD_BUILDDIRECTORY%\%TF_BUILD_BUILDNUMBER%
- Edit the text, substituting the correct path to the directory containing SqlPackage.exe in line 3, and the name(s) of your Publish Profile and dacpac files in the subsequent lines. If you have multiple deployment database projects or target databases, copy the lines of text that execute SqlPackage.exe for each such database project or target database.
- If you have set the Output location field to PerProject, then you will need to insert the name of the Solution file or project being built into the paths, like so:
/pr:%TF_BUILD_BINARIESDIRECTORY%\YourProject\your_file_name.publish.xml /sf:%TF_BUILD_BINARIESDIRECTORY%\YourProject\your_dacpac_name.dacpac
- Click File and then click Save. Name the save file your_name_here.cmd and make note of its location.
- In Solution Explorer, right-click on your SQL Server Database Project and click on Add and then Existing Item....
- In the Add Existing Item dialog, navigate to and select the batch file you created (your_name_here.cmd).
- Note: If you have multiple SQL Server Database Projects in a single solution, it is not necessary to add this batch file to each project – only one copy is necessary. Also, it is not required that the batch file be added to a SQL Server Database Project. It’s sufficient to simply add the batch file somewhere under source control, provided that it is in a folder that is part of the set of Working folders defined on the Source Settings tab of the Team Foundation Build definition.
- Your batch file should now appear in Solution Explorer as a node under the SQL Server Database Project. Right-click on the batch file and click on Properties in the context menu.
- In the Properties grid, change the Copy to Output Directory value to Copy always. This will place the batch file into the output directory, making it easy to reference as an argument. Note: If you do not want the batch file included in the Output Directory, you can skip this step and leave the Copy to Output Directory value set to Do not copy. In this case, in subsequent steps you will need to use the environment variable TF_BUILD_SOURCESDIRECTORY to specify the path to the batch file in the source code. Instructions regarding the use of this environment variable can be found at http://msdn.microsoft.com/en-us/library/hh850448.aspx.
- Follow the instructions at http://msdn.microsoft.com/en-us/library/ms181407.aspx to check in the new file and the modified SQL Server Database Project file.
- In the Team Explorer – Builds page (Keyboard: Ctrl+0, B), find your build in the list of All Build Definitions.
- Right-click on your build and then click on Edit Build Definition… in the context-menu.
- Click on the Process tab and find the Build process parameters grid.
- Expand the Advanced node to reveal the Post-build script path field.
- In the Post-build script path field, enter: Note: If you have set the Output location field to PerProject, then you will need to insert the name of the Solution file or project being built into the path, like so:
$(TF_BUILD_BINARIESDIRECTORY)\your_name_here.cmd
$(TF_BUILD_BINARIESDIRECTORY)\YourProject\your_name_here.cmd
- Click File and then click Save to save the build definition.
Note: Additional SqlPackage.exe arguments are available. See documentation here: http://msdn.microsoft.com/en-us/library/hh550080(v=vs.103).aspx
Special Considerations for Visual Studio Online
Microsoft provides a hosted Team Foundation Build service as part of Visual Studio Online. In general, configuration in Visual Studio Online is the same as an on-premise Team Foundation Server, but the following special considerations apply when using the Visual Studio Online build service:
- Visual Studio Online's hosted build controller has certain restrictions, for example restrictions on execution time and available software, that might make the hosted build controller unsuitable for your needs. See http://www.visualstudio.com/en-us/get-started/hosted-build-controller-vs.aspx for documentation on applicable restrictions.
- The default process template for new builds in Visual Studio Online is from Team Foundation Server 2012. Several scenarios in this walkthrough assume you will use the default template from Team Foundation Server 2013. In Visual Studio Online, you can select the process template TfvcTemplate.12.xaml (in the case of Team Foundation Version Control) or GitTemplate.12.xaml (in the case of Git), which correspond to the default templates from TFS 2013.
- The version of SQL Server Data Tools installed on the Visual Studio Online hosted build agent might not match the version used in your development environment. New releases of SSDT are not immediately installed on the Visual Studio Online hosted build agents, but are rather installed as part of the regular update cadence for Visual Studio Online. While new versions of SSDT do maintain backwards compatibility, you may be unable to use the new features of SSDT until the new version is installed on the hosted build agent.
- At the time of publishing this blog post in late July 2014, Visual Studio Online's hosted build agents do not have the latest version of SSDT, but availability of the latest version of SSDT is targeted for August 2014. The version of SSDT currently installed on Visual Studio Online's hosted build agents does not support SQL Server 2014 as a target platform. Also, until the latest version of SSDT is installed on the hosted build agents, the path to SqlPackage.exe on the Visual Studio Online hosted build agents will be different than the path specified in the other sections of this post. It is:
C:\Program Files\Microsoft SQL Server\110\DAC\bin
- Database deployment to LocalDB using Integrated Authentication is supported on Visual Studio Online. Database deployment to other SQL Server instances, whether performed as a post-build activity or as part of a database unit test, requires the use of SQL Authentication when running in Visual Studio Online. No built-in mechanism is available in Visual Studio Online by which you can provide the SQL Authentication credentials other than by storing them as part of the connection string. Microsoft recommends against storing the password to a production database in your code.
- Visual Studio Online allows you to connect an on-premises build controller to the hosted TFS instance, so that you can control the build environment. See http://msdn.microsoft.com/library/ee330987#hosted for information about using an on-premises build controller.
What’s new: July 2014
July has been a busy month for mobile apps and admins. OneNote continues to receive updates to its Mac OS X, iPhone, iPad, and Android versions. Office for iPad was enhanced with many top customer requested features. Admins can now have 10x the number of Public Folders and new communication tools to stay informed about their Office 365 service. Power BI, Power Map, and Power Query all also were updated this month. Leave us a comment to let us know what your favorite new feature is. If you missed last month’s updates, see What’s New: June.
Office 365 Personal, Office 365 Home, and Office 365 University updates:
Clip the web with me@onenote.com– Send a link to me@onenote.com and get a screenshot of the site added to your notebook.
Mod and OneNote sync notes on paper to the cloud– Now you can sync your hand written notes that you take in Mod with OneNote.
Let’s get cooking! Collecting recipes in OneNote just got better– When you use the OneNote web clipper or email clipper (me@onenote.com) to clip a recipe, we’ll simplify the page down to just the steps, ingredients, and key pieces of information you need most, giving your recipes a simpler, cleaner format in OneNote.
OneNote available for Kindle Fire and Fire phone– OneNote is available at the Amazon Appstore for Android, delivering on our promise to bring OneNote to every device that matters to you.
OneNote for Mac and iOS get dressed for business and school– This update added four major features across Mac os X, iPhone, and iPad. Updates include: Access your work or school notebooks on your Mac stored on OneDrive for Business or SharePoint Online on Office 365. The ability to open and insert files, including PDF files, into your notebook pages. The ability to view your password protected sections. Lastly, improved organization, capturing content, and sharing of notes.
Office for iPad: now with Presenter View, Pivot Table interaction, Export to PDF, and more top-requested features– This major update adds lots of customer-requested features to Office for iPad, read on for the full list.
Office 365 for business updates*:
Document collaboration made easy– Now from Outlook Web App you can open and edit email attachments using Office Online. A new side-by-side view keeps the email thread on the right of the email so you can also craft a response to the conversation, make edits to the document, and send your edited version, all without leaving Outlook Web App.
10x increase in Public Folder limits– We are raising the folder count limit to 100,000 folders. This is a 10x increase over the prior limit as defined in the Exchange service description limits. This enables migration of larger existing public folder hierarchies and increased use of the feature in Office 365.
OneNote for Mac and iOS get dressed for business and school– This update added four major features across Mac os X, iPhone, and iPad. Updates include: Access your work or school notebooks on your Mac stored on OneDrive for Business or SharePoint Online on Office 365. The ability to open and insert files, including PDF files, into your notebook pages. The ability to view your password protected sections. Lastly, improved organization, capturing content, and sharing of notes.
New Office 365 admin tools– We’re providing Office 365 admins with more ways to stay informed with Office 365 service communications. You can now receive Office 365 service communications in Microsoft System Center by importing the newly available Office 365 Management Pack. And you can integrate Office 365 service communications into your monitoring systems and tools by interfacing with the Office 365 Service Communications API.
Office for iPad: now with Presenter View, Pivot Table interaction, Export to PDF, and more top-requested features– This major update adds lots of customer-requested features to Office for iPad, read on for the full list.
Important update available for Exchange Server 2013 hybrid deployments– An important update is now available to resolve issues customers are currently experiencing when using the Hybrid Configuration Wizard (HCW) to create a new or manage an existing hybrid deployment with Microsoft Exchange Server 2013.
Power Map July update– We’re adding a new feature called “Automatic Refresh.” This feature makes the Repeat Tour feature we added back in May even better. During tour playback, Automatic Refresh will pull in the latest updates to your data every 10 minutes. When used together, Repeat Tour and Automatic Refresh let you use Power Map to monitor events in near real time.
New in Power BI: Cloud modeling for Q and A– The new cloud modeling enables you to define new terms and resolve any ambiguity for your questions.
Six new updates in Power Query– Here is a summary of the new features included in this release: more flexible Load Options for your queries, Query Groups, improved Error Debugging Experience, additional Query Editor transformations, replace Errors within a column, UX for defining Math operations based on a single column, options dialog: Restore Defaults & Tips for Cache Management, and update notifications: 3 notifications per update (max.), once per day at most.
Office 365 developer updates
Office 365 Development Patterns and Practices launched– The Office 365 Development Patterns & Practices team shipped the new open source repository today to GitHub.com/OfficeDev.
Please note that some of the updates may take time to show up in your Office 365 account, because they’re being rolled out to customers worldwide.
– Andy O’Donald @andyodonald
——————————————————————————–
*Not all updates apply to every Office 365 plan; please check the individual post for specifics.
The post What’s new: July 2014 appeared first on Office Blogs.