Quantcast
Channel: TechNet Technology News
Viewing all 13502 articles
Browse latest View live

#AzureAD Domain Services is now GA! Lift and shift to the cloud just got WAY easier!

$
0
0

Howdy folks,

The news today is just SOOOO cool. Azure AD Domain Services is now Generally Available!

If you follow the blog, you already know about how this unique capability makes moving legacy applications into the cloud WAY easier. But youll probably be surprised to learn that more than 5700 customers have already turned on Azure AD Domain services in their tenant and are using it every day.

To give you a quick tour of the service and the improvements weve made during the public preview, Ive asked Mahesh Unnikrishnan, the PM who leads this effort back to do another guest blog post. Youll find it below.

I hope youll find this service valuable, and as always, we would love to receive any feedback or suggestions you have!

Best Regards,

Alex Simons (Twitter: @Alex_A_Simons)

Director of Program Management

Microsoft Identity Division

————–

Hi there!

Im Mahesh Unnikrishnan, a Program Manager in the Identity division at Microsoft.

Late last year, we announced the public preview of Azure AD Domain Services. Since then, weve been working closely with customers to make sure they can get up and running and to learn from their feedback and suggestions. Then in May, we announced several exciting new features and improvements to the service including secure LDAP support, support for configuring DNS and custom OUs.

Since May we have continued to evolve the service and refine it based on your feedback.

So today, Im thrilled to announce that Azure AD Domain Services is now Generally Available (GA)!

The Preview program was incredibly successful, with over 5700 Azure AD tenants testing the service and sharing their feedback. Wed like to thank all these customers for their time and for helping us evolve the service. Some of the features weve added this year based on your feedback include:

  • Secure LDAP access to your managed domain, including over the internet (even from Amazon Web Services!)
  • Enable AAD DC Administrators to configure DNS on their managed domain.
  • Enable AAD DC Administrators to create custom organizational units (OUs).

A quick review – What is Azure AD Domain Services?

Azure AD Domain Services provides managed domain services such as domain join, group policy, LDAP, & Kerberos/NTLM authentication that are fully compatible with Windows Server Active Directory. You can consume these domain services without the need to deploy, manage, and patch domain controllers in the cloud. Azure AD Domain Services integrates with your existing Azure AD tenant so users can signin using their corporate credentials. Additionally, you can use existing groups and user accounts to secure access to resources, making the ‘lift-and-shift’ of on-premises resources to Azure Infrastructure Services way easier than in the past.

Azure AD Domain Services functionality works seamlessly regardless of whether your Azure AD tenant is cloud-only or synced with your on-premises Active Directory. For synced tenants, you do not need to deploy any additional software apart from your deployment of Azure AD Connect.

clip_image002

Setting up Azure AD Domain Services is simple. You simply toggle the service to enabled, pick a DNS domain name for your managed domain and select a virtual network where youd like the managed domain to be available.

To get started with Azure AD Domain Services, click here.

New since our initial preview

We have quite a few enhancements and features since the service first went into preview late last year.

  • Support for secure LDAP: You canaccess your managed domain using LDAPS (secure LDAP), including over the internet.
  • Custom OU support: Users in the AAD DC Administrators delegated group can create and administer a custom organizational unit on your managed domain.
  • Configure managed DNS for your domain: Users in the AAD DC Administrators delegated group can administer DNS on your managed domain using Windows Server DNS administration tools.
  • Domain join for Linux: Weve worked with RedHat to document how you can join a RedHat Linux VM to your managed domain.
  • New and improved synchronization with your Azure AD tenant: Wehave re-designed the synchronization between your Azure AD tenant and your managed domain. For existing domains, this new improved synchronization has been rolled out automatically in a phased manner.
  • The password does not expire attribute: Some accounts had the password-does-not-expire attribute set on them, for example, service accounts. The password policy was being enforced for these accounts in managed domains, resulting in their passwords expiring. Passwords for such accounts will not expire.
  • Incorrect group display name for accounts created in Azure AD: The samAccountName attribute for groups created in Azure AD was not being set correctly in the managed domain. These were being set to GUIDs instead of valid samAccountName.
  • SID history sync: The on-premises primary user and group SIDs will now be synchronized to your managed domain and set as the SidHistory attribute on corresponding users and groups. This cool feature helps you lift-and-shift your workloads to Azure without having to worry about re-ACLing them.
  • Virtual network peering: The Azure networking team recently announced GA for virtual network peering. This awesome feature makes it easy to connect Domain Services to other virtual networks. You can connect a classic virtual network in which your managed domain is available to workloads deployed in resource manager virtual networks using network peering.

Planning guide

To help you get started, we have created a planning guide to help you design and plan your deployment of Azure AD Domain services. This guide includes:

What can I do with Azure AD Domain Services?

You can use Azure AD Domain Services to lift-and-shift many on-premises applications to Azure. For more information, see this list of canonical deployment scenarios and use-cases.

Get Started!

Its easy to get started with Azure AD Domain Services. Here are a few pointers to information that helps you kick the tires.

A sneak peek whats on the way?

We are already working on enhancements to the service based on your feedback. Some of the upcoming features and updates include:

  • Support for Azure Resource Manager including the ability to enable the service in Resource Manager based virtual networks.
  • A new management UI experience in the modern Azure portal (portal.azure.com).

We’re thrilled about the opportunity to evolve Azure AD Domain Services based on your feedback. We’d love for you to try out the service, deploy your workloads in production and share your feedback with us.

Thanks,

Mahesh Unnikrishnan

Principal Program Manager

Microsoft Identity Division


Maven and Gradle build tasks support powerful code analysis tools

$
0
0

Over the last few months we have been steadily building up the capabilities of the Maven and Gradle build tasks to offer insights into code quality through popular code analysis tools. We are pleased to announce additional much-requested features that we are bringing to these tasks, which will make it easier to understand and control technical debt.

Maven Code Analysis fields

Continuous Integration builds: SonarQube integration feature parity with MSBuild

Back in July, our Managing Technical Debt planning update for 2016 Q3 announced a plan to support SonarQube analysis in Java to a level that is equivalent with our strong integration for MSBuild. This is well underway and nearing completion: both Maven and Gradle can now perform SonarQube analysis by selecting a checkbox in the build definition. This will create a build summary of issues that are detected.

We also added the option to break a build when SonarQube quality gates fail. This gives instant feedback and helps you stop the technical debt leak. Finally, there is a new build summary that provides detailed information from SonarQube on why the quality gate failed so that it is easy to identify problems. You can then drill-down and get even more data by navigating to the SonarQube server through the link provided.

SonarQube Build Breaker

Broader support for Java-based static analysis tools

We understand that in the past we lacked integration features for some standalone code analysis tools that are widely used. We have heard your feedback and have added support for three such tools: PMD, Checkstyle and FindBugs. You can enable them simply and quickly through a checkbox in the “Code Analysis” section of your build configuration, and they will run on any agent whether through the Hosted agent pool or on a dedicated agent of your choice (Windows, Linux or Mac!).

Code Analysis Report

Towards Full Parity Java/MSBuild: Pull Request with Code Analysis for Java

For some time we have supported showing you code analysis issues directly on pull requests in Visual Studio Team Services for projects using MSBuild. We hope to support this for Maven and Gradle builds too in future.

 

Limitations, Feedback, and Troubleshooting

If you are working on-premise with TFS 2016, FindBugs support for Gradle will not ship at RTM but will be added in Update 1. For users on Visual Studio Team Services, most of these features are already live and waiting for you, with the rest due to roll out as part of Sprint 107 in the next few weeks.

As always, we would love to hear from you. Please raise issues and suggestions on the issues tab of the vsts-tasks repository in GitHub: https://github.com/microsoft/vsts-tasks/issues and add the label “Area: Analysis”.

 

Another big step in Hybrid Cloud – Windows Server 2016 general availability

$
0
0

This post was authored by Mark Jewett, Senior Director Product Marketing, Cloud Platform.

We are pleased to announce today marks the general availability (GA) of Windows Server 2016 and System Center 2016. Customers can now broadly acquire the server operating system that accelerates innovation and security of both traditional and cloud-native applications.

As a cloud-ready OS, Windows Server 2016 inherently enables hybrid cloud. It was forged in our own Azure datacenters, learning from the rigorous requirements of a global, public cloud, and includes software-defined capabilities that are fundamental to fast-paced cloud innovation. Windows Server 2016 also offers seamless portability across datacenter, private and public cloud environments via virtual machine and new container formats that can be reliably deployed wherever the business requires. Weve further enabled portability to Microsoft Azure with the Azure Hybrid Use Benefit that allows Windows Server VMs to be run in Azure at a discounted rate.

This release is just one of many reflecting our deep commitment to hybrid cloud. In our long-held view, hybrid cloud is the reality for all enterprise customers, even those with the most ambitious cloud plans. Some applications should and will move quickly to public cloud, while others face technological and regulatory obstacles. Regardless of where these applications run today or will run in the future, Windows Server 2016 provides a rich and secure platform.

Beyond Windows Server 2016, weve applied this hybrid cloud commitment across our Microsoft portfolio to build comprehensive capabilities across data, identity, management, application, and the infrastructure platform overall.

Hybrid Stack graphic

Figure 1: Hybrid capabilities built-in across Microsoft products and services

Recognizing hybrid cloud as the reality, other vendors from both traditional IT and public cloud are now also talking about hybrid; however, there is a fundamental difference to the way Microsoft approaches the enablement of hybrid cloud. Customers tell us it isnt enough to be simply connected across cloud environments. Yes, great network connectivity and common VM formats are foundational requirements, but consistency is the real key. Consistency means IT professional, developer, and end user experiences dont change based on the location of the resource, and provides the ultimate flexibility to use the right cloud resources at the right time.

Consistent hybrid cloud enables uniform development developers can build in a common way, and then those applications and services can be deployed in the right location based on business rules and technical requirements.

Consistent hybrid cloud enables unified dev-ops and management operators can use a consistent interface to manage and update resources across the hybrid cloud, alleviating the need to learn new environments and instead focus on deriving more insight and proactive action.

Consistent hybrid cloud enables common identity and security users are protected and managed no matter the location of the application they are using.

Consistent hybrid cloud enables seamless extension of existing applications to the cloud data can be maintained for longer periods at lower cost, and without any application or infrastructure changes required.

We are committed to help customers in their cloud journey and the general availability of Windows Server 2016 marks another important milestone in delivering the most complete hybrid cloud solutions for customers of all sizes. Get started now by downloading the Windows Server 2016 Evaluation, and tune in to the Windows Server 2016 webcast on October 13.

Reduced Out of Memory Crashes in Visual Studio “15”

$
0
0

This is the third post in a five-part series covering performance improvements in Visual Studio “15” Preview 5. The previous 2 posts talked about faster startup, and shorter solution load times in Visual Studio 15″.

Visual Studio is chock-full of features that millions of developers rely on to be productive at their work. Supporting these features, with the responsiveness that developers expect, consume memory. However, in Visual Studio 2015, in certain scenarios the memory usage grew too large. This led to adverse impact such as out-of-memory crashes and UI sluggishness. We received feedback from a lot of customers about these problems. In VS “15” we are tackling these issues while not sacrificing the rich functionality and performance of Visual Studio.

While we are optimizing a lot of feature areas in Visual Studio, this post presents the progress in three specific areas – JavaScript and TypeScript language services, symbol loading in the debugger, and Git support in VS. Throughout this post I will compare the following two metrics for each of the measured scenarios, to show the kind of progress we have made:

Peak Virtual Memory: Visual Studio is a 32bit application, which means the virtual memory consumed can grow up to 4GB. Memory allocations that cause total virtual memory to cross that limit will cause Visual Studio to crash with an “Out of memory” (OOM) error. Peak Virtual Memory is a measure of how close the process is to the 4GB limit, or in other words, how close the process is to crashing.

Peak Private Working Set: A subset of the virtual memory which contains code that the process executes or data that the process touches, needs to be in physical memory. “Working set” is a metric that measures the size of such physical memory consumption. A portion of this working set, called “Private Working Set”, is memory that belongs to a given process and that process alone. Since such memory is not shared across processes, their cost on the system is relatively higher. Measurements in this post report the peak private working set of Visual Studio (devenv.exe) and relevant satellite processes.

JavaScript language service

Over a third of Visual Studio developers write JavaScript (JS) on a regular basis, making the JS language service a component that is loaded in a significant number of Visual Studio sessions. The JS language service provides such features as IntelliSense, code navigation, etc., that make JS editing a productive experience.

To support such productivity features and to ensure they are responsive, the language service consumes a non-trivial amount of memory. The memory usage depends on the shape of the solution, with project count, file count, and file sizes being key parameters. Moreover, the JS language service is often loaded in VS along with another language service such as C#, which adds to the memory pressure in the process. As such, improving the memory footprint of the JS language service is crucial to reducing the number of OOM crashes in VS.

In VS “15”, we wanted to ensure Visual Studio reliability is not adversely impacted by memory consumption regardless of the size and shape of the JS code. To achieve this goal without sacrificing the quality of the JavaScript editing experience, in VS “15” Preview 5 we have moved the entire JS language service to a satellite Node.js process that communicates back to Visual Studio. We have also merged the JavaScript and TypeScript language services, which means we achieve net memory reduction in sessions where both language services are loaded.

To measure the memory impact, we compared Visual Studio 2015 Update 3 with VS “15” Preview 5 in this scenario:

  • Open WebSpaDurandal solution. This is an Asp.Net sample, which we found to represent the 95th percentile in terms of JS code size we see opened in VS.
  • Create and enable auto syncing of _references.js
  • Open 10 JS files
  • Make edits, trigger completions, create/delete files, run the format tool

Here are the results:

Chart 1: Memory usage by the JavaScript language service

Peak virtual memory usage within Visual Studio is reduced by 33%, which will provide substantial relief to JS developers experiencing OOM crashes today. The overall peak private working set, which in Preview 5 represents the sum of the Visual Studio process and our satellite node process, is comparable to that of Visual Studio 2015.

Symbol loading in the debugger

Symbolic information is essential for productive debugging. Most modern Microsoft compilers for Windows store symbolic information in a PDB file. A PDB contains a lot of information about the code it represents, such as function names, their offsets within the executable binary, type information for classes and structs defined in the executable, source file names, etc. When the Visual Studio debugger displays a callstack, evaluates a variable or an expression, etc. it loads the corresponding PDB and reads relevant parts of it.

Prior to Visual Studio 2012, the performance of evaluating types with complex natvis views, was poor. This was because a lot of type information would be fetched on demand from a PDB, which would result in random IOs to the PDB file on disk. On most rotational drives this would perform poorly.

In Visual Studio 2012, a feature was added to C++ debugging that would pre-fetch large amounts of symbol data from PDBs early in a debugging session. This provided significant performance improvements when evaluating types, by eliminating the random IOs.

Unfortunately, this optimization erred too much on the side of pre-fetching symbol data. In certain cases, it resulted in a lot more symbol data being read than was necessary. For instance, while displaying a callstack, symbol data from all modules on the stack would get pre-fetched, even though that data was not needed to evaluate the types in the Locals or Watch windows. In large projects having many modules with symbol data available, this caused significant amounts of memory to be used during every debug session.

In VS “15” Preview 5, we have taken a step towards reducing memory consumed by symbol information, while maintaining the performance benefit of pre-fetching. We now enable pre-fetching only on modules that are required for evaluating and displaying a variable or expression.

We measured the memory impact using this scenario:

  • Load the Unreal Engine solution, UE4.sln
  • Start Unreal Engine Editor
  • Attach VS debugger to Unreal Engine process
  • Put Breakpoint on E:\UEngine\Engine\Source\Runtime\Core\Public\Delegates\DelegateInstancesImpl_Variadics.inl Line 640
  • Wait till breakpoint is hit

Here are the results:

Chart 2: Memory usage when VS Debugger is attached to Unreal Engine process

VS 2015 crashes due to OOM in this scenario. VS “15” Preview 5 consumes 3GB of virtual memory and 1.8GB of private working set. Clearly this is an improvement over the previous release, but not stellar memory numbers by any means. We will be continuing to drive down memory usage in native debugging scenarios during rest of VS “15” development.

Git support in Visual Studio

When we introduced Git support in Visual Studio, we utilized a library called libgit2. For various operations, libgit2 maps the entire git index file into memory. The size of the index file is proportional to the size of the repo. This means that for large repos, Git operations can result in significant virtual memory spikes. If VS was already under virtual memory pressure, these spikes can cause OOM crashes.

In VS “15” Preview 5, we no longer use libgit2 and instead call git.exe, thus moving the virtual memory spike out of VS process. We moved to using git.exe not only to reduce memory usage within VS, but also because it allows us to increase functionality and build features more easily.

To measure the incremental memory impact of a Git operation, we compared Visual Studio 2015 Update 3, with VS “15” Preview 5 in this scenario:

  • Open Chromium repo in Team Explorer
  • Go to “Changes” panel to view pending changes
  • Hit F5 to refresh

Here are the results:

Chart 3: Incremental memory usage when “Changes” panel in Team Explorer is refreshed

In VS 2015, the virtual memory spikes by approximately 300MB for the duration of the refresh operation. In VS “15”, we see no measurable virtual memory increase. The incremental private working set increase in VS 2015 is 79MB, while in VS “15” it is 72MB and entirely from git.exe.

Conclusion

In VS “15” we are working hard at reducing memory usage in Visual Studio. In this post, I presented the progress made in three feature areas. We still have a lot of work ahead of us and are far from being done.

There are several ways, you can help us on this journey:

  • First, we monitor telemetry from all our releases, including pre-releases. Please download and use VS “15” Preview 5. The more usage we have with external sources in day to day usage scenarios, better the signal we get and that will immensely help us.
  • Secondly, report high memory (or any other quality) issues to us using the Report-a-problem tool. The most actionable are reports that help us reproduce the issues at our end, by providing us with sample or real solutions that demonstrate the issue. I realize that is not always an option, so the next best are reports that come attached with a recording of the issue (Report-a-problem tool lets you do this easily) and describe the issue in as much detail as possible.
Ashok Kamath, Principal Software Engineering Manager, Visual Studio

Ashok leads the performance and reliability team at Visual Studio. He previously worked in the .NET Common Language Runtime team.

Nano Server in the Azure Gallery and VM Agent Support

$
0
0

As part of Windows Server 2016 General Availability, there are new Nano Server images in the Azure Gallery. In addition to the new image, there is also a helper script attached to this blog post that can be used to customize the image when deploying in Azure.

Nano Server now supports running a limited VM Agent when deploying in Azure, as well as the running of Extensions. Due to the small nature of Nano Server, most of the Windows extensions do not run on Nano Server and are not required. For Nano Server, we provide two special-case extensions:

  1. VMAccess: Used to change the Administrator password through the Azure interface
  2. CustomScript: Used for running arbitrary customization scripts

We are investigating adding additional extensions for Nano Server, so please use the comments section or the Nano Server UserVoice site at https://windowsserver.uservoice.com/forums/295068-nano-server to provide feedback on additional extensions you’d like to see.

The following examples show how to configure the CustomScript extension, depending on which portal you are using. Note that this blog post builds on our previous post describing how to create Nano Server VMs in Azure IaaS at https://blogs.technet.microsoft.com/nanoserver/2016/05/27/nano-server-tp5-iaas-image-in-azure-updated/.

 

Azure Service Manager

If the container already exists, you can skip its creation. The script file only needs to be uploaded once to Azure Storage.

New-AzureStorageContainer -Name "my-script-container"
Set-AzureStorageBlobContent -File .\RunMe.ps1 -Container "my-script-container"
$vm = Get-AzureVM -Name "my-vm" -ServiceName "my-service-name"
$vm = Set-AzureVMCustomScriptExtension -VM $vm -ContainerName "my-script-container" -FileName "RunMe.ps1" -Run "RunMe.ps1" -Argument "MyArg1 MyArg2"
$vm | Update-AzureVM

 

Azure Resource Manager

The module NanoServerAzureHelper.psm1 in the attached zip file provides a helper function: Set-HelperAzureRmCustomScript. This helper function makes the configuration easier, similar to Set-AzureVMCustomScriptExtension, and also provides automatic creation of storage accounts and containers and uploading of the scripts. Here are some examples of its use.

  • To run a plain Windows command, without any scripts to download into the extension:
Set-HelperAzureRmCustomScript -ResourceGroupName "testrg001" -VMName "test-vm001" -Location "West US" -NoDownload -FullCommand "echo with No Download" -Verbose
  • Set-HelperAzureRmCustomScript can create the storage account and container, if they didn’t already exist. To upload the PowerShell script “RunMe.ps1” and an input data file from C:\MyScripts and run the script:
Set-HelperAzureRmCustomScript -ResourceGroupName "testrg001" -VMName "test-vm001" -Location "West US" -Account "my-storage" -NewAccount -Container "my-script-container" -NewContainer -Script "RunMe.ps1", "input.txt" -UploadPath "C:\MyScripts" -Arguments "With new accounts < input.txt" -Verbose
  • To run a PowerShell script downloaded from a public resource:
Set-HelperAzureRmCustomScript -ResourceGroupName "testrg001" -VMName "test-vm001" -Location "West US" -ScriptUri "https://raw.githubusercontent.com/My/Dir/master/MyScript.ps1" -Arguments "With URI" -Verbose
  • To run a PowerShell script that has been already uploaded to Azure storage:
Set-HelperAzureRmCustomScript -ResourceGroupName "testrg001" -VMName "test-vm001" -Location "West US" -Account "my-storage" -Container "my-script-container" -Script "RunMe.ps1" -Arguments "Without upload" -Verbose

 

Nano Server Azure Helper

Here is the Nano Server Azure Helper script module:

NanoServerAzureHelper_20160927

 

Configuration Manager: a progress update on the current branch and a new servicing branch

$
0
0

Today, 101 years ago, the Ford Motor Company manufactured its 1 millionth Model T automobile. Thanks to our customers, we also have a reason to celebrate today as we are continuing to see an incredible adoption of our own model, the current branch of Configuration Manager. Our current branch model was designed to provide our customers with ongoing product improvements, faster updates, and timely support for new Windows releases.

Since the release of the current branch in December of 2015, over 21,000 organizations managing more than 43 million devices have transformed client management for their organizations by upgrading to Configuration Manager 1511 or later, allowing them to keep their management tools up to date at an unprecedented rate and scale. With three current branch releases to date, the move to later versions is accelerating: more than half of these organizations have already updated to the latest version 1606. In the wake of this strong customer adoption, we are including the latest version of Configuration Manager in newly released System Center 2016 for server management, and at the same time we are introducing a new branch type.

In short, System Center Configuration Manager (version 1606) is now included with System Center 2016. Our customers can now upgrade Configuration Manager 2012/R2 directly to version 1606 of the current branch and start taking advantage of new management features, faster and easier updates, support for new Windows releases, and more. For the overwhelming majority of our customers, the current branch of Configuration Manager will be their preferred installation option, and we have seen this further validated by the upgrade momentum we noted above.

Today, we are also making available the Long-Term Servicing Branch (LTSB) of Configuration Manager. Up until this point, if Software Assurance or equivalent subscription rights (most normally from Intune or EMS) became expired, customers, per product terms, would have to move back to the most recent release they owned perpetual rights to, e.g., System Center 2012 R2 Configuration Manager. The LTSB of Configuration Manager now delivers an alternative option that will be supported on a fixed 10-year lifecycle, although it is important to understand the limitations inherent in a long-term serviced management product vs. the easily updatable current branch model our customers have been rapidly moving to.

While the LTSB is derived from the current branch of Configuration Manager (version 1606), it is scaled back and reduced in functionality to permit the extended support model. LTSB of Configuration Manager will not receive new functionality or support for new Windows 10 and Windows Server releases. It will continue to receive security updates only. By design, LTSB of Configuration Manager is intended to be fixed in functionality and very infrequently updated, so any features or components that require continuous updating or are tied to a cloud service have been removed. These removed features include:

  • Support for Windows 10 Current Branch (CB) and Current Branch for Business (CBB)
  • Support for the future releases of Windows 10 LTSB and Windows Server
  • Windows 10 Servicing Dashboard and Servicing Plans
  • The ability to add a Microsoft Intune Subscription, which prevents the use of Hybrid MDM and on-premises MDM
  • Asset Intelligence
  • Cloud-based Distribution Point
  • Support for Exchange Online as an Exchange Connector
  • Any pre-release features available in the current branch of Configuration Manager

For more details about the LTSB release and to get answers to your most common questions, please check out the following pages:

Based on the strong adoption of the current branch of Configuration Manager, positive feedback from our customers, and the future of Windows and the industry in general shifting to more frequent and smaller updates, we highly recommend our customers continue upgrading to the current branch of Configuration Manager. We expect that for the overwhelming majority of you this is the best model and approach of delivering an up to date management offering.

Configuration Manager (version 1606) can be downloaded from Volume Licensing Service Center (search for System Center Config). It can also be downloaded from Microsoft Evaluation Center and MSDN. The setup process of Configuration Manager (version 1606) allows you to choose to install either the current branch or LTSB.

For assistance with the upgrade process please post your questions in the Site and Client Deployment forum. To provide feedback or report any issues with the functionality included in this release, please use Connect. If theres a new feature or enhancement you want us to consider including in future updates, please use the Configuration Manager UserVoice site.

Additional resources:

Managing the software-defined datacenter with System Center 2016

$
0
0

This post was authored by Bala Rajagopalan, Principal Group Program Manager, System Center.

Today, were excited to announce the general availability of System Center 2016. System Center 2016 makes it easy for you to deploy, configure and manage your virtualized, software-defined datacenter and hybrid cloud infrastructure. The latest release of System Center offers an array of new capabilities that amplify your ability to meet the most demanding business requirements, providing support for everything from provisioning the physical and virtual infrastructure to IT process and service management.

System Center 2016 is designed to provide:

  • Faster time to value with simple installation, in-place upgrades, and automated workflows.
  • Efficient operations with improvements in performance and usability of all System Center components.
  • Greater heterogeneity and cloud management with broader support for LAMP stack and VMware, including monitoring resources and services in Azure and Amazon Web Services.

System Center 2016 summary highlights

SC_GA_preview_table

 

NewInSC_10.12.2016

Click to download the Whats New in System Center 2016 white paper

New options in licensing

As announced in December 2015, System Center 2016 will move from processor-based licensing to core-based licensing for both Datacenter and Standard Editions and will continue to be available through all existing licensing channels. In addition, on October 1, 2016, we announced exciting new ways to purchase System Center in conjunction with Operations Management Suite, bringing together the power of System Center for management of the on-premises datacenter with the speed and flexibility of cloud-based management in Operations Management Suite. System Center and Operations Management Suite give you powerful hybrid cloud management solutions for your entire IT environment, both on-premises and in the cloud. Now you have the option to purchase both System Center and Operations Management Suite together, and take advantage of significant savings.

  • To get the full benefit of cloud management, and access all the Operations Management Suite services at a convenient price, the Operations Management Suite Add-on for System Center enables you to simply attach Operations Management Suite services to your existing System Center license.
  • Customers nearing the end of their renewal cycle can take advantage of a special price and convert their System Center licenses to an Operations Management Suite Subscription. Once converted, you do not need to pay for System Center separately. The Operations Management Suite subscription includes rights to System Center.

Learn more about the exciting new options and more, at the Operations Management Suite pricing page.

Check out whats new!

WhatsNew_SC_10.12.16
Click to watch What’s New with System Center 2016

For a full listing of the new capabilities in this latest release of our core management platform, you can visit the System Center 2016 site and read the Whats New in System Center 2016 whitepaper. You can also download the System Center 2016 evaluation and see it in action for yourself. Check out the updates to System Center 2016 Configuration Manager also released today.

Weve built in a host of features, taking the learnings from System Center 2012 R2 and your feedback to make sure System Center continues to support you. The unique requirements of the on-premises datacenter are your business, every day, and we encourage you to explore how System Center 2016 can help you move faster and focus on delivering impact. Try out the evaluation today!

Ignite Recap: Remote Desktop Services

$
0
0

What an incredible Microsoft Ignite we had this year!

Thanks to the 800+ of you who came to our breakout sessions and learned about RDS Improvements in Windows Server 2016 and Citrix innovations on our platform, as well as How to Deploy RDS in the Cloud. A huge shout out to our Microsoft MVPs who put together an in-depth independent assessment session of RDS capabilities in Windows Server 2016, and a big vote of thanks to all of you who stopped by our booth to share your amazing experiences and feedback. We had more than 400 enriching conversations as a team at the booth alone.

As we continue to enhance Remote Desktop Services, we will go back to the many conversations we had with you at our sessions, booth, and in focused interactions. Your input will help fuel and inform our efforts to create the best platform for Windows desktop and application virtualization.

For those who could not join us in person, we would suggest starting at our MS Mechanics video below to get an overview of our innovations with Windows Server 2016 before diving deep into the on-demand sessions, shared in-line above.

As we continue to enhance Remote Desktop Services, we will go back to the many conversations we had with you at our sessions, booth, and in focused interactions. Your input will help to fuel and inform our efforts to create the best platform for Windows desktop and application virtualization.

We invite you all to try out the best Windows Server release yet, and see how Remote Desktop Services can help you execute your virtual workspace strategy. Look out for upcoming innovations from our team in this space by following us on Twitter and remaining connected on this blog.

Thanks again and see you next Ignite.


ADMX Version History

$
0
0

Hi Everyone,

 

My name is Kai Ohnesorge and I am working with Microsoft in a position as Premier Field Engineer (PFE) based in Germany. In my job I am confronted with a large amount of GPO topics, one being changes in ADMX templates over the various versions of Windows. From time to time, we as PFEs are asked for changes in ADMX templates between different versions of Windows operating systems, but so far the only sources of information were the “Group Policy Settings Reference for Windows and Windows Server” spreadsheets. These contain all Group Policy settings available in the corresponding version of the Windows operating systems, but unfortunately there has never been a comprehensive documentation of all changes. A spreadsheet containing all changes in ADMX files shipped with Windows Vista up to Windows 10 Anniversary update (not containing Windows Server 2016 yet) can now be downloaded here. Later on in this article you will find a description of the spreadsheet, but first let me explain why this information might be of importance to you.

 

Originally, my answer regarding changes to ADMX files always was “ADMX files shipped with newer versions of Windows contain additional settings, but no settings are removed or basic changes are made”. Today, I know this is not the entire truth. In 2012 my good friend Mark Empson, PFE based in the UK, discovered that ADMX files are not always growing between OS versions, but many stay the same size – some of them are actually shrinking! During further investigation he identified settings that were indeed removed from newer versions of ADMX files, which means if a Central Store has been configured for the ADMX and ADML files in a domain, affected settings might not be manageable after updating the files in the central store to the newest version. So if a domain is configured with a Central Store, containing the ADMX templates delivered with Windows Server 2008 R2, a Group Policy might contain the following settings:

 

After updating the ADMX files to the versions delivered with Windows Server 2012 R2, the same GPO might be displayed as:

 

As a result, all of these settings are not manageable anymore and cannot be changed or removed within the Group Policy Management Console (GPMC). The same situation will occure when a GPO, that was originally created with Windows Server 2008 R2, is edited with a GPMC installed on Windows Server 2012 R2 while using the local ADMX files.

To bypass this situation, several workarounds are available after identifying if your environment is affected at all. The amount of Policy settings that have been removed is curently around 40 over all versions of Windows, including settings added to Operating systems via patches, but not available in the RTM version of the next release. For example the setting “Do not reinitialize a pre-existing roamed user profile when it is loaded on a machine for the first time” has been added to Windows Server 2012 and Windows Server 2012 R2 via an update (yes, updates do change ADMX files!), so it has been available in the latest version of Windows Server 2012 ADMX files, but not in the RTM version of Windows Server 2012 R2.

So if your environment is affected, here are two possible workarounds  (not in a particular order):

  1. Identify all related settings and remove them from the Group Policy before updating the Central Store, if currently configured
  2. Create a management system (server or client) per version of the Windows Operating system present in your environment and configure the Group Policy objects for a specific version of Windows from the corresponding management system. If a Central Store is configured in the environment, configure the GPMC on these management systems to bypass the central store as described in this article:

An update is available to enable the use of Local ADMX files for Group Policy Editor

https://support.microsoft.com/en-us/kb/2917033

 

Furthermore, for a few Policy settings essential information in the ADMX files, such as the key value or the enabled / disabled values have changed. One example is the Group Policy setting “Turn off Fair Share CPU Scheduling “. Prior to Windows 8 / Windows Server 2012, the key values were:

 

Starting with the ADMX files shipped with Windows 8 / Windows Server 2012, the key values are:

 

 

It is important to mention that this does not affect GPOs present in your environment before updating the ADMX files, but it might affect your clients when editing the GPOs after the update. If any affected Policy settings are configured in your environment, additional testing should be planned before changing the GPOs or applying them to newer versions of Windows.

 

The comparison file contains a number of spreadsheets:

  • Annotations: General comments about the document
  • ADMXOSAvailability This spreadsheet displays all ADMX files and their availability in different versions of Windows
  • ADMXChangesFull All changes in ADMX files, starting Windows Vista up to Windows 10 Build 1607 Patch state August 2016
  • admx_IE11 All changes that occurred in the Internet Explorer 11 ADMX file
  • Removed_Items A list of all ADMX files and Policy settings removed between Windows Vista up to Windows 10 Build 1607 Patch state August 2016
  • New_Items A list of all ADMX files and Policy settings added between Windows Vista up to Windows 10 Build 1607 Patch state August 2016
  • Changed_Items Changes made to Policy settings between Windows Vista up to Windows 10 Build 1607 Patch state August 2016. Important changes, as Registry Key value changes, are marked red.

 

The file can be downloaded directly from the following location:

 

https://go.microsoft.com/fwlink/?linkid=829685

 

 

 

Kai Ohnesorge, Microsoft Identity PFE

VS Team Services Update – Oct 12

$
0
0

Before I get to talking about this update let me talk about a change in the way we are announcing updates…

It takes a while for an update to roll out across the entire service.  That is by design and it is part of our strategy to control the damage from any bugs we miss in the testing process.  Our deployment process is currently divided into 5 “rings”.  The first (we call ring 0) is our own Team Services instance – the one the Team Services team uses to build Team Services.  The second is a small public instance with external customers on it and the rings grow to more and more public instances.

When we deploy a sprint release to a ring, we wait for 24 hours to monitor it and see if any issues arise and fix them before rolling to the next ring.  So, assuming we have no issues that extend the 24 hour “observation time”, it takes us at least 5 days to do the deployment.  Sometimes we have issues and it takes 6, 7 or 8 days.

A sprint ends every 3rd Friday.  The first production deployment (ring 0) generally happens by Wed or Thurs of the week following the sprint end.  Because we don’t like to deploy to ring 1 on a Friday and risk not being here for issues over the weekend, we usually wait until Monday of the second week to roll out ring 1.  So then, ring 2 – Tuesday, …, ring 4 Thursday and the deployment is finished and everyone has everything by Friday of the second week.  And then we go straight into finishing up the work for the following sprint on the 3rd week and start all over again – it’s never ending 🙂

So when do we notify customers that we’re making an update?  My philosophy has generally been that I don’t want people seeing new features roll out without being able to find release notes/docs describing them.  But I also have resisted rolling out release notes for changes that no one can see – it just creates anxiety about why I can’t have it now.

So, our policy has been to publish the release notes when the deployment to ring 1 (the first public ring) is complete.  Of course, as we’ve added more rings and the deployment has stretched out, an increasing number of customers do end up seeing release notes before the features go live in their accounts and it hasn’t created a huge problem.

Over the past few months, we’ve been getting a bunch of feedback, particularly from our larger customers with hundreds or thousands of Team Services users, that they would like to know what’s coming sooner.  They don’t like being surprised when stuff just shows up and they need a little time to investigate what the changes mean for them and whether or not they need to send additional communication to their teams.

To honor that request, we are experimenting with changing our publishing process.  We have started publishing the release notes as soon as they are ready – which generally means the middle of the 1st week after a sprint end, around the time ring 0 is deployed, but before *any* external customer can actually see the changes.

That is why our sprint 107 release notes were published yesterday afternoon and I am blogging about it today, despite the fact that none of you have access to any of it.  We hope this, combined with the more course grained roadmap that we publish will meet the needs of people looking for more forewarning of changes.  We also hope that people can wrap their head around the update announcements well before availability (on average, about a week before).

Some people have asked for even more forewarning and, for now, I don’t have any solution for that.  Our roadmap gives a longer term picture (6 months) of the big things we are working and our release notes, now, give a 1 week preview of imminent changes.  Given our backlog based development methodology, anything between those two granularities is hard to do and likely to have a lot of errors.

As always, feedback is welcome.

So, on to Sprint 107 updates…

Sprint 107 is delivering quite a few updates and some of them are pretty darned nice.  The biggest visual change is that we are flipping the new navigation structure on by default.  If users aren’t ready for the change yet, they can still turn it off but we’re going to remove that ability before too long and everyone will be on the new nav experience.

Probably the most helpful set of changes are the version control ones – lots of very nice UX improvements and new features.  The Cherry-pick and revert additions are very nice.  The new file/folder quick search is small but a really nice, snappy experience.

The Azure continuous delivery enhancements are just one step out of many in the journey we are on over the next few months to create a truly impressive and simple Azure CI/CD capability.  Stay tuned for more every sprint.

There’s lots of other nice improvement that I don’t mean to downplay.  Check out the release notes for full details.

Brian

C++/WinRT Available on GitHub

$
0
0

C++/WinRT is now available on GitHub. This is the future of the Modern C++ project and the first public preview coming officially from Microsoft.

https://github.com/microsoft/cppwinrt

C++/WinRT is a standard C++ language projection for the Windows Runtime implemented solely in header files. It allows you to both author and consume Windows Runtime APIs using any standards-compliant C++ compiler. C++/WinRT is designed to provide C++ developers with first-class access to the modern Windows API.

Please give us your feedback as we work on the next set of features.

Bentley’s Cloud Solution is Instrumental to Europe’s Largest Construction Project

$
0
0

London is a dynamic city with over two millennia of history which span several eras of buildings and infrastructure. So it’s an ambitious undertaking to build a new subway line right through the center of it. The London Crossrail railway project is the largest construction project in Europe with a £14.8 billion budget. The project consists of over 60 miles of above and below ground rail, 10 new stations and updates of 30 existing stations. The challenge for Crossrail was managing information amongst hundreds of contractors with the risk of information loss and miscommunication between project phases and teams, causing errors, safety risks, and increased project costs. Crossrail also wanted to increase effectiveness during construction, where engineers could visualize complexities surrounding the project so design changes can easily be integrated throughout the project.

 For the project Crossrail teamed up with Bentley Systems. Bentley’s charter was to facilitate collaboration by bringing all the data into one environment, so information is continuously available to all of the contractors where and when they need it – on time and on budget. Crossrail had already been utilizing Bentley’s modeling software to design in a virtual environment along with their project information and collaboration software and Bentley’s asset management software in a Common Data Environment (CDE), but as the data grew they decided to extend their solution to a hybrid model powered by Microsoft Azure. By using a hybrid model with Azure, Crossrail can work with their entire supply chain, using digital technologies to manage and join up the data that underpins the design, construction and operation activities of an asset across its lifecycle from conception to decommissioning. It provides a single location for storing, sharing, and managing information. This creates a “virtual railroad” where the existing infrastructure and future infrastructure could be viewed simultaneously.

 Alan Kiraly, Senior Vice President of Asset Performance at Bentley, explained: “People have used 3D models to design stuff for twenty years. But we are making a comprehensive virtual world that depicts the terrain, the tunnel and all of the associated data.” You can see more about this project in the video below.

 

 

Alan Kiraly says that Bentley “use[s] the entire Azure stack for extending our solution to the cloud.” When the project is finished, Bentley’s virtual model will be used to manage ongoing operations, enabling maintenance crews to assess repairs without shutting down the subway. By using Azure, Bentley is able to streamline the construction of the tunnel and have a resource for future teams to extend or maintain the tunnel.

 Digital transformation is often thought of in the context of changing how we communicate, share pictures or hail a cab. Yet, as we can see from the London Crossrail project it can be used to change how we build and update our physical surroundings. By using the cloud not only is it easier to share information but it leaves an asset for people in the future to use. In this case people who will do the repairs. As your companies transform or disrupt industries, think about new ways that things can be done better and then the technologies that can support it. Happy coding.

 

Cheers,

Guggs

@stevenguggs

In case you missed it: #AzureAD PowerShell v2.0 is now in public preview!

$
0
0

Howdy folks,

We have launched so many public previews and so many capabilities have reachedGArecently that even if you are a follower of our blog you might have missed this bit of news:

#AzureAD PowerShell v2.0 is now in public preview!

We know that for many Azure AD and Windows Server AD admins, PowerShell is an essential tool something you rely on every day to get your job done. So were really pumped to finally get this new version into your hands.

To give you a quick rundown on this new version, Ive asked Rob de Jong, the PM who drives our PowerShell efforts to write up guest blog which you will find down below.

I hope youll find these new cmdlets useful!

And as always, wed love to receive any feedback or suggestions you have.

Best Regards,

Alex Simons (@Twitter: @Alex_A_Simons)

Director of Program Management

Microsoft Identity Division

——————-

Hi everyone,

Its Rob de Jong here and today Im excited to give you a quick tour of the new #Azure AD PowerShell v2.0 which is now in public preview.

PowerShell is an important tool in the toolkit of nearly every IT professional who manages Azure Active Directory, and weve just recently released the public preview of our new V2 version of Azure Active Directory PowerShell cmdlets. This preview release marks the first step on a journey to renew the existing MSOL PowerShell cmdlets which you are so familiar with, and were seeing an amazing number of customers already using our new cmdlets with their production Azure Active Directory.

We have been getting great feedback on the need for publishing updates to the new module to address all the new scenarios that are now available in Azure Active Directory and the team is working hard to add new capabilities. All new capabilities will be provided through the new AzureAD PowerShell Module and you will see continual updates, and meanwhile were also working on making all functionality of the old MSOL module available in the new module.

When complete, you will be able to rely completely on the new AzureAD module for all of your needs.

So please start using the new AzureAD module and give us feedback your feedback is critical to our shared success!

Azure AD PowerShell module features

One of the key features of the new module is a close alignment of the PowerShell functionality with the Graph API capabilities. We are also moving towards a faster and more agile release process for new or updated functionality of these cmdlets.

The new PowerShell cmdlets already provide more functionality in several areas, most notably for Modern Authentication and MFA, and includes new management capabilities for Applications and Certificate Authority through PowerShell.

For a full list of all available cmdlets and how to use them, please read our AzureAD PowerShell reference documentation here: https://msdn.microsoft.com/en-us/library/azure/mt757189.aspx

Over time, we will fully replace the existing MSOL cmdlets. You will see regular new functionality updates to this preview release until the complete replacement is available.

Some changes

As you will notice, some things have changed when compared to the existing MSOL library. First of all we have updated the names of all cmdlets to conform with the Azure PowerShell naming conventions. Since were publishing a new module for these cmdlets, the name of the module has changed as well: the existing modules name was MSOL, the new module is call AzureAD. So where e.g. an existing cmdlet was named New-MSOLUser, which adds a new user to the directory, the new cmdlets name is New-AzureADUser.

Secondly the parameters for the new cmdlets sometimes changed as well. As we are developing cmdlets in close alignment with the Graph API functionality, were also keeping the names of objects and parameters as close as possible to what is used in Graph API. An overview of Azure AD Graph API functionality can be found here: Getting started with Graph API

New functionality in AzureAD PowerShell

Using the -SearchString parameter

Based on feedback we received from early users of the V2 cmdlets, we introduced a new parameter SearchString. This parameter allows you to search for data in your directory based on a matching string value.

For example, executing the cmdlet

clip_image002

in my demo directory would return

clip_image003

while

clip_image005

returns

clip_image007

which are all users a string attribute matches the value Marketing in my demo tenant, this would be the Department attribute. Please note that the SearchString search scope for users currently covers the attributes City, Country, Department, DisplayName, JobTitle, Mail, mailNickName, State, and UserPrincipalName.

Managing Token Lifetime policy settings

Were including several new cmdlets in this release that can be used to manage Token Lifetime settings in your directory and that will support operations on Policy, ServicePrincipalPolicy and PolicyAppliedObject objects. More information and examplesforn this functionality can be found here.

Managing Certificate Authority using Powershell for Azure AD

These are the new cmdlets that are used to manage Certificate Authority:

  • New-AzureADTrustedCertificateAuthority – Adds a new certificate authority for the tenant
  • Get-AzureADTrustedCertificateAuthorities – Retrieves the list of certificate authority for the tenant
  • Remove-AzureADTrustedCertificateAuthority – Removes a certificate authority for the tenant
  • Set-AzureADTrustedCertificateAuthority – Modifying a certificate authority for the tenant

Please refer to https://azure.microsoft.com/en-us/documentation/articles/active-directory-certificate-based-authentication-ios/#getting-started for detailed information on how to use these cmdlets.

Managing Applications in Azure AD using PowerShell

Several new cmdlets have been added to enable management of Applications in Azure AD using PowerShell. There is a set of cmdlets to create, modify and remove Applications:

  • New-AzureADApplication
  • Remove-AzureADApplication
  • Set-AzureADApplication

We also offer capabilities to manage Directory Extensions in PowerShell:

  • Get-AzureADApplicationExtensionProperty
  • New-AzureADApplicationExtensionProperty
  • Remove-AzureADApplicationExtensionProperty

There are new cmdlets to manage Owners for an Application:

  • Add-AzureADApplicationOwner
  • Get-AzureADApplicationOwner
  • Remove-AzureADApplicationOwner

And finally, were offering new capabilities to manage credentials for Applications in PowerShell:

  • Get-AzureADApplicationKeyCredential
  • New-AzureADApplicationKeyCredential
  • Remove-AzureADApplicationKeyCredential
  • Get-AzureADApplicationPasswordCredential
  • New-AzureADApplicationPasswordCredential
  • Remove-AzureADApplicationPasswordCredential

Here is a short video that demonstrates how you can use these new cmdlets to manage access to Applications in your directory.

We invite you to try out the new AzureAD Powershell V2 module, which you can install from the PowerShell Gallery here: http://www.powershellgallery.com/packages/AzureADPreview.

Check out the new capabilities and let us know what you think!

Regards,

Rob

The future of workplace productivity

Customer data—walking the line between helpful innovation and invasion of privacy

$
0
0

Everyone is worried about privacy these days. More information about you exists in more places today than ever before in history.

Collecting and using customer data is not a bad thing. Organizations need that data to deliver products and services customers want. The issue is where to draw the line between using customer data to deliver helpful new capabilities and invading customer privacy. The October 11 episode of Modern Workplace provided guidance on how organizations can manage these tricky policy decisions.

Hillery Nye, chief privacy officer at Glympse, explained how the startup company made a very conscious decision to not collect data that it could have easily gathered from its real-time location sharing app. The company collects customer data and uses it for very specific purposes, but it never stores that data. The company may have given up some opportunities to monetize its customer data, but Nye feels that the company gains even more by being a responsible corporate citizen and establishing a reputation for privacy. She discussed how a company’s brand is affected by its privacy policies, and how organizations can better align their privacy policies with their business strategy.

Jules Polonetsky, CEO of the Future of Privacy Forum—a think tank and advocacy group focused on data privacy issues—explained that companies need to understand what customer data they have, where it is located, what rules apply to it and who has access to it. Then they need to develop privacy standards that align with their business goals and customer expectations. Polonetsky helped draw a distinction between security and privacy. “Security is about making sure the people who aren’t authorized to have data don’t get it. Privacy is about the people who areauthorized to have the information…and what you do with it once you are allowed to have it.”

What can you do to stay secure and ensure privacy? Microsoft has developed the Secure Productive Enterprise, an offering that brings together the latest, most advanced technologies for security as well as management, collaboration and analytics. On this episode, we demonstrated some of the key security capabilities across the three products of the Secure Productive Enterprise: Office 365, Enterprise Mobility & Security and Windows 10.

Watch now to learn tips on managing privacy, such as:

  • Place someone in charge of privacy.
  • Align your privacy policies with your business strategy.
  • Implement clear rules to maintain privacy standards.
  • Know your partners and how they use data.
  • Understand that there’s too much risk to ignore international privacy regulations, even if you are not a huge organization.

And tune in on November 15, 2016, at 8 a.m. PST to watch “Solving the Generational Divide: How to create cohesive teams.”

The post Customer data—walking the line between helpful innovation and invasion of privacy appeared first on Office Blogs.


Yammer bolsters security and compliance with new auditing and reporting capabilities

$
0
0

Yammer is designed to connect people and teams across your organization so they can work more effectively together, while also meeting your organization’s security and compliance needs. Having visibility and control over your data within the cloud services you use—whether user actions, object activities or access points—is a critical part of IT compliance and security. We’re pleased to announce we are rolling out new auditing and reporting capabilities for Yammer, powered by the Office 365 Management Activity API and the Office 365 Security & Compliance Center.

These new capabilities provide a new level of visibility for IT and builds on our announcement earlier this year that Yammer is now covered by the Office 365 Trust Center. As part of the Office 365 Trust Center, Yammer complies with international and regional standards such as the Office 365 Data Processing Agreement with European Union Model Clauses (DPA with EUMC), Health Insurance Portability and Accountability Office 365 Business Associate Agreement (HIPAA BAA), ISO 27001, ISO 27018, Section 508 for web accessibility, and SSAE 16 SOC 1 and SOC 2 reports. We also recently announced an update to the Yammer apps for iOS and Android that allows IT administrators to protect their corporate data using mobile application management (MAM) controls in Microsoft Intune.

Auditing and reporting across various user and admin activities

The Office 365 Management Activity API is a RESTful API that already provides visibility into user and admin transactions across SharePoint Online, Exchange Online and Azure Active Directory. The Office 365 Security & Compliance Center aggregates these transactions into a single searchable log. Today, we’re pleased to announce our plans to include Yammer user and admin transactions in both the Office 365 Management Activity API and the Office 365 Security & Compliance Center.

More than 25 different Yammer operations spanning five categories will be made available for auditing:

  • Users—including activating a user, suspending a user and deleting a user.
  • Groups—including creating a group, adding a member to a group and deleting a group.
  • Files—including creating a file, viewing a file and deleting a file.
  • Admins—including exporting data, triggering private content mode and forcing all users to log out.
  • Networksettings—including changing network usage policy and changing data retention policy.

This support article has the complete list of operations available.

Using the Office 365 Management Activity API to audit Yammer data

To get to the Yammer reports, open the Security & Compliance Center from the Office 365 Admin Center, then click the Search & investigation tab and select Audit log search. Currently, audit history is retained for 90 days, and admins can export results to a CSV file for additional reporting in Excel.

Yammer operations and details can be viewed through the Office 365 Security & Compliance Center.

Find more information on using the audit log reports here, from getting started to understanding all that is audited.

Applications can also consume audit data using the Office 365 Management API. The API provides a consistent schema across all activity logs and allows organizations and ISVs to integrate Office 365 audit data into their security and compliance monitoring and reporting solutions.

To learn more about the API all up and the Yammer specifics, visit the following:

Control on your terms with cross application security and compliance

Organizations want a collaboration platform that gives them the right level of control, compliance, privacy and security. Because Yammer is part of Office 365, IT departments can easily manage user access and controls and ensure that corporate data is private, secure and compliant. These new capabilities will roll out to Office 365 commercial customers over the coming weeks. More information about auditing and reporting in Yammer can be found in the support article.

The post Yammer bolsters security and compliance with new auditing and reporting capabilities appeared first on Office Blogs.

New Learning Tools help educators create more inclusive classrooms

$
0
0

A one-size-fits-all approach to education can be especially stifling for students with unique learning needs. To put these unique learning needs in perspective: dyslexia is estimated to impact one in five people; 72 percent of classrooms have special education students; and 73 percent of classrooms have readers that span four or more grade levels.* Understanding this classroom reality is what led our engineering teams at Microsoft to design Office 365, and its inclusive classroom technology like Learning Tools, with accessible learning experiences in mind.

With accessibility in mind, and based on direct feedback from educators and students, the Microsoft engineering teams continue to expand the capabilities and availability of the tools that help students of all abilities succeed. Many features previously exclusive to OneNote desktop are now coming to OneNote Online, Word Online and for desktop, Office Lens and beyond, to make sure more students have access to these tools.

The accessibility features in many of the tools in Office 365—free for students and teachers—that educators use regularly with their students create even more inclusive experiences, enabling all learners to have that “ah-ha” moment that motivates ongoing success.

To learn more, check out theMicrosoft in Education blog.

* Scholastic and the Bill and Melinda Gates Foundation survey of 20,000 public school teachers.

The post New Learning Tools help educators create more inclusive classrooms appeared first on Office Blogs.

SQL Server 2016 Express Edition in Windows containers

$
0
0

We are excited to announce the public availability of SQL Server 2016 Express Edition in Windows Containers! The image is now available on Docker Hub and the build scripts are hosted on our SQL Server Samples GitHub repository. This image can be used in both Windows Server Containers as well as Hyper-V Containers.

SQL Server 2016 Express Edition Docker Image | Installation Scripts

We hope you will find these images useful and leverage them for your container-based applications!

Why use SQL Server in containers?

SQL Server 2016 in a Windows container would be ideal when you want to:

  1. Quickly create and start a set of SQL Server instances for development or testing.
  2. Maximize density in test or production environments, especially in microservice architectures.
  3. Isolate and control applications in a multi-tenant infrastructure.

Prerequisites

Before you can get started with the SQL Server 2016 Express Edition image, you’ll need a Windows Server 2016 or Windows 10 host with the latest updates, the Windows Container feature enabled, and the Docker engine.

Please find the details for each of these requirements below.

  • Get a Windows Server 2016 or Windows 10 host
    • Windows Server 2016: You can start by downloading an evaluation copy from the TechNet Evaluation Center. Please make sure that all the latest Windows updates are installed, most importantly KB3176936 and KB3192366.
    • Windows 10: You will need Windows 10 Anniversary Edition Professional or Enterprise. Note: if you are on the Windows Insider builds, make sure that you are using build 14942.1000 or higher to avoid an issue with the Docker run command in older builds.
  • Enable the Windows Container feature and install the Docker Engine

Pulling and Running SQL Server 2016 in a Windows Container

Below are the Docker pull and run commands for running SQL Server 2016 Express instance in a Windows Container.

Make sure that the mandatory sa_password environment variable meets the SQL Server 2016 Password Complexity requirements.

First, pull the image
docker pull microsoft/mssql-server-2016-express-windows

Then, run a SQL Server container

Running a Windows Server Container (Windows Server 2016 only):

docker run -d -p 1433:1433 –env sa_password= microsoft/mssql-server-2016-express-windows

Running a Hyper-V Container (Windows Server 2016 or Windows 10):

docker run -d -p 1433:1433 –env sa_password=–isolation=hyperv microsoft/mssql-server-2016-express-windows

Connecting to SQL Server 2016

From within the container

An easy way to connect to the SQL Server instance from inside the container is by using the sqlcmd utility.

First, use the docker ps command to get the container ID that you want to connect to and use it to replace the parameter placeholder ‘’ in the commands below. You can use the docker exec -it command to create an interactive command prompt that will execute commands inside of the container.

You can connect to SQL Server by using either Windows or SQL Authentication.

Windows authentication using container administrator account

docker exec -it sqlcmd

SQL authentication using the system administrator (SA) account

docker exec -it sqlcmd -S. -Usa

From outside the container

One of the ways to access SQL Server 2016 from outside the container is by installing SQL Server Management Studio (SSMS). You can install and use SSMS either on the host or on another machine that can remotely connect to the host .

Connect from SSMS installed on the host

To connect from SSMS installed on the host, you’ll need the following information:

  • The IP Address of the container
    One of the ways to get the IP address of the container is by using the docker inspect command:
    docker inspect –format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}’
  • The SQL Server port number
    This is the same port number that was specified in the docker run command. If you used 1433 you don’t need to specify the port. If you want to specify a port to connect to you can add it to the end of the server name like this: myserver,1433.
  • SQL system administrator account credentials
    The username is ‘sa’ and the sa_password that was used in the docker run command.

Connect from SSMS on another machine (other than the Host Environment)

To connect from SSMS installed on another machine (that can connect to the host), you’ll need the following information:

  • The IP address of the host
    You can get the host’s IP address by using the ipconfig command from a PowerShell or command prompt window.
  • The SQL Server port number
    This is the same port that was specified in the docker run command. If you used 1433 you don’t need to specify the port. If you want to specify a port to connect to you can add it to the end of the server name like this: myserver,1433.
    Note: Depending on your configuration, you might have to create a firewall rule to open the necessary SQL Server ports on the host. Please refer to this article for more information regarding container networking.
  • SQL system administrator account credentials
    The username is ‘sa’ and the sa_password that was used in the docker run command.

SQL 2016 Features Supported on Windows Server Core

Please refer to this link for all SQL Server 2016 features that are supported on a Windows Server Core installation.

Developing Using Windows 10 Containers

Check out this blog post by Alex Ellis, Docker Captain, on how to use SQL Server 2016 Express Edition in a Windows container as part of an application development and test environment on Windows 10.

Docker with Microsoft SQL 2016 + ASP.NET

Further Reading

Windows Containers Documentation
Container Resource Management
SQL Server 2016 GitHub Samples Repo
Tutorials for SQL Server 2016

Faster C++ solution load and build performance with Visual Studio “15”

$
0
0

With Visual Studio ‘15’ our goal is to considerably improve productivity for C++ developers. With this goal in mind we are introducing many new improvements, which you can try out in the recently released Preview 5 build. The highlights of this release include the following:

Faster C++ solution load

‘Fast project load’ is a new experimental feature for C++ projects. The first time you open a C++ project it will load faster, and the time after that it will load even faster! To try out this experimental feature set ‘Enable Faster Project Load’ to true in Tools -> Options as shown in the figure below:

The small demo below depicts these improvements on the large Chromium Visual Studio Solution which has 1968 projects. You can learn more about how faster C++ solution load operates by reading this detailed post we published earlier.

There is another experimental effort underway in Visual Studio to improve solution load called “lightweight solution load”.  This is a completely different approach and you can read about it here.
Generally, it will avoid loading projects at all and will only load a project when a user explicitly expands a project in Solution Explorer.  The C++ team has been focused on Fast Project Load and so our support for lightweight solution load is currently minimal.  In the RC release of Visual Studio 15, we expect to support fast project load feature in conjunction with Lightweight Solution Load.  This combination should provide a great experience.

Faster build cycle with /Debug:fastlink

Developer builds will now build faster, because of faster links with an integrated /debug:fastlink experience. Expect to see 2-4x linker improvements for your application builds.

The figure below illustrates how /debug:fastlink helps improve link times for some popular C++ sources. You can learn more about /debug:fastlink and its integration into Visual Studio by reading this blog post we published last week.


Reducing out-of-memory crashes in VS while debugging

With VS “15” Preview 5, we have also taken a step towards reducing memory consumed by symbol information, while maintaining the performance benefit of pre-fetching symbol data. We now enable pre-fetching only on modules that are relevant for evaluating and displaying a variable or expression. As a result, we are now able to successfully debug the Unreal engine process and contain it within 3GB of virtual memory and 1.8GB of private working set. Previously in VS 2015 when debugging the Unreal engine process, we would run out-of-memory. Clearly this is an improvement over the previous release, but we’re not done yet. We will be continuing to drive down memory usage in native debugging scenarios during the rest of VS “15” development.

Wrap Up

As always, we welcome your feedback and we would love to learn from your experiences as you try out these features. Do let us know how these improvements scale for your C++ code base.
If you run into any problems, let us know via the Report a Problem option, either from the installer or the Visual Studio IDE itself. You can also email us your query or feedback if you choose to interact with us directly! For new feature suggestions, let us know through User Voice.

Jim Springfield, Principal Architect, Visual C++ team.

Jim is passionate and guru about all things C++ and is actively involved in redesigning of the compiler frontend, language service engine, libraries and more. Jim is also the author of popular C++ libraries MFC, and ATL and his most recent work also involves development of the initial cross-platform C++ language service experience for upcoming editor Visual Studio Code.

Ankit Asthana, Senior Program Manager, Visual C++ team.

Ankit’s focus area is cross-platform mobile development along with native code generation tools. Ankit is also knowledgeable in compilers, distributed computing and server side development. He has in the past worked for IBM and Oracle Canada as a developer building Java 7 (hotspot) optimization and telecommunication products. Ankit back in 2008 also published a book on C++ titled ‘C++ for Beginners to Masters’ which sold over a few thousand copies.

Announcing Windows 10 Insider Preview Build 14946 for PC and Mobile

$
0
0

Hello Windows Insiders!

Today we are excited to be releasing Windows 10 Insider Preview Build 14946 for PC and Mobile to Windows Insiders in the Fast ring.

What’s new in Build 14946

Customizing your precision touchpad experience (PC): Last week, we announced that we’d been working on refining your touchpad experience. This week, we’re taking that one step further. When you go into Settings > Devices > Touchpad, you will now find a section called “Other gestures”. In this section, you now have basic customization options for your three and four finger gestures. For taps, you can select between Cortana, Action Center, play/pause or middle mouse button, and for left/right swipes, you can select between switching apps or switching virtual desktops. Try it out and let us know what you think!

Updated Touchpad settings.

However, we know that some of our Insiders prefer even more control over their experience, so for those Insiders (and power users) we have added a new Advanced Gestures Configuration page. The page can be accessed via a link at the bottom of Touchpad settings page.

Advanced Gesture Configuration settings page

The Advanced Gestures Configuration page has more configuration options, including hooking gestures up for next/previous song, creating/deleting virtual desktops, or snapping windows. In addition to these new options, we’ve also updated the Touchpad settings to include reference diagrams to remind you of what to expect when you swipe with three or four fingers in a particular direction.

Known issues to look out for with your touchpad: since the last flight, we’ve fixed the issue resulting in touchpad scrolling being too sensitive in Windows 10 apps – appreciate everyone who shared their feedback on the subject. There’s one other known issue we’re looking into right now that click and drag with your precision touchpad might get misrecognized as right-click in this build. You’ll also notice that there’s a “custom keyboard shortcut” option in settings – that hasn’t lit up yet, but we’ll let you know when it does. Keep the feedback coming!

Separate screen time-out settings when using Continuum for Phone (Mobile): Today, we are happy to announce the availability of a top user request for Continuum for Phone. With this update, you will now be able to turn off whichever screen you are not using with Continuum – saving battery and preventing screen burn-in. If you are working on a Word document, your phone screen will sleep without any impact on your Continuum session. If you make a call, hang-up, or press the phone power button, you can keep right on working in Word with Continuum. And, if you prefer longer or shorter timeout values, you can change them independently for the phone and connected screen, using the settings found under Settings > Personalization > Lock screen. Have fun, and keep sending us great feedback on Continuum!

Updated Wi-Fi Settings page (PC and Mobile): We continue to make Settings more similar across Windows devices. We have added a new setting to the Wi-Fi settings page. When you go to Settings > Network & Internet > Wi-Fi on your PC, then turn Wi-Fi off, you can now select a time under “Turn Wi-Fi back on” to have it turn on automatically after the amount of time you choose. It’s set to Manually by default in this build.

Turn Wi-Fi back on

We are also continuing to move functionality from the Wi-Fi (legacy) screen to the new Wi-Fi settings screen on Mobile. After you turn off Wi-Fi, choose a time under Turn Wi-Fi back on to have it turn on automatically after the amount of time you choose. Based on what we heard from customers in the Windows 10 Anniversary Update, we changed the default setting from “In 1 hour” to Manually in this build.

Option to prevent autocorrection (Mobile): On your phone, if you see that an autocorrection is going to happen (i.e. the first candidate is bolded) and it’s not what you intended, you will now see what you originally typed as the second candidate. Tapping on that candidate will prevent the autocorrection, and the system will learn your vocabulary and get smarter over time.

Option to remove a word from user dictionary (Mobile): We learn from your typing on the phone and build up a local user dictionary that adapts the keyboard prediction, autocorrection and shape writing experiences to your language style. Sometimes you may type a misspelled word and send it as is, sometimes you may tap on a red squiggled word and then tap the “+” sign by accident. Both save the unwanted word into your user dictionary, and it may appear when you type something similar in the future. Now you have a way to manually remove bad entries: tap on the word you don’t like in the text box, and you will see a candidate that is the word with a “-“ sign before it. Tapping on that candidate will remove the word completely from your user dictionary and no longer suggest it.

Important note about a change to automatic backups of your phone (Mobile): We have changed the frequency of scheduled backups for Mobile to once a week. Initiating a backup manually by clicking the “Back up now” button remains unchanged and works as expected. You can do this via Settings > Update & security > Backup and clicking the “More options” link at the bottom. As always, it’s good to do a backup of your device before updating to a new build.

 Other improvements and fixes for PC

  • Optional components such as Hyper-V and Bash should remain installed after updating to this build.
  • We have fixed the issue where signing into games that use Xbox Live would not work. You should be able to sign-in to Xbox Live in games in this build.
  • We fixed the issue causing Microsoft Edge to sometimes crash on launch, or when you type in address bar or try to open a new tab. You no longer need to run the PowerShell script.
  • We fixed the issue causing touch scrolling to be too sensitive in Windows 10 apps, such as Microsoft Edge.
  • We fixed an issue where Explorer.exe would hang when attempting to open considerably large .MOV files.
  • We fixed an issue that could result in the network icon occasionally getting into a state where a red X would display in the taskbar despite an active internet connection, until the device had been restarted.
  • We fixed an issue where if the device’s brightness was automatically adjusted after being woken from sleep, the brightness level shown in the Action Center’s Brightness Quick Action might not reflect the current brightness of the device.
  • We fixed an issue leading to Narrator not tracking focus on the Start Menu All apps list or tiles.
  • We fixed an issue potentially resulting in the “Open with…” dialog displaying with two entries for Calculator after tapping the Calculator key on a keyboard or running the Calculator app.

Other improvements and fixes for Mobile

  • We fixed the issue causing you to get into a state where text messages will fail to send.
  • When your phone is connected to your PC, long lists of captured photos will now load significantly faster in File Explorer.
  • We’ve updated Narrator’s reading order for Windows 10 apps which display an app bar on the bottom of the app, for example OneDrive, so now the contents of the page will be read before the contents of the app bar
  • We fixed an issue resulting in video thumbnails sometimes not being shown in WhatsApp – videos received from this build onwards will display a thumbnail.
  • We fixed an issue resulting in Camera app video recordings having a slight crackle to their audio in recent builds.

Known issues for PC

  • If you have 3rd party antivirus products such as Bitdefender, Kaspersky Antivirus, F-Secure Antivirus and Malwarebytes installed on your PC – your PC might not be able to complete the update to this build and roll-back to the previous build.
  • Larger Windows Store games such as ReCore, Gears of War 4, Forza Horizon 3, Killer Instinct and Rise of the Tomb Raider may fail to launch.

Known issues for Mobile

  • If your phone has additional speech packs installed – this build will fail to install on your phone with an 0x80188319 error. This is why we polled Insiders about language pack usage this week. Based on the response, we decided to push forward and send the latest bits. This issue will be fixed in the next build we release. NOTE: Removing speech packs will not correct the issue. Your device will continue to try and download and install the update and fail. It is recommended you move to the Slow ring until this issue is fixed.

Getting misconfigured PCs up to date

Some Windows Insiders who have pre-release builds on their PCs are not targeted correctly for receiving updates because they have not opted into receiving Insider Preview builds by having selected a valid ring or are in some other misconfigured state. To re-target these PCs, we are making a server-side change that will put these PCs in the Slow ring so that they can receive updates. Specifically:

  • Going forward, any build from our Development Branch installed on a PC will be put into the Slow ring. This includes if you installed a Development Branch build from an ISO or later reset the device.
  • For PCs on the Windows 10 Anniversary Update that were previously opted in to receiving builds but see a “Fix me” button on the Windows Insider Program settings page because they no longer have a registered Microsoft account attached will be put in the Release Preview ring.

If you would like to adjust these settings or suspend getting Development branch build updates temporarily (7 days), you can do so by going to Settings > Update & security > Windows Insider Program. To move to the Fast ring, a Windows Insider registered Microsoft Account must be attached to the PC. For enterprises running Insider Preview builds, you can manage this via these Group Policies documented here.

Team updates

Whew! On top of all the build and update system news, we also made a ton of progress on our #WINsiders4Good work for GIVE month. We had a fantastic launch of GIVE Month at the NYC Flagship Microsoft store last weekend where the Windows Insiders partnered with Educational Alliance to create solutions for after-school care for students from underrepresented backgrounds.

Dona at the Microsoft Store

The team came up with a variety of solutions, including some that are being completed with the help of the Windows Insiders globally. We are creating the world’s biggest open source list of free software tools that NGO and everyone else can use to teach and to learn concepts such as visual design, coding, problems solving and more so that EVERY person can have access and knowledge to these great tools. .  This webpage will live on our Windows Insiders website. I’ll share more as it goes live before Friday.

We are excited to be present at the Boston Microsoft store event this weekend as well as the global rollouts of the Create-A-Thon kits.

Thank you everyone and keep hustling,
Dona <3

Viewing all 13502 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>