Quantcast
Channel: TechNet Technology News
Viewing all 13502 articles
Browse latest View live

PingAccess for Azure AD: The public preview is being deployed!

$
0
0

Howdy folks,

Back in September, I blogged about our exciting partnership with Ping Identity.

Since then, Microsoft and Ping Identity have worked closely together to extend the capabilities of Azure AD Application Proxy to support new kinds of on-premises applications using Ping Access.

I’m happy to announce today that PingAccess for Azure AD is now ready for Public Preview and is currently being deployed across Azure AD data centers around the world. Many of you in North America will see it turn on today and it should be available to everyone by the end of the day Friday, 3/24/2017.

I’ve invited one of the program managers on our team, Harshini Jayaram, to share more details in a blog, which you’ll find below. We hope you try it out and look forward to hearing what you think!

Best regards,

Alex Simons (Twitter: @Alex_A_Simons)

Director of Program Management

Microsoft Identity Division

—-

Hi all,

We’ve already have many customers use Application Proxy to provide single sign-on (SSO) and secure remote access for web applications hosted on-premises. Many of them use this product for applications such as local SharePoint sites, Outlook Web Access for local Exchange servers, and other business web applications. It is a simple, secure, and cost-effective solution:

  • Simple: You don’t need to change the network infrastructure, put anything in a DMZ, or use VPN.
  • Secure: Application Proxy only uses outbound connections, giving you a more secure solution. It also works with other security features you’ve seen in Azure such as two-step verification, conditional access, and risk analysis. Learn more about this in Security considerations for Azure AD Application Proxy.
  • Cost-Effective: Application Proxy is a service that we maintain in the cloud, so you can save time and money.

Right now, all those benefits of Application Proxy are available for many different types of applications, including:

  • Web applications using Integrated Windows Authentication
  • Web applications using form-based access
  • Web APIs that you want to expose to rich applications on different devices
  • Applications hosted behind a Remote Desktop Gateway

If you want more details, you can check out our Application Proxy documentation. For this blog, I want to focus more on how we’re adding header-based applications with this new public preview!

PingAccess for Azure AD enables more apps!

Our customers have consistently asked for Application Proxy to also support apps that use headers for authentication, such as Peoplesoft, Netweaver Portal, and WebCenter. To enable this capability for our Azure AD Premium customers, we have partnered with Ping Identity. Ping Identity’s PingAccess now allows Application Proxy to support apps that use header-based authentication.

PingAccess is installed on-premises. For apps that use header-based authentication, Application Proxy connectors route traffic through PingAccess. Existing App Proxy applications are not impacted and use the current flow with no changes. An overview of this flow is shown below, and you can always check out our overview documentation for more on App Proxy flows.

Figure 1: Application Proxy + PingAccess Infrastructure Overview

PingAccess is a separately licensed feature, but your Azure Premium licenses now include a free license to configure up to 20 applications with this flow. If you have more apps, you’ll need to get a license through Ping Identity.

Joining the Preview

We are excited to have you join our preview! To get started you need to:

  1. Configure Application Proxy Connectors
  2. Create an Azure AD Application Proxy Application
  3. Download & Configure PingAccess
  4. Configure Applications in PingAccess

Just head to our Application Proxy + PingAccess documentation for a walkthrough of each of these steps.

We hope you enjoy trying this preview! As always, we’d love to hear from you with any questions, comments, or feedback, so please leave us a comment or reach out to us directly at aadapfeedback@microsoft.com.

Thanks,

Harshini Jayaram


Bridge gap between app and infra with Azure management and security updates

$
0
0

Silos within organizations arent new, and they arent a new source of issues. As IT groups today work to eliminate bottlenecks and increase speed, one important area to consider is simplifying the connections between teams. The tools you use to manage your environment can be an important part of your approach. Today, were sharing recent updates to Azure management and security services that are designed to make it easier to anticipate issues and resolve them more quickly when they arise.

Service management, security, and performance management are all high priority areas where the activities of different groups may intersect. For greater efficiency in handling incidents, you can now bring together system logs and service management features into interactive dashboards for network performance and application dependency mapping. For greater insight into security incidents, we recently announced enhanced behavioral analytics and advanced prevention technologies. To bridge between application performance monitoring and traditional monitoring of servers and workloads, we now include Azure Application Insights in the Operations Management Suite.

The insight gives everyone at different levels, and from different perspectives, a better view of whats happening with an application or a network segment. If we compare it to a water service, its like looking at a water pipe from inside the house to the sidewalk and all the way to the water tower.

Shawn Williams, Systems Engineer, Purdue University

Service management

Connecting your log data across datacenters, remote office sites, and critical workloads can make it easier to troubleshoot IT issues and provide faster resolutions across teams. Gain visibility into the virtual network connections in your environment using the Network Performance Monitor in Azure Insight & Analytics. This network monitoring tool scans networks for performance degradation and outages in near real-time, and enables you to drill-down into each node and visualize network segments and devices that may be causing the problem. Additionally, Service Map, currently in public preview, is an application and server dependency mapping tool that discovers and maps server and process dependencies and visualizes application components, service dependencies, and supporting infrastructure configuration. This helps you eliminate the guesswork of problem isolation, identify surprise connections and broken links in your environment, and perform Azure migrations knowing that critical systems and endpoints wont be left behind.

Security

Security is a major focus area for most organizations, and you dont want to be the bottleneck in ensuring your systems are secure and compliant. You can use Azure security tools to monitor security events leveraging Microsofts threat intelligence engine, and to respond immediately. Servers remain a top target for breaches, but now with new behavioral analytics and advanced prevention technologies available you can help identify suspicious activity, such as process persistency in the registry, processes masquerading as system processes, and attempts to evade application whitelisting if a server becomes compromised. You can also use the power of machine learning to detect brute force attacks, and to determine if virtual machines are taking part in a DDoS attack by joining network and machine data together, so you can work together with your security counterparts to resolve issues quickly.

Performance management

Application developers and IT are both concerned with the health and performance of critical applications and the platforms on which they run. To help you improve collaboration between teams, weve made it possible to use the same application data across Azure services with custom views to meet different needs. Azure Insight & Analytics enables IT to see server, workload and network performance across your environment, coupled with application data on web tests, page views, server requests, exceptions and custom events, in a single, connected workspace. For developers, you can use Azure Application Insights, which was announced at Connect() in November, to visualize application behavior and usage patterns, including interactive charts for app responsiveness, load times, and slow requests. You can also deploy and test new features, using custom telemetry, to see how they are performing. To make it easier to take advantage of the combined functionality of Application Insights and Insight & Analytics, weve now added Application Insights to the Operations Management Suite.

Azure management and security services enable organizations to see in near real-time what is happening in a hybrid, cross-platform environment. For Shawn Williams, systems engineer at Purdue University, The insight gives everyone at different levels, and from different perspectives, a better view of whats happening with an application or a network segment. If we compare it to a water service, its like looking at a water pipe from inside the house to the sidewalk and all the way to the water tower.

Get started now

Azure Application Insights is now included in the Operations Management Suite. To take advantage of this offer, your Application Insights resources must be configured in theEnterprise tier.Try today with a free account, and send us your feedback.

New Year, New Dev – Windows IoT Core

$
0
0

To wrap up the “New Year, New Dev” blog series, we’ll go into using Windows 10 IoT Core and show how easy it is to start developing applications to deploy on IoT devices such as the Raspberry Pi 3. If you haven’t had a chance to read the first two posts in this series, you can find them here:

  1. New Year, New Dev: Sharpen your C# Skills
  2. New Year, New Dev: Developing your idea into a UWP app

Let’s begin by explaining what Windows 10 IoT Core actually is.

Windows 10 IoT Core is a version of Windows 10 that is optimized to run on smaller IoT devices, with or without a display, that allows you to build apps using the rich Universal Windows Platform (UWP). In fact, one of the goals for UWP was to enable the same application to run on PC, Xbox, HoloLens, Surface Hub or IoT Core devices. This enables you, as the developer, to create a core application with custom, adaptive user experiences appropriate for these devices.

In UWP there are extension APIs available to do things that are specific to your platform or device. For example, there are extension SDKs that give you access to unique features in Windows Mobile and on Surface Hubs. The extension SDKs for Windows IoT give you access to the APIs that can manage things like lights, sensors, motors, buses and much more.

Windows IoT Core supports a wide range of devices. Here are some examples:

Out of the box, you can use the UWP programming language you’re most comfortable with to build apps for IoT Core; these languages support both apps with or without a User Interface (aka “background applications”) and are shipped with Visual Studio by default.

  • C#
  • C++
  • JavaScript
  • Visual Basic

Alternatively, you can use one of the following IoT-focused languages; these languages can only be Background Applications.

  • C/C++ with Arduino wiring
  • js
  • Python

To use one of the “IoT focused” languages, you’ll need to install the Windows IoT Core Project Templates Visual Studio Extension. You can download these right from Visual Studio by going to Tools > Extensions and Updates. Or, you can install it separately by downloading it (go here for VS2015 or here for VS 2017).

For the purposes of today’s post, we will show how to install Windows 10 IoT on a device, how to install the tools you’ll need, and share several sample applications to get you started.

Getting started

The IoT Core team has made it very easy to get started by providing a flow-based Get Started page on the Windows IoT Core web page. Let me take you through the steps:

Step 1:

Go to the Get Started page here and you’ll be greeted with Step 1: Select Your Hardware. Choose the device you’re using, for this post, I’ll pick Raspberry Pi 3:

Step 2:

Select the route you want to install Windows 10 IoT Core. The most common option is to install it directly to a blank microSD Card:

Step 3:

Next, you’ll get to pick what version of Windows 10 IoT Core you want to use.

Important Note: Normally, you’d choose the non-insider version. Only choose the Insider Preview if you have the Insider Preview UWP SDK installed in Visual Studio 2017. The SDK version has to match the OS version on the device in order to deploy to it.

Step 4:

You’re done! Now click the Next button to navigate to the next phase of the setup process, getting the tools.

Installing the tools

During this phase of Getting Started, you’ll go through four high level steps. At the last step, you’ll be running an application on your IoT device! Here are the steps:

  1. Get the Tools
  2. Set up your device
  3. Set up Visual Studio
  4. Write your first app

1 – Get the Tools

In this step, you’ll download and install an amazing tool, the Windows 10 IoT Core Dashboard (approx. 54MB). This tool has a lot of features that makes using Windows 10 IoT core much easier than it has ever been. Once you’ve installed it, find it in your Start Menu’s apps list or search for “IoT Dashboard.”

You should now see a Start page like the following:

2 – Set up your device

With the Dashboard running, you can now set up your device by clicking the “Set up a new device” button. This makes the installation process very easy, just a few selections and a click of the button.

Here’s a screenshot of the “Set up a new device” page. Take note of the version of Windows IoT Core you’re installing. The current version is 14393, otherwise known as the Anniversary Update.

Once this is done, you’re good to go! Just remove the microSD card from your PC and insert it into the Raspberry Pi and boot it up.

Note: The first time boot-up will take longer than normal because it is performing an initial configuration. Be patient and do not power down during this. If you have any trouble and it doesn’t boot, just repeat the setup again to get a fresh start.

3 – Setup Visual Studio

Now let’s review what you have installed for tools.

If you do not have Visual Studio 2017 installed

You can download and install Visual Studio 2017 Community edition, for free, from here. This is not a “express” version, the Community edition is a feature-rich version of Visual Studio with all the tools you need to develop UWP applications and much more.

When running the installer, make sure you check off the Windows Platform Apps workload to get the Tools and SDK. Here’s what the installer looks like:

If you already have Visual Studio 2017 installed

If you already installed Visual Studio, then let’s check if you have the UWP tools installed. In Visual Studio, drop down the Help menu and select “About Visual Studio.” You’ll see a modal window pop out, inside the “Installed Products” are you can scroll down to check for “Visual Studio Tools for Universal Windows Apps”:

If you do not have them installed, you can use the standalone SDK installer to install them (see UWP SDK paragraph below) or rerun the Visual Studio 2017 installer and select the “Universal Windows Platform development” workload to install it.

Note: You can use Visual Studio 2015, just make sure you’re on Update 3 in addition to having the UWP SDK installed.

UWP SDK Version

Now that you have Visual Studio 2017 and the UWP tools installed, you’ll want to have the UWP SDK version that matches the Windows 10 IoT Core version you installed. As I mentioned earlier, the current version is 14393.

If you just installed Visual Studio, this would be the SDK version you have installed already. However, if you do need the 14393 SDK, you can get the standalone installer from here (note: if you’ve take the Windows IoT Core Insider Preview option, you can get the Insider Preview SDK from here).

TIP: Install the IoT Core Project templates

At this point, you can build and deploy to an IoT device simply because you have the UWP SDK installed. However, you can get a productivity boost by installing the IoT Core project templates. The templates contain project types such as: Background Application, Console Application and Arduino Wiring application. Download and install the templates from here.

Write your first app

At this point, your device and your developer environment is set up. Now it’s time to start writing apps! You may be familiar with the “Hello World” app as being the first application you build when trying a new language or platform. In the world of IoT, these are known as “Hello Blinky” apps.

There are several excellent Hello Blinky sample applications (when we say headless we mean with no user interface; you can still have a display connected to the system if you wish) to help you get a jump-start:

There are even more Microsoft authored samples located at our Windows IoT Dev Center.

You can also check out what the community has built on websites such as Hackster.io where developers open source their Windows 10 IoT Core projects, build specs and source code. There are hundreds of projects available; a few examples are:

There are unlimited possibilities with Window 10 IoT core, from home automation to industrial robotics or even environmental monitoring. Your app doesn’t have to be a complex system, you can build a UWP app to build a smart mirror, turn your hallway lights when motion is sensed, or use a light sensor to open your shades at dawn and close them at sunset!  We look forward to seeing what you build with Windows 10 IoT Core; send us a tweet @WindowsDev and share your creations with us!

Resources

The post New Year, New Dev – Windows IoT Core appeared first on Building Apps for Windows.

Apply Now for Microsoft’s Go Mobile Tech Workshops

$
0
0

This is your opportunity to bring Microsoft engineering experts to discuss app development and architecture best practices with your team 1:1 

We’re excited to announce that the Microsoft engineering team is offering a limited number of technical sessions to help your team build better mobile apps faster. The Go Mobile Tech Workshops are 3 hour sessions dedicated to your team, covering everything from your technology stack and architecture to the latest in Visual Studio 2017, Azure, and DevOps best practices.  

Sign-up now

What’s in it for you

  • Dedicated time with Microsoft technical experts to help you analyze your technology stack and application development practices.
  • Common patterns, architectures, and best practices for mobile to help you go faster and avoid common pitfalls.
  • Q&A with the engineering team to address your technical questions.

Don’t miss out—apply for a workshop today!

Joseph Hill, Principal Director PM, Mobile Development Tools

Prior to joining Microsoft, Joseph was VP of Developer Relations and Co-Founder at Xamarin.  Joseph has been an active participant and contributor in the open source .NET developer community since 2003. In January 2008, Joseph began working with Miguel de Icaza as Product Manager for his Mono efforts, ultimately driving the product development and marketing efforts to launch Xamarin’s commercial products.

Online Analysis Services Course: Developing a Tabular Model

$
0
0

Check out the excellent, new online course by Peter Myers and Chris Randall for Microsoft Learning Experiences (LeX). Lean how to develop tabular data models with SQL Server 2016 Analysis Services. The complete course is available on edX at no cost to audit, or you can highlight your new knowledge and skills with a Verified Certificate for a small charge. Enrolment is available at edX.

Learn about Azure Analysis Services at the Microsoft Data Insights Summit 2017

$
0
0

We’re excited to participate in the Microsoft Data Insights Summit June 12 – 13, 2017 in Seattle, WA. This two-day event is designed to help you identify deeper insights, make better sense of your data, and take action to transform your business.

This year’s Microsoft Data Insights Summit will be filled with strong technical content, vibrant speakers, and an engaged community of experts. The event offers deep dive sessions, hands-on learning, industry insights, and direct access to experts. Join us to expand your skills, connect directly with Microsoft product development teams, and learn how to get the most from the Microsoft BI stack.

The Analysis Services program-management team is excited to deliver the following sessions.

Super Charge Power BI with Azure Analysis Services

Monday, June 12. 11:10 am – 12:00 pm.

Join this session to get a deep dive to how you can scale up a Power BI model by migrating it to Azure Analysis Services. This new service enables larger models and allows fine grained control of refresh behavior. We will cover migration, using the gateway for on-premises data, and new connectivity with Power Query and the M Engine for Power BI compatibility and reuse. Other topics will include creating reports that tell stories, distributing in SPO or PtW, collaborative conversations across teams, data story galleries, custom visuals, Sway, and more.

Creating Enterprise Grade BI Models with Azure Analysis Services

Tuesday, June 13. 11:40 am – 12:30 pm.

Microsoft Azure Analysis Services and SQL Server Analysis Services enable you to build comprehensive, enterprise-scale analytic solutions that deliver actionable insights through familiar data visualization tools such as Microsoft Power BI and Microsoft Excel. Analysis Services enables consistent data across reports and users of Power BI. The demos will cover new features such as improved Power BI Desktop feature integration, Power Query connectivity, and techniques for modeling and data loading which enable the best reporting experiences. Various modeling enhancements will be included such as Detail Rows allowing users to easily see transactional records, and improved support for ragged hierarchies.

Check out the sessions page for the complete list of sessions. Don’t miss out—register today!

Office Store brings you Power BI custom visuals

$
0
0

The Office Store is introducing Power BI custom visuals to download and use in Power BI service reports and Power BI Desktop. Users will be able to easily discover and quickly download BI visualizations that interact with data to find key insights and drive important business decisions. Power BI custom visuals provide compelling data visualizations created by members of the community and by Microsoft. They behave just like the native rich visualizations already included with Power BI but can also be filtered, highlighted, edited and shared.

Here are some examples:

  • Word Cloud—Visualize the text in your data in a beautiful way.

  • SandDance—See all your data as grains of sand with animated transitions between views to help you explore, understand and communicate insights in your data.
  • Correlation plot—An advanced analytics visual based on R script to highlight correlations in your data.

Check Power BI custom visuals out for yourself—get started today!

The post Office Store brings you Power BI custom visuals appeared first on Office Blogs.

Retail Customer Churn Prediction: Free How-To Guide Now Available

$
0
0

This post is authored by Lixun Zhang, Data Scientist, Daisy Deng, Software Engineer, and Tao Wu, Principal Data Scientist Manager, at Microsoft.

Predicting customer churn rate is among the most sought-after machine learning and analytics applications for retail stores, and of high value to companies that are eager to take advantage of the ever-increasing amounts of customer data they are collecting. Retaining existing customers is estimated to be five times cheaper than the cost of attracting new ones, and so businesses want to be proactive about things and predict who is likely to churn before it happens. Businesses also wish to identify the factors that are related to high churn rates, which in turn helps them apply resources towards acquiring the right type of customers in the first place.

Microsoft has been active in the domain of churn prediction, having published several resources to help businesses understand the data science process behind customer churn prediction.

We are now pleased to announce the Retail Customer Churn Prediction Solution How-to Guide, available in Cortana Intelligence Gallery and a GitHub repository.


What’s the Guide About?

The Guide includes a Solution Overview for Business Audiences and a Technical Deployment Guide that provides the steps needed to implement an end-to-end solution to predict customer churn rates, including data ingestion, data storage, data movement, machine learning / advanced analytics, model operationalization, model retraining, and visualization.

The specific business case in the Guide is about predicting churn rate such that the question “What is the probability that a customer will churn soon?” can be answered.

We say a customer churned when that customer spent no money at the store in the last 21 days. This definition can be customized by two factors: the number of days from today and the amount of money spent. For example, some businesses might define a churned customer as someone who has made less than $10 in purchases over the last 30 days. The problem is formatted as a two-class classification problem, and a machine learning algorithm is used to create the predictive model that learns from the simulated data based on the Tafeng dataset, which can be found in this GitHub repository resource folder. The data includes transaction-level information such as user-id, item-id, quantity, and value, as well as user-level information such as age and region.

Who will Benefit from the Guide?

The Guide was developed with three distinct audiences in mind: business decision makers, data scientists, and engineers.

The Solution Overview for Business Audiences helps you understand the business implications of customer churn, providing a high-level view of how churn rate analytics can be streamlined with Cortana Intelligence.

Data scientists and engineers will benefit from the Technical Deployment Guide, which provides detailed instructions on how to stitch together on-premises and Azure services. The Technical Deployment Guide includes an Azure ML experiment that provides a starting point for data scientists to develop churn prediction models. Interested data scientists can also learn to generate powerful visualizations using Power BI.

To get started and learn more, check out the Guide in the Cortana Intelligence Gallery.

How is the Guide Related to Other Resources?

Data scientists looking for guidance on building models for customer churn can visit the Retail Customer Churn Prediction Template, which covers the steps needed to implement a customer churn model, including feature engineering, label creation, training and evaluation.

To create an on-premises version of this solution using SQL Server R Services, take a look at the Customer Churn Prediction Template with SQL Server R Services, which walks you through that process.

Do put the guide to use in the real world, and share your feedback and thoughts with us, below.

Lixun, Daisy & Tao


SQL Server Replication enhancement in SQL Server 2016

$
0
0

Replication is a widely-adopted feature in SQL Server to copy and distribute data and database objects from one database to another and then synchronizing between databases to maintain consistency. In SQL Server 2016, we did some enhancement in replication features and made it easier to use and work better with other SQL Server features and SQL platform.

Re-publisher in AlwaysOn Availability Groups

In SQL Server 2016, transaction replication re-publisher is supported in an AlwaysOn availability groups configuration.

See “Replication, Change Tracking, Change Data Capture, and Always On Availability Groups (SQL Server)” for more details.

Replication to Azure SQL Database

You can now configure transaction replication to replicate data to Azure SQL Database from on-premises SQL Server or SQL Server running in Azure VMs. Azure SQL Database can be configured as push subscriber. See “Replication to SQL Database” for more details.

Replication to memory-optimized table subscribers

In SQL Server 2016, tables acting as snapshot and transaction replication subscribers can be configured as memory-optimized tables. Peer-to-peer transaction replication and other replication types are not supported yet. See “Replication to Memory-Optimized Table Subscribers” for more details.

Support DROP TABLE DDL

Starting SQL Server 2016 SP1, a table included as an article of transaction replication publication can be dropped from the database and the publications. See this document for more details.

We are going to continue to improve the functionality and usability of SQL Server replication in the future releases. Please subscribe SQL Server Database Engine Blog to get latest information.

Error Adding Windows Server 2016 and Windows 10 2016 LTSB Product Keys to VAMT 3.1

$
0
0

My name is Shannon Gowen and I am a Serviceability Support Escalation Engineer for Windows Client and Server Beta. This is a short blog describing an issue that you might see utilizing the Volume Activation Management Tool (VAMT) with Windows Server 2016 and Windows 10 2016 LTSB product keys and how to resolve it.

This issue was reported previously for VAMT3.1 v1507 under KB Article 3094354, listed below under References.

Upon attempting to enter a Windows Server 2016 product key (for this example, the three Windows Server 2016 Client Setup keys are used) to VAMT (installed from the Windows ADK for Windows 10 version 1511), you receive the following pop-up:

clip_image001

Figure 1– Failed attempt to add Windows Server 2016 Client Setup keys to VAMT 3.1 (1511)

If you attempt to do the same with Windows 10 product keys (for this example, the Windows 10 Professional, Enterprise, and Education Client Setup keys are used), the addition is successful.

clip_image002

Figure 2– Successful attempt to add Windows 10 Client Setup keys to VAMT 3.1 (1511)

If you attempt to do the same with Windows 10 LTSB product keys (for this example, the Windows 10 2015 and 2016 LTSB Client Setup keys are used), the addition of the Windows 10 2015 LTSB Client Setup key was successfully, while the Windows 10 2016 LTSB Client Setup key failed.

clip_image003

Figure 3– Failed attempt to add the Windows 10 2016 LTSB Client Setup key to VAMT 3.1 (1511)

clip_image005

Figure 4 – Successful addition of the Windows 10 2015 LTSB Client Setup key to VAMT 3.1 (1511)

This occurs because the Windows ADK for Windows 10 v1511 was released prior to the release of Windows Server 2016 and Windows 10 2016 LTSB. The code is not aware of newer releases. This is why VAMT 3.1 v1511 reports that the product key configuration files could not be found and why the Windows Server 2016 and Windows 10 2016 LTSB product keys cannot be added successfully.

To resolve this, you can either update your VAMT v1607 with the Windows ADK for Windows 10, version 1607 or update the code in VAMT 3.1 v1511.

Update to VAMT 3.1 v1607 (Highly recommended, supported method)

1. Uninstall the Windows ADK for Windows 10, version 1511.

a. Open Control Panel à Programs à Programs and Features.

b. Click on Windows Assessment and Deployment Kit – Windows 10 (version 10.1.10586.0.

c. Click on Uninstall.

clip_image007

Figure 5 – Uninstall Windows ADK from Control Panel

2. Once the uninstall is complete, download the Windows ADK for Windows 10, version 1607.

3. Run through the installation with at least the Volume Activation Management Tool selected.

clip_image009

Figure 6 – Installing VAMT 3.1 v1607 via Windows ADK

4. Open the Volume Activation Management Tool and attempt to add any issue key. For this example, the Windows Server 2016 and Windows 10 2016 LTSB Client Setup (shown) and KMS host (not shown) keys are used.

clip_image010

Figure 7 – Successful addition of Product Keys

5. All product keys now listed.

clip_image012

Figure 8 – List of Installed Product Keys

Update VAMT 3.1 v1511 Code

1. Download either July 2016 Update Rollup (KB 3172614 or KB 3172615 mentioned below under References). For this example, KB 3172614 is used and the MSU, saved to C:\KB3172614.

2. In an elevated command prompt, navigate to the location of the saved MSU, and extract the contents of the update with the following command: expand -f:* . For this example, the following command is used:

expand Windows8.1-KB3172614-x64.msu –f:* C:\KB3172614

clip_image014

Figure 9 – Expansion of KB 3172614’s MSU

3. Next, extract the contents of the .cab file using the following command: expand -f:* . For this example, the following command is used:

Expand Windows8.1-KB3172614-x64.cab –f:* C:\KB3172614\KB3172614CAB

clip_image016

Figure 10 – Command Line used to Expand KB 3172614’s CAB File

4. Copy all XrML Digital License files in the subfolders to the pkconfig directory for VAMT. The default location is C:\Program Files (x86)\Windows Kits\10\Assessment and Deployment Kit\VAMT3\pkconfig. This can quickly be accomplished by opening File Explorer, navigating to the expanded CAB location, and searching for “xrm-ms”. Copy all of the resulting files into the previously mentioned pkconfig directory.

clip_image018

Figure 11 – List of XrML Files

5. Close and reopen VAMT. Attempt to add the issue keys. The Windows Server 2016 and Windows 10 2016 LTSB Client Setup keys will not be able to be added, but Windows Server 2016 and Windows 10 2016 LTSB MAK and KMS host keys (CSVLKs) can be added.

clip_image019

Figure 12 – Successful Addition of a KMS Host Key (CSVLK)

clip_image021

Figure 13 – List of Successfully Added Product Keys

References:

“Can’t add CSVLKs for Windows 10 activation to VAMT 3.1” – https://support.microsoft.com/en-in/help/3094354/can-t-add-csvlks-for-windows-10-activation-to-vamt-3.1

“Download the Windows ADK” – https://developer.microsoft.com/en-us/windows/hardware/windows-assessment-deployment-kit

Direct Download of Windows ADK for Windows 10, version 1607 – https://go.microsoft.com/fwlink/p/?LinkId=526740

“What’s new in ADK kits and Tools” – https://msdn.microsoft.com/windows/hardware/commercialize/what-s-new-in-kits-and-tools

Direct Download of Windows ADK for Windows 10, version 1511 – https://go.microsoft.com/fwlink/p/?LinkId=823089

“Appendix A: KMS Client Setup Keys” – http://technet.microsoft.com/en-us/library/jj612867.aspx

“July 2016 update rollup for Windows 8.1 and Windows Server 2012 R2” – https://support.microsoft.com/en-us/help/3172614/july-2016-update-rollup-for-windows-8.1-and-windows-server-2012-r2

“July 2016 update rollup for Windows Server 2012” – https://support.microsoft.com/en-us/help/3172615/july-2016-update-rollup-for-windows-server-2012

Building a KMS Host on Windows Server 2008 R2

$
0
0

Support Lifecycle: https://support.microsoft.com/en-us/lifecycle?p1=14134

This blog post is part of a series of posts, detailing the build process and activating capabilities of a KMS host on a particular host operating system. The operating system dictates which KMS host key (CSVLK) can be installed on that particular host, and that CSVLK determines what KMS-capable clients can be activated. When implementing KMS activation in an environment, it is best to determine all of the potential volume license operating systems for your KMS clients and then pick the best key. To simplify this, it is recommended that the most current KMS CSVLK be used, insuring that all KMS-capable operating systems that have been released at that time can be activated.

Note: Desktop KMS CSVLKs can only be installed on hosts with desktop operating systems (that support that CSVLK) and Server KMS CSVLKs can only be installed on hosts with server operating systems (that support that CSVLK).

This blog post pertains to a KMS host with Windows Server
2008 R1 with SP1 as the operating system.

Windows Server 2008 R2 with SP1 can host the following
server KMS CVSLKs:

  • Windows
    Server 2008 R2
  • Windows
    Server 2012
  • Windows
    Server 2012 R2
  • Windows
    Server 2012R2 DataCtr/Std
    KMS for Windows 10

Note: Windows Server 2008 R2 cannot host a Windows Server 2016 KMS CSVLK. At the release of Windows Server 2016, Windows Server 2008 R2 was no longer in mainstream support. No hotfix will be released to enable a Windows Server 2008 R2 KMS host to activate Windows Server 2016 KMS clients.

The KMS CSVLKs can activate the following KMS clients:

KMS CSVLK

KMS Clients Activated

Hotfix Required

Windows Server 2008 R2

(Channels A, B, and C determine specific editions
activated.)

Windows Vista

Windows Server 2008

Windows 7

Windows Server 2008 R2

None needed.

Windows Server 2012

Windows Vista

Windows Server 2008

Windows 7

Windows Server 2008 R2

Windows 8

Windows Server 2012

As Windows 2008 R2 was released prior to Windows 8 and
Server 2012, it is not aware of Windows 8 and Server 2012. KB Article 2757817
will address this.

Windows Server 2012 R2

Windows Vista

Windows Server 2008

Windows 7

Windows Server 2008 R2

Windows 8

Windows Server 2012

Windows 8.1

Windows Server 2012 R2

As Windows Server 2008 R2 was released prior to Windows
8.1 and Server 2012 R2, it is not aware of Windows 8.1 and Server 2012 R2. KB Article 2885698
will address this.

Windows Server 2012 R2 DataCtr/Std KMS for Windows 10

Windows Vista

Windows Server 2008

Windows 7

Windows Server 2008 R2

Windows 8

Windows Server 2012

Windows 8.1

Windows Server 2012 R2

Windows 10

As Windows Server 2008 R2 was released prior to Windows
10, it is not aware of Windows 10. KB Article 3079821
will address this.

KMS Host Build Steps:

  1. Install
    Windows Server 2008 R2 with SP1
  2. Patch
    completely
  3. If
    a firewall is used, verify that there is an exception for KMS
  4. Obtain the desired CSVLK from the VLSC site
  5. If the KMS CSVLK is newer than the Windows Server 2008 R2, install the hotfix required as per the table above
  6. Install the KMS CSVLK
    1. Open an elevated command prompt and navigate to Windows\System32
    2. Run cscript.exe slmgr.vbs /ipk XXXXX-XXXXX-XXXXX-XXXXX-XXXXX using your KMS CSVLK
    3. Wait for success message
  7. Activate the KMS CSVLK
    1. If system has external internet connectivity:

                                                               i.      Open an elevated command prompt

                                                             ii.      Run cscript.exe slmgr.vbs /ato

                                                           iii.      Wait for success message

    1. If system does not have external internet connectivity:

                                                               i.      Phone activate with UI

1.       Open an elevated command prompt

2.       Run slui.exe 0x4 to open the Phone Activation wizard

3.       Follow the prompts to complete

                                                             ii.      Phone activate via command prompt

1.       Open an elevated command prompt

2.       Run cscript.exe slmgr.vbs /dti to obtain the installation ID

3.       Call Microsoft’s Phone Activation using a phone number listed in %SystemRoot%System32\SPPUI\Phone.inf

4.       Follow the prompts to obtain the confirmation ID

5.       Run cscript.exe slmgr.vbs /atp<ConfirmationID w/o hyphens>to apply the confirmation ID

6.       Wait for a success message

  1. Run cscript.exe slmgr.vbs /dlv and verify that the License Status indicates that the KMS host is licensed.

clip_image002

The Windows Server 2008 R2 KMS host is now ready to begin accepting KMS activation requests. The host needs to meet the minimum threshold of five unique KMS activation requests (from desktop and/or server systems for a total of five) for server KMS client activations to begin and a minimum threshold of twenty-five unique KMS activation requests (from desktop and/or server systems for a total of twenty-five, not in addition to the threshold count of five for server activation) to begin activating both KMS desktop and server clients. Until the minimum threshold is met, KMS clients attempting to activate against this host will report the following error:

clip_image002[5]

When the threshold is met, all KMS clients requesting activation (that are supported by the CSVLK installed) will begin to activate. Those KMS clients that previously erred out with 0xC004F038 will re-request activation (default interval is 120 minutes) and will be successfully activated without any user interaction. An activation request can be prompted on a KMS client immediately by running cscript.exe slmgr.vbs /ato in an elevated command prompt.

Scenario:

You want to build a KMS host on Windows Server 2008 R2, to activate Windows 7, Windows Server 2008 R2, and Windows Server 2012 R2. Here are the steps necessary to achieve your goal.

  1. Determine what Key host key (CSVLK) is needed– You determine that KMS host key (CSVLK) needed to activate Windows 7, Windows Server 2008 R2, and Windows Server 2012 R2 is the Windows Server 2012 R2 KMS CSVLK as per this TechNet article, under the “Plan for Key Management Services activation” section.
  2. Obtain the CSVLK – Log onto your Volume License Service Center site and locate the Windows Server 2012 R2 KMS key listed. Note this for Step #5.
  3. Build a Windows Server 2008 R2 system from Volume License media and patch– Using volume license media, build a system or utilize a system that is already built. Completely patch the system using Windows Update or whatever solution you use for applying updates/hotfixes.
  4. Apply the required hotfix – Because Windows Server 2008 R2 was released before Windows Server 2012 R2, the system needs to become aware of the newer operating systems. Applying the hotfix from KB Article 2885698 will accomplish this and enable your Windows Server 2008 R2 KMS host to activate Windows Server 2012 R2 KMS clients (along with Windows 7, Windows Server 2008 R2, Windows 8, Windows Server 2012, and Windows 8.1 KMS clients).
  5. Install the CSVLK – Open an elevated command prompt. Install the CSVLK on the KMS host by running the following command: cscript.exe slmgr.vbs /ipk
  6. Activate the CSVLK – In the elevated command prompt, activate the CSVLK by running the following command: cscript.exe slmgr.vbs /ato
  7. Verify– In the elevated command prompt, display the licensing information by running the following command: cscript.exe almgr.vbs /dlv
  8. Phone activate if necessary – If you have issues with online activation from Step #6, you can open the phone activate by running the following command: slui.exe 0x4 and follow the prompts to activate your system. Once complete, repeat verification if necessary.

The KMS host is now ready to begin activating any Windows 7, Windows Server 2008 R2, Windows 8, Windows Server 2012, Windows 8.1, and Windows Server 2012 R2 KMS clients). Here is a quick video to show the steps.

References:

Announcing Azure Service Fabric 5.5 and SDK 2.5

$
0
0

Customers around the world are delivering their mission critical business applications as always-on, scalable, and distributed services built using Azure Service Fabric. Last week we rolled out Azure Service Fabric 5.5 to Azure clusters in 26 regions across the world. Today, we’re excited to announce the release of version 2.5 of the Azure Service Fabric SDK and the corresponding 5.5 release of the Azure Service Fabric runtime and standalone Windows Server installer.

If you're using Visual Studio 2017, the Service Fabric tools are built in, so you'll only need to install the Microsoft Azure Service Fabric SDK. If you're using Visual Studio 2015, install the Microsoft Azure Service Fabric SDK and Tools.*

Get the new standalone package for Windows Server.

This release has a number of great new features along with the usual bug fixes and optimizations. Here are a few highlights of this release, in no particular order:

Support for compressed application packages for faster image store upload

Previously, application packages were always a directory structure. While this format was simple to edit it could occasionally result in application packages that were quite large. This can be problematic when copying and registering in a Service Fabric cluster, especially on slower connections or across larger distances. In this release we have added support for compressed packages prior to upload to the cluster. 

Improved upgrade behavior to catch additional errors during upgrade and improve deployment safety

In this release, we’ve increased the default health check duration between upgrade domains so that the automated upgrade rollback function has a chance to catch a wider range of errors. This makes upgrades slightly longer, but much safer by default.

We also improved the health evaluation of entities by checking that they have at least one report from their system authority component. This ensures that the health store view is consistent with the state of the system as viewed by the authority components, adding to even greater upgrade safety.

ASP.NET Core integration

Integration with ASP.NET Core is now fully supported in both stateless and stateful Reliable Services, available as add-on NuGet packages. These packages allow you to easily bootstrap an ASP.NET Core web application in a stateless or stateful service using either Kestrel or WebListener. The integration also features custom Service Fabric middleware designed to help handle service resolution when connecting to an ASP.NET Core Service Fabric service. Learn more about ASP.NET Core in Service Fabric.

Refresh application debug mode in Visual Studio 2015 (Preview this release)

In conjunction with ASP.NET Core integration support, we’ve added a new application debug mode to the Service Fabric Application project. Refresh Mode allows you to quickly iterate between writing code and debugging and supports edit and refresh for ASP.NET Core services, so you can now develop ASP.NET Core services in Service Fabric the same way you would outside of Service Fabric.
Note that Refresh Mode is a preview feature in this release. Refresh Mode will also be available in Visual Studio 2017 soon.

.NET Core support using csproj project system in Visual Studio 2017

Service Fabric services for .NET Core now supports the new simplified .csproj project system in Visual Studio 2017. Migrating to csproj from existing xproj projects is also supported, but is a one-way migration.

For more details on these features and others, along with bug fixes and known issues, please see the detailed release notes.

*Note that there is a known issue causing occasional failures when following WebPI links in Google Chrome. If you run into this, either try the link in another browser or launch the WebPI client directly and search for Service Fabric.

Azure SQL hybrid data movement

$
0
0

As Cloud computing is getting more and more popular today, many companies are choosing to deploy a hybrid environment using a mix of on-premises data center and public cloud. It gives the businesses greater flexibility and more data deployment options. For example, a company can host business critical or sensitive data in on-premises data centers and deploy less-critical data or test and development environment in the public cloud. A hybrid cloud environment will also help large companies to migrate on-premises data center to cloud in multiple stages without interfering with the business.

Moving data around efficiently in a hybrid cloud environment is critical and challenging. In this blog, we are going to introduce options in different data movement scenarios built on top of on-premises SQL Server, Azure SQL VMs and Azure SQL Databases:

  • Migrate data from on-premises SQL Server to Azure
  • Replicate data for business continuity
  • Replicate data to scale out read-only workload
  • Replicate data to refresh dev-test environment
  • Distribute referencing data or multi-master
  • Backup and restore data
  • Migrate cold data from on-premises SQL Server to Azure
  • Move data into data warehouse
  • Move data into big data platform
  • Move data from other data platforms

We are going to mention the following technologies and tools in this blog:

  • Export and import .bacpac files
  • bcp
  • Transactional replication, including peer-to-peer transaction replication
  • Merge replication
  • SQL Server backup and restore, including managed backup and file snapshot backup
  • Always On availability groups
  • Data Migration Assistant (DMA)
  • Azure SQL Data Sync
  • SQL Server Integration Services (SSIS)
  • Azure SQL Database copy
  • Azure Data Factory (ADF)
  • SQL Server Migration Assistant (SSMA)
  • Attunity CDC for SSIS
  • SQL Server Stretch Database

The goal of this blog is to help you to choose the right technologies and tools to implement different scenarios. Implementation details and step by step instructions will not be covered in this blog, however we will provide links to related resources.

Migrate data from on-premises SQL Server to Azure

When you are migrating existing data from on-premises SQL Server databases to Azure, there are a few key facts you should measure and consider:

  1. Azure SQL Databases (PaaS) or Azure SQL VM (IaaS), which is the better option? This is out of scope of our topic today. Please see “Choose a cloud SQL Server option: Azure SQL (PaaS) Database or SQL Server on Azure VMs (IaaS)” for more details.
  2. How many databases are you going to migrate? How large are they?
  3. How much downtime can your service or application afford without significant business impact?

Azure SQL Databases

If you can afford some downtime, or if you are performing a test migration, you can use bacpac to migrate your databases to Azure SQL Databases. See the blog of “Migrating from SQL Server to Azure SQL Database using Bacpac Files” for detailed instructions.

When you migrate databases, especially large databases using bacpac, plan for a long enough application downtime. Depending on the database size, the downtime can be hours.

When you cannot afford to remove your databases from production during the migration, you can consider using transaction replication as the migration solution (SQL Azure Database as push subscriber). See the “Migration from SQL Server to Azure SQL Database Using Transactional Replication” and “Replication to SQL Database” for details, including the limitations of transaction replication.

See “SQL Server database migration to SQL Database in the cloud” for more about migration to Azure SQL Databases.

Azure SQL VM

If you decide to migrate and host your data in Azure SQL VM, you will have several more options, including creating a Always On replica, backup/restore, etc. See “Migrate a SQL Server database to SQL Server in an Azure VM” for more details.

You can also use DMA (Data Migration Assistant) to migrate on-premises SQL Server databases to Azure SQL VM. DMA can migrate not only data, but also other server objects like logins, users and roles. DMA can also be used to detect compatibility issues before the migration. See the document of “Data Migration Assistant (DMA)” for more details.

Replicate data for business continuity

Disruptive events can happen in any data platform and cause data loss or your databases and application to become unavailable. Capability of fast recovery from data loss or database downtime is important for business continuity, especially for business-critical databases and applications.

Azure SQL Databases

Azure SQL Databases automatically have more than one copy created to ensure the high availability (99.99% availability SLA). To prevent or mitigate business discontinuity in event of a data center outage, you can either create Active Geo-replication or restore database from geo-redundant backup.

Active geo-replication will provide minimum downtime and data loss during the data center outage. It can also be used to scale out read-only workloads (will discuss in next scenario); however, this will introduce extra cost to have active geo-replication replicas. Consider this option for business-critical databases. See “Overview: SQL Database Active Geo-Replication” for more details about active geo-replication.

You can also recover your database from geo-redundant backup only when necessary. It will introduce longer recovery time and more data loss. It provides business continuity during data center outage with lower cost. See “Recover an Azure SQL database using automated database backups” for more details:

You can find more details about SQL Azure Databases business continuity in “Overview of business continuity with Azure SQL Database”.

Azure SQL VM

For SQL VM, you can setup Always On availability group or failover cluster instance to manage the downtime during VM reboot or outage.

See “High availability and disaster recovery for SQL Server in Azure Virtual Machines” for more details.

SQL VM as DR solution for on-premises SQL Server

To create a DR site without building a data center in another region, you can extend on-premises Availability Groups to Azure by provisioning one or more Azure SQL VMs and then adding them as replicas to your on-premises Availability Group. See section “Hybrid IT: Disaster recovery solution” in “High availability and disaster recovery for SQL Server in Azure Virtual Machines” for more details.

Replicate data to scale out for read-only workload

In many systems, certain applications only need to read the information from the databases. For example, in the information publishing service, only the publisher need to update the data, and all subscribers only need to read the data. To offload the primary database, you can replicate data and redirect read-only workload to other replicas.

Azure SQL Databases

In addition of providing business continuity in event of disaster, Active Geo-replication also can be used to offload read-only workloads such as reporting jobs to the secondary databases. If you only intend to use the secondary databases for load balancing, you can create the secondary databases in the same region.

See “Overview: SQL Database Active Geo-Replication” for more details about Active Geo-replication.

SQL VMs and on-premises SQL Server

To scale out SQL VM or on-premises SQL Server, you can build readable Always On replicas. Consider to build the replica in the same region unless the read traffic is going to happen in a different region.

Replicate data to refresh dev-test environment

Before system upgrade or deployment of a new system, you may want to test it using a copy of the production data. Building a separate dev-test environment will help you to run the test without impact on your production environment.

Azure SQL Databases

To create a replication of the live production environment in Azure DB for dev-test environment, you can use database copy.

If you want to replicate a snapshot of production environment with old data within retention period (35 days for Standard and Premium; 7 days of Basic), you can restore the database to the point in time you want.

Azure SQL VM and on-premises SQL Server

To replicate data from Azure Databases to on-premises or Azure SQL VM, you can export the database into bacpac and import into SQL Server running in Azure VM or on-premises. If you only want to replicate specific tables instead of the whole database, you can run SqlPackage in Azure VM. See “Export an Azure SQL database or a SQL Server database to a BACPAC file” for more details.

Distributing referencing data/Multi-master

International ISVs and corporates usually have clients or branches in different countries or regions. To reduce the performance impact from network latency, they want some business referencing data, like SKUs, user information, etc., distributed to Azure SQL or SQL Server databases. In a typical scenario, a central database will host all reference data and distribute it to different clients or branches. The clients or branches can also update the reference data locally and push the change back to the central database.

Azure SQL Data Sync can be used to implement the data distribution between on-premises SQL Server, Azure SQL VM and Azure SQL databases, in uni-direction or bi-direction. See “Getting Started with Azure SQL Data Sync (Preview)” to learn more about Azure Data Sync.

Azure Data Sync is now only available in old Azure portal. It will be available in new Azure portal very soon. See the blog of “Azure Data Sync Update” for more details.

On-premises or Azure SQL VMs to Azure SQL Databases

When you are design and implement such a distributed multi-master system, especially cross internet, you should consider to shard the database and only sync data when necessary to reduce the latency.

If the central database is hosted on-premises or in Azure SQL VM, you can use transactional replication to distribute data. Azure SQL databases can be configured as push subscribers and replicate data from the publisher, the central database. Transaction replication can only replicate data in one-way.

Between on-premises or Azure SQL VMs

If all your data in hosted in on-premises SQL Server or Azure SQL VM, and you need to sync data in bi-direction, except using Azure SQL Data Sync, you can setup either Peer-to-peer replication or Merge replication.

If you can design your application or service to ensure certain rows will be modified only in one node, peer-to-peer replication is recommended. If the application requires sophisticated conflict detection and resolution capacities, use merge replication. Merge replication introduces more complex topology and higher maintenance cost comparing to other sync and replication technologies. Only use Merge replication when other technologies can’t solve your problem.

Peer-to-peer replication and Merge replication will not work on Azure SQL Database, as neither publisher nor subscriber.

See Peer to peer transaction replication and Merge replication for more details.

Backup and restore data

Backing up your database is essential for protecting your data. It allows you to recover data from accidental corruption or deletion. We always recommend to store backup files separately from the databases. The hybrid environment will allow you to implement this easily.

Azure SQL Databases

Azure SQL Databases automatically backup the databases at no additional charge. You can restore the database to any point in time during the retention (7 days for Basic and 35 days for Standard and Premium). All backup files are replicated to multiple copies also to a different region. See “Learn about SQL Database backups” for more details.

If your business requires longer retention for backup, you can configure long term retention backup to keep the backup up to 10 years. See “Storing Azure SQL Database Backups for up to 10 years” for more details.

If you want to restore the database to on-premises SQL Server or store the backup in your own local file system or other cloud platforms, you can export the database into a bacpac file. You can use Azure automation to schedule the backup periodically. The sample script is provided in the following document: “Export an Azure SQL database or a SQL Server database to a BACPAC file”.

Azure SQL VMs

If you are running SQL Server on Azure VM, we recommend you to backup your databases to Azure storage (Backup to URL). This feature is supported in SQL Server 2012 SP1 CU2 or later version.

You can also configure the Managed Backup to enable automated backup management. The feature is available in SQL Server 2014 or later version. In SQL Server 2016, it supports custom schedule. See “Backup and Restore for SQL Server in Azure Virtual Machines” for more details.

To ensure the availability of backup files in event of disaster or data center outage, we recommend you use GRS or RA-GRS storage to store the backup files. It will automatically replicate your backup files to a different region. See “Azure Storage replication” for more details.

If you are running SQL Server 2016 and host data files in Azure storage, another option is to use file snapshot backup. It provides near instantaneous backup and rapid restores for database files stored in Azure storage. See “File-Snapshot Backups for Database Files in Azure” for more details.

On-premises SQL Server

We always recommend users to store backup files in a different location from database files. If you are running SQL Server 2014 or later version on-premises, you can use the Backup to URL feature to back up your database or transaction log directly to Azure storage. However, due to the network latency, backing up to and restoring from Azure storage may introduce lower throughput than the same operation against local disk. See “SQL Server Backup and Restore with Windows Azure Blob Storage Service” for more details.

Migrate cold data from on-premises SQL Server to Azure

As the database size is fast growing, managing and storing historical or cold data efficiently becomes a big problem. You can use Stretch Database to migrate the cold data from on-premises SQL Server to Azure and keep the data online. See “Introduction to Stretch Database” for more details about Stretch Database.

Move data into data warehouse

Companies periodically move data from OLTP system to OLAP system/data warehouse for data analysis and reporting. The process usually includes extracting data from data source, transformation and loading data into the target data warehouse (ETL).

The data warehouse and OLAP system can be built on top of Azure SQL Databases, Azure SQL VMs or Azure SQL Data Warehouse.

Most on-premises SQL Server customers use SSIS (SQL Server Integration Services) to load data into data warehouse. It extracts data from OLTP system, transforms it, and loads it into data warehouses. SSIS can also be used in the hybrid environment. For exiting SSIS customers, it will reduce the cost to build a new ETL process. If you are loading data into SQL VM or SQL Databases, we recommend running SSIS on Azure VM. See the blog of “Running SSIS on Azure VM (IaaS) – Do more with less money” for more information.

If you are moving data into Azure Data Warehouse, you can also use ADF (Azure Data Factory) or bcp as the loading tools. See “SQL Data Warehouse Migrate Your Data” and “Use Azure Data Factory with SQL Data Warehouse” for details.

Move data into Azure Big Data Stores for Advanced Analytics

To build big data advanced analytics solution in Azure, users need to load online transactional data and other reference data from various data sources into a modernized multi-platform data warehouse, which usually consists of Azure Blob/Azure Data Lake as the staging area, perform transformation activities such as HIVE/PIG/Spark, and finally load “cooked” data into Azure Data Warehouse for BI and reporting.

If you want to move your data into Azure and build advanced analytics solution on top of it, you can use ADF (Azure Data Factory), which is a fully managed data integration service that orchestrates the movement and transformation of data.  Please see “Introduction to Azure Data Factory Service, a data integration service in the cloud” for more details about ADF.

If you are an existing SSIS user, SSIS is another option you can use to load data into big data stores such as Azure Storage Blob and Azure Data Lake Store.  You will need the Azure feature pack to load data into Azure.

Move data from other data platforms

If you want to migrate databases from other DBMSs (Oracle, MySQL, etc…) to Azure SQL Databases or Azure SQL VM, you can use SSMA (SQL Server Migration Assistant) as a helpful migration tool. See “SQL Server Migration Assistant” for more details.

If you want to continuously move data from other DBMSs or other data format like flat files to Azure SQL Databases or Azure SQL VM, or you need to do some transformation before loading data into Azure, you can use SSIS. Attunity CDC for SSIS or SQL Server CDC for Oracle by Attunity provides end to end operational data replication solution. For more information, see “Attunity CDC for SSIS” and “SQL Server 2012 CDC for Oracle – a Review of One Implementation”.

Summary

In this blog, we discussed how to choose the right technologies and tools for different hybrid data movement scenarios. It is just a starting point and general guidance for these use cases. You will still need to evaluate different solutions based on your business needs.

If you have any further question, please post it in MSDN forum.

If you have any feedback to Azure SQL Databases or Azure SQL VMs, please submit it in https://feedback.azure.com.

Troubleshooting Office 365 ProPlus patching through System Center Configuration Manager

$
0
0

 

Hello, Dave Guenthner back again with a discussion about Office 365 ProPlus. In February of 2016 I wrote a blog entry about how enterprise customers can service\patch Office 365 ProPlus with System Center Configuration Manager (ConfigMgr) I’ve spoken to many customers about this feature but only recently have customers upgraded from ConfigMgr 2007 and 2012 R2 to ConfigMgr Current Branch to use feature in production. The intention of the blog is talk about how to troubleshoot this feature using a recent case as reference. Basic documentation and requirements are posted here on TechNet. The reality is, each of us learns more when things don’t work than when they do. My case started with the following question…

Hey Microsoft, we’re not seeing any updates for Office 365 ProPlus in Software Center?”

Checklist for SCCM:

1. Verify Office 365 Client product is selected from Products within Software Update Point Component Properties

image

2. Synchronize updates, verify Office 365 Client Updates exist within Software Updates node.

image

3. Create Automatic Deployment Rule (ADR) to download and deploy updates to collection of machines. In my case, customer has clients on First Release for Deferred and Deferred. Since updates for First Release channels can occur at any time, ADR is perfect to automate process of keeping updates current. (*Assume you know how to create ADR) Search criteria (example below) will constrain download only to two channels appropriate for this customer. Run ADR and verify content has been downloaded and distributed to deployment points. Important to note, customers should have ~10% of clients on a validation channel such as First Release Deferred and ~90% on Deferred Channel. In this way, ADR is automating download of content but SysAdmin is controlling schedule of deployment.

image

4. Verify Software Update Group created is deployed to desired collection. In example, ADR is deployed to All Desktop and Server Clients.

image

Checklist for Client:

1. Install a N-1 version of Office 365 Client. By intentionally installing one month old version we can be confident updates will be applicable. As of this writing, N-1 for deferred channel is 6965.2117 and latest version is 7369.2118. Install Office 365 Client using Office Deployment Tool using sample unattend. (Assume you know how to install Office 365 Client)

image

2. System Center Configuration Manager references CDNBaseUrl to understand which channel is in scope for Office 365 Client. In this case, the GUID below is “Deferred Channel”, since our ADR included this channel we’re in good shape. You can use the RIOScan tool located here to provide a simple summary report to tell you what is installed and how its configured.

HKLM\SOFTWARE\Microsoft\Office\ClickToRun\Configuration

“CDNBaseUrl”=http://officecdn.microsoft.com/pr/7ffbc6bf-bc32-4f92-8982-f9dd17fd3114

3. System Center Configuration Manager in its applicability check requires Office COM interface to be enabled to help broker communication between Office and ConfigMgr. Typically, customers have already deployed some Office 365 Clients and want to turn this functionality on. The Software Updates client setting show below is used to enable this:

image

Alternatively, this can be accomplished via domain policy using Office 2016 ADMX file.

image

You can verify GPO is working by validating existence of following key. When service Microsoft Office Click-to-Run starts it will check for this value and register class to enable COM interface.

HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\office\16.0\common\officeupdate

“officemgmtcom”=dword:00000001

4. From control panel, launch Configuration Manager Properties applet and select Software Updates Deployment Evaluation Cycle, click Run Now.

image

Hey Microsoft, we checked and double checked all steps and still nothing is happening?”

5. Verify COM interface is registered. It should be, if you enabled it via one of the methods above. You can do this by verifying existence of following registry key on the client.

[HKEY_CLASSES_ROOT\CLSID\{B7F1785F-D69B-46F1-92FC-D2DE9C994F13}\InProcServer32]

@=”C:\\Program Files\\Common Files\\Microsoft Shared\\ClickToRun\\OfficeC2RCom.dll”

Hey Microsoft, this key doesn’t exist!?”

This is the break we were looking for.

We know if OfficeMgmtCOM is present, Microsoft Office Click-to-Run service will register .dll and enable COM interface. This is not happening in customer environment. Why? Using Process Monitor from Sysinternals site we can take a trace of restarting Microsoft Office Click-to-Run service and stop trace once service has started. One of the features I like is called count occurrences. I immediately saw a smaller number of Access Denied entries relating to registration of .dll in question. Double clicking on the occurrence count will automatically open trace filtering to operation, in our case show all Access Denied.

image

image

With this information, we could see McAfee DLP Agent or fcag.exe and with this evidence, make case to disable McAfee Access Protection and restart Office Click-to-Run service. The COM interface key was created and Software Update Deployment Scan finally showed update as applicable in Software Center. Security software and filter drivers are intrusive by design. In this case, customer can follow-up with this vendor and make appropriate tuning change. By taking 3rd party security software out of the picture, we allowed the COM interface to succeed and Configuration Manager to deem client in scope and advertise update.

image

Summary

I hope this blog provides a helpful checklist to accelerate and potentially resolve any native patching activities with SCCM and Office 365 ProPlus.

Dave Guenthner

 

 

SEO Key words:

C2R, Click-To-Run, Click2Run, Office, SCCM, patch, patching, OfficeMgmtCOM, native, Office 365 Client Update, Office 365 ProPlus

Five reasons to run SQL Server 2016 on Windows Server 2016 — No. 1: Security

$
0
0

This is the first blog in a five-part series. Keep an eye out for upcoming posts, which will cover cutting costs and improving performance of storage, BI, and analytics; improving uptime and reliability; reaching data insights faster by running analytics at the point of creation; and maintaining a consistent data environment across on-premises, hybrid, and cloud environments.

Wall, ditch, moat, palisades, watch towers, guards, highly trained soldiers: Even 2,000 years ago, when the Romans built their defenses, they deployed multiple layers of protection to deter invaders and keep intruders out. Today, on the electronic front, IT environments demand no less than a strong, layered approach to ensuring that data assets are protected from attacks such as stolen administrator credentials, unauthorized access, and pass-the-hash exploits.

You can see how important security is by examining the cost of data breaches, which is growing rapidly and represents a significant risk to business, as Figure 1 illustrates. To address this, Microsoft’s $1 billion annual investment in security demonstrates the company’s longstanding and proven commitment to building security capabilities into both its applications and operating systems. This means you can take advantage of layered security and mitigate risk.

Figure 1: Growing cost of data breach [1]

figure-1

Consider SQL Server 2016 and Windows Server 2016, for example: Security is built into both. In fact, the National Institute of Standards and Technology (NIST) has shown SQL Server to consistently be the least vulnerable database.[2] Underpinning the built-in security you get with SQL Server, Windows Server 2016 adds new OS-level security capabilities to existing security functionality. As a result, if you use both SQL Server 2016 and Windows Server 2016 together, you get enterprise-scale security that meets the strictest organizational and industry standards for your infrastructure and your data.

Figure 2: Independent findings show unparalleled security

figure-2[3]

SQL Server 2016 security

When you modernize your data platform to SQL Server 2016, you get access to innovative advanced security features of the least vulnerable database.[4] Three key built-in features that keep unauthorized users from accessing SQL Server data are:

  • Always Encrypted enables encryption inside client applications without revealing encryption keys to SQL Server. It allows changes to encrypted data without the need to decrypt it first, as shown in Figure 3. The combination of Transparent Data Encryption and Always Encrypted ensures that data is encrypted both at rest and in motion. (To learn more, see “Always Encrypted in SQL Server & Azure SQL Database.”)

Figure 3: Always Encrypted protection

figure-3

  • Row-Level Security (RLS), which Figure 4 illustrates, enables developers to centralize row-level access logic in the database and maintain a consistent data access policy to reduce the risk of accidental data leakage. (For details, see “Limiting access to data using Row-Level Security.”)

Figure 4: Row-Level Security

figure-4

  • Dynamic Data Masking (DDM) lets you conceal your sensitive data or personally identifiable information (PII) such as customer phone number, bank information or Social Security number. DDM and RLS help developers build applications that require restricted direct access to certain data as a means of preventing users from seeing specific information. Figure 5 illustrates. (For deeper information, see “Use Dynamic Data Masking to obfuscate your sensitive data.”)

Figure 5: Dynamic Data Masking

figure-5

To learn more about SQL Server 2016 security, you can visit the SQL Server data security webpage and read the security white paper.

Windows Server 2016 security

Just as SQL Server 2016 provides advanced security features that are not available in other data platforms, Windows Server 2016 includes built-in breach-resistance mechanisms to establish strong security layers to help thwart attacks.

The Windows Server 2016 operating system is a strategic layer in your infrastructure and serves as the foundation for your SQL Server data security. To prevent data exposure, you need the most advanced protection you can get. By modernizing both your server platform and your data platform together, you can be assured you’re doing your best to protect your business. The security functionality in Windows Server 2016 includes the following:

  • Device Guard helps lock down what runs on the server so that you are better protected from unauthorized software running on the same server as your SQL Server application.
  • Credential Guard to protect SQL Server admin credentials from being stolen by Pass-the-Hash and Pass-the-Ticket attacks. Using an entirely new isolated Local Security Authority (LSA) process, which is not accessible to the rest of the operating system, Credential Guard’s virtualization-based security isolates credential information to prevent interception of password hashes or Kerberos tickets.
  • Control Flow Guard and Windows Defender protect against known and unknown vulnerabilities that malware can otherwise exploit. Control Flow tightly restricts what application code can be executed — especially indirect call instructions. Lightweight security checks identify the set of functions in the application that are valid targets for indirect calls. When an application runs, it verifies that these indirect call targets are valid. Windows Defender works hand-in-hand with Device Guard and Control Flow Guard to prevent malicious code of any kind from being installed on your servers.

To learn more about the advanced layers of OS security, visit the Windows Server security webpage and read the white paper.

Thanks for reading our first blog in the series. For more info, check out this summary of five reasons to run SQL Server 2016 with Windows Server 2016.

Ready to give it a try? Here are some options to get started:

Windows Server Virtual Labs

Windows Server 2016 Free Evaluation

SQL Server 2016 Free Evaluation

SQL Server Virtual Labs


[1] “Data Breach Costs Rising, Now $4 Million per Incident”

[2] National Institute of Standards and Technology Comprehensive Vulnerability Database, update 2016

[3] National Institute of Standards and Technology Comprehensive Vulnerability Database, update 2016

[4] National Institute of Standards and Technology Comprehensive Vulnerability Database, update 2016


Five reasons to run SQL Server 2016 on Windows Server 2016 – #1 Security

$
0
0

This is the first blog in a five-part series. Keep an eye out for upcoming posts, which will cover cutting cost and improving performance of storage, BI, and analytics; improving uptime and reliability; reaching data insights faster by running analytics at the point of creation; and maintaining a consistent data environment across on-premises, hybrid, and cloud environments.

Reason #1: Security

Wall, ditch, moat, palisades, watch towers, guards, highly trained soldiers: Even 2,000 years ago, when the Romans built their defenses, they deployed multiple layers of protection to deter invaders and keep intruders out. Today, on the electronic front, IT environments demand no less than a strong, layered approach to ensuring that data assets are protected from attacks such as stolen administrator credentials, unauthorized access, and pass-the-hash exploits.

You can see how important security is by examining the cost of data breaches, which is growing rapidly and represents a significant risk to business, as Figure 1 illustrates. To address this, Microsoft’s $1 billion annual investment in security demonstrates the companys long-standing and proven commitment to building security capabilities into both its applications and operating systems. This means you can take advantage of layered security and mitigate risk.

Figure 1: Growing cost of data breach

Figure 1

Consider Windows Server 2016 and SQL Server 2016, for example: Security is built into both. Windows Server 2016 adds new OS-level security capabilities to existing security functionality. On top of Windows Servers built-in security, SQL Server has consistently been the least vulnerable database, according to the National Institute of Standards and Technology (NIST). As a result, if you use both SQL Server 2016 and Windows Server 2016 together, you get enterprise-scale security that meets the strictest organizational and industry standards for your infrastructure and your data.

Figure 2: Independent findings show unparalleled security

Figure 2

Windows Server 2016 security

Windows Server 2016 includes built-in breach-resistance mechanisms to establish strong security layers to help thwart attacks. The Windows Server 2016 operating system is a strategic layer in your infrastructure and serves as the foundation for your SQL Server data security. To prevent data exposure, you need the most advanced protection you can get. By modernizing both your server platform and your data platform together, you can be assured youre doing your best to protect your business. The security functionality in Windows Server 2016 includes:

  • Device Guard helps lock down what runs on the server so that you are better protected from unauthorized software running on the same server as your SQL Server application.
  • Credential Guard helps protect SQL Server admin credentials from being stolen by Pass-the-Hash and Pass-the-Ticket attacks. Using an entirely new isolated Local Security Authority (LSA) process, which is not accessible to the rest of the operating system, Credential Guards virtualization-based security isolates credential information to prevent interception of password hashes or Kerberos tickets.
  • Control Flow Guard and Windows Defender protect against known and unknown vulnerabilities that malware can otherwise exploit. Control Flow tightly restricts what application code can be executed, especially indirect call instructions. Lightweight security checks identify the set of functions in the application that are valid targets for indirect calls. When an application runs, it verifies that these indirect call targets are valid. Windows Defender works hand in hand with Device Guard and Control Flow Guard to prevent malicious code of any kind from being installed on your servers.

To learn more about the advanced layers of OS security, visit the Windows Server security webpage and read the white paper.

SQL Server 2016 security

When you modernize your data platform to SQL Server 2016, you get access to innovative advanced security features of the least vulnerable database. Three key built-in features that keep unauthorized users from accessing SQL Server data are:

  • Always Encrypted enables encryption inside client applications without revealing encryption keys to SQL Server. It allows changes to encrypted data without the need to decrypt it first, as shown in Figure 3. The combination of Transparent Data Encryption and Always Encrypted ensures that data is encrypted both at rest and in motion. To learn more please see Always Encrypted in SQL Server & Azure SQL Database.

Figure 3: Always Encrypted protection

Figure 3

  • Row-Level Security (RLS), which Figure 4 illustrates, enables developers to centralize row-level access logic in the database and maintain a consistent data access policy to reduce the risk of accidental data leakage. For details please see, “Limiting access to data using Row-Level Security.”

Figure 4: Row-Level Security

Figure 4

  • Dynamic Data Masking (DDM) lets you conceal your sensitive data or personally identifiable information (PII) such as customer information such as phone number, bank information, or Social Security number. DDM and Row-Level Security (RLS) help developers build applications that require restricted direct access to certain data as a means of preventing users from seeing specific information. Figure 5 illustrates. For deeper information please see, “Use Dynamic Data Masking to obfuscate your sensitive data.”

Figure 5: Dynamic Data Masking

Figure 5

1 National Institute of Standards and Technology Comprehensive Vulnerability Database, update 2016.

 

To learn more about SQL Server 2016 security, you can visit the SQL Server data security webpage and read the security white paper.

Thanks for reading our first blog in the series. For more info, check out this summary of five reasons to run SQL Server 2016 with Windows Server 2016.

Ready to give it a try? Check out our free evaluation options:

Dont wait until you experience a data breach to get tougher on your security stance. Build your own layers of defense to protect your organizations most important data and keep the bad guys out.

Windows Server Virtual Labs

Windows Server 2016 Free Evaluation

SQL Server 2016 Free Evaluation

SQL Server Virtual Labs

How to find the SDN gateway local address for BGP peering in Windows Server 2016

$
0
0

Few days back, I had written a blog post about some issues being faced by Software Defined Networking (SDN) customers. The issue was specific to changing VPN bandwidth settings in Windows Server 2016. You can read more about that issue and the solution here.

Another area where we have seen customers struggle is finding out the local SDN gateway server address. The local SDN gateway Server address is required for the following reasons:

  1. When you configure the remote VPN endpoint (in your enterprise or your local datacenter), you need to provide the local SDN gateway server address as the destination IP. This is the IP address advertised by the gateway for external connectivity
  2. If you are using BGP for learning dynamic routes over VPN, you will need the local SDN gateway server address to configure the BGP peering information. Note that this address will be different from the destination IP I have mentioned above, since this is the IP address of the internal interface of the VPN server.

Finding the external address of SDN gateway

This address will be used as the destination IP address when you configure the on-premise VPN server (or a GRE endpoint in the same datacenter). This address may be different for different tenants as the SDN gateway is a multi-tenant server.

This address is displayed in the SCVMM console (System Center Virtual Machine Manager) when you configure the connection.

a

 

Finding the BGP router IP address of the SDN gateway

BGP Router IP for tenant connections

If you are using BGP (Border Gateway Protocol) with your tenant IPsec, GRE or L3 connections for dynamically learning remote routes, you will need to know the BGP router IP address so that you can configure that address as the peer address on the remote router. When you configure the VPN connections through SCVMM, it automatically assigns an IP Address from the gateway routing subnet to the tenant compartment of the gateway VM. And uses this IP address as the BGP router IP address. Since this router is per tenant, the router address will be different for each tenant.

First, execute the following Powershell commands on a Network Controller machine or a machine that is configured as a Network Controller client:

$gateway = Get-NetworkControllerVirtualGateway -ConnectionUri

$gateway.Properties.NetworkConnections.Properties.DestinationAddress

Note that there can be multiple virtual gateways depending on how many tenants have configured gateway connections. Also, each virtual gateway can have multiple connections (IPSec, GRE, L3). Since you already know the destination address of the connection, you can identify the correct connection based on the destination address. Once you have the correct network connection, execute the following command (on the corresponding virtual gateway) to get the BGP router IP address of the virtual gateway

$gateway.Properties.BgpRouters.Properties.RouterIp

This IP address must be configured on the remote router as the peer IP Address.

BGP router IP for GRE gateway

If you are using GRE connectivity in your SDN deployment, you must create a GRE VIP logical network and advertise the GRE VIPs from your SDN gateways to the physical network using internal BGP peering. You can get more details in the SDN planning document here.

You need to create a BGP peer on the Top of Rack router (ToR) that is used by your SDN infrastructure to receive routes for the GRE VIP logical network advertised by the SDN Gateways. BGP peering only needs to occur one way (from SDN Gateway to external BGP peer). To configure the BGP peer, you will need to provide the peer IP i.e, the BGP router IP of the SDN gateways.

To get the BGP router IP of the SDN gateway, execute the following Powershell commands on a Network Controller machine or a machine that is configured as a Network Controller client:

$gateway = Get-NetworkControllerGateway -ConnectionUri

$gateway.Properties.BgpConfig.RouterIp

This IP address must be configured on the remote router as the peer IP Address.

 

If you want to setup SDN through SCVMM, there is a bunch of detailed documentation on Technet here. Before starting the deployment, please go through the SDN planning guidance here.

This Week on Windows: Star Wars, Windows Hello, Resident Evil 7 and more

$
0
0

We hope you enjoyed today’s episode of This Week on Windows! Head over here to read our Windows 10 Tip on how to set active hours so your PC doesn’t restart while you’re working, save on seasons 1-4 of The Americans, get Resident Evil 7 biohazard from the Windows Store – on sale until March 27 – or keep reading to catch up on this week’s news.

In case you missed it:

Here’s what’s new in the Windows Store:

Pre-order Rogue One: A Star Wars Story today and watch it this Friday, March 24

Rogue One: A Star Wars Story

As the Empire continues to gain power, a group of rebel spies will risk everything to steal the plans to their enemy’s most terrifying new weapon: the Death Star. Buy the epic adventure Rogue One: A Star Wars Story tomorrow on Digital HD in the Movies & TV section of the Windows Store. Can’t get enough Star Wars? Buy the Digital Six Film Collection March 24 through April 10 and get a $5 gift card to spend on even more movies, games, apps, or music! For additional details, visit microsoft.com/movies-and-tv.

TurboTax Online Tax Return App– Free

 TurboTax Self-Employed, Expense Finder™ will automatically find deductible and industry- specific business expenses for you.

With one month left in tax season, we’re excited to announce the launch of TurboTax for Windows 10 PCs and tablets in the Windows Store. The app is available to download for free starting today. The TurboTax app brings technological innovation to tax filing, so you can file your taxes with confidence, anytime, anywhere. Read more in our blog post!

New content in Halo Wars 2

A fierce and fiery new ally joins the fray in the UNSC’s battle against Atriox and The Banished. Lt. Colonel Morgan Kinsano, the first of many new add-on leaders, is available today on the Xbox Play Anywhere title Halo Wars 2 for both Windows 10 PC and Xbox One. Read more over at Halo Waypoint!

NCAA March Madness LIVE– Free

NCAA March Madness LIVE

The NCAA Men’s College Basketball Tournament is underway, and the official app, NCAA March Madness LIVE (Free, TV subscription required for some games), is here – ready with every tip-off, layup, 3-pointer and slam dunk of the action. It’s the place to enjoy unlimited access to live-stream video of every tournament game, along with a live, interactive bracket from Bing to keep you up-to-date, too.

Get season 2 of Into the Badlands – Buy from $18.99

 Into the Badlands

Get ready for all the post-apocalyptic action you can handle with Season 2 of Into the Badlands ($24.99 HD, $18.99 SD). Catch up on past episodes of the hit martial arts drama and then dive into the season premiere, available now in the Movies & TV section of the Windows Store.

Have a great weekend!

The post This Week on Windows: Star Wars, Windows Hello, Resident Evil 7 and more appeared first on Windows Experience Blog.

Episode 123 on the Excel Bot with Jakob Nielsen—Office 365 Developer Podcast

$
0
0

In episode 123 of the Office 365 Developer Podcast, Richard diZerega and Andrew Coates talk to Jakob Nielsen about the Excel Bot.

Download the podcast.

Weekly updates

Show notes

Got questions or comments about the show? Join the O365 Dev Podcast on the Office 365 Technical Network. The podcast RSS is available on iTunes or search for it at “Office 365 Developer Podcast” or add directly with the RSS feeds.feedburner.com/Office365DeveloperPodcast.

About Jakob Nielsen

Jakob Nielsen is a principal designer for the Microsoft Office team working on Excel and Office for professional developers and makers. In his 20+ years at Microsoft, he has worked with enterprise customers and partners in Microsoft Consulting Services and on the Dynamics and SharePoint products.

About the hosts

RIchard diZeregaRichard is a software engineer in Microsoft’s Developer Experience (DX) group, where he helps developers and software vendors maximize their use of Microsoft cloud services in Office 365 and Azure. Richard has spent a good portion of the last decade architecting Office-centric solutions, many that span Microsoft’s diverse technology portfolio. He is a passionate technology evangelist and a frequent speaker at worldwide conferences, trainings and events. Richard is highly active in the Office 365 community, popular blogger at aka.ms/richdizz and can be found on Twitter at @richdizz. Richard is born, raised and based in Dallas, TX, but works on a worldwide team based in Redmond. Richard is an avid builder of things (BoT), musician and lightning-fast runner.

 

ACoatesA Civil Engineer by training and a software developer by profession, Andrew Coates has been a Developer Evangelist at Microsoft since early 2004, teaching, learning and sharing coding techniques. During that time, he’s focused on .NET development on the desktop, in the cloud, on the web, on mobile devices and most recently for Office. Andrew has a number of apps in various stores and generally has far too much fun doing his job to honestly be able to call it work. Andrew lives in Sydney, Australia with his wife and two almost-grown-up children.

Useful links

StackOverflow

Yammer Office 365 Technical Network

The post Episode 123 on the Excel Bot with Jakob Nielsen—Office 365 Developer Podcast appeared first on Office Blogs.

On-Demand Webinar: 5 days to make IT Ops more efficient

$
0
0

When you are managing multiple resources across heterogenous environments, you need to make operations work for you. A few of the most critical areas you can improve efficiency in are service management, application management, and security. Join us for this on-demand webinar to learn how to remediate infrastructural issues, understand application health, and establish preventative security in your organization.

Nick Burling and Melanie Maynes from the Enterprise Cloud Management team at Microsoft will give you an overview of Azure tools that can help you increase your efficiency in 5 simple ways:

  1. Use Desired State Configuration to automate processes and improve your service level quality
  2. Monitor the performance and behavior of your applications and workloads
  3. Keep track of service changes that occur in your environment
  4. Track security alerts, missing updates, and malicious activity
  5. Bring app and server analytics together for dependency mapping and diagnostics

See close-up screenshots from the webinar below.

DSC_Config
Set Desired State Configuration

DSC_Nodes
DSC nodes status

AppInsights_Main
App Insights dashboard

AppInsights_FailedRequests
App Insights failed requests

AutomationMainView
Automation dashboard

ChangeTracking
Configuration change tracking

Security_Main
Security and audit dashboard

Security_Threat_Detection
Threat detection map

Service_Map_Alert
Service Map critical alert

Service_Map_Security
Service Map security issue

Viewing all 13502 articles
Browse latest View live