Quantcast
Channel: TechNet Technology News
Viewing all 13502 articles
Browse latest View live

New Office 365 capabilities helps you proactively manage security and compliance risk

$
0
0

Missing a key security signal could mean not catching a breach, but the number of security signals is increasing exponentially. It’s becoming impossible to manually prioritize them. That’s why Office 365 applies intelligence to help you proactively manage risk and ward off threats. Today, we’re pleased to introduce several new capabilities in Office 365 that help you manage risk and stay ahead of threats:

  • Office 365 Secure Score—A new security analytics tool that applies a score to Office 365 customers’ current Office 365 security configuration.
  • Office 365 Threat Intelligence Private Preview—Service that leverages billions of data points from the Microsoft Intelligent Security Graph to provide actionable insights to the global threat landscape and help customers stay ahead of cyber threats. Office 365 Threat Intelligence is now in private preview, with general availability planned for later this quarter.
  • Office 365 Advanced Data Governance Preview—Applies machine learning to help customers find and retain the most important data to them while eliminating redundant, obsolete and trivial data that could cause risk if compromised. Office 365 Advanced Data Governance is now in preview, with general availability planned for later this quarter.

Know your Office 365 Secure Score

Do you know how you’d be rated if someone were to evaluate your security configuration? To give you better visibility into your Office 365 security configuration and the security features available to you, we’re pleased to introduce Secure Score—a new security analytics tool. Secure Score helps you understand your current Office 365 security configuration and shows you how implementing additional controls can further enhance your security and reduce risk.*

Here’s how it works:

Secure Score Summary—Displays your Secure Score and provides access to view your Score Analyzer. Your Secure Score, the numerator, is the sum of the points associated with security configurations that you have partially or fully adopted. The total score, the denominator, is the sum of the points associated with all the security controls that are available to you through your Office 365 plan.

In this example, the Secure Score is 130 out of 273 points possible:

New Office 365 capabilities 1

Score Summary window showing your Secure Score.

Score Analyzer—Allows you to track and report on your score over time. The graph shows your Secure Score on any date in the past, what specific actions you completed and which actions were available to you. Your score results can also be exported to a CSV file for easy planning and communication with your organizations.

New Office 365 capabilities 2

Score Analyzer graph showing the Secure Score over time.

In addition to providing insight, Secure Score provides suggestions on the possible actions you can take to improve your security position. These suggestions are prioritized based on the effectiveness of the action and level of impact to end users. Actions that are highly effective with low level of user impact are placed at the top, followed by actions that are less effective and more impactful to users. You can also filter actions in the list with criteria such as those that have low end user impact or that apply to user accounts.

Secure Score can play an important role in a holistic security strategy, which encompasses how an organization strengthens its risk controls, mitigates potential losses and offsets some of the risk. To help businesses strengthen their security position, property and casualty insurer The Hartford will consider a customer’s Office 365 Secure Score as a part of the cyber insurance underwriting process.

We believe aligning the solutions between security and insurance can make a real difference. By encouraging the use of an innovative security analytics tool like Office 365 Secure Score and making it a part of the underwriting process, businesses have more information to make risk-based decisions around privacy and security, potentially reducing their exposure to loss.”
—Tom Kang, head of Cyber Insurance at The Hartford

This builds upon the endorsement of Office 365 made by insurance industry leader AIG last year.

Watch this Microsoft Mechanics video for an in-depth look at Secure Score:

To learn more about Secure Score, check out your score and see recommendations on how you can increase your security position in Office 365, go to securescore.office.com.

Office 365 Threat Intelligence—now in private preview

According to a recent Ponemon Institute study, the average cost of a data breach has risen to $4 million. These costs can include litigation, the effects of brand or reputation damage, potential lost sales, and in some cases, complete business closure. Organizations that are prepared for a breach by spending on appropriate staffing, security training and security products can ultimately reduce their long-term costs.

Office 365 Threat Intelligence uses the Microsoft Intelligent Security Graph to analyze billions of data points from global datacenters, Office clients, email, user authentications and other incidents that impact the Office 365 ecosystem, as well as signals from our Windows and Azure ecosystems—to provide actionable insights to global attack trends.

It provides information about malware families inside and outside your organization, including breach information with details, like how much bitcoin the attackers typically request in ransomware attacks. Office 365 Threat Intelligence also integrates seamlessly with other Office 365 security features like Exchange Online Protection and Advanced Threat Protection, so you’ll be able to see analysis, including the top targeted users, malware frequency and security recommendations related to your business.

Office 365 Threat Intelligence provides this visibility, along with rich insights and recommendations on mitigating cyber-threats, ultimately supporting a proactive defense posture, leading to long-term reduced organizational costs.

New Office 365 capabilities 3

The Office 365 Threat Intelligence Dashboard provides visibility into the global threat landscape.

To sign up for the private preview of Office 365 Threat Intelligence, please contact your Microsoft account representative.

Why data governance matters

Many organizations are exposing themselves to unnecessary risk because they don’t have a good grasp on all the data they have. Often, they retain data they no longer need, such as the personal information of former employees who have long since left the company. Should this personal data be compromised in a breach, the company could be liable for costly remediation, such as lifetime credit monitoring for these former employees.

Office 365 Advanced Data Governance helps you find and retain the data that is most important to you while eliminating redundant, obsolete and trivial data that could cause risk if compromised. Office 365 Advanced Data Governance applies machine learning to intelligently deliver proactive policy recommendations; classify data based on automatic analysis of factors like the type of data, its age and the users who have interacted with it; and take action, such as preservation or deletion.

We’re already receiving a great response from legal professionals who expect Office 365 Advanced Data Governance to enhance their data management practices.

“The machine learning designed into Microsoft Office 365 Advanced Data Governance’s suite has the potential to tame the ever-increasing growth and complexity of data types we deal with. Office 365 Advanced Data Governance can apply unified policies to data in place across all Office 365 applications, regardless of when the data was ingested, to intelligently retain high-value data while deposing of what isn’t needed or is obsolete. As organizations grasp its potential to reduce compliance and security risks, this will be a game changer in information governance.”
—Paul Meyer, eDiscovery and Data Management managing counsel at Willis Towers Watson

Watch this Microsoft Mechanics for an in-depth look at Office 365 Advanced Data Governance:

Visit Office 365 Advanced Data Governance to register for the limited public preview.

Availability

Office 365 Secure Score is now generally available to organizations with an Office 365 commercial subscription and who are in the multi-tenant and Office 365 U.S. Government Community clouds.

Office 365 Threat Intelligence and Advanced Data Governance are expected to be generally available by the end of March 2017, and will be included in the Office 365 Enterprise E5 plan, as well as in the Secure Productive Enterprise E5 offering.

—Alym Rayani, director for Office Security and Compliance team

*The Secure Score is a numerical summary of your security posture within Office 365 based on system configurations, user behavior and other security related measurements; it is not an absolute measurement of how likely your system or data will be breached; rather, it represents the extent to which you have adopted security controls available in Office 365, which can help offset the risk of being breached. No online service is completely immune from security breaches; the Secure Score should not be interpreted as a guarantee against security breach in any manner.

The post New Office 365 capabilities helps you proactively manage security and compliance risk appeared first on Office Blogs.


Azure AD and SailPoint: Advanced identity governance across your on-premises and cloud resources

$
0
0

Howdy folks,

Over the past year, we’ve had the privilege to work closely with our largest customers in highly regulated industries like healthcare, financial services and pharma, helping them to successfully deploy and use Azure AD Premium. Through this close partnering, we’ve learned that to meet their unique security and compliance requirements, they need some pretty advanced access governance controls across their on-premises and cloud resources, in addition to the industry leading identity management and security they get with Azure AD Premium.

Today, we’ve got good news for these customers.

I am thrilled to announce our technical collaboration with SailPoint, a proven leader in identity governance. SailPoint’ s identity governance capabilities, combined with Azure AD’s secure access and risk-based identity protection, will help cover the most demanding security and compliance needs of our joint customers. The SailPoint integration extends Azure Active Directory Premium to provide full, fine-grained provisioning and lifecycle governance across enterprise systems on-premises and in the cloud.

Let’s take a look at how the integration works through the lens of a few specific scenarios.

Identity and context synchronization

The first step in enabling advanced access governance is to synchronize the Azure AD view of users and their access to applications with SailPoint. This is performed using a direct connector that automatically aggregates user accounts, group permissions, and Microsoft Access Panel tiles and maps each of these to the SailPoint Identity Cube. It also provides the basis for SailPoint to send change events back to Azure AD when access is modified during a governance mitigation process.

In addition to this, SailPoint will connect to applications managed outside of Azure AD, including on-premises applications like EPIC, which is widely used in healthcare. This creates a 360-degree view of all access in the organization and creates a strong foundation for comprehensive control.

Access request and lifecycle events

User access request and approval is at the core of any identity management and governance solution. The integration of SailPoint with Azure AD adds support for self service access requests and approvals. Additionally the integration propogates access changes based on employee lifecycle events like join, move, or leave across all applications (cloud or on-premises) to ensure that access is granted according to business policy.

In both cases, the SailPoint-Microsoft combination enables end-to-end coverage of all provisioning events with full synchronization of access changes to the Microsoft Access Panel.

Identity governance certification, segregation of duty policies, and more

A key component of strong identity governance is the ability to review access on a regular basis. The integration provides a simple and effective way to automate the entire access certification process.

SailPoint’s access certifications combine data collected from the identity and context synchronization process described above with account and entitlement data from all application sources to create a single view of all access. After that, a fully automated access review process can be initiated to business and IT owners. Changes to access that resulted from the access review process are automatically propagated to the Azure AD Access Panel.

Another important governance control is the ability to enforce SOD policies throughout a user’s lifecycle with an organization. SOD policies can be defined and enforced by SailPoint during access reviews or access request processes to provide an additional level of policy control.

SailPoint also delivers audit and compliance reporting that demonstrates the effectiveness of the identity controls operating across the organization. This significantly reduces the burden on IT operations teams and improves visibility for the business.

Self-service password reset extension

In addition to the governance capabilities described above, the integration with SailPoint enables an important password management use case the combined solution can automatically propagate an Azure AD password change to all connected systems in SailPoint that share a common password policy. This allows a user to change their password once in Azure AD and have it synchronized across a wide variety of on-premises and cloud-based systems.

We’re excited to bring this partnership to you and want to hear your feedback. Leave your comments below and reach out to us via Twitter! As always, we’re listening.

Best regards,

Alex Simons (Twitter: @Alex_A_Simons)

Director of Program Management

Microsoft Identity Division

Bing Predicts the GRAMMY Awards

$
0
0
This Sunday, February 12, music fans will huddle around their TVs and phones to see who will take home a coveted GRAMMY® Award. To see the full list of this year’s nominees, along with Bing’s predictions for who will win in each category, go to Bing and search for GRAMMYs.
 
If you’re a fan of Adele, you’ll be happy to see Bing is predicting she’ll edge out the competition to take home four GRAMMYs, including Record of the Year, Song of the Year, Pop Vocal of the Year, and Pop Solo Performance. Beyoncé, Blackstar and Work are also expected to do well, with Bing predicting each artist to take home multiple awards. And for those cheering on the new artist nominees, Bing predicts it’ll be a close race between Chance the Rapper and Maren Morris!


 
If you’re curious how this year’s list of nominees compares to last year’s winners, just use the link in the carousel titled, “2016 Winners” and you’ll see a carousel of winners.

 
We wish the best of luck to all nominees and hope you have fun cheering on your favorite artists. Regardless of who wins, it’s sure to be a great night for music!
 
-The Bing Team
 
Grammy(s)® is the trademark of The Recording Academy



 

How to give us feedback

$
0
0

We love hearing from you.  So what’s the best way to give us feedback?

The best way to report an issue or give a quick suggestion is the Feedback Hub on Windows 10 (Windows key + F to open it quickly). The feedback hub lets the product team see all of your feedback in one place, and allows other users to upvote and provide further comments. It’s also tightly integrated with our bug tracking and engineering processes, so that we can keep an eye on what users are saying and use this data to help prioritize fixes and feature requests, and so that you can follow up and see what we’re doing about it.

In the latest build, we have reintroduced the Hyper-V feedback category.

After typing your feedback, selecting “Show category suggestions” should help you find the Hyper-V category under Apps and Games. It looks like a couple people have already discovered the new category:

 

Hyper-V feedback

When you put your feedback in the Hyper-V category, we are also able to collect relevant event logs to help diagnose issues. To provide more information about a problem that you can reproduce, hit “begin monitoring”, reproduce the issue, and then “stop monitoring”. This allows us to collect relevant diagnostic information to help reproduce and fix the problem.

Begin monitoring

We also love to hear from you in our forums if there are any issues you are running into. This is a good place to get direct help from the product group as well as community members. Hyper-V Forums

Hyper-V Forums

That’s all for now. Looking forward to seeing your feedback!

Cheers,
Andy

Evolving the Visual Studio Test Platform – Part 4: Together, in the Open

$
0
0

[This is the 4th post in a 4-part series on evolving the Visual Studio Test Platform. You can read the earlier parts here:
Evolving the Visual Studio Test Platform – Part 3,
Evolving the Visual Studio Test Platform – Part 2,
Evolving the Visual Studio Test Platform – Part 1]

The Test Platform is where it is at thanks to its community – a community of adapter writers, test framework writers, extension writers, and application developers, working on platforms ranging from .NET to C++ to JavaScript. The Test Platform has grown to serve a diverse and complex range of lifecycle-requirements and is now at a point where it is vital to enable this community to define and shape its evolution.

Open Sourcing of the Test Platform

Three weeks ago, we announced open sourcing the Test Platform. We have released the sources under the MIT open source license. We have published the public repositories on GitHub where the project is hosted:
https://github.com/Microsoft/vstest, and
https://github.com/Microsoft/vstest-docs.
The repos include the complete implementation of the test platform. These are fully open and ready to accept contributions. Over the next several weeks and months we will continue to transfer source and documentation into the repositories and likewise make it open for contributions.

What does this open sourcing mean?

This open source announcement means that the community now has a fully supported, fully open source, fully cross-platform Test Platform to power tests – including all elements of the lifecycle from the runner to discover and execute tests, all the way to the extensibility model.

Our “Modernizing the Test Platform” journey

This is not just a one-off event. Rather, it should be seen in the larger context of our steps to modernize the test platform as a whole.

We started by taking a hard look at the MSTest test framework, and how to move that user base forward. Thus, with MSTest V2 we introduced support for ASP.NET Core and .NET Core, added several important and much asked-for features, published the bits to NuGet, updated the Unit Test project templates to reference MSTest V2 starting VS “15” Preview 4, updated the Create Unit Test wizard and the Create IntelliTest wizards to support MSTest V2, and have since been updating the NuGet packages regularly – it’s just about 8 months since launch and we have updated the packaged 8 times already.

We added Parallel Test Execution and several other features spanning the entire lifecycle. We added testing support for .NET Core. The “dotnet test” and the “dotnet vstest” commands are now powered by the test platform.

At one level, this means that you can now carry forward the testing experience and assets that you are used to on the desktop (with vstest.console.exe) – at another level, this means that the underlying testing infrastructure is now cross platform – you can carry forward the testing experience that you were used to on the desktop to .NET Core, and not only on Windows but cross platform too (Linux/Mac).

Collaborative innovation, transparent development

This moment is not only about sharing the sources under some license, either. We will engage in transparent development. Towards that, we will share, make visible, and collaborate on issues, implementation and our roadmap.

What we hope for in return is your continued support and participation in collaborative innovation.

Summary

We look at this step of open sourcing the test platform as a prerequisite to unlock collaborative innovation. It is only through our combined expertise – as a community – that the test platform can be taken to the next level of success and applicability.

Here is looking forward to our collaboration. Come, let us grow the test platform, together, in the open.

Adding #AzureSearch to #DocumentDB collections with a click of a button

$
0
0

-SearchPlusDocDB-banner@2x

Our customers love how easy it is to use Azure Search and DocumentDB together to meet business goals. Tight integration through Indexers simplifies the task of indexing and searching in a variety of verticals from ecommerce to business applications. With the ability to load data with zero code, it’s even easier. We’re always looking for ways to boost developer productivity, so today we’re happy to announce the ability to add Search to a collection directly from DocumentDB with a click of a button.

2

Seamlessly select or create a Search service, and your DocumentDB configuration will be populated automatically. You’ll have all the search power you’ve come to expect. Schema inference provides an excellent starting point to easily add features like faceted navigation, intelligent language processing, and suggestions.

docdb2All of this is built on the tried and true indexer infrastructure, so expect a mature solution that’s in use by lots of customers and that will get the task done smoothly and reliably. Indexers and Search + DocumentDB enable more complex scenarios as well. DocumentDB is a global NoSQL database, this enables you to create Azure Search service instances in as many regions as you want. Create an indexer in each Search service, all pointing at the same DocumentDB account, for a simple and rock-solid solution for low-latency, geo-distributed search application backend.

-SearchPlusDocDB-illustration@2x

We can’t wait to see what you build with DocumentDB and Azure Search! As always we’d love to hear from you on Twitter, User Voice, or the comments below. Happy coding!

Work smarter, not harder with Office 365—free live demos with Q&A

$
0
0

With our modern-day mobile workforce, more and more employees are working from home and on the go. We’ve become accustomed to hosting online meetings and collaborating in the cloud. To help expand the digital capabilities of your team, join us for our Office 365 live demo webinar series.

These free 30-minute live webinars show how you and your remote teams can work smarter and more efficiently using Office 365 apps.

Here are a few upcoming topics we will cover:

Work smarter with Microsoft Delve

Join this live demo on Wednesday, February 15 at 12 p.m. ET / 9 a.m PT and:

  • Discover how the Office Graph is changing the ways we interact with Office 365.
  • Learn what Microsoft Delve is and how you can use it to discover content relevant to your work.
  • Find out how MyAnalytics is enhancing the Microsoft Delve experience.
  • See how Microsoft Delve is being integrated into the tools you already know and love.

Master chat-based collaboration tools in Microsoft Teams

Join this live demo Wednesday, March 15 at 12 p.m. ET / 9 a.m. PT and learn how to:

  • Use chat for today’s teams in a threaded, persistent way that keeps everyone engaged.
  • Participate in a hub that works seamlessly with Office 365 apps.
  • Customize options for each team with channels, connectors, tabs and bots.
  • Add your personality to your team with cool emojis, GIFs and stickers.

Work efficiently with SharePoint and OneDrive for Business

Join this live demo on Wednesday, April 5 at 12 p.m. ET / 9 a.m. PT and:

  • Find out when and why to use a particular tool for file sharing, storage and collaboration.
  • See how OneDrive for Business can help you get the most out of personal file storage.
  • Discover how to use some of the newest features to securely store, edit and share your documents.
  • Learn how to make professional team sites in minutes.

To register for one or all of the sessions, visit Office 365 live demo webinar series. Want to attend but can’t make a scheduled live session? We’ve got you covered. All webinars are free and available both live and on demand.

The post Work smarter, not harder with Office 365—free live demos with Q&A appeared first on Office Blogs.

TD banks on the security of the cloud with Office 365

$
0
0

TD logo


Today’s post was written by Ron Markezich, corporate vice president for Microsoft.

When customers in highly regulated industries choose the Microsoft Cloud, it’s because they feel confident that our security capabilities will meet their needs. Microsoft has made great strides in securing approvals from global financial regulatory bodies to help them meet their compliance requirements. That’s one reason why TD Bank Group (TD)—a prominent North American financial institution with more than 90,000 employees worldwide—took a leadership role in the financial services industry with their move to Microsoft Office 365.

TD thoroughly tested the built-in security capabilities of Office 365. It also researched and confirmed that the Microsoft Cloud complies with Canadian financial regulatory requirements. And to maintain explicit control over access to their data during Microsoft service operations, they acquired Customer Lockbox for Office 365.

Jeff Henderson, executive vice-president and chief information officer at TD, can now speak to the advantages of moving to the cloud:

“As we build the bank of the future, we are providing the right tools and technology for our people, resulting in improved agility and security. We are moving forward, driving innovation with purpose, simplifying the way we work and significantly improving our colleagues’ experience. Our move to Office 365 is also helping us substantially reduce IT costs in half. We’re fully committed to the cloud as we add on all the Office 365 functionality, including the Enterprise Mobility Security Suite and Customer Lockbox.”

There is some very exciting thought leadership at TD these days, such as their global transformation initiatives, like the recent “Work Space of the Future” project. TD will be using Office 365 to make the most of its vision to empower employees with mobility, attract new talent and keep up with customers’ expectations for digital services.

As TD recognizes the incremental benefits of moving to the cloud, their investment in Microsoft is going to pay dividends. Especially now that they can realize the full capabilities of using the entire Office 365 productivity platform. It’s great that Microsoft is a part of their vision—in the workplace and in the cloud.

—Ron Markezich

The post TD banks on the security of the cloud with Office 365 appeared first on Office Blogs.


What’s new in Azure Active Directory B2C

$
0
0

Over the past few weeks, we have introduced new features in Azure AD B2C, a cloud identity service for app developers. Azure AD B2C handles all your app’s identity management needs, including sign-up, sign-in, profile management and password reset. In this post, you’ll read about these features:

  • Single-page app (SPA) support
  • Usage reporting APIs
  • Friction-free consumer sign-up

Single-page app (SPA) support

A single-page app (SPA) is a web app that loads a single HTML page and dynamically updates the page as the consumer interacts with the app. It is written primarily in JavaScript, typically using a framework like AngularJS or Ember.js. Gmail and Outlook are two popular consumer-facing SPAs.

Since JavaScript code runs in a consumer’s browser, a SPA has different requirements for securing the frontend and calls to backend web APIs, compared to a traditional web app. To support this scenario, Azure AD B2C added the OAuth 2.0 implicit grant flow. Read more about using the OAuth 2.0 implicit grant flow or try out our samples:

  • A SPA, implemented with an ASP.NET Web API backend
  • A SPA, implemented with a Node.js Web API backend

Both samples use an open-source JavaScript SDK (hello.js). Note that the OAuth 2.0 implicit grant flow support is still in preview.

Usage reporting APIs

A frequent ask from developers is to get access to rich consumer activity reports on their Azure AD B2C tenants. We’ve now made those available to you, programmatically, via REST-based Azure AD reporting APIs. You can easily pipe the data from these reports into business intelligence and analytics tools, such as Microsoft’s Power BI, for detailed analyses. With the current release, 4 activity reports are available:

  • tenantUserCount: Total number of consumers in your Azure AD B2C tenant (per day for the last 30 days). You can also get a breakdown by the number of local accounts (password-based accounts) and social accounts (Facebook, Google, etc.).
  • b2cAuthenticationCount: Total number of successful authentications (sign-up, sign-in, etc.) within a specified period.
  • b2cAuthenticationCountSummary: Daily count on successful authentications for the last 30 days.
  • b2cMfaRequestCountSummary: Daily count of multi-factor authentications for the last 30 days.

Get started using the steps outlined in this article.

Friction-free consumer sign-up

By default, Azure AD B2C verifies email addresses provided by consumers during the sign-up process. This is to ensure that valid, and not fake, accounts are in use on your app. However, some developers prefer to skip the upfront email verification step and doing it themselves later. This friction-free sign-up experience makes sense for certain app types. We’ve added a way for you to do this on your “Sign-up policies” or “Sign-up or sign-in policies”. Learn more about disabling email verification during consumer sign-up.

Feedback

Keep your great feedback coming on UserVoice or Twitter (@azuread, @swaroop_kmurthy). If you have questions, get help on Stack Overflow (use the ‘azure-active-directory’ tag).

Cloud-Scale Text Classification with Convolutional Neural Networks on Microsoft Azure

$
0
0

This post is by Miguel Fierro, Ilia Karmanov, Thomas Delteil, Andreas Argyriou, and Max Kaznady, all Data Scientists at Microsoft.

Natural Language Processing (NLP) is one of the fields in which deep learning has made significant progress. Specifically, the area of text classification, where the objective is to categorize documents, paragraphs or individual sentences into classes, has attracted the interest of both industry and academia. Examples include determining what topics are discussed in a document or assessing whether the sentiment conveyed in a text passage is positive, negative or neutral. This information can be used by companies to define marketing strategy, generate leads or improve customer service.

This is the fourth blog showcasing deep learning applications on Microsoft’s Data Science Virtual Machine (DSVM) with GPUs using the R API of the deep learning library MXNet. The DSVM is a custom virtual machine image from Microsoft that comes pre-installed with popular data science tools for modeling and development activities.

In our first post, we showed how to set up a deep learning environment in one of the new DSVMs with NVIDIA Tesla K80 GPUs, installing CUDA drivers, Microsoft R Server and MXNet. In the second post, we presented a pipeline for a massive parallel scoring of 2.3 million images forming a collage of the Mona Lisa using HDInsight Apache Spark cluster. Finally, in the third post we illustrated how to train a network on multiple GPUs to classify objects among 1000 classes, using the ImageNet dataset and ResNet architecture.

In this sequel of the deep learning series, we will demonstrate how to use Convolutional Neural Networks (CNNs) in a text classification problem. We will explain how to generate an end-to-end pipeline, train a CNN for text classification and prepare the model for production so it can be queried by a user to classify sentences via a web service.

Deep Learning for Text Classification on Azure

The development of Recurrent Neural Networks (RNNs) has led to significant advances in deep learning for NLP. These networks, especially the subclass of Long Short Term Memory Networks (LSTMs), have achieved promising results in tasks related to temporal series, for instance, in speech recognition, text understanding and text classification, usually treating the text as groups of words.

The area of text classification has been developed mostly with machine learning models that use features at the word level. The use of word features such as bag of words, n-grams or word embeddings has been shown to be very successful. Some examples of text classification methods are bag of words with TFIDF, k-means on word2vec, CNNs with word embedding, LSTM or bag of n-grams with a linear classifier.

In parallel, there have been important advances in image recognition using different types of CNNs. The ResNet architecture introduced by Microsoft Research, is an example of this and was the first to surpass human performance in image classification. The reason for this extraordinary success comes from the fact that CNNs learn hierarchical representations in increasing levels of abstraction. This means they don’t just classify features but also automatically generate them in the first place.

Motivated in part by the success of CNNs in image recognition problems, where the inputs to the network are the pixels in images, a group of researchers proposed using CNNs for text understanding, using the most atomic representation of a sentence: characters. Even though other researchers have used sub-word units as inputs to deep networks for information retrieval and anti-spam filtering, the idea of using CNNs for text classification at character level first appeared in 2015 with the Crepe model. The following year the technique was developed further in the VDCNN model and the char-CRNN model.

Text Classification with Convolutional Neural Networks at the Character Level

To achieve text classification with CNN at the character level, each sentence needs to be transformed into an image-like matrix, where each encoded character is equivalent to a pixel in the image. This process is explained in detail in Zhang et al., but here’s a quick summary.



Fig. 1: Scheme of character encoding. Each sentence is encoded as a 69×1014 matrix.

The encoding of each sentence is represented in Fig. 1. Each sentence is transformed into a matrix, where the rows correspond to a dictionary and the columns correspond to the characters in the sentence. The dictionary consists of the following characters:

abcdefghijklmnopqrstuvwxyz0123456789-,;.!?:’\”/\\|_@#$%^&*~`+ =<>()[]{}

For each character in the sentence, we compute a one-hot encoded vector, that is, for each column, we assign a 1 to the corresponding row. As an example, if we want to encode the sentence “a cat” we will create the encoding shown in Fig. 1 (left).

Generally, networks require an input that is fixed-sized (to correspond to the fixed-size weights and bias matrices). The size of the vocabulary is fixed at 69 and the length of the text is fixed at 1014 characters – longer sentences are trimmed down and shorter sentences are padded with spaces.

One of the main bottlenecks in a process that parses and transforms vast amounts of data is memory management. In most situations, the dataset will not fit in the memory of the DSVM and is thus processed in mini-batches: reading a chunk, processing it to create a group of encoded images and freeing the memory before starting again. The current version of MXNet for R provides access to a C++ iterator that allows data to be in batches from a CSV file. We built upon this to create a custom-iterator (explained in this tutorial) that not only reads the CSV file in batches but also processes (and expands) the data into the image-matrix in batches.

Convolution with Characters

A convolution allows one to generate a hierarchical mapping from the inputs to the internal layers and to the outputs. Each layer sequentially extracts features from small windows in the input sequence and aggregates the information through an activation function. These windows, normally referred to as kernels, propagate the local relations of the data over the hidden layers.

As shown in Fig. 2 (left), a kernel applied to an image is usually symmetric (e.g. 5 by 5). A convolution of 3, 5 or 7 pixels can represent a small part of the image, like an edge or a shape.

In a model like Crepe or VDCNN, the first convolution has the size of 3xn or 7xn (see Fig. 2 right), where n is the size of the vocabulary.


Fig. 2: Comparison of the kernel size in an image with that in a matrix representation of a sentence.

In Fig. 3, we represent the Crepe model, composed of 9 layers (6 convolutional and 3 fully connected). We provide the code for the Crepe model in R and Python. We also provide the code for the VDCNN model in R and Python, but do not show its representation due to space constraints.


Fig. 3: Scheme of Crepe CNN. CAP stands for Convolution, Activation and Pooling. CA stands for Convolution and Activation.
FC stands for Fully Connected. This scheme assumes a batch size of 1.

As can be seen in the Fig. 3, the input, which is the encoded sentence, is initially transformed with a convolution of 7 in the x-axis and 69 in the y-axis with 256 feature maps, followed by a max pooling of 3 in the x-axis, flattening the image and reducing its size in that axis. The number of feature maps is constant in the rest of the internal layers of the network. After the first convoluted layer, there is a second convolution and pooling. Following that, there are three convolution layers. Next, there is a third convolution and pooling. Finally, there are 3 fully connected layers. All the technical details of the internal transformations of the network can be found in the paper or the code that we have provided.

As an interpretation of the Crepe architecture, we can speculate that, through the hidden layers, a hierarchical transformation is learned, where the first convolution of 7×69 could be loosely interpreted as a character n-gram approach, expressing something close to the average word whose length is 7 characters. The lower layers could represent groups of neighboring characters, with middle layers learning “n-grams” of these groups and final layers capturing semantic elements. However, more experiments would be needed in order to verify this hypothesis.

In Fig. 4 we show the results of training the Crepe model on the Amazon categories dataset, which can be downloaded using this script. This dataset consists of a training set of 2.38 million sentences and a test set of 420,000 sentences, divided into these 7 categories: “Books”, “Clothing, Shoes & Jewelry”, “Electronics”, “Health & Personal Care”, “Home & Kitchen”, “Movies & TV” and “Sports & Outdoors”. The model has been trained for 10 epochs on an Azure NC24 with 4 K80 Tesla GPUs. The training time was around 1 day.


Fig.4: Training and test accuracy using the Crepe model on the Amazon categories dataset.

Development of Cloud Infrastructure for Text Classification in Azure

Once we have the CNN trained model, we can use the Azure cloud infrastructure to operationalize the solution and provide text classification as a web service. The complete pipeline of creating a deep learning text classifier in Azure is detailed in Fig. 5.


Fig. 5: Workflow of the solution. In the training phase, we use a dataset to train the model parameters of the CNN in an
Azure GPU DSVM. These parameters are fed to the backend of the Azure Cloud Service for scoring. Through an API,
the front end is shown to the user who can interact with the application. The front can be run on different clients.

As shown in Fig. 5, the first step is to process the dataset. Once the model is trained, we can host it on Azure Cloud Services and use it to classify sentences via a web service. We created a Python web service that exposes the model for scoring via a web API. There is also a simple web app, programmed in JavaScript and HTML, that consumes the API, and provides a flexible environment for experimenting with different trained models.

The front end is managed by an AngularJS application, which consumes the Python API hosted on Azure Web Apps, and visualizes the classification for the user. The Python API uses a popular framework called Flask to handle the requests and responses. The model is held in memory by the web service process for superior performance. The code and instructions are open sourced in this repo.

On this web page, we show the complete system, which also contains the text analytics API from the Cortana Intelligence Suite.

Fig. 6 shows the result of the text classification API when we type the following review: “It was a breeze to configure and worked straight away.”


Fig. 6: Result of the sentence: “It was a breeze to configure and worked straight away”.

From the sentence, we can guess that the user is talking about the setup of some technological device. The system predicts that the most probable class is “Electronics”.

In Fig. 7 we input “It arrived as expected. No complaint.”. The Crepe model shows a very positive sentiment. In a general scenario, something arriving as expected could be seen as neutral. However, in the context of an online-retailer (such as Amazon) a review that a product arrived on time may be considered a positive experience, rather than neutral.


Fig. 7: Result of the sentence: “It arrived as expected. No complaint”.

The code for the end-to-end solution can be found in this repo, along with a more detailed explanation of the whole implementation.

Summary of the End to End Text Classification Solution

In summary, in this post, we showed how to create an end-to-end text classification system using deep learning and the Azure cloud infrastructure.

The CNNs we discussed in this post were trained at the character level. This means that we do not use words or groups of words as the input to a network. In contrast, we encode each sentence as a matrix, mapping each character of the sentence to a dictionary. The resulting matrix is fed into a convolutional network, which can learn increasingly abstract hierarchical representations of this input as features which can be used to determine the sentiment of the text.

We demonstrated how easy it is to train one of these networks on our high-performance GPU DSVMs by distributed computation with 4 GPUs.

Once the model is trained, one can very quickly create a web service in Azure Web Apps to host the prediction system.

All the code is open source and accessible in this repo.

Miguel, Ilia, Thomas, Andreas & Max

 

Acknowledgements

We would like to thank TJ Hazen from Microsoft for his assistance and feedback on this post.

Happy 15th Birthday .NET!

$
0
0

WP_20170209_19_12_07_Rich

Today marks the 15th anniversary since .NET debuted to the world. On February 13th, 2002, the first version of .NET was released as part of Visual Studio.NET. It seems just like yesterday when Microsoft was building its “Next Generation Windows Services” and unleashed a new level of productivity with Visual Studio.NET.

Since the beginning, the .NET platform has allowed developers to quickly build and deploy robust applications, starting with Windows desktop and web server applications in 2002. You got an entire managed framework for building distributed Windows applications, ASP.NET was introduced as the next generation Active Server Pages for web-based development, and a new language, C# (pronounced “see sharp” :-)) came to be.

Over the years, .NET and it’s ecosystem has grown and expanded to meet the needs of all kinds of developers and platforms. As the technology landscape has changed, so has .NET. You can build anything with .NET including cross-platform web apps, cloud services, mobile device apps, games and so much more. We have a vibrant open-source community where you can participate in the direction of .NET.

With the release of Visual Studio 2017 coming on March 7th and Visual Studio’s 20th anniversary, .NET Core tools reach 1.0. Tune in March 7th and watch the keynote and live Q&A panels. A few nights ago we got together with the Microsoft Alumni Network and threw a big .NET birthday bash with former .NET team members & rock stars. We caught up with Anders Hejlsberg, father of the C# language, to share some stories and thoughts on .NET, open source, and the release:

Thank you to all the past and present .NET team members, rock stars and the community of passionate developers for building incredible software that has changed our industry. Share your story about .NET and Visual Studio with hash tags #MyVSstory #dotnet. Show your .NET pride with a t-shirt, sticker, or mug. Or bake a cake, eat it with your development team, and share it with us on twitter @dotnet!

Here’s to another 15 years!

– The .NET Team

Azure Application Insights JavaScript SDK: reliability and performance improvements

$
0
0

Recently, we have improved the robustness of web page monitoring in Application Insights, and introduced the ability not to use cookies. Transmission is now more reliable in the face of throttling and network issues, and when a page is about to unload.

With Azure Application Insights you can monitor performance and usage of your apps. With a little snippet of JavaScript you can get timings of page loads and AJAX calls, counts and details of browser exceptions and AJAX failures, as well as users and session counts. To learn more how to get started you can visit our documentation.

New JavaScript SDK features

In addition to internal improvements we have fixed some bugs, cleaned up the SDK, and added a couple more features:

Add snippet.js to NPM package:

Snippet.js is a file used for those who want to use application insights from a separate file instead of using inline JavaScript using a Gulp pipeline. By RehanSaeed. Thanks Muhammed Rehan!

Option to disable cookies

This was an ask from some of our users. Now you can disable cookies, but some features will be lost. Without cookies every page view will count as a new user and session. For more configuration options please see our Application Insights SDK JavaScript API

Enable dependency correlation headers

These are now turned off by default, but you can enable them manually. To correlate dependencies with server request set disableCorrelationHeaders in your config file to false. If you opt in to this feature you will then be able to see the server request that correlate with your client side AJAX calls.

JavaScript SDK Improvements

The below improvements have been made that elevate our JavaScript SDK without any additional work when onboarding

Security

The JS SDK has fully switched to https. We now use https to send all telemetry and to download the library. Also, for all secure sites the SDK will create cookies with a secure flag - more on set-cookie documentation

Transmission

One of the biggest goals for Application Insights JavaScript SDK is to provide as much functionality as possible, while not degrading the instrumented page in any way. This includes performance and user experience. Because transmission reliability is a crucial part of the SDK, we’ve recently made a few improvements in this area.

Telemetry is now sent more reliably when:

1. The Application Insights portal is imposing throttling, or is temporarily unable to accept data.

Retry and error handling features are helping to improve transmission reliability. In the case of network issues, the SDK will retry sending the telemetry data with an exponential backoff.

2. The user navigates to a new page within the same site, shortly after your code sends a telemetry event. Previously, events not yet sent were lost when the page was unloaded. Events are now kept in a session buffer that is preserved across pages in the same site.

Beacon API

JavaScript SDK also provides experimental support for a Beacon API. The Beacon API is designed to transmit telemetry when the browser is not busy with time-critical operations. All data is delivered even if the user navigates away or closes a browser tab.

The Beacon API only allows to queue data for transmission, and the actual transmission is handled by the browser outside of our control. Thus, the JavaScript SDK cannot receive a confirmation from the Application Insights backend that all telemetry was received and processed correctly.

If a default JavaScript SDK configuration doesn’t fully work for you and you suspect that not all events are tracked or that transmission is impacting the performance of your page, you can try using the Beacon API.  The feature is currently disabled, but you can enable it by setting ‘isBeaconApiDisabled’ to false - see config. If you decide to send your data using the Beacon API, the SDK will automatically turn off Session Storage Buffer and Retry features.

Performance.now in Session Class

After working with the PowerBI and Edge team we have moved to using performance.now in session calls. This improves the performance of our library’s date/time handling.

Feedback

If you have any questions or experiencing any problems with the JavaScript SDK, feel free to open an issue on GitHub.

Special Thanks

Special thanks to Kamil Szostak for help preparing this post.

Power BI, Flow, and PowerApps Webinars for February 23 - April 6

$
0
0
If you haven’t attended one of the free Business Application Platform webinars, the intent is to offer world-class best practices, giving you not just insight into your data but also into how to automate your data and save time in a medium that's easy to consume. Check out these upcoming 11 free webinars on how to action, automate, and gain insight from your data!

TFS 2017 Update 1 RC2

$
0
0

Today we released the final release candidate for TFS 2017.1.

This update contains a few new features and a lot of bug fixes.  To my knowledge, we have fixed all the bugs that were reported from RC1.  There are a small handful of bugs left to be fixed and we will be ready to ship the final version of Update 1.  We recently announced that VS 2017 and TFS 2017.1 will be released on March 7th.

Most of the new features in RC 2 are small but nice.  You can read the release notes for details (look for “New in RC2”).  Probably the most frequently requested new addition is breaking up of the Git repository administration permissions so that you can. for instance, give people the permission to create repos without giving them permission to administer everyone else’s.  Check out the release notes though because there are improvements in Pull requests, testing, release management and more.

This release candidate is a “go-live” release, can be used in production environments and will be supported.  It will upgrade seamlessly to the final release.  It also has localized resources for all our supported languages (RC1 was English only).  Please try it out and, if you find any issues, let us know immediately.  We still have a little time left to fix any serious problems.

Thank you,

Brian

 

 

 

Top Ten Issues with Active Directory Trusts and Corporate Mergers

$
0
0

Hey Everyone. Randy, Premier Field Engineer, here to discuss some lessons learned from working with a recent merger between two corporations. I don’t have enough time or space to go into the details of this major endeavor, so I am going to talk about this experience with a “Top Ten Countdown” style BLOG POST. I’m sure there are many other headaches seen in the field and I would love to hear about them in the comments section. Maybe I can start to consolidate all this into a Wiki about Partnerships and Mergers between two dueling Active Directory environments. Let’s get started:

1. This one being the most important! Never establish multiple trust paths: I have had the same conversation with countless engineers when doing phone support, about setting up both a Forest Trust between the two Forest Roots, and also an External Trust between two child domains in each of the forests. This should never be done under any circumstances. The argument is often “But this is a shortcut trust.” A Shortcut Trust is between two different domains in the same forest. I have also seen arguments where certain applications (here is an example) that are performing logon routines are not able to query a forest, and therefore need a direct trust. There is likely a newer version of the application without this requirement. If there is not an update or competitive product without this requirement, then it is time to do some soul searching on what is more important. The crux of the issue is different technologies providing the trust path between the same domains, each having different characteristics and limitations. One workflow may use the enumeration of trusted domains and hit one of these limitations based on the technology invoked.

2. Beware of CNAME (alias) records in DNS: This is true regardless of traversing a trust, or in the local domain. The behavior of how SPNs are formatted in the client’s request has dramatically changed over various versions of Windows. This article talks about this behavior, although it is not that straight forward about why it is a problem. When accessing a resource using Kerberos Authentication, the client has to construct a Service Principal Name based on the Host Name offering that service. Take a look at the example Below:

clip_image002

Here we have a File Server (FileServ1.adatum.com) hosting a Share (\\SalesReports\Monthly.) In DNS, we configure a Cname record to direct any DNS query for SalesReports.adatum.com, to the physical host of website Fileserv1.adatum.com.

In Windows XP and SMBv1: We would ask the KDC/Domain Controller for CIFS/SalesReports, because that was the hostname supplied in the UNC Path.

In Vista and SMBv2: We would ask the KDC/Domain Controller for CIFS/FileSrv1.adatum.com, because that was the hostname resolved in the DNS CNAME query.

In Windows 8: We once again returned to the original behavior of asking for CIFS/SalesReports, and adding the “NETDOM COMPUTERNAME /ADD” to supply an alternate name that would register the SPN for you.

The recommendation from Microsoft is to avoid CNAME records if at all possible. This will avoid a variety of headaches because you could see unexpected outcomes as you use other network transports like HTTP.

3. Use Fully Qualified Domain Names: When joining a domain, writing logon scripts, or configuring an application setting that requires a computer or domain name, I have just made this a habit ever since about 2003. There are plenty of ways that Windows can overcome flat names, but why not keep it simple wherever you can. By supplying the FQDN, we tell DNS exactly what we want without confusion. Here is a short list of problems you will avoid:

1. Same Host Names exist in multiple domains

2. Time delays having to parse through the domain suffix search order to look for a match

3. Kerberos KDC knowing which realm to forward the ticket request

4. Disjoined Name Spaces or Unorthodox Naming Conventions

The next two items on the list are related to FQDN.

4. Kerberos Forest Search Order: In some situations, flat names may be used and will format the SPN request made by the client as “HTTP/ServerName” (for example) and not include the domain suffix. The domain suffix is important because the user will always go to its local domain’s KDC which uses the domain suffix to identify which Kerberos Realm it should direct the user. There is a GPO setting that can be configured either for the client or the KDC which lists out other realms where it can check for a matching SPN. A configuration that I found useful is to list the Workstation Forest in the KFSO GPO applied to the workstation. This was helpful in a situation where users have been migrated to the new AD Forest, but their workstations have yet to be migrated. Let’s watch this play out in a cartoon entitled “The Adventures of Joey Adatum!”

clip_image004

clip_image006

The beauty of this solution is that there would be no performance impact (as described in the TechNet documentation) because the workstation is just asking its local Forest and would only be relevant if the logged on user was from a different Forest.

5. DFS Namespaces: To continue on the topic of domain suffixes, update DFSN to use DNS FQDN’s. This will ensure that any cross forest usage of DFS Namespaces will resolve correctly. Another important aspect is the Special Names Table generated for the forest. The Special Names Table is a list stored on each Windows client that it gets from the DC. It is a listing of domain names that the DFS client needs to recognize when resolving a UNC path as either a Domain-Based DFSN or a regular Server Name. A Two-Way Forest trust will not give you any problems with populating this table for the partner forest. If you start playing with One-Way, or External Trusts, you might not get the results you are looking for, especially when child domains are involved. You can see the Special Names Table by using “dfsutil /spcinfo” on a client that has the DFS RSAT tools installed. See the example below when running dfsutil /spcinfo on a workstation in Widgets.Adatum.com:

clip_image008

One-Way Trust:

C:\temp>dfsutil /spcinfo

[*][WidgetDC1.widgets.adatum.com*widgets]

[*][widgets.adatum.com]

[-][adatum]

[-][adatum.com]

[-][widgets]

[-][widgets.adatum.com]

Two-Way Trust:

C:\temp>dfsutil /spcinfo

[*][WidgetDC1.widgets.adatum.com*widgets]

[*][widgets.adatum.com]

[-][adatum]

[-][adatum.com]

[-][widgets]

[-][widgets.adatum.com]

[-][Contoso]

[-][Contoso.com]

[-][Asia]

[-][Asia.Contoso.com]

The good news is Windows 8 & 10 clients seem to behave, and I only found this to be an issue with Windows 7 / 2008 systems. This can be a major problem with new SMB Hardening recommendations for \\*\Sysvol and \\*\Netlogon. If you have a One-Way trust and Windows 7 clients with SMB hardening, then they won’t be getting group policy across the forest because Kerberos will fail.

6. Will users log on interactively or via RDP to systems across the Forest Trust?I am including this as a separate line item because it will likely be the biggest headache that you encounter. The two items above on our list are examples of problems you might face. It would be helpful to really think through these situations and see where Network Access would be sufficient.

7. Don’t forget about udp port 389: Firewall configuration is always important when troubleshooting cross forest failures. Make sure all the required ports are open for Active Directory, udp 389 is often forgotten, but very important for DC Discovery operations.

8. Site Affiliation: For optimal performance, we want to make sure that we are choosing resources located near our physical location. I won’t go into detail, because a great BLOG POST was written eight years ago that still reigns true today.

9. Make security policies consistent: Why are domain controllers all located in a single OU? Answer: To make sure they all get the same GPO Security Settings. When we establish a trust between two Active Directories, we are extending our trust boundary beyond the local forest to our partner. You want to pay careful attention to conflicting settings that would break communications. Here is a classic KB article which describes numerous potential conflicts. Make sure you work together on a common security baseline for both AD Forests. If this is going to be a long term relationship, I recommend a solution in place to identify a naming convention for GPOs with sensitive settings and routinely synchronize or compare these GPOs. Some tools that may help in this effort are the Security Compliance Manager (free!), or Advanced Group Policy Manager (requires MDOP license)

10. Be aware of your partner’s vulnerabilities: When you open your doors to a partner environment, you could be exposing yourself to all the threats they encounter. Here are some questions to ask:

1. How do they enforce system health compliance?

2. How do they deliver patching and security updates?

3. Are they running legacy operating systems?

4. What are their Password Policies?

5. How many Domain Admins do they have, and are any of them service accounts?

6. Have they reported security breaches in the past?

Whenever your users traverse the trust, they could be exposed to credential theft. In turn, their users are now your users. Whatever vulnerabilities and compromises they face, are now your problems. You can set up various roadblocks to prevent these issues.

1. Don’t let privileged accounts logon interactively and restrict what they can access over the network. Stay away from Legacy Operating Systems.

2. Consider Security Considerations for Trusts, implement measures like SID Filtering or Selective Trusts.

3. Secure Privileged Access with a Tier Administration Model and other mitigations.

These are just a few examples of considerations. Just keep in mind that this is a major endeavor and should be well planned. Also, think outside of the box. This may be time to consider hosting services in Azure and merging the two forests in Azure AD, while keeping your on premise nice and isolated.

Good Luck,

Randy “Trust me” Turner


Work Folders for iOS can now upload files!

$
0
0

Work Folders for iOS can now upload files!

Hi.

We are happy to announce, that we’ve just released an update to the Work Folders app on iOS that now allows anyone to upload pictures and documents from other apps, take pictures or even write a simple note – right from within in the Work Folders App.

 

What's new: Take pictures and notes and upload documents

Overview

Work Folders is a Windows Server feature since 20012 R2 that enables individual employees to access their files securely from inside and outside the corporate environment. The Work Folders app connects to the server and enables file access on your Android phone and tablet. Work Folders enables this while allowing the organization’s IT department to fully secure that data.

 

What’s New

Using the latest version of Work Folders for iOS, users can now:

  • Sync files that were created or edited on their device
  • Take pictures and write notes within the Work Folders application
  • When working within other applications (i.e., Microsoft Word), the Work Folders location can be selected when opening or saving files. No need to open the Work Folders app to sync your files.

For the complete list of Work Folders for iOS features, please reference the feature list section below.

 

iosaddmenu

 

Work Folders for iOS – Feature List

  • Sync files that were created or edited on your device
  • Take pictures and write notes within the Work Folders app
  • Pin files for offline viewing – saves storage space by showing all available files but locally storing and keeping in sync only the files you care about.
  • Files are always encrypted – on the wire and at rest on the device.
  • Access to the app is protected by an app passcode – keeping others out even if the device is left unlocked and unattended.
  • Allows for DIGEST and Active Directory Federation Services (ADFS) authentication mechanisms including multi factor authentication.
  • Search for files and folders
  • Open files in other apps that might be specialized to work with a certain file type
  • Integration with Microsoft Intune

 

Known Issues

Microsoft Office files are read-only when opening the files from the Work Folders app.

  • Tap “Duplicate to store the file locally on your iOS device.
  • Make your changes and save the file.
  • Follow the steps below to sync any office file with your Work Folders app:

 

 

All the Goods

Blogs and Links

 If you’re interested in learning more about Work Folders, here are some great resources:

 Introduction and Getting Started

Advanced Work Folders Deployment and Management

Videos

Safeguarding your cloud resources with Azure security services

$
0
0

While cloud security continues to be a top concern, we recently shared insights from a survey that show overall concern has dropped significantly since 2015. We’re now at a stage where half of organizations contend the cloud is more secure than their on-premises infrastructure. In conversations I have with our customers and partners, I hear increasingly about how using the cloud improves an organizations’ security posture. As many organizations push forward on their digital transformation through increased use of cloud services, understanding the current state of cloud security is essential.

Maintaining a strong security posture for your cloud-based innovation is a shared responsibility between you and your cloud provider. With Microsoft Azure, securing cloud resources is a partnership between Microsoft and our customers, so it’s essential that you understand the comprehensive set of security controls and capabilities available to you on Azure. 

Microsoft Azure is built on a foundation of trust and security. With significant investments in security, compliance, privacy, and transparency, Azure provides a secure foundation to host your infrastructure, applications, and data in the cloud. Microsoft also provides built-in security controls and capabilities to further help you protect your data and applications on Azure. These can be classified broadly into four categories:

Manage and control user identity and access: Comprehensive identity management is the linchpin of any secure system. You must ensure that only authorized users can access your environments, data, and applications. Azure Active Directory serves as a central system for managing access across all your cloud services, including Azure, Office 365, and hundreds of popular SaaS and PaaS cloud services. Its federation capability means that you can use your on-premises identities and credentials to access those services, and Azure Multi-Factor Authentication provides for the most secure sign-on experience.

Increase network and infrastructure security: Azure provides you the security-hardened infrastructure to interconnect Azure VMs as well as make connections to on-premises datacenters. Additionally, you can extend your on-premises network to the cloud using secure site-to-site VPN or a dedicated Azure ExpressRoute connection. You can strengthen network security by configuring Network Security Groups, user-defined routing, IP forwarding, forced tunneling, endpoint ACLs, and Web Application Firewall as appropriate.

Encrypt communications and operation processes: Azure uses industry-standard protocols to encrypt data in transit as it travels between devices and Microsoft datacenters, and when it is stored in Azure Storage. You can also encrypt your virtual machine disks using Azure Disk Encryption. Azure Key Vault enables you to safeguard and control cryptographic keys and other secrets used by cloud apps and services. Azure Information Protection will help you classify, label, and protect your sensitive data.

Defend against threats: Microsoft enables actionable intelligence against increasingly sophisticated attacks using our network of global threat monitoring and insights. This threat intelligence is developed by analyzing a wide variety of signal sources and a massive scale of signals. (For example, customers authenticate with our services over 450 billion times every month, and we scan 200 billion emails for malware and phishing each month.) Our approach to protect the Azure platform includes intrusion detection, distributed denial-of-service (DDoS) attack prevention, penetration testing, behavioral analytics, anomaly detection, and machine learning. You can leverage additional services to develop a strong threat prevention, detection, and mitigation strategy.

Azure Active Directory Identity Protection helps you protect and mitigate against the risks from compromised identities. It offers a cloud powered, adaptive machine learning based identity protection system that can detect cyber-attacks, mitigate them in real time, and automatically suggest updates to your Azure AD configuration and conditional access policies. Services like Antimalware for Azure and Azure Security Center use advanced analytics to not only help in detecting threats but also prevent them. Azure Security Center helps you get a central view of the security state of all your Azure resources in real time, including recommendations for improving your security posture. You can use Operations Management Suite to extend the threat prevention, detection and quick response across Azure and other environments (on-premises, AWS). Log Analytics service will give you real-time insights to readily analyze millions of records across all of your workloads regardless of their physical location.

These are just a few examples of the broad set of security controls and services available to you with Azure. Over the past year, we have expanded the portfolio with many new security services and ongoing enhancements.

Microsoft is committed to continued innovation in helping you protect your data, applications, and identities in the cloud. Innovations we have delivered most recently include:

  • New capabilities and enhancements in Azure Security Center available for preview this month include Just In Time network access to VMs, automatic discovery and recommendations for application whitelisting, and expanded Security Baselines with more than 100 recommended configurations defined by Microsoft and industry partners. Our research team continues to monitor the threat landscape and innovate on detection algorithms. Some new threat detections available to customers include Brute Force detections, outbound DDoS and Botnet detections, as well as new behavioral analytics for Windows and Linux VMs.
  • Preview of Storage Service Encryption for File Storage. IT organizations can lift and shift their on-premises file shares to the cloud using Azure Files by simply pointing the applications to the Azure file share path. Azure Files now offer enhanced protection with the ability to encrypt data at rest.
  • Azure SQL Database Threat Detection is already available in preview. Last week the team announced that it will be generally available in April 2017. Azure SQL Database Threat Detection provides an additional layer of security intelligence built into the Azure SQL Database service that uses machine learning to continuously monitor, profile, and detect suspicious database activity to help customers detect and respond to potential threats.

With these tools, organizations are able to securely transition to the cloud while also complying with regulatory requirements. Read how Ricoh USA Inc. discovered that Azure exceeds the level of security it could previously offer its customers.

Azure has a vibrant partner ecosystem, so it’s also easy to bring your trusted cloud security vendor with you, enabling you to leverage your existing security solutions. Find partner security solutions in Azure Marketplace.

Microsoft Azure at RSA 2017

For those of you attending RSA Conference this week in San Francisco, we hope to connect with you at the show. You can:

  • See the keynote by Brad Smith, President and Chief Legal Officer at 8:35AM PST. You can stream it live if you’re not at RSA.
  • Attend our sessions:
    • A Vision for Shared, Central Intelligence to Ebb a Growing Flood of Alerts: SP03-T09
    • How to Go from Responding to Hunting with Sysinternals Sysmon: HTA-T09
    • Critical Hygiene for Preventing Major Breaches: CXO-F02
    • Advances in Cloud-Scale Machine Learning for Cyber-Defense: EXP-T11
    • Learnings from the Cloud: What to Watch When Watching for Breach: STR-W11
  • Visit Booth 3501 in the North Expo Hall and learn how Microsoft solutions work together to improve your organization’s security posture. See the complete Microsoft schedule for RSA 2017. Hope to see you in San Francisco!

From pit stop to checkered flag—NASCAR drives productivity through cloud-based IT innovation

$
0
0

NASCAR races forward with Office 365

Today’s post was written by Stephen Byrd, director of Technology Integration and Development at NASCAR.

NASCAR pro pixIn the midst of the deafening noise and speed of a NASCAR race, even mundane tasks like changing a tire become heroic. It’s the same with the technology we employ; simple tasks like collaborating on documents and sharing files support the essential teamwork that is at the core of the NASCAR enterprise. When the Technology Integration and Development team at NASCAR chose Microsoft Office 365, we knew we had found a platform that would simplify how we produce the exciting sporting events that are the NASCAR legacy. Today, we are about 25 percent migrated to Office 365, including SharePoint Online and OneDrive for Business.

To say our business is mobile is an understatement; NASCAR holds 38 events a year all over the country. This means setting up the compound infrastructure, arranging catering and coordinating travel for our employees. For 10 months of the year, we are a traveling circus. It’s a huge undertaking, and it’s crucial that all the aspects of our operation run smoothly. In the past, the logistics of coordinating these events came down to printing out huge runbooks and relying on email to make sure everyone had the correct information before race day. That has fundamentally changed with the introduction of Office 365 and, specifically, SharePoint Online collaboration team sites. Using SharePoint Online, we created a brand-new intranet called Inside Track that employees use to collaborate simultaneously on documents, update spreadsheets in real-time and securely access the information they need on the road or from the track. Productivity is at a peak level in the company, and we’re setting up new races with maximum efficiency.

The company is also piloting Microsoft Teams to take chat-based teamwork to the next level. We have lofty expectations that Microsoft Teams can handle the extremely fast-paced collaboration that must happen trackside and in the pit. For example, we are interested to see if we can use Microsoft Teams to ascertain if a penalty has been handed out, rather than hopping on a radio channel or the phone.

Our fast-paced industry is unique, but NASCAR’s security concerns are much like those of any large enterprise. After doing our due diligence, we felt confident that the Microsoft cloud-based products were among the best in their class when it came to cloud security. We are confident that with Office 365 and a custom app built in PowerApps, we can easily manage data access for all of our employees and hundreds of freelance contractors to ensure everyone gets the information they need to get their work done.

The Technology Integration and Development team at NASCAR was born out of a single question: How do we empower the business to get more done? We chose Office 365, with its mobile-first, cloud-first benefits, to give our employees the best tools for working at the office or at the track. Putting a race together requires the same coordinated teamwork you see in the pit, and with Office 365 and SharePoint Online, we’re driving productivity into a new era of time- and cost-savings. That’s great news for NASCAR.

—Stephen Byrd

For more on NASCAR’s race toward efficiency, read the full case study.

The post From pit stop to checkered flag—
NASCAR drives productivity through cloud-based IT innovation
appeared first on Office Blogs.

Demo Tuesday // Shielded Virtual Machines in Windows Server 2016

$
0
0

Welcome to our Demo Tuesday Series. Each week, we will be highlighting a new product feature from the Hybrid Cloud Platform.

You cant put virtual machines (VMs) under lock and key

Customers are virtualizing pretty much everything they can today, from SharePoint to SQL Server to Active Directory domain controllers. That has created an interesting new challenge. When workloads ran on physical machines, it was possible to physically secure themput them in cages, padlock them, and protect them with actual security guards. Unfortunately, you cant do that with virtualized machines.

Network admins, backup admins, server admins, storage admins, and potentially others have access to your virtual machines. Any one of them could inject malicious code into the virtual machine or upload a VM onto a USB and take it home for inspection, and youd be none the wiser. Take a look:

Get the benefits of virtualization with the security of physical deployment

Shielded VMs in Windows Server 2016 allow you to keep malicious actors from stealing copies of your virtual machines. They are fully encrypted and will only run on approved guarded fabrics. That guarded fabric forces strong isolation boundaries between the host and its own virtual machines. From any other aspect, a Shielded VM is just like a normal VM, you can run it, back it up, and live migrate it with no additional operational hassle. Thats the power of the shield in shielded virtual machines.

Try it out for yourself in our lab and learn more about how Windows Server 2016 can help shield your virtual machines in this Microsoft Virtual Academy session.

Are you still running a Windows Server technical preview?

$
0
0

If you were one of the customers who took time to evaluate the technical previews of Windows Server last year, thank you! Your testing and feedback helped us improve and deliver a great release of Windows Server 2016 for general availability in October 2016.

If you are still testing on one of these technical previews, we wanted to let you know about the great options to evaluate the generally available version of Windows Server 2016:

We continue to value your feedback, so please reach out to our User Voice or TechNet forums and let us know what you think!

Viewing all 13502 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>