Quantcast
Channel: TechNet Technology News
Viewing all 13502 articles
Browse latest View live

Detecting threat actors in recent German industrial attacks with Windows Defender ATP

$
0
0

When a Germany-based industrial conglomerate disclosed in December 2016 that it was breached early that year, the breach was revealed to be a professionally run industrial espionage attack. According to the German press, the intruders used the Winnti family of malware as their main implant, giving them persistent access to the conglomerate’s network as early as February 2016.

In this blog, we look at the Winnti malware implant as used by two known activity groups BARIUM and LEAD. We look at how these activity groups introduce the implant to various targets and techniques used by Microsoft researchers to track the implant.

To show how this breach and similar breaches can be mitigated, we look at how Windows Defender Advanced Threat Protection (Windows Defender ATP) flags activities associated with BARIUM, LEAD, and other known activity groups and how it provides extensive threat intelligence about these groups. We go through the Winnti implant installation process and explore how Windows Defender ATP can capture such attacker methods and tools and provide visualized contextual information that can aid in actual attack investigation and response. We then discuss how centralized response options, provided as enhancements to Windows Defender ATP with the Windows 10 Creators Update, can be used to quickly stop threats, including stopping command and control (C&C) communication and preventing existing implants from installing additional components or from moving laterally to other computers on the network.

Winnti activity groups: BARIUM and LEAD

Microsoft Threat Intelligence associates Winnti with multiple activity groups—collections of malware, supporting infrastructure, online personas, victimology, and other attack artifacts that the Microsoft intelligent security graph uses to categorize and attribute threat activity. Microsoft labels activity groups using code names derived from elements in the periodic table. In the case of this malware, the activity groups strongly associated with Winnti are BARIUM and LEAD. But even though they share the use of Winnti, the BARIUM and LEAD activity groups are involved in very different intrusion scenarios.

BARIUM begins its attacks by cultivating relationships with potential victims—particularly those working in Business Development or Human Resources—on various social media platforms. Once BARIUM has established rapport, they spear-phish the victim using a variety of unsophisticated malware installation vectors, including malicious shortcut (.lnk) files with hidden payloads, compiled HTML help (.chm) files, or Microsoft Office documents containing macros or exploits. Initial intrusion stages feature the Win32/Barlaiy implant—notable for its use of social network profiles, collaborative document editing sites, and blogs for C&C. Later stages of the intrusions rely upon Winnti for persistent access. The majority of victims recorded to date have been in electronic gaming, multimedia, and Internet content industries, although occasional intrusions against technology companies have occurred.

In contrast, LEAD has established a far greater reputation for industrial espionage. In the past few years, LEAD’s victims have included:

  • Multinational, multi-industry companies involved in the manufacture of textiles, chemicals, and electronics
  • Pharmaceutical companies
  • A company in the chemical industry
  • University faculty specializing in aeronautical engineering and research
  • A company involved in the design and manufacture of motor vehicles
  • A cybersecurity company focusing on protecting industrial control systems

During these intrusions, LEAD’s objective was to steal sensitive data, including research materials, process documents, and project plans. LEAD also steals code-signing certificates to sign its malware in subsequent attacks.

In most cases, LEAD’s attacks do not feature any advanced exploit techniques. The group also does not make special effort to cultivate victims prior to an attack. Instead, the group often simply emails a Winnti installer to potential victims, relying on basic social engineering tactics to convince recipients to run the attached malware. In some other cases, LEAD gains access to a target by brute-forcing remote access login credentials, performing SQL injection, or exploiting unpatched web servers, and then they copy the Winnti installer directly to compromised machines.

Tracking Winnti

Microsoft Analytics shows that Winnti has been used in intrusions carried out throughout Asia, Europe, Oceania, the Middle East, and the United States in the last six months (Figure 1).  The most recent series of attacks observed was in December 2016.

 

Winnti encounters from July to December 2016

Figure 1. Winnti encounters from July to December 2016

Although tracking threats like Winnti involves old-fashioned investigative work, Microsoft Threat Intelligence analysts take advantage of machine learning to work at scale. When attackers used Winnti to maintain access to web servers, they hid the implant in plain sight by masquerading it as a trusted, legitimate file. This was the case in two known intrusions in 2015, where attackers named the implant DLL “ASPNET_FILTER.DLL” to disguise it as the DLL for the ASP.NET ISAPI Filter (Table 1).  Although there are obvious differences between the legitimate file and the malicious one, filtering out the malicious file would involve going through a data set with noise from millions of possible file names, software publishers, and certificates. Microsoft researchers used a combination of anomaly detection and supervised machine learning to reduce the data set and separate meaningful, malware-related anomalies from benign data.

 

ASPNET_FILTER.dll comparison

 

Dealing with Winnti intrusions

Windows Defender ATP helps network security professionals deal with intrusions from activity groups like LEAD and BARIUM in several ways. The following examples were developed using a Winnti installer that was used in attacks in December 2016.

Alerts for breach activity

Microsoft Threat Intelligence continually tracks activity groups such as LEAD and BARIUM and documents the tactics, techniques, and procedures they employ in their attacks, with a special focus on the tools and infrastructure they use to facilitate those attacks. Windows Defender ATP continuously monitors protected systems for such indicators of hostile activity and alerts security operations center (SOC) personnel to their presence (Figure 2).

 

Threat Intelligence alert in Windows Defender ATP

Figure 2. Threat intelligence alert in Windows Defender ATP

To provide context around such alerts, Windows Defender ATP also features a short summary of the group’s history, goals, methods, and tools (Figure 3), with links to extensive documentation for technically minded users.

 

Lead activity group summary and extensive documentation

Figure 3. Lead activity group summary and extensive documentation

 

Windows Defender ATP is also capable of detecting previously unknown attacks by monitoring system behavior indicative of hostile activity, including:

  • Malware installation, persistence, and activation
  • Backdoor command and control
  • Credential theft
  • Lateral movement to other machines on the network

For example, numerous malware families register themselves as services during installation to guarantee persistence across reboots. A majority of malware that perform this persistence technique modify the necessary registry keys in ways that do not fit the profile of a legitimate program. Winnti is no exception, and so, during Winnti’s installation process, Windows Defender ATP is able to raise behavioral alerts (Figure 4).

 

Abnormal service creation alert

Figure 4. Abnormal service creation alert

To improve coverage while minimizing false positives, Windows Defender ATP uses the intelligent security graph to differentiate between suspicious and benign behavior before generating alerts. It considers the age of the file, its global prevalence, and the presence and validity of a digital signature along with the method of service creation.

Visualized contextual information

For alerts raised either by specific threat intelligence tied to activity groups or by more generic suspicious behaviors, Windows Defender ATP provides rich, visualized technical context. This visual context enables SOC personnel to investigate alerts with all related artifacts, understand the scope of the breach, and prepare a comprehensive action plan. In the screenshots below, Windows Defender ATP clearly presents the Winnti installation where an installer drops a DLL to disk (Figure 5), loads the DLL using rundll32 (Figure 6), sets the DLL as a service (Figure 7), and saves a copy of itself in C:\Windows\Help (Figure 8).

 

Winnti installer drops a DLL

Figure 5. Winnti installer drops a DLL

 

Winnti installer loads DLL with rundll32

Figure 6. Winnti installer loads DLL with rundll32

 

Winnti sets itself as a service for persistence

Figure 7. Winnti sets itself as a service for persistence

 

Installer copied to Help folder

Figure 8. Installer copied to C:\Windows\Help\

Windows Defender ATP displays these activities as process trees in a machine timeline for the infected computer. Analysts can easily extract detailed information from these trees, such as the implant DLL dropped by the installer, the command used to call rundll32.exe and load the DLL, and the registry modifications that set the DLL as a service. This information can provide an initial means by which to assess the scope of the breach.

 

Response options

The Windows 10 Creators Update will bring several enhancements to Windows Defender ATP that will provide SOC personnel with options for immediate mitigation of a detected threat. If an intruder compromises a computer that has been onboarded to Windows Defender ATP, SOC personnel can isolate the computer from the network, blocking command and control of the implant and preventing attackers from installing additional malware and moving laterally to other computers in the network. Meanwhile, connectivity to the Windows Defender ATP service is maintained. While the machine is in isolation, SOC personnel can direct the infected machine to collect live investigation data, such as the DNS cache or security event logs, which they can use to verify alerts, assess the state of the intrusion, and support follow-up actions.

 

Response options for the compromised machine

Figure 9. Response options for the compromised machine

Another option is to simply halt and quarantine the Winnti implant itself, stopping the intrusion on a single machine. LEAD and BARIUM are not known for large-scale spear-phishing, so it is unlikely that SOC personnel would have to deal with multiple machines having been compromised by these groups at the same time. Nevertheless, Windows Defender ATP also supports blocking the implant across the entire enterprise, stopping large-scale intrusions in the early stages (Figure 10).

 

Response options for the Winnti implant file

Figure 10. Response options for the Winnti implant file

Conclusion: Shorten breach detection times to reduce impact

According to news reports, the incident affecting the industrial conglomerate may have taken several months to detect and mitigate. The time between the actual breach and its detection may have given attackers enough time to locate sensitive information and exfiltrate this information.

With the enhanced post-breach detection capabilities of Windows Defender ATP, SOC personnel are able to reduce this period to hours or even minutes, significantly lessening the potential impact of persistent attacker access to their network. Windows Defender ATP provides extensive information about activity groups responsible for the attacks, enabling customers to understand aspects of the attack that may not be obtained by network and endpoint sensors, such as common social engineering lures and the regional nature of an attack. With relevant visualized information, analysts are able to study malware behavior on impacted machines, so they can investigate further and plan out their response. And, finally, with the upcoming Creators Update, Windows Defender ATP will provide additional capabilities for detecting threats such as Winnti, as well as centralized response options, such as machine isolation and file blocking, that will enable fast containment of known attack jump off points.

Windows Defender ATP is built into the core of Windows 10 Enterprise and can be evaluated free of charge.

 

Peter Cap, Mathieu Letourneau, Ben Koehl, and Milad Aslaner

Microsoft Threat Intelligence

 

This blogpost is also available in German.

 


Sanofi Pasteur unlocks quality excellence and unleashes innovation with Yammer

$
0
0

Today’s post was written by Celine Schillinger, head of Quality Innovation and Engagement at Sanofi Pasteur.

Sanofi Pasteur pro pixImproving quality and performance are common business goals, but at Sanofi Pasteur, these goals have exceptional implications. As the vaccines division of a multinational pharmaceuticals company, we are responsible for creating rare, specialized products that prevent disease and save lives. When we improve any aspect of our business, we are also improving the lives of the people who receive our products. It’s an enormous responsibility, and one that is getting a surprising boost from the Microsoft social networking tool Yammer.

Too often, corporate social networks are seen as not having real business value. At Sanofi Pasteur, we know that these networks can have tangible benefits that support our corporate mission. When we first discovered Yammer eight years ago, I had no idea it would blossom into the largest group of its kind in the company, or that it would go on to inspire my current work using social networks to crowdsource change at the company. By supporting a movement of internal activists with a “quality mindset,” and connecting them through a large Yammer group dedicated to sharing best practices and ideas around working better together, Sanofi Pasteur took a decisive move toward improving the quality of our manufacturing processes. The result was an outpouring of innovation, such as an operator who saved his manufacturing unit more than €100,000 (US $105,000) by changing the way a particular material was handled in the production process, or a shop floor manager who has created a way to empower shop floor operators around key pillars of the production, thus reducing human errors by 91 percent. These are the kind of solutions that flow on Yammer. They are shared by their proud authors, recognized by peers and management, and replicated on other sites. They inspire more ideas and create a vibrant culture of improvement wherever people sit in the hierarchy.

Ours is a very stringent industry, with incredibly strict guidelines, and my challenge was to turn quality from an imposed constraint to a shared passion. Yammer helped me create a platform where this shared passion could grow. The Yammer effect is one of democratization and liberation of knowledge. In the past, the siloed company structure put pressure on managers to exchange information with their peers. Today, everyone has the opportunity to share their experiences on Yammer more openly, creating a rich resource of insights from every individual, regardless of title or level within the company. Our chief quality officer will regularly ask for and receive input directly from Yammer. Yammer groups have also helped us facilitate more collaboration between departments. Today, groups that would never have had a reason to interact before are learning from each other’s success.

It is my belief that no one understands a company better than the people working there, and Yammer allows us to tap into this expertise at every layer of our business. When we invite the company at large to share best practices and initiatives on Yammer, we are inundated with great ideas that solve problems. With every success, participation increases, driving a global corporate sensibility for personal responsibility and change.

Yammer also speaks to the very different working realities of all our employees at Sanofi Pasteur, from those on the shop floor to every level of manager and associate in the company. For employees without access to a company computer, the mobile nature of Yammer allows them to tap into all the benefits of an engaged workplace culture from their smartphone. Sharing pictures is easy and intuitive, bringing people’s conversations to life and adding color to the discussions. The reality of a global enterprise like ours is also supported by Yammer, as it provides multilingual translators that remove geographical barriers to enable a global conversation. A couple of employees took to Yammer to launch the idea of a cross-site running group, to promote the quality improvement movement and its collaborative nature. A few months later, this network has gathered runners from 14 sites in eight countries, from Argentina to India, from France to China. Altogether, they ran 6,000 kilometers (3,700 miles)! When you see the pictures on Yammer, the bold challenge, the smiles and the pride, you realize that our company is closer and more cohesive than ever.

I knew when I started the Yammer community that our purpose was to create a living ecosystem where people could care for each other and work toward building better business outcomes. The result exceeded our expectations and has led to savings in time, money and resources as we move ahead with our mission to prevent illness the world over.

—Celine Schillinger

For more on Sanofi Pasteur’s ability to promote and support the “quality mindset,” read the full case study.

The post Sanofi Pasteur unlocks quality excellence and unleashes innovation with Yammer appeared first on Office Blogs.

Evolving Office 365 Advanced Threat Protection with URL Detonation and Dynamic Delivery

$
0
0

We built Office 365 Advanced Threat Protection to provide nearly unparalleled email security with little impact on productivity. Advanced Threat Protection defends your organization from today’s growing and evolving advanced threats with powerful safeguards like Safe Links, which provides time-of-click protection to help prevent users from opening or accessing malicious links, and Safe Attachments, which protects users from opening malicious email attachments. Today, we’re pleased to announce availability of two new capabilities—URL Detonation and Dynamic Delivery—which improve the security Advanced Threat Protection provides while keeping people productive.

General availability of URL Detonation

URL Detonation helps prevent your users from being compromised by files linked to malicious URLs.

Advanced Threat Protection with URL Detonation and Dynamic Delivery 1

Email with malicious link to PDF file. 

When a user receives an email, Advanced Threat Protection analyzes the URLs for malicious behavior. This new capability is in addition to the URL reputation checks that Advanced Threat Protection already does. If the user clicks a link during the scan, the message “This link is being scanned” is displayed. If the link is identified as malicious after the scan, a pop-window opens notifying the user that the file is malicious and warns the user against opening it.

Advanced Threat Protection with URL Detonation and Dynamic Delivery 2

Link scan in progress notification (left). Malicious link notification (right).

IT admins can configure a SafeLink policy that turns on the URL trace to track user clicks, which is especially useful for instances when users can bypass the warning and click through to blocked pages. This enables them to appropriately focus on remediation efforts for impacted users while not disrupting the work of unaffected users.

Advanced Threat Protection with URL Detonation and Dynamic Delivery 3

URL trace of user activity.

Public preview of Dynamic Delivery

Since introducing Safe Attachments, we have greatly reduced the time it takes to scan emails containing attachments. While any malware solution requires some small amount time to scan suspicious attachments, Advanced Threat Protection enables you to remain productive during this scan time. Now, with Dynamic Delivery, recipients can read and respond to the email while the attachment is being scanned. Dynamic Delivery delivers emails to the recipient’s inbox along with a “placeholder” attachment notifying the user that the real attachment is being scanned—all with minimal lag time.

Advanced Threat Protection with URL Detonation and Dynamic Delivery 4

Users can read the email body while the attachment is scanned in a Safe Attachments sandbox.

If a user clicks the placeholder attachment, they see a message showing the progress of the scan. If the attachment is harmless, it seamlessly re-attaches to the email so the user can access it. If it is malicious, Office 365 Advanced Threat Protection will filter out the attachment.

Advanced Threat Protection with URL Detonation and Dynamic Delivery 5

The scan progress page displayed when a user clicks an attachment undergoing a scan.

How to enable URL Detonation and Dynamic Delivery

URL Detonation can be enabled through the policy controls in the Safe Links admin window under settings. To enable URL Detonation, select the On radio button and then select the Use Safe Attachments to scan downloadable content checkbox.

Advanced Threat Protection with URL Detonation and Dynamic Delivery 6

Admin control window for Safe Links policy. Both Linked Content Detection and Dynamic Email Delivery (through Safe Attachments) are enabled.

Dynamic Delivery can be activated through the policy controls from the Safe Attachments admin control window under Settings. Simply select the Dynamic Delivery radio button.

Advanced Threat Protection with URL Detonation and Dynamic Delivery 7

Admin control window for Safe Attachments policy with Dynamic Delivery activated.

How to get started with Advanced Threat Protection

To learn how to turn on the new Advanced Threat Protection capabilities in the Office 365 Security & Compliance Center, watch this Office Mechanics video. If you don’t yet have Advanced Threat Protection, sign up for a trial of Office 365 E5, which provides advanced security and compliance capabilities including Advanced Threat Protection.

We’re continuously trying to improve your experience on Office 365, so be sure to let us know what you think of these Advanced Threat Protection feature enhancements!

The post Evolving Office 365 Advanced Threat Protection with URL Detonation and Dynamic Delivery appeared first on Office Blogs.

River Dell Regional School District gives students connected classrooms and digital ink

$
0
0

Students and teachers decide what technology the district should adopt. Their choice: Windows 10 and Microsoft Education.

River Dell Regional School District in New Jersey has a middle school and high school comprised of more than 1,700 students and 131 educators. When it came time to renew their computer equipment the district they asked the teachers and students what they wanted.

OneNote’s collaboration space encourages students to work together on the same document at the same time, whether it’s a shared writing project or gathering data for a science lab report.

OneNote’s collaboration space encourages students to work together on the same document at the same time, whether it’s a shared writing project or gathering data for a science lab report.

River Dell Regional School District has equipped every student and teacher with a Windows 10 touch- and pen-enabled laptop with Microsoft Office 365 and OneNote. This combination enables teachers to work more efficiently, facilitates personalized learning opportunities for students and fosters a collaborative learning culture.

“We had a teacher and student committee that researched and selected the devices with us,” explains Marianthe Williams, director of technology for the district. “As a 1:1 school district for over 10 years, it was essential to have teacher and student involvement in the selection process.”

The committee developed a rubric to determine what criteria were important in and out of the classroom, and what functionality would enable them to do their best work while staying within the budget. “The group wanted 24/7 access to all their files with wireless, portable devices and digital inking capabilities,” said Williams.

While teachers roam the room, they can write on the Surface tablet and have the display projected on a screen in the front of the class.

While teachers roam the room, they can write on the Surface tablet and have the display projected on a screen in the front of the class.

The top priority for students was to find a device that offered a variety of modern capabilities, including keyboard, touch and pen. Williams noted, “For certain subjects such as math, science and music, students really need the ability to take notes by hand so they can make diagrams and sketches.” Research shows that digital inking results in long-term retention and better learning outcomes. ​Students produce 56 percent more non-linguistic content [diagrams, symbols, numbers] when using digital inking rather than a keyboard, which leads to an improvement in performance of 9 to 38 percent.* Teachers are more productive with digital ink as well since they are able to offer individualized feedback which improves personalized student learning. Another study showed that 68 percent of educators improve the quality of their communications with students by using a touch-based device with a stylus.**

As the district began testing different types of devices, including laptops, Chromebooks, and tablets, the choice became clear. “Hands-down, the committee selected the Surface Pro 3 because it was easy to use, intuitive, and lightweight,” Williams explained. All teachers and staff were issued the Surface Pro 3. High school students received the HP EliteBook 820, while the middle school students received the HP ProBook 11 Education Edition. “We were pleasantly surprised to find the HP devices offered the touchscreen features our students craved at a price point the district could afford.”

With the Surface tablets, teachers are freed from their stationary post in front of the class.

With the Surface tablets, teachers are freed from their stationary post in front of the class.

Over the summer, the district made a smooth transition to Windows 10 on all devices. Not only did the students keep the productivity tools in Office 365 to which they were so accustomed, but they also benefited from digital inking features they craved. The upgrade to Windows 10 provided a user experience that students and teachers were already familiar with and opened a new world of touch and pen features.

Williams said, “We’re all-in with Surface, HP, Microsoft Office 365 and OneNote. It’s exciting to see how our teachers and students have taken OneNote Class Notebook to be their digital curriculum. The technology helps us to extend learning beyond the instructional period.  Our students are finding new ways to collaborate and developing skills that will serve them well throughout their lives and careers.”

With the Surface Pro 3 tablets, teachers are freed from their stationary post in front of the class. While teachers roam the room, they can write on the Surface tablet and have the display projected on a screen in the front of the class. “You have a very strong pulse of the class when you can walk around,” explains U.S. History teacher, Dawn Rivas. “I can write on the tablet, and it goes up immediately on the projector and kids will see that.”  Using the stylus allows students and teachers to incorporate notes and comments directly into the lesson in a way that does not interrupt the flow of the classroom.  Teachers report that their lessons are considerably enhanced, with students who are more engaged and are able to draw better conclusions.

The most exciting instructional catalyst was Microsoft OneNote Class Notebook, which has taken collaborative learning to a new level. “The game changer is OneNote Class Notebook. Our teachers have really embraced it,” said Williams.

River Dell Regional School District in New Jersey has had a 1:1 device program in place for 10 years. But the technology program really took off after every teacher was given a Microsoft Surface Pro 3 over the 2015-2016 school year.

River Dell Regional School District in New Jersey has had a 1:1 device program in place for 10 years. But the technology program really took off after every teacher was given a Microsoft Surface Pro 3 over the 2015-2016 school year.

With OneNote Class Notebook, students can ask for help, and teachers can give their support—directly in the student’s personal notebook. OneNote’s collaboration space encourages students to work together on the same document at the same time, whether it’s a shared writing project or gathering data for a science lab report. OneNote automatically saves notebooks so that students and teachers can view them from any device, online or offline.  Williams says, “Teachers tell us that they are more efficient as they can check student progress at any time and that the students have higher homework completion rates.”

To learn more about how Microsoft is empowering every student to achieve more, visit Microsoft in Education.

*Computer Interfaces and their Impact on Learning, Sharon Oviatt.
**Digital Ink in the Classroom, IDC InfoBrief sponsored by Microsoft, June 2015.

The post River Dell Regional School District gives students connected classrooms and digital ink appeared first on Windows Experience Blog.

A Plethora of Microsoft Training Options on AI, Machine Learning & Data Science, including MOOCs

$
0
0

This post is authored by Kristin M. Tolle, Director of Program Management for Advanced Analytics Ecosystem Development and Training at Microsoft.

Cortana Intelligence, Microsoft’s end-to-end platform for Advanced Analytics, offers a suite of services to solve real world customer problems. The suite has many moving parts – Data Lake, HDInsight (Hadoop), Event Hub, Machine Learning and R – just to name a few, and we realize it may be challenging for some of you to experience first-hand how all these services work together in concert.

My team, which is tasked with training our partners to use these services to address their customers’ needs, is keenly aware of the breadth of that knowledge surface area. In this blog post, I outline some of the best ways for you to learn about all things Big Data and Advanced Analytics from Microsoft, including many hands-on training options, and also how to stay in the loop on our future offerings.


Online Training

Our training offerings take many forms. Overviews can be found on Channel 9. For example, we published two new series recently, one on Azure SQL Data Warehousing by Chris Testa-O’Neill of SQLBits fame and another on Operationalizing Solutions with Azure Data Factory, by Ryan Swanstrom who, when he’s not training people for the Data Science team, runs Data Science 101, one of the most popular blogs for aspirant data scientists.

We’re also spending a lot of time working on massively open online courses (MOOCs).

Microsoft partners can take Practical Data Analytics with Cortana Intelligence through the Cloud Platform University Online (CPO), with our very own Buck Woody.

If you’re looking for a deep dive into Microsoft R there’s an EdX.org course, Analyzing Big Data with Microsoft R Server, presented by Data Scientist Seth Mottaghinejad.

Our MOOCs are not for the faint of heart – they’re close to the metal, the objective being to prepare you to become a Microsoft Certified Solutions Expert or a Microsoft Certified Professional on our platform. In fact, we recently launched (currently in beta) a certification called Analyzing Big Data with Microsoft R (Exam 70-773).

If you’re just getting started in Data Science, Microsoft Learning has an entire series of online classes, including ours, which teach you how to do real world data science using Microsoft platforms through the Data science track in the Microsoft Professional Program.

Training Materials, and Becoming a Training Partner

If you’re that type who just wants to figure it all out for yourself, we publish our training materials to a public GitHub repository where you’re welcome to access firsthand all the materials we use when we do in-person instruction. You can find us under Azure here: https://github.com/Azure/learnAnalytics-public.

These materials are open source and free for you to use to teach others about Advanced Analytics.

To become an Advanced Analytics training partner, please reach out to us here: http://learnanalytics.microsoft.com/home/trainingpartners– before long you’ll be publishing your classes on our LearnAnalytics@MS website!

In-Person Training

You’ll often find our team running classes at data science and machine learning events all over world – for instance at PASS, the Data Science Summit and many, many others.

We also deliver in-person “capstone” style classes around the world called the LearnAnalytics Series. These classes are very hands-on deep dives. You can learn Microsoft R with Spark and HDInsight, develop solutions using Cortana Intelligence and Microsoft R, and learn how to put Cognitive Services, Bot Framework and Deep Learning to use in the real world, just to cite some examples.

If Bots are your thing, register today for four upcoming classes on this topic in Houston, Chicago, Munich, and Madrid.

We’ve got an offering titled “Cortana Intelligence Suite Workshop – Foundations and SQL Data Warehousing” coming up in Copenhagen, with events in Madrid (Spain), Reading (UK), Sydney, Melbourne and Hong Kong to be scheduled soon after.

For those of you interested in a deep dive into Microsoft R, our course on “Microsoft R Server for Data Science with Spark and HDInsight” will be offered in Belgrade, Reading, Budapest, Istanbul and Dubai.

Stay in the Know

As you can tell, there’s just a lot going on in this space! Be sure to regularly visit the LearnAnalytics@MS portal to stay on top of everything Advanced Analytics from Microsoft.

What’s more, you may just find us offering a class in your own backyard in the near future!

Kristin
@Kristin_Tolle

DSC Resource Kit Release January 2017

$
0
0

We just released the DSC Resource Kit!

This release includes updates to 7 DSC resource modules, including 10 new DSC resources. In these past 6 weeks, 71 pull requests have been merged and 37 issues have been closed, all thanks to our amazing community!

The modules updated in this release are:

  • AuditPolicyDsc
  • xDismFeature
  • xExchange
  • xNetworking
  • xPSDesiredStateConfiguration
  • xSQLServer
  • xWebAdministration

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

xHyper-V has changes to be released, but the tests have a few issues at the moment. We will release it as soon as those issues are resolved.
PSDscResources does not currently have any changes to release, but there are some changes queuing up. We will release it again soon once those changes are ready.
We will update this blog post when either of these modules is released.

A new version of xActiveDirectory was also released about two weeks ago with hotfixes for xADDomain and xADDomainController.

Our last community call for the DSC Resource Kit was last week on January 18. A recording of our updates as well as summarizing notes are available. Join us next time at 9AM PST on March 1 to ask questions and give feedback about your experience with the DSC Resource Kit. Keep an eye on the community agenda for the link to the call agenda.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

All resources with the ‘x’ prefix in their names are still experimental – this means that those resources are provided AS IS and are not supported through any Microsoft support program or service. If you find a problem with a resource, please file an issue on GitHub.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or Changelog.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module NameVersionRelease Notes
AuditPolicyDsc1.1.0.0
  • Added the AuditPolicyCsv resource.
xDismFeature1.2.0.0
  • xDismFeature: Resource no longer includes the Source parameter when it is not specified
  • Converted appveyor.yml to install Pester from PSGallery instead of from Chocolatey.
xExchange1.13.0.0
  • Fix function RemoveVersionSpecificParameters
  • xExchMailboxServer: Added missing parameters except these, which are marked as “This parameter is reserved for internal Microsoft use.”
xNetworking3.2.0.0
    • Fixed typo in the example”s Action property from “Blocked” (which isn”t a valid

value) to “Block”

  • Added support for auto generating wiki, help files, markdown linting

and checking examples.

  • Added NetworkingDsc.ResourceHelper module based on copy from PSDscResources.
  • MSFT_xFirewall:
    • Cleaned up ParameterList table layout and moved into a new file

(MSFT_xFirewall.data.psd1).

  • Separated Localization strings into strings file.
  • Added standard help blocks to all functions to meet HQRM standards.
  • Added CmdletBinding attribute to all functions to meet HQRM standards.
  • Style changes to meet HQRM standards.
  • Fixed issue using CIDR notation for LocalAddress or RemoteAddress.

See GitHub issue.

  • Fixed integration tests so that values being set are correctly tested.
  • Added integration tests for Removal of Firewall rule.
  • Added NetworkingDsc.Common module to contain shared networking functions.
  • MSFT_xDNSServerAddress:
  • Separated Localization strings into strings file.
  • MSFT_xDefaultGatewayAddress:
  • Separated Localization strings into strings file.
  • Style changes to meet HQRM standards.
  • MSFT_xDhcpClient:
  • Separated Localization strings into strings file.
  • Fix parameter descriptions in MOF file.
  • Style changes to meet HQRM standards.
  • MSFT_xDnsClientGlobalSetting:
  • Renamed Localization strings file to be standard naming format.
  • Moved ParameterList into a new file (MSFT_xDnsClientGlobalSetting.data.psd1).
  • Style changes to meet HQRM standards.
  • Removed New-TerminatingError function because never called.
  • Converted to remove Invoke-Expression.
  • MSFT_xDnsConnectionSuffix:
  • Separated Localization strings into strings file.
  • Style changes to meet HQRM standards.
  • MSFT_xHostsFile:
  • Renamed Localization strings file to be standard naming format.
  • Style changes to meet HQRM standards.
  • Refactored for performance
    • Code now reads 38k lines in > 1 second vs 4
  • Now ignores inline comments
  • Added more integration tests
  • MSFT_xIPAddress:
  • Separated Localization strings into strings file.
  • Style changes to meet HQRM standards.
  • MSFT_xNetAdapterBinding:
  • Separated Localization strings into strings file.
  • Style changes to meet HQRM standards.
  • MSFT_xNetAdapterRDMA:
  • Renamed Localization strings file to be standard naming format.
  • Style changes to meet HQRM standards.
  • MSFT_xNetBIOS:
  • Renamed Localization strings file to be standard naming format.
  • Style changes to meet HQRM standards.
  • MSFT_xNetConnectionProfile:
  • Separated Localization strings into strings file.
  • Style changes to meet HQRM standards.
  • MSFT_xNetworkTeam:
  • Style changes to meet HQRM standards.
  • MSFT_xNetworkTeamInterface:
  • Updated integration tests to remove Invoke-Expression.
  • Style changes to meet HQRM standards.
  • MSFT_xRoute:
  • Separated Localization strings into strings file.
  • Style changes to meet HQRM standards.
  • MSFT_xFirewall:
  • Converted to remove Invoke-Expression.
xPSDesiredStateConfiguration5.2.0.0
  • xWindowsProcess
    • Minor updates to integration tests because one of the tests was flaky.
  • xRegistry:
    • Added support for forward slashes in registry key names. This resolves issue 285.
xSqlServer5.0.0.0
  • Improvements how tests are initiated in AppVeyor
    • Removed previous workaround (issue 201) from unit tests.
    • Changes in appveyor.yml so that SQL modules are removed before common test is run.
    • Now the deploy step are no longer failing when merging code into Dev. Neither is the deploy step failing if a contributor had AppVeyor connected to the fork of xSQLServer and pushing code to the fork.
  • Changes to README.md
    • Changed the contributing section to help new contributors.
    • Added links for each resource so it is easier to navigate to the parameter list for each resource.
    • Moved the list of resources in alphabetical order.
    • Moved each resource parameter list into alphabetical order.
    • Removed old text mentioning System Center.
    • Now the correct product name is written in the installation section, and a typo was also fixed.
    • Fixed a typo in the Requirements section.
    • Added link to Examples folder in the Examples section.
    • Change the layout of the README.md to closer match the one of PSDscResources
    • Added more detailed text explaining what operating systemes WMF5.0 can be installed on.
    • Verified all resource schema files with the README.md and fixed some errors (descriptions was not verified).
    • Added security requirements section for resource xSQLServerEndpoint and xSQLAOGroupEnsure.
  • Changes to xSQLServerSetup
    • The resource no longer uses Win32_Product WMI class when evaluating if SQL Server Management Studio is installed. See article kb974524 for more information.
    • Now it uses CIM cmdlets to get information from WMI classes.
    • Resolved all of the PSScriptAnalyzer warnings that was triggered in the common tests.
    • Improvement for service accounts to enable support for Managed Service Accounts as well as other nt authority accounts
    • Changes to the helper function Copy-ItemWithRoboCopy
      • Robocopy is now started using Start-Process and the error handling has been improved.
      • Robocopy now removes files at the destination path if they no longer exists at the source.
      • Robocopy copies using unbuffered I/O when available (recommended for large files).
    • Added a more descriptive text for the parameter SourceCredential to further explain how the parameter work.
    • BREAKING CHANGE: Removed parameter SourceFolder.
    • BREAKING CHANGE: Removed default value “$PSScriptRoot….” from parameter SourcePath.
    • Old code, that no longer filled any function, has been replaced.
      • Function ResolvePath has been replaced with [Environment]::ExpandEnvironmentVariables($SourcePath) so that environment variables still can be used in Source Path.
      • Function NetUse has been replaced with New-SmbMapping and Remove-SmbMapping.
    • Renamed function GetSQLVersion to Get-SqlMajorVersion.
    • BREAKING CHANGE: Renamed parameter PID to ProductKey to avoid collision with automatic variable $PID
  • Changes to xSQLServerScript
    • All credential parameters now also has the type [System.Management.Automation.Credential()] to better work with PowerShell 4.0.
    • It is now possible to configure two instances on the same node, with the same script.
    • Added to the description text for the parameter Credential describing how to authenticate using Windows Authentication.
    • Added examples to show how to authenticate using either SQL or Windows authentication.
    • A recent issue showed that there is a known problem running this resource using PowerShell 4.0. For more information, see issue #273
  • Changes to xSQLServerFirewall
    • BREAKING CHANGE: Removed parameter SourceFolder.
    • BREAKING CHANGE: Removed default value “$PSScriptRoot….” from parameter SourcePath.
    • Old code, that no longer filled any function, has been replaced.
      • Function ResolvePath has been replaced with [Environment]::ExpandEnvironmentVariables($SourcePath) so that environment variables still can be used in Source Path.
    • Adding new optional parameter SourceCredential that can be used to authenticate against SourcePath.
    • Solved PSSA rules errors in the code.
    • Get-TargetResource no longer return $true when no products was installed.
  • Changes to the unit test for resource
    • xSQLServerSetup
      • Added test coverage for helper function Copy-ItemWithRoboCopy
  • Changes to xSQLServerLogin
    • Removed ShouldProcess statements
    • Added the ability to enforce password policies on SQL logins
  • Added common test (xSQLServerCommon.Tests) for xSQLServer module
    • Now all markdown files will be style checked when tests are running in AppVeyor after sending in a pull request.
    • Now all Examples will be tested by compiling to a .mof file after sending in a pull request.
  • Changes to xSQLServerDatabaseOwner
    • The example “SetDatabaseOwner” can now compile, it wrongly had a DependsOn in the example.
  • Changes to SQLServerRole
    • The examples “AddServerRole” and “RemoveServerRole” can now compile, it wrongly had a DependsOn in the example.
  • Changes to CONTRIBUTING.md
    • Added section “Tests for examples files”
    • Added section “Tests for style check of Markdown files”
    • Added section “Documentation with Markdown”
    • Added texts to section “Tests”
  • Changes to xSQLServerHelper
    • added functions
      • Get-SqlDatabaseRecoveryModel
      • Set-SqlDatabaseRecoveryModel
  • Examples
    • xSQLServerDatabaseRecoveryModel
      • 1-SetDatabaseRecoveryModel.ps1
    • xSQLServerDatabasePermission
      • 1-GrantDatabasePermissions.ps1
      • 2-RevokeDatabasePermissions.ps1
      • 3-DenyDatabasePermissions.ps1
    • xSQLServerFirewall
      • 1-CreateInboundFirewallRules
      • 2-RemoveInboundFirewallRules
  • Added tests for resources
    • xSQLServerDatabaseRecoveryModel
    • xSQLServerDatabasePermissions
    • xSQLServerFirewall
  • Changes to xSQLServerDatabaseRecoveryModel
    • BREAKING CHANGE: Renamed xSQLDatabaseRecoveryModel to xSQLServerDatabaseRecoveryModel to align with naming convention.
    • BREAKING CHANGE: The mandatory parameters now include SQLServer, and SQLInstanceName.
  • Changes to xSQLServerDatabasePermission
    • BREAKING CHANGE: Renamed xSQLServerDatabasePermissions to xSQLServerDatabasePermission to align wíth naming convention.
    • BREAKING CHANGE: The mandatory parameters now include PermissionState, SQLServer, and SQLInstanceName.
  • Added support for clustered installations to xSQLServerSetup
    • Migrated relevant code from xSQLServerFailoverClusterSetup
    • Removed Get-WmiObject usage
    • Clustered storage mapping now supports asymmetric cluster storage
    • Added support for multi-subnet clusters
    • Added localized error messages for cluster object mapping
    • Updated README.md to reflect new parameters
  • Updated description for xSQLServerFailoverClusterSetup to indicate it is deprecated.
  • xPDT helper module
    • Function GetxPDTVariable was removed since it no longer was used by any resources.
    • File xPDT.xml was removed since it was not used by any resources, and did not provide any value to the module.
  • Changes xSQLServerHelper moduled
    • Removed the globally defined $VerbosePreference = Continue from xSQLServerHelper.
    • Fixed a typo in a variable name in the function New-ListenerADObject.
    • Now Restart-SqlService will correctly show the services it restarts. Also fixed PSSA warnings.
xWebAdministration1.17.0.0
  • Added removal of self signed certificate to the integration tests of xWebsite, fixes 276.
  • Added EnabledProtocols to xWebApplication.
  • Changed SSLFlags for xWebApplication to comma seperate multiple SSL flags, fixes 232.

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available in WMF 5.0) to find modules with DSC Resources:

# To list all modules that are part of the DSC Resource KitFind-Module-Tag DSCResourceKit # To list all DSC resources from all sources Find-DscResource

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module-Name < module name >

For example:

Install-Module-Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the xCertificate module, go to:
https://github.com/PowerShell/xCertificate.

All DSC modules are also listed as submodules of the DscResources repository in the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Keim
Software Engineer
PowerShell Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

VIDEO: How to get started with technical public speaking!

$
0
0

On .NET is a weekly chat with team members from the .NET team at Microsoft. This week we put together something a little different, and honestly, I think it not only went really well, but I think it's an hour that provides a lot of value that goes well beyond .NET or any technology.

We put together a panel of folks at different points in their technical careers. Some just starting to speak publically and some who've been doing it for 20+ years. Some introverts, some extroverts. Some with speaking or theater experience, others with none. And we talked!

We chatted about how to get started, where you can learn to speak on technical topics, how to form a story arc, how to best utilize your gifts, when to be critical and when to just chill.

It was great fun and included myself, Kendra Havens, Maria Naggaga Nakanwagi, Kasey Uhlenhuth, and Donovan Brown. You can view or download it here on Channel 9, or you can watch it on YouTube embedded below.

Let us know if this kind of content is useful, and if you want to see more in the future.


Sponsor: Big thumbs-up for Kendo UI! They published a comprehensive whitepaper on responsive web design and the best and fastest way to serve desktop and mobile web users in a tailored and cost-effective manner. Check it out!



© 2016 Scott Hanselman. All rights reserved.
     

VS Team Services Update – Jan 25

$
0
0

We have begun the process of deploying our sprint 112 work into production.  You will see the improvements show up in your account over the next week.  You can read the release notes for all the details in this deployment.

A few things worth highlighting…

This provides your first peek at our new Enterprise Agile “Delivery Plans” feature.  To get it, you actually need to go to the marketplace and install it into your account.  This feature is designed to enable you to look across teams and see how work is aligned.  This is still a very early preview and we have lots of plans to continue to evolve it but there’s enough functionality there for you to try it and start giving feedback.

deliveryplan

The mobile work item forms that I first demoed in my session at Connect(); in November are now available in the service.  If you just click on a link to a work item from any notification email on a phone, you’ll get the new mobile web view.  It’s a tremendously better experience.  We have some work to finish optimizing work items but we have lots of other work on the backlog – like mobile pull requests views, etc.

mobileform

We broke up the “administer” permission on Git repos into finer grained permissions.  I call this out because it’s been a common request – and one made in a comment on my last post.  Among other things, you can now give people permissions to create repos without giving them full administrative control over repos.

There’s plenty of other nice improvements too…

Much of this (though not all) will make it into the TFS 2017 Update 1 release.  We’ll be clear about what made it when we publish the release notes for RC2.

I hope you enjoy the update!

Thanks,

Brian


Loading data into Azure SQL Data Warehouse just got easier

$
0
0

Azure SQL Data Warehouse is a SQL-based fully managed, petabyte-scale cloud solution for data warehousing. SQL Data Warehouse is highly elastic, enabling you to provision in minutes and scale capacity in seconds. You can scale compute and storage independently, allowing you to burst compute for complex analytical workloads or scale down your warehouse for archival scenarios, and pay based off what you're using instead of being locked into predefined cluster configurations.

Since announcing general availability in July 2016, we have continued to work on helping customers get data faster into their Data Warehouse to generate insights faster and grow their businesses further. Azure SQL Data Warehouse solves the data loading scenario via PolyBase, which is a feature built into the SQL Engine. It effectively leverages the entire Massively Parallel Processing (MPP) architecture of Azure SQL Data Warehouse to provide the fastest loading mechanism from Azure Blob Storage into the Data Warehouse. We recently shared how you can use Azure Data Factory Copy Wizard to load 1TB data in under 15 mins into Azure SQL Data Warehouse, at over 1.2 GB per second throughput.

To understand just how this works, let’s take a high-level look at the SQL Data Warehouse architecture. A SQL Data Warehouse is composed of a Control Node, which is where users connect and submit queries, and compute nodes, where processing occurs. Traditional loading tools load individual rows through the control node. The rows are then routed to the appropriate compute node depending on how the data is to be distributed. This can cause slower performance because the control node must read each record as they are received. PolyBase uses the compute nodes to load the data in parallel allowing for faster performance, resulting in quicker insights from your data.

Parallel Loading with PolyBase

UTF-16 support for delimited text files

To make it easier to load data into Azure SQL Data Warehouse using PolyBase, we have expanded our delimited text file format to support UTF-16 encoded files.

Support for UTF-16 encoded files is important because this is the default file encoding for BCP.exe. We’ve often seen that customers export their data from their on-premises data Warehouse to Azure Blob Storage in UTF-16 format. In the past, it was necessary to then have a script to reencode the data into UTF-8 format, resulting in time consuming processing and a duplication of data. Now with UTF-16 supported, files can go directly from Azure Blob storage into SQL Data Warehouse without encoding conversion.

How to import a UTF 16 text file format

To import UTF-16 files into SQL DW with PolyBase, all you have to do is create a new file format with the encoding option set to ‘UTF16’. All of the additional format options like, field terminator, date format, and rejection values are supported with both UTF-16 and UTF-8 encoding.

Below is an example of a pipe delimited text file format that would read UTF16 files.

UTF16 File

Next steps

In this blog post we discussed a bit about PolyBase and why it is the optimal data loading tool for SQL Data Warehouse and our expanded support for UTF-16 encoded file formats. This is now available in all SQL Data Warehouse Azure regions worldwide. We encourage you to try it out if you are interested in moving your on-prem Data Warehouse into the cloud.

Learn more

What is Azure SQL Data Warehouse?

SQL Data Warehouse best practices

Load Data into SQL Data Warehouse

MSDN forum

Stack Overflow forum

Feature Requests

If you have any feature requests for Azure SQL Data Warehouse, I like to suggest connecting with the team via User Voice.

Our journey on building the Go SDK for Azure

$
0
0

Over the last few months, we've been busy adding new functionality to the Azure Go SDK and we'll keep doing so as we march towards public preview next year.

If you followed the recent changes on our GitHub repo, you probably noticed few general improvements we made to the SDK

Model Flattening

In the last release we added model flattening to many of our APIs (i.e. you can type Resource.Sku.Family instead of resource.Properties.Sku.Family), which makes for more readable code.

Better error messages during parameter validation

During parameter validation, we enabled the SDK to return an error with the info needed to fix the JSON before sending the request out - making it easier to identify/correct potential coding mistakes.

For example, let us take a scenario where a user wants to create a resource group and location is required in that operation. User forgets to include it in the request.

In previous SDK versions, the operation would fail inside Azure and user would get the following error

resources.GroupsClient#CreateOrUpdate: Failure responding to request: StatusCode=400 -- Original Error: autorest/azure: Service returned an error. Status=400 Code="LocationRequired" Message="The location property is required for this definition." 

In the latest SDK version, user would get

resources.GroupsClient#CreateOrUpdate: Invalid input: autorest/validation: validation failed: parameter=parameters.Location constraint=Null value=(*string)(nil) details: value can not be null; required parameter 

 

We also improved the coverage and functionality of the data plane of the SDK by adding support for file and directory manipulation, getting/setting ACLs on containers, working with the Storage Emulator and other various storage blob and queues operations.

Some of the fixes and improvements added to the SDK have been provided by enthusiastic developers outside of our Microsoft team and we would like to extend our sincere gratitude and appreciation to everyone who sent us feedback and/or pull requests. We took note of your requests for better API coverage in the data plane, better documentation, release notes and samples, and we are making progress in incorporating them into our future releases.

Breaking changes

Speaking of future releases: while many API changes are expected to be additive in nature, some of the changes we are introducing will break existing clients. A recent example was issue 1559, which arose when we added parameter validation; in the near future, some methods and parameters may be added/deleted, parameters change order, and structs can change while we are considering model flattening on more APIs. This is part of the reason why we keep the 'beta' label on the Go SDK, and we are carefully examining every proposed change for alternatives that will not break the existing functionality.

We’d like to thank in advance all of you who continue to use our Go SDK and send us feedback; we are committed to building the best experience for developers on our platform, and we'd like to make sure the changes have minimal impact on your development cycle as the SDK goes towards more mature stages of public preview and GA (general availability)

We will use this blog to keep you updated on the progress and potential breaking changes, and we’ll give you a heads-up as we are approaching new milestones.
Have any suggestions for how to make the SDK better? We’d love to hear from you! Send us a pr or file an issue, and let’s talk!

Push data to Power BI streaming datasets without writing any code using Microsoft Flow

$
0
0
Today, I am happy to announce an exciting new update to the Power BI connector for Microsoft Flow. Coming hot on the heels of our data alert Flow trigger, we have added a new action which pushes rows of data to a Power BI streaming dataset.

Webinar for February 2: Accelerate your Retail Business with Analytics by Hitachi Solutions

$
0
0
A common challenge that businesses face today is creating something that improves on the value of traditional analytics while utilizing new technologies at our disposal. Turning data into insights and action is the hallmark of a good business intelligence solution. With the increasingly rich and populated Microsoft Cloud Technology landscape, new opportunities are presented in ways that organizations can leverage them. Attend the webinar, "Accelerate your Retail Business with Analytics" by Hitachi Solutions on February 2 to learn more!

Episode 115 on January Microsoft and community updates—Office 365 Developer Podcast

$
0
0

In episode 115 of the Office 365 Developer Podcast, Richard diZerega and Andrew Coates discuss Microsoft and dev community announcements in January.

Download the podcast.

Weekly updates

Got questions or comments about the show? Join the O365 Dev Podcast on the Office 365 Technical Network. The podcast RSS is available on iTunes or search for it at “Office 365 Developer Podcast” or add directly with the RSS feeds.feedburner.com/Office365DeveloperPodcast.

About the hosts

RIchard diZeregaRichard is a software engineer in Microsoft’s Developer Experience (DX) group, where he helps developers and software vendors maximize their use of Microsoft cloud services in Office 365 and Azure. Richard has spent a good portion of the last decade architecting Office-centric solutions, many that span Microsoft’s diverse technology portfolio. He is a passionate technology evangelist and a frequent speaker at worldwide conferences, trainings and events. Richard is highly active in the Office 365 community, popular blogger at aka.ms/richdizz and can be found on Twitter at @richdizz. Richard is born, raised and based in Dallas, TX, but works on a worldwide team based in Redmond. Richard is an avid builder of things (BoT), musician and lightning-fast runner.

 

ACoatesA Civil Engineer by training and a software developer by profession, Andrew Coates has been a Developer Evangelist at Microsoft since early 2004, teaching, learning and sharing coding techniques. During that time, he’s focused on .NET development on the desktop, in the cloud, on the web, on mobile devices and most recently for Office. Andrew has a number of apps in various stores and generally has far too much fun doing his job to honestly be able to call it work. Andrew lives in Sydney, Australia with his wife and two almost-grown-up children.

Useful links

StackOverflow

Yammer Office 365 Technical Network

The post Episode 115 on January Microsoft and community updates—Office 365 Developer Podcast appeared first on Office Blogs.

Bulk changing virtual hard disk path

$
0
0

I received this in email today:

“I have XCOPY’d a bunch of VHDX files from one volume to another on WS2016.    What’s the easiest / fastest way to fix up the paths for the VM’s???”

The answer to this is quite simple.  Open PowerShell and run:

$oldPath = "C:\Users\Public\Documents\Hyper-V\Virtual Hard Disks"
$newPath = "D:"
get-vm | Get-VMHardDiskDrive | ? path -Like $path* | %{Set-VMHardDiskDrive -VMHardDiskDrive $_ -Path $_.path.Replace($oldPath, $newPath)}

A couple of details on this answer:

  1. PowerShell is wonderful for these kinds of bulk operations
  2. While we do not allow you to edit the virtual hard disk path on a saved virtual machine using Hyper-V manager – we do allow you to do this through PowerShell.  In fact – there are a lot of things that are blocked in Hyper-V manager that are possible through PowerShell.

Cheers,
Ben

Team Services Process Customization Roadmap (Jan 2017)

$
0
0

Work items in Visual Studio Team Services can be customized to meet the needs of your individual organization. Today, project administrators can add/remove fields to a work item form, change the way fields are displayed on a form, define states that your work item can move through, and define your own custom work item types.

This blog post gives you a sneak peek at next set of customizations that we plan to bring to Team Services.

As always, the timelines and designs shared in this post are subject to change.

Planned dateFeature
Q1 2017
  • Add custom backlog levels
  • Improved navigation of process customization
Q2 2017
  • Define business rules for work item types
  • Add identity fields to work item types
  • REST API support for customization

Custom backlog levels

When you create a project with any of our processes (Agile, Scrum or CMMI), each team gains access to one product backlog (Stories, Backlog items, or Requirements) and two portfolio backlogs (Epics and Features). Small organizations typically find these two portfolio levels sufficient for their needs. However, large organizations have expressed the need to add more portfolio backlogs.

With custom backlog levels, you can add new portfolio backlog levels above Epics.

In the following image, we add a portfolio backlog called “Themes” above Epics, and associate the custom work item type “Theme” with this new backlog level.
custom-backlogs-levels

Improved navigation for process customization

We’re updating the navigation involved in managing a process.  The main change you’ll see is how you choose and modify a work item type. You’ll choose the work item type from a drop-down menu and modify it from the form layout. You no longer have to select the area you want to customize. From the form, you can perform all actions, including adding and editing of fields, modifying states, and later, maintaining the business rules.

improved-navigation

Rules for work item types

With  rules, you can set default values, clear entries, or restrict changes. With conditional rules, you can restrict to run a rule only on a specific state change, or when a field has a specific value.

When you define the rule, first you choose what needs to happen (action), and then when the rule needs to execute (condition).

business-rules

Adding Identity fields

We currently support the most important field types such as text box, checkbox, and picklist. The number one request we’ve received is to also support custom identity fields, just like the Assigned To field.

Custom identity fields will render the same control as the Assigned To field, with the identity picker and avatar image.

identity-field

REST API support for customization

Today we have a few APIs available to get a list of work item types, but we know you need more. If you are building on top of Team Services and TFS, you want to get the list of all processes in the collection and a list of all work item types in a process. You also want to know the fields and states of a work item type, including the state color. And you not only want to retrieve that data, you also want to update that configuration.

The next set of process REST APIs will deliver these capabilities to you.

Various process models

Dependent on whether you host TFS on your own servers (on-premises) or how your account was created in Team Services (cloud), you will be using one of these process models

  1. Inheritance
  2. Hosted XML
  3. On-premises XML

process-model

Inheritance process model (Team Services)

All new accounts created in Team Services use the Inherited process model.  The customizations and roadmap described in this post apply only to this process model.

To customize your project, you first create an inherited process from one of the system processes (Agile, CMMI or Scrum), and migrate the project to the new inherited process. A change to your inherited process immediately affects all team projects which use that process.

See our documentation to learn more about the customization of inherited processes

Hosted XML process model (Team Services)

At the Connect() event last November, we announced the migration from TFS to VSTS.  Accounts imported from TFS to Team Services use this process model.

This model is very similar to the on-premises XML process model. The major difference is where the metadata for the team project is stored. Team projects which use the hosted XML process model, read their metadata from the process. A change to the process will affect all its team projects too.

To customize a process in this model, you download the process as a zip file, make the changes locally, and then upload the full process to apply these changes.

On-premises XML process model (TFS)

This process model is the traditional model used by all on-premises installations. As opposed to the other two process models, the metadata of a team project is not read from the process. When you create a team project, the process metadata is copied to the team project. As a result, any change to the process won’t affect the team projects. Instead you modify the metadata of the project using the command-line tool called witadmin.

Compare process models

The table below summarizes the differences between the various process models

Team ServicesTFS
InheritanceHosted XMLOn-premises XML
Inherit system processes changes (Agile, Scrum, CMMI)
Process changes affect team projects
WYSIWYG editor
Use witadmin to edit team projects
REST API (read)
REST API (write)
Create custom processes
Advanced customizations (global workflow, custom link types, global lists)

Future

We have received many requests to bring the Inheritance process model to on-premises. We are working through our plan for on-premises now, and expect to provide an update in the spring of 2017.

Keep the feedback coming!

Ewald Hofman


Use Windows Server 2016 and software-defined networking to build a better network

$
0
0

Windows Server Program Manager Greg Cusanza joins Matt McSpirit to demonstrate the software-defined networking (SDN) capabilities in Windows Server 2016. Watch as Greg explains those capabilities and shows you how to use SDN to dynamically create, secure, and connect your network to meet the evolving needs of your apps, speed up the deployment of your workloads, and contain security vulnerabilities from spreading across your network, all while reducing your overall infrastructure costs.

When you use Windows Server 2016 and SDN, you gain access to a level of agility that allows you to extend the capabilities of your existing physical network. The SDN sits on top of your physical network infrastructure and virtualizes the network and its services so that the management experiences are simpler and networks become application-specific. Apps can evolve as quickly, or become as complex, as developers want them to, because the SDN can isolate resources and eliminate shared dependencies. This connectivity flexibility doesnt skimp on security either. Beyond isolating networks from one another, and beyond what you can do with VLANs or vSwitches, you can use policy statements to control communication channels within a network using micro-segmentation practices.

Greg pulls this all together with an example to make it concrete: he uses a two-tiered web app and an accompanying PowerShell helper script. The demonstration shows the ease with which an administrator can scale out the app tiers, add network services, and use the network controller to automate all the changes. The demonstration expands to show how the built-in services can be extended by adding third-party network services such as firewalls or WAN optimization appliances; any virtual appliance that works with Hyper-V can be integrated into the environment.

After you watch the video above, explore more resources on SDN and try the virtual lab for SDN.

Microsoft Mechanics show features software-defined networking with Windows Server 2016

$
0
0

Take a coffee break and watch our new 10-minute video on software-defined networking with Windows Server 2016. Join Windows Server program manager Greg Cusanza as he demonstrates how you can use software-defined networking (SDN) to dynamically create, secure and connect your network to:

  • Meet the evolving needs of your applications
  • Speed deployment of workloads
  • Contain security vulnerabilities from spreading across your networks
  • All while reducing your overall infrastructure costs.

Grab a cup of coffee and click here to check it out.

Introducing the Host Compute Service (HCS)

$
0
0

Summary

This post introduces a low level container management API in Hyper-V called the Host Compute Service (HCS).  It tells the story behind its creation, and links to a few open source projects that make it easier to use.

Motivation and Creation

Building a great management API for Docker was important for Windows Server Containers.  There’s a ton of really cool low-level technical work that went into enabling containers on Windows, and we needed to make sure they were easy to use.  This seems very simple, but figuring out the right approach was surprisingly tricky.

Our first thought was to extend our existing management technologies (e.g. WMI, PowerShell) to containers.  After investigating, we concluded that they weren’t optimal for Docker, and started looking at other options.

Next, we considered mirroring the way Linux exposes containerization primitives (e.g. control groups, namespaces, etc.).  Under this model, we could have exposed each underlying feature independently, and asked Docker to call into them individually.  However, there were a few questions about that approach that caused us to consider alternatives:

  1. The low level APIs were evolving (and improving) rapidly.  Docker (and others) wanted those improvements, but also needed a stable API to build upon.  Could we stabilize the underlying features fast enough to meet our release goals?
  2. The low level APIs were interesting and useful because they made containers possible.  Would anyone actually want to call them independently?

After a bit of thinking, we decided to go with a third option.  We created a new management service called the Host Compute Service (HCS), which acts as a layer of abstraction above the low level functionality.  The HCS was a stable API Docker could build upon, and it was also easier to use.  Making a Windows Server Container with the HCS is just a single API call.  Making a Hyper-V Container instead just means adding a flag when calling into the API.  Figuring out how those calls translate into actual low-level implementation is something the Hyper-V team has already figured out.

linux-arch windows-arch

Getting Started with the HCS

If you think this is nifty, and would like to play around with the HCS, here’s some infomation to help you get started.  Instead of calling our C API directly, I recommend using one the friendly wrappers we’ve built around the HCS.  These wrappers make it easy to call the HCS from higher level languages, and are released open source on GitHub.  They’re also super handy if you want to figure out how to use the C API.  We’ve released two wrappers thus far.  One is written in Go (and used by Docker), and the other is written in C#.

You can find the wrappers here:

If you want to use the HCS (either directly or via a wrapper), or you want to make a Rust/Haskell/InsertYourLanguage wrapper around the HCS, please drop a comment below.  I’d love to chat.

For a deeper look at this topic, I recommend taking a look at John Stark’s DockerCon presentation: https://www.youtube.com/watch?v=85nCF5S8Qok

John Slack
Program Manager
Hyper-V Team

Phishers unleash simple but effective social engineering techniques using PDF attachments

$
0
0

The Gmail phishing attack is reportedly so effective that it tricks even technical users, but it may be just the tip of the iceberg. We’re seeing similarly simple but clever social engineering tactics using PDF attachments.

These deceitful PDF attachments are being used in email phishing attacks that attempt to steal your email credentials. Apparently, the heightened phishing activity that we have come to expect every year during the holiday season has not subsided.

Unlike in other spam campaigns, the PDF attachments we are seeing in these phishing attacks do not contain malware or exploit code. Instead, they rely on social engineering to lead you on to phishing pages, where you are then asked to divulge sensitive information.

At Microsoft Malware Protection Center, we continuously monitor the threat landscape for threats such as these PDF files that arrive via email and execute their payload from the web. We do this, not only so we can create security solutions for the latest threats, but also so we understand cybercriminal’s newest schemes and warn customers.

Awareness is an effective weapon against social engineering. We’re sharing some examples of these PDF attachments, including one that spoofs Microsoft Office, so you are armed with knowledge that you can use to detect these social engineering attacks.

Example 1: You received a document that Adobe Reader can’t display because it’s a protected Excel file, so you need to enter your email credentials

Attachment file type: PDF
Filename: Quote.pdf
Info stolen: Email credentials
Windows Defender detection: Trojan:Win32/Pdfphish.BU

One example of the fraudulent PDF attachments is carried by email messages that pretend to be official communication, for instance, a quotation for a product or a service, from a legitimate company. These email messages may spoof actual people from legitimate companies in order to fake authenticity.

1

When you open the attachment, it’s an actual PDF file that is made to appear like an error message. It contains an instruction to “Open document with Microsoft Excel”. But it’s actually a link to a website.

pdf-example-1-screenshot-1

Clicking the link opens your browser and brings you to a website, where the social engineering attack continues with a message that the document is protected because it is confidential, and therefore you need to sign in with your email credentials.

pdf-example-1-screenshot-2

If you’re using Microsoft Edge, Microsoft SmartScreen will block this website, stopping the phishing attack.

block1

However. if you’re using a browser that does not block the website and you click OK, you are led to the phishing site, which asks you to enter your email address and password. The website is designed to appear like you are opening an Excel file. The website goes to great lengths to mimic Microsoft Excel Online, but what you see in the site is not an Excel file, but just an image.

pdf-example-1-screenshot-3

If you fall for this social engineering trick and enter your details, you are redirected to the site below, which says you entered your details incorrectly. But at this point, the attackers will have your email credentials. Once they have access to your email, the attackers can launch further phishing attacks against your contacts, or gain access to your social networking, online banking, or online gaming accounts.

pdf-example-1-screenshot-4

Example 2: You received a PDF file from Dropbox and need to log in using your email credentials

Attachment file type: PDF
Filename: ScannedbyXerox.pdf
Info stolen: Gmail, Outlook, AOL, Yahoo!, Office 365 credentials
Windows Defender detection: PWS:HTML/Misfhing.B

Another example of these PDF attachments put on pretense that you need to sign in to online storage provider Dropbox to access your document. Just like the first example, this PDF document does not have malicious code, but contains a link to “View .PDF online”.

pdf-example-2-screenshot-1

Clicking the link takes you to a fake Dropbox login page that gives you options to sign in using your Google, Outlook, AOL, Yahoo!, Office 365 or other email credentials.

pdf-example-2-screenshot-2

Microsoft Edge users are protected from this threat. Using Microsoft SmartScreen, it stops this phishing attack from loading or serving further offending pages.

block2

On the phishing page, options are tailored to look like a legitimate email sign in page. For example, clicking the Office 365 option brings up a window that may look authentic to an untrained eye.

pdf-example-2-screenshot-3

It’s the same level of customization for the other options. For example, for the Google option, the window first asks you to choose whether you’d like to sign in using your organizational or individual account. This step is not present in the actual Google sign in process, but this may be done to help the attackers identify business-related account credentials. It then brings up the sign in page.

pdf-example-2-screenshot-4

pdf-example-2-screenshot-5

If you enter your details, an actual PDF document (hosted in Google Drive, not Dropbox) is opened in a window.

pdf-example-2-screenshot-6

As part of the social engineering tactic, this is done so you don’t immediately suspect you were phished. By this time, the attackers will have your credentials. This last step can buy them more time to use your credentials before you realize you need to change your password.

Other examples: Enter your email credentials to access or download your file

We have seen other examples of PDF files being distributed via email and exhibiting the same characteristics. Just like the first two cases, these PDF files don’t contain malicious code, apart from a link to a phishing site. All of them carry the message that you need to enter your email credentials so that you can view or download the document. All of these attachments are detected as variants of Trojan:Win32/Pdfphish.

pdf-example-3pdf-example-4pdf-example-5pdf-example-6pdf-example-8pdf-example-7

How to stay safe from phishing attacks

As we saw from these examples, social engineering attacks are designed to take advantage of possible lapses in decision-making. Awareness is key; that is why we’re making these cybercriminal tactics known.

Don’t open attachments or click links in suspicious emails. Even if the emails came from someone you know, if you are not expecting the email, be wary about opening the attachment, because spam and phishing emails may spoof the sender.

In these times, when we’re seeing heightened phishing attacks with improved social engineering techniques, a little bit of paranoia doesn’t hurt. For instance, question why Adobe Reader is trying to open an Excel file. Ask why Dropbox is requiring you to enter your email credentials, not your Dropbox account credentials.

For more information, download and read this Microsoft e-book on preventing social engineering attacks, especially in enterprise environments.

Using a secure platform like Windows 10 will let you take advantage of security features that can help identify and stop phishing attacks:

  • Microsoft Edge is a secure browser that can block phishing sites and other malicious websites using Microsoft SmartScreen
  • Windows Defender can detect and block malicious PDF attachments and other malicious code
  • Office 365 has built in content security features that can block spam and phishing emails

 

Alden Pornasdoro

MMPC

Update to Visual Studio 2017 Release Candidate

$
0
0

Today we have another update to Visual Studio 2017 Release Candidate.

Take a look at the Visual Studio 2017 Release Notes and Known Issues for the full list of what’s available in this update, but here’s a summary:

  • The .NET Core and ASP.NET Core workload is no longer in preview. We have fixed several bugs and improved usability of .NET Core and ASP.NET Core Tooling.
  • Team Explorer connect experience is now improved to make it easier to find the projects and repos to which you want to connect.
  • The Advanced Save option is back due to popular demand.
  • Multiple installation-related issues are now fixed in this update, including hangs. We’ve also added a retry button when installation fails, disambiguated Visual Studio installs in the Start menu, and added support for creating a layout for offline install.

Apart from these improvements you’ll notice that we’ve removed the Data Science and Python Development workloads. As we’ve been closing on the VS release, some of the components weren’t going to meet all the release requirements, such as translation to non-English languages. They’ll re-appear soon as separate downloads. F# is still available in the .NET Desktop and .NET Web development workloads.

Please try this latest update and share your feedback. For problems, let us know via the Report a Problem option in the upper right corner of the VS title bar. Track your feedback on the developer community portal. For suggestions, let us know through UserVoice.

John Montgomery, Director of Program Management for Visual Studio

@JohnMont is responsible for product design and customer success for all of Visual Studio, C++, C#, VB, .NET and JavaScript. John has been at Microsoft for 18 years working in developer technologies.

Viewing all 13502 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>