Quantcast
Channel: TechNet Technology News
Viewing all 13502 articles
Browse latest View live

Been shopping lately? Fake credit card email can spook you into downloading Cerber ransomware

$
0
0

As the shopping sprees become increasingly frenetic during holiday season, it’s hard not to worry about how much credit card debt we’re piling. Some of us rely on email notifications from our banks to track the damage to our finances. So what happens when we suddenly get notified about charges for things we never bought?

Microsoft security researchers have received samples of personalized emails that appear to be MasterCard  notifications. Although not without flaws, these emails can be very effective malware vectors—they can trigger an urgent need to act and open the attached payload.

The payload is a macro downloader embedded in a Word document. Starting with Office 2010, documents from untrusted sources are displayed in Protected View and macros are disabled by default. To overcome this security measure, the malware authors crafted the contents of the attached Word document so that unsuspecting users are convinced about enabling macros to see supposedly important content.

As seen in the screenshot below, the Word document provides step-by-step instructions telling users to leave Protected View and enable macros. One should note that legitimate notifications from MasterCard and other credit card companies do not ask recipients to enable macros.

Instructions in the attached document about enabling macros; these instructions are not from Microsoft

Figure 1. Instructions in the attached document about enabling macros; these instructions are not from Microsoft

Once the macro is allowed to run, it downloads and launches Cerber, a known ransomware. Cerber victims, recipients who don’t have robust antimalware, are bound to learn a potentially pricey lesson in computing safety.

Engineering an urgent response

Although some aspects of the socially engineered emails are weak, they do have some strong points:

  • Urgency—by stating that the recipient is being billed, the attack emails can trick unsuspecting users into opening the malicious attachment without consideration for their safety.
  • Convincing workaround instructions—when the attached Word document is opened, it displays well-formatted and well-written instructions on how to enable macros, tricking recipients into facilitating payload detonation. The instructions are made to appear like help content from Microsoft and even have feedback buttons that appear functional.

Below is a recreation of one of the sample messages received by Microsoft security researchers. It has been modified to protect the original recipient.

Recreated attack email (original recipient information has been anonymized) 

Figure 2. Recreated attack email  (original recipient information has been anonymized) 

Social engineering flaws

There are some social engineering flaws in the attack emails. In our sample, the sender address does not spoof MasterCard or a bank, making it much less convincing. Also, the apparent use of automated code to copy the recipient local-name to the salutation section of the message and the file name of the attached document is a giveaway. We do concede, however, that this simple attempt at personalization can work and is in fact employed in attacks associated with the highly prevalent Ransom:Win32/Locky.

The email itself is crude and shows almost no attempt to feign legitimacy. It contains some typographical errors, such as the missing number between the dollar sign and the comma in our sample. Also, users who are careful enough will likely notice that the sender address does not match the signatory.

Social engineering flaws in the attack email

Figure 3. Social engineering flaws in the attack email

Scanner evasion and anonymization

On the technical side, the use of a password-protected Word document allows the embedded macro code to avoid detection by many email scanners. Without password-protection, the macro code is easily detected by antimalware engines . (Microsoft detects the macro code in our samples as TrojanDownloader:O97M/Donoff.CU .) To an extent, password-protection also makes the attachment appear legitimate—many bank documents are typically transmitted as password-protected files.

When our researchers detonated the payload by opening the attached document and enabling macros, the embedded macro code began downloading a variant of a known ransomware from the following URL:

hxxps://2cbhkcjhn5suq6t6.onion.to/explore.exe

This URL is a hidden Tor web location made available to all web browsers by Tor2web. Hidden Tor web locations allow publishers to stay anonymous and protected from political persecution. However, this anonymity can also be abused by criminals.

Once the download completes, the macro runs PowerShell commands to launch the downloaded ransomware.

Classic case of ransomware

The ransomware component is a variant of Ransom:Win32/Cerber. Like most ransomware, Cerber encrypts files to render them inaccessible. Unfortunate users who detonate the macro end up with a lot of encrypted files as shown below. Note that the extension name of the encrypted files is not static—Cerber uses a pseudorandom extension.

Inaccessible user files encrypted by Cerber

Figure 4. Inaccessible user files encrypted by Cerber

Cerber behavior has not changed much compared to earlier versions. After encrypting the files, Cerber attempts to collect ransom by opening a window that displays its ransom note.

Cerber ransom note

Figure 5. Cerber ransom note

As an additional reminder to its victims, Cerber modifies the desktop wallpaper:

Cerber wallpaper

Figure 6. Cerber wallpaper serves a painful reminder to victims

In the ransom note, users are reassured that their files are intact and are told to purchase the Cerber Decryptor from a list of URLs. Victims who do not purchase the decryption tool are left unable to access the contents of their files.

Victims who do go to the URLs find the same features that the scammers have had on their website since the early versions of Cerber:

  • Support for multiple languages, including several European languages, Chinese, and Japanese
  • An image CAPTCHA mechanism to prevent robots from using the site
  • Special rates for those who purchase the decryption tool in the next few days

Below are screenshots of the ransomware website.

Language options on the ransomware website

Figure 7. Language options on the ransomware website

Anti-robot CAPTCHA on the ransomware website

Figure 8. Anti-robot CAPTCHA on the ransomware website

Special rate countdown on the ransomware website

Figure 9. Special rate countdown on the ransomware website

Be safe and save

An effective way to avoid this ransomware attack is to be extremely wary of unsolicited emails and emails coming from unknown sources. Check the sender name and consider contacting the company or institution represented by the unsolicited email to verify the email’s authenticity.

Ransomware may also come from other sources, including pirated software and along with legitimate applications that have been repackaged inside a software bundler. Obtain software from trustworthy sources, such as the Windows Store, or directly from the software vendor’s website.

Microsoft recommends running robust antimalware, like Windows Defender, to help stop ransomware and other malicious code from causing irreversible or costly damage. Windows Defender uses behavioral heuristics—it actively checks for suspicious behavior and references advanced algorithms in the cloud. By using behavioral heuristics, Windows Defender can detect ransomware even before specific signatures become available.

The screenshot below shows Windows Defender detecting Cerber ransomware using only behavioral heuristics.

Windows Defender behavior-based proactive detection of Cerber ransomware

Figure 10. Windows Defender behavior-based proactive detection of Cerber ransomware

Here are some more tips:

For end users

  • Use an up-to-date, real-time antimalware product, such as Windows Defender for Windows 10.
  • Think before you click. Do not open emails from senders you don’t recognize.  Upload any suspicious files here: https://www.microsoft.com/en-us/security/portal/submission/submit.aspx. This campaign spoofs MasterCard, and can easily be modified to spoof banks and other credit card providers. The attachment is a Word document, which is a commonly distributed file. However, be mindful of documents that instruct you to enable macros—it’s very possible that they contain malicious macros.

For IT administrators


VMware Monitoring Solution updates

$
0
0

Hi, this is Keiko. We now support VMware VSphere 6.5 on OMS VMware Monitoring Solution. Just for a recap, here’s the VMware monitoring with OMS blog.

We were at the HPE Discover Conference 2016 at ExCel London last week where more than 13,000 customers attended. In the keynote session, OMS Container and VMware Solution were presented and were well received.

Microsoft Azure Dashboard

We are looking for people who like to give us direct feedback for our next planning session for this solution.

Please be specific about what you want:

For example:  Add performance capabilities

  • Provide us the capacity that this is for, such as VM, datasets, FS, etc.
  • What performance are you looking for? (bandwidth, iops, etc.)

How to contact us for feedback

Still haven’t tried OMS?

Get a free Microsoft Operations Management + Security (#MSOMS) subscription so that you can try the VMware Monitoring Solution features. You can also get a free subscription for Microsoft Azure.

If you want to just to see an online full interactive demo of OMS, go to OMS Experience Center.

Keiko Harada
Program Manager
Microsoft Operations Management Team

Announcing the Sales Management Solution Template for Dynamics 365 with Data Export

$
0
0
Back in May, we announced the sales management solution template that simplified and accelerated building powerful and compelling Power BI solutions on Dynamics CRM (now Dynamics 365). The sales management solution template offered a very fast guided experience to create compelling reports on an extensible, scalable, and secure architecture that could be customized however one needed. This meant that instead of spending one’s time on plumbing, one could instead spend it on extending and customizing the solution template to meet your organization’s needs. Today, I’m pleased to announce the integration of Dynamics CRM Data Export with the sales management solution template for Dynamics 365.

Milliman chooses Power BI Embedded for their Integrate application

$
0
0
Today, we’re excited to share how Milliman, a global market-leader in actuarial-products and services, has integrated Power BI Embedded into its solutions. I want to welcome Paul Maher, Principal and Chief Technology Officer of the Life Technology Solutions Practice at Milliman.

Use the Cloud to help people in need

$
0
0
In this holiday time, peoples’ thoughts turn to helping those less fortunate. If you’re a charity, helping people is your business all year round, and every dollar spent on computing infrastructure…

Introducing Change Feed support in Azure DocumentDB

$
0
0

 We’re excited to announce the availability of Change Feed support in Azure DocumentDB! With Change Feed support, DocumentDB provides a sorted list of documents within a DocumentDB collection in the order in which they were modified. This feed can be used to listen for modifications to data within the collection and perform actions such as:

  • Trigger a call to an API when a document is inserted or modified
  • Perform real-time (stream) processing on updates
  • Synchronize data with a cache, search engine, or data warehouse

DocumentDB's Change Feed is enabled by default for all accounts, and does not incur any additional costs on your account. You can use your provisioned throughput in your write region or any read region to read from the change feed, just like any other operation from DocumentDB.

In this blog, we look at the new Change Feed support, and how you can build responsive, scalable and robust applications using Azure DocumentDB.

Change Feed support in Azure DocumentDB

Azure DocumentDB is a fast and flexible NoSQL database service that is used for storing high-volume transactional and operational data with predictable single-digit millisecond latency for reads and writes. This makes it well-suited for IoT, gaming, retail, and operational logging applications. These applications often need to track changes made to DocumentDB data and perform various actions like update materialized views, perform real-time analytics, or trigger notifications based on these changes. Change Feed support allows you to build efficient and scalable solutions for these patterns.

Many modern application architectures, especially in IoT and retail, process streaming data in real-time to produce analytic computations. These application architectures (“lambda pipelines”) have traditionally relied on a write-optimized storage solution for rapid ingestion, and a separate read-optimized database for real-time query. With support for Change Feed, DocumentDB can be utilized as a single system for both ingestion and query, allowing you to build simpler and more cost effective lambda pipelines. For more details, read the paper on DocumentDB TCO.

 

clip_image002

Stream processing: Stream-based processing offers a “speedy” alternative to querying entire datasets to identify what has changed. For example, a game built on DocumentDB can use Change Feed to implement real-time leaderboards based on scores from completed games. You can use DocumentDB to receive and store event data from devices, sensors, infrastructure, and applications, and process these events in real-time with Azure Stream Analytics, Apache Storm, or Apache Spark using Change Feed support.

Triggers/event computing: You can now perform additional actions like calling an API when a document is inserted or modified. For example, within web and mobile apps, you can track events such as changes to your customer's profile, preferences, or location to trigger certain actions like sending push notifications to their devices using Azure Functions or App Services.

Data Synchronization: If you need to keep data stored in DocumentDB in sync with a cache, search index, or a data lake, then Change Feed provides a robust API for building your data pipeline. Change feed allows you to replicate updates as they happen on the database, recover and resume syncing when workers fail, and distribute processing across multiple workers for scalability.

 

Working with the Change Feed API

Change Feed is available as part of REST API 2016-07-11 and SDK versions 1.11.0 and above. See Change Feed API for how to get started with code.

 

 clip_image004

The change feed has the following properties:

  • Changes are persistent in DocumentDB and can be processed asynchronously.
  • Changes to documents within a collection are available immediately in the change feed.
  • Each change to a document appears only once in the change feed. Only the most recent change for a given document is included in the change log. Intermediate changes may not be available.
  • The change feed is sorted by order of modification within each partition key value. There is no guaranteed order across partition-key values.
  • Changes can be synchronized from any point-in-time, that is, there is no fixed data retention period for which changes are available.
  • Changes are available in chunks of partition key ranges. This capability allows changes from large collections to be processed in parallel by multiple consumers/servers.
  • Applications can request for multiple Change Feeds simultaneously on the same collection.

Next Steps

In this blog post, we looked the new Change Feed support in Azure DocumentDB.

Support Tip: WSUS Server Cleanup Wizard task “Delete computers that have not contacted the server in 30 days or more” fails with connection error

$
0
0

~ Author: Moni R S

Hello Everyone, my name is Moni and I am a Support Engineer on the Windows Devices and Deployment team here at Microsoft. In this post, I’ll be discussing an issue where the Server Cleanup Wizard for WSUS 3.0 Service Pack 2 times out when attempting to delete computers that have not contacted the server in 30 days or more.

Background

The following screenshot shows what the WSUS Server Cleanup Wizard looks like in WSUS 3.0 SP2:

clip_image002

We could execute all the tasks except the second one: Delete computers that have not contacted the server in 30 days or more. This task was timing out, resulting in a loss of WSUS console connectivity.

clip_image004

Details of the error are as follows:

The WSUS administration console was unable to connect to the WSUS Server Database.
Verify that SQL server is running on the WSUS Server. If the problem persists, try restarting SQL.
System.Data.SqlClient.SqlException — Access to table dbo.tbDownstreamServerClientSummaryRollup is blocked because the signature is not valid.
Source
.Net SqlClient Data Provider
Stack Trace:
at System.Windows.Forms.Control.MarshaledInvoke(Control caller, Delegate method, Object[] args, Boolean synchronous)
at System.Windows.Forms.Control.Invoke(Delegate method, Object[] args)
at Microsoft.UpdateServices.UI.SnapIn.Wizards.ServerCleanup.ServerCleanupWizard.OnCleanupComplete(Object sender, PerformCleanupCompletedEventArgs e)

It appears that access to the table dbo.tbDownstreamServerClientSummaryRollup is blocked because the signature is not valid, thereby resulting in a timeout while attempting to delete obsolete computers.

Attempting to access this table by executing the following query against the SUSDB:

select * from tbDownstreamServerClientSummaryRollup

It failed with the message below:

Msg 33002, Level 16, State 1, Line 1
Access to table dbo.tbDownstreamServerClientSummaryRollup is blocked because the signature is not valid.

What we can infer:

  • Windows Internal Database (WID) was being used to host the SUSDB. WID basically is SQL Server Embedded Edition.
  • The table named tbDownstreamServerClientSummaryRollup has an invalid signature and this is because this is not allowed in SQL Server Embedded Edition.
  • The trust is broken between WSUS and SQL Embedded edition.
  • When a query is executed which tries to access such a table or module with invalid signature it fails.

Fixing this is a laborious task as we must sign the concerned table with a valid signature and re-establish the trust between WSUS and SQL Server Embedded Edition.

This led me to perform the below testing:

Testing / Repro

I built a new WSUS 3.0 SP2 lab test machine and verified the certificate by executing this SQL query against the SUSDB:

select * from sys.certificates

This returned MS_SchemaSigningCertificateD7A4348D8F461363128D655AE4589B8206B74257

There was only one certificate, which comes built in.

I removed the signature for tbDownstreamServerClientSummaryRollup by executing the SQL query below against SUSDB:

DROP SIGNATURE FROM [dbo]. [tbDownstreamServerClientSummaryRollup] BY CERTIFICATE [MS_SchemaSigningCertificateD7A4348D8F461363128D655AE4589B8206B74257]

Next I tried to verify whether we could access the concerned table by executing the following SQL query against the SUSDB:

select * from tbDownstreamServerClientSummaryRollup

We got the following message:

Msg 33002, Level 16, State 1, Line 1
Access to table dbo.tbDownstreamServerClientSummaryRollup is blocked because the signature is not valid.

So this means that we were able to break the signature of this table and thereby reproduce the issue that the customer was facing.

The latest hardening update for WSUS (2938066) includes a new certificate which is [MS_SchemaSigningCertificate77F29F0BD53B7D715AB7B4671A559131C9F9EAF6].

We verified this by downloading the update and extracting its contents: Extract the .exe file -> extract the .MSP -> extract the PCW_CAB_SUS –> open the DbCertSql file in a text editor such as Notepad ++.

clip_image006

clip_image008

The first section in this file will give us details about the certificate that comes with the update. Scrolling down below will give you information about the objects that will be re-signed using this certificate.

If this update is installed, then all the tables should be re-signed and the new certificate should be added.

I installed this update on the test machine and it needed a system restart, but before restarting the machine we verified the certificate by executing the following query:

select * from sys.certificates

The two certificates below were present now.

[MS_SchemaSigningCertificateD7A4348D8F461363128D655AE4589B8206B74257]
[MS_SchemaSigningCertificate77F29F0BD53B7D715AB7B4671A559131C9F9EAF6]

The second one above is the one that comes with the hardening update.

We also ran the query below to check which certificate signed the table [tbDownstreamServerClientSummaryRollup].

SELECT object_name(major_id), cer.name,cp.crypt_property,*
FROM sys.crypt_properties AS cp
JOIN sys.certificates AS cer
ON cp.thumbprint = cer.thumbprint

                where object_name(major_id) = ‘tbDownstreamServerClientSummaryRollup’

It returned stating that this table is signed by the new certificate which is [MS_SchemaSigningCertificate77F29F0BD53B7D715AB7B4671A559131C9F9EAF6].

Now we executed the query below to see if we could access the table tbDownstreamServerClientSummaryRollup, which we couldn’t previously due to an invalid signature.

select * from tbDownstreamServerClientSummaryRollup

It was accessible, and since there weren’t any client computers contacting this new WSUS server, it returned stating that there wasn’t any data in it.

In-house testing confirmed that installing the latest hardening update for WSUS will re-sign all of the objects with the new certificate that it brings with it.

Resolution

The issue can be resolved with the following steps:

1. Make a backup of the SUSDB.

  • Right click the SUSDB –> Tasks -> Back Up…

clip_image010

clip_image011

2. Install the latest WSUS hardening update KB2938066.

3. Restart the machine when prompted for a restart.

4. Execute the following SQL query to see if table tbDownstreamServerClientSummaryRollup is accessible:

select * from tbDownstreamServerClientSummaryRollup

5. Verify the certificate that signed the table tbDownstreamServerClientSummaryRollup by executing the following SQL query:

SELECT object_name(major_id), cer.name,cp.crypt_property,*
FROM sys.crypt_properties AS cp
JOIN sys.certificates AS cer
ON cp.thumbprint = cer.thumbprint

                where object_name(major_id) = ‘tbDownstreamServerClientSummaryRollup’

If the table tbDownstreamServerClientSummaryRollup is accessible, try to execute the Server Cleanup Wizard to delete obsolete computers again, and verify whether that operation is executed successfully.

Conclusion

This approach can be used in any case where WSUS related tables in the SUSDB become inaccessible due to an invalid signature. This assumes that we have the option to install a newer WSUS hardening update of course. The key is that each WSUS hardening update comes in with a unique certificate which will re-sign the tables using it.

I hope this helps, and thanks for taking the time to read this post!

Moni R S | Support Engineer | Microsoft Windows Devices and Deployment

Disclaimer: This information is provided ‘as-is’ with no warranties

Team Services December Extensions Roundup

$
0
0

It is the holiday season and we get to look back on a fantastic year for the Team Services Marketplace! Thanks to our growing publisher community there are 321 extensions in the Marketplace and November was one of the best months ever for our installation traffic. 2017 is full of potential as we continue to invest and grow our ecosystem. This month I’ve got two extensions for you, one of them is a must have for our Work Item users. Happy Holidays!

Work Item Search

See it in the Marketplace: https://marketplace.visualstudio.com/items?itemName=ms.vss-workitem-search

Big and small teams rejoice! The need to create small temporary queries to find that pesky work item you lost track of is gone! With Work Item Search, you get fast and flexible text search of all Work Item fields across all projects in your account.

  • Search all of your Work Item fields– You can easily search across all work item fields, including custom fields, which enables more natural searches. The snippet view indicates where matches were found.

image002

  • Inline search filters help you narrow it down, fast– The dropdown list of suggestions helps complete your search faster. For example, a search such as “AssignedTo: Chris WorkItemType: Bug State: Active” finds all active bugs assigned to a user named Chris.

image003

Code Coverage Widgets

See it in the Marketplace: https://marketplace.visualstudio.com/items?itemName=shanebdavis.code-coverage-dashboard-widgets

Dashboard widgets are great because so many interesting scenarios can be enabled there. Code Coverage Widgets adds another set of tools for those who want to stay on top of the quality of new code for a growing project. This widget displays the percentage of unit test code coverage based on a selected build definition. If a build definition does not have any unit tests results recognized by the widget or if has not yet been configured, it will indicate so with a message displayed within the widget.

preview1

The widget has two customizable properties

  • Title– Title of the widget as it is displayed on the dashboard.
  • Build Definition– The build definition you want code coverage displayed for on the widget.

configuration

You will be able to make your dashboard a build status powerhouse with Code Coverage Widgets!

microsoft-visualstudio-services-screenshots

    Are you using an extension you think should be featured here?

    I’ll be on the lookout for extensions to feature in the future, so if you’d like to see yours (or someone else’s) here, then let me know on Twitter!

    @JoeB_in_NC


    Help prevent user-error security breaches

    $
    0
    0

    According to the Association of Corporate Counsel, unintentional employee error is the top cause of data breaches. And with 87 percent of IT professionals concerned about the security of cloud data, according to a Dimensional Research survey conducted for Druva, it’s easy to feel vulnerable. Preventing these unintentional errors can help keep your data protected.

    The problem—simple passwords

    Simple or reused passwords open the door to hackers. According to SplashData, the top five worst passwords of 2015 were:

    1. 123456
    2. password
    3. 12345678
    4. qwerty
    5. 12345

    But even a great password can pose problems when used on multiple sites. Hackers know that people like to reuse passwords, so when they crack one, they test it on multiple sites, especially those that may contain higher value information.

    Your solution—Educate employees on how to create a strong password. Then put a policy in place to ensure passwords meet minimum complexity requirements and require that users change them often. Also, encourage secure password-keeping practices such as using third-party services that store passwords in the cloud and secure them all with a master password.

    The problem—falling for phishing

    According to a Verizon Data Breach report, phishing is the second most common threat and is implicated in around a quarter of all data breaches. If a phishing message ends up in an employee’s inbox, there’s a good chance they will click the link.

    Your solution—In addition to top-notch security and secure email filters, encourage users to report suspicious-looking messages—similar to reporting junk mail. Once reviewed and identified as a threat, add these messages to service-wide filters.

    help-prevent-user-error-security-breaches-1

    In Exchange Online, Email Safety Tips provide an additional layer of protection with a warning to the user in messages that are marked suspicious.

    The problem—BYOD practices

    Bring-your-own-device (BYOD) policies are widely used in today’s business landscape, but employees accessing sensitive information from personal devices can open the door to security threats. According to research from the Ponemon Institute, a total of 67 percent of respondents cited employees using their devices to access company data as likely or certainly the cause of data breaches.

    Your solution—Create clear BYOD policies and educate employees on how to follow these guidelines—including what’s at risk if they’re ignored. For additional layers of security, require the use of approved secure mobile apps and multi-factor authentication when accessing company information.

    The problem—lost or stolen devices

    Lost devices are another leading cause of data breaches. And not just employee-owned devices—even your company’s devices are at risk, leaving your organization exposed to threats if they are lost or stolen.

    Your solution—Educate employees on proper device security on- and off-premises, and instruct them to report lost devices as soon as possible. Enable security policies to ensure you can remotely access, locate and wipe a device if necessary.

    Keep your business and email secure

    Help protect your organization’s data with the email security features you need to move your business ahead. Office 365 has built-in, always up-to-date security and compliance features for greater peace of mind.

    Get the free eBook

    Continually educate employees to minimize risk of common user-error breaches. Security features available with Office 365 help mitigate the risks introduced by employees. Data Loss Prevention (DLP) proactively scans emails and notifies users before they send sensitive information. Information Rights Management (IRM) allows you to control email access permissions to keep unauthorized people from printing, forwarding or copying sensitive information. Additionally, Office 365 gives you the option to use Advanced Threat Protection (ATP) to safeguard mailboxes against sophisticated attacks in real time.

    Learn more

    The post Help prevent user-error security breaches appeared first on Office Blogs.

    Updates to the OneNote Class Notebook add-in—read/unread indicators in Review Student Work and more

    $
    0
    0

    Since the school year started, we’ve been making improvements to the Class Notebook add-in for OneNote on the desktop. Here is a summary of the highlights—as well as details for the update releasing today. To update your OneNote Class Notebook add-in, just click the Update button on your toolbar to download and install the latest version. If you’ve never installed the Class Notebook add-in, you can get it from the OneNote Class Notebook website.

    Improvements to Review Student Work

    The most exciting new feature is being released today. Teachers now can quickly and easily see updates made by students in the Review Student Work pane.

    Some common requests from teachers have been to:

    • Easily see when students have started their work in a distributed page or assignment.
    • Quickly see which pages you, the teacher, have already reviewed.
    • See if a student added any new content after a teacher has already reviewed or graded a page.

    These requests are all now possible with the new OneNote Class Notebook add-in (version 1.4.5.0) and use the familiar bold convention to denote the unread status.

    Read/unread support in the Review Student Work pane.

    Improvements to notebook and student mapping

    • Notebook mapping—If your notebook name and LMS/SIS course name match, the Class Notebook add-in will automatically map the two together.
    • Student mapping—If the format of your students is , , the student names will be automatically mapped.

    A few more improvements and detailed summary

    We have provided additional LMS support for SEQTA by adding assignment and grading integration. Also, we made some performance improvements to help speed up page distribution.

    Here is a detailed summary of all the new capabilities and improvements to the Class Notebook add-in over the last few months. This list can also be found at the bottom of the Class Notebook add-in support page.

    Version 1.2.0.0—September 2016

    • SEQTA assignment/grading integration.
    • Automatic mapping of Class Notebooks.
    • Automatic mapping of students by last name, first name.
    • Refresh tokens so teachers only need to sign in to the LMS/SIS once.
    • Display a warning when distributing pages that contain attachments.
    • Apostrophes in section names now preserved correctly during page distribution.

    Version 1.3.0.0—September 2016

    • Fixed an issue with grading in MS Classroom courses with many assignments.

    Version 1.3.1.0—October 2016

    • Fixed issue affecting Schoology sign-in.

    Version 1.3.2.0—October 2016

    • Fixed issue preventing student mappings from getting saved.

    Version 1.4.0.0—November 2016

    • Fixed bugs that happened with certain content types for page distribution.

    Version 1.4.5.0—December 2016

    • Read/unread state for Review Student Work assignments and distributions.
    • Performance improvements for page distribution.

    The post Updates to the OneNote Class Notebook add-in—read/unread indicators in Review Student Work and more appeared first on Office Blogs.

    #AzureAD Certificate based authentication is Generally Available!

    $
    0
    0

    Howdy folks!

    Many big organizations that have certificates have been using the certificate-based authentication feature while it was in preview and giving us feedback. Thank you for your input! Today, Im excited to announce the GA of certificate based authentication.

    This announcement enables two key scenarios:

    1. Federated Azure AD customers can sign in using certificate-based authentication (performed against the federation server) with Office applications on iOS and Android. The chart below outlines the support for certificate-based authentication across Office applications:

    clip_image002

    2. Azure AD customers can sign in using certificate-based authentication with Exchange ActiveSync mobile apps in iOS and Android when signing in to Exchange Online.

    Take a look at our certificate-based authentication documentation to get started with these scenarios

    Of course, we always love to hear your feedback and suggestions, and look forward to hearing from you!

    Best regards,

    Alex Simons (Twitter: @Alex_A_Simons)

    Director of Program Management

    Microsoft Identity Division

    Connect(“demos”); // 2016: BikeSharing360 on GitHub

    $
    0
    0

    Microsoft loves developers and is constantly investing in enabling the future of development with cloud-first, mobile-first solutions that serve any developer, any application, and any platform.

    During our Connect(); event this year we presented 15 demos in Scott Guthrie’s and Scott Hanselman’s keynotes. If you missed the keynotes, you can watch the recording in Channel 9. I highly recommend it!

    New products, services, and tools we announced help bring innovation to your apps. We enjoy working on the demos for the keynotes and building real-world applications through which you can directly experience what’s possible using those technologies. This year, we built out a full intelligent bike sharing scenario for our Connect(); //2016 demos and are delighted to share all the source code with you today.

    clip_image002

    BikeSharing360 is a fictitious example of a smart bike sharing system with 10,000 bikes distributed in 650 stations located throughout New York City and Seattle. Their vision is to provide a modern and personalized experience to riders and to run their business with intelligence.

    In this demo scenario, we built several apps for both the enterprise and the consumer (bike riders).

    BikeSharing360 (Enterprise)

    New York, Seattle, and more coming soon!

    • Manage our business with intelligence
    • Own fleets of smart bikes we can track with IoT devices
    • Go mobile and get bike maintenance reports
    • Intelligent kiosks with face and speech recognition to help customers rent bikes easily
    • Intelligent customer service: AI – assisted customer service through bots
    Bike Riders (Consumer)
    • Go mobile! Go green! Save time, money & have fun!
    • Find and rent bikes and manage your rides
    • My rides: Discover and track your routes
    • Get personalized recommendations for events
    • Issues on the road? Chat with the BikeSharing360 bot, your customer service personal assistant

    BikeSharing360 Suite of Apps

    We want you to be inspired and learn how to use multiple tools, products, and our Microsoft application platform capabilities to unleash your productivity, help transform your businesses, and build deeply personalized apps for your customers.

    We built a suite of apps for the BikeSharing360 enterprise and bike riders. The following diagram provides a high-level overview of the apps we built for:

    Watch the demos in action and download the code

    This time we are releasing multiple demo projects split in seven different demo repos now available in GitHub:

    Websites

    BikeSharing360: Websites on GitHub

    • Web Apps focused on bike rentals and corporate users
    • BikeSharing360 Public Web Site (MVC)
    • BikeSharing360 Public Web Site (ASP.NET Core)
    • BikeSharing360 Private Web Site (ASP.NET Core 1.1)

    Mobile apps

    BikeSharing360: Mobile apps on GitHub

    • BikeRider: Native mobile apps using Xamarin Forms for iOS, Android and UWP
    • Maintenance: Cordova cross-platform mobile app

    Watch demos in action:

    Backend services

    BikeSharing360: Backend services on GitHub

    • Backend microservices used in various Connect() demos (mainly in the Xamarin apps).
    • Azure Functions

    Watch demos in action:

    Single container apps

    BikeSharing360: Single container app on GitHub

    • Single Container App: Existing marketing site and publish to Azure App Service running Linux Docker Containers

    Watch demos in action:

    Multi container apps

    BikeSharing360: Multi container app on GitHub

    · Multi Container App: More complex app to demonstrate setting up Continuous Delivery with Visual Studio 2017 RC. The project was then deployed to Azure Container Services, through the Azure Container Registry.

    Watch demos in action

    · Watch Donovan Brown demo a single container app

    Cognitive Services kiosk app

    BikeSharing360: Cognitive Services kiosk app on GitHub

    • UWP Intelligent Kiosk with Cognitive Services (Face recognition API, Voice recognition)

    Watch demos in action:

    Bot app

    BikeSharing360: Bot app on GitHub

    • BikeSharing360 Intelligent Bot: Customer Services integrated with Language Understanding Intelligent Service (LUIS)

    Watch demos in action:

    You can also watch this Visual Studio Toolbox episode for an E2E overview of the BikeSharing360 demo apps:

    Even more demos from Connect();!

    Here are a few of our tooling demos showing the latest improvements on our Visual Studio family of products:

    It is a great time to be a developer. Create amazing apps and services that delight customers and build your business. With Microsoft’s intelligent Azure cloud, powerful data platform, and flexible developer tools, it is easier than ever to design, build, and manage breakthrough apps that work across platforms and devices.

    Enjoy BikeSharing360 from our demo team!

    Erika Ehrli Cabral, Senior Product Marketing Manager, Cloud Apps Dev and Data@erikaehrli1

    Erika has been at Microsoft for over 12 years, working first in Microsoft Consulting and enjoying later on different roles where she created content and code samples for developers. In her current role, she is now focused on executive keynote demos and Visual Studio and Azure product marketing.

    Writing Declaration Files for @types

    $
    0
    0

    A while back we talked about how TypeScript 2.0 made it easier to grab declaration files for your favorite library. Declaration files, if you’re not familiar, are just files that describe the shape of an existing JavaScript codebase to TypeScript. By using declaration files (also called .d.ts files), you can avoid misusing libraries and get things like completions in your editor.

    As a recap of that previous blog post, if you’re using an npm package named foo-bar and it doesn’t ship any .d.ts files, you can just run

    npm install -S @types/foo-bar

    and things will just work from there.

    But you might have asked yourself things like “where do these ‘at-types’ packages come from?” or “how do I update the .d.ts files I get from it?”. We’re going to try to answer those very questions.

    DefinitelyTyped

    The simple answer to where our @types packages come from is DefinitelyTyped. DefinitelyTyped is just a simple repository on GitHub that hosts TypeScript declaration files for all your favorite packages. The project is community-driven, but supported by the TypeScript team as well. That means that anyone can help out or contribute new declarations at any time.

    Authoring New Declarations

    Let’s say that we want to create declaration files for our favorite library. First, we’ll need to fork DefinitelyTyped, clone your fork, and create a new branch.

    git clone https://github.com/YOUR_USERNAME_HERE/DefinitelyTyped
    cd DefinitelyTyped
    git checkout -b my-favorite-library

    Next, we can run an npm install and create a new package using the new-package npm script.

    npm install
    npm run new-package my-favorite-library

    For whatever library you use, my-favorite-library should be replaced with the verbatim name that it was published with on npm.
    If for some reason the package doesn’t exist in npm, mention this in the pull request you send later on.

    The new-package script should create a new folder named my-favorite-library with the following files:

    • index.d.ts
    • my-favorite-library-tests.ts
    • tsconfig.json
    • tslint.json

    Finally we can get started writing our declaration files. First fix up the comments for index.d.ts by adding the library’s MAJOR.MINOR version, the project URL, and your username. Then, start describing your library. Here’s what my-favorite-library/index.d.ts might look like:

    // Type definitions for my-favorite-library x.x// Project: https://github.com/my-favorite-library-author/my-favorite-library// Definitions by: Your Name Here // Definitions: https://github.com/DefinitelyTyped/DefinitelyTypedexportfunction getPerpetualEnergy():any[];exportfunction endWorldHunger(n:boolean):void;

    Notice we wrote this as a module– a file that contains explicit imports and exports. We’re intending to import this library through a module loader of some sort, using Node’s require() function, AMD’s define function, etc.

    Now, this library might have been written using the UMD pattern, meaning that it could either be imported or used as a global. This is rare in libraries for Node, but common in front-end code where you might use your library by including a tag. So in this example, if my-favorite-library is accessible as the global MyFavoriteLibrary, we can tell TypeScript that with this one-liner:

    exportasnamespaceMyFavoriteLibrary;

    So the body of our declaration file should end up looking like this:

    // Our exports:exportfunction getPerpetualEnergy():any[];exportfunction endWorldHunger(n:boolean):void;// Make this available as a global for non-module code.exportasnamespaceMyFavoriteLibrary;

    Finally, we can add tests for this package in my-favorite-library/my-favorite-library-tests.ts:

    import*aslibfrom"my-favorite-library";const energy =lib.getPerpetualEnergy()[14];lib.endWorldHunger(true);

    And that’s it. We can then commit, push our changes to GitHub…

    git add ./my-favorite-library
    git commit -m "Added declarations for 'my-favorite-library'."
    git push -u origin my-favorite-library

    …and send a pull request to the master branch on DefinitelyTyped.

    Once our change is pulled in by a maintainer, it should be automatically published to npm and available. The published version number will depend on the major/minor version numbers you specified in the header comments of index.d.ts.

    Sending Fixes

    Sometimes we might find ourselves wanting to update a declaration file as well. For instance, let’s say we want to fix up getPerpetualEnergy to return an array of booleans.

    In that case, the process is pretty similar. We can simply fork & clone DefinitelyTyped as described above, check out the master branch, and create a branch from there.

    git clone https://github.com/YOUR_USERNAME_HERE/DefinitelyTyped
    git checkout -b fix-fav-library-return-type

    Then we can fix up our library’s declaration.

    - export function getPerpetualEnergy(): any[];+ export function getPerpetualEnergy(): boolean[];

    And fix up my-favorite-library‘s test file to make sure our change can be verified:

    import*aslibfrom"my-favorite-library";// Notice we added a type annotation to 'energy' so TypeScript could check it for us.const energy:boolean=lib.getPerpetualEnergy()[14];lib.endWorldHunger(true);

    Dependency Management

    Many packages in the @types repo will end up depending on other type declaration packages. For instance, the declarations for react-dom will import react. By default, writing a declaration file that imports any library in DefinitelyTyped will automatically create a dependency for the latest version of that library.

    If you want to snap to some version, you can make an explicit package.json for the package you’re working in, and fill in the list of dependencies explicitly. For instance, the declarations for leaflet-draw depend on the the @types/leaflet package. Similarly, the Twix declarations package has a dependency on moment itself (since Moment 2.14.0 now ships with declaration files).

    As a note, only the dependencies field package.json is necessary, as the DefinitelyTyped infrastructure will provide the rest.

    Quicker Scaffolding with dts-gen

    We realize that for some packages writing out every function in the API an be a pain. Thats why we wrote dts-gen, a neat tool that can quickly scaffold out declaration files fairly quickly. For APIs that are fairly straightforward, dts-gen can get the job done.

    For instance, if we wanted to create declaration files for the array-uniq package, we could use dts-gen intsead of DefinitelyTyped’s new-package script. We can try this our by installing dts-gen:

    npm install -g dts-gen

    and then creating the package in our DefinitelyTyped clone:

    cd ./DefinitelyTyped
    npm install array-uniq
    
    dts-gen -d -m array-uniq

    The -d flag will create a folder structure like DefinitelyTyped’s new-package script. You can peek in and see that dts-gen figured out the basic structure on its own:

    export=array_uniq;declarefunction array_uniq(arr:any):any;

    You can even try this out with something like TypeScript itself!

    Keep in mind dts-gen doesn’t figure out everything– for example, it typically substitutes parameter and return values as any, and can’t figure out which parameters are optional. It’s up to you to make a quality declaration file, but we’re hoping dts-gen can help bootstrap that process a little better.

    dts-gen is still in early experimental stages, but is on GitHub and we’re looking feedback and contributions!

    A Note About Typings, tsd, and DefinitelyTyped Branches

    If you’re not using tools like tsd or Typings, you can probably skip this section. If you’ve sent pull requests to DefinitelyTyped recently, you might have heard about a branch on DefinitelyTyped called types-2.0. The types-2.0 branch existed so that infrastructure for @types packages wouldn’t interfere with other tools.

    However, this was a source of confusion for new contributors and so we’ve merged types-2.0 with master. The short story is that all new packages should be sent to the master branch, which now must be structured for for TypeScript 2.0+ libraries.

    Tools like tsd and Typings will continue to install existing packages that are locked on specific revisions.

    Next Steps

    Our team wants to make it easier for our community to use TypeScript and help out on DefinitelyTyped. Currently we have our guide on Publishing, but going forward we’d like to cover more of this information on our website proper.

    We’d also like to hear about resources you’d like to see improved, and information that isn’t obvious to you, so feel free to leave your feedback below.

    Hope to see you on DefinitelyTyped. Happy hacking!

    JBoss and WildFly extension for Visual Studio Team Services

    $
    0
    0

    We are pleased to announce the new JBoss and WildFly extension available from the Visual Studio Marketplace for Visual Studio Team Services / Team Foundation Server.

    This extension provides a task to deploy your Java applications to an instance of JBoss Enterprise Application Platform (EAP) 7 or WildFly Application Server 8 and above over the HTTP management interface.  It also includes a utility to run CLI commands as part of your build/release process.  Check out this video for a demo.

    screenshot

    This extension is open sourced on GitHub so reach out to us with any suggestions or issues.  We welcome contributions.

    To learn more about how to deploy tolegacy JBoss EAP 6, please refer to this guide.

    To learn more about Java and cross platform support in Visual Studio Team Services and Team Foundation Server, visit http://java.visualstudio.com or follow us on twitter @JavaALM.

    Twin zero-day attacks: PROMETHIUM and NEODYMIUM target individuals in Europe

    $
    0
    0

    Targeted attacks are typically carried out against individuals to obtain intellectual property and other valuable data from target organizations. These individuals are either directly in possession of the targeted information or are able to connect to networks where the information resides. Microsoft researchers have encountered twin threat activity groups that appear to target individuals for reasons that are quite uncommon.

    Unlike many activity groups, which typically gather information for monetary gain or economic espionage, PROMETHIUM and NEODYMIUM appear to launch campaigns simply to gather information about certain individuals. These activity groups are also unusual in that they use the same zero-day exploit to launch attacks at around the same time in the same region. Their targets, however, appear to be individuals that do not share common affiliations.

    Activity group profiles

    PROMETHIUM is an activity group that has been active as early as 2012. The group primarily uses Truvasys, a first-stage malware that has been in circulation for several years. Truvasys has been involved in several attack campaigns, where it has masqueraded as one of server common computer utilities, including WinUtils, TrueCrypt, WinRAR, or SanDisk. In each of the campaigns, Truvasys malware evolved with additional features—this shows a close relationship between the activity groups behind the campaigns and the developers of the malware.

    NEODYMIUM is an activity group that is known to use a backdoor malware detected by Microsoft as Wingbird. This backdoor’s characteristics closely match FinFisher, a government-grade commercial surveillance package. Data about Wingbird activity indicate that it is typically used to attack individual computers instead of networks.

    Similarly timed attacks

    In early May 2016, both PROMETHIUM and NEODYMIUM started conducting attack campaigns against specific individuals in Europe. They both used an exploit for CVE-2016-4117, a vulnerability in Adobe Flash Player that, at the time, was both unknown and unpatched.

    PROMETHIUM distributed links through instant messengers, pointing recipients to malicious documents that invoked the exploit code to launch Truvasys on victim computers. Meanwhile, NEODYMIUM used well-tailored spear-phishing emails with attachments that delivered the exploit code, ultimately leading to Wingbird’s installation on victim computers.

    While the use of the same exploit code could be attributed to coincidence, the timing of the campaigns and the geographic location of victims lend credence to the theory that the campaigns are somehow related.

    Stopping exploits in Windows 10

    PROMETHIUM and NEODYMIUM both used a zero-day exploit that executed code to download a malicious payload. Protected view, a security feature introduced in Microsoft Office 2010, can prevent the malicious Flash code from loading when the document is opened. Control Flow Guard, a security feature that is turned on by default in Windows 10 and Microsoft Office 365 64-bit, can stop attempts to exploit memory corruption vulnerabilities. In addition, Credential Guard, an optional feature introduced in Windows 10, can stop Wingbird’s use of the system file, lsass.exe, to load a malicious DLL.

    Detecting suspicious behaviors with Windows Defender Advanced Threat Protection

    Windows Defender Advanced Threat Protection (Windows Defender ATP) is a new built-in service that ships natively with Windows 10 and helps enterprises to detect, investigate and respond to advanced targeted attacks. When activated, it captures behavioral signals from endpoints and then uses cloud-based machine learning analytics and threat intelligence to flag attack-related activities.

    Wingbird, the advanced malware used by NEODYMIUM, has several behaviors that trigger alerts in Windows Defender ATP. Windows Defender ATP has multiple behavioral and machine learning detection rules that can catch various elements of the malware kill chain. As a result, it can generically detect, without any signature, a NEODYMIUM attack in the following stages:

    • Zero-day exploits causing Microsoft Office to generate and execute malicious files
    • Zero-day exploits attempting to grant malicious executables higher privileges
    • Malicious files trying to delete themselves
    • Malicious files attempting the DLL side-loading technique, in which legitimate DLLs in non-standard folders are replaced by malicious ones so that malicious files are loaded by the operating system or by installed applications
    • Malicious files injecting code into legitimate processes

    In the example below, Windows Defender ATP alerts administrators that something is amiss. It notifies them that an Office document has dropped an executable file in one of their computers—activity that is very likely part of an attack.

    windows-defender-advanced-threat-protection-exploit-executing-file

    Additionally, Windows Defender ATP and Office 365 ATP leverage rules based on IOCs and threat intelligence specific to PROMETHIUM and NEODYMIUM . Alerts from these rules work alongside concise briefs and in-depth profiles provided in the Windows Defender ATP console to help administrators address breach attempts by these activity groups.

    For more information about Windows Defender ATP service in Windows 10, check out its features and capabilities and read more about why a post-breach detection approach is a key component of any enterprise security stack.

    Details about PROMETHIUM and NEODYMIUM along with indicators of compromise can be found in the Microsoft Security Intelligence Report volume 21.

     

    – Windows Defender ATP team


    Exploring Azure Data with Apache Drill, Now Part of the Microsoft Data Science Virtual Machine

    $
    0
    0

    This post is authored by Gopi Kumar, Principal Program Manager in Microsoft’s Data Group.

    We recently came across Apache Drill, a very interesting data analytics tool. The introduction page to Drill describes it well:

    “Drill is an Apache open-source SQL query engine for Big Data exploration. Drill is designed from the ground up to support high-performance analysis on the semi-structured and rapidly evolving data coming from modern Big Data applications, while still providing the familiarity and ecosystem of ANSI SQL, the industry-standard query language.”.

    Drill supports several data sources ranging from flat files, RDBMS, NoSQL databases, Hadoop/Hive stored on local server/desktop or cloud platforms like Azure and AWS. It supports querying various formats like CSV/TSV, JSON, relational tables, etc. all from the familiar ANSI SQL language (SQL remains one of the most popular languages used in data science and analytics). The best part of querying data with Drill is that the data stays in the original source and you can join data across multiple sources. Drill is designed for low latency and high throughput, and can scale from a single machine to thousands of nodes.

    We are excited to announce that Apache Drill is now pre-installed on the Data Science Virtual Machine (DSVM). The DSVM is Microsoft’s custom virtual machine image on Azure, pre-installed and configured with a host of popular tools that are commonly used in data science, machine learning and AI. Think of DSVM as an analytics desktop in the cloud, serving both beginners as well as advanced data scientists, analysts and engineers.

    Azure already provides several data services to store and process analytical data ranging from blobs, files, relational databases, NoSQL databases, and Big Data technologies supporting varied types of data, scaling / performance needs and price points. We wanted to demonstrate how easy it is to setup Drill to explore data stored on four different Azure data services – Azure Blob Storage, Azure SQL Data Warehouse, Azure DocumentDB (a managed NoSQL database) and Azure HDInsight (i.e. managed Hadoop) Hive tables.

    Towards that end, we’ve published a tutorial on the Cortana Intelligence Gallery that walks you through the installation and how to query data with Drill. the tutorial that will guide you through the steps to set up connections from Drill to different Azure Data services.

    Drill also provides an ODBC/JDBC interface, allowing you to perform data exploration on your favorite BI tool such as Excel, Power BI or Tableau, using SQL queries. You can also query data from any programming language such as R or Python with ODBC/JDBC interfaces.

    While on the Data Science Virtual Machine, we encourage you to also take a look at other useful tools and samples that come pre-built. If you’re new to the DSVM (which is available in Windows and Linux editions, plus a deep learning extension to run on Nvidia GPUs), we invite you to give the DSVM a try through a Azure free trial. We also have a timed test drive, available for the Linux DSVM now, that does not require an Azure account. You will find more resources to get you started with the DSVM below.

    In summary, Apache Drill can be a powerful tool in your arsenal, and can help you be nimbler with your data science projects and gain faster business insights on your big data. Data scientists and analysts can now start exploring data in its native store without having to wait for ETL pipelines to be built, and without having to do extensive data prep or client side coding to bring together data from multiple sources. This can be a huge boost to your teams’ agility and productivity.

    Gopi

    Windows Edition:

    Linux Edition:

    Webinar:

    https://channel9.msdn.com/blogs/Cloud-and-Enterprise-Premium/Inside-the-Data-Science-Virtual-Machine (Duration: 1 Hour)

    Playing with an Onion Omega IoT to show live Blood Sugar on an OLED screen

    $
    0
    0

    arduino_lb3dg8I've been playing with IoT stuff on my vacation. Today I'm looking at an Onion Omega. This is a US$19 computer that you can program with Python, Node.js, or C/C++. There's a current IndieGogo happening for the Onion Omega2 for $5. That's a $5 Linux computer with Wi-Fi. Realistically you'd want to spend more and get expansion docks, chargers, batteries, etc, but you get the idea. I got the original Omega along with the bluetooth dongle, Arduino compatible base, tiny OLED screen. A ton of stuff to play with for less than $100.

    Note that I am not affiliated with Onion at all and I paid for it with my own money, to use for fun.

    One of the most striking things about the Onion Omega line is how polished it is. There's lots of tiny Linux Machines that basically drop you at the command line and say "OK, SSH in and here's root." The Onion Omega is far more polished.

    Onion Omega has a very polished Web UI

    The Omega can do that for you, but if you have Bonjour installed (for zeroconf networking) and can SSH in once to setup Wi-Fi, you're able to access this lovely web-based interface.

    Look at all the info about the Omega's memory, networking, device status, and more

    This clean, local web server and useful UI makes the Onion Omega extremely useful as a teaching tool. The Particle line of IoT products has a similarly polished web-interfaces, but while the Onion uses a local web server and app, the Particle Photon uses a cloud-based app that bounces down to a local administrative interface on the device. There's arguments for each, but I remain impressed with how easy it was for me to update the firmware on the Omega and get a new experience. Additionally, I made a few mistakes and "bricked" it and was able - just by following some basic instructions - to totally reflash and reset it to the defaults in just about 10 minutes. Impressive docs for an impressive product.

    image

    Onion Omega based Glucose Display via NightScout

    So it's a cool product, but how quickly can I do something trivial, but useful? Well, I have a NightScout open source diabetes management server with an API that lets me see my blood sugar. The resulting JSON looks like this:

    [  
    {
    "_id":"5851b235b8d1fea108df8b",
    "sgv":135,
    "date":1481748935000,
    "dateString":"2016-12-14T20:55:35.000Z",
    "trend":4,
    "direction":"Flat",
    "device":"share2",
    "type":"sgv"
    }
    ]

    That number under "sgv" (serum glucose value) is 135 mg/dl. That's my blood sugar right now. I could get n values back from the WebAPI and plot a chart, but baby steps. Note also the "direction" for my sugars is "flat." It's not rising nor falling in any major way.

    Let's add the OLED Display to the Onion Omega and show my sugars. Since it's an OpenWRT Linux machine, I can just add Python!

    opkg update
    opkg install python

    Some may (and will) argue that for a small IoT system, Linux is totally overkill. Sure, it likely it. But it's also very productive, fun to prototype with, and functional. Were I to go to market for real, I'd likely use something more hardened.

    As I said, I could SSH into the machine but since the Web UI is so nice, it includes an HTML-based terminal!

    A Terminal built in!

    The Onion Omega includes not just libraries for expansions like the OLED Display, but also command-line utilities. This script clears the display, initializes it, and displays some text. The value of that text will come from my yet-to-be-written python script.

    #!/bin/sh    

    oled-exp -c

    VAR=$(python ./sugar_script.py)

    oled-exp -i
    oled-exp write "$VAR"

    Then in my Python script I could return print the value. OR, I can use the Python Module for this OLED screen directly and do this:

    #!/usr/bin/env python                                                                                                        

    from OmegaExpansion import oledExp
    import urllib
    import json

    site="https://hanselsugars.azurewebsites.net/api/v1/entries/sgv.json?count=1"
    jfile=urllib.urlopen(site)
    jsfile=jfile.read()
    jsfile=jsfile.replace("\n","")
    jsfile=jsfile.replace("/","")
    jsfile=jsfile.replace("]","")
    jsfile=jsfile.replace("[","")

    a=json.loads(jsfile)
    sugar=a['sgv']
    direction=a['direction']
    info="\n" + str(sugar)+" mg/dl and "+direction

    oledExp.driverInit()
    oledExp.clear()
    oledExp.write(info)

    Now here's a pic of my live blood sugar on the Onion Omega with the OLED! I could put this to run on a timer and I'm off to the races.

    Photo Dec 14, 2 16 27 PM

    The next step might be to clean up the output, parse the date better, and perhaps even dynamically generate a sparkline and display the graphic on the small B&W OLED Screen.

    Have you used a small Linux IoT device like the Onion Omega?


    Sponsor: Do you deploy the same application multiple times for each of your end customers? The team at Octopus have taken the pain out of multi-tenant deployments. Check out their latest 3.4 release



    © 2016 Scott Hanselman. All rights reserved.
         

    DSC Resource Kit Release December 2016

    $
    0
    0

    We just released the DSC Resource Kit!Since our last release on November 2, we have added 1 new module, AuditPolicyDsc, which allows you to edit your audit policy subcategories and options. Thank you to Adam Haynes for this great new module!

    Outside of the DSC Resource Kit, we also recently published an update to GPRegistryPolicy in the PowerShell Gallery. The latest version now includes a DSC resource to help you locally manage policy registry keys.

    We have also added two new maintainers to the DSC Resource Kit:

    • Johan Ljunggren (xSqlServer)
    • Daniel Scott-Raynsford (xAdcsDeployment, xCertificate, xNetworking, xStorage)

    These guys have shown outstanding dedication to their modules and continue to be invaluable contributors to the DSC Resource Kit. Congrats!

    This release includes updates to 15 DSC resource modules, including 5 new DSC resources. In these past 6 weeks, 101 pull requests have been merged and 43 issues have been closed, all thanks to our amazing community!

    We are holding the release of xPSDesiredStateConfiguration and PSDscResources until Friday (12/16) so that we can finish updating some of the in-box resources. We will update this blog post when those modules are released on Friday.

    The modules updated in this release are:

    • SharePointDsc
    • xActiveDirectory
    • xAdcsDeployment
    • xCertificate
    • xComputerManagement
    • xDatabase
    • xDscDiagnostics
    • xExchange
    • xFailOverCluster
    • xHyper-V
    • xNetworking
    • xSCSMA
    • xSQLServer
    • xStorage
    • xWebAdministration

    For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

    Our last community call for the DSC Resource Kit was last week on December 7. A recording of our updates as well as summarizing notes are available. Join us next time to ask questions and give feedback about your experience with the DSC Resource Kit. Keep an eye on the community agenda for the next call date.

    We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

    As with past Resource Kits, all resources with the ‘x’ prefix in their names are still experimental – this means that those resources are provided AS IS and are not supported through any Microsoft support program or service. If you find a problem with a resource, please file an issue on GitHub.

    Included in this Release

    You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or Changelog.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

    Module NameVersionRelease Notes
    AuditPolicyDsc1.0.0.0
    • Initial release with the following resources:
      • AuditPolicySubcategory
      • AuditPolicyOption
    SharePointDsc1.5.0.0
    • Fixed issue with SPManagedMetaDataServiceApp if ContentTypeHubUrl parameter is null
    • Added minimum PowerShell version to module manifest
    • Added testing for valid markdown syntax to unit tests
    • Added support for MinRole enhancements added in SP2016 Feature Pack 1
    • Fixed bug with search topology that caused issues with names of servers needing to all be the same case
    • Fixed bug in SPInstallLanguagePack where language packs could not be installed on SharePoint 2016
    • Added new resource SPSearchFileType
    • Updated SPDatabaseAAG to allow database name patterns
    • Fixed a bug were PerformancePoint and Excel Services Service Application proxies would not be added to the default proxy group when they are provisioned
    • Added an error catch to provide more detail about running SPAppCatalog with accounts other than the farm account
    xActiveDirectory2.15.0.0
    • xAdDomainController: Fixes SiteName being required field.
    xAdcsDeployment1.1.0.0
    • Converted AppVeyor.yml to pull Pester from PSGallery instead of Chocolatey.
    • Changed AppVeyor.yml to use default image.
    • xAdcsCertificateAuthority:
      • Change property format in Readme.md to be standard layout.
      • Converted style to meet HQRM guidelines.
      • Added verbose logging support.
      • Added string localization.
      • Fixed Get-TargetResource by removing IsCA and changing Ensure to return whether or not CA is installed.
      • Added unit tests.
      • Updated parameter format to meet HQRM guidelines.
    • xAdcsOnlineResponder:
      • Change property format in Readme.md to be standard layout.
      • Added unit test header to be latest version.
      • Added function help.
      • Updated parameter format to meet HQRM guidelines.
      • Updated resource to meet HQRM guidelines.
    • xAdcsWebEnrollment:
      • Change property format in Readme.md to be standard layout.
      • Added unit test header to be latest version.
      • Added function help.
      • Updated parameter format to meet HQRM guidelines.
      • Updated resource to meet HQRM guidelines.
    • Added CommonResourceHelper.psm1 (copied from xPSDesiredStateConfiguration).
    • Removed Technet Documentation HTML file from root folder.
    • Removed redundant code from AppVeyor.yml.
    • Fix markdown violations in Readme.md.
    • Updated readme.md to match DSCResource.Template\Readme.md.
    xCertificate2.3.0.0
    • xCertReq:
      • Added additional parameters KeyLength, Exportable, ProviderName, OID, KeyUsage, CertificateTemplate, SubjectAltName
    • Fixed most markdown errors in Readme.md.
    • Corrected Parameter decoration format to be consistent with guidelines.
    xComputerManagement1.9.0.0
    • Added resources
      • xPowerPlan
    xDatabase1.5.0.0
    • Converted appveyor.yml to install Pester from PSGallery instead of from Chocolatey.
    • Added logging for when dac deploy fails
    xDscDiagnostics2.6.0.0
    • Added JobId parameter set to Get-xDscConfiguration
    • Added IIS binding collection
    xExchange1.12.0.0
    • xExchangeCommon : In StartScheduledTask corrected throw error check to throw last error when errorRegister has more than 0 errors instead of throwing error if errorRegister was not null, which would otherwise always be true.
    • Fix PSAvoidUsingWMICmdlet issues from PSScriptAnalyzer
    • Fix PSUseSingularNouns issues from PSScriptAnalyzer
    • Fix PSAvoidUsingCmdletAliases issues from PSScriptAnalyzer
    • Fix PSUseApprovedVerbs issues from PSScriptAnalyzer
    • Fix PSAvoidUsingEmptyCatchBlock issues from PSScriptAnalyzer
    • Fix PSUsePSCredentialType issues from PSScriptAnalyzer
    • Fix erroneous PSDSCDscTestsPresent issues from PSScriptAnalyzer for modules that do actually have tests in the root Tests folder
    • Fix array comparison issues by removing check for if array is null
    • Suppress PSDSCDscExamplesPresent PSScriptAnalyzer issues for resources that do have examples
    • Fix PSUseDeclaredVarsMoreThanAssignments issues from PSScriptAnalyzer
    • Remove requirements for second DAG member, or second Witness server, from MSFT_xExchDatabaseAvailabilityGroup.Integration.Tests
    xFailOverCluster1.6.0.0
    • xCluster: Fixed bug in which failure to create a new cluster would hang
    xHyper-V3.6.0.0
    • xVHD: Updated incorrect property name MaximumSize in error message
    xNetworking3.1.0.0
    • Changed parameter format in Readme.md to improve information coverage and consistency.
    • Changed all MOF files to be consistent and meet HQRM guidelines.
    • Removed most markdown errors (MD*) in Readme.md.
    • Added xNetAdapterRDMA resource
    • Fixes to support changes to DSCResource.Tests.
    xSCSMA1.5.0.0
    • Added $IdentifyingNumber for TP5/RTM and small WMI improvements
    xSQLServer4.0.0.0
    • Fixes in xSQLServerConfiguration
      • Added support for clustered SQL instances
      • BREAKING CHANGE: Updated parameters to align with other resources (SQLServer / SQLInstanceName)
      • Updated code to utilize CIM rather than WMI
    • Added tests for resources
      • xSQLServerConfiguration
      • xSQLServerSetup
      • xSQLServerDatabaseRole
      • xSQLAOGroupJoin
      • xSQLServerHelper and moved the existing tests for Restart-SqlService to it.
      • xSQLServerAlwaysOnService
    • Fixes in xSQLAOGroupJoin
      • Availability Group name now appears in the error message for a failed Availability Group join attempt.
      • Get-TargetResource now works with Get-DscConfiguration
    • Fixes in xSQLServerRole
      • Updated Ensure parameter to “Present” default value
    • *-SqlServerRole to *-SqlServerRoleMember
    • Changes to xSQLAlias
      • Add UseDynamicTcpPort parameter for option “Dynamically determine port”
      • Change Get-WmiObject to Get-CimInstance in Resource and associated pester file
    • Added CHANGELOG.md file
    • Added issue template file (ISSUE_TEMPLATE.md) for “New Issue” and pull request template file (PULL_REQUEST_TEMPLATE.md) for “New Pull Request”
    • Add Contributing.md file
    • Changes to xSQLServerSetup
      • Now Features parameter is case-insensitive.
    • BREAKING CHANGE: Removed xSQLServerPowerPlan from this module. The resource has been moved to xComputerManagement and is now called xPowerPlan.
    • Changes and enhancements in xSQLServerDatabaseRole
      • BREAKING CHANGE: Fixed so the same user can now be added to a role in one or more databases, and/or one or more instances. Now the parameters SQLServer and SQLInstanceName are mandatory.
      • Enhanced so the same user can now be added to more than one role
    • BREAKING CHANGE: Renamed xSQLAlias to xSQLServerAlias to align with naming convention.
    • Changes to xSQLServerAlwaysOnService
      • Added RestartTimeout parameter
      • Fixed bug where the SQL Agent service did not get restarted after the IsHadrEnabled property was set.
      • BREAKING CHANGE: The mandatory parameters now include Ensure, SQLServer, and SQLInstanceName. SQLServer and SQLInstanceName are keys which will be used to uniquely identify the resource which allows AlwaysOn to be enabled on multiple instances on the same machine.
    • Moved Restart-SqlService from MSFT_xSQLServerConfiguration.psm1 to xSQLServerHelper.psm1.
    xStorage2.9.0.0
    • Updated readme.md to remove markdown best practice rule violations.
    • Updated readme.md to match DSCResources/DscResource.Template/README.md.
    • xDiskAccessPath:
      • Fix bug when re-attaching disk after mount point removed or detatched.
      • Additional log entries added for improved diagnostics.
      • Additional integration tests added.
      • Improve timeout loop.
    • Converted integration tests to use $TestDrive as working folder or temp folder when persistence across tests is required.
    • Suppress PSUseShouldProcessForStateChangingFunctions rule violations in resources.
    • Rename Test-AccessPath function to Assert-AccessPathValid.
    • Rename Test-DriveLetter function to Assert-DriveLetterValid.
    • Added CommonResourceHelper.psm1 module (based on PSDscResources).
    • Added CommonTestsHelper.psm1 module (based on PSDscResources).
    • Converted all modules to load localization data using Get-LocalizedData from CommonResourceHelper.
    • Converted all exception calls and tests to use functions in CommonResourceHelper.psm1 and CommonTestsHelper.psm1 respectively.
    • Fixed examples:
      • Sample_InitializeDataDisk.ps1
      • Sample_InitializeDataDiskWithAccessPath.ps1
      • Sample_xMountImage_DismountISO.ps1
    • xDisk:
      • Improve timeout loop.
    xWebAdministration1.16.0.0
    • Log directory configuration on xWebsite used the logPath attribute instead of the directory attribute. Bugfix for

    How to Find Released DSC Resource Modules

    To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

    Of course, you can also always use PowerShellGet (available in WMF 5.0) to find modules with DSC Resources:

    # To list all modules that are part of the DSC Resource KitFind-Module-Tag DSCResourceKit # To list all DSC resources from all sources Find-DscResource

    To find a specific module, go directly to its URL on the PowerShell Gallery:
    http://www.powershellgallery.com/packages/< module name >
    For example:
    http://www.powershellgallery.com/packages/xWebAdministration

    How to Install DSC Resource Modules From the PowerShell Gallery

    We recommend that you use PowerShellGet to install DSC resource modules:

    Install-Module-Name < module name >

    For example:

    Install-Module-Name xWebAdministration

    To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

    Update-Module

    After installing modules, you can discover all DSC resources available to your local system with this command:

    Get-DscResource

    How to Find DSC Resource Modules on GitHub

    All resource modules in the DSC Resource Kit are available open-source on GitHub.
    You can see the most recent state of a resource module by visiting its GitHub page at:
    https://github.com/PowerShell/< module name >
    For example, for the xCertificate module, go to:
    https://github.com/PowerShell/xCertificate.

    All DSC modules are also listed as submodules of the DscResources repository in the xDscResources folder.

    How to Contribute

    You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
    See our contributing guide for more info on how to become a DSC Resource Kit contributor.

    If you would like to help, please take a look at the list of open issues for the DscResources repository.
    You can also check issues for specific resource modules by going to:
    https://github.com/PowerShell/< module name >/issues
    For example:
    https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

    Your help in developing the DSC Resource Kit is invaluable to us!

    Questions, comments?

    If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

    Katie Keim
    Software Engineer
    PowerShell Team
    @katiedsc (Twitter)
    @kwirkykat (GitHub)

    Episode 111 with John Bristowe and John Liu about Office 365 development with KendoUI and Angular2—Office 365 Developer Podcast

    $
    0
    0

    In episode 111 of the Office 365 Developer Podcast, Richard diZerega and Andrew Coates talk with John Bristowe and John Liu about Office 365 development with KendoUI and Angular2.

    Download the podcast.

    Weekly updates

    Show notes

    Got questions or comments about the show? Join the O365 Dev Podcast on the Office 365 Technical Network. The podcast RSS is available on iTunes or search for it at “Office 365 Developer Podcast” or add directly with the RSS feeds.feedburner.com/Office365DeveloperPodcast.

    About John Bristowe

    johnbristoweJohn is a member of the Developer Relations team at Progress.

    About John Liu

    johnliuBased in Sydney, John specializes, blogs and speaks frequently on client-side scripting, custom development, workflows and forms. John loves finding ways to apply the latest web technologies to extend the SharePoint platform.

    About the hosts

    RIchard-diZeregaRichard is a software engineer in Microsoft’s Developer Experience (DX) group, where he helps developers and software vendors maximize their use of Microsoft cloud services in Office 365 and Azure. Richard has spent a good portion of the last decade architecting Office-centric solutions, many that span Microsoft’s diverse technology portfolio. He is a passionate technology evangelist and a frequent speaker at worldwide conferences, trainings and events. Richard is highly active in the Office 365 community, popular blogger at aka.ms/richdizz and can be found on Twitter at @richdizz. Richard is born, raised and based in Dallas, TX, but works on a worldwide team based in Redmond. Richard is an avid builder of things (BoT), musician and lightning-fast runner.

     

    ACoatesA Civil Engineer by training and a software developer by profession, Andrew Coates has been a Developer Evangelist at Microsoft since early 2004, teaching, learning and sharing coding techniques. During that time, he’s focused on .Net development on the desktop, in the cloud, on the web, on mobile devices and most recently for Office. Andrew has a number of apps in various stores and generally has far too much fun doing his job to honestly be able to call it work. Andrew lives in Sydney, Australia with his wife and two almost-grown-up children.

    Useful links

    StackOverflow

    Yammer Office 365 Technical Network

     

    The post Episode 111 with John Bristowe and John Liu about Office 365 development with KendoUI and Angular2—Office 365 Developer Podcast appeared first on Office Blogs.

    SQL Server + C#: What’s new

    $
    0
    0

    This post was authored by Andrea Lam, Program Manager, SQL Server

    With the release of SQL Server v.Next public preview on Linux and Windows, the ability to connect to SQL Server on Linux, Windows, Docker or macOS (via Docker) makes cross-platform support for all connectors, including .NET Framework and .NET Core SqlClient, even more important. To enable C# developers to use the newest SQL Server features, we have been updating SqlClient with client-side support for new features.

    In .NET Framework, we have provided client-side support for Always Encrypted and added Azure Active Directory as an authentication method. We’ve also added a new connection string parameter called “Pool Blocking Period.” It can be used to select the behavior of the blocking period when connecting to Azure SQL Database and SQL Server. Many applications that connect to Azure SQL Database need to render quickly, and the exponential blocking period can be problematic, especially when throwing errors. By adding the Pool Blocking Period parameter, we aim to improve the experience for connections to Azure SQL Database. Learn more about the new parameter here.

    .NET Core is the cross-platform, open-source implementation of the .NET Framework. The project includes CoreFX, the foundational libraries of .NET Core. Cross-platform support allows developers to seamlessly run applications on their operating system of choice, regardless of the platform it was developed on. For example, an app developed on Windows can be deployed to macOS and Linux, without ever having to port any code. To connect apps written in .NET Core to SQL Server (hosted anywhere), developers can use System.Data.SqlClient available in CoreFX. By developing in .NET Core on GitHub, we have been able to get feedback quickly and are actively working to enable the breadth of application scenarios and workloads.

    Get started today

    • Check out CoreFX on GitHub! Make pull requests and let us know what you think.
    • Try the new getting started tutorials that show you how to:
      • Install SQL Server on Linux/macOS/Docker/Windows
      • Create a simple app using C# and other popular programming languages with SQL Server
      • Create a simple app using popular web frameworks and Object Relational Mapping (ORM) frameworks with SQL Server
      • Try out some cool SQL Server features that can make your apps shine

    Connect with us

    Learn more

    Other videos in this series

    Viewing all 13502 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>