Quantcast
Channel: TechNet Technology News
Viewing all 13502 articles
Browse latest View live

Columnstore Index: Standard and Express editions with SQL Server 2016 SP1


Get notified if OMS Log Analytics usage is higher than expected

$
0
0

Summary: Good morning everyone, Richard Rundle here, and today I want to talk about how you can get notified if your usage of Log Analytics goes above a predefined limit.

New Usage view

In mid-September, we updated the Usage view in Log Analytics so that you can get more insight into how much data is being sent to Log Analytics, and which solutions and computers are sending the most data. The usage data is calculated hourly and is available from search, which makes it easier to perform your own analysis and correlations.

The included views will give you information about:

  • How much data is sent to Log Analytics and by which computers
  • How much data is sent for each solution
  • How much data isn’t associated with a computer
  • Which computers are sending data and which computers haven’t recently sent data
  • How many nodes are sending data for each of the OMS offers (Insight & Analytics, Automation & Control, and Security and Compliance)
  • How long it takes for Log Analytics to make data searchable

New Usage view

These views provide a good starting point for understanding the data sent to Log Analytics, but what if you want to predict how much your usage might be?

Queries to show usage

The following query will tell me how many GB of data was sent in the last 24 hours:

Type=Usage QuantityUnit=MBytes IsBillable=true TimeGenerated > NOW-24HOURS | measure sum(div(Quantity,1024)) as DataGB by Type

What if I want to predict a day’s worth of data based on how much data was sent in the last 3 hours? The following sums the data sent in the last 3 hours, multiplies by 8 (to estimate usage for the day), and then divides by 1024 (to convert MB to GB).

Type=Usage QuantityUnit=MBytes IsBillable=true TimeGenerated > NOW-3HOURS | measure sum(div(mul(Quantity,8),1024)) as EstimatedGB by Type

These two queries let you know:

  • My actual usage in the last 24 hours
  • My estimated usage in the next 24 hours

Create an alert

Let’s assume that you expect to send 100 GB of data per day to Log Analytics and want to know if you have either sent more than 100 GB, or you expect to send more than 100 GB in a day.

Because we are going to create alerts, we can modify the above queries slightly to remove the TimeGenerated part of the query so that we can use the alert time window of the alert to control the time range. Next, we add a where command to return results only if we have more than 100 GB of data. Our modified queries look like this:

Type=Usage QuantityUnit=MBytes IsBillable=true | measure sum(div(Quantity,1024)) as DataGB by Type | where DataGB > 100

Type=Usage QuantityUnit=MBytes IsBillable=true | measure sum(div(mul(Quantity,8),1024)) as EstimatedGB by Type | where EstimatedGB > 100

To alert on a different data volume, change the 100 in the above queries to the number of GB you want to alert on.

The following screenshot shows creating an alert for the first query, showing when I have more than 100 GB of data in 24 hours.

I’ve set the time window to be 24 hours, so I look at the last 24 hours of data, and I only check for the alert once per hour because the usage data only updates once per hour.

Note: I don’t use a metric measurement alert because I’m only calculating a single value. If I removed the where command and added an interval to my query so that it returned multiple values, I could use a metric alert.

Create an alert for a query

I can use the same process to create an alert for the second query. For the second query, I set the time window to 3 hours and set the frequency to every 1 hour.

Now, if my data volume exceeds, or is expected to exceed, 100GB in 24 hours, I will get a notification. After I have the notification, I can investigate what caused the spike in usage and then make needed adjustments.

Detailed documentation about creating alerts is available in the Log Analytics documentation.

That is all I have for you today. I would like to hear any feedback you have. If you don’t already have your own Log Analytics workspace and want to try out the new usage view, you can sign up and access our demo workspace.

Please feel free to send me an e-mail at Richard.Rundle@microsoft.com with questions, comments, and suggestions.

Richard Rundle
Microsoft Operations Management Team

Windows 10 SDK Preview Build 14965 Released

$
0
0

Today, we released a new Windows 10 Anniversary SDK Preview to be used in conjunction with Windows 10 Insider Preview (Build 14965 or greater). The Preview SDK is a pre-release and cannot be used in a production environment. Please only install the SDK on your test machine. The Preview SDK Build 14965 contains bug fixes and under development changes to the API surface area. If you are working on an application that you need to submit to the store, you should not install the preview.

The Preview SDK can be downloaded from the developer section on Windows Insider.

For feedback and updates to the known issues, please see the developer forum.  For new feature requests, head over to our Windows Platform UserVoice.

Things to note:

What’s New

Known Issues Windows SDK

  • Wrong GenXBF.DLL
    If you installed a previous Windows SDK flight, either version 14951 or 14931, you may have an incorrect GenXBF.dll installed. Please follow the following steps after installing the Windows 10 SDK Preview build 14965.
  1. Exit Visual Studio
  2. Open an Administrative command prompt
  3. Type the following:

    DEL “c:\Program Files (x86)\Windows Kits\10\bin\x86\genxbf.dll”

    DEL “c:\Program Files (x86)\Windows Kits\10\bin\x64\genxbf.dll”

  1. Run Control Panel
  2. Select Uninstall a Program
  3. Highlight Windows Software Development Kit – Windows 10.0.14965.1000
  4. Click Change
  5. Select Repair
  6. Click Next

Windows SDK setup will restore the missing GenXBF.dlls  with the appropriate version.

  • Visual Studio 2017 fails with HRESULT: 0x80041FE2 when trying to create C++ UWP apps targeting build 14965 SDK

This is a known problem. Here are steps to address this issue in your project file:

  1. Close the project
  2. Open up the project file in notepad or your favorite editor
  3. Add the following to the project file:
      false
  4. Reopen the project in Visual Studio

Known Issues Microsoft Emulator

Microsoft Emulator Preview for Windows 10 Mobile (10.0.14965.0) crashes when launching

Impact:

Please note that there is a bug impacting the usage of hardware accelerated graphics in the latest release of the Mobile Emulator. Follow the instructions below to temporarily disable hardware accelerated graphics in the emulator and use the emulator with software rendered graphics (WARP).

NOTE: The following registry setting will impact any and all Microsoft Emulators installed on your machine. You will need to remove this registry setting in order to re-enable hardware accelerated graphics in the emulator.

  1. Create the following registry subkey if it doesn’t exist: HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Xde\10.0
  2. Right click the 10.0 folder, point to New, and then click DWORD Value.
  3. Type DisableRemoteFx, and then press Enter.
  4. Double-click DisableRemoteFx, enter 1 in the Value data box, select the Decimal option, and then click OK.

API Updates and Additions

The following API changes are under development and new or updated for this release of the SDK.

namespace Windows.ApplicationModel.Preview.Notes {
  public sealed class NotesWindowManagerPreview {
    void SetFocusToPreviousView();
    IAsyncAction SetThumbnailImageForTaskSwitcherAsync(SoftwareBitmap bitmap);
    void ShowNoteRelativeTo(int noteViewId, int anchorNoteViewId, NotesWindowManagerPreviewShowNoteOptions options);
    void ShowNoteWithPlacement(int noteViewId, IBuffer data, NotesWindowManagerPreviewShowNoteOptions options);
  }
  public sealed class NotesWindowManagerPreviewShowNoteOptions
}
namespace Windows.Devices.Gpio {
  public sealed class GpioInterruptBuffer
  public struct GpioInterruptEvent
  public enum GpioOpenStatus {
    MuxingConflict = 3,
    UnknownError = 4,
  }
  public sealed class GpioPin : IClosable {
    GpioInterruptBuffer InterruptBuffer { get; }
    ulong InterruptCount { get; }
    void CreateInterruptBuffer();
    void CreateInterruptBuffer(int minimumCapacity);
    void StartInterruptBuffer();
    void StartInterruptBuffer(GpioPinEdge edge);
    void StartInterruptCount();
    void StartInterruptCount(GpioPinEdge edge);
    void StopInterruptBuffer();
    void StopInterruptCount();
  }
}
namespace Windows.Devices.Gpio.Provider {
  public interface IGpioInterruptBufferProvider
  public interface IGpioPinProvider2
  public struct ProviderGpioInterruptEvent
}
namespace Windows.Devices.I2c {
  public enum I2cTransferStatus {
    ClockStretchTimeout = 3,
    UnknownError = 4,
  }
}
namespace Windows.ApplicationModel {
  public sealed class Package {
    IAsyncOperation GetContentGroupAsync(string name);
    IAsyncOperation> GetContentGroupsAsync();
    IAsyncOperation SetInUseAsync(bool inUse);
    IAsyncOperation> StageContentGroupsAsync(IIterable names);
    IAsyncOperation> StageContentGroupsAsync(IIterable names, bool moveToHeadOfQueue);
  }
  public sealed class PackageCatalog {
    event TypedEventHandler PackageContentGroupStaging;
    IAsyncOperation AddOptionalPackageAsync(string optionalPackageFamilyName);
  }
  public sealed class PackageContentGroup
  public sealed class PackageContentGroupStagingEventArgs
  public enum PackageContentGroupState
}
namespace Windows.ApplicationModel.Activation {
  public enum ActivationKind {
    ContactPanel = 1017,
    LockScreenComponent = 1016,
  }
  public sealed class ContactPanelActivatedEventArgs : IActivatedEventArgs, IActivatedEventArgsWithUser, IContactPanelActivatedEventArgs
  public interface IContactPanelActivatedEventArgs
  public sealed class LockScreenComponentActivatedEventArgs : IActivatedEventArgs
  public sealed class ToastNotificationActivatedEventArgs : IActivatedEventArgs, IActivatedEventArgsWithUser, IApplicationViewActivatedEventArgs, IToastNotificationActivatedEventArgs {
    int CurrentlyShownApplicationViewId { get; }
  }
}
namespace Windows.ApplicationModel.Background {
  public sealed class GattCharacteristicNotificationTrigger : IBackgroundTrigger {
    public GattCharacteristicNotificationTrigger(GattCharacteristic characteristic, BluetoothEventTriggeringMode eventTriggeringMode);
    BluetoothEventTriggeringMode EventTriggeringMode { get; }
  }
  public sealed class GattServiceProviderTrigger : IBackgroundTrigger
}
namespace Windows.ApplicationModel.Contacts {
  public sealed class ContactAnnotation {
    string ContactGroupId { get; set; }
    string ContactListId { get; set; }
  }
  public enum ContactAnnotationOperations : uint {
    Share = (uint)32,
  }
  public sealed class ContactAnnotationStore {
    IAsyncOperation> FindAnnotationsForContactGroupAsync(string contactGroupId);
    IAsyncOperation> FindAnnotationsForContactListAsync(string contactListId);
  }
  public sealed class ContactGroup
  public sealed class ContactGroupMember
  public sealed class ContactGroupMemberBatch
  public sealed class ContactGroupMemberReader
  public enum ContactGroupOtherAppReadAccess
  public static class ContactManager {
    public static IAsyncOperation IsShowFullContactCardSupportedAsync();
  }
  public sealed class ContactManagerForUser {
    void ShowFullContactCard(Contact contact, FullContactCardOptions fullContactCardOptions);
  }
  public sealed class ContactPanel
  public sealed class ContactPanelClosingEventArgs
  public sealed class ContactPanelLaunchFullAppRequestedEventArgs
  public sealed class ContactPicker {
    User User { get; }
    public static ContactPicker CreateForUser(User user);
    public static IAsyncOperation IsSupportedAsync();
  }
  public sealed class ContactStore {
    IAsyncOperation CreateContactGroupAsync(string displayName);
    IAsyncOperation CreateContactGroupAsync(string displayName, string userDataAccountId);
    IAsyncOperation> FindContactGroupsAsync();
    IAsyncOperation> FindContactGroupsByRemoteIdAsync(string remoteId);
    IAsyncOperation GetContactGroupAsync(string contactGroupId);
  }
  public sealed class PinnedContactIdsQueryResult
  public sealed class PinnedContactManager
  public enum PinnedContactSurface
}
namespace Windows.ApplicationModel.Core {
  public sealed class CoreApplicationView {
    IPropertySet Properties { get; }
  }
}
namespace Windows.ApplicationModel.DataTransfer {
  public sealed class DataTransferManager {
    public static void ShowShareUI(ShareUIOptions shareOptions);
  }
  public sealed class ShareUIOptions
}
namespace Windows.ApplicationModel.Email {
  public sealed class EmailMessage {
    IVector ReplyTo { get; }
    EmailRecipient SentRepresenting { get; set; }
  }
}
namespace Windows.ApplicationModel.Store.LicenseManagement {
  public static class LicenseManager {
    public static IAsyncAction RefreshLicensesAsync(LicenseRefreshOption refreshOption);
  }
  public enum LicenseRefreshOption
}
namespace Windows.ApplicationModel.UserDataAccounts {
  public sealed class UserDataAccount {
    bool CanShowCreateContactGroup { get; set; }
    bool IsProtectedUnderLock { get; set; }
    IPropertySet ProviderProperties { get; }
    IAsyncOperation> FindContactGroupsAsync();
    IAsyncOperation> FindUserDataTaskListsAsync();
    IAsyncOperation TryShowCreateContactGroupAsync();
  }
  public sealed class UserDataAccountStore {
    IAsyncOperation CreateAccountAsync(string userDisplayName, string packageRelativeAppId, string enterpriseId);
  }
}
namespace Windows.ApplicationModel.UserDataTasks {
  public sealed class UserDataTask
  public sealed class UserDataTaskBatch
  public enum UserDataTaskDaysOfWeek : uint
  public enum UserDataTaskDetailsKind
  public enum UserDataTaskKind
  public sealed class UserDataTaskList
  public sealed class UserDataTaskListLimitedWriteOperations
  public enum UserDataTaskListOtherAppReadAccess
  public enum UserDataTaskListOtherAppWriteAccess
  public sealed class UserDataTaskListSyncManager
  public enum UserDataTaskListSyncStatus
  public static class UserDataTaskManager
  public sealed class UserDataTaskManagerForUser
  public enum UserDataTaskPriority
  public enum UserDataTaskQueryKind
  public sealed class UserDataTaskQueryOptions
  public enum UserDataTaskQuerySortProperty
  public sealed class UserDataTaskReader
  public sealed class UserDataTaskRecurrenceProperties
  public enum UserDataTaskRecurrenceUnit
  public sealed class UserDataTaskRegenerationProperties
  public enum UserDataTaskRegenerationUnit
  public enum UserDataTaskSensitivity
  public sealed class UserDataTaskStore
  public enum UserDataTaskStoreAccessType
  public enum UserDataTaskWeekOfMonth
}
namespace Windows.ApplicationModel.UserDataTasks.DataProvider {
  public sealed class UserDataTaskDataProviderConnection
  public sealed class UserDataTaskDataProviderTriggerDetails
  public sealed class UserDataTaskListCompleteTaskRequest
  public sealed class UserDataTaskListCompleteTaskRequestEventArgs
  public sealed class UserDataTaskListCreateOrUpdateTaskRequest
  public sealed class UserDataTaskListCreateOrUpdateTaskRequestEventArgs
  public sealed class UserDataTaskListDeleteTaskRequest
  public sealed class UserDataTaskListDeleteTaskRequestEventArgs
  public sealed class UserDataTaskListSkipOccurrenceRequest
  public sealed class UserDataTaskListSkipOccurrenceRequestEventArgs
  public sealed class UserDataTaskListSyncManagerSyncRequest
  public sealed class UserDataTaskListSyncManagerSyncRequestEventArgs
}
namespace Windows.Gaming.Input {
  public sealed class FlightStick : IGameController
  public enum FlightStickButtons : uint
  public struct FlightStickReading
  public enum GameControllerSwitchKind
  public enum GameControllerSwitchPosition
  public sealed class RawGameController : IGameController
}
namespace Windows.Gaming.Input.Custom {
  public sealed class HidGameControllerProvider : IGameControllerProvider
  public interface IHidGameControllerInputSink : IGameControllerInputSink
}
namespace Windows.Graphics.Printing.PrintTicket {
  public interface IPrintTicketSchemaDisplayableElement : IPrintTicketSchemaElement
  public interface IPrintTicketSchemaElement
  public interface IPrintTicketSchemaOption : IPrintTicketSchemaDisplayableElement, IPrintTicketSchemaElement
  public interface IPrintTicketSchemaParameterDefinition : IPrintTicketSchemaElement
  public interface IPrintTicketSchemaValue
  public sealed class PrintTicketSchemaCapabilities : IPrintTicketSchemaElement
  public sealed class PrintTicketSchemaFeature : IPrintTicketSchemaDisplayableElement, IPrintTicketSchemaElement
  public sealed class PrintTicketSchemaParameterInitializer : IPrintTicketSchemaElement
  public enum tagSchemaParameterDataType
  public enum tagSchemaSelectionType
  public enum tagValueType
  public sealed class WorkflowPrintSchemaTicket : IPrintTicketSchemaElement
  public sealed class XmlNode
}
namespace Windows.Graphics.Printing.Workflow {
  public interface IPrinterPropertyBag
  public sealed class PrinterQueue
  public sealed class PrintTaskBackgroundSessionManager
  public sealed class PrintTaskConfig
  public sealed class PrintTaskForegroundSessionManager
  public sealed class PrintTaskSessionState
  public enum PrintTaskSessionStatus
  public sealed class PrintTaskSetupEventArgs
  public sealed class PrintTaskSubmissionController
  public sealed class PrintTaskSubmittedEventArgs
  public sealed class PrintTaskTarget
  public sealed class PrintTaskUIActivatedEventArgs : IActivatedEventArgs
  public sealed class PrintTaskXpsDataAvailableEventArgs
  public sealed class SourceContent
  public sealed class SpoolStreamContent
  public sealed class StreamTarget
  public sealed class WorkflowTaskContext
  public sealed class WorkflowTriggerDetails
  public sealed class XpsOmContent
  public sealed class XpsOmReceiver
}
namespace Windows.Management.Deployment {
  public enum DeploymentOptions : uint {
    EnableStreamedInstall = (uint)128,
    RequiredContentGroupOnly = (uint)256,
  }
  public sealed class PackageManager {
    IAsyncOperationWithProgress AddPackageAsync(Uri packageUri, IIterable dependencyPackageUris, DeploymentOptions deploymentOptions, PackageVolume targetVolume, IIterable optionalPackageFamilyNames, IIterable externalPackageUris);
    IAsyncOperationWithProgress RegisterPackageByFamilyNameAsync(string mainPackageFamilyName, IIterable dependencyPackageFamilyNames, DeploymentOptions deploymentOptions, PackageVolume appDataVolume, IIterable optionalPackageFamilyNames);
    IAsyncOperationWithProgress StagePackageAsync(Uri packageUri, IIterable dependencyPackageUris, DeploymentOptions deploymentOptions, PackageVolume targetVolume, IIterable optionalPackageFamilyNames, IIterable externalPackageUris);
  }
}
namespace Windows.Management.Policies {
  public sealed class BinaryPolicy
  public sealed class BooleanPolicy
  public static class BrowserPolicies
  public sealed class BrowserPoliciesForUser
  public sealed class Int32Policy
  public sealed class StringPolicy
}
namespace Windows.Media {
  public sealed class MediaExtensionManager {
    void RegisterMediaExtensionForAppService(IMediaExtension extension, AppServiceConnection connection);
  }
  public sealed class MediaMarkerSpeechSentenceBoundary : IMediaMarker
  public sealed class MediaMarkerSpeechWordBoundary : IMediaMarker
  public static class MediaMarkerTypes {
    public static string SentenceBoundary { get; }
    public static string WordBoundary { get; }
  }
  public struct MediaTimeRange
}
namespace Windows.Media.Capture {
  public sealed class MediaCaptureInitializationSettings {
    bool AlwaysPlaySystemShutterSound { get; set; }
  }
}
namespace Windows.Media.Core {
  public sealed class ChapterCue : IMediaCue
  public sealed class DataCue : IMediaCue {
    PropertySet Properties { get; }
  }
  public sealed class ImageCue : IMediaCue
  public sealed class MediaBindingEventArgs {
    void SetAdaptiveMediaSource(AdaptiveMediaSource mediaSource);
    void SetStorageFile(IStorageFile file);
  }
  public sealed class MediaSource : IClosable, IMediaPlaybackSource {
    AdaptiveMediaSource AdaptiveMediaSource { get; }
    MediaStreamSource MediaStreamSource { get; }
    MseStreamSource MseStreamSource { get; }
    Uri Uri { get; }
  }
  public sealed class MediaStreamSource : IMediaSource {
    IReference MaxSupportedPlaybackRate { get; set; }
  }
  public enum TimedMetadataKind {
    ImageSubtitle = 6,
  }
  public enum TimedTextFontStyle
  public sealed class TimedTextSource {
    public static TimedTextSource CreateFromStreamWithIndex(IRandomAccessStream stream, IRandomAccessStream indexStream);
    public static TimedTextSource CreateFromStreamWithIndex(IRandomAccessStream stream, IRandomAccessStream indexStream, string defaultLanguage);
    public static TimedTextSource CreateFromUriWithIndex(Uri uri, Uri indexUri);
    public static TimedTextSource CreateFromUriWithIndex(Uri uri, Uri indexUri, string defaultLanguage);
  }
  public sealed class TimedTextStyle {
    TimedTextFontStyle FontStyle { get; set; }
    bool IsLineThroughEnabled { get; set; }
    bool IsOverlineEnabled { get; set; }
    bool IsUnderlineEnabled { get; set; }
  }
}
namespace Windows.Media.Core.Preview {
  public static class SoundLevelBroker
}
namespace Windows.Media.MediaProperties {
  public static class MediaEncodingSubtypes {
    public static string D16 { get; }
    public static string L16 { get; }
    public static string L8 { get; }
    public static string Vp9 { get; }
  }
  public enum SphericalVideoFrameFormat
  public sealed class VideoEncodingProperties : IMediaEncodingProperties {
    SphericalVideoFrameFormat SphericalVideoFrameFormat { get; }
  }
}
namespace Windows.Media.Playback {
  public enum AutoLoadedDisplayPropertyKind
  public sealed class CurrentMediaPlaybackItemChangedEventArgs {
    MediaPlaybackItemChangedReason Reason { get; }
  }
  public sealed class MediaPlaybackItem : IMediaPlaybackSource {
    AutoLoadedDisplayPropertyKind AutoLoadedDisplayProperties { get; set; }
    bool IsDisabledInPlaybackList { get; set; }
    double TotalDownloadProgress { get; }
  }
  public enum MediaPlaybackItemChangedReason
  public sealed class MediaPlaybackList : IMediaPlaybackSource {
    IReference MaxPlayedItemsToKeepOpen { get; set; }
  }
  public sealed class MediaPlaybackSession {
    bool IsMirroring { get; set; }
    MediaPlaybackSphericalVideoProjection SphericalVideoProjection { get; }
    event TypedEventHandler BufferedRangesChanged;
    event TypedEventHandler PlayedRangesChanged;
    event TypedEventHandler SeekableRangesChanged;
    event TypedEventHandler SupportedPlaybackRatesChanged;
    IVectorView GetBufferedRanges();
    IVectorView GetPlayedRanges();
    IVectorView GetSeekableRanges();
    bool IsSupportedPlaybackRateRange(double rate1, double rate2);
  }
  public sealed class MediaPlaybackSphericalVideoProjection
}
namespace Windows.Media.Protection.PlayReady {
  public interface IPlayReadyLicenseSession2 : IPlayReadyLicenseSession
  public sealed class PlayReadyLicense : IPlayReadyLicense {
    bool ExpiresInRealTime { get; }
    bool InMemoryOnly { get; }
    Guid SecureStopId { get; }
    uint SecurityLevel { get; }
  }
  public sealed class PlayReadyLicenseAcquisitionServiceRequest : IMediaProtectionServiceRequest, IPlayReadyLicenseAcquisitionServiceRequest, IPlayReadyServiceRequest {
    PlayReadyLicenseIterable CreateLicenseIterable(PlayReadyContentHeader contentHeader, bool fullyEvaluated);
  }
  public sealed class PlayReadyLicenseSession : IPlayReadyLicenseSession, IPlayReadyLicenseSession2 {
    PlayReadyLicenseIterable CreateLicenseIterable(PlayReadyContentHeader contentHeader, bool fullyEvaluated);
  }
}
namespace Windows.Media.SpeechSynthesis {
  public sealed class SpeechSynthesisOptions
  public sealed class SpeechSynthesizer : IClosable {
    SpeechSynthesisOptions Options { get; }
  }
}
namespace Windows.Media.Streaming.Adaptive {
  public sealed class AdaptiveMediaSource : IClosable, IMediaSource {
    IReference DesiredSeekableWindowSize { get; set; }
    AdaptiveMediaSourceDiagnostics Diagnostics { get; }
    IReference MaxSeekableWindowSize { get; }
    IReference MinLiveOffset { get; }
    void Close();
    AdaptiveMediaSourceCorrelatedTimes GetCorrelatedTimes();
  }
  public sealed class AdaptiveMediaSourceCorrelatedTimes
  public sealed class AdaptiveMediaSourceDiagnosticAvailableEventArgs
  public sealed class AdaptiveMediaSourceDiagnostics
  public enum AdaptiveMediaSourceDiagnosticType
  public sealed class AdaptiveMediaSourceDownloadBitrateChangedEventArgs {
    AdaptiveMediaSourceDownloadBitrateChangedReason Reason { get; }
  }
  public enum AdaptiveMediaSourceDownloadBitrateChangedReason
}
namespace Windows.Networking.NetworkOperators {
  public sealed class MobileBroadbandAccount {
    Uri AccountExperienceUrl { get; }
  }
  public sealed class MobileBroadbandDeviceInformation {
    string SimGid1 { get; }
    string SimPnn { get; }
    string SimSpn { get; }
  }
}
namespace Windows.Payments {
  public interface IPaymentItem
  public sealed class PaymentAddress
  public static class PaymentAppRegistration
  public sealed class PaymentCurrencyAmount
  public sealed class PaymentDetails
  public sealed class PaymentDetailsModifier
  public sealed class PaymentItem : IPaymentItem
  public static class PaymentMediator
  public sealed class PaymentMerchantInfo
  public sealed class PaymentMethodData
  public enum PaymentOptionPresence
  public sealed class PaymentOptions
  public sealed class PaymentRequest
  public sealed class PaymentRequestChangedEventArgs
  public delegate IAsyncOperation PaymentRequestChangedEventHandler(PaymentRequest paymentRequest, PaymentRequestChangedEventArgs args);
  public sealed class PaymentRequestChangedEventResult
  public enum PaymentRequestChangeSource
  public enum PaymentRequestCompletionStatus
  public enum PaymentRequestStatus
  public sealed class PaymentRequestSubmitResult
  public sealed class PaymentResponse
  public sealed class PaymentShippingOption : IPaymentItem
  public sealed class PaymentToken
  public sealed class PaymentTransaction
  public sealed class PaymentTransactionAcceptResult
}
namespace Windows.Perception.Spatial.Preview {
  public interface ISpatialAnchorStorage
  public sealed class SpatialAnchorMetadata
  public enum SpatialAnchorStorageContentChange
  public sealed class SpatialAnchorStorageContentChangedEventArgs
  public sealed class SpatialElement
  public sealed class SpatialElementChangedEventArgs
  public sealed class SpatialElementStore
}
namespace Windows.Perception.Spatial.Preview.Sharing {
  public interface ISpatialSharingSession
  public interface ISpatialSharingSessionHost
  public interface ISpatialSharingSessionManager
  public sealed class SessionChangedEventArgs
  public sealed class SessionInviteReceivedEventArgs
  public sealed class SessionMessageReceivedEventArgs
  public sealed class SessionParticipantEventArgs
  public sealed class SessionParticipantLeftEventArgs
  public sealed class SpatialSharingDevice
  public sealed class SpatialSharingQueryResult
  public sealed class SpatialSharingSession : ISpatialAnchorStorage, ISpatialSharingSession
  public sealed class SpatialSharingSessionHost : ISpatialSharingSessionHost
  public sealed class SpatialSharingSessionInvite
  public sealed class SpatialSharingSessionManager : ISpatialSharingSessionManager
  public sealed class SpatialSharingSessionParticipant
  public enum SpatialSharingSessionState
  public sealed class SpatialSharingSessionToken
}
namespace Windows.Security.Cryptography.Certificates {
  public sealed class CertificateExtension
  public sealed class CertificateRequestProperties {
    IVector Extensions { get; }
    SubjectAlternativeNameInfo SubjectAlternativeName { get; }
    IVector SuppressedDefaults { get; }
  }
  public sealed class SubjectAlternativeNameInfo {
    IVector DistinguishedNames { get; }
    IVector DnsNames { get; }
    IVector EmailNames { get; }
    CertificateExtension Extension { get; }
    IVector IPAddresses { get; }
    IVector PrincipalNames { get; }
    IVector Urls { get; }
  }
}
namespace Windows.Services.Cortana {
  public enum CortanaPermission
  public enum CortanaPermissionsChangeResult
  public sealed class CortanaPermissionsManager
}
namespace Windows.Services.Maps {
  public sealed class EnhancedWaypoint
  public static class MapRouteFinder {
    public static IAsyncOperation GetDrivingRouteFromEnhancedWaypointsAsync(IIterable waypoints);
    public static IAsyncOperation GetDrivingRouteFromEnhancedWaypointsAsync(IIterable waypoints, MapRouteDrivingOptions options);
  }
  public static class MapService {
    public static MapServiceDataUsagePreference DataUsagePreference { get; set; }
  }
  public enum MapServiceDataUsagePreference
  public enum WaypointKind
}
namespace Windows.Services.Maps.OfflineMaps {
  public sealed class OfflineMapPackage
  public sealed class OfflineMapPackageQueryResult
  public enum OfflineMapPackageQueryStatus
  public sealed class OfflineMapPackageStartDownloadResult
  public enum OfflineMapPackageStartDownloadStatus
  public enum OfflineMapPackageStatus
}
namespace Windows.System {
  public sealed class DispatcherQueue
  public delegate void DispatcherQueueHandler();
  public delegate IAsyncAction DispatcherQueueHandlerAsync();
  public sealed class DispatcherQueueOptions
  public enum DispatcherQueuePriority
  public sealed class DispatcherQueueTimer
}
namespace Windows.System.Preview.RemoteSessions {
  public enum BinaryChannelTransportMode
  public sealed class RemoteSession
  public sealed class RemoteSessionAddedEventArgs
  public sealed class RemoteSessionBinaryChannel
  public sealed class RemoteSessionBinaryMessageReceivedEventArgs
  public enum RemoteSessionConnectionStatus
  public sealed class RemoteSessionConnectResult
  public sealed class RemoteSessionDisconnectedEventArgs
  public enum RemoteSessionDisconnectedReason
  public sealed class RemoteSessionInfo
  public sealed class RemoteSessionInvitationManager
  public sealed class RemoteSessionInvitationReceivedEventArgs
  public sealed class RemoteSessionJoinRequest
  public sealed class RemoteSessionJoinRequestedEventArgs
  public sealed class RemoteSessionParticipant
  public sealed class RemoteSessionParticipantChangedEventArgs
  public sealed class RemoteSessionRemovedEventArgs
  public sealed class RemoteSessionUpdatedEventArgs
  public sealed class RemoteSessionWatcher
}
namespace Windows.System.Profile {
  public static class EducationSettings
}
namespace Windows.System.RemoteSystems {
  public sealed class RemoteSystem {
    IAsyncOperation GetResourceAvailableAsync(string query);
  }
}
namespace Windows.System.RemoteSystems.Preview {
  public static class RemoteSystemResourceQuery
}
namespace Windows.UI.Composition {
  public class CompositionDrawingSurface : CompositionObject, ICompositionSurface {
  }
  public sealed class CompositionGraphicsDevice : CompositionObject {
    CompositionVirtualDrawingSurface CreateVirtualDrawingSurface(Size sizePixels, DirectXPixelFormat pixelFormat, DirectXAlphaMode alphaMode);
  }
  public sealed class CompositionVirtualDrawingSurface : CompositionDrawingSurface, ICompositionSurface
  public sealed class CompositionVisualSurface : CompositionObject, ICompositionSurface
  public sealed class CompositionWindowBackdropBrush : CompositionBrush
  public sealed class Compositor : IClosable {
    CompositionVisualSurface CreateVisualSurface();
    CompositionWindowBackdropBrush CreateWindowBackdropBrush();
  }
  public sealed class LayerVisual : ContainerVisual {
    CompositionShadow Shadow { get; set; }
  }
  public class Visual : CompositionObject {
    Vector3 RelativeOffset { get; set; }
    Vector2 RelativeSize { get; set; }
    Visual TransformParent { get; set; }
  }
}
namespace Windows.UI.Core {
  public sealed class CoreWindow : ICorePointerRedirector, ICoreWindow {
    event TypedEventHandler ResizeCompleted;
    event TypedEventHandler ResizeStarted;
  }
}
namespace Windows.UI.Input {
  public static class KnownSimpleHapticsControllerWaveforms
  public sealed class RadialController {
    event TypedEventHandler ButtonHolding;
    event TypedEventHandler ButtonPressed;
    event TypedEventHandler ButtonReleased;
  }
  public sealed class RadialControllerButtonClickedEventArgs {
    SimpleHapticsController SimpleHapticsController { get; }
  }
  public sealed class RadialControllerButtonHoldingEventArgs
  public sealed class RadialControllerButtonPressedEventArgs
  public sealed class RadialControllerButtonReleasedEventArgs
  public sealed class RadialControllerConfiguration {
    RadialController ActiveControllerWhenMenuIsSuppressed { get; set; }
    bool IsMenuSuppressed { get; set; }
  }
  public sealed class RadialControllerControlAcquiredEventArgs {
    bool IsButtonPressed { get; }
    SimpleHapticsController SimpleHapticsController { get; }
  }
  public sealed class RadialControllerMenuItem {
    public static RadialControllerMenuItem CreateFromFontGlyph(string displayText, string glyph, string fontFamily);
    public static RadialControllerMenuItem CreateFromFontGlyph(string displayText, string glyph, string fontFamily, Uri fontUri);
  }
  public sealed class RadialControllerRotationChangedEventArgs {
    bool IsButtonPressed { get; }
    SimpleHapticsController SimpleHapticsController { get; }
  }
  public sealed class RadialControllerScreenContactContinuedEventArgs {
    bool IsButtonPressed { get; }
    SimpleHapticsController SimpleHapticsController { get; }
  }
  public sealed class RadialControllerScreenContactEndedEventArgs
  public sealed class RadialControllerScreenContactStartedEventArgs {
    bool IsButtonPressed { get; }
    SimpleHapticsController SimpleHapticsController { get; }
  }
  public sealed class SimpleHapticsController
  public sealed class SimpleHapticsControllerFeedback
}
namespace Windows.UI.Input.Core {
  public sealed class RadialControllerIndependentInputSource
}
namespace Windows.UI.Input.Inking {
  public enum InkPersistenceFormat
  public sealed class InkPresenterProtractor : IInkPresenterStencil
  public sealed class InkPresenterRuler : IInkPresenterStencil {
    bool AreTickMarksVisible { get; set; }
    bool IsCompassVisible { get; set; }
  }
  public enum InkPresenterStencilKind {
    Protractor = 2,
  }
  public sealed class InkStroke {
    uint Id { get; }
    IReference StrokeDuration { get; set; }
    IReference StrokeStartedTime { get; set; }
  }
  public sealed class InkStrokeBuilder {
    InkStroke CreateStrokeFromInkPoints(IIterable inkPoints, Matrix3x2 transform, IReference strokeStartedTime, IReference strokeDuration);
  }
  public sealed class InkStrokeContainer : IInkStrokeContainer {
    InkStroke GetStrokeById(uint id);
    IAsyncOperationWithProgress SaveAsync(IOutputStream outputStream, InkPersistenceFormat inkPersistenceFormat);
  }
}
namespace Windows.UI.Input.Spatial {
  public sealed class SpatialHoldCompletedEventArgs {
    SpatialPointingPose TryGetPointingPose(SpatialCoordinateSystem coordinateSystem);
  }
  public sealed class SpatialHoldStartedEventArgs {
    SpatialPointingPose TryGetPointingPose(SpatialCoordinateSystem coordinateSystem);
  }
  public sealed class SpatialInteractionDetectedEventArgs {
    SpatialPointingPose TryGetPointingPose(SpatialCoordinateSystem coordinateSystem);
  }
  public enum SpatialInteractionKind
  public sealed class SpatialInteractionSource {
    bool SupportsPointing { get; }
  }
  public sealed class SpatialInteractionSourceEventArgs {
    SpatialInteractionKind InteractionKind { get; }
    SpatialPointingPose TryGetPointingPose(SpatialCoordinateSystem coordinateSystem);
  }
  public sealed class SpatialInteractionSourceState {
    bool IsGrasped { get; }
    bool IsPrimaryPressed { get; }
    bool IsSecondaryPressed { get; }
    SpatialPointingPose TryGetPointingPose(SpatialCoordinateSystem coordinateSystem);
  }
  public sealed class SpatialPointerPose {
    SpatialPointingPose TryGetPointingPose(SpatialInteractionSource source);
  }
  public sealed class SpatialPointingPose
  public sealed class SpatialTappedEventArgs {
    SpatialPointingPose TryGetPointingPose(SpatialCoordinateSystem coordinateSystem);
  }
}
namespace Windows.UI.Notifications {
  public sealed class NotificationData
  public enum NotificationUpdateResult
  public sealed class ToastCollection
  public sealed class ToastCollectionManager
  public sealed class ToastNotification {
    NotificationData Data { get; set; }
  }
  public sealed class ToastNotificationHistoryChangedTriggerDetail {
    string CollectionId { get; }
  }
  public static class ToastNotificationManager {
    public static ToastNotificationManagerForUser Current { get; }
  }
  public sealed class ToastNotificationManagerForUser {
    IAsyncOperation GetHistoryForToastCollectionIdAsync(string collectionId);
    ToastCollectionManager GetToastCollectionManager();
    ToastCollectionManager GetToastCollectionManager(string appId);
    IAsyncOperation GetToastNotifierForToastCollectionIdAsync(string collectionId);
  }
  public sealed class ToastNotifier {
    NotificationUpdateResult Update(NotificationData data, string tag);
    NotificationUpdateResult Update(NotificationData data, string tag, string group);
  }
}
namespace Windows.UI.Text {
  public enum TextDecorations : uint
}
namespace Windows.UI.ViewManagement {
  public sealed class ApplicationView {
    IAsyncOperation TryConsolidateAsync();
  }
  public sealed class ApplicationViewConsolidatedEventArgs {
    bool IsAppInitiated { get; }
  }
}
namespace Windows.UI.WebUI {
  public sealed class WebUIContactPanelActivatedEventArgs : IActivatedEventArgs, IActivatedEventArgsDeferral, IActivatedEventArgsWithUser, IContactPanelActivatedEventArgs
  public sealed class WebUILockScreenComponentActivatedEventArgs : IActivatedEventArgs, IActivatedEventArgsDeferral
}
namespace Windows.UI.Xaml {
  public sealed class BringIntoViewOptions
  public class FrameworkElement : UIElement {
    public static void DeferTree(DependencyObject element);
  }
  public class UIElement : DependencyObject {
    double KeyTipHorizontalOffset { get; set; }
    public static DependencyProperty KeyTipHorizontalOffsetProperty { get; }
    KeyTipPlacementMode KeyTipPlacementMode { get; set; }
    public static DependencyProperty KeyTipPlacementModeProperty { get; }
    double KeyTipVerticalOffset { get; set; }
    public static DependencyProperty KeyTipVerticalOffsetProperty { get; }
    XYFocusKeyboardNavigationMode XYFocusKeyboardNavigation { get; set; }
    public static DependencyProperty XYFocusKeyboardNavigationProperty { get; }
    void StartBringIntoView();
    void StartBringIntoView(BringIntoViewOptions options);
  }
}
namespace Windows.UI.Xaml.Automation {
  public sealed class AutomationElementIdentifiers {
    public static AutomationProperty CultureProperty { get; }
  }
  public sealed class AutomationProperties {
    public static DependencyProperty CultureProperty { get; }
    public static int GetCulture(DependencyObject element);
    public static void SetCulture(DependencyObject element, int value);
  }
}
namespace Windows.UI.Xaml.Automation.Peers {
  public class AutomationPeer : DependencyObject {
    int GetCulture();
    virtual int GetCultureCore();
  }
  public sealed class MapControlAutomationPeer : FrameworkElementAutomationPeer, IScrollProvider, ITransformProvider, ITransformProvider2 {
    bool CanMove { get; }
    bool CanResize { get; }
    bool CanRotate { get; }
    bool CanZoom { get; }
    double MaxZoom { get; }
    double MinZoom { get; }
    double ZoomLevel { get; }
    void Move(double x, double y);
    void Resize(double width, double height);
    void Rotate(double degrees);
    void Zoom(double zoom);
    void ZoomByUnit(ZoomUnit zoomUnit);
  }
}
namespace Windows.UI.Xaml.Controls {
  public class ContentDialog : ContentControl {
    bool IsTertiaryButtonEnabled { get; set; }
    public static DependencyProperty IsTertiaryButtonEnabledProperty { get; }
    Style PrimaryButtonStyle { get; set; }
    public static DependencyProperty PrimaryButtonStyleProperty { get; }
    Style SecondaryButtonStyle { get; set; }
    public static DependencyProperty SecondaryButtonStyleProperty { get; }
    ICommand TertiaryButtonCommand { get; set; }
    object TertiaryButtonCommandParameter { get; set; }
    public static DependencyProperty TertiaryButtonCommandParameterProperty { get; }
    public static DependencyProperty TertiaryButtonCommandProperty { get; }
    Style TertiaryButtonStyle { get; set; }
    public static DependencyProperty TertiaryButtonStyleProperty { get; }
    string TertiaryButtonText { get; set; }
    public static DependencyProperty TertiaryButtonTextProperty { get; }
    event TypedEventHandler TertiaryButtonClick;
  }
  public enum ContentDialogResult {
    Tertiary = 3,
  }
  public class Control : FrameworkElement {
    Uri DefaultStyleResourceUri { get; set; }
    public static DependencyProperty DefaultStyleResourceUriProperty { get; }
  }
  public sealed class FocusEngagedEventArgs : RoutedEventArgs {
    bool Handled { get; set; }
  }
  public class Frame : ContentControl, INavigate {
    void SetNavigationState(string navigationState, bool suppressNavigate);
  }
  public class InkToolbar : Control {
    InkToolbarButtonFlyoutPlacement ButtonFlyoutPlacement { get; set; }
    public static DependencyProperty ButtonFlyoutPlacementProperty { get; }
    bool IsStencilButtonChecked { get; set; }
    public static DependencyProperty IsStencilButtonCheckedProperty { get; }
    Orientation Orientation { get; set; }
    public static DependencyProperty OrientationProperty { get; }
    event TypedEventHandler BringStencilIntoViewRequested;
    event TypedEventHandler EraserWidthChanged;
    event TypedEventHandler IsStencilButtonCheckedChanged;
    InkToolbarMenuButton GetMenuButton(InkToolbarMenuKind menu);
  }
  public enum InkToolbarButtonFlyoutPlacement
  public class InkToolbarEraserButton : InkToolbarToolButton {
    InkToolbarEraserKind EraserKind { get; set; }
    public static DependencyProperty EraserKindProperty { get; }
    bool IsClearAllVisible { get; set; }
    public static DependencyProperty IsClearAllVisibleProperty { get; }
    bool IsWidthSliderVisible { get; set; }
    public static DependencyProperty IsWidthSliderVisibleProperty { get; }
    double MaxStrokeWidth { get; set; }
    public static DependencyProperty MaxStrokeWidthProperty { get; }
    double MinStrokeWidth { get; set; }
    public static DependencyProperty MinStrokeWidthProperty { get; }
    double SelectedStrokeWidth { get; set; }
    public static DependencyProperty SelectedStrokeWidthProperty { get; }
  }
  public enum InkToolbarEraserKind
  public class InkToolbarFlyoutItem : ButtonBase
  public enum InkToolbarFlyoutItemKind
  public sealed class InkToolbarIsStencilButtonCheckedChangedEventArgs
  public class InkToolbarMenuButton : ToggleButton
  public enum InkToolbarMenuKind
  public class InkToolbarPenConfigurationControl : Control {
    InkToolbarEraserButton EraserButton { get; }
    public static DependencyProperty EraserButtonProperty { get; }
  }
  public class InkToolbarStencilButton : InkToolbarMenuButton
  public enum InkToolbarStencilKind
  public sealed class RichTextBlock : FrameworkElement {
    TextDecorations TextDecorations { get; set; }
    public static DependencyProperty TextDecorationsProperty { get; }
  }
  public sealed class TextBlock : FrameworkElement {
    TextDecorations TextDecorations { get; set; }
    public static DependencyProperty TextDecorationsProperty { get; }
  }
}
namespace Windows.UI.Xaml.Controls.Maps {
  public sealed class MapBillboard : MapElement
  public sealed class MapContextRequestedEventArgs
  public sealed class MapControl : Control {
    MapProjection MapProjection { get; set; }
    public static DependencyProperty MapProjectionProperty { get; }
    MapStyleSheet StyleSheet { get; set; }
    public static DependencyProperty StyleSheetProperty { get; }
    Thickness ViewPadding { get; set; }
    public static DependencyProperty ViewPaddingProperty { get; }
    event TypedEventHandler MapContextRequested;
    IVectorView FindMapElementsAtOffset(Point offset, double radius);
    void GetLocationFromOffset(Point offset, AltitudeReferenceSystem desiredReferenceSystem, out Geopoint location);
    void StartContinuousPan(double horizontalPixelsPerSecond, double verticalPixelsPerSecond);
    void StopContinuousPan();
    IAsyncOperation TryPanAsync(double horizontalPixels, double verticalPixels);
    IAsyncOperation TryPanToAsync(Geopoint location);
  }
  public enum MapProjection
  public enum MapStyle {
    Custom = 7,
  }
  public sealed class MapStyleSheet : DependencyObject
}
namespace Windows.UI.Xaml.Controls.Primitives {
  public class FlyoutBase : DependencyObject {
    DependencyObject OverlayInputPassThroughElement { get; set; }
    public static DependencyProperty OverlayInputPassThroughElementProperty { get; }
  }
}
namespace Windows.UI.Xaml.Documents {
  public sealed class Hyperlink : Span {
    FocusState FocusState { get; }
    public static DependencyProperty FocusStateProperty { get; }
    event RoutedEventHandler GotFocus;
    event RoutedEventHandler LostFocus;
    bool Focus(FocusState value);
  }
  public class TextElement : DependencyObject {
    double KeyTipHorizontalOffset { get; set; }
    public static DependencyProperty KeyTipHorizontalOffsetProperty { get; }
    KeyTipPlacementMode KeyTipPlacementMode { get; set; }
    public static DependencyProperty KeyTipPlacementModeProperty { get; }
    double KeyTipVerticalOffset { get; set; }
    public static DependencyProperty KeyTipVerticalOffsetProperty { get; }
    TextDecorations TextDecorations { get; set; }
    public static DependencyProperty TextDecorationsProperty { get; }
    event TypedEventHandler AccessKeyDisplayDismissed;
    event TypedEventHandler AccessKeyDisplayRequested;
    event TypedEventHandler AccessKeyInvoked;
  }
}
namespace Windows.UI.Xaml.Input {
  public sealed class AccessKeyManager {
    public static bool AreKeyTipsEnabled { get; set; }
  }
  public enum KeyTipPlacementMode
  public enum XYFocusKeyboardNavigationMode
}
namespace Windows.UI.Xaml.Markup {
  public sealed class XamlMarkupHelper
}
 
namespace Windows.Media.Capture {
  public sealed class AppCaptureDurationGeneratedEventArgs
  public sealed class AppCaptureFileGeneratedEventArgs
  public enum AppCaptureMicrophoneCaptureState
  public sealed class AppCaptureMicrophoneCaptureStateChangedEventArgs
  public enum AppCaptureRecordingState
  public sealed class AppCaptureRecordingStateChangedEventArgs
  public sealed class AppCaptureRecordOperation
  public sealed class AppCaptureServices
  public sealed class AppCaptureState
}
 
namespace Windows.Services.Store {
  public sealed class StoreContext {
    IAsyncOperation FindStoreProductForPackageAsync(IIterable productKinds, Package package);
  }
}

API Removals

namespace Windows.UI.Composition {
  public sealed class CompositionDrawingSurface : CompositionObject, ICompositionSurface {
  }
}

API Additions not yet implemented

The Bluetooth APIs were included to receive feedback from the Developer community.

namespace Windows.Devices.Bluetooth {
  public sealed class BluetoothAdapter
  public sealed class BluetoothDeviceId
  public enum BluetoothError {
    TransportNotSupported = 9,
  }
  public sealed class BluetoothLEDevice : IClosable {
    DeviceAccessInformation DeviceAccessInformation { get; }
    IAsyncOperation GetGattServicesAsync();
    IAsyncOperation GetGattServicesAsync(BluetoothCacheMode cacheMode);
    IAsyncOperation GetGattServicesForUuidAsync(GattUuid serviceUuid);
    IAsyncOperation GetGattServicesForUuidAsync(GattUuid serviceUuid, BluetoothCacheMode cacheMode);
    IAsyncOperation RequestAccessAsync();
  }
  public enum BluetoothTransportOptions : uint
}
namespace Windows.Devices.Bluetooth.Background {
  public enum BluetoothEventTriggeringMode
  public sealed class GattCharacteristicNotificationTriggerDetails {
    BluetoothError Error { get; }
    BluetoothEventTriggeringMode EventTriggeringMode { get; }
    IVectorView ValueChangedEvents { get; }
  }
  public sealed class GattServiceProviderBackgroundInfo
  public sealed class GattServiceProviderRequestActivityInfo
  public enum GattServiceProviderRequestActivityType
  public enum GattServiceProviderRequestAttributeType
  public sealed class GattServiceProviderTriggerDetails
  public enum GattServiceProviderTriggerReason
}
namespace Windows.Devices.Bluetooth.GenericAttributeProfile {
  public sealed class GattCharacteristic {
    IAsyncOperation GetDescriptorsAsync();
    IAsyncOperation GetDescriptorsAsync(BluetoothCacheMode cacheMode);
    IAsyncOperation GetDescriptorsForUuidAsync(GattUuid descriptorUuid);
    IAsyncOperation GetDescriptorsForUuidAsync(GattUuid descriptorUuid, BluetoothCacheMode cacheMode);
    IAsyncOperation WriteValueWithResultAsync(IBuffer value);
    IAsyncOperation WriteValueWithResultAsync(IBuffer value, GattWriteOption writeOption);
  }
  public sealed class GattCharacteristicsResult
  public sealed class GattClientNotificationResult
  public enum GattCommunicationStatus {
    ProtocolError = 2,
  }
  public sealed class GattDescriptor {
    IAsyncOperation WriteValueWithResultAsync(IBuffer value);
  }
  public sealed class GattDescriptorsResult
  public sealed class GattDeviceService : IClosable {
    DeviceAccessInformation DeviceAccessInformation { get; }
    GattSession Session { get; }
    public static IAsyncOperation FromIdAsync(string deviceId, GattSharingMode sharingMode);
    IAsyncOperation GetCharacteristicsAsync();
    IAsyncOperation GetCharacteristicsAsync(BluetoothCacheMode cacheMode);
    IAsyncOperation GetCharacteristicsForUuidAsync(GattUuid characteristicUuid);
    IAsyncOperation GetCharacteristicsForUuidAsync(GattUuid characteristicUuid, BluetoothCacheMode cacheMode);
    public static string GetDeviceSelector(GattUuid gattUuid);
    public static string GetDeviceSelectorForBluetoothDeviceId(BluetoothDeviceId bluetoothDeviceId);
    public static string GetDeviceSelectorForBluetoothDeviceId(BluetoothDeviceId bluetoothDeviceId, BluetoothCacheMode cacheMode);
    public static string GetDeviceSelectorForBluetoothDeviceIdAndGattUuid(BluetoothDeviceId bluetoothDeviceId, GattUuid gattUuid);
    public static string GetDeviceSelectorForBluetoothDeviceIdAndGattUuid(BluetoothDeviceId bluetoothDeviceId, GattUuid gattUuid, BluetoothCacheMode cacheMode);
    IAsyncOperation GetIncludedServicesAsync();
    IAsyncOperation GetIncludedServicesAsync(BluetoothCacheMode cacheMode);
    IAsyncOperation GetIncludedServicesForUuidAsync(GattUuid serviceUuid);
    IAsyncOperation GetIncludedServicesForUuidAsync(GattUuid serviceUuid, BluetoothCacheMode cacheMode);
    IAsyncOperation RequestAccessAsync(GattSharingMode sharingMode);
  }
  public sealed class GattDeviceServicesResult
  public sealed class GattLocalCharacteristic
  public sealed class GattLocalCharacteristicParameters
  public sealed class GattLocalDescriptor
  public sealed class GattLocalDescriptorParameters
  public sealed class GattPresentationFormat {
    public static GattPresentationFormat FromParts(byte formatType, int exponent, ushort unit, byte namespaceId, ushort description);
  }
  public static class GattProtocolError
  public sealed class GattPublishedService
  public sealed class GattReadClientCharacteristicConfigurationDescriptorResult {
    IReference ProtocolError { get; }
  }
  public sealed class GattReadRequest
  public sealed class GattReadRequestedEventArgs
  public sealed class GattReadResponse
  public sealed class GattReadResult {
    IReference ProtocolError { get; }
  }
  public sealed class GattReliableWriteTransaction {
    IAsyncOperation CommitWithResultAsync();
  }
  public sealed class GattServiceProvider
  public sealed class GattServiceProviderAdvertisingParameters
  public sealed class GattServiceProviderResult
  public enum GattServiceProviderStatus
  public sealed class GattServiceProviderStatusChangedEventArgs
  public enum GattServiceType
  public sealed class GattSession : IClosable
  public enum GattSessionStatus
  public sealed class GattSessionStatusChangedEventArgs
  public enum GattSharingMode
  public sealed class GattSubscribedClient
  public sealed class GattUuid
  public sealed class GattWriteRequest
  public sealed class GattWriteRequestedEventArgs
  public sealed class GattWriteResponse
  public sealed class GattWriteResult
}
namespace Windows.Devices.Bluetooth.Rfcomm {
  public sealed class RfcommDeviceService : IClosable {
    public static IAsyncOperation FromIdWithResultAsync(string deviceId);
  }
  public sealed class RfcommServiceProvider {
    public static IAsyncOperation CreateWithResultAsync(RfcommServiceId serviceId);
  }
  public sealed class RfcommServiceProviderResult
}
 

A lightning tour of the new Microsoft Tech Community

$
0
0

On today’s Microsoft Mechanics, community manager Anna Chu gives a lightning tour of the Microsoft Tech Community. If you haven’t seen or joined the community, it’s the single place where experts in the community can share, collaborate and learn about products and services across Microsoft.

In this demonstration, Anna shows how to sign up to become a member of the community, search for answers to questions, personalize your experience to just the products and services that you want to follow, post new conversations and reply to conversations, view member status and build your community status, and access popular event content within the community.

As you’ll see in this hands-on demonstration, the Microsoft Tech Community is the place to connect with peers, experts and Microsoft staff to follow and contribute to conversations across your favorite Microsoft products, services and events. We’ve recently added communities and spaces for Microsoft Teams and Surface Devices with more on the way.

To get started, sign up at the Microsoft Tech Community. You can also follow the community on Twitter for major news and promoted posts.

—Jeremy Chapman

The post A lightning tour of the new Microsoft Tech Community appeared first on Office Blogs.

Making your money go “further” with Excel—personal finance tips from the Dostals

$
0
0

excel-tools-fi

Marshall and Megan Dostal are the owners of Further, a forward-thinking company that produces sustainable, luxurious soap. How they get glycerin, a key ingredient, is nothing short of amazing. Marshall picks up depleted cooking oil from restaurants, then converts it into biofuel in the Further warehouse. One of the byproducts of the biofuel distillation is glycerin, which he and Megan use to create their soap. The soap is then packaged and sold to restaurants, hotels and retailers nationwide. Like many small business owners, they’re obsessed with keeping the business running smoothly and don’t have much time to spend on their personal finances. That’s one major reason why they use Excel.

“I haven’t found anything better than Excel for tracking our spending and planning for things like our savings—it’s user friendly and it’s simple. What more could you ask for?” says Megan who takes care of their family budget.

“Generally speaking, from month to month, we are on track. We run a tight ship. Our budget varies depending on how monthly sales are doing.

“A typical Excel session for recording our month-to-month spending is short and sweet. I enter in a few data points and Excel calculates everything for me. I’m not spending a ton of my day on it and that’s what makes it so valuable for me. It frees up time to do other things.

“We run really lean, so any variance is likely because of a big life event, such as a family wedding,” she adds.

To keep on top of their budget, Marshall loads all the numbers into Excel and they divide and conquer from there. “Marshall reviews them for Further and I review them for our home and family,” says Megan. “It’s super easy to download Excel files from our credit card accounts and voila, the expense report is done. No need to manually enter expenses. It’s a real time saver.”

While Megan uses Excel daily to record financial matters, Marshall has also used it to automate his biodiesel production formulae. “Making biodiesel involves a number of different variables, so I have a worksheet that automates the production formula,” he says. “Figuring out the numbers involved without Excel would be a laborious process.”

To keep spending on track and catch things they need to watch out for, Marshall and Megan discuss big purchases in advance and are able to anticipate “big months” and spend accordingly.

“I also find the Sunburst and Treemap charts helpful as something to reference,” she says. “If I look at the first three months in one chart, it’s nice to see how much we spend—for budgeting for the next year—for things like food, expenses and clothing. Of course, things happen that we cannot plan for, but all in all, we just keep open communication about everything.”

What really piqued their interest was the Get & Transform data-gathering feature, which they can use to import information for websites for budgeting their vacations. “Megan does a great job planning trips for us,” says Marshall. “So if we take a spring trip, Get & Transform can be really helpful budgeting for that.”

Another time-saving feature they like is Flash Fill, which fills out data in Excel cells for you and starts working when it recognizes a pattern in the data you’re inputting.

When asked how they thought about savings and planned for major life events, like college for their son and retirement, Megan said they do have an account that they opened at the time they started Further and they add to it as best they can. “When we started the company, we opened the account as our “backup,” money that we would not touch if the company went bust. Today, it serves as a great savings account. As far as college and retirement go—we could be better.”

Office has created special Excel templates specifically for personal finance like Financial Vision and the Budget Wheel at Tools for You. Financial Vision helps you see all your financial goals at a glance and lets you compare your spending, income and savings to see what it will take to get you to your goal.

“We like to plan things out,” adds Megan. “So, if we know now that in March we’re going to spend X amount for a vacation, you fill that into the Financial Vision template and it instantly adjusts for you and you know—well, that’s what we’re have left.”

“I really like that template because I like bar charts more than pie charts,” says Marshall. “The colors give a better indication of your budget, and break things down better. You can see visually immediately what you’re spending.”

Marshall also likes having the ability to look at percentages in the Budget Wheel and seeing what they have left over each month for savings.

So what does the future hold for these innovative soap makers? Megan says expanding their business into new markets is their primary focus for now. Then they’ll have time to further explore their concept of reusing things that seem like they’re waste, and turning them into products that people can use. “Excel has just been a steady presence in my life. Always there behind the scenes getting the job done,” says Megan.

4 cool Excel personal finance tools

Check out all these great new features in Excel. Even if you don’t run a small business like the Dostals, you’ll find some great tools for helping you understand, keep on track and visualize your own personal finances.

excel-tools-1

excel-tools-2

excel-tools-3

excel-tools-4

The post Making your money go “further” with Excel—personal finance tips from the Dostals appeared first on Office Blogs.

Calling all Gilmore Girls fans! Binge more and watch in 4K with Microsoft Edge

$
0
0

Mild spoilers for seasons 1-7 ahead, read at your own risk.

It’s an exciting week for Gilmore Girls fans around the world as Netflix prepares to release four new installments of the heartwarming fan favorite. More than nine years since the network run of the show ended in 2007, we get the opportunity to catch up with Rory, Lorelai, Emily, Luke, Lane, all the whacky faces of Stars Hollow and, of course, the boyfriends. No matter what “team” you’re on, this promises to be one of the most exciting premiers of all time.

Of course, you want to see Stars Hollow in all its Ultra HD* glory this time around – and today we are happy to announce 4K content from Netflix is now available exclusively for compatible PCs and 2-in-1 devices with Windows 10, including the new Gilmore Girls: A Year in the Life, premiering November 25th.

Your binge-watching favorites on Netflix are now available in 4K exclusively on compatible PCs and 2-in-1 devices, on Windows 10. Check out the new streaming experience with Microsoft Edge, the only browser that supports Netflix 4K content. If you’re looking for a new Windows 10 device that supports 4K, head over to Microsoft Stores.**

But wait, there’s more! What better way to prepare for this major moment than to re-watch all your old favorites? And since it’s Thanksgiving week for those of us in the US, you might have some downtime on a train, at Grandma’s house or in your childhood bedroom. Make sure you’re making the most of your binge watching experience by streaming your favorite episodes from Netflix on Microsoft Edge. Nothing is worse than your battery dying right at a pivotal moment – what if your screen went dark when Rory sees Jess before Sookie’s wedding? Or right as Logan was about to ask a very important question at Rory’s graduation party? Or, worst case, before you know Luke is “all in”?

When streaming Netflix on Microsoft Edge you can get through at least one more full episode of Gilmore Girls than when streaming on Chrome on battery*** – and you know one episode can be the difference between Chris and Lorelai casually dating and being a married couple.

And now, as I get ready to cozy in for a marathon binge session this week, I leave you with my top twelve must-watch episodes to get ready for Gilmore Girls: A Year in the Life. Copper boom!

  • Season 1, Episode 2: The Lorelai’s First Day at Chilton
  • Season 1, Episode 10: Forgiveness and Stuff
  • Season 2, Episode 4: The Roadtrip to Harvard
  • Season 2, Episode 22: I Can’t Get Started
  • Season 3, Episode 7: They Shoot Gilmores, Don’t They?
  • Season 3, Episode 22: Those Are Strings, Pinocchio
  • Season 4, Episode 22: Raincoats and Recipes
  • Season 5, Episode 7: You Jump, I Jump, Jack
  • Season 5, Episode 22: A House is not A Home
  • Season 6, Episode 7: Twenty-One is the Loneliest Number
  • Season 6, Episode 13: Friday Night’s Alright for Fighting
  • Season 7, Episode 22: Bon Voyage

*Ultra HD availability subject to your Netflix subscription plan, Internet service, device capabilities, and content availability. netflix.com/TermsOfUse
**To run Netflix in 4K on a PC device, it must have a 4K-capable screen and use a 7th Gen Intel® Core™ Processor.
***Battery life varies significantly with settings and other factors.

Updates for Excel Services and BI in SharePoint 2016 on-premises

$
0
0

If your organization enables Excel Services for SharePoint to enable BI, we have some updates on what has changed with SharePoint Server 2016 BI. Updates include architecture changes to the SharePoint-based on-premises Microsoft BI Stack and how you can benefit from upgrading your BI farms to SharePoint Server 2016, SQL 2016 and Office Online Server with Excel Online. We also examine various aspects, such as upgradeability, backward compatibility and licensing. For more information, please read “Deploying SQL Server 2016 PowerPivot and Power View in SharePoint 2016.”

First, let’s look at the history of Excel web solutions on-premises:

From the diagram above, you can see that Excel Services has evolved from an optional add-on application for SharePoint Server 2007 to an inherent part of Office Online Server for on-premises. In this newest 2016 release, Excel Services capabilities are moving to Office Online Server (which used to be called Office Web Apps Server) and as a result Excel Services is being replaced with Office Online Server.

Here is a summary of the main benefits brought to you by moving to Office Online Server and the latest and greatest version of Excel in your BI deployment. Many of these benefits are not limited to BI scenarios, but are valuable in other setups as well.

  • More Excel features—Additional features include the ability to search in Pivot Table filters, number formatting, ability to view and insert comments and the addition of Excel JS APIs.
  • Robust, unified deployment—You can deploy it with other Office Web apps. Use it for viewing, editing and BI.
  • Basic Excel BI capabilities with SharePoint Standard CAL—Use data model and/or access external data. Refresh and save your refreshed workbooks.
  • Evergreen service—Always stay up to date, get many of the same features and improvements as Office 365.

Here is a look at the architecture of the Microsoft BI Stack 2016 for on-premises:

updates-for-excel-services-and-bi-in-sharepoint-2016-on-premises-2

Key facts

The following list summarizes key facts about our on-premises offering in Microsoft BI Stack 2016 release.

  • All Excel web functionality including BI, viewing and editing of workbooks is now under one roof. Capabilities that were previously provided by Excel Services are now provided by Excel Online in Office Online Server.
  • Office Online Server is now a prominent member of the MS BI Stack. It requires a machine of its own. Deploy Office Online Server with Excel Online and bind it to SharePoint Server for SharePoint BI features to work.
  • SharePoint Server 2016 performance and scalability are no longer impacted by Excel Services footprint. Instead, Excel Online is deployed and scaled alongside other Office web applications using an Office Online Server farm.
  • Workbooks created in previous versions of Excel desktop client are fully supportedand can be published.
  • Working with a data model requires SSAS (in PowerPivot for SharePoint mode) but doesn’t require PowerPivot for Sharepoint add-in. The latter is available only with SharePoint eCAL.
  • If possible, we recommend upgrading the on-premises environment before upgrading desktop clients. This will allow new workbooks taking advantage of new features of Excel 2016 to be fully supported by the BI farm.
  • MS BI Stack 2016 on-premises should be upgraded bottom-up. To leverage advanced SharePoint BI features provided by its add-ons, on-premises MS BI Stack 2016 should be deployed in its entirety. Otherwise, if you only need the ability to access external data (e.g., use data model, connect to OLAP/Tabular DBs), it will be sufficient to upgrade your SSAS (in PowerPivot for SharePoint mode) and deploy the Office Online Server. SharePoint Server itself does not have to be upgraded in such a case.
  • Office Online Server is evergreen. Meaning, it is now possible to keep end-users on the latest and greatest functionality through frequent upgrade cadence. A lot of Office 365 improvements will now find their way to your on-premises environment.

SharePoint Server 2016 licensing also incurred changes. First, SharePoint Server 2016 no longer offers the Foundation SKU. It now comes only in Standard and Enterprise SKUs, with Standard CAL and Enterprise CAL licenses. Second, licensing requirements for some SharePoint Insights features have been relaxed. Namely, accessing external data, refreshing and working with a Model is now possible with all SharePoint SKUs, but advanced capabilities such as PowerPivot for SharePoint add-in, Reporting Services add-in, Excel Web Parts and ODC file support will still require SharePoint Enterprise CAL.

Key facts

The following list summarizes key facts about our SharePoint Server 2016 licensing changes:

  • Refresh, OLAP connectivity and Data Model interactivity are now supported in all SharePoint SKUs.
  • SQL licenses still apply, where necessary. (There is no change in this respect.)

With the new SharePoint 2016 release and Office Online Server, we’ve completed the move toward a unified architecture for viewing and editing Excel workbooks using Office Online Server. Moving to the new architecture provides users with access to the latest and greatest Excel features and adds new BI-specific features, such as search in PivotTable filters, and helps IT unify the deployment around one farm of Excel servers instead of two. In addition, we’ve simplified the licensing model, and you can expect Excel to be evergreen and keep getting new capabilities on a regular basis.

Please tell us what you think by commenting below.

—The Excel team

The post Updates for Excel Services and BI in SharePoint 2016 on-premises appeared first on Office Blogs.

Introducing Service Map for dependency-aware monitoring, troubleshooting and workload migrations

$
0
0

This post was authored by Nick Burling, Principal Program Manager on the Enterprise Cloud Management Team.

Managing modern business services requires visibility across the complex interdependencies of your critical applications and supporting infrastructure. Yet IT operations teams currently struggle to manage the complexity of discovering and troubleshooting the interconnected services their businesses depend on. Today, were announcing the public preview of Service Map, a new solution in Operations Management Suite Insight & Analytics that provides this visibility by presenting your servers as you think of themas interconnected systems that rely on other technologies to deliver business services.

Real-time dependency discovery and mapping

Service Map, which was previously called Application Dependency Monitor, discovers and maps server and process dependencies in real-time, without any predefinition, and visualizes application components, service dependencies, and supporting infrastructure configuration. This helps you eliminate the guesswork of problem isolation, identify surprise connections and broken links in your environment, and perform Azure migrations knowing that critical systems and endpoints wont be left behind. The Service Map public preview supports Windows and Linux guests, in any cloud and on-premises, discovering dependencies for any TCP-connected process running in those guests. As part of the cloud management capabilities in Operations Management Suite, Service Map enhances your ability to track dependencies in your hybrid cloud environment, making it easier to manage the complexity of multiple clouds.

Service Map

Accelerated troubleshooting and root-cause analysis

In the most recent episode of Microsoft Mechanics, I showcase the new Service Map capability and demonstrate the ways that Service Map can simplify analysis of complex operational issues. This includes a hands-on look at how to automatically discover app and system dependencies to accelerate troubleshooting and root cause analysis when used in conjunction with other Operations Management Suite services like Log Analytics and Change Tracking. Youll see how the new Service Map capability provides dependency-aware diagnostics and troubleshooting by integrating performance data, alerts, changes, and critical security issues with rich, dynamic topology views. In addition, youll learn how you can take advantage of Service Map to expedite your app and workload migrations, making it easier to shift to the cloud.

As we go forward with the preview, were excited to get feedback from you. You can submit your comments in our User Voice forum for Service Map. To get started today, create a free Microsoft Operations Management Suite account. Learn more about Service Map on our documentation page.

Service Map is just one of the capabilities available to help you gain control over your hybrid cloud. In addition to automatic discovery for application and system dependencies, check out a related Microsoft Mechanics episode focused on unified management across Linux and Windows Servers in hybrid cloud environments.


Windows 10 Tip: Draw on your memories with the Photos app update

$
0
0

Did you know that with our latest update to the Photos app on Windows 10, we’ve made it easy to view all your memories, edit them with new filters and even draw on them?

 Here’s how to add ink to your photos:

 Windows 10 Tip: Windows Ink with the Photos app

Choose from one of three pen types, pick a color to draw with and use the eraser to fine-tune your work. You can use Windows Ink with your device’s pen, it’s also available to use with a mouse or touch if you enable the touch-writing option on the ink toolbar.

You can even add ink to your animations or videos:

Windows 10 Tip: Windows Ink with the Photos app

If you ink while the video is playing, your ink will playback in sync with the video.  If you pause and write something, it will fade in and out.

Even better, watch your message to come to life by sharing an animation of your drawing with friends and family as a video. Share it on Facebook, send over email, or even text it.

Have a great week!

Deep Dive: The Storage Pool in Storage Spaces Direct

$
0
0

Hi! I’m Cosmos. Follow me on Twitter @cosmosdarwin.

Review

The storage pool is the collection of physical drives which form the basis of your software-defined storage. Those familiar with Storage Spaces in Windows Server 2012 or 2012R2 will remember that pools took some managing – you had to create and configure them, and then manage membership by adding or removing drives. Because of scale limitations, most deployments had multiple pools, and because data placement was essentially static (more on this later), you couldn’t really expand them once created.

We’re introducing some exciting improvements in Windows Server 2016.

What’s new

With Storage Spaces Direct, we now support up to 416 drives per pool, the same as our per-cluster maximum, and we strongly recommend you use exactly one pool per cluster. When you enable Storage Spaces Direct (as with the Enable-ClusterS2D cmdlet), this pool is automatically created and configured with the best possible settings for your deployment. Eligible drives are automatically discovered and added to the pool and, if you scale out, any new drives are added to the pool too, and data is moved around to make use of them. When drives fail they are automatically retired and removed from the pool. In fact, you really don’t need to manage the pool at all anymore except to keep an eye on its available capacity.

Nonetheless, understanding how the pool works can help you reason about fault tolerance, scale-out, and more. So if you’re curious, read on!

To help illustrate certain key points, I’ve written a script (open-source, available at the end) which produces this view of the pool’s drives, organized by type, by server (‘node’), and by how much data they’re storing. The fastest drives in each server, listed at the top, are claimed for caching.

The storage pool forms the physical basis of your software-defined storage.

The confusion begins: resiliency, extents, and striping

Let’s start with three servers forming one Storage Spaces Direct cluster.

Each server has 2 x 800 GB NVMe drives for caching and 4 x 2 TB SATA SSDs for capacity.

poolsblog-servers-3

We can create our first volume (‘Storage Space’) and choose 1 TiB in size, two-way mirrored. This implies we will maintain two identical copies of everything in that volume, always on different drives in different servers, so that if hardware fails or is taken down for maintenance, we’re sure to still have access to all our data. Consequently, this 1 TiB volume will actually occupy 2 TiB of physical capacity on disk, its so-called ‘footprint’ on the pool.

Our 1 TiB two-way mirror volume occupies 2 TiB of physical capacity, its ‘footprint’ on the pool.

Our 1 TiB two-way mirror volume occupies 2 TiB of physical capacity, its ‘footprint’ on the pool.

(Storage Spaces can achieve resiliency using mirroring, erasure coding, or both. For simplicity, this blog will show two-way mirroring. The concepts we’ll cover apply regardless which resiliency type you choose, but two-way mirroring is by far the most straightforward to draw and explain. Check out TechNet to learn about the various resiliency types and their respective storage efficiency.)

Okay, so we have 2 TiB of data to write to physical media. But where will these two tebibytes of data actually land?

You might imagine that Spaces just picks any two drives, in different servers, and places the copies in whole on those drives. Alas, no. What if the volume were larger than the drive size? Okay, perhaps it spans several drives in both servers? Closer, but still no.

What actually happens can be surprising if you’ve never seen it before.

Storage Spaces starts by dividing the volume into many 'extents', each 256 MB in size.

Storage Spaces starts by dividing the volume into many ‘extents’, each 256 MB in size.

Storage Spaces starts by dividing the volume into many ‘extents’, each 256 MB in size. (People sometimes use the word ‘slab’ to mean this, too. Not quite, but close enough.) This means our 1 TiB volume has 4,000 such extents!

For each extent, two copies are made and placed on different drives in different servers. This decision is made independently for each extent, successively, with an eye toward equilibrating utilization – you can think of it like dealing playing cards into equal piles. This means every single drive in the storage pool will store some copies of some extents!

The placement decision is made independently for each extent, like dealing playing cards into equal piles.

The placement decision is made independently for each extent, like dealing playing cards into equal piles.

This can be non-obvious, but it has some real consequences you can observe. For one, it means all drives in all servers will gradually “fill up” in lockstep, in 256 MB increments. This is why we rarely pay attention to how full specific drives or servers are – because they’re (almost) always (almost) the same!

Extents of our two-way mirrored volume have landed on every drive in all three servers.

(For the curious reader: the pool keeps a sprawling mapping of which drive has each copy of each extent called the ‘pool metadata’ which can reach up to several gigabytes in size. It is replicated to at least five of the fastest drives in the cluster, and synchronized and repaired with the utmost aggressiveness. To my knowledge, pool metadata loss has never taken down an actual production deployment of Storage Spaces.)

Why? Can you spell parallelism?

This may seem complicated, and it is. So why do it? Two reasons.

Performance, performance, performance!

First, striping every volume across every drive unlocks truly awesome potential for reads and writes – especially larger sequential ones – to activate many drives in parallel, vastly increasing IOPS and IO throughput. The unrivaled performance of Storage Spaces Direct compared to competing technologies is largely attributable to this fundamental design. (There is more complexity here, with the infamous column count and interleave you may remember from 2012 or 2012R2, but that’s beyond the scope of this blog. Spaces automatically sets appropriate values for these in 2016 anyway.)

(This is also why members of the core Spaces engineering team take some offense if you compare mirroring directly to RAID-1.)

Improved data safety

The second is data safety – it’s related, but worth explaining in detail.

In Storage Spaces, when drives fail, their contents are reconstructed elsewhere based on the surviving copy or copies. We call this ‘repairing’, and it happens automatically and immediately in Storage Spaces Direct. If you think about it, repairing must involve two steps – first, reading from the surviving copy; second, writing out a new copy to replace the lost one.

Bear with me for a paragraph, and imagine if we kept whole copies of volumes. (Again, we don’t.) Imagine one drive has every extent of our 1 TiB volume, and another drive has the copy of every extent. What happens if the first drive fails? The other drive has the only surviving copy. Of every extent. To repair, we need to read from it. Every. Last. Byte. We are obviously limited by the read speed of that drive. Worse yet, we then need to write all that out again to the replacement drive or hot spare, where we are limited by its write speed. Yikes! Inevitably, this leads to contention with ongoing user or application IO activity. Not good.

Storage Spaces, unlike some of our friends in the industry, does not do this.

Consider again the scenario where some drive fails. We do lose all the extents stored on that drive. And we do need to read from each extent’s surviving copy in order to repair. But, where are these surviving copies? They are evenly distributed across almost every other drive in the pool! One lost extent might have its other copy on Drive 15; another lost extent might have its other copy on Drive 03; another lost extent might have its other copy on Drive 07; and so on. So, almost every other drive in the pool has something to contribute to the repair!

Next, we do need to write out the new copy of each – where can these new copies be written? Provided there is available capacity, each lost extent can be re-constructed on almost any other drive in the pool!

(For the curious reader: I say almost because the requirement that extent copies land in different servers precludes any drives in the same server as the failure from having anything to contribute, read-wise. They were never eligible to get the other copy. Similarly, those drives in the same server as the surviving copy are ineligible to receive the new copy, and so have nothing to contribute write-wise. This detail turns out not to be terribly consequential.)

While this can be non-obvious, it has some significant implications. Most importantly, repairing data faster minimizes the risk that multiple hardware failures will overlap in time, improving overall data safety. It is also more convenient, as it reduces the ‘resync’ wait time during rolling cluster-wide updates or maintenance. And because the read/write burden is spread thinly among all surviving drives, the load on each drive individually is light, which minimizes contention with user or application activity.

Reserve capacity

For this to work, you need to set aside some extra capacity in the storage pool. You can think of this as giving the contents of a failed drive “somewhere to go” to be repaired. For example, to repair from one drive failure (without immediately replacing it), you should set aside at least one drive’s worth of reserve capacity. (If you are using 2 TB drives, that means leaving 2 TB of your pool unallocated.) This serves the same function as a hot spare, but unlike an actual hot spare, the reserve capacity is taken evenly from every drive in the pool.

Reserve capacity gives the contents of a failed drive

Reserve capacity gives the contents of a failed drive “somewhere to go” to be repaired.

Reserving capacity is not enforced by Storage Spaces, but we highly recommend it. The more you have, the less urgently you will need to scramble to replace drives when they fail, because your volumes can (and will automatically) repair into the reserve capacity, completely independent of the physical replacement process.

When you do eventually replace the drive, it will automatically take its predecessor’s place in the pool.

Check out our capacity calculator for help with determining appropriate reserve capacity.

Automatic pooling and re-balancing

New in Windows 10 and Windows Server 2016, extents and their copies can be moved around between drives in the storage pool to equilibrate utilization. We call this ‘optimizing’ or ‘re-balancing’ the storage pool, and it’s essential for scalability in Storage Spaces Direct.

For instance, what if we need to add a fourth server to our cluster?

Add-ClusterNode -Name 

poolsblog-servers-4

The new drives in this new server will be added automatically to the storage pool. At first, they’re empty.

The capacity drives in our fourth server are empty, for now.

After 30 minutes, Storage Spaces Direct will automatically begin re-balancing the storage pool – moving extents around to even out drive utilization. This can take some time (many hours) for larger deployments. You can watch its progress using the following cmdlet.

Get-StorageJob

If you’re impatient, or if your deployment uses Shared SAS Storage Spaces with Windows Server 2016, you can kick off the re-balance yourself using the following cmdlet.

Optimize-StoragePool -FriendlyName "S2D*"

The storage pool is 're-balanced' whenever new drives are added to even out utilization.

The storage pool is ‘re-balanced’ whenever new drives are added to even out utilization.

Once completed, we see that our 1 TiB volume is (almost) evenly distributed across all the drives in all four servers.

The extents of our 1 TiB two-way mirrored volume are now spread evenly across all four servers.

And going forward, when we create new volumes, they too will be distributed evenly across all drives in all servers.

This can explain one final phenomena you might observe – that when a drive fails, every volume is marked ‘Incomplete’ for the duration of the repair. Can you figure out why?

Conclusion

Okay, that’s it for now. If you’re still reading, wow, thank you!

Let’s review some key takeaways.

  • Storage Spaces Direct automatically creates one storage pool, which grows as your deployment grows. You do not need to modify its settings, add or remove drives from the pool, nor create new pools.
  • Storage Spaces does not keep whole copies of volumes – rather, it divides them into tiny ‘extents’ which are distributed evenly across all drives in all servers. This has some practical consequences. For example, using two-way mirroring with three servers does not leave one server empty. Likewise, when drives fail, all volumes are affected for the very short time it takes to repair them.
  • Leaving some unallocated ‘reserve’ capacity in the pool allows this fast, non-invasive, parallel repair to happen even before you replace the drive.
  • The storage pool is ‘re-balanced’ whenever new drives are added, such as on scale-out or after replacement, to equilibrate how much data every drive is storing. This ensures all drives and all servers are always equally “full”.

U Can Haz Script

In PowerShell, you can see the storage pool by running the following cmdlet.

Get-StoragePool S2D*

And you can see the drives in the pool with this simple pipeline.

Get-StoragePool S2D* | Get-PhysicalDisk

Throughout this blog, I showed the output of a script which essentially runs the above, cherry-picks interesting properties, and formats the output all fancy-like. That script is included below, and is also available at http://cosmosdarwin.com/Show-PrettyPool.ps1 to spare you the 200-line copy/paste. There is also a simplified version at here which forgoes my extravagant helper functions to reduce running time by about 20x and lines of code by about 2x. 🙂

Let me know what you think!

# Written by Cosmos Darwin, PM
# Copyright (C) 2016 Microsoft Corporation
# MIT License
# 11/2016

Function ConvertTo-PrettyCapacity {
    <#
    .SYNOPSIS Convert raw bytes into prettier capacity strings.
    .DESCRIPTION Takes an integer of bytes, converts to the largest unit (kilo-, mega-, giga-, tera-) that will result in at least 1.0, rounds to given precision, and appends standard unit symbol.
    .PARAMETER Bytes The capacity in bytes.
    .PARAMETER UseBaseTwo Switch to toggle use of binary units and prefixes (mebi, gibi) rather than standard (mega, giga).
    .PARAMETER RoundTo The number of decimal places for rounding, after conversion.
    #>

    Param (
        [Parameter(
            Mandatory = $True,
            ValueFromPipeline = $True
            )
        ]
    [Int64]$Bytes,
    [Int64]$RoundTo = 0,
    [Switch]$UseBaseTwo # Base-10 by Default
    )

    If ($Bytes -Gt 0) {
        $BaseTenLabels = ("bytes", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
        $BaseTwoLabels = ("bytes", "KiB", "MiB", "GiB", "TiB", "PiB", "EiB", "ZiB", "YiB")
        If ($UseBaseTwo) {
            $Base = 1024
            $Labels = $BaseTwoLabels 
        }
        Else {
            $Base = 1000
            $Labels = $BaseTenLabels
        }
        $Order = [Math]::Floor( [Math]::Log($Bytes, $Base) )
        $Rounded = [Math]::Round($Bytes/( [Math]::Pow($Base, $Order) ), $RoundTo)
        [String]($Rounded) + $Labels[$Order]
    }
    Else {
        0
    }
    Return
}


Function ConvertTo-PrettyPercentage {
    <#
    .SYNOPSIS Convert (numerator, denominator) into prettier percentage strings.
    .DESCRIPTION Takes two integers, divides the former by the latter, multiplies by 100, rounds to given precision, and appends "%".
    .PARAMETER Numerator Really?
    .PARAMETER Denominator C'mon.
    .PARAMETER RoundTo The number of decimal places for rounding.
    #>

    Param (
        [Parameter(Mandatory = $True)]
            [Int64]$Numerator,
        [Parameter(Mandatory = $True)]
            [Int64]$Denominator,
        [Int64]$RoundTo = 1
    )

    If ($Denominator -Ne 0) { # Cannot Divide by Zero
        $Fraction = $Numerator/$Denominator
        $Percentage = $Fraction * 100
        $Rounded = [Math]::Round($Percentage, $RoundTo)
        [String]($Rounded) + "%"
    }
    Else {
        0
    }
    Return
}

Function Find-LongestCommonPrefix {
    <#
    .SYNOPSIS Find the longest prefix common to all strings in an array.
    .DESCRIPTION Given an array of strings (e.g. "Seattle", "Seahawks", and "Season"), returns the longest starting substring ("Sea") which is common to all the strings in the array. Not case sensitive.
    .PARAMETER Strings The input array of strings.
    #>

    Param (
        [Parameter(
            Mandatory = $True
            )
        ]
        [Array]$Array
    )

    If ($Array.Length -Gt 0) {

        $Exemplar = $Array[0]

        $PrefixEndsAt = $Exemplar.Length # Initialize
        0..$Exemplar.Length | ForEach {
            $Character = $Exemplar[$_]
            ForEach ($String in $Array) {
                If ($String[$_] -Eq $Character) {
                    # Match
                }
                Else {
                    $PrefixEndsAt = [Math]::Min($_, $PrefixEndsAt)
                }
            }
        }
        # Prefix
        $Exemplar.SubString(0, $PrefixEndsAt)
    }
    Else {
        # None
    }
    Return
}

Function Reverse-String {
    <#
    .SYNOPSIS Takes an input string ("Gates") and returns the character-by-character reversal ("setaG").
    #>

    Param (
        [Parameter(
            Mandatory = $True,
            ValueFromPipeline = $True
            )
        ]
        $String
    )

    $Array = $String.ToCharArray()
    [Array]::Reverse($Array)
    -Join($Array)
    Return
}

Function New-UniqueRootLookup {
    <#
    .SYNOPSIS Creates hash table that maps strings, particularly server names of the form [CommonPrefix][Root][CommonSuffix], to their unique Root.
    .DESCRIPTION For example, given ("Server-A2.Contoso.Local", "Server-B4.Contoso.Local", "Server-C6.Contoso.Local"), returns key-value pairs:
        {
        "Server-A2.Contoso.Local" -> "A2"
        "Server-B4.Contoso.Local" -> "B4"
        "Server-C6.Contoso.Local" -> "C6"
        }
    .PARAMETER Strings The keys of the hash table.
    #>

    Param (
        [Parameter(
            Mandatory = $True
            )
        ]
        [Array]$Strings
    )

    # Find Prefix

    $CommonPrefix = Find-LongestCommonPrefix $Strings

    # Find Suffix

    $ReversedArray = @()
    ForEach($String in $Strings) {
        $ReversedString = $String | Reverse-String
        $ReversedArray += $ReversedString
    }

    $CommonSuffix = $(Find-LongestCommonPrefix $ReversedArray) | Reverse-String

    # String -> Root Lookup

    $Lookup = @{}
    ForEach($String in $Strings) {
        $Lookup[$String] = $String.Substring($CommonPrefix.Length, $String.Length - $CommonPrefix.Length - $CommonSuffix.Length)
    }

    $Lookup
    Return
}

### SCRIPT... ###

$Nodes = Get-StorageSubSystem Cluster* | Get-StorageNode
$Drives = Get-StoragePool S2D* | Get-PhysicalDisk

$Names = @()
ForEach ($Node in $Nodes) {
    $Names += $Node.Name
}

$UniqueRootLookup = New-UniqueRootLookup $Names

$Output = @()

ForEach ($Drive in $Drives) {

    If ($Drive.BusType -Eq "NVMe") {
        $SerialNumber = $Drive.AdapterSerialNumber
        $Type = $Drive.BusType
    }
    Else { # SATA, SAS
        $SerialNumber = $Drive.SerialNumber
        $Type = $Drive.MediaType
    }

    If ($Drive.Usage -Eq "Journal") {
        $Size = $Drive.Size | ConvertTo-PrettyCapacity
        $Used = "-"
        $Percent = "-"
    }
    Else {
        $Size = $Drive.Size | ConvertTo-PrettyCapacity
        $Used = $Drive.VirtualDiskFootprint | ConvertTo-PrettyCapacity
        $Percent = ConvertTo-PrettyPercentage $Drive.VirtualDiskFootprint $Drive.Size
    }

    $Node = $UniqueRootLookup[($Drive | Get-StorageNode -PhysicallyConnected).Name]

    # Pack

    $Output += [PSCustomObject]@{
        "SerialNumber" = $SerialNumber
        "Type" = $Type
        "Node" = $Node
        "Size" = $Size
        "Used" = $Used
        "Percent" = $Percent
    }
}

$Output | Sort Used, Node | FT

Source Control in SQL Server Management Studio (SSMS)

$
0
0

This post was written by Ken Van Hyning, Engineering Manager, SQL Server Client Tools.

In the latest generation of SQL Server Management Studio, we moved to the Visual Studio 2015 Isolated Shell. While this provides SSMS a modern IDE foundation for many functional areas, it also had some consequences. Specifically, the integration with source control systems in SSMS no longer works the way it did in SSMS 2014 and prior.  Previously, one could install the Visual Studio MSSCCI provider and then integrate with various source control systems. Visual Studio 2015 does not support MSSCCI so that is no longer an option to use in SSMS.

Of course, the good news is that Visual Studio 2015 includes TFS and Git source control integration. With the move to VS 2015 Isolated Shell, SSMS should be able to use these packages as well, right? The answer is…yes…but! The issue for SSMS is that the TFS source control integration package VS provides also includes the entire suite of TFS integration features. If we include this package by default, SSMS will have Team Explorer in its entirety which includes things such as work item tracking, builds, etc. This doesn’t fit in the overall experience SSMS is designed for, so we aren’t going to include this package as part of SSMS. The full TFS integrated experience is included as part of SQL Server Data Tools which is designed for a more developer-centric set of scenarios.

That said, if source code integration is an important aspect of how you use SSMS, you can enable the Visual Studio packages manually.

Enabling source control integration in SSMS

To enable TFS integration in SSMS, follow these steps:

  1. Close SSMS if it is running.
  2. Install Visual Studio 2015 on your SSMS machine. If you don’t already have Visual Studio, Community Edition will work fine. This is a large download but you can save some space by unselecting all languages during the Visual Studio install if your only purpose is to enable Source Control in SSMS.
  3. Edit the ssms.pkgundef file found at C:\Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\ManagementStudio\ssms.pkgundef.
    • At the top of this file there are a series of packages grouped together related to TFS Source Control features. These packages must be removed from the pkgundef file. This can be done by either deleting the section or commenting out each line using ‘//’. Here is an example of what the section should look like if commented out:// TFS SCC Configuration entries.  The TFS entries block Team Explorer from loading.
      // Microsoft.VisualStudio.TeamFoundation.VersionControl.HatPackage
      //[$RootKey$\AutoLoadPackages\{4CA58AB2-18FA-4F8D-95D4-32DDF27D184C}]
      // Microsoft.VisualStudio.TeamFoundation.Lab
      //[$RootKey$\Packages\{17c5d08a-602c-4dfb-82b5-8e0f7f50c9d7}]
      // GitHub Package
      //[$RootKey$\Packages\{c3d3dc68-c977-411f-b3e8-03b0dccf7dfc}]
      // Team Foundation Server Provider Package
      //[$RootKey$\Packages\{5BF14E63-E267-4787-B20B-B814FD043B38}]
      // Microsoft.VisualStudio.TeamFoundation.WorkItemTracking.WitPcwPackage
      //[$RootKey$\Packages\{6238f138-0c0c-49ec-b24b-215ee59d84f0}]
      // Microsoft.VisualStudio.TeamFoundation.Build.BuildPackage
      //[$RootKey$\Packages\{739f34b3-9ba6-4356-9178-ac3ea81bdf47}]
      // Microsoft.VisualStudio.TeamFoundation.WorkItemTracking
      //[$RootKey$\Packages\{ca39e596-31ed-4b34-aa36-5f0240457a7e}]
      // Microsoft.VisualStudio.TeamFoundation
      //[$RootKey$\Packages\{b80b010d-188c-4b19-b483-6c20d52071ae}]
      // Microsoft.TeamFoundation.Git.Provider.SccProviderPackage
      //[$RootKey$\Packages\{7fe30a77-37f9-4cf2-83dd-96b207028e1b}]
      // Microsoft.VisualStudio.TeamFoundation.VersionControl.SccPcwPluginPackage
      //[$RootKey$\Packages\{1b4f495a-280a-3ba4-8db0-9c9b735e98ce}]
      // Microsoft.VisualStudio.TeamFoundation.VersionControl.HatPackage
      //[$RootKey$\Packages\{4CA58AB2-18FA-4F8D-95D4-32DDF27D184C}]
      // Visual SourceSafe Provider Package
      //[$RootKey$\Packages\{AA8EB8CD-7A51-11D0-92C3-00A0C9138C45}]
      // Visual SourceSafe Provider Stub Package
      [$RootKey$\Packages\{53544C4D-B03D-4209-A7D0-D9DD13A4019B}]
      // Microsoft.VisualStudio.TeamFoundation.Initialization.InitializationPackage
      [$RootKey$\Packages\{75DF55D4-EC28-47FC-88AC-BE56203C9012}]
      // Team Foundation Server Provider Stub Package
      [$RootKey$\Packages\{D79B7E0A-F994-4D4D-8FAE-CAE147279E21}]
      // Microsoft.VisualStudio.Services.SccDisplayInformationPackage
      [$RootKey$\Packages\{D7BB9305-5804-4F92-9CFE-119F4CB0563B}]
      // Microsoft.VisualStudio.TeamFoundation.Lab.LabPcwPluginPackage
      [$RootKey$\Packages\{e0910062-da1f-411c-b152-a3fc6392ee1f}]
      [$RootKey$\ToolsOptionsPages\Source Control]
      [$RootKey$\AutoLoadPackages\{11b8e6d7-c08b-4385-b321-321078cdd1f8}]
      // TFS SCC Configuration entries.

Once completed, start SSMS and the “Team” menu should be visible in the SSMS menu bar. This menu and related features are the standard Visual Studio functionality. This enables connections to TFS servers or Git servers. Please refer to the following Visual Studio documentation for more information:

Automatic dependency discovery and mapping with Service Map – Public Preview

$
0
0

Summary: Use Service Map to discover TCP-connected processes and build topology views.

Hi everyone, Nick Burling here, and today I’m excited to announce that Service Map (formerly known as Application Dependency Monitor) is available in Public Preview. In this post, I’ll take you through some of the cool capabilities that Service Map provides like automatic discovery and mapping for server and process dependencies in real-time, without any predefinition.

Automatic dependency mapping

Service Map discovers every TCP-connected process in your Windows and Linux virtual machines running in Azure, in third-party clouds, or on-premises in your datacenter. It sees every connection that those processes make down to the IP and port, and it then builds intuitive, live, and historical topology views as shown in the following screenshot.

Topology views that Service Map builds

There is no predefinition required, and Service Map will discover dependencies for literally any TCP-connected process. Using Service Map, you will not only see connections to other systems in your environment, but also those going out to third-party services to make sure you have full visibility to what your systems are talking to.

Accelerated troubleshooting and root-cause analysis

In addition to providing automatic dependency discovery and mapping, Service Map accelerates troubleshooting and root-cause analysis when used in conjunction with other Operations Management Suite capabilities like Log Analytics alerts, Change Tracking, Security, and Update Management. In the following screenshot, you can see how Service Map integrates with Log Analytics alerts and highlights when there might be issues with any of the monitored virtual machines in a Service Map topology view.

Service Map integrates with Log Analytics

In addition to alert integration, Service Map can help you understand instantly if any important changes occurred across your environment whether in software, registry, or files. Service Map integrates with Change Tracking data to surface change events in the context of your system dependencies and troubleshooting workflow as shown in the following screenshot.

Service Map integrated with Change Tracking data

Service Map can also provide visibility to critical security issues across your virtual machines or through integration with the Update Management solution, missing critical patches across your Windows and Linux systems.

Service Map integration with the Update Management solution

Accelerate migration projects

In addition to enhancing your troubleshooting and root-cause analysis, Service Map helps to expedite your app and workload migrations, accelerating your transition to the cloud. Service Map helps you eliminate the guesswork of problem isolation, identify surprise connections and broken links in your environment, and perform Azure migrations knowing that critical systems and endpoints won’t be left behind. The following screenshot is an example where we have built a Power BI migration report takes advantage of Service Map’s APIs that can be shared across the organization as part of a migration project.

A Power BI migration report that takes advantage of Service Map’s APIs

Get started

You can find detailed instructions about how to get started with Service Map, and following is a quick overview:

  1. Make sure your OMS workspace is enrolled in the OMS Pricing Plan and that you have added either the free or paid tier for the Insight and Analytics offer.Note: Service Map is currently available in the East US region only.
  2. Go to the OMS Gallery in the OMS Portal or the Azure Marketplace, and add the Service Map solution to your workspace. You can get started for free with up to five nodes connected and sending data. If you are already using the paid tier of Insight and Analytics, you can use Service Map with any of your licensed nodes.
  3. After you have added Service Map to your workspace, click the solution tile in the OMS Portal Overview page and download the Windows and or Linux Dependency Agent:

Options to download the Windows or Linux agents

  1. In the Azure Portal, select your Log Analytics Workspace, and then click Service Map to access the Dependency Agent download links.

Select your Log Analytics Workspace, and then click Service Map to access the Dependency Agent download links

  1. Install the agent on your Windows and / or Linux virtual machines, and you all set to begin using Service Map.

Please send your feedback

As we go forward with the preview, we’re excited to get feedback from you. You can submit your comments in our User Voice forum for Service Map.

In addition, we’re always interested in having new customers join our cohorts to get early access to new features and help us improve Service Map going forward. If you are interested in joining our cohorts, simply fill out this quick survey.

Nick Burling
System Center & Services

Free Online Workshop on Cortana Intelligence Suite: Register Now!

$
0
0

Get Live, Step-by-Step Guidance from Microsoft Experts

This post is authored by Matthew Calder, Senior Content Developer at Microsoft.

Join us on Microsoft Virtual Academy December 6, 2016, from 9AM – 4PM Pacific, for an exciting look at the Cortana Intelligence Suite (CIS), and end the day with a working web app!

The “Cortana Intelligence Suite End to End” session will be taught by Microsoft Architects Todd Kitta and Jin Cho, and promises a hands-on day with the platform. During the workshop, you’ll learn how to architect solutions in the Suite and weave intelligence into your applications. We’ll look at the “What” and “Why” of CIS in an overview keynote and then explore key platform components such as Azure ML, Azure Data Factory, HDInsight Spark and Power BI, as we build the app. The instructors will talk about the Open Source capabilities of the Suite, walk you through an ML model, show you how to set up an Azure Data Factory pipeline, and more. 

What’s more, Jin and Todd will pace their instruction, giving you plenty of time to work through the project and ask questions. To best follow along with them, be sure to set up a free trial subscription to Microsoft Azure before the event.


Course Outline 

  • Analytics State of the Union + Cortana Intelligence Suite overview keynote.
  • Building a Machine Learning Model and Operationalizing.
  • Setting up Azure Data Factory.
  • Developing a Data Factory Pipeline for Data Movement.
  • Operationalizing Machine Learning Scoring with Azure Machine Learning and Data Factory.
  • Summarizing Data Using HDInsight Spark.
  • Visualizing Spark Data in Power BI.
  • Deploying an Intelligent Web App.
  • Wrap-up and Cleanup of Azure Resources.

Register Now!

Cortana Intelligence Suite End to End

Date December 6, 2015
Time 9AM – 4PM Pacific
Where Live, online virtual classroom
Cost Free!


Whether you’re a Data Professional, Data Analyst, Data Scientist, or Developer looking to incorporate intelligence into your applications, use this opportunity to enhance your industry know-how, learn what’s possible with the Cortana Intelligence Suite and get all your questions answered.

Enjoy your Thanksgiving break, and we’ll see you in our virtual classroom in early December!

Matt
@MatthewCalder1

Announcing Extended Support for WSUS 3.0 SP2

$
0
0

Hi everyone! Brandon Wilson here just passing along a friendly note that consumers of WSUS will probably be jumping for joy to hear. Nathan Mercer and Michael Niehaus have published some useful news in this blog post (contents can also be read below).

So, without further delay….here it is in Nathan and Michael’s words:

—–

Windows Server Update Services (WSUS) is key to the Windows servicing process for many organizations. Whether being used standalone or as a component of other products like System Center Configuration Manager or Windows Small Business Server, it provides a variety of useful features, including automating the download and installation of Windows updates.

While WSUS has been built into Windows Server 2012 and later operating systems, most people didn’t realize that it was a separate product for earlier operating systems like Windows Server 2008 R2. Because the version that complemented Windows Server 2008 R2, WSUS 3.0, was considered a separate product, it had a separate support lifecycle, and that lifecycle was due to end in July of 2017, even though extended support for Windows Server 2008 R2 continues until January of 2020.

To remedy this situation, we have extended WSUS 3.0 support to coincide with the Windows Server 2008 R2 end of support date. Now, both will be supported through January 14, 2020. While this reduces the sense of urgency, we still would like to encourage organizations to move all remaining WSUS 3.0 servers to a later version, a process that involves migrating to a new version of Windows Server where WSUS 4.0 (or later, in the case of the upcoming Windows Server 2016 release) can be used.

For those using Windows 10, it is particularly important to look at moving to WSUS 4.0 (or later) to support the deployment of Windows 10 feature updates, which add new features to Windows 10. Support for this new type of update has been added to WSUS 4.0 via an update to WSUS itself. This functionality isn’t available in WSUS 3.0 because mainstream support for that version has already ended (extended support does not add new capabilities; it only fixes security issues).

To help you with the migration process to WSUS 4.0 (or later), we have provided some additional documentation to help guide you through the process. For more information on WSUS, be sure to check out the WSUS team blog.

—–

This change is great news, however don’t let it deter you from embracing WSUS 4.0! Hopefully this post will be of some help, and if you have questions are comments, please feel free to leave them here or on the above linked post.

Until later….Brandon Wilson

Two Ways To Accept Pipeline Input In PowerShell

$
0
0

Hello everyone, my name is Preston K. Parsard, and I’m a Premier Field Engineer. I’ve been focusing on Azure and PowerShell engagements recently and would like to present an illustrated article about how Windows PowerShell parameter binding works.

REQUIREMENTS

Windows PowerShell version 4.0 is used in the screenshots for this article include both 4.0, however, Get-Service, Stop-Service and Set-Service have been available since PowerShell version 2.0.

PURPOSE

One of the more abstract concepts to both learn as a new user and to explain as an instructor is just exactly what goes on when you use the pipeline operator to combine expressions and pass values from one cmdlet to another in sequence. The good news is that once this process is understood, we can start to develop our own advanced functions while improving our scripting knowledge and capabilities. These are functions that can integrate parameters which accept pipeline input, and in fact this is how many of the native PowerShell cmdlets already work. Now a further discussion of building those types of functions are beyond the scope of this post, but can be found here. For now, we’ll review the process of pipeline inputs for existing native PowerShell cmdlets. First though, I’ll offer a quick description of the pipeline and the pipeline operator, for those of us that are not already familiar.

A pipeline in PowerShell is a series of values, expressions, commands or cmdlets that are combined with the pipe operator (|) to send the results of one command or expression to the next. These results are sent through the pipeline as objects or object properties, not just text as from the Windows command console (cmd.exe) or certain other non-PowerShell methods. If the results consist of an array of objects, these objects are sent in one-by-one through the pipeline.


Figure 1: Using the findstr.exe utility to search for “packets”

Here I am looking for the case insensitive string of “packets” in the result.

Now if I would like to only retrieve the number of received packets using the command console, I could try to do so by modifying my original expression to the following:


Figure 2: Using the findstr.exe utility to search for “received”

Look familiar? The command returns the entire line again, so the result is identical to the previous one shown in Figure 1.

Now what if the command ping localhost were an object, and we could just specify exactly which property of that resulting object we are interested in, such as the Received property to get its value of 4? This would provide more flexibility, but unfortunately we can’t do this directly in the command shell.

In PowerShell, objects can be filtered and manipulated at the other end of the pipeline with the receiving cmdlets, so you gain more control over utilizing all the available properties of those objects than would be possible with just text results.

USE CASES


Figure 3: Stop and Set-Service

Ok, so now that we’ve quickly reviewed the pipeline, let’s turn to our scenarios and some specific examples to examine these concepts in more detail.

STOP-SERVICE

Imagine that you wanted to identify a service and stop it? How would you do that? What about if it’s on a remote server on your network?

It will look something like this:

Get-Service -Name BITS -ComputerName 2012r2-ms | Stop-Service -PassThru -Verbose

If we look at the name parameter attribute for the Stop-Service cmdlet, which is on the receiving side of the pipeline, we’ll see below (1) that this parameter has a type of , which indicates it can accept multiple values, or an array of service names with the ServiceController .NET Class Type Name. It also shows that (2) this parameter accepts pipeline input, by value.


Figure 4: Stop-Service InputObject parameter attribute table

If we also look at the expressions with their available properties and parameters in terms of class diagrams, it would look something like this:


Figure 5: Get-Service and Stop-Service viewed as partial class diagrams

When Get-Service -Name BITS -ComputerName 2012r2-ms
is evaluated on the left side of the pipeline, it produces a result which is an object of type
ServiceController and has a value of bits hosted on the computer 2012r2-ms.

We’ve also just seen from Figure 4 that the Stop-Service cmdlet accepts pipeline input, and that it accepts values coming in to the InputObject parameter specifically, so it accepts these parameters by the object
value or values depending on whether a single or multiple objects are received.

By the way, it will be important to keep this in mind for other parameters which may accept pipeline input both by value and parameter name. The rules and processing order of how these parameters are evaluated follows:

  1. ByValue without coercion: Match incoming object type to receiving parameter type without having to try and convert the object type to the parameter type.
  2. ByPropertyName without coercion: If the incoming property name (not object value) matches, the receiving parameter name, then attempt to the match incoming object property type to the receiving parameter type, again, without type conversion.

    This option requires that the property name of the receiving cmdlet has to match exactly the property name of the incoming expression for which the object property value is being sent. More about this later.

  3. ByValue with coercion: If the incoming object value (not object property) type is convertible, for example, an integer could be converted or coerced into a string value if that’s what the receiving parameter expects based on its parameter type definition then binding will still work.
  4. ByProperty name with coercion: If the first match was by property name, then check to see if the incoming object type is convertible to what the receiving parameter has defined as it’s type.

Now that we have the breakdown of the processing order, we see that in our Stop-Service example, the match and subsequent binding will occur, based on the pipeline’s object value. This is because there is an instance of the ServiceController type, which is an object and not an object property coming through the pipeline.

More specifically, the resulting object received from the pipeline is a service object instance for the BITS service (Get-Service -Name BITS) that has a type of ServiceController. It is a single value, but can be converted to an array consisting of a single element. The receiving parameter type now matches sent object type, and so it binds or is associated with the object value, which is also be referred to as the argument in the Trace-Command output. This option was taken first because ByValue without coercion is evaluated highest in the processing order, so after parameter binding is successful, the receiving expression Stop-Service -PassThru -Verbose is then evaluated and executed. Of course, we just threw in the -PassThru and -Verbose parameters here to see what’s going on and get a bit more detail for the output.

This is a good start, but how can we trace the detailed activity for what transpires with parameter binding operations? Well, valued readers, I’m glad you asked.

Like this…

Trace-Command -Name ParameterBinding -Expression { Get-Service -Name bits -ComputerName 2012r2-ms | Stop-Service -PassThru } -PSHost -FilePath c:\pshell\labs_9\Stop-Service-PBv1.0.log

Here, the -PSHost switch parameter tells us to display the results on the screen and -FilePath is used to specify a file to log the same details.


Figure 7: The parameter binding process for Stop-Service

Inside the red outline shown in Figure 7, we first see the bind operation beginning for the pipeline object for the receiving Stop-Service cmdlet.

BIND PIPELINE object to parameters: [Stop-Service]

Next, the pipeline object type is evaluated as [System.ServiceProcess.ServiceController], but we can just exclude the class namespace, and call it ServiceController from here onwards.

PIPELINE object Type = [System.ServiceProcess.ServiceController]

The pipeline parameter value is then restored and because it matches the receiving parameter object type, which is ServiceController. The ByValueFromPipeline without coercion rule is applied first.

RESTORING pipeline parameter’s original values

Parameter [InputObject] PIPELINE INPUT ValueFromPipeline NO COERCION

Now we can bind the specific bits service object value instance to the Stop-Service’s InputObject parameter, but because it’s only a single service value, and an array of services is expected, the binding operation creates a new array and adds this single value of the bits service to it.

Bind arg [bits] to parameter [InputObject]

Binding collection parameter InputObject: argument type [ServiceController], parameter type [System.ServiceProcess.ServiceController[]], collection type Array, element type [System.ServiceProcess.ServiceController], no coerceElementType

Creating array with element type [System.ServiceProcess.ServiceController] and 1 elements

Argument type ServiceController is not IList, treating this as scalar

Adding scalar element of type ServiceController to array position 0

Ok, we’re almost finished with binding, but first we’ll validate that the parameter is not null or empty. This is necessary because it has already been defined as an attribute for the Parameter in PowerShell as part of the Stop-Service cmdlet source code.

Executing VALIDATION metadata: [System.Management.Automation.ValidateNotNullOrEmptyAttribute]

Are we there yet? …Yes, so finally, we can successfully bind the argument bits, now re-formatted as a ServiceController array type to the Stop-Service parameter -InputObject.

BIND arg [System.ServiceProcess.ServiceController[]] to param [InputObject] SUCCESSFUL

SET-SERVICE

Alright, now that we’ve looked at a ByValue example, let’s shift gears this time to the other pipeline input option – ByPropertyName. What’s interesting in this next example is that this parameter accepts pipeline input both ByValue AND ByPropertyName, so we’ll even get to do a bit of comparison as an added bonus.


Figure 8: Parameter attribute table for the Set-Service -Name parameter

And what would a discussion about binding be without a quick glance at another partial class diagram?


Figure 9: Class diagrams for Get-Service and Set-Service

First, we’ll execute the expression;

Figure 10: Trace-Command for Set-Service -Name ByParameterName

This command will select the bits service by it’s property name to set it’s StartupType from Auto to Manual. Notice the red outline area in Figure 10. After the first pipe operator, we single out ONLY name property of the bits ServiceController object, not the entire object, as in the first example with the Stop-Service cmdlet.

Here is the relevant part of the result of the Trace-Command, the rest is just ommited for brevity.


Figure 11: Partial Trace-Command output for Set-Service -Name ByParameterName

In Figure 11, if we look above the red outline, we’ll see that an attempt was previously made using the PIPELINE INPUT ValueFromPipeline NO COERCION rule to bind the incomming name property to receiving name parameter, but was skipped.

Parameter [Name] PIPELINE INPUT ValueFromPipeline NO COERCION

BIND arg [@{Name=bits}] to parameter [Name]

BIND arg [@{Name=bits}] to param [Name] SKIPPED

The rule is used as the first in the order of evaluation, however the pipeline value is not an object in this case, but a property called name having the value bits. The verdict therefore, is that a property can’t be bound to parameter by value, only objects can, so this rule doesn’t apply and must be skipped to process the next rule in the binding sequence.

As a convenience, here’s quick reminder list of the binding order again:

  1. PIPELINE INPUT ValueFromPipeline NO COERCION
  2. PIPELINE INPUT ValueFromPipelineByPropertyName NO COERCION
  3. PIPELINE INPUT ValueFromPipeline WITH COERCION
  4. PIPELINE INPUT ValueFromPipelineByPropertyName WITH COERCION

The red outlined area in Figure 11 shows that the name parameter of the Set-Service cmdlet will now attempt to accept pipeline input ByPropertyName without coercion, which is accepted and binding is successful. This is because the pipeline item is a property, and only a property can be bound to a parameter with rule 2 – PIPELINE INPUT ValueFromPipelineByPropertyName NO COERCION, objects can’t bind using this rule, only rules 1 and 3 apply to objects. Rules 2 and 4 apply to properties. The other reason it binds is because the property name is “name”, and the parameter name is also “name”. ByPropertyName means that these two have to match exactly in order to bind.

So what if for the name parameter, we send an object through the pipeline instead of a property this time, since the Set-Cmdlet -Name parameter accepts input both ByValue and ByPropertyName? Again, we’re still setting the StartupType of the bits service from Auto to Manual, but now we will select the service object itself, not the name property of the bits service object as we did in the previous section. Here is the command and corresponding output below.

Figure 12: Class diagrams for Get-Service and Set-Service

We can observe that the pipeline object type is string, which also matches the parameter type (see Figure 8), and we also will notice that rule 1 is evaluated:

Parameter [Name] PIPELINE INPUT ValueFromPipeline NO COERCION

However, because we have sent an entire object with the value “bits” and not a property of an object through the pipeline (figure 13), and because rule 1 – ByValueFromPipeline without coercion, only accepts objects that do not have to be coerced to have its type converted, then it meets all the criteria and will bind successfully.

If we were to summarize these concepts visually in terms of a process diagram, it may resemble:


Figure 6: Parameter binding process

See, I told you this would be illustrated, didn’t I? Notice that indexed items 04.02.01, 04.02.02, 08.00 and 04.02.02.02 all have [A], [B], [C], and [D] prefix designations respectively. This is just a simple hint of the binding order we discussed previously.

SUMMARY

The key points I’d like to reinforce are: First, to know what you are sending through the pipeline. Is it an object or a property of an object? If it’s a valid object, it will bind using rules 1 or 3 ([A] or [C]) depending on if it’s object type requires conversion or not. The sending object type must match the receiving parameter type also.

If the pipeline item is a property of an object and is a valid property, it will use either rules 2 or 4 ([B] or [D]). Both the property/parameter type and the property/parameter names on both sides of the pipeline must eventually match to satisfy binding.

Finally, it’s important to point out that when we refer to the sending expression on the left side of the pipeline, we call the pipeline items either objects or properties, but on the receiving side, these are bound to their corresponding parameters. Parameters are always on the right, while properties and objects traverse the pipeline from the left.

We hope this article has been helpful and would love to get your feedback, especially any stories of how you have been best able use this information. Stay tuned for more topics and happy scripting!

REFERENCES

  1. about_Parameters (Get-Help about_Parameters -ShowWindow)

    https://technet.microsoft.com/en-us/library/hh847824.aspx

  2. about_Pipelines (Get-Help about_Pipelines -ShowWindow)

    https://technet.microsoft.com/en-us/library/hh847902.aspx



Windows Ink 1: Introduction to Ink and Pen

$
0
0

Using a pen and computer has an interesting history that goes farther back than you’d think. In 1888, the first patent for a “electric stylus device for capturing handwriting” was issued to Elisha Gray for the Telautograph. In fact, pen input was being used 20 years before mouse and GUI input with systems like the Styalator tablet demonstrated by Tim Diamond in the 1950s and the RAND tablet in the 1960s, both could recognize free hand writing and turn it into computer recognizable characters and words.

In 1992, Microsoft made its first major entrance into the pen input space with Windows for Pen Computing and also had the NCR tablet that ran Windows 3.1 with pen input as an option to interact with applications.

New ways to use Windows Ink

In the Windows 10 Anniversary Update, Inking (pen input) has taken front stage. Microsoft recently announced the Surface Studio. An All in One machine, designed to empower the creative process with a 28 inch, Pen enabled, PixelSense screen. With such a large working area for the Pen and the thin profile of the PC, the user can focus on what matters, the art.

In addition to having the work front and center, the user can now use new methods of input, such as the Surface Dial, to leverage your application’s inking features. As a developer, you can leverage the Radial Controller APIs to make accessing those inking features a natural and smooth experience for the user.

Let’s start exploring Windows Ink from two perspectives, the consumer and the developer.

User’s Perspective

On PC with stylus support, the Windows Ink Workspace is front and center in the system tray. For the consumer, this a highly convenient option to quickly access the applications in the Workspace; Sticky Notes, Sketchpad and Screensketch, as you see here:

picture1

Depending on the PC’s pen you’re using, the pen can provide some natural interactions even for you start writing on the screen. Using a Surface Book as an example, the Surface Pen lets you quickly launch an application by clicking the pen’s eraser. One click, a double click or a click and hold can perform three different things. Which action is taken depends on what is set by the user, this option is highly configurable from the PC’s Pen Settings page, as seen here:

picture2

There are other settings you can configure to further customize your experience. Windows 10 already ignores when your palm is touching the screen while you’re writing, but you may want to completely ignore touch altogether. These options can be set on the same settings pane:

picture3

Ignoring touch input while using the pen is disabled by default because there are great simultaneous pen and touch scenarios. A good example of this would be the Windows Ink ruler! You can use one hand for the pen and the other hand to move the ruler on the screen.

Now that’s we’ve taken a high level look at the Windows 10 Anniversary Update’s inking features, let’s switch gears and take a look at it from a developer’s perspective.

Developer’s Perspective

Pen input and handwriting recognition traditionally has needed a specialized developer skillset. You would have to detect the strokes made to a canvas via and using complex algorithms determine what character was written. In the Windows 10 Anniversary Update SDK, this is no longer the case. You can add inking support to your application with just a couple lines of code.

Let’s make a small example that lets the user draw to an area of your UWP (Universal Windows Application) app. This example can be added to any UWP app that is using the 14393 Anniversary SDK.

To enable inking, you only need to add the following to your XAML.

That’s it! Where you placed the InkCanvas UIElement is where the user can use a pen and draw on it with the default Ink settings. Here’s what it looks like at runtime after I’ve written a special message:

picture4

The InkCanvas built-in defaults makes it very easy to get started. However, what if you wanted to let the user change the color of the ink, or the thickness of the stroke? You can add this functionality quickly by adding an InkToolbar UIElement to your XAML. The only thing you need to do to wire it up, is tell it what InkCanvas is to be used for:

Note: If you see a XAML designer error when you add the InkToolbar, you can safely ignore this as it is a known issue that is being worked on. Your code will run fine.

Let’s rerun our test app and see what this looks after using a couple of the InkToolbar’s default tools; the ruler and changing the ink color:

picture5

This is all you need to having inking enabled in the app, however you might want to save the user’s strokes so that they can be saved and reloaded at another time.

Saving and Loading Ink

You can embed the ink data within a GIF file so that you can save and load the user’s work. This is easily done using the InkPresenter, which is available as a read-only property of the InkCanvas.

Here’s an example of getting all the ink that’s on the canvas and saving it to a file:


        private async Task SaveInkAsync()
        {
            if (inkCanvas.InkPresenter.StrokeContainer.GetStrokes().Count > 0)
            {
                // Select a StorageFile location and set some file attributes
                var savePicker = new Windows.Storage.Pickers.FileSavePicker();
                savePicker.SuggestedStartLocation = Windows.Storage.Pickers.PickerLocationId.PicturesLibrary;
                savePicker.FileTypeChoices.Add("Gif with embedded ISF", new List {".gif"});

                var file = await savePicker.PickSaveFileAsync();

                if (null != file)
                {
                    using (IRandomAccessStream stream = await file.OpenAsync(FileAccessMode.ReadWrite))
                    {
                        // This single method will get all the strokes and save them to the file
                        await inkCanvas.InkPresenter.StrokeContainer.SaveAsync(stream);
                    }
                }
            }
        }

Then, the next time the user wants to load in an old drawing, or maybe you want to properly resume an application that was terminated, you only need to load that file back into the canvas. To do this, it’s just as easy as saving it:


        private async Task LoadInkAsync()
        {
            // Open a file picker
            var openPicker = new Windows.Storage.Pickers.FileOpenPicker();
            openPicker.SuggestedStartLocation = Windows.Storage.Pickers.PickerLocationId.PicturesLibrary;

            // filter files to show both gifs (with embedded isf) and isf (ink) files
            openPicker.FileTypeFilter.Add(".gif");
            openPicker.FileTypeFilter.Add(".isf");

            var file = await openPicker.PickSingleFileAsync();

            if (null != file)
            {
                using (var stream = await file.OpenSequentialReadAsync())
                {
                    // Just like saving, it's only one method to load the ink into the canvas
                    await inkCanvas.InkPresenter.StrokeContainer.LoadAsync(stream);
                }
            }
        }

To see this code, and many other demos, take a look at the SimpleInk demo from the official Universal Windows Platform examples Github page.

What’s next?

Getting started with Windows Ink is quick and easy. However, you can also create some highly customized inking applications. In the next Windows Ink series post, we’ll dig deeper into the InkPresenter, Pen Attributes, Custom Pens, Custom InkToolBar and explore a more complex ink data  scenario that enables sharing and printing!

Resources

With move to Office 365, NGA Human Resources builds a more engaging employee experience

$
0
0

Working better for businesses

Trust and productivity

Today’s Microsoft Office 365 post was written by Russell Sheldon, chief information officer, senior vice president for HR consulting, application services and global technology.

nga-hr-pro-pixAt NGA Human Resources (NGA HR), the way we engage with our employees—and what we believe it takes to be a great employer—centers around building a positive employee experience. This is what we do on a daily basis for our customers, and it’s equally true for our internal operations.

When new employees join the business, regardless of location or job function, it is vital that they feel part of our global organization. (We operate in more than 35 countries, serving customers in more than 145 countries in 25 languages.) All employees need to be connected, engaging in the company culture that drives our success as a business.

In the digital economy, technology, location and time zones should not be a barrier to productivity. Given our global presence, using technology that promotes worldwide collaboration is critical. In turn, collaboration and the sharing of ideas are paramount to fostering talent. We enact our belief that employees everywhere should feel connected to their organization and that they should be able to work as easily together as they do individually.

Our corporate objective is to make HR work better for businesses. To do this, we have to make the workplace a great place for people to work. For example, we rely on the same HR and payroll platform internally that we use to empower millions of our customers’ employees around the world.

As a business and a services provider, NGA HR has a policy of investing in innovative technologies that drive business efficiencies and improve the employee experience, while continuing to adhere to the strictest compliance requirements.

That is why, when our G Suite (formerly Google Apps for Work) contract came up for renewal, we took the time to evaluate what we require as a global organization. We reviewed the market for cloud-based business tools that would help us achieve the scope of global collaboration and individual productivity that we want for our employees, yet still maintain the highest level of data security.

We selected Microsoft Office 365 and migrated our back-office applications and internal collaboration platform from Google to Office 365. We believe that Office 365 presents more aligned business services that will make it easier for us to grow, develop, and most importantly, retain our talent. Employees want to work for an organization that uses technology to improve their work experience so they can collaborate and innovate more effectively to contribute to its growth. This is the inherent value of effective business productivity tools.

A perfect example comes from our chief executive officer, Adel Al-Saleh. Today, he uses Skype for Business Online to host video calls with our 300 global leaders, something that was not possible before. Now the leadership team meets more frequently, using interactive virtual discussions to speed decision making on a global scale. I run a team of approximately 2,000 people around the world. I use Skype for Business Online to connect in real time with 30 of my senior managers, dramatically reducing the time and cost of business travel and freeing up my time and budget for allocation to more strategic requirements. Also, now that we can rely on the de facto industry standard for office collaboration, our commercial teams are responding to RFPs and collaborating on documents more efficiently than ever.

Because Microsoft includes intuitive collaborative capabilities throughout Office 365, it’s easy to be productive. You can kick off a Skype for Business Online call from your inbox and access all Office documents from any device. Now mobile employees stay in touch with work using minimal effort.

The fact that we had more than 8,000 employees regularly active on our Yammer enterprise social network just four weeks after we went live demonstrates that Google was not addressing the need we had for companywide collaboration. Today, we have listened to our employees, and we are providing them with the same ease of communication and access to data that they are used to at home.

Also, with Office 365, we can maintain a hybrid environment. This is hugely advantageous to us when working with customers whose data cannot leave their geographic borders. NGA HR manages the payroll data of millions of employees around the world every year, so we take data security very seriously. We can assure all customers that Office 365 meets our internal compliance mandate and European data privacy standards. It adheres to the Article 29 Data Protection Working Party (A29WP) opinion on cloud computing around basic principles of transparency, purpose limitation, data retention, access and disclosure restriction. We also took into consideration the positive opinion of A29WP on the Microsoft Cloud business solution, in line with European data transfer and protection clauses.

Our relationship with Microsoft got off to an incredible start with the highly successful implementation of Office 365. Thanks to the close collaboration among NGA HR, the Microsoft FastTrack team and Microsoft partner Content and Code, we migrated 8,000 employees across the globe, with all their data, in just 12 weeks.

The deployment and change management expertise of the FastTrack team helped us meet our strict deadline, imposed by the expiration of the Google contract, with comfortable breathing space. With a minimal learning curve, everyone in the organization is more mobile, connected and agile. The feedback from employees is positive, and we are already seeing great results. Today, NGA HR is looking forward to even greater collaboration and localization of our global business.

—Russell Sheldon

Read the case study for more on why NGA Human Resources moved from Google to Office 365.

The post With move to Office 365, NGA Human Resources builds a more engaging employee experience appeared first on Office Blogs.

New metric measurement alert rule type in Public Preview!

$
0
0

Hi folks, Anurag here. Today I want to talk about a new powerful alerting capability for Operations Management Suite (OMS): metric measurement alerts.

Traditionally, OMS alerts have only used the number of results returned from Log Search to provide alerting capabilities. With metric measurement alerts, we now allow broad alert rule definitions across a group of objects with the ability to evaluate a threshold and raise alerts on single objects. This new capability also comes with more granular trigger conditions such as single or consecutive breaches.

To provide more context about this new feature, the following table showcases applicability of alert types.

Examples of alerts

 

Type of alertScenario
Number of resultsSend Alert if Computer A’s CPU is ever above 90%
Number of resultsSend Alert if Computer A’s average CPU is ever above 90%
Metric measurementSend Alert if Computer A’s average CPU goes above 90% twice over 2 hours
Metric measurementSend Alert if Computer A’s average CPU goes above 90% twice in a row over 2 hours
Number of resultsSend Alert if Computer Group A’s average CPU is ever collectively above 90%
Metric measurementSend Alert PER Computer if any Computer in Computer Groups A’s average CPU is above 90%
Metric measurementSend Alert PER Computer if any Computer in Computer Group A’s average CPU is above 90% 3 times over 2 hours
Metric measurementSend Alert PER Computer if any Computer in Computer Group A’s average CPU is above 90% 2 times in a row over 2 hours

Create a metric measurement alert

You create a metric measurement through the same workflow as traditional number of results alerts. Additionally, any of the on-demand aggregation queries used for performance metrics work. See the On-demand metric aggregation and visualization in OMS blog post for information about how to craft these on demand aggregation queries.

Requirements

The main difference when you define metric measurement alerts are the following two requirements:

  • “measure” statement– Metric measurement alerts require a grouping on a field to indicate what object to alert on.

Ex:Type=Event | measure count() by Computer interval 5 minute

Ex:Type=Perf ObjectName=Processor CounterName= | measure avg(CounterValue) by Computer interval 2minute

  • interval” statement– This specifies your sampling interval for your metric for how your data is aggregated.

Ex: Type=Perf ObjectName=Process CounterName=”% Processor Time”| measure avg(CounterValue) by InstanceName interval 3minute

Ex: Type=W3CIISLog | measure avg(TimeTaken) by Computer interval 30minute

Additional steps

  1. Switch Alert Type toggle from Number of Results to Metric Measurement.

Select the Metric Measurement alert type

  1. Switch Alert Type toggle from Number of Results to Metric Measurement.

The threshold is based off the metric aggregation from the query. For example, if you are using Memory as the metric and want to alert if Memory is less than 1 GB, set the threshold to Less Than and the value to 1000.

Pro-Tip: Open two tabs, one with Log Search and the metric chart and the other as the Alert Creation page. In the future, we plan to integrate visualizations straight into the alert creation process.

  1. Choose Trigger conditions.

Metric measurements come with the ability to define trigger options at a granular level. These two options are Total Breaches or Consecutive Breaches.

Total Breaches: When X out of Y samples exceed the threshold fire alert. For Example, if a sampling interval is defined as 15 minutes and a 60-minute time window is defined, there are 60/15 or 4 samples to choose from. If we set the trigger condition to greater than 2 total breaches, an alert fires if 3 out of the 4 samples are greater than the threshold set.

On the following chart, if threshold is set to 15 and trigger condition is set to Greater than 1 total breach, my alerts fire as there are two violating points in the specified time window.

Graph that shows two violating points in the specified time window

Consecutive Breaches: If X consecutive samples exceed the AggregateValue threshold. The time window is less important in this case. For example, if my trigger condition is greater than 2 consecutive breaches, I will raise an alert if the last 3 samples are greater than the threshold of AggregateValue.

In the following graph, if I set my threshold to 10 and my trigger condition to greater than 5 consecutive breaches, the alert would fire for the given time window because there are 6 consecutive violations.

Graph that shows 6 consecutive violations when threshold is 10 and trigger condition is greater than 5 consecutive breaches

Metric measurement alerts in search

As metric measurement alerts are evaluated for each unique object that is part of the grouping, we get unique alerts for each object. This also means actions such as email/runbook/ Webhook are initiated per alert firing.

Additionally, you can group by the specific computer field. This field is then available in the alert record in search.

View an alert for a specific computer

Example queries for alerts

 

Alert rule descriptionQuery
Alert if any computer talks to a malicious IP X timesMaliciousIP=* | measure count() by Computer interval 1minute
Alert if any Windows or Linux CPU % is greater than XType:Perf ObjectName=Processor CounterName="% Processor Time" | measure avg(CounterValue) by Computer, InstanceName interval 5minutes
Alert if any Windows or Linux Memory Used % is greater than XType:Perf ObjectName=Memory (CounterName="% Used Memory") | measure avg(CounterName) by Computer interval 5minutes
Alert if any Windows or Linux agent has missing Security UpdatesType:Update AND Classification="Security Updates" UpdateState=Needed Optional=false | measure count() by Computer interval 12hours

New in Intune: More conditional access, App SDK updates, and Android for Work!

$
0
0

A lot of teams ramp down at the end of the year, shifting into holiday hibernation mode for the final stretch. But not us. Were still pushing at full speed, dedicated to delivering more value to you in the remainder of 2016. If youre already making the shift into holiday mode, we suggest you bookmark this page because youll want to read about all these new features and improvements in Intune when youre back from the break and gearing up for 2017. And please check back next month for news on our final update of the year.

More conditional access goodness:

Conditional access is one of the signature experiences from Microsoft Enterprise Mobility + Security, bringing together the power of Intune and Azure Active Directory Premium to allow you to define policies that provide contextual control at the user, location, device and app levels. This rich set of features gives you the control you need to ensure your corporate data is secure, while giving your users the experience they expect in todays world. Were excited to announce these new features that further expand our conditional access capabilities to mobile applications and Windows PCs:

  • Conditional access for mobile apps
    This update allows you to restrict access to Exchange Online from only apps that are enabled with Intunes mobile application protection policies, such as Outlook. If youve been looking for a way to block access to Exchange Online from built-in mail clients or other apps, look no further.
  • Conditional access for Windows PCs
    You can now create conditional access policies through the Intune admin console to block Windows PCs from accessing Exchange Online and SharePoint Online. You can also create conditional access policies to block access to Office desktop and universal applications.

Conditional Access Overview

Intune App SDK now supports MAM without device enrollment

Last year, we released the Intune App SDK for iOS and Android. The SDK enables developers to easily build data protection and app management features into mobile apps, allowing admins to manage these apps via Microsoft Intune. For existing line-of-business applications, we created an Intune App Wrapping Tool which allows you to add app management without making code changes.

A few months ago, we took it a step further, releasing a Cordova plugin and Xamarin component based on our SDK that makes it simpler for cross-platform mobile developers using Cordova and Xamarin to incorporate Intunes mobile application protection controls into their standard development process.

Today, we are happy to announce that all our SDK tools have been updated to support MAM without enrollment scenarios. Whether youre a big power player creating apps the world knows and loves, or an in-house developer creating LOB apps to fit the unique needs of your team, theres never been a better time to use our SDK.

You can download the Intune App SDK, App Wrapping Tool, Cordova plugin, and Xamarin component here on Github.

Android for Work now generally available

Thanks to those of you who took part in our public preview. Today, were pleased to announce the General Availability of our Android for Work support. Theres loads of information to help you get started on our docs site.

Visit the Whats New in Microsoft Intune page for more on these and other recent developments in Intune.

Additional resources:

Empowering employees in a digital world

$
0
0

Pervasive access to new digital services is changing every aspect of your business—enabling growth, disrupting industry landscapes and acting as the catalyst for new business models, products, services and experience. Don’t let fear slow down your digital transformation. Use Microsoft Secure Productive Enterprise to empower employees with tools that fuel collaboration and productivity—and to mitigate the risks that come with a digital world.

Secure Productive Enterprise delivers the capabilities your organization needs to empower employees in a digital workplace, including:

  • Trust—Protect the organization, data and people.
  • Collaboration—Create a productive workplace that embraces diverse workstyles.
  • Mobility—Enable employees to get things done from anywhere.
  • Intelligence—Provide insights to drive faster, better business decisions.

Download the free e-book, “Empowering Employees in a Digital World,” to learn more.

The post Empowering employees in a digital world appeared first on Office Blogs.

Viewing all 13502 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>