Quantcast
Channel: TechNet Technology News
Viewing all 13502 articles
Browse latest View live

New Features in C# 7.0

$
0
0

Here is a description of all the new language features in C# 7.0, which came out last Tuesday as part of the Visual Studio 2017 release.

C# 7.0 adds a number of new features and brings a focus on data consumption, code simplification and performance. Perhaps the biggest features are tuples, which make it easy to have multiple results, and pattern matching which simplifies code that is conditional on the shape of data. But there are many other features big and small. We hope that they all combine to make your code more efficient and clear, and you more happy and productive.

If you are curious about the design process that led to this feature set, you can find design notes, proposals and lots of discussion at the C# language design GitHub site.

If this post feels familiar, it may be because a preliminary version went out last August. In the final version of C# 7.0 a few details have changed, some of them in response to great feedback on that post.

Have fun with C# 7.0, and happy hacking!

Mads Torgersen, C# Language PM

Out variables

In older versions of C#, using out parameters isn’t as fluid as we’d like. Before you can call a method with out parameters you first have to declare variables to pass to it. Since you typically aren’t initializing these variables (they are going to be overwritten by the method after all), you also cannot use var to declare them, but need to specify the full type:

In C# 7.0 we have added out variables; the ability to declare a variable right at the point where it is passed as an out argument:

Note that the variables are in scope in the enclosing block, so that the subsequent line can use them. Many kinds of statements do not establish their own scope, so out variables declared in them are often introduced into the enclosing scope.

Since the out variables are declared directly as arguments to out parameters, the compiler can usually tell what their type should be (unless there are conflicting overloads), so it is fine to use var instead of a type to declare them:

A common use of out parameters is the Try... pattern, where a boolean return value indicates success, and out parameters carry the results obtained:

We allow "discards" as out parameters as well, in the form of a _, to let you ignore out parameters you don’t care about:

Pattern matching

C# 7.0 introduces the notion of patterns, which, abstractly speaking, are syntactic elements that can test that a value has a certain "shape", and extract information from the value when it does.

Examples of patterns in C# 7.0 are:

  • Constant patterns of the form c (where c is a constant expression in C#), which test that the input is equal to c
  • Type patterns of the form T x (where T is a type and x is an identifier), which test that the input has type T, and if so, extracts the value of the input into a fresh variable x of type T
  • Var patterns of the form var x (where x is an identifier), which always match, and simply put the value of the input into a fresh variable x with the same type as the input.

This is just the beginning – patterns are a new kind of language element in C#, and we expect to add more of them to C# in the future.

In C# 7.0 we are enhancing two existing language constructs with patterns:

  • is expressions can now have a pattern on the right hand side, instead of just a type
  • case clauses in switch statements can now match on patterns, not just constant values

In future versions of C# we are likely to add more places where patterns can be used.

Is-expressions with patterns

Here is an example of using is expressions with constant patterns and type patterns:

As you can see, the pattern variables– the variables introduced by a pattern – are similar to the out variables described earlier, in that they can be declared in the middle of an expression, and can be used within the nearest surrounding scope. Also like out variables, pattern variables are mutable. We often refer to out variables and pattern variables jointly as "expression variables".

Patterns and Try-methods often go well together:

Switch statements with patterns

We’re generalizing the switch statement so that:

  • You can switch on any type (not just primitive types)
  • Patterns can be used in case clauses
  • Case clauses can have additional conditions on them

Here’s a simple example:

There are several things to note about this newly extended switch statement:

  • The order of case clauses now matters: Just like catch clauses, the case clauses are no longer necessarily disjoint, and the first one that matches gets picked. It’s therefore important that the square case comes before the rectangle case above. Also, just like with catch clauses, the compiler will help you by flagging obvious cases that can never be reached. Before this you couldn’t ever tell the order of evaluation, so this is not a breaking change of behavior.
  • The default clause is always evaluated last: Even though the null case above comes last, it will be checked before the default clause is picked. This is for compatibility with existing switch semantics. However, good practice would usually have you put the default clause at the end.
  • The null clause at the end is not unreachable: This is because type patterns follow the example of the current is expression and do not match null. This ensures that null values aren’t accidentally snapped up by whichever type pattern happens to come first; you have to be more explicit about how to handle them (or leave them for the default clause).

Pattern variables introduced by a case ...: label are in scope only in the corresponding switch section.

Tuples

It is common to want to return more than one value from a method. The options available in older versions of C# are less than optimal:

  • Out parameters: Use is clunky (even with the improvements described above), and they don’t work with async methods.
  • System.Tuple<...> return types: Verbose to use and require an allocation of a tuple object.
  • Custom-built transport type for every method: A lot of code overhead for a type whose purpose is just to temporarily group a few values.
  • Anonymous types returned through a dynamic return type: High performance overhead and no static type checking.

To do better at this, C# 7.0 adds tuple types and tuple literals:

The method now effectively returns three strings, wrapped up as elements in a tuple value.

The caller of the method will receive a tuple, and can access the elements individually:

Item1 etc. are the default names for tuple elements, and can always be used. But they aren’t very descriptive, so you can optionally add better ones:

Now the recipient of that tuple have more descriptive names to work with:

You can also specify element names directly in tuple literals:

Generally you can assign tuple types to each other regardless of the names: as long as the individual elements are assignable, tuple types convert freely to other tuple types.

Tuples are value types, and their elements are simply public, mutable fields. They have value equality, meaning that two tuples are equal (and have the same hash code) if all their elements are pairwise equal (and have the same hash code).

This makes tuples useful for many other situations beyond multiple return values. For instance, if you need a dictionary with multiple keys, use a tuple as your key and everything works out right. If you need a list with multiple values at each position, use a tuple, and searching the list etc. will work correctly.

Tuples rely on a family of underlying generic struct types called ValueTuple<...>. If you target a Framework that doesn’t yet include those types, you can instead pick them up from NuGet:

  • Right-click the project in the Solution Explorer and select "Manage NuGet Packages…"
  • Select the "Browse" tab and select "nuget.org" as the "Package source"
  • Search for "System.ValueTuple" and install it.

Deconstruction

Another way to consume tuples is to deconstruct them. A deconstructing declaration is a syntax for splitting a tuple (or other value) into its parts and assigning those parts individually to fresh variables:

In a deconstructing declaration you can use var for the individual variables declared:

Or even put a single var outside of the parentheses as an abbreviation:

You can also deconstruct into existing variables with a deconstructing assignment:

Deconstruction is not just for tuples. Any type can be deconstructed, as long as it has an (instance or extension) deconstructor method of the form:

The out parameters constitute the values that result from the deconstruction.

(Why does it use out parameters instead of returning a tuple? That is so that you can have multiple overloads for different numbers of values).

It will be a common pattern to have constructors and deconstructors be "symmetric" in this way.

Just as for out variables, we allow "discards" in deconstruction, for things that you don’t care about:

Local functions

Sometimes a helper function only makes sense inside of a single method that uses it. You can now declare such functions inside other function bodies as a local function:

Parameters and local variables from the enclosing scope are available inside of a local function, just as they are in lambda expressions.

As an example, methods implemented as iterators commonly need a non-iterator wrapper method for eagerly checking the arguments at the time of the call. (The iterator itself doesn’t start running until MoveNext is called). Local functions are perfect for this scenario:

If Iterator had been a private method next to Filter, it would have been available for other members to accidentally use directly (without argument checking). Also, it would have needed to take all the same arguments as Filter instead of having them just be in scope.

Literal improvements

C# 7.0 allows _ to occur as a digit separator inside number literals:

You can put them wherever you want between digits, to improve readability. They have no effect on the value.

Also, C# 7.0 introduces binary literals, so that you can specify bit patterns directly instead of having to know hexadecimal notation by heart.

Ref returns and locals

Just like you can pass things by reference (with the ref modifier) in C#, you can now return them by reference, and also store them by reference in local variables.

This is useful for passing around placeholders into big data structures. For instance, a game might hold its data in a big preallocated array of structs (to avoid garbage collection pauses). Methods can now return a reference directly to such a struct, through which the caller can read and modify it.

There are some restrictions to ensure that this is safe:

  • You can only return refs that are "safe to return": Ones that were passed to you, and ones that point into fields in objects.
  • Ref locals are initialized to a certain storage location, and cannot be mutated to point to another.

Generalized async return types

Up until now, async methods in C# must either return void, Task or Task. C# 7.0 allows other types to be defined in such a way that they can be returned from an async method.

For instance we now have a ValueTask struct type. It is built to prevent the allocation of a Task object in cases where the result of the async operation is already available at the time of awaiting. For many async scenarios where buffering is involved for example, this can drastically reduce the number of allocations and lead to significant performance gains.

There are many other ways that you can imagine custom "task-like" types being useful. It won’t be straightforward to create them correctly, so we don’t expect most people to roll their own, but it is likely that they will start to show up in frameworks and APIs, and callers can then just return and await them the way they do Tasks today.

More expression bodied members

Expression bodied methods, properties etc. are a big hit in C# 6.0, but we didn’t allow them in all kinds of members. C# 7.0 adds accessors, constructors and finalizers to the list of things that can have expression bodies:

This is an example of a feature that was contributed by the community, not the Microsoft C# compiler team. Yay, open source!

Throw expressions

It is easy to throw an exception in the middle of an expression: just call a method that does it for you! But in C# 7.0 we are directly allowing throw as an expression in certain places:


Build & Deploy Machine Learning Apps on Big Data Platforms with Microsoft Linux Data Science Virtual Machine

$
0
0

This post is authored by Gopi Kumar, Principal Program Manager in the Data Group at Microsoft.

This post covers our latest additions to the Microsoft Linux Data Science Virtual Machine (DSVM), a custom VM image on Azure, purpose-built for data science, deep learning and analytics. Offered in both Microsoft Windows and Linux editions, DSVM includes a rich collection of tools, seen in the picture below, and makes you more productive when it comes to building and deploying advanced machine learning and analytics apps.

The central theme of our latest Linux DSVM release is to enable the development and testing of ML apps for deployment to distributed scalable platforms such as Spark, Hadoop and Microsoft R Server, for operating on data at a very large scale. In addition, with this release, DSVM also offers Julia Computing’s JuliaPro on both Linux and Windows editions.


Here’s more on the new DSVM components you can use to build and deploy intelligent apps to big data platforms:

Microsoft R Server 9.0

Version 9.0 of Microsoft R Server (MRS) is a major update to enterprise-scale R from Microsoft, supporting parallel and distributed computation. MRS 9.0 supports analytics execution in the Spark 2.0 context. There’s a new architecture and simplified interface for deploying R models and functions as web services via a new library called mrsdeploy, whichmakes it easy to consume models from other apps using the open Swagger framework.

Local Spark Standalone Instance

Spark is one of the premier platforms for highly scalable big data analytics and machine learning. Spark 2.0 launched in mid-2016 and brings several improvements such as the revised machine learning library (MLLib), scaling and performance optimization, better ANSI SQL compliance and unified APIs. The Linux DSVM now offers a standalone Spark instance (based on the Apache Spark distribution), PySpark kernel in Jupyter to help you build and test applications on the DSVM and deploy them on large scale clusters like Azure HDInsight Spark or your own on-premises Spark cluster. You can develop your code using either Jupyter notebook or with the included community edition of the Pycharm IDE for Python or RStudio for R.

Single Node Local Hadoop (HDFS and YARN) Instance

To make it easier to develop Hadoop programs and/or use HDFS storage locally for development and testing, a single node Hadoop installation is built into the VM. Also, if you are developing on the Microsoft R Server for execution in Hadoop or Spark remote contexts, you can first test things locally on the Linux DSVM and then deploy the code to a remote scaled out Hadoop or Spark cluster or to Microsoft R Server. These DSVM additions are designed to help you iterate rapidly when developing and testing your apps, before they get deployed into large-scale production big data clusters.

The DSVM is also a great environment for self-learning and running training classes on big data technologies. We provide sample code and notebooks to help you get started quickly on the different data science tools and technologies offered.

DSVM Resources

New to DSVM? Here are resources to get you started:

Linux Edition

Windows Edition

The goal of DSVM is to make data scientists and developers highly productive in their work and provide a broad array of popular tools. We hope you find it useful to have these new big data tools pre-installed with the DSVM.

We always appreciate feedback, so please send in your comments below or share your thoughts with us at the DSVM community forum.

Gopi

Power BI publish to web helps nonprofits do more good

$
0
0
It’s been just over a year since we launched Power BI publish to web, and in that time, it’s become an important tool for bloggers, journalists, newspaper columnists, and authors who want to tell stories and share data insights online. But that’s not all! The publish to web feature has been widely adopted by businesses, civic groups, and nonprofit organizations, the latter in ways that might just help make the world a better place. Publish to web has already established itself as a great way to make data journalism even easier, but it also has found plenty of application in the nonprofit sector. These organizations are increasingly looking to leverage their data in cost-effective ways and drive real-time decision making. And nonprofits, perhaps more than other organizations, require easy, impactful ways to demonstrate issues and rally support from their constituents. Learn more about how these organizations are using Power BI.

Congratulations to this month's Featured Data Stories

$
0
0
Last month we put out the call for submissions using mapping tools -- along with other topics that interest you -- for the Data Stories Gallery, and we got some fantastic entries! Congratulations to the grand winner and runners-up. Want to see your work become a Featured story? Post it to the Data Stories Gallery, and then tweet a link with the hashtags #powerbi #datastory. The inspiration topic for this month is: tables and matrices!

March 2017 updates for Get & Transform in Excel 2016 and the Power Query add-in

$
0
0

Excel 2016 includes a powerful set of features based on the Power Query technology, which provides fast, easy data gathering and shaping capabilities and can be accessed through the Get & Transform section on the Data ribbon.

Today, we are pleased to announce two new data transformation and connectivity features that have been requested by many customers.

These updates are available as part of an Office 365 subscription. If you are an Office 365 subscriber, find out how to get these latest updates. If you have Excel 2010 or Excel 2013, you can also take advantage of these updates by downloading the latest Power Query for Excel add-in.

These updates include the following new or improved data connectivity and transformation features:

  • New transformation—horizontal list expansion.
  • Enhanced SQL Server connector—support for SQL Failover option.

New transformation—horizontal list expansion

With this update, we made it easier to extract data values from a column containing nested lists. Before this update, customers could expand nested lists within a column in a table, resulting in one new table row for each item within the nested list. This capability is accessible via the column headers in a List column, or by using the Expand ribbon entry point.

We also added a new Extract Values transformation command that allows users to extract values from a list into a new Text column, with a delimiter in between these values. This new transformation can be accessed from the column header when a column with nested Lists is selected.

Upon selecting this transformation, users are prompted to provide a delimiter to use in the new column. They can pick from a list of predefined delimiters or specifying a custom one, which may also include special characters.

This transformation turns the column with nested lists into a Text column as showed below:

Enhanced SQL Server connector—support for SQL Failover option

We improved the SQL Server connector and added a new option to enable SQL Server Failover support. This new option can be found under the Advanced Options section in the SQL Server connector dialog. See “Always On Failover Cluster Instances (SQL Server)” for details about this option.

Learn more

—The Excel team

The post March 2017 updates for Get & Transform in Excel 2016 and the Power Query add-in appeared first on Office Blogs.

Introducing the new Office 365 profile experience

$
0
0

In the modern workplace, an organization’s most important assets are its people. The knowledge, skills and expertise found throughout your carefully recruited teams are tantamount to individual and collective success.

All too often, however, this specialized knowledge is obfuscated by physical and organizational barriers. People know what information they need, but are unable to track down the answers they’re looking for. The popular adage “It’s not what you know, it’s who you know” reminds us that the best-connected employees get the most done.

That’s where Office 365 can help. As Microsoft works to reinvent productivity for the modern workplace, our goal is to put people at the center of the connected suite experience. When you’re able to tap into the hidden knowledge throughout your organization and leverage your talent pool, you’re able to achieve more.

Starting today, we’re rolling out an extended profile card experience across Office 365 to enhance the way you collaborate with colleagues and external contacts. We’ve made several big improvements that improve on the existing experience across three pillars to create an intelligent, holistic and integrated profile experience.

Intelligent

Traditionally, employees looking for specific information had to manually connect the dots between people and units of knowledge. By tapping into the Office 365 graph and machine learning, the new Office 365 profile card can identify information relevant to you based on the person you’re looking up. This can help you quickly look up documents that have been shared with you, independent of how they were sent.

Holistic

We’re also working to help employees connect with people across the organization that they don’t traditionally interact with. The new Organization view shows a complete picture of the highlighted user’s position in the company, including their direct reports and co-workers. Office 365 will also surface other people relevant to the person you are looking up based on their working habits and communication.

Integrated

We’re integrating the new profile card everywhere you see a person’s name—but it’s important that the experience doesn’t interrupt your productivity. We’ve made it easy for users to achieve these tasks with as little interruption to their workflow as possible. Hovering over a name provides a quick look at their most important attributes, such as contact details, recent documents and manager. More details are only a click away with the extended flex pain that displays additional information without navigating from the page.

Over the next few weeks, the new profile card experience will begin rolling out in OneDrive for Business and SharePoint Online for Office 365 customers enrolled in first release. We’ll continue to roll out this service for all Office 365 users over the next few months. As always, we’d love to hear what you think, so please let us know in the comments below.

—Tom Batcheler, @TomBatcheler, senior product marketing manager for the Office 365 team

The post Introducing the new Office 365 profile experience appeared first on Office Blogs.

Where is my custom log data?

$
0
0

The ability of OMS to ingest custom log files enables almost limitless possibilities for you to collect and analyse data from your systems. In fact, a common usage is for customers to use their own PowerShell scripts to collect, collate and create custom logs for uploading and analysis.

If this sounds appealing, you can read more about setting up custom logs at Custom logs in Log Analytics.

I am not writing this to beat the drum for using customer logs but to make you aware of a gotcha that I’ve seen a few of our customers hit when doing this.

We currently only support the ASCII and UTF-8 encodings. This is especially relevant if you are using the PowerShell out-file cmdlet because it defaults to UTF-16.

So, when writing your custom log file, be sure to include the -encoding switch and specify either ASCII or UTF-8.

For example:

out-file -Encoding utf8 -FilePath c:\logs\somelog.log

Brian McDermott
Senior Escalation Engineer
Microsoft

SQL Server Data Tools 17.0 RC, and SSDT in VS2017

$
0
0

It’s been a busy few months for SSDT with multiple Release Candidate releases of the 17.0 major release, and Visual Studio 2017 shipping this week. Here’s a quick update on the latest & greatest database tools updates you can try out.

SQL Server Data Tools 17.0 RC3

Today we’re announcing SSDT 17.0 RC3 release for Visual Studio 2015. SSDT 17.0 is a major update with support for SQL Server vNext and the latest Azure SQL DB features built in. We’ve released multiple RCs and are well on our way towards a GA release that’s supported against production instances.

Download the latest release candidate here.

New in SSDT 17.0 RC3: Ignore Column Order when publishing Database Projects

One of the most requested features for database project publishing is now finally here. If you’ve ever had to deal with accidental data motion when putting a column in the middle of a table definition, you’ll know how hard this can be to spot and manage. In this release you can now check the “Ignore Column Order” option in the Advanced Publish Options. This will append new columns to the end of an existing table rather than altering the table structure to add the column in its listed position.

For a full list of features for all project types see the release notes on the SSDT 17.0 Release Candidate download page.

Visual Studio 2017

Visual Studio 2017 includes SSDT relational DB support: Database Projects, Schema Compare, Data Compare, SQL Server Object Explorer and more. The Data Storage and Processing workload is optimized for DB developers, but SSDT is a recommended option in most other workloads including ASP.Net and Web Development. We encourage you to try out the new, more lightweight Visual Studio today!

Support for Analysis Services and Reporting Services BI project types is available in the VS2017 Gallery on launch day. 

To download Visual Studio 2017 and learn more about the exciting new features in this release, see www.visualstudio.com.

FAQ

When will all BI Projects be supported in VS2017?

Support for Analysis Services and Reporting Services BI project types is available in the VS2017 Gallery on launch day.  Support for Integration Services for Visual Studio 2017 is in progress, but is not yet available on launch day. For now we recommend using SSDT for VS2015 if you need to use all of the BI project types together in a solution, but stay tuned to this blog for updates on this.

When will SQL Server vNext & other SSDT 17.0 features be supported in VS2017?

We’re focused on bringing SSDT 17.0 to GA quality in the coming months. After this happens, we’ll work to ship this updated code into a future Visual Studio 2017 update. Please note that Visual Studio 2017, database project updates are shipped via the built-in updater instead of a separate setup executable. This makes install & management much easier, but does mean that releases align with the Visual Studio release schedule rather than shipping on an ad-hoc basis. With this change, SSDT updates for Visual Studio 2017 will generally ship after updates to SSMS and other tools.

Will there be additional feature updates to SSDT in Visual Studio 2013?

No. SSDT supports the most recent 2 versions of Visual Studio, with our features shipping in both versions wherever possible. With the release of Visual Studio 2017, we are ceasing to ship feature updates to Visual Studio 2013. This release will continue to be supported via the Microsoft support lifecycle, meaning any vital security fixes and similar important fixes will be released. We recommend updating to the latest version of Visual Studio to keep getting our most up to date features.

What tools does Visual Studio 2017 have for database development?

Visual Studio 2017 has several features specific to database development to keep developers more productive, while ensuring that application’s data and schemas are part of the DevOps best practices (automatic deployment, continuous integration). Developers can leverage SQL Server Data Tools (SSDT) included in all Visual Studio editions. Additionally on Visual Studio 2017, developers can leverage Redgate Data Tools. SQL Search is available in all editions, and SQL Prompt Core and ReadyRoll Core are available for VS 2017 Enterprise subscribers.

  • SQL Server Data Tools (SSDT) turns Visual Studio into a powerful development environment for SQL Server, Azure SQL Database and Azure SQL Data Warehouse. With SSDT, developers can visually design, Microsoft Confidential build, debug, test, maintain, refactor, deploy, source control and enable continuous integration & continuous deployment for their databases with a declarative model that spans all the phases of database development. Developers can work offline with a database project, or directly with a connected database instance in Azure SQL Database, Azure SQL Data Warehouse, and SQL Server running on Windows, Linux, Docker and in Azure or any cloud.
  • With ReadyRoll Core, users can develop, source control, and safely automate deployments of database changes, alongside application changes. This means the same tools used for application development can be utilized for database development and deployment, and ensures a single source of truth for both application and database changes. ReadyRoll’s migrations-based approach gives developers more control over the end database deployment script and can be easily integrated into DevOps processes, such as continuous integration and continuous delivery.
  • With SQL Prompt Core users can increase productivity with advanced IntelliSense-style code completion in Visual Studio.
  • SQL Search allows users to quickly search for SQL objects across databases. Together, Redgate Data Tools help to ensure database development is not the bottleneck to continuously delivering value to end users.

Can these tools help me with continuous integration & continuous deployment scenarios?

Yes, both SSDT Database Projects and ReadyRoll can be used to include the database in your CI and CD processes. ReadyRoll is an alternative to SSDT Database Projects that allows developers to have more control over the end database deployment script.

Read more about the differences between these in the “Which tool should I use” question.

Where do I get these database development tools from?

You can find both SQL Server Data Tools and Redgate’s SQL Search on all Visual Studio 2017 editions. Additionally, Visual Studio 2017 Enterprise subscribers have access to Redgate SQL Prompt Core and ReadyRoll Core as part of their installation, in the Data storage and processing workload.

Which tool should I use: SSDT or ReadyRoll?  

ReadyRoll is an alternative to SSDT Database Projects that allows developers to have more control over the end database deployment script. It can be used alongside other great features in SSDT like the Table Designer and ability to View and Edit Data.

SSDT and ReadyRoll Core take two different approaches to database development. SSDT takes a state-based approach, while ReadyRoll Core takes a migrations-based approach. The short version: for example, consider this analogy for baking a cake:

  • With SSDT database projects, you are given a list of ingredients and a picture of what the end cake will look like; the beginning and end state. How you get there is derived.
  • With ReadyRoll Core, you are given the recipe telling you what to do each step of the way to turn the ingredients into a cake.

 


Announcing granular tenant settings in Power BI

$
0
0
Power BI allows organizations to monitor and analyze their most critical business data. As such, it’s very important for administrators to have control over how Power BI is used in their…

Giving a Workgroup Server an FQDN

$
0
0

Recently I needed to be able to securely, remotely manage a set of Windows Servers that were not domain joined.  One problem that I hit while setting this up was that each of the servers did not believe that they had a valid FQDN.

For example – I could:

  • Set the name of a computer to “HyperVSV1”
  • Create a DNS entry that said that “HyperVSV1.mydomain.com” resolved to that computer
  • I could then correctly ping the computer at that address

But when I tried to use tools like PowerShell Remoting or Remote Desktop – they would complain that “HyperVSV1.mydomain.com” did not believe it was “HyperVSV1.mydomain.com”.

Thankfully, this is relatively easy to fix.

If you open PowerShell and run the following two commands:

Set-ItemProperty "HKLM:\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\" -Name Domain -Value "mydomain.com"
Set-ItemProperty "HKLM:\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\" -Name "NV Domain" -Value "mydomain.com"

After this your workgroup server will correctly identify itself with a valid FQDN.

Cheers,
Ben

vswhere now searches older versions of Visual Studio

$
0
0

One of the top requests I kept hearing for vswhere was to also search older versions of Visual Studio. You can now do that starting with the latest release.

vswhere -legacy -latest -property installationPath

Even if you don’t have Visual Studio 2017 or newer installed – which means the query API is not even registered – you can use vswhere to find the installation root directory for Visual Studio 2010 and newer.

There are some caveats:

  • You cannot also specify the -products or -requires parameters.
  • Information returned is very limited. While there are a few more properties I could scavenge, the scenario that prompted the requests was to find the location.

[
{
"instanceId": "VisualStudio.14.0",
"installationPath": "C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\",
"installationVersion": "14.0"
}
]

All the other vswhere parameters that control the output format and selection still work. For example, if you wanted to find the path to the latest version of VS2010 through VS2014.

$path = vswhere -latest -version '[10.0,15.0)' -property installationPath
if (test-path "$path\Common7\IDE\devenv.exe") {& $path $args
}

Once the new version is moderated on Chocolatey.org you can acquire it quickly using choco or PowerShell’s install-package.

For rich information about instances, I recommend you use the VSSetup PowerShell module. These support deeper analysis of instances and is designed to work only with Visual Studio 2017 and newer, or related products such as Build Tools.

Released: Public Preview for Microsoft Azure SQL Database Management Pack (6.7.25.0)

$
0
0

We are working on significantly updating the Management Pack for Azure SQL Database. Your feedback on this public preview will improve the quality of the final release. Please install the bits from the download location below and send us your feedback. You can comment on this post or send specific feedback to sqlmpsfeedback@microsoft.com. We are looking forward to hearing from you.

You can find the release at:

Microsoft System Center Management Pack (Community Technology Preview 2) for Azure SQL Database

Please see below for the new features and improvements. More detailed information can be found in the guide that can be downloaded from the link above.

New Features and Fixes

· Implemented performance improvements

· Improved error handling in Add Monitoring Wizard

· Fixed issue: “Collect Elastic Database Pool Number of Databases” rule does not collect performance data if REST monitoring is used

· Fixed issue: “Operations Manager Expression Filter Module” error messages appear in the Operations Manager event log

 

We are looking forward to hearing your feedback!

Linux Integration Services 4.1.3-2

$
0
0

Linux Integration Services has been update to version 4.1.3-2 and is available from https://www.microsoft.com/en-us/download/details.aspx?id=51612

This is a minor update to correct the RPMs for a kernel ABI change in Red Hat Enterprise Linux, CentOS, and Oracle Linux’s Red Hat Compatible Kernel version 7.3. Version 3.10.0-514.10.2.el7 of the kernel was sufficiently different for symbol conflicts to break the LIS kernel modules and create a situation where a VM would not start correctly. This version of the modules is compatible with the new kernel.

Nintendo Switch and The Legend of Zelda: Breath of the Wild are an ABSOLUTE JOY

$
0
0

I bought a Nintendo Switch last week with my allowance and I'm utterly smitten. It's brilliant. It's absolutely brilliant.

The Nintendo Switch is FAB

Now, to be clear, I'm neither a hardcore gamer nor a journalist. However, I am someone who grew up on Mario, enjoys Retrogaming and my Xbox One, and most of all, I know genius when I see it.

The Legend of Zelda: Breath of the Wild is a wonderful example of the very best that video games can offer as an art form in 2017.

It may be the best video game ever. And it is because it borrows so much from the decades of refinement whose shoulders it stands upon.

Let's break this down into two halves. First, Zelda (which is available on WiiU and Switch), and later, the Switch itself.

If you don't feel like reading this, just trust me and buy a Switch and Zelda and bask in the hundreds of hours of joy and wonder it will bring you. It's the most fun I've had with a video game in recent memory. I also profoundly recommend the gorgeous hardcover The Legend of Zelda: Breath of the Wild: The Complete Official Guide Collector's Edition. The maps, the art, and the gentle walkthroughs are more fun than googling. The kids and I have enjoyed exploring the wilderness with the giant map unfurled in front of us.

Legend of Zelda: Breath of the Wild

It's HUGE. It's estimated at 360 square kilometers. They are saying it's 1.5x the Skyrim map and may be larger than Witcher 3. A cynic could call Breath of the Wild derivative, but an optimist like me says, well, they stole every game mechanic that was awesome over the last few decades, and made the near-perfect game. I love that this is a console launch game that is polished and has at LEAST 100 hours or more for the completist.

Zelda is gorgeousWhat is Zelda like?

  • Just Cause - Fly off a cliff with a paraglider, fly over a raging river and land on an elk, tame it and ride it. Because you're awesome and you can.
  • Witcher 3 - Massive map, armor sets, crafting and more.
  • Assassin's Creed - Climbing because...it's fun. Getting maps by unlocking towers and jumping off.
  • Grand Theft Auto - The first massive sandbox without loading. You enter a new area and get a brief subtitle announcing you're in a new "neighborhood" and then you wander.
  • Skyrim - The Elder Scrolls was the first video game I played where I climbed mountains "because they were there" and really had a sense of wonder when I got to the top. Draw distance!
  • Shadow of the Colossus - There's amazing HUGE boss fights that involve climbing the enemy, racing after monsters with horses, and sometimes going inside them.
  • Bard's Tale - Because I'm old.

Complaints? Honestly, if I had to truly nit. And I mean really nit I'd say the durability of weapons, particularly swords, is annoying. I would make them last maybe 50% longer. Also, moving in and out of Shrines has a load screen that takes 10-15 seconds. But really, that's like saying "I wish Beyoncé was 5'8", not 5'7". I mean, REALLY. Beyoncé. Shush.

The Nintendo Switch

It's portable. Just like in the ad, you can pull the Switch out and leave. In my video below I also switch to portable AND have to re-sync the controllers, so there is one additional ceremony, but it's easy.

It feels like a console when it's plugged in. I've got it plugged into my TV and from my couch it looks as nice as any of my devices. Sure, it's not an Xbox One playing Tom Clancy: The Division. But it's a brilliant tradeoff for a device I can simply pick up and go outside with (which I've done, with considerable appreciation.)

I'm surprised that folks are complaining about the gaming resolution, frame rates, battery life, older processor, or said "it's just like an iPad with an HDMI cable." Here's why:

  • Resolution - Zelda runs at 720p (the native res of the touchscreen) at 30fps. It's just 6.5" and 720p is just fine when it's a foot or more from your face.
  • Battery - I got an easy 3 hours out of it. If you're on a plane, carry a cable and extra battery. If you need to portably game more than 3 hours, take a break. ;) Seriously, though, given my appreciation of it's portability and power and experience this is reasonable. One can always complain about battery life.
  • Frame Rate - When you dock the Switch and run Zelda over your TV the resolution is 900p and sometimes it lags. If you're in the forest, and it's raining, and there's a bunch of enemies around there will totally be moments of 20 fps. But it passes. And it's still gorgeous. A small price to pay, and we don't know if it's fixable with a software patch. Given that launch titles rarely use the new hardware in an optimized fashion, it's more than reasonable to give them a break on this.
  • Older Processor - The Switch is using the older Nvidia Tegra X1 processor. As a business person this makes sense. It's a $300 device. It's not reasonable to expect all day battery life and 4k gaming on a device that weights two-thirds of a pound.
  • Innovation - Yes, you can plug your iPad into your TV. But most folks don't. And the iPad and iOS clearly haven't tried to optimize for this scenario. Apple has scandalously under-supported their MFi Controller Spec, even though the SteelSeries is brilliant. Frankly, Apple handed Nintendo a huge opportunity by not making a proper controller and supporting MFi better with Game Devs. The Switch might not exist if I could BlueTooth Pair any controller to my iPad and play Skyrim on an iPad. Oh ya, I'd have to have an iPad with expandable memory or a cartridge slot. ;) The Switch is a new category of device. It's not an iPad.

It's a fantastic device for the price and the promises, for the most part, were kept. That said, a few gentle warnings if you do get a Switch.

  • If you put the joy-cons on backwards they might get stuck and you could perhaps damage the system.
  • The joy-cons have these little wrist straps as well, and to be clear if you put these on backwards you're in trouble. Make sure you line up the plus + signs. There's a + on the right joy-con and a - on the left one. Use the correct strap for the correct joy-con.
  • If you slam the Switch into the dock it's possible you could scratch the screen. I always tweet $300 equipment like it cost $300. Be somewhat careful.

My Recommended Nintendo Switch accessories (I own each of these)

These accessories are by no means required (the Switch has everything you need out of the box) but these are all 4+ star rated and I've purchased them myself and appreciate them. Yes, I've gone overboard and my $300 Switch is now a $500 Switch BUT I HAVE NO REGERTS. ;)

  • Some kind of Carrying Case. I have the Zelda Special Edition case, but all the cases that are official Nintendo are excellent.
  • The Nintendo Switch Pro Controller. If you're going to hook your Switch up to the TV you might consider the pro controller. The Switch does come with a quasi-controller that has you pop the two joy-cons into a harness to simulate a typical Xbox/PS Controller but the ergonomics are exact by any stretch. The Pro Controller is fantastic. It's 99% the same as an Xbox Controller and includes (quietly) the full 360 degree gyro support that (I believe) Switch Games will be known for (see the second above on gyro in Zelda.)
  • Joy-con Grips. This was a frivolous purchase but a good one. I've got big hands and the Joy-Cons are NOT comfortable when turned horizontally and used for any period of time. These little holsters turn them into tiny Pro Controllers and make two player a LOT easier.
  • Compact Playstand. The Switch has one major hardware design "flaw" in that it can't be charged while it's using its kickstand. This little folding playstand is nice because it's 3-in-1 and can also perfectly fit a 3DSXL.
  • Large 128g EXTRA-FAST microSDXC SD Card. The Switch has only 32gigs of internal space and if you (theoretically) downloaded Zelda you'll use 13gigs. I can see myself using up a LOT of space in the next year so I got this 128G SD Card. And it's FAST.
  • 6 pack of Microfiber Cleaning Cloths  - I can't stand a dirty touchscreen. Can't. I have two dozen of these spread around the house, my car, my backpack. Can't have too many given laptops, TVs, and iPads.
  • USB C cables - Both the Switch and Pro Controller use USB C (finally!) so pick up a few USB C cables that you can use to charge in a pinch from your laptop, existing car charger, or portable battery. I only buy Anker Batteries.
  • A Zelda Amiibo - Amiibos are these little figurines with an RFID/NFC dealie inside. They are registered to you and they can "light up" features in all kinds of games. In Zelda specifically you can (a little later in the game) use them to get daily food and other bonuses. Plus they look nice on your desk.

My Predictions for the Nintendo Switch in 2017

I'm looking forward to seeing what the Nintendo Switch can become. I think/predict we'll see this on the Switch in 2017.

  • A thrilling Indie Game Community. Yes, the launch titles are weak. There aren't a ton of launch games. Call it a soft launch. But give it a few months.
  • Virtual Console - The ability to play SNES/NES and other games via some kind of emulation from Nintendo. We have already seen NEO-GEO games show up in the last few days! I can imagine we'll see a Mario Collection going back 30+ years.
  • Video Apps - If they add Hulu, Netflix, and Amazon, then I'll be taking my Switch with me to
  • A USB-C to HDMI cable - I don't want to take the dock with me on trips, so I'd love a USB-C to HDMI cable from Nintendo (It'll need their magic box/chip) to free up my bag.
  • A great balance between AAA Games and "classic" games. If Zelda and Shovel Knight are any indication, the future is bright.
  • Continued updates to the online experience. I suspect we'll get firmware and store updates quarterly.

But at the same time, what's the nightmare scenario? Nothing happens. No games come out and I have a $500 Zelda-specific device. I'm totally OK with that give the joy of the last week. So between the worst-case scenario and the best case, no matter what happens it's awesome and I'm a satisfied customer.

* I've used Amazon referral links here. Please use them and you'll support this blog and my Amiibo Habit.


Sponsor: Get next level application monitoring with Raygun - The revolutionary software intelligence platform for your web and mobile apps. Take a free trial today



© 2017 Scott Hanselman. All rights reserved.
     

Build a Smarter Bracket with Bing

$
0
0

March means three things.

  1. Spring is soon to be sprung.
  2. It’s time to focus solely on college hoops.
  3. Build a better bracket with Bing Predicts.

We've been using Bing Predicts for the bracket selection process for two years now, and in that time, we've learned a few things.

Bing Predicts may be the magic ingredient for making all your bracket dreams happen. We use our intelligent machine-learning technology to analyze social and search signals, plus more than a decade of college hoops statistical data to bring you insights on every single one of your picks.

Bing March Madness

There are quintillions of ways to fill out a bracket, so we want to bring some intelligence into the often overwhelming decision making process. Here's how it works:

  • When selecting your first round matchups, use the Bing predictions to help guide your choices
  • In every subsequent pick for every subsequent round, the Bing models give suggestions and forecasts based on who you picked in the prior round. Even after the tournament starts, Bing will still provide predictions for matches in the round of 16

Our smarter bracket can work its magic on your phone as well with a fantastic mobile bracket experience. Download the Bing app and search “March Madness” to start swiping away at your bracket.

Want to use a bot to keep track of your teams? Follow them with the Bing Sportscaster bot on Facebook Messenger for news and game updates.

Good luck this March. We understand the agony involved in getting your bracket just right. If Bing Predicts can take a bit of the madness away, then we did our job.

- The Bing Team


Join us March 15th for the first Azure Blockchain AMA

$
0
0

Join us Wednesday, March 15th at 9am PST/12pm EST for the first Azure Blockchain-hosted Ask Me Anything (AMA) on Microsoft Tech Community. We receive great feedback and input about blockchain through customer engagements and other channels, but we haven’t interacted broadly in real-time, until now. You’ll be able to connect directly with the Blockchain team, who will be on hand to answer your questions and listen to feedback.

Add the AMA to your calendar!

When:

Wednesday, March 15, 2017 from 09:00 am to 10:00 am Pacific Time

Where:

The Azure Blockchain Community

What’s an AMA session?

We’ll have folks from the Azure Blockchain engineering team available to answer any questions you have. You can ask us anything about our products, services, or even our team!

Why are we doing an AMA?

Connect directly with customers and hear your feedback and answer your questions, such as:

  • What is Microsoft’s strategy around blockchain?
  • Why have blockchain on Azure?
  • What are the most common blockchain scenarios for industry X or vertical Y?
  • How do I submit blockchain feature requests?

Who will be at the AMA?

We’ll have PMs, developers, and technical thought leaders from the Azure Blockchain engineering team.

Azure%20blockchain%20AMA

We’re looking forward to this AMA and to connect with you directly.

Azure Data Lake Analytics and Data Lake Store now available in Europe

$
0
0

Azure Data Lake Analytics and Azure Data Lake Store are now available in the North Europe region. For more information about pricing, please visit the Data Lake Analytics Pricing and Data Lake Store Pricing webpages. Data Lake Analytics is a cloud analytics service for developing and running massively parallel data transformation and processing programs in U-SQL, R, Python, and .NET over petabytes of data. Data Lake Store is a no-limit cloud data lake built so enterprises can unlock value from unstructured, semi-structured, and structured data. To learn more about these services, please visit the Data Lake Analytics and Data Lake Store webpages.

Visual Studio 2017 Poster

$
0
0

Since we launched Visual Studio 2017 last week, hundreds of thousands of customers like you have already installed and started using Visual Studio 2017, and we’re excited to see what you create. Visual Studio 2017 contains so many new features. From productivity enhancements to new C++ capabilities. Start using our ASP.NET Core Tooling along with Live Unit Testing. Explore C# 7 and our new database DevOps features that help improve your productivity.

We thought it would be fun to put together a poster that shows off some of our favorite parts of Visual Studio 2017. You’ll also see some helpful tips and tricks to improve your productivity!

visualstudio2017_productlaunchposter-1

Go ahead and download this poster as a PDF, hang it up in your team room or wherever you code and tweet us a picture at @VisualStudio with the caption . We also have a web infographic version, and if you just need a cheat sheet of handy keyboard shortcuts, you can just download that page for easy printing.

A quick reminder – we’d love your feedback on Visual Studio 2017. If you run into an issue, you can Report-a-Problem directly from within Visual Studio to our Developer Community. We are always listening so you can also submit suggestions or feature ideas to be considered on our UserVoice.

Tim Sneath, Principal Lead Program Manager, Visual Studio

Tim leads a team focused on Visual Studio acquisition and extensibility. His mission is to see developers create stunning applications built on the Microsoft platform, and to persuade his mother that her computer is not an enemy. Amongst other strange obsessions, Tim collects vintage releases of Windows, and has a near-complete set of shrink-wrapped copies that date back to the late 80s.

Storage Spaces Direct throughput with iWARP

$
0
0

Hello, Claus here again. It has been a while since I last posted here and a few things have changed since last time. Windows Server has been moved into the Windows and Devices Group, we have moved to a new building with a better café, but a worse view 😊. On a personal note, I can be seen waddling the hallways as I have had foot surgery.

At Microsoft Ignite 2016 I did a demo at the 28-minute mark as part of the Meet Windows Server 2016 and System Center 2016 session. I showed how Storage Spaces Direct can deliver massive amounts of IOPS to many virtual machines with various storage QoS settings. I encourage you to watch it, if you haven’t already, or go watch it againJ. In the demo, we used a 16-node cluster connected over iWARP using the 40GbE Chelsio iWARP T580CR adapters, showing 6M+ read IOPS. Since then, Chelsio has released their 100GbE T6 NIC adapter, and we wanted to take a peek at what kind of network throughput would be possible with this new adapter.

We used the following hardware configuration:

  • 4 nodes of Dell R730xd
    • 2x E5-2660v3 2.6Ghz 10c/20t
    • 256GiB DDR4 2133Mhz (16 16GiB DIMM)
    • 2x Chelsio T6 100Gb NIC (PCIe 3.0 x16), single port connected/each, QSFP28 passive copper cabling
    • Performance Power Plan
    • Storage:
      • 4x 3.2TB NVME Samsung PM1725 (PCIe 3.0 x8)
      • 4x SSD + 12x HDD (not in use: all load from Samsung PM1725)
    • Windows Server 2016 + Storage Spaces Direct
      • Cache: Samsung PM1725
      • Capacity: SSD + HDD (not in use: all load from cache)
      • 4x 2TB 3-way mirrored virtual disks, one per cluster node
      • 20 Azure A1-sized VMs (1 VCPU, 1.75GiB RAM) per node
      • OS High Performance Power Plan
    • Load:
      • DISKSPD workload generator
      • VM Fleet workload orchestrator
      • 80 virtual machines with 16GiB file in VHDX
      • 512KiB 100% random read at a queue depth of 3 per VM

We did not configure DCB (PFC) in our deployment, since it is not required in iWARP configurations.

Below is a screenshot from the VMFleet Watch-Cluster window, which reports IOPS, bandwidth and latency.

iwarp-throughput-results

As you can see the aggregated bandwidth exceeded 83GB/s, which is very impressive. Each VM realized more than 1GB/s of throughput, and notice the average read latency is <1.5ms.

Let me know what you think.

Until next time

@ClausJor

Leverage the Azure CLI with these examples

$
0
0

Customer response to our CLI has been great since the preview release last September and the GA announcement in February. This has been a great opportunity for us to work with customers and learn what is working well and what is still needed. Some of the feedback we’ve received is that we need to provide more documentation and examples to fully leverage all the new features.

Based on this feedback, we’re delivering samples for Linux and Windows VMs, Web Apps, and Azure SQL Database, all of which can be found on this overview page. For Linux VMs, we’re highlighting best practices for creating, monitoring, and troubleshooting, with similar scripts for Windows VMs. For Microsoft SQL Azure, we get you started with scripts for creating single and pooled databases. For Web Apps, we show you how to create a Web App with integration to your favorite deployment method (git, GitHub, Docker, etc.) as well as configure, scale, connect to resources, and monitor your Web Apps, all from the command line. Here’s an example that creates an Azure Web App that is ready to deploy code from GitHub:

Create app sample (2)

All of the example scripts can be used in the CLI “as is”, and also as documentation to help you understand how to develop your own scripts.

You can also get started with the CLI using the rest of our updated docs and samples, including installing and updating the CLI, working with Virtual Machines, creating a complete Linux environment including VMs, Scale Sets, Storage, and network. The Azure CLI is open source and on GitHub.

We’re continuing to provide updates based on your ongoing feedback, so please share any suggestions you may have. Reach out with suggestions or questions via StackOverflow using the azure-cli tag, or email us directly at azfeedback@microsoft.com.

Viewing all 13502 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>