Quantcast
Channel: Windows PowerShell Blog
Viewing all 265 articles
Browse latest View live

Separating "What" from "Where" in PowerShell DSC

$
0
0

As you already know we introduced PowerShell Desired State Configuration to the world at our TechEd NA 2013 Session. The session also introduced the notion of structural configuration (what) and environmental configuration (where) (at the 25:50 min mark). Structural configuration defines what is needed and does not change based on the environment. For instance, a configuration can require IIS to be installed – whether we have one node, or many nodes, whether it is a development environment or a production environment. It does not matter - we need IIS installed. Environmental configuration defines the environment in which the configuration is deployed. For instance, the node names and source file locations can change from a development to a test to a production environment.


DSC offers the ability to separate structural configuration from environment configurational. This provides the ability to scale up or down a configuration across machines.


image001


The way to specify environmental configuration is through the ConfigurationDataautomatic parameter which is a hash table. Alternatively, the parameter can also take a .psd1 file containing the hashtable


PS C:\> Get-Command MyConfiguration -Syntax

 

MyConfiguration [[-InstanceName]< string>] [[-OutputPath] <string>] [[-ConfigurationData]< hashtable>]

 

This hash table needs to have at least one key AllNodes– whose value is structured. ConfigurationData can have any number of additional keys and value mappings added. For example:


$MyData=

@{

    AllNodes = @();

    NonNodeData =""  

}

 

The Key of interest is AllNodes. Its value is an array. However each element of the array is a hash table in itself with NodeName being a required key


$MyData=

@{

    AllNodes =

    @(

        @{

            NodeName ="Nana-VM-1"

        },

 

        @{

            NodeName ="Nana-VM-2"

        },

 

        @{

            NodeName ="Nana-VM-3"

        }

    );

    NonNodeData =""  

}

 

Each hash table entry in AllNodes corresponds to configuration data for one node in the configuration. The hash table can have any number of other keys (other than the required NodeName key)


$MyData=

@{

    AllNodes =

    @(

        @{

            NodeName ="Nana-VM-1";

            Role     ="WebServer"

        },

 

        @{

            NodeName ="Nana-VM-2";

            Role     ="SQLServer"

        },

 

        @{

            NodeName ="Nana-VM-3";

            Role     ="WebServer"

        }

    );

    NonNodeData =""  

}

 

 

DSC provides 3 special variables for use within the configuration which can access elements from the configuration data.


1.     $AllNodes: Itis a special variable that will refer to the AllNodes collection. It also supports filtering using simplified .Where() and .ForEach() syntax. So you can author a configuration like this

 

configurationMyConfiguration

{

    node $AllNodes.Where{$_.Role -eq"WebServer"}.NodeName

    {

 


    }

}

 

When the configuration is invoked with the ConfigurationData parameter the filter returns a set of nodes for use in the node statement. This avoids the need for having to hard code the node name in a configuration (or parameterizing it always). This is equivalent to writing the following (When the above configuration is invoked with –ConfigurationData$MyDatapresented above):

 

configurationMyConfiguration

{

    node Server1

    {

       

    }

 

    node Server3

    {

       

    }

}

 

2.     $Node: Once a set of nodes is filtered from $AllNodes, $Nodecan be used to refer to the current entry.

 

configurationMyConfiguration

{

    Import-DscResource-ModuleNamexWebAdministration-NameMSFT_xWebsite

 

    node $AllNodes.Where{$_.Role -eq"WebServer"}.NodeName

    {

        xWebsite Site

        {

            Name         =$Node.SiteName

            PhysicalPath =$Node.SiteContents

            Ensure       ="Present"

        }

    }

}

The above configuration is equivalent to writing the following (when evaluated with $MyDatapresented above)

 

configurationMyConfiguration

{

    Import-DscResource-ModuleNamexWebAdministration-NameMSFT_xWebsite

 

    node Nana-VM-1

    {

        xWebsite Site

        {

            Name         ="Website1"

            PhysicalPath ="C:\Site1"

            Ensure       ="Present"

        }                   

    }

 

    node Nana-VM-3

    {

        xWebsite Site

        {

            Name         ="Website3"

            PhysicalPath ="C:\Site3"

            Ensure       ="Present"

        }

       

    }

}

 

Note: xWebsite is a resource that we published as part of DSC Resource Kit Wave 1. More information on the same can be found here

 

If you want some properties to apply to all the nodes, then it can be specified with NodeName as “*” (Note: * is a special notion. Wildcards are not supported)

 

 

$MyData=

@{

    AllNodes =

    @(

        @{

            NodeName           ="*"

            LogPath            ="C:\Logs"

        },

 

        @{

            NodeName ="Nana-VM-1";

            Role     ="WebServer"

            SiteContents ="C:\Site1"

            SiteName ="Website1"

        },

 

        @{

            NodeName ="Nana-VM-2";

            Role     ="SQLServer"

        },

 

        @{

            NodeName ="Nana-VM-3";

            Role     ="WebServer";

            SiteContents ="C:\Site2"

            SiteName ="Website3"

        }

    );

}

 

Now every node has a LogPath property.

 

3.     $ConfigurationData: This variable can be used from within a configuration to access the configuration data hash table passed as a parameter.

 

$MyData=

@{

    AllNodes =

    @(

        @{

            NodeName           ="*"

            LogPath            ="C:\Logs"

        },

 

        @{

            NodeName ="Nana-VM-1";

            Role     ="WebServer"

            SiteContents ="C:\Site1"

            SiteName ="Website1"

        },

 

        @{

            NodeName ="Nana-VM-2";

            Role     ="SQLServer"

        },

 

        @{

            NodeName ="Nana-VM-3";

            Role     ="WebServer";

            SiteContents ="C:\Site2"

            SiteName ="Website3"

        }

    );

 

    NonNodeData =

    @{

        ConfigFileContents = (Get-ContentC:\Template\Config.xml)

     }  

}

 

configurationMyConfiguration

{

    Import-DscResource-ModuleNamexWebAdministration-NameMSFT_xWebsite

 

    node $AllNodes.Where{$_.Role -eq"WebServer"}.NodeName

    {

        xWebsite Site

        {

            Name         =$Node.SiteName

            PhysicalPath =$Node.SiteContents

            Ensure       ="Present"

        }

 

        File ConfigFile

        {

            DestinationPath =$Node.SiteContents +"\\config.xml"

            Contents =$ConfigurationData.NonNodeData.ConfigFileContents

        }

    }

}

 

Here is a complete example using configuration data (already included in examples for xWebAdministration module in DSC Resource Kit Wave 1)


configurationSample_xWebsite_FromConfigurationData

{

    # Import the module that defines custom resources

    Import-DscResource-ModulexWebAdministration

 

    # Dynamically find the applicable nodes from configuration data

    Node $AllNodes.where{$_.Role -eq"Web"}.NodeName

    {

        # Install the IIS role

        WindowsFeature IIS

        {

            Ensure          ="Present"

            Name            ="Web-Server"

        }

 

        # Install the ASP .NET 4.5 role

        WindowsFeature AspNet45

        {

            Ensure          ="Present"

            Name            ="Web-Asp-Net45"

        }

 

        # Stop an existing website (set up in Sample_xWebsite_Default)

        xWebsite DefaultSite

        {

            Ensure          ="Present"

            Name            ="Default Web Site"

            State           ="Stopped"

            PhysicalPath    =$Node.DefaultWebSitePath

            DependsOn       ="[WindowsFeature]IIS"

        }

 

        # Copy the website content

        File WebContent

        {

            Ensure          ="Present"

            SourcePath      =$Node.SourcePath

            DestinationPath =$Node.DestinationPath

            Recurse         =$true

            Type            ="Directory"

            DependsOn       ="[WindowsFeature]AspNet45"

        }      

 

        # Create a new website

        xWebsite BakeryWebSite

        {

            Ensure          ="Present"

            Name            =$Node.WebsiteName

            State           ="Started"

            PhysicalPath    =$Node.DestinationPath

            DependsOn       ="[File]WebContent"

        }

    }

}

 

# Content of configuration data file (e.g. ConfigurationData.psd1) could be:

 

# Hashtable to define the environmental data

@{

    # Node specific data

    AllNodes = @(

 

       # All the WebServer has following identical information

       @{

            NodeName           ="*"

            WebsiteName        ="FourthCoffee"

            SourcePath         ="C:\BakeryWebsite\"

            DestinationPath    ="C:\inetpub\FourthCoffee"

            DefaultWebSitePath ="C:\inetpub\wwwroot"

       },

 

       @{

            NodeName           ="WebServer1.fourthcoffee.com"

            Role               ="Web"

        },

 

       @{

            NodeName           ="WebServer2.fourthcoffee.com"

            Role               ="Web"

        }

    );

}

 

# Pass the configuration data to configuration as follows:

Sample_xWebsite_FromConfigurationData-ConfigurationDataConfigurationData.psd1


Separating the environmental configuration from structural configuration helps author configuration without having to “hard code” environment specific information in a configuration declaration. DSC provides a mechanism to do so but does not enforce it. The separation can be very specific to a configuration and its environment and each configuration author can use it as they seem fit.


Happy configuring !!!

 

Narayanan (Nana) Lakshmanan

Development Lead - PowerShell DSC


Want to secure credentials in Windows PowerShell Desired State Configuration?

$
0
0

As you start using Windows PowerShell Desired State Configuration (DSC), you might need to specify credentials for resources. In a previous post we showed you how to define a resource that has a credential property.  In this post, I’ll discuss how to properly encrypt credentials when used in a DSC configuration.

Prerequisites

First, let us discuss the requirements to encrypt a DSC configuration. 

  • You must have an Encryption capable certificate on the target node in the Local Computer’s Personal Store (in PowerShell the path to the store is Cert:\LocalMachine\My, we used the workstation authentication template, see all templates here.)
  • If you are running the configuration, from a separate machine than the target node, you must export the public key of the certificate and import it on the machine you will be running the configuration from.

o   It is important that you keep the private key secure.  Since the public key is all that is needed to encrypt, I recommend you only export the public key to the machine you are writing your configurations on in order to keep the private key more secure. 

Assumptions

For this article I’m going to assume:

  • You are using something like Active Directory Certificate Authority to issue and distribute the encryption certificates.
  • Administrator access to the target node must be properly secured, as anyone with administrator access to the target node should be trusted with the credentials as the administrators can decrypt the credentials with enough effort.

Overview

In order to encrypt credentials in a DSC configuration, you must follow a process.  You must have a certificate on each target node which supports encryption.  After that, you must have the public key and thumbprint of the certificate on the machine you are authoring the configuration on.  The public key must be provided using the configuration data, and I’ll show you how to provide the thumbprint using configuration data as well.  You must write a configuration script which configures the machine using the credentials, and sets up decryption by configuring the target node’s Local Configuration Manager (LCM) to decrypt the configuration using the encryption certificate as identified by its thumbprint.  Finally, you must run the configuration, including, setting the LCM settings and starting the DSC configuration.

Diagram1

Configuration Data

When dealing with encryption of DSC configuration, you must understand DSC configuration data. This structure describes, to a configuration, the list of nodes to be operated on, if credentials in a configuration should be encrypted or not for each node, how credential will be encrypted, and other information you want to include.  Below is an example of configuration data for a machine named “targetNode”, which I’d like to encrypt using a public key I’ve exported and saved to “C:\publicKeys\targetNode.cer”.

$ConfigData=    @{
        AllNodes = @(    
                        @{ 
                            # The name of the node we are describing
                            NodeName ="targetNode"

                            # The path to the .cer file containing the
                            # public key of the Encryption Certificate
                            # used to encrypt credentials for this node
                            CertificateFile ="C:\publicKeys\targetNode.cer"


                            # The thumbprint of the Encryption Certificate
                            # used to decrypt the credentials on target node
                            Thumbprint ="AC23EA3A9E291A75757A556D0B71CBBF8C4F6FD8"
                        };
                    );   
    }
 

Configuration Script

After we have the configuration data, we can start building our configuration.  Since credential are important to keep secure, you should always take the credential as a parameter to your configuration.  This is so the credentials are stored for the shortest time possible.  Below I’ll give an example of copying a file from a share that is secured to a user. 


configurationCredentialEncryptionExample
{
    param(
        [Parameter(Mandatory=$true)]
        [ValidateNotNullorEmpty()]
        [PsCredential]$credential
        )
   

    Node $AllNodes.NodeName
    {
        File exampleFile
        {
            SourcePath ="\\Server\share\path\file.ext"
            DestinationPath ="C:\destinationPath"
            Credential =$credential
        }
    }
}

When you run CredentialEncryptionExample, DSC will prompt your for the credential and encrypt the mof using the CertificateFile associated with the node in the configuration data.

 

Setting up Decryption

There is still one issue.  When you run Start-DscConfiguration, the Local Configuration Manager (LCM) of target node does not know which certificate to use to decrypt the credentials.  We need to add a LocalConfigurationManager resource to tell it.  You must set the CertificateId to the thumbprint of the certificate.  The first question becomes how to get the thumbprint.  Below is an example of how to find a local certificate that would work for encryption (you may need to customize this to find the exact certificate you want to use.)

 

# Get the certificate that works for encryption
functionGet-LocalEncryptionCertificateThumbprint
{
    (dirCert:\LocalMachine\my) |%{
                        # Verify the certificate is for Encryption and valid
                        if ($_.PrivateKey.KeyExchangeAlgorithm -and$_.Verify())
                        {
                            return$_.Thumbprint
                        }
                    }
}

 

After we have the thumbprint, we use this to build the configuration data (given in the above configuration data example.)  Below is an example of the updated configuration with the LocalConfigurationManager resource, getting the value from the node in the configuration data.

 

configurationCredentialEncryptionExample
{
    param(
        [Parameter(Mandatory=$true)]
        [ValidateNotNullorEmpty()]
        [PsCredential]$credential
        )
   

    Node $AllNodes.NodeName
    {
        File exampleFile
        {
            SourcePath ="\\Server\share\path\file.ext"
            DestinationPath ="C:\destinationPath"
            Credential =$credential
        }
       
        LocalConfigurationManager
        {
           
 CertificateId =$node.Thumbprint
        }
    }
}

Running the Configuration

From this point, we need to run the configuration, it will output one *.meta.mof to configure LCM to decrypt the credentials using the certificate installed to the local machine store identified by the thumbprint, and one mof to apply the configuration.  First, you will need to use Set-DscLocalConfigurationManager to apply the *.meta.mof and then, Start-DscConfiguration to apply the configuration.  Here is an example of how you would run this:


Write-Host"Generate DSC Configuration..."

CredentialEncryptionExample -ConfigurationData $ConfigData -OutputPath .\CredentialEncryptionExample


Write-Host"Setting up LCM to decrypt credentials..."

Set-DscLocalConfigurationManager.\CredentialEncryptionExample-Verbose

 

Write-Host"Starting Configuration..."

Start-DscConfiguration.\CredentialEncryptionExample-wait-Verbose

 

This example would push the configuration to the target node.  If you reference our blog on how to setup a pull configuration, you can modify the setting in the LocalConfigurationManager resource and use these steps to deploy this using a pull server.

Summary

You should be able to build a sample that uses credentials securely in DSC using the information in this post.  I have written a more complete sample and have attached the code here:

The sample expands on what we discussed here and includes a helper cmdlet to export and copy the public keys to the local machine and an example how to use it.

 

Travis Plunk

Windows PowerShell DSC Test

How to enable Updatable Help for your PowerShell Module

$
0
0

PowerShell 3.0 lets the user update Help content on a per module basis. In this article, I will explain how you can enable this for your own PowerShell module.

Prerequisites: Have a new (script/binary) module, help content for the cmdlets of the module, and a server where the help content is located. For this particular exercise I will be using a script module.

Here is how my module is organized:

C:\Users\frangom\Documents\WindowsPowerShell\modules\TestModule\TestModule.psm1

C:\Users\frangom\Documents\WindowsPowerShell\modules\TestModule\TestModule.psd1

TestModule.psm1 is my script module. There, I defined the name of help file.

TestModule.psd1 is the module manifest for my module. The HelpInfoURI field is the address where the help content for this module is located.

On the server site, we have:

  • The .cab file which contains the dll-Help.xml or the psm1-Help.xml file
  • The HelpInfo.xml file.

The .cab file must be named as follows:

ModuleName_ModuleGUID _UI-Culture_HelpContent.cab

where:

ModuleName: The name of the module (same as the module manifest file).

ModuleGUID: The module GUID as referenced in the module manifest.

UI-Culture: The four letter hyphenated UI culture abbreviation (en-US, fr-FR, de-DE, etc.).

HelpContent: Indicates that this cab contains the help content file.

For example:

TestModule_d03c1cf3-f738-48a3-b845-5ead46a52671_en-US_HelpContent.cab

Similarly, the HelpInfo file must adhere to the following naming convention:

ModuleName_ModuleGUID_HelpInfo.xml

where:

ModuleName: Name of the module (same as the module manifest file).

ModuleGUID: Module GUID as referenced in the module manifest.

HelpInfo: Indicates that this is the help info file.

For example:

TestModule_d03c1cf3-f738-48a3-b845-5ead46a52671_HelpInfo.xml

File content of the HelpInfo.xml file:

<?xmlversion="1.0"encoding="utf-8"?>

<HelpInfoxmlns="http://schemas.microsoft.com/powershell/help/2010/05">

  <HelpContentURI>http://www.mysite.com/PSHelpContent/</HelpContentURI>

  <SupportedUICultures>

     <UICulture>

       <UICultureName>en-US</UICultureName>

       <UICultureVersion>3.2.15.0</UICultureVersion>

     </UICulture>    

  </SupportedUICultures>

</HelpInfo>

Note: The HelpContentURI (in yellow) should point to a container.

Place these two files in the server folder PSHelpContent, e.g., \\mysite\c$\Inetpub\wwwroot\PSHelpContent\

How to Test Updatable Help for Your Module

# First, make sure updatable help works using the -SourcePath

Update-Help-ModuleTestModule-SourcePath\\mysite\c$\Inetpub\wwwroot\PSHelpContent\-Force

 

# After that, you can use without the -SourcePath which will connect to the site defined in 

# the module manifest, HelpInfoURI = 'http://www.mysite.com/PSHelpContent/'

Update-Help-ModuleTestModule-Force


For more information on Supporting Updatable Help, please visit http://go.microsoft.com/fwlink/?LinkId=391422.


Cheers,

Francisco Gamino

PowerShell Test Team

Need more DSC Resources? Announcing DSC Resource Kit Wave 2

$
0
0

Good news everyone! Starting today, you can use Windows PowerShell Desired State Configuration (DSC) to configure Active Directory and SQL Server (including High Availability Groups). We are pleased to release the next wave of the DSC Resource Kit – one that enables you to start using DSC to solve your real world problems and scenarios.

When we shipped DSC in Windows Server 2012 R2, we shipped a platform with great infrastructure for configuration.  The next step for any platform like DSC is the creation of resources to make it immediately usable in significant real world scenarios.  With the resources we shipped in box, configuring SQL Server was out of reach for most of our customers.  That's changing today.  Now is the time to create a vibrant ecosystem and expansive community.

Over the past few months, we've worked hard to kick start this community.  Last month, we released the first wave of the DSC Resource Kit.  That release contained six experimental DSC resources, enabling you to configure IIS websites and Hyper-V.  Those were a limited subset of resources – a first pass at the process.  Now, we’re ready to turn it up a notch. 

This wave of the DSC Resource Kit includes fourteen new resources.  These resources are all focused on enabling you to configure Active Directory and SQL Server (including High Availability Groups).  These are real world scenarios where DSC can make an impact.  Soon, we will be blogging an involved example that uses these resources to set up a SQL High Availability Group using DSC.  In addition, we’ve updated many of the resources from the initial release, adding features and fixing bugs.

Click here to see the latest DSC Resource Kit modules.

We hope these resources will be a starting point for the DSC community – something to facilitate DSC resource creation.  Feel free to take and modify these resources to meet your needs (while following the Renaming Guidelines). We were thrilled to see that several community members created modified versions of last wave’s resources in PowerShell.Org’s GitHub repository.  Also, don’t forget that you can create your own resources – for help, check out this blog post and the DSC Resource Designer.

On a more serious note, we must reiterate that these resources come without any guarantees.  The “x” prefix stands for experimental – which means these resources are provided AS IS and are not supported through any Microsoft support program or service. We will monitor the TechNet pages, take feedback, and may provide fixes on a “fix forward” basis. 

Finally – before diving into the details – we want to invite everyone to give feedback on the DSC Resource Kit.  Are you hungry for even more resources?  Do you need DSC to enable any important scenarios?  Let us know through the comments or TechNet Q&A -- there's definitely more to come.

Description of Resources

After installing the modules, you can discover all of the resources available through the Get-DSCResource cmdlet:

 

 

Here is a brief description of each resource (for more details on a resource, check out the TechNet pages).

 

Resource

Description

Module Name

Link

xADDomain

Create and manage an Active Directory Domain

xActiveDirectory

 click here

xADDomainController

Create and manage an AD Domain Controller

xActiveDirectory

 click here

xADUser

Create and manage an AD User

xActiveDirectory

 click here

xWaitForADDomain

Pause configuration implementation until the AD Domain is available.  Used for cross machine synchronization.

xActiveDirectory

 click here

xSqlServerInstall

Create and manage a SQL Server Installation.

xSqlps

 click here

xSqlHAService

Create and manage a SQL High Availability Service.

xSqlps

 click here

xSqlHAEndpoint

Create and manage the endpoint used to access a SQL High Availability Group.

xSqlps

 click here

xSqlHAGroup

Create and manage a SQL High Availability Group.

xSqlps

 click here

xWaitForSqlHAGroup

Pause configuration implementation until a SQL HA Group is available.  Used for cross machine synchronization.

xSqlps

 click here

xCluster

Create and manage a cluster.

xFailOverCluster

 click here

xWaitForCluster

Pause configuration until a cluster is available.  Used for cross machine synchronization.

xFailOverCluster

 click here

xSmbShare

Create and manage a SMB Share.

xSmbShare

 click here

xFirewall

Create and manage Firewall rules

xNetworking

 click here

xVhdFile

Manage files to be copied into a Vhd.

xHyper-V

 click here

xWebsite

Added functionality to xWebsite to support configuration of https websites.

xWebAdministration

 click here

xVhd

Bug fixes

xHyper-V

 click here

 

Renaming Guidelines

When making changes to these resources, we urge the following practice:

1.     Update the following names by replacing MSFT with your company/community name and replacing the “x” with "c" (short for "Community") or another prefix of your choice:

a.     Module name (ex: xWebAdministration becomes cWebAdministration)

a.     Folder name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

b.     Resource Name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

c.     Resource Friendly Name (ex: xWebsite becomes cWebsite)

d.     MOF class name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

e.     Filename for the <resource>.schema.mof (ex: MSFT_xWebsite.schema.mof becomes Contoso_cWebsite.schema.mof)

2.     Update module and metadata information in the module manifest

3.     Update any configuration that use these resources

 

We reserve resource and module names without prefixes ("x" or "c") for future use (e.g. "MSFT_WebAdministration" or "Website").  If the next version of Windows Server ships with a "Website" resource, we don't want to break any configurations that use any community modifications.  Please keep a prefix such as "c" on all community modifications.

As specified in the license, you may copy or modify this resource as long as they are used on the Windows Platform.

Requirements

The DSC Resource Kit requires Windows 8.1 or Windows Server 2012 R2 with update KB2883200 (aka the GA Update Rollup). You can check whether it is installed by running the following command:

 

PS C:\WINDOWS\system32> Get-HotFix -Id KB2883200

 

Source        Description      HotFixID      InstalledBy          InstalledOn             

------        -----------      --------      -----------          -----------             

NANA-TOUCH    Update           KB2883200     NANA-TOUCH\Admini... 9/30/2013 12:00:00 AM   

 

On supported down level operating systems, they require WMF 4.0. Refer to these previous blog posts for more information on WMF 4.0 and issues with partial installation.

Using Resources

For simple examples of configurations that use these resources, check out the respective TechNet pages. Soon, we will be blogging an involved example that details the configuration of a SQL High Availability Group using DSC. If you need help deploying the resources, see this blog post

 

Thanks,

 

John Slack

Program Manager

PowerShell Team

DSC Diagnostics Module– Analyze DSC Logs instantly now!

$
0
0
 
Have you ever witnessed a DSC Configuration run where you had no idea about what it might have done behind the scenes? Well, then your worries end here! During any DSC Operation, the DSC engine writes into windows event logs, which are like bread crumbs that the engine leaves along the way during any execution. If you read the blog here about DSC troubleshooting, you could learn how to use the Get-WinEvent cmdlet to debug a DSC failure using event logs. However, something that really simplifies life is the new module that has been published in Wave 2 of the DSC Resource Kit , called xDscDiagnostics.

Introduction

xDscDiagnostics is a PowerShell module that consists of two simple operations that can help analyze DSC failures on your machine – Get-xDscOperation and Trace-xDscOperation. These functions help in identifying all the events from past DSC operations run in your system, or any other computer (Note: you need a valid credential to access remote computers). Here, we use the term DSC Operation to define a single unique DSC execution from its start to its end. For instance, Test-DscConfiguration would be a separate DSC Operation. Similarly, every other cmdlet in DSC (such as Get-DscConfiguration, Start-DscConfiguration, etc.) could each be identified as a separate DSC operation.

The two cmdlets are explained here and in more detail below. Help regarding the cmdlets are available when you run get-help <cmdlet name>.

Get-xDscOperation

This cmdlet lets you find the results of the DSC operations that run on one or multiple computers, and returns an object that contains the collection of events produced by each DSC operation.

For instance, in the following output, we ran three commands, the first of which passed, and the others failed. These results are summarized in the output of Get-xDscOperation.

image001

Figure 1 : Get-xDscOperation that shows a simple output for a list of operations executed in a machine

 

Parameters

  • Newest– Accepts an integer value to indicate the number of operations to be displayed. By default, it returns 10 newest operations. For instance,

image002

Figure 2 : Get-xDscOperation can display the last 5 operations’ event logs

 

  • ComputerName– Parameter that accepts an array of strings, each containing the name of a computer from where you’d like to collect DSC event log data. By default, it collects data from the host machine. To enable this feature, you must run the following command in the remote machines, in elevated mode so that the will allow collection of events

    New-NetFirewallRule -Name "Service RemoteAdmin" -Action Allow      
  • Credential– Parameter that is of type PSCredential, which can help access to the computers specified in the ComputerName parameter.

Returned object

The cmdlet returns an array of objects each of type Microsoft.PowerShell.xDscDiagnostics.GroupedEvents. Each object in this array pertains to a different DSC operation. The default display for this object has the following properties:

  1. SequenceID: Specifies the incremental number assigned to the DSC operation based on time. For instance, the last executed operation would have SequenceID as 1, the second to last DSC operation would have the sequence ID of 2, and so on. This number is another identifier for each object in the returned array.
  1. TimeCreated: This is a DateTime value that indicates when the DSC operation had begun.
  1. ComputerName: The computer name from where the results are being aggregated.
  1. Result: This is a string value with value “Failure” or “Success” that indicates if that DSC operation had an error or not, respectively.
  1. AllEvents: This is an object that represents a collection of events emitted from that DSC operation.

 

For instance, if you’d like to aggregate results of the last operation from multiple computers, we have the following output:

 

image003

Figure 3 : Get-xDscOperation can display logs from many other computers at once.

 

Trace-xDscOperation

 

This cmdlet returns an object containing a collection of events, their event types, and the messages output generated from a particular DSC operation. Typically, when you find a failure in any of the operations using Get-xDscOperation, you would want to trace that operation to find out which of the events caused a failure.

Parameters

  • SequenceID: This is the integer value assigned to any operation, pertaining to a specific computer. By specifying a sequence ID of say, 4, the trace for the DSC operation that was 4th from the last will be output

image004

Figure 4: Trace-xDscOperation with sequence ID specified

  • JobID: This is the GUID value assigned by LCM xDscOperation to uniquely identify an operation. Hence, when a JobID is specified, the trace of the corresponding DSC operation is output.

image005

Figure 5: Trace-xDscOperation taking JobID as a parameter – to output the same record as above – they just have two identifiers- job id and SequenceID

  • Computer Name and Credential: These parameters allow the trace to be collected from remote is necessary to run the command :

    New-NetFirewallRule -Name "Service RemoteAdmin" -Action Allow

image006

Figure 6: Trace-xDscOperation running on a different computer with the -ComputerName option

Note: Since Trace-xDscOperation would aggregate events from Analytic, Debug, and operational logs, it will prompt the user to enable these logs. If the logs are not enabled, an error message is displayed stating that these events cannot be read until it has been enabled. However, the trace from other logs are still displayed. This error can be ignored.

 

Returned object

The cmdlet returns an array of objects, each of type Microsoft.PowerShell.xDscDiagnostics.TraceOutput. Each object in this array contains the following fields:

  1. ComputerName: The name of the computer from where the logs are being collected.
  1. EventType: This is an enumerator type field that contains information on the type of event. It could be any of the following :

a.       Operational : Indicates the event is from the operational log

b.      Analytic : The event is from the analytic log

c.       Debug : This would mean the event is from the debug log

d.      Verbose: These events are output as verbose messages during execution. The verbose messages make it easy to identify the sequence of events that are published.

e.      Error: These events are error events. Please note that by looking for the error events, we can immediately find the reason for failure most of the times.

  1. TimeCreated : A DateTime value indicating when the event was logged by DSC
  1. Message: The message that was logged by DSC into the event logs.

 

There are some fields in this object that are not displayed by default, which can be used for more information about the event. These are:

  1. JobID : The job ID (GUID format) specific to that DSC operation
  1. SequenceID: The SequenceID unique to that DSC operation in that computer.
  1. Event: This is the actual event logged by DSC, of type System.Diagnostics.Eventing.Reader.EventLogRecord. This can also the obtained from running the cmdlet Get-Winevent, as in the blog here. It contains more information such as the task, eventID, level, etc. of the event.

Hence, we could obtain information on the events too, if we saved the output of Trace-xDscOperation into a variable. To display all the events for a particular DSC operation, the following command would suffice:

(Trace-xDscOperation-SequenceID3).Event

 

That would display the same result as the Get-Winevent cmdlet, such as in the output below.

image007

Figure 7 : Output that is identical to a get-winevent output. These details can be extracted using the xDscDiagnostics module as well

 

Ideally, you would first want to use Get-xDscOperations to list out the last few DSC configuration runs on your machines. Following this, you can dissect any one single operation (using its sequenceID or JobID) with Trace-xDscOperation to find out what it did behind the scenes.

In summary, xDscDiagnostics is a simple tool to extract the relevant information from DSC logs so that the user can diagnose operations across multiple machines easily. We urge the users to use this more often to simplify their experience with DSC.

 
 
 

Inchara Shivalingaiah
Software Developer
Windows PowerShell Team

Configuring a SQL High Availability Group with DSC

$
0
0

Let's use DSC to configure something complicated!  In past blogs, we’ve shown you how to use Windows PowerShell Desired State Configuration (DSC) to configure relatively simple systems.  However, the technologies you deal with on a day to day basis can sometimes become complicated.  Don’t worry, DSC can still help simplify configuration.  Let’s use SQL High Availability Group (HAG) as a example. SQL HAG is a new SQL feature that enables replication on top of Windows Server Failover Clustering. While the feature is cool, configuring the environment is quite complex. It involves many steps across multiple machines. Some steps in one machine might depend on progress or status of others.

 

In this blog post, we will demonstrateusing DSC to configure a SQL HAG. When using the provided example, one PowerShell command will deploy a SQL HAG on Virtual Machines (VMs). 

Environment

Using the DSC configuration scripts described in this blog you can fully deploy and configure the following environment: 

Configuration Overview

To deploy the environment described above in a virtual environment using DSC a configuration is generated for each guest server described above and the VM host machine. All of these configurations are coordinated by a single PowerShell script (Deploy-Demo.ps1). A description of what each of the configuration scripts does is below. A zip (Dsc-SqlDemo.zip) containing all of the configuration files is attached to this blog (see the bottom of the blog) and should be downloaded before you read on so that you can follow along while looking at the associated scripts.

 

Configuring the Host and VMs

 

First, Deploy-Demo.ps1 runs Dsc-SqlDemo\ConfigSqlDemo.ps1.  Thisconfigures the host machine by doing the following:

 

1.       Ensure that a VM Switch for an internal network is present (in the demo, subnet of 192.168.100.*)

2.       Ensure that a local user called vmuser is present, so that VMs can access data in host

3.       Ensure that a net share (c:\SqlDemo\Sql12Sp1) is present. 

4.       Ensure that three VMs are created in the correct state by:

o   Ensuring that a DSC Configuration, DSC Resources, and other files are copied to the VHD image.

o   Ensuring that the VMs are started from the VHDs.

 

Once the host machine is configured, we have three VMs running.  Each of these VMs has a configuration that has been bootstrapped into it.  Because of the way we bootstrap the VMs, they will configure themselves after startup, using the .mof we have injected into them.

 

Stay tuned for a blog post about the bootstrapping procedure. 

 

Configuring the Primary Domain Controller - pdc

 

The .mof file injected into the Primary Domain Controller (pdc) VM was generated from the configuration in Dsc-SqlDemo\Scenarios\nodes.ps1 from the node statement èNode $AllNodes.Where{$_.Role -eq "PrimaryDomainController" }.NodeName

 

1.       Ensure the VM has a static IPAddress

2.       Ensure necessary WindowsFeatures are present

3.       Ensure that a Domain Forest is created

4.       Sets  up a network share folder that will be used in the SQL replication process

 

Setting up the first SQL Server- Sql01

 

The .mof file injected into the first SQL Server (Sql01) VM was generated from the configuration in Dsc-SqlDemo\Scenarios\nodes.ps1 from the node statement èNode $AllNodes.Where{$_.Role -eq "PrimarySqlClusterNode" }.NodeName

 

 

1.       Ensures that the machines IPAddress is correctly set

2.       Ensures that necessary WindowsFeatures are present

3.       WaitFor Primary Domain Controller to have created the AD Domain

4.       Ensure that the machine is joined to the Domain

5.       Ensure that .Net 3.5 and SQL Server 2012 SP1 are installed

6.       Ensures that Firewalls are configured such that Sqlbrowser.exe and SqlServr.exe are accessible in the private network.

7.       Ensure that a Windows Cluster is created and that Sql01 is added to the cluster

8.       Ensure that the SQL Server for High Availability (HA) service is enabled

9.       Ensure that there is an Endpoint for the HA

10.   Ensure that the SQL HA group for databases is created (in the demo, TestDB)

 

 

Setting up the second SQL Server - Sql02

 

The .mof file injected into the second SQL Server (Sql02) VM was generated from the configuration in Dsc-SqlDemo\Scenarios\nodes.ps1 from the node statement èNode $AllNodes.Where{$_.Role -eq "ReplicaSqlClusterNode" }.NodeName

 

1.       Ensures that the machines IPAddress is correctly set

2.       Ensures that necessary WindowsFeatures are present

3.       WaitFor Primary Domain Controller to have created the AD Domain

4.       Ensure that the machine is joined to the Domain

5.       Ensure that .Net 3.5 and SQL Server 2012 SP1 are installed

6.       Ensures that Firewalls are configured such that Sqlbrowser.exe and SqlServr.exe are accessible in the private network.

7.       WaitFor the first SQL node to have created the windows cluster

8.       Ensure that Sql02 is added to the cluster

9.       Ensure that the SQL Server for High Availability (HA) service is enabled

10.   Ensure that there is an Endpoint for the HA

11.   WaitFor the first SQL node to have created the HA group

12.   Ensure that sql02 is joined to the HA group.

 

Deploy the environment

Now that you have an understanding of the environment and what the DSC scripts do, let’s go ahead and deploy the environment using the scripts. Note there is quite a bit of preparation to complete before the scripts can be executed so please be patient.

Requirements

Hardware

 

To simulate a SQL HAG, we need a decent machine that is capable of running Windows Server 2012 R2 and Hyper-V (64-bit) with at least 16GB of RAM and around 100GB of free disk space. Because this is a demo, we also recommend that you not store important items on the machine, in case it is cleaned up.

 

Software

 

The following software are needed to perform the steps in the demo.

 

1.       An evaluation version of Windows Server 2012 R2 Datacenter (both ISO and VHD). A download can be found here.  Note: We need both the VHD and the ISO because SQL Server requires .Net 3.5, which is not available in the VHD. Fortunately, in the expanded ISO image, there is a folder named Sources\sxs, that includes all .Net 3.5 files.

2.       An evaluation version of SQL Server 2012 SP1 (ISO).  A download can be found here.

3.       The following DSC resources:

a.       User (Ships in Windows Server 2012)

b.      Windows Feature (Ships in Windows Server 2012)

c.       xComputerManagement (Download here)

d.      xNetworking (Download here)

e.      xHyper-V (Download here)

f.        xActiveDirectory (Download here)

g.       xFailOverCluster (Download here)

h.      xSqlps (Download here)

i.         xSmbShare (Download here)

 

 

Certificate

 

Setting up domain controllers or SQL servers requires a few credentials.  To keep these credentials secure, DSC encrypts them before placing them into the plain text of the .mof files.  For details on this process, check out this blog. To secure credentials, DSC uses a certificate’s public key to encrypt the credentials and the private key to decrypt the credentials on the target machine that is being configured. To ensure that this demo works correctly, we need to ensure that the host and the target machines have the appropriate certificates.

 

To do this, we first create a self-signed certificate on the host machine, then copy it with private key to the target machines.  We then install the certificate to the target’s local machine certificate store. Since private key should be kept secret, it is important to clean them up as soon as possible (instructions can be found below).   Again, with this, please ensure you do NOT run the demo in production or on machines that require security by default. 

 

1.       Steps to setup certificate on the host machine:

·         Get MakeCert.exe if you don’t have. (It is shipped with Windows SDK, a download can be found here).

·         Create a certificate with CN=DSCDemo. To do this, open a PowerShell console with Administrator elevation, cd to place that can see MakeCert.exe, and run the following command (notice, for security reasons, I make the cert expire as soon as possible, please adjust the highlighted date as needed).

 

makecert -r -pe -n "CN=DSCDemo" -sky exchange -ss my -sr localMachine–e 02/15/2014

 

The command line above will create a self-signed certificate on localhost certificate store (cert:\localMachine\My, with Subject = “CN=DSCDemo”).  Remember the subject, we will need it very soon. In my example, the UI in the certificate store looks like the following in (Certificates(Local Computer)\Personal\Certificates)

 

 

·         Create a folder to hold the keys for the demo. In my example, I created C:\keys

·         Public key. Export the public key of the certificate.  You can do this manually, or do it with the following PS script. In my example, I saved the public key as: C:\keys\Dscdemo.cer

 

$certSubject = "CN=DSCDemo"

$keysFolder = Join-Path $env:SystemDrive -ChildPath "Keys"

$cert = dir Cert:\LocalMachine\My | ? { $_.Subject -eq $certSubject }

if (! (Test-Path $keysFolder ))

{

    md $keysFolder | Out-Null

}

$certPath = Export-Certificate -Cert $cert -FilePath (Join-Path $keysFolder -ChildPath "Dscdemo.cer")

 

·         Private key and protection-Password. For security reason, export the private key certificate as the following:

o   In Personal\Certificates, find the certificate Issued to “DSCDemo” as shown above. Right click and select the “Export…” option.

o   Take the option of “export private key”

o   UI will ask you for apassword for the protection. Enter and remember your password, youneed it very soon. For this demo, we used P@ssword

o   Export the certificate to the appropriate folder.  In my example, it is C:\keys\Dscdemo.pfx

·         Certificate Thumbprint. Run the following PS script to get certificate’s thumbprint, we need it very soon.

 

dir Cert:\LocalMachine\My | ? { $_.Subject -eq "CN=DSCDemo" }

 

In my example, it is E513EEFCB763E6954C52BA66A1A81231BF3F551E

 

2.       Update the deployment scripts:

With above steps, we need update deployment scripts to point to the correct certificate values.

 

·         Public key location: in my example, it is C:\keys\Dscdemo.cert

·         Thrumbprint: in my example, it is E513EEFCB763E6954C52BA66A1A81231BF3F551E

·         Private key location: in my example, it is C:\keys\Dscdemo.pfx

·         Private key protection password: in my example, it is P@ssword

 

Update the following places in the deployment scripts:

 

2.1 ConfigSqlDemoData.psd1

 

At line 56, modify the file to point to your private key location.

 

           SourcePath ="C:\Keys\Dscdemo.pfx";

 

 

 

At line 145-146, modify the file to point to your certificate file and Thumbprint:

 

         @{

            NodeName="*"

 

            CertificateFile ="C:\keys\Dscdemo.cer"

            Thumbprint ="E513EEFCB763E6954C52BA66A1A81231BF3F551E"

 

 

2.2 deployment\installcert.ps1

 

          -Password $(ConvertTo-SecureString-String"P@ssword"

 

This corresponds private key protection password. Change it to the value you just entered.

               

3.       Install the certificate to the VMs.  Now that we’ve done steps 1 and 2, the deployment script will do the following automatically:

 

1.       Encrypt credentials for the environment that is going to set up.

2.       Copy the private key and the script (installcert.ps1) that holds the private key protection password to each VM’s VHD file (into the VHD’s c:\deployment folder). Once the VM is started, it will install the certificate with the private key.

 

4.       Clean up the certificate.  After you are done with the demo, please remove certificate and keys as soon as possible with the following steps:

 

1.       Delete the certificate files. In my case, I delete all files under C:\keys

2.       Remove the self-signed certificate we just created. In my case, I used the UI to go to Certificates(Local Computer)\Personal\Certificates, and deleted certificate issued to DSCDemo

3.       Remove the password in the deployment\installcert.ps1 file.

4.       Delete the xml files under deployment (pdc.xml, sql01.xml, sql02.xml) because they have passwords for VMs bootstrap.

5.       In each VM, delete the files under C:\deployment

6.       Shred the recycle bin of the host machine.

 

 

Prepare the host

 

Before we can run the demo, we need to make sure that we have all of the necessary files in the appropriate places. 

 

Copying Files

 

1.        Confirm that the host machine is running Windows Server 2012 R2. If that is not the case, you can expand the ISO downloaded above to DVD, and install Windows Server 2012 R2 from there. The host is also required to have Hyper-V. Please see the Hyper-V Start Guide in the reference section for more details on Hyper-V.  It is recommend to upgrade the OS with latest patches by running Windows Update.

2.       Create a folder named SqlDemo. In my case, I created the folder here: C:\SqlDemo

3.       Copy the Windows Server 2012 R2 VHD file to C:\SqlDemo. For me, this looks like: “c:\SqlDemo\9600.16415.amd64fre.winblue_refresh.130928-2229_server_serverdatacentereval_en-us.vhd”

4.       Copy the Windows Server ISO to C:\SqlDemo. To make things simple, you can rename the file to a short name. In my case, this looks like: C:\SqlDemo\WS12R2.ISO

5.       Similarly, copy the SQL ISO  to C:\SqlDemo. Again, rename the file to a short name like this: C:\SqlDemo\SqlSP1.iso

6.       Unzip Dsc-SqlDemo.zip. In my case, it is like C:\Dsc-SqlDemo, the entire folder like the following:

 

 

7.       download xActiveDirectory, xComputerManagement, xFailOverCluster, xHyper-V, xNetworking, xSmbShare, xSqlPs modules if not. Copy them to root of unzipped folder. It looks like the following in the end:

 

 

Extracting Content

 

Now that we’ve copied the ISOs into the necessary locations, we need to extract some of their content.  Specifically, we need to get the sxs files (which include .Net 3.5), and the SQL content.  While there are many ways to do this, the simplist way in this situation is to run the “GetFilesFromImage.ps1” script in DSC-SqlDemo folder.

 

1.       Open a Windows PowerShell console (with Administrator privileges), and cd to the Dsc-SqlDemo folder.

2.       Run the following script to get sxs files including .Net 3.5

 

.\GetFilesFromImage.ps1 -ImagePath c:\SqlDemo\WS12R2.ISO -SrcPath “sources\sxs” -DstPath c:\SqlDemo\Srv12R2\sxs

 

Figure 1: Note: -SrcPath has no driver letter because we don’t know which driver letter the ISO image will mount to until runtime.

 

 

3.       Similarly, get the entire Sql ISO content by running the following script:

 

.\GetFilesFromImage.ps1 -ImagePath c:\SqlDemo\Sql12SP1.ISO -SrcPath “*” –DstPath c:\SqlDemo\Sql12SP1

 

 

Remember folder c:\SqlDemo\Srv12R2\sxs and c:\SqlDemo\Sql12SP1, we need them later on.

 

Checking the Configuration Data File

 

It’s important to ensure the configuration data file (c:\dsc-SqlDemo\ConfigSqlDemoData.psd1) has the correct information. If you used the same paths as above for SqlDemo, and are okay with using the default credentials, the demo should work without any change. However, if SqlDemo and underneath files is in different path, driver, or name, their locations need to be updated in the data file.

 

Checking Credentials

 

By default, “P@ssword” is the password for every credential. You can change to your own if you would like, but please remember them. And don’t forget to do clean up after the demo.

 

Also, notice, those three VMs are created in private network of the host. In another word, they are only visible to each other and the host. To make vm access software’s on the host, we create a local user: vmuser, which could read access to SqlDemo folder (in my case: c:\SqlDemo).

 

Checking Paths

 

Confirm that the following paths in the ConfigSqlDemoData.psd1 file are correct:

 

# Windows Server 2012 R2 vhd

VhdSrcPath ="c:\SqlDemo\9600.16415.amd64fre.winblue_refresh.130928-2229_server_serverdatacentereval_en-us.vhd"

 

# .Net 3.5 source files  

@{ Source ="C:\SqlDemo\Svr12R2\sxs";    Destination ="sxs" }

 

# Sql software folder on Host

SqlSrcHostPath ="C:\SqlDemo\Sql12SP1" 

 

Running the demo

 

Once everything is ready, running the demo is as simple as this:

 

The script will ask you to enter password for private domain administrator, sql administrator, user to access host file share, and user on the host for the file share access. The last two should have the same password. In my example, I entered "P@ssword" four times for the sake of simply.

 

After about 30-60 minutes, the SQL HAG will be set up across three VMs running on the host machine:

 

1.       SqlDemo-pdc – the primary domain controller, which ensures the private domain for two SQL cluster nodes.

2.       SqlDemo-Sql01 – the primary node in SQL High Availability Group

3.       SqlDemo-Sql02 – the secondary node in SQL High Availability Group

 

 

Verification (How do you know it worked)

 

It's worth noting that when the configuration returns success on the host machine, that only indicates that the VMs have been created, NOT that SQL HA deployment on VMs is completed.  The deployment takes about 30-60 minutes, so be patient with the installation script.

 

To check for complete status:

·         Monitor the size of the vhds being created on the host machine under Vm\. pdc vhd should be about 2.4 GB, sql vhds should be about 8GB.

 

To debug a failure:

·         Check the ETW events on each VM under Applications and Services Logs\Microsoft\Windows\Desired State Configuration/Operational

 

To confirm success:

1.       Login to one of SQLs nodes

2.       Start “Microsoft SQL Server Management Studio”

3.       Connect to one of SQL instances (like sql01, or 192.168.100.11 in IP)

4.       Under “AlwaysOn High Availability”, you should see something like the following snapshot:

 

 

5.       Expand the Databases folder

6.       Open TestDB

7.       Populate some data

8.       Check that it is replicated on the second node shortly thereafter.

Key Takeaways

This example is far more complex than most others that have been shown or created.  As such, it demonstrates many characteristics of configurations that may be lost in the simpler scenarios.  Here are a few things we think are worth noting.

 

  1.  Each configuration uses a Configuration Data File to separate the structural configuration data from the environmental configuration.  This allows the example to easily scale up.
  2. The “WaitFor” pattern is used many times to coordinate across machines.  This pattern is used in scenarios where a machine needs to wait for another machines to do something.  For example, Sql02 needed to wait for the Primary Domain Controller to create the domain before ensuring that it was joined to the domain.
  3. The configurations that ran in pdc, sql01, and sql02 were bootstrapped into the VHDs as .mof files.  This technique improves scalability and performance when configuring VMs at startup.  Stay tuned for a blog post on this later.

 

That’s it!  Let us know what you think in the comments.

 

Enjoy the fun!

 

Chen Shang, Mark Gray, John Slack, Narine Mossikyan

Windows DSC Team

Reusing Existing Configuration Scripts in PowerShell Desired State Configuration

$
0
0

You are an expert in PowerShell DSC (or maybe not an expert, just someone playing around with configurations in DSC) and have already written fairly large and complex configurations for configuring your environment/data center. Everything is working well and you are a great fan of DSC. There’s only one problem: your work is complicated. Before long, you have a configuration that is hundreds or thousands of lines long – people from many different teams are editing it. Finding and fixing problems becomes nearly impossible… your configuration is just too unwieldy. Then comes a day when there arises a need to add something more (or maybe delete something) to your configuration. The problem looks trivial to solve right? –Just add one more resource to your (already big) configuration. But you are a forward thinking person and find yourself wondering if there is something clever you can do leverage your existing configuration scripts.


That is why we made configurations composable and reusableJ. Yes, one configuration can call another. How? That is what we are going to cover in this post.

 

The way to make a configuration reusable is by making it what we call a composite resource. Let me walk you through an example to do just that.

 

I have the following parameterized configuration (the parameters of the configuration become the properties of the composite resource) which I will turn into a composite resource:

 

ConfigurationxVirtualMachine

{

param

(

# Name of VMs

[Parameter(Mandatory)]

[ValidateNotNullOrEmpty()]

[String[]]$VMName,

 

# Name of Switch to create

[Parameter(Mandatory)]

[ValidateNotNullOrEmpty()]

[String]$SwitchName,

 

# Type of Switch to create

[Parameter(Mandatory)]

[ValidateNotNullOrEmpty()]

[String]$SwitchType,

 

# Source Path for VHD

[Parameter(Mandatory)]

[ValidateNotNullOrEmpty()]

[String]$VhdParentPath,

 

# Destination path for diff VHD

[Parameter(Mandatory)]

[ValidateNotNullOrEmpty()]

[String]$VHDPath,

 

# Startup Memory for VM

[Parameter(Mandatory)]

[ValidateNotNullOrEmpty()]

[String]$VMStartupMemory,

 

# State of the VM

[Parameter(Mandatory)]

[ValidateNotNullOrEmpty()]

[String]$VMState

)

 

# Import the module that defines custom resources

Import-DscResource-ModulexComputerManagement,xHyper-V

 

# Install the HyperV role

WindowsFeatureHyperV

{

    Ensure="Present"

    Name="Hyper-V"

}

 

# Create the virtual switch

xVMSwitch$switchName

{

    Ensure="Present"

    Name=$switchName

    Type=$SwitchType

    DependsOn="[WindowsFeature]HyperV"

}

 

# Check for Parent VHD file

FileParentVHDFile

{

    Ensure="Present"

    DestinationPath=$VhdParentPath

    Type="File"

    DependsOn="[WindowsFeature]HyperV"

}

 

# Check the destination VHD folder

FileVHDFolder

{

    Ensure="Present"

    DestinationPath=$VHDPath

    Type="Directory"

    DependsOn="[File]ParentVHDFile"

}

 

 # Creae VM specific diff VHD

foreach($Namein$VMName)

{

    xVHD"VhD$Name"

    {

        Ensure="Present"

        Name=$Name

        Path=$VhDPath

        ParentPath=$VhdParentPath

        DependsOn= @("[WindowsFeature]HyperV",

                      "[File]VHDFolder")

    }

}

 

# Create VM using the above VHD

foreach($Namein$VMName)

{

    xVMHyperV"VMachine$Name"

    {

        Ensure="Present"

        Name=$Name

        VhDPath= (Join-Path-Path$VhDPath-ChildPath$Name)

        SwitchName=$SwitchName

        StartupMemory=$VMStartupMemory

        State=$VMState

        MACAddress=$MACAddress

        WaitForIP=$true

        DependsOn= @("[WindowsFeature]HyperV",

                      "[xVHD]Vhd$Name")

    }

}

}

 

The key is to place the configuration in a file with the extension schema.psm1. You can take a look here to find out how to deploy a DSC Resource. Here is how it looks on my machine:

PS C:\Program Files\WindowsPowerShell\Modules\TestCompositeResource\DSCResources\xVirtualMachine> dir

    Directory: C:\Program Files\WindowsPowerShell\Modules\TestCompositeResource\DSCResources\xVirtualMachine

Mode                LastWriteTime     Length Name                                                                         

----                -------------     ------ ----                                                                         

-a---         2/25/2014   8:42 PM       2642 xVirtualMachine.psd1                                                         

-a---         2/25/2014   8:42 PM       2957 xVirtualMachine.schema.psm1   

Note: Take note of the .psd1 file (xVirtualMachine.psd1) inside the DSCResources folder. On my first attempt, I did not put that file in there and wasted some time trying to figure out where I was going wrong (yes, yes, yes a valid PowerShell module must have one out of .psd1, .psm1, .cdxml, .dll extension and it took me some time to figure out the fact that .schema.psm1 does not satisfy that condition).

Inside the .psd1 file, I have this line:

RootModule ='xVirtualMachine.schema.psm1'

 

That is it, you are done!

PS C:\> Get-DscResource -Name xVirtualMachine

ImplementedAs        Name                           Module                                                Properties                                      

-------------              ----                               ------                                                  ----------                                      

Composite               xVirtualMachine           TestCompositeResource                    {VMName, SwitchName, SwitchType, VhdParentPath...}

 

Your configuration shows up as a composite resource.

Let us now see how to use it:

configurationRenameVM

{

Import-DscResource-ModuleTestCompositeResource

 

Node localhost

{

    xVirtualMachine VM

    {

        VMName ="Test"

        SwitchName ="Internal"

        SwitchType ="Internal"

        VhdParentPath ="C:\Demo\Vhd\RTM.vhd"

        VHDPath ="C:\Demo\Vhd"

        VMStartupMemory =1024MB

        VMState ="Running"

    }

    }

   Node "192.168.10.1"

   {  

    xComputerName

    {

        Name="SQL01"

        DomainName="fourthcoffee.com"

    }                                                                                                                                                                                                                                                              

}

}

 

We have used the dynamic keyword Import-DscResource to make our composite resource type available in the configuration. The parameters of the composite resource become its properties. You can discover this in two ways, one way is to use the Get-DscResource cmdlet as above and the other is in ISE. I like the one in ISE since it does not require me to shift my focus to the command window and type in the cmdlet. You can take the cursor to the place where you have the name of the resource, and press CTRL+space. You can discover all the resources by using CTRL+Space after the configuration keyword as well. .You have to do it after Import-DscResource if importing custom resources.

Here is what ISE displays:

 

Untitled

 

Tab completion works on the names of the properties just like any other resource, isn’t that cool?

This way, I have a configuration, where I reused one of my existing configurations and added one more resource to the overall configuration of my machine. This configuration first creates a VM and then uses the xComputerResource to rename it. I can thus build upon my existing configurations as and when the need arises for more complex configurations.

 

 

 

 

Happy configuring!

Abhik Chatterjee

Windows PowerShell Developer

 

 

Want to Automatically Configure Your Machines Using DSC at Initial Boot-up?

$
0
0

Often times, IT Pros want to automate software installation and configuration upon a machine’s initial boot-up. This blog will walk you through how to use Windows PowerShell Desired State Configuration (DSC) to do this. We will show you how to prepare and inject DSC configurations into your bootable media (such as VHDs), so that they are executed during the initial boot-up process. We will discuss three approaches to this process, and give a VHD example for each one.

Prerequisites:

  • Windows Server 2012 R2 with update KB2883200You can use Get-HotFix -Id KB2883200 to check whether it's installed.

Option 1 – Inject a DSC configuration (.ps1) into a VHD

Our first approach will be to inject a .ps1 with a DSC configuration directly into a VHD. This approach leverages “Unattended Windows Setup” to automatically run an inserted configuration. For information about running Windows Setup, see the reference here.

  • First, we need to write a configuration to inject. This configuration will be run when the machine boots up. For example, this sample configuration ensures that IIS is installed. Note: feel free to replace this configuration with your own. The xComputer page has more samples for common tasks including renaming your computer, joining a domain, etc.

 

powershell-c {

    configurationSampleIISInstall

    {

        node ("localhost")

        {

            WindowsFeatureIIS

            {

                Ensure ="Present"

                Name ="Web-Server"                      

            }

        }

    }

  

    # Compile the configuration file to a MOF format

    SampleIISInstall

 

    # Run the configuration on localhost

    Start-DscConfiguration-Path.\SampleIISInstall-ComputerNamelocalhost-Wait-Force-Verbose

}

 

  • Save the above to SampleIISInstall.ps1. This filewill be injected to your bootable VHD.
  • Next, we need to create a unattend.xml file. Below is a snippet from a sample unattend.xml (the whole document is attached at the bottom of this blog). Under the “specialize” pass, a part of Windows installation, the RunSynchronous command gets invoked. In turn a command file (in this case RunDSC.cmd) is called.

<settings pass="specialize">

        <component name="Microsoft-Windows-Deployment" processorArchitecture="amd64" …>

            <RunSynchronous>

                <RunSynchronousCommand>

                    <Order>1</Order>

                    <Path>%SystemDrive%\DSC\RunDSC.cmd</Path>

                </RunSynchronousCommand>

            </RunSynchronous>

        </component>

</settings>

  • The content of RunDSC.cmd looks something like the following. It uses Windows Task Scheduler to schedule a task that invokes SampleIISInstall.ps1.

schtasks.exe /Create /SC ONSTART /TN "\Microsoft\Windows\Desired State Configuration\DSCRestartBootTask" /RU System/F /TR "PowerShell.exe -ExecutionPolicyRemoteSigned -File c:\dsc\SampleIISInstall.ps1"

  • We have all necessary files ready. Now inject SampleIISInstall.ps1, RunDSC.cmd, and unattend.xml to the VHD. In our case, unattend.xml is copied under root directory while SampleIISInstall.ps1 and RunDSC.cmd are put under %SystemDrive%\DSC. If you boot up from the VHD, or create a VM from the VHD, you will see that IIS gets installed at first boot-up.

Option 2 – Inject a compiled DSC configuration (.mof) into a VHD

Imagine we need to customize the installation for large-scale rollouts so that we achieve consistency in the configuration of each target node. The method described in Option 1 does not seem to be efficient; it requires the compilation of the DSC configuration on each node. In the following approach, we inject an already compiled configuration into the VHD.

There are two ways to inject a complied DSC configuration to VHD. Let’s go through them.

2.1 Inject a DSC configuration (.mof) document

  • To generate a compiled configuration, let’s compile the above SampleIISInstall configuration. A localhost.mof file is generated after the compilation as below:

PS C:\> SampleIISInstall

 

 

    Directory: C:\SampleIISInstall

 

 

Mode       LastWriteTime            Length Name

----       -------------            ------ ----

-a---      2/14/2014   4:12 PM      1086   localhost.mof

 

  • Copy the localhost.mof to pending.mof.
  • Make sure you inject the pending.mof to your VHD under the DSC Local Configuration Manager (LCM) dedicated folder: %systemdrive%\Windows\System32\Configuration.
  • As in Option 1, we will use a unattend.xml. As before, the unattend.xml will invoke a command like RunDSC.cmd.
  • The content of RunDSC.cmd should look something like the following. It schedules a task:

schtasks.exe /Create /SC ONSTART /TN "\Microsoft\Windows\Desired State Configuration\DSCRestartBootTask" /RU System /F /TR "PowerShell.exe -NonInt -Command 'Invoke-CimMethod -Namespace root/Microsoft/Windows/DesiredStateConfigurationClassNameMSFT_DSCLocalConfigurationManager -MethodNamePerformRequiredConfigurationChecks -Arg@{Flags = [System.UInt32]3 }'"

The above process will make localhost.mof the pending configuration in the LCM. The scheduled task then tells the LCM to process the pending configuration (“Flag 3” means to apply the configuration). Through this mechanism, localhost.mof will be applied.

2.2 Inject a “meta-configuration” to download a document

In the above we discussed how to apply configurations using the “push model” of DSC. That is to say, we “pushed” the configuration onto the machine. We can also leverage the “pull model” of DSC to apply a configuration. To do this, we must configure the node to “pull” it’s configuration from a DSC “pull server” and then apply it. We do this by configuring the Local Configuration Manager with a “meta-configuration.”

We describe the steps as below:

1. Deploy your configuration files to the pull server (see this blog for more information)

  • Copy the localhost.mof created in Option 2.1 to 43d4995d-3199-4e0d-aef5-d52d3b681ac4.mof
  • Run “New-DSCCheckSum” to generate 43d4995d-3199-4e0d-aef5-d52d3b681ac4.mof.checksum.
  • Copy the above two files to your pull server under the following folder: $env:programfiles\WindowsPowerShell\DSCService\Configuration.

Now we have the configuration files ready on the pull server. Next, we will create a DSC meta-configuration to be injected to the VHD.

2. Prepare the meta-configuration for target node

ConfigurationSampleConfiguration

{

    Node "localhost"

    {

        LocalConfigurationManager

        {   

            ConfigurationID="43d4995d-3199-4e0d-aef5-d52d3b681ac4";          

            DownloadManagerName="WebDownloadManager";

            DownloadManagerCustomData=@{ ServerUrl="http://Pullserver:8080/PSDSCPullServer/PSDSCPullServer.svc" ; AllowUnsecureConnection="True"}

            RefreshMode="PULL";

            ConfigurationMode="ApplyAndAutoCorrect";

            RebootNodeIfNeeded=$true;

            RefreshFrequencyMins=1;

            ConfigurationModeFrequencyMins=30;

        }

    }

}

 

  • After compiling the above meta configuration, you will see a localhost.meta.mof file is generated. The content of localhost.meta.mof looks like below:

instance of MSFT_KeyValuePair as $keyvaluepair1

{

    key = "ServerUrl";

    value = "http://pullserver:8080/PSDSCPUllServer/PSDSCPullServer.svc";

};

 

 

instance of MSFT_KeyValuePair as $keyvaluepair2

{

    key = "AllowUnsecureConnection";

    value = "true";

};

 

 

instance of MSFT_DSCMetaConfiguration

{

      ConfigurationID ="43d4995d-3199-4e0d-aef5-d52d3b681ac4";

      RefreshMode ="PULL";

      ConfigurationMode ="ApplyAndAutoCorrect";

      DownloadManagerName ="WebDownloadManager";

      DownloadManagerCustomData = {$keyvaluepair1,$keyvaluepair2};

      RebootNodeIfNeeded = True;

      RefreshFrequencyMins = 1;

      ConfigurationModeFrequencyMins = 30;

};

 

  • Rename the localhost.meta.mof as metaconfig.mof and inject it to your VHD under the %systemdrive%\Windows\System32\Configuration folder. Please note your metaconfig.mof should only contain MSFT_DSCMetaConfiguration and MSFT_KeyValuePair instances. You may need to manually remove OMI_ConfigurationDocument if it exists.
  • The rest of the steps regarding unattend.xml and RunDSC.cmd are similar to what we mentioned in Option 2.1.

In this post, we discussed three approaches for DSC configuration installation during the initial boot up. You may follow these instructions to prepare your DSC configurations, unattend.xml, and command files. Once you have these files ready, you can use the xVhdFileDirectory samples from this site to inject them into your VHD. After that, you can create a VM from the VHD. If you do so, you will see that IIS is installed (or being installed as installing IIS takes time) when you log on to the VM.


Thanks,
Jianyun Tao
Microsoft Corporation


PowerShell Summit North America

$
0
0

We are delighted to draw attention to this year's PowerShell Summit - an excellent community run PowerShell event.

As described by PowerShell.Org:

Come meet the Windows PowerShell team, PowerShell MVPs, independent experts, and your peers and colleagues in the PowerShell universe! The PowerShell Summit is a one-of-a-kind annual event, and this year it’s April 27-29 right in Microsoft’s home town of Bellevue, Washington. You’ll enjoy more than 63 in-depth technical sessions about PowerShell, how to manage using PowerShell, how to develop in PowerShell, how to troubleshoot in PowerShell – pretty much everything PowerShell. 

Microsoft speakers include PowerShell’s inventor, Distinguished Engineer Jeffrey Snover – not to mention an all-star cast of developers, program managers, and others who help create PowerShell on a daily basis. You’ll meet Lee Holmes (author of the “Windows PowerShell Cookbook” and resident PowerShell Security Guru), Kenneth Hansen (Principle Program Manager), Krishna Vutukuri (Developer Lead), Jason Shirk (Senior Developer, creator of TabExpansion++ and PSReadLine), Hemant Mahawar (Senior Program Manager), and Paul Higinbotham (Senior Developer, expert on debugging and remoting). You’ll also find independent experts like Jeff Hicks, Don Jones, Jason Helmick, Aleksandar Nikolic, Richard Siddaway, and many more. 

The Summit is fun, too. You’ll have a great time at the Monday evening IRON SCRIPTER tournament and reception (including a ton of great local food and brews), mingle with product team members in Microsoft’s downtown office on Tuesday night, and rub elbows with a who’s-who list of PowerShell people each and every day.

Best of all, the Summit is run by PowerShell.org, a community-owned and –operated not-for-profit. That means the event is run on a break-even basis, so it’s not an expensive boondoggle. Head over to http://powershellsummit.org for all the details. Note that in 2013, the event completely sold out, so you don’t want to delay too much longer to secure your space.

We hope to see you there!

The Windows PowerShell Team

DSC Resource Kit Wave 3

$
0
0

In September, Microsoft released PowerShell Desired State Configuration (DSC) with twelve built in resources.

 

Three months later (December), we added eight more resources with Wave One of the DSC Resource Kit.

 

Two months after that (February), we totaled fourteen additional resources with Wave Two of the DSC Resource Kit.

 

Can you guess what happens now, the month after the release of Wave Two?

 

We're happy to announce Wave Three of the DSC Resource Kit.  This wave contains eighteen new DSC resources.  This brings our total count to over 50 DSC Resources -- without even mentioning the community created resources available on the PowerShell.Org's GitHub.

 

Click here to see all of the resources in the Resource Kit!

 

What's in this Wave?

The resources available in this wave allow you to get more done.  We've drastically improved the support of IIS and SQL Server, adding resources for managing things like Web App Pools, Virtual Directories, Web Applications, and DACPACs.  We've added a whole new set of resources to support Remote Desktop Session Host.  We've added other new resources, allowing you to manage PowerShell Remoting endpoints, download files from a URI, compress/decompress .zip files, and more.

 

Now, you might be thinking, "That is an awful lot of resources.  Am I going to have to download them all individually?"  Not to worry -- you can download all of the latest DSC resources from this page.  If you want to take a more à la carte approach, you can still search for “DSC Resource Kit” on TechNet Script Center.

 

In addition to the new resources, we have also made some updates to existing resources based on feedback we have received.

Questions, Comments?

If you're looking into using PowerShell DSC, but are blocked by a lack of resources, let us know in the comments or the TechNet QA Section.

 

As always, we must reiterate that these resources come without any guarantees.  The “x” prefix stands for experimental – which means these resources are provided AS IS and are not supported through any Microsoft support program or service. We will monitor the TechNet pages, take feedback, and may provide fixes moving forward.   Also, don’t forget to check out the community versions of many resources on PowerShell.Org's GitHub.

Details

After installing the modules, you can discover all of the resources available by using the Get-DSCResource cmdlet.  Here is a brief description of each resource (for more details on a resource, check out the TechNet pages).

 

Module

Resource 

Description 

xWebAdministration

xWebAppPool 

Create, remove, start, stop an IIS Application Pool 

xWebVirtualDirectory 

Create or remove a virtual directory 

xWebApplication 

Create or remove a web application 

xWebConfigKeyValue 

Configure AppSettings section of Web.Config 

xDatabase

xDatabase 

Create, drop & deploy databases 

xDBPackage 

Backup & restore databases 

xSystemSecurity

xUAC 

Enable or disable User Account Control prompt 

xIEEsc 

Enable or disable IE Enhanced Security Configuration 

xRemoteDesktopSessionHost

xRDSessionDeployment 

Creates and configures a deployment in RDSH.   

xRDSessionCollection 

Creates a RDSH collection.  

xRDSessionCollectionConfiguration  

Configures a RDSH collection.  

xRDRemoteApp 

Publish applications for your RDSH collection 

xPSDesiredStateConfiguration

xWindowsProcess 

Adds ability to run as a specific user to the existing WindowsProcess resource  

xService 

Update to existing Service resource to include create/configure service 

xRemoteFile 

Download files from a URI 

xPackage 

Adds ability to run as a specific user to the existing resource, includes VS Setup 

xArchive

Create, update, extract a Zip file 

xEndpoint 

Creates a remoting endpoint 

Updates

xDscResourceDesigner, xComputer, xVMHyperV, xDNSServerAddress

Feature additions and bug fixes

The specific changes to existing resources will be noted on the individual TechNet pages.

Renaming Guidelines

When making changes to these resources, we urge the following practice:

1.       Update the following names by replacing MSFT with your company/community name and replacing the “x” with "c" (short for "Community") or another prefix of your choice:

a.       Module name (ex: xWebAdministration becomes cWebAdministration)

b.      Folder name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

c.       Resource Name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

d.      Resource Friendly Name (ex: xWebsite becomes cWebsite)

e.      MOF class name (ex: MSFT_xWebsite becomes Contoso_cWebsite)

f.        Filename for the <resource>.schema.mof (ex: MSFT_xWebsite.schema.mof becomes Contoso_cWebsite.schema.mof)

2.       Update module and metadata information in the module manifest

3.       Update any configuration that use these resources

 

We reserve resource and module names without prefixes ("x" or "c") for future use (e.g. "MSFT_WebAdministration" or "Website").  If the next version of Windows Server ships with a "Website" resource, we don't want to break any configurations that use any community modifications.  Please keep a prefix such as "c" on all community modifications.

As specified in the license, you may copy or modify this resource as long as they are used on the Windows Platform.

Requirements

The DSC Resource Kit requires Windows 8.1 or Windows Server 2012 R2 with update KB2883200 (aka the GA Update Rollup). You can check whether it is installed by running the following command:

PS C:\WINDOWS\system32> Get-HotFix -Id KB2883200

 

Source        Description      HotFixID      InstalledBy          InstalledOn             

------        -----------      --------      -----------          -----------             

NANA-TOUCH    Update           KB2883200     NANA-TOUCH\Admini... 9/30/2013 12:00:00 AM   

 

On supported down level operating systems, they require WMF 4.0. Refer to these previous blog posts for more information on WMF 4.0 and issues with partial installation.

 

Configuring an Azure VM using PowerShell DSC

$
0
0

At the //build/ conference today, Jeffery Snover demonstrated bringing up an Azure virtual machine and configuring it using DSC and the Custom Script VM extension.  We are sharing the scripts he used to accomplish this.

These scripts present an example of how PowerShell DSC can be used with the Azure boot agent to create and automatically configure Azure VMs. In order to use these scripts you must have an active Azure account and some knowledge of the Azure PowerShell SDK. (In other words, these examples are not for the faint of heart...) 

NOTE: before adapting these scripts for your own use, you should review the scripts and make sure that you update the scripts to comply with your security patterns and practices.

This example will provision a virtual machine using your Windows Azure account. It will upload the configuration defined in myConfigScript.ps1, any modules required by that configuration, and the DscBoot.ps1 script to blobs located in your Azure storage account. Once the virtual machine has been provisioned, the Azure script handler extension will download and run DSCBoot.ps1 to enact the configuration specified in MyConfigScript.ps1.

The provided MyConfigScript.ps1 uses the OneGet package management toolkit and DSC to configure the applications installed on the VM, and sets up an IIS website. 

In order to use these scripts, you will need a few things:

1)      A Microsoft Azure account

2)      The most recent version of the Azure Powershell SDK

  1. You can install it from here
  2. Then as administrator, copy the Azure module into the common shared modules folder:

copy "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\azure" $env:ProgramFiles\WindowsPowerShell\Modules -Recurse –force

3)      The AzureBootAgentPowerShellDSCexample.zip file (attached to the blog post)

  1. Unblock the file so you are able to run the scripts contained within it.

unblock-file .\AzureBootAgentPowerShellDSCexample.zip

  1. Extract the zip to a known location (ie “c:\temp\DSC”)
  2. As administrator, copy the module definitions contained within these folders into the $PSHome\Modules folder

copy .\xOneGet $env:ProgramFiles\WindowsPowerShell\Modules -Recurse

copy .\xWebAdministration $env:ProgramFiles\WindowsPowerShell\Modules -Recurse

 

Initial Setup

This script is dependent on several properties linked to your Azure account. You will need to obtain the values for the following variables in the config.ps1 file contained in the AzureBootAgentPowerShellDSCexample.zip.

  • Publish settings
  • Run the following commands to obtain information you will need to configure the scripts for your own usage.
    • Run the following command in Powershell, replacing $publishSettingsFile with the file name:

Import-Module azure; Import-AzurePublishSettingsFile -PublishSettingsFile $publishSettingsFile

    • Run the following command and note the SubscriptionName property

Get-AzureSubscription

  • If you already have an Azure storage account you use, skip the next step.
    • Create a new Azure storage accoung by running the following command in powershell, replacing “demostorage001” with a unique identifier.

New-AzureStorageAccount -storageaccountname demostorage001 -location "West US"

  • Get secondary storage key for your storage account.
    • Run the following command in powershell, replacing “demostorage001” with your storage account name:

Get-AzureStorageKey demostorage001

(Storage account names can only contain lowercase letters, numbers or “-“, and must start with a letter. See Azure documentation for more details on storage account name restrictions).

  • Open config.ps1 to change the following variable assignments:
    •  Change the value of the variable below such that it points to the .publishsettings file.

$publishSettingsFile = "<your.publishsettings>"

    • Change the value of the variable below to reflect the SubscriptionName property

$subscriptionName = ""

    • Change the two variable assignments below to reflect your storage account and key.

$AccountName = "demostorage"

$AccountKey = "<your-account-key>"

Running the Demo

  • Ensure config.ps1 has been set up correctly as described in the previous section. In particular, the $vmName needs to be unique. $VMServiceName will throw an error if it is not unique, but can continue regardless:

$vmName = "<myUserid-machineName-01>"

$VMServiceName = "<myUserid-serviceName-01>"

  • Open the demo.ps1 file and follow through the steps it contains to see how the VM is created and configured. Be sure to create your own username/password for the VM.
    • Connect to the Azure Management Portal  and select the Virtual Machines slice
    • Find the VM you created by name and wait for the VM to be in “Ready OR Stopped” mode
    • Connect on to the VM with the credentials you created in demo.ps1
  • On the target VM, verify that the configuration has been properly applied. Check that
    • that the Chrome, FireFox and Opera web browsers have been installed
    • that the vim and sysinternals packages have been installed
    • that the Fourth Coffee website has been created.

 

Windows Management Framework V5 Preview

$
0
0

The Windows Management Framework V5 Preview is out! Check out Jeffrey Snover's announcement here. The preview contains updates to PowerShell Desired State Configuration, as well as two new features: NetworkSwitch cmdlets and Windows PowerShell OneGet.  OneGet is designed to dramatically simplify how you discover and install software packages, while the NetworkSwitch cmdlets allow you to manage Certified for Windows network switches.  Check out the announcement page for more details.

We're excited to share this preview with you -- let us know what you think. 

Cheers,

PowerShell Team

What’s in a name? Using prefixes in PowerShell.

$
0
0

We’ve talked about this in the past but it’s time for a reminder.  PowerShell uses prefixes in front of nouns to avoid name collisions.  Imagine how many collisions there would be if people used the noun “USER” directly.  Instead, we have cmdlets *-ADUser, *-VPNUser, and *-RDUser.  The use of the prefix avoids name collisions.  When there is a name collision, the customer must specify the full name of the cmdlet.  Did you know that the full name of Get-ADUser was actually ActiveDirectory\Get-ADUser?  How would you feel if you had to type that every time you wanted to invoke that command? 

The use of prefixes also makes it easy to find related components.  Look how easy it is to find all the PowerShell settings by searching for variables with the “PS” prefix:

 

 

It is important to use prefixes.  There are two other rules you need to know:

  1. The prefix for PowerShell is “PS”.  If you use the “PS” prefix, we can and will collide with you.
  2. There are two scenarios where PowerShell will use generic nouns:  When we cover a system resource (e.g. *-Process) or where we provide a pluggable framework for an area (e.g. Drives, Jobs, Events, etc.).

Moving forward, we anticipate continuing to create modules, environment variables, file extensions, and other content types using the “PS” prefix and creating additional pluggable frameworks where we will use generic nouns.
If you don’t use a prefix or use the PS prefix, you are setting yourself up for a collision.   

 

We don’t want to create naming conflicts – that’s why we have insisted on the use of prefixes from the very early days on PowerShell and why we started using the “PS” prefix. 

To Sum Up

Use noun prefixes and avoid the “PS” prefix to deliver a great customer experience and minimize/avoid name collisions.

 

Thanks,

 

Jeffrey Snover and John Slack

Collecting the Output of Remoting Commands

$
0
0

One question we often get around PowerShell Remoting is "How do I collect the output of remoting commands into a different file for each computer?" For example, you want to invoke a remote command across 1,000 machines as a multi-threaded job and then create 1,000 text files for their output.

This is in fact impressively easy in PowerShell due to the automatic properties that we add to remote output. The trick is to just know that it exists :)

                                                                                                                                  
106 [C:\windows\system32]                                                                                                         
>> Invoke-Command myComputer { "Hello World" }                                                                                     
Hello World                                                                                                                       
                                                                                                                                  
107 [C:\windows\system32]                                                                                                         
>> Invoke-Command myComputer { "Hello World" } | Format-List*-Force                                                              
                                                                                                                                  
                                                                                                                         
PSComputerName     : myComputer
RunspaceId         : 30c313ff-00a2-4a21-a001-1309c5d12a1d                                                                         
PSShowComputerName : True                                                                                                         
Length             : 11                                                                                                           
                                                                                                                                  

 

As you can see in the example, we tag remote output with the computer name from which it originated. If we leverage that property, we can now create output files for remote output quite easily:

                                                                                                                                
130 [C:\windows\system32]                                                                                                         
>> $job=Invoke-Command comp1,comp2 {
"Hello World: from $(hostname): $(Get-Random)" } -AsJob 131 [C:\windows\system32] >> Wait-Job$job 132 [C:\windows\system32] >> $r=Receive-Job$job 133 [C:\windows\system32] >> $r Hello World: from comp1: 341068382 Hello World: from comp2: 1357111073 134 [C:\windows\system32] >> $r.PSComputerName comp1 comp2 135 [C:\windows\system32] >> $r | % { $_> c:\temp\$($_.PSComputerName).output } 136 [C:\windows\system32] >> dir c:\temp\*.output Directory: C:\temp Mode LastWriteTime Length Name ---- ------------- ------ ---- -a--- 4/16/2014 11:04 AM 88 comp1.output -a--- 4/16/2014 11:04 AM 82 comp2.output

 

Lee Holmes [MSFT]
Windows PowerShell Development

A World of Scripts at your Fingertips – Introducing Script Browser

$
0
0
 <Today we have a guest blog entry from Microsoft Customer Services & Support and the Garage>

 

To reuse script samples on the Internet, the following steps seem quite familiar to IT Pros: wandering through different script galleries, forums and blogs, switching back and forth between webpages and scripting environment, and countless download, copy and paste operations. But all of these will drive one as dizzy as a goose. Need a simpler way of searching and reusing scripts? Try out the new Script Browser add-in for PowerShell ISE!

 

Download Here

 

 

Script Browser for Windows PowerShell ISE is an app developed by Microsoft Customer Services & Support (CSS) with assistance from the PowerShell team and the Garage to save IT Pros from the painful process of searching and reusing scripts. We start from the 9,000+ script samples on TechNet Script Center. Script Browser allows users to directly search, learn, and download TechNet scripts from within PowerShell ISE – your scripting environment. Starting from this month, Script Browser for PowerShell ISE will be available for download. If you are a PowerShell scripter or are about to be one, Script Browser is a highly-recommended add-in for you.

Nearly 10,000 scripts on TechNet are available at your fingertips. You can search, download and learn scripts from this ever-growing sample repository.

 

   

 

·         We enabled offline search for downloaded script samples so that you can search and view script samples even when you have no Internet access.

 

 

          You will get the chance to try out another new function bundled with Script Browser - ‘Script Analyzer’. Microsoft CSS engineer managed to use the PowerShell Abstract Syntax Tree (AST) to check your current script against some pre-defined rules. In this first version, he built 7 pilot PowerShell best practice checking rules. By double-clicking a result, the script code that does not comply with the best practice rule will be highlighted. We hope to get your feedback on this experimental feature.

 

 

It is very essential that an app satisfies users’ requirements. Therefore, feedback is of prime importance. For Script Browser, Microsoft MVPs are one of the key sources where we get constructive feedback. When the Script Browser was demoed at the 2013 MVP Global Summit in November and 2014 Japan MVP Open Day, the MVP community proposed insightful improvements. For instance, MVPs suggested showing a script preview before users can decide to download the complete script package. MVPs also wanted to be able to search for script samples offline. These were great suggestions, and the team immediately added the features to the release. We have collected a pool of great ideas (e.g. MVPs also suggested that the Best Practice rules checking feature in Script Analyzer should be extensible). We are committed to continuously improving the app based on your feedback.

 

We have an ambitious roadmap for Script Browser. For example, we plan to add more script repositories to the search scope. We are investigating integration with Bing Code Search. We are also trying to improve the extensibility of Script Analyzer rules. Some features, like script sample sharing and searching within an enterprise, are still in their infancy.

 

We sincerely suggest you give Script Browser a try (click here to download). If you love what you see in Script Browser, please recommend it to your friends and colleagues. If you encounter any problems or have any suggestions for us, please contact us at onescript@microsoft.com. Your precious opinions and comments are more than welcome.


Debug Mode in Desired State Configuration

$
0
0

The DSC engine caches resources implemented as a PowerShell module for efficiency purposes. This can sometimes turn out to be annoying, when you are authoring a resource and testing it simultaneously. The only way to cause DSC to load the newer version every time is to explicitly kill the process hosting the DSC engine.

As part of the WMF5 CTP release, we have introduced a new property for configuring the DSC Local Configuration Manager (LCM) called the DebugMode. When set to true, it causes the engine to reload the PowerShell DSC resource. Once you are done writing your resource, you can set it back to false and the engine will revert to its behavior of caching the modules.

Here is a demonstration of using the DebugMode:

PS C:\Users\WinVMAdmin\Desktop> Get-DscLocalConfigurationManager

 

 

AllowModuleOverwrite           : False

CertificateID                  :

ConfigurationID                :

ConfigurationMode              : ApplyAndMonitor

ConfigurationModeFrequencyMins : 30

Credential                     :

DebugMode                      : False

DownloadManagerCustomData      :

DownloadManagerName            :

LocalConfigurationManagerState : Ready

RebootNodeIfNeeded             : False

RefreshFrequencyMins           : 15

RefreshMode                    : PUSH

PSComputerName                 : 

 

To begin with, it is false (by default).

For the purpose of demonstration, we will be using the following PowerShell resource:

functionGet-TargetResource

{

    param

    (

        [Parameter(Mandatory)]

        $onlyProperty

    )

    return @{onlyProperty=Get-Content-path"$env:SystemDrive\OutputFromTestProviderDebugMode.txt"}

}

functionSet-TargetResource

{

    param

    (

        [Parameter(Mandatory)]

        $onlyProperty

    )

    "1"|Out-File-PSPath"$env:SystemDrive\OutputFromTestProviderDebugMode.txt"

}

functionTest-TargetResource

{

    param

    (

        [Parameter(Mandatory)]

        $onlyProperty

    )

    return$false

}

 

We author a configuration using the above resource(let us call the resource TestProviderDebugMode):

Configuration ConfigTestDebugMode

{

    Import-DscResource -Name TestProviderDebugMode

    Node localhost

    {

        TestProviderDebugMode test

        {

            onlyProperty = "blah"

        }

    }

}

ConfigTestDebugMode

 

 If we read the contents of the file: “$env:SystemDrive\OutputFromTestProviderDebugMode.txt”, we see it is 1. Now, let us update our provider code. For the purpose of demonstration, we will use the following script:

$newResourceOutput=Get-Random-Minimum5-Maximum30

$content=@"

function Get-TargetResource

{

    param

    (

        [Parameter(Mandatory)]

        `$onlyProperty

    )

    return @{onlyProperty = Get-Content -path "$env:SystemDrive\OutputFromTestProviderDebugMode.txt"}

}

function Set-TargetResource

{

    param

    (

        [Parameter(Mandatory)]

        `$onlyProperty

    )

    "$newResourceOutput"|Out-File -PSPath C:\OutputFromTestProviderDebugMode.txt

}

function Test-TargetResource

{

    param

    (

        [Parameter(Mandatory)]

        `$onlyProperty

    )

    return `$false

}

"@|Out-File-FilePath"C:\Program Files\WindowsPowerShell\Modules\MyPowerShellModules\DSCResources\TestProviderDebugMode\TestProviderDebugMode.psm1"

 

 

It generates a random number and updates the provider code accordingly. This simulates the scenario where you need to make frequent updates to your provider and test them.

With the DebugMode set to false, the contents of the file “$env:SystemDrive\OutputFromTestProviderDebugMode.txt” are never changed.

Now let us set the DebugMode to true by having the following block in our configuration script:

LocalConfigurationManager

{

    DebugMode = $true

}

 

Now, if you run the above script again, you will see that the content of the file is different every time. (You can do a Get-DscConfiguration to check it). Here is what I got in two subsequent runs:

PS C:\Users\WinVMAdmin\Desktop> Get-DscConfiguration -CimSession (New-CimSessionlocalhost)

 

onlyProperty                            PSComputerName                        

------------                            --------------                        

20                                      localhost                             

 

PS C:\Users\WinVMAdmin\Desktop> Get-DscConfiguration -CimSession (New-CimSessionlocalhost)

 

onlyProperty                            PSComputerName                        

------------                            --------------                        

14                                      localhost                             

 

 

Happy configuring!

Abhik Chatterjee

Windows PowerShell Developer

 

Understanding Import-DscResource Keyword in Desired State Configuration

$
0
0

Desired State Configuration resources are used to model and change the state of different components of the system. In an earlier blog post, we discussed details about deploying and discovering those resources and introduced Import-DscResource dynamic keyword. In this post, we will discuss some more details about the usage and internal working of this dynamic keyword.  This keyword is only available inside configuration block. Its syntax looks like PowerShell cmdlet but it’s not a cmdlet and behaves differently. 

 

When authoring the DSC configuration in ISE, the PowerShell parser, provides IntelliSence for resources and resource properties by auto loading the resource definitions from $pshome module path. Moreover, the same resource definitions are used during compilation step (generation of MOF document) to validate the entries of the configuration. E.g. resource schema validation is done for following:

  •           Only properties defined in schema are used
  •           The data types for each property is correct
  •            Keys properties are specified
  •            No read only property is used
  •           Validation on value maps types

 

Consider this configuration:

 

ConfigurationSchemaValidationInCorrectEnumValue

{

        WindowsFeatureROLE1

        {            

               Name ="Telnet-Client"

               Ensure ="Invalid"

        }           

}

 

Compiling it will result in error:

 

PSDesiredStateConfiguration\WindowsFeature: At least one of the values 'Invalid' is not supported or valid for property 'Ensure' on class 'WindowsFeature'. Please specify only supported values: Present, Absent.

 

This allows catching most of the errors during parse and compilation time instead of delaying it till run time (configuration application).

 

DSC provides a mechanism to add new custom DSC resources. These resources should be deployed outside of $pshome module path, preferably under $env:SystemDrive\Program Files\WindowsPowerShell\Modules path. For configuration that use custom DSC resources, authoring and compilation steps need to know the additional resources that should be loaded for validation – that’s where Import-DscResource comes into play. In absence of an import mechanism, parsing and compilation steps could be very slow because they have to look up entire $PSModulePath entries to find module containing required DSC resource. Also Import-DscResource is used to import both composite and PowerShell DSC resources.

 

Import-DscResource currently supports two non-positional parameters Name and ModuleName.

 

-Name is the name of the resource to import. The name of the resource has to be the class name used when defining schema for the resource and not the friendly name. E.g.

[ClassVersion("1.1.0"),FriendlyName("Website")]

class WebsiteResource : OMI_BaseResource

{

}

In above example resource name is WebsiteResourceand not the friendly name “Website”.  

 

Name can contain any numbers of resources to be imported.  If only –Name parameter is used then resources can belong to different PowerShell modules as well. e.g.

 Import-DSCResource –Name MSFT_xADDomain , MSFT_xSmbShare   

The above command will loop through all modules installed in all paths in $env:PSModulePath to find the ones containing resources whose names are specified as value for –Name parameter. Things to consider when using only the Name parameter:  

1-   It has performance implications because it is resource intensive operation. It can use lots of memory on machine depending on how many modules are installed on machine.  

2-   It will load the first resource found with given name so in case there are more than one resource with same name deployed on machine, it can end up loading different resource then the one desired.

 The recommended way is to specify –ModuleName as well as described below.  

-ModuleName is the name of the module that contains the resources to be imported.  This parameter can contain either string array of module names or modules specification object.  e.g.

Import-DSCResource –ModuleName  xActiveDirectory, xSmbShare 

Above command will import all resources defined inside modules xActiveDirectory and xSmbShare unless we specify –Name parameter as well to only load selected resources.

 

Module specification object is just a hash table. The valid members that can be added to it are ModuleName, ModuleVersion and GUID.  

 

Import-DscResource –ModuleName @{ModuleName="xActiveDirectory”},@{ModuleName=" xSmbShare"}

 

The Import-DscResource command only tries to find resources inside specific module when ModuleName parameter is specified, which solves any delays/performance issues during resource discovery. Also its lets user control which exact version of module to load when there are multiple versions of same module installed on machine.

 

 

DSC Resource Versioning 

 

DSC resources have same version as that of container Modules. Version is determined by ModuleVersion property defined in .psd1 of container module. DSC resources can also define their own .psd1 file which might have ModuleVersion defined as well, but version property is used from container module .psd1 to load resources.

 

Any updates in resources should result in update to container module version and configuration author can select which version of Resource/Module to load by using command like:

 

  Import-DscResource -ModuleName @{ModuleName="xActiveDirectory”;GUID="caedd1418493-4af3-bda0-eef11e9ca2be”;ModuleVersion="2.0"},@{ModuleName=" xSmbShare";ModuleVersion="1.0"}

 

Above command will fail if it is not able to find exact module version for each module.

 

The Module version number along with module name is included in each resource instance in output MOF generated during compilation. It looks like this:  

 

instance of MSFT_xADDomain as $MSFT_xADDomain1ref

{

            ResourceID = "[xADDomain]FirstDS";

 

            ModuleName = "xActiveDirectory"; 

 

               ModuleVersion = "2.0";

}

 

This gives the LCM information about which version of resource was used to generate the MOF. That is the version against which we validated resource properties during compilation so LCM attempts to find/load the same version at runtime. Incase same version does not exist on target node, execution behavior is different for Push vs Pull configuration modes. In case of Push mode execution fails and error is returned to user pushing the configuration. For Pull mode LCM attempts to download required module from Pull server and installs it on target node before execution, if it is unable to find that version on pull server execution fails.

 

 

Support for Wild Card operator and Partial Parameter Names

 

Import-DSCResource also supports partial parameter names

 

Import-DscResource -N MSFT_xADDomain -Mod " xActiveDirectory "

 

We can also use wildcard while specifying resource Names

 

Import-DscResource -Name * -ModuleName ="xActiveDirectory”;

 

Import-DscResource -Name MSFT*  -ModuleName ="xActiveDirectory”;

 

Wildcards are not supported for –ModuleName parameter because value can map to multiple module names and is error prone. Below command will result in error during configuration compilation.

 

Import-DscResource -Name * -ModuleName UserConfigPro*

 

Also specifying multiple values for Resource names and modules names in same command are not supported. It can have non-deterministic behavior about which resource to load from which module in case same resource exists in multiple modules. Below command will result in error during compilation as well.

 

Import-DscResource -Name UserConfigProvider*,TestLogger1 -ModuleName UserConfigProv,PsModuleForTestLogger

 

Fayza Sultan  

PowerShell Team

 

Windows Management Framework 5.0 Preview May 2014 is now available

$
0
0

We’re excited to announce Windows Management Framework 5.0 Preview May 2014, a new package publicizing new and exciting management technologies, is now available for download on Download Center!

This version of the preview includes everything in the Windows Management Framework 5.0 Preview that was released in April of 2014 and a new module called PowerShellGet.

 

This new package installs exclusively on Windows 8.1 and Windows Server 2012 R2.  If WMF 5.0 Preview (April 2014) is already installed on the machine, WMF 5.0 Preview (April 2014) must be uninstalled in order to install WMF 5.0 Preview May 2014.

 

PowerShellGet

PowerShellGet is a new way to discover, install, and update PowerShell Modules.  New in WMF 5.0 Preview May 2014, PowerShellGet contains a set of cmdlets that enable users to interact with an online module gallery.

 

Instead of having to search through CodePlex, GitHub, various blogs, and TechNet Script Center for all the PowerShell Modules you want, you can use Find-Module to search through an online Gallery:

 

PS C:\windows\system32> Find-Module

 

Version         Name                                     DateUpdated               Description

-------         ----                                     -----------               -----------

1.1.0.0         AutoVars                                 5/12/2014 2:37:19 PM      Allows f...

6.0             ConversionModule                         5/6/2014 2:34:14 PM       a module...

1.0             EWS                                      5/11/2014 11:20:17 AM     Module t...

1.0             GenericMethods                           5/8/2014 10:53:45 AM      The Invo...

1.2.0.0         HardwareManagement                       5/13/2014 1:28:11 PM      Out-of-b...

1.1             IEFavorites                              5/13/2014 7:30:05 PM      Used to ...

1.0             InlineMailAttachments                    5/8/2014 10:33:11 AM      This mod...

0.1.0.0         ISEGit                                   5/11/2014 11:48:48 AM     Module t...

3.0             LocalUserManagement                      5/12/2014 2:17:19 PM      a module...

1.0.0.5         LockObject                               5/13/2014 5:41:05 PM      Lock-Obj...

1.0.0.0         MyDefaults                               5/7/2014 12:47:21 PM      Sets and...

1.4             myModule                                 5/8/2014 10:18:33 AM      My Power...

1.0             PoshInternals                            5/7/2014 5:56:43 AM       Collecti...

2.2.1           PoshWSUS                                 5/6/2014 7:48:49 PM       PowerShe...

1.2             PowerShellCookbook                       5/14/2014 11:02:23 AM     Sample s...

                         

 

Installing a module from the Gallery is a simple one liner.  You won’t need to understand (or explain) $env:PSModulePath to your coworkers again:

 

C:\Windows\System32\WindowsPowerShell\v1.0> Install-Module PowerShellCookbook -Scope CurrentUser -Verbose

VERBOSE: The specified module will be installed in 'C:\Users\joslack\Documents\WindowsPowerShell\Modules'.

VERBOSE: GET https://go.microsoft.com/fwlink/?LinkID=397631&clcid=0x409 with 0-byte payload

VERBOSE: received 160-byte response of content type text/html; charset=utf-8

VERBOSE: Found module 'PowerShellCookbook' with version '1.2'.

VERBOSE: Loading module from path 'C:\Users\joslack\AppData\Local\Temp\411069568\PowerShellCookbook\PowerShellCookbook.psm1'.

VERBOSE: Performing the operation "Install-Module" on target "Version '1.2' of module 'PowerShellCookbook'".

VERBOSE: Module 'PowerShellCookbook' was installed successfully.

 

Updating your modules is even easier – just run the Update-Module command.  Checking for updates through your web browser will be a thing of the past.  You can check which modules will be updated with the -Whatif parameter:

 

C:\Windows\System32\WindowsPowerShell\v1.0> Update-Module -WhatIf

What if: Performing the operation "Update-Module" on target "Version '1.0.0.6' of module 'PSReadLine', updating to version '1.0.0.8'".

What if: Performing the operation "Update-Module" on target "Version '1.0' of module 'xDatabase', updating to version '1.1'".

What if: Performing the operation "Update-Module" on target "Version '0.2.7' of module 'xJea', updating to version '0.2.16'".

 

To sum up, you can:

·         Discover modules from the gallery using the Find-Module command

·         Install module from the gallery using the Install-Module command

·         Update installed modules by using the Update-Module command 

 

For more details, look through the PowerShellGet help:

 

C:\Windows\System32\WindowsPowerShell\v1.0> Update-Help -Module PowerShellGet

 

C:\Windows\System32\WindowsPowerShell\v1.0> Get-Help Update-Module

 

NAME

    Update-Module

   

SYNOPSIS

    Downloads and installs the newest version of specified modules from an online gallery to the local computer.

   

    

SYNTAX

    Update-Module [[-Name] <String[]>] [-Confirm] [-Force] [-RequiredVersion <Version>] [-WhatIf] [-Confirm] [-WhatIf] [<CommonParameters>]

    

 

Other Improvements

 

For additional information about the NetworkSwitch and OneGet cmdlets introduced in the April Preview, we recommend checking Jeffrey Snover’s blog post about the WMF 5.0 Preview released in April.

 

We’re excited to provide this new iteration of WMF to you all, and we’re happy to hear any and all feedback that you may have.

As usual, our main avenue of feedback is Connect: https://connect.microsoft.com/PowerShell/Feedback

 

On behalf of everyone contributing to WMF, we hope you enjoy this Preview release.

 

 

John Lisco

PowerShell Program Manager

 

 

Announcing Windows PowerShell Desired State Configuration for Linux

$
0
0

For those of you fortunate enough to be at TechEd North America last week, you might have seen Jeffrey Snover announcing Windows PowerShell Desired State Configuration (DSC) for Linux!  (If you missed that session, you can watch a replay of it online).  We are excited to announce the initial availability of this feature with the release of a CTP of DSC for Linux at GitHub.com. With DSC for Linux, you can now use the same platform to concurrently manage the configuration of your Windows and Linux computers.  We are actively developing DSC for Linux and will continue to improve and extend it going forward. 

Bringing Desired State Configuration to Linux is another step in our commitment to standards-based management for the heterogeneous data center, and it empowers admins and developers to effectively and efficiently manage configuration state - irrespective of the managed computer’s operating system. The DSC for Linux implementation uses the open-source Open Management Infrastructure (OMI) as a CIM server, and the standard WS-Management protocol for remote communication.  As such, it is an open and extensible configuration management platform for managing the entire range of fabric and tenant elements of a datacenter. Unlike other configuration management efforts, DSC is focused on providing an open platform for tool vendors to build on top of. This approach allows tool vendors to maximize their efforts on differentiating value-add features and ensures that customers can choose the tool they want and be confident that it will manage all of their elements.  This collaborative, layering approach was first demonstrated by Chef in a technology demo during TechEd 2012.  The upcoming version of Chef with DSC support was demonstrated during the PowerShell Desired State Configuration and DevOps in Microsoft Azure talk at TechEd 2014.

In this initial CTP release, we have implemented the following Resource providers, which you will find to be analogous to Windows “Built-In” DSC Resources:

  • nxFile – manage files and directory state
  • nxScript – runs script blocks on target nodes
  • nxUser – manages Linux users
  • nxGroup – manages Linux groups
  • nxService – manages Linux services (System-V, Upstart, SystemD)

Note, only “push” mode is currently available for Linux in this release.  For more information on Push vs Pull modes, see this explanation.

The CTP release for PowerShell Desired State Configuration for Linux is open sourced.  For a walkthrough of getting started with DSC for Linux, head over to the Building Clouds blog.

 

 

Setting up an Internal PowerShellGet Repository

$
0
0

At TechEd, we announced and released an early version of PowerShellGet: a package manager for PowerShell modules.  The response was positive, and many people asked the same type of question:

 

“Can I set up my own internal repository for PowerShellGet?”

 

Many enterprise-oriented houses want the ability to create private module repositories (without sharing them with the world).  Many security minded folks want to curate their own module collections.

 

The short answer to this question is: yes!  All you need to do is create your own NuGet Server and redefine a couple of PowerShellGet variables.  Detailed instructions are below.

Diversion: Why NuGet?

You might be asking yourself: “Why does PowerShellGet work against NuGet Feeds?” When designing PowerShellGet, we didn’t want to reinvent the wheel.  Making a good package manager from scratch can be hard.  Fortunately, Microsoft already has a pretty awesome package management solution in NuGet (http://www.nuget.org/).  So, we decided to build PowerShellGet on top of NuGet.

 

This way, we can craft a PowerShell specific package management experience without needing to develop stuff like the protocol from scratch. 

What does this mean for you?

This means that PowerShellGet can work against any NuGet repository out of the box.  Note: Do not publish PowerShell modules to NuGet.org. 


The only things you need to do to make this happen are:

  1. Create a NuGet repository.
  2. Tell PowerShellGet to work against it.

Creating a NuGet Repository

There are many ways to set up a working NuGet repository.  Here are a couple of options:

  1. Follow the instructions in NuGet’s documentation:
    1. Setting up a Local Gallery
    2. Branding the NuGet Gallery
  2. Let a company like MyGet to host a NuGet feed for you (https://www.myget.org/).
Update: Commenters have also suggested the following links for help with setting up an internal repository:

For the purposes of this blog, I’m going to use MyGet to create a NuGet feed.  I ended up with a feed at this URL: https://www.myget.org/F/powershellgetdemo/.

Tell PowerShellGet to work with your Repository

For this release, PowerShellGet uses two variables to determine which gallery to work against: $PSGallerySourceUri and $PSGalleryPublishUri.  Here’s what you need to do to generate these variables for your feed:

  1. Append “api/v2” to your feed URL for $PSGallerySourceUri (e.g. https://www.myget.org/F/powershellgetdemo/api/v2
  2. Append “/api/v2/package” to your feed URL for $PSGalleryPublishUri (e.g. https://www.myget.org/F/powershellgetdemo/api/v2/package)

 

To make PowerShellGet work against these variables, you need to first import the PSGet module, and then override the values before using PSGet commands. 

 

Here’s an example:

C:\Windows\System32\WindowsPowerShell\v1.0> Import-Module PowerShellGet

 

C:\Windows\System32\WindowsPowerShell\v1.0> $PSGalleryPublishUri = 'https://www.myget.org/F/powershellgetdemo/api/v2/package'

 

C:\Windows\System32\WindowsPowerShell\v1.0> $PSGallerySourceUri = 'https://www.myget.org/F/powershellgetdemo/api/v2'

 

Success!

You can successfully publish and install modules from your new NuGet gallery.

C:\Windows\System32\WindowsPowerShell\v1.0> Publish-Module -Name xDscResourceDesigner -NuGetApiKey wouldnt-you-like-to-know…

 

C:\Windows\System32\WindowsPowerShell\v1.0> Find-Module xDscResourceDesigner

 

Version         Name                                     DateUpdated               Description     

-------         ----                                     -----------               -----------     

1.1.1           xDSCResourceDesigner                     5/15/2014 12:51:24 AM     The xDscResour...

 

Looking forward

As always, if you have feedback about PowerShellGet or any other PowerShell feature, please let us know: http://connect.microsoft.com/PowerShell.

 

John Slack

Program Manager

PowerShell Team

 

Viewing all 265 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>