Quantcast
Channel: TechNet Blogs
Viewing all articles
Browse latest Browse all 17778

Automation–The New World of Tenant Provisioning with Windows Azure Pack (Part 5): Working with the SQL Server resource provider, and the ITIL dilemma

$
0
0

Hello Readers!

Yes, it’s only been a couple minutes since Charles published parts 1 to 4 for this series, and maybe you thought we would rest a bit and take a breath before publishing new content… Well, not really!

Today, we also have part 5 of the series available for you to read, and it brings a new set of sample scripts, this time in the context of working with the SQL Server resource provider in WAP.

In this post, we will demonstrate the following:

  1. How to request an administrator token from the WAP admin API. Charles already covered this in post #2, but this time we will also include a few variations you may need, when you use ADFS or signed certificates for examples. Again, requesting a token is important for many tasks, so the examples here are not limited to SQL Server databases activities. This post was just an excuse to give more details about token requests, and you could use the corresponding token for other resource providers (VM Clouds, MySQL Servers, etc.)
  2. How to test our token to display existing plans, subscriptions and users. One of the benefits is that this will display some subscriptions IDs, that we will be able to reuse in subsequent scripts
  3. Display the names of existing SQL Server databases for a specific user subscription
  4. Display details of a specific existing SQL Server database in a user’s subscription
  5. List all databases in all subscriptions : using our administrative token, we will cycle through all subscriptions and display databases names, if any
  6. Create a new database in a subscription, using our administrative token, on behalf of the user

And as a bonus, I will include some user and subscriptions samples at the end. But these will be quite short and more of a teaser, as Charles is preparing a more in-depth series on this topic.

What are the scenarios?

Having the ability to run scripts as both admins and tenant is critical for automation, especially in the context of enterprises. A few examples:

- Doing actions on behalf of a tenant, after approvals in an ITSM solution

- Collecting data available in the admin and/or tenant APIs, to centralize/reconcile in a CDMB or else

Now, there is a fine line between controlled processes as requested by ITIL, and the expected and promised agility of a cloud… This is where a balance has to be found. At the end of this post, we try to start covering this topic, by introducing what we called the “ITIL dilemma” (see section “Bonus – Creating users and adding them to plans”)


Download

Yes, the scripts from this page are available as a download!

BC-DLButtonDark

Note : This is a single script, intended to be run sample-by-sample, using selection mode (“F8” in PowerShell ISE). If you start to run the script right away, it will likely fail because some variables need to be edited, like the subscriptions ID. Please refer to the sample below and to the header of the script, to understand what need to be modified. By default, write actions are commented out anyway, to ensure you do not mistakenly write to your environment.


Introduction : The SQL Server resource provider in Windows Azure Pack

The SQL Server resource provider in WAP allows an administrator/service provider to delegate tenants the ability to create/resize/delete SQL Server databases. It’s a very easy and flexible way to provide databases that could be hosted on standalone servers, or using SQL Server AlwaysOn. These databases can then be used with the Web Sites or Virtual Machines from this tenant, for instance (there are many scenarios, depending on how network routing and isolation is designed in your WAP environment). Just like with VM Clouds, the SQL Server resource providers relies on a shared fabric (this time made up of SQL Servers servers – optionally categorized in groups – whereas VM Clouds rely on a fabric of Hyper-V hosts surfaced through the Virtual Machine Manager clouds). You can read more about the SQL Server resource provider here, and we will have a more detailed blog post on this very soon.


Working the SQL Server resource provider from an API perspective

This is very similar to VM Clouds, that have been the main focus of parts 1 to 4 for this series. We need to authenticate with the WAP APIs (admin or tenant) and then can execute actions using the WAP cmdlets (from an admin standpoint) or through the tenant API web service (from a tenant standpoint).

The foundation

We’ll need the WAP admin API PowerShell module loaded, and here are two variables we are setting and will reuse for all these scripts. They correspond to the endpoints for the tenant and admin APIs.

001
002
003
ipmo mgmtsvcadmin
$TenantUri = "https://wap01.contoso.com:30005"
$AdminUri = "https://wap01.contoso.com:30004"

Note : You may need to run these samples as administrator/privileged.

First, let’s request a token

Once again, this has been previously covered by Charles, but let’s look at the variations you may face in different environments. These variations may be due to the following:

- Whether you use self-signed certificates, or certificates signed by an approved certification authority

- Whether you have setup ADFS or not, for the WAP admin interfaces

 

Requesting a token without ADFS, with self-signed certificates:

001
$Token = Get-MgmtSvcToken -Type Windows -AuthenticationSite "https://wap01.contoso.com:30072" -DisableCertificateValidation -ClientRealm "http://azureservices/AdminSite"

The important part here is the “-DisableCertificateValidation” switch, but it’s not enough. you will also need to add this to your script, before continuing:

001
002
003
004
005
006
007
008
009
010
011
012
013
014
add-type @"
    using System.Net;
    using System.Security.Cryptography.X509Certificates;
    
    public class NoSSLCheckPolicy : ICertificatePolicy {
        public NoSSLCheckPolicy() {}
        public bool CheckValidationResult(
            ServicePoint sPoint, X509Certificate cert,
            WebRequest wRequest, int certProb) {
            return true;
        }
    }
"@

[System.Net.ServicePointManager]::CertificatePolicy = new-object NoSSLCheckPolicy

Note : Be aware that this snippet disable SSL verification in the rest of your PowerShell session. Of course, having “well-signed” certificates is recommended in production…

Requesting a token without ADFS, with “well-signed” certificates:

001
$Token = Get-MgmtSvcToken -Type Windows -AuthenticationSite "https://wap01.ad.corp.local:30072" -ClientRealm "http://azureservices/AdminSite"

Requesting a token with ADFS, with “well-signed” certificates”

001
$Token = Get-AdfsToken -domain 'contoso.com' -username 'brunosa' -password 'Pass@word1' -adfsAddress 'https://adfs01.contoso.com' -clientRealm 'http://azureservices/AdminSite'

OK, I am cheating a bit here, as “Get-AdfsToken” is actually calling a function in the scriptSmile But this function is provided in the downloadable script sample. It’s also actually adapted from the samples you have on any Windows Azure Pack installation – Thanks for Shri for his help on this!

image

 

When we have a token, displaying it should look like this:

image

Testing the token on the admin side

To do this, let’s display existing plans, subscriptions and users:

001
002
Get-MgmtSvcPlan -AdminUri $AdminUri -Token $Token  | select DisplayName, Id, servicequotas | ft
Get-MgmtSvcSubscription
 -AdminUri $AdminUri -Token $Token | select subscriptionID, AccountAdminLiveEmailId, PlanId, state | ft

Output:

image

Using the token to list databases for a specific tenant, with the tenant API

This is just a matter to craft the expected Web Request, and using our token as part of the headers – note that the subscription ID can either be found in the WAP admin portal…or in the output of the previous sample Smile

001
002
003
004
005
006
$SusbcriptionID = "afe457ae-f4d9-46e9-81a2-67c0654bf668"
$UserID = (Get-MgmtSvcSubscription -AdminUri $AdminUri -Token $Token| Where-Object {$_.SubscriptionId -eq $SusbcriptionID}).AccountAdminLiveEmailId
$Headers = @{Authorization = "Bearer $Token"
        "x-ms-principal-id" = $UserID}
$FullUri = $TenantUri + "/" + $SusbcriptionID + "//services/sqlservers/databases/"
(Invoke-RestMethod -Uri $FullUri -Headers $Headers).Name

Here is the output in PowerShell, and the view in the tenant portal:

image

image

If we had wanted to query VM Role Gallery Items available through that subscriptions, the last two lines to use would have been this one, with a different URI:

001
002
$FullUri = $TenantUri + "/" + $SusbcriptionID + "//Gallery/GalleryItems/$/MicrosoftCompute.VMRoleGalleryItem?api-version=2013-03"
(Invoke-RestMethod -Uri $FullUri -Headers $Headers).Content.Properties.Name

As you can see, working with a different resource provider only requires to understand the namespace for that provider. And for the resource providers shipping out of the box, MSDN is your friend, such as this link for the SQL Server resource provider.

Displaying details for a specific database in a subscription

The script would look like this – It is using the URI from the previous sample, to automatically display the details of the first database that was found. But you could replace $DBName by any database name that you know to exist in the subscription.  $SubscriptionID and $Headers also come from the previous sample (both samples are grouped toegether in the downloadable script)

001
002
003
004
005
006
007
008
009
010
If ((Invoke-RestMethod -Uri $FullUri -Headers $Headers).Count -eq 1)
    {$DBName = (Invoke-RestMethod -Uri $FullUri -Headers $Headers)[0].Name}
If ((Invoke-RestMethod -Uri $FullUri -Headers $Headers).Count -gt 1)
    {$DBName = (Invoke-RestMethod -Uri $FullUri -Headers $Headers).Name[0]}
    $DBName
If ($DBName)
    {
    $FullUri = $TenantUri + "/" + $SusbcriptionID + "//services/sqlservers/databases/" + $DBName + "/"
    (Invoke-RestMethod -Uri $FullUri -Headers $Headers)
    }

And the output gives us:

image

Listing all databases in all subscriptions

The sample below uses our token to cycle through all subscriptions and, when the SQL Server resource provider is part of the associated plan, it queries to see if databases are already created. If yes, it lists them.

001
002
003
004
005
006
007
008
009
010
011
012
$subscriptions = Get-MgmtSvcSubscription -AdminUri $AdminUri -Token $Token | Where-Object {$_.State -eq "active"} | select subscriptionID, AccountAdminLiveEmailId, PlanId
Foreach ($subscription in $subscriptions)
    {
    $FullUri = $TenantUri + "/" + $subscription.subscriptionID + "//services/sqlservers/databases/"
    $Headers = @{Authorization = "Bearer $Token"
        "x-ms-principal-id" = $subscription.AccountAdminLiveEmailId}


    If ((Get-MgmtSvcPlan -AdminUri $AdminUri -Token $Token | where-object {$_.Id -eq $subscription.planID}).servicequotas.servicename -contains "sqlservers")
        {write-host "SQL Server Databases is part of subscription " $subscription.subscriptionID " . Database(s) found : " (Invoke-RestMethod -Uri $FullUri -Headers $Headers).Name}
        else {write-host "SQL Server Databases is not part of subscription " $subscription.subscriptionID}
    }

Output:

image

Creating a database in a subscription, on behalf of a user

Reading data is great, but creating may be important once in a while, right? Smile

This scripts create a database in the specified subscription, by preparing the JSON body and calling the right Web Request, this time with a “POST” command, vs an implied “GET” in previous examples. Notice how we have to specify the “application/json” content type too.

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
$DatabaseName = "DB9999"
$SQLserverGroup = "Production"
$BaseSize = "1000"
$MaxSize = "1000"
$DBOwnerName = "User9999"
$DBOwnerPassword = "Pass@word1"
$BodyHashTable = @{
    Name = "{0}" -f $DatabaseName
    Edition = $SQLserverGroup
    BaseSizeMB = $MaxSize
    MaxSizeMB = $MaxSize
    Collation = "SQL_Latin1_General_CP1_CI_AS"
    Iscontained = "false"
    CreationDate = "0001-01-01T00:00:00+00:00"
    Status = "0"
    AdminLogon = $DBOwnerName
    Password = $DBOwnerPassword
    }
$BodyJSON = ConvertTo-Json 
$BodyHashTable

$SusbcriptionID
 = "afe457ae-f4d9-46e9-81a2-67c0654bf668"
$UserID = (Get-MgmtSvcSubscription -AdminUri $AdminUri -Token $Token| Where-Object {$_.SubscriptionId -eq $SusbcriptionID}).AccountAdminLiveEmailId
$Headers = @{Authorization = "Bearer $Token"
        "x-ms-principal-id" = $UserID}
$FullUri = $TenantUri + "/" + $SusbcriptionID + "//services/sqlservers/databases/"
Invoke-RestMethod -Uri $FullUri -Headers $Headers -Body $BodyJSON -ContentType 'application/json; charset=utf8' -Method 
POST

Note : You do not need to use the “-f” syntax in the hashtable. This is just to show that you can work with values in many different ways, as long as you do not forget to convert to JSON later (using ConvertTo-Json)

How did we know what properties to insert in the hash table?

Well, as said before, MSDN is your friend, and in particular this link : it provides a sample JSON payload to insert in the request body. You could actually use this in the previous code, to insert the JSON body directly as a PowerShell here-string:

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
$BodyJSON = @"
{
  "Name": "DB9999",
  "SqlServerName": null,
  "SqlServerId": null,
  "ConnectionString": null,
  "Edition": "Production",
  "BaseSizeMB": 1000,
  "MaxSizeMB": 1000,
  "Collation": "SQL_Latin1_General_CP1_CI_AS",
  "IsContained": false,
  "CreationDate": "0001-01-01T00:00:00+00:00",
  "Status": 0,
  "SelfLink": null,
  "Quota": null,
  "AdminLogon": "User9999",
  "Password": "Pass@word1",
  "AccountAdminId": null
}
"@

Using a hash table has the benefit to make it easier to insert variables. When going from the here-string to the hash table, we also removed the “null” properties, only keeping the other ones. The subscription property was not added too, as we are already executing the request in the subscription’s context.

Output of a database creation:

image


Bonus – Creating users and adding them to plans

In addition to the previous samples, here are three additional and very simple scripts. We had them handy, so figured it was worth adding there as a bonus.

- Listing all users

001
Get-MgmtSvcuser -AdminUri $AdminUri -Token $Token

- Creating a Windows Azure Pack user

001
002
$NewUser = "testuserPS@contoso.com"
Add-MgmtSvcUser  -AdminUri $AdminUri -Token $token -Name $NewUser -email $NewUser -State 'Active'

- Adding a WAP user to a plan (see also here)

001
002
003
004
$NewUser = "testuserPS@contoso.com"
$PlanName = "Contoso Gold Plan"
$Plan = Get-MgmtSvcPlan -AdminUri $AdminUri -Token $Token | where-object {$_.DisplayName -eq $PlanName}
Add-MgmtSvcSubscription -AdminUri $AdminUri -Token $Token -PlanId $Plan.Id -AccountAdminLiveEmailId $NewUser -AccountAdminLivePuid $NewUser -FriendlyName 'MySubcription01'

Output for the new user and new subscription creations:

image

These is just a teaser, since Charles will be creating some posts soon on topic, with more depth and more scenarios. For example, creating the user in WAP is only one of two steps. The second step is to make sure the user actually exists in the authentication system. With ADFS, there is no need for the second step (user is already an Active Directory user, so what you have earlier is enough in an ADFS-enabled environment). But without ADFS, the user needs to created in the default ASP.NET membership provider too.

 

What are the scenarios for these three examples – and let’s introduce the “ITSM dillema”

When talking with enterprises, there is often an IT Service Management (ITSM) solution already used to manage requests and approvals, often through a “service catalog”. Using the samples from this blog post, it is definitely possible to approve new databases (or virtual machines) requests in the ITSM solution, and trigger API calls to actually deploy the resources on behalf of the user. Now, as organizations try to embrace the “IT as a Service” frame of mind, a balance has to be found, so that traceability and approvals required by ITIL processes do not offset the agility and economics of the cloud.  This is where these bonus examples might be interesting : If your processes require it, you could provide controlled/approved access to private plans in WAP, and then let users do whatever they want inside their “delegated sandboxes” (with optional chargeback).

The scenario would be like this:

1. The WAP administrator creates plans but keeps some of them “private”

1. A user goes to a well-known service catalog (System Center Service Manager comes to mind, but you may be using another solution today too)

2. The “private” plans are listed in the service catalog. They could have been added to the CMDB, by listing them using the samples here, and added through automation (assuming your ITSM solution allows to create CMDB items programmatically – hint : Service Manager does that)

3. The approvals occur within the realm of the ITSM solution, as for any other enterprise process

4. Once approved, automation kicks in. The automation process (Service Management Automation, Orchestrator, or just a PowerShell script) could either monitor approved requests, or be called externally by the ITSM solution. This depend on your design, and on the ITSM solution and automation engine being used

5. The Runbook or script can request an admin token, and create the user (if not there already), add him/her to the plan, and notify the user by sending him the URL to the WAP portal

6. Optionaly, another Runbook could be cycling through resources created (like all databases in all subscriptions) and add/update then in the CMDB. This is a way to reconcile items potentially created outside of the ITSM process, if that’s something technically possible in your implementation.


What’s next?

Well, maybe this time we’ll take a breather Smile Parts 1 to 5 are now out, and we should be back in a few weeks with part 6, hopefully with something around: Value Added Services/Offerings and Ongoing Automated Maintenance/Operations. Thanks for reading!


Blog Series Table of Contents

  1. Part 1: Intro & TOC
  2. Part 2: Automated Deployment of Tenant Network and Identity Workload(Isolated Tenant Virtual Network & Active Directory VM Role; from the Service Admin Persona)
  3. Part 3: Automated Deployment of the Identity Workload as a Tenant Admin(Active Directory VM Role; from the Tenant Admin Persona)
  4. Part 4: Automated Deployment of Tenant Workloads (Lync, SharePoint, and Exchange)(Lync, SharePoint, and Exchange VM Roles; from both Service Admin and Tenant Admin Personas)
  5. Part 5: Working with the SQL Server resource provider, and the ITIL dilemma(by Bruno Saille)
  6. Part 6: TBD(We hope to have something around: Value Added Services/Offerings and Ongoing Automated Maintenance/Operations)


Viewing all articles
Browse latest Browse all 17778

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>