Quantcast
Channel: TechNet Blogs
Viewing all 17778 articles
Browse latest View live

“Modernizing” Your Data Warehouse with Microsoft

$
0
0

Data warehousing technology began as a framework to better manage, understand, and capitalize on data generated by the business, and it worked extremely well for many years. This is a space that Microsoft knows well with warehousing capabilities since 1995 with SQL Server. 

However, there are several forces working to stretch the traditional data warehouse. Data volume is expanding tenfold every five years. Even the most robust SMP warehouse will require costly forklift operations to a larger and more expensive hardware footprint to keep up with the growth. Companies are using real-time data to optimize their businesses as well as to engage in dynamic, event-driven processes. The variety of new data types is proliferating with over 85 percent of new data coming from non-relational data such as logs, mobile, social, RFID, and devices.

Modernizing your data warehouse with new technologies can help you meet the needs of today’s enterprise to connect all volumes of any data with agile and familiar BI to business decision makers. This was validated by The Data Warehousing Institute (TDWI) who published a checklist to enable the modern data warehouse.  At a high level, your new warehouse must be able to handle:

  • Data of All Volumes: The modern data warehouse can scale up to any volume of data starting from terabytes up to multi-petabyte scale on all data types – relational and non-relational. As an example, Virginia Tech was able to crunch data from DNA sequencers (growing over 15 petabytes a year) to do cancer research.
  • Real-Time Performance: The ability to work with data in real time to keep up the pace of increasing demands without losing performance. As an example, MEC was able to bring queries of customer online visitation metrics from four hours down to minutes.
  • Any Data Types: The ability to seamlessly integrate over any data types from traditional relational sources to new non-relational sources. As an example, Direct Edge was able to join non-relational stock exchange messaging with their relational stock ticker data.

Microsoft has a comprehensive solution to modernize your data warehouse across software, appliance, and cloud for this new world of data. We invite you to learn more of our offerings:

 

Software

Appliance

Cloud

Relational Data

SQL Server 2014

SQL Server Fast Track

Parallel Data Warehouse with PolyBase

SQL Server for DW in WA Virtual Machine

Non-relational Data (Hadoop)

Hadoop on Windows

Parallel Data Warehouse with PolyBase

Windows Azure HDInsight

 

Product Offerings:

  • SQL Server 2014: Microsoft SQL Server 2014 will be generally available on April 1 and includes technologies like an updateable in-memory columnstore that can increase data warehouse queries 10-100x.
  • SQL Server Parallel Data Warehouse: SQL Server Parallel Data Warehouse (PDW) is a Massively Parallel Processing data warehouse appliance that can scale out queries to Petabytes of data. It includes PolyBase, a feature that can allow you to seamlessly query & integrate both relational data and Hadoop data with the same T-SQL query. 
  • SQL Server hosted in a Windows Azure Virtual Machine: SQL Server Enterprise for data warehousing can be installed and hosted in the cloud on Windows Azure Virtual Machines. This image takes advantage of best practices from the Fast Track reference architecture to tune SQL Server for data warehousing in Windows Azure.
  • Windows Azure HDInsight: Windows Azure HDInsight is a Hadoop-based service from Microsoft that brings a 100 percent Apache Hadoop solution to the cloud. You can seamlessly process data of all types with simplicity, ease of management, and an open Enterprise-ready Hadoop service all running in the cloud. We recently announced the general availability of HDInsight running Hadoop 2.2 clusters.
  • Hortonworks Data Platform For Windows: Through a strategic partnership with Hortonworks, Microsoft co-developed the Hortonworks Data Platform for Windows to customers who want to deploy Hadoop on their own servers. Learn more about Microsoft’s contributions to Hadoop.

You can learn more about Microsoft solutions for the modern data warehouse by tuning in for a live stream of our April 15th Accelerate your insights event and by visiting these resources:


Intersection of Music, Technology Can Be Found in Cambridge, Mass.

$
0
0

Posted by Rob Knies

Music Tech Fest logo

What do musicians and computer scientists have in common?

More than you might think.

“If you think about it,” Nancy Baym says, “almost everything musicians do is technologically mediated—their instruments are technologies, they manipulate the sound with technologies, they sing through microphones that are technologies, they depend on speakers, production software … all kinds of technologies just in the music-making processes themselves.

“If you add in the music distribution and communication that happens through social media and other communication technologies, there are infinitely more ways technology shapes what musicians can and can’t do, as well as what they are expected to do or not to do.”

Baym, a principal researcher for Microsoft Research, is discussing the connection that encouraged her to host the first Music Tech Fest series in the United States. The event is scheduled for Cambridge, Mass., from March 21 to 23 at the Microsoft New England Research & Development Center.

...(read more)

Catch up on your reading with Kobo Books on Windows PCs and tablets

$
0
0

clip_image002

Kobo Books gives readers more than 3.5 million eBooks, comics and children’s books to choose from and read on Windows PCs or tablets. U.S. and Canada users will also find a free bonus to start off their library: Robert Ludlum’s “The Janson Commando.” (Ludlum also gave the world superspy Jason Bourne.)

The Kobo catalog contains everything from classics to new releases and best sellers, with thousands of first chapter previews you can save to your library. You can create a custom reading experience, with crisp, clear text in the size and style you prefer. And whatever you’re reading now is featured on a Live Tile.

You can also pick up right where you left off, with synced bookmarks across all your devices.

Install Kobo Books from the Windows Store.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff

Office 365 - Exporting Site Collection Search Configuration using CSOM with PowerShell

$
0
0

Chris O'Brien has a fantastic Blog post - Using CSOM in PowerShell Scripts with Office 365: http://www.sharepointnutsandbolts.com/2013/12/Using-CSOM-in-PowerShell-scripts-with-Office365.html. One of the examples that he provides is how to import search configuration from an XML file, this is a new feature in SharePoint 2013 that is documented here - http://technet.microsoft.com/en-us/library/jj871675(v=office.15).aspx.

I've put together a PowerShell script that can be used to export the search configuration, using this along with Chris's script is a great way to copy Search configuration between Site Collections.

Three variables need to be updated prior to running the script (highlighted), $User is the username of a tenant administrator, $Site is the URL of the Site within the tenant to export the search configuration from and $Schema is the location of the file to write the configuration to.

#Please install the SharePoint client components SDK - http://www.microsoft.com/en-us/download/details.aspx?id=35585 prior to running this script.

#Specify tenant admin, site URL and scope to export from
$User = "admin@tenant.onmicrosoft.com"
$SiteURL = https://tenant.sharepoint.com/sites/site
$Scope = "SPSite"
$Schema = "D:\SearchSchema.XML"

#Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Search.dll"

$Password = Read-Host -Prompt "Please enter your password" -AsSecureString
$Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)

#Export search configuration
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
$Context.Credentials = $Creds
$Owner = New-Object Microsoft.SharePoint.Client.Search.Administration.SearchObjectOwner($Context,$Scope)
$Search = New-Object Microsoft.SharePoint.Client.Search.Portability.SearchConfigurationPortability($Context)
$SearchConfig = $Search.ExportSearchConfiguration($Owner)
$Context.ExecuteQuery()
$SearchConfig.Value > $Schema


Brendan Griffin - @brendankarl

MLB.com At Bat updated with breaking news push notifications, pitch-by-pitch tracking

$
0
0

clip_image001

Now up at home plate: an updated MLB.com At Bat for Windows Phone that keeps you up to date on breaking news and lead changes through push notifications. Subscribers also receive pitch-by-pitch tracking with realistic renderings of all 30 MLB ballparks.

This official Major League Baseball app also provides Live Tile and Lock screen updates, live scores, news, video highlights, schedules and more. Also, pin your favorite teams to your Start screen.

Install At Bat from the Windows Phone Store – it’ll be your MVP this season, and every one after that.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff

Video: Indie developers share sneak peeks of new self-published Xbox One games

$
0
0

As the Game Developers Conference buzzes full-steam in San Francisco, attendees heard from independent developers who shared 25 games coming to Xbox One through the ID@Xbox program. In the video above, you’ll see cameos from folks such as Ken Yeung, technical director of Capy Games, who talks about “Super Time Force,” – an action-packed platform game with a time-traveling twist.

With the ID@Xbox program, independent game developers can self-publish their games. This initial sampling of new games includes game creators from nine countries. More than 250 developers have now received development kits for Xbox One via this program.
Watch the video to hear directly from those developers and head over to Xbox Wire to find out more about them and the ID@Xbox program.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff

Hosting Summit - Etude de marché sur le Cloud et l'hébergement

$
0
0

Une session très intéressante animée par le cabinet 451 Research qui revient sur les prochaines grandes tendances en matière de Cloud et d’hébergement.

Le Cloud est une opportunité plus que sérieuse pour des clients qui recherchent de la croissance

  • 45% des clients interrogés sont à une étape avancée et pilotent déjà des projets dont l’objectif est une implémentation effective dans le cloud
  • 32% des clients interrogés ont élaboré une stratégie Cloud
  • Les early-adopter du Cloud sont ceux qui ont la plus grande propension à investir dans des services/applications à forte croissance
  • Les entreprises de taille moyenne ont le plus grand revenu moyen par utilisateur et le plus grand nombre d’applications prévues pour le cloud
  • Plus vous vendez de services à un seul client, plus la probabilité pour qu’il change de fournisseur diminue fortement puisqu’il devient plus difficile pour lui de recréer cet environnement ailleurs

Un Cloud de confiance est un business Premium

  • 55% des clients interrogés paieraient pour un service client de type Premium
  • 60% des clients hébergés paieraient pour des services Premium en matière de sécurité s’ils le pouvaient
  • Le fait d’avoir dans ses équipes un Chief Customer Officer permet de positionner le service comme un axe de différenciation

La route vers le Cloud Hybride est le privé

  • Le cloud privé, c’est profiter de l’innovation sans le risque
  • Le cloud public sera également important, mais les clients se projetteront toujours sur des plans à long terme
  • Le cloud hybride est l’issue la plus probable

Le Cloud ne change pas tout. La sécurité est une opportunité.

  • Il faut mener une politique de transparence vis-à-vis des clients
  • Il faut avoir une stratégie claire qui permette au client de savoir ce qui est une composante inhérente à votre offre et ce qui relève d’un service Premium

Aider vos clients à faire de la croissance, ne faites pas que baisser les prix

  • Aider-les à s’étendre dans de nouvelles géographies
  • Supporter les nouvelles applications et les applications existantes
  • Aider-les à accélérer leur time to market

Tip of the Day: Flash support added to IE 10


Tip of the Day: A Dip in the Pool

$
0
0

Today’s Tip…

In Storage Spaces a pool is a logical grouping of physical disks. There are two types of pools.

Primordial Pool

The Primordial pool represents all of the disks that Storage Spaces is able use but are not already members of a concrete pool. Physical Disks in the Primordial pool have a property named CanPool equal to “True” when they meet the requirements to create a concrete pool.

Concrete Pool

A Concrete pool is a specific collection of Physical Disks that was formed by the user to allow creating Storage Spaces (aka Virtual Disks).

Look up! Atlas Titan is on tour in Germany

$
0
0

clip_image002

Following the global launch of “Titanfall” on Xbox One, a giant Atlas Titan is now on tour. If you’re in Germany in March, you can find it in Hamburg, Cologne and Stuttgart.

The Atlas Titan began its tour March 13 at a Berlin train station.

Head over to Xbox Wire to see more photos and the Titan Tour Map.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff

Sexta-Feira - Atualização Internacional da Comunidade - TechNet Guru

$
0
0


Hoje vamos falar um pouco sobre o maior prêmio da Comunidade TechNet Wiki, o TechNet Guru.

Para quem já conhece o prêmio TechNet Wiki Day pode parecer estranho existir outro prêmio, mas seus critérios são diferentes, sendo que o TechNet Guru possui maior abrangência entre os membros da Comunidade Internacional.

O prêmio TechNet Guru foi criado em maio de 2013 com o intuito de destacar às melhores soluções criadas à cada mês em artigos com base nos Fóruns MSDN e TechNet para cada Tecnologia ou Produto Microsoft.

Com o passar do tempo os artigos desta competição passaram à ter apenas o intuito de ajudar à Comunidade Internacional com qualidade, independente de onde a solução foi originada (Fórum, Galeria, experiência pessoal ou outra fonte).

Para participar, é necessário inscrever seu artigo à cada mês. Então quem define se um artigo tem condições de competir é só você mesmo.

O requisito essencial para participar é que o artigo, original ou traduzido, esteja escrito em inglês. Isto dificulta um pouco o trabalho para quem não fala o inglês fluente ou não seja nativo deste idioma, e isto aumenta o desafio para muitos de nós.

Se você acredita que é capaz, vá em frente e prepare seu artigo para competir no TechNet Guru de Março, mas se você tem receito sobre seu conteúdo ou sobre seu inglês então encare este desafio !

- Faça um plano;

- Organize um conteúdo interessante;

- Explique em um artigo como você fez uma solução incrível no Fórum ou em seu trabalho;

- "Traduza" seu artigo para inglês

No final, você vai ver que você vai ajudar muita gente e também vai aprender bastante com esta experiência.

Além disso, você pode gravar seu nome na lista de medalhistas do prêmio TechNet Guru ou quem sabe até como um dos vencedores.

Mostre à influência que você tem ajudando à Comunidade, criando bons artigos em Português e em Inglês.

Até +,

Wiki Ninja Durval Ramos ( TwitterPerfil )

Hosting Summit - Accélérer la croissance, ou comment diriger avec une clairvoyance et une assurance stratégique

$
0
0

Une excellente session menée par Daniel Burrus, fondateur et CEO de Burrus Research Associates, un des grands penseurs autour des grandes tendances de l’innovation. Une session, ou plutôt un cours de stratégie business !
Daniel Burrus revient sur le fait qu’il est primordial d’avoir une stratégie de croissance. Or, élaborer une stratégie de croissance, c’est offrir aux gens la possibilité de faire ce qu’ils ne peuvent actuellement pas faire… mais qu’ils feraient s’ils savaient qu’ils pouvaient le faire !

Les technologies permettent justement d’envisager de nouvelles possibilités. Et c’est à vous, Services Providers, d’emmener vos clients sur ces nouvelles possibilités.

Vous ne pourrez être convaincant qu’en sachant transmettre ce sentiment d’assurance à vos clients : « Be a brain driver for your customers».

Or, l’entreprise IT telle qu’on la connait devient obsolète. La technologie n’est plus un simple socle. Elle est désormais au cœur des stratégies et c’est elle qui « empower » le business.
Les Services Providers n’ont donc pas d’autres choix que d’embrasser cette transformation pour jouer un rôle dans ce nouveau paysage et pour se positionner comme des partenaires crédibles vis-à-vis de leurs clients. Des clients qui attendent désormais que leurs fournisseurs de services ne se cantonnent plus à la simple « fourniture » mais qu’ils soient de véritables « trusted advisors» capables de les éclairer et de les guider en fonction de leurs problématiques.

Pour cela, vous devez créer votre propre succès. Cela passe par :

  • Une bonne combinaison des talents : avez-vous des équipes multi-générationnelles ?
    Les juniors et les seniors, aussi appelés les Générations Y et Générations X, sont parfaitement complémentaires. Les jeunes utilisent et pensent à la technologie, là où les plus seniors n’y songent même pas. En revanche, les jeunes manquent cruellement d’expérience et de maturité. Ces 2 groupes ont tout à apprendre l’un de l’autre ce qui représente une vraie force pour votre organisation.
  • Penser autrement : il va vous falloir changer le schéma mental traditionnel. Les CIO doivent maintenant être des Chief Innovation Officers et les CTO, des Chief Transformation Officers.
  • Entrer dans l’ère de la communication, et non plus de l’information. Quelle différence me direz-vous ? L’information est statique et elle est uni-sens (de l’émetteur vers le récepteur). A contrario, la communication engage et elle est à double sens. Elle invite votre interlocuteur à réagir, à répondre. L’information est passive, alors que la communication est active.
    Entrer dans l’ère de la communication implique d’adopter des outils et des medias sociaux adaptés au business. La vocation des media sociaux étant par nature de créer du dialogue, d’interagir et d’engager. Beaucoup d’entre eux sont en plus gratuits !

PowerTip: Use PowerShell to Document DSN Names

$
0
0

Summary: Learn how to easily document DSN names by using Windows PowerShell.

Hey, Scripting Guy! Question How can I use Windows PowerShell to find what DSN names are defined on my system?

Hey, Scripting Guy! Answer On your computer running Windows 8.1, use the Get-OdbcDsn function.

Be rewarded and save with these training and certification offers

$
0
0

Whether you’re preparing for a Microsoft Certification exam, gathering best practices to help improve your company’s time to market for deployments, or just interested in deepening your technical knowledge about Microsoft products and technologies, take advantage of these offers.  

Hurry—these sweepstakes end March 31

Microsoft System Center

Windows Server

Windows Azure

Microsoft SQL Server

Microsoft SharePoint Server

clip_image002Take our Data Center or Data Insights Immersion Training modules. This training will give you the knowledge to perform technical product demonstrations to customers. Take the self-assessment at the end of each module for a chance to win a Dr. Dre Beats wireless speaker. Completing all modules will enter you for a chance to win a Surface Pro 2.

Access the training links and Cloud OS Training Sweepstakes details.

Official rules.

Microsoft SQL Server

Microsoft SharePoint Server

Office 365

Microsoft SharePoint Server 2013

Windows 8

clip_image004Speed up your time to market and increase customer wins by taking a Microsoft Practice Accelerator in March, and you will be entered into a $1,000 American Express gift card drawing.

Register for March sessions and see the Accelerate Your Learning Sweepstakes details.

Official rules.

Windows sweepstakes for OEM resellers

Windows 8

Windows XP

imageOEM reseller partners who pass the featured Windows assessments will be entered into weekly and cumulative drawings for a chance to win a Windows 8 tablet.

Take the assessments and read the Windows Partner Sweepstakes details.

Official rules.

Microsoft Certification offers

Office 365

Windows Server 2012

Windows Azure

Windows 8

Microsoft Dynamics CRM

 Save 50% on 10 of our most popular Microsoft Certification exams that can help you earn a competency. Purchase your vouchers by April 30 and use them by September 30.

Download the flyer for eligible exams and offer details.

Terms and conditions.

Microsoft System Center

Office 365

Microsoft SharePoint Server 2013

Microsoft Exchange Server 2013

Windows Server 2012

Microsoft SQL Server

Microsoft Visual Studio

Microsoft Lync Server 2013

Windows 8

Microsoft Dynamics products

 Pass a qualifying exam by April 27 and you will be entered into weekly drawings for a $500 Microsoft Store gift card and monthly drawings for a Surface Pro 2.

View qualifying exams and Microsoft Certification Sweepstakes details.

Official rules.

 

 Save up to 40% on exams, plus get a second shot to pass, with the Microsoft Competency Exam pack offer. Purchase by May 31 and take your exams by December 31.

Terms and conditions.

 

 Save 20% on individual exams and get a free Second Shot through May 31. Second Shot vouchers must be used by May 31.

Terms and conditions.

Get your #certkudos

 Join the community of Microsoft US partners pursuing training, certifications, and accreditations by following @mslearningcurve on Twitter. Use #certkudos to tell us what exams and assessments you're preparing for or just passed, and we’ll give you the kudos you deserve.

Lync Windows Store App update gives meeting presenters more control over participants, scheduling

$
0
0

clip_image002

Lync meetings are a big part of life here at Microsoft, and likely where you work, too. Now, moving closer to the desktop Lync experience, the latest update to the Lync Windows Store app gives meeting presenters more control over the remote attendees. It also provides a Meet Now feature that bypasses the need for scheduling a meeting and a tile for the Start screen to join or start a new Lync meeting.

Presenters can now mute others, send private IMs, view participants’ contact cards, promote others to presenter or demote them. They can also invite more participants, mute everyone and admit people to the meeting who are waiting in the lobby. Presenters can do all this using controls at the bottom of the screen.

Lync will upgrade automatically within 24 hours for most users of Windows 8.1. You can find Lync in the Windows Store and read more about the updates on The Lync Team Blog.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff


Automation–The New World of Tenant Provisioning with Windows Azure Pack (Part 4): Automated Deployment of Tenant Workloads (Lync, SharePoint, and Exchange)

$
0
0

Yay! Part 4 has finally arrived…

And by Part 4, I am obviously talking about the next big post in “The New World of Tenant Provisioning with Windows Azure Pack” blog series.


What is this one all about?

Automated Deployment of Tenant Workloads (Lync, SharePoint, and Exchange)

What does that mean?

Well, let’s first take a look back: Part 1 (here) was the Intro/TOC for this fine blog series; Part 2 (here) was all about Automated Deployment of Tenant Network and Identity Workload (Active Directory Gallery Item VM Role) from the Service Admin Persona; and Part 3 (here) was all about Automated Deployment of the Identity Workload (Active Directory Gallery Item VM Role) from the Tenant Admin Persona.

So what that means is, in Part 4, we only have one major aspect of Gallery Item VM Role Deployment left to discuss in the series: Tenant Workloads (Lync, SharePoint, and Exchange Gallery Item VM Roles)


Up to this point in the blog series, we have covered an example for each of the following deployment options:

image

Note     In this diagram, we also see that deployment options for both administrator personas have been covered. And we did this with a specific example in mind: the Active Directory Gallery Item VM Role. This was due to the fact that the subsequent example Gallery Item VM Role deployments in this blog series take a strong dependency on Active Directory. In this way, both the Gallery Item VM Role for Active Directory and the Tenant Network are “table stakes”.


Truth be told, based on just Parts 2 and 3 of this blog series, you really have everything necessary to deploy not only the Gallery Item VM Role for Active Directory, but any Gallery Item VM Role you have built (or pulled down off of WebPI (or our Blog)).

Which leads to the logical next step…

Describing what it takes and providing the necessary script updates (to what you have seen so far) to deploy the Gallery Item VM Roles for Lync, SharePoint, and Exchange (for both personas).


Which means, at least in part, that these three diagrams happen to also apply to this blog post:

imageimage


Automated Deployment of Tenant Workloads

(Lync, SharePoint, and Exchange VM Roles; from both Service Admin and Tenant Admin Personas)

Because we established a solid foundation in Parts 2 and 3 of the series, this post really just highlights the script updates necessary to get Automated WAP Deployments to work for OUR PUBLISHED Lync, SharePoint, and Exchange Gallery Item VM Roles. The reason I say “OUR PUBLISHED” in yelling-case is to emphasize which Gallery Item VM Roles the example scripts in this post actually directly relate. If you are wondering, “OUR PUBLISHED” Lync, SharePoint, and Exchange Gallery Item VM Roles can be found here: Windows Azure Pack VM Role Gallery Items for Collaboration Workloads

Wait! Wait! Wait! What’s the difference in the script from VM Role to VM Role?

Good question, glad you asked.

Well, as you saw in Part 2 and 3, no one generic script can be created that handles the deployment from beginning to end. This is due to the fact that each VM Role Definition can vary, based on how that VM Role was created, what fields were included, and which Resource Extensions were added. And while I did my best to keep my scripts generic, there were just portions that had to be hardcoded - dun! dun! dun! The good news is, the hardcoded – dun! dun! dun! stuff is kept to where the scripts differ. Meaning, logic can be introduced to dynamically add the appropriate hardcoded - dun! dun! dun! script portions based on VM Role type.

Will I be able to apply what I learn here against the VM Roles I create?

Yes. There will be a “discovery” section in the post which takes you through how to enumerate “What’s different” or “What’s required” as it relates to the Resource Definition (ResDef/ResDefExt), and Resource Definition Configuration (ResDefConfig).

What about the Tenant Virtual Network?

This topic will not be discussed again here in this post. Please refer to Part 2 of this blog series for more information and example script.

Note     The TechNet Gallery Contribution download will also include the example script(s) for automatically creating a Tenant Virtual Network.


The Scope

Just like in Parts 2 and 3, there are various options to choose from when deploying Gallery Item VM Roles. These options are depicted above, in the first image (the one with all the green checkmarks). To keep this blog post manageable, I will be providing the example Runbooks / Scripts / Guidance in the following order:

Service Administrator in SMA (with the WAP Tenant API Only)

  • Existing: SMA Runbook that is the same for each Gallery Item VM Role Deployment
  • New: Updated SMA Runbooks for the Lync Gallery Item VM Role Deployment
  • New: Updated SMA Runbooks for the SharePoint Gallery Item VM Role Deployment
  • New: Updated SMA Runbooks for the Exchange Gallery Item VM Role Deployment
  • Future Discovery: How to enumerate ResDef/ResDefExt and ResDefConfig Requirements for any Gallery Item VM Role

Tenant Administrator from a PowerShell Script (With the Public WAP Tenant API)

Instead of making a whole big section, reiterating the same thing over and over, I will simply be providing the following:

  • New: Example Gallery Item VM Role Deployment Script for Active Directory, Lync, Exchange, or SharePoint (as a Tenant Admin against the Public WAP API)
  • Future Discovery: Modified Example PowerShell script to enumerate ResDef/ResDefExt and ResDefConfig Requirements for any Gallery Item VM Role (as a Tenant Admin against the Public WAP API)

Out of Scope

There are a couple topics out of scope for this particular blog post, as they have been covered previously within the series.

  1. Automated Tenant Virtual Network Creation (can be found in Part 2)
  2. Automated Active Directory Gallery Item VM Role Deployment (can be found in both Part 2 and Part 3)
  3. Detailed Review of “The Process” for constructing a Gallery Item VM Role Deployment Script (can be found in both Part 2 and Part 3)
  4. Example VMM PowerShell for Gallery Item VM Role Deployment (“Add-CloudResource” can be found in Part 2)
  5. Pre-Requisites for WAP Tenant API Authentication (“bearer token authorization” can be found in Part 2)
  6. Pre-Requisites for Public WAP Tenant API Authentication (“Windows Azure PowerShell Module” and “certificates” can be found in Part 3)

In other words, from a deployment perspective (and from the Tenant’s view), this is our starting point:

image


Service Administrator in SMA (with the WAP Tenant API Only)

Existing: SMA Runbook that is the same for each Gallery Item VM Role Deployment

If we take a look at the “overall structure of my SMA Runbooks” section from Part 2, the following SMA Runbook will be the same for each Gallery Item VM Role Deployment (well, depending on your implementation, but definitely for this example) and can be leveraged generically:

image

Example PowerShell workflow script for Deploy-VMRole

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
054
055
056
057
058
059
060
061
062
063
064
065
066
067
068
069
070
071
072
073
074
075
076
077
078
079
080
081
082
083
084
085
086
087
088
089
090
091
092
093
094
095
096
097
098
workflow Deploy-VMRole
{
    param
    (
        [string]$WAPServer,
        [PSCredential]$Creds,
        [string]$TenantPortalAddress,
        [string]$SubscriptionID,
        [string]$UserID,
        [string]$GalleryItemName,
        [string]$GIVersion,
        [string]$ResDefConfigJSON,
        [string]$CloudServiceName,
        [string]$VMRoleName
    )

    $VMRole = InlineScript {

        $AdminURI = "https://" + $Using:WAPServer + ":30004"
        $AuthSite = "https://" + $Using:WAPServer + ":30072"
        $ClientRealm = "http://azureservices/AdminSite"
        $token = Get-MgmtSvcToken -Type Windows -AuthenticationSite $AuthSite -ClientRealm $ClientRealm -DisableCertificateValidation

        $Headers = @{
            Authorization = "Bearer $token"
            "x-ms-principal-id" = $Using:UserID }

        # Get Gallery Item Reference
        $GIReferenceUri = "https://{0}:30005/{1}/Gallery/GalleryItems/$/MicrosoftCompute.VMRoleGalleryItem?api-version=2013-03" -f $Using:TenantPortalAddress,$Using:SubscriptionID
        $GIReferenceData = [xml](Invoke-WebRequest -Uri $GIReferenceUri -Headers $Headers -UseBasicParsing | Select-Object -ExpandProperty Content)
        $GalleryItemREF = $GIReferenceData.feed.entry.content.properties.resourcedefinitionUrl | ? {$_ -match $Using:GalleryItemName}

        # Get Gallery Item Resource Definition
        $GIResDEFUri = "https://{0}:30005/{1}/{2}/?api-version=2013-03" -f $Using:TenantPortalAddress,$Using:SubscriptionID,$GalleryItemREF
        $GIResourceDEFJSON = Invoke-WebRequest -Uri $GIResDEFUri -Headers $Headers -UseBasicParsing | Select-Object -ExpandProperty Content 
     
        #Convert ResDef JSON to Dictionary
        [System.Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions") | Out-Null
        $JSSerializer = New-Object System.Web.Script.Serialization.JavaScriptSerializer
        $ResDef = $JSSerializer.DeserializeObject($GIResourceDEFJSON)

        #Add ResDefConfig JSON to Dictionary
        $ResDefConfig = New-Object 'System.Collections.Generic.Dictionary[String,Object]'
        $ResDefConfig.Add("Version",$Using:GIVersion)
        $ResDefConfig.Add("ParameterValues",$Using:ResDefConfigJSON)

        # Set Gallery Item Payload Variables
        $GISubstate = $null
        $GILabel = $Using:VMRoleName
        $GIName = $Using:VMRoleName
        $GIProvisioningState = $null
        $GIInstanceView = $null

        # Set Gallery Item Payload Info
        $GIPayload = @{
            "InstanceView" = $GIInstanceView
            "Substate" = $GISubstate
            "Name" = $GIName
            "Label" = $GILabel
            "ProvisioningState" = $GIProvisioningState
            "ResourceConfiguration" = $ResDefConfig
            "ResourceDefinition" = $ResDef
            }

        # Convert Gallery Item Payload Info To JSON
        $GIPayloadJSON = ConvertTo-Json $GIPayload -Depth 7

        # Get Cloud Services
        $CloudServicesUri = "https://{0}:30005/{1}/CloudServices?api-version=2013-03" -f $Using:TenantPortalAddress,$Using:SubscriptionID
        $CloudServicesData = [xml](Invoke-WebRequest -Uri $CloudServicesUri -Headers $Headers -UseBasicParsing | Select-Object -ExpandProperty Content)
        $CloudService = $CloudServicesData.feed.entry.content.properties.Name | ? {$_ -match $Using:CloudServiceName}
        if (!$CloudService) {
            # Set Cloud Service Configuration
            $CloudServiceConfig = @{
                "Name" = $Using:CloudServiceName
                "Label" = $Using:CloudServiceName
                }

            # Convert Cloud Service Configuration To JSON
            $CloudServiceConfigJSON = ConvertTo-Json $CloudServiceConfig

            $CloudServicesData = [xml](Invoke-WebRequest -Uri $CloudServicesUri -Headers $Headers -Method Post -Body $CloudServiceConfigJSON -ContentType "application/json" -UseBasicParsing)
            $CloudService = $CloudServicesData.entry.content.properties.Name | ? {$_ -match $Using:CloudServiceName}
        }

        # Set Gallery Item VM Role Deploy URI
        $GIDeployUri = "https://{0}:30005/{1}/CloudServices/{2}/Resources/MicrosoftCompute/VMRoles/?api-version=2013-03" -f $Using:TenantPortalAddress,$Using:SubscriptionID,$CloudService

        # Deploy Gallery Item VM Role
        $VMRoleDeployed = Invoke-WebRequest -Uri $GIDeployUri -Headers $Headers -Method Post -Body $GIPayloadJSON -ContentType "application/json" -UseBasicParsing

        Return $VMRoleDeployed

    } -PSComputerName $WAPServer -PSCredential 
$Creds

$VMRole


}

New: Updated SMA Runbooks…

If we take another look at the “overall structure of my SMA Runbooks” section from Part 2, these two SMA Runbooks will have subtle differences for each Gallery Item VM Role Deployment:

image

Note     I provide the updated SMA Runbook examples for each Tenant Workload Deployment below.


Before we dive into the SMA Runbook updates, I want to level-set…

Because more significant updates take place in the Deploy-TenantVMRole SMA Runbook, this section of the post will concentrate more on it, rather than the Subscription-Create-Dispatcher SMA Runbook.

Remember in both Parts 2 and 3 there was a portion of the example PowerShell that created the Gallery Item Parameter Hashtable (for Common Data) and then added to the Gallery Item Parameter Hashtable (for GI Specific Data)?

If not, here is an image to help jog your memory:

image

It is right below this section in the Deploy-TenantVMRole SMA Runbook where we will be adding the updates. Essentially, more logic will be introduced to dynamically add the appropriate hardcoded - dun! dun! dun! script portions based on VM Role type.

Note     I will be including the script portions in each respective Tenant Workload sub-section below. Then an all-up Deploy-TenantVMRole SMA Runbook with configuration for all 4 (including Active Directory) Tenant Workload Deployments will be made available after the last sub-section. Also included, will be the all-up Subscription-Create-Dispatcher SMA Runbook with calls for all 4 (including Active Directory) Tenant Workload Deployments.


…for the LyncGallery Item VM Role Deployment

001
002
003
004
005
if ($GalleryItemName -eq "Lync")
{
    $GIParamList += @{RunAsDomainToJoin = $UserID.Split("@")[1]} 
    $GIParamList += @{LyncServer2013RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
}

…for the SharePoint Gallery Item VM Role Deployment

001
002
003
004
005
006
if ($GalleryItemName -eq "SharePoint")
{
    $GIParamList += @{RunAsDomainToJoin = $UserID.Split("@")[1]}
    $GIParamList += @{SharePointServer2013RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
    $GIParamList += @{SharePointServer2013InstallChoice = "SingleServer"}
}

…for the Exchange Gallery Item VM Role Deployment

001
002
003
004
005
006
if ($GalleryItemName -eq "Exchange")
{
    $GIParamList += @{DomainJoinDomainToJoin = $UserID.Split("@")[1]}
    $GIParamList += @{ExchangeServer2013CU2RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
    $GIParamList += @{ExchangeServer2013CU2ExchangeServer2013CU2Organization = "ExchangeOrg"}
}

Note     For each of these example script portions, I kept most of the Hashtable Variable building “complex” and/or hardcoded – dun! dun! dun! on purpose. I wanted to avoid creating and leveraging extra variables outside these individual IF blocks. In these examples, much of the string formatting is the same from IF block to IF block, so you can create a “Credential” string before all this, and leverage it throughout.

Disclaimer     This is a set of examples to handle the GI Specific Data. There are other (and likely better) ways to do this same thing. If you have improvements, please leverage them. If you would like to share those improvements with the community, please leave comments, or send a link to your related blog post. I am always happy to cross-post.


Example PowerShell workflow script for Deploy-TenantVMRole

(with IF blocks for DomainController, Lync, SharePoint, and Exchange)

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
054
055
056
057
058
059
060
061
062
063
064
065
066
067
068
069
workflow Deploy-TenantVMRole
{
    param
    (
        [string]$OwnerUserRole,
        [string]$GalleryItemToDeploy,
        [string]$VMRoleName,
        [string]$VMRoleNamePattern,
        [string]$VMRoleSize
    )
   
    #Define Variables
    $WAPServer = Get-AutomationVariable -Name 'WAP Admin Server'
    $Creds = Get-AutomationPSCredential -Name 'PSCredential Name'
    $TenantPortalAddress = Get-AutomationVariable -Name 'WAP Tenant Server FQDN'
    $SubscriptionID = $OwnerUserRole.Split("_")[1]
    $UserID = $OwnerUserRole.Split("_")[0]
    $GalleryItemName = $GalleryItemToDeploy.Split(";")[0]
    $GIVersion = $GalleryItemToDeploy.Split(";")[1]
    $CloudServiceName = "CloudService-4-{0}" -f $SubscriptionID
    $OSDisk = Get-AutomationVariable -Name 'Default VM Role OS Disk'
    $Password = Get-AutomationVariable -Name 'Password'
   
    # Create Gallery Item Parameter Hashtable (for Common Data)
    $GIParamList = @{
        VMRoleVMSize = $VMRoleSize
        VMRoleOSVirtualHardDiskImage = $OSDisk
        VMRoleAdminCredential = "administrator:{0}" -f $Password
        VMRoleTimeZone = "Pacific Standard Time"
        VMRoleComputerNamePattern = $VMRoleNamePattern
        VMRoleNetworkRef = "Tenant Network ({0})" -f $UserID
        }

    # Add to Gallery Item Parameter Hashtable (for GI Specific Data)
    if ($GalleryItemName -eq "DomainController")
    {
        $GIParamList += @{DomainControllerWindows2012DomainDNSName = $UserID.Split("@")[1]} 
        $GIParamList += @{DomainControllerWindows2012DomainNETBIOSName = ($UserID.Split("@")[1]).Split(".")[0]}
        $GIParamList += @{DomainControllerWindows2012SafeModeAdminPassword = $Password}
    }
   
    if ($GalleryItemName -eq "Lync")
    {
        $GIParamList += @{RunAsDomainToJoin = $UserID.Split("@")[1]} 
        $GIParamList += @{LyncServer2013RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
    }
   
    if ($GalleryItemName -eq "SharePoint")
    {
        $GIParamList += @{RunAsDomainToJoin = $UserID.Split("@")[1]}
        $GIParamList += @{SharePointServer2013RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
        $GIParamList += @{SharePointServer2013InstallChoice = "SingleServer"}
    }
   
    if ($GalleryItemName -eq "Exchange")
    {
        $GIParamList += @{DomainJoinDomainToJoin = $UserID.Split("@")[1]}
        $GIParamList += @{ExchangeServer2013CU2RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
        $GIParamList += @{ExchangeServer2013CU2ExchangeServer2013CU2Organization = "ExchangeOrg"}
    }
   
    # Convert Gallery Item Parameter Hash To JSON
    $ResDefConfigJSON = ConvertTo-Json $GIParamList
   
    Deploy-VMRole -WAPServer $WAPServer -creds $Creds -TenantPortalAddress $TenantPortalAddress `
        -SubscriptionID $SubscriptionID -UserID $UserID -GalleryItemName $GalleryItemName `
        -GIVersion $GIVersion -ResDefConfigJSON $ResDefConfigJSON -CloudServiceName $CloudServiceName `
        -VMRoleName $VMRoleName
}

Note     Obviously I took some liberties with the data sharing in this example. Before I started this SMA Runbook, I knew that each of our Gallery Item VM Roles shared a set of “Common Data” for the Gallery Item Parameter Hashtable. At that point, I just had to figure out which parameters fell into the “GI Specific Data” for each Gallery Item VM Role to  be included in the SMA Runbook. Each set of “GI Specific Data” then gets its own IF block, appending to the “Common Data” already in the Gallery Item Parameter Hashtable. More information on “Discovery” of “GI Specific Data” can be found in the very next section of this post!


Example PowerShell workflow script for Subscribe-Create-Dispatcher

(with calls for each Tenant Workload)

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
054
workflow Subscription-Create-Dispatcher
{
    param
    (
        [object]$resourceObject
    )

    if ($resourceObject.AdminID.Length -gt 27) { $AdminId = $AdminId.SubString(0,27) } else { $AdminId = $resourceObject.AdminId }
    $OwnerUserRole = $AdminId + "_" + $resourceObject.SubscriptionID
    $SubscriptionName = $resourceObject.SubscriptionName
   
    $VMMServer = Get-AutomationVariable -Name 'VMM Server'
    $LogicalNetworkName = Get-AutomationVariable -Name 'Default VMM Logical Network'
    $PSEmailServer = Get-AutomationVariable -Name 'SMTP Server'
    $PSEmailFrom = Get-AutomationVariable -Name 'SMTP From Email'
    $PSEmailCC = Get-AutomationVariable -Name 'PSEmailCC'
   
    if ($SubscriptionName -eq "Collaboration Workloads")
    {
        $CloudName = "Tenant Cloud"
       
        Create-VMNetwork -VmmServerName $VMMServer -OwnerUserRole $OwnerUserRole `
            -CloudName $CloudName -LogicalNetworkName $LogicalNetworkName
       
        Send-SMTPNotification -SendNotificationType "Plans" -PSEmailFrom $PSEmailFrom `
            -PSEmailTo $AdminId -PSEmailServer $PSEmailServer -PSEmailCC $PSEmailCC `
            -WorkloadName $SubscriptionName

        $SubscriptionName + " Plan Selected"
       
        "Deploying Active Directory"
        Deploy-TenantVMRole -GalleryItemToDeploy "DomainController;1.0.0.0" `
            -OwnerUserRole $OwnerUserRole -VMRoleName "ActiveDirectory" `
            -VMRoleNamePattern "DC##" -VMRoleSize "ExtraSmall"
        
         #No Logic, Just Sleep in this example
         Start-Sleep -Seconds 600

        "Deploying Lync"
        Deploy-TenantVMRole -GalleryItemToDeploy "Lync;1.0.0.0" `
            -OwnerUserRole $OwnerUserRole -VMRoleName "Lync" `
            -VMRoleNamePattern "LY##" -VMRoleSize "ExtraLarge"

        "Deploying SharePoint"
        Deploy-TenantVMRole -GalleryItemToDeploy "SharePoint;1.0.0.0" `
            -OwnerUserRole $OwnerUserRole -VMRoleName "SharePoint" `
            -VMRoleNamePattern "SP##" -VMRoleSize "ExtraLarge"

        "Deploying Exchange"
        Deploy-TenantVMRole -GalleryItemToDeploy "Exchange;1.0.0.0" `
            -OwnerUserRole $OwnerUserRole -VMRoleName "Exchange" `
            -VMRoleNamePattern "EX##" -VMRoleSize "ExtraLarge"
    }
}

Note     I did not add any “wait logic” in this example. It simply deploys Active Directory, waits 10 minutes and then deploys the other workloads (Lync, SharePoint, and Exchange). For more information on Monitoring and Notifications, refer to this blog post: Automation–Monitoring and Notifying in Windows Azure Pack with SMA.


Future Discovery: How to enumerate ResDef/ResDefExt and ResDefConfig Requirements for any Gallery Item VM Role

So, how did I know which items in the Gallery Item Parameter Hashtable were “Common” and which were “GI Specific”?

Well, other than tearing apart the JSON during my learning process for all this, there is a pretty simple method to extract the necessary data for review.

The following is an example PowerShell script which leverages the WAP Tenant API (non-Public), pulls down the specified Gallery Item VM Roles desired for comparison, and outputs a Hashtable containing the Gallery Item VM Role Name and its required Resource Parameters.

Example PowerShell script for Get-GIResourceParams

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
function Get-GIResourceParams {

    param
    (
        [object]$Headers,
        [string]$TenantPortalAddress,
        [string]$SubscriptionID,
        [string]$GalleryItemName
    )

    # Get Gallery Item Reference
    $GIReferenceUri = "https://{0}:30005/{1}/Gallery/GalleryItems/$/MicrosoftCompute.VMRoleGalleryItem?api-version=2013-03" -f $TenantPortalAddress,$SubscriptionID
    $GIReferenceData = [xml](Invoke-WebRequest -Uri $GIReferenceUri -Headers $Headers -UseBasicParsing | Select-Object -ExpandProperty Content)
    $GalleryItemREF = $GIReferenceData.feed.entry.content.properties.resourcedefinitionUrl | ? {$_ -match $GalleryItemName}

    # Get Gallery Item Resource Definition
    $GIResDEFUri = "https://{0}:30005/{1}/{2}/?api-version=2013-03" -f $TenantPortalAddress,$SubscriptionID,$GalleryItemREF
    $GIResourceDEFJSON = Invoke-WebRequest -Uri $GIResDEFUri -Headers $Headers -UseBasicParsing | Select-Object -ExpandProperty Content 
    $ResDef = ConvertFrom-Json $GIResourceDEFJSON

    Return $ResDef.ResourceParameters.Name

}

$WAPServer = "WAP Admin Server Name"
$UserID = "User ID (email) of User with Subscription"
$TenantPortalAddress = "FQDN of Tenant Portal Address"
$SubscriptionID = "Subscription ID for Specified User"
$GalleryItems = @("DomainController","Lync","SharePoint","Exchange")

$AdminURI = "https://" + $WAPServer + ":30004"
$AuthSite = "https://" + $WAPServer + ":30072"
$ClientRealm = "http://azureservices/AdminSite"
$token = Get-MgmtSvcToken -Type Windows -AuthenticationSite $AuthSite -ClientRealm $ClientRealm -DisableCertificateValidation

$Headers = @{
    Authorization = "Bearer $token"
    "x-ms-principal-id" = $UserID }

$GIandResourceParams = @{}

foreach ($GalleryItem in $GalleryItems)
{
    $GIResourceParams = Get-GIResourceParams -Headers $Headers -TenantPortalAddress $TenantPortalAddress `
        -SubscriptionID $SubscriptionID -GalleryItemName $GalleryItem

    $GIandResourceParams += @{$GalleryItem = $GIResourceParams}
}

$GIandResourceParams

Note     I leveraged a function within this example script, to minimize duplicate command execution. Modify this example as you see fit. Oh, and it needs to be executed (at least in part) from the WAP Admin Server, or wherever the MgmtSvcAdmin Cmdlets live.

What is the output of this script?

A Hashtable of values you can use to compare/contrast Gallery Item VM Role Resource Parameters:

image

Challenge     I simply provide the Hashtable worth of data, it is up to you to enumerate the data and make something fancy with Compare-Object command to dynamically compare the Resource Parameters on the fly!


Tenant Administrator from a PowerShell Script (With the Public WAP Tenant API)

New: Example Gallery Item VM Role Deployment Script for Active Directory, Lync, Exchange, or SharePoint (as a Tenant Admin against the Public WAP API)

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
054
055
056
057
058
059
060
061
062
063
064
065
066
067
068
069
070
071
072
073
074
075
076
077
078
079
080
081
082
083
084
085
086
087
088
089
090
091
092
093
094
095
096
097
098
099
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
#region GetWAPConnectionData

# Get WAP Subscription Information

$WAPSubscription = Get-WAPackSubscription

# Set Subscription
$SubscriptionID = $WAPSubscription.SubscriptionId

# Get Management Certificate Info
$CertThumb = $WAPSubscription.Certificate.Thumbprint
$CertPath = "Cert:\CurrentUser\My\{0}" -f 
$CertThumb
$Cert
 = Get-Item $CertPath

# Set Tenant Portal Address
$TenantPortalAddress = $WAPSubscription.ServiceEndpoint.Host

# Set Port
$Port = $WAPSubscription.ServiceEndpoint.Port

#endregion GetWAPConnectionData

#region SetVariables

# Set Gallery Item Name and Version for Match and Deploy

$GalleryItemName = "Lync"
$GIVersion = "1.0.0.0"

# Set Common Gallery Item Parameters
$UserID = "tenant@company.com"
$VMRoleNetwork = "Tenant Network ({0})" -f 
$UserID
$CloudServiceName
 = "CloudService-4-{0}" -f 
$SubscriptionID
$VMRoleTZ
 = "Pacific Standard Time"
$OSDisk = "Windows Server 2012 Datacenter"
$OSDiskVersion = "1.0.0.0"
$Password = "Password"

#Set GI Specific Gallery Item Parameters

if ($GalleryItemName -eq "DomainController")
{
    $VMRoleName = "ActiveDirectory"
    $VMRoleNamePattern = "DC##"
    $VMRoleSize = "ExtraSmall"
}

if ($GalleryItemName -eq "Lync")
{
    $VMRoleName = "Lync"
    $VMRoleNamePattern = "LY##"
    $VMRoleSize = "ExtraLarge"
}

if ($GalleryItemName -eq "SharePoint")
{
    $VMRoleName = "SharePoint"
    $VMRoleNamePattern = "SP##"
    $VMRoleSize = "ExtraLarge"
    $SPInstallChoice = "SingleServer"
}

if ($GalleryItemName -eq "Exchange")
{
    $VMRoleName = "Exchange"
    $VMRoleNamePattern = "EX##"
    $VMRoleSize = "ExtraLarge"
    $ExchangeOrg = "ExchangeOrg"
}

#endregion SetVariables

#region GetResDef

# Get Gallery Item Reference

$GIReferenceUri = "https://{0}:{1}/{2}/Gallery/GalleryItems/$/MicrosoftCompute.VMRoleGalleryItem?api-version=2013-03" -f $TenantPortalAddress,$Port,
$SubscriptionID
$GIReferenceData
 = [xml](Invoke-WebRequest -Certificate $Cert -Uri $GIReferenceUri | Select-Object -ExpandProperty Content)
$GalleryItemREF = $GIReferenceData.feed.entry.content.properties.resourcedefinitionUrl | ? {$_ -match $GalleryItemName}

# Get Gallery Item Resource Definition
$GIResDEFUri = "https://{0}:{1}/{2}/{3}/?api-version=2013-03" -f $TenantPortalAddress,$Port,$SubscriptionID,
$GalleryItemREF
$GIResourceDEFJSON
 = Invoke-WebRequest -Certificate $Cert -Uri $GIResDEFUri | Select-Object -ExpandProperty Content

#Convert ResDef JSON to Dictionary
[System.Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions") | Out-Null
$JSSerializer = New-Object System.Web.Script.Serialization.JavaScriptSerializer
$ResDef = $JSSerializer.DeserializeObject($GIResourceDEFJSON)

#endregion GetResDef

#region SetResDefConfig

# Create Gallery Item Parameter Hashtable (for Common Data)

$GIParamList = @{
    VMRoleVMSize = $VMRoleSize
    VMRoleOSVirtualHardDiskImage = "{0}:{1}" -f $OSDisk,$OSDiskVersion
    VMRoleAdminCredential = "administrator:{0}" -f $Password
    VMRoleTimeZone = $VMRoleTZ
    VMRoleComputerNamePattern = $VMRoleNamePattern
    VMRoleNetworkRef = $VMRoleNetwork
    }

# Add to Gallery Item Parameter Hashtable (for GI Specific Data)
if ($GalleryItemName -eq "DomainController")
{
    $GIParamList += @{DomainControllerWindows2012DomainDNSName = $UserID.Split("@")[1]} 
    $GIParamList += @{DomainControllerWindows2012DomainNETBIOSName = ($UserID.Split("@")[1]).Split(".")[0]}
    $GIParamList += @{DomainControllerWindows2012SafeModeAdminPassword = $Password}
}
   
if ($GalleryItemName -eq "Lync")
{
    $GIParamList += @{RunAsDomainToJoin = $UserID.Split("@")[1]} 
    $GIParamList += @{LyncServer2013RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
}
   
if ($GalleryItemName -eq "SharePoint")
{
    $GIParamList += @{RunAsDomainToJoin = $UserID.Split("@")[1]}
    $GIParamList += @{SharePointServer2013RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
    $GIParamList += @{SharePointServer2013InstallChoice = $SPInstallChoice}
}
   
if ($GalleryItemName -eq "Exchange")
{
    $GIParamList += @{DomainJoinDomainToJoin = $UserID.Split("@")[1]}
    $GIParamList += @{ExchangeServer2013CU2RunAsCredential = "{0}\administrator:{1}" -f ($UserID.Split("@")[1]).Split(".")[0],$Password}
    $GIParamList += @{ExchangeServer2013CU2ExchangeServer2013CU2Organization = $ExchangeOrg}
}
   
# Convert Gallery Item Parameter Hashtable To JSON
$ResDefConfigJSON = ConvertTo-Json $GIParamList

#Add ResDefConfig JSON to Dictionary
$ResDefConfig = New-Object 'System.Collections.Generic.Dictionary[String,Object]'
$ResDefConfig.Add("Version",$GIVersion)
$ResDefConfig.Add("ParameterValues",$ResDefConfigJSON)

#endregion SetResDefConfig

#region GenerateGIPayloadJSON

# Set Gallery Item Payload Variables

$GISubstate = 
$null
$GILabel
 = 
$VMRoleName
$GIName
 = 
$VMRoleName
$GIProvisioningState
 = 
$null
$GIInstanceView
 = $null

# Set Gallery Item Payload Info
$GIPayload = @{
    "InstanceView" = $GIInstanceView
    "Substate" = $GISubstate
    "Name" = $GIName
    "Label" = $GILabel
    "ProvisioningState" = $GIProvisioningState
    "ResourceConfiguration" = $ResDefConfig
    "ResourceDefinition" = $ResDef
    }

# Convert Gallery Item Payload Info To JSON
$GIPayloadJSON = ConvertTo-Json $GIPayload -Depth 7

#endregion GenerateGIPayloadJSON

#region GetOrSetCloudService

# Get Cloud Services

$CloudServicesUri = "https://{0}:{1}/{2}/CloudServices?api-version=2013-03" -f $TenantPortalAddress,$Port,
$SubscriptionID
$CloudServicesData
 = [xml](Invoke-WebRequest -Uri $CloudServicesUri -Certificate $Cert | Select-Object -ExpandProperty Content)
$CloudService = $CloudServicesData.feed.entry.content.properties.Name | ? {$_ -match $CloudServiceName}
if (!$CloudService) {
    # Set Cloud Service Configuration
    $CloudServiceConfig = @{
        "Name" = $CloudServiceName
        "Label" = $CloudServiceName
        }

    # Convert Cloud Service Configuration To JSON
    $CloudServiceConfigJSON = ConvertTo-Json $CloudServiceConfig

    $CloudServicesData = [xml](Invoke-WebRequest -Uri $CloudServicesUri -Certificate $Cert -Method Post -Body $CloudServiceConfigJSON -ContentType "application/json")
    $CloudService = $CloudServicesData.entry.content.properties.Name | ? {$_ -match $CloudServiceName}
}

#endregion GetOrSetCloudService

#region DeployGIVMRole

# Set Gallery Item VM Role Deploy URI

$GIDeployUri = "https://{0}:{1}/{2}/CloudServices/{3}/Resources/MicrosoftCompute/VMRoles/?api-version=2013-03" -f $TenantPortalAddress,$Port,$SubscriptionID,$CloudService

# Deploy Gallery Item VM Role
$VMRoleDeployed = Invoke-WebRequest -Uri $GIDeployUri -Certificate $Cert -Method Post -Body $GIPayloadJSON -ContentType "application/json"
$VMRoleDeployed

#endregion DeployGIVMRole

Notes(s)     Once again, I have several notes about the above script. So I will list them here:

  • This is an example script.
  • It has been tested against our Demo/Test/Dev environment multiple times.
  • The following section of the script controls the execution, simply modify these Variables to Deploy the other available VM Roles included within the script:
    image
    The logic is all based on the included IF blocks (two per Gallery Item VM Role).
    image and image
  • It absolutely requires the pre-requisites discussed in Part 3. And while there are alternatives to getting the required variable data based on the Get-WAPackSubscription command, I have found this to be the most efficient/dynamic method.
  • The$TenantPortalAddress Variable may need to  be set to a specific string, rather than being extracted from the information available from the Get-WAPackSubscription command, specifically if the public portal address is different than the FQDN of WAP Admin Server.
  • If you are getting errors during deployment like, “Disk with Name (Windows Server 2012 Datacenter) and Version (1.0.0.0) not found while translating Resource Definition for cloud resource.” it is likely that the$OSDisk and/or $OSDiskVersion Variables have incorrect data. The Tenant API does not care so much for the actual OS Disk name, instead, it appears to require the Family Name of the OS Disk.
  • I chose to leave all the Variable settings within the script. These could very easily be presented as Script Parameters, and fed into the script to make it a bit more generic.
  • Just like the Service Admin example script, there is a portion of the Resource Definition Configuration (ResDefConfig) that is tied directly to the ResDef/ResDefExt for a given Gallery Item VM Role. When generating the Gallery Item Parameter Hashtable, I have separated the GI specific data from the common GI data. In most cases (especially for the Gallery Item VM Roles produced by my team), the only portion of the script that has to change per VM Role is this GI specific data (and, of course any specific Variable data).
  • This script (for the Tenant Admin) should look nearly identical to the script from Part 2 (for the Service Admin). This is by design, as I wanted to keep as may synergies in play as possible. Remember, there are only subtle differences (Ports and Auth).
  • While the steps in the Gallery Item VM Role deployment process will likely remain the same, the actual script could be improved in various ways: Addition of Script Parameters, Separated into Functions, Transformed into a Set of Cmdlets, etc. If anyone takes these improvements on, I will be happy to reference / endorse the published work here in this blog.
  • Finally, and once again, the script has been broken up into “regions”, each of which builds on the last, to eventually complete all the data collection / command execution for the final Invoke-WebRequest POST to deploy the Gallery Item VM Role. Here is an image illustrating the seven regions:
    image

After the dust settles on back to back executions, you should see something similar to this in the Tenant Portal:

image


Future Discovery: Modified Example PowerShell script to enumerate ResDef/ResDefExt and ResDefConfig Requirements for any Gallery Item VM Role (as a Tenant Admin against the Public WAP API)

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
function Get-GIResourceParams {

    param
    (
        [object]$Cert,
        [string]$TenantPortalAddress,
        [string]$SubscriptionID,
        [string]$GalleryItemName
    )

    # Get Gallery Item Reference
    $GIReferenceUri = "https://{0}:30006/{1}/Gallery/GalleryItems/$/MicrosoftCompute.VMRoleGalleryItem?api-version=2013-03" -f $TenantPortalAddress,$SubscriptionID
    $GIReferenceData = [xml](Invoke-WebRequest -Uri $GIReferenceUri -Certificate $Cert -UseBasicParsing | Select-Object -ExpandProperty Content)
    $GalleryItemREF = $GIReferenceData.feed.entry.content.properties.resourcedefinitionUrl | ? {$_ -match $GalleryItemName}

    # Get Gallery Item Resource Definition
    $GIResDEFUri = "https://{0}:30006/{1}/{2}/?api-version=2013-03" -f $TenantPortalAddress,$SubscriptionID,$GalleryItemREF
    $GIResourceDEFJSON = Invoke-WebRequest -Uri $GIResDEFUri -Certificate $Cert -UseBasicParsing | Select-Object -ExpandProperty Content 
    $ResDef = ConvertFrom-Json $GIResourceDEFJSON

    Return $ResDef.ResourceParameters.Name

}

$WAPSubscription = Get-WAPackSubscription

$CertThumb = $WAPSubscription.Certificate.Thumbprint
$CertPath = "Cert:\CurrentUser\My\{0}" -f 
$CertThumb
$Cert
 = Get-Item 
$CertPath

$TenantPortalAddress
 = $WAPSubscription.ServiceEndpoint.Host
$SubscriptionID = $WAPSubscription.SubscriptionId
$GalleryItems = @("DomainController","Lync","SharePoint","Exchange")

$GIandResourceParams = @{}

foreach ($GalleryItem in $GalleryItems)
{
    $GIResourceParams = Get-GIResourceParams -Cert $Cert -TenantPortalAddress $TenantPortalAddress `
        -SubscriptionID $SubscriptionID -GalleryItemName $GalleryItem

    $GIandResourceParams += @{$GalleryItem = $GIResourceParams}
}

$GIandResourceParams

Note     This is essentially a copy/paste of the Service Admin script, with modifications to work against the Public WAP Tenant API. The major updates: URL port (changed from 30005 to 30006); and Authorization method (changed from Header with bearer token to Certificate).

And once again, the output of this script is:

image


But wait, WHY would I want to do this?

Great question!

I assume once you watch the video (below) you will have a few ideas, but here are some use cases (from both the Service Admin and Tenant Admin personas):

  • Use Case 1:As a Tenant– Simple avoidance of manual clicking to deploy Gallery Item VM Roles
  • Use Case 2: As a Tenant– Develop scripts to fully deploy a set of multiple concurrent (and/or dependent) Gallery Item VM Role Deployments (with scripts like this, you have complete control over the “what” and “when”)
  • Use Case 3: As a Service Provider (or Enterprise acting like one)– Create a custom set of cmdlets encapsulating the parameters and logic into easily consumable/executable commands
  • Use Case 4: As a Service Provider (or Enterprise acting like one)– Enabling your Tenants/End Users to automate their own Gallery Item VM Role deployments (external to any SMA efforts on the Service Admin side)

Note     Again, these are just some of the use cases I could come up with off the top of my head. I am sure you have many more scenarios in mind.


So that’s a wrap, right?

Oh, you want the VIDEO and TECHNET GALLERY CONTRIBUTION now, do you?

Okay.


Automated Tenant Provisioning, the 8-Minute-Demo Video!

Fun Fact     It may be my first 8-Minute-Demo Video that is actually 8 minutes exactly.


TechNet Gallery Contribution and Download

The download (Windows Azure Pack Tenant Provisioning Automation Toolkit.zip) includes the following (14) files:

For the Service Administrator

SMA Runbook Exports

  • Create-VMNetwork.xml
  • Deploy-TenantVMRole.xml
  • Deploy-VMRole.xml
  • Subscription-Create-Dispatcher.xml
  • VMRole-Create-Dispatcher.xml

PowerShell Scripts

  • Deploy-VMRole_OptionalVMMCommands.ps1
  • Get-GIResourceParams_asServiceAdmin.ps1

PowerShell Workflows

  • Create-VMNetwork.ps1
  • Deploy-TenantVMRole.ps1
  • Deploy-VMRole.ps1
  • Subscription-Create-Dispatcher.ps1
  • VMRole-Create-Dispatcher.ps1

For the Tenant Administrator

PowerShell Scripts

  • Deploy-TenantVMRoles_asTenantAdmin.ps1file.ps1
  • Get-GIResourceParams_asTenantAdmin.ps1

Note     XML (SMA Runbooks) and PS1 (PowerShell Scripts) files are both provided in the download. Use SMART for Runbook Import and Export to leverage the provided XML files in the above download for an enhanced experience in importing the example solution into your SMA environment.

Optional     Some of the scripts within this download contain commented out “optional” portions for Monitoring and Notifications. The associated Runbooks and Variables for these options are not included in this download. For more information about Monitoring and Notifications within SMA, please see the following blog post: Automation–Monitoring and Notifying in Windows Azure Pack with SMA


Download the Windows Azure Pack Tenant Provisioning Automation Toolkit from TechNet Gallery here:

BC-DLButtonDark


Oh, and have you seen this blog post yet?

Windows Azure Pack–Gallery Item VM Role–References for Creation, Configuration, and Automation

If not, check it out.

And while it does cross-reference back to this post, it covers the entire Gallery Item VM Role Lifecycle (well the important bits, anyway).


Blog Series Table of Contents

  1. Part 1: Intro & TOC
  2. Part 2: Automated Deployment of Tenant Network and Identity Workload(Isolated Tenant Virtual Network & Active Directory VM Role; from the Service Admin Persona)
  3. Part 3: Automated Deployment of the Identity Workload as a Tenant Admin(Active Directory VM Role; from the Tenant Admin Persona)
  4. Part 4: Automated Deployment of Tenant Workloads (Lync, SharePoint, and Exchange)(Lync, SharePoint, and Exchange VM Roles; from both Service Admin and Tenant Admin Personas)
  5. Part 5: Working with the SQL Server resource provider, and the ITIL dilemma(by Bruno Saille)
  6. Part 6: TBD(We hope to have something around: Value Added Services/Offerings and Ongoing Automated Maintenance/Operations)

Thanks for checking out my latest blog series! For more information, tips/tricks, and example solutions for Automation within System Center, Windows Azure Pack, Windows Azure, etc., be sure to check out the other blog posts from Building Clouds in the Automation Track!

enJOY!

Contoso Labs-Network Purchasing (Vendor)

$
0
0

Contoso Labs Series - Table of Contents

Now that our storage is straightened out, it's time to set our sights on network gear. As was mentioned earlier, this entire project is operating as an independent entity from Microsoft's normal IT operations. There's no shared network, authentication, or any kind of IT services at all. We'll simply be another waypoint on the Internet, which just happens to be hosted in a datacenter. That means we need to own all of our network equipment.

Unlike our storage clusters, we considered it highly desirable to make the network a one-vendor solution for maximum compatibility and to avoid the overhead of having to learn multiple CLI and other management tools. After all, while many of us are fabric specialists with lots of experience in configuring hardware and networks, we're not network engineers in the traditional sense. That means we will occasionally need to rely on others to assist with our network, and considering where that expertise lies.

In addition, it was important to keep an eye on the direction of the industry and the direction Microsoft and the WSSC org in general is moving. An example is the Open Management Interface (OMI) initiative. Microsoft strongly believes a common management interface is critical to the success of SDN in the market. Using devices with OMI support allows us to leverage features in Windows Server and System Center for configuration and reporting, and be a good test bed for future development of our products.

Taken together, those options led to one network vendor: Cisco. Besides being a long-time leader in networking, there is a large base of Cisco expertise inside and outside Microsoft to take advantage of. On top of that, their excellent Nexus line has support for OMI in the particular devices we were considering, making this an ideal marriage.

In the next few posts, we'll cover the particular models we invested in, and explain why they were chosen for their roles.

Lync 2013 and OneDrive for Business are not installed when installing Office 2013 with Service Pack 1

$
0
0

After installing Office 2013 with Service Pack 1 from the Volume Licensing Service Center (VLSC) with a customized MSP file using the Office Customization Tool (OCT), Lync and OneDrive for Business are not installed.

We are currently investigating the issue and will provide an update soon.  In the meantime, please use one of the following workarounds:
 

1. Use the Updates folder on the original RTM release and the publically available Service Pack 1 MSP files: http://support.microsoft.com/kb/2817430.

For further information on using the Updates folder to install Office updates, refer to the following Microsoft article: Deploying software updates with an initial Office 2013 installation


2. Use the config.xml file to add Lync and OneDrive post install of Office 2013.  To accomplish this, edit the config.xml file to have the following:

<Configuration Product="ProPlus">

               <Display Level="none" CompletionNotice="no" SuppressModal="yes" AcceptEula="yes" />
               <OptionState Id="GrooveFiles2" State="local" Children="force" />
               <OptionState Id="LyncCoreFiles" State="local" Children="force" />

</Configuration>

Note - this issue also applies to Excel Add-ins Power Map and PowerPivot.  If there is a need to install these add-ins you can do so by adding the following to the config.xml file:

               <OptionState Id="ExcelAddInPowerMapFiles" State="local" Children="force" />
               <OptionState Id="ExcelAddInPowerPivotFiles" State="local" Children="force" />


Once the install of Office 2013 with SP1 has completed, run the following command-line:

<path>\setup.exe /config <path>\config.xml

For further information on using the Config.xml file in Office 2013, refer to the following Microsoft Article: Config.xml file reference for Office 2013

Do you still have legacy Address Lists and Email Address Policies in your Exchange Environment?

$
0
0

Frequently, when I review customer Exchange environments, I see Address Lists and Email Address Policies that have not been upgraded to OPATH (from LDAP) filtering.

Here are the steps:

Check to see if you still have legacy lists or policies:

LDAP to OPATH Conversion Script

  1. Download the script and put it in the C:\Program Files\Microsoft\Exchange Server\Scripts directory (or whichever drive you have E2k7 installed on)

 http://gallery.technet.microsoft.com/scriptcenter/7c04b866-f83d-4b34-98ec-f944811dd48d

 

TEST THE RESULTS BEFORE YOU CONVERT!

Review the resulting text files for any errors. If there are errors, you will need to adjust the LDAP filter (in your Exchange environment) before you import/convert them to OPATH.

 

Email Address Policies test convert to OPATH:

Get-EmailAddressPolicy | WHERE { $_.RecipientFilterType -eq 'Legacy' } | foreach { $_.Name + [char]9 + $_.LdapRecipientFilter + [char]9 + (.\ConvertFrom-LdapFilter $_.LdapRecipientFilter) } > C:\TEMP\EmailAddressPolicyConvert.txt

 

Global Address Lists test convert to OPATH:

Get-GlobalAddressList | WHERE { $_.RecipientFilterType -eq 'Legacy' } | foreach { $_.Name + [char]9 + $_.LdapRecipientFilter + [char]9 + (.\ConvertFrom-LdapFilter $_.LdapRecipientFilter) } > C:\TEMP\GlobalAddressListConvert.txt

 

 Address Lists test convert to OPATH:

Get-AddressList | WHERE { $_.RecipientFilterType -eq 'Legacy' } | foreach { $_.Name + [char]9 + $_.LdapRecipientFilter + [char]9 + (.\ConvertFrom-LdapFilter $_.LdapRecipientFilter) } > C:\TEMP\AddressListConvert.txt

 

Review the resulting TXT files for errors or if just the name of the policy/list. If the TXT file is just showing the name of the policy/list that means that it was unable to convert it.

Once the above test are error free: convert all legacy address lists, GALs, and email address policies, without prompting, run three commands:

Get-EmailAddressPolicy | WHERE { $_.RecipientFilterType -eq 'Legacy' } | foreach { Set-EmailAddressPolicy $_.Name -RecipientFilter (.\ConvertFrom-LdapFilter $_.LdapRecipientFilter) -ForceUpgrade }dd

Get-GlobalAddressList | WHERE { $_.RecipientFilterType -eq 'Legacy' } | foreach { Set-GlobalAddressList $_.Name -RecipientFilter (.\ConvertFrom-LdapFilter $_.LdapRecipientFilter) -ForceUpgrade }

Get-AddressList | WHERE { $_.RecipientFilterType -eq 'Legacy' } | foreach { Set-AddressList $_.Name -RecipientFilter (.\ConvertFrom-LdapFilter $_.LdapRecipientFilter) -ForceUpgrade }

 

Run the command again (shown in the Exchange Management Shell diagram at top) to confirm that there are no longer any Legacy lists/policies.

Automation–The New World of Tenant Provisioning with Windows Azure Pack (Part 5): Working with the SQL Server resource provider, and the ITIL dilemma

$
0
0

Hello Readers!

Yes, it’s only been a couple minutes since Charles published parts 1 to 4 for this series, and maybe you thought we would rest a bit and take a breath before publishing new content… Well, not really!

Today, we also have part 5 of the series available for you to read, and it brings a new set of sample scripts, this time in the context of working with the SQL Server resource provider in WAP.

In this post, we will demonstrate the following:

  1. How to request an administrator token from the WAP admin API. Charles already covered this in post #2, but this time we will also include a few variations you may need, when you use ADFS or signed certificates for examples. Again, requesting a token is important for many tasks, so the examples here are not limited to SQL Server databases activities. This post was just an excuse to give more details about token requests, and you could use the corresponding token for other resource providers (VM Clouds, MySQL Servers, etc.)
  2. How to test our token to display existing plans, subscriptions and users. One of the benefits is that this will display some subscriptions IDs, that we will be able to reuse in subsequent scripts
  3. Display the names of existing SQL Server databases for a specific user subscription
  4. Display details of a specific existing SQL Server database in a user’s subscription
  5. List all databases in all subscriptions : using our administrative token, we will cycle through all subscriptions and display databases names, if any
  6. Create a new database in a subscription, using our administrative token, on behalf of the user

And as a bonus, I will include some user and subscriptions samples at the end. But these will be quite short and more of a teaser, as Charles is preparing a more in-depth series on this topic.

What are the scenarios?

Having the ability to run scripts as both admins and tenant is critical for automation, especially in the context of enterprises. A few examples:

- Doing actions on behalf of a tenant, after approvals in an ITSM solution

- Collecting data available in the admin and/or tenant APIs, to centralize/reconcile in a CDMB or else

Now, there is a fine line between controlled processes as requested by ITIL, and the expected and promised agility of a cloud… This is where a balance has to be found. At the end of this post, we try to start covering this topic, by introducing what we called the “ITIL dilemma” (see section “Bonus – Creating users and adding them to plans”)


Download

Yes, the scripts from this page are available as a download!

BC-DLButtonDark

Note : This is a single script, intended to be run sample-by-sample, using selection mode (“F8” in PowerShell ISE). If you start to run the script right away, it will likely fail because some variables need to be edited, like the subscriptions ID. Please refer to the sample below and to the header of the script, to understand what need to be modified. By default, write actions are commented out anyway, to ensure you do not mistakenly write to your environment.


Introduction : The SQL Server resource provider in Windows Azure Pack

The SQL Server resource provider in WAP allows an administrator/service provider to delegate tenants the ability to create/resize/delete SQL Server databases. It’s a very easy and flexible way to provide databases that could be hosted on standalone servers, or using SQL Server AlwaysOn. These databases can then be used with the Web Sites or Virtual Machines from this tenant, for instance (there are many scenarios, depending on how network routing and isolation is designed in your WAP environment). Just like with VM Clouds, the SQL Server resource providers relies on a shared fabric (this time made up of SQL Servers servers – optionally categorized in groups – whereas VM Clouds rely on a fabric of Hyper-V hosts surfaced through the Virtual Machine Manager clouds). You can read more about the SQL Server resource provider here, and we will have a more detailed blog post on this very soon.


Working the SQL Server resource provider from an API perspective

This is very similar to VM Clouds, that have been the main focus of parts 1 to 4 for this series. We need to authenticate with the WAP APIs (admin or tenant) and then can execute actions using the WAP cmdlets (from an admin standpoint) or through the tenant API web service (from a tenant standpoint).

The foundation

We’ll need the WAP admin API PowerShell module loaded, and here are two variables we are setting and will reuse for all these scripts. They correspond to the endpoints for the tenant and admin APIs.

001
002
003
ipmo mgmtsvcadmin
$TenantUri = "https://wap01.contoso.com:30005"
$AdminUri = "https://wap01.contoso.com:30004"

Note : You may need to run these samples as administrator/privileged.

First, let’s request a token

Once again, this has been previously covered by Charles, but let’s look at the variations you may face in different environments. These variations may be due to the following:

- Whether you use self-signed certificates, or certificates signed by an approved certification authority

- Whether you have setup ADFS or not, for the WAP admin interfaces

 

Requesting a token without ADFS, with self-signed certificates:

001
$Token = Get-MgmtSvcToken -Type Windows -AuthenticationSite "https://wap01.contoso.com:30072" -DisableCertificateValidation -ClientRealm "http://azureservices/AdminSite"

The important part here is the “-DisableCertificateValidation” switch, but it’s not enough. you will also need to add this to your script, before continuing:

001
002
003
004
005
006
007
008
009
010
011
012
013
014
add-type @"
    using System.Net;
    using System.Security.Cryptography.X509Certificates;
    
    public class NoSSLCheckPolicy : ICertificatePolicy {
        public NoSSLCheckPolicy() {}
        public bool CheckValidationResult(
            ServicePoint sPoint, X509Certificate cert,
            WebRequest wRequest, int certProb) {
            return true;
        }
    }
"@

[System.Net.ServicePointManager]::CertificatePolicy = new-object NoSSLCheckPolicy

Note : Be aware that this snippet disable SSL verification in the rest of your PowerShell session. Of course, having “well-signed” certificates is recommended in production…

Requesting a token without ADFS, with “well-signed” certificates:

001
$Token = Get-MgmtSvcToken -Type Windows -AuthenticationSite "https://wap01.ad.corp.local:30072" -ClientRealm "http://azureservices/AdminSite"

Requesting a token with ADFS, with “well-signed” certificates”

001
$Token = Get-AdfsToken -domain 'contoso.com' -username 'brunosa' -password 'Pass@word1' -adfsAddress 'https://adfs01.contoso.com' -clientRealm 'http://azureservices/AdminSite'

OK, I am cheating a bit here, as “Get-AdfsToken” is actually calling a function in the scriptSmile But this function is provided in the downloadable script sample. It’s also actually adapted from the samples you have on any Windows Azure Pack installation – Thanks for Shri for his help on this!

image

 

When we have a token, displaying it should look like this:

image

Testing the token on the admin side

To do this, let’s display existing plans, subscriptions and users:

001
002
Get-MgmtSvcPlan -AdminUri $AdminUri -Token $Token  | select DisplayName, Id, servicequotas | ft
Get-MgmtSvcSubscription
 -AdminUri $AdminUri -Token $Token | select subscriptionID, AccountAdminLiveEmailId, PlanId, state | ft

Output:

image

Using the token to list databases for a specific tenant, with the tenant API

This is just a matter to craft the expected Web Request, and using our token as part of the headers – note that the subscription ID can either be found in the WAP admin portal…or in the output of the previous sample Smile

001
002
003
004
005
006
$SusbcriptionID = "afe457ae-f4d9-46e9-81a2-67c0654bf668"
$UserID = (Get-MgmtSvcSubscription -AdminUri $AdminUri -Token $Token| Where-Object {$_.SubscriptionId -eq $SusbcriptionID}).AccountAdminLiveEmailId
$Headers = @{Authorization = "Bearer $Token"
        "x-ms-principal-id" = $UserID}
$FullUri = $TenantUri + "/" + $SusbcriptionID + "//services/sqlservers/databases/"
(Invoke-RestMethod -Uri $FullUri -Headers $Headers).Name

Here is the output in PowerShell, and the view in the tenant portal:

image

image

If we had wanted to query VM Role Gallery Items available through that subscriptions, the last two lines to use would have been this one, with a different URI:

001
002
$FullUri = $TenantUri + "/" + $SusbcriptionID + "//Gallery/GalleryItems/$/MicrosoftCompute.VMRoleGalleryItem?api-version=2013-03"
(Invoke-RestMethod -Uri $FullUri -Headers $Headers).Content.Properties.Name

As you can see, working with a different resource provider only requires to understand the namespace for that provider. And for the resource providers shipping out of the box, MSDN is your friend, such as this link for the SQL Server resource provider.

Displaying details for a specific database in a subscription

The script would look like this – It is using the URI from the previous sample, to automatically display the details of the first database that was found. But you could replace $DBName by any database name that you know to exist in the subscription.  $SubscriptionID and $Headers also come from the previous sample (both samples are grouped toegether in the downloadable script)

001
002
003
004
005
006
007
008
009
010
If ((Invoke-RestMethod -Uri $FullUri -Headers $Headers).Count -eq 1)
    {$DBName = (Invoke-RestMethod -Uri $FullUri -Headers $Headers)[0].Name}
If ((Invoke-RestMethod -Uri $FullUri -Headers $Headers).Count -gt 1)
    {$DBName = (Invoke-RestMethod -Uri $FullUri -Headers $Headers).Name[0]}
    $DBName
If ($DBName)
    {
    $FullUri = $TenantUri + "/" + $SusbcriptionID + "//services/sqlservers/databases/" + $DBName + "/"
    (Invoke-RestMethod -Uri $FullUri -Headers $Headers)
    }

And the output gives us:

image

Listing all databases in all subscriptions

The sample below uses our token to cycle through all subscriptions and, when the SQL Server resource provider is part of the associated plan, it queries to see if databases are already created. If yes, it lists them.

001
002
003
004
005
006
007
008
009
010
011
012
$subscriptions = Get-MgmtSvcSubscription -AdminUri $AdminUri -Token $Token | Where-Object {$_.State -eq "active"} | select subscriptionID, AccountAdminLiveEmailId, PlanId
Foreach ($subscription in $subscriptions)
    {
    $FullUri = $TenantUri + "/" + $subscription.subscriptionID + "//services/sqlservers/databases/"
    $Headers = @{Authorization = "Bearer $Token"
        "x-ms-principal-id" = $subscription.AccountAdminLiveEmailId}


    If ((Get-MgmtSvcPlan -AdminUri $AdminUri -Token $Token | where-object {$_.Id -eq $subscription.planID}).servicequotas.servicename -contains "sqlservers")
        {write-host "SQL Server Databases is part of subscription " $subscription.subscriptionID " . Database(s) found : " (Invoke-RestMethod -Uri $FullUri -Headers $Headers).Name}
        else {write-host "SQL Server Databases is not part of subscription " $subscription.subscriptionID}
    }

Output:

image

Creating a database in a subscription, on behalf of a user

Reading data is great, but creating may be important once in a while, right? Smile

This scripts create a database in the specified subscription, by preparing the JSON body and calling the right Web Request, this time with a “POST” command, vs an implied “GET” in previous examples. Notice how we have to specify the “application/json” content type too.

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
$DatabaseName = "DB9999"
$SQLserverGroup = "Production"
$BaseSize = "1000"
$MaxSize = "1000"
$DBOwnerName = "User9999"
$DBOwnerPassword = "Pass@word1"
$BodyHashTable = @{
    Name = "{0}" -f $DatabaseName
    Edition = $SQLserverGroup
    BaseSizeMB = $MaxSize
    MaxSizeMB = $MaxSize
    Collation = "SQL_Latin1_General_CP1_CI_AS"
    Iscontained = "false"
    CreationDate = "0001-01-01T00:00:00+00:00"
    Status = "0"
    AdminLogon = $DBOwnerName
    Password = $DBOwnerPassword
    }
$BodyJSON = ConvertTo-Json 
$BodyHashTable

$SusbcriptionID
 = "afe457ae-f4d9-46e9-81a2-67c0654bf668"
$UserID = (Get-MgmtSvcSubscription -AdminUri $AdminUri -Token $Token| Where-Object {$_.SubscriptionId -eq $SusbcriptionID}).AccountAdminLiveEmailId
$Headers = @{Authorization = "Bearer $Token"
        "x-ms-principal-id" = $UserID}
$FullUri = $TenantUri + "/" + $SusbcriptionID + "//services/sqlservers/databases/"
Invoke-RestMethod -Uri $FullUri -Headers $Headers -Body $BodyJSON -ContentType 'application/json; charset=utf8' -Method 
POST

Note : You do not need to use the “-f” syntax in the hashtable. This is just to show that you can work with values in many different ways, as long as you do not forget to convert to JSON later (using ConvertTo-Json)

How did we know what properties to insert in the hash table?

Well, as said before, MSDN is your friend, and in particular this link : it provides a sample JSON payload to insert in the request body. You could actually use this in the previous code, to insert the JSON body directly as a PowerShell here-string:

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
$BodyJSON = @"
{
  "Name": "DB9999",
  "SqlServerName": null,
  "SqlServerId": null,
  "ConnectionString": null,
  "Edition": "Production",
  "BaseSizeMB": 1000,
  "MaxSizeMB": 1000,
  "Collation": "SQL_Latin1_General_CP1_CI_AS",
  "IsContained": false,
  "CreationDate": "0001-01-01T00:00:00+00:00",
  "Status": 0,
  "SelfLink": null,
  "Quota": null,
  "AdminLogon": "User9999",
  "Password": "Pass@word1",
  "AccountAdminId": null
}
"@

Using a hash table has the benefit to make it easier to insert variables. When going from the here-string to the hash table, we also removed the “null” properties, only keeping the other ones. The subscription property was not added too, as we are already executing the request in the subscription’s context.

Output of a database creation:

image


Bonus – Creating users and adding them to plans

In addition to the previous samples, here are three additional and very simple scripts. We had them handy, so figured it was worth adding there as a bonus.

- Listing all users

001
Get-MgmtSvcuser -AdminUri $AdminUri -Token $Token

- Creating a Windows Azure Pack user

001
002
$NewUser = "testuserPS@contoso.com"
Add-MgmtSvcUser  -AdminUri $AdminUri -Token $token -Name $NewUser -email $NewUser -State 'Active'

- Adding a WAP user to a plan (see also here)

001
002
003
004
$NewUser = "testuserPS@contoso.com"
$PlanName = "Contoso Gold Plan"
$Plan = Get-MgmtSvcPlan -AdminUri $AdminUri -Token $Token | where-object {$_.DisplayName -eq $PlanName}
Add-MgmtSvcSubscription -AdminUri $AdminUri -Token $Token -PlanId $Plan.Id -AccountAdminLiveEmailId $NewUser -AccountAdminLivePuid $NewUser -FriendlyName 'MySubcription01'

Output for the new user and new subscription creations:

image

These is just a teaser, since Charles will be creating some posts soon on topic, with more depth and more scenarios. For example, creating the user in WAP is only one of two steps. The second step is to make sure the user actually exists in the authentication system. With ADFS, there is no need for the second step (user is already an Active Directory user, so what you have earlier is enough in an ADFS-enabled environment). But without ADFS, the user needs to created in the default ASP.NET membership provider too.

 

What are the scenarios for these three examples – and let’s introduce the “ITSM dillema”

When talking with enterprises, there is often an IT Service Management (ITSM) solution already used to manage requests and approvals, often through a “service catalog”. Using the samples from this blog post, it is definitely possible to approve new databases (or virtual machines) requests in the ITSM solution, and trigger API calls to actually deploy the resources on behalf of the user. Now, as organizations try to embrace the “IT as a Service” frame of mind, a balance has to be found, so that traceability and approvals required by ITIL processes do not offset the agility and economics of the cloud.  This is where these bonus examples might be interesting : If your processes require it, you could provide controlled/approved access to private plans in WAP, and then let users do whatever they want inside their “delegated sandboxes” (with optional chargeback).

The scenario would be like this:

1. The WAP administrator creates plans but keeps some of them “private”

1. A user goes to a well-known service catalog (System Center Service Manager comes to mind, but you may be using another solution today too)

2. The “private” plans are listed in the service catalog. They could have been added to the CMDB, by listing them using the samples here, and added through automation (assuming your ITSM solution allows to create CMDB items programmatically – hint : Service Manager does that)

3. The approvals occur within the realm of the ITSM solution, as for any other enterprise process

4. Once approved, automation kicks in. The automation process (Service Management Automation, Orchestrator, or just a PowerShell script) could either monitor approved requests, or be called externally by the ITSM solution. This depend on your design, and on the ITSM solution and automation engine being used

5. The Runbook or script can request an admin token, and create the user (if not there already), add him/her to the plan, and notify the user by sending him the URL to the WAP portal

6. Optionaly, another Runbook could be cycling through resources created (like all databases in all subscriptions) and add/update then in the CMDB. This is a way to reconcile items potentially created outside of the ITSM process, if that’s something technically possible in your implementation.


What’s next?

Well, maybe this time we’ll take a breather Smile Parts 1 to 5 are now out, and we should be back in a few weeks with part 6, hopefully with something around: Value Added Services/Offerings and Ongoing Automated Maintenance/Operations. Thanks for reading!


Blog Series Table of Contents

  1. Part 1: Intro & TOC
  2. Part 2: Automated Deployment of Tenant Network and Identity Workload(Isolated Tenant Virtual Network & Active Directory VM Role; from the Service Admin Persona)
  3. Part 3: Automated Deployment of the Identity Workload as a Tenant Admin(Active Directory VM Role; from the Tenant Admin Persona)
  4. Part 4: Automated Deployment of Tenant Workloads (Lync, SharePoint, and Exchange)(Lync, SharePoint, and Exchange VM Roles; from both Service Admin and Tenant Admin Personas)
  5. Part 5: Working with the SQL Server resource provider, and the ITIL dilemma(by Bruno Saille)
  6. Part 6: TBD(We hope to have something around: Value Added Services/Offerings and Ongoing Automated Maintenance/Operations)

Viewing all 17778 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>