Quantcast
Channel: TechNet Blogs
Viewing all 17778 articles
Browse latest View live

International Women’s Day profile: Meet Skype Android app engineer Kine Camara

$
0
0

clip_image002Women who are inspiring change are being honored March 8 on International Women’s Day, and Kine Camara is one, having crossed traditional boundaries in her role as an engineer, a field still dominated by men.

Camara, who was born in Senegal, says that she grew up interested in science but did not plan to become an engineer. She enrolled at the Polytechnic University of Montreal and remembers, “At first I was completely lost, but once I worked with friends to develop a mobile app, I knew I had to become a mobile software engineer.”

She now develops new features, fixes bugs and ensures high code quality for Skype on Android.

“There is not an equal representation of men and women” in engineering, Kine says. “But things are improving. There are more and more articles that talk about women in senior positions. There are more women at the head of large tech companies who write books and talk about their experiences. I think this will inspire more girls to go into engineering and technology.”

To read the full interview with Camara, head over to Skype’s Play Blog.

You might also be interested in:

· Skype celebrates women inspiring children in science, technology, engineering and math
· Meet five Microsoft women inspiring change in technology
· Another Amelia Earhart helps women soar, and relies on Office 365 to do so

Suzanne Choney
Microsoft News Center Staff


Introducing the “Gallery Resource Import Tool”

$
0
0

If you’ve started to use or evaluate the Windows Azure Pack (WAP), you may have noticed the different steps required to install a VM Role Gallery Item and to configure your environement to leverage this new artifact. These include:

  1. Downloading the Gallery Resource from Web Platform Installer, or from the source blog.
  2. Checking the virtual disks requirements (OS disk, and data disk if any) in the Gallery Item documentation, and configure disks accordingly in your environment (family, release, operating system, tags). Most of this can be done in the VMM, the tags require PowerShell.
  3. Importing the Resource Extension in VMM, via PowerShell – possibly after editing the Resource Extension to add specific files/payloads (here also the information is in the Gallery Item documentation)
  4. Importing the Resource Definition in the WAP administration portal

These steps are detailed here, and rely on specific information you can find in the documentation for each VM Role Role Gallery Item you wish to configure and import.

More specifically, the Resource Extension step can be seen starting at 16:36 of the VM Role Resource Extension video, and the Resource Definition step can be seen starting at 16:07 in the VM Role Resource Definition video. Both videos are part of the series published by Stephen Baron from the Virtual Machine Manager team.

To simplify, automate and assist with this process, we are happy to release today the “Gallery Resource Import Tool” (GRIT).

image

This post explains the goals and capabilities of the tool, as well as how to use it.

Note : Before reading further, I would recommend reading a bit about what makes up a VM Role Gallery Item, and mainly the notion of “Resource Extension” and “Resource Definition”. Michael Greene has written an excellent post on this topic, and it is available here.

 

What is GRIT?

Fully written in PowerShell, the “Gallery Resource Import Tool” (GRIT) aims at simplifying discovery and installation of VMRole Gallery Items in Windows Azure Pack (WAP), in addition to help reduce manual errors when tagging virtual disks. Through this single tool, all the configuration and import steps can be achieved.

Great, where can I get it?

Here is the download link:

Orchestrator Visio and Word Generator 1.5

Capabilities overview

With GRIT, you can:

  • Browse and download Gallery Resources available from Microsoft on the internet, or use a local copy from your disk
  • Review virtual disks prerequisites for that Gallery Resource, compare with your existing virtual hard disks, and optionnaly update these disks to match the requirements for the Gallery Resource
  • Import the Resource Extension and/or Resource Definition

All without firing Powershell! (Well… that’s not entirely true, you still need to run PowerShell to use the tool itself Smile)

How to use the tool

1. First download the tool from the link earlier in this blog post. The package contains a readme document – essentially redirecting you to this blog post - and the tool itself (a PS1 file)

2. Edit the $VMMServer and $SPFServer variables at the top of the PS1 file, to match your environment, or enter these variables on the command line when starting the tool

3. Please make sure that

  • You are currently running this tool on a machine with VMM and SPF cmdlets. An easy location is the server running Service Provier Foundation (SPF).
  • You are also running the tool from an elevated PowerShell/ISE invite, or some of the cmdlets might fail to import the packages

4. In the window that opens, select the Gallery Resource you want to configure and/or import. If network connectivity is available, you should see all Gallery Resources from Web Platform Installer (WebPI). If not, you should only see the option to browse and select a Resource Definition file on the local machine

    image

5. The tool should then parse the chosen Gallery Resource, and open a new window with a welcome screen, similar to the ones you are reading right now.image

6. In the "1.Virtual Disks Configuration" tab : Review the discovered requirements for the virtual disks, and pick the operating system you would like to use (it has to be one of the operating systems from the gallery item documentation). In the grid that opens, review virtual disks compatibility, and update disks as needed, until you get at least one “OS disk Match” and one “Data Disk Match”. While you can choose to import the resource extension/definition before configuring the disks, just remember the gallery item wizard and deployment may not work until valid disks are available.

image

Picking the Operating System for the OS disk

 image

After picking an OS disk, this list comes up and displays the compatibility with your existing virtual hard disks. You can select the disks and apply OS/data disk settings to the selected disks.

7. In the "2.Gallery Resource Import" tab : You can choose to import the resource extension and/or import the resource definition. To import the resource extension, you must first select a VMM library share. The tool may also warn you if it detects potential missing payloads in the package to be imported, in which case we recommend you double check the Gallery Resource documentation, and only proceed if you are sure the extension is ready (Note : Don't worry, you can always remove the resource extension if you need to start over again). The tool also checks if the Resource Extension and Resource Definition are already imported and does not try to import them again if that’s the case.

image

Note : When updating disks/importing files, the tool will display progress in the PowerShell console (while the tool’s window will be waiting for the end of the job to display status).

image

In this example, a disk was tagged as a data disk, and both the Resource Extension and Resource Definition were imported.

You will also get a popup at the end of each major operations (disk tagging, or Resource Extension/Definition import):

image

Once you have imported the Resource Extension and Resurce Definition, you just have to go to the Windows Azure Pack (WAP) admin portal, set the gallery item to “public” and add it to a plan to test.

Tip : If you select a row and right click on it, the OS disk family and release fields will be populated with the currently highlighted values. You can use that tip to easily copy some values before assigning new tags, etc. This is essentially the same thing as writing the Family and Release fields manually, but just faster…


Detailed capabilities

This section covers in more details what happens under the hood when you run the tool, and you can see the corresponding script in the PS1 file.

The tool does the following:

1. Lets you select a Resource Definition from the local disk, or download one from the Services Model feed from Web Platform Installer (WebPI).

a. The Resource Definition file path can also be specified as a parameter. The VMM and SPF server are also script parameters.

b. If no internet connectivity is detected, only local Resource Definition files are presented

2. Copies/downloads the packages, unzip them in a temporary directory and parses JSON files to discover virtual disks requirements, if any (where the model allows).

a. By default, packages are downloaded to a “C:\GRIT_Downloads” folder, that will be created by the tool. You can change this default value in the tool variables in the PS1 script, by updating the $DownloadFolder varialble.

b. Specific situations are also handled, like a Resource Definition file not relying on a Resource Extension file, or multiple Resource Definition files in a WebPI package (such as a “domain” and a “workgroup” Resource Definition, both leveraging the same Resource Extension).

3. Displays the computed requirements for virtual hard disks, and gives the option for the user to specify the OS disk operating system (this is the only thing remaining that you would still have to check in the documentation, as this is not part of the current VM Role model)

4. Based on these requirements, existing virtual hard disks are listed, explaining which ones are matches for OS disks and Data Disks (if any)

a. All different cases are handled : OS disk parameterized or specified through family:release ; Data disk parameterized, specified through family:release, or not required

5. As a user, you can then choose to apply required settings for OS or data disks, and see if they become “matches”

a. Items like “OS Disk Operating System” are read directly from VMM, both for convenience and to minimize errors

b. When applicable, items are grayed out to limit user errors (for example, data disks must have “None” as the Operating System)

6. In the same user interface, you can choose to import the Resource Extension and/or the Resource Definition

a. The tool does some checks in the Resource Extension, to warn you when it detects the need for manual configuration before import (external payload declared in JSON, folders with a single readme file). A warning is displayed in that case, and the option to import the Resource Extension is unchecked by default. You can choose to bypass this and import the Resource Extension, once you’re sure you’ve made the corresponding changes, if needed. You can also choose to disable this check, by setting the $CheckResExtContent variable to $false in the PS1 file (by default, the variable exists and it set to $true)

b. Since Resource Extensions are imported as VMM custom resources in a specific library share, VMM library shares are automatically detected and listed

7. When finished, the tool cleans up temporary files (zip and copies)


Summary and Wrap-up

We hope you will find this tool useful, and really encourage you to send feedback !

A special thanks to MVP Kaido Järvemets. His blog post on color coding a datagrid inspired the look and feel of the disk configuration grid in this tool.

Thanks again for reading our blog.

HIMSS14 conference recap: Microsoft demos technology transforming the healthcare industry

$
0
0

The Healthcare Information and Management Systems Society (HIMSS) 2014 Annual Conference & Exhibition in Orlando, Fla., wrapped up Feb.27, ending a week of wide-ranging demonstrations of the ways Microsoft technology is transforming the healthcare industry.

The conference showed off how Microsoft’s solutions are helping address some of the industry’s toughest challenges and opportunities, especially in areas of mobility and collaboration, electronic medical records (EMRs) system access and integration, health analytics, patient and consumer engagement, and governance and compliance.

Attendees saw how Windows 8.1, Surface and Office 365 devices, apps and services provide mobile tools and features enabling clinicians to quickly and easily communicate with care teams. These tools also help them access their full function EMRs from anywhere at any time.

Several Microsoft’s partners, including Omnicell and Predixion, demonstrated how they’re using Windows Embedded technology and Microsoft’s business intelligence and analytics capabilities to enhance operational performance in hospital settings. 

The recipients of the Microsoft Health Users Group (HUG) 2014 Innovation Awards were also announced at HIMSS.

Find out more about HIMSS14 on the Microsoft in Health blog.

You might also be interested in:


Athima Chansanchai
Microsoft News Center Staff

Microsoft Joins 900+ Data Enthusiasts to Examine how Big Data Drives Business Success at Gigaom’s Structure Data

$
0
0

More than 900 big data practitioners, technologists and executives will gather at Structure Data in New York March 19-20 to investigate how big data can drive business success, and Microsoft will be there to take part in the conversation. 

Around the world, big data and big compute are coming together to help enterprises and organizations create better products, make cities run smoother, advance disease prevention and treatments, and much more. 

Today, Microsoft is working with customers to do just this. In the process, we’ll bring the power of big data to a billion people – connecting companies to previously untouched data sources and enabling everyone to gain insights through familiar and powerful tools that now simply live in Microsoft Excel.

At the event, John Platt, distinguished scientist at Microsoft Research, will share the latest on Microsoft’s work in machine learning in a fireside chat.  We’ll also have data experts on hand to give attendees a taste of how easy it is to turn big data into beautiful visualizations and insights.  Attendees will also get a peek at how Microsoft’s Cybercrime Center is using data to fight worldwide organized crime and BotNets. (Get a sneak peek by watching the video below!) 

So please, join the conversation by attending the event and visiting with one of the Microsoft experts on hand. You can save 25% off your registration by using this special sponsor link

Or, connect with us on Facebook.com/sqlserver and Twitter @SQLServer and learn how Microsoft’s approach to data helps employees, IT professionals and data scientists transform data into an organizational advantage. 

Networking Configurations for Hyper-V over SMB in Windows Server 2012 R2

$
0
0

Dan Stolts and Jose Barreto are back and in this episode as they discuss networking configurations for Hyper-V over SMB in Windows Server 2012 and Windows Server 2012 R2. Tune in as they chat about how networks should be configured as well as provide several scenarios and best practices.

  • [2:21] Let’s do a quick recap of SMB is and what Hyper-V over SMB is
  • [4:59] How do we configure the network for Hyper-V over SMB?
  • [10:14] Can we piggy back roles onto a small number of networks?
  • [15:00] Do we have to use RDMA nics? if we do have them, should we use them?  What are the most important components to use RDMA if we are limited to the number of RDMA nics we have?
  • [19:12]  How does Quality of Service play in the networking configuration?
  • [27:40]  What are the different types of RDMA nics and what do each do?
  • [31:36]  What best practices can you share with us?

Catch up on the other episodes of the Windows Server 2012 SMB series HERE.

Download Windows Server 2012 R2 Evaluation

Start Your 30 Day Windows Azure Trial

Tip of the Day: BranchCache part1

$
0
0

Today’s Tip…

Microsoft introduced BranchCache in Windows Server 2008 R2. The idea was that branch offices would only need to download data from the main office once. Then it would be cached at that branch. Subsequent requests for the same data would then be routed to the cached data once it was determined that the data was not stale. Over the next few tips, I’ll be forwarding information on some of the improvements made to BranchCache in Windows Server 2012 which focus on new tools and a simplified deployment model.

  • BranchCache no longer requires office-by-office configuration. Deployment is streamlined because there is no requirement for a separate Group Policy object (GPO) for each location. Only a single GPO that contains a small group of settings is required to deploy BranchCache in any size organization, from a small business to a large enterprise.
  • Client computer configuration is automatic. Clients can be configured through Group Policy as distributed cache–mode clients by default; however, they search for a hosted cache server, and if one is discovered, clients automatically self-configure as hosted cache-mode clients.
  • Cache data is kept encrypted, and hosted cache servers do not require server certificates. BranchCache security provides improved data encryption and other technologies, providing data security without requiring a public key infrastructure or additional drive encryption.
  • BranchCache provides tools to manipulate data and preload the content at remote locations. Now, you can push content to branch offices so that it is immediately available when the first user requests it. This allows you to distribute content during periods of low WAN usage.
  • BranchCache is deeply integrated with the Windows file server. BranchCache uses Windows file server’s state-of-the-art technology to divide files into small pieces and eliminate duplicates. This greatly increases the chance of finding duplicate pieces in independent files, resulting in greater bandwidth savings. BranchCache is also more tolerant of small changes in large files.

US Partner News Online for Friday, March 7, 2014

$
0
0

image

Welcome to this week’s issue of the US Partner News Online! Each week, we’ll bring you the latest news and information for Microsoft partners. You can read previous weeks’ issues at http://aka.ms/uspnewsletter. If you’d like to see the next post in your email inbox, go to http://aka.ms/uspblog and enter your email address under “Subscribe and follow” on the right hand side.

To stay in touch with me and connect with other partners and Microsoft sales, marketing, and product experts, join our US Partner Community on Yammer and choose how you want to stay informed.

Top stories

Let Us Help You Plan Your Microsoft Partner Network Membership Renewal
If your company's Microsoft Partner Network membership is up for renewal in the next couple of months, use the “Renew my membership” guidance on the MPN portal to start the renewal process. For assistance, you can request one-to-one communications with an MPN Support agent through the Partner Membership forum in the Partner Support Community.

WPC14_nodate_blueWPC 2014 Updates: US Celebration, Awards, Content, and More
The 2014 Microsoft Worldwide Partner Conference is in Washington, DC, from July 13–17. The US Partner Team has just announced that this year's US Celebration at WPC 2014 is on Tuesday, July 15 at the Smithsonian National Air and Space Museum, and we can't wait to see you there. More announcements:

imageArt and Science of Partnerships: Hitting the Road with Windows Phone
On her executive blog, US Channel Chief, Jenni Flinders, talks about the Windows Phone apps she finds indispensable when she’s on the road to visit and talk with Microsoft partners. From managing flight information to entertainment and business productivity, there are Windows Phone apps that can make travel easier.

The Internet of Things Offers Microsoft Partners New Business Opportunities
The Windows Embedded Partner Program is now part of MPN, and you can demonstrate your expertise with intelligent systems by earning the new Intelligent Systems competency. Gartner and IDC both included the Internet of Things as a business opportunity that has significant growth projections for this year.

Ready-to-Go Marketing: Help Customers Say Goodbye to Windows XP and Upgrade by April 8
The Get Modern with Windows 8.1 and Office campaign, from Ready-to-Go Marketing, provides you with resources that will help you prepare for conversations with your customers and explain the benefits of modernizing their businesses by moving to current versions of Windows, Office, and Exchange. You'll help customers increase productivity, access information anywhere, and stay secure while positioning yourself as their trusted advisor.

Microsoft Channel Chief, Phil Sorgen: Smaller Partners Make Action Pack Better
Channel Chief Phil Sorgen talks about the redesigned Microsoft Action Pack subscription. It now provides simple and flexible resources that can grow and evolve with you, whether you're focused on providing cloud, on-premises, or hybrid solutions. You asked for training customized to your needs, and for added technical advisory support to provide much-needed help with those complex customer scenarios. Those benefits are now part of the Action Pack.

New Customer Campaign Explains the Value of Working with Microsoft Partners
A new campaign from the Microsoft Partner Network reaches out to customers to tell them the benefits of working with a partner that has proven expertise with Microsoft technologies by attaining silver or gold competency status. A new MPN page shows customers how to search on Microsoft Pinpoint to find a qualified partner based on business need, competency, or industry vertical.

ROI_Office365tool

Show Customers How Much They Can Save with Office 365
With support ending soon for Office 2003, Exchange Server 2003, and Windows XP, now is a good time for customers to move up to Office 365. Help your customers understand the benefits of moving to Exchange and Office in the cloud by using CloudReady Insight. This third-party tool calculates the return on investment of moving to Office 365 based on a customer's present on-premises usage.

Register for the VAR Incentive Program to Earn Up to 22 Percent in Incentives
The VAR Incentive provides maximum payout when you help your customers recognize the benefits of emerging Microsoft products and technologies, including virtualization, security, and unified communications. Register before March 31 to earn this incentive retroactive to your eligible sales starting in January. You must register for the current incentive even if you have participated in previous versions of the VAR rebate. Terms and conditions.

Earn $500 for Your First Sale of Microsoft Office 365 to an SMB Customer
Register for the VAR Incentive, then make your first Microsoft Office 365 sale through Open or Advisor between March 1 and May 31, and you may be eligible for an extra $500 payment. Terms and conditions.

SMB Advantage: Earn Partner Subsidy Funds for Cloud Sales
Earn partner subsidy funds on eligible sales of Microsoft Office 365 or Microsoft Dynamics CRM Online through Open or Advisor. Redeem the purchase within 15 days, and your customer will receive a check to be used within 90 days for additional purchases or services provided by you. Offer ends May 31. Terms and conditions.

PracticeAccelerator_valuegraphicIncrease Your Customer Wins and Retention with Practice Accelerators
If you have a Microsoft competency, chances are good that you have not used many, if any, of your Advisory Hours benefit. Spend five of those hours on a Practice Accelerator that will help you build your services practice in one of these key areas: mobility, cloud, big data, or enterprise social.

Accelerate Your Learning with Practice Accelerator in March, and Enter to Win
Practice Accelerators can help your organization branch your business into a new services area. These sessions enable technical consultants and architects to increase their skills for solutions in the growth areas of mobility, big data, enterprise social, and cloud. When you complete any Practice Accelerator session between March 3 and March 31, you will be entered into our sweepstakes for a chance to win* a $1,000 gift card.
* No Purchase Necessary. Open only to authorized reps of companies actively enrolled in the Microsoft Partner Network. Game ends March 31. For details, including free alternate method of entry, download the official rules.

Prepare for the Microsoft Security Bulletin for March 2014
The next security bulletin release is on Tuesday, March 11. Get information about the security updates for February 2014 that cover Microsoft .NET Framework, Microsoft Security Software, and Microsoft Windows. You can watch the February webcast on demand, and register for the next webcast, on March 12.

Test Product Migration and Application Compatibility with Previous Versions
Over 30 products from the collection of previously released trial software remaining under mainstream support have been re-released on the TechNet Evaluation Center to support your migration and application compatibility testing needs. Get the downloads and then get started with migrating your customers to the most current Microsoft products.

Cloud1-2-3Academy

Cloud 1-2-3 Academy for SMB Partners: Build a Profitable Cloud Business
Whether you are new to the cloud or have closed a few cloud deals already, Cloud 1-2-3 Academy will help you make decisions about your business model to help you maximize revenues in the long term. Register to attend an in-person event, register for the online, 6-week series—or, register for and attend both.

Spread the Word About Your Business with Microsoft Community Connections
Give your business a boost by hosting local events and webinars with Microsoft Community Connections (MCC). Be the small business expert in your and help spread the word via local business organizations in your area, like Chambers of Commerce and other networking groups. Fully scripted presentations are available for download at the MCC site. When you register your event with us, kits with marketing resources are provided at no cost to you.

Meet Search Marketing Experts at Bing Ads Connect
Come to one of these four in-person events to engage with search marketing experts and learn about your opportunities with Bing Ads. If you have an Action Pack subscription, your benefits include Bing Ads credits that you and your customers can use to market products, services and solutions. Come learn how to put them to work.

May 12–15 in Houston, Texas: Join Us at TechEd 2014
Connect with IT pros and enterprise developers and explore key technology trends at TechEd. Find out how Microsoft innovation is transforming the cloud, application architecture, and data analysis. Choose from hundreds of sessions and start building your schedule today.

Resources

imageChoose How You Want to Stay Informed as a US Partner
Microsoft offers its Partner Network members programs, training, resources, and materials to help them build, sell, deploy, and support their Microsoft-based solutions. To help you stay informed about news, opportunities, recommendations, and programs relevant to your business, the US Partner Team offers you options that include social platforms, email, and blogs. Choose your favorite way (or ways) to get the information you need.

Resources to Help You Transition Your Cloud Internal Use Rights
As of February 24, all Microsoft Action Pack and competency partners now have access to Microsoft Internal Use Rights for Microsoft cloud solutions such as Office 365, Windows Azure, Dynamics CRM Online, and Windows Intune. You can now deploy your internal use rights benefit allotment the way you choose, to suit your business needs—online, on-premises, or both.

Monthly partner callUS SMB Regional Partner Calls for March
Join your Microsoft US SMB regional team for the March partner call in their monthly series. These calls cover technology, sales, and marketing information and resources for resellers that serve SMB customers. You'll hear from Microsoft experts about your opportunities, resources and tips for building your business, and best practices.

Build a Marketing Program to Nurture Customers with the Microsoft Dynamics Partner Outsource Model
Are you a Microsoft partner short on time, marketing headcount, or would simply prefer to focus your staff on other facets of your business? The Microsoft Dynamics Partner Outsource Model (OSM) can help. This program not only offers a full 3-12 months of marketing on your behalf at a fraction of the cost of doing it yourself, it generates, on average, a 40:1 ROI for participating partners.

Microsoft Dynamics Partners: Improve Your Search Rankings
Through the Marketing Services Bureau available to Microsoft Dynamics partners, White Hat Media is offering a free one hour consultancy, overview audit, and a 20% discount on a full audit and implementation, through March 31.

Set Your Job Role in Your MPN Profile to Get More Relevant Communications
Along with your company's MPN profile, your individual MPN profile settings help us direct the right email communications about partner information, resources, and opportunities to you. You can quickly check your primary job role setting, and update it if necessary, in the Partner Membership Center. Your job role is in the Your Name and Job Responsibilities section.

TrainingTraining and certification

Stay informed about training opportunities that align to your business, role, and Microsoft Partner Network membership.

March Hot Sheet: Find and Register for Upcoming Training and Partner Calls
The Hot Sheet is a frequently updated schedule of live virtual and in-person training and informational calls that are coming in the next few weeks. Subscribe to the US Partner Learning blog to receive the Hot Sheet as soon as it's published each month. The April issue will be available the week of March 17.

Microsoft Dynamics Training Schedule
Register for Microsoft Dynamics training for both CRM and ERP using our schedule that goes through July. We also list Dynamics training on the monthly Hot Sheet.

Save on Certification Exams that Meet the New Intelligent Systems Competency Requirements
Save up to 40% when you purchase a Microsoft Competency Exam Pack. Exams eligible for this offer include those that are required for the new Intelligent Systems competency. Each exam voucher also gives you a second shot to pass the exam.

image8 Weeks of Windows 8 – Enter to Win a Cool Device
Pass a Windows assessment and be entered to win a Dell Venue 8 Pro, a Toshiba Encore, or a Lenovo IdeaCentre Horizon Multimode Table PC. Weekly and grand prize drawings. Contest ends April 13. Open to US resident employees of OEM Reseller partners who are 18 or older. Weekly entry periods are Monday through Sunday. Download the official rules.

Windows Azure Week Sessions Available On Demand
If you missed Windows Azure Week, the sessions are now available on-demand. These deep dive courses from the experts who built Azure are full of demos and real world examples designed to show you how to start using Windows Azure in your solutions today.

Featured video

Watch this video for a demonstration of the CloudReady Insight cost-assessment tool from third-party Exoprise. The tool helps partners sell Office 365 to customers on Exchange Server 2003 or Exchange Server 2007. It runs in your customer's environment and analyzes the existing Exchange environment to produce detailed reports on total cost of ownership and end-user readiness for Office 365.

image

imageCalendar Through March 21

This list is a sampling of upcoming partner calls, training, and events. Refer to the March Hot Sheet for a comprehensive list for the next several weeks that's frequently updated.

What's New in SMB for Windows Server 2012 R2

$
0
0

Dan Stolts welcomes back Jose Barreto to the show as they discuss what’s new in SMB for Windows Server 2012 R2. Tune in as they chat about all of the new features and functionality for Server Message Block (SMB) and how these improvements will affect your datacenter.

  • [2:10] SMB 3 offers some new improved SMB bandwidth Management.  Can you share a bit with us about the three traffic types and how they are different?
  • [3:54] SMB 3.0 on Windows Server 2012 R2 offers new automatically rebalancing of Scale-Out File Server Clients.  Can you tell us about this excellent technology for improving scalability and improving efficiency?
  • [8:38] What about the new capabilities including SMB Multichannel and high speed migrations with low CPU utilization?
  • [13:24] Can you share a bit about SMB over RDMA?
  • [20:37]  What are VHDX files and how does it work in terms of storage inside virtual machines?
  • [25:30]  SMB 1.0 is now an optional feature. Why?
  • [28:48]  What about SMB event messages?

Catch up on the other episodes of the Windows Server 2012 SMB series HERE.

Download Windows Server 2012 R2 Evaluation

Start Your 30 Day Windows Azure Trial


PowerTip: Protect the Data Produced by PowerShell Jobs

$
0
0

Summary: Learn how to protect the data in a Windows PowerShell job.

Hey, Scripting Guy! Question How can I protect the data that is produced by a Windows PowerShell job?

Hey, Scripting Guy! Answer If the job is running and your Windows PowerShell session or computer crashes, you’ll lose the data.
          One possibility is that you can write the data to disk before the job finishes so that its available if the
          session or machine stops. This would be a good approach for overnight jobs where the machine
          is unattended and might be restarted by an outside process.

Creating A PowerShell Driven XAML Self Help Center

$
0
0

Overview


In this post I will demonstrate how to make a self help center using XAML and PowerShell. In case you missed it, I demonstrated how to integrate XAML into PowerShell here. The main objective of that post was to quickly demonstrate how to integrate XAML into PowerShell and it concluded with a simple application that showed basic operating system details. With this post I decided to demonstrate a PowerShell driven XAML solution to an age old problem; how to empower end users to help themselves while limiting that power to specific tasks that the IT organization has chosen to allow end users to perform. This post also introduces some new concepts such as working with grids, XAML form states, and interacting with Active Directory.

The Challenge


The challenge associated with creating an end user self help center is two fold;

  • How do I make the self help center easily discoverable?
  • How do I limit what end users can do within the self help center?

PowerShell, XAML, and Active Directory provide the answers to both challenges. I will show you how to use PowerShell to perform complex tasks on the user's behalf, XAML will be used to make the self help center easily discoverable for the end users, and Active Directory will be used to ensure end users cannot perform tasks that are outside of the self help center's scope.

Prerequisites


The solution discussed in this post requires that the following prerequisites be met;

  • An Active Directory forest with Windows Server 2008 R2 domain controllers
  • Windows 7 or newer domain joined clients

Configure Active Directory


Before we dive into the PowerShell and XAML application, Active Directory must be configured to allow end users to write to certain attributes within their user accounts. By default, administrative credentials are needed to modify most user attributes, since the objective of this post is to demonstrate how to empower end users, Active Directory permissions must be configured to permit this behavior. The steps below were performed on a Windows Server 2012 R2 domain controller.

  1. Log into a domain controller and go to Start > Run > dsa.msc
  2. Click View  and select Advanced Features
  3. Locate a user account that resides in the OU where you wish to provide the user's some self service capability.
  4. Right click > Properties > Security Tab
  5. Highlight the SELF Access Control Entry and ensure "Read personal information" and "Write personal information" is selected as shown below

Create the XAML Application


Now that you have verified that Active Directory will allow the end user to modify some of the properties of the user's object, it is time to create a XAML application that will present those properties to the end user in an intuitive manner. The following XAML code is designed to effectively demonstrate the concepts in this post and to be easily customizable to specific organizational needs.


In the following code I created some textboxes, labels, buttons and grids. Using the grid XAML element I was able to transform the appearance of the XAML interface that is presented to the end user without using something more complex such as tabs or multiple XAML forms. As I mentioned in my previous XAML post, I created all of the elements in Visual Studio Express 2013 first, then made the changes necessary to integrate the XAML form into PowerShell. For this example I also added the variable $ver which stores the version number. Although the version number is not displayed anywhere on the form, it is useful for keeping track of the latest version.

#Version
$ver="1.0.03062014"
#XAML
[void][System.Reflection.Assembly]::LoadWithPartialName('presentationframework')
[xml]$XAML = @'
<Window
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    Title="LogonBanner" Height="600" Width="800" WindowStartupLocation="CenterScreen" WindowStyle='None' ResizeMode='NoResize' WindowState='Maximized'>
    <Grid HorizontalAlignment="Center" VerticalAlignment="Center">
        <TextBox HorizontalAlignment="Center" HorizontalContentAlignment="Center" Height="60" TextWrapping="NoWrap" Text="Welcome to IT Company" VerticalAlignment="Top" Width="800" FontSize="36" Background="#FF00C6FF"/>
        <TextBox Visibility="Visible" Name="txtMessage" HorizontalAlignment="Center" Height="500" Margin="0,60,0,0" TextWrapping="Wrap" Text="" FontSize="14" VerticalAlignment="Top" Width="800"/>
        <Grid Visibility="Visible" Name="grdSelfHelp" HorizontalAlignment="Center" Height="493" Margin="0,88,0,0" VerticalAlignment="Top" Width="800">
            <TextBox HorizontalContentAlignment="Center" VerticalContentAlignment="Center" HorizontalAlignment="Left" Height="23" Margin="0,-28,0,0" TextWrapping="NoWrap" Text="Self Help Center" VerticalAlignment="Top" Width="800" Background="#FFC6FF00" FontWeight="Bold" FontSize="14"/>
            <Label Content="Logon Name" HorizontalAlignment="Left" Margin="0,33,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtSAMAccountName" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,33,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label  HorizontalContentAlignment="Center" Content="Physical Address" HorizontalAlignment="Left" Margin="450,2,0,0" VerticalAlignment="Top" Width="350" Background="#FF00C6FF"/>
            <TextBox Name="txtStreetAddress" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="575,33,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Street" HorizontalAlignment="Left" Margin="450,33,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtL" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="575,64,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="City" HorizontalAlignment="Left" Margin="450,64,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtST" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="575,95,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="State/Province" HorizontalAlignment="Left" Margin="450,95,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtPostalCode" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="575,126,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Zip/Postal Code" HorizontalAlignment="Left" Margin="450,126,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtCO" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="575,157,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Country/Region" HorizontalAlignment="Left" Margin="450,157,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <Label HorizontalContentAlignment="Center" Content="Account Details" HorizontalAlignment="Left" VerticalAlignment="Top" Width="350" Background="#FF00C6FF" Margin="0,2,0,0"/>
            <Label Content="Profile Path" HorizontalAlignment="Left" Margin="0,64,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtProfilePath" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,64,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Logon Script" HorizontalAlignment="Left" Margin="0,95,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtLogonScript" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,95,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Home Folder" HorizontalAlignment="Left" Margin="0,126,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtDirectory" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,126,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="User Principal Name" HorizontalAlignment="Left" Margin="0,157,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtUserPrincipalName" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,157,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="First Name" HorizontalAlignment="Left" Margin="0,222,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtGivenName" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,222,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label HorizontalContentAlignment="Center" Content="Contact Information" HorizontalAlignment="Left" Margin="0,191,0,0" VerticalAlignment="Top" Width="350" Background="#FF00C6FF"/>
            <Label Content="Last Name" HorizontalAlignment="Left" Margin="0,253,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtSN" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,254,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Display Name" HorizontalAlignment="Left" Margin="0,284,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtDisplayName" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,285,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Description" HorizontalAlignment="Left" Margin="0,315,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtDescription" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,315,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Office" HorizontalAlignment="Left" Margin="0,346,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtPhysicalDeliveryOfficeName" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,346,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Cell Phone Number" HorizontalAlignment="Left" Margin="0,377,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtMobile" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,377,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <Label Content="Email Address" HorizontalAlignment="Left" Margin="0,408,0,0" VerticalAlignment="Top" Width="120" Background="#FF00C6FF"/>
            <TextBox Name="txtMail" VerticalContentAlignment="Center" HorizontalContentAlignment="Center" HorizontalAlignment="Left" Height="26" Margin="125,408,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="225"/>
            <TextBox Name="txtResults" VerticalContentAlignment="Top" HorizontalContentAlignment="Left" HorizontalAlignment="Left" Height="212" Margin="450,222,0,0" TextWrapping="NoWrap" VerticalAlignment="Top" Width="350" BorderThickness="0" IsReadOnly="True"/>
        </Grid>
        <Button Name="btnAccept" Content="Accept" HorizontalAlignment="Left" Margin="0,560,0,0" VerticalAlignment="Top" Width="300" Height="40" Background="#FF00FF27"/>
        <Button Name="btnSelfHelp" Content="Visit Self Help Center" HorizontalAlignment="Left" Margin="305,560,0,0" VerticalAlignment="Top" Width="190" Height="40" Background="#FFBAD3F5"/>
        <Button Name="btnDecline" Content="Decline" HorizontalAlignment="Left" Margin="500,560,0,0" VerticalAlignment="Top" Width="300" Height="40" Background="#FFFF1400"/>
    </Grid>
</Window>
'@

In the next block of code the XAML is stored in the $reader variable and prepared for display.

#Read XAML
$reader=(New-Object System.Xml.XmlNodeReader $xaml)
try{$Form=[Windows.Markup.XamlReader]::Load( $reader )}
catch{Write-Host "Unable to load Windows.Markup.XamlReader. Some possible causes for this problem include: .NET Framework is missing PowerShell must be launched with PowerShell -sta, invalid XAML code was encountered."; exit}

In the following code, the named XAML form objects are stored in PowerShell as variables and the disclaimer message variable is filled with a disclaimer message. I envision this type of XAML application being deployed as a logon script so I created it in a way that it would show a disclaimer message that the user must accept prior to completing the logon process for a domain joined workstation.

#===========================================================================
# Store Form Objects In PowerShell
#===========================================================================
$xaml.SelectNodes("//*[@Name]") | %{Set-Variable -Name ($_.Name) -Value $Form.FindName($_.Name)}

#===========================================================================
# Fills Variable Values
#===========================================================================
$txtMessage.Text = @'
WARNING!  This computer system is the property of the IT Company.  The IT Company may monitor any activity on the system and retrieve any information stored within the system.  By accessing and using this computer, you are consenting to such monitoring and information retrieval for law enforcement and other purposes.  Users should have no expectation of privacy as to any communication on or information stored within the system, including information stored locally on the hard drive or other media in use with this unit (e.g., floppy disks, tapes, CD-ROMs, etc).
'@

The following code adds events to the XAML form objects including function calls based on button presses.

#===========================================================================
# Add events to form objects
#===========================================================================
$btnSelfHelp.Add_Click({
    if($btnSelfHelp.Content -eq "Visit Self Help Center"){fnSelfHelp; fnViewState -State "selfhelp"}
    elseif($btnSelfHelp.Content -eq "Exit Self Help Center"){fnSelfHelp; fnViewState -State "disclaimer"}
    })

#Creates actions based on form state and button click
$btnDecline.Add_Click({
    if($btnDecline.Content -eq "Decline"){fnLogout}
    if($btnDecline.Content -eq "Cancel"){fnViewState -State "disclaimer"}
    })
$btnAccept.Add_Click({
    if($btnAccept.Content -eq "Accept"){$form.Close()}
    if($btnAccept.Content -eq "Update"){

        #Updates All AD Attributes
        fnUpdateAD -DistinguishedName $sDistinguishedName -Attribute "mobile" -NewValue $txtMobile.Text
        }
    })

For a XAML form like this, it is important to keep the form from having too many unnecessary buttons and the goal is to minimize the number of steps the user must take to interact with the form. To accomplish this goal I created multiple view states that transform the XAML form based on user activity. The following function controls that view state.

#===========================================================================
# Sets View State
#===========================================================================
function fnViewState{
param([Parameter(Mandatory=$true)][string]$State)

    #Sets read only Elements
    $txtSAMAccountName.IsEnabled=$false
    $txtStreetAddress.IsEnabled=$false
    $txtSt.IsEnabled=$false
    $txtL.IsEnabled=$false
    $txtST.IsEnabled=$false
    $txtPostalCode.IsEnabled=$false
    $txtCO.IsEnabled=$false
    $txtProfilePath.IsEnabled=$false
    $txtLogonScript.IsEnabled=$false
    $txtDirectory.IsEnabled=$false
    $txtMail.IsEnabled=$false
    $txtGivenName.IsEnabled=$false
    $txtSN.IsEnabled=$false
    $txtDisplayName.IsEnabled=$false
    $txtDescription.IsEnabled=$false
    $txtPhysicalDeliveryOfficeName.IsEnabled=$false
    $txtMobile.IsEnabled=$true
    $txtUserPrincipalName.IsEnabled=$false

    #Changes viewstate based on selection
    switch($State){
        "disclaimer"{
            $txtMessage.Visibility="Visible"
            $grdSelfHelp.Visibility="Hidden"
            $btnAccept.Content="Accept"
            $btnDecline.Content="Decline"
            $btnSelfHelp.Content="Visit Self Help Center"
            $txtResults.Background="#FFFFFFFF"
        }
        "selfhelp"{
            $txtMessage.Visibility="Hidden"
            $grdSelfHelp.Visibility="Visible"
            $btnAccept.Content="Update"
            $btnDecline.Content="Cancel"
            $btnSelfHelp.Content="Exit Self Help Center"
            $txtResults.Background="#FFFFFFFF"
            $txtResults.Text=""
        }
    }
}

In the following code I tie Active Directory attributes to XAML form fields. Get-ADUser would have been much simpler for this task, however it would rely on the AD RSAT tools being installed, so instead I opted to use the .NET framework adsisearcher accelerator.

#===========================================================================
# Displays AD Object Properties
#===========================================================================
function fnSelfHelp{
    #Store AD User Object
    $oUserObject = ([adsisearcher]"(&(objectCategory=User)(objectClass=User)(SAMAccountName=$env:USERNAME))").FindOne()

    #Displays Properties
    try{
        $txtSAMAccountName.Text = $oUserObject.Properties.Item("SAMAccountName")
        $txtStreetAddress.Text = $oUserObject.Properties.Item("streetAddress")
        $txtSt.Text = $oUserObject.Properties.Item("st")
        $txtL.Text = $oUserObject.Properties.Item("l")
        $txtST.text = $oUserObject.Properties.Item("st")
        $txtPostalCode.Text = $oUserObject.Properties.Item("postalCode")
        $txtCO.Text = $oUserObject.Properties.Item("co")
        $txtProfilePath.Text = $oUserObject.Properties.Item("profilePath")
        $txtLogonScript.Text = $oUserObject.Properties.Item("logonscript")
        $txtDirectory.Text = $oUserObject.Properties.Item("homeDirectory")
        $txtMail.Text = $oUserObject.Properties.Item("mail")
        $txtGivenName.Text = $oUserObject.Properties.Item("givenName")
        $txtSN.Text = $oUserObject.Properties.Item("sn")
        $txtDisplayName.Text = $oUserObject.Properties.Item("displayName")
        $txtDescription.Text = $oUserObject.Properties.Item("description")
        $txtPhysicalDeliveryOfficeName.Text = $oUserObject.Properties.Item("PhysicalDeliveryOfficeName")
        $txtMobile.Text = $oUserObject.Properties.Item("mobile")
        $txtUserPrincipalName.Text = $oUserObject.Properties.Item("userPrincipalName")
      }
      catch{}

    #Sets Global DistinguishedName Variable
    Set-Variable -Name sDistinguishedName -Scope Global -Value $oUserObject.Properties.Item("distinguishedName")
}

In the following function, I use the System.DirectoryServices.Protocols assembly to search Active Directory and to update the user's attributes in Active Directory.

#===========================================================================
# Updates Active Directory With Changed Attributes
#===========================================================================
function fnUpdateAD{
param([Parameter(Mandatory=$true)][string]$DistinguishedName,[Parameter(Mandatory=$true)][string]$Attribute,[Parameter(Mandatory=$true)][string]$NewValue)

    #Stores SearchRoot
    $sSearchRoot = "DC=" + $env:USERDNSDOMAIN.replace(".",",DC=")

    #LDAP Search Filter
    $filter = "(&(objectCategory=User)(objectClass=User)(SAMAccountName=$env:USERNAME))"

    #Add Assembly
    Add-Type -AssemblyName System.DirectoryServices.Protocols

    #Create Connection
    $connection=New-Object System.DirectoryServices.Protocols.LDAPConnection($env:USERDNSDOMAIN)

    #Search Request
    $req = New-Object System.DirectoryServices.Protocols.SearchRequest($sSearchRoot,$filter,"Subtree",$null)

    #Search Response
    $rsp = $connection.SendRequest($req)
    
    #Store Current Attribute Value
    $sCurrentValue = $rsp.Entries.Item(0).Attributes.$Attribute[0]

    #Checks for existing value
    if($sCurrentValue.Length -eq 0){$Action = "Add"}else{$Action = "Replace"}

    #Update Attribute Field
    try{
        $req = New-Object System.DirectoryServices.Protocols.ModifyRequest($DistinguishedName,$Action,$Attribute,$NewValue)
        $rsp = $connection.SendRequest($req)
        $txtResults.Text+="$($rsp.ResultCode): Updating $Attribute to $NewValue`r`n"
        if($txtResults.Text.ToLower() -notcontains "error" ){$txtResults.Background="#FFFFFFCC"}
     }
     catch{
        $txtResults.Text+="Error: Failed to update $Attribute to $NewValue`r`n"
        $txtResults.Background="#FF0033"
     }
}

The following function logs off the currently logged on user if the user refuses to accept the terms of use agreement.

#===========================================================================
# Forces Logoff For The Current User
#===========================================================================
function fnLogout{
    (Get-WmiObject -Class Win32_OperatingSystem).Win32Shutdown(4)
}

Last but not least, the following function displays the form.

#===========================================================================
# Shows the form
#===========================================================================
#Set ViewState
fnViewState -State "disclaimer"

#Show Form
$Form.ShowDialog() | out-null

So after all of that code, what does it actually do?

What It Does


When a user logs in they are presented with an acceptable use agreement as shown below. The user can either opt to Accept, Visit Self Help Center or Decline. Accept will log in the user normally, Decline will log the user off, and the main feature is the Visit Self Help Center button.


If the user opts to visit the Self Help Center they are presented with the following screen. For simplicity purposes, I disabled all fields except the Cell Phone Number field since this is meant to only be an example.

Editing the Cell Phone Number then clicking Update displays the following output:

From there the user can then exit the Self Help Center and proceed from the disclaimer page. To verify that the change was indeed successful, you can go to Active Directory and view the attribute as shown below.

Wrap Up


In this post I have shown you how to go beyond the basic XAML application which displays data and demonstrated how to use PowerShell, XAML, and Active Directory to empower end users to help themselves. To keep this demonstration as short as possible I did not include some of the code that should be included in a production deployment such as form validation, additional error checking, reverting back to the previous value if the user clicks cancel, etc. Also, many organizations have proprietary employee databases or in house solutions outside of Active Directory. It would be relatively simple to hook into more data sources than just Active Directory to further empower end users to perform some tasks that are typically performed by the organization's service desk.

To view the complete code it is attached to this post.

Contoso Labs-Fabric Choices (Network)

$
0
0

Contoso Labs Series - Table of Contents

In the prior post, we discussed how our major decision revolved around storage. The choices we made there would dictate the cost of the rest of the solution, so we had to make a decision there first, and carefully at that. Once we knew we'd be using Scale-Out File Servers and a converged fabric, we had to evaluate what that meant for our network.

Speeds and Feeds

Our first inclination was to get excited by all the possibilities and performance that the SOFS+SMB 3.0 stack could enable. How awesome is your life when you technologies like SMB Multichannel and SMB Direct (RDMA) at your disposal? Once our budget numbers came back, we realized we had to scale back our ambitions a little. While SMB Direct is amazing, it requires special RDMA-enabled 40GbE, 10GbE, or Infiniband cards. That presented a few serious problems.

  • Dual port, RDMA-capable cards are expensive. Well over $500 a piece for a 10GbE example, with limited options because it is newer technology. Do the math for one per 288 nodes, and you see a tremendous cost.
  • High speed NICs require drastically more capable and expensive switching and routing infrastructure. 48-port 10GbE switches cost 2-4x more than their 1Gbe counterparts. SFP+ cables often cost 5-10x or more of an equivalent length CAT-6 cable. When you need a thousand of them? Ouch.
  • On top of that, because our compute nodes are Gen6 servers, we learned that it's actually frighteningly easy to overwhelm the PCIe bus in some RDMA scenarios. Our network could be TOO FAST FOR OLDER PCIe. What a great problem to have.

We decided to do the math to see if we could get away with using the existing connectivity in our scenario. The compute nodes currently have four 1GbE ports each. Frankly, that's not very much. 4Gbit of throughput, split between management, storage, AND tenant traffic is almost criminally low. We had to debate long and hard about whether we could get away with it. After long deliberation, we decided that our users and workload could manage with this limited throughput, since there won't be heavy stress being generated during steady-state. Our worst case scenario is boot storms, which we think we can manage. However, we're acknowledging there's serious risk here. Until we get users on and load test, we can't be sure what kind of user base we can support. Just like our customers, sometimes we need to learn useful things by doing them and seeing what happens.

In all honesty: This is not an architecture we'd recommend in any production environment meant for serious workloads. It does not provide enough room for I/O growth, and could too easily produce bottlenecks. On the flip side, we think it's an excellent dev/lab design that could make good use of older hardware, and provide a good experience. Keep that in mind as you read more of our architecture in the future.

Changing Realm of the Tenant Portal in Windows Azure Pack

$
0
0

Scenario:

There are times when you are required to have a single Security Token Service (STS) (eg. AD FS) pointing to multiple WAP Tenant or Admin Portals. This is a common occurrence when you do not want to have more than one STS instance  set up (possibly because it is federated with your enterprise STS and other Identity Providers that you don’t want to keep setting up repeatedly for every environment). Instead, you would like to have multiple WAP installations all pointing to a single STS instance that you can reuse. In this case you have multiple Tenant portals and multiple Admin portals from different WAP installations that need to be registered with AD FS.

When you try to set up the portals as Relying parties (RP) to the STS, the STS is going to complain after the first RP, saying you have a collision with an already existing Relying Party Identifier and will not let you set it up. This is because all your tenant portals have an out-of-the box Realm (aka Identifier) value of http://azureservices/TenantSite. The Realm value is used to uniquely identify a Relying party and the tokens issued for the user will be targeted towards that realm. Hence collisions are unacceptable. In this scenario, you have to change the realm of the portals in your different WAP installations so that they are sufficiently unique from the STS’s perspective.

image

Scope

  1. This blog post will provide information about changing the realm of a single tenant portal and does not describe changing the realm of the Admin Portal
  2. You can reuse parts of the post (and code snippet) to modify the realm value of the Admin Portal as well by replacing appropriate values.

Solution

The following script will modify the realm of the tenant portal. You can run it from any machine that has the Windows Azure Pack PowerShell modules on it.

Note:

  1. Since the point of this script is to change the Realm of the tenant portal,  it assumes that you have a Tenant Authentication site as your STS. If you have AD FS, you have to modify the code snippet to add AD FS specific code to establish trust, and/or split it up into different functions.
  2. This script will modify the values in the WAP configuration database using PowerShell. Make sure you take appropriate backups before you execute the script.

     You will need the following values for this process: 
   1: $authPortalUrl = 'win-0f2kif9k2tg:30071'
   2: $tenantPortalUrl = 'win-0f2kif9k2tg:30081'
   3:  
   4: $dbServer = 'win-0f2kif9k2tg\sqlexpress'
   5: $dbPassword = 'pass@word'
   6:  
   7: $newRealm = http://azureservices/mySite

The Realm value should be in a URI format. This is per the specification for the Realm in WS-Fed Protocol. So always ensure that the realm is something of http://value/value.

Provisioning a new Realm to the portal involves 3 steps:

1. Change the Identifiers in the Portal and Service Management API databases

  1. Portal Config store
    The Realm that will be exposed through the Federation metadata file will be specified in the Authentication.RelyingParty field under the Namespace TenantSite in the Microsoft.MgmtSvc.PortalConfigStore database in the Config.Settings table. This is a JSON value containing information about endpoints, signing certificates and Realm.

    image

    This realm needs to be updated to the new value. Note that this value is case sensitive. The following code snippet will do that for you
       1: $dbValue = Get-MgmtSvcDatabaseSetting -ConnectionString $portalConfigStoreConnectionString -Name 'Authentication.RelyingParty' -Namespace 'TenantSite'
       2: $jsonObject = $serializer.DeserializeObject($dbValue.Value)
       3: $jsonObject.Realm = $newRealm
       4: $newDbValue = $serializer.Serialize($jsonObject)
       5:  
       6:Set-MgmtSvcDatabaseSetting -ConnectionString $portalConfigStoreConnectionString -Name 'Authentication.RelyingParty' -Namespace 'TenantSite' -Value $newDbValue -Force
       7:
                 
  2. API Config Store
    The  value here will be used to validate the incoming token to validate if the token has been issued to the appropriate Relying Party. This will be specified in the Authentication.RelyingParty.Primary field under the Namespace TenantAPI in the Microsoft.MgmtSvc.Store database. This is a JSON value containing information about endpoints, signing certificates and Realm.

    image

    This realm needs to be updated to the new value. Note that this value is case sensitive. The following code snippet will do that for you
       1: $apiDbValue = Get-MgmtSvcDatabaseSetting -ConnectionString $storeConnectionString -Name 'Authentication.RelyingParty.Primary' -Namespace 'TenantAPI'
       2: $jsonObject = $serializer.DeserializeObject($apiDbValue.Value)
       3: $jsonObject.Realm = $newRealm
       4: $newapiDbValue = $serializer.Serialize($jsonObject)
       5:  
       6:Set-MgmtSvcDatabaseSetting -ConnectionString $storeConnectionString -Name 'Authentication.RelyingParty.Primary' -Namespace 'TenantAPI' -Value $newapiDbValue -Force 

I have used a JSON serializer in this sample. But if you do not have access to that library, you can use ConvertTo-Json and ConvertFrom-Json PowerShell to the same effect

2. Restart the websites

You should restart the Tenant Portal and Tenant API services for the changes to get picked up. You can verify if the new value has been picked up by visiting the Federation Metadata endpoint of the portal  at /federationmetadata/2007-06/federerationmetadata.xml">/federationmetadata/2007-06/federerationmetadata.xml">https://<<portalurl>/federationmetadata/2007-06/federerationmetadata.xml and validating the Realm value.

3. Establish Trust between the Portal and the STS

Once these services are restarted and the new realm value has propagated into the federation metadata, you have to re-establish trust between the portal and the STS. In the sample script provided, I have assumed that the Tenant Authentication Site is the STS. If you are using AD FS, please use the appropriate cmdlets for establishing trust.

   1:Set-MgmtSvcRelyingPartySettings -Target Tenant `
   2:     -MetadataEndpoint https://$authPortalUrl/FederationMetadata/2007-06/FederationMetadata.xml `
   3:     -ConnectionString $portalConfigStoreConnectionString `
   4:     -DisableCertificateValidation
   5:  
   6:Set-MgmtSvcIdentityProviderSettings -Target Membership `
   7:     -MetadataEndpoint "https://$tenantPortalUrl/FederationMetadata/2007-06/FederationMetadata.xml" `
   8:     -ConnectionString $portalConfigStoreConnectionString `
   9:     -DisableCertificateValidation

Once this is done, Visit the tenant portal and observe it redirect to the Tenant Authentication site. The simplest way to see if the realm was changed properly, is to observe the URL. for eg, my portal redirects to the URI:

https://win-0f2kif9k2tg:30071/Login?p=login&ReturnUrl=%2fwsfederation%2fissue%3fwa%3dwsignin1.0%26wtrealm%3dhttp%253a%252f%252fazureservices%252fmySite%26wctx%3drm%253d0%2526id%253dpassive%2526ru%253d%25252f%2526cx%253d0%26wct%3d2014-03-07T01%253a48%253a19Z&wa=wsignin1.0&wtrealm=http%3a%2f%2fazureservices%2fmySite&wctx=rm%3d0%26id%3dpassive%26ru%3d%252f%26cx%3d0&wct=2014-03-07T01%3a48%3a19Z

Note the highlighted section. This will reflect the new realm that is used by the Tenant site to identify itself. You can also validate the changed realm by looking into the database in the  Config.Settings table in the Microsoft.MgmtSvc.PortalConfigStore  and Microsoft.MgmtSvc.Store databases. Smile

image

image

Summary

The Realm value is used to uniquely identify a Relying Party. So in the scenario where you have multiple WAP installations that need to point to the same STS, you should change the Realm values of these portals to unique values. You can modify the Realm value of the Tenant Portal in 3 steps

  1. Modifying the Realm values in the Portal Database and in the API Database stores
  2. restarting the services so that they can pick up the updated values. The changed value will be represented in the Federation metadata exposed by the Tenant Portal
  3. Re-Establish Trust between the portal and the STS

Find out how to publish Windows Phone apps for enterprise

$
0
0

The latest “Inside Windows Phone” video shows how to publish Windows Phone apps for enterprise through the Windows Phone Store, as a beta and through private distribution.

The process for distributing apps through the store is exactly the same as apps for consumers. But with Windows Phone apps for enterprise, you can choose a best practice of securing your corporate data with a login.

You can also distribute your creation as a beta app; the advantage of that is that there’s no expiration. You need to provide all live IDs of your users and send them the link to the store.

The last option is to side-load apps, which can be done via email or website.

Find more details about deploying Windows Phone apps for enterprise on the Windows Phone Developer Blog.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff

Check out Dell’s new Latitude 13 Education Series laptop for classrooms

$
0
0

clip_image001

Dell has announced the new classroom-ready Latitude 13 Education Series laptop, built to meet the needs of any student computing program.

This laptop, built with a rubberized LCD and base trim, is designed to be tough and has been subject to military-standard testing. It has a fully sealed keyboard and touchpad, a 13.3-inch display and on models with touch it has Corning Gorilla Glass NBT for scratch resistance.

It is also built to be secure, with comprehensive encryption, advanced authentication and integration with Microsoft System Center.

The Latitude 13 Education Series starts at $539 (in the U.S., with non-touch models available in red, blue and black and touch models available in black.)

Read more about the laptop on the Windows Experience Blog.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff

‘Call of Duty: Ghosts’ ‘Devastation’ coming first to Xbox April 3

$
0
0

“Devastation,” the second downloadable content pack for “Call of Duty: Ghosts,” debuts April 3, first on Xbox. It features four unique multiplayer maps, an all-new, tactical 2-in-1 weapon and “Episode 2: Mayday” – the next installment in Extinction’s four-part episodic narrative.

That 2-in-1 weapon is known as The Ripper, and gives players two difference configurations, writes Jeff Rubenstein on Major Nelson’s blog: “Medium to short-range capability as a Sub Machine-Gun (SMG), and medium to long-range as an Assault Rifle (AR), giving maximum flexibility to adjust to a variety of map types and tactics.”

Read more about “Devastation” and the new maps – named Ruins, Behemoth, Collision and Unearthed – on Major Nelson’s blog.

You might also be interested in:

· Calling all fans! “Titanfall” for Xbox One: Microsoft retail stores midnight launch events
· Get ready for “Titanfall” with these three apps for Xbox One
· Limited time offer: Buy Xbox One, get a free copy of “Forza Motorsport 5”

Suzanne Choney
Microsoft News Center Staff


Automation–The New World of Tenant Provisioning with Windows Azure Pack (Part 2): Automated Deployment of Tenant Network and Identity

$
0
0

Hello once again!

So, here we are, Part 2 of The New World of Tenant Provisioning with Windows Azure Pack blog series (find Part 1: Intro & TOC here). Finally, some PowerShell/SMA Runbook examples!


Automated Deployment of Tenant Network and Identity

What does that mean, exactly?

Well for the context of this blog series, it means I am going to provide the PowerShell/SMA Runbook scripts necessary to create an Isolated Software Defined Network (SDN) & Active Directory VM Role. So, for organization’s sake, this blog post will be split into two main sections – one for automated SDN deployment, and one for automated VM Role deployment.

Automated SDN Deployment

For the most part, the PowerShell/SMA Runbook example for this already exists. Granted, you would have to be a Building Clouds Blog super-fan to know exactly where it is, but it does exist on this very blog. That said, for this series, we want to promote it much more, and underline the significance of it in this example solution.

So where did this example live before being reestablished here?

In this blog post: Automation–PowerShell Workflow Script Spotlight–Deploying Virtual Machine Manager Service Templates “OnBehalfOf” Tenant Administrator User Roles

An admittedly, under-promoted yet valuable blog post on the automation of various VMM resources via PowerShell workflow, from the Service Administrator “OnBehalfOf” the Tenant Administrator.


In fact, I believe now is a great time to take a moment and describe the “Scope of Management” for these two personas in an image:

image

The reason I believe this is important, is that I will be referring to each persona as this blog series continues. Now, this is not a comprehensive list of all the potential areas each persona has management over, but it covers what we need here.

Note     There may be other uses or ways to access the WAP Tenant API (non-Public) than just as a Service Administrator. From what I have seen, it requires bearer token authorization. And since the best way to get this token is via the WAP Admin PowerShell Cmdlet (Get-MgmtSvcToken), I made some assumptions.


Okay, let’s knock this SDN stuff out…

Create a VM Network “OnBehalfOf” a User Role

The following PowerShell workflow script (Create-VMNetwork) will create a VMM VM Network with the following settings (leveraging VMM PowerShell Commands):

  • VM Network Name:<VM Network Name generated by Owner User Role Name defined with parameter>
  • Subnet Name:<defined in script: “TenantSubnet”>
  • Subnet Value:<defined in script: “192.168.0.0/24”>
  • IP Address Pool Name:<defined in script: “TenantIPPool”>
  • IP Address Range Start:<defined in script: “192.168.0.100”>
  • IP Address Range End:<defined in script: “192.168.0.199” – providing for 100 available addresses>
  • DNS IP:<defined in script: “192.168.0.100” – first IP in Pool>
  • OnBehalfOfUser:<User Name parsed from Owner User Role Name defined with parameter>
  • OnBehalfOfUserRole:<User Role parsed from Owner User Role Name defined with parameter>

Note     You may keep these example settings, or modify to fit your deployment specifications.

Example PowerShell workflow script for Create-VMNetwork

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
workflow Create-VMNetwork
{
    param
    (
    [string]$OwnerUserRole,
    [string]$VmmServerName,
    [string]$CloudName,
    [string]$LogicalNetworkName
    )
 
    inlinescript
    {
        $subnetValue = "192.168.0.0/24"
        $subnetName = "TenantSubnet"
        $dnsIP = "192.168.0.100"
        $ipAdressPoolName = "TenantIPPool"
        $ipAddressRangeStart = "192.168.0.100"
        $ipAddressRangeEnd = "192.168.0.199"
        $UserRole = $Using:OwnerUserRole
        $User = $UserRole.Split("_")[0]
        $vmNetworkName = "Tenant Network ($User)"
       
        Get-SCVMMServer -ComputerName $Using:VmmServerName -ForOnBehalfOf | Out-Null
        $OwnerUserRoleObj = Get-SCUserRole | where {$_.Name -match $Using:OwnerUserRole}

        $VMNetwork = Get-SCVMNetwork -OnBehalfOfUser $User -OnBehalfOfUserRole $OwnerUserRoleObj
        if(!$VMNetwork) {
            $CloudObj = Get-SCCloud -Name $Using:CloudName
            $logicalNetwork = Get-SCLogicalNetwork -Cloud $CloudObj -Name $Using:LogicalNetworkName
            $vmNetwork = New-SCVMNetwork -Name $vmNetworkName -LogicalNetwork $logicalNetwork `
                -OnBehalfOfUser $User -OnBehalfOfUserRole $OwnerUserRoleObj
       
            $subnet = New-SCSubnetVLan -Subnet $subnetValue 
            $vmSubnet = New-SCVMSubnet -Name $subnetName -VMNetwork $vmNetwork -SubnetVLan $subnet `
                -OnBehalfOfUser $User -OnBehalfOfUserRole $OwnerUserRoleObj

            $allDnsServer = @($dnsIP)

            $staticIPAddressPool = New-SCStaticIPAddressPool -Name $ipAdressPoolName `
                -VMSubnet $vmSubnet -Subnet $subnetValue -IPAddressRangeStart $ipAddressRangeStart `
                -IPAddressRangeEnd $ipAddressRangeEnd -DNSServer $allDnsServer `
                -RunAsynchronously -OnBehalfOfUser $User -OnBehalfOfUserRole $OwnerUserRoleObj
        }
    }
}

Note     In general, the Create-VMNetwork workflow gets called once per Tenant Admin (Owner User Role, which is the equivalent of User + Plan Subscription). In fact, this workflow is most often called as part of the SMA Runbook linked to the Subscription.Create event within WAP/SPF, meaning that as soon as a Tenant Admin User Subscribes to the related Plan, the SDN is created automatically for that Subscription. For more information (and a specific example) about how WAP leverages SPF events to initiate SMA Runbooks, see the following TechNet Article: Using automation with Virtual Machine Clouds and blog post: Automation–Monitoring and Notifying in Windows Azure Pack with SMA

Calling the Create-VMNetwork PowerShell workflow

The following is a very basic example for calling this workflow:

001
002
003
004
005
006
$VMMServer = "MY_VMM_SERVER" 
$UserRole = "USER_ROLE" 
$CloudName = "My Tenant Cloud" 
$LogicalNetworkName = "Contoso Logical Network" 

Create-VMNetwork -VmmServerName $VMMServer -OwnerUserRoleName $UserRole.Name -CloudName $CloudName -LogicalNetworkName $LogicalNetworkName

Again, there are lots of options here, choose the one that makes sense for your deployment. And I am not going to dive into the details for this example, but obviously you can leverage alternate objects/variables to collect/pass the parameter data within this call (as it is a 1:1 Tenant Admin User Role:VMNetwork in this example where Tenant Admin User Role = User + Subscription to a Plan).


Why go directly against VMM, as opposed to leveraging the WAP Tenant API?

For two reasons, really – First, I wanted to highlight and leverage existing known-good and well-used scripts; Second, search as I might, I have not found a published Windows Azure Pack Virtual Networks Tenant API (unlike what is readily available for Virtual Machine Roles), nor anything in the SPFAdmin Cmdlet. So, the above script is what we have been using on my team in our Demo/Test/Dev environment for months now.

Also, one might argue that Network belongs to Fabric Management, which in turn belongs in VMM. Either way, based on what I could find, VMM is your [current] best bet.

Note    All this being said, I do have confirmation that this API does exist, and as soon as I have a public link for the API Reference for creating Virtual Networks in WAP, I will update this post right here.


Automated Active Directory VM Role Deployment

Let’s break this down…

The Options

In fact, this is another great time to illustrate this in image form, for these two personas:

image

Note     These options are the same for any Automated VM Role Deployment, Active Directory happens to be the first one to be deployed, so it gets all the attention.


The Process

Before we get started on the process, now is the perfect time to give a shout out to one of my WSSC CAT teammates: Nader Benmessaoud [MSFT], for whom I would like to thank for paving the way towards my understanding this process, with early work against the Public Tenant  API, PowerShell examples, and foundational support in this effort. That said, I believe it is important for everyone to understand the current step-by-step process necessary to automatically deploy a VM Role via PowerShell against the available endpoints. It will let you appreciate the script that much more.

  1. Generate the Gallery Item VM Role Reference URI (based on Subscription ID and Tenant Portal Address)
  2. Invoke-WebRequest to Get the Gallery Item VM Role Reference (portion of the Gallery Item VM Role Resource Definition (ResDef) URI specific to the Gallery Item VM Role, based on the Gallery Item VM Role Name and data returned from Step #1)
  3. Generate the Gallery Item VM Role ResDef URI (based on data returned from Step #2)
  4. Invoke-WebRequest to Get the Gallery Item VM Role ResDef (based on the URI from Step #3, data returned in JSON)
  5. Convert (Deserialize) the returned ResDef JSON to a 'System.Collections.Generic.Dictionary[String,Object]
  6. Create the Gallery Item VM Role Parameter Hashtable (based on custom variable data)
  7. Convert the Gallery Item VM Role Parameter Hashtable to JSON
  8. Create the Gallery Item VM Role Resource Definition Configuration (ResDefConfig)'System.Collections.Generic.Dictionary[String,Object]' (based on converted Gallery Item VM Role Parameter data (JSON) and Version information)
  9. Create the Gallery Item VM Role Payload Hashtable (based on custom variable data, Gallery Item VM Role ResDef and ResDefConfig Dictionary Objects)
  10. Convert the Gallery Item VM Role Payload Hashtable to JSON
  11. Verify/Create Cloud Service Name (based on custom variable data)
  12. Generate the Gallery Item VM Role Deployment URI (based on Subscription ID, Tenant Portal Address, and Cloud Service Name)
  13. Invoke-WebRequest to Post the Gallery Item VM Role Payload JSON (based on the URI from Step #14)

Step #11 (Cloud Service Creation/Verification) has several Sub-Steps:

  1. Generate the Cloud Service URI (based on Subscription ID and Tenant Portal Address)
  2. Invoke-WebRequest to Get the Cloud Service Data and verify if Cloud Service already exists (based on the URI from Sub-Step #1)
  3. If exists, Output Cloud Service Name
  4. If notexists, Create the Cloud Service Parameter Hashtable (based on custom variable data)
  5. Convert the Cloud Service Parameter Hashtable to JSON
  6. Invoke-WebRequest to Post the Cloud Service JSON (based on the URI from Sub-Step #1)
  7. Once created, Output Cloud Service Name

Some people like images better, so here is one that represents the text above:

image


So, if you were counting, that is…

  • 4 URIs Dynamically Generated
  • 3 Invoke-WebRequest GETs (for ResDef information)
  • 4 Conversions (1 from JSON, 3 to JSON)
  • 3 Hashtables Created
  • 2 'System.Collections.Generic.Dictionary[String,Object]'
  • 1 Cloud Service Creation (or reference to an existing Cloud Service)
  • 2 Invoke-WebRequest POSTs (1 for the Cloud Service Creation, 1 for the VM Role Deployment)

The magic here really exists in handling the data from JSON to Dictionary, Hashtable to JSON, etc. In fact, as you will see in the final script, JSON is the preferred method for data storage and transfer between sections of the script (from InlineScript to InlineScript and workflow to workflow).

As a reference, be sure to review the following TechNet Article: Windows Azure Pack VM Roles Tenant API


Even More Magic

I realize I keep using this word to emphasize technical significance, but when I find something so useful and so “simple”, I label it as such. That said, there is a CRITICAL piece in this process that HAS to be pointed out specifically:

Get and Usage of the WAP MgmtSvcToken

I mention it in passing above, but this really makes everything work from a “Service Administrator leveraging the WAP Tenant API (non-public)” perspective.

Here is the TechNet Library Article on the WAP command used to retrieve the token: Get-MgmtSvcToken

Here is the snippet of script (you will see it within the larger script below) where the token is retrieved from WAP, and then placed in a $Headers Hashtable, along with the identity of the User (essentially the WAP Tenant API’s version of “OnBehalfOf”):

001
002
003
004
005
006
007
008
009
$AdminURI = "https://" + $Using:WAPServer + ":30004"
$AuthSite = "https://" + $Using:WAPServer + ":30072"
$ClientRealm = "http://azureservices/AdminSite"
$token = Get-MgmtSvcToken -Type Windows -AuthenticationSite $AuthSite `
    -ClientRealm $ClientRealm -DisableCertificateValidation

$Headers = @{
    Authorization = "Bearer $token"
    "x-ms-principal-id" = $Using:UserID }

Note     This $Headers Hashtable gets reused over and over, each time an Invoke-WebRequest is made against the WAP Tenant API (non-Public). In fact, this bit of script is the only reason the InlineScript is leveraged, remoting to the WAP Admin Server – the rest of the calls are 100% Invoke-WebRequests and do not require the WAP Cmdlet. It is also important to note, this functionality is not restricted to VM Clouds, but can be leveraged for WAP Tenant API (non-public) calls, in general.


The Scripts

Based on the various options listed above, and considering the already tremendous length of this blog post, I am going to choose a [hopefully] winning combination that will satisfy the masses. Here goes…

Colonel Mustard In The Study (With A Candlestick)

Oh wait - wrong combination!

Service Administrator in SMA (with the WAP Tenant API Only)

That’s better!

So, what does this mean?

  • Service Administrator– The persona the script will be executed from and written for
  • in SMA– Service Management Automation, the PowerShell workflow engine in WAP (the very same thing that will automatically Create a VM Network “OnBehalfOf” a User Role, as described above)
  • (with the WAP Tenant API Only)– Because the script is executed from the Service Admin persona, there are a couple options, one is a mix of WAP Tenant API and VMM Cmdlet calls, the other is purely WAP Tenant API calls (no direct VMM commands used). Going with the WAP Tenant API Only option, keeps the script a bit more simple, and allows you to call the VM Role Deployments in the same way the Tenant Portal calls them. This means you can leverage the MicrosoftCompute.VMRole SPF event for monitoring and notification if desired (again, see the following blog post: Automation–Monitoring and Notifying in Windows Azure Pack with SMA for more information).

Let’s get started…

As you know, using SMA means using PowerShell workflow. It is a bit more complex than just “regular PowerShell”, but many liberties can be taken when leveraging the InlineScript functionality. Taking liberties, making my life easier, and built-in PowerShell Remoting are the three main reasons I chose to go with InlineScript in these examples. I am not opposed to other methods, this is just the one I chose.

So, the overall structure of my SMA Runbooks is as follows:

  1. Subscription-Create-Dispatcher: Top Tier Dispatch Runbook; hooked directly to the SPF event call; captures SPF event Data (Subscription and User info, etc.); used to decide which Subscription to act against, and which User to act for; calls numerous other Sub-Runbooks (including the one(s) for SDN Creation and VM Role Deployments – this is where multiple VM Role Deployments can be called at once)
  2. Deploy-TenantVMRole: Middle Tier Sub-Runbook; called by Dispatch Runbook; collects, parses and organizes deployment data from input parameter and variable data; calls Lowest Tier Sub-Runbook with specific deployment criteria
  3. Deploy-VMRole: Lowest Tier Sub-Runbook; called by Middle Tier Sub-Runbook; collects and uses input parameter data; executes all required VM Role Deployment calls (this is where some of the “options” come in – for example, instead of executing only against the WAP Tenant API, the actual VM Role Deployment could execute the required VMM Cmdlet command (Add-CloudResource))

And to help with the overall visualization as well as where these SMA Runbooks fit in the all-up process, here is another image:

image


Did someone say Scripts?

Yeah, yeah. I am getting there. Remember, it is not only about the “code”, but also how that “code” came to be. Or at least that is my reasoning for taking you on this journey.

Here goes…

I am going to start with the Lowest Tier Sub-Runbook (Deploy-VMRole), and work up.


Deploy a Gallery Item VM Role

The following PowerShell workflow script (Deploy-VMRole) will deploy a WAP Gallery Item VM Role with the following settings (with the WAP Tenant API Only):

  • WAP Server:<Name of WAP Admin Server defined with parameter used by InlineScript Remoting and by WAP Admin PowerShell Cmdlet>
  • Credentials:<PSCredential defined with parameter used by InlineScript Remoting>
  • Tenant Portal Address:<FQDN of the WAP Tenant Portal Server defined with parameter used to generate URIs for Invoke-WebRequest API Calls to WAP Tenant API>
  • Subscription ID:<Subscription ID for the User for whom the VM Roles will be deployed, defined with parameter>
  • User ID:<User ID of the User for whom the VM Roles will be deployed, defined with parameter>
  • Gallery Item Name:<Partial or Full Name of the Gallery Item VM Role to be Deployed, defined with parameter, used to match the Gallery Item Reference Data to generate the Gallery Item ResDef URI>
  • Gallery Item Version:<Version of the Gallery Item VM Role to be Deployed, defined with parameter, used to identify correct version of the Gallery Item VM Role>
  • ResDefConfig JSON:<Resource Definition Configuration data (in JSON format), defined with parameter, used to satisfy the ResDef, part of the Gallery Item VM Role Payload>
  • Cloud Service Name:<Name of the Cloud Service, defined with parameter, used in the Gallery Item VM Role Deployment>
  • VM Role Name:<Name of the VM Role to be Deployed, defined with parameter, part of the Gallery Item VM Role Payload>

Note     Each of these parameters are leveraged throughout the Deploy-VMRole workflow and are configured for maximum re-use and flexibility. The intention of this Low Tier Sub-Runbook is to remain as generic as possible. Specific parameter setting occurs at the Top and Middle Tier Runbooks.

Example PowerShell workflow script for Deploy-VMRole

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
054
055
056
057
058
059
060
061
062
063
064
065
066
067
068
069
070
071
072
073
074
075
076
077
078
079
080
081
082
083
084
085
086
087
088
089
090
091
092
093
094
095
096
097
098
workflow Deploy-VMRole
{
    param
    (
        [string]$WAPServer,
        [PSCredential]$Creds,
        [string]$TenantPortalAddress,
        [string]$SubscriptionID,
        [string]$UserID,
        [string]$GalleryItemName,
        [string]$GIVersion,
        [string]$ResDefConfigJSON,
        [string]$CloudServiceName,
        [string]$VMRoleName
    )

    $VMRole = InlineScript {

        $AdminURI = "https://" + $Using:WAPServer + ":30004"
        $AuthSite = "https://" + $Using:WAPServer + ":30072"
        $ClientRealm = "http://azureservices/AdminSite"
        $token = Get-MgmtSvcToken -Type Windows -AuthenticationSite $AuthSite -ClientRealm $ClientRealm -DisableCertificateValidation

        $Headers = @{
            Authorization = "Bearer $token"
            "x-ms-principal-id" = $Using:UserID }

        # Get Gallery Item Reference
        $GIReferenceUri = "https://{0}:30005/{1}/Gallery/GalleryItems/$/MicrosoftCompute.VMRoleGalleryItem?api-version=2013-03" -f $Using:TenantPortalAddress,$Using:SubscriptionID
        $GIReferenceData = [xml](Invoke-WebRequest -Uri $GIReferenceUri -Headers $Headers -UseBasicParsing | Select-Object -ExpandProperty Content)
        $GalleryItemREF = $GIReferenceData.feed.entry.content.properties.resourcedefinitionUrl | ? {$_ -match $Using:GalleryItemName}

        # Get Gallery Item Resource Definition
        $GIResDEFUri = "https://{0}:30005/{1}/{2}/?api-version=2013-03" -f $Using:TenantPortalAddress,$Using:SubscriptionID,$GalleryItemREF
        $GIResourceDEFJSON = Invoke-WebRequest -Uri $GIResDEFUri -Headers $Headers -UseBasicParsing | Select-Object -ExpandProperty Content 
     
        #Convert ResDef JSON to Dictionary
        [System.Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions") | Out-Null
        $JSSerializer = New-Object System.Web.Script.Serialization.JavaScriptSerializer
        $ResDef = $JSSerializer.DeserializeObject($GIResourceDEFJSON)

        #Add ResDefConfig JSON to Dictionary
        $ResDefConfig = New-Object 'System.Collections.Generic.Dictionary[String,Object]'
        $ResDefConfig.Add("Version",$Using:GIVersion)
        $ResDefConfig.Add("ParameterValues",$Using:ResDefConfigJSON)

        # Set Gallery Item Payload Variables
        $GISubstate = $null
        $GILabel = $Using:VMRoleName
        $GIName = $Using:VMRoleName
        $GIProvisioningState = $null
        $GIInstanceView = $null

        # Set Gallery Item Payload Info
        $GIPayload = @{
            "InstanceView" = $GIInstanceView
            "Substate" = $GISubstate
            "Name" = $GIName
            "Label" = $GILabel
            "ProvisioningState" = $GIProvisioningState
            "ResourceConfiguration" = $ResDefConfig
            "ResourceDefinition" = $ResDef
            }

        # Convert Gallery Item Payload Info To JSON
        $GIPayloadJSON = ConvertTo-Json $GIPayload -Depth 7

        # Get Cloud Services
        $CloudServicesUri = "https://{0}:30005/{1}/CloudServices?api-version=2013-03" -f $Using:TenantPortalAddress,$Using:SubscriptionID
        $CloudServicesData = [xml](Invoke-WebRequest -Uri $CloudServicesUri -Headers $Headers -UseBasicParsing | Select-Object -ExpandProperty Content)
        $CloudService = $CloudServicesData.feed.entry.content.properties.Name | ? {$_ -match $Using:CloudServiceName}
        if (!$CloudService) {
            # Set Cloud Service Configuration
            $CloudServiceConfig = @{
                "Name" = $Using:CloudServiceName
                "Label" = $Using:CloudServiceName
                }

            # Convert Cloud Service Configuration To JSON
            $CloudServiceConfigJSON = ConvertTo-Json $CloudServiceConfig

            $CloudServicesData = [xml](Invoke-WebRequest -Uri $CloudServicesUri -Headers $Headers -Method Post -Body $CloudServiceConfigJSON -ContentType "application/json" -UseBasicParsing)
            $CloudService = $CloudServicesData.entry.content.properties.Name | ? {$_ -match $Using:CloudServiceName}
        }

        # Set Gallery Item VM Role Deploy URI
        $GIDeployUri = "https://{0}:30005/{1}/CloudServices/{2}/Resources/MicrosoftCompute/VMRoles/?api-version=2013-03" -f $Using:TenantPortalAddress,$Using:SubscriptionID,$CloudService

        # Deploy Gallery Item VM Role
        $VMRoleDeployed = Invoke-WebRequest -Uri $GIDeployUri -Headers $Headers -Method Post -Body $GIPayloadJSON -ContentType "application/json" -UseBasicParsing

        Return $VMRoleDeployed

    } -PSComputerName $WAPServer -PSCredential 
$Creds

$VMRole


}

Note     There are a ton of nuances within this example workflow. It is because of this, I outlined The Process above. In fact, this example workflow includes the following process steps: 1-5, 8-13 and all Cloud Service Creation sub-steps. Steps 6 and 7 occur in the Middle Tier Sub-Runbook.

Calling the Deploy-VMRole PowerShell workflow

The following PowerShell workflow script (Deploy-TenantVMRole) will call the Deploy-VMRole workflow with the following settings:

  • WAP Server:<Name of WAP Admin Server, defined by SMA Variable>
  • Credentials:<PSCredential, defined by SMA Variable>
  • Tenant Portal Address:<FQDN of the WAP Tenant Portal Server, defined by SMA Variable>
  • Subscription ID:<Subscription ID for the User for whom the VM Roles will be deployed, defined by parsing the $OwnerUserRole parameter>
  • User ID:<User ID of the User for whom the VM Roles will be deployed, defined by parsing the $OwnerUserRole parameter>
  • Gallery Item Name:<Partial or Full Name of the Gallery Item VM Role to be Deployed, defined by parsing the $GalleryItemToDeploy parameter>
  • Gallery Item Version:<Version of the Gallery Item VM Role to be Deployed, defined by parsing the $GalleryItemToDeploy parameter>
  • Cloud Service Name:<Name of the Cloud Service, generated by combining hardcoded custom variable data and $SubscriptionID variable>
  • OS Disk:<Name of the Default OS Disk to be used in the Gallery Item VM Role Deployment, defined by SMA Variable>
  • Password:<Password used in the Gallery Item VM Role Deployment, defined by SMA Variable>
  • ResDefConfig JSON:<Resource Definition Configuration data (in JSON format), defined by numerous parameters/variables, contains common and Gallery Item VM Role specific data>

Note     You may keep these example settings, or modify to fit your deployment specifications. Each of these parameters/variables are leveraged in the Deploy-VMRole workflow call and are configured for maximum re-use and flexibility. The intention of this Middle Tier Sub-Runbook is to balance common and Gallery Item VM Role specific data so that the line between them is clear and updates are simple (as the number of Gallery Item VM Roles in your environment grows). Even more specific parameter settings occur in the Top Tier Runbook.

Example PowerShell workflow script for Deploy-TenantVMRole

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
workflow Deploy-TenantVMRole
{
    param
    (
        [string]$OwnerUserRole,
        [string]$GalleryItemToDeploy,
        [string]$VMRoleName,
        [string]$VMRoleNamePattern,
        [string]$VMRoleSize
    )
   
    #Define Variables
    $WAPServer = Get-AutomationVariable -Name 'WAP Admin Server'
    $Creds = Get-AutomationPSCredential -Name 'PSCredential Name'
    $TenantPortalAddress = Get-AutomationVariable -Name 'WAP Tenant Server FQDN'
    $SubscriptionID = $OwnerUserRole.Split("_")[1]
    $UserID = $OwnerUserRole.Split("_")[0]
    $GalleryItemName = $GalleryItemToDeploy.Split(";")[0]
    $GIVersion = $GalleryItemToDeploy.Split(";")[1]
    $CloudServiceName = "CloudService-4-{0}" -f $SubscriptionID
    $OSDisk = Get-AutomationVariable -Name 'Default VM Role OS Disk'
    $Password = Get-AutomationVariable -Name 'Password'
   
    # Create Gallery Item Parameter Hashtable (for Common Data)
    $GIParamList = @{
        VMRoleVMSize = $VMRoleSize
        VMRoleOSVirtualHardDiskImage = $OSDisk
        VMRoleAdminCredential = "administrator:{0}" -f $Password
        VMRoleTimeZone = "Pacific Standard Time"
        VMRoleComputerNamePattern = $VMRoleNamePattern
        VMRoleNetworkRef = "Tenant Network ({0})" -f $UserID
        }

    # Add to Gallery Item Parameter Hashtable (for GI Specific Data)
    if ($GalleryItemName -eq "DomainController")
    {
        $GIParamList += @{DomainControllerWindows2012DomainDNSName = $UserID.Split("@")[1]} 
        $GIParamList += @{DomainControllerWindows2012DomainNETBIOSName = ($UserID.Split("@")[1]).Split(".")[0]}
        $GIParamList += @{DomainControllerWindows2012SafeModeAdminPassword = $Password}
    }
   
    # Convert Gallery Item Parameter Hash To JSON
    $ResDefConfigJSON = ConvertTo-Json $GIParamList
   
    Deploy-VMRole -WAPServer $WAPServer -creds $Creds -TenantPortalAddress $TenantPortalAddress `
        -SubscriptionID $SubscriptionID -UserID $UserID -GalleryItemName $GalleryItemName `
        -GIVersion $GIVersion -ResDefConfigJSON $ResDefConfigJSON -CloudServiceName $CloudServiceName `
        -VMRoleName $VMRoleName
}

Did you miss it?

Up to this point, you have been inundated with “generic” workflow examples to Deploy any Gallery Item VM Role. So, you may have actually missed where this post went from “generic” to “specific”. If you did, look up at the  Deploy-TenantVMRole workflow script again, and check out lines 34-40ish. For this Middle Tier Sub-Runbook, that is the extent of the “specific”. Believe me, I would have loved to make this and the Lowest Tier Sub-Runbook 100% Generic, but it is just not possible. More about this, and the decisions I made are in the note below.

Note     The Gallery Item VM Role specific data (as it relates to the other Gallery Item VM Roles in this blog series, and from the Building Clouds Blog Gallery) is kept in a separate section (lines 34-40ish above) and is surrounded by “if” logic. The intention of this section is to logically store the Gallery Item VM Role specific data by $GalleryItemName. I put quite a bit of thought on where the best place for this “unique” data should live within these example workflows, as well as more dynamic ways to process it – this happens to be where it landed and what it looks like. And obviously, this will not work for every possible VM Role created. But it is one idea/implementation that has worked for my team’s deployment scenario. Oh, and before I forget, this is the portion of the examples where Steps 6 and 7 of The Process take place.

Calling the Deploy-TenantVMRole PowerShell workflow

The following PowerShell workflow script (Subscription-Create-Dispatcher) will call the Deploy-TenantVMRole workflow with the following settings:

  • Gallery Item To Deploy:<Concatenated string made up of the Gallery Item Name (full or partial) and Gallery Item Version in GalleryItemName:1.0.0.0 format, defined by custom data>
  • Owner User Role:<Concatenated string made up of of the User ID (email address) and Subscription ID (GUID) in email@address.com_SUBSC_RIPTION_GUID_STRING, defined by SPF event Data stored in the $resourceObject input variable>
  • VM Role Name:<Name of the VM Role to be Deployed (will be how the VM Role is seen in the Tenant Portal), defined by custom data>
  • VM Role Name Pattern:<Name Pattern of the VM to be Deployed (will be how the VM is named within the hypervisor) in NN## format, defined by custom data>
  • VM Role Size:<Size of the VM Role to be Deployed, restricted to available VMM Hardware Profile/Gallery Item VM Role definitions, defined by custom data>

Note     The data entered here is completely customizable and should fit your deployment specifications. The intention of this Top Tier Runbook is to set Gallery Item VM Role specific variables and parameters to be passed on to the much more generic Middle and Lowest Tier Sub-Runbooks. In fact, this Top Tier Runbook is where the Tenant configuration is built out:

  • SDN
  • Multiple concurrent (and/or dependent) Gallery Item VM Role Deployments
  • Job Monitoring
  • Notifications
  • etc.

Remember, as the Top Tier Runbook, it is the one called by the Subscription.Create WAP/SPF event, configured in the VM Clouds Resource Provider:

image

It is the main “hook” for the Tenant Provisioning Process of “Subscribe to a Plan, Get a Fully Deployed Set of Collaborative Workloads”.

Example PowerShell workflow script for Subscribe-Create-Dispatcher

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
workflow Subscription-Create-Dispatcher
{
    param
    (
        [object]$resourceObject
    )

    if ($resourceObject.AdminID.Length -gt 27) { $AdminId = $AdminId.SubString(0,27) } else { $AdminId = $resourceObject.AdminId }
    $OwnerUserRole = $AdminId + "_" + $resourceObject.SubscriptionID
    $SubscriptionName = $resourceObject.SubscriptionName
   
    $VMMServer = Get-AutomationVariable -Name 'VMM Server'
    $LogicalNetworkName = Get-AutomationVariable -Name 'Default VMM Logical Network'
    $PSEmailServer = Get-AutomationVariable -Name 'SMTP Server'
    $PSEmailFrom = Get-AutomationVariable -Name 'SMTP From Email'
    $PSEmailCC = Get-AutomationVariable -Name 'PSEmailCC'
   
    if ($SubscriptionName -eq "Collaboration Workloads")
    {
        $CloudName = "Tenant Cloud"
       
        Create-VMNetwork -VmmServerName $VMMServer -OwnerUserRole $OwnerUserRole -CloudName $CloudName -LogicalNetworkName $LogicalNetworkName
       
        Send-SMTPNotification -SendNotificationType "Plans" -PSEmailFrom $PSEmailFrom -PSEmailTo $AdminId -PSEmailServer $PSEmailServer -PSEmailCC $PSEmailCC -WorkloadName $SubscriptionName
        $SubscriptionName + " Plan Selected"
       
        "Deploying Active Directory"
        Deploy-TenantVMRole -GalleryItemToDeploy "DomainController;1.0.0.0" `
            -OwnerUserRole $OwnerUserRole -VMRoleName "ActiveDirectory" `
            -VMRoleNamePattern "DC##" -VMRoleSize "ExtraSmall"
    }
}

Note     This example is exactly what we have been using on my team in our Demo/Test/Dev environment. As you can see, it includes more than just a call to the Deploy-TenantVMRole workflow. In fact, it also includes:

  • Extraction of SPF event data from the $resourceObject input variable
  • Example generation of the $OwnerUserRole string
  • Usage of SMA Variables
  • Logic for Subscription dispatching based on Subscription Name (one example provided)
  • Invocation of the Create-VMNetwork workflow (to create the SDN described above)
  • Invocation of the Send-SMTPNotification workflow (to send email notifications to the user about post-subscription activities) - For more information (and a specific example) about how the Send-SMTPNotification workflow, see the following blog post: Automation–Monitoring and Notifying in Windows Azure Pack with SMA
  • And finally, Invocation of the Deploy-TenantVMRole workflow (with Gallery Item VM Role specific data)

By the way - I realize that some of the direct text output and use of hardcoded values could be better served in Write-Verbose commands and usage of variables (respectively), but for our environment, and for this example, this is what I went with. That said, you may keep these example settings, or modify to fit your deployment specifications.

Oh, and if you are wondering about Job Monitoring, our environment has that too – it just so happens to be hooked to a different Top Level Dispatch Runbook: VMRole-Create-Dispatcher. And because the Deploy-VMRole workflow leverages the same exact method for VM Role creation as the Tenant Portal, the WAP/SPF event for MicrosoftCompute.VMRole works the same:

image

So, for fun…

Example PowerShell workflow script for VMRole-Create-Dispatcher

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
workflow VMRole-Create-Dispatcher
{
    param
    (
        [object]$resourceObject
    )
   
    $VMRoleOwner = $resourceObject.Owner
    $VMRoleName = $resourceObject.Name
    $SCJobID = $resourceObject.MostRecentTask.ID
    $PSEmailServer = Get-AutomationVariable -Name 'SMTP Server'
    $PSEmailFrom = Get-AutomationVariable -Name 'SMTP From Email'
    $PSEmailCC = Get-AutomationVariable -Name 'PSEmailCC'
   
    Send-SMTPNotification -SendNotificationType "Workloads" -PSEmailFrom $PSEmailFrom -PSEmailTo $VMRoleOwner -PSEmailServer $PSEmailServer -PSEmailCC $PSEmailCC -WorkloadName $VMRoleName 
    "Job ID ($SCJobID) is being monitored"
    $VMRoleDeploy = Monitor-VMMJobStatus -SleepDuration 300 -SCJobID $SCJobID -OutputStatus $false
    "Deploy of $VMRoleName : $VMRoleDeploy"
    Send-SMTPNotification -JobState $VMRoleDeploy -SendNotificationType "Workloads-Complete" -PSEmailFrom $PSEmailFrom -PSEmailTo $VMRoleOwner -PSEmailServer $PSEmailServer -PSEmailCC $PSEmailCC -WorkloadName $VMRoleName
     
}

The combination of Job Monitoring and Notifications allow for the creation of emails like this:

image  image


BONUS!

Oh, alright…Here is another example script.

What does it do?

It is the portion of the Deploy-VMRole required if you wanted to leverage the VMM Cmdlet command Add-CloudResource instead of going WAP Tenant API Only

Note     If you want to go this route, this example script replaces the existing script portion within the Deploy-VMRole starting around line 45, right through to the end (before the workflow’s end bracket).

Sound interesting?

Alternate Example PowerShell workflow script for a portion of the Deploy-VMRole

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
$GIVMRole = InlineScript {
       
    #Convert ResDef JSON to Dictionary
    [System.Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions") | Out-Null
    $JSSerializer = New-Object System.Web.Script.Serialization.JavaScriptSerializer
    $ResDef = $JSSerializer.DeserializeObject($Using:ResDefJSON)

    #Add ResDefConfig JSON to Dictionary
    $ResDefConfig = New-Object 'System.Collections.Generic.Dictionary[String,Object]'
    $ResDefConfig.Add("Version",$Using:GIVersion)
    $ResDefConfig.Add("ParameterValues",$Using:ResDefConfigJSON)

    Get-SCVMMServer -ComputerName $Using:VMMServer -ForOnBehalfOf | Out-Null
    $OwnerUserRole = Get-SCUserRole | ? {$_.Name -match $Using:UserID}

    #Create a CloudService
    $Cloud = Get-SCCloud -Name $Using:CloudName -OnBehalfOfUser $Using:UserID -OnBehalfofUserRole $OwnerUserRole
    $CloudSevice = Get-CloudService -Name $Using:CloudServiceName  -OnBehalfOfUser $Using:UserID -OnBehalfofUserRole $OwnerUserRole
    if (!$CloudSevice) {
        $CloudSevice = New-CloudService -Cloud $Cloud -Name $Using:CloudServiceName -OnBehalfOfUser $Using:UserID -OnBehalfofUserRole $OwnerUserRole
    }

    $CloudResource = Add-CloudResource -CloudService $CloudSevice -ResourceDefinition $ResDef -ResourceName $Using:VMRoleName `
        -OnBehalfOfUser $Using:UserID -OnBehalfofUserRole $OwnerUserRole -ResourceConfiguration $ResDefConfig -RunREST

    Return $CloudResource

} -PSComputerName $VMMServer -PSCredential 
$Creds

$GIVMRole

Note     Again, this is still example workflow script to be executed as the Service Admin, this time against VMM directly, instead of leveraging the WAP Tenant API. You will see that I leveraged a second InlineScript, this time to connect to the VMM Server (as the WAP Tenant API calls that come before it execute via InlineScript connected to the WAP Server). You will also see that we are still leveraging “OnBehalfOf” functionality for the VMM commands, this is required if you want proper Tenant ownership of the deployed resources.

The biggest thing to know about the Add-CloudResource command is that the -ResourceDefinition and –ResourceConfiguration  parameters require Dictionary Objects. This is why I have both the ResDef and ResDefConfig convert/deserialization steps in this section. In fact, this portion of the script includes steps 8-13 and all Cloud Service Creation sub-steps from The Process described above.

Finally, it is worth restating - JSON is the preferred method for data storage and transfer between sections of the script (from InlineScript to InlineScript and workflow to workflow).


What’s Missing?

Well, from the Service Admin perspective, nothing really. Sure, I did not provide a non-workflow PS script example, but that is easy enough to generate based on the above examples.

From the Tenant Admin perspective, that is another story. In fact, I think it is time for another visual aid:

image

Why?

Well, it is not for a lack of trying. And actually, the scripts are 99% complete. So are my thoughts on how to deliver the information…It came down to a question of prioritization - for both this blog post series, as well as product demos/videos on my plate. In the end, the Tenant Provisioning Process of “Subscribe to a Plan, Get a Fully Deployed Set of Collaborative Workloads” took precedence.

Besides, isn’t this blog post long enough already?

When?

The very next blog post in this series! And thus, I have updated the Table of Contents below…


Blog Series Table of Contents

  1. Part 1: Intro & TOC
  2. Part 2: Automated Deployment of Tenant Network and Identity(Isolated SDN & Active Directory VM Role; from the Service Admin Persona)
  3. Part 3: Automated Deployment of Tenant Identity (Active Directory VM Role; from the Tenant Admin Persona)
  4. Part 4: Automated Deployment of Tenant Workloads(Lync, SharePoint, and Exchange VM Roles; from both Service Admin and Tenant Admin Personas)
  5. Parts 5 & 6: TBD(We hope to have something around: Value Added Services/Offerings and Ongoing Automated Maintenance/Operations)

Note     Automated Deployment of the VM Roles in these examples will include PowerShell scripts for both Service Admin and Tenant Admin personas.

Once Parts 1-4 are published, I will be creating a TechNet Gallery Contribution with a collection of all the scripts (SMA Runbooks / PowerShell Workflows). Look for a download link here in the coming weeks!


Thanks for checking out my latest blog series! For more information, tips/tricks, and example solutions for Automation within System Center, Windows Azure Pack, Windows Azure, etc., be sure to check out the other blog posts from Building Clouds in the Automation Track!

enJOY!


Italy's Lombardia region uses open data on Windows Azure to build for the future

$
0
0

FH_Lombardia

The Lombardia region in northern Italy has recently earned a reputation as a leader in Europe’s open data movement, launching dati.lombardia.it. Powered by the Socrata Open Data Portal running on Windows Azure, it hosts more than 430 datasets and visualizations on a range of topics, from museum locations for visitors to government spending information.

The team at Lombardia publishes datasets that are most useful to citizens, and recently hosted the OpenApp Lombardia competition, encouraging residents to create Web and mobile applications using open data. The top prize went to ReadIt, a search engine for book inventories within Lombardia libraries, giving users the ability to search by various fields and verify availability.

The region is using open data as an integral part of its strategy to modernize government and build for the future.

For more on how Lombardia is using open data solutions to improve transparency and public services, please see the Socrata site and the Openness@Microsoft blog.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff

Technet Wiki Son 1 Ay En Aktif Katkıda Bulunanlar / TOP 10

$
0
0

Merhaba Sevgili Wiki Ailesi ve tüm bilişim severler ;


Bir Cumartesi istatiğinde daha beraberiz.Bu haftada geçen haftalarda olduğu gibi Türk Wiki ailesinin hızlı yükselişi devam ediyor. 10 kişilik listede 5 en aktif üyenin Türk Wiki Ninjalarından çıkması çok gurur verici bir tablo.Bundan dolayı bütün arkadaşlarımı tebrik ediyorum.Ayrıca listeye girmeyi başaran diğer üyeleri de tebrik ediyorum.Umuyorum ki Türk Wiki'si olarak daha da güzel işler çıkaracağız. Ayrıca Alican Dökmen arkadaşımızı tebrik ediyorum çok kısa süredir aramızda olmasına rağmen bu kadar aktif olarak çalışması mutluluk verici,umuyorum ki kendisi daha iyi işler yapacaktır.Herkese güzel, huzurlu ve mutlu hafta sonları diliyorum

Profiles

Ed Price

 http://social.technet.microsoft.com/profile/ed%20price%20-%20msft/

Alan Nascimento Carlos

http://social.technet.microsoft.com/profile/alan%20nascimento%20carlos/

Durval Ramos

http://social.technet.microsoft.com/profile/durval%20ramos/

Robin Gaal

http://social.technet.microsoft.com/profile/robin%20gaal/

Davut Eren

http://social.technet.microsoft.com/profile/davut%20eren%20-%20tat/

Alican Dökmen

http://social.technet.microsoft.com/profile/alican%20d%C3%B6kmen%20-%20tat/

Gökan Özcifci

http://social.technet.microsoft.com/profile/gokan%20ozcifci/

The Scripting Guy s

http://social.technet.microsoft.com/profile/the%20scripting%20guys/

Elguc Yusifbeyli

http://social.technet.microsoft.com/profile/elguc%20yusifbeyli-tat/

Mehmet Parlakyiğit

http://social.technet.microsoft.com/profile/mehmet%20parlaky%C4%B0%C4%9F%C4%B0t-tat/

Verbreitung von Windows XP? In Deutschland Platz 3.

$
0
0
Viele Artikel widmen sich in der letzten Zeit dem Ende des Supports für Windows XP, der in einem Monat bekanntlich bevorsteht: Die Verbreitung von XP geht nur langsam zurück , schreibt Heise Online, Golem findet Windows XP noch auf 29 Prozent aller Online-PCs weltweit vor , Windows 8 wächst, XP weiterhin weit verbreitet titelt Tom’s Hardware und die netzwoche meint, Windows XP ist nicht totzukriegen . Alle Artikel beziehen sich mehr oder weniger auf die Auszüge einer Studie , die von den Marktforschern...(read more)

Updates: Process Explorer v16.02, Process Monitor v3.1, PSExec v2.1, Sigcheck v2.03

$
0
0

Process Explorer v16.02: This minor update adds a refresh button to the thread’s stack dialog and ensures that the Virus Total terms of agreement dialog box remains above the main Process Explorer window.

Process Monitor v.3.1: This release adds registry create file disposition (create vs open) and a new switch, /saveapplyfilter, which has Process Monitor apply the current filter to the output file as it saves it.

PSExec v2.1: This update to PsExec, a command-line utility that enables you to execute programs on remote systems without preinstalling an agent, encrypts all communication between local and remote systems, including the transmission of command information such as the user name and password under which the remote program executes.

Sigcheck v2.03: This version corrects a bug that caused the output of the –u switch to include signed files, and fixes several other minor bugs.

Viewing all 17778 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>