Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels

Channel Catalog

Channel Description:

Resources for IT Professionals

older | 1 | .... | 803 | 804 | (Page 805) | 806 | 807 | .... | 889 | newer

    0 0

    Sejam muito bem-vindos à mais uma Wiki Life.

    A ideia hoje é apresentar um resumo dos novos recursos que integram o Update 1 do Visual Studio 2015, lançado no dia 30/11/2015.

    Este primeiro pacote de atualização do Visual Studio traz como principais destaques:

    • O Tools for Apache Cordova (TACO), um conjunto de ferramentas concebidas com o objetivo de simplificar o desenvolvimento de soluções mobile baseadas no Apache Cordova;
    • O Tools for Universal Windows Apps versão 1.2, com funcionalidades para a construção de aplicativos voltados à Windows Store empregando o Windows 10 SDK;
    • O .NET Framework 4.6.1, que inclui correções e melhorias adicionadas à versão 4.6 da plataforma .NET;
    • Funcionalidades que permitem o uso do C# como linguagem de scripts;
    • Melhorias nos mecanismos de debugging e diagnóstico da IDE.

    Para uma descrição completa das novidades clique no link abaixo:


    Para saber mais sobre o Visual Studio 2015 e exemplos de uso do mesmo acesse também os links:

    Visual Studio 2015 Portal

    ALM - Visual Studio 2015 - Edições e Upgrades

    Utilizando o GitHub no Visual Studio 2015


    Até a próxima!

    Wiki Ninja Renato Groffe (Wiki, Facebook, LinkedIn, MSDN)

    0 0

    Summary: Boe Prox shows how to use Windows PowerShell to display hidden files.

    Hey, Scripting Guy! Question How can I use Windows PowerShell to display hidden files and folders?

    Hey, Scripting Guy! AnswerThere are a couple of approaches that use Get-ChildItem:

        • Get-ChildItem –Hidden
        • Get-ChildItem –Attributes H

    0 0

    Traditional Integration will be taken over by the cloud. I will tell you why! I have worked in the Microsoft space for over a decade, predominantly with BizTalk and other integration technologies. I have noticed recently that the role of BizTalk is slowly moving towards the backend enabling systems (legacy) to expose their data. However, countless systems themselves have capabilities onboard these days to expose data and accept data. Beside that some of them if not relatively all of them have the ability to transform in/outgoing data to another format. Hence, a messaging/broker is not required, unless you require to send out and accept data from various systems and services on a large scale.

    The cloud is slowly taken over the IT-landscape. And who wants to run their IT on premise these days? The elasticity, the pay as you go, the vast amount of services (capabilities) and time-to market makes IT on premise practically obsolete. Even large LOB systems, which people thought would never migrate to the cloud will eventually! The only justification to have data on premise locked into systems is its sensitivity.

    Looking at integration, connecting systems, devices and applications together is going to change. A bulky message broker is not required anymore, integration can be done using API’s through different protocols communicating JSON. Will XML be past tense? Not yet as EDI/HL7 still aren't past tense and are still relevent. However, REST, JSON are the magic words today. And the Micro services paradigm totally kills the idea of having a bus. Modern integration can be done leveraging Microsoft Azure services such as Service Bus, Logic Apps and API apps building lightweight integrations. These services offer messaging capabilities, hosting pre-defined processes and access to resources. Several of BizTalk Server capabilities can currently be found in Logic Apps and more will over time move to Azure.

    Integration itself will not disappear and remains a necessity in IT. However, it will change and this will reflect in my job and anyone else working in the integration space working with Microsoft Technology. Therefore, we need to shift our focus from traditional- to modern means of integration, which involves Microsoft Azure. We need to invest in Microsoft Azure Service Bus, Logic Apps, API’s and several other technologies. For instance “Internet of Things”, a trend in IT, which will disrupt traditional ways of building solutions if it hasn’t already. IoT will create a totally different market for us with a critical role for integration.

    Does that mean I now should stop developing integration solutions with BizTalk Server? No, since the cloud technology for integration is currently not mature enough. However, that will change on the short or mid term (a few months to over a year). Therefore, you need to think about a migration strategy for your on premise integration assets to the cloud. Your current and future solutions depend on it and should be made cloud ready. And this can be achieved by having a distinct separation of concerns i.e. loose coupling between artefacts. The artefacts either remain on premise or move to the cloud as Logic App or API App. The the TechNet Wiki will provide the articles around them as with BizTalk Server today!

    Note: Microsoft has released its road map for integration just recently and it aligns with my thoughts above. You can find the road map here.

    0 0


    Hashing Algorithms

    Hashing Algorithms take variable input and provide a unique fixed length output. Hashing algorithms have a number of desired properties. Those desired properties include that the hash should not be able to be reversed to determine the data that was inputted into the hash. Also, it should not be possible to produce collisions. A collision is when two separate inputs into a hash algorithm produce the same hash.

    How a Digital Signature is created

    To produce a digital signature data that is going to be signed is passed through a hash algorithm to create a hash.

    The resulting hash is encrypted with key. That key is the private key that is part of a private/public key pair. The private key is held securely by the signer and the public key is widely distributed in the form of an X.509 certificate.

    The result of the encrypted hash is a digital signature

    Verifying a digital signature

    To verify a digital signature, the verifier must first retrieve or possess a copy of the signer’s certificate. This certificate includes the public key of the signer.

    The verifier must also have a copy of the data or document that was signed. The verifier also must have knowledge of what hash algorithm and what asymmetric cryptographic algorithm were used to create the signature. Information on what algorithms were used for the signature are usually contained in the metadata of the signature. And finally, the verifier must also have a copy of the signature.

    The verifier runs the document or data through the same hashing algorithm as the signer. The verifier then decrypts the digital signature with the signer’s public key. The result of the decryption is the hash that was created by the signer. That hash is compared to the hash generated by the verifier. If the hashes match the verifier now knows two things. First, the verifier knows that the data has not been modified since signed by the signer. Second, since the public key decrypted a matching hash, the verifier knows that the document or data was signed with the corresponding private key and thus knows that the identity associated with the signer via the certificate is actually the signer of the data or document.

    Impact of Collisions on Certificates

    Certificates are digitally signed data, signed by a Certification Authority. If the signature is determined valid using the Certification Authorities public key, then that certificate is understood to have been issued by that Certification Authority assuming information used for chain building also regard that CA as the issuer.

    There is a significant problem that is caused if collisions can be created for the hash algorithm that is used to sign a certificate. The issue arises because it provides an opportunity for an adversary to create fraudulent certificates. If an adversary can create a fraudulent certificate whose hash is the same as another certificate, then the adversary can use the signature from another non-fraudulent certificate on the fraudulent certificate. This allows an adversary to create certificates that have the same chain of trust as those signed by that CA. This is the reason that as hashing algorithms become considered broken in the sense that collisions can be created, that they must stop being used in the certificate signing process.

    Attacks against SHA1

    Example of a current attack against SHA1:

    SHA1 Deprecation

    Microsoft SHA1 Deprecation Policy as of 11/26/2015

    Public Key Infrastructure’s whose Root CA certificates are installed in the Windows OS through the Microsoft Root Certificate Program will no longer be allowed to issue SSL and Code Signing certificates that are signed using the SHA1 hashing algorithm. This policy will effect certificates that are issued as of January 1, 2016. The official policy can be viewed here:

    3rd Party SHA1 Policies

    Google: Gradually sunsetting SHA-1 (

    Mozilla: Phasing Out Certificates with SHA-1 based Signature Algorithms (


    Beginning with Windows Server 2008, Windows began supporting Suite-B cryptographic protocols. This support of Suite-B cryptographic protocols includes the support of the SHA2 hashing algorithm. Specifically, SHA256, SHA384, and SHA512 are supported by Windows. Typically, when choosing a SHA2 hashing algorithm SHA256 is selected. I have not run into compatibility issues with SHA384 or SHA512. However, it is more likely that SHA256 would be supported in 3rd party software then SHA384 or SHA512. Also, it is a common practice by 3rd party Certification Authorities to issue certificates signed using the SHA256 hashing algorithm.


    Reconfiguring a PKI to use SHA2 instead of SHA1

    The steps below will illustrate how one can reconfigure their PKI so that the CA certificates are signed with the SHA2 hashing algorithm and certificates issued by the PKI are signed using the SHA2 hashing algorithm. This walkthrough is an example of how this can be accomplished on a Two Tier PKI. If you are going to perform these steps in a production environment you should first backup your existing PKI. It is also recommended that you engage Premier Field Engineering, Microsoft Support, or Microsoft Consulting Services if you are performing these steps in a Production Environment. One last note of caution, you should first run through these steps in a pre-production environment to ensure you get the proper results.

    Reconfiguring the Root CA

    In the following steps I will show the steps used to reconfigure the Root CA to issue certificates signed using the SHA256 hashing algorithm.

    The screenshot below shows the current Root CA certificate in my test lab and the fact that the hashing algorithm is SHA1.


    Step 1: Open the registry editor

    Step 2: Navigate to HKLM\System\CurrentControlSet\Services\CertSvc\Configuration\<CA Name>\CSP

    Step 3: Open the registry setting CNGHashAlgorithm


    Step 4: Replace Value data with SHA256

    Step 5: Click OK



    Step 6: Restart the Active Directory Services service. One way to do this is to open an elevated command prompt and run the following command to stop the service: net stop certsvc, and then run the following command to start the service: net start certsvc.


    Renewing the Root CA Certificate

    In the following steps I will show how the Root CA certificate can be renewed. Since, I reconfigured might Root CA to issue certificates signed using the SHA256 hashing algorithm, the resulting certificate will be signed with this hashing algorithm.

    Step 1: Open the Certification Authority MMC (certsrv.msc)

    Step 2: Right-click on the CA Name

    Step 3: Select All Tasks and then Renew CA Certificate… from the context menu


    Step 4: When prompted to stop the Active Directory Certificate Services service, click Yes


    Step 5: When prompted to Renew CA Certificate, click OK. Note: If you do not want to renew the CA certificate with a new key pair, select No and then click OK.


    Verifying the certificate

    To verify that the certificate has been renewed and signed using the SHA256 hashing algorithm, perform the following steps:

    Step 1: Open the Certification Authority MMC (certsrv.msc)

    Step 2: Right-click on the CA Name and select Properties from the context menu.


    Step 3: Select the newest certificate in the list, which will be the last one in the list.

    Step 4: Click View Certificates


    Step 5: When the Certificate opens, click on the Details tab

    Verify Signature hash algorithm is set to SHA256.


    Publishing the Root CA Certificate and CRL

    Next you will need to publish the new CA Certificate to any AIA repositories. You will also need to publish the new CRL to any CDP locations. <Add section on finding these sections>

    Step 1: On the Root CA navigate to C:\Windows\Systen32\CertSrv\CertEnroll

    Step 2: Copy the newest CA Certificate and CRL


    Step 3: Make the CA Certificate and CRL available to a machine where you are logged in as Enterprise Admin.

    Step 4: Run the folllowing command in an elevated command prompt and where the path in the command prompt is the same path where there is a copy of the CA Certificate:

    certutil –f –dspublish<CA Certificate File Name> Root CA


    Step 5: Run the folllowing command in an elevated command prompt and where the path in the command prompt is the same path where there is a copy of the CRL:

    certutil –f –dspublish<CRL File Name> Root CA


    Step 6: Copy the Root CA Certificate to the HTTP AIA Repository

    Step 7: Copy the Root CRL to the HTTP CDP Repository


    Renewing the Issuing CA Certificate

    Step 1: Open the Certification Authority MMC (certsrv.msc) on the Issuing CA

    Step 2: Right-click on the CA Name

    Step 3: Select All Tasks and then Renew CA Certificate… from the context menu


    Step 4: When prompted to stop the Active Directory Certificate Services service, click Yes


    Step 5: When prompted to Renew CA Certificate, click OK. Note: If you do not want to renew the CA certificate with a new key pair, select No and then click OK.


    Step 6: When prompted to submit the CA Certificate Request click Cancel


    Step 7: Locate the newest request file in the C:\ directory

    Step 8: Copy that request file to the Root CA


    Step 9: On the Root CA open up the Certification Authority MMC (certsrv.msc)

    Step 10: Right-click on the CA Name and from the context menu, select All Tasks and then Submit new request…


    Step 11: Navigate to the request file and click Open.


    Step 12: Return to the Certification Authority MMC and click on Pending Requests

    Step 13: Right-click on the appropriate request and from the context menu click All Tasks and then Issue


    Step 14: Navigate to Issued Certificates

    Step 15: Right-click on the certificate you just generated and click Open


    Step 16: Click on the Details Tab (Note that the Signature hash algorithm is set to SHA256)

    Step 17: Click the Copy to File… button


    Step 18: When the Certificate Export Wizard opens click Next


    Step 19: On the Export File Format page of the wizard, select Cryptographic Message Syntax Standard – PKCS #7 Certificates (.P7B)

    Step 20: Click Next


    Step 21: On the File to Export page, click Browse…


    Step 22: Navigate to the location where you want to save the resulting CA Certificate File, and then click Save


    Step 23: Click Next


    Step 24: Click Finish


    Step 25: When prompted that The export was successful., click OK


    Step 26: Copy the certificate from the Root CA to the Issuing CA

    Step 27: From the context menu right-click on the CA Name and select All Tasks and then Install CA Certificate…


    Step 28: When prompted to stop the Active Directory Certificate Services service, click Yes


    Step 29: Browse to the CA Certificate Envelope file and click Open


    Verify the Issuing CA Certificate

    To verify that the Issuing CA certificate was renewed with SHA256 used in signing of the certificate, follow the following steps.

    Step 1: Open the Certification Authority MMC

    Step 2: Right-click on the CA Name and from the context menu click Properties


    Step 3: Select the last certificate in the list

    Step 4: Click View Certificate


    Step 5: Verify that the Signature hash algorithm is set to SHA256


    Publish CA Certificate

    An Enterprise Issuing CA is normally configured to automatically publish it’s CRL to both the HTTP and LDAP CDP repositories. Enterprise Issuing CAs typically also are configured to publish their CA cert to Active Directory. But in my experience the Enterprise Issuing CA will not publish it’s CA certificate to an HTTP AIA repository. The following steps illustrate how to copy the CA Ceritficate to the HTTP AIA repository.

    Step 1: Navigate to C:\Windows\System32\CertSrv\CertEnroll on the Issuing CA.

    Step 2: Copy the newest CRT file.


    Step 3: Locate the server(s) that host the HTTP AIA repository, paste the CRT file to the appropriate directory.


    Validating PKI Status

    The following steps will allow you to validate that all of the AIA and CDP repositories are now accessible and up to date with the appropriate files.

    Step 1: Open pkiview.msc

    Step 2: In the left hand pane click on the Root CA

    Step 3: Verify that there are no errors in the right-hand pane


    Step 4: Click on the Issuing CA

    Step 5: Verify that there are no errors in the right-hand pane


    Configure the Issuing CA

    Next you will need to configure the Issuing CA to issue end-entity certificates that are signed using the SHA256 hash algorithm.

    Step 1: Open the registry editor

    Step 2: Navigate to HKLM\System\CurrentControlSet\Services\CertSvc\Configuration\<CA Name>\CSP

    Step 3: Open the registry setting CNGHashAlgorithm


    Step 4: Replace Value data with SHA256

    Step 5: Click OK



    Step 6: Restart the Active Directory Services service. One way to do this is to open an elevated command prompt and run the following command to stop the service: net stop certsvc, and then run the following command to start the service: net start certsvc.



    0 0
  • 12/27/15--15:21: Administrator Workstations

    I had previously published this information to  my blog and accidently removed it from here.  Re-adding the posting.  I hope to find time to update this for Windows 10 in the future. Windows 10 has a feature named Credential Guard which greatly increases the security of credentials and help limits their exposure. This blog posting covers one possible way that Administrator Workstations could be configured to reduce the attack surface for Administrator Accounts.  If you would like assistance with this Microsoft Consulting Services has a service named Privileged Administrator Workstation, where they can assist you with implementing Administrator Workstations. Their service is much more detailed and comprehensive then what I have provided here.  The instructions here are just taking Microsoft’s PtH recommendations and showing how they could be implemented.


    Productivity Workstation: Workstation used by a user to check email, access the internet, view and edit documents. This workstation is less trusted since it has a larger attack surface and is extremely susceptible to compromise.

    Administrator Workstation: A workstation used solely for systems administration. This workstation is very trusted since it is locked down, reducing the overall attack surface and making less susceptible to compromise.


    To help secure environments against Credential Theft attacks and to prevent escalation the following conceptual model should be used to help understand how the environment should be secured.


    In order to protect credentials, especially those of privileged accounts the following rules should be followed:

    · A user should never logon to a system that is in a lower tier. (For example a Domain Administrator should not login to a server or a non-administrator workstation with their credentials.)

    · A tier 0 principal should not be accessed by an account or service that is in a lower tier. (For example the same WSUS server that patches all infrastructure servers should not be the same one that patches Domain Controllers.)

    Administrator Workstations

    Users who will be logging in with accounts that will be members of privileged groups should have a dedicated workstation for administration.

    When these types of users perform both administration and productivity work from the same workstation it increases the likelihood of their credentials being compromised. This is due to the increased exposure due to opening documents, reading email, or browsing the web all of which are pathways for malware or an adversary to gain a foothold on a workstation.

    Exposure of credentials

    Even if an administrator is logged on to workstation with a non-admin account, their admin credentials can be exposed if the use the “Run As” functionality and use their admin credentials.

    If an administrator uses RDP to manage servers or domain controllers with a privileged account, their password can be exposed if there is malware on the system running a keystroke logger.

    Cost of exposure

    If a highly privileged account such as domain administrator or enterprise administrator is compromised it is an extremely costly compromise. In some cases this may result in the organization having to build an entirely new forest, since the existing forest can no longer be trusted. In the best case scenario the organization will have to spend several months having all systems reviewed by a forensic team to try to rid systems of malware or an adversary that has established a foothold. Although the cost of a second laptop for an administrator can be costly, that cost pales in comparison to having to rebuild an entire forest. Having a dedicated workstation for administrators is a fundamental mitigation that should be in place and if it is not in place reduces the effectiveness of other mitigation that are put in place to protect administrator credentials.

    Options for Administrator Workstation

    Option 1: Dedicated Hardware


    A dedicated laptop used exclusively for performing administration is the most secure option. This is due to the fact that the administrators credentials are completely isolated. As with all administrator workstations.

    Since a user has a dedicated productivity machine they are allowed to have greater control and will be able to use USB storage, add drivers, and have greater control over the environment.


    Requires the user having two machines.

    Option 2: Physical Machine for Administrator Workstation with VM for Productivity Workstation.

    Advantages: The advantage of this configuration is it isolates the productivity workstation from the administrator workstation using a hypervisor. This allows a single physical machine to be used for both an administrator workstation and productivity workstation.

    Additional Information: One question that may be posed is “Can the Physical Machine be the Productivity Workstation and the VM be the Administrator Workstation?”. The answer to this question is: No. The Productivity Workstation is a Tier 2 Workstation and the Administrator Workstation is a Tier 0 Workstation. If the physical machine is a Productivity Workstation and the VM is a Administrator Workstation, you then have a Tier 2 workstation managing a Tier 0 workstation. And from the previous rules we know that a lower tier should not be able to manage a higher tier. This is because if the physical machine was compromised it would have control over the VM, be able to overlay fake images over the VM, modify the VM configuration, add files to the VM image, and capture keystrokes through the physical machine.


    Since the laptop is the Administrator Workstation it is very locked down. That means that the user has little control over the machine, including using USB storage and adding device drivers. This means the user must compromise on the usefulness of the laptop in exchange for carrying one machine.

    Hardening Requirements for Administrator Workstation

    In order to ensure the integrity of the Administrator Workstation there is configuration that needs to be enforced. The following is a list of those configuration items.

    · The administrator using the Administrator Workstation cannot have administrator level access to the OS on the Administrator Workstation. This must be enforced to prevent the administrator from bypassing security configuration on the workstation.

    · Latest Operating System- Each iteration of the Windows Operating System adds additional features to mitigate known attacks, additional there are security features added to each OS that increase the ability to further secure the OS through proper configuration. To ensure the security of the Administrator workstation the latest Operating System should be installed. The latest Operating System is Windows 8.1.

    · Secure Boot- Protects the Operating System by preventing the loading of unsigned code during the boot up process

    · AppLocker- Application Whitelisting should be implemented so that only approved applications can be run on the Administrator Workstation.

    · BitLocker- Whole volume encryption should be implemented to secure the device in case of loss or theft.

    · USB Restrictions- The Administrator Workstation should be prevented from using USB storage to prevent infection or compromise via malware on a USB device.

    · Internet Restrictions- The Administrator Workstation should be make incapable of accessing the internet. This restriction can be implement by configuring the browser to point to a “black hole” proxy. Alternatively, the organizations proxy server can be configured to block outbound connections on connections imitated by an administrator account.

    · Network Isolation- Windows Firewall should be enabled and configured to block incoming connections.

    · Antimalware- Antimalware also known as Antivirus should be installed on the Administrator Workstation and kept current with signatures.

    · Exploitation Mitigation- EMET should be installed to mitigate common methods that malware and adversaries use to run code on the system.

    · Attack Surface Analysis- During the build of the Administrator Workstation a tool such as Attack Surface Analyzer should be run to determine the attack surface are of the deployment.

    · Imaging- A Golden Image should be produced of the Administrator Workstation and should be kept up to date. In the event of a potential exposure the machine can quickly be wiped and rebuilt.

    Restricting Administrator Access

    Below are instructions to demonstrate how restricted groups can be used to control members of the Local Administrators Group.

    Step 1: Open GPMC.msc

    Step 2: Right click on the OU that contains Administrator Workstations and select Create a GPO in this domain and Link it here…


    Step 3: Give the GPO a name and click OK


    Step 4: Right click on the new GPO and select Edit… from the context menu


    Step 5: Navigate to \Computer Configuration\Policies\Windows Settings\Restricted Groups

    Step 6: Right click on Restricted Groups and select Add Group… from the context menu


    Step 7: In the Add Group dialog box enter .\Administrators and click OK


    Step 8: A new Window will open. Under the Members of this group section and users or groups that should be Local Administrators. Be sure to exclude the users of the Administrator Workstation. Click OK when finished.


    The resulting GPO should look something like the example below.


    Current Operating System

    Windows 8.1 should be installed as the operating system on the Administrator Workstation.

    Windows 8.1 includes all of the security features of Windows 7 along with the additional security functionality available in Windows 8.1. Below are some of the key security features that help add additional security to privileged users.

    Secure Boot

    Secure Boot prevents unsigned code from being run during the boot process. This prevents attackers from loading attack tools or malware during boot.

    Trusted Boot

    Early-Launch Anti-Malware (ELAM) is run early in the boot process and is used to prevent malware from launching prior to the Operating System.

    Virtual Smart Cards

    The Trusted Platform Module is used to securely store a certificate that is used for Smart Card Logon. The user must enter a PIN to use the Virtual Smart Card. So a user must have possession of the device as well as knowledge of the PIN to perform a Smart Card Logon with a Virtual Smart Card.

    Protected Groups

    · Protected Groups add the following protections: The member of the Protected Users group cannot authenticate by using NTLM, Digest Authentication, or CredSSP. On a device running Windows 8.1, passwords are not cached, so the device that uses any one of these Security Support Providers (SSPs) will fail to authenticate to a domain when the account is a member of the Protected User group.

    · The Kerberos protocol will not use the weaker DES or RC4 encryption types in the pre-authentication process. This means that the domain must be configured to support at least the AES cipher suite.

    · The user’s account cannot be delegated with Kerberos constrained or unconstrained delegation. This means that former connections to other systems may fail if the user is a member of the Protected Users group.

    · The default Kerberos Ticket Granting Tickets (TGTs) lifetime setting of four hours is configurable by using Authentication Policies and Silos, which can be accessed through the Active Directory Administrative Center (ADAC). This means that when four hours has passed, the user must authenticate again.

    Restricted Admin Mode for RDP

    This allows an administrator to RDP to another system without exposing their credentials to that system

    Secure Boot

    Secure boot requires firmware that is at UEFI version 2.3.1 or greater. Secure Boot must be enabled in the BIOS. Contact your PC vendor to get instruction on how to enable Secure Boot on your particular model. Below is an example of a BIOS Setup screen where Secure Boot is enabled.


    (Photo from


    AppLocker is an Application Control feature in Windows 7 and Windows 8. AppLocker is used to control which applications a user can run. This control over what applications can run, reduces the chance of a user accidentally running malware or other software that can leave a system exposed. In terms of the Administrator Workstation it is used for “Application Whitelisting”. The first step in Application Whitelisting is to determine what applications the user should be allowed to run. So, first you will want to inventory what applications an Administrator users. Productivity software and alternate web browsers should not be allowed to run on the Administrator Workstation.

    Once you have the applications that should be whitelisted you can use the following steps to create a GPO to enforce the AppLocker settings.

    First we you will need a reference machine to create your settings on. On the reference machine you should have Windows 8.1 installed, along with any applications that you have chosen to whitelist. You will also need the RSAT installed unless you plan on importing the policy into group policy on another machine. The RSAT tools can be downloaded from here:

    Step 1: Launch the Local Group Policy Editor (gpedit.msc) on the reference machine.

    Step 2: Navigate to \Computer Configuration\Windows Settings\Security Settings\Application Control Policies\AppLocker

    Step 3: Click on Configure rule enforcement


    Step 4: Under Executable rules, select Configured and choose Enforce Rules, then click OK. This step enables rule enforcement for executable files such .exe.clip_image021

    Step 5: Right click on Executable Rules and select Create Default Rules from the context menu.


    Step 6: The Default rules creates 3 rules:

    · Allow Everyone to run All files located in the Program Files folder

    · Allow Everyone to run All files located in the Windows folder

    · Allow Builtin Administrator to run All files

    The rule that allows Everyone to run All files located in the Program Files folder should be deleted, so that you have greater control over what programs the user can run. Do not delete the other two rules. Deleting these rules can cause the user to be incapable of logging in.


    Step 7: When prompted about deleting the rule, click Yes


    Step 8: Right click on Executable Rules, and select Create New Rule… from the context menu


    Step 9: Check Skip this page by default, and click Next


    Step 10: Select Allow. Select a group or user that will be using the Administrator Workstation and click Next.


    Step 11: Select Publisher and click Next.


    Step 12: Check, Use Custom Values and next to the File Version select Exactly from the Drop Down menu.


    Step 13: Click Next


    Step 14: Click Create

    Follow Steps 10-14 for each application that needs to be whitelisted.


    Step 15: In the Group Policy Editor, right click on AppLocker, and select Export Policy… from the context menu.


    Step 16: Give the Policy File a name and click Save


    Step 17: When prompted that the rules were exported, click OK.


    Step 18: Open Group Policy Management (GPMC.msc). On the OU that contains the Administrator Workstations, right click and select Create a GPO in this domain, and Link it here…


    Step 19: When prompted, give a name for the GPO and click OK


    Step 20: Right click on the newly create GPO and select Edit


    The Application Identity Service must be running for Applocker to function so we will enable it through Group Policy.

    Step 21: Navigate to \Computer Configuration\Policies\Windows Settings\Security Settings\System Services.

    Step 22: Double click on Application Identity


    Step 23: Select Define this policy setting and then select Automatic , and then click OK


    The Services should now show up as Automatic


    Now we are going to import the previously created policy.

    Step 24: Navigate to \Computer Configuration\Windows Settings\Security Settings\Application Control Policies\AppLocker

    Step 25: Right click on AppLocker

    Step 26: Select Import Policy… from the context menu


    Step 27: Navigate to the Policy File you created in step sixteen, and click Open


    Step 28: When prompted if you want to import the policy, click Yes


    Step 29: When prompted that the rules were imported, click OK


    Internet Restrictions

    Internet Access on the Administrator Workstation should be blocked. The steps below demonstrate how internet access can be blocked with a GPO. These instructions assume there are no other web browsers other then Internet Explorer installed on the system.

    Step 1: Right click on the OU that contains Administrator Workstations, and select Create a GPO in this domain, and Link it here…


    Step 2: Type a name for the new GPO and click OK


    Step 3: Right click on the newly created GPO and select Edit from the context menu


    Step 4: Navigate to \Computer Configuration\Administrative Templates\Windows Components\Internet Explorer\


    Step 5: Open Prevent changing proxy settings and click OK


    Step 6: Open the setting Disable changing Automatic Configuration settings, select Enabled, and then click OK


    Step 7: Open the setting Make proxy settings per-machine (rather than per-user) select Enabled and click OK.


    Step 8: Open Disable changing connection settings, select Enabled, and click OK


    Step 9: Navigate to \Computer Configuration\Administrative Templates\Windows Components\Internet Explorer\Internet Control Panel


    Step 10: Open the setting Disable the Connections page, select Enabled, and click OK


    Step 11: Create a new Group Policy that is linked to the OU that contains Administrator accounts.

    Step 12: Navigate to \User Configuration\Preferences\Internet Settings

    Step 13: Right click on Internet Settings, select New and then Internet Explorer 10 from the context menu


    Step 14: Navigate to the Connections Tab, and click LAN Settings


    Step 15: Select Use a proxy server for your LAN (These settings will not apply to dial-up or VPN connections), under Address enter, for Port enter 80, and ensure that Bypass proxy server for local addresses is selected. Make sure everything has a green underline, if not toggle the F5 and F8 keys until it turns green. Click OK.


    Step 16: Close the Group Policy Editor.


    USB Restrictions

    Malware is often spread via Removable Storage. Therefore it is important that Removable Storage cannot be used with the Administrator Workstation. Below are steps to lock down Removable Storage on the Administrator Workstation.

    Step 1: Open the Group Policy Management Console.

    Step 2: Right click on the OU that contains the Administrator Workstation, and click on Create a GPO in this domain, and Link it here…


    Step 3: Enter a name for the newly created GPO, and click OK


    Step 4: Right click on the newly created GPO, and select Edit


    Step 5: Navigate \Computer Configuration\Administrative Templates\System\Removable Storage Access and open the setting All Removable Storage classes: Deny all access


    Step 6: Select Enabled, and click OK



    BitLocker Drive Encryption should be enabled on the Administrator Workstation to secure the machine in case it is lost or stolen. The following steps illustrate how to configure BitLocker securely. The following instructions do not able the ability to recover BitLocker with a Recovery Key. If the administrator is unable to boot, the machine should be considered compromised, and rebuild from known good media.

    Step 1: Right click on the OU that contains the Administrator Workstation, and click on Create a GPO in this domain, and Link it here…


    Step 2: Enter a name for the newly created GPO, and click OK


    Step 3: Right click on the newly created GPO, and select Edit


    Step 4: Navigate to \Computer Configuration\Windows Components\Bit locker Drive Encryption\Operating System Drives


    Step 5: Open the setting Require additional authentication at startup, select Enabled, and click OK


    Step 6: Open the setting Allow enhanced PINs for startup, select Enabled, and click OK


    Step 7: Open Configure minimum PIN length for startup, select Enabled, and click OK


    Step 8: Open the setting Enforce drive encryption type on operating system drives, select Enabled, and click OK


    Step 9: Select the setting Choose how BitLocker-protected operating system drives can be recovered, select Disabled, and click OK


    Step 10: Navigate to \Computer Configuration\Administrative Templates\Windows Components\BitLocker Drive Encryption


    Step 11: Open the setting Choose drive encryption method and cipher strength, select Enabled, and click OK


    Network Isolation

    The Windows Firewall should be enabled to protect the Administrator Workstation from Network Attacks. The following instructions cover how to create a Group Policy to configure the Windows Firewall for Administrator Workstations.

    Step 1: Right click on the OU that contains the Administrator Workstation, select Create a GPO in this domain, and Link it here…


    Step 2: Enter a name for the New Group Policy, and click OK


    Step 3: Right click on the newly created GPO and select Edit


    Step 4: Navigate to \Computer Configuration\Policies\Windows Settings\Security Settings\Windows Firewall with Advanced Security\

    Step 5: Right click on Windows Firewall with Advanced Security, and select Properties


    Step 6: Configure the settings as defined below:

    Firewall state: On (recommended)

    Inbound connections: Block (default)

    Outbound connections: Allow (default)

    Step 7: Then under Settings, click Customize…


    Step 8: Change Apply local firewall rules to No

    Step 9: Change Apply local connection security rules to No

    Step 10: Click OK


    Step 11: Repeat Steps 6 – 10 for each Profile. When finished, click OK.

    The resulting setting should look similar to those in the screenshot below:


    0 0

    (この記事は 2015 年11 月 17 日に Microsoft Partner Network Blog に掲載された記事 Build your cloud business: financial models & scenarios to help get you there の翻訳です。最新情報についてはリンク元のページをご参照ください。)


    どうしたら収益性の高いクラウド ビジネスを構築できるのでしょう。
    私はこの質問を、世界中のパートナー様に投げかけてきました。さまざまな回答から見えてきたのが、収益を向上させるベスト プラクティスと、収益化できないクラウド ビジネスに対する改善策です。
    パートナー様から寄せられる要望で最も多いのは、クラウド ビジネスを収益化するための財務モデルを知りたいというものです。こうした声にお応えするために、数か月前いくつかの収益性に関する財務モデル (英語)を公開しました。皆様にお役立ていただくためのヒントも説明しています。
    この財務モデルは、クラウド サービスの提供内容を検討しているときや、将来的なリソース計画を策定するとき、さらにはクラウド ビジネスのキャッシュ フローが黒字化する期間を見極めるときなど、パートナー様の収益化にかかわるさまざまな場面で活用していただけます。

    Office 365、Azure、CRM Online それぞれに関する単独のモデルと、Office 365、Azure、CRM Online に加え Windows とデバイスを組み合わせたモデルを提供しています。まずこちらから財務モデルをダウンロード (英語)してください。ここからはモデルの構成について簡単にご説明します。

    [Services Overview (サービス概要)] シート: 他の主要パートナー様が市場で提供しているプロジェクト サービス、マネージド サービス、知的財産をさまざまなシナリオ別に示しています。このデータは、昨年春に 1,260 社のパートナー企業を対象に実施したアンケートの集計結果です。自社のクラウド ビジネスに含めるサービスや、適正な料金を検討する場合に、このリストが役立ちます。モデルで示した 20 の各シナリオ (英語)の詳細については、こちらを参照してください。
    ・[Definitions (定義)] シート: モデルで使用されている主な用語と前提条件を説明しています。
    ・[Key Variables (主要変数)] シート: このシートには、パートナー様のビジネスに固有のあらゆる変数を入力します。赤い文字のセルに具体的なデータを入力してください。各フィールドにどのような情報を入力するか確認したい場合には、セルにカーソルを合わせるとヒントが表示されます。その他の情報については [Definitions (定義)] シートを参照してください。入力するデータに応じて、売上構成と取引の見積額が自動的に再計算されます。
    ・[P&L Detail (損益の詳細)] シート: [Key Variables (主要変数)] シートにすべてのデータを入力し終えたら、[P&L Detail (損益の詳細)] をクリックし、ビジネスの 4 年間の損益を確認します。このデータを使用して作成したビジネス ケースを基に、取締役会や投資会社に対して支援や投資を求めることができます。
    ・[Users (ユーザー数)] シート: すべてのサービスに関して、サポートするユーザー企業数が 4 年間でどのくらい増加するかがわかります。
    ・[Cash flow (キャッシュ フロー)] シート: [Key Variables (主要変数)] シートに入力した数値を前提に、キャッシュ フローが黒字転換する時期を予測できます。キャッシュ フロー面でどのくらいの期間持ちこたえられるかを判断し、結果に応じて [Key Variables (主要変数)] シートの数値を調整します。他のパートナー様の平均実績では、18 ~ 24 か月でクラウド ビジネスが軌道に乗り、キャッシュ フローが黒字に転換しています。
    ・[Resourcing (リソース)] シート: プロジェクト サービスおよびマネージド サービスの収益性は、その事業を支える適切な数のリソースをパートナー様が確保できるかどうかと深く関係しています。このシートでは、入力した数値に基づいて、ビジネスの成長に対応するためにどのくらいの人員が必要となるかを割り出します。


    この記事を皆様にご活用いただければ幸いです。財務モデル (英語)およびシナリオ概要に関する皆様のフィードバックをお待ちしています。ご質問にも喜んでお答えします。電子メールTwitterLinkedInよりお気軽にご連絡ください。





    0 0

    2010년 10월에 포스팅한 Active Directory에 대한 기초 동영상 세션 이 사실 제 블로그에서 가장 View가 높은 포스팅 중 하나입니다. 다양한 형태의 세미나나 IT CAMP에서 AD에 관련된 기초 강좌가 있으면 좋겠다는 의견을 지속적으로 주셨고, 꼬알라가 생각해보아도, Microsoft 엔지니어 기술에서 기본중에 기본이 되는 기술이 바로 AD인 것은 틀림없는 사실입니다. 해당 동영상이 Windows Server 2008 R2에 대한 이야기로 구성되어 있고, 조금더 깊이 있고, 오랫동안 찾아보실 수 있는 형태가 되었으면 하는 바램으로, 금년 연말에는 액티브 디렉터리의 정석이라는 주제로 대략 6시간정도의 동영상을 구성해보았습니다. 파트 1 : Active Directory를 살펴보기 전에 반드시 알아야 할 기본 지식 - Microsoft Active Directory를 살펴보기 위해, 반드시 이해해야 할 기본 단어들에 대한 정의 파트 2 : Active Directory의...(read more)

    0 0

    对于微软而言,我们一直在努力兑现我们对客户及其生态系统安全做出的承诺。告知 Windows 用户其在线访问的网站、应用程序和软件的安全性是我们的一项战略,其核心部分已构建到 微软可信根证书计划 中。此计划接受全球的授权证书颁发机构 (CA) 提供的根证书并将这些根证书移至您的设备上,从而告知设备哪些程序、应用程序和网站是受微软信任的。 我们为用户获得无缝、安全的体验所做的工作通常是在后台进行的,但今天我们要告诉您我们已对此计划做出的一些更改。这些重要的更改将帮助我们更好地防御会对网站和应用程序生态系统造成影响且不断演变的威胁,但这些更改可能会影响一小部分使用受影响的合作伙伴提供的证书的客户。 今年春天,我们开始与证书颁发机构 (CA) 接洽,以征求反馈并商谈即将对我们的可信证书计划进行的更改。此外,更改包括了更严格的技术和审核要求。最终的计划更改已于 2015 年 6 月发布。这之后,我们一直通过直接接触的方式以及通过社区论坛的方式帮助我们的合作伙伴了解并满足新的计划要求。 通过这一工作,我们确定了一些将不再参与此计划的合作伙伴,原因是他们选择了自愿退出或无法满足新的要求...(read more)

    0 0


    Script Download:  
    The script is available for download from  You can also use  Microsoft Script Browser for Windows PowerShell ISE to download the sample with one button click from within your scripting environment. 

    This script shows how to change the Internet Time Synchronization Update Interval in Windows using PowerShell DSC.

    You can find more All-In-One Script Framework script samples at

    0 0

    The IKEv2 implementation difference between Windows RRAS Gateway and Cisco ASA results in the non-interoperability between the two VPN devices (documented in this VPN Interoperability guide ). This interoperability is affecting a number of customers as Cisco ASA has a large user base. However, Cisco ASA being an End-Of-Line product, the OS update for these devices seem unlikely. So, to help the customers use Cisco ASA devices with Windows Server 2012 R2 RRAS Gateways, Microsoft has released a hotfix...(read more)

    0 0

    Welcome to the Training Schedule, a comprehensive schedule of training, webcasts. We update the post regularly. If you want to attend these trainings, please Contact us.

    Date (GMT+08:00) Workshop Registration
    1/06/2016 10:00-12:00 Skype for Business - What's new Register
    1/11/2016 10:00-12:00 Win10 Introduction and value proposition Register
    1/12/2016 10:00-12:00 Win10 deployment Register
    1/12/2016 10:00-13:00 OMS Red Carpet Register
    1/13/2016 10:00-12:00 Win10 management Register
    1/13/2016 10:00-12:00 Skype for Business-Reference Architectures and Design Register
    1/14/2016 10:00-12:00 Windows As- A-Service Register
    1/14/2016 10:00-12:00 Power BI Red Carpet Register
    1/15/2016 10:00-12:00 What's new in Win10 Threshold 2 (Nov update) Register
    1/26/2016 10:00-12:00 EMS Red Carpet Register
    1/27/2016 10:00-12:00 Upgrade to Skype for Business - What should I know Register
    1/28/2016 10:00-12:00 Power BI Red Carpet Register

    0 0

    Welcome to the Training Schedule, a comprehensive schedule of training, webcasts. We update the post regularly. If you want to attend these trainings, please Contact us.

    Date (GMT+08:00) Workshop Registration
    2/17/2016 10:00-12:00 Skype for Business Meeting and video Register
    2/18/2016 10:00-12:00 Exchange Online – Troubleshooting Hybrid Mail Flow Register
    2/19/2016 10:00-12:00 Lync Online - Overview and Troubleshooting Register
    2/24/2016 10:00-12:00 Skype for Business Voice Register
    2/25/2016 10:00-12:00 Office 365 – Troubleshooting Identity, DirSync, Single Sign-On and ADFS Register

    0 0

    Welcome to the Training Schedule, a comprehensive schedule of training, webcasts. We update the post regularly. If you want to attend these trainings, please Contact us.

    Date (GMT+08:00) Workshop Registration
    3/02/2016 10:00-12:00 Scalability and Performance of Cloud-hosted Web Apps Register
    3/03/2016 10:00-12:00 Approaches to Hybrid Cloud Register
    3/03/2016 10:00-12:00 Exchange Online – Troubleshooting Hybrid Migration Register
    3/04/2016 10:00-12:00 Introduction to DevOps and ALM for Cloud Apps Register
    3/04/2016 10:00-12:00 Skype for Business Hybrid Deep Dive Register
    3/07/2016 10:00-12:00 Microsoft Azure Storage and StorSimple Register
    3/08/2016 10:00-12:00 Microsoft Azure Networking Register
    3/10/2016 10:00-12:00 Exchange Online – Troubleshooting Public Folder Migration Register
    3/11/2016 10:00-12:00 Solution Accelerator- Business Continuity with Azure Site Recovery Register
    3/17/2016 10:00-12:00 Exchange Online – Troubleshooting Outlook Connectivity Register
    3/18/2016 10:00-12:00 Office 365 - Troubleshooting SharePoint Online Register
    3/22/2016 10:00-12:00 Microsoft Azure Security Solutions Register
    3/23/2016 10:00-12:00 Edge architectures and configuration Register
    3/24/2016 10:00-12:00 Microsoft Azure Identity and Access Register
    3/24/2016 10:00-12:00 Enterprise Mobility Suite PPE Register
    3/24/2016 10:00-12:00 Office 365 – Troubleshooting OneDrive for Business Register
    3/29/2016 10:00-12:00 Azure lab for Developers Register
    3/30/2016 10:00-12:00 What’s New with Enterprise Mobility Register
    3/31/2016 10:00-12:00 EMS Red Carpet Register

    0 0



    时间 (GMT+08:00) 课程名称
    2/01/2016 14:00-16:00 Skype for Business的新特性
    2/02/2016 14:00-16:00 Skype for Business参考体系结构和设计
    2/18/2016 14:00-16:00 升级到 Skype for Business应注意什么
    2/19/2016 14:00-16:00 CRM Online 市场营销管理概览
    2/19/2016 14:00-16:00 Skype for Business:会议和视频
    2/23/2016 14:00-16:00 OneDrive for Business 产品技术概览
    2/24/2016 14:00-16:00 Exchange Online – 混合迁移故障排错接
    2/25/2016 10:00-12:00 Dynamics CRM Online概览、管理与许可
    2/25/2016 14:00-16:00 PowerBI Red Carpet
    2/25/2016 14:00-16:00 Dynamics CRM Online与Exchange Online、SharePoint Online集成
    2/26/2016 14:00-16:00 Dynamics CRM Online与Skype for Business,Yammer以及Power BI集成
    2/26/2016 14:00-16:00 Exchange Online – 混合环境邮件流故障排错

    0 0



    时间 (GMT+08:00) 课程名称
    3/03/2016 14:00-16:00 企业移动化套件PPE
    3/03/2016 14:00-16:00 Skype for Business核心、 语音改进以及体系结构服务
    3/07/2016 14:00-16:00 Microsoft Azure 存储和 StorSimple
    3/08/2016 14:00-16:00 Windows 10 中的关键功能 和 用户体验
    3/10/2016 14:00-16:00 Microsoft Azure 网络
    3/10/2016 14:00-16:00 SharePoint Online 工作流介绍
    3/11/2016 14:00-16:00 CRM Online 客户服务管理概览
    3/11/2016 14:00-16:00 Skype for Business 边缘的体系结构和配置
    3/14/2016 14:00-16:00 Windows 10 部署
    3/15/2016 14:00-17:00 OMS Red Carpet
    3/15/2016 14:00-16:00 Microsoft Azure 的安全解决方案
    3/15/2016 14:00-16:00 可扩展性和云托管的 Web 应用程序的性能
    3/16/2016 14:00-16:00 构建混合集成
    3/16/2016 14:00-16:00 Office 365 – Identity, DirSync, Single Sign-On 以及ADFS的故障排错
    3/17/2016 10:00-12:00 Dynamics CRM Online概览、管理与许可
    3/17/2016 14:00-16:00 DevOps 和 ALM 对云应用程序
    3/17/2016 14:00-16:00 Dynamics CRM Online与Exchange Online、SharePoint Online集成
    3/18/2016 14:00-16:00 Dynamics CRM Online与Skype for Business,Yammer以及Power BI集成
    3/18/2016 14:00-16:00 Microsoft Azure 身份和访问
    3/22/2016 14:00-16:00 Windows 10 管理
    3/24/2016 14:00-16:00 SharePoint Online混合场景介绍
    3/25/2016 14:00-16:00 Exchange Online – 公共文件夹迁移的故障排错
    3/29/2016 14:00-16:00 Windows 10 中的Microsoft Edge浏览器 与Win10身份和安全
    3/30/2016 14:00-16:00 Windows As- A-Service

    0 0

    Summary: Matthew Hitchcock, Microsoft MVP, takes us through a deeper look at Azure DSC.

    Today we have the first part of a two-part series by Matthew Hitchcock. Here’s a bit about him:

    Matthew has deep experience in identity and directory services, including hybrid identity for Office 365, and Azure Active Directory federation and identity life cycle management. He also has extensive Active Directory migration and consolidation experience, primarily from 10 years in the banking industry. He now specializes in Microsoft Azure, helping customers automate their operations, start dealing with infrastructure-as-code, and adopting cloud services. Matthew is working as a contract staff consultant at Microsoft, where he works with customers in Singapore in telecommunications, logistics, banking, education, and government.

    Take it away, Matthew…

    It's an honor for me to be writing my first Hey, Scripting Guys! Blog post! I'd like to share something I have been doing a lot of recently—using advanced Desired State Configuration (DSC) files with Azure virtual machines, and perhaps more importantly, learning how to troubleshoot them! Let's face it, this is IT and we wouldn't have jobs if everything was easy and worked first time.

    Keen readers of the Hey, Scripting Guys! Blog will have seen a series by Honorary Scripting Guy, Sean Kearney: Use PowerShell to Create Virtual Machine in Azure. He discusses how to create virtual machines and assign DSC files through the DSC VM Extension. In Part 4, Sean created a basic configuration to configure a web server. It was a simple example with static data to teach a concept, but when we get into doing this for real we often need to do something a bit more advanced. 

    What is an advanced DSC file?

    By advanced DSC, I mean something that is parameterized and reusable. Take for example, a baseline server configuration. You may want to have a DSC that will join the server to a domain. The domain that you join may change from VM to VM, and the credential used may change also. You may want to pass a file share path for source files, which may change depending on the environment. In short, anything that you parameterize so you can reuse it is considered advanced.

     A note about prerequisites

    If you're following along, you'll need to know that I am using some downloadable DSC resources; **xComputerManagement** and **xSmbShare**. I am also using the Azure PowerShell module. I have an Azure subscription and I have already opened a PowerShell session, connected to my Azure subscription, and specified a default storage account. I will not cover these steps in this post, but there are plenty of articles online that discusee setting up your Azure connection. You can download the DSC resources and Azure module from the PowerShell Gallery

    Creating an advanced configuration file

    With the PowerShell ISE open and my subscription established, I’ll create the DSC file that I want to use and save it as Fileserver.ps1:

    Configuration FileServer


     param (








     Import-DSCResource -Module xComputerManagement,xSmbShare

     Node localhost




       ActionAfterReboot = 'ContinueConfiguration'

    \   ConfigurationMode = 'ApplyOnly'

       RebootNodeIfNeeded = $true


      xComputer JoinDomain


       DomainName = $domainName

       Credential = $domainCred

       Name = $env:computername


      File MyPackageDirectory


       DestinationPath = $FolderPath

       Type = "Directory"

       Ensure = "Present"

       DependsOn = "[xComputer]JoinDomain"


      xSmbShare MyFileShare


       Ensure = "present"

       Name = "SourceFiles"

       Path = $FolderPath

       FullAccess = "Domain Admins"

       Description = "Source for all Packages in the environment"

       DependsOn = "[File]MyFileDirectory"




    As you can see, there are a few parameters at the top of the file. They will take the domain name input, the domain credential to help join the domain and a file share path for a package I want to deploy.

    "Credentials?!" I hear you cry. "This will mean storing plain text passwords or much jiggery-pokery with certificates to encrypt them!"

    Fret not, my security conscious admin, I will put your mind at ease shortly. After I have my DSC script with parameters, it is time to upload it to my Azure storage account. I can do this by using the following command:

    Publish-AzureVMDscConfiguration -ConfigurationPath ".\FileServer.ps1"

    This uploads a file into the storage account associated with my Azure subscription. If you don't have one defaulted, you will be prompted to set it. If this is your first time uploading a DSC file, a folder called windows-powershell-dsc will be created. The file is uploaded as a ZIP file, and it includes the modules that are needed to apply this configuration to the target server. There is a lot more that you can configure with this cmdlet, and I encourage you to explore the additional parameters.

    Note   If you run into issues with the Help topics for Azure PowerShell cmdlets, please let the team know on GitHub at Azure PowerShell issues.

    Assign the configuration to a virtual machine

    After it is uploaded, I can assign it to my virtual machine. I have an existing virtual machine that I will use:

    get-azurevm -ServiceName "hitchfscloud" -Name "HitchFS01" | Set-AzureVMDscExtension -Version "2.9" -WmfVersion latest -Verbose -ConfigurationArchive "" -ConfigurationName "FileServer" -ContainerName "windows-powershell-dsc" -ConfigurationArgument @{domainName = ""; domainCred = Get-Credential; FolderPath = "C:\SourceFiles"} | Update-AzureVM

    Image of command output 

    With this command I have assigned the DSC Extension to the virtual machine in Azure and assigned my DSC file from Azure Storage to the extension. There are a few interesting parameters to pay attention to:

    • -WMFVersion Refers to the version of the Windows Management Framework that will be used by the extension. This may matter to you. The Default if not specified at the time of writing is "latest" which is WMF 5 Preview.
    • -ConfigurationArgument This is the most interesting parameter here. See how each parameter I specify matches up to one in my script? Here is what happens next:
      A file is created on the virtual machine with the values I have given it. This will be used with the DSC file to set up the virtual machine. Azure recognizes that I have provided a credential and uses the certificate for the virtual machine to encrypt that credential before writing it to the Azure VM disk. Clever huh? We will take a deeper look at that shortly.

    What's going on?

    So now if I execute this command, I can wait a few minutes and use the Get-AzureVMDSCExtensionStatus cmdlet to see that it's actually running:

    Image of command output

    If I run that command a few more times, I can see the state changing, so I know everything is running perfectly:

    Image of command output 

    Image of command output

    The Azure DSC Extension is doing the following:

    • Creating a folder called C:\Packages\Plugins\Microsoft.Powershell.DSC for the extension
    • Creating a folder for the version of the extension, in my case: C:\Packages\Plugins\Microsoft.Powershell.DSC\
    • Creating the relevant files and folder structure in this folder
    • Downloading the WMF files and DSC file into the following folder: C:\Packages\Plugins\Microsoft.Powershell.DSC\\DSCWork
    • Extracting the required modules from the DSC ZIP folder that accompanied my configuration and placing them in C:\Program Files\WindowsPowerShell\Modules
    • Installing the WMF package and restarting to apply the DSC Configuration

    The parameters that we specified when we assigned the configuration to the VM and the password were written to:


    Here is what it looks like on disk:

    Image of command output 

    Everything is in there, nice and secure. Perfect.

    Now that you have an understanding of how to write and apply these configurations, what about troubleshooting? I have a feeling that something is going to go wrong with this configuration, and I am going to have to fix it. Join me in Part 2 where I’ll take a deeper look at what's going on inside of the Azure VM and how we can troubleshoot the errors.


    I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

    Ed Wilson, Microsoft Scripting Guy

    0 0

    Vysvětlení pojmů
    Microsft Business  Inteligence
    - Vytažení relevantní informace, která nemusí být na první dojem zřejmá z dat
    V podání Microsoftu máme dva přístupy:
    1. Firemní (korporátní) BI
     1.1. Datové sklady
     1.2. OLAP kostky
     1.3. ETL- datové pumpy
     1.4. SQL reporting services

    2. Self Services BI – nástroje pro koncového uživatele
     2.1. BI postavené na doplňcích do  Excelu ( od verze Excel 2010)
       2.1.1. PowerQuery (transformace dat, ETL)
       2.1.2. PowerPivot (analytický model, tvorba modelu, logiky, výstupů)
       2.1.3. PowerView (prezentační vrstva, vizualizace)
       2.1.4. PowerMap (3D vizualizace v závislosti na geolokaci)

    2.2. Nová cloudová služba Power BI
       2.2.1. PowerBI desktop
       2.2.2. Mobilní aplikace
       2.2.3. Portál

    Power Query
    Power Query je funkce analýzy dat dostupná pro Excel, která vám umožní hledat, kombinovat a upřesňovat data.
    Microsoft Power Query pro Excel vylepšuje samoobslužné funkce business intelligence (BI) pro Excel o intuitivní a konzistentní možnosti pro hledání, kombinování a upřesňování dat ze spousty různých zdrojů, třeba z relačních, strukturovaných a částečně strukturovaných, z datového kanálu, z webů, z Azure Marketplace a z dalších. Power Query taky umožňuje hledat veřejná data ze zdrojů, jako je Wikipedie.
    S Power Query můžete sdílet a spravovat dotazy i hledat data ve vaší organizaci. Uživatelé v podniku můžou tyto sdílené dotazy najít a použít (pokud jsou sdílené i s nimi) a použít podkladová data v dotazech pro analýzu vlastních dat a vytváření sestav.
    Co můžete dělat díky Power Query
    • Hledat a připojovat data z nejrůznějších zdrojů.
    • Slučovat a přizpůsobovat zdroje dat, aby odpovídaly vašim požadavkům pro analýzu dat, nebo je připravit pro další analýzu a modelování pomocí nástrojů, jako jsou Power Pivot a PowerView.
    • Vytvářet vlastní zobrazení dat.
    • Vytvářet vizualizace velkých dat a dat Azure HDInsight pomocí analyzátoru formátu JSON.
    • Provádět operace čištění dat.
    • Importovat data z víc souborů protokolu.
    • Provádět online vyhledávání dat z rozsáhlé kolekce veřejných zdrojů dat, včetně tabulek Wikipedia, části webu Microsoft Azure Marketplace a části webu
    • Vytvářet dotazy z facebookových hodnocení To se mi líbí zobrazených v excelovém grafu.
    • Přidávat data do doplňku Power Pivot z nových zdrojů dat (například XML, Facebook nebo složky souborů) jako aktualizovatelná připojení.
    • S Power Query verze 2.10 a novější můžete ve svojí organizaci sdílet a spravovat dotazy i hledat data.

    Zdroje dat Power Query
    • Webová stránka
    • Excelový soubor nebo soubor CSV
    • Soubor XML
    • Textový soubor
    • Složka
    • Databáze SQL Server
    • Databáze Microsoft Azure SQL
    • Databáze Access
    • Databáze Oracle
    • Databáze IBM DB2
    • Databáze MySQL
    • Databáze PostgreSQL
    • Databáze Sybase
    • Databáze Teradata
    • Seznam SharePointu
    • Informační kanál OData
    • Microsoft Azure Marketplace
    • Soubor systému Hadoop (HDFS)
    • Microsoft Azure HDInsight
    • Úložiště tabulek Microsoft Azure
    • Active Directory
    • Microsoft Exchange
    • Facebook

    PowerPivot je doplněk do aplikace Excel (od verze Excel 2010), který otevírá dveře do světa Business Intelligence. Přístup Self-Business Intelligence dovoluje uživatelům rychle a jednoduše používat nástroje BI bez velkých znalostí.
    Hlavní přínosy:
    - Uživatelé si tvoří své vlastní BI aplikace pomocí nástrojů, které dobře znají Excel.
    - Jednotlivé aplikace se dají rychle a lehce bez podrobných znalostí BI a vývoje aplikací.
    - Vše se dá sdílet např. pomocí Office 365 a SharePointu
    - Analýzy mohou být obnovovány automaticky, a reflektovat změny zdrojových dat
    Samotné PowerPivot aplikace vypadají jako obyčejný sešit Excel, ale navíc obsahují metadata a data PowerPivot. Veškerá data jsou ukládána s vysokou kompresí a jsou uložena jako součást sešitu Excel. Proti samotné aplikaci Excel otvírá uživateli PowerPivot cestu k víceru datovým zdrojům, pokročilejším filtrům pomocí DAX, přístupu na data offline a v neposlední řadě také možnost pracovat s větším množstvím informací než v samotném Excelu.

    Power View je technologie vizualizace dat, která umožňuje vytvářet interaktivní grafy, přehledy, mapy a další vizuální prvky, které oživí vaše data. Power View je dostupný doplněk v Excelu, v SharePointu, SQL Serveru a Power BI.
    Typy vizualizací Power View

    Na následujících stránkách najdete podrobné informace o různých vizualizacích, které jsou dostupné v Power View:
    • Grafy a další vizualizace v Power View (užitečný přehled různých typů)
    • Spojnicové grafy v Power View
    • Výsečové grafy v nástroji Power View
    • Mapy v Power View
    • Dlaždice v Power View
    • Karty v Power View
    • Obrázky v Power View
    • Tabulky v Power View
    • Vizualizace násobných grafů v Power View (typ vizualizace)
    • Bublinové a bodové grafy v Power View
    • Klíčové indikátory výkonu (KPI) v nástroji Power View

    Power Map
    Jedná se o nástroj, který umožní vytvořit grafy propojené s mapami. Power Map se dá stáhnout a nainstalovat. Není to ale možné ve všech verzích MS Office.

    Co jsou otevřená data

    Otevřená data jsou informace a čísla bezplatně a volně dostupná na internetu ve strukturované a strojově čitelné podobě a jsou zpřístupněna způsobem, který jejich využití neklade zbytečné technické či jiné překážky.

     Formát a struktura otevřených dat tedy umožňuje jejich hromadné počítačové zpracování, k němuž jejich vydavatel poskytl právní svolení. Díky tomu mohou být dál volně zpracovávány, a to i v rámci softwarových aplikací.

    Jedná se například o jízdní řády, příjmy států, seznam poskytovatelů sociálních služeb, kalendář ministra nebo měření čistoty ovzduší. Pocházejí z univerzit, nevládních organizací, soukromých firem nebo veřejné správy.

    V České republice můžeme díky zákonu o svobodném přístupu k informacím zatím získávat jen část z údajů, které úřady shromažďují. Pokud si o to zažádají a úřad jim vyhoví. Otevřená data by měl naopak zveřejnit sám vydavatel na internetu tak, aby je všichni mohli snadno najít a stáhnout. Otevřená data lze chápat i v širším smyslu - kromě informací veřejné správy lze využít i data komerčních subjektů. 
    Přínosy otevřených dat: 
    - Zvýšení efektivity: uvolnění dat znamená možnost je sdílet a analyzovat
    - Podpora ekonomiky: data jsou zdrojem inovací, podnikatelských příležitostí a pracovních nabídek – lze je využít třeba v dopravě, logistice, zdravotnictví či bankovnictví. Firmy pracují s daty jako se surovinou, vytváří nad nimi aplikace, které generují přidanou hodnotu a zisk
    - Transparentnost, zefektivnění a kontrola veřejné správy: zveřejněná data umožňují kontrolu, jak se hospodaří s daněmi nebo jaké jsou náklady organizací, které podporujeme
    - Zapojení občanů do rozhodování: občané se mohou díky datům a analýzám kvalifikovaněji podílet na fungování státu
    - Datová žurnalistika: otevřená data jsou nezastupitelným zdrojem informací pro novináře


    Power BI
    Power BI Desktop
    PowerBI Desktop můžete stáhnout z webu Zde po registraci, máte k dispozici Power BI cloudové portálové řešení s datasety, reporty a dashboardy. Mezi položkami ke stažení naleznete  PowerBI Gateway pro propojení on premise prostředí s portálem, dále Power BI Desktop a mobilní aplikace pro platformy Windows, Android i iOS. PowerBI desktop je dostupný i v češtině a slovenštině. 

    Power BI Desktop je aplikace, která usnadňuje uživatelům práci s reporty, která umožňuje připojovat se k datům ze spousty různých zdrojů a kombinovat je, a to pomocí Power Query v Power BI Desktop. Z datových sad sestavených v Editoru dotazů můžete v Power BI Desktop vytvářet rozsáhlé sestavy a vizualizace. A když je vytvoříte, můžete je snadno publikovat v cloudové službě Power BI.

    Příští článek už bude praktický na téma: Vizualizace dat z Facebooku

    Autor: Karel Rejthar- MIE Educator

    0 0

    DevTest Labs di Azure è un servizio (al momento in Preview) che consente agli sviluppatori e ai tester di creare rapidamente ambienti in Azure riducendo al minimo gli sprechi e i costi. È possibile ottenere ambienti dove provare la versione più recente di una applicazione o di una configurazione eseguendo rapidamente il provisioning di Windows e Linux tramite modelli ed elementi riutilizzabili. DevTest Labs inoltre ha i seguenti vantaggi: Permette di avere sempre sotto...(read more)

    0 0


    2015/12/29 9:30 (JST) 復旧しました。ご迷惑をおかけし申し訳ございませんでした。


    2015/12/28 20:00 (JST) 現在 Microsoft Intune 管理コンソールや Intune ポータル サイトに接続できない事象を確認しております。




    0 0
  • 12/28/15--04:00: Predictions for data in 2016
  •  As the expert in our team on Machine Learning, which is all about making predictions from data, Harry has asked me to make some predictions for 2016. For this article, however, I’ll be relying on my experience and not an API built from a neural network!

    Prediction 1: Machine learning is a good place to start

    Arthur C Clarke once made the remark that any sufficiently advanced technology is like magic, but I would suggest that in 2016 some of this technology will disappear by just being ordinary and common place.

    Let me give you an example – in 2001: A Space Odyssey, the on board computer HAL recognises Dr Bowman and says hello. It has taken a little longer, but Windows 10 has Hello which will log you in by recognising your face. This took a little longer not because recognising a face from pictures is hard, but because it works from a 3D camera so Windows knows your face by its depth of features and won’t accept a picture of you. 

    So what if you want to embed this sort of intelligence in your own applications? 

    Prediction 2: 2016 will be the year of R

    R, the universal language for machine learning across all platforms that will be coming to you in SQL Server 2016, in Power BI, and in Visual Studio at some point. R is not only set based like SQL is, but has very rich and extensible functions for statistics and it can plot results:

    Some data about Bill Gates…


    A tiny bit of R…

    img = maml.mapInputPort(1)
    img_rgb = rgb(img$R, img$G, img$B, maxColorValue = 255)
    dim(img_rgb) = c(160,160)

    And here’s Bill as a 160x 160 plot of that data:


    Prediction 3: The importance of APIs

    API’s will become ever more important as a way of stitching services together without exposing the code and data behind an application which is a good thing because it limits what data we choose to share (subject to the APIs being secure of course)

    Prediction 4: Less of a prediction, more of a hope

    I hope U-SQL will take off. In a world of many computer languages and technologies, particularly around big data, you might wonder how another one can make a difference, but actually that diversity indicates that there is no killer language for big data. So U-SQL is Microsoft’s attempt at getting past the problems of architecting and executing process to run against big data by doing nothing more than combining C# and SQL. C# is there to describe the data which is then analysed using familiar SQL.

    Prediction 5: More integration and better tooling in the Cortana Analytics Suite

    Following on from U-SQL, I think we’ll also see more integration and better tooling in the Cortana Analytics Suite. If you haven’t read up on this, it’s a collection of data related online services in Azure and Office 365 (PowerBI). Many of these, however, were developed independently and so what we will see in 2016 is the integration and extension of these services. Actually this is part of a continuing program, for example Stream Analytics can now call Machine Learning in line. Also, Azure Data Factory (ADF) is already aware of Azure Data Lake (ADL). Next year, however, we’ll see even more improvements to this and to the tooling to make it easier and more agile. For example, ADF is still very code heavy and so everything needs to be hand crafted in json. While we now have nice ADF templates and solutions in Visual Studio, it has not got the simple to use drag and drop UI that we have in SQL Server Integration Services Informatica. I understand that this will change, but I don’t know when or to what. We’ll also see the launch of SQL Server 2016 and this is also an example of integration, both with R and with Hadoop via the Polybase technologies.

    Oh and another example that just landed in my inbox a dedicated VM in the Azure gallery for Data Science.

    Prediction 6: Human learning

    Cortana is maturing to the point where there will be Microsoft certifications in this space, but this brings challenges as learning materials and the exams need to keep pace with new developments. What I can say for sure is I am trying out the beta exams in January and will keep you posted.

    Prediction 7: SQL Bits

    If you are serious about your data driven career, I’ll be seeing you at SQLBits on 4-7th May in Liverpool. And if not, at a SQL Relay, Saturday evening or something in 2016.

older | 1 | .... | 803 | 804 | (Page 805) | 806 | 807 | .... | 889 | newer