Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog


Channel Description:

Resources for IT Professionals

older | 1 | .... | 848 | 849 | (Page 850) | 851 | 852 | .... | 889 | newer

    0 0

    image
    Mar.
    2
    image
    image

    Script Download:  
    The script is available for download from https://gallery.technet.microsoft.com/How-to-check-forchange-352e20da You can also use  Microsoft Script Browser for Windows PowerShell ISE to download the sample with one button click from within your scripting environment. 

    The VBScript helps to change the user theme from Aero to Classic.

    You can find more All-In-One Script Framework script samples at http://aka.ms/onescriptingallery


    0 0

    Performing well at school can sometimes be a bit of challenge, but what about for those with learning disabilities like dyslexia?

    To assist those with learning disabilities, Microsoft is introducing Learning Tools for OneNote – a toolbar designed specifically to improve reading and writing experiences for all students, including for those who experience learning challenges.

    In addition to this nifty add-on (click here for a preview) that will include special text formatting, immersive reading and other features that make classroom activities easier, One Note is the most productive education tool around.

    Here are some of our favourite features:

    Record a class
    Students or educators who want to record a lesson for an absent student, or students who want to record a class to listen to again can do so easily with OneNote. And, if you don’t want to go back and listen to the whole lesson, the notes typed in OneNote while recording will be synced to the exact point in the audio recording. Teachers can also record verbal instructions or feedback for their students so that they no longer need to repeat themselves.

    Convert images to text
    It’s become the norm to see students pulling out their phones to take a photo of an important slide or notes their teacher has written on the board. With OneNote’s Optical Character Recognition (OCR), photos you take on a Windows Phone are scanned, with any text being captured and converted to editable text in OneNote.

    Learn with doodles and sketches
    For visual learners, drawings and pictures can be useful learning tools – and OneNote makes it easy to doodle and sketch to your heart’s content. The Draw tab gives you all the drawing tools you need whether you’re creating a random doodle, a graph or an ecological cycle. You can choose to draw directly on your existing notes or pin a Quick Note to them.

    Capture your web research easily
    When you’re researching online, you can send a whole page or parts of a page directly to OneNote for easy reference. It’s as simple as clicking Ctrl+P and selecting Sent to OneNote. You’ll find all your research saved in Quick Notes, ready for you to put together your lesson or project.

    Solve equations easily
    Take the stress out of Maths with OneNote. Get help performing tricky calculations by typing in your mathematical expression anywhere on a OneNote page and pressing Enter to see the result. There is also an equation editor to help you construct and enter your Maths equations, or you can even write the equation by hand and see it transformed into text.

    Create your own digital textbook
    Audio and video create a more engaging learning experience, and OneNote makes it possible for educators to create custom digital textbooks incorporating these elements along with Office documents, pictures and web clips. You can even make it a collaborative project by asking students for their input.

    Collaborate and interact virtually
    Collaboration is a buzzword of 21st century learning, and the OneNote Class Notebook Creator helps makes this a reality. From offering a collaboration space for classroom projects, to creating a Classroom Library for distributing handouts and assignments and a Student Space for each student, which allows for teacher-student interaction, the virtual environment is changing the nature of the classroom. OneNote Notebook Creator is an online SharePoint apps that is available to educators for free with Office 365.

    Never lose work again
    You’ll never need to use “The dog ate my homework” as an excuse again. OneNote saves everything for 60 days after you edit or delete it. This means that you can recover anything from the trash and, with version management, you can even go back to an earlier version of a note. This is also a great tool to encourage accountability, as educators can use the Find by Author option to see who did what in a collaborative class assignment.

    Making use of technology in the classroom can help to empower teachers and students. If you’re just starting out with OneNote as a teacher or student, browse through the tips on OneNote for Teachers and stay updated with the Microsoft OneNote education blog for more tips.


    0 0

    image
    Mar.
    2
    image
    image

    Script Download:  
    The script is available for download from https://gallery.technet.microsoft.com/Check-out-how-many-c252278d. You can also use  Microsoft Script Browser for Windows PowerShell ISE to download the sample with one button click from within your scripting environment. 

    The script scans the logon events in the security event viewer log and counts the number of users and computers authenticated in a particular domain controller.

    You can find more All-In-One Script Framework script samples at http://aka.ms/onescriptingallery


    0 0

    Skrevet av Joakim Knutsen, produktsjef Office 365

    En av de vanligste grunnene til at norske bedrifter velger Office 365 er at de får en løsning som alltid er oppdatert og moderne. I Microsoft har vi et sterkt fokus på innovasjon og brukeropplevelse og vi lanserer stadig oppdateringer til Office 365 med ny eller forbedret funksjonalitet. Eksempler på dette gjennom det siste året er oppgradering av Lync til Skype for Business, Office og Skype for Business tilrettelagt for alle mobiler og nettbrett, introduksjonen av Office 365 Groups, nytt grensesnitt på OneDrive for Business og oppdateringer til Office-pakken.

    Uansett om du jobber med salg av Office 365 eller er en kunde som har anskaffet Office 365, kan være tidkrevende å få med seg alle oppdateringene som blir annonsert. Her vil jeg dele mine tre beste tips som sørger for at du er oppdatert til enhver tid på det som kommer i Office 365.

    Nyhetsportal for Office 365

    En side jeg sjekker et par ganger i uken er vår globale Office-blogg. Her annonseres nyheter og oppdateringer til Office 365 sammen med nyttig informasjon om når disse blir gjort tilgjengelig for kunder og hva oppdateringene innebærer.

    Roadmap for Office 365

    Videre besøker jeg vår Roadmap-portal et par ganger i måneden. Her får jeg informasjon om hva som kommer i Office 365 framover. Et bra tips er å bruke søkeboksen hvis det er spesifikke oppdateringer du leter etter. For eksempel kan du skrive «Yammer» for å se framtidige oppdateringer på Yammer.

    Helt oppdatert på 10 minutter

    En gang i måneden slipper vi en «Office 365 Updates»-video på YouTube. Denne tar for seg de viktigste nyhetene i Office 365 på 10 minutter. Vi lager en ny video hver måned, så har du kun 10 minutter å avse i løpet av måneden er det denne du bør bruke tiden på. Du finner videokanalen her.

    Meld deg gjerne inn i vårt globale Yammer-nettverk hvor du treffer 80 000 andre Office 365-interesserte. Der kan du delta i diskusjoner og stille spørsmål om temaer relatert til teknisk utrulling, adopsjon og bruk av Office 365.


    0 0

    While helping Windows Enterprise customers deploy and realize the benefits of Windows 10, I've observed there's still a lot of confusion regarding the security features of the operating system. This is a shame since some of the key benefits of Windows 10 involve these deep security features. This post serves to detail the Device Guard and Credential Guard feature sets, and their relationship to each other.

    First, let's set the foundation by thinking about the purpose of each feature:

    Device Guard is a group of key features, designed to harden a computer system against malware. Its focus is preventing malicious code from running by ensuring only known good code can run.

    Credential Guard is a specific feature that is not part of Device Guard that aims to isolate and harden key system and user secrets against compromise, helping to minimize the impact and breadth of a Pass the Hash style attack in the event that malicious code is already running via a local or network based vector.

    The two are different, but complimentary as they offer different protections against different types of threats. Let's dive in and take a logical approach to understanding each.

    It’s worth noting here that these are enterprise features, and as such are included only in the Windows Enterprise client.

     

    Virtual Secure Mode

    The first technology you'll need to understand before we can really dig into either Device Guard or Credential Guard, is Virtual Secure Mode (VSM). VSM is a feature that leverages the virtualization extensions of the CPU to provide added security of data in memory. We call this class of technology Virtualization Based Security (VBS), and you may have heard that term used elsewhere. Anytime we’re using virtualization extensions to provide security, we're essentially talking about a VBS feature.

    VSM leverages the on chip virtualization extensions of the CPU to sequester critical processes and their memory against tampering from malicious entities.

    The way this works is the Hyper-V hypervisor is installed - the same way it gets added in when you install the Hyper-V role. Only the hypervisor itself is required, the Hyper-V services (that handle shared networking and the management of VMs themselves) and management tools aren't required, but are optional if you’re using the machine for ‘real’ Hyper-V duties. As part of boot, the hypervisor loads and later calls the real 'guest' OS loaders.

    The diagram below illustrates the relationship of the hypervisor with the installed operating system (usually referred to as the host operating system)

    image

    The difference between this and a traditional architecture is that the hypervisor sits directly on top of the hardware, rather than the host OS (Windows) directly interacting at that layer. The hypervisor serves to abstract the host OS (and any guest OS or processes) from the underlying hardware itself, providing control and scheduling functions that allow the hardware to be shared.

    In VSM, we’re able to extend this by tagging specific processes and their associated memory as actually belonging to a separate operating system, creating a ‘bubble’ sitting on top of the hypervisor where security sensitive operations can occur, independent of the host OS:

    image

    In this way, the VSM instance is segregated from the normal operating system functions and is protected by attempts to read information in that mode. The protections are hardware assisted, since the hypervisor is requesting the hardware treat those memory pages differently. This is the same way to two virtual machines on the same host cannot interact with each other; their memory is independent and hardware regulated to ensure each VM can only access it’s own data.

    From here, we now have a protected mode where we can run security sensitive operations. At the time of writing, we support three capabilities that can reside here: the Local Security Authority (LSA), and Code Integrity control functions in the form of Kernel Mode Code Integrity (KMCI) and the hypervisor code integrity control itself, which is called Hypervisor Code Integrity (HVCI).

    Each of these capabilities (called Trustlets) are illustrated below:

    image

    When these capabilities are handled by Trustlets in VSM, the Host OS simply communicates with them through standard channels and capabilities inside of the OS. While this Trustlet-specific communication is allowed, having malicious code or users in the Host OS attempt to read or manipulate the data in VSM will be significantly harder than on a system without this configured, providing the security benefit.

    Running LSA in VSM, causes the LSA process itself (LSASS) to remain in the Host OS, and a special, additional instance of LSA (called LSAIso – which stands for LSAIsolated) is created. This is to allow all of the standard calls to LSA to still succeed, offering excellent legacy and backwards compatibility, even for services or capabilities that require direct communication with LSA. In this respect, you can think of the remaining LSA instance in the Host OS as a ‘proxy’ or ‘stub’ instance that simply communicates with the isolated version in prescribed ways.

    Deploying VSM is fairly straightforward. You simply need to verify you have the appropriate hardware configuration, install certain Windows features, and configure VSM via Group Policy.

    Step One: Configure Hardware

    In order to use VSM, you’ll need a number of hardware features to be present and enabled in the firmware of the machine:

    1. UEFI running in Native Mode (not Compatibility/CSM/Legacy mode)
    2. Windows 64bit and it’s associated requirements
    3. Second Layer Address Translation (SLAT) and Virtualization Extensions (Eg, Intel VT or AMD V)
    4. A Trusted Platform Module (TPM) is recommended.

    Step Two: Enable Windows Features

    The Windows features you’ll need to make VSM work are called Hyper-V Hypervisor (you don’t need the other Hyper-V components) and Isolated User Mode:

    image

    If these options are greyed out or unavailable for install, it will typically indicate that the hardware requirements in step one haven’t been met.

    You’ll notice the name of the feature is called Isolated User Mode in here. It actually is the Virtual Secure Mode feature – you can thank a last minute name change for that. In order to not confuse people, this isn’t planned to change to reflect the VSM name at this time, and may look to being integrated as a standard Windows feature at a later stage.

    Step Three: Configure VSM

    VSM and the Trustlets loaded within are controlled via either Mobile Device Management (MDM) or Group Policy (GP).

    For the purposes of this article, I’ll cover the Group Policy method as that’s the most commonly used option, but the same configuration is possible with MDM.

    The GP setting you need to know about is called Turn On Virtualization Based Security, located under Computer Configuration \ Administrative Templates \ System \ Device Guard in the Group Policy Object Editor:

    image

    Enabling this setting, and leaving all the settings blank or at their defaults will turn on VSM, ready for the steps below for Device Guard and Credential Guard. In this default state, only the Hypervisor Code Integrity (HVCI) runs in VSM until you enable the features below (protected KMCI and LSA).

    Device Guard

    imageNow that we have an understanding of Virtual Secure Mode, we can begin to discuss Device Guard. The most important thing to realize is that Device Guard is not a feature; rather it is a set of features designed to work together to prevent and eliminate untrusted code from running on a Windows 10 system.

    Device Guard consists of three primary components:

    • Configurable Code Integrity (CCI)– Ensures that only trusted code runs from the boot loader onwards.
    • VSM Protected Code Integrity– Moves Kernel Mode Code Integrity (KMCI) and Hypervisor Code Integrity (HVCI) components into VSM, hardening them from attack.
    • Platform and UEFI Secure Boot– Ensuring the boot binaries and UEFI firmware are signed and have not been tampered with.

    When these features are enabled together, the system is protected by Device Guard, providing class leading malware resistance in Windows 10.

    Configurable Code Integrity (CCI)

    CCI dramatically changes the trust model of the system to require that code is signed and trusted for it to run. Other code simply cannot execute. While this is extremely effective from a security perspective, it provides some challenges in ensuring that code is signed.

    Your existing applications will likely be a combination of code that is signed by the vendor, and code that is not. For code that is signed by the vendor, the easiest option is just to use a tool called signtool.exe to generate security catalogs (signatures) for just about any application.

    More detail on this in an upcoming post.

    The high level steps to configure code integrity for your organization is:

    1. Group devices into similar roles – some systems might require different policies (or you may wish to enable CCI for only select systems such as Point of Sale systems or Kiosks.
    2. Use PowerShell to create integrity policies from “golden” PCs
      (use the New-CIPolicy Cmdlet)
    3. After auditing, merge code integrity policies using PowerShell (if needed)
      (Merge-CIPolicy Cmdlet)
    4. Discover unsigned LOB apps and generate security catalogs as needed (Package Inspector & signtool.exe – more info on this in a subsequent post)
    5. Deploy code integrity policies and catalog files
      (GP Setting Below + Copying .cat files to catroot - C:\Windows\System32\\{F750E6C3-38EE-11D1-85E5-00C04FC295EE}\)

    The Group Policy setting in question is Computer Configuration \ Administrative Templates \ System \ Device Guard \ Deploy Code Integrity Policy:

    image

    VSM Protected Code Integrity

    The next component of Device Guard we’ll cover is VSM hosted Kernel Mode Code Integrity (KMCI). KMCI is the component that handles the control aspects of enforcing code integrity for kernel mode code. When you use Configurable Code Integrity (CCI) to enforce a Code Integrity policy, it is KMCI and it’s User-Mode cousin, UMCI – that actually enforces the policy.

    Moving KMCI to being protected by VSM ensures that it is hardened to tampering by malware and malicious users.

    Platform & UEFI Secure Boot

    While not a new feature (introduced in Windows 8), Secure Boot provides a high-value security benefit by ensuring that firmware and boot loader code is protected from tampering using signatures and measurements.

    To deploy this feature you must be UEFI booting (not legacy), and the Secure Boot option (if supported) must be enabled in the UEFI. Once this is done, you can build the machine (you’ll have to wipe & reload if you’re switching from legacy to UEFI) and it will utilize Secure Boot automatically.

    For more information about the specifics of deploying Device Guard, start with the deployment guide.

    Credential Guard

    imageAlthough separate from Device Guard, the Credential Guard feature also leverages Virtual Secure Mode by placing an isolated version of the Local Security Authority (LSA – or LSASS) under it’s protection.

    The LSA performs a number of security sensitive operations, the main one being the storage and management of user and system credentials (hence the name – Credential Guard)

    Credential guard is enabled by configuring VSM (steps above) and configuring the Virtualization Based Security Group Policy setting with Credential Guard configured to be enabled.

    Once this is done, you can easily check if Credential Guard (or many of the other features from this article) is enabled by launching MSINFO32.EXE and viewing the following information:

    image

    You can also check for the presence of the LSAIso process, which is running in VSM:

    image

    I hope this article has been useful for you and answered at least some of your questions about Device Guard and Credential Guard.

    If your thirst for knowledge is not yet quenched and you need more information while you wait for the follow up posts, check out the following Channel9 videos that cover this topic:

    https://channel9.msdn.com/Blogs/Seth-Juarez/Isolated-User-Mode-in-Windows-10-with-Dave-Probert
    https://channel9.msdn.com/Blogs/Seth-Juarez/Isolated-User-Mode-Processes-and-Features-in-Windows-10-with-Logan-Gabriel
    https://channel9.msdn.com/Blogs/Seth-Juarez/Isolated-User-Mode-in-Windows-10-with-Dave-Probert

    Stay tuned for further posts about this and other Windows 10 features. Hey, why not subscribe?

    Ash.


    0 0

    開工大吉之後,又過了一些日子。看著堆積好多的資料到底該怎麼處理呢?今天Office 部落格要教您怎麼把這些資料迅速分類處理。受夠複製貼上了嗎?現在學會這一招,複製貼上20分鐘的事情,您只需要花費三十秒鐘就能完成。

    先來看看這個例子吧!

    Andy 自認是一個日理萬機的人,總是期許自己能夠成就一番大事業。沒想到今天,在他要提出百萬提案會議前十分鐘,主管提出一個要求:

    Andy,幫我把這個電子郵件裡的英文名字抓出來,我等一下要列印資料。」

     

    什麼!等一下就是個百萬提案了,Andy到底該如何快速解決這個任務,去會議中讓自己發光發熱呢?

    今天Office 部落格要教您怎麼樣在三十秒之內完成這個簡單的任務。一切真的只需要三十秒。

    步驟一:在英文姓的下方 輸入第一個人名的英文姓: Huang


    步驟二:點擊彩帶式功能列中[資料]下方的[快速填入]

     步驟三:一切輕鬆完成!



    小提示:Ctrl + E 快速鍵背起來,更省時間喔

     

    真的不用三十秒,您要的資料全部都可以自動輸入正確。複製貼上已經太落伍了,快來試試這個小技巧吧!

    現在相信Andy 能夠繼續準備他的百萬提案囉!

     

    在操作上有任何問題,歡迎在下方留言,我們也會努力幫您解決您的問題。

    或者,有沒有讀者希望下次Excel 小技巧專欄介紹甚麼好用功能,都可以在下方留言詢問喔!


    0 0

    こんにちは、Windows プラットフォーム サポートの世古です。

    今回は IaaS VM Backup のアラート通知手順についてご案内いたします。

    そもそもアラートではどの様なことが実現出来るのだろうという事を確認したい場合には、以下の弊社公開情報をご参照ください。

    タイトル: Azure 仮想マシンのバックアップを管理および監視する
    https://azure.microsoft.com/ja-jp/documentation/articles/backup-azure-manage-vms/#-8
    参考箇所: アラート通知

    タイトル: Add-AlertRule
    https://msdn.microsoft.com/en-us/library/mt282468.aspx

    以下、アラートの設定についてご案内させていただきます。

    アラートの設定につきましては、Add-AlertRule コマンドを利用する事で、失敗時にメールでアラート配信を実施する事が可能です。以下実施手順についてご案内いたします。
     
    - アラートの設定手順
    --------------------------------------------------------------------
    1."Microsoft Azure PowerShell" に Azure のアカウントでログオンします。
     
    - 実行例
    ----------------------------------------------------------
    PS C:\Users\taseko.000> Login-AzureRmAccount
     
    Environment           : AzureCloud
    Account               : ms@microsoft.com
    TenantId              : 72f988bf-86f1-41af-91ab-2d7cd01*****
    SubscriptionId        : 0d8ad84b-f76a-40f7-a159-1095089*****
    CurrentStorageAccount :
     
     
    2.以下のコマンドを実行し、Alert の設定対象となるバックアップ コンテナ情報を確認します。
     
    実行コマンド: Get-AzureRmResource | Where-Object { $_.ResourceName -eq "バックアップ コンテナ名" }
     
    - 実行例
    ----------------------------------------------------------
    PS C:\Users\taseko.000> Get-AzureRmResource | Where-Object { $_.ResourceName -eq "EAST-US-TEST1" }
     
    Name              : EAST-US-TEST1
    ResourceId        : /subscriptions/0d8ad84b-f76a-40f7-a159-1095089*****/resourceGroups/RecoveryServices-*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US/providers/microsoft.backup/BackupVault/EAST-US-TEST1
    ResourceName      : EAST-US-TEST1
    ResourceType      : microsoft.backup/BackupVault
    ResourceGroupName : RecoveryServices-*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US
    Location          : eastus
    SubscriptionId    : 0d8ad84b-f76a-40f7-a159-1095089*****
     
     
    3.Add-AlertRule コマンドを実施し、アラートを追加します。
     
    実行コマンド:
    Add-AlertRule -EventName Backup -EventSource Administrative -Level Error -Location <String> -Name <String> -OperationName Microsoft.Backup/backupVault/Backup -Operator GreaterThanOrEqual -ResourceGroup <String> -ResourceId <String> -ResourceProvider Microsoft.Backup -RuleType Event -Status <String> -SubStatus <String> -Threshold <Double> [-CustomEmails <String[]> ] [-Description <String> ] [-DisableRule] [-EmailAddress <String> ] [-SendToServiceOwners] [-WindowSize <TimeSpan> ] [ <CommonParameters>]
     
    - 実行例
    ----------------------------------------------------------
    PS C:\Users\taseko.000> Add-AlertRule -EventName Backup -EventSource Administrative -Level Error -Location eastus -Name TestCase -OperationName Microsoft.Backup/backupVault/Backup -Operator GreaterThanOrEqual -ResourceGroup RecoveryServices-*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US -ResourceId /subscriptions/0d8ad84b-f76a-40f7-a159-1095089*****/resourceGroups/RecoveryServices-*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US/ -ResourceProvider Microsoft.Backup -RuleType Event -Status Failed -SubStatus Failed -Threshold 1 -EmailAddress ms@microsoft.com  -SendToServiceOwners
     
    RequestId                            StatusCode
    ---------                            ----------
    12b8bf6b-bc29-4212-999d-9778f01*****    Created
     
     
    - コマンドのオプションについて
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~
    - EventName
    バックアップ失敗のアラートを作成するには "Backup" を指定ください。
    IaaS VM バックアップに関するアラートの場合、サポートされる値は、以下の値です。
    Register、Unregister、ConfigureProtection、Backup、Restore、StopProtection、DeleteBackupData、CreateProtectionPolicy、DeleteProtectionPolicy、UpdateProtectionPolicy
     
    -EventSource
    ルールによってモニタリングするイベント ソース名となります。Azure Backup のソースは ”Administrative” となります。
     
    - Level
    サポートされる値は、Informational、Error です。操作が失敗した場合のアラートには Error を使用し、ジョブが成功した場合のアラートには Informational を使用します。
     
    - Location
    対象のバックアップ コンテナの場所を指定します。手順 2 の実行結果から "Location" をご確認ください。
     
    - Name
    アラート ルールの名前を指定します。
     
    - OperationName
    バックアップのアラートを作成するには "Microsoft.Backup/backupVault/Backup" の形式で指定します。
     
    -Operator
    指定したルールの条件を実行する際の比較演算子を指定できます。バックアップが失敗した際にアラートを通知するには、GreaterThanOrEqual をご指定ください。以下 4 つの値があります。
    (GreaterThan | GreaterThanOrEqual | LessThan | LessThanOrEqual)
     
    - ResourceGroup
    操作がトリガーされるリソースの ResourceGroup です。手順 2 の実行結果から "ResourceGroupName" をご確認ください。
     
    - ResourceId
    手順 2 の実行結果から "ResourceId" をご確認ください。
     
    - ResourceProvider
    リソース プロバイダーの値を指定します。”Microsoft.Backup” は Azure Backup に使用されるリソース プロバイダーとなります。
     
    - RuleType
    イベント ベースのバックアップ アラートであるため、Event を指定します。
     
    - Status
    サポートされる値は、Started、Succeeded、および Failed です。失敗時のアラートを設定するには "Failed" を指定ください。Status に Succeeded を指定する場合、Level には Informational を指定してください。
     
    - SubStatus
    バックアップ操作の Status と同じです。
     
    -Threshold
    アラートが通知されるまでのしきい値です。
     
    - Description
    アラート ルールの説明を指定します。
     
    - CustomEmails
    アラート通知を送信するカスタム電子メール アドレスを指定します。
     
    - SendToServiceOwners
    このオプションを指定すると、サブスクリプションの管理者と共同管理者全員にアラート通知を送信します。
     
    -WindowSize  “時間:分:秒 (00:05:00)” (最短は 5 分です)
    オプションを追加する事で、アラートの確認頻度を指定出来ます。運用に合わせて設定の変更をご検討ください。 
     

    - アラートの確認手順
    --------------------------------------------------------------------
    1.以下のコマンドを実行して、アラート対象のバックアップ コンテナのリソース グループ名を確認します。
     
    実行コマンド: Get-AzureRmResource | Where-Object { $_.ResourceName -eq "バックアップ コンテナ名" } 
     
    - 実行例
    ----------------------------------------------------------
    PS C:\Users\taseko.000> Get-AzureRmResource | Where-Object { $_.ResourceName -eq "EAST-US-TEST1" }
     
    Name              : EAST-US-TEST1
    ResourceId        : /subscriptions/0d8ad84b-f76a-40f7-a159-1095089*****/resourceGroups/RecoveryServices-*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US/providers/microsoft.backup/BackupVault/EAST-US-TEST1
    ResourceName      : EAST-US-TEST1
    ResourceType      : microsoft.backup/BackupVault
    ResourceGroupName : RecoveryServices-*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US
    Location          : eastus
    SubscriptionId    : 0d8ad84b-f76a-40f7-a159-1095089*****
     
     
    2.以下のコマンドで対象のリソース グループ名に関連付けられているアラートを
       確認出来ます。
     
    実行コマンド: Get-AlertRule -ResourceGroup <リソース グループ名>
     
    - 実行例
    ----------------------------------------------------------
    PS C:\Users\taseko.000> Get-AlertRule -ResourceGroup RecoveryServices--*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US
     
    Properties : Microsoft.Azure.Management.Insights.Models.Rule
    Tags       : {[$type, Microsoft.WindowsAzure.Management.Common.Storage.CasePreservedDictionary, Microsoft.WindowsAzure.Management.Common.Storage], [hidden-link:/subscriptions/0d8ad84b-f76a-40f7-a159-1095089*****/resourceGroups/RecoveryServices-*****TB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US/, Resource]}
    Id         : /subscriptions/0d8ad84b-f76a-40f7-a159-1095089*****/resourceGroups/RecoveryServices-*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US/providers/microsoft.insights/alertrules/TestCase
    Location   : eastus
    Name       : TestCase
     
     
    詳細な内容を確認するには “-DetailedOutput” オプションを指定して実行してください。
     
    実行コマンド: Get-AlertRule -ResourceGroup <リソース グループ名> -DetailedOutput
     
     
    - アラートの削除手順
    --------------------------------------------------------------------
    以下のコマンドを実行して、対象のアラートを削除します。
     
    実行コマンド: Remove-AlertRule -Name <アラート名> -ResourceGroup <リソース グループ名>
     
    - 実行例
    ----------------------------------------------------------
    PS C:\Users\taseko.000> Remove-AlertRule -Name TestCase -ResourceGroup RecoveryServices-*****ZBYRTB6YDWNPCPY5Z5PJH3GAKHOGSLYZXAELCN74LBJRL5Q-East-US
     
    RequestId                            StatusCode
    ---------                            ----------
    *****abe-6c0d-44a7-98ea-93f5fb152690         OK


    0 0

    Hi everyone! Just a quick tip regarding this message. I am on a customer and found that not always the command will returns data and had to check it more carefully.

    Symptom

    Suppose you swear you know that there are some log related to the filter you have put - say a Correlation ID - but after you execute the following command:

    Merge-SPLogFile -Path C:\temp\ErrorJob.log -Correlation "84FC629D-03AF-A09C-2AC2-C59A1650BB9E"

    You receive the message

    WARNING: Cmdlet did not return any records in the log file. Check your time range or filters.

    And you just stare at the message knowing that it is a LIAR! How to teach it who's the Boss?

    Reason

    This is because normally Merge-SPLogFile haven't found any log in the most recent files (my assumption is that it only check the latest ULS files) so you need to provide the date filter.

    Solution

    Instead of just asking for the Correlation ID, provide a date filter, the narrower the time span is, the faster it will execute:

    Merge-SPLogFile -Path C:\temp\ErrorJob.log -Correlation "84FC629D-03AF-A09C-2AC2-C59A1650BB9E" -StartTime "2/26/2016 12:00:00" -EndTime "2/23/2016 15:00:00"

    Violà!

    You now won't see the message and will have your beloved trimmed log.

    Enjoy!


    0 0

    3 марта в 11:00 (МСК)мы начнем наш вебинар «Защита устройств Windows 10 с помощью Device Guard», где вы познакомитесь с компонентами Device Guard, требованиями к устройствам и ПО, особенностями настройки и сценариями применения этого нового решения.

    Device Guard – набор программно-аппаратных технологий защиты, доступный для устройств с Windows 10. Будучи сконфигурированным, Device Guard обеспечивает запуск только доверенных драйверов и приложений, подобно тому, как это происходит сейчас в современных смартфонах. Такой подход существенно снижает риск проникновения в систему вредоносного кода. И, с другой стороны, конечно же требует планирования и реализации со стороны ИТ-отдела. 

    Рассматриваемые вопросы:
    • Компоненты Device Guard
    • Создание и применение политик Code Integrity
    • Добавление бизнес-приложений в каталог
    • Сценарии применения Device Guard

    Докладчик: Александр Шаповал, Технологический евангелист, Microsoft Россия

    К вебинару может подключиться любой желающий, необходима регистрация.


    0 0

    Dovolujeme si vás pozvat na online webinář, který se bude konat zítra v 10:00 hodin. Když vyvíjíte web nebo cloud aplikace, potřebujete nástroj, který je rychlý a odlehčený. Ale pořád potřebujete editor, který vám dovolí efektivně psát kód, a který vám pomůže ladit vaše aplikace přímo v editoru. Proto jsme pro vás vyvinuli Visual Studio Code: bezplatný...(read more)

    0 0

    March 2, 2016

    By Meghan Liese, Senior Product Marketing Manager, StorSimple

     

    Today’s enterprises are embracing digital transformation and are strategically defining revenue generating experiences as a key strategy to their growth. A byproduct of this rapid transformation is massive amounts of business data that is experiencing a double digit increase every year.This has directly resulted in rising storage costs, growing storage infrastructure footprint and increasing complexity of storage and data management within enterprises of all sizes. This is creating significant hurdles for IT to scale efficiently in support of the growing demands of the business.

    However, the time has come to transition the focus of IT from infrastructure management to business agility, where the goal is not just connecting storage and server infrastructure, but rather connecting data to the users and applications quickly and efficiently. A key to this transition is the adoption of the cloud.

    With Microsoft Azure StorSimple, customers have reduced their storage costs, eliminated infrastructure sprawl, simplified data management and increased IT agility to help transform their businesses.  “Prior toStorSimple, we simply added more and more storage to accommodate growing data,” says Lee Bingham, Head of IT for leading European fashion designer Paul Smith.  “Now we are using StorSimple to create policies that automatically archive data to Azure storage in the cloud if it hasn’t been accessed in 12 months.”

    Today we are announcing the general availability of the StorSimple Virtual Array, as part of the overall StorSimple product offering for Microsoft Azure. The StorSimple Virtual Array provides hybrid cloud storage using a VM that can be run on Hyper-V or VMware hypervisors and supports either a Network Attached Storage (NAS) or Storage Area Network (SAN) configuration. The StorSimple Virtual Array also delivers integrated primary storage, data protection, archiving, and disaster recovery capabilities in a single, easy to deploy solution designed for small, remote environments where there is minimal IT infrastructure and management.

    With the StorSimple Virtual Array, customers no longer need to centralize data protection and disaster recovery at the main datacenter. Instead, the StorSimple Virtual Array allows customers to have a highly scalable, consistent, and cost-effective approach to managing data growth and data protection across all their environments, including remote and branch offices. Rand Morimoto, President of Convergent Computing, an early adopter of the StorSimple Virtual Array explains that, “Microsoft has helped us consolidate several storage solutions into a comprehensive strategy that includes high speed appliance-based storage, versatile and cost effective Virtual Array technology with a hybrid integration of our storage needs on-premises and the agility of extending storage and backup into the cloud.”

    The StorSimple Virtual Array is available to all Microsoft Azure customers with an Enterprise Agreement. The StorSimple Virtual Array is managed by the StorSimple Manager, which provides a single point of management along with other StorSimple solutions.

    With today’s announcements, we are excited that more enterprises can adopt a hybrid storage strategy based on StorSimple to transform their businesses – by reducing costs, simplifying IT processes and helping increase IT agility in support of their business goals.

    According to leading storage analyst firm Taneja Group, “Microsoft Azure StorSimple efficiently extends the local storage infrastructure into the cloud. With the Virtual Array, StorSimple does so much more.  Now the enterprise can bring remote and branch offices into the fold with virtualized StorSimple arrays that can run from remote office premises, or in case of a disaster, from Azure. Virtual Array rounds out the StorSimple product family and lets Microsoft broadly expand its customer base to ROBO locations,“ states Arun Taneja, Founder, Taneja Group


    0 0

    Hello again!  Tim Macaulay here from the Identity Support team here at Microsoft.  Recently I worked through an issue where we had CNF objects that had fully synchronized to the cloud.  These objects were in some weird state in Active Directory, so our goal was to prevent these objects from making it to the cloud by setting the cloudFiltered attribute to True.  Setting the cloudFiltered attribute to True will only allow the object to go as far as the Metaverse. 

    To accomplish this task, we have a couple options available to us depending on the build of Azure AD Connect that you are currently utilizing in your environment.

    *NOTE: I am not certain of the specific Azure AD Connect build that we implemented the ability to clone default Synchronization Rules, so I am going to base the blog on build 1.0.9131.0 which I know has this feature available.  If you are using a build that contains this feature, than you can use option #2 as well.

      • Option #1: If you are using a build of Azure AD Connect that is below 1.0.9131.0
      • Option #2: If you are using build 1.0.9131 or later

     

    Option #1

     If you are utilizing a build of Azure AD Connect in your environment that is below 1.0.9131.0 than you will need to create a new Synchronization Rule.  Please find below, some detailed steps to guide you through the creation of the Synchronization Rule. 

      1. Open the Synchronization Rules Editor
        1. Start > All Programs > Azure AD Connect > Synchronization Rules Editor
      2. Click the Add New Rule button in the upper right - The Synchronization Rule Dialog should open on the Description Tab.

        DESCRIPTION TAB (This tab is pretty self-explanatory.   Here is a reference to help out)

        NAME Title of the Synchronization Rule - This is how the Sync Rule will be displayed in the Sync Rule Editor
        DESCRIPTION Purpose of Synchronization Rule
        CONNECTED SYSTEM On-Premise Active Directory
        CONNECTED SYSTEM OBJECT TYPE User
        METAVERSE OBJECT TYPE Person
        LINK TYPE Join
        PRECEDENCE 50
        TAG (leave blank)
        ENABLE PASSWORD SYNC (leave not checked)
        DISABLED (leave not checked)



     *NOTE: The below snapshot is for illustration purposes




    SCOPING FILTER TAB
    Use this tab to identify which objects that we want to have this synchronization rule apply.

     

     JOIN RULES TAB (You will need a Join Rule here.  A Join Rule should be based on an attribute providing a unique value to uniquely identify the object.)

      

     

    TRANSFORMATION TAB

    This is where we will set the cloudFiltered attribute.

    1. Click Add Transformation

    FLOW TYPE Expression
    TARGET ATTRIBUTE cloudFiltered
    SOURCE IIF(IsPresent([isCriticalSystemObject]) || IsPresent([sAMAccountName]) = False || [sAMAccountName] = "SUPPORT_388945a0" || Left([mailNickname], 14) = "SystemMailbox{" || Left([sAMAccountName], 4) = "AAD_" || (Left([mailNickname], 4) = "CAS_" && (InStr([mailNickname], "}") > 0)) || (Left([sAMAccountName], 4) = "CAS_" && (InStr([sAMAccountName], "}") > 0)) || Left([sAMAccountName], 5) = "MSOL_" || CBool(IIF(IsPresent([msExchRecipientTypeDetails]),BitAnd([msExchRecipientTypeDetails],&H21C07000) > 0,NULL)) || CBool(InStr(DNComponent(CRef([dn]),1),"\\0ACNF:")>0), True, NULL)
    APPLY ONCE (leave not checked)
    MERGE TYPE Update

     

     Options #2

    In this option, if you are using a build that contains the ability to clone a Synchronization Rule such as build 1.0.9131.0 then we can utilize the below steps to accomplish our goal.

    1. Open the Synchronization Rules Editor
      1. Start > All Programs > Azure AD Connect > Synchronization Rules Editor
    2. Select the In from AD - User Join Synchronization Rule
    3. Click the Edit Button
    4. In the Pop-Up Window click the Yes button

    Now that we have cloned the In from AD - User Join, let's make it a bit more understandable for the task at hand.  For this option, we only need to modify a few properties on the Description tab.  Let's go over them now.

    DESCRIPTION TAB

    Leave all other properties alone, except for the below mentioned properties.

    NAME In from AD - User Join - Cloned - Filter Joined CNF Objects
    DESCRIPTION Filter Joined CNF Objects
    LINK TYPE Join
    PRECEDENCE 50

     

    TESTING

    Ok.  Now that we have created this new customized synchronization rule, we need to test it to ensure that it works good in the environment that we are currently working in.  To test, we will utilize a feature known as Preview. 

    1. Open the Synchronization Service Manager Console and select the Connectors Tab
    2. Select the On-Premise Active Directory Connector
    3. From the Actions menu, select Search Connector Space
    4. Change the Scope to RDN
    5. In the white space to the right, enter the start of the Distinguished Name (e.g. CN=Tim Macaulay)
    1. This will be the distinguished Name of the object in question
    • Click the Search Button
    • Once found, double click on the object to review the connector space properties of the object
    • Click the Preview Button
    • Click the Generate Preview Button
      1. Generate Preview will display what will happen with the object in question if it were to be synchronized
    • Click Import Attribute Flow
    • Locate and Review the Metaverse Attribute cloudFiltered
      1. cloudFiltered should be set to True now
      2. If cloudFiltered is not set to True, than something did not execute correctly
    • Click on Connector Updates
      1. Expecting to see just one connector for the On-Premise Active Directory Connector
      2. It is possible that we will see the Azure Active Directory Connector.  If it is there, it should contain the word "Deprovision"

     

     

    ADDITIONAL INFORMATION

     


    0 0

    Seit kurzem steht wieder ein neues, kostenfreies eBook von Microsoft Press in verschiedenen Formaten bereit:

    Deploying Windows 10: Automating deployment by using System Center Configuration Manager
    von Andre Della Monica, Russ Rimmerman, Alessandro Cesarini und Victor Silveira.

    image

    Mit Windows 10 stellt Microsoft sein modernstes Betriebssystem für den Einsatz im Unternehmen bereit. Dieses Buch zeigt, wie Windows 10 automatisiert installiert und mit System Center Configuration Manager administriert werden kann.

    Windows 10 steht als Plattform “as a service” bereit. Die Zeiten von mehrjährigen Upgrade-Zyklen sind vorbei. Umgekehrt erwarten Unternehmen, einen reibungslosen und kontinuierlichen Betrieb der Plattform. Dies wird mit den neuen Update und Deployment-Mechanismen bereitgestellt.

    Der Lifecycle von Windows 10 reicht vom Deployment über den täglichen Einsatz zu Updates und Anpassungen im laufenden Betrieb. Automatisierung ist der Schlüssel zur erfolgreichen Planung und Administration von Windows im Unternehmen. Die Autoren sind ausgewiesene Experten (Premier Field Engineers) für Windows und Automatisierung und haben ihre Kenntnisse aus Real-World Experiences in das Buch einfließen lassen.

    Das Buch steht in den gängigen Formaten zum freien Download bereit:

    Auf 95 Seiten werden Methoden für automatisiertes Windows 10 Deployment bis zum Einsatz von System Center Configuration Manager beschrieben.

    Noch mehr Literatur zu Windows 10 gibt es auch in Freies eBook Introducing Windows 10 for IT Professionals, Technical Overview.

    Viel Spaß mit dem kostenfreien eBook Deploying Windows 10!


    0 0

     

    image

     

     

     

    The Exchange 2013 MP has been released for some time now.  The current version at this writing is 15.0.666.19 which you can get HERE

    This MP can be used to discover and monitor Exchange Server 2013 and 2016.

     

     

     

     

    However, one of the things I always disliked about this MP – is that it does not use a seed class discovery.  Therefore – it runs a PowerShell script every 4 hours on EVERY machine in your management group, looking for Exchange servers.  The problem with this, is that it doesn’t follow best practices.  As a general best practice, we should NOT run scripts on all servers unless truly necessary.  Another issue – many customers have servers running 2003 and 2008 that DON’T have PowerShell installed!  You will see nuisance events like the following:

     

    Event Type:    Error
    Event Source:    Health Service Modules
    Event Category:    None
    Event ID:    21400
    Date:        3/2/2016
    Time:        3:29:26 AM
    User:        N/A
    Computer:    WINS2003X64
    Description:
    Failed to create process due to error '0x80070003 : The system cannot find the path specified.
    ', this workflow will be unloaded.
    Command executed:    "C:\WINDOWS\System32\WindowsPowerShell\v1.0\powershell.exe" -PSConsoleFile "bin\exshell.psc1" -Command "& '"C:\Program Files\Microsoft Monitoring Agent\Agent\Health Service State\Monitoring Host Temporary Files 85\26558\MicrosoftExchangeDiscovery.ps1"'" 0 '{3E7D658E-FA5E-924E-334E-97C84E068C4A}' '{B21B34F9-2817-4800-73BD-012E79609F7E}' 'wins2003x64.dmz.corp' 'wins2003x64' 'Default-First-Site-Name' 'dmz.corp' '' '' '0' 'false'
    Working Directory:    C:\Program Files\Microsoft Monitoring Agent\Agent\Health Service State\Monitoring Host Temporary Files 85\26558\
    One or more workflows were affected by this. 
    Workflow name: Microsoft.Exchange.15.Server.DiscoveryRule
    Instance name: wins2003x64.dmz.corp
    Instance ID: {B21B34F9-2817-4800-73BD-012E79609F7E}
    Management group: OMMG1

     

     

    So, I have created an addendum MP which should resolve this.  My MP creates a class and discovery, looking for “HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\ExchangeServer\v15\Setup\MsiInstallPath” in the registry.  If it finds the registry path, SCOM will add it as an instance of my seed class.

    image

     

    Then, I created a group of Windows Computer objects that “contain” an instance of the seed class. 

    image

     

    Next, I added an override to disable the main script discovery the Exchange 2013 MP.

    Finally, I added an override to enable this same discovery, for my custom group.  This should have the effect that our Exchange discovery script ONLY runs on server that actually have Exchange installed (based on the registry key)

    image

     

     

    This works for discovering Exchange 2013 and Exchange 2016 with the current Exchange 2013 MP.

     

    You can download this sample MP at the following location:

    https://gallery.technet.microsoft.com/Exchange-Server-2013-and-cfdfcf2f


    0 0

    The following post is provided as-is with no warranty nor support of any sort. This is to illustrate how flexible the Federation Role could be if you have a little imagination and some time to spare. This is not a security feature or anything like that... As a matter of fact, if you'd like to detect suspicious logon activity, I highly encourage you to have a look at this: Use Azure Active Directory sign-in and audit reports.

    Here, the idea is to trigger Multi Factor Authentication if the user is connected from a different country from the one it has set in Active Directory. Again, this is just an example, it isn't really a security solution.

    So why this example? To illustrate how flexible the product could be, see how to use claim rules and MFA triggers, and see how to leverage a custom attribute store...

    What's the plan?


    When...

    • the user is connected through the Web Application Proxy servers
    • the user tries to access to a specific RP
    • the user has a country defined in its Active Directory different from the one its current connection is coming from

    We...

    • trigger Multi Factor Authentication, in my case a phone call will be placed and the user has to pick up and enter a PIN

    So it means that the user already has a country set in Active Directory Directory Service (the attribute co) and that you already have a MFA provider. In my example, I am using an Azure MFA server (but really any MFA provider does the trick). If you are not familiar with Azure MFA and wish to get a glimpse at it, please watch/listen to this: TechNet Radio: Delivering Results: How Microsoft is Simplifying Authentication with Azure MFA. This post does not explain how to configure Azure MFA but just leverages it.

    Step 0 - Setting a country in Active Directory for Alice

    Yes in IT everything starts with a 0. So Alice will be our test user.

    Set-ADUser-IdentityAlice-Country"CA"

    Interestingly, in Active Directory you set the country by setting the country code. So here in my example, CA means Canada.

    Step 1 - Creating a new claim definition

    This is optional since the claim rules are fairly easy-going and do not enforce the existence of a claim definition before using them. But for the sake of using PowerShell, let's do it. We are going to need two new claims to store the country of the user and a flag to determine whether or not the country of the connection matches the country of the user in Active Directory.

    Add-AdfsClaimDescription-Name"Country of the user"-ClaimType"http://yoga.corp/Claims/CountryIP"-ShortName"CountryUser"

    Add-AdfsClaimDescription-Name"Country Match Flag"-ClaimType"http://yoga.corp/Claims/CountryMatch"-ShortName"CountryUserMatch"

    Step 2 - Detecting the country from where Alice is connected

    For this there is nothing out of the box. So let's be creative. When we want to query for an information which does not exist in the claim pipeline nor in Active Directory, we can query attribute store. By default, only Active Directory is listed as an attribute store. In fact, if you are very finicky, you have some attribute stores which are hidden... The _OpaqueIdStore, the _PasswordExpiryStore (maybe that will be the theme of another post) but none of them are providing what we need: the country from where Alice is connected. When things are not here by default we can extend the default capabilities with some customization. Here we are going to use a Custom Attribute store to give us the country. Basically, we are going to feed this custom attribute store with an public IP address, then query a public online webservice, the webservice will return the country and we will add this returned country as a claim. Custom attribute stores are DLLs that you have to develop yourself... Because I don't want to trigger multiple support calls ;) I'll just point to some documentations if you'd like to do the same: How to create a Custom Attribute Store for Active Directory Federation Services 3.0. Here is an excerpt of my custom attribute store:

    using System;
    using System.Collections.Generic;
    using System.Text;
    using System.Net;
    using System.Xml;
    using Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore;
    using System.IdentityModel;

    namespace GeoIPv4AS
    {
        public class IpOperations : IAttributeStore
        {
            public IAsyncResult BeginExecuteQuery(string query, string[] parameters, AsyncCallback callback, object state)
            {
                if (String.IsNullOrEmpty(query) || parameters == null || parameters.Length != 1 )
                {
                    throw new AttributeStoreQueryFormatException("Something wrong with the input");
                }
                string inputString = parameters[0];

                if (inputString == null)
                {
                    throw new AttributeStoreQueryFormatException("Query parameter cannot be null.");
                }

                string result = null;

                switch (query)
                {
                    case "country":
                        {

                            //Getting data from the webservice
                            //Blablablabla check that the IP is a valid IP with a RegExp
       //I am hidding the actual code and URL of the webservice I am using
       //You can find than online yourself :)
       //Returning the name of the country
                            result = countryName;
                            break;
                        }
                    default:
                        {
                            throw new AttributeStoreQueryFormatException("The query string is not gibberish.");
                        }
                }
                string[][] outputValues = new string[1][];
                outputValues[0] = new string[1];
                outputValues[0][0] = result;

                TypedAsyncResult<string[][]> asyncResult = new TypedAsyncResult<string[][]>(callback, state);
                asyncResult.Complete(outputValues, true);
                return asyncResult;
            }

            public string[][] EndExecuteQuery(IAsyncResult result)
            {
                return TypedAsyncResult<string[][]>.End(result);
            }

            public void Initialize(Dictionary<string, string> config)
            {
                // No initialization is required for this store.
            }
        }
    }

    Once compiled, I have my GeoIPAS.dllthat I copy to the C:\Windows\ADFS folder of all my ADFS servers. Then I create my custom attribute store in the console:


    Once the store has been created, I restart my ADFS server and confirm the event 251 in the ADFS admin logs:


    Now I can call the store with an IP in input and it will return me the country. The input in our case will be the http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-forwarded-client-ip claim which is set with the external IP address of the client when it is coming through a Web Application Proxy. So the following claim rule gives you an example of how to leverage the custom attribute store:

    c:[ Type == "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-forwarded-client-ip" ]
     => Issue( Store = "GeoIPAS", Types = ("http://yoga.corp/Claims/CountryIP"), Query = "country", Param = c.Value );

    This rule will issue a claim http://yoga.corp/Claims/CountryIP with the country for the IP of http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-forwarded-client-ip.

    Step 3 - Creating the MFA trigger

    So to trigger MFA we will need to do 3 things (well, there are multiple ways to do it).

    1. We query the attribute store if the user is connected from the WAP (so http://schemas.microsoft.com/ws/2012/01/insidecorporatenetwork is false) and has an IP address in the claim (http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-forwarded-client-ip has a value). If so, then we call the custom attribute store, which calls the webservice, which returns the name of the country and issues it in a claim type call http://yoga.corp/Claims/CountryIP (well technically, we could use the statement addsince none of the things issued at that level of the pipeline actually end up in the token).

      c1:[ Type == "http://schemas.microsoft.com/ws/2012/01/insidecorporatenetwork", Value == "false" ] && c2:[ Type == "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-forwarded-client-ip" ]
       => Issue( Store = "GeoIPAS", Types = ("http://yoga.corp/Claims/CountryIP"), Query = "country", Param = c2.Value );

    2. We query AD and check if the user's country in AD matched the country returned by the webservice (via the custom attribute store). If so, we issue a new claim http://yoga.corp/Claims/CountryMatch (its value actually doesn't matter since we are just using it as a flag):

      c1:[ Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname", Issuer == "AD AUTHORITY" ] && c2:[ Type == "http://yoga.corp/Claims/CountryIP" ]
       => Issue( Store = "Active Directory", Types = ("http://yoga.corp/Claims/CountryMatch"), Query = "(co={1});co;{0}", Param = c1.Value , Param = c2.Value );

    3. Finally, if the flag http://yoga.corp/Claims/CountryMatch does not exist in the pipeline, we issue the claim http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod with the value http://schemas.microsoft.com/claims/multipleauthn which will trigger the MFA, else nothing happens and the claim engine moves on:

      NOT EXISTS( [ Type == "http://yoga.corp/Claims/CountryMatch" ] )
      => Issue( Type = "http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod", Value = "http://schemas.microsoft.com/claims/multipleauthn");

    Now that we have the logic, we will add the trigger for a specific relying party trust. In my case, the RP is called MFACountryExample:

    $_countryMFA=@"
    c1:[ Type == "http://schemas.microsoft.com/ws/2012/01/insidecorporatenetwork", Value == "false" ] && c2:[ Type == "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-forwarded-client-ip" ]
    => Issue( Store = "GeoIPAS", Types = ("http://yoga.corp/Claims/CountryIP"), Query = "country", Param = c2.Value );
    c1:[ Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname", Issuer == "AD AUTHORITY" ] && c2:[ Type == "http://yoga.corp/Claims/CountryIP" ]
    => Issue( Store = "Active Directory", Types = ("http://yoga.corp/Claims/CountryMatch"), Query = "(co={1});co;{0}", Param = c1.Value , Param = c2.Value );
    NOT EXISTS( [ Type == "http://yoga.corp/Claims/CountryMatch" ] )
    => Issue( Type = "http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod", Value = "http://schemas.microsoft.com/claims/multipleauthn");
    "@

    Get-AdfsRelyingPartyTrust-name"MFACountryExample"|Set-AdfsRelyingPartyTrust-AdditionalAuthenticationRules$_countryMFA

    And here you go... Note that as soon as you are using custom rules for MFA triggers, you won't see the options in the GUI anymore:


    Step 4 - Playing with it

    Of course in a real life example you have way more things to consider... First the error management of your custom attribute store should be solid, including parsing the input because after all, this is an HTTP header, the client could try to modify it and put something that would make your code crash. Then you create a dependency of that web service with an outgoing network connection to the Internet... For that you could use your own webservice or come up with some multi tier thing... Anyhow, instead of going fancy on your ADFS server, just use the one we created for you: Use Azure Active Directory sign-in and audit reports.

     


    0 0

    今回は、Exchange Server 2007 および 2010 のパブリック フォルダー データベースについて、データベースの削除ができない際の対処方法をご案内いたします。

    Exchange Server の撤去やリプレース時に、パブリック フォルダー データベースを削除する際、パブリック フォルダー データベースの状態によっては正常に削除できない場合がございます。
    パブリック フォルダー データベースが削除できない一例として、レプリカが残ってしまっている場合が挙げられ、その場合は削除時に以下のメッセージが出力されます。

    エラー メッセージの通り、レプリカがパブリック フォルダー データベース上に残っているために削除することができません。

    このような場合、まずは該当データベース上にパブリック フォルダーが残っているかどうかをご確認ください。
    レプリカが残っている場合はパブリック フォルダならびにアイテムを他のサーバーに複製してください。

    ただし、ご利用状況によっては移行後はパブリック フォルダーを利用しない場合や、複製する必要のないシステム フォルダのみが残っている場合もあります。
    以下のような条件に当てはまる場合は、後述の手順にて強制的にパブリック フォルダー データベースを削除することができます。

     ・必要なフォルダやアイテムは他のサーバーに複製済み
     ・サーバー撤去後は、パブリック フォルダーを利用する予定がない
     ・これまでも、パブリック フォルダーの利用はしていない
     ・利用しない (していない) アイテムだけが残っているもしくは、システム フォルダのみが残っている

    なお、上記に挙げたような条件はすべてを網羅できているわけではなく一般的な条件です。
    そのため、後述にてご案内する ADSI エディターによるパブリック フォルダー データベースの削除をする前に、念のために Active Directory のバックアップを取ることをお勧めいたします。
    またバックアップの有無に関わらず、本当に削除しても問題がないのか、削除する対象が間違っていないか等を、作業前に必ずご確認くださいますようお願いいたします。

    パブリック フォルダー データベースの削除を実施するにあたり、ご不安なお客様はお手数ではございますがサポートまでお問い合わせください。
    ご利用状況をヒアリングして、適切な対処方法をご案内いたします。


    それでは、以下にパブリック フォルダー データベースの強制削除手順をご案内します。

    パブリック フォルダーデータベースの強制削除方法
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ドメイン コントローラーもしくは Exchange Server に管理者権限でログオンし、下記の手順で Exchange 2007 / 2010 のパブリック フォルダー データベースの情報を削除します。


    1. [スタート] - [管理ツール] より [ADSI エディター] を起動します。または [ファイル名を指定して実行] にて [adsiedit.msc] と入力し、[OK] を押します。
       

    2. メニューバーより、[操作] - [接続] をクリックします。

    3. [接続ポイント] にて [既知の名前付けコンテキストを選択する] を選択し、リストより [構成 (英語の場合は Configuration) ] を選択して [OK] をクリックします。
      

    4. 左ペインより AD オブジェクトを下記のように展開します。
    Exchange 2007/2010 でそれぞれ以下のように展開します。

        
    5. [CN=<パブリック フォルダ データベース名>] を右クリックして [削除] をクリックします。

    6. 確認画面で [はい] をクリックします。

    7. 続けて左ペインで下記のように展開します。

       
    8. [CN=Exchange Administrative Group (FYDIBOHF23SPDLT)] を右クリックして [プロパティ] をクリックします。

    9. 属性から siteFolderServer の項目を確認します。下図では パブリック フォルダー "pfdb2007" が登録されております。

        
    10. 値に <未設定> (英語の場合は <not set>) 以外の情報が入っていた場合は、ダブルクリックします。

    11. [クリア] ボタンをクリックし、[OK] をクリックします。下図のように値が削除されていることをご確認ください。

        
    12. [OK] をクリックしてプロパティを閉じます。

    13. 続けて左ペインで下記のように展開します。

    14. [CN=<メールボックス データベース名>] を右クリックして [プロパティ] をクリックします。

    15. 属性から msExchHomePublicMDB の項目を確認します。下図では パブリック フォルダー "pfdb2007" が登録されております。

        
    16. 値に <未設定> 以外の情報が入っていた場合は、ダブルクリックします。

    17. [クリア] ボタンをクリックし、[OK] をクリックします。下図のように値が削除されていることをご確認ください。

        
    18. [OK] をクリックしてプロパティを閉じます。

    19. [CN=Databases] 配下に複数のデータベースの情報が存在する場合は、それぞれの項目に対して手順 14. から 18. の操作を繰り返します。

    20. ADSI エディターを終了します。

    以上の操作で Exchange 2007 / 2010 のパブリック フォルダー データベースの情報を、Active Directory 上から削除する手順は完了となります。


    0 0

    微軟與匯智資訊合作三年內預計讓 10 萬家中小企業上雲端

    Office 365 中小企業重要里程碑

    單一合作夥伴 Office 365 客戶數最大的合作計畫

      (2016年3月3日,台北)  繼微軟推出「CSP (Cloud Solution Provider) 計畫」,陸續成功協助台灣的 CSP 計畫合作夥伴進行商業模式升級後,台灣微軟今(3) 日宣布,與台灣網域/網址申請、網站代管及虛擬主機服務的專業領導品牌─匯智資訊  (WIS) 擴大 CSP 計畫合作,匯智資訊允諾將於未來三年內,預計讓服務的 10 萬家中小企業透過使用Office 365 的雲端服務以及整合其自創產品服務,讓這 10 萬家的中小企業輕鬆上雲端,而此項合作也為 Office 365 在中小企業市場創下重要里程碑。

      「全球 Office 365 在企業端的成長,較去年同期增長 66% ,而 Office 365 從 2012 年在台灣上市以來,每年營收較去年同期成長皆超過 100 %,並且,今年 Office 365 營收在企業端的表現,較去年同期成長比全球高 2 倍。」微軟全球資深副總裁暨大中華區首席執行長賀樂賦(Ralph Haupter) 表示: 「很高興此次與匯智資訊共同成就一樁 Office 365 單一夥伴客戶數最大的合作計畫,不僅是 CSP 計畫推行至今一 項重要的 進展成果,對於 Office 365 在台灣中小企業市場經營也具有相當大的指標性意義。」  

    匯智資訊搶攻雲端市場    3 年目標欲達到 10 萬客戶使用 Office 365

      據 2015 年 IDC 國際數據研究機構調查,雲端市場將會於 2018 年達到 14%的年均複合增長率(CAGR) ,將創造超過兩千億美金的營收,以及帶來超過 2.3 兆的雲端商機;為因應網路應用與雲端演化,匯智資訊投入雲端市場以提升企業獲利機會,轉型新型態的網路服務模式,從過去依據客戶網站的需求提供服務,延伸到客戶辦公室 PC 的應用,增加網路服務產品以滿足持續擴張的客戶需求,也造就與微軟加強合作的原因。

      「匯智資訊將於未來 3 年讓服務的 10 萬家台灣中小企業使用 Office 365,透過 CSP 平台特性並結合更多附加價值服務,協助用戶,除了將電子郵件雲端化來節省成本提高工作產出效率外,更結合了包含企業品牌網址、形象官網設計、網站代管與 BYOD 雲端信箱資訊安全管理等服務,強化 Office 365 的雲端辦公室與企業檔案共享等微軟雲端服務功能,不僅能大幅提升安全性且節省企業管理上的時間與費用,進一步解決中小企業的 IT 難題,提供更高等級的資訊安全需求,讓中小企業也能以極低的成本採用大型企業青睞的雲端服務;」匯智資訊總經理  林宜鋒進一步說明:「匯智資訊自 1999 年成立至今,累積相當龐大的中小企業用戶,產業涵蓋的領域相當廣。正因為如此,我們了解中小企業在前進雲端服務時所遇到的困難與擔憂,希望幫中小企業找熟悉度好、安全性高、又能很快上手的雲端產品與服務,有了微軟成為堅強後盾,我們與中小企業用戶都更能專注在本業核心的業務上,而將節省下來的時間與成本,轉換成企業更高的價值與利潤。」

      台灣微軟總經理  邵光華表示:「匯智資訊是台灣網站代管及虛擬主機服務的前三大專業領導品牌之一,深耕台灣中小企業,很高興台灣微軟能夠與匯智資訊合作 CSP 計畫,藉由其對中小企業的了解與提供的專業顧問服務,不僅可降低中小企業使用 Office 365 雲端服務的技術門檻,微軟也有機會將企業一站購足全方位雲端服務帶給更多的客戶,讓中小企業經由 Office 365 輕鬆上雲端,完成高效行動辦公。」

    微軟 CSP 計畫新型商業模式   雙向企業合作模式共創獲利

      微軟 CSP 計畫提供企業夥伴四大服務: 自創應用與服務(IP Service)、管理服務(Managed Services) 、專案服務(Project Services) 以及轉售服務(Resale Services) ,並讓企業夥伴選擇直接性(1 Tier Direct) 或間接性(2 Tier Indirect) 的合作營運模式,負責微軟 Office 365、Microsoft Azure 以及 CRM Online 等不同產品的銷售與解決方案加值服務:


    ●  直接性(1 Tier Direct) 雲端解決方案經銷商:

    1. 合作夥伴根據條件及商機自行選擇
    2. 合作夥伴由微軟專人負責:經銷業務 (PSE)、電話業務(Tele)、系統(Programmatic)
    3. 若未符合資格,則建議成為 2-Tier 經銷商
    4. 透過不同方案及服務開發全新客戶群

    ●  間接性(2 Tier Indirect) 雲端解決方案配銷商:

    1. 有條件式的招募,嚴格的評估標準
    2. 透過“經銷”夥伴執行非直接的銷售
    3. 合作夥伴符合高擴展性,高產能和高能力
    4. 透過不同方案及服務開發全新客戶群
    5. 招募新的經銷夥伴

      「微軟 CSP 計畫讓企業可選擇直接性或間接性的雙向合作模式,不僅讓企業商業模式升級更具彈性,也使計畫能夠適用於各種不同屬性與組織結構的企業夥伴,」微軟大中華區副總裁兼市場行銷暨營運總經理  康容表示:「微軟將持續致力於雲端領域的投資,希望邀集更多台灣各式企業成為微軟 CSP 計畫的合作夥伴,擴大雲端解決方案生態體系並發掘更多潛在商機,聯手共創獲利機會。」

    更多詳細有關 CSP計畫的內容,請至官方網站查詢 https://partner.microsoft.com/en-US/Solutions/cloud-reseller-overview


    0 0

    Here are the set of steps to go through, to ensure the Exchange 2010 Service Pack upgrades or the Exchange 2013 Cumulative Update installations happen smoothly

    • Make sure the Execution policies are set to unrestricted by running the command :

       Get-ExecutionPolicy -list

       You can run the command Set-ExecutionPolicy Undefined -Scope PolicyName

       Refer

       You receive error 1603 when you try to install the Exchange Server 2010 RU1

       <https://support.microsoft.com/en-us/kb/981474>

       This is also applicable to Exchange 2013

    • Since the server is already in production, the Windows OS updates and system requirements will already be met.

       However, I would still suggest to go through the hotfixes section.

       https://technet.microsoft.com/en-us/library/bb691354(v=exchg.141).aspx

       https://technet.microsoft.com/en-us/library/aa996719(v=exchg.141).aspx

    • Make sure we do not have anything additional installed that is unsupported.

       Exchange Server Supportability Matrix

       <https://technet.microsoft.com/en-us/library/ff728623(v=exchg.150).aspx>

       For example, the .Net framework and Windows Management Framework installed should fall into the supportability matrix. There are instances where admnistrators inevitably or unknowingly install one of the unsupported versions of .Net or WMF that prevents future upgrades.

       Check the Powershell version by running the command $PSVersionTable

       .Net framework version could be checked under the registry location :

       HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\NET Framework Setup\NDP\v4\Full

    • Interim Updates if any, need to uninstalled before Exchange 2010 Service Pack installation. For Exchange 2013, they need not be uninstalled, we can go ahead with the Cumulative Update installation.

      If at all there are UM-LanguagePacks installed, they need to uninstalled before upgrading, for Exchange 2010 and Exchange 2013 both.

    • Make sure we have the latest back-up of the Exchange databases and also system state backup of the AD environment

    • Note down any customizations such as OWA, config files on servers, registry changes, Lync integration, or third party add-ons, prior to the installation.

    • We could perform the PrepareAD in advance separately. This only needs to be done once for the whole environment.

       Exchange 2010

       setup.com /preparead

       Exchange 2013

       setup.exe /preparead /IAcceptExchangeServerLicenseTerms

       If this is not done, setup will automatically perform it at the time of the upgrade of the first server.

    • Set the server to be upgraded to maintenance mode if it is mailbox role in Exchange 2013 ( with or without DAG) and if is a DAG member in Exchange 2010.

        For Exchange 2010 :

       .\StartDagServerMaintenance.ps1 <ServerName>

       Remove it from maintenance mode using the command below

       .\StopDagServerMaintenance.ps1  <ServerName>

       <https://technet.microsoft.com/en-IN/library/ee861125(v=exchg.141).aspx>

       For Exchange 2013 :

       Use the steps as suggested in the article below,

       Exchange 2013 Maintenance mode

       <http://blogs.technet.com/b/nawar/archive/2014/03/30/exchange-2013-maintenance-mode.aspx>

       Or the script ,

       Exchange Server 2013 Maintenance Mode Script (Start)

       <https://gallery.technet.microsoft.com/office/Exchange-Server-2013-ff6c942f>

       For CAS, take the node to be upgraded out of the Load Balancer.

    • Then proceed with the installation from GUI or from Administrator command prompt by pointing to the location where the files have been downloaded. Files should be downloaded from Microsoft Download Center

       Exchange 2010

       Setup.com /m:upgrade

        Exchange 2013

       Setup.exe /m:upgrade /IAcceptExchangeServerLicenseTerms

    • Make sure to use an elevated command prompt and to close all the other Exchange windows at the time of setup.

    • Also make sure there were no pending reboots prior to beginning the installation.

    • The order of installation for Exchange 2010 :

       Client Access Server (if you have multiple sites, internet facing sites first)

       Hub Transport

       Mailbox

       The order of installation for Exchange 2013 :

       Mailbox (Internet-facing site first)

       CAS

       Edge

    • Post the installation we must reboot the Exchange server.

    • I would also recommend to test the setup in a lab before installing it in production.

     

    - Shweta


    0 0

    The terms used to describe MDM and cloud account sign in and management have evolved and changed over time from the original terms used in Windows 8. This post’s purpose is to give you the low-down on the options available to you in Windows 10, Version 1511 and onwards.

    They have had many names and functions, but the key point you should take away from this post is that there are only two types of enterprise cloud identity (Azure AD) you need to consider in Windows 10, and either can trigger MDM enrolment to occur if you wish. So we’ve had many names, but we’re actually only talking about three things: Azure AD Join, Work or School Accounts, and Device Management (MDM).

    A simple representation of this is as follows:

    image

    We’ve had many names for these things in older versions of Windows 10, and in Windows 8:

    • Device Enrolment
    • Device Registration
    • MDM management
    • Workplace Join
    • and many more..

    Forget everything you’ve heard before. There are just two ways to use an AAD account:

    • Azure Active Directory Join (AADJ)– Where the device joined Azure AD instead of your on premises, Windows Server AD the way you typically would. This allows the user to sign in with their corporate (Azure AD) credentials at the sign-in screen as their Primary Account. You can’t AADJ & traditional AD Join at the same time.
    • Add Work or School Account (“Add Account”) - Which adds an additional account (a Secondary Account) to the users login that facilitates single sign on (SSO) even when the user is logged on with some other account as their primary account (Eg, local account, Microsoft Account, etc). You can be traditional AD joined and still use the Add Account functionality.

    There is also just one Mobile Device Management (MDM) type you need to worry about, which is MDM Enrolment. The term ‘enrol’ or ‘enrolment’ is used to imply MDM.

    Finally, you can either use the ways above to add an AAD account to automatically enrol in an MDM (using the Enrolment URL capabilities of Azure AD) or you can just manually enrol in MDM and add or use no corporate identity (except for enrolling in MDM. of course).

    When should I use one versus the other?

    One common question after covering the above is where each is intended to be used. With some exceptions depending on your organization’s needs, the identity options are geared as follows:

    • Azure AD Join (AADJ) is primarily designed for corporate-owned devices. Since you’ll be primarily signing in with a corporate identity, it’s not as suitable for personally-owned devices.
    • Add a Work or School Account is intended for all (other) use cases, so that corporate credentials can be added to the logged on user’s account, regardless of what they are logged on with – for example the user may be logged on with their Microsoft Account, but they may wish to add their corporate or school account as a secondary account.

    I hope this has helped you better understand the options available in Windows 10!


    0 0

    こんにちは、Windows プラットフォーム サポートの世古です。

    今回は Azure Backup を構成し、初めてバックアップを取得する際に、バックアップの取得に失敗する事例についてご紹介いたします。

    1.バックアップに失敗した際には、GUI 上に以下の様なイベントが記録されます。



    2.次にイベント ビューアーより "CloudBackup" のイベントを確認します。



    ログの名前:         CloudBackup
    ソース:           CloudBackup
    日付:            2016/02/27 1:00:05
    イベント ID:       11

    3.中央ペイン下部より "詳細" タブをクリックして、EventData の内容を確認します。




    - EventData StopInfo
    StopInfo <?xml version="1.0"?> <CBJob><JobId>060f2660-e41a-413c-a955-b6ed089a2895</JobId>
    <JobType>Backup</JobType><JobStatus><JobState>Aborted</JobState><StartFileTime>13100976
    0052990143</StartFileTime><EndFileTime>131009760053180108</EndFileTime><FailedFileLog>
    </FailedFileLog><ErrorInfo><ErrorCode>120002</ErrorCode><DetailedErrorCode>-2147024773
    </DetailedErrorCode><ErrorParamList/></ErrorInfo><DatasourceStatus><CBDatasourceStatus>
    <JobState>Aborted</JobState><LastCompletedJobState>Initializing</LastCompletedJobState>
    <ErrorInfo><ErrorCode>120002</ErrorCode><DetailedErrorCode>-2147024773</DetailedErrorCode>
    <ErrorParamList/></ErrorInfo><Datasource><DataSourceId>2026778163991805067</DataSourceId>
    <DataSourceName>C:\</DataSourceName></Datasource><ByteProgress><Total>0</Total><Changed>
    0</Changed><Progress>0</Progress><Failed>0</Failed></ByteProgress><FileProgress><CurrentFile>
    </CurrentFile><Total>0</Total><Changed>0</Changed><Progress>0</Progress><Failed>0</Failed>
    </FileProgress></CBDatasourceStatus><CBDatasourceStatus><JobState>Aborted</JobState>
      :
      :
      :
    </CBJob>

    4.以下のレジストリ キーを確認します。

    HLKM\Software\Microsoft \Windows Azure Backup\Config
      キー: ScratchLocation

    HLKM\Software\Microsoft \Windows Azure Backup\Config\CloudBackupProvider
      キー: ScratchLocation

    問題が発生する環境においては、レジストリ キーの値のパスに余分な "\" が付与されております。

    例:  F:\MARSAgent\Scratch\\Scratch
     

    5.上記確認の結果、本問題に該当している場合には、不要な "\" を削除して変更します。本作業は手順 4 で確認したレジストリ キー両方ともの値を変更します。

    例:  F:\MARSAgent\Scratch\Scratch

    6.レジストリ キーの設定を反映させる為に Azure Backup エージェントのサービスを再起動します。管理者権限で起動したコマンド プロンプトより以下のコマンドを実行します。

      net stop obengine
      net start obengine

    7.再度バックアップを実行し、バックアップが正常に終了するか確認します。


    [参考情報]
    上記状況をログ等の情報から確認するには、CBEngine ログより詳細動作を確認します。

    ログの既定パス: C:\Program Files\Microsoft Azure Recovery Services Agent\Temp
    上記パスはエージェントのインストール フォルダによって変更されます。

    現象が発生した時間帯付近のログを展開し、以下の様なログが記録されているか確認します。尚、ログ内のタイム スタンプは GMT (標準時) で記録される為、日本標準時で確認するには +9 時間となります。

    - CBEngine ログ
    3380 6D6C 02/26 16:00:05.388 18 dsmfsenumerator.cpp(150) [000000001A36E370] 060F2660-E41A-413C-A955-B6ED089A2895 WARNING Failed: Hr: = [0x8007007b] : FindFirstFile failed For Dir:\\?\F:\MARSAgent\Scratch\\Scratch\*
    3380 6D6C 02/26 16:00:05.388 18 fsutils.cpp(2409)  060F2660-E41A-413C-A955-B6ED089A2895 WARNING Failed: Hr: = [0x8007007b] : FindFirstFile failed for Path [\\?\F:\MARSAgent\Scratch\\Scratch\], FileSpec [*]


older | 1 | .... | 848 | 849 | (Page 850) | 851 | 852 | .... | 889 | newer