Quantcast
Channel: TechNet Blogs
Viewing all 17778 articles
Browse latest View live

Creating Uninstall Applications (NOT packages) - SCCM 2012

$
0
0

BLUF: Who said creating an application to JUST run a one line script could not be easy?!

Old Software Cleanup

It is not uncommon to have a lot of old software (usually vulnerable to security risks) on your network.  Some applications don’t always uninstall older versions when installing newer versions, so cleaning up this software is important.

An uninstall script is usually a one liner.  Creating Applications in Configuration Manager 2012 for scripts is generally a lot more work than just creating a non-content package to run a one-line script.  Unfortunately you cannot use the supersedence feature of Configuration Manager Applications with packages.  To use the supersedence feature in Applications, you need at minimum two things: Uninstall String, and Detection Method.  With MSI installed applications, one piece of information is required to meet both of these requirements: Product ID or GUID.

Preparations

  • Find current versions of software on the network

  • Create base MSI application (any will do)

  • Create Uninstall applications

Finding the software

How do you know if you have more than one version of software on the network?  I use queries (Monitoring->Queries) and usually return Display Name and Version with a query for Installed Applications.Display Name like %<softwareName>%.  To make this process a little easier, I would suggest returning Product ID as well.  This value will return a GUID for MSI installed applications.  As all of us SCCM administrators know, this value is very important when looking for a way to uninstall an application (msiexec.exe /x {GUID} /qn).

I like to have my data in Excel when working with it.  To do this, select one entry from the query results, select Ctrl-a to select all results, select Ctrl-c to copy the data, and select Ctrl-v to paste it into Excel.

Note: If the Product IDs returned are not GUIDs, more research will need to be done to collect the uninstall strings and detection methods for your software.  Uninstall string is usually found in the registry in HKLM:\Software\Microsoft\Windows\CurrentVersion\Uninstall\<ProductID> and HKLM:\Software\WOW6432Node\Microsoft\Windows\CurrentVersion\Uninstall\<ProductID> for x64 systems.  Examples of Detection Methods are file versions or registry settings.

Create Base MSI application

As the requirements indicate, the MSI needed for this step is not important.  Once one MSI application is created (feel free to copy an existing one), there are just a few updates that need to be made to it to make it your base Uninstall Application.

Once you have created the MSI application, open the Properties window of the Application.  Change the name to something like Uninstall Application Template on both the General Information and Application Catalog tabs.  Then open the Properties of the Deployment Type from the Deployment Type tab.  Change the Name of the existing DT to Uninstall <application>.  On the Content tab, delete the contents of the Content location.  On the Programs tab, change the Install program to dummy.msi as this is a required field, but we will not be using it.  Select OK twice to close the Properties windows and save the new application.

Create Uninstall Applications

This base MSI application can now be used to create the applications for uninstall.  All we really need to change in the copied application is the Name for meaning, the Uninstall program for the correct action, and the Detection Rule Clause so that the installed application can be found for action.  I am sure you can change names without instruction, so I will leave this to you.  This is where the spreadsheet comes in nicely; just copy the GUID and get ready to paste in two to three places.  Now all we have to do is create one uninstall application per old application.  The first place to paste the GUID is in the Uninstall program, the second may or may not be the Product code field on the Programs tab (seen it without and with), and the last in the Detection Rule Clause on the Detection Method tab.  Now we are ready to rid ourselves of this old software.

Setup Applications for Uninstall

On the Properties window of the Application, the last tab is Supersedence.  This tab allows you to select an existing application to be superseded and there is a checkbox for Uninstall.  Brilliant!Just enter each of the Uninstall applications, select the checkbox and be done.  You don’t even have to look for what computers need the old application removed.  If the newest application targets a machine that has any version of the application installed, it will install the newest version and uninstall all older versions, per the supersedence rules. I have not verified, but I think that if the newest application is already installed, and the superseded applications are added post, they will still uninstall given the requirements of the application for install.

Summary

The applications that customers have sought most to uninstall are Java and Adobe products as they seem to just hang around; I've even done VLC Player.  This model has proven well to clean up your network.

If you are looking for an even easier way to keep your network clean, AppV 5.0 SP2 has brought an easy way to manage these same type of applications.  My previous blog speaks about this: http://blogs.technet.com/b/christianwb/archive/2014/03/15/configmgr-2012-app-v-5-0-and-internet-explorer-plug-ins.aspx.


Configuring Relative Dates for Any Scheduled SCOM Report

$
0
0

One of my favorite features of Operations Manager has always been the ability to tap into the data it collects and run reports. It's like being able to travel back in time, and examine what was happening on our servers 3, 6, or 12 months ago. To me, it is a type of Business Intelligence, and just like BI data, it should be leveraged for the benefit of the organization generating it. One of the ways reports are most useful, is through the scheduling functionality provided by SQL Reporting Services. OM reports can be scheduled to run on a regular basis (daily, weekly, monthly, quarterly) with convenient delivery right to your mail inbox, or a file share, and configured in pretty much any file format you would want.

A major challenge for Operations Manager administrators when scheduling reports, however, is that some of the reports which come with Operations Manager out of the box, or which ship with management packs, are not configured to use relative dates but rather use fixed dates when they are executed through the Administration Console. The ACS reports, as well as the Exchange 2010 reports are prime examples of this problem:

The above example shows the Unsuccessful Logon Attempts report from the Operations Manager Audit Reports (ACS) library. Note how the Start Date and End Date are configured as specific dates. Although this is convenient when running the report manually, because it just "runs" immediately for a two day period without having to configure a date range or other parameters, it presents a problem when the report is scheduled.

If you schedule the report above, after having run it, and configure it to run say on a weekly basis on Friday afternoons, the report which is generated on Friday afternoons will ALWAYS have the same starting date (March 12, 2014) and ending date (March 14, 2014). Always getting the same report, with the same data, for the same window of time, week after week does not make for a very useful report.

The problem is that when scheduling the report, the scheduling wizard does not give us a way to pick relative dates for the execution of the report. By relative, we mean dates relative to the time when the report is executed:

 

Notice that the Start Date, and End Date settings above are hard-wired to specific dates, which completely ruins the scheduling feature. Compare that with a report which does allow relative dates, like the Most Common Alerts report from the Microsoft Generic Report Library:

 

Notice that in this report, the From and To date fields are not hard-wired, and instead allow for the use of "Friday previous week." When the report is run, as scheduled, it figures out what that date corresponds to and adjusts it in the report execution accordingly.

 

So, does that mean that there is no hope for reports in the Audit Reports (ACS) library or other reports like the Exchange 2010 reports? Fortunately the answer is no. We can modify these reports to provide us with the relative date functionality. To do so, we must first open the reports using SQL Reporting Service's Report Builder. You can access Report Builder from the SQL Reporting Services web page by using the Report Builder button at the top:

 

I am using Report Builder 3.0 from SQL Server 2008R2, but the functionality should be the same in other versions:

 

Once Report Builder launches, we need to find and open the report we want to modify by clicking on the Open option, and then browsing to the report you wish to schedule in the Open Report dialog box.

 

 

After clicking the Open button, the report opens up in the design view of Report Builder, and it should look something like this:

 

The first thing we should do is to save a copy of the report, so we are working on the copy and not altering the original. Click the circle in the top left corner of the Report Builder window, and select Save As. Save the report with a new name, perhaps by adding some text to the end of the report file name indicating the report window. For example, if I intend to run this report for the last 7 days using relative dates, I might save my report with the name: "Access_Violation_-_Unsuccessful_Logon_Attempts_RelativeDates_Last_7_Days". With the report copy saved, we can now feel free to alter it, without fear of losing the original report, which might still be useful for manual execution in the future.

Now, note that the report data is configured using the Report Data pane on the left. If you expand the Datasets tree node, you will see the data set items which are being requested from the Operations Manager database, either the Data Warehouse or ACS databases. This particular report gets data from the ACS database, about the user, IP address, computer, logon process, authentication package, etc. The Parameters folder, meanwhile, contains the report parameters which are entered by the user at run-time. Here is where our problem is, and what we need to correct.

Start by right-clicking on the StartDate item under the Parameters folder, and select Parameter Properties. Click on the Default Values tab on the right, which shows the configuration for what are the default values used for this parameter when the report is run.

Click the "fx" button on the right side which will bring up the Expression dialog box. Here we can see how the default value for the StartDate Parameter is calculated. The formula highlighted at the top obtains Today's date, and uses the AddDays() function to add -1 days, in other words, move the date backward one day, starting today.

You should also explore the functions in the function category control, bottom left, and get familiar with the Date & Time functions. Of specific interest for a future step is the DateAdd function highlighted in the middle list control above. This function can be used to add or subtract time from a given date. The beauty of this function is that it can be used to add or subtract days, weeks, months, years, you name it. There is an excellent reference for this function, with a complete list of parameters and some sample uses, in TechNet: http://technet.microsoft.com/en-us/library/ms186819.aspx. For now, simply document the expression used for the StartDate Parameter, and do the same for the EndDate Parameter.

Armed with this knowledge, let's alter the report, so that instead of taking in input from a user. Start by right clicking on the "dataSet" item in the Datasets folder, and then click on the Parameters tab. Here we see that the dataSet gets its values for StartDate and EndDate from the Parameters we explored above.

We do not want to get the dates from the Parameters, so we will remove the link between the two. To do so, click on the "fx" button next to the StartDate, and we will have an Expression dialog box similar to the one we saw before in the Parameters Properties for StartDate, but here the expression formula is different. The expression formula here links up the dataSet's configuration to the parameters the user configures at execution. We are going to remove that link, and instead hard-wire the report to run for a fixed period of time relative to the current date. For this example we will configure the report to run for the period of the LAST 7 days, prior to today. We can use the DateAdd() function to add or subtract days as I mentioned previously. We can also use the Today() function to obtain the current date. Finally, we can next one function inside the other, so that today's date will be a parameter for the execution of the DateAdd() function, like this:

I won't get into every scheduling variation, but one other useful one is for when you want to run the report for the past month. In that scenario the value would be:

Click OK to complete the configuration of the StartDate parameter, and then configure the expression for the EndDate as follows:

 

After doing so, the Parameters tab of the Dataset properties looks like this:

 

Now we can delete the two parameters of the report, StartDate and EndDate, under the Parameters folder. Since we hard-wired the StartDate and EndDate values, the parameters are not needed. Simply right-click on each one, and select the Delete option:

 

 

Save the report, and then run it. If you schedule it from the SCOM admin console, you will notice that it does not prompt for StartDate and EndDate parameters any longer, and now when scheduled will use the relative dates you configured.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Office 365 ProPlus Deployment Options

$
0
0

Q: (from Dan)

One quick question for you as we’re preparing to roll out office pro plus from 365 e3. I believe you indicated it’s possible to download a “traditional” MSI or other kind of installer rather than do the streaming install to every machine in the customer environment. Can you point us in the right direction?

 

A:

I did a Bing search on “local Office 365 deployment” and found the following information:

http://technet.microsoft.com/en-us/library/gg998766.aspx

 

SDeming Face  Steve

如何在 Windows Server 2012 和 2012 R2 中自定义服务器管理器 – 发挥创意!

$
0
0

大家好,Hilde 在此向大家问候 - 我正在西雅图地区向大家致意,非常荣幸即将在这里参加我们的高级内部技术培训活动。

在本文中,我将会介绍一项不太知名但很酷的自定义功能,让大家从一个全新的视角了解如何使服务器管理员的工作变得更加轻松高效。

如果您使用过 Windows Server 2012 或 2012 R2,势必已经发现服务器管理器与先前的版本存在很大的区别,但您知道吗?您完全可以在服务器管理器中编辑“工具”菜单。

您可以做到。工具下拉菜单只是用于显示“Administrative Tools”文件夹 (C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Administrative Tools\) 中的快捷方式的一种 UI。

简单思考以下描述:

  • 您可以添加/删除此处显示的工具
  • 您可以在此处创建文件夹以便更有效地组织列表
  • 您可以在此处创建/复制/重命名快捷方式
  • 您可以在此处添加第三方应用程序/工具的快捷方式
  • 您可以从此处将 PowerShell 脚本作为“工具”运行

典型默认工具列表                    

                        

 

我们来进行自定义,使其更有效地满足我的需求

在目标系统上打开以下位置:

C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Administrative Tools\

在此文件夹中,您可以:

  • 创建子文件夹,以便更好地进行组织并在这些文件夹中自行创建工具快捷方式(或副本)
  • 在下方的屏幕截图中,我已经创建了一个名为 '- IT – Admin       Tools' 的文件夹
  • 此名称会将该文件夹置于工具列表的顶部
  • 创建新快捷方式或快捷方式副本,并为它们命名以控制其顺序
  • 几点警告
      
  • 如果要更改内置快捷方式,我建议复制这些快捷方式,然后编辑副本,因为如果修改或缺少工具快捷方式,可能无法删除角色/功能。

  

  • 安装服务包或补丁程序等可能会重写/覆盖您的默认值修改结果。

  

  • 在下方的示例中,我为现有的快捷方式创建了副本,将副本保存到文件夹中,对它们进行重命名以符合我的命名标准。

  

       
  • 同样,我选择这些名称来控制工具的顺序/组织方式

  

  •  
  • 创建 PowerShell“Tool”并将其作为 PowerShell 脚本的快捷方式:
  • 创建脚本并将其存储到您的常用脚本目录中
  • 为脚本创建快捷方式并编辑脚本的属性(参见下文)
  • 将修改后的快捷方式保存到上面创建的自定义子文件夹中

 

    • 编辑快捷方式的“Target”属性
        
    • 我在下面的示例中进行的编辑旨在实现以下目的:
      
         
    • 绕过本地         PowerShell         执行策略(这样不仅能在系统中保持属性不变,而且仍能运行该脚本)
       
  • 设置控制台在运行脚本后关闭/退出(这样就能实际查看代码结果/输出)

  

%systemroot%\System32\WindowsPowerShell\v1.0\powershell.exe -executionpolicy bypass -noexit -file "C:\Users\administrator.CHILD\Documents\POSH - dedup.ps1"

 

现在介绍自定义IT – Admin Tools 列表

 

在这项设计中,我可以轻松地查找工具(工具位于顶部)。单击“- IT – POSH Dedup Status Utility”时,我的屏幕上将会弹出此窗口,并在关闭窗口前一直保持该画面:

  • 顺便说一下,您能相信吗?通过删除服务器的虚拟机和 ISO 驱动器的重复数据还可以节省空间(参见下方的屏幕截图)。


 

如果需要管理存储阵列,我可以从 Tools 文件夹运行“– IT – HP Array Configuration Utility”:


 

操作方法:

  • 在任务栏上创建一个工具栏,使其指向您的新 Tools 文件夹 – 如果您钟爱使用工具栏的话


 

  • 在 Start 屏幕上创建一个应用程序组并进行命名,然后将您的新快捷方式固定在此处 – 如果您钟爱使用 Start 屏幕的话


 

您已经学会了。

除了这种绝佳的灵活性以外,我也非常期待 Win+X 菜单(右键单击 Start 按钮)能够具备这种自定义功能,让我们拭目以待吧!

如要了解 2012 服务器管理器的“经典概括说明”,请参阅我的先前发布的文章

服务器管理器博文中讨论了这篇文章中提出的一些核心观点,请阅读了解更多相关见解/信息

 

现在,发挥创意,在评论中提出您在提高服务器管理效率方面的一些奇思妙想。帮助我们共同“奔向美好的明天”。

Calling a WCF dynamically using a BizUnit Test Step

$
0
0

Introduction

Using BizUnit to test complex testing cases is a common practice that we all have been using for a while now. This very useful library of testing steps provides ease in writing testing cases maintaining them and executing them. One common challenge commonly faced is how to call web services or more commonly how to call WCF services with a test step? There has been an out of the box test step to call SOAP 1.1 web services or WCF services based on BasicHttpBinding. Also there has been another custom one (here) to call a WCF with a common strongly typed contract. Of course there is always the good old HTTP step to invoke everything yourself and complicate your life.
In this post I will show you a custom test step that I have written to call a WCF dynamically without knowing anything about its contract. I am basing my code below on the out of the box Web service testing case already in BizUnit 4.0.

Solution

So like I said I wanted to call any WCF service with any binding and any contract without limitations. Also I wanted the flexibility to configure the service as I want using the app.config file.
So I have extracted the out of the box Web testing case and started to look into how to customize that test case. So here is what I did. I started with adding couple of properties for my testing case as below:
Property
Description
DataLoaderBase RequestBody
This is the standard request body loader of the Web service call test step.
string ServiceUrl
This is the URL of the WCF service you want to call.
string Username
This is the username of the user to be added to the client credentials if the security mode is set to username. Else you should not specify this value.
string Password
This is the password of the user to be added to the client credentials if the security mode is set to username. Else you should not specify this value.
string BindingTypeName
This is the full type name of the binding required to be used when calling the WCF service.
MessageVersion MsgVersion
This is the request message version to be used when calling the WCF service.
string BindingConfigurationName
This is the name of the binding configuration in the app.config file.
 
Then I validated my test step as per the below method:
if (string.IsNullOrEmpty(ServiceUrl))
{
    thrownewStepValidationException("ServiceUrl may not be null or empty", this);
}
 
if (string.IsNullOrEmpty(Action))
{
    thrownewStepValidationException("Action may not be null or empty", this);
}
 
if (string.IsNullOrEmpty(BindingTypeName))
{
    thrownewStepValidationException("Binding type name may not be null or empty", this);
}
 
if (string.IsNullOrEmpty(BindingConfigurationName))
{
    thrownewStepValidationException("Binding configuration name may not be null or empty", this);
}
 
RequestBody.Validate(context);
 
And then I started working on the Execute method. First thing I wanted is to create the binding and I used the reflection to do this and I used the binding configuration in the configuration file to customize the binding as I want.
Type bindingType = Type.GetType(BindingTypeName);
Binding binding = (Binding)Activator.CreateInstance(bindingType, BindingConfigurationName);
 
Then once I have the binding I created the address as below:
var epa = newEndpointAddress(newUri(serviceUrl));
 
I also created a dummy WCF service contract so that it would be a generic contract for any WCF service as below:
///<summary>
/// A dummy WCF interface that will be manipulated by the CallWebMethod above
///</summary>
[ServiceContract]
interfacegenericContract
{
    [OperationContract(Action = "*", ReplyAction = "*")]
    Message Invoke(Message msg);
}
 
Then I created the ChannelFactory using the EndpointAddress and the Binding created above as below:
cf = newChannelFactory<genericContract>(binding, epa);
 
One final note is that I used the Message version property to control which message version my WCF is using when I am creating the request message as below:
request = Message.CreateMessage(MsgVersion, action, r);
 
The remaining code is standard with no changes. Then I started to call my service as below:
var testCase = newTestCase();
 
var wcftststep = newWcfGenericRequestResponseStep();
wcftststep.ServiceUrl = "http://localhost:16987/Service1.svc";
wcftststep.Action = "http://tempuri.org/IService1/GetData";
wcftststep.BindingConfigurationName = "WSHttpBinding_IService1";
wcftststep.BindingTypeName = typeof(System.ServiceModel.WSHttpBinding).AssemblyQualifiedName;
wcftststep.FailOnError = true;
wcftststep.RunConcurrently = false;
wcftststep.MsgVersion = System.ServiceModel.Channels.MessageVersion.Soap12WSAddressing10;
wcftststep.RequestBody = newFileDataLoader()
{
    FilePath = @"SampleInput.xml"
};
var xmlvalstep = newXmlValidationStep();
xmlvalstep.XmlSchemas.Add(newSchemaDefinition()
    {
        XmlSchemaPath = @"OutputSchema.xsd",
        XmlSchemaNameSpace = @"http://tempuri.org/"
    });
xmlvalstep.XPathValidations.Add(new BizUnit.TestSteps.Common.XPathDefinition()
    {
        XPath = "/*[local-name()='GetDataResponse' and namespace-uri()='http://tempuri.org/']/*[local-name()='GetDataResult' and namespace-uri()='http://tempuri.org/']",
        Value = "You entered: 0"
    });
 
wcftststep.SubSteps.Add(xmlvalstep);
 
testCase.ExecutionSteps.Add(wcftststep);
 
var bu = new BizUnit.BizUnit(testCase);
bu.RunTest();
 
The complete code for the WcfGenericRequestResponseStep is listed below:
using BizUnit;
using BizUnit.TestSteps.Common;
using BizUnit.TestSteps.Soap;
using BizUnit.Xaml;
using System;
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.IO;
using System.Linq;
using System.ServiceModel;
using System.ServiceModel.Channels;
using System.Text;
using System.Threading.Tasks;
using System.Xml;
 
namespace BizUnitTester
{
    publicclassWcfGenericRequestResponseStep : TestStepBase
    {
        privateStream _request;
        privateStream _response;
        privateCollection<SoapHeader> _soapHeaders = newCollection<SoapHeader>();
 
        publicDataLoaderBase RequestBody { get; set; }
        publicstring ServiceUrl { get; set; }
        publicstring Action { get; set; }
        publicstring Username { get; set; }
        publicstring Password { get; set; }
        publicstring BindingTypeName { get; set; }
        publicMessageVersion MsgVersion { get; set; }
        publicstring BindingConfigurationName { get; set; }
 
        public WcfGenericRequestResponseStep()
        {
            SubSteps = newCollection<SubStepBase>();
        }
 
        publicCollection<SoapHeader> SoapHeaders
        {
            set
            {
                _soapHeaders = value;
            }
            get
            {
                return _soapHeaders;
            }
        }
 
        publicoverridevoid Execute(Context context)
        {
            _request = RequestBody.Load(context);
 
            context.LogXmlData("Request", _request, true);
 
            _response = CallWebMethod(
                _request,
                ServiceUrl,
                Action,
                Username,
                Password,
                context);
 
            Stream responseForPostProcessing = _response;
            foreach(var subStep in SubSteps)
            {
                responseForPostProcessing = subStep.Execute(responseForPostProcessing, context);
            }
        }
 
        publicoverridevoid Validate(Context context)
        {
            if (string.IsNullOrEmpty(ServiceUrl))
            {
                thrownewStepValidationException("ServiceUrl may not be null or empty", this);
            }
 
            if (string.IsNullOrEmpty(Action))
            {
                thrownewStepValidationException("Action may not be null or empty", this);
            }
 
            if (string.IsNullOrEmpty(BindingTypeName))
            {
                thrownewStepValidationException("Binding type name may not be null or empty", this);
            }
 
            if (string.IsNullOrEmpty(BindingConfigurationName))
            {
                thrownewStepValidationException("Binding configuration name may not be null or empty", this);
            }
 
            RequestBody.Validate(context);
        }
 
        privateStream CallWebMethod(
            Stream requestData,
            string serviceUrl,
            string action,
            string username,
            string password,
            Context ctx )
        {
            try
            {
                Stream responseData;
                Type bindingType = Type.GetType(BindingTypeName);
                Binding binding = (Binding)Activator.CreateInstance(bindingType, BindingConfigurationName);
 
                var epa = newEndpointAddress(newUri(serviceUrl));
 
                ChannelFactory<genericContract> cf = null;
                genericContract channel;
                Message request;
                Message response;
                string responseString;
 
                try
                {
                    cf = newChannelFactory<genericContract>(binding, epa);
                    if (!string.IsNullOrWhiteSpace(username) && !string.IsNullOrWhiteSpace(password))
                    {
                        cf.Credentials.UserName.UserName = username;
                        cf.Credentials.UserName.Password = password;
                    }
                   
                    cf.Open();
                    channel = cf.CreateChannel();
                    using (newOperationContextScope((IContextChannel)channel))
                    {
                        XmlReader r = newXmlTextReader(requestData);
 
                        request = Message.CreateMessage(MsgVersion, action, r);
 
                        foreach (var header in _soapHeaders)
                        {
                            MessageHeader messageHeader = MessageHeader.CreateHeader(header.HeaderName, header.HeaderNameSpace, header.HeaderInstance);
                            OperationContext.Current.OutgoingMessageHeaders.Add(messageHeader);
                        }
                       
                        response = channel.Invoke(request);
 
                        string responseStr = response.GetReaderAtBodyContents().ReadOuterXml();
                        ctx.LogXmlData("Response", responseStr);
                        responseData = StreamHelper.LoadMemoryStream(responseStr);
                    }
                    request.Close();
                    response.Close();
                    cf.Close();
                }
                catch (CommunicationException ce)
                {
                    ctx.LogException(ce);
                    if (cf != null)
                    {
                        cf.Abort();
                    }
                    throw;
                }
                catch (TimeoutException te)
                {
                    ctx.LogException(te);
                    if (cf != null)
                    {
                        cf.Abort();
                    }
                    throw;
                }
                catch (Exception e)
                {
                    ctx.LogException(e);
                    if (cf != null)
                    {
                        cf.Abort();
                    }
                    throw;
                }
 
                return responseData;
            }
            catch (Exception ex)
            {
                ctx.LogException(ex);
                throw;
            }
        }
 
        ///<summary>
        /// A dummy WCF interface that will be manipulated by the CallWebMethod above
        ///</summary>
        [ServiceContract]
        interfacegenericContract
        {
            [OperationContract(Action = "*", ReplyAction = "*")]
            Message Invoke(Message msg);
        }
    }
}
 

Environmental Insight That Helps the Bottom Line

$
0
0

 We’ve talked a lot about data—consumer data, to be specific—and how transforming it into insight could mean either savings, profit, or both for your business. But according to the Computational Ecology and Environmental Science Group (CEES), a part of Microsoft’s Cambridge Research team that we got to meet this month as part of TechFest, ecological data will be just as big of a business opportunity as consumer data. The predictive models these researchers are creating can be applied to many other kinds of business, from Coca-Cola’s manufacturing process to protecting resources.

When physics meets machine learning

CEES combines the knowledge of ecologists and environmental scientists with the skills of software engineers to create predictive models of our environment. Given historical data, the team can create simulation models to show how the natural world responds to various occurrences in the environment.

"Ecological data will be just as big of a business opportunity as consumer data"

Data like this can be used to visualize how the distribution of carbon will change in polar versus equatorial regions or to model changes in Earth’s marine biomass over the next decade. Although the association may not be immediately apparent, with insights such as these, major corporations such as Coca-Cola, whose bottling plants depend on local water sources, can act to protect resources.  Coca-Cola can use this data to instantly and independently conduct local source vulnerability assessments to inventory risks to the water sources supplying their facilities and the surrounding communities.

Another example would be the construction of data centers: before the first blueprint is ever drawn, enterprises can consider the availability and cost of electric power in addition to the likelihood of disruptive weather events in a particular location. This way, the company will know what to expect in the years to come (e.g., the probability and frequency of server crashes).  The same techniques can also be used to understand past events. For instance, to understand why a major server crashed, analysts can use data to reconstruct the circumstances of the event.

Data deficient

When it comes to ecological data helping the bottom line, the challenge isn’t technology—it’s a lack of data. Our data resources are limited because ecologists (and scientists in general) aren’t necessarily the most technology savvy. Whether data is shared in PDFs or stored in an Excel sheet on a researcher’s computer, it remains either locked in a difficult-to-use format or siloed and inaccessible.

Microsoft’s Cambridge Research team is working to compile this data and has already made big strides—from reaching out to scientists and researchers to organizing data from all sorts of public sources. Even major corporations such as Coca-Cola realize what an important asset this data is; as part of its water stewardship effort, the company is “donating data to speed water-risk mapping.” Its goal is to create “an open, transparent and publicly available database that provides geographical and sector-specific water risk context to companies, investors, governments and others.”

Until now, the push to organize ecological data in a structured way has been somewhat of a grassroots effort led (for the most part) by teams such as CEES, nonprofits, and other disparate groups of scientists and researchers. Enterprises are now realizing the potential value of these insights. Beyond the enterprise, how could this insight help your business? Beyond your business, how could it affect your life as a homeowner or human on this planet?

Windows 8.1 Deployment: From the ground up.

$
0
0
There has been a lot of talk about the end of Windows XP, the end of support for XP that is… As I write this post there is only a few days left for Windows XP support. What is end of support? (the following is taken word for word from here) After 12 years, support for Windows XP will end on April 8, 2014. There will be no more security updates or technical support for the Windows XP operating system. It is very important that customers and partners migrate to a modern operating system such as Windows...(read more)

Team up with millions in first ever community challenge for ‘World of Tanks: Xbox 360 Edition’

$
0
0

Roll out your tank skills and work with millions of other players around the world to unlock new maps in Map Madness – the first ever community challenge for “World of Tanks: Xbox 360 Edition.”

Xbox Live Gold members can download this free-to-play title and join this massive event to unlock new battlefields beginning Tuesday and ending March 24, at 11:59 p.m. PDT.

Read more about Map Madness on Xbox Wire. Good luck!

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff


Blog Intro and purpose

$
0
0

As a person with experience and background as an IT Professional, my blog at first look might seem to have an interesting title - IT Pro's becoming Agile.

'Agile' is a term I have used from the software development methodology, that promotes the development, teamwork, collaboration, and process adaptability throughout the life-cycle of a project.

Just as Developers are embracing the evolution of Application Lifecycle Management (ALM), with the Business requiring and expecting faster iterations of an application, IT Professionals are in exactly the same boat for Infrastructure.

This blog is intended to assist and guide IT Professionals towards becoming Agile, at the same time leveraging the licensed products they have at their disposal, with a focus on the Platform and the Management of that platform. I want the reader to have a take-away from each article on how they can use the information to individually develop the technology or their skills around it, to work as a team and collaborate with Developers and Business, and to adapt to each change on the horizon.

The Business is expecting movement and turnaround for IT departments to use services such as Cloud solutions, whether it be Private, Public or Hybrid. At the same time see a return on their investment for the particular licensing agreement/s they have with Microsoft. This could be in the form of maximising product usage, Software Assurance benefits and the like…

The Business benefits:
Each post in my blog will have this section within it, allowing an IT Professional to engage with their relevant Business. This content can be used by an IT Professional to explain/negotiate/sell or advise the Business of the benefits that will be seen by deploying the relevant technology and its feature set within their organisation.

Why have this section you ask?…

The number one problem I have seen in almost every organization I have consulted to, is that of IT and Business not being able to speak the same language. IT struggling to explain the business value of deploying various products and their feature sets, as well as Business struggling to translate their business roadmap to IT. This all resulting in IT being the traditional cost centre it has always been!

Along with this, many IT Pro’s and Developers are not fully aware of what products they own on, or the benefits that can be derived from, the Microsoft license agreement/s that Business has signed for. On the other side of the divide, most Businesses are not fully aware of what “IT” has deployed in their environment, keeping them in the dark of the real value in their license agreement.

Amongst how-to articles on this blog, there will also be articles explaining and guiding IT Professionals on how to leverage many of the Solution Accelerators & license agreement benefits provided by Microsoft.

The first series of posts will focus on Developer/Operations, commonly known as DevOps, and for a change up, these articles will be written from the IT Pro's point of view. Just as Developers are adapting to faster release cycles for their applications or their application updates, IT Professionals need to be able to adapt at a faster pace. Not only to the changing and challenging IT landscape that Cloud is bringing in its various forms, but we have/and can also play a very key important role within the Application Lifecycle Management.

The IT landscape IS changing and moving forward at a rapid pace, however, it is a prime opportunity for IT Professionals to embrace these movements and build up to showing REAL value back to Business, for all the investment they have made.

To ensure this blog stay relevant and timeous, I encourage you as a reader to feedback on articles posted here, as well as on any IT infrastructure movements you are currently going through. Thus ensuring we all move towards becoming as Agile as possible…

Morgan
@morgan_msft

25 000 Linux- und Unix-Server durch Operation “Windigo” gekapert

$
0
0

Spezialisten des Security-Anbieters Eset haben zusammen mit CERT-Bund, der Europäischen Organisation für Nukleare Forschung (CERN) und anderen nationalen Organisationen eine groß angelegte Aktion von Cyberkriminellen aufgedeckt. Danach haben diese offenbar heimlich die Kontrolle von zehntausenden Web-Server übernommen, die auf Linux und Unix betrieben werden.

Von dem Hackerangriff, der von Eset mit dem Namen “Windigo” getauft wurde, können bis zu 25 000 Linux/Unix-Systeme betroffen sein, die bis zu 35 Millionen Spam-Mails pro Tag verschicken können. Die infizierten Systeme müssen mit einer Neuinstallation des Betriebssystems neu aufgesetzt werden, alle Passwörter und private OpenSSH-Schlüssel sollten ausgetauscht werden.

Die Hacker haben die gekaperten Web-Server offenbar auch dazu genutzt, um Windows-Computer mit Malware zu infizieren und Werbung für Dating-Webseiten auf Apple-Computern anzuzeigen. Auch Apple iPhones wurden offenbar auf Sex-Seiten geleitet, damit die Hacker damit Geld verdienen konnten.

Eset hat eine ausführliche technische Analyse der “Operation Windigo” als PDF-Dokument veröffentlicht. Die Experten glauben, dass die kriminelle Kampagne bereits seit über zweieinhalb Jahren läuft und bisher unerkannt blieb. Die Antiviren-Fachleute gehen davon aus, dass täglich mehr als eine halbe Million Rechner in Gefahr sind, nach Besuch eines kompromittierten Servers mit Malware infiziert zu werden.

Widigo nutzt dafür offenbar ein ganzes Bündel von leistungsstarken Malware-Komponenten, wie Linux/Ebury, Linux/Cdorked, Perl/Calfbot, Linux/Onimiki, Win32/Glubteba.M und Win32/Boaxxe.G.

An nur einem Wochenende fand Eset nach eigener Auskunft mehr als 1,1, Millionen verschiedene IP-Adressen, die durch Windigos Infrastruktur liefen, bevor sie auf Server umgeleitet wurden, auf denen Exploit-Kits gehostet werden. Die Betriebssysteme der Rechner reichten dabei zurück bis Windows 95 und 98, erklärte ein Experte.

Webmaster können ihre Maschinen laut Eset auf eine Infektion durch Windigo prüfen, indem sie folgende Befehlszeile eingeben:

$ ssh -G 2>&1 | grep -e illegal -e unknown > /dev/null && echo "System clean" || echo "System infected"

Gastbeitrag von Michael Kranawetter, Chief Security Advisor (CSA) bei Microsoft in Deutschland. In seinem eigenen Blog veröffentlicht Michael alles Wissenswerte rund um Schwachstellen in Microsoft-Produkten und die veröffentlichten Softwareupdates.

Windows AzureVM mit statischen IP-Adressen betreiben

$
0
0

Hi,

jetzt ist es da. Wo vorher ein Trick herhalten mußte, da ist es jetzt ab sofort möglich.

Windows Azure VMs können mit einer Statischen IP-Adresse betrieben werden. :-)

WIE?

Wichtig sind 4 PowerShell-Cmdlets:

  • Get-AzureStaticVNetIP
  • Set-AzureStaticVNetIP
  • Remove-AzureStaticVNetIP
  • Test-AzureStaticVNetIP

Doku - http://msdn.microsoft.com/en-us/library/windowsazure/dn630228.aspx

Die neuen Cmdlets wurden veröffentlicht zum 12.03.2014 im Update 0.7.3. PowerShell cmdlets for Windows Azure version 0.7.3 

-- schauen ob die IP im Netzwerk noch frei ist:

$vnetname = "MyBackendVNetWestEurope"
Test-AzureStaticVNetIP –VNetName $vnetname –IPAddress 192.168.4.7

-- einer neuen VM ein Subnet und eine Statische IP-Adresse mitgeben

$vm1 = New-AzureVMConfig -Name $vmname -ImageName $img –InstanceSize Small;

Set-AzureSubNet –VM $vm1 –SubNetNames $sub;

Set-AzureStaticIP –IPAddress 192.168.4.7 -VM $vm1;

New-AzureVM -ServiceName $vmsvc1 -VMs $vm1 –AffinityGroup "WestEuropeAG";

und etwas besonders schönes: Einer AzureVM eine neue statische IP vergeben

$vm2 = Get-AzureVM -ServiceName "MeinCloudService" –Name “MeineAzureVM”

$vmchange = Set-AzureStaticVNetIP -VM $vm2.VM -IPAddress 192.168.4.7 |

Update-AzureVM -ServiceName "MeinCloudService" –VM $vm2

weitere Änderungen sind hier dokumentiert. auch was in Version 0.7.4 enthalten ist:

https://github.com/WindowsAzure/azure-sdk-tools/releases

VM extension cmdlets

  • Set-AzureVMExtension
  • Get-AzureVMExtension
  • Remove-AzureVMExtension
  • Set-AzureVMAccessExtension
  • Get-AzureVMAccessExtension
  • Remove-AzureVMAccessExtension
  • Multi-thread support in storage cmdlets
  • Add YARN support via -Yarn parameter on Add-AzureHDInsightConfigValues

Alle Details zum 0.7.3:

Web Site cmdlets

  • Slot
    • All Web Site cmdlets takes a new -Slot parameter
    • Switch-AzureWebsiteSlot to swap slots
  • WebJob
    • Get-AzureWebsiteJob
    • New-AzureWebsiteJob
    • Remove-AzureWebsiteJob
    • Start-AzureWebsiteJob
    • Stop-AzureWebsiteJob
    • Get-AzureWebsiteJobHistory
  • Publish project to Web Site via WebDeploy
    • Publish-AzureWebsiteProject
  • Test Web Site name availability
    • Test-AzureName -Website
  • Virtual Machine cmdlets
    • Generic extension
      • Get-AzureVMAvailableExtension
      • Get-AzureServiceAvailableExtension
    • BGInfo extension
      • Get-AzureVMBGInfoExtension
      • Set-AzureVMBGInfoExtension
      • Remove-AzureVMBGInfoExtension
    • VM role size
      • Get-AzureRoleSize
      • New-AzureQuickVM -InstanceSize takes a string instead of enum
    • Other improvements
      • Add-AzureProvisioningConfig will enable guest agent by default. Use -DisableGuestAgent to disable it
  • Cloud Service cmdlets
    • Generic extension
      • Get-AzureServiceExtension
      • Set-AzureServiceExtension
      • Remove-AzureServiceExtension
    • Active directory domain extension
      • Get-AzureServiceADDomainExtension
      • Set-AzureServiceADDomainExtension
      • Remove-AzureServiceADDomainExtension
      • New-AzureServiceADDomainExtensionConfig Virtual Network cmdlets
    • Get-AzureStaticVNetIP
    • Set-AzureStaticVNetIP
    • Remove-AzureStaticVNetIP
    • Test-AzureStaticVNetIP
  • Storage cmdlets
    • Metrics and logging
      • Get-AzureStorageServiceLoggingProperty
      • Set-AzureStorageServiceLoggingProperty
      • Get-AzureStorageServiceMetricsProperty
      • Set-AzureStorageServiceMetricsProperty
    • Timeout configuration via -ServerTimeoutRequest and -ClientTimeoutRequest parameters
    • Paging support via -MaxCount and -ContinuationToken parameters
      • Get-AzureStorageBlob
      • Get-AzureStorageContainer
  • ExpressRoute cmdlets (in ExpressRoute module)
    • Get-AzureDedicatedCircuit
    • Get-AzureDedicatedCircuitLink
    • Get-AzureDedicatedCircuitServiceProvider
    • New-AzureDedicatedCircuit
    • New-AzureDedicatedCircuitLink
    • Remove-AzureDedicatedCircuit
    • Remove-AzureDedicatedCircuitLink
    • Get-AzureBGPPeering
    • New-AzureBGPPeering
    • Remove-AzureBGPPeering
    • Set-AzureBGPPeering

Liebe Grüße

Patrick

Bing brings Machu Picchu to you

$
0
0

clip_image002

Tuesday’s Bing home page took us thousands of miles – and centuries – away with a looping video of fog rolling over the Machu Picchu ruins in Peru.

The Incans built this mountain city, considered one of the Seven Wonders of the World, around 1450. Its construction appeals to visitors as much as its location, as no mortar or modern tools were used to create this masterpiece of craftsmanship. It is slowly being restored.

After occupying it for a century, the Inca abandoned Machu Picchu when Spanish conquistadors arrived in what is now Peru. In 1911, American explorer Hiram Bingham discovered this hidden city and revealed it to the world.

Check out the Bing home page and explore Machu Picchu at your own pace, without ever having to take any of the arduous steps up.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff

A jolly good landing for ‘Titanfall’ as it hits the UK

$
0
0

“Titanfall” for Xbox One hit the United Kingdom Friday, and players celebrated its arrival in style. Outside and inside a warehouse in London, “pilots” were in force, jumping, leaping and running in true Titan fashion.

Check out the video above, or on Xbox Wire, for a look at some of the action fans got to see.

You might also be interested in:

· “Metal Gear Solid V: Ground Zeroes” lands on Xbox One and Xbox 360
· Friends’ notifications, game save progress bar and more will be added to Xbox One in April
· Xbox One coming to 26 more countries in September

Suzanne Choney
Microsoft News Center Staff

Microsoft Partnership Helps Teachers Teach with Technology: By Curt Kolcun, Vice President, U.S. Public Sector, Microsoft Corp.

$
0
0
Over the weekend, I had the great honor of joining Ron Thorpe, President of the National Board for Professional Teaching Standards (National Board) at their annual conference to announce an important partnership that will provide technology training to National Board Certified Teachers. As technology continues to transform learning, teacher training is a critical element in ensuring technology tools are implemented in an effective and impactful way. This is an issue that has been front and center...(read more)

Marlee Matlin: ‘Glad Microsoft has invested so much into making the Xbox accessible’

$
0
0

clip_image002

In a guest post on the Microsoft Accessibility Blog, Academy Award-winning actress Marlee Matlin writes about how she learned Xbox One could be used for watching closed captioned movies and television shows, connect with friends over Skype and surf the Web. Until then, she thought that her four children would be using it mostly for marathon “FIFA 14” and “Madden NFL” games.

She uses Xbox One to watch favorite new shows such as “Orange is the New Black” and “Girls” with closed captioning through Netflix and HBO. Matlin lobbied Congress nearly 30 years ago to help bring closed captioning to television.

And now, she writes, “Everyone should be able to watch TV shows and movies on their phones, tablets and computers, and that means including closed captions.”

She’s looking forward to finding out more about the console.

“I may not have used all of Xbox’s accessibility tools yet, but trust me I’ll be learning more about Skype and Kinect,” she writes. “After fighting for closed captioned access for three decades, it’s my responsibility to get ahead of the technology curve.”

Read the rest of her guest post on the Microsoft Accessibility Blog.

You might also be interested in:

Athima Chansanchai
Microsoft News Center Staff


School Pitch Helps Drive Innovative Ideas to Transform Learning

$
0
0
By Akhtar Badshah, senior director, Citizenship and Public Affairs At the Microsoft in Education Global Forum last week, I had the honor to celebrate some of the best and brightest educators and school leaders from around the world. It was inspiring to be amongst them listening, sharing ideas, learning from one another and meeting some of their young students. These educators are taking action within their schools and communities to ensure our youth in countries from Tunisia to Mexico and Singapore...(read more)

App-V 5: On Using Sequencing Templates

$
0
0

Sequencing Templates (.APPVT) files are designed for automating the sequencing of applications. While you can take advantage of some of the benefits of templates with manual, interactive sequencing, be careful making assumptions when sequencing following the importing of a template in the Sequencer GUI. Sequencing Templates are also essential for the upgrading of packages.

Remember this from the App-V Sequencing Guide:

“Templates are also very important for upgrade scenarios.  The Sequencer does not save state so when a new Sequencer session is open and a package is opened for upgrade, the settings are in the default state.  If certain sequencer settings were changed when sequencing a package, the changes will not remain at time of upgrade.  Therefore, it is recommended to save a template for any package that has Sequencer customizations, and re-apply them on upgrade.  A template may also contain additional options such as Package Deployment Settings and Advanced Monitoring Options.”

Creating a Sequencing Template

Creating a sequencing template is pretty straight forward. You launch the App-V Sequencer and first set your advanced options for sequencing. You do this by going to the “Tools” menu and selecting “Options.”

All of the General Items and Exclusion Items can be adjusted using this dialog box. All of these settings will be saved into the template.

If you plan on only using these settings in your template, you can proceed to save as template using the “File” menu to “Save as Template.” However, if you want to include additional settings (for automated sequencing with PowerShell) instead of saving as template, proceed and go through the process of creating a blank dummy package. Make sure you click through to the advanced options so you can configure:

  • Operating System Options
  • Advanced Interaction

 

 

Once you have all of these settings the way you want them then you can proceed to save the template. Notice you will get a specific alert when doing so.

 

While it implies that the additional settings (OS, COM, objects) will not be saved in the template, you will find that they are, in fact, saved. What the effect of this message is any settings other than General Options or Exclusion Items will NOT be imported if you import the template into the sequencer GUI for the sequencing of a new package.

 

 

All of the settings will however be used if the template is used in conjunction with the New-AppVSequencerPackage PowerShell cmdlet. It will support the use of all of the template items. The use of PowerShell with templates opens the door of many possibilities for automating the sequencing of your packages. Here is an example:

 

Once the package has been created, you can verify the configuration held by observing the information in the App-V manifests.

 

Happy Automation!!

The Microsoft Channel: Technology’s Melting Pot

$
0
0

by Jenni Flinders, Vice President, Microsoft US Partner Group

Jenni%20Flinders%202010%2001_jpgAmerica has long been considered the world’s melting pot, blending various cultures, nationalities and ethnicities together. This diversity has long defined the “American” culture, with unique flavors working together to create a wonderful new dish. Similarly, my vision for, the channel is as technology’s melting pot – where diverse backgrounds, skills and experiences blend together to create one enhanced workforce. Much like you can pick out the hints of curry, bay leaves, and onion in beef curry soup, this melting pot highlights the unique flavors of each employee while delivering a more complete solution.

Diversity, in all of its forms, has always been an important subject to me, and it’s why I’m happy to announce we will be launching a diversity focus study through our People for Innovation group on the US Partner Community Yammer network. I’m excited about the conversations I’ve been having with partners around diversity, both one-on-one and at our WPC diversity events, and I’m eager to have more.

Through the coming weeks, questions will be posted to the Yammer group to engage the US community of partners and Microsoft professionals in discussions about diversity. Mixed in with these questions, we’ll release the findings from our questions and encourage dialogue on those findings.

The goal of this study, and the larger goal of the People for Innovation Yammer group, is to extend the traction we achieve each year at our diversity event during WPC. Every year I’ve seen the participation and attendance of this gathering grow, and these discussions will build upon the exciting plan we have for promoting diversity at WPC this summer. Our event is still being planned, and we’ll use the Yammer group to announce the details and invite you to join us.

Together, we can enhance the flavor and returns of our channel melting pot!

Yammer_coolgrey11Connect with me on the People for Innovation Yammer group and join the conversation. Instructions to join the US Partner Community Yammer network are here

Managing a Distribution Group membership from Outlook in Exchange 2010 and Exchange 2013

$
0
0
In Exchange 2003 and Exchange 2007 to let a user manage distributions groups from Outlook we only set him as the distribution group owner and then he’ll be able to add and remove users from the D.G as needed. Starting from Exchange 2010 we don’t leverage ACL anymore, we introduced RBAC (from more detail see my previous post ) Now, if we act as we used to do, meaning assign a user as a Distribution Group owner And then try to modify the D.G membership from Outlook , this is...(read more)

SQL Server 2014 RTM today - Available 1st April 2014 (USA date)

$
0
0

SQL Server 2014, the foundation of Microsoft’s cloud-first data platform, is released to manufacturing and will be generally available on April 1st 2014 (US date). SQL Server 2014 provides organisations with a data platform:

  • Delivering breakthrough performance with In-Memory OLTP
  • Enabling greater availability and data protection with new hybrid scenarios
  • More customer value for mission critical applications and hybrid cloud

For more information see the data platform teams blog post - http://blogs.technet.com/b/dataplatforminsider/archive/2014/03/18/sql-server-2014-releases-april-1.aspx

Sign-up for the SQL Server 2014 release available April 1

For those who want access to the release as soon as possible, please sign-up to be notified once the release is available. 

Viewing all 17778 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>