Quantcast
Channel: TechNet Blogs
Viewing all 17778 articles
Browse latest View live

Develop Windows Universal Apps with Azure AD and the Windows 10 Identity API

$
0
0
Howdy folks, It's time for a new installment in our Azure AD and Windows 10 series! In earlier posts you have learned how Windows 10 offers deep Azure AD integration, both on enterprise owned and personal devices. Windows 10 makes it possible to sign in your Windows 10 device directly with your work account in Azure AD, or add your work account to your personal Windows 10 device (to which you signed in with your personal Microsoft account). In both cases, this allows you to access Azure AD...(read more)

PowerTip: Find PowerShell Help Content

$
0
0

Summary: Learn how to find all Windows PowerShell Help content.

Hey, Scripting Guy! Question I updated the Help files on my installation of Windows PowerShell. I can find the cmdlet Help,
           but how can I see the Help text files?

Hey, Scripting Guy! Answer Use the Get-Help cmdlet, search for the HelpFile category, select all of them with a wildcard
           character, and then pipe the output to More:

Get-Help –category helpfile * | more

Exalumnos de Imagine Cup entran a las grandes ligas [Parte 2]

$
0
0
Vamos a continuar con las pláticas que tuvimos con los equipos de Imagine Cup que decidieron seguir sus sueños y hacer realidad su idea, estos son los equipos restantes. Equipo Chemicalium Final Mundial de Imagine Cup 2014 People Choice Award Miembros: Amra Buljubasic, Samir Supcic, Ema Begulic, y Hamza Sabljakovic, Mohamed El- Zayat Descripción: Una aplicación de educación-entretenimiento diseñada para ayudar a los estudiantes a disfrutar aprender de manera tradicional los temas difíciles. Esta...(read more)

Microsoft System Center 2012 R2 솔루션 가이드북이 나왔습니다!

$
0
0
Microsoft의 System Center 2012 R2에 대한 전반적인 솔루션 가이드북이 어제 저녁에 따끈따끈하게 작성이 완료되어, 꼬알라의 하얀집에 먼저 포스팅을 합니다. 해당 가이드북에서는 System Center 2012 R2의 소개, SCOM, SCCM, SCVMM, SCDPM, SCO, SCSM 2012 R2 기술을 활용한 솔루션을 간결하게 요약하고, 이를 활용하여 고객분들께서 얻으실 수 있는 기술을 설명하고 있습니다. 마지막으로 고객의 요구 사항에 대한 Microsoft의 기술 제언으로 이어져 있죠. 당연히 해당 제언으로 구축된 사례도 포함하고 있습니다. 백문이 불여일견! 파일은 하단 다운로드 링크를 통해 다운로드하실 수 있습니다! 더불어, 공식적으로 Microsoft.Com에서도 곧 다운로드하실 수 있을 예정이며, 오프라인 형태의 가이드 북으로 인쇄도 한다고 하니, 여러 이벤트 장소에서 만나보실 수 있을 거라 생각합니다....(read more)

Configure State to SCOM Group

$
0
0

your create Group in SCOM Console called "My Group" with computer object member

In Discover Inventory under Monitoring tab, you can see that on group didn't have State, that because group was not inherited the monitors from is members

to configure state that depend on members status [Useful in dashboards], you need to create Dependency monitor

  

Configure Dependency Monitor on a group membership [choice the second]

as you can see the Group now get the State from is Member

Coworking & Travel: Auf der Suche nach einer Antwort, wie gut es sich im grenzenlosen Europa wirklich arbeiten lässt – Eine #CoworkingEU Gastbeitragsserie

$
0
0

Tobias Schwarz (Isarmatrose) und Kati Kremkau (Ostseegoere) wollen in diesem Sommer testen, wie gut es sich im grenzenlosen Europa wirklich arbeiten lässt. Für zwei Monate gehen sie unter dem Hashtag #CoworkingEU und mit Office 365 und dem Surface 3 auf #OutofOffice-Trip und stellen sich der Frage, warum wir eher vom Starbucks am Jungfernstieg oder dem Sankt Oberholz am Rosenthaler Platz aus arbeiten, anstatt vom Montmarte in Paris einen Tweet abzuschicken, auf dem Markusplatz in Venedig zu bloggen oder im Schatten der Sagrada Familia eine Präsentation zu bauen. Erste Stationen waren Dresden und Leipzig. Für uns berichten sie hier im Blog jede Woche über ihre Erfahrungen.



#1 Bericht
- Was ist eigentlich Coworking?

Vorletztes Jahr besuchte mich mein inzwischen 80-jähriger Opa in Berlin. Ich zeigte ihm meinen Arbeitsplatz in einem Coworking Space. Er hörte sich an, was ich hier mache, wer noch in diesem Raum arbeitet und wie es so ist, in einem Coworking Space zu arbeiten. Dann schaute er sich schweigend noch einmal um, ging an meinen leeren Arbeitsplatz und erklärte mir in einem Satz, dass er an dieser Stelle des Tisches seinen Telefonapparat hinstellen würde. In diesem Moment war mir vollkommen bewusst, wie enorm die Auswirkungen des Internets auf die Arbeitswelt sind.

Neue Arbeitswelt

Mein Telefonapparat habe ich jeden Moment bei mir, er passt in meine Hosentasche und verbindet mich mit der ganzen Welt. Nahezu meine komplette Arbeit als Leiter des Blogs Netzpiloten.de kann über das Smartphone erledigt werden. Den Rest erledige ich mit meinem Notebook, das ich nach getaner Arbeit wieder in meine Tasche stecke und so den leeren Arbeitsplatz hinterlasse. Die heutige Technik ermöglicht es in meinem Beruf als Blogger, nahezu grenzenlos arbeiten zu können. Trotzdem sitze ich die meiste Zeit in einem Berliner Coworking Space.

Diesen Sommer ändere ich das und versuche nicht nur meine Möglichkeiten auszutesten, sondern auch herauszufinden, wie mobil es sich in einem grenzenlosen Europa arbeiten lässt. Zusammen mit meiner Freundin, die eigentlich von zu Hause aus, im sogenannten Home Office, für ein Startup und eine Berliner Landespolitikerin arbeitet, bin ich diesen Sommer acht Wochen lang unterwegs in Europa. Unsere Arbeit erledigen wir in dieser Zeit in Coworking Spaces, also Räumen zum Coworking.

Coworking ist nicht gleich Coworking

Was Coworking ist, konnten wir uns auf unseren ersten beiden Station in Dresden und Leipzig noch einmal gut in Erinnerung rufen. In beiden Städten fanden wir je ein Beispiel für das, was Coworking sein kann und das, was sich ebenfalls Coworking nennt. Für uns beide ist Coworking die Möglichkeit, andere Menschen zu treffen, sich mit ihnen auszutauschen und zu schauen, wohin sich eine Zusammenarbeit oder auch nur ein Gespräch entwickeln kann. Es ist der Reiz der Serendipität, das zu finden, nachdem man gar nicht gesucht hat.


Das Dresdner Coworking Space Cloudsters und die Raumstation in Leipzig sind zwei solcher Orte, die unserer Meinung nach dem Begriff Coworking eine Bedeutung verleihen. Hier fanden wir eine kreative und vor allem kommunikative Arbeitsumgebung vor, in der sich Gespräche mit Fremden als ungemein wertvoll erwiesen. Wir bekamen hier Ratschläge, von denen wir nicht einmal ahnten, dass wir nach ihnen fragen sollten. Allein hier gewesen zu sein, unsere tägliche Arbeit erledigt zu haben und sich auch Zeit für eine Frage, ein Gespräch und das gemeinsame Mittagsessen genommen zu haben, hat sich gelohnt.

Mittagspause in der Raumstation

Mit dem CoFab und dem Studio Delta haben wir auch zwei Spaces besucht, die sich ebenfalls Coworking auf die Fahne geschrieben haben, dies aber unserer Meinung nach nicht sind. Viel mehr sind es ganz klassische Bürogemeinschaften, in denen Coworking in erster Linie ein Geschäftsmodell und keine Philosophie ist. Auch Cloudsters und Raumstation leben von der Vermietung der freien Arbeitsplätze, aber hier steht das Miteinander im Vordergrund, während es in den anderen beiden Spaces der Service ist. Manchem mag das reichen, auch ich brauche von Zeit zu Zeit nicht mehr als einen ruhigen und gepflegten Arbeitsplatz, aber Coworking ist das für mich nicht.

Es geht weiter…

Inzwischen sind wir in Barcelona angekommen und haben hier bereits vier Coworking Spaces in der Stadt besucht. Morgen geht es weiter nach Toulouse. Auf Twitter und Instagram berichten wir von unserer Reise, im Blog coworking-and-travel.eu kann man unsere Berichte aus den verschiedenen Coworking Spaces lesen. Nächste Woche berichte ich dann im Microsoft-Blog von unseren in Spanien gemachten Erfahrungen. Adiós!

 

 

Ein Gastbeitrag von Katharina Kremkau und Tobias Schwarz
www.coworking-and-travel.eu

- - -

Über die Autoren


Kati, die Ostseegoere, arbeitet als freie Social-Media-Alleskönnerin für ein Berliner eLearning-Startup, eine Umweltpolitikerin und die Kiezsauna, der wohl heißeste Ort Friedrichshains. Sie ist aus Rostock und hat Politikwissenschaft in Berlin studiert. Was sie am Internet wirklich liebt, sind Listen auf Foursquare and Yelp von Orten, an denen sie leckere Delikatessen essen möchte.



Tobias leitet das Online-Magazin Netzpiloten.de, das seit 1998 das Internet erforscht. Davor arbeitete er für Tumblr, McKinsey und die Grünen. Er hat eine “Always On”-Mentalität, aber weiß, worauf es im Leben noch ankommt: ein eiskaltes Bier (er bevorzugt einen Russ) und Freizeit im Grünen mit Kati.

Interview with a Wiki Ninja, Author, Speaker, and C# Expert: Pooja Baraskar

$
0
0

Welcome to Monday - Interview with a Wiki Ninja!

 

This week we're interviewing Pooja Baraskar!

Pooja Baraskar's avatar 

Pooja has contributed 8 Wiki articles and is an author, speaker, and developer! Here's an example article:

Silver Award Winner

 

Pooja BaraskarConditional Compilation in Windows Universal AppsCarmelo La Monica: "Great article, the conditional compile is very useful expecially in Universal App. Congratulation for your article." 
JH: "Conditional compilation becomes more important in UWAs. This article shows a simple example how you can work with it."

Some classics:

Localization in Windows Universal Apps

Azure: Creating Virtual Machines

winRT apps: Implementing Screen Reading

 

Let's begin the interview!

 

Who are you, where are you, and what do you do? What are your specialty technologies?

Hello everyone, my name is Pooja Baraskar, I am from Betul, a small town in India. I work with a startup called Geek Monkey Studios which is more towards innovations. I love to work with C#, I develop Windows apps and Games using that. I am currently more focused on Internet of Things also I am winner of Microsoft IoT competition and got awarded at Azure Conference Pune. Creating devices with Intel boards and accessing it via Azure is something that excites me. I am also an Intel Software Innovator from IoT and Realsense side. You can know more about me from my personal blog.

  

What are your big projects right now?

I am working on a IoT based product. My goal for now is to take that into market also we have integrated Intel Realsense 3D camera with Intel Galileo Board, with that and implementing the “I” part with Azure and Windows UAP we can do a lot more amazing things. Working on Internet of Things is like turning your imaginations into reality. It gives you wings and power to do anything, to make a big change in this world. I am also working on a book on Internet of Things and on Technet Wiki a lot more to come from my side on this.

 

What is TechNet Wiki for? Who is it for?

Technet wiki is for everyone who wants to learn whether he is a student or an expert, every time you will find something new and useful to you. It is a place to share and learn. People who really love Microsoft technologies are here.

What do you do with TechNet Wiki, and how does that fit into the rest of your job?

This is the most trusted source on Microsoft Technologies. I learn from the great people here. It helps me to keep my knowledge updated. I write on things in which I am good and experts here help me to get better on that. Sometime my article here acts as a repository to me, there are things that I did in past and do not remember, those times I refer to them. The best feature I like is experts update my articles and corrects my mistakes.

  

What are your favorite Wiki articles you’ve contributed?

My favorite article wrote by me is my first article on Technet. It is on Localization in Windows RT. For this I got my first award on Technet Wiki and also some valuable feedback, which motivated me to contribute more. I will soon update it or write another version based on Windows UAP.

Localization in Windows Universal Apps

   

What could we do differently on TechNet Wiki?

You guys are doing a great job, big thanks for that. There are some authors like Carmelo La Monica who are writing well on Internet of Things. IoT is an emerging Technology and there are very less resources on it. I think we should promote and motivate more authors so that we can create a vast repository which people can refer to. Also I see articles written on IoT falls under Miscellaneous Category. It will be so good if we can create a specific category for that, this will encourage people to write more on Windows IoT and IoT with Azure.

 

Who has impressed you in the Wiki community, and why? 

I am impressed by everyone here. People give their valuable time and share their knowledge to build an awesome community. There is something to learn from every articles, no matters if it is big or small. They are always useful.

 

Do you have any tips for new Wiki authors? 

For the new authors, I suggest to keep contributing. Follow the general Technet wiki guidelines. It is always good to attach some working codes so that the learners can play around. Do not worry about the mistakes, the community is here to guide you. With their suggestions you will improvise day by day. Take each and every comment positively and work on yourself. Love the community and the community will love you.   

    

 

Please join me in thanking Pooja for all her community contributions!

   - Ninja Ed

Цвета и стили линий в Visio

$
0
0

Стили линий, также как визуальные эффекты, могут придать схеме наглядности. Рассмотрим несколько типов и узоров, которые можно использовать для границ фигур и соединительных линий.

...(read more)

Arduino Due and Mobile Service

$
0
0

In the previous article Arduino Due: Development , it was made an introduction of this board, after setting up starting from the installation of the necessary libraries, to finish with the setting up the Ethernet shield, with which the Arduino Due can access to the Internet. In this second article, we will see how Arduino can interact with the services of Microsoft Azure, precisely with the Mobile Service. We will start by creating a Mobile Service, we will build a small circuit using the sensor temperature and humidity DHT11, ending with the piece of code that will be responsible for transferring the information detected by the sensor on Microsoft Azure. We will see in the order:

  • Create a Mobile Service.
  • Build electronic circuit.
  • Write the code for to upload data form sensor dht11 to Microsoft Azure.
  • Test application.
  • Conclusion.

Navigate to this article http://social.technet.microsoft.com/wiki/contents/articles/31660.arduino-due-and-mobile-service.aspx

Happy reading to all ;)

5 Years of Blogging!

$
0
0


The TechNet UK Blog is 5 years old today and boy it's been a good 5 years getting to know you all. In this post we want to share with you some of your favourite blog posts and also let you know what our 6 TechNet Editors are up to now. 

From the very first post the editors have had a great time working with our writers to come up with new and interesting articles and competitions. Here are just a few of our highlights: 

The Fun Articles 

Celebrating our SysAdmins

On the last Friday of July every year we like to celebrate and show our appreciation for the great work our SysAdmins do for us here at Microsoft and in every company. We sure do love our gifs (and cake): 

Competitions

We have had some fantastic competitions over the years, here are two of the highlights:

Popular Blog Posts

Over the years we have published a vast array of articles and it's hard to single out just a few but here are the articles that you guys seemed to love the most: 

“Don’t focus on having a great blog. Focus on producing a blog that’s great for your readers.” (Brian Clark, 2012) - this quote really encompasses how we've aspired to create a blog that you can go to and find useful content to help you be successful in your jobs. However, we know we're not perfect and are always looking to improve. With this in mind as a birthday present to us we would love to hear about what we have done well and most importantly where we can improve so the next 5 years sees you continue to receive content that you really want to read! Let us know your thoughts in the comments section and we will get our editors to work. 

Editors of the Past and Present

Over the past 5 years we have had 6 TechNet Editors who have done a fantastic job with the TechNet UK Blog in their own special way. Here's what they're up to now:

 Rachel Collier

 

Editor from launch until September 2011

In my last post I said that I was leaving my role 'to deploy and manage a baby', the project has been running for four years and now has a little brother. These days I answer to them. I've had more reasonable managers! My favourite thing about being editor was definitely getting the blog started in the first place. There was often news we wanted to share between editions of the TechNet newsletter or stuff for which we needed more space - competitions, interviews, events, etc. I loved the competitions we ran. It's fair to say the prizes varied; from a Windows phone to a TechNet mug, but the responses we had always got us giggling (though they were occasionally unprintable). As an editor it's great to hear back from the folks you write for.

 Sarah Lamb

Editor from November 2011 - June 2013

After working at Microsoft, I went self-employed and started my own Social Media and Comms Consultancy. I still love social media and comms and learnt a lot when I was at Microsoft which I now use to benefit startups like digi.me. There's many highlights and it's really hard to choose just one! Giving the winner of the 12 Days of Geekmas their Micro Server in person at their office was definitely a highlight as well as working with the team with TechDays Online. My favourite articles were the interviews with various people including Jason Zander and Stephen Rose and the guest articles we had from the MVP's, they were always fascinating and had plenty of technical depth to them.

 Alexander Guy

Interim Editor from September 2011 – November 2011

After a great couple of months standing in as TechNet Editor, I completed my internship with the TechNet Team and returned to University to finish my degree. I am now back in the team, managing our Social Media communities and campaigns. I really enjoyed how as Editor of the blog, you can immediately see the fruits of your labour. Having worked with a contributor or composed a blog yourself, from the minute you hit ‘publish’ you can watch the view count climb and start to gauge how interested IT Pros are in what you’re saying. Be it positive or not so, the feedback was always very enlightening! My favourite article was the first piece that I researched and wrote myself from scratch to mark the 900-day milestone in the run-up to Windows XP End of Support. Using cold hard evidence to articulate to our readers why they should look to upgrade their venerable friend, as well as some interesting comparisons to technologies which had come and gone since XP’s launch, was particularly fun to write. Furthermore, it helped me to understand what End of Support means from an IT Pro’s perspective, really getting a feel for why some folks were so reluctant to move.

Steven Mullaghan  

Editor from July 2013 to July 2014

Seems like only yesterday, I was keeper of content across the blog and newsletter channels. I have now recently graduated Newcastle Business School with a First Class BA (Hons) in Marketing Management and delighted to be joining Microsoft again as a Graduate in September working as a Partner Sales Executive within SMS&P in Reading. Needless to say I'll always have a soft spot for the TechNet team - no doubt finding a way to pop by when I can. My favourite thing about being editor was curating the content that mattered to our technical audience. Not to be cliche, but what I enjoyed the most was being out and about, on the road at events and gaining insights from you on what's best to publish. Then working with our awesome team of content creators to make that happen. My favourite article was the TechDays 2013 wrap up, mostly because the memories that I associate along with it. The hard work, the team collaboration, the Steve B interview, Countdown conundrum and... my xbox onesie cameo. Top event all round and well summed up through this article (I'm getting all nostalgic). It's been great to grace the blog pages once more, hopefully I may bump into a few familiar faces when I'm back.

Charlotte Utting

Editor from July 2014 to July 2015

Being TechNet Editor for the past year has been a fantastic experience and I have many highlights. Firstly, I must say it has been great getting to know you all and building up a network of great writers who have written lots of fab articles for the TechNet UK Blog over the years and we really can't thank you enough for those! One of my highlights was definitely leading the Learning at Work Week campaign back in May, championed by the Campaign for Learning organisation, which aims to put the spotlight on the importance and benefits of learning and development within the workplace. We saw this as fantastic initiative to encourage our IT Pros and Developers to use this week as an opportunity to develop their learning which is important in order to stay up-to-date with the ever evolving technological landscape - it was a lot of fun sourcing and writing the blog posts highlighting the campaign as well as lots of the great online learning resources for our readers to ingest. Another highlight was getting the chance to be involved with the two TechDays Online events and getting to work amongst the production team throughout those events, we had a lot of fun producing the events and we hope you enjoyed them too. Life after being the TechNet Editor sees me heading back to Nottingham Trent University shortly to finish my Information Systems degree. Hopefully this is not a goodbye but more of a see you later. 


Editor from July 2015 to Present 

I may only have been editor for a month but I can already tell that it is the people who make the blog what it is. Over the next year I hope to meet as many of you as possible. I want to get to know the writers, they are experts in their field and I intend to learn as much from them as I can. I also wish to meet as many readers as possible, I have already met a few of you at events and really enjoyed speaking to you. SysAdmin Day has been by far and away the highlight of my work for the blog thus far. We got to commission a cake, present it to our SysAdmins (which made them happy) and then - most importantly - we got to eat the cake! Easily the best day yet! I loved writing the article that we published on the blog for the day - it was a bit of fun and different to anything else I have written thus far! I used my previous experience as a SysAdmin and some funny GiFs to create an article that I hoped would make people laugh – thankfully no one has told me it is rubbish so I take it to be a success! 

Thank you for everything so far and bring on the next 5 years! #HappyBlogging

TNWiki Article Spotlight - Power BI, Text Analytics and the United States Congress

$
0
0

Hello and welcome everybody to our TNWiki Article Spotlight on Tuesday.

Today I want to introduce you to an article written by Seth Moupre. He saw a demo of an interactive word cloud based upon votes in the UK parliament. The idea was to do something similar with Power BI and the United States Congress. In Power BI, Text Analytics and the Unites States Congress he shows how to do text analytics with Power BI. Seth goes step by step through how he started his analysis, where he gets his data from and how he models the data. At the end of the article you have a working example.

If you want to see the simplicity and capabilities of Power BI this article is the best point to get started.

- German Ninja Jan (TwitterBlogProfile)

State Tiles Widget + PowerShell Widget

$
0
0

Dashboard options for Windows Operating System, Logical Disks and SQL DB Engine

 


I closely follow the Wei out there blog series as it is a great source of information on building dynamic dashboards.

A few weeks ago he posted an interesting article for showing operating system summary dashboards and one of my customers was keen to tweak this for other classes and also to integrate the dashboard with the State Tiles widget.

I spent some time looking at the code that Wei provides and found there were 4 key sections for customisation.

1. The lines where we select the classes that we want to display on the dashboard $class = get-scomclass -Name Microsoft.Windows.Server.OperatingSystem

$serverOSes = Get-SCOMClassInstance -class $class

2. The lines where we display the Discovery Data

#Get values of Logical Processors and Physical Memory properties: 

$properties = @('LogicalProcessors', 'PhysicalMemory') 
$properties | % { 
$prop = $serverOS."[Microsoft.Windows.OperatingSystem].$($_)"   

AddColumnValue $dataObject $prop.Type.DisplayName $prop.Value

3. The lines where we display the Performance Data

#Get % Processor Time Stat
if($perfRule.CounterName -eq "% Processor Time")   { 
$data = $perfRule.GetValues($from, $now) | % { $_.SampleValue } | Measure-Object -Average  
AddColumnValue $dataObject $perfRule.CounterName $data.Average
          }

#Get Processor Queue Length Stat 
if($perfRule.CounterName -eq "Processor Queue Length")  { 
$data = $perfRule.GetValues($from, $now) | % { $_.SampleValue } | Measure-Object -Average  
AddColumnValue $dataObject $perfRule.CounterName $data.Average     
          }

#Get % Memory Used Stat 

if($perfRule.CounterName -eq "PercentMemoryUsed")  { 
$data = $perfRule.GetValues($from, $now) | % { $_.SampleValue } | Measure-Object -Average 
AddColumnValue $dataObject $perfRule.CounterName $data.Average      
          }    

- note that the GetMonitoringPerformanceData in $perfRules = $serverOS.GetMonitoringPerformanceData() retrieves data recorded in UTC. You may need to add some extra lines in the script to handle that as some local times may be a future date and no data will be returned like Wei did here:
#Last 16 hours
$aggregationInterval = 16
     $dt = New-TimeSpan -hour $aggregationInterval
  $now = Get-Date 
  $from = $now.Subtract($dt)

 

4. The lines where we do unit conversions e.g. from MB to GB (this is actually towards the start of the script, but we need to identify the discovery data and performance data before we configure the unit conversions.

$unitReplacements = @{

"Physical Memory (MB)" = @{ "name" = "Total RAM (GB)"; "coeff" = 1048576 };

 }

So if we want a dashboard for Logical Disks or SQL DB Engines, how do we go about tweaking the above code?  It is actually quite straight forward. We’ll start with Logical Disks:


1. Change the Class

$class = get-scomclass -Name Microsoft.Windows.Server.LogicalDisk
$serverOSes = Get-SCOMClassInstance -class $class

2. The lines we select the discovery data:

#Get values of Logical Disk Size property

$properties = @('SizeNumeric')

$properties | % {

$prop = $serverOS."[Microsoft.Windows.Server.LogicalDisk].$($_)"  

AddColumnValue $dataObject $prop.Type.DisplayName $prop.Value

3. The lines where we select the performance counters

#Get % Processor Time Stat 
if($perfRule.CounterName -eq "Free Megabytes")   { 
$data = $perfRule.GetValues($from, $now) | % { $_.SampleValue } | Measure-Object -Average  
AddColumnValue $dataObject $perfRule.CounterName $data.Average
          }
#Get Processor Queue Length Stat 
if($perfRule.CounterName -eq "Current Disk Queue Length")  { 
$data = $perfRule.GetValues($from, $now) | % { $_.SampleValue } | Measure-Object -Average  
AddColumnValue $dataObject $perfRule.CounterName $data.Average     
          }
#Get % Memory Used Stat 
if($perfRule.CounterName -eq "Avg. Disk sec/Transfer")  { 
$data = $perfRule.GetValues($from, $now) | % { $_.SampleValue } | Measure-Object -Average 
AddColumnValue $dataObject $perfRule.CounterName $data.Average      
          }     

#Get % Free Space Used Stat 
if($perfRule.CounterName -eq "% Free Space")  { 
$data = $perfRule.GetValues($from, $now) | % { $_.SampleValue } | Measure-Object -Average 
AddColumnValue $dataObject $perfRule.CounterName $data.Average   
 }     

How do we know what counters to select? In the SCOM console, go into Authoring, Management Pack Objects, Rules and scope to Windows Server 2012 or 2008 Logical Disk and look through the performance collection rules. To find out the counter name:

  • Double click the Collection Rule
  • On the Configuration Tab, click View (next to Data sources):

  • This shows us the Counter Name that we need:

Just repeat this for the performance counters that you want to display.

4. Lets not forget that we want to change MB to GB

$unitReplacements = @{

"Size (MBytes) (Numeric)" = @{ "name" = "Size (GB)"; "coeff" = 1024 };

 "Free Megabytes" = @{ "name" = "Free GB"; "coeff" = 1024};

 }

When you put it all together you get this. And when you plug it into the PowerShell Grid Widget you'll see the following:

You can follow the same process for SQL Server DB Engine.

When you put it all together you get this. And when you plug it into the PowerShell Grid Widget you'll see the following:

And you can even create a state view widget based on a group of Operating System objects and then make the dashboard contextual. When you select a state widget at the top, the dashboard updates with details of the servers in that group. Script

I'll be starting a new series of blog posts and Visual Studio Authoring and will use this show how we can make the groups that I use in the State tiles widget dynamic. So you can get Windows Operating System objects for your SQL Server DB Engines. And Windows Operating System Objects for your Exchange Servers.

Windows 10 Cheat Sheet Compilation!

$
0
0

Hello all, I hope your summer is going well!  As you know Windows 10 is here and has hit general availability!!  With that in mind I wanted to put out a few quick things that may be helpful to get the gears turning for testing and evaluation of Windows 10.
 

  • Windows 10 Build 10240 is the RTM build!

  • EMET 5.2 is not supported with Windows 10, it will cause Internet Explorer to not start.  The next version of EMET will be compatible with Windows 10

  • Windows 10 is a FREE upgrade on consumer PCs, reserve and get your copy today; if you don’t see the icon in your taskbar tray install KB3035583, or use the media creation toolto upgrade to Windows 10 now.

    After you download the Media Creation Tool and launch it, there will be an option to upgrade your PC now:


  • Windows 10 ADK is available here

  • Windows 10 Activation via KMS requires Hotfix KB3058168 be installed on the KMS host server, and the KMS host key for Windows 10 be installed.

  • There will be an updated Microsoft Deployment Toolkit (MDT) available in August.

  • Microsoft Bit locker Administration and Monitoring (MBAM) 2.5 client is supported on Windows 10.  MBAM 2.5 SP1 will add additional features for 10, including customizing the recovery screen, a new PowerShell v2 compatible script for enabling bit locker during a task sequence in MDT or SCCM, or stand alone; and a few new WMI methods as well as some other features.

  • Current version of UE-V is not supported on Windows 10, will be supported in the next version of UE-V

  • Windows 10 and the KMS host keys will be available for Enterprise customers on the VLSC on 01AUG.

  • Create a Windows 10 Reference Image

 

Well I hope you find this information useful for getting started!  We are all very excited about Windows 10!  As we run into any issues, gotchas, or helpful tips we'll post them here.  Do you plan on testing, evaluating, and deploying Windows 10 in your organization?


Jesse Esquivel

Excel for Security Analysts - Episode 2

$
0
0

In today's episode on Excel for Security Analysts, we will import SSH logs to Excel. To read more episodes of Excel for Security Analysts, go here.

The idea for this post came from a TechNet forum thread that started by a Data Analyst here (Thank you Rajender for sharing the challenge and the logs). 

You start with a log file like this: 

    

And you should end up with a table in Excel:

Looks challenging, isn't it? Well, with Power Query you can transform this type of data with simple mouse clicks and few easy Power Query (M) formulas.

Note: As usual, we will use the integrated Power Query experience in Excel 2016 Preview, which is available in the Data ribbon inside the Get & Transform group. If you didn't try Excel 2016 Preview, download it here. If you are still on Excel 2010 or 2013, you can download Power Query Add-in here to perform the same data transformations, that are described in this post. You can also use Power BI Desktop to accomplish all the steps below. Download Power BI Desktop here.

Let's start:

 

Connect Phase

In Excel 2016 Preview, go to the Data ribbon. In Get & Transform group, click New Query, go to From Other Sources, and click From Web.

In From Web dialog, paste the following URL to the URL box, and click OK:

https://gallery.technet.microsoft.com/SSH-Log-Sample-to-import-b4088fd4/file/140762/1/ssh.txt

In Access Web Content dialog, ensure you are in the Anonymous tab, select the second URL, and click Connect.

The Query Editor will open with a preview from the original log file. You can see that at this stage we have a single column whose cells consist of the original lines. It's difficult to work with this kind of data structure.

 

Cleanup Phase

Let's filter out all the lines that are not relevant for us.

Click the filter button on the header of Column1, click Text Filters, and click Contains...

In the Filter Rows dialog (as shown in the screenshot below), enter $ssh, and abcin the first and second text boxes. Select Or in the relevant radio button, and select begins with in the second drop down menu.

We will now tune our filter to accept a combination of OR statements, and ensure that all the relevant lines are filtered in.

In Home tab, click Advanced Editor.

We will now modify the Power Query expression (M):

Change the expression

let
    Source = Table.FromColumns({Lines.FromBinary(Web.Contents("https://gallery.technet.microsoft.com/SSH-Log-Sample-to-import-b4088fd4/file/140762/1/ssh.txt"),null,null,1252)}),
    #"Filtered Rows" = Table.SelectRows(Source, each Text.Contains([Column1], "$ssh") or Text.StartsWith([Column1], "abc"))
in
    #"Filtered Rows"

To the following expression:

let
    Source = Table.FromColumns({Lines.FromBinary(Web.Contents("https://gallery.technet.microsoft.com/SSH-Log-Sample-to-import-b4088fd4/file/140762/1/ssh.txt"),null,null,1252)}),
    #"Filtered Rows" = Table.SelectRows(Source, each
        Text.Contains([Column1], "$ssh") or
        Text.StartsWith([Column1], "MAC Address") or
        Text.StartsWith([Column1], "Port") or
        Text.StartsWith([Column1], "PEVLAN") or
        Text.StartsWith([Column1], "TrustFlag") or
        Text.StartsWith([Column1], "Peer IP") or
        Text.StartsWith([Column1], "Aging time") or
        Text.StartsWith([Column1], "TimeStamp"))
in
    #"Filtered Rows"

You can see that we added OR statements to filter the data by lines that starts with MAC Address,Port, PEVLAN, etc. These lines contain the multiline records that we want to transform into a tabular format.

Click Done in theAdvanced Editor dialog.

 

Extracting SSH Server IP Addresses

In Add Column tab, click Add Custom Column.

Paste the following formula to the Custom column formula box, and click OK.

if Text.Contains([Column1], "$ssh") then [Index] else null


This formula will copy the cells from Column1to a new column, if the cell contains the text $ssh.

Select column Custom(by clicking on its header). In Home tab, click Split Column, then click By Delimiter.

In the Split by Delimiter dialog, select the Custom delimiter, enter "$ssh " as a delimiter, and click OK.

The previous step will split Custominto two columns: Custom.1 and Custom.2.

SelectCustom.1and delete it (By pressing the Delete key).

Rename Custom.2to IP(By double clicking the header and editing the header name).

Right click on column IP, click Fill, and click Down.

Following the last step, we handled the IP Addresses, from occasional log line, such as the line below, to a column that associates the IP address to its relevant log data.

[u0039853@sam ~]$ssh 10.35.9.15

We can now filter out the rows that contains $sshin Column1.

Click the filter button on the header of Column1, select Text Filters, and click Does Not Contain..

In the Filter Rows dialog, enter $ssh, and click OK.

Obtain unique IDs for Multiline Records

The next phase of our transformation will help us to associate multiple rows into a single record. We will use the technique that I shared with you here.

Assuming that each record of data is in a multiline structure as shown below, we would like to associate a unique ID to each record. We will assume that every multiline record starts with the prefix MAC Address.

MAC Address: 0000-5e00-01a8     VLAN/VSI/SI   : MOB-RAN-1158                  
Port       : Eth-Trunk11.1198    Type          : dynamic                       
PEVLAN     : 1                  CEVLAN        : 1198                          
TrustFlag  : 0                  TrustPort     : -                             
Peer IP    : -                  VC-ID         : -                             
Aging time : 300                LSP/MAC_Tunnel: 1/16407                       
TimeStamp  : 7918170       

In Add Column tab, click Add Index Column.

Now let's create a new column with the index values from the rows that start with MACAddress.

In the Add Custom Column, enter the following formula, and click OK.

if Text.StartsWith([Column1], "MAC Address") then [Index] else null

Select column Custom, and in Transform tab, click Fill, and click Down.

We now have a unique ID for each record. We can delete column Index, and proceed to the next phase.

Split, Trim & Merge

In this phase we will perform a series of split and trim transformations to change the clattered data in Column1 into four well-structured columns of field, value, field value.

Right click on the header of column Column1, select Split Column and click By Delimiter...

In the Split by Delimiter dialog, select Colon as the delimiter, then select to split At each occurrence of the delimiter, as shown in the following screenshot, and click OK.

After the previous step, we have three columns: Column1.1, Column1.2 and Column1.3.

Right click on Column1.2, select Transform, and click Trim

Right click on Column1.2, select Split Column, and click By Delimiter...

In Split by Delimiter dialog, select Space as delimiter, split At the left-most delimiter, and click OK.

The column Column1.2is now transformed into Column1.2.1 and Column1.2.2.

We will now trim the first four columns.

Select Column1.1, Column1.2.1, Column 1.2.2 and Column 1.3 (You can use the Shift key to select a range of columns). Right click on any of the column headers, select Transform, and click Trim.

We got to the point of having four columns in a structure of field, value, field, value.

The next phase will include two merge operations, that will help us to stage the next transformation.

Select Column1.1 and then Column 1.2.1. Right click on one of the selected columns and click Merge Columns.

In Merge Columns dialog, select Colon as a separator and click OK.

Select Column1.2.1 and then Column 1.3. Right click on one of the selected columns and click Merge Columns.

In Merge Columns dialog, select Colon as a separator and click OK.

As a result of this stage, we have two columns in the format of field:value.

The Magic Starts...

The next phase is where the really cool things start.

Select the first two columns (Merged and Merged.1), right click on one of the column headers, and click Unpivot Columns.

Click the filter button of Valueheader, selectText Filters, and click Does Not Begin With...

In Filter Rows dialog, enter the colon character in the text box which is adjacent to the drop down menu does not begin with, and click OK.

Delete column Attribute(by selecting the column and pressing Delete key).

Right click on column Value, select Split Column, and click By Delimiter...

In the Split by Delimiter dialog, select Colon as the delimiter and click OK,

Rename the last two columns to Field and Value.

Select the column Field, and in the Transform tab click Pivot Column.

In the Pivot Column dialog, select Valuein the Values Column, expand the Advanced options, then select Don't Aggregate in Aggregate Value Function, and click OK.

Delete Column Custom, and we are done.

In Home tab, click Close & Load, or Close & Load To... and select your load options (If you have more than 1M rows, don't load the data to a table. Use the Data Model option and build PivotTable to analyze the data).

Conclusions

All the data is now loaded in the desired structure, and is quite different than the data we've started with. Don't you think?

You can now start using the power of Excel to analyze and visualize this data.

To download the workbook that was used for this episode, click here.

To skip all the steps above, and build the same query in your favorite Query Editor (Excel 2016 Preview, Excel 2013, Excel 2010 or Power BI Desktop), open the Advanced Editor and paste the following Power Query (M) expression:

let
    Source = Table.FromColumns({Lines.FromBinary(Web.Contents("https://gallery.technet.microsoft.com/SSH-Log-Sample-to-import-b4088fd4/file/140762/1/ssh.txt"),null,null,1252)}),
    #"Filtered Rows" = Table.SelectRows(Source, each
        Text.Contains([Column1], "$ssh") or
        Text.StartsWith([Column1], "MAC Address") or
        Text.StartsWith([Column1], "Port") or
        Text.StartsWith([Column1], "PEVLAN") or
        Text.StartsWith([Column1], "TrustFlag") or
        Text.StartsWith([Column1], "Peer IP") or
        Text.StartsWith([Column1], "Aging time") or
        Text.StartsWith([Column1], "TimeStamp")),
    #"Added Custom" = Table.AddColumn(#"Filtered Rows", "Custom", each if Text.Contains([Column1], "$ssh") then [Column1] else null),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Added Custom","Custom",Splitter.SplitTextByEachDelimiter({"$ssh "}, null, false),{"Custom.1", "Custom.2"}),
    #"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Custom.1", type text}, {"Custom.2", type text}}),
    #"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Custom.1"}),
    #"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"Custom.2", "IP"}}),
    #"Filled Down" = Table.FillDown(#"Renamed Columns",{"IP"}),
    #"Filtered Rows1" = Table.SelectRows(#"Filled Down", each not Text.Contains([Column1], "$ssh")),
    #"Added Index" = Table.AddIndexColumn(#"Filtered Rows1", "Index", 0, 1),
    #"Added Custom1" = Table.AddColumn(#"Added Index", "Custom", each if Text.StartsWith([Column1], "MAC Address") then [Index] else null),
    #"Filled Down1" = Table.FillDown(#"Added Custom1",{"Custom"}),
    #"Removed Columns1" = Table.RemoveColumns(#"Filled Down1",{"Index"}),
    #"Split Column by Delimiter1" = Table.SplitColumn(#"Removed Columns1","Column1",Splitter.SplitTextByDelimiter(":"),{"Column1.1", "Column1.2", "Column1.3"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter1",{{"Column1.1", type text}, {"Column1.2", type text}, {"Column1.3", type text}}),
    #"Trimmed Text" = Table.TransformColumns(#"Changed Type1",{{"Column1.2", Text.Trim}}),
    #"Split Column by Delimiter2" = Table.SplitColumn(#"Trimmed Text","Column1.2",Splitter.SplitTextByEachDelimiter({" "}, null, false),{"Column1.2.1", "Column1.2.2"}),
    #"Changed Type2" = Table.TransformColumnTypes(#"Split Column by Delimiter2",{{"Column1.2.1", type text}, {"Column1.2.2", type text}}),
    #"Trimmed Text1" = Table.TransformColumns(#"Changed Type2",{{"Column1.1", Text.Trim}, {"Column1.2.1", Text.Trim}, {"Column1.2.2", Text.Trim}, {"Column1.3", Text.Trim}}),
    #"Merged Columns" = Table.CombineColumns(#"Trimmed Text1",{"Column1.1", "Column1.2.1"},Combiner.CombineTextByDelimiter(":", QuoteStyle.None),"Merged"),
    #"Merged Columns1" = Table.CombineColumns(#"Merged Columns",{"Column1.2.2", "Column1.3"},Combiner.CombineTextByDelimiter(":", QuoteStyle.None),"Merged.1"),
    #"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Merged Columns1", {"IP", "Custom"}, "Attribute", "Value"),
    #"Filtered Rows2" = Table.SelectRows(#"Unpivoted Columns", each not Text.StartsWith([Value], ":")),
    #"Removed Columns2" = Table.RemoveColumns(#"Filtered Rows2",{"Attribute"}),
    #"Split Column by Delimiter3" = Table.SplitColumn(#"Removed Columns2","Value",Splitter.SplitTextByEachDelimiter({":"}, null, false),{"Value.1", "Value.2"}),
    #"Changed Type3" = Table.TransformColumnTypes(#"Split Column by Delimiter3",{{"Value.1", type text}, {"Value.2", type text}}),
    #"Renamed Columns1" = Table.RenameColumns(#"Changed Type3",{{"Value.1", "Field"}, {"Value.2", "Value"}}),
    #"Pivoted Column" = Table.Pivot(#"Renamed Columns1", List.Distinct(#"Renamed Columns1"[Field]), "Field", "Value"),
    #"Removed Columns3" = Table.RemoveColumns(#"Pivoted Column",{"Custom"})
in
    #"Removed Columns3"

 

If you find this post useful, please share your comments with us.

To read more episodes of Excel for Security Analysts, go here.

Which other log files do you think we should try next? 

 

Let's meet up in North Carolina Monday evening (8/10)


Windows Performance Recorder (WPR.exe) now inbox in Windows 10.

$
0
0

Applies to:

Windows 10

 

Starting with Windows 10, the “Windows Performance Recorder” (WPR.exe) console (command) line version ships in-box.

Note:  The WPR.exe file is located in %SystemRoot%\System32 i.e. c:\Windows\System32.

It contains the common profiles (scenarios):

ProfileDescription
GeneralProfileFirst level triage
CPU CPU usage
DiskIODisk I/O activity
FileIOFile I/O activity
Registry Registry I/O activity
Network Networking I/O activity
HeapHeap usage
Pool Pool usage
VirtualAllocationVirtualAlloc usage
AudioAudio glitches
VideoVideo glitches
Power   Power usage
InternetExplorer  Internet Explorer
EdgeBrowserEdge Browser
Minifilter Minifilter I/O activity
GPUGPU activity
HandleHandle usage
XAMLActivityXAML activity
HTMLActivityHTML activity
DesktopCompositionDesktop composition activity
XAMLAppResponsivenessXAML App Responsiveness analysis
HTMLResponsivenessHTML Responsiveness analysis
ResidentSetResident Set analysis
XAMLHTMLAppMemoryAnalysisXAML/HTML application memory analysis
UTC UTC Scenarios
DotNET.NET Activity
WdfTraceLoggingProviderWDF Driver Activity

Note:  For the On/Off Scenarios (boot, fast startup, shutdown, rebootCycle, Standby (sleep), hibernate)

and

for Interrupts and DPCs

you will need to use the Windows 10 SDK or the Windows 10 ADK version of the “Windows Performance Toolkit” which ships WPRUI.exe, WPR.exe, and xperf.exe. 
                       
                     
 

C:\Program Files (x86)\Windows Kits\10\Windows Performance Toolkit>wpr -?

Microsoft Windows Performance Recorder Version 10.0.10240 (CoreSystem)
Copyright (c) 2015 Microsoft Corporation. All rights reserved.

        Usage: wpr options ...

-help  - Provide command line help information
-profiles- Enumerates the profile names and descriptions  from a profile file
-start - Starts one or more profiles
-marker- Fires an event marker
-markerflush - Fires an event marker and flushes the working
  set
-status  - Displays status on active recording (if any)
-profiledetails - Displays the detailed information about a set
  of profiles
-providers  - Displays detailed information about providers
-cancel- Cancels recording initiated via WPR (if any)
-stop - Stops recording initiated via WPR (if any) an
d saves
-log - Configure debug logging to the event log
-disablepagingexecutive- Change the Disable Paging Executive settings
-heaptracingconfig - Change heap tracing settings for a process
-capturestateondemand- Capture states for the configured providers i
n the current recording
-setprofint - Set sampled profile interval
-profint  - Query the current profile interval
-resetprofint  - Restores the default profile interval values

                         
Here are the three most common scenarios:

Scenario 1:  Where you are having a high CPU.

Start, CMD (Run As Admin)

md c:\temp

WPR.exe -start CPU

<reproduce the issue>

WPR.exe -stop c:\temp\HighCPU.etl

 

Scenario 2:  Where you are having a high Storage utilization.

Start, CMD (Run As Admin)

md c:\temp

WPR.exe -start CPU -start DiskIO –start FileIO

<reproduce the issue>

WPR.exe -stop c:\temp\CPUDiskFileIO.etl

 

Scenario 3:  Where you are having something hang for 1-60 seconds and it becomes responsive again.

Start, CMD (Run As Admin)

md c:\temp

WPR.exe -start CPU -start DiskIO -start FileIO -start Network

<reproduce the issue>

WPR.exe -stop c:\temp\WaitAnalysis.etl


Note:  For scenario 3, there is a good chance that it’s related with something coming over the network.  Capture a matching network trace.  Network tracing (packet sniffing) built-in to Windows Clients (7,8,8.1,10) and Windows Server (2008 R2, 2012, 2012 R2).

Azure RemoteApp 評価ガイドのご案内

$
0
0

日本マイクロソフトからテレワークを実施される企業の皆様へ、インターネットがつながる環境であればどこからでもOfficeプログラムなどをWindowsMac OS XiOSAndroid などさまざまなデバイスから利用できるAzure RemoteAppをご紹介します。

 

Azure RemoteAppとは、クラウド上にアプリケーションやデータを置いておき、それを持ち出し可能なタブレットや、自宅のPCなどから安全に利用する仕組みです。

 


 

テレワークを実施される方では、会社と同じデータ、同じアプリケーションを利用したいシーンもあるかと思いますが、それを安全に、かつテレワーク先の状況に合わせていろいろなデバイスで利用することができる仕組みとして、この機会にお試しいただいてはいかがでしょうか。

 

1か月、無料でご利用可能な環境でご評価いただける「評価ガイド」をご用意しております。テレワーク週間の実施において、ぜひご活用ください。

 

ご参考ページ:

Microsoft Azure RemoteApp 製品ページ


 

(SQL) Tip of the Day: You shouldn’t be using Auto Export anymore

$
0
0

Today’s (SQL) Tip…

Automatic export is a tool built into SQL Azure. It was formerly used as the primary way to back up SQL Azure databases in an automated way. In fact for a long time it was the only way to automate backup. It works by first creating a copy of your database and then running an export to your Azure Storage account.

This has a large number of moving parts and can get very expensive if used daily or more frequently. Also unless you have a very defined business need it is mostly pointless.

With any tier of SQL Azure database other than Web or Business you get point in time recovery that is backing up your database every five minutes. You also get Geo-Restore which will give you full disaster recovery in a geo-redundant location.

This all boils down to unless you specifically need a bacpac of your database on a regular and automated basis you should be using the features already built into SQL Azure to handle backup for you.

Support Tip: Error message when deleting the Subscribers object from Administration\Notifications in Operations Manager

$
0
0

~ Keshav Deo Jain | Support Engineer

FIXHere’s a quick note on an issue I’ve run into a couple times that might help you out if you happen to see it. What happens is that the error message below may appear while deleting the Subscribers object from Administration\Notifications in Microsoft System Center Operations Manager 2007 (OpsMgr 2007), System Center 2012 Operations Manager (OpsMgr 2012) or System Center 2012 R2 Operations Manager (OpsMgr 2012 R2):

1

The Error Message

The full error message you receive will be similar to the following:

The following information was gathered when the operation was attempted. The information may appear cryptic but provides context for the error. The application will continue to run.

System.InvalidOperationException: The notification subscription was not valid for the update. See inner exception for details.

---> System.ArgumentException: The notification subscription is not valid for insert. See inner exception for details. ---> Microsoft.EnterpriseManagement.Common.ManagementPackException: Database error. MPInfra_p_ManagementPackInstall failed with exception:
Database error. MPInfra_p_ManagementPackInstall failed with exception:
Could not allocate space for object 'dbo.DeployedManifest'.'PK_DeployedManifest' in database 'OperationsManager' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.
at Microsoft.EnterpriseManagement.Common.Internal.ServiceProxy.HandleFault(String methodName, Message message)
at Microsoft.EnterpriseManagement.Common.Internal.MonitoringNotificationServiceProxy.UpsertNotificationSubscription(String conditionDetectionConfiguration, String dataSourceConfiguration, IList`1 endpoints, String name, String internalName, String displayName, String description, String languageCode, String targetString, String targetManagementPackId, String targetManagementPackVersion, String targetManagementPackPublicKey, String targetManagementPackAlias, Int32 subscriptionType, Boolean enabled, Boolean resetStateOnEnabled, Boolean isUpdate)
at Microsoft.EnterpriseManagement.MonitoringNotificationManagement.UpsertNotificationSubscription

The Fix

Generally this error occurs when there is not enough space available for the OperationsManager database. You can run the following query in SQL Management Studio to verify the free space available for the OperationsManager database:

execsp_spaceused

This query will give you the database size as well as the amount of free space as shown below:

2

If you come across this error, usually the easy fix is to simply create additional free space for database. To do this, right-click on the OperationsManager database in SQL Management Studio and choose Properties. In Properties, click Files in the right side pane. From there you can change the DB size.

Alternately, you could also enable the autogrow setting for the database. Note that the OperationsManager database has autogrow disabled by default, which is what we recommend. This is because if the OperationsManager database has a size that is too small and the database is set to autogrow, the autogrow process will use more system resources than during normal operation which can impair system performance. With that said, you can find details about autogrow in the following Knowledge Base article:

315512Considerations for the "autogrow" and "autoshrink" settings in SQL Server

Keshav Deo Jain | Support Engineer | Microsoft GBS Management and Security Division

Get the latest System Center news onFacebookandTwitter:

clip_image001clip_image002

System Center All Up: http://blogs.technet.com/b/systemcenter/

Configuration Manager Support Team blog: http://blogs.technet.com/configurationmgr/ 
Data Protection Manager Team blog: http://blogs.technet.com/dpm/ 
Orchestrator Support Team blog: http://blogs.technet.com/b/orchestrator/ 
Operations Manager Team blog: http://blogs.technet.com/momteam/ 
Service Manager Team blog: http://blogs.technet.com/b/servicemanager 
Virtual Machine Manager Team blog: http://blogs.technet.com/scvmm

Microsoft Intune: http://blogs.technet.com/b/microsoftintune/
WSUS Support Team blog: http://blogs.technet.com/sus/
The RMS blog: http://blogs.technet.com/b/rms/
App-V Team blog: http://blogs.technet.com/appv/
MED-V Team blog: http://blogs.technet.com/medv/
Server App-V Team blog: http://blogs.technet.com/b/serverappv
The Surface Team blog: http://blogs.technet.com/b/surface/
The Application Proxy blog: http://blogs.technet.com/b/applicationproxyblog/

The Forefront Endpoint Protection blog : http://blogs.technet.com/b/clientsecurity/
The Forefront Identity Manager blog : http://blogs.msdn.com/b/ms-identity-support/
The Forefront TMG blog: http://blogs.technet.com/b/isablog/
The Forefront UAG blog: http://blogs.technet.com/b/edgeaccessblog/

Now available: Cumulative Update 1 for System Center 2012 R2 Configuration Manager SP1 and System Center 2012 Configuration Manager SP2

$
0
0
Author: Brian Huneycutt, Software Engineer, Enterprise Client and Mobility Cumulative Update 1 (CU1) for System Center 2012 R2 Configuration Manager SP1 and System Center 2012 Configuration Manager SP2 is now available as a hotfix download from KB 3074857 . This update contains fixes for various issues including an update version of the Endpoint Protection client, and contains all of the changes from prior cumulative updates. For more information about the Endpoint Protection client, refer...(read more)
Viewing all 17778 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>