The (Solar) WInds of Change…

I used to love System Center.  Simply put, if you were a systems administrator / engineer / architect it did… everything.  It monitors, automates, protects, virtualizes, scripts, patches, deploys, integrates… everything in your environment.  It is, in a word, comprehensive.

It is also big.  There was a time (prior to System Center 2012) when you could pick and choose the components you wanted to buy – if you only wanted monitoring then all you bought was System Center Operations Manager (SCOM).  If all you wanted was the virtualization management than all you bought was System Center Virtual Machine Manager (SCVMM).

When Microsoft announced in 2012 that all of the pieces would now be sold as a single package I thought it was a good decision for Microsoft, but not necessarily a good one for the customer.  Certainly it would increase their market share for components such as System Center Data Protection Manager (DPM) – which was probably from a 0.1% market share to something somewhat higher – but that was not what the customers wanted.  I want a reasonably simple monitoring tool that could be deployed (and purchased) independent of everything else; I could then use the backup tool that I want, the deployment tools that I want, the anti-malware tools that I want.

So when I got an e-mail from representative of SolarWinds asking if I would try out their product (Server & Application Monitor) I decided to give it a try.  After all, I knew SolarWinds by reputation, and due to the non-invasive nature of the tool I could easily deploy it along side my existing SCOM environment and monitor the same servers without risk.

The Good

The first thing I noticed about SolarWinds was the ease with which it installed.  Compared to SCOM (which even to simply install it was a bit of an ordeal (See article) it was a simple install – it did not take long, and was pretty straight-forward.

While the terminology was a little different that SCOM it was easy to understand the differences, and I suspect for a junior sys admin would be pretty easy to understand.  At the top of the Main Settings & Administration page the first option is Discovery Central, which allows SAM to search your entire environment for servers.

The Alerts & Reports option helps you set up your mail account that sends alert & notification e-mails to the admins based on the current environment and issues.  It is just as easy to send these e-mails to individuals as to groups, and configuring what is sent to whom is relatively simple.

Fortunately SAM is completely Active Directory integrated, so I can just authorize my Domain Admins and other groups to access what data they need in SAM, and to grant individuals and groups granular permissions to see and/or change what they are allowed to.

The dashboard is easy to read and understand, as well as customize.  I want my graphs to be at the top, and I want to know anything critical up front.  As with any good monitoring tool, Green=Good, Red=Bad.  All of my alerts are hyper-linked so if I see something Red I can just click and go right to it.


Actions, not words… If this happens then do that is a requirement in this day and age… Of course, if my monitoring tool can notify me that a service is down it is great… but how much better that it can bring it back up for me at the same time.  That can be as simple or complicated as you need, but the fact that certain conditions can trigger actions and not just alerts is key for me.  This was a simple task in SAM.

Of course it is important to realize that some system admins will not be as comfortable learning a tool this powerful on their own, and the fact that SolarWinds offers scores of free training resources is key.  The Customer Portal has more than just videos; they offer live classes and expert sessions with their engineers and experts which you can attend live or watch later.  They have on-demand recordings of everything you might want to learn.  Their Virtual Classroom is an amazing resource for customers who need help – whether that is learning a simple tidbit in a few minutes, or going from zero to hero over the course of a few days.

My initial impression of SolarWinds SAM was that it would be a great tool for smaller businesses; that impression changed drastically reasonably quickly.  Yes, I installed SAM in one of my 100 server environments in Q3 2015, and it performed brilliantly.  However as I learned about it and got to know the product I was convinced it was definitely Enterprise-Class, and by the end Q1 2016 I also had it installed at a client with 19,000 users and thousands of servers.

The Bad and the Ugly…

There is really only one aspect of SolarWinds that irked me, and that is the licensing model.  With some monitoring tools if you have 200 servers you know you need 200 licenses.  With SolarWinds a single server may require 100 licenses, depending on what you are monitoring.  That is not to say that SAM will be more expensive than other tools… it is just a different way of looking at the calculations that I needed to wrap my head around.  A small thing to be sure, but it is certainly an issue for me.


I was offered a trial period with SAM to try it out in my environment, and when that trial period ended I decided to renew.  SolarWinds has a great tool here, but more important to me is the support that I have been able to get from the company, which has extended beyond simple ‘how do I…’ questions.  Their engineers have gotten on-line with me to help solve a couple of custom issues that arose, and they were happy to do it.

The product offering is a home run for system admins who want a monitoring and reporting tool and do not want to break the bank… or change out all of their other management tools to drink Microsoft’s Kool-Aid.

Small environment or large, SolarWinds is worth it.  Contact them at for more information, and a demo of their offerings!


The Perils of a Manual Environment

I am not going to lie to you and say that every environment that I manage or have managed is an optimized Secure, Well-Managed IT Environment.  It’s just not true.

In a secure, well-managed IT environment we monitor to make sure that things are working the way they are supposed to.  When we spin up a new server, for example, the proper agents are installed for anti-malware and monitoring without our lifting a finger.  Tuesday evening a new server is spun up, Wednesday morning it is already letting us know how well it is running.

But what about the other environments?  Many smaller environments do not have automated deployment infrastructures that make sure every new server is built to spec.  What do we do for those?

The answer is simple… where automation is lacking we have to be more vigilant in our processes.  When a new server (virtual or otherwise) is created, we not only install an operating system… we also make sure we add the monitoring agent, the anti-virus agent, and make sure you schedule proper backups because if you don’t it will all ne for naught if everything goes down.

So the answer is to make my environment completely automated, right?

Well, yes of course it is… in an ideal world.  In the real world there are plenty of reasons why we wouldn’t automate everything.  The cost of such systems might outweigh the benefits, for example… or maybe we do not have an IT Pro managing it, just the office computer guy.  Ideally we would get that guy trained and certified in all of the latest and greatest… but if you work in small business you know that might not always be the reality.

So what IS the answer?

Green-Check-MarkSimple.  I have a friend who has made a fortune telling people around the world how to make checklists.  I am not the guru that Karl is, and you don’t have to be either.  But if you do have a manual environment, spend the time to make a checklist for how you build out systems – make one for servers, one for desktops, and probably one for any specific type of server.  You don’t have to do it from memory… the next time you build a machine write down (or type!) every step you take. 1) Create virtual machine. 2) Customize virtual machine. 3) Install operating system… and so on.  When you are satisfied that your system is built the way you want it (every time) then you should try it again… but rather than using what you know, follow the checklist.

These checklists, I should mention, should not be written in stone.  There are ten rules that were so written, and that’s enough.  Thou shalt not murder is pretty unambiguous.  Thou shalt install Windows 8.1 may change when you decide to upgrade to Windows 10.  So make sure that every time you use the checklist you do so with a critical eye, trying to see if there is a way to improve upon the process.  The Japanese word for this is Kaizen.  They are pretty good at a lot of things from what I have seen Winking smile

True story: I gave this advice to a colleague once who thought it was great.  He started creating checklists, and had his employees and contractors follow them.  One day he invited me for a drink and told me a funny story.  His client had been using System Center Operations Manager (SCOM) to monitor all of their servers.  He had a checklist that included installing the SCOM agent in all servers.  One day the client decided to switch from SCOM to SolarWinds (a great product!) and after several weeks he decommissioned his SCOM infrastructure.  Six months later the client (a pretty big small business) complained that since they switched from SCOM to SW all of their new servers kept reporting a weird error.  It seems that the IT Pro who was following the checklists had continued installing the SCOM Agent into their servers, and since it could not find a SCOM server to report to, it was returning an error.  As I said, these checklists should be living documents, and not set in stone.


There is no one right or wrong answer for every environment.  What is a perfect inexpensive solution for one company might be cost prohibitive for another.  The only thing you have to do is use your mind, keep learning, use common sense, and keep reading The World According to Mitch!

Welcome to What’s Next…

There is irony in the title of this post… What’s next.

I posted on Friday that it was my last day working full time at Yakidoo.  I really enjoyed my time there, and am glad that my next venture will allow me to stay on there on a limited basis.

This afternoon I am meeting a colleague at the airport in Seattle, and that will begin my first day at my new gig.  I will talk more about it in a few weeks, even though today will be my first billable day.  That is what’s Next.

However the reason he and I will be in Seattle – Bellevue/Redmond actually – is the Airlift for Windows Server, System Center (WSSC), and Windows Azure vNext… the next generation of datacenter and cloud technologies that Microsoft is ‘showing off’ to select Enterprise customers several months prior to launching them.  It will be a week of deep-dive learning, combined with the usual Microsoft Marketing machine.  How do I know?  It’s not my first kick at the can Winking smile

It is, of course, not my first such Airlift.  The first one I attended was for System Center Configuration Manager (SCCM) 2007, back in November of that year. It was a consulting firm that had sent me, in advance of my heading off to Asia to teach it.  I have since been to a couple of others, each either as a consultant, a Microsoft MVP, or as a Virtual Technology Evangelist for Microsoft.  I have not given this a lot of thought, but this will be my first Airlift / pre-Launch event that I am attending as a customer.  It will be interesting to see if and how they treat me differently.

I suspect that the versions of WSSC that I will learn about this week will be the first that I will not be involved in presenting or evangelizing in any way dating back to Windows Server 2003.  I will not be creating content, I will not be working the Launch Events, and I will not be touring across Canada presenting the dog and pony show for Microsoft.  I will not be invited by the MVP Program to tour the user groups presenting Hyper-V, System Center, or Small or Essential Business Servers.  I will not be fronting for Microsoft showing off what is new, or glossing over what is wrong, or explaining business reasons behind technology decisions.  It is, in its way, a liberating feeling.  It is also a bit sad.

Don’t get me wrong… I will still be blogging about it.  Just because Microsoft does not want me in their MVP program does not mean that I will be betraying my readers, or the communities that I have helped to support over the years.  I will be writing about the technologies I learn about over the next week (I do not yet know if there will be an NDA or publication embargo) but at some point you will read about it here.  I will also, if invited, be glad to present to user groups and other community organizations… even if it will not be on behalf of (or sponsored by) Microsoft.  I was awarded the MVP because I was passionate about those things and helping communities… it was not the other way around.

What else can I say?  I am at the airport in Toronto, and my next article will be from one of my favourite cities in North America… see you in Seattle!

Onboard SAN… Issues.

A client of mine is a small business with a couple of physical servers and a couple of virtualization hosts.  One of the physical servers, a Lenovo ThinkServer, has been acting as a file server, so it has really been very under-used.  It is a good server that has never been used to its potential (like myself) but has been nonetheless a very important file server.  It has eight hard drives in it, managed by the on-board RAID controller.

When the server rebooted for no discernible reason last week, we were concerned.  When it didn’t come up again, and did not present any hard drives… we realized we had a problem.

I was relieved to discover that it was still under warranty from Lenovo, with NBD on-site support.  I called them, and after the regular questions they determined that there might be a problem with the system board.  They dispatched one to me along with a technician for the next morning, Their on-site service is still done by IBM, and in my career I have never met an unprofessional IBM technician.  These guys were no exception.  They were very professional and very nice.  Unfortunately they weren’t able to resolve the problem.

Okay, in their defense, here is what everyone (including me) expected to happen:

1) Replace the system board.

2) Plug all of the devices (including the hard drives)

3) Boot it up, and during the POST get a message like ‘Foreign drive configuration detected.  Would you like to import the configuration?’

4) We answer YES, the configuration rebuilds, and Windows boots up.

Needless to say, this is NOT what happened.  Why?  Let’s start with the fact that low-end on-board RAID controllers apparently suck.  Is it possible that a procedure was not properly followed?  I am not sure, and I am not judging.  I know that I watched most of what they did, and did not see them do that I felt was overtly wrong.

The techs spent six hours on-site, a lot of that spent in consultation with the second level support engineer at Lenovo, who had the unenviable task of telling me, at the end of the effort, that all was lost, and I would have to restore everything from our backup.

I should mention at this point that we did have a backup… but because of maintenance we were doing to that system over the December holidays the most recent successful backup was twelve days old.


Okay, we’ll go ahead and do it.  In the meantime, the client and I went to rebuild the RAID configuration.  We decided that although we were going to bolster the server – including a new RAID controller – we were going to try to rebuild the array configuration exactly as it had been, and see what happened.

Let me be clear… even the Lenovo Engineer agreed that this was a futile effort, that there was no way that this was going to work.  Of course it would work as a new array, we just weren’t going to recover anything.  I agreed… but we tried it anyways.

…and the server booted into Windows.

To say that we were relieved would be an understatement.  We got it back up and running exactly as it had been, with zero data loss.  We were not going to leave it this way of course… I spent the next day migrating data into new shares on redundant virtual servers.  But nothing was lost, and we all learned something.

I want to thank Jeff from Lenovo, as well as Luke and Brett from IBM who did their best to help.  Even though we ended up resolving it on our own (and that credit goes mostly to my client), they still did everything they could to make it right.

So my client has a new system board in their server, and hopefully with a new RASID controller, some more memory, and an extra CPU this server can enjoy a new and long, productive life as a vSphere host in the cluster.

…But I swear to you, I will never let a customer settle for on-board ‘LSI Software RAID Mega-RAID’ type devices again!

Happy week-end.

End Of Days 2003: The End is Nigh!

In a couple of days we will be saying goodbye to 2014 and ringing in the New Year 2015.  Simple math should show you that if you are still running Windows Server 2003, it is long since time to upgrade.  However here’s more:

When I was a Microsoft MVP, and then when I was a Virtual Technical Evangelist with Microsoft Canada, you might remember my tweeting the countdown to #EndOfDaysXP.  That we had some pushback from people who were not going to migrate, I think we were all thrilled by the positive response and the overwhelming success we had in getting people migrated onto either Windows 8, or at least Windows 7.  We did this not only by tweeting, but also with blog articles, in-person events (including a number of national tours helping people understand a) the benefits of the modern operating system, and b) how to plan for and implement a deployment solution that would facilitate the transition.  All of us who were on the team during those days – Pierre, Anthony, Damir, Ruth, and I – were thrilled by your response.

Shortly after I left Microsoft Canada, I started hearing from people that I should begin a countdown to #EndOfDaysW2K3.  Of course, Windows Server 2003 was over a decade old, and while it would outlast Windows XP, support for that hugely popular platform would end on July 14th, 2015 (I have long wondered if it was a coincidence that it would end on Bastille Day).  Depending on when you read this article it might be different, but as of right now the countdown is around 197 days.  You can keep track yourself by checking out the website here

It should be said that with Windows 7 there was an #EndOfDaysXP Countdown Gadget for the desktop, and when I migrated to Windows 8 I used a third party app that sat in my Start Menu.  One friend suggested I create a PowerShell script, but that was not necessary.  I don’t remember exactly which countdown timer I used, but it would work just as well for Windows Server 2003 – just enter the date you are counting down to, and it tells you every day how much time is left.

The point is, while I think that migrating off of Server 2003 is important, it was not at that point (nor is it now) an endeavour that I wanted to take on.  To put things in perspective, I was nearing the end of a 1,400 day countdown during which I tweeted almost every day.  I was no longer an Evangelist, and I was burnt out.

Despite what you may have heard, I am still happy to help the Evangelism Team at Microsoft Canada (although I think they go by a different name now).  So when I got an e-mail on the subject from Pierre Roman, I felt it important enough to share with you.  As such, here is the gist of that e-mail:

1) On July 14, 2015 support for Windows Server will come to an end.  It is vital that companies be aware of this, as there are serious dangers inherent in running unsupported platforms in the datacenter, especially in production.  As of that date there will be no more support and no more security updates.

2) The CanITPro team has written (or re-posted) several articles that will help you understand how to migrate off your legacy servers onto a modern Server OS platform, including:

3) The Microsoft Virtual Academy ( also has great educational resources to help you modernize your infrastructure and prepare for Windows Server 2003 End of Support, including:

4) Independent researchers have come to the same conclusion (IDC Whitepaper: Why You Should Get Current).

      5) Even though time is running out, the Evangelism team is there to help you. You can e-mail them at if you have any questions or concerns surrounding Windows Server 2003 End of Support.

      Of course, these are all from them.  If you want my help, just reach out to me and if I can, I will be glad to help! Smile  (Of course, as I am no longer with Microsoft or a Microsoft MVP, there might be a cost associated with engaging me Smile)

      Good luck, and all the best in 2015!

Obvious? No…

I learned a lesson today, after I thought I had heard it all.  It seems that when there is mould in your server room it is important to specify to everyone involved at every level that the equipment in there – often valued at hundreds of thousands of dollars, not to mention the potential for lost productivity – is extremely sensitive to elements, and that pneumatic pressure hoses are not to be used in this room for anything ever ever ever.

You know, I always thought that there were some things that were so blatantly obvious that you just didn’t have to say anything.  I was reminded today that I was wrong about that.  So: For those of you who may ever be asked to clean a Server Room: NO PRESSURE HOSES.

That’s all.

Server Core: Save money.

I remember an internal joke floating around Microsoft in 2007, about a new way to deploy Windows Server.  There was an ad campaign around Windows Vista at the time that said ‘The Wow Starts Now!’  When they spoke about Server Core they joked ‘The Wow Stops Now!’

Server Core was a new way to deploy Windows Server.  It was not a different license or a different SKU, or even different media.  You simply had the option during the installation of clicking ‘Server Core’ which would install the Server OS without the GUI.  It was simply a command prompt with, at the time, a few roles that could be installed in Core.

While Server Core would certainly save some resources, it was not really practical in Windows Server 2008, or at least not for a lot of applications.  There was no .NET, no IIS, and a bunch of other really important services could not be installed on Server Core.  In short, Server Core was not entirely practical.

Fast Forward to Windows Server 2012 (and R2) and it is a completely different story.  Server Core a fully capable Server OS, and with regard to resources the savings are huge.  So when chatting with the owner of a cloud services provider recently (with hundreds of physical and thousands of virtual servers) I asked what percentage of his servers were running Server Core, and he answered ‘Zero’.  I could not believe my ears.

The cloud provider is a major Microsoft partner in his country, and is on the leading edge (if not the bleeding edge) on every Microsoft technology.  They recently acquired another datacentre that was a VMware vCloud installation, and have embarked on a major project to convert all of those hosts to Hyper-V through System Center 2012.  So why not Server Core?

The answer is simple… When Microsoft introduced Server Core in 2008 they tried it out, and recognizing its limitations decided that it would not be a viable solution for them.  It had nothing to do with the command line… the company scripts and automates everything in ways that make them one of the most efficient datacentres I have ever seen.  They simply had not had the cycles to re-test Server Core in Server 2012 R2 yet.

We sat down and did the math.  The Graphical User Environment (GUI) in Windows Server 2012 takes about 300MB of RAM – a piddling amount when you consider the power of today’s servers.  However in a cloud datacentre such as this one, in which every host contained 200-300 virtual machines running Windows Server, that 300MB of RAM added up quickly – a host with two hundred virtual machines required 60GB of RAM just for GUIs.  If we assume that the company was not going to go out and buy more RAM for its servers simply for the GUI, it meant that, on average, a host comfortably running 200 virtual machines with the GUI would easily run 230 virtual machines on Server Core.

In layman’s terms, the math in the previous paragraph means that the datacentre capacity could increase by fifteen percent by converting all of his VMs to Server Core.  If the provider has 300 hosts running 200 VMs each (60,000 VMs), then an increased workload of 15% translates to 9,000 more VMs.  With the full GUI that translates to forty-five more hosts (let’s conservatively say $10,000 each), or an investment of nearly half a million dollars.  Of course that is before you consider all of the ancillary costs – real estate, electricity, cooling, licensing, etc…  Server Core can save all of that.

Now here’s the real kicker: Had we seen this improvement in Windows Server 2008, it still would have been a very significant cost to converting servers from GUI to Server Core… a re-install was required.  With Windows Server 2012 Server Core is a feature, or rather the GUI itself is a feature that can be added or removed from the OS, and only a single reboot is required.  While the reboot may be disruptive, if managed properly the disruption will be minimal, with immense cost savings.

If you have a few servers to uninstall the GUI from then the Server Manager is the easy way to do it.  However if you have thousands or tens of thousands of VMs to remove it from, then you want to script it.  As usual PowerShell provides the easiest way to do this… the cmdlet would be:

Uninstall-WindowsFeature Server-Gui-Shell –restart

There is also a happy medium between the GUI and Server Core called MinShell… you can read about it here.  However remember that in your virtualized environment you will be doing a lot more remote management of your servers, and there is a reason I call MinShell ‘the training wheels for Server Core.’

There’s a lot of money to be saved, and the effort is not significant.  Go ahead and try it… you won’t be disappointed!

What’s New in Windows Server 2012 R2 Lessons Learned Week 1

Dan Stoltz asked me to republish this article, and it is well worth it!  Check out all of the links – a lot of great material! -MDG

It has been an incredible start to the Windows Server 2012 R2 Launch Series.  Here is brief summary of what we covered so far…

  1. Windows Server 2012 R2 Launch Blog Series Index #WhyWin2012R2 the series, opening and index page we learned that from Oct 18th and every day until Thanksgiving we should visit to learn all about Windows Server 2012 R2. You can also follow the excitement on twitter at #WhyWin2012R2. Download the calendar .ICS to populate your calendar here.  This post started the new launch series where Microsoft platform experts would cover  why Windows Server 2012 R2 is important, how to deploy, manage, configure any number of components in Windows Server 2012 R2, how the new OS capabilities stack up against competitors, how R2 integrates with and leverages cloud services like Windows Azure and many, many more categories. This series is deep technical content with lots of How To’s and Step-By-Step instructions. You will learn about storage, cloud integration, RDS, VDI, Hyper-V, virtualization, deduplication, replica, DNS, AD, DHCP, high availability, SMB, backup, PowerShell and much, much more!
  2. Why Windows Server 2012 R2 Rocks! #WhyWin2012R2 – You are probably like most people and realize that Windows Server 2012 was a very substantial upgrade over Windows Server 2008 R2. What would you say to Microsoft doing it again, and even better? WOW! That is exactly what Windows Server 2012 R2 has done. In this post we will look at some of the coolest additions and improvements to Windows Server 2012 R2. Regardless of which of the four pillars of focus (Enterprise-Class, Simple and Cost-Effective, Application Focused, User Centric) you are most interested in, you will find plenty in this post to appreciate! @ITProGuru will show you as he counts the top 10 biggest, most relevant and/or most differentiated new features in Windows Server 2012 R2.
  3. Where Are All The Resources For Windows Server 2012 R2? – We learned where to do go get free resources for Windows Server 2012 R2 including downloading a Free Trial of Windows Server 2012 R2, Free online cloud serversFree EBook on Windows Server 2012 R2, Free Posters, Free Online Training from Microsoft Virtual Academy, and much more.
  4. Implementing Windows Server 2012 R2 Active Directory Certificate Services Part 1 &
  5. Implementing Windows Server 2012 R2 Active Directory Certificate Services Part 2PKI is heavily employed in cloud computing for encrypting data and securing transactions. While Windows Server 2012 R2 is developed as a building block for cloud solutions, there is an increasing demand for IT professionals to acquire proficiency on implementing PKI with Windows Server 2012 R2. This two-part blog post series is to help those who, like me, perhaps do not work on Active Directory Certificate Services (AD CS) everyday while every so often do need to implement a simple PKI for assessing or piloting solutions better understand and become familiar with the process.
  6. Step-by-Step: Automated Tiered Storage with Storage Spaces in R2 – Windows Server 2012 R2 includes a number of exciting storage virtualization enhancements, including automated storage tiering, scale-out file server re-balancing and performance tuning for high-speed 10Gbps, 40Gbps and 56Gbps storage connectivity.  IT Pros with which I’ve spoken are leveraging these new enhancements to build cost-effective SAN-like storage solutions using commodity hardware.In this article, we’ll begin part 1 of a two-part mini-series on storage.  I’ll provide a technical comparison of Windows Server 2012 R2 storage architecture to traditional SAN architecture, and then deep-dive into the new Storage Spaces enhancements for storage virtualization.  At the end of this article, I’ll also include Step-by-Step resources that you can use to build your own Storage Spaces lab.  In part 2 of this mini-series, we’ll finish our storage conversation with the new improvements around Scale-Out File Servers in Windows Server 2012.
  7. iSCSI Target Server – Super Fast Mass Server Deployment! – #WhyWin2012R2 – There have been some significant updates to Windows Server 2012 with the R2 release. One of these updates helps IT Pros deal with a growing problem – How do I deploy a large number of servers quickly, at scale without adding massive amounts of storage?The updates to the iSCSI target server technologies allow admins to share a single operating system image stored in a centralized location and use it to boot large numbers of servers from a single image. This improves efficiency, manageability, availability, and security. iSCSI Target Server can boot hundreds of computers by using a single operating system image!
  8. Why Windows Server 2012 R2: Reducing the Storage Cost for your VDI Deployments with VHD De-duplication for VDI – Windows Server 2012 introduced a data deduplication for your storage workloads customers saw phenomenal storage reduction.  Windows Server 2012 R2 deduplucation now supports live VHDs for VDI, which means that data de-duplication can now be performed on open VHD/VHDX files on remote VDI storage with CSV volume support. Remote VHD/VHDX storage de-duplication allows for increased VDI storage density significantly reducing
    VDI storage costs, and enabling faster read/write of optimized files and advanced caching of duplicated data.
  9. Importing & Exporting Hyper-V VMs in Windows Server 2012 R2 One of the biggest benefits of server virtualization is the ability to backup or restore entire systems easily and quickly.  Though they are infrequently used features, Hyper-V import and export are very fast, versatile, and easy to use.  In Windows Server 2012 R2 these features get even better.  I will take a look at how this functionality works and why it is useful.  I’ll also discuss how they are very different from the commonly used checkpoints in Hyper-V, and how you can automate this process.

Keep plugged in to the series to continue learning about Windows Server 2012 R2

– See more at:

Vancouver Helping Calgary

The news is ablaze with stories of the terrible flooding in Calgary.  As I wrote in an article yesterday (Leaving Calgary…) I got out before the worst of it, but only barely.  The rivers are overflowing, entire neighborhoods are under water, and the news is not getting better.  At least two dead, and people are discovering that many of their insurance policies will not cover the damage.

On Saturday I spent the day with the Vancouver Technology Users Group (VANTug).  We spent the morning talking Windows 8 and Office 365, and then in the afternoon we discussed System Center 2012 and Microsoft’s Private Cloud solutions.  We had a great time at the Burnaby campus of BCIT.  I always love coming out to Vancouver, and today was no different.

And yet I couldn’t get Calgary out of my mind.  I know that a lot of people are scared, cold, wet, and hungry… and will have a very tough time rebuilding.  I am sure that when the IT Pros of Southern Alberta do get back into their offices they will have discussions around disaster recovery, business continuity, and minimizing loss.  Today, and through the middle of the week I expect most of them are with their families worrying about things much more important… their homes, their memories.

I showed up at BCIT with a Big Box o’ Swag full of prizes, and as is always the case at Install Fests I was asked early on if they were going to get licenses of Windows 8.  They were not… but as luck would have it I had one license in my laptop case that I had received at an event a few weeks ago that I did not really need, so I told them I would raffle off that license at the end of the day.

When the raffle time came some fifteen people won mice, keyboards, and Xbox controllers.  I then put all of the winning tickets back into the hat and was about to draw for the Windows 8 Pro license when I had a thought…

I had a one year subscription to Microsoft Office 365 Home Premium in my bag that I was supposed to give to a friend last week, but didn’t see them.  As I stood at the front of the room I asked the group leader (Peter) if they support charities, and he said that they did.  Normally they support the local children’s hospital, but for this I asked him to agree to support the Red Cross Alberta Floods Fund.  I told the group that I would draw for a winner of the Windows 8 license, and if the winner was willing to donate $50 to the fund (through VanTug) then he or she would also receive the subscription for Office 365.

The winner agreed and is now the proud owner of two great products… but should be even prouder to be helping a very important cause that is near and dear to my heart, and one that should be important for all Canadians.

I received a comment on my blog that same morning in response to an article I wrote about the relationship between Quebec and the rest of Canada.  He said that we have nothing in common across this great land (obviously not his words).  I disagree.  I think we share a heart and a love of our fellow man that transcends the political views of one side or another of any political debate, most of which seem petty in the face of disasters that befall regions and peoples from time to time.  I will respond to that comment in an article later this week, but in the meantime I hope my Quebec reader takes some food for thought from this one, and says a prayer or even donates a little to the people of Alberta… so distant, but so close to all of us.

Rebuilding SWMI… Not the company, just the infrastructure

I have been talking about rebuilding the domain for my company for several months, and finally sat down to do it this week-end.  Because I was essentially destroying the old domain there were a few steps I needed to perform before going ahead.

I performed a Backup of all of my data.  Nobody in their right mind would destroy an environment before they back up their data… especially if they are planning to actually delete the machines and start from scratch.

I performed a complete test-Restore of all of my data.  Now that my Mail Server is completely cloud-based this was easier than it might have been – If I had Exchange, SQL, and SharePoint it would have been more complicated, but also more crucial.  I always stress the importance of doing test-restores because the worst time to find out that your backup did not work is when you need to recover it.  Make sure that it works before you are faced with real data loss.

Planning was actually relatively simple for me, because the main environment was going to look very similar to the lab environment I had recently built for my Private Cloud camp.  I still had the planning documents for that, and I was able to follow them pretty closely for the first few machines. There was a time when I would have done the planning in my head, but now I make sure that I have all of my plans on paper before I go forward.  As the old adage goes, measure twice, cut once.  By having your thoughts on paper it is easier to stay on track… and if you do have to veer then you should document why you did.

Cleaning Up may not seem all that important, but destroying a cluster before destroying the domain is infinitely simpler than doing so afterwards.  It is doable of course, but there are PowerShell commands such as Remove-ClusterResource –force that one will get intimately familiar with if you do not think ahead.

Make sure you have all of the installation Media at hand… either on physical DVD or in an ISO repository.  This should not only include the obvious ones such as operating systems and applications, but also make sure you have the latest hardware drivers.  By looking at my Plan I know that I will need the following media:

Additionally I would need several bits that I would simply download as one-offs… the Report Viewer, Silverlight, and things like that.  However since my networking topology is already in place, I would be able to do that from within the virtual machines.

Now that I have everything ready to go, I am ready to move forward.  Building an environment from scratch (green-field) would be simpler, but there are some aspects that prevented that.  In your production environment (should you ever decide to start from near-scratch) you will have to run through the same sort of project plan.  Make sure you think it out – do not simply sit down one morning and expect to implement in the afternoon; rather make sure you observe your environment for a few cycles and build your plan over time so that you don’t run into any surprises.

In my next piece I will go through the actual build architecture of how I decided to build my server infrastructure; I will also introduce some actual build videos of the System Center components.  If there is something in particular that you would like to see please let me know by commenting! -M

Windows Server 2012: Roles & Features

A colleague asked me earlier today if I knew off the top of my head how many roles and features there are in Windows Server 2012, and I had to admit that I did not know.  As Albert Einstein once said, why memorize what you can reference.  However as a quick exercise I decided to not only count them, but type them up into an article for Kalvin… and for all of you!

NOTE: I included all of the sub-roles and sub-features as well for all except for the Remote Server Administration Tools, which would show a tool for all of the roles and features.


1. Active Directory Certificate Services

2. Active Directory Domain Services

3. Active Directory Federation Services

4. Active Directory Lightweight Directory Services

5. Active Directory Rights Management Services

6. Application Server

7. DHCP Server

8. DNS Server

9. Fax Server

10. File and Storage Services

a. File and iSCSI Services

i. File Server

ii. BranchCache for Network Files

iii. Data Deduplication

iv. DFS Namespaces

v. DFS Replication

vi. File Server Resource Manager

vii. File Server VSS Agent Services

viii. iSCSI Target Server

ix. iSCSI Target Storage Provider

x. Server for NFS

b. Storage Services

11. Hyper-V

12. Network Policy and Access Services

13. Print and Document Services

14. Remote Access

15. Remote Desktop Services

16. Volume Activation Services

17. Web Server (IIS)


1. .NET Framework 3.5 Features

a. .NET Framework 3.5 (includes .NET 2.0 and 3.0)

b. HTTP Activation

c. Non-HTTP Activation

2. .NET Framework 4.5 Features

a. .NET Framework 4.5

b. ASP.NET 4.5

c. WCF Services

i. HTTP Activation

ii. Message Queuing (MSMQ) Activation)

iii. Named Pipe Activation

iv. TCP Activation

v. TCP Port Sharing

3. Background Intelligent Transfer Service (BITS)

a. IIS Server Extension

b. Compact Server

4. BitLocker Drive Encryption

5. BitLocker Network Unlock

6. BranchCache

7. Client for NFT

8. Data Center Bridging

9. Enhanced Storage

10. Failover Clustering

11. Group Policy Management

12. Ink and Handwriting Services

13. Internet Printing Client

14. IP Address Management (IPAM) Server

15. iSNS Server Service

16. LPR Port Monitor

17. Management OData IIS Extension

18. Media Foundation

19. Message Queuing

a. Message Queuing Services

b. Message Queuing DCOM Proxy

20. Multipath I/O

21. Network Load Balancing

22. Peer Name Resolution Protocol

23. Quality Windows Audio Video Experience

24. RAS Connection Manager Administration Kit (CMAK)

25. Remote Assistance

26. Remote Differential Compression

27. Remote Server Administration Tools

28. RPC over HTTP Proxy

29. Simple TCP/IP Services

30. SMTP Server

31. SNMP Server

a. SNMP WMI Provider

32. Subsystem for UNIX-based Applications (Deprecated)

33. Telnet Client

34. Telnet Server

35. TFTP Client

36. User Interfaces and Infrastructure

a. Graphical Management Tools and Infrastructure

b. Desktop Experience

c. Server Graphical Shell

37. Windows Biometric Framework

38. Windows Feedback Forwarder

39. Windows Identity Foundation 3.5

40. Windows Internal Database

41. Windows PowerShell

a. Windows PowerShell 3.0

b. Windows PowerShell 2.0

c. Windows PowerShell ISE

d. Windows PowerShell Web Access

42. Windows Process Activation Service

a. Process Model

b. .NET Environment 3.5

c. Configuration APIs

43. Windows Search Service

44. Windows Server Backup

45. Windows Server Migration Tools

46. Windows Standards-Based Storage Management

47. Windows System Resource Manager (Deprecated)

48. Windows TIFF IFilter

49. WinRM IIS Extension

50. WINS Server

51. Wireless LAN Service

52. WoW64 Support

53. XPS Viewer

Now: Adding roles and features in Windows Server 2012 is easier than it was previously… either use the Add Roles and Features Wizard (See my article and video here).  Or you can use Windows PowerShell (which is the preferred way to do it) by using the cmdlet Install-WindowsFeature.  Even though there is a distinction between Roles and Features, the cmdlet to install them is the same for both.

Now go forth and serve, my fellow IT Pros!

Windows Server 2012: More than Virtualization!

Since it was in pre-release I have been evangelizing Windows Server 2012.  I have gone from sea to shining sea talking about it at Launch events, at Partner showcases, in IT Camps, at user groups talking about how much better it is than Windows Server 2008, but more importantly I chiefly discuss the improvements to Hyper-V over previous versions, and how it (and System Center 2012) compares to VMware’s vSphere 5.1 and vCenter Server.

While all of that is true, to say that virtualization is the only benefit to Windows Server 2012 is doing it a disservice.  Don’t get me wrong, Hyper-V officially rocks; but if virtualization was the only benefit to the new Server, couldn’t companies simply deploy the new version on their host hardware, and leave their virtual machines running Windows Server 2008 R2?

Going forward when someone asks me what is new and exciting in Windows Server, I am going to start with the improvements to Hyper-V… but then we can go into the real meat of the product, and see where it takes us.  Improvements such as:

Storage Spaces (or Storage Pools), which I have equated to software-RAID after ten generations of improvement.  With Storage Spaces you can build your volume from multiple disks of equal or disparate size, on similar or disparate architecture.  Imagine having three SAS disks of 450GB, 146GB, and 72GB combined into a single volume of 668GB… or a 146GB SAS disk, a 500GB SATA disk, and a 2TB USB disk combined into a 2.46TB volume.  Add to that the ability to hot-add drives on the fly (in a recent demo I added two disks in under 30 seconds), and have your volume protected by Mirroring or Parity. All of this is built into Windows Server 2012, and we have written about it extensively.  Try it for yourself by following my article here.

Data Deduplication is built into the operating system.  Previously a tool that storage-conscious companies would pay thousands of dollars to third-party vendors for, is now a check box away when creating your volume.  Once it is enabled on your volume you can either use the GUI tool or, if you are efficient, Windows PowerShell to either schedule your dedup or run the job immediately on either your local or remote systems.

Software iSCSI Target was exclusively a feature of Microsoft Storage Server until April of 2010 when Microsoft released it as a fully supported free download.  Now integrated in Server 2012, it gives you the ability to create a software SAN device on your server with all of the functionality of most hardware SANs, but at a fraction of the cost.  While I will still not replace my hardware SAN devices in large organizations, it brings that functionality to smaller businesses without the budget for the extra hardware.  Couple this feature with Storage Spaces and Data Dedup and you have yourself a real ballgame!  To get started check out our article here.

MinShell is the new ‘compromise’ step between the full GUI Server installation and the Server Core installation.  It allows you to have a sort of ‘safety net’ of the GUI management tools, without actually having the Windows GUI environment installed.  You will save tons of resources across your virtualized environment because you no longer need the GUI on hundreds of virtual machines, as we wrote about here.

Server Manager was introduced to Windows Server 2003 R2 with all of the ho-hum yawning that it deserved.  Okay, a lot of our tasks were brought into one app, but that was about it.  That is why I was so surprised that the modern Server Manager in Server 2012 blew me away with its true multi-server management, the Dashboard functionality that gives the administrator a birds-eye view of the health of all of his or her systems, and the ability to manage… well, everything from one console.  Install roles and features on your local or remote servers with the same ease.  Manage multiple servers from the same console – add them by simply right-clicking the All Servers context, and then without any more work see that all of the services running on that (or those) remote server(s) are instantly added to your Dashboard.  I recorded a video of some of the great functionality in Server Manager for our blog here.

PowerShell 3.0 is the breakout version of this already incredible scripting environment, with nearly ten times the cmdlets than previously available (out of the box).  Add to that the Integrated Scripting Environment (ISE) and you have a powerful scripting environment that is even easier to learn and use than before!

Active Directory Administration Center is a new all-encompassing tool for Active Directory management.  No longer will admins have to open one of several different consoles depending on what they wanted to do, the ADAC is it… plain and simple!

Active Directory Recycle Bin was introduced in Windows Server 2008 R2, and is now even easier to use to use.  Enable it in the ADAC (remember that once enabled it cannot be disabled).  To lean how to enable it read our article here, and the to use it to restore an object we have another article here.

Windows PowerShell History Viewer records the underlying Windows PowerShell commands when action is taken in the Active Directory Administrative Center so that the admin can copy and reuse the scripts.  This is also a great way for admins to start learning PowerShell!

Cloning and Snapshotting Domain Controllers, along with DCs that are fully aware of virtualization, mean that we no longer need to maintain a physical domain controller in our fully virtualization (or cloud-based) organization.  I can rapidly deploy new domain controllers (either in an existing or new domain), and quickly and easily restore business continuity during disaster recovery.  I can rapidly provision test environments and quickly meet increased capacity needs in branch offices.  Our virtualized domain controllers will detect snapshot restoration and non-authoritatively synchronize the delta of changes for Active Directory and the SYSVOL, making DC virtualization safer.

Fine-Grained Password Policies in Active Directory allows me to have better security for my infrastructure by making it easier for users with no access to sensitive information have more lenient password policies, while enforcing stricter policies for users with more access and for service accounts.  While everyone will still have to have password awareness, this will see a marked decrease in Post-It Note Security Violations.

Dynamic Access Control is a new way of securing your information, whether on file shares, in SharePoint Document Libraries, or even in e-mail.  It works with Rights Management Server using Central Access Policies to verify who is accessing what information from where (what device).  The expression-based access policies determine before decrypting the content that both the user and the device are trusted.  If you have highly sensitive information that should only be accessed on corporately managed devices this is going to be a great new security feature available to you!

DirectAccess was introduced in the 2008 era with a plethora of complex requirements and prerequisites needed to implement.  In 2009 Rodney Buike wrote an article that is a great explanation of DirectAccess on our blog which can be read here.  In Server 2012 it is so much simpler to plan for, deploy, and use.  Anthony Bartolo wrote the article about what it is, what it needs, and what it does recently, and you can read that article here.

…and the list just keeps going and going.  I urge you to download the evaluation software and try it out by clicking on the appropriate link:

Windows Server

System Center 2012

Windows 8

In addition to downloading the software and reading our articles, you could have a chance in winning your lab computer by participating in free Microsoft offered Virtual Academy.  To have a chance to win an HP EliteBook Revolve and two chances to win 400 Microsoft Points enter here.  Complete two TechNet evaluations, and take the selected Microsoft Virtual Academy courses for your chance at a $5,000 grand prize!

Windows Fever… Catch It!

This morning we in the field of Information Technology awake to a new reality… A virtual reality if you will, but one that is quite real.

On July 5th I posted an article in this space called A Response to VMware’s ‘Get the Facts’ page comparing vSphere to Hyper-V and System Center.  In the four weeks since the article was published it has become the fourth most-read article on my blog (I have 437 articles publicly posted, dating back several years… the statistics cited are since the re-launch of The World According to Mitch in November of 2010).  It is certainly the most discussed and commented on.

The first comment, from a manager at VMware, says that we should compare what is in the market TODAY with what’s in the market TODAY, and since vSphere 5.1 was (and remains) in a private beta, we should not discuss Windows Server 2012.

Today Microsoft is releasing to manufactures (RTM) Windows Server 2012, and while they are still the number one virtualization technology in the market with regard to market share, they have a lot to worry about with today’s release.

I have been saying for several years that when Microsoft puts its mind to something (as well as its considerable financial and intellectual resources) you should never bet against them.  In February of 2008 they released Hyper-V, and two years later they released Hyper-V 2008 R2.  The former was decent, but (as VMware enthusiasts were quick to point out) lacked a lot of the features that enterprise IT departments needed.  The second release did a good job of adding many of those features, and with Service Pack 1 came even more features.

I have been a Hyper-V evangelist for a little over two years now, and I have seen the writing on the wall.  Even with Hyper-V 2.1 (2008 R2 SP1) Microsoft offered most of the features and functionality that businesses needed and wanted, but at a fraction of the cost.

Today, with the launch of Hyper-V 3.0, the circle is now complete.  The technological advantages of VMware have evaporated in the momentum of progress that Microsoft has made to Windows Server, Hyper-V, and System Center.

Over the course of the coming weeks and months you will be reading a lot about Hyper-V, both from myself and others.  If you are Canadian you might want to come out to an IT Pro Boot Camp offered by Microsoft Canada.  Even if you are not, I encourage you to download the preview and try it.  Play with it, and when you read about tricks in blog articles try them yourself.  It will not take long for you to realize that it is not just hype surrounding Windows Server 2012, it is substance, it is momentum, and it is a new era of server capabilities, without having to pay a fortune for the privilege.

Welcome to Server 2012 my friends… It is going to be an exciting one!

The Demise of Windows SBS

On Thursday July 5th Microsoft made an announcement that shocked the small business IT world, although it came as little surprise to the SBS MVPs who had, I am told been informed (much to their communal displeasure) months ago.

Microsoft Windows Small Business Server, a product that launched a thousand (and more) careers in IT, that has a loyal and vocal following that has blossomed since the very early days of Windows NT, is in its final iteration as we know it. (See the official announcement on the Windows SBS blog: Windows Small Business Server Essentials becomes Windows Server 2012 Essentials)

I couldn’t even begin to tell you when I first started writing about SBS – it was on newsgroups way before I started blogging.  I do remember when I first heard about it.  You would think that it would have been in one of the many MOC courses I had taken on Windows Server, Active Directory, and so on but it wasn’t.  I was actually in a job interview for a company that would eventually hire me called Poppy Industries.  Fred Blauer – a consultant that company used – asked me how I would configure the infrastructure for a given organization, and I told him that it would require five servers – a domain controller (two if they were smart), a mail server, a database server, a SharePoint/web Server, a firewall, and a file server.  Fred said to me ‘Would you consider using Small Business Server?’ He proceeded to tell me what SBS was – a single Windows Server box that was a domain controller, Exchange Server, SQL Server, SharePoint Server, ISA Server, and more… all for just under $2,000.

Obviously back then I knew everything, and I told him that no such product existed.  He opened up a web browser and showed it to me.  I told him that what he (and the page on was telling me was going to break every rule of enterprise best practices, but I would definitely look at the product and see if it was really all that.

I did… and I fell in love.

How cool is it that Microsoft had taken all of those products that I had been learning about, and rather than having to invest in six individual servers (virtualization was not yet a serious option), and put it all into one relatively low-end box?

Over the next few years I spent most of my career working in SBS.  I deployed and supported it for dozens of customers, supported the community, and for a short time I was even an SBS MVP.  I wrote courseware and an exam for SBS 2003 and the exam for SBS 2008 for Microsoft Learning, lectured on it to dozens of groups around the world, and wrote numerous articles for my blog.  On January 17, 2007 the Microsoft Canada DPE team’s blog (IT Pro Connection) published my article ‘Why I am not an SBSer’.  To say that it ruffled a few feathers is the very nicest thing that can be said about it.  I would go so far as to say that it was the beginning of the end of my amicable relationship with the entire SBS MVP group.

Over the last few years many of you have heard me predict the end of SBS.  In fact months ago I submitted a session to SMB Nation (which I will present at that conference this October in Las Vegas) called ‘The SWMI Vision of SMB IT in a post-SBS Era’.  I had no inside information when I coined the term ‘Post-SBS Era’ but it looks like I finally called one right.

Now to be fair, I have been predicting for years that the next version of SBS – that is, SBS 2011, which is a full-blown viable product – would actually rely heavily on SBS, separating the roles into multiple servers – AD, Exchange, SharePoint, SQL on four different virtual machines.  That did not happen.  I predicted that the future of SBS would look more like an enterprise datacentre in a single box with several virtual machines, and all managed by System Center 2012.  I never liked the idea of a ‘Windows Server Essentials’ that would facilitate the SMB to use cloud-based Exchange services, but alas, that is what we are seeing as the future of the product.  The idea was first floated to the SBS MVPs at least two years ago (and maybe three) at the MVP Summit… it was long enough ago that I was still attending SBS sessions at Summit, and the outcry against it was loud and strong.

So today when the announcement was made, it was indeed a sad day for SBSers, although not one that was unforeseen and certainly not one that was unexpected.  My SBS MVP friends will probably have to find new categories to fall into, as I had to when EBS was discontinued.  I expect some of them will transition to a new award category called Windows Server Essentials MVP, and others will find new categories.  I expect that some will resign the award in protest, and others, having lost the passion, will stop contributing to community and will eventually lose the award.

Most, however, will adapt and persevere.  Greg Starks, SMB Solutions Program Manager for Hewlett Packard, wrote this on his Facebook page, and I could not have put it better myself:

To all my SBSer, MVP and SMB IT Pro friends… I know today’s SBS End-Of-Life news is kind of a kick in the gut to some of you, but remember that it’s YOU, not the product, that helps your small business customers succeed.  The market will adapt and so will you.  You are too good at what you do to let the comings and going of a single product inhibit your success.  No matter which OS, SMB IT Pros RULE!!!!

I am looking forward to seeing Greg at WPC next week… He and his team at HP have done so much to support the SBS community over the past several years, and I expect they will continue to do so in the future… no matter what the product is called.

In the meantime, I will raise a toast to all of my friends, past and present, who are mourning the loss of SBS this week.  My thoughts are with you!

Managing Your SMB-IT Without Server

A set of clouds

You have a small business.  You have been running Windows Small Business Server 2003 for six years, and you know that it is time to retire it.  The question is, what should replace it?

Before you make any definitive decisions, why not review what you need your server to do:

  • File Server
  • Mail Server
  • Internet Portal
  • Centralized Management

For the past several years you have paid a consultant to manage the server and your client PCs, and have primarily called him in for break-fix issues.  Maybe you were industrious and decided to learn the basics of IT so you could do a lot of the maintenance yourself.  You might even be a small-business IT consultant who has been managing and maintaining SBS environments for your clients.

You have heard so much about the cloud that you are in a bit of a fog… you know that people are talking about cloud-services, but haven’t quite figured out how they can work for you… to save you money, to earn you money.

Replacing the Server

For most small businesses I still recommend a centralized server; Active Directory is still the best mechanism you will find for centralized user management, and Group Policy allows you to lock down your environment.

With that being said, many of the functionalities offered in Microsoft Small Business Server are now available as part of two cloud-services offerings from Microsoft.  Microsoft Office 365 offers all of the functionality listed above (File Server, Mail Server, Internet Portal) and much more.  It is actually all of the following products in the cloud:

Office 365 allows you to have the functionality of all of these tools… without having to purchase or maintain them.  It also means that you will always have the latest versions of all of these… without having to upgrade.  ‘Your servers’ will be maintained by the Microsoft IT team, without your having to pay them hundreds of dollars per hour.  If any of your services go down (and admittedly they do occasionally) you can rest assured that before you even discover the outage the people who know the products best will already be well on their way to fixing the issues.

Managing the Desktop

Between the operating system and the applications, there is a lot of work that goes into the proper maintenance of your PCs.  That includes anti-malware, patch management, policies, and more.  Additionally being able to generate and view reports is a huge benefit – as I always say If you cannot measure it, you cannot manage it!

So Before we get into application side of things,  let’s discuss the benefits of the second cloud-services offering, Windows InTune.  InTune installs as a simple agent on your Windows PC, and the list of benefits is amazing:

  • Upgrade rights to Windows 7 Enterprise
  • Windows InTune Endpoint Protection (centralized anti-malware solution)
  • Centralized Patch Management
  • Policy Deployment
  • Application Deployment
  • Device Reporting
  • Alerts
  • License Management

When you subscribe to Windows InTune (per-PC subscription) you get the right to upgrade your legacy Windows client (Professional/Business/Enterprise SKUs) to Windows 7 Enterprise.  Right there you have the basis for the common operating system required to simplify management.

Windows 7 Enterprise Edition includes two features that Business Edition does not:

  1. Multiple language support; and
  2. BitLocker drive encryption technology

With the preponderance of mobile computing these days, as well as organizations doing business around the world, there is no question that Windows 7 Enterprise is an easier feature-by-feature sell than the lower-priced options, but that lower price seems to be a deciding factor so often.  With the Use Rights in Windows InTune you don’t have to settle.

Once the Windows InTune agent is deployed on a PC it will start populating the individual computer’s information to the InTune system, and you will be able to get a better idea of what you have.  On the Devices screen you will be able to see:

Computer Name Total Disk Space CPU Speed
Chassis Type Used Disk Space Last User to Log On
Manufacturer & Model Free Disk Space Serial Number
Operating System Physical Memory Last Hardware Status

imageIncluded in the Windows InTune installation is the Windows Intune Endpoint Protection engine, which will protect your PCs from malware.  It uses the built-in patch management system to keep the definitions up to date, and offers real-time protection, as well as centralized reporting and e-mail alerts to the Help Desk / Support Team / IT Guy when a computer is infected.

InTune 2.0 added the ability to centrally deploy applications to client PCs.  InTune 3.0 adds an extra to this – the ability for end-users to install published applications on-demand.  The new Company Portal allows users to help themselves on-line, before eventually ‘escalating the call’ to you.

Users can also deploy their own client from the portal, assuming they have the proper credentials.  This allows them to download a client using their corporate credentials, rather than you having to send them the file (along with the ACCOUNTCERT file) which would allow anyone (in theory) to install on any device that would automatically be managed by… you.

By far the most common application suite found on desktops in the workplace is Microsoft Office.  The most common complaint I hear about Office is the cost (followed by the difficult to understand SKUs).  Of course, with Office in the name it is no wonder that it is part of Office 365.

Of course there are several different SKUs to Office 365, and each one has different offerings.  The small business SKU (P1) costs $6/month, and does not include the installable suite.  However it does include Office Web Apps, which means you can create and edit Word documents, Excel spreadsheets, PowerPoint presentations, and of course use OneNote… all within your web browser.  This is great if you work on multiple systems, or if you are ever remote and need to work on a document.  The convenience loses its thrill when you realize you cannot work if you don’t have an Internet connection.

The E1, E2, and E3 SKUs do come with the client software, so if that is a requirement then those SKUs (which cost quite a bit more) are probably better for you.

Why you should consider maintaining a server on-site

Our mail server is gone… so are our SharePoint and File Servers.  Why then would I still recommend a small server in a small business environment? There are several reasons.

  1. Active Directory.  As I mentioned earlier in the article, AD is a great way to centralize security and credentials.  Additionally there are plenty of hooks from Active Directory into Office 365 (which can be covered in a later article).
  2. Deployment Server.  Microsoft Deployment Toolkit 2012 is the perfect companion to your new Windows 7 Enterprise licenses.  In under an hour you can create a deployment point that will deploy Windows and all of your applications (including the Lync Client and the Windows InTune agent) in fifteen minutes (or less).  It is by far the easiest way to deploy Windows to your desktops, laptops, and even tablets!
  3. Hyper-V.  Although many of our applications will be installed directly onto the laptop, many companies still have server-based applications that require an application server.  Hyper-V is the best way to create those servers on-site, for a plethora of reasons that have been outlined ad nauseum previously at, and countless other sites.  Of course, your virtualized application servers can run any version of the Windows Server operating system, but they can also run any supported client OS, as well as several iterations of Linux (supported and enlightened) and any other x86-based OS (neither supported nor enlightened).
  4. Group Policy.  Although Windows InTune v3 has much better policy support than its predecessors, there is no denying that Group Policy is the best way to granularly control, configure, and secure your client and server environments.  Whether you want to enforce secure passwords, BitLocker, or simply set a centralized screen saver and desktop wallpaper, the best way to do it is by creating a GPO in Active Directory.

As you see the combination of cloud-based services from Microsoft and an on-line Windows Server are the best way to manage your entire SMB IT infrastructure, but even if you are not going to maintain an on-premise server the cloud-based services can manage most of the needs of most SMBs.

By the way, there is one more advantage to these solutions… you will always have the latest and greatest.  Right now the Windows InTune subscription comes with use rights for Windows 7 Enterprise.  When Windows 8 is released, you will automatically have access to that platform.  Office 365 comes with Office 2010… but when the next version is released you will have that version right away too!

Interested in hearing more?  Drop me a line and we’ll talk… or you can check out and to download 30-day trials of each!