Seven steps Windows Server administrators should consider

by [Published on 26 May 2011 / Last Updated on 26 May 2011]

In this article, I will describe what I see as the major events coming down the pike to which administrators should take notice.

Introduction

I’ve been working with Windows since the earliest days of Windows NT while also working with the then stalwart NetWare product. While there has certainly been change in that time, that evolutionary change pales in comparison to the confluence of events that are hitting Windows administrators today. In this article, I will describe what I see as the major events coming down the pike to which administrators should take notice. I will also point you to some resources that provide additional thoughts and direction on the topic at hand.

1. Start honing your IPv6 chops

The Internet is almost out of IP addresses. Over the years many workarounds, such as Network Address Translation (NAT), have been implemented to extend the life of the address pool, but those efforts will not stall forever the need to move the world to the new IP standard, IPv6. Although IPv6 is not yet required everywhere, some Windows 7 and Windows Server 2008 R2-based services – most notably, Direct Access – are either dependent or simply work better with IPv6. Windows system administrators don’t necessarily need to become overnight experts in this networking technology, but it makes sense to start studying it now to get the basics down so that you’re prepared when it comes time to implement the standard throughout your organization.

Here are some resources to help you in your journey to understanding IPv6:

2. Begin investigating the cloud, but be careful…

I’ve written before about some of my thoughts about moving services to the cloud. Before I get into some additional thoughts on the cloud, this bears repeating and remembering: Moving services to the cloud is the new way of saying that services will be outsourced, managed and controlled by someone else. There are definitely some services that can be easily and securely moved into the cloud but there are others that simply must remain internal.

Communications services such as e-mail are ripe for the cloud, but core business services such as identity management and anything that carries significant trade secrets should be kept internal.

As you are investigating cloud services, keep a few things in mind:

  • There is no such thing as 100% uptime. If a cloud vendor is providing you with a 100% guarantee, you’re being lied to. The recent multi-day Amazon outage is a stark reminder that even the most intentionally engineered infrastructure can fall victim to the unexpected.
  • Another lie: “You don’t need any support from your IT department in order to use our service.” I hear this all the time from vendors and in every case, the exact opposite is true. The IT department simply must be fully engaged in all efforts to move services to the cloud or add additional services.
  • A thorough review of your intended vendor’s security practices and policies is in order. Make sure that your vendors comply with the regulations necessary for your industry.
  • Don’t be afraid of the cloud – or of outsourcing very specific services. Just don’t be too eager to jettison what could be a robust infrastructure based on vendor claims.

If you’re considering cloud services for your organization, here are some additional resources that you should review:

3. Review all of your licenses and make sure you’re not overpaying and that you’re in compliance

Licensing has changed a lot since virtualization has become a mainstay in the data center. Many vendors have modified their licensing policies in order to react to the fact that many workloads no longer run on physical hardware. Microsoft, for example, has taken great strides toward creating virtualization-friendly policies that make it easy to deploy software without having to jump through licensing hoops. If you haven’t recently reviewed your software licenses, now is the time. Are you taking full advantage of all programs offered by your vendors? More importantly, are you complying with all of your vendor licensing requirements, whether the licenses are for physical or virtual use? While some vendors have created virtualization friendly licensing programs, others have made the decision that supporting virtual environments is not a good business model and thrown up barriers in licensing agreements. Pay attention to these clauses so you don’t end up on the wrong side of a legal issue.

Understand your Microsoft licensing options with these resources:

4. Make sure you’re fully leveraging your virtual assets

Unless you’re running services that absolutely must be housed on physical hardware, now is the time to virtualize, well, everything reasonable in your environment including your Exchange systems, SQL Server databases, Active Directory domain controllers, file servers and more in your virtual environment and begin leveraging the benefits that come from the technology, including new ways to add high availability to the environment, easier methods by which resources can be added to specific workloads, more efficient use of hardware and lower costs.

Of course, your virtual infrastructure has to be able to keep up so make sure that all of the important aspects – processing, memory and storage – are adequately addressed for all of the workloads that might run.

Here are some links to resources for further reading that can help you see some possibilities in this venture:

5. It’s time to pilot VDI – do it

With the great success that many organizations have had with regard to server virtualization, the desktop is the next great frontier on the way to enterprise efficiency and, possibly, lower costs. Desktop virtualization is a server-centric pursuit that can streamline desktop chaos, enable new ways to deliver key business applications to an increasingly mobile workforce and create conditions enabling what can be efficient bring your own device initiatives that are being driven by an environment of IT consumerization.

Like server virtualization, desktop virtualization provides an abstraction layer that enables workloads to run on a variety of devices from typical desktop computers to terminals to mobile devices such as iPads and even smartphones. Server virtualization has been successful because it allows organizations to harness and maximize fully the technology assets on hand. No longer do servers use just a fraction of their capacity; with virtualization, physical hardware is pushed to limits, which is as it should be.

A similar situation exists at the desktop level, but executing on a corrective plan is must more complex due to the sheet direct workload variety that exists. Every user uses his PC differently, even when the same work is being performed. As you look at the future of your desktop environment, it’s time to consider whether or not VDI fits into your specific picture.

TechGenix has a number of resources that can introduce you to VDI, its success and some options with regard to the technology:

6. Learn PowerShell

For many years, Windows administrators have relied mostly on traditional GUI-based management tools to perform key operational tasks. With just about every new software release from Microsoft, however, you might have noticed that the tools all include complete command line-based management capabilities using PowerShell. For those brave admins that like to streamline ongoing administrative tasks, PowerShell has become a potent weapon in their arsenal.

Microsoft really got going with PowerShell with the release of Exchange 2007, but even VMware has gotten into the PowerShell game with their PowerCLI tool, which enables vSphere administrators to use PowerShell for a variety of tasks, including monitoring hosts and automating routine administrative tasks.

Here are some PowerShell resources to get you on your way:

7. Disaster recovery has many new options – Learn them

Disaster recovery used to be simple before IT infrastructures grew to encompass pretty much every aspect of the business. Back then, recovery consisted of retrieving tapes from off-site locations and rebuilding servers as necessary. Today, days are not an option; recovery must happen in hours and business must resume as quickly as possible. So, organizations are augmenting their tape-based recovery mechanisms with disk-based and even cloud-based systems to further protect their environments and enable quicker recovery.

Obviously, if considering cloud-based backup, the same concerns previously expressed apply. Backups should be well-encrypted and you need to watch the usage charges, particularly if information changes a lot. Many cloud-based providers charge pretty hefty fees for data storage but on top of that, they charge on a per-KB-basis for both incoming and outgoing transmissions. So, watch the bill!  And, understand up front how much your data might change.

If keeping backup internal, it’s worth exploring disk-based backup methods that can keep up with ever changing data so that you can recovery to more points in time. Many backup tools, such as Microsoft Data Protection Manager, back data up continuously by writing just blocks of data that have changed to the backup store. Older software performed operations at the file level, which could result in inefficiency as too much data is transmitted when only a small chunk of a file has changed.

Disk-based backup presents its own challenges, of course. Disks are more difficult to move off site – which is often one of the reasons that disk-based backups are augmented with cloud-based services.

Summary

The world of the network administrator is changing and administrators needs to keep up with these changes in order to remain effective in their roles. These are just seven items that demand consideration to ensure that the Information Technology infrastructure best meets the needs of the business.

Featured Links