WindowsNetworking.com - Monthly Newsletter - November 2015

Welcome to the WindowsNetworking.com newsletter by Debra Littlejohn Shinder, MVP. Each month we will bring you interesting and helpful information on the world of Windows Networking. We want to know what all *you* are interested in hearing about. Please send your suggestions for future newsletter content to: dshinder@windowsnetworking.com

 

1.The Evolution of Data Storage

It seems as if storage is the one thing in life that we never have quite enough of. About ten years ago, my husband and I moved from a 1500 square foot house to a 3700 square foot one. One of the things that most impressed me about my new home at the time was the big closet in the master bedroom. I remember my son saying “I could live in here.”

Fast forward a decade and that huge closet is packed full, as are the several other walk-in closets in the house. Murphy might not have mentioned it, but Shinder’s Law of Storage Space says that the amount of “stuff” you have will always expand to fill the amount of space you have – and then a little more.

It’s just as true in the context of digital data as it is in home design. I still recall how excited I was when I got my first hard drive, which held an incredible 10 MB (yes, megabytes) of data. Prior to that, everything was stored on a big old floppy disk (the 5 ¼ inch kind that really were floppy, unlike the rigid 3 ½ inch “floppies” that can along later). For those of you who are too young to remember, when I say “everything,” I mean everything. My first IBM PC had two floppy drives and you loaded the operating system from one drive and saved your data on the second one. Each floppy held 180KB, or a whopping 360KB if it was a double-sided disk.

That 10MB hard drive was the first of many, many hard drives I bought over the years, either as part of a new computer or as a component that I added when the original one got too full. Back then, adding a new drive wasn’t quite as easy as plugging in a USB cable; you had to actually open up the case. But people take drastic measures when they’re out of room. We’re considering remodeling the house to add more storage space, and when we started feeling crowded back in the glory years of computing, we didn’t hesitate to perform PC surgery to rectify the situation – for a while, at least.

Over the years, those hard drives kept growing and growing. In the mid-90s I thought I had reached the pinnacle when I popped the panel on my amazingly fast 486 and added a drive that held a whole gigabyte of files. Today my primary desktop computer has 1.5 terabytes of storage – and that’s actually less than the one I had right before it, because I went SSD with this one. Heck, even my phone has 96 GB – 32 GB internal and a 64 GB microSD card.

Here’s the kicker, though: Out of that 1.5 TB of space, I still (after almost two years) have 1.1 TB free, empty, unused. How did that happen? And given that huge hard drives can be had these days for pennies per gig, why is it that many of today’s laptops and desktop come with a measly 500 GB or so of storage space? Does that seem to violate Shinder’s Law above?

It’s certainly not that we’re creating fewer data files or that those files are smaller. High resolution video is all the rage and a 3 minute 1080p MPEG2 file can be almost 2 GB in size. We’re creating and downloading high res video and multi-megapixel photos by the dozens. The difference is in where we’re storing our stuff. A while back, at our house we stopped keeping all of our files on our individual computers and started putting them on our file server. Now that beast has 4 TB of drive capacity, two of which are on external drives that are easy to add or swap out. And it’s all backed up to a second server.

Those servers take up a lot of the other kind of space, though – room in our house that could better be used for that other kind of storage (clothes, suitcases, holiday decorations and all those “keepsakes” that we can’t throw away because of sentimental attachments – not to mention the boxes and boxes of books and the Museum of Obsolete Electronics). So we are now looking at moving some of our data again. Where can we put it that will allow us to get rid of a server or two, reduce our electric bills and free up part of the server room for other things? The answer is obvious: the cloud.

Public cloud storage services abound – some targeted at consumers, some at businesses and some at both. The concept has been around longer than you might think; in the early 1980s, online services such as CompuServe gave users space where they could upload files, albeit not very much (a little over 100k). Many of us who had web sites hosted by ISPs in the early 1990s used our web space to upload (via FTP) and store some files that weren’t actually linked to the HTML pages that displayed our sites.

Then along came real online storage services. The first ones were mostly marketed for backup purposes. Rather than having the cloud be your data’s primary resting place, you kept it on your local machine or local network but uploaded a copy of it to a remote storage location. These became popular in the 1990s.

After the turn of the century (and millennium), online services exploded onto the market. Everybody and his dog had an Internet connection by the middle of the 2000s, and backup services like Mozy and Carbonite offered consumers a place to put their data where it would be safely off-site in case of a physical catastrophe. Amazon Web Services started up in the early 2000s and launched the S3 storage service in 2006.

In the meantime, mobile computing was on the rise. Laptops gained popularity over desktops, and smart phones gave us computers that fit in our pockets. In 2008, we got storage services such as Dropbox, where the idea was to put your files in the cloud and access them from anywhere, with any device. Soon Google was offering Google Drive and Microsoft was offering SkyDrive (now OneDrive) and Amazon, which still had S3 for devs and businesses, was offering Amazon Cloud Drive for consumers. There were soon dozens of other alternatives such as Box, Livedrive, Apple’s iCloud, and many more.

Most of today’s file hosting services synchronize your files across multiple devices and allow you to share the files with whomever you want or keep them private, giving you seamless cloud functionality.

According to this article, the IBM “disk storage unit” in 1956 cost $3200 per month with a storage capacity of 3.75 MB of data. Today’s cloud storage services allow customers to store gigabytes of information at low cost or free of charge. Fierce competition driving the race to the bottom for online storage prices resulted in unsustainable (but nice while it lasted) offerings such as Microsoft’s unlimited free OneDrive storage plan, which has now been cut back to one terabyte.

You can still get a good bit of free online storage, though, especially if you’re willing to spread your data out across multiple services. Microsoft currently offers 15 GB of free storage to those who don’t have Office 365, but has announced plans to cut that back to 5 GB next year. Google provides 15 GB free. Amazon Cloud Drive gives you unlimited photo storage plus 5 GB for other files free with a Prime membership. A free account with Dropbox will snag you another 2 GB. Box gets you 10 GB free.

Mega provides a whopping 50 GB of free space, and includes end-to-end encryption, but also imposes a 10 GB bandwidth limit. Copy is a cloud storage service that gives you 15 GB at no cost and you can “earn” 5 GB more for every friend you refer who uses the service. StreamNation provides a generous 20 GB of storage and lets you send files to Chromecast, but you can only share files and folders with others who also use StreamNation. You can get 25 GB free with hubiC and 50 GB free with SFShare, although shared folders aren’t allowed and you can only upload one file at a time. You can check out even more free online file hosting services here. Even phone carriers such as Verizon and Sprint offer their customers cloud storage space.

When you move up to paid services, you’ll find even more options. Google Drive offers a terabyte of storage for $120/year, you can get the same amount for $99 with DropBox or $59.50 with iDrive, and Amazon Cloud Drive beats them all with unlimited storage for $59.99 per year, although given Microsoft’s experience with unlimited, it’s hard to guess how long that will last. Most of the storage providers mentioned above also have plans that are targeted at businesses, and others, such as Barracuda and Acronis, focus on business customers.

Where and how we store our data has undergone quite a few changes over the last 30 years and there’s little doubt more changes are over the horizon. With cloud computing becoming the new norm, it’s likely that most of the new storage technologies will be aimed at high capacity data centers. Helium-filled drives that use less power to spin the disks and allow for larger capacity (Western Digital has announced one that holds 10 TB) are one technology of the future. Even more futuristic is the idea of storing data in DNA molecules. Holographic storage was the great hope for a long time, but seems to have somewhat faded from the picture in recent years. Quantum computing, once we have a better understanding of quantum physics, is likely to bring other new ways of storing data as well as transferring it.

Although we might not know at this time what the next big breakthrough in storage tech will be, we can be pretty sure that it will enable us to store even more information and access it even more quickly. It’s been exciting to be around to watch the progress made in this area over my adult lifetime. Now if only someone would come up with a way to buy an external USB closet and plug it into my house … .

‘Til next time,

Deb

dshinder@windowsnetworking.com

=======================

Quote of the Month

Don’t jump to conclusions. Just because your spouse says “I need more space,” it doesn’t mean your marriage is in trouble. It might just mean he/she is lusting after a new four terabyte hard drive. - me

=======================

 

2. Windows Server 2012 Security from End to Edge and Beyond – Order Today!

Windows Server 2012 Security from End to Edge and Beyond

By Thomas Shinder, Debra Littlejohn Shinder and Yuri Diogenes

From architecture to deployment, this book takes you through the steps for securing a Windows Server 2012-based enterprise network in today’s highly mobile, BYOD, cloud-centric computing world. Includes test lab guides for trying out solutions in a non-production environment.

Order your copy of Windows Server 2012 Security from End to Edge and Beyond. You'll be glad you did

   


Click here to Order your copy today

 


3. WindowsNetworking.com Articles of Interest

This month on WindowsNetworking.com, we introduce one brand new topic and continue with new installments for three popular article series.

Wi-Fi Site Survey Tips

Properly deploying and maintaining a wireless network is much different from a wired network. Wi-Fi signals can have relatively low, unpredictable, and varying ranges in buildings, as the walls, furniture, people and other objects within attenuate and reflect the signal. Additionally, you can’t control the airwaves and must share them with others. This standalone article by Eric Geier looks at how performing a Wi-Fi survey can help you overcome obstacles when it comes to Wi-Fi interference.
http://www.windowsnetworking.com/articles-tutorials/wireless-networking/wi-fi-site-survey-tips.html

Active Directory Insights (Part 9)

This very comprehensive look at Windows Active Directory by Mitch Tulloch has already addressed virtual domain controllers, NIC teaming, DC hardware sizing, read-only DCs, trusts and configuring DNS. This time, in Part 9, Mitch will go into how to automate the provisioning of user accounts via Windows PowerShell.
http://www.windowsnetworking.com/articles-tutorials/windows-server-2012/active-directory-insights-part9.html

PowerShell for storage and file system management (Part 5)

In Brien Posey’s latest series on using PowerShell to manage storage and file system tasks, he has discussed a number of the storage-related administrative actions that you can perform using PowerShell, beginning with the simple task of checking the health of a physical disk. In this, Part 5, Brien focuses on how you can retrieve SMART data from multiple servers via PowerShell.
http://www.windowsnetworking.com/articles-tutorials/netgeneral/powershell-storage-and-file-system-management-part5.html

Hybrid Network Infrastructure in Microsoft Azure (Parts 8 and 9)

In this series, I’ve delved pretty deeply into the topic of hybrid network infrastructure and hybrid cloud, and the networking functionality that you get when you when  you adopt Azure Infrastructure Service. We looked at data center extension architecture, site-to-site and point-to-site VPNs, Azure dedicated WAN link service (ExpressRoute) and the Azure Virtual Gateway. Then we moved on to Azure Virtual Networks and how to use PowerShell to configure internal load balancing. We continue that discussion in Part 8, and also introduced Network Security Groups.
http://www.windowsnetworking.com/articles-tutorials/cloud-computing/hybrid-network-infrastructure-microsoft-azure-part8.html

In Part 9, we moved on to talk about virtual machine access control lists (ACLs) and how to configure them to allow inbound traffic from the Internet.
http://www.windowsnetworking.com/articles-tutorials/cloud-computing/hybrid-network-infrastructure-microsoft-azure-part9.html

 

4. Administrator KB Tip of the Month

Failure to back up EFS key
by Mitch Tulloch

A tip about what you can try doing if you move a drive with EFS-encrypted files to a different computer without backing up your encryption key and certificate.

Your computer won't power up anymore, so you remove the hard drive and install it as a second drive in a different computer. Then you realize that you've encrypted some of the files on the drive using EFS but you failed to back up the encryption key and certificate to removable media. Does this mean those files are lost forever?

Maybe. But first you could try using the reccerts.exe utility to recover the certificate as described in this thread on the Security forum on TechNet:
http://www.wservernews.com/go/1348060049487

You can obtain this utility by opening a ticket with Microsoft Support:
http://www.wservernews.com/go/1348060054018

And if you don't want to pay for support on this issue, you could try following the somewhat advanced instructions found here:
http://www.wservernews.com/go/1348060059690

I've known at least one person who successfully recovered encrypted files using this procedure, but I haven't tried it myself and can't guarantee the results. And you might want to clone a copy of the drive before you try to recover encrypted data from it.

The above tip was previously published in an issue of WServerNews, a weekly newsletter from TechGenix that focuses on the administration, management and security of the Windows Server platform in particular and cloud solutions in general. Subscribe to WServerNews today by going to http://www.wservernews.com/subscribe.htm and join almost 100,000 other IT professionals around the world who read our newsletter!



5. Windows Networking Links of the Month

This month, many of us who have upgraded computers to Windows 10 will be dealing with its first major update. Here are some links to see you through and help you avoid the pitfalls:

Meanwhile, there are some folks who haven’t upgraded in a long, long time:

And a few more links of interest:

6. Ask Sgt. Deb

Do I really need containers?

QUESTION:

It’s not that I’m lazy – LOL. I’m overworked and underpaid and now the boss at our medium sized company thinks we need to get in on the cloud craze, but without going too far. In other words, I have to implement a hybrid network, hook up our on-premises network to the cloud. We know we want to go with Azure since we’re a Microsoft shop all the way.  – Dennis C.

ANSWER:

The whole point of cloud is to make it easier (and cheaper) but hybrid cloud can get complicated. I just ran across something last month that might work for you. It’s basically “Azure in a box” and Satya Nadella was on stage with Michael Dell at Dell World, introducing Dell’s version of this system that’s called Cloud Platform System Standard. It integrates servers, storage and software designed to connect to Azure with minimal effort.

Check out this article about it: