Oct 31 2006
Data Center

Strategies to Alleviate Time-Consuming Data Backups

New storage architectures and processes ease the strain of protecting vast amounts of information.

The growth of electronic data is expanding so quickly that I’m reminded of those low-budget horror films like Creature From the Black Lagoon and The Blob, in which runaway forces threaten Earth’s very existence. Perhaps a modern day sequel to these cult classics should be Attack of the Killer JPEG File.

In reality, as a local government information technology manager, I’m not terribly worried about killer amounts of data. Thanks to hard drives with ever-higher capacities, we never seem to run out of room to store all the new data that our city creates each year.

What I do fret about, however, is backing up all this data.

I’ve been in this business long enough to remember the days of disk drives spinning in a back room that housed storage cabinets as tall as me. Each of the drives held less than a megabyte of data. Today, we have many times more storage capacity than what’s available within each of our servers.

But with all this capacity comes the challenge of making sure we’re protecting the data effectively. What’s the best way to regularly back up gigabytes or terabytes of data in a cost-effective manner? Many third-party services perform high-volume backups for organizations like ours, but those services are out of our price range.

Taking Control

Instead, we’ve taken two big steps to take control of our backups.

First, we’ve created a backup server in a disk-to-disk-to-tape configuration. We take all of the data from our production servers (disk layer one) and store that data on a backup server (disk layer two), and then we perform our tape backups from the backup server. This approach reduces the time that our front-line servers have to devote to the backup process.

Second, we take advantage of our thin-client architecture to ease backup procedures. To do this, we dedicated an individual Linux file server to each department in City Hall and to our seven remote locations.

This means that each department’s e-mails and any files it creates—including spreadsheets, slides for presentations, text documents and digital photos—all reside on that dedicated file server, instead of locally on individual workstations.

Dedicated departmental servers have had a big impact on our backup procedures. When we were using servers at City Hall as the home servers for the remote locations, we were backing up the remote data over the network. This meant that data was taking up a large amount of our City Hall storage capacity and that the slow network connection was impeding the backup process. By deploying separate Linux servers in each department, we eased our disk storage concerns and sped up the connection for the remote locations.

The thin-client approach offers other backup benefits. Most notably, we have to back up only a relatively small number of servers every night, and we don’t have to worry about backing up data saved on hundreds of PCs.

Reducing Risk

In a typical PC environment, if somebody creates the most important document the city has ever produced and his or her PC crashes, the IT staff immediately moves into crisis mode. Fortunately, we don’t face that kind of panic attack because our management information systems department controls all the backups centrally. We know that all the city’s data is saved in a timely fashion, and taking this responsibility out of the hands of end users significantly reduces the room for error.

If a file is accidentally damaged or erased, we just find the copy of the file we had saved centrally and send it to the appropriate departmental server. Everyone is up and running within a few minutes.

And horror stories are relegated to the movies—where they belong.

RUTH SCHALL is MIS Director for the city of Kenosha, Wis.

Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT