Mention the prospect of a disaster wiping out an organization’s financial, donor and member data and people think of earthquakes, floods and tornadoes. But, sometimes humans pose the greatest danger.
The YWCA of the USA had three hard drives for all its business and financial systems. When the main hard drive failed and needed to be replaced, Kenneth Klum, the chief financial officer for the New York City-based organization, didn’t think that he had a reason to be concerned.
Yet, when the vendor’s technician not only removed the wrong hard drive – the primary backup drive – but also dropped it, months of work was lost. The vendor had to reinstall the drives, which also had lots of special design components.
“We lost about a month of time,” Klum said, “which really wasn’t too bad considering. … We had two dead drives out of three.”
The YW also lost an audit team that was finalizing the organization’s financial statements and couldn’t get it back for more than two months. The organization is late or on extension for all sorts of filings.
Wait, there’s more. Software had to be reinstalled. The original software had been installed with some modifications … by someone no longer with the organization.
“Murphy’s Law was exploding all over the place,” Klum said. “Once you get behind on that stuff, it’s hard to catch up.” And when he had to explain to the board what happened, “They thought I was kidding,” Klum said.
The loss of vital data is no joke to organizations that have amassed large volumes of donor and organizational information over the years. To ensure the continued safety of compiled information, some nonprofits are leaning on outside organizations to guard-dog their files. Others are using the tried and true method of backing up the disks.
“The trend in all businesses and all industries, including nonprofits, is to move away from the model that dictates that data is stored in an on-site server,” explained Kathryn Engelhardt-Cronk, president of Austin, Texas-based Community TechKnowledge. “A lot of companies outside of the nonprofit industry have already moved to that ASP (Application Service Provider) model. What that means is that the nonprofit would contract with a company that would build them a database, then lease it to them on a monthly basis.”
A nonprofit’s data storage scheme is developed with a handle on the present and an eye on the future. Moving from one system to another is not a simple data entry task. It’s about evaluating the costs, effectiveness and security levels of available storage scenarios.
All of the options can be grouped within three basic scenarios, said Amit Motwani, chief technology officer at Community TechKnowledge. The most antiquated is the “file cabinet” scenario where an organization doesn’t use any system and keeps everything locked up in a cabinet onsite. The second is having an onsite server where data is stored. The third involves the outsourcing of an organization’s data storage to an ASP.
The first scenario is both simple and widely used, Motwani said. In contemplating the second scenario, a nonprofit needs to address the issue of resources. Do they have the resources to spend on a server and the necessary security firewalls to protect their server? In addition to the hardware and software, a nonprofit must also consider hiring staff to manage the server.
The third scenario is the option with the strongest defenses. An ASP typically has the firepower to be able to handle thousands of servers. The more servers, the safer the data. The beauty of most ASPs is that if anything short of an apocalyptic event hits the primary server, an organization’s data will be safe in another server or mirror site that may be housed anywhere across the globe.
The Women’s Advocacy Project (WAP) in Austin is one nonprofit that recently switched to the ASP model. The organization dispenses legal advice and education to women throughout the state and managers decided that the method of gathering and comparing data was antiquated.
“We had been working off the FileMaker Pro database for something like five years and we had a volunteer handling the data,” explained Danielle Hayes, deputy director at WAP. “It was getting frustrating because one change could take as much as six weeks to get done.”
In addition, some of the attorneys that work for the nonprofit were constructing databases via Microsoft Word and comparing the data by placing an unmanageable number of printed fact sheets side-by-side.
Getting all of the attorneys on one page was the goal. “We also have attorneys who are located in two different parts of the city who couldn’t access that database because it was onsite and only available through our Intranet,” said Hayes. “Even thought FileMaker is Web-ready we didn’t feel comfortable starting our own server and going that whole nine yards.”
WAP hooked up with Community TechKnowledge in the fall of 2000 and now has one of their five necessary databases up and running. Hayes admitted that the organization was “10 years behind the technology curve” before switching over, but expects the new databases to boost WAP’s effectiveness.
As with most systems, ASPs have a hierarchy of checks and balances. In the case of Community TechKnowledge, they outsource their information to computer giant Dell. By contract, Dell assumes accountability if their level of security is breached, Engelhardt-Cronk explained.
The nonprofit receives that security, back-ups, 24/7 administration and confidentiality. In addition, there is assurance in the form of insurance policies. So if something happens to the data, organizations do not assume liability and can turn to the vendors for solutions, thereby washing their hands of problems ranging from servers crashing to hackers.
“ASPs are in the business of security, meaning that they’re in the business of setting up huge complicated firewalls to keep hackers out,” Motwani said. They have monitoring systems that are watching servers constantly and they report any kind of illicit activity that might be occurring.
One great example is “hammering.” A lot of hackers who try to compromise data will make repeated attempts at hammering a server until they make it into one port. Whenever something like this occurs there are automatic algorithms and there are staff monitoring activity at the ASP to block the hammering.
Since many nonprofits have yet to board the ASP storage vessel Engelhardt-Cronk says that fears of budget-busting services are unfounded. “When you talk about a nonprofit buying and maintaining an internal server and trying to keep the firewalls up, you’re talking tens of thousands of dollars a year – aside from the staffing costs involved,” she said.
With ASPs if you’re a $3 to $5 million organization you’re going to pay $200 to $300 monthly to insure a high level of security and storage you.
The storage space available from ASPs far outdistances anything a nonprofit could muster up on their own. The outsourcing of information often travels to buildings filled with servers with a seemingly endless amount of record-holding capacity.
Masters avoiding disasters
According to Michael Smith, manager of information systems for the San Francisco-based Evelyn & Walter Haas, Jr. Fund, three months worth of data would be destroyed in an event of a disaster. “That’s if the whole state of California went down,” he said.
What the Fund does as an archival back up is ship tapes via FedEx every three months thousands of miles away to the East Coast.
The president of the Fund is a board member of New York City-based the Heron Foundation, and that’s where the tapes are archived, Smith explained.
“Once we send the tape out to them at the end of a term then they send the one they’ve been keeping back to us and then we revolve like that,” Smith said. “The Heron Foundation tapes sent out is really more archival safety in terms of overall Fund business.”
The Fund recycles 12 tapes (the tapes cost $50 each) with its Eastern Standard Time partner. The FedEx charge is basically the entire overhead for the archival back up system. The Fund also has an in-house mirrored hard drive, and an employee who takes tapes home.
If something were to happen to the office building, the business of the Fund would go on because all the information would be on the vice president of finance and administration’s home computer because he takes home back up tapes twice a week. Daily back up tapes remain at the office.
“We run daily back ups that stay on site and then within any given week we have two back ups that are taken off site. Our vice president of finance and administration happens to be that person, it could be another person in another situation,” said Smith.
“He has a duplicate database over there so he can pull any of the data from any of our back up tapes that he has with him should there be a failure and disperse them accordingly if email systems are up, said Smith.
“Our heart line, if you will, is really to the East Bay (of San Francisco) so that if the San Francisco office were to go down in some respect, we would have him able to still run things from there.”
If a smaller catastrophe occurs Smith said, eight hours of battery back up power is available on the server and telephone system. Many of the 25 employees have the ability to work from home and therefore would be able to stay there until things were restored to normal.
As for those employees who can’t work from home, “we’ve arranged to have one of our conference rooms set up as sort of a central back up area where people could go to laptops and work there,” said Smith. “We always keep a number of laptop batteries on charge for that very purpose.”
The Fund also operates a mirrored hard drive on its server. A second server that is kept in the same building would be destroyed if the building burned to the ground.
“And that basically means that if any of our hard drives, if our main hard drive fails, we have a hard drive that’s a duplicate of it,” Smith said. “If the day is wiped out from the primary one there’s always another hard drive there for us the switch over to.”
Smith explained the mirrored hard drive system as one where the server itself has an extra physical drive which operates simultaneously with the main drive and anything that’s put on the main drive gets copied on to the second physical hard drive.
Smith said mirrored hard drives are pretty common with data servers in today’s ever-changing technological environment. “If any mechanical part of that (main hard drive) should fail suddenly, which occasionally happens with hard drives, or if the data on that one drive were corrupted for some reason, (it’s backed up by the other hard drive),” Smith said.
Having a back up plan for data storage doesn’t have to be an overwhelming task. While mirrored hard drives and storing archived tapes on the other side of the country are effective methods of keeping data safe, tried and true methods of backing up data remain practical.
The Christian Children’s Fund (CCF), for example, conducts nightly back ups Monday through Friday, and the archived tapes are taken off the Richmond, Va. grounds by an off-site storage company, which is located 30 to 45 minutes away from the office.
John Watts, director of information systems of CCF, said the organization does what they call hot and cold back ups. They use an Oracle hot back up, which backs up all data from the point at which the last back up was made, said Watts.
A cold backup for the organization is one day, for example a Saturday, copying everything on the database, all the files, programs, applications, data.
“On Monday night for instance, we would take a hot back up, and the hot back up would be everything that was processed, everything that had happened to our databases during the day on that Monday, and we would do the same on Tuesday.”
Therefore, if CCF had a failure on Wednesday they would begin restoring the cold back up and then Monday’s hot back up, and then Tuesday’s hot back up.
Another aspect of CCF’s data back up plan includes a maintenance agreement with a company in Richmond that carries the same level of equipment. CCF could go there and restore their system.
“Then we’d have to set up communications to our people because they’d also have to work sometime,” Watts said. “We have 120 people in this building, if the building burned down I can restore and bring the system back up. I would say within 48 to 72 hours we can be back working.”
CCF’s computers are connected via the Internet to approximately 22 locations around the world, Watts said. And since the organization shares information with all of them on a daily basis, all those connections would have to be remade if the system went down.
Watts said the aftermath of a disaster involves more than just restoring data. “It’s not just computers coming back up again because, let’s face it, we’re sitting here in a large building with lots of offices and 120-odd people. So the reality of it is you would have to create not just the computer environment but the business environment so that people could work.”
Jon Stahl, program manager for Seattle-based ONE/Northwest, an organization that works with nonprofit environmental groups, said although ASPs are the newest craze in database technology, all nonprofits are not quick to jump at the idea. “I think they are definitely intrigued by the idea of not having to be responsible for maintaining their own infrastructure,” Stahl said.
According to Stahl, there are two fundamental approaches that an organization could take with database management. Foremost, the sort of traditional approach would be to buy or build database software and run it on a machine on a network in the office. There may or may not be a server, depending on the size of the organization.
“The point is that you’re hosting your own application and in many cases you’re buying off the shelf software, or you’re writing the database yourself,” Stahl explained.
Or there is ebase, which is software developed by and for the grassroots community. “That’s sort of a middle point, where there’s software that you’re getting but then its expected that you will then customize that,” Stahl said. “But then again it’s software that you’re running inside your office.” According to Stahl, the ASP model consists of software packages not running a computer in the organizations office, it is run on a central server that is owned and operated by the ASP company, “or potentially it cold be a nonprofit.”
With nonprofits still getting familiar with the ASP model, the dot-com shake out has also added to the concern of investing in or leasing from an outside source rather than owning the software, he added.
“I think they’re mostly storing data on their networks… on their computers,” he said. “If you’re looking to a trend, I do not yet see a trend of people using ASPs for data storage… And that would be the new thing… I don’t see it in my (environmental groups) community.”
Stahl said the ASPs suit more of the larger nonprofits than the medium-sized or smaller groups because they have more data, and especially more donor information to store. The theoretical advantage of the ASP model is that the nonprofit does not have to maintain an elaborate infrastructure other than a solid Internet connection.
Robert Walker, executive director of The Management Center in San Francisco, said that TMC would not be subject to the scheduled rolling blackouts this summer in California because of the building’s location on an electrical grid, which has a hospital, police station, or other exempt building on it.
“We do back ups in that our IT manager keeps a separate back up disc in his personal home,” said Walker, “and he does so on a weekly basis.”
The Management Center publishes Opportunity Knocks and that database is backed up on a daily basis, he said. The remaining organizational data is chronicled twice a week. The IT manager “takes the back up and has the physical at home so if indeed the place blew up or something like that we can restore most of our data to within about a few (days worth of information),” Walker said.
“We’d lose a little bit on a daily basis but the point is that we’d be able to restore all our data to within seven days of where we were.”
Lou Attanasi, vice president of product development for Charleston, S.C.-based Blackbaud said that he recommends tried and true methods of database back ups to his clients.
“They should be backing up their data regularly. What I mean by regularly, I mean like every other day,” he said. “Once a week at the outside. But they need to be taken off site.”
Attanasi also recommends rotating back up, tapes, CDs, or whatever medium used to a portable site, which are quite effective in terms of keeping everything current.
“If they have an earthquake, or if the place blows up they are a day or two away from being completely (up and running),” said Attanasi. “I think that there are some technologies around to sort of the high-end nonprofits these days that will allow them to maybe take advantage of some more sophisticated techniques like database mirroring, things of that nature, but I think the bulk of nonprofits really need to rely on (steady back up techniques).”
Attanasi gave credence to the fact that the majority of nonprofits are not capable of hosting another server off site, nor do they have the infrastructure to be able support that kind of technology. But, there are several 501(c)(3)s that do have the infrastructure to do highly sophisticated data storage, he added.
The old adage of “you get what you pay for” applies to the cost of storing data. Obviously it varies upon how elaborate a system you want.
“An example of the most primitive (backup system) would be a single computer, with your database and your software sitting on that box,” explained Attanasi. “And then obviously a nice, fresh supply of CDs, tapes, or diskettes, whatever you prefer.”
The simplest system runs well under $10,000, according to Attanasi. “They’d be backing up regularly, but they would be rotating… Take last Friday’s and turn it into this Thursdays or something like that,” he said.
The most sophisticated back up systems has literally banks of servers that store data. In addition to that they can also replicate it and mirror the data.
“For example if a donation is added to one person in this database it can be mirrored over to another server so they have a hot, live server in case the line goes down,” said Attanasi.
Attanasi guessed that if The NonProfit Times interviewed nonprofits at random, “My bet would be you would find that many of them don’t do anything.”
As we celebrate our 36th year, NPT remains dedicated to supplying breaking news, in-depth reporting, and special issue coverage to help nonprofit executives run their organizations more effectively.