Backing Up is Hard to Do? 299
Joe Barr writes "NewsForge is running a story this morning on a personal hardware/software backup solution for your Linux desktop (NewsForge is owned by Slashdot's parent OSTG). The solution doesn't require a SCSI controller, or tape drive, or the ability to grok a scripting language or archiving tool to work, either. It's based on point-and-click free software. Plus it includes a dead-parrot joke by Linus Torvalds."
Backup painful? (Score:5, Funny)
Re:Backup painful? (Score:2)
Re:Backup painful? (Score:5, Informative)
Backup:
tar -czf backup.tar.gz
Then use k3b or something to record the file to CD
Restore:
Take a wild guess:-)
Restore individual files:
Use mc to browse the tarball (slow but works)
Now, do you see me bragging about this trivial shit on slashdot? No?
Eh, wait...
I keep backups on my iPod (Score:4, Interesting)
I have a 20GB iPod, but only about 12GB is used. My $HOME is about 2GB, including a bunch of digital photos, but also a bunch of documents, my email, and other stuff I'd rather not lose.
My solution is simple:
This is a very simple shell script that deletes the backup file already on the iPod, then does a 'tar czf - $HOME' and pipes it into gpg using circular encryption (that is, a passphrase.) The encrypted, compressed tarball (about 1.7GB) is written directly to the iPod. Takes about 20 minutes.
I've used this backup copy to do restores, and it's really as simple as plugging in the iPod, using gpg to descrypt the file, piping that into 'tar xvzf -' to re-create my $HOME. I can move all my stuff back to where it needs to be after that.
(For those who wonder: I always make an encrypted backup file in case my iPod is ever lost or stolen. Sure, the bad guy can probably run something to brute force the passphrase, if that's something he's interested in doing, but it's a tough passphrase. I don't worry about it so much, and it's "only" email and family photos.)
My backup script: (Score:3, Interesting)
(I'm the type that creates one big ass partition for
Restoration, I do the lazy way:
mkdir test
cd test
tar -xvf
and then I grab the files (the RAID array usually has plenty of space).
backup2l (script) (Score:3, Informative)
Re:Backup painful? (Score:2)
Restoring is a pain when the backups are incomplete, or backup media is faulty (quite common). Instead of have a backup of the complete system, they just backup user data chancing that reinstalling the OS and then restore should be a breeze. Ouch! Now they have to install numerous vendor patches, as well as other undocumented tweaks done to the system before restoring. Yup, restoring here can be painful. The only funny thing about parent post is the moderators modd
Re:Backup painful? (Score:2)
When you backup to DVD or CD-ROM make sure to run a verify after every backup!
Verify means comparing the files that you meant to write to those that actually read back from the disc (diff
You'd be surprised how often many CD/DVD-writers screw up some files or even the whole disc. If you skip the verify you'll learn it the hard way.
MacOS X (Score:2)
First off, I don't do the default OSX install. I always slice up the partition (not partition the drive, this is a *bsd type OS people!
Then on a semi-regular basis, I will (should rather) clone everything save
Carbon Copy Cloner [bombich.com]. This would be applicable to other *nix's of making a bootable clone minus
T
Re:Backup painful? (Score:2)
Re:Backup painful? (Score:2)
a) she knows there's a computer in the house
b) she knows it's the most important thing to you in the house
c) she will find it and "get creative"
d) If not c) then destroy the rest of the house finding until c) can be completed
e) Goto d)
I keep all my computers in the basement except one decoy (802.11 prevents following the cords)
And here is the joke... (Score:2)
Re:And here is the joke... (Score:2, Informative)
The cast:
MR. PRALINE John Cleese
SHOP OWNER Michael Palin
The sketch:
A customer enters a pet shop.
Mr. Praline: 'Ello, I wish to register a complaint.
(The owner does not respond.)
Mr. Praline: 'Ello, Miss?
Owner: What do you mean "miss"?
Mr. Praline: I'm sorry, I have a cold. I wish to make a complaint!
Owner: We're closin' for lunch.
Mr. Praline: Never mind that, my lad. I wish to complain about this parrot what I purc
My choice for backups: (Score:5, Informative)
Re:My choice for backups: (Score:3, Informative)
For those who do not know, rsnapshot uses rsync to backup. What makes it unique is its ability to use hard links to keep full copies of a particular backup (ie during the restore, go into the folder you want and copy the back
rsnapshot is run via cron so you can configure it to email when it runs (to verify correct operation).
I have had to restor
Re:My choice for backups: (Score:2)
Too easy (Score:2)
For the more typing inclined people, create a directory and do this:
rsync -av --delete --no-whole-file /folder-to-backup/ /backupfolder
Re:Too easy (Score:2)
Re:Too easy (Score:2)
Rsync or mkzftree for backups (Score:5, Informative)
One small improvement over rsync (IMO) is to use mkzftree from the zisofs-tools [freshmeat.net] package. It's designed to create compressed ISO filesystems which will be transparently uncompressed when mounted under Linux (and other supporting operating systems; it's a documented ISO extension). mkzftree supports an option for creating hardlinked forest (like cp -al and rsync), with the advantage that the files are compressed, thus saving space. ISO isn't quite as flexible as ext2 for things like hardlinks, so what I do is have DVD-sized disk images formatted as ext2 to store the snapshots. I burn the disk images directly to DVD; each one can hold ten or twenty compressed snapshots (of my data anyway). The disadvantage is that I can't read the files directly (because they're compressed, and the transparent decompression only works with ISO) but it's easy to decompress a file or folder to
It shouldn't be hard to make the transparent decompression code work with other filesystems than ISO, as long as they're mounted read-only. The files are just gzipped with a header block indicating they are compressed.
Re:Rsync or mkzftree for backups (Score:2)
Easy (Score:5, Interesting)
1. Reach over and plug in USB 120 gig drive.
2. Become root, and go to /root.
3. Type "./backup.sh".
That is a script that goes to all the directories I care about (/root, /etc, /srv/www, /usr/local/share, and my home directory), and basically does this for each drive.
cd $DIR rsync -avz --progress --delete . $MNT/$DIR
where $MNT is where the USB drive mounts.
4. Unmount the drive and unplug it.
This is quick (a few minutes) and easy, and since rsync reads the files from the last backup to figure out what needs to be copied, it should catch it if I develop a bad sector on the USB drive.
I left it out in the above, but the backup script also, before doing the rsyncs, lists my crontab into a file, so that gets backed up.
no incremental (Score:2)
There's a reason incremental backups have been around for two plus decades, and "update the difference between two drives with rsync" is not "incremental".
If you were going to reply and say "oh, but I only do it every X weeks", well- you'll now loose weeks of work if you loose a file/drive.
Re:no incremental (Score:3, Informative)
b-loo
Re:no incremental (Score:2)
I do it over nfs and smbfs mounts using rsync to dvd-r's and dvd-rw's.
Re:no incremental (Score:2)
Yep, but luckily rsync provides the options --backup and --backup-dir which makes it easy to create increments. Actually not exactly increments, but the files deleted between the previous and the last rsync run, but in the end they serve pretty much the same.
So is a boot to the head (Score:5, Insightful)
I make daily differential backups (via AMANDA) to a rotating set of 12 tapes. If I accidentally delete /etc/shadow or some other important file, I have nearly two weeks to discover the problem and restore a previous version from tape. Your idea gives you, oh, until about the time that rsync discovers the missing file and dutifully nukes it from your "backup" drive.
What you're doing is certainly better than nothing, but it's not a backup solution by any definition of the term beyond "keeps zero or one copy of the file somewhere else".
Far, far better would be for your script to use dump or tar to create incremental backup files on your USB drive and to rotate them out on a regular basis.
Re:Easy (Score:2)
I've yet to find an adequate FOSS backup solution that meets the following requirements:
1. Able to backup only specified directories (I really don't need a backup of
2. Backups are strongly encrypted.
3. Backups are fault-tolerent. (If I lose one byte in the middle of a CD, I don't want to lose the whole thing.)
Right now my
Re:Easy (Score:3, Informative)
Re:Easy (Score:2)
I simply do the following in a cron job that run's nightly.
tar -czvf
this dumps the contents of all drives to the SDLT tape drive. One tape 240Gig of files... and then eject's the tape. now swaping the tapes in each server every morning is the hard part... every tape is a complete backup that way i have 5 copies of all files incremently 1 day older than the other on hand and 16 copies off site. 4 weekly's and 12 monthlies.
the larger database server has a jukebox
What about remote backups? (Score:2)
super compression (Score:3, Funny)
Re:super compression (Score:2)
What? (Score:3, Funny)
You can read from /dev/null just fine:
will restore a whole directory's worth of files back to what's stored in the backup. If you want to make an exact copy of the whole filessytem stored in the backup:
Now, that's just off the top of my head, so I won't take any blame (or credit) if you try that out on your own system.
Re:super compression (Score:2)
We used to take "bogus" backups from our DB and run them to
Of course, now that I think about it, since mv is just a pointer command, it might be... cp *
-WS
Re:super compression (Score:2, Funny)
Re:super compression (Score:5, Funny)
Re:super compression (Score:2)
Cool Open Source Backup Software (Score:2)
Re:Cool Open Source Backup Software (Score:2)
I knew bacula before but never got around to actually test it.
It seems a bit bloated to me and while many features it has are nice it lacks the smartness of dar.
In case of /. ing (Score:3, Interesting)
The hardware
My desktop machine includes three IDE drives and an ATAPI CD-ROM drive. I have Debian installed on hda, SUSE on hdc, and my
In the past I've researched tape drives and found that for a decent drive, I would also have to add a SCSI controller. Those two items can be pretty pricey. I opted for a less expensive configuration.
I decided to go with a removable IDE drive, connected via USB. I bought a 3.5-inch hard disk enclosure with USB 2.0 connectivity on eBay. It cost roughly $45, including shipping. With three drives to backup, I needed a large-capacity IDE drive to hold all the data. It turns out I already had one, just waiting for me to use. I raided the stash of goodies I've been hoarding to build a killer MythTV box and found a 250GB Hitachi DeskStar -- just what the doctor ordered. I got it on sale at Fry's Electronics a couple of months ago for $189.
I have the mechanical skills of a three-toed sloth, but I still managed to cobble together the drive and the enclosure, neither of which came with directions. Four screws hold the faceplate on the enclosure, and four more hold the drive in place inside. Even I was able to puzzle it out.
The most difficult part was the stiffness of the IDE cable running between the faceplate and the drive. In hindsight, I recommend connecting the power and data cables from the faceplate to the drive before screwing the drive in place inside the enclosure. I also recommend not forgetting to slide the top of the enclosure back in place before reattaching the faceplate.
I connected the USB cable to the enclosure and the PC and powered on. Using the SUSE partitioning tool, I created an ext3 filesystem and formatted it on the Hitachi drive, using the default maximum start and stop cylinders. That worked, but there was a problem. My great big 250GB drive yielded only 32GB.
One of my OSTG cohorts asked if had clipped the drive for 32GB max, but I had done no such thing. All I did was check to see how the drive was strapped out of the box. It was set to Cable Select, which was fine with me, so I left it like that. His question worried me, though, because I had never heard of a 32GB clip thingie before.
I called Hitachi support to find out what was up with that. Their tech support answered quickly. When I explained what was going on, he agreed that it sounded like it was clipped to limit its capacity. This functionality allows these big honkers to be used on old systems which simply cannot see that much space. Without it, the drive would be completely unusable on those machines.
I asked why in the world they would ship 250GB drives configured for a max of 32GB by default, and he denied that they had. He asked where I got the drive, then suggested that Fry's had "clipped" it for some reason. There are jumper settings to limit the capacity, but my drive had not been jumpered that way. Perhaps Fry's sold me a returned drive that a customer had "clipped", then returned the jumpers to their original position. We'll never know.
The tech told me how it should be jumpered for Cable Select without reducing capacity. I opened the USB enclosure, pulled out the drive, and found it was already jumpered as he described. Undaunted, I pressed on.
On the Hitachi support page for the drive, I found a downloadable tool wh
I use a DVD-RAM burner (Score:2)
Heh, noob mistake (Score:5, Interesting)
Now, when his system borks, how does he restore? Or did he think that far ahead?
I skimmed the article, and nothing about restoring. Your backup is useless if you can't restore it.
Does he have to install and configure linux, X, and KDE just to be able to access KDar?
Forget all this jibberjabber, and emerge or apt-get or type whatever command you use to get Mondo/Mindi. Just perfect for home boxes, and most other use.
Burn yourself a bootable CD that can recreate your box, just like Norton Ghost for Linux. I have it write out the iso files and boot disk for
I run a seperate job to backup
Whats important, is to seperate system from user data when it comes to backups. This also forms my "archiving" system, since old "/home" backups stick around, so if I want to take a look at the version of foo.c I was writing 6 months ago, it's easy enough to find.
As much as I love Mondo/Mindi, it's not the be-all and end-all. AMANDA is a better choice for a corporate (more elaborate) environment. It's a PITA and not worth getting involved with for a simple user box.
Or, if you have another computer handy... (Score:3, Informative)
mount -t smbfs -o username:password \\10.0.1.111\backup
cd
tar -cvjf
(If I recall the commands correctly.) I use this all the time to make quick snapshots of my Gentoo installation before emerging some bleeding edge package.
Re:Or, if you have another computer handy... (Score:2)
You can try bru, but that costs $ and isn't that user friendly.
Re:Or, if you have another computer handy... (Score:2)
sudo tar -cvjf
Re:Or, if you have another computer handy... (Score:2)
Restoration problem??? hmm... (Score:2)
i don;t know about everyone else, but isn't that one of those things that should have come up pretty early in BETA testing?
"Great backup program.. too bad it can't restore"
Re:Restoration problem??? hmm... (Score:2)
My solution (Score:2)
2. A cron job that runs 4 times a day that does
for DEST in
if [ -d $DEST/home ] ; then
rsync -aSuvrx --delete /
fi
done
If anything went wrong with the main disks, it would be pretty simple to get grub installed on the USB drive, and whip
Rdiff-backup (Score:2)
Also, rdiff-backup allows for remote operations. So you can have a central server back up many desktops, with relative ease. It doesn't have a nice GUI, but then again, I'm running it all through a cron job anyhow, so who cares.
Restoration is a breeze because the most recent snapshot
My solution: backuppc (Score:5, Informative)
Restore is also straightforward - it can be done in place, or by downloading a zip/tar file.
Old box archive. (Score:2)
Mmmmm... Images (Score:2)
What I like about this is if I always have a week ago fall back if I mess something up. Or if the original drive fails I just swap the backup in less than 1 min.
And yes I also do select daily data backups (email, etc.)
KDar? (Score:5, Funny)
Backups onto USB removable drives = pain (Score:2)
Not if you _care_ about your data! The drivers for this stuff seem to be very betaish. Lockups, garbled writes, non-standard implementation.
It's just not worth the hassle...
Snapshot? (Score:2)
I mean, after all, who cares if he backs up to DVD or CD or network, or whatever. We all know linux is good at moving data. I usually backup with tar -cz to a tarball on CD, and I can restore from this from a minimum CD boot, and I don't get the idea to brag about it on slashdot either.
Choosing this or that media is a non-problem, as long as you understand the difference be
Re:Snapshot? (Score:2)
Biggest problem: consistancy during backup (Score:2)
For example - you might have something that changes both
unison (Score:2)
How I do it. (Score:2)
today=$(date +%F)
zip -r -u
cp -v -f -u
rm -r -f
mkisofs -r -J -o
cdrecord -v -speed=16 dev=0,0,0 -data
eject cdrom1
Where "collected" are files copied to the local machine through a series of smb mounts and copied across the network as well as important files on the local system t
Pathetic (Score:2, Informative)
He should read the Tao of Backup http://www.taobackup.com/ [taobackup.com] and be enlightend.
My simple fairly inexpensive solution (Score:2)
I have sort of an interesting situation at work (Score:2)
Our solution was to buy a couple 160GB FireWire LaCie hard drives [lacie.com]. They have heavy-duty aluminum cases and USB2, FireWire and FireWire 800 interfaces. I use CMS Products' free BounceBack Backup Express [cmsproducts.com] software to automatically syncronize the files on the server to the files on the hard disks.
It works g
Two Words: NORTON GHOST (Score:3, Insightful)
With just one or two boot floppies, I can back up and restore my Linux drives to either: other internal IDE drives, other parititons on the same drive, external USB1 and USB2 drives, burnable CDs, or burnable DVDs.
Heck, it is so fast and reliable, I've been known to backup the drive just before even *trying out* new software or options, and if I don't like it, I just Ghost it back to how it was.
Now, I know it isn't free, or even Linux based, but it is hard to argue with cheap, reliable, and fast backup procedures that just work all the time...
And that is easy (Score:5, Interesting)
I've read the whole article. My! You'd better be a geek to have to cope with all the little worries..
Getting cheap AND working hardware on E-Bay. My mom will not do it for the sake of her computer.
32GB limitation by jumpers. Not obvious for an end-user.
Booting up *nixes from various drives in order to access the limited drive, then fiddle with partitions. I still don't dare touching my configs for more than OS at a time. Let alone various OSes on various drives.
Compiling KDart?! Compiling what? What do I have to do? "Comp..??" You have to admit, it's not for the dummy kind.
Definitely not "Backup made easy" but "Made not so expensive" since the price tag still reaches 300$ (drive + box from e-bay + screws + shots of valium to calm you down when your machine refuses to boot after all the offence you just did to it).
I bought Linux Hacks [amazon.ca]. This, Webmin [webmin.com] and a remote machine accessible using Samba or sftp does the daily backup just fine.
Programs are not the problem (Score:2)
kDAR question, OT ? (Score:2)
If you loose the disk that was backed up, and the catalogue with it, is the kdar archive file useless ?
I use rsync to keep track of daily changes, and tapes to make backups. Tapes have the advatage of not showing up as a drive than can be destroyed if the system gets hacked.
Shameless rdiff-backup plug (Score:2)
With rdiff-backup, backup dozens of gigabytes effortlessly and restore as effortlessly at any point in time. Add it in a nightly cron job and you are golden !
From the description : "rdiff-backup backs up one directory to another, possibly over a network. The target directory ends up a copy of the source directory, but extra reverse diffs are stored in a special subdirectory of that target directory, so you can still recover files lost some time ago. The idea is to combine the best features of a mirror and
Network Harddrive? (Score:2)
here's how I perform backups (Score:3, Interesting)
RSync makes backup easy... (Score:4, Informative)
rsync -e ssh \
--delete \
--relative \
--archive \
--verbose \
--compress \
--recursive \
--exclude-from list_of_files_you_don't_wanna_backup \
--backup \
--backup-dir=/backup/`date -I` \
user@other_host:/backup/current/
This command mirrors everything in
Disadvantage is that you can't easily restore an exact old state of the directory which you backuped, however you can retrieve all the files very easily.
There are also floating some shell scripts around which add to the above rsync line some vodoo to hardlink the different dated directories, so that you have a normal browsable copy of each and every day while only wasting the space for the changes.
And there are also tools which optimize this whole thing a bit more, by compressing the changes you did to files, like http://www.nongnu.org/rdiff-backup/
However overall I found the plain rsync solution the most pratical, since it doesn't require special tools to access the repo and 'just works' the way I need it.
Backup? (Score:2, Informative)
When I was young (early 20s) I saved everything. Then I had an HD crash. I started over and, several years later, my new HD inherited an unrecoverable problem. I starte
Backups to removable media and offsite (Score:2)
I still think that removable media (e.g. tape) is the most effective form of backup [baheyeldin.com].
Under Linux, a tape drive can be used effectively to backup a home network [baheyeldin.com], specially when you have offsite storage (e.g. take the monthly backup to a friend or to your work).
Granted, this is only for 10 or 20 GB worth of data, but I am not even half there yet. This does not apply to guys who have a, let's say extensive, collection of movies, or have a huge set of, ahem, images.
One Danger of hard drive backup (Score:3, Insightful)
If someone gets into your system, they do an rm -r *, is your backup drive mounted?
What if they're clever and do a mount all, or find your backup.sh first?
I've seen some people take the first and last step of "inserting the USB cable" and "removing the USB cable". Is there any kind of automated system that would ease this, or is it the Hard drive equivelant of "Remove tape, insert new tape".
USB drives also suffer from problems with catastrophic failure, like a fire in your home.
I wonder if there exist any online backup systems that let you do offsite daily differential backups of your system (or critical files) that would let you download or mail you an image of your harddrive (on DVD-R) along with restore software in case anything went wrong. You could charge directly by bandwidth used. Hmm, interesting idea.
Re:One Danger of hard drive backup (Score:3, Insightful)
I've thought of that too. I like to backup my gradebook to another server.
So, you're asking yourself. What keeps the malicious intruder from logging into the 2nd server after perusing my backup script?
I used a little-used feature of ssh that allows you to restrict a session to a single pre-specified command. My backup script has only the ability to write new gradebook backups to the server. It cannot execute any ot
Make backups easy by using... (Score:2)
mondo will do a full image of your drives (including making images of ntfs/fat32 drives). You boot off the image you create with mondo, and you can nuke the machine and do a full restore from cd/dvd, or do a partial restore.
backuppc is perl based and works wonders on a network for daily backups. (you can backup the server backuppc is running on too!)
Re:Make backups easy by using... (Score:2)
Expired Websites (Score:2)
Obligatory KNOPPIX post (Score:2)
(Linux/win)
I generally just use partimage, but mc browsed tar archives just fine.
IN the "early" days there was some brain damage with versions of partimage, but those are long gone.
implicit backup HW (Score:2)
mkcdrec (Score:3, Informative)
mkcdrec is a really neat program that packs up your whole system and makes a recovery disk. Its something any sysop should take a look at.
See the homepage [mkcdrec.ota.be] here.
Use RAID-1 (Score:3, Informative)
Because RAID-1 is an exact mirror, I get a complete, bootable backup copy of my system at the time of the shutdown. Downtime is limited to the few minutes it takes to shut down and swap drives. The lengthy process of mirror rebuilding takes place while the system runs normally. And of course, RAID also protects me against random (single) hard drive failures.
This solves the full image backup problem, leaving only the more frequent partial backups you should also be doing. For this, rsync is your friend. The stuff that changes most often on my system are my IMAP folders, which I periodically (several times per day) rsync to my laptop. Besides backing up my mail server, this gives me copies I can carry around and read when I'm offline.
Tape is obsolete. It's just too slow, expensive, unreliable and small. Hard drives are so cheap, fast and capacious that there's little excuse to not run RAID on any machine that can physically hold more than one hard drive. Unfortunately, this leaves out most laptops.
Re:here ya go: (Score:2)
you don't call that a backup command, do you?
for all practical purposes that is - useless.
Re:here ya go: (Score:2)
Re:here ya go: (Score:2)
Actually, I did this for a while, with /dev/hda smaller and faster, and /dev/hdb slightly larger and significantly cheaper (and slower). Back when CD burners cost big bucks, I lucked onto a cheap, big, slow second hard drive, and had instant backup.
The bad news is that you only have one backup copy, rather than a history of backups. If you do dd if=/dev/hda of=/dev/hdb, then remember that you want that file that you erased just before that (which you just wrote over), you'r
Re:here ya go: (Score:2)
Re:here ya go: (Score:2)
Using dd to move a partition to another drive is good use, using it for backup is pretty much doomed to cause havok.
Re:here ya go: (Score:2)
Otherwise, unless you plan on only doing this while in single user mode or while that disk is unmounted, you play a risk stuff might happen while you're doing the copy over.
That, and on big drives it can take a wee bit longer than you probably want to wait. You really don't need to back up the free space that badly.
Re:here ya go: (Score:2)
Re: (Score:2)
Re:simple (Score:2)
If he was worried about single file restores and didn't want something complicated, why not just drag his entire drive over using a window manager type application like Nautilus. It isn't brain surgery (and hell - cp will even do it cheaply and easily and can be setup in cron in a snap for anyone but the most new newbie).
Re:simple (Score:2)
This part had me rolling. WTF good is a backup if you can't do a single file restores?
I hope no one who is considering linux stumbles across this article and thinks that's the best we have for backing something up. If I had written the article KDar would've been skipped for something that can completely do the job. Even if that meant hacking out some scripts and setting up a cron job.
Re:simple (Score:2)
The main reason I would want backup software is so that I don't have to think about doing a backup: it just happens. I have a hard drive I use for backups, but I find that I don't end up dragging and dropping as much as I should. People tend to get lazy and not do backups if they aren't autom
Re:simple (Score:2)
Re:simple (Score:2)
Re:simple (Score:2)
Now, take that USB disk and drop it from 1 meter a couple of times. Even better, drop it to the floor while writing data. Hardisk proably damaged, and backup unreliable. Granted, USB disk backup is better than none at all. If only tape backup devices weren't so damn expensive.....
Re:Let's get these out of the way... (Score:2)
(2)Do not eat backup
Re:As usual (Score:2)
Really, "tar -clzf
for the perplexed:
Re:As usual (Score:2)