Sbackup restore fails

Asked by Eivind Ødegård

I am unable to restore with Sbackup version 0.10.4., both GUI and command line usage, both partial and full restores. The backup was made to an external USB disk as part of my preparations to install Hardy from scratch.

I would be glad if someone could review this comand line output:

eivind@boxen:~$ sudo srestore.py /media/IAUDIO/tryggingskopiar/2008-04-23_16.10.26.333349.boxen.ful /home/lisbeth /home/lisbeth/gamalt
[sudo] password for eivind:
Traceback (most recent call last):
  File "/usr/sbin/srestore.py", line 159, in <module>
    ret = r.restore( sys.argv[1], sys.argv[2], sys.argv[3] )
  File "/usr/sbin/srestore.py", line 112, in restore
    shutil.move( os.path.join(tdir,spath), dpath )
  File "/usr/lib/python2.5/shutil.py", line 199, in move
    copy2(src,dst)
  File "/usr/lib/python2.5/shutil.py", line 91, in copy2
    copyfile(src, dst)
  File "/usr/lib/python2.5/shutil.py", line 46, in copyfile
    fsrc = open(src, 'rb')
IOError: [Errno 2] No such file or directory: '/home/lisbeth/tmpIp7e3K/home/lisbeth'

Any help appreciated!

Question information

Language:
English Edit question
Status:
Solved
For:
Ubuntu sbackup Edit question
Assignee:
No assignee Edit question
Solved by:
Oumar Aziz OUATTARA
Solved:
Last query:
Last reply:
Revision history for this message
Eivind Ødegård (eivind) said :
#1

Additional information: The directory /home/lisbeth/tmplp7e3K gets created in the above example. The directory is empty.

Revision history for this message
Nick (weegreenblobbie) said :
#2

Take a look at the file /media/IAUDIO/tryggingskopiar/2008-04-23_16.10.26.333349.boxen.ful/files.tgz.

How big is it? Is it as big as you would expect?

Try extracting the file manually:

$ cd /tmp
$ mkdir my_backup
$ cd my_backup
$ tar xfz /media/IAUDIO/tryggingskopiar/2008-04-23_16.10.26.333349.boxen.ful/files.tgz

I'd bet that tar will complain about "Unexpected end of file." This would indicate that the backup you created is broken, perhaps sbackup crashed when you created it and didn't notice. It happened to me.

Revision history for this message
Eivind Ødegård (eivind) said :
#3

Thanks for replying.

I'm not at the computer in question now, but I have examined the backup archive earlier.

Yes... it was somewhat smaller than I expected it to be (a bit over 4 GB), but at the time, I attributed this to gzip compression (alas!), so now I feel an urge to bang my head against the wall. The initial backup seemingly went just fine, and I didn't suspect errors. There were no messages to tell me anyway.

Not surprisingly, both file-roller and the command line complain about unexpected EOF when I've tried to extract the archive.

Sigh... I do wish I'd used rsync or something. This example in my opinion validates the bug reports, and demonstrates that you should be extra super careful before releasing and/or using backup software. I am not blaming the developers for my woes, they just happen to be working on very fault-intolerant software, and I just happened to use that software before it was ready to use. This makes for an unlucky combination, and I guess I can regard my backup as lost (?).

Revision history for this message
Best Oumar Aziz OUATTARA (wattazoum) said :
#4

hello,

Can you please tell me the File System you were using to make those backups ? I bet it's FAT32 . FAT filesystem doesn't support files bigger than 4GB

So sbackup will crash silently without even telling you.

I have forked Sbackup into a software call NSsbackup which got some log files and would have tell you that the backup had failed.
https://edge.launchpad.net/nssbackup

In version 0.2 NSsbcakup will also manage to make a backup on a VFAT by splitting the backup file .

Anyway, I don't think the file you're looking for is in the incomplete backup file you have. Sbackup, unlike file-roller which fails when noticing that the file is incomplete, tries to get the file from the incomplete archive. If it didn't find it , it means, it wasn't there .

Hope this helps !

Revision history for this message
Eivind Ødegård (eivind) said :
#5

The harddisk/media player is automounted as vfat, probably meaning that my backup is (at least partly) destroyed.

Personal opinion follows: Because sbackup has these severe limitations on its use, I would recommend that it be removed from the repositories. This is the right thing to do now, especially as the software isn't safe to use for the non-technical users it is targeted towards. It is very annoying that it has made an invalid backup without giving me the slightest notice, but I guess I can only blame myself for having entrusted sbackup with the task of backing up my files. It doesn't matter now, I have lost my files either way.

I'll mark this problem as solved, although it isn't.

Thanks for answering.

Revision history for this message
Eivind Ødegård (eivind) said :
#6

Thanks Oumar Aziz OUATTARA, that solved my question.

Revision history for this message
Alex Thurgood (alex-thurgood) said :
#7

I have an issue with marking this as solved, because I have just spent the last 3 months using SimpleBackup to create tgz files of my system to, wait for it, an external Iomega 500Gb USB disk. Now that the motherboard on my system has been burnt out by a short circuit in a VGA Dsub connector, I have had the unfortunate displeasure in trying to restore my system to another brand new PC from the backup, and what do you know ? Empty folders. This is with a Lucid Lynx netbook remix on Aspire One.

OK, so maybe instead of saying, here is an End User product in the mainstream distro that you can use to back up to any media, perhaps someone, somewhere ought to mention that the product will not work with VFAT formatted USB hard drives, i.e. the majority of all of the consumer drives available ? That would at least warn the user in advance not to put his hopes in a backup that is doomed to failure (without even telling him) once the file size exceeds 4Gb.

In my opinion, the product is defective because it leads the user into a false sense of security, so personally, I would still class this as a bug. Why ? Because it involves data loss, and that is considered unacceptable in most other projects.

Just my 2cents, and yes, I am angry.

Alex Thurgood

Revision history for this message
Nick (weegreenblobbie) said :
#8

I use Areca for backup now. Instead of creating a monstrous single tgz file, each file is compressed individually. Areca is similar to Acronis and by default it will verify the backup when you create it.

Revision history for this message
Avery (docaltmed) said :
#9

Either fix this bug or do not promote this software as a backup system. Or, at the very least, put up a warning.

I just lost ALL OF THE FINANCIAL DATA FOR MY ENTIRE BUSINESS FOR THE ENTIRE YEAR BECAUSE OF THIS BUG.

I am utterly appalled.

Revision history for this message
bash.vi (bash-vi) said :
#10

Hey guys. I just wanted to restore my home directory with all my data and had to find this bug.
WHAT THE HECK DO YOU THINK YOU'RE DOING?
I'm NOT going to upgrade to 10.10, but this data is worth damned much!
As Avery said: Take this package out of the repositories! It's a broken backup system. That's like the worst thing that can happen!

Please fix the damned bug!