Announcement

Collapse
No announcement yet.

Problem with cpio premature end of file

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Problem with cpio premature end of file

    Hello folks,
    I work as admin at a small company and got a problem restoring a backup with cpio. The background story is that we got an old server with Suse on it that had Samba installed and was being used for file exchange and storage. Last week the hard drive had a mechanical failure and stopped working. Luckily it had an external hard-drive attached on which backups were made periodically. I know very little about the setup of the server, since it was set up by my predecessor and he left no documentation about it. Since the server's hard disk is beyond repair I decided to try my luck with the backup unit. I got myself a computer that was lying around, burned a copy of Kubuntu and attached the backup unit to it. So far everything went smoothly. After having a look at the backup I found out it was in cpio.bz2 format, so I tried to restore the files. After a little bit of research on the Internet I found the following command:
    sudo bzcat backup.cpio.bz2 |sudo cpio -i
    The restoring of the files went smoothly until, about halfway, it stopped with following error message:
    cpio: premature end of file

    I tried unzipping the backup with ark, and managed to extract the cpio file, then I tried restoring it with:
    sudo cpio -i
    but I get the same error at the same point.

    Since the written files are about half the size of the extracted cpio file, I suppose the rest of the backup is still in the cpio file. I had a look at the last file it was writing, and it appears to be some backup of an thunderbird inbox, which has a size of 2.3 GiB. From what I already read on different forums it seems that cpio has problems with big filesizes (over 2 GiB), so that seems to be the problem (I'm not really sure about this, but some of the threads I read pointed in that direction). If this is the problem, then I would be OK with cpio skipping this file (it doesen't seem to be particulary important) as long as I can restore the rest of the files. My problem is that I don´t know how to achieve this, on other forums I searched I din't find much advice and man cpio wasn't much of a help either.
    At the moment I'm trying my luck extracting the backup with ark again (yes, I deleted my extracted cpio file when I found out it gave the same error), and, when it is finished, I will try to open the extracted cpio in ark.

    So my questions are, basically, the following three:

    -The problem I have, is it because of a limitiation when the file was written (in this case I can imagine that I'm in trouble) or is it some limitation when extracting?
    -Is there any way I can tell cpio to skip this file (or any files that give this errors) and extract the rest of the files?
    -Do you think that using ark instead of the command line cpio can solve my problem?

    If you need any further information, feel free to ask, it's just that I don't know what information could be usefull to you.

    Best regards, and thak you for reading my questions.

    Joder Illi

    #2
    I know nothing about cpio but there is an option to ignore a file pattern:


    -f pattern (i mode only) Ignore files that match pattern.

    Please Read Me

    Comment


      #3
      The -f option didn't work for me, it skips the file, but still gives the same error just after cpio got to the point where the file would have been processed. I did a little research meanwhile, and I think I found something interesting:

      From the german Wikipedia: Of the two UNIX-commands cpio and tar, tar is more widely known and doesn't have the limitation of 4 GB (2 GB for Implementetions with signed int for the file size) for the SVr4-Format, respectively 8 GB for the POSIX-Format.
      From the english Wikipedia: The cpio utility was standardized in POSIX.1-1988. It was dropped from later revisions, starting with POSIX.1-2001 because of its 8 GB file size limit. The POSIX standardized pax utility can be used to read and write cpio archives instead.
      I think the problem might be that the cpio that I'm using (the one that ships out-of-the-box with Kubuntu) has a 2 GB limit, while the cpio that was used to create my backup probably was the version with the 4 GB limit. I will now have a look a the pax utility that is mentioned in the english Wikipedia, maybe this helps me along.

      In the meanwhile, can somebody tell me if I'm right with my assumption about the 2 GB limit?

      It seems that pax doesn't handle input files over 2GB. So I'm back to square one. :-(
      Last edited by joder; Nov 08, 2012, 04:01 AM.

      Comment


        #4
        The OpenSolaris docs say that their versions of pax/cpio have the 8gb limit. Maybe you could install opensolaris in a virtual machine and get past this one task?

        The manpage for paxcpio for ubuntu states:

        CAVEATS

        Different file formats have different maximum file sizes. It is recommended that a format such as cpio or ustar be used for larger files.

        File
        format Maximum file size
        ar 10 Gigabytes - 1 Byte
        bcpio 4 Gibibytes
        sv4cpio 4 Gibibytes
        cpio 8 Gibibytes
        tar 8 Gibibytes
        ustar 8 Gibibytes

        So they say 8GB is supported. Your problem might lie elsewhere. Have you tried verbose to see if any errors show up?

        cpio -idvk < FILENAME.cpio

        Please Read Me

        Comment


          #5
          I tried with -k, since I think this could skip the error I'm getting, but cpio tells me:
          cpio: invalid option -- 'k'
          Try `cpio -- help' or `cpio --ussage' for more information

          The error I'm getting is the same one I allready stated above (it doesn' t matter if I put verbose mode on or don't specify it):
          cpio: premature end of file
          The cpio version I'm using is 2.11.
          I'm starting to think that the problem might not be te version of cpio I'm using, but the version used at the point the backup was created. I think this, because I played around with the -H option, and the only format that actually manages to extract something out of the backup ist the bin format, which, according to the GNU manual, has a limitation of 2.147.483.647 bytes per file (more or less the 2 Gig limit I hit with my file).
          Now I' m trying to get an implementation of cpio that supports the -k option (I allready tried with apt-get install cpio, but it tells me that I have the most current version), any ideas about how I can achieve this?

          Comment


            #6
            Sorry, that was a typo, I meant to write

            -idv


            Another cause of that error could be they used a block size other than standard.

            Have you tried opening the file with Ark?

            Please Read Me

            Comment


              #7
              Originally posted by oshunluvr View Post
              Sorry, that was a typo, I meant to write

              -idv
              -idv just gives me the cpio: premature end of file error

              Another cause of that error could be they used a block size other than standard.

              Have you tried opening the file with Ark?
              Yes, When I select the option "Open with Ark", Ark fires up ans starts loading the archive. Then It remains a while loading the archive and, after that, closes istelf. The same happens if I open Ark and then tell him to open the file. So I suppose it could be that it internally runs into the same error as me.

              I still think that the best solution for me would be getting hold of a cpio that supports the -k option. Then probably it could skip this one file that is causing all the problems and restore the rest of the backup.

              Comment


                #8
                I tried my luck with:
                afio -i -v -k - < backup.cpio
                but I ran into following error at the same spot as with cpio:
                afio: "path/to/file/Mailbox-Backup.zip": Invalid argument
                path/to/file/Mailbox-Backup.zip -- okay
                Segmentation fault (core dumped)

                I hoped that afio would be able to skip the error with the -k option, but now I run into a segmentation fault :-(
                Any ideas?

                Edit: I did just run the afio again, this time with sudo. The output is basically the same, except that it doesn't show the segmentation fault. Afio still exists after printing path/to/file/Mailbox-Backup.zip -- okay.
                I suppose this means that the problem is with the cpio file (cpio faulted at this file and exited while creating the backup). The only thing that I find a little strange ist that my cpio file is about 60 gigas, but only manages to write about 35 gigas.

                I will try afio again, trying to skip this file, but I suppose there is little hope for me.

                Edit: I tried afio again, this time skipping the problematic file, and afio exited just after skipping this file. So I suppose this means that my backup is messed up and I can't recover it.
                Last edited by joder; Nov 14, 2012, 04:33 AM.

                Comment

                Working...
                X