remaster puppy deleted everything on my drive
Hmmm... I tried the ram2sfs script a few times doing different things each time hoping to get it to work, but all I ended up with each time was a 36k lupu_528.sfs file.
Oh well, at least it didn't delete everything on my partition.
Oh well, at least it didn't delete everything on my partition.
[color=blue]A life! Cool! Where can I download one of those from?[/color]
miriam,
As the maintainer of lucid pup 5.2.8.7 and its remasterpup2 script, I am very concerned about this problem. Thank you for reporting it.
The main clue, so far, is the duplicated free space field in /tmp/schoices.txt. The only way I can see that that could happen is if the 'df -m' command produced a second line beginning with "/dev/sda2". I am not sure how that would affect the command that wiped your partition, but I am not ruling it out as a cause.
The command that produces the lines:
sda2 "Filesystem: ext4 Size: 873805M Free: 73743
73743M (currently mounted)" \
iswhere AFREE becomes
The data we need to confirm this possibility is what you would see by executing the command, "df -m" (w/o quotes). Could you post that for us?
Richard
As the maintainer of lucid pup 5.2.8.7 and its remasterpup2 script, I am very concerned about this problem. Thank you for reporting it.
The main clue, so far, is the duplicated free space field in /tmp/schoices.txt. The only way I can see that that could happen is if the 'df -m' command produced a second line beginning with "/dev/sda2". I am not sure how that would affect the command that wiped your partition, but I am not ruling it out as a cause.
The command that produces the lines:
sda2 "Filesystem: ext4 Size: 873805M Free: 73743
73743M (currently mounted)" \
is
Code: Select all
AFREE="`df -m | grep "$AFPATTERN" | tr -s " " | cut -f 4 -d " "`"
Code: Select all
73743
73743
Richard
Sorry it took me a while to get back on this. I've been spending a while recovering my drive (well, the data) and sifting through the 2.7 million files thereon. I've written some programs that might help some others recover lost data. When I've finished doing this I'll post them to the forum.
I've installed a new version of Puppy, though this one is Puppy Slacko 632 as a frugal install until I can afford either a new computer or a third internal hard drive.
Running gives the result:
When I run the command it prints out:
As you see, I have 2 internal hard drives, sda is partitioned into 4 and sdb is partitioned into 2. I mount them all by their label names (makes it much easier to work out what window belongs to what on a screen cluttered with several windows). I don't know if it makes a difference, but I also always have a ramdisk (/mnt/ram0) mounted which dynamically alters its size to accommodate whatever data I give it. I have a 3GB of RAM, which is not a lot nowadays, though was a heck of lot not long ago... God! I just bought a 128 GB thumbdrive for $40 yesterday. Holy cow! I love technological advancement. But gimme petabyte drive already! Although, as this thread shows, the more data you have, the more you risk losing.
I've installed a new version of Puppy, though this one is Puppy Slacko 632 as a frugal install until I can afford either a new computer or a third internal hard drive.
Running
Code: Select all
probepart -m 2> /dev/null | grep '^/dev/' | grep -E 'ext2|ext3|ext4|reiserfs|msdos|vfat|ntfs' | cut -f 1-3 -d '|'
Code: Select all
/dev/sda1|ext3|40063
/dev/sda2|ext4|873805
/dev/sda3|ext4|20000
/dev/sda4|ext4|20000
/dev/sdb1|ext3|435973
/dev/sdb2|ext3|40962
Code: Select all
df -m
Code: Select all
Filesystem 1M-blocks Used Available Use% Mounted on
/dev/sda1 39307 25446 11858 69% /mnt/lupu
tmpfs 201 200 2 100% /initrd/mnt/tmpfs
/dev/loop0 171 171 0 100% /initrd/pup_ro2
/dev/loop4 30 30 0 100% /initrd/pup_z
unionfs 39307 25446 11858 69% /
tmpfs 823 2 822 1% /tmp
devtmpfs 1644 0 1644 0% /dev
shmfs 715 0 715 0% /dev/shm
/dev/sda2 859966 396176 420085 49% /mnt/big
/dev/sda3 19557 14451 4091 78% /mnt/lucid
/dev/sda4 19558 5139 13404 28% /mnt/precise
/dev/sdb1 429003 390168 17037 96% /mnt/work
/dev/sdb2 40192 35560 2584 94% /mnt/temp
/dev/loop1 142 142 0 100% /initrd/pup_ro4
[color=blue]A life! Cool! Where can I download one of those from?[/color]
me too
I had 2 big crashes. First some years ago, when my old PC (before 2004 and 1G RAM, Palomino and geforce256) crashed on XP. I switched to Puppy Wary and recovered most, including the machine. This was all right.
But almost 2 years ago I installed different puppies, started experimenting... and had my HDD wiped. It had 7 or 8 partitions (and a swap), all of them were empty.
I discovered it only when i tried to start up again, so I am not sure if I had done a remaster. But it is quite possible.
If so, I am quite sure I built the new master-sfs on an external harddisc and not on the internal HDD. Maybe it matters in finding the cause?
I had tahr, wary, precise, lupu, classic (214X), Gamer (LegacyOS) and probably dsl and HBCD installed.
One partition used to hold most data on vfat.
I recovered a lot of data with recuva on HBCD, but I lost some precious data too (documents, photographs and movies, some music... ). It was bad.
Since then I do no remasters anymore - though I was not aware this could have been the cause. But I was considering it lately, since I use Tahr with quite some replacements for the standard apps. Now I certainly reconsider!!
(A barebone version of every distro would suit me.)
I use 2 external harddiscs now. They both hold a backup of my data and documents. I backup a few times a month, sometimes much more. One of the discs I only connect for this purpose.
Sometimes I copy a pupsave and things I hold in /mnt/home (like personal settings like .Mozilla), but I do not consider installed distros valuable because I fiddle with them all the time.
Miriam; thank you for sharing your experience on this forum. It rang a bell for me.
I hope you recover well.
And it looks like there is coming something good out of it as well.
But almost 2 years ago I installed different puppies, started experimenting... and had my HDD wiped. It had 7 or 8 partitions (and a swap), all of them were empty.
I discovered it only when i tried to start up again, so I am not sure if I had done a remaster. But it is quite possible.
If so, I am quite sure I built the new master-sfs on an external harddisc and not on the internal HDD. Maybe it matters in finding the cause?
I had tahr, wary, precise, lupu, classic (214X), Gamer (LegacyOS) and probably dsl and HBCD installed.
One partition used to hold most data on vfat.
I recovered a lot of data with recuva on HBCD, but I lost some precious data too (documents, photographs and movies, some music... ). It was bad.
Since then I do no remasters anymore - though I was not aware this could have been the cause. But I was considering it lately, since I use Tahr with quite some replacements for the standard apps. Now I certainly reconsider!!
(A barebone version of every distro would suit me.)
I use 2 external harddiscs now. They both hold a backup of my data and documents. I backup a few times a month, sometimes much more. One of the discs I only connect for this purpose.
Sometimes I copy a pupsave and things I hold in /mnt/home (like personal settings like .Mozilla), but I do not consider installed distros valuable because I fiddle with them all the time.
Miriam; thank you for sharing your experience on this forum. It rang a bell for me.
I hope you recover well.
And it looks like there is coming something good out of it as well.
Re: me too
1. remastering an iso is easy. you mount the iso, get the files on it, make a new image. you can do it using the isomaster gui, leaving the boot stuff intact.chilidog wrote:Since then I do no remasters anymore - though I was not aware this could have been the cause. But I was considering it lately, since I use Tahr with quite some replacements for the standard apps. Now I certainly reconsider!!
(A barebone version of every distro would suit me.)
2. you can copy the puppy sfs file, mount or unsquash that, and do the same thing, mksquash it back up, and put it back in the iso with isomaster.
1 and 2 are two levels of remastering. 1 lets you add packages (like pets) though i dont know if they will automatically load on boot- possibly. 2. lets you change basically all the files in puppy, except the ones in initrd. that i havent tried yet.
the kind of remastering that people are talking about here, is probably based on copying the contents of the drive, or the ramdrive (aufs?) same process, only difference is where it gets its stuff. but if you want a safer remaster you can skip the usual remaster scripts. you can do it without them, thats the thing im trying to say.
Last edited by learnhow2code on Tue 26 Jul 2016, 15:51, edited 2 times in total.
miriam,
Thanks for posting the output of 'df'. It seems different than it would have to be to cause the duplicated value, possibly because the partition is different now.
Although we might never know why the "wipe" occurred, I conclude that the "grep" in the statement I posted should limit the number of "hits" to one, to prevent the possibility of duplication (df -m | grep -m 1 "$AFPATTERN" |...). Maybe that will also avoid the "wipe". I need to look further to determine how the "wipe" might happen.
Richard
Thanks for posting the output of 'df'. It seems different than it would have to be to cause the duplicated value, possibly because the partition is different now.
Or might it be related to that? Was sda2 mounted as you say when that command was issued? If not, could you mount sda2 that way and then run 'df'?I mount them all by their label names (makes it much easier to work out what window belongs to what on a screen cluttered with several windows)
Although we might never know why the "wipe" occurred, I conclude that the "grep" in the statement I posted should limit the number of "hits" to one, to prevent the possibility of duplication (df -m | grep -m 1 "$AFPATTERN" |...). Maybe that will also avoid the "wipe". I need to look further to determine how the "wipe" might happen.
Richard
@learnhow2code
It's the 2nd thing that interests me.
I am new here.]
Thank you!!
I didn't realize myself, but you are so right.
It's probably even easier then using the remaster script.
I would probably take more out than put in, which is a bit tricky too sfs is there for adding things.
I'll try it.
By the way, I did already edit and change init in initrd.gz
It's the 2nd thing that interests me.
[I just discovered my favorite browser, dillo, cannot execute all formatting buttons in this forum. Must be javascript? I have to add the code myself.2. lets you change basically all the files in puppy, except the ones in initrd. that i havent tried yet.
I am new here.]
Thank you!!
I didn't realize myself, but you are so right.
It's probably even easier then using the remaster script.
I would probably take more out than put in, which is a bit tricky too sfs is there for adding things.
I'll try it.
By the way, I did already edit and change init in initrd.gz
me too
I looked up some details of my 'wipe out'. It happened on 3 feb 2016. I didn't realize it was so recently.
I used the remaster script in Tahr 6.0.2(5).
I made a working directory on an external NFTS harddisc to do it. The .iso too was made on the external. I probably had a rather small pupsave, which I thought could be a problem.
I used the remaster script in Tahr 6.0.2(5).
I made a working directory on an external NFTS harddisc to do it. The .iso too was made on the external. I probably had a rather small pupsave, which I thought could be a problem.
Re: me too
this could be old/outdated, but i never trust ntfs partitions when doing extensive work with files from gnu/linux. yes its possible for it to go well, traditionally the ntfs support is not to be trusted for "everyday" use. ymmv. this is not necessarily the issue you had, however.chilidog wrote:I made a working directory on an external NFTS harddisc to do it. The .iso too was made on the external.
Sylvander, I used photorec to recover as many of the files as possible to another drive. Unfortunately the directory structure was irretrievable. The problem with photorec is that it has no idea what the files' names are, so it gives them uninformative names, and it doesn't know where they are supposed to be in the directory structure. This is a real hassle with the hundreds of log files I instruct my scripts to write to their current directory. Those files don't make much sense without the path as context.
The easiest to organise are the photos. The metadata in the files usually records the name of the camera (or smartphone) used to snap the images. It also records the date and time of the photos. If your camera or phone has a GPS device and it's switched on then it may record the location and direction pointed. (If you ever photograph something of questionable legitimacy it is wise to remember this.) I've already written a simple script that renames photos according to the camera and date and collects them in folders under year and month.
Next easiest are the mp3 files. If you are as obsessive as I am you'll have ensured your mp3 files have the appropriate artist, track name, album name (or speaker, talk title, site name, etc if they're talks). They can be renamed from that data.
HTML files generally have the <title></title> tags in the head of each file so sed can be used to get that for renaming the files.
I try to always put the name, author, date, description details at the top of all my scripts (bash scripts, sed, awk, python, etc). I'm currently working out the best way to reliably rename files according to that data.
A bad problem is the hundreds of thousands of text files. I'm working out a way to add the title, author, and path info into the top of each file. It doesn't help my current problem, but will be a lifesaver if this happens to me again. I have formed the habit over many years of writing most of my notes with the first line being the same as the title of the file, so can may be able to retrieve a lot of my own notes this way already.
Worst of all is are all the video files. While mp4 (which has become the standard) is capable of carrying metadata about author, title, date, etc, few actually do. I like to watch Ben Heck's geeky electronics videos. He puts all that data in, but I also like to watch Blender3D tutorials by various people and they almost never include it. TED Talks have that info. None of the YouTube videos do. I need to write a program that ripples through my folders writing name, date, and other related data to mp4 videos.
I use exiftool to find the metadata in files. It's a perl program. Exiftool is an incredibly versatile program. It not only allows you to query particular tags in files, but also to rename the files according to those tags. For a couple of years I've had a script in my context menus to rename mp3 files according to their duration (so they'll be named something like "Radiolab 2011-06-20 - Talking to Machines [1:05:47].mp3" -- oh, by the way, Radiolab is a wonderful show, well worth listening to).
I'm doing a lot of other things at the same time, so I won't finish these scripts for a while, but when I do I'll post them here.
The easiest to organise are the photos. The metadata in the files usually records the name of the camera (or smartphone) used to snap the images. It also records the date and time of the photos. If your camera or phone has a GPS device and it's switched on then it may record the location and direction pointed. (If you ever photograph something of questionable legitimacy it is wise to remember this.) I've already written a simple script that renames photos according to the camera and date and collects them in folders under year and month.
Next easiest are the mp3 files. If you are as obsessive as I am you'll have ensured your mp3 files have the appropriate artist, track name, album name (or speaker, talk title, site name, etc if they're talks). They can be renamed from that data.
HTML files generally have the <title></title> tags in the head of each file so sed can be used to get that for renaming the files.
I try to always put the name, author, date, description details at the top of all my scripts (bash scripts, sed, awk, python, etc). I'm currently working out the best way to reliably rename files according to that data.
A bad problem is the hundreds of thousands of text files. I'm working out a way to add the title, author, and path info into the top of each file. It doesn't help my current problem, but will be a lifesaver if this happens to me again. I have formed the habit over many years of writing most of my notes with the first line being the same as the title of the file, so can may be able to retrieve a lot of my own notes this way already.
Worst of all is are all the video files. While mp4 (which has become the standard) is capable of carrying metadata about author, title, date, etc, few actually do. I like to watch Ben Heck's geeky electronics videos. He puts all that data in, but I also like to watch Blender3D tutorials by various people and they almost never include it. TED Talks have that info. None of the YouTube videos do. I need to write a program that ripples through my folders writing name, date, and other related data to mp4 videos.
I use exiftool to find the metadata in files. It's a perl program. Exiftool is an incredibly versatile program. It not only allows you to query particular tags in files, but also to rename the files according to those tags. For a couple of years I've had a script in my context menus to rename mp3 files according to their duration (so they'll be named something like "Radiolab 2011-06-20 - Talking to Machines [1:05:47].mp3" -- oh, by the way, Radiolab is a wonderful show, well worth listening to).
I'm doing a lot of other things at the same time, so I won't finish these scripts for a while, but when I do I'll post them here.
[color=blue]A life! Cool! Where can I download one of those from?[/color]
Both my external HD are in ntfs, from the factory. They are backups. What should I use for that?ntfs support is not to be trusted for "everyday" use
Probably not. I still have the directory and the iso that were made by the script.this is not necessarily the issue you had, however
I used the utility to remove some modules from the installlation. They are not really removed, but marked and then removed on remastering. Could that be something?
@miriam
1. "Sylvander, I used photorec to recover as many of the files as possible to another drive. Unfortunately the directory structure was irretrievable. The problem with photorec is that it has no idea what the files' names are, so it gives them uninformative names, and it doesn't know where they are supposed to be in the directory structure. This is a real hassle with the hundreds of log files I instruct my scripts to write to their current directory. Those files don't make much sense without the path as context."
Same here with my photo files recovered using PhotoRec.
My photo files don't make much sense without the path and filename as context.
2. "The easiest to organise are the photos. The metadata in the files usually records the name of the camera (or smartphone) used to snap the images."
Different for me here.
Mine are all old paper photos that I scanned to make into photo files.
3. I believe the photo files are the only highly valuable stuff, but there's always a possibility I could be wrong.
I spent YEARS scanning and identifying and sorting all those photos, and I gave up [conceded defeat] just short of completing the job.
The unscanned photos are in a separate box, but I now have no scanner.
Then I LOST all of the computer copies of the scanned photos...
And so far cannot get up the energy/will-power to attempt to go through all of them again.
I'm getting old, my remaining years are numbered, is this how I should spend them?
I'd like to hand them down to my offspring, but so far all 3 seem disinterested in the Family Tree [got back to the 1600's; didn't lose it] and the photos.
Made paper copies of the Family Tree for each my brothers and sisters, but they seemed disinterested.
An Australian descendant of my wifes' ancestors made contact and was interested, and we were swapping info [learned some interesting stuff about the 1st Australian settlers], but then he died.
He told me of an American branch, but I haven't been in contact with them.
I contacted some Canadian descendants of my ancestors.
1. "Sylvander, I used photorec to recover as many of the files as possible to another drive. Unfortunately the directory structure was irretrievable. The problem with photorec is that it has no idea what the files' names are, so it gives them uninformative names, and it doesn't know where they are supposed to be in the directory structure. This is a real hassle with the hundreds of log files I instruct my scripts to write to their current directory. Those files don't make much sense without the path as context."
Same here with my photo files recovered using PhotoRec.
My photo files don't make much sense without the path and filename as context.
2. "The easiest to organise are the photos. The metadata in the files usually records the name of the camera (or smartphone) used to snap the images."
Different for me here.
Mine are all old paper photos that I scanned to make into photo files.
3. I believe the photo files are the only highly valuable stuff, but there's always a possibility I could be wrong.
I spent YEARS scanning and identifying and sorting all those photos, and I gave up [conceded defeat] just short of completing the job.
The unscanned photos are in a separate box, but I now have no scanner.
Then I LOST all of the computer copies of the scanned photos...
And so far cannot get up the energy/will-power to attempt to go through all of them again.
I'm getting old, my remaining years are numbered, is this how I should spend them?
I'd like to hand them down to my offspring, but so far all 3 seem disinterested in the Family Tree [got back to the 1600's; didn't lose it] and the photos.
Made paper copies of the Family Tree for each my brothers and sisters, but they seemed disinterested.
An Australian descendant of my wifes' ancestors made contact and was interested, and we were swapping info [learned some interesting stuff about the 1st Australian settlers], but then he died.
He told me of an American branch, but I haven't been in contact with them.
I contacted some Canadian descendants of my ancestors.
Sylvander, oh dear. I just checked some of my scanned images and they don't have much metadata at all stored in them. That's certainly something to be aware of in the future.
There might be a quick way to sort through all the salvaged images. If you divide the time up into say, half an hour each day then you could get through the pictures in a surprisingly short time -- a month or two, perhaps. Many times we balk at the enormity of a task when we know how large it is, whereas if we didn't understand that we would just do it.
I count as one of my best assets a peculiar kind of blindness to my limitations. I often attempt things that a more sensible person would immediately realise can't be done, but because of my inability to see that, I very often manage to complete these "impossible" tasks anyway.
If you had a simple script that would display a picture and gave a dialog box into which you could type a name and other extra details, like date, names of other people in the pic, etc then on entering and closing it up pops another pic and another dialog waiting for more input, then this would take a lot of the tedium out of the task. You could have the box of photos to hand to refer to anything written on their backs. It would also probably be smart to have the data that you enter automatically be inserted into the image's metadata. Exiftool can add the metadata. And yad is a great tool for building dialogs.
If you don't feel up to writing something like this give me a little while and I'll have a shot at it and post it here. It'll be a little while though... could be months.
Don't worry too much about the young-uns not being interested in the past. That's normal. They'll become interested in time. Just make it ready for them so it isn't lost.
There might be a quick way to sort through all the salvaged images. If you divide the time up into say, half an hour each day then you could get through the pictures in a surprisingly short time -- a month or two, perhaps. Many times we balk at the enormity of a task when we know how large it is, whereas if we didn't understand that we would just do it.
I count as one of my best assets a peculiar kind of blindness to my limitations. I often attempt things that a more sensible person would immediately realise can't be done, but because of my inability to see that, I very often manage to complete these "impossible" tasks anyway.
If you had a simple script that would display a picture and gave a dialog box into which you could type a name and other extra details, like date, names of other people in the pic, etc then on entering and closing it up pops another pic and another dialog waiting for more input, then this would take a lot of the tedium out of the task. You could have the box of photos to hand to refer to anything written on their backs. It would also probably be smart to have the data that you enter automatically be inserted into the image's metadata. Exiftool can add the metadata. And yad is a great tool for building dialogs.
If you don't feel up to writing something like this give me a little while and I'll have a shot at it and post it here. It'll be a little while though... could be months.
Don't worry too much about the young-uns not being interested in the past. That's normal. They'll become interested in time. Just make it ready for them so it isn't lost.
[color=blue]A life! Cool! Where can I download one of those from?[/color]
Just want to highlight a post by slavvo that may be of interest to people with file recovery / identification issues:
http://www.murga-linux.com/puppy/viewto ... 987#917987
http://www.murga-linux.com/puppy/viewto ... 987#917987