Strange behavior in relation to the zip api - Lucid 511
Strange behavior in relation to the zip api - Lucid 511
I tried to zip a group of jpg files & noticed that the resulting files were about 30 bytes in size. That's way too much for even the ultimate compression program. The original files were overwritten; of course, it means that the files were lost.
I would like to know the proper procedure for zipping a group of files to avoid making this kind of error in the future.
Thanks for any help on this!
========
The command used was <gzip *.*>, after changing to the directory where I had put the files. All the files were .jpg.
I would like to know the proper procedure for zipping a group of files to avoid making this kind of error in the future.
Thanks for any help on this!
========
The command used was <gzip *.*>, after changing to the directory where I had put the files. All the files were .jpg.
All I can say is backup your files first.
Just create a new directory and copy the files into it, then try to zip them in that directory this will leave the originals in their own directory.
You can create a new directory in Rox by right clicking in an open directory and selecting 'New' / 'Directory'.
Just create a new directory and copy the files into it, then try to zip them in that directory this will leave the originals in their own directory.
You can create a new directory in Rox by right clicking in an open directory and selecting 'New' / 'Directory'.
Thanks Ian for responding! I did as you said, copied almost 321 KB in 39 .png files to another location/partition within the same drive (ntfs).
The problem is that I used the default compression ratio & I'm getting 2 other files within the directory after running the command <cat "forziptest" | gzip > forziptest.gz>: 'forziptest' & 'forziptest.gz'; between the 2 last files there're not 1 KB &, for the best ever compressing api, that's too much compression for the aggregated total. So, I'm assuming that I'm doing something awfully wrong.
In case you're asking yourself about the format of the file containing the list of files to be compressed I must say that the 1st time I used the file resulting from the command <ls > forziptest> &, as you are surely thinking now, that's wrong for my purpose because the cat command is expecting a list of files separated by a space & not by a carriage return character. I noticed the discrepancy & edited the file & ran the command again. The results were similar in both cases.
Haven't tried to un-compress the .gz file yet because I would like to know first how to put the contents of running the appropriate command somewhere else.
The problem is that I used the default compression ratio & I'm getting 2 other files within the directory after running the command <cat "forziptest" | gzip > forziptest.gz>: 'forziptest' & 'forziptest.gz'; between the 2 last files there're not 1 KB &, for the best ever compressing api, that's too much compression for the aggregated total. So, I'm assuming that I'm doing something awfully wrong.
In case you're asking yourself about the format of the file containing the list of files to be compressed I must say that the 1st time I used the file resulting from the command <ls > forziptest> &, as you are surely thinking now, that's wrong for my purpose because the cat command is expecting a list of files separated by a space & not by a carriage return character. I noticed the discrepancy & edited the file & ran the command again. The results were similar in both cases.
Haven't tried to un-compress the .gz file yet because I would like to know first how to put the contents of running the appropriate command somewhere else.
If you're referring to the application with an icon on the desktop that's named 'zip', yes. I started that api but, I don't know how to use it for compression. Have used it before a couple of times for decompressing .tar files, though. I've tried to follow the instructions that are found at the Help tab but all that I created was a copy of the files, nothing got compressed.
As per the instructions there it seems that it should be possible but not tonight for me... I did created a folder for putting the compressed files within the folder the original files were but they just got copied to it.
I'll try to play a little more with this tomorrow.
As per the instructions there it seems that it should be possible but not tonight for me... I did created a folder for putting the compressed files within the folder the original files were but they just got copied to it.
I'll try to play a little more with this tomorrow.
Thanks for responding Karl Godt! I'm guessing that you tried to punctuate that the code (cat "forziptest" | gzip > forziptest.gz>) is weird. The explanation is as follows:
1) issued the command "ls > forziptest" to create a file containing the files within current directory;
2) edited forziptest to separate file names with spaces instead of CR+LF as was originally created;
3) pretended to use the edited file as source for the gzip command (which didn't worked);
4) later tried to use the directory where the files resided on as "gzip fotos" (from parent directory) but that command wasn't accepted by gzip.
I must say that I would like to understand your code better. Have the feeling that it involves a For>Next loop & 'i' is an alphanumeric variable for holding the names of the files encountered under the directory of interest.
Would you mind clarifying it? Thanks!
1) issued the command "ls > forziptest" to create a file containing the files within current directory;
2) edited forziptest to separate file names with spaces instead of CR+LF as was originally created;
3) pretended to use the edited file as source for the gzip command (which didn't worked);
4) later tried to use the directory where the files resided on as "gzip fotos" (from parent directory) but that command wasn't accepted by gzip.
I must say that I would like to understand your code better. Have the feeling that it involves a For>Next loop & 'i' is an alphanumeric variable for holding the names of the files encountered under the directory of interest.
Would you mind clarifying it? Thanks!
The star '*' is a wildcard/joker in this case , and works for dirs and files in a for-loop .
"find /dir -type f" filters for files and links .
if you use a database file make sure , basename and dirname are correct .
for a database.file :
"cat database.file | while read line ; do echo $line ; done"
"line" is also a variable like "i" for the current read line/variable .
I really think , that you understood the code right .
You can of course replace "line" or "i" with something like "Mp3Line" or "INTERioR" .
Just watch out for PATH and other VARS from "set" or "env" commands .
PATH=`cat database.file`
for CurrentPath in $PATH ; do
basename $CurrentPath
done
would likely give something like "basename : command not found" because /bin /usr/bin /sbin ... are not in the PATH anymore .
Took me a little time to answer , for the whole day I wasted time to find the depends for compiling Xorg-7.1.1.1 in lucy , and still a newer header.h forced me to run "make -k"
"find /dir -type f" filters for files and links .
if you use a database file make sure , basename and dirname are correct .
for a database.file :
"cat database.file | while read line ; do echo $line ; done"
"line" is also a variable like "i" for the current read line/variable .
I really think , that you understood the code right .
You can of course replace "line" or "i" with something like "Mp3Line" or "INTERioR" .
Just watch out for PATH and other VARS from "set" or "env" commands .
PATH=`cat database.file`
for CurrentPath in $PATH ; do
basename $CurrentPath
done
would likely give something like "basename : command not found" because /bin /usr/bin /sbin ... are not in the PATH anymore .
Took me a little time to answer , for the whole day I wasted time to find the depends for compiling Xorg-7.1.1.1 in lucy , and still a newer header.h forced me to run "make -k"
I tried your 1st 'for' command but the result was a per file un-compression (the resulting files were bigger than the originals). It might have to do to the fact they're .png files, most of them < 5 KB. In relation to this, I must say 1st that I used the default compression factor & 2nd, that I've read that it should be possible to combine similar files & make just one bigger compressed file; this last approach should increase the overall compression (haven't tried it, though).
I also tried your 2nd command but, with this had no luck. Its output was:
{# find . -maxdepth 1 -type f -name "*.png" exec gzip {} \;
find: paths must precede expression: exec
Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...] [expression]}
Since I don't understand the command haven't tried troubleshooting it, yet.
Thanks for your kind help, Karl Godt!
I also tried your 2nd command but, with this had no luck. Its output was:
{# find . -maxdepth 1 -type f -name "*.png" exec gzip {} \;
find: paths must precede expression: exec
Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...] [expression]}
Since I don't understand the command haven't tried troubleshooting it, yet.
Thanks for your kind help, Karl Godt!
The same error message I get when I forget '-' before the "exec" .
So the error message is confusing , the path is already determined with '.' which means the current directory .
To compress many files you could also use the "tar" command .
"tar -czf /mypictures.tar.gz /path/to/my/pic/dir"
would c=create z=compress_with_gzip f=filename=/mypictures.tar.gz of the directory of the pics .
tar is one of the very few commands , that don't need '-' before an option , like find , but I have put it there also .
for further information , try "pman tar" or just "man tar" . Puppy is cut down , so only if you copy manpages from a larger distro to /usr/share/man you would not need to use the online manual pages as pman does .
So the error message is confusing , the path is already determined with '.' which means the current directory .
To compress many files you could also use the "tar" command .
"tar -czf /mypictures.tar.gz /path/to/my/pic/dir"
would c=create z=compress_with_gzip f=filename=/mypictures.tar.gz of the directory of the pics .
tar is one of the very few commands , that don't need '-' before an option , like find , but I have put it there also .
for further information , try "pman tar" or just "man tar" . Puppy is cut down , so only if you copy manpages from a larger distro to /usr/share/man you would not need to use the online manual pages as pman does .
Sorry to bother with this, but I've read the info contained at die.net with regard to gzip & its options already. Nevertheless, I tried the corrected command starting with <find...> that you gave above (1st time was a syntax mistake introduced by me) with the -9 option for the gzip command at the end: here, I'm interpreting that the '{}' symbols at the end of your command line is a space/place for introducing any gzip option (correct me here if I'm wrong).
The message I have now is:
# find . -maxdepth 1 -type f -name "*.png" -exec gzip -9
find: missing argument to `-exec'
#
Thought that the output of 'find' was about to be used by the next, in this case, 'gzip'. I'll try other variants in the mean time.
Thanks!
The message I have now is:
# find . -maxdepth 1 -type f -name "*.png" -exec gzip -9
find: missing argument to `-exec'
#
Thought that the output of 'find' was about to be used by the next, in this case, 'gzip'. I'll try other variants in the mean time.
Thanks!
the '{}' stands for the file and it needs the '\;' at the end of every exec command .
would give output like :
You should really run "file /path/to/the pic.jpg" to check if the .jpg s are already compressed . I don't know if mtpaint uses gzip for that . At my mtpaint for now there are compression values chooseable for .tga and .png files , but they don't work for now , must be disabled @compiletime .
And I have not run gzip on already gzipped files , maybe gzip reverses and unzips them without errror message .
Code: Select all
find /dev -type b -name "*hda*" -exec ls -l {} \; -exec file {} \;
....brwxrwxr-x 1 root 6 3, 13 2000-09-27 12:31 /dev/hda13
/dev/hda13: block special (3/13)
brw-r--r-- 1 root root 3, 10 2011-04-06 19:07 /dev/hda10
/dev/hda10: block special (3/10)
You should really run "file /path/to/the pic.jpg" to check if the .jpg s are already compressed . I don't know if mtpaint uses gzip for that . At my mtpaint for now there are compression values chooseable for .tga and .png files , but they don't work for now , must be disabled @compiletime .
And I have not run gzip on already gzipped files , maybe gzip reverses and unzips them without errror message .
Code: Select all
find . -maxdepth 1 -name "*.png" -exec file {} \; -exec gzip {} \;
find . -maxdepth 1 -name "*.png.gz" -exec file {} \; -exec ls -s {} \;
./SHOT2.png.gz: gzip compressed data, was "SHOT2.png", from Unix, last modified: Wed Jun 8 22:16:06 2011
240 ./SHOT2.png.gz
Code: Select all
find . -maxdepth 1 -name "*.png.gz" -exec file {} \; -exec gzip {} \;
./SHOT2.png.gz: gzip compressed data, was "SHOT2.png", from Unix, last modified: Wed Jun 8 22:16:06 2011
gzip: ./SHOT2.png.gz already has .gz suffix -- unchanged
Code: Select all
find . -maxdepth 1 -name "*.png.gz" -exec file {} \; -exec gunzip {} \;
find . -maxdepth 1 -name "*.png" -exec file {} \; -exec ls -s {} \;
./SHOT2.png: PNG image data, 1280 x 1024, 8-bit/color RGB, non-interlaced
3856 ./SHOT2.png
I think I should have said something important much earlier: the os of my nephew is a Windows version one, so I'm supposing that he'll be using the zip api that every 'recent' windows release comes with. That's the reason to just zip the files without any other feature that might give him problems to uncompress later.
I know there should be a way to compress all those files at once with the best compression rate available (#9).
My 'command' on linux commands with the shell is very lacking in tools as is my knowledge of it. Think I should spend time dealing with this by my own with the help of a good info source/manual.
Thanks Karl for all your help!
I know there should be a way to compress all those files at once with the best compression rate available (#9).
My 'command' on linux commands with the shell is very lacking in tools as is my knowledge of it. Think I should spend time dealing with this by my own with the help of a good info source/manual.
Thanks Karl for all your help!
As far as I know windows really zips and does not gzip .
There should be actually
gzip and zip are different apis .
There should be actually
Code: Select all
file `which zip unzip`
in puppy but at some puppies it might be a symlink to busybox , which contains good working compression commands ./usr/bin/zip: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), for GNU/Linux 2.0.0, dynamically linked (uses shared libs), stripped
/usr/bin/unzip: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), for GNU/Linux 2.0.0, dynamically linked (uses shared libs), stripped
gzip and zip are different apis .
Thank you a lot, Karl, for all your patience & help!
I, finally, did what I wanted..., as I thought from the very beginning, it was very easy. The problem was, hum, me! I didn't read the available info carefully enough.
I decided to use the command <zip> from the shell. What drove me out of this included-feature was the name that icon on my desktop has --'gzip'. That put in my mind the notion it was fully compatible or kind of windows-zip version. After reading for a few days about it I stumbled with this other Puppy-included api (zip).
Haven't tried using it via the gui but, at least from the shell, it did what I wanted: create a single zipped file from a series of liked files, with option -9 it reduced the overall size in half. The command used (from the directory containing the said files) was:
<zip -T -m -9 koko *>.
This, I think, makes this thread solved. I'll check how to mark it as such, if possible.
I, finally, did what I wanted..., as I thought from the very beginning, it was very easy. The problem was, hum, me! I didn't read the available info carefully enough.
I decided to use the command <zip> from the shell. What drove me out of this included-feature was the name that icon on my desktop has --'gzip'. That put in my mind the notion it was fully compatible or kind of windows-zip version. After reading for a few days about it I stumbled with this other Puppy-included api (zip).
Haven't tried using it via the gui but, at least from the shell, it did what I wanted: create a single zipped file from a series of liked files, with option -9 it reduced the overall size in half. The command used (from the directory containing the said files) was:
<zip -T -m -9 koko *>.
This, I think, makes this thread solved. I'll check how to mark it as such, if possible.