💾 Archived View for mirrors.apple2.org.za › archive › apple.cabi.net › Utilities › Compress.1.1.shk.… captured on 2024-02-05 at 11:43:12.
View Raw
More Information
⬅️ Previous capture (2023-01-29)
-=-=-=-=-=-=-
UNIX compress (12 or 13 bit)
APW C port by Andy McFadden (fadden@cory.berkeley.edu)
Version 1.1 December 1989
compress [-dcvfV] [-b maxbits] [file ...]
-d: If given, decompression is done instead
-c: Write output on stdout, don't remove original.
-b: Parameter limits the max number of bits/code [12 or 13]
-f: Forces output file to be generated, even if one already exists,
and even if no space is saved by compressing. If -f is not
used, the output file will not be overwritten if it exists.
-v: Write compression statistics.
-V: Print version info.
file..: Files to be compressed. If none specified, stdin is used.
Output to file.Z (with same attributes as original), or stdout (if stdin
is used as input). Does not replace original file if no compression is
achiveved. Filenames must be short enough to concatenate the ".Z" suffix.
To uncompress a file, the best way is to put
alias uncompress compress -d
in your login file. Then just use uncompress with the above options.
Note that the maximum for this version is ** 13 bits ** . If you use
the compress command normally under UNIX, you will make a compressed file
which may use 16 bit codes. This port CANNOT uncompress such a file, so make
sure you do something like
% compress -b13 file1 file2 ...
when you are compressing the files initially.
- **** Benchmarks / statistics:
(note these were made with a 1K I/O buffer; see comments below)
File Storage Compress Uncompress Size
------------------------------------------------------------------------------
Moria GS hard drive 870/1024 sec 345/330 sec 66% / 64% of original.
(577K)
compress.c 3.5" drive 55 / 69 sec 38 / 37 sec 47% / 41% of original.
(45K) hard drive 46 / 60 sec 28 / 30 sec
/ram5 41 / 54 sec 24 / 24 sec
The double entries correspond to 12 and 13-bit compress. 13-bit compress
generally requires slightly more time to compress, but about the same (or
even LESS in the case of MoriaGS) to uncompress. So it is probably to your
advantage to use 13-bit codes.
By way of comparison, ShrinkIt v2.1 takes about two minutes to compress
Moria GS, with slightly better compression. It takes about 10 seconds to
compress "compress.c"; the resulting file is 49% of the original. Generally
speaking, ShrinkIt compresses binary or sparse files better than UNIX compress
can. However, it is difficult to match the crunching power for text files.
It should be painfully obvious from the few statistics here that disk speed
plays a large role in the time required. After Larry Virden pointed out
the virtues of setvbuf(), I boosted the buffer size to 8K. The time to
compress and uncompress "compress.c" on a 3.5" drive was reduced by six and
three seconds, respectively.
- This will require about 125K free memory to run.
- The executable file is NOT compacted. When I ran compact on it, no
errors were reported. However, the program crashed when I ran it. I
tried this several times, all to no avail. Apparently this is due to
the increased array sizes for 13-bit compress; there were no problems
compacting a 12-bit only version (obtainable by setting USERMEM to
65000 in the first few lines of the program). Using static arrays instead
of global automatic arrays didn't help (it still crashed, in more or
less the same bank 0 location).
- When uncompressing, either don't type the last .Z or type ".Z", not
".z". Under UNIX, ".z" is a different kind of file, and some parts of
compress do distinguish case.
- Compressing a file to stdout will probably be a mistake, since I believe
that will convert linefeeds (hex 0a) to carriage returns (hex 0d). However,
uncompressing a text file to a file works fine. Uncompressing a text file
to the screen has the usual APW one-line weirdness.
- This version does not properly support wildcards, although it does
correctly handle ".." and device numbers. Things generally work the way
you would expect them to, except for '=' and '?'.
- The UNIX version prompts before overwriting; APW (apparently) does not
allow reading from stderr, which is what compress wants to do, so I just
made it not overwrite the file unless the -f option is used.
- If you compress a file on UNIX and try to download it, it may grow in size
because some transfer programs/protocols append null characters to the
ends of files. Unfortunately this may cause compress to become confused...
the best solution is probably to encapsulate it with something like NuLib
(archive w/o compression) and then use ShrinkIt to extract the compressed
file (extract uncompressed; works pretty fast). This will preserve the
original EOF.
- This won't work on subdirectories or extended files.