Lossless data compression library built in Scala that leverages Cats Effect and FS2 libraries.
- Format code ->
sbt formatAll
(see alias.sbt) - Build/Compile ->
sbt compile
- Run unit tests ->
sbt test
- Run unit tests with code coverage report ->
sbt unitTests
(see alias.sbt)- Check report on index.html
- Run program:
- Compress mode ->
sbt "run compress -f samples/sample_1kb.txt"
- Decompress mode ->
sbt "run decompress -f samples/sample_1kb.txt.pico"
- Compress mode ->
- Run Async profiler ->
./asprof -e cpu -d 30 -f profiler.html (pid)
- Get pid by running
jps -l | grep -i pico
- Also options:
-e alloc
(memory),-e lock
(lock contention),-e wall
(wall clock time)
- Get pid by running
- Run JMH benchmarks ->
sbt clean Jmh/run
sbt clean Jmh/run -i 3 -wi 3 -f 1 -t 2
-> (3 iterations, 3 warmup iterations, 1 fork, 2 threads)
- Command:
compress|decompress
- Selects compress or decompress mode
- File:
-f fileName
or--file fileName
- Target file to compress or decompress
- ChunkSize:
-c chunkSize
or--chunkSize chunkSize
- Only applies to
compress
command mode, source file stream chunk size in kilobytes units
- Only applies to
- Verbosity:
-d
or--debug
(for debug mode) andt
or--trace
(for trace mode)- Updates root log level for a more verbose logging
- Stream file data into fixed sized chunks
- For each chunk, build huffman tree and encode both tree and data
- Stream output into target file wrapping it with chunk delimiter
- Stream compressed file data, splitting it by chunk delimiter
- For each chunk, decode huffman tree and with it, decode encoded data
- Stream output into target file
- Release as Scala native
See LICENSE