I have never been able to use Blosch myself. But it sounds really interesting, outperforming RAM. Not sure what are the applications - columnar data processing, Parquet files, etc?
It gets used a fair amount in the weather data space. Forecasting and climate reanalysis grids are typically large (gigabytes) N-dimensional arrays of float32 values and Blosc provides enough tunable knobs that it's fairly easy to find a combination that performs acceptably without writing a bunch of custom handling code to keep track of which underlying compression schemes and settings were used. Additionally, it supports byte- and bit-shuffling filters which can really help boost the compressibility of certain data sets.
https://www.blosc.org/pages/blosc-in-depth/
I have never been able to use Blosch myself. But it sounds really interesting, outperforming RAM. Not sure what are the applications - columnar data processing, Parquet files, etc?