-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable compression for hash_table #210
Comments
Also, the |
The hash table looks the same regardless of what is in the dataset. So I would think it's better to just find a compression that works and use it everywhere. |
Both 'gzip' and 'lzf' seem to work and are part of the |
lzf sounds good, let's go with that for now. Aaron: |
Sorry, didn't see this comment until just now. I have a fix at #211. I will merge it an make a release as soon as the tests pass. |
I didn't make the chunk size configurable yet. Hopefully that isn't also something you need done urgently. I'm actually not even sure if the hashtable dataset needs to be chunked at all. I might need to play with this. |
Thanks, just the compression helps us a lot already! |
As a follow up to #205, can we enable compression for the
hash_table
datasets?Ideally this should be configurable in some way, maybe as an argument to
VersionedHDF5File(f, hash_table_compression='lzf')
? If this turns out to be difficult it's also okay to use some default compression for allhash_table
datasets.The text was updated successfully, but these errors were encountered: