-
-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Provide an API to work with streams ? #114
Comments
It depends on AEADs API so only https://github.com/RustCrypto/AEADs/blob/master/README.md How big is your file? You can get |
Considering this library is the one wrapping it, I guess it would make sense to provide a higher level streaming interface around the lower level AEAD APIs ? Of course it would require incrementing the IV for each chunk, or generate a new ephemeral key for each (at least for AES, I don't know about Chacha20).
I don't know but that doesn't matter, let's say it's a 10TiB file
I already use |
Maximum message size is 64GB (AES) or 256GB (XChaCha20) so you need to split your 10TiB files for each encryption. In this case, to exchange a key first and encrypt chunks of the file with the same key on your own is probably better |
Good to know, thanks. But that was not my point, 10TiB was just some random number, my point is that with the current state of the library, you have to load the whole file (or whatever that is) into memory in order to encrypt it, which is not realistic in any major, well behaved, application. Think embedded systems for exemple, do you think they can afford having 2GB or RAM just to be able to encrypt a 2GB file ? I believe it makes sense for an ECIES library to provide streaming APIs, so that you can encrypt multiple chunks of data using a single ephemeral key. Not to mention AES GCM and ChaCha20 are streaming ciphers, they are perfect for the job, it would be a shame not to take advantage of that feature.
Well that's pretty much what ECIES is for, right ? |
Currently the API takes a
&[u8]
as input.It would be nice to be able to process huge files entirely in memory by processing chunks one after another.
Maybe by using
std::io::BufReader
/std::io::BufWriter
?The text was updated successfully, but these errors were encountered: