Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide an API to work with streams ? #114

Open
ShellCode33 opened this issue Dec 18, 2023 · 4 comments
Open

Provide an API to work with streams ? #114

ShellCode33 opened this issue Dec 18, 2023 · 4 comments
Labels
question Further information is requested

Comments

@ShellCode33
Copy link

Currently the API takes a &[u8] as input.

It would be nice to be able to process huge files entirely in memory by processing chunks one after another.

Maybe by using std::io::BufReader/std::io::BufWriter ?

@kigawas
Copy link
Member

kigawas commented Dec 19, 2023

It depends on AEADs API so only &[u8]

https://github.com/RustCrypto/AEADs/blob/master/README.md

How big is your file? You can get &[u8] with BufReader::buffer()

@kigawas kigawas added the question Further information is requested label Dec 19, 2023
@ShellCode33
Copy link
Author

It depends on AEADs API so only &[u8]

Considering this library is the one wrapping it, I guess it would make sense to provide a higher level streaming interface around the lower level AEAD APIs ? Of course it would require incrementing the IV for each chunk, or generate a new ephemeral key for each (at least for AES, I don't know about Chacha20).

How big is your file?

I don't know but that doesn't matter, let's say it's a 10TiB file

You can get &[u8] with BufReader::buffer()

I already use fill_buffer(), but I have to handle the chunks manually, it would be nice to be able to stream data from one "pipe" to another read from file -> compress -> encrypt -> write to file

@kigawas
Copy link
Member

kigawas commented Dec 21, 2023

Maximum message size is 64GB (AES) or 256GB (XChaCha20) so you need to split your 10TiB files for each encryption.

In this case, to exchange a key first and encrypt chunks of the file with the same key on your own is probably better

@ShellCode33
Copy link
Author

ShellCode33 commented Dec 22, 2023

Maximum message size is 64GB (AES) or 256GB (XChaCha20) so you need to split your 10TiB files for each encryption.

Good to know, thanks. But that was not my point, 10TiB was just some random number, my point is that with the current state of the library, you have to load the whole file (or whatever that is) into memory in order to encrypt it, which is not realistic in any major, well behaved, application. Think embedded systems for exemple, do you think they can afford having 2GB or RAM just to be able to encrypt a 2GB file ?

I believe it makes sense for an ECIES library to provide streaming APIs, so that you can encrypt multiple chunks of data using a single ephemeral key. Not to mention AES GCM and ChaCha20 are streaming ciphers, they are perfect for the job, it would be a shame not to take advantage of that feature.

In this case, to exchange a key first and encrypt chunks of the file with the same key on your own is probably better

Well that's pretty much what ECIES is for, right ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants