Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Lambda response streaming #318

Open
jabrks opened this issue Apr 1, 2024 · 6 comments
Open

Support Lambda response streaming #318

jabrks opened this issue Apr 1, 2024 · 6 comments
Labels
enhancement New feature or request

Comments

@jabrks
Copy link

jabrks commented Apr 1, 2024

It would be great if LLRT supported Lambda response streaming as per https://docs.aws.amazon.com/lambda/latest/dg/runtimes-custom.html#runtimes-custom-response-streaming.

@jabrks
Copy link
Author

jabrks commented Apr 1, 2024

I had a go at implementing this myself (my Rust isn't quite up to scratch but the code that interacts with the Lambda runtime API is written in TypeScript for the moment) but I don't think we have a way to stream a request body in LLRT currently. I would've ordinarily used the http API in Node.js but that isn't implemented here with a recommendation to use fetch instead, which only seems to allow a string, Array, ArrayBuffer or Uint8Array.

I'd be happy to take another look if there's a preferred way forward here?

@richarddavison
Copy link
Contributor

Hi @jabrks. We want to support response streams but first we have to implement native streaming for performance reasons #178

@gc-victor
Copy link

@richarddavison
Copy link
Contributor

Thanks for the suggestion but we already have a streams polyfill but that won't perform as well as we desire due to lack of a JIT. Basically every byte that flows through a stream implemented will cause multiple function calls, branches, allocations etc.

@jabrks
Copy link
Author

jabrks commented Apr 1, 2024

Thanks @richarddavison, I'll sit tight for now in that case!

@richarddavison richarddavison added the enhancement New feature or request label Apr 2, 2024
@codingnuclei
Copy link

codingnuclei commented Sep 27, 2024

Hi 👋 - just posting to show that there is still interest in this enhancement. We rely heavily on middy so would be awesome to be able to use it with LLRT 😄

Thanks for the on going effort 💯

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants