Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Customizable Warmup endpoint #327

Closed
RickVerkuijlen opened this issue Oct 21, 2022 · 7 comments
Closed

Customizable Warmup endpoint #327

RickVerkuijlen opened this issue Oct 21, 2022 · 7 comments
Labels
waiting for response Waiting for some kind of feedback from the user

Comments

@RickVerkuijlen
Copy link

It would be nice to configure which endpoint the Warmup lambda will trigger. The way it is done right now, is to overwrite the Lambda Handler and listen to the serverless-plugin-warmup event. Is it possible to make it so that it can also listen to this event in the Lambda itself? For example when creating an API, that the event can be handled in one of the API endpoints?

@juanjoDiaz
Copy link
Owner

Hi @RickVerkuijlen ,

I don't really get your proposal.
At the moment this plugin simply invokes your lambdas based on a schedule.

How do you propose that we do it instead?

@juanjoDiaz juanjoDiaz added the waiting for response Waiting for some kind of feedback from the user label Oct 30, 2022
@RickVerkuijlen
Copy link
Author

I get a lot of null pointers because I cannot override my Lambda Handler because thats done by Quarkus. I thought that it would be possible to invoke the lambda using a REST endpoint that you can set yourself?

@funkel1989
Copy link

I think this would also be an interesting feature to add, for the warmer to hit an endpoint to warm up, in my case a health check endpoint.

I'm using a dotnet lambda container with an average response time of about 120ms. its lambda warm-up time is 4.5 seconds and its code warm-up time is about 3 seconds. Using this plugin in its current state successfully saves me 1.5 seconds of cold start time but doesn't resolve my issue as my DI container has not been initialized yet.

I would propose an Endpoint mode where you can specify the endpoint and the warmup lambda would use Axios to make an HTTP call to the specified endpoint.

@RickVerkuijlen
Copy link
Author

I would propose an Endpoint mode where you can specify the endpoint and the warmup lambda would use Axios to make an HTTP call to the specified endpoint.

Yes! This is exactly what I mean. Thanks for putting it into words.

@funkel1989
Copy link

I failed to do so yesterday but I wanted to add some more clarity on why this was important for me.

I'm currently running a dotnet 7 lambda container. similar to what is posted above because the language is strongly typed, the incoming http rest api v2 does not have a type for source so the detail warmer lambda has failed me because the incoming object is completely empty and it doesn't know what to do with it.

I have done the following to fix it:

  1. secured my health check endpoint (otherwise useless because its a lambda, except for this scenario) with an api key authentication schema and built out code for that. I needed it for other reasons but it happens to come in handle for this.
  2. edited the payload of the lambda to be the following
  warmup:
    officeHoursWarmer:
      enabled: true
      verbose: true
      prewarm: true
      concurrency: 2
      payload:
        body: '${self:custom.domainMap.${self:provider.stage}} 483fefbe-3c66-4f59-8df9-21e904369b7a ${self:custom.warmerApiKey}'

Note here, should consider logging the payload being sent. In this case it includes a secret but i feel like this should be a debug more only kind of thing.
3. I had to edit my lambda handler file and override the FunctionHandlerAsync with the following

    public override async Task<APIGatewayHttpApiV2ProxyResponse> FunctionHandlerAsync(
        APIGatewayHttpApiV2ProxyRequest request,
        ILambdaContext lambdaContext)
    {
        if (string.IsNullOrWhiteSpace(request.RawPath))
        {
            Console.WriteLine("In overridden FunctionHandlerAsync…");

            // 0. url, 1. ssid, 2. apikey
            var bodyItems = request.Body.Split(" ");

            HttpClientHandler handler = new HttpClientHandler()
            {
                AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate
            };

            HttpClient httpClient = new HttpClient(handler);

            httpClient.BaseAddress = new Uri($"https://{bodyItems[0]}/");
            httpClient.DefaultRequestHeaders.Add("x-source-system-id", bodyItems[1]);
            httpClient.DefaultRequestHeaders.Add("x-api-key", bodyItems[2]);
            HttpResponseMessage response = httpClient.GetAsync("api/HealthCheck").Result;
            response.EnsureSuccessStatusCode();

            var result = response.Content.ReadAsStringAsync().Result;
            Console.WriteLine(result);

            Thread.Sleep(2000);

            return new APIGatewayHttpApiV2ProxyResponse() { StatusCode = 200, Body = result };
        }

        return await base.FunctionHandlerAsync(request, lambdaContext);
    }

This basically says if the code didn't come from API gateway (everything should except this warmer lambda) then I want to bring in an HTTP client and make a health check call to my application which creates a lambda. I pause for 2 seconds because there is a change these health checks could happen so fast that the warmer handlers use the same lambda for health checks. Most times this doubles my expected concurrence but oh well in this case.

Why is this important:
Dotnet lambda code start for my app is about 4.5 seconds. 1.5 seconds of this is the actual lambda cold start. I still have a 3-second wait time on the DI container to populate to be able to handle requests though. I used the health check to pass a request through to the base FunctionHandlerAsync which would cause the DI container to start to instantiate and the code actually be ready to handle requests...which knocks off about 2.8 seconds leaving me with a 200ms time for requests on average.
Keep in mind these numbers are averages.

What this leaves me with though is some lambdas are completely warm with a DI container and some are not and are simply just lukewarm. so this does not completely fix the problem but it's better than not at all.

If the warmer function was able to take an endpoint and headers(prob can be done in payload but I tried and it was weird and I couldn't get it to work) then at least in my scenario it would be a TON better.

Unfortunately, I don't have a ton of time to contribute something like this but if I do in the future and no one has done it yet I'll open up a PR.

@juanjoDiaz
Copy link
Owner

Hi,

Sorry for the slow response.
And thank you very much for the detailed explanation.

I still don't fully understand the problem. So please bear with me and let's try to reach understanding so I can help you 🙂

HttpResponseMessage response = httpClient.GetAsync("api/HealthCheck").Result;
            response.EnsureSuccessStatusCode();

You seem to detect the warmer call and then call a separate endpoint?
I don't fully get that. Why is that needed? Why can't you warm up everything from the handler itself?

What this leaves me with though is some lambdas are completely warm with a DI container and some are not and are simply just lukewarm.

I don't understand this either.
If you do the same initialization for all request, why are not all equally warmed up?

If the warmer function was able to take an endpoint and headers(prob can be done in payload but I tried and it was weird and I couldn't get it to work) then at least in my scenario it would be a TON better.

Can you elaborate?
The warmer calls the endpoint that represents your lambda in API GW.
What would be the other endpoint?

And what would headers provide over payload or context?
You seem to be able to parse the body without a problem.
Is that to pass the API Key and the source system id?

@juanjoDiaz
Copy link
Owner

Closing since there hasn't been a response in long time.
Feel free to reopen if you thing that there is still more stuff to do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
waiting for response Waiting for some kind of feedback from the user
Projects
None yet
Development

No branches or pull requests

3 participants