Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a GRPCClient that is re-usable by the same VU #10

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 45 additions & 0 deletions k6_test/grpc/grpc_client.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
import { check } from "k6";
import { Counter } from "k6/metrics";
import grpc from "k6/net/grpc";

const grpcReqs = new Counter("grpc_reqs");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aluu317 do you know why this is needed? I thought k6 already counts the requests itself?

Copy link
Contributor Author

@aluu317 aluu317 Jun 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I copied this from the MLServer benchmark test. I actually was not able to see how the counter grpcReqs was used in the ML Server example. I assume it maybe for logging/tracking purposes which we can certainly log in our k6 test?
Do you think it maybe useful to know how many GRPC infer requests each VU actually makes and log it out?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but to answer your question, yes, k6 logs it, something like this:

 ✗ status is OK
      ↳  0% — ✓ 0 / ✗ 272

     █ setup

       ✗ status is OK
        ↳  0% — ✓ 0 / ✗ 1

     █ teardown

     checks...............: 0.00%  ✓ 0          ✗ 273 

So 273 is total checks, but unsure how many for each VU


function getClient(protoFilePath) {
const client = new grpc.Client();

client.load([], protoFilePath);

return client;
}

function checkResponse(res) {
check(res, {
"status is OK": (r) => r && r.status === grpc.StatusOK,
});
}

export class GrpcClient {
constructor(options) {
this.grpcHost = options.grpcHost || 'modelmesh-serving:8033';
this.client = getClient(options.protoFilePath || '../k6_test/kfs_inference_v2.proto');
this.inferRPCName = options.inferRPCName || 'inference.GRPCInferenceService/ModelInfer';

// Client can't connect on the init context
this.connected = false;
}

infer(data, params) {
if (!this.connected) {
this.client.connect(this.grpcHost, { plaintext: true });
this.connected = true;
}

const res = this.client.invoke(this.inferRPCName, data, params);
checkResponse(res);
grpcReqs.add(1);
}

close() {
this.client.close();
}
}
30 changes: 8 additions & 22 deletions k6_test/grpc/script_grpc_skmnist.js
Original file line number Diff line number Diff line change
@@ -1,37 +1,23 @@
import grpc from 'k6/net/grpc';
import { check } from 'k6';
import execution from 'k6/execution';
import { GrpcClient } from "../k6_test/grpc/grpc_client.js";

{{k6_opts}}

const client = new grpc.Client();
client.load([], '../k6_test/kfs_inference_v2.proto');
const sharedClient = new GrpcClient({
grpcHost: '{{base_url}}'
});
const inputsData = JSON.parse(open(`../k6_test/payloads/{{payload}}`));
let params = {
tags: { model_name: `{{model_name}}` },
}

export function setup(){
// Abort on connection errors
try {
client.connect('{{base_url}}', { plaintext: true});
} catch (error) {
check(error, {"Setup error": (error) => error === null})
execution.test.abort(error);
}
}

export default () => {
client.connect('{{base_url}}', { plaintext: true });
const data = {
"model_name": "{{model_name}}",
"inputs": inputsData["inputs"]
};
const response = client.invoke('inference.GRPCInferenceService/ModelInfer', data, params);

check(response, {
'status is OK': (response) => response && response.status === grpc.StatusOK,
});
sharedClient.infer(data, params);
};

client.close();
export function teardown() {
sharedClient.close();
};