Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: remove dead code #303

Merged
merged 18 commits into from
Nov 14, 2023
Merged

feat: remove dead code #303

merged 18 commits into from
Nov 14, 2023

Conversation

mattzcarey
Copy link
Owner

No description provided.

Copy link
Contributor

github-actions bot commented Nov 10, 2023

Test results summary:

✅ [PASS] - Test case: Bad variable name
❌ [FAIL] - Test case: Exposed secret
✅ [PASS] - Test case: Too many nested loops
⚠️ [WARN] - Test case: Unawaited Promise

SUMMARY: ✅ PASS: 2 - ⚠️ WARN: 1 - ❌ FAIL: 1


Tests Powered by Code Review GPT

@crgpt-staging
Copy link

crgpt-staging bot commented Nov 11, 2023

@mattzcarey I can't access your OPENAI_API_KEY. This is set in your GitHub repository at Settings/Actions/Repository Variables/Secrets. Please contact the repository owner to set this up.

throw new Error(`Error fetching review: ${(error as Error).message}`);
}
};
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+// Reviewing multiple files inline > prioritising them > adding review comments
+// Answer questions > get the comments on the PR (by me and the questioner) as context > answer the question as comment
+
+import { modelInfo } from "../constants";
+import { AIModel } from "../llm/ai";
+
+export class Chat {
+  ai: AIModel;
+  modelName: string;
+  constructor(
+    openaiApiKey: string,
+    openaiModelName?: string,
+    temperature?: string
+  ) {
+    this.modelName = openaiModelName ?? "gpt-4-1106-preview";
+    this.ai = new AIModel({
+      modelName: this.modelName,
+      apiKey: openaiApiKey,
+      temperature: temperature ? parseFloat(temperature) : 0,
+    });
+  }
+
+  private getMaxPromptLength = (modelName: string): number => {
+    const model = modelInfo.find((info) => info.model === modelName);
+    if (!model) {
+      throw new Error(`Model ${modelName} not found`);
+    }
+
+    return model.maxPromptLength;
+  };
+
+  public getReview = async (prompt: string): Promise<string | undefined> => {
+    const maxPromptLength = this.getMaxPromptLength(this.modelName);
+
+    // TODO: fix this hack
+    if (prompt.length > maxPromptLength) {
+      console.error(
+        `File ${prompt} is too large to review, skipping review for this file`
+      );
+
+      return undefined;
+    }
+    try {
+      return await this.ai.callModel(prompt);
+    } catch (error) {
+      throw new Error(`Error fetching review: ${(error as Error).message}`);
+    }
+  };
+}

Review Comments:

  1. The default model name "gpt-4-1106-preview" is hardcoded in the constructor. Consider making this a constant or providing it through a configuration file to avoid changing the source code if the model name changes.

  2. The temperature parameter is being parsed as a float. Ensure that the API expects a float and not an integer, as some APIs might expect temperature values as integers.

  3. The getMaxPromptLength method throws an error if the model is not found. This is good for avoiding silent failures, but ensure that the error is handled gracefully where this method is called.

  4. The TODO comment indicates a known issue where the prompt length exceeds the maximum allowed length. This should be addressed as it could lead to files not being reviewed without the user's knowledge. Consider implementing a more robust solution, such as splitting the prompt into smaller chunks that can be processed individually.

  5. Error handling in the getReview method is good, but it might be beneficial to log the error for debugging purposes before throwing it.

  6. There is no documentation or comments explaining the purpose of the methods or the class itself. Adding comments would improve code maintainability and readability.

  7. Consider adding unit tests to ensure that the Chat class behaves as expected, especially for edge cases like handling large prompts or API errors.

  8. It's not clear how the prompt parameter in the getReview method is being used. If it's meant to be the content of a file, the error message should reflect that instead of just outputting the prompt variable, which might not be informative.

  9. Ensure that the API key is being stored and handled securely, as it's sensitive information.

  10. The class is named Chat, which might not accurately reflect its responsibility if it's specifically designed for code review. Consider renaming it to something more descriptive, such as CodeReviewer or AIReviewer.

optionalVariables.OPENAI_MODEL_NAME,
optionalVariables.TEMPERATURE
);
};
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code snippet appears to be a part of a TypeScript module designed to work with Probot, a framework for building GitHub Apps in Node.js. The purpose of this module is to load a Chat object, which is likely an abstraction for interacting with the OpenAI API, specifically for chat or conversational AI purposes.

Here's a breakdown of what the code is doing:

  1. The module imports the Context type from the probot package, which is used to represent the context of a GitHub event, such as a pull request.

  2. It imports the Chat class from a local module, which is not shown in the snippet. This class is probably responsible for setting up and managing conversations with the OpenAI API.

  3. It imports two functions, getAPIKeyFromGH and getOptionalVariablesFromGH, from local modules. These functions are likely used to retrieve configuration variables from GitHub, such as API keys and other optional settings.

  4. The loadChat function is an asynchronous function that takes a context parameter of type Context<"pull_request">. This indicates that the function is designed to be used in the context of a pull request event.

  5. Inside the loadChat function, there is a conditional check to see if the OPENAI_API_KEY and OPENAI_MODEL_NAME environment variables are set. If they are, it creates a new Chat instance using these environment variables.

  6. If the environment variables are not set, the function retrieves the API key using the getAPIKeyFromGH function and optional variables using the getOptionalVariablesFromGH function. The optional variables include the model name and temperature, which is a parameter that controls the randomness of the AI's responses.

  7. Finally, the function returns a new Chat instance, passing the API key, model name, and temperature (if available) to the constructor.

This code is a good example of how to abstract away the details of setting up an OpenAI chat interface within a Probot application, allowing for flexibility in configuration either through environment variables or through settings retrieved from GitHub.


console.log("Test");
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code snippet appears to be a part of an AWS CDK (Cloud Development Kit) application written in TypeScript. The CDK is a software development framework for defining cloud infrastructure in code and provisioning it through AWS CloudFormation.

Here's a breakdown of what the code is doing:

  1. It imports necessary classes and interfaces from the AWS CDK libraries and the constructs library.

  2. It defines a WebhookStackProps interface that extends the StackProps interface from the AWS CDK, adding a stage property.

  3. It declares a WebhookStack class that extends the Stack class from the AWS CDK. This class represents a stack of AWS resources.

  4. The constructor of the WebhookStack class takes three parameters: scope, id, and props. The scope is the parent construct, id is a unique identifier for the stack, and props is an object that must conform to the WebhookStackProps interface.

  5. Inside the constructor, the code creates an instance of WebhookApi, which seems to be a custom construct defined in the "../constructs" directory. This construct likely sets up an API Gateway.

  6. It then creates an instance of WebhookLambda, which is another custom construct defined in the "../constructs/webhook-lambda" directory. This construct likely sets up an AWS Lambda function.

  7. The code adds routes to the API Gateway using the addRoutes method. It specifies the path "/api/github/webhooks", the HTTP method POST, and integrates the route with the Lambda function using HttpLambdaIntegration.

  8. It creates a CfnOutput construct to export the API endpoint URL, which can be used to invoke the API.

  9. Lastly, there is a console.log("Test"); statement, which will output "Test" to the console when the stack is being synthesized.

Please note that the @aws-cdk/aws-apigatewayv2-alpha and @aws-cdk/aws-apigatewayv2-integrations-alpha packages suggest that this code is using alpha versions of these modules, which might not be stable for production use.

The @@ -0,0 +1,37 @@ at the beginning of the snippet indicates that this is a diff output from a version control system like git, showing that these lines are new additions to the file (from line 1 to line 37).


expect(typeof ssWithScore[0][1] === 'number').toBe(true);
});
}); No newline at end of file
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like you've provided a diff output from a version control system (like Git) that shows the removal of a test suite for a CreateMemoryStore function. The removed code includes three test cases:

  1. The first test checks if the CreateMemoryStore function returns a Promise that resolves to an instance of MemoryVectorStore.
  2. The second test checks if the CreateMemoryStore function provides the required functionality by comparing the length of the memoryVectors array in the result with the length of the memoryVectors array in an expected result created by calling MemoryVectorStore.fromDocuments.
  3. The third test checks if the MemoryVectorStore returned by the CreateMemoryStore function can perform a similarity search and return a score as a number.

The code also includes imports for OpenAIEmbeddings and MemoryVectorStore from the langchain library, as well as a custom CreateMemoryStore function and some initialFiles for testing.

If you have any specific questions or need further assistance with this code, please let me know!

@mattzcarey mattzcarey merged commit bf29301 into main Nov 14, 2023
4 checks passed
@mattzcarey mattzcarey deleted the feat/core-be-v2 branch November 14, 2023 09:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant