Skip to content

Commit

Permalink
git commit message (#656)
Browse files Browse the repository at this point in the history
* custom message

* Add interactive commit prompt utility using inquirer for staged Git changes

* Add @inquirer/prompts dependency for interactive commit prompt
Upgrade various scripts to use provider:name syntax for model
Fix error logging in util.ts

* Enhance staging workflow and improve commit message generation process

* quote args

* Rename gcm.genai.mts and implement inner option for stream handling in chat completion logic

* Change default confirmation for staging changes to true in gcm.genai.mts

* Improve console logging and error handling with color-coded output and execution feedback

* "Update documentation and scripts to streamline automated Git commit message generation"
  • Loading branch information
pelikhan authored Aug 27, 2024
1 parent bb44bb6 commit c497d30
Show file tree
Hide file tree
Showing 19 changed files with 507 additions and 20 deletions.
147 changes: 147 additions & 0 deletions docs/src/content/docs/guides/auto-git-commit-message.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,147 @@
---
title: "Automated Git Commit Messages"
keywords: ["GenAI", "Git", "Automation"]
sidebar:
order: 15
---

In the world of software development, making consistent and informative commit messages is crucial but often overlooked.
This task can become tedious, especially when you are in the flow of coding.
To help with this, we've crafted a [script tailored to automate generating Git commit messages](](https://github.com/microsoft/genaiscript/blob/main/genaisrc/gcm.genai.mts)),
ensuring they are meaningful and save you time.

The script acts as a regular node.js automation and uses [runPrompt](/genaiscript/reference/scripts/inner-prompts)
to issue calls to the LLM and ask the user to confirm the generated text.

## 🔍 **Explaining the Script**

The script begins by importing necessary functions from [@inquirer/prompts](https://www.npmjs.com/package/@inquirer/prompts):

```ts
import { select, input, confirm } from "@inquirer/prompts"
```

These functions will be used to interact with the user, asking them to confirm actions or input data.

Next, we check if there are any staged changes in the Git repository:

```ts
let { stdout } = await host.exec("git", ["diff", "--cached"])
```

If no changes are staged:

```ts
if (!stdout) {
```
We ask the user if they want to stage all changes. If the user confirms, we stage all changes. Otherwise, we bail out.
```ts
const stage = await confirm({
message: "No staged changes. Stage all changes?",
default: true,
})
if (stage) {
await host.exec("git", ["add", "."])
stdout = (await host.exec("git", ["diff", "--cached"])).stdout
}
if (!stdout) cancel("no staged changes")
}
```

We generate an initial commit message using the staged changes:

```ts
message = (
await runPrompt(
(_) => {
_.def("GIT_DIFF", stdout, { maxTokens: 20000 })
_.$`GIT_DIFF is a diff of all staged changes, coming from the command:
\`\`\`
git diff --cached
\`\`\`
Please generate a concise, one-line commit message for these changes.
- do NOT add quotes`
},
{ cache: false, temperature: 0.8 }
)
).text
```

The prompt configuration above indicates that the message should be concise,
related to the "git diff --cached" output, and should not include quotes.

User chooses how to proceed with the generated message:

```ts
choice = await select({
message,
choices: [
{ name: "commit", value: "commit", description: "accept message and commit" },
...
],
})
```

Options are given to edit or regenerate the message. If the user chooses to edit the message, we ask them to input a new message:

```ts
if (choice === "edit") {
message = await input({
message: "Edit commit message",
required: true,
})
choice = "commit"
}
```

If the user chooses to commit the message, we commit the changes:

```ts
if (choice === "commit" && message) {
console.log((await host.exec("git", ["commit", "-m", message])).stdout)
}
```

## 🚀 **Running the Script**

You can run this script using the [CLI](/genaiscript/reference/cli).

```bash
genaiscript run gcm
```

Since it uses the [@inquirer/prompts](https://www.npmjs.com/package/@inquirer/prompts) package, you will need to install this package first:

```bash
npm install --save-dev @inquirer/prompts
```

If you are using [npx](https://docs.npmjs.com/cli/v10/commands/npx),

```bash
npx -p @inquirer/prompts genaiscript -p genaiscript -- genaiscript run gcm
```

This command will run the script, and guide you through the process of generating and committing a Git message using AI, making your commits more informative and consistent.

You can wrap this command in a `gcm.sh` file or in your package `script` section in `package.json`:

```json '"gcm": "genaiscript run gcm"'
{
"devDependencies": {
"@inquirer/prompts": "...",
"genaiscript": "..."
},
"scripts": {
"gcm": "genaiscript run gcm"
}
}
```

Then you can run the script using:

```bash
npm run gcm
```
72 changes: 72 additions & 0 deletions genaisrc/gcm.genai.mts
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
import { select, input, confirm } from "@inquirer/prompts"

// Check for staged changes and stage all changes if none are staged
let { stdout } = await host.exec("git", ["diff", "--cached"])
if (!stdout) {
const stage = await confirm({
message: "No staged changes. Stage all changes?",
default: true,
})
if (stage) {
await host.exec("git", ["add", "."])
stdout = (await host.exec("git", ["diff", "--cached"])).stdout
}
if (!stdout) cancel("no staged changes")
}

console.log(stdout)

let choice
let message
do {
// Generate commit message
message = (
await runPrompt(
(_) => {
_.def("GIT_DIFF", stdout, { maxTokens: 20000 })
_.$`GIT_DIFF is a diff of all staged changes, coming from the command:
\`\`\`
git diff --cached
\`\`\`
Please generate a concise, one-line commit message for these changes.
- do NOT add quotes`
},
{ cache: false, temperature: 0.8 }
)
).text

// Prompt user for commit message
choice = await select({
message,
choices: [
{
name: "commit",
value: "commit",
description: "accept message and commit",
},
{
name: "edit",
value: "edit",
description: "edit message and commit",
},
{
name: "regenerate",
value: "regenerate",
description: "regenerate message",
},
],
})

// Handle user choice
if (choice === "edit") {
message = await input({
message: "Edit commit message",
required: true,
})
choice = "commit"
}
// Regenerate message
if (choice === "commit" && message) {
console.log((await host.exec("git", ["commit", "-m", message])).stdout)
}
} while (choice !== "commit")
4 changes: 3 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,8 @@
"genai:test": "node packages/cli/built/genaiscript.cjs run test-gen",
"genai:blog-post": "node packages/cli/built/genaiscript.cjs run blog-generator",
"genai:readme": "node packages/cli/built/genaiscript.cjs run readme-updater",
"genai:blogify": "node packages/cli/built/genaiscript.cjs run blogify-sample --no-cache"
"genai:blogify": "node packages/cli/built/genaiscript.cjs run blogify-sample --no-cache",
"gcm": "node packages/cli/built/genaiscript.cjs run gcm"
},
"release-it": {
"github": {
Expand All @@ -74,6 +75,7 @@
}
},
"dependencies": {
"@inquirer/prompts": "^5.3.8",
"glob": "^11.0.0",
"zx": "^8.1.4"
}
Expand Down
7 changes: 4 additions & 3 deletions packages/cli/src/log.ts
Original file line number Diff line number Diff line change
@@ -1,18 +1,19 @@
import { stdout } from "node:process"
import console from "node:console"
import { CONSOLE_COLOR_DEBUG, CONSOLE_COLOR_WARNING, CONSOLE_COLOR_ERROR } from "../../core/src/constants"

export const info = console.error

export function debug(...args: any[]) {
if (!isQuiet) console.error(...wrapArgs(34, args))
if (!isQuiet) console.error(...wrapArgs(CONSOLE_COLOR_DEBUG, args))
}

export function warn(...args: any[]) {
console.error(...wrapArgs(95, args))
console.error(...wrapArgs(CONSOLE_COLOR_WARNING, args))
}

export function error(...args: any[]) {
console.error(...wrapArgs(91, args))
console.error(...wrapArgs(CONSOLE_COLOR_ERROR, args))
}

export let consoleColors = !!stdout.isTTY
Expand Down
6 changes: 5 additions & 1 deletion packages/cli/src/nodehost.ts
Original file line number Diff line number Diff line change
Expand Up @@ -280,8 +280,12 @@ export class NodeHost implements RuntimeHost {
if (command === "python" && process.platform !== "win32")
command = "python3"

const quoteify = (a: string) => (/\s/.test(a) ? `"${a}"` : a)
logVerbose(
`exec ${cwd ? `${cwd}> ` : ""}${quoteify(command)} ${args.map(quoteify).join(" ")}`
)
trace?.itemValue(`cwd`, cwd)
trace?.item(`\`${command}\` ${args.join(" ")}`)
trace?.item(`${command} ${args.map(quoteify).join(" ")}`)

const { stdout, stderr, exitCode, failed } = await execa(
command,
Expand Down
18 changes: 14 additions & 4 deletions packages/cli/src/run.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import { capitalize } from "inflection"
import { resolve, join, relative, dirname } from "node:path"
import { isQuiet } from "./log"
import { isQuiet, wrapColor } from "./log"
import { emptyDir, ensureDir, appendFileSync } from "fs-extra"
import { convertDiagnosticsToSARIF } from "./sarif"
import { buildProject } from "./build"
Expand Down Expand Up @@ -28,6 +28,7 @@ import {
UNRECOVERABLE_ERROR_CODES,
SUCCESS_ERROR_CODE,
RUNS_DIR_NAME,
CONSOLE_COLOR_DEBUG,
} from "../../core/src/constants"
import { isCancelError, errorMessage } from "../../core/src/error"
import { Fragment, GenerationResult } from "../../core/src/generation"
Expand Down Expand Up @@ -241,6 +242,7 @@ export async function runScript(
trace.options.encoder = await resolveTokenEncoder(info.model)
await runtimeHost.models.pullModel(info.model)
result = await runTemplate(prj, script, fragment, {
inner: false,
infoCb: (args) => {
const { text } = args
if (text) {
Expand All @@ -249,11 +251,19 @@ export async function runScript(
}
},
partialCb: (args) => {
const { responseChunk, tokensSoFar } = args
const { responseChunk, tokensSoFar, inner } = args
tokens = tokensSoFar
if (responseChunk !== undefined) {
if (stream) process.stdout.write(responseChunk)
else if (!isQuiet) process.stderr.write(responseChunk)
if (stream) {
if (!inner) process.stdout.write(responseChunk)
else
process.stderr.write(
wrapColor(CONSOLE_COLOR_DEBUG, responseChunk)
)
} else if (!isQuiet)
process.stderr.write(
wrapColor(CONSOLE_COLOR_DEBUG, responseChunk)
)
}
partialCb?.(args)
},
Expand Down
3 changes: 2 additions & 1 deletion packages/cli/src/server.ts
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ export async function startServer(options: { port: string }) {
trace: MarkdownTrace
): Promise<ChatCompletionResponse> => {
const { messages, model } = req
const { partialCb } = options
const { partialCb, inner } = options
if (!wss.clients?.size) throw new Error("no llm clients connected")

return new Promise<ChatCompletionResponse>((resolve, reject) => {
Expand All @@ -120,6 +120,7 @@ export async function startServer(options: { port: string }) {
tokensSoFar,
responseSoFar,
responseChunk: chunk.chunk,
inner,
})
finishReason = chunk.finishReason as any
if (finishReason) {
Expand Down
3 changes: 2 additions & 1 deletion packages/core/src/aici.ts
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ const AICIChatCompletion: ChatCompletionHandler = async (
trace
) => {
const { messages, response_format, tools } = req
const { requestOptions, partialCb, cancellationToken } = options
const { requestOptions, partialCb, cancellationToken, inner } = options
const { headers, ...rest } = requestOptions || {}

if (tools?.length) throw new NotSupportedError("AICI: tools not supported")
Expand Down Expand Up @@ -342,6 +342,7 @@ const AICIChatCompletion: ChatCompletionHandler = async (
responseSoFar: chatResp,
tokensSoFar: numTokens,
responseChunk: progress,
inner,
})
}
pref = chunk
Expand Down
2 changes: 2 additions & 0 deletions packages/core/src/chattypes.ts
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ export interface ChatCompletionsProgressReport {
tokensSoFar: number
responseSoFar: string
responseChunk: string
inner: boolean
}

export interface ChatCompletionsOptions {
Expand All @@ -90,4 +91,5 @@ export interface ChatCompletionsOptions {
retry?: number
retryDelay?: number
maxDelay?: number
inner: boolean
}
4 changes: 4 additions & 0 deletions packages/core/src/constants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -220,3 +220,7 @@ export const PLACEHOLDER_API_KEY = "<your token>"

export const VSCODE_CONFIG_CLI_VERSION = "cli.version"
export const VSCODE_CONFIG_CLI_PATH = "cli.path"

export const CONSOLE_COLOR_DEBUG = 34
export const CONSOLE_COLOR_WARNING = 95
export const CONSOLE_COLOR_ERROR = 91
1 change: 1 addition & 0 deletions packages/core/src/generation.ts
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,7 @@ export interface GenerationOptions
ModelOptions,
EmbeddingsModelOptions,
ScriptRuntimeOptions {
inner: boolean
cancellationToken?: CancellationToken
infoCb?: (partialResponse: { text: string }) => void
trace: MarkdownTrace
Expand Down
Loading

0 comments on commit c497d30

Please sign in to comment.