A gnorm solution for generating strongly typed Go code using database/sql with postgres.
This code generates CRUD calls that can be dropped into any go project.
There are a few things these templates assume - if a column has a primary key that has a default, we'll ignore the primary key on insert and let the database generate the key (e.g. a uuid or auto-incrementing integer). This avoids the problem where Go code's zero value is a valid value, so you forget to set an ID and it gets inserted with ID == 0 (which is usually wrong).
Other assumptions - columns called updated_at and created_at are assumed to be generated by the database, and thus are never inserted or updated manually and are returned (along with ids) on insert using Postgres' RETURNING statement.
These templates are the ones we use at Mattel for our GRPC API server that saves data in postgres.
Everything under the generated
directory was generated using gnorm. The
database was set up using the schema defined in gnorm.sql
and then gnorm gen
was run with the configuration specified in gnorm.toml
. Read the output under
generated
and then look at the templates under gnorm_templates
to get an
idea of what the templates output.
Note that there's an example plugin under gnorm_templates\plugin
... you'll
need to go build
that plugin first, since the templates expect a plugin
executable in that directory.
The templates are where all the work goes, under gnorm_templates
. This
generates strongly typed code for CRUD updates, including strongly typed
queries.
Do you have code in your repo that looks like this?
var product Product
db.First(&product, "code = ?", "L1212") // find product with code l1212
// Update - update product's price to 2000
db.Model(&product).Update("Price", 2000)
Sure, it works now. What happens if you pass, say, a UUID into that first line? It'll compile. It'll run.. until it hits that exact line, and then it'll error out at runtime.
What happens if you typo "coed" in that first line? Again, runtime error.
What happens if you someday migrate your database and decide that "Price" should now be called "ItemPrice"? Runtime error.
Example of code using code generated by this repo instead of an ORM (this is the stuff you write):
import "github.com/yourOrg/yourProj/models/gnorm/user"
func insert(ctx context.Context, db *sql.DB) error {
// plain old boring struct. Look ma, no struct tags!
// The fields of Row are all generated from your database schema,
// so they're guaranteed to match the DB and guaranteed to be strongly typed.
u := users.Row{
Name: "Bob Smith",
Age: 84,
Role: enum.AdminRole,
}
// user.Insert takes a user.Row. Strongly typed!
if err := users.Insert(ctx, db, u); err != nil {
return err
}
}
func findOldMods(ctx context.Context, db *sql.DB) error {
// Strongly typed queries. Age can only be compared to an int, Role can
// only be compared to the Role enum type.
mods, err := users.Query(ctx, db, gnorm.AndClause(
users.AgeCol.GreaterThan(40),
users.RoleCol.In(enum.AdminRole, enum.ModeratorRole),
))
}
The best thing about this is that if you migrate your database and re-run gnorm, you'll get compile
time errors if column, table, or enum names or types have changed. Decided you wanted to change
the type
column on users to user_role
? With gnorm and postgres-go, it's no big deal, you'll get
a compile error everywhere in your data layer where you accessed it via the old name.
Not only do you get compile-time errors, but there's no reflection, so it's screaming-fast.
Gnorm lets you generate files in whatever directory structure you like, so we can use go's package namespacing to avoid unwieldy names like db.UsersAgeCol and instead just have users.AgeCol, without worrying about if some other table has an age column of a different type.
The final benefit is that gnorm generates boring old go code. This is the code you'd write, written the way you'd write it, if you had the time and patience to do that (and if you didn't already know about gnorm). It's incredibly trivial to read and debug. There's no magic.
// Inside generated/public/authors/authors.go
// Query retrieves rows from 'authors' as a slice of Row.
func Query(ctx context.Context, db generated.DB, where generated.WhereClause) ([]*Row, error) {
const origsqlstr = `SELECT
id, name
FROM public.authors WHERE (`
idx := 1
sqlstr := origsqlstr + where.String(&idx) + ") "
var vals []*Row
q, err := db.QueryContext(ctx, sqlstr, where.Values()...)
if err != nil {
return nil, errors.Wrap(err, "query Authors")
}
for q.Next() {
r := Row{}
err := q.Scan(&r.ID,
&r.Name,
)
if err != nil {
return nil, errors.Wrap(err, "query Authors")
}
vals = append(vals, &r)
}
return vals, nil
}
All the functions take a gnorm.DB, which matches both sql.DB and sql.Tx, so you can easily bundle a bunch of changes into a transaction if you want.
Implement joins.
There's probably a lot of optimizations that could be made to the code. Pull requests are welcome.