Replies: 2 comments
-
We're running full async in production
Feel free to explore the setup at https://gitlab.com/fluidattacks/universe/-/tree/trunk/integrates/back/src/api |
Beta Was this translation helpful? Give feedback.
0 replies
-
Depending on a project we are using either Starlette, FastAPI or Mangum. For dataloader we are using aiodataloader. Our Ariadne API's are proxies for microservices, so we are not using databases much in them. But when we do, its either encode/databases + Alembic, or SQLAlchemy 2 DBAL and Alembic. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hej community, we from the boxtribute project were wondering about the stack that people use ariadne in.
To the mirumee folks @rafalp @patrys especially, can you share what stack works for you?
Background
For the BE of our web application, we started with Flask-ariadne-Peewee. We had to integrate batching via dataloaders at some point. We hence ended up with Flask running ariadne async including async dataloaders running sync Peewee ORM calls...
We'd like to be consistent 😅
Beta Was this translation helpful? Give feedback.
All reactions