Full-stack starter kit for building cross-platform mobile AI apps supporting OpenAI LLMs real-time / streaming text and chat user interfaces.
Before diving into the setup, I would like to extend my sincere thanks and recognition to the React Native AI repository and its author. The React Native part of this project heavily draws from this repository. I've refined the original code to concentrate on OpenAI support, ensuring it's streamlined for this specific function.
On the backend side, as a Python enthusiast, I noticed a lack of templates or starter kits combining React Native with FastAPI and Langchain. To fill this gap, I developed a simple yet effective FastAPI backend. It features a clean, easily extendable folder structure and a single endpoint, making it ideal for Python developers stepping into the world of mobile AI applications.
There are two methods to set up the backend:
Option 1: Using Docker
- Add your
OPENAI_API_KEY
to the.env
file located atfastapi-backend-app/.env
. - Execute
docker-compose up -d
from the root directory to start the backend.
Option 2: Setting Up Locally with Python Virtual Environment (venv)
- Change to the backend directory using
cd fastapi-backend-app/
. - Create a Python virtual environment with
python -m venv env
. - Activate the virtual environment:
- On macOS/Linux:
source env/bin/activate
. - On Windows:
env\Scripts\activate
.
- On macOS/Linux:
- Install the required dependencies by running
pip install -r requirements.txt
. - Add your
OPENAI_API_KEY
to thefastapi-backend-app/.env
file. - Start the FastAPI backend app with
uvicorn --reload --proxy-headers --host 0.0.0.0 --port 8000 src.main:app
.- Note: If you encounter a
ModuleNotFoundError
, ensure that the virtual environment is activated.
- Note: If you encounter a
- Navigate to the React Native app directory:
cd react-native-app/
. - Install the dependencies using
npm i
. - Launch the React Native app with
npm start
.