{
"bigqueryTableID": "yourBigQueryProject.yourDataset.yourTable",
"tableLocation:"EU",
"firestoreCollection":"theFirestoreCollectionName",
"columnName":"documentID"
}
You need to specify the BigQuery TableID and location where data is taken from, the Firestore Collection name where data is sent to and the documentID which will be used as unique identifier for the different Firestore documents. Every row in the BigQuery table will be put in one document in Firestore where the document name is one choosen field of the BigQuery table named columnName in the JSON Object.
- Create a service account
Create a Service Account in Google Cloud Platform to use in the Cloud Function created later. Roles needed: BigQuery Job User, Cloud Datastore User, Cloud Functions Invoker - Create the Cloud Function
Create a Cloud Function with Trigger Type HTTP and with "Allow unauthenticated invocations" enabled. Choose the service account created in step 1. As runtime choose Node.js 16 and make sure the entry point is "init". Copy the code from this repository including the package.json. - Create a scheduled job
To run the data transfer frequently you can use the Cloud Scheduler. Create a job with content-type: "application/json" and put the JSON object into the body.