RevenueCat -> Cloud Storage -> BigQuery, Data Transfer Configuration

  • 11 March 2022
  • 5 replies

Badge +2

Hey team, I have the integration that sends a daily dump of our subscription info into Cloud Storage. 

Now, I’d like to create a daily job that imports this into BigQuery. I look at the “Data Transfer” Configuration, and am a bit stuck with “Cloud Storage URI”:



How could I get it to _only_ get the CSV for a certain date string. 


I know this is more of a Google Cloud question, but I’d guess everyone who exports to Storage wants to do this too, so thought I’d ask here: if you could share your data transfer configuration with me would appreciate it!


5 replies


@Stepan Parunashvili I am running into the same issue! Did you ever find a way to get the data into BQ?


Alright, I think I figured it out (mostly)! Turns out that BQ allows wildcards when creating a table from a GCS folder, so I was able to point BQ to the base folder (the one that holds all of the data folders) and to reference that folder with a wildcard. The resulting table had the file from every date.

What is still missing with this approach is the actual date from the date file. All of the csvs are joined together with no way of determining which date the particular record existed in.

Badge +4

Hi guys,


How do you solve this?

I am stuck trying to send the data from google cloud to bigquery in an efficient way,



Badge +6

+100 on this thread cc @Jens this is a huge data workflow that seems pretty complex when you factor in the file/folder structure going into to BQ 

Badge +2

Hello, have you guys solved it? We are looking into ways of getting access to RC data in BigQuery.