Integrating BigQuery with Bifrost

Google BigQuery is an industry-leading, fully-managed cloud data warehouse that lets you store and analyze petabytes of data in no time.

Bifrost supports Google BigQuery as a source from which you can ingest data and route it to your SuprSend workspace.

Granting permissions

Bifrost requires you to grant certain user permissions on your BigQuery warehouse to successfully access data from it.

Perform the following steps in the exact order to grant these permissions:

Step 1: Creating a role and granting permissions

  1. Go to the Roles section of Google Cloud platform dashboard and click on CREATE ROLE.

  1. Fill in the details as shown

  1. Click on ADD PERMISSIONS and add the following permissions

bigquery.datasets.get
bigquery.jobs.create
bigquery.jobs.list
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list

  1. Finally, click on CREATE.

Step 2: Creating a service account and attaching a role to it

  1. Go to Service Accounts and select the project which has the dataset or the table that you want to use.
  2. Click on CREATE SERVICE ACCOUNT

  1. Fill in the Service Account details as shown below, and click on CREATE AND CONTINUE:

  1. Fill in the Role details as shown below, and click on CONTINUE:

  1. Click on DONE to move to the list of service accounts.

Step 3: Creating and downloading the JSON key

  1. Click on the three dots icon under Actions in the service account that you just created and select Manage keys, as shown:

  1. Click on ADD KEY, followed by Create new key, as shown:

  1. Select JSON and click on CREATE

  1. A JSON file will be downloaded on your system. This file is required while creating a BigQuery warehouse connection in Bifrost, explained further in this document.

Step 4: Modify docker compose file

Save the downloaded JSON file in the same folder where your bifrost config file exists and update your docker compose file to pass the file name as environment variable to both bifrost server and worker instance. Refer to line number 14 & 25 in the following docker compose example.

services:
  redis:
    container_name: redis
    image: "bitnami/redis:latest"
    volumes:
      - ./redis-data:/bitnami/redis/data
    environment:
      - ALLOW_EMPTY_PASSWORD=yes
  bifrost-server:
    container_name: bifrost-server
    restart: on-failure
    image: "ghcr.io/suprsend/bifrost:main"
    environment:
      - GOOGLE_APPLICATION_CREDENTIALS=bifrost-369512-bdffacc07706.json # Only required for BigQuery integration
    entrypoint: /bifrost-server
    depends_on:
      - redis
    volumes:
      - "./:/bifrost/"
  bifrost-worker:
    container_name: bifrost-worker
    restart: on-failure
    image: "ghcr.io/suprsend/bifrost:main"
    environment:
      - GOOGLE_APPLICATION_CREDENTIALS=bifrost-369512-bdffacc07706.json # Only required for BigQuery integration
    entrypoint: /bifrost-worker
    depends_on:
      - redis
    volumes:
      - "./:/bifrost/"
  caddy:
    container_name: caddy
    image: caddy:latest
    depends_on:
      - redis
      - bifrost-server
    ports:
      - "443:443"
      - "80:80"
    volumes:
      - ./Caddyfile:/etc/caddy/Caddyfile
      - ./caddy:/data/caddy/pki

Step 5: Create BigQuery Connection in Bifrost

  1. Go to "Connection Manager" and Create a new connection like this

  1. Enter Connection string in the following format:

bigquery://projectid/dataset

  1. Test & Save.

What’s Next