Treasure Data's primary idea portal. 

Submit your ideas & feature requests directly to our product requirements team! We look forward to hearing from you.

BigQuery input to TD via Data Connector

Import data from BigQuery by specifying BigQuery's SQL statement.

If you want to import data from BigQuery, TD user can dump BigQuery's job result into GCS and then use GCS data connector. However this needs a lot of manual work and difficult to mange the data pipeline for existing BigQuery user.

It would be much easier if we have BigQuery input data connector, just by specifying BigQuery SQL statement and handles all of underlying complexity.

  • Kazuki Ohta
  • May 14 2016
  • Under Review
  • Admin
    Rob Parrish commented
    September 29, 2016 20:24

    While we do aim to have this available, please note that there is a workaround for this today. Specifically, you can output from BigQuery to GCS Object store, and then use our GCS Data Connector to input into Treasure Data.

  • Ryutaro Yada commented
    June 12, 2017 05:21

    In case Bigquery table has some record column, CSV formatted export to GCS is not available.
    JSON or Avro formatted export is possible, but it contains data type data and usually difficult to parse by Data Connector.

    This is the case of Firebase data export to Bigquery, for example.

    In that case, user has to export data from original table to temporary table by creating some query with csv formattable output.
    Then, export data from the temporary table to GCS.
    There is no way to directly output query result to GCS.
    So, total workflow will be as follows.

    BQ 1-> BQ 2-> GCS 3-> TD

    1. output csv formattable result to temporary table by query.
    2. export temporary table to GCS by table export.
    3. Data Connector for GCS

    This way is very costly and not friendly to users.

    If we have DataConnector for BQ which can import query result directly to TD,
    data flow will be much simpler as follows.

    BQ -> TD