top | item 34283622

(no title)

qoega | 3 years ago

I don't know why exactly you need BigQuery in your scenario. If I wanted to have Postgres->ClickHouse it would take single INSERT SELECT from ClickHouse.

If you need just CSV result dumped to GCS you can use clickhouse-local mode that has all the features like integrations with Postgres, GCP, formats.

Only but it is not a service with UI where your data analysts can click and drag what they want to export. But SQL can be simple for them to write and you need nothing more than a trivial cron job analogue to run it.

discuss

order

yashap|3 years ago

We're using BigQuery as our data warehouse, with a dashboarding UI (Metabase) connected to it. Not looking to do local analysis (clickhouse-local), this is about dumping data into a central data warehouse. The data warehouse could indeed be ClickHouse, but we're using BigQuery.

Either way, the need would be similar - a simple, cheap tool, with a web UI, that lets analysts easily setup nightly jobs to dump data out of Postgres, into the data warehouse.

qoega|3 years ago

There is no need to use clickhouse-local for 'local' analysis. It just the same ClickHouse, but it will not store any local data. Just use it as ETL tool. Something like this will work: INSERT INTO FUNCTION s3('https://storage.googleapis.com/BUCKET/test_data/file.csv', 'XXXXX','XXXXX',CSV) SELECT col1, col2 FROM postgresql(postgres1, schema='schema1', table='table1') WHERE col1 > col2;