This got me curious about our Exasol environment, which we've been running since 2016 at Piedmont Healthcare. We average 2 million queries per day (DDL/DML/DQL). Our query failure rate is ~0.1%. Only 7% of those failures were due to hitting resource limits. The rest were SQL issues: constraint errors, data type issues, etc. Average connected users is ~400. Average concurrent queries is ~7 with a daily max average of ~78 concurrent queries. Avg query time across DQL statements is around 10 seconds, which is only that high due to some extreme outliers -- I have users that like to put 200k values in a WHERE clause IN statement, and Tableau sometimes likes to write gnarly SQL with LOD calcs and relationship models.TPC-H benchmarks are what convinced us to purchase Exasol 10 years ago. Still happy with that decision! Congrats to the Exasol team on these results vs ClickHouse.
asteroidtunnel|1 month ago
"200k values in a WHERE clause IN statement"? What is that column about?
Average concurrent query is ~7 in what time period?
ugamarkj|1 month ago
Regarding the 200k values in a where clause, we have some users that will do research across published data source in Tableau. They will copy account IDs from one report and paste them into a filter in another. Our connections from Tableau to Exasol are live. Tableau doesn't have great guardrails on the SQL that gets issued to the database.
The concurrent query comes from a daily statistics table in Exasol. There is an average and max concurrency measure aggregated per day. I averaged the last 30 days. Exasol doesn't really explain their sampling methodology in their documentation: https://docs.exasol.com/db/latest/sql_references/system_tabl...