top | item 33101155

(no title)

qoega | 3 years ago

License states the following. All other modifications are not standardized and you can't just compare systems. Otherwise there would be another standardized benchmark in the list you propose to run and publish.

>c. Public Disclosure: You may not publicly disclose any performance results produced while using the Software except in the following circumstances: (1) as part of a TPC Benchmark Result. For purposes of this Agreement, a "TPC Benchmark Result" is a performance test submitted to the TPC, documented by a Full Disclosure Report and Executive Summary, claiming to meet the requirements of an official TPC Benchmark Standard. You agree that TPC Benchmark Results may only be published in accordance with the TPC Policies. viewable at http: //www.tpc.org (2) as part of an academic or research effort that does not imply or state a marketing position (3) any other use of the Software, provided that any performance results must be clearly identified as not being comparable to TPC Benchmark Results unless specifically authorized by TPC.

discuss

order

menaerus|3 years ago

I see, thanks for the context, it seems like a PITA.

But given that each database system has its own flavor of SQL, vanilla TPC benchmarks may not work out of the box so one needs to tweak them a bit and this might be what actually disqualifies the published results from all of the clauses from above being applicable.

I can also anticipate that combination of clause (2) and (3) is what some that publish the results are also taking advantage of.

[1] https://www.oracle.com/mysql/heatwave/performance/ [2] https://www.singlestore.com/blog/tpc-benchmarking-results/ [3] https://docs.pingcap.com/tidb/v6.2/v5.4-performance-benchmar... [4] https://www.monetdb.org/blogs/learning-from-benchmarking/