This is really nice, specially the pdf report generation.
I feel very moronic making a dashboard for any products now. Enterprise customers prefer you integrate into their ERPs anyway.
I think we lost the plot as an industry, I've always advocated for having a read only database connection to be available for your customers to make their own visualisations. This should've been the standard 10 years ago and it's case is only stronger in this age of LLMs.
We get so involved with our products we forget that our customers are humans too. Nobody wants another account to manage or remember. Analytics and alerts should be push based, configurable reports should get auto generated and sent to your inbox, alerts should be pushed via notifications or emails and customers should have an option to build their own dashboard with something like this.
Sane defaults make sense but location matters just as much.
> I've always advocated for having a read only database connection to be available for your customers to make their own visualisations.
Roughly three decades ago, that *was* the norm. One of the more popular tools for achieving that was Crystal Reports[1].
In the late 90s, it was almost routine for software vendors to bundle Crystal Reports with their software (very similar to how the MSSQL installer gets invoked by products), then configure an ODBC data source which connected to the appropriate database.
In my opinion, the primary stumbling block of this approach was the lack of a shared SQL query repository. So if you weren’t intimately aware with the data model you wanted to work with, you’d lose hours trying to figure it out on your own or rely on your colleagues sharing it via sneakernet or email.
Crystal Reports has since been acquired by SAP, and I haven’t touched it since the early ‘00s so I don’t know what it looks or functions like today.
100% agreed regarding shipping a read-replica, for any sufficiently complex enterprise app (ERP, CRM, accounting, etc.).
Customers need it to build custom reports, archive data into a warehouse, drive downstream systems (notifications, audits, compliance), and answer edge-case questions you didn’t anticipate.
Because of that, I generally prefer these patterns over a half-baked built-in analytics UI or an opinionated REST API:
Provide a read replica or CDC stream.
Let sophisticated customers handle authz, modelling, and queries themselves. This gets harder with multi-tenant DBs.
Optionally offer a hosted Data API, using something like -- PostgREST / Hasura / Microsoft DAB.
You handle permissions and safety, but stay largely un-opinionated about access patterns.
Any built-in metrics or analytics layer will always miss edge cases.
With AI agents becoming first-class consumers of enterprise data, direct read access is going to be non-negotiable.
Also, I predict the days of charging customers to access their own goddamn data, behind rate-limited + metered REST APIs are behind us.
1999-2000, the company I worked with gave a smallish number of key users full read rights to the SAP minus HR, briefly after introducing SAP to the global supply chain of that company. The key users came from all orgs using SAP, basically every department had one or two key users.
I was part of this and "saw the light". We had such a great visibility into all the processes, it was unreal. It tremendously sped-up cross-org initiatives.
hi, dev building Shaper here. I agree re sending reports vs dashboards.
Many users use Shaper mostly as UI to filter data and then download a pdf, png or csv file to use elsewhere.
We are also currently working on functionality to send out those files directly as messages using Shaper's task feature.
I get your point, but generally with most enterprise-scale apps you really don’t want your transactional DB doubling as your data warehouse. The “push-based” operation should be limited to moving data from your tx environment to your analytical one.
Of course, if the “analytics” are limited to simple static reports, then a data warehouse is overkill.
Customers don’t want to learn your schema or deal with your clever optimizations either. If you expose a DB make sure you abstract everything away in a view and treat it like a versioned API.
> I've always advocated for having a read only database connection to be available for your customers to make their own visualisations.
A layer on top of the database to account for auth/etc. would be necessary anyways. Could be achieved to some degree with views, but I'd prefer an approach where you choose the publicly available data explicitly.
GraphQL almost delivered on that dream. Something more opinionated would've been much better, though.
We've (https://www.definite.app/) replaced quite a few metabase accounts now and we have a built-in lakehouse using duckdb + ducklake, so I feel comfortable calling us a "duckdb-based metabase alternative".
When I see the title here, I think "BI with an embedded database", which is what we're building at Definite. A lot of people want dashboards / AI analysis without buying Snowflake, Fivetran, BI and stitching them all together.
hi, dev building Shaper here. Both, Shaper and Metabase, can be used to build dashboards for business intelligence functionality and embedded analytics. But the use cases are different: Metabase is feature-rich and has lots of functionality for self-serve that allows non-technical users to easily build their own dashboards and drill down as they please. With Shaper you define everything as code in SQL. It's much more minimal in terms of what you can configure, but if you like the SQL-based approach it can be pretty productive to treat dashboards as code.
Nice work! I met Jorin a couple years ago at a tech meetup and this was just an idea at the time. So cool to see the consistent progress and updates and to see this come across HN.
Is there anyway to run the query -> report generation standalone in process? Like maybe just outputting the html (or using the React components in a project).
I was looking to add similar report generation to a vscode-extension I've been building[0]
I use it daily and it never crashed. How long ago was this?
I am a big fan of DuckDB. Plow through hundrets of GB of logs on a 5 year old linux laptop - no problem.
hi, dev building shaper here. shaper allows you to visualize data and build dashboards just by writing sql. the sql runs in duckdb so you can use all duckdb features. its for when you are looking for a minimal tool that allows you to just work in code. you can use shaper to build dashboards that you share internally or also for customer-facing dashboards you want to embed into another application.
written-beyond|12 days ago
I feel very moronic making a dashboard for any products now. Enterprise customers prefer you integrate into their ERPs anyway.
I think we lost the plot as an industry, I've always advocated for having a read only database connection to be available for your customers to make their own visualisations. This should've been the standard 10 years ago and it's case is only stronger in this age of LLMs.
We get so involved with our products we forget that our customers are humans too. Nobody wants another account to manage or remember. Analytics and alerts should be push based, configurable reports should get auto generated and sent to your inbox, alerts should be pushed via notifications or emails and customers should have an option to build their own dashboard with something like this.
Sane defaults make sense but location matters just as much.
oogali|11 days ago
Roughly three decades ago, that *was* the norm. One of the more popular tools for achieving that was Crystal Reports[1].
In the late 90s, it was almost routine for software vendors to bundle Crystal Reports with their software (very similar to how the MSSQL installer gets invoked by products), then configure an ODBC data source which connected to the appropriate database.
In my opinion, the primary stumbling block of this approach was the lack of a shared SQL query repository. So if you weren’t intimately aware with the data model you wanted to work with, you’d lose hours trying to figure it out on your own or rely on your colleagues sharing it via sneakernet or email.
Crystal Reports has since been acquired by SAP, and I haven’t touched it since the early ‘00s so I don’t know what it looks or functions like today.
1: https://en.wikipedia.org/wiki/Crystal_Reports
AgharaShyam|11 days ago
Customers need it to build custom reports, archive data into a warehouse, drive downstream systems (notifications, audits, compliance), and answer edge-case questions you didn’t anticipate.
Because of that, I generally prefer these patterns over a half-baked built-in analytics UI or an opinionated REST API:
Provide a read replica or CDC stream. Let sophisticated customers handle authz, modelling, and queries themselves. This gets harder with multi-tenant DBs.
Optionally offer a hosted Data API, using something like -- PostgREST / Hasura / Microsoft DAB. You handle permissions and safety, but stay largely un-opinionated about access patterns.
Any built-in metrics or analytics layer will always miss edge cases.
With AI agents becoming first-class consumers of enterprise data, direct read access is going to be non-negotiable.
Also, I predict the days of charging customers to access their own goddamn data, behind rate-limited + metered REST APIs are behind us.
mitjam|12 days ago
I was part of this and "saw the light". We had such a great visibility into all the processes, it was unreal. It tremendously sped-up cross-org initiatives.
Today, I guess, only agents get that privilege.
jorin|12 days ago
owlstuffing|11 days ago
I get your point, but generally with most enterprise-scale apps you really don’t want your transactional DB doubling as your data warehouse. The “push-based” operation should be limited to moving data from your tx environment to your analytical one.
Of course, if the “analytics” are limited to simple static reports, then a data warehouse is overkill.
mrits|11 days ago
matsz|12 days ago
A layer on top of the database to account for auth/etc. would be necessary anyways. Could be achieved to some degree with views, but I'd prefer an approach where you choose the publicly available data explicitly.
GraphQL almost delivered on that dream. Something more opinionated would've been much better, though.
piterrro|12 days ago
mritchie712|11 days ago
When I see the title here, I think "BI with an embedded database", which is what we're building at Definite. A lot of people want dashboards / AI analysis without buying Snowflake, Fivetran, BI and stitching them all together.
jorin|12 days ago
frafra|12 days ago
rorylaitila|11 days ago
thanhnguyen2187|11 days ago
- SQLPage: more on UI building; doesn't use DuckDB
- Shaper: more on analytics/dashboard focused with PDF generation and stuff; uses DuckDB
https://github.com/sqlpage/SQLPage
cjonas|11 days ago
I was looking to add similar report generation to a vscode-extension I've been building[0]
[0](https://github.com/ChuckJonas/duckdb-vscode)
andrewstuart|12 days ago
robowo|12 days ago
jastr|11 days ago
kavalg|12 days ago
ldnbln|11 days ago
3abiton|12 days ago
jorin|12 days ago
pdyc|12 days ago
drfrost|11 days ago
[deleted]