As someone who runs teams deploying BI to internal stakeholders (product, sales) I am pretty fed up with them.
Firstly Tableau, QlikView or PowerBI are all pretty much doing the same thing whatever flavour you prefer.
We find that maybe 10% of users will actually use them to garner new actionable insights and 90% will moan there's too much data for them to process and no one appreciates the immense amount of data munging that goes on behind the scenes to make those pretty graphs.
Where do we go? Personally I see a lot of the insights being automated and turned into actions on the server side without anyone touching a BI tool. This can be achieved with rules based approach and perhaps some correlation, trend analysis over time series data. Where you do need to go beyond that then perhaps AR/VR will provide a novel and more valuable approach.
I see things going the same direction. I've always been pretty anti-dashboard because of the lack of long-term utility. If you want to make data-driven decisions, you need to spend the time codifying the decision making process and automating that. Automated actioning off data is far more impactful than automated visualization of data.
When it comes to business stakeholders, the biggest obstacle is trust. If you're not a data person, or a developer, decision logic being gated behind code is a scary black box. "If I can't control it, I can't trust it". That feeling gets even worse when we build systems that are controlled by AI or ML.
I think we need more solutions for data teams that allow business stakeholders to take part in the automated workflow deployment. Let them see how things are connected together. Let them verify that the decisions are being made correctly every day. Let them tweak levers so they have a say in how things are working. That's the only way to move beyond the current environment where everyone wants a dashboard, but no one looks at it.
That's a big problem I'm aiming to solve right now.
I've been doing a fair amount of BI work myself recently.
I reckon of the 90% that don't make any data driven decisions with the dashboards they're provided, there are three causes I run into frequently.
1. They can't describe what info they need
2. That info isn't available.
3. They don't make decisions based on data at all.
I feel I'm getting better at addressing 1 and 2 the more I do it, but I figure there's a fair amount from group 3 who demand the reports nonetheless...
As an aside, you mention Tableau, QlikView or PowerBI, have you tried Apache Superset at all?
We provide power bi access to the data our app runs on and it has been a mixed bag. The customers absolutely love it. But they build dashboards where hitting the refresh button can bring a super expensive db to its knees for half an hour.
So then we have to get our db people in to rewrite all of their reports so they don't do this which eliminated the benefit of having the customers do it.
In the end there is just no avoiding having experts build the dashboards.
I couldn't agree more! We built Lexio to address exactly that issue. (Our tagline is literally "Stop building dashboards that no one uses".) It does data storytelling (discovering the insights and conveying them in natural language) on the server and gives users a newsfeed experience for consuming them. I'm happy to talk shop if folks are interested in how it works. https://narrativescience.com/lexio
* Provide a clean minimal view of the data that other teams can pivot table on top of or download in the BI tool. We do onboarding, support and all that for this data.
* Provide dashboards and reports for more complicated ongoing questions that require data that's not cleaned yet.
We don't use Looker but based on the sales demo it seems optimized for this type of workflow. The core analytics team maintains the data and complex workflows while other teams have business analysts for day to day questions.
>perhaps AR/VR will provide a novel and more valuable approach
Yes. A lot of darts are being thrown as to where XR fits on which Hypecycles, but one of the most promising uses to me is as a new enduser engagement layer for BI(g) data. The new UI/UX possibilities of immersive 3D have a low level effect on what/how information can be rendered in meaningful ways. It's still early days in this space, but here is a quick list of some work in progress for anyone interested:
[1] https://flowimmersive.com/
Focused on Data driven story-telling through an inhouse 3D visualization tool. Strong on AR collaboration and responsible for "The Data Guy" popular on TikTok.
[2] https://www.badvr.com/
Industry focused on using XR for specific applications like 5G Radio Coverage heatmap modeling and Smart City interfacing tools.
[3] https://3data.io/
Use of XR for IT Operations Center applications (NOC/SOC)
I work as a data engineer in a BI team, with our product being part of a large software suite in healthcare. We got a small number power users who really use the dashboards, who get real result from them and who actively help our development. We also got a larger group of people who use it once in a while.
But most people seem to log in once or twice and then never again. And if you ask why it’s often because they couldn’t find the answer to their incredibly specific one-time only question, so they decided it’s all useless and they continue to work on their own personal homegrown BI-suite in excel.
Also, we use Qlik and I absolutely hate it. Luckily I don’t have to work with it too often but whenever I do I feel like I’m always fighting it. Using Power BI and Tableau felt much more like working together to solve issues.
Would love to hear your feedback for https://hal9.ai -- We are building an open source platform for data analysis based on reusable code blocks and a community to build and monetize their contributions. We are pretty early in our journey, launched our alpha and getting ready for our beta release, but would love to hear your thoughts. You can find me at javier at hal9.ai. Cheers!
On the flipside though, I've had some success with self service bi platforms like metabase.
Allowing stakeholders some leeway to conduct their own independent analysis (after a short training session) has allowed our strained data team to hand over simpler analysis to focus on the harder problems.
I am working as a consultant in the field for some time. The approach what I found that works very well is to create a BI org that is driven by the business needs and only have data for what you actually need.
Amplitude has been incredible, at my new workplace. Fast and lots of pretty graphs are easily generated. Also used by tonnes of different people throughout the company!
The ugly truth of BI tools is that they are of most benefit to the organizations which are least capable of using them.
The goal of properly exploiting BI comes with many prerequisites which sound superficially reasonable but turn out to be decade-long side-quests. Things like having all your data in one data model. Things like understanding where your data comes from, and exactly what it means.
These prerequisites are easy for small orgs, but small orgs benefit least from BI and typically get better bang-for-buck from Excel.
Large orgs find themselves mired in the political meta-problems of meeting those prerequisites, like joining up the fiefdoms that own data sources with the cabals that run data governance and the accountants who want a return on the investment of simplifying a sprawling legacy estate.
BI tools are generally incredibly poor at dealing with the bag-of-spanners heterogeneous data landscapes that exist in these organizations, and their analytics nirvana remains largely unattainable.
The trend that the article describes - towards simpler, composable BI components, each with more modest goals - is progress. It helps move focus away from the relatively-easy problem of visualization, to the rest of the data stack.
This is so true. I worked a BI tooling startup and it was a lot of fun. But the big clients all had the same types of issues which ultimately dooms their quest. At some point the CFO, CTO, CEO notices they are spending more on “BI” than marketing and they start to get very curious about what they are getting for their money. It’s never quite what they expect.
> BI tools are generally incredibly poor at dealing with the bag-of-spanners heterogeneous data landscapes
It's not that they are poor, it's that they eventually require some moderately-challenging coding - because there is only so much complexity you can hide behind point&click before things become untractable. Programmers get bored doing BI plumbing, and those commercial ecosystems are quite secretive and often expensive to train in; so the talent pool is small and costs can skyrocket pretty quickly, making the whole proposition unappealing.
Maybe if we accepted that "BI Programmer" is a respectable career and something that is fundamentally unavoidable, providing clear career paths inside companies for it, instead of trying so hard (and failing) to get rid of such figures as soon as they progress a bit, there would be more predictability and less angst in the space.
I’ve spent the last 10 years of my life building “Big Data” platforms for companies (Kafka, Druid, Hadoop, Databricks, Presto, Teradata, Informatica, MicroStrategy, etc etc) and I still have no idea what people need most of this crap for. There’s so much duplicative functionality, and a basically never ending list of OLAP adjacent products emerging every year. I mean it’s good for me personally, it’s pretty much a gravy train, but I always question if a lot of it is a solution searching for a problem lol
Interesting. To me, you seem to be conflating some pieces of data infra plumbing with very sensible and concrete goals (Kafka, Hadoop) with others that are adjacent tools which may or may not make sense. Regardless of any tech details, the idea behind Kafka makes a lot of sense: https://engineering.linkedin.com/distributed-systems/log-wha...
Or did you mean, "I have no idea what business needs all this data for", a much broader question?
I wonder about this a lot. The IT department where I work is rolling out a new “data platform”. I’m in the finance unit and a prime customer but fairly up to date on the tech. This thing has soooo many logos on it. I suspect there is someone just saying yes to everything their consultant pitches.
Like why have databricks, azure synapse, AND snowflake on the same picture. I’m certain I could do everything they say they’re going to do with this shit with an integration tool, snowflake, a machine to run python on, and our BI tool in the same cloud.
Similar experience for me. Years ago I made a decision to jump into data engineering and reporting instead of pushing more into enterprise app development (meaning, Java/C# stuff).
I share the question about a solution searching for a problem. I will say that I continue to see efforts being made that seem focused on adding a bullet point to some executive leader's yearly accomplishments rather than specifically providing value back to an org . . .
I've been working in the BI industry for almost 20 years and I don't know why the author believes that the "Original BI" includes ETL/storage. He mentions BusinessObjects and Microstrategy as an example of traditional BI tools popular 20 years ago, but neither of them had ETL/storage capabilities. Both were basically visual SQL generators. (Although, BusinessObjects later acquired an ETL company). Qlik has a bit of everything - ETL, columnar storage, and dataviz. But that's more of an exception than a rule.
Nevertheless, the author has a point - BI seems to not have a clear path forward. There are experiments with natural language queries (won't work), automatic text summary generation (useless), AI-assisted automatic insights (might be a nice feature, but barely a "product"), and a few more. The self-service story of BI never really took off (I believe self-service ETL is a more interesting story [1]). Neither did story-telling. Dashboards are oversold and are mainly used to impress CxOs to close big deals. Geospatial dataviz is useful, but has limited application.
I would envision two possible directions to advance for the BI industry:
1) Analytical notebooks, something like Jupiter Notebooks but no-code and for BI might be a good idea to explore, but I haven't seen anything like that.
2) No-code/lo-code analytical app building. Reporting and dashboarding tools are just highly specialized app builders. Why not make a step further and generalize it a bit?
> 2) No-code/lo-code analytical app building. Reporting and dashboarding tools are just highly specialized app builders. Why not make a step further and generalize it a bit?
Yes exactly! Dashboarding tools only get you halfway - there's no easy way to take action from within the dashboard itself. I see analytic app building as a cross between Tableau and Airplane/Retool
> I've been working in the BI industry for almost 20 years and I don't know why the author believes that the "Original BI" includes ETL/storage.
I've done some BI, but mostly DW/ETL, for over twenty years... And I can't remember ever seeing ETL viewed as a sub-category of BI in any project I've been part of. It hasn't even been mentioned as such -- except in weird articles like this -- this side of the turn of the century. Before that, yes, the categories were muddled. But not since then.
> 2) No-code/lo-code analytical app building. Reporting and dashboarding tools are just highly specialized app builders. Why not make a step further and generalize it a bit?
Sure. I've long thought the best tool, one that could do both ETL and BI -- is an IDE with a full-blown programming language and a good library: Something like D̵e̵l̵p̵h̵i̵ Free Pascal / Lazarus.
And what it actually means is "visualizations and dashboards". Why make decisions based on analyzing the data from a report when you can get a vague impression from a pretty chart and go with that instead?
IMX, the core problem is that all BI software is built around the idea that everything is a financial value that can always be arbitrarily aggregated meaningfully, and that looking pretty is more important than presenting the data in a way that your organization actually finds logical.
It's a way to sell very expensive software to decision-makers, while also producing software complex enough that it can't be configured well enough to properly evaluate until well after you've already bought it. Only then do you find that the feature set is very broad, but very shallow. It's only 18 months later that you discover how limited the report writer is, or that you have to do it this one way for everything even if you really could use it formatted slightly differently.
It's like using a pivot table in Excel and trying to control the order of the columns, or to make one table aggregate two values distinctly, etc. You end up with 10 seconds to pull your data and create the table, and 4 hours trying to get it to display in the manner you want before giving up.
The interesting point here is that the unbundling of BI is resulting in tools for every occasion. What I'm seeing on my end is that the "modern data stack" is starting to become very fragmented as a result.
Sure, each of these tools are better suited to accomplishing a very specific task with your data. But the problem is that now these tools aren't talking to each other. No one has a single place to view their data end-to-end. No one can show me every automated process in their organization that relies on a specific table. Everything is tied together with trust. Trust that the Data Engineering team won't deploy bad code. Trust that the API won't change how it returns data. Trust that the data actually loaded completely.
This trust-based system means that one problem during ingestion/cleaning ends up spelling disaster for all of the downstream ML models, reports, data extracts, etc. and causes a day worth of headaches to resolve.
If BI used to hold everything together under one roof, we need something to make these tools talk to each other. The only solution, as I see it, is to build out better data orchestration that effectively glues these tools together and lets you see the "big picture" with your data.
I remember learing decades ago that "In UNIX, everything is a file"
I approach BI now as "In BI, everything is in your database".
With everything in the database (SQL Server in my case) I can deliver BI to my users with quality tools like Tableau and Power BI - or just with generic web reports.
It really is amazing how far you can get by just dumping everything into a single database and letting people do joins. We’re doing the same thing with Stitch replicating data from production databases and a few SaaS products into a big Postgres instance and it’s so low on maintenance hassle. The most advanced our “BI pipeline” gets is a few stored procedures run from a cron job to aggregate data in a particularly large table.
I do believe that we have seen solutions trying to be a horizontal solution to problems, and I think the evolution will turn into vertical solutions. I have also invested a lot of time, as we are a couple of people who has been building on a vertical BI tool for the SaaS space for about a year.
The main goals are:
- Easy accessible for novice users. We want to make the Google for BI to help empower non-technical people (no more salespeople asking for reports from devs)
- More advanced editors for the more technical people
- Advanced alerts + integrations to 3rd parties
- Later on proactive reports
Hit me up on @philipanderse if you want to test it out.
I appologise in advance for probably using poor terminology..
Maybe I am a naive fool but IMHO the benifits of BI are reconciliation of the business model:
- reconciling processes
- reconciling data
- reconciling governance
Its a painful process to impliment and you will hit all of your compliance, governance and change management issues head on.
IMHO if you accept that this is what you are doing then its potentially very valuable for your business, but it must be done humbly with humilty as every single measure and many dimensions are each a project on their own.
The pretty graphs are largely irrelevant apart from making employees feel valued and to re-enforce abstractions via visual metaphors.
I get pissed off when end users are not trained in consuming the database warehouse through tools like excel or access. This project should include general IT education programme for most companies.
Also.. if the end users cannot use a pivot table then you need to teach them that and prototype a bunch of stuff in excel before you go anywhere near BI of any flavour.
A lot of conventional BI was focused on 'single player' mode -- basically an analyst running analysis at their computer and maybe dropping that graph into a presentation. This fundamentally limited the usefulness of of BI results (they got stuck in time and rarely updated in) and made follow-ups hard since the graph was no longer connected to the data source.
All the while things like Google Docs and Slack made collaboration around documents and ideas much easier with @mentions, threads, etc.
So BI can be a lot more useful if it is
1\ accessible anywhere (not just desktop or a crappy mobile app)
2\ collaborative -- bringing the mentions, shared questions, and ability to make decisions together
3\ not crazy expensive (looking at your tableau + looker)...because at these high price points the tool ends up not getting offered to everybody, it ends up being more limited in who gets access and hence less useful across a company
Full disclosure, I built Zing Data which is an app for mobile first business intelligence and is free for small teams. Works with PostgreSQL, MySQL, and Google BigQuery. https://www.getzingdata.com/
Would love any feedback folks have -- we're actively improving it and I'm sure this community has a lot of great ideas!
Data is clay. We are still struggling to make bricks. We are still very far from building a house out of it, much less a large building. And we're going to eventually find out that we can't make skyscrapers out of it at all.
Just like bricklayers haven't even remotely been replaced with robots, I doubt the data-mungers will be, either. We are probably on the cusp of a whole new generation of "data people" who will have careers that span a generation, and do nothing but sift through data.
In my experience BI has no business being done by technical people. You need business people who are just technical enough to use the system for what they need. You need domain knowledge and deep product understanding to know what data to be looking for, not a stats and CS whiz. In 99% of cases, SQL backed by a fast distributed analytics database and Tableau+Excel is enough for that.
Even as a software engineer doing basic analytics charts for the service I own I don't want to be writing JavaScript. Ideally I wouldn't be writing SQL either.
A little python training in the python stack (pandas/numpy/matplotlib or other visualization libraries) can go a long way to simplifying tech stack and get rid of these mind numbing BI tools.
But I also see spending more on these tools, in the name of innovation, because some bigshot likes tool X and that's what he wants to use. And guess what, now you also need it to be made available in the cloud.
[+] [-] monkeydust|4 years ago|reply
Firstly Tableau, QlikView or PowerBI are all pretty much doing the same thing whatever flavour you prefer.
We find that maybe 10% of users will actually use them to garner new actionable insights and 90% will moan there's too much data for them to process and no one appreciates the immense amount of data munging that goes on behind the scenes to make those pretty graphs.
Where do we go? Personally I see a lot of the insights being automated and turned into actions on the server side without anyone touching a BI tool. This can be achieved with rules based approach and perhaps some correlation, trend analysis over time series data. Where you do need to go beyond that then perhaps AR/VR will provide a novel and more valuable approach.
[+] [-] blakeburch|4 years ago|reply
When it comes to business stakeholders, the biggest obstacle is trust. If you're not a data person, or a developer, decision logic being gated behind code is a scary black box. "If I can't control it, I can't trust it". That feeling gets even worse when we build systems that are controlled by AI or ML.
I think we need more solutions for data teams that allow business stakeholders to take part in the automated workflow deployment. Let them see how things are connected together. Let them verify that the decisions are being made correctly every day. Let them tweak levers so they have a say in how things are working. That's the only way to move beyond the current environment where everyone wants a dashboard, but no one looks at it.
That's a big problem I'm aiming to solve right now.
[+] [-] TOMDM|4 years ago|reply
I reckon of the 90% that don't make any data driven decisions with the dashboards they're provided, there are three causes I run into frequently.
1. They can't describe what info they need
2. That info isn't available.
3. They don't make decisions based on data at all.
I feel I'm getting better at addressing 1 and 2 the more I do it, but I figure there's a fair amount from group 3 who demand the reports nonetheless...
As an aside, you mention Tableau, QlikView or PowerBI, have you tried Apache Superset at all?
[+] [-] SilverRed|4 years ago|reply
So then we have to get our db people in to rewrite all of their reports so they don't do this which eliminated the benefit of having the customers do it.
In the end there is just no avoiding having experts build the dashboards.
[+] [-] ndnichols|4 years ago|reply
[+] [-] marcinzm|4 years ago|reply
* Provide a clean minimal view of the data that other teams can pivot table on top of or download in the BI tool. We do onboarding, support and all that for this data.
* Provide dashboards and reports for more complicated ongoing questions that require data that's not cleaned yet.
We don't use Looker but based on the sales demo it seems optimized for this type of workflow. The core analytics team maintains the data and complex workflows while other teams have business analysts for day to day questions.
[+] [-] jimmySixDOF|4 years ago|reply
Yes. A lot of darts are being thrown as to where XR fits on which Hypecycles, but one of the most promising uses to me is as a new enduser engagement layer for BI(g) data. The new UI/UX possibilities of immersive 3D have a low level effect on what/how information can be rendered in meaningful ways. It's still early days in this space, but here is a quick list of some work in progress for anyone interested:
[1] https://flowimmersive.com/ Focused on Data driven story-telling through an inhouse 3D visualization tool. Strong on AR collaboration and responsible for "The Data Guy" popular on TikTok.
[2] https://www.badvr.com/ Industry focused on using XR for specific applications like 5G Radio Coverage heatmap modeling and Smart City interfacing tools.
[3] https://3data.io/ Use of XR for IT Operations Center applications (NOC/SOC)
[4] https://github.com/ACEMS/r2vr This project lets you output basic WebXR 3D visualizations using R
[+] [-] FranzFerdiNaN|4 years ago|reply
But most people seem to log in once or twice and then never again. And if you ask why it’s often because they couldn’t find the answer to their incredibly specific one-time only question, so they decided it’s all useless and they continue to work on their own personal homegrown BI-suite in excel.
Also, we use Qlik and I absolutely hate it. Luckily I don’t have to work with it too often but whenever I do I feel like I’m always fighting it. Using Power BI and Tableau felt much more like working together to solve issues.
[+] [-] javierluraschi|4 years ago|reply
[+] [-] esyir|4 years ago|reply
Allowing stakeholders some leeway to conduct their own independent analysis (after a short training session) has allowed our strained data team to hand over simpler analysis to focus on the harder problems.
[+] [-] StreamBright|4 years ago|reply
Less is more.
[+] [-] gitgud|4 years ago|reply
[+] [-] jl6|4 years ago|reply
The goal of properly exploiting BI comes with many prerequisites which sound superficially reasonable but turn out to be decade-long side-quests. Things like having all your data in one data model. Things like understanding where your data comes from, and exactly what it means.
These prerequisites are easy for small orgs, but small orgs benefit least from BI and typically get better bang-for-buck from Excel.
Large orgs find themselves mired in the political meta-problems of meeting those prerequisites, like joining up the fiefdoms that own data sources with the cabals that run data governance and the accountants who want a return on the investment of simplifying a sprawling legacy estate.
BI tools are generally incredibly poor at dealing with the bag-of-spanners heterogeneous data landscapes that exist in these organizations, and their analytics nirvana remains largely unattainable.
The trend that the article describes - towards simpler, composable BI components, each with more modest goals - is progress. It helps move focus away from the relatively-easy problem of visualization, to the rest of the data stack.
[+] [-] Foobar8568|4 years ago|reply
[+] [-] mberning|4 years ago|reply
[+] [-] toyg|4 years ago|reply
It's not that they are poor, it's that they eventually require some moderately-challenging coding - because there is only so much complexity you can hide behind point&click before things become untractable. Programmers get bored doing BI plumbing, and those commercial ecosystems are quite secretive and often expensive to train in; so the talent pool is small and costs can skyrocket pretty quickly, making the whole proposition unappealing.
Maybe if we accepted that "BI Programmer" is a respectable career and something that is fundamentally unavoidable, providing clear career paths inside companies for it, instead of trying so hard (and failing) to get rid of such figures as soon as they progress a bit, there would be more predictability and less angst in the space.
[+] [-] pram|4 years ago|reply
[+] [-] the_af|4 years ago|reply
Or did you mean, "I have no idea what business needs all this data for", a much broader question?
[+] [-] eyeball|4 years ago|reply
Like why have databricks, azure synapse, AND snowflake on the same picture. I’m certain I could do everything they say they’re going to do with this shit with an integration tool, snowflake, a machine to run python on, and our BI tool in the same cloud.
[+] [-] clusterhacks|4 years ago|reply
I share the question about a solution searching for a problem. I will say that I continue to see efforts being made that seem focused on adding a bullet point to some executive leader's yearly accomplishments rather than specifically providing value back to an org . . .
[+] [-] dgudkov|4 years ago|reply
I've been working in the BI industry for almost 20 years and I don't know why the author believes that the "Original BI" includes ETL/storage. He mentions BusinessObjects and Microstrategy as an example of traditional BI tools popular 20 years ago, but neither of them had ETL/storage capabilities. Both were basically visual SQL generators. (Although, BusinessObjects later acquired an ETL company). Qlik has a bit of everything - ETL, columnar storage, and dataviz. But that's more of an exception than a rule.
Nevertheless, the author has a point - BI seems to not have a clear path forward. There are experiments with natural language queries (won't work), automatic text summary generation (useless), AI-assisted automatic insights (might be a nice feature, but barely a "product"), and a few more. The self-service story of BI never really took off (I believe self-service ETL is a more interesting story [1]). Neither did story-telling. Dashboards are oversold and are mainly used to impress CxOs to close big deals. Geospatial dataviz is useful, but has limited application.
I would envision two possible directions to advance for the BI industry:
1) Analytical notebooks, something like Jupiter Notebooks but no-code and for BI might be a good idea to explore, but I haven't seen anything like that.
2) No-code/lo-code analytical app building. Reporting and dashboarding tools are just highly specialized app builders. Why not make a step further and generalize it a bit?
[1] https://easymorph.com (Disclaimer: it's my company)
[+] [-] infinite8s|4 years ago|reply
Yes exactly! Dashboarding tools only get you halfway - there's no easy way to take action from within the dashboard itself. I see analytic app building as a cross between Tableau and Airplane/Retool
[+] [-] CRConrad|4 years ago|reply
I've done some BI, but mostly DW/ETL, for over twenty years... And I can't remember ever seeing ETL viewed as a sub-category of BI in any project I've been part of. It hasn't even been mentioned as such -- except in weird articles like this -- this side of the turn of the century. Before that, yes, the categories were muddled. But not since then.
> 2) No-code/lo-code analytical app building. Reporting and dashboarding tools are just highly specialized app builders. Why not make a step further and generalize it a bit?
Sure. I've long thought the best tool, one that could do both ETL and BI -- is an IDE with a full-blown programming language and a good library: Something like D̵e̵l̵p̵h̵i̵ Free Pascal / Lazarus.
[+] [-] drfuchs|4 years ago|reply
[+] [-] da_chicken|4 years ago|reply
IMX, the core problem is that all BI software is built around the idea that everything is a financial value that can always be arbitrarily aggregated meaningfully, and that looking pretty is more important than presenting the data in a way that your organization actually finds logical.
It's a way to sell very expensive software to decision-makers, while also producing software complex enough that it can't be configured well enough to properly evaluate until well after you've already bought it. Only then do you find that the feature set is very broad, but very shallow. It's only 18 months later that you discover how limited the report writer is, or that you have to do it this one way for everything even if you really could use it formatted slightly differently.
It's like using a pivot table in Excel and trying to control the order of the columns, or to make one table aggregate two values distinctly, etc. You end up with 10 seconds to pull your data and create the table, and 4 hours trying to get it to display in the manner you want before giving up.
[+] [-] meepmorp|4 years ago|reply
[+] [-] majormajor|4 years ago|reply
(Though even in the target audience, I'm not taking much away from this. Tools should be better. Yep.)
[+] [-] blakeburch|4 years ago|reply
Sure, each of these tools are better suited to accomplishing a very specific task with your data. But the problem is that now these tools aren't talking to each other. No one has a single place to view their data end-to-end. No one can show me every automated process in their organization that relies on a specific table. Everything is tied together with trust. Trust that the Data Engineering team won't deploy bad code. Trust that the API won't change how it returns data. Trust that the data actually loaded completely.
This trust-based system means that one problem during ingestion/cleaning ends up spelling disaster for all of the downstream ML models, reports, data extracts, etc. and causes a day worth of headaches to resolve.
If BI used to hold everything together under one roof, we need something to make these tools talk to each other. The only solution, as I see it, is to build out better data orchestration that effectively glues these tools together and lets you see the "big picture" with your data.
[+] [-] CRConrad|4 years ago|reply
[+] [-] oxfordmale|4 years ago|reply
I don't think BI analysts have to get worried quite yet.
[+] [-] intrasight|4 years ago|reply
[+] [-] jon-wood|4 years ago|reply
[+] [-] PhilipA|4 years ago|reply
The main goals are:
- Easy accessible for novice users. We want to make the Google for BI to help empower non-technical people (no more salespeople asking for reports from devs)
- More advanced editors for the more technical people
- Advanced alerts + integrations to 3rd parties
- Later on proactive reports
Hit me up on @philipanderse if you want to test it out.
[+] [-] molsongolden|4 years ago|reply
[+] [-] goodlinks|4 years ago|reply
Maybe I am a naive fool but IMHO the benifits of BI are reconciliation of the business model:
- reconciling processes - reconciling data - reconciling governance
Its a painful process to impliment and you will hit all of your compliance, governance and change management issues head on.
IMHO if you accept that this is what you are doing then its potentially very valuable for your business, but it must be done humbly with humilty as every single measure and many dimensions are each a project on their own.
The pretty graphs are largely irrelevant apart from making employees feel valued and to re-enforce abstractions via visual metaphors.
I get pissed off when end users are not trained in consuming the database warehouse through tools like excel or access. This project should include general IT education programme for most companies.
Also.. if the end users cannot use a pivot table then you need to teach them that and prototype a bunch of stuff in excel before you go anywhere near BI of any flavour.
[+] [-] zhendlin|4 years ago|reply
All the while things like Google Docs and Slack made collaboration around documents and ideas much easier with @mentions, threads, etc.
So BI can be a lot more useful if it is 1\ accessible anywhere (not just desktop or a crappy mobile app) 2\ collaborative -- bringing the mentions, shared questions, and ability to make decisions together 3\ not crazy expensive (looking at your tableau + looker)...because at these high price points the tool ends up not getting offered to everybody, it ends up being more limited in who gets access and hence less useful across a company
Full disclosure, I built Zing Data which is an app for mobile first business intelligence and is free for small teams. Works with PostgreSQL, MySQL, and Google BigQuery. https://www.getzingdata.com/
Would love any feedback folks have -- we're actively improving it and I'm sure this community has a lot of great ideas!
[+] [-] 0xbadcafebee|4 years ago|reply
Just like bricklayers haven't even remotely been replaced with robots, I doubt the data-mungers will be, either. We are probably on the cusp of a whole new generation of "data people" who will have careers that span a generation, and do nothing but sift through data.
[+] [-] d--b|4 years ago|reply
Because I believe that BI should be done by people who have a mixed set of skills between stats and programming.
A programmer alone can’t make the stats, and a mathematician can’t build a reporting tool.
Jig is a shortcut to build reports quickly, but I am somehow convinced by javascript is just not good enough to do that job.
I think I might have to try and design a language that has indexed containers as first class objects…
[+] [-] b9a2cab5|4 years ago|reply
Even as a software engineer doing basic analytics charts for the service I own I don't want to be writing JavaScript. Ideally I wouldn't be writing SQL either.
[+] [-] deshpand|4 years ago|reply
And companies are trying. Ex:https://www.bobsguide.com/articles/barclays-gordon-risk-mana...
But I also see spending more on these tools, in the name of innovation, because some bigshot likes tool X and that's what he wants to use. And guess what, now you also need it to be made available in the cloud.
[+] [-] nivenkos|4 years ago|reply
Tableau is a nightmare (no Linux support for a start, nevermind the hassle of editing each dashboard individually).
[+] [-] 58x14|4 years ago|reply
I wonder how long this took to write.
[+] [-] mdoms|4 years ago|reply
[+] [-] isoprophlex|4 years ago|reply
[+] [-] slt2021|4 years ago|reply
[+] [-] MangoCoffee|4 years ago|reply
we use Microsoft PowerBI. it work great within the Microsoft ecosystem. if you try to pull data from NetSuite with PowerBI. it went to crap.
[+] [-] riskneutral|4 years ago|reply
[+] [-] 1-6|4 years ago|reply
[+] [-] alach11|4 years ago|reply
If you and I sit in a room and race to analyze a few CSVs of data, I bet I can find compelling trends faster using a BI tool.
[+] [-] unknown|4 years ago|reply
[deleted]