I use OpenOffice Calc, not Excel, but same basic idea. In my case, I use it for a couple of things.
1. A file viewer for certain kinds of data (eg, CSV files and the like).
2. A quick and dirty way to get simple summary stats from some data. Spreadsheets make it pretty easy to point at some data and get things like the mean, median, mode, std deviation, etc.
3. Quick and dirty plotting of small amounts of data. This use isn't as frequent for me, as I tend to use GNUPlot for most of this kind of stuff, but sometimes if you're already in the spreadsheet program poking at some data, you want a chart with a trend-line or something.
4. Exploratory models to play with "what if" analysis. Useful for a first pass at coming up with financial projects, evaluating pricing models, etc. The nice thing about a spreadsheet for this is that you write up all the various formulas you care about, and then you can update one cell and quickly see all the updated values. In a sense, spreadsheets are an implementation of a sort of Dataflow programming[1] and this kind of thing can be very handy.
It's good for getting answers to questions like "If I want this company to generate $1MM in revenue this year, and we set the price for our product to z, how many customers do we have to sign up?" and others of that nature. Of course there are plenty of other ways to do this kind of analysis, but spreadsheets are pretty convenient here.
I spent the last year building a predictive tennis model using Google Sheets which is basically the same as Excel. I have no technical skills so this was the easiest way to do it, plus it helped me learn SQL. Basically I have around 50,000 tennis matches in one sheet, filter them into another sheet, into around 100 comparative points, then in the end produce a probability for each upcoming match, and bet accordingly. I also use it to produce a little head to head stats visual and automated 200-word match previews.
It all works perfectly but obviously just for one user at a time. So I brought on a dev friend a few months ago and will launch a user facing site next week, as well as a b2b api for match previews / seo rich content.
I'm actually looking for seed funding up to around 25k, if anyone wants to reach out about that, or has other questions.
If you just want to follow some of the tips I'm running a very basic blog site at www.tennisacca.com sharing my most highly recommended picks (rather than listing 100's of games a week and needing a huge bankroll), which win at a ROI of 10%+ on average. It's mostly just for friends until the main site launches next week.
Hi! I did something similar a few years ago,instead I used python and machine learning algorithms + custom ELO implementations for tennis matches. Unfortunately it worked too well and I was banned from all the bookmakers.
I haven't used Excel for 10 years. I use LibreOffice Calc.
I have used it to chart data extracted from syslog, journald, dmesg. For example trouble-shooting in a Wi-Fi subsystem or boot time optimization work.
And of course all kind of project management type stuff. My job forces me to do that in Google Sheets these days. Action points, bug lists etc. are easily maintained in a spreadsheet. On one side it's a bit of a misuse of the tool, but as long they are short and simple lists with 10s and not 100s of entries it can be the most efficient solution.
The main drawback is that version history is a pain. If something goes wrong (wrong formula, wrong editing) it can be impossible to understand what happened and fix the problem afterwards.
Financial modeling. I work as a corporate finance consultant and am a former investment banker, and it's an extremely valuable tool in my opinion. I think people are crazy for trying to run a company without making heavy use of a financial model.
actually, I'd love to talk with you about this. I find that writing my own tools gives me what I need but takes me longer. I wrote my own ncurses spreadsheet that lets me add lambdas to be executed each cell (if the cell denies one)
I had a commercial real estate appraiser as a client once. She had a massive excel workbook for the calculations her and her employees used. Originally it was set up so they input everything by hand, used a calculator to figure everything out, input the answers back into excel then copy and paste everything manually into a report template.
I made it so after inputting less than ten variables it automatically calculated everything they needed, generated several charts and graphs, and a fully completed report at the press of a button. I turned it from a two or three day job into a 10 minute one.
Most excels I use are automatically generated as a build pipeline/Jenkins step (except 2 and 4)
1. Accumulating results of performance tests over time. Add columns with results for each nightly regression and keeping additional columns like std/worst/avg for last10, last100 etc.
2. Keeping indexes with locations of our data, and their meta data. So we can quickly find data that matches to specific problems we need to solve.
3. Specification files for our performance tests. Listing KPIs for specific cases, KPIs per type of machine, how to measure (bst5, avg5, single), tracking what should show up as red/green/orange in our dashboards.
4. Running FMEA using an excel FMEA template that automatically fills/calculates certain columns based on answers in the other.
5. All sorts of automatically generated regression overviews for our unit tests over periods of time. Showing trends, violations with specs, etc.
6. Sheets that are connected to our TFS so we can have overviews and do mass updates on items.
Any time I have some repetitive numerical calculation or procedure I am going to want to do multiple times (whether because it's a common task, or I want to try many different inputs in a row), I quickly throw an Excel sheet together for it.
In one place where I worked, someone had programmed financial models in Excel. It used to run for a long time, crash often etc. But it was good enough for his team and they used it until I left. That one single Excel file was worth a lot
This is going to be a boring answer, but I use it to fill in data for sql inserts. Its also nice for repetitive backend server commands. Since excel places a carriage return after each row, copy and paste into terminal is nice.
My previous company uses a set of Excel files to create construction tenders. They included a lot of formulas and vba to calculate quantities, costs, labor hours, etc.
I had to set up dedicated backup jobs for those templates.
Not me personally, but the amount of complex workflow that I've seen relying on tortured Excel and VBA is mind-boggling.
In the hands of someone who knows how to use it, Excel is incredibly powerful. Though at some point it crosses the threshold of "why don't you just get a junior developer to code this?". And the answer is always "we are technologically dinosaurs, and we think that any devolopment project costs at least $200k"
I see! In your opinion, if I am trying to track various parts of my life. Like a schedule, todo stuff. Should I just code it as a programmer, instead of using excel?
Did the same for a while, usually to have a short-term period, then I tried gnucash (no undo) and hledger which nowadays has some features for planning ahead. But still, I decided to write my own webapp for my personal finances and long-term budgeting.
If your excel files have anything for forward planning and you want to share those files, I'd be interested.
[+] [-] mindcrime|7 years ago|reply
1. A file viewer for certain kinds of data (eg, CSV files and the like).
2. A quick and dirty way to get simple summary stats from some data. Spreadsheets make it pretty easy to point at some data and get things like the mean, median, mode, std deviation, etc.
3. Quick and dirty plotting of small amounts of data. This use isn't as frequent for me, as I tend to use GNUPlot for most of this kind of stuff, but sometimes if you're already in the spreadsheet program poking at some data, you want a chart with a trend-line or something.
4. Exploratory models to play with "what if" analysis. Useful for a first pass at coming up with financial projects, evaluating pricing models, etc. The nice thing about a spreadsheet for this is that you write up all the various formulas you care about, and then you can update one cell and quickly see all the updated values. In a sense, spreadsheets are an implementation of a sort of Dataflow programming[1] and this kind of thing can be very handy.
It's good for getting answers to questions like "If I want this company to generate $1MM in revenue this year, and we set the price for our product to z, how many customers do we have to sign up?" and others of that nature. Of course there are plenty of other ways to do this kind of analysis, but spreadsheets are pretty convenient here.
[1]: https://en.wikipedia.org/wiki/Dataflow_programming
[+] [-] tinyrisks|7 years ago|reply
[+] [-] RossTennis|7 years ago|reply
It all works perfectly but obviously just for one user at a time. So I brought on a dev friend a few months ago and will launch a user facing site next week, as well as a b2b api for match previews / seo rich content.
I'm actually looking for seed funding up to around 25k, if anyone wants to reach out about that, or has other questions.
If you just want to follow some of the tips I'm running a very basic blog site at www.tennisacca.com sharing my most highly recommended picks (rather than listing 100's of games a week and needing a huge bankroll), which win at a ROI of 10%+ on average. It's mostly just for friends until the main site launches next week.
[+] [-] thewizardofaus|7 years ago|reply
[+] [-] xcubic|7 years ago|reply
[+] [-] usr1106|7 years ago|reply
I have used it to chart data extracted from syslog, journald, dmesg. For example trouble-shooting in a Wi-Fi subsystem or boot time optimization work.
And of course all kind of project management type stuff. My job forces me to do that in Google Sheets these days. Action points, bug lists etc. are easily maintained in a spreadsheet. On one side it's a bit of a misuse of the tool, but as long they are short and simple lists with 10s and not 100s of entries it can be the most efficient solution.
The main drawback is that version history is a pain. If something goes wrong (wrong formula, wrong editing) it can be impossible to understand what happened and fix the problem afterwards.
[+] [-] lhh|7 years ago|reply
[+] [-] jason_slack|7 years ago|reply
I could use some advice on modeling itself too...
[+] [-] grawprog|7 years ago|reply
I made it so after inputting less than ten variables it automatically calculated everything they needed, generated several charts and graphs, and a fully completed report at the press of a button. I turned it from a two or three day job into a 10 minute one.
Needless to say my client was fairly excited.
[+] [-] mariojv|7 years ago|reply
- Plan for miles run every day of the week, side-by-side with what I actually ran, notes about pace, weather, and other factors
- Sum of total weekly mileage done vs. planned, amount left to meet my goals
- Number of weeks left until race
all in one "dashboard." It's nothing fancy, but the date and sum functions made it really trivial to make a template for this exactly how I wanted it.
[+] [-] smacktoward|7 years ago|reply
[+] [-] acct1771|7 years ago|reply
Like my father, who runs his entire gas station/towing company/repair shop with Excel macros. lol.
[+] [-] bastijn|7 years ago|reply
1. Accumulating results of performance tests over time. Add columns with results for each nightly regression and keeping additional columns like std/worst/avg for last10, last100 etc.
2. Keeping indexes with locations of our data, and their meta data. So we can quickly find data that matches to specific problems we need to solve.
3. Specification files for our performance tests. Listing KPIs for specific cases, KPIs per type of machine, how to measure (bst5, avg5, single), tracking what should show up as red/green/orange in our dashboards.
4. Running FMEA using an excel FMEA template that automatically fills/calculates certain columns based on answers in the other.
5. All sorts of automatically generated regression overviews for our unit tests over periods of time. Showing trends, violations with specs, etc.
6. Sheets that are connected to our TFS so we can have overviews and do mass updates on items.
7....
[+] [-] MrEldritch|7 years ago|reply
[+] [-] justaguyhere|7 years ago|reply
In one place where I worked, someone had programmed financial models in Excel. It used to run for a long time, crash often etc. But it was good enough for his team and they used it until I left. That one single Excel file was worth a lot
[+] [-] chapium|7 years ago|reply
[+] [-] chrislc|7 years ago|reply
[+] [-] DeonPenny|7 years ago|reply
[+] [-] samfisher83|7 years ago|reply
Here is someone doing paintings in excel: https://pasokonga.com/
[+] [-] DavidHm|7 years ago|reply
In the hands of someone who knows how to use it, Excel is incredibly powerful. Though at some point it crosses the threshold of "why don't you just get a junior developer to code this?". And the answer is always "we are technologically dinosaurs, and we think that any devolopment project costs at least $200k"
[+] [-] twoquestions|7 years ago|reply
Large corporate governance is at best, a shitshow.
[+] [-] tinyrisks|7 years ago|reply
[+] [-] ssivark|7 years ago|reply
I'm curious to hear about things which are easy to do in Excel, but hard in the Jupyter workflow.
[+] [-] cbm-vic-20|7 years ago|reply
[+] [-] S4M|7 years ago|reply
[+] [-] TomK32|7 years ago|reply
If your excel files have anything for forward planning and you want to share those files, I'd be interested.
[+] [-] asgeirn|7 years ago|reply
Even from JSON or XML.
[+] [-] gekkostate|7 years ago|reply
The other advantage is that you have a full language (Wolfram Language) in the event you want to do further evaluation.
[+] [-] tudelo|7 years ago|reply
[+] [-] mrlala|7 years ago|reply
[deleted]
[+] [-] cm2012|7 years ago|reply