(no title)
thastings | 3 years ago
At analysis, files were containing multiple sweeps were averaged to reduce random noise (YYMMDDXX.Ann files), then these files were exported as a CSV-like series of time-voltage paires (YYMMDDXX.Tnn files). Tables containing the calculated variables were also generated, and exported (YYMMDD.Nnn files). The fun part is that these files had to be named MANUALLY. Each and every one of them. I can't stress hard enough how repetitive this got... We generated double-digit number of files every day, and analyzing each file thorougly required around 30-50 keypresses to move around in the menu and to name the files. Lucky for me, no mouse use was required, and keypresses could at least be automated.
I used DOSBox on Debian to do the analysis, and I ended up creating a bash script that could automatically analyze whole folders of these files in a few minutes. To achieve this, I generated xmacro files that would be played back while the DOSBox window was opened. Opening the file was also put in these xmacro files. The generation of the files was wrapped inside a bash script that kept track both of the files in the folder and of the files generated by the analysis. If a file was supposed to be there but some something broke inside DOSBox, it would just stop playing the macro for the next file, so it could be restarted relatively easily.
A few months later, I met the guy who wrote the software for our team, and asked him if he could write us a script to unpack the binaries into CSVs. From there, I could come up with my own completely automated solution for analysis, and everything was much-much faster. I also showed him the macro-monster I created. I'm still not sure if he was amazed or he just thought that I was impatient.
No comments yet.