(no title)
1a527dd5 | 1 month ago
2. Don't have logic in your workflows. Workflows should be dumb and simple (KISS) and they should call your scripts.
3. Having standalone scripts will allow you to develop/modify and test locally without having to get caught in a loop of hell.
4. Design your entire CI pipeline for easier debugging, put that print state in, echo out the version of whatever. You don't need it _now_, but your future self will thank you when you do it need it.
5. Consider using third party runners that have better debugging capabilities
Storment33|1 month ago
embedding-shape|1 month ago
dijit|1 month ago
I'm a huge fan of "train as you fight", whatever build tools you have locally should be what's used in CI.
If your CI can do things that you can't do locally: that is a problem.
zelphirkalt|1 month ago
Of course, if you use something else as a task runner, that works as well.
Wilder7977|1 month ago
Using makefiles mixes execution contexts between the CI pipeline and the code within the repository (that ends up containing the logic for the build), instead of using - centrally stored - external workflows that contains all the business logic for the build steps (e.g., compiler options, docker build steps etc.).
For example, how can you attest in the CI that your code is tested if the workflow only contains "make test"? You need to double check at runtime what the makefile did, but the makefile might have been modified by that time, so you need to build a chain of trust etc. Instead, in a standardized workflow, you just need to establish the ground truth (e.g., tools are installed and are at this path), and the execution cannot be modified by in-repo resources.
reactordev|1 month ago
pydry|1 month ago
Neither do most people, probably but it's kinda neat how they suggested fix for github actions' ploy to maintain vendor lock-in is to swap it with a language invented by that very same vendor.
elSidCampeador|1 month ago
kstrauser|1 month ago
TeeMassive|1 month ago
jayd16|1 month ago
Does anyone have a way to mark script sections as separate build steps with defined artifacts? Would be nice to just have scripts with something like.
They could noop on local runs but be reflected in the github/gitlab as separate steps/stages and allow resumes and retries and such. As it stands there's no way to really have CI/CD run the exact same scripts locally and get all the insights and functionality.I haven't seen anything like that but it would be nice to know.
arwhatever|1 month ago
It seems like if you
> 2. Don't have logic in your workflows. Workflows should be dumb and simple (KISS) and they should call your scripts.
then you’re basically working against or despite the CI tool, and at that point maybe someone should build a better or more suitable CI tool.
zelphirkalt|1 month ago
never_inline|1 month ago
Storment33|1 month ago
ufo|1 month ago
For my actions, the part that takes the longest to run is installing all the dependencies from scratch. I'd like to speed that up but I could never figure it out. All the options I could find for caching deps sounded so complicated.
embedding-shape|1 month ago
You shouldn't. Besides caching that is.
> All the options I could find for caching deps sounded so complicated.
In reality, it's fairly simple, as long as you leverage content-hashing. First, take your lock file, compute the sha256sum. Then check if the cache has an artifact with that hash as the ID. If it's found, download and extract, those are your dependencies. If not, you run the installation of the dependencies, then archive the results, with the ID set to the hash.
It really isn't more to it. I'm sure there are helpers/sub-actions/whatever Microsoft calls it, for doing all of this with 1-3 lines or something.
plagiarist|1 month ago
https://docs.github.com/en/actions/how-tos/manage-runners/la...
philipp-gayret|1 month ago
For caching you use GitHubs own cache action.
1a527dd5|1 month ago
For things like installing deps, you can use GitHub Actions or several third party runners have their own caching capabilities that are more mature than what GHA offers.
latentsea|1 month ago
tracker1|1 month ago
Not to mention, Deno can run TS directly and can reference repository/http modules directly without a separate install step, which is useful for shell scripting beyond what pwsh can do. ex: pulling a dbms client and interacting directly for testing, setup or configuration.
For the above reasons, I'll also use Deno for e2e testing over other languages that may be used for the actual project/library/app.
embedding-shape|1 month ago
newsoftheday|1 month ago
What? Bash is the best scripting language available for CI flows.
linuxftw|1 month ago
jayd16|1 month ago
If you don't like it, you can get bash to work on windows anyway.
rerdavies|1 month ago
latentsea|1 month ago
NSPG911|1 month ago