Great question. With the presented approach in Zig, since you work on the binary directly, after you finish your hot-code reloading session, you can still run the generated binary from disk (and debug it or whatnot) as all writes to memory were also committed to the underlying file on disk. There is therefore no need to recompile your program to an executable from a dynamic library as I guess would be the case for approach taken by Nim/V.
The presented approach might also be more resource efficient as it is writing directly to program's memory rather than unloading and reloading a dynamic library, but this is very much a guess and I would need to do some benchmarking to get a better feel for it.
In general though, this approach is possible in Zig since first of all, we have our own linker for each target file format (ELF, MachO, COFF-coming-soon-tm), secondly, the compiler generates the executable file directly, and thirdly, incremental updates are super granular in order to minimise writes to the file as much as possible.
mihaigalos|4 years ago
They can support you with good advice in the further developing of Zig, I'm sure! Good luck!
amedvednikov|4 years ago
mihaigalos|4 years ago
WithinReason|4 years ago
kubkon|4 years ago
The presented approach might also be more resource efficient as it is writing directly to program's memory rather than unloading and reloading a dynamic library, but this is very much a guess and I would need to do some benchmarking to get a better feel for it.
In general though, this approach is possible in Zig since first of all, we have our own linker for each target file format (ELF, MachO, COFF-coming-soon-tm), secondly, the compiler generates the executable file directly, and thirdly, incremental updates are super granular in order to minimise writes to the file as much as possible.