top | item 28738069

(no title)

dchapp | 4 years ago

> Yeah, there are reasons to use dynamic linking, but I'm still not sure why dynamic linking gives you better profiling and tracing.

It's not so much that the tracing becomes better, but that it becomes feasible at all. Two specific situations come to mind, both MPI-adjacent. (1) Running a PMPI-based tool on code you can't recompile yourself (e.g., you need a Q clearance to see the source, but not to actually execute it--weird I know, but not that uncommon in the DOE labs.); and (2) running multiple PMPI-based tools simultaneously which are composed at runtime via PnMPI.

discuss

order

gnufx|4 years ago

Exactly, but even if you can rebuild it, you don't want to under most circumstances. In principle with static binaries you can use dyninst, for instance, but in practice you may not be able to for various reasons. Then as a system manager you'd like to have global profiling of what runs -- for various reasons, including putting the result for a program in front of a user. If it's all dynamically linked, you can hook in and do that by default, and with various levels of insistence, depending on the hooks.

ghoward|4 years ago

I addressed these arguments. I think that if you don't have access to the source, it's a security problem.

That includes not having access to the source because of security clearances. As far as I am concerned, your superiors in the DOE are adversaries to you and your machine.

Other people will have different viewpoints on that, and that's fine.

gnufx|4 years ago

I want free software, and I'm aghast at what some people run, but this is not the real world. How many examples do you want? If you talk security, what's the threat model? What I run on a decently managed (with the aid of dynamically linked libraries) compute cluster should only put my data at risk.

As I understood it, there actually has been a push for free software solutions on CORAL systems, but I don't remember where that came from.