Sometimes it seems you can "remind" the more established models, and this will bring the context back into focus (just from personal experience) but why that would work, I can only guess.
What methods have you found to brute-force through the problem?
No comments yet.