Alot of GOFAI was about symbolic reasoning which is back in fashion thanks to areas like Neurosymbolic AI and Graph Learning. One could argue that Generative AI like Stable Diffusion is incorporating some of these techniques. What is prompt engineering and textual inversion if not the manipulation of symbols? Add more recent interest in areas like Answer Set Programming and "what is old is new again". It's not like GOFAI dissapeared. Even technologies like Expert Systems still see use under a new name. Sometimes you don't need a heuristic approximation of the rules that you get from all ML approaches. You need exact representations or already know the rules and the challenge is inference.
PartiallyTyped|3 years ago
This is no different to tree search, not really anyway, it's just that ANNs are used to handle the intractability of expanding an infinite and continuous space or providing the probabilities within a MCTS - which is what AlphaGo and AlphaZero do.
One could argue that LMs / Causal Transformers are also doing treesearch, indeed one of our evaluation metrics, perplexity, is a measurement of the model's uncertainty in the transition p(token_i | token_{i-1} ... token_{0}).