You could start by looking into either multitask transformers or really general seq2seq models like T5. With T5, for example, it just learns to transform one text sequence into another. So you could fine-tune T5 to produce your target sequence, but rather than outputting an explicit Python list of tuples, it would output a string that looks like a sequence of tuples.Or maybe skip all that and outsource it to GPT:
https://imgur.com/a/BQv6C3K
brooksbp|3 years ago
mothcamp|3 years ago
trenchgun|3 years ago