What you said makes sense. The previous learning methods of language models are no longer feasible. My friends and I have recently been looking for new training methods. We believe that topology will be the next breakthrough point in the structure of language models. Anyone who is interested can discuss with me!
From the observability realm (check username!), the relationship of data is a challenging problem. Standards like OpenTelemetry try to solve this by focusing on the relationship between technology elements with attributes and resource.attributes, along with context propagation using span and trace ids.
OTel is effectively a relational database schema. The larger questions like “If the Detroit Tigers make it to the playoffs, how much will a head of lettuce be in Berlin?” require context that machines (and humans!) lack. And, since the question is entirely made up, there might not be any relevant context.
Context powered by topology feels like the next step. Extrapolating that topology to search queries still feels like science fiction today.
HsuWL|6 months ago
o11ywhisperer|6 months ago
From the observability realm (check username!), the relationship of data is a challenging problem. Standards like OpenTelemetry try to solve this by focusing on the relationship between technology elements with attributes and resource.attributes, along with context propagation using span and trace ids.
OTel is effectively a relational database schema. The larger questions like “If the Detroit Tigers make it to the playoffs, how much will a head of lettuce be in Berlin?” require context that machines (and humans!) lack. And, since the question is entirely made up, there might not be any relevant context.
Context powered by topology feels like the next step. Extrapolating that topology to search queries still feels like science fiction today.