(no title)
masteranza | 2 months ago
Here's a very cool analogy from GPT 5.1 which hits the nail in the head in explaining the role of subspace in learning new tasks by analogy with 3d graphics.
Think of 3D character animation rigs:
• The mesh has millions of vertices (11M weights).
• Expressions are controlled via:
• “smile”
• “frown”
• “blink”
Each expression is just:
mesh += α_i \* basis_expression_i
Hundreds of coefficients modify millions of coordinates.
topspin|2 months ago
Are there novel tasks? Inside the limits of physics, tasks are finite, and most of them are pointless. One can certainly entertain tasks that transcend physics, but that isn't necessary if one merely wants an immortal and indomitable electronic god.
janalsncm|2 months ago
mlpro|2 months ago