Indeed, the context is "people using natural language to make requests". No soul on earth would consider/use your phrasing.
I (a human) have clue what your request is for - "lowest observed request volume"...??? Try "raisr the lights where we usually aren't asking you for much" and you might get tge same result. As far as I can tell, brightness increase in the garage (where, I'd guess, you've made the least requests), the AI apparenyly understood better than you or I what you meant.
selalipop|2 years ago
With super primitive wake word detection and transcription, the most you get is:
- What the user said
- How loudly each microphone in the house heard it.
If you take a look at the mock object in that transcript, that's what it maps to...
```json { "request": "I'm finding it hard to read" "observedRequestVolume": [ 3eQEg: 30, iA0TN: 60, h1T3y: 59, 5Qg1M: 10 ] } ```
The only part that would be human provided is: "I'm finding it hard to read"
The invented challenge was to see if using a suboptimal set of inputs (we didn't tell it where we are) it can figure out how to action.
It's zero-shot capability that makes LLMs suitable for assistants: traditional assistants can barely handle being told to do something they're capable of in the wrong word order, while this can go from hastily invented representation of a house and ambiguous commands to rational actions with no prior training on that specific task