chanchar | 1 year ago | on: What We Learned from a Year of Building with LLMs
chanchar's comments
chanchar | 4 years ago | on: GraphQL at PayPal: An Adoption Story
> Nothing stops you building another layer of endpoints to service your frontend.
They could've added a new endpoint for that specific usecase but I'm assuming they've simplified their actual needs to this one example.
chanchar | 6 years ago | on: California governor issues statewide 'stay at home' order
chanchar | 6 years ago | on: Let me give you a list of the top scams coding bootcamps use to steal your money
chanchar | 6 years ago | on: Fractional Shares
chanchar | 6 years ago | on: PayPal to acquire shopping and rewards platform Honey for $4B
chanchar | 6 years ago | on: Google I/O Developer Keynote [video]
chanchar | 7 years ago | on: How to Get Started with Elixir
chanchar | 7 years ago | on: Market Conditions, Trends and Home Prices in San Francisco
chanchar | 7 years ago | on: The best laptop right now: Huawei Matebook X Pro
chanchar | 7 years ago | on: Ask HN: How many GPUs do you train deep models on?
chanchar | 8 years ago | on: TensorFlow 1.7.0 released
chanchar | 8 years ago | on: Behind the Motion Photos Technology in Pixel 2
chanchar | 8 years ago | on: Show HN: M1 Finance – Automatically invest in what you want, for free
Love the idea of adding CoT as a field in the expected structured output as it also makes it easier from a UX perspective to show/hide internal vs external outputs.
> Structuring your inputs with XML is very good. Even if you're trying to get JSON output, XML input seems to work better. (Haven't extensively tested this because eval is hard).
Would be neat to see LLM-specific adapters that can be used to swap out different formats within the prompt.