top | item 35364229

(no title)

aiappreciator | 2 years ago

Prompt engineering is merely the entry-drug to AI-wrangling.

SD has gotten to the point that someone can fine tune a model (LORAs) with 2 days of time and $2 of GPU time.

There'll be roles for AI wranglers in every large company, where you'll be gathering the dataset and building LORA plugins for the AI to adapt specifically for your codebase/customerbase/documentation etc.

There's also processes involved in building APIs for the AI (AIPI?) to use and interface with your documentation and systems, setting up vector databases, monitoring AI output etc.

People who think there won't be job for expert AI users are just coping. Thinking "haha AI will kill your job too". The steam engine was more powerful than 100 men. In the end it required like 30 people up and down the value chain to support the engines, from coal mining, to coal shoving, to maintenance, to manufacturing.

discuss

order

majormajor|2 years ago

I'm not sure most codebases are unique enough for that. There will certainly be some of that at places that are doing new things, but for the average online service backend or frontend app programming tasks, I think things like Copilot will see enough and get trained well enough out of the box to be pretty one-size-fits-all.

There will be a lot of business pressure towards using the "good enough" out of the box ones too. If you've got a team of less than a hundred people, rolling your own "datasets, LORA plugins, APIs for AI, vector databases, monitoring, etc" is a multi-person team and significant chunk of new expense. So is the incremental gain their for small to medium teams with relatively "standard" problems?

Kinda like self-hosting at that scale vs using a cloud vendor.

whatshisface|2 years ago

"Language model, write me a python script to finetune a language model."