top | item 42853309

(no title)

heyflyguy | 1 year ago

Stuff like this troubles me. I am in defense tech working with LLMs and I am here to tell you the guards and fences are down at the chow truck while commoners are using LLMs.

discuss

order

brandon272|1 year ago

> I am here to tell you the guards and fences are down at the chow truck while commoners are using LLMs.

I've read this over and over and have no idea what this means.

ebiester|1 year ago

I think it means people in the military and contractors are already using openai and other tools and nobody is able to stop them, even if it leaks secrets.

umeshunni|1 year ago

I hate to be that guy, but here's what ChatGPT says:

This expression uses metaphorical language to describe a situation where traditional barriers, hierarchies, or protections have been removed, allowing broader access or disrupting the status quo. Here's a breakdown:

1. "Guards and fences are down" - This implies that the usual controls, restrictions, or gatekeepers are no longer in place.

2. "At the chow truck" - The chow truck symbolizes something previously exclusive or regulated, like access to resources, opportunities, or knowledge.

3. "While commoners are using LLMs" - Refers to everyday people (as opposed to elites or specialists) now having access to advanced technology, specifically Large Language Models (LLMs), like AI tools for generating text.

Together, the expression likely means that AI-powered tools have democratized access to knowledge and creativity, breaking down barriers that once limited these capabilities to experts, institutions, or privileged individuals. It highlights a significant shift where advanced tools are now accessible to "commoners," disrupting traditional power dynamics.

jcfrei|1 year ago

The real revolution will come (hopefully) when voters start using LLMs to figure out which of their representatives actually vote for policies that improve their life.

netdevphoenix|1 year ago

is that not happening yet? It will be the equivalent of people search for that kind of info on Google

nzach|1 year ago

I don't really understand why people bash so hard on LLMs. In some cases it is a spectacularly bad tool, I get that. But it is only a tool.

Imagine if you have a judge giving out sentences based on astrology books. I don't think anyone would argue the problem would be resolved by banning astrology books from our libraries.

TiredOfLife|1 year ago

From Douglas Adams (of "Hitchhiker's Guide to the Galaxy" fame)

1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

3. Anything invented after you’re thirty-five is against the natural order of things.

ourguile|1 year ago

I believe the main issue in regard to LLMs is that there is a real chance of the prevalence and ease of use of LLMs to erode critical thinking skills. Regardless of boilerplate warnings to "check the validity of answers" coming from the LLM, plenty of people in society outside of this tech savvy audience wouldn't even know where to begin. There was a recent Big Think article on this: https://bigthink.com/thinking/artificial-intelligence-critic....

To be fair, I do think there are plenty of uses for LLMs, but with adoption skyrocketing there really are no guardrails against misuse.

aeturnum|1 year ago

Sure, that judge is a problem, but I think your metaphor is a bit mal-formed.

In your example you should probably drop the judge, but you should also make a rule saying astrology books aren't a legitimate source of sentence guidance. That's what people are annoyed about re:LLMs. People keep insisting they are a legit source in different situations.

You wouldn't ban them overall, but you do want some kind of society-level taboo against relying on them. You can't just deal with it on the level of people who get fooled into using them.

topaz0|1 year ago

That's all well and good except that we look out and see all of the actively bad uses being hyped as the way of the future, at untold expense in both dollars and energy. The LLM is just a model that is what it is, bashing it doesn't make sense. People are bashing how it is used, both currently and in prospect.

dylan604|1 year ago

First graph, okay. Second graph, the wheels come off in spectacular fashion (unless you're from Florida).

Banning books does not ban knowledge. It just makes it a bit more inconvenient. Banning drugs has not stopped people from getting access.

adamtaylor_13|1 year ago

.... WAT.

Just kick out the judge who clearly... lacks judgment.

nextworddev|1 year ago

Wait till defense contractors start using DeepSeek with Aider..

UltraSane|1 year ago

That would get the contractor fired on the spot along with whatever network engineer allowed it to happen.