top | item 35345154

(no title)

jazzkingrt | 2 years ago

Serious question: is it typical to describe client-side computing as "on the edge"?

I thought running something on the edge referred to running it in close network proximity to the user, rather than users having control and running things themselves.

discuss

order

wsgeorge|2 years ago

I believe this has been extended to mean "on device", which is interesting. See Gerganov's article on Github [0]. I wrote about this here [1] where I made a contrast between the core and the edge. I think the term maps well to this meaning.

What I find more interesting is that in the classic "close network proximity", some parts of the world may not have benefited as much from that trend since the closest nodes of a global delivery network could be several countries away.

[0] https://github.com/ggerganov/llama.cpp/discussions/205

[1] https://medium.com/sort-of-like-a-tech-diary/consumer-ai-is-...

TeMPOraL|2 years ago

> I believe this has been extended to mean "on device", which is interesting.

I don't like the connotations this carries. This is almost openly talking about reaching all the way into peoples' hardware to run your software, for your benefit, on them, without their knowledge, consent or control...

capableweb|2 years ago

Yes, "edge computing" can refer to both computing done as close to the user as possible geographically, or even on the device itself. If someone says "I wanna do edge computing" it's not clear enough to know if they just want to have servers they control as close to the user as possible, or do the computing on the device itself. I think Apple would say "edge computing" is on the actual device while CloudFlare would say "edge computing" is on their infrastructure, but distributed to be physically closer to the end user.

iamerroragent|2 years ago

I guess I've been out of the loop for a bit and didn't realize that "edge computing" became a term since cloud computing took off.

It is kind of cyclical then is not?

By that I mean computers used to be shared and to log into it through a terminal.

Then the PC came around.

Then about 15 years ago Cloud computing became the rage (really an extension or more sophisticated system than the first time shared computers)

Now we're back to local computing. I even see more self hosting and moving away from cloud due to costs.

All that rant is to say is it's interesting.

Side note, getting this AI to be localized as much as possible I imagine will be really useful in the medical industry because it helps alleviate HIPAA requirements.

dragonwriter|2 years ago

> Serious question: is it typical to describe client-side computing as “on the edge”?

Somewhat; its consistent with, e.g., Google’s “Edge TPU” designation for its client-side neural processors.

> I thought running something on the edge referred to running it in close network proximity to the user

Typically, but on the client device is the limit-case of “close network proximity to the user”, so the use is consistent.

aargh_aargh|2 years ago

Because of the ambiguity of the term "on the edge" that is used to refer to both close network proximity and the device closest to the user, as evidenced by this thread, I would suggest to use a new term, at least in the context of A.I. The AI running on the device closest to the user should be called a "terminator".

layer8|2 years ago

“Edge computing” arguably implies there’s a network you are connected t, that you’re on the edge of, so I wouldn’t apply the term to applications that can function completely offline. With edge computing there’s usually still a notion of having some sort of internet integration, like IoT devices.

dannyobrien|2 years ago

I've used "edge" in this context for around 15 years[1], and I've always intended it to mean "at the edge of the network", which can include being on the other side of the world to a user.

[1] from https://www.oblomovka.com/wp/2007/08/ at least