There could be a #4 "historically, streaming could use protocols with unreliable delivery and with limited or no retransmission" (which is somewhat related to #1 and #2). For example, there have been media streaming protocols built on UDP rather than TCP, so packets that are lost are not automatically retransmitted. The idea is that for a real-time stream transmission, older frames are no longer considered relevant (as they would not be rendered at all if they were received late), so there is typically no benefit in retransmitting a dropped packet.
That means you could get drop-outs when data gets lost in transmission, but the overall data consumption of the protocol wouldn't go up as a result.
Not all that long ago, this prompted lots of debate about QoS and prioritization and paid prioritization and network neutrality and stuff. People were arguing that media streams needed higher priority on the Internet than downloads (and other asynchronous communications). Effectively, different Internet applications were directly competing with one another, yet they had very different degrees of tolerance to delays, packet reordering, and packet loss. Wouldn't ISPs have to intervene to prioritize some applications over others?
I remember reading from Andrew Odlyzko that this controversy was mostly resolved in an unexpected way: faster-than-realtime streams with buffering (as the network was typically faster overall than what was needed for a given level of media quality, you could use TCP to reliably download frames that were still in the future with respect to what would be played, and then buffer those locally). This is indeed the scenario depicted in this article.
What about actual live events? My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay relative to realtime, specifically to allow for significant buffering on the client, and then using reliable TCP faster-than-realtime downloads of the "near future" of the video content. Since these streams are purely unidirectional, users don't have a way to notice that they're not literally live. (I don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.)
There's a big difference for bidirectional conversation, like phone calls, because there even tiny delays are extremely psychologically noticeable. It appears that Zoom, for instance, is still using unreliable UDP streams for call content, which allows skips or dropouts, but keeps the latency relatively low so that it feels comparatively more like a face-to-face interactive conversation.
> My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay relative to realtime,
Yeah. The rule of thumb with Twitch used to be 11 seconds. You can still measure this because many streams replay the chat in the stream as an overlay for both being able to see when the streamer has seen your message and for archival purposes to preserve the chat in VODs.
> don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.
There's a buffer on the CDN (which they have anyways because they're recording the VOD) and you start playback at the point t seconds back.
> What about actual live events? My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay relative to realtime, specifically to allow for significant buffering on the client, and then using reliable TCP faster-than-realtime downloads of the "near future" of the video content. Since these streams are purely unidirectional, users don't have a way to notice that they're not literally live. (I don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.)
For TV, last I worked on a system like this the clients received data the same way as non-live streams: Http streaming (HLS or Dash), where you fetch playlists and small video files and the player stitches them all together. There's buffering along the pipe, the 30-60s total delay (which you'll notice if you watch sports and chat with someone who has cable and is watching the same thing) is a cumulative thing, so you don't see a 1-min startup delay, you just near-instantly get dropped into something that's already quite a bit behind.
Not sure what Twitch does. The over-the-network video game streaming-console services are obviously completely different from TV land, they couldn't get away with it there; but for TV the expense of better isn't seen worth it.
> My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay
This used to be the case, and may still be for some steamers, but mostly when I watch it's less than a couple seconds delay with the low latency mode enabled in the browser.
...you could use TCP to reliably download frames that were still in the future with respect to what would be played, and then buffer those locally)
I was mucking around with my network recently, with Netflix playing in the background. Rebooted the router, and to my utter surprise, the stream continued to play uninterrupted for the entire (30+ seconds) time it takes my network stack to reinitialize. I did not realize how aggressively the providers buffer, but it completely papered over the lack of internet service for the window.
>Since these streams are purely unidirectional, users don't have a way to notice that they're not literally live.
Depending on the delay, this can cause problems when switching from delayed streaming to real life. For example, watching the countdown on a rocket launch via streaming then going outside to watch the actual launch. Usually, for me, a few seconds delay is OK, because I can't see the rocket until about 30 seconds after liftoff due to trees. But when I have a better view of the launch pad the delay can become an issue.
I remember having countless depressing conversations about this all the way back in the very early 2000s when potential clients wanted us to program a video "streaming" system that did not allow downloading, and fruitlessly trying to convince them that streaming was downloading--there's no meaningful technical difference. People were convinced that "streaming" was some weird distinct mode that the Internet could be converted into, and that you just need to program harder to do it.
Streaming does not by default save to device. There are ways around it; these are pointless to invest too much in fighting. Making streaming good and reasonably priced makes legit customers out of pirates.
You will never defeat piracy through technology, only through economics.
I remember a line from the 2002 movie Big Fat Liar involving a school assignment, that went something like, "And don't even think about downloading something from the internet; I want that essay hand-written".
One reason I remember it is the implicit assumption that one couldn't transcribe a digital essay onto physical paper.
The other reason is because it seemed strange to call it "downloading" when I was imagining a web page. Aside from the possibility that it was downloading a non-HTML document file, "downloading" didn't feel right for "visiting a web page", even though of course it is downloading in a more technical sense.
Like "streaming", downloading a web page into your browser's memory isn't saving it to long term storage.
On capable devices actual downloading is even supported as an USP by most providers for offline/travel scenarios.
Besides that there are even more externalities that differentiate them:
Client and User requirements and targeted devices, therefore mass adoption and market penetration.
Downloading requires quite expensive hardware by comparison in usually quite complicated setups for a TV/like experience, it requires the user to do active file management, (which includes deleting files at some point, or buy more expensive local infrastructure) to become a mass market consumer thing, this needs to be externalized.
A streaming client is way cheaper to build and market, since doesn't need any relevant amount of non/volatile memory to speak off, that the user experience easier to sell is also quite obvious as witnessed by the golden last decade, it's only now getting tainted by encroaching advertising and platform proliferation etc.
(Music) Streaming being a "rented" download is the analogy I used to use back in the day.
e.g. the "rented" downloads can be removed from file system at any time by the service you've "rented" from, while a "purchased" or "owned" download is only removed by the person who purchased it.
1) In remote, The server will provide chunks of data. Each chunk has predefined length.
Even in Text format you have to identify each chunks to process. `Transfer-Encoding: chunked` HTTP header for example.
> No local streaming in remote computer.
2) In local storage system you can stream any length of data from Storage Drive to RAM
Downloading:
1) In remote, The client will request chunks of data. You can request any length.
You don't need to identify the chunks. You can append the downloaded data without any process.
> There is Local streaming. The remote computer actually streams data from it's storage to RAM.
2) In local copying from peripheral device is also called Downloading. I've seen Downloading label in micro-controller burner.
Presenting, Storing or Deleting the either one data is your choice. Not only stream; also you can watch, listen, read the downloaded content also without storing it into actual drive or without finishing the download. It's all actions nothing to do with techniques behind the terms.
At Koofr[1] one of the most requested features was an option to prevent downloading files from public links. We didn't want to lie to our users so we added a "Hide download button" option because that's the only thing you can do. You can hide the download button but you can never really prevent the download.
>> rights holders have engaged in a fundamentally-doomed arms race of implementing copy-protection strategies
Not entirely true. They simply haven't succeeded in created an industry-standard secure pipeline to the pixels on the display. Aside from the "analogue hole", eventually all of the gaps will be plugged, the same way we use secure sockets today. All media devices (including home HDTV/8K/etc) will extend the chain of trust farther into the pipeline. A set of signed apps and hardware will be required to watch any DRM films on HDTV, with each stage using authenticated encryption completely annihilating any MITM siphoning of the video.
So, its not doomed, just moving slowly, but it absolutely WILL arrive. I know, because I'm working on secure embedded video codec hardware, and our customers are targeting this..
At some point you hit the pixel driver with a bunch of bits, unless your pipeline involves digital signing of copyrights in everyone's future cyber eyeballs, it will always be possible to get the video if you have hardware access.
And the article goes over how there is already an industry standard for the encryption pipeline that goes all the way to monitors and television sets themselves and how you can get a cheap device which just pretends to be a TV and passes on an unencrypted HDMI out.
There are still people watching television on 1980’s hardware. Full HD televisions have been essentially feature complete for over 20 years and should remain relevant for another 20 years, since the vast majority of broadcasts are still 480p and 720p. There are now hundreds of millions of 4k and 8k televisions and projectors with expected service life and lifecycles extending into 2050s.
Bricking those devices en masse is a PR disaster and invites legal scrutiny from regulators, and any individual service suddenly requiring special hardware is shooting itself in the face financially.
All it takes is one person to figure out how to get the bits out, and then the only other potential solution would be to make devices that cannot play unencrypted content.
>I know, because I'm working on secure embedded video codec hardware, and our customers are targeting this..
Why? Or more specifically, why you, doing that?
You can say no, you know. To solve your problem, you're making for some of the least scrupulous people on the planet, (Hollywood types), the primitives to a guaranteed technologically enforceable tyranny. Remember that just because someone says they won't do something with a thing, doesn't mean the heel turn isn't coming. Sometimes you just don't build things because people can't be trusted with them.
So, why are you doing it?
You might think it's just harmless bits now... But today's harmless bits are tomorrow's chain links. Seriously asking. Might help me out of a mental hang up I'm trying to work through.
This is another one of those little technical debates we all like to have. Most of it comes down to the intent of what it we're trying to achieve and how we place language around it.
If I want to stream something, it traditionally means I want to watch it in (near) real time as it is being downloaded, without necessarily having to store the entire thing. If I am downloading, I want the whole thing first before I watch it and I want to keep all of it.
Depending on the intent, will depend on the technical solution put in place to do that thing well, as there are technical differences. So yes, to watch a movie it requires downloading it in some way shape or form.
Something that doesn't need as much discussion as it gets. One of those "well technically it is..." type of arguments.
>Like all these technologies, HDCP was cracked almost immediately and every subsequent version that’s seen widespread rollout has similarly been broken by clever hacker types
IIUC it doesn't matter much if HDCP is cracked because the licenced chips (or knock-offs from the same factory) end up in stripping devices (or devices that are marketed as having another function like display cloning but also effectively strip the HDCP).
On top of that most pirates prefer to crack the encryption much earlier. Ideally the video stream is captured before the video is decoded. This avoids quality loss that would occur when re-encoding the video.
So cracking HDCP is only "interesting" if you don't want to buy the (very available) hardware and are not going to re-encode or are ok with the generation loss.
There are some "streaming" systems that just download clips of a few seconds, which Javascript in the client reassembles into a longer video. This allows moving forwards and backwards in the video stream while using standard HTTPS.
Well, there is yt-dlp, if you count that as a browser. It has hacks for downloading from nearly every website that has weak DRM. It also has a fallback for guessing how to download from arbitrary websites.
If you mean drm'd content, I think that's hopeless now, the security is at the hypervisor level nowadays. I can't even play that content on my old vga monitors. For the average youtube video, I don't see why not in principle.
Lovely article, when discussing streaming and downloading, we should ignore the transport, it's not relevant to the definition.
Downloading is a superset of streaming.
Low-level streaming is the process of transferring data, between CPU and RAM, between RAM, controller and mass storage device, between machines connected to the internet.. The data is streamed, it's sent serially, it's a stream of data.. (the width of the lake does not matter, if the entire thing does not arrive at once, it's streaming).
Low-level loading is the process of loading something down to a computer, meaning, something is loaded, if it's loaded, it's stored _SOMEWHERE_, it could be stored simply in RAM (the LOAD instructions in the cpu place a value into a storage location, a high-level use of the word could be LOAD command in BASIC which streams data from somewhere into some location in RAM for medium-term usage).
When you're viewing a video without saving it to file, we call it streaming, it fits the metaphor, even if _chunks_ of the video are DOWNLOADED into temporary storage, the entire video is not (intentionally at least, but if it's small it might be) stored as a whole before being presented, so from a high-level perspective, the video is streamed. If the data is not persisted, it's streamed.
High-level use of the word streaming fits the low-level metaphor, the process is: Stream data from content provider to client, client loads enough data to start presenting, client presents loaded data while new data is streaming into it, that data is immediately loaded in buffers, after data has been presented, it is discarded.
Downloading is a different word from loading, but it historically fits the metaphor, when a computer requests data from another computer, it pulls the data "down" from the other computer (as opposed to "up" from mass storage), and into some medium-term storage location, often RAM (when you browse a website your browser downloads the data into memory, the medium means it can't automatically discard after presenting, because it does not know when presentation is done, it may even cache some of the data it downloaded on mass storage).
The newer use of download is a slight misnomer, but only slight, the intention is to keep the file for long-term use, so it is streamed into memory buffers before being saved to disk.. "Saving" would be a more appropriate term, the distinction where it is saved FROM is not as important anymore as people used to think.
tldr:
Low-level streaming: Transfer data sequentially.
High-level streaming: Transfer data sequentially, present as needed, reclaim storage.
Downloading: low-level streaming into long-term storage.
[+] [-] schoen|10 months ago|reply
That means you could get drop-outs when data gets lost in transmission, but the overall data consumption of the protocol wouldn't go up as a result.
Not all that long ago, this prompted lots of debate about QoS and prioritization and paid prioritization and network neutrality and stuff. People were arguing that media streams needed higher priority on the Internet than downloads (and other asynchronous communications). Effectively, different Internet applications were directly competing with one another, yet they had very different degrees of tolerance to delays, packet reordering, and packet loss. Wouldn't ISPs have to intervene to prioritize some applications over others?
I remember reading from Andrew Odlyzko that this controversy was mostly resolved in an unexpected way: faster-than-realtime streams with buffering (as the network was typically faster overall than what was needed for a given level of media quality, you could use TCP to reliably download frames that were still in the future with respect to what would be played, and then buffer those locally). This is indeed the scenario depicted in this article.
What about actual live events? My impression is that Twitch and YouTube livestreaming are using a 10-30 second delay relative to realtime, specifically to allow for significant buffering on the client, and then using reliable TCP faster-than-realtime downloads of the "near future" of the video content. Since these streams are purely unidirectional, users don't have a way to notice that they're not literally live. (I don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.)
There's a big difference for bidirectional conversation, like phone calls, because there even tiny delays are extremely psychologically noticeable. It appears that Zoom, for instance, is still using unreliable UDP streams for call content, which allows skips or dropouts, but keeps the latency relatively low so that it feels comparatively more like a face-to-face interactive conversation.
[+] [-] treyd|10 months ago|reply
Yeah. The rule of thumb with Twitch used to be 11 seconds. You can still measure this because many streams replay the chat in the stream as an overlay for both being able to see when the streamer has seen your message and for archival purposes to preserve the chat in VODs.
> don't understand how this interacts with the typical ability to start watching almost instantly, with no visible buffering delay, though.
There's a buffer on the CDN (which they have anyways because they're recording the VOD) and you start playback at the point t seconds back.
[+] [-] charcircuit|10 months ago|reply
Plenty of streamers show the chat on screen and talk with people in the chat. This is not true.
[+] [-] majormajor|10 months ago|reply
For TV, last I worked on a system like this the clients received data the same way as non-live streams: Http streaming (HLS or Dash), where you fetch playlists and small video files and the player stitches them all together. There's buffering along the pipe, the 30-60s total delay (which you'll notice if you watch sports and chat with someone who has cable and is watching the same thing) is a cumulative thing, so you don't see a 1-min startup delay, you just near-instantly get dropped into something that's already quite a bit behind.
Not sure what Twitch does. The over-the-network video game streaming-console services are obviously completely different from TV land, they couldn't get away with it there; but for TV the expense of better isn't seen worth it.
[+] [-] rlpb|10 months ago|reply
This can be a problem, for example when sports fans receive out-of-band notification of a goal before they see it happen on their "live" stream.
[+] [-] qskousen|10 months ago|reply
This used to be the case, and may still be for some steamers, but mostly when I watch it's less than a couple seconds delay with the low latency mode enabled in the browser.
[+] [-] 3eb7988a1663|10 months ago|reply
[+] [-] svggrfgovgf|10 months ago|reply
Depending on the delay, this can cause problems when switching from delayed streaming to real life. For example, watching the countdown on a rocket launch via streaming then going outside to watch the actual launch. Usually, for me, a few seconds delay is OK, because I can't see the rocket until about 30 seconds after liftoff due to trees. But when I have a better view of the launch pad the delay can become an issue.
[+] [-] ryandrake|10 months ago|reply
[+] [-] RajT88|10 months ago|reply
Streaming does not by default save to device. There are ways around it; these are pointless to invest too much in fighting. Making streaming good and reasonably priced makes legit customers out of pirates.
You will never defeat piracy through technology, only through economics.
[+] [-] wmf|10 months ago|reply
[+] [-] Cyphase|10 months ago|reply
One reason I remember it is the implicit assumption that one couldn't transcribe a digital essay onto physical paper.
The other reason is because it seemed strange to call it "downloading" when I was imagining a web page. Aside from the possibility that it was downloading a non-HTML document file, "downloading" didn't feel right for "visiting a web page", even though of course it is downloading in a more technical sense.
Like "streaming", downloading a web page into your browser's memory isn't saving it to long term storage.
[+] [-] nntwozz|10 months ago|reply
Sailing the high seas since Napster.
I give thanks to:
BitTorrent - Private trackers - Subsonic API - Navidrome - invidious - yt-dlp - Infuse - mpv
[+] [-] Mistletoe|10 months ago|reply
[+] [-] wood_spirit|10 months ago|reply
[+] [-] mxfh|10 months ago|reply
Besides that there are even more externalities that differentiate them:
Client and User requirements and targeted devices, therefore mass adoption and market penetration.
Downloading requires quite expensive hardware by comparison in usually quite complicated setups for a TV/like experience, it requires the user to do active file management, (which includes deleting files at some point, or buy more expensive local infrastructure) to become a mass market consumer thing, this needs to be externalized.
A streaming client is way cheaper to build and market, since doesn't need any relevant amount of non/volatile memory to speak off, that the user experience easier to sell is also quite obvious as witnessed by the golden last decade, it's only now getting tainted by encroaching advertising and platform proliferation etc.
[+] [-] dijksterhuis|10 months ago|reply
e.g. the "rented" downloads can be removed from file system at any time by the service you've "rented" from, while a "purchased" or "owned" download is only removed by the person who purchased it.
[+] [-] Hashex129542|10 months ago|reply
Streaming:
1) In remote, The server will provide chunks of data. Each chunk has predefined length.
Even in Text format you have to identify each chunks to process. `Transfer-Encoding: chunked` HTTP header for example.
> No local streaming in remote computer.
2) In local storage system you can stream any length of data from Storage Drive to RAM
Downloading:
1) In remote, The client will request chunks of data. You can request any length.
You don't need to identify the chunks. You can append the downloaded data without any process.
> There is Local streaming. The remote computer actually streams data from it's storage to RAM.
2) In local copying from peripheral device is also called Downloading. I've seen Downloading label in micro-controller burner.
Presenting, Storing or Deleting the either one data is your choice. Not only stream; also you can watch, listen, read the downloaded content also without storing it into actual drive or without finishing the download. It's all actions nothing to do with techniques behind the terms.
[+] [-] lukax|10 months ago|reply
[1] https://koofr.eu
[+] [-] voilavilla|10 months ago|reply
Not entirely true. They simply haven't succeeded in created an industry-standard secure pipeline to the pixels on the display. Aside from the "analogue hole", eventually all of the gaps will be plugged, the same way we use secure sockets today. All media devices (including home HDTV/8K/etc) will extend the chain of trust farther into the pipeline. A set of signed apps and hardware will be required to watch any DRM films on HDTV, with each stage using authenticated encryption completely annihilating any MITM siphoning of the video.
So, its not doomed, just moving slowly, but it absolutely WILL arrive. I know, because I'm working on secure embedded video codec hardware, and our customers are targeting this..
[+] [-] SpaceNugget|10 months ago|reply
And the article goes over how there is already an industry standard for the encryption pipeline that goes all the way to monitors and television sets themselves and how you can get a cheap device which just pretends to be a TV and passes on an unencrypted HDMI out.
[+] [-] djhn|10 months ago|reply
There are still people watching television on 1980’s hardware. Full HD televisions have been essentially feature complete for over 20 years and should remain relevant for another 20 years, since the vast majority of broadcasts are still 480p and 720p. There are now hundreds of millions of 4k and 8k televisions and projectors with expected service life and lifecycles extending into 2050s.
Bricking those devices en masse is a PR disaster and invites legal scrutiny from regulators, and any individual service suddenly requiring special hardware is shooting itself in the face financially.
[+] [-] LocalH|10 months ago|reply
[+] [-] salawat|10 months ago|reply
Why? Or more specifically, why you, doing that?
You can say no, you know. To solve your problem, you're making for some of the least scrupulous people on the planet, (Hollywood types), the primitives to a guaranteed technologically enforceable tyranny. Remember that just because someone says they won't do something with a thing, doesn't mean the heel turn isn't coming. Sometimes you just don't build things because people can't be trusted with them.
So, why are you doing it?
You might think it's just harmless bits now... But today's harmless bits are tomorrow's chain links. Seriously asking. Might help me out of a mental hang up I'm trying to work through.
[+] [-] NoPicklez|10 months ago|reply
If I want to stream something, it traditionally means I want to watch it in (near) real time as it is being downloaded, without necessarily having to store the entire thing. If I am downloading, I want the whole thing first before I watch it and I want to keep all of it.
Depending on the intent, will depend on the technical solution put in place to do that thing well, as there are technical differences. So yes, to watch a movie it requires downloading it in some way shape or form.
Something that doesn't need as much discussion as it gets. One of those "well technically it is..." type of arguments.
[+] [-] charcircuit|10 months ago|reply
Is there proof HDCP 2.3 has been cracked?
[+] [-] kevincox|10 months ago|reply
On top of that most pirates prefer to crack the encryption much earlier. Ideally the video stream is captured before the video is decoded. This avoids quality loss that would occur when re-encoding the video.
So cracking HDCP is only "interesting" if you don't want to buy the (very available) hardware and are not going to re-encode or are ok with the generation loss.
[+] [-] haiku2077|10 months ago|reply
[+] [-] kazinator|10 months ago|reply
What you cannot say is that saving and streaming are not the same.
Streaming UI uses "download" as a synonym of "save to local file". That particular semantics of "download' is not the same as "stream".
[+] [-] Animats|10 months ago|reply
[+] [-] slt2021|10 months ago|reply
https://en.wikipedia.org/wiki/HTTP_Live_Streaming
https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_ove...
[+] [-] voilavilla|10 months ago|reply
[+] [-] unknown|10 months ago|reply
[deleted]
[+] [-] 1vuio0pswjnm7|10 months ago|reply
Storage space has become so inexpensive. Memory, too. I have enough RAM to store anything I download for the short term.
I will always prefer downloading
[+] [-] haddr|10 months ago|reply
[+] [-] creatonez|10 months ago|reply
[+] [-] jasonfarnon|10 months ago|reply
[+] [-] Retr0id|10 months ago|reply
[+] [-] unknown|10 months ago|reply
[deleted]
[+] [-] ljsprague|10 months ago|reply
[+] [-] 4ggr0|10 months ago|reply
[+] [-] dusted|10 months ago|reply
Downloading is a superset of streaming.
Low-level streaming is the process of transferring data, between CPU and RAM, between RAM, controller and mass storage device, between machines connected to the internet.. The data is streamed, it's sent serially, it's a stream of data.. (the width of the lake does not matter, if the entire thing does not arrive at once, it's streaming).
Low-level loading is the process of loading something down to a computer, meaning, something is loaded, if it's loaded, it's stored _SOMEWHERE_, it could be stored simply in RAM (the LOAD instructions in the cpu place a value into a storage location, a high-level use of the word could be LOAD command in BASIC which streams data from somewhere into some location in RAM for medium-term usage).
When you're viewing a video without saving it to file, we call it streaming, it fits the metaphor, even if _chunks_ of the video are DOWNLOADED into temporary storage, the entire video is not (intentionally at least, but if it's small it might be) stored as a whole before being presented, so from a high-level perspective, the video is streamed. If the data is not persisted, it's streamed.
High-level use of the word streaming fits the low-level metaphor, the process is: Stream data from content provider to client, client loads enough data to start presenting, client presents loaded data while new data is streaming into it, that data is immediately loaded in buffers, after data has been presented, it is discarded.
Downloading is a different word from loading, but it historically fits the metaphor, when a computer requests data from another computer, it pulls the data "down" from the other computer (as opposed to "up" from mass storage), and into some medium-term storage location, often RAM (when you browse a website your browser downloads the data into memory, the medium means it can't automatically discard after presenting, because it does not know when presentation is done, it may even cache some of the data it downloaded on mass storage).
The newer use of download is a slight misnomer, but only slight, the intention is to keep the file for long-term use, so it is streamed into memory buffers before being saved to disk.. "Saving" would be a more appropriate term, the distinction where it is saved FROM is not as important anymore as people used to think.
tldr:
Low-level streaming: Transfer data sequentially. High-level streaming: Transfer data sequentially, present as needed, reclaim storage.
Downloading: low-level streaming into long-term storage.
[+] [-] yapyap|10 months ago|reply
[+] [-] unknown|10 months ago|reply
[deleted]