Ask HN: Why is there no P2P streaming protocol like BitTorrent?
236 points| memet_rush | 10 months ago
I was thinking most people nowaday have at least 30mbps upload and a 1080p stream only needs ~10mbps and 720p needs ~5ish. Also i think it wouldnt have to be live, people would definitely not mind some amount of lag. I was thinking the big O for packets propagating out in the network should be Log(N) since if a master is sharing the content then is connected to 10 slaves, then those connected to 10 other slaves and so on.
The other limitation I could think of is prioritizing who gets the packets first since there's a lot of people with 1gbs connections or >10mbps connections. Also deprioritizing leechers to keep it from degrading the stream.
Does anyone have knowledge on why it isn't a thing still though? it's super easy to find streams on websites but they're all 360p or barely load. I saw the original creator of bittorrent was creating something like this over 10 years ago and seems to be a dead project. Also this is ignoring the huge time commitment it would take to program something like this. I want to know if this is technically possible to have streams of lets say 100,000 people and why or why not.
Just some thoughts, thanks in advance!
bawolff|10 months ago
If you want live high quality streaming, a lot of reasons bit torrent works so well goes away.
Latency matters. In bit torrent if the peer goes away, no big deal, just try again in 5 minutes with another peer, you are downloading in random order, who cares if one piecs is delayed 5 minutes. In a live stream your app is broken if it cuts out for 5 minutes.
In bit torrent, everyone can divide the work - clients try to download the part of the file the least number of people have, quickly rare parts of the file spread everywhere. In streaming everyone needs the same piece at the same time.
Bit torrent punishes people who dont contribute by deprioritizing sending stuff to peers that freeride. It can do this on the individual level. In a p2p streaming setup, you probably have some peers getting the feed, and then sending it to other peers. The relationship isnt reciperocal so its harder to punish freeriders as you can't at the local level know if the peer you are sending data to is pushing data to the other nodes it is supposed to or not.
I'm sure some of these have work arounds, but they are hard problems that aren't really satisfactorily solved
arghwhat|10 months ago
> Latency matters. In bit torrent if the peer goes away, no big deal, just try again in 5 minutes with another peer, you are downloading in random order, who cares if one piecs is delayed 5 minutes. In a live stream your app is broken if it cuts out for 5 minutes.
First of all, BitTorrent clients do not download in random order or wait 5 minutes. They usually download the rarest block first, but can do whatever they want, whenever they want.
Second, standard HLS sets a nominal segment size of 6 seconds (some implementations will go as high as 10 seconds), and a client will usually cache multiple segments before playing (e.g., 3). This mean that you have 18 seconds before a segment becomes critical.
This is not a difficult thing for a P2P network to handle. You'd adapt things to introduce timing information and manage number of hops, but each client can maintain a connection to a number of other clients and have sufficient capacity to fill a segment if a connection fails. Various strategies could be used to distribute load while avoiding latency penalties.
Low-latency HLS uses much smaller segments and would be more demanding, but isn't impossible to manage.
> BitTorrent punishes people who dont contribute
Private communities punish this behavior, BitTorrent clients do not. Most new downloads will appear as freeriders for a long time, and only over long periods far exceeding the download time will enough compatible seeding opportunities arise for them to contribute in any substantial way.
The network does not need everyone to seed, it only needs enough people to seed.
delusional|10 months ago
Video-on-demand is perfectly implementable on top of BitTorrent. As you say, there are some latency pitfalls you'll have to avoid, but that's nothing you can't hack yourself out of.
Livestreaming is a different beast. As you say, the problem with livestreaming is that everyone needs the same content at the same time. If I spend 200ms downloading the next 500ms worth of content, then there's nobody to share it with, they all spent the 200ms doing the same. BitTorrent relies on the time shift that is allowed between me downloading the content and you requesting it. If you request it before I've got it, well I can't fulfil that request, only the guy I intend to get it from can.
If you wanted to implement something like that, you would probably pick a tree of seeders where the protocol would pick a subset of trusted nodes that it would upload the content to before then allowing them to seed it, and the have them doing the same recursively.
That would obviously introduce a bunch of complexity and latency, and would very much not be BitTorrent anymore.
Protostome|10 months ago
1. locally-randomizing segments download order
2. Create a larger buffer
3. Prioritize parts coming from slower connections
karel-3d|10 months ago
And the order is requested by the client, and there are clients that go in the sequential order, like Deluge
HumblyTossed|10 months ago
Just a mental curiosity is all.
calvinmorrison|10 months ago
That's basically true for one client (transmission) - who specifically refuses to allow linear ordering. Most clients implement this.
To enable it, its about a 3 sloc change.
I hate clients that dont work for the user.
PaulRobinson|10 months ago
This P2P stack was meant to allow for mass scaling of lowish latency video streaming, even in parts of the World with limited peer bandwidth to original content source servers. The VC-1 format got into a legal quagmire, as most video streaming protocols do, and it speaks volumes that by the time I turned up in ~2012-ish, the entire stack was RTMP, RTSP, HDS and HLS with zero evidence of that P2P tech stack in production.
My main role was to get the ingest stack out of a DC and into cloud, while also dealing with a myriad of poor design decisions that led to issues (yes, that 2013 outage in the first paragraph of the wiki article was on my watch).
At no point did anybody suggest to me that what we really needed to fix our attention back to was P2P streaming, beyond the fact the company built a version of Periscope (Twitter's first live streaming product), and launched it weeks/months before they did, and were pivoting towards a social media platform, at which point I decided to go do other things.
The technical and legal problems are real, and covered elsewhere here. People want reliable delivery. Even Spotify, YouTube and others who have licensed content and could save a pile by moving to DRM-ified P2P don't go near it, and that should tell you something about the challenges.
I'd love more widespread adoption of P2P tech, but not convinced we'll ever see it in AV any time soon, unfortunately.
[0] https://en.wikipedia.org/wiki/LiveStation
garganzol|10 months ago
Thank you for bringing up the warm memories I thought I no longer had.
apitman|10 months ago
andruby|10 months ago
Key part of that tech was that it synchronized the playback between all peers. That was nice for stock market announcements and sport events for example.
https://web.archive.org/web/20131208173255/http://splitcast....
https://www.youtube.com/watch?v=R5UYu9jeQbY
https://www.crunchbase.com/organization/splitcast-technology
martinald|10 months ago
For 'hobbyists' there is a lot of complexity with setting up your own streaming infrastructure compared to just using YouTube or Twitch.
Then for media companies who want to own it, they can just buy their own infra and networking which is outrageously cheap. HE.net advertises 40gbit/sec of transit for $2200/month. I'm oversimplifying this somewhat, you do have issues with cheap transit and probably need backups especially for certain regions. But there isn't much of a middleground between hobbyists and big media cos.
For piracy (live sports streams), I've read about https://en.wikipedia.org/wiki/Ace_Stream being used for this exact purposes FWIW. This was a while back but I know it had a lot of traction at one point.
imtringued|10 months ago
Minimum latency broadcast forms a B tree. A tree is by definition not peer to peer. The number of branches per node is upload speed divided by bandwidth of the stream. This branching factor is extremely low for residential internet with asymmetric high download and low upload speeds.
Once you add malicious adversaries in the P2P network or poor network connectivity, each client will need to read multiple streams via erasure encoding at once and switch over, when a node loses its connection.
throwaway920102|10 months ago
miyuru|10 months ago
In my opinion, NAT and the extensive tracking that has led users to distrust sharing their IP addresses are the reasons why it hasn't caught on.
Imagine YouTube using P2P technology, it would save lot of money spent on caching servers.
bawolff|10 months ago
> Imagine YouTube using P2P technology, it would save lot of money spent on caching servers.
I think its money well spent.
miohtama|10 months ago
I remember it as it was one of rare apps built in XUL, the same framework as Mozilla apps (Firefox).
https://en.m.wikipedia.org/wiki/Joost
pests|10 months ago
whalesalad|10 months ago
lathiat|10 months ago
j45|10 months ago
elmerfud|10 months ago
In general people aren't tolerant of lag and spinning circles and other such things when they're trying to watch streaming content. If you're fine with just watching it a little bit later might as well queue it up and left the whole thing down load so it's ready when you're ready.
jeroenhd|10 months ago
Popcorn Time got taken down pretty hard because they became too popular too fast.
A commercial solution could have a seed server optimized for streaming the initial segments of video files to kickstart the stream, and let basic torrents deal with the rest of the stream.
bayesianbot|10 months ago
memet_rush|10 months ago
The main reason I would think it would be useful is 1. since streaming sites seem to lose a lot of money and 2. sports streams are really bad, even paid ones. I have dazn and two other sports streaming services and they still lag and are only 720p
pabs3|10 months ago
BiteCode_dev|10 months ago
It's similar to popcorn time that was killed by legal ways so I'd say they did take off.
Stremio smartly avoids being killed by making pirating an optional plugin you have to install from another site so they get deniability.
It works well and save my ass from needing 1000s' of subscriptions.
reliablereason|10 months ago
The former don't want to use it as it degrades their control over the content, and the later don't want to make a new system cause systems that are built on torrents are good enough.
littlestymaar|10 months ago
I then left and the company later got acquired by Level 3 so I don't know exactly how it evolved but it's likely that they abandoned the illegal streaming market for reputational reasons and stuck with big players.
aaron695|10 months ago
Encryption (can work with sharing), signatures, fall back to CDN. Control is not an issue.
> torrents are good enough.
Torrents can't do the massive market of livestream, like sports or season finales or reality TV / news. This is the entire point of the question.
> The only entities
And everyone kicked off of YouTube or doesn't want to use big corporations on principal, like Hacker Cons or the open source community.
rklaehn|10 months ago
Our library is general purpose and can be used whenever you need direct connections, but on top of Iroh we also provide iroh-blobs, which provides BLAKE3 verified streaming over our QUIC connections.
Blobs currently is a library that provides low level primitives and point to point streaming (see e.g. https://www.iroh.computer/sendme as an example/demo )
We are currently working on extending blobs to also allow easy concurrent downloading from multiple providers. We will also provide pluggable content discovery mechanisms as well as a lightweight content tracker implementation.
There is an experimental tracker here: https://github.com/n0-computer/iroh-experiments/tree/main/co...
Due to the properties of the BLAKE3 tree hash you can start sharing content even before you have completely downloaded it, so blobs is very well suited to the use case described above.
We already did a few explorations regarding media streaming over iroh connections, see for example https://www.youtube.com/watch?v=K3qqyu1mmGQ .
The big advantage of iroh over bittorrent is that content can be shared efficiently from even behind routers that don't allow manual or automatic port mapping, such as many carrier grade NAT setups.
Another advantage that BLAKE3 has over the bittorrent protocol is that content is verified incrementally. If somebody sends you wrong data you will notice after at most ~16 KiB. Bittorrent has something similar in the form of piece hashes, but those are more coarse grained. Also, BLAKE3 is extremely fast due to a very SIMD friendly design.
We are big fans of bittorrent and actually use parts of bittorrent, the mainline DHT, for our node discovery.
Here is a talk from last year explaining how iroh works in detail: https://www.youtube.com/watch?v=uj-7Y_7p4Dg , also briefly covering the blobs protocol.
anacrolix|10 months ago
notepad0x90|10 months ago
netsharc|10 months ago
nisa|10 months ago
Surprisingly the channels that are available work really well if you just use the mpegts stream.
In a past life I've added a few channels to a tvheadend instance on a VPS. It reliable crashed Kodi watching some channels and I've wondered if it's just broken streams or something more interesting is going on.
If you open the ports and watch popular channels it's easily saturating bandwidth - there is no limit.
I've since stopped using it it's the kind of thing that breaks not often enough to be not useless but often enough to be annoying.
It's IPv4 only and seems to use it's own tracker or at least calls to some URLs for initial peer discovery.
Building something similar as true open source would be great but I guess the usecase is mostly illegal streaming.
Be careful - it's attempting to use upnp to open ports on the router and even if just looking through the lists makes you upload fragments.
Still fascinating tool. It's getting to close to what op is looking for but I think it has scalability issues and everything about it is kind of shady and opaque.
extraduder_ire|10 months ago
I was hopeful about bittorrent-live when that was announced, but they didn't open source that for some reason either.
wmf|10 months ago
globular-toast|10 months ago
The real reason is centralised architecture gives them control and ability to extract rent.
LargoLasskhyfv|10 months ago
memet_rush|10 months ago
Imustaskforhelp|10 months ago
They use bao hashing which is something that I discovered through them (IIRC) and its really nice.
Could create such a protocol though bittorrent/ipfs is fine
I once wanted to create a website which was just a static website.
and I used some ipfs gateway to push it with my browser and got a link of that static website, all anonymous.
Kind of great tbh.
Imustaskforhelp|10 months ago
There are other genuinely useful crypto projects (like Monero for privacy and I don't like the idea of smart contracts)
I really want to tell you the fact that most crypto is scam. These guys first went into crypto and now I am seeing so much crypto + AI.
As someone who genuinely is interested in crypto from a technology (decentralization perspective)
I see transactions as a byproduct not the end result & I see people wanting to earn a quick buck feel really weird.
Also crypto isn't safe. I just think like now its better to correlate as a tech stock though 99% of the time, its run by scams, so absolutely worse.
The technology is still fascinating. But just because the technology is fascinating doesn't mean its valuable. Many people are overselling their stuff.
That being said, I have actually managed to use crypto to create a permanent storage (something like ipfs but its forced to store it forever) , so I think this can be used where anonymity/decentralized is required. But still, this thing could be done without including money in the process as well & crypto is still not as decentralized as one might imagine.
jauntywundrkind|10 months ago
Dibby053|10 months ago
For livestreams there's AceStream built on BitTorrent, but I think it's closed-source. They do have some SDK but I never looked into it. It's mostly used by IPTV pirates. I've used it a few times and it's hit-or-miss but when it works well I have been able to watch livestreams in HD/FullHD without cuts. Latency is always very bad though.
Then for video-on-demand there are some web-based ones like PeerTube (FOSS) and I think BitChute? Sadly webtorrent is very limited.
cess11|10 months ago
You might want to look into the tradeoffs Discord decided to go with, https://discord.com/blog/how-discord-handles-two-and-half-mi....
Here's some boilerplate for rolling your own, https://blog.swmansion.com/building-a-globally-distributed-w....
In theory you could gain resilience from a P2P architecture but you're going to have to sacrifice some degree of live-ness, i.e. have rendering clients hold relatively large buffers, to handle jitters, network problems, hostile nodes and so on.
RedNifre|10 months ago
Such a shame that it failed, nothing after it ever came close.
alganet|10 months ago
It is a thing.
wmf|10 months ago
pabs3|10 months ago
syndeo|10 months ago
1970-01-01|10 months ago
rollcat|10 months ago
Netflix famously offers ISPs an appliance.
zinekeller|10 months ago
nottorp|10 months ago
And if you pay for the streaming, why would you donate your bandwidth to them? Would you get a discount?
glxxyz|10 months ago
Live events, e.g. sports?
"why would you donate your bandwidth to them?"
I don't know but people donate bandwidth for torrents, maybe it's 'free' for them?
PotterSys|10 months ago
Besides bandwidth problems (as you can't 100% rely on remote connections), any P2P solution would mean the same fragment will be shared many times between clients; something CDN networks have solved already (just serving content, instead of juggling with signalling)
pdubouilh|10 months ago
0: https://github.com/pldubouilh/live-torrent
foobarbecue|10 months ago
I think your other constraints (tree topology & connection prioritization) already describe how BitTorrent works.
I think there's one thing you'd need to change for /live/ streaming, where the file is actually being created /during/ broadcast-- I think the file verification hash systems require the seeder to have the entire file when initially seeding. I think magnet links and .torrent files are based on a hash of the entire file. Do maybe you need some kind of modification to DHT , .torrent , .magnet , to support verification by sequential chunks.
chyueli|10 months ago
1. Hybrid architecture (CDN+P2P): - Use CDN to process backbone traffic, and edge nodes distribute through P2P to reduce the pressure on the central server (such as LivePeer trying to combine blockchain and P2P). - Platforms such as Youku have experimented with such solutions, but they need to weigh the cost and effect.
2. Protocol optimization: - Sliced transmission: Divide the streaming media into small pieces and improve efficiency through multi-path transmission. - Dynamic priority: Dynamically adjust the data allocation strategy according to node bandwidth and latency. - Buffering and preloading: Allow users to tolerate higher latency in exchange for more stable transmission (such as HLS/DASH protocol ideas).
3. Decentralized network exploration: - Projects such as IPFS and BitTorrent Live have tried real-time streaming, but are limited by technical maturity and ecological support. - Web3 projects (such as Theta Network) combine token incentives to encourage nodes to contribute bandwidth, which may promote development.
dcow|10 months ago
One possibility as you allude to is licensing. In a P2P streaming model “rights” holders want to collect royalties on content distribution. I’m not sure of a way you could make this feel legal short of abolishing copyright, but if you could build a way to fairly collect royalties, I wonder if you’d make inroads with enforcers. But overall that problem seems to have been solved with ads and subscription fees.
Another data point is that the behemoths decided to serve content digitally. Netflix and Spotify showed up. The reason the general population torrented music is because other than a CD changer, having a digital library was a requirement in order to listen to big playlists of songs on your… Zune. Or iPod. That problem doesn't exist anymore and so the demand dried up. There was also an audiophile scene but afaik with Apple Lossless the demand there has diminished too.
And finally, since people were solving the problem for real, we also entertained big deal solutions to reduce the strain on the network. If you stream P2P your packets take the slow lane. Netflix and other content providers build out hardware colocated with last mile ISPs so that content distribution can happen even more efficiently than in a P2P model.
In short: steaming turned into a real “industry”. Innovators and capitalists threw lots of time and money at the problem. Streaming platforms emerged, for better and for worse. And here we are today, on the cusp of repeating the past because short sighted business mongers have balkanized access with exclusive content libraries for the user numbers.
karel-3d|10 months ago
Modern streaming protocols sometimes go to absurd lengths to avoid too many hops so you get the data as soon as possible... torrent has so many jumps and negotiations to get to the actual file. It's good for decentralization but decentralization and efficiency go against each other.
SonuSitebot|10 months ago
Churn & reliability: Peers come and go, making stable streaming tricky. Latency: BitTorrent-style protocols aren’t built for real-time delivery. Incentives: Without rewards, too many users just leech. WebRTC: It hits limits fast and often relies on centralized relays. Legal risks: Media companies don’t play nice with decentralized distribution.
Bram Cohen tried with BitTorrent Live, but it fizzled out. Would love to see someone revive this with modern tech — still feels like untapped potential.
benlivengood|10 months ago
IPv6 multicast is probably the way forward for livestreams but I haven't really been keeping up on recent developments. In theory there could be dynamic registrations of multicast addresses that ISPs could opt-in to subscribe to and route for their customers.
nicman23|10 months ago
Calwestjobs|10 months ago
it is insane to me, for people to have need to watch toxic channels like LinusTechTips livestream, regurgitating weeks old toxic marketing disinformation and having need to have that 0ms latency... XD
why everyone needs low latency for one way stream? unnecessary hurdle just to have that hurdle. no benefit to anything.
but agree with you that if companies already forget existence of IPv4, internet will be simpler, faster and more usable. for less price for everyone.
Am4TIfIsER0ppos|10 months ago
I work on low latency and live broadcast. The appropriate latency of any video stream is the entire duration of it. Nobody else seems to share this opinion though.
johanvts|10 months ago
hwpythonner|10 months ago
If the goal is to cut costs — like vendors trying to avoid AWS/CDN bills — that’s a very different problem than building for censorship resistance or resilience.
Without a clear “why,” the tradeoffs (latency, peer churn, unpredictable bandwidth) are hard to justify. Centralized infra is boring but reliable — and maybe that's good enough for 99% of use cases.
The interesting question is: what’s the niche where the pain is big enough to make P2P worth it?
xbmcuser|10 months ago
m-s-y|10 months ago
Even “modern” cities like NYC are limited to a MAXIMUM of 30Mbps upstream due to ISP monopolies and red tape.
It’s getting better, but Spectrum is still literally the only ISP available for many city residents, and their offerings are so lopsided that their highest-end package is a whopping 980/30.
That’s right. If you use the majority of that 980Mbps your IP overhead will gladly take that 30Mbps, leaving you with just about Zero headroom for anything else.
snvzz|10 months ago
0. https://ja.wikipedia.org/wiki/PeerCast
mannyv|10 months ago
What does that mean?
The steps to live are pretty simple on the server side (assuming HLS):
1. Stream to your encoder, ideally at a bitrate higher than the transcoded bitrate.
2. Encode and transcode your video, ideally to 540/720/1080p 30fps. Each resolution will have its own bitrate, so maybe 2/3.5/5.5 respectively. Assume 2 second segments, and a manifest duration of 10 seconds. So you have 5 segments out there at any given time (though there are usually a few more hanging around).
3. Put the 3 newest segments to storage, and rewrite the four manifests with the new segment URLs. (do you need to rewrite the top-level manifest? I believe you do, but I can't remember).
4. Delete the older segment(s) (optional)
So when the client requests the manifest (the m3u8), it'll getch the three sub-manifests (forgot the term) and chose the appropriate resolution. It'll also start loading the segments up. Ideally it would look at the manifest and fetch the latest segment, so it starts nearer to "now."
Then the client will occasionally re-fetch the manifests to get the new segments (the manifest is marked as live; VoD manifests don't require reload). The fetch time probably must be < than the segment duration, which is in the manifest somewhere.
All that takes time. It takes time for the server to encode, time for the encoder to put the file(s), time for the client to fetch the manifests, and time for the client to fetch a video segment.
Looking at the above sequence, a client can be generally 0-10 seconds behind everyone else, depending on how the client behaves. And that's a few seconds behind "live," because receiving, encoding and putting files takes time.
So can you do p2p live? As long as you relax the constraints on what you mean by "live," yes. As you can imagine, the chain of latenty keeps growing longer the more peers a segment goes through. And that segment is only really good for 2 seconds (or up to 10 seconds, if the client is sloppy). If live means "up to 20 seconds since now" then yes, you can definitely do it. The tighter that time window gets the less likely you'll be able to do it. You might be able to do it with a lower bandwidth stream, but even TLS negotiation takes time. Does your client not use TLS? That will save you time.
slicksicknick|10 months ago
jannw|10 months ago
ValdikSS|10 months ago
* Asymmetric network links, slow upload especially on cellular
* Traffic package limitations, and both DL and UL are counted
* Some ISP are very against p2p, sometimes it's a government policy (China banned "Residential CDNs")
* NAT
dewcifer|10 months ago
I imagine that cutting out the live service ($$$) and SaaS have a large role to play.
paulcole|10 months ago
Plus instead of a million people all wanting to watch Spider-Man 2, those million people have infinite options of short videos or whatever to watch. The desire to watch A Specific Video isn’t what it used to be.
Times have changed and P2P as a common way of sharing stuff is dead to the average person.
silcoon|10 months ago
Saris|10 months ago
But the reality is for 99% of people Youtube and Twitch work just fine.
Plus most residential ISPs have really poor upload speed, and very restrictive data caps.
mikhailbolton|10 months ago
https://www.bittorrent.com/blog/2016/05/17/bittorrent-live-m...
jkhanlar|10 months ago
greenavocado|10 months ago
globular-toast|10 months ago
Tepix|10 months ago
nayuki|10 months ago
Calwestjobs|10 months ago
torrent PROTOCOL does not require to download random pieces, in random order.
ONLY bittorrent, inc. COMPANY which releases "Utorrent" and "Bittorent" NAMED APPLICATIONS/PROGRAMS does not want legal trouble from media/music companies. Because STREAMING is other legal category then downloading. There is no other reason for torrent PROTOCOL to not deliver file pieces in sequential order.
if you need instant nanosecond delayed stream, those does not exist anywhere, even radio, tv stations over the air are delayed so they all transmit synchronized. so 0 latency and synchronized can be mistaken for each other.
_flux|10 months ago
> if you need instant nanosecond delayed stream
I believe nobody was suggesting that.
ValdikSS|10 months ago
kevinmhickey|10 months ago
ramesh31|10 months ago
dp-hackernews|10 months ago
wmf|10 months ago
protocolture|10 months ago
But after working in ISP for a while I realised that the issue is getting ISP's to use cool protocols is just impossible and everything must be built at higher levels.
memet_rush|10 months ago
giorgioz|10 months ago
gunalx|10 months ago
GTP|10 months ago
teddyh|10 months ago
globular-toast|10 months ago
pabs3|10 months ago
https://github.com/johang/vlc-bittorrent/
oldgregg|10 months ago
dboreham|10 months ago
immibis|10 months ago
zveyaeyv3sfye|10 months ago
We had torrent client/streaming video players maybe 20 years ago already.
> Does anyone have knowledge on why it isn't a thing still though?
It is a thing, it seems you didn't do your research.
There's articles all over the interweb if you went and looked, such as
https://www.makeuseof.com/best-torrent-streaming-apps/
remram|10 months ago
behringer|10 months ago
greenavocado|10 months ago
markus_zhang|10 months ago
6510|10 months ago
If you have [say] a 640 MB recording at 120 fps you would only need to successfully receive 2.5 MB at 30 fps to be able to watch the entire thing. With a slight delay in playback you could even hop from one sub set of channels to another.
It should work offline too. One could have the cutting edge crispy resolution on a large display or watch the same on a crappy old laptop. (and everything in between)
For fun I one time convert a 3.5 hour lecture to 75 MB and was stunned by how watchable it still was.
guerrilla|10 months ago
i5heu|10 months ago
mystified5016|10 months ago
Nadya|10 months ago
noman-land|10 months ago
memet_rush|10 months ago
slashink|10 months ago
jeroenhd|10 months ago
john_the_writer|10 months ago
storytellerjr|10 months ago
defdefred|10 months ago
notpushkin|10 months ago
finalhacker|10 months ago
scotty79|10 months ago
dackdel|10 months ago
rogueptr|10 months ago
[deleted]
demo4000|10 months ago
[deleted]
grezql|10 months ago
[deleted]
aaron695|10 months ago
[deleted]
liaheve|10 months ago
[deleted]
shemulray667|10 months ago
[deleted]
Szpadel|10 months ago
one issue I can imagine would be that each part would discover peers independently where assumption that most peers of previous parts should be expected to also have those files.
second idea would be to use ipfs in that way instead of torrent. that would probably have much easier time for reusing peer discovery between parts and also would solve issue when to stop seeding as this is already build in into protocol.
I guess that creating distributed twitch basing on ipfs would be feasible but not sure how many people would like to install ipfs node before that could use that. that's kind of chicken and egg problem, you need a lot of people before this system starts work really well, but to get interest it need to really perform well so people would migrate from twitch like services.
ofc you can use public gateways. afaik cloudflare have public ipfs endpoint that could serve as fallback
immibis|10 months ago