That's not an accurate assessment of what's going on. The post linked states "no contact with the maintainers" not just inactivity, and if you follow the github issues linked, it's about people wondering what's going on and if anyone else can approve pull requests because there's lots of pull requests waiting. There's 843 pull requests at this time, and I just looked and over 50 are from just the last month.
It's not that there's repo inactivity, it seems to be that this is an extremely active repo which saw everything grind to a halt when the admins went dark. That's quite a bit different then just "inactivity".
> There's 843 pull requests at this time, and I just looked and over 50 are from just the last month
That's kinda overwhelming though ... imagine that if the maintainer pops up somewhere, suddenly 100 motivated people may chime in "hey please review this important pull request that's been sitting over here for a while".
There are some kinds of open source projects that are prone to this ... some are really not so bad to maintain if you have the right kind of discipline, because they converge on a stable set of functionality and platform compatibility evolves slowly, but some just naturally have endless room for variations and special cases, and as users increase, PRs increase linearly (instead of sub-linearly as you'd hope). I'm thinking in particular of https://github.com/oauth2-proxy/oauth2-proxy (of which I contributed to an older fork)
Months before the lawsuit, youtube-dl's maintainers frequently closed issues that reported ongoing breakage without giving a reason. Here is one especially illuminating example:
All code in this project is licensed solely with the condition that any portion of it is not permitted to be used in the main youtube-dl fork, either directly or indirectly. It is also not permitted to be used in any project that contains contributions from either remitamine or dstftw.
The two users mentioned are or were previously major contributors to youtube-dl.
It seems that youtube-dl was already a dysfunctionally managed project at the time of the lawsuit and happened to ride out on the good PR for a couple of months, before returning to stagnation once again.
To me it sounds like a plugin system would have prevented centralization and the need for forks, but would have made distribution harder for average users.
2 months is a long time in youtube-dl world. It's not really a "software project" in the traditional sense, where you can stop working on it once it's "done". It's more of a "social project", a focal point for the required ongoing activity necessary to keep sites working. Youtube-dl without daily commits is useless.
I'm currently using it to download a youtube video. If it still works for it's main function, maybe it's just not a high priority project until something breaks?
For a project like youtube-dl it is a long time, because they use unofficial APIs (fancy word for scraping) of video sites that can shift even on daily basis. If you look at their Github issues it is just people endlessly complaining that some websites are broken again
Sounds like an exhausting thing to maintain. It's not like writing scraping (or even just changing slight variations in an API) is terribly interesting.
Having contributed to youtube-dl in the past, long turnaround times from the maintainers was pretty normal. I've had (and still have some) open PRs that have been ready to merge for going on a year. The two months is really not that big of deal.
That being said, the project probably could use some reorganization. It requires a lot of community contributions to keep all the extractors maintained so long turnaround times for reviews isn't ideal.
Also, in common with other massive online properties for instance Amazon, not consistently: changes sometimes roll out a bit at a time so users in different areas get different versions, either due to global roll-out being a staged process or because a UI experiment is being performed. It usually doesn't matter to a well written scraper as the core data is still accessible in the same way despite the UI sugar coat having changed, but sometimes there are significant enough changes under the hood too that the scraper needs to deal with while still supporting the older format(s).
I guess it depends on if sub-2-month feature development is needed to keep up with YouTube's changes or not.
I maintain a GCC code coverage tool on GitHub, and since GCC doesn't change very often and the feature set of the tool is fairly complete, I sometimes go 6+ months without commits. Usually I don't touch it unless someone opens an issue.
dark thought but I've had multiple instances in last year+ where someone hasn't posted in awhile, tweeted etc and I wonder for a second, maybe they died? The pandemic is real. And random ppl disappearing unexpectedly is part of it.
I hope all is well.
kbenson|4 years ago
It's not that there's repo inactivity, it seems to be that this is an extremely active repo which saw everything grind to a halt when the admins went dark. That's quite a bit different then just "inactivity".
ploxiln|4 years ago
That's kinda overwhelming though ... imagine that if the maintainer pops up somewhere, suddenly 100 motivated people may chime in "hey please review this important pull request that's been sitting over here for a while".
There are some kinds of open source projects that are prone to this ... some are really not so bad to maintain if you have the right kind of discipline, because they converge on a stable set of functionality and platform compatibility evolves slowly, but some just naturally have endless room for variations and special cases, and as users increase, PRs increase linearly (instead of sub-linearly as you'd hope). I'm thinking in particular of https://github.com/oauth2-proxy/oauth2-proxy (of which I contributed to an older fork)
nonbirithm|4 years ago
https://github.com/ytdl-org/youtube-dl/issues/23860
And there is also an entire fork that fixes the support for just a single provider, NicoNico, because the maintainers ignored its issues.
https://github.com/animelover1984/youtube-dl
A quote from its README:
All code in this project is licensed solely with the condition that any portion of it is not permitted to be used in the main youtube-dl fork, either directly or indirectly. It is also not permitted to be used in any project that contains contributions from either remitamine or dstftw.
The two users mentioned are or were previously major contributors to youtube-dl.
It seems that youtube-dl was already a dysfunctionally managed project at the time of the lawsuit and happened to ride out on the good PR for a couple of months, before returning to stagnation once again.
To me it sounds like a plugin system would have prevented centralization and the need for forks, but would have made distribution harder for average users.
luovatek|4 years ago
[deleted]
dTal|4 years ago
bittercynic|4 years ago
ugjka|4 years ago
OJFord|4 years ago
Using a non-public API is not at all the same as scraping, which refers to parsing a rendered HTML page for the content you want.
Both have this maintenance problem, but one's not a fancy word for the other.
leshow|4 years ago
_joel|4 years ago
devrand|4 years ago
That being said, the project probably could use some reorganization. It requires a lot of community contributions to keep all the extractors maintained so long turnaround times for reviews isn't ideal.
hatsunearu|4 years ago
https://old.reddit.com/r/DataHoarder/comments/p9riey/youtube...
Sounds more like maybe the person is sick or something.
shp0ngle|4 years ago
dspillett|4 years ago
tomaszs|4 years ago
umvi|4 years ago
I maintain a GCC code coverage tool on GitHub, and since GCC doesn't change very often and the feature set of the tool is fairly complete, I sometimes go 6+ months without commits. Usually I don't touch it unless someone opens an issue.
pwdisswordfish8|4 years ago
They don’t even bother removing spam.
momothereal|4 years ago
ChrisArchitect|4 years ago
jms703|4 years ago
unknown|4 years ago
[deleted]