top | item 43962427

Show HN: CLI that spots fake GitHub stars, risky dependencies and licence traps

122 points| artski | 9 months ago |github.com

When I came across a study that traced 4.5 million fake GitHub stars, it confirmed a suspicion I’d had for a while: stars are noisy. The issue is they’re visible, they’re persuasive, and they still shape hiring decisions, VC term sheets, and dependency choices—but they say very little about actual quality.

I wrote StarGuard to put that number in perspective based on my own methodology inspired with what they did and to fold a broader supply-chain check into one command-line run.

It starts with the simplest raw input: every starred_at timestamp GitHub will give. It applies a median-absolute-deviation test to locate sudden bursts. For each spike, StarGuard pulls a random sample of the accounts behind it and asks: how old is the user? Any followers? Any contribution history? Still using the default avatar? From that, it computes a Fake Star Index, between 0 (organic) and 1 (fully synthetic).

But inflated stars are just one issue. In parallel, StarGuard parses dependency manifests or SBOMs and flags common risk signs: unpinned versions, direct Git URLs, lookalike package names. It also scans licences—AGPL sneaking into a repo claiming MIT, or other inconsistencies that can turn into compliance headaches.

It checks contributor patterns too. If 90% of commits come from one person who hasn’t pushed in months, that’s flagged. It skims for obvious code red flags: eval calls, minified blobs, sketchy install scripts—because sometimes the problem is hiding in plain sight.

All of this feeds into a weighted scoring model. The final Trust Score (0–100) reflects repo health at a glance, with direct penalties for fake-star behaviour, so a pretty README badge can’t hide inorganic hype.

I added for the fun of it it generating a cool little badge for the trust score lol.

Under the hood, its all uses, heuristics, and a lot of GitHub API paging. Run it on any public repo with:

python starguard.py owner/repo --format markdown It works without a token, but you’ll hit rate limits sooner.

Please provide any feedback you can.

72 comments

order

the__alchemist|9 months ago

> It checks contributor patterns too. If 90% of commits come from one person who hasn’t pushed in months, that’s flagged.

IMO this is a slight green flag; not red.

sethops1|9 months ago

I have to agree - the highest quality libraries in my experience are the ones maintained that one dedicated person as their pet project. There's no glory, no money, no large community, no Twitter followers - just a person with a problem to solve and making the solution open source for the benefit of others.

artski|9 months ago

Fair take—it's definitely context-dependent. In some cases, solo-maintainer projects can be great, especially if they’re stable or purpose-built. But from a trust and maintenance standpoint, it’s worth flagging as a signal: if 90% of commits are from one person who’s now inactive, it could mean slow responses to bugs or no updates for security issues. Doesn’t mean the project is bad—just something to consider alongside other factors.

Heuristics are never perfect and it's all iterative but it's all about understanding the underlying assumptions and taking the knowledge you get out of it with your own context. Probably could enhance it slightly by a run through an LLM with a prompt but I prefer to keep things purely statistical for now.

255kb|9 months ago

Also, isn't that just 99% of OSS projects out there? I maintained a project for the past 7+ years, and despite 1 million downloads, tens of thousands of monthly active users, it's still mostly me, maintaining and committing. Yes, there is a bus factor, but it's a common and known problem in open-source. It would be better to try to improve the situation instead of just flagging all the projects. It's hard enough to find people ready to help and work on something outside their working hours on a regular basis...

lispisok|9 months ago

It's gonna flag most of the clojure ecosystem

j45|9 months ago

Not sure if this is a red flag.

coffeeboy|9 months ago

Very nice! I'm personally looking into bot account detection for my own service and have come up with very similar heuristics (albeit simpler ones since I'm doing this at scale) so I will provide some additional ones that I have discovered:

1. Fork to stars ratio. I've noticed that several of the "bot" repos have the same number of forks as stars (or rather, most ratios are above 0.5). Typically a project doesn't have nearly as many forks as stars.

2. Fake repo owners clone real projects and push them directly to their account (not fork) and impersonate the real project to try and make their account look real.

Example bot account with both strategies employed: https://github.com/algariis

artski|9 months ago

Crazy how far people go for these things tbh.

hungryhobbit|9 months ago

Dependencies: PyPI, Maven, Go, Ruby

This looks like a cool project, but why on earth would it need Python, Java, Go, AND Ruby?

27theo|9 months ago

It doesn't need them, it parses SBOMs and manifests from their ecosystems. I think you misunderstood this section of the README.

> Dependencies | SBOM / manifest parsing across npm, PyPI, Maven, Go, Ruby; flags unpinned, shadow, or non-registry deps.

The project seems like it only requires Python >= 3.9!

deltaknight|9 months ago

I think these are just the package managers that it supports parsing dependencies for. The actual script seems to just be a single python file.

It does seem like the repo is missing some files though; make is mentioned in the README but no makefile and no list of python dependencies for the script that I can see.

catboybotnet|9 months ago

Why care about stars in the first place? Github is a repo of source repos, using it like social media is pretty silly. If I like a project, it goes into a folder in my bookmarks, that's the 'star' everyone should use. For VCs? What, are you looking to make an open source todo app into a multi million dollar B2B SaaS? VCs are the almighty gods of the world, and us humble peons do not need to lend our assistance into helping them lose money :-)

Outside of that, neat project.

never_inline|9 months ago

The social-ification of tech is indeed a worrying trend.

This is exacerbated by low-quality / promotional medium articles, linkedin etc.. which promote Nth copycat HTML GPT wrapper app or kubernetes dashboard as the best thing after sliced bread, and not-so-technical programmers falling for it.

This applies to some areas more than others (eg, generative AI, cloud observability)

ngangaga|9 months ago

> they still shape hiring decisions, VC term sheets, and dependency choices

This is nuts to me. A star is a "like". It has carries no signal of quality and even its popularity proxy is quite weak. I can't remember the last time I looked at stars and considered them meaningful.

rurban|9 months ago

A star is more of a "follow" or "watch", because at each update shows up in my timeline then

pkkkzip|9 months ago

the difference means getting funded or not. people fake testimonials and put logos of large companies too on their saas.

some people even buy residential proxies and create accounts on communities that can steer them towards specific action like "hey lets short squeeze this company let me sell you my call option" etc.

there's no incentive to be honest, i know two founders where one cheated with fake accounts, github likes and exited. the other ultimately gave up and worked in another field.

the old saying "if you lie to those who wants to be lied to you will become wealthy" rings true.

however at the end of the day it is dishonest and money earned through deception is bad.

Yiling-J|9 months ago

It would be interesting if there were an AI tool to analyze the growth pattern of an OSS project. The tool should work based on star info from the GitHub API and perform some web searches based on that info.

For example: the project gets 1,000 stars on 2024-07-23 because it was posted on Hacker News and received 100 comments (<link>). Below is the static info of stargazers during this period: ...

artski|9 months ago

Yeah I thought about this and maybe down the line, but wanted to start with the pure statistics part as the base so it's as little of a black box as possible.

knowitnone|9 months ago

Great idea. This should be done by Github though. I'm surprised Github hasn't been sued for serving malware.

swyx|9 months ago

> I'm surprised Github hasn't been sued for serving malware.

do you want a world where people can randomly sue you for any random damages they suffer or do you want nice things like free code hosting?

artski|9 months ago

Yeah to be fair would be great, sometimes just giving a nudge and showing people want these features is the first step to getting an official integration.

binary132|9 months ago

I approve! It would be cool to have customizable and transparent heuristics. That way if you know for example that a burst of stars was organic, or you don’t care and want to look at other metrics, you can, or you can at least see a report that explains the reasoning.

feverzsj|9 months ago

CTOs don't care about github stars. They are behind tons of screening processes.

throwaway314155|9 months ago

Believe me, CTO's of startups do.

zxilly|9 months ago

Frankly, I think this program is ai generated.

1. there are hallucinatory descriptions in the Readme (make test), and also in the code, such as the rate limit set at line 158, which is the wrong number

2. all commits are done on github webui, checking the signature confirms this

3. too verbose function names and a 2000 line python file

I don't have a complaint about ai, but the code quality clearly needs improvement, the license only lists a few common examples, the thresholds for detection seem to be set randomly, _get_stargazers_graphql the entire function is commented out and performs no action, it says "Currently bypassed by get_ stargazers", did you generate the code without even reading through it?

Bad code like this gets over 100stars, it seems like you're doing a satirical fake-star performance art.

zxilly|9 months ago

I checked your past submissions and yes, they are also ai generated.

I know it's the age of ai, but one should do a little checking oneself before posting ai generated content, right? Or at least one should know how to use git and write meaningful commit messages?

artski|9 months ago

Well I initially planned to use GraphQL and started to implement it, but switched to REST for now as it's still not fully complete, just to keep things simpler while I iterate and the fact that it's not required currently. I’ll bring GraphQL back once I’ve got key cycling in place and things are more stable. As for the rate limit, I’ve been tweaking things manually to avoid hitting it constantly which I did to an extent—that’s actually why I want to add key rotation... and I am allowed to leave comments for myself for a work in progress no? or does everything have to be perfect from day one?

You would assume if it was pure ai generated it would have the correct rate limit in the comments and the code .... but honestly I don't care and yeah I ran the read me through GPT to 'prettify it'. Arrest me.

Am4TIfIsER0ppos|9 months ago

What is a license trap? This "AGPL sneaking into a repo claiming MIT"? Isn't that just a plain old license violation?

artski|9 months ago

Basically what I mean by it is for example a repository appears to be under a permissive license like MIT, Apache, or BSD, but actually includes code that’s governed by a much stricter or viral license—like GPL or AGPL—often buried in a subdirectory, dependency, or embedded snippet. The problem is, if you reuse or build on that code assuming it’s fully permissive, you could end up violating the terms of the stricter license without realising it. It’s a trap because the original authors might have mixed incompatible licenses, knowingly or not, and the legal risk then falls on downstream users. So yeah essentially a plain old license violation which are relatively easy to miss or not think about

tough|9 months ago

they get around it by licensing differently only packages / parts of the codebase

sesm|9 months ago

How does it differentiate between organic (like project posted on HN) and inorganic star spikes?

colonial|9 months ago

Just spitballing, but assuming the fake stars are added in a "naive" manner (i.e. as fast as possible, no breaks) you could distinguish the two by looking for the long tail usually associated with organic traffic spikes.

Of course, the problem with that is the adversary could easily simulate the same effect by mixing together some fall-off functions and a bit of randomness.

artski|9 months ago

For each spike it samples the users from that spike (I set it to a high enough value currently it essentially gets all of them for 99.99% of repos - though that should be optimised so it's faster but just figured I will just grab every single one for now whilst building it). It checks the users who caused this spike for signs of being "fake accounts".

nfriedly|9 months ago

I love the idea! How feasible would it be to turn it into a browser extension?

edoceo|9 months ago

Could you add support for PHP via package.json? Accept patch?

artski|9 months ago

I haven't done that before so it would be a small learning curve for me to figure that out. Feel free to make a pull request.

nottorp|9 months ago

Of course, github could just drop the stars, but everything has to entshittify towards "engagement" and add social network features.

Or users could ignore the stars and go old school and you know, research their dependencies before they rely on them.

Vanclief|9 months ago

Stars are just a signal. When I am looking at multiple libraries that do the same, I am going to trust more a repo with 200 starts that one with 0. Its not perfect, but I don't have the time to go through the entire codebase and try it out. If the repo works for me I will star it to contribute to the signal.

benwilber0|9 months ago

Github was a "social network" from its very beginning. The whole premise was geared around git hosting and "social coding". I don't think it became enshittified later since that was the entire value proposition from day 1.