(no title)
PythagoRascal | 3 years ago
I don't see how that invalidates the copyright/license argument. So, instead of just a straight up license violation it's a license violation via plagiarism.
That argument wouldn't hold up even if it was a human that caused the violation. You can't just paraphrase someones licensed work and then lie about looking at and pretend you made it yourself, which is basically what seems to happen with co-pilot, as it doesn't also automatically reproduce the license of the code it reproduces.
abigail95|3 years ago
Yes you can. That's exactly why you paraphrased it instead of copying verbatim.
At the fringes, your transformation may not be enough to overcome the requirements, but that's an exception. Nearly all paraphrasing is legal by default.
anigbrowl|3 years ago
ClassAndBurn|3 years ago
The arguments against my point always assume perfect memory of everything this model is consumed. This is the plagiarism position. In reality, some patterns are more common than others and generate a code that looks nearly identical. I can’t speak for the reasons for this, as I’m not familiar with all of the methods. However, I don’t assume that is the current working state or intent of Codex.
BeefWellington|3 years ago
It remains to be seen whether ML is true "learning" in the sense of developing a skill the way a human does over time.
It is however irrelevant to the manner in which this model operates today.