It isn’t, but I thought that made for a good title. You have probably heard about GitHub Copilot, the new AI-driven pair programming buddy. Copilot is trained on a wealth of publicly-available code, including code under copyleft licenses like the GPL. This has lead many people to question the legality of using Copilot. Does code developed with it require a copyleft license?
The legal parts
Reminder: I am not a lawyer.
No. While I’d love to see the argument play out in court, I don’t think it’s an issue. For as much as I criticize how we apply AI in society, I don’t think this is an illegal case. In the same way that the book I’m writing on program management isn’t a derivative work of all of the books and articles I’ve read over the years, Copilot-produced code isn’t a derivative work either.
“But, Ben,” you say. “What about the cases where machine learning models have produce verbatim snippets from code?” In those cases, I doubt the snippets rise to the level of copyrightability on their own. It’d be one thing to reproduce a dozen-line function. But even giving two or three lines…eh.
The part where verbatim reproduction gets interesting is by leaking secrets. I’ve seen anecdotal tales of Copilot helpfully suggesting private keys. This is either: Copilot producing strings that are gibberish because it expects gibberish or Copilot producing a string that someone accidentally checked into a repo. The latter seems more likely. And it’s not a licensing concern at that point. I’m not sure it’s any legal concern at all. But it’s a concern to the owner of the secret if that information gets out into the wild.
The community parts
But being legally permissible doesn’t mean Copilot is acceptable to the community. It certainly feels like it’s a two-trillion dollar company (Microsoft, the parent of GitHub) taking advantage of individual and small-team developers—people who are generally under-resourced. I can’t argue with that. I understand why people would find it gross, even if it’s legal. Of course, open source licenses by nature often permit behavior we don’t like.
Pair programming works well, or so I’m told. If a service like Copilot can be that second pair of eyes sometimes, then it will have a net benefit for open and proprietary code alike. In the right context, I think it’s a good idea. The execution needs some refinement. It would be good to see GitHub proactively address the concerns of the community in services like this. I don’t think Copilot is necessarily the best solution, but it’s a starting point.
[Full disclosure: I own a limited number of Microsoft shares.]