[Nut-upsuser] FOSDEM 2026
Greg Troxel
gdt at lexort.com
Thu Feb 5 13:05:26 GMT 2026
Thanks for the FOSDEM summary/comments.
My take on AI is that is LLM output is a derived work of the training
data, and there is no licensing story, so people submitting it are
sending the project code without the ability to make the normal
inbound=outbound license grant.
Consider LICENSE-DCO:
Developer Certificate of Origin
Version 1.1
Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.
Developer's Certificate of Origin 1.1
By making a contribution to this project, I certify that:
(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
(b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or
(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.
For LLM output:
point a is not true
point b is not true
point c is either not applicable or not true
Merging LLM output means that the codebase is contaminated and that
there is no longer clear permission to copy under the GPL.
The other issue, totally separate, is that I believe it is outright
unethical to ask humans to review or even read LLM output.\
So yes, we live in a world where improper behavior is common, but that
doesn't mean we have to say it's ok.
Thus:
No LLM output may be submitted in a PR, inserted into a ticket, sent
to a mailinglist, or sent privately to any maintainer.
More information about the Nut-upsuser
mailing list