TL;DR
No.
LGPL is just one license and won’t impact the life and freedoms of millions of Europeans by enabling thousands of AI systems to avoid the legal and scientific scrutiny that the AI Act impose them.
A surprising argument…
During the last Town Hall, @stefano quoted Stallman about the strategic value of the linking exception granted by LGPL (slide 18).
As far as I understood, the argument was: “open source has a long history of compromises, so allowing unshareable datasets in Open Source AIs is fine if it leads to more AI systems certified by OSI as Open Source”.
The LGPL was suggested as an example of such compromises.
But is it a reasonable precedent?
LGPL doesn’t compromise on the 4 freedoms
Lesser/Library GPL is a copyleft license that grant linking exception to developers, so that one can link a covered library from a program without releasing such program under a copyleft.
However, none of the four freedom is compromised by adopting LGPL: developers that modify a LGPL-covered work, still have to grant users access to the modified sources under the same license.
LGPL uses a different name than GPL
Since the very beginning the license was not presented as “the GPL”, but as a surrogate for the GPL to be used strategically to maximize users’ freedoms.
To keep with the comparison, the current draft should be named “Almost Open Source AI definition”, or “Somewhat Forkable AI”, “Freely Fine-tunable AI definition” or even “OSI-Certifiable AI definition”, but not open source, since without access to training data users lose two of the four freedoms.
LGPL aims to maximize freedoms, not FSF income
The LGPL was designed to pragmatically maximize users’ freedoms, not to maximize the market of worthless certifications.
Instead an Open Source AI definition that does not requires training data would severely inhibit user freedoms, while raising the number of compliant AI systems.
So we would have more systems bragging to be “Open Source AI” and less freedom for the users at the same time.
Maybe a win-win for an hypothetical OSI Corporation (that could launch a rich business selling certificates of paper-compliance) and for all companies trying to escape the legal and technical scrutiny imposed by the AI Act.
But a net loss for everybody else.
LGPL is just one license
To be honest, I don’t know how many compromises OSI did with the Open Source Definition, but reading the Fair License I can trust @stefano’s words about them.
However LGPL was just one license.
Just like the Fair License or the CAL.
The Open Source AI definition will impact the safety of all Europeans!
Thus I don’t think that LGPL can be issued as a precedent to justify an Open Washing AI definition like the last draft.
Am I missing something obvious?