I think the issue of certification may also come up in relation to the EU AI Act. After all, this act puts legal weight on the term “open source” and provides open source systems with certain exemptions — exemptions which are apparently attractive enough to induce companies like Meta and Mistral to go all-in on co-opting the term “open source”.
Now, the EU AI Act stipulates a “template” that specifies some forms of disclosure even for “open source” systems, and an “AI Office” that will draw up this template and presumably oversee its enforcement (though by which processes and with what powers, is unclear at this point). As we point out in our FAccT paper:
If this exemption or one like it stays in place, it will have two important effects: (i) attaining open source status becomes highly attractive to any generative AI provider, as it provides a way to escape some of the most onerous requirements of technical documentation and the attendant scientific and legal scrutiny; (ii) an as-yet unspecified template (and the AI Office managing it) will become the focus of intense lobbying efforts from multiple stakeholders (e.g., [12]). Figuring out what constitutes a “sufficiently detailed summary” will literally become a million dollar question.
The ‘million dollar question’ is not even hyperbolic — Meta currently spends 8 million annually in Brussels to lobby on the DSA and EU AI Act. If organisations like OSI and other open source players can play a role in regulation and certification (and especially in ensuring and advocating maximum transparency) it seems this might strengthen the open source ecology.