I agree with @Mark that certification of open source AI will be a key issue in the context of the AI Act. I would like to point out that in this context the open source status of the licenses themselves will also be important.
The AI Act uses the term “under a free and open-source license”, which is itself quite confusing (I would expect “free and open”). One can assume that the definition is pretty clear and basically covers OSI-compliant licenses. But I think that one could just as well attempt to argue that responsible AI licensing fits the broad definition that’s included.
Looking at it differently, the issue of responsible AI licensing as a form of open-source licensing remains an open one today. And it needs to be resolved, so that there is clarity on the AI Act’s open source exemptions.
There is a possible simple answer: these licenses are not OSI-compliant, so they are not open source. But I am not sure that it will suffice. That’s because responsible AI licenses are getting some significant traction with developers – when you look at HuggingFace data, for example. So as a legislator I could see the sense of having them exempted from some of the regulation as well.
But to complicate things even further, there are at least several licenses – like the LLama or Falcon license – that introduce restrictions but are dubbed as “open source”. So the European AI Office, as it aims to clarify open-source licensing, might face pressure to accept a definition that includes these various responsible / restrictive licenses.
The conversation on this forum has focused on issues related to systems and their compliance with OSAID. I think that all this points to the need to reach consensus on licenses themselves.