Thanks also for starting this seperate thread.
I suspect the majority of folks involved with AI systems will be able to use the OSAID to “self-certify” e.g. “I meet the standard so I can use the open source label”. Lots of this will be straightforward and not controversial and needs minimal bottlenecks or external interference.
I agree with Zack that the most likely scenario where someone wants a form of objective “certification” will be where some kind of arbitration is required. Obvious mis-alignment with the OSAID will also be easy. The real work will be in nuanced edge cases.
Focussing on the arbitration element rather than some all-purpose certification process in my view is worth seriously considering.
The OSD and the development of OSS licenses has benefitted from over 25 years of community practice and discussion. We have a better and more informed understanding of how the OSD works in practice with good precendents to point to and expanded guidance alongside the OSD that supports the development of new licenses (https://opensource.org/licenses/review-process).
The OSAID is inevitably going to go through a similar maturation process, with the same discussions, precedent setting, and emergent good practice. There’s merit in thinking about how to best support that as ultimately it will strengthen the OSAID as it has for the OSD.
Having said all of that, I also wonder whether a simple self-certification tool/register would be of use to the community? Something quick which takes about 5 mins to fill in and checks whether a system aligns with the definition or not (based on stating which licenses apply to which components), with potentially some info about versions and locations of components? Beyond making it really easy to check alignment with the OSAID it creates a registry of what good practice looks like and promotes transparency.