-
Isn’t the “privacy-preserving” techniques contradictory to open-source at the first place? You might think it is safe to “open-source” a federated learning based on private face data, but I guarantee you there are lots of methods to dump private training data from the model, although not byte-to-byte equivalent.
-
“access to data”. Anyone is able to download the original training/validation datasets anonymously, without charge, without registration. An example is the coco dataset: COCO - Common Objects in Context