Ed Newton-Rex, former head AI designer at TikTok and an executive at Stability AI, has launched a nonprofit organisation called Fairly Trained which introduced a certification program called L Certification, akin to the “fair trade” certification labels. The L certification, applicable to generative AI models, signifies that these models do not use copyrighted work without proper licensing. By obtaining this certification, the organisation claims that companies demonstrate their commitment to ethical training practices and respect for creators’ rights. To obtain the certification label, a company must provide evidence that their training data are:

  • Explicitly licensed for training purposes,
  • Available in the public domain,
  • Offered under an appropriate open license, or
  • Already owned by the model developer.

Nine companies have currently received certification, including Bria AI and LifeScore Music. Bria AI exclusively licenses its image generator training data from legitimate sources such as Getty Images, while LifeScore Music licenses work from all major record labels for its music generation platform. Several other companies are in the process of completing their certification. Fairly Trained charges a certification fee ranging from $500 to $6,000, depending on the size of the applicant’s business.

Why does it matter?

Considering the rise of copyright lawsuits in 2023 against the generative AI industry, this certification process would be the first step in addressing the ethical use of AI. Essentially, Fairly Trained and the certified companies’ efforts contradict OpenAI’s claim that generating AI services like ChatGPT without using unlicensed data is impossible.

At the same time, Newton-Rex told Wired that overturninAI’s industry’s standard approach of scraping training data is still in the early stages. Copyright activist and former Recording Industry Association of America executive Neil Turkewitz also highlighted that the certification process is still too limited as it does not confirm if the company is behaving fairly or what creators’ expectations for how they want their information to be used.

cross-circle