China Catching ChatGPT, Moves to Regulate AI

China is moving to regulate artificial intelligence (AI). Under draft regulations released this week, Chinese tech companies must register generative AI products with China’s cyberspace agency and submit them for a security assessment before releasing them to the public. Beijing wants to control the technology, from how AI is trained to how users interact with it. Under the new rules, tech companies will be responsible for the “legitimacy of the source of pre-training data” so that content reflects the “core value of socialism.” AI must not call for the “subversion of state power,” overthrow of the ruling Chinese Communist Party (CCP), incite moves to “split the country” or “undermine national unity,” or produce content that is pornographic, encourages violence, extremism, terrorism or discrimination. Users must verify their real identity before using their products.

They will fine violators between 10,000 yuan ($1,454) and 100,000 yuan ($14,545) and potentially face a criminal investigation.

The European Union has proposed the AI Act to classify which kinds of AI are “unacceptable” and banned, “high risk” and regulated, and unregulated. The proposal follows the passage of the EU’s 2018 General Data Protection Regulation in 2018, so far one of the toughest data privacy-protection laws in the world. Brazil is also working towards AI regulation.

So far, Chinese regulators have introduced data privacy rules, created a registry of algorithms, and started to regulate deep synthesis, aka “deep fake” technology. Big tech companies in China must follow the direction that the party-state wants.

Two early Chinese chatbots were taken offline after they told users they did not love the CCP and wanted to move to the US. Chinese competitors to ChatGPT, including Baidu’s ERNIE, are trained on data from outside China’s “Great Firewall,” including information from Wikipedia and Reddit. Regulators may choose not to strictly enforce the rules initially unless they find particularly egregious violations, or they may decide to make an example of a particular company.

Leave A Reply

Your email address will not be published.