Anthropic's AI Code Leak Sparks Frenzy Among Chinese Developers Amid Export Ban
Anthropic AI Code Leak Sparks Frenzy in China Amid Export Ban

Anthropic's AI Code Leak Ignites Developer Frenzy in China Despite Export Restrictions

American artificial intelligence firm Anthropic has consistently advocated for stringent export controls on U.S. AI software and hardware to China, with CEO Dario Amodei frequently labeling China as an adversarial nation. Mirroring the stance of tech giants like Microsoft-backed OpenAI and Alphabet's Google, Anthropic has withheld its services from mainland China, citing pressing national security concerns. Consequently, China joins a restricted list including Russia, North Korea, Afghanistan, Iran, and Cuba where access to Claude chatbots and advanced AI models is completely blocked.

Accidental Source Code Exposure Triggers Widespread Interest

Last week, a significant security breach occurred when Anthropic inadvertently made the source code for its acclaimed vibe-coding tool, Claude Code, publicly accessible. This revelation quickly spread across developer communities, with the code being reposted extensively on the popular platform GitHub. The incident particularly captivated Chinese developers, sparking a wave of intense activity and analysis.

The leaked source code, comprising over 512,000 lines and embedded deep within the software package, was first identified and decrypted by software engineer and cybersecurity researcher Shou Chaofan. He subsequently shared the findings on Twitter, drawing immediate attention. According to a report from the South China Post, Chinese developers have been enthusiastically engaging with the leaked material, rushing to download copies and meticulously examining the files to uncover every technical detail.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Advanced Capabilities Drive Enthusiasm

What reportedly fuels this high level of enthusiasm among Chinese developers is the advanced coding prowess of Anthropic's AI models. On various Chinese online forums, users have been actively sharing insights they consider the secret recipe behind Claude Code, discussing elements such as its architectural design, agent frameworks, and memory mechanisms. One particularly popular discussion thread titled Claude Code source code leak incident has amassed millions of views, with local developers exchanging learned techniques and strategizing on optimizing the tool's applications.

While some industry experts argue that the leaked files contained only code for Claude Code and not the critical model weights, others view the data as an invaluable resource. Zhang Ruiwang, a Beijing-based IT system architect, emphasized to the South China Post that the code batches are indeed a treasure for AI companies or developers, as they revealed all the key engineering decisions Anthropic made.

Background of Tensions and Allegations

This leak emerges merely a month after Anthropic leveled serious accusations against Chinese AI companies for data theft. In a February blog post, Anthropic alleged that three Chinese firms—DeepSeek, Moonshot AI, and MiniMax—created over 24,000 fraudulent accounts to interact with the Claude AI model. These accounts reportedly prompted Claude more than 16 million times, extracting information to train and enhance their own products.

Anthropic acknowledged that distillation techniques have legitimate uses, such as building smaller versions of proprietary products, but warned that they could also be exploited to develop competitive offerings in a fraction of the time, and at a fraction of the cost. This context underscores the ongoing technological rivalry and security concerns shaping U.S.-China relations in the AI sector.

The incident highlights the complex dynamics of global AI competition, where security measures and accidental disclosures can significantly impact developer communities and international tech policies.

Pickt after-article banner — collaborative shopping lists app with family illustration