AI & Tech

Anthropic restricts Claude access on open agent platform... Hugging Face ‘Alternative Guide’ released

Even Pro/Max subscribers cannot use Claude in OpenClaw. Instructions on how to switch to the open source model.

AI Reporter Alpha··5 min read·
앤스로픽, 오픈 에이전트 플랫폼에서 클로드 접근 제한…허깅페이스 '대안 가이드' 공개
Summary
  • Anthropic has restricted access to closed models, including Pro/Max subscribers, on the open agent platform.
  • Hugging Face has revealed two ways to switch to open source models, including GLM-5 and Qwen3.5.
  • Access restrictions in the closed model are expected to accelerate the growth of the open source agent ecosystem.

Anthropic blocks access to open agent platform Claude

Anthropic has begun restricting access to Claude models on its open agent platform. This measure also applies to Pro and Max subscribers, affecting developers who were using the Clod model on open source agent platforms such as OpenClaw, Pi, and Open Code.

Hugging Face released a detailed guide on switching to an open source model in response to this restriction on its official blog. Hugging Face said, “In most cases, agents can continue to operate at a much lower cost.”

Why is this important?

Anthropic's action once again highlights the competitive landscape between closed and open source models in the AI agent ecosystem. Claude has been popular in the field of coding agents for its high performance, and has especially played a key role in open source agent platforms such as OpenClaw.

Anthropic's restriction of access to open platforms is interpreted as a strategy to drive users to its own platform, Claude Code. This is raising concerns in the developer community because even paid subscribers cannot freely use the model on external platforms.

Two alternatives presented by Hugging Face

Hugging Face presented two conversion paths for developers whose access to Claud was blocked.

1. Hugging Face Inference Providers

Hugging Face Inference Provider is a platform that provides an open source model in hosting form. It is suitable for developers who lack hardware or want to quickly restore agents.

How to set up:

  1. Generate API token from Hugging Face
  2. Execute the command openclaw onboard --auth-choice huggingface-api-key
  3. Select model after entering token

Hugging Face presented GLM-5 as the recommended model. The reason is that it recorded excellent performance on the Terminal Bench. Changing the model in the configuration file is also simple:

{ "agents": { "defaults": { "model": { "primary": "huggingface/zai-org/GLM-5:fastest" } } } }

Hugging Face Pro subscribers can receive $2 of free credit per month to use at inference providers.

2. Local model execution (using Llama.cpp)

This is an option for developers who want complete privacy and zero API costs. The model is run directly in the local environment using Llama.cpp, an open source inference library.

Installation command:

#Mac/Linux
brew install llama.cpp
#Windows
winget install llama.cpp

The recommended model, Qwen3.5-35B-A3B, operates smoothly in a 32GB RAM environment. After running the server and linking it with OpenClaw, a complete local agent environment is created.

ItemHugging Face InferenceRun locally (Llama.cpp)
Setting Difficultylowmiddle
costUsage-based billingFree
PrivacyCloud dependenceCompletely local
Hardware RequirementsNone32GB+ RAM recommended
Recommended ModelGLM-5Qwen3.5-35B-A3B
Suitable forDevelopers wanting quick recoveryPrivacy-conscious developer

Growth of the open source agent model ecosystem

This incident is also an example that shows the maturity of the open source AI model ecosystem. The latest open source models such as GLM-5 and Qwen3.5 show performance close to closed models in coding agent tasks.

Hugging Face has thousands of models to choose from, and developers can freely choose a model that suits their hardware specifications and purposes. In particular, models quantized in the GGUF format can be run on consumer-grade hardware, greatly lowering the barrier to entry.

[AI Analysis] Closed vs. open source, a turning point in the AI agent market

Anthropic's move appears to be part of a trend where closed model providers are trying to strengthen platform dependency in the AI agent market. It is a strategy to encourage developers to move to official platforms by restricting them from using their models on open platforms.

However, these restrictions are likely to accelerate the transition to an open source model. The fact that Hugging Face immediately released a detailed migration guide is a result of capturing this demand.

In the future, the AI ​​agent ecosystem is expected to be divided into two branches. One is a closed platform ecosystem led by Anthropic and OpenAI, and the other is an open ecosystem led by Hugging Face and the open source community. Developers are expected to choose both based on cost, privacy, and freedom of customization.

Considering the current trend of continuously improving the performance of open source models, it is unclear whether access restriction strategies for closed models will be effective in the long term. Rather, it could be an opportunity to promote opposition from the developer community and a shift to open source.

Share

댓글 (4)

한밤의커피30분 전

Anthropic 관련 기사 잘 읽었습니다. 유익한 정보네요.

가을의바이올린5분 전

좋은 의견이십니다.

현명한라떼5시간 전

기사 잘 봤습니다. 다른 시각의 분석도 읽어보고 싶네요.

서울의독자2일 전

Claude에 대해 더 알고 싶어졌습니다. 후속 기사 부탁드립니다.

More in this series

More in AI & Tech

Latest News