原文整理页

Garry Tan 澄清其支持本地模型,并宣布 OpenClaw 已支持高效推理服务器 inferrs

来源作者:Garry Tan (@garrytan)原始来源:https://x.com/garrytan/status/2042020290218373214

中文导读

Garry Tan 澄清其支持本地模型,并宣布 OpenClaw 已支持高效推理服务器 inferrs。

正文 Markdown

Some folks try to spin a narrative that I don't like local models, meanwhile I spent a lot of time making it easy to use OpenClaw with them. Latest release adds support for inferrs, which is a new super efficient TurboQuant inference server: https://docs.openclaw.ai/providers/inferrs