正文 Markdown
Some folks try to spin a narrative that I don't like local models, meanwhile I spent a lot of time making it easy to use OpenClaw with them. Latest release adds support for inferrs, which is a new super efficient TurboQuant inference server: https://docs.openclaw.ai/providers/inferrs