Setting up OpenClaw with Ollama Cloud -
I wrote an article on how I set up OpenClaw using Ollama, while using Ollama cloud models.
It is straight-forward and easy to do, and very affordable.
The AI can read and write files, do research on the web, manage task lists, write computer code, and more..
Here’s my 8-step guide to getting started:
To run OpenClaw and a good LLM locally, I would look at a Mac Studio, with a ton of RAM. Probably 128 GB RAM. Expensive! But that is my plan in the future.
Right now I have OpenClaw running on my Linux box, but for the LLM I am pointing it at a cloud model LLM (Qwen 3.5)
I first tried it on my M4 Mac Mini. I ran OpenClaw with Qwen3.5-9b completely locally, but it was painfully slow.
I don't have my agents (castle residents!) talking with each other yet, but that sounds fun 🙂