Andy API Local Client

Contribute your GPU power to the distributed AI compute pool.

Could Not Connect to Local Client

We tried to connect to your local client at http://localhost:5000, but it doesn't seem to be running. The local client allows you to share your computer's processing power with the Andy API network, helping to run AI models for everyone.

1

Install Ollama

Ollama is required to run the language models on your machine. Download and install it from the official website.

Download Ollama
2

Get the Local Client

Download the Andy API Local Client from its GitHub repository. This contains the script you'll need to run.

Get the Client on GitHub
3

Run the Client

Follow the instructions in the `README.md` file on GitHub to install dependencies and start the client. Once it's running, refresh this page or click the "Local Client" link in the navigation bar again.