This project is a proxy server that enables GitHub Copilot to use local LM Studio models through Ollama API compatibility. Currently, setup requires manual Python environment configuration and local ...
If you’re behind a corporate firewall or use a restricted network, you may need to configure a proxy to: If your access to license services or other online features is restricted, configure a proxy in ...