# LLM-Proxy-Metrics
Go impl of a proxy for Ollama and OpenAI endpoints that gathers prometheus metrics.