MCP.Pizza Chef: akramsheriff5
I’ve developed a scalable MCP client designed to work seamlessly with local LLMs—currently running on LLaMA 3.2—and architected it to support multiple models with minimal configuration.