Hi! My name is Sergii and I am the developer behind Geeps. I'm based in Canada and at work I build software for the healthcare industry. Geeps is a project that I develop in my spare time.
Soon after the release of ChatGPT OpenAI gave developers access to the same powerful models through their API. Immediately projects like Chatbot UI and MacGPT became popular and demonstrated that users could connect to the best models like GPT-4 using their own API keys. I really liked this approach – I could still pay for this valuable service while making my usage more flexible and cost-efficient.
The pace of development in the AI world is insane and the big players like OpenAI, Google, Anthropic, and others come up with new models and capabilities on a monthly basis. And all these providers naturally try to lock you in their ecosystems by bundling many capabilities in their Pro/Max subscriptions.
Those are really great offerings and I'm also a subscriber but their combined cost may escalate really quickly especially if you want to use SOTA models - a top tier subscription from each of those companies costs $200+ per month and that's without mentioning other players in the field like xAI, Mistral, Cohere, all the Chinese labs, etc. who all charge a monthly subscription.
And if you don't pay a subscription? The ads are coming your way!
My personal preference is to access intelligence as a utility and pay only for what I use. It's great to be able to switch between providers and models based on specific needs.
Fortunately, this approach is still possible through direct APIs and already works great in AI assisted programming apps like Cursor, Copilot, Cline, Claude Code, OpenCode, Codex, Pi and countless others. And there are also excellent LLM routers like OpenRouter and others that give flexible access to multiple providers and models. The aim of Geeps is to be a simple and portable cross-platform client that works with multiple AI providers in a single app.