Hi! My name is Sergii and I am the developer behind Geeps. I'm based in Canada, at work I build software for the healthcare industry and Geeps is a project that I develop in my spare time.
The release of ChatGPT in 2022 unleashed a new tech revolution, and the subsequent API gave developers access to the same powerful models that power it. Immediately projects like Chatbot UI and MacGPT became popular and demonstrated that users could connect to the best models like GPT-4 using their own API keys. I really liked this approach – I could still pay for this valuable service while making my usage more flexible and cost-efficient.
However most of the apps that were available for use with my key were either too technical or were limited only to macOS, or I didn't like the UI, etc. I wanted a simpler app that would work on my iPhone, iPad and Mac and that less tech savvy people could also use. At the same time I wanted to catch up with SwiftUI development and dig deeper into OpenAI and other providers' APIs.
The pace of development in the AI world is only accelerating and the big players like OpenAI, Google, Anthropic, and others come up with new models and capabilities at an unprecedented rate. This week it may be the new Claude and a week later it's Gemini or GPT or something else. And all these providers naturally try to lock you in their ecosystems by bundling many capabilities in their Pro/Max subscriptions.
Those are really great offerings and I'm also a subscriber but their combined cost may escalate really quickly especially if you want to use SOTA models - a top tier subscription from each of those companies costs $200+ per month and that's without mentioning other players in the field like xAI, Mistral, Cohere, all the Chinese labs, etc. who all charge a monthly subscription.
My personal preference is to access intelligence as a utility and pay only for what I use. I would like to avoid a hard lock in a single vendor's ecosystem and prefer to have the option to switch between providers and models based on specific needs. And I don't want to juggle multiple apps and subscriptions to keep up with the latest developments.
Fortunately, this approach is possible through direct APIs and already works great in AI assisted programming tools like Cursor, GitHub Copilot, Cline, Roo, Claude Code, OpenCode and countless others. And there are also excellent LLM routers like OpenRouter and others that give flexible access to multiple providers and models. I hope Geeps can fit into the same category as a simple and portable cross-platform client that works with multiple AI providers letting you choose the best model for your needs, all from a single app.