Replies: 1 comment 1 reply
-
|
The smallest resolution the full featured UI could support would be a tablet/iPad. I could look at getting the core main ui functionality with model selection and thread view, but many of the extensions will unlikely ever support a mobile resolution. The smallest resolution I would look at supporting for the core functionality would be 6.7" (2796x1290). |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
For my friends and family I run an
open-webuiinstance where they can talk to whatever openrouter models they want. As for myself, I strongly prefer to run thisllmspytool on a localhost: the web interface is much faster and lightweight, image generation is built-in, creating extensions is straightforward. After the recent introduction of the basic user management system to this project, I've briefly experimented with making an llmspy instance available online for my friends, but I found out that almost nobody used it since they mostly chat with models from their smartphones, and the current interface is essentially unusable from a mobile device. I would be happy with even a highly simplified mobile-friendly interface: for example, a list of models, a list of previous chats, a currently opened chat, nothing else (no "model settings" dialog, no gallery, no analytics, no skill browsers, basic "on/off" toggle for all tools, etc).I'm not sure if this would be considered in scope for
llmspysince this suggestion only makes sense regarding the "chat with an LLM" use case -- I use opencode when I want an agent to edit files and run tools, but I guess the project creator has a quite different use case in mind.Beta Was this translation helpful? Give feedback.
All reactions