shakedown.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A community for live music fans with roots in the jam scene. Shakedown Social is run by a team of volunteers (led by @clifff and @sethadam1) and funded by donations.

Administered by:

Server stats:

278
active users

#openwebui

0 posts0 participants0 posts today
Mike Stone<p><span class="h-card" translate="no"><a href="https://fosstodon.org/@Gina" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>Gina</span></a></span> If I had to choose, I'd probably go with <a href="https://fosstodon.org/tags/Ollama" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Ollama</span></a> (which has been mentioned several times already). It's licensed under the MIT license and the models are about as close to open source as you can get. When I play with LLMs, it's what I use. Locally run and with an API that could be used to integrate with other stuff. I also have <a href="https://fosstodon.org/tags/OpenWebUI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>OpenWebUI</span></a> to make things prettier. Both can run locally, though OpenWebUI can integrate with cloud LLMs too. Of course, tomorrow everything could change.</p>
Stephen McNamara<p><a href="https://mastodon.social/tags/nixos" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>nixos</span></a> roadblock getting a m.2 <a href="https://mastodon.social/tags/google" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>google</span></a> Coral TPU up 😮‍💨 no more steam left this Sunday morning.<br>Swapping for <a href="https://mastodon.social/tags/Ubuntu" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Ubuntu</span></a> and Docker so I can start testing with <a href="https://mastodon.social/tags/OpenWebUI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>OpenWebUI</span></a>.</p><p>I have not been defeated yet though! Once my PCIe to m.2 e-key adapter shows up I will attempt to get the device working with NixOS once again. I want the device on the main app server.</p>
Jcrabapple<p>I basically have a DIY Perplexity setup running in OpenWebUI (which is running politely alongside Plex). I'm using Mistral-Large with web search via SearXNG and the system prompt that Perplexity uses for their Sonar Pro searches.</p><p>And since OpenWebUI has an OpenAI-compatible API, I can connect to it from this GPTMobile app on my phone and query my custom search assistant model.</p><p><a href="https://dmv.community/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://dmv.community/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://dmv.community/tags/LLMs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLMs</span></a> <a href="https://dmv.community/tags/OpenWebUI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>OpenWebUI</span></a> <a href="https://dmv.community/tags/Mistral" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Mistral</span></a> <a href="https://dmv.community/tags/Perplexity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Perplexity</span></a></p>
Jcrabapple<p>I set up <a href="https://dmv.community/tags/OpenWebUI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>OpenWebUI</span></a> on one of my more powerful servers, and it is fantastic. I'm running a couple smaller local Llama models, and hooked up my Anthropic and OpenRouter API keys to get access to Claude and a bunch of other models including Mistral, DeepSeek, and others. I also linked up my Kagi search API key to give web search capabilities to the models that don't have a web index. I will probably lower my Kagi Ultimate subscription to Professional since I no longer have a need for their Assistant.</p><p><a href="https://dmv.community/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://dmv.community/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://dmv.community/tags/LLMs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLMs</span></a> <a href="https://dmv.community/tags/selfhosted" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>selfhosted</span></a> <a href="https://dmv.community/tags/selfhosting" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>selfhosting</span></a></p>