eupolicy.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This Mastodon server is a friendly and respectful discussion space for people working in areas related to EU policy. When you request to create an account, please tell us something about you.

Server stats:

215
active users

#gpt4all

0 posts0 participants0 posts today
Anton<p>what does <a href="https://nrw.social/tags/mastodon" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mastodon</span></a> like for <a href="https://nrw.social/tags/gpt4all" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gpt4all</span></a> models. Do you all go for <a href="https://nrw.social/tags/Llama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Llama</span></a> 3.8B<br>Please share this post. <a href="https://nrw.social/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://nrw.social/tags/MetaAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MetaAI</span></a> <a href="https://nrw.social/tags/CustomGPT" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CustomGPT</span></a></p>
🇩🇪 Tinca Tinca 🐟<p>Dear Nerdfront! <br>Who can tell me which model in <a href="https://bonn.social/tags/GPT4All" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4All</span></a> is best suited for creating German texts? It is only for private use (e.g. help with creating job applications). Thanks 🙂 !</p><p><a href="https://bonn.social/tags/nomic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nomic</span></a> <a href="https://bonn.social/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://bonn.social/tags/gpt" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gpt</span></a> <a href="https://bonn.social/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a></p>
Sebastian Meineck<p>Mit <a href="https://mastodon.social/tags/GPT4All" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4All</span></a> kan man sich ohne langes Gefrickel lokale <a href="https://mastodon.social/tags/Sprachmodelle" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Sprachmodelle</span></a> auf den Laptop holen, offline. Was man damit so anstellen kann, habe ich im neusten Online-Recherche Newsletter aufgeschrieben.</p><p><a href="https://sebmeineck.substack.com/i/160694890/gptall-lokale-sprachmodelle-ohne-hurden" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">sebmeineck.substack.com/i/1606</span><span class="invisible">94890/gptall-lokale-sprachmodelle-ohne-hurden</span></a></p>
Kuketz-Blog 🛡<p><a href="https://social.tchncs.de/tags/UnplugTrump" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>UnplugTrump</span></a> - Tipp24:<br>Meide KI-Angebote großer US-Konzerne wie ChatGPT und Gemini. Diese Systeme werden mit fragwürdigen Daten trainiert, verletzen Urheberrechte und beeinflussen Meinungen. Nutze stattdessen Open-Source-Alternativen wie Ollama oder GPT4All, die du lokal betreiben kannst – ohne Überwachung durch Big Tech.</p><p><a href="https://social.tchncs.de/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://social.tchncs.de/tags/GPT4All" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4All</span></a> <a href="https://social.tchncs.de/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://social.tchncs.de/tags/Privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Privacy</span></a> <a href="https://social.tchncs.de/tags/FOSS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FOSS</span></a></p>
ricardo :mastodon:<p>How to Set Up <a href="https://fosstodon.org/tags/GPT4All" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4All</span></a> for <a href="https://fosstodon.org/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> Editing in <a href="https://fosstodon.org/tags/ONLYOFFICE" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ONLYOFFICE</span></a> on <a href="https://fosstodon.org/tags/Ubuntu" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ubuntu</span></a> <a href="https://fosstodon.org/tags/Linux" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Linux</span></a> </p><p><a href="https://www.tecmint.com/gpt4all-ai-editing-in-onlyoffice/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">tecmint.com/gpt4all-ai-editing</span><span class="invisible">-in-onlyoffice/</span></a></p>
Kuketz-Blog 🛡<p>Ich habe das Thema »KI-Tools« in der Empfehlungsecke hinzugefügt. Die dort vorgestellten Lösungen sind alle datenschutzfreundlich, da sie offline genutzt werden können und mit lokalen Sprachmodellen arbeiten. 👇 </p><p><a href="https://www.kuketz-blog.de/empfehlungsecke/#ki-tools" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">kuketz-blog.de/empfehlungsecke</span><span class="invisible">/#ki-tools</span></a></p><p><a href="https://social.tchncs.de/tags/KI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>KI</span></a> <a href="https://social.tchncs.de/tags/gpt4all" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gpt4all</span></a> <a href="https://social.tchncs.de/tags/lmstudio" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lmstudio</span></a> <a href="https://social.tchncs.de/tags/noscribe" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>noscribe</span></a></p>
Karl Voit :emacs: :orgmode:<p>I played around with local <a href="https://graz.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a>​s like <a href="https://graz.social/tags/Llama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Llama</span></a> 3.1 via <a href="https://graz.social/tags/GPT4all" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4all</span></a> UI. GPT4all is fine but limits my self to one host only.</p><p>Can somebody suggest me a similar local web-based UI so that I may use my local LLMs as well as my main machine on other hosts in the same LAN via browser as well?</p><p>I don't need fancy bells and whistles, I prefer a rather simple concept with some selection of LLMs to download and simple chat windows for my prompts and answers.</p><p><a href="https://graz.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a></p>
David Clubb<p><a href="https://toot.wales/tags/GPT4All" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4All</span></a>'s new update has amazingly captured my brain the moment I wake up</p><p><a href="https://toot.wales/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a></p>
David Clubb<p>Ok I *think* that the <a href="https://toot.wales/tags/GPT4All" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4All</span></a> local docs functionality is now working (it wasn't for me about a month ago). </p><p>This could be a game changer for my research/analysis ....</p><p><a href="https://www.nomic.ai/gpt4all" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">nomic.ai/gpt4all</span><span class="invisible"></span></a></p>
Karl Voit :emacs: :orgmode:<p>Meta <a href="https://graz.social/tags/llama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llama</span></a> 3.1 8B on my intel Core i5-13500 (no dedicated graphics card) via <a href="https://graz.social/tags/GPT4all" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4all</span></a> is running on CPU only mode approximately one word per 3-4 seconds, consuming all of the 6+8 physical cores.</p><p>Most of my rare questions for the <a href="https://graz.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> can be sent, generated in background and I read the response later-on.</p><p>For occasional use an appropriate price I'm willing to pay for not uploading my tasks to a cloud service.</p><p><a href="https://graz.social/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://graz.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a></p>
David Clubb<p><span class="h-card" translate="no"><a href="https://toot.wales/@alexisbushnell" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>alexisbushnell</span></a></span> You can run your own GPT on your own computer and avoid all the dodgy stuff. Especially easy on Linux via Flathub </p><p>Using it right now and it's better than some more famous online tools <a href="https://flathub.org/apps/io.gpt4all.gpt4all" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">flathub.org/apps/io.gpt4all.gp</span><span class="invisible">t4all</span></a></p><p><a href="https://toot.wales/tags/GPT4All" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4All</span></a></p>
David Clubb<p>Honestly, there's a lot to like about this Chat AI tool that is hosted on your own machine. </p><p>You can grab your own from Flathub: <a href="https://flathub.org/apps/io.gpt4all.gpt4all" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">flathub.org/apps/io.gpt4all.gp</span><span class="invisible">t4all</span></a></p><p><a href="https://toot.wales/tags/GPT4All" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPT4All</span></a></p><p>Kudos to Andriy Mulyar and Yaroslav Halchenko</p><p>(I got recommended this by the fab podcast <span class="h-card" translate="no"><a href="https://linuxrocks.online/@DestinationLinux" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>DestinationLinux</span></a></span> )</p>