Thanks will do all that!
- 3 Posts
- 22 Comments
Start now! Install it, get a python environment up and running if you haven’t already, and get that first play-around project working which you work outwards from!
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English1·7 days agoThanks, when I get some time soon, I’ll have another look at it and cherry ai with a local install of ollama
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English2·10 days agoAny suggestions for solutions?
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English3·10 days agoYou’re conflating me asking how to use these tools with you who’s misusing them. I see you still don’t accept what you’re doing is wrong. But go you.
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English6·11 days agoPlease be very careful. The python code it’ll spit out will most likely be outdated, not work as well as it should (the code isn’t “thought out” as if a human did it.
If you want to learn, dive it, set yourself tasks, get stuck, and f around.
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English4·11 days agoYeah shell scripts are one of those things that you never remember how to do something and have to always look it up!
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English31·11 days agoWas this system vibe coded? I get the feeling it was…
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English21·11 days agolol. Way to contradict yourself.
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English1·11 days agoI haven’t actually found the coder-specific ones to be much (if at all) better than the generic ones. I wish I could have. Hopefully LLMs can become more efficient in the very near future.
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English3·11 days agoSome questions and because you don’t actually understand, also, the answers.
- what does the LLM understand the context of, (other user’s data owned by Twitch)
- How is the LLM fed that data? (You store it and feed it to the LLM)
- Do you use Twitch’s data and its users data through an AI without their consent? (Most likely, yes)
- Do you have consent from the users to store ‘facts’ about them (You’re pissy, so obviously not)
- Are you then storing that processed data? (Yes, you are, written to a file)
- Is the purpose this data processing commercial (Yes, it is, designed to increase viewer count for the user of this system - and before you retort “OMG it helps twitch too”… Uhm no, Twitch has the viewers if not watching him, watching someone else)
I mean yeah, it’s a use case, but own up to the fact that you’re wrong. Or be pissy. I don’t care.
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English3·11 days agoDoesn’t Twitch own all data that is written and their TOS will state something like you can’t store data yourself locally.
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English5·11 days agoNo, what is it? How do I try it?
catty@lemmy.worldOPto Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English1·12 days agoSurely none of that uses a small LLM <= 3B?
catty@lemmy.worldOPto Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish1·14 days agoBut won’t this be a mish-mash of different docker containers and projects creating an installation, dependency, upgrade nightmare?
catty@lemmy.worldOPto Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish3·14 days agoBut its website is Chinese. Also what’s the github?
catty@lemmy.worldOPto Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish3·15 days agoThis looks interesting - do you have experience of it? How reliable / efficient is it?
catty@lemmy.worldOPto Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish1·15 days agoTry the beta on the github repo, and use a smaller model!
catty@lemmy.worldOPto Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish4·15 days agoI’m getting very-near real-time on my old laptop. Maybe a delay of 1-2s whilst it creates the response
Sounds like a great first question! Go for it!