coffee_with_cream@sh.itjust.workstoSelfhosted@lemmy.world•Self-Hosted AI is pretty darn coolEnglish
1·
3 months agoImo it’s worthwhile to just run the biggest model available and rent expensive GPU time. It still amounts to very little overall and you get much better results. Project dependent of course
You probably want 48gb of vram or more to run the good stuff. I recommend renting GPU time instead of using your own hardware, via AWS or other vendors - runpod.io is pretty good.