TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-23 months agoWhat's the bang for the buck go to setup for AI image generation and LLM models?message-squaremessage-square16fedilinkarrow-up112arrow-down19file-text
arrow-up13arrow-down1message-squareWhat's the bang for the buck go to setup for AI image generation and LLM models?TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-23 months agomessage-square16fedilinkfile-text
minus-squarekata1yst@sh.itjust.workslinkfedilinkEnglisharrow-up0·edit-23 months agoKobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs. I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
KobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs.
I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.