minus-squareRonon Dex@lemmy.sdf.orgtoSelfhosted@lemmy.world•Guide: Self-hosting open source GPT chat with no GPU using GPT4AlllinkfedilinkEnglisharrow-up1arrow-down1·1 year agoI tried that. GPT4all is a hog. You’ll need at least 16GB of RAM. linkfedilink
I tried that. GPT4all is a hog. You’ll need at least 16GB of RAM.