1rre@discuss.tchncs.detoSelfhosted@lemmy.world•Is it possible to run a LLM on a mini-pc like the GMKtec K8 and K9?English
4·
7 days agoIntel Arc also works surprisingly fine and consistently for ML if you use llama.cpp for LLMs or Automatic for stable diffusion, it’s definitely much closer to Nvidia in terms of usability than it is to AMD
LLMs have a very predictable and consistent approach to grammar, punctuation, style and general cadence which is easily identifiable when compared to human written content. It’s kind of a watermark but it’s one the creators are aware of and are seeking to remove. That means if you want to use LLMs as a writing aid of any sort and want it to read somewhat naturally, you’ll have to either get it to generate bullet points and expand on them yourself, or get it to generate the content then rewrite it word for word in a style you’d write it in.