

Sure, but you at least have something to work with rather than whatever you know off the top of your head
Sure, but you at least have something to work with rather than whatever you know off the top of your head
Because it’s like a search box you can explain a problem to and get a bunch of words related to it without having to wade through blogspam, 10 year old Reddit posts, and snippy stackoverflow replies. You don’t have to post on discord and wait a day or two hoping someone will maybe come and help. Sure it is frequently wrong, but it’s often a good first step.
And no I’m not an AI bro at all, I frequently have coworkers dump AI slop in my inbox and ask me to take it seriously and I fucking hate it.
Charybdis, hippo, appa, Momo, pabu
Yes it’s an LLM called pandoc, you can run it locally
People don’t actually do this, right? Docker inside docker inside a VM inside another VM? On windows? Right???
Well that’s just blatantly false. They’re extremely useful for the initial stage of research when you’re not really sure where to begin or what to even look for. When you don’t know what you should read or even what the correct terminology is surrounding your problem. They’re “Language models”, which mean they’re halfway decent at working with language.
They’re noisy, lying plaigarism machines that have created a whole pandora’s box full of problems and are being shoved in many places where they don’t belong. That doesn’t make them useless in all circumstances.