• 9 Posts
  • 45 Comments
Joined 1 month ago
cake
Cake day: November 25th, 2024

help-circle

  • sith@lemmy.ziptoPeople Twitter@sh.itjust.worksFeelings? Nah
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    8
    ·
    edit-2
    2 days ago

    My belief is that most women belive they want a sensitive man (after all, that’s the cultural norm), until they actually get one. It’s not super cool IRL unfortunately. Though it’s very rare that women admits this to themselves or others. Usually you can find another believable excuse, that fits with the norm. Abnormal sensitivity often comes with extra baggage.

    But there are of course exceptions, and that’s what you should look for if you’re a guy and know you’re on the more sensitive side of the spectrum.

    Also don’t fall for any of that “patriarchy” crap that is being spammed here. It’s just a useless concept (or religion). Usually advocated by people with close to zero life experience and a taste for conspiracy theories. And in this context its almost dangerous, because even if it was true, advocates draw the wrong conclusions (like that a less patriarchal society would appreciate sensitive men more). If you want to understand why the world feels injust or that you’ve been fooled, I would start with reading about evolutionary game theory and maybe look at Robert Sapolskys video lectures on human behavior biology on YouTube. Then do some reading on moral realism (and why it’s stupid). If you’re American (sorry) its probably more likely that you are a firm believer in moral realism and that you don’t know much about evolution and biology. Don’t go for Jordan Peterson because he’s just a completely incoherent thinker (or simply put, a quite stupid guy), who’s also into mysticism. Or maybe just read some Peterson and you will hopefully understand. He’s very average, but had good timing I guess.










  • I’m an active user who post and comment regularly, and I would say that the experience is very similar to Reddit. Except for less adds and smaller numbers on the main/all page. The experience is probably very different if you’re mainly a passive consumer of content.

    Though I’ve never been active in “large” subreddits and I tend to block them from my feed. So guess I don’t know what I’m missing.




  • Is that still true though? My impression is that AMD works just fine for inference with ROCm and llama.cpp nowadays. And you get much more VRAM per dollar, which means you can stuff a bigger model in there. You might get fewer tokens per second compared with a similar Nvidia, but that shouldn’t really be a problem for a home assistant. I believe. Even an Arc a770 should work with IPEX-LLM. Buy two Arc or Radeon with 16 GB VRAM each, and you can fit a Llama 3.2 11B or a Pixtral 12B without any quantization. Just make sure that ROCm supports that specific Radeon card, if you go for team red.






  • Yes, but all that is true for Facebook, Reddit and whatever. It’s still nice to have this feature in the “reference” implementation of Lemmy. I think. Then it will also be easier for instance owners and moderators to follow any local laws that requires this.

    I don’t know if this is already in the ActivityPub protocol, but it would be nice if all instances who has a copy of some content, deletes it, if it has been marked “request for deletion” by the creator or the owner of the instance where it was first posted. There will always be actors that store specifically all posts that’s been marked for delete, but I still think this is preferable.