• 0 Posts
  • 155 Comments
Joined 1 year ago
cake
Cake day: April 3rd, 2024

help-circle
  • Yeah, and in the 70s they estimated they’d need about twice that to make significant progress in a reasonable timeframe. Fusion research is underfunded – especially when you look at how the USA dump money into places like the NIF, which research inertial confinement fusion.

    Inertial confinement fusion is great for developing better thermonuclear weapons but an unlikely candidate for practical power generation. So from that one billion bucks a year, a significant amount is pissed away on weapons research instead of power generation candidates like tokamaks and stellarators.

    I’m glad that China is funding fusion research, especially since they’re in a consortium with many Western nations. When they make progress, so do we (and vice versa).



  • Auto-writing boilerplate code doesn’t change the fact that you still have to reimplement the business logic, which is what we’re talking about. If you want to address the “reinventing the wheel” problem, LLMs would have to be able to spit out complete architectures for concrete problems.

    Nobody complains about reinventing the wheel on problems like “how do I test a method”, they’re complaining about reinventing the wheel on problems like “how can I refinance loans across multiple countries in the SEPA area while being in accord with all relevant laws”.





  • I fully agree. LLMs create situations that our laws aren’t prepared for and we can’t reasonably get them into a compliant state on account of how the technology works. We can’t guarantee that an LLM won’t lose coherence to the point of ignoring its rules as the context grows longer. The technology inherently can’t make that kind of guarantee.

    We can try to add patches like a rules-based system that scans chats and flags them for manual review if certain terms show up but whether those patches suffice will have to be seen.

    Of course most of the tech industry will instead clamor for an exception because “AI” (read: LLMs and image generation) is far too important to let petty rules hold back progress. Why, if we try to enforce those rules, China will inevitably develop Star Trek-level technology within five years and life as we know it will be doomed. Doomed I say! Or something.



  • I saw the term “bio resonance” and immediately knew that this ostensible medical practitioner couldn’t get in touch with reality if they used a special reality-seeking pole constructed from a thousand dousing rods.

    I used to work adjacent to the medical field, close enough to have to deal with a certain kind of medical practitioner a lot. For some reason, that part of medicine attracts people who believe in the supernatural so I’m familiar with bullshit from anthroposophy to quantum healing.

    That shit gets real wild real fast. Bio resonance is already terrible (it’s basically the same kind of bullshit Scientology’s “E-meters” pretend to do but now as a “therapeutic” device with thirty buttons). But the worst must be quantum healing.

    In quantum healing, actually seeing the patient in person is not necessary. Neither is knowing a lot about the patient. In fact, the less the practitioner knows, the better. Just give them a picture and a really vague description of the symptoms and the person (or pet; it “works” for those, too), and the practitioner will do something at some point in the future that will have some positive effect on either the person or the universe as a whole, even if it’s not obvious. Source: Trust me, bro.

    And they charge real money for that shit. Real medical practitioners who went to real university and have a real degree in human medicine.

    Absolutely incredible.



  • Water hardness matters a lot, too. When I visit my family and shower with their super soft water, I could use industrial degreaser and my hair would be just fine.

    But when I’m at home where the water is super hard? I better use a shampoo without sodium laureth sulfate and condition regularly or my hair will become an uncombable abomination within a few days.







  • Jesus_666@lemmy.worldtoLinux@lemmy.mlLinux Users- Why?
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    I run Garuda because it’s a more convenient Arch with most relevant things preinstalled. I wanted a rolling release distro because in my experience traditional distros are stable until you have to do a version upgrade, at which point everything breaks and you’re better off just nuking the root partition and reinstalling from scratch. Rolling release distros have minor breakage all the time but don’t have those situations where you have to fix everything at the same time with a barely working emergency shell.

    The AUR is kinda nice as well. It certainly beats having to manually configure/make obscure software myself.

    For the desktop I use KDE. I like the traditional desktop approach and I like being able to customize my environment. Also, I disagree with just about every decision the Gnome team has made since GTK3 so sticking to Qt programs where possible suits me fine. I prefer Wayland over X11; it works perfectly fine for me and has shiny new features X11 will never have.

    I also have to admit I’m happy with systemd as an init system. I do have hangups over the massive scope creep of the project but the init component is pleasant to work with.

    Given that after a long spell of using almost exclusively Windows I came back to desktop Linux only after windows 11 was announced, I’m quite happy with how well everything works. Sure, it’s not without issues but neither is Windows (or macOS for that matter).

    I also have Linux running on my home server but that’s just a fire-and-forget CoreNAS installation that I tell to self-update every couple months. It does what it has to with no hassle.




  • To quote that same document:

    Figure 5 looks at the average temperatures for different age groups. The distributions are in sync with Figure 4 showing a mostly flat failure rate at mid-range temperatures and a modest increase at the low end of the temperature distribution. What stands out are the 3 and 4-year old drives, where the trend for higher failures with higher temperature is much more constant and also more pronounced.

    That’s what I referred to. I don’t see a total age distribution for their HDDs so I have no idea if they simply didn’t have many HDDs in the three-to-four-years range, which would explain how they didn’t see a correlation in the total population. However, they do show a correlation between high temperatures and AFR for drives after more than three years of usage.

    My best guess is that HDDs wear out slightly faster at temperatures above 35-40 °C so if your HDD is going to die of an age-related problem it’s going to die a bit sooner if it’s hot. (Also notice that we’re talking average temperature so the peak temperatures might have been much higher).

    In a home server where the HDDs spend most of their time idling (probably even below Google’s “low” usage bracket) you probably won’t see a difference within the expected lifespan of the HDD. Still, a correlation does exist and it might be prudent to have some HDD cooling if temps exceed 40 °C regularly.