

No, just complete. Whatever the dude does may have morning to do with what you needed it to do, but it will be “done”


No, just complete. Whatever the dude does may have morning to do with what you needed it to do, but it will be “done”


Why he was suddenly “onboard”:
He thought that the Democrats were bluffing, thinking that Trump would veto. With the threat of Trump actually passing, they would get too scared that their sacred cow Clinton getting caught up.
He might have been assured by the DOJ that the files were safe to release (now).
His demand to get investigations going again locks the files back up. An official in an interview basically said as much, that starting investigations would b block “portions of the files” regardless of this bill.


Even if it is someone you actively wanted, the reaction should be to feel betrayed by then falling to love up to your standards, not to try to rationalize child sex trafficking to preserve your image of the guy.


Also possible, some invented “Epstein files” that add people he doesn’t like.


Also assuming it became prolific enough to appear in output, would that mean it is “correct”?


I would assume that a screen reader will pronounce it properly. If it doesn’t, then that reader needs an update. Still think it’s a pointless thing to try to resurrect that character from the past and kind of annoying, but at least screen readers should in principle be able to pronounce it.
Note that this outage by itself, based on their chart, was kicking out errors over the span of about 8 hours. This one outage would have almost entirely blown their downtown allowance under 99.9% availability criteria.
If one big provider actually provided 99.9999%, that would be 30 seconds of all outages over a typical year. Not even long enough for people to generally be sure there was an ‘outage’ as a user. That wouldn’t be bad at all.


Problem is that it isn’t that simple. The general fact we do know is that the atmosphere will be more energetic and, on average, the globe will be hotter. The more energetic facet means more weather activity and changes to various currents. So local weather may be more impacted by a change in wind current temperature wise. The change in storm activity may be the bigger concern rather than temperatures. We don’t know how much viable agriculture will be possible or exactly where it will be. Also there’s the question of soil quality. It’s a dangerous gambit to assume a straightforward “Lots of snowy land now means lots of agriculture in a warmed globe”.
To be fair, it says “bubba” and the Clinton link is speculative.
The republicans have started trying to blame Obama for this years hikes…
It’s quite a leap, but they are trying to say ACA blew it all up, but it just took almost 20 years for the pain to hit.
It’s a narrative that really only works for the ride or die republicans, but it’s all they have to try, since they have no actual answer they want to propose…


One this is all speak to convince investors to throw money, so they’ll cheer pick their interpretation.
In this case I think they refer to already having the real estate, buildings, power and cooling. So “all” they have to do is rip out their rigs and dump a bunch of nVidia gear in. All they need is just a few hundred million from some lucky investors and they will be off…


Same way a lot of the “ai” companies make money, investors that have no idea but want to get in on the ground floor of the next nVidia or openai.


So he’s going to get Epsteined before that can happen …


I get the sentiment, but the steam machine will have an x64 processor…
The VR headset won’t, but the PC will…
I doubt this one. It would require that Trump ever would care about the pleasure of anyone other than himself.


Well even with your observation, it could well be losing share to Mac and Linux. The Windows users are more likely to jump ship, and Mac and Linux users tend to stick with the platform more, mainly because it’s not actively working to piss them off. Even if zero jump to Mac or Linux, the share could still shift.
The upside of ‘just a machine to run a browser’ is that it’s easier than ever to live with Linux desktop, since that nagging application or two that keeps you on Windows has likely moved to browser hosted anyway. Downside of course being that it’s much more likely that app extracts a monthly fee from you instead of ‘just buying it’.
Currently for work I’m all Linux, precisely because work was forced to buy Office365 anyway, and the web versions work almost as well as the desktop versions for my purposes (I did have to boot Windows because I had to work on a Presentation and the weird ass “master slide” needed to be edited, and for whatever reason that is not allowed on the web). VSCode natively supports linux (well ‘native’, it’s a browser app disguised as a desktop app), but I would generally prefer Kate anyway (except work is now tracking our Github Copilot usage, and so I have to let Copilot throw suggestions at me to discard in VSCode or else get punished for failing to meet stupid objectives).


“Agentic” is the buzzword to distinguish “LLM will tell you how to do it” versus “LLM will just execute the commands it thinks are right”.
Particularly if a process is GUI driven, Agentic is seen as a more theoretically useful approach since a LLM ‘how-to’ would still be tedious to walk through yourself.
Given how LLM usually mis-predicts and doesn’t do what I want, I’m no where near the point where I’d trust “Agentic” approaches. Hypothetically if it could be constrained to a domain where it can’t do anything that can’t trivially be undone, maybe, but given for example a recent VS Code issue where it turned out the “jail” placed around Agentic operations turned out to be ineffective, I’m not thinking too much of such claimed mitigations.


My career is supporting business Linux users, and to be honest I can see why people might be reluctant to take on the Linux users.
“Hey, we implemented a standard partition scheme that allocates almost all our space to /usr and /var, your installer using ‘/opt’ doesn’t give us room to work with” versus “Hey, your software went into /usr/local, but clearly the Linux filesystem standard is for such software to go into /opt”. Good news is that Linux is flexible and sometimes you can point out “you can bind mount /opt to whatever you want” but then some of them will counter “that sounds like too much of a hack, change it the way we want”. Now this example by itself is mostly simple enough, make this facet configurable. But rinse and repeat for just an insane amount of possible choices. Another group at my company supports Linux, but just as a whole virtual machine provided by the company, the user doesn’t get to pick the distribution or even access bash on the thing, because they hate the concept of trying to support linux users.
Extra challenge, supporting an open source project with the Linux community. “I rewrote your database backend to force all reads to be aligned at 16k boundaries because I made a RAID of 4k disks and think 16k alignment would work really well with my storage setup, but ended up cramming up to 16k of garbage into some results and I’m going to complain about the data corruption and you won’t know about my modification until we screen share and you try to trace and see some seeks that don’t make sense”.


People’s laziness?
Well yes, that is a huge one. I know people who when faced with Google’s credible password suggestion say “hell no, I could never remember that”, then proceed to use a leet-speak thinking computers can’t guess those because of years of ‘use a special character to make your password secure’. People at work giving their password to someone else to take care of someething because everything else is a pain and the stakes are low to them. People being told their bank is using a new authentication provider and so they log dutifully into the cited ‘auth provider’, because this is the sort of thing that (generally not banks) do to people.
to an extent
Exactly, it mitigates, but still a gap. If they phish for your bank credential, you give them your real bank password. It’s unique, great, but the only thing the attacker wanted was the bank password anyway. If they phish a TOTP, then they have to make sure they use it within a minute, but it can be used.
actually destroys any additional security added by 2fa
From the user perspective that knows they are using machine generated passwords, yes, that setup is redundant. However from the service provider perspective, that has no way of enforcing good password hygiene, then at least gives the service provider control over generating the secret. Sure a ‘we pick the password for the user’ would get to the same end, but no one accepts that.
But this proves that if you are fanatical about MFA, then TOTP doesn’t guarantee it anyway, since the secret can be stuffed into a password manager. Passkey has an ecosystem more affirmatively trying to enforce those MFA principles, even if it is, ultimately, generally in the power of the user to overcome them if they were so empowered (you can restrict to certain vendor keys, but that’s not practical for most scenarios).
My perspective is that MFA is overblown and mostly fixes some specific weaknesses: -“Thing you know” largely sucks as a factor, if I human can know it, then a machine can guess it, and on the service provider there’s so much risk that such a factor can be guessed at a faster rate than you want, despite mitigations. Especially since you generally let a human select the factor in the first place. It helps mitigate the risk of a lost/stolen badge on a door by also requiring a paired code in terms of physical security, but that’s a context where the building operator can reasonably audit attempts at the secret, which is generally not the case for online services as well. So broadly speaking, the additional factor is just trying to mitigate the crappy nature of “thing you know” -“Thing you have” used to be easier to lose track of or get cloned. A magstripe badge gets run through a skimmer, and that gets replicated. A single-purpose security card gets lost and you don’t think about it because you don’t need it for anything else. The “thing you have” nowadays is likely to lock itself and require local unlocking, essentially being the ‘second factor’ enforced client side. Generally Passkey implementations require just that, locally managed ‘second factor’.
So broadly ‘2fa is important’ is mostly ‘passwords are bad’ and to the extent it is important, Passkeys are more likely to enforce it than other approaches anyway.
It will offer an explanation, one that sounds consistent, but it’s a crap shoot as to whether or not it accurately described the code, and no easy of knowing of the description is good or bad without reviewing it for yourself.
I do try to use the code review feature, though that can declare bug based on bad assumptions often as well. It’s been wrong more times than it caught something for me.