

Last night, I had a dream that fucking Kevin came into the house with a raw chicken like fucking Dionysius, and he just laughed, swinging it around, spraying chicken juice all over everyone and everything. I am so mad.


Last night, I had a dream that fucking Kevin came into the house with a raw chicken like fucking Dionysius, and he just laughed, swinging it around, spraying chicken juice all over everyone and everything. I am so mad.


I usually give up or find a free alternative. Typically, if something is available at a good price I won’t bother trying to get it free to begin with.


It’s been a while since I’ve listened to him, but I felt like he really struck an authentic vein with young men, especially queer kids. I wouldn’t say his lyrics are genius, but his music was refreshing for popular hip-hop. I think a lot of his fans value the feelings he gives them over most other rappers in the space, even when their technical ability surpasses his.
Also his music production team sounds great, imo.
I don’t remember the institution, but I remember reading a paper on a simulated trading environment with several ai agents who didn’t know about eachother. The LLMs were pretty conservative with profits and deliberately bought and sold in predictable ways. They all ended up “colluding” with eachother by deliberately not competing.


How could I say no to a face like that?
Sweaty gamer meme. I think the government is supposed to be the gamer? https://www.youtube.com/watch?v=Jb9Ebe_rA8M&t=0
Serious question: what sort of crimes do I have to commit to be reincarnated as a penguin in Australia but not Antarctica?


Life’s too short to pick your partners based on other people’s aesthetic preferences. The whole, ‘is this guy too young for me thing’ is a more complicated choice IMO. Maybe you’re over thinking it. Have you been on any dates recently?
I wonder if this actually happened to someone or this is the a case of armchair survivalism.


It is? I’d like to read about that



It just looked a lot like an AI image classifier.


I don’t expect current ai are really configured in such a way that they suffer or exhibit more than rudimentary self awareness. But, it’d be very unfortunate to be a sentient, conscious ai in the near future, and to be denied fundinental rights because your thinking is done “on silicone” rather than on meat.


Do you mean conventional software? Typically software doesn’t exhibit emergent properties and operates within the expected parameters. Machine learning and statistically driven software can produce novel results, but typically that is expected. They are designed to behave that way.


Really? I mean, it’s melodramatic, but if you went throughout time and asked writers and intellectuals if a machine could write poetry, solve mathmatical equations, and radicalize people effectively t enough to cause a minor mental health crisis, I think they’d be pretty surprised.
LLMs do expose something about intelligence, which is that much of what we recognize as intelligence and reason can be distilled from sufficiently large quantities of natural language. Not perfectly, but isn’t it just the slightest bit revealing?


A child may hallucinate, lie, misunderstand, etc, but we wouldn’t say the foundations of a complete adult are not there, and we wouldn’t assess the child as not conscious. I’m not saying that LLMs are conscious because they say so (they can be made to say anything), but rather that it’s difficult to be confident that humans possess some special spice of consciousness that LLMs do not, because we can also be convinced to say anything.
LLMs can reason (somewhat unreliably) with a fraction of a human brains compute power while running on hardware that was made for graphics processing. Maybe they are conscious, but only in some pathetically small way, which will only become evident when they scale up, like a child.


I don’t believe that consciousness strictly exist. Probably, the phenomenon emerges from something like the attention schema. Ai exposes, I think, the uncomfortable fact that intelligence does not require a soul. That we evolved it, like legs with which to walk, and just as easily as robots can be made to walk, they can be made to think.
Are current LLMs as intelligent as a human? Not any LLM I’ve seen, but give it 100 trillion parameters instead of 2 trillion and maybe.


Why can’t complex algorithms be conscious? In fact, ai can be directed to reason about themselves, context can be made to be persistent, and we can measure activation parameters showing that they are doing so.
I’m sort of playing devil’s advocate here, but, “Consciousness requires contemplation of self. Which requires the ability to contemplate.” Is subjective, and nearly any ai model, even rudimentary ones, are capable of insisting that they contemplate themselves.


Once they finally lock down the player so it’s impossible to block or skip ads, I look forward to coding a script which screen records each video on my sub list, feeds each video with ads into a purpose made classifier model which labels the ads, stitches out of ads with FFmpeg, and then uploads them to my jellyfin server.
I built several nodes. I think it’s most useful as an asset tracking tool, but the battery life isn’t great. Like, I have a couple premade credit card sized nodes. It’s pretty neat to ping them and get their gps. But, to be honest, your money for that application would probably be better spent on an iPhone and apples tags.
For communication: there are dubiously legal, cheap radios you can get off Amazon that would probably be 100% more useful.
I did enjoy it, though, and I still have some nodes. Also, it’s illegal to send encrypted messages over mestastic.
Edit some mestastic nodes double as a low power gps with a screen. These may be useful on their own.