

The problem is big businesses like Temu can bulk ship and still only pay a certain %.
But it will ruin small businesses who do only small shipments and will now see a flat fee that may be half or more the value of the good.
The problem is big businesses like Temu can bulk ship and still only pay a certain %.
But it will ruin small businesses who do only small shipments and will now see a flat fee that may be half or more the value of the good.
So I googled it and if you have a Pi 5 with 8gb or 16gb of ram it is technically possible to run Ollama, but the speeds will be excruciatingly slow. My Nvidia 3060 12gb will run 14b (billion parameter) models typically around 11 tokens per second, this website shows a Pi 5 only runs an 8b model at 2 tokens per second - each query will literally take 5-10 minutes at that rate:
Pi 5 Deepseek
It also shows you can get a reasonable pace out of the 1.5b model but those are whittled down so much I don’t believe they’re really useful.
There are lots of lighter weight services you can host on a Pi though, I highly recommend an app called Cosmos Cloud, it’s really an all-in-one solution to building your own self-hosted services - it has its own reverse proxy like Nginx or Traefik including Let’s Encrypt security certificates, URL management, and incoming traffic security features; it has an excellent UI for managing docker containers and a large catalog of prepared docker compose files to spin up services with the click of a button; it has more advanced features you can grow into using like OpenID SSO manager, your own VPN, and disk management/backups.
It’s still very important to read the documentation thoroughly and expect occasional troubleshooting will be necessary, but I found it far, far easier to get working than a previous Nginx/Docker/Portainer setup I used.
Using Ollama depends a lot on the equipment you run - you should aim to have at least 12gb of VRAM/unified memory to run models. I have one copy running in a docker container using CPU on Linux and another running on the GPU of my windows desktop so I can give install advice for either OS if you’d like
I’m actually right there with you, I have a 3060 12gb and tbh I think it’s the absolute most cost effective GPU option for home use right now. You can run 14B models at a very reasonable pace.
Doubling or tripling the cost and power draw just to get 16-24gb doesn’t seem worth it to me. If you really want an AI-optimized box I think something with the new Ryzen Max chips would be the way to go - like an ASUS ROG Z-Flow, Framework Desktop or the GMKtek option whatever it’s called. Apple’s new Mac Minis are also great options. Both Ryzen Max and Apple make use of shared CPU/GPU memory so you can go up 96GB+ at much much lower power draws.
They didn’t even use WhatsApp or signal, they were literally plain unencrypted texting
As an Xperia 5 III user I’m feeling very left out
SSO is single sign on, so you don’t need individual username and password for every service. It’s a bit more advanced so don’t worry about it until you have what you want working properly for a while.
DNS is like the yellow pages of the internet - when you type www.google.com your computer uses a DNS server to look up what actual IP address corresponds to the website name. The point of Adguard or pihole is that when a website tries to load an ad your custom DNS server just says it doesn’t recognize the address
Check out Cosmos, I struggled piecing things together but when I restarted from scratch with this as the base is has been SO much easier to get services working, while still being able to see how things work under the hood.
It’s basically a docker manager with integrated reverse proxy and OpenID SSO capability, with optional VPN and storage management
I’m sorry, bike lanes piss you off?
Funny that decades of the GOP sabotaging progress on renewable energy like solar and wind, and torpedoing research into modernizing the power grid - specifically because it would make deploying renewable easier, will end up helping to hand the lead of possibly the most consequential industry of the 21st century over to China.
This has some details of Azerbaijan distancing themselves
The meme in the article cracked me up fwiw
I haven’t used any but have researched it some:
Minisforum DEG1 looks like the most polished option, but you’d have to add an m.2 to oculink adapter and cable.
ADT-Link makes a wide variety of kits as well with varying pcie gen and varying included equipment.
Black-out typically means it just doesn’t let any light into the room, not necessarily the actual color of the curtain
Homebox has this capability too, you can generate QR codes for assets and scan it later to identify whatever’s inside.
The current version does have the ability to create QR codes for your assets and scan them later for identification, but I don’t know of a way to scan a new item and identify it automatically.
I think it becomes more useful as you accumulate stuff - I get frustrated when my wife buys crap on Amazon that we already have. So while I don’t have the time or energy to sort everything in our house, I am beginning to catalogue things as we buy or find them in the hope it becomes more useful over time to find things we rarely use and/or avoid re-buying excess items
https://a.co/d/hh2N98y Something as simple as that, though I’m not sure 5v/3a is enough for a pi5 you’d have to check power specs
I would especially advise against relying on battery banks due to the heat, if you’re just going to use it in the car and already going to the trouble of customizing so much hardware I’d find a way to run a power supply off a switched 12v fuse or wire from the car - just convert to 5v/USB power. I’m sure there are generic kits online
When on your wifi, try navigating in your browser to your windows computer’s address with a colon and the port 11434 at the end. Would look something like this:
http://192.168.xx.xx:11434/
If it works your browser will just load the text: Ollama is running
From there you just need to figure out how you want to interact with it. I personally pair it with OpenWebUI for the web interface