If RAM and GPUs were cheap people like us would be more likely to set up local LLMs to prevent our data from being productized by power-grabbing corporations.
I think it’s more likely that they’re setting up to push VDI.
The vast majority of consumers would not be able to set up a local LLM, and they know the people who are able to do so aren’t going to use their services in the first place.
If RAM and GPUs were cheap people like us would be more likely to set up local LLMs to prevent our data from being productized by power-grabbing corporations.
The actual explanation is much simpler.
Not claiming it’s the reason since it clearly isn’t, only that it will help drive traffic to commercial AI products.
I think it’s more likely that they’re setting up to push VDI.
The vast majority of consumers would not be able to set up a local LLM, and they know the people who are able to do so aren’t going to use their services in the first place.