I’m not fully up to speed on Waymo and if they have ever released remote assistance/ miles details, but when Cruise went through that shit storm a year or two ago, it came out that that the cars were asking for help every few miles.
Interesting. Stuff like this makes me suspicious of the current LLM hype. I know it’s not necessarily language models per se being used by these vehicles, but still. If we were really on the cusp of AGI then I’d expect us to have at least cracked autonomous driving by now.
Ya, I don’t buy the hype around AGI. Like a Waymo drove into a telephone pole because of something they had to fix in their code. I’m not doubting there’s AI involved, neural nets, machine learning, whatever, but this isn’t an AGI type level development. Nor do I think they need an AGI to do this.
I’m also not convinced this LLM stuff can ever lead to AGI either. I think it can do some pretty impressive things with some very real drawbacks/caveats and there is definitely room to keep improving them, but that the whole architecture is flawed if you want to make an AGI.
Yeah I guess theres a lot of interesting stuff we can do with AI without necessarily achieving AGI. What about programming? Even if we don’t get AGI soon, do you still think LLMs will be snatching up a sizeable chunk of programming jobs?
So I’m developer, I do mobile apps, and I do use Claude/GPT.
I could be wrong, but I don’t foresee any imminent collapse of developer jobs, but it does have its uses. I think if anything it’ll be fewer lower end positions, but if you don’t hire and teach new devs, that’s going to have repercussions down the road.
I needed to make a webpage for example, and I’m not a webdev, and it helped me create a static landing webpage. I can tell that the webpage code is pretty shitty, but it does work for it’s purposes. This either replaced a significant amount of time learning how to do it, or replaced me hiring a contractor to do it. But I also am not really any better off at writing a webpage if I needed to make a 2nd one having used it, as I didn’t lean much in the process.
But setting it all up also did have me have to work on the infrastructure behind it. The AI was able to help guide me through that as well, but it did less of it. That I did learn, and would be able to leverage that for future work.
When it comes to my actual mobile work, I don’t like asking it do anything substantial as the quality is usually pretty low. I might ask it to build a skeleton of something that I can fill out, I’ll often ask it’s opinions on a small piece of code I wrote and look for a better way to write it, and in that case it has helped me learn new things. I’ll also talk to it about planning something out and getting some insights on the topic before I write any code.
It gives almost as many wrong/flawed answers as right answers if there’s even a tiny bit of complexity, so you need to know how to sift through the crap which you won’t know if you aren’t a developer. It will tell you APIs exist that don’t. It will recommend APIs that were deprecated years ago. The list goes on and on and on. This also happened while I was making the webpage, so my developer skills were still required to get to the end product I wanted.
I can’t see how it will replace a sizeable chunk of developers yet, but I think if used properly, it could enhance existing devs and lead to fewer hires needed.
When I hear things like 30% of Microsoft code is now written by AI, it makes sense why shit is breaking all the time and quality is going down. They’re forcing it to do what i can’t do yet.
I’m not fully up to speed on Waymo and if they have ever released remote assistance/ miles details, but when Cruise went through that shit storm a year or two ago, it came out that that the cars were asking for help every few miles.
Cruise was essentially all smoke and mirrors.
Interesting. Stuff like this makes me suspicious of the current LLM hype. I know it’s not necessarily language models per se being used by these vehicles, but still. If we were really on the cusp of AGI then I’d expect us to have at least cracked autonomous driving by now.
Ya, I don’t buy the hype around AGI. Like a Waymo drove into a telephone pole because of something they had to fix in their code. I’m not doubting there’s AI involved, neural nets, machine learning, whatever, but this isn’t an AGI type level development. Nor do I think they need an AGI to do this.
I’m also not convinced this LLM stuff can ever lead to AGI either. I think it can do some pretty impressive things with some very real drawbacks/caveats and there is definitely room to keep improving them, but that the whole architecture is flawed if you want to make an AGI.
Yeah I guess theres a lot of interesting stuff we can do with AI without necessarily achieving AGI. What about programming? Even if we don’t get AGI soon, do you still think LLMs will be snatching up a sizeable chunk of programming jobs?
So I’m developer, I do mobile apps, and I do use Claude/GPT.
I could be wrong, but I don’t foresee any imminent collapse of developer jobs, but it does have its uses. I think if anything it’ll be fewer lower end positions, but if you don’t hire and teach new devs, that’s going to have repercussions down the road.
I needed to make a webpage for example, and I’m not a webdev, and it helped me create a static landing webpage. I can tell that the webpage code is pretty shitty, but it does work for it’s purposes. This either replaced a significant amount of time learning how to do it, or replaced me hiring a contractor to do it. But I also am not really any better off at writing a webpage if I needed to make a 2nd one having used it, as I didn’t lean much in the process.
But setting it all up also did have me have to work on the infrastructure behind it. The AI was able to help guide me through that as well, but it did less of it. That I did learn, and would be able to leverage that for future work.
When it comes to my actual mobile work, I don’t like asking it do anything substantial as the quality is usually pretty low. I might ask it to build a skeleton of something that I can fill out, I’ll often ask it’s opinions on a small piece of code I wrote and look for a better way to write it, and in that case it has helped me learn new things. I’ll also talk to it about planning something out and getting some insights on the topic before I write any code.
It gives almost as many wrong/flawed answers as right answers if there’s even a tiny bit of complexity, so you need to know how to sift through the crap which you won’t know if you aren’t a developer. It will tell you APIs exist that don’t. It will recommend APIs that were deprecated years ago. The list goes on and on and on. This also happened while I was making the webpage, so my developer skills were still required to get to the end product I wanted.
I can’t see how it will replace a sizeable chunk of developers yet, but I think if used properly, it could enhance existing devs and lead to fewer hires needed.
When I hear things like 30% of Microsoft code is now written by AI, it makes sense why shit is breaking all the time and quality is going down. They’re forcing it to do what i can’t do yet.