garbage account

  • devedeset@lemmy.zip
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    2 days ago

    I’m really split with it. I’m not a 10x “rockstar” <insert modern buzzword> programmer, but I’m a good programmer. I’ve always worked at small companies with small teams. I can figure out how to parse requirements, choose libraries/architecture/patterns, and develop apps that work.

    Using Copilot has sped my work up by a huge amount. I do have 10 YoE before Copilot existed. I can use it to help write good code much faster. It may not be perfect, but it wouldn’t have been perfect without it. The thing is I have enough experience to know when it is leading me down the wrong path, and that still happens pretty often. What it helps with is implementing common patterns, especially with common libraries. It basically automates the “google the library docs/stackoverflow and use code there as a starting point” aspect of programming. (edit: it also helps a lot with logging, writing tests, and rewriting existing code as long as it isn’t too whacky, and even then you really need to understand the existing code to avoid a mess of bugs)

    But yeah search is completely fucked now. I don’t know for sure but I would guess stackoverflow use is way down. It does feel like many people are being pigeonholed into using the LLM tools because they are the only things that sort of work. There’s also the vibe coding phenomenon where people without experience will just YOLO out pure tech debt, especially with the latest and greatest languages/libraries/etc where the LLMs don’t work very well because there isn’t enough data.

    • Metju@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      LLMs are an okay’ish tool if your code style is not veering from what 99% of the open-sourced codebase looks like. Use any fringe concept in a language (for example, treat errors as values in languages ridden with exceptions, use functional concepts in an OOP language) and you will have problems.

      Also, this crap tends to be an automated copy-paste. Which is especially bad when it skips on abstracting away a concept you would notice if you were to write the code yourself.

      Source: own experience 😄

      • devedeset@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        Totally agree. In my day to day work, I’m not dealing with anything groundbreaking. Everything I want/need to code has already been done.

        if you have a Copilot license and are using the newest Visual Studio, it enables the agentic capabilities by default. It will actually write the code into your files directly. I have not done that and will not do that. I want to see and understand what it is trying to do.

    • pulsewidth@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 days ago

      I agree it’s great at writing and frame-working parts of code and selecting libraries - it definitely has value for coding. $1500 bil value though, I doubt.

      My main concern there lies in the next gen of programmers. The work that ChatGPT (and Claude etc) outputs requires some significant programming prior-experience to allow them to make sense of the output and adjust (or correct) it to suit their scope and requirements of the project - it will be much harder for junior devs to learn that skill with LLMs doing all the groundwork - essentially the same problem in wider education now with kids/teens just using LLMs to write their homework and essays. The consequences will be long term, and significant. In addition (for coding) it’s taking away the entry-level work that junior devs would usually do and then have cleaned up for prod by senior devs - and that’s not theory, the job market for junior programmers is dying already.