• zerofk@lemm.ee
    link
    fedilink
    English
    arrow-up
    75
    arrow-down
    2
    ·
    edit-2
    8 days ago

    As someone who has interviewed candidates for developer jobs for over a decade: this sounds like “in my day everything was better”.

    Yes, there are plenty of candidates who can’t explain the piece of code they copied from Copilot. But guess what? A few years ago there were plenty of candidates who couldn’t explain the code they copied from StackOverflow. And before that, there were those who failed at the basic programming test we gave them.

    We don’t hire those people. We hire the ones who use the tools at their disposal and also show they understand what they’re doing. The tools change, the requirements do not.

    • uranibaba@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      7 days ago

      I think that LLMs just made it easier for people who want to know but not learn to know. Reading all those posts all over the internet required you to understand what you pasted together if you wanted it to work (not always but the barr was higher). With ChatGPT, you can just throw errors at it until you have the code you want.

      While the requirements never changed, the tools sure did and they made it a lot easier to not understand.