This article outlines an opinion that organizations either tried skills based hiring and reverted to degree required hiring because it was warranted, or they didn’t adapt their process in spite of executive vision.

Since this article is non industry specific, what are your observations or opinions of the technology sector? What about the general business sector?

Should first world employees of businesses be required to obtain degrees if they reasonably expect a business related job?

Do college experiences and academic rigor reveal higher achieving employees?

Is undergraduate education a minimum standard for a more enlightened society? Or a way to hold separation between classes of people and status?

Is a masters degree the new way to differentiate yourself where the undergrad degree was before?

Edit: multiple typos, I guess that’s proof that I should have done more college 😄

  • tsonfeir
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    9 months ago

    I think it would be easier for you to give me an idea of the clusterfuck you have experienced and I can let you know if that cluster is still fucking.

    What I do know, is that it is significantly better. Nullable types, multi-catch, typed properties, arrow functions, etc.

    • SpaceNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      9 months ago

      I’m not going to dig up decades-old code for you to pick over - but I do recall that the labyrinthian and ever-increasingly complex and buggy behavior of the multitudinous builtins was an undending pain in the ass.

      • tsonfeir
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        9 months ago

        I was just wondering if you had anything off the top of your head. Any language can be spaghetti if you make it spaghetti. 🤷‍♂️

              • tsonfeir
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                9 months ago

                Using them incorrectly, would be incorrect. Without an example, it’s hard to tell.

                But, pretty much everyone was doing the web “wrong” back in the day. Server-side html generation? Gag me. Or worse, inserting PHP into html?! Shudder. But that’s how it was for many backed languages.

                IMO, nowadays, if it’s not a reactive js front end using the backend as an API, it’s doing it wrong. But I’m sure in 10 years we will all be laughing at how seriously we were taking JavaScript.

                • SpaceNoodle@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  9 months ago

                  It makes me shudder to think how the modern web is just treating browsers as JavaScript application environments. Converting a little backend load into a massive frontend headache is the exact opposite of where we thought we were headed twenty years ago.

                  • tsonfeir
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    2
                    ·
                    9 months ago

                    Well, it’s not a massive front end headache if you do it right. And, by passing off a lot of the easy stuff to the browser, your server can handle more load. As a bonus, it’s easier to decouple your architecture. Not only is this more efficient, but it’s easier to maintain, test, and deploy.