As the title says, I am currently learning to be a programmer, and my tablet does not suffice for the job.

I have already finished a small MEAN-Stack application for learning Typescript, learned some Java syntax (I expect nothing more exciting than a sorting algorithm, but exam language is Java, so…) and the next stop will most likely be plain vanilla C to learn about handling hardware.

Windows I hate with a passion, and I don’t know squat about Macs, so I am thinking of getting myself a decently sized laptop for a sensible Linux install.

History (I started my Liux journey with SuSE Linux 4.4.1, way back when) taught me to be very wary of driver issues on laptops, so I thought I could ask you for recommendations that play fair with Linux.

(as an aside, if I could play GuildWars2 on it in the evening and attach my two big monitors when at home, that would be super cool)

  • boonhet
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    1 year ago

    Now Linux is obviously a great OS for development, but there’s so much misinfo here.

    Other people (MacOS/Winblows) will spend a significant amount of time trying to emulate a Linux environment.

    It’s 2023, most shit is either platform-agnostic (Anything front-end, Java, etc) or runs in Docker nowadays. Or both. I run plenty of Java shit in Docker despite the fact that it’d run natively on any major desktop OS. It’s easier to guarantee an environment in a container than get a bunch of Linux users to agree on an environment. Otherwise you get one dev with Java 8 as default as per company spec and then another with Java 17 because some tool they use requires it and they’re too lazy to set 8 as the default and invoke 17 specially for that use case.

    Matter of fact, I know most companies I know people at, either use Macbooks or give you a choice.

    Does it really take a significant amount of time trying to emulate a Linux environment? Eh, I suppose. I first install brew and THEN install docker. Whereas on Linux, I’d just use the distro’s built-in package manager to install docker, because everything gets deployed in containers for k8s anyway, so why would I run it without docker locally and complicate things?

    Linux is the correct (and only) choice for programmers

    Also, funnily enough, according to the latest Stack Overflow survey, Windows is actually the most popular OS among developers. Probably because of all the ancient legacy win32 shit. MacOS is second, but if Linux wasn’t split into different distros, it would likely be second (multiple choice survey, so y’know, can’t just add them together linearly), but it’d be a close call anyway.

    99.9% of the internet runs on Linux. When you get a job, you’ll most likely deploy to Linux servers

    This part is technically true (I believe the real number is closer to 96%, with Windows Server and FreeBSD accounting for the rest), but it’s highly irrelevant because most modern backend applications in production run on multiple layers of abstraction to the point where it doesn’t matter if the development takes place in Windows, Linux or MacOS.

    At the end of the day, I want my dev machines to always work, so they’re Macs. My personal desktop is mostly used for gaming and tinkering, so it doesn’t matter if I fuck something up and have downtime. That runs Gentoo with nVidia, KDE and Wayland because I hate myself and want to suffer. That said, OP is a student and should use Linux precisely because they can probably afford the downtime so what better time to tinker? But for work, reliable and polished > tinkering and infinite customizability.

    • jg1i@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Preface: I’m assuming most people are going to get jobs at a SaaS or hardware company. Obviously, this doesn’t make sense if you’re a macOS, iOS or Windows developer. Also, this is about the work experience, not casual, personal.

      most shit is either platform-agnostic… or runs in Docker nowadays

      This kind of ties into my point. Docker, WSL, Vagrant, brew. Think of all the time and effort spent on these tools that… just try to create a Linux environment on other OSes. Plus, you have to learn how to use those tools and configure them. Or just use the thing they’re trying to minic, Linux?

      Especially at work, making things “platform-agnostic” sucks. You’re always going to need to support Linux—optionally you can add macOS and Windows support, but that’s not required and adds unnecessary complexity. It doubles or triples your workload with low-to-no value tasks that your customers, CI, and app don’t care about.

      You could be done with your sed bash script to get CI working again, but wait, you have to go back and handle macOS differently and if you have Windows, sheesh. Debugging something? Oh, ip isn’t on macOS. Oh ss isn’t on macOS. Oh nc is on macOS, but it’s different. tar is on macOS, but it’s different.

      Who wants to learn the tooling and workings of 3 different OSes, when only 1 is required?

      Am I saying that software doesn’t run on other platforms? No. I’m saying, when you’re ready, you won’t have to.

      Windows is actually the most popular OS

      I’m not talking about developer preferences. I’m talking about the tools that are required at work.

      it’s highly irrelevant because … multiple layers of abstraction

      This is simply not true. And in fact, this is actually a huge problem. Sure, everything is nice when it works, but the problem is when something goes wrong and you’re trying to debug. The layers of abstraction will get in the way. At some point, you’ll have to SSH into a machine and figure out why your app is crashing. Or if you don’t SSH, you’ll try to reproduce the issue locally. The closer your dev environment is to the prod environment, the easier debugging gets.

      Additionally, all of those abstractions will change. Each job will use slightly different services or new products will come and old ones will die. Linux skills are more transferrable from job to job and over the long term.

      But for work, reliable and polished > tinkering and infinite customizability

      Yeah, I’m with you here. Linux gives you the freedom to do whatever the heck you want, even if it’s wrong. I would advise against using Linux for tinkering and customizability. That’s how you break stuff. Stick to well tested hardware and software and get your work done. I’ve always been issued a boring Ubuntu LTS laptop at jobs and I don’t customize it.

      • boonhet
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Preface: I’m assuming most people are going to get jobs at a SaaS or hardware company. Obviously, this doesn’t make sense if you’re a macOS, iOS or Windows developer. Also, this is about the work experience, not casual, personal.

        But that is also what I’m talking about. I don’t know of a single modern SaaS company that deploys straight on Linux anymore. It’s always going to be some form of container nowadays, making it irrelevant which OS you use. Sure docker might be replaced one day by something else, but you’re still unlikely to run anything straight on Linux.

        Especially at work, making things “platform-agnostic” sucks. You’re always going to need to support Linux—optionally you can add macOS and Windows support, but that’s not required and adds unnecessary complexity. It doubles or triples your workload with low-to-no value tasks that your customers, CI, and app don’t care about.

        What are you talking about here? MacOS and Windows support for the actual backend you’re developing? It’s highly irrelevant, because you’d be using Docker so it only needs to support Linux. And for most popular runtimes nowadays, it’s actually easy to do cross-platform if you were so inclined. JVM doesn’t care what it runs on, Python doesn’t care much either, and Rust compiles for multiple platforms. My jars and Rust binaries still go inside docker if they’re supposed to be backend services, rather than client applications.

        Or if you were talking about MacOS and Windows support for desktop applications… I don’t think NOT supporting Windows is wise, MacOS is pretty optional.

        You could be done with your sed bash script to get CI working again, but wait, you have to go back and handle macOS differently and if you have Windows, sheesh.

        Why do you need to have CI working on your local machine rather than your CI servers (which should be running Linux)?

        I’m not talking about developer preferences. I’m talking about the tools that are required at work.

        Actually, I’m saying Windows is used more, not that people prefer it more. 47% of people reported using Windows for work, 33% MacOS, 26% Ubuntu and so on.

        it’s highly irrelevant because … multiple layers of abstraction This is simply not true. And in fact, this is actually a huge problem. Sure, everything is nice when it works, but the problem is when something goes wrong and you’re trying to debug. The layers of abstraction will get in the way. At some point, you’ll have to SSH into a machine and figure out why your app is crashing. Or if you don’t SSH, you’ll try to reproduce the issue locally. The closer your dev environment is to the prod environment, the easier debugging gets.

        My dev environment is actually closer to the prod environment than it would be if I was running everything without these abstraction layers. The abstraction layers are precisely what guarantee sameness. Prod is a Linux container running on Linux. Dev is a Linux container running on MacOS. Since they’re both pulling from the same base images, it guarantees that I have the same version of each system library, etc. No chance of something like prod being Ubuntu 20.04 and dev being 22.04, or prod having Eclipse Temurin while I use Azul, or prod being OpenJDK 17.0.7 and dev being OpenJDK 17.0.8.

        Yeah, I’m with you here. Linux gives you the freedom to do whatever the heck you want, even if it’s wrong. I would advise against using Linux for tinkering and customizability. That’s how you break stuff. Stick to well tested hardware and software and get your work done. I’ve always been issued a boring Ubuntu LTS laptop at jobs and I don’t customize it.

        And that’s why most companies I know about, give their devs Macbooks nowadays. Much harder to fuck up and regardless of whether you give your employee a Macbook or a boring Ubuntu LTS laptop, they’re going to be running nearly everything in containers anyway.

        I know one company near me that does give out Linux laptops - they created their own distro for better control of what updates get installed from the repos. The rest of us use containers to abstract away the host operating system of the dev machine.