After chatting with some of you on this forum and seeing that we all are on Lemmy rather than Reddit, I think it would be a good idea for us to have some study groups to improve our technological literacy and competency.

During my time on Lemmy, I’ve been able to increase my digital literacy and overall knowledge surrounding my system. I’ve loved the nearly endless rabbit holes Wikipedia has pulled me into, as well as the resulting happiness that comes from finally fixing a broken Linux system or piece of technology.

But what exactly does technological literacy encompass, one might ask? I’d like to illustrate via anecdote. When I first got into Linux, I was told to “Get a terminal emulator to SSH into the HPC so that you can run computational jobs”. To most of you this sentence is completely normal, but to my unconditioned mind, I felt like a big bright light was flashed before my eyes while my PI spoke martian to me. After the initial disorientation, I downloaded what I thought was my only option for a terminal emulator (MobaXTerm), and found myself sitting in front of a pitch black terminal screen with a blinking prompt. Not knowing what a host was, how to manage a network, any Linux commands (coreutil, never heard of her…), or really do anything past opening up WoW and Google Docs. The only things more advanced than the plug and play Google/Microsoft software solutions I’d use, was my botched LaTeX setup. I used it to typeset math equations for my students, homework, and lab reports from how much faster I could type in the TeX format than click on every Greek letter/symbol I needed. Overall, it really messed with my ability to do the research I was tasked to do. I was supposed to learn how to use Vim as my IDE when the only IDE I had ever worked in was Spyder from Anaconda! VSCodium, CodeBlocks, Emacs, etc, I did not know that any of these existed.

Needless to say, this was extremely discouraging to be thrown head first into a difficult scenario with very little assistance whilst trying to juggle coursework and outside responsibilities. Humble beginnings reinforced in me that if I experimented with my computer and messed up on the OS side, that I’d brick my hardware and have some variation of Homer Simpson holding up the “So you Broke the Family Computer” book.

I’m sure that we all come from varying origins of computer literacy, which IthinkI’ve proposed a couple of possible areas of study, that we could set up in small or large groups depending on interest. The frequency, literature references (textbooks, white papers, blogs, forums, etc.), and the project goal (could be concrete or abstract) should be drawn up and worked towards to keep the topic focused. I’ve come up with a couple of fields for us to start with, feel free to add to the list or modify what I’ve written.

  1. Cryptography with a rigorous mathematical foundation applied to both classical and quantum computing paradigms (AES, RSA, Hash functions deeper than just the surface, information theory (We love our boy Claude Shannon), Cryptographic primitives, Shor’s Algorithm, etc.)
  2. A hardware agnostic study of firmware (What are some unifying principles about firmware that can empower the user to understand why certain aspects of the device are not functioning)
  3. Hardware architectures (GPU, NPU, TPU, CPU, RAM, DIMM)
  4. Form factors (How geometry can impose certain design decisions, and so forth
  5. Fundamentals from First Principles, i.e condensed matter physics theories to understand the classical computing systems. The group can also choose to segwey into topological states of matter (Dirac fermions, Weyl semimetals, Mott insulators, and a myriad of other cool matter states that aren’t really discussed outside of physics / graduate engineering classes) Qubits (Bloch sphere representations) and loads of other things that I’m sure exist but am unaware of.
  6. LLM Inference technology and how it can be applied to case law, accounting, stocks, and various other fields where the solution to the problem lay somewhere in an encoded technical language.

I’d like to begin the discussion with this as our starting framework, does anyone have any interest in the topics listed above or suggestions for other subjects? How should we manage these groups? Should we add some chats to the Matrix instance?

  • nickwitha_k (he/him)@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Oooh! Very cool stuff. As one whose academic background was in chemistry, I really like that.

    I’ve heard interesting things about the RISC-V architecture which have made me want to look more into it. Logic design is something that I feel isn’t nearly touched upon as much as it should in other technical disciplines. My undergraduate engineering coding experience was complete shit for lack of a better term.

    Yeah. I’ve been slowly working towards going through Learning the Art of Electronics but, too much has come up this year. RISC-V, coupled with artificial hardware shortages (scalpers), is really what got me interested in digital circuits, as I’ve been a FOSS proponent for years and have been gaining both annoyance at lack of innovation and effort to rectify supply shortages, as well as less and less comfortable with centralized control of hardware implementation (repairability, privacy, modifiability, and general ability to figure out how a thing works).

    RISC-V, to me, seems to be solving a lot of that by encouraging more open approaches to ISA development. It isn’t quite GPL level but, some of the most performant and general-purpose oriented implementations have been open-sourced by their implementers (ex Berkeley BOOM series of cores and even Alibaba’s THead cores, which are likely to be present in servers by the end of the decade).

    The biggest challenge, to me, is that there are not currently any great ways to make the ISA implementations democratized, accessible, and performant in modern use cases. The performance side is directly related to the two options for implementation: FPGA and custom silicon.

    FPGAs are limited both in number of logic cells and physic due to the size and nature of their logic gates (bigger and further apart means slower and less efficient). For example, an implementation of a modified Berkeley SonicBOOM (BOOMv3) that is hardened against Spectre attacks takes about 115k LUTs per 100MHz core (my Xilinx 7-series FPGA, which is 4 generations old, only has an 85k LUT capacity, a cutting-edge FPGA chip costs thousands of dollars just for a bare chip).

    Custom silicon is just insanely expensive and out of reach for most. The lithography equipment necessary for currently node processes is pretty much only available to multinational corporations. I’m going done alternatives that are “good enough” for modern workloads become more available.

    I think that having a masterclass to get everyone on board with using Vim/Neovim/Emacs (Something at least that has much more versatility as an editor) as well.

    Absolutely. That sounds like a worthwhile undertaking.

    Embedded systems have had particular interest for me too. I took a control theory class in my senior year where we had to analyze reactors, distillation columns and disturbances in steady states: (i.e. fluctuations in P,T,x_{i} (liquid fraction compositions), and many more). I’m assuming the embedded systems that actually implement this could go into the curriculum too.

    Indeed. I’m mainly self-taught there, with focus on hobbyist MCUs but the entry requirements both financial and knowledge have gone down significantly. An RP Pico W, a dual-core 32-bit ARM MCU with onboard WiFi and BLE can be had for $6. It can also be programmed on bare C, Arduino C, at least two implementations of Python, Rust, and a number of others, including drag-and-drop languages like MakeCode.

    Definitely interested to see where this goes and contribute.