Another post regarding time travel got me wondering how far back in time can I hypothetically leave a modern computer where they, the most capable engineers of their time, can then somewhat reverse engineer it or even partially?

  • flamingo_pinyata@sopuli.xyz
    link
    fedilink
    arrow-up
    130
    arrow-down
    2
    ·
    edit-2
    1 year ago

    The biggest issue would be microchips which require some really precise machinery to manufacture.

    1930s - complete reverse engineering
    By then they had both an understanding of semiconductors and computational theory. Using semi-conductive materials to compute wasn’t yet a thing, but there wouldn’t be much surprise at the concept. Some kind of reproduction is likely, probably not a 5nm manufacturing process like modern chip factories, but they could make it.

    1890s - eventual understanding, but not able to manufacture
    Measuring devices were sensitive enough by then to measure tiny electrical fluctuations. They would be able to tell the device functions due to processing of electrical signals, even capture those signals. Biggest missing piece is mathematical theory - they wouldn’t immediately understand how those electrical signals produce images and results. Reproduction - no. Maybe the would get an idea what’s needed - refining silicon and introducing other stuff into it, but no way they could do it with equipment of the day.

    1830s - electricity goes into a tiny box and does calculations, wow!
    This is the age of the first great electrical discoveries. They would be in awe what is possible, and understand on a high level how it’s supposed to work. Absolutely no way to make it themselves.

    1730s - magic, burn the witch!

    • ColeSloth@discuss.tchncs.de
      link
      fedilink
      arrow-up
      24
      ·
      1 year ago

      The novel ways that we’ve come up with to make processors and circuit boards over the past 40 years has been pretty amazing. I believe you’re giving people of the 1930s too much credit here. Just for instance, the entire industry has known making chips smaller with more transistors will yield better performance for the past 40+ years. It’s taken coming up with manufacturing “tricks” this long to get down to what we have today. Same thing for ram and hard drives.

      And the code that programs it all to run would be completely unreadable. Much less the understanding of all the code for stuff that wouldn’t have been named, created, or thought of, yet. Or how to program and read anything off the a solid state hard drive or the ram.

      The first “digital computer” was made in 1945. You would bump that up a bit sooner by giving them a laptop in the 1930s, but most things since then have been just trying to refine the manufacturing process. They wouldn’t be able to recreate the laptop at all. Not even in the 1980s would they be able to create it.