My wife and I started talking about this after she had to help an old lady at the DMV figure out how to use her iPhone to scan a QR code. We’re in our early 40s.

  • nednobbins
    link
    fedilink
    arrow-up
    1
    ·
    11 months ago

    I think you’re sort of right but it will depend heavily on how radical a shift the new technology is. In order for there to be this kind of divide there needs to be a steep learning curve to the technology. People are only willing to put up with those learning curves if there’s a significant advantage. That means that manufacturers can only successfully market “difficult” technologies if they provide a big advantage.

    I’m not aware of any old people having difficulty transitioning from quills to, fountain pens to ball point pens. They all basically did the same thing and you only had to make minor adjustments. Nobody bothered learning how to use the Writer since it didn’t actually let you do anything better. They were willing to go through the significant curve of learning how to use typewriters because, once they did, they could write significantly faster.

    Computers and cell phones are a whole different way of interacting with people and information than “hardcopy” was. You didn’t just swap some objects that did the same thing with a different approach. It wasn’t even just a slightly different way of doing the same thing. Those technologies allowed us to interact with the world in a totally new way. It was worth learning a bunch of weird computer stuff that older generations had never heard of because we could do things they never dreamed of. (eg I used to get rushed when talking with my grandmother to save on long distance bills, now I don’t even think about long distance costs other than latency.

    I’m sure that sort of thing will happen again but it would require a far more disruptive technology than AR. That’s a small iteration that we’ve already been primed for. When Terminator 1 came out, nobody was confused when it switched to “terminator vision” and you saw the AR display. That’s why I joke about neural interfaces. In theory, that could give a person significantly higher throughput rates to their computer. There are all kinds of potential benefits to. It would be worth it for people to put up with steep learning curves, unintuitive interfaces and lots of troubleshooting if it meant they could suddenly “read” at 10,000 words a minute or control complex robots. Not everyone would go through that effort and it would create the kinds of divides that we saw with computers.

    When I look at current technologies as an old(ish) person, it’s a very different view than my parents and grandparents had. They didn’t understand the new technologies. I have no trouble understanding them, I just think a lot of them are a waste of my time (unlike screwing around on Lemmy, which is totally productive /s).