Literally just mainlining marketing material straight into whatever’s left of their rotting brains.

  • Nevoic
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 months ago

    Just to be clear, the claim is that human thought is qualitatively different than an algorithm, I just haven’t been convinced of the claim. I chose my words incredibly carefully here, this isn’t me being pedantic.

    Anyway, I don’t know how you’ve come to the definitive conclusion that somehow emotions aren’t information. Or that thoughts and dreams are somehow not outputs of some process.

    Nothing you’ve outlined is necessarily impossible to derive as an output of some process. It’s actually quite possible that they’re only derived as an output of some process, unless you think they’re spawned into existence without causes, which I think religious people do believe (this is the essence of a free soul). I’m not religious.

    • Budwig_v_1337hoven [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      “some process”, sure, but not every process is an algorithm. My digestion is a complex process with outputs, I wouldn’t describe it as algorithmic though. You might want to do so, and you probably can, but I’d argue you’re just flattening an incredibly complex, species-spanning process into a mathematical representation for ideological reasons at that point.

      • Nevoic
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        7 months ago

        The question is whether or not human thought can be represented algorithmically. It seems we agree it’s plausible?

        • Budwig_v_1337hoven [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          Yea, I think we might agree there but I don’t think that supports the original assertion that human thought is nothing but an (exceedingly complex) algorithm. You can also represent human thought as a system of hydraulic pressures, that’s what early psychology did, and how we got words like repression. But just because you can do that, and maybe even gain some useful knowledge from it - doesn’t mean actual human thought is actually made up of a complex system of pressures/valves - or algorithms. Your map may seem useful, but it ain’t the territory, is what I’m trying to get at, I guess.

          To be clear, I don’t think AGI/ASI is an impossible idea, but I’m pretty confident that current approaches will not even get us in the ballpark, because they are fundamentally not the right tool for the job. Any allusion to having built the “almost AGI, swear, we’re this close this time” seems, to me, to be little more than marketing hype for silicon valley products and tech stocks. Maybe some day gluing enough of these products together will get you something indiscernible from AGI, but I really do doubt that whole premise. A text transformer won’t become sentient just by throwing more text at it and telling it to process, that’s just a hand-wavy sci-fi premise at best.