Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

    • LazaroFilm@lemmy.world
      link
      fedilink
      English
      arrow-up
      81
      arrow-down
      1
      ·
      9 months ago

      “By the way, you can now pay for Alexa AI option if you want me to reply in a slightly smarter way, but I will still cut you off with ads and other useless things. To activate AlexaAI say activate”

        • spitfire@infosec.pub
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          9 months ago

          Just made the switch to NextDNS. For $2/month I get a lot of the same features but also on my phone when not on WiFi. Still love my pihole though!

      • JonEFive@midwest.social
        link
        fedilink
        English
        arrow-up
        27
        ·
        9 months ago

        “No”

        “I heard ‘activate’. Thank you! Your credit card will be charged $129 annually. To cancel, please log on to the website because there’s no way we’re letting you get out of this mess the same way we got you into it.”

        • bobs_monkey
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          To cancel, please log on to the website because there’s no way we’re letting you get out of this mess the same way we got you into it.

          Unless you’re in California

      • FireTower@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 months ago

        *to the same degree of intelligence as you’ve previously experienced. (Ps if you don’t we’re making Alexa have a room temp IQ)

    • pyldriver@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 months ago

      Mine can’t ever seem to tell the difference between on and off if there is any sound in my house

      • glimpseintotheshit@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        Siri was always shit but somehow managed to devolve even further lately. I never trusted her to do more than than turning lights on or off but now this shit happens:

        Me: Siri, turn off the lights in the living room

        Siri: OKAY, WHICH ROOM? BATHROOM, BEDROOM, KITCHEN, HALLWAY, LIVING ROOM?

        Imagine living in a mansion with this cunt

        • Staple_Diet@aussie.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 months ago

          I use Google to turn on my TV by saying ‘turn on TV’, easily done. But then when I ask it to adjust volume it asks me which TV… I only have one TV online and it had just turned it on.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      9 months ago

      But he acknowledged that Alexa will need to drastically improve before that happens.

      I get tired of the outrage-headline game.

  • arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    127
    arrow-down
    2
    ·
    9 months ago

    We need to move AI from the cloud to our own hardware running in our homes. Free, open source, privacy focused hardware. It’ll eventually be very affordable.

    • LEX
      link
      fedilink
      English
      arrow-up
      54
      arrow-down
      2
      ·
      edit-2
      9 months ago

      That’s already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.

      Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there’s no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.

      Hell, you can even run llama.cpp on Android phones.

      This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

      • Zetta@mander.xyz
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        9 months ago

        Yes, and you can run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay

        • LEX
          link
          fedilink
          English
          arrow-up
          9
          ·
          9 months ago

          Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.

            • LEX
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              9 months ago

              Huggingface is where the models live. Anything that’s uncensored (and preferably based on llama 2) should work.

              Some popular suggestions at the moment might be HermesLimaRPL2 7B and MythomaxL2 13B for general roleplay that can easily include nsfw.

              There are lots of talented people releasing models everyday tuned to assist with coding, translation, roleplay, general assistance (like chatgpt), writing, all kinds of things, really. Explore and try different models.

              General rule: if you don’t have a dedicated GPU, stick with 7B models. Otherwise, the bigger the better.

          • Zetta@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            Which models do you think beat Pygmalion for erotic roleplay? Curious for research haha

            • LEX
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              9 months ago

              Hey, I replied below to a different post with the same question, check it out.

                • LEX
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  9 months ago

                  lol nothing to be sorry about, I just wanted to make sure you saw it.

        • LEX
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          Thanks for this, I haven’t tried GPT4All.

          Oobabooga is also very popular and relatively easy to run, but it’s not my first choice, personally.

        • LEX
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          9 months ago

          13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?

          It is relative so, I guess if you’re comparing that to an atari 2600 cartridge then, yeah, it’s hella huge. But you can store multiple models for the same storage cost as a single modern video game install.

          • scarabic@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.

            It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+

            So okay, storage is not prohibitive.

        • arthurpizza@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          Storage is getting cheaper every day and the models are getting smaller with the same amount of data.

      • teuast@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

        You’re probably right, but I kinda hope you’re wrong.

          • teuast@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 months ago

            Call it paranoia if you want. Mainly I don’t have faith in our economic system to deploy the technology in a way that doesn’t eviscerate the working class.

            • LEX
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              9 months ago

              Oh, you are 100% justified in that! It’s terrifying, actually.

              But what I am envisioning is using small, open source models installed on our phones that can answer questions or just keep us company. These would be completely private, controlled by the user only, and require no internet connection. We are already very close to this reality, local AI models can be run on Android phones, but the small AI “brains” that are best for phones are still pretty stupid (for now).

              Of course, living in our current Capitalist Hellscape, it’s hard not to imagine that going awry to the point where we’ll all ‘rent’ AI from some asshole who spies on everything we do, censors the AI for our own ‘protection’, or puts ads in there somehow. But I guess I’m a dreamer.

    • pyldriver@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      9 months ago

      God I wish, I would just love local voice control to turn my lights and such on and off… but noooooooooooo

        • pyldriver@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          I have home assistant, but have not heard anything good about rhasspy. Just want to control lights and be able to use it to play music and set timers. That being said I run home assistant right now and can control it with Alexa and Siri but… I would like local only

      • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        9 months ago

        I have that with just my phone, using Wiz lights and ITEEE. It’s the only home automation I even have because it’s the only one I found that doesn’t necessarily need a special base station like an Alexa or Google Home.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          9 months ago

          But you want a local base station, else there’s no local control. You want to use local-only networks like z-wave, zigbee, Thread, Bluetooth, etc, even though they require a base station because that’s what gives you a local-only way of controlling things.

          Matter promises a base station may no longer be necessary for smart devices to control each other, but it is rolling out very slowly

          I also wonder what I’ll be able to do with the Thread radio in the iPhone 15 Pro

          • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            9 months ago

            The base stations are what uses the cloud/AI shit. The setup I have doesn’t even require an Internet connection or wifi; it’s entirely bluetooth. Why in the hell would I want a base station that costs money, is controlled by Amazon or Google, and requires an Internet connection for my local shit?

            I don’t want a piece of hardware that does nothing but act like a fucking middleman for no good reason.

            • AA5B@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              9 months ago

              I’m a huge fan of Home Assistant. You might look into it

            • foggenbooty@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 months ago

              That is not necessarily true. Some base stations use the internet, yes, but not all. For example a Philips hue does not require internet access, nor does Lutron Caseta. As the other person posted, Home Assistant is the absolute best (IMO) way to do everything locally without the internet.

              Your system, while it might work for you, does not scale well due to the limited range and reliability of Bluetooth. You’d likely be better off to adopt a more robust protocol like Z-wave, or ZigBee and get a hub that you have full control over.

    • a1studmuffin 🇦🇺@aussie.zone
      link
      fedilink
      English
      arrow-up
      12
      ·
      9 months ago

      It’s the year of the voice for Home Assistant. Given their current trajectory, I’m hopeful they’ll have a pretty darn good replacement for the most common use cases of Google Home/Alexa/Siri in another year. Setting timers, shopping list management, music streaming, doorbell/intercom management. If you’re on the fence about a Nabu Casa subscription, pull the trigger as it helps them stay independent and not get bought out or destroyed by commercial interests.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        Thumbs up for Nabu Casa and Home Assistant!

        I haven’t yet played with the local voice stuff but have been following it with interest. Actually, now that Taspberry Piis are starting to become available again, I’m on the fence between buying a few more, vs finding something with a little more power, specifically for voice processing

        • foggenbooty@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          Get something with a little more power. Pi’s are reaching outside the price where they make sense these days. You can get an Intel N100 system on AliExpress/Amazon for pretty cheap now and I’ve got mine running ProxMox hosting all kinds of stuff.

    • Captain Aggravated@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      I do wonder how much of those voice assistants could run on-device. Most of what I use Bixby for (I know. I KNOW.) is setting timers. I think simple things like that can run entirely on the phone. It’s got a shocking amount of processing in it.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      While you may have points against Apple and how effective Siri may be, with this latest version kind of products, even the watch has enough processing power to do voice processing on device. No ads. No cloud services

      • whofearsthenight
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Pretty much. If you want a voice assistant right now, Siri is probably the best in terms of privacy. I bought a bunch of echos early, then they got a little shitty but I was in, and now I just want them out of my house except for one thing - music. Spotify integration makes for easy multi-room audio in a way that doesn’t really work as well on the other platform that I’ll consider (Apple/Siri) and basically adds sonos-like functionality for a tiny fraction of the price. The Siri balls and airplay are just not as good, and of course, don’t work as well with Spotify.

        But alexa is so fucking annoying that at this point I mostly just carry my phone (iPhone) and talk to that even though it’s a little less convenient because I’m really goddamned tired of hearing “by the way…”

  • NocturnalMorning@lemmy.world
    link
    fedilink
    English
    arrow-up
    92
    arrow-down
    4
    ·
    9 months ago

    AI is being touted as the solution to everything these days. It’s really not, and we are going to find that out the hard way.

      • HughJanus@lemmy.ml
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        edit-2
        9 months ago

        Yes, but so much more. An actually useful assistant that could draft emails, set reminders appropriately, create automations, etc. would be worth A LOT of money to me.

        • whofearsthenight
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          I think if there ends up actually being a version of AI that is privacy focused and isn’t screwing over creators it’d be so much less controversial. Also, everyone (including me) is really, really fucking sick of hearing about it all of the time in the same way that everyone is/was sick of hearing about the blockchain. As in: “Bro your taco stand needs AI/the blockchain.”

          • HughJanus@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            You wouldn’t need any kind of special training for this. Just the ability to do simple things like make calendar appointments, draft emails/responses, and set reminders based on time/locations/etc. It really doesn’t seem very complicated but as far as I know no one has figured out how to do it yet. All the existing “assistants” are so bad that I don’t even bother trying to use them anymore. They can’t even do something simple like turning on a light with any degree of reliability.

    • Valmond@lemmy.mindoki.com
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      9 months ago

      Hey that’s only because Amazon, Google and Microsoft (et al) just doesn’t have the Money to Make it good!!

      So what about 9.99 a month?

      4.99 if you pay up front for a year?

      Euh, or how much can you cough up, like for a year or at least for Q4, I’m literally on a bad roll here.

      • NocturnalMorning@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        9 months ago

        I’m not going to buy into a subscription model for something I’ve already paid for. This subscription model crap is complete bullshit.

        We even tried to do it with heated seats recently. Like install heated seats in your car, but disable them in software. It’s crazy that companies think they can get away with this.

        • Stumblinbear@pawb.social
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          edit-2
          9 months ago

          I think there’s a massive difference between unlocking a feature that’s already there and requires no maintenance and a cloud-based service that demands 24/7 uptime and constant developer support, as well as ongoing feature development

        • slumberlust@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          While I agree with you, they are 💯 going to get away with it, because your average consumer just doesn’t care.

    • Hoomod@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      If IBM actually manages to convert COBOL into Java like they’re advertising, they’ll end up killing their own cash cow

      So much still runs on COBOL

  • 5BC2E7@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    3
    ·
    9 months ago

    Alexa is more like a telemarketer disguised as an assistant. Every interaction is followed by a “by the way . Its a shit experience so I stopped using mine.

    • Corkyskog@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      41
      ·
      9 months ago

      Alexa was designed explicitly for that purpose. They lose money on every Echo sold, the whole idea was they would make money selling you stuff. Turns out people would rather use their Echo to check the weather, get recipes, etc. rather than voice shop.

      • hightrix@lemmy.world
        link
        fedilink
        English
        arrow-up
        36
        ·
        9 months ago

        I just can’t see a use case for voice shopping. There are almost zero instances where I want to buy something without having a visual of that thing in front of me at time of purchase.

        I could possibly see something like “buy another stick of deodorant”, but even then I want to see if there are deals or some other options and would want to check the price at a minimum.

        Seems like yet another MBA idea.

        • SpaceCowboy@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          ·
          9 months ago

          Yeah it seems the execs who had the idea for Alexa never used Amazon for shopping. It’s a shit shopping site full of scammy products. I’d never buy anything from them without checking out the prices reviews, etc.

        • OpenPassageways@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 months ago

          It’s really only good for re-ordering things you’ve already ordered. It will let you know that it found something in your order history and then you can decide whether you want to order again.

          • hightrix@lemmy.world
            link
            fedilink
            English
            arrow-up
            13
            ·
            9 months ago

            And this makes sense, but I’d still want to check prices to make sure that my $3 deodorant didn’t get discontinued and priced at $30/stick.

            • GamingChairModel@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              ·
              9 months ago

              Well you think this way because you’ve seen what happened to Amazon in the past 10 years. 10 years ago, when they were getting ready to launch the Echo, Amazon was a great retailer that people trusted. Now a decade of sellers gaming listings and reviews, and Amazon customer service deteriorating, we’ve been trained not to trust Amazon’s defaults.

      • Cort@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        9 months ago

        Ha, I use mine almost exclusively as a light switch. I don’t have to get out of bed to turn off my lights or turn on my fan. I’m sure they’re losing a bunch of money on me

    • o0oradaro0o@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      9 months ago

      Setting all my Alexa’s to UK English got rid of all marketing “by the ways.” I still regret going with the Alexa ecosystem but at least for now there is a workaround for the most rage inducing part of it.

    • locuester@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      By the way, did you know that you can find out more about telemarketing with an audio book from audible on the subject. Would you like to hear a preview of that now?

  • Treczoks@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    3
    ·
    9 months ago

    So they expect that people pay for being spied upon and seriously data mined?

      • Esqplorer@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        ·
        9 months ago

        I don’t know about that. They never delivered on Smart Home promises and the only truly useful thing my Google AI does is to give me the forecast. Otherwise it’s just a wifi speaker.

        If they finally integrate Bard, I would actually consider paying for the service.

  • galaxies_collide@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    1
    ·
    9 months ago

    So they get massive amounts of free data for Machine Learning, but want to charge users for supplying it?

    • HughJanus@lemmy.ml
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      9 months ago

      It’s like charging you for cable and then shoving ads down your throat.

    • tempest@lemmy.ca
      link
      fedilink
      English
      arrow-up
      14
      ·
      9 months ago

      That’s often the case. They can have their cake and eat it too. Shareholders would expect nothing less.

    • ChaoticNeutralCzech@feddit.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      I think the data is probably less valuable than people think, especially if the users expect an AI response whenever a data point can be collected from them.

  • Dem Bosain@midwest.social
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    2
    ·
    9 months ago

    Alexa has a feature where you tell it you’re leaving the house and it will listen for smoke detectors or breaking glass, alerting you through your phone if it detects something. Amazon is putting that behind a paywall next year.

    • Natanael@slrpnk.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      And much of it can be listened to by staff that are hired to label it to train the model.

    • ViewSonik@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      9 months ago

      Yep, used to be much better. There was SO much potential with it too. I wish there was a Smart Speaker with integration into ChatGPT. Id love to stand in the shower and ask it shit

      • deranger@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        9 months ago

        You can do this with a Siri shortcut.

        It still falls short because LLMs aren’t smart, they’re just approximately not wrong most of the time. I thought it would be a lot cooler than it actually is.

    • CoderKat
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      Yeah, they’re all pretty disappointing. I’d love to have something that feels like how movies portray digital assistants. Movie assistants never misunderstand you or say “I’m sorry, I couldn’t recognize your voice”. I’ve mostly used the Google one and it’s so bad at doing what I feel like is feasible even with inaccuracy.

      Eg, I’ve tried to tell my assistant to like a song that was currently playing on YTM but could not find a voice command that worked (and some commands backfired by making it skip to the next song). I’ve had very poor success with getting assistant to cast something to my Chromecast with my voice. It sometimes works, but it fails or gets it wrong so often that it’s not worth the time.

      Sometimes I use it for rewinding (e.g., “ok google, rewind 30 seconds”) because many apps don’t have granular rewind buttons and tracking on the track bar is way too inaccurate. But lol, it’s so slow! It takes a few seconds to figure out what I said (so I have to ask it to rewind more than I wish) and it seems every app is unoptimized for rewinding, as it usually takes several seconds of loading.

      It can’t really do any kind of research either. You basically can just ask it to google things and it sometimes is able to extract the meaningful part from simple questions. It’s a far way from how Hollywood thinks a digital assistant will work.

    • RaoulDook@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      9 months ago

      I never got the appeal of those things even ignoring how their design is the antithesis of privacy. It just seems dumb to talk to the computer box, like it’s a thing to talk to when it’s just a microphone and software. I simply prefer direct, precise, and silent control of devices

      • eronth@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        9 months ago

        It’s good for hands/device free control. Setting timers while cooking by simply saying “set a timer” or controlling lights from across the room without fiddling with a phone or remote.

        • ram@bookwormstory.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 months ago

          Set a timer’s and set an alarm’s the only two I ever found useful personally. I stopped using google assistant because it just legitimately stopped understanding me correctly and I got frustrated with it.

      • GladiusB@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        It’s very sci fi. Star Trek amongst many others from the 80s. If you are old enough then you would remember that this was the stuff of fantasy. I can see why it appeals to people with disabilities and possibly kids for homework or something. But I am 1000 percent with you on the privacy part. No thanks.

    • DarienGS@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      From the article:

      Amazon has bet big on AI, with the company unveiling a new, AI-powered version of Alexa alongside updated versions of its Echo Frames and Carrera smart glasses last week.

  • OrangeCorvus@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    edit-2
    9 months ago

    Good luck, I guess? Got the first Google home, at first it was great, I was asking it tons of questions. Then the questions stopped, used it for turning on the lights and other automations. Then I installed Home Assistant and the only command Google Home got was to set a timer to know when to pull things out of the oven. Eventually I stopped doing that.

    At the moment all Google/Nest Homes have their mic cut off, I only use them to stream music in my house from my NAS via Plex. So yeah…

    • tony@lemmy.hoyle.me.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      All mine to is turn lights on and off… very occasionally they might be used to find a phone, or set a reminder, but I wouldn’t miss it if that went.

      I wondered if I was unusual in not using the voice features much, but according to this thread it seems I’m not.