this rootless Python script rips Windows Recall’s screenshots and SQLite database of OCRed text and allows you to search them.

  • qjkxbmwvz@startrek.website
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    6
    ·
    5 months ago

    Hilarious to me that it OCRs the text. The text is generated by the computer. It’s almost like when Lt. Cmdr. Data wants to get information from the computer database, so he tells the computer to display it and just keeps increasing the speed — there are way more efficient means of getting information from A to B than displaying it, imaging it, and running it though image processing!

    I totally get that this is what makes sense, and it’s independent of the method/library used for generating text, but still…the computer “knows” what it’s displaying (except for images of text), and yet it has to screenshot and read it back.

    • Wispy2891@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      5 months ago

      It happens the same on android for some reason

      Like 5-8 years ago the google assistant app was able to select and copy text from any app when invoked, I think it was called “now on tap”. Then because they’re google and they’re contractually obligated to remove features after some time, they removed this from the google app and integrated it in the pixel app switcher (and who cares if 99% of android users aren’t using a pixel, they say). The new implementation sucks, as it does ocr instead of just accessing the raw text…

      It only works fine with us English and not with other languages. But maybe it’s ok as it seems that google’s development style is us-centric

      • nawa@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        5 months ago

        Now on Tap also used OCR. Both Google Lens and Now on Tap get the same bullshit results on any languages that are not Latin. Literally, Ж gets read as >|< by both exactly the same.

    • 4am
      link
      fedilink
      English
      arrow-up
      25
      ·
      edit-2
      5 months ago

      Hey, yeah… why aren’t they just tapping the font rendering DLL?

      are they tapping the front rendering dll??

      • HelloHotel
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        My guess is that they looked at their screen reader API, saw that it wasnt 100% of the text on screen and said fuck it! Were using OCR!

    • space@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      24
      ·
      5 months ago

      Having worked on a product that actually did this, it’s not as easy as it seems. There are many ways of drawing text on the screen.

      GDI is the most common, which is part of the windows API. But some applications do their own rendering (including browsers).

      Another difficulty, even if you could tap into every draw call, you would also need a way to determine what is visible on the screen and what is covered by something else.

    • catloaf
      link
      fedilink
      English
      arrow-up
      20
      ·
      5 months ago

      That’s the thing, it doesn’t really know what it’s displaying. I can send a bunch of textboxes, but if they’re hidden, or drawn off-screen, or underneath another element, then they’re not actually displayed.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 months ago

      Text from OCR is one kind of match. Recall also runs visual comparisons with the image tokens stored.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      To be fair, Data was designed to be like a human, and was made in the image of his creator. He has a number of design decisions that are essentially down to his creator wanting to create something like a human. Including that which you describe.

      Data was never intended to work like a PC, it’s very normal that he can’t just wirelessly interface with stuff.