• LiPoly@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    72
    arrow-down
    3
    ·
    1 day ago

    In theory, there’s a Million awesome business applications for it.

    Let’s say you’re in construction and your glasses tell you exactly what to build where and how.

    You’re a waiter and the glasses tell you which table ordered what, needs attention, etc.

    You’re a network engineer and the glasses show you on every port which device is connected.

    And don’t even get me started on the military applications.

    Of course we’re not there yet. But that’s why they’re so obsessed with it. They want to be the first.

    • Meron35@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      We were already there 10 years ago with Google Glass. Despite its failure in the consumer market, it found significant success in enterprise settings in the exact scenarios you’ve listed.

      Except, all of these are scenarios in blue collar work. Apple seems hell bent on making this succeed in white collar areas with its emphasis on meetings, which is extremely baffling.

      How Is Google Glass Doing in Enterprise and Industrial Settings? - Engineering.com - https://www.engineering.com/how-is-google-glass-doing-in-enterprise-and-industrial-settings/

    • CandleTiger@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 hours ago

      How does the construction app know what needs to be constructed and how?

      How does the waiter app know which table ordered what, needs attention, etc?

      How does the IT app know on which port every device is connected?

      These things are all real hard to know. Having glasses that display the knowledge could be really nice but for all these magic future apps, having a display is only part of the need.

      • Valmond@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        If you have all that info you could probably remove the human from the equation and automate it.

        As for the NPC-Waiter 🤢

      • Lvdwsn@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 hours ago

        As somebody who wanted google glass back in the day and thinks AR glasses would be really really cool, this is ultimately where I end up on it, and with a lot of tech in general: the primary usefulness of any of this shit is in accurate and relevant information, and that’s the part of the equation that these big companies are definitely NOT in the business of producing. In fact, they seem to have discovered a while back that inaccurate and irrelevant information being blasted in your face is the real money maker. And now with AI/ML producing so much/filling in gaps, I just can’t imagine that it’s going to get any better.

        That being said, I think the tech is so cool. I’d love to travel to a new city and be able to get directions around to different sightseeing spots and real time star ratings above all the restaurants instead of anxiously glancing at my phone the entire time. If we ever get to that level of goodness I’m in, but I have a lot of doubts that it’ll ever be more than another attention-seeking thing attached to your body.

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      50
      ·
      1 day ago

      In the current US political climate, giving everyone glasses with always-on cameras run by big tech companies seems particularly dangerous.

      • Inaminate_Carbon_Rod@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        I think for the most part society has gotten used to being on someone’s camera when in public at pretty much all times.

        It’s something I used to think about, now I just, don’t.

        Everyone has been looking for the next big hardware thing. It looked like it might be foldable phones for a little while but I reckon AR Glasses are the ultimate endgame until they start making bio implants.

        • floofloof@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          They’ve gotten used to it in different political circumstances. But as people start to see how an authoritarian and vindictive fascist government works with surveillance tech to invade and endanger people’s lives, attitudes to things like always-on cameras may start to shift.

      • Tarquinn2049@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        20 hours ago

        If it helps, they don’t have the battery life to be constantly recording or sending that much traffic. And that stuff can’t be invisible, us nerds can see it all. That’s one of the things dystopian sci-fi dramas have to gloss over, it all still runs on the properties of physics, sending a wireless message, even if the contents themselves are encrypted, we can still figure out where it is going and how much data it is by reading the wave. No way to block that from being possible.

        Plus, there is no reason to be covert or secretive about manipulating people. They have been literally saying it out loud for years now, and it’s still just as effective.

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      13 hours ago

      Sounds like a robot would just steal your job if that was implemented well. (And that is a big IF) Meanwhile you would pay off your AR glasses by watching a constant stream of ads for months.

    • MDCCCLV@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      17 hours ago

      Even lightweight glasses can be irritating and the extra weight from steel v plastic is noticeable. There will never be ar glasses or goggles that are comfortable to wear all the time.

    • Knock_Knock_Lemmy_In@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      17 hours ago

      All of this can be done with AR on a mobile phone.

      Only when you need to do this AND have both hands free do AR glasses become necessary. So surgery, bomb refusal or something niche like thar.

      • LiPoly@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        17 hours ago

        This might be the dumbest take I’ve heard today.

        Everything your smartphone does your laptop can do, too. Therefore, smartphones are useless!!

        Everything AR can do that your smartphone can do today will be a hundred times more convenient because you don’t have to carry a slab of glass with you all the time. You just have to wear glasses. Like I already do anyway.

        The only reason for smartphones to still exist in a world where AR is compact will be if we can’t figure out a way to efficiently input data without annoying everyone around us. As soon as that problem’s solved, nobody will be using smartphones anymore.

        • Prandom_returns@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          This might be the dumbest take I’ve heard today.

          You’re forgetting that AR headgear requires to WEAR THAT THING ON YOUR FACE AT ALL TIMES

          No matter how compact (don’t even start talking about some techbro “all conteined in a lens” type of shit), there will absolutely, always be people who will refuse to wear it. (Ask any former glasses user who went for contact lenses)

          A phone you glance at and is in your pocket only when you need it is a million times more convenient than something that goes over your eyes all of the time.

          Your world where external compact computing devices (phone/tablet/smartwatch/a slab of glass) are no longer needed is mostly constructed out of flatulence of the technology brotherhood.

    • StinkyFingerItchyBum@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 day ago

      Imagine being anyone anywhere whipped like an Amazon worker. Will the waitress have to piss in bottles? Bad for tips I think.

    • Lv_InSaNe_vL@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      20 hours ago

      you’re a network engineer and the glasses show you on every port which device is connected

      Unifi equipment already can sorta do this! The little dot pattern on the screen is an AR code and you can use the app to see this. It’s pretty cool actually. I’ve never actually used it for real work though, I just look at the dashboard on my laptop and find the port that way.

      It would be really really cool to be able to just touch the physical port and be able to change the settings in real space with AR glasses though.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      22 hours ago

      This could also be the breakout app for AI. While AR glasses obviously need shape recognition and manipulation, the real world has many many more things than likely to be codified. How do you deal with that? AI. How do you do arbitrary summaries of whatever you’re looking at? AI. How do you interact with the glasses and the real world? Speech recognition and AI.

      You heard it here first, folks. Two hot new technologies with no real use yet will find each other and turn into something useful

      • Tarquinn2049@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        19 hours ago

        I mean, technically, we heard it first at the demonstrations of the meta and google glasses, where that is exactly the main use of them demonstrated. But they also do smartphone stuff. Like project directions when looking straight ahead, and a map when glancing downwards. Or translate stuff you are looking at. Their AI stuff was like, “Where did I leave my keys?”, “Can you play me the first song off this album(while holding a record)?”, and they also did more general memory stuff like “what was the title of the white book on the shelf?”.

        But yeah, even “indoor” VR headsets have an AI assistant on them now that can help with context aware intelligence. Like “What is this thing I’m looking at?” And it can be used in both the real world and the virtual world. Like, “Is this everything I need to bake a cake?” or “how do I kill this boss?” See, real world and virtual world… lol. Or like, “Can you give me a hint on this puzzle? Not too big of a hint though.”.

        I just personally don’t like asking questions out loud.