Apple Vision Pro is a mixed-reality headset – which the company hopes is a “revolutionary spatial computer that transforms how people work, collaborate, connect, relive memories, and enjoy entertainment” – that begins shipping to the public (in the United States).

The data Apple collects is not “consumer” data like the brand of toothpaste you buy. It is more akin to medical data.

For instance, analysing a person’s unconscious movements can reveal their emotional state or even predict neurodegenerative disease. This is called “biometrically inferred data” as users are unaware their bodies are giving it up.

Apple suggests it won’t share this type of data with anyone.

  • narc0tic_bird@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 years ago

    Spying on our bodies? The device processes data about your face and surroundings in order to function the way it does. This is all processed on-device (it works offline) and is not sent to Apple in any way.

    Calling this “spying” is the equivalent of saying a camera is spying on you when you record video with it.

      • 4dpuzzle@beehaw.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 years ago

        it doesn’t upload your offline activity?

        As this WaPo article states, they doesn’t even have to upload your activity online to be very invasive. Imagine mapping your room and your house and loading it online to share with your visitors - this will happen. It technically comes within what Apple considers as private - but is still very dangerous. The yard stick to judge Apple by is the case of airtags. They didn’t care about the stalking problem of airtags until there was a huge uproar. And even then, the solution they released was very half-hearted.

        • conciselyverbose@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          2 years ago

          What are you talking about?

          Despite the fact that GPS trackers without restrictions literally already existed, are unconditionally legal and legitimate to have, and were readily available to bad actors, they heavily limited the functionality out of the gate to limit the benefit to malicious use cases.