• dual_sport_dork 🐧🗡️@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    edit-2
    4 hours ago

    This is exactly the thing.

    Whatever the dress may be in reality, the photo of it that was circulated was either exposed or twiddled with such that the pixels it’s made of are indeed slightly bluish grey trending towards white (i.e. above 50% grey) and tanish browny gold.

    That is absolutely not up for debate. Those are the color values of those pixels, end of discussion.

    Edit to add: This entire debacle is a fascinating case of people either failing to or refusing to separate the concept of a physical object versus its very inaccurate representation. The photograph of the object is not the object: ce n’est pas une robe.

    The people going around in this thread and elsewhere putting people down and calling them “stupid” or whatever else only because they know that the physical dress itself is black and blue based on external information are studiously ignoring the fact that this is not what the photograph of it shows. That’s because the photograph is extremely cooked and is not an accurate depiction. The debate only exists at all if one party or the other does not have the complete set of information, and at this point in history now that this stupid meme has been driven into the ground quite thoroughly I should hope that all of us do.

    It’s true that our brains can and will interpret false color data based on either context or surrounding contrast, and it’s possible that somebody deliberately messed with the original image to amplify this effect in the first place. But the fact remains that arguing about what the dress is versus how it’s been inaccurately depicted is stupid, and anyone still trying that at this late stage is probably doing so in bad faith.

    • Feathercrown@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      edit-2
      4 hours ago

      The “white” pixels are literally blue. The “black” ones can be considered gold due to the lighting.

      • auraithx@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        2 hours ago

        Yes a very light blue, nobody is seeing brilliant white. But on a colour slider it’s much closer to white than the ‘true’ dark blue of the dress. If you sample the sleeve or whatever that is hanging over it’ll be even closer to pure white.

      • pftbest@sh.itjust.works
        link
        fedilink
        arrow-up
        12
        arrow-down
        3
        ·
        3 hours ago

        You missed the whole point. If I take a white dress and then shine a blue lamp on it, then take a photo.The pixels will be 100% blue, but would that mean the dress itself is blue?

        • MotoAsh@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          1 hour ago

          That’s… literally not what this phenominon is about, either. Talk about missing the point.

          • blockheadjt@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            23 minutes ago

            It’s exactly the point. White fabric will appear blue in blue light, which is why some people see this white dress and think it’s blue.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            34 minutes ago

            That is literally what the argument is caused by, adaptive perception to lighting conditions.

    • Genius@lemmy.zip
      link
      fedilink
      arrow-up
      3
      arrow-down
      12
      ·
      3 hours ago

      They’re not stupid, their visual cortex just lacks the ability to calibrate to context. You can see in the picture that the scene is very brightly lit. If your visual cortex is in working order, you’ll adjust your perception of the colours. The picture reveals that some people struggle to do that.

      • auraithx@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 hours ago

        fMRI studies show that white-and-gold perceivers exhibit more activity in frontal and parietal brain regions, suggesting that their interpretation involves more top-down processing. This means they are more, not less, engaged in contextual interpretation.

        Some differences may relate to physiological traits like macular pigment density, which affects how much blue light is absorbed before reaching the retina. People with higher density tend to see white and gold

        Color perception is not only about the visual cortex’s function but about the image’s properties and the brain’s inferential processes. You’d know this if you weren’t a dumb blue-n-black’er

        • Genius@lemmy.zip
          link
          fedilink
          arrow-up
          1
          ·
          1 hour ago

          How come the gold and whiters are simultaneously claiming they use more top down processing, AND that the pixels are white and gold? Looking at the pixel colour is bottom up processing.

          If the dress is actually blue and black, how is doing more contextual processing supposed to get you a less accurate perception? Imagine if it was a snake and you needed to tell what colour it was so you’d know if it’s going to bite you. If your perception of the snake’s colour changes depending on the lighting, you’re going to die.

          The correct interpretation of that study is that you white and golders are doing 10,000 calculations per second and they’re all wrong… Or, you know, the BOLD activation was in inhibitory pathways.

          • auraithx@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            The claim mixes up how perception works and what people actually mean when they talk about top-down processing. White and gold viewers aren’t saying the pixels are literally white and gold—they’re saying the colors they perceive match most closely with that label, especially when those were the only options given. Many of them describe seeing pale blue and brown, which are the actual pixel values. That’s not bottom-up processing in the strict sense, because even that perception is shaped by how the brain interprets the image based on assumed lighting. You don’t just see wavelengths—you see surfaces under conditions your brain is constantly estimating. The dress image is ambiguous, so different people lock into different lighting models early in the process, and that influences what the colors look like. The snake example doesn’t hold up either. If the lighting changes and your perception doesn’t adjust, that’s when you’re more likely to get the snake’s color wrong. Contextual correction helps you survive, it doesn’t kill you. As for the brain scan data, higher activity in certain areas means more cognitive involvement, not necessarily error. There’s no evidence those areas were just shutting things down. The image is unstable, people resolve it differently, and that difference shows up in brain activity.