by clicking the "Next" arrow.
by clicking on the page.
the page around when zoomed in by dragging it.
the zoom using the slider when zoomed-in.
by clicking on the zoomed-in page.
by entering text in the search field, and select "This Issue" or "All Issues"
by clicking on thumbnails to select pages, and then press the print button.
displays sections with thumbnails and descriptions.
displays a slider of thumbnails. Click on a page to jump.
allows you to browse the full archive.
about your subscription?
Mindful : October 2018
1 2 3 you’re seeing? For instance, you see a gray box even though it’s actually white because of how it’s positioned on a checkered background, or you see a triangle in a diagram where there isn’t one because of strategically placed wedges and angles. Optical illusions point to how easy it is to fill in what we “see” based on memory, the biology of vision, and our brain’s need for coherence, and they reveal how much our minds can trick us. Because we perceive the world with relative accu- racy most of the time, we’re surprised when we get duped. We believe that our senses are exact, so it shocks us when we find out that is not the case. It’s the same with uncovering blind spots— and there lies the possibility for life-changing insights to appear. The “easy way” is harder One of the best things we can do is pull back the curtain on our unconscious operating systems. Once we’ve seen the inner workings of those systems—examining how we organize and filter information—we are far more likely to catch ourselves when we’re falling into our own inno- cent little traps. Daniel Kahneman and his col- leag ue Amos Tversky developed some ground- breaking insights into human judgment as it applies to behavioral economics: namely, that people who make decisions and form judgments under uncertainty make systematic mistakes that are common to all humans. Their find- ings have brought into question the long-held assumption of human rationality, and have had broad impacts across diverse domains,bringing to light ways our cognitive biases cause us to make errors in judgments and decision-making. As we have discovered, we humans tend to think we are rational most of the time. It’s that exact blind spot that is most in our way. Our blind spots are created through our unwillingness to question the fixed ideas and assumptions that we hold about ourselves, oth- ers, and the world around us. Some of the most problematic blind spots, however, are created and supported by the tiniest and most innocent of biases and moments, combining to form ideas and stories that keep confirming themselves and feeling believable. Mental shortcuts are just such things. They are lightning-quick intuitive judgments that are common to all of us. They often work, but when they don’t , they lead to cognitive biases that obscure our seeing. These can take many forms, three of which I’ll highlight here: Believing ideas because they are readily available to us: availability bias Finding data that confirms what we already believe: confirmation bias Thinking we saw things coming when we didn’t, which makes us think we are better at predicting the future than we are: hindsight bias Mental shortcuts like these help us simplify things as we navigate the complexities and unknowns of life. But they hinder us when they cause us to gloss over or misperceive complexities that might actually require our attention, and that’s bound to go wrong some- times. When do mental shortcuts interfere with our ability to experience mindful aware- ness throughout our day? When do they back up untrue stories about ourselves or the world around us? Can seeing how our biases help us filter the world also help us see where we may have hidden blind spots? Questioning biases and opening to a larger, more nuanced story doesn’t have to destabi- lize us. Coming out of denial doesn’t mean we need to act on our feelings. Rather, it can help us make wiser, more informed decisions and be more compassionate and understanding. It takes practice and experience to trust yourself enough to open to what you don’t know, and realize that it’s safe and important to do so. But don’t worry—you’re not going to die if you admit you don’t have a clue! You don’t have to exert effort all the time to check whether you are believing something untrue. Hacking these biases sets you up for a different autopilot: It opens you to the unknown, and in doing so, you move the arrow from blindness to mindful- ness. It doesn’t take much more than that! ● Adapted from The Blind Spot Effect: How to Stop Missing What’s Right in Front of You by Kelly Boys. Copyright © 2018 by Kelly Boys. Published in July 2018 by Sounds True. 76 mindful October 2018 psychology