Psychology Archive III

The previous contents of this page are now in Psychology Archives I & II. There will be more posts on this page as and when appropriate.

January 1st 2023



That Feeling When You Have So Many Things to Do You Can’t Do Any of Them? Psychologists Have a Name for It, and a Solution

When your to-do list is so scary you can’t do any of it, that’s called overwhelm freeze. Psychology has a solution.

By Jessica Stillman, Contributor,

That Feeling When You Have So Many Things to Do You Can't Do Any of Them? Psychologists Have a Name for It, and a Solution
Illustration: Getty Images

It’s the season of joy and togetherness, which means, of course, I have approximately 687 unfinished items on my to-do list. There are (many) work projects to wrap up, gifts to buy, visits to prepare for, school holiday concerts to attend. The list feels endless–and paralyzing. 

Logically, you’d think that when your to-do list was at its longest would be when you’d kick into high gear and start ticking things off like a productivity superstar. But in my personal experience, often the opposite happens. When my to-do list grows this long, I panic, my motivation tanks, and my brain fogs. I end up stressing more than accomplishing. Am I a weirdo? 

Possibly, but not because of my to-do list panic, according to a recent, gratifying New York Times article. Apparently, having so many things to do that you can’t do any of them is a recognized psychological phenomenon with a name and, better yet, a suggested cure. 

Memo to my brain: Your to-do list isn’t a saber-toothed tiger

In the article, I learn from writer Dana G. Smith that I am suffering from overwhelm freeze. Just from the name, that sounds about right, but Smith talks to Ellen Hendriksen, a professor at Boston University’s Center for Anxiety and Related Disorders, who explains that we freeze in the face of an overwhelming to-do list for the same reason our ancestors froze in response to a stalking predator.  

“Our bodies react to threat the same way, whether the threat is external, like the proverbial saber-toothed tiger, or the threat is internal,” she said. So basically my brain is so stressed by the thought of everything I probably won’t have time to do that my prefrontal cortex, which should be planning, organizing, and generally orchestrating the show, just lies down and plays possum. 

So what do I do about it?

Knowing that I am experiencing something called overwhelm freeze, and that it is common enough to merit its own terminology, is soothing. But I still have to figure out what to get for my impossible-to-buy-for father and when I am going to squeeze in that meeting with my accountant. Do psychologists have any practical advice on how to get my brain moving again? 

As overwhelm freeze is a response to stress, all the usual stress-reducing techniques like deep breathing, physical exercise, and reminding yourself it’s normal and human to be less-than-perfect are a good place to start. Beyond that, Smith and the experts she talks to advise those paralyzed by their to-do lists to start small–very small. 

“How do you eat an elephant? One bite at a time,” jokes University of Calgary professor and procrastination expert Piers Steel. No matter the size of the elephant, those individual bites should be so ridiculously small and concrete that you can’t possibly stress about them. I can feel my blood pressure rise when I contemplate a task like “buy presents for entire family” but it’s much harder to get worked up about “order cookbook for mom.”

Think about rewriting your to-do list as if you were giving “instructions to a teenager who doesn’t really want to do it, so you have to be really specific,” explains Steel. Which makes sense — my brain does feel quite a bit like a sulky 16-year-old these days. 

Once you’ve got your grumpy teen-proof to-do list, the next step is to actually get started. The experts suggest beginning with something easy and pleasant to help you build momentum. For activities that promise absolutely zero joy, good old fashioned self-bribery can help

“If there’s an email you need to send that you keep putting off (and off and off), promise yourself ten guilt-free minutes of internet celebrity gossip afterward,” suggests Smith. 

Whatever you need to do, get the ball rolling because the longer you let yourself stay frozen, the worse it’s likely to get. I guess I am off to order a cookbook then. What small task could you undertake to break your freeze?

This Morning

The Daily Digest for Entrepreneurs and Business Leaders

December 17th 2022

How to Stop Overthinking Everything

Deliberation is an admirable and essential leadership quality that undoubtedly produces better outcomes. But there comes a point in decision making where helpful contemplation turns into overthinking.

Harvard Business Review

  • Melody Wilding

Read when you’ve got time to spare.

Harvard Business Review


Hands writing in a notepad with crumpled balls of paper around

Ghislain & Marie David de Lossy/Getty Images

As a product lead at a major technology company, Terence’s job is to make decisions. How should the team prioritize features to develop? Who should be staffed on projects? When should products launch? Hundreds of choices drive the vision, strategy, and direction for each product Terence oversees.

While Terence loved his job, making so many decisions caused him a lot of stress. He would waste hours in unproductive mental loops — analyzing variables to make the “right” choices. He would worry about the future and imagine all the ways a launch could go wrong. Then, he would beat himself up for squandering valuable time and energy deliberating instead of taking action. In other words, his thoughtfulness, which was typically a strength, often led him to overthink situations.

Terence is what I like to call a sensitive striver — a high-achiever who processes the world more deeply than others. Studies show that sensitive people have more active brain circuitry and neurochemicals in areas related to mental processing. This means their minds not only take in more information, but also process that information in a more complex way. Sensitive strivers like Terence are often applauded for the way they explore angles and nuance. But at the same time, they are also more susceptible to stress and overwhelm.

Deliberation is an admirable and essential leadership quality that undoubtedly produces better outcomes. But for Terence and others like him, there comes a point in decision making where helpful contemplation turns into overthinking. If you can relate, here are five ways to stop the cycle of thinking too much and drive towards better, faster decisions.

1. Put aside perfectionism

Perfectionism is one of the biggest blockers to swift, effective decision-making because it operates on faulty all-or-nothing thinking. For example, perfectionism can lead you to believe that if you don’t make the “correct” choice (as if there is only one right option), then you are a failure. Or that you must know everything, anticipate every eventuality, and have a thorough plan in place before making a move. Trying to weigh every possible outcome and consideration is paralyzing.

To curb this tendency, ask yourself questions like:

  • Which decision will have the biggest positive impact on my top priorities?
  • Of all the possible people I could please or displease, which one or two people do I least want to disappoint?
  • What is one thing I could do today that would bring me closer to my goal?
  • Based on what I know and the information I have at this moment, what’s the best next step?

After all, it’s much easier to wrap your head around and take action towards a single next step rather than trying to project months or years into the future.

2. Right-size the problem

Some decisions are worth mulling over, while others are not. Before you make a call, write down what goals, priorities, or people in your life will be impacted. This will help you differentiate between what’s meaningful and what’s not worth obsessing over.

Likewise, if you’re worried about the prospect of a decision bombing, try the 10/10/10 test. When the prospect of falling flat on your face seizes you, think about how you’ll feel about the decision 10 weeks, 10 months, or 10 years from now? It’s likely that the choice will be inconsequential or that you won’t even remember it was a big deal. Your answers can help you put things in perspective and rally the motivation you need to take action.

3. Leverage the underestimated power of intuition

Intuition works like a mental pattern matching game. The brain considers a situation, quickly assesses all your experiences, and then makes the best decision given the context. This automatic process is faster than rational thought, which means intuition is a necessary decision-making tool when time is short and traditional data is not available. In fact, research shows that pairing intuition with analytical thinking helps you make better, faster, and more accurate decisions and gives you more confidence in your choices than relying on intellect alone. In one study, car buyers who used only careful analysis were ultimately happy with their purchases about a quarter of the time. Meanwhile, those who made intuitive purchases were happy 60 percent of the time. That’s because relying on rapid cognition, or thin-slicing, allows the brain to make wise decisions without overthinking.

Terence, the product lead I mentioned earlier, was so intrigued by the idea of making decisions from his gut that he planned a “Day of Disinhibition” during which he followed his own intuition about everything he said and did for twenty-four hours. The result? Going with his gut gave him the courage to stop censoring himself and make tough calls, even when he knew it might upset some stakeholders. “It wasn’t just what I got done, but how I got it done, how quickly, and how I felt about it,” he later told me, “It put me in the best frame of mind to deal with whatever is in front of me,” he said. Try the “Day of Disinhibition” experiment for yourself, or simply set aside a few minutes today and list three to five times you trusted your gut in and whether the outcome was favorable.

4. Limit the drain of decision fatigue

You make hundreds of decisions a day — from what to eat for breakfast to how to respond to an email — and each depletes your mental and emotional resources. You’re more likely to overthink when you’re drained, so the more you can eliminate minor decisions, the more energy you’ll have for ones that really matter.

Create routines and rituals to conserve your brainpower, like a weekly meal plan or capsule wardrobe. Similarly, look for opportunities to eliminate certain decisions altogether, such as by instituting best practices and standardized protocols, delegating, or removing yourself from meetings.

5. Construct creative constraints

You may be familiar with Parkinson’s Law, which states that work expands to the time we allow it. Put simply, if you give yourself one month to create a presentation, it will take you one full month to finish it. But if you only had a week, you’d finish the same presentation in a shorter time.

I’ve observed a similar principle among sensitive strivers — that overthinking expands to the time we allow it. In other words, if you give yourself one week to worry about something that is actually a one-hour task, you will waste an inordinate amount of time and energy.

You can curb this tendency by creating accountability through creative constraints. For example, determine a date or time by which you’ll make a choice. Put it in your calendar, set a reminder on your phone, or even contact the person who is waiting for your decision and let them know when they can expect to hear from you. A favorite practice of my clients is “worry time,” which involves earmarking a short period of the day to constructively problem solve.

Above all, remember that your mental depth gives you a major competitive advantage. Once you learn to keep overthinking in check, you’ll be able to harness your sensitivity for the superpower that it can be.

Melody Wilding, LMSW is an executive coach and author of Trust Yourself: Stop Overthinking and Channel Your Emotions for Success at Work. Get a free copy of Chapter One here.

December 15th 2022

A Healthy Social Life Goes Beyond Friends and Family

When we have a variety of social interactions—with not just intimates, but acquaintances and strangers—we may be happier and healthier for it.

By Jill Suttie | December 5, 2022

Like many people, in terms of socializing, I prioritize making time for my closest friends and family. When it comes to reaching out to people I don’t know as well, I have a harder time and often find myself reluctant to engage—maybe because I’m introverted or just plain busy. 

This could be a big mistake, though, according to a new study. Having a variety of different types of social interactions seems to be central to our happiness—something many of us discovered firsthand during the pandemic, but may already have forgotten.

In a series of surveys (done pre-pandemic), researchers looked at how having a socially diverse network related to people’s well-being. Just to be clear, they weren’t looking at racial, ethnic, or gender diversity, but how much people interacted with different types of social contacts—friends, family, colleagues, neighbors, classmates, community members, etc.

In one survey, 578 Americans reported on what activities they’d been engaged in, with whom, and for how long over the past 24 hours, while also saying how happy and satisfied with life they were. The researchers then gave them a score for “social diversity” based on the variety of social contacts they’d had and the length of time spent with each type of contact.

After analyzing the results, they found that people with more diverse social networks were happier and more satisfied with life than those with less diverse networks—regardless of how much time they’d spent socializing overall. This pattern held even after taking into account things like a person’s gender, age, employment status, and other potential influences of happiness.

Having a wider set of social contacts seems to be important for happiness, says lead researcher Hanne Collins of Harvard Business School.

“The more you can broaden your social portfolio and reach out to people you talk to less frequently—like an acquaintance, an old friend, a coworker, or even a stranger in the grocery store—the more it could have really positive benefits for your well-being,” she says.

To further test this idea, she and her colleagues looked at large data sets from the American Time Use Survey (which provided detailed information from over 19,000 Americans about what activities they engaged in during a typical day) and the World Health Organization’s Study on Global Aging and Adult Health (which did the same for 10,447 respondents from China, Ghana, India, Mexico, the Russian Federation, and South Africa).

In both cases, they found that when people had a broader range of social interactions, they experienced greater happiness and well-being—including better physical health. These gains in well-being went above and beyond gains related to how active people were or what country they were from or other demographics—meaning the results seemed to apply to people from all stations of life and from many cultures.

“Across many measures, many populations, and many different studies, we’re finding the same kind of positive association between a greater social portfolio and well-being,” says Collins.

In this type of analysis, though, it’s hard to tell whether social diversity leads to happiness or if happier people just attract more diverse social contacts. To try to get at that, Collins and her colleagues did another analysis, using data from a mobile app that 21,644 French-speaking people used to report on their daily social activities and happiness.

There, they found that when someone experienced greater-than-average social diversity one week, they were happier that week and the week after—independent of how active or social they were overall. This finding probably jives with people’s own experience during the pandemic, Collins adds.

“The pandemic narrowed people’s social portfolios, in terms of having different relationship partners to talk to, and they missed that wider network,” she says. “We benefit from having access to more distant others.”

Why is that? Other research supports the importance of “weak ties” in our social networks—and how interacting with a stranger can be more pleasurable than we predict. It could be that being with different people elicits different kinds of emotions, says Collins, and that emotional variety may be a driving force in our happiness.

Alternatively, it could be that having a more diverse network allows you to get different kinds of social support when you need it—whether that’s emotional or financial support from friends or family, or informational and practical support from an acquaintance (like letting you borrow a tool or helping you find a job). Having various kinds of social support can be tied to well-being, too, she says.

Whatever the case, Collins hopes her research will spur people to expand their social networks when they can. She suggests reaching out to old friends, joining a class, having an extra meeting with a colleague to touch base, or chatting with the grocery store cashier. Especially after going through the isolation imposed during the pandemic, she says, getting back to a wider set of social interactions could really help improve people’s well-being.

“As we try to find out what a ‘new normal’ looks like, recovering our diverse social portfolios may be really impactful for people,” she says. “Just trying to be conscious about who you’re talking to and making the effort to foster moments of connection with people with less obvious access to your life could be powerful.”

GreaterGood Tiny Logo Greater Good wants to know: Do you think this article will influence your opinions or behavior?

  Get the science of a meaningful life delivered to your inbox. Submit

About the Author
  • {author} Jill Suttie Jill Suttie, Psy.D., is Greater Good’s former book review editor and now serves as a staff writer and contributing editor for the magazine. She received her doctorate of psychology from the University of San Francisco in 1998 and was a psychologist in private practice before coming to Greater Good.

November 10th 2022

Psychosis (young people)

Medication for Psychosis

Most people had been prescribed a number of different medications over the time they had been having psychotic experiences. These included:

  • Anti-psychotics (e.g. olanzapine, risperidone, quetiapine, aripiprazole)
  • Anti-depressants (e.g. citalopram)
  • Benzodiazepines (e.g. diazepam, lorazepam, clonazepam)

These medicines are usually prescribed by a psychiatrist, and should be regularly monitored. You can read more about taking anti-depressants here. Finding a medication that works can often involve trial and error, and it can take months to find the right dosage or best combination of medications to suit a person. Most people find medications that are helpful for their symptoms, although, a few of the young people we spoke to said they had never found ones that suited them.  

Green Lettuce tried many different medications and said that most didn’t help. Seroq

(quetiapine) only made the voices go away for a short time but diazepam helped a lot.

Effects of taking medication for psychosis
  Some people found medication provided a short relief from their psychotic experiences. Others found it stopped hallucinations, delusions and paranoia for longer periods or reduced them and it could also “kick start” their recovery. Quetiapine (antipsychotic) helped reduce the number of voices Dominic heard from seven to three, and helped him to sleep. However, most people felt that medication was a part of the solution rather than solving everything. Nikki says medication takes her from “bad to not as bad” and she uses self-help techniques to get herself from “not as bad to better”: “So it doesn’t do all the work, but it helps”.    People we spoke to were often prescribed different types of medication (anti-depressants, benzodiazepines and anti-psychotics) together and some were able to compare the effects of each. Green Lettuce says benzodiazepines like lorazepam and diazepam help him more than the antipsychotics and diazepam has longer lasting effects. Some felt that anti-depressants interfered with anti-psychotics or made them feel worse. Andrew X, whose psychosis is linked to depression and low mood, finds that anti-depressants make the low mood worse before it makes things better.  

Nikki prefers taking anti-depressants to taking anti-psychotics. They make her feel “lighter” and she feels a little less suicidal.

Effects of taking medication for psychosis
  Some people found medication provided a short relief from their psychotic experiences. Others found it stopped hallucinations, delusions and paranoia for longer periods or reduced them and it could also “kick start” their recovery. Quetiapine (antipsychotic) helped reduce the number of voices Dominic heard from seven to three, and helped him to sleep. However, most people felt that medication was a part of the solution rather than solving everything. Nikki says medication takes her from “bad to not as bad” and she uses self-help techniques to get herself from “not as bad to better”: “So it doesn’t do all the work, but it helps”.    People we spoke to were often prescribed different types of medication (anti-depressants, benzodiazepines and anti-psychotics) together and some were able to compare the effects of each. Green Lettuce says benzodiazepines like lorazepam and diazepam help him more than the antipsychotics and diazepam has longer lasting effects. Some felt that anti-depressants interfered with anti-psychotics or made them feel worse. Andrew X, whose psychosis is linked to depression and low mood, finds that anti-depressants make the low mood worse before it makes things better.  

October 21st 2022

Sins of the mothers in today’s feminist broken homes.

Edward John Cook Jnr on an educstional trip with his father in Northampton, who hasn’t seen him since Apri 2008 , now sectined. For legal reasons I am not allowed to say . Mothers have the power to destroy theit sons with impunity becsuse they know it all. Kieran Cook, who police have barred from seeing his brother for the past 14 years, durining which Edward has been shut securely away for 18 years. Kieran Cook, who police have been trying to jail or section along with his father for the past 14 plus years.

10 Proven Ways to Learn Faster to Boost Your Math, Language Skills and More Quickly

Neuroscience has taught us a lot about how our brains process and hold on to information. By Deep Patel November 21, 2018 Updated: September 14, 2022 Opinions expressed by Entrepreneur contributors are their own.

Learning new things is a huge part of life — we should always be striving to grow and learn a new skill. Whether you’re learning Spanish or want to do math fast, it takes time to learn each lesson, and time is precious. So how can you make the most of your time by speeding up the learning process? Thanks to neuroscience, we now have a better understanding of how we learn and the most effective ways our brains process and hold on to information.

If you want to get a jump start on expanding your knowledge, here are 10 proven ways you can start being a quick learner.

1. Take notes with pen and paper.

Though it might seem that typing your notes on a laptop during a conference or lecture will be more thorough, thus helping you learn faster, it doesn’t work that way. To speed up your learning, skip the laptop and take notes the old-fashioned way, with pen and paper. Research has shown that those who type in their lecture notes process and retain the information at a lower level. Those who take notes by hand actually learn more.

While taking notes by hand is slower and more cumbersome than typing, the act of writing out the information fosters comprehension and retention through muscle memory. Reframing the information in your own words helps you retain the information longer, meaning you’ll have better recall and will perform better on tests.

Related: Your Lousy Handwriting Might Actually Make You Smarter

2. Have effective note-taking skills.

The better your notes are, the faster you’ll learn. Knowing how to take thorough and accurate notes will help you remember concepts, gain a deeper understanding of the topic and develop meaningful learning skills. So, before you learn a new topic, make sure you learn different strategies for note taking, such as the Cornell Method, which helps you organize class notes into easily digestible summaries.

Whatever method you use, some basic tips for note-taking include:

  • Listen and take notes in your own words.
  • Leave spaces and lines between main ideas so you can revisit them later and add information.
  • Develop a consistent system of abbreviations and symbols to save time.
  • Write in phrases, not complete sentences.
  • Learn to pull out important information and ignore trivial information.

3. Distributed practice.

This method involves distributing multiple practices (or study sessions) on a topic over a period of time. Using short, spaced-out study sessions will encourage meaningful learning, as opposed to long “cram sessions,” which promote rote learning. The first step is to take thorough notes while the topic is being discussed. Afterward, take a few minutes to look over your notes, making any additions or changes to add detail and ensure accuracy.

Do this quickly, once or twice following each class or period of instruction. Over time, you can begin to spread the sessions out, starting with once per day and eventually moving to three times a week. Spacing out practice over a longer period of time is highly effective because it’s easier to do a small study session and you’ll stay motivated to keep learning.

Related: 3 Ways to Become a More Effective Learner

4. Study, sleep, more study.

You have a big project or a major presentation tomorrow and you’re not prepared. If you’re like many of us, you stay up too late trying to cram beforehand. Surely your hard work will be rewarded, even if you’re exhausted the next day… right? However, that’s not the most efficient way for our brains to process information.

Research shows a strong connection between sleep and learning. It seems that getting some shut-eye and taking short breaks are important elements in bolstering how our brains remember something. Deep sleep (non-rapid-eye-movement sleep) can strengthen our long-term memory if the sleep occurs within 12 hours of learning the new information. And students who both study and get plenty of sleep not only perform better academically; they’re also happier.

Related: Study Finds the Less You Sleep the Less People Like You

5. Modify your practice.

If you’re learning a skill, don’t do the same thing over and over. Making slight changes during repeated and deliberate practice sessions will help you master a skill faster than doing it the same way every time. In one study of people who learned a computer-based motor skill, those who learned a skill and then had a modified practice session where they practiced the skill in a slightly different way performed better than those who repeated the original task over and over.

This only works if the modifications are small — making big changes in how the skill is performed won’t help. So, for instance, if you’re practicing a new golf swing or perfecting your tennis game, try adjusting the size or weight of your club or racket.

6. Try a mnemonic device.

One of the best ways to memorize a large amount of information quickly is to use memory techniques like a mnemonic device: a pattern of letters, sounds or other associations that assist in learning something. One of the most popular mnemonic devices is one we learned in kindergarten — the alphabet song. This song helps children remember their “ABCs,” and it remains deeply ingrained in our memory as adults. Another is “i before e except after c” to help us remember a grammar rule.

Mnemonics help you simplify, summarize and compress information to make it easier to learn a new word or new skill. It can be really handy for students in medical school or law school, or people studying a new language. So, if you need to memorize and store large amounts of new information, try a mnemonic and you’ll find you remember the information long past your test.

Related: 5 Apps to Boost Your Brain Power

7. Use brain breaks to restore focus.

Information overload is a real thing. In order to learn something new, our brains must send signals to our sensory receptors to save the new information, but stress and overload will prevent your brain from effectively processing.

When we are confused, anxious or feeling overwhelmed, our brains effectively shut down. You can see this happen when students listening to long, detailed lectures “zone out” and stop paying attention to what’s being said.

They simply aren’t able to effectively conduct that information into their memory banks, so learning shuts down. The best way to combat this is by taking a “brain break,” or simply shifting your activity to focus on something new. Even a five-minute break can relieve brain fatigue and help you refocus.

8. Stay hydrated.

We know we should drink water because it’s good for us — it’s good for our skin and our immune system, and it keeps our body functioning optimally. But staying hydrated is also key to our cognitive abilities. Drinking water can actually make us smarter. According to one study, students who took water with them to an examination room performed better than those who didn’t.

Dehydration, on the other hand, can seriously affect our mental function. When you fail to drink water, your brain has to work harder than usual.

9. Learn information in multiple ways.

When you use multiple ways to learn something, whether it’s language learning or speed reading, you’ll use more regions of the brain to store information about that subject. This makes that information more interconnected and embedded in your brain. It basically creates a redundancy of knowledge within your mind, helping you truly learn the information and not just memorize it.

You can do this through spaced repetition or by using different media to stimulate different parts of the brain, such as reading notes, reading the textbook, watching a video on social media and listening to a podcast or audio file on the topic. The more resources you use, the faster you’ll learn.

10. Connect what you learn with something you know.

The more you can relate new concepts to ideas that you already understand, the faster the you’ll learn the new information. According to the book Make It Stick, many common study habits are counterproductive. They may create an illusion of mastery, but the information quickly fades from our minds.

Memory plays a central role in our ability to carry out complex cognitive tasks, such as applying knowledge to problems we haven’t encountered before and drawing inferences from facts already known. By finding ways to fit new information in with preexisting knowledge, you’ll find additional layers of meaning in the new material. This will help you fundamentally understand it better, and you’ll be able to recall it more accurately.

Elon Musk, founder of Tesla and SpaceX, uses this method. He said he views knowledge as a “semantic tree.” When learning new things, his advice is to “make sure you understand the principles, i.e., the trunk and big branches, before you get into the leaves/details or there is nothing for them to hang on to.” When you connect the new to the old, you give yourself mental “hooks” on which to hang the new knowledge.

Entrepreneur Editors’ Picks

This 27-Year-Old Yale Alum Has a College Prep Company With a 100% Harvard Acceptance Rate. Here’s How He Does It — and How Much It Costs.

How a Handwritten Core Values List Can Make You a Great Leader

This Body-Language Expert’s ‘Triangle’ Method Will Help You Catch a Liar in the Act

6 Things I’d Tell My 20-Year-Old Entrepreneur Self

Calling All Pet Lovers: The Best Pet Care Franchise Opportunities

8 Easy, Virtual Side Hustles for Extra Cash What This Overlooked Military Tip Can Teach You About Being an Effective Entrepreneu

October 17th 2022

Why Talented People Don’t Use Their Strengths

We often undervalue what we inherently do well.

Harvard Business Review

  • Whitney Johnson

Read when you’ve got time to spare.

Harvard Business Review

More from Harvard Business Review

Photo by: cintascotch/Getty Images

If you’ve watched the Super Bowl in recent years, you’ve probably seen the coaches talking to each other over headsets during the game. What you didn’t know is that during the 2016 season, the NFL made major league-wide improvements to its radio frequency technology, both to prevent interference from media using the same frequency and to prevent tampering. This was a development led by John Cave, VP of football technology at the National Football League. It’s been incredibly helpful to the coaches. But it might never have been built, or at least Cave wouldn’t have built it, had it not been for his boss, Michelle McKenna-Doyle, CIO of the NFL.

When McKenna-Doyle was hired, she observed that a number of her people were struggling, but not because they weren’t talented — because they weren’t in roles suited to their strengths. After doing a deep analysis, she started having people switch jobs. For many, this reshuffling was initially unwelcome and downright uncomfortable. Such was the case with Cave.

Cave had the talent to create products and build things. But he didn’t have time to do it, because he had the big job of system development, including enterprise systems. “Why was he weighed down with the payroll system when he could figure out how to evolve the game through technology?” McKenna-Doyle asked. As she later explained to me, she envisioned a better role for his distinctive strengths. The coaches wanted to talk to each other. The technology didn’t exist. She tasked Cave with creating it. “At first, he was concerned, because his overall span was shrinking. ‘Just trust me,’ I said. ‘You’re going to be a great innovator,’ and he is.”

Experts have long encouraged people to “play to their strengths.” And why wouldn’t we want to flex our strongest muscle? But based on my observations, this is easier said than done. Not because it’s hard to identify what we’re good at. But because we often undervalue what we inherently do well.

Often our “superpowers” are things we do effortlessly, almost reflexively, like breathing. When a boss identifies these talents and asks you to do something that uses your superpower, you may think, “But that’s so easy. It’s too easy.” It may feel that your boss doesn’t trust you to take on a more challenging assignment or otherwise doesn’t value you — because you don’t value your innate talents as much as you do the skills that have been hard-won.

As a leader, the challenge is not only to spot talent but also to convince your people that you value their talents and that they should, too. This is how you start to build a team of employees who bring their superpowers to work.

Begin by identifying the strengths of each member of your team. Some of my go-to questions are:

What exasperates you? This can be a sign of a skill that comes easily to you, so much so that you get frustrated when it doesn’t to others. I’m weirdly good at remembering names, for example, and often get annoyed with others who don’t. I have a terrible sense of direction, however, and probably irritate other people who intrinsically sense which way is north.

What compliments do you dismiss? When we’re inherently good at something, we tend to downplay it. “Oh, it was nothing,” we say — and maybe it was nothing to us. But it meant something to another person, which is why they’re thanking you. Notice these moments: They can point to strengths that you underrate in yourself but are valuable to others.

What do you think about when you have nothing to think about? Mulling over something is a sign that it matters to you. Your brain can’t help but come back to it. If it matters to you that much, maybe you’re good at it.

In group settings, I’ll also ask people why they hired so-and-so — what that person’s genius is. Rarely is this a skill listed on their résumé.

When people bring up new ideas, you can ask them, Will this leverage what you do well? Are you doing work that draws on your strengths? Are we taking on projects that make the most of your strengths?

Once each person has identified their strengths, make sure everyone remembers them. Brett Gerstenblatt, VP and creative director at CVS, has his team take a personality assessment, then post their top five strengths on their desk. Brett wants people to wear their strengths like a badge. Not to tell others why they’re great, but to remind them to use them.

Diana Newton Anderson, an entrepreneur turned social good activist, shares a story of her college basketball coach, who had her team take shots from different places on the court: the key, the elbow, the paint. He would record their percentages, and then had every person on the team memorize those percentages. This would allow the team to literally play to each other’s strengths. You can do something similar with your team.

As with McKenna-Doyle, building a team that can play to their strengths begins with analysis. Observe people, especially when they are at their best. Because some will undervalue what they do well, it may be up to you to place a value on what they do best. Understanding and acknowledging each person’s strengths can be a team-building exercise. Then you can measure new ideas, new products, and new projects against these collective superpowers, asking: Are we playing to our strengths? When people feel strong, they are willing to venture into new territory, to play where others are not, and to consider ideas for which there isn’t yet a market.

How to Get Comfortable With Uncertainty and Change

When life is uncertain, our usual responses and coping strategies might not always work. The practice of mental agility can help us be resilient.

By Kira M. Newman | October 4, 2022

I recently moved to a new apartment, an occasion that calls for celebration—preferably outdoors in my brand-new backyard. But I didn’t expect how much being in a different space would disrupt my sense of safety. So I worried—about my cat escaping out the front door, how to protect my family from COVID, raccoon-transmitted diseases, and more.

After reading Elaine Fox’s new book Switch Craft: The Hidden Power of Mental Agility, I have a better idea of what’s going on. I fall into the category of someone who’s uncomfortable with uncertainty. I love a good routine, and moving disrupted all of mine. I have a need to feel in control of my circumstances, but just about everything in my immediate surroundings changed.

Maybe you fit this description, too, and you have trouble coping when life is full of unknowns or when things don’t turn out as you expected. According to Fox, what we need to cultivate is mental agility—a nimbleness in how we think, feel, and act that will allow us to adapt to changing circumstances.

Feeling uncomfortable with uncertainty

Uncertainty arises when we’re in new situations, like a move or a new job, or when we’re in unpredictable situations—like when we have a job interview, a medical test, an injury, or the possibility of layoffs at work.

Because our brains are future-predicting machines, it’s natural to want to avoid ambiguity. “As human beings, we crave security, and that is why all of us are intolerant of uncertainty to some extent,” writes Fox.

But some have this tendency more than others. For example, you might be intolerant of uncertainty if you love planning, hate surprises, and get frustrated when unexpected things mess up your day. Someone who has trouble with uncertainty might find it hard to make decisions in ambiguous circumstances, because they feel like they don’t have enough information and don’t want to make the wrong choice. 

To avoid the discomfort of uncertainty, some of us engage in what Fox calls “safety behaviors”—things like making lots of lists, constantly double checking, overpreparing, or seeking reassurance from others. For example, you might read a restaurant menu in advance, or repeatedly check in on your kid to make sure they’re doing OK.

If you dislike uncertainty, you might also be a worrier, because worrying actually gives us a sense of control in a difficult situation—at least we’re doing something! You might also shy away from challenges that you could fail at, and lean on tried-and-true pathways in life.

The power of mental agility

To get more comfortable with uncertainty, we need to practice what Fox calls mental agility, or what psychologists call psychological flexibility. Research suggests that people who are more psychologically flexible have higher well-being and tend to be less anxious and worried.

Someone who is psychologically flexible is open to change, or may even find change exciting. When they’re working on a problem, they try lots of different solutions. They don’t see the world in black and white, they like to learn from others, and they often have some unusual ideas of their own.

Mental agility shines when we’re facing change, when things don’t go as expected, or when the future is particularly unpredictable—like, say, when travel plans fall through, going through a divorce, or in a pandemic. At that moment, some people dig their heels in and keep doing what they’ve always done. But mentally agile people are able to recognize when what they’re doing isn’t working, and change things up.

“There is no one-size-fits-all solution to any of life’s problems,” writes Fox.

She likes the metaphor of using different clubs on a golf course, depending on whether you’re hitting a long shot, swinging from a bunker, or putting. “Life is exactly like that—we’re going to be faced with quite different types of problems and different types of obstacles to get around, and we need different approaches for all of those.”

It comes down to the choice of stick or switch: Should I keep pursuing the same thoughts, feelings, and actions, or do I need to switch to something new?

For example, she says, parents need a veritable smorgasbord of strategies to raise their children, everything from tough discipline and strict boundaries to treating kids to ice cream and a day off. Knowing when to use which one is a sign of healthy flexibility. The same goes for leaders at work, who might want to change the way they manage their employees when the company is going through a season of stress.  <em><a href=“”>Switch Craft: The Hidden Power of Mental Agility</a></em> (HarperOne, 2022, 352 pages).” srcset=” 1x,

			 2x” src=””><em><a href=Switch Craft: The Hidden Power of Mental Agility (HarperOne, 2022, 352 pages).

Coping strategies are another good example. Psychologists like to group them into two main types: emotion-focused and problem-focused. Emotion-focused strategies change the way we feel, like distracting ourselves, getting support from friends, or looking at the situation from a different perspective. Problem-focused strategies, on the other hand, involve taking action to solve the problem directly.

No one strategy works all the time, and you’ll often see people get stuck in their favorite way of coping. If you tend toward distraction and denial, you might avoid dealing with a problem that you actually could have solved; if you’re an inveterate problem-solver, you might feel helpless and angry when confronting a problem—or a loved one’s—that has no solution, when all that’s really needed is support and connection.

How to cultivate mental agility

Fox’s book is full of tips to cultivate mental agility, as well as other related skills that can help you roll with the punches in life. Here are a few that felt most practical and new to me.

Surrender to transitions. When something changes in your life—you leave a job, end a relationship, or lose someone you love—recognize that you’re now in a transition. Transitions take time to move through, and they can’t be rushed. Your identity (as an employee, partner, or friend, perhaps) will have to shift and change, as well. Be kind and accepting, and don’t expect too much of yourself as you struggle through this time.

Prepare for change in advance. Sometimes change is unexpected, and other times you see it coming. When you anticipate a big change in life, spend some time exploring your feelings around it. You can list all the ways your life will change, and identify the ones that are causing you anxiety. Give yourself the opportunity to mourn what you will leave behind, but also devote some of your attention to new opportunities that you’re excited about.

Seek out small uncertainties. You can build up your tolerance for uncertainty, Fox explains, by gradually exposing yourself to it on purpose. For example, you could reach out to an acquaintance you haven’t seen in a while, try bargaining for an item you want to buy, or check social media less frequently.

Change up your perspective. One way to do this is to find something small that annoys you, and try to see the silver lining to it. For example, maybe your commute got longer, but that means you have extra time to listen to podcasts.   

When you’re facing a problem, you could change your perspective by brainstorming a handful of solutions, rather than trying to figure out the perfect correct one. Or make list of people you admire, and ask yourself: What would they do in your place? 

Ask a different question. When life is hard, we often find ourselves harping on “why” questions: “Why is this happening to me?” In those moments, Fox suggests letting go of the “why” and asking “how” instead: “How can I change this situation?” Or perhaps you’re already asking a “how” question, but the wrong one: Instead of “How do I stop working so much?,” she explains, try an easier question: “How can I find time to go to the gym?”

Move past worry. Repetitive worrying is one of the most common rigid thought patterns we get stuck in. To break free from it, identify whether the problem you’re worrying about is solvable or not—and take action if you can. If there’s nothing you can do, Fox suggests recording yourself talking in detail about your worries, and then listening to the recording repeatedly until your worries don’t have as much of a hold on you. 

It’s been about two months since my move, and my brain has calmed down about all the changes. (Surrender to transitions—got it.) I definitely see the appeal of being someone who moves through life agilely and with curiosity, letting things happen as they may and feeling confident I’ll figure out how to deal with them. Lists gripped tightly in hand, I have trouble ever imagining myself that way.

But Fox’s book helped put a name and an explanation on something I struggle with, so at least I have a goal to aspire to. Since reading it, I have noticed my knee-jerk resistance to plans changing or doing things someone else’s way, and I have been able to let go. I doubt anyone will ever call me spontaneous and easygoing, but at least I can make a point to expect the unexpected in life.

GreaterGood Tiny Logo Greater Good wants to know: Do you think this article will influence your opinions or behavior?
About the Author
  • Follow Kira M. Newman Kira M. Newman is the managing editor of Greater Good. Her work has been published in outlets including the Washington Post, Mindful magazine, Social Media Monthly, and, and she is the co-editor of The Gratitude Project. Follo

October 15th 2022

Neuropsych — October 13, 2022

Opening the Stasi files: Would you read the secrets your government kept on you?

What if your best friend was an informant?

Credit: Annelisa Leinbach

Key Takeaways

  • In 1991, the German government allowed the public to open the “Stasi files” that the East German secret police had kept on them.
  • It’s estimated that less than half of those who thought they had files applied to see them. The majority didn’t want to know.
  • One of the biggest reasons given for not finding out was that people were worried that their present-day relationships would be damaged if they learned that others close to them were informants. 

An interesting area of “deliberate ignorance” concerns once-secret government data on individuals. Would you want to know what spies and surveillance teams have found out about you? Are you curious as to what it would say? As it happens, there’s a research paper about just that.

Opening the Stasi files

In the decades after World War II, East Germany was a fearful place of suspicion, surveillance, and spies. The Communist state’s secret police, the Stasi, wiretapped, bugged, and tracked citizens on an enormous scale. By the time the Berlin Wall came down in 1989, the Stasi had over 90,000 employees and 200,000 informants (that we can estimate). While the Stasi disappeared — reabsorbed into a healing, reunited nation — the millions of pages of information they had collected on people did not.

In 1991, the reunified German government passed a bill that allowed people to access and view the Stasi files that were kept on them. What do you think people did? What would you do? A lot of people wanted to know. By 2020, 2.17 million people had applied to see their files. But, that’s not as many as you might have thought. Given that an estimated 5.25 million people in Germany believe they have a file to see, that means that more than half chose not to see their Stasi files. So, why didn’t all those people want to know?

Cambridge researchers gain a major new insight into how cancer spreads

The study revealed that the vast majority of people simply thought the information on them wasn’t relevant. The fact that they read “capitalistic literature” or fraternized with certain unsavory characters simply didn’t matter anymore in 1990s Germany. What’s more interesting, though, is the next largest reason: People didn’t want to know if their friends, family, or colleagues were informants.

Trust issues

The Stasi files research reveals that people did not want to ruin the relationships they had in their present lives. Imagine, for instance, that you opened a file and found information that only your spouse or a very close family member could have divulged. It would ruin that relationship forever. If you discovered that your best friend’s name appeared at the top of an “informants” document, how would that change your friendship? For a society that had been shrouded in suspicion and mistrust for so long, the opening up of personal Stasi files served only to erode that trust further.

What this also teaches us is just how far people are willing to forgive, or at least forget, the wrongdoings of those “on the other side.” It’s something that a lot of us do not think about in the 21st century. When Nazi Germany fell to the Allies, the thousands of bureaucrats of the Nazi machine simply found new work in the new country. Few to no questions were asked. Thousands of informants, party members, enablers, and soldiers were reabsorbed into a healing society (although, the most notorious were either arrested or subjected to “denazification”). The same story was told in Vichy France, as well as all Nazi-occupied countries across Europe. It’s a story being told even today where you find sudden, bloody regime change such as in Afghanistan or after the Arab Spring.

Smarter faster: the Big Think newsletter

Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday Fields marked with an * are required

For a country to heal, perhaps there is a necessary element of “deliberate ignorance” — a no-questions-asked policy. It’s something mirrored, on a smaller scale, in the wake of elections in democracies. Invariably, the winning incumbent’s speech will be one of reconciliation, renewal, and getting on with the job. It was a hard, vitriolic battle, but let’s move on now. The case of the Stasi files reveals how much this is ingrained in our collective ability and desire to forget.

Jonny Thomson teaches philosophy in Oxford. He runs a popular account called Mini Philosophy and his first book is Mini Philosophy: A Small Book of Big Ideas.

October 11th 2022

There are 3 main attachment styles in every relationship—here’s the ‘healthiest’ type, says therapist

Published Wed, Sep 28 202210:36 AM EDTUpdated Thu, Sep 29 20223:33 PM EDTthumbnailJohn Kim, Contributor@angrytherapistShare

Katelyn Dubose | Getty

Our attachment style is shaped and developed in early childhood by our relationships with our parents.

According to the attachment theory, first developed by psychologist Mary Ainsworth and psychiatrist John Bowlby in the 1950s, we mirror the dynamics we had with our parents — or primary caregivers — as infants and children.

As a therapist who specializes in relationships, I’ve found that attachment style discussions are not typical until much later on in life, when we must start to examine our relationship patterns and connect the dots.ADVERTI

The 3 main attachment styles: Which one are you?

Attachment theory is nuanced, like humans are. Although it is a spectrum of four styles, common parlance refers to only three: anxious, avoidant and secure.

Studies show that people who are securely attached have the healthiest relationships, and it’s the type that everyone should strive for.

Understanding which style you fall under — and the specific details surrounding it — can help you take control of how you relate to other people, particularly in stressful situations.

1. Anxious attachment style

Anxious attachment is characterized by a concern that the other person, whether with a significant other, friend or family member, will not reciprocate your level of availability.

This is generally caused when a child learns that their caregiver or parent is unreliable and does not consistently provide responsive care towards their needs.

I am anxiously attached. My parents came to America with very little money. They worked a lot and were more worried about paying the bills than creating a safe emotional space where secure attachments grew. 

Anxious attachment types have a sense of unworthiness but generally evaluate others positively. As a result, they strive for self-acceptance by tying their worth to approval and validation from their relationships.

Knowing this about myself has been a game-changer in my current relationship. Instead of demanding, wanting more, feeling rejected and undesired, I can take ownership and remind myself that how I feel may not be the reality.

2. Avoidant attachment style

My partner Vanessa leans toward an avoidant attachment style. Children who fall under this category tend to avoid interaction with their parents, and show little or no distress during separation. The child may believe that they cannot depend on the relationship.

An avoidant attachment style shows up in adults who hold a positive self-image and a negative image of others. They prefer to avoid close relationships and intimacy in order to retain a sense of independence and invulnerability. It’s a way to hide and not truly show themselves.

The avoidant struggles with intimacy and expressing feelings, thoughts and emotions. They are often accused of being distant and closed off. The closer someone gets and the needier they seem to become, the more an avoidant withdraws.

Knowing that Vanessa has more of an avoidant attachment style makes me understand and listen to her more, instead of immediately jumping to blame. 

3. Secure attachment style

People who are securely attached appreciate their own self-worth and ability to be themselves in their relationships. They openly seek support and comfort from their partner, and are similarly happy when their partner relies on them for emotional support.

During the childhood years, their caregivers made sure they felt valued, supported, heard and reassured. Here are some ways securely attached kids show up as adults:

  • They are able to regulate emotions and feelings in a relationship.
  • They have a strong goal-oriented behavior when on their own.
  • They don’t struggle with opening up and trusting others.
  • They are comfortable being alone and use that time to explore their emotions.
  • They have a strong capacity to reflect on how they are maneuvering in a relationship.

Secure attachment is what everyone is swimming towards, including Vanessa and me. But it takes awareness and practice.

The good news about attachment styles

We can become more and more securely attached as we experience healthy attachment habits in our adult relationships.

Because Vanessa is aware of her tendency to be avoidant, for example, she’s able to reflect on her emotional responses and see that they are mostly a knee-jerk reaction she’s adopted for protection. Then she can challenge herself to choose differently based on the kind of connection she truly wants.

We both give each other the space and the loving boundaries that we expect from one another.

Rewiring yourself to be more securely attached has to be a lifestyle, an everyday thing. Because as humans, we snap back if we are not intentional and just live by our default.

We all have our stories; no one has a perfect childhood. And it’s not about blaming or living in the past. It’s about looking at who we are now, and healing and evolving to become more secure.

John Kim, LMFT, is a therapist and life coach based in Los Angeles. He is also the author of “It’s Not Me, It’s You” and “Single on Purpose.” Follow him on Twitter and Instagram.

September 27th 2022

The big idea: should we drop the distinction between mental and physical health?

The current false dichotomy holds back research and stigmatises patients


Illustration: Elia BarbieriEdward BullmoreMon 12 Sep 2022 12.30 BSTLast modified on Tue 13 Sep 2022 02.50 BST

A few months ago, I was infected by coronavirus and my first symptoms were bodily. But as the sore throat and cough receded, I was left feeling gloomy, lethargic and brain-foggy for about a week. An infection of my body had morphed into a short-lived experience of depressive and cognitive symptoms – there was no clear-cut distinction between my physical and mental health.

My story won’t be news to the millions of people worldwide who have experienced more severe or prolonged mental health outcomes of coronavirus infection. It adds nothing to the already weighty evidence for increased post-Covid rates of depression, anxiety or cognitive impairment. It isn’t theoretically surprising, in light of the growing knowledge that inflammation of the body, triggered by autoimmune or infectious disease, can have effects on the brain that look and feel like symptoms of mental illness.

However, this seamless intersection of physical and mental health is almost perfectly misaligned with the mainstream way of dealing with sickness in body and mind as if they are completely independent of each other.

In practice, physical diseases are treated by physicians working for medical services, and mental illnesses are treated by psychiatrists or psychologists working for separately organised mental health services. These professional tribes follow divergent training and career paths: medics often specialise to focus exclusively on one bit of the body, while psychs treat mental illness without much consideration of the embodied brain that the mind depends on.

Scales hanging from a scorpion's tail

Read more

We live in a falsely divided world, which draws too hard a line – or makes a false distinction – between physical and mental health. The line is not now as severely institutionalised as when “lunatics” were exiled to remote asylums. But the distinction remains deeply entrenched despite being disadvantageous to patients on both sides of the divide.

A 55 year old woman with arthritis, depression and fatigue, and a 25 year old man with schizophrenia, obesity and diabetes, have at least this in common: they will probably both struggle to access joined-up healthcare for body and mind. Psychological symptoms in patients with physical disease are potentially disabling yet routinely under-treated. Physical health problems in patients with major psychiatric disorders contribute to their shockingly reduced life expectancy, about 15 years shorter than people without them.

Why do we stick with such a fractured and ineffective system? I will focus on two arguments for the status quo: one from each side, from the tribes of medics and psychs.

For the medics, the problem is that we just don’t know enough about the biological causes of mental illness for there to be a deep and meaningful integration with the rest of medicine. Psychiatry is lagging behind scientifically more advanced specialities, such as oncology or immunology, and until it catches up in theory it can’t be joined up in practice. To which I would say yes but no: yes, greater detail about biological mechanisms for mental symptoms will be fundamental to the fusion of mind and body medicine in future; but no, that is not a sufficient defence of the status quo, not least because it discounts how much progress has already been made in making biomedical sense of illnesses such as schizophrenia.

When I started as a psychiatrist, about 30 years ago, we knew that schizophrenia tended to run in families; but it is only in the last 5-10 years that the individual genes conferring inherited risk have been identified. We were unsure whether schizophrenia was linked to structural changes in the brain; but MRI scanning studies have established beyond doubt that it is. We were puzzled that the risk of diagnosis was increased among young adults born in the winter months, when viral infections are more common; but now we can begin to see how the mother and child’s immune response to perinatal infection could disrupt the synaptic pruning process which is crucial to development of brain networks throughout childhood and adolescence.

For the psychs, the problem is fear of excessive reductionism: that the personal and social context of mental illness will be neglected in pursuit of an omnipotent molecule or other biological mechanism at the root of it all. That would indeed be a dead end, but it’s not a likely destination.

We have known since Freud that childhood experience can have a powerful effect on adult mental health. There is now massive epidemiological evidence that social stress, broadly speaking, and early life adversity in particular, are robust predictors of both mental illness and physical disease. Only a biomedical zealot in denial would claim this doesn’t matter. But the question remains: how does experience of poverty, neglect, abuse or trauma in the first years of life have such enduring effects on health many decades later?

Freud’s answer was that traumatic memories are buried deep in the unconscious mind. A more up-to-date answer is that social stress can literally “get under the skin” by rewriting the script for activation of the genetic blueprint. Molecular modifications called epigenetic marks cause long-term changes in the brain and behaviour of young rats deprived of maternal affection or exposed to aggression. Similar mechanisms could biologically embed the negative impacts of early-life adversity in humans, exacerbating inflammation and steering brain development on to paths that lead to mental health problems in future.Advertisement

As things stand, these are plausible theories based on animal experiments rather than established facts in patients. But already they tell us this is not a zero-sum game. Drilling down on the biological mechanisms doesn’t mean that we must abandon or devalue what we know about the social factors that cause mental illness. Anxious anticipation of such a binary choice is itself a symptom of the divided way of thinking that we need to escape.

So, if we could entirely free ourselves from this unjustified class distinction between mental and physical health, what changes might we hope to see in future?

Sign up to Inside Saturday

For medics and psychs, there will be more educational and career paths that cut across, rather than entrench, specialisations. Diagnostic labels categorically ordained by the bible of psychiatric diagnosis, the Diagnostic and Statistical Manual of Mental Disorders (DSM), will be reformulated in terms of the interactions between biomedical and social factors that cause mental symptoms. There will be new treatments to tackle the physical causes of mental illness, which are expected to be many and variable between patients, rather than trying to smother symptoms by “one size fits all” treatment regardless of cause. Knowing more about their physical roots, we should be much more successful at predicting and preventing mental health disorders.

For patients, the result will be better physical and mental health outcomes. There will be more integrated specialist physical and mental health services, like the new hospital we are planning in Cambridge for children and young people, so that body and mind can be treated under one roof throughout the first two decades of life. There will be more opportunities for people with relevant lived experience to co-produce research investigating the links between physical and mental health. But the biggest impact of all could be on stigma. The sense of shame or guilt that people feel about being mentally ill is an added load, a meta-symptom, culturally imposed by the false dichotomy between physical and mental health. Without it, the stigma of mental illness should fade away, just as the stigma attached to epilepsy tuberculosis and other historically mysterious disorders has been diminished by an understanding of their physical causes.

Ultimately it is easier to imagine a better future for mental and physical health together than for either alone.

Edward Bullmore is professor of psychiatry at the University of Cambridge and author of The Inflamed Mind: A Radical New Approach to Depression (Short Books).

Further reading

Inventing Ourselves: The Secret Life of the Teenage Brain by Sarah-Jayne Blakemore (Black Swan, £9.99)

The Body Keeps the Score: Mind, Brain and Body in the Transformation of Trauma. by Bessel van der Kolk (Penguin, £12.99)

Illness as Metaphor & Aids and its Metaphors by Susan Sontag (Penguin Classics, £14)

Comment Societies are inherently unequal. Lies to the contrary engender escapism through all manner of self abuse, guilt and self blame. Body chemistry responds to perceptions, the most obvious example being the brain’s body and mind chemical response to triggers for sexual arousal.

Many get off on power , even if the personwho is so lowly the best they can do is kick the cat. Depression comes from such circumstances as lack of hope, isolation, being scapegoated or convicted for things you haven’t done. These have physical consequences through perceptions and brain changes, starting with anxiety, self negelct, leading into alcoholism , drug addiction which can cause schizophrenia and hearing voices due to brain pathway damage,and homlessless. Guilt is a huges issue but it should be psychiatrists and the elite who feel it.

So enters the state psychiatrist, young, brainwashed that all the answers are in the 6kg DSM. R.D Laing, author of what should have been ground breaking, ‘TheDivided Self.’

They are straight out of med school or uni depending whether they have studied medicine or psychology. Here they have learned less and less about less and less as education is progressively about social control and creating mnions with what the system defines as ‘normal psychology.’ The job is the same. Save society from the guilt and punishment it deserves. Record numbers of increasingly hopeless, scared , depressed and demoralised youth are committing suicide. As with the current high profile U.K case, the knee jerk response is to blame the internet because the corrupt elite running society need protection. That is why moronic lackeys from the police to do the ground work, banging on doors and dragging in the patients, handcuffed if necessary so the officers are safe. Society is one big sickness and getting worse. Covid lockdowns in response to a man made virus generated a mental health epidemic , health destroying ruin and suicide because the elite needed to create health destroying fear and conformity. R J Cook

Miss Roberta Jane Cook,. ‘So called health professionals, conspicuously directed by police lies, was ruined when labelled a paranoid personality, schizophrenic, bi polar and delusional ‘not needing hospital yet.’ The effects on her mental and physical health have been massive. Society says she is in denial and needs police surveilance 24/7. To suggest that society’s elite are paranoid and in denial is taken as more evidence of paranoia. As with the Casino, & as demonstrated by Julian Assaneg, the House always wins and is run by little more than gangsters and henchmen.

September 22nd 2022

Why Companies Are So Interested in Your Myers-Briggs Type

If you’ve looked for a job recently, you’ve probably encountered the personality test. You may also have wondered if it was backed by scientific research.

High angle view of businessman giving presentation colleagues in board room at office

Getty/Jonathan Aprea By: Ben Ambridge September 7, 2022 5 minutes Share Tweet Email Print

How many piano tuners are there in the entire world? How much should you charge to wash every window in Seattle? How many golf balls can you fit in a school bus?

According to urban legend, these are all questions that Google asks at interviews. Or at least, that Google used to ask. Apparently the off-beat questions have fallen out of favor. So, what do they do instead? Well, as well as asking more standard interview questions (e.g., “Tell us about a time you faced and overcame an important challenge”), Google uses personality tests. In fact, according to Psychology Today, around 80 percent of Fortune 500 companies use personality tests in some form.Is the use of personality testing for making hiring decisions backed by scientific research? Well, it’s complicated.

But is the use of personality testing for making hiring decisions backed by scientific research? Well, it’s complicated. There are two basic approaches to personality tests: trait-based and type-based.

Mainstream academic psychology has gone almost exclusively down the route of trait-based approaches. By far the dominant approach is known as the “Big 5,” as it assumes that personality can essentially be boiled down to five traits, summarized by the acronym OCEAN: Openness to experience; Conscientiousness; Extraversion; Agreeableness; and Neuroticism (these days, more often referred to as “Emotional Stability”). What makes this a trait-based (as opposed to a type-based approach) is that each of these follows a a sliding scale. For example, you might score 82/200 for Conscientiousness, 78/100 for Extraversion, 48/100 for Agreeableness, and so on (if you’re interested, there are many places online you can take a free version of this personality test yourself). What this approach does not do is categorize people into types (e.g., “He’s an extrovert,” “She’s an introvert”).

Do scores on the “Big 5” predict aspects of performance in workplace? Psychologists Leonard D. Goodstein and Richard I. Lanyon answer with a cautious “Yes.” Unsurprisingly, conscientiousness shows a correlation with most measures of job performance, regardless of the particular job or of how performance is measured, though the size of the correlation is modest. Researchers measure the relationship between two things (in this case, conscientiousness on a questionnaire and some measure of job performance) on scale from 0 (no relationship whatsoever) to 1 (a perfect relationship: i.e., if you know a person’s conscientiousness score, you can predict with perfect accuracy their score on the measure of job performance).

On the 0–1 scale, the relationship between conscientiousness and job performance (as measured by this so-called “r value”) was 0.22; not trivial, but by no means large. This means only around 5 percent of the variation between different people on their job performance can be explained by their conscientiousness score on the personality questionnaire (calculated by squaring 0.22 to give the “r-squared value”). Similarly modest correlations were observed between extraversion and performance, but only—as you might expect—for employees involved in sales (r=0.15) or managing others (r=0.18). Openness to experience (creativity, enjoying new things) was positively correlated with employees’ ability at training others (r=0.25), but not with their job performance per se.

So far, so (cautiously) good. But here’s the thing: Outside of academic psychology, in the world of big business, employers tend not to use the “Big 5” or other trait-based measures of personality. Instead, they lean toward type-based measures such as the Myers-Briggs. Type-based measures don’t give people scores on continuous scales but instead categorize them into distinct “types.”

In the case of Myers-Briggs, there are 16 different types, defined by the test-takers’ preferences on four dimensions:

Extraversion (outgoing, life and soul of the party) or Introversion (prefer calmer interactions)
Sensing (relying mainly on your eyes and ears, etc.) or Intuition (seeing patterns or connections)
Thinking (prioritizing logic in decision making) or Feeling (prioritizing emotions in decision making)
Judging (living life in a planned, orderly way) or Perceiving (living life in a flexible, spontaneous way).

Your personality-type is simply a combination of your preferences. For example, my type would be “Extraversion Intuition Thinking Judging.”

How does the Myers-Briggs fare as a measurement of personality? The answer—unlike for the “Big 5,” which is generally well-supported by a large body of research—is that we just don’t know. A 2017 systematic review and meta-analysis set out to investigate “the validity and reliability of the Myers-Briggs,” trying to determine if the test measures what it claims to measure (validity), and if it comes up with more-or-less the same answer if people take the test several times (reliability)?

Weekly Newsletter

Get your fix of JSTOR Daily’s best stories in your inbox each Thursday.

Privacy Policy   Contact Us
You may unsubscribe at any time by clicking on the provided link on any marketing message.

The researchers came up almost blank: Out of 221 studies of the Myers-Briggs Type Indicator, only seven studies met their criteria for inclusion: four looking at validity and three at reliability. The four validity studies concluded that individual Myers-Briggs scores do seem to correlate well with one another and/or other personality measures, although the studies were too different to allow their results to be combined (as is usually done for meta-analysis). The three reliability studies concluded that the correlation between an individual’s score on different sittings of the same test is generally good (r=0.7–0.8), although almost all were conducted on college students, who are not necessarily representative of the general population.

And that’s it—there’s simply very little data on how well the Myers-Briggs (and other type-based personality tests) measures personality and even less on how it might predict job performance. We just don’t know. It’s too early to say that the test is “meaningless,” “totally meaningless,” or “a fad.” One thing is clear: valid and reliable alternatives that have been shown to correlate with job performance—tests based on the “Big 5” model—are widely available for free.

Comment This is not about accuracy, It is aboout reinforcing conformity implanting in candidates what they should aspire to regardless of class, race, gender or experience. Britain is a leader in police state controls , using notions of the DSM ( Diagnosis, Statistics and Medication ) along with malicious police devices like PNC Criminal Markers created by malicious moronic police officers on dubious so called ‘softintelligence’ , the likes of which saw a young black shot dead in Streatham. A person can go a lifetime without getting a good job or any job at all, so no mortgage because of this. Background checks are secretive, with confidentail calls and memos destroying peoples lives – from vindctive police and careerists who neverf admit mistakes, closing ranks to protect themselves.

Thepropaganda that Britain or the U.S exemplify democarcy is absurd.. The rampant Royalism which the elite and their media are now projecting on too the once villified Charles and Camilla should be hard to stomach by people with a brain, unless they are part of the ruling elite interest group or their lackeys who certainly have questionable intelligence.

Britain has reached new levels of fake diversity. Take the death of black Stephen Lawrence , Dalian Atkinson , Charles DeMenezes and the Streatham police killing. Police response in this country to their misconduct and crimes is always knee jerk cover up ‘in the public interest.’ They mean ‘class interest.’ This was pretty obvious from all the black dictators and other tyrants turning up welcome at the Queen’s funeral. The Queen was a totem who inherited a life of privilege, now portrayed as a Mother Theresa with a life of sacrifice. The self indulgence of the Royal family does not bear scrutiny. The masses love it, beause like puppets they cannot see the strings. That is what the money spinning Myers Briggs test is all about.

R J Cook

Miss Roberta Jane Cook 1990 “I never maid a secret of being transsexual. My wife knew. But LGBTQI is not about accepting us as women as hate crime, including police, against me, demonstrates.
I knew my marriage was over by then, but we have to be treated as freaks and forced to conform , as through devices like Myers – Briggs.
I stayed married to protect my children in a world that sanctifies sis women regardless of what they do . These sis women perceive us trans ladies as a threat to their efforts to both masculinise their gender while behaving like the worst power maniacs and little Hitlers. They can say what they like about us but want hate laws and restraining orders to keep themselves and ‘their children safe.’ They perpetuate the myth that sis women never lie, are always victims , care for children ( ironic an Irish woman recenly locked her toddlers into her car and set it on fire ) and fight for equal rights. If it had been a man, there would have been a national self righteous media outburst asserting the need to do more to control male violence It is against this background of liars , posers , manipulators and cheats that the Briggs- Myers test should be judged. .

September 19th 2022

Don’t Insist on Being Positive—Allowing Negative Emotions Has Much to Teach Us

Leaning into difficult feelings can help you find the way forward, according to a refreshing new wave of books, says Jamie Waters.

The Guardian

  • Whitney Goodman
More from The Guardian



‘Complaining is natural because language converts a “menacing cloud” into “something concrete”. Photo by erhui1979/Getty Images

Eight years ago, when Whitney Goodman was a newly qualified therapist counselling cancer patients, it struck her that positive thinking was being “very heavily pushed”, both in her profession and the broader culture, as the way to deal with things. She wasn’t convinced that platitudes like “Look on the bright side!” and “Everything happens for a reason!” held the answers for anyone trying to navigate life’s messiness. Between herself, her friends and her patients, “All of us were thinking, ‘Being positive is the only way to live,’ but really it was making us feel disconnected and, ultimately, worse.”

This stayed with her and, in 2019, she started an Instagram account, @sitwithwhit, as a tonic to the saccharine inspirational quotes dominating social media feeds. Her posts included: “Sometimes things are hard because they’re just hard and not because you’re incompetent…” and “It’s OK to complain about something you’re grateful for.” It took off: the “radically honest” Miami-based psychotherapist now has more than 500,000 followers.

Goodman’s 2021 book, Toxic Positivity, expands on this thinking, critiquing a culture – particularly prevalent in the US and the west more broadly – that has programmed us to believe that optimism is always best. She traces its roots in the US to 19th-century religion, but it has been especially ascendant since the 1970s, when scientists identified happiness as the ultimate life goal and started rigorously researching how to achieve it. More recently, the wellness movement – religion for an agnostic generation – has seen fitness instructors and yogis preach about gratitude in between burpees and downward dogs. We all practise it in some way. When comforting a friend, we turn into dogged silver-lining hunters. And we lock our own difficult thoughts inside tiny boxes in a corner of our brains because they’re uncomfortable to deal with and we believe that being relentlessly upbeat is the only way forward. Being positive, says Goodman, has become “a goal and an obligation”.

Toxic Positivity is among a refreshing new wave of books attempting to redress the balance by espousing the power of “negative” emotions. Their authors are hardly a band of grouches advocating for us to be miserable. But they’re convinced that leaning into – rather than suppressing – feelings, including regret, sadness and fear brings great benefit. The road to the good life, you see, is paved with tears and furrowed brows as well as smiles and laughter. “I think a lot of people who focus on happiness, and the all-importance of positive emotions, are getting human psychology wrong,” says Paul Bloom, a psychology professor at Yale and the author of The Sweet Spot, which explores why some people seek out painful experiences, like running ultra marathons and watching horror movies. “In a life well lived, you should have far fewer negative than positive emotions, but you shouldn’t have zero negative emotions,” adds Daniel Pink, the author of The Power of Regret. “Banishing them is a bad strategy.”

The timing of these new works – which also include Helen Russell’s podcast (following her book of the same name) How To Be Sad – is no coincidence. In light of the pandemic and now the conflict in Ukraine, it seems trite to suggest a positive outlook is all we need. Strong negative emotions – fear, anxiety and sadness – are a natural response to what’s happening around the world right now and we shouldn’t have to deny them.

These authors want you to know that “negative” emotions are, in fact, helpful. Russell talks about sadness being a “problem-solving” emotion. Research from the University of New South Wales shows that it can improve our attention to detail, increase perseverance, promote generosity and make us more grateful for what we’ve got. “It’s the emotion that helps us connect to others,” she adds. “We’re nicer, better people in some ways when we are sad.”

It’s tougher making an argument for regret, which might be the world’s most maligned emotion, but Pink is game. From a young age we are instructed to never waste energy on regrets. The phrase “No regrets” is inked into arms and on to bumper plates and T-shirts. Seemingly every famous person has a quip about living without regrets (I would know: as someone who tends to linger on thoughts of what might have been, I’ve read them all). Pink says we’re getting it all wrong. “A ‘No regrets’ tattoo is like having a tattoo that says ‘No learning’,” says Pink, who was also a speechwriter for Al Gore, speaking from Dallas, Texas. He became interested in this topic because he couldn’t shake his own regrets about the fact that, while a university student, he wasn’t kind to fellow pupils excluded at social events. “If it has bothered me for a month, a year, or in this case 20 years, that’s telling me: ‘Hey, you might not realise it, but you care about kindness,’” he says. “Regrets clarify what matters to us and teach us how to do better. That’s the power of this emotion – if we treat it right.”

The problem? We’re not taught how to effectively process these difficult emotions. A good starting point is to familiarise ourselves with these feelings by acknowledging them and sitting with them for a beat. That takes practice, says Goodman. “It can include learning how your emotions feel in your body, and what to call them. When we’re able to put a name to a feeling, it makes it less scary. And when something is known, we can figure out what we want to do with it.”

Telling others about it lightens the weight. Complaining is perfectly natural, says Goodman. And articulating it helps us pinpoint what it is that’s bothering us, because language converts this “menacing cloud” into “something concrete”, says Pink. That disclosure could be to a friend, therapist or total stranger. In his Regret Survey, 18,000 people anonymously shared their biggest regrets, while Russell suggests a “buddy” system, in which you make a reciprocal agreement with someone to talk about your worries without interruption. (A note, if you are comforting a friend: listen and ask questions rather than immediately reaching for pick-me-ups.)

Your next step will likely depend on the nature – and severity – of the emotion. To help us sit with sadness, Russell advocates being in nature. Cultural pursuits can help, too. “It sounds a little ‘woo’, but there are lots of studies about the effectiveness of reading therapy and looking at a piece of art – and how music can change our moods,” she says. “Sad music can act as a companion when we’re feeling sad, rather than making us feel lower. I do think it’s liberating when you finally kind of surrender to it all.”

Pink, whose approach is a little more structured, differentiates between regrets of action (wrongs you’ve committed) and inaction (opportunities not seized). For both, you must comfort yourself with the knowledge that everyone has regrets – and recognise that that single thing doesn’t define you. “Don’t look at a mistake as St Peter at the gate passing final judgment on your worth,” he says, but as “a teacher trying to instruct you.” He recommends stepping outside yourself and considering what you would recommend a friend do in a similar situation, whether that’s making amends for past acts, grasping a new opportunity, or ensuring you don’t make a similar misstep in the future.

Crucially, processing negative emotions “should all feel somewhat productive in the end”, says Goodman. Meaning: instead of ending up in a funk of wallowing, with your feelings replaying on a loop, “the wheels are turning, you’re making connections, you’re figuring things out,” she says. That doesn’t mean you need to come out of it feeling happy, or with a neat fix. “Sometimes you just get to a place where you say, ‘That was really hard, and now it’s over or now I’m not dealing with that any more’,” says Goodman. “And if it comes up for me again, I’ll deal with it.”

Leaning into negative thoughts should ultimately leave you with a sense of fulfilment. While we might instinctively think that filling our days solely with joy and excitement is the dream, “if we want to live a meaningful and purposeful life, a lot of pain is going to be part of it”, says Bloom. “What I really want is for people to be able to enjoy the full range of the human experience,” adds Goodman. Armed with the knowledge that you can do it in a methodical way, don’t be afraid to let the darkness in.

September 10th 2022

Full text links

full text provider logo




Page navigation

J Psychiatr Pract

. 2012 May;18(3):221-4. doi: 10.1097/

Inevitable suicide: a new paradigm in psychiatry

Benjamin J Sadock  1 Affiliations


The author suggests that a new paradigm may be needed which holds that some suicides may be inevitable. The goal of this paradigm would be to diminish the sense of failure and inadequacy felt by many psychiatrists who experience the suicide of a patient and to increase understanding of the unique biopsychosocial profile of those whose suicides appear to be inevitable. The author stresses that this proposed paradigm should not be misconstrued as therapeutic nihilism but rather should serve to stimulate efforts to treat this patient population more effectively. Risk factors that place individuals at high risk for suicide are reviewed, including presence of a mental illness, genetic predisposition, and factors such as a history of abuse, divorce, unemployment, male gender, recent discharge from a psychiatric hospital, prior suicide attempts, alcohol or other substance abuse, a history of panic attacks, and persistent suicidal thoughts, especially if coupled with a plan. The author notes that, in those suicides that appear to have been inevitable, risk factors are not only numerous but at the extreme end of profound pathology. The example of Ernest Hemingway is used to illustrate how such a combination of risk factors may have contributed to his eventual suicide. Psychiatrists, like other doctors, may have to acknowledge that some psychiatric disorders are associated with a high mortality rate as a natural outcome. This could lead to heightened vigilance, a more realistic view of what can and cannot be achieved with therapy, and efforts to improve the quality of life of patients at high risk for suicide with the goal of reducing this risk and prolonging their lives. (Journal of Psychiatric Practice 2012;18:221-224).

Comment in

Similar articles

See all similar articles

September 9th 2022

How to Spot the Potential Warning Signs of Suicide

Suicide expert Dr. Mark Russ on recognizing suicidal behavior, and how to help someone who is suffering and may be contemplating taking their own life.

Illustration of woman demonstrating potential warning signs of suicide

14 Min ReadMental Health • Story By Courtney Allison

Despite growing public attention and efforts to curb the country’s suicide rate, the statistics are sobering: Suicide is the 10th-leading cause of death in the United States, according to the American Foundation for Suicide Prevention. In 2017, more than 47,000 Americans died by suicide, a 33% increase since 1999.

“Suicide is both a public health problem and, of course, an extremely individual one,” says Dr. Mark Russ, vice chair for clinical programs and medical director at NewYork-Presbyterian Westchester Division. “It’s critically important to continue to raise public awareness of suicide and suicide risk to try to get our arms around this, to do more research to understand the underpinnings of suicidal behavior, and educate people as best we can, including how to intervene on the level of families, schools, employers, and the medical community.”

Stressful life events are unavoidable, but it’s important to remember that painful feelings will not last forever, Dr. Russ emphasizes.

“I think many people feel that what they are feeling in the moment they will always be feeling, and therefore ‘I need to end this in some way,’” he says. “That is not true. We know that moments of emotionally intensely painful feeling tend to come and go. They don’t last forever; that is just sort of the way the brain works. Sometimes, just letting people know that what they are feeling now is not something they’re going to need to bear forever can help — that even if they do nothing, in time, they are likely to feel better.”

How do you know if someone you love is considering taking their life? And is there a way to help? Health Matters spoke with Dr. Russ to better understand what may drive someone to take their life, and potential warning signs of suicide.

Why do you think the suicide rate is rising?
Dr. Russ: It’s unclear why the rate is increasing. It could have to do with more people reporting it, or societal stressors. Some have speculated that it could be related to the opioid crisis, social media, availability of information, bullying, or copycatting. Financial concerns are also a huge stressor for many people.

Who is most at risk?
The rates are rising in middle-aged individuals, particularly among white men, as well as adolescents and young adults. There are also increases among the elderly and the African American community.

We can only speculate, but for a middle-aged man it could have something to do with financial stressors, phase of life, the prospect of losing one’s job, or actually losing one’s job, retirement without adequate resources, or interpersonal issues involving family or divorce. For young people, possible factors include social media stress in terms of bullying, and tremendous social stress around getting into and succeeding in college and beyond.

Suicide is highest in people with existing mental and psychiatric disorders. We believe that there is a genetic component to suicide risk, and we know from studies that suicide can run in families.

What could trigger suicide?
Any kind of a loss is potentially a trigger for thinking about ending one’s life. It’s important to note that there is no absolute scale for loss. You can’t judge the meaning or impact of a loss to an individual. A circumstance or event that one person might regard as relatively trivial may be extremely impactful to an individual based on their experience, their interpersonal dynamics and who they are in the world. Some people may be very sensitive to humiliation or to being slighted. One person may think that’s just a part of life, but for another person it may be devastating.

“Struggling with depression, anxiety, or having thoughts of suicide is not uncommon, yet many people seem to distance themselves from it — perhaps because it’s not the way that people want to see themselves.”

— Dr. Mark Russ

What are the warning signs that someone may be considering suicide?

  • A change in someone’s mood or behavior, particularly along depressive and anxious lines, or isolating themselves. This can come across as either a sad mood, a feeling of disconnection, or social withdrawal. People at risk for suicide may isolate themselves. They may be quieter or not enjoy things the way they used to. Adolescents who used to love to engage in sports or play video games may stop doing that. If you get the sense that they are lacking pleasure in life, that is an extremely important warning sign.
  • Any statements that express a sense of hopelessness, helplessness, or worthlessness are evidence the individual may be becoming depressed. People may make a direct or indirect comment about suicide. Often, they may use euphemisms, like “I can’t take this anymore,” “I’m at the end of my rope,” “I want to throw in the towel,” or “Not sure how much longer I can go on.”
  • Changes in intake of alcohol or use of illicit drugs.
  • Signs of intense agitation, anxiety, or feelings of tremendous inner pain. This is an extraordinarily important warning sign of suicide because it may suggest that they are going to act soon.

What do you do if you see these warning signs of suicide?
Awareness is most important. People may deny that they or a loved one could be considering suicide because it’s too painful a thought. The natural tendency is to minimize symptoms and risks when they see them in other people. A loved one, for example, who has just suffered a traumatic life event, or experienced a situation that is painful, problematic, or difficult, might be overwhelmed by it and might be thinking that this is not something they can take anymore. The first step is to consider the possibility that suicide is among the potential outcomes of the situation and to listen very carefully.

How can you be a good listener?
Being a good listener can be tough, and it’s what we in behavioral health spend many years of training to do, because the natural inclination for most people is to jump in and share their experiences, give advice, and make judgments, all of which may not really be helpful in trying to help somebody who may be going through a very, very tough time.

  • Keep your ears open and resist the temptation to talk a lot. Be present for the person and don’t interject with “quick” or “easy” solutions. Understand that if there were a quick and easy solution, this person would likely have already thought of it.
  • Convey a sense of caring and interest. This can be done nonverbally and verbally, but try to provide a validation and acknowledgment of their current situation without making a judgment about it. Show that you see they are extremely depressed or despondent, and ask if they can tell you more about it.
  • Ask questions that are open-ended and allow the person to talk as much as possible. Allow periods of silence. It conveys a respect for the person and an interest, that you’re willing to forgo your own needs in the moment for the sake of helping them. It’s not easy to do, but important.
  • Create the time and space for someone to open up. You want to create a space that is private and give them enough time to talk.

What else can someone do to help if they suspect someone is considering taking their life or see a potential warning sign of suicide?
It’s OK to come out and ask “Are you feeling suicidal?” It’s not going to put the idea in their head if it wasn’t there before. Some may say they don’t have suicidal thoughts because it may be too painful to admit or it may be embarrassing because of the stigma associated with mental illnesses and suicide. Sometimes, it’s better to ease into the question by using terms that are less charged, like “Are you feeling overwhelmed?,” “Are you feeling that you can’t go on?,” or “Are you feeling at the end of your rope?” The idea is to normalize the experience and show that it’s understandable for someone who has gone through something difficult to have these thoughts.

Dr. Mark Russ, expert in identifying the warning signs of suicide.
Dr. Mark Russ

What else is important to know?
Getting someone professional help is critical. You don’t want to be in a situation where you are the lifeline for another human being. That is fraught with danger for that person and for you. Even if the person doesn’t kill themselves, they put you in an untenable situation that you’re not prepared for and will likely cause a great deal of anxiety for you. The best thing you can do is whatever you can to get that person into treatment. It may range from providing a suicide hotline number to actually calling the police and an ambulance to bring them to the emergency room, depending on the acuity of the circumstance.

What if someone has ongoing feelings of suicide?
Often, suicidal feelings can last a long time and be chronic. In addition to treatment, it’s important to create a safety plan. There should be a very clear, and preferably written, point-by-point plan of what a person will do if they feel suicidal. This is hopefully done in the context of treatment, but it’s important for the family to be aware of it whenever possible. The plan can include anything from providing a suicide hotline number to engaging in coping mechanisms they’ve learned as part of therapy, going to an ER, or calling a therapist, friend, or parent. Coping skills may include distracting behaviors, like walking around the block, taking a bath, or watching a funny movie — whatever seems to work for that individual. But a list of activities and steps that they are going to take before they act or harm themselves is useful and important.

Even if they are not suicidal in the moment, these feelings tend to recur. We should have the expectation that they’re going to feel this way again and help them anticipate the circumstances under which these thoughts or feelings are likely to emerge. A young adult with a history of becoming suicidal after they fail a test may not feel suicidal now, but might if they fail another test. Knowing the steps to take to help in the moment to get over the crisis is extremely important.

Is suicide an impulsive act?
Not always. Some people think about suicide for a very long time and plan carefully and basically make a decision that they feel is rational, although others would not agree, that their life is just not worth living. Others may be extremely impulsive, and the suicidal act may come at a moment of heightened emotion, in particular heightened anxiety and this sense of what we call psychic angst — pain in your being that just feels unbearable in the moment and may push someone to do something impulsively. Then, of course, there’s everything in between.

Can you predict who will take their life?
We cannot predict who is going to die by suicide. Most people who experience or exhibit warning signs of suicide don’t take their lives. There are many risk factors, but there is a lack of specificity in those risk factors. Many people get depressed or go through periods of hopelessness. For statistical reasons having to do with the relative infrequency of suicide and lack of specificity of any risk factor or group of risk factors related to suicidal behavior, it makes it impossible to predict who will die by suicide.

There is no blood test. There is no brain scan. There is no psychological test that can tell us if and when an individual will engage in suicidal behavior. We look for triggers and warning signs that have been associated with suicide risk in vulnerable individuals because it makes common sense to do so and can save lives. But in the end, we don’t know what is in the mind of that person who ultimately decides to take their own life. We don’t know that final thought because it’s never available to us. Because of that, there’s a certain degree of humility that I think all of us who work with people who struggle with suicidal feelings have to accept and understand.

Is there still a stigma around mental illness and suicide?
I think it’s getting better very slowly, but it is still a huge problem. Efforts of organizations like the National Alliance on Mental Illness are making an impact, and public figures coming out and sharing their mental health struggles helps too. But I think that the stigma remains a problem, particularly in some communities more than others. When there are cultural prohibitions or consequences to having a mental illness or getting treated for one, it makes it that much more difficult to access care.

Struggling with depression, anxiety, or having thoughts of suicide is not uncommon, yet many people seem to distance themselves from it — perhaps because it’s not the way that people want to see themselves. The sense of emotional well-being is so closely connected to who you are in the world. It’s different than if you have pneumonia or diabetes, if something happens to you — somehow the idea of mental illness isn’t viewed the same way, perhaps because it’s so closely aligned with a sense of self and identity — even though these are brain diseases.

How can people who have lost someone to suicide cope?
Be aware of your feelings and understand and accept the fact that you are going to have very strong feelings about what happened. That’s normal and appropriate, and you will need support. The extent of that support may be limited or may be extensive in terms of getting into therapy.

There is no constructive role for guilt or self-blame — there has already been one casualty, and we need not create any more. Feelings of guilt, loss, anger, and, of course, tremendous sadness are all natural feelings that can be dealt with and understood in the context of the circumstance.

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Additional Resources

September 6th 2022

déjà vu

The Search for Scientific Proof for Premonitions

In the 1960s, a British researcher launched one of the largest ever studies of people who believed they could see the future.

Ron Burton/Mirrorpix/Getty ImagesChildren looking over Aberfan, Wales, in the wake of the disaster there in 1966

When it finally happened, shortly after nine o’clock in the morning on October 21, 1966—when the teetering pile of mining waste known as a coal tip collapsed after days of heavy rain and an avalanche of black industrial sludge swept down the Welsh mountainside into the village of Aberfan, when rocks and mining equipment from the colliery slammed into people’s homes and the schools were buried and 116 young children were asphyxiated by this slurry dark as the river Styx—the anguished public response was that someone should have seen this disaster coming, ought to have predicted it.  

Someone did. 

The Premonitions Bureau: A True Account of Death Foretold by Sam Knight Buy on BookshopPenguin Press, 256pp., $28

Or at least, they claimed they had. Shortly after the tragedy at Aberfan, several women and men recalled having eerily specific premonitions of the event. A piano teacher named Kathleen Middleton awoke in North London, only hours before the tip fell, with a feeling of sheer dread, “choking and gasping and with the sense of the walls caving in.” A woman in Plymouth had a vision the evening before the disaster in which a small, frightened boy watched an “avalanche of coal” slide towards him but was rescued; she later recognized the child’s face on a television news segment about Aberfan. One of the children who died had first dreamt of “something black” smothering her school. Paul Davies, an 8-year-old victim, drew a picture the night before the catastrophe that showed many people digging in a hillside. Above the scene, he had written two words: The End.

Premonitions this dramatic and alarming are likely rare. But most of us have experienced odd coincidences that make us feel, even for an instant, that we have glimpsed the future. A phrase or scene that triggers a jarring sensation of déjà vu. Thinking of someone right before they text or call. Inexplicably dreaming about a long-lost acquaintance or relative only to wake and find that they have fallen ill or died. It’s mostly accepted that these are not really forms of precognition or time travel but instead fluky accidents or momentary brain glitches, explainable by science. And so we don’t give them a second thought or take them that seriously. But what if we did?

The Premonitions Bureau, an adroit debut from The New Yorker staff writer Sam Knight, draws us into a world not that far gone in which psychic phenomena were yet untamed by science and uncanny sensations still whispered of the supernatural, of cosmic secrets. Knight’s book registers the spectral shockwaves that rippled out from Aberfan through the human instrument of John Barker, a British psychiatrist who began cataloguing and investigating the country’s premonitions and portents in the wake of the accident. Barker spent his career seeking out the hidden joints between paranormal experience and modern medicine, asking scientific questions about the occult that we have now agreed no longer to ask. In Knight’s skillful hands, the life of this forgotten clinician becomes a meditation on time and a window through which we can perceive the long human history of fate and foresight. It’s also a tale about how we decide what is worthy of science and what it feels like to be left behind. It is a story about a scientific revolution that never happened. 

Forty-two years old when the country learned of Aberfan, John Barker was a Cambridge-educated psychiatrist of terrific ambition and rather middling achievement. In his thirties, he had been an unusually young hospital superintendent at a facility in Dorset; a nervous breakdown led to his demotion and reassignment, by the mid-’60s, to Shelton Hospital, where he cared for about 200 of the facility’s thousand patients. Shelton was a Victorian-era asylum in western England, not far from Wales, and a hellish world unto itself. Local doctors called it the “dumping ground,” this 15-acre gothic facility of red-brick buildings hidden behind red-brick walls, where women and men suffering from mental illness were deposited for the rest of their lives. One-third of Shelton’s population had never received a single visitor. Like other mental health facilities in midcentury Britain, it was a place of absolutely crushing neglect. “Nurses smoked constantly,” Knight writes, “in part to block out Shelton’s all-pervading smell: of a house, locked up for years, in which stray animals had occasionally come to piss.” Every week or two, another suicide. “The primary means of discharge was death.”

As a clinician, Barker was tough and demanding. He was also complicated (like all of us) and tough to caricature. Barker had arrived at Shelton as calls for psychiatric reform were growing louder, and he supported efforts to make conditions “as pleasant as possible” for the hospital’s permanent residents, including removing locks from most of the wards and arranging jazz concerts. But he also favored aversion shock therapies and once performed a lobotomy—which, to his credit, he later regretted. At any rate, Barker’s true passion lay elsewhere. As a young medical student, he collected ghost stories from nurses and staff at the London hospital where he was training: sudden and unaccountable cold presences late at night, spectral ward sisters who shouldn’t have been there and who vanished when you looked twice. A “modern doctor” committed to rational methods, his interest in all things paranormal led him to join Britain’s Society for Psychical Research, whose members had been studying unexplained occult phenomena since 1882. Barker had a crystal ball on his desk and spent his weekends at Shelton rambling around haunted houses with his son. He was a man caught between worlds who would eventually fall through the cracks.  

The day following the disaster, Barker showed up in Aberfan to interview residents for an ongoing project about people who frightened themselves to death. But he realized quickly that his questioning was insensitive—and as he learned more about the uncanny portents and premonitions that were already swirling around the tragedy, he sensed a much greater opportunity. Barker contacted Peter Fairley, a journalist and science editor at the Evening Standard, with his hunch that some people may have foreseen the disaster through a kind of second sight. Days later, the paper broadcast Barker’s paranormal appeal to its 600,000 subscribers: “Did anyone have a genuine premonition before the coal tip fell on Aberfan? That is what a senior British psychiatrist would like to know.” 

A gifted scientific popularizer, Fairley shared with Barker a knack for publicity as well as tremendous ambition. Within weeks, the two men had dramatically expanded the project. From January 1967, readers were told to send general auguries or prophecies to a newly established “Premonitions Bureau” within the newsroom. “We’re asking anyone,” Fairley told a BBC radio interviewer, “who has a dream or a vision or an intensely strong feeling of discomfort” which involves potential danger to themselves or others “to ring us.” With Fairley’s brilliant assistant Jennifer Preston doing most of the work, the team categorized the predictions and tracked their accuracy. Their hope was to prove that precognition was real and convince Parliament to use this psychic power for good by developing a national early warning system for disasters. “Nobody will be scoffed at,” Fairley insisted. “Let us simply get at the truth.”

Seventy-six people wrote to Barker claiming premonitory visions of the Aberfan disaster. Throughout 1967, another 469 psychic warnings were submitted to the Bureau. Many of these submissions came from women and men who claimed to be seers, who experienced precognition throughout their lives as a sort of sixth sense. Kathleen Middleton, the piano teacher who awoke choking before the coal tip collapse, became a regular Bureau contact who had been sensitive to occult forces since she was a girl. (During the Blitz, a vision of disaster convinced her to stay home one night instead of going out with friends; the dance hall was bombed.) Another frequent contributor was Alan Hencher, a telephone operator who wrote that he was “able to foretell certain events” but with “no idea how or why.” 

The premonitions gathered by Barker ran the gamut of believability. Some were instantly disqualified. Others were spookily prescient. In early November 1967, both Hencher and Middleton warned of a train derailment; one occurred days later, near London, killing 49 people. Hencher suffered a severe headache on the evening of the disaster and suggested the time of the accident nearly to the minute, before the news had been reported. Most of the premonitions appear to have been vague enough to be right if you wanted them to be, if you were willing to cock your head to one side and squint. A woman reported a dream about a fire; on the day she mailed her letter, a department store in Brussels burned. One day in May 1967, Middleton warned about an impending maritime disaster; an oil tanker ran aground. Visions of airliner crashes inevitably, if one waited long enough, came true somewhere in the world. Barker was determined to believe in them. “Somehow,” he told an interviewer, seers like Hencher and Middleton “can gate-crash the time barrier … see the unleashed wheels of disaster before the rest of us.… They are absolutely genuine. Quite honestly, it staggers me.”Visions of airliner crashes inevitably, if one waited long enough, came true somewhere in the world. Barker was determined to believe in them.

For those of us unable to gate-crash time itself, one wonders what it would be like to have this kind of premonitory sense, to perceive the future so viscerally and so involuntarily. It was like knowing the answer for a test, some explained, with cryptic keywords floating in space in their imaginations. ABERFAN. TRAIN. Others had physiological symptoms. Odd smells, like earth or rotting matter, that nobody else could perceive, or a spasm of tremors and pain at the precise moment when disaster struck far away. People who sensed premonitions explained to Barker that it was an awful burden, that they grappled with, as one put it, “the torment of knowing” and “the problem of deciding whether we should tell what we have received” in the face of potential ridicule or error. 

Prone to a certain grandeur, Barker believed that the stakes of the project, which he called “essential material and perhaps the largest study on precognition in existence,” were high. Practically speaking, he thought it would help avert disaster. (If the Premonitions Bureau had been up and running earlier, he boldly claimed, Aberfan could have been avoided and many children’s lives saved.) More daringly, Barker thought that proving the existence of precognition would overturn the basic human understanding of linear time. He wondered if some people were capable of registering “some sort of telepathic ‘shock wave’ induced by a disaster” before it occurred. It might be akin to the psychic bonds felt between twins, but able to vanquish time as well as space. Inspired by Foreknowledge, a book by retired shipping agent and amateur psychic researcher Herbert Saltmarsh, Barker thought that our conscious minds could likely only experience time moving forward, and in three distinct categories: past, present, and future. To our unconscious, however, time might be less stable and more permeable. If scientists would “accept the evidence for precognition from the cases” gathered by the Bureau, he said, they would be “driven to the conclusion that the future does exist here and now—at the present moment.” Barker sensed a career-defining discovery just around the corner.  

But it was not to be. John Barker died on August 20, 1968 after a sudden brain aneurysm. He was 44 years old. The Bureau, which Jennifer Preston dutifully continued through the 1970s, and which ultimately included more than 3,000 premonitions, represented the last, unfinished chapter of his brief life. He never wrote his book on precognition and fell into obscurity. The morning before he died, Kathleen Middleton woke up choking.  

Knight narrates Barker’s story with considerable generosity and evident care. Rather than condescend or deride him as a crank, Knight thinks with Barker: about the strangeness of time and our human ways of moving through it, about how we make meaning from chaos and resist the truly random, about prediction and cognition and our hunger for prophecy. Yet the many disappointments in Barker’s career were not incidental to his significance, and emphasizing them does not diminish him. In fact, his life can also be framed as a tale told much too rarely in the history of science, about how scientific inquiry relies as much upon failure as success in order to function, on exclusion as much as expansion.

Around the time Barker was appointed to his role at Shelton, the American historian and philosopher of science Thomas Kuhn published a book called The Structure of Scientific Revolutions, a landmark work that now structures practically everyone’s thinking without them realizing it. What Kuhn proposed was that scientific research always occurs within a paradigm: a set of rules and assumptions that reflect not only what we think we know about how the universe works, but also the questions we are permitted to ask about it. At any given moment, “normal science” beavers away within the borders of the current paradigm, working on “legitimate problems” and solving puzzles. For a long while, Kuhn explained, phenomena “that will not fit the box are … not seen at all,” and “fundamental novelties” are suppressed. Eventually, however, there are too many anomalies for which the reigning paradigm cannot account. When a critical mass is reached, the model breaks and a new one is adopted that can better explain things. This is a scientific revolution.

For Barker, precognition constituted what Kuhn would have called a legitimate problem within normal science: It ought to be studied using experimental methods and would, he thought, one day be explained by them. But he admitted the risk that modern psychiatry might not ever be able to accommodate the occult, that his work on premonitions could break the paradigm altogether. Hunches and visions that came true might demand a new way of explaining time and energy. “Existing scientific theories must be transformed or disregarded if they cannot explain all the facts,” he lectured his many critics. “Although unpalatable to many, this attitude is clearly essential to all scientific progress.” He seems to have seen himself as a contemporary Galileo, insisting upon empirical truth in the face of “frivolous and irresponsible” gatekeepers. “What is now unfamiliar,” he argued in the BMJ, usually tends to be “not accepted, even despite overwhelming supportive evidence. Thus for generations the earth was traditionally regarded as flat, and those who opposed this notion were bitterly attacked.” Barker wanted the ruling scientific paradigm to make room for the paranormal—or give way.

It wasn’t so implausible, in midcentury Britain, that it just might. A craze for spiritualism and the paranormal had swept the country between the two world wars, and a rash of new technologies that seemed magical (telegram, radio, television, etc.) left many Britons, not unreasonably, to wonder if “supernatural” phenomena like prophecies or telepathy might turn out to be explainable after all. In Barker’s Britain, one quarter of the population had reported believing in some form of the occult. Even Sigmund Freud, nervously protecting the reputation of psychoanalysis, refused to dismiss paranormal activities “in advance” as being “unscientific, or unworthy, or harmful.” In physics, too, Knight points out, “the old order of time was collapsing” by midcentury, thanks to developments in relativity as well as quantum mechanics. For experts, time had become less predictable and mechanisms of causation less clear, both subatomically and cosmically. Barker had been formed, in other words, by “a society in which one set of certainties had yet to be eclipsed by another.”Premonitions became understood not in terms of extrasensory perception but simply misperception: the work of cognitive error or misfiring neurons rather than the supernatural.

But instead of rearranging itself around Barker’s research into precognition, the paradigm shifted away from him and snapped more firmly into place. The walls sprang up, and the questions that interested Barker became seen as illegitimate and unscientific. The Bureau he built with Fairley was not all that successful. Only about 3 percent of submissions ever came true, and in February 1968 a deadly fire at Shelton Hospital itself went unpredicted, to the unabashed glee of critics and satirists. Barker’s supervisors grew skeptical and then embarrassed. As time went on, and the boundaries of the scientific paradigm in which we still live grew less permeable, occult phenomena were explained not by bending time, but with recourse to cognitive science and neurology. Premonitions became understood not in terms of extrasensory perception but simply misperception: the work of cognitive error or misfiring neurons rather than the supernatural.  

The popular understanding of scientific revolutions still revolves around big ruptures and great scientists, the paradigm-defining concepts (like heliocentrism, gravity, or relativity) that transform how human beings think they understand the universe: We shift the frame to move forward. Yet there is just as much to be learned from the times when revolutions don’t occur, when scientific inquiry is defined not by asking thrilling new questions, but by the determination that some old questions will no longer be asked. What’s so brilliant about Knight’s account, in the end, is the way it portrays a creative workaday researcher rather than a modern-day Newton or Einstein, a man aspiring to do normal science while the rules shifted around him; the way it conveys the rarely captured feeling of a paradigm closing in around you and your ideas, until it all fades to black.  Ian Beacock @IanPBeacock

Ian Beacock is a frequent contributor to The New Republic. He lives in Vancouver, where he’s working on a book about democratic emotions.Want more on art, books, and culture?Sign up for TNR’s Critical Mass weekly newsletter.ContinueRead More: Critical Mass, Books, Culture, Aberfan, Science, PremonitionsEditor’s Picks

Night VisionNicholson Baker

September 5th 2022

By Melissa Hogenboom23rd August 2022Comments about our looks from our loved ones and friends can cause lifelong insecurities. How can we teach kids to feel confident about their bodies instead?P

Picture the scene: a little girl tries on a sparkly dress, does a twirl and with great satisfaction, smooths it down. The adults around her echo her delight, and tell her how pretty she is. Later she looks at her favourite books, and sees slim people and slender animals going on exciting adventures, while their heavier counterparts are portrayed as slow or clumsy. Sometimes, she notices her own parents fretting about their weight or looks.

By the time she is a teen, her parents may worry how social media influencers are affecting her body image. But research suggests that in reality, her perception of bodies and their social acceptance will have been shaped long before then, in those very early years.

When we think about our relationship with our bodies, it’s often hard to pinpoint precisely where our satisfaction or dissatisfaction comes from. If we cast our minds back to our childhood, however, we may remember a collection of off-hand comments or observations. None of them may seem hugely impactful in themselves. And yet, their cumulative effect can be surprisingly potent.

The writer Glennon Doyle still recalls how her looks as a child earned her praise from the adults around her: “I could see it on their faces… They would light up, and so I learned, this is a currency,” she says on her podcast. But when she grew older and was considered less pretty, that adoration stopped – it was, she says, as if the world had turned away from her.

Whether it comes in the form of compliments or criticism, that kind of attention to body shapes can lay down beliefs and insecurities that are hard to shake off. The consequences can be tremendously damaging, as research shows, with family attitudes and derogatory comments about weight linked to mental health problems and eating disorders. In addition, the broader stigmatisation of overweight children has increased – affecting their self-esteem and of course, body image.

Given how early this awareness of body ideals begin, what can parents and caregivers do to help children feel confident about themselves – and more supportive of others?From a young age, children are influenced by their parents' views about physical appearance (Credit: Getty/Javier Hirschfeld)

From a young age, children are influenced by their parents’ views about physical appearance (Credit: Getty/Javier Hirschfeld)

Family Tree

This article is part of Family Tree, a series that explore the issues and opportunities that families face all over the world. You might also be interested in other stories on childhood and development:

You can also climb new branches of the Family Tree on BBC Worklife and BBC Culture – and check out this playlist on changing families by BBC Reel

Body shame is taught, not innate

Physical ideals hugely differ across time and different cultures – a quick look at any painting by Peter Paul Rubens, or indeed the 29,500-year-old figurine known as the “Venus of Willendorf“, shows just how exuberantly humans have embraced curvy features. But today, despite a growing body positivity movement that celebrates all shapes and sizes, the idea that a thin body is an ideal one remains dominant on social media, on traditional media, on television, on the big screen and in advertising.

Awareness of body ideals starts early, and reflects children’s experience of the world around them. In one study, children aged three to five were asked to choose a figure from a range of thin to large sizes, to represent a child with positive or negative characteristics. They were for example asked which children would be mean or kind, who would be teased by others and whom they would invite to the birthday. The children tended to choose the bigger figures to represent the negative characteristics.

Crucially, this bias was influenced by others: for example, their own mothers’ attitudes and beliefs about body shapes affected the outcome. Also, the older children displayed a stronger bias than the younger ones, which again indicates that it was learned, not innate. The findings “suggest children’s social environments are important in the development of negative and positive weight attitudes”, the researchers conclude.

“We see the patterns whereby children are attributing the positive characteristics to the thinner figures, and negative characteristics to the larger figures,” says Sian McLean, a psychology lecturer at La Trobe University in Melbourne, Australia, who specialises in body dissatisfaction. “They’re developing that quite early, which is a concern because they potentially have the chance to internalise that perception, that being larger is undesirable and being thinner is desirable and associated with social rewards.”

While parents play an important role in shaping their children’s attitudes and views, it should be emphasised that they are far from the only influence youngsters are exposed to, and can often have a positive effect that can counteract messages from other sources. But the research shows that parents’ views do matter.

Girls as young as five use dieting to control their weight

Another study showed that children as young as three were influenced by their parents’ attitude towards weight. Over time, the children’s negative associations with large bodies, and awareness of how to lose weight, increased. There is often a gender element to these perceptions, with sons more affected by their fathers’ views, and daughters by their mothers’ attitudes. The use of dieting to control weight has even been reported in girls as young as five. Here the main factors were exposure to media, as well as conversations about appearance.  

The studies show just how early young children take on the societal perceptions of those around them, paying close attention to how adults behave and talk about bodies and food. That pattern continues, and can even worsen, as they grow older. Research assessing the level of body dissatisfaction and dieting awareness in children aged five to eight found that “the desire for thinness emerges in girls at around age six”. From that age, girls rated their ideal figure as significantly thinner than their current figure. Again, the children’s perception of their mothers’ body dissatisfaction predicted whether the girls then also felt dissatisfied with their own bodies. “A substantial proportion of young children have internalised societal beliefs concerning the ideal body shape and are well aware of dieting as a means for achieving this ideal,” the authors concluded. Thinking back, most of us will have experienced off-hand comments or observations during our childhood (Credit: Alamy/Javier Hirschfeld)

Thinking back, most of us will have experienced off-hand comments or observations during our childhood (Credit: Alamy/Javier Hirschfeld)

The danger of teasing

Many parents may feel shocked to hear that their own insecurities – which may after all be completely involuntary, and not something they wish to pass on – can have such an impact. But some family members also magnify this effect through derogatory comments.

In a study on the effects of teasing by family members on body dissatisfaction and eating disorders, 23% of participants reported appearance-related teasing by a parent, and 12% were teased by a parent about being heavy. More reported being teased by their fathers than their mothers. Such paternal teasing was a significant predictor of body dissatisfaction as well as bulimic behaviours and depression, and also increased the odds of being teased by a sibling. Maternal teasing was a significant predictor of depression. Being teased about one’s appearance by a sibling had a similarly negative impact on mental health and self-esteem, and raised the risk of eating disorders.

The authors suggested that understanding a family history of teasing would help health care providers identify those at risk for “body image and eating disturbance and poor psychological functioning”.

I still have a problem eating in front of my mom. She always criticised my eating and weight starting from when I was six. Maybe even before – 49-year-old study participant

Other research on children aged seven to eight has shown that mothers’ comments about weight and body size have been linked to a disordered eating behaviour among their children. Similarly, girls “whose mothers, fathers, and friends encouraged them to lose weight and be lean” were more likely to endorse negative beliefs about others’ weight, known as “fat stereotypes”. This is especially alarming given the rise in weight-related stigmatisation and bullying.

Even adult women can still feel the pain of weight stigma experienced in childhood, a study found, with the participants mainly pointing to their mothers as the source of such stigma. It was “the most hurtful thing I’ve ever experienced“, one participant said. The study quoted women in their 40s, 50s and 60s describing vivid memories of being weight-shamed by their families, and the profound sadness they still felt. “The constant criticism from my mother about my weight led to issues of self-confidence I have struggled with all my life,” one participant reported. “My father and brothers used to hum the ‘baby elephant walk’ tune when I was around eight–11 years old,” said another. “I still have a problem eating in front of my mom,” a 49-year-old participant stated. “She always criticised my eating and weight starting from when I was six. Maybe even before.” 

One respondent recalled her mother putting her on a diet at the age of 10: “My feelings of my lack of attractiveness will probably never go away and have been with me all my life even when I was thinner. It is very painful.”

However, some respondents also said they felt their mothers projected their own insecurities, and perhaps intended the comments and advice to be helpful rather than mean.Some adult women still feel the pain of weight stigma experienced in childhood (Credit: Getty Images/Javier Hirschfeld)

Some adult women still feel the pain of weight stigma experienced in childhood (Credit: Getty Images/Javier Hirschfeld)

Beyond the family

There’s a reason why parental influence is so strong. Rachel Rodgers, a psychologist at Northeastern University, says that when a parent is concerned with their own body image, they will be modelling behaviours that show “this is important”.

“Even if they’re not mentioning the child’s physical appearance, they’re still acting in a way that suggests to the child, ‘this is something that worries me, this is something that I’m preoccupied with’, and so children pick up on that.”

In addition, many parents do tend to comment on what children are eating, wearing, or how they look, often in a well-meaning way, and that can increase the preoccupation with looks and weight. The resulting “thin idealisation” – a preference for thin bodies – sets children up to believe that their “social worth is contingent on their physical appearance and that’s going to lead them to invest in it in terms of their self-esteem, as well as their time and energy”, says Rodgers.  

Of course, parents are not the only source of body stigma, especially as the child grows older. Their peers and the media tend to assume a greater role over time. Even toys such as dolls have an influence. One study featuring girls aged five to nine, found that when they played with an extremely thin doll, it changed their ideal body size to being thinner. 

Unless they are countered, these influences can reinforce each other. Many studies show that media exposure contributes to appearance ideals – young girls who watched music videos were more focused on their appearance afterwards, for example. If friends then also talk about weight and appearance, that effect can be magnified.

“The way in which media ideals are supported and endorsed by their peers/friends was a more crucial factor than direct media exposure itself,” explains Jolien Trekels a psychologist studying body image at KU Leuven in Belgium, who led research looking into the role friends play on appearance ideals.

On a positive note, it may mean that young people are not just at the mercy of media ideals, but can collectively shape their own responses to it. 

The danger of “thinspiration”

The type of social platform and activity also plays a role. One 2022 review found that Instagram and snapchat (both extremely visual) were more negatively linked to body image than Facebook, while taking and manipulating selfies was more damaging than actually posting them.

Unsurprisingly “thinspiration” content that promotes thinness and dieting, also showed negative effects (due to negative self-comparisons), as did fitness-promoting posts categorised as “fitspiration”.

Although viewing posts about exercise has been shown to increase adult exercise among women, it also internalises thin ideals, according to a 2019 study. This means that this early inspiring effect is not necessarily long lasting, as the study notes: “As time passes and women see no major effects of dieting and exercising, they may become frustrated which may consequently result in body dissatisfaction.”

A negative body image is problematic for many reasons. “Self-worth is often intertwined with one’s bodily self-perceptions,” explains Trekels.

This is especially the case for women and girls. Once a negative body image develops, it is a high predictor for eating disorders and depression. The statistics paint a sobering picture. Estimates suggest that up to half of pre-adolescent girls and teens report body dissatisfaction.

A negative body image in childhood is also likely to persist into adolescence. A recent survey of adults by the charity Butterfly, which offers evidence-based support for eating disorders, found that of those who developed body dissatisfaction early on, 93% said it got worse during adolescence.Focusing more on a child's interests rather than how they look could improve a sense of self-satisfaction (Credit: Getty Images/Javier Hirschfeld)

Focusing more on a child’s interests rather than how they look could improve a sense of self-satisfaction (Credit: Getty Images/Javier Hirschfeld)

Are girls more at risk?

While girls often seem to be more affected by body image concerns, this may in part be due to the fact that more research exists featuring girls, as well as how consistently the female body is objectified and sexualised early on. Emerging research on boys shows a similar level of dissatisfaction, though their body ideals tend to be a bit different, with a greater focus on wanting to be muscular, for example.

“Really everyone in a body can experience body dissatisfaction, it doesn’t matter what you look like on the outside, it’s how you’re thinking and feeling on the inside,” says Stephanie Damiano, who works at Butterfly.  

Trekels has noted similar trends: “Generally, we find more or stronger effects for girls than for boys. However, this does not mean that boys aren’t vulnerable to experiencing these influences, too.”

One reason the effect is stronger for girls could be because, from an early age, girls and boys are socialised differently. Girls are often told that their social value lies in how attractive they are, says Rodgers. “That their bodies are made to be looked at, they are supposed to be contained, docile and not take up too much space,” she says. “Boys are socialised to understand that their bodies are functional, that they’re strong, which is a very different message.”

Given how all-pervasive these messages are, what can parents do to counter them and instead nurture a more generous, positive and empowering body image?

First, as the evidence shows, the way adults talk about bodies around children matters. “We would encourage parents or educators not to make comments about body image, even if they’re positive,” McLean says.

Instead, parents should focus on what the children enjoy doing and are interested in,  placing “more value on who they are and their special skills and talents and less focus on what they look like”, says Damiano. This helps children get a sense of satisfaction and self-worth that’s not tied to their appearance. It may also mean working on our own self-perception and self-esteem, given that the research shows how easy it is to transmit our insecurities.Positive family relationships can help to reduce the negative effects of body dissatisfaction (Credit: Getty Images/Javier Hirschfeld)

Positive family relationships can help to reduce the negative effects of body dissatisfaction (Credit: Getty Images/Javier Hirschfeld)

Family support makes a difference

Damiano also recommends parents avoid talking about weight or constantly telling children to eat healthier foods. “The more we focus on higher weight as being a problem, or certain foods as being ‘bad’, the more guilt, shame, and body dissatisfaction children are likely to feel.”

Instead, parents can talk about exercise as being important for general health and wellbeing, rather than a way to lose weight. Families can also normalise eating healthy meals, rather than overtly talking about specific foods being bad for you. We all like a treat, after all, so it seems counter-productive to teach children to feel guilty about having one. In fact, enjoying treats is known to be key to a healthy attitude towards weight. Watching TV cooking programmes featuring healthy food, can also subtly encourage children to eat healthier foods.

Family relationships can play an important positive part: one study showed that a good relationship between mothers and their adolescent children can reduce the negative effects of social media use on body dissatisfaction. Limiting children’s time on social media can reduce “appearance comparisons” as well as improve mental health.

“The way that parents provide meaning to what the child is seeing”, is also really important, Rodgers says, as it can help a child decode what the images truly show.  And of course, not all social media is bad – it can be a source of community and encouragement, too.

Parents may find it useful to team up with schools. The Butterfly Body Bright programme in Australia helps primary school children develop a positive body image and lifestyle choices. In a pilot programme, the children’s body image was found to improve after one lesson. Intervention programmes that focus on building self-esteem have also shown success. Reflecting on these programmes and their messages may even help parents examine their own ideas around weight and bodies, and cast off long-held, harmful beliefs.

As for what we can do at home, an easy change might be to pause whenever we’re about to praise a child’s appearance, and think of something else we like about them, and want them to know. Instead of telling them “I love your dress”, we could simply smile and tell them how nice it is to see them, and how much fun they are to be around.

* Melissa Hogenboom is the editor of BBC Reel. Her book, The Motherhood Complex, is out now. She is @melissasuzanneh on Twitter.

Join one million Future fans by liking us on Facebook, or follow us on Twitter or Instagram.

If you liked this story, sign up for the weekly features newsletter, called “The Essential List” – a handpicked selection of stories from BBC Future, Culture, Worklife, Travel and Reel delivered to your inbox every Friday.

By Sophia Smith Galer31st August 2022In the digital age, kids need a trusted source they can turn to with questions about love and sex – and research shows how parents can get it right.I

I never got the opportunity to do something that’s almost a rite of passage among British teens – spend a sex education class peeling a condom out of its stiff foil packet and rolling it down a banana. It wasn’t until I was 27 years old that I would finally get to do it, but in a very different capacity. I wasn’t learning how to put a condom on. I was learning how I’d teach somebody else to put it on. 

About 15 newly trained sex educators and I sat in front of our computers, condommed-bananas in hand. “We often use flavoured condoms,” explained our teacher over Zoom, “because the smell is a bit more appealing than normal condoms.” He took a moment to look at the participants’ expressions, and obviously found some of them looking less passive than he’d hoped. “It’s really important that you don’t look or feel squeamish when you do this,” he said. “That’s not how you want young people to feel when you’re encouraging them to use these.”

Many parents may feel a similar sense of squeamishness when trying to talk to their children about physical intimacy – though attitudes to sex education can vary widely between countries and families, research shows.

A review of research on British parents’ involvement in sex education found that they often felt embarrassed, for example, and feared they lacked the skills or the knowledge to talk to their children. However, that same review also found that in countries such as the Netherlands and Sweden, parents talked openly to their children about sex from an early age, and that possibly as a result, teenage pregnancies and sexually transmitted diseases were far less common than in England and Wales.

Parents who do feel awkward talking about sex can find themselves in a difficult spot. Many would like their children to know that they can come to them with questions and problems, especially in the digital age, with children coming across graphic online content at an increasingly young age. But they may struggle to decide when and how to start.Parents may feel squeamish when trying to talk about bodies and intimacy. (Credit: Prashanti Aswani)

Parents may feel squeamish when trying to talk about bodies and intimacy. (Credit: Prashanti Aswani)

Eva Goldfarb, professor of public health at Montclair State University, co-authored a systematic literature review of the past 30 years of comprehensive sex education. While the review focuses on schools, Goldfarb says her research holds important lessons for parents, too. One basic insight is that sex education has a positive, long-term impact, such as helping young people form healthy relationships. Her advice to parents is not to skip or delay these chats.

“Start earlier than you think,” she says. “Even with very young children you can talk about names of body parts and functions, body integrity and control.”

This includes talking about issues that parents may not even think about as sex-related, but that are about relationships more broadly: “Nobody gets what they want all the time, it’s important to treat everyone with kindness and respect.”

In fact, parents tend to find it easier to talk to their children about sex when these conversations start at a young age and come up naturally, separate research suggests. Answering young children’s questions openly and honestly can set a positive pattern that makes it easier to talk about more complex issues later.

This step-by-step approach can also be beneficial for children in terms of understanding their own origins and identity. For example, research has shown that children who were conceived with the help of sperm donation, and whose parents explained this from the start with the help of books and stories, felt more positive about their origins than those who found out later.

For parents who want to broach the subject of sex but don’t quite know how, research has revealed a number of ways to get started.Many parents would like to be a trusted source for their children, especially in the digital age. (Credit: Prashanti Aswani)

Many parents would like to be a trusted source for their children, especially in the digital age. (Credit: Prashanti Aswani)

What was your own sex education like?

Over the past few years, I have interviewed dozens of sex educators for my book debunking sex myths and misinformation, Losing It. They are pretty much unanimous when it comes to Lesson One of sex education training – figuring out your own level of sex education before considering passing it on to anybody else.

Numerous studies and surveys suggest that adults often do not know as much about sex and the body as they would like to, and may even have completely inaccurate ideas that are grounded in myth or guesswork. For example, many people around the world erroneously believe that the state of a woman’s hymen can prove whether she is a virgin – an idea that has no scientific basis.

Parents’ basic level of knowledge can vary widely. Some may identify with the subjects of a study in Namibia, which found that many parents didn’t talk to their children about sex because they themselves felt that their knowledge about human sexuality, or their ability to explain it, was inadequate. But a survey of almost 2,000 parents of young children in China found that parents’ own sexual knowledge and sex education was generally good, though they were less knowledgeable when it came to issues around child development, which made it difficult for them to be effective educators. 

Some of the Namibian respondents also avoided the topic because they viewed sex as taboo, or they thought discussing it was going to encourage young people to have sex. The idea that talking to children about sex will encourage them to think about things that aren’t age appropriate, or seek out sexual experiences, remains common around the world, including in the US. It tends to be linked to the belief that teaching abstinence from sex until marriage is the best way to protect young people’s health and safety.

However, research has shown the opposite. Simply telling teenagers not to have sex has been conclusively proven not to work. The American Academy of Pediatrics calls educational programmes that only promote abstinence “ineffective”, based on a systematic review of the evidence. The review also shows that comprehensive sex education helps prevent and reduce the risks of teen pregnancy and sexually transmitted diseases, echoing the findings in the Netherlands and Sweden.When parents delay or skip the topic of sex, young people can be left vulnerable to misinformation (Credit: Prashanti Aswani)

When parents delay or skip the topic of sex, young people can be left vulnerable to misinformation (Credit: Prashanti Aswani)

In fact, when parents, and especially mothers, talk to their teenage children about sex, the teens are more likely to delay having sex for the first time, and engage in safer behaviour when they do have sex, especially in the case of girls. The study of British families suggests that it is important to involve fathers in the conversations, too, also because boys often feel sex education tends to be weighted towards the experience of girls.

In short, teaching young people what it truly means to be ready the first time that they have sex, and what to consider when doing so, is far more likely to lead to protect them than not telling them anything.

In Finland, parents prefer calling sex education “Kehotunnekasvatus” – “body emotion education”

What might be helpful, however, is reframing what exactly parents think sex education is. In Finland, researchers conducted an experiment where they changed the name of sexuality education to “Kehotunnekasvatus” – “body emotion education” – and evaluated how early childhood education professionals and parents felt about the term.

The majority preferred the new phrase as it was “more neutral, downplaying thoughts of sex”. The researchers note that “one problem impeding the promotion of childhood sexuality education has been the lack of terms free from adulthood connotations”, and that using child-centred words might be how many of us are able to talk more easily.

“Using different words for children’s sexuality is not a repressed, evasive or euphemistic representation, but can help adults to see the difference and to overcome their rejections, misunderstandings and objections,” write the authors.

Such changes can come with a risk, though. One study in India observed that altering the name of a local programme to ‘lifestyle’ education ended up being counter-productive, sweeping the sex education agenda “under the carpet”.

Rephrasing or hiding vocabulary around sex and sexual development when talking to young people also risks accidentally tainting the original words with shame, instead of presenting them as a normal part of a frank conversation.Parents may find that talking openly is easier than they thought (Credit: Prashanti Aswani)

Parents may find that talking openly is easier than they thought (Credit: Prashanti Aswani)

Family Tree

This article is part of Family Tree, a series that explores the issues and opportunities families face today – and how they’ll shape tomorrow. You might also be interested in other stories about children’s health and development:

Climb other branches of the family tree with BBC Culture and Worklife.

Step by step

Parents who are unsure when and how to start these conversations may find it helpful to seek out material for schools. In a UK study in 2016, parents who were shown the books used for their children’s sex education classes felt that they better understood the subject – and also reported that it made them feel more confident talking to their children about sex. Eva Goldfarb says that it can also be helpful for parents to have evening meetings with their children’s sex education teachers and receive information about what their children will be learning at the beginning of the school year.

International guidelines for sex education, such as a comprehensive, evidence-based guide published by Unesco, can also be a good starting point for parents looking for age-appropriate advice. The Unesco guide uses basic, clear ideas around bodies and healthy relationships as building blocks, rather than storing it all up for one big conversation. For a child aged 5-8 years, for example, one key idea is that “everyone has the right to decide who can touch their body, where, and in what way”.

For teenagers, the conversations can include discussions around emotional health, such as what it means to take responsibility for oneself and others, or ways to counter peer pressure, as well as providing specific information about condoms and other contraceptives, according to the guide.

Discussing pleasure can help young people practise more safe sex, have more knowledge and positive attitudes about sex – Mirela Zaneva

One factor has been found to be surprisingly powerful in sex education, but remains relatively little used: pleasure. A new systematic review into health interventions that incorporated pleasure found that explaining enjoyment around sex may encourage safer habits. Programmes that taught people about achieving sexual pleasure were found to improve condom use more than those that focused on the dangers of unprotected sex.

“It’s worth talking about the positives beyond protection, too, such as how using a condom can be fun and can help you connect with a partner,” says Mirela Zaneva, one of the study’s authors and a PhD candidate in experimental psychology at the University of Oxford.

Zaneva found that pleasure tends not to be mentioned much, or at all, in sex education. This means that if your child isn’t hearing about pleasure from you, it’s very likely they’re not hearing about it from school, either. “It is likely that a lot of young people miss out on positive, empowering conversations about sex in their current school sex education,” she says.

She notes that the Pleasure Project, a public health project involved in the research, offer a range of practical tips on how to incorporate pleasure into discussions with young people around sex.

“The evidence so far is that discussing pleasure can help young people practise more safe sex, have more knowledge and positive attitudes about sex, as well as have more confidence and self-efficacy.”

Finding trusted sources

Parents are usually the primary source of sex education for young children, but adolescents tend to tap many sources for information, such as their peers, teachers, and popular culture. And parents may not be the only ones who can feel squeamish. Research undertaken in Ireland found that while in the past, parents’ ignorance and embarrassment were the main obstacles to open discussions of sex, nowadays, it was the young people who tended to block these talks, by claiming to already know the facts, becoming irritated or annoyed, or even leaving the room. That does not mean parents should avoid the subject, but it does show how important it is to frame the chats in a way that make everyone feel comfortable.

“Let your child know ahead of time when you want to discuss something delicate, potentially embarrassing or difficult to talk about. They don’t feel ambushed this way, and they are more likely to be prepared and to talk with you,” says Goldfarb.

Overcoming that squeamishness may even turn out to be freeing experience. After all, sex and healthy relationships – or as the Finnish researchers call it, “body emotions” – are important at any stage of adult life. Young people are at the start of that journey, and have the chance to define values, habits and priorities that can benefit them over a lifetime, not just in intimate situations, but as a part of moving through the world safely and considerately. You may find that it is life-affirming, and not remotely awkward, to be part of that journey. 

* Sophia Smith Galer is the author of Losing It: Sex Education for the 21st Century, published by Harper Collins.

Join one million Future fans by liking us on Facebook, or follow us on Twitter or Instagram.

If you liked this story, sign up for the weekly features newsletter, called “The Essential List” – a handpicked selection of stories from BBC Future, Culture, Worklife, Travel and Reel delivered to your inbox every Friday.

September 4th 2022

Recognizing Suicidal Behavior

In many cases, suicide can be prevented. Learn the risk factors and warning signs, which include depression, change in personality, self-harm behavior recent life crisis and conversation about wanting to die. If a family member or friend talks about suicide, take them seriously. Listen without judgement and encourage them to seek professional help .

What is suicide?

Suicide is death caused by self-inflicted injury with the intent to die.

Suicide is the tenth leading cause of death in the U.S. One person dies by suicide about every 11 minutes. It is the second leading cause of death among people ages 10 to 34, the fourth leading cause of death among people ages 34 to 54 and the fifth leading cause of death among people ages among those ages 45 to 54.

Groups of people who have higher rates of suicide include:

  • American Indian/Alaska Native and non-Hispanic White people.
  • Veterans.
  • Rural dwellers.
  • Young people who are lesbian, gay, bisexual, transgender.

What are the situations– risk factors – that could lead someone to consider suicide?

Although you may not know what might cause a friend or loved one to attempt suicide, there are at least some common characteristics to be aware of.

Known factors that increase an individual’s risk of suicide include:

Individual factors

  • Has attempted suicide in the past.
  • Has a mental health condition, such as depression and mood disorders, schizophrenia, anxiety disorders.
  • Has long-term pain or a disabling or terminal illness.
  • Expresses feelings of hopelessness.
  • Has money or legal problems.
  • Has violent or impulsive behavior.
  • Has alcohol or other substance abuse problems.
  • Has easy access to self-harm methods, such as firearms or medications.

Relationship factors

  • Has a history of physical, emotional or sexual abuse; or neglect or bullying.
  • Has lost relationships through break-up, divorce or death.
  • Has a family history of death by suicide.
  • Is socially isolated; lacks support.

Community, cultural, societal factors

  • Is ashamed to ask for help, especially help for mental health conditions.
  • Lacks access to healthcare services, especially mental health and substance abuse treatment.
  • Holds cultural or religious belief that suicide is a noble option to resolving a personal dilemma.
  • Has become aware of an increased number of local suicides or an increase in media coverage of deaths by suicide.

What are some of the most common suicide warning signs?

Some of the more common warning signs that a person may be thinking of ending their life include:

  • Being sad or moody: The person has long-lasting sadness and mood swings. Depression is a major risk factor for suicide.
  • Sudden calmness: The person suddenly becomes calm after a period of depression or moodiness.
  • Withdrawing from others: The person chooses to be alone and avoids friends or social activities. They also lose of interest or pleasure in activities they previously enjoyed.
  • Changes in personality, appearance, sleep pattern: The person’s attitude or behavior changes, such as speaking or moving with unusual speed or slowness. Also, they suddenly become less concerned about their personal appearance. They sleep much more or much less than typical for that person.
  • Showing dangerous or self-harmful behavior: The person engages in potentially dangerous behavior, such as driving recklessly, having unsafe sex or increase their use of drugs and/or alcohol.
  • Experiencing recent trauma or life crisis: Examples of crises include the death of a loved one or pet, divorce or break-up of a relationship, diagnosis of a major illness, loss of a job or serious financial problems.
  • Being in a state of deep despair: The person talks about feeling hopeless, having no reason to live, being a burden to others, feeling trapped or being in severe emotional pain.
  • Making preparations: The person begins to put their personal business in order. This might include visiting friends and family members, giving away personal possessions, making a will and cleaning up their room or home. Often the person will search online for ways to die or buy a gun. Some people will write a note before attempting suicide.
  • Threatening suicide or talking about wanting to die: Not everyone who is considering suicide will say so, and not everyone who threatens suicide will follow through with it. However, every threat of suicide should be taken seriously.

Can suicide be prevented?

In many cases, suicide can be prevented. The best way you can help prevent suicide is to:

  • Learn the risk factors for suicide.
  • Be alert to the signs of depression and other mental health conditions.
  • Recognize suicide warning signs.
  • Provide caring support.
  • Ask directly if the person has considered hurting themselves.

People who receive support from caring friends and family and who have access to mental health services are less likely to act on their suicidal impulses than are those who are isolated from support.

What should I do if someone I know is talking about suicide?

If your friend or loved one is not in immediate danger but is talking about suicide and is showing risk factors for harming themselves, take them seriously. If you can, remove any objects that can be used in a suicide attempt. Encourage them to call – or call together – support services such as the National Suicide Prevention Lifeline: 1-800-273-(TALK) (1-800-273-8255). Conversations are with a skilled, trained counselor and are free and confidential and available 24 hours a day, seven days a week.

If the friend or loved one appears to be extremely distressed, don’t leave the person alone. Try to keep the person as calm as possible and get immediate help. Call 911 or take the person to an emergency room.

A note from Cleveland Clinic

If someone you know is exhibiting warning signs for suicide, don’t be afraid to ask if he or she is depressed or thinking about suicide. Listen without judging. In some cases, your friend or family member just needs to know that you care and are willing to hear them talk about how they are feeling. Encourage them to seek professional help.

If you have suicidal thoughts, know that you are not alone. Also know that help is available 24/7. Call your healthcare provider, go to the emergency room or call the National Suicide Prevention Lifeline, 1-800-273-TALK.

Comment I wrote and presented a seminar paper on suicide in 1973. It is to easy to call the impulse simply mental illness. I have nearly succeeded on three occasions I can remember. The last time was after being locked in a cold dirty dark smelly police cell for 18 hours. I strangled myself under the cover, listening to other prisoners screaming, shouting, swearing and banging. Then, found semi conscious, I was taken to a secure mental health unit for the next 12 hours on August 25th 2020.

Two senior doctors and a senior white female mental health nurse judged me sane and let me go. Since then I have longed to die from natural causes so that my son would not lose the insurance money, and be rid of me. When police lied that I was and am a gay whore, having me labelled a long term alcoholic with alcoholic peripheral neuropathy, paranoid personality with schizopphrenia, delusional and bi polarism, they had me sacked from the truck driving job I loved, costing me a fortune in lost income and credibility. It is quite rational to want to die in such deleterious circumstances, after nearly 15 years of high level police harrassment , lies and persecution. I won’t recover now. It is far too late.

The brain is like any other organ. It can only cope with so much social and economic poison before failing terminally. That is what suicide is, chronic brain failure which the authorities like to fein caring for with anti psychotic drug fed ‘living death.’ R J Cook.

R J Cook, truck driver, Bristol old docks December 2016.
Psychology of trucking, huge responsibility, splet second decisions where mistakes kill. R J Cook started out in 2003, sacked in 2019 thanks to corrupt lying cops and serious corrupt mental health ‘diagnosis.’ ( Sic ). They refuse to disclose records of alleged invesiigations passed on to GP, Whiteleaf Psychiatrists and GIC. R J Cook’s conclusion. DR R D Laing was quite correct to label State psychiatrists ‘ whores’.

September 3rd 2022

MIT Technology ReviewFeaturedTopicsNewslettersEventsPodcastsSign inSubscribeMIT Technology ReviewFeaturedTopicsNewslettersEventsPodcastsSign inSubscribeBiotechnology

How do strong muscles keep your brain healthy?

There’s a robust molecular language being spoken between your muscles and your brain. By

August 22, 2022

a brain wearing a sweat band

Selman Design

We’ve often thought about muscle as a thing that exists separately from intellect—and perhaps that is even oppositional to it, one taking resources from the other. The truth is, our brains and muscles are in constant conversation with each other, sending electrochemical signals back and forth. In a very tangible way, our lifelong brain health depends on keeping our muscles moving. 

Skeletal muscle is the type of muscle that allows you to move your body around; it is one of the biggest organs in the human body. It is also an endocrine tissue, which means it releases signaling molecules that travel to other parts of your body to tell them to do things. The protein molecules that transmit messages from the skeletal muscle to other tissues—including the brain—are called myokines. 

Myokines are released into the bloodstream when your muscles contract, create new cells, or perform other metabolic activities. When they arrive at the brain, they regulate physiological and metabolic responses there, too. As a result, myokines have the ability to affect cognition, mood, and emotional behavior. Exercise further stimulates what scientists call muscle-brain “cross talk,” and these myokine messengers help determine specific beneficial responses in the brain. These can include the formation of new neurons and increased synaptic plasticity, both of which boost learning and memory.

In these ways, strong muscles are essential to healthy brain function. 

In young muscle, a small amount of exercise triggers molecular processes that tell the muscle to grow. Muscle fibers sustain damage through strain and stress, and then repair themselves by fusing together and increasing in size and mass. Muscles get stronger by surviving each series of little breakdowns, allowing for regeneration, rejuvenation, regrowth. As we age, the signal sent by exercise becomes much weaker. Though it’s more difficult for older people to gain and maintain muscle mass, it’s still possible to do so, and that maintenance is critical to supporting the brain.

 Even moderate exercise can increase metabolism in brain regions important for learning and memory in older adults. And the brain itself has been found to respond to exercise in strikingly physical ways. The hippocampus, a brain structure that plays a major role in learning and memory, shrinks in late adulthood; this can result in an increased risk for dementia. Exercise training has been shown to increase the size of the hippocampus, even late in life, protecting against age-related loss and improving spatial memory. 

Related Story

This is how your brain makes your mind

Your mind is in fact an ongoing construction of your brain, your body, and the surrounding world. 

Further, there is substantial evidence that certain myokines have sex-differentiated neuroprotective properties. For example, the myokine irisin is influenced by estrogen levels, and postmenopausal women are more susceptible to neurological diseases, which suggests that irisin may also have an important role in protecting neurons against age-related decline.

Studies have shown that even in people with existing brain disease or damage, increased physical activity and motor skills are associated with better cognitive function. People with sarcopenia, or age-related muscle atrophy, are more likely to suffer cognitive decline. Mounting evidence shows that the loss of skeletal muscle mass and function leaves the brain more vulnerable to dysfunction and disease; as a counter to that, exercise improves memory, processing speed, and executive function, especially in older adults. (Exercise also boosts these cognitive abilities in children.)

There’s a robust molecular language being spoken between your muscles and your brain. Exercise helps keep us fluent in that language, even into old age. 

by Bonnie Tsui

twitterlink opens in a new windowfacebooklink opens in a new windowemaillink opens in a new window

The Gender issue

MagazineThe Gender issue

This story was part of our September/October 2022 issue.Explore the issue

Deep Dive


With plans to create realistic synthetic embryos, grown in jars, Renewal Bio is on a journey to the horizon of science and ethics. By

The feud between a weed influencer and scientist over puking stoners

A scientist went looking for genes that cause cannabinoid hyperemesis syndrome. But a public spat with a cannabis influencer who suffers from the disease may have derailed his research. By

Edits to a cholesterol gene could stop the biggest killer on earth

In a first, a patient in New Zealand has undergone gene-editing to lower their cholesterol. It could be the beginning of new era in disease prevention. By

The quest to show that biological sex matters in the immune system

A handful of immunologists are pushing the field to take attributes such as sex chromosomes, sex hormones, and reproductive tissues into account. By

Stay connected

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.Enter your emailPrivacy Policy

Our in-depth reporting reveals what’s going on now to prepare you for what’s coming next.

Subscribe to support our journalism.

© 2022 MIT Technology Review

Cookie Policy

We use cookies to give you a more personalized browsing experience and analyze site traffic.See our cookie policy

September 2nd 2022

Can we diagnose suffering without knowing a person’s history? | Psyche

Can we diagnose suffering without knowing a person’s history?

Photo by Angelo Rezutto/The Library of Congress

Christos Tombrasis a supervising psychoanalyst with a Lacanian orientation, practising in London.

Edited by Cameron Allan McKean

This is Whiteleaf Centre , Bierton Road, Aylesbury. Aylesbury’s main centre for mental patients was originally Stone Asylum four miles south west of Aylesbury off the Oxford Road. It was a standing joke among Winslow yokels regarding those who didn’t fit in : “The men in white coats will come and take you away in the yellow van.” The local dustcart driver, Mr Mason, had the humiliation of having his wife taken away to Stone. Around the same time, Mr Donald Carpenter, the chimney sweep turned telephone operator ended up in the same place. As a boy, I remmeber Mr Donald Carpenter riding by on his bike, immacualte ,trilby hat, mackintosh, collar and tie. Every time he passed, Don Carpenter would smile vacantly, asking a very phisophical question, ‘Do you know where you are going ?

Sadly, I recall a nice girl for whom I was form tutor. She showed me pictures of her parents last holiday to repair their marriage. Her father had trained as a doctor, coming to Aylesbury to work in a local hospital. This lovely girl with promising life ahead of her, imtrouduced me to her equally beautiful sister -of which she had two.

A few weeks later, I read in the local paper that the father had strangled the younger daughter, laying wait to kill the others before setting fire to the large family home. The doctor feared this world and wanted to take them all to a better place. He was committed to Stone Asylum where he hung himself in the grounds. Friends and colleagues recalled him as a very nice man.
Stone was closed down with mental health care being moved to the old Tindal General Hospital in the 1980s, where my father died from terminal illness in 1962. Just a little way west, along Bierton Road, a new purpose built mental hospital, Whiteleaf, was opened. It is a day and secure unit.
I have personal expereince of this establishment as well as a long history of difficulties with the police who have a Triage Role committing people there. I was referred there to former Aylesbury Grammar School Boy and Nottingham Med School Old Peoples’ mental health ‘expert’ Dr C R Ramsay – Charlie for short.
Curiously he considered me a suitable candidate for sex change treatment. He wrote a glowing referral. But in 2018, with an axe to grind, Thames Valley Police informed him and the GIC via my obsequeous GP Kamble who has met me only once, that I was working from home as a ‘gay escort’ in a brothel for my son Kieran and his gangster associates..
So, biding their time, me awash with oestrogen with my sex change surgery listing overdue in February 2019, while I was working as a truck driver not a whore, Dr Kirpal Sahota announced to me, when I visted her clinic :’You are very elegant.’ She went on to tell me I needed anti psychotic drugs. She informed ny GP I had a secure female dentity and was being prepared for sex change surgery. A few weeks later, after I had been out truck driving for 14 hours, Charlie Ramsay arrived unexpected with a burly mental health nurse and smirking posh medical student. After one hour, having asured me they were not necessarily going to take me to hospital, Ramsay went away. Three days later he uploaded his report that I have a paramoid personality disorder, abnormal psychology, schizophrenia , bi polarism and delusional. Heatlh professionals tell me that I cannot see my related medical records without police permission.
Miss Roberta Jane Cook.
Elegant Roberta Jane Cook as she prepares to leave for her alst appointment at the London Gender Identity Clinic after nearly 3 years treatment , suddenly being told she is psychotic, seriously mentally ill and nneds drugs before surgery.

The power of diagnosis is becoming more potent. In 2022, the Diagnostic and Statistical Manual of Mental Disorders (DSM), the ‘bible’ of mental health professionals, appeared on The Wall Street Journal’s bestseller list for the first time. It was also the top-selling psychiatry book on Amazon. Five editions have been published since 1952, and the latest, the DSM-5-TR (2022), is perhaps the most popular of them all. Why? Is it the promise that science can assess and understand human suffering? Is it the belief that this understanding can help us find specific, appropriate and effective treatment for our problems?

These are tempting but misleading promises. Human suffering is not defined by abstract categories. It does not exist independently of humans who are suffering. Useful as the DSM is, any project that seeks to list and categorise psychological problems in terms of some deviation from a definition of what is ‘normal’ runs the risk of forgetting that disorders do not appear out of thin air. They have their own history. They are also part of our histories. Moreover, they do not remain constant; they change just as we change.

From the moment it was conceived in the mid-20th century, the DSM was hailed by many as a liberating and revolutionary scientific project. Not everyone has agreed. It and other diagnostic tools have also been criticised for being vehicles of corporatised medicine, products of bureaucratic health systems, riddled with false categories, and for forgetting that psychological suffering is connected to the society that produced it. However, within this debate, the personal histories of those who actually experience suffering are often overlooked.

Human experience is distributed, non-specific, and flows in time. The checklist-style organisation of the disorders that populate diagnostic manuals and tests – including online questionnaires, mental health apps and personality ‘inventories’ – tend to forget that people have some awareness of themselves as agents in a timeline that is coming from somewhere (the past) and going towards somewhere (the future). This distinctly human feature of our experience, its historicity, has been at the centre of the work of philosophers like Martin Heidegger and psychoanalysts like Jacques Lacan. Less a question about the factual specifics of one’s timeline (what happened when and where), ‘historicity’ in philosophy refers to the fact that we are constantly creating and re-creating self-narratives. This is how we try to make sense of our lives as we move along the myriad pathways that connect our past to our future. These meandering and confusing pathways, full of dead ends and false connections, often contribute to our suffering.

It’s tempting to believe that we can see ourselves objectively reflected in the diagnostic criteria and checklists of ‘bibles’ like the DSM, but our individual stories and anxieties elude easy diagnosis because they do not exist independently of our history or our attempts to articulate them, make sense of them, and fit them into the identities we are constantly trying to form of ourselves.

Robert is finally able to give me a glimpse into his suffering. At this moment, I could try to turn his symptoms into a diagnosis

I am a psychoanalyst working in London. The people who come to see me, mostly self-referrals, are seeking help with those articulations, that sense-making and indentity-construction. Robert (not his real name) first came to see me when he was in his 30s. He came because he was worried about his job at an art gallery, where he felt like an intruder, having nothing in common with his colleagues. Just like his experience in previous jobs, he had a deep, debilitating fear that colleagues and bosses were always watching, waiting for him to make a decisive mistake. This fear would make him freeze. He couldn’t think. He wanted to disappear.

Sign-up to our weekly newsletter

Intriguing articles, practical know-how and immersive films, straight to your inbox.See our newsletter privacy policy

Emotionally Extreme Experiences Are More Meaningful in Life

Peak emotional experiences, not just “negative” and “positive” experiences, are the most meaningful ones in our lives.

Scientific American

people happy, raising their arms on a roller coaster ride
  • Scott Barry Kaufman

What does it take to live a meaningful life? In trying to answer this question, most researchers focus on the valence of the life experience: is it positive or negative?

Researchers who focus on positive emotions have amassed evidence suggesting that we are more likely to find more meaning in our lives on days when we experience positive emotions. In contrast, researchers taking a meaning-making perspective tend to focus on meaning in the context of adjustment to stressful events. These two areas of research are often treated separately from each other, making it difficult to answer the question about which valence of our emotional life—positive or negative—is most likely to be meaningful.

Both perspectives may be at least partly right. In their classic paper ”Some Differences between a Happy Life and a Meaningful Life,“ Roy Baumeister and his colleagues zoomed in on the different outcomes associated with happiness (controlling for meaning) and meaningfulness (controlling for happiness). Whereas happiness was positively correlated with the frequency of positive events in one’s life and negatively related to the frequency of negative events, greater meaningfulness was related both to a higher frequency of positive events and a higher frequency of negative events, as well as reports of more stress, time spent worrying, and time spent reflecting on struggles and challenges. What’s going on here? How can meaning be positively associated with both positive and negative experiences?

In a 2019 paper, Sean Murphy and Brock Bastian suggest that a focus on emotional valence may have been a red herring for the field. By intentionally pitting “positive” experiences against “negative” experiences, researchers have focused on the difference between these experiences. However, Murphy and Bastian argue that this has neglected our understanding of similarities in how the positivity and negativity of experiences are related to meaningfulness. They raise the intriguing possibility that the more relevant factor may be the extremity of the experience, not the valence. Perhaps both extremely pleasant and extremely painful events relative to more neutral events share a common set of characteristics that might lead them to be found more meaningful.

They set out to test this idea for the first time. Across three studies, they collected reports of the most significant events in people’s lives across the emotional spectrum and measured the meaningfulness of the experiences. In line with their prediction, they found that the most meaningful events were those that were extremely pleasant or extremely painful.

They also looked at various qualities of the event that might explain the impact of emotional extremity on meaningfulness. They found that extreme events were found more meaningful in large part because of their emotional intensity and the contemplation they inspired (e.g., “I find myself analyzing this experience to try to make sense of it”). In fact, they consistently found that positive and negative events inspired contemplation to about the same degree. While the field has focused mostly on how traumatic events inspire contemplation, this finding is in line with research looking at the rumination that often occurs following positive moods.

Their findings also point to the importance of intensity in building a meaningful life, a factor that hasn’t received as much attention in the field as the valence of the emotion. This work is important because it ties together literatures on meaning that have often been treated separately, or even in opposition, to each other. As the researchers note, the “commonalities reveal a more complete and nuanced picture about what determines the events we find meaningful and memorable.”

Rethinking the Good Life

Their findings have a number of important implications for our understanding of the good life as well as our understanding of human nature more generally. On the surface, it may seem perplexing that so many people intentionally behave in counter-hedonic ways, actively seeking out unpleasant experiences.

For instance, in their paper “Glad to be Sad, and Other Examples of Benign Masochism,” Paul Rozin and his colleagues found that 29 initially aversive activities—including watching frightening movies, viewing sad paintings, listening to sad music, eating spicy food, listening to disgusting jokes, going on thrill rides, having a painful massage, and being physically exhausted—produced pleasure in a substantial number of individuals. Rozin and his colleagues ended their paper noting that if “we had a better understanding of the function of sadness, we would no doubt be able to make more sense of this.”

However, the findings of Murphy and Bastian suggest that it’s not the sadness, per se, that is enjoyable, but the intensity of the experiencethat is enjoyable because it leads to a greater sense of meaning. This makes sense from a narrative identity perspective: our life story and our sense of who we are is a carefully constructed selection of meaningful events from our lives. The events that we find most worthy of incorporating into our life story may be those that are most intense. The greater contemplation associated with intense experiences may increase the likelihood that we consider such events self-defining.

Over 50 years ago, Abraham Maslow talked about the importance of “peak experiences,” which he described as “rare, exciting, oceanic, deeply moving, exhilarating, elevating experiences that generated an advanced form of perceiving reality, and are even mystic and magical in their effects…” While people often talk about the euphoria of peak experiences, Maslow often pointed out how overcoming intense challenges and setbacks can be a key trigger for a peak experience.

Similarly, in his 2018 book The Other Side of Happiness: Embracing a More Fearless Approach to Living, Bastian argues that suffering and sadness are actually a necessary ingredients for happiness. He notes that “the most thrilling moments in our lives are often balanced on a knife edge between pleasure and pain… Our addiction to positivity and the pursuit of pleasure is actually making us miserable… without some pain, we have no real way to achieve and appreciate the kind of happiness that is true and transcendent.” Yale psychologist Paul Bloom has also been making sense of the “pleasures of suffering.”

These findings also have implications for the mindfulness craze, and provide a much-needed counterpoint to the the current trend of viewing calm and tranquil experiences as most conducive to a life well lived. To be sure, mindfulness, meditation, and cultivating inner calm can be beneficial for reducing anxiety, improving depression and helping us cope with pain.

However, the intensity of peak experiences may be more likely to define who we are. At the end of our lives, will we look back and remember most poignantly all of the calm and tranquil meditation sessions we had, or will we remember the moments that plumbed the depths of our emotional life, that made us feel most alive?

The views expressed are those of the author(s) and are not necessarily those of Scientific American.

August 25th 2022

Schizophrenia is a serious mental disorder in which people interpret reality abnormally. Schizophrenia may result in some combination of hallucinations, delusions, and extremely disordered thinking and behavior that impairs daily functioning, and can be disabling.

People with schizophrenia require lifelong treatment. Early treatment may help get symptoms under control before serious complications develop and may help improve the long-term outlook.

Products & Services


Schizophrenia involves a range of problems with thinking (cognition), behavior and emotions. Signs and symptoms may vary, but usually involve delusions, hallucinations or disorganized speech, and reflect an impaired ability to function. Symptoms may include:

  • Delusions. These are false beliefs that are not based in reality. For example, you think that you’re being harmed or harassed; certain gestures or comments are directed at you; you have exceptional ability or fame; another person is in love with you; or a major catastrophe is about to occur. Delusions occur in most people with schizophrenia.
  • Hallucinations. These usually involve seeing or hearing things that don’t exist. Yet for the person with schizophrenia, they have the full force and impact of a normal experience. Hallucinations can be in any of the senses, but hearing voices is the most common hallucination.
  • Disorganized thinking (speech). Disorganized thinking is inferred from disorganized speech. Effective communication can be impaired, and answers to questions may be partially or completely unrelated. Rarely, speech may include putting together meaningless words that can’t be understood, sometimes known as word salad.
  • Extremely disorganized or abnormal motor behavior. This may show in a number of ways, from childlike silliness to unpredictable agitation. Behavior isn’t focused on a goal, so it’s hard to do tasks. Behavior can include resistance to instructions, inappropriate or bizarre posture, a complete lack of response, or useless and excessive movement.
  • Negative symptoms. This refers to reduced or lack of ability to function normally. For example, the person may neglect personal hygiene or appear to lack emotion (doesn’t make eye contact, doesn’t change facial expressions or speaks in a monotone). Also, the person may lose interest in everyday activities, socially withdraw or lack the ability to experience pleasure.

Symptoms can vary in type and severity over time, with periods of worsening and remission of symptoms. Some symptoms may always be present.

In men, schizophrenia symptoms typically start in the early to mid-20s. In women, symptoms typically begin in the late 20s. It’s uncommon for children to be diagnosed with schizophrenia and rare for those older than age 45.

Symptoms in teenagers

Schizophrenia symptoms in teenagers are similar to those in adults, but the condition may be more difficult to recognize. This may be in part because some of the early symptoms of schizophrenia in teenagers are common for typical development during teen years, such as:

  • Withdrawal from friends and family
  • A drop in performance at school
  • Trouble sleeping
  • Irritability or depressed mood
  • Lack of motivation

Also, recreational substance use, such as marijuana, methamphetamines or LSD, can sometimes cause similar signs and symptoms.

Compared with schizophrenia symptoms in adults, teens may be:

  • Less likely to have delusions
  • More likely to have visual hallucinations

When to see a doctor

People with schizophrenia often lack awareness that their difficulties stem from a mental disorder that requires medical attention. So it often falls to family or friends to get them help.

Helping someone who may have schizophrenia

If you think someone you know may have symptoms of schizophrenia, talk to him or her about your concerns. Although you can’t force someone to seek professional help, you can offer encouragement and support and help your loved one find a qualified doctor or mental health professional.

If your loved one poses a danger to self or others or can’t provide his or her own food, clothing, or shelter, you may need to call 911 or other emergency responders for help so that your loved one can be evaluated by a mental health professional.

In some cases, emergency hospitalization may be needed. Laws on involuntary commitment for mental health treatment vary by state. You can contact community mental health agencies or police departments in your area for details.

Suicidal thoughts and behavior

Suicidal thoughts and behavior are common among people with schizophrenia. If you have a loved one who is in danger of attempting suicide or has made a suicide attempt, make sure someone stays with that person. Call 911 or your local emergency number immediately. Or, if you think you can do so safely, take the person to the nearest hospital emergency room.

From Mayo Clinic to your inbox

Sign up for free, and stay up to date on research advancements, health tips and current health topics, like COVID-19, plus expertise on managing health. Email Request an Appointment at Mayo Clinic


It’s not known what causes schizophrenia, but researchers believe that a combination of genetics, brain chemistry and environment contributes to development of the disorder.

Problems with certain naturally occurring brain chemicals, including neurotransmitters called dopamine and glutamate, may contribute to schizophrenia. Neuroimaging studies show differences in the brain structure and central nervous system of people with schizophrenia. While researchers aren’t certain about the significance of these changes, they indicate that schizophrenia is a brain disease.

Risk factors

Although the precise cause of schizophrenia isn’t known, certain factors seem to increase the risk of developing or triggering schizophrenia, including:

  • Having a family history of schizophrenia
  • Some pregnancy and birth complications, such as malnutrition or exposure to toxins or viruses that may impact brain development
  • Taking mind-altering (psychoactive or psychotropic) drugs during teen years and young adulthood


Left untreated, schizophrenia can result in severe problems that affect every area of life. Complications that schizophrenia may cause or be associated with include:

  • Suicide, suicide attempts and thoughts of suicide
  • Anxiety disorders and obsessive-compulsive disorder (OCD)
  • Depression
  • Abuse of alcohol or other drugs, including nicotine
  • Inability to work or attend school
  • Financial problems and homelessness
  • Social isolation
  • Health and medical problems
  • Being victimized
  • Aggressive behavior, although it’s uncommon


There’s no sure way to prevent schizophrenia, but sticking with the treatment plan can help prevent relapses or worsening of symptoms. In addition, researchers hope that learning more about risk factors for schizophrenia may lead to earlier diagnosis and treatment.By Mayo Clinic StaffRequest an Appointment at Mayo ClinicDiagnosis & treatment Jan. 07, 2020 Print Share on: FacebookTwitter


Associated Procedures

Products & Services



August 24th 2022

10 common phrases that make you sound passive-aggressive in the workplace

Published Fri, Aug 19 202210:00 AM EDTthumbnailAshton Jackson@ashtonlinnell

For many professionals, the majority of daily work communication happens through emails. Unfortunately, it can be hard to gauge someone’s tone through a computer — and your emails could be coming off a little aggressive.

Whether intentional or unintentional, being passive-aggressive in the workplace could make others uncomfortable, create tension and even jeopardize your job.

A recent study from WordFinder by Your Dictionary, an online word search tool, collected data from Ahrefs and Google Adwords to find the most-used passive-aggressive work phrases, most of which seem pretty harmless at first glance. 

“For better or worse, digital communication, whether it’s through email or direct messages on platforms like Slack, doesn’t let us see each other’s immediate reactions — which is why we look for ways to ‘politely’ express irritation,” WordFinder representative Joe Mercurio tells CNBC Make It. “As a result, employee frustration and miscommunication are at an all-time high, with tone alone being misinterpreted quite a bit in email communication.”

According to the findings, here are the top 10 most passive-aggressive phrases in the workplace:

1. Please advise 

2. Noted

3. Friendly Reminder

4. Will do 

5. Thanks in advance 

6. Per our last conversation 

7. Circling back

8. As per my last email 

9. As promised 

10. As discussed 

WordFinder also identified some of the least passive-aggressive work phrases, including “Sorry to bother you again,” “Any update on this” and “I’ll take care of it.”

According to Mercurio, the difference in the delivery of these phrases have to do with timing and attitude. He also urges employees to think twice before sending their emails.

“To communicate effectively, employees should remember not to respond to messages or emails when in a state of frustration. They should also assume good intent, show empathy and encouragement, and avoid digital ghosting. As a rule of thumb: if you feel uncomfortable reading it directed toward you, try rethinking your approach.”

Communication is one of the most important parts of an effective workplace, according to BetterUp, as it “boosts employee morale, engagement, productivity, and satisfaction.” Communication also enhances teamwork and coordination and helps ensure better performance for organizations as a whole.

Mercurio advises professionals to bring the “Golden Rule” into the workplace when it comes to interacting and communicating with others.

“Treat people how you would like to be treated. Start by deciding if the request is better suited for an email or a face-to-face conversation. If it’s something that can be relayed over email, reread the email and think about things like tone and reception. Overall, open and honest communication is the best way people can work together.”

Comment Women excel at passive aggression because of inherent physical weakness due to hormones. Feminists are admitting this hormone issue by blocking transsexual competitors. Hence the expression’ sarcasm is the lowest form ofr wit and a lady’s privilege.’

What concerns me is that this is another example of policing speech and therefore , what and how we are allowed to say and therefore think. Aggression is a natural animal instinct. There are no peaceful wars. Passive people don’t get rich and powerful. Satire is passive aggression has played apart in deflating dangerous politcal egos. Articles like these are about brain training and zombie creation. They are a good back up for the anti psychotic drugs dished out to children along with the anti androgens. This is Grave New World, a world of State approved media and living dead.

Miss Roberta Jane Cook. She says ‘ When you look like this and it is your internal self image in a savage world : one has to deal with aggression from the likes of authority as well as evil minds lurking at work or in the undergrowth ,Passive aggression is the safest option.

R.J Cook.

Check out:

1 in 4 Gen Z employees worry that taking a lunch break makes them look bad at work

Companies’ DEI plans often stall at ‘frozen’ middle management—here’s how to fix it

This 30-year-old Etch A Sketch artist paid off over $13,000 in student loans in 30 days using TikTok

Sign up now: Get smarter about your money and career with our weekly newsletter

August 23rd 2022

Did we all believe a myth about depression?

By Rachel Schraer
Health and disinformation reporter

Graphic representation of serotonin in the brain

A study showing depression isn’t caused by low levels of the “happy hormone” serotonin has become one of the most widely shared medical articles.

It has provoked a wave of misleading claims about antidepressant drugs, many of which increase the amount of serotonin in the brain.

This research doesn’t show the drugs aren’t effective.

But the response to it has also sparked some genuine questions about how people treat, and think about, mental illness.

Short presentational grey line

After Sarah had her first major psychiatric episode, in her early 20s, doctors told her the medication she was prescribed was like “insulin for a diabetic”. It was essential, would correct something chemically wrong in her brain, and would need to be taken for life.

Her mother had type 1 diabetes, so she took this very seriously.

Sarah stayed on the drugs even though they seemed to make her feel worse, eventually hearing menacing voices telling her to kill herself and being given electroconvulsive therapy (ECT).

Yet the claim she needed the drug like a diabetic needs insulin wasn’t based on any medical evidence.

“You feel betrayed by the people that you trusted,” she says.

Her reaction to the drugs was extreme but the “chemical imbalance” message she was given was not unusual.

Sarah on her wedding day in a veil smiling next to her mother
Image caption, Sarah and her mother, who took insulin for type 1 diabetes

Many psychiatrists say they have long known low levels of serotonin are not the main cause of depression and this paper doesn’t say anything new.

Yet the unusually large public response to it suggests this was news to many.

But some made an inaccurate leap from saying antidepressants don’t work by fixing a chemical imbalance, to saying they don’t work at all.

And doctors fear, in the confusion, people might stop taking their medication abruptly and risk serious withdrawal effects.

The National Institute for Health and Care Excellence (NICE) says these drugs shouldn’t be stopped abruptly except in medical emergencies and reducing the dose slowly can minimise withdrawal symptoms.

Sarah using a wheelchair smiling
Image caption, Sarah has since developed speech and movement difficulties

What did the research show?

This latest research looked at 17 studies and found people with depression didn’t appear to have different levels of serotonin in their brains to those without.

The findings help to rule out one possible way the drugs might work – by correcting a deficiency.

“Many of us know that taking paracetamol can be helpful for headaches and I don’t think anyone believes that headaches are caused by not enough paracetamol in the brain,” Dr Michael Bloomfield points out.

So do antidepressants work?

Research suggests antidepressants work only a bit better than placebos (dummy drugs people are told could be the real thing). There are debates among researchers about how significant this difference is.

Within that average is a group of people who experience much better results on antidepressants – doctors just don’t have a good way of knowing who those people are when prescribing.

Some people who take anti-depressants say the drugs have helped them during a mental health crisis, or allow them to manage depression symptoms day to day.

Prof Linda Gask, at the Royal College of Psychiatrists, says antidepressants are “something that help a lot of people feel better quickly”, especially in a crisis.

But one of the authors of the serotonin paper, Prof Joanna Moncrieff, points out most research by drug companies is short-term, so little is known about how well people do after the first few months.

hand holding tablet with water

“You have to say we will continue to review them and we won’t keep you on them any longer than you need to be on them,” something which often doesn’t happen, Prof Gask agrees.

While there are risks to leaving depression untreated, some people will experience serious side-effects from antidepressants – which the serotonin study’s authors say need to be more clearly communicated.

These can include suicidal thoughts and attempts, sexual dysfunction, emotional numbing and insomnia, according to NICE.

Since last autumn, UK doctors have been told they should offer therapy, exercise, mindfulness or meditation to people with less severe cases of depression first, before trying medication.

two women swimming in cold water
Image caption, Local health teams might offer group therapy, recommend exercise or community activities

How was the research talked about?

One typical misleading response claimed the study showed the prescribing of antidepressants was “built on a myth”.

post labelled 'misleading' reading: "This anti-depressant study is huge. Big Pharma has made billions prescribing wonder drugs to treat depression but there was never any solid scientific evidence that the drugs would work. Now we know that the whole thing was built on a myth. Big Pharma's greatest scam of all time.

But the study didn’t look at antidepressant use at all.

Serotonin plays a role in mood, so tweaking it can make people feel happier at least in the short term, even if they didn’t have abnormally low levels to start with. It may also help the brain make new connections.

Others have claimed this study shows depression was never an illness in people’s brains but a reaction to their environments.

“Of course it’s both,” Dr Mark Horowitz, one of the paper’s author, says.

“Your genetics affects your sensitivity to stress,” for example.

But people having an understandable response to difficult circumstances might be better helped with “relationship counselling, financial advice, or a change of jobs” than medication.

However, Zoe, who lives in south-east Australia and experiences both severe depression and psychosis, says rebranding depression as “distress” that would go away if we “just fix all the social problems” is also too simplistic and overlooks people with more severe mental illnesses.

Psychosis runs in her family, but episodes are often triggered by stressful events such as exam deadlines.

Zoe says she has found medication, including antidepressants, life-changing and has been able to make a “calculation” side-effects are “worth it” to avoid severe episodes.

And that’s one things all of the experts who spoke to BBC News agree on – patients need to have more information, better explained so they can make these difficult calculations for themselves.

For details of organisations offering advice and support, go to BBC Action Line.

Some contributors asked that their surnames be withheld

Follow Rachel on Twitter or email your stories about antidepressants to

Related Topics

August 17th 2022

How Psychology Can Help Fight Climate Change—And Climate Anxiety

Scientists and activists have deployed many tactics to help combat climate change: expanding technologies like wind and solar power, building better batteries to store that renewable energy, and protecting forests, all the while striving to reduce greenhouse gas emissions.

On Aug. 4, during the American Psychological Association’s Convention in Minneapolis, nearly a dozen experts turned the spotlight on another more surprising tool: psychology.

How Heat Waves Could Have Long-Term Impacts on Your HealthPosted 1 Month Ago

“I used to begin my presentations by talking about temperature data and heat-trapping gasses, but now I begin most of my presentations in the same way: by asking people, ‘How do you feel about climate change?’” said Katharine Hayhoe, chief scientist for the Nature Conservancy, a nonprofit environmental organization, during a panel discussion. “I get the same words everywhere: anxious, worried, frustrated, concerned, devastated, overwhelmed, angry, hopeless, horrified, frightened, heartbroken, and afraid.”

Simply simmering in those negative emotions won’t accomplish much: “If we don’t know what to do with them, that can cause us to withdraw, to freeze, to give up rather than take action,” Hayhoe says.

February 25th 2022

Who Lives, Who Dies: The Remarkable Life and Untimely Death of Dr. Paul FarmerBy Amy Goodman and Denis Moynihan
“Who Lives, Who Dies: Reimagining Global Health and Social Justice ” was the title of a talk delivered virtually at the University of Hawaii on February 17th by renowned public health physician Dr. Paul Farmer. He was speaking from a hospital in Rwanda that he helped build along with Partners in Health, the global non-profit organization he co-founded in 1987. Paul Farmer talked of his life’s work transforming healthcare systems worldwide, where too often access to care is reserved for the wealthy while the poor are left to die. With characteristic humility, he described healthcare as a human right and his years of what he called “pragmatic solidarity” in scores of countries. The clinics and hospitals he developed in the world’s poorest regions have saved patients from tuberculosis, HIV, Ebola, cancer and more. Four days after giving his talk, Paul Farmer died in his sleep, of an acute cardiac event. He was 62 years old.

“We are gutted by this loss,” Dr. Joia Mukherjee, chief medical officer for Partners In Health, said on the Democracy Now! news hour. “Just a deep, deep sorrow, a sorrow for the whole world…he combined a very fierce intellect with just an absolutely expansive heart and generosity and a real enthusiasm and joy for service and fellowship that was unparalleled. At the same time, he had impossibly high standards — high standards for medicine, that everyone should get a very First World care, that there is no First, Second and Third World, high standards for dignity.”

Paul Farmer had an unorthodox upbringing, living with his parents and siblings in a converted bus in Florida. After college, he spent a year in Haiti, where he committed to helping Cange, one of Haiti’s poorest communities. The people of Cange were … Read More →
Listen Now
Download Audio ⤓
Remembering Dr. Paul Farmer: A Public Health Pioneer Who Helped Millions from Haiti to Rwanda
We remember the life and legacy of Dr. Paul Farmer, a public health icon who spent decades building … Read More →

HR manager loses race discrimination case over ‘inner chimp’ comment

Mark Duell for MailOnline

A black HR manager has lost a race discrimination case after her boss asked her about her ‘inner chimp’ during a meeting about ‘Chimp Paradox’ theory.

Lindani Sibanda was offended when Clinisupplies chief executive Paul Cook asked her in a meeting how she kept ‘the chimp at bay’, an employment tribunal was told.

The question was a reference to the ‘Chimp Paradox’, a mind management tool used to differentiate between irrational, emotional thoughts and rational ones.

It was coined by psychiatrist Steve Peters and has been used to help Olympic gold medallists Victoria Pendleton and Sir Chris Hoy to victory on the cycling track.

Read More HR manager loses race discrimination case over ‘inner chimp’ comment (

Comment This woman clearly has anxieties about being associated with Monkeys because Chimps have black fur – pink without it- and presumably is typically religion influenced into thinking humans are mini Gods. The reality that we are all primates offends ‘normal’ people. Only so much science is allowed where social control has primacy. The obsession with racism and anti racism is a source of mental illness. The reality of it all being about politics of control is not supposed to be realised – unless of course she was just on the make. I am sure it wasn’t that, but in a dog eat dog world, I would not blame her if she was. R J Cook February 15th 2022.

What Are Precognitive (Premonition) Dreams?

Updated July 15, 2021

Written by

Sarah Shoen


Medically Reviewed by

Dr. Abhinav Singh


Fact Checked

In This Article

Psychopaths & The Police


13 Clear Traits of a Psychopath (Spot Them)

Read More 13 Clear Traits of a Psychopath (Spot Them) – (

The link between crime and the antisocial lifestyle aspects of psychopathy (e.g., Self-Centered Impulsivity; SCI) is well established. However, some psychopathic traits may be adaptive in specific institutions and cultures. Lykken (1995) conjectured that socialization within first responder culture may enable “heroes,” to utilize the interpersonal and affective aspects of psychopathy (e.g., Fearless Dominance; FD) in a manner that benefits society. Previous research (Falkenbach et al., in press) suggests that psychopathy and its correlated traits differ between law enforcement officers and the community. The objective of the present study was to bring further clarity to the presence of these traits within the police community and consider the influence of socialization within police culture. Self-report measures were used to evaluate how aggression, behavioral inhibition/activation, empathy, affect, and anxiety were related to factors of the Psychopathic Personality Inventory-Revised. These measures were administered to 1450 police officers, including recruits, detectives, sergeants, lieutenants, and executives. Recruits had lower SCI and behavioral inhibition, and higher FD and positive affect than higher ranking officers. These findings support the hypothesis that there are differences in the expression of psychopathic traits and correlates in those just starting a police.

Read More (PDF) Psychopathy and Associated Traits in Police Officers (

Is That Cop Dangerous? 4 Tips to Detect Psychopaths in Uniform

Author and lawyer Joseph Tully, November 17th, 2017

Is That Cop Dangerous? 4 Tips to Detect Psychopaths in Uniform

Psychopathic cops can be more dangerous than criminals. They are responsible for police brutality, unjustified shootings, false testimony, and many other forms of police misconduct.

Every year, dozens of people who were convicted based on a cop’s testimony, are released from prison because they were innocent. In three out of four homicide exonerations, official misconduct is a factor.

Thousands of Americans have died at the hands of cops in suspicious circumstances. This kind of behaviors are, more often than not, the work of a psychopath.

What is a Psychopath?

One of the problems with psychopaths is that they are incapable of remorse.

For Jon Ronson, author of The Psychopath Test, “Psychopathy is probably the most pleasant-feeling of all the mental disorders… All of the things that keep you good, morally good, are painful things: guilt, remorse, empathy.” For neuroscientist James Fallon, author of The Psychopath Inside, “Psychopaths can work very quickly, and can have an apparent IQ higher than it really is, because they’re not inhibited by moral concerns.”

Psychopaths have cognitive empathy, they can understand what others are feeling, but they lack the ability to feel it, which is known as emotional empathy. “This all gives certain psychopaths a great advantage, because they can understand what you’re thinking, it’s just that they don’t care, so they can use you against yourself,” Fallon explains.

In fact, research has shown that psychopaths are extremely adept at identifying vulnerability.

Psychopaths Often Become Cops

What happens when a person like that, someone who has zero concern for our feelings, is handed a gun and put in a position of power?

An encounter with a psychopath in a police uniform can be a life hazard. That’s why it is so important to be able to detect them. When you are in front of a psychopath, behaviors need to be altered, because normal social behaviors can trigger unexpected responses.

Research has shown that Police Officer is one of the top 10 professions chosen by psychopaths, ranking at number 7.

As I wrote in my book – California: State of Collusion, “Power, such as we give to law enforcement, prosecutors and judges, actually attracts psychopathic personalities who want to exert violent dominance under the color of authority. Innocent people can be subjected to a power trip police encounter, can be arrested by a megalomaniacal cop, jailed by a sadist, prosecuted by a manipulative Machiavellian and judged by a sociopath on an ego trip.”

How to Tell if a Cop is a Psychopath – Watch for these 4 Signs

Next time you dial 911, are pulled over on the road, handcuffed, interrogated, or arrested, watch for these signs to determine if you are in the presence of a psychopathic police officer.

1. They are Ambidextrous

If a cop uses both hands indistinctly, that increases his or her chances of being a psychopath. Research has shown that ambidextrous people consistently score higher on standard psychopathy tests. This has been observed in both male and female individuals.

2. They Maintain Constant Eye Contact and Show Particular Body Language

According to the gold standard for detecting psychopathic personalities, which was developed by Robert Hare, lying is one of the strategies psychopaths use to dominate. Sometimes they lie about things that don’t matter, lying is like a sport for them.

While a non-psychopath will often blink and move about in times of stress, psychopaths remain still and hold consistent eye contact. They always appear confident and secure, and they tend to be still.

Because they lack relaxed and natural movements, sometimes their perfect poise gives way to contradictory movements. For example, a psychopath’s nod may say yes while their words say no. They can also point to the left while talking about something that is on their right.

3. They Speak in a Grandiose Manner

Studies have also shown that the speech of psychopaths has some recognizable characteristics. For example, they use words like ‘because,’ ‘since,’ and ‘that’s why,’ to describe their own actions. They often speak in past tense and use utterances like ‘uh’ and ‘um’ rather frequently.

It is very common for psychopathic cops to speak highly of their own behaviors and blame others for all their problems. It is the mark of a psychopath to take no responsibility for their own mistakes.

4. They Manufacture Negative Reactions

Psychopaths enjoy creating chaos. Afterwards, they feign innocence and put the blame on you for reacting. They are masters of provocation. If a cop baits you into an argument and then pretends to be surprised, watch out; chances are, you are dealing with a psychopath.

Conclusion: Trust Your Instincts and Stay Safe

Not all cops care about justice. Many of them abuse their power. Psychopathic ones beat up innocent men, women, and minors. They shoot unarmed victims and lie to secure convictions. If you ever come across a cop, trust your instinct. If there is even the smallest sign that they might be a psychopath, you must watch your back.

Unfortunately, not all cops are there to protect you. Unfortunately, some will try to harm you and send you over the edge. If that happens, try to remain silent, and put safety first.

Our attorneys are used to dealing with problematic police officers and prosecutors who display psychopathic traits. We have psychiatrists on our teams who can efficiently detect psychopaths. If you have run into trouble with a law enforcement officer, call us, we can help restore your civil rights and hold psychopathic cops and their employers accountable.

Is the Cop Dangerous? 4 Tips to Detect Psychopaths in Uniform (

Life after death? Scientist says experiences ‘incredibly real’ as ‘brain goes haywire’ – September 6th 2021

Sebastian Kettley

Out-of-body experiences are a fairly common phenomenon with an estimated 10 percent of people experiencing them at least once in their life. The OBEs can be triggered by a wide array of factors – from brain tumours to epilepsy – but are often associated with so-called near-death experiences, or NDEs. Cardiac arrest patients, in particular, frequently recall a sensation of floating outside of their bodies or even looking down at their hospital beds from the ceiling.

One woman who temporarily died as a child believes her spirit escaped her body before being sucked right back in.

A similar account was shared by a man who claimed to have left his body after suffering a near-fatal heart attack.

For many people, these experiences are deeply profound and serve to inform their beliefs in God, the afterlife and give them some sort of “meaning in life”.

Scientists are, however, unconvinced about the spiritual aspect of OBEs and instead point to a growing body of evidence that might explain what is going on in the brain during these phenomena.

Dr Jane Aspell, a cognitive neuroscientist at Anglia Ruskin University, has been experimenting with OBEs through the use of virtual reality headsets to better understand how the brain creates an identity of the “self”.

What is the difference between incel and feminism , I ask this as a serious question ? Why are there always excuses for female miscreants and killers but none for men – if equality is the true State & Feminist Goal for the masses , which it isn’t at all ? August 20th 2021.


Mother, 31, held her head in hands as she appeared in court accused of murdering two-year-old son at family’s £500,000 suburban homeA mother held her head in her hands as she today appeared in court over the alleged murder of her toddler son. Natalie Steele …Posted August 19th 2021


Robert Weiss Ph.D., MSW

Love and Sex in the Digital Age

7 Key Reasons Why Some Women Cheat – Posted July 7th 2021

Women are unfaithful nearly as often as men.

Posted August 29, 2018 |  Reviewed by Lybi Ma

Sementsova Lesia/Shutterstock

Source: Sementsova Lesia/Shutterstock

First things first: If you’re female and reading this wondering why I’m only writing about women who cheat, know that a post I published a few months ago — “13 Reasons Why Men Cheat” — has become one of my most widely read.

But now it’s time to look at female infidelity.

There is a common misperception that it’s only men who step out on their partners and that women are always faithful. To that, I say: Who are all these men cheating with exactly? Do heterosexual men only cheat with single women and each other?

The simple truth is that approximately as many married, heterosexual women cheat as married, heterosexual men. Research suggests that 10 to 20 percent of men and women in marriages or other committed (monogamous) relationships will actively engage in sexual activity outside of their primary relationship. And these numbers are likely under-reported, possibly by a wide margin, thanks to denial and confusion about what constitutes infidelity in the digital era. For example: Are you cheating if you look at porn? If you flirt on social media? If you have a profile on Ashley Madison that you check regularly, even though you never hook up in person?

To help couples answer these questions, I offer you my fully functional, digital-era definition of what it means to cheat:

Infidelity (cheating) is the breaking of trust that occurs when you keep profound, meaningful secrets from a committed primary partner.

I like this definition for four primary reasons:

1. The definition speaks to the most basic element of what happens when we cheat on our partners. We betray their trust. In such cases, even more than our sextracurricular activity, it is the lying and the secrecy of betrayal that wounds a beloved and unknowing partner (male or female).

2. The definition encompasses both online and real-world sexual activity, as well as sexual and romantic activities that stop short of intercourse: everything from looking at porn to kissing another man/woman to something as simple as flirting (now commonly referred to as micro-cheating).

3. The definition is flexible depending on the couple. It lets couples define their own version of sexual fidelity based on honest discussions and mutual decision-making. This means that it might be just fine to look at porn or to engage in some other form of extramarital sexual activity, as long as your mate knows about this behavior and is okay with it.

4. The definition helps the cheater understand that the problem he or she created occurred the moment he or she started lying to accommodate or cover up his or her infidelity. The harm is not a spouse finding out the bad news — the harm is that it was covered up.

None of that, of course, explains why women cheat. Nor does it address the fact that women and men often cheat for very different reasons.article continues after advertisement

So Why Do Women Cheat?

Typically, females step out on a committed partner for one or more of the following reasons:

  • They feel underappreciated, neglected, or ignored. They feel more like a housekeeper, nanny, or financial provider than a wife or girlfriend. So they seek an external situation that validates them for who they are, rather than the services they perform.
  • They crave intimacyWomen tend to feel valued and connected to a significant other more through non-sexual, emotional interplay (talking, having fun together, being thoughtful, building a home and social life together, etc.) than sexual activity. When they’re not feeling that type of connection from their primary partner, they may seek it elsewhere.
  • They are overwhelmed by the needs of others. Recent research about women who cheat indicates that many women, despite stating that they deeply love their spouse, their home, their work, and their lives, cheat anyway. These women often describe feeling so under-supported and overwhelmed by having to be all things to all people at all times that they seek extramarital sex as a form of life-fulfillment.
  • They are lonely. Women can experience loneliness in a relationship for any number of reasons. Maybe their spouse works long hours or travels for business on a regular basis, or maybe their spouse is emotionally unavailable. Whatever the cause, they feel lonely, and they seek connection through infidelity to fill the void.
  • They expect too much from a primary relationship. Some women have unreasonable expectations about what their primary partner and relationship should provide. They expect their significant other to meet their every need 24/7, 365 days a year, and when that doesn’t happen, they seek attention elsewhere.
  • They are responding to or re-enacting early-life trauma and abuse. Sometimes women who experienced profound early-life (or adult) trauma, especially sexual trauma, will re-enact that trauma as a way of trying to master or control it.
  • They’re not having enough satisfying sex at home. There is a societal misconception that only men enjoy sex. But plenty of women also enjoy sex, and if they’re not getting it at home, or it’s not enjoyable to them, for whatever reason, they may well seek it elsewhere.

As with male cheaters, women who cheat typically do not realize (in the moment) how profoundly infidelity affects their partner and their relationship. Cheating hurts betrayed men just as much as it hurts betrayed women. The keeping of secrets, especially sexual and romantic secrets, damages relationship trust and is incredibly painful regardless of gender.

If a couple chooses to address the situation together, couple’s counseling can turn a relationship crisis into a growth opportunity. Unfortunately, even when experienced therapists are extensively involved with people committed to healing, some couples are unable to ever regain the necessary sense of trust and emotional safety required to make it together. For these couples, solid, neutral relationship therapy can help the people involved to process a long overdue goodbye. But cheating doesn’t have to be seen as the end of a relationship; instead, it can be viewed as a test of its maturity and ability to weather the storm.

Comment I can only conclude that these women are immature greedy attention seeking fun lovers with scant interest in the welfare of any unfortunate children they may have. Robert Cook

A Generation of Sociopaths review – how Trump and other Baby Boomers ruined the world – Posted July 6th 2021

Bruce Cannon Gibney’s study convinces Jane Smiley of the damage her own American generation has done

‘Shiny advertisements of middle-class perfection’ … 1950s America.
‘Shiny advertisements of middle-class perfection’ … 1950s America. Photograph: Lambert/Getty Images

Jane SmileyWed 17 May 2017 07.30 BST

The day before I finished reading A Generation of Sociopaths, who should pop up to prove Bruce Cannon Gibney’s point, as if he had been paid to do so, but the notorious Joe Walsh (born 1961), former congressman and Obama denigrator. In answer to talkshow host Jimmy Kimmel’s plea for merciful health insurance, using his newborn son’s heart defect as an example, Walsh tweeted: “Sorry Jimmy Kimmel: your sad story doesn’t obligate me or anyone else to pay for somebody else’s health care.” Gibney’s essential point, thus proved, is that boomers are selfish to the core, among other failings, and as a boomer myself, I feel the “you got me” pain that we all ought to feel but so few of us do.

Gibney is about my daughter’s age – born in the late 1970s – and admits that one of his parents is a boomer. He has a wry, amusing style (“As the Boomers became Washington’s most lethal invasive species … ”) and plenty of well parsed statistics to back him up. His essential point is that by refusing to make the most basic (and fairly minimal) sacrifices to manage infrastructure, address climate change and provide decent education and healthcare, the boomers have bequeathed their children a mess of daunting proportions. Through such government programmes as social security and other entitlements, they have run up huge debts that the US government cannot pay except by, eventually, soaking the young. One of his most affecting chapters is about how failing schools feed mostly African American youth into the huge for-profit prison system. Someday, they will get out. There will be no structures in place to employ or take care of them.

The boomers have made sure that they themselves will live long and prosper, but only at the expense of their offspring. That we are skating on thin ice is no solace: “Because the problems Boomers created, from entitlements on, grow not so much in linear as exponential terms, the crisis that feels distant today will, when it comes, seem to have arrived overnight.” As one who has been raging against the American right since the election of Ronald Reagan, as someone with plenty of boomer friends who have done the same, I would like to let myself off the hook, but Gibney points out that while “not all Boomers directly participated, almost all benefited; they are, as the law would have it, jointly and severally liable”.

Dick Cheney … ‘Others became overly aggressive.’
Dick Cheney … ‘Others became overly aggressive.’ Photograph: Win McNamee/Getty Images

Gibney’s theories about how we boomers got to be sociopaths (inclined to “deceit, selfishness, imprudence, remorselessness, hostility”) are a little light: no experience of the second world war, unlike the Europeans; coddled childhoods owing to 1950s prosperity; and TV – “a training and reinforcement mechanism for deceit”, not to mention softening viewers up for ever more consumption of goods.Advertisement

My own theories are based on my experience of the cold war. I think that the constant danger of nuclear annihilation and the drumbeat on TV and radio of the Soviet threat raised our fight-flight instincts so that some of us became overly cautious (me) and others overly aggressive (Dick Cheney). I also think that our parents were not “permissive”, but that they produced too many children in an era when there was nothing much for the children to do but get out of the house and into trouble – few time-consuming tasks around the house or on the farm, plus bored mothers and absent fathers, who felt a sense of despair when they compared themselves with the shiny advertisements of middle-class perfection they saw everywhere, not just on TV. This was what America had to offer – washing machines, high heels, perfect hairdos, Corn Flakes, TV dinners, patriotism and imminent destruction.

Gibney’s book includes more than 100 pages of documents and notes, and he is best at analysing the financial details of the various forms of national and environmental debt that our children and grandchildren will eventually have to pay. He slides around the obvious – to  me – solution of just shooting us so that we can’t suck social security dry (I am not in favour of shooting even rats: the gun rights advocate Wayne LaPierre, born the exact day I was due in 1949, though I came six weeks early, is surely the ultimate example of this book’s sociopaths, completely indifferent as he is to the lives lost to the gun rights lobby). Yet Gibney does convince me that those of us born between 1940 and 1965 (his definition) are a drag on the future.

His last chapter concerns what can be done before it is too late. “Remediating the sociopathic Superfund site of Boomer America will be expensive,” he writes. “In money alone, the project will require $8.65 trillion soon and over $1 trillion in additional annual investment.” Then he asserts that it can be done, that the investment will pay off, that “it will be helpful to view reform as a process of manageable fiscal adjustments”. Good luck with that, and I say that with deep sincerity. As I watch my fellow boomers, Paul Ryan, Donald Trump and Mike Pence grin and fistbump at the idea of killing their fellow Americans with their newly passed health bill, I suspect that no one, not even their children, can redeem these people.

The first sociopath I actually knew, in the 1980s, was born in 1961, just like Joe Walsh. In the 80s, he was in finance. When, over dinner, I objected to off-shoring jobs and destroying unions, he said, in a sneering, Ayn Randian way: “They don’t have a right to those jobs!” He ran through his millions and is now in jail for pimping his girlfriend. I doubt he has learned a thing.

Read A Generation of Sociopaths and hope for the best. Gibney is more optimistic than those who predict an imminent third world war, than the scientists who warn of sudden climate shifts and the end of antibiotics, and even – in one sense – than the evangelicals who believe in the Rapture. He also has a better sense of humour.

 A Generation of Sociopaths: How the Baby Boomers Betrayed America is published by Hachette. To order a copy for £17.84 (RRP £20.99) go to or call 0330 333 6846. Free UK p&p over £10, online orders only. Phone orders min p&p of £1.99.

Donnie Darko - Mad World

Donnie Darko – Mad World – YouTube

StoriesPart of Broken hearts not broken brains

As a psychologist, I can tell you that Dominic Cummings is an enemy that few would wish to cross May 28th 2021

Emma Kenn

Cummings & Goings

For a man undergoing a national, and very public grilling on one of the most controversial and divisive situations that the UK has ever experienced, from my professional opinion as a psychologist it would appear that Dominic Cummings seemed unnaturally comfortable.

Even the relaxed open-collared white shirt he chose to wear during his inquiry yesterday (26 May) seemed more fitting for a casual meal out with friends, as opposed to discussing an alleged national catastrophe – and one that he was very much a part of.

It was worryingly easy for Cummings to betray Boris Johnson, the man who once employed him as a close ally, and who invited him into his circle of trust. To the less discerning viewer, his apparent openness may perhaps wrongly be interpreted as transparency, authenticity, and even honesty. But practising for 23 years as a registered psychologist therapist, with a BSc in psychology, an advanced diploma in therapy, a masters in integrative therapy and 14 years experience as a media psychologist, I have a few observations.

In cases where someone so effortlessly offers up damning anecdotal information, it is potentially less about stating facts, and more about positioning themselves as the perfect sniper, seeming to carefully assassinate every single layer of their perceived enemy, a deeper and more discerning analysis is required.

At every single point when questioned during Wednesday’s inquiry, he was very much in charge. He appeared not to see this as an interrogation, and he appeared to feel firstly in control, and secondly superior to those requiring his responses. While initially, his body language was closed, as he became more comfortable with his performance, his open posture, in my opinion, betrayed him – becoming both emotionally and literally laid back in his chair was at complete odds with the content of his conversation, particularly in regard to his own responsibilities and failures where the Covid strategy is concerned.

At every single opportunity, he carefully offered his analysis of certain important players, and he was clever in his tactics, using first names where Chris Whitty and Patrick Vallance were concerned, casting these as the heroes in this unravelling, and suggesting that he held a “closer” relationship with these apparent “good guys”, while he simultaneously dismissed professionals like Professor Carl Heneghan, and Sunetra Gupta – two highly accomplished experts in their fields – as if their views were irrelevant, dangerous even. To me, as a psychologist, this offers further insight into his feelings of superiority and righteousness in regards to his own behaviour.

While he did apologise for the role that he played, this apparently heartfelt statement fell short due to his constant blame of others. To be truly responsible and accountable for one’s own actions, it is essential that personal culpability is fully acknowledged. Instead, Cummings constantly rationalised that he was entirely powerless and beholden to an out of control, chaotic and personally irresponsible prime minister, and in particular the health minister.

Yet, there was a flaw in his own argument towards the prime minister, as by Cummings’s own admission, Boris Johnson didn’t wish to lock down at all. Instead, he saw the long-term potential catastrophe on the economy as a bigger burden for society to bear. Yet, Johnson did lock down – again, and again – which makes little sense if he was indeed the chaotic megalomaniac Cummings would have you believe.

When he suggested wanting a “dictator” with “kingly authority” to oversee the Covid crisis, it seems to me he was likely referring to himself. His judgements of most of those who surrounded him were full of contempt, and while it is possible that Matt Hancock could have made huge errors of judgement and failures that will be exposed, the vitriol that Cummings had towards him seemed personal, as opposed to professional.

There is absolutely no doubt that Dominic Cummings is a master of spin, brilliant at his work and an enemy that few would wish to encounter. He portrayed himself as a victim of circumstance and a good man who found himself silenced by a foolish leader, and while each will come to their own conclusion as to his integrity, it is always worth asking oneself if, placed in a similar position, would you be so willing to potentially destroy those you once counted as close colleagues and friends with such ease?

Lockdowns remain a contentious issue, their impact will be felt for generations to come and their effectiveness in the scientific world is still being fiercely debated, meaning that whatever Dominic Cummings’s personal feelings are, they are not enough to cast Johnson as the villain.

Emma Kenny is a psychological therapist (MBACP)

Meet the People Who Believe They’ve Traveled to a Past Life

Christopher was an ancient Egyptian prisoner. Stephanie’s dating the man who had her murdered. They and many others swear by the controversial benefits of past-life regression. Posted April 19th 2021


  • Michael Stahl

an illustration of an infinite row of humans eying behind them

Could past-life regressing be far less time-consuming and costly than traditional forms of psychotherapy?  Illustrations by Micky Walls

Christopher Benjamin was imprisoned in Ancient Egypt, alone, barefoot and cold. The stone wall he leaned against felt frigid and bone dry. Through a small cutout in the high ceiling of his cell, a single beam of sunlight taunted him. Gazing up at the peephole, he sensed that the world on the opposite side was far warmer — but certainly not more welcoming.

“I felt like I had really screwed up because I told these people in charge what they were doing wrong,” Benjamin says. “I kind of shot my mouth off. … They did not like what I told them, and so they put me in this dungeon.”

Not a single soul was around to save him. “I’m screwed,” Benjamin thought to himself. “There’s no way out.”

Fortunately, in a version of reality more adherent to its traditional definition, there was an exit: his therapist’s office door.

The unsettling visions and sensations Benjamin experienced while imprisoned thousands of years ago were part of what he thinks may have been a past life. His mind traveled to that time and place during a session of past-life regression, a practice in which a person, under hypnosis or in a meditative state, experiences a memory that they believe is from a time when their soul inhabited another body.

Some who engage in past-life regression do so simply out of curiosity about their former selves, perhaps discovering they were a knight in shining armor — or the town wench who waited on one. But others hope to treat a range of mental health issues, including phobias, addictions, anxiety and depression, which they believe could have sprung from a past-life trauma. By reliving their trauma’s origin story, they hope to better understand, and possibly ameliorate, the emotional damage lingering in their current life.

The American Psychological Association is deeply skeptical of past-life regression’s viability, and there are serious questions about the ethics of using it as a treatment. But the practice’s most steadfast backers contend that its impact can be immediate, and far less time-consuming and costly than traditional forms of psychotherapy.

Eli Bliliuos, a Manhattan-based hypnotist who often employs past-life regression in client sessions, says that “just the shift in perspective” about a traumatic event, which can materialize instantaneously with past-life regression, “can be profound.”

In psychoanalysis, an analyst will explore their patient’s life events, particularly those from childhood, in order to bring unconscious material into the conscious mind. Then they work to restructure the personality adversely affected by those repressed, emotionally damaging experiences. This process can take months or even years.

But, Bliliuos says, “In hypnosis, you go always to the most important memory you’ve experienced,” whether that’s in this life or perhaps a previous one. Thus, in past-life regression, a reckoning can begin right away.

Christopher Benjamin — a 58-year-old Milwaukee resident who asked that his real name not be used here — started psychotherapy in 2012, seeking relief from lifelong anxiety, as well as depression that had more recently developed from what amounted to a midlife crisis. Though his therapist engaged him in typical talk therapy, she also employed some out-of-the-ordinary approaches. She did not hypnotize Benjamin, but instead focused his mind through brainspotting, a type of exposure therapy. Brainspotting administrators guide the patient’s eyes across their field of vision as they recall an upsetting event. Eventually, their eyes locate a position that activates feelings associated with the memory.

“All of a sudden, it’s sort of like magic, you’re blabbing out something that was completely hidden,” Benjamin says.

After a few moments of the patient’s eyes fixing on that point, their discomfort slowly fades — not unlike how someone with a fear of heights might grow more at ease the longer they remain atop a ladder.

In the brainspotting sessions, Benjamin recounted memories from his present life, like his uncle’s suicide 30 years earlier. He’d never truly grieved the loss, and he had an unexpectedly emotional response to the memory in his therapist’s office. He was, as he puts it, “a complete mess.” But reliving the memory helped him recognize for the first time that his family has a history of mental illness, which proved oddly comforting. It wasn’t his fault that sadness seemed to consistently find him; it was genetics. He realized he just needs to work harder than most to live a more emotionally healthy, content life, and he’d already taken that upon himself by seeking therapy.

When Benjamin visited Ancient Egypt during another round of brainspotting, he was focused on one particular manifestation of his anxiety.

“I can’t speak up,” Benjamin says. “I am deathly afraid of saying anything to offend people.”

His therapist asked him to think of an instance in which he felt betrayed by something he’d once said. Soon, his thoughts drifted from present-life memories down into the dungeon. Benjamin says it became “sort of self-evident” that he’d tapped into a past-life memory. He emerged from the past-life regression “more mindful” of the root of his anxiety, and determined to conquer the fear of speaking up that haunts him in this life.

Belief in reincarnation dates back to at least the oldest scriptural texts of Hinduism, written nearly 3,000 years ago in India. Greek philosophers like Socrates and Plato also entertained the idea, as did some Gnostic Christian groups from the turn of the first millennium, as well as 17th-century Jews who practiced Kabbalah. Today, 33 percent of American adults believe in reincarnation. (American women subscribe to the belief at a 12 percent higher rate — 39 percent overall — than men do.)

Notions of reincarnation are diverse and nuanced, but for past-life regression advocates like Eli Bliliuos, the New York hypnotist, “The basic belief is that we are souls; we choose to incarnate these bodies for purposes of learning from experience, growing from experience.”

In another past-life regression session, Christopher Benjamin found himself canoeing in wide-open waters, enjoying a day out with a person who he understood intuitively to be his younger brother. As the elder sibling, Benjamin naturally accepted the responsibility of caring for him, of protecting him from danger.

“It was really, really blue and wavy, choppy water,” Benjamin says, “and I don’t specifically remember how it happened, but he went overboard.”

His little brother disappeared, into the void of the water.

“And I’m freaking out, and then I can’t find him,” Benjamin continues. “And the rest of [the memory] is the dread of having to tell everybody what happened. That was hideous.”

Before this particular past-life regression, Benjamin was talking with his therapist about a strange preoccupation he’d developed with a co-worker, whom he found himself always trying to protect. In the canoe that day, Benjamin sensed that his past-life brother was a previous incarnation of the co-worker. The past-life version of Benjamin blamed himself for the man’s untimely death, but experiencing the memory altered Benjamin’s perspective about why he obsessed over his co-worker in this life. On some core level, Benjamin wanted to look out for him, to protect him. Going back to his past life, he says, helped him move on.

“After the session, I’m like, Oh, OK. Well that happened way back in the past, and I don’t have to worry about that anymore,” Benjamin remembers.

Sigmund Freud introduced the concept of regression in the 1890s, utilizing it in psychoanalysis, at first via hypnosis and then in talk therapy. Through regression, his patients could confront childhood events and traumas that continued to create emotional stress in their adult lives.

illustration of a woman on a couch speaking to a therapist

The Search for Bridey Murphy, a book written by Bernstein and published by Doubleday, became a best seller in 1956 and was adapted into a movie. The 1995 New York Times obituary for Virginia Tighe — later Virginia Mae Morrow — said, “Bridey Murphy became a 1950s phenomenon rivaling the Hula-Hoop. There were Bridey Murphy parties (‘come as you were’) and Bridey Murphy jokes (parents greeting newborns with ‘Welcome back’).” The Times also reported that The Search for Bridey Murphy “triggered an interest in reincarnation and the use of hypnosis to regress a subject to early childhood, and perhaps beyond.”

During the following two decades, Ian Stevenson, then chair of psychiatry at the University of Virginia, chronicled about 3,000 cases of people from around the world, mostly children, with past-life memories, and Morris Netherton wrote what he claimed to be the first book in the field of past-life regression therapy.

Dr. Brian L. Weiss has been perhaps the most prominent American figure in the practice since the 1980s, publishing 10 books on past-life regression and related subjects. Once a traditional psychotherapist, Weiss — who declined to be interviewed for this story due to his busy schedule — has written that he was a past-life regression skeptic at first. But a hypnotized patient of his, whom he called “Catherine” in one of his books, recounted past-life memories that were so precisely outlined and, as it turned out, historically accurate, that he felt it was impossible she could have invented them.

Weiss has led mass past-life regression sessions, and he conducts five-day training workshops for psychotherapists and others. In a New York Times article that chronicled one of his training workshops, Weiss drew more than 200 people into a meditative state and encouraged them to walk through doors labeled with years such as 1850, 1700 and 1500, where some past-life memories may have resided. Weiss was quoted in the piece as saying, “Any good therapist can use these techniques and you can learn them in a week.” There are also online past-life regression courses, open to anyone, some as short as a few hours long and costing only about $100.

Matthew Brownstein, founder of The Institute of Interpersonal Hypnotherapy, says these quickie training programs are “holding back a glorious profession.”

“I’m actually appalled by what’s out there,” he says. “We worked really hard, myself and other leaders in the field, to make hypnotherapy a federally and state-level acknowledged occupation, and it’s technically illegal to train somebody as a hypnotherapist if you’re not licensed to do so.”

Brownstein adds that if a person could become a medical doctor in four hours, “it wouldn’t make medicine look all that appealing.”

He says the hands-on training offered at a school like his is extremely important. Aspiring practitioners of hypnosis should know that “some very dark stuff can come up.”

“Even though you’re looking for, just say, past-life [memories],” he continues, “there are a lot of other … really out-of-this-world phenomena that occur in the altered state that someone needs to be trained to deal with.” Examples of these special instances, Brownstein says, range from remembrances of alien abductions to “channeling,” when a client acts as a conduit for the spirit of a deceased person and communicates their messages.

Stephanie Riseley rested one hand on her husband’s chest as he died in his hospital bed after a months-long battle with leukemia. Riseley says her husband’s “heart just stopped, and then I had this feeling that he flew through me.”

illustration of woman writing with a ghost man holding her hands from behind

Perhaps sensing that she was having difficulty forgiving herself for not doing enough to save him, he added: “You must honor what we had together and forget the rest. … Please forgive me. And forgive yourself for not being superwoman.”

Soon thereafter, her husband “started chatting with me and having wild sex,” Riseley says. “I am not the only widow that this happens to, by the way.”

Once she opened herself up to the idea that her husband could interact with her from the beyond, she says, he began running his energy through her body, and orgasms were a regular occurrence. The encounters — sexual and conversational — went on for nearly a year, before he had to move on to his next life.

“He ultimately became a little African girl,” Riseley says.

Riseley, who lives in Los Angeles, wrote a book about these interactions called Love From Both Sides. She is now a hypnotist who trained under Brian Weiss, and began guiding her own clients through past-life regressions in 2004.

She says that over the past eight years she’s also been “dating the man who had me murdered in my direct past life.”

Riseley believes she was a male descendant of the Rothschilds, the famed family of bankers, in her most recent past life. She says she was a Jewish doctor who lived in Germany as the Nazis rose to power. Riseley was married, but cheating with a showgirl, who also slept around with Nazi SS officers. When the past-life version of Riseley told the showgirl he was fleeing Germany with his wife and daughter, the showgirl sold him out to the Nazis, and the whole family perished in the Auschwitz concentration camp.

Riseley also says that she realized during her past-life regression into Nazi Germany that her wife from that past life had been reincarnated as her high school best friend in this life, while the showgirl returned as her current boyfriend.

“In that past life I was a complete narcissist: rich, entitled, totally self-involved,” Riseley says. “I wasn’t appreciative. … And in this lifetime? My lesson is to see what it feels like to be surrounded by complete narcissists.”

From exploring her past life, Riseley says, she is learning compassion and forgiveness. “You’re not supposed to die wanting to kill somebody, that’ll come and bite you in the butt,” she explains. “So I’m forgiving him.”

“Certainly, I can’t prove to anybody that past-life regression is real or not real,” says Eli Bliliuos, “but doing it as often as I do and having people have similar experiences, after a while it sort of proves itself.”

Bliliuos first sought out past-life regression himself in his late teens, after losing both of his parents within a year — his father died of cancer; his mother passed after a fall on icy subway stairs. He says that in his regressions into past lives, he has encountered his parents’ souls, which eased his anguish because he knew he’d come across them again in some future life. His subsequent journeys into other past lives — one sent him to 13th-century Italy, where he was a merchant with an unknown but debilitating illness — have continued to help him improve his outlook on this life.

Bliliuos and other advocates of past-life regression say that you don’t have to believe in reincarnation to benefit from the experience. He recalls one session with a client who told him that they thought their past-life regression was a figment of their imagination.

“That’s perfectly fine,” Bliliuos responded, but he asked them to at least consider whether the vision might be a message from their unconscious. If it had some relevance to their life now, then the important element of the experience was the lesson they took away from it.

“Who cares if it’s ‘real’?” he adds.

As effective as past-life regression may be for some, the practice has drawn plenty of criticism throughout the years. In a 1995 Chicago Tribune article, Dr. Mel Sabshin, who was the medical director for the American Psychiatric Association at the time, said that the organization regards past-life regression as “pure quackery.”

“[P]sychiatric diagnosis and treatment today is based on objective scientific evidence,” Sabshin explained. “There is no accepted scientific evidence to support the existence of past lives let alone the validity of past life regression therapy.”

The American Psychological Association (APA) calls past-life regression “highly controversial,” stating that most hypnotherapists are “skeptical of the practice and do not recognize it as a legitimate therapeutic tool.” The APA adds that “clinicians generally consider actual past-life enactments to be manifestations of psychopathology.”

Since the Bridey Murphy episode entered the zeitgeist in the 1950s, specialists have concluded that the hypnosis-induced interviews were the result of cryptomnesia, which the American Psychological Association defines as “people mistakenly believ[ing] that a current thought or idea is a product of their own creation when, in fact, they have encountered it previously and then forgotten it.” The APA likens cryptomnesia to “inadvertent” or “unconscious plagiarism.” It turns out that when Virginia Tighe — the woman who believed she was once Bridey Murphy — was growing up in Chicago, she lived across the street from a woman named Bridey. Tighe also had an Irish aunt who told her stories from the old country.

Gabriel Andrade, assistant professor of psychology at Ajman University in the United Arab Emirates, published a 2017 medical journal article titled “Is Past Life Regression Therapy Ethical?” He argued that “the reincarnation hypothesis … is not supported by evidence, and in fact, it faces some insurmountable conceptual problems.” He cites the world’s population growth as one example, asking, “Where did these additional souls come from?”

illustration of a person with pendulum clocks swinging in their face

Past-life regression supporters, however, say that a subject will experience the past-life regression, even a ghastly death, only to recognize that their soul carried on, unabated, past the trauma, and they can take solace in that ultimate outcome.

“Nobody comes in [my office] and goes through a traumatic experience they can’t handle,” Bliliuos says. “It doesn’t work that way.”

But, as Andrade says in an email to me: “The client/patient may develop new phobias, and may have increased anxiety and depression, as a result of believing that he/she underwent traumatic experiences, even though these experiences may have never been real in the first place.”

In Andrade’s paper, he pointed to “the Satanic Ritual Abuse moral panic of the 1980s” in the United States as a far-reaching, stark example of the damage memory implantation can cause. Throughout the decade, hundreds of people underwent hypnosis to recover memories of sexual and ritual abuses, allegedly conducted against them during early childhood.

“A thorough FBI investigation was carried out, and no evidence whatsoever was found to support the allegations of sexual and ritual abuse,” Andrade wrote. Because their hypnotists had asked leading questions, “these false memories had come to be perceived as quite real by the subjects … who had to face the troubling consequences of having false memories of traumatic events that, in fact, had never happened to them.”

“My conclusion, then,” Andrade states, “is that it is better to play it safe.” He advocates that people seek out more evidence-based forms of treatment instead.

I am not a person of faith. I thank God for the gift of cookie dough ice cream as I savor the first spoonful out of a fresh pint, but for the most part, that’s about as spiritual as I get. The idea of reincarnation has never seemed plausible to me. I have had brushes with “the unexplained,” however. When I was about 3, and just learning to speak in sentences, I was sitting with my dad and his father, watching 30-year-old 8-millimeter home movies. The clips starred the two of them, great-grandparents and other deceased relatives of mine, as well as an uncle and some cousins, then in stages of toddlerhood and infancy. I’ve no memory of this, but my father says that, out of nowhere, I leaped from my seat, pointed at the screen, and yelled something to the effect of There she is that sunuvabitch!

“You went wild,” Dad tells me in a recent phone call. “And the way you talked was so clear, that was part of the shock of it.”

He has no idea who appeared on the TV that got me so riled up, and he says that my outburst was very out of character. A belief in past lives might suggest that I’d previously been one of my departed family members, or that I channeled someone else’s spirit and shouted out searing anger on their behalf. Personally, I’m pretty skeptical of either explanation.

So when I arrived at a past-life regression group workshop conducted by Johanna Derbolowsky, a Los Angeles–based spiritual healer, I maintained my robust skepticism. The event was held in Midtown Manhattan, at a community space geared toward mindful practices and performances. Derbolowsky, a German immigrant, speaks softly and has a calming presence. She characterized the workshop as a “past-life regression meditation,” devoid of hypnotism. She stood in front of a tightly quartered crowd of 38 attendees, almost all women.

I’ve been in talk therapy for nearly six years — treatment for a moderate panic disorder and depression — and I have experienced continuous progress throughout. In that, I’m a believer. But at the top of the past-life regression session, Derbolowsky mentioned that it didn’t matter if we regarded past lives as legitimate phenomena. She instructed us to close our eyes and focus on breathing deeply. As we relaxed, she told us each to think about something that we’ve frequently worried about. For me, that’s concern about my ability to financially sustain myself as a freelance writer. It often feels as though at any given moment I could be completely out of work. With the cost of living being what it is in New York City, I still have roommates at the age of 41 and no semblance of a retirement fund. The forecast for my future sometimes seems gloomy.

In the session, we each considered our fear, and then Derbolowsky said, “Now, I want you to throw it away.”

Just like that, I stopped worrying for a moment. My body relaxed even more deeply.

After engaging in some group sharing about giving our fears the mental boot, Derbolowsky again told us to close our eyes, breathe deeply and picture a vast closet, teeming with costumes. She then said something to the effect of “Find the costume you’re most drawn to, and try it on.”

I found myself fully engaged in the vision. I picked out an engulfing, furry animal skin shroud, dark trousers and high boots, like something Leonardo DiCaprio wore in The Revenant.

“Now, go out into the world with your costume on,” I recall Derbolowsky saying. “Look around, take in the sights, the sounds.”

I was in a deep wood. It was sunny, but water droplets falling from high above sporadically made the leaves around me flutter. A beautiful deer poked its head out from the foliage, eyeballed me briefly, and then disappeared. I felt at peace, but lonely.

After a few minutes, Derbolowsky told us to take ourselves into “the next scene,” and suddenly it was night. Still alone, I was eating meat cooked over a campfire. Still lonesome, I thought to myself, Look around for your family, but no figures emerged.

I took the vision as a reflection of my real-life loneliness and isolation, due in part to my career choice. (As a freelance writer who works from home, it’s not uncommon for me to go an entire day without having an in-person conversation with anyone.) I’m also currently single and, since childhood, I’ve always felt somewhat disconnected from my family, filling the “weird one” role nicely. The vision gave all of this new weight and immediacy, motivating me to address it somehow.

In a phone interview a few days later, Derbolowsky offers a different observation.

“You came up in the visualization as somebody that’s in the middle of nowhere, and his food and supply just appeared,” she says. “You didn’t have to worry.”

She told me that I am probably a pretty resourceful guy, and that because I’ve figured out how to take care of myself for this long, there’s no reason for it not to continue.

“Yeah, I just have to keep pluggin’ away,” I say.

“Obviously you have to act and look and do,” she adds, “but you also have to know that opportunities will show up.”

Though I took more than I expected away from Derbolowsky’s workshop, it did not reveal that I was once an influential philosopher, a warrior leader, a powerful sultan or any other figure of particular import. The most popular examples of past-life regression found on the internet feature such grandiose characters — like the actress Shirley MacLaine’s yarns of having sex with Charlemagne the Great, and her belief that her dog is the reincarnation of an Egyptian god. To that point, hypnotist Eli Bliliuos says, “If you say that you were Eisenhower, you’re gonna get more hits on your Facebook.”

While Derbolowsky says that one client of hers regressed not only to prehistoric times but also into the body of a saber-toothed tiger, she, like Bliliuos, confirms that most people who experience past-life regressions recount ordinary lives of fairly common folk.

illustration of woman holding a cat-like crature in front of a painting of Anubis

She insists suggesting such fodder is “useless.” The value in past-life regression does not materialize from the status of the person you were, but rather from how the experiences are relevant to your current life.

Along with his visions in Ancient Egypt and of canoeing with his brother, Christopher Benjamin has also had a third notable past-life regression.

“I’m dancing in a group setting and, it sounds so cheesy, but as like a flamenco dancer,” Benjamin says. “It felt like Spain or Portugal — I don’t know. But I’m picturing it was the 1800s.”

Benjamin had told his therapist in session that day that he wanted to focus on the malaise and lack of joy that had settled into his life, compliments of his midlife crisis and his family’s history of mental health struggles. She instructed him to think of a time when he didn’t feel so disgruntled and numb — a time, instead, when he relished life.

“And I’m dancing with this woman, who is extremely beautiful, and I desired her with that, like, burst of passion when you first meet somebody,” Benjamin continues. “It represented being alive, with passion that I lacked.”

As the remembrance continued, he slowly understood that the soul of the woman he’d danced with a couple hundred years ago was that of someone still very close to him today.

“I realized that it was my daughter,” he says, “which is freaky when you think about it because you don’t want to think that way about your daughter, but I did.”

Putting the unwanted implication of incest aside, Benjamin says that the vision “was so refreshing because, when you feel so bad, to see a time when you did feel good is so important. And it made me feel so much better.”

Whatever he may or may not have actually experienced during that dance, Benjamin says he takes great comfort from the fact that he — his soul — is loved, has always been loved, and will always be loved.

Michael Stahl is a freelance writer, editor and journalist based in Queens, New York, whose work has been published in many print and digital publications, including Rolling Stone, Vice, Vulture, Village Voice, Mic, Quartz, and CityLab. His first book, Big Sexy: Bartolo Colón in His Own Words, was published by Abrams Books in May 2020. He is a contributing editor and the layout manager at Narratively.

How Not to Care When People Don’t Like You April 19th 2021

Everyone is disliked by someone. Don’t let it slow you down.


  • Rebecca Fishbein

Read when you’ve got time to spare.

When I was in high school, I found out that my friends didn’t like me. One of the girls in my “group” told me I wasn’t invited to a birthday party because “everyone” thought I was annoying—which, to be honest, at 15 I probably was—and for months I was ostracized. It took some time for me to worm my way back into the gang, but until then, I was devastated, and I swore I would spend the rest of my life being likable.

But, as David Foster Wallace (sorry) wrote in Infinite Jest (sorry again), “certain persons simply will not like you not matter what you do,” and no matter how likable you think you are, you’re not going to win over every person you meet. “Remember that it is impossible to please everyone,” Chloe Brotheridge, a hypnotherapist and anxiety expert, tells us. “You have your own unique personality which means some people will love and adore you, while others may not.” Of course, while this concept is easy to understand on its face, it’s difficult to keep your perspective in check when you find you’re, say, left out of invitations to happy hours with co-workers, or getting noncommittal responses from potential new friends, or you overhear your roommates bad-mouthing you. Rejection is painful in any form, whether it be social or romantic, and it’s a big ego blow to get bumped from the inner circle.

Before you freak out, keep in mind that it’s not just normal to be occasionally disliked, but in fact, it’s healthy. Rejection is a way to suss out who’s compatible with whom, and just as getting romantically dumped by someone leaves you open to finding a better suited partner, getting axed from a social group gives you space to find folks that are a little more your speed. Plus, it’s empowering not to fear being disliked—not that you should run around violating social norms, but when you’re not wasting energy molding your personality to someone else’s to be accepted, you’re more likely to find people who genuinely like you for you, and those relationships are far less exhausting to keep up.

Still, it sucks to feel disliked. Here’s how to get through it without falling down a rabbit hole of sadness.

It’s okay to feel the pain

Humans are social creatures, and so we experience painful biological responses to rejection. “Historically it was essential for our survival,” Brotheridge explains. “When we were evolving and living in tribes, being rejected and kicked out of the community would have been a matter of life or death.” When we get rejected, our brains register an emotional chemical response so strong, it can physically hurt. We’re also likely to cycle through a series of responses that’s not dissimilar to the stages of grief. First, the blame game starts. “The first stop on the train is self blame: ‘It’s my fault, I did something to upset them,’” Sean Grover, a psychotherapist and author of When Kids Call the Shots, tells us. Up next is shame: “You feel ashamed, you feel humiliated, you feel weak,” Grover says.

Then, like any dumped individual, you’ll probably try to win back your rejecter. “Not because, necessarily, you want them to like you, but you just don’t like this feeling of being disliked,” Grover says. “It’s, ‘Let me get you to like me so I can feel better about myself.’” Last but not least, you’ll likely feel like you’re a failure, and that’s when it gets dark. “These are very, very, primitive early feelings. For somebody not to like you, it induces a regression,” Grover says. “Generally, that brings you back to high school, middle school, elementary school, when it was all about whether you’re cool or not. Once you get caught in the feeling, it really pulls you under, and then you’re struggling.”

These feelings aren’t exactly pleasant, but they’re also perfectly healthy and normal, so long as you don’t end up dwelling on them, preventing yourself from moving forward.

Know that it’s not (totally) your fault

This type of rejection is literally personal, and it’s easy to start questioning your self worth when someone makes it clear they don’t like you. But we all act out of our own insecurities and unique experiences, and for the most part, being disliked is a measure of mutual compatibility. So, it’s not really that it’s not you but them, so much as it’s both you and them. “This person, this situation, where they are in their life, it’s not compatible to where you are,” Jennifer Verdolin, an animal behavior expert and adjunct professor at Duke University, tells us. “We have preferences in terms of personality, and that’s not to say that your personality is bad. It’s different from mine, and I prefer to hang around people who are similar to me.”

Sometimes, the people who dislike you don’t think certain facets of your personality jibe with theirs; sometimes, you just don’t offer them enough social capital to be worth their time. “Because we’re a very social species with a pretty intense dominance hierarchy, especially when it comes to work, and sometimes in social situations, people make specific strategic alliances and switch alliances as it suits them to meet their needs as they define them,” Verdolin says. “So people will try to achieve status, and a lot of time, whether they like you or don’t like you may have nothing to do with who you are.”

Either way, likability has a lot to do with what you bring to someone else’s table, whether or not you realize it. “We see this in all kinds of species. They preferentially tend to spend time, outside of mating, with either individuals who are similar to them in status, individuals who are similar to them in personality, individuals who are similar to them in some sort of way genetically, so, family,” Verdolin says. “So if you don’t have anything in common that is equally valuable to both parties, then you will likely be rejected. It’s kind of an inevitability.”

But watch for signs of your own bad behavior

While you shouldn’t always blame yourself if someone doesn’t like you, if you’re finding this is a pattern, you may want to take an unbiased look at your own behavior. “When I put people in a [therapy] group, I get to see immediately what problems or tics or bad social habits they have,” Grover says. He recalls a successful, handsome male patient of his who was having trouble holding onto romantic relationships. Though they were unable to solve the problem together in individual therapy, Grover managed to convince the patient to join a group. “Within five minutes, I was horrified,” Grover says. “He gets very anxious in front of people, and to camouflage his anxiety he becomes overly confident, which comes across as arrogant. The women in the group commented that he was becoming less popular the more they got to know him.”

The patient’s anxiety was manifesting in such a way that he had difficulty relating to people in a social setting, but because our own egos tend to protect us from our faults, he wasn’t aware of his bad habits. “I had to help him be aware of how his anxiety manifested,” Grover said. “Anxiety can make people act aggressive or really anxious, and in a group situation it’s super effective to see that.”

One way to find out what’s going on, Verdolin says, is to ask for feedback as to why you’re disliked. Then, if someone tells you, say, you’re annoying, or overly braggy, or self-obsessed, you can take a step back and analyze whether there’s some validity to the criticism.“Ultimately you have to know who you are well enough to say, okay, that information sounds pretty valid, I do tend to do that, I can see why that might not be attractive to other people, so I’m going to work on changing it,” Verdolin says. “You might be being given important information that you should take a look at seriously, and evaluate to see if there’s truth to it.”

Still, remember that while some of your behaviors might turn people off, likability is typically a two-way street. “It is, more often than not, some sort of reflection of [the other person’s] history, their prejudices, their fears,” Grover says.

Remind yourself that making new friends is no easy task

One of my greatest fears is that I’ll start a new job or move to a new place where I don’t know anyone and have to make new friends. Changing your social circle can be isolating; it’s when you’re most likely to feel disliked or suffer from social anxiety. “I think we have a little bit of an unrealistic expectation that we should be able to [enter social groups] anywhere, with all people,” Verdolin says. “When you’re first trying to establish rapport in relationships with people in, say, a new work environment, you’re coming into a dynamic that’s already set in structure. There are already cliques, there are already personalities, there are already dynamics, and you have no idea what you’re stepping into.”

Verdolin suggests that people faced with starting a new job or making a big move start slowly to get a sense of their new social environment. “With animals, sometimes they’ll integrate by having a sampling interaction with everyone else in the group before making decisions, to kind of get a lay of the land, so to speak, before trying to jump right in,” Verdolin says. At a new job, for instance, it might be worth suggesting going to lunch with folks one-on-one, to find the group’s friendliest entry point. “Some people are very welcoming and some people are not,” Verdolin says. Get to know people slowly, and focus your energy on those who seem most receptive, rather than the group’s most exclusive members, or toughest nuts to crack.

Spend extra time with the people who do like you

Even if you find yourself on the outs with some folks, chances are, you’ve at least got a few people you can rely on when you’re feeling low. “Spending time with people that care about you can boost your self-esteem and help you to feel more secure,” Brotheridge says. Besides acting as a balm to your wounded ego, focusing your energies on relationships with people who appreciate you will, in the larger picture, be a much more fulfilling use of your time and social energy.

And keep in mind that the best way to make genuine friendships is to be genuine yourself. “If you just walk around wanting to be liked, it’s very stressful, and people will read that as inauthentic,” Grover says.

And tell the haters to suck it

At least, tell them in your head. Grover says that when all else fails, it’s best to embrace having the occasional enemy. “Delight in it. Really, just enjoy it,” he says. After all, as Grover says, sometimes it’s actually better to be formidable. “If people are jealous or whatever, all feelings are welcome.” You don’t need to go around antagonizing people, but if someone doesn’t like you and the feeling is mutual, you don’t necessarily have to go out of your way to appease them, either.

How the Physical Body Holds Mental Tension

The connection between mind, body, posture, and stress Posted April 18th 2021

Tami BulmashApr 6·9 min read

Image: Alexander Jawfox/Unsplash

If you’ve ever endured a nerve-racking situation followed by a throbbing noggin, it wouldn’t seem far-fetched to connect one with the other. Nearly one in four adults reports experiencing multiple headaches every year in the United States. The World Health Organization estimates 50% of all adults have at least one headache annually. Though there are over 150 types of headaches, tension headaches are the most common and often triggered by stress. Yet while doctors might agree the two can be linked, they still don’t understand exactly how.

Brian Cole, MD, an orthopedic sports medicine surgeon at Midwest Orthopaedics at Rush and a professor of orthopedics, anatomy, and cell biology at Rush University Medical Center, agrees with this sentiment. “The exact reason why stress creates headaches is still unclear. One theory is that muscle tightness in the neck and head, which can reflexively increase with stress, results in dull tension headaches.”

It’s reasonable to presume a relationship between a headache and stressful thought exists because of where the brain is located. Since thoughts tend to be associated with the mind — which is often synonymous with the brain — they all appear to reside in the same place. However, the farther down pain travels in the body — further distancing itself from the head — the murkier the relationship between thoughts and tension becomes.Yes, We Really Are Having More Headaches Right Now — Here’s WhyPeople can experience stress in different ways during the pandemic. Many of those experiences can trigger

When pain is felt, it’s usually assigned to a specific body part. Localizing pain to a certain region inadvertently disconnects it from the rest of the body, thus making its origin harder to pinpoint. For instance, there is a low likelihood a medical doctor would suggest the onset of torticollis, a form of neck strain, is the result of ruminating thoughts. Linking kyphosis, or hunchback, with depression sounds even less plausible. But are connections like these really so hard to believe? Not according to Erik Peper, PhD, a professor of holistic health studies at San Francisco State University.

The manifestation of thoughts in the body

In his study, “How Posture Affects Memory Recall and Mood,” Peper illustrates how thoughts and body tension are interrelated. Participants were asked to recall negative memories while sitting in slouched and upright positions. They were told to do this again, but this time to think of positive memories. Peper’s research found that 86% of participants reported it was easier to access negative memories when collapsed than upright, and 87% of participants said it was easier to access positive images while erect than hunched over.

Peper explains, “If you are in a collapsed position, I think almost everybody finds it harder to access or be really involved in a positive memory and vice versa. When you’re in the upright position, it doesn’t mean you can’t have negative or hopeless thoughts. However, if you keep that opposition — as if you’re slightly removed from the emotional impact, once you’re in the upright position, it’s easier to access the more empowering and positive thoughts. Because thoughts and body are not separate.”

The evolutionary patterns of posture are described in Peper’s book, Tech Stress, where he examines the prevalent symbols of power throughout evolution. “A depressed or collapsed posture is a universal symbol of constricted posture in both humans and animals. In nature, throughout the animal kingdom, collapsed posture indicates submission. On the other hand, erect posture universally indicates leadership.”

But don’t get too excited about straightening your back just yet. There is a difference between sitting tall — which connotes length — and sitting straight. The latter has become a well-known way to command sitting correctly. Thoughts evoked by picturing a “straight” back might resemble a soldier pose, the most common depiction of what “standing up straight” looks like. Yet, if the images associated with a straight back were truly congruent with alignment — or balance for that matter — good posture would not be analogous to a fixed and tense figure.Everything You Know About “Good Posture” Is WrongCould the way you’re aiming to hold your body be making pain worse?

The power of habit

The desire to perform activities successfully is a deep-seated norm. “Sitting up straight” is the perfect example of how people may unwittingly arch backs, lift chests, pull back shoulders, and tighten necks — namely contort their bodies — in a strained attempt to do it the “right” way. Because the alternative — collapse — represents defeat.

A century-old method known as the Alexander technique was developed upon the principle of unknown habits. The founder of the technique, F.M. Alexander, was a Shakespearean actor who repeatedly lost his voice during performances. He thought he was doing everything right, yet his problem only worsened.

Alexander sought the aid of medical professionals to fix his problem but to no avail. He came to wonder if perhaps it was something he was doing to himself that caused his laryngitis. He learned that once he stopped repeating certain habits — such as standing soldier-like while performing — his voice returned. More importantly, he discovered the self-imposed pressure to perform successfully resulted in tense and tight positioning that constricted his breathing and voice.

The next time you sense the slightest tinge of stress — whether you’re running late or feeling angry, frustrated, or generally inadequate — just pause. Take note of what is happening in your body. Catching that moment is key.

Common contributing stressors

Wanting to succeed in life is part of human nature. Social comparison theory tells us that individuals determine their own worth based on thoughts of how they stack up against others. Yet, comparison doesn’t always lead to self-improvement. For example, social media has made measuring up seem impossible as it takes a considerable toll on self-esteem and can lead to depression.

In addition to the pressure of achieving or maintaining success, Americans are notorious workaholics. They work long hours, take few vacations, are more likely to work at night or on weekends, and feel pressured for time. It’s no wonder Americans are among the most stressed people in the world.

Beret Arcaya, a master Alexander technique teacher and the founding member of the American Society for the Alexander Technique, describes this as a “habit of thoughts.” She says this is something we do with our brain and we don’t have to. “You don’t have to be on that rat race. You really don’t. But you’ve got to get conscious of when you’re doing it.”

The way you react to each situation is a choice, though until that choice is realized, the reaction remains a habit. When it comes to lack of time, Arcaya explains, “You have to say, ‘Ah, I was just thinking that way again — I don’t have any time, I don’t have any time, I don’t have any time!’ Wait a minute. Stop and feel what happens to your breath.”

Arcaya tells her students there may never be enough time, but that’s fine. There might even be a way out of the rat race if you don’t react to it. She says, “Don’t even argue with the habit. Don’t even argue with ‘I don’t have enough time, or I didn’t have time today.’ It will make you stay in the moment, and that will elongate the time.” In other words, wait before responding to a stressful thought because that delay keeps you in the present.

Psycho-physical pain

Having taught the Alexander technique for over 40 years, Arcaya is no stranger to human behavior and how habits present themselves as tension. A renowned teacher, people come to her when they have exhausted all other options to ameliorate their pain.

One such woman came to her after suffering for years with terrible neck pain. Doctors couldn’t find a way to help her. They didn’t consider her stress level even though thoughts of neglect, abandonment, betrayal, and loss consumed the woman’s life. She developed a wrenching spasm of the neck called torticollis shortly after the passing of someone beloved to her.

Arcaya recalls it was way beyond a stiff neck, the woman was essentially trapped in her pain. The pain made it impossible to eat or walk properly, forcing her to walk sideways. It seemed the woman couldn’t face the events of her life. Her head was turned away, and the tension locked it there. It was as if it was all too much; she couldn’t even look at it.

When asked if she believed thoughts could directly impact the body, Arcaya responded, “Thoughts and the way you think — and the whole way you are when you are thinking — is your body tension. It doesn’t impact it; it is it.”

See it to believe it

As past president of the Association for Applied Psychophysiology and Biofeedback, Peper is an established biofeedback expert. His work focuses on training individuals to learn awareness and gain control of body functions through the aid of electromyography (EMG). During a treatment, biofeedback sensors are attached to the skin to measure the body’s biological signals that are shown as feedback that informs and assists in health improvement.‘Biofeedback’ Could Ease Headaches, Anxiety — and Maybe a Lot ElseBy providing a window onto the body’s inner workings, biofeedback could help people control what was once thought to be…

The psycho-physiological principle asserts that “every change in the physiological state is accompanied by an appropriate change in the mental-emotional state, conscious or unconscious, and conversely, every change in the mental-emotional state, conscious or unconscious, is accompanied by an appropriate change in the physiological state.”

Another study led by Peper, who was instrumental in establishing the first holistic health program at a public university in the United States, recorded physiological signals demonstrating the mind-body connection.

A 25-year-old participant who had been playing the piano for 16 years was asked to relax and then imagine playing a musical piece in a series of intervals. Muscle activity was recorded from her right forearm extensor muscles and displayed on a large screen so that other group participants could observe. Each time she imagined playing the piano, the forearm extensor muscle tension increased, even though there was no observed finger and forearm movements. The physiological monitoring showed how her body responded to “playing” only in her thoughts. When recordings of her movement were shown later, she reported being completely unaware of activating her muscles — especially since her forearm appeared to stay in a relaxed position.

If the mere conscious thought of performing an activity (such as playing the piano) can evince body tension, what can be said about the impact of unconscious thinking? This begs the question of what might happen to the body with repeated thoughts of anger, resentment, and hopelessness — or with thoughts of kindness and love.

What you can do (or not do) about your thoughts

According to Arcaya, thoughts don’t have to become tension. She says, “We’re always directing ourselves. We don’t realize it a lot because it is subconscious.” She notes that direction provides a purpose toward a goal that serves to guide or motivate. “To have conscious direction is to take agency over your life. It’s to have self-possession. To take your energy and decide what you will and what you will not do.”

When thought is attached to an expectation and the outcome is unmet, the reaction will translate into tension. That is unless the habitual response is recognized first. The next time you sense the slightest tinge of stress — whether you’re running late or feeling angry, frustrated, or generally inadequate — just pause. Take note of what is happening in your body. Catching that moment is key. Pay attention to your thoughts. Giving yourself even 10 seconds to pause might allow you to substantially calm your nerves and prevent you from going down a familiar spiral that could lead to stress and pain.

Albert Einstein famously said, “The definition of insanity is doing the same thing over and over again but expecting different results.” A habit can only be changed if it is recognized. Then it’s up to you to make the choice of which direction you choose to take. You could repeat the same habit and get the same outcome or pause and see if a new path presents itself. Who knows, you might even bypass a headache (or two or three or 20) along the way.

When the mind is dark, making art is a thrilling way to see – posted April 17th 2021

Art from the Heart

Adam Zemanis professor of neurology at the University of Exeter Medical School in the UK. His interests lie at the intersection of neurology, psychology and psychiatry. He was chairman of the British Neuropsychiatry Association 2007-2011, and is the author of Consciousness: A User’s Guide (2003) and A Portrait of the Brain (2008).

Edited by Christian Jarrett

In the early 2000s, I encountered a patient who had lost his previously active ‘mind’s eye’ following neural complications arising from heart surgery: he could no longer conjure up the faces of family and friends, visualise places he’d visited or imagine scenes in a novel; even his dreams were now without imagery. Before his loss, the vividness of his imagery approached ‘real seeing’ – afterwards he ‘saw’ nothing in his mind, though he could still think perfectly well about objects and concepts, and he functioned well in everyday life.

I was intrigued: visualisation is, for most of us, such a central ingredient of our mental lives, at work in memory, reverie, dreaming, reading, problem-solving and creativity. Think of an apple for a moment. Was it green or red? Shiny or gnarled? Was there a stalk? Could you have used the stalk to twirl the apple on request? If you can answer these questions, or at least they make sense to you, it’s likely that you experience ‘visual imagery’, as most of us do.

Given his deficit, it seemed remarkable that our patient – MX, as we called him – was able to function so well. In fact, MX was an ideal research participant – intelligent, interested, cognitively intact in all respects bar this single, subtle but striking deficit. Using functional MRI scanning, we showed that, while his neural activity was normal when he looked at famous faces, once he tried to visualise them mentally, he failed to engage the visual parts of his brain as you or I would if we engaged our mind’s eye (assuming you can visualise). His presentation matched a disorder of ‘visual imagery generation’ previously described by the American psychologist Martha Farah, drawing on the background work of another psychologist, Stephen Kosslyn. I found MX’s case fascinating and odd, but unlikely to lead much further. Then events, as they sometimes do, took an unexpected turn.

In 2010, the US science journalist Carl Zimmer described our work on MX’s case in the science magazine, Discover. Over the next few years, 21 people contacted me explaining that they’d recognised themselves in Zimmer’s lucid description of MX, but with one difference – they’d never been able to visualise. For as long as they could remember, they’d been aware of this quirk in their psychological nature: while others seemed to have a somewhat visual experience when they thought of an apple, or their front door, or their last holiday, these folk saw blackness.

If you’re aphantasic, and you want to know what something looks like, one solution is to draw it

Reading around the topic, I discovered that this quirk had been commented on before. Francis Galton, the first psychologist to measure the vividness of visual imagery, recognised in the 1880s that a small number of those whom he approached lacked imagery entirely: their ‘powers of visualisation … [were] zero’. But oddly this intriguing observation had lain in the long grass since. One reason might have been that it lacked a convenient name. So when my colleagues and I came to describe in the research literature the 21 individuals who’d made contact with us, we decided to coin one. At the suggestion of my classicist friend David Mitchell, we borrowed from Aristotle, who used the term ‘phantasia’ to denote the mind’s eye, adding the prefix ‘a’ to denote absence. In 2015, we published our paper reporting these participants as having ‘congenital aphantasia’.

Five years on, I have heard from around 14,000 people with unusual mental imagery capacities, most of them at the aphantasic end of the spectrum, but also a minority with ‘hyperphantasia’, who experience imagery ‘as vivid as real seeing’. Depending on the precise definitions used, aphantasia is estimated to occur in around 3 per cent of the general population, and hyperphantasia in around 6 per cent of the population.

Through questionnaire and face-to-face studies, we have found some consistent associations with these extremes of imagery vividness, helping to assuage the worry that they might simply reflect vagaries of introspection. People with aphantasia are more likely to work in science, mathematics and IT, while those with hyperphantasia are more likely to work in traditionally ‘creative’ industries. Aphantasia is associated with difficulty with autobiographical memory and with imaginary or future thinking; people with aphantasia also tend to be relatively poor at recognising faces (which is intriguing, as this is a perceptual rather than a representational problem) and they’re diagnosed with autistic spectrum disorder at above-average rates. Extremes of imagery tend to cluster in families, hinting at a genetic link or links. Some people with aphantasia report avisual dreams, but many dream visually, which suggests wakeful and dreaming imagery depend on somewhat separate processes. Some lack imagery in any of the senses – no mind’s ear or mind’s nose, for example – while others have vivid imagery in at least one other sense. In short, aphantasia is almost certainly more than one ‘thing’.

You might be feeling that, on the whole, people with aphantasia have a raw deal but, while it has some drawbacks, there are advantages too. For example, lack of distraction by emotional imagery might allow people with aphantasia to be ‘present’ in ways that some of us might envy, and it’s abundantly clear that aphantasia is compatible with high achievement: Craig Venter, who led the first draft sequences of the human genome, Ed Catmull, past president of Pixar Disney and recent recipient of the Turing Prize for his work on computer-generated animation, Blake Ross, co-creator of Mozilla Firefox, and Oliver Sacks, the acclaimed neurological author, have all declared their aphantasia.

We are immensely complex creatures: the possession of visual imagery is one small piece in the huge jigsaw of cognition

When I first encountered people lacking visual imagery in some numbers, I was fascinated by this invisible variation in human experience. I wondered how to make sense of it. My first pass was that it probably reflected an extreme form of the old distinction between ‘verbalisers’ and ‘visualisers’. I assumed that people lacking imagery would be more dependent on language and generally less interested in the visual world, than those with vivid imagery. My colleagues in our Eye’s Mind project at the University of Exeter – two historians, a philosopher, an artist and a neuroscientist – broadly agreed. The finding that aphantasia predisposes people to work in scientific rather than creative trades seemed to bear out this assumption. Imagine our puzzled surprise, then, as aphantasic visual artists got in touch in growing numbers – about 150 so far, alongside a smattering of actors, authors and architects. Why would you want to depict a world you can’t imagine? How can artists create without any imagery to guide their creation?

I realised belatedly, after hearing from these people, that aphantasia might sometimes increase rather than reduce interest in the visual world. People with aphantasia see perfectly well and can have an eye for beauty. In fact, deprived of the ability to contemplate the look of things in their mind’s eye, their visual attention to the here and now might be heightened. And lacking a mind’s eye might also increase the motivation to represent the visual artistically – as my colleague, the cultural historian Matthew MacKisack pointed out to me, if you’re aphantasic, and you want to know what something looks like, one solution is to draw it. This meshed with the report from many of our study participants that they used photography extensively to capture appearances they would otherwise be unable to retrieve. As Sheri Bakes, a Canadian artist who – unusually – had lost imagery following a stroke at the age of 29, told us: ‘The paintings have become the picture inside that I can’t see.’ She wrote elsewhere: ‘After the stroke, I taught myself to paint before I could even walk or speak again properly; it was the only thing I could really do.’

Increasingly struck by the accounts we were receiving, in 2019 we mounted an exhibition of work by aphantasic and hyperphantasic artists at the Tramway in Glasgow and then at the Royal Albert Memorial Museum in Exeter ­­– Extreme Imagination: Inside the Mind’s Eye, curated by MacKisack and the artist Susan Aldworth (now accessible virtually). I would challenge any visitor to ‘diagnose’ which category each artist belonged to. Artists with aphantasia produced works ranging from the photorealist to the abstract, conjuring vivid detail as well as suggestive ambiguity. But their accounts of their artistic process were distinctive.

The point of departure for the aphantasic artists’ work is not a visual image, but a ‘feeling’, ‘emotion, desire, energy’, ‘thoughts and ideas’ or a sense of a spatial arrangement. Some then make use of existing materials, through collages or reworkings of famous paintings; others rely on drawing from life, or refer to archives of images and photos; a third group, intriguingly to me, describes using the developing work on the page or canvas as a kind of externalised imagination, allowing them to ‘push and engage with the appearing image’, sometimes with a ‘thrilling sense of stepping out into the unknown’.

A lifetime of looking, of drawing and – I suspect – a wealth of unconscious imagery informs aphantasic artists’ use of their work to dream itself into being. Catmull was startled to discover that one of his favourite illustrators, Glen Keane, animator of The Little Mermaid (1989) and Beauty and the Beast (1991), lacks imagery, like Catmull himself. Yet Keane was named a Disney Legend in 2013, and won a 2017 Academy Award for his animated film made with Kobe Bryant, Dear Basketball (2017). I’ve seen a clip of Keane drawing that shows him peering intently at a preliminary, indecipherable scribble, hunting for the cues that will help him to conjure a vivid drawing from the page. Intriguingly, he belongs to a lineage of artists – his father Bil was a celebrated cartoonist, and his daughter, Claire, helped to illustrate Frozen (2013).

Our finding that aphantasia predisposes people to work in the sciences suggests that our initial assumption, that it would militate against a career in art, was not wholly wrong. But we are immensely complex creatures: the possession of visual imagery is one small piece in the huge jigsaw of cognition. There are many ways to represent things in their absence besides a sensory image – language for example and, as our exhibition illustrated, art! Our work with aphantasic creatives drives home another broad conclusion: we shouldn’t confuse visualisation with imagination, the far broader capacity, to represent, reshape and reconceive things in their absence. Imagination can certainly make use of imagery – but it doesn’t have to. As the examples of Bakes, Catmull, Keane, Ross, Sacks and Venter demonstrate amply, aphantasia is no bar to an imaginative life. If, reading this, you come to the conclusion that you too might lack imagery, or have it in spades, don’t hesitate to contact me ( as we still have much to learn.

Danger Signals by R.J Cook. The following two articles are also on the Hate Male Page. April 11th 2021

I studied social sciences at the Univerity of East Anglia in the early 1970s – eventually majoring in economics following a good grounding in all of them. The key issue with all was grounding , verifying and evidence gathering. Marxism and concern for the under class was fashionable then.

Returning to post graduate study at London University in the late 1970s, I renewed an interest in psychology which had begun when I read one of my late father’s encyclopedias on the subject , then aged 12. Following his death I developed acute OCD and anxiety issues. Paying attention at school was an issue for me. I was bullied and didn’t want to be there.

Luckily , during my formative years alternating between a secure rural community and a snug North London community in the 1950s – where there was no shortage of role models regradless of our material poverty , I already had an advanced reading age. So in London , I combined studying to be a teacher with evening sessions on the part time MA Psychology course.

I never had the temerity to go through with plans to be an educational psychologist because I doubted my ability to use their methods to make an accurate safe assessment. So my note of caution regarding the following is based on a feeling that the real issue about girls and boys diagnosed as having ADHD , has much to do with society’s influencers using the education system for social engineering.

At Goldsmiths’ College, we were told that our mission in education was fighting sexism and racism by challenging racial and gender syereotypes. I took this all too seriously , leaving London for my native Buckinghamshire as one of the advanced guard warriors.

To cut a long story short, I took matters to their logical conclusion. In between universities , I had abandoned plans to become a military pilot for the mythical Britain I had grown up in. Infected by hippy unisex culture at the University of East Anglia , I was dismayed when a female officer psychiatrist told me the job might well resolve itself into me being required to kill without question.

Male and females of her exalted position do not get there without an identity based on social position and good education . The problem these days , and in the wider society, the rainbow multi culture world where we can be whoever and whatever we like , offers no sound basis for choice or real choice given what young people have become in a world without secure role models. In short it is a con.

For my part , I took all of this gender identity choice too seriously , teaching and writing a book about it. My ex wife was not impressed – a complex and difficult subject in its own right. It is only relevant here because ,coming from an earlier age ADHD was the least of her problems.

As my marriage steadily failed miserably over several years , she was not happy when I started thinking I might have a female brain and should dress and modify behaviour accordingly. The outcome of her and other peoples responses was interesting to put it mildly. As an inveterate writer – though maybe not a good one – there was an element of experiment on my part.

The world of trnsgender has officially been closed to question and scrutiny but it won’t prevent prejudice , ostrcism nd socil punishment. No doubt experts are working on gender reassignments where patients may have full reproductive potential , meaning that in richer countries , gender may be a matter of choice. But gender alone is not a route to happiness or contentment. More of a problem for society is the nonsense that we can ignore nature while attempting to control it. Looking back over 18 years teaching , I saw the system shifting slowly at first, softening people up, toward outright brain washing. Now it is warp speed.

Economic and social change , outside of the Islamic world at the moment but coming to them soon , has broken traditional families and communities. Schools and media are working flat out for change. The problem here is that once docile girls are beginning to rebel , attracting the same ADHD label as the boys. Shut up with an ever more constrictive national curriculum and robotic target driven teachers in schools managed by gender/racial neutral super clones, they are expected to pay attention and absorb it all, shape shifting accordingly in mind , if not body.

For my own part , having gone to the analyst my ex wife insisted upon, I was told to find the ‘woman within’ on the basis of a book I had written. I have already said that I am an inveterate writer ( and photographer ) . So I took him seriously , monitoring my changes while planning a book on how the hero/heroine might contrive to have that woman within murdered – rough draft on this site.

Interestingly , I was told by the gender identity clinic that they could not complete the process until i agreed to anti psychotic drugs. These are basically zombie drugs, affecting motor control , bowel movements and memory. Maybe some are that desperate or ignorant in our poorly educate society to accept them.

It seems to me, that what we have in the following is more pseudo science, which I wasted years studying , to justify more spending to adapt young women to take seriously some very serious and unnatural diktats packaged as education for their own freedom.

‘The Divided Self ‘ Robert and Roberta Cook ( take your pick }

Young Robert Cook with his beloved and constant companion Yamaha Classical guitar, 1980.
Image Appledene Photographics Archive.
Roberta Jane Cook 2018. She was told that to complete her gender reassignment surgery ‘she’ must agree to anti psychotic drugs. Otherwise , their female psychiatrist went on record to say ‘Roberta has a secure female identity.’
So do all modern females need drugs to fit in with the ‘slave new world ?.’
Image Appledene Photographics/RJC
Pupils at Grange School Aylesbury, 1983. Modern education is about social engineering , in my view and qualified opinion.
R.J Cook

Decades of Failing to Recognize ADHD in Girls Has Created a “Lost Generation” of Women Posted April 11th 2021

Not the gender equality we had hoped for.


  • Jenny Anderson

Read when you’ve got time to illustration of the same woman with different expressions reflected in broken mirror shards

A painful discovery. Photo by Giada Fiorindi

Girls are closing one gender gap we don’t want: diagnoses of Attention deficit hyperactivity disorder (ADHD). Between 2003 and 2011, parents reported an increase of ADHD diagnoses of 55% for girls, compared to 40% for boys, according to a 2015 study in the Journal of Clinical Psychiatry.

And yet girls continue to be misdiagnosed in spades, with alarming consequences, Dr. Ellen Littman, clinical psychologist and co-author of Understanding Girls with AD/HD, tells Quartz. “The outcomes for girls are horrendously negative compared to boys,” she says.

ADHD materializes dramatically differently in girls.

“Anxiety and depression turn into low self-esteem and self-loathing, and the risk for self-harm and suicide attempts is four-to-five times that of girls without ADHD,” 2012 research shows.

“This is not about having trouble with their homework,” Littman says.

Unlike boys, many of whom show hyperactivity, girls’ symptoms veer more toward inattentiveness and disorganization. Girls tend to develop ADHD later than boys. They frequently mask it in an attempt to conform to society’s expectation that they be on the ball and organized. And while some ADHD symptoms can become less intense for boys after they pass through puberty, for many girls, it gets worse.

“I think we have a lost generation of women who are diagnosed with ADHD later in life, who have had to manage the condition on their own and deal with it on their own for the majority of their lives,” Michelle Frank, a clinical psychologist and ADHD expert, tells Quartz. “The diagnosis is a blessing and a curse: it’s a great relief, but they wonder what could have been different if they had only known.” 

ADHD is Harder to Recognize in Girls 

In Understanding Girls with AD/HD, Littman and her co-authors explain that ADHD was first diagnosed in young, white boys, with a key indicator being hyperactivity. As a result, guidelines were written around how it manifests in boys, and research is almost exclusively focused on boys (1% is specific to girls, Littman says).

It also materializes much later in girls, which was problematic when the American Psychiatric Association’s diagnosis criteria called for symptoms to be visible by age 7. It recently changed the age (pdf) to 12, allowing more girls to be captured.

Dr. Patricia Quinn, one of Littman’s co-authors on the book and a pediatrician in Washington, DC, who founded the National Center For Girls and Women With ADHD, told HuffPo Parents that girls’ symptoms include:

  • a tendency toward daydreaming
  • trouble following instructions
  • making careless mistakes on homework and tests.

ADHD is a chronic neurobiological disorder which affects the brain structurally and chemically, as well as the ways in which various parts of the brain communicate with one another. It is highly heritable, says Frank.

Pressure to perform means many girls internalize their symptoms—disorganization or carelessness—as personal flaws rather than medical issues to be treated through medicine and therapy.

Girls with ADHD are significantly more likely to experience major depression, anxiety, and eating disorders than girls without. “They tend to have few friendships,” Littman says. “As a result of their low self-esteem, they often choose unhealthy relationships in which they may accept punitive criticism and or abuse.”

Teachers and parents often miss the warning signs because feeling disorganized or unfocused often leads to depression and anxiety. Failing to properly diagnose the condition, girls miss out on critical academic services and accommodations, as well as therapy and medication. Many girls end up misdiagnosed and treated with anti-anxiety or depression drugs, some of which exacerbate the effects of ADHD.

The Numbers

Progress is being made. Not long ago, the ratio of diagnosed boys vs. girls with attention-deficit hyperactivity disorder was 10 to 1. Today, it is somewhere between 4 to 1 and 2 to 1, Littman says.

According to the Journal of Clinical Psychiatry study, ADHD affected 7.3% of girls in 2011, compared to 16.5% for boys. Oddly, as awareness grows about ADHD in girls, there is evidence that boys are being wildly over-diagnosed with ADHD, including 17-year-old-boys who want extra time to complete the SAT for college applications.

“There’s no question it is both under-diagnosed and over-diagnosed,” Littman says.

This is not to say that millions of boys and men don’t have ADHD or suffer from the consequences of it.

“It’s important to remember that while we have to focus on increasing awareness and services to girls and women, boys and men are also profoundly affected by ADHD. We have a long way to go in addressing the immense stigma and gross misunderstanding that surrounds this diagnosis,” says Frank.

But Littman says the myths around ADHD and girls remain pervasive and she is deluged daily by people who are seeking help. “Girls are being told still by pediatricians and primary care doctors that ‘you are a girl, you can’t get ADD.” (ADHD includes the symptom of physical hyperactivity while ADD does not).

Women and ADHD 

Other pernicious myths around ADHD include the perception that adult women, including successful professionals, can’t have ADHD.

Just the opposite. When the structure of school and college make way for the anarchy of balancing work and maybe having children, keeping ADHD at bay becomes harder.

Littman works with, and has studied, the impact of ADHD on high-IQ men and women, many of whom spent years masking their symptoms with their high abilities. As she tries to target if ADHD is the issue, she asks whether they are “constantly in a state of being overwhelmed and frantic about coping with day-to-day basic things?” Most burst into tears. “These are the people least likely to be acknowledged and because of the shame of feeling smart, they don’t feel they are entitled to help.”

Not surprisingly, she says, there are as many female as male patients in adult clinics. One study of ADHD medication showed women were the fast-growing population. Between 2008 and 2012, the number of Americans using medication to treat ADHD rose 36%; among women aged 26 to 34, the figure rose 85%.

Frank says even those who were diagnosed, though perhaps late, face serious longer-term consequences. “You can treat ADHD, you can get support and strategies, but the self-esteem challenges are going to be left over and you have to work at that for a lot longer.”

Personal stories have helped to raise awareness about the fact that girls can have ADHD, and that it presents itself differently. Maria Yagoda wrote in the Atlantic about being diagnosed as a junior at Yale:

My peers were also confused, and rather certain my psychiatrist was misguided. “Of course you don’t have ADHD. You’re smart,” a friend told me, definitively, before switching to the far more compelling topic: medication.“So are you going to take Adderall and become super skinny?” “Are you going to sell it?” “Are you going to snort it?”

The answer, clearly, was no.

“Medication is certainly not a cure-all, but when paired with the awareness granted by a diagnosis, it has rendered my symptoms more bearable—less unknown, less shameful,” she wrote.

The Five Universal Laws of Human Stupidity Posted April 6th 2021

We underestimate the stupid, and we do so at our own peril.


  • Corinne Purtill

Read when you’ve got time to spare.a person is thrown into the air by a bull

Not just a danger to themselves. Photo by Reuters/Susana Vera

In 1976, a professor of economic history at the University of California, Berkeley published an essay outlining the fundamental laws of a force he perceived as humanity’s greatest existential threat: Stupidity.

Stupid people, Carlo M. Cipolla explained, share several identifying traits: they are abundant, they are irrational, and they cause problems for others without apparent benefit to themselves, thereby lowering society’s total well-being. There are no defenses against stupidity, argued the Italian-born professor, who died in 2000. The only way a society can avoid being crushed by the burden of its idiots is if the non-stupid work even harder to offset the losses of their stupid brethren.

Let’s take a look at Cipolla’s five basic laws of human stupidity:

Law 1: Always and inevitably everyone underestimates the number of stupid individuals in circulation.

No matter how many idiots you suspect yourself surrounded by, Cipolla wrote, you are invariably lowballing the total. This problem is compounded by biased assumptions that certain people are intelligent based on superficial factors like their job, education level, or other traits we believe to be exclusive of stupidity. They aren’t. Which takes us to:

Law 2: The probability that a certain person be stupid is independent of any other characteristic of that person.

Cipolla posits stupidity is a variable that remains constant across all populations. Every category one can imagine—gender, race, nationality, education level, income—possesses a fixed percentage of stupid people. There are stupid college professors. There are stupid people at Davos and at the UN General Assembly. There are stupid people in every nation on earth. How numerous are the stupid amongst us? It’s impossible to say. And any guess would almost certainly violate the first law, anyway.

Law 3. A stupid person is a person who causes losses to another person or to a group of persons while himself deriving no gain and even possibly incurring losses.

Cipolla called this one the Golden Law of stupidity. A stupid person, according to the economist, is one who causes problems for others without any clear benefit to himself.

The uncle unable to stop himself from posting fake news articles to Facebook? Stupid. The customer service representative who keeps you on the phone for an hour, hangs up on you twice, and somehow still manages to screw up your account? Stupid.

This law also introduces three other phenotypes that Cipolla says co-exist alongside stupidity. First there is the intelligent person, whose actions benefit both himself and others. Then there is the bandit, who benefits himself at others’ expense. And lastly there is the helpless person, whose actions enrich others at his own expense. Cipolla imagined the four types along a graph, like this:a chart of ineffectual people from helpless people to bandits

Stupidity, graphed. Photo by Vincedevries on Wikimedia, licensed under CC-BY-SA 4.0

The non-stupid are a flawed and inconsistent bunch. Sometimes we act intelligently, sometimes we are selfish bandits, sometimes we act helplessly and are taken advantage of by others, and sometimes we’re a bit of both. The stupid, in comparison, are paragons of consistency, acting at all times with unyielding idiocy.

However, consistent stupidity is the only consistent thing about the stupid. This is what makes stupid people so dangerous. Cipolla explains:

Essentially stupid people are dangerous and damaging because reasonable people find it difficult to imagine and understand unreasonable behavior. An intelligent person may understand the logic of a bandit. The bandit’s actions follow a pattern of rationality: nasty rationality, if you like, but still rationality. The bandit wants a plus on his account. Since he is not intelligent enough to devise ways of obtaining the plus as well as providing you with a plus, he will produce his plus by causing a minus to appear on your account. All this is bad, but it is rational and if you are rational you can predict it. You can foresee a bandit’s actions, his nasty maneuvres and ugly aspirations and often can build up your defenses.

With a stupid person all this is absolutely impossible as explained by the Third Basic Law. A stupid creature will harass you for no reason, for no advantage, without any plan or scheme and at the most improbable times and places. You have no rational way of telling if and when and how and why the stupid creature attacks. When confronted with a stupid individual you are completely at his mercy.

All of which leads us to:

Law 4: Non-stupid people always underestimate the damaging power of stupid individuals. In particular non-stupid people constantly forget that at all times and places and under any circumstances to deal and/or associate with stupid people always turns out to be a costly mistake.

We underestimate the stupid, and we do so at our own peril. This brings us to the fifth and final law:

Law 5: A stupid person is the most dangerous type of person.

And its corollary:

A stupid person is more dangerous than a bandit.

We can do nothing about the stupid. The difference between societies that collapse under the weight of their stupid citizens and those who transcend them are the makeup of the non-stupid. Those progressing in spite of their stupid possess a high proportion of people acting intelligently, those who counterbalance the stupid’s losses by bringing about gains for themselves and their fellows.

Declining societies have the same percentage of stupid people as successful ones. But they also have high percentages of helpless people and, Cipolla writes, “an alarming proliferation of the bandits with overtones of stupidity.”

“Such change in the composition of the non-stupid population inevitably strengthens the destructive power of the [stupid] fraction and makes decline a certainty,” Cipolla concludes. “And the country goes to Hell.”

Corinne Purtill writes about culture, behavioral science, and management. Based at various times in Washington, D.C., Phnom Penh, New York, and London, she has written about everything from terrorism to the search for the Loch Ness Monster. She has a BA in English from Stanford University and reports now from southern California.

Warning April 4th 2021

The following article should be read with caution. The key and most worrying phrase, apart from all the sophistry, is this following part of the writer’ conclusion : We are organisms, not computers. Get over it. Let’s get on with the business of trying to understand ourselves, but without being encumbered by unnecessary intellectual baggage. The IP metaphor has had a half-century run, producing few, if any, insights along the way. The time has come to hit the DELETE key.

IP , for those who don’t know means information processor. Epstein is effectively and inidiously saying that we humans are incapable of storing information, sorting and retrieving information as the need arises. In short we are not equiped to think for ourselves. This begs the question how on earth humans created and developed computers in the first place.

All we need know is the rich powerful folk controlling their use, including the police, should be trusted to do our thinking for us – keeping us safe ( ly away from them). This includes accepting government’s ‘science’ based computer modelling telling us we must get used to the perpetual lockdowns to control the virus spread.

Whatever your brains tell you, you are not IP Equiped. He uses a young female student of his to make the point that we can’t even store the image of what a bank note looks like. His experiment assumes we are all the same.

Well few of us study bank notes but we certainly process the image and scan for errors. As for not being able to draw from memory, I counter Epstein’s nonsensical and patronising garbage Steven Wiltshire with a post following on from Epstein. Epstein talks about us being ourselves, while inferring we are equally limited , and there is a little undercurrent about spirituality presumably to appease and please Muslims.

This is the WOKE cancel culture and people like Epstein need tenure, Woke sits uncomfotarbly between the Islamic rigid pattern and the LGBTQI box algorithim – an interesting challenge for the old set theory that I used to teach along with binary numbers.

I played classical guitar for years but due to over 13 years of serious police harrassment, I haven’t practiced for years. Just now I tested my self, disovering that I can still play complex music simply by thinking of what I want to play.

It is one thing to say that the brain is so complex we can’t really understand how it works. It is quite another thing to say we cannot store or process information. It raises questions of timing and motive as psychology and psychiatry dictates conditions and medication according to the DSM ( Diagnosis, Statistics & Medication ) in a world of flourishing mental illness. Robert Cook

Robert Cook practising classical guitar in 1980. According to the following expert writer, our brains are not computers and cannot process or store data. Robert said ‘ I worked with an amazing guitarist, writing and performing, from 2008-11. She couldn’t read music and freaked out when I tried to teach her. She just played.
For me, without a musical education and understanding of theory, I would never have been able to play or write anything. She could play anything she heard. When I wrote the melody to our songs, she just put in the chords by ear.
Just because we don’t understand the brain doesn’t men it is not a computer. To say that it is an organism is to state the obvious, but doesn’t mean it has no IP potential. It thrives on exercise and decays early if misfed or otherwise put at risk. Epstein’s article should be read with caution and attention to motive. Robert Cook

The Empty Brain

Your brain does not process information, retrieve knowledge, or store memories. In short: Your brain is not a computer. Posted April 4th 2021


  • Robert Epstein
  • What’s in a brain? Photo by Unsplash

No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.

Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.

To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections.

A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to change rapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.

Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.

But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.

Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’). On my computer, each byte contains 8 bits, and a certain pattern of those bits stands for the letter d, another for the letter o, and another for the letter g. Side by side, those three bytes form the word dog. One single image – say, the photograph of my cat Henry on my desktop – is represented by a very specific pattern of a million of these bytes (‘one megabyte’), surrounded by some special characters that tell the computer to expect an image, not a word.

Computers, quite literally, move these patterns from place to place in different physical storage areas etched into electronic components. Sometimes they also copy the patterns, and sometimes they transform them in various ways – say, when we are correcting errors in a manuscript or when we are touching up a photograph. The rules computers follow for moving, copying and operating on these arrays of data are also stored inside the computer. Together, a set of rules is called a ‘program’ or an ‘algorithm’. A group of algorithms that work together to help us do something (like buy stocks or find a date online) is called an ‘application’ – what most people now call an ‘app’.

Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.

Humans, on the other hand, do not – never did, never will. Given this reality, why do so many scientists talk about our mental life as if we were computers?

In his book In Our Own Image (2015), the artificial intelligence expert George Zarkadakis describes six different metaphors people have employed over the past 2,000 years to try to explain human intelligence.

In the earliest one, eventually preserved in the Bible, humans were formed from clay or dirt, which an intelligent god then infused with its spirit. That spirit ‘explained’ our intelligence – grammatically, at least.

The invention of hydraulic engineering in the 3rd century BCE led to the popularity of a hydraulic model of human intelligence, the idea that the flow of different fluids in the body – the ‘humours’ – accounted for both our physical and mental functioning. The hydraulic metaphor persisted for more than 1,600 years, handicapping medical practice all the while.

By the 1500s, automata powered by springs and gears had been devised, eventually inspiring leading thinkers such as René Descartes to assert that humans are complex machines. In the 1600s, the British philosopher Thomas Hobbes suggested that thinking arose from small mechanical motions in the brain. By the 1700s, discoveries about electricity and chemistry led to new theories of human intelligence – again, largely metaphorical in nature. In the mid-1800s, inspired by recent advances in communications, the German physicist Hermann von Helmholtz compared the brain to a telegraph.

Each metaphor reflected the most advanced thinking of the era that spawned it. Predictably, just a few years after the dawn of computer technology in the 1940s, the brain was said to operate like a computer, with the role of physical hardware played by the brain itself and our thoughts serving as software. The landmark event that launched what is now broadly called ‘cognitive science’ was the publication of Language and Communication (1951) by the psychologist George Miller. Miller proposed that the mental world could be studied rigorously using concepts from information theory, computation and linguistics.

This kind of thinking was taken to its ultimate expression in the short book The Computer and the Brain (1958), in which the mathematician John von Neumann stated flatly that the function of the human nervous system is ‘prima facie digital’. Although he acknowledged that little was actually known about the role the brain played in human reasoning and memory, he drew parallel after parallel between the components of the computing machines of the day and the components of the human brain.

Propelled by subsequent advances in both computer technology and brain research, an ambitious multidisciplinary effort to understand human intelligence gradually developed, firmly rooted in the idea that humans are, like computers, information processors. This effort now involves thousands of researchers, consumes billions of dollars in funding, and has generated a vast literature consisting of both technical and mainstream articles and books. Ray Kurzweil’s book How to Create a Mind: The Secret of Human Thought Revealed (2013), exemplifies this perspective, speculating about the ‘algorithms’ of the brain, how the brain ‘processes data’, and even how it superficially resembles integrated circuits in its structure.

The information processing (IP) metaphor of human intelligence now dominates human thinking, both on the street and in the sciences. There is virtually no form of discourse about intelligent human behaviour that proceeds without employing this metaphor, just as no form of discourse about intelligent human behaviour could proceed in certain eras and cultures without reference to a spirit or deity. The validity of the IP metaphor in today’s world is generally assumed without question.

But the IP metaphor is, after all, just another metaphor – a story we tell to make sense of something we don’t actually understand. And like all the metaphors that preceded it, it will certainly be cast aside at some point – either replaced by another metaphor or, in the end, replaced by actual knowledge.

Just over a year ago, on a visit to one of the world’s most prestigious research institutes, I challenged researchers there to account for intelligent human behaviour without reference to any aspect of the IP metaphor. They couldn’t do it, and when I politely raised the issue in subsequent email communications, they still had nothing to offer months later. They saw the problem. They didn’t dismiss the challenge as trivial. But they couldn’t offer an alternative. In other words, the IP metaphor is ‘sticky’. It encumbers our thinking with language and ideas that are so powerful we have trouble thinking around them.

The faulty logic of the IP metaphor is easy enough to state. It is based on a faulty syllogism – one with two reasonable premises and a faulty conclusion. Reasonable premise #1: all computers are capable of behaving intelligently. Reasonable premise #2: all computers are information processors. Faulty conclusion: all entities that are capable of behaving intelligently are information processors.

Setting aside the formal language, the idea that humans must be information processors just because computers are information processors is just plain silly, and when, some day, the IP metaphor is finally abandoned, it will almost certainly be seen that way by historians, just as we now view the hydraulic and mechanical metaphors to be silly.

If the IP metaphor is so silly, why is it so sticky? What is stopping us from brushing it aside, just as we might brush aside a branch that was blocking our path? Is there a way to understand human intelligence without leaning on a flimsy intellectual crutch? And what price have we paid for leaning so heavily on this particular crutch for so long? The IP metaphor, after all, has been guiding the writing and thinking of a large number of researchers in multiple fields for decades. At what cost?

In a classroom exercise I have conducted many times over the years, I begin by recruiting a student to draw a detailed picture of a dollar bill – ‘as detailed as possible’, I say – on the blackboard in front of the room. When the student has finished, I cover the drawing with a sheet of paper, remove a dollar bill from my wallet, tape it to the board, and ask the student to repeat the task. When he or she is done, I remove the cover from the first drawing, and the class comments on the differences.

Because you might never have seen a demonstration like this, or because you might have trouble imagining the outcome, I have asked Jinny Hyun, one of the student interns at the institute where I conduct my research, to make the two drawings. Here is her drawing ‘from memory’ (notice the metaphor):

And here is the drawing she subsequently made with a dollar bill present:

Jinny was as surprised by the outcome as you probably are, but it is typical. As you can see, the drawing made in the absence of the dollar bill is horrible compared with the drawing made from an exemplar, even though Jinny has seen a dollar bill thousands of times.

What is the problem? Don’t we have a ‘representation’ of the dollar bill ‘stored’ in a ‘memory register’ in our brains? Can’t we just ‘retrieve’ it and use it to make our drawing?

Obviously not, and a thousand years of neuroscience will never locate a representation of a dollar bill stored inside the human brain for the simple reason that it is not there to be found.

A wealth of brain studies tells us, in fact, that multiple and sometimes large areas of the brain are often involved in even the most mundane memory tasks. When strong emotions are involved, millions of neurons can become more active. In a 2016 study of survivors of a plane crash by the University of Toronto neuropsychologist Brian Levine and others, recalling the crash increased neural activity in ‘the amygdala, medial temporal lobe, anterior and posterior midline, and visual cortex’ of the passengers.

The idea, advanced by several scientists, that specific memories are somehow stored in individual neurons is preposterous; if anything, that assertion just pushes the problem of memory to an even more challenging level: how and where, after all, is the memory stored in the cell?

So what is occurring when Jinny draws the dollar bill in its absence? If Jinny had never seen a dollar bill before, her first drawing would probably have not resembled the second drawing at all. Having seen dollar bills before, she was changed in some way. Specifically, her brain was changed in a way that allowed her to visualise a dollar bill – that is, to re-experience seeing a dollar bill, at least to some extent.

The difference between the two diagrams reminds us that visualising something (that is, seeing something in its absence) is far less accurate than seeing something in its presence. This is why we’re much better at recognising than recalling. When we re-member something (from the Latin re, ‘again’, and memorari, ‘be mindful of’), we have to try to relive an experience; but when we recognise something, we must merely be conscious of the fact that we have had this perceptual experience before. 

Perhaps you will object to this demonstration. Jinny had seen dollar bills before, but she hadn’t made a deliberate effort to ‘memorise’ the details. Had she done so, you might argue, she could presumably have drawn the second image without the bill being present. Even in this case, though, no image of the dollar bill has in any sense been ‘stored’ in Jinny’s brain. She has simply become better prepared to draw it accurately, just as, through practice, a pianist becomes more skilled in playing a concerto without somehow inhaling a copy of the sheet music.

From this simple exercise, we can begin to build the framework of a metaphor-free theory of intelligent human behaviour – one in which the brain isn’t completely empty, but is at least empty of the baggage of the IP metaphor.

As we navigate through the world, we are changed by a variety of experiences. Of special note are experiences of three types: (1) we observe what is happening around us (other people behaving, sounds of music, instructions directed at us, words on pages, images on screens); (2) we are exposed to the pairing of unimportant stimuli (such as sirens) with important stimuli (such as the appearance of police cars); (3) we are punished or rewarded for behaving in certain ways.

We become more effective in our lives if we change in ways that are consistent with these experiences – if we can now recite a poem or sing a song, if we are able to follow the instructions we are given, if we respond to the unimportant stimuli more like we do to the important stimuli, if we refrain from behaving in ways that were punished, if we behave more frequently in ways that were rewarded.

Misleading headlines notwithstanding, no one really has the slightest idea how the brain changes after we have learned to sing a song or recite a poem. But neither the song nor the poem has been ‘stored’ in it. The brain has simply changed in an orderly way that now allows us to sing the song or recite the poem under certain conditions. When called on to perform, neither the song nor the poem is in any sense ‘retrieved’ from anywhere in the brain, any more than my finger movements are ‘retrieved’ when I tap my finger on my desk. We simply sing or recite – no retrieval necessary.

A few years ago, I asked the neuroscientist Eric Kandel of Columbia University – winner of a Nobel Prize for identifying some of the chemical changes that take place in the neuronal synapses of the Aplysia (a marine snail) after it learns something – how long he thought it would take us to understand how human memory works. He quickly replied: ‘A hundred years.’ I didn’t think to ask him whether he thought the IP metaphor was slowing down neuroscience, but some neuroscientists are indeed beginning to think the unthinkable – that the metaphor is not indispensable.

A few cognitive scientists – notably Anthony Chemero of the University of Cincinnati, the author of Radical Embodied Cognitive Science (2009) – now completely reject the view that the human brain works like a computer. The mainstream view is that we, like computers, make sense of the world by performing computations on mental representations of it, but Chemero and others describe another way of understanding intelligent behaviour – as a direct interaction between organisms and their world.

My favourite example of the dramatic difference between the IP perspective and what some now call the ‘anti-representational’ view of human functioning involves two different ways of explaining how a baseball player manages to catch a fly ball – beautifully explicated by Michael McBeath, now at Arizona State University, and his colleagues in a 1995 paper in Science. The IP perspective requires the player to formulate an estimate of various initial conditions of the ball’s flight – the force of the impact, the angle of the trajectory, that kind of thing – then to create and analyse an internal model of the path along which the ball will likely move, then to use that model to guide and adjust motor movements continuously in time in order to intercept the ball.

That is all well and good if we functioned as computers do, but McBeath and his colleagues gave a simpler account: to catch the ball, the player simply needs to keep moving in a way that keeps the ball in a constant visual relationship with respect to home plate and the surrounding scenery (technically, in a ‘linear optical trajectory’). This might sound complicated, but it is actually incredibly simple, and completely free of computations, representations and algorithms.

Two determined psychology professors at Leeds Beckett University in the UK – Andrew Wilson and Sabrina Golonka – include the baseball example among many others that can be looked at simply and sensibly outside the IP framework. They have been blogging for years about what they call a ‘more coherent, naturalised approach to the scientific study of human behaviour… at odds with the dominant cognitive neuroscience approach’. This is far from a movement, however; the mainstream cognitive sciences continue to wallow uncritically in the IP metaphor, and some of the world’s most influential thinkers have made grand predictions about humanity’s future that depend on the validity of the metaphor.

One prediction – made by the futurist Kurzweil, the physicist Stephen Hawking and the neuroscientist Randal Koene, among others – is that, because human consciousness is supposedly like computer software, it will soon be possible to download human minds to a computer, in the circuits of which we will become immensely powerful intellectually and, quite possibly, immortal. This concept drove the plot of the dystopian movie Transcendence (2014) starring Johnny Depp as the Kurzweil-like scientist whose mind was downloaded to the internet – with disastrous results for humanity.

Fortunately, because the IP metaphor is not even slightly valid, we will never have to worry about a human mind going amok in cyberspace; alas, we will also never achieve immortality through downloading. This is not only because of the absence of consciousness software in the brain; there is a deeper problem here – let’s call it the uniqueness problem – which is both inspirational and depressing.

Because neither ‘memory banks’ nor ‘representations’ of stimuli exist in the brain, and because all that is required for us to function in the world is for the brain to change in an orderly way as a result of our experiences, there is no reason to believe that any two of us are changed the same way by the same experience. If you and I attend the same concert, the changes that occur in my brain when I listen to Beethoven’s 5th will almost certainly be completely different from the changes that occur in your brain. Those changes, whatever they are, are built on the unique neural structure that already exists, each structure having developed over a lifetime of unique experiences.

This is why, as Sir Frederic Bartlett demonstrated in his book Remembering (1932), no two people will repeat a story they have heard the same way and why, over time, their recitations of the story will diverge more and more. No ‘copy’ of the story is ever made; rather, each individual, upon hearing the story, changes to some extent – enough so that when asked about the story later (in some cases, days, months or even years after Bartlett first read them the story) – they can re-experience hearing the story to some extent, although not very well (see the first drawing of the dollar bill, above).

This is inspirational, I suppose, because it means that each of us is truly unique, not just in our genetic makeup, but even in the way our brains change over time. It is also depressing, because it makes the task of the neuroscientist daunting almost beyond imagination. For any given experience, orderly change could involve a thousand neurons, a million neurons or even the entire brain, with the pattern of change different in every brain.

Worse still, even if we had the ability to take a snapshot of all of the brain’s 86 billion neurons and then to simulate the state of those neurons in a computer, that vast pattern would mean nothing outside the body of the brain that produced it. This is perhaps the most egregious way in which the IP metaphor has distorted our thinking about human functioning. Whereas computers do store exact copies of data – copies that can persist unchanged for long periods of time, even if the power has been turned off – the brain maintains our intellect only as long as it remains alive. There is no on-off switch. Either the brain keeps functioning, or we disappear. What’s more, as the neurobiologist Steven Rose pointed out in The Future of the Brain (2005), a snapshot of the brain’s current state might also be meaningless unless we knew the entire life history of that brain’s owner – perhaps even about the social context in which he or she was raised.

Think how difficult this problem is. To understand even the basics of how the brain maintains the human intellect, we might need to know not just the current state of all 86 billion neurons and their 100 trillion interconnections, not just the varying strengths with which they are connected, and not just the states of more than 1,000 proteins that exist at each connection point, but how the moment-to-moment activity of the brain contributes to the integrity of the system. Add to this the uniqueness of each brain, brought about in part because of the uniqueness of each person’s life history, and Kandel’s prediction starts to sound overly optimistic. (In a recent op-ed in The New York Times, the neuroscientist Kenneth Miller suggested it will take ‘centuries’ just to figure out basic neuronal connectivity.)

Meanwhile, vast sums of money are being raised for brain research, based in some cases on faulty ideas and promises that cannot be kept. The most blatant instance of neuroscience gone awry, documented recently in a report in Scientific American, concerns the $1.3 billion Human Brain Project launched by the European Union in 2013. Convinced by the charismatic Henry Markram that he could create a simulation of the entire human brain on a supercomputer by the year 2023, and that such a model would revolutionise the treatment of Alzheimer’s disease and other disorders, EU officials funded his project with virtually no restrictions. Less than two years into it, the project turned into a ‘brain wreck’, and Markram was asked to step down.

We are organisms, not computers. Get over it. Let’s get on with the business of trying to understand ourselves, but without being encumbered by unnecessary intellectual baggage. The IP metaphor has had a half-century run, producing few, if any, insights along the way. The time has come to hit the DELETE key.

Robert Epstein is a senior research psychologist at the American Institute for Behavioral Research and Technology in California. He is the author of 15 books, and the former editor-in-chief of Psychology Today. 

After flying just once over Mexico City, artist Stephen Wiltshire drew the entire cityscape from memory on a 13-foot canvas.Photograph by Paolo Woods, National Geographic
  • Genius

See This Incredible Artist Draw a Whole City From Memory Posted April 4th 2021

Diagnosed with autism at age three, Stephen Wiltshire is now famous for producing highly detailed scenes after just a brief glance.ByNina StrochlicPublished April 18, 2018• 5 min read

Today, Stephen Wiltshire is one of Britain’s best-known artists. His commissions have a four- to eight-month waiting list, and videos of him sketching panoramic cityscapes in perfect scale have a tendency to go viral.

But when Stephen was in school, his teachers didn’t know what to do with him. Diagnosed with autism at age three, he didn’t say his first word (“paper”) until age five. Still, as a child, Stephen could sketch stunningly accurate images of wildlife and caricatures of his teachers.

Later he began drawing the buildings he was seeing around London with impressive detail. His older sister Annette would take him to the home of a school friend who lived on the 14th floor of an apartment building, so he could see a sprawling view of the city. He marveled at its layout and landmarks. From that point on, she says, “his passion became obsessive.”

At age eight, he got his first commission—from the British prime minister. Language didn’t come easily until the next year, but by age 13, he had published his first book of drawings. The public and the media became fascinated by the young teen’s incredible memory. Stephen was featured on television shows and in documentaries about so-called savants.

Drawing a CrowdWiltshire completed his sketch of Mexico City in front of onlookers inside the city’s Bancomer bank.Photograph by Paolo Woods, National Geographic

On a trip to New York for an interview, he met Oliver Sacks and drew a perfect replica of the neurologist’s house after taking a quick glance at it. “The combination of great abilities with great disabilities presents an extraordinary paradox: how can such opposites live side by side?” Sacks later wrote in the foreword to Wiltshire’s second book.

Two years later, in 1989, he visited Venice and drew his first panorama. From then on, Stephen became known for his incredibly detailed cityscapes, each done from memory with hundreds of streets, landmarks, and other minutia in perfect scale. He drew cities around the world, from Jerusalem to Sydney. His latest project brought Mexico City to life on a 13-foot canvas.

After just a brief look, Wiltshire re-creates cities to scale with intricate detail, including buildings with the right number of windows.Photograph by Paolo Woods, National Geographic

In New York, he took a 20-minute helicopter ride and then sketched everything he saw onto a 19-foot-long piece of paper as viewers watched live via webcam.

“Despite Stephen’s astounding memory, whilst in Manhattan he still managed to get lost and walk 45 minutes in the wrong direction before finding Cheyenne’s Diner,” says a playful anecdote on his official website.

In 2006, Prince Charles presented Stephen as a Member of the Order of the British Empire for his contributions to the art world. That year, he opened his own gallery in central London. Today, his photograph welcomes visitors to London’s Heathrow airport.

Wiltshire drew this picture of the Manhatten skyline after taking a 20-minute helicopter ride.
The completed version of Wiltshire's Mexico City drawing spanned 19 feet.
Wiltshire created this moody view of the Chicago River in January 2017.
This 2013 drawing of Monte Carlo shows the city illuminated at night.
The iconic statue of Christ the Redeemer gazes over the skyline in Wiltshire's 2012 re-creation of Rio de Janeiro.

1 / 5Wiltshire drew this picture of the Manhatten skyline after taking a 20-minute helicopter ride.

New York, New YorkWiltshire drew this picture of the Manhatten skyline after taking a 20-minute helicopter ride.Illustration by Stephen Wiltshire

“Stephen is extremely humble and not fazed at all,” says Annette, who manages the gallery.

Fame “hasn’t altered his concentration or even made him nervous … I think it pushes his abilities even further.”

And thanks to his prolific and celebrated career, the once-silent artist now communicates easily with millions of people. “Stephen’s art speaks a language that we can all understand,” she says.

Someone I Would Like To Kill by Robert Cook April 2nd 2021

Television was a magic box to me in the 1950s. The first ones were large pieces of furniture with large polished wood cabinets and flickering blackish and whitish images on tiny screens. To have an H Shaped  aerial attached to your chimney was a status symbol like owning a car. We never had a car. Dad rode a bike 10 miles to his long day’s work as a lorry driver. Just before his accident, he stepped up in the world purchasing an NSU moped called a  ‘Quickly.’  He was fluent in German and had fought the Germans during the war , but we had German relatives and he admired their engineering.

The war had a big impact on my parents. They were both Londoners where the blitz cooked and killed people. Mother lost a brother and boyfriend.  Our television arrived in 1957 and I was allowed to watch it without censorship until 9 p.m. War films , where death was always heroic, interested me, but it was not until 1962 when my father ended up on a terminal ward in Aylesbury that I heard the sound of death while we visited on a  regular basis. The sound of the death rattle was horrible. Eventually it was my father’s turn. He was 41. Brought up as an agnostic, I had no doubt he had gone for good. Very poor already, due to his long illness, our situation got worse. Life seemed horrible, so I wanted to go too.

It was my role to look after my mother, doing jobs before and after school , including long hours on the farm where life and death of animals was quite normal.

My first memory of  desperately wanting to die comes from just before I went up to university. I had become very anxious that something might happen to my mother and the lonely prospect of being in a world without her.

The difficulty with the suicidal impulse is that , in my experience, it is like one of those old wartime blackout curtains that we still had on our front windows in 1960. It closes out all light and therefore any sense of hope.

So while working for the Inland Revenue in Havant, near my beloved Portsmouth, I got lodgings with wonderful Bill and Jean Neal in Lymbourne Road.  I had recently been dumped by my girlfriend because she found me too depressing. I was writing a lot of poetry , including lines about this person that I wanted to kill.

The person was , of course, myself. The blackout curtain came down so I saw nothing else. I went to my doctor and he prescribed amitrytyline, better known as tryptizol. It would take a lot of explanation as to how I reached this stage, but my ex girlfriend observed when we met in Norwich that, in her words ‘You are very insecure.’ She did her best to help me but psychology is a blunt instrument where a scalpel is required. No such tool exists in that field of medicine, beyond lobotomy.

The year was 1975.  Back in 1974, the English folk singer Nick Drake had died from an overdose of tryptizol. As an aspiring folk singer myself, I knew that. The drug is a tricyclic antidepressant with sedative properties. The maximum effective dose is 150 mg per day.  It was a Friday. I collected my prescription from Boots in West Street , on my  way home from the tax office just around the corner. The tiny pills were crammed into a little brown glass bottle.

My lodgings were a short walk away. I was home in time for dinner with Bill and Jean. They always went to the British Legion Club in South Street. I went up to my little back bedroom and swallowed all the pills. In my hazy half sleep, I heard them come home that night, then nothing until Monday morning.

Unhappy to have woken up, I rang in sick and took time to recover.  A few weeks later, I returned to my doctor asking for extra pills because I was going away on holiday. Something people miss about those of us who become suicidal is that once we have decided what we are going to do, we put on an image. So I got the pills and drove home up country to the house I then owned in Winslow, where I took lots of the pills, spending another two days in a coma.

People miss or don’t want to face the reality of why young men commit suicide. This is Robert Cook at his friend Vernon’s Leigh Park Havant home a few days before overdosing on tryptizol in 1975. Image from ‘Havant & Hayling’ in old Photographs by Robert Cook, 1996.

Coming out of it was like swimming up from the depths. I persuaded myself that I was going to become more than a tax man. I was going to be a great writer and folk singer. Over the years , all that hope was gradually taken from me. I went on to attempt suicide by hanging in March 2007 and another overdose in 2016 , following 7 traumatic court hearings before my hollow Crown Court victory.

Still I was unable to get at the truth, which I cannot talk about now for legal reasons. So I took an overdose of tamazapam whilst on leave from work in December 2016. There were efforts to persuade me that I am transgendered stemming from me having a book published on the subject. The final report on the matter noted that I have a ‘strong female identity.’

This led to a psychiatrist following up with the deluded paranoid diagnosis and prescription for anti psychotic drugs. I came to the conclusion that these people cannot be trusted and that the only identity I have is the one the police gave me on October 9th 2008 when they created intriguing and life destroying records and a PNC Criminal Marker, meant to be secret, that I was a violent stalker. One cannot get much more hopeless than carrying an identity like that one. It is near impossible to earn a living that way and was expanded to incriminate my eldest son.

Life without hope is a terrible experience. If it had not been for the support of my eldest son who also depends on me for reasons I am also not allowed to mention, I would be dead. When the blackout curtain is drawn, one sees nothing but darkness, which is why I nearly succeeded in strangling myself with my tee shirt whilst in police custody back last August 25th. Death, where is thy sting ? I would do it again. I am not allowed to say why, and no one believes me anyway, which is why I have been labelled a paranoid personality, schizophrenic and deluded. Who am I to argue ? It would be kind of them to give me euthanasia instead of leaving me plotting to kill that someone who is me.  R.J Cook

Robert Cook , farm boy, 1963 . The building immediately in the background are the outside toilets. We were poor , but we were honest, as the song goes. But does honesty realLy pay ? I don’t think so.It is such a privilege to be a white male in ‘liberal Britain, isn’t it ? Image Appledene Photographic Archive
A poem, one of many written in my little room overlooking the remnants of the old ‘Hayling Billy’ railway line in 1975, where I attemped suicide by overdosing tryptizol. From the book ‘Havant & Hayling’ in old photographs by Robert Cook. 1996.

Why are suicides so high amongst men? Posted March 16th 2021

77% of suicides are by men. Are the myths around depression responsible for their deaths?

Each year across the UK, approximately 6,000 people take their own lives. Around 90% of suicide victims suffer from a mental health condition and those at the highest risk include middle aged men living with depression. Some of these men might end up becoming dependent on alcohol, which possibly starts as a coping strategy but eventually ends up worsening their mood and leading to other problems affecting their health, employment and relationships. Despite the overall number of suicides falling in 2012, the total number for men increased. Figures from the Office for National Statistics show that highest number of suicides was recorded among men aged 40 to 44. For this age group, men were more than four times as likely as women to commit suicide. In total, men make up 77% of all suicides in the UK.

Why more men than women?

There are many possible reasons why middle aged men are more at risk of depressive disorder than other groups in the population, and also why they might be less likely to seek help even after they become depressed. These include the changing role of women in our society who have become less dependent on their male partners, the decline of traditional male dominated jobs leading to a loss of identity as well as income, and relationship breakdown which can be more devastating for men than women. Women generally tend to be more ’emotionally literate’ and are able to discuss their feelings with others rather than resorting to internalising their emotions or using alcohol or recreational substances as ways of coping with distress. This latter attribute, together with the fear and stigma of revealing low mood might further hinder the readiness of men to seek help for their symptoms.

A Mad World

Posted March 3rd 2021

A diagnosis of mental illness is more common than ever – did psychiatrists create the problem, or just recognise it?


  • Joseph Pierre

Read when you’ve got time to spare.GettyImages-1030071552.jpg

Photo from Chinnapong / Getty Images.

When a psychiatrist meets people at a party and reveals what he or she does for a living, two responses are typical. People either say, ‘I’d better be careful what I say around you,’ and then clam up, or they say, ‘I could talk to you for hours,’ and then launch into a litany of complaints and diagnostic questions, usually about one or another family member, in-law, co-worker, or other acquaintance. It seems that people are quick to acknowledge the ubiquity of those who might benefit from a psychiatrist’s attention, while expressing a deep reluctance ever to seek it out themselves.

That reluctance is understandable. Although most of us crave support, understanding, and human connection, we also worry that if we reveal our true selves, we’ll be judged, criticised, or rejected in some way. And even worse – perhaps calling upon antiquated myths – some worry that, if we were to reveal our inner selves to a psychiatrist, we might be labelled crazy, locked up in an asylum, medicated into oblivion, or put into a straitjacket. Of course, such fears are the accompaniment of the very idiosyncrasies, foibles, and life struggles that keep us from unattainably perfect mental health.

As a psychiatrist, I see this as the biggest challenge facing psychiatry today. A large part of the population – perhaps even the majority – might benefit from some form of mental health care, but too many fear that modern psychiatry is on a mission to pathologise normal individuals with some dystopian plan fuelled by the greed of the pharmaceutical industry, all in order to put the populace on mind-numbing medications. Debates about psychiatric overdiagnosis have amplified in the wake of the 2013 release of the newest edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the so-called ‘bible of psychiatry’, with some particularly vocal critics coming from within the profession.

It’s true that the scope of psychiatry has greatly expanded over the past century. A hundred years ago, the profession had a near-exclusive focus on the custodial care of severely ill asylum patients. Now, psychiatric practice includes the office-based management of the ‘worried well’. The advent of psychotherapy, starting with the arrival of Sigmund Freud’s psychoanalysis at the turn of the 20th century, drove the shift. The ability to treat less severe forms of psychopathology – such as anxiety and so-called adjustment disorders related to life stressors – with the talking cure has had profound effects on mental health care in the United States.

Early forms of psychotherapy paved the way for the Mental Hygiene Movement that lasted from about 1910 through the 1950s. This public health model rejected hard boundaries of mental illness in favour of a view that acknowledged the potential for some degree of mental disorder to exist in nearly everyone. Interventions were recommended not just within a psychiatrist’s office, but broadly within society at large; schools and other community settings were all involved in providing support and help.

A new abundance of ‘neurotic’ symptoms stemming from the trauma experienced by veterans of the First and Second World Wars reinforced a view that mental health and illness existed on a continuous spectrum. And by the time DSM was first published in 1952, psychiatrists were treating a much wider swath of the population than ever before. From the first DSM through to the 2013 revision, inclusiveness and clinical usefulness have been guiding principles, with the profession erring on the side of capturing all of the conditions that bring people to psychiatric care in order to facilitate evaluation and treatment.

In the modern era, psychotherapy has steered away from traditional psychoanalysis in favour of more practical, shorter-term therapies: for instance, psychodynamic therapy explores unconscious conflicts and underlying distress on a weekly basis for as little as a few months’ duration, and goal-directed cognitive therapy uses behavioural techniques to correct disruptive distortions in thinking. These streamlined psychotherapeutic techniques have widened the potential consumer base for psychiatric intervention; they have also expanded the range of clinicians who can perform therapy to include not only psychiatrists, but primary care doctors, psychologists, social workers, and marriage and family therapists.

In a similar fashion, newer medications with fewer side effects are more likely to be offered to people with less clear-cut psychiatric illnesses. Such medications can be prescribed by a family physician or, in some states, a psychologist or nurse practitioner.

Viewed through the lens of the DSM, it is easy to see how extending psychiatry’s helping hand deeper into the population is often interpreted as evidence that psychiatrists think more and more people are mentally ill. Epidemiological studies based upon DSM criteria have suggested that half or more of the US population will meet the threshold for mental disorder at some point in their lives. To many, the idea that it might be normal to have a mental illness sounds oxymoronic at best and conspiratorially threatening at worst. Yet the widening scope of psychiatry has been driven by a belief – on the parts of both mental health consumers and clinicians alike – that psychiatry can help with an increasingly large range of issues.

The diagnostic creep of psychiatry becomes more understandable by conceptualising mental illness, like most things in nature, on a continuum. Many forms of psychiatric disorder, such as schizophrenia or severe dementia, are so severe – that is to say, divergent from normality – that whether they represent illness is rarely debated. Other syndromes, such as generalised anxiety disorder, might more closely resemble what seems, to some, like normal worry. And patients might even complain of isolated symptoms such as insomnia or lack of energy that arise in the absence of any fully formed disorder. In this way, a continuous view of mental illness extends into areas that might actually be normal, but still detract from optimal, day-to-day function.

While a continuous view of mental illness probably reflects underlying reality, it inevitably results in grey areas where ‘caseness’ (whether someone does or does not have a mental disorder) must be decided based on judgment calls made by experienced clinicians. In psychiatry, those calls usually depend on whether a patient’s complaints are associated with significant distress or impaired functioning. Unlike medical disorders where morbidity is often determined by physical limitations or the threat of impending death, the distress and disruption of social functioning associated with mental illness can be fairly subjective. Even those on the softer, less severe end of the mental illness spectrum can experience considerable suffering and impairment. For example, someone with mild depression might not be on the verge of suicide, but could really be struggling with work due to anxiety and poor concentration. Many people might experience sub-clinical conditions that fall short of the threshold for a mental disorder, but still might benefit from intervention.

The truth is that while psychiatric diagnosis is helpful in understanding what ails a patient and formulating a treatment plan, psychiatrists don’t waste a lot of time fretting over whether a patient can be neatly categorised in DSM, or even whether or not that patient truly has a mental disorder at all. A patient comes in with a complaint of suffering, and the clinician tries to relieve that suffering independent of such exacting distinctions. If anything, such details become most important for insurance billing, where clinicians might err on the side of making a diagnosis to obtain reimbursement for a patient who might not otherwise be able to receive care.


Though many object to psychiatry’s perceived encroachment into normality, we rarely hear such complaints about the rest of medicine. Few lament that nearly all of us, at some point in our lives, seek care from a physician and take all manner of medications, most without need of a prescription, for one physical ailment or another. If we can accept that it is completely normal to be medically sick, not only with transient conditions such as coughs and colds, but also chronic disorders such as farsightedness, lower back pain, high blood pressure or diabetes, why can’t we accept that it might also be normal to be psychiatrically ill at various points in our lives?

The answer seems to be that psychiatric disorders carry a much greater degree of stigma compared with medical conditions. People worry that psychiatrists think everyone is crazy because they make the mistake of equating any form of psychiatric illness with being crazy. But that’s like equating a cough with tuberculosis or lung cancer. To be less stigmatising, psychiatry must support a continuous model of mental health instead of maintaining an exclusive focus on the mental disorders that make up the DSM. If general medicine can work within a continuous view of physical health and illness, there is no reason why psychiatry can’t as well.

Criticism of this view comes from concern over the type of intervention offered at the healthier end of the continuum. If the scope of psychiatry widens, will psychiatric medications be vastly overprescribed, as is already claimed with stimulants such as methylphenidate (Ritalin) for attention deficit hyperactivity disorder (ADHD)? This concern is well worth fretting over, given the uncertain effectiveness of medications for patients who don’t quite meet DSM criteria. For example, a 2008 study by the Harvard psychologist Irving Kirsch published in PLOS Medicine found that, for milder forms of depression, antidepressants are often no better than placebos. Likewise, research suggests that children at risk of developing psychosis – but not diagnosable just yet – might benefit more from fish oil or psychotherapy than antipsychotic drugs.

In the end, implementing pharmacotherapy for a given condition requires solid evidence from peer-reviewed research studies. Although by definition the benefit of medications decreases at the healthier end of a mental health continuum (if one isn’t as sick, the degree of improvement will be less), we need not reject all pharmacotherapy at the healthier end of the spectrum, provided medications are safe and effective. Of course, medications aren’t candy – most have a long list of potential side effects ranging from trivial to life-threatening. There’s a reason such medications require a prescription from a physician and why many psychiatrists are sceptical of proposals to grant prescribing privileges to health practitioners with far less medical training.

People worry that psychiatrists think everyone is crazy because they make the mistake of equating any form of psychiatric illness with being crazy. But that’s like equating a cough with tuberculosis or lung cancer

Pharmacotherapy for healthier individuals is likely to increase in the future as safer medications are developed, just as happened after selective serotonin re-uptake inhibitors (SSRIs) supplanted tricyclic antidepressants (TCAs) during the 1990s. In turn, the shift to medicating the healthier end of the continuum paves a path towards not only maximising wellness but enhancing normal functioning through ‘cosmetic’ intervention. Ultimately, availability of medications that enhance brain function or make us feel better than normal will be driven by consumer demand, not the Machiavellian plans of psychiatrists. The legal use of drugs to alter our moods is already nearly ubiquitous. We take Ritalin, modafinil (Provigil), or just our daily cup of caffeine to help us focus, stay awake, and make that deadline at work; then we reach for our diazepam (Valium), alcohol, or marijuana to unwind at the end of the day. If a kind of anabolic steroid for the brain were created, say a pill that could increase IQ by an average of 10 points with a minimum of side effects, is there any question that the public would clamour for it? Cosmetic psychiatry is a very real prospect for the future, with myriad moral and ethical implications involved.

In the final analysis, psychiatrists don’t think that everyone is crazy, nor are we necessarily guilty of pathologising normal existence and foisting medications upon the populace as pawns of the drug companies. Instead, we are just doing what we can to relieve the suffering of those coming for help, rather than turning those people away.

The good news for mental health consumers is that clinicians worth their mettle (and you might have to shop around to find one) don’t rely on the DSM as a bible in the way that many imagine, checking off symptoms like a computer might and trying to ‘shrink’ people into the confines of a diagnostic label. A good psychiatrist draws upon clinical experience to gain empathic understanding of each patient’s story, and then offers a tailored range of interventions to ease the suffering, whether it represents a disorder or is part of normal life.

Joseph Pierre is a professor of psychiatry at the University of California, Los Angeles and chief of the Hospital Psychiatry Division at the VA Greater Los Angeles Healthcare System. He writes the Psych Unseen blog for Psychology Today. Aeon

More from Aeon

How to Think Like a Genius, According to Nobel Laureate Richard Feynman Posted February 28th 2021

You don’t have to understand quantum mechanics to use this advice to start solving your toughest problems.

By Jessica Stillman@EntryLevelRebel

Richard Feynman.
Richard Feynman.

IQ may be largely fixed, but that doesn’t mean intelligence is. While we’re stuck with a certain amount of intellectual horsepower, how you employ that talent makes a big difference. Learning different ways to approach problems and dodge cognitive pitfalls effectively makes you smarter. Even changing the time of day you tackle a problem can make you smarter. 

So how do you set yourself up to maximize your intelligence? There are few better qualified to answer this question than a certified genius

How to be a genius, according to a genius. 

Physicist Richard Feynman received the Nobel Prize for his work unraveling one of the most mind-bending subjects known to humankind: quantum physics. He was also famous for his clear and engaging communication style. The man wasn’t just brilliant, he was also great at explaining the process he used to think brilliantly. 

I’ve covered a few of these tips here before, but recently came across another great one on the blog Farnam Street. The post highlights a classic lecture by mathematician and MIT professor Gian-Carlo Rota on how to get students to pay attention in class. 

Much of these ideas are useful to anyone trying to seize and hold attention, but one tip is useful for just about anyone who has ever faced a problem in their life (so all of us then). It comes from Feynman originally, according to Rota: 

Richard Feynman was fond of giving the following advice on how to be a genius. You have to keep a dozen of your favorite problems constantly present in your mind, although by and large they will lay in a dormant state. Every time you hear or read a new trick or a new result, test it against each of your 12 problems to see whether it helps. Every once in a while there will be a hit, and people will say: “How did he do it? He must be a genius!”

The joy of this advice is that it is simple as it is powerful, and you don’t need a super brain to implement it. It’s all about your system, not your talent.

Rather than a sky-high IQ, Feynman’s approach requires the foresight to make a catalog of your most pressing problems. Add to that the attentiveness to watch out for new mental models, hacks, and relevant concepts (particularly from fields not usually thought of as relevant) and you have a simple recipe for a steady stream of fresh, useful ideas. 

And finding ideas like that is what intelligence is in practice. Cracking brain teasers or spotting patterns on some abstract test might give you bragging rights (or a leg up in college admissions). But the ability to solve problems and improve the world is real-life genius. Follow Feynman’s simple framework and you’ll be well on your way to making more of such smart moves. Inc. helps entrepreneurs change the world. Get the advice you need to start, grow, and lead your business today. Subscribe here for unlimited access.Feb 18, 2021The opinions expressed here by columnists are their own, not those of

Neuroscience Readies for a Showdown Over Consciousness Ideas

To make headway on the mystery of consciousness, some researchers are trying a rigorous new way to test competing theories. Posted February 8th 2021

Quanta Magazine

  • Philip Ball

Read when you’ve got time to spare.Consciousness_Ledefullwidth.jpg

Neuroscientists are preparing to test their ideas about the origins of consciousness — the cognitive state of experiencing your own existence. Credit: Ryan Garcia for Quanta Magazine.

Some problems in science are so hard, we don’t really know what meaningful questions to ask about them — or whether they are even truly solvable by science. Consciousness is one of those: Some researchers think it is an illusion; others say it pervades everything. Some hope to see it reduced to the underlying biology of neurons firing; others say that it is an irreducibly holistic phenomenon.

The question of what kinds of physical systems are conscious “is one of the deepest, most fascinating problems in all of science,” wrote the computer scientist Scott Aaronson of the University of Texas at Austin. “I don’t know of any philosophical reason why [it] should be inherently unsolvable” — but “humans seem nowhere close to solving it.”

Now a new project currently under review hopes to close in on some answers. It proposes to draw up a suite of experiments that will expose theories of consciousness to a merciless spotlight, in the hope of ruling out at least some of them.

If all is approved and goes according to plan, the experiments could start this autumn. The initial aim is for the advocates of two leading theories to agree on a protocol that would put predictions of their ideas to the test. Similar scrutiny of other theories will then follow.

Whether or not this project, funded by the Templeton World Charity Foundation, narrows the options for how consciousness arises, it hopes to establish a new way to do science for difficult, contentious problems. Instead of each camp championing its own view and demolishing others, researchers will collaborate and agree to publish in advance how discriminating experiments might be conducted — and then respect the outcomes.

Dawid Potgieter, a senior program officer at the Templeton World Charity Foundation who is coordinating the endeavor, says that this is just the beginning of a sustained effort to winnow down theories of consciousness. He plans to set up several more of these “structured adversarial collaborations” over the next five years.

He is realistic about the prospects. “I don’t think we are going to come to a single theory that tells us everything about consciousness,” he said. “But if it were to take a hundred years to solve the mystery of consciousness, I hope we can cut it down to fifty.”

A Workspace for Awareness

Philosophers have debated the nature of consciousness and whether it can inhere in things other than humans for thousands of years, but in the modern era, pressing practical and moral implications make the need for answers more urgent. As artificial intelligence (AI) grows increasingly sophisticated, it might become impossible to tell whether one is dealing with a machine or a human  merely by interacting with it — the classic Turing test. But would that mean AI deserves moral consideration?

Understanding consciousness also impinges on animal rights and welfare, and on a wide range of medical and legal questions about mental impairments. A group of more than 50 leading neuroscientists, psychologists, cognitive scientists and others recently called for greater recognition of the importance of research on this difficult subject. “Theories of consciousness need to be tested rigorously and revised repeatedly amid the long process of accumulation of empirical evidence,” the authors said, adding that “myths and speculative conjectures also need to be identified as such.”

You can hardly do experiments on consciousness without having first defined it. But that’s already difficult because we use the word in several ways. Humans are conscious beings, but we can lose consciousness, for example under anesthesia. We can say we are conscious of something — a strange noise coming out of our laptop, say. But in general, the quality of consciousness refers to a capacity to experience one’s existence rather than just recording it or responding to stimuli like an automaton. Philosophers of mind often refer to this as the principle that one can meaningfully speak about what it is to be “like” a conscious being — even if we can never actually have that experience beyond ourselves.

Plenty of cognition takes place outside the grasp of conscious awareness — in that sense, we respond to some cues and stimuli “unconsciously.” A distinguishing feature of our minds, however, is that we can hold on to a piece of information, an idea or an intention as a motivation for subsequent decisions and behaviors. If we’re hungry, we salivate as a reflex, but we might also choose to eat, go to the kitchen and get what we want from the cupboard.

Some researchers, such as the cognitive scientist Stanislas Dehaene of the Collège de France in Paris, suggest that this conscious behavior arises when we hold a piece of information in a “global workspace” within the brain, where it can be broadcast to brain modules associated with specific tasks. This workspace, he says, imposes a kind of information bottleneck: Only when the first conscious notion slips away can another take its place. According to Dehaene, brain-imaging studies suggest this “conscious bottleneck” is a distributed network of neurons in the brain’s prefrontal cortex.

This picture of consciousness is called global workspace theory (GWT). In this view, consciousness is created by the workspace itself — and so it should be a feature of any information-processing system capable of broadcasting information to other processing centers. It makes consciousness a kind of computation for motivating and guiding actions. “Once you have information and the information is made broadly available, in that act consciousness occurs,” said Christof Koch, chief scientist and president of the Allen Institute for Brain Science in Seattle. GlobalWorkspaceT_560.jpg

Credit: Lucy Reading-Ikkanda / Quanta Magazine.

But to Koch, the argument that all of cognition, including consciousness, is merely a form of computation “embodies the dominant myth of our age: that it’s just an algorithm, and so is just a clever hack away.” According to this view, he said, “very soon we’ll have clever machines that model most of the features that the human brain has and thereby will be conscious.”

He has been developing a competing theory in collaboration with its originator, the neuroscientist Giulio Tononi of the University of Wisconsin-Madison. They say that consciousness is not something that arises while turning inputs into outputs but rather an intrinsic property of the right kind of cognitive network, one that has specific features in its architecture. Tononi christened this view integrated information theory (IIT).

In contrast to GWT, which starts by asking what the brain does to create the conscious experience, IIT begins instead with the experience. “To be conscious is to have an experience,” Tononi said. It doesn’t have to be an experience about anything, although it can be; dreams, or some “blank mind” states attained by meditation also count as conscious experiences. Tononi has sought to identify the essential features of these experiences: namely, that they are subjective (they exist only for the conscious entity), structured (their contents relate to one another: “the blue book is on the table”), specific (the book is blue, not red), unified (there is only one experience at a time) and definitive (there are bounds to what the experience contains). From these axioms, Tononi and Koch claim to have deduced the properties that a physical system must possess if it is to have some degree of consciousness. IntergratedInformation_560.jpg

Credit: Lucy Reading-Ikkanda / Quanta Magazine.

IIT does not portray consciousness as information processing but rather as the causal power of a system to “make a difference” to itself. Consciousness, Koch said, is “a system’s ability to be acted upon by its own state in the past and to influence its own future. The more a system has cause-and-effect power, the more conscious it is.”

This harks back to the famous “cogito, ergo sum” dictum of René Descartes in the 17th century. “The one thing, the only thing, that is [a] given is my experience,” Koch said. “That’s Descartes’ central insight.”

To Tononi and Koch, systems in which information is merely “fed forward” to convert inputs to outputs, as in digital computers, can only be  “zombies,” which might act as if they are conscious but cannot truly possess that property. Much of Silicon Valley may believe that computers will eventually become conscious, but to Koch, unless those machines have the right hardware for consciousness, they will just constitute a “deep fake.”

“Digital computers can simulate consciousness, but the simulation has no causal power and is not actually conscious,” Koch said. It’s like simulating gravity in a video game: You don’t actually produce gravity that way.

‘Surrounded and Immersed’ in Consciousness

One of the most striking features of IIT is that it makes consciousness a matter of degree. Any system with the required network architecture may have some of it. “No matter whether the organism or artifact hails from the ancient kingdom of Animalia or from its recent silicon offspring, no matter whether the thing has legs to walk, wings to fly, or wheels to roll with,” Koch wrote in his 2012 book Consciousness: Confessions of a Romantic Reductionist. “If it has both differentiated and integrated states of information, it feels like something to be such a system.”

This view arouses a lot of skepticism. The influential American philosopher of mind John Searle of the University of California, Berkeley derides the idea as a form of panpsychism: crudely, a belief that mind and awareness infuse the whole cosmos. In a withering critique of IIT, Searle has asserted that “the problem with panpsychism is not that it is false; it does not get up to the level of being false. It is strictly speaking meaningless because no clear notion has been given to the claim.” Consciousness, he wrote, “cannot be spread over the universe like a thin veneer of jam” — it “comes in units and panpsychism cannot specify the units.”

Koch, however, is perfectly happy to think that “we are surrounded and immersed” in consciousness. He believes “that consciousness is a fundamental, elementary property of living matter. It can’t be derived from anything else.”

But this doesn’t mean it is spread equally everywhere. Koch and Tononi assert that, while consciousness can be an attribute of many things, a significant amount of it can exist only in particular kinds of things, notably human brains (indeed, in specific parts of human brains). And to turn IIT into a quantitative, testable theory, Koch and Tononi have formulated a criterion for what kinds of things those are.

To reflect how conscious an information-processing network is, Koch and Tononi define a measure of “information integration,” which they call Φ (the Greek letter phi). It represents the amount of “irreducible cause-effect structure”: how much the network as a whole can influence itself. This depends on interconnectivity of feedback. If a network can be divided into smaller networks that don’t exert causal power on one another, then it will have a correspondingly low value of Φ no matter how many processing nodes it has.

Equally, “any system whose functional connectivity and architecture yield a Φ value greater than zero has at least a trifle of [conscious] experience,” Koch said. That includes the biochemical regulatory networks found in every living cell, and also electronic circuits that have the right feedback architecture. Since atoms can influence other atoms, “even simple matter has a modicum of Φ.” But systems that have enough Φ to “know” of their existence, as we do, are rare (although the theory anticipates that higher animals will also have a degree of that experience).

Because of this effort to make IIT quantitative and testable, Aaronson puts it “in something like the top 2 percent of all mathematical theories of consciousness ever proposed.” He believes that the theory is flawed — but contrary to Searle, he says that “almost all competing theories of consciousness have been so vague, fluffy and malleable that they can only aspire to wrongness.”

Seeking Neural Correlates

Koch would concur with that. “Everybody seems to have a pet theory of consciousness, but few of them are quantitative or predictive,” he said. He believes that both GWT and IIT are testable. “Logically speaking, they could be wrong, or both could capture certain aspects of reality.” How, though, do we test them?

Enter the Templeton World Charity Foundation, which has assigned $20 million to the task of testing theories of consciousness to destruction. It is starting with what Potgieter calls a “structured adversarial collaboration” involving IIT and GWT because they are able to make testable and contrasting predictions. The plan is for the proponents of the two theories to agree in advance to an experimental protocol that ought to distinguish whether either or both of the theories are wrong. “The condition was that the leaders of the theories would sign off on this protocol, in the sense of acknowledging that the predictions accurately represent the theory,” Potgieter said. (He credited the willingness of Dehaene and Tononi “to put themselves on the line” as one of the considerations that led to the choice of GWT and IIT as the first theories on the block.)

The collaboration will get a top journal to commit to publishing the outcome of the experiments, come what may. The study will also include replication experiments. “This is basically open science,” Potgieter said. “If we can use the best practices in open science to demonstrate progress in an area where no one has done very much, it could show that it’s a useful approach.”

He says that the researchers now have a final experimental design to test incompatible  predictions of GWT and IIT head-to-head. The details are yet to be disclosed, but they will deploy a battery of brain-monitoring techniques, such as functional magnetic resonance imaging (fMRI), electrocorticography and magnetoencephalography. The experiment seems to be “the first time ever that such an audacious, adversarial collaboration has been undertaken and formalized within the field of neuroscience,” Potgieter added. He hopes that if the project is approved, the experimental work will be able to start after the summer and run for about three years, involving 10-12 labs.

What differences between the theories will the experiments test? One is in the location of consciousness in the brain. According to GWT, the “neural correlates of consciousness” — the patterns of neuron activity that reflect the conscious state — should show up in parts of the brain that include the parietal and frontal lobes of the cortex. The parietal lobe processes sensory data such as touch and spatial sense. The frontal lobe is associated with cognitive processing for “higher” functions such as memory, problem solving, decision-making and emotional expression.

But people who have had a large fraction of the frontal lobe removed — as used to happen in neurosurgical treatments of epilepsy — can seem remarkably normal, Koch says. According to IIT, the seat of consciousness is instead likely to be in the sensory representation in the back of the brain, where the neural wiring seems to have the right character. “I’m willing to bet that, by and large, the back is wired in the right way to have high Φ, and much of the front is not,” Tononi said.

We can compare the locations of brain activity in people who are conscious or have been rendered unconscious by anesthesia, he says. If such tests were able to show that the back of the brain indeed had high Φ but was not associated with consciousness, he admits that “IIT would be very much in trouble.”

A recent fMRI study of brain activity in volunteers who were either conscious or under general anesthesia, conducted by a group that included Dehaene, showed distinct patterns corresponding to the two states. During periods of unconsciousness, brain activity persisted only among regions with direct anatomical connections, whereas during conscious activity, complex long-distance interactions did not seem constrained by the brain’s “wiring.”

However, one of the authors of the study, the physicist-turned-neuroscientist Enzo Tagliazucchi of the University of Buenos Aires and the Pitié-Salpêtrière Hospital in Paris, stresses that the findings don’t yet clearly support any particular theory of consciousness. “It would be premature to frame our work within one theory or the other,” he said. “It doesn’t tip any balance, nor it is intended to do so.”

Another prediction of GWT is that a characteristic electrical signal in the brain, arising about 300-400 milliseconds after a stimulus, should correspond to the “broadcasting” of the information that makes us consciously aware of it. Thereafter the signal quickly subsides. In IIT, the neural correlate of a conscious experience is instead predicted to persist continuously while the experience does. Tests of this distinction, Koch says, could involve volunteers looking at some stimulus like a scene on a screen for several seconds and seeing whether the neural correlate of the experience persists as long as it remains in the consciousness.

Not everyone is optimistic that it will be possible to find rigorous, definitive ways of testing and adjudicating these two theories. “The current project is an attempt in good faith in this direction,” said Francis Fallon, a philosopher of mind at St. John’s University in Queens, New York, who is involved in the Templeton project. But he noted that because both theories have already been shaped by existing empirical evidence, it would be surprising to find new data with which either seems fundamentally inconsistent.

Hakwan Lau, a psychologist who studies behavioral neuroscience at the University of California at Los Angeles, is not convinced IIT is even a truly scientific theory. “IIT is based on armchair theorizing,” he said. He thinks that what IIT advocates regard as the likely locus of consciousness doesn’t necessarily follow from the theory but is just their subjective view. “To make empirical predictions [of the theory] testable by current methods,” he said, “many additional assumptions and approximations need to be made.”

To him, Lau says, IIT and GWT are “so different that I don’t know how to begin to compare them.” In contrast, Tagliazucchi thinks it possible that the two are essentially the same theory, but “developed from third- and first-person viewpoints.”

The cognitive scientist Anil Seth of the University of Sussex in the U.K. shares reservations about whether the Templeton project might prove premature. A “definitive rebuttal or validation” seems unlikely, he said, because the theories “make too many different assumptions, have different relations to testability and may even be trying to explain different things. GWT seems mostly about function and cognitive access, while IIT is a theory based primarily on phenomenology, not function, and is difficult to test.”

Tononi and his collaborators would counter that they have been developing experimental tests of IIT for many years — work that has led to the development of a crude but effective tool for evaluating consciousness in brain-damaged patients. Yet even Tononi agrees that, because both theories are still under construction and remain so “far apart,” it might be too much to expect a definite outcome. “Their predictions aren’t as precise as in physics,” he said.

Still, he argues that “in the interests of making progress, you have to start with what you’ve got.” Besides, the exercise “forces the theories to say something specific.” Regardless of the outcome, Tononi feels sure that the tests will teach us something new and useful about the brain.

Other Contending Theories

No one imagines that eliminating GWT or IIT would solve the mystery of what consciousness is. For one thing, there are other serious theories, too.

Among them, two common classes are called first-order and higher-order theories (HOTs). “A first-order theory says that there’s nothing more to the mind than the basic cognitive processing of sensory information,” according to Lau. What brings some of that sensory information into consciousness, first-order theorists say, is something unidentified but intrinsic to how it’s represented in the brain — for example, the dynamics of the interactions among elements in its neural network.

In contrast, he said, “higher-order theorists say that the mind does something with the representation, over and above the cognition itself, to produce consciousness.” In a HOT, a conscious experience is not merely a record of the perceptions involved but involves some additional mechanism that draws on that representation. That higher-order state doesn’t necessarily serve some function in processing the information, as in GWT; it just is.

“Compared to other existing theories, HOT can more readily account for complex everyday experiences, such as emotions and episodic memories,” Lau and his colleagues, the philosopher Richard Brown of LaGuardia Community College and the neuroscientist Joseph LeDoux of New York University, wrote recently.

The Templeton World Charity Foundation has assigned further funds to test such ideas as it will GWT and IIT. “I hope to host about nine meetings over the next five years, to bring together two or more incompatible theories and try to hash it out between those theories,” Potgieter said. He admits that “it might be that none of the current ideas is correct.”

It may also turn out that no scientific experiment can be the sole and final arbiter of a question like this one. “Even if only neuroscientists adjudicated the question, the debate would be philosophical,” Fallon said. “When interpretation gets this tricky, it makes sense to open the conversation to philosophers. Many of the neuroscientists in this field are already engaging in philosophy, some quite excellently.”

Potgieter hopes that the adversarial approach will allow progress on other big questions — like understanding how consciousness arose in the first place, or how life itself did. “Wherever there is a big question with a bunch of different theories that are all strong but all siloed away from each other, we will try to make progress by breaking down the silos,” he said.

“I think it is a wonderful initiative, and should be much more frequent in science,” Tononi said. “It forces the proponents to focus and enter some common framework. I think we all stand to gain one way or another.”

Philip Ball is a science writer and author based in London who contributes frequently to Nature, New Scientist, Prospect, Nautilus and The Atlantic, among other publications.
Quanta Magazine

More from Quanta Magazine

How was it? Save stories you love and never lose them.

This post originally appeared on Quanta Magazine and was published March 6, 2019. This article is republished here with permission.

Get math and science news, explainers, interviews and more in your inbox.Get Quanta’s weekly newsletter

More Stories from Pocket

All Mixed Up by R.J Cook February 3rd 2021

Roberta Jane Cook There are clear and established differences in gender behaviours. Denying this and imposing one size fits all blob identity for the masses while the elite do as they like, is a recipe for conflict and disaster. R.J Cook

After this post you can read what I consider a very nasty and dangerous piece of what is basically more feminist and their ‘he for she supporters’ -who are like eunuchs guarding the harem – poisonous propaganda. There was a word for such men but it is now a hate crime to use it as progressively language is censored , with consequent limitations to our thoughts , consequent behaviour , interactions and sense of identity. This is because we are n a police state and the leftist liberal paradise currently under ongoing and ever more restrictive development Here they don’t just hand out identity cards and covid injection certificates , like the one I have. They hand out the complete iidentity package and necessary brain training – a black senator actually referencing ‘reprogramming Trump supporters.

Hence the confines LGBTQI and BLM and sensitivities toward Islam ,who are blob sensitive only to themselves. We are supposed to be atoms in a blob, worshipping a God who we are supposed to believe built it all , along with God’s earthly representatives like the Queen , the Police and NHS who keep us safe and need more taxes to pay them more for doing such a good job.

For the slightest mistake or malicious dishonest moronic police behaviour , a person has their DNA , finger prints and photographs taken. To relate all of this , and more , as I will , is to advance absractions beyond the grasp of our new wave of school and ‘uni’ production line products.

Language limitations are wonderful. It was one of my old post grad teachers at London University who started ground breaking research on language limitation and capacity for abstract thought. He was Professor Basil Bernstein , a highly gifted academic – unlike most professors who are politicans blowing with the wind not in it. So Bernstein was rubbished by 80s trendies.

Howver, as with Snowball and Napoleon in ‘Animal Farm’ , these trendies latched on to the broad concept of language codes. So while disparaging Bernstein as classist on the one hand, they set out not to challenge working class restricted language codes but to extend them to the masses along with an extension of brain numbing ‘uni’ degrees and student loan debts forcing the new ‘graduates’ into the mind numbing careers (sic) in the call centres that have replaced what old foggies like to call ‘good old fashioned service.’

So if a person reads the following so called ‘new research’ most won’t question its premise , the reality being that this type of researcher starts with the conclusion to their hypothesis , seeking evidence accordingly. Many won’t understand what the word research means , let alone the issue of sampling and questionaires , or the necessary brain tissue sampling , environmental and cultural variations or that it is known that experience modifies DNA. The word research on its own is enough to conjour up image of serious looking boffins beavering away in science laboratories bubbling with liquids and white boards covered in calculations which only a born race of scientists could ever understand. The masses think like that because they are like the babies in the ‘nursery’ in Huxley’s ‘Brave New World’ crawling across the metal floor and given electric shocks as soon as they lay hands on interesting clourful things set out for them when they get to the other side.

What these typical reearchers – having more in common with Nazi doctors than truth seekers – want to do is abolish the concept of gender difference. hence their allegedly scientific research offers the carrot of having a gender balanced brain is good for you and this is how to get one. That will be the way to a better if not perfect life. It’s the tosh you find written by a plethora of conceited ‘academics ‘ ( sic ) in magazines like ‘Psychology Today.’ That publication routinely has a middle class well made up model’s face on the cover along with a list of all the horrors , burdens and vulnerabilities of being ‘WOMAN.’

People must not be allowed to be individuals unless they are part of the ruling thieving money soaked rich power mad lying manipulative elites. Women must be forgiven and understood because of pre menstrual tension, post pregnancy depression , murder and God knows what else. They must be worshiped and seen as able to do anything the ‘man blob ‘ can do , but men mustn’t be allowed to compete with women in sport because it is a fact that oestrogen is relevant to having babies , testosterone is the hormone that helps men fight wars and challenge nature in a dangerous and risky world.

Articles like the one below are viewed through the drivel of so called gender equality, when in reality it is about scapegoating and castrating men , ignoring their gender specific achievements and taking for granted the technology of a very unnatural envronment where its fragility has been made all too obvious by the Covid panic , where the average woman is the first to run for cover because nature tells them they are the baby carriers and home makers. As young people , women are brainwashed into thinking they can be anything but wives and mothers.

This is poison. Lockdown has left many young women shut up , egotistic flirty world of work ladders fading, with depression and sucide clear demonstration of female behaviour in the absence of male company, sexual possibilites , scapegoats and protection . The last point is crucial because women have less muscle mass and more fat. Brains respond and develop to our sense of who we are, what we look like, where we come from, where we go and all the obstacles that we overcome or are beaten by. Whether we look like men or women is crucial and matter of fact.

However , in the following article ,its ‘ pumped up authors expect their list of qualifications – I have many but know them for what they are worth – to impress the ignorant masses who they patronise as morons because that is what school and ‘uni’train them to be. As products of the post 80s uni systeem these authors are also morons but don’t know it. They know only what they have been told they are – look up ther Kirkald experiment which involved former Prime Minister Gordon Brown and this has a lot to answer for.

I rarely reveal my perfectly legal female identity because it nauseates me how people want to make an issue of gender identities imposed on them . Identifying with feminism is a sure sign of cowardice and lack of individuality. The same goes for LGBTQI and all their petty in fighting and feminist style victimhood. The law has appalling bias in favour of women. The law should not be about quotas. The premises , prejudices and methodology of the GIC needs questioning and reforming. It is driven by social engineering targets rather than science.

The demand for MTF sex change should raise questions ,but moronic responses from the masses and police with their restricted language codes should face massive penalties and custodial sentences – as simple as that , not all the LGBTQI pride posturing and exhibitionism holding up the traffic.

The same goes for the revolting Terfs ( Trans Exclusionary Radical Feminists ). The very fact that such bigots exist makes a nonsense of the following garbage about mixed gender brains. Real and historic research has established strong evidence of a continium of gender behaviours commensurate with the need for biological diversity and appropriate and natural interaction. There are profoundly significant psychological tests which show defined differences in skill sets at an early age.

Social engineers have been working hard at re educating , but one only has to watch girls football to notice key differences in behaviour. That is not to say we cannot learn from each other in many ways. However having serious politically motivated restrictions on thought and langauge , backed by feminist and their lackeys claiming ‘new research’ as follows, ultimately causes more mental health issies and conflict. R.J Cook

‘Male’ vs ‘female’ brains: having a mix of both is common and offers big advantages – new research

January 20, 2021 2.55pm GMT


  1. Barbara Jacquelyn Sahakian Barbara Jacquelyn Sahakian is a Friend of The Conversation. Professor of Clinical Neuropsychology, University of Cambridge
  2. Christelle Langley Postdoctoral Research Associate, Cognitive Neuroscience, University of Cambridge
  3. Qiang Luo Associate Principal Investigator of Neuroscience, Fudan University
  4. Yi Zhang Visiting Phd Candidate, University of Cambridge

Disclosure statement

Barbara Jacquelyn Sahakian receives funding from the Wellcome Trust, the Lundbeck Foundation, the Leverhulme Trust, Eton College and the Wallitt Foundation. Her research is conducted within the NIHR Cambridge Biomedical Research Centre (Mental Health and Neurodegeneration Themes) and the NIHR MedTech and Invitro Diagnostic Co-operative (MIC). I thank Thomas Piercy of the University of Cambridge for the image of the androgynous brain.

Qiang Luo receives funding from the National Natural Science Foundation of China and the Natural Science Foundation of Shanghai.

Christelle Langley and Yi Zhang do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.


University of Cambridge

University of Cambridge provides funding as a member of The Conversation UK.

The Conversation UK receives funding from these organisations

View the full list

CC BY NDWe believe in the free flow of information
Republish our articles for free, online or in print, under Creative Commons licence.

From advertising to the workplace, it is often assumed that men and women are fundamentally different – from Mars and Venus, respectively. Of course, we all know people who are more androgynous, having a mix of personality traits that are stereotypically considered to be male or female. Importantly, such “psychological androgyny” has long been associated with traits such as better cognitive flexibility (the mental ability to shift between different tasks or thoughts), social competence and mental health.

But how does this relate to the brain? Are people who are more androgynous in their behaviour going against their biological nature, doing things that their brains are not optimised for? It’s long been unknown whether there is such a thing as brain androgyny. But our new study, published in Cerebral Cortex, suggests it does exist – and it’s common.

Psychological androgyny is thought to be psychologically protective. For example, we know it is associated with fewer mental health problems such as depression and anxiety. It has also been linked to higher creativity.

We’re all familiar with the traits that are stereotypically classified as male or female. Men, for example, are not encouraged to express feelings or cry when upset. Instead they are expected to be tough, assertive, rational and good at visuospatial tasks such as map reading. Women, on the other hand, are often expected to be more emotional, nurturing and better at language.

Disinformation is dangerous. We fight it with facts and expertise

But these differences are likely to be partly down to social norms and expectations – we all want to be liked, so we conform. If a girl is told that it is rude or unbecoming to be assertive, for example, she may change her behaviour to accommodate this, affecting her future career choices. Female adolescents, for example, may not be encouraged by friends and family to consider rewarding but dangerous careers such as the military or policing.

Sex in the brain

Scientists have long argued over how different male and female brains really are. There are many reports of differences between male and female brains in the literature. Other researchers, however, argue that these differences are tiny and the categories are anything but absolute. One study suggested that, psychologically, most of us are in fact probably somewhere on a spectrum between what we stereotypically consider a “male” and a “female”.

But does that mean that the people who fall somewhere in the middle are more androgynous in their brains as well as their behaviour? To test this, we created a brain continuum using a machine-learning algorithm and neuroimaging data. While male and female brains are similar, the connectivity between different brain areas have been shown to differ. We used these connectivity markers to characterise the brains of 9,620 participants (4,495 male and 5,125 female).

Outdated: Men are from Mars and women are from Venus. wjarek

We discovered that brains were indeed distributed across the entire continuum rather than just at the two ends. In a subsample, approximately 25% of brains were identified as male, 25% as female and 50% were distributed across the androgynous section of the continuum. What’s more, we found that participants who mapped at the centre of this continuum, representing androgyny, had fewer mental health symptoms, such as depression and anxiety, compared with those at the two extreme ends.

These findings support our novel hypothesis that there exists a neuroimaging concept of brain androgyny, which may be associated with better mental health in a similar way to psychological androgyny.

Why androgyny benefits us

To learn new things in order to adapt to the ever-changing global environment, we need to be able to be attentive to the world around us. We must also have mental wellbeing, flexibility and be able to employ a wide range of life strategies.

These skills enable us to rapidly understand external context and decide on the optimal response. They help us take advantage of time-limited opportunities and instil resilience. Therefore, these skills confer an advantage for people with androgynous brains, with others being less likely to flourish.

But why is this the case? A meta-analysis of 78 studies of about 20,000 participants revealed that men who conform to typical masculine norms, for example never relying on others and exercising power over women, suffered more psychiatric symptoms than others, including depression, loneliness and substance abuse. They also felt more isolated, lacking social connections to others.

Image of a macho-looking man with a big beard.
Being macho doesn’t seem to make men happy. Volodymyr TVERDOKHLIB/Shutterstock

Women who try to conform pay a price too, perhaps opting out of their dream job because the industry is dominated by men or taking on the majority of tedious household chores. An androgynous person, however, is not influenced by gender norms to the same extent.

That doesn’t mean that there’s no hope for those at the extreme ends of the spectrum. The brain is changeable (plastic) to an extent. It is likely that the androgynous brain is influenced both by genetic and environmental factors, as well as an interaction between the two. Our own study has suggests people’s level of brain androgyny may change over the life course.

Future research is required to understand the influences on brain androgyny across the life span and how environmental factors, such as education, may affect it. Given that we have found that an androgynous brain offers better mental health, it follows that, for optimal performance in school, work and for better wellbeing throughout life, we need to avoid extreme stereotypes and offer children well-balanced opportunities as they grow up.

Excusing Religion Posted February 2nd 2021

The following articles is long winded and tedious, excusing religion as if it is helpful in helping with identifying truth or reality. It is back , big time because the masses need comfort, a fake friend , guide and defender and , above all an after life. I will add to this comment later.

Even more so , ruling Christian , Islamic and Judaic religious elites along with the politcal cadres, need the masses distracted , divided , deluded and ritualistice to avoid them wondering why 20 % of the world’s population owne more of the world’s resources than the rest of thw world’s people put together, with the top 1% controlling most of it and getting ever richer from lockdown – while the masses have been losing more and more of their life’s work daily. R.J Cook

Who should have the right to decide what is delusion and what is real ? who has the right to judge the insanity of anyone in a world of multi culture and diversity ? Is multi culuture and diversity an insane religion ? Is religion a form of mental illness/ Am I allowed to ask such questions in this age of diversity ? What does god think ? As I wrote in my novel
‘Man, Mad, Woman’ God laughed nastily ?
R.J Cook Image Appledene Photographics

Do atheists think differently? patrice6000/Shutterstock

Are the brains of atheists different to those of religious people? Scientists are trying to find out

January 18, 2021 1.45pm GMT


  1. Miguel Farias Associate Professor in Experimental Psychology, Coventry University

Disclosure statement

Miguel Farias receives funding from the John Templeton Foundation, the BIAL Foundation, and the Templeton Religion Trust.


Coventry University

Coventry University provides funding as a member of The Conversation UK.

The Conversation UK receives funding from these organisations

View the full list

CC BY NDWe believe in the free flow of information
Republish our articles for free, online or in print, under Creative Commons licence.

The cognitive study of religion has recently reached a new, unknown land: the minds of unbelievers. Do atheists think differently from religious people? Is there something special about how their brains work? To illustrate what they’ve found, I will focus on three key snapshots.

The first one, from 2003, is probably the most photogenic moment of “neuro-atheism”. Biologist and atheist Richard Dawkins travelled to the lab of Canadian neuroscientist Michael Persinger in the hope of having a religious experience. In this BBC Horizon film, God on the Brain, a retro science-fiction helmet was placed on Dawkins head. This “god helmet” generated weak magnetic fields, applied to the temporal lobes.

Picture of Richard Dawkins.
Richard Dawkins. CC BY-SA

Persinger had previously shown that this kind of stimulation triggered a wide range of religious phenomena – from sensing the presence of someone invisible to prompting out-of-body experiences. With Dawkins, though, the experiment failed. As it turned out, Persinger explained, Dawkins’ temporal lobe sensitivity was “much, much lower” than is common in most people.

The idea that the temporal lobes may be the seat of religious experience has been around since the 1960s. But this was the first time that the hypothesis was extended to explain the lack of religious experience based on the lower sensitivity of a brain region. Despite the exciting possibility of testing this hypothesis with a larger sample of atheists, it remains to be done.

We believe in experts. We believe knowledge must inform decisions
Image of Rodin's The Thinker.
Rodin’s The Thinker. wikipedia, CC BY-SA

The second snapshot takes us to 2012. Three articles published by labs in the USA and Canada presented the first evidence linking an analytical, logical thinking style to unbelief. Psychologists have been theorising about different ways that brains process information for a long time: conscious versus unconscious, reflective versus experiential, analytical versus intuitive. These are linked to activity in certain brain areas, and can be triggered by stimuli including art. The researchers asked participants to contemplate Rodin’s famous sculpture, The Thinker, and then assessed their analytical thinking and disbelief in god. They found that those who had viewed the sculpture performed better on the analytical thinking task and reported less belief in god than people who hadn’t seen the image.

In the same year, a Finnish lab published the results of a study where their scientists tried to provoke atheists into thinking supernaturally by presenting them with a series of short stories and asking if the punchline was a “sign of the universe” (interpreting something as a “sign” is more supernatural than interpreting something as, for example, a coincidence). They did this while scanning their brains using functional magnetic resonance imaging (fMRI). The more the participants suppressed supernatural thinking, the stronger the activation of the right inferior frontal gyrus was. We know this area is involved in cognitive inhibition, an ability to refrain from certain thoughts and behaviours.

Together, these studies suggest that atheists have a propensity to engage more in analytical or reflective thinking. If believing in gods is intuitive, then this intuition can be overridden by more careful thinking. This finding certainly raised the possibility that the minds of atheists are simply different from those of believers.

Replication crisis

So how robust are the findings? In 2015, a “replication crisis” hit the field of psychology. It turned out that the results of many classic studies couldn’t be achieved when running them again. The psychology of religion and atheism was no exception.

The experiment with Rodin’s Thinker was the first to be investigated. Three new studies were conducted with larger samples than the original — and they all failed to replicate the original results. With one sample, they found the very opposite: contemplating the Thinker increased religious belief.

One possible limitation with the original studies is that they had all been undertaken in the USA. Could culture act in such a decisive way that the analytical cognitive style associated with atheism in one country might be nonexistent elsewhere? The author of the original Rodin study attempted to answer this in a new study which included individuals from 13 countries. The results confirmed that a cognitive analytical style was only linked to atheism in three countries: Australia, Singapore and the USA.

In 2017, a double-blind study was carried out to test in a more robust way the link between unbelief and cognitive inhibition. Instead of using brain imaging to see which area lit up, they used a brain stimulation technique to directly stimulate the area responsible for cognitive inhibition: the right inferior frontal gyrus. Half of the participants, however were given a fake stimulus. The results showed that the brain stimulation worked: participants who had it achieved better in a cognitive inhibition task. However, this had no effect on decreasing supernatural belief.

The complexity of atheism

The third snapshot is this one: a man is standing against a background which looks like a church. He appears to be doing the sign of the cross with his right hand while his left hand rests on his heart. He is a priest – but not of any church that believes in gods: he presides over the Positivist Temple of Humanity, a church for atheists and agnostics created by August Comte in the 19th century. This priest is not doing the sign of cross but the Positivist blessing.

Together with photographer Aubrey Wade, I stumbled upon this active temple in the south of Brazil, while collecting data for a large ongoing project involving over 20 labs across the world: Understanding Unbelief.

Image of a man doing the positivist blessing.
Positivist blessing. @Aubrey Wade, Author provided (No reuse)

Finding an active church of unbelievers dedicated to the love of humanity — its golden principle being “live for others” — ruptured how I thought of atheists and the boundary separating them from the religious. And this has implications for how we develop studies in this area. When doing experiments with believers we can use multiple stimuli, from religious images to music, to trigger a religious effect or cognition in the lab. But finding an equivalent for unbelievers has proved hard.

One brain imaging study conducted at Oxford University compared an image of the Virgin Mary with that of a regular woman, both painted in the same period. Researchers found that when Roman Catholics concentrated on the Virgin Mary while being subjected to electric shocks, this alleviated their perception of pain compared to looking at the other woman. This decrease in pain was associated with an engagement of the right ventro-lateral prefrontal cortex, a region known to drive pain inhibitory circuits.

No similar effect was found for the unbelievers, although they rated the secular image as more pleasant than the religious one. But what if the unbelievers being tested were members of the Positivist Temple and were instead shown an image of their goddess of humanity — would this have alleviated pain in a similar way to that experienced by the religious individuals?

The future cognitive science of atheism will have to think hard about how to move forward. It needs to develop models that account for cultural variations as well as consider the implications of atheists engaging with rituals that celebrate humanity.

Single Mothers: Psychological Problems for Kids?

Long-held stigmas about single mothers are wrong. Posted January 17th 2021

Posted Aug 08, 2016

Dindo Jimenez/

Source: Dindo Jimenez/

My friend “Andrea” was at the head of the trend toward a new kind of family. In an earlier post, “On NOT Waiting for Mr. Right,” she shared her perspective as a single woman who was five months pregnant.

“This isn’t what I dreamed of,” Andrea told me. She became pregnant via sperm donor insemination, joined Single Mothers by Choice and also took childbirth and parenting classes. But she had to explain her choice to most people—even to those teaching the courses. “There is an expectation that you have a partner or spouse who will show up at some point. I have to ask if I can bring a friend,” said Andrea, who was 40 years old when her daughter was born.

During previous generations, single mothers were viewed askance. Much of the skepticism and distrust were fueled by views of teenage pregnancies and poor outcomes for the children of young, usually single teen mothers. Similarly, older, unmarried women who had babies faced criticism that was fed, in part, by those who believed how and what a family should be—you know, a mom, dad, and two children. Whatever their age or socio-economic status, single mothers struggled for legitimacy.

The 21st century has changed some, but not all tired—possibly unrealistic—attitudes about single women in general and single mothers in particular. As recently as 2010, the Pew Research Center found that 69 percent of people feel having and raising a child without a man to help raise that child is “a bad thing for society.” 

Fewer Waiting for “Mr. Right”

Yet, among women today, we have what amounts to “the invention of independent female adulthood as a norm, not an aberration, and the creation of an entirely new population: adult women who are no longer economically, socially, sexually, or reproductively dependent on or defined by the men they marry,” as Rebecca Traister described the shift in a New York Magazine article about single women’s political power.

Many of these women are parents. Single parent households in both the US and England have jumped from less than 10 percent in the 1970s to roughly 30 percent today. Some women are single parents through divorce or separation or unplanned pregnancies, but a growing number choose to have and a raise babies on their own. In other words, fewer and fewer women are waiting for Mr. Right.

Women who decide to be solo parents are in large part educated, responsible, emotionally mature, and fiscally able to support their offspring. Many of them are in their 30s and 40s and embrace advances such as sperm donation and in-vitro fertilization to become mothers.

As enlightened as we are about single women, the belief lingers that two parents are better—significantly better—than one. The concern that children raised by single mothers will have difficulties remains.

Single Mothers: Problems for Children?

Researchers studied solo mothers and two-parent families when the babies were infants. They revisited the question two years later and published their findings in the study, “Solo mothers and their donor insemination infants: follow-up at age 2 years.” Again they compared solo mothers and married women who became pregnant via donor insemination (DI). article continues after advertisement

They reported: “This route to parenthood (via DI for solo mothers) does not necessarily seem to have an adverse effect on mothers’ parenting ability or the psychological adjustment of the child.” In fact, “The solo DI mothers showed greater pleasure in their child and lower levels of anger accompanied by a perception of their child as less ‘clingy’. Fewer emotional and behavioural difficulties were shown by children of solo than married DI mothers.” The results during those typically more trying “terrible twos” were similar when studies examined the quality of parenting and children’s psychological adjustment at ages 3, 7, and 10, again with DI solo mothers and DI married parents.

In a 2016 study published in the Journal of Family Psychology, “Single mothers by choice: Mother-child relationships and children’s psychological adjustment,” children of single mothers were compared with children in two-parent households. The children ranged in age from four to nine and were all conceived by donor insemination—50 solo mothers and 51 two-parent families.

Susan Golombok and her colleagues at the University of Cambridge used a series of interviews with parents, researcher observation, teacher reports, and measurements for psychological problems such as ADHD, and autism. The findings for both family types were the same on an array of measures: warmth, conflict, stress, adjustment problems, mother’s well-being, among others. No significant differences were discovered.

Golombok noted: “The low level of psychological problems among the children of single mothers by choice in the present study suggests that lack of knowledge of the identity of their biological father does not have a negative impact on their psychological wellbeing.”

Some of the positive results might be attributed to the connection to the mother’s carrying the baby herself. One study, “Children born through reproductive donation: a longitudinal study of psychological adjustment,” looked a different means of having a baby through reproductive donation—sperm, egg or embryo donation, surrogacy. The conclusion: “The absence of a gestational connection to the mother may be more problematic for children than the absence of a genetic link.”

When thinking about the positive results of solo motherhood, especially using donor insemination, the desire to have a baby, psychological screenings, the expense and difficulties in becoming pregnant should be considered. Single women who choose motherhood often wait until they are older to start their families. Many also go to great lengths to become mothers, making children of single mothers very wanted children—all of which may help explain the optimistic outcomes. Solo mothers by choice are certainly, as Rebecca Traister wrote, not “economically, socially, sexually, or reproductively dependent on or defined by the men…” article continues after advertisement

Note: The number of unmarried women deciding to become mothers is growing, however, to date, research on single women who choose motherhood is limited and hence the children studied are young. Although donor insemination for single motherhood is in its “infancy,” future studies are bound to follow.
Copyright @2016 by Susan Newman


Related:On NOT Waiting for Mr. Right  Too Old to Have a Baby?  Women: Want to Make More Money? Have Babies After 30


Bock, Jane D. “Doing the Right Thing? Single Mothers by Choice and the Struggle for Legitimacy.” Gender and Society 14.1 (2000): 62-86.

De Wert, G., Dondorp, W., Shenfield, F., Barri, P., Devroey, P., Diedrich,K., and Pennings, G. (2014). “ESHRE Task Force on Ethics and Law 23: “Medically assisted reproduction in singles, lesbian and gay couples, and transsexual people.” Human Reproduction, 29.9. (2014) 1859–1865. NCBI. 1   Golombok, Susan, Lucy Blake, Polly Casey, Gabriela Roman and Vasanti Jadva. “Children born through reproductive donation: a longitudinal study of psychological adjustment.” Journal of Child Psychology and Psychiatry. 54.6 (2013): 653–660. NCBI Golombok, Susan, Sophie Zadeh, Susan Imrie; Venessa Smith and Tabitha Freeman. “Single Mothers by Choice: Mother–Child Relationships and Children’s Psychological Adjustment.” Journal of Family Psychology. 30.4 (2016): 409–418. NCBI. Murray, C. and Golombok, S. “Solo mothers and their donor insemination infants: follow-up at age 2 years.” Human Reproduction. 20.6 (2005): 1655-1660. NCBI.

Pew Research Center. “The Decline of Marriage And Rise of New Families.” Pew Research Center, 18 November 2010.   Traister, Rebecca. “The Single American Woman.” New York Magazine. New York Media, LLC, 22 Fe

Susan Newman, Ph.D., is a social psychologist and author. Her latest book is The Book of No: 365 Ways to Say it and Mean it—and Stop People-Pleasing Forever. Online:Susan Newman, Ph.D., Twitter, Facebook, LinkedIn

The Key to a Good Life? Lose Yourself in Something.

Whether it’s you that benefits most or someone else, don’t be afraid to go all in.

Brad Stulberg

One of the best feelings in the world is losing your attachment to yourself.

So much of our time is spent in self-focused ways. What happens if I do this? Or that? Doubt. Fear. Self-judgement. The judgement of others against ourselves. Planning. Scheming. It’s a whole lot of I, I, and I. You get the point.

Yet there’s a paradox: all of this self-focus is not very good for ourselves. Studies show that self-absorption is associated with clinical depression, personality disorders, and anxiety.

On the other hand, releasing from such a tight attachment to one’s self is a hallmark of flow, or that highly sought after state of being fully in the zone. Losing oneself is also the goal of most spiritual disciplines. (And athletic and creative ones, too.) The more you forget about yourself, the better you’ll feel, the better you’ll do, and the better you’ll be.

Unfortunately, the current ethos promotes self absorption. Examples include social media; the supposed importance of building a “personal brand”; or the self-improvement and self-esteem movements. More than ever, it seems, we’re being sold the idea of a separate self. This is a trap. And while there are a handful of ways out, I want to briefly explore two of the most dependable ones.

Pursue Mastery (In Anything)

More than 2,000 years ago, in his Aristotle wrote that integral to a meaningful life is striving for arête, or what we might today call excellence or mastery. Aristotle pointed out, however, that achieving arête — be it by throwing oneself fully into a work of art, intellect, or athletics — is not always pleasant: “A virtuous life,” he wrote, “requires exertion, and does not consist in amusement.” But he also wrote that it is in such virtuous acts — making ourselves vulnerable and giving something our all — that we lose ourselves.

Centuries later, in his wildly popular Drive, a book that at its core is about what makes people tick, author Daniel Pink makes a similar case: “Mastery,” writes Pink, “is pain.” Yet, like Aristotle, Pink also argues that mastery is meaningful, that the benefits of taking on a challenge out of one’s own volition and losing oneself in an activity are immense.

For a study published in the Journal of Personality and Social Psychology, psychologist Carol Ryff surveyed more than 300 men and women, in order to identify correlates of well-being. She found that people who had “a feeling of continued development,” and saw themselves as “growing and expanding” were more likely to score high on assessments of life satisfaction and self-esteem than those who did not. Other research shows that when people throw themselves into an activity for the sake of the activity itself — and not for some sort of external reward, like money or fame or Instagram followers — they tend to report long-term well-being and fulfillment.

Attempting to master a craft may seem inherently selfish, but that’s not the case. In interviews with over 100 highly productive scientists, artists, and other creative types, the psychologist Mihaly Csikszentmihalyi discovered that many found meaning in their lives precisely because they lost themselves in their pursuit, or because they turned themselves over to it. He coined this “vital engagement,” or a relationship to an activity that manifests when one becomes fully absorbed in it. Meaning, Csikszentmihalyi writes, “derives from the connection of the individual to a tradition, enterprise, and community of practice that lie beyond the self.”

The specific craft need not matter. For some it may be running, for others sculpting, cooking, or playing the cello. What does matter is that you respect and honor the traditions of the craft, pursue long-term progress in it, and participate not for the sake of raising yourself up (i.e., an ego boost) but for the sake of transcending the very notion of your “self” altogether. You want to express yourself in the work and lose yourself in the work at the same time.

Though some may say that pursuing this kind of mastery is self-serving, or worse, selfish, I’d argue otherwise. I’ve never met someone who is in pursuit of mastery, who pays close attention to their craft and cares deeply about it, who isn’t a good person. Plus, whatever they create tends to end up helping lots of other people anyways. (Exhibit A: Mike Posner’s recent walk across America.)

Be Kind

As meaningful as devoting oneself to mastery may be, devoting oneself directly to helping others is perhaps even more powerful. (Of course, the two aren’t exclusive.) One of the world’s foremost happiness researchers, Sonya Lyubomirsky, has told me that her research continues to show that one of the best ways to boost both happiness and meaning is to perform acts of kindness, such as volunteering, mentoring, coaching, or even just writing someone a letter of gratitude. When individuals participate in these activities, she says, they report more positive emotions, both immediately and over time.

A recent series of studies published in The Journal of Positive Psychology, “Prosociality Enhances Meaning in Life,” bears this out. The psychologist Daryl Van Tongeren and his colleagues asked over 400 participants how often they engage in altruistic endeavors. He then asked them how meaningful their lives felt. Those who were more altruistic reported more meaning in their lives.

Next, Van Tongeren conducted a case-control experiment: that is, he took a group of individuals, measured their sense of meaning at baseline, and then instructed half the participants to engage in altruistic acts and the other half not to. The participants who partook in the altruistic acts reported significantly greater increases in meaning versus those who did not, suggesting a causal relationship, or thatacts of kindness are not merely associated with but actually create meaning.

Though the exact mechanism by which performing acts of kindness enhances meaning is unknown, researchers speculate that doing so makes us feel more connected to and rooted in community. Additionally, doing nice things for others affords us a purpose that is beyond ourselves and the opportunity to contribute to a greater cause — both of which are associated with increased meaning.

Today’s world is all about quick fixes, hot takes, and outrage. Yet, according to science and the longstanding wisdom traditions, the keys to a good life are the exact opposite.

Brad Stulberg (@Bstulberg) writes about performance and wellbeing. He is the bestselling author of Peak Performance and The Passion Paradox, and co-creator of the


We’ve Got Depression All Wrong. It’s Trying to Save Us.

New theories recognize depression as part of a biological survival strategy. Posted Here January 9th 2021

Posted Dec 22, 2020


For generations, we have seen depression as an illness, an unnecessary deviation from normal functioning. It’s an idea that makes sense because depression causes suffering and even death. But what if we’ve got it all wrong? What if depression is not an aberration at all, but an important part of our biological defense system?


Depression is a courageous biological strategy to help us survive. Source: ActionVance/Unsplash

More and more researchers across specialties are questioning our current definitions of depression. Biological anthropologists have argued that depression is an adaptive response to adversity and not a mental disorder. In October, the British Psychological Society published a new report on depression, stating that “depression is best thought of as an experience, or set of experiences, rather than as a disease.” And neuroscientists are focusing on the role of the autonomic nervous system (ANS) in depression. According to the Polyvagal Theory of the ANS, depression is part of a biological defense strategy meant to help us survive.

The common wisdom is that depression starts in the mind with distorted thinking. That leads to “psychosomatic” symptoms like headaches, stomachaches, or fatigue. Now, models like the Polyvagal Theory suggest that we’ve got it backward. It’s the body that detects danger and initiates a defense strategy meant to help us survive. That biological strategy is called immobilization, and it manifests in the mind and the body with a set of symptoms we call depression.

When we think of depression as irrational and unnecessary suffering, we stigmatize people and rob them of hope. But when we begin to understand that depression, at least initially, happens for a good reason we lift the shame. People with depression are courageous survivors, not damaged invalids.

Laura believes that depression saved her life. Most of the time her father only hurt her with words, but it was when she stood up to him that Laura’s dad got dangerous. That’s when he’d get that vicious look in his eyes. More than once his violence had put Laura’s life at risk.

Laura’s father was so perceptive, that he could tell when she felt rebellious on the inside even when she was hiding it. And he punished her for those feelings.

It was the depression that helped Laura survive. Depression kept her head down, kept her from resisting, helped her accept the unacceptable. Depression numbed her rebellious feelings. Laura grew up at a time where there was no one to tell, nowhere for her to get help outside her home. Her only strategy was to survive in place. And she did.

Looking back, Laura does not regret her childhood depression. She values it. Going through her own healing process and working with her therapist helped her see how depression served her.

Laura’s story is stark. It’s ugly. And it helps us understand that even though depression may happen for a good reason, that does not make it a good thing. Laura suffered deeply and describes the pain of her hopelessness vividly. Her depression was a bad experience that started as the last resort of a good biological system.

Depression starts with immobilization

According to the Polyvagal Theory, discovered and articulated by neuroscientist Stephen Porges, our daily experience is based on a hierarchy of states in the autonomic nervous system. When the ANS feels safe, we experience a sense of well-being and social connection. That’s when we feel like ourselves.

But the autonomic nervous system is also constantly scanning our internal and external environment for signs of danger. If our ANS detects a threat or even a simple lack of safety, its next strategy is the fight or flight response which we often feel as anxiety.

Sometimes the threat is so bad or goes on for so long, that the nervous system decides there is no way to fight or to flee. At that point, there is only one option left: immobilization. article continues after advertisement

The immobilization response is the original biological defense in higher animals. This is the shutdown response we see in reptiles. Also known as the freeze or faint response, immobilization is mediated by the dorsal vagus nerve. It turns down the metabolism to a resting state, which often makes people feel faint or sluggish.

Owlie Harring/Unsplash

The immobilization response dulls pain. Source: Owlie Harring/Unsplash

Immobilization has an important role. It dulls pain and makes us feel disconnected. Think of a rabbit hanging limply in the fox’s mouth: that rabbit is shutting down so it won’t suffer too badly when the fox eats it. And the immobilization response also has a metabolic effect, slowing the metabolism and switching the body to ketosis. Some doctors speculate that this metabolic state could help healing in severe illness.

In humans, people often describe feeling “out of their bodies” during traumatic events, which has a defensive effect of cushioning the emotional shock. This is important because some things are so terrible, we don’t want people to be fully present when they happen.

So the immobilization response is a key part of the biological defense, but it is ideally designed to be short term. Either the metabolic shut down preserves the organism, i.e. the rabbit gets away, or the organism dies and the fox eats the rabbit.

But if the threat continues indefinitely and there is no way to fight or flee, the immobilization response continues. And since the response also changes brain activity, it impacts how people’s emotions and their ability to solve problems. People feel like they can’t get moving physically or mentally, they feel hopeless and helpless. That’s depression. article continues after advertisement

Does depression have value?

It’s easy to see why Laura’s childhood circumstances would set off the immobilization response, and even how it might have helped her survive. But why does it happen in people with less obvious adversity? Our culture tends to think of depression in the person who finds work too stressful as a sign of weakness. Self-help articles imply that they just need more mental toughness and they could lean in and solve it. Even some therapists tell them that their depression is a distorted perception of circumstances that aren’t so bad.

But that is not how the body sees it. The defense responses in the autonomic nervous system, whether fight/flight or immobilization are not about the actual nature of the trigger. They are about whether this body decides there is a threat. And that happens at a pre-conscious point. The biological threat response starts before we think about it, and then our higher-level brain makes up a story to explain it. We don’t get to choose this response; it happens before we even know it.

Studying anxiety has revealed that many modern circumstances can set off the fight or flight response. For instance, low rumbling noises from construction equipment sound to the nervous system like the growl of a large predator. Better run. Or feeling like they are being evaluated at school removes kids’ sense of safety and triggers fight or flight. Better give the teacher attitude or avoid homework. And to most of us, fight or flight feels like anxiety. article continues after advertisement

Eventually, if these modern triggers last long enough, the body decides it can’t get away. Next comes immobilization which the body triggers to defend us. According to Porges, what we call depression is the cluster of emotional and cognitive symptoms that sits on top of a physiological platform in the immobilization response. It’s a strategy meant to help us survive; the body is trying to save us. Depression happens for a fundamentally good reason.

And that changes everything. When people who are depressed learn that they are not damaged, but have a good biological system that is trying to help them survive, they begin to see themselves differently. After all, depression is notorious for the feelings of hopelessness and helplessness. But if depression is an active defense strategy, people may recognize they are not quite so helpless as they thought.

Shifting out of immobilization

If depression is the emotional expression of the immobilization response, then the solution is to move out of that state of defense. Porges believes it is not enough to simply remove the threat. Rather, the nervous system has to detect robust signals of safety to bring the social state back online. The best way to do that? Social connection. article continues after advertisement

One of the symptoms of depression is shame, a sense of having let other people down or being unworthy to be with them. When people are told that depression is an aberration, we are telling them that they are not part of the tribe. They are not right, they don’t belong. That’s when their shame deepens and they avoid social connection. We have cut them off from the path that leads them out of depression.

It is time that we start honoring the courage and strength of depressed people. It is time we start valuing the incredible capacity of our biology to find a way in hard times. And it is time that we stop pretending depressed people are any different than anyone else.


Porges, Stephen. (Apr 2009) The polyvagal theory: New insights into adaptive reactions of the autonomic nervous system. Cleve Clin J Med.

Porges, Stephen. (Feb 2007) The polyvagal perspective. Biol Psychology.

Suicide U.K Posted January 9th 2021 from ONS.

  • In 2019, there were 5,691 suicides registered in England and Wales, an age-standardised rate of 11.0 deaths per 100,000 population and consistent with the rate in 2018.
  • Around three-quarters of registered deaths in 2019 were among men (4,303 deaths), which follows a consistent trend back to the mid-1990s.
  • The England and Wales male suicide rate of 16.9 deaths per 100,000 is the highest since 2000 and remains in line with the rate in 2018; for females, the rate was 5.3 deaths per 100,000, consistent with 2018 and the highest since 2004.
  • Males aged 45 to 49 years had the highest age-specific suicide rate (25.5 deaths per 100,000 males); for females, the age group with the highest rate was 50 to 54 years at 7.4 deaths per 100,000.
  • Despite having a low number of deaths overall, rates among the under 25s have generally increased in recent years, particularly 10- to 24-year-old females where the rate has increased significantly since 2012 to its highest level with 3.1 deaths per 100,000 females in 2019.
  • As seen in previous years, the most common method of suicide in England and Wales was hanging, accounting for 61.7% of all suicides among males and 46.7% of all suicides among females.

Talking out loud to yourself is technology for thinking . Posted January 3rd 2021

Talking out loud to yourself is a technology for thinking | Psyche

Talking out loud to yourself is a technology for thinking

Photo by Marcos Brindicci/Reuters

Nana Arielis a writer, literary scholar and lecturer in the Faculty of Humanities at Tel Aviv University, a fellow of the Minducate Science of Learning Research and Innovation Center, and a guest lecturer at Harvard University. She specialises in theoretical and practical rhetoric and in adventurous pedagogy. She lives in Tel Aviv.

This week, a woman was strolling in my street, walking in circles and speaking out loud to herself. People were looking at her awkwardly, but she didn’t particularly mind, and continued walking vigorously and speaking.

Yes, that woman was me.

Like many of us, I talk to myself out loud, though I’m a little unusual in that I often do it in public spaces. Whenever I want to figure out an issue, develop an idea or memorise a text, I turn to this odd work routine. While it’s definitely earned me a reputation in my neighbourhood, it’s also improved my thinking and speaking skills immensely. Speaking out loud is not only a medium of communication, but a technology of thinking: it encourages the formation and processing of thoughts.

The idea that speaking out loud and thinking are closely related isn’t new. It emerged in Ancient Greece and Rome, in the work of such great orators as Marcus Tullius Cicero. But perhaps the most intriguing modern development of the idea appeared in the essay ‘On the Gradual Formation of Thoughts During Speech’ (1805) by the German writer Heinrich von Kleist. Here, Kleist describes his habit of using speech as a thinking method, and speculates that if we can’t discover something just by thinking about it, we might discover it in the process of free speech. He writes that we usually hold an abstract beginning of a thought, but active speech helps to turn the obscure thought into a whole idea. It’s not thought that produces speech but, rather, speech is a creative process that in turn generates thought. Just as ‘appetite comes with eating’, Kleist argues, ‘ideas come with speaking’.

A lot of attention has been given to the power of spoken self-affirmation as a means of self-empowerment, in the spirit of positive psychology. However, as Kleist says, talking to oneself is also a cognitive and intellectual tool that allows for a wider array of possible use cases. Contemporary theories in cognition and the science of learning reaffirm Kleist’s speculations, and show how self-talk contributes not only to motivation and emotional regulation, but also to some higher cognitive functions such as developing metacognition and reasoning.

If self-talk is so beneficial, why aren’t we talking to ourselves all the time? The dynamic between self-talk and inner speech might explain the dubious social status of the former. Self-talk is often seen as the premature equivalent of inner speech – the silent inner voice in our mind, which has prominent cognitive functions in itself. The tendency to express our inner thoughts in actual self-talk, typical of children, is internalised, and transforms to voiceless inner speech in adulthood, as the developmental psychologist Lev Vygotsky already speculated in the 1920s.

Self-talk is deemed legitimate only when done in private, by children, by people with intellectual disabilities, or in Shakespearean soliloquies

Vygotsky’s view stood in opposition to a competing one from the psychological school known as behaviourism, which saw children’s self-talk as a byproduct of (supposedly) less competent minds. But Vygotsky claimed that self-talk has an active mental role. He observed children performing tasks while speaking to themselves out loud, and reached the conclusion that their ‘private-talk’ is a crucial stage in their mental development. Gradually, a child’s interaction with others turns into an uttered conversation with the self – self-talk – until it becomes muted inner speech in adulthood. Vygotsky’s successors, such as the psychologist Charles Fernyhough, have demonstrated that inner speech goes on to facilitate an array of cognitive functions including problem solving, activating working memory and preparation for social encounters. It is inner speech rather than self-talk, then, that has been the focus of research in adults.

However, the internalisation of self-talk isn’t necessarily evidence of cognitive maturity: rather, it could represent the degeneration of an essential cognitive skill in the face of social pressure. The sociologist Erving Goffman noted that self-talk is taboo because it is a ‘threat to intersubjectivity’ and violates the social assumption that speech is communicative. As he wrote in his book Forms of Talk (1981): ‘There are no circumstances in which we can say: “I’m sorry, I can’t come right now, I’m busy talking to myself”.’ Self-talk is deemed legitimate only when done in private, by children, by people with intellectual disabilities, or in Shakespearean soliloquies.

Yet self-talk enjoys certain advantages over inner speech, even in adults. First, silent inner speech often appears in a ‘condensed’ and partial, form; as Fernyhough has shown, we often tend to speak to ourselves silently using single words and condensed sentences. Speaking out loud, by contrast, allows the retrieval of our thoughts in full, using rhythm and intonation that emphasise their pragmatic and argumentative meaning, and encourages the creation of developed, complex ideas.

Not only does speech retrieve pre-existing ideas, it also creates new information in the retrieval process, just as in the process of writing. Speaking out loud is inventive and creative – each uttered word and sentence doesn’t just bring forth an existing thought, but also triggers new mental and linguistic connections. In both cases – speech and writing – the materiality of language undergoes a transformation (to audible sounds or written signs) which in turn produces a mental shift. This transformation isn’t just about the translation of thoughts into another set of signs – rather, it adds new information to the mental process, and generates new mental cascades. That’s why the best solution for creative blocks isn’t to try to think in front of an empty page and simply wait for thoughts to arrive, but actually to continue to speak and write (anything), trusting this generative process.

Speaking out loud to yourself also increases the dialogical quality of our own speech. Although we have no visible addressee, speaking to ourselves encourages us to actively construct an image of an addressee and activate one’s ‘theory of mind’ – the ability to understand other people’s mental states, and to speak and act according to their imagined expectations. Mute inner speech can appear as an inner dialogue as well, but its truncated form encourages us to create a ‘secret’ abbreviated language and deploy mental shortcuts. By forcing us to articulate ourselves more fully, self-talk summons up the image of an imagined listener or interrogator more vividly. In this way, it allows us to question ourselves more critically by adopting an external perspective on our ideas, and so to consider shortcomings in our arguments – all while using our own speech.

You might have noticed, too, that self-talk is often intuitively performed while the person is moving or walking around. If you’ve ever paced back and forth in your room while trying to talk something out, you’ve used this technique intuitively. It’s no coincidence that we walk when we need to think: evidence shows that movement enhances thinking and learning, and both are activated in the same centre of motor control in the brain. In the influential subfield of cognitive science concerned with ‘embodied’ cognition, one prominent claim is that actions themselves are constitutive of cognitive processes. That is, activities such as playing a musical instrument, writing, speaking or dancing don’t start in the brain and then emanate out to the body as actions; rather, they entail the mind and body working in concert as a creative, integrated whole, unfolding and influencing each other in turn. It’s therefore a significant problem that many of us are trapped in work and study environments that don’t allow us to activate these intuitive cognitive muscles, and indeed often even encourage us to avoid them.

Technological developments that make speaking seemingly redundant are also an obstacle to embracing our full cognitive potential. Recently, the technology entrepreneur Elon Musk declared that we are marching towards a near future without language, in which we’ll be able to communicate directly mind-to-mind through neural links. ‘Our brain spends a lot of effort compressing a complex concept into words,’ he said in a recent interview, ‘and there’s a lot of loss of information that occurs when compressing a complex concept into words.’ However, what Musk chalks up as ‘effort’, friction and information loss also involves cognitive gain. Speech is not merely a conduit for the transmission of ideas, a replaceable medium for direct communication, but a generative activity that enhances thinking. Neural links might ease intersubjective communication, but they won’t replace the technology of thinking-while-speaking. Just as Kleist realised more than 200 years ago, there are no pre-existing ideas, but rather the heuristic process by which speech and thought co-construct each other.

So, the next time you see someone strolling and speaking to herself in your street, wait before judging her – she might just be in the middle of intensive work. She might be wishing she could say: ‘I’m sorry, I can’t chat right now, I’m busy talking to myself.’ And maybe, just maybe, you might find yourself doing the same one day.

Married to Someone Who’s Always Right?

One personality trait can be especially frustrating. Posted January 1st 2021

Posted Aug 27, 2019


Relationships are rife with possible conflicts because they require the navigation of two different personalities. While there are many personality traits that can bother you in a spouse or partner, few traits elicit as strong an emotional reaction as the trait where a person acts as if they’re always right.

This trait can be frustrating in a friend but is much more difficult to bear in a romantic relationship that involves so many emotional ties and such constant close proximity. If you’re married to someone who acts as if they’re always right, there are a few things to keep in mind that can make your interactions with them less conflictual.

Before exploring the topic further, it’s important to note that research on this subject is challenging. Who, for example, wants to admit that they always need to be right? Because of the challenges in getting honest self-reports in this area, I’ll draw on my 20 years of clinical experience with men and women of various ages and social demographics. 

Acting as if one is always right reflects a pervasive psychological defense mechanism.

There isn’t one simple cause for this complex personality trait, but most individuals who have a need to always be right share one important characteristic: Their need to always be right indicates a strong and pervasive defense mechanism (including, but not limited to, a denial of their vulnerability, an inherent part of everyone’s human experience, whether one likes it or not).

The definition of a defense mechanism is: “a way in which somebody behaves without thinking about it to protect themselves from unpleasant feelings or situations” (“Defense mechanism,” 2019). Note that the part of the definition that includes “without thinking about it” is also known clinically as an unconscious process, meaning that the personality trait — always being right — has become so ingrained in the person’s thinking and personality that the person isn’t fully aware of just how right they always need to be. Though people who act as if they’re always right know that they like to be right, they would not necessarily understand consciously that they act this way because they are overcompensating for feelings of shame, a sense of insufficiency, and fear that would arise if they were wrong.

Psychologically, men and women who are never wrong would feel extremely exposed if others witnessed them being wrong. Being wrong under any circumstances in front of others reflects to them a weakness or flaw, even when most people would not consider being wrong here or there as rising to the level of a flaw! In contrast, people with good self-esteem accept that they are sometimes wrong (read: occasionally vulnerable and always imperfect) because they are human.

Why people who act as if they’re never wrong are so averse to the notion of occasionally being wrong, even in the most mundane or trivial circumstances

A history of early experiences in which being vulnerable resulted in getting emotionally hurt: Men and women who are never wrong developed this defensive personality trait many years ago, and many of them developed it because someone very important in their early life made them feel emotionally unsafe. When they were young, many of these men and women learned that it isn’t safe to let their guard down and be vulnerable, because when they let their guard down and were vulnerable in the past, they got emotionally hurt, criticized or even punished.

For example, men and women who are always right often had the experience of sharing an emotional experience with someone, and watching as information about that experience was used against them later. Other men and women with this problem were shamed at critical points during their development for “failing,” or they were made to feel stupid or even pathetic, at times, by parents or peers at school. Years ago, these individuals (unconsciously, without realizing it) began construction on a moat-like defensive response style to protect their ego from ever feeling small, insufficient, defective, or stupid again.

A lack of praise, feeling unvalued: Another factor in the life of a child that gives rise to this personality trait is a lack of feeling praised and valued enough as a child. Because these men and women weren’t praised and valued enough as boys and girls, their ego development, their self-esteem, suffered. Later in life, these men and women learned to overcompensate for self-doubt and feelings of shame for not being good enough by flipping the script. Outwardly, they learned to act as if they were strong, superior and infallible, even when logic would tell them that no such person exists. 

Growing up with a parent who was always right, too: In some cases, the person who is always right developed this orientation based on social modeling. Specifically, these individuals may have grown up with a parent who was always right, too. Children who have a parent who is never wrong and always right often feel angry and resentful, because the parent’s perspective feels rigid and unfair, and often betrays reality or objectivity.

These children often live with an underlying sense that they are inferior to the superior, always-right parent, and the children internalize the sense that they’re not inherently good enough and as valuable as they are. As a result, these children usually go through childhood feeling resentful and angry that they aren’t “heard” or valued, and that they are dismissed or discounted by those who matter. How do they cope with these feelings? They begin to operate with others using the same personality trait they fell victim to with their parent, acting with others now as if they are the ones who are always right. article continues after advertisement

What this personality trait means diagnostically

For some men and women, the never-wrong personality trait is a part of a larger problem: an entire personality organization that is distorted in crucial ways. These individuals may have what clinicians call a personality disorder, and this trait is most common among individuals who have what is known as Cluster B personality disorders (Narcissistic, Borderline, and Antisocial Personality Disorders, especially), each of which is outlined in the current edition of the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 2013).

The Cluster B personalities involve distorted expectations of others, a disordered view of the self, and disordered relationships. Men and women who have a Cluster B personality disorder often have a need to feel superior to others, which often requires dismissing the thoughts and feelings of others. The thinking goes like this: What does reality really matter when my ego is on the line? I protect my ego to feel big and strong at all costs. For men and women who are never wrong, protecting their fragile ego is their number one goal.

Personality is not the only factor at work, however, in the construction and maintenance of this trait. Individuals who have the need to always be right may have this problem as a function of their cognitive (thinking) style. Specifically, they may suffer from an extremely rigid cognitive style, one with fixed ideas. Men and women who are never wrong may meet some or all of the criteria for Obsessive-Compulsive Disorder, a disorder that includes rigid, fixed ideas and behaviors. article continues after advertisement

Similarly, men and women who lie at the highly functional end of Autism Spectrum Disorder (what clinicians historically called “Aspergers Disorder”) often present rigid or fixed thoughts and behaviors. When such individuals have the thought that they are right in a particular situation, they have great difficulty “shifting sets” and seeing another person’s perspective in the very same situation.

The previous disorders are only a subset of the possible disorders that can coexist with the personality trait of always needing to be right. If you’re concerned that your spouse may present this personality type as a symptom of a larger psychological problem, the best practice is to meet with a mental health professional to discuss the issue in depth. Though such professionals can only diagnose individuals they have clinically assessed themselves, a professional can listen to your circumstances and share feedback to help you manage this relationship dynamic more effectively.

Unmet needs in the individual’s current life

While it’s helpful to understand what may be going on clinically with your spouse, it’s also helpful to reflect on what factors in your spouse’s current life could be exacerbating the problem (the need to prove how right they always are). Sigmund Freud, the revolutionary neurologist and founder of psychoanalysis, believed that a person’s primary emotional issues come out in one of two areas in life: one’s work life or romantic life. In my clinical work, I’ve found this theory to be remarkably accurate. People who act like they’re never wrong often have a strong unmet need in their personal life, whether it’s their romantic life, home life, or social life

Unmet need for recognition in one’s professional or work-contribution life: Having a sense of purpose and feeling needed are crucial to a person’s well-being. People have a driving need to feel that the work contribution they make in life is important and valued by those close to them. Whether their job is a formal one — from a cashier to a CEO — or their work life is in the home — managing a household and/or taking care of children — it’s crucial for a person’s mood and self-esteem to feel that the contribution they make is recognized and valued. If the need for recognition in one’s professional or work-contribution life is not met, the gaping unmet need will typically result in anger, resentment, sadness, and even depression.

Those who have an unmet need for recognition in their work life become defensive. They overcompensate for the unmet need by trying harder to make everyone close to them acknowledge their contribution and their overall worth and value. In other words, if one feels undervalued and unappreciated, one goes into psychological overdrive to get everyone to see their worth and value. Because men and women who are never wrong have such a deep, profound unmet need for recognition, they devote much of their energy to constructing a persona in which they are seen as the opposite of someone who is vulnerable or flawed. They start acting to the world as if they’re an authority figure, one who is gifted and superior to most others.

Unmet need for recognition in one’s personal life: To feel happy enough and to be able to socialize consistently and harmoniously with others in close proximity to them (spouses, partners, close friends, co-workers, and bosses), people need to have their basic emotional needs for respect and caring met. When people feel unnoticed, unappreciated or disrespected for too long, they start to feel bitter, angry, and even depressed. Without question, someone who acts as if they’re always right isn’t getting their basic emotional needs met for respect and recognition in their daily life.

If a person doesn’t feel sufficiently valued by those closest to them in their personal life, that person is going to become defensive and is going to take on personality characteristics and defense mechanisms that protect their ego from feeling bad or insufficient. These individuals will often adopt the I’m-never-wrong attitude as a way to overcompensate for the feelings that come up for them because crucial people in their current personal life cause them to feel invisible or unimportant.

Why their approach doesn’t work

Despite the effort, this mental approach doesn’t work. The personality orientation – always right, never wrong – isn’t authentic or rooted in reality (because it’s impossible for anyone to be superhuman, or always right), so the foundation of this belief system is faulty and maladaptive. As a result, men and women who act as if they’re never wrong don’t actually achieve their goal of making others respect and recognize them. Instead, this rigid personality style only causes conflicts, causing others to resent or dislike them even more. Sadly, the cycle continues. The person who’s never wrong gets even more triggered because they’re trying so hard to demand the respect they believe they deserve, but they’re still not getting sufficiently acknowledged. Over time, they become more bitter and angry, and are even more intent on proving their value and right-ness. The need to be respected and valued for anyone is so fundamental that people will do almost anything to get it, even if that means self-sabotaging. 

How to cope when your spouse is never wrong

You’ve heard the trope about trying to change the stripes of a tiger. Simply put, trying to change your spouse or partner’s most fundamental personality characteristics is a losing endeavor. The psychological need for these individuals to always be right and never be wrong is so strong and so many years in the making that the personality trait’s closest relative is actual titanium; it’s simply not budging. What can budge, however, is how you, their spouse or partner, react to them. How do you cope? You use a series of mental approaches.

Talk to a mental health professional to get some perspective.

First, the most effective strategy in dealing with a spouse who is never wrong is to seek out a couples therapist. Though many men and women with this personality trait won’t want to talk with a therapist because their self-esteem isn’t strong enough to withstand any constructive criticism or feedback from a therapist, it’s always a good idea to suggest therapy — even just one session — as an option. If therapy is not an option, the only other option (aside from ending the relationship, which may not be necessary) is to shift how you react to their frustrating personality trait.

Don’t take their defensive, always-right personality personally.

It feels personal when your spouse acts as if they’ve descended to from the heavens to grace you with their superior, always-right presence, but it’s most definitely not personal. Your spouse is this way with anyone with whom they are in close proximity professionally or personally. Understand that your spouse’s need to always be right isn’t a sign that they think you’re inherently inferior to them; they are simply terrified of being disrespected or unvalued by anyone — stranger, boss, spouse — and that’s why they act the way they do. Though they act superior and high-and-mighty, they actually suffer from a somewhat fragile ego. Of course, people who feel good about themselves don’t need to be right all the time; it’s the ones who battle self-doubts and low self-esteem who insist on being the smartest, wisest people in the room. Men and women who are never wrong can’t ever really be vulnerable because being vulnerable, according to their distorted thinking, would end up hurting them or being used against them. 

Choose your battles.

People who are never wrong need to win and be voted “Most Respected” at all costs. These men and women will match you note for note if you challenge them, so when the issue is not important, let them win. When the two of you are navigating an issue that is important, sit with the issue for a day or two and plan a measured, non-emotional approach to the issue. Showing these people any sort of negative feelings, like anger or frustration, will only fuel them more. The arch-enemy of these individuals is accountability, so don’t waste your energy trying to hold them accountable and asking for fairness. When these individuals’ need to be right gets triggered, they will never, ever acknowledge any vulnerability at all.

Make sure you have a long list of coping outlets when you get triggered.

You’re not crazy to expect fairness and a mutual acknowledgment of reality in a relationship. Sadly, these men and women don’t value those things. It’s not realistic to never again be bothered by this personality trait, but it is realistic to make sure that you don’t lose your mind dealing with them. You can cope well and keep the relationship working well enough as long as you have sufficient prosocial outlets. Examples: talking to a therapist, meditation, various types of physical exercise, venting to close friends, writing in a journal, talking to your minister, rabbi, preacher, etc.

The overall point

Don’t take your spouse’s need to always be right personally, but also don’t engage too emotionally when your spouse’s need to be right gets triggered. Ultimately, everyone has flaws, and it is our own job to make sure that we figure out a way to react to those closest to us in a way that makes us feel good, connected, and supported.


American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). Washington, DC: Author.

Defense mechanism (2019). In Oxford Learner’s Dictionaries. Retrieved from

Stigma, schizophrenia and being transgender Posted December 26th 2020

When Ashley McFord-Allister was diagnosed with schizophrenia, treatment to confirm his gender slowed to a crawl. Here he explores how you can have a mental illness and be transgender simultaneously, and why the medical community – and society in general – needs to change their definition of the ‘accepted truth’.

Words by Ashley Ford-McAllister and artwork by Olivia Twist 20 March 2020

  • Article
Illustration in black, purple and red tones, showing a person sitting in an armchair in a room. At their feet is a ruckled rug with things hidden under it. To the left is a small bookcase and house plant.

When the whole world seems convinced that being transgender is a form of madness, what does it mean to be both a transgender individual and someone with a mental illness?

I’m a transgender man, though I rarely use the ‘trans’ prefix. I completed my transition, to the extent I’m happy with, in 2011, and have been living, loving and working as your run-of-the-mill working-class guy ever since.

In 2007, I was diagnosed with schizophrenia. This slowed my transition down by almost two years, because when you’re assumed to be crazy by reason of your identity, it’s a bit rich, as far as the medical community are concerned, to have a genuine mental illness as well.

The first reaction of many people – both individuals experiencing the twists and turns of personhood, and the professionals they come into contact with – is to assume that the transgender feelings are part of the actual mental illness.

I’d expressed definitively male aspirations from the age of five. Aged nine, I cut my hair short and insisted on being called by a male name (the same name I use now, almost a quarter of a century on).

At 20, following what is recorded as my first psychotic episode (but was more likely at least my second or third), I considered mentioning that I saw myself as male, rather than female. However, I decided not to do so, and quietly hoped that the conviction that I was ‘supposed to be’ male would be resolved by whatever medication I was prescribed.

In the 1990s, there was very little representation and information about transgender people, and nobody seemed to have a problem with me ‘being a tomboy’. Had I been born in the 1990s, rather than growing up in them, I might have discussed how I saw my gender sooner; it simply wasn’t a possibility I was aware of as a child.

An either-or situation

Many people with mental illnesses avoid raising transgender feelings or experiences with consulting clinicians. It takes so long to actually get to see a professional about mental health challenges in general that people don’t want to do anything that might cause treatment to be put on hold. Meanwhile, people who are already being seen by professionals as transgender individuals can be very wary of mentioning mental health issues, as these are frequently used as a reason to discontinue hormone therapy, and put the person off the clinical pathway to transition.

The psychiatric community has, to date, been very insistent that you can either undergo gender-confirming treatment, or receive treatment to help stabilise and manage an unrelated mental health condition; doing both simultaneously is, they insist, not possible. Society at large loudly asserts that being transgender is madness, yet transgender individuals wanting to medically validate their gender are required to be sane.

But we’re not always sane, because transgender experiences don’t discriminate. Your gender does not particularly care about your neurology, or your sanity, just as it doesn’t really care about your sexuality. Psychiatrists, however, care an awful lot about both, and they let their concern about them override the requirement that they treat their patients with courtesy, and with professional compassion.

Society at large loudly asserts that being transgender is madness.

In the face of neurodivergence or mental illness, psychiatry often denies transgender people the presumption of competence. We are no longer a relatively straightforward psychosocial ‘quirk’, to be quickly popped back on the track to being as close to cisgender as possible, but an unhinged mess that can’t be neatly tidied up .

It becomes an awful lot easier, as with many stubborn stains, to throw a rug over it and forget about it. The ‘rug’, in this case, being medical professionals’ insistence that our gender identity is simply a ‘manifestation’ of our wider ‘loss of contact with accepted reality’.

Schizophrenics like me are told that “fluid interpretations of gendered experience are quite usual with schizophrenia”. The fact that the most fluid my gender has ever been is wondering whether velvet T-shirts were a good look is glossed over with an assurance that I will have been experiencing shifts in my sense of gender but may not have been fully aware of them.

Individuals on the autism spectrum, meanwhile, are regularly ‘reminded’ that they “don’t really understand the concept of gender identity”. Despite this, many individuals I know, online and as friends, who are on the spectrum, both trans and not, are as certain and understanding of their gender as anyone else.

Creating more realistic expectations

Is transgender experience, or gender dysphoria, a kind of madness? As someone who has experienced being both transgender and clinically insane, no, it isn’t. Schizophrenia means I can struggle to understand and interpret the world around me, which causes me varying degrees of distress.

Transgender identities, by contrast, result in the world around the individuals living those identities struggling to understand and interpret them, causing the people of the wider world varying degrees of distress. ‘Mental illness’ is defined, in part, as causing distress to the individual experiencing it. I was never distressed by presenting as male; people around me were.

Are there overlaps between transgender individuals and those with mental illnesses? Very often, yes. Overwhelmingly, these overlaps are positive – a shared agreement that individuals are the authorities on themselves, their experiences and their identities. A common refusal to acknowledge that what society believes to be ‘obvious’, is, or deserves to be, the accepted truth. A letting-go of unrealistic expectations, and the radical acceptance of a very simple but frequently disputed truth: that if something does not cause harm to anyone, then it really isn’t an issue.

If something does not cause harm to anyone, then it really isn’t an issue.

Other overlaps are less positive, and come not from the communities themselves, but from wider society. These include the persistent association of transgender feelings with insanity, as well as the belief that other peoples’ reactions to our lived experience and personal expression matter more than our feelings about ourselves.

It is these overlaps that continue the stigma faced by both transgender individuals and those with mental health conditions. This stigma reinforces the barriers that prevent people, whatever their personal identity and experience, from being seen and accepted as ‘normal’.

About the contributors

Photograph of Ashley Ford-McAllister

Ashley Ford-McAllister


Raised working poor, diagnosed with schizophrenia in 2007; a lifelong learning curve based on these two core points. Married, with dogs. Interests mostly became obsolete at least a century ago. He/him. 

Photograph of Olivia Twist

Olivia Twist

Illustrator@yesoliviatwist on Instagram

Olivia Twist is an illustrator, arts facilitator and lecturer from east London with an MA in Visual Communication from the Royal College of Art. The key threads that can be found in her work are place, the mundane and overlooked narratives. Her striking visual language is comprised of a myriad of esoteric layers informed by a propensity for human-centered research methodologies.

Read more from Broken hearts not broken brains

Dolly Sen invited five contributors to share their experiences that reflect upon the idea that mental ill health has less to do with a broken brain and more to do with a broken heart.ArticleHappy Joy SmileDrawn from real-life experiences, this short story depicts a character negotiating the UK’s current mental health system. Discover what happens as they encounter waiting lists, sketchy healthcare and punitive government bureaucracy.Photo storyThe man who remembers everythingTilney1 can remember his life in minute detail, but can’t control the incessant intrusion of thoughts and images from the past. As cuts to mental health services isolate him more and more, a crisis approaches.

How to Get Smarter Every Day, According to Neuroscience Posted December 7th 2020

Education matters. But so does fluid intelligence. Here’s how to improve yours.

By Jeff Haden, Contributing editor, Inc.@jeff_haden

How to Get Smarter Every Day, According to Neuroscience
Getty Images

A friend of mine spends 20 to 30 minutes a day solving Sudoku puzzles. He says it improves his speed of mental processing and makes him, well, smarter.

Hold that thought.

Ask people which factor contributes the most to success and most will choose intelligence, even though science says you also have to be lucky: Right place, right time. Right person, right time. Right idea, right market, right audience at the right time. 

Yet even though there are ways to “create” your own luck, you can’t control luck.

But you can control, to some degree, how smart you are. 

Let’s Define “Smart.”

While there are a number of different forms of intelligence, let’s focus on two. Crystallized intelligence is accumulated knowledge: facts, figures. Think “educated.”

Of course we all know people who are “book smart” but not necessarily smart smart. That’s where fluid intelligence comes into play: The ability to learn and retain new information and then use it to solve a problem, to learn a new skill, to recall existing memories and modify them with new knowledge. Think “applied intelligence.”

Becoming more educated is, while not easy, certainly simple.

Improving fluid intelligence is harder, which is one reason why brain games–crossword puzzles, Sudoku, brain training apps, etc.–are fairly popular.  

But do they make you smarter? Do they improve fluid intelligence? 

Basically, No.

A 2007 study published in Behavioral and Brain Sciences assessed the impact of brain training games on fluid intelligence. After participants played Tetris–yes, Tetris–for several weeks, cortical thickness and cortical activity increased. 

Both are signs of an increase in neural connections and learned expertise. In simple terms, their brains bulked up and got smarter.

But after those first few weeks, cortical thickness and activity started to decrease, eventually returning to pre-Tetris mastery pursuit levels–even though their skill levels remained high. Participants didn’t lose brain power.

Instead, their brains became so efficient at playing Tetris those increased neural connections were no longer necessary. Using more mental energy was no longer necessary. As with most things, once they kinda figured it out, it got easy.

Unfortunately, no matter how much work it took to learn new information or gain new skills,  “easy” doesn’t help improve fluid intelligence. Once knowledge or skill is in your pocket, you certainly benefit from the increase in crystallized intelligence.

But your fluid intelligence soon returns to a more baseline level. 

That’s the problem with brain training games. Solving Sudoku puzzles, and only solving Sudoku puzzles, won’t improve my friend’s fluid intelligence in any other areas. 

It only makes him better at solving Sudoku puzzles.

Learning how to use a new inventory management system will improve your fluid intelligence, until you’ve mastered it. Setting up Quickbooks for a new business will improve your fluid intelligence, until you’ve mastered the accounting process basics.

Once you achieve a level of comfort, your brain no longer has to work as hard, and all that new mental muscle gained starts to atrophy.

So what can you do?

Stay Uncomfortable.

Easy: Once you’ve mastered a new game, a new process, a new skill, a new anything–move on to something new.

At work. At home. Anywhere. Just keep challenging yourself.

Not only will you pocket a constant flow of new information and skill, your brain will stay “bulked up” and forging new neural connections, making it easier to keep learning and growing.

And then there’s this: The more you know, the more you can leverage the power of associative learning–the process of relating something new to something you already know.

Not in a Pavlov’s dog kind of way, but by learning the relationship between seemingly unrelated things. In simple terms, whenever you say, “Oh, that makes sense: This is basically like that,” you’re using associative learning. 

The more you learn, the more likely you will be able to associate “old” knowledge to new things. Which means you only have to learn differences or nuances. And you’ll be able to apply greater context, which also helps with memory storage and retrieval, to the new information you learn.

All of which makes learning even easier, which research shows will result in your being able to learn even more quickly–and retain a lot more.

So if you like brain training games, master one and then move on to another. And another.

Better yet, keep pushing yourself to learn new things about your business, your customers, your industry, etc.

Not only will that help you become more successful, you’ll also get to improve your crystallized intelligence and fluid intelligence–which will surely help you become even more successful.

Where win-wins are concerned, that’s a tough one to beat.Inc. helps entrepreneurs change the world. Get the advice you need to start, grow, and lead your business today. Subscribe here for unlimited access.

It’s not necessarily deluded to feel in control when you’re not | Psyche

It’s not necessarily deluded to feel in control when you’re not

Daniel Yonis a cognitive neuroscientist and experimental psychologist. He is a lecturer at Goldsmiths, University of London, where he heads a research lab investigating how our brains build models of ourselves and the world around us.

A distorted awareness of our capacities and capabilities is often a sign of serious mental illness. Take ‘Sophie’, a British woman living in Oxfordshire, who – in the grips of a delusional episode – developed the bizarre belief that she was God, and so able to take flight from sheer cliff drops and walk effortlessly on water. Though this episode subsided before she came to any serious harm, she later recounted that, if she had indeed tried leaping from slightly taller heights or treading in deeper pools of water, her already unsettling story could have had a fatal ending.

Sophie’s story is so unnerving because we take for granted that our insight into our actions and their consequences is accurate. This intuition is deeply embedded in many of our formal institutions. When a jury finds a defendant guilty of a crime or a tribunal disciplines a doctor for malpractice, we tacitly assume that the blameworthy party had a good awareness of their actions and the outcomes that ensued. The same thought seems to guide our personal and social relationships. When we praise someone for a thoughtful gift or admonish them for a hurtful comment, we do so because we believe those close to us are well aware of how their actions can affect us – and perhaps that they should have known better.

However, evidence from the cognitive sciences suggests our subjective awareness of what we can and cannot control is not always reliable. This was demonstrated in a seminal set of experiments in 1979 by the psychologists Lauren Alloy and Lyn Abramson using a fiendishly simple piece of equipment – a button wired up to a light bulb. Alloy and Abramson asked student volunteers at the University of Pennsylvania to play around with the button, and to judge how much their presses influenced the flashing bulb. Unbeknown to the students, sometimes the button was disconnected from the bulb, and all the flashes were programmed to occur at random. Surprisingly, perfectly healthy volunteers still reported feeling that they could influence when the flashes would occur, even when those flashes were completely uncontrollable.

Psychologists have termed these kinds of experiences ‘illusions of control’ – where we feel a sense of agency over events in the world that we can’t truly influence. And examples of these illusions crop up in our everyday lives, if you know where to look. For instance, in the 1960s the sociologist James Henslin observed American cab drivers as they gambled away their profits on curbside games of craps. Henslin spotted that when one of the cabbies needed a higher number on the dice, he’d superstitiously throw them harder against the curb – even though this can’t possibly affect the outcome. Those of us who live in cities might often experience illusions of control, since many of the mechanical buttons we interact with – from pedestrian crossings, elevators or office thermostats – have become obsolete, with the underlying systems controlled by centralised computers and automatic timers. Nonetheless, every day, thousands of us push the ‘placebo buttons’ – not realising that they do nothing at all.

Illusions of control have led scientists to claim that human beings have a fundamentally grandiose picture of how much they can influence the world around them. Some perspectives view the problem through an evolutionary lens, and suggest that exaggerated beliefs about our actions – while false – could be a useful product of natural selection. Thinkers in this camp reason that creatures with overly optimistic beliefs about their chances of success will seize more of the opportunities that the environment offers up. On this reading, we descend from those plucky primates who were overconfident about their ability to snatch food from a rival or seduce an attractive mate, and were thus more likely to survive, multiply and pass on this disposition to us.

These evolutionary ideas are complemented by perspectives from social psychology that suggest an exaggerated sense of control is a key ingredient to healthy self-esteem. One striking demonstration of this comes from studies of people with depression – who do not experience illusions of control in the same way. This observation led Alloy and Abramson to suggest that depression gives us a ‘sadder but wiser’ view of our capabilities. On this view, the feelings of powerlessness associated with the illness arise because the scales have fallen from the patient’s eyes, and they see how little they can shape the world around them. To be healthy is to be deluded.

The idea that humans are fundamentally deluded creatures has had a wide-ranging impact on studies of the mind and brain. However, my colleagues and I have been thinking about illusions of control and if the psychological evidence really does mean we are afflicted by delusions of grandeur. We came up with a new hypothesis – maybe when people hallucinate control over some event in the world, they could be noticing something that everyone else is missing.

These hallucinations of control were linked to spurious correlations between action and outcome

This idea was partly inspired by a longstanding puzzle in studies of human perception: why do we often see or hear things that aren’t really there? For more than a century, a branch of experimental psychology called ‘psychophysics’ has investigated the limits of human perception using tightly controlled laboratory tasks. For example, a volunteer in a typical psychophysical experiment might be placed in a dark, soundproof room and asked to find degraded black-and-white patterns embedded in ‘visual noise’ – similar to the television static you see when the signal fails. Such studies reveal that observers often raise a ‘false alarm’ – seeing patterns even when the experimenter hasn’t embedded one in the noisy display.

For a long time, it was thought that these ‘false alarms’ were strategic guesses: observers know that sometimes there’ll be a pattern in the noise and, if they have a lapse of attention, they might guess and hope they’re correct. However, a study led by the cognitive neuroscientist Valentin Wyart in 2012 suggested that observers really do detect patterns in the random noise. In particular, this study revealed that false alarms were more likely to occur when (just by luck) the noise spuriously looked like the pattern they were looking for. These hallucinations were also exaggerated when observers strongly expected the pattern to be there.

With my colleagues Clare Press and Carl Bunce, we thought that the same kind of thing could explain illusions of control: it might not be that we disregard the evidence in front of our eyes and decide irrationally that we can control things that we can’t. Instead, we might just be especially sensitive to ways that changes in the world co-vary with our actions, and be able to pick up on correlations that occur just by chance. If this were true, illusions of control would be a sign that humans are sensitive – not that they are grandiose.

We recently tested this idea using some new experimental techniques. Volunteers came into the lab and completed a task where they waved their hands over an infrared motion tracker to move a virtual dot – a bit like using a mouse to move a cursor on screen. Participants were told that sometimes the movements of the dot would be yoked to the movements they were actually performing, and sometimes the trajectory would be controlled by a computer. They would have to tell us if they thought they controlled the dot or not.

Our volunteers experienced strong illusions of control – feeling that they controlled the dot even when it was objectively programmed by the computer and they had no influence over it whatsoever. However – because we tracked how the participants moved – we could see that these hallucinations of control were linked to spurious correlations between action and outcome: participants were very sensitive to times when the uncontrolled dot randomly corresponded with what they were doing, and felt control when this correspondence was strong.

To further test this idea, we constructed computational models of how these decisions might be unfolding in our volunteers’ heads. This involved simulating different artificial agents and seeing how these machines would make judgments about what they can and can’t control if we placed them in our participant’s shoes. Importantly, an artificial agent programmed only to focus on the evidence at hand, rather than a grandiose agent with built-in delusional beliefs, provided a better explanation for the illusions of control we saw in our real volunteers.

These findings put pressure on the idea that humans have hardwired delusions about their actions, ignoring the evidence in front of their eyes and instead relying on exaggerated beliefs about the kinds of things they can influence. They suggest illusions of control could arise because we are very sensitive to how the world changes when we act, and can sometimes spot spurious relationships between our behaviour and changes in the environment. While the beliefs are false (we don’t actually control things), the inference might still be a rational one in an uncertain and changeable world.

This could also give us cause to think differently about the links between feelings of control and mental health. Under our theory, illusions of control are a counterintuitive sign that we are sensitive to the relationship between actions and outcomes. The same disposition that makes us occasionally hallucinate control equips us to spot weak but genuine correlations between what we do and what we see. If this is true, the absence of illusions of control in illnesses such as depression would mean that these patients don’t necessarily have a ‘sadder but wiser’ view of their capabilities. Indeed, real insight might be knowing that our control over our environments is almost never absolute, and it is important to appreciate the slight influences we have.

In Reinhold Niebuhr’s Serenity Prayer, the supplicant asks for the serenity to accept the things they cannot change, the courage to change the things they can, and the wisdom to know the difference. It has been tempting for scientists to think that illusions of control mean the human mind is rich on courage and lean on wisdom. But this might be premature, and new tools will allow scientists to reveal how our beliefs about our abilities are calibrated to the world around us – and whether the illusion of control is an illusion after all.

Pills Given to selected victims of exempted criminals and abusers are not an answer, they are just more abuse along with all the psycho babble and official bull-hit. R.J Cook Posted December 3rd 2020

Trauma unmakes the world of the self. Can stories repair it? | Psyche

Trauma unmakes the world of the self. Can stories repair it?

Anna Gotlibis an associate professor of philosophy at Brooklyn College in New York, specialising in bioethics, moral psychology and philosophy of law. She is the editor of The Moral Psychology of Sadness (2018) and The Moral Psychology of Regret (2020). In January 2020, she was a Fulbright Specialist Scholar at the University of Iceland.

Edited by Sam Dresser

Human beings are storytelling creatures: we spin narratives in order to construct our world. Whether on the cave walls of Lascaux or the golden record stored on the Voyager spacecraft, we want to share our selves and what matters to us through words, actions, even silence. Self-making narratives create the maps of the totality of our physical reality and experiences – or, as philosophers sometimes say, of the lifeworlds that we inhabit. And just as narratives can create worlds, they can also destroy them.

Trauma, in its many guises, has been part of these narratives since time immemorial, often by shattering the topographies of our lifeworlds. Breaking our most fundamental, most taken-for-granted means of self-understanding, it replaces our familiar narratives with something dreadful, something uncanny, sometimes something unspeakable.

What is trauma? Rather than just fear or guilt or unwanted memories, trauma is a totalising force that unmakes our worlds, leading to a kind of world-loss. It draws sharp lines marked ‘before’ and ‘after’: the ‘before’ demarcates the prelapsarian world, the self that we knew; the ‘after’ is the devastation of a broken lifeworld that remains.

Because we are natural storytellers, we turn to narratives in order to try to make sense of trauma. Our stories can vary as widely as human experience itself. Sometimes, while trauma breaks down our sense of who we are, it also shocks us into greater clarity about alternative, perhaps better, versions of ourselves. And so, while the Epic of Gilgamesh (c1800 BCE) offers a glimpse of the pain of trauma, it also explores its transformative effects. Through his grief over losing his beloved friend Enkidu, the arrogant Gilgamesh, touched by personal tragedy, becomes more connected to the mortal and the temporary – indeed, he becomes more human.

Trauma can be something that we choose to do to each other

Alternatively, in the ‘after’, we might find ourselves in the difficult liminal spaces that trauma makes for us. Sigmund Freud argued that we can be traumatised by a contradictory sense of what he called the ‘uncanny’: the almost-recognisable, forgotten or repressed thing that frightens us. We can remember the horror in pieces and parts, in vivid flashes and in opaque memorylessness. For this, too, we need narratives by which to navigate these borderlands of memory. And so Toni Morrison, in her novel Beloved (1987), writes narratives born of what she calls the ‘rememory’ of the survivors of slavery who confront the uncanny by recollecting and reassembling their histories, their families, their communities, their very selves. Trauma fuels the narratives, and the narratives themselves become the loci of trauma, the battlegrounds where suffering and memory meet.

Fictionalised narratives offer a kind of reckoning with our collective traumas. But understanding and responding to trauma also calls for personal stories that take us into singular experiences of lifeworld loss. What we need, then, is a more direct way to glimpse what trauma is like, and how we might go on in its wake.

These personal narratives can emerge against the background of world-historical tragedies. For instance, the Italian writer and Holocaust survivor Primo Levi describes his arrival at Auschwitz:

Driven by thirst, I eyed a fine icicle outside the window, within hand’s reach. I opened the window and broke off the icicle, but at once a large, heavy guard prowling outside brutally snatched it away from me. ‘Warum?’ [Why?] I asked him in my poor German. ‘Hier ist kein warum,’ (there is no why here) he replied, pushing me inside with a shove.

Levi’s words transport us into his world, into his trauma – what it looks like, feels like, sounds like. We see a man dehumanised not only by the brutality of the camp itself, but by the attitudes of his tormentors that make such traumatisation possible. What Levi’s narrative makes clear is that how (or whether) we traumatise each other depends on what Ludwig Wittgenstein calls our ‘attitude towards a soul’: the role that our actions and words play in recognising another as human. Levi’s Nazi captors saw no such humanity in him. It is this dehumanisation that reveals to us how trauma can be something that we choose to do to each other. But it also suggests how it might be resisted: in the harshest of circumstances, such as a war, our attitudes toward another soul matter.

Does trauma have to be understood, expressed, and confronted narratively? For many, not at all. Instead, it might be viewed as an illness, reified by the psychiatric handbook the DSM‐III (1980), leading directly to the psychiatrist’s office for medication, to the psychologist for therapy, or else to the local bookstore for self-help literature. This medicalisation of trauma positions it as a disease, as an alien within that must be treated as one would treat a virus: identify the culprit, find the appropriate tool for combat, and destroy the enemy.

Yet because trauma’s complexities so often elude purely medicalised solutions, this is at best incomplete. Let’s then return to trauma as a felt, existentially threatening experience. A brush with serious illness in the intensive care unit leaves one with paralysing fear at the possibility of returning there during the pandemic. A victim of violence suffers from post-traumatic stress disorder, her world forevermore unsafe. In the traumatic ‘after’, we’re not simply facing negative emotions that can be medicated away. We lack the means of more permanent repair – the kind that doesn’t merely blunt the pain, but re-establishes meaningful lifeworlds. We are emotionally, narratively and psychologically adrift, having ‘outlived’ ourselves, as the philosopher and trauma survivor Susan Brison notes in Aftermath: Violence and the Remaking of a Self (2001), without a way back. What, then, is left to remake?

In the midst of despair, we can still find – indeed, create – meaning by embracing ‘tragic optimism’

When all else fails, we might remake the story itself. Because most of us can’t do much about the conditions in which we find ourselves, we can begin by repairing the stories about who we are.

How we proceed partly depends on what we want these narratives to do for us – after all, not all stories are reliable, or good, or restorative. Some narratives might insist on impenetrable hopelessness in the wake of suffering. Others might counsel forgetting of the trauma in favour of epistemic lacunas. But as the former lovers in the film Eternal Sunshine of the Spotless Mind (2004) discover, no amount of wilful forgetting fully erases our deepest, most unwanted recollections.

What also fails is a kind of magical thinking. Those who embrace the largely American tendencies toward triumphant, happiness-centred narratives offer stories of what the political activist Barbara Ehrenreich calls ‘reckless optimism’ in her book Bright-sided (2009), or Smile or Die as it’s titled in Britain. As Ehrenreich notes:

[W]e cannot levitate ourselves into that blessed condition by wishing it. We need to brace ourselves for a struggle against terrifying obstacles, both of our own making and imposed by the natural world.

None of these options, it seems to me, get us any closer to world-repair.

So what does? Surprisingly, a kind of optimism – but not the reckless variety. Viktor Frankl, the Austrian neurologist, psychiatrist and Holocaust survivor, argued in a postscript to his memoir Man’s Search for Meaning (1946) that in the midst of despair, tragedy and suffering we can still find – indeed, create – meaning by embracing what he called ‘tragic optimism’. This odd kind of optimism allows us to remake ourselves and our worlds, despite what Frankl calls the ‘tragic triad’ of pain, guilt and death.

It is to Frankl’s ‘tragic optimism’ that we might turn in the midst of trauma. Tragic optimism is found in the story of the Brooklyn ICU nurse who chooses to stay by the bedside of a dying COVID-19 patient, bearing witness to his suffering. Fully cognizant of the trauma of death in isolation, she offers a counternarrative by taking an attitude towards his soul that restores his lifeworld, and her own, even if a little. She bears witness in silence and in spoken narratives – ‘You are not alone.’ Her actions remake the trauma into something more meaningful: the isolation of human suffering and death is no longer unintelligible, but shared through a profound experience of compassion. And while her actions might not be life-saving in that most basic sense, they are world-saving for the patient, for whatever time remains.

So tragic optimism calls for letting go of our happiness-seeking tendencies. We face the difficult process of world-repair through the restoration of meaning – through our work, our relationships, and through engaging with suffering itself. And what this requires is not a denial of trauma’s existence, of its destructive powers, but the deliberate decision to act in ways that affirm our shared humanity by sustaining each other’s lifeworlds.

Trauma is not a virus to be medicated away, nor a tale to be forgotten, nor a deep sadness to be replaced with reckless optimism. What it can be is a catalyst for different stories – better stories – about who we are, what we value, and how we might live in the ‘after’. And these stories are not happiness-seeking – they are meaning-making, meaning-remaking. They are the narratives of tragic optimism that don’t fall prey to comfortable amnesias or myths of human invulnerability. They harbour no illusions about the indestructibility of our worlds. Perhaps if we engage with our traumas less reluctantly and open up to the possibilities of narrative world-remaking, we might integrate some of our worst experiences into the ever-evolving stories about who we are. However uneasily, we just might coexist with, and even flourish in, their glare. Because trauma can, and will, unmake our worlds again.

Write to Reward Your Reader

November 18, 2020

Illustration by Asia Pietrzyk

Chances are that every time you sit down to write — whether it’s a report or a speech or a white paper or an op-ed — you hear a little voice. It’s your high-school teacher or college professor reciting the rules of writing: Use the active voice. Choose strong verbs and nouns. Show don’t tell.

But are these the right rules? Do they put the focus on what most matters? Is there another — even better — approach?

Research by scientists today shows there is. Thanks to the work of psychologists and neuroscientists using MRIs, EEGs, PETs and other tools, we can observe in never-before-seen detail what entices readers to read and listeners to listen. We now know how readers respond to simple words (versus complex), to specific language (versus abstract), to aesthetic features (versus literal ones), to metaphor (versus plain language), and more.

All of the research points to a single principle: You can actually write in a way that rewards our primal learning needs, prompting the release of pleasing chemicals in the reader’s “reward circuit,” a cluster of midbrain regions that drive desire and behavior. The first chemical out of the gate is dopamine, released when your neurons sense a cue for a likely reward. If the reward pays off, eventually a half dozen pleasure hotspots may glow.

You can craft a winning communication strategy that specifically taps the reward circuit embedded within our hunter-gatherer brains.  Here are five tactics:

Keep it simple: People may say they love complexity, but they’re usually praising wine, not prose. So favor simple words, simple sentences, and above all, distilling simple concepts from complex ones.

Princeton University scientist Daniel Oppenheimer researched how readers viewed complexity. He asked 71 Stanford University students to assess two written passages. One was composed of simple words, the other, complex. Both said the same thing. The students, quizzed later, consistently said the authors of the complex passage were less intelligent.

Research has even shown that it literally pays to keep your writing simple:  Researchers Byoung-Hyoun Hwang and Hugh Hoikwang Kim used a computer to rank the readability of shareholder reports from closed-end investment companies. Their findings: Companies that issue reports that are hard to read traded at a 2.5% discount to competitors.

So divide your big sentences in two, omit unnecessary adjectives and adverbs, cut useless transitions, and omit caveats that clutter your message. Make your writing accessible.

Keep it specific: Concrete details light up neurons that process smell, sight, sound, and motion. Your brain, as it turns out, yearns for full-bodied stimuli — and then it runs an internal multimedia show.

Scientists have shown that when people in MRI scanners read words like garlic, cinnamon, and jasmine, their olfactory circuits light up.  The same thing happens with sight, sound, and motion. So write as if you’re scripting lines for readers’ internal cinema.

Keep it stirring: You may think you persuade people with logic, not emotion, but our brains process emotions much faster than thoughts. Each emotion also comes programmed with reflexive reactions and motivations — fear, for example, prompting dry mouth and the urge to run, which served our hunter-gatherer ancestors who needed to outrun fires and snakes.

The lesson is that how your words make people feel shapes what they understand. Emotion and language deliver meaning together. The leaner you are on emotion, the slower readers are on comprehension.

Jonah Berger and Katherine Milkman tracked the virality of 7,000 New York Times articles. Stories carrying emotions — anger, awe, anxiety, surprise — got 34% more shares, and those with positive emotions did best of all. So at least pair your logic with some zeal. And favor metaphors as a potent way to do so.

Keep it social: Even hints of connection count.  Experiments with poems, for example, show that a social signal as slight as a quotation mark — to indicate someone speaking — engages people’s reward circuitry. We are driven to seek out social cues as hungrily as any other.

So flavor your writing with your voice, character, and experience. Self-revelation — measured and apt — connects readers to you and turns on rewards.

An overlooked way to keep it more social is to write in the second person, (i.e. “you”). Research on song lyrics and poems found that people preferred those that spoke directly to the audience. No other pronoun, “he,” “she,” or “they,” has the same power to create a sense of social connection.

Keep it story-driven: Evolutionarily, stories are believed to have served as a primary vehicle for sharing lessons. We’re wired to ask, “What did she do next?” And “what happened?” So play to your readers’ thirst with whodunnit or how-did-it narratives.

Telling stories can literally pay off. For example, researchers who looked at two kinds of business crowdfunding campaigns found that those with richer narratives earned higher marks for entrepreneur credibility, legitimacy, and intentions of people to invest and share. The implication: No stories, no funding!

Tastes vary, of course, but we’re all affected by basic evolutionary drives. Ultimately readers don’t listen to what you say because they like your style. They listen because they love how you reward them in the ancient midbrain. That principle ties all the rules of great communication together.

So, the next time you’re struggling for the right words, turn not just to your teachers’ advice. Turn inward as well to your ancient muse and ask, “What would a hunter-gatherer read?”

Bill Birchard is a business author and book-writing coach. His book Eight Secrets from Science for Aspiring Writers is in progress. His previous books include Merchants of Virtue, Stairway to Earth, Nature’s Keepers, Counting What Counts, and others. For more tips, see his website:

A Johns Hopkins Study Reveals the Scientific Secret to Double How Fast You Learn November 25th 2020

Making one small change to the way you practice can make a huge difference in how quickly you gain new skills.


  • Jeff Haden

Read when you’ve got time to spare.

Photo from Getty Images.

When you’re trying to learn something new — like, say, making that new sales demo really sing — you need to practice. When you’re trying to gain expertise, how much you practice is definitely important.

But even more important is the way you practice.

Most people simply repeat the same moves. Like playing scales on the piano, over and over again. Or going through the same list of vocabulary words, over and over again. Or, well, repeating anything over and over again in the hopes you will master that task.

Not only will your skills not improve as quickly as they could, in some cases, they may actually get worse.

According to research from Johns Hopkins, “What we found is if you practice a slightly modified version of a task you want to master, you actually learn more and faster than if you just keep practicing the exact same thing multiple times in a row.”

Why? The most likely cause is reconsolidation, a process where existing memories are recalled and modified with new knowledge.

Here’s a simple example: trying to get better at shooting free throws in basketball. The conditions are fixed. The rim is always 10 feet above the floor. The free throw line is always 15 feet from the basket.

In theory, shooting from the same spot, over and over again, will help you ingrain the right motions into your muscle memory so your accuracy and consistency will improve.

And, of course, that does happen — but a better, faster way to improve is to slightly adjust the conditions in subsequent practice sessions.

Maybe one time you’ll stand a few inches closer. Another time you might stand a few inches to one side. Another time you might use a slightly heavier, or lighter, ball.

In short, each time you practice, you make the conditions a little different. That primes the reconsolidation pump — and helps you learn much more quickly.

But Not Too Different — or Too Soon

But you can’t adjust the conditions more than slightly. Do something too different and you’ll simply create new memories — not reconsolidated ones.

“If you make the altered task too different, people do not get the gain we observed during reconsolidation,” the researchers say. “The modification between sessions needs to be subtle.”

And you’ll also need to space out your practice sessions appropriately.

The researchers gave the participants a six-hour gap between training sessions, because neurological research indicates it takes that long for new memories to reconsolidate.

Practice differently too soon and you haven’t given yourself enough time to “internalize” what you’ve learned. You won’t be able to modify old memories — and therefore improve your skills — because those memories haven’t had the chance to become old memories.

So if you want to dramatically improve how quickly you learn a new skill, try this.

How to Learn a New Skill

The key to improvement is making small, smart changes, evaluating the results, discarding what doesn’t work, and further refining what does work.

When you constantly modify and refine something you already do well, you can do it even better.

Say you want to improve a skill; to make things simple, we’ll pretend you want to master a new presentation.

1. Rehearse the basic skill. Run through your presentation a couple of times under the same conditions you’ll eventually face when you do it live. Naturally, the second time through will be better than the first; that’s how practice works. But then, instead of going through it a third time …

2. Wait. Give yourself at least six hours so your memory can consolidate. (Which probably means waiting until tomorrow before you practice again, which is just fine.)

3. Practice again, but this time …

  • Go a little faster. Speak a little — just a little — faster than you normally do. Run through your slides slightly faster. Increasing your speed means you’ll make more mistakes, but that’s OK — in the process, you’ll modify old knowledge with new knowledge — and lay the groundwork for improvement. Or …
  • Go a little slower. The same thing will happen. (Plus, you can experiment with new techniques — including the use of silence for effect — that aren’t apparent when you present at your normal speed.) Or …
  • Break your presentation into smaller parts. Almost every task includes a series of discrete steps. That’s definitely true for presentations. Pick one section of your presentation. Deconstruct it. Master it. Then put the whole presentation back together. Or …
  • Use a different projector. Or a different remote. Or a lavaliere instead of a headset mic. Switch up the conditions slightly; not only will that help you modify an existing memory, it will also make you better prepared for the unexpected.

4. And then, next time, slightly modify another condition.

Keep in mind you can extend this process to almost anything. While it’s clearly effective for improving motor skills, the process can also be applied to nearly any skill.

Don’t do the same thing over and over again in hopes you’ll improve. You will, but not nearly as quickly as when you slightly modify the conditions in subsequent practice sessions — and then give yourself the time to consolidate the new memories you’ve made.

Keep modifying and refining a skill you already do well and you can do it even better.

And a lot more quickly.

That’s the fastest path to expertise. Inc.

More from Inc.

A Math Theory for Why People Hallucinate Posted November 23rd 2020

Psychedelic drugs can trigger characteristic hallucinations, which have long been thought to hold clues about the brain’s circuitry. After nearly a century of study, a possible explanation is crystallizing.

Quanta Magazine

  • Jennifer Ouellette

Read when you’ve got time to spare.Hallucination_2880_v2.jpg

Credit: aeforia and Olena Shmahalo / Quanta Magazine.

In the 1920s, decades before counterculture guru Timothy Leary made waves self-experimenting with LSD and other psychedelic drugs at Harvard University, a young perceptual psychologist named Heinrich Klüver used himself as a guinea pig in an ongoing study into visual hallucinations. One day in his laboratory at the University of Minnesota, he ingested a peyote button, the dried top of the cactus Lophophora williamsii, and carefully documented how his visual field changed under its influence. He noted recurring patterns that bore a striking resemblance to shapes commonly found in ancient cave drawings and in the paintings of Joan Miró, and he speculated that perhaps they were innate to human vision. He classified the patterns into four distinct types that he dubbed “form constants”: lattices (including checkerboards, honeycombs and triangles), tunnels, spirals and cobwebs.

Some 50 years later, Jack Cowan of the University of Chicago set out to reproduce those hallucinatory form constants mathematically, in the belief that they could provide clues to the brain’s circuitry. In a seminal 1979 paper, Cowan and his graduate student Bard Ermentrout reported that the electrical activity of neurons in the first layer of the visual cortex could be directly translated into the geometric shapes people typically see when under the influence of psychedelics. “The math of the way the cortex is wired, it produces only these kinds of patterns,” Cowan explained recently. In that sense, what we see when we hallucinate reflects the architecture of the brain’s neural network.

But no one could figure out precisely how the intrinsic circuitry of the brain’s visual cortex generates the patterns of activity that underlie the hallucinations. FormConstants_560.jpg

Heinrich Klüver classified the shapes he saw while under the influence of hallucinogenic drugs into four categories, known as “form constants.” Credit: Lucy Reading-Ikkanda / Quanta Magazine.

An emerging hypothesis points to a variation of the mechanism that produces so-called “Turing patterns.” In a 1952 paper, the British mathematician and code-breaker Alan Turing proposed a mathematical mechanism for generating many of the repeating patterns commonly seen in biology — the stripes of tigers or zebra fish, for example, or a leopard’s spots. Scientists have known for some time that the classic Turing mechanism probably can’t occur in a system as noisy and complicated as the brain. But a collaborator of Cowan’s, the physicist Nigel Goldenfeld of the University of Illinois, Urbana-Champaign, has proposed a twist on the original idea that factors in noise. Experimental evidence reported in two recent papers has bolstered the theory that this “stochastic Turing mechanism” is behind the geometric form constants people see when they hallucinate.

Sweaty Grasshoppers

Images we “see” are essentially the patterns of excited neurons in the visual cortex. Light reflecting off the objects in our field of view enters the eye and comes to a focus on the retina, which is lined with photoreceptor cells that convert that light into electrochemical signals. These signals travel to the brain and stimulate neurons in the visual cortex in patterns that, under normal circumstances, mimic the patterns of light reflecting off objects in your field of view. But sometimes patterns can arise spontaneously from the random firing of neurons in the cortex — internal background noise, as opposed to external stimuli — or when a psychoactive drug or other influencing factor disrupts normal brain function and boosts the random firing of neurons. This is believed to be what happens when we hallucinate.

But why do we see the particular shapes that Klüver so meticulously classified? The widely accepted explanation proposed by Cowan, Ermentrout and their collaborators is that these patterns result from how the visual field is represented in the first visual area of the visual cortex. “If you opened up someone’s head and looked at the activity of the nerve cells, you would not see an image of the world as through a lens,” said Peter Thomas, a collaborator of Cowan’s who is now at Case Western Reserve University. Instead, Thomas explained, the image undergoes a transformation of coordinates as it is mapped onto the cortex. If neuronal activity takes the form of alternating stripes of firing and non-firing neurons, you perceive different things depending on the stripes’ orientation. You see concentric rings if the stripes are oriented one way. You see rays or funnel shapes emanating from a central point — the proverbial light at the end of the tunnel common in near-death experiences — if the stripes are perpendicular to that. And you see spiral patterns if the stripes have a diagonal orientation.

But if geometric visual hallucinations like Klüver’s form constants are a direct consequence of neural activity in the visual cortex, the question is why this activity spontaneously occurs — and why, in that case, it doesn’t cause us to hallucinate all the time. The stochastic Turing mechanism potentially addresses both questions.

Alan Turing’s original paper suggested that patterns like spots result from the interactions between two chemicals spreading through a system. Instead of diffusing evenly like a gas in a room until the density is uniform throughout, the two chemicals diffuse at different rates, which causes them to form distinct patches with differing chemical compositions. One of the chemicals serves as an activator that expresses a unique characteristic, such as the pigmentation of a spot or stripe, while the other acts as an inhibitor, disrupting the activator’s expression. Imagine, for example, a field of dry grass dotted with grasshoppers. If you start a fire at several random points, with no moisture present, the entire field will burn. But if the heat from the flames causes the fleeing grasshoppers to sweat, and that sweat dampens the grass around them, you’ll be left with periodic spots of unburned grass throughout the otherwise charred field. This fanciful analogy, invented by the mathematical biologist James Murray, illustrates the classic Turing mechanism.

Turing acknowledged that this was a greatly simplified toy model for how actual patterns arise, and he never applied it to a real biological problem. But it offers a framework to build on. In the case of the brain, Cowan and Ermentrout pointed out in their 1979 paper that neurons can be described as activators or inhibitors. Activator neurons encourage nearby cells to also fire, amplifying electrical signals, while inhibitory neurons shut down their nearest neighbors, dampening signals. The researchers noticed that activator neurons in the visual cortex were mostly connected to nearby activator neurons, while inhibitory neurons tended to connect to inhibitory neurons farther away, forming a wider network. This is reminiscent of the two different chemical diffusion rates required in the classic Turing mechanism, and in theory, it could spontaneously give rise to stripes or spots of active neurons scattered throughout a sea of low neuronal activity. These stripes or spots, depending on their orientation, could be what generates perceptions of lattices, tunnels, spirals and cobwebs.

While Cowan recognized that there could be some kind of Turing mechanism at work in the visual cortex, his model didn’t account for noise — the random, bursty firing of neurons — which seemed likely to interfere with the formation of Turing patterns. Meanwhile, Goldenfeld and other researchers had been applying Turing’s ideas in ecology, as a model for predator-prey dynamics. In that scenario, the prey serve as activators, seeking to reproduce and increase their numbers, while predators serve as inhibitors, keeping the prey population in check with their kills. Thus, together they form Turing-like spatial patterns. Goldenfeld was studying how random fluctuations in predator and prey populations affect these patterns. He knew about Cowan’s work in neuroscience and soon realized his insights could apply there as well.

Houses With Eyes and Jaws

A condensed matter physicist by training, Goldenfeld gravitates toward interdisciplinary research, applying concepts and techniques from physics and math to biology and evolutionary ecology. Roughly 10 years ago, he and his then graduate student Tom Butler were pondering how the spatial distribution of predators and prey changes in response to random local fluctuations in their populations, for instance if a herd of sheep is attacked by wolves. Goldenfeld and Butler found that when a herd’s population is relatively low, random fluctuations can have big effects, even leading to extinction. It became clear that ecological models need to take random fluctuations into account rather than just describe the average behavior of populations. “Once I knew how to do the fluctuation calculation for pattern formation,” Goldenfeld said, “it was an obvious next step to apply this to the hallucination problem.”

In the brain, it’s the number of neurons that are on or off that randomly fluctuates rather than sheep and wolf populations. If an activator neuron randomly switches on, it can cause other nearby neurons to also switch on. Conversely, when an inhibitory neuron randomly switches on, it causes nearby neurons to switch off. Because the connections between inhibitory neurons are long-range, any inhibitory signals that randomly arise spread faster than random excitatory signals — exactly what’s needed for a Turing-like mechanism. Goldenfeld’s models suggested that stripes of active and inactive neurons will form in a Turing-like pattern. He dubbed these stochastic Turing patterns.

However, to function properly, the visual cortex must be primarily driven by external stimuli, not by its own internal noisy fluctuations. What keeps stochastic Turing patterns from constantly forming and causing us to constantly hallucinate? Goldenfeld and colleagues argue that even though the firing of neurons can be random, their connections are not. Whereas short-range connections between excitatory neurons are common, long-range connections between inhibitory neurons are sparse, and Goldenfeld thinks this helps suppress the spread of random signals. He and his cohorts tested this hypothesis by creating two separate neural network models. One was based on the actual wiring of the visual cortex, and the other was a generic network with random connections. In the generic model, normal visual function was substantially degraded because the random firing of neurons served to amplify the Turing effect. “A generically wired visual cortex would be contaminated by hallucinations,” Goldenfeld said. In the realistic model of the cortex, however, internal noise was effectively dampened. Goldenfeld_2K.jpg

Nigel Goldenfeld, a physicist at the University of Illinois, Urbana-Champaign, hypothesizes that the stochastic Turing mechanism underlies visual hallucinations. Credit: Seth Lowe for Quanta Magazine.

Goldenfeld suggests that evolution has selected for a particular network structure that inhibits hallucinatory patterns: The sparseness of connections between inhibitory neurons prevents inhibitory signals from traveling long distances, disrupting the stochastic Turing mechanism and the perception of funnels, cobwebs, spirals and so forth. The dominant patterns that spread through the network will be based on external stimuli — a very good thing for survival, since you want to be able to spot a snake and not be distracted by a pretty spiral shape.

“If the cortex had been built with these long-range inhibitory connections all over the place, then the tendency to form these patterns would be stronger than the tendency to process the visual input coming in. It would be a disaster and we would never have survived,” Thomas said. Because long-range inhibitory connections are sparse, “the models don’t produce spontaneous patterns unless you force them to, by simulating the effects of hallucinogenic drugs.”

Experiments have shown that hallucinogens like LSD appear to disrupt the normal filtering mechanisms the brain employs, perhaps boosting long-range inhibitory connections and therefore permitting random signals to amplify in a stochastic Turing effect.

Goldenfeld and collaborators have not yet tested their theory of visual hallucinations experimentally, but hard evidence that stochastic Turing patterns do arise in biological systems has emerged in the last few years. Around 2010, Goldenfeld heard about work done by Ronald Weiss, a synthetic biologist at the Massachusetts Institute of Technology who had been struggling for years to find the appropriate theoretical framework to explain some intriguing experimental results.

Years earlier, Weiss and his team had grown bacterial biofilms that were genetically engineered to express one of two different signaling molecules. In an effort to demonstrate the growth of a classic Turing pattern, they tagged the signaling molecules with fluorescent markers so that the activators glowed red and the inhibitors glowed green. Although the experiment started out with a homogenous biofilm, over time a Turing-like pattern emerged, with red polka dots scattered throughout a swath of green. However, the red dots were much more haphazardly located than, say, leopards’ spots. Additional experiments also failed to yield the desired results.

When Goldenfeld heard about these experiments, he suspected that Weiss’ data could be viewed from a stochastic point of view. “Rather than trying to make the patterns more regular and less noisy,” Weiss said, “we realized through our collaboration with Nigel that these are really stochastic Turing patterns.” Weiss, Goldenfeld and collaborators finally published their paper in the Proceedings of the National Academy of Sciences last month, 17 years after the research began.

The biofilms formed stochastic Turing patterns because gene expression is a noisy process. According to Joel Stavans of the Weizmann Institute of Science in Israel, that noise is responsible for disparities among cells, which can have the same genetic information yet behave differently. In recently published work, Stavans and his colleagues investigated how noise in gene expression can lead to stochastic Turing patterns in cyanobacteria, ancient organisms that produce a large proportion of the oxygen on Earth. The researchers studied Anabaena, a type of cyanobacteria with a simple structure of cells attached to one another in a long train. An Anabaena’s cells can specialize to perform one of two activities: photosynthesis, or converting nitrogen in the atmosphere into proteins. An Anabaena might have, for instance, one nitrogen-fixing cell, then 10 or 15 photosynthesis cells, then another nitrogen-fixing cell, and so on, in what appears to be a stochastic Turing pattern. The activator, in this case, is a protein that creates a positive feedback loop to produce more such proteins. At the same time, the protein may also produce other proteins that diffuse to neighboring cells and inhibit the first protein’s production. This is the primary feature of a Turing mechanism: an activator and an inhibitor fighting against each other. In Anabaena, noise drives the competition.

Researchers say the fact that stochastic Turing processes appear to be at work in these two biological contexts adds plausibility to the theory that the same mechanism occurs in the visual cortex. The findings also demonstrate how noise plays a pivotal role in biological organisms. “There is not a direct correlation between how we program computers” and how biological systems work, Weiss said. “Biology requires different frameworks and design principles. Noise is one of them.”

There is still much more to understand about hallucinations. Jean-Paul Sartre experimented with mescaline in Paris in 1935 and found it distorted his visual perception for weeks. Houses appeared to have “leering faces, all eyes and jaws,” clock faces looked like owls, and he saw crabs following him around all the time. These are much higher-level hallucinations than Klüver’s simple form constants. “The early stages of visual hallucination are very simple — these geometric patterns,” Ermentrout said. But when higher cognitive functions kick in, such as memory, he said, “you start to see more complex hallucinations and you try and make sense of them. I believe that all you’re seeing is the spontaneous emergence of [stored memories] as the higher brain areas become more excited.”

Back in the ’20s, Klüver also worked with subjects who reported tactile hallucinations, such as cobwebs crawling across their skin. Ermentrout thinks this is consistent with a cobweb-like form constant mapped onto the somatosensory cortex. Similar processes might play out in the auditory cortex, which could account not only for auditory hallucinations but for phenomena like tinnitus. Cowan agrees, noting that the brain has similar wiring throughout, so if a theory of hallucinations “works for vision, it’s going to work for all the other senses.”

Jennifer Ouellette is a freelance writer and an author of popular science books, including “Me, Myself, and Why: Searching for the Science of Self.”

How is paranoid personality disorder treated?

People with PPD often do not seek treatment on their own because they do not see themselves as having a problem. The distrust of others felt by people with PPD also poses a challenge for health care professionals because trust is an important factor of psychotherapy (a form of counseling). As a result, many people with PPD do not follow their treatment plan and may even question the motives of the therapist.

When a patient seeks treatment for PPD, psychotherapy is the treatment of choice. Treatment likely will focus on increasing general coping skills, especially trust and empathy, as well as on improving social interaction, communication, and self-esteem.

Medication generally is not used to treat PPD. However, medications—such as anti-anxiety, antidepressant, or anti-psychotic drugs—might be prescribed if the person’s symptoms are extreme, or if he or she also suffers from an associated psychological problem, such as anxiety or depression.

What are the complications of paranoid personality disorder?

The thinking and behaviors associated with PPD can interfere with a person’s ability to form and maintain relationships, as well as their ability to function socially and in work situations. In many cases, people with PPD become involved in legal battles, suing people or companies they believe are “out to get them.”Previous: Diagnosis and TestsNext: Prevention

Comment This is a catch all for anyone criticising the state and its public services. It is incredibly convenient in an age where so many are apparently , for no specific reason, labelled mentally ill. R.J Cook


What is paranoid personality disorder? Posted November 19th 2020

Paranoid personality disorder (PPD) is a long-term, mental health condition. PPD causes you to be suspicious, distrusting, and hostile toward others. This is because you think they want to hurt you or take advantage of you. You may have trouble trusting or getting along with others. These thoughts and behaviors can cause problems with your relationships and daily activities.

What causes PPD?

The cause may not be known. Your risk for PPD is increased if you have a family history of the disorder. You are also at risk if you were abused or neglected as a child.

What are the symptoms of PPD?

  • You think other people will harm, trick, or take advantage of you
  • You think that your friends might not be loyal. You may think about how they have let you down. You may search for proof that they cannot be trusted.
  • You are nervous about talking to other people because you are afraid they will use the information against you.
  • You often hold grudges against people who you believe have done something bad to you. You believe that the actions were done to hurt you, and you cannot forgive the people who did them. You may see people as your enemies, and want to get back at them.
  • You think that others are trying to insult you. You may hear a person say one thing, but you think that they mean something else.
  • You suspect that your partner has been unfaithful.
  • You think that certain people are trying to make you look bad to others. You may react by getting angry or attacking them back. You may also believe that your reputation is being threatened.

What other behaviors might I have with PPD?

  • Depression
  • Obsessive-compulsive disorder
  • Agoraphobia
  • Alcohol or substance abuse

How is PPD diagnosed?

Your healthcare provider will ask about your history and if you want to hurt yourself or others. He will ask about your behaviors, feelings, and relationships with others.

How is PPD treated?

Medicines can help decrease anxiety or depression and make you feel more stable.

How can I manage my symptoms?

Go to individual or group therapy. You may need any of the following types of therapy:

  • Supportive psychotherapy helps you understand your behaviors and actions. This can help you cope with your disorder so you can have positive relationships.
  • Family therapy helps you and your family communicate and teaches your family how they can best support you.

When should I contact my healthcare provider?

  • You are depressed.
  • You feel anxious or worried.
  • You do not want to leave your house.
  • You begin to drink alcohol, or you drink more than usual.
  • You take illegal drugs.
  • You take medicines that are not prescribed to you.
  • You have questions or concerns about your condition or care.

When should I seek immediate care or call 911?

  • You have severe depression.
  • You want to hurt yourself or others.

Care Agreement

You have the right to help plan your care. Learn about your health condition and how it may be treated. Discuss treatment options with your healthcare providers to decide what care you want to receive. You always have the right to refuse treatment. The above information is an educational aid only. It is not intended as medical advice for individual conditions or treatments. Talk to your doctor, nurse or pharmacist before following any medical regimen to see if it is safe and effective for you.

© Copyright IBM Corporation 2020 Information is for End User’s use only and may not be sold, redistributed or otherwise used for commercial purposes. All illustrations and images included in CareNotes® are the copyrighted property of A.D.A.M., Inc. or IBM Watson Health

Brain Fog November 14th 2020

Mental fog is often described as a “cloudy-headed” feeling.

Common conditions of brain fog include poor memory, difficulty focusing or concentrating, and struggling with articulation.

Imagine if you could concentrate your brain power into one bright beam and focus it like a laser on whatever you wish to accomplish.

Many people struggle to concentrate. And when you can’t concentrate, everything you do is harder and takes longer than you’d like.

Give Up the Clutter

Mess creates stress.

There’s a strong link between your physical space and your mental space.

Clutter is bad for your mind and health. It can create long-term, low-level anxiety.

When the book, The Life-Changing Magic of Tidying Up: The Japanese Art of Decluttering and Organizing, by Marie Kondo became a best-seller, it wasn’t too surprising.

We are all looking for ways to create more meaningful lives with less to distract us.

Get rid of clutter at your office, on your desk, in your room, and you will send a clear message of calm directly to your brain.

Start decluttering today in small, focused bursts. You’re not going to clean up your entire space in a day, so start small to make it a daily habit that sticks.

Set yourself up for success by making a plan and targeting specific areas you’re going to declutter, clean up, and organize over a prolonged period of time.

Multi-Tasking Doesn’t Work

The ability to multi-task is a false badge of honor.

Task switching has a severe cost.

Your concentration suffers when you multitask.

It compromises how much actual time you spend doing productive work, because you’re continually unloading and reloading the hippocampus/short term memory.

Research shows that task switching actually burns more calories and fatigues your brain – reducing your overall capacity for productive thought and work.

Commit to completing one task at a time.

Remove potential distractions (like silencing your mobile, turning off email alerts) before you start deep work to avoid the temptation to switch between tasks.

Use the 3-to-1 method!

Narrow down your most important tasks to 3, and then give one task your undivided attention for a period of time.

Allow yourself to rotate between the three, giving yourself a good balance of singular focus and variety.

Give Up the Urgent Distraction

Disconnect. Your productivity, creativity and next big idea depends on it.

Urgency wrecks productivity. Urgent but unimportant tasks are major distractions.

Last-minute distractions are not necessarily priorities.

Sometimes important tasks stare you right in the face, but you neglect them and respond to urgent but unimportant things.

You need to reverse that. It’s one the only ways to master your time.

Your ability to distinguish urgent and important tasks has a lot to do with your success.

Important tasks are things that contribute to your long-term mission, values, and goals. Separating these differences is simple enough to do once, but doing so continually can be tough.

Stop Feeding Your Comfort

Comfort provides a state of mental security.

When you’re comfortable and life is good, your brain can release chemicals like dopamine and serotonin, which lead to happy feelings.

But in the long-term, comfort is bad for your brain.

Without mental stimulation dendrites, connections between brain neurons that keep information flowing, shrink or disappear altogether.

An active life increases dendrite networks and also increase the brain’s regenerating capacity, known as plasticity.

“Neglect of intense learning leads plasticity systems to waste away,” says Norman Doidge in his book, The Brain That Changes Itself.

Michael Merzenich, a pioneer of plasticity research, and author of Soft-wired: How the New Science of Brain Plasticity Can Change Your Life says that going beyond the familiar is essential to brain health.

“It’s the willingness to leave the comfort zone that is the key to keeping the brain new,” he says.

Seeking new experiences, learning new skills, and opening the door to new ideas inspire us and educate us in a way improves mental clarity.

Don’t Sit Still

Sitting still all day, every day, is dangerous.

Love it or hate it, physical activity can have potent effects on your brain and mood.

The brain is often described as being “like a muscle”. Its needs to be exercised for better performance.

Research shows that moving your body can improve your cognitive function.

30–45 minutes of brisk walking, three times a week, can help fend off the mental wear and tear.

What you do with your body impinges on your mental faculties.

Find something you enjoy, then get up and do it. And most importantly, make it a habit.

Stop Consuming Media and Start Creating Instead

It’s extremely easy to consume content.

You are passive. Even relaxed.

But for each piece of unlimited content you consume, it stops a piece of content you could have created.

Limit your mass media consumption.

Embrace the creation habit.

Start paying attention to the noise that you let seep into your eyes and ears.

Ask, Is this benefitting my life in any way?

Does all this information make me more prone to act?

Does it really make me more efficient? Does it move me forward in any significant way?

Let creation determine consumption.

Allow curiosity to lead you to discover and pursue something you deepy care about. Make time to create something unique.

The point is to get lost in awe and wonder like you did when you were a child. When you achieve that feeling from a certain activity, keep doing it!

Share your authentic self with the rest of us.

Thomas Oppong is the founder of AllTopStartups and writes on science-based answers to problems in life about creativity, productivity, and self-improvement.

The Amazing Psychology of Japanese Train Stations

The nation’s famed mastery of rail travel has been aided by some subtle behavioral tricks. Posted November 10th 2020


  • Allan Richarz

Read when you’ve got time to spare.940....jpg

Passengers line up for a bullet train at a platform in Tokyo Station. Photo by Yuya Shino/Reuters

It is a scene that plays out each weekday morning across Tokyo. Suit-clad office workers, gaggles of schoolchildren, and other travelers gamely wend their way through the city’s sprawling rail stations.

To the casual observer, it is chaos; commuters packed shoulder-to-shoulder amid the constant clatter of arriving and departing trains. But a closer look reveals something more beneath the surface: A station may be packed, yet commuters move smoothly along concourses and platforms. Platforms are a whirl of noisy activity, yet trains maintain remarkable on-time performance. Indeed, the staggering punctuality of the Japanese rail system occasionally becomes the focus of international headlines—as on May 11, when West Japan Railways issued a florid apology after one of its commuter trains left the station 25 seconds early.

Tokyo is home to the world’s busiest train stations, with the capital’s rail operators handling a combined 13 billion passenger trips annually. Ridership of that volume requires a deft blend of engineering, planning, and psychology. Beneath the bustle, unobtrusive features are designed to unconsciously manipulate passenger behavior, via light, sound, and other means. Japan’s boundless creativity in this realm reflects the deep consideration given to public transportation in the country.

Passengers wait for a train at a platform of a station in Kawasaki. Photo by Kim Kyung-Hoon/Reuters

Rail stations, whether in Japan or elsewhere, are also great places to see “nudge theory” at work. Pioneered by behavioral economist Richard Thaler, who was awarded the 2017 Nobel Memorial Prize for his work, and Harvard Law School professor Cass Sunstein, the theory posits that gentle nudges can subtly influence people towards decisions in their own (or society’s) best interests, such as signing up for private pension schemes or organ donation. In the U.K., there’s a government office devoted to the idea, the Behavioural Insights Team (or “nudge unit”), and their work often shows up in the transit realm.

In 2016, for instance, London Underground operator Transport for London partnered with the behavioral science department at the London School of Economics to develop ways of encouraging riders to queue on both sides of station escalators as a means of increasing their capacity in the capital’s Holborn Station. Among other measures, simple hand and footprints were also painted on each side of the “up” escalators. In Australia, researchers conducted an experiment with lighted directional arrows on signposts to improve flows of departing passengers. Using a camera system designed to recognize and distinguish brisk-walking businesspeople from dawdling tourists, for example—green arrows would flash to direct commuters in an efficient route towards the exit.

When it come to passenger manipulation, what sets the stations of Japan apart from their counterparts is both the ingenuity behind their nudges and the imperceptible manner in which they are implemented. Japan’s nudges reflect a higher order of thinking. The orderliness of society is taken as a given—Japanese commuters know how to queue on an escalator and can easily navigate the confusing, but wide-open, spaces of Tokyo’s rail stations without assistance. This allows rail operators to instead focus on deeper psychological manipulation.

The Ultimate in Mood Lighting

Japan has one of the highest suicide rates among OECD nations, and often, those taking their own lives do so by leaping from station platforms into the path of oncoming trains, with Japan averaging one such instance each day. It is a brutal, disruptive end that can also wreak havoc across the transit system.

To address the issue, stations across Tokyo and the rest of Japan installed chest-high barriers as a means of preventing suicide attempts. But platform barriers are expensive, and about 70 percent of Japan’s largest and most-travelled stations do not have the platform space or structural strength to accommodate them. While there are hopes to have platform barriers installed in all 243 of Tokyo’s train stations by 2032 (at a cost of $4.7 billion), rail operators in the interim have come up alternative approaches.

Standing at either end of a platform in Tokyo’s labyrinthine Shinjuku Station, one might detect a small square LED panel emitting a pleasant, deep-blue glow. Nestled among vending machines and safety posters, the panel might be dismissed as a bug zapper. But these simple blue panels are designed to save lives.

A blue-light panel in a Japanese train station, designed to calm agitated passengers. Photo by Allan Richaz/CityLab

Operating on the theory that exposure to blue light has a calming effect on one’s mood, rail stations in Japan began installing these LED panels as a suicide-prevention measure in 2009. They are strategically located at the ends of each platform—typically the most-isolated and least-trafficked area, and accordingly, the point from which most platform jumps occur. Some stations, such as Shin-Koiwa Station in Tokyo, bolster their LED regime with colored roof panels, allowing blue-tinted sunlight to filter down on to platforms.

It is an approach that has proven to be surprisingly effective. According to a study by researchers at the University of Tokyo published in the Journal of Affective Disorders in 2013, data analyzed over a 10-year period shows an 84 percent decline in the number of suicide attempts at stations where blue lights are installed. A subsequent study revealed no corresponding increase in suicide attempts at neighboring stations lacking such lights.

The idea has been picked up in the U.K.: Several stations in England now emulate the Japanese approach, with blue LED light panels on station platforms.

A Song for a More Peaceful Departure

Commuting during rush hour in Japan is not for the faint of heart. The trains are jam-packed at as much as 200 percent capacity during the height of rush hour, and razor-thin connection times to transfer from one train to another leave little margin for error. Compounding the stressful nature of the commute in years past was the nerve-grating tone—a harsh buzzer used to signal a train’s imminent departure. The departing train buzzer was punctuated by sharp blasts of station attendants’ whistles, as harried salarymen raced down stairs and across platforms to beat the train’s closing doors.

To calm this stressful audio environment, in 1989 the major rail operator JR East commissioned Yamaha and composer Hiroaki Ide to create hassha melodies—short, ear-pleasing jingles to replace the traditional departure buzzer.

Also known as departure or train melodies, hassha tunes are brief, calming and distinct; their aim is to notify commuters of a train’s imminent departure without inducing anxiety. To that end, most melodies are composed to an optimal length of 7 seconds, owing to research showing that shorter-duration melodies work best at reducing passenger stress and rushing incidents, as well as taking into account the time needed for a train to arrive and depart.

The tunes feature whimsical titles like “Seaside Boulevard” and range from the wistful to the jaunty. Most stations have their own melodies, forming de facto theme songs that become part of a station’s identity. Tokyo’s Ebisu Station, for example, is known for its departure melody—a short, stylized version of the theme from The Third Man.

As more stations have added melodies over the years, the original thesis has proven correct. A study conducted in October 2008 at Tokyo Station, for instance, found a 25 percent reduction in the number of passenger injuries attributable to rushing after the introduction of hassha melodies on certain platforms.

The use of these jingles is not without controversy, however. Shortly after their introduction, residents living near open-air rail stations, weary of hearing endless repetitions of the same jingles all day, complained of noise pollution.


Despite, or perhaps because of, its reputation as a remarkably safe country, Japan is nonetheless vigilant in combatting youth delinquency. Train stations are particularly sensitive in that regard, since large congregations of young people pass through stations at all hours of the day.

To address the Japanese fear of loitering and vandalism by young riders, some train stations deploy ultrasonic deterrents—small, unobtrusive devices that emit a high-frequency tone. The particular frequency used—17 kilohertz*—can generally only be heard by those under the age of 25. (Older people can’t detect such frequencies, thanks to the age-related hearing loss known as presbycusis.) These devices—the brainchild of a Welsh inventor and also used to fend off loitering teens in the U.S. and Europe—have been enthusiastically adopted in Japan.

Standing outside one of Tokyo Station’s numerous exits on a recent summer day, it was easy to see the effectiveness of this deterrent in action. Weary salarymen and aged obaachan passed under the sonic deterrent without changing pace. Among uniform-clad students, however, the reactions were evident—a suddenly quickened pace, a look of confusion or discomfort, and often a cry of urusai! (Loud!) None appeared to connect the noise to the deterrents placed almost flush in the ceiling panels above.

Pointing the Best Way Forward

Rail employees are not exempt from the behavioral hacks of their employers. Perhaps most famously, Japanese train conductors, drivers, and platform attendants are mandated to use the “point and call” method—called shisa kanko—in executing tasks. By physically pointing at an object, and then verbalizing one’s intended action, a greater portion of the brain is engaged, providing improved situational awareness and accuracy. Studies have repeatedly shown that this technique reduces human error by as much as 85 percent. Pointing-and-calling is now a major workplace safety feature in industries throughout Japan.

So, why don’t train workers everywhere do this? Like so many aspects of Japanese transit culture, shisa kanko has proved resistant to export (though pointing-and-calling has been adopted in modified form by New York City’s transit authority). In this, as in so many things, Japan’s rail system stands largely alone.

Allan Richarz is a privacy lawyer and writer based in Tokyo, Japan.CityLab

More from CityLab

‘It Feels Like a Derangement’: Menopause, Depression, & Me

Estrogen is more powerful and more wide-ranging than is assumed, and its removal or diminishment brings effects ludicrously understated by “the change.” Posted November 9th 2020

The New York Review of Books

  • Rose George

Read when you’ve got time to spare.hammershoi-rest.jpg

Vilhelm Hammershøi: Rest, 1905. Credit: René-Gabriel Ojéda / Google Arts & Culture.

Menopause: the ceasing of menstruation or the period in a woman’s life (typically between forty-five and fifty-five).

I stare stupidly at it. It’s nothing much to look at. It’s only a small pile of clothing: the shorts and tank top that I wear in bed, which I have thrown onto the floor before getting into the shower. I stare stupidly at the clump because I can’t pick it up. It’s astonishing I managed to shower, because I know already that this is a bad day, one when I feel assaulted by my hormones, which I picture as small pilots in those huge Star Wars armored beasts that turn me this way and that, implacable. On this morning, I wake up with fear in my stomach—fear of nothing—and I know it will be a bad day.

For a while, I thought I could predict these days. I have had practice. This is my second menopause: the first was chemically induced seven years ago to treat my endometriosis, a condition that has riddled my insides with adhesions of endometrial tissue, and stuck my organs together. The adhesions are exacerbated by estrogen; the drug switched it off. (The same drug can block other hormones and is also used to treat pedophilia and prostate cancer.) I hated that menopause; it was a crash off a cliff into sudden insomnia and depression and a complete eradication of sexual desire. “The symptoms will last six months,” said the male ob-gyn, with a voice he thought was kind but that sounded only casual. They lasted far longer. The nurse giving me the first injection said, “He keeps prescribing this stuff, but women hate it.”

This menopause is the natural one. I’m two years in. It doesn’t feel natural. It feels like a derangement. With each menopause, I have chosen to take hormone replacement therapy (HRT). The first time because I wanted my sleep back. This time because I spent a year researching menopause for a magazine article, and because I have weighed the risks and judged them acceptable, and because I know what happened last time, when I was broken. The two occasions when I asked for HRT are the only two on which I have cried in a doctor’s office.

Every Wednesday and Saturday, I take two 100mg transdermal patches of estradiol (a form of estrogen). I fix them to my abdomen, swapping sides each time. They never fall off, though I go running for hours at a time and sweat. This is the maximum dose of estrogen, and it took about a year for me to understand I needed this amount, a year of peeling skin, sore tendons, poor sleep, awful sadness, inexplicable weeping, and various other “symptoms” of menopause that you can find listed if you look beyond the hot flashes and insomnia. (I don’t know why Americans say “flash” instead of “flush”; I prefer the British-English word, less fleeting than a flash, a better fit for that rise in temperature, violently sudden and overwhelming, that makes you feel as if you had never been cool or would be again.) Estrogen is more powerful and more wide-ranging than is assumed, and its removal or diminishment brings effects ludicrously understated by “the change.”

A friend gave me access to her university library and I start to swim among papers, sometimes floundering. I learn that estrogen is a gonadal steroid produced by the ovaries and essential to female reproduction. It is a sex hormone but—it is now known—far more besides. There are receptors for estrogen all over the body. In the brain, the densest amounts are in the amygdala, the hippocampus, and the hypothalamus. Estrogen influences serotonin, dopamine, glutamate, and noradrenaline. It is involved in cognitive function. Its diminishment can impair verbal dexterity, memory, and clarity of thought. Recently, scientists discovered that estrogen is also produced in the adrenal glands, breasts, adipose tissue, and brain. This is astonishing. But so is the extent of the unknown.

Peri-menopausal women (whose periods may be irregular, who have symptoms, but who are not yet post-menopausal) are twice as likely to have depressive symptoms or depression than pre-menopausal women. Peri-menopausal women who were vulnerable to depression during the menstrual cycle are more susceptible to depression when they enter menopause or its hinterlands. This is accepted, but there is disagreement about how to fix it. Antidepressants often don’t work. Studies show both success and failure when women are given estrogen to counter depression. Controversy exists over whether the menopausal transition is a risk factor for the development of depression, I read. And, I think, the person who wrote that has probably never been on a menopause forum, where women’s stories and pain would make me weep, if I didn’t feel like weeping already, from menopause.

Because I have a womb—though it is likely of no use for fertility, thanks to the endometriosis—I also take progesterone for ten days a month. This induces the womb to shed its endometrium, which may otherwise thicken to cancer-risky proportions. So I still bleed, and choose to. I knew from my research that the gentlest version of progesterone is micronized, something that my doctor had to look up. I didn’t know that taking it orally, as I had for many months, would bring me profound sadness, fatigue, weight gain, awfulness. That wasn’t something I discovered in my research, and no one told me.

I can’t pick up the clothes. I can’t explain the granite of that “can’t” to anyone else, the way it feels impossible to beat. Look at me looking at the pile and you will think, Just pick it up. For fuck’s sake. But I don’t. I look at it, and the thought of accomplishing anything makes my fear and despair grow. Every thought brings on another and that prospect is frightening. All those thoughts. I write that down and I feel stupid and maudlin and dramatic. A privileged freelance writer who does not have a full-time job that requires her presence in an office and can be indulgent of what the medical profession calls “low moods.” In fact, plenty of menopausal women leave their jobs, endure wrecked relationships, suffer, and cope. Or don’t. But I don’t feel maudlin and dramatic in the bathroom, or on any other of a hundred occasions over the past two years. I feel terrified. I have no reason to feel fear. But my body acts as though I do: the blood rushing from my gut to my limbs in case I need to flee, leaving the fluttering emptiness that is called “butterflies,” though that is too pretty a description.

Still, I set off on my bicycle to my writing studio. I hope I can overcome the day. I always hope, and I am always wrong. A few hours later, I find myself cowering in my workspace, a studio I rent in a complex of artists’ studios, scared to go downstairs to the kitchen because I can’t bear to talk to anyone I might find there. I have done nothing of use all day. Every now and then, I stop doing nothing and put my head in my hands because it feels safe and comfortable, like a refuge. I look underneath my desk and think I might sit there. There is no logic to this except that it is out of sight of the door and no one will find me.

Even so, when the phone rings I answer it. I shouldn’t, but I am hopeful that I can manage it and mask it, and I haven’t spoken to my mother for a few days and would like to. It goes well for a few minutes, because I’m not doing the talking. Then she asks me whether I want to accompany her to a posh dinner, several weeks hence. She doesn’t understand when I ask to be given some time to think about it. “Why can’t you decide now?” I say it’s one of the bad days, but I know this is a mixed message: If it’s that bad, how am I talking on the phone and sounding all right? Because I am a duck: talking serenely above, churning below, the weight on my chest, the catch in my throat, the inexplicable distress. I try to explain but I’m also trying hard not to weep, and so I explain it badly.

She doesn’t understand. This is not her fault. She is a compassionate woman, but she had an easy menopause, so easy that she can say, “Oh, I barely remember it.” One of those women: the lucky ones. She doesn’t understand depression, though both her children experience it, because she has never had it. “But you sounded well,” she says, “I thought you were all right.” Now she says, “I don’t understand how your not being well is stopping you deciding whether you want to go to dinner.” Because it is a decision, and a decision is too hard, requiring many things to happen in my brain and my brain is too busy being filled with fear and panic and tears and black numbness. There is no room to spare.

I hang up because I can’t explain this. I stay there for a while, sitting on my couch, wondering how to face cycling home or leaving my studio or opening the door. All these actions seem equally impossible.

It takes a while but finally I set off. I know where I’m going. I have learned. On days like this, there are only two places to be. One is in my darkened bedroom with my cat lying next to me. On days like this, she takes care to lie closer to me than usual because she knows and because she loves me. Maybe my darkness has a smell.

The other place to be is in unconsciousness.

These are the safe places because everything is quiet. On days like this, I wonder if this is what autism feels like, when sensation is overwhelming. Not just noise, but thoughts, sights, all input. It is on the bad days that I realize what a cacophony of impressions we walk through every day, and how good we are at receiving and deflecting, as required. Every day, we filter and sieve; on the bad days, my filters fail.

I sometimes call these bridge days, after a footbridge near my studio that goes at a great height over the busy A64 road. On days like this, that bridge is a danger for me. I am not suicidal, but I have always had the urge to jump. This is a thing with a name. HPP: high places phenomenon. The French call it “l’appel du vide.” So very Sartre of them: the call of emptiness. The A64 is the opposite of emptiness, but still, it is a danger. Today I don’t have the filter that we must all have to function: the one that stops us stepping into traffic or fearing the cars or buses that can kill us at any time. The one that mutes the call of the HPP.

I avoid the bridge. I cycle home, trying not to rage at drivers who cut me off and ignore me. I have no room for rage along with everything else. Thoughts that would normally flow now snag. Every observation immediately triggers a negative thread, a spiral, and a worsening. On a good day, I can pass a child and a mother and think, How nice. Nothing more. Fleeting. Unimportant. On a bad day, I see the same and think of my own infertility, how I have surely disappointed my mother by not giving her grandchildren; how it is all too late, and what have I done with my life, and my book will be a failure and today is lost and I can’t afford to lose the time. It goes on and on. Snagging thoughts that drag me down, that are relentless.

When I get inside my house, I cry. I try to watch something or read, but nothing interests me. This is called anhedonia and is a symptom of depression: the forgetting how to take pleasure. The best thing to do is sleep away the day, as much as I can.

Toward evening, I begin to feel a faint foolishness. This is my sign. Embarrassment. Shame at the day and at my management of it. When I am able to feel that and see that, I am getting better. Now I manage to watch TV, though only foreign-language dramas. Without the filmmaking industries of northern Europe, my menopause would be even bleaker. Foreign words go somewhere shallower in the brain; they are less heavy. But soon I switch it off. I don’t care about the plot. I don’t care about anything. I take a sleeping pill to get the day over with, so the better next day can begin.

Twenty-four hours earlier, I had been wearing a Santa hat, running for five miles through icy bogs on a Yorkshire moor, happy to be doing that for fun, happy to be alive.

April 4. Sleep mostly OK; a few days of melatonin after stopping progesterone. Last night I was exhausted but slept badly. Mood difficult but not dreadful. Angry and irritated. No bleed after progesterone. Peeling skin. Weepy and panic now. Can I face people?

Depression, wrote William Styron, is a noun “with a bland tonality and lacking any magisterial presence, used indifferently to describe an economic decline or a rut in the ground, a true wimp of a word for such a major illness.” It was pioneered by a Swiss psychiatrist who, Styron thought, perhaps had a “tin ear” and “therefore was unaware of the semantic damage he had inflicted by offering ‘depression’ as a descriptive noun for such a dreadful and raging disease.”

Black dog. Walking through treacle. Low moods. Nothing I have read of depression has conveyed the crippling weight of it, that is a weight made out of nothing.

I do not have depression according to most authoritative clinical definitions of the condition. Depression is a long-term chronic illness. Mine is unpredictable, and before I got my HRT dose right, it lasted weeks at a time; but usually, these days, it lasts no more than twenty-four hours. My now-and-thens do not qualify as a disease. I do not count as depressed. Instead, I am one of the women of menopause, who struggle to understand why we feel such despair, why now we cry when before we didn’t, why understanding what is left and what is right takes a fraction longer than it used to: all this is “low mood” or “brain fog.” These diminishing phrases, which convey nothing of the force of the anguish or grief that assaults us, are reserved for women and usually relate to menstrual cycles or hormones.

I have never been sunny. People who can rise from their beds and see joy without working at it, they have always been a mystery. I still feel guilty for once asking a cheery person, cheery very early in the morning, why he was so happy—I made it sound like an accusation not praise—and I watched as his face fell and his warmth iced over. I’m still sorry. Cheeriness always seems like an enviable gift. I have always been susceptible to premenstrual upheaval: two days a month when things feel awful as though they have never been anything else. I endured them. Now and then, there have been therapists sometimes and antidepressants now and then, and, for the last few years, running, in whatever wilds I can find. The best therapy. I have managed.

Then I became what I am. A menopausal woman. In the eyes of evolution, that makes me a pointless person. I can no longer reproduce, if I ever could. The grandmother theory of menopause—that women live beyond their reproductive utility in order to care for grandchildren—doesn’t persuade me. Also, I have no grandchildren. I cannot account for how awful menopause can be, unless I think that we were not meant to survive it. A useless evolutionary blip.

Thursday 14. Removed old patch, added half a new one. Mood immediately plunged. Awful: anhedonia, anxiety, panic, weepiness. I still ran but stopped to cry in the middle. So sick of this, and I can’t work.

For months, I resisted HRT. I endured as my periods got erratic, as I lost my ability to sleep through the night, as my temperature rose furiously and intolerably at unpredictable moments, all the time. I had forgotten from the first time what it was like to stink, to carry around a fan, to wear so much black so the sweat didn’t show. I had forgotten what a hot flash—such an innocent phrase—felt like; what the night sweats—such an innocent phrase—felt like. I woke up in the night boiling hot and pouring sweat. I use “pouring” deliberately because I was drenched. Sometimes, I woke up freezing because I was covered in cold sweat. Every athlete knows to change clothes as soon as possible because sweat chills so fast. Every night, it was as though I was running several races. I woke up fatigued, stinking, and angry that something so common, something that affects millions of women, is still such a medical mystery. Why do we get hot flashes? We don’t know. Why is sleep broken? We don’t know. Why are we the only creatures to get menopause apart from two types of whales? We don’t know.

I saw my doctor, who prescribed a low dose of HRT and a visit to a specialized menopause clinic, of which there are far too few. The symptoms continued, and were far more numerous than the hot flashes and insomnia to which menopause is usually reduced in common perception. I made a list: at various points, my skin peeled, my ears rang with tinnitus, my posterior tibial tendon swelled, my lubrication disappeared, my eyes dried so it felt as if I had grit in them, my jaw locked. My menopause doctor prescribed a higher dose of HRT, but the troubles continued; I got a higher one still and still they came. Finally, I sat in her office and said I couldn’t think straight. I felt like I was going mad. I became clumsier, dropping things. I forgot everything: names, events, appointments. My partner began to say, carefully, too often, “Yes, you’ve mentioned that,” in the same way I used to say it to my dad when he had dementia. The menopause doctor said, “This is just your age.” I never went back. The year before, aged forty-six, I had had no brain confusion. Forty-seven, and menopausal, I did. And she was a specialist.

I paid to see a private menopause specialist who immediately said I could be on the maximum dose of estrogen, that she couldn’t understand why no one had told me that taking progesterone orally causes many women troubles such as profound fatigue and depression, or that I could take it vaginally in half the dose for less of the time, which would be better (it is). She also prescribed testosterone, a clinical decision that is controversial in the small circle of medical professionals who take an interest in menopause. It is unnecessary, say skeptics, because the ovaries produce enough testosterone, and mine are still there, though sputtering into dysfunction. But it can help, say others, because, in the same way that estrogen is far more than a sex hormone, testosterone can lift energy, mood, life. Perhaps I would get a libido back. Perhaps I would remember what desire feels like, rather than looking at my partner and thinking how lovely he is, but distantly, through a glass pane, as if someone else were thinking it, as if that thought had nothing to do with me.

I took my new boxes of patches, a pump gel of estrogen to top up with on the bad days, my precious testosterone, and went home with hope. It took months, but things stabilized. Now, there is never more than one bad day at a time of these “low moods.” The phrase is belittling. My depression is not simply feeling miserable or glum. I know what that feels like. I know that that can be fixed by fresh air or effort. This depression is dysfunction, derangement. I hate myself so hard. And I miss myself, the woman who didn’t feel like this. The woman who felt uncomplicated sexual desire, whose skin healed quickly and didn’t scar so easily, whose hands did not dry and flake, whose ears didn’t ring; whose bladder didn’t leak. On the good days, I am at peace with my age, with what I have done, with who I am, menopausal or not. I delight in what I can do, and when I run, I hurtle headlong down a steep descent with the joy of a child, aged nearly fifty. But on other days, that woman seems like someone else.

Monday 25. First morning I haven’t felt dread and weepiness. Not giddy like before, but like things are possible. But also scared of mood flipping—and it did. Horribly. Weepy, panicking, total anhedonia. I haven’t left the house. At 3:30 I went to bed and woke up at 6. I feel profoundly sad, black, AWFUL. Did it all change after I drank coffee?

Tuesday 26. No coffee. Panic, dread, weepy. Can’t focus, can’t wash up.

I grasp for reason. I look for patterns. I keep a diary for eighteen months. If I can understand the patterns, I can predict the bad days and allow for them. I can plan for them. Tom Cruise in Minority Reporthad “pre-crime” to prevent and disrupt future criminal threats. Perhaps I can have pre-depression. For many months, I think that the bad days come when my estrogen dips on the last day before I get new patches. I stop scheduling things on Mondays and Fridays. But then the pattern changes so that I know it never was a pattern. Sometimes, it’s a Tuesday. Sometimes, a Sunday. I can’t tell. I give up the diary.

I try to take control by being less embarrassed. Once, when I still had flashes and was out at dinner, I got out my fan and a relative said, “Must you?” I don’t understand this reaction. People are not mortified by cancer patients on chemo who sweat and use fans. Is it because menopause is to do with periods? Is it because women’s health must be hidden and quiet? Is it because women do hide it? I can’t think why the irregularities of the hypothalamus should be socially unacceptable. I kept using my fan for as long as I needed to, though I felt faintly uneasy.

The only acceptable place for menopause is in menopause jokes. The humor that masks distress and shame. The woman in a meeting who laughs off her sweating, who talks of “power surges.” The comedians and their mothers-in-law and their flushes or flashes. What if it came out of jokes and into accepted conversation?

For many months, I told people I was “unwell.” Not crippled, not weeping, not disabled. “Unwell.” The implication: that there is something physically wrong. A proper illness, not depression. A definition of depression is heartache, but it is my head that aches. What if I told everyone I had a severe headache? A broken ankle not brain? They would understand better. Then, one day, as I sit at my computer and think of the writing deadline that is today and feel despair, and I try to read serious medical literature and instead put my head in my hands again, I decide to write to the commissioning editor, even though she is new and this may form her opinion of me, and say: I can’t function today. I can’t write. And it is because of depression. Please give me leeway. It shames me to write it, but I do. And I do it again, when needed. So far, every response has been profoundly kind. I should have done it sooner.

Mental illness. Such an odd concept. How strange to put a division between mental and physical illness, as if the brain is not in the body. As if emotions are not regulated by the brain. As if feelings are not linked to hormones. As if all maladies are not of the body. And still mental illness is put in its place, which is in a different category. Not “real” illness. Not physical. Easier to fix, to underfund, to sweep into the dark corner of the unspoken. Imagine the contrary. Have you broken your ankle? Cheer up. Do you have third-degree burns? Chin up. Think yourself better, you with your chronic lymphocytic leukemia. Smile.

May 4. Finally felt better yesterday. Tweeted fury about BBC menopause doc and all its “low moods.” Messaged with a doctor who thinks 50mg of estradiol is too low and particularly for someone who was prone to PMT. She also thought I should try testosterone. I immediately went downstairs and put another patch on. Retroactively furious with Dr. X for sticking so firmly to dose, but maybe I played down the depression. Today I slept well. Mood good. A feeling in my stomach that is positivity, like I can do things.

In London, at an event: clever people all around me, and I am on a panel discussing clever things. But I do not feel clever. I feel like a dolt, that when my mouth opens stupidity and cotton wool come out. I meet people I know and like from social media, and am happy to learn that I like them in real life, too. We go for a drink, but I want to leave. There is no reason for me to feel this way: the people are nice, the place is great, the cocktails look tasty. I mostly drink water and leave early and walk through the quietening streets of London and feel so numb I can’t even be bothered to loathe myself. The next morning, I wake gloomy, my head foggy apparently from just one glass of prosecco. The room is hot, the city noises are infuriating. I put new estrogen patches on my abdomen. I smear testosterone gel, two pea-sized globs, on my inner thighs. I go through the motions of other activities and wait. Half an hour later, as I am walking to the station, I feel a quiet flood of good mood. It feels as though the estrogen is lifting me slightly. I picture a tide floating buoys higher and higher in a harbor. Estrogen is hefting and hauling me out of depression, for today.

This is my theory. It is unproven, according to the literature. I wish the urge to better understand the extent of estrogen’s reach, and the devastation its fluctuation can bring, had happened decades ago. There has been more research in recent years, but I doubt that the driver for this knowledge is how poorly menopause is treated or understood; it’s probably that estrogen is implicated in higher rates of Alzheimer’s disease in postmenopausal women. There is money in Alzheimer’s, but not in making women’s lives better.

Friday 22. Woke up at 10. Awful, awful, awful. Got up at 12 and ran 10 miles, got back and burst into tears. Profound sadness, depression, weepiness. One of the worst yet. Panic at night.

My mother says, the day after another bad day, “I feel so awful for you. Why can’t they fix it?” They are doing all they can, I say. I don’t really believe this. If women’s health were taken as seriously as men’s, this probably would have been solved a while ago.

The trouble with women is we cope. We always do.

I keep fit. I gave up alcohol for months, reasoning that it plunges me into depression the next day—and I can produce those days all on my own without paying money to make them happen. Over the years, I have taken citalopram, sertraline, black cohosh, red clover, omega 3, magnesium, iron, vitamin D. For a while, I saw a serene herbalist, who mixed dark potions and told me I should eat chickpeas and tofu to get their phyto-estrogens to bind to the receptors all over my body, that these are good estrogens and binding them is something I want to do. But I don’t understand her explanation and imagine only battalions of chickpeas marching around my body seeking docking stations. Many peri-menopausal women with depression are prescribed antidepressants. I hope theirs work, as mine did nothing. I know the iron helps, and I think the magnesium does, too, because when I forget to take it, I start to feel stupider.

In scientific papers, researchers argue about whether women feeling depressed in menopause (pre-, peri-, post-) are actually just experiencing the ups and downs of life. We are brought low, they reason, by the hot flashes and sleeplessness, not by hormonal fluctuations. Or we are diminished by life. At that age, I read, women may have aging parents to care for; grown children and an empty house; empty marriages. Their depressive symptoms are a mourning for who they were and what is to come. They have what is called “the redundancy syndrome.” It’s just coincidence that they are also menopausal. “Research has found,” I read, “that depressed mood and depressive disorder in middle-aged women are related less to menopause than to the vicissitudes of life.”

I bristle at this. Although I wonder. I remember a month away in France when I had not a single bad day. I notice that my mood lifts once my book is written and its huge pressure is also lifted. I wonder: Is my problem not menopause-specific depression, but that the removal of estrogen leaves me less protected against my natural lows? This theory lasts until the next bad day when I remember how elemental it feels. There is no choice involved. I would not choose to feel the way I do. Who the hell would?

May 2. I slept fine and took no pills but today was the same. Sad, weepy, furious. I can interact with people but in-between is awful. I went home at 3 and went to bed until 6. I hate this.

Today. Today is a decent day. It has taken me months to write this essay because when I am bad, I can’t write, and when I am not, I don’t want to remember. Tomorrow? My menopausal status is being masked by HRT, so I won’t know when I become post-menopausal until I dare to stop my artificial bolster of hormones. My post-menopausal friends tell me everything is better on the other side. I want to believe them and ask my doctor, a young woman half my age, when I can stop taking HRT and what will happen if I do. She says, “Four years? That’s about right.” Stay on HRT for four years, wean yourself off it, and then see. She doesn’t say that this means I have to plan for a period of life when I can risk being brutalized by depression and insomnia for weeks at a time, not days. When I can crash to the bottom again. Even on a good day, I think that time will be never.

Rose George is a British-based journalist and writer who has contributed to The Guardian, the Financial Times, Details, and Condé Nast Traveler, among other publications. She is also the author of “A Life Removed,” “The Big Necessity: the Unmentionable World of Human Waste and Why it Matters,” “Ninety Percent of Everything,” and, most recently, “Nine Pints.” Follow her on Twitter: @rosegeorge3.The New York Review of Books

More from The New York Review of Books

How was it? Save stories you love and never lose them.

Why Do We See Dead People?

Humans have always sensed the ghosts of loved ones. It’s only in the last century that we convinced ourselves this was a problem by Patricia Pearson
Illustration by Megan Kyak-Monteith Updated 19:38, Nov. 6, 2020 | Published 15:54, Oct. 27, 2020

In the late spring of 2015, my brother-in-law paid a visit to my sister’s grave, in a lush meadow cemetery amid the Gatineau Hills of southern Quebec. My sister had been dead, at this point, for seven years, and the couple had been separated for twelve. Doug sat in the grass among planted geraniums for half an hour or so, musing about the rise and fall of their marriage. He told Katharine, or her grave, that he was sorry for the part he had played in the dissolution. Then, plucking up and tossing a handful of grass, desultory, he began his two-and-a-half-hour motorcycle journey back to Montreal.

Never miss stories like this one. Sign up for our Sunday night newsletter: By checking this box I consent to the use of my information for emails from The Walrus.*

“The landscape is open there, with a big wide sky, but it was overcast and had started to rain—just barely, but it made me a bit nervous,” Doug later told me. Even fit riders in their fifties experience the occasional lapse in confidence. “It wasn’t until I was maybe halfway home that I felt her presence.”

“The sense wasn’t physical at first,” he went on, “just this really nice, strong awareness of her. And then I had the distinct sensation of her arms around me and her leaning in close against my back. It was tactile and fantastic. I felt warm. I was completely calm and happy, smiling from ear to ear. That hardly ever happens to me.” His nervousness about the rain ebbed, and it occurred to him that Katharine was there to keep him safe on behalf of their two sons. She—her presence, her spirit—rode behind him for twenty minutes or so. “What I know is that it did not feel at all like a product of my imagination,” he said. “It felt external to me. It felt real.”

He wasn’t prepared to name what the experience pointed to: that he had been visited by my sister’s ghost. Like other secular North Americans, he is aware that we must uphold a certain paradigm and say “this cannot be.” After all, Doug considers himself a rationalist: the son of an engineer, himself an amateur astronomer. Nevertheless, the sensed presence mattered deeply to him. “It was,” he said, “a remarkable, indelible experience.”

Sigmund Freud was the first to articulate the concept of “wishful psychosis” in grief, a notion of temporary madness featuring wilfully conjured visions of the dead. A person who’s lost someone might see the face of their beloved, hear their voice, notice the smell of their pipe or perfume, or simply be struck by a feeling of their presence. Such ghostly apparitions were diagnosed as fanciful yearnings by Freud—warning signs of some lingering dependency. In his 1917 essay “Mourning and Melancholia,” he urged his patients toward recovery by severing bonds with the dead: move on and let go, lest sorrow bedevil and sink you. For decades, this was one of the counselling profession’s central models for grief recovery: a sort of tacit agreement played out between therapist and patient that what the latter sensed, no matter how comforting it may be or how real it may seem, dwelled in their head and would best be forgotten. When the physician W. Dewi Rees uncovered the prevalence rate of these hallucinations in a 1972 study of Welsh widows and widowers—about 50 percent—he also found that three-quarters of them had never spoken of the experience before being asked in his survey. Unsurprisingly, these people didn’t wish to be pathologized. They also didn’t want to move on.

In 1970, English author Sylvia Townsend Warner, a frequent contributor of short stories to The New Yorker, had an unexpected visit from her dead lover, Valentine Ackland, lost the previous year to breast cancer. Roused one night at three, Warner found, as she later wrote in her diary, that Ackland had followed her to bed. “Not remembered,” she clarified, “not evoked, not a sense of presence. Actual.” In the dark quiet of their British cottage, this “actual” Ackland, solid yet ephemeral, engaged in a reuniting embrace. Then she was gone. “I held her again,” Warner noted with deep satisfaction. “It was. It is.”

Ought anyone to have argued with her? Death and its accompanying grief are often shrouded by awkward silences, but the unwavering prevalence of these apparitions, whether viewed as grief hallucinations or as ghosts, lays bare a metaphysical crisis at the heart of our common model of mourning: for there to be efficacy in recovery, these experiences must be respected as real. As counselling psychologist Edith Maria Steffen notes in her book, Continuing Bonds in Bereavement, there is a “controversial reality status” at play that can erode the trusting relationship between therapist and bereaved person if not handled with care and nuance. The same can be said for family and friends. The question is not whether these apparitions are real, it’s why the first impulse of many is to stifle these stories and dismiss the experiences as impossible.

Pulling power

Women find a man more attractive once they learn he has a wife or girlfriend, study suggests Posted November 8th 2020

They think he is more likely to be kind and faithful and, therefore, a good dad

  • 30 Jan 2018, 0:53
  • Updated: 30 Jan 2018, 3:54

WOMEN find a man more attractive once they learn he has a wife or girlfriend, a study suggests.

They think he is more likely to be kind and faithful and, therefore, a good dad, psychologists believe.

 A study has revealed that having a partner gives men an 'attractiveness boost'
A study has revealed that having a partner gives men an ‘attractiveness boost’Credit: Alamy

Having a partner gives men an “attractiveness boost”, they say.

But women may just be swayed by the opinions of others — and are as likely to up their rating of artworks which others like.

The findings come from an experiment testing the notion of “mate-choice copying”, which is seen in female birds and fish.

It can offer an evolutionary advantage by boosting their chances of finding a good sexual partner.

Researchers asked 49 female volunteers to rate men’s faces, men’s hands and a piece of art.

But when they were shown others’ ratings and asked again, they moved 13 per cent closer to the average facial score and 14 per cent closer to the average art rating.

Research leader Dr Kate Cross, from the University of St Andrews, said: “Women appear to copy the mate preferences of other women but this might simply be because humans have a general tendency to be influenced by the opinions of others.”

 Extremely bizarre compilation of Russian online dating profile pics

Antipsychotic Medicines Posted October 27th 2020

Authored by Dr Laurence Knott, Reviewed by Dr Hannah Gronow | Last edited 29 Jun 2018 | Meets Patient’s editorial guidelines 

In this series: Schizophrenia Psychosis

Antipsychotics are medicines that are mainly used to treat schizophrenia or mania caused by bipolar disorder. There are two main types of antipsychotics: atypical antipsychotics and older antipsychotics. Both types are thought to work as well as each other. Side-effects are common with antipsychotics. You will need regular tests to monitor for side-effects while you take these medicines.

In this article

What are antipsychotics?

Antipsychotics are a group of medicines that are mainly used to treat mental health illnesses such as schizophrenia, or mania (where you feel high or elated) caused by bipolar disorder. They can also be used to treat severe depression and severe anxiety. Antipsychotics are sometimes also called major tranquillisers.

Trending Articles

There are two main types of antipsychotics:

Antipsychotics are available as tablets, capsules, liquids and depot injections (long-acting). They come in various different brand names.

Older antipsychotics have been used since the 1950s and are still prescribed today. Newer antipsychotics were developed in the 1970s onwards. It was originally thought that these medicines would have fewer side-effects than the older type of antipsychotics. However, we now know that they can also cause quite a few side-effects.

How do antipsychotics work?

Antipsychotics are thought to work by altering the effect of certain chemicals in the brain, called dopamine, serotonin, noradrenaline and acetylcholine. These chemicals have the effect of changing your behaviour, mood and emotions. Dopamine is the main chemical that these medicines have an effect on.

By altering the effects of these chemicals in the brain they can suppress or prevent you from experiencing:

  • Hallucinations (such as hearing voices).
  • Delusions (having ideas not based on reality).
  • Thought disorder.
  • Extreme mood swings that are associated with bipolar disorder.

When are antipsychotics usually prescribed?

As discussed above, antipsychotics are usually prescribed to help ease the symptoms of schizophrenia, mania (caused by bipolar disorder), severe depression or severe anxiety. Normally they are started by a specialist in psychiatry, or your GP will ask a specialist for advice on when to start them.

Also, for many years antipsychotics were used to calm elderly people who had dementia. However, this use is no longer recommended. This is because these medicines are thought to increase the risk of stroke and early death – by a small amount. Risperidone is the only antipsychotic recommended for use in these people. Even then, it should only be used for a short period of time (less than six weeks) and for severe symptoms.

Which antipsychotic is usually prescribed?

The choice of antipsychotic prescribed depends upon what is being treated, how severe your symptoms are and if you have any other health problems. There are a number of differences between the various antipsychotic medicines. For example, some are more sedating than others. Therefore, one may be better for one individual than for another. A specialist in psychiatry usually advises on which to use in each case. It is difficult to tell which antipsychotic will work well for you. If one does not work so well, a different one is often tried and may work well. Your doctor will advise.

It is thought that the older and newer types of antipsychotics work as well as each other. The exception to this is clozapine – it is the only antipsychotic that is thought to work better than the others. Unfortunately, clozapine has a number of possible serious side-effects, especially on your blood cells. This means that people who take clozapine have to have regular blood tests. See below.

In some cases, an injection of a long-acting antipsychotic medicine (depot injection) is used once symptoms have eased. The medicine from a depot injection is slowly released into the body and is given every 2-4 weeks. This aims to prevent recurrences of symptoms (relapses). The main advantage of depot injections is that you do not have to remember to take tablets every day.

How well do antipsychotics work?

It is thought that for every 10 people who take these medicines, 8 will experience an improvement in their symptoms. Unfortunately, antipsychotics do not always make the symptoms go away completely, or for ever. A lot of people need to take them in the long term even if they feel well. This is in order to stop their symptoms from coming back. Even if you take these medicines on a long-term basis and they are helping, sometimes your symptoms can come back.

Symptoms may take 2-4 weeks to ease after starting medication and it can take several weeks for full improvement. The dose of the medicine is usually built up gradually to help to prevent side-effects (including weight gain).

What is the usual length of treatment?

This depends on various things. Some people may only need to take them for a few weeks but others may need to take them long-term (for example, for schizophrenia). Even when symptoms ease, antipsychotic medication is normally continued long-term if you have schizophrenia. This aims to prevent relapses, or to limit the number and severity of relapses. However, if you only have one episode of symptoms of schizophrenia that clears completely with treatment, one option is to try coming off medication after 1-2 years. Your doctor will advise.

Stopping antipsychotics

If you want to stop taking an antipsychotic you should always talk to your doctor first. This is in order to help you decide if stopping is the best thing for you and how you should stop taking your medicine. These medicines are usually stopped slowly over a number of weeks. If you stop taking an antipsychotic medicine suddenly, you may become unwell quite quickly. Your doctor will usually advise you to reduce the dose slowly to see what effect the lower dose has on your symptoms.

What about side-effects from antipsychotics?

Side-effects can sometimes be troublesome. There is often a trade-off between easing symptoms and having to put up with some side-effects from treatment. The different antipsychotic medicines can have different types of side-effects. Also, sometimes one medicine causes side-effects in some people and not in others. Therefore, it is not unusual to try two or more different medicines before one is found that is best suited to an individual.

The following are the main side-effects that sometimes occur. However, you should read the information leaflet that comes in each medicine packet for a full list of possible side-effects.

Common side-effects include:

  • Dry mouth, blurred vision, flushing and constipation. These may ease off when you become used to the medicine.
  • Drowsiness (sedation), which is also common but may be an indication that the dose is too high. A reduced dose may be an option.
  • Weight gain which some people develop. Weight gain may increase the risk of developing diabetes and heart problems in the longer term. This appears to be a particular problem with the atypical antipsychotics – notably, clozapine and olanzapine.
  • Movement disorders which develop in some cases. These include:
    • Parkinsonism – this can cause symptoms similar to those that occur in people with Parkinson’s disease – for example, tremor and muscle stiffness.
    • Akathisia – this is like a restlessness of the legs.
    • Dystonia – this means abnormal movements of the face and body.
    • Tardive dyskinesia (TD) – this is a movement disorder that can occur if you take antipsychotics for several years. It causes rhythmical, involuntary movements. These are usually lip-smacking and tongue-rotating movements, although it can affect the arms and legs too. About 1 in 5 people treated with typical antipsychotics eventually develop TD.

Atypical antipsychotic medicines are thought to be less likely to cause movement disorder side-effects than typical antipsychotic medicines. This reduced incidence of movement disorder is the main reason why an atypical antipsychotic is often used first-line. Atypicals do, however, have their own risks – in particular, the risk of weight gain. If movement disorder side-effects occur then other medicines may be used to try to counteract them.

Will I need any tests while taking an antipsychotic?

Your doctor will want to monitor you regularly for side-effects if you take an antipsychotic. The tests needed and how often you will need to have them depend on which antipsychotic you are taking.

In general, your doctor will take a sample of blood for certain tests before you start treatment. The tests look at:

  • How many blood cells you have.
  • How well your kidneys and liver are working.
  • How much lipid (fat) is in your blood.
  • Whether you have diabetes.

When you take clozapine your white blood cell (leukocyte) and differential blood counts must be normal before treatment is started. After beginning treatment, a full blood count should be taken every week for 18 weeks, then at least every two weeks after that. If clozapine is continued, and the blood count is stable after one year, then monitoring should occur at least every four weeks, and for four weeks after finishing. These tests may be repeated in the first three or four months of treatment. After this they are normally done every year. However, your doctor may advise you to have these tests more often.

Your weight and blood pressure are usually measured before you start treatment and every few weeks after this for the first few months. After this they are normally measured every year.

The blood level of prolactin (a hormone) may also be measured before starting treatment and six months later. Usually it is then measured every year after this. The prolactin level is measured because sometimes antipsychotics can make you produce too much of this hormone. If you make too much prolactin it can lead to your breasts growing bigger and breast milk being produced.

Who cannot take antipsychotics?

Antipsychotics are usually not prescribed for people who are in a coma (comatose), have depression of their central nervous system, or who have a tumour on the adrenal gland (phaeochromocytoma).

Can I buy antipsychotics?

No – they are only available from your pharmacist, with a doctor’s prescription.

How to use the Yellow Card Scheme

If you think you have had a side-effect to one of your medicines you can report this on the Yellow Card Scheme. You can do this online at

The Yellow Card Scheme is used to make pharmacists, doctors and nurses aware of any new side-effects that medicines or any other healthcare products may have caused. If you wish to report a side-effect, you will need to provide basic information about:

  • The side-effect.
  • The name of the medicine which you think caused it.
  • The person who had the side-effect.
  • Your contact details as the reporter of the side-effect.

It is helpful if you have your medication – and/or the leaflet that came with it – with you while you fill out the report.

Previous article

Antipsychotic Medicines

Authored by Dr Laurence Knott, Reviewed by Dr Hannah Gronow | Last edited 29 Jun 2018 | Meets Patient’s editorial guidelines 

In this series: Schizophrenia Psychosis

Antipsychotics are medicines that are mainly used to treat schizophrenia or mania caused by bipolar disorder. There are two main types of antipsychotics: atypical antipsychotics and older antipsychotics. Both types are thought to work as well as each other. Side-effects are common with antipsychotics. You will need regular tests to monitor for side-effects while you take these medicines.

In this article

What are antipsychotics?

Antipsychotics are a group of medicines that are mainly used to treat mental health illnesses such as schizophrenia, or mania (where you feel high or elated) caused by bipolar disorder. They can also be used to treat severe depression and severe anxiety. Antipsychotics are sometimes also called major tranquillisers.

Trending Articles

There are two main types of antipsychotics:

Antipsychotics are available as tablets, capsules, liquids and depot injections (long-acting). They come in various different brand names.

Older antipsychotics have been used since the 1950s and are still prescribed today. Newer antipsychotics were developed in the 1970s onwards. It was originally thought that these medicines would have fewer side-effects than the older type of antipsychotics. However, we now know that they can also cause quite a few side-effects.

How do antipsychotics work?

Antipsychotics are thought to work by altering the effect of certain chemicals in the brain, called dopamine, serotonin, noradrenaline and acetylcholine. These chemicals have the effect of changing your behaviour, mood and emotions. Dopamine is the main chemical that these medicines have an effect on.

By altering the effects of these chemicals in the brain they can suppress or prevent you from experiencing:

  • Hallucinations (such as hearing voices).
  • Delusions (having ideas not based on reality).
  • Thought disorder.
  • Extreme mood swings that are associated with bipolar disorder.

When are antipsychotics usually prescribed?

As discussed above, antipsychotics are usually prescribed to help ease the symptoms of schizophrenia, mania (caused by bipolar disorder), severe depression or severe anxiety. Normally they are started by a specialist in psychiatry, or your GP will ask a specialist for advice on when to start them.

Also, for many years antipsychotics were used to calm elderly people who had dementia. However, this use is no longer recommended. This is because these medicines are thought to increase the risk of stroke and early death – by a small amount. Risperidone is the only antipsychotic recommended for use in these people. Even then, it should only be used for a short period of time (less than six weeks) and for severe symptoms.

Which antipsychotic is usually prescribed?

The choice of antipsychotic prescribed depends upon what is being treated, how severe your symptoms are and if you have any other health problems. There are a number of differences between the various antipsychotic medicines. For example, some are more sedating than others. Therefore, one may be better for one individual than for another. A specialist in psychiatry usually advises on which to use in each case. It is difficult to tell which antipsychotic will work well for you. If one does not work so well, a different one is often tried and may work well. Your doctor will advise.

It is thought that the older and newer types of antipsychotics work as well as each other. The exception to this is clozapine – it is the only antipsychotic that is thought to work better than the others. Unfortunately, clozapine has a number of possible serious side-effects, especially on your blood cells. This means that people who take clozapine have to have regular blood tests. See below.

In some cases, an injection of a long-acting antipsychotic medicine (depot injection) is used once symptoms have eased. The medicine from a depot injection is slowly released into the body and is given every 2-4 weeks. This aims to prevent recurrences of symptoms (relapses). The main advantage of depot injections is that you do not have to remember to take tablets every day.

How well do antipsychotics work?

Symptoms may take 2-4 weeks to ease after starting medication and it can take several weeks for full improvement. The dose of the medicine is usually built up gradually to help to pIt is thought that for every 10 people who take these medicines, 8 will experience an improvement in their symptoms. Unfortunately, antipsychotics do not always make the symptoms go away completely, or for ever. A lot of people need to take them in the long term even if they feel well. This is in order to stop their symptoms from coming back. Even if you take these medicines on a long-term basis and they are helping, sometimes your symptoms can come back.

revent side-effects (including weight gain).

What is the usual length of treatment?

This depends on various things. Some people may only need to take them for a few weeks but others may need to take them long-term (for example, for schizophrenia). Even when symptoms ease, antipsychotic medication is normally continued long-term if you have schizophrenia. This aims to prevent relapses, or to limit the number and severity of relapses. However, if you only have one episode of symptoms of schizophrenia that clears completely with treatment, one option is to try coming off medication after 1-2 years. Your doctor will advise.

Stopping antipsychotics

If you want to stop taking an antipsychotic you should always talk to your doctor first. This is in order to help you decide if stopping is the best thing for you and how you should stop taking your medicine. These medicines are usually stopped slowly over a number of weeks. If you stop taking an antipsychotic medicine suddenly, you may become unwell quite quickly. Your doctor will usually advise you to reduce the dose slowly to see what effect the lower dose has on your symptoms.

What about side-effects from antipsychotics?

Side-effects can sometimes be troublesome. There is often a trade-off between easing symptoms and having to put up with some side-effects from treatment. The different antipsychotic medicines can have different types of side-effects. Also, sometimes one medicine causes side-effects in some people and not in others. Therefore, it is not unusual to try two or more different medicines before one is found that is best suited to an individual.

The following are the main side-effects that sometimes occur. However, you should read the information leaflet that comes in each medicine packet for a full list of possible side-effects.

Common side-effects include:

  • Dry mouth, blurred vision, flushing and constipation. These may ease off when you become used to the medicine.
  • Drowsiness (sedation), which is also common but may be an indication that the dose is too high. A reduced dose may be an option.
  • Weight gain which some people develop. Weight gain may increase the risk of developing diabetes and heart problems in the longer term. This appears to be a particular problem with the atypical antipsychotics – notably, clozapine and olanzapine.
  • Movement disorders which develop in some cases. These include:
    • Parkinsonism – this can cause symptoms similar to those that occur in people with Parkinson’s disease – for example, tremor and muscle stiffness.
    • Akathisia – this is like a restlessness of the legs.
    • Dystonia – this means abnormal movements of the face and body.
    • Tardive dyskinesia (TD) – this is a movement disorder that can occur if you take antipsychotics for several years. It causes rhythmical, involuntary movements. These are usually lip-smacking and tongue-rotating movements, although it can affect the arms and legs too. About 1 in 5 people treated with typical antipsychotics eventually develop TD.

Atypical antipsychotic medicines are thought to be less likely to cause movement disorder side-effects than typical antipsychotic medicines. This reduced incidence of movement disorder is the main reason why an atypical antipsychotic is often used first-line. Atypicals do, however, have their own risks – in particular, the risk of weight gain. If movement disorder side-effects occur then other medicines may be used to try to counteract them.

Will I need any tests while taking an antipsychotic?

Your doctor will want to monitor you regularly for side-effects if you take an antipsychotic. The tests needed and how often you will need to have them depend on which antipsychotic you are taking.

In general, your doctor will take a sample of blood for certain tests before you start treatment. The tests look at:

  • How many blood cells you have.
  • How well your kidneys and liver are working.
  • How much lipid (fat) is in your blood.
  • Whether you have diabetes.

When you take clozapine your white blood cell (leukocyte) and differential blood counts must be normal before treatment is started. After beginning treatment, a full blood count should be taken every week for 18 weeks, then at least every two weeks after that. If clozapine is continued, and the blood count is stable after one year, then monitoring should occur at least every four weeks, and for four weeks after finishing. These tests may be repeated in the first three or four months of treatment. After this they are normally done every year. However, your doctor may advise you to have these tests more often.

Your weight and blood pressure are usually measured before you start treatment and every few weeks after this for the first few months. After this they are normally measured every year.

The blood level of prolactin (a hormone) may also be measured before starting treatment and six months later. Usually it is then measured every year after this. The prolactin level is measured because sometimes antipsychotics can make you produce too much of this hormone. If you make too much prolactin it can lead to your breasts growing bigger and breast milk being produced.

Who cannot take antipsychotics?

Antipsychotics are usually not prescribed for people who are in a coma (comatose), have depression of their central nervous system, or who have a tumour on the adrenal gland (phaeochromocytoma).

Can I buy antipsychotics?

No – they are only available from your pharmacist, with a doctor’s prescription.

How to use the Yellow Card Scheme

If you think you have had a side-effect to one of your medicines you can report this on the Yellow Card Scheme. You can do this online at

The Yellow Card Scheme is used to make pharmacists, doctors and nurses aware of any new side-effects that medicines or any other healthcare products may have caused. If you wish to report a side-effect, you will need to provide basic information about:

  • The side-effect.
  • The name of the medicine which you think caused it.
  • The person who had the side-effect.
  • Your contact details as the reporter of the side-effect.

It is helpful if you have your medication – and/or the leaflet that came with it – with you while you fill out the report.


This is astonishing stuff. Following my very serious criminal allegations, over nearly 13 years, against two police forces command and control, and their many efforts to have me tried and jailed, my doctor et al were contacted. They first tried this in 2013, when a senior forensic psychiatrist concluded that I was not suffering from any known mental illness. The police and my GP ignored this report. It was not in disclosures, redacted and sent to me at my request after a year of arguing with them,

In March 2018, two psychiatrists and a mental health nurse turned up at my home, making three 40 minute calls in total, concluding that I do not have normal psychology, have a paranoid personality disorder and should be subject to a multi agency approach.

Their report said that I would be upset if I saw all of the combined Police/NHS records on my case, but I did not need hospital at the moment but should take anti psychotic drugs, as detailed, with side effects, above. Another report, based on me being persuaded to attend the Gender Identity Clinic, concluded that I have ‘ a secure female identity .’

The police still refuse to disclose records of their alleged investigations into my criminal behaviour and alleged domestic violence – the first of which I heard while in court being prosecuted for repeating criminal allegations against individuals including senior police officers.

I spent another 12 hours in police custody on August 24th, in a cold dark dirty cell, before nearly succeeding in strangling myself. I was then transferred to a secure mental health facility, kept for 12 hours, went before a panel of two senior doctors and a senior mental health worker. I was judged sane and fit to leave.

Interestingly, the police, who have a very big axe to grind with me, informed my GP that I am a violent mentally ill alcoholic, advising them to inform a consultant urologist dealing with other aspects of my medical care. This GP, Dr Roger Dickson Principal of Norden House Surgery Avenue Rd, acted accordingly.

The urologist informed me, along with Dr Ramasamay of the same surgery putting me in a position where I could copy a letter from Reading Police Station, off of his VDU, informing Norden House that I am, in their opinion, mentally ill. It is a very serious matter when police can make these allegation as if fact. meanwhile Dickson seemed to have forgotten that he regularly passed me fit to drive HGVs, which I had been doing for the 12 years leading up to lockdown. There is something very rotten about our public services, particularly the police.

This is life in a very dangerous police state today. I have worked in a wide range of occupations, including teaching, journalism, engineering, truck driving, construction and writing many books. No one apart from the little Police and NHS clique seem to have noticed how mad I am. Praise be to our wonderful police and NHS ( sic ). The situation is ongoing. R.J Cook

NHS incompetence killed both my parents and is working with police in what they call ‘ a multi agnecy approach’ to have me take anti psychotics, and ultimately to section me.
R.J Cook
There will never be real reform of the British police because the British Elite’s Dictatorship needs them, they are essential to oppression and elite privilege.