[…] Van Prooijen said the results suggest that “the relationship between education and belief in conspiracy theories cannot be reduced to a single psychological mechanism but is the product of the complex interplay of multiple psychological processes.”
The nature of his study means we can’t infer that education or the related factors he measured actually cause less belief in conspiracies. But it makes theoretical sense that they might be involved: for example, more education usually increases people’s sense of control over their lives (though there are exceptions, for instance among people from marginalized groups), while it is feelings of powerlessness that is one of the things that often attracts people to conspiracy theories.
Importantly, Van Prooijen said his findings help make sense of why education can contribute to “a less paranoid society” even when conspiracy theories are not explicitly challenged. “By teaching children analytic thinking skills along with the insight that societal problems often have no simple solutions, by stimulating a sense of control, and by promoting a sense that one is a valued member of society, education is likely to install the mental tools that are needed to approach far-fetched conspiracy theories with a healthy dose of skepticism.”
1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.
Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.
As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.
In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.
“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”
[…] She had always had a talent for remembering. She had also always dreaded change. Knowing that after they left New Jersey, nothing could ever be the same, Price tried to commit to memory the world she was being ripped away from. She made lists, took pictures, kept every artefact, every passed note and ticket stub. If this was a conscious effort to train her memory, it worked, perhaps better than she ever imagined.
Jill Price was the first person ever to be diagnosed with what is now known as highly superior autobiographical memory, or HSAM, a condition she shares with around 60 other known people. She can remember most of the days of her life as clearly as the rest of us remember the recent past, with a mixture of broad strokes and sharp detail. Now 51, Price remembers the day of the week for every date since 1980; she remembers what she was doing, who she was with, where she was on each of these days. She can actively recall a memory of 20 years ago as easily as a memory of two days ago, but her memories are also triggered involuntarily.
It is, she says, like living with a split screen: on the left side is the present, on the right is a constantly rolling reel of memories, each one sparked by the appearance of present-day stimuli. With so many memories always at the ready, Price says, it can be maddening: virtually anything she sees or hears can be a potential trigger.
Before Price, HSAM was a completely unknown condition. So what about the day she sent an email to a Dr James McGaugh at University of California, Irvine? That was 8 June 2000, a Thursday. Price was 34 years and five months old.
At least when I was in grade school, we learned the very basics of how the Third Reich came to power in the early 1930s. Paramilitary gangs terrorizing the opposition, the incompetence and opportunism of German conservatives, the Reichstag Fire. And we learned about the critical importance of propaganda, the deliberate misinforming of the public in order to sway opinions en masse and achieve popular support (or at least the appearance of it). While Minister of Propaganda Joseph Goebbels purged Jewish and leftist artists and writers, he built a massive media infrastructure that played, writes PBS, “probably the most important role in creating an atmosphere in Germany that made it possible for the Nazis to commit terrible atrocities against Jews, homosexuals, and other minorities.”
How did the minority party of Hitler and Goebbels take over and break the will of the German people so thoroughly that they would allow and participate in mass murder? Post-war scholars of totalitarianism like Theodor Adorno and Hannah Arendt asked that question over and over, for several decades afterward. Their earliest studies on the subject looked at two sides of the equation. Adorno contributed to a massive volume of social psychology called The Authoritarian Personality, which studied individuals predisposed to the appeals of totalitarianism. He invented what he called the F-Scale (“F” for “fascism”), one of several measures he used to theorize the Authoritarian Personality Type.
Arendt, on the other hand, looked closely at the regimes of Hitler and Stalin and their functionaries, at the ideology of scientific racism, and at the mechanism of propaganda in fostering “a curiously varying mixture of gullibility and cynicism with which each member… is expected to react to the changing lying statements of the leaders.” So she wrote in her 1951 Origins of Totalitarianism, going on to elaborate that this “mixture of gullibility and cynicism… is prevalent in all ranks of totalitarian movements”
Hannah Arendt on Loneliness as the Common Ground for Terror and How Tyrannical Regimes Use Isolation as a Weapon of Oppression
“Loneliness is personal, and it is also political,” Olivia Laing wrote in The Lonely City, one of the finest books of the year. Half a century earlier, Hannah Arendt (October 14, 1906–December 4, 1975) examined those peculiar parallel dimensions of loneliness as a profoundly personal anguish and an indispensable currency of our political life in her intellectual debut, the incisive and astonishingly timely 1951 classic The Origins of Totalitarianism (public library).
Arendt paints loneliness as “the common ground for terror” and explores its function as both the chief weapon and the chief damage of oppressive political regimes. Exactly twenty years before her piercing treatise on lying in politics, she writes:
Just as terror, even in its pre-total, merely tyrannical form ruins all relationships between men, so the self-compulsion of ideological thinking ruins all relationships with reality. The preparation has succeeded when people have lost contact with their fellow men* as well as the reality around them; for together with these contacts, men lose the capacity of both experience and thought. The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist, but people for whom the distinction between fact and fiction (i.e., the reality of experience) and the distinction between true and false (i.e., the standards of thought) no longer exist.
What Drives Trump Supporters? Sociologist Arlie Russell Hochschild on the Anger and Mourning of the Right
Amy Goodman and Juan Gonzalez speak to renowned sociologist Arlie Russell Hochschild, who has spent much of the past five years with some of Donald Trump’s biggest supporters, researching her new book, Strangers in Their Own Land: Anger and Mourning on the American Right. (Democracy Now!)
In a series of conversations leading up to the U.S. presidential election in November, Christopher Graves — a recent Rockefeller Foundation Bellagio Resident honoree for behavioral science, Global Chairman of Ogilvy Public Relations, and chair of the PR Council — and Steve Simpson — Chief Creative Officer of Ogilvy & Mather North America — will dissect and debate the candidates’ communications and marketing strategies and techniques.
Graves: Steve, let’s kick off with two television spots that have been running head to head from Clinton and Trump. Trump’s first authorized general election spot, called “Two Americas: Immigration” sets a vision of a dystopian America under Clinton against a safe America under Trump. The first half, under Clinton, is dark and foreboding. Then the tone shifts abruptly.
Imprisonment in America often means complete seclusion from nature. Take the case of the maximum security inmates at Snake River Correctional Institution in Oregon: they spend 23 hours a day locked in 7 X 12 foot concrete cells. The only windows face inside the unit. Four or five times a week they can spend an hour in an exercise yard that is about twenty times smaller than a basketball court. From the yard, prisoners can glimpse the sky—the only “nature” they ever see. And this is typical.
Of course, prisons weren’t designed for comfort, and one could argue that access to nature is one of many pleasures that convicted criminals forfeit. But mental illness is a growing problem in prisons, the impacts of which society at large bears when inmates are released. Solitary confinement (a staple of maximum security units) has been shown to cause mental health issues, or when preexisting, exacerbate them. And it turns out that isolation from other human beings might not be the only factor. Researchers in the field of ecopsychology believe that nature deprivation can also damage mental well-being.
That concept resonated with the administrators at Snake River who were struggling to address violence and suicide among their most troubled inmates. The superintendent of the prison, Mark Nooth, encouraged his staff to explore novel solutions.
“In the state of nature, profit is the measure of right,” wrote Thomas Hobbes, a philosopher with a dim view of human nature. Hobbes’s fellow-thinkers have spent centuries pondering whether humans tend to be self-serving or are more inclined to straight-dealing. (Obviously, people exhibit both kinds of behaviour. The question is which comes most easily.)
So far, most experiments have tended to favour the first idea—that humans are dishonest by default when it serves their self-interest. In one study led by Shaul Shalvi at the University of Amsterdam, for example, participants were told to roll a die secretly three times and write down the results of the first roll. They would then receive 10 times that number in Israeli New Shekels. The researchers found that people who were asked to report their die roll within 20 seconds tended to report higher numbers than those who were given no time limit (though both groups reported higher numbers, on average, than would be expected if they were being truthful).
Now, though, the waters have been muddied by a new study published on arXiv, an online preprint site, by Valerio Capraro of the University of Middlesex. Dr Capraro argues that previous studies have allowed people to ponder in advance how they can best maximise their gains. That means such studies did not properly test how the participants respond under pressure. His study, which presents participants with details of their task just before they perform it, finds that people may be naturally truthful after all.
Steven Pinker’s excellent 2011 book, The Better Angels of Our Nature: Why violence has declined, upsets many people. They believe that the world is more dangerous — now than ever before. And they believe it is dangerous to go around saying that the world isn’t more dangerous now. They are wrong, according to Pinker. And this is because of human psychology. There is, here, a fascinating story to tell about us and our fears.
Some readers might not want to tackle the hefty Better Angels, from cover to cover, it is almost 800 pages long. Never fear: Pinker has in several places provided good synopses of his research and conclusions. One can google “Pinker, safer world,” or read this or this or this.
[…] Our current world is the safest time in all of human history. In his Wall Street Journal article, Pinker sums this up his conclusions: “Violence has been in decline for thousands of years, and today we may be living in the most peaceable era in the existence of our species.” Hallelujah!
Nevertheless we Americans do fear (and not just Americans). Few will relax and rejoice upon learning of Professor Pinker’s data-driven conclusions about the decline of violence. Many will find his conclusions unbelievable (partly because we are a nation that doesn’t like or trust science). In fact, we know that there is an entire political campaign for the U. S. presidency based on fear of increasing violence: The campaign of Mr. Trump.
Despite Hilary Clinton’s recent claims that Donald Trump is temperamentally unsuited to be president and her suggestion that he may be unhinged – clinicians have steered clear of volunteering their professional opinion in the public forum. There is a reason for this.
According to an American Psychiatric Association regulation known informally as The Goldwater Rule, psychiatrists are not allowed to volunteer their opinion on the mental health of a public figure without having had a private consultation with the individual, and, what’s more, received their authorization to make a public statement. There is provision, however, for psychiatrists to comment on more general issues of mental health (and this often happens). A very similar rule applies to psychologists.
The advantage of this is, according to Susan H. McDaniel, president of the American Psychological Association, that psychiatrists and psychologists don’t give the impression of having “a professional relationship … with people in the public eye”.
One disadvantage is that voters can only access amateur psycho-babble.
[…] Research going back as far as 1998 suggests that modern politicians are more narcissistic than people in other professions. But in fact, politicians—at least those in positions of high power—might also be more narcissistic than ever. In a 2013 study published in Psychological Science, we and several colleagues examined a trait called grandiose narcissism, which comprises immodesty, boastfulness and interpersonal dominance (a certain presidential candidate in a gold-plated tower in Manhattan comes to mind). For every president up to and including George W. Bush, we asked eminent biographers and experts to complete extensive personality ratings for the five years before each president took office. The ratings revealed an intriguing trend: Grandiose narcissism levels are higher in more recent presidents than in earlier ones. Despite some caveats to this result, we also found that levels of several other traits, such as those linked to interpersonal oddity, were not higher in more recent presidents. Ultimately, our findings raise the possibility that the mounting pressures on candidates to be telegenic and adept at self-presentation may be selecting for heightened self-centeredness.
Rising narcissism levels in our politicians might be cause for concern. But today’s grueling campaigns might also select for such adaptive traits as emotional resilience, stick-to-itiveness and impulse control. Of course, even if a presidential candidate is driven partly by ambition, this does not mean he or she is not also driven by love of country. Even egomaniacs can be animated by a higher calling.
[…] I do a lot of people watching and I have noticed that we are now spending more time with our faces staring at our phone than we spend with our faces looking around the world or looking directly at another person.
In a recent study colleagues and I asked 216 undergraduate students to use an app called Instant Quantified Self that tallied the number of times the student unlocked his/her phone during the day and how many minutes it remained unlocked. Strikingly, the average student (and our students are typically older, averaging about 25 years old instead of the usual 20-year-old college student) unlocked his/her phone roughly 60 times a day for about 4 minutes each time. In all, the phone was in use 4 hours! And this does not count time spent on a laptop, tablet, or any other device.
What are they doing on their phones? Mostly accessing social connections including text messaging, reading or posting on social media, dealing with email or any app that involves connecting with another human being.
Pavlov paired food with a bell; we seem to be pairing our human connection with our phone. We may not salivate but our brain is certainly responding to those internal and external alerts.
Since the World Economic Forum (WEF) was founded in 1971, its annual meeting in Davos has served as a useful indicator of the global economic zeitgeist. These conferences, which last a few days in late January, bring together corporate executives, senior politicians, representatives of NGOs and a sprinkling of concerned celebrities to address the main issues confronting the global economy and the decision-makers tasked with looking after it.
In the 1970s, when the WEF was still known as the ‘European Management Forum’, its main concern was slumping productivity growth in Europe. In the 1980s, it became preoccupied with market deregulation. In the 1990s, innovation and the internet came to the fore, and by the early 2000s, with the global economy humming, it began to admit a range of more ‘social’ concerns, alongside the obvious post-9/11 security anxiety. For the five years after the banking meltdown of 2008, Davos meetings were primarily concerned with how to get the old show back on the road.
At the 2014 meeting, rubbing shoulders with the billionaires, pop stars and presidents was a less likely attendee: a Buddhist monk. Every morning, before the conference proceedings began, delegates had the opportunity to meditate with the monk and learn relaxation techniques. ‘You are not the slave of your thoughts’, the man in red and yellow robes, clutching an iPad, informed his audience. ‘One way is to just gaze at them . . . like a shepherd sitting above a meadow watching the sheep’. A few hundred thoughts of stock portfolios and illicit gifts for secretaries back home most likely meandered their way across the mental pastures of his audience.
Proud and happy moments in our lives become cherished memories, kept in relatively crisp condition in our noggins for the occasional uplifting retrieval. But memories of not so pleasant events, such as a moment of weakness when we cheated on a math test or snuck a candy bar from a store, may get roughed up in our brains, perhaps to the point where we can’t clearly recall them anymore, according to a new study.
Collecting data from a series of nine experiments involving 2,109 participants, researchers suggest that our brains actively blur and junk memories of our own misdeeds to help avoid dissonance between our actions and moral values. This mental hazing, the researchers hypothesize in the Proceedings of the National Academy of Sciences, helps us maintain a positive moral self-image and sidestep distress.
“Because morality is such a fundamental part of human existence, people have a strong incentive to view themselves and be viewed by others as moral individuals,” the authors write. But with lying, cheating, and stealing being common occurrences, the use of unethical amnesia “can explain why ordinary, good people repeatedly engage in unethical behavior and also how they distance themselves from such behavior over time.”
Social media isn’t necessarily good for us. In fact, studies suggest that Facebook, Twitter and other social media platforms may have fueled a spike in suicide, addiction, and a host of other mental health problems. Now, a new study in Depression and Anxiety finds that social media use is strongly linked to depression among young adults living in the United States.
“Social media use was significantly associated with increased depression,” the authors write. “Given the proliferation of social media, identifying the mechanisms and direction of this association is critical.”
Perhaps the first case of social media-induced psychosis involved Jason Russell, the man behind the mega-viral Kony 2012 campaign that raised awareness about child soldiers in Uganda. Shortly after being catapulted into internet fame, Russell had an emotional meltdown on camera (which, incidentally, went almost as viral as Kony 2012). Doctors called it “reactive psychosis”, saying that his sudden fame had inspired a sort of temporary insanity.
But since then, even scientists have been slow to recognize that internet culture may have deleterious effects on our brains—at least in part because nobody wants to be, “waving a cane at electric light or blaming the television for kids these days,” as one Newsweek article put it. Indeed, a peer reviewer once famously rejected an article on the psychiatric study of internet abuse in 2006, quipping, “What’s next? Microwave abuse and Chapstick addiction?”
Match.com and OKCupid are the Amazon.com of courtship.
Researchers and social scientists argue that dating and economics have evolved in tandem. “The story of dating began when women left their homes and the homes of others where they had toiled as slaves and maids to cities where they took jobs and let them mix with men,” writes Moira Weigel, author of the new book “Labor of Love: The Invention of Dating,”. Even “picking up” someone made dating sound like some kind of consumer transaction, she adds, as do common dating terms like “on the market” and “off the market” (or meat market). “The way we think about online dating has completely permeated the concepts of economics,” Weigel says.
By that logic, lovelorn singletons should apply the same principles to their dating profiles as advertisers apply to a bottle of shampoo competing for attention on a supermarket shelf, according to a study published in 2015 by Sameer Chaudhry, assistant professor at University Texas Southwestern Medical Center, and his colleague Khalid Khan, professor of women’s health and clinical epidemiology at Queen Mary University of London. Chaudhry had good reason to choose this as a research topic. “I was having trouble Internet dating,” he says. By employing the study’s findings in his own search for a partner, Chaudhry says he finally found the right match.
Never in my wildest dreams did I imagine I’d earn the ire of a character named Turd Flinging Monkey, the nom de plume of a popular online men’s rights activist. A leader in the MGTOW (Men Going Their Own Way) movement, which encourages men to avoid romantic relationships with women, Monkey did not take kindly to my new Prager University video talking up the benefits of marriage for men. In the video, I noted, among other things, that married men work harder (about 400 more hours), smarter (they’re less likely to quit without having found another job), and more successfully (they make about $16,000 more per year) than their single peers. I described these as features, not bugs, of married life for men.
In response, in a video of his own, Monkey unloaded on marriage, arguing that the things I had described as features of marriage were in fact bugs.
- For men, marriage equals slavery: “Marriage, in essence, is a man choosing his slave master.”
- For men, marriage equals unrequited sacrifice: “So married men work 400 hours more per year than single men; that’s not a good thing. They’re not hanging out with their friends… They’re sacrificing their life for other people. Now, you may think that’s noble, but that’s not a benefit for the man.”
- For men, marriage equals emasculation: it means “giving a woman power over your life, power over your income.”
- And above all, for men, marriage equals a soul-destroying divorce: “talk to the men in MGTOW who have had their wallets ripped out their a** in family court. Go to the graves of men who killed themselves after they were unemployed and couldn’t afford child support and faced jail. Talk to those men about how wonderful marriage was… Ask them about the hundreds of hours they work extra each year to avoid going to prison because they owe so much child support or alimony that they gotta move in with their parents.”
This is Turd Flinging Monkey’s view of marriage. And judging by the thousands of internet comments and emails my video making the case for marriage to men has garnered, I’d say his perspective resonates with a substantial minority of men. There are lots of men out there who harbor a deeply misogynistic view of the opposite sex, an unremittingly negative view of love and commitment, and a complete lack of faith in marriage to deliver on their deepest dreams and desires.
Acclaimed journalist, author and political activist Barbara Ehrenreich explores the darker side of positive thinking. (RSA Animate)
Scientists now believe that language and music co-evolved to simulate the most abiding truths of nature. Indeed, for as long as we’ve been able to articulate the human experience, we’ve made music about the most inarticulable parts of it and then used language to extol music’s power — nowhere more beautifully than in Aldous Huxley’s 1931 meditation on how music stirs the soul, in which he asserted that music’s greatest potency lies in expressing the inexpressible.
This, perhaps, is why music is so sublime a solace when it comes to the most inexpressible of human emotions: grief.
Lesser, who doesn’t consider herself “a particularly musical person,” contemplates the way in which music bypasses the intellect and speaks straight to the unguarded heart.
- How Repetition Enchants the Brain and the Psychology of Why We Love It in Music
- Aldous Huxley on the Transcendent Power of Music and Why It Sings to Our Souls
- How Music and Language Mimicked Nature to Evolve Us
- 7 Essential Books on Music, Emotion, and the Brain
- How music helps children to deal with bereavement
- Great Writers on the Power of Music
- How Music Works
In 1979, a secret memo from the tobacco industry was revealed to the public. Called the Smoking and Health Proposal, and written a decade earlier by the Brown & Williamson tobacco company, it revealed many of the tactics employed by big tobacco to counter “anti-cigarette forces”.
In one of the paper’s most revealing sections, it looks at how to market cigarettes to the mass public: “Doubt is our product since it is the best means of competing with the ‘body of fact’ that exists in the mind of the general public. It is also the means of establishing a controversy.”
This revelation piqued the interest of Robert Proctor, a science historian from Stanford University, who started delving into the practices of tobacco firms and how they had spread confusion about whether smoking caused cancer.
Proctor had found that the cigarette industry did not want consumers to know the harms of its product, and it spent billions obscuring the facts of the health effects of smoking. This search led him to create a word for the study of deliberate propagation of ignorance: agnotology.
- Cultural production of ignorance provides rich field for study
- Meet the ‘Merchants of Doubt’ Who Sow Confusion about Tobacco Smoke and Climate Change
- ExxonMobil gave millions to climate-denying lawmakers despite pledge
- “Dark Money” Funds Climate Change Denial Effort
- Smoked out: tobacco giant’s war on science
- From tobacco to climate change, ‘merchants of doubt’ undermined the science
- The Denial Industry
- Agnotology: The Making and Unmaking of Ignorance (Book)
- Toxic Sludge is Good for You: Lies, Damn Lies, and the Public Relations Industry (Book)
- Trust Us, We’re Experts!: How Industry Manipulates Science and Gambles with Your Future (Book)
- Agnotology: The Study of Ignorance
- Merchants of Doubt
[…] It’s not that belief in conspiracy theories is becoming more widespread, says Viren Swami, professor of social psychology at Anglia Ruskin university: while the research hasn’t been done yet, he tells me, there’s lots of anecdotal evidence to suggest that belief in conspiracies has remained fairly stable for the last half-century or so. What has changed, however, is the speed with which new theories are formed. “It’s a symptom of a much more integrated world,” he says. The internet speeds everything up, allowing conspiracy-minded individuals to connect and formulate their ideas. In contrast, it took months for theories about Pearl Harbor to develop.
Karen Douglas, another social psychologist, echoes this point. “People’s communication patterns have changed quite a lot over the last few years. It’s just so much easier for people to get access to conspiracy information even if they have a little seed of doubt about an official story. It’s very easy to go online and find other people who feel the same way as you.”
Is everyone prone to this kind of thinking, or is it the preserve of an extreme fringe? Douglas reckons it’s more common than most of us realise. “Recent research has shown that about half of Americans believe at least one conspiracy theory,” she says. “You’re looking at average people; people you might come across on the street.”
That’s also the view of Rob Brotherton, whose new book, Suspicious Minds,explores the traits that predispose us to belief in conspiracies. He cautions against sitting in judgment, since all of us have suspicious minds – and for good reason. Identifying patterns and being sensitive to possible threats is what has helped us survive in a world where nature often is out to get you. “Conspiracy theory books tend to come at it from the point of view of debunking them. I wanted to take a different approach, to sidestep the whole issue of whether the theories are true or false and come at it from the perspective of psychology,” he says. “The intentionality bias, the proportionality bias, confirmation bias. We have these quirks built into our minds that can lead us to believe weird things without realising that’s why we believe them.”
- Podcast: Why are conspiracy theories so attractive?
- Paper: Conspiracy Theories and the Paranoid Style(s) of Mass Opinion
- National Security and Double Government (Book)
- The American Deep State (Book)
- Disinfo Wars: Alex Jones’ War on Your Mind
- Insights into the Personalities of Conspiracy Theorists
- CIA Popularized ‘Conspiracy Theory’ Term to Silence Dissent
- Lecture: A Social Psychological Perspective On Conspiracy Theories
- Social consequences of conspiracism: Exposure to conspiracy theories decreases the intention to engage
- If I’m Drunk, Then You Stepped On My Toes On Purpose
- The Rough Guide To Conspiracy Theories (Book)
- The Paranoid Style in American Politics: And Other Essays (Book)
- Conspiracy Theories: Secrecy and Power in American Culture (Book)
- Top 10 Conspiracy Theories That Turned Out To Be TRUE
- Them: Adventures with Extremists (Book)
- FAQs about State Crimes Against Democracy (SCADs)
- State Crimes Against Democracy – Wikipedia
- Deep state – Wikipedia
The psychology of mass government surveillance: How do the public respond and is it changing our behaviour?
‘Amnesty International has today reported the outcome of a Yougov survey in 15,000 people across 13 countries, studying for the first time international views of mass surveillance and whether the public believe it is changing their own behaviour.
[…] Just how accepting are people of surveillance in the first place? In short, not very. Across all 13 countries, there was no majority support for surveillance – only 26% of people, overall, agreed that the government should monitor the communications and Internet activity of its own citizens, while a similar number (29%) felt their government should monitor overseas citizens. Only 17% of respondents believed their government should monitor everybody – citizens, foreign nationals, and foreign countries – while twice as many (34%) believed their government should never monitor any of these groups.’
‘If Kremlinology made for a viable career track at the Pentagon during the cold war, Putinology is its pale 21st-century successor, complete with geopolitical guessing games, spycraft and the unknowable machinations of the man inside Red Square. The latest contribution to the field comes courtesy of a Pentagon thinktank: a suggestion that Vladimir Putin has an autistic disorder.
Studies from 2008 and 2011, commissioned by the Pentagon and revealed by USA Today through a freedom of information request, suggest Putin has “an autistic disorder which affects all of his decisions” and may be Asperger syndrome. But the studies, which focused on videos of the Russian president, do not claim to make a diagnosis and are primarily the brainchild of one person, Brenda Connors of the US Naval War College (USNWC) in Newport, Rhode Island.’
‘Geoff Heyes, Mental Health Charity Mind‘s campaigns and policy manager, talks to Going Underground host Afshin Rattansi about the crisis in mental health treatment on the NHS. He says that two thirds of people feel their health gets worse whilst waiting for support after visiting a GP or nurse, with 1 in 6 people attempting suicide in that period and 40% self-harming.’ (Going Underground)
‘[…] The Central Park Five had falsely confessed — because, they said, they’d been coerced by police.
Don’t think that it could happen to you? Sorry, but a first-of-its-kind study shows that it could — easily. With a little misinformation, encouragement and three hours, researchers were able to convince 70 percent of the study’s participants that they’d committed a crime.
And the college-aged students who participated in the study didn’t merely confess — they recalled full-blown, detailed experiences, says lead researcher Julia Shaw, a lecturer in forensic psychology from the University of Bedfordshire. The results were “definitely unexpected,” says Shaw, who predicted only a 30 percent rate.’
‘Abby Martin interviews Margaret Heffernan, author of ‘Willful Blindness’ and ‘A Bigger Prize’, about the destructive impact of competition and alternative models of incentivizing people to work together for the greater good.’ (Breaking the Set)
- Margaret Heffernan at TED: The dangers of “willful blindness”
- Margaret Heffernan at TED: Dare to disagree
- The Psychology of Our Willful Blindness and Why We Ignore the Obvious at Our Peril
- Wilful Blindness: Why We Ignore the Obvious (Book)
- A Bigger Prize: How We Can Do Better than the Competition (Book)
‘[…] We are now living in an age of abundance in the West. Before, material goods were expensive and scarce. Clothes were so hard to come by that they were handed down from generation to generation. A historian called Eve Fisher has calculated that before 1750 and the onset of the industrial revolution a shirt would have cost around £2,000 in today’s money. But now, things – shirts, shoes, toys and a million other consumer items – are cheap.
Once again, our inbuilt impulses have yet to catch up. As a result, many millions of us are filling our homes and lives, and suffocating under too much stuff.
This problem, which I call “stuffocation”, is the material version of the obesity epidemic. Since obesity is one of the most worrying problems we face, as individuals and as a society, saying that stuffocation is similar is quite a statement.’
- Stuffocation: Living More With Less (Book)
- Stuffocation: Living More With Less (Book Review)
- The Century of the Self (2002 Documentary)
- New Research Reveals Shift In Consumption Patterns
- How to edit your wardrobe and avoid ‘stuffocation’
- In every woman’s closet, 22 items she never wears
- Don’t buy stuff. Do stuff
‘In China, if you are a kid who spends a long time online, you had better watch out. Your parents may send you off for “treatment”.
At the Internet Addiction Treatment Centre in Beijing, children must take part in military-style activities, including exercise drills and the singing of patriotic songs. They are denied access to the internet. One of the first experiences internees undergo is brain monitoring through electroencephalography (EEG). The programme is run by psychologist Tao Ran, who claims the brains of internet and heroin addicts display similarities.’
- Internet addiction disorder
- Is Internet Addiction a Real Thing?
- Psychology Today on Web Addiction
- Internet Addiction: The Next New Fad Diagnosis
- Internet Addiction: What once was parody may soon be diagnosis
- Teenager’s death sheds light on brutal discipline at military-style camps for internet addicts
- Behind “Web Junkie,” a Documentary About China’s Internet-Addicted Teens
- China’s Web Junkies: Internet Addiction Documentary (NYT Op-Doc)
- Web Junkies: China’s Addicted Teens (2013 BBC Documentary)
- Web Surfers Hooked On Using Mobile Devices (2013)
- Young men are hooked on the web – even in bed (2010)
- “Matrix” has you: hooked on the web end up in clinic (2009)
- Hooked on the Web: Help Is on the Way (2005)
‘The new law on domestic violence would make it illegal for someone to exercise ‘coercive control’ over their partner.
The proposals, which could find those guilty facing a maximum 14 years in prison, will be unveiled by the Government this week.
Campaigners have long called for a change in the law to put psychological exploitation on a par with physical violence, in the hope it will encourage more victims to come forward and report abuse in the home.’