Please enter the text in the image above here:
The ascent up the Maslow hierarchy of needs might have a dark side; a US psychologist claims that the ideal of self-actualisation has created a world in which romantic relationships are more likely to fail. Eli Finkel of Northwestern University posits the “suffocation” model of marriage, asserts that, as the needs we have of a partner have changed from shared survival in a hostile environment, through romantic love and onto mutual self-discovery, and the time these couples spend with one another decreases due to external time constraints, it is harder for any actual relationship with another human being (especially one who also wishes to discover themselves) to fit the bill:
"People used to marry for basic things like food and shelter. In the 1800s, you didn't have to have profound insight into your partner's core essence to tend to the chickens or build a sound physical structure against the snow," Finkel said. "Back then, the idea of marrying for love was ludicrous."
"In 2014, you are really hoping that your partner can help you on a voyage of discovery and personal growth, but your partner cannot do that unless he or she really knows who you are, and really understands your core essence. That requires much greater investment of time and psychological resources," he said.
Book of Lamentations, a review of the American Psychological Association's Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition as a work of dystopian literature:
If the novel has an overbearing literary influence, it’s undoubtedly Jorge Luis Borges. The American Psychiatric Association takes his technique of lifting quotes from or writing faux-serious reviews for entirely imagined books and pushes it to the limit: Here, we have an entire book, something that purports to be a kind of encyclopedia of madness, a Library of Babel for the mind, containing everything that can possibly be wrong with a human being. Perhaps as an attempt to ward off the uncommitted reader, the novel begins with a lengthy account of the system of classifications used – one with an obvious debt to the Borgesian Celestial Emporium of Benevolent Knowledge, in which animals are exhaustively classified according to such sets as “those belonging to the Emperor,” “those that, at a distance, resemble flies,” and “those that are included in this classification.”
As you read, you slowly grow aware that the book’s real object of fascination isn’t the various sicknesses described in its pages, but the sickness inherent in their arrangement... This mad project is clearly something that its authors are fixated on to a somewhat unreasonable extent. In a retrospectively predictable ironic twist, this precise tendency is outlined in the book itself. The entry for obsessive-compulsive disorder with poor insight describes this taxonomical obsession in deadpan tones: “repetitive behavior, the goal of which is […] to prevent some dreaded event or situation.” Our narrator seems to believe that by compiling an exhaustive list of everything that might go askew in the human mind, this wrong state might somehow be overcome or averted. References to compulsive behavior throughout the book repeatedly refer to the “fear of dirt in someone with an obsession about contamination.” The tragic clincher comes when we’re told, “the individual does not recognize that the obsessions or compulsions are excessive or unreasonable.” This mad project is so overwhelming that its originator can’t even tell that they’ve subsumed themselves within its matrix. We’re dealing with a truly unreliable narrator here, not one that misleads us about the course of events (the narrator is compulsive, they do have poor insight), but one whose entire conceptual framework is radically off-kilter. As such, the entire story is a portrait of the narrator’s own particular madness. With this realization, DSM-5 starts to enter the realm of the properly dystopian.
Builders of Star Trek-inspired rooms recently in the news: a convicted paedophile (or, to be precise, another one, though his Star Trek-inspired flat has been in the news previously), and the US National Security Agency.
A behavioural economist from Yale has posited the theory that how one's primary language handles the future tense influences the amount of planning one does for the future, with one consequence being that English speakers save less for their old age than speakers of languages such as Mandarin and Yoruba, which lack a separate future tense and instead treat the future as part of the present. Professor Keith Chen's theory is that, in doing so, such languages encourage and entrench habits of thought more conducive to mindfulness of one's future than languages where the future is hived off into a separate grammatical tense:
Prof Chen divides the world's languages into two groups, depending on how they treat the concept of time. Strong future-time reference languages (strong FTR) require their speakers to use a different tense when speaking of the future. Weak future-time reference (weak FTR) languages do not.
"The act of savings is fundamentally about understanding that your future self - the person you're saving for - is in some sense equivalent to your present self," Prof Chen told the BBC's Business Daily. "If your language separates the future and the present in its grammar that seems to lead you to slightly disassociate the future from the present every time you speak.The effect is not limited to exotic non-European languages; similar differences are present in European languages to an extent (for example, one often uses the present tense in German to refer to events in the future, which is not the case in English, French or Italian; whether this has any causal relationship with the higher rate of personal saving in Germany remains to be determined).
Professor Chen's paper, The Effect of Language on Economic Behavior: Evidence from Savings Rates, Health Behaviors, and Retirement Assets, is (here), in PDF format.
If this effect holds true, all may not be lost; one could consciously intervene in English to an extent without breaking too much, by forcing oneself to say things like “I'm going to the seminar” rather than “I will go to the seminar”. Further flattenings-out of the future tense, however, get more awkward; saying, at age 29, “I'm retiring to the south of France” could raise a few eyebrows.
Researchers in the US have been investigating the question of what is “cool” from a psychological perspective, hitting the dichotomy between the two opposite poles which can be described with this term: on one hand, agreeability and popularity, and, on the other hand, a vaguely antisocial countercultural/oppositional stance reflected in the classic iconography of rebels and outlaws from the history of cool:
"I got my first sunglasses when I was about 13," said Dar-Nimrod. "There wasn't a cooler kid on the block for the next few days. I was looking cool because I was distant from people. My emotions were not something they could read. I put a filter between me and everyone else. That, in my mind, made me cool. Today, that doesn't seem to be supported. If anything, sociability is considered to be cool, being nice is considered to be cool. And in an oxymoron, being passionate is considered to be cool—at least, it is part of the dominant perception of what coolness is. How can you combine the idea of cool—emotionally controlled and distant—with passionate?"
"We have a kind of a schizophrenic coolness concept in our mind," Dar-Nimrod said. "Almost any one of us will be cool in some people's eyes, which suggests the idiosyncratic way coolness is evaluated. But some will be judged as cool in many people's eyes, which suggests there is a core valuation to coolness, and today that does not seem to be the historical nature of cool. We suggest there is some transition from the countercultural cool to a generic version of it's good and I like it. But this transition is by no way completed."The researchers claim that the concept of “cool” is mutating away from the oppositional/rebellious sense and towards straight agreeability.
If this phenomenon does bear itself out, there may be a number of possible explanations. Perhaps, as the countercultural struggles against the repressive hegemony of the “squares” have receded into folk memory of The Fifties and everyone wears jeans, listens to rock and has smoked a joint at least once in their lives, the idea of the rebel is left with even less of a cause than before Perhaps the shift in the meaning of “cool” has something to do with the ongoing process of commodification of the counterculture, with the sneers and icy glares of vintage cool now being little more than a mask for agreeable dudes to put on when the occasion suits. Or perhaps, in the information age, being agreeable and well-connected confers a greater advantage than being tough and detached. One would imagine that this would be the case in most normal situations, in which case, the old world of tough guys and strong, silent types would have been an anomalous case, a hostile environment which traumatised its inhabitants into growing expensive carapaces of character armour.
Another option would be that the meaning of “cool” is not, in fact, changing (this study doesn't seem to involve surveys done decades earlier to gauge what people thought at the time, and compares living attitudes with canned stereotypes), and that the word “cool” has several meanings; when it's used as a term of approval for a person, it has always indicated agreeability, whereas when talking about fictional characters, it suggested a certain type of antiheroic asshole.
A study of user questionnaires on the site yourmorals.org (whose stated goal is “to understand the way our "moral minds" work”) suggests a truth about the nature of the left-right political divide: that levels of empathy are positively correlated with political engagement among liberals, and negatively among conservatives. (This is a study in the US, hence the terminology.)
Cracked's David Wong has a list of five telltale indicators of a bullshit political story; in this case, a “bullshit political story” is one which ignores the actual issues and treats politics as a sporting event, appealing to the audience's identification with one team or other:
The answer is that many (if not most) people don't follow politics in order to find out who to vote for as part of their duty as citizens living in a democracy. They follow it purely as a form of entertainment. They're like sports fans, rooting for their "team" to win. And as you're going to find out, virtually all political news coverage is written to appeal to those people. They're the most rabid "consumers" of news, and their traffic is the most reliable, so the news is tailored to appeal to them. In the business, they derisively call it "horse race journalism," where the stories focus purely on the "sport" of politics rather than the consequences.The telltale signs are stories with the word “gaffe” in the headline (generally some content-free event giving one half of the stadium cause to hoot and jeer at what dumbasses the other side are), anything about a politician “blasting” the other side (which appeals to the audience's inner wrestling fan), weasel-worded headlines asking a question (the answer to which is generally “probably not”), headlines attempting to escalate random low-ranking members of one political side, generally with non-mainstream opinions, to the status of “lawmakers” or “advisors” and demanding that the leadership take responsibility for them, and real-world political issues being framed as a “blow to” one political side or other:
That's where the gaffe stories come in. See, in this game, your "team" scores a point each time the other team says something stupid. It lets all of the supporters of your team mock and humiliate the supporters of the opposing team, on Internet message boards and around water coolers and in coffee shops nationwide. "Haha! The supposed 'genius' Obama thinks there are 57 states in the U.S.!" "Oh, yeah? Well, your last president said he was going to help terrorists plan their next attack!"
Hey, did you know that Barack Obama is an out-of-touch elitist because he puts fancy Dijon mustard on his hamburgers? Did you know that Mitt Romney is an insane sociopath because he once made his pet dog ride on top of his car 26 years ago? Did you know John Kerry can't relate to the average person because he puts Swiss cheese on his Philly cheese steaks? Did you know that George W. Bush hates foreigners so much that he wiped his hand after shaking hands with a Haitian? Did you know that all of this is petty schoolyard bullshit that wastes valuable time and energy that you'll never get back?
And, as smarter commentators have pointed out, there's an even bigger problem with this: It actually implies that the issue itself is completely unimportant. For instance, if the courts overturn some regulation about mercury in the water or Congress blocks car mileage standards, it always gets reported as "A Blow to Environmentalists." Oh, no, it's not a blow to the people who have to drink the water or breathe the air, or the taxpayers who have to fund the regulations, or the businesses that lose jobs over it. It's either a "blow to environmentalists" or it's not. They specifically make it sound like the effects extend purely to some fringe special interest group and absolutely no one else.
I'm telling you from experience, watching political races this way is addictive as shit. You have thousands of years of violent tribal instincts pumping through your veins, itching for a fight. That makes you an easy tool for manipulation, and every good politician and pundit knows how to push those buttons to make people march neatly in formation. Don't succumb. Or else you'll start supporting the most bullshit legislation just because your guy is for it. Or you'll start knee-jerk rejecting anything the other "team" proposes. Not because it's bad for the country, but because you want to deny them a "win."
When testing drugs for treating depression on lab mice, it is important to have ways of determining whether or not a mouse is depressed, or suffering from the mouse equivalent of depression. Not surprisingly, mice deemed to be depressed are the ones which give up and stop struggling when faced with difficulty, and which get little joy from life:
Forced swimming test. The rat or mouse is placed into a cylinder partially filled with water from which escape is difficult. The longer it swims, the more actively it is trying to escape; if it stops swimming, this cessation is interpreted as depressionlike behavior, a kind of animal fatalism.
Sugar water preference. The preference an animal shows for sugar water is taken as an indication of its ability to derive pleasure, a quality that is missing in depression. Most rodents, when given two identical-looking sources of water, will drink much more of the sweetened water than the plain water. Rodents exposed to chronic stress or whose brains have been manipulated show no such preference.(Previously: Scientists create a mouse that's permanently happy.)
Apropos of the previous post, a few tidbits from research on what information one can determine from someone's Facebook profile, without even looking at their activity:
People who tested as "extroverts" on the personality test tended to have more friends, but their networks tended to be more sparse, meaning that they made friends with lots of different people who are less likely to know each other.
The researchers also found that people with long last names tended to be more neurotic, perhaps because "a lifetime of having one's long last name misspelled may lead to a person expressing more anxiety and quickness to anger," according to the study, which is being presented this week at the Computer Human Interaction conference in Vancouver.
Psychologist Bruce Levine makes the claim that, in the US, the psychological profession has a bias towards conformism and authoritarianism, and against anti-authoritarian tendencies. This bias apparently results from the institutional structure of the profession, which selects for and reinforces pro-conformist and pro-authoritarian tendencies, and manifests itself, among other things, in those who exhibit “anti-authoritarian tendencies” being caught, diagnosed with various mental illnesses and medicated into compliance before they can develop into actual troublemakers:
In my career as a psychologist, I have talked with hundreds of people previously diagnosed by other professionals with oppositional defiant disorder, attention deficit hyperactive disorder, anxiety disorder and other psychiatric illnesses, and I am struck by (1) how many of those diagnosed are essentially anti-authoritarians, and (2) how those professionals who have diagnosed them are not.
Anti-authoritarians question whether an authority is a legitimate one before taking that authority seriously. Evaluating the legitimacy of authorities includes assessing whether or not authorities actually know what they are talking about, are honest, and care about those people who are respecting their authority. And when anti-authoritarians assess an authority to be illegitimate, they challenge and resist that authority—sometimes aggressively and sometimes passive-aggressively, sometimes wisely and sometimes not.
Some activists lament how few anti-authoritarians there appear to be in the United States. One reason could be that many natural anti-authoritarians are now psychopathologized and medicated before they achieve political consciousness of society’s most oppressive authorities.Showing hostility to or resentment of authority will get one diagnosed with various conditions, such as “opposition defiant disorder (ODD)”, a condition which manifests itself in deficits in “rule-governed behaviour”, and for which, as for many parts of the human condition, there are many types of corrective medication these days. (Compare this to the condition of “sluggish schizophrenia”, which only existed in the Soviet Union and manifested itself as a rejection of the self-evident truth of Marxism-Leninism.)
While pretty much every hierarchical society has mechanisms for encouraging conformity to some degree, Dr. Levine's contention is that the increase in psychiatric medication in recent years may be leading to a more authoritarian and conformistic society.
(via jwz) Share
Stage magician Teller explains some of the cognitive principles behind the magician's craft:
1. Exploit pattern recognition. I magically produce four silver dollars, one at a time, with the back of my hand toward you. Then I allow you to see the palm of my hand empty before a fifth coin appears. As Homo sapiens, you grasp the pattern, and take away the impression that I produced all five coins from a hand whose palm was empty.
2. Make the secret a lot more trouble than the trick seems worth. You will be fooled by a trick if it involves more time, money and practice than you (or any other sane onlooker) would be willing to invest. My partner, Penn, and I once produced 500 live cockroaches from a top hat on the desk of talk-show host David Letterman. To prepare this took weeks. We hired an entomologist who provided slow-moving, camera-friendly cockroaches (the kind from under your stove don’t hang around for close-ups) and taught us to pick the bugs up without screaming like preadolescent girls. Then we built a secret compartment out of foam-core (one of the few materials cockroaches can’t cling to) and worked out a devious routine for sneaking the compartment into the hat. More trouble than the trick was worth? To you, probably. But not to magicians.
3. It’s hard to think critically if you’re laughing. We often follow a secret move immediately with a joke. A viewer has only so much attention to give, and if he’s laughing, his mind is too busy with the joke to backtrack rationally.
The New York Times has a fascinating article about how retail companies use data mining and carefully targeted coupons to analyse and influence the spending habits of consumers, without the consumers getting wise to what's happening and resisting:
Almost every major retailer, from grocery chains to investment banks to the U.S. Postal Service, has a “predictive analytics” department devoted to understanding not just consumers’ shopping habits but also their personal habits, so as to more efficiently market to them. “But Target has always been one of the smartest at this,” says Eric Siegel, a consultant and the chairman of a conference called Predictive Analytics World. “We’re living through a golden age of behavioral research. It’s amazing how much we can figure out about how people think now.”A specific case the article describes is that of finding new parents-to-be, who will soon be in a position of having to form new spending habits, and doing so before the competition can identify them from public information; which is to say, inferring from statistical information whether a customer is likely to be pregnant, and at which stage, and then subtly manipulating her spending through the various milestones of her child's development:
The only problem is that identifying pregnant customers is harder than it sounds. Target has a baby-shower registry, and Pole started there, observing how shopping habits changed as a woman approached her due date, which women on the registry had willingly disclosed. He ran test after test, analyzing the data, and before long some useful patterns emerged. Lotions, for example. Lots of people buy lotion, but one of Pole’s colleagues noticed that women on the baby registry were buying larger quantities of unscented lotion around the beginning of their second trimester. Another analyst noted that sometime in the first 20 weeks, pregnant women loaded up on supplements like calcium, magnesium and zinc. Many shoppers purchase soap and cotton balls, but when someone suddenly starts buying lots of scent-free soap and extra-big bags of cotton balls, in addition to hand sanitizers and washcloths, it signals they could be getting close to their delivery date.
One Target employee I spoke to provided a hypothetical example. Take a fictional Target shopper named Jenny Ward, who is 23, lives in Atlanta and in March bought cocoa-butter lotion, a purse large enough to double as a diaper bag, zinc and magnesium supplements and a bright blue rug. There’s, say, an 87 percent chance that she’s pregnant and that her delivery date is sometime in late August. What’s more, because of the data attached to her Guest ID number, Target knows how to trigger Jenny’s habits. They know that if she receives a coupon via e-mail, it will most likely cue her to buy online. They know that if she receives an ad in the mail on Friday, she frequently uses it on a weekend trip to the store. And they know that if they reward her with a printed receipt that entitles her to a free cup of Starbucks coffee, she’ll use it when she comes back again.The uncanny accuracy of the algorithm is demonstrated by an anecdote about an angry father storming into a Target store demanding why they had sent his teenaged daughter coupons for nappies and prams, and then, some time later, returning to apologise, having discovered that she had, in fact, become pregnant.
Of course, there is the small issue of how to use this wealth of information in a plausibly deniable sense, without being obviously creepy. People, after all, tend to react badly to being surreptitiously watched and manipulated, especially so when deeply personal matters are involved:
“With the pregnancy products, though, we learned that some women react badly,” the executive said. “Then we started mixing in all these ads for things we knew pregnant women would never buy, so the baby ads looked random. We’d put an ad for a lawn mower next to diapers. We’d put a coupon for wineglasses next to infant clothes. That way, it looked like all the products were chosen by chance. And we found out that as long as a pregnant woman thinks she hasn’t been spied on, she’ll use the coupons. She just assumes that everyone else on her block got the same mailer for diapers and cribs. As long as we don’t spook her, it works.”
(via jwz) Share
In 1995, the state legislature of New Mexico passed a law requiring psychologists and psychiatrists to be dressed as wizards when giving evidence in court:
When a psychologist or psychiatrist testifies during a defendant’s competency hearing, the psychologist or psychiatrist shall wear a cone-shaped hat that is not less than two feet tall. The surface of the hat shall be imprinted with stars and lightning bolts. Additionally, a psychologist or psychiatrist shall be required to don a white beard that is not less than 18 inches in length, and shall punctuate crucial elements of his testimony by stabbing the air with a wand. Whenever a psychologist or psychiatrist provides expert testimony regarding a defendant’s competency, the bailiff shall contemporaneously dim the courtroom lights and administer two strikes to a Chinese gong…The amendment passed unanimously, but was removed from the final law, to the detriment of the theatrical beard and Chinese gong industries.
Idea of the day: the Happy Recession; the idea that the internet, through driving prices and costs down, will permanently deflate both prices and wages; the post-internet world, it seems, jams econo:
The most pernicious aspect of Internet entertainment is that it’s so easy to measure and so easy to mass-produce. So the moment something on the Internet gets fun enough to be competitive with the real-world analogue, it starts getting relentlessly improved until it’s vastly superior. World of Warcraft soaks up upwards of forty hours per week from serious fans, who pay about $15 per month for their subscriptions. Few other hobbies can consume so much time at such a low cost.
The web makes it easier to access non-traditional employees at much lower salaries. As we argued in our Demand Media analysis, the real story here is that a stay-at-home mom with a Masters in Journalism can write content that is good enough compared to a typical Madison Avenue copywriter, especially when the rate is $15 per article instead of six figures per year. This disaggregation of writing skill means that companies no longer have to hire good writers in order to write 5% good copy and 95% mediocre work; they can outsource the mediocre stuff and relegate the high-end work to a short-term freelancer.
The web offers cheap social status: In the long term, this may have a bigger effect than the web merely making digitizable products cheaper. Social status games drive a huge amount of economic activity: people strive to get into high-paying, high prestige career tracks, to win promotions and attendant raises, to live in the best neighborhoods and send their kids to the best schools. Few status games lack some kind of economic output—people who play sports well below the professional level still get some job opportunities out of it.One could probably also add a geographical factor to this: in the age of cheap, ubiquitous opportunity, access to economic and cultural opportunities is less dependant on being located in a buzzing metropolis or creative-class hive; after all, if a copywriter or app developer can work from anywhere with creativity, things like music and art scenes (or whatever replaces the post-punk rock'n'roll era construction of the "music scene" in the cultural ecosystem) are centred around blogs rather than physical venues, and one doesn't even need to move to a different place to find like-minded people, there would be less competition for living in more desirable areas, when the price of not doing so no longer includes disconnection from as many opportunities.
street financial sector finds its own uses for things psychopath profiling tests:
My companion, a senior UK investment banker and I, are discussing the most successful banking types we know and what makes them tick. I argue that they often conform to the characteristics displayed by social psychopaths. To my surprise, my friend agrees.
He then makes an astonishing confession: "At one major investment bank for which I worked, we used psychometric testing to recruit social psychopaths because their characteristics exactly suited them to senior corporate finance roles."
Here was one of the biggest investment banks in the world seeking psychopaths as recruits.
Writing in the Pinboard blog, Maciej Ceglowski tears apart the concept "social graph", saying that it is neither social nor a graph, but a sort of pseudoscience invented by socially-challenged geeks and now peddled by hucksters out to monetise you and your relationships:
Last week Forbes even went to the extent of calling the social graph an exploitable resource comprarable to crude oil, with riches to those who figure out how to mine it and refine it. I think this is a fascinating metaphor. If the social graph is crude oil, doesn't that make our friends and colleagues the little animals that get crushed and buried underground?The first part of his argument has to do with the inadequacy of the "social graph" model for representing all the nuances of human social relationships in the real world; the many gradations of friendship and acquaintance, the ways relationships change and evolve, making a mockery of nailed-down static representations; the way that describing a relationship can change it in some cases, and various issues of privacy and multi-faceted identity, things which exist trivially in the real world, even if they're in violation of the Zuckerberg Doctrine.
One big sticking point is privacy. Do I really want to find out that my pastor and I share the same dominatrix? If not, then who is going to be in charge of maintaining all the access control lists for every node and edge so that some information is not shared? You can either have a decentralized, communally owned social graph (like Fitzpatrick envisioned) or good privacy controls, but not the two together.
This obsession with modeling has led us into a social version of the Uncanny Valley, that weird phenomenon from computer graphics where the more faithfully you try to represent something human, the creepier it becomes. As the model becomes more expressive, we really start to notice the places where it fails.
You might almost think that the whole scheme had been cooked up by a bunch of hyperintelligent but hopelessly socially naive people, and you would not be wrong. Asking computer nerds to design social software is a little bit like hiring a Mormon bartender. Our industry abounds in people for whom social interaction has always been more of a puzzle to be reverse-engineered than a good time to be had, and the result is these vaguely Martian protocols.Of course, whilst the idea of the social graph may not be good for modelling real-life social interactions with naturalistic fidelity, it has been a boon for targeting advertising; the illusion of social fulfilment is enough to keep people clicking and volunteering information about themselves. From the advertisers' point of view, the fish not only jump right into the boat, they fillet themselves in mid-air and bring their own wedges of lemon:
Imagine the U.S. Census as conducted by direct marketers - that's the social graph. Social networks exist to sell you crap. The icky feeling you get when your friend starts to talk to you about Amway, or when you spot someone passing out business cards at a birthday party, is the entire driving force behind a site like Facebook.There is some good news, though: while general-purpose social web sites with the ambition of mediating (and monetising) the entirety of human social interaction may fail creepily as they approach their goal, special-purpose online communities can thrive in their niches:
The funny thing is, no one's really hiding the secret of how to make awesome online communities. Give people something cool to do and a way to talk to each other, moderate a little bit, and your job is done. Games like Eve Online or WoW have developed entire economies on top of what's basically a message board. MetaFilter, Reddit, LiveJournal and SA all started with a couple of buttons and a textfield and have produced some fascinating subcultures. And maybe the purest (!) example is 4chan, a Lord of the Flies community that invents all the stuff you end up sharing elsewhere: image macros, copypasta, rage comics, the lolrus. The data model for 4chan is three fields long - image, timestamp, text. Now tell me one bit of original culture that's ever come out of Facebook.I wonder whether there is a dichotomy there between sites and networks; would a special-interest site that used, say, Facebook's social graph as a means of identifying users (rather than having its own system of accounts, usernames, profiles, and optionally friendship/trust edges) be infected by the Zuckerbergian malaise?
In the wake of riots across the UK, the BBC asks what turns ordinary people into looters:
Psychologists argue that a person loses their moral identity in a large group, and empathy and guilt - the qualities that stop us behaving like criminals - are corroded. "Morality is inversely proportional to the number of observers. When you have a large group that's relatively anonymous, you can essentially do anything you like," according to Dr James Thompson, honorary senior lecturer in psychology at University College London.
He rejects the notion that some of the looters are passively going with the flow once the violence has taken place, insisting there is always a choice to be made.
Workman argues that some of those taking part may adopt an ad hoc moral code in their minds - "these rich people have things I don't have so it's only right that I take it". But there's evidence to suggest that gang leaders tend to have psychopathic tendencies, he says.
[Criminologist Prof. John Pitts] says most of the rioters are from poor estates who have no "stake in conformity", who have nothing to lose. "They have no career to think about. They are not 'us'. They live out there on the margins, enraged, disappointed, capable of doing some awful things."
Today in weaponised sociolinguistics: the US intelligence research agency IARPA is running a programme to collect and catalogue metaphors used in different cultures, hopefully revealing how the Other thinks. This follows on from the work of cognitive linguist George Lakoff, who theorised that whoever controls the metaphors used in language can tilt the playing field extensively:
Conceptual metaphors have been big business over the last few years. During the last Bush administration, Lakoff – a Democrat – set up the Rockridge Institute, a foundation that sought to reclaim metaphor as a tool of political communication from the right. The Republicans, he argued, had successfully set the terms of the national conversation by the way they framed their metaphors, in talking about the danger of ‘surrendering’ to terrorism or to the ‘wave’ of ‘illegal immigrants’. Not every Democrat agreed with his diagnosis that the central problem with American politics was that it was governed by the frame of the family, that conservatives were proponents of ‘authoritarian strict-father families’ while progressives reflected a ‘nurturant parent model, which values freedom, opportunity and community building’ (‘psychobabble’ was one verdict, ‘hooey’ another).
But there’s precious little evidence that they tell you what people think. One Lakoff-inspired study that at first glance resembles the Metaphor Program was carried out in the mid-1990s by Richard D. Anderson, a political scientist and Sovietologist at UCLA, who compared Brezhnev-era speeches by Politburo members with ‘transitional’ speeches made in 1989 and with post-1991 texts by post-Soviet politicians. He found, conclusively, that in the three periods of his study the metaphors used had changed entirely: ‘metaphors of personal superiority’, ‘metaphors of distance’, ‘metaphors of subordination’ were out; ‘metaphors of equality’ and ‘metaphors of choice’ were in. There was a measurable change in the prevailing metaphors that reflected the changing political situation. He concluded that ‘the change in Russian political discourse has been such as to promote the emergence of democracy’, that – in essence – the metaphors both revealed and enabled a change in thinking. On the other hand, he could more sensibly have concluded that the political system had changed and therefore the metaphors had to change too, because if a politician isn’t aware of what metaphors he’s using who is?The article is vague on the actual IARPA research programme, but reveals that it involves extracting metaphors from large bodies of texts in four languages (Farsi, Mexican Spanish, Russian and English) and classifying them according to emotional affect.
The IARPA metaphor programme follows an earlier proposal to weaponise irony:
If we don’t know how irony works and we don’t know how it is used by the enemy, we cannot identify it. As a result, we cannot take appropriate steps to neutralize ironizing threat postures. This fundamental problem is compounded by the enormous diversity of ironic modes in different world cultures and languages. Without the ability to detect and localize irony consistently, intelligence agents and agencies are likely to lose valuable time and resources pursuing chimerical leads and to overlook actionable instances of insolence. The first step toward addressing this situation is a multilingual, collaborative, and collative initiative that will generate an encyclopedic global inventory of ironic modalities and strategies. More than a handbook or field guide, the work product of this effort will take the shape of a vast, searchable, networked database of all known ironies. Making use of a sophisticated analytic markup language, this “Ironic Cloud” will be navigable by means of specific ironic tropes (e.g., litotes, hyperbole, innuendo, etc.), by geographical region or language field (e.g., Iran, North Korea, Mandarin Chinese, Davos, etc.), as well as by specific keywords (e.g., nose, jet ski, liberal arts, Hermès, night soil, etc.) By means of constantly reweighted nodal linkages, the Ironic Cloud will be to some extent self-organizing in real time and thus capable of signaling large-scale realignments in the “weather” of global irony as well as providing early warnings concerning the irruption of idiosyncratic ironic microclimates in particular locations—potential indications of geopolitical, economic, or cultural hot spots.The proposal goes on to suggest possibilities of using irony as a weapon:
Superpower-level political entities (e.g., Roman Empire, George W. Bush, large corporations, etc.) have tended to look on irony as a “weapon of the weak” and thus adopted a primarily defensive posture in the face of ironic assault. But a historically sensitive consideration of major strategic realignments suggests that many critical inflection points in geopolitics (e.g., Second Punic War, American Revolution, etc.) have involved the tactical redeployment of “guerrilla” techniques and tools by regional hegemons. There is reason to think that irony, properly concentrated and effectively mobilized, might well become a very powerful armament on the “battlefield of the future,” serving as a nonlethal—or even lethal—sidearm in the hands of human fighters in an information-intensive projection of awesome force. Without further fundamental research into the neurological and psychological basis of irony, it is difficult to say for certain how such systems might work, but the general mechanism is clear enough: irony manifestly involves a sudden and profound “doubling” of the inner life of the human subject. The ironizer no longer maintains an integrated and holistic perspective on the topic at hand but rather experiences something like a small tear in the consciousness, whereby the overt and covert meanings of a given text or expression are sundered. We do not now know just how far this tear could be opened—and we do not understand what the possible vital consequences might be.
A new study has shown that violent video games decrease crime rates. While they do increase aggression in the players, the incapacitation effect of the players being drawn into sitting in front of a computer or console for extended periods of time, and thus unlikely to attack anything larger than a plate of nachos in reality, outweighs this.
Cognitive bias of the day: the name letter effect, which causes people to be subconsciously more favourably inclined to names and words that sound like their name:
The researchers then moved on to career choices. They combed the records of the American Dental Association and the American Bar association looking for people named either Dennis, Denice, Dena, Denver, et cetera, or Lawrence, Larry, Laura, Lauren, et cetera. That is: were there more dentists named Dennis and lawyers named Lawrence than vice versa? Of the various statistical analyses they performed, most said yes, some at < .001 level. Other studies determined that there was a suspicious surplus of geologists named Geoffrey, and that hardware store owners were more likely to have names starting with 'H' compared to roofing store owners, who were more likely to have names starting with 'R'.
Some other miscellaneous findings: people are more likely to donate to Presidential candidates whose names begin with the same letter as their own, people are more likely to marry spouses whose names begin with the same letter as their own, that women are more likely to show name preference effects than men (but why?), and that batters with names beginning in 'K' are more likely than others to strike out (strikeouts being symbolized by a 'K' on the records).
(via David Gerard) Share
Jon Ronson looks at what makes psychopaths tick:
I met an American CEO, Al Dunlap, formerly of the Sunbeam Corporation, who redefined a great many of the psychopath traits to me as "business positives": Grandiose sense of self-worth? "You've got to believe in yourself." (As he told me this, he was standing underneath a giant oil painting of himself.) Cunning/manipulative? "That's leadership."
I wondered if sometimes the difference between a psychopath in Broadmoor and a psychopath on Wall Street was the luck of being born into a stable, rich family.The article, which is excerpted from Ronson's new book, "The Psychopath Test", also follows the story of Tony, a youth who, when tried for a violent crime, feigned insanity in an attempt to avoid prison, and instead was diagnosed as a manipulative psychopath and committed to the notorious Broadmoor prison. After 12 years amongst notorious killers and the criminally insane, he secured a hearing, which found him, whilst mildly psychopathic, fit to be released into society:
"The thing is, Jon," Tony said as I looked up from the papers, "what you've got to realise is, everyone is a bit psychopathic. You are. I am." He paused. "Well, obviously I am," he said.
"What will you do now?" I asked.
"Maybe move to Belgium," he said. "There's this woman I fancy. But she's married. I'll have to get her divorced."
Science blogger Ben Goldacre points us to an interesting psychology paper (unfortunately paywalled), analysing changes over the past few decades in the subject matter of popular song lyrics:
The current research fills this gap by testing the hypothesis that one cultural product—word use in popular song lyrics—changes over time in harmony with cultural changes in individualistic traits. Linguistic analyses of the most popular songs from 1980–2007 demonstrated changes in word use that mirror psychological change. Over time, use of words related to self-focus and antisocial behavior increased, whereas words related to other-focus, social interactions, and positive emotion decreased. These findings offer novel evidence regarding the need to investigate how changes in the tangible artifacts of the sociocultural environment can provide a window into understanding cultural changes in psychological processes.Compare and contrast: Hypebot's analysis of 2010 commercial pop lyrics, coming up with an example of perfectly generic pop lyrics, circa 2010:
Oh baby, yeah, Imma rock your body hard—like damnI wonder how much of this is actually emblematic of a deeper cultural shift towards short-term values. A world in which everything is a dynamic market of novelty and possibility, and "love" just means a temporary arrangement for mutually negotiated gratification.
Chick I wanna know, cause I get around now—like bad
Love gonna stop, Imma rock your body hard—like damn
Had enough tonight, I wanna break the love—like bad
Scientists in China have found that mice bred to not be receptive to serotonin have no sexual preferences for either sex:
When presented with a choice of partners, they showed no overall preference for either males or females. When just a male was introduced into the cage, the modified males were far more likely to mount the male and emit a "mating call" normally given off when encountering females than unmodified males were.
However, a preference for females could be "restored" by injecting serotonin into the brain.The researchers have cautioned against drawing conclusions about human sexuality from the result.
The lazy takeaway from this, as seen in news sites, is that serotonin affects sexual orientation, with the suggestion that low serotonin might be the secret to the inexplicable condition known as homosexuality. I'm wondering whether a more plausible conclusion is that, with sexual selection being about competition amongst fit individuals, a prerequisite for having an active sexual preference is passing an internal test of subjective fitness, i.e., being aware that one has sufficiently high status to be picky. In other words, mice without functioning serotonin receptors perceive themselves as losers who will take anything that's warm and regard it, being more than they're entitled to, as a win.
The Hathaway Effect: an observation that when Hollywood celebrity Anne Hathaway makes headlines, stock in Berkshire Hathaway, Warren Buffett's company, rises, for no other reason. Is it a demonstration of human irrationality (significant numbers of investors getting a good feeling about a stock by virtue of having its name in their mind for an unconnected reason), or, as the article suggests, automated trading systems picking up entertainment headlines and buying stock on the basis of them?
Positivity considered harmful (2): A new study suggests that social software such as Facebook may be making its users unhappy, by causing them to overestimate how contented their peers are with their lives (unlike themselves). The theory goes that, as these sites are self-curated experiences where users present generally positive images of themselves, other users don't get well-rounded views of how an online acquaintance's life is going, but have a cognitive bias to thinking that they do. Consequently, we overestimate our online acquaintances' life satisfaction, compare it to our own, and feel unhappy:
The human habit of overestimating other people's happiness is nothing new, of course. Jordan points to a quote by Montesquieu: "If we only wanted to be happy it would be easy; but we want to be happier than other people, which is almost always difficult, since we think them happier than they are." But social networking may be making this tendency worse. Jordan's research doesn't look at Facebook explicitly, but if his conclusions are correct, it follows that the site would have a special power to make us sadder and lonelier. By showcasing the most witty, joyful, bullet-pointed versions of people's lives, and inviting constant comparisons in which we tend to see ourselves as the losers, Facebook appears to exploit an Achilles' heel of human nature. And women—an especially unhappy bunch of late—may be especially vulnerable to keeping up with what they imagine is the happiness of the Joneses.Which makes sense, assuming that one buys the assumption that social software strongly discourages expressions of negativity or unhappiness. This is clearly not the case on all social sites; witness, for example, the (somewhat old) stereotype of the LiveJournal Angstpuppy, characterised by demonstrative levels of self-pity, often encoded into musical and/or sartorial preferences. Granted, that was in an earlier, weirder internet, and might get one unfriended or laughed at in today's more mainstream networks, though one does see a fair amount of kvetching on Facebook. Perhaps the best solution for the collective mental health is to encourage a culture of moderate self-pity and commiseration?
Mark Dery critically examines at the relentlessly upbeat politics of enthusiasm in the age of the Tumblr blog and the Like button:
At its brainiest, this sensibility expresses itself in the group blog Boing Boing, a self-described “directory of wonderful things.” Tellingly, the trope “just look at this!,” a transport of rapture at the wonderfulness of whatever it is, has become a refrain on the site, as in: ”Just look at this awesome underwear made from banana fibers. Just look at it.” Or: “Just look at this awesome steampunk bananagun. Just look at it.” Or: “Just look at this bad-ass volcano.” Or: “Just look at this illustration of an ancient carnivorous whale.” Because that’s what the curators of wunderkammern do—draw back the curtain, like Charles Willson Peale in “The Artist in His Museum,” exposing a world of “wonderful things,” natural (bad-ass volcanoes, carnivorous whales) and unnatural (steampunk bananaguns, banana-fiber underwear), calculated to make us marvel.Of course, there is a downside to this relentless boosterism: the positive becomes the norm (how many things can you "favourite"?); meanwhile, critical thought becomes delegitimised. When everybody's building shrines to their likes, any expression of negativity is an attack on someone's personal taste, making one a "hater" (a term originally from hip-hop culture which, tellingly, gained mainstream currency in the past decade). From this relentlessly upbeat point of view, critics are no more legitimate than griefers, the players in multi-player games who destroy others' achievements motivated by sadism:
At their wound-licking, hater-hatin’ worst, the politics of enthusiasm bespeak the intellectual flaccidity of a victim culture that sees even reasoned critiques as a mean-spirited assault on the believer, rather than an intellectual challenge to his beliefs. Journal writer Christopher John Farley is worth quoting again: dodging the argument by smearing the critic, the term “hater” tars “all criticism—no matter the merits—as the product of hateful minds.” No matter the merits.The culture of enthusiasm, and the culture of disenthusiasm (which Dery mentions), seems to be founded on the assumption that we are defined by the things we like and dislike. It's a form of commodity fetishism taken into the cultural sphere, though one step removed from the accumulation of material goods, rather dealing with approval and disapproval. Not surprisingly, it's often associated with youth subcultures; take, for example, punks' leather jackets; the names which appear on the back, and those omitted for obviousness or inauthenticity, signal their wearers' authenticity and legitimacy in the culture. (Hipsters take it further, into the realm of irony, where one's status is measured by how close one can surf to the void of kitsch; being into, say, Hall & Oates or M.C. Hammer, is worth more than safe choices like Joy Division and the Velvet Underground, which are so obvious a part of every civilised person's background that trumpeting one's enthusiasm for them is immediately suspect.)
However, likes and dislikes, when worn as badges of identity, can become mere totemism. Do you like, say, The Strokes or Barack Obama, because you find them interesting, or because you wish to be identified as the kind of person who does? Or, as A Softer World put it:
Cultural products (a term which encompasses everything from pop stars to public intellectuals, from comic books to politicians) can fulil two functions: they can be valued for their content or function (does this band rock? Is this book interesting?), or for their function as establishing the consumer's identity. Much like vinyl record sleeves framed on trendy apartment walls by people who don't own turntables to project an aura of cool, favourite books or movies or bands or public figures can be trotted out to buttress one's public image, without ever being fully digested. (Witness, for example, the outspokenly religious American "Conservatives" who idolise Ayn Rand, a strident atheist who expressed a Nietzschean contempt for religion.) Likes and dislike, in other words, are like flags, saluted or burned often out of habit or social obligation as much as any intrinsic value they may hold.
At the end, Dery points out that, far more interesting and telling than what we like or dislike are the things we both like and dislike, or else find fascinating; things which compel us with a mixture of fascination and repulsion, in whatever quantities, rather than neatly falling into one side or the other of the love/hate binary.
Freed from the confining binary of loving versus loathing, Facebook Like-ing versus hateration, we can imagine an index of obsessions, an inventory of intrigues that more accurately traces the chalk outline of who we truly are.
Imagine a more anarchic politics of enthusiasm, poetically embodied in a simulacrum of the self that preserves our repulsive attractions and attractive repulsions, reducing us not to our Favorites, nor even to our likes and dislikes, but to our obscure obsessions, our recurrent themes, the passing fixations that briefly grip us, then are gone—not our favorite things, but the things that Favorite us, whether we like it, or even know it, or not.
New research has shown that oxytocin, the neurochemical which promotes feelings of love and trust, also induces racism, or to be more precise, sharper discrimination against those ethnically or culturally different from oneself and one's group:
When asked to resolve a moral dilemma, such as choosing to save five lives from a runaway train by sacrificing one life, oxytocin-sniffing Dutch men more often saved fellow countrymen over Arabs and Germans than those who didn’t get a hormonal whiff.
“Earlier research of oxytocin paints a very rosy view of it. We thought it was odd a neurological system that survived evolution would make people indiscriminately loving toward others,” said social psychologist Carsten De Dreu of the University of Amsterdam, co-author of a Jan. 10 study in the Proceedings of the National Academy of Sciences. “Under oxytocin we saw an increase of in-group favoritism, which has the downside of discrimination against people who are not part of your group.”The questions this raises are interesting. In modern Western society at least, the idea of love is almost a secular religion; it is seen as an unequivocally positive phenomenon, whose only fault is that it is, alas, not everywhere, not washing over everyone and making everything alright. Anyone who dissents from this opinion must be some kind of pitiably twisted curmudgeon; entire subgenres of Hollywood romantic comedies have been made about such sourpusses seeing the light and gaining a new faith in the redeeming power of love, replete with montage sequences. But if the biological conditions underlying the phenomena of love also measurably amplify less positive tendencies, such as reducing empathy to those outside of one's in-group, could love follow religion into becoming something once seen as universally good that has been subjected to more radical reassessment? Perhaps, in future, we'll see the same rational scepticism that has been applied to the virtue of religious faith applied to the universal beneficience of love?
(via BHA) Share
A study from University College London, involving brain scans correlated with political surveys, has found that self-proclaimed liberals and conservatives have differently shaped brains:
"The anterior cingulate is a part of the brain that is on the middle surface of the brain at the front and we found that the thickness of the grey matter, where the nerve cells of neurons are, was thicker the more people described themselves as liberal or left wing and thinner the more they described themselves as conservative or right wing," he told the programme.
"The amygdala is a part of the brain which is very old and very ancient and thought to be very primitive and to do with the detection of emotions. The right amygdala was larger in those people who described themselves as conservative.
Following in the footsteps of OKCupid's data-mining blog, some people at Facebook have recently analysed a sample of status updates by word category, extracting correlations between word categories (as well as overall subject matter and positivity/negativity), time of day and probability of updates being liked/commented on. The analysis has shown, among other things:
Among the British Medical Journal's Christmas season of light-hearted articles this year: Mozart’s 140 causes of death and 27 mental disorders, an amusing piece about the tendency to posthumously diagnose illustrious historical figures:
Schoental, an expert in microfungi, thought that Mozart died from mycotoxin poisoning. Drake, a neurosurgeon, proposed a diagnosis of subdural haematoma after a skull fracture identified on a cranium that is not Mozart’s. Ehrlich, a rheumatologist, believed he died from Behçet’s syndrome. Langegger, a psychiatrist, contended that he died from a psychosomatic condition. Little, a transplant surgeon, thought he could have saved Mozart by a liver transplant. Brown, a cardiologist, claimed he succumbed to endocarditis. On the basis of a translation error of Jahn’s biography of Mozart, Rappoport, a pathologist, thought Mozart died of cerebral haemorrhage. Ludewig, a pharmacologist, suggested poisoning or self poisoning by drinking wine adulterated with lead compounds. For some, Mozart manifested cachexia or hyperthyroidism, but for others it was obesity or hypothyroidism. Ludendorff, a psychiatrist, and her apostles, claimed in 1936 that Mozart had been murdered by the Jews, the Freemasons, or the Jesuits, and assassination is not excluded by musicologists like Autexier, Carr, and Taboga.
What clearly emerges is that Mozart’s medical historiography is made out of various alternatives, with a general time trend as tenable diagnostic hypotheses are progressively exhausted: the more recent they are the less probable. The most likely diagnoses—such as influenza, typhoid fever, and typhus—were proposed first, and only rare and irrelevant conditions such as Goodpasture’s syndrome, Wegener’s granulomatosis, Still’s disease, or Henoch-Schönlein syndrome were left for those who came later.
Thus, highly selective readings of the sources, blatant misquotations, and perversions of the diagnostic criteria have led to shoddy medical interpretations. Mozart allegedly had thought disorder, delusions, musical dysfluency, and epileptic fits, plus he did not actually compose music but merely displayed musical hallucinations. He was a manic depressive, a pathological gambler, and had an array of psychiatric conditions such as Capgras’ syndrome, attention deficit/hyperactive disorder, paranoid disorder, obsessional disorder, dependent personality disorder, and passive-aggressive disorder. This has resulted in psychiatric narratives that blend an uninterrupted long tradition of defamation—the film Amadeus was one of the last public expressions of this tradition.
This phenomenon is Mozart’s medical nemesis. It covers the hidden intent to pull an exceptional creator down from his pedestal through some obscure need to cut great artists down to size. It is reminiscent of Rameau’s nephew in Diderot’s novel who says about people of exceptional creativity: “I never heard any single one of them praised without it making me secretly furious. I am full of envy. When I hear some degrading feature about their private life, I listen with pleasure. This brings me closer to them. It makes me bear my mediocrity more easily.”
A clinical psychologist at Liverpool University has put forward a proposal to classify happiness as a psychiatric disorder:
It is proposed that happiness be classified as a psychiatric disorder and be included in future editions of the major diagnostic manuals under the new name: major affective disorder, pleasant type. In a review of the relevant literature it is shown that happiness is statistically abnormal, consists of a discrete cluster of symptoms, is associated with a range of cognitive abnormalities, and probably reflects the abnormal functioning of the central nervous system. One possible objection to this proposal remains--that happiness is not negatively valued. However, this objection is dismissed as scientifically irrelevant.
(via David Gerard) Share
Satoshi Kanazawa, evolutionary psychology researcher at the London School of Economics, has published a list of ten controversial assertions about human nature; they vary from well-trodden ones (men being naturally sexually promiscuous/drawn to younger partners and such; there are several points drawn from the asymmetry of sexual selection) to more contentious ones; Kanazawa contends that most suicide bombers are Muslims because polygyny, and the sexual frustration of a society where powerful men monopolise the pool of women, serves as powerful motivation, which sounds a bit reductionistic, and would suggest that suicide bombers would predominantly be of low status or prospects, which has not been the case. Meanwhile, liberals are more intelligent than conservatives (as measured by IQ scores) because conservatism is a no-brainer:
"The ability to think and reason endowed our ancestors with advantages in solving evolutionarily novel problems for which they did not have innate solutions. As a result, more intelligent people are more likely to recognise and understand such novel entities and situations than less intelligent people, and some of these entities and situations are preferences, values, and lifestyles," Dr Kanazawa said.
Humans are evolutionarily designed to be conservative, caring mostly about their family and friends. Being liberal and caring about an indefinite number of genetically unrelated strangers is evolutionarily novel. So more intelligent children may be more likely to grow up to be liberals.Also, both creativity and criminality have a common basis in costly peacock-tail behaviour:
The tendency to commit crimes peaks in adolescence and then rapidly declines. But this curve is not limited to crime – it is also evident in every quantifiable human behaviour that is seen by potential mates and costly (not affordable by all sexual competitors). In the competition for mates men may act violently or they may express their competitiveness through their creative activities.
Today's big question: does country music increase suicide rates? The authors of this paper think that it does, and that country music fans are at significantly higher risk of suicide than nonfans, for reasons involving gun ownership, marital discord and the inherent job and financial stresses affecting America's working poor (which are often referred to in country song lyrics). The authors of this paper, however, dispute this, claiming methodological errors and that there is no evidence of country music making people more likely to off themselves than any other genre. (Whether music in general, or music with lyrics more specifically, correlates to depression or suicide risk, of course, is another question.)
In his latest Independent column, the inimitable Rhodri Marsden writes about the psychologically brutalising arena of online dating:
Internet dating pivots around profiles; lists of attributes, paragraphs where you attempt to make yourself sound appealing, a handful of flattering photographs. But there's already a problem. Dozens of books and websites offer advice on how to write profiles; third-party services even charge 40 quid to save you the bother. As a result, the uniformity is hilarious. Everyone loves travelling, particularly to Machu Picchu – which, if the profiles are to be believed, is an Inca site swarming with thousands of backpacking singletons. Men are singularly obsessed with skiing. All of us love to curl up on the sofa with a bottle of wine and a DVD (or a VD, as one unfortunately misspelled profile said).
But we're forced to filter the mass of potential datees, and we do it savagely. We start to adopt a power-shopping mentality, disregarding people for arbitrary reasons; as my friend Sam put it, we cruise past people's pictures as if they're caravans in Daltons Weekly. "Yeah, no, no, yeah – ooh, yes! – no, no, ugh." It's a compelling, but ultimately exhausting, process that these services have adapted, refined and streamlined because it's a brilliant way for them to make money. While a service might lure you with a strapline saying "Meet sexy singles in your area", the truth is more like, "Reject perfectly decent singles in your area while waiting for the maddeningly elusive sexy ones." Everyone is trading off current opportunities against future possibilities. In a thoughtful moment, you might even realise there are people you've had relationships with in the past who, if they appeared as an online match, you might reject. And when you're the one being rejected, it can hurt.
Long-term internet dating participants know only too well, however, the cycle of knock-back followed by a speedy return to the site in search of someone else. You start seeing the same faces across multiple sites, and some people (especially men) will start to play the percentage game, firing off multiple cut-and-paste emails in the hope that someone will reply. One friend of mine was even sent a cheery message of introduction from a man who she had already had a disastrous date with via another dating website.
Douglas Coupland, who epitomised the late-80s/early-90s slacker zeitgeist in Generation X, offers a list of terms for aspects of the human condition circa the 2010s:
Blank-Collar Workers: Formerly middle-class workers who will never be middle-class again and who will never come to terms with that.
Grim Truth: You're smarter than TV. So what?
Instant Reincarnation: The fact that most adults, no matter how great their life is, wish for radical change in their life. The urge to reincarnate while still alive is near universal.
Intraffinital Melancholy vs Extraffinital Melancholy: Which is lonelier: To be single and lonely, or to be lonely within a dead relationship?
Zoosumnial Blurring: The notion that animals probably don't see much difference between dreaming and being awake.
A new study has shown that older people enjoy reading news stories that portray younger people negatively:
All the adults in the study were shown what they were led to believe was a test version of a new online news magazine. They were also given a limited time to look over either a negative and positive version of 10 pre-selected articles. Each story was also paired with a photograph depicting someone of either the younger or the older age group. The researchers found that older people were more likely to choose to read negative articles about those younger than themselves. They also tended to show less interest in articles about older people, whether negative or positive.The study concluded that this is a result of a youth-centric society, and that stories which take young people down a few notches serve to boost the self-esteem of older readers.
I wonder whether this factor, plus the aging of the baby boom cohort and the populist bent of the market-driven media, could be behind so many beatups in the news, from scare stories about killer hoodies to dire warnings about internet addiction, shrinking attention spans and the imminent collapse of civilisation as we know it. (And whether, historically, the same factor has played a part in fuelling moral panics about youth-oriented trends such as rock'n'roll music, comic books, swing dancing, and so on.)
Peter McGraw, a behavioural economist from Colorado, has a grand unified theory of humour: he calls it the Benign Violation Theory; the gist of it is that, for something to be amusing, it has to involve a violation of norms, albeit one in which nobody is actually harmed.
Every kind of humor McGraw and Warren could think of fit into the BVT. Slapstick worked: Falling down the stairs, a physical violation, is only funny if nobody's actually hurt. A dirty joke trades on moral or social violations, but it's only going to get a laugh if the person listening is liberated enough to consider risqué subjects such as sex benign. Puns can be seen as violations of linguistic norms, though only cerebral types and grammarians care enough about the violation to chuckle.
McGraw believes the BVT may even help explain why, biologically, humans evolved with the ability to laugh. It is clearly a beneficial trait to be able to correctly perceive when a violation is benign and communicate that to others via laughter, he points out. Early humans who were afraid of every apparent violation, real or not, weren't going to last long — nor were those who took one look at a woolly mammoth charging their way and did nothing but bust a gut.Which more or less makes sense, though McGraw's attempt to explain laughter as a reaction to being tickled by this theory seems to be grasping at straws. (I'd be more inclined to believe that the internal state arising from being tickled is quite different from that arising from perceiving a joke, even though they have the same external symptom.)
A theory of humour I once saw elsewhere suggested that laughter was a reflexive reaction to a frame of reference suddenly and abruptly being changed, and to being suddenly faced with the need to reevaluate an entire story, scene or proposition, especially if it has become more exciting or unusual in doing so. Of course, this is biased towards conceptual humour, such as a told joke in which a sudden wordplay causes the carefully constructed word-picture to come crashing down (take, for example: "When I die, I want to die peacefully in my sleep like my grandfather, not screaming like the passengers in his car"), or else stepping out of the frame and wantonly changing the (implied) terms of reference of the text of the first part of the joke ("What's orange and sounds like a parrot? A carrot"). This act of conceptual violence triggers a minor earthquake in the listener's mind, which manifests itself as laughter (or a groan of disapproval if they've heard the joke before). Slapstick (and the bodily-function gross-out gags on which current Hollywood comedies are founded) are basically this for people who'd rather not mess with ideas. But both seem to be encompassed by the benign-violation framework.
Of course, the benignness is a negotiable point. One can tell a joke in which people die horribly (or worse), if the people are clearly hypothetical, stuffed straw dummies whose only purpose is to be sacrificed in a joke. Among bigots, jokes at the expense of out-groups also work because, by being dehumanised, the outgroup don't count as actual people. (A popularly tolerated echo of this are things like lawyer jokes, because nobody really believes in the possibility of exterminating all members of a profession.)
Huffington Post co-founder Johan Peretti has posted a presentation, titled "Mormons, Mullets and Maniacs", on what makes online content "viral", i.e., likely to be passed along by bored people:
One key point: content that goes viral tends to appeal to people's personality disorders, or at least gives them an opportunity to score points, laugh at/put down those they disagree with, or express their obsessions, self-identification or narcissistic attention-seeking tendencies:
A new theory claims that human monogamy is a direct result of the development of beer; or more precisely, firstly, that monogamy was the result of social changes that arose from the shift from a nomadic to an agricultural (and thus hierarchical and patriarchial) lifestyle, and secondly, that the main impetus to move to agriculture wasn't so much a desire to build cities or empires but to brew beer.
The Rap Guide To Human Nature is a hip-hop album about evolutionary psychology by a Canadian "rap troubador" named Baba Brinkman. It's not a joke: the beats are sharp, and Brinkman rhymes with the speed and dexterity of an accomplished rapper, deftly laying out the theories and controversies of evolutionary psychology, from kin selection to the biological roots of religious and political belief, twin studies to alternative models of human nature, and of course to areas such as sexual competition and social status where hip-hop culture and evolutionary psychology intersect. Note that, as expected from rap, the lyrics are probably not suitable for children.
Cow Clicker: a distillation of addictive, potentially expensive Facebook games to their purest essence:
You get a cow. You can click on it. In six hours, you can click it again. Clicking earns you clicks. You can buy custom "premium" cows through micropayments (the Cow Clicker currency is called "mooney"), and you can buy your way out of the time delay by spending it. You can publish feed stories about clicking your cow, and you can click friends' cow clicks in their feed stories.
Until now, Google and social software haven't been ideas that went together naturally. The famously engineering-focussed company had experimented with social, though mostly in engineers' 20% time, and with mixed results. Orkut became spectacularly successful in Brazil, but largely bobbed along in the wake of Friendster elsewhere until the vastly technically inferior MySpace came along and seized the market, Google Friend Connect got its lunch eaten by Facebook Connect, and other forays into social made the mistake of being a bit too clever and automatically inferring the user's social graph from their online activity, crossing the line between nifty and disturbing.
Now, however, this is likely to change. There are rumours afoot that Google have made social software a strategic priority, establishing teams to work on the problem of social as part of their regular 80% job, and that a social platform, possibly named Google Me, is in the works. Of course, as far as social platforms go, Facebook have the area sewn up, with a pretty sophisticated API, leaving little space for newcomers (or even Google) to expand into, unless they find and solve problems in the way Facebook does it.
Which brings us to this slide presentation from Google user-experience researcher Paul Adams. The presentation rigorously examines the social uses of software, and the natures of social connections (Adams mentions strong ties and weak ties, and adds a third category, temporary ties, or pairs of people involved in once-off interactions; think someone you buy something from on eBay) and pinpoints possible shortcomings of simple models such as Facebook's (the fact that people have different social circles and needs to expose different facets of their identities to different circles, and that tools such as Facebook's privacy filters have a high overhead to use satisfactorily in this way), not to mention unresolved mismatches between the way human beings intuitively perceive social interaction working and the way it does in the age of social software (for example, we are not intuitively prepared for the idea of our conversations being recorded and made searchable). All in all, it looks like a pretty rigorous survey of social software, condensed down to 216 slides. (An expanded version may be the contents of a book, Social Circles, which comes out in August.)
If Google, who have not given much weight to social software in the past, are investing in this level of research into it, they may well have a Facebook-beating social platform in the works. Though (assuming that it exists, of course) only time will tell whether Google have finally grasped social enough to pull it off.
A team of evolutionary psychologists have revised Maslow's Hierarchy of Needs. The original hierarchy is a pyramid of needs, with basic ones (food, shelter and, because it was invented in the 1960s, sex) at the bottom, and subsequent layers adding more advanced ones, like love, esteem and, at the apex, self-actualisation. Douglas Kenrick's team, however, does away with all that fluffy human-potential thinking and replaces it with the brute certainties of evolutionary psychology: at the top is not self-actualisation but parenting; i.e., doing what your genes built you to do and passing them on. The levels below have to do with acquiring and retaining a genetically fit mate, and building up the necessary social status to compete for the prize.
I am generally a fan of evolutionary psychology as an explanatory tool, though this doesn't sit well with me; it strikes me as a bit too reductionistic, and a bit too basic a model. Is the ultimate goal really to breed? Can we say that someone who has settled down in anonymous suburbia with a stable if dull job and started pumping out the children is more fulfilled than one who has found self-actualisation (through social, creative or otherwise constructive pursuits) but is childless? Are those who choose the latter path deluding themselves? It seems to say so.
ABC Radio National's All In The Mind recently interviewed a US psychiatrist who claims that psychiatry was used as a weapon against the civil rights movement in the 1960s. According to Jonathan Metzl, author of The Protest Psychosis: How Schizophrenia Became A Black Disease, the definition of schizophrenia was tweaked to apply to a lot of discontented African-Americans, with many activists being institutionalised in mental hospitals. (Until then, schizophrenia had been seen as a passive, disengaged condition mostly affecting white women; mental institutions were repurposed for containing the civil rights movement, many such patients were rediagnosed with depression and deinstitutionalised.)
All of a sudden in 1968 the second Diagnostic Manual comes out, the DSM 2, in the context of probably the most racially charged year in the history of the Civil Rights Movement—1968, where there are many riots, many protests. And also the DSM 2 importantly added language in the paranoid sub-type of schizophrenia, it added several important terms, it said the new criteria included aggression, hostility and projection. These hadn't been characteristics in DSM 1 and the manual explained, 'the patient manifests the characteristics of aggression and hostility and also attributes to others characteristics he cannot accept in himself.Even the advertisements at the time for sedative drugs used for treating patients echoed this racial paranoia:
I unearthed a series of advertisements for serious tranquilisers, Haldol, Stelazine, Thorazine that either represented African iconography, so African tribal masks, and would use incredibly charged racial language—so it would say this is the tool of primitive psychiatry and they would show these African masks—or images that quite literally showed, shockingly enough, angry black men protesting in the streets. And there's one image I reproduce in the beginning of the book, it's a Haldol advertisement that shows an angry black man in a burning urban scene who's shaking his fist. And the important point for both of these is that the iconography from these images literally appearing in the leading psychiatric journals was taking directly from the themes of the Civil Rights movement. The kind of Return to Africa Movement played out in these African scenes, and the idea of a clenched fist which was...This wasn't the first example of psychiatry being used in the service of racism in the US; in the 1850s, a surgeon named Samuel Cartwright put forward the theory that escaped slaves were suffering from illnesses he called drapetomania and dysesthesia aethiopis; his argument being that, as Negroes are psychologically unfit to cope with the pressures of freedom, escaping from one's rightful master was a sign of mental illness. This idea was, of course, very useful to those with a stake in maintaining the status quo, and flourished for some time for that reason.
Anyway, the shifting of the meaning of schizophrenia during the civil rights era was subsequently remedied partly by a deliberate programme to harmonise diagnoses with those used in Europe, though one might argue that the likelihood of the mentally ill to slip through the cracks to the prison system is part of the legacy of this phenomenon (according to Metzl, those diagnosed with schizophrenia in the US today are far more likely to end up in prison than in hospital; given that in America's neo-Calvinist penology, prisons are emphatically places of punishment first and rehabilitation a distant second, this is particularly disturbing).
Meanwhile, back in Europe, a converse relationship between mental illness and radical politics was posited from the other side; West Germany's Sozialistisches Patientenkollektiv, a radical Marxist group comprised of mental patients and the odd psychiatrist, argued that mental illness was a cultural construct, a reaction to the iniquities of capitalism.
A few quick links to things recently seen:
A Spanish mathematician has created a mathematical model of how marriages and relationships break down. Termed "sentimental dynamics", José-Manuel Rey's theory is based on the second law of thermodynamics, and posits an optimum amount of "energy" which needs to be fed into a relationship to sustain it:
The results of the mathematical analysis showed when both members of union are similar emotionally they have an “optimal effort policy,” which results in a happy, long-lasting relationship. The policy can break down if there is a tendency to reduce the effort because maintaining it causes discomfort, or because a lower degree of effort results in instability. Paradoxically, according to the second law model, a union everyone hopes will last forever is likely break up, a feature Rey calls the “failure paradox”.The paper may be found here. (Aside: note the use of the Unicode ♥ character in the equations; I wonder how common unusual Unicode symbols are in mathematical or scientific papers these days.)
A Facebook intern and PhD student in human-computer interaction has used Facebook to measure the relationship between sharing and wellbeing. Moira Burke's findings, gained by measuring the interactions between Facebook users who filled in surveys, has found, unsurprisingly, that active sharing (such as posting content and sending messages) is more correlated with wellbeing than passive consumption.
In user interface design, sometimes worse is better, as in the case of the Bloomberg Terminal, a proprietary computer terminal used by financial traders. The Bloomberg Terminal's interface, which hasn't been updated for a decade or so, is generally seen as cluttered and ugly. Proposals for more elegant redesigns have been knocked back, because the existing users like the macho ugliness of the interface and the aura of hardcore expertise it bestows on them:
Simplifying the interface of the terminal would not be accepted by most users because, as ethnographic studies show, they take pride on manipulating Bloomberg's current "complex" interface. The pain inflicted by blatant UI flaws such as black background color and yellow and orange text is strangely transformed into the rewarding experience of feeling and looking like a hard-core professional.In other words, the Bloomberg Terminal is one of a class of items whose bad design is a feature serving a higher-level social function; in this case, the function is that of being a badge of proficiency or status, and an artificial handicap to keep usurpers out. In this way, it functions somewhere between the tail of a peacock (which is expensive to grow and makes one more visible to predators, but having one (and being alive) also acts as proof of fitness) and the regalia and rituals of Freemasonry back when it was a force to be reckoned with. Of course, secrets are inherently leaky and can hold power only for so long, so sooner or later, perhaps someone (possibly Apple or Google?) will come along with a more elegantly-designed system that will demystify what it does and, in doing so, hole Bloomberg's boat below the waterline (unless they do so first).
(via Daring Fireball) Share
If you've ever found yourself compelled to keep playing a video game, despite realising that you're not actually enjoying it, you may have been a victim of the Behaviourist conditioning techniques game designers use to get people hooked. Video game designers are applying Skinnerian techniques of behaviour reinforcement to compel players to keep playing, to get hooked early, and to invest more time (and often money) into levelling up. (And playing a game does not necessarily equal enjoying it; the stimulus of getting unpredictable rewards, and the fear of losing one's carefully built-up progress, are sufficient to compel one, even if they might otherwise have preferred to do something else.)
His theories are based around the work of BF Skinner, who discovered you could control behavior by training subjects with simple stimulus and reward. He invented the "Skinner Box," a cage containing a small animal that, for instance, presses a lever to get food pellets. Now, I'm not saying this guy at Microsoft sees gamers as a bunch of rats in a Skinner box. I'm just saying that he illustrates his theory of game design using pictures of rats in a Skinner box. This sort of thing caused games researcher Nick Yee to once call Everquest a "Virtual Skinner Box."
First, set up the "pellets" so that they come fast at first, and then slower and slower as time goes on. This is why they make it very easy to earn rewards (or level up) in the beginning of an MMO, but then the time and effort between levels increases exponentially. Once the gamer has experienced the rush of leveling up early, the delayed gratification actually increases the pleasure of the later levels. That video game behavior expert at Microsoft found that gamers play more and more frantically as they approach a new level.Behaviourist game design techniques are becoming more prevalent in the age of online games, where the maker's revenue comes not from once-off purchases but from time (and money) spent in the course of playing the game; hence, game designers have to get their players hooked before the other guy comes along and milks them. And milking is perhaps an apt metaphor, given that one of the leading examples of this sort of game design is the Facebook game FarmVille, which, by all accounts is more of a socially conditioned obligation than a ludic activity:
Farmville is not a good game. While Caillois tells us that games offer a break from responsibility and routine, Farmville is defined by responsibility and routine. Users advance through the game by harvesting crops at scheduled intervals; if you plant a field of pumpkins at noon, for example, you must return to harvest at eight o’clock that evening or risk losing the crop. Each pumpkin costs thirty coins and occupies one square of your farm, so if you own a fourteen by fourteen farm a field of pumpkins costs nearly six thousand coins to plant. Planting requires the user to click on each square three times: once to harvest the previous crop, once to re-plow the square of land, and once to plant the new seeds. This means that a fourteen by fourteen plot of land—which is relatively small for Farmville—takes almost six hundred mouse-clicks to farm, and obligates you to return in a few hours to do it again. This doesn’t sound like much fun, Mr. Caillois. Why would anyone do this?
The secret to Farmville’s popularity is neither gameplay nor aesthetics. Farmville is popular because in entangles users in a web of social obligations. When users log into Facebook, they are reminded that their neighbors have sent them gifts, posted bonuses on their walls, and helped with each others’ farms. In turn, they are obligated to return the courtesies. As the French sociologist Marcel Mauss tells us, gifts are never free: they bind the giver and receiver in a loop of reciprocity. It is rude to refuse a gift, and ruder still to not return the kindness. We play Farmville, then, because we are trying to be good to one another. We play Farmville because we are polite, cultivated people.Here's more about FarmVille's use of the Cialdini reciprocity principle, as beloved of grifters. Meanwhile, other gaming companies are using other techniques to keep the marks coming back, like taking advantage of players' loss aversion ("your account is now flagged to have your characters below level 20 deleted as part of maintenance. Please re-activate your account now to ensure that your characters progress and names stay intact").
On a tangent, there is a blog titled The Psychology of Games; some of its content has to do with psychological manipulation techniques to control and monetise gamers, though it also covers examples of game theory (in the Prisoner's Dilemma sense) in games, psychoeconomics, the enjoyment of gaming as an activity, and, indeed, a wealth of psychological phenomena as illustrated through video gaming.
Sleep Talkin' Man: a log of the bizarre, surreal and often obscenity-filled utterances of a man afflicted with the condition of sleep talking, as transcribed (and sometimes recorded and posted online) by his wife:
"Don't move a muscle. Bushbabies are everywhere... everywhere... Shoot the fucking big-eyed wanky shite fucks! Kick 'em. Stamp them. Poke 'em in their big eyes! Take that for scaring the crap out of me."
"My badger's gonna unleash hell on your ass. Badgertastic!"
"It's a good thing your breath smells of shit. It colors your words beautifully. Gives it an edge."
"Tea bags, see? Better be careful with the tea bags. They're delicate creatures. Handle them with care."
A new study from Bristol University has looked into the differences between cat owners and dog owners. As well as the usual stereotypes (cat owners are more likely to be women who live alone), they discovered that cat owners are more likely to have degrees than dog owners (47.2% of households with cats have one person with a degree, compared to 38.4% with a dog):
"Our best guess is that it's to do with working hours and perhaps commuting to work, meaning people have a less suitable lifestyle for a dog. It's really just a hunch though."Or perhaps there are common psychological traits associated with a fondness for cats and a likelihood to apply oneself to study (or, indeed, a fondness for dogs and a likelihood to quit wastin' time and go out into the real world)?
Life imitates New Waver lyrics yet again: A psychological study at Leeds University has found a connection between depression and heavy internet use:
The authors found that a small number of users had developed a compulsive internet habit, replacing real life social interaction with online chat rooms and social networking sites.
They classed 18 respondents - 1.2% of the total - as "internet addicts". This group spent proportionately more time on sex, gambling and online community websites... The internet addicts were significantly more depressed than the non-addicted group, with a depression score five times higher.Of course, the whole concept of "internet addiction" is a dubious one, and often tinged with tabloid-style moral panic, so there's a danger that the advocates of the "internet addiction" industry will wave this around as proof, ignoring the fact that the addictive behaviours there are more usefully described as gambling and/or pornography addiction.
The report does not put forward any causal links between heavy internet use and depression. Do specific patterns of internet use weaken social contacts, contributing to depression, or do depressed people use the internet to self-medicate?
Also, the inclusion of online community websites along with sex and gambling websites seems somewhat dubious; while the latter are masturbatory replacements for natural stimuli, especially those one leading an impoverished life may lack, can one really imply that social community sites substitute for and weaken social ties rather than facilitating them? I recall a study from a few years ago which showed that users of social web sites actually have stronger social connections, and improved wellbeing as a result of those. Though it is always possible that various characteristics of particular social websites (which may be influenced by their design and/or emergent from organic patterns of use) influence their ability to facilitate psychologically useful social ties.
The boffins at OKCupid have posted another statistical tour of the mysteries of human sexual attraction, this time looking at profile pictures, and what makes them work (or fail). Some of the findings: the "MySpace shot", cheesy as it may sound, does work for women (though only if they're looking for something other than interesting conversation), and if you're male, you're advised to get your shirt off:
Canadian psychology professor Bob Altemeyer has made available online the text of a book examining the psychology of authoritarianism. Altemeyer looks at what he calls Right-Wing Authoritarianism, a personality trait which manifests itself in a high degree of submission to the established authorities, high levels of aggression in the name of the authorities, and a high level of conventionalism, and correlates with the political right, at least in North America. (He also mentions left-wing authoritarianism—think dogmatic Maoism or similar—in passing, though dismisses it as having all but died out in North America, whereas right-wing authoritarianism is going from strength to strength.)
It’s about what happened to the American government after "conservatives" gained control of Congress in the 1990s and the White House in 2000. It’s about the disastrous decisions that government made, which have created the enormous problems we face now. It’s about the corruption that rotted the Congress. It’s about how traditional conservatism has nearly been destroyed by authoritarianism. It’s about how the “Religious Right” teamed up with amoral authoritarian leaders to push its un-democratic agenda onto the country.
For example, take the following statement: “Once our government leaders and the authorities condemn the dangerous elements in our society, it will be the duty of every patriotic citizen to help stomp out the rot that is poisoning our country from within.” Sounds like something Hitler would say, right? Want to guess how many politicians, how many lawmakers in the United States agreed with it? Want to guess what they had in common?Altemeyer puts forward a Right-Wing Authoritarian personality scale, with higher scores correlating with the trait. High-RWA individuals have a "Daddy knows best" attitute to the authorities. They defer to their leaders, and even while they often believe that the law, however harsh, must be obeyed, they will exempt their leaders from this if the ends justify the means (such as approving of illegal activities against "radicals" or "enemies of society"). They view the world in terms of in-groups and out-groups, with little sympathy for the latter, and an us-vs.-them outlook, exhibit aggression against those seen to be transgressing against the norms of society, and are quicker than average to join with others to take action against them. And, being highly conventional, they interpret a lot of things as existential threats to the established order. (Authoritarianism, in other words, seems to tie in with a survival-values worldview, driven by the perception of existential threats and the need to deal with them.) Being driven by faith in authority, high-RWAs are more capable than most of compartmentalising contradictory beliefs and resisting challenges to their beliefs posed by logic or evidence.
The Authoritarians looks at the RWA scale and other phenomena, such as religious fundamentalism, social-dominance orientation and real-world politics. Not surprisingly, there are correlations between right-wing authoritarianism and religious fundamentalism, and both are strong predictors of prejudice against out-groups. (Paradoxically, many high-RWA people exhibit both racial prejudices and hostility to overt racism, largely due to not seeing themselves or their peers as racially prejudiced; this would be the dampening effect authoritarianism has on insight and analysis.) Meanwhile, there are both parallels and differences between right-wing authoritarian followers and people who score highly on the social dominance scale; the former don't necessarily want personal power, whereas the latter are less likely to be religious or constrained by rules, though will often happily feign religiosity as a means to an end. Some individuals, of course, score highly on both scales. Because authoritarian followers are receptive to messages that feel right, and are suspicious of critical thought, right-wing authoritarian movements attract more than their share of power-hungry sociopaths willing to pound the right talking points to get willing, unquestioning followers.
The bad news is, the authoritarians have been ascendant over the past decade (in the US, Altemeyer says, they have largely seized the Republican Party). The good news is that right-wing authoritarianism, as a tendency, can be defeated. Studies have found that fear increases RWA scores, in effect making people shut up and follow the leader. (This was used to great effect by the Bush Whitehouse, for example, by instituting a prominent colour-coded terror threat level, seldom dipping below "severe", and raising it inexplicably before elections.) Fearful societies are governed by authoritarian survival values, which have a harder time of getting a grip without fear. Exposure to people unlike oneself and one's "in-group" also weakens authoritarian tendencies, as does a liberal education. A study cited by Altemeyer showed university students' RWA scores declining steadily over the course of their studies, and remaining low throughout their lives. (Parenting, meanwhile, causes one's RWA scores to increase slightly.)
There are other ideas Altemeyer's Right-Wing Authoritarianism scale ties into, such as Lakoff's strict-father/nurturing-parent family dichotomy (which Altemeyer looks at though finds weakly connected), Milgram's obedience experiment, Zimbardo's Stanford Prison Experiment, and theories about the mass psychology of fascism. (Of which this strikes me as one of the more useful ones; while it may be fun to posit connections between fascism and manned flight or the mass spectacle of rock'n'roll, those are probably less useful for actually understanding the threat of fascism as a mass movement.)
The Gervais Principle, or the social psychology of how organisations really function, as seen in the Office TV comedies:
Now, after four years, I’ve finally figured the show out. The Office is not a random series of cynical gags aimed at momentarily alleviating the existential despair of low-level grunts. It is a fully-realized theory of management that falsifies 83.8% of the business section of the bookstore. The theory begins with Hugh MacLeod’s well-known cartoon, Company Hierarchy ..., and its cornerstone is something I will call The Gervais Principle, which supersedes both the Peter Principle and its successor, The Dilbert Principle.The McLeod hierarchy, and the theory which is the cornerstone of Ricky Gervais' comedy (and its US remake), divides organisations into three psychological types, somewhat facetiously labelled Sociopaths (i.e., those driven by the desire to control and dominate, without whom no decisions would be made), Losers (i.e., those who have made the tradeoff of security for control of their destiny; these need not necessarily be losers in the colloquial sense) and the Clueless (who are in the middle of the hierarchy, but are one level below losers in self-awareness; whereas the loser typically puts in the minimum they can get away with, the clueless give their loyalty to the organisation out of a misplaced faith that it will be reciprocated). Initially, organisations start off with a few Sociopaths in the driving seat and a corps of Losers doing the gruntwork in exchange for a regular paycheque; as they get larger, a layer of Clueless is added, and expands. This layer may be imagined as a dense, inert substance, which serves to keep the otherwise inherently unstable organisation from imploding.
A sociopath-entrepreneur with an idea recruits just enough losers to kick off the cycle. As it grows it requires a clueless layer to turn it into a controlled reaction rather than a runaway explosion. Eventually, as value hits diminishing returns, both the sociopaths and losers make their exits, and the clueless start to dominate. Finally, the hollow brittle shell collapses on itself and anything of value is recycled by the sociopaths according to meta-firm logic.The Gervais Principle builds on this, and describes how Losers who put in more than is in their best interest get promoted to middle-management, not because of their talents, or because of their incompetence (as per the Peter Principle or Dilbert Principle), but because they are most useful as pebbles in the insulating layer of the Clueless.
Sociopaths, in their own best interests, knowingly promote over-performing losers into middle-management, groom under-performing losers into sociopaths, and leave the average bare-minimum-effort losers to fend for themselves.
A loser who can be suckered into bad bargains is set to become one of the clueless. That’s why they are promoted: they are worth even more as clueless pawns in the middle than as direct producers at the bottom, where the average, rationally-disengaged loser will do. At the bottom, the overperformers can merely add a predictable amount of value. In the middle they can be used by the sociopaths to escape the consequences of high-risk machinations like re-orgs.
Which brings us to the other major management book that is consistent with the Gervais Principle. Images of Organization, Gareth Morgan’s magisterial study of the metaphors through which we understand organizations. Of the eight systemic metaphors in the book, the one that is most relevant here is the metaphor of an organization as a psychic prison. The image is derived from Plato’s allegory of the cave, which I won’t get into here. Suffice it to say that it divides people into those who get how the world really works (the sociopaths and the self-aware slacker losers) and those who don’t (the over-performer losers and the clueless in the middle).(Paging Greg Wadley...)
(via Lef) Share
The OKCupid people have been running a free online dating service, backed by psychological matching algorithms driven by user-written tests, for many years, and have build up a huge corpus of data about how people interact. Now they have started a blog, where they discuss the statistical findings that may be gathered from comparing people's profiles and message counts.
One blog posts looks at how well different profile attributes predict whether two people will match. Not surprisingly, the zodiac signs of any two people have no effect on their actual personalities, and thus on how well they would get along:
Race has a slightly greater influence (of a few percentage points either way), presumably because of uneven distribution of cultural backgrounds, but it is still fairly small. (Keep in mind that the match scores are computed from how users answer others' questions, and not from explicitly asking questions like "would you date a Virgo/Polynesian/Buddhist".) Religion, however, turns out to be a lot more telling:
According to this, atheists, agnostics, Jews and Buddhists seem to get along just swell (in fact, Buddhists appear to be slightly more compatible with the nonbelievers than with other Buddhists), whereas the Christians, Hindus and Muslims tend to be somewhat more contentious, not only not getting along with other religions as well but also with each other. Additionally, the more seriously one takes religion, it seems, the less likely one is to get along with others.
Looking again at the issue of race, while race doesn't seem to affect actual compatibility scores, it does affect how likely people are to get responses:
Love may be blind, but it also seems that it, or at least attraction, is deeply racist.
On a lighter note, OKCupid have crunched the word frequencies of successful and unsuccessful opening messages and discovered what to write if you want a reply. Netspeak and "hip" misspellings ('u', 'luv', 'wat') and physical compliments are out, whereas mentions of specific interests are helpful. Unsurprisingly, mentioning religion is generally a bad idea as well.
The Graun has turned what was meant to be a review of the latest Hollywood romantic comedy into a critique of this culture's emphasis on romantic love:
Romcoms don't merely provide an evening's harmless escapism. They help underpin one of the most potent doctrines of our culture: the sanctity of romantic love. It's a doctrine in which many find relief from the materialism, apathy and banality of a society no longer hallowed by religious transcendence. Yet it comes at a price.
The involuntary cognitive state that Jennifer Aniston finds herself depicting so frequently is real enough, but not particularly mystical. Brain scans show it to be generated by the frisky interaction of chemicals like norepinephrine and dopamine. If this hubbub's triggered by recognition of genetic quality, as now seems to be assumed, that would explain why Aniston and her ilk have to be so annoyingly good-looking.
What we call love induces some of the worst behaviour that we're likely to encounter. Yet when this occurs, it usually invites no censure, let alone punishment. Romantic love is a get-out-of-jail-free card that legitimises actions which would otherwise be thought contemptible. Home-wreckers steal something cherished far more deeply than money or possessions. Nonetheless, they go on to build their happiness on the misery of others without having to endure the slightest disapproval. After all, they had no choice but to do what they did: they were in love.
In other cultures, romantic love enjoys no comparable status. Our own ancestors might find our veneration of it as puzzling as we find their worship of pagan gods. In our otherwise disrespectful age, the persistence of its dominion is rather remarkable. Would it have proved so enduring without the big screen's relentless promotion of its supposedly limitless benefits?I have been wondering whether the emphasis on romantic love in popular culture (a significant proportion of mainstream pop songs seem to be about the transitions into and out of the state of being in love, for example) and the sexualisation of the media are not two sides of the same coin, namely a focus on relations between people as being a marketplace of potential partners, rather than less glamorous and less dynamic forms of relations. Could this be a sort of social Reaganism/Thatcherism, the ideological assertion that "everything is a market" translated into the realm of interpersonal relations?
Psychological experiments at the University of Amsterdam have found connections between romantic love and creative thinking and sex and analytical thinking:
The clever experiments demonstrated that love makes us think differently in that it triggers global processing, which in turn promotes creative thinking and interferes with analytic thinking. Thinking about sex, however, has the opposite effect: it triggers local processing, which in turn promotes analytic thinking and interferes with creativity.
Why does love make us think more globally? The researchers suggest that romantic love induces a long-term perspective, whereas sexual desire induces a short-term perspective. This is because love typically entails wishes and goals of prolonged attachment with a person, whereas sexual desire is typically focused on engaging in sexual activities in the "here and now". Consistent with this idea, when the researchers asked people to imagine a romantic date or a casual sex encounter, they found that those who imagined dates imagined them as occurring farther into the future than those who imagined casual sex.
A global processing style promotes creative thinking because it helps raise remote and uncommon associations. Consider, for example, the act of finding a gift for your partner. If we think about a gift while in a local mindset, then we’ll probably focus on more literal and concrete options, most of which involve a tangible object wrapped in colorful paper. We’ll probably consider the usual suspects, such as a watch, a book, or perfume. However, thinking about a gift more globally might inspire us to consider a gift as "anything that will make him/her happy". This may, in turn, bring to mind more diverse and original ideas, such as going on a joint vacation, writing a song, or cleaning and remodeling the house. Of course, this doesn’t mean we should always think globally. While local processing might interfere with creativity, it also promotes analytic thinking, which requires us to apply logical rules. For example, if you are looking for a piece of furniture in a big display according to a pre-defined list of criteria (e.g., size, color, price), a local mindset may help you find a match, by preventing you from being side-tracked by attractive but irrelevant options and by making you pay more attention to relevant details.I wonder how this ties into other things, such as holism and reductionism. Or wiether there's a correlation between short-term thinking and a prevalence of sexualised imagery/metaphors.
Doing its part to cope with the global obesity epidemic, the subway system in Sao Paolo, Brazil, has introduced double-width, reinforced seats for fat people:
The seats are bright blue and have stickers above them marking them out as special seats for the use of obese persons. Strangely enough, they seem almost always empty; presumably, a lot of those who can fit into a regular seat or can bear standing are not keen to self-identify themselves as severely overweight.
One wonders how facility planners could cater for a widening population without coming up against against stigma and denial. They could, of course, go back to flat benches without divisions, except that those allow homeless people to sleep on them, which is unacceptable for various reasons. (Not all people are sufficiently enlightened and compassionate to share their daily commute with the aromatically homeless, and if public transport facilities adopted a secondary role as a homeless shelter, this would drive out many of those who are sufficiently well-off to avoid public transport, putting more cars on the road, and resulting in the money spent on running the actual trains being wasted, but I digress.) Possibly some sort of design with all seats being double-width with a low-key, or movable, divider in the middle, would do the trick; though that could have the unintended consequence of encouraging amorous couples.
Wikipedia link of the day: Spite houses, or where malice and architecture intersect:
A spite house is a building (generally found in an urban environment) which was constructed or modified because the builder felt wronged by someone who did not want it there. Typically built to annoy someone, in most cases a neighbor, these buildings serve primarily as obstructions, blocking out light or access to neighboring buildings, or as flamboyant symbols of defiance. Because actually inhabiting such structures is usually a secondary goal at most, they often have strange and impractical layouts.
Researchers at the University of Sussex have discovered how domestic cats have developed a purr that psychologically manipulates humans, subtly triggering a sense of urgency without being as confronting as a meow. The "soliciting purr" contains an embedded high-frequency component similar to a cry:
Dr McComb and her team set up an experiment which tested human responses to the different purring types. She says: “When humans were played purrs recorded while cats were actively seeking food at equal volume to purrs recorded in non-solicitation contexts, even those with no experience of cats judged the ‘solicitation’ purrs to be more urgent and less pleasant.”
Not all cats, however, use this solicitation purring: “It seems to most often develop in cats that have a one-on-one with their owners rather than in large households where there is a lot going on and such purring might get overlooked. Meowing seems to be more common in these situations.”Cats tend to use the "soliciting purr" at times such as early in the morning, to elicit compliance from humans who may otherwise prefer to do something else, such as remaining asleep. It appears to be individually learned rather than an evolved instinct. There are more details, including embedded video, here.
The British Psychological Society's journal, The Psychologist, has a fascinating article about outbreaks of mass hysteria and "dancing plagues" in the Middle Ages:
The year was 1374. In dozens of medieval towns scattered along the valley of the River Rhine hundreds of people were seized by an agonising compulsion to dance. Scarcely pausing to rest or eat, they danced for hours or even days in succession. They were victims of one of the strangest afflictions in Western history. Within weeks the mania had engulfed large areas of north-eastern France and the Netherlands, and only after several months did the epidemic subside. In the following century there were only a few isolated outbreaks of compulsive dancing. Then it reappeared, explosively, in the city of Strasbourg in 1518. Chronicles indicate that it then consumed about 400 men, women and children, causing dozens of deaths (Waller, 2008).
Not long before the Strasbourg dancing epidemic, an equally strange compulsion had gripped a nunnery in the Spanish Netherlands. In 1491 several nuns were ‘possessed’ by devilish familiars which impelled them to race around like dogs, jump out of trees in imitation of birds or miaow and claw their way up tree trunks in the manner of cats. Such possession epidemics were by no means confined to nunneries, but nuns were disproportionately affected (Newman, 1998). Over the next 200 years, in nunneries everywhere from Rome to Paris, hundreds were plunged into states of frantic delirium during which they foamed, screamed and convulsed, sexually propositioned exorcists and priests, and confessed to having carnal relations with devils or Christ.The article examines these phenomena, dismissing various theories (such as them being caused by ergotism, or the consumption of bread contaminated with hallucinogenic mould), and makes the case that they were culture-bound psychogenic illnesses, enabled by accepted beliefs about the supernatural and triggered by stress:
Similarly, it is only by taking cultural context seriously that we can explain the striking epidemiological facts that possession crises so often struck religious houses and that men were far less often the victims of mass diabolical possession. The daily lives of nuns were saturated in a mystical supernaturalism, their imaginations vivid with devils, demons, Satanic familiars and wrathful saints. They believed implicitly in the possibility of possession and so made themselves susceptible to it. Evangelical Mother Superiors often made them more vulnerable by encouraging trance and ecstasy; mind-altering forms of worship prepared them for later entering involuntary possession states. Moreover, early modern women were imbued with the idea that as the tainted heirs of Eve they were more liable to succumb to Satan, a misogynistic trope that often heightened their suggestibility.
Theological conventions also conditioned the behaviour of demoniac nuns. This is apparent from the fact that nearly all possession epidemics occurred within a single 300-year period, from around 1400 to the early 1700s. The reason is that only during this period did religious writers insist that such events were possible (Newman 1998). Theologians, inquisitors and exorcists established the rules of mass demonic possession to which dissociating nuns then unconsciously conformed: writhing, foaming, convulsing, dancing, laughing, speaking in tongues and making obscene gestures and propositions. These were shocking but entirely stereotypical performances based on deep-seated beliefs about Satan’s depravity drawn from religious writings and from accounts of previous possessions. For centuries, then, distress and pious fear worked in concert to produce epidemics of dancing and possession.The article concludes with examples of modern occurrences of such phenomena, from the rather feeble examples (such as epidemics of fainting) one could find in a materialistic post-Enlightenment society to "spirit possession" among factory workers drawn from rural communities in Malaysia and Singapore, to delusions of penis-stealing witchcraft in western Africa.
A few interesting posts from LiveJournal: Momus writes about the nihilism of heat:
In hot places, human life seems to be worth less. Iraq is a place where an army of Mersaults in the form of soldiers and "contractors" are constantly on the brink of breaching their own culture's taboos, driven to acts of violence by the murderous heat. Under the sun, nothing matters any more.
In hot places, human life seems to be worth less. Iraq is a place where an army of Mersaults in the form of soldiers and "contractors" are constantly on the brink of breaching their own culture's taboos, driven to acts of violence by the murderous heat. Under the sun, nothing matters any more.
Some say that cultural psychology changes as you move closer to the equator. In Discovering Psychology with Philip Zimbardo, a precis of a film about cultural psychology made for the University of Stanford, James Jones of the University of Delaware argues that a way of being has evolved near the equator featuring particular uses of time, rhythm, improvisation, orality, and spirituality. What he describes is a failure to defer gratification:And Lord Whimsy (author of The Affected Provincial's Companion) examines cod-Victorianism (think steampunk, goth and McSweeney's-esque retro typography) and cod-Modernism (think Shibuya-kei, Mod revivalism and, for want of a better word, Helvetipunk):
Cod-Vic usually has an element of knowing perversity to it, relishing its bad taste rather than recoiling from it. Cod-Vic has also opened the floodgates to a host of one-liners, cheap novelties, and dead ends--some delightful, others less so. The most successful Cod-Vic offerings seem to strike a deft balance, borrowing from overwrought Victorian forms while maintaining a modern crispness and rigor.
Cod-Mod is another form of retro, but an altogether less problematic one than Cod-Vic since its clean lines, unadorned materials and simple color schemes lend a laid-back, contemporary air of unaffected sophistication, making it much easier to defend as an aesthetic. Cod-mod has been mined heavily by indie music (Camera Obscura and Belle & Sebastian, anyone?), as well as interior design types like Jonathan Adler and film directors like Wes Anderson. Oh yeah, and Ikea.
To figure out the current state and direction of the global economy, economists are turning to somewhat unusual indicators, such as the membership of extramarital infidelity websites and the price of prostitution in Latvia:
The Web site crunched its traffic and membership numbers and found that there was a big increase in both when there was a turning point in the FTSE-100 index, which measures the leading companies listed in London. When the market collapses, people plot affairs. And when the bulls rage, the same thing happens. When it is trading sideways, they stick with their partners.
“It has to do with people’s confidence levels,” says Rosie Freeman-Jones, a spokeswoman for the site. “When the markets are up, they think they can have an affair because they feel they can get away with anything. When the market hits the bottom, they are looking for a way to relieve the pressure.”And here is more information on the prostitution index, and why prostitution prices make a good economic indicator.
Anyway the problem is that most industries have contractual arrangements which fix prices. Wages are very hard to flex downwards. Rents are fixed over sustained periods and the like. All of this means that people go bust rather than reduce prices – simply because prices are sticky.
Well – most prices. The contractual terms of prostitution are short (an hour, a night) and entry to the industry is unconstrained. That means that the prices are very flexible. Extraordinarily flexible.
As the economic crisis bites, credit card companies are turning to advanced psychological techniques to manage their customers, using their purchasing records to develop detailed psychological models of their behaviour.
Martin could often see precisely what cardholders were purchasing, and he discovered that the brands we buy are the windows into our souls — or at least into our willingness to make good on our debts. His data indicated, for instance, that people who bought cheap, generic automotive oil were much more likely to miss a credit-card payment than someone who got the expensive, name-brand stuff. People who bought carbon-monoxide monitors for their homes or those little felt pads that stop chair legs from scratching the floor almost never missed payments. Anyone who purchased a chrome-skull car accessory or a “Mega Thruster Exhaust System” was pretty likely to miss paying his bill eventually.
Martin’s measurements were so precise that he could tell you the “riskiest” drinking establishment in Canada — Sharx Pool Bar in Montreal, where 47 percent of the patrons who used their Canadian Tire card missed four payments over 12 months. He could also tell you the “safest” products — premium birdseed and a device called a “snow roof rake” that homeowners use to remove high-up snowdrifts so they don’t fall on pedestrians.
By the time he publicized his findings, a small industry of math fanatics — many of them former credit-card executives — had started consulting for the major banks that issued cards, and they began using Martin’s findings and other research to build psychological profiles. Why did birdseed and snow-rake buyers pay off their debts? The answer, research indicated, was that those consumers felt a sense of responsibility toward the world, manifested in their spending on birds they didn’t own and pedestrians they might not know. Why were felt-pad buyers so upstanding? Because they wanted to protect their belongings, be they hardwood floors or credit scores. Why did chrome-skull owners skip out on their debts? “The person who buys a skull for their car, they are like people who go to a bar named Sharx,” Martin told me. “Would you give them a loan?”It's not only your purchasing record that's mined for psychological data, though:
Most of the major credit-card companies have set up systems to comb through cardholders’ data for signs that someone is going to stop making payments. Are cardholders suddenly logging in at 1 in the morning? It might signal sleeplessness due to anxiety. Are they using their cards for groceries? It might mean they are trying to conserve their cash. Have they started using their cards for therapy sessions? Do they call the card company in the middle of the day, when they should be at work? What do they say when a customer-service representative asks how they’re feeling? Are their sighs long or short? Do they respond better to a comforting or bullying tone?The card companies have, as you might imagine, a variety of uses for this data. On the blunter side of the spectrum, signs of potential unreliability (bills for dive bars or marriage counselling services, unusual login patterns) may trigger card companies to raise interest rates or start pushing more aggressively for repayment. More subtly, though, if your credit card company calls you to discuss your bill, the person talking to you will be trained in psychological techniques and will have on their screen a detailed psychological profile of you, all the better to elicit compliance:
Santana had actually already sought permission from the bank to settle for as little as $10,000. It’s an open secret that if a debtor is willing to wait long enough, he can probably get away with paying almost nothing, as long as he doesn’t mind hurting his credit score. So Santana knew he should jump at the offer. But as an amateur psychologist, Santana was eager to make his own diagnosis — and presumably boost his own commission.
“I don’t think that’s going to work,” Santana told the man. Santana’s classes had focused on Abraham Maslow’s hierarchy of needs, a still-popular midcentury theory of human motivation. Santana had initially put this guy on the “love/belonging” level of Maslow’s hierarchy and built his pitch around his relationship with his ex-wife. But Santana was beginning to suspect that the debtor was actually in the “esteem” phase, where respect is a primary driver. So he switched tactics.
“You spent this money,” Santana said. “You made a promise. Now you have to decide what kind of a world you want to live in. Do you want to live around people who break their promises? How are you going to tell your friends or your kids that you can’t honor your word?”
The man mulled it over, and a few days later called back and said he’d pay $12,000.
“Boom, baby!” Santana shouted as he put down the phone. “It’s all about getting inside their heads and understanding what they need to hear,” he told me later. “It really feels great to know I’m helping people in pain.”Of course, another way to look at this was that, had the chump (who, according to the article, had recently been left by his wife) not offered to pay up extra, the friendly man from the card company would know exactly which buttons to push to kick them down further. Which is all very well (Personal Responsibility, after all, is What Made America Great, as any card-carrying Libertarian will tell you), other than the inherent asymmetry of going up against a huge organisation with frighteningly powerful intelligence-gathering abilities, and no interest in your welfare beyond what's required to maximise its profits.
When US filmmaker Andrea Wachner was invited to attend her 10-year high-school reunion in the affluent Los Angeles suburb of Palos Verdes, she didn't want to go; so she recruited an exotic dancer to pretend to be her, fitting her with an earpiece and coaching her interactively on the people she was meeting. Tattooed, scantily-clad "Cricket" claimed that she was Andrea, had had reconstructive surgery and suffered amnesia after a car accident, and that she was working as a stripper to pay for her graduate school tuition. She was followed by a camera crew, ostensibly making a documentary about the daily lives of artists. Cricket finished off her performance by doing a striptease to a Lisa Loeb song.
Most of the people were taken in by this, or at least sufficiently uncertain to not raise a fuss in case they ended up making fools of themselves, and found out only later, when Wachner posted video to YouTube, as a teaser for a 40-minute documentary titled "I Remember Andrea Better" she was making on the incident.
I have just been catching up with some blogs, and have found, in Momus' blog, an interesting taxonomy of recent pop-cultural history:
The anxious interval: The anxious interval is the recent past. It's long enough ago to feel not-contemporary, but not long enough ago to feel utterly removed. It's at an uncomfortable distance, which is why I call it "anxious". You could think of the anxious interval as the temporal equivalent of the uncanny valley, that place where robots are similar enough to us to give us an uncomfortable shudder. You could also say the anxious interval is a place, a style, a set of references we avoid, repress, sublimate, have selective amnesia about, stow away, throw out, deliberately forget.
An example: Devendra Banhart and the scene that was called Freak Folk or New Weird America. The Wire magazine cover feature on New Weird America dates from August 2003. By April 2005 the San Francisco Chronicle is telling us that Freak Folk Flies High. By June 2006 the New York Times is telling its readers that "a music scene called freak folk is bursting up from underground" but adding that "it looked like a trend of the moment a couple of years ago". By 2009, it's safe to say that a reference to Freak Folk would be more likely to puncture your credibility than bolster it. Freak Folk is in "the anxious interval".
The goldmine is the cultural era the present is currently reviving. I've put a picture of Buggles, because in general we're reviving the 80s at the moment. You know, the guy from Hot Chip wears Buggles-like glasses, and so on. The goldmine is a goldmine for people who run secondhand clothes stores and have lots of stock from the requisite era, or people who are selling synths from that era, or people who've got a bunch of cheap Chinese Ray Ban copy frames. The smartest people in the present are remembering the goldmine and sifting through its waters like a crowd of panhandlers.
The battlefront is the area right at the edge of the goldmine -- the place where the acceptable and lucrative revival era meets a time which is currently repressed, neglected, and a-slumber. What's so interesting about the battlefront is that the process of reassessment is so visible here, and the revaluation is so daringly and consciously done. An elite of taste-leaders and taste-formers unafraid of ridicule are hard at work here, foraging for bargains, bringing an unacceptable era into fresh acceptability. There's a kind of shuddering repulsion for long-neglected, long-repressed artifacts, and yet something compellingly taboo about them. Their hiddenness makes them fascinating -- it's as if their very sublimation has given these cultural objects some kind of big power over our unconscious. The best curators and fashionistas are to be found at the battlefront, battling for the fascinating-repellant things they find in that twilit zone between acceptability and unacceptability.
New research from England (where else?) claims that having had a passionate first love can damage one's chances of finding happiness, and it's better to avoid the throes of extreme passion altogether:
Brynin found that the euphoria of first love can damage future relationships. "Remarkably, it seems that the secret to long-term happiness in a relationship is to skip a first relationship," said Brynin. "In an ideal world, you would wake up already in your second relationship."
While researching the components of successful long-term partnerships, Brynin found intense first loves could set unrealistic benchmarks, against which we judge future relationships. "If you had a very passionate first relationship and allow that feeling to become your benchmark for a relationship dynamic, then it becomes inevitable that future, more adult partnerships will seem boring and a disappointment," he said.
Cory Doctorow, freelance writer and novelist, has written a short article on how to write productively in the age of ubiquitous distraction. The advice he gives is rather novel; he dismisses the usual advice about switching off one's internet connection, and is also scornful of the idea of ceremony, or of setting the right mood. (And understandably so; acknowledging the idea of there being a right mood or atmosphere for evoking one's inner muse could lead to finding excuses, consciously or subconsciously, for not actually doing anything.)
The single worst piece of writing advice I ever got was to stay away from the Internet because it would only waste my time and wouldn't help my writing. This advice was wrong creatively, professionally, artistically, and personally, but I know where the writer who doled it out was coming from. Every now and again, when I see a new website, game, or service, I sense the tug of an attention black hole: a time-sink that is just waiting to fill my every discretionary moment with distraction. As a co-parenting new father who writes at least a book per year, half-a-dozen columns a month, ten or more blog posts a day, plus assorted novellas and stories and speeches, I know just how short time can be and how dangerous distraction is.
Short, regular work schedule. When I'm working on a story or novel, I set a modest daily goal — usually a page or two — and then I meet it every day, doing nothing else while I'm working on it. It's not plausible or desirable to try to get the world to go away for hours at a time, but it's entirely possible to make it all shut up for 20 minutes. Writing a page every day gets me more than a novel per year — do the math — and there's always 20 minutes to be found in a day, no matter what else is going on. Twenty minutes is a short enough interval that it can be claimed from a sleep or meal-break (though this shouldn't become a habit). The secret is to do it every day, weekends included, to keep the momentum going, and to allow your thoughts to wander to your next day's page between sessions. Try to find one or two vivid sensory details to work into the next page, or a bon mot, so that you've already got some material when you sit down at the keyboard.
Leave yourself a rough edge. When you hit your daily word-goal, stop. Stop even if you're in the middle of a sentence. Especially if you're in the middle of a sentence. That way, when you sit down at the keyboard the next day, your first five or ten words are already ordained, so that you get a little push before you begin your work. Knitters leave a bit of yarn sticking out of the day's knitting so they know where to pick up the next day — they call it the "hint." Potters leave a rough edge on the wet clay before they wrap it in plastic for the night — it's hard to build on a smooth edge.
Realtime communications tools are deadly. The biggest impediment to concentration is your computer's ecosystem of interruption technologies: IM, email alerts, RSS alerts, Skype rings, etc. Anything that requires you to wait for a response, even subconsciously, occupies your attention. Anything that leaps up on your screen to announce something new, occupies your attention. The more you can train your friends and family to use email, message boards, and similar technologies that allow you to save up your conversation for planned sessions instead of demanding your attention right now helps you carve out your 20 minutes. By all means, schedule a chat — voice, text, or video — when it's needed, but leaving your IM running is like sitting down to work after hanging a giant "DISTRACT ME" sign over your desk, one that shines brightly enough to be seen by the entire world.
(via Where do you think?) Share
A study at Edinburgh's Heriot Watt University has claimed that watching romantic comedies can damage relationships, by inculcating unrealistic expectations:
They found fans of films such as Runaway Bride and Notting Hill often fail to communicate with their partner. Many held the view if someone is meant to be with you, then they should know what you want without you telling them.
Kimberly Johnson, who also worked on the study, said: "Films do capture the excitement of new relationships but they also wrongly suggest that trust and committed love exist from the moment people meet, whereas these are qualities that normally take years to develop."
New advances in neural imaging are shedding light on what makes a psychopath a psychopath:
In a landmark 1991 E.R.P. study conducted at a prison in Vancouver, Robert Hare and two graduate students showed that psychopaths process words like “hate” and “love” differently from the way normal people do. In another study, at the Bronx V.A. Medical Center, Hare, Joanne Intrator, and others found that psychopaths processed emotional words in a different part of the brain. Instead of showing activity in the limbic region, in the midbrain, which is the emotional-processing center, psychopaths showed activity only in the front of the brain, in the language center. Hare explained to me, “It was as if they could only understand emotions linguistically. They knew the words but not the music, as it were.”
Today, Kiehl and Hare have a complementary but complicated relationship. Kiehl claims Hare as a mentor, and sees his own work as validating Hare’s checklist, by advancing a neurological mechanism for psychopathy. Hare is less gung ho about using fMRI as a diagnostic tool. “Some claim, in a sense, this is the new phrenology,” Hare said, referring to the discredited nineteenth-century practice of reading the bumps on people’s heads, “only this time the bumps are on the inside.”
But the problem is that “psychopathic behavior”—egocentricity, for example, or lack of realistic long-term goals—is present in far more than one per cent of the adult male population. This blurriness in the psychopathic profile can make it possible to see psychopaths everywhere or nowhere. In the mid-fifties, Robert Lindner, the author of “Rebel Without a Cause: A Hypnoanalysis of a Criminal Psychopath,” explained juvenile delinquency as an outbreak of mass psychopathy. Norman Mailer inverted this notion in “The White Negro,” admiring the hipster as a “philosophical psychopath” for having the courage of nonconformity. In the sixties, sociopathy replaced psychopathy as the dominant construct. Now, in our age of genetic determinism, society is once again seeing psychopaths everywhere, and this will no doubt provoke others to say they are nowhere, and the cycle of overexposure and underfunding will continue.One researcher is doing research on prison inmates (a population in which psychopathy is greatly over-represented, as one might expect) using a brain scanner and tests reminiscent of the Voigt-Kampf test in Blade Runner, measuring their responses to moral questions non-psychopathic individuals would respond to viscerally.
The fMRI machine started up with a high-pitched whirring sound. I began to see photographs. One was of a baby covered with blood. I thought first about the blood, then realized the circumstances—birth—and rated the moral offense zero. A man was lying on the ground with his face beaten to a bloody pulp: I scored this high. There was a picture of Osama bin Laden. I scored it four, although I felt that I was making more of an intellectual than a moral judgment. Two guys inadvertently butting heads in a soccer game got a zero, but then I changed it to a one, because perhaps a foul was called. I had considered deliberately giving wrong answers, as a psychopath might. But instead I worked at my task earnestly, like a good fifth grader.
(via Wired News) Share
A biologist and a sociologist have put forward a new theory of brain development and mental disorders. Crespi and Badcock's theory posits a spectrum running between autism and related social dysfunctions on one side and schizophrenia, depression and bipolar disorder on the other, with the struggle between maternal and paternal genes in the womb determining where the child's neurology will fall on this axis:
Dr. Crespi and Dr. Badcock propose that an evolutionary tug of war between genes from the father’s sperm and the mother’s egg can, in effect, tip brain development in one of two ways. A strong bias toward the father pushes a developing brain along the autistic spectrum, toward a fascination with objects, patterns, mechanical systems, at the expense of social development. A bias toward the mother moves the growing brain along what the researchers call the psychotic spectrum, toward hypersensitivity to mood, their own and others’. This, according to the theory, increases a child’s risk of developing schizophrenia later on, as well as mood problems like bipolar disorder and depression.
It was Dr. Badcock who noticed that some problems associated with autism, like a failure to meet another’s gaze, are direct contrasts to those found in people with schizophrenia, who often believe they are being watched. Where children with autism appear blind to others’ thinking and intentions, people with schizophrenia see intention and meaning everywhere, in their delusions. The idea expands on the “extreme male brain” theory of autism proposed by Dr. Simon Baron-Cohen of Cambridge.
“Think of the grandiosity in schizophrenia, how some people think that they are Jesus, or Napoleon, or omnipotent,” Dr. Crespi said, “and then contrast this with the underdeveloped sense of self in autism. Autistic kids often talk about themselves in the third person.”
Faced with a wave of ostalgie, misty-eyed nostalgia for the fallen East German Communist regime, Germany's educational authorities have created a mockup of an East German classroom, in which school students would be subjected to the Communist experience. There they would be threatened with disciplinary action for wearing Western clothes, ordered to sing Communist marching songs and told of field trips to border guard regiments, by a "teacher" attired in authentic East German synthetic fabrics. One student would also volunteer in advance to play the child of dissidents, who would then be alternately criticised and ignored by the teachers. What the organisers hadn't planned on was that the whole thing would turn into a small-scale reenactment of the Stanford Prison Experiment, with dissident "Steffen"'s erstwhile classmates turning on him and joining in persecuting him like good cogs in the totalitarian machine:
The other pupils began to ostracise "Steffen" themselves and accused him of disrupting the class. Although they were encouraged to stand up against the system before the session, none of the pupils rallied to Steffen's support when he was told he could not visit the border-guard unit, or at any other time.
During these sessions Elke Urban models herself on Margot Honecker, the leader's wife who was also a hardline education minister. She said that only one group had dared to stand up and defend the dissident pupil during her classes. "I deliberately create a totalitarian atmosphere and I am still always shocked how quickly and easily people are conditioned by it," she said. "East Germany may have left a pile of Stasi files behind rather than a pile of corpses, but the similarities with the Nazi regime are there."
A study has shown that people who grew up watching black and white television dream in black and white, whilst people who grew up with colour television dream in colour:
Research from 1915 through to the 1950s suggested that the vast majority of dreams are in black and white but the tide turned in the sixties, and later results suggested that up to 83 per cent of dreams contain some colour.
Only 4.4 per cent of the under-25s' dreams were black and white. The over-55s who had had access to colour TV and film during their childhood also reported a very low proportion of just 7.3 per cent. But the over-55s who had only had access to black-and-white media reported dreaming in black and white roughly a quarter of the time.It isn't clear what sorts of dreams people who grew up without television have; whether they're less visual and more verbal, more three-dimensional, or just less spectacular.
The US election season is proving a bonanza to scientists studying deception, from incongruous body language to the vague phraseology of "spin":
BLINK and you would have missed it. The expression of disgust on former US president Bill Clinton's face during his speech to the Democratic National Convention as he says "Obama" lasts for just a fraction of a second. But to Paul Ekman it was glaringly obvious.
"Given that he probably feels jilted that his wife Hillary didn't get the nomination, I would have to say that the entire speech was actually given very gracefully," says Ekman, who has studied people's facial expressions and how they relate to what they are thinking for over 40 years.Another algorithm scores politicians on the amount of spin, or manipulative content-free language, in their speeches, using word frequencies:
The algorithm counts usage of first person nouns - "I" tends to indicate less spin than "we", for example. It also searches out phrases that offer qualifications or clarifications of more general statements, since speeches that contain few such amendments tend to be high on spin. Finally, increased rates of action verbs such as "go" and "going", and negatively charged words, such as "hate" and "enemy", also indicate greater levels of spin. Skillicorn had his software tackle a database of 150 speeches from politicians involved in the 2008 US election race (see diagram).
In general though, Obama's speeches contain considerably higher spin than either McCain or Clinton. For example, for their speeches accepting their party's nomination for president, Obama's speech scored a spin value of 6.7 - where 0 is the average level of spin within all the political speeches analysed, and positive values represent higher spin. In contrast, McCain's speech scored -7.58, while Hillary Clinton's speech at the Democratic National Convention scored 0.15. Skillicorn also found that Sarah Palin's speeches contain slightly more spin than average.So whilst Obama is one slick player, the straight-talkin', plain-dealin' McCain has little to rejoice about, according to a different metric:
"The voice analysis profile for McCain looks very much like someone who is clinically depressed," says Pollermann, a psychologist who uses voice analysis software in her work with patients. Previous research on mirror neurons has shown that listening to depressed voices can make others feel depressed themselves, she says.
Additionally, McCain's voice and facial movements often do not match up, says Pollermann, and he often smiles in a manner that commonly conveys sarcasm when addressing controversial statements. "That might lead to what I would call a lack of credibility."
A piece by online communication expert Suw Charman-Anderson about how and why email is so dangerous to getting things done:
In a study last year, Dr Thomas Jackson of Loughborough University, England, found that it takes an average of 64 seconds to recover your train of thought after interruption by email (bit.ly/email2). So people who check their email every five minutes waste 8 1/2hours a week figuring out what they were doing moments before.The distractive (and some would say destructive) effects of email come down partly to the psychology of addiction and reinforcement:
Tom Stafford, a lecturer at the University of Sheffield, England, and co-author of the book Mind Hacks, believes that the same fundamental learning mechanisms that drive gambling addicts are also at work in email users. "Both slot machines and email follow something called a 'variable interval reinforcement schedule' which has been established as the way to train in the strongest habits," he says.
"This means that rather than reward an action every time it is performed, you reward it sometimes, but not in a predictable way. So with email, usually when I check it there is nothing interesting, but every so often there's something wonderful - an invite out or maybe some juicy gossip - and I get a reward." This is enough to make it difficult for us to resist checking email, even when we've only just looked. The obvious solution is to process email in batches, but this is difficult. One company delayed delivery by five minutes, but had so many complaints that they had to revert to instantaneous delivery. People knew that there were emails there and chafed at the bit to get hold of them.Things like weekly "no email days" don't work either, because they don't actually change people's compulsive email-checking habits. Charman-Anderson's article recommends other notification technologies, such as Twitter and RSS aggregators, as better alternatives.
On a similar tangent, one of the tips for getting more done in Tim Ferriss' The 4-Hour Work Week is a somewhat counterintuitive-sounding low-information diet; rather than binging on magazines, news, books, blogs, podcasts and such, he advocates cutting that out as much as possible, reasoning that we can get by on much less information than we habitually consume and still know enough, whilst having more time to actually do things. The holes in what we know will soon enough be filled by what we hear in smalltalk, learn from friends unavoidably see in front of the newspaper kiosk on the way to the shops. (Which sort of makes sense; think, for example, of the how much you know of the plots of various well-known movies you haven't seen or books you haven't read. Whether or not you've seen Star Wars or read Animal Farm (to cite two examples), you can probably come up with a summary of what they're about.)
A study recently published in the Australasian Psychiatry journal has found correlations between musical preferences and a variety of mental illnesses and antisocial tendencies, and recommends that doctors ask their teenaged patients what sorts of music they listen to. The study, by Dr. Felicity Baker of the University of Queensland, is not online, but these articles contain various points from it. Among them:
There's an intriguing article in the Guardian about the descendents of German Nazis who converted to Judaism and moved to Israel. The article interviews several such converts (the son of a SS man who's an Orthodox rabbi, a left-wing lesbian campaigner for Palestinian rights, and a professor of Jewish Studies who is related to Hitler, and who describes his (Israeli-born, Arab-hating) son as a "fascist").
One somewhat obvious explanation for this phenomenon is that of assuagement of guilt by rejecting the oppressor population one came from identifying with the victims, and this explanation is floated by an expert on the psychology of the children of perpetrators. Interestingly, though, none of those interviewed, when asked for why they converted to Judaism, mention the Holocaust or Nazism, instead giving theological reasons:
"During my theological studies at university it became clear that I couldn't be a minister in the church," he says. "I concluded that Christianity was paganism. One of [its] most important dogmas is that God became man, and if God becomes man then man also can become God." He pauses. "Hitler became a kind of god."
I tell Bar-On they talk obsessively about the Trinity. But is incredulity really a reason for abandoning a religion with a three-in-one god for one that still believes bushes talk and that waves are parted by the will of God? "That is another way of saying what I have already told you," he says. "They want to join the community of the victim. They may have their own way of rationalising it."
A new study has revealed a correlation between the number of bumper stickers on a car and the aggressiveness of the driver's behaviour, presumably as bumper stickers indicate a territorial mindset on the part of the driver. Interestingly enough, there was no correlation between the content of the bumper stickers and the driver's behaviour, so a "Visualise World Peace" sticker would be as much of a danger sign as a "Don't Mess With Texas" one.
Psychology experiments have shown that subliminal exposure to brands can prime people with the attributes those brands have cultivated. For example, when students were exposed to either an Apple or IBM logo and asked to list all the uses for a brick they could imagine, the Apple ("creativity, noncomformity") group came up with significantly more than the IBM ("tradition, responsibility") group. In a subsequent experiment, candidates primed with the Disney logo behaved more honestly than those primed with the logo of E! Channel (which, I believe, is a celebrity-gossip cable-TV channel in the US).
The practical consequences of this are interesting: if this is to be taken at face value then, by the sheer power of subliminal conditioning and marketing, brands do have magical properties, and branded products would perform better than physically identical unbranded ones. A brand logo is a macro, a tightly-encoded package of ideas, instantaneously decoded by appropriately conditioned consumers (and that means all of us; given the studies showing that young children learn to recognise brands before they learn to read), and priming has been shown to work. (In one experiment (previously mentioned here), students were asked to sort words, and then surreptitiously timed as they walked down the corridor on leaving. Those given words relating to old age—including, memorably, "Florida"—walked more slowly than those given youth-related words. Another experiment showed that exposure to alcohol-related words increased men's sex drive.)
Putting these facts together, it seems that using an Apple computer would make you more creative, even if you work in the same version of Microsoft Word you could as easily use on Windows, though so would having an Apple iPod, and Nike shoes could make you run faster than generic trainers of exactly the same composition, and so on. It's not necessarily even limited to brands, but could extend to any perceptible medium associated with qualities or values. It'd be interesting to see whether, for example, if one took two groups of students and, after surreptitiously exposing half of them to Belle & Sebastian and the other half to 50 Cent, asked them to play a game, whether members of one group would be more aggressive or competitive than the other.
Anyway, this finding could be seen as a justification for big brands' steep markups of otherwise average products: they're not exploiting a gullible public, they're selling the psychological magic of their brand. Though if you don't want to pay the markup, you could just as easily clip ads out of papers and tape them around your cubicle/kitchen/locker/wherever, which might get you a similar result, at the risk of making you look like a tragic. Just keep reminding yourself that you're not a gullible dupe or an unpaid human billboard, but a cunningly rebellious pirate, sticking it to The Man by stealing his magic without paying.
I wonder, though, whether candidates subliminally exposed to craptacular knockoffs of Apple products would experience a boost of creativity or a drop in IQ.
Striking another blow against the modern idea that 100% cheerfulness is attainable or desirable, an expert on mood disorders at King's College argues that depression may be good for you:
The fact it has survived so long - and not been eradicated by evolution - indicates it has helped the human race become stronger.
"I have received e-mails from ex-sufferers saying in retrospect it probably did help them because they changed direction, a new career for example, and as a result they're more content day-to-day than before the depression."
Aristotle believed depression to be of great value because of the insights it could bring. There is also an increased empathy in people who have or have had depression, he says, because they become more attuned to other people's suffering.
Two Oxford University sociologists look at the question of why graduates in science, engineering and medicine are overrepresented in terrorist and extremist groups:
However, contrary to popular speculation, it's not technical skills that make engineers attractive recruits to radical groups. Rather, the authors pose the hypothesis that "engineers have a 'mindset' that makes them a particularly good match for Islamism," which becomes explosive when fused by the repression and vigorous radicalization triggered by the social conditions they endured in Islamic countries.
Whether American, Canadian or Islamic, they pointed out that a disproportionate share of engineers seem to have a mindset that makes them open to the quintessential right-wing features of "monism" (why argue where there is one best solution) and by "simplism" (if only people were rational, remedies would be simple).
The internet, with its detachment between online and offline actions and its lack of a private register, has spawned the phenomenon of griefers, or highly organised subcultures of people (mostly young men) who delight in ruining other people's online fun:
Consider the case of the Avatar class Titan, flown by the Band of Brothers Guild in the massively multiplayer deep-space EVE Online. The vessel was far bigger and far deadlier than any other in the game. Kilometers in length and well over a million metric tons unloaded, it had never once been destroyed in combat. Only a handful of player alliances had ever acquired a Titan, and this one, in particular, had cost the players who bankrolled it in-game resources worth more than $10,000.
So, naturally, Commander Sesfan Qu'lah, chief executive of the GoonFleet Corporation and leader of the greater GoonSwarm Alliance — better known outside EVE as Isaiah Houston, senior and medieval-history major at Penn State University — led a Something Awful invasion force to attack and destroy it.
"The ability to inflict that huge amount of actual, real-life damage on someone is amazingly satisfying" says Houston. "The way that you win in EVE is you basically make life so miserable for someone else that they actually quit the game and don't come back."
To see the philosophy in action, skim the pages of Something Awful or Encyclopedia Dramatica, where it seems every pocket of the Web harbors objects of ridicule. Vampire goths with MySpace pages, white supremacist bloggers, self-diagnosed Asperger's sufferers coming out to share their struggles with the online world — all these and many others have been found guilty of taking themselves seriously and condemned to crude but hilarious derision.Griefers defend their behaviour by claiming that they're merely giving those who take the internet far too seriously a reality check. The implied subtext is that anything that happens online is just a game and doesn't count. Though, given how the internet has become a mainstream part of many people's lives (witness, for example, the rise in social networking websites), this assertion makes about as much sense as Tom Hodgkinson's call to kill your Facebook account, throw away your email address and instead socialise in the pub with people near you. There's not a great leap from asserting that anything that happens online doesn't really count and absurdly ludditic claims like "if you don't know what someone smells like, they're a stranger".
On the other hand, there is no such thing as the right to be respected, or even to not be ridiculed. If one posts a web page detailing one's peculiar political views, conspiracy theories and/or sexual fetishes online, one can expect to be laughed at and even snidely remarked about. Though there is a distinction between demolishing someone's homepage in a blog or discussion forum and actively gathering a posse and going out to hound them off the net.
Griefing happens in the real world, though it's usually called other things, such as bullying. The difference is that the internet has democratised bullying. In the real world, in more conformistic societies, bullies can typically only be those either of or contending for alpha social status, enforcing an exaggerated version of majority values by picking on those perceived to not conform to them (witness the use of the word "gay", sometimes semi-euphemised as "ghey", as a general-purpose term of derision), and in more liberal or pluralistic environments, even that is frowned upon. Online, anyone can find a group of like-minded misfits, make up a cool-sounding name, set up a virtual clubhouse and start picking on mutually agreed targets, with little fear of social consequences.
A study in Singapore has shown that the sight or smell of appetising food can compel people to make impulse purchases, or else compromise their ability to judge risks and payoffs:
Similarly, another experiment used a cookie-scented candle to further gauge whether appetitive stimulus affects consumer behavior. Female study participants in a room with a hidden chocolate-chip cookie scented candle were much more likely to make an unplanned purchase of a new sweater -- even when told they were on a tight budget -- than those randomly assigned to a room with a hidden unscented candle (67 percent vs. 17 percent).The researchers make the further claim that "the presence of an attractive woman in the trading room might propel an investor to choose the investment option providing smaller but sooner rewards".
A pair of 35 year-old identical twins met for the first time, after having been raised apart without knowing of each others' existence, as part of a psychological experiment:
"It was a relief I think for both of us that we were not carbon copies. As similar as we looked when we compared pictures of ourselves as kids, as adults we have our own distinct style."
"We had the same favourite book and the same favourite film, Wings of Desire," says Elyse. "It was amazing," says Paula. "We felt we were conducting our own informal study on nature versus nurture in a way".Which raises the question: how do you know that the way you live, and what you accept as normal today, is not actually part of some psychological experiment?
Don't envy the super-rich, this article says; their wealth has almost certainly made them miserable:
According to de Vries, the super-rich are increasingly succumbing to what has been labelled Wealth Fatigue Syndrome (WFS). When money is available in near-limitless quantities, the victim sinks into a kind of inertia.
"The rich are never happy, no matter what they have," he told CNN. "There was this man who owned a 100ft yacht. I said: 'This is a terrific boat.' He said: 'Look down the harbour.' We looked down the marina, and there were boats two and three times as large. He said: 'My 100ft yacht today is like a dinghy compared to these other boats.' When else in history has someone been able to call a 100ft yacht a dinghy?"
Some of our friends have jumped from nice five-bedroom houses in South Kensington to gated mansions in St John's Wood, complete with hot and cold running staff. But many who join the super-rich find it hard to keep their old circles of support. Happiness studies have repeatedly shown that being marginally better off than your neighbours makes you feel good, but being a hundred times richer makes you feel worse. So either you change your friends or live with the envy of others.The article goes on to expound numerous other causes of wealth-induced misery: social support networks break down, as relationships with old friends are strained by the wealth disparity and poisoned by real or perceived envy; and all the cars, yachts and new houses your money can buy you just become boring much more quickly. (It's the hedonic treadmill effect, where one becomes acclimatised to one's level of comfort and contentment, it takes even more to not succumb to ennui.) Meanwhile, the wives of the super-rich (and most of the super-rich are men; presumably husbands of super-rich women or gay partners would suffer the same) suffer the same psychological consequences as the unemployed (that is, when they're not traded in for younger, prettier models), and their children, shuttled between nannies and estates, often end up clinically depressed.
The conclusion is that money can buy happiness — but only up to a point. A key component of happiness is social connectedness, of the sort that cannot be bought:
The happiest nations, he says, are those where people feel most equal, even if that means being less wealthy. Pentecost, a tiny island in the South Pacific, has recently been voted the happiest place on earth. They don't have WFS – in fact, they don't have money; they use pigs' horns instead.
In places such as Pentecost, people actually talk to each other – indeed, belonging to a community is one of the single most important prerequisites for happiness.
A study of hundreds of written threats to US politicians has yielded the conclusion that emailed threats showed far fewer signs of serious mental illness than posted ones. This is presumably because the internet has lowered the barrier to entry to threatening one's congresscritter, making it available to people who are only slightly nuts.
As one might expect, the emailed threats also contained more obscene language and were more disorganised.
A new book takes to task the accepted belief that men and women think and/or communicate differently, as expounded by popular psychology books like Men Are From Mars, Women Are From Venus:
The bones of Cameron’s argument, set out in The Myth of Mars and Venus, are that Gray et al have no scientific basis for their claims. Great sheaves of academic papers, says Cameron, show that the language skills of men and women are almost identical. Indeed, the central tenets of the Mars and Venus culture – that women talk more than men, that men are more direct, that women are more verbally skilled – can all be debunked by scientific research. A recent study in the American journal Science, for instance, found men and women speak almost exactly the same number of words a day: 16,000.
Where the book becomes interesting is when she asks why we have become interested in these myths. “The first point to make is that in the past 20 years we have become obsessed by communication,” she says. “And that’s not just in relationships; it’s in customer care, it’s in politics. All problems are seen to be communication problems.
Cameron is not simply irritated that the Mars and Venus books have filled too many Christmas stockings. Her fervour on this issue runs deeper. There is, she thinks, something regressive, deeply conservative, in this outlook because what it seems to be saying is that we can’t change.The author, Deborah Cameron, is a feminist philologist and Rupert Murdoch professor of Language at Oxford (really); other than the Mars-and-Venus brigade, she has in her sights Darwinists (which, I'm guessing, means the likes of Steven Pinker and/or Richard Dawkins), Tories, man-hating "pseudo-feminists" and punctuation/grammar pedants:
“You had people like Prince Charles and Norman Tebbit inferring that if people were making spelling mistakes it was only a short step to them coming in dirty to school and then there’d be no motivation for them to stay out of crime,” says Cameron. “There were these illogical slippery slope arguments: how, if children didn’t know how to use the colon properly, it was only a few steps from drug-taking and criminality. There was a deep moral and social dimension to it all.
Experiments in political psychology have shown that people become more receptive to conservatism, authoritarianism, intolerance and zero-sum "us against them" worldviews when reminded of their own mortality (going some way to explaining the "values" vote for Bush in 2004, and indeed the Howard government's successive landslides in Australia):
Their experiments showed that the mere thought of one's mortality can trigger a range of emotions--from disdain for other races, religions, and nations, to a preference for charismatic over pragmatic leaders, to a heightened attraction to traditional mores.
To test the hypothesis that recognition of mortality evokes "worldview defense"--their term for the range of emotions, from intolerance to religiosity to a preference for law and order, that they believe thoughts of death can trigger--they assembled 22 Tucson municipal court judges. They told the judges they wanted to test the relationship between personality traits and bail decisions, but, for one group, they inserted in the middle of the personality questionnaire two exercises meant to evoke awareness of their mortality. One asked the judges to "briefly describe the emotions that the thought of your own death arouses in you"; the other required them to "jot down, as specifically as you can, what you think will happen to you physically as you die and once you are physically dead." They then asked the judges to set bail in the hypothetical case of a prostitute whom the prosecutor claimed was a flight risk. The judges who did the mortality exercises set an average bail of $455. The control group that did not do the exercises set it at an average of $50. The psychologists knew they were onto something.The researchers did other experiments involving priming one group of candidates with the thought of their mortality. In one example, they found that awareness of one's mortality can induce xenophobia and distrust of difference (students at a Christian college who did the exercises had a more negative opinion of an essay they were told was written by a Jewish author than a control group did) and aggressive patriotism (those who did the exercises took a far more negative view of an essay critical of the United States, and also expressed more reverence for national icons).
After 9/11, the researchers did experiments specifically showing that Bush's popularity in the US was enhanced by Americans' awareness of their mortality:
The control group that completed a personality survey, but did not do the mortality exercises, predictably favored Kerry by four to one. But the students who did the mortality exercises favored Bush by more than two to one. This strongly suggested that Bush's popularity was sustained by mortality reminders. The psychologists concluded in a paper published after the election that the government terror warnings, the release of Osama bin Laden's video on October 29, and the Bush campaign's reiteration of the terrorist threat (Cheney on election eve: "If we make the wrong choice, then the danger is that we'll get hit again") were integral to Bush's victory over Kerry. "From a terror management perspective," they wrote, "the United States' electorate was exposed to a wide-ranging multidimensional mortality salience induction."The induction of mortality salience is also claimed to have been instrumental in popular antagonism to perceived enemies (including France, Germany and Canada), and a mass shift towards reactionary conservative positions such as the defense of tradition and religious dictates (from rising opposition to abortion, gay marriage and liberal attitudes to the rise of the "strict father" model of the family, which on 10 September 2001, seemed like a laughable relic of the 1950s):
Indeed, from 2001 to 2004, polls show an increase in opposition to abortion and gay marriage, along with a growing religiosity. According to Gallup, the percentage of voters who believed abortion should be "illegal in all circumstances" rose from 17 percent in 2000 to 20 percent in 2002 and would still be at 19 percent in 2004. Even church attendance by atheists, according to one poll, increased from 3 to 10 percent from August to November 2001.In the 1980s, some figure associated with the Thatcher government in the UK was quoted as saying that "the facts of life are Conservative". Whether or not that is the case, it seems that the facts of death are.
A new study from the University of North Carolina suggests that Iraqi citizens experience sadness and a sense of loss when relatives, spouses, and even friends perish — emotions that have until recently been identified almost exclusively with Westerners:
Iraqis have often been observed weeping and wailing in apparent anguish, but the study offers evidence indicating this may not be exclusively an outward expression of anger or a desire for revenge. It also provocatively suggests that this grief can possess an American-like personal quality, and is not simply a tribal lamentation ritual.
Psychologists and anthropologists have thus far largely discounted the study, claiming it has the same bias as a 1971 Stanford University study that concluded that many Vietnamese showed signs of psychological trauma from nearly a quarter century of continuous war in southeast Asia.
"We are, in truth, still a long way from determining if Iraqis are exhibiting actual, U.S.-grade sadness," Mayo Clinic neuropsychologist Norman Blum said. "At present, we see no reason for the popular press to report on Iraqi emotions as if they are real."
A study in Bath has shown that wearing a bicycle helmet may actually increase risk, as motorists give helmeted cyclists less room to manoeuvre.
If you want drivers to cut you more slack when you're cycling, however, the advice is dress as a woman (if you're not one already, that is):
His findings, published in the March 2007 issue of Accident Analysis & Prevention, state that when Walker wore a helmet drivers typically drove an average of 3.35 inches closer to his bike than when his noggin wasn't covered. But, if he wore a wig of long, brown locks -- appearing to be a woman from behind -- he was granted 2.2 inches more room to ride.
A study in Japan has shown that Japanese and Americans interpret facial expressions differently. In Japan, people pay attention to the eyes for emotional cues, whereas in America (and, presumably, elsewhere in the West), they look to the mouth.
The exact reasons for this are not known, though one theory is that it is because the Japanese attempt to suppress their emotions in the presence of others more than the loud, demonstrative gaijin do, and in such cases, the eyes provide more of a clue to someone's emotional state. One consequence of this, of course, is the difference between the way Westerners and Japanese draw happy-face symbols in ASCII characters, with the Japanese smiley looking like ^_^ (note the emphasis on the eyes), and the Western one being the familiar :-):
So when Yuki entered graduate school and began communicating with American scholars over e-mail, he was often confused by their use of emoticons such as smiley faces :) and sad faces, or :(.
"It took some time before I finally understood that they were faces," he wrote in an e-mail. In Japan, emoticons tend to emphasize the eyes, such as the happy face (^_^) and the sad face (;_;). "After seeing the difference between American and Japanese emoticons, it dawned on me that the faces looked exactly like typical American and Japanese smiles," he said.
Meanwhile, Google has filed a patent for using online games to build up psychological profiles of users, and using these for targetting ads:
The company thinks it can glean information about an individual's preferences and personality type by tracking their online behaviour, which could then be sold to advertisers. Details such as whether a person is more likely to be aggressive, hostile or dishonest could be obtained and stored for future use, it says.
The patent says: "User dialogue (eg from role playing games, simulation games, etc) may be used to characterise the user (eg literate, profane, blunt or polite, quiet etc). Also, user play may be used to characterise the user (eg cautious, risk-taker, aggressive, non-confrontational, stealthy, honest, cooperative, uncooperative, etc)."
Players who spend a lot of time exploring "may be interested in vacations, so the system may show ads for vacations". And those who spend more time talking to other characters will see adverts for mobile phones.
Not all the inferences made by monitoring user activity rely on subtle psychological clues, however. "In a car racing game, after a user crashes his Honda Civic, an announcer could be used to advertise by saying 'if he had a Hummer, he would have gotten the better of that altercation', etc," the patent says. And: "If the user has been playing for over two hours continuously, the system may display ads for Pizza Hut, Coke, coffee."And on a related note, Bruce Schneier on how today's likely surveillance dystopias differ from Orwell's totalitarian vision:
Data collection in 1984 was deliberate; today's is inadvertent. In the information society, we generate data naturally. In Orwell's world, people were naturally anonymous; today, we leave digital footprints everywhere.
1984's Big Brother was run by the state; today's Big Brother is market driven. Data brokers like ChoicePoint and credit bureaus like Experian aren't trying to build a police state; they're just trying to turn a profit. Of course these companies will take advantage of a national ID; they'd be stupid not to. And the correlations, data mining and precise categorizing they can do is why the U.S. government buys commercial data from them.
And finally, the police state of 1984 was deliberately constructed, while today's is naturally emergent. There's no reason to postulate a malicious police force and a government trying to subvert our freedoms. Computerized processes naturally throw off personalized data; companies save it for marketing purposes, and even the most well-intentioned law enforcement agency will make use of it.
A funny thing happened during a recent test of a military mine-disposal robot:
This is not the only incident of a curious camaraderie developing between soldiers and robots; the article describes other stories of US troops in Afghanistan and Iraq (where robots are being widely deployed) befriending their mechanical compatriots, ascribing quirks of individual robots to personalities, giving them names and (virtual) battlefield honours, and even going fishing with their robot buddies.
At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.
Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.
The human in command of the exercise, however -- an Army colonel -- blew a fuse.
The colonel ordered the test stopped.
Why? asked Tilden. What's wrong?
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.
Which could be more evidence that the human mind automatically perceives anything whose actions show signs of intention to have psychological states. An example (in the book Mind Hacks) describes children being shown a display with two shapes moving on a screen, one after the other, and when asked what was happening, describing one as chasing the other. Could it be that when a machine tries to defuse a bomb, its operators, even though they know it is a machine, can't help but mentally classify it as being "alive"?
The Guardian has an excerpt from a recent book by Barbara Ehrenreich, which postulates that the rise of subjective individual self-awareness and the decline of the collective celebrations common in mediæval times may have touched off an epidemic of depression we've been living in ever since:
And very likely the phenomena of this early "epidemic of depression" and the suppression of communal rituals and festivities are entangled in various ways. It could be, for example, that, as a result of their illness, depressed individuals lost their taste for communal festivities and even came to view them with revulsion. But there are other possibilities. First, that both the rise of depression and the decline of festivities are symptomatic of some deeper, underlying psychological change, which began about 400 years ago and persists, in some form, in our own time. The second, more intriguing possibility is that the disappearance of traditional festivities was itself a factor contributing to depression.
One approaches the subject of "deeper, underlying psychological change" with some trepidation, but fortunately, in this case, many respected scholars have already visited this difficult terrain. "Historians of European culture are in substantial agreement," Lionel Trilling wrote in 1972, "that in the late 16th and early 17th centuries, something like a mutation in human nature took place." This change has been called the rise of subjectivity or the discovery of the inner self and since it can be assumed that all people, in all historical periods, have some sense of selfhood and capacity for subjective reflection, we are really talking about an intensification, and a fairly drastic one, of the universal human capacity to face the world as an autonomous "I", separate from, and largely distrustful of, "them".
But the new kind of personality that arose in 16th- and 17th-century Europe was by no means as autonomous and self-defining as claimed. For far from being detached from the immediate human environment, the newly self-centered individual is continually preoccupied with judging the expectations of others and his or her own success in meeting them: "How am I doing?" this supposedly autonomous "self" wants to know. "What kind of an impression am I making?"If this hypothesis is correct, then the epidemic of depression and mental illness that began in the 1600s (which Ehrenreich provides supporting evidence for, in historical records) is a side-effect of a step in the evolution of human psychology that began at around that time, with the pressures of communication, trade and social organisation dragging the human mind kicking and screaming from a sleepy collective life to a more dynamic way of living. In this case, a lot of the anxiety, angst and low-level distress people feel routinely is not a result of human nature, but rather human nature reacting against "unnatural" circumstances. Small wonder that many have sought relief in an annihilation of the self, from hippie communes to Communist utopias, from meditation to severe religious submission, from the Arcadian pastoral utopias throughout art (Tolkien, William Morris and the Arcade Fire to name three examples off the top of my head) to the transcendental nihilism of drugs (take, for example, Lou Reed wishing he had been born "a thousand years ago" in Heroin).
So where does that leave us? Perhaps, given enough time (hundreds if not thousands of years), human psychology will evolve into depression-resistant directions, assuming that some kind of technological catastrophe doesn't cut the process short. Genetic evolution is slow, but cultural evolution is faster, and it could be argued that our technologies and cultural institutions are part of the "extended phenotype" of humanity; that the invention of antidepressant drugs is an adaptation to these changes in our environment. It's a crude, reactionary adaptation, merely treating the symptoms; though there is hope on the horizon. There has recently been a lot of focus on the study of the psychology of happiness, and what factors make for environments conducive to sustainable happiness. With any luck, this will lead to improvements in areas from urban planning to social policy to economics.
Then again, if the hypothesis is true, would it be possible to somehow get the best of both worlds? Could one have the happy, fulfilling collective connectedness people (allegedly) had before the 16th century, whilst retaining the gains made since then? Or is the very presence of subjective thought, the demarcation between the self and the collective, poisonous?
(On the other hand, L. Ron Hubbard claims that depression comes from humanity's early ancestor, the clam, and the tension between the desire to open and close its hinge.)
(via del.icio.us:cos) Share
After Stephen Fry commented that British actors have an unfair advantage in America because Americans mistake British accents for brilliance, the BBC has published a piece on what a British accent gets you in the US. (And, apparently, a "British accent" includes anything from Hugh Grant plumminess to deepest darkest Geordie.)
"For most Americans, there's no distinction between British accents. For us, there's just one sort of British accent, and it's better than any American accent - more educated, more genteel," says Rosina Lippi-Green, a US academic and author of English with an Accent: Language, Ideology and Discrimination in the United States.
"There was a sitcom called Dead Like Me with a Brit [Callum Blue] in it. He was a scruffy, 20-something drug dealer. Even he had that sort of patina - his was not an RP accent, it was a working class London accent."
Katharine Jones, author of Accent of Privilege: English Identities and Anglophilia in the US, says the "educated and cultured" associations have a long history. "British etiquette books have been used for years; and although Americans say they have no class system, they do - and the American upper class apes the British upper class."Another point the article makes: British expatriates in Australia (where their accent is associated with complaining and being bad at cricket, and/or where refinement and intelligence have traditionally been associated with weakness and/or metaphorical or literal homosexuality rather than any positive attributes) tend to lose their accents pretty quickly, whereas those in the US (where their accents make them appear intelligent and sophisticated, and often get them preferential treatment) retain theirs. Funny, that.
While it may seem that we live in an age of unprecedented violence and atrocity, according to Steven Pinker, violence has been steadily declining over the past few centuries, and we are now living at the most peaceful time in the history of humanity so far.
The decline of violence, he tells us, is a fractal phenomenon - we see it over the centuries, the decades and the years. That said, we see a tipping point in the 16th century - the age of reason - particularly in England and Holland.
One on one death has plummeted through the middle ages, with an "elbow" of the curve in the 16th century. Despite a slight uptick in the 1960s - "perhaps those who thought that rock and roll would lead to a decline in moral values had it right" we've seen two orders of magnitude fall in one on one violence from the middle ages to today. State sponsored violence has also fallen sharply - we've seen a 90% reduction in genocide since the end of the cold war. State on state conflicts are dropping every decade.Pinker then calls bullshit on the Rousseauvian "noble savage" myth that, in some state of long-lost primordial innocence, our distant ancestors lived in blessed harmony with one another, and that ills such as warfare and violence are the result of the noxious effects of language/capitalism/agriculture/urbanisation.
Until 10,000 years ago, all humans were hunter gatherers. This is the group that some believe lived in primordial harmony - there's no evidence of this. Studying current hunter-gatherer tribes, the percent of male adults who die in violence is extraordinary - from 20 to 60% of all males. Even during the violent 20th century, with two world wars, less than 2% of males worldwide died in warfare.
The Middle Ages were filled with mutilation and torture as routine punishments for trangressions we'd punish with fines today. This was merely another charming feature of a time that featured pastimes like "cat burning", dropping cats into a fire for entertainment purposes. Some of the most creative inventions of the Middle Ages were fantastically cruel forms of corporal punishment.Pinker offers several reasons for the illusion that violence is increasing and the past was more idyllic: improved communications (we have more awareness of acts of violence, petty and enormous, than people had in earlier centuries), the cognitive illusion that makes memorable events (which include acts of spectacular brutality) seem more common, and the fact that popular standards of what's acceptable are changing faster than behaviour actually is. He also offers four explanations for why violence is becoming less common: the Hobbesian hypothesis (that states with monopolies on violence reduce it), a decline in the belief that life is cheap, the rise of more non-zero-sum games such as international trade, which make potential rivals more valuable alive than dead, and the hypothesis of the "expanding circle":
By default, we empathize with a small group of people, our friends and family. Everyone else is subhuman. But over time, we've seen this circle expand, from village to clan to tribe to nation to other races, both sexes and eventually other species. As we learn to expand our circles wider and wider, perhaps violence becomes increasingly unacceptable.
A team of six specialists assembled by the BBC has discovered the ten steps to true happiness:
The team, which consists of a psychologist, a psychotherapist, two "workplace specialists", a "social entrepreneur" and "Richard Reeves, whose expertise spans philosophy, public policy and economics", has been given the task of increasing the levels of happiness in Slough, a town whose name has stood as a byword for post-industrial alienation and the dehumanising effects of modernity since John Betjeman penned his famous ode to the place in 1937.
- Plant something and nurture it
- Count your blessings - at least five - at the end of each day
- Take time to talk - have an hour-long conversation with a loved one each week
- Phone a friend whom you have not spoken to for a while and arrange to meet up
- Give yourself a treat every day and take the time to really enjoy it
- Have a good laugh at least once a day
- Get physical - exercise for half an hour three times a week
- Smile at and/or say hello to a stranger at least once each day
- Cut your TV viewing by half
- Spread some kindness - do a good turn for someone every day
Culture-bound syndrome of the day: "Paris Syndrome". This is a condition affecting Japanese tourists who travel to Paris, romantic scenes from Amélie in their minds, only to discover that the city is considerably dirtier and—shock, horror—full of very rude people. This shock can cause a psychiatric breakdown:
An encounter with a rude taxi driver, or a Parisian waiter who shouts at customers who cannot speak fluent French, might be laughed off by those from other Western cultures.
But for the Japanese - used to a more polite and helpful society in which voices are rarely raised in anger - the experience of their dream city turning into a nightmare can simply be too much.
This year alone, the Japanese embassy in Paris has had to repatriate four people with a doctor or nurse on board the plane to help them get over the shock.As many as 12 Japanese tourists fall victim to Paris Syndrome each year. The Japanese embassy has established a 24-hour hotline to help those afflicted.
Psychologists Sara Gutierres, Ph.D., and Douglas Kenrick, Ph.D., both of Arizona State University, demonstrated that the contrast effect operates powerfully in the sphere of person-to-person attraction as well. In a series of studies over the past two decades, they have shown that, more than any of us might suspect, judgments of attractiveness (of ourselves and of others) depend on the situation in which we find ourselves. For example, a woman of average attractiveness seems a lot less attractive than she actually is if a viewer has first seen a highly attractive woman. If a man is talking to a beautiful female at a cocktail party and is then joined by a less attractive one, the second woman will seem relatively unattractive.
Psychologists Sara Gutierres, Ph.D., and Douglas Kenrick, Ph.D., both of Arizona State University, demonstrated that the contrast effect operates powerfully in the sphere of person-to-person attraction as well. In a series of studies over the past two decades, they have shown that, more than any of us might suspect, judgments of attractiveness (of ourselves and of others) depend on the situation in which we find ourselves. For example, a woman of average attractiveness seems a lot less attractive than she actually is if a viewer has first seen a highly attractive woman. If a man is talking to a beautiful female at a cocktail party and is then joined by a less attractive one, the second woman will seem relatively unattractive.
The strange thing is, being bombarded with visions of beautiful women (or for women, socially powerful men) doesn't make us think our partners are less physically attractive. It doesn't change our perception of our partner. Instead, by some sleight of mind, it distorts our idea of the pool of possibilities.
Our minds have not caught up. They haven't evolved to correct for MTV. "Our research suggests that our brains don't discount the women on the cover of Cosmo even when subjects know these women are models. Subjects judge an average attractive woman as less desirable as a date after just having seen models," Kenrick says.
So the women men count as possibilities are not real possibilities for most of them. That leads to a lot of guys sitting at home alone with their fantasies of unobtainable supermodels, stuck in a secret, sorry state that makes them unable to access real love for real women. Or, as Kenrick finds, a lot of guys on college campuses whining, "There are no attractive women to date."This effect apparently manifests itself in higher rates of divorce or persistent singleness due to people exposed to quantities of images of attractiveness their brains are not evolutionarily adapted to, and thus developing dissatisfaction with actual potential partners.
Mind you, this article is rather male-centric (it's partly a survey of studies, and partly a lament from the head of a Los Angeles PR agency, kvetching bitterly about all the unfeasibly gorgeous women he is surrounded by and how their presence is making his life a hell), and doesn't cover the female perspective; i.e., whether women are bombarded with images of unfeasibly attractive potential male partners, and whether this causes them to feel dissatisfied with actual partners (or potential partners) to the same extent.
Cognitive neuroscience researcher Ogi Ogas describes how he used techniques from neuroscience to win a quiz show, getting questions he did not consciously know the answer to:
Cognitive models developed by my advisor Gail Carpenter suggest that a more effective way to evaluate an intuition is to consider its mnemonic associations. If you can mentally trace some of the cognitive links of an intuition (through a process similar to priming), these links may suggest whether the intuition is meaningfully connected to the correct answer or whether the link is trivial, incidental, or wrong. For example, given the question "Bucharest is the capital of what European country?", you might have an intuition that the answer is Hungary, because the actual capital of Hungary--Budapest--sounds like "Bucharest" and is thus unconsciously linked. In this case, naively following your unexamined intuition would lead you away from the correct response: Romania.
My $250,000 question presented me with a case of pure intuition. "The department store Sears got its start by selling what specific product in its first catalog?" Since pop culture esoterica and business origins are outside my domains of interest, I did not know the answer. But for some reason, even before the four possible answers appeared, I thought of watches. When "watches" turned up as one of the choices, I reflected on it further. I did not feel any certainty. But why did my brain come up with "watches?" ... As I concentrated on my watch intuition, I began to think about railroads. My brain's memory pattern of watches was somehow linked to a memory pattern of railroads, and my railroad memory also evoked a memory of Sears. Though I still could not work out the explicit connection between watches and Sears, I satisfied myself that "watches" had some deep mnemonic relationship to both railroads and Sears--perhaps at some point in my life I had read that Sears originally delivered their watch catalogs by railroad?
Later, in the tranquility of my apartment, I discovered that 23-year old railroad station agent Richard Sears sold watches to other station agents along the Minneapolis and St. Louis Railway for a full year before meeting up with Alvah C. Roebuck. I never did discover how this obscure factoid had left its faint trace upon my brain.
A new book by a US sociologist examines the phenomenon of bed sharing, which has, so far, been overlooked by science:
In researching his book, Dr. Rosenblatt said even though many couples said they slept better alone, they still shared a bed. "When I asked why, they looked at me as if I'd asked them why they keep breathing," he said.
The subjects he interviewed invariably had their own side of the bed, and responsibilities like putting out the cat or opening the windows before turning in. They usually had rituals like watching the television news before lights out or snuggling before falling to sleep. And they often had signals for when they wanted affection, wanted to talk or wanted to be left alone.
"How they arrived at these systems could be said to mirror their relationships," said Dr. Rosenblatt. The most successful systems were those formed out of compromise and sensitivity to the other's needs.
The Daily Telegraph has some excerpts from a book on the psychology of the joke:
Jerry Seinfeld compares telling a joke to attempting to leap a metaphorical canyon, taking the audience with him. The set-up is the nearside cliff, and the punchline is the far side. If they're too far apart, the listeners don't make it to the other side. And if they are too close together, the audience just steps across the gap without experiencing any exhilarating leap. The joke-hearer gets far more pleasure from the joke if he or she has to do a little work.
The surprise mechanism doesn't work without effective timing. It's almost impossible to explain in print because our eyes always skip ahead to the punchline before the set-up is properly digested. But next time you listen to a comedian, listen to the pauses. They're not that funny on their own - obviously, they're just tiny silences - but the point is, neither are the jokes.
It's not that long, wordy jokes can't be funny, but if too much is explained, there's no logical leap for the audience to make, and the paradigm shift which elicits laughter is lost.
Compare: I'm not a homosexual. Mind you, I might be mistaken for one if I went to the north of England. In places like Newcastle, there's such a culture of macho posturing that they go out in their shirtsleeves in all kinds of weather, so if you wear a coat they think you're gay.
And: I'm not gay. Unless you're from Newcastle and by 'gay' you mean, 'owns a coat'.
In RPS circles a common mantra is "Rock is for Rookies" because males have a tendency to lead with Rock on their opening throw. It has a lot to do with idea that Rock is perceived as "strong" and forceful", so guys tend to fall back on it. Use this knowledge to take an easy first win by playing Paper. This tactic is best done in pedestrian matches against someone who doesn't play that much and generally won't work in tournament play.
When playing with someone who is not experienced at the RPS, look out for double runs or in other words, the same throw twice. When this happens you can safely eliminate that throw and guarantee yourself at worst a stalemate in the next game. So, when you see a two-Scissor run, you know their next move will be Rock or Paper, so Paper is your best move. Why does this work? People hate being predictable and the perceived hallmark of predictability is to come out with the same throw three times in row.
When playing against someone who asks you to remind them about the rules, take the opportunity to subtly "suggest a throw" as you explain to them by physically showing them the throw you want them to play. ie "Paper beats Rock, Rock beats scissors (show scissors), Scissors (show scissors again) beats paper." Believe it or not, when people are not paying attention their subconscious mind will often accept your "suggestion". A very similar technique is used by magicians to get someone to take a specific card from the deck.
Research published yesterday shows that men who went to single-sex schools are significantly more likely to be divorced by their early forties. Which, presumably, is a result of lack of socialisation with the opposite sex in adolescence damaging mens' relationship skills required to sustain long relationships.
Figures show 37 per cent of men from boys-only comprehensives are divorced by the time they reach the age of 42 - compared with just 28 per cent from co-educational schools.Interestingly enough, women who went to single-sex schools do not suffer any similar effect.
Organisers of the Bestival festival on the Isle of Wight asked attendees to come dressed as clowns, but have had to change the theme after a number of ticket holders said they had a fear of clowns:
Organisers have suggested people instead turn up in "bunny ears, a Spam tin outfit, an astronaut's helmet, a witch's hat or just a plain old Buzz Lightyear lycra all-in-one" for the concerts which include performances by the Pet Shop Boys and the Scissor Sisters.
Coulrophobia - fear of clowns - can cause panic attacks, shortness of breath, rapid breathing, irregular heartbeat, sweating, nausea and overall feelings of dread.What I'm wondering is: did anyone suffer from coulrophobia before John Wayne Gacy and the emergence of the modern psycho-killer-clown meme?
A new study has shown that, whilst self-sacrifice and discipline can make you feel proud in the short term, in the long run, you regret it
In another experiment, students who'd just come back from their break were polled. The ones who'd partied it up regretted their actions -- while those who studied were virtuously smug. But when asked to recall the spring break from the previous year, suddenly more students regretted their choice not to party. When alumni were asked to recall their spring breaks of 40 years ago, the results were even starker: Those who hadn't been doing beer shots out of a barber's chair were striken with remorse.
Why the reversal? Why do we opt for virtue in the short term, but prefer vice in the long? The reason, the researchers suggest, is in the mechanics of guilt: It's intense and painful emotion in the here-and-how, but fades over time. As they write:Whereas guilt is an acute, hot emotion, missing out is a colder, contemplative feeling. Therefore, indulgence guilt is expected to predominate in the temporal proximity of the relevant self-control choice, but subsequently diminish over time.Now there's a conclusion that will deeply freak out social conservatives.
The study was done in America in 2006; I wonder whether a similar study done in a culture (say, Japan) with a stronger sense of duty would yield different results.
More evidence of neoteny being a characteristic of evolutionary advancement: as coping the modern world requires more flexibility, immaturity levels in adults are rising. Which sounds alarming, until you consider that "maturity" (and the nebulous "wisdom" that comes with it) is a sclerotic set-in-one's-ways inflexibility and resistance to change, which no longer cuts it:
"The psychological neoteny effect of formal education is an accidental by-product -- the main role of education is to increase general, abstract intelligence and prepare for economic activity," he explained. "But formal education requires a child-like stance of receptivity to new learning, and cognitive flexibility."
"When formal education continues into the early twenties," he continued, "it probably, to an extent, counteracts the attainment of psychological maturity, which would otherwise occur at about this age.
While the human mind responds to new information over the course of any individuals lifetime, Charlton argues that past physical environments were more stable and allowed for a state of psychological maturity. In hunter-gatherer societies, that maturity was probably achieved during a persons late teens or early twenties, he said.
By contrast, many modern adults fail to attain this maturity, and such failure is common and indeed characteristic of highly educated and, on the whole, effective and socially valuable people," he said.Some of the symptoms of neoteny include novelty-seeking, which ties in with the possibility of a "neophilia gene" previously mentioned here. In fact, if there was a genetic mutation that caused neophilia, the abovementioned article suggests that, in today's environment, it would be strongly selected for.
More on the subject of happiness and its exact nature: Edge.org talks to Daniel Gilbert, a researcher on the subject (he is director of Harvard's Hedonic Psychology Laboratory):
My research with Tim Wilson shows that when people try to simulate future events -- and to simulate their emotional reactions to those events -- they make systematic errors. Modern people take the ability to imagine the future for granted, but it turns out that this is one of our species' most recently acquired abilities -- no more than three million years old. The part of our brain that enables us to simulate the future is one of nature's newest inventions, so it isn't surprising that when we try to use this new ability to imagine our futures, we make some rookie errors. The main error, of course, is that we vastly overestimate the hedonic consequences of any event. Neither positive nor negative events hit us as hard or for as long as we anticipate.
We're all told that variety is the spice of life. But variety is not just over-rated, it may actually have a cost. Research shows that people do tend to seek more variety than they should. We all think we should try a different doughnut every time we go to the shop, but the fact is that people are measurably happier when they have their favorite on every visit -- provided the visits are sufficiently separated in time.
Those last four words are the important ones. If you had to eat 4 donuts in rapid succession, variety would indeed spice up your experience and you'd be wise to seek it. But if you had to eat 4 donuts on 4 separate Mondays, variety would lower your overall enjoyment. The human brain has tremendous difficulty reasoning about time, and thus we tend to seek variety whether the doughnuts are separated by minutes or months.
Even in a technologically sophisticated society, some people retain the romantic notion that human unhappiness results from the loss of our primal innocence. I think that's nonsense. Every generation has the illusion that things were easier and better in a simpler past, but the fact is that things are easier and better today than at any time in human history.
Our primal innocence is what keeps us whacking each other over the head with sticks, and it is not what allows us to paint a Mona Lisa or design a space shuttle. It gives rise to obesity and global warming, not Miles Davis or the Magna Carta. If human kind flourishes rather than flounders over the next thousand years, it will be because we embraced learning and reason, and not because we surrendered to some fantasy about returning to an ancient Eden that never really was.
A new study has confirmed that people like the music that other people like, rather than judging it objectively:
Researchers created an artificial "music market" of 14,341 participants drawn from a teen-interest Web site. Upon entering the study's Internet market, the participants were randomly, and unknowingly, assigned to either an "independent" group or a "social influence" group. Participants could then browse through a collection of unknown songs by unknown bands.
In the independent condition, participants chose which songs to listen to based solely on the names of the bands and their songs. While listening to the song, they were asked to rate it from one star ("I hate it") to five stars ("I love it"). They were also given the option of downloading the song for keeps.
In the social influence group, participants were provided with the same song list, but could also see how many times each song had been downloaded.
Researchers found that popular songs were popular and unpopular songs were unpopular, regardless of their quality established by the other group. They also found that as a particular songs' popularity increased, participants selected it more often.
So what drives participants to choose low-quality songs over high-quality ones? "People are faced with too many options, in this case 48 songs. Since you can't listen to all of them, a natural shortcut is to listen to what other people are listening to," Salganik said. "I think that's what happens in the real world where there's a tremendous overload of songs."Which certainly explains [insert massively popular yet mediocre artist in the genre of your choice here].
The Mind Hacks blog has a report of an interesting study on subliminal influence:
You go to the supermarket and stop by some shelves offering French and German wine. You buy a bottle of French wine. After going through the checkout you are asked what made you choose that bottle of wine. You say something like "It was the right price", or "I liked the label". Did you notice the French music playing as you took it off the shelf? You probably did. Did it affect your choice of wine? No, you say, it didn't.
That's funny because on the days we play French music nearly 80% of people buying wine from those shelves choose French wine, and on the days we play German music the opposite happensThe study in question used stereotypical examples of national music (French accordion music and German "oom-pah" band music), yielding the results mentioned, and is effective primarily due to its subtlety. It would not be enough to make someone not intending to buy wine buy some, but is enough to influence the choice of wine.
What would be the effect, I wonder, of having someone stand by the shelves saying to the customers as they passed "Why don't you buy a French wine today"? My hunch is that you'd make people think about their decision a lot more - just by trying to persuade them you'd turn the decision from a low involvement one into a high involvement one. People would start to discount your suggestion. But the suggestion made by the music doesn't trigger any kind of monitoring. Instead, the authors of this study believe, it triggers memories associated with the music - preferences and frames of reference. Simply put, hearing the French music activates  ideas of 'Frenchness' - maybe making customers remember how much they like French wine, or how much they enjoyed their last trip to France. For a decision which people aren't very involved with, with low costs either way (both the French and German wines are pretty similar, remember, except for their nationality) this is enough to swing the choice.
John Birmingham puts forward the case that the political right pretty much has a monopoly on humour, with the left having become too puritanical and politically correct to laugh, with the voices that dare to be outrageous being predominantly right-wing, from shock-jocks and reactionary bloggers to institutions like VICE Magazine (infamously offending the uptight by pejoratively calling things "gay") and the creators of South Park and Team America (who skewered Hollywood liberals and left-wing sanctimony alike).
Of course, this relies on a rather broad definition of "right-wing", as anything that goes against a doctrinaire liberal/progressive view of propriety and "political correctness". By this token, one would classify Coco Rosie as a right-wing band, placing them in the same ideological milieu as Pat Robertson and Little Green Footballs, because one of their number attended "Kill Whitey" parties. And while VICE's Gavin McInnes claimed in American Conservative to represent a hip new conservatism (a view he later retracted, claiming he was joking/being ironic), the cocaine-snorting, nihilistic libertinism epitomised in the magazine, as much as it may offend "liberals" (or straw-man caricatures thereof), hardly fits well with the canon of conservatism and its emphasis on values, tradition and authority. However, it does fit in with the recently noted shift towards Hobbesian nihilism and radical individualism.
On a tangent: some American conservatives are concerned about FOXNews' alarming slide to the radical left; the channel, once the shining beacon of all things Right-thinking, has been compromising its Fair And Balanced™ reputation by running programmes on topics such as global warming. Pundits blame the influx of liberally-inclined ex-CNN reporters, the staffers having spent too long in Godless New York, away from the Biblical certainties of the Red States, or Murdoch not really being "One Of Us", but rather a cynical opportunist.
And finally, a study on the neurology of political belief has showed that True Believers of both stripes are adept at ignoring facts which don't jive with their beliefs, and experience a rush in the reward centres of the brain when they do:
"We did not see any increased activation of the parts of the brain normally engaged during reasoning," said Drew Westen, director of clinical psychology at Emory University. "What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts."
The test subjects on both sides of the political aisle reached totally biased conclusions by ignoring information that could not rationally be discounted, Westen and his colleagues say. Then, with their minds made up, brain activity ceased in the areas that deal with negative emotions such as disgust. But activity spiked in the circuits involved in reward, a response similar to what addicts experience when they get a fix, Westen explained.
The New York Times has a long and interesting article on the Japanese phenomenon of hikikomori, or of young Japanese dropping out of society and shutting themselves in their rooms for months at a time, emerging only to go to convenience stores at night or not at all:
A leading psychiatrist claims that one million Japanese are hikikomori, which, if true, translates into roughly 1 percent of the population. Even other experts' more conservative estimates, ranging between 100,000 and 320,000 sufferers, are alarming, given how dire the consequences may be. As a hikikomori ages, the odds that he'll re-enter the world decline. Indeed, some experts predict that most hikikomori who are withdrawn for a year or more may never fully recover. That means that even if they emerge from their rooms, they either won't get a full-time job or won't be involved in a long-term relationship. And some will never leave home. In many cases, their parents are now approaching retirement, and once they die, the fate of the shut-ins - whose social and work skills, if they ever existed, will have atrophied - is an open question.
In other societies the response from many youths would be different. If they didn't fit into the mainstream, they might join a gang or become a Goth or be part of some other subculture. But in Japan, where uniformity is still prized and reputations and outward appearances are paramount, rebellion comes in muted forms, like hikikomori. Any urge a hikikomori might have to venture into the world to have a romantic relationship or sex, for instance, is overridden by his self-loathing and the need to shut his door so that his failures, real or perceived, will be cloaked from the world.
By Japanese standards, his room was enormous, with a wall of delicate shoji screens leading to a rock garden. But it was hard to imagine what he did there all day. There were no stacks of manga, the popular Japanese comic books, no DVD's, no computer games, all things found in the rooms of most hikikomori. The TV was broken, and the hard drive was missing from his computer. There were a few papers on his desk, including a newsletter from New Start that Kawakami brought on her last visit. Otherwise, the only evidence that this was a hikikomori's room were three holes in the wall - the size of fists. Shut-ins often describe punching their walls in a fit of anger or frustration at their parents or at their own lives. The holes were suggestive too of the practice of "cutting" among American adolescent girls. Both acts seemed to be attempts to infuse feeling into a numb life.
By the time parents seek help, often their child has been shut in for a year or more. "When they call," Dr. Saito said, "I offer them three choices: 1) Come to me for counseling; 2) Kick your child out; 3) Accept your child's state and be prepared to take care of him for the rest of your life. They choose Option 1." He also offers poignantly simple parenting tips, like not leaving dinner at a child's doorstep. "You make dinner and call him to the table, and if he doesn't come then let him fend for himself." In addition to meals, parents often provide monetary allowances for their adult child, and in rare cases, if a child has become verbally or physically abusive, parents move out, leaving their home to the shut-in.Parents of hikikomori now have support programmes to turn to, including volunteers known as "rental sisters", who try to befriend their children and coax them out of their rooms and into support centres, often over months or years.
There are multiple theories trying to explain the hikikomori phenomenon, but several frame it as a conscious rejection of the high pressure to conform and succeed placed on individuals in Japanese society; a conscious, if not particularly sustainable, decision to drop out of the traditional school-university-work career path.
On a similar note, Momus' latest piece in Wired News celebrates Japan's aging population and embrace of the "slow life".
A US professor of psychology has found the difference between English and American smiles. Apparently, Americans smile naturally and warmly, whereas an English smile is a suppressed grimace, or a signal of acquiescence to hierarchy (which, apparently, is internalised in the English national character):
Keltner hit upon this difference in national smiles by accident. He was studying teasing in American fraternity houses and found that low-status frat members, when they were teased, smiled using the risorius muscle - a facial muscle that pulls the lips sideways - as well as the zygomatic major, which lifts up the lips. It resulted in a sickly smile that said, in effect, I understand you must paddle me, brother, but not too hard, please. Several years later, Keltner went to England on sabbatical and noticed that the English had a peculiar deferential smile that reminded him of those he had seen among the junior American frat members. Like the frat brothers', the English smile telegraphed an acknowledgment of hierarchy rather than just expressing pleasure.
The first-hand account from a waiter of how he seduced an entire table of women into ordering desserts, coming up against concerted resistance and coming out triumphant.
A hoaxer in the US Midwest has reprised the Milgram obedience experiments by calling fast-food restaurants posing as a police officer and instructing managers to strip-search employees, subjecting them to bizarre and degrading ordeals. The managers in question, being selected for unthinking obedience, never realised that anything was wrong, accepting "Officer Scott"'s authoritative tone of voice, stated reasons and the sounds of police radios in the background as sufficient reason to start obeying, and the fact that they were already obeying as sufficient reason to keep doing so, up to committing rape.
On May 29, 2002, a girl celebrating her 18th birthday -- in her first hour of her first day on the job at the McDonald's in Roosevelt, Iowa -- was forced to strip, jog naked and assume a series of embarrassing poses, all at the direction of a caller on the phone, according to court and news accounts.
He had mastered the police officer's calm but authoritative demeanor. He sprinkled law-enforcement jargon into every conversation. And he did his homework. He researched the names of regional managers and local police officers in advance, and mentioned them by name to bolster his credibility. He called some restaurants in advance, somehow getting names and descriptions of victims so he could accurately describe them later.
In her book, "Making Fast Food: From the Frying Pan into the Fryer," Canadian sociologist Ester Reiter concludes that the most prized trait in fast-food workers is obedience. "The assembly-line process very deliberately tries to take away any thought or discretion from workers," said Reiter, who teaches at Toronto's York University and who spent 10 months working at a Burger King as part of her research. "They are appendages to the machine."Several people who followed orders were jailed for rape and related crimes. The hoaxer was later found to be a 38-year-old prison guard with a fantasy of being a police officer. Meanwhile, one of the victims is suing McDonalds for allowing this to happen; McDonalds, meanwhile, blames her for not reading the employee manual where it said that strip searches were prohibited and not recognising that the caller wasn't a real police officer.
A New York Times article hypothesises on the significance of stuffed toys lashed to the front of trucks:
Robert Marbury, an artist who photographed dozens of Manhattan bumper fauna for a project in 2000 (see urbanbeast.com/faq/strapped.html), said he had once asked a trash hauler why he had a family of three mismatched bears strapped to his rig. "He said: 'Yo, man, I drive a garbage truck. How am I going to get the ladies to look at me?' " Mr. Marbury recalled.
Monroe Denton, a lecturer in art history at the School of Visual Arts, traced the phenomenon's roots back to the figureheads that have animated bows of ships since the time of the pharaohs. "There was some sort of heraldic device to deny the fact of this gigantic machine," he said. "You would have these humanizing forms, anthropomorphic forms - a device that both proclaims the identity of the machine and conceals it."
"There's a transference in this," she said. "There's this soft, flesh-and-bone sanitation worker, who knows very well they could be crushed against this truck. The creature could be the sanitation worker in a very dangerous position, so the animal could be a stand-in."
"Binding a soft thing to a very powerful truck - there's a kind of macho thing about that," she said.
Scooby's story lends credence to the theory of Mr. Denton, the art historian, that the grille-mounted stuffed animal draws from the same well as the "abject art" movement that flourished in the 1990's and trafficked heavily in images of filth and of distressed bodies.
In 1950, a book titled The Authoritarian Personality posited the claim that, far from being alien, fascist tendencies were commonplace in American society. The book is best known for the "F Scale", a test of how inclined one would be, should the opportunity present itself, to don the jackboots of authoritarianism. The test consisted of a number of multiple-choice questions, with the answers added to give a score; in that sense, it's an ancestor of numerous OKCupid tests and LiveJournal memes.
The F Scale is not perfect: for one, it focuses almost exclusively on a strain of traditionalist, right-wing authoritarianism, ignoring other strains, such as Soviet-style social engineering. (This could be because one of the authors was the famous Marxist critical theorist Theodor Adorno, and/or because authoritarian utopianism à la Lenin never had more than niche popularity in the US, where the research was carried out.) However, according to this article, it's more relevant today than it was when it was written:
In the June 19, 2005, issue of The New York Times Magazine, the journalist Russell Shorto interviewed activists against gay marriage and concluded that they were motivated not by a defense of traditional marriage, but by hatred of homosexuality itself. "Their passion," Shorto wrote, "comes from their conviction that homosexuality is a sin, is immoral, harms children and spreads disease. Not only that, but they see homosexuality itself as a kind of disease, one that afflicts not only individuals but also society at large and that shares one of the prominent features of a disease: It seeks to spread itself." It is not difficult to conclude where those people would have stood on the F scale.
Consider the case of John R. Bolton, now our ambassador to the United Nations. While testifying about Bolton's often contentious personality, Carl Ford Jr., a former head of intelligence within the U.S. State Department, called him a "a quintessential kiss-up, kick-down sort of guy." Surely, in one pithy sentence, that perfectly summarizes the characteristics of those who identify with strength and disparage weakness. Everything Americans have learned about Bolton -- his temper tantrums, intolerance of dissent, and black-and-white view of the world -- step right out of the clinical material assembled by the authors of The Authoritarian Personality.
One item on the F scale, in particular, seems to capture in just a few words the way that many Christian-right politicians view the world in an age of terror: "Too many people today are living in an unnatural, soft way; we should return to the fundamentals, to a more red-blooded, active way of life."
(via ALDaily) Share
Please enter the text in the image above here: