The Null Device
Posts matching tags 'society'
Observing political debate, I have noticed a trope that keeps recurring, particularly (these days) on the Right. I'll call it the Gordian Knot Delusion. It says, in essence, “the so-called experts/eggheads/‘intellectuals’ keep going on about how complex things are, but they're liars. When you get down to it, things really are simple.” (There is an implicit “Watch this!” after that, as the speaker purports to bulldoze their way through some issue that namby-pamby liberals and ivory-tower boffins have been wringing their hands ineffectually over, like the two-fisted, lantern-jawed hero of one of those old sci-fi paperbacks the Sad Puppies lament aren't being written any more.) An example of the Gordian Knot Delusion, on that favourite subject of taxes/economics, recently manifested itself in the following tweet from Conservative commentator Daniel Hannan:
It does not need to be pointed out that this is an extremely simplistic argument, more an act of trolling (in its original sense of seeking to provoke a pile-on of responses) than a serious inquiry in good faith, at least, if one assumes that the author is not a simpleton. It achieved its aim, in that others piled on with rebuttals on the same level, along the lines of “if olive oil is made of olives, what is baby oil made of?”. But if one takes the premise beneath it at face value, or at least treats it as something more meaningful than wordplay, the Gordian Knot Delusion comes through. Taxes disincentivise prosperity, it implies, unqualifyingly; cut taxes to the bone and watch prosperity take off like a rocket. And ignore the tweedy, elbow-patched fellow there saying that it's more complicated than that; the man looks like a commie, and is probably after your piece of the pie.
If tobacco taxes disincentivise smoking, and petrol taxes disincentivise driving, what do you suppose income taxes do?— Daniel Hannan (@DanielJHannan) September 4, 2016
The Gordian Knot Delusion, the idea that things are simpler than they are claimed to be, is trotted out by amateur spectators in a lot of fields. Economics is a big one: witness the “common-sense” idea that national economies work like household budgets, with a largely fixed income that is unaffected by the level of spending. By this token, one can believe that deficit spending is inherently irresponsible and austerity is, in itself, good economic housekeeping. (This, of course, falls apart when one considers that economic activity generates wealth, and that savings at rest have no economic impact, but it feels enough like common sense that one can persuade oneself that these objections are sophistry by ivory-tower eggheads, Marxists and moochers.) Ecology and the environment are another area; nobody can see global warming, or when they can, one can believe that the evidence is still out, (or once it isn't, it's too late to do anything so crank your air conditioning up and enjoy the ride); and as for that habitat of endangered newts the hippies are protesting about, let's just drive a motorway through it and see what happens; betcha that everything will be alright. The bees are dying off? Who cares about a buncha dumb bugs! The coral reefs are too? The tabloids say they're not. And if they are, so what?
And then there's modern society in general: gender-neutral job titles and ladies wearing trousers and lactose-free milk in the supermarket, oh my! Your son, who used to be your daughter, is taking medication for ADHD, your other daughter has a girlfriend, your boss wears a nose ring, and the golliwog doll from your childhood is now a potential hate crime. In the good old days, these things didn't exist, or if they did, they were hammered flat like a lump under the rug; people accepted their lot in life, and, as the refrain goes, everything was alright. (One part of this is the myth that these complex conditions, from gluten intolerance to gender dysphoria, don't actually exist, but are made up by an unholy alliance of bureaucrats, drug companies, the liberal media and people who want to feel like special snowflakes; the corollary: were it not for the conspiracy, a sharp clip around the ear would sort them out just as well.)
At its core, the Gordian Knot Delusion is an application of the 80/20 rule to the modern world at large; the belief that complexity is superfluous, and that rather than fretting over it, one should just stride over and cut the knot, deciding that the world is actually simple; witnessing the lack of an immediate catastrophe, one will find one's common sense and derring-do vindicated. (The original Gordian Knot was cut by that gung-ho man of action, Alexander the Great, which is always a flattering comparison.) The other part of the Gordian Knot Delusion is the stab-in-the-back narrative of how the world started to look deceptively complex. As the paranoiac's dictum goes, shit doesn't just happen, but is caused by assholes; in this case, all that talk about how complex things are is the work of a conspiracy; a motley crew of commie traitors, ivory-tower academics, so-called “intellectuals” corrupted by book-learning, miscellaneous perverts, Satanic cultists and out-and-out crooks and thieves out to keep the gravy-train of complexity going, all the better to steal from the simple honest folks. (The trope about climate change being a massive fraud for the purpose of maintaining funding for otherwise worthless research is a classic of the genre.) It is, as conspiracy theories tend to be, a compelling story, especially those who feel themselves bewildered or victimised by the world.
Whilst ostensibly associated with the Right these days, the Gordian Knot Delusion is actually the very antithesis of Edmund Burke's Conservatism, formulated in the wake of that catastrophic leftist severing of this knot, the French Revolution. Burke's argument (framing Conservatism for a world where the divine right of kings was no longer accepted and the University of Chicago School of Economics had yet to come into being and coin its modern analogue, trickle-down economics) was that things are much more complicated than one can comprehend, that bold attempts to destroy ancient injustices are also likely to have countless unintended consequences, and that one should stick to gradual, tentative reforms at best, if not to just give up and learn to live with the world as it is in all its richness and iniquity. Today, one might expect to hear that sort of argument, but only from a hair-shirted greenie warning against tampering with Mother Gaia's blessings. The Robespierres of the Right are all too happy to break things and observe that, on a macro level, everything is alright (whilst circularly classifying those for whom they are not alright as bums and sore losers). These radicals are in alliance with a growing number of people who are anything but radical in temperament, but who have been radicalised by the rapid pace of change, and for whom the idea of turning back the clock to (what in retrospect seems like) a simpler time has appeal. The shift of the Gordian Knot from the Left to the Right could be a result of the increasingly rapid pace of social and technological change.
An interesting article on the artificial and constructed nature of neoliberalism, the supposedly neutral post-ideological ideology, touted as the mirror of that which is and cannot but be:
The public at large would be hard pressed to know that this strange doctrine of economics actually had a place of origin in the Mont Pèlerin Society, which was the brainchild of Friedrich Hayek. Mirowski tells us that one of the keys to Neoliberalism has always been a sort of “double-truth”, the ability to convey both an outer exoteric version of the truth to the public at large, while at the same time conveying an inward esoteric truth to those in the know: what he terms the Neoliberal Thought Collective.One fundamental contradiction of neoliberalism is that between its assertion of naturalism—the market is the state of nature, to deny it is to fall foul of it, and resistance is useless—and that the specific kinds of market forces, and specific order of society, championed by neoliberalism are anything but natural; in that, neoliberalism diverges from the classical liberalism it purports to be, and becomes a constructivist ideological project; human relations need to be redefined to fit its market-based paradigm, and alternatives need to be bulldozed out of the way, to fit the vested interests of its stakeholders. No alternative must be allowed to challenge this model:
6. Neoliberalism thoroughly revises what it means to be a human person. “Individuals” are merely evanescent projects from a neoliberal perspective. Neoliberalism has consequently become a scale-free Theory of Everything: something as small as a gene or as large as a nation-state is equally engaged in entrepreneurial strategic pursuit of advantage, since the “individual” is no longer a privileged ontological platform. Second, there are no more “classes” in the sense of an older political economy, since every individual is both employer and worker simultaneously; in the limit, every man should be his own business firm or corporation; this has proven a powerful tool for disarming whole swathes of older left discourse. Third, since property is no longer rooted in labor, as in the Lockean tradition, consequently property rights can be readily reengineered and changed to achieve specific political objectives; one observes this in the area of “intellectual property,” or in a development germane to the crisis, ownership of the algorithms that define and trade obscure complex derivatives, and better, to reduce the formal infrastructure of the marketplace itself to a commodity. Indeed, the recent transformation of stock exchanges into profit-seeking IPOs was a critical neoliberal innovation leading up to the crisis. Classical liberals treated “property” as a sacrosanct bulwark against the state; neoliberals do not. Fourth, it destroys the whole tradition of theories of “interests” as possessing empirical grounding in political thought.
7. Neoliberals extol “freedom” as trumping all other virtues; but the definition of freedom is recoded and heavily edited within their framework. It is economic freedom only.Of course, imposing the economic freedom of those with the means to exercise it requires considerable effort, with significant proportions of the economy being employed as guard labour:
12. The neoliberal program ends up vastly expanding incarceration and the carceral sphere in the name of getting the government off our backs.Friedrich Hayek himself, as much as he cautioned against the threat of Communist serfdom (which would be an inevitable consequence of state intervention in the economy), was not a fan of democracy, seeing it at worst as a problem to be mitigated, and at best a placebo button, giving those at the top of the pyramid a veil of legitimacy that, say, the Bourbons and Romanovs didn't have:
Hayek, no friend of democracy, looked upon this closed society of elite intellectuals as if it were a new Platonic Academy. In fact he saw democracy as a hindrance to the neoliberal world view, saying to his friend Bertrand de Jouvenel, “I sometimes wonder whether it is not more than capitalism this strong egalitarian strain (they call it democracy) in America which is so inimical to the growth of a cultural elite.”Meanwhile, it turns out that the neoliberal order is not particularly good for the mental health of those subjected to it, or so argues that perennial Cassandra of the Left, George Monbiot, citing a new book by a Belgian psychoanalyst, Paul Verhaeghe. (No idea whether he identifies as a Marxist psychoanalyst, a peculiar school of thought which, these days, seems to outnumber Marxist economists.)
The market was meant to emancipate us, offering autonomy and freedom. Instead it has delivered atomisation and loneliness. The workplace has been overwhelmed by a mad, Kafkaesque infrastructure of assessments, monitoring, measuring, surveillance and audits, centrally directed and rigidly planned, whose purpose is to reward the winners and punish the losers. It destroys autonomy, enterprise, innovation and loyalty, and breeds frustration, envy and fear. Through a magnificent paradox, it has led to the revival of a grand old Soviet tradition known in Russian as tufta. It means falsification of statistics to meet the diktats of unaccountable power.
These shifts have been accompanied, Verhaeghe writes, by a spectacular rise in certain psychiatric conditions: self-harm, eating disorders, depression and personality disorders. Of the personality disorders, the most common are performance anxiety and social phobia: both of which reflect a fear of other people, who are perceived as both evaluators and competitors – the only roles for society that market fundamentalism admits. Depression and loneliness plague us.And here is a piece by the BBC's esoteric historian, Adam Curtis, about the men who brought Hayek's neoliberalism—and its vector, the pseudo-academic, ideological PR agency known as the think tank—to Britain; a tale also involving legendary pirate radio station Radio Caroline, rock star turned gonzo anti-politician Screaming Lord Sutch, and the small-'l' libertarianism of the Sixeventies:
The Think Tank that Antony Fisher set up was very different. It had no interest in thinking up new ideas because it already knew the "truth". It already had all the ideas it needed laid out in Professor Hayek's books. Its aim instead was to influence public opinion - through promoting those ideas. It was a big shift away from the RAND model - you gave up being the manufacturing dept for ideas and instead became the sales and promotion dept for what Hayek had bluntly called "second-hand ideas".
Reg Calvert was part of an old, unruly tradition of true independence and libertarian freedom. A real bucaneer who would ignore rules and the structure of class and power in Britain while merrily going his own way. Smedley on the other hand was a "privateer" only to the extent that he wanted to bring the private sector back to power in Britain. Other than that he wanted the traditional power structure to remain the same. And to do this he (and his Think Tank) wanted to reinvent the free market as a managed system - managed by them, and any true "privateer" - like Reg - who challenged that power was doomed.If one sees the epoch from the youthquake of the Sixeventies to the triumph of Reagan and Thatcher's free-market ideology and a globalised quasi-feudal corporatism with the levers of power far out of the reach of the populace as one revolution, it could be argued that this was another case of a revolution (in this case, loosely defined as a period of phase transition or instability, in which old certainties come undone and the future is up for grabs) being seized by a well-organised faction with its own Nietzschean will to power. In this case, the Bolsheviks are the interests of concentrated wealth and power, their ability to exercise their power limited by things such as Roosevelt's New Deal, or even the post-Enlightenment doctrines of universal human rights, and the hippies (in the loose sense of the word) are the chumps who kicked off the revolution and then got it taken from them by someone far more devious.
On occasion of a Women In Rock mini-festival on Melbourne radio station 3CR, Mess+Noise got Ninetynine's Laura Macfarlane and the members of the all-female rock trio Dead River to interview each other:
Laura: Overall I think things with gender equality in music have improved slightly but it still needs more work. There could be more female presence in the technical side of music. For instance there aren’t many female masterers still. It also varies a lot between countries. Ninetynine has played in countries and cities where being a female musician is still a novelty. Those shows always stick out in my memory because usually one female person in the audience will come up and tell you that they really appreciate seeing female musicians. Maybe they were thinking of starting their own band, but hadn’t seen a live band with women in it. It is always special to feel like maybe you have helped encourage other women in some small way.
Laura: Although Ninetynine does not exclusively reference Get Smart, we do like a lot of things people relate to the name, including agent 99. She’s great. We also wanted to use a number as a band name because it can work well in countries where people don’t speak a lot of English. I think the The Shaggs would be my favourite ’60s girl group.
Dead River: Despite plenty of evidence that women are capable and creative masters of their instruments and gear (PJ Harvey, Savages, Kim Gordon, to name a few), there are prevailing paternalistic attitudes that continue to undermine women in music. I’m sure many female musicians can relate to the experience of a male mixer walking on stage and adjusting her amp or telling her how to set her levels. Or being asked if you’re the ”merch girl” or “where’s your acoustic guitar?” after you’ve just lugged an entire drum kit or Orange stack through the door.Meanwhile, the members of Ninetynine have recorded a song to raise funds for protests against the East-West road tunnel, under the name “Tunnel Vision Song Contest”. It sounds like Ninetynine at their most Sonic Youth-influenced, though is a bit light on the Casiotone and chromatic percussion.
On Smarm, an essay pointing out that the problem with the internet is not snark but its condemnation, and through that, smarm; i.e., emotive appeals to the idea of positivity as a virtue (as if it were motherhood or apple pie or adorable kittens), and condemnation of negativity in general:
Over time, it has become clear that anti-negativity is a worldview of its own, a particular mode of thinking and argument, no matter how evasively or vapidly it chooses to express itself. For a guiding principle of 21st century literary criticism, BuzzFeed's Fitzgerald turned to the moral and intellectual teachings of Walt Disney, in the movie Bambi: "If you can't say something nice, don't say nothing at all."Smarm (whose genesis, in its current form, the article lays at the feet of that one-man Coldplay of letters, Dave Eggers, who exhorted to “not dismiss a movie until you have made one”, singlehandedly reserving the right to engage in, rather than merely consuming, culture for those within the culture industry) may be most obviously evident on the web, in cloyingly snark-free websites like Buzzfeed and Upworthy (the latter of which spawned a satirical webtoy), and the one-sided boosterism of the “like” button, but its effects go beyond the risk of ending up with an overly warmed heart and a jaw needing to be picked up off the floor. As a content-free (and thus outside of the criteria of debate) appeal to a nebulous ideal of civility or niceness (and surely everybody loves niceness, much like kittens and cupcakes), it is a tool for disingenuously shutting down challenging voices, and is very useful for bolstering the status quo when appeals to, say, the divine right of kings or the Hobbesian necessity of there being an ultimate authority, no longer hold water: don't do it because I said so, but do it because kittens.
Smarm hopes to fill the cultural or political or religious void left by the collapse of authority, undermined by modernity and postmodernity. It's not enough anymore to point to God or the Western tradition or the civilized consensus for a definitive value judgment. Yet a person can still gesture in the direction of things that resemble those values, vaguely.As concerns about “civility” and the “tone of debate” and such are raised, the result is often a soupy homogenate of truisms, motherhood statements and content-free manufactured consensus, meeting in the middle and staying there, bathed in a glow of positive sentiment: democratic debate reduced to calming mood lighting. Which undoubtedly serves interests behind the scene just fine.
Here is Obama in 2012, wrapping up a presidential debate performance against Mitt Romney: “I believe that the free enterprise system is the greatest engine of prosperity the world's ever known. I believe in self-reliance and individual initiative and risk-takers being rewarded. But I also believe that everybody should have a fair shot and everybody should do their fair share and everybody should play by the same rules, because that's how our economy is grown. That's how we built the world's greatest middle class.”
The lone identifiable point of ideological distinction between the president and his opponent, in that passage, is the word "but." Everything else is a generic cross-partisan recitation of the indisputable: Free enterprise ... prosperity ... self-reliance ... initiative ... a fair shot ... the world's greatest middle class.And, of course, smarm is useful for ruling out points of view deemed to be inadmissible, on the grounds that they are too negative, or confrontational, or that we have outgrown such petty squabbling about actual issues:
The New York Times reported last month that in 2011, the Obama Administration decided not to nominate Rebecca M. Blank to be the head of the Council of Economic Advisers, because of "something politically dangerous" she had written in the past: In writing about poverty relief, she had used the word "redistribution."
Like every other mode, snark can sometimes be done badly or to bad purposes. Smarm, on the other hand, is never a force for good. A civilization that speaks in smarm is a civilization that has lost its ability to talk about purposes at all. It is a civilization that says "Don't Be Evil," rather than making sure it does not do evil.Topically, we are currently witnessing a tsunami of smarm over the recently deceased Nelson Mandela, as right-wing politicians, many of whom wore HANG MANDELA badges at their Conservative Students meetings or lobbied against sanctions against the apartheid regime, fawningly profess what an inspiration the great man had been to them, with the implication that Mandela was not a freedom fighter but some kind of apolitical, beatific self-help guru, a Princess Diana in Magical Negro form, come to heal us with peace and love. It's ironic to think that, as utterly wrong as Margaret Thatcher was when she denounced Mandela as a terrorist, her view was at least grounded in reality, unlike the insipid words of content-free praise her successors are heaping upon him.
From an article by Nick Cohen about the current Frieze art fair in London, an observation on the function that huge, tacky-looking artworks fulfil in validating their wealthy purchasers' status; Cohen's argument is that monumental kitsch is a peacock-tail-like mechanism for (expensively and unforgeably) demonstrating that one has status putting one beyond the criticism of one's inferiors:
The justifications from the critics whom galleries always seem able to find to dignify the shallow add to the melancholy spectacle. They talk of challenging our notions of what is art as Duchamp did with his urinal. They forget that Duchamp offered his "fountain" to a New York show in 1917 – almost a century ago, and his once radical ideas are now so established they will soon deserve a telegram from the Queen. "Everything changes except the avant garde," said Paul Valéry. Yet Frieze shows one change, although not a change for the better.
Collectors do not buy Koons because he challenges their definitions of art. The ever-popular explanation that the nouveau riche have no taste strikes me as equally false – there's no reason why the nouveau riche should have better or worse taste than anyone else.
What a buyer of a giant kitten or a gargantuan fried egg says to those who view his purchase is this: "I know you think that I am a stupid rich man who has wasted a fortune on trash. But because I am rich you won't say so and your silence is the best sign I have of my status. I can be wasteful and crass and ridiculous and you dare not confront me, whatever I do."
Apparently Thailand these days is full of homeless European/American blokes; mostly middle-aged, and often alcoholic, they spend their time drinking and sleeping rough on beaches, which is considerably less idyllic than the big-rock-candy-mountain image the description evokes:
Steve, who declined to give his surname over fears that his long-expired visa could land him in jail, said he has spent two years sleeping rough on Jomtien Beach, a 90-minute drive from Bangkok. “I’ve gone 14 days without food before. I lived off just tea and coffee,” he told The Independent. After his marriage of 33 years ended seven years ago, Steve began regular visits to Thailand before setting up permanently in Pattaya, a seaside resort with a sleazy reputation close to Jomtien. “I’m a bit of a sexaholic,” he says, also admitting a fondness for alcohol.
Paul Garrigan, a long-time Thai resident, isn’t surprised by the growing problem of homeless and stranded Westerners. The 44-year-old spent five years “drinking himself to death” in Thailand before giving up alcohol in 2006 and writing a book called Dead Drunk about his ordeal and the expats who have fallen on hard times in the country. He told The Independent: “I’d been living in Saudi Arabia where I worked a nurse but I’ve been an alcoholic since my teens and, after a holiday to Thailand in 2001, I decided I may as well drink myself to death on a beautiful island in Thailand. Like many people I taught English at a school but spent much of my time on islands such as Ko Samui where I could start drinking early in the morning at not be judged.Meanwhile in the US, some homeless people are apparently surviving on Bitcoin; spending their days in public libraries earning the coins by doing vaguely sketchy online work (watching videos to bump up YouTube counters is mentioned; perhaps armies of the destitute to solve CAPTCHAs, artisanally hand-spam blog comments or otherwise laboriously defeat anti-bot countermeasures could make economic sense in today's climate too) and then cashing out through gift card services. Meanwhile, homelessness charities are embracing Bitcoin:
Meanwhile, Sean’s Outpost has opened something it calls BitHOC, the Bitcoin Homeless Outreach Center, a 1200-square-foot facility that doubles as a storage space and homeless shelter. The lease – and some of the food it houses — is paid in bitcoins through a service called Coinbase. For gas and other supplies, Sean’s Outpost taps Gyft, the giftcard app Jesse Angle and his friends use to purchase pizza.(I suspect that the photo of the homeless man “mining Bitcoins” on the park bench on his laptop is mislabelled; wouldn't all the easily minable Bitcoins have been tapped out, with the computational power required to mine any further Bitcoins essentially amount to already having thousands of dollars of high-end graphics cards lying around and using them to heat your house, rather than something one could do with an old battery-operated laptop on a park bench?)
An interesting article on cryptolects, secret group languages whose purpose is to conceal meaning from outsiders:
Incomprehension breeds fear. A secret language can be a threat: signifier has no need of signified in order to pack a punch. Hearing a conversation in a language we don’t speak, we wonder whether we’re being mocked. The klezmer-loshn spoken by Jewish musicians allowed them to talk about the families and wedding guests without being overheard. Germanía and Grypsera are prison languages designed to keep information from guards – the first in sixteenth-century Spain, the second in today’s Polish jails. The same logic shows how a secret language need not be the tongue of a minority or an oppressed group: given the right circumstances, even a national language can turn cryptolect. In 1680, as Moroccan troops besieged the short-lived British city of Tangier, Irish soldiers manning the walls resorted to speaking as Gaeilge, in Irish, for fear of being understood by English-born renegades in the Sultan’s armies. To this day, the Irish abroad use the same tactic in discussing what should go unheard, whether bargaining tactics or conversations about taxi-drivers’ haircuts. The same logic lay behind North African slave-masters’ insistence that their charges use the Lingua Franca (a pidgin based on Italian and Spanish and used by traders and slaves in the early modern Mediterranean) so that plots of escape or revolt would not go unheard. A Flemish captive, Emanuel d’Aranda, said that on one slave-galley alone, he heard ‘the Turkish, the Arabian, Lingua Franca, Spanish, French, Dutch, and English’. On his arrival at Algiers, his closest companion was an Icelander. In such a multilingual environment, the Lingua Franca didn’t just serve for giving orders, but as a means of restricting chatter and intrigue between slaves. If the key element of the secret language is that it obscures the understandings of outsiders, a national tongue can serve just as well as an argot.The article goes on to mention polari, which originated as a travelling entertainers' argot and ended up being a cryptolect used by gay men in 20th-century Britain, becoming largely obsolescent after homosexuality was decriminalised, surviving as a piece of period colour in artefacts like Morrissey's song Piccadilly Palare.
With its roots in Yiddish, cant, Romani, and Lingua Franca, Polari was a meeting-place for languages of those who were too often forced to hit the road; groups who, however chatty, tend to remain silent in traditional historical accounts. Today, the spirit of Polari might be said to live on in Pajubá (or Bajubá), a contact language used in Brazil’s LGBT community, which draws its vocabulary from West African languages – testimony to the hybrid, polyvocal processes through which a cryptolect finds voice.Of course, as the whole point of a cryptolect is to conceal meaning, as soon as some helpful soul compiles a crib sheet, they kill that particular version of the language as surely as a butterfly collector with a killing jar. (An example of this that has become a comedic trope is parents, politicians and other grown-ups trying to be hip to the groovy lingo of teenagers and falling flat.)
The work of the chronicler of cryptolect must always end in failure. These are languages which need to do more than keep up with current usage: they have to stay ahead of it, burning bridges where the vernacular has come too close; keeping their distance from the clear, the comprehensible. When Harman returned to the subject of pedlars’ French, his promises of understanding came with a new caveat: ‘as [the canting crew] have begun of late to devise some newe tearmes for certaine things: so will they in time alter this, and devise as evill or worse’. We can’t write working dictionaries of secret languages, any more than we can preserve a childhood or catch a star.Not all cryptolects belong to marginalised, disempowered or nefarious outsider groups (say, itinerant thieves, galley slaves, sexual minorities or minors under the totalitarian regime of parental authority); various technical jargons have something of the cryptolect about them where they avoid using laypersons' terminology in favour of synonymous terms specific to their subcultures. This could be argued to be a good thing, as confusion can occur when words have both technical and vernacular meanings (take for example the word “energy” as used by physicists and New Age mystics). Indeed, whether, say, International Art English is a cryptolect could come down to whether it serves to actually communicate to an in-group or just as a form of ritual display.
Tom Ellard, formerly of industrial electropop combo Severed Heads and now an academic teaching the digital arts, takes the world of art and the vocation of the Artist to task in an essay titled Five Reasons Why I Am Not An ‘artist’. His targets include the various hierarchies, hypocritical masquerades and rituals enforced on those playing the role of Artist, from refraining from lowering oneself to doing anything too hands-on or technical (there are operators for that) to the politics and carefully circumscribed modes of relating to other people within the art world (a place seemingly as formalised as an 18th-century aristocratic court), to the somewhat less than inspiring reality facing an Artist who has Made It:
When I worked in advertising I was surprised to meet people who didn’t do anything. They are called ‘art directors’. People like myself that perform the actual tasks are called ‘operators’ and there is a strong class distinction which leads ‘art directors’ to cross their arms while speaking near any object that they may accidentally use*. I was employed to move text on a page for an irate person standing a few feet away from the means to do it. Apparently their pureness of thought would be sullied by contact with a mechanism.
I’ve said it too many times: the ideal of an artistic career is inertia. Innovate for a while. Find a practice, a style, a scheme that earns attention. Repeat it endlessly, never daring to step outside your persona because the system will need to bind you to an iconic representation of yourself. Do you reproduce famous paintings as slow motion videos? Or use a skateboard as your macguffin? Better stick to that. Keep on making action painting, or ‘industrial’ tape cut up until you die – which is your prime function, sealing off the quantity of your saleable work.
Artists that constrain themselves are recognised more quickly, they are funded, they are more acceptable to publications because they are easier to digest. They are the cheddar cheese of creativity, and when I am I told that ‘all the best work is happening over here’, I know the place to look is anywhere but there. Innovation is part of a continuing vitality, and confusedly being alive is more important than being neatly dead. We should never ever pre-organise ourselves into categories that fit nicely in museums, journals and repositories. That’s like pinning yourself into a display case.
What will we call ourselves? The Kraftwerk guys were onto something when they called themselves ‘music workers’. But I have another idea. In advertising the term ‘creative’ is a mixed signal, it seems to be a positive, but can be a polite substitute for ‘operator’. I’ve often heard somebody say, ‘we’ll get our creatives onto that’. It means ‘all slaves to the oars’. If so, perhaps we can claim ‘creative’ or ‘operator’ back. It can be our own swearword.
The Bacon-Wrapped Economy, an article looking at how the rise of a stratum of extremely well-paid engineers and wealthy dot-com founders, mostly in their 20s, has changed the San Francisco Bay Area, economically and culturally:
You don't need to look hard to see the effects of tech money everywhere in the Bay Area. The housing market is the most obvious and immediate: As Rebecca Solnit succinctly put it in a February essay for the London Review of Books, "young people routinely make six-figure salaries, not necessarily beginning with a 1, and they have enormous clout in the housing market." According to a March 11 report by the National Low Income Housing Coalition, four of the ten most expensive housing markets in the country — San Francisco, San Mateo, Santa Clara, and Marin counties — were located in the greater Bay Area. Even Oakland, long considered a cheaper alternative to the city, saw an 11 percent spike in average rent between fiscal year 2011-12 and the previous year; all told, San Francisco and Oakland were the two American cities with the greatest increases in rent. Parts of San Francisco that were previously desolate, dangerous, or both are now home to gleaming office towers, new condos, and well-scrubbed people.The economic effects of gentrification, soaring costs of living and previous generations of residents being priced out are predictable enough (and San Francisco has been suffering from similar effects since the 1990s .com boom, when a famous graffito in one of the city's then seamy neighbourhoods read “artists are the shock troops of gentrification”). And then there are the effects of the city's wealthy elite being replaced by a new crop of the wealthy who, being in their 20s and from the internet world, share little of the aesthetic tastes and cultural assumptions of the traditional plutocracy, favouring street art to oil on canvas and laptop glitch mash-ups to the philharmonic; their clout has sent shockwaves through the philanthropic structures of patronage that supported high culture in the city:
Historically, most arts funding has, of course, come from older people, for the simple reason that they tend to be wealthier. But San Francisco's moneyed generation is now significantly younger than ever before. And the swath of twenties- and thirties-aged guys — they are almost entirely guys — that represents the fattest part of San Francisco's financial bell curve is, by and large, simply not interested.
"If you're talking the symphony or other classical old-man shit, I would say [interest] is very low," an employee at a smallish San Francisco startup recently told me. "The amount of people I know that give a shit about the symphony as opposed to the amount of people I know who would look at a cool stencil on the street ... is really small."And not only the content of philanthropy has changed, but so have the mechanisms. Just handing over money to a museum, without any strings, no longer cuts it to a generation of techies raised on test-driven development and the market-oriented philosophy of Ayn Rand, and believing in fast iteration, continuous feedback and quantifiable results. Consequently, donations to old-fashioned arts institutions have declined with the decline of the old money, but have largely been replaced by the rise of crowdfunding, with measurable results:
(Kickstarter) The self-described "world's largest funding platform for creative projects" has, in its three-year existence, raised more than half a billion dollars for more than 90,000 projects and is getting more popular by the day; at this point, it metes out roughly twice as much money as the National Endowment for the Arts. And though hard statistics are difficult to come by, it's clear that this is a funding model that's taken particular hold in the tech world, even over traditional mechanisms of philanthropy. "Arts patronage is definitely very low," one tech employee said. "But it's like, Kickstarters? Oh, off the map." Which makes sense — Kickstarter is entirely in and of the web, and possibly for that reason, it tends to attract people who are interested in starting and funding projects that are oriented toward DIY and nerd culture. But it represents a tectonic shift in the way we — and more specifically, the local elite, the people with means — relate to art.
"A lot of this is about the difference between consuming culture and supporting culture," a startup-world refugee told me a few weeks ago: If Old Money is investing in season tickets to the symphony and writing checks to the Legion of Honor, New Money is buying ultra-limited-edition indie-rock LPs and contributing to art projects on IndieGoGo in exchange for early prints. And if the old conception of art and philanthropy was about, essentially, building a civilization — about funding institutions without expecting anything in return, simply because they present an inherent, sometimes ineffable, sometimes free market-defying value to society, present and future, because they help us understand ourselves and our world in a way that can occasionally transcend popular opinion— the new one is, for better or for worse, about voting with your dollars.Which suggests the idea of the societal equivalent of the philosopher's zombie, a society radically restructured by a post-Reaganite, market-essentialist worldview, in which all the inefficient, inflexible bits of the old society, from philanthropic foundations in support of a greater Civilisation to senses of civic values and community, have been replaced by the effects of market forces: a world where, if society is assumed to be nothing but the aggregation of huge numbers of self-interested agents interacting in markets, things work as they did before, perhaps more efficiently in a lot of ways, and to the casual observer it looks like a society or a civilisation, only at its core, there's nothing there. Or perhaps there is one supreme value transcending market forces, the value of lulz, an affectation of nihilistic nonchalance for the new no-hierarchy hierarchies.
The article goes on to describe the changes to other things in San Francisco, such as the attire by which the elite identify one another and measure status (the old preppie brands of the East Coast are out, and in their place are luxury denim and “dress pants sweatpants” costing upwards of $100 a pair–a way of looking casual and unaffected, in the classic Californian-dude style, to the outside observer, whilst signalling one's status to those in the know as meticulously as a Brooks Brothers suit would in old Manhattan), the dining scene (which has become more technical and artisanal; third-wave coffee is mentioned) and an economy of internet-disintermediated personal services which has cropped up to tend to the needs of the new masters of the online universe:
And then there are companies like TaskRabbit and Exec, both of which serve as sort of informal, paid marketplaces for personal assistant-style tasks like laundry, grocery shopping, and household chores. (Workers who use TaskRabbit bid on projects in a race-to-the-bottom model, while Execs are paid a uniform $20 per hour, regardless of the work.) According to Molly Rabinowitz, a San Franciscan in her early twenties who briefly made a living doing this kind of work — though she declined to reveal which service she used — many tech companies give their employees a set amount of credit for these tasks a month or year, and that's in addition to the people using the services privately. "There's no way this would exist without tech," she said. "No way." At one point, Rabinowitz was hired for several hours by a pair of young Googlers to launder and iron their clothes while they worked from home. ("It was ridiculous. They didn't want to iron anything, but they wanted everything, including their T-shirts, to be ironed.") Another user had her buy 3,000 cans of Diet Coke and stack them in a pyramid in the lobby of a startup "because they thought it would be fun and quirky." Including labor, gas, and the cost of the actual soda, Rabinowitz estimated the entire project must have cost at least several hundred dollars. "It's like ... you don't care," she said. "It doesn't mean anything because it's not your money. Or there's just so much money that it doesn't matter what you spend it on."
The economic difference between London, a global centre of finance, where wealth is conjured into being and every Russian oligarch and Saudi princeling worth his salt has to have a pied à terre, and the rest of Britain is drawn into sharp relief by a recent property value survey:
Research shows that the net value of properties in just 10 London boroughs – Westminster, Kensington & Chelsea, Wandsworth, Barnet, Camden, Richmond, Ealing, Bromley, Hammersmith & Fulham and Lambeth – now outstrips the worth of all the properties in Wales, Northern Ireland and Scotland combined.
The capital’s richest borough, Westminster, with 121,600 dwellings, is worth £95bn – more than twice the value of Edinburgh (pop 500,000) and three times that of England’s sixth most populous city, Bristol.That's one thing one forgets about living in London: that this isn't normal. The high property values (and rents), the billions of pounds poured into public transport, the presence of everything from world-class art exhibitions to lunch options more interesting than a supermarket sandwich: none of this would be here were London not a global city-state of its stature, alongside the Singapores and Dubais of this world.
Of course, the downside of this is that London is considerably less affordable for those who aren't oligarchs, princelings or otherwise loaded.
Another consequence of the Zuckerberg Doctrine, the belief that every person has one and only one identity which they use for all online social interactions: doctors in Britain are reporting an increase in infatuated patients pursuing them romantically via Facebook:
Figures compiled by the Medical Defence Union (MDU) show that the number of cases of doctors seeking its help because they are being pursued by a lovestruck patient rose from 73 in 2002-06 to 100 in 2007-11. Patients are increasingly using social media rather than letters or flowers to make their feelings clear, such as following a doctor on Twitter, "poking" them on Facebook or flirting with them online.
A female GP was asked out for a drink by a male patient as she left her surgery. When she declined, he began to pester her via Facebook and sent her a bunch of lilies, which she had listed as her favourite flowers on her Facebook page. On MDU advice, she changed her security and privacy settings on the site so that only chosen friends could view her postings.Of course, it is unreasonable to ask doctors (and, indeed, other public-facing professionals; teachers, police, social workers and legal aid workers come to mind) to delete their Facebook accounts and not use social software. For one, in this day and age, disconnecting from social software means virtual exile; Facebook refuseniks find themselves out of the loop, relying on the charity of friends with Facebook accounts and free time to keep them informed of everything from party invitations to when mutual friends friends had a baby, got divorced or moved abroad. And then there is the increasing public expectation that well-adjusted citizens have a Facebook profile, and one with normal activity patterns. Already there is talk about governments requiring citizens to log in with Facebook/Google identities to access services, so a normal Facebook record, with the requisite casual-though-not-debauched photos and history of social chatter is increasingly starting to look like a badge of good citizenship, well-adjustedness and general non-terroristicity. And having two accounts, one for your professional persona, and one for your personal life, is expressly verboten by orders of Mark Zuckerberg and Vic Gundotra, as mandated by the advertisers who demand accurate records of eyeballs sent their way and the shareholders who demand steady advertising revenue.
So now, by the immutable facts of neoliberal capitalism in the internet age, we have a world where people have only one face they present to the world, one with their wallet name, career record, list of friends and social activity attached. This face is visible to everyone from old friends to employers to any members of the public one has a professional duty of care to. Perhaps there's a Californian jeans-and-T-shirts casualness to forcibly unifying these facets; to not allowing a distinction between the uniform of professionalism one wears in one's career and the accoutrements of one's casual, personal life; to knowing that your doctor's favourite flower is the lily, your geography teacher was in a moderately well-known math-rock band, or the police officer you reported your lost phone to is an Arsenal fan and known to his mates as Beans; though the downside of the casualisation of professional life is the professionalisation of casual life, a sort of Bay Area take on superlegitimacy. And while in Britain today, that may take the form of doctors self-censoring to avoid the possibility of obsessive patients, in parts of the US, where employers can fire workers for their political or personal views, sexual orientation or even sporting loyalties, the stakes are higher.
Whether the Zuckerberg Doctrine is the inescapable future, in which everyone is coerced into an endless, joyless social game of simulating a model citizen as if under the watchful eyes of an outsourced Stasi, however, is another question. Facebook's unquestionable hegemony is starting to show its first cracks. For now, it remains the default grapevine, the standard channel of social chatter; however, its declining share price seems to be pushing Facebook to more agressively monetise the relationships of its nominally captive audience, pushing more ads and sponsored stories, asking users to pay for their messages to be seen by their friends (whose feeds can only contain so many updates, after all, and there are commercial sponsors to compete with), and, the implication goes, throttling back how much unsponsored chatter a user sees. As this ratchets up, eventually people will notice that their friends' announcements and photos aren't making it to them but instead the fact that their friend ostensibly likes Toyota or Red Bull is and start tuning out. Then Facebook will decline, as MySpace and Friendster did before it, and something else will take its place.
Perhaps the best thing to hope for is that whatever fills the niche occupied by Facebook will be not so much a service but a decentralised system of independent services, each free to set its own terms and policies. They could be based on a protocol such as Tent or Diaspora*, and, as the servers interact, allow for great diversity; some servers will be free to use but spam your eyeballs with ads until they bleed, others will charge, say, $25 a year and offer ad-free unlimited hosting; some will have Zuckerbergian wallet-name policies, others will allow users to choose the pseudonyms of their choice (as, say, LiveJournal did back in the day, and community-oriented web forums often do), with some uptight silos only federating with others with wallet-name policies, and being seen by those outside of those as terminally square. And, of course, unlike on Facebook, there will be nothing stopping someone from having multiple accounts. Of course, there will be nothing preventing people from running their own silos, though any system which depends on people doing this will become a ghetto of deep geeks with UNIX beards who enjoy setting up such systems, to the exclusion of everyone else.
An article in the Guardian presents a scenario on the privacy risks even the most careful social media output could pose when analysed with data-mining software descended from that currently in existence:
"Tina Porter, 26. She's what you need for the transpacific trade issues you just mentioned, Alan. Her dissertation speaks for itself, she even learned Korean..." He pauses.Of course, if employers (and health insurance companies and the police and organised criminals and advertising firms and psychotic stalkers) can data-mine a tendency to get migraines from the fluctuation of the vocabulary of one's posts, one might suggest that those with a healthy amount of paranoia should avoid social media altogether, beyond having a simple static page that gives away absolutely nothing. Except that not having an active social media profile is increasingly seen as suspicious in itself; if you're not tweeting your TV viewing or Instagramming your sandwiches and leaving a statistically normal trail of well-adjusted narcissistic exhibitionism, there's a nonzero probability that you might be the next Unabomber; and, in any case, the HR department who knocked Tina Porter back for her carefully concealed migraines would certainly not even look at the CV of the potential ticking timebomb whose online profile draws a blank.
"But?..." Asks the HR guy.
"She's afflicted with acute migraine. It occurs at least a couple of times a month. She's good at concealing it, but our data shows it could be a problem," Chen says.
"How the hell do you know that?"
"Well, she falls into this particular Health Cluster. In her Facebook babbling, she sometimes refers to a spike in her olfactory sensitivity – a known precursor to a migraine crisis. In addition, each time, for a period of several days, we see a slight drop in the number of words she uses in her posts, her vocabulary shrinks a bit, and her tweets, usually sharp, become less frequent and more nebulous. That's an obvious pattern for people suffering from serious migraine. In addition, the Zeo Sleeping Manager website and the stress management site HeartMath – both now connected with Facebook – suggest she suffers from insomnia. In other words, Alan, we think you can't take Ms Porter in the firm. Our Predictive Workforce Expenditure Model shows that she will cost you at least 15% more in lost productivity."
So, if this sort of thing comes to pass (and whether that sort of data could be extracted from social data with few enough false positives to be useful is a big if), we may eventually see an age of radical transparency, where everyone knows who's likely to be marginally more or less productive, along with possible laws regulating when this may be taken into account. Either that or the evolution of Gattaca-style systems and techniques for chaffing one's social data trail and masking any deficiencies which it may betray, in an ever-escalating arms race with new analytical techniques designed to detect such gaming.
The Observer has an article about the phenomenon of “friend clutter” on social network services; in short: while it's easy to “friend” people, removing someone from one's circle of acquaintance is inherently a hostile act; there is no cultural provision for severing notional ties with people one has no actual ties with on a no-fault basis. (At least, this is the case in England, where making a scene is something impetuous foreigners do, and Just Not Done; it'd be interesting to see whether people are quicker to sever online acquaintances in more brusque locales—say, Berlin, Moscow or Tel Aviv) And hence, we end up with friend lists full of strangers:
Even "unfriending" someone on Facebook, the closest equivalent to Bierce's proposal, feels like delivering a slap in the face (and not even a well-timed slap, since you can't be sure when they'll find out). Facebook itself hates unfriending, for commercial reasons, and thus makes it easy to hide updates from tiresome contacts without their knowing – a deeply unsatisfactory arrangement that leaves you at constant risk of meeting someone face-to-face who assumes you must already know they've got engaged, or had another baby, or been dumped, or fired, or widowed.
If that sounds a heartless way to think about other people, consider the parallels. Physical clutter, as a widespread problem, is only as old as modern consumerism: before the availability of cheap gadgets, clothes and self-assembly furniture, it wasn't an option for most people to accumulate basements full of unwanted exercise bikes, games consoles or broken Ikea bookshelves. We think we want this stuff, but, once it becomes clutter, it exerts a subtle psychological tug. It weighs us down. The notion of purging it begins to strike as us appealing, and dumping all the crap into bin bags feels like a liberation. "Friend clutter", likewise, accumulates because it's effortless to accumulate it: before the internet, the only bonds you'd retain were the ones you actively cultivated, by travel or letter-writing or phone calls, or those with the handful of people you saw every day. Friend clutter exerts a similar psychological pull. The difference, as Bierce understood, comes with the decluttering part: exercise bikes and PlayStations don't get offended when you get rid of them. People do. So we let the clutter accumulate.And while the psychological impact of severing a friendship (even one that only exists as a row in a database, in which neither party remembers who the other actually is) can be mildly traumatic (there have been neurological studies that showed that social/romantic rejection stimulates the same parts of the brain as physical pain; I wouldn't be surprised if awareness of a severed connection worked similarly), another factor is the business models of social software services, such as Facebook, whose balance sheet depends on as many people as possible seeing which brands other people they “know” in some sense or other liked, hence another layer of polite hypocrisy is invented: the hidden, passive “friendship”, in which one doesn't have to see anything about the life of one's notional acquaintance, but can avoid the minor agony of forever writing them out of one's life. (And unfriending, it goes without saying, is forever, or at least without a damned good apology.)
The more profound truth behind friend clutter may be that, as a general rule, we don't handle endings well. "Our culture seems to applaud the spirit, promise and gumption of beginnings," writes the sociologist Sara Lawrence-Lightfoot in her absorbing new book, Exit: The Endings That Set Us Free, whereas "our exits are often ignored or invisible". We celebrate the new – marriages, homes, work projects – but "there is little appreciation or applause when we decide (or it is decided for us) that it's time to move on". We need "a language for leave-taking", Lawrence-Lightfoot argues, and not just for funerals. A terminated friendship, after all, needn't necessarily signal a horrifying defeat, to be expunged from memory. One might just as easily think of it as "completed".
Mullany recommends a friend-decluttering exercise that she admits sounds "weird", but that she predicts will become more and more widely accepted. She advises making a public proclamation on Facebook in which you specify the criteria by which you'll henceforth be defining people as "friends". Maybe you'll resolve only to remain Facebook friends with people you've met at least once in real life, or maybe you'll use a stricter standard, such as whether you'd invite that person to your wedding. Explain, in the same proclamation, that the consequent defriending shouldn't be taken personally, and that you're doing it to a number of people at once. Then start clearing out the clutter.
If Zuckerberg's insistence that everyone should be friends with everyone prompts us, out of necessity, to winnow our lists to a smaller group of people we truly cherish, he'll have done something admirable, even if it's the opposite of what he intended.Indeed; though whether it prompts people to circle the wagons and insist on only remaining attached to people they have met recently or would go for a drink with is another question. Part of the utility of services like Facebook and whatever succeeds it would be to keep in low-level ambient contact with people whom one is not friends with in the classic sense of friendship: old school buddies, ex-coworkers, people one met a few times some years ago, and so on. Of course, the amount of attention these people might have for one is probably somewhat limited, so updates would be limited to the major things: changes of location, marital status, sex, and that sort of thing. Which strikes me as quite distinct from the interactions one has with one's active online friends: the stream of updates about one's life peppered with amusing links, usually involving cats.
New York is facing an onslaught of middle-aged subway taggers; latchkey kids from the 1970s and 1980s who never put graffiti vandalism behind them, or else who decided to recapture their lost youth, spurred on by one thing or another to pick up their spraycans and get back into the game:
In torn jeans and saddled with a black backpack, Andrew Witten glances up and down the street for police. The 51-year-old then whips out a black marker scribbles "Zephyr" on a wall covered with movie posters. He admires his work for a few seconds before his tattooed arms reach for his daughter, holding her hand as he briskly walks away.
Witten's brush with fame now often comes with his freelance art writing and his sporadic visits to his daughter's school, where he teaches her classmates how to draw. Lulu knows her father draws "crazy art," a term she picked up from seeing graffiti on trains.
For decades, Ortiz, 45, has been known on Manhattan's Lower East Side as LA II. A traumatic loss of a girlfriend brought him out of a 14-year hiatus from graffiti writing. He has since been caught three times spraying his tag on property, each time while walking a friend's dog. "Everywhere that dog stopped to pee I would write my name," Ortiz says. "The streets were like my canvases. I just started writing my name everywhere."Alternatively, it could be argued that he and his dog bonded by participating in territory-marking activities together.
There's a piece in the Guardian about the rise of crowdfunding, and how it shifted from a medium for art projects to a means of decentralised organisation of practical endeavours requiring money, and turned into a means for circumventing market failures:
Kickstarter itself is changing under the influence of digital culture. At first it was about making established forms of art. Film was big – documentaries about organic community vegetable gardens were not uncommon. Now that is changing. It is becoming a land of gadget makers and gamers.
This new communal instinct can do amazing things like route around the warping influence of capitalism and digital platform wars. Look at projects like Open Trip Planner. This takes a bit of unravelling but basically the benefit of good maps on smartphones became endangered by Apple's titanic battle for market supremacy with Google. Apple are attempting to strip Google products like maps from iPhones and this left users with crappy transport info – Open Trip Planner is the communal answer to a hierarchical fall out.The article mentions OpenTripPlanner, an open-source alternative to trip planning systems which seems to be doing for trip planning what OpenStreetMap did for geodata, and the Pebble watch, a Bluetooth-enabled smart watch designed without the backing of a large electronics corporation, and the fact that Kickstarter is expanding to the UK.
In other crowdfunding news, Matthew Inman, who runs The Oatmeal web comic, recently launched a crowdfunding campaign to raise US$850,000 buy Nikola Tesla's old laboratory (put on the market by AGFA and expected to be bought by property developers); the campaign met its target in under a week and has since raised over a million dollars.
Wary of the possibility that a population of educated, frustrated women could result in pressure for political liberalisation, Iran's government has moved to preempt this by declared 70 university courses to be for men only:
It follows years in which Iranian women students have outperformed men, a trend at odds with the traditional male-dominated outlook of the country's religious leaders. Women outnumbered men by three to two in passing this year's university entrance exam.
Senior clerics in Iran's theocratic regime have become concerned about the social side-effects of rising educational standards among women, including declining birth and marriage rates.Of course, if women outperform men in academic fields, banning women may make the men feel better about their performance, but it will be less salutary for Iran's economy if half of all potential knowledge workers are prohibited by law from developing their potential.
This is at odds with the arguably more progressive Saudi approach (and "progressive" and "Saudi" aren't two words I expected to write adjacent to each other) of planning segregated women-only cities, where the nation's educated, otherwise frustrated women can work in industry on a “separate but equal” basis. (I wonder how long that will last; eventually, I imagine it'll lead to those invested in the status quo deciding that it's a threat and attacking it; starving it of resources, imposing crippling restrictions on it, and eventually shutting it down and sending the women back to the authority of their male family members, and the city will go the way the the USSR's Jewish Autonomous Oblast did once Stalin found it too threatening.)
And Iran is moving to further remind women of their place under an Islamic theocracy, by moving to legalise the marriage of girls under 10. The current age at which girls can be married in the Islamic Republic is 10, down from 16 before the revolution.
Researchers in the US have been investigating the question of what is “cool” from a psychological perspective, hitting the dichotomy between the two opposite poles which can be described with this term: on one hand, agreeability and popularity, and, on the other hand, a vaguely antisocial countercultural/oppositional stance reflected in the classic iconography of rebels and outlaws from the history of cool:
"I got my first sunglasses when I was about 13," said Dar-Nimrod. "There wasn't a cooler kid on the block for the next few days. I was looking cool because I was distant from people. My emotions were not something they could read. I put a filter between me and everyone else. That, in my mind, made me cool. Today, that doesn't seem to be supported. If anything, sociability is considered to be cool, being nice is considered to be cool. And in an oxymoron, being passionate is considered to be cool—at least, it is part of the dominant perception of what coolness is. How can you combine the idea of cool—emotionally controlled and distant—with passionate?"
"We have a kind of a schizophrenic coolness concept in our mind," Dar-Nimrod said. "Almost any one of us will be cool in some people's eyes, which suggests the idiosyncratic way coolness is evaluated. But some will be judged as cool in many people's eyes, which suggests there is a core valuation to coolness, and today that does not seem to be the historical nature of cool. We suggest there is some transition from the countercultural cool to a generic version of it's good and I like it. But this transition is by no way completed."The researchers claim that the concept of “cool” is mutating away from the oppositional/rebellious sense and towards straight agreeability.
If this phenomenon does bear itself out, there may be a number of possible explanations. Perhaps, as the countercultural struggles against the repressive hegemony of the “squares” have receded into folk memory of The Fifties and everyone wears jeans, listens to rock and has smoked a joint at least once in their lives, the idea of the rebel is left with even less of a cause than before Perhaps the shift in the meaning of “cool” has something to do with the ongoing process of commodification of the counterculture, with the sneers and icy glares of vintage cool now being little more than a mask for agreeable dudes to put on when the occasion suits. Or perhaps, in the information age, being agreeable and well-connected confers a greater advantage than being tough and detached. One would imagine that this would be the case in most normal situations, in which case, the old world of tough guys and strong, silent types would have been an anomalous case, a hostile environment which traumatised its inhabitants into growing expensive carapaces of character armour.
Another option would be that the meaning of “cool” is not, in fact, changing (this study doesn't seem to involve surveys done decades earlier to gauge what people thought at the time, and compares living attitudes with canned stereotypes), and that the word “cool” has several meanings; when it's used as a term of approval for a person, it has always indicated agreeability, whereas when talking about fictional characters, it suggested a certain type of antiheroic asshole.
In the UK, nightclub bouncers are requiring punters to show them their Facebook profiles on their phones as a condition of entry, ostensibly to weed out the underage. Civil liberties groups and a door staff training firm claim that this is illegal, while some bar owners and bouncers defend the practice, citing heavy fines levied in the event of staff accidentally letting in a minor with a fake ID.
A modest proposal from security commentator Alec Muffett: Could we replace marriage with incorporation of private limited companies?
in the UK the biggest hurdle would be IR35 and certain aspects of expense policy, but you’d have a contract between the directors, clear dissolution clauses; all income could be directed to the corporation – taxed differently and offset against OPEX like nappies and so forth – and salaries paid out, plus it would own and depreciate shared assets like houses and cars. You could avoid much income tax by taking dividends – I know tax avoidance is infra dig at the moment, but I’m thinking of the future here – and you could offshore your marriage for better breaks.
But there would be no gender issues – no incorporation is between a man and woman – plus you could have more than one company director, have corporations that outlast their founders, and in extreme circumstances outsource large chunks of the whole enterprise.
A new study in the US has found a positive correlation between the number of big-box retail stores in an area and the number of hate groups in that area; the study used Wal-Mart as a proxy for big-box retailers:
The amount of Wal-Mart stores in a county was more statistically significant than other factors commonly regarded as important to hate group participation, such as the unemployment rate, high crime rates and low education, the research found.
"Wal-Mart has clearly done good things in these communities, especially in terms of lowering prices," said Stephan Goetz, a Penn State University professor who also serves as the director of the Northeast Regional Center for Rural Development. "But there may be indirect costs that are not as obvious as other effects."It is speculated that the correlation may be due to the fraying of the social ties that exist in areas with smaller, less impersonal, shops. Whether Wal-Mart's owners' politics (generally well on the right of the Republican party) have anything to do with the correlation is unclear.
Psychologist Bruce Levine makes the claim that, in the US, the psychological profession has a bias towards conformism and authoritarianism, and against anti-authoritarian tendencies. This bias apparently results from the institutional structure of the profession, which selects for and reinforces pro-conformist and pro-authoritarian tendencies, and manifests itself, among other things, in those who exhibit “anti-authoritarian tendencies” being caught, diagnosed with various mental illnesses and medicated into compliance before they can develop into actual troublemakers:
In my career as a psychologist, I have talked with hundreds of people previously diagnosed by other professionals with oppositional defiant disorder, attention deficit hyperactive disorder, anxiety disorder and other psychiatric illnesses, and I am struck by (1) how many of those diagnosed are essentially anti-authoritarians, and (2) how those professionals who have diagnosed them are not.
Anti-authoritarians question whether an authority is a legitimate one before taking that authority seriously. Evaluating the legitimacy of authorities includes assessing whether or not authorities actually know what they are talking about, are honest, and care about those people who are respecting their authority. And when anti-authoritarians assess an authority to be illegitimate, they challenge and resist that authority—sometimes aggressively and sometimes passive-aggressively, sometimes wisely and sometimes not.
Some activists lament how few anti-authoritarians there appear to be in the United States. One reason could be that many natural anti-authoritarians are now psychopathologized and medicated before they achieve political consciousness of society’s most oppressive authorities.Showing hostility to or resentment of authority will get one diagnosed with various conditions, such as “opposition defiant disorder (ODD)”, a condition which manifests itself in deficits in “rule-governed behaviour”, and for which, as for many parts of the human condition, there are many types of corrective medication these days. (Compare this to the condition of “sluggish schizophrenia”, which only existed in the Soviet Union and manifested itself as a rejection of the self-evident truth of Marxism-Leninism.)
While pretty much every hierarchical society has mechanisms for encouraging conformity to some degree, Dr. Levine's contention is that the increase in psychiatric medication in recent years may be leading to a more authoritarian and conformistic society.
Idea of the day: the Happy Recession; the idea that the internet, through driving prices and costs down, will permanently deflate both prices and wages; the post-internet world, it seems, jams econo:
The most pernicious aspect of Internet entertainment is that it’s so easy to measure and so easy to mass-produce. So the moment something on the Internet gets fun enough to be competitive with the real-world analogue, it starts getting relentlessly improved until it’s vastly superior. World of Warcraft soaks up upwards of forty hours per week from serious fans, who pay about $15 per month for their subscriptions. Few other hobbies can consume so much time at such a low cost.
The web makes it easier to access non-traditional employees at much lower salaries. As we argued in our Demand Media analysis, the real story here is that a stay-at-home mom with a Masters in Journalism can write content that is good enough compared to a typical Madison Avenue copywriter, especially when the rate is $15 per article instead of six figures per year. This disaggregation of writing skill means that companies no longer have to hire good writers in order to write 5% good copy and 95% mediocre work; they can outsource the mediocre stuff and relegate the high-end work to a short-term freelancer.
The web offers cheap social status: In the long term, this may have a bigger effect than the web merely making digitizable products cheaper. Social status games drive a huge amount of economic activity: people strive to get into high-paying, high prestige career tracks, to win promotions and attendant raises, to live in the best neighborhoods and send their kids to the best schools. Few status games lack some kind of economic output—people who play sports well below the professional level still get some job opportunities out of it.One could probably also add a geographical factor to this: in the age of cheap, ubiquitous opportunity, access to economic and cultural opportunities is less dependant on being located in a buzzing metropolis or creative-class hive; after all, if a copywriter or app developer can work from anywhere with creativity, things like music and art scenes (or whatever replaces the post-punk rock'n'roll era construction of the "music scene" in the cultural ecosystem) are centred around blogs rather than physical venues, and one doesn't even need to move to a different place to find like-minded people, there would be less competition for living in more desirable areas, when the price of not doing so no longer includes disconnection from as many opportunities.
The Bleeding Obvious: A sociological study from Australia has showed that people who fly national flags on their cars are more likely to harbour xenophobic attitudes, with both exclusionary views of who belongs in their society and hostility to those outside of the circle:
Professor Fozdar said 43 per cent of those with car flags said they believed the White Australia Policy had saved Australia from many problems experienced by other countries, while only 25 per cent without flags agreed.
A total of 55 per cent believed migrants should leave their old ways behind, compared with 30 per cent of those without flags.
"Very clear statistical differences in attitudes to diversity between those who fly car flags and those who don't, show that flag waving − while not inherently exclusionary – is linked in this instance to negative attitudes about those who do not fit the 'mainstream' stereotype'," she said.The study also revealed more young people flying flags than older people; perhaps a sign that a more liberal older generation who grew up in the wake of the cultural struggles of the Sixeventies and the Whitlam-era progressive consensus is being supplanted by the children of Howard, Hanson and Hillsong, whose views on what belongs in Australia are a lot narrower?
The study was done in Australia, surveying people flying the Australian flag on their cars in the run-up to Australia Day, though I imagine they'd find similar findings on people flying the flag in other circumstances (such as wearing it as a cape at Big Day Out), or in other countries (I imagine those flying the Cross of St. George in England are more likely to vote UKIP, have uttered the phrase "bloody Pakis" at some point in their lives, to have an aversion to eating "foreign muck", and complain about foreigners coming here and stealing our jobs and not working").
A few random odds and ends which, for one reason or another, didn't make it into blog posts in 2011:
- Artificial intelligence pioneer John McCarthy died this year; though before he did, he wrote up a piece on the sustainability of progress. The gist of it is that he contended that progress is both sustainable and desirable, for at least the next billion years, with resource limitations being largely illusory.
- As China's economy grows, dishonest entrepreneurs are coming up with increasingly novel and bizarre ways of adulterating food:
In May, a Shanghai woman who had left uncooked pork on her kitchen table woke up in the middle of the night and noticed that the meat was emitting a blue light, like something out of a science fiction movie. Experts pointed to phosphorescent bacteria, blamed for another case of glow-in-the-dark pork last year. Farmers in eastern Jiangsu province complained to state media last month that their watermelons had exploded "like landmines" after they mistakenly applied too much growth hormone in hopes of increasing their size.
Until recently, directions were circulating on the Internet about how to make fake eggs out of a gelatinous compound comprised mostly of sodium alginate, which is then poured into a shell made out of calcium carbonate. Companies marketing the kits promised that you could make a fake egg for one-quarter the price of a real one.
- The street finds its own uses for things, and places develop local specialisations and industries: the Romanian town of Râmnicu Vâlcea has become a global centre of expertise in online scams, with industries arising to bilk the world's endless supply of marks, and to keep the successful scammers in luxury goods:
The streets are lined with gleaming storefronts—leather accessories, Italian fashions—serving a demand fueled by illegal income. Near the mall is a nightclub, now closed by police because its backers were shady. New construction grinds ahead on nearly every block. But what really stands out in Râmnicu Vâlcea are the money transfer offices. At least two dozen Western Union locations lie within a four-block area downtown, the company’s black-and-yellow signs proliferating like the Starbucks mermaid circa 2003.
It’s not so different from the forces that turn a neighborhood into, say, New York’s fashion district or the aerospace hub in southern California. “To the extent that some expertise is required, friends and family members of the original entrepreneurs are more likely to have access to those resources than would-be criminals in an isolated location,” says Michael Macy, a Cornell University sociologist who studies social networks. “There may also be local political resources that provide a degree of protection.”
- Monty Python's Terry Jones says that The Life Of Brian could not be made now, as it would be too risky in today's climate of an increasingly strident religiosity exercising its right to take offense:
The 69-year-old said: "I took the view it wasn't blasphemous. It was heretical because it criticised the structure of the church and the way it interpreted the Gospels. At the time religion seemed to be on the back burner and it felt like kicking a dead donkey. It has come back with a vengeance and we'd think twice about making it now."
- The Torygraph's Charles Moore: I'm starting to think that the Left might actually be right:
And when the banks that look after our money take it away, lose it and then, because of government guarantee, are not punished themselves, something much worse happens. It turns out – as the Left always claims – that a system purporting to advance the many has been perverted in order to enrich the few. The global banking system is an adventure playground for the participants, complete with spongy, health-and-safety approved flooring so that they bounce when they fall off. The role of the rest of us is simply to pay.
- The sketchbooks of Susan Kare, the artist who designed the icons, bitmaps and fonts for the original Macintosh, and went on to an illustrious career as a pixel artist (Microsoft hired her to do the Windows 3.x icons, and some years ago, Facebook hired her to design the virtual "gifts" you could buy for friends.) The sketchbooks show her original Macintosh icons, which were drawn by hand on graph paper (because, of course, they didn't have GUI tools for making icons back then).
- How To Steal Like An Artist: advice for those who wish to do creative work.
- The street finds its own uses for things (2): with the rise of the Arduino board (a low-cost, hackable microcontroller usable for basically anything electronic you might want to program), anyone can now make their own self-piloting drone aircraft out of a radio-controlled plane. And it isn't actually illegal in itself (at least in the US; YMMV).
- An answer to the question of why U2 are so popular.
In praise of Joanne Rowling's Hermione Grainger series, which lauds the popular novelist for standing up to commercial pressure to adhere to traditional gender stereotypes and pepper her story with hackneyed clichés because they're, you know, "more marketable":
And what a show it is. In Hermione, Joanne Rowling undermines all of the cliches that we have come to expect in our mythic heroes. It’s easy to imagine Hermione’s origin story as some warmed-over Star Wars claptrap, with tragically missing parents and unsatisfying parental substitutes and a realization that she belongs to a hidden order, with wondrous (and unsettlingly genetic) gifts. But, no: Hermione’s normal parents are her normal parents. She just so happens to be gifted. Being special, Rowling tells us, isn’t about where you come from; it’s about what you can do, if you put your mind to it. And what Hermione can do, when she puts her mind to it, is magic.
The character of Harry Potter is an obnoxious error in the Hermione Granger universe, made more obnoxious by his constant presence. It’s tempting to just write Harry off as a love interest who didn’t quite work out; the popular-yet-brooding jock is hardly an unfamiliar type. And, given that Hermione is constantly having to rescue Harry, he does come across as a sort of male damsel-in-distress.But, if we look closely, we can see that Harry is a parody of every cliche Rowling avoided with Hermione. Harry is not particularly bright or studious; he’s provided with an endless supply of gifts and favors; he’s the heir to no less than two huge fortunes; he’s privileged above his fellow students, due to his fame for something he didn’t actually do himself; he even seems to take credit for “Dumbledore’s Army,” which Hermione started. Of course this character is obnoxious. It’s only by treating ourselves to the irritation caused by Harry that we can fully appreciate Hermione herself.Which makes for an astute critique of the reactionary elements of popular fiction, of which Harry Potter is an exemplar. Whether it's convincing as a counterfactual history, though, is another matter; were Rowling to write her books in the way the article described, what's to say they wouldn't have sunk into obscurity like a lot of worthily didactic left-wing fiction, championed only by those so cultishly right-on that they condemn the Grauniad as a right-wing hate sheet?
A piece in the Observer looks at the privatisation of public space in Britain, or how many of the "public spaces" created by private developers in neo-Thatcherite Britain are not actually public space, but rather private spaces, where the developers allow the public to use them, with conditions, much like shopping malls. The public who use these spaces do so on the sufferance of the owners, who are legally in their right to prohibit anything from photography to public displays of affection to any sort of democratic unpleasantry:
City Hall – the riverside HQ of London's elected government – stands in a privately owned and managed development called More London. Should anyone wish to protest here against the actions of the mayor, they would not be allowed to do so.
With the Liverpool One development a large part of the city effectively became a shopping mall without a roof. Formerly public streets are now privately managed, and a popular indoor market was closed. Liverpool One is not gated but its architectural style and treatment create what has been called an "invisible wall" around it.
The redevelopment of Paternoster Square, next to St Paul's Cathedral, has in its middle a piazza repeatedly described as a "public space". When its owners feared that Occupy London protesters would move into it, however, a sign went up saying that it is "private land".Whilst a product of St. Margaret's vanquishment of post-WW2 quasi-socialism, the privatisation of public space found its place after the fall of the Berlin Wall, in the zeitgeist of Francis Fukuyama's "End of History". After all, if history is over and we're all happy consumers forever, things like public squares are as anachronistic as castles; there are no more issues of ideology to be thrashed out that could necessitate the unsightly spectacle of public protest, and democracy is best left to professional managers and corporate stakeholders, all watched over by the beneficent invisible hand of the free market.
However, now, two decades later, as it emerges that the seemingly endless boom of consumer capitalism was a product of a middle class with disposable income, which is now being eroded, and increasing numbers of people find themselves facing poorer standards of living than their parents and grandparents did, may be the time that privatisation of public space comes into its own. For protests to go over the tipping point, there has to be collective awareness of a reality: it's not enough for everyone to know that the emperor has no clothes; everyone also has to know that everyone else knows before one can act on this without fear, which is why public spaces (such as, say, Tahrir Square or Tienanmen Square) can breed protest, and consequently trouble for the stakeholders of the status quo. Abolishing such public spaces, and effectively interdicting anybody who looks like starting any sort of protest, may be a necessary move as the squeeze takes hold.
Writing in the Pinboard blog, Maciej Ceglowski tears apart the concept "social graph", saying that it is neither social nor a graph, but a sort of pseudoscience invented by socially-challenged geeks and now peddled by hucksters out to monetise you and your relationships:
Last week Forbes even went to the extent of calling the social graph an exploitable resource comprarable to crude oil, with riches to those who figure out how to mine it and refine it. I think this is a fascinating metaphor. If the social graph is crude oil, doesn't that make our friends and colleagues the little animals that get crushed and buried underground?The first part of his argument has to do with the inadequacy of the "social graph" model for representing all the nuances of human social relationships in the real world; the many gradations of friendship and acquaintance, the ways relationships change and evolve, making a mockery of nailed-down static representations; the way that describing a relationship can change it in some cases, and various issues of privacy and multi-faceted identity, things which exist trivially in the real world, even if they're in violation of the Zuckerberg Doctrine.
One big sticking point is privacy. Do I really want to find out that my pastor and I share the same dominatrix? If not, then who is going to be in charge of maintaining all the access control lists for every node and edge so that some information is not shared? You can either have a decentralized, communally owned social graph (like Fitzpatrick envisioned) or good privacy controls, but not the two together.
This obsession with modeling has led us into a social version of the Uncanny Valley, that weird phenomenon from computer graphics where the more faithfully you try to represent something human, the creepier it becomes. As the model becomes more expressive, we really start to notice the places where it fails.
You might almost think that the whole scheme had been cooked up by a bunch of hyperintelligent but hopelessly socially naive people, and you would not be wrong. Asking computer nerds to design social software is a little bit like hiring a Mormon bartender. Our industry abounds in people for whom social interaction has always been more of a puzzle to be reverse-engineered than a good time to be had, and the result is these vaguely Martian protocols.Of course, whilst the idea of the social graph may not be good for modelling real-life social interactions with naturalistic fidelity, it has been a boon for targeting advertising; the illusion of social fulfilment is enough to keep people clicking and volunteering information about themselves. From the advertisers' point of view, the fish not only jump right into the boat, they fillet themselves in mid-air and bring their own wedges of lemon:
Imagine the U.S. Census as conducted by direct marketers - that's the social graph. Social networks exist to sell you crap. The icky feeling you get when your friend starts to talk to you about Amway, or when you spot someone passing out business cards at a birthday party, is the entire driving force behind a site like Facebook.There is some good news, though: while general-purpose social web sites with the ambition of mediating (and monetising) the entirety of human social interaction may fail creepily as they approach their goal, special-purpose online communities can thrive in their niches:
The funny thing is, no one's really hiding the secret of how to make awesome online communities. Give people something cool to do and a way to talk to each other, moderate a little bit, and your job is done. Games like Eve Online or WoW have developed entire economies on top of what's basically a message board. MetaFilter, Reddit, LiveJournal and SA all started with a couple of buttons and a textfield and have produced some fascinating subcultures. And maybe the purest (!) example is 4chan, a Lord of the Flies community that invents all the stuff you end up sharing elsewhere: image macros, copypasta, rage comics, the lolrus. The data model for 4chan is three fields long - image, timestamp, text. Now tell me one bit of original culture that's ever come out of Facebook.I wonder whether there is a dichotomy there between sites and networks; would a special-interest site that used, say, Facebook's social graph as a means of identifying users (rather than having its own system of accounts, usernames, profiles, and optionally friendship/trust edges) be infected by the Zuckerbergian malaise?
Around Leicester Square and Chinatown, one sometimes sees a hoody with a traffic cone:
The figure, sitting there in the middle of pedestrian traffic, burbling into a traffic cone, seemingly oblivious to the comings and goings of tourists and Londoners, seems like some non-specific element of as-yet undifferentiated satire or social criticism, some amalgam of Hogarth and Banksy, Chris Morris and Thom Yorke. But what does he represent? Is he the feral Other, attired in the uniform of Britain's demonised youth, brazenly possessing a traffic cone he is unlikely to have acquired legitimately, and embodying threat? The drug-zombie, mouth clamped onto a pipe, oblivious to the morés of respectable society? Or the embattled Everyman, reacting to unreasonable circumstances in the only reasonable way, by curling up into a ball and gibbering? Is he a satire on the malaise of Broken Britain, or the mindset of the sorts of people who use the phrase "Broken Britain" in the mistaken belief that it was ever not, or both?
More on the Pirate Party's recent electoral success in Berlin: Der Spiegel asks who the Pirate Party are (spoiler: they're the new Greens):
Voter analysis from Sunday would seem to back up that assessment. The survey group Infratest established that 17,000 former Green Party supporters switched their votes to the Pirate Party on Sunday, more than came from any other party. The SPD lost 14,000 voters to the Pirates and the far-left Left Party 13,000.
The party's largest coup, however, came from its ability to attract fully 23,000 people to the polls who had never voted before. More votes came from former East Berlin, where the party secured 10.1 percent of the vote, than from former West Berlin. Most of the party's supporters are young, well-educated men -- as are 14 of the 15 Pirates who will now take their seats in the Berlin city-state parliament.And a Spiegel survey of editorials from various German newspapers (conveniently annotated with their political slants) links the Pirate vote to the rise of the laptop-and-latte generation in Berlin, a city now said to be Europe's IT start-up hub. Which raises the question of whether the Pirates are a progressive party for an age of gentrification.
Meanwhile, the Grauniad asks whether something like that could happen in Britain. (Spoiler: not in a first-past-the-post system, and Britain's politicians also seem less technologically clueful, and more beholden to the old-media powerbrokers, than Germany's:)
The German government was one of the first to decide that national-security systems should not be based on proprietary software. In such a climate it's predictable that a campaigning political party with a radical online agenda would find a ready audience. The bovine way in which the last House of Commons passed Lord Mandelson's digital economy bill, with its clueless 'anti-piracy' provisions, does not exactly engender confidence in the British political class's understanding of these matters.
As the world celebrated Talk Like A Pirate Day (with the true hardcore eschewing the "yarrr"s and brushing up on their Somali), the good burghers of Berlin have done one better; there, the Pirate Party has won some 14 or 15 seats in the city-state's 149-seat parliament; about half as many as the Greens and slightly fewer than the neo-Communist Left Party.
Indeed, the support for the party -- founded in 2006 on a civil liberties platform that focused on Internet freedoms -- was sensational. Not only will the Pirate Party enter a regional government for the first time, but its results far surpassed the five percent hurdle needed for parliamentary representation. The success was so unexpected that the party had only put 15 candidates on its list of nominations. Had their support been just a little higher, some of their seats would have remained empty because post-election nominations of candidates isn't allowed.Many of the seats came at the expense of the neoliberal Free Democrats, who were wiped out in Berlin. The Pirate Party (which started campaigning on a copyright-reform and online privacy platform, and expanded this to include the decriminalisation of drugs, the abolition of Germany's church tax system and a basic living wage for all), in fact, seems to be taking over the mantle of forward-looking progressive party from the Greens, who were once considered dangerous radicals (in the Reagan-era action film Red Dawn, the Greens winning West German elections was the catalyst that led to a Soviet invasion of the USA) but now have become all but part of the establishment.
The Pirates also have something other parties have long since lost -- credibility, authenticity and freshness. The erstwhile alternative Greens, whose share of the vote in the Berlin election fell well behind their expectations, were also once the young party with funny mottos and unconventional campaign methods. When they entered the Berlin parliament in 1981, other parties were skeptical. At the time, the now imploding Free Democrats described the Greens as "domestic policy anarchists and foreign policy gamblers", while lead CDU candidate Richard von Weizsäcker, who would later be appointed German President, said they were "impossible to describe."It used to be that the concept of "Green" (i.e., ecological consciousness and sustainability) was the hook to hang progressive ideals from; now, it seems, that the idea of the Pirate (as defined in opposition to the propaganda of Big Copyright, the steady privatisation of the public sphere and an encroaching authoritarian surveillance state) may be replacing the idea of Greenness as the banner that draws in progressives.
There's an interesting piece in Der Spiegel about the rise of secularism and the psychological differences between religious and secular people. According to the article, non-religious people (atheists, agnostics and the nonreligious) make up about 15% of the world's population, placing them third behind Christians and Muslims in number. Meanwhile, secularism is on the rise, with the often discussed religious revivals, in Europe, the US and elsewhere, being, more often than not, illusory. (In the US, a country associated with almost mediaeval levels of religiosity in public life, churches are losing up to 1 million members a year.)turned out to be and also an increasing number of people who identify as religious on surveys admitting that they don't actually believe in a deity.
According to Boston University psychologist Catherine Caldwell-Harris, the differences between the religious and secular minds may emerge from different thinking styles, with religious people being more likely to attribute sentient agency than secular people:
Caldwell-Harris is currently testing her hypothesis through simple experiments. Test subjects watch a film in which triangles move about. One group experiences the film as a humanized drama, in which the larger triangles are attacking the smaller ones. The other group describes the scene mechanically, simply stating the manner in which the geometric shapes are moving. Those who do not anthropomorphize the triangles, she suspects, are unlikely to ascribe much importance to beliefs. "There have always been two cognitive comfort zones," she says, "but skeptics used to keep quiet in order to stay out of trouble."The rise of secularism has led to more study of what secularists do actually believe. And, it seems, there are a few outlooks they tend to share:
Sociologist Phil Zuckerman, who hopes to start a secular studies major at California's Pitzer College, says that secularists tend to be more ethical than religious people. On average, they are more commonly opposed to the death penalty, war and discrimination. And they also have fewer objections to foreigners, homosexuals, oral sex and hashish.
The most surprising insight revealed by the new wave of secular research so far is that atheists know more about the God they don't believe in than the believers themselves. This is the conclusion suggested by a 2010 Pew Research Center survey of US citizens. Even when the higher education levels of the unreligious were factored out, they proved to be better informed in matters of faith, followed by Jewish and Mormon believers.The article also looks at the case of religiosity in Germany, where the East was ruled by an officially atheistic totalitarian dictatorship while the West retained strong links to Christianity. After reunification, the East remained considerably poorer than the West. Perhaps surprisingly, these conditions did not result in a new religious revival spreading through the East, but rather the opposite:
When the GDR ended its period of religious repression, no process of re-Christianization occurred. "After the fall of the Berlin Wall, the withdrawal of a church presence in the east actually sped up," says Detlef Pollack, a professor in the sociology of religion at the University of Münster. Ironically, the link between church and state contributed to secularization in the East, he says. Publicly funded theological professorships, military chaplaincies, and the presence of church representatives on broadcasting councils were common. As a result, public perception came to closely link authority with religion, which was seen as coming from the West.As rapidly as secularism is rising, though, we might not see a powerful secular lobby any time soon. For one, secularists remain mistrusted in many places (in the US, according to a 2010 Pew Research survey, atheists are the most disliked group, behind Muslims and homosexuals). And secondly, given the broad differences in a movement by definition not bound by any dogma, the emergence of any sort of consensus is unlikely:
Then he tells of a meeting of secular groups last year in Washington. They were planning a big demonstration. "But they couldn't even agree on a motto," he says. "It was like herding cats, straight out of a Monty Python sketch." In the end, the march was called off.
In the wake of riots across the UK, the BBC asks what turns ordinary people into looters:
Psychologists argue that a person loses their moral identity in a large group, and empathy and guilt - the qualities that stop us behaving like criminals - are corroded. "Morality is inversely proportional to the number of observers. When you have a large group that's relatively anonymous, you can essentially do anything you like," according to Dr James Thompson, honorary senior lecturer in psychology at University College London.
He rejects the notion that some of the looters are passively going with the flow once the violence has taken place, insisting there is always a choice to be made.
Workman argues that some of those taking part may adopt an ad hoc moral code in their minds - "these rich people have things I don't have so it's only right that I take it". But there's evidence to suggest that gang leaders tend to have psychopathic tendencies, he says.
[Criminologist Prof. John Pitts] says most of the rioters are from poor estates who have no "stake in conformity", who have nothing to lose. "They have no career to think about. They are not 'us'. They live out there on the margins, enraged, disappointed, capable of doing some awful things."
There have been three days of rioting across London (and now other parts of Britain). The riots started after the fatal shooting by police of a Tottenham man, said to be a responsible family man, though alleged to have current or former gang links. The riots soon spread, with gangs of youths organising through BlackBerry Messaging (which is harder for police to monitor than the internet) and looting shops. Sportswear and consumer electronics were reportedly the most stolen items, though baby buggy shops were broken into (presumably for use carrying stolen goods), while other groups systematically mugged passersby. (One gang stormed a posh Notting Hill restaurant and robbed the patrons.) Meanwhile, another group trashed a gay bookshop in Camden, whilst leaving other businesses alone. Elsewhere, ordinary people were burned out of their homes when shops were torched; a news photograph shows a woman leaping for her life from a burning building in Croydon.
In a sense, this was an aspirational riot. While it may have started with anger over police violence, and mistrust of the police, it soon degenerated into an excuse to stock up on Nikes and plasma TVs, as well as engage in lots of fun ultra-violence (just like a video game, only better!) The conditions for the riot may have been set by the Thatcherite-Blairite ideology of helping the rich get richer and letting the devil take the hindmost, but the riots were an affirmation of these values, filtered through the equally sociopathic antihero mythology of US gang culture, from Scarface, via gangsta rap, to video games (some rioters referred to the police as "the feds", presumably imagining themselves to be 50 Cent or the guy from Grand Theft Auto or something), secondhand Jamaican gang machismo (which could explain attacking gay bookshops) and even half-digested bits of Tea Party-style moronic entitlement (one junior Dagny Taggart with an armful of trainers was heard to say "I'm getting my taxes back"). There was no challenge to the status quo here, only an extrapolation of it.
Meanwhile in North London, the Sony music/video distribution warehouse was burned to the ground, destroying the inventory of dozens of independent music labels and film distributors, among them Beggars/4AD, Domino, Thrill Jockey, FatCat and Soul Jazz. I wonder how many of them, already kicked by the recession, will go under.
Meanwhile, the Metropolitan Police are posting CCTV images of looters to see if anyone can identify them, and so is an independent site. And in riot-hit areas, the local communities have united to clean up after the riot, with volunteers signing on via social networks.
Sydney's status as one of the world's most liveable cities has recently been threatened by spiralling rents and property prices. In an attempt to turn this about, the New South Wales government has announced that it will pay Sydneysiders to leave, with eligible residents standing to get AU$7,000 to move to the countryside:
The one-off grants to move to country areas will be payable to individuals or families provided they sell their Sydney home and buy one in the country. The country home must be worth less than $600,000 (£390,000), something that won't be hard in most rural areas. It will cost the taxpayer up to $47m (£30m) a year.
As much as boosting regional areas, the scheme is also about making Sydney more liveable. The city's population is 4.5m and predicted to grow by 40% over the next 30 years, putting unprecedented pressure on infrastructure and housing.The government reportedly considered increasing the building density in Sydney to something approaching European levels for almost five minutes, before it was pointed out that doing so would be fundamentally un-Australian, and would violate Australians' rights to a house on a quarter-acre block with a two-car garage, which, much like Americans' right to bear arms, is sacrosanct and not negotiable.
It's not clear whether Melbourne (which is about as expensive as Sydney these days, making up for its lack of a spectacular harbour with a thousand funky laneway bars) will follow this lead and offer people money to move to Geelong or Moe or somewhere.
This article looks at the malaise in indie/hipster culture, and places the blame squarely at the feet of 1990s proto-hipster Beck:
The two most common characteristics of the “indie” persona these days, at least in North America, are an aversion to overt seriousness and the ability to find everything “awesome”. These characteristics often intermingle and feed off one another, creating the voracious indie devourer who is able to simultaneously enjoy every kind of music while at the same time not particularly caring about anything. They are the ultimate consumer, willing to embrace and discard bands at a moment’s notice while never questioning what led them to lose interest in one band and embrace another. Awkward inquiries about almost any subject can be dealt with in a detached and deliberately ironic manner — following trends is awesome, selling out is awesome, being shallow is awesome, sweatshops are awesome. When it comes to fashion, trashiness battles against both vintage store retro and American Apparel chic as the dominant form, and everyone thinks that everybody but themselves is a hipster. How this persona was birthed is a relatively straightforward tale, as suburban America fell in the love with the vulgar commercial product of its youth. An ironic approach was already somewhat popular but something, or in this case someone, happened in the ‘90s to turn what was a mere aspect of American culture into the dominant personality trait of American teenagers, twenty-somethings and, at this point, thirty-somethings. That someone was Beck.
Cinema in the 90s reflected this shift in taste, with the ultra-violence of Quentin Tarantino’s movies creating a detached, cartoonish reality that allowed the viewer to feel unconcerned as to the repercussions of the savagery on screen. The character’s brutal transgressions are played out for entertainment and amusement rather than illustrating any kind of painful struggle. Tarantino’s movies were also filled with pop culture references that allowed the viewer to feel like they were part of the director’s insular self-congratulatory world. If America in the 70s wrestled with moral dilemmas and a diminished sense of individuality and reach, then pop culture mavens in the 90s merely wanted to be in on the joke. To music fans who imagined themselves to be more alternative in their approach, Beck fulfilled this need. His music basked in the mindset of trash culture and knowing irony, of sneering at seriousness, of adopting hip-hop beats to play up the now utterly commonplace “look at me I’m a nerdy white guy rapping about ridiculous things” persona that has managed to all but reduce hip-hop to a comedy sideshow for those who need an occasional break from their Arcade Fire or Vampire Weekend albums.The ironic stance, the article argues, was a false victory, delivering the counterculture straight into the arms of the consumerist mainstream. After all, you can buy more crap if you're doing so ironically:
Consumerism thrives on people getting excited about, and buying, things that they ultimately don’t care about. In this sense the ironic persona is the ultimate gift to consumerism. Mainstream music revels in easy sentiment and soul-crushing banality and can only truly be enjoyed by not paying attention to the lyrics. Beck’s meaningless babble trained a generation of young ears to seek out amusing sound-bites over articulate content and in doing so helped break down the last vestiges of ‘alternative’ music by making it as equally meaningless as, and therefore all but identical to, mainstream drivel.I'm wondering whether the rise to dominance of the stance of ironic detachment and the tendency of musicians and bands to define themselves publically by catalogues of their influences ("we're kraut-punk meets Afrobeat meets New Jack Swing") could not both be symptoms of a more abstract shift from directness and immediacy towards mediation and referentiality, an addition of levels of abstraction to the processes of culture, a tendency to see and do things from one step removed.
A new study has shown that violent video games decrease crime rates. While they do increase aggression in the players, the incapacitation effect of the players being drawn into sitting in front of a computer or console for extended periods of time, and thus unlikely to attack anything larger than a plate of nachos in reality, outweighs this.
The Guardian looks at whether intellectuals get as little respect in British culture as one is inclined to think:
Britain is a country in which the word "intellectual" is often preceded by the sneering adjective "so-called", where smart people are put down because they are "too clever by half" and where a cerebral politician (David Willetts) was for years saddled with the soubriquet "Two Brains". It's a society in which creative engineers are labelled "boffins" and kids with a talent for mathematics or computer programming are "nerds". As far as the Brits are concerned, intellectuals begin at Calais and gravitate to Paris, where the fact that they are lionised in its cafes and salons is seen as proof that the French, despite their cheese- and wine-making skills, are fundamentally unsound. Given this nasty linguistic undercurrent, a Martian anthropologist would be forgiven for thinking that Britain was a nation of knuckle-dragging troglodytes rather than a cockpit of vibrant cultural life and home to some of the world's best universities, most creative artists, liveliest publications and greatest theatres and museums.There are various theories attempting to explain the British disdain for intellectuals: that Britain, because of its temperate cultural climate and historical good fortune, has not had to evolve an intelligentsia as more fraught countries such as France and Germany have; that Britain (or at least England) in valuing the empirical over the theoretical (or, conversely, being a "nation of shopkeepers", as Napoleon put it), has little room for the kinds of florid theorists who flourish across the Channel, preferring more practical thinkers, or (as the article suggests), that Britain is every bit as governed by ideas as the Continent is, and the supposed disdain for intellectuality is actually a disdain for blowing one's own horn or being too earnest. Or, perhaps, a combination of these.
And while English anti-intellectualism (the Scots may well argue that it is strictly a south-of-the-border phenomenon) may disdain the more abstract and less market-ready areas of thought, the colonial strains are considerably more virulent:
Marginson thinks there is a particular problem for science common to most English-speaking countries except Canada, which has a strong French influence. He says that in Australia, particularly in working-class cultures: ''Not all people think it is smart to learn; some feel it is not going to help them much and they think people who do well at school are wankers. It is a view pretty commonly felt and is not terribly conducive to having a highly educated population.''To be fair, I've seen the same argument said about British working-class culture, though combined with nostalgia for an age when self-improvement was a widespread working-class ideal, now sadly replaced by acquisition of bling.
On the eve of the Eurovision Song Contest, Der Spiegel has a piece on a group of academics who are looking at what the competition says about European cultures:
Take the 2007 winner, Serbia's Marija Serifovic. Many interpreted her act to be that of a campy, butch lesbian, but Gluhovic argues that people in the East viewed it differently, noting that the song's title, "Molitva" ("prayer"), is almost the same word in many Slavic languages. Viewers in Prague, Zagreb or Moscow may have been more inclined to think of the song as a prayer for a Serbia where EU sanctions against the former Milosevic regime had only just been lifted.
One thing neither academic disputes is the fact that countries in Eastern Europe and far beyond are investing heavily in their Eurovision acts as a way of polishing their images abroad. From Kiev to Moscow to Baku, tens of millions of euros have been spent on campaigns to burnish their images at Eurovision. Two approaches have proven highly popular -- either attempts to "self-exoticize" a country's "Orientalness" or Eastern culture, or to bring in famous producers to emulate Western pop styles.And while new arrivals go for nouveau-riche glamour to make an impression, those closer in seek to tone their appearance down, to distance themselves from their arriviste neighbours, not unlike the English class system:
Despite all the exuberant performers, some new entrants take a conservative approach. Researchers working on the Eurovision 'New Europe' project have seen a trend in Poland in which the country eschews the more outlandish performances adopted by some of its neighbors in favor of more mainstream pop. "In terms of their look and the way they sound, they have a strategy of disidentification with the more exotic East, thereby claiming its position in the Central European cultural core and values." The strategy has been a loser in terms of votes, however.Meanwhile, there is the question of Eurovision's campness and function as a signifier of gay identity, particularly in places where open homosexuality is disapproved of or worse:
At times, she continues, Eurovision can be outrageous, and at others downright silly, which all plays into its camp appeal. And in the past, Eurovision was a "secret code or club" for being gay in countries like Ireland, where homosexuality was only decriminalized in 1993. "You had a secret and your friends had a secret and you had those parties every year," Fricker says.
More recently, Eurovision has underscored differences in acceptance of homosexuality in different parts of Europe that give little reason to celebrate. When Belgrade hosted the contest in 2008, welcome packages for Eurovision attendees included warnings against displaying same-sex affection in a city that gets low marks for gay-friendliness. Moscow, which hosted in 2009, isn't exactly known as a bastion of tolerance either.Interestingly enough, in Australia, where Eurovision is broadcast most of a day later (a function of Australia having a lot of descendants of European migrants with connections to their old countries; the US, incidentally, doesn't have Eurovision, and Americans I've spoken to have found it befuddling, in the same way westerners see Japanese game shows), Eurovision isn't seen as a specifically gay thing, but rather a piece of kitsch to have a good laugh at with friends. This seems to be particularly common in the inner-city areas, populated by bohemians and avant-bourgeoisie who, thanks to SBS, have a finely tuned taste for Euro-kitsch.
Christopher Hitchens weighs in on the Royal Wedding, and, as usual, pulls no punches. The Hitch in full form is a splendid thing to behold:
A hereditary monarch, observed Thomas Paine, is as absurd a proposition as a hereditary doctor or mathematician. But try pointing this out when everybody is seemingly moist with excitement about the cake plans and gown schemes of the constitutional absurdity's designated mother-to-be. You don't seem to be uttering common sense. You sound like a Scrooge. I suppose this must be the monarchical "magic" of which we hear so much: By some mystic alchemy, the breeding imperatives for a dynasty become the stuff of romance, even "fairy tale." The usually contemptuous words fairy tale were certainly coldly accurate about the romance quotient of the last two major royal couplings, which brought the vapid disco-princesses Diana and Sarah (I decline to call her "Fergie") within range of demolishing the entire mystique. And, even if the current match looks a lot more wholesome and genuine, its principal function is still to restore a patina of glamour that has been all but irretrievably lost.
For Prince William at least it was decided on the day of his birth what he should do: Find a presentable wife, father a male heir (and preferably a male "spare" as well), and keep the show on the road. By yet another exercise of that notorious "magic," it is now doubly and triply important that he does this simple thing right, because only his supposed charisma can save the country from what monarchists dread and republicans ought to hope for: King Charles III. (Monarchy, you see, is a hereditary disease that can only be cured by fresh outbreaks of itself.) An even longer life for the present queen is generally hoped for: failing that a palace maneuver that skips a generation and saves the British from a man who—like the fruit of the medlar—went rotten before he turned ripe.
Myself, I wish her well and also wish I could whisper to her: If you really love him, honey, get him out of there, and yourself, too. Many of us don't want or need another sacrificial lamb to water the dried bones and veins of a dessicated system. Do yourself a favor and save what you can: Leave the throne to the awful next incumbent that the hereditary principle has mandated for it.
As the Royal Wedding approaches, progressive commentator Johann Hari makes a case against the monarchy, and what he terms its subtly corroding effect on the nation's psyche:
Of course, when two people get married, it's a sweet sight. Nobody objects to that part. On the contrary: republicans are the only people who would let William Windsor and Kate Middleton have the private, personal wedding they clearly crave, instead of turning them into stressed-out, emptied-out marionettes of monarchy that are about to jerk across the stage. We object not to a wedding, but to the orgy of deference, snobbery, and worship for the hereditary principle that will take place before, during and after it.
Kids in Britain grow up knowing that we all bow and curtsy in front of a person simply because of their unearned, uninteresting bloodline. This snobbery subtly soaks out through the society, tweaking us to be deferential to unearned and talentless wealth, simply because it's there.
We live with a weird cognitive dissonance in Britain. We are always saying we should be a meritocracy, but we shriek in horror at the idea that we should pick our head of state on merit. Earlier this month, David Cameron lamented that too many people in Britain get ahead because of who their parents are. A few minutes later, without missing a beat, he praised the monarchy as the best of British. Nobody laughed. Most monarchists try to get around this dissonance by creating – through sheer force of will – the illusion that the Windsor family really is steeped in merit, and better than the rest of us. This is a theory that falls apart the moment you actually hear Charles Windsor speak.Monarchy, after all, is just a polite word for "hereditary dictatorship"; the difference between a monarchy and North Korea is the layers of glamour and mystique from centuries of submission and acclimatisation, which still remains in kitschy old fairy tales of wise kings, beautiful princesses and enchanted castles. Granted, in a constitutional monarchy like Britain's, in which the monarch has no power but to sit in a gilded cage, cut the odd ribbon and mouth the words written by elected politicians, the idea of monarchy is watered down almost to homeopathic levels, though the unpalatable reality of what a real monarchy would be like intrudes from time to time. For example, it is still a tradition for monarchs to act as a slightly peculiar global club and invite other crowned heads to their occasions, even if those crowned heads are actual hereditary dictators of the old school, like the Crown Prince of Bahrain, whose government recently massacred pro-democracy protesters. (The crown prince has withdrawn, much to his regret, though royals from Saudi Arabia and Swaziland, blessed with less immediately conspicuous human-rights issues, are still coming. Kim Jong Il, however, has not been invited, being too much of a hopped-up nouveau-riche to make the club.)
In a few days' time, Britain will have its first Royal Wedding since 1981; though, while it's big news in some places (such as the US), Britain seems somewhat more apathetic about the whole thing, particularly compared to Charles and Diana's nuptials:
The street party response has been disappointing – many fewer than 1981 and a more markedly southern retro- flavour to those that are planned. But 1981 was practically BC in terms of the changes to the kinds of communities that have street parties – company towns, motor towns, mining villages, the lot. After all, 1981 was the year of the Specials' "Ghost Town" and every famous riot going, but there were still communities to fight for. Now it's all grassed over for Call Centre and Office Park Britain. Gone. How could you expect cheery knees-ups everywhere?
What sociologists call our "reference groups" have changed. For the UK over-class, it's their global-rich peers (they compare themselves to Wall Street, Silicon Valley or Shanghai). And for the uniquely durable British underclass, it's Lottery winners, football players and entertainers, people in the low-end celebrity press, Fickle Fingers of Fate people. To stay relevant, royal people and styles have both to acknowledge all this, yet still stay aloof from it.Which is not to say Britain doesn't have communities with cohesion—one of them just rioted against the opening of a Tesco—but they're not the sorts of places that put up the bunting of Royalism, except perhaps in the most backhandedly sarcastic way (at the markets of Hackney, they're flogging boxing-match-flyer-style tea towels and commemorative mugs with the wrong prince printed on them to the bohemians and avant-bourgeoisie). Meanwhile, in the harrumphingly blue-ribboned Tory shires, they're probably too cosseted in their SUVs and too busy with serious business to spend much time putting up bunting and organising egg-and-spoon races; it's not really the sort of thing you can hire some Polish and/or Lithuanian workers to do either. Meanwhile, everybody else thanks the two young people for a long weekend to fly to Spain for and/or commiserates Ms. Middleton on the impending end of her life as a private individual.
London folk singer Emmy The Great has written a song in back-handed tribute to the Royal Wedding. Titled Mistress England, it is dedicated to the mothers of the young women whom Prince William didn't end up choosing as his future queen, and it positively drips with a very British, very measured wit:
The subject has inspired a touching, tender song. "Fold up your clean white invitations/ There is no need to keep them now," run the lyrics. "He found a Queen/ He chose another." The middle eight conjures distant churchbells, but in the Union Jack-decked garden, "no celebration here". "I'm two years younger than Kate Middleton," says Moss. "I honestly knew girls who applied to St Andrews to meet him. Presumably they're a bit miffed now."
"I keep trying to put myself in Kate Middleton's place," says Moss. "She did a degree, right, that's how she met him? I have never, ever heard it said what she studied there. But I do know what boots she likes to wear. That's a bit depressing, isn't it?"
The BBC News Magazine has an article about the shifting meaning of the adjective "bohemian", a word which used to started off describing vagabonds and those beyond the pale of respectable society, shifted via itinerant actors and musicians to refer to self-selected artistic outsiders who rejected bourgeois values and social norms, and now is increasingly used to refer to fashion-conscious types who engage in slightly more trendy modes of consumption (note the rise of "bobos", or "bourgeois bohemians", sometimes provocatively referred to as "White People").
In essence, bohemianism represented a personal, cultural and social reaction to the bourgeois life. And, once the latter was all but swept away by the maelstrom that was the 1960s, the former was doomed, too.Perhaps we need a word to refer to the "bohemians"-who-aren't-really-bohemians, in that, whilst engaging with culture outside of the feeding trough of the mainstream, they do live a comfortable bourgeois life, with respectable jobs, stable living arrangements and disposable income to spend on accoutrements such as limited-edition trainers, designer glasses, fancy bicycles and Apple products. How about the "avant-bourgeoisie"?
Once a rich, almost craftsmanly, criminal tradition pickpocketing is dying out in America, due to the success of law enforcement campaigns against it and/or the shorter attention spans of today's juvenile delinquents. And some criminologists and folk historians are lamenting this loss:
Pickpocketing in America was once a proud criminal tradition, rich with drama, celebrated in the culture, singular enough that its practitioners developed a whole lexicon to describe its intricacies. Those days appear to be over. "Pickpocketing is more or less dead in this country," says Harvard economist Edward Glaeser, whose new book Triumph of the City, deals at length with urban crime trends. "I think these skills have been tragically lost. You've got to respect the skill of some pickpocket relative to some thug coming up to you with a knife. A knife takes no skill whatsoever. But to lift someone's wallet without them knowing …"
But even if Fagins abounded in the United States, it's unclear whether today's shrinking pool of criminally minded American kids would be willing to put in the time to properly develop the skill. "Pickpocketing is a subtle theft," says Jay Albenese, a criminologist at Virginia Commonwealth University. "It requires a certain amount of skill, finesse, cleverness, and planning, and the patience to do all that isn't there" among American young people. This is "a reflection of what's going on in the wider culture," Albenese says. If you're not averse to confrontation, it's much easier to get a gun in the United States than it is in Europe (though the penalties for armed robbery are stiffer). Those who have no stomach for violence can eke out a living snatching cell phones on the subway, which are much easier to convert to cash than stolen credit cards, or get into the more lucrative fields of credit card fraud or identity theft, which require highly refined skills that people find neither charming nor admirable in the least. Being outwitted mano a mano by a pickpocket in a crowded subway car is one thing; being relieved of your savings by an anonymous hacker is quite another.Fortunately or unfortunately, depending on how you look at it, the craft of pickpocketing is alive and well in Europe, the home of many highly refined traditions and systems of apprenticeship:
This is not the case in Europe, where pickpocketing has been less of a priority for law enforcement and where professionals from countries like Bulgaria and Romania, each with storied traditions of pickpocketing, are able to travel more freely since their acceptance into the European Union in 2007, developing their organizations and plying their trade in tourist hot spots like Barcelona, Rome, and Prague. "The good thieves in Europe are generally 22 to 35," says Bob Arno, a criminologist and consultant who travels the world posing as a victim to stay atop the latest pickpocketing techniques and works with law enforcement agencies to help them battle the crime. "In America they are dying off, or they had been apprehended so many times that it's easier for law enforcement to track them and catch them."
A call for papers has been issued for an anthology of academic papers which addresses a hitherto underexamined niche: zombies and the undead and higher education institutions:
This book takes up the momentum provided by the recent resurgence of interest in zombie culture to explore the relevance of the zombie trope to discussions of scholarly practice itself. The zombie is an extraordinarily rich and evocative popular cultural form, and zombidity, zombification and necromancy can function as compelling elements in a conceptual repertoire for both explaining and critically ‘enlivening’ the debates around a broad variety of cultural and institutional phenomena evident in the contemporary university. We propose to canvas a range of critical accounts of the contemporary university as a living dead culture. We are therefore seeking interdisciplinary proposals for papers that investigate the political, cultural, organisational, and pedagogical state of the university, through applying the metaphor of zombiedom to both the form and content of professional academic work.Zombies in the Academy: Living Death in Higher Education is scheduled for publication in 2012, and will address three broad topics: "corporatisation, bureaucratisation, and zombification of higher education", "technology, digital media and moribund content distribution infecting the university", and the intriguingly phrased "zombie literacies and living dead pedagogies". The call for papers has a number of example paper topic suggestions, in which the metaphor of the undead is applied to everything from moribund institutions to Marxist critiques of "undead labour" (did Marx actually use the word "undead"?) to the question of whether zombiedom could be a positive adaptation to the academic environment.
As Britain's Tory-led government prepares to bring in a regime of cuts that make High Thatcherism look like 1970s Sweden, one of the services set to be hit hardest will be libraries, with nearly half of libraries closing in some councils (because, the reasoning seems to go, if they don't make a profit they can't be very important to the public after all, and if there is a profit to be made, the free market, in its beneficent wisdom, will step in and provide what the public wants). As for those libraries that are left standing, there won't be money to pay for professional librarians to staff them, so those will be replaced with volunteers from the general public (because, to paraphrase Jeremy Clarkson, how hard can it be to put books back on shelves?).
Here is the transcript of a speech from Philip Pullman, delivered at a meeting to save Oxfordshire's libraries, about why this ostensibly economically rational measure is actually an act of gross cultural and social vandalism:
Does he think the job of a librarian is so simple, so empty of content, that anyone can step up and do it for a thank-you and a cup of tea? Does he think that all a librarian does is to tidy the shelves? And who are these volunteers? Who are these people whose lives are so empty, whose time spreads out in front of them like the limitless steppes of central Asia, who have no families to look after, no jobs to do, no responsibilities of any sort, and yet are so wealthy that they can commit hours of their time every week to working for nothing? Who are these volunteers? Do you know anyone who could volunteer their time in this way? If there’s anyone who has the time and the energy to work for nothing in a good cause, they are probably already working for one of the voluntary sector day centres or running a local football team or helping out with the league of friends in a hospital. What’s going to make them stop doing that and start working in a library instead?
Imagine two communities that have been told their local library is going to be closed. One of them is full of people with generous pension arrangements, plenty of time on their hands, lots of experience of negotiating planning applications and that sort of thing, broadband connections to every household, two cars in every drive, neighbourhood watch schemes in every road, all organised and ready to go. ... I’m not knocking them. But they do have certain advantages that the other community, the second one I’m talking about, does not. There people are out of work, there are a lot of single parent households, young mothers struggling to look after their toddlers, and as for broadband and two cars, they might have a slow old computer if they’re lucky and a beaten-up old van and they dread the MOT test – people for whom a trip to the centre of Oxford takes a lot of time to organise, a lot of energy to negotiate, getting the children into something warm, getting the buggy set up and the baby stuff all organised, and the bus isn’t free, either – you can imagine it. Which of those two communities will get a bid organised to fund their local library?
The greedy ghost understands profit all right. But that’s all he understands. What he doesn’t understand is enterprises that don’t make a profit, because they’re not set up to do that but to do something different. He doesn’t understand libraries at all, for instance. That branch – how much money did it make last year? Why aren’t you charging higher fines? Why don’t you charge for library cards? Why don’t you charge for every catalogue search? Reserving books – you should charge a lot more for that. Those bookshelves over there – what’s on them? Philosophy? And how many people looked at them last week? Three? Empty those shelves and fill them up with celebrity memoirs. That’s all the greedy ghost thinks libraries are for.
Mark Dery critically examines at the relentlessly upbeat politics of enthusiasm in the age of the Tumblr blog and the Like button:
At its brainiest, this sensibility expresses itself in the group blog Boing Boing, a self-described “directory of wonderful things.” Tellingly, the trope “just look at this!,” a transport of rapture at the wonderfulness of whatever it is, has become a refrain on the site, as in: ”Just look at this awesome underwear made from banana fibers. Just look at it.” Or: “Just look at this awesome steampunk bananagun. Just look at it.” Or: “Just look at this bad-ass volcano.” Or: “Just look at this illustration of an ancient carnivorous whale.” Because that’s what the curators of wunderkammern do—draw back the curtain, like Charles Willson Peale in “The Artist in His Museum,” exposing a world of “wonderful things,” natural (bad-ass volcanoes, carnivorous whales) and unnatural (steampunk bananaguns, banana-fiber underwear), calculated to make us marvel.Of course, there is a downside to this relentless boosterism: the positive becomes the norm (how many things can you "favourite"?); meanwhile, critical thought becomes delegitimised. When everybody's building shrines to their likes, any expression of negativity is an attack on someone's personal taste, making one a "hater" (a term originally from hip-hop culture which, tellingly, gained mainstream currency in the past decade). From this relentlessly upbeat point of view, critics are no more legitimate than griefers, the players in multi-player games who destroy others' achievements motivated by sadism:
At their wound-licking, hater-hatin’ worst, the politics of enthusiasm bespeak the intellectual flaccidity of a victim culture that sees even reasoned critiques as a mean-spirited assault on the believer, rather than an intellectual challenge to his beliefs. Journal writer Christopher John Farley is worth quoting again: dodging the argument by smearing the critic, the term “hater” tars “all criticism—no matter the merits—as the product of hateful minds.” No matter the merits.The culture of enthusiasm, and the culture of disenthusiasm (which Dery mentions), seems to be founded on the assumption that we are defined by the things we like and dislike. It's a form of commodity fetishism taken into the cultural sphere, though one step removed from the accumulation of material goods, rather dealing with approval and disapproval. Not surprisingly, it's often associated with youth subcultures; take, for example, punks' leather jackets; the names which appear on the back, and those omitted for obviousness or inauthenticity, signal their wearers' authenticity and legitimacy in the culture. (Hipsters take it further, into the realm of irony, where one's status is measured by how close one can surf to the void of kitsch; being into, say, Hall & Oates or M.C. Hammer, is worth more than safe choices like Joy Division and the Velvet Underground, which are so obvious a part of every civilised person's background that trumpeting one's enthusiasm for them is immediately suspect.)
However, likes and dislikes, when worn as badges of identity, can become mere totemism. Do you like, say, The Strokes or Barack Obama, because you find them interesting, or because you wish to be identified as the kind of person who does? Or, as A Softer World put it:
Cultural products (a term which encompasses everything from pop stars to public intellectuals, from comic books to politicians) can fulil two functions: they can be valued for their content or function (does this band rock? Is this book interesting?), or for their function as establishing the consumer's identity. Much like vinyl record sleeves framed on trendy apartment walls by people who don't own turntables to project an aura of cool, favourite books or movies or bands or public figures can be trotted out to buttress one's public image, without ever being fully digested. (Witness, for example, the outspokenly religious American "Conservatives" who idolise Ayn Rand, a strident atheist who expressed a Nietzschean contempt for religion.) Likes and dislike, in other words, are like flags, saluted or burned often out of habit or social obligation as much as any intrinsic value they may hold.
At the end, Dery points out that, far more interesting and telling than what we like or dislike are the things we both like and dislike, or else find fascinating; things which compel us with a mixture of fascination and repulsion, in whatever quantities, rather than neatly falling into one side or the other of the love/hate binary.
Freed from the confining binary of loving versus loathing, Facebook Like-ing versus hateration, we can imagine an index of obsessions, an inventory of intrigues that more accurately traces the chalk outline of who we truly are.
Imagine a more anarchic politics of enthusiasm, poetically embodied in a simulacrum of the self that preserves our repulsive attractions and attractive repulsions, reducing us not to our Favorites, nor even to our likes and dislikes, but to our obscure obsessions, our recurrent themes, the passing fixations that briefly grip us, then are gone—not our favorite things, but the things that Favorite us, whether we like it, or even know it, or not.
If you're thinking of doing a PhD to advance your career, you may want to reconsider: there is a glut of PhDs in the market and not enough jobs for them, other than postdoctoral work at slave-labour wages:
One thing many PhD students have in common is dissatisfaction. Some describe their work as “slave labour”. Seven-day weeks, ten-hour days, low pay and uncertain prospects are widespread. You know you are a graduate student, goes one quip, when your office is better decorated than your home and you have a favourite flavour of instant noodle. “It isn’t graduate school itself that is discouraging,” says one student, who confesses to rather enjoying the hunt for free pizza. “What’s discouraging is realising the end point has been yanked out of reach.”
Whining PhD students are nothing new, but there seem to be genuine problems with the system that produces research doctorates (the practical “professional doctorates” in fields such as law, business and medicine have a more obvious value). There is an oversupply of PhDs. Although a doctorate is designed as training for a job in academia, the number of PhD positions is unrelated to the number of job openings. Meanwhile, business leaders complain about shortages of high-level skills, suggesting PhDs are not teaching the right things. The fiercest critics compare research doctorates to Ponzi or pyramid schemes.As a result, the traditional bargain (crummy pay now for an academic career later) no longer holds, and postdocs are starting to see themselves not as apprentices on the first step to something better but as disposable cheap labour. (In Canada, apparently 80% of postdocs earn no more than the salary of a construction worker.) This has led to a new development: the rise of trade unions of PhD-accredited teaching staff.
As far as non-academic careers go, the picture isn't much brighter. Having a PhD no longer gets one a salary premium over having a mere Master's. (In some areas, such as engineering and technology, a PhD actually gets you less than a Master's. Meanwhile, the functions of having a PhD (i.e., advanced knowledge potentially applicable to a field) have been taken over by more specialised, market-oriented courses:
Dr Schwartz, the New York physicist, says the skills learned in the course of a PhD can be readily acquired through much shorter courses. Thirty years ago, he says, Wall Street firms realised that some physicists could work out differential equations and recruited them to become “quants”, analysts and traders. Today several short courses offer the advanced maths useful for finance. “A PhD physicist with one course on differential equations is not competitive,” says Dr Schwartz.I imagine this is part of the ongoing theme of the entire education infrastructure of the current world, having developed largely from the Middle Ages onward, not keeping pace well with technologically-driven social and economic change. Chances are that, over the next few decades, the assumptions of how education works and what functions it fulfils will have to be looked at anew on all levels.
The Economist Intelligence Unit's 2010 Democracy Index, a ranking of countries from most to least democratic, is out. The actual report requires registration, but the Wikipedia page contains a list, and various news sites across the world accompany this with explanatory commentary. A press release is here.
The report divides the world into four blocks, in order from best to worst: full democracies, flawed democracies, hybrid regimes and authoritarian regimes. The largest group, by population, is flawed democracies, followed by authoritarian regimes and, some distance behind, full democracies.
The four most democratic countries are—quelle surprise!—Norway, Iceland, Denmark and Sweden. They're followed immediately by New Zealand (which is looking increasingly like a chunk of Scandinavia in the Antipodes) and Australia. That's right, Australia is more democratic than Finland, Switzerland and Canada (#7. #8 and #9). The United States is at #17 (with a score of 8.18/10) and the UK is at #19. (The US loses points due to the War On Terror, whereas the UK's problem seems to be political apathy. Though is that the cause or, as Charlie Stross argued, a symptom?)
Meanwhile, France under Sarkozy has fallen out of the league of full democracies, and been relegated to the flawed democracies; there it is kept company by Berlusconi's Italy, Greece, and most of the Eastern European countries (with the notable exception of the Czech Republic, who are one step above the US), along with South Africa, Israel, India, East Timor, Brazil, Thailand, Ukraine and a panoply of African, Asian, Latin American and Caribbean countries.
Below the flawed democracies lie the hybrid regimes; these include Hong Kong (a notional democracy with Communist China keeping it on a leash), Singapore (a model of "managed democracy"), Turkey, Venezuela, Pakistan, Palestine and Russia. And at the bottom are authoritarian regimes, including the usual suspects: Cuba, China, the United Arab Emirates, Iran, Saudi Arabia and such. It will surprise few to learn that the bottom spot is held by North Korea, with a score of 1.08 out of 10, followed by Chad, Turkmenistan, Uzbekistan and Burma.
The press release states that the democracy ratings are worse than in previous years, with democracy declining across the world. Several factors are cited for this decline, including the economic crisis, the War On Terror, and declining confidence in political institutions. The press release also says that the crisis may have increased the attractiveness of the Chinese authoritarian model.
The scale of the rankings is, of course, not scientific. A rating of 9.8/10, as Norway has, would suggest that 98% of policy is decided at the ballot box, rather than in negotiations with other states, interest groups, bondholders and the like. And if 81.6% of Britain's decisions were democratically made, grossly unpopular decisions like trebling university tuition fees or invading Iraq would not have happened. One could imagine a more accurate scale, which estimates what percentage of a country's public affairs are decided through democratic discourse. A better measure would also have to take into account media pluralism, the education levels of the public, and access to unfiltered information; if a country's media is controlled by a few media tycoons, the will of the people will act as a low-pass filter on their opinions.
An observation I recently had about the way the various classes of "indie" music fall across the spectrum of class in Britain:
According to sociologist Eric Klinenberg, we are witnessing an unprecedented rise in people choosing to live alone; and while this has been happening over a century, it is now becoming a stable state for large numbers of people, rather than being a temporary state between childhood, youthful house-sharing and nuclear parenthood:
You’d think that the United States, with its cult of individualism, would be the world leader in living alone, but it’s not. Sweden, Norway, Finland, and Denmark, among others, come in ahead of us. That’s because they’re advanced welfare states that combine their own emphasis on the individual with extensive social safety nets.
In the absence of such safety nets, terrible things can happen, especially to those who grow old in isolation. Klinenberg, who is not yet 40, won a reputation as a leading figure in his field with his much-discussed first book, “Heat Wave: The Social Autopsy of a Disaster in Chicago,’’ which analyzed the deaths of over 700 people in Chicago during a weeklong period in July of 1995. Most of them were senior citizens who died at home and alone. Klinenberg showed how they were victims not just of the weather but of a social order that left them without the support of family, community, or government.Klinenberg also rejects the usual clichés about living alone being a symptom of alienation and social atomisation (i.e., "bowling alone") and a pathological state, raising the claim that people who live alone often have richer social lives than those in traditional nuclear family arrangements:
“One reason so many people live alone today is that they can do it while being extremely social,’’ Klinenberg told me in an email. “You needn’t live a traditional lifestyle to have a community. In fact, people who live alone are more likely to socialize with friends and neighbors than are married people.’’
Big changes in the structure of everyday life have converged to enable us to live alone: the greater freedom and economic power of women, the communications revolution, longer life spans. Klinenberg sees living alone as a choice, not a form of exile, and it’s a choice we value because it’s infused with principles that are important to us: individual freedom, personal control, self-realization.In other words, living with other people is not so much as the ideal state, or the most psychologically beneficial, as the least-worst state in societies where individuals don't have the means of living richly social lives from autonomous bases; and, indeed, the continuous stream of compromises resulting from sharing quarters with others can confine one to the lower rungs of the Maslow hierarchy of needs. (Of course, some traditionalists would contend that lack of self-actualisation is just another word for character-building, and that the self-actualised (or self-actualising; it's not clear whether self-actualisation is a state one can ever actually reach) character is a woefully underbuilt one, but that's another discussion.) Or, in other words, what common sense tells us is the natural order of things is the system of compromises we have become familiar with, to the point of assuming that that's the way things are meant to be. (Aside: if human neurologies have evolved to form stable social orders, then it's likely that humans have a innate bias towards classifying long-standing circumstances as natural rules, if not divine commandments, and not questioning them.)
The New York Times (registration required) has a convincing essay by one Mark Greif on what the word "hipster" actually means in a social/cultural context. It's a largely pejorative word nobody will admit to applying to them, though many of those using it derogatorily to refer to others look suspiciously like the stereotypical description of a hipster. The key, it seems, is in the writings of French sociologist Pierre Bourdieu, whose thesis was that taste (in everything from diet to dress to the various arts) is neither arbitrary nor objective, but correlates rigidly to one's social stratum, and serves a competitive role in jockeying for position in the social hierarchy. And this is where hipsters come in.
According to Greif, what people might classify as "hipsters" are three different groups: upper-middle-class, university-educated "culture workers" (i.e., Richard Florida's "Creative Class"), upper-class "trust fund hipsters", the scions of the aristocracy seeking to convert financial capital into cultural capital, and the old-guard, lower-middle-class hipsters, wearing thrift-shop clothes they acquired before they became expensively trendy, serving the aforementioned two categories in dive bars and boutiques and then repairing to crappy bedsits or borrowed couches. These may be the most authentic, but are looked down upon by the others for their lower standing, with only their unpurchased cultural authenticity giving them a form of superiority which doesn't afford them economic mobility. These three categories use the H-word as a weapon in an ongoing cultural jousting match, to knock each other down, belittling each other's cultural standing by denying its authenticity:
All hipsters play at being the inventors or first adopters of novelties: pride comes from knowing, and deciding, what’s cool in advance of the rest of the world. Yet the habits of hatred and accusation are endemic to hipsters because they feel the weakness of everyone’s position — including their own. Proving that someone is trying desperately to boost himself instantly undoes him as an opponent. He’s a fake, while you are a natural aristocrat of taste. That’s why “He’s not for real, he’s just a hipster” is a potent insult among all the people identifiable as hipsters themselves.
In his latest Independent column, the inimitable Rhodri Marsden writes about the psychologically brutalising arena of online dating:
Internet dating pivots around profiles; lists of attributes, paragraphs where you attempt to make yourself sound appealing, a handful of flattering photographs. But there's already a problem. Dozens of books and websites offer advice on how to write profiles; third-party services even charge 40 quid to save you the bother. As a result, the uniformity is hilarious. Everyone loves travelling, particularly to Machu Picchu – which, if the profiles are to be believed, is an Inca site swarming with thousands of backpacking singletons. Men are singularly obsessed with skiing. All of us love to curl up on the sofa with a bottle of wine and a DVD (or a VD, as one unfortunately misspelled profile said).
But we're forced to filter the mass of potential datees, and we do it savagely. We start to adopt a power-shopping mentality, disregarding people for arbitrary reasons; as my friend Sam put it, we cruise past people's pictures as if they're caravans in Daltons Weekly. "Yeah, no, no, yeah – ooh, yes! – no, no, ugh." It's a compelling, but ultimately exhausting, process that these services have adapted, refined and streamlined because it's a brilliant way for them to make money. While a service might lure you with a strapline saying "Meet sexy singles in your area", the truth is more like, "Reject perfectly decent singles in your area while waiting for the maddeningly elusive sexy ones." Everyone is trading off current opportunities against future possibilities. In a thoughtful moment, you might even realise there are people you've had relationships with in the past who, if they appeared as an online match, you might reject. And when you're the one being rejected, it can hurt.
Long-term internet dating participants know only too well, however, the cycle of knock-back followed by a speedy return to the site in search of someone else. You start seeing the same faces across multiple sites, and some people (especially men) will start to play the percentage game, firing off multiple cut-and-paste emails in the hope that someone will reply. One friend of mine was even sent a cheery message of introduction from a man who she had already had a disastrous date with via another dating website.
If you've ever wondered why what is commonly called Christianity in the US is so weird; why it so often condemns the poor as being responsible for their own misfortune, defends the right to make a profit above others, and is so obsessed with the evils of homosexuality and abortion, A guy named Brad Hicks wrote an illuminating essay in five parts (1, 2, 3, 4, 5) about how political expediency during the Cold War drove evangelical Christians (until then suspicious of worldly wealth) and the Republican Party (until then, the party of east-coast industrialists, with little time for religious pieties) into each others' arms, creating a Christianity that emphasises condemnation over redemption (though, granted, that's hardly new; Calvinism was there for a few hundred years before, though not quite to the same Randian extent), is not at all uncomfortable with getting filthy rich (as long as one donates to the Republican Party), and whilst not throwing any bones to the not-so-rich, manages to unite them with a common activity everyone can get behind: reinforcing a personal morality based in an idealised view of just-before-one-was-born (nowadays, the upright 1950s, that suburban patriarchial Garden of Eden before the serpent that was The 1960s came along and ruined everything), with a call to war against those who transgress against it (gays, feminists, abortionists and such).
The convergence of Christianity and right-wing politics in America has brought its own problems for both, with growing numbers of young Americans turning away from organised religion to avoid the politics. Granted, most of them aren't yet declaring themselves to be atheists (in America, it seems that one has to be pugnatiously anti-religious to feel comfortable using that label), but are filling in their religious orientation as "none".
This backlash was especially forceful among youth coming of age in the 1990s and just forming their views about religion. Some of that generation, to be sure, held deeply conservative moral and political views, and they felt very comfortable in the ranks of increasingly conservative churchgoers. But a majority of the Millennial generation was liberal on most social issues, and above all, on homosexuality. The fraction of twentysomethings who said that homosexual relations were "always" or "almost always" wrong plummeted from about 75% in 1990 to about 40% in 2008. (Ironically, in polling, Millennials are actually more uneasy about abortion than their parents.)
Meanwhile, in Finland, proponents of conservative Christianity have their own problems: after representatives of the state Lutheran church spoke against gay marriage on a TV current affairs programme, a record number of Finns had resigned from the state church. (Finland, like many European countries, has a state church, records citizens' religious affiliations, and levies an additional "church tax" on church members, to be paid to their respective churches.)
The next big thing in today's socially atomised, time-stressed world could be friend rental services, allowing people who don't have the time or social connections to make friends the traditional way to pay people to hang out with them:
While it is free to become a friend and advertise your social services on the site, anyone wanting to rent a friend must pay $24.95 a month (about £17 in the UK) or $69.95 a year (£47) to become a member. Some friends offer their services for free, while others charge anything from $10 to $50 an hour, plus all expenses incurred on the friend "date".
"I guess some people who use the site are losers and maybe disconnected from a regular social life, but most people I've met seem normal. In a big city like New York it's not always easy to meet people," she says.
"If it feels like work to hang out with someone or like I'm their shrink then I'd definitely need to get well paid. But I haven't met anyone that boring yet."Of course, whether the person whom you're buying dinner in return for company can be considered to be a friend is debatable, to say the least; perhaps "rent a companion" sounded too sleazily euphemistic?
Douglas Coupland, who epitomised the late-80s/early-90s slacker zeitgeist in Generation X, offers a list of terms for aspects of the human condition circa the 2010s:
Blank-Collar Workers: Formerly middle-class workers who will never be middle-class again and who will never come to terms with that.
Grim Truth: You're smarter than TV. So what?
Instant Reincarnation: The fact that most adults, no matter how great their life is, wish for radical change in their life. The urge to reincarnate while still alive is near universal.
Intraffinital Melancholy vs Extraffinital Melancholy: Which is lonelier: To be single and lonely, or to be lonely within a dead relationship?
Zoosumnial Blurring: The notion that animals probably don't see much difference between dreaming and being awake.
The latest use of offshore personal outsourcing: cutting the drudgery out of online dating:
Anyway, last weekend I was talking to an acquaintance about his use of such services. He has his assistant seducing women for him. His assistant, who is female and lives in India, logs onto his account on a popular dating site, browses profiles and (pretending to be him) makes connections with women on the site. She has e-mail conversations and arranges first dates. Then her employer reads the e-mail conversation and goes to the date. (Perhaps he also does a quick vet before arranging a date to be sure the assistant has chosen well, but I did not confirm that.)Currently, this seems anomalous and a bit sleazy, but perhaps there'll come a time when a variant of this (minus the sketchy subterfuge of it) becomes the norm. After all, the pace of life continues to accelerate and people have less unstructured time. (This is so across the spectrum, from high-powered executives to overworked students holding down two jobs to keep their heads above water.) Spare time is a declining luxury these days. Meanwhile, online dating, at least in its early stages is a labour-intensive activity: reading dozens of profiles and crafting charming responses tailored to the individual strangers, who will most probably not reply. This is a tedious and unrewarding activity, and, clearly, not the sort of thing today's time-stressed professional has time to spare on.
Perhaps the offshore-dating-assistant position will evolve into a sort of dating agent: half recruitment consultant, half marketing professional, with a touch of seduction guru thrown in (depending on how much of a bro the client sees themself as). There will be differently priced tiers of service. Those with the means looking for a partner (or a hook-up) will hire them, getting generally the level of service (in finding and wooing suitable partners, and selling them) they paid for. Those who don't will either do the job themselves, cutting into sleeping time or whatever, or go bowling alone.
The Running of the Dead, a longish but eminently readable and illuminating article by one Christian Thorne, examining the shift in zombie film tropes from the slow, shambling zombies of George Romero's original films to the fast zombies of "updated" remakes and films like 28 Days Later, and what it says about changes in assumptions about civilisation between the late 1960s and the Homeland Security age:
Slow-zombie movies are a meditation on consumer society—on a certain excess of civilization, as it were; and fast-zombie movies are pretty much the opposite. So the simple question: In the Dawn remake, how do the zombies look? And the simple answer is: They look like rioters or encamped refugees. If you say that zombie movies are always about crowds, a person might respond: Yeah, I see, the mob—but if you’re talking about George Romero and the slow-zombie movie, the word “mob” isn’t quite right, since white people in formal wear aren’t exactly the mob, and, casting a glance at Romero’s original Dawn, shoppers aren’t either, except on the day after Thanksgiving. Fear of the mob has usually been the hallmark of an anti-democratic politics. The phrase “mob rule” remains common enough; eighteenth-century writers used to call it “mobacracy.” And that’s not what Romero’s after. Romero is worried that the crowd isn’t democratic enough, and one of his more remarkable achievements, back in 1968, was to start a cinematic conversation about the dangers of crowds that ducked the problem of “the mob,” that bracketed that concept out. This couldn’t have been easy to do, since the one term substitutes so easily for the other. And the pokeyness of the zombies is central to this feat, because corpses that look like they’re wading through gelatin are going to seem grinding and methodical or maybe doped and so not like looters or protestors or the Red Cross’s Congolese wards. By making the zombies fast—or rather, by merely accelerating them back to normal human speeds—Snyder allows his dead to seethe and roil. Once the movie’s survivors decide they have to leave the mall where they’ve been hiding—once they head out, in armored buses, into the teeming parking lot—they have entered an American Gaza.In short, according to Thorne, while Romero's slow zombie movies are inherently democratic, fast-zombie movies are fundamentally authoritarian, advancing Thomas Hobbes' argument: civilisation is a fragile thing, one step away from collapse, and must be upheld by arbitrary authority (which is to say, authority one cannot question, whose rightness or wrongness are not open to debate). Hobbes' argument (a cornerstone of right-wing authoritarian thought) is predicated on fear, and has been gaining cultural currency in the post-9/11 Long Siege; the pro-democratic, small-L-libertarian tropes of the cultural shifts of the 1960s seem impossibly quaint, almost Rousseauvian in their naïveté, and haven't dated well beyond being a period piece, a code for the somewhat goofy epoch of pot-smoking, group sex and poor personal hygiene we call The Sixties. Meanwhile, the Other—the terrorist, the inner-city looter, the paedophile, the ultra-ruthless foreign gangster—is at the door. The freedom that was liberating to our hippy parents and grandparents is positively terrifying, and perhaps we need more authority.
The rise in fear-driven authoritarianism has manifested itself in other places; the zero-tolerance culture in schools (at least in the US; your mileage may vary), brought in after Columbine and 9/11, is, as one author suggests, conditioning a generation of young people to unquestioningly accept and submit to authority, instilling subconscious values which will later assert themselves when future social contracts are thrashed out. Tomorrow's citizens will be a little (or a lot) more accepting of the encroachment of authority, less likely to question it, and more likely to dismiss anti-authoritarian arguments as invalid or irrational.
Allegedly the next big thing in Berlin: cannibal cuisine:
In a prominent advertising campaign on the internet, in German newspapers and on television, the restaurant, Flime, is appealing for willing donors and diners to become members of what it hints at being a new dining movement. "Members declare themselves willing to donate any part of their body," the advertisement reads, adding that any resulting hospital costs will be taken on by the restaurant. They say they are also looking to employ an "open-minded surgeon".
The restaurant cites as its inspiration the indigenous Brazilian Waricaca tribe, which once practised the ritual of "compassionate cannibalism", or eating parts of the corpse of a loved one to emphasise the connection between the living and the dead, which was said to help with mourning.I bet this is a prank; it sounds like something Joey Skaggs might have come up with. Though you never know; perhaps there is someone who thinks that a cannibal restaurant could work.
Movies that would have been ruined by Facebook (or, more specifically, whose premises fall apart if their characters are on Facebook):
Some culture-jammers in New York have affixed official-looking "Spoiler Alert" signs to LED train attival signs in the subway.
Their rationale is that the recently installed signs erode faith in the system, create false hopes, erase the "mystery and magic" of the Subway and "threaten historical social behaviors, rendering obsolete the time-honored New York tradition of leaning over the platform edge with the hope of glimpsing headlights from an approaching train".
A study has claimed that rising rates of obesity in the US have resulted in almost a billion gallons of extra fuel consumption per year:
One key finding was that almost 1 billion gallons of gasoline per year can be attributed to passenger weight gain in non-commercial vehicles between 1960 and 2002--this translates to .7 percent of the total fuel used by passenger vehicles annually. Researchers also estimated that over 39 million gallons of fuel is used annually for every pound gained in average passenger weight. It is noted that while this is relatively small considering other factors such as more people on the roads, it is still a large amount of fuel that will continue to grow as the obesity rate increases.
Evolutionary psychologist Daniel Nettle claims that a lot of the social problems associated with socioeconomic deprivation are actually evolutionarily adaptive strategies for maximising opportunities when faced with uncertain prospects. To wit: risk-taking behaviour such as gambling and crime make sense when, ordinarily, individuals' prospects look bleak, unhealthy diets make sense when there isn't much of a future to plan for (junk food, after all, is a far more economical source of energy in the short term than eating healthily), and, as for teenage pregnancy, that's what's known as a fast reproductive strategy (i.e., have as many offspring as quickly as possible and hope that some do OK rather than putting all your proverbial eggs in one basket):
At a meeting last year, Sarah Johns at the University of Kent in Canterbury, UK, reported that in her study of young women from a range of socioeconomic backgrounds in Gloucestershire, UK, those who perceived their environment as risky or dangerous, and those that thought they might die at a relatively young age, were more likely to become mothers while they were in their teens. "If your dad died of a heart attack at 45, your 40-year-old mum has got chronic diabetes and you've had one boyfriend who has been stabbed, you know you've got to get on with it," she says.
Fathers in deprived neighbourhoods are more likely to be absent, which could be because they are following "fast" strategies of their own. These include risky activities designed to increase their wealth, prestige and dominance, allowing them to compete more successfully with other men for sexual opportunities. These needn't necessarily be antisocial, but often they are. "I'm thinking about crime here, I'm thinking about gambling," says Nettle, and other risky or violent behaviours that we know are typical of men in rough environments. A fast strategy also means a father is less likely to stick with one woman for the long term, reducing his involvement with his children.
Once you are in a situation where the expected healthy lifetime is short whatever you do, then there is less incentive to look after yourself. Investing a lot in your health in a bad environment is like spending a fortune on maintaining a car in a place where most cars get stolen anyway, says Nettle. It makes more sense to live in the moment and put your energies into reproduction now.These fast strategies, unfortunately, form a feedback loop: children brought up with minimal investment by fast-strategy parents are more likely to perceive their prospects as bleak and engage in similar strategies (studies have shown daughters of absentee fathers being more likely to become pregnant in their teens, for example). Meanwhile, junk-food diets stunt cognitive development, further sabotaging attempts to break the cycle.
The upshot of all this is that, if Nettle's theory holds, campaigns against unhealthy or antisocial behaviours are merely treating the symptoms, and real improvements can only come from addressing the underlying causes of such insecurity, i.e., poverty and uncertainty. Of course, actually doing so is a lot more expensive and could prove electorally unpopular, especially when opportunistic politicians are willing to promise cheaper solutions and voters are eager to believe that they will work.
John Perry Barlow, co-founder of the Electronic Frontier Foundation, claims that the internet has broken the US political system, with the deluge of information rendering the country "ungovernably information-rich":
Barlow also said that President Barack Obama's election, driven largely by small donations, has fundamentally changed American politics. He said a similar bottom-up structure is needed for governing as well. "It's not the second coming, everything won't get better overnight, but that made it possible to see a future where it wasn’t simply a matter of money to define who won these things," Barlow said. "The government could finally start belonging to people eventually."That's one perspective; another one is that the true stakeholders are not the plebeians who vote but the corporations who buy government bonds, and that, to keep an economy stable, the levers of power have to be moved well out of reach of the ignorant rabble and those who would pander to them; i.e., that governments' hands have to be tied by international treaties on anything that might affect the bottom line, reserving democracy for largely symbolic issues. Of course, with the people empowered from the edges by new tools, and the stakeholders pushing to seize more power, things could end up getting ugly.
The Exterminator's Want Ad, a new story by scifi author and design theorist Bruce Sterling, set after an economic/ecological collapse and collectivist revolution, and from the point of view of a particularly despicable corporate tool butting up against the rehabilitation process (the revolutionaries, it seems, are a bit more soft-hearted than the Bolsheviks or Cubans were).
Me, I was more of the geek technician in our effort. My job was to methodically spam and troll the sharing-networks. I would hack around with them, undermine them, and make their daily lives difficult. Threaten IP lawsuits. Spread some Fear Uncertainty and Doubt. Game their reputation systems. Gold-farm their alternative economies. Engage in DDOS attacks. Harass the activist ringleaders with blistering personal insults. The usual.
Claire and I hated the sharing networks, because we were paid to hate them. We hated all social networks, like Facebook, because they destroyed the media that we owned. We certainly hated free software, because it was like some ever-growing anti-commercial fungus. We hated search engines and network aggregators, people like Google -- not because Google was evil, but because they weren't. We really hated "file-sharers" -- the swarming pirates who were chewing up the wealth of our commercial sponsors.
Because the inconvenient truth is that, authentically, about fifteen percent of everybody is no good. We are the nogoodniks. That's the one thing the Right knows, that the Left never understands: that, although fifteen percent of people are saintly and liberal bleeding hearts, and you could play poker with them blindfolded, another fifteen are like me. I'm a troll. I'm a griefer. I'm in it for me, folks. I need to "collaborate" or "share" the way I need to eat a bale of hay and moo.
(via Boing Boing)
A new frontrunner has emerged in Colombia's presidential elections: Antanas Mockus, a former academic and two-time mayor of Bogotá whose terms in power there were characterised by a sort of surrealist urbanism (previously). While Mayor, he inaugurated voluntary women-only nights in the city's streets, deployed mimes to mock those flouting traffic rules and issued citizens with thumbs-up and thumbs-down cards for commenting on others' behaviour. Mockus is running for the Presidency under the banner of the Green Party, and has attracted an Obama/Clegg-style following of young voters, who have joined the campaign with another kind of surrealist intervention, the flash mob.
Star Wars Modern has another excellent post, this time about the history of cities and urbanism as seen through superhero comics, or more specifically, Superman and Batman.
That Depression Era mash of eugenics, nationalism, and progress/self-improvement, when introduced into the settings of the already popular crime pulps, gave birth to two enduring strains of superheroes: those that are inhumanly-super, like Superman; and those that are merely humanly-super, like Batman. Each has a place, an urban setting. More than childhood trauma or costume choices, it is these negative spaces that surround the heroes that make them what they are.While both embody the idea of the übermensch, leavened by Depression-era anxieties, they represent different outlooks in the 20th-century debate on the condition of living in cities; Superman embodies the modernist utopianism of slum clearances and gleaming high-rise tower blocs à la Le Corbusier (in one early story, he demolishes a slum teeming with criminality, forcing the authorities to hastily erect modern tower blocs). Batman, however, represents a more Hobbesian pessimistic world-view, of the urban condition as irredemably producing vice and evil, of urban dwellers as rats, their depravity justifying Batman's brutal methods. (Or, as John Powers writes, "In Batman's Gotham, human-nature makes the city a bad place. In Superman's Metropolis, exactly like More's Utopia, it is the city that makes people bad, and it needs to be physically reordered".) Both, however, were founded in the same prevalent assumption that the urban condition breeds vice, and that a more wholesome life is to be found in small towns, villages or the newly erected Levittown-style suburbs.
A lot of the anti-urbanist arguments cited a 1962 psychology paper titled Population Density and Social Pathology, by John B. Calhoun, in which the researchers cram increasing numbers of rats into a small space and notice that they start attacking and cannibalising one another, and then infer that rat psychology applies equally to humans, and city-like population densities trigger acts of depravity in human populations. To this day you hear the "rats in a cage" argument trotted out as folklore, because it's a vivid, lurid image. The problem is, the experiment doesn't hold; humans in a city aren't rats in a cage, and cities, even densely-packed slum-like ones, left to their own devices, evolve remarkable (if not always aesthetically pleasing) mechanisms of community and cooperation; in fact, the sprawling shanty-town is the ur-city:
Like Jacobs in 1961, who was opposed to Modernist slum clearance and saw density as a positive quality invisible to her contemporaries, Brand sees the high density of slums of contemporary South America, Asia and Africa as the model for future city life. While Jacobs pointed to so-called slums as healthy, but underserved neighborhoods in Boston and New York, and argued that they were positive examples to be emulated by planners, Brand points to vast squatter cities that house billions of people globally as feral urbanism that needs to be legitimized and fostered. The favelas and katchi abadi are thousands of times larger then the neighborhoods Jacobs wrote about, but Brand points out that San Francisco started out as a shanty town, and while he is quick to admit that "new squatter cities look like human cesspools and often smell like them," these are still neighborhoods, they are a legitimate form of urban development. These are not the "breeding ground for suffering and injustice" that Nolan has cast them as. In Brand's description squatter cities are vibrant:
The Ten Stupidest Utopias is not a Cracked-style list of wacky stuff, but a fairly insightful rundown of various utopian ideals, from Plato's Republic, through Thomas More's original Utopia, and then onwards through various permutations of utopian ideals (the American Calvinist city on a hill, racial-supremacist and feminist-separatist enclaves, heavy-handedly didactic Libertarian scifi utopias, the pipe dreams of cyberpunks, and the diametrically opposite yet oddly similar quasi-totalitarian visions of Le Corbusier and the Situationists):
The violence of More's historical period is never far from the surface of More's island Utopia, where a single act of adultery is punishable by slavery and serial adulterers are punished with death. If More's narrator had looked past the happy smiling faces of Utopia, what fear and violence might he have seen?
Exactly how stupid is Plato's Republic, and who am I to call one of history's greatest philosophers "stupid"? Is Plato's time simply too different from our own for us to pass judgment? I don't think so, for The Republic lives on in the rhetoric of contemporary political movements of both right and left—every elitist and technocratic fantasy of our time has grown from the seed of The Republic. Plato would not have understood the term "dehumanization" as we understand it—he'd never, of course, seen a factory floor or a gas chamber—but when his ideas have been enacted in places like the Soviet Union, Mussolini's Italy, or modern state-capitalist China, they have proven brutally dehumanizing, his apparat of "guardians" thoroughly corrupted by power.A more mundane utopia in the list is that of post-war American Suburbia:
Historian Robert Fishman calls American suburbia a "bourgeois utopia," whose hopes for community stability were founded "on the shifting sands of land speculation," backed up by racially discriminatory covenants and lending standards. The postwar American suburb, each a Nueva Germania of the soul, organized men's life around commutes and women's life around the home: the result was absent fathers, isolated mothers, and alienated children, who seldom knew anyone of a different race. In providing for the material needs of the growing middle class, the suburb created social and spiritual cavities that numerous social movements—from the 1960s New Left to today's Christian fundamentalism—have tried to fill.
Gay marriage: the database engineering perspective, or how different definitions of the institution of marriage would be reflected in different (relational) database schemas. Not surprisingly, the strictly traditionalist schemas do hideously inelegant things like have different tables for men and women, or mark one gender as subordinate to the other (i.e., have the males table contain a wife_id column), while the most elegant ones reduce marriage to a type of edge in generalised social networking, leaving policy (can you marry yourself for tax reasons? can more than two people be married?) outside of the schema.
I wouldn't be surprised if, at some point, some technically ignorant legislator in some conservative backwater proposed a law requiring databases to have separate tables for men and women or something similarly brain-damaged.
Australian architects and urban planners are pondering the challenges of adapting to population growth and climate change
Australia survived the global financial crisis, due largely to China buying its resources, and while resource exports will continue to bolster its economy for decades, future prosperity may be threatened by a growing, ageing population, according to an Australian government report released in February.
Australia's post-World War Two sprawling suburbia is under strain due to inadequate transport and public facilities. "We're at risk of seeing increasingly dysfunctional cities ... we're starting to see sort of fragmentation and breakdown of the transport systems and increasing frustration for the residents of those cities trying to get around," said Jago Dodson, urban researcher at Griffth University.A number of problems loom on the horizon: Australia's urban planning uniformly follows the post-WW2 American model, with sprawling cities of quarter-acre blocks, large "McMansion"-style houses, near-universal car ownership and a neglected public transport system half-heartedly run as welfare for the carless poor (though, to its credit, without the sort of neo-Calvinist contempt America manages for its have-nots). Most of Australia's land mass is desert, and its arable land, being geologically older than other continents, is less fertile, needing more fertiliser (typically petrochemical-based). Fresh water supplies are scarce; farmers are affected by drought and cities hit with water restrictions. The bulk of Australia's population is located around the coasts, with the country's largest cities being coastal. If global warming results in rising sea levels, this could result in flooding of populated areas. Moving inland is, for obvious reasons, not an easy solution.
With the population set to rise, some key assumptions about life in the Lucky Country are being reconsidered, from the ideal of owning a huge house on a quarter-acre block in suburbia to the right to drive anywhere in your own car. However, as in the US, there is considerable resistance to a change in these values.
Demographer Salt questions whether Australians will give up the "Neighbours" dream, citing the worldwide TV hit about life in a suburban Australian street. "Neighbours...is absolutely integral to the Australian psyche," said Salt, a partner at KPMG.
"Despite concern about climate change, road use in our cities is predicted to grow significantly in the next 20 to 30 years," said Transurban in a 2009 sustainability report. "New road projects will increasingly be part of integrated transport solutions for entire cities or transport corridors."Despite this resistance, demographers are predicting Australia's cherished sprawl being replaced with higher-density urban planning (some are even talking about "Manhattanization"), and there are already proposals for European-style mass transit projects (such as an underground rail network for Sydney). Whether the kinds of multi-billion-dollar sums get spent on subways, high-speed trains and other infrastructure that routinely happen in Europe remains to be seen; until now, public transport has been the red-headed stepchild of state governments, with billions spent on freeways (which, after all, serve the outer-suburban motorists whose votes swing elections) and public transport getting mostly empty promises, with the occasional bus service being extended from 6:30pm to 7:00pm on weekdays. (In Melbourne, there doesn't seem to be enough money to keep the existing service, as patchy as it is, from falling apart.)
Not even bohemian Berlin is immune from the forces of gentrification; luxury apartments are going up where the Wall stood, and the city's legendary bars and clubs are threatened with closure by rising rents and noise complaints. The city's non-yuppie residents are fighting back in a number of ways; some are torching luxury cars, while others are uglifying their areas with yuppie-repelling camouflage:
A recent meeting at SO36 discussed non-violent ways to keep out "unwanted" residents. Erwin Riedmann, a sociologist, proposed an "uglification strategy" – to "go around wearing a ripped vest and hang food in Lidl bags from the balcony so that it looks like you don't have a fridge". The suggestion drew laughs, but is a strategy being adopted.
An "anti-schicki micki" website, esregnetkaviar.de (it's raining caviar), offers the following tips to make a neighbourhood unattractive for newcomers: "Don't repair broken windows; put foreign names on the doorbell, and install satellite dishes."
Weaponizing Mozart, an article on how classical music is being used in Britain's war on its own youth:
The weaponization of classical music speaks volumes about the British elite’s authoritarianism and cultural backwardness. They’re so desperate to control youth—but from a distance, without actually having to engage with them—that they will film their every move, fire high-pitched noises in their ears, shine lights in their eyes, and bombard them with Mozart. And they have so little faith in young people’s intellectual abilities, in their capacity and their willingness to engage with humanity’s highest forms of art, that they imagine Beethoven and Mozart and others will be repugnant to young ears. Of course, this becomes a self-fulfilling prophecy.
The dangerous message being sent to young people is clear: 1) you are scum; 2) classical music is not a wonder of the human world, it’s a repellent against mildly anti-social behavior.
Poptimist Tom Ewing has written a future history of the 2020s CD revival:
But for the fans, the music is still at the core. Unlike today's collaborative, crowdsourced, and automatically generated playlists, a CD's tracklisting is fixed, and the CD-burning scene is an opportunity for music lovers to show their deep individual loves of music, its sequencing and presentation. The 74 Sessions is one of many CD-burning clubs and groups-- some ban members from remixing or mashing up material, others ask people to theme their CD-Rs. Chantal Fielding, who runs the Prismatic Spray trading club out of Rochester, NY, loves the way CD-Rs make her focus her fandom. "You've got all this information, literally everything you look at you can find out everything about it right there, and for music that means there's no mystery anywhere. So saying no, you can't explore endlessly, you have to reduce it down-- it's powerful."The romance of CDs in Ewing's 2020s world isn't just about working within finite physical constraints, like a sort of music-curatorial Lomography; while there is that, and undoubtedly an element of nostalgia as the hipsters and scenesters of the day relive hazy early childhood memories of the CD age (you've probably seen these kids, being wheeled through Stokey or Fitzroy in three-wheeled prams, dressed up in their Ramones onesies), a lot of the physical media revival would be driven by a backlash against the network-centric age of social software, recommendations, playlists and crowdsourcing, and the ever-hungry target-marketing apparatus beneath the surface. (Or, as one of the interviewed CD fetishists says, "when you can't see what the product is and someone's still making money, the product is you.")
While earlier physical-music movements fought to preserve analog formats in the face of digitization, CD revivalists see music's physical existence as a rebuke to a world where people's digital presence has overtaken their physical one. "It's not just about the music," explains Wolfe. "Words like 'social' and 'sharing' became absolutely twisted. It used to mean things people did together, now it's about how well you fit into algorithms. We leave snail trails of data everywhere, and all 'social' means now is that two trails have crossed and somebody's making money off it. Forcing people to collaborate for a fuller experience helps restore some of the real idea of 'social.'"
Wolfe sees CD-R revivalists as part of a 'post-social' wave of digital mischief-makers and situation-builders, in the tradition not of industrial or noise culture but of Fluxus and Neoism. He's sympathetic to "troll artists" like bot-creators and recommendation-scramblers. A friend of his was involved with the 'artificial hipster' Karen Eliot, a digital taste bundle whose infiltration of music friendship networks in 2020 caused scores of trusted playlist generators to start throwing in 00s tracks like "Starstrukk" and "My Humps".Another dimension of CD revivalism would, of course, be the sonic characteristics of the medium; the brittleness of 44kHz 16-bit audio compared to what everybody's listening to in the future. Of course, the revival would take this even further; much as 2000s "electro" ramped up the electronicness of 1980s synthpop by throwing in anachronistically vocoded/robotised vocals, some participants in the CD revival will go beyond the limitations of the CD and start playing around with low-bitrate audio compression, with subsubcultures of hipsters settling upon a right form of crappiness as a cultural touchstone.
The sound on most CDs Wolfe releases is deliberately low-bitrate, with a glossy, uneasy, skinny sheen that's a stark contrast to the lossless warmth of most streamed music. Some fans call lo-bit music "ghostwave", because, as Hall Of Mirrors act Cursor Daly puts it, "you start listening to stuff that isn't there, phantom sound-- your ears are filling in the gaps. Below 128 kbps you're essentially hallucinating sound, no two people hear the same thing. Loads of CD nerds were neuroscience majors."
Veteran Australian pop satirist New Waver has a new album, Bohemian Suburb Rhapsody, out.
New Waver's usual stock-in-trade in the past has been a relentlessly bleak neo-Darwinian pessimism, extrapolating the principles of neo-Darwinist evolution into a viciously competitive world, seen from the loser's perspective, and resulting in records like The Defeated and Darwin Junior High. Bohemian Suburb Rhapsody veers from this theme into an examination of the modern post-industrial age, casting a jaundiced eye over Richard Florida's concept of the "Creative Class" from the unaffordably gentrified inner north of Melbourne.
In the thesis of Bohemian Suburb Rhapsody, several phenomena of the past few decades (the shifting of industrial production to China, the move to a post-industrial economy and the rise of DIY art/music and internet-based user-generated content lowering the barriers to artistic creativity) have created a glut of "artists", with exhibitions and indie bands and bedroom music projects all over the inner suburbs. Artists have, as many have observed, congregated in undesirable suburbs hollowed out by deindustrialisation (at least in Melbourne; in Berlin, the collapse of Communism had the same effect), attracting hipsters, trendies, yuppies and ultimately the wealthy, aesthetically conservative haute-bourgeoisie, by then the artists having been forced out by rising rents. (In the words of a famous graffito in 1990s San Francisco, "artists are the shock troops of gentrification"; though it may make more sense to think of them as a sort of baker's yeast, whose job is to make the bread rise and then perish.) Meanwhile, the ease of creating (and copying) art, and indeed any sort of intellectual products, in the digital age has led to a rise in supply exceeding demand; not only is it harder to survive making art, but it is harder to get people to devote time to looking at your creations.
As with many of his previous recordings, New Waver expresses this thesis through the medium of cover versions of popular songs, assembled using General MIDI files. The opening track, Lugging For Nothing turns Dire Straits' anthem of the rock'n'roll dream on its head; in New Waver's acerbically realistic reworking, the people to be envied are the tradesmen, high-school drop-outs and cashed-up bogans, doing lucratively uncopiable physical work and spending their money on material luxuries. Like neo-Rousseauvian ignoble savages, impervious to the siren song of cultural engagement, they're happy to take the money of those afflicted by it (by renting them rehearsal rooms and such), while aspiring musicians infected by the rock'n'roll dream pack into small rooms and toil doing shitwork to pay off records and tours. The idea of cultural enagement as a parasitic replicator reemerges behind Media, I Gave You The Best Years Of My Life, which recounts the lot of the culturally engaged, struggling to afford to rent enough space to store their record collections and spending their spare hours discussing music and arthouse films on social websites; it is not difficult to square this with author Greg Wadley's well-documented interest in evolutionary psychology and conclude that the culturally engaged are the victims of parasitic memes, deprived of the chance to live a comfortable existence in a McMansion in suburbia, watching junk TV on their plasma screen and listening to whatever's on the radio by the terrible compulsion to impoverish themselves playing in bands, exhibiting art or otherwise trading time, wealth and effort for arbitrary signifiers of status, all the while helping to reproduce these memes.
Other songs touch on different, but related, themes; Party Like It's 1979 (a Prince cover, of course) looks at the resurgence of retro-styled indie music genres, from White Stripes-like garage bands to post-punk ("Fleetwood Mac's probably the most influential band today", "I got some classic rock released six months ago, some psychedelic folk, some white guys playing disco"), and the fetishisation of the vinyl format, reframing it as a cargo-cult commodity fetish, a subconscious belief that imitating one's idols will bring one their fame, wealth and sexual success. Inner City Drug Use, one of New Waver's older songs, is Queen's You're My Best Friend rewritten about the dependence on coffee, and My Memory Stick Weighs A Ton (a cover of a song by Melburnian 1980s post-punk turned suave crooner Dave Graney) about the glut of media produced by those who can be loosely categorised as "white-collar", and the declining likelihood of any of those items finding a willing audience. The closing track, The Cars That Ate Melbourne returns to the uncultured bogan "other", and this time to their habit of cruising around the inner cities in souped-up cars with blaring stereos; it does this by combining a house/commercial-dance beat, car engine noise and a porn dialogue sample; it is somewhat reminiscent of New Waver's 1990s commercial-dance track, "We're Gonna Get You After School".
The standout track, in my opinion, is "Hey Dude"; here, New Waver has taken the famous Beatles song and turned it into a missive from property developers and landlords to artists, hipsters and the creative classes, urging them to take a sad suburb and make it better by putting on exhibitions, opening cafés, organising events and looking hip, and reminding them that they carry investments on their shoulders. As commentary on gentrification, it is perfect. For what it's worth, there is a video here.
Consistent with its thesis, Bohemian Suburb Rhapsody is not being manufactured on CD or offered in shops (though there are rumours of a limited-edition memory-stick release), but is available for free downloading from New Waver's website. Which is not at all a bad deal for what will undoubtedly be one of the most apposite pieces of social commentary committed to the format of music this year.
Journalism researcher Nina Funnell recounts her encounter with Australian PM Kevin Rudd:
At that point one of my friends introduced me, dropping in that I am completing a PhD. At this, Rudd rolled his eyes and in a terse voice lacking any sense of irony remarked that is the "excuse" that "all" young women are using nowadays to avoid starting families. Since then I've come up with numerous one-line retorts, but in the moment I just froze in shock.It would appear that, in post-Howard Australia, there is an increasing tendency to treat women who shirk their childbearing duties as demographic bludgers of sorts, freeloaders who are cheating society of what they owe it.
Similarly women who do not wish to have children should also not be punished or labelled non-maternal. As a young woman I find it frustrating to see women like Gillard constantly attacked and ascribed derogatory labels like ''empty fruit bowl'', as though her worth is a sole function of her ability and inclination to reproduce.This is not unprecedented across societies; other states have, in the past, rewarded fertile women and punished those who shirked their reproductive duty to society. Mind you, those other states have included Ceaucescu's Romania and Nazi Germany, and are probably not a club Australia should seek to join.
Pete Warden, a programmer and amateur researcher, has analysed the data from public Facebook profiles, including the relative locations of pairs of friends and people's names and fan pages, and used this to divide the US into seven relatively self-contained clusters, which he terms "Stayathomia" (i.e., the northeast to midwest), "Dixie" (the old South), "Greater Texas" (encompassing Oklahoma and Arkansas), "Mormonia" (no prizes for guessing where that is), the "Nomadic West" (places like Idaho, Oregon and Arizona, where people's connections span wide areas), Socalistan (i.e., most of California and parts of Nevada) and Pacifica (essentially Seattle). The clusters were derived from performing cluster analysis on the social graph, and not imposed on the data a priori.
Warden posts various findings he gained from crunching the data on these clusters:
Probably the least surprising of the groupings, the Old South is known for its strong and shared culture, and the pattern of ties I see backs that up. Like Stayathomia, Dixie towns tend to have links mostly to other nearby cities rather than spanning the country. Atlanta is definitely the hub of the network, showing up in the top 5 list of almost every town in the region. Southern Florida is an exception to the cluster, with a lot of connections to the East Coast, presumably sun-seeking refugees. God is almost always in the top spot on the fan pages, and for some reason Ashley shows up as a popular name here, but almost nowhere else in the country.
(Greater Texas:)God shows up, but always comes in below the Dallas Cowboys for Texas proper, and other local sports teams outside the state. I've noticed a few interesting name hotspots, like Alexandria, LA boasting Ahmed and Mohamed as #2 and #3 on their top 10 names, and Laredo with Juan, Jose, Carlos and Luis as its four most popular.
(Mormonia:) It won't be any surprise to see that LDS-related pages like Thomas S. Monson, Gordon B. Hinckley and The Book of Mormon are at the top of the charts. I didn't expect to see Twilight showing up quite so much though, I have no idea what to make of that!Mormons like their vampires sparkly and pro-abstinence; who would have guessed?
(Socalistan:) Keeping up with the stereotypes, God hardly makes an appearance on the fan pages, but sports aren't that popular either. Michael Jackson is a particular favorite, and San Francisco puts Barack Obama in the top spot.Warden also has this tool for browsing aggregate profiles of countries based on their residents' public Facebook profiles.
Tabloids and Tory politicians have been claiming that Britain is a "broken society"; The Economist looks at the figures and shows that, actually, that's a load of rubbish; while Britain does have its share of social problems, it had much worse before:
As for family breakdown, some commentators seem to think that sex really was invented in 1963. British grannies know differently. Teenage pregnancy is still too common, but it has been declining, with the odd hiccup, for ages. A girl aged between 15 and 19 today is about half as likely to have a baby in her teens as her grandmother was. Her partner will probably not marry her and he is less likely to stick with her than were men in previous generations, but he is also a lot less likely to beat her. In homing in on the cosier parts of the Britain of yesteryear, it is easy to ignore the horrors that have gone. Straight white men are especially vulnerable to this sort of amnesia.The perpetuators of the myth of "broken Britain", a society in violent decay, are building a narrative that strengthens kneejerk culture-war reactions, such as the Tories' tax breaks for married couples (read: "sin taxes" on the unmarried), whilst ignoring the cause of Britain's social problems: too little spent on education:
The waning of the manufacturing jobs that used to be the mainstay of the working class has created a generation of young males, in particular, who don’t know what to do with themselves. Britons have been boozers and scrappers for centuries, but self-destructive behaviour today in part reflects the perception that their lives are not worth much. As for children bearing children, there is evidence elsewhere that if girls are given better education—not just about sex, but also in areas likely to improve their job prospects—they are less likely to get pregnant at 16. Yet for all the official talk at home about ever-improving exam results, Britain is beginning to slide down the international league table of educational attainment.
(via David Gerard)
A new study from Bristol University has looked into the differences between cat owners and dog owners. As well as the usual stereotypes (cat owners are more likely to be women who live alone), they discovered that cat owners are more likely to have degrees than dog owners (47.2% of households with cats have one person with a degree, compared to 38.4% with a dog):
"Our best guess is that it's to do with working hours and perhaps commuting to work, meaning people have a less suitable lifestyle for a dog. It's really just a hunch though."Or perhaps there are common psychological traits associated with a fondness for cats and a likelihood to apply oneself to study (or, indeed, a fondness for dogs and a likelihood to quit wastin' time and go out into the real world)?
Following the death of J.D. Salinger, author of Catcher In The Rye, the seminal formulation of mid-20th-century teen angst, the Observer's Barbara Ellen asks whether Holden Caulfield's angst, alienation and stance against "phonies" has any relevance to today's affluent, materialistic teenagers:
Watch Skins, which has a new series on E4, but also take an honest look at your own teenager/s. Compare them with your teen self. Better dressed (check), more affluent (check), perma-partying (check), healthier, better looking, better skin (check, check, check!). There have been times when I've stared at my teenage daughter and thought: "What happened to acne?" Not only acne, but having to wear horrible clothes, because you didn't get an allowance, or sitting in cold bus shelters for hours with your friends because there was nowhere else to go.
They were humbling mechanisms of youth, so boring at the time, but also so important because they gave you an incentive to get a life. All gone. A particular breed of metro-teens already has a nice life, thank you very much. In fact, many of them seem to have the lives of salaried twentysomethings. Alienated? Only if being alienated is being infatuated with one's youth, to the point of having no interest in previous generations. Do a Holden and resent and judge "phoney adults"? You'd be lucky with this lot. They barely notice we're alive.
One realises that things are more complex than that – recessions, vanishing university places, the feeling that this relentless selfdom is doubtless a mere carapace with myriad complexities bubbling beneath. Besides, I like the carapace – that merit-less self-glorification, the stubborn refusal to glance out of their yoof bubble to see how the rest of us may be doing. At least they're not wasting their glory years picking their noses to the Smiths. However, this doesn't alter the fact that the dislocated, angst-ridden "blah" of Catcher is no longer a good fit for modern teens. The defining work for this generation would more likely be the Argos catalogue.Ellen concludes with the claim that the truly lost generation aren't the hoody-wearing, Vice-reading, iPhone-toting party kids but their broke, exhausted parents, though, alas, there is no market for a book about middle-agers raging against "phoney teens".
From a Momus blog post, in which he, on departing from Osaka, speculates on how he might possibly live there:
I've never seriously thought about living in Osaka before. I love Tokyo best of all. But increasingly, my outlook has Berlinified, by which I mean I regard expensive cities like New York, London and Tokyo as unsuited to subculture. They're essentially uncreative because creative people living there have to put too much of their time and effort into the meaningless hackwork which allows them to meet the city's high rents and prices. So disciplines like graphic design and television thrive, but more interesting types of art are throttled in the cradle.Momus raises an interesting observation, and one which may seem somewhat paradoxical at first. First-tier global cities, like London, New York, Paris and Tokyo are less creative than second-tier cities, largely due to the increased pressure of their dynamic economies making all but the most commercial creative endeavours unsustainable. I have noticed this myself, having lived both in Melbourne (Australia's "Second City" and home of the country's most vibrant art and music scenes; generally seen by almost everyone to be ahead of Sydney in this regard) and London (a city associated, in the public eye, with pop-cultural cool, from the Swingin' Sixties, through punk rock and Britpop, but now more concerned with marketing and repackaging than creating; it also serves as the headquarters of numerous media companies and advertising agencies). In London, it seems that people are too busy working for a living to make art in the way they do in Melbourne or Berlin, and the arts London leads in are the commercial ones Momus names Tokyo as leading in: graphic design, the media, and countless onslaughts of meticulously market-researched "indie" bands. Those who thrive in London (and presumably New York, Paris and Tokyo) tend to be not the free-wheeling bricoleurs but the repackagers and cool-hunters, one eye on the stock market of trends and another on the repository of past culture, looking for just the right thing to pick up and just the right way to market it. (Examples: various revivals (Mod, Punk, New Wave), each more cartoonish and superficial than the last.) "Moving to London" is an artistic cliché, shorthand for wanting to hit the commercial mainstream, to surf the big waves.
There are, of course, counter-examples, but they tend to be scattered. For one, the more vibrant a cultural marketplace a city is, the more money is floating around, the more rents and prices are driven up, and the more those who are not driven by a commercial killer instinct find themselves unable to keep up, without either channeling their energies into money-spinning hackwork or whoring themselves to the marketing ecosystem, subordinating their creative decisions to its meretricious logic.
Also, as Paul Graham pointed out, cities have their own emphases encoded in their cultures; a city is made up as much of cultural assumptions as buildings and roads, and there is only space for one main emphasis in a city. If it's about commerce or status, it's not going to be about creative bricolage. (This was earlier discussed in this blog, here.) The message of a city is subtle but pervasive, replicating through the attitudes and activities of its inhabitants, subtly encouraging or discouraging particular decisions (not through any system of coercion, but simply through the interest or disinterest of its inhabitants). As Graham writes, Renaissance Florence was full of artists, wherea Milan wasn't, despite both being of around the same size; Florence, it seems, had an established culture encouraging the arts and attracting artists, whereas Milan didn't.
When a city is said to be first-tier—in the same club of world cities as London and New York—the implication is that its focus is on status and success, and the city attracts those drawn to these values, starting the feedback loop. Second-tier cities (like Melbourne and Berlin and, according to Momus, Osaka) are largely shielded from this by their place in the shadows of first-tier cities and their relatively cooler economic temperature. (There's a reason why music scenes flourish disproportionately in places like Manchester and Portland, often eclipsing the Londons and New Yorks for a time.) Of course, as second-tier cities are recognised as "cool", they begin to heat up and aspire towards first-tier status. (One example is San Francisco; formerly the hub of the 1960s counterculture (which, of course, birthed the personal computing revolution), then the seat of the dot-com boom, and now promoting itself as the Manhattan of the West Coast.) Cities, however, fill niches; they can't all be New York, and the number of first-tier "world cities" is, by its nature, limited.
Much has been said about the alleged epidemic of random alcohol-fuelled violence outside Melbourne's night spots and its possible causes. Now, The Age's Fiona Scott-Norman suggests that it might be due to the boom in venues playing house music, once confined to Chapel Street, but now part of every venue aiming for the cashed-up-bogan dollar; in particular, to house music being poorly suited for facilitating social interaction:
And then there's house music. It's pretty much the ultimate "anti-romance" music. It's played loud, it's repetitive, it's not fun, it's unremarkable and unmemorable — even if you can make yourself heard over the top, it gives you nothing to talk about, and appears to be the first music ever created by humankind that bypasses the emotions. Again, fine if your aim is to dance like a maniac until 6am, or whenever you start coming down, but truly terrible if you're not on chemicals.
So the clubs are chock-full of young folk who can't talk to each other, can't touch each other, have zero opportunity for intimacy, and can only dance in their own little world and hope someone's looking at their booty. The only tools in their seriously denuded seduction kit are alcohol and shouting. So yet another night ends, they're disconnected and frustrated, back on the streets, and totally hammered. Gee, I wonder why there's so much violence.
Playing almost any other kind of music would reduce street violence. Doesn't matter if it's disco, funk, yacht rock, indie pop, Mongolian throat-singing, gypsy punk, neo-lounge or Latin, so long as it's not joyless, thumping background music.
Canadian psychology professor Bob Altemeyer has made available online the text of a book examining the psychology of authoritarianism. Altemeyer looks at what he calls Right-Wing Authoritarianism, a personality trait which manifests itself in a high degree of submission to the established authorities, high levels of aggression in the name of the authorities, and a high level of conventionalism, and correlates with the political right, at least in North America. (He also mentions left-wing authoritarianism—think dogmatic Maoism or similar—in passing, though dismisses it as having all but died out in North America, whereas right-wing authoritarianism is going from strength to strength.)
It’s about what happened to the American government after "conservatives" gained control of Congress in the 1990s and the White House in 2000. It’s about the disastrous decisions that government made, which have created the enormous problems we face now. It’s about the corruption that rotted the Congress. It’s about how traditional conservatism has nearly been destroyed by authoritarianism. It’s about how the “Religious Right” teamed up with amoral authoritarian leaders to push its un-democratic agenda onto the country.
For example, take the following statement: “Once our government leaders and the authorities condemn the dangerous elements in our society, it will be the duty of every patriotic citizen to help stomp out the rot that is poisoning our country from within.” Sounds like something Hitler would say, right? Want to guess how many politicians, how many lawmakers in the United States agreed with it? Want to guess what they had in common?Altemeyer puts forward a Right-Wing Authoritarian personality scale, with higher scores correlating with the trait. High-RWA individuals have a "Daddy knows best" attitute to the authorities. They defer to their leaders, and even while they often believe that the law, however harsh, must be obeyed, they will exempt their leaders from this if the ends justify the means (such as approving of illegal activities against "radicals" or "enemies of society"). They view the world in terms of in-groups and out-groups, with little sympathy for the latter, and an us-vs.-them outlook, exhibit aggression against those seen to be transgressing against the norms of society, and are quicker than average to join with others to take action against them. And, being highly conventional, they interpret a lot of things as existential threats to the established order. (Authoritarianism, in other words, seems to tie in with a survival-values worldview, driven by the perception of existential threats and the need to deal with them.) Being driven by faith in authority, high-RWAs are more capable than most of compartmentalising contradictory beliefs and resisting challenges to their beliefs posed by logic or evidence.
The Authoritarians looks at the RWA scale and other phenomena, such as religious fundamentalism, social-dominance orientation and real-world politics. Not surprisingly, there are correlations between right-wing authoritarianism and religious fundamentalism, and both are strong predictors of prejudice against out-groups. (Paradoxically, many high-RWA people exhibit both racial prejudices and hostility to overt racism, largely due to not seeing themselves or their peers as racially prejudiced; this would be the dampening effect authoritarianism has on insight and analysis.) Meanwhile, there are both parallels and differences between right-wing authoritarian followers and people who score highly on the social dominance scale; the former don't necessarily want personal power, whereas the latter are less likely to be religious or constrained by rules, though will often happily feign religiosity as a means to an end. Some individuals, of course, score highly on both scales. Because authoritarian followers are receptive to messages that feel right, and are suspicious of critical thought, right-wing authoritarian movements attract more than their share of power-hungry sociopaths willing to pound the right talking points to get willing, unquestioning followers.
The bad news is, the authoritarians have been ascendant over the past decade (in the US, Altemeyer says, they have largely seized the Republican Party). The good news is that right-wing authoritarianism, as a tendency, can be defeated. Studies have found that fear increases RWA scores, in effect making people shut up and follow the leader. (This was used to great effect by the Bush Whitehouse, for example, by instituting a prominent colour-coded terror threat level, seldom dipping below "severe", and raising it inexplicably before elections.) Fearful societies are governed by authoritarian survival values, which have a harder time of getting a grip without fear. Exposure to people unlike oneself and one's "in-group" also weakens authoritarian tendencies, as does a liberal education. A study cited by Altemeyer showed university students' RWA scores declining steadily over the course of their studies, and remaining low throughout their lives. (Parenting, meanwhile, causes one's RWA scores to increase slightly.)
There are other ideas Altemeyer's Right-Wing Authoritarianism scale ties into, such as Lakoff's strict-father/nurturing-parent family dichotomy (which Altemeyer looks at though finds weakly connected), Milgram's obedience experiment, Zimbardo's Stanford Prison Experiment, and theories about the mass psychology of fascism. (Of which this strikes me as one of the more useful ones; while it may be fun to posit connections between fascism and manned flight or the mass spectacle of rock'n'roll, those are probably less useful for actually understanding the threat of fascism as a mass movement.)
(via Boing Boing)
A heterosexual couple in London are planning to sue for the right to enter a civil partnership. Tom Freeman and Katherine Doyle want the same rights as a married couple, but reject the separate-but-equal doctrine that separates gay and straight couples into "civil partnerships" and "marriages". Unfortunately, the law insists that, being heterosexual, they are only entitled to marriage (an institution with its roots in religious tradition, to the point that the government created "civil partnership" to keep the queers from defiling it with their existence).
Tom Freeman and Katherine Doyle, both 25, want the same legal rights as any husband and wife, but said they did not want to be seen to be "colluding with the segregation that exists in matrimonial law between gay civil partnerships and straight civil marriage".
The couple applied for a civil partnership at Islington register office, in north London, but were refused because UK law bans opposite-sex civil partnerships.Freeman and Doyle have the support of gay- and human-rights campaigner Peter Tatchell.
I wish them the best of luck. To say that a heterosexual couple's partnership automatically lumps them in with all other heterosexuals and separates them from non-heterosexuals is a denial of freedom of association. Furthermore, it casts light on the oddness of the concept of "marriage", which, on one hand, is a secular institution provided by a secular state, and on the other hand is robed in so much fusty religious tradition that the state imposes absolutely inflexible barriers along lines of ancient prejudice. It would be far more sensible to abolish secular marriage altogether, and have the state-recognised component be a civil partnership; if couples want to get married by their church (and their church approves), they could do that in addition to the state institution. The process could be streamlined, with churches acting as agents for the civil partnership agency whilst performing marriages, but at the end, all partnered couples, gay and straight, religious and irreligious, would have exactly the same title and status in the eyes of the law.
Australians work some of the longest hours in the developed world, mostly through unpaid overtime. Now, a progressive group calling itself The Australia Institute has designated today to be Go Home On Time Day, urging employees to leave the office exactly at leaving time:
Each year, Australians work more than 2 billion hours of unpaid overtime.
Around half of all employees work more hours than they are paid for. On average, a typical employee works 49 minutes of unpaid overtime per day. For full-time workers, the average daily amount of unpaid work is 70 minutes, which equates to 33 eight-hour days per year, or six and a half standard working weeks. Put another way, this is the equivalent of 'donating' more than your annual leave entitlement back to your employer.
The Gervais Principle, or the social psychology of how organisations really function, as seen in the Office TV comedies:
Now, after four years, I’ve finally figured the show out. The Office is not a random series of cynical gags aimed at momentarily alleviating the existential despair of low-level grunts. It is a fully-realized theory of management that falsifies 83.8% of the business section of the bookstore. The theory begins with Hugh MacLeod’s well-known cartoon, Company Hierarchy ..., and its cornerstone is something I will call The Gervais Principle, which supersedes both the Peter Principle and its successor, The Dilbert Principle.The McLeod hierarchy, and the theory which is the cornerstone of Ricky Gervais' comedy (and its US remake), divides organisations into three psychological types, somewhat facetiously labelled Sociopaths (i.e., those driven by the desire to control and dominate, without whom no decisions would be made), Losers (i.e., those who have made the tradeoff of security for control of their destiny; these need not necessarily be losers in the colloquial sense) and the Clueless (who are in the middle of the hierarchy, but are one level below losers in self-awareness; whereas the loser typically puts in the minimum they can get away with, the clueless give their loyalty to the organisation out of a misplaced faith that it will be reciprocated). Initially, organisations start off with a few Sociopaths in the driving seat and a corps of Losers doing the gruntwork in exchange for a regular paycheque; as they get larger, a layer of Clueless is added, and expands. This layer may be imagined as a dense, inert substance, which serves to keep the otherwise inherently unstable organisation from imploding.
A sociopath-entrepreneur with an idea recruits just enough losers to kick off the cycle. As it grows it requires a clueless layer to turn it into a controlled reaction rather than a runaway explosion. Eventually, as value hits diminishing returns, both the sociopaths and losers make their exits, and the clueless start to dominate. Finally, the hollow brittle shell collapses on itself and anything of value is recycled by the sociopaths according to meta-firm logic.The Gervais Principle builds on this, and describes how Losers who put in more than is in their best interest get promoted to middle-management, not because of their talents, or because of their incompetence (as per the Peter Principle or Dilbert Principle), but because they are most useful as pebbles in the insulating layer of the Clueless.
Sociopaths, in their own best interests, knowingly promote over-performing losers into middle-management, groom under-performing losers into sociopaths, and leave the average bare-minimum-effort losers to fend for themselves.
A loser who can be suckered into bad bargains is set to become one of the clueless. That’s why they are promoted: they are worth even more as clueless pawns in the middle than as direct producers at the bottom, where the average, rationally-disengaged loser will do. At the bottom, the overperformers can merely add a predictable amount of value. In the middle they can be used by the sociopaths to escape the consequences of high-risk machinations like re-orgs.
Which brings us to the other major management book that is consistent with the Gervais Principle. Images of Organization, Gareth Morgan’s magisterial study of the metaphors through which we understand organizations. Of the eight systemic metaphors in the book, the one that is most relevant here is the metaphor of an organization as a psychic prison. The image is derived from Plato’s allegory of the cave, which I won’t get into here. Suffice it to say that it divides people into those who get how the world really works (the sociopaths and the self-aware slacker losers) and those who don’t (the over-performer losers and the clueless in the middle).(Paging Greg Wadley...)
Some observations on Halloween costumes, from Conrad "Ignatz" Heiney:
The Halloween costume for women that I call the "Slutty Noun" outfit is now a topic of debate and outrage; I've been complaining about it for years. It's mainstreaming the sex industry, dragging women back into the Playboy Bunny past, and in poor taste. Yuck!
Last year I realized something worse. While the women dress as stereotyped available objects (nurse, catwoman, stripper outfits, little French maid, showgirls) the men have their own roles. They're pirates, soldiers, cops, horror movie murderers, Dracula, barbarian." These roles have something in common too: they're powerfully violent and often depicted assaulting women.
What's the message? Men are rapists and women are their victims. And now every year the men and women dress that way, go to parties and bars and get sloshed, and see what happens.
Anyone is free to explore sexuality and enjoy role-playing I don't like. In this case it would be less worrisome if any if these people knew what roles they were taking on and where that might go.
The OKCupid people have been running a free online dating service, backed by psychological matching algorithms driven by user-written tests, for many years, and have build up a huge corpus of data about how people interact. Now they have started a blog, where they discuss the statistical findings that may be gathered from comparing people's profiles and message counts.
One blog posts looks at how well different profile attributes predict whether two people will match. Not surprisingly, the zodiac signs of any two people have no effect on their actual personalities, and thus on how well they would get along:
Race has a slightly greater influence (of a few percentage points either way), presumably because of uneven distribution of cultural backgrounds, but it is still fairly small. (Keep in mind that the match scores are computed from how users answer others' questions, and not from explicitly asking questions like "would you date a Virgo/Polynesian/Buddhist".) Religion, however, turns out to be a lot more telling:
According to this, atheists, agnostics, Jews and Buddhists seem to get along just swell (in fact, Buddhists appear to be slightly more compatible with the nonbelievers than with other Buddhists), whereas the Christians, Hindus and Muslims tend to be somewhat more contentious, not only not getting along with other religions as well but also with each other. Additionally, the more seriously one takes religion, it seems, the less likely one is to get along with others.
Looking again at the issue of race, while race doesn't seem to affect actual compatibility scores, it does affect how likely people are to get responses:
Love may be blind, but it also seems that it, or at least attraction, is deeply racist.
On a lighter note, OKCupid have crunched the word frequencies of successful and unsuccessful opening messages and discovered what to write if you want a reply. Netspeak and "hip" misspellings ('u', 'luv', 'wat') and physical compliments are out, whereas mentions of specific interests are helpful. Unsurprisingly, mentioning religion is generally a bad idea as well.
The BBC looks at the sociological phenomena behind the rising popularity of cupcakes:
The humble cupcake has even been linked to political culture. Ms Twilley sees cupcakes both democratic - one each - and libertarian - there is no imperative to share and everyone chooses a flavour - in marked contrast to the communal cake.
Dr Smith believes there could be something behind this theory. "There are more diverse kinds of families now," he says. "These social changes could have an impact upon the type of baking we're producing. Quantity could have changed - it might be that we prefer lots of little cakes to one huge one now."Cupcakes as a symptom of social atomisation/the bowling-alone phenomenon/the decline of collective institutions?
Steven Levitt and Stephen Dubner, the authors of Freakonomics, have a sequel coming out next week; titled, naturally, Superfreakonomics, it looks like the same winning blend of insights, economic detective stories and at times glib reductionism:
Thus in the new book we learn, for example, that more deaths are caused per mile, in America at least, from drunk walking than drunk driving – so when you drive to a party and get plastered, it's not necessarily a wise decision to leave the car and walk home. We discover that female emergency-room doctors are slightly better at keeping their patients alive than male ones, and that Hezbollah suicide bombers, far from being the poorest of the poor, with nothing to lose, tend to be wealthier and better-educated than the average Lebanese person. There's an artful takedown of the fashionable "locavore" movement: transportation, Levitt and Dubner argue, accounts for such a small part of food's carbon footprint that buying all-local can make matters worse, because small farms use energy less efficiently than big ones.
Their mission to explain the world through numbers alone can give Superfreakonomics an oddly detached feel: major social issues are addressed, then instantly reduced to a statistical parlour game. An interesting section on the transactions between pimps and prostitutes, for example, shows that working with a pimp confers great financial advantages: they're much more helpful to prostitutes than estate agents are to house-sellers, for a start. But it neglects to consider the notion that there might, just possibly, be some negative aspects to the pimp-prostitute relationship. (The authors, apparently aware that they have made it look as if selling sex is the business plan to beat all others, add a helpful clarification: "Certainly, prostitution isn't for every woman.")
The Graun has turned what was meant to be a review of the latest Hollywood romantic comedy into a critique of this culture's emphasis on romantic love:
Romcoms don't merely provide an evening's harmless escapism. They help underpin one of the most potent doctrines of our culture: the sanctity of romantic love. It's a doctrine in which many find relief from the materialism, apathy and banality of a society no longer hallowed by religious transcendence. Yet it comes at a price.
The involuntary cognitive state that Jennifer Aniston finds herself depicting so frequently is real enough, but not particularly mystical. Brain scans show it to be generated by the frisky interaction of chemicals like norepinephrine and dopamine. If this hubbub's triggered by recognition of genetic quality, as now seems to be assumed, that would explain why Aniston and her ilk have to be so annoyingly good-looking.
What we call love induces some of the worst behaviour that we're likely to encounter. Yet when this occurs, it usually invites no censure, let alone punishment. Romantic love is a get-out-of-jail-free card that legitimises actions which would otherwise be thought contemptible. Home-wreckers steal something cherished far more deeply than money or possessions. Nonetheless, they go on to build their happiness on the misery of others without having to endure the slightest disapproval. After all, they had no choice but to do what they did: they were in love.
In other cultures, romantic love enjoys no comparable status. Our own ancestors might find our veneration of it as puzzling as we find their worship of pagan gods. In our otherwise disrespectful age, the persistence of its dominion is rather remarkable. Would it have proved so enduring without the big screen's relentless promotion of its supposedly limitless benefits?I have been wondering whether the emphasis on romantic love in popular culture (a significant proportion of mainstream pop songs seem to be about the transitions into and out of the state of being in love, for example) and the sexualisation of the media are not two sides of the same coin, namely a focus on relations between people as being a marketplace of potential partners, rather than less glamorous and less dynamic forms of relations. Could this be a sort of social Reaganism/Thatcherism, the ideological assertion that "everything is a market" translated into the realm of interpersonal relations?
Two more institutions of 20th-century Britain celebrate their anniversaries this month. It was 40 years ago today that Monty Python's Flying Circus first aired on the BBC. And almost exactly ten years later, BBC Radio aired an odd little radio play titled The Hitchhiker's Guide to the Galaxy, which encapsulated a Pythonesque sense of absurdism, a now quaint view of post-WW2, pre-Thatcherite Britain, and visions of futuristic technologies that, seen from today, are at once uncannily prescient and jarringly cautious, in the way that yesterday's futurism tends to be:
Within this handy framework, the Hitchhiker stories make up a sort of folk-art depiction, like on a tribal carpet, of the late-1970s English middle-class cosmic order. So there he is, the hapless Arthur Dent, in the middle, his maths insufficient to grasp even the first thing about his current position, in a county in a country, on a continent on a planet, in a solar system, in a galaxy, and so on. (Even now, the only way I can get the hierarchy right is by referring to the products of Mars Inc.) Except that the universe, 1979-style, would have seemed different from the one we know, and don't know, today, with space travel, in the years between the Moon landings and the Challenger disaster, both current and glamorous-feeling in a way it certainly isn't now. Tomorrow's World went out on the BBC every Thursday; Carl Sagan's Cosmos went out in 1980; cool space-junk was everywhere, Star Wars and Close Encounters, Bowie and P-Funk and the Only Ones. Relativity and the space-time continuum, wormholes and the multiverse featured everywhere in science fact and fiction, and were easily bent and twisted into the sort of paradox at which Adams's mind excelled – the armada of spaceships diving screaming towards Earth, "where, due to a terrible miscalculation of scale, the entire battle fleet was accidentally swallowed by a small dog"; the Restaurant at the End of the Universe, where you can pay for dinner by putting 1p in a present-day savings account, meaning that "when you arrive at the End of Time . . . the fabulous cost of your meal has been paid for".
Except that the Guide wasn't just a literary device, a concept. It really was a "Book", a thing of plastic, an actual piece of tech. It looked, we are told, "rather like a largish electronic calculator" – as such a device would have had to look in the 1970s, before iPhones, Kindle, Ernie Wise's Vodafone. On it, "any one of a million 'pages' could be summoned at a moment's notice" – what, only a million?, 21st-century readers object.
But there's a definite tea theme, and a lot of Englishness, and a distinctive note of piscine melancholy: So Long, and Thanks for All the Fish; The Salmon of Doubt. If Adams's books were a domestic appliance, they'd be a Sinclair ZX80, wired to a Teasmade, screeching machine code through quadraphonic speakers, and there'd probably be a haddock in there somewhere, non-compatible and obsolete.I'm slightly disappointed that Google didn't put up a special commemorative logo.
Hospitals and prisons in the UK are removing alcohol-based hand sanitiser dispensers introduced during the swine flu pandemic because people are drinking the gel.
An article in Radical Philosophy (a journal of "socialist and feminist philosophy") looks at the phenomenon of those Keep Calm And Carry On posters, and what the reprinting and near-ubiquitous popularity of a WW2-era propaganda poster that was never originally released says about contemporary British society, alienation from the consumer-capitalist values of Thatcherism-Blairism, and a displaced longing for an imagined utopia of benign paternalistic bureaucracy and modernistic optimism that happened (as all golden ages happen) decades before those contemplating it were born.
Initially sold in London by the Victoria & Albert Museum, the poster only gradually became the middlebrow staple it is now when the recession, euphemistically the ‘credit crunch’, hit. Through this poster, the way to display one’s commitment to the new austerity was to buy more consumer goods, albeit with a less garish aesthetic than was customary during the boom. It is in a sense not so different to the ‘keep calm and carry on shopping’ commanded by George W. Bush both after September 11 and when the sub-prime crisis hit America – though the ‘wartime’ use of this rhetoric has escalated during the economic turmoil, especially in the UK. Essentially, the power of ‘Keep Calm and Carry On’ comes from a yearning for an actual or imaginary English patrician attitude of stoicism and muddling through, something which survives only in the popular imaginary, in a country devoted to services and consumption, and given to sudden outpourings of sentiment and grief, as over the deaths of celebrities like Diana Spencer or Jade Goody. The poster isn’t just a case of the return of the repressed, it is rather the return of repression itself, a nostalgia for the state of being repressed – solid, stoic, public-spirited, as opposed to the depoliticized, hysterical and privatized reality of Britain over the last thirty years. At the same time as it evokes a sense of loss over the decline of this idea of Britain and the British, it is both reassuring and flattering, implying a virtuous (if highly self-aware) stoicism in the displayer of the poster or wearer of the T-shirt.The article then mentions a number of other examples of related "austerity chic" or fetishisations of a more high-minded and public-spirited pre-Thatcherite idyll: celebrity chef Jamie Oliver's "Ministry of Food" concept, aesthetic references to Jan Tschichold' Penguin paperback covers, and plates printed with austerely clean images of 1930s Modernist buildings (which the author of the essay links to a reaction against the Blatcherite culture of high-Gini property speculation). One can add to this list a number of other icons. For example, on things like CD covers, the British Rail logo seems to have taken the place of the Mod RAF roundel as an insignia of retro cool, also referencing a pre-privatisation institution remembered more fondly in retrospect to its better-marketed, fare-gouging Blatcherite successors. Perhaps if Rupert Murdoch gets his way, we'll see hipsters wearing BBC logo badges in the not too distant future?
It's not all benign nostalgia for a kinder, gentler age, though: the article mentions a more sinister edge, from the police force's Keep Calm-esque "We'd Like To Give You A Good Talking To" posters (ironically juxtaposing an attention-grabbing tone of authoritarian brutality with a "caring"official message, which happened, as the author points out, near the harshly suppressed G20 protests) to Ken Livingstone's ironically Orwellian "Secure Beneath Their Watchful Eyes" posters, promoting the comprehensive surveillance society as a beneficial thing. (One is reminded of the "High Security Holiday Resort" posters in Terry Gilliam's Brazil.)
An article in the New Republic examines the rise of Ayn Rand's ideology, which asserts that one's wealth is literally proportional to one's value as a human being, selfishness is a virtue and altruism is evil, from the fringes to the mainstream of American conservative thought:
In these disparate comments we can see the outlines of a coherent view of society. It expresses its opposition to redistribution not in practical terms--that taking from the rich harms the economy--but in moral absolutes, that taking from the rich is wrong. It likewise glorifies selfishness as a virtue. It denies any basis, other than raw force, for using government to reduce economic inequality. It holds people completely responsible for their own success or failure, and thus concludes that when government helps the disadvantaged, it consequently punishes virtue and rewards sloth. And it indulges the hopeful prospect that the rich will revolt against their ill treatment by going on strike, simultaneously punishing the inferiors who have exploited them while teaching them the folly of their ways.
Today numerous CEOs swear by Rand. One of them is John Allison, the outspoken head of BB&T, who has made large grants to several universities contingent upon their making Atlas Shrugged mandatory reading for their students. In 1991, the Library of Congress and the Book of the Month Club polled readers on what book had influenced them the most. Atlas Shrugged finished second, behind only the Bible. There is now talk of filming the book again, possibly as a miniseries, possibly with Charlize Theron. Rand's books still sell more than half a million copies a year. Her ideas have swirled below the surface of conservative thought for half a century, but now the particulars of our moment--the economic predicament, the Democratic control of government--have drawn them suddenly to the foreground.
Around the age of five, Alissa Rosenbaum's mother instructed her to put away some of her toys for a year. She offered up her favorite possessions, thinking of the joy that she would feel when she got them back after a long wait. When the year had passed, she asked her mother for the toys, only to be told she had given them away to an orphanage. Heller remarks that "this may have been Rand's first encounter with injustice masquerading as what she would later acidly call ‘altruism.’ " (The anti-government activist Grover Norquist has told a similar story from childhood, in which his father would steal bites of his ice cream cone, labelling each bite "sales tax" or "income tax." The psychological link between a certain form of childhood deprivation and extreme libertarianism awaits serious study.)While the premises of Randian ideology—the myth of the heroic self-made man, the cirrelation between wealth and value—have a certain sort of glibly narcissistic appeal (in particular to those wishing to rationalise their beliefs in themselves as good people), they fall apart under closer examination:
Is income really a measure of productivity? Of course not. Consider your own profession. Do your colleagues who demonstrate the greatest skill unfailingly earn the most money, and those with the most meager skill the least money? I certainly cannot say that of my profession. Nor do I know anybody who would say that of his own line of work. Most of us perceive a world with its share of overpaid incompetents and underpaid talents. Which is to say, we rightly reject the notion of the market as the perfect gauge of social value.
Now assume that this principle were to apply not only within a profession--that a dentist earning $200,000 a year must be contributing exactly twice as much to society as a dentist earning $100,000 a year--but also between professions. Then you are left with the assertion that Donald Trump contributes more to society than a thousand teachers, nurses, or police officers. It is Wall Street, of course, that offers the ultimate rebuttal of the assumption that the market determines social value. An enormous proportion of upper-income growth over the last twenty-five years accrued to an industry that created massive negative social value--enriching itself through the creation of a massive bubble, the deflation of which has brought about worldwide suffering.
The reality of the contemporary United States is that, even as income inequality has exploded, the average tax rate paid by the top 1 percent has fallen by about one-third over the last twenty-five years. Again: it has fallen. The rich have gotten unimaginably richer, and at the same time their tax burden has dropped significantly. And yet conservatives routinely describe this state of affairs as intolerably oppressive to the rich. Since the share of the national income accruing to the rich has grown faster than their average tax rate has shrunk, they have paid an ever-rising share of the federal tax burden. This is the fact that so vexes the right.And here is a piece on the sociopathic dimension of Randian ideology:
Interestingly, despite her general disdain for humanity, there were people she seemed to admire greatly, such as William Edward Hickman, whose credo, "What is good for me is right," she described in her Journals as, "The best and strongest expression of a real man's psychology I have heard." But Hickman was no simple expositor of personal greed and self-interest; no mere modern day libertarian; no pedestrian practitioner of excessive self-love. No indeed. He was a sociopathic murderer. In 1927 he kidnapped a 12-year old girl from a school in Los Angeles by the name of Marian Parker, chopped off her legs, cut our her internal organs, drained all of her blood and then spread parts of her body all over the city.
Of Hickman, this sick murderer, Rand had almost nothing but positive things to say. She indeed critiqued those who would condemn Hickman's actions for having committed "worse sins and crimes," such as those she ascribed to his jury. Among those "greater" crimes--greater than mutilating a child--she included being, "Average, everyday, rather stupid looking citizens. Shabbily dressed, dried, worn looking little men. Fat, overdressed, very average, 'dignified' housewives." Their ordinariness, in other words, placed them below Hickman, in Rand's mind. "How can they decide the fate of that boy? Or anyone's fate?" she implored in her Journals.
As rising oil prices bite, people are talking about moving to a 4-day work week to reduce fuel consumption. The idea has been tried in Utah, but as befits a conservative Mormon state in the US whose emblem is the beehive, it didn't result in an extra day of leisure time, but rather four 10-hour workdays. Nonetheless, the results have been promising, and the experiment has proven popular, with 82% of participants preferring to stick with it:
"If employees are on the road 20 percent less, and office buildings are only powered four days a week," Langmaid says, "the energy savings and congestion savings would be enormous." Plus, the hour shift for the Monday through Thursday workers means fewer commuters during the traditional rush hours, speeding travel for all. It also means less time spent idling in traffic and therefore less spewing of greenhouse gases and other pollutants. The 9-to-5 crowd also gets the benefit of extended hours at the DMV and other state agencies that adopt the four-day schedule.
An article in the Graun looks at how changes in telephone technology have affected the plots of novels, plays and films:
Victoria Wood's Talent, first staged in 1978 and now showing at the Old Laundry theatre in Bowness-on-Windermere, includes a scene in which an important call is made from a coin-operated phonebox. Wood, who is directing, had to explain to young members of the cast how the strange apparatus worked: listen for an answer, then push in your sometimes-resistant 10p pieces. It sounded like science fiction to the young actors. So, while the play's first audiences will have regarded this scene as social realism (perhaps reflecting on their own experiences of trying to finding an unvandalised phone that didn't spit your silver out), the same sequence, within three decades, has become social history.
In Pedro Almodóvar's latest movie, Broken Embraces, which cuts from the present to the 1980s, the director uses mobiles as a visual clue to where we are. The older sequences are signalled by the wielding of brick-like instruments, while present-day characters effortlessly palm their thin, flippable devices.
Wood, in Talent, rings comic embarrassment from the fact that a character's mother has an extension in her bedroom. But the detail is revealing in other ways: in the 1970s, multiple receivers in the home distinguished the upper-middle classes from the plebs who had a single instrument in the hallway. In a later TV play by Wood, it matters that a character has a "kitchen extension". Indeed, in Dial M for Murder, the "perfect murder plot" turns on luring a woman to the living room to answer the phone. Within 20 years, Knott's plot had been rendered a period piece by multi-phone homes; after a further 30, mobiles had made the plot absurd.
But, although mobiles have provided writers with rich new storylines, they have also worryingly closed off many traditional developments. This is of most concern to authors of horrors, thrillers and mysteries, in which a regular premise is the protagonist's total isolation. If there had been a Nokia in Janet Leigh's handbag, Hitchcock's Psycho would have been a short film with a happy ending. The avoidance of this problem has already created a new movie cliche: the closeup showing the "no signal" warning on the star's phone.
There's a new documentary which looks at how Italian president and media mogul Silvio Berlusconi has, over the past three decades, changed his country's culture, society and politics. Videocracy, by Swedish-based Italian filmmaker Erik Gandini, starts 30 years ago, when Berlusconi's channels started introducing gratuitous female nudity to mainstream programming, ramping up the amount of vacuous, vaguely pornographic titilliation, and culminating with their owner becoming president, twice. The film interviews a number of characters symbolic of the system, including a hapless, fame-hungry talent-show contestant, a fascist-sympathising media fixer, and a paparazzo and convicted extortionist turned celebrity. There are more details here, and (with a trailer) here.
Beginning with a young Berlusconi’s arrival on the commercial Italian TV scene three decades ago, the film opens with archive footage of a stripping housewife quiz show – proving that this Italian TV gem is not the urban legend some assume it to be – and segues into a montage of busty variety show starlets (one of whom is now minster for gender equality in Berlusconi’s government). Videocracy then introduces us to the three main characters whoare used as benchmarks for the moral madness that the director sees as being induced by the dumbed down world of Italian commerical TV.Videocracy is screening at the Venice and Toronto film festivals. Perhaps unsurprisingly, though, both Berlusconi's private TV channels and the Italian state broadcaster RAI have refused to run advertisements for it.
Every so often, one reads horror stories about how technology is destroying our ability to write: about kids robbed of grammatical ability by text messaging, or PowerPoint presentations eroding the ability to string sentences together. . Now, a professor of writing claims the opposite: thanks to the internet, we are entering a golden age of literacy. According to Professor Andrea Lunsford of Stanford University, thanks to the internet, email, blogs, forums and instant messaging, people write more than in living memory, and consequently, more people than ever have the sorts of highly developed and practiced writing skills that previously were the domain of an elite cadre of professional wordsmiths:
Before the Internet came along, most Americans never wrote anything, ever, that wasn't a school assignment. Unless they got a job that required producing text (like in law, advertising, or media), they'd leave school and virtually never construct a paragraph again.
But is this explosion of prose good, on a technical level? Yes. Lunsford's team found that the students were remarkably adept at what rhetoricians call kairos—assessing their audience and adapting their tone and technique to best get their point across. The modern world of online writing, particularly in chat and on discussion threads, is conversational and public, which makes it closer to the Greek tradition of argument than the asynchronous letter and essay writing of 50 years ago.
The brevity of texting and status updating teaches young people to deploy haiku-like concision. At the same time, the proliferation of new forms of online pop-cultural exegesis—from sprawling TV-show recaps to 15,000-word videogame walkthroughs—has given them a chance to write enormously long and complex pieces of prose, often while working collaboratively with others.
Doing its part to cope with the global obesity epidemic, the subway system in Sao Paolo, Brazil, has introduced double-width, reinforced seats for fat people:
The seats are bright blue and have stickers above them marking them out as special seats for the use of obese persons. Strangely enough, they seem almost always empty; presumably, a lot of those who can fit into a regular seat or can bear standing are not keen to self-identify themselves as severely overweight.
One wonders how facility planners could cater for a widening population without coming up against against stigma and denial. They could, of course, go back to flat benches without divisions, except that those allow homeless people to sleep on them, which is unacceptable for various reasons. (Not all people are sufficiently enlightened and compassionate to share their daily commute with the aromatically homeless, and if public transport facilities adopted a secondary role as a homeless shelter, this would drive out many of those who are sufficiently well-off to avoid public transport, putting more cars on the road, and resulting in the money spent on running the actual trains being wasted, but I digress.) Possibly some sort of design with all seats being double-width with a low-key, or movable, divider in the middle, would do the trick; though that could have the unintended consequence of encouraging amorous couples.
A review of a new book on changing patterns of recreational drug use in the USA:
...according to several metrics, acid use was at "an historic low: 3.5 percent." By 2003, it was down to 1.9 percent. Why? It wasn't just that LSD had gone out of style, although it had, somewhat. Grim found evidence of a perfect storm of causes for the decline. In 2000, the DEA had arrested a man named William Pickard, thought to be the manufacturer of as much as 95 percent of the available acid in the U.S. The Grateful Dead, whose concerts provided an opportunity for suppliers and users to connect and network, had stopped touring after the 1995 death of Jerry Garcia, and Phish, a jam band that had stepped in to fill the gap, also stopped touring by the end of 2000. The rave scene began to fade away under pressure from authorities who threatened to arrest organizers for drug offenses committed at their events.And then this depressing picture of an atomised, asocial society, which ties in with the bowling-alone mass-alienation idea:
Today's kids aren't smoking much pot because pot is a "social" drug, shared among peers who gather in parking lots and other hangouts; teens have less unstructured time now and tend to socialize online. They still get high, only on prescription drugs pilfered from adults or ordered off the Internet. "There's no social ritual involved," he observes, "just a glass of water and a pill," which "fits well into a solitary afternoon."The rest of the review looks pretty interesting, including the theory that recreational drugs have cycles, in which they become popular, then become lame, and then come back sometime later to a generation who have never witnessed their effects, illustrated by an anecdote about kids regarding Ecstasy as "too hard on your body" and cocaine as "not that bad".
Prospect Magazine has an interesting article about Sweden:
Inevitably, the subject turns to sex and marriage. I'll never forget asking one group what they thought of marriage in a country where most educated young people (and half go to university) don't get married or bear children until they are well over 30. A young woman gave me a thoughtful answer and so I asked her, "What are you looking for in a husband?" Without batting an eye or pausing for thought, she answered: "Three things. One, he must be good in bed. Two, he must be a good father. Three, when we divorce, he mustn't be bitter."
At the moment Reinfeldt is leading the four conservative parties who form the government to reform some of the deeply held attitudes of Swedish society. "What we are is anti-conformity. We have opened up the schools and health services to competition and worked to end the many monopolies in our society." The propensity towards conformity bugs both Reinfeldt and many of the foreigners who work or study here. When I said that I find the Swedes are the Japanese of Europe, he nodded his head in agreement.
Apart from a well-travelled elite, the majority of people look down on those who buck the Swedish lifestyle trend—those who are a day late at the end of every month with paying their bills, those who cross the street before the light turns green even if no traffic is coming, those who miss a meeting of the committee of tenants that supervise their block of flats, those who don't do immediately what the committee has told them to do. (The penalties are severe—as I found out. You can be thrown onto the street for disobeying, even though you own your flat). Swedes just about can bring themselves to vote for different parties. But when it comes to big issues they usually follow the Stockholm elite's concensus. Very rarely is there a furious debate in parliament or the courtroom. People prefer to agree than disagree.
Swedes tell you that there is pressure in society not to raise your head too far above the parapet. One shouldn't push too hard to get ahead, to ask too demandingly for a salary increase, to engage in conspicuous consumption, to build too big a house or to own too posh a car or dress in a fancy or even stylish way. The very well cut business suit or skirt and jacket, much less the bejewelled theatre, concert or partygoer are not welcomed.
Gropecunt Lane (pronounced /ˈɡroʊpkʌnt ˈleɪn/) was a street name found in English towns and cities during the Middle Ages, believed to be a reference to the prostitution centred on those areas; it was normal practice for a medieval street name to reflect the street's function or the economic activity taking place within it. Gropecunt, the earliest known use of which is in about 1230, appears to have been derived as a compound of the words "grope" and "cunt". Streets with that name were often in the busiest parts of medieval towns and cities, and at least one appears to have been an important thoroughfare.
Although the name was once common throughout England, changes in attitude resulted in it being replaced by more innocuous versions such as Grape Lane. Gropecunt was last recorded as a street name in 1561.There are currently no streets named Gropecunt Lane in England, them all having been renamed to things like Magpie Lane, Grape Lane, or in some cases, Grope Lane (attributed by the prudish to the narrowness and darkness of the street, not to any untoward activity having taken place there). Perhaps the time has come for a campaign to undo these shameful acts of vandalism and restore this piece of English history?
(via David Gerard)
An artist has placed 30 street pianos in the City of London. The pianos are padlocked to prevent antisocial people from vandalising them and bear the legend "Play Me, I'm Yours".
Your Humble Correspondent saw one such piano this afternoon at Liverpool Street station, adjacent to the exit to the station plaza. I walked up to it, and, before I could approach it, was approached by a beggar, who presumably had found a lucrative pitch around the piano. I, of course, did what any decent person would have done and gave the poor wretch a crisp ten-pound note.* After he was off, I approached the piano, wondering whether it actually worked.
Before I could touch the keyboard, I heard another voice. Another man, with bad skin and beady eyes, had walked up to me, and was pointing to a discarded Tesco bag on the ground near the piano.
"Is this your bag?"
I assured him that it wasn't, and returned my attentions to the piano. I managed to find that the lowermost D key did actually work. My unwelcome companion, however, would not be ignored.
"It's a disgrace, that's what it is! People leaving shopping bags around."
It seems that he had a chip on his shoulder and a lot to say, and had found someone to say it to, or at least to say it at. In any case, he didn't have an audience for long; I abandoned the piano and left. The piano went unplayed; the only people who found it useful, it seems, were Trevor the Tramp and Mr. Cranky.
I suspect that ideas like street pianos, while they may well work in, say, Berlin or Copenhagen or somewhere, don't work in London; because of the culture of London and/or England, they would attract the antisocial who would render them unusable for their intended purpose; i.e., awakening the spirit of creative play in random passers-by. This comes down to the social contract of London.
To live in London for any length of time is to learn the rule that other people are at best an annoyance. There are too many of them and too little space to go around. When they're not assaulting your hearing with their tinny music-playing mobile phones, accosting you for money or subjecting you to their inanities and prejudices, they're taking up space you want to move into, or looking like they might well threaten to do one of these things. One deliberately avoids eye contact, avoids acknowledging other people as anything more than roughly human-sized obstacles, lest the barrier come down and one find oneself subjected to the outflowings of the (quite probably disagreeable) personality of the entity sitting across the Tube carriage. That is, in the event that they don't start moving away from the weirdo who's making eye contact with them. People in general are, the belief goes, not very nice. (There is a technical term used for this state of being; it starts with C.)
Of course, the flipside of believing that people are, as a rule, unpleasant is to believe that being unpleasant is the accepted behaviour, part of the social contract. The only people who are conspicuously nice are charity muggers and con artists, i.e., people who are up to something. Unpleasantry is honesty.
Why this is so is not clear. The Left might claim that it's the legacy of Thatcherism and the dog-eat-dog market society, whereas the Right may pronounce gloomily that it is part of mankind's Hobbesian nature, the eternal war of all against all which can only be quelled with suppressive force from above. Perhaps it has something to do with England being what anthropologists call a "negative politeness" society (where politeness is about leaving people alone, rather than reaching out to them, as in a "positive politeness" society), and thus the only people who deign to enter the bubbles of mutually agreed isolation people were are outsiders, those with little to lose. Well, them and chuggers, but I digress. And some (including author Stuart Maconie) claim that it is only London that is like this, that if you head far enough north, or far enough south, people become actually quite agreeable.
* Actually, I lie. I headbutted him to the ground, kicked him repeatedly in the groin and told him, firmly, to be off lest I give him a proper hiding.
A few interesting posts from LiveJournal: Momus writes about the nihilism of heat:
In hot places, human life seems to be worth less. Iraq is a place where an army of Mersaults in the form of soldiers and "contractors" are constantly on the brink of breaching their own culture's taboos, driven to acts of violence by the murderous heat. Under the sun, nothing matters any more.
In hot places, human life seems to be worth less. Iraq is a place where an army of Mersaults in the form of soldiers and "contractors" are constantly on the brink of breaching their own culture's taboos, driven to acts of violence by the murderous heat. Under the sun, nothing matters any more.
Some say that cultural psychology changes as you move closer to the equator. In Discovering Psychology with Philip Zimbardo, a precis of a film about cultural psychology made for the University of Stanford, James Jones of the University of Delaware argues that a way of being has evolved near the equator featuring particular uses of time, rhythm, improvisation, orality, and spirituality. What he describes is a failure to defer gratification:And Lord Whimsy (author of The Affected Provincial's Companion) examines cod-Victorianism (think steampunk, goth and McSweeney's-esque retro typography) and cod-Modernism (think Shibuya-kei, Mod revivalism and, for want of a better word, Helvetipunk):
Cod-Vic usually has an element of knowing perversity to it, relishing its bad taste rather than recoiling from it. Cod-Vic has also opened the floodgates to a host of one-liners, cheap novelties, and dead ends--some delightful, others less so. The most successful Cod-Vic offerings seem to strike a deft balance, borrowing from overwrought Victorian forms while maintaining a modern crispness and rigor.
Cod-Mod is another form of retro, but an altogether less problematic one than Cod-Vic since its clean lines, unadorned materials and simple color schemes lend a laid-back, contemporary air of unaffected sophistication, making it much easier to defend as an aesthetic. Cod-mod has been mined heavily by indie music (Camera Obscura and Belle & Sebastian, anyone?), as well as interior design types like Jonathan Adler and film directors like Wes Anderson. Oh yeah, and Ikea.
Found in Bruce Schneier's account of a workshop on security and human behaviour, this gem:
Great phrase: the "laundry belt" -- close enough for students to go home on weekends with their laundry, but far enough away so they don't feel as if their parents are looking over their shoulder -- typically two hours by public transportation (in the UK).
To figure out the current state and direction of the global economy, economists are turning to somewhat unusual indicators, such as the membership of extramarital infidelity websites and the price of prostitution in Latvia:
The Web site crunched its traffic and membership numbers and found that there was a big increase in both when there was a turning point in the FTSE-100 index, which measures the leading companies listed in London. When the market collapses, people plot affairs. And when the bulls rage, the same thing happens. When it is trading sideways, they stick with their partners.
“It has to do with people’s confidence levels,” says Rosie Freeman-Jones, a spokeswoman for the site. “When the markets are up, they think they can have an affair because they feel they can get away with anything. When the market hits the bottom, they are looking for a way to relieve the pressure.”And here is more information on the prostitution index, and why prostitution prices make a good economic indicator.
Anyway the problem is that most industries have contractual arrangements which fix prices. Wages are very hard to flex downwards. Rents are fixed over sustained periods and the like. All of this means that people go bust rather than reduce prices – simply because prices are sticky.
Well – most prices. The contractual terms of prostitution are short (an hour, a night) and entry to the industry is unconstrained. That means that the prices are very flexible. Extraordinarily flexible.
A website named Double X, which seems to be a broadly feminist publication run by the Newsweek people, has a piece examining the phenomenon of women using photographs of their children as their Facebook profile photos, and what it says about their identity and social position:
These Facebook photos signal a larger and more ominous self-effacement, a narrowing of our worlds. Think of a dinner party you just attended, and your friend, who wrote her senior thesis in college on Proust, who used to stay out drinking till five in the morning in her twenties, a brilliant and accomplished woman. Think about how throughout the entire dinner party, from olives to chocolate mousse, she talks about nothing but her kids. You waited, and because you love this woman, you want her to talk about…what?…a book? A movie? A news story? True, her talk about her children is very detailed, very impressive in the rigor and analytical depth she brings to the subject; she could, you couldn’t help but think, be writing an entire dissertation on the precise effect of a certain teacher’s pedagogical style on her 4-year-old. But still. You notice at another, livelier corner of the table that the men are not talking about models of strollers. This could in fact be a 19th-century novel where the men have retired to a different room to drink brandy and talk about news and politics. You turn back to the conversation and the woman is talking about what she packs for lunch for her child. Are we all sometimes that woman? A little kid talk is fine, of course, but wasn’t there a time when we were interested, also, in something else?
Facebook, of course, traffics in exhibitionism: It is a way of presenting your life, at least those sides of it you cherry pick for the outside world, for show. One’s children are of course an important achievement, and arguably one’s most important achievement, but that doesn’t mean that they are who you are. It could, of course, be argued that the vanity of a younger generation, with their status postings on what kind of tea they are drinking, is a worse kind of narcissism. But this particular form of narcissism, these cherubs trotted out to create a picture of self is to me more disturbing for the truth it tells. The subliminal equation is clear: I am my children. And perhaps for their health and yours and ours, you should be other things as well.
A new exhibition in Spain explores sexual relations under Franco's fascist régime, through official advice given to women by the Feminine Section of the Falange, the fascist party:
"If your husband asks you for unusual sexual practices, be obedient and don’t complain. If he suggests union, agree humbly? When the culminating moment arrives, a small whimper on your part is sufficient to indicate any pleasure that might have been experienced."
From 1937 to 1977, some three million women aged 17 to 35 joined an organisation that urged young girls "not to burden themselves with books ? there’s no need to be an intellectual". And although sport was encouraged – one of few positives of the mass mobilisation – enthusiasts were warned: "Don’t take sport as a pretext to wear scandalous costumes."
The head of the women’s section was Pilar Primo de Rivera, sister of Jose Antonio Primo de Rivera, who founded the fascist Falange, the ideological backbone of Franco’s rule. "The life of every woman ? is nothing more than the eternal desire to find someone to submit to," she wrote.
For 40 years, women were drafted into "social service", a form of military conscription that supplied free labour for hospitals, publicly run restaurants and other social institutions.
A study in Japan, correlating suicide rates with lithium levels in water supplies, has found that lithium in the water supply reduces the suicide rate:
High doses of lithium are already used to treat serious mood disorders. But the team from the universities of Oita and Hiroshima found that even relatively low levels appeared to have a positive impact of suicide rates.
Levels ranged from 0.7 to 59 micrograms per litre. The researchers speculated that while these levels were low, there may be a cumulative protective effect on the brain from years of drinking this tap water.The researchers hace stopped short of recommending that lithium be added to the water supply, in the way that fluoride is.
They now have atheist summer camps:
The five days in Somerset will consist of traditional outdoor activities such as canoeing and cycling, combined with discussions about religion and non-belief. The centrepiece of the camp is an ongoing discussion where participants are encouraged to try to disprove the existence of unicorns, which serve as a metaphor for God.
Campers are told that two unicorns live in the area and cannot be seen, heard or touched. The adult councillors pretend to believe in the unicorns on the basis that an ancient book handed down through the generations says they exist. The children are encouraged to try to prove that the unicorns do not exist. If anyone is successful they will be awarded a £10 note which has a picture of Charles Darwin on it and is signed by leading atheist academic Richard Dawkins.
In the US the prize is a "godless" $100 bill from before 1957, which was when the US placed the phrase "In God We Trust" on all its notes. No child has definitively disproved the existence of unicorns and won the prize. "The idea of the unicorn debate is not to prove God doesn't exist, it is to illustrate that having such debates with religious people is futile because in the end faith trumps everything," said Miss Stein.
The Gini coefficient is a number from 0 to 1 representing the equality or inequality of income distribution in an economy; 0 is theoretical absolute equality, and 1 is one person having everything and everyone going without. In practice, it varies from about 0.2 to about 0.7.
According to it, Europe ranges from the mid-.20s to the high .30s, with a few outliers in the low 40s. At the most egalitarian end, unsurprisingly, are the Jante states of Denmark and Sweden, as well as Iceland (perhaps surprisingly, if it's meant to have been an experiment in cut-throat neoliberalism). Things get more inequitous into Norway, Finland, France, Germany and Switzerland (which stays under .28, despite being home to a lot of the global super-rich), and then on to Italy, Spain, Britain and Ireland, and beyond that, Poland and Lithuania. The most unequal country in Europe is Turkey, which has a Gini coefficient of 0.436, somewhere between Guyana and Nigeria, or, if you prefer, Delaware and Hawaii.
The United States is, unsurprisingly, a lot less egalitarian in income than Europe. American states' Gini coefficients range from 0.41 (the solidly Mormon state of Utah, whose state emblem is the beehive, has a Gini coefficient equivalent to Russia's) to a whopping 0.537 in the District of Columbia (comparable to the Honduras). Other states are twinned with parts of the developing world; Alabama and Mississippi are most like Nepal, California has the income distribution of Rwanda, and New York, barely under the .5 mark, is twinned with Costa Rica. According to the article, this is an astonishing state of affairs for a developed country:
According the the CIA World Factbook (table compiled here), the lowest Gini score in the world is Sweden's, at .23, followed by Denmark and Slovenia at .24. The next 20 countries are all in either Western Europe or the former Communist bloc of Eastern Europe. The EU as a whole is at .307. Russia has the highest number in Europe (.41); Portugal is the highest in Western Europe (.38). Japan is at .381; Australia is .352; Canada is .321.
And then there is the United States, sandwiched between Cote d'Ivoire and Uruguay at .450. Not counting Hong Kong (.523), the US is a complete loner among developed countries. In fact, as you can see from the map above, there is no overlap between any single US state and any other developed country; no state is within the normal range of income distribution in the rest of the developed world. Here's a list of the states with their Gini index numbers, and the country where income distribution is most comparable in parentheses:Other interesting maps on the site include a map of religious nonbelief in the UK (which points out that Scotland and Northern Ireland are the most religious, and asks whether that correlates to the Scots-Irish roots of the US "Bible belt"), of antidepressant use in England and Wales (summary: it's grim up north, and in Cornwall too; either that or Londoners prefer a line of coke), and one suggesting that, as global warming advances, Australia is ecologically fux0red.
In Britain, the government is making plans to let artists and community groups take over shops hollowed out by the recession, to sow the seeds of Berlin-style regeneration (which, for all its lack of respect for the sanctity of property rights, is a lot nicer than the alternative, urban wasteland):
Planning rules will be relaxed to allow changes of use which go against local guidelines. For example, a disused clothes shop could become an art gallery or an empty Woolworths an NHS drop-in centre.
Temporary lease agreements will enable owners who want to retain a vacant property in the long term to make it available for community or creative use during the recession. Councils will be urged to take control of empty properties until the recession ends.
"Empty shops can be eyesores or crime magnets," Blears said. "Our ideas for reviving town centres will give communities the knowhow to temporarily transform vacant premises into something innovative for the community - a social enterprise, a showroom for local artists or an information centre - and stop the high street being boarded up.Of course, as always, the devil is in the details. What exactly "relaxation of planning rules" involves is uncertain. As long as the shopfronts are used for community centres or art spaces and not, say, cut-rate toxic-waste processing facilities or something, that's a good idea.
Not all artists and activists are waiting for Her Majesty's Government to hand them the keys to a disused Woolworths, though; some have taken matters into their own hands:
The slack space movement has echoes in previous slumps when many now successful architects, magazine publishers and artists moved into vacant premises. There is certainly room for creativity again. One in six shops will be vacant by the end of the year, according to the data company Experian. It predicts that 72,000 retail outlets could close during 2009, more than doubling the number of empty units to 135,000 in the UK.Of course, some artists still haven't shaken off the language of Thatcherism-Blairism, and talk not of "community spaces" but of "business development". Art, you see, is a means to an economic end, and, even immediately after the recessionary shock, in Anglocapitalist cultures, there is the assumption that artists and squatters' role is merely that of the microbes in the soil of commerce, to prepare the ground for the next wave of aspirational consumerism, and hopefully make a few quid at the end of it:
"Rather than letting lots of pound shops appear, we are encouraging people to start up businesses," said Firmin. "We know recessions are awful but can be a good time for artists as creative ideas start appearing while otherwise redundant people are sitting at home fiddling and doing creative stuff."And here is a profile of various groups of artist-squatters, including the Da! Collective, notorious for outraging the tabloids by having the temerity to move into a disused mansion, rather than a warehouse or something more appropriate; not to mention a chronology of the history of squatting in Britain (and Europe).
Via Momus, who's, understandably, over the moon about this, hailing it as a triumph for the Berlin model (which, for a while, looked like it was going to be ground under the wheels of yuppification):
Since it's a global recession, I also like to think Berlin has now become a sort of template for cities all over the world. Whereas we might once have looked like a museum of crusty subcultures past their sell-by date, this city now looks like the future of Tokyo, the future of London, and the future of New York. We're your best-case scenario, guys, your optimal recessionary outcome. Everything else is dystopia, Escape-From-New-York stuff.
If the major cities of the world all become "Berlins", though, I can't guarantee I'd stay in the actual Berlin, the black flagship, the Big Squat itself. If Tokyo, for instance, got as cheap and cheerfully creative as Berlin -- if it became the kind of city you could simply occupy without having to scuttle around pointlessly making rent -- I'd be there in a flash. Secretly, what I'm doing here in Berlin is waiting for Tokyo to Berlinify.
Something to do this Valentine's Day in London: anarchist speed dating, organised by hardcore anarchist group Class War, no less:
Proceedings at the Cross Kings - a boozer in an as-yet un-gentrified corner of north London - will be overseen by a dominatrix known as Miss Scarlett L'amour. There will be no segregation by gender, out of respect to the lesbian, gay, bisexual and transgender attendees.
Anarchist literature will be on sale all through the anti-Valentine's evening. And entertainment will be provided by punk bands with such names to set the heart a-flutter as Active Slaughter and Headjam.The BBC piece, of course, starts with the "Anarchist speed dating? What a laugh." angle; after all, in the eyes of the tabloid-reading public, anarchists are nihilistic thugs who like smashing stuff, and the idea of them engaging with romantic love is a piece of wacky juxtaposition. The piece is, to its credit, quite sympathetic to the anarchist position, and goes beyond the stereotype:
He doesn't mind if that somebody is "a boy or a girl, as long as they're pretty". And he doesn't find anything strange about agitators being equally committed to bringing down the system and finding companionship.
"From a distance, when you think of anarchists you think of big boots and fighting with policemen," he says. "But all the ones I've met have been very nice, very committed people. They believe in something and they want to find love, just like everyone else. Why would that surprise anyone?"
An Europe-wide study has shown that, of all the countries in Europe, Britain has the lowest levels of "trust and belonging" amongst under-50s.
The ESS tries to measure trust and belonging by comparing answers to questions such as these:
- Generally speaking, would you say that most people can be trusted, or that you can't be too careful in dealing with people?
- Do you think that most people would try to take advantage of you if they got the chance, or would they try to be fair?
- Would you say that most of the time people try to be helpful or that they are mostly looking out for themselves?
The researchers suggest that our low "trust and belonging" score may be "the result of the development of a highly individualistic culture in the UK". Basically, the suggestion is that we are in danger of becoming the most selfish nation in Europe.That's one explanation, that the low level of trust is symptomatic of Thatcherite-Blairite Hobbesian anglocapitalist values, where man is expected to be wolf to man (after all, were this not the case, that would be grossly inefficient and uncompetitive). Other factors could include greater geographical mobility (a society of immigrants, expatriates and the global superrich would be less cohesive than a tightly-knit local society). Interestingly enough, the countries with the highest level of trust and belonging in Europe appear to be Denmark and Norway (Sweden, it seems, has pulled away towards high-Gini competitive individualism, undoubtedly buoyed by the success of Ikea, H&M and a dozen supercool indie-folk and fashion-electro bands); could this be a reflection of the vaunted Scandinavian egalitarianism and/or the internalised repression of individualism of the Jante Law, or just of more homogeneous societies?
Having seen their previous bets turned into a spectacular wipeout, some hedge-fund managers are now betting on the total collapse of civilisation:
In his book Wealth, War, published last year, former Morgan Stanley chief global strategist Barton Biggs advised people to prepare for the possibility of a total breakdown of civil society. A senior analyst whose reports are read at hedge funds all over the city wrote just before Christmas that some of his clients are “so bearish they’ve purchased firearms and safes and are stocking their pantries with soups and canned foods.” This fear is very much reflected in the market—prices of corporate bonds have been so beaten down at various points that they suggest a higher default rate than during the Great Depression. Meanwhile, while the overall gold market has fluctuated, the premium for quarter-ounce gold coins—meaning the difference between the price for gold you can hold in your hand and that for “paper gold,” such as exchange-traded funds—rose to an all-time high of 20 percent. “Gold is transportable, it’s 100 percent liquid, and it’s perfectly divisible in the context of ounces, bars, or coins,” says the head of a California research firm who keeps a supply of it, along with food, water, and guns, on hand. “And most important, there’s no counterparty”—i.e., it’s an investment beholden to no one, and perhaps one of the few assets that will retain value if the financial system collapses.
While it may look like these Wall Streeters are betting on such a collapse, their embrace of survivalism is an outgrowth of their professional habits of mind: Having observed the economy’s shaky high-wire act from their ringside seats, they are trying to manage their risk and “hedge” against a potential fall. “It’s like insurance,” says an investor who has stockpiled MREs and a hand-cranked radio. “And by the time you need it, it’s way too late.” Leave it for others to weep for the collapse of the social order. These guys would prefer to be in a high-speed boat or ex-military vehicle, heading off toward their fully provisioned compounds in pursuit of the ultimate goal: to win the chaos.I wonder how many of these Wall Street supersharks would be able to translate their killer instincts to being effective post-apocalyptic warlords, or whether most of them would end up as mincemeat when the first killer caravan rolls through their survivalist homestead. I also wonder how long until "Post-Apocalyptic Warlordship for Dummies" replaces "Flipping Houses for Dummies" in the Business sections of bookshops.
(via Boing Boing)
In Japan, it is possible to pay to spend time with a cat:
Lola - or Rora - to give her a slightly more Japanese pronounciation - is a beauty and she knows it. Customers pay by the hour for her company. Usually they just want to stroke her, but as a special treat for favoured clients, she will lie back in a chair, close her eyes and pose for photographs.
Lola is a Persian cat who works at the Ja La La Cafe in Tokyo's bustling Akihabara district. It is one of a growing number of Cat Cafes in the city which provide visitors with short but intimate encounters with professional pets.
It costs about £8 ($10) an hour to spend time in a Cat Cafe. Busy lives mean some people prefer to hire a dog If felines do not appeal, other establishments will rent you a rabbit, a ferret or even a beetle. There are more than 150 companies in Tokyo which are licensed to hire out animals of various kinds and although beetles may be cheap, dogs much more popular.This is, of course, not the only Japanese institution of this type. There are the obvious ones, such as hostess bars and maid cafés, which allow men with more cash than time or social skills to rent the company and flattery (and occasionally other services) of attractive women. On a less salacious note, there are apparently also parks in Japan where salarymen suffering from alienation from nature can pay to rake leaves.
Germany's Potsdam University is offering its computer science master's degree students a new subject: a practical course in flirting skills, ostensibly designed to improve students' social skills and ability to operate in the real world:
The 440 students enrolled in the master's degree course will learn how to write flirtatious text messages and emails, impress people at parties and cope with rejection.
Philip von Senftleben, an author and radio presenter who will teach the course, summed up his job as teaching how to "get someone else's heart beating fast while yours stays calm."(Ah, those Germans: they even make the sport of love sound like a duelling society...)
Of course, the idea of flirting courses for compulsively-systematising geeks is not a new one, though they have usually consisted of sneaky hacks for getting laid, typically boiling down to embedding subliminal messages in one's speech, surreptitiously pointing to one's crotch and going to bars wearing ridiculously flamboyant boots and a LED belt buckle to get attention. It's not clear whether the Potsdam course will follow in this vein.
Atheist bus ads roll out across the UK. Having raised £135,000, the Atheist Bus Campaign has broadened its scope considerably, and rather than the original 30 bus ads in London, they're rolling out 800 across the UK, with interior ads giving quotes from famous atheists including Douglas Adams, Albert Einstein and Emily Dickinson.
A similar campaign has run in the USA, not a country typically associated with atheism. However, when the local atheist society tried to run one in Australia (not usually a religiously strident place), the advertising company knocked them back, considering promotion of Godlessness (with the scandalous slogan "Celebrate reason", no less!) to be a bit too much for Australian morés.
There's an essay in the Graun about the fall from grace of the idea of kindness. Far from being a virtue to be embraced, conspicuously kind behaviour is now seen as a bad thing; it either signifies that the individual is a loser of low status (and best avoided; loserdom's contagious, you know, and you don't want to associate with losers), or up to something predatory or dishonest. Even if they're not, they're being kind for some sort of psychological payoff, and thus are really just as selfish and corrupt as the rest of us grasping predators in the urban jungle:
Kindness - not sexuality, not violence, not money - has become our forbidden pleasure. In one sense kindness is always hazardous because it is based on a susceptibility to others, a capacity to identify with their pleasures and sufferings. Putting oneself in someone else's shoes, as the saying goes, can be very uncomfortable. But if the pleasures of kindness - like all the greatest human pleasures - are inherently perilous, they are none the less some of the most satisfying we possess.
All compassion is self-pity, DH Lawrence remarked, and this usefully formulates the widespread modern suspicion of kindness: that it is either a higher form of selfishness (the kind that is morally triumphant and secretly exploitative) or the lowest form of weakness (kindness is the way the weak control the strong, the kind are kind only because they haven't got the guts to be anything else). If we think of humans as essentially competitive, and therefore triumphalist by inclination, as we are encouraged to do, then kindness looks distinctly old-fashioned, indeed nostalgic, a vestige from a time when we could recognise ourselves in each other, and feel sympathetic because of our kindness - if such a time ever existed. And what, after all, can kindness help us win, except moral approval; or possibly not even that, in a society where "respect" for personal status has become a leading value.
Margaret Thatcher's 1979 electoral victory marked the defeat in Britain of the Beveridge/Tawney/Titmuss vision of a kindly society, while the rise of Reaganism in 80s America saw a similar erosion of welfare values there. Kindness was downgraded into a minority motivation, suitable only for parents (especially mothers), "care professionals" and assorted sandal-wearing do-gooders. The "caring, sharing" 90s proclaimed a return to community values, but this proved to be rhetorical flimflam as Thatcher and Reagan's children came of age, steeped in free-market ideology and with barely a folk memory of the mid-century welfare vision. With the 1997 triumph of New Labour in Britain, and George W Bush's election to the American presidency in 2000, competitive individualism became the ruling consensus. The taboo surrounding "dependency" became even stronger, as politicians, employers and a motley array of well-fed moralists harangued the poor and vulnerable on the virtues of self-reliance. Tony Blair called for "compassion with a hard edge" to replace the softening variety advocated by his predecessors. "The new welfare state must encourage work, not dependency," he declared, as a plague of cost-cutting managers chomped away at Britain's social services.
So kindness is not just camouflaged egoism. To this old suspicion, modern post-Freudian society has added two more: that kindness is a disguised form of sexuality, and that kindness is a disguised form of aggression - both of which again reduce kindness to a covert selfishness. Insofar as kindness is a sexual act it is seen as a seduction (I am being very nice to you so I can get to have sex and/or babies), or as a defence against the sexual event (I'll be so kind to you that you will forget about sex and we can do something else together), or as a way of repairing the supposed damage done by sex (I'll be nice to you to make up for all my harmful desires). Insofar as kindness is an aggressive act it is seen as a placation (I feel so aggressive towards you that I can only protect both of us by being very kind), or a refuge (my kindness will keep you at arm's length). "One can always, for safety, be kind," as Maggie Verver says to her father in Henry James's The Golden Bowl.Of course, kindness is socially corrosive as it makes people less self-reliant and prevents the weak from being weeded out as Nature intended; to wit, there are formal systems for discouraging, or even punishing, acts of kindness, as described in the MetaFilter thread; for example, the old practice of passing unwanted all-day rail tickets on to strangers after using them is now a prosecutable offence in much of the UK. (Then again, in a world where business models are sacrosanct, this is just as criminal an act of theft as copying a MP3 for a friend or getting up during the ad break on TV.) It's as if the philosophy of Ayn Rand has assimilated itself into the unwritten cultural assumptions of society.
Meanwhile, if you encounter what looks like unwarranted kindness or friendliness from a stranger in a big city, your first reaction is wondering whether they're trying to con or scam you in some fashion. Though I suspect this is not an invention of Reaganism-Thatcherism or even of Ayn Rand, but just the way that things have been in big cities full of strangers since time immemorial. While the article makes some interesting points about the social consequences of the drive for free-market competitiveness as the primary principle of social organisation, of social planning on the assumption of individuals as gold-seeking individualists, the idea that there ever was a glden age of widespread kindness does seem somewhat implausible. (A time of widespread hypocritical lip service to the virtue of kindness seems more plausible.)
Texan Cyberpunk sci-fi author turned father of the Viridian pro-green design/technology movement turned Belgrade-based design theorist Bruce Sterling gives his annual state-of-the-world address to the Inkwell forum. It's focussed mostly on the economic cataclysm in progress, and it's full of the sorts of apposite powder-dry black humour you'd expect from him:
Do we HAVE to talk about the economy this year? I'm wondering what conceivable event could overshadow the fiscal crisis. Maybe a cozy little nuclear war? An Indo-Pakistani nuclear war might conceivably take a *back page* to the fiscal crisis.
I'm a bohemian type, so I could scarcely be bothered to do anything "financially sound" in my entire adult life. Last year was the first year when I've felt genuinely sorry for responsible, well-to-do people. Suddenly they've got the precariousness of creatives, of the underclass, without that gleeful experience of decades spent living-it-up.
If the straights were not "prone to hostility" before that experience, they might well be so after it, because they've got a new host of excellent reasons. The sheer galling come-down of watching the Bottom Line, the Almighty Dollar, revealed as a papier-mache pinata. It's like somebody burned their church.After indulging in terriblisma for a while, Sterling turns his attention to Dmitry Orlov's prediction of the US disintegrating, and ideas for a "new localism" that might arise in the event of catastrophic collapse:
In any case, after eight glum years of watching Bush and his neocons methodically wreck the Republic, both Kunstler and Robb have gotten really big on American localism -- "resilient" localism. Kunstler has this painterly, small-town-America, Thoreauvian thing going on, kinda locavore voluntary simplicity, with lots of time for... I dunno, group chorale singing. Kunstler seems kinda hung up on the singing effort, somehow... Whereas Robb has a military background and is more into a gated-community, bug-out-bag, militia rapid-response thing.
Certainly neither of these American visions look anything like what happened to Russia. As Orlov accurately points out, in the Russian collapse, if you were on a farm or in some small neighborly town, you were toast. The hustlers in the cities were the ones with inventive opportunities, so they were the ones getting by.
So the model polity for local urban resilience isn't Russia. I'm inclined to think the model there is Italy. Italy has had calamitous Bush-levels of national incompetence during almost its entire 150-year national existence.Meanwhile, Clay Shirky gives his predictions for 2009. Whether or not we're all toast, a lot of the old media, such as newspapers, seem to be:
The great misfortune of newspapers in this era is that they were such a good idea for such a long time that people felt the newspaper business model was part of a deep truth about the world, rather than just the way things happened to be. It's like the fall of communism, where a lot of the eastern European satellite states had an easier time because there were still people alive who remembered life before the Soviet Union - nobody in Russia remembered it. Newspaper people are like Russians, in a way.
Why pay for it at all? The steady loss of advertising revenue, accelerated by the recession, has normalised the idea that it's acceptable to move to the web. Even if we have the shallowest recession and advertising comes back as it inevitably does, more of it will go to the web. I think that's it for newspapers. What we saw happen to the Christian Science Monitor [the international paper shifted its daily news operation online] is going to happen three or four dozen times (globally) in the next year. The 500-year-old accident of economics occasioned by the printing press - high upfront cost and filtering happening at the source of publication - is over. But will the New York Times still exist on paper? Of course, because people will hit the print button.Shirky's not one for terriblisma, so not much about social collapse, cannibalism or killer caravans marauding the post-apocalyptic landscape there. For that, you'll have to read Charlie Brooker's column:
Dim your lights. Here's the highlights reel. The worst recession in 60 years. Broken windows and artless graffiti. Howling winds blowing empty cans past boarded-up shopfronts. Feral children eating sloppy handfuls of decomposed-pigeon-and-baked-bean mulch scraped from the bottom of dustbins in a desperate bid to survive. The pound worth less than the acorn. The City worth less than the pound. Your house worth so little it'll collapse out of shame, crushing you in your bed. Not that you'll die peacefully in your sleep - no, you'll be wide awake with fear, worrying about the situation in the Middle East at the precise moment a chunk of ceiling plaster the size of a flagstone tumbles from on high to flatten your skull like a biscuit under a shoe, sending your brain twizzling out of your earholes like pink-grey toothpaste squeezed from a tube. All those language skills and precious memories splattered over your pillows. It'll ruin the bedclothes. And instead of buying expensive new ones, your grieving, impoverished relatives will have to handwash those bedclothes in cold water for six hours to shift the most upsetting stains before passing them down to your orphaned offspring, who are fated to sleep on them in a disused underground station for the rest of their lives, shivering in the dark as they hear bombs dipped in bird flu dropping on the shattered remains of the desiccated city above.
(via Boing Boing)
Of all 3,141 counties in the United States, New York County is the unrivaled leader in single-individual households, at 50.6 percent. More than three-quarters of the people in them are below the age of 65. Fifty-seven percent are female. In Brooklyn, the overall number is considerably lower, at 29.5 percent, and Queens is 26.1. But on the whole, in New York City, one in three homes contains a single dweller, just one lone man or woman who flips on the coffeemaker in the morning and switches off the lights at night.
These numbers should tell an unambiguous story. They should confirm the common belief about our city, which is that New York is an isolating, coldhearted sort of place. Mark Twain called it “a splendid desert—a domed and steepled solitude, where the stranger is lonely in the midst of a million of his race.” (This from a man who settled in Hartford, Connecticut.) In J. D. Salinger’s 1952 short story “De Daumier-Smith’s Blue Period,” the main character observes that wishing to be alone “is the one New York prayer that rarely gets lost or delayed in channels, and in no time at all, everything I touched turned to solid loneliness.” Modern movies and art are filled with lonesome New York characters, some so familiar they’ve become their own shorthand: Travis Bickle (in Taxi Driver, calling himself “God’s lonely man”); the forlorn patrons in Nighthawks (inspired, Edward Hopper said, “by a restaurant on New York’s Greenwich Avenue”); Ratso Rizzo (“I gotta get outta here, gotta get outta here,” he kept muttering in Midnight Cowboy … and died before he could).There are several assumptions here: the equation of living alone (outside of a stable nuclear family) with loneliness and psychological toll is one of them. Another one is the great American myth about small-town values, one we see trotted out (often by people on the right of culture-war politics) time and time again.
In American lore, the small town is the archetypal community, a state of grace from which city dwellers have fallen (thus capitulating to all sorts of political ills like, say, socialism). Even among die-hard New Yorkers, those who could hardly imagine a life anywhere else, you’ll find people who secretly harbor nostalgia for the small village they’ve never known.One problem with "small-town values" is that the word is often a dog whistle for a certain brand of reactionary intolerance; a strong in-group-vs.-out-group distinction, knee-jerk traditionalism, bigotry and petty authoritarianism, only painted in folksy Thomas Kincaid colours. One example of this that came around not so long ago was failed US vice-presidential candidate Sarah Palin approvingly quoting a fascist newspaper columnist's praise of "small-town America". If small towns stand as a symbol of intolerance and conformism, all things urban could be said to represent the opposite, cosmopolitanism and liberalism.
Cities, in other words, are the ultimate expression of our humanity, the ultimate habitat in which to be ourselves (which may explain why half the planet’s population currently lives in them). And in their present American incarnations—safe, family-friendly, pulsing with life on the street—they’re working at their optimum peak. In Cacioppo’s data, today’s city dwellers consistently rate as less lonely than their country cousins. “There’s a new sense of community in cities, an increase in social capital, an increase in trust,” he says. “It all leads to less alienation.”
Cacioppo and Patrick cite a range of studies showing that students in classes with the best rapport imitate each other’s body language; same goes for athletes on winning teams. The presence of other human beings puts a natural limit on how freakily we can behave. And where better to find them than in cities, where we have more ties? (Think about the sociopathic kids who shot other kids in Red Lake, Minnesota; at Northern Illinois University; at Virginia Tech—what do they have in common? They were living in isolated places.) Robert Sampson, paraphrasing Durkheim, puts it this way: “The tie itself provides health benefits. That’s where I started with my work on crime.”
In any case, recent research has revealed that the equation of living alone and loneliness does not follow; for one, what sociologists call "weak ties" are at least as important to psychological wellbeing as more intimate connections, and cities full of singletons are swarming with potential weak ties (and often stronger ones as well):
“In our data,” adds Lisa Berkman, the Harvard epidemiologist who discovered the importance of social networks to heart patients, “friends substitute perfectly well for family.” This finding is important. It may be true that marriage prolongs life. But so, in Berkman’s view, does friendship—and considering how important friendship is to New Yorkers (home of Friends, after all), where so many of us live on our own, this finding is blissfully reassuring. In fact, Berkman has consistently found that living alone poses no health risk, whether she’s looking at 20,000 gas and electricity workers in France or a random sample of almost 7,000 men and women in Alameda, California, so long as her subjects have intimate ties of some kind as well as a variety of weaker ones. Those who are married but don’t have any civic ties or close friends or relatives, for instance, face greater health risks than those who live alone but have lots of friends and regularly volunteer at the local soup kitchen. “Any one connection doesn’t really protect you,” she says. “You need relationships that provide love and intimacy and you need relationships that help you feel like you’re participating in society in some way.”
In fact, many Internet and city behaviors we consider antisocial have social consequences. Think of people who lug their laptops into public settings. In 2004, Hampton and his colleagues looked at just those people—at Starbucks, in fact, in Seattle and Boston—and concluded that a full third of them were basically using their laptops and interacting at the same time. (Cafés, in other words, were like dog runs, and laptops were like pugs, encouraging interaction among solitaries.) Hampton did a similar study of laptop users in Bryant Park, and the same proportion, or one-third, reported meeting someone they hadn’t before. Fifteen percent of them kept in touch with that person over time (meaning that about 5 percent made lasting ties out of a trip to Bryant Park with a laptop).
Conversely, married people—women especially—have smaller friendship-based social networks than they did as single people, according to Claude Fischer. In a recent phone conversation with the sociologist, I mentioned a related curiosity I came across in a paper about the elderly and social isolation in New York City: The neighborhoods where people were at the greatest risk, it seemed, were in neighborhoods where people seemed very married—family neighborhoods, in fact, like Borough Park and Ridgewood. “That’s not strange at all,” he says. “They’re the prime category of people to be isolated.” He explains that these people “aged in place,” as sociologists like to say, staying in the homes where they raised their own families. Then their spouses died, and so did their cohort (or it moved to a retirement community), and they’re suddenly surrounded by strange families, often of different classes or ethnic backgrounds, with whom they’re likely to have far less in common. “Unless they have children living nearby,” he says, “they’re likely to be quite isolated.”The article concludes with the notion that the internet—another thing often pooh-poohed as alienating and antisocial—functions, in terms of facilitating weak ties, much like a city; in fact, like the ultimate city:
Think about it: Serendipitous encounters between people who know each other well, sort of well, and not at all. People of every type, and with every type of agenda, trying to meet up with others who share that same agenda. An environment that’s alive at all hours, populated by all types, and is, most of the time, pretty safe. What he was saying, really, was that New York had become the Web. Or perhaps more, even: that New York was the Web before the Web was the Web, characterized by the same free-flowing interaction, 24/7 rhythms, subgroups, and demimondes.
(via Mind Hacks)
If you live in Britain, you've probably been confronted by chuggers; those armies of attractive, bubbly young people in jackets bearing the logo of some charity or other who loiter in packs in pedestrian areas, waiting for you to pass before intercepting you, latching on like a lamprey and attempting to guilt a direct debit out of you, giving you only two options: surrender and pay up, or be rude to a nice person and feel like a heel for it.
The charity watchdog Intelligent Giving has investigated the activities of chuggers and found almost all of them to be breaking both the law and codes of conduct, harrassing shoppers and asking them to commit fraud to bolster their commissions, and is calling on the public to boycott them.
[A] survey of their tactics has found that some face-to-face fundraisers are not as good as the causes they represent. They have been caught out misleading the public about how they are paid, harassing shoppers who say they are not interested, and asking donors to lie on direct debit forms to help them meet their targets.
The watchdog also found that 15 fundraisers from nine charities broke the Institute of Fundraising's own code of conduct by refusing to back off when asked to do so. These included fundraisers for the British Red Cross and Scope.So next time you tell a chugger to naff off (with all due politeness, of course), remember: you're not being a miser or a crabby old troll, but a responsible citizen.
The Mind Hacks blog has a summary of a paper looking at the content of another adulterated street drug, in this case, heroin. Not surprisingly, your average wrap of street smack contains a lot of adulterants; the analysis lists random medications and pharmaceutical substances, anaesthetics, dietary supplements and chelating agents for metals, as well as other street drugs including cocaine and amphetamines.
The article also looks at the economics and business practices behind the adulteration of heroin. Obviously, taking advantage of the lack of quality control and getting more money out of less actual expensive heroin is a major consideration for the dealers, but it isn't the only one:
Interestingly, the paper also notes that professional heroin cutters are expensive, charging up to $20,000 for a kilo of heroin. This is likely due to the skill and knowledge needed to select ingredients that will have certain effects, which can be different for 'smokers', 'snorters' and 'injectors'.
Additionally, some ingredients are added purely for their psychoactive effect to give a different experience and 'brand' the dope.
(via Mind Hacks)
Music journalist Jon Savage, who has recently compiled a compilation of music from the gay underground of the 1960s and 70s, claims that today's popular music and pop culture is a lot less tolerant of difference and nontraditional sexual roles than it was in the bad old days:
A few years after Sylvester's triumph, explicitly gay music - Frankie Goes to Hollywood, Bronski Beat, the muscle-bound thud of high-energy dance music - was accepted into the British charts in a way that Joe Meek or the shadowy figures behind the Brothers Butch and Camp Records could never have anticipated. Twenty years on, Radio 1's breakfast show presenter is using the word "gay" as an insult.
"Lad culture has been a disaster for pop music," says Savage. "That definition of a heterosexual man - beer and football, Nick Hornby - is so restrictive. It's important that pop musicians play around with gender and sexual divergence. The fact that it's gone back to Oasis from the Rolling Stones, Mick Jagger being very camp, is just pathetic, it's a complete failure. People are scared of nonconformity in music, so this album is a less-than-fragrant reminder of a time when pop music was less sanitised than it is now.Perhaps he has a point and the role of the rock star as a pansexual shaman of kink seems to have largely been displaced by that of a laddish alpha-male, with rock'n'roll's rebellious energy being focussed not so much at overthrowing repressive social strictures as enforcing them and gay-bashing those who transgress. (Witness the reactionary "rebellion" of "alternative" bands like Limp Bizkit, which has more politically in common with right-wing talk radio than any sort of progressive movement.) Where it exists, it is either used as a retro cliché (think Of Montreal's glam sleaze) or in a sanitised, cartoonish form (i.e., Mormon boy band The Killers' faux-transgressivism).
Then again, one could argue that rock'n'roll was always a regressive force; Susan Sontag, for example, equated it with "aggressive normality", and in his blog, Momus (of "Tender Pervert" fame) has asserted that rock music is inherently fascist. Could the 1970s glam nexus of rock music and gender-bending be more like oil and water, less a natural symbiosis than a chance collision brought on by external pressures (in that case, opposition to the strictures of "straight" society). With mainstream conformity eroded, in favour of a marketing-driven arms race of sexualisation, the brute berzerker force of rock has no external targets to be directed against, so it lashed out against the usual targets, and the rebels become bullies?
Theory of the day: the political tone of a time is reflected in the theme of its undead-themed horror films; to be more precise, conservative periods include zombie movies, whereas progressive periods feature vampire movies:
One answer: These gore-flecked flicks are really competing parables about class warfare. “Democrats, who want to redistribute wealth to 'Main Street,' fear the Wall Street vampires who bleed the nation dry,” Newitz argued, noting that Dracula and his ilk arose from the aristocracy. “Republicans fear a revolt of the poor and disenfranchised, dressed in rags and coming to the White House to eat their brains.”Whilst that could be reading much into it, zombie films can be equated with leftist critiques of conservative societies: George Romero's original films are widely regarded as critiques of post-war American consumerism, meanwhile other films make the connection even more explicit (the British zombie film Dead Creatures, for example, is essentially a Ken Loach film with zombies). Not sure what Shaun Of The Dead would be, though; Blairism, perhaps?
The latest in the line of "Stuff (group) Like" blogs is Stuff Geeks Love, which mentions things like "zombies", "Libertarianism", "cancelled TV shows" and "sex", and shines a revealing yet harsh light on stereotypical "geek" obsessions:
Comic book geeks are especially prone to faux boycotts. Every week hundreds of comic book fans declare that, because of some perceived outrage, they will never buy anything from DC or Marvel again. And the following week they proceed to do so because otherwise their runs on titles will be incomplete and because what else are they supposed to do? They’ve been reading X-Men since they were nine and aren’t going to stop now! Within weeks of the “true fan” declaring that he’ll never buy another Marvel comic again he’ll proudly declare victory for Marvel when an issue of their current “event” comic sells a few dozen more issues than an issue of DC’s current “event” comic.
It’s not surprising geeks have affection for zombies; these creatures are arrested in their existence, unable to change or grow. Geeks feel a oneness with them. And although zombies are frightening to look at, they don’t seem on the surface to be a serious threat, but their numbers and sheer tenacity make them possibly the most sinister killers of all. This is another thing geeks like to think they feel a oneness with; the underestimated lethal threat. Also, zombies desire, above all else, brains.
Geeks enjoy being Libertarians for two reasons. First, it allows them to be Conservative without having to belong to one of the two mainstream parties that the regular sheep are part of. Second, it gives them a political party that is just as self-absorbed as they are. Conservatives don’t care if you think they’re selfish pricks. Libertarians wonder why you don’t admire them for it.
Having a show canceled also has another upside for the geek. If it’s no longer in production, all those meddling writers, producers, actors, and studios can’t “mess it up” for him by having things happen on the show that blatantly contradict the obvious “right way” things would happen, were the geek in charge. It saves him the later trouble of having to declare he’s going to boycott the show (he won’t) because someone on the show did something that was “totally out of character”. It puts the show into a little snowglobe the geek can cradle and protect from the cruel outside world. The geek and his friends now own and control it and it is finally where it belongs, in the hands of the “true fans”.
A few articles about class in Britain today: The Guardian has one looking at what, if anything, constitutes class in neo-Thatcherite Britain:
The idea of class as an expression of wealth was always a misconception. Our modern obsession with its outward manifestations was entrenched at a time when wealth was at its most even level of distribution in Britain, in the immediate post-Second World War era. It was people's relative proximity in money terms that led them to find alternative ways of distinguishing themselves from their neighbours. Before the war, most people knew their place (and outside the political left, accepted it). But an acceleration of social mobility in the Forties and Fifties led to a boom in petty snobbery. It was the era of the 'u' and 'non-u' distinctions notoriously codified by Nancy Mitford. As the solidarity of the war years receded, but austerity kept people's incomes relatively homogenous, it became almost existentially important whether you said 'napkin' or 'serviette'; 'toilet' or 'loo'; 'how d'you do?' or 'pleased to meet you.'
Professor Dorling has a graph that shows what a purely wealth-based class structure looks like. There is a big chunk in the middle - 50 per cent of the population who qualify as 'normal'. Beneath them is a 15 per cent chunk of 'poor' people, and another 10 per cent who are 'very poor'. There is a sizeable chunk - 20 per cent - near the top who are rich. But the remaining five per cent are stratified into ever smaller distinctions of extreme wealth. These are people who, in Dorling's phrase, 'exclude themselves from the norms of society' - the footballers, pop stars, Russian oligarchs, oil sheikhs, hedge-fund managers. 'The top-level people all meet each other,' says Dorling, 'and the thing they have in common is money.'
But there is no agreement on where the boundaries of 'chavdom' begin and end. It is an extraordinarily polyvalent word, which can be used as a slur against the urban poor and the suburban rich. It is a weapon in a petty civil war waged almost entirely within the swollen ranks of the middle class, often between people of equivalent incomes, in houses of equal value.And The Times has a piece by (half-Asian, though very "middle-class") journalist India Knight, who writes from personal experience about the assumptions Britons have about race and class:
The class/race issue confuses many. I’ve had people pretend I was white since I was a child, despite the evidence of their own eyes. I am café au lait: this means I’ve been asked if I was Spanish, Italian, Greek, Turkish, South American. I don’t think anyone would have asked me if my family ran a corner shop and I had an Indian accent or wore a sari (although it’s always fun to stick one on: if people have only ever seen you in heels and dresses, you can see their bewilderment). I don’t think anyone genuinely wonders if I am Spanish; I think my middle-classness automatically “promotes” me to being manageably European, rather then problematically “foreign”.
This week has seen not one but two demonstrations of the power of internet-enabled bottom-up organisation. Firstly, Barack Obama's campaign topped off its outflanked of rivals (both Democratic and Republican) by becoming the first post-broadcast-age president, and now Rick Astley has won MTV's Best Act Ever award, largely because of
his timeless songwriting the lulz factor.
A code of practice published by the British government reminds owners of dogs and cats to ensure that their pets are not only fed properly but provided with adequate entertainment and mental stimulation. Rumours that the government will distribute laser pointers to all cat owners to assist in this could not be confirmed.
In his WIRED column, Bruce Schneier puts forward a new model for understanding why people become terrorists. The conventional model, that they do it to achieve political aims or address grievances, doesn't adequately describe the real world, in which terrorist groups have vague or changing goals and eschew actions more likely to actually achieve those goals, and the actual terrorists often adopt and change ideologies and targets at the drop of a hat. Instead, being a terrorist is not about changing the world, but rather about being part of a community:
The evidence supports this. Individual terrorists often have no prior involvement with a group's political agenda, and often join multiple terrorist groups with incompatible platforms. Individuals who join terrorist groups are frequently not oppressed in any way, and often can't describe the political goals of their organizations. People who join terrorist groups most often have friends or relatives who are members of the group, and the great majority of terrorist are socially isolated: unmarried young men or widowed women who weren't working prior to joining. These things are true for members of terrorist groups as diverse as the IRA and al-Qaida.
For example, several of the 9/11 hijackers planned to fight in Chechnya, but they didn't have the right paperwork so they attacked America instead. The mujahedeen had no idea whom they would attack after the Soviets withdrew from Afghanistan, so they sat around until they came up with a new enemy: America. Pakistani terrorists regularly defect to another terrorist group with a totally different political platform. Many new al-Qaida members say, unconvincingly, that they decided to become a jihadist after reading an extreme, anti-American blog, or after converting to Islam, sometimes just a few weeks before. These people know little about politics or Islam, and they frankly don't even seem to care much about learning more. The blogs they turn to don't have a lot of substance in these areas, even though more informative blogs do exist.
All of this explains the seven habits. It's not that they're ineffective; it's that they have a different goal. They might not be effective politically, but they are effective socially: They all help preserve the group's existence and cohesion.The implications of this theory are that terrorist groups are the emergent product of mass social alienation; which suggests a solution to terrorism: give everyone internet access and multiplayer online games. Which would mean that those drawn to malignant, tightly-knit social groups would merely become trolls and griefers rather than actual real-world terrorists.
Legal doctrine of the day: the Three-Pony Rule, used in determining when child support claims are excessive:
While acknowledging there are unique problems with determining the reasonable needs of children of high-earning families, the court said trial judges should nevertheless avoid overindulgence -- citing the doctrine of In re Patterson, 920 P.2d 450 (Kan. App. 1996), that "no child, no matter how wealthy the parents, needs to be provided [with] more than three ponies."
But the appeals court said Convery failed to make a detailed examination of Jean Strahan's child support request and instead merely accepted her recitation of the children's needs. Those "needs," wrote Appellate Division Judge Lorraine Parker, included the children giving their nanny a 10-day vacation in Jamaica; diamond jewelry for their grandmother; $30,000 yearly for landscaping expenses; $36,000 a year for "equipment and furnishings"; and $3,000 yearly for audio visual equipment. Jean set their clothing needs at $27,000 a year, since the children needed new outfits every time they saw their father and one of them demanded a new purse every time she left the house.
"[T]he court made no distinction between what needs were reasonable, given the age of the children, and what simply amounted to a 'fourth pony,'" wrote Parker, who was joined by Judges Rudy Coleman and Thomas Lyons.
The Dutch authorities have trained about 200 inspectors to catch people illegally smoking tobacco in marijuana cafes. Smoking tobacco indoors has just been outlawed in the Netherlands, as it has elsewhere, though smoking marijuana in Amsterdam's famous "coffee shops" is allowed, as long as the weed isn't diluted with tobacco.
A Swedish school confiscated birthday party invitations handed out by an 8-year-old pupil because he failed to invite two of his classmates, violating their rights, and possibly the Jante Law as well:
The school, in Lund, southern Sweden, argues that if invitations are handed out on school premises then it must ensure there is no discrimination.
He says the two children were left out because one did not invite his son to his own party and he had fallen out with the other one.The boy's father lodged a complaint with the parliamentary ombudsman. A verdict is expected in September.
The New York Times has a piece on the traditional Albanian institution of sworn virgins; where women could swear an oath of lifelong virginity and assume the role of men, dressing and behaving like men and wielding male authority. Some did this to escape undesired arranged marriages (or, presumably, the restricted world of female gender roles in general), while others did this out of obligation to provide a "male" head of their family or avenge the family honour:
“Back then, it was better to be a man because before a woman and an animal were considered the same thing,” said Ms. Keqi, who has a bellowing baritone voice, sits with her legs open wide like a man and relishes downing shots of raki. “Now, Albanian women have equal rights with men, and are even more powerful. I think today it would be fun to be a woman.”
The sworn virgin was born of social necessity in an agrarian region plagued by war and death. If the family patriarch died with no male heirs, unmarried women in the family could find themselves alone and powerless. By taking an oath of virginity, women could take on the role of men as head of the family, carry a weapon, own property and move freely.While Albanian culture formally codified a way for women to assume male gender roles, it would not surprise me if other traditional societies had their share of women who, disaffected with their prospects, secretly pursued lives in a male disguise.
Kevin Kelly (one of the original WIRED contributors) and Brian Eno (no introduction needed) have a game where they try to come up with improbable trends for the near future and extrapolate them. While some of them are (at least nowadays), somewhat lacking in the "improbable" aspects (computer power plateauing has been predicted for a while, people are avoiding American citizenship for tax reasons, Sao Paolo in Brazil has already banned billboard advertising, a deadly airborne plague has been feared since SARS and bird flu and there are predictions that the end of cheap oil will enrich inner cities whilst turning formerly affluent suburbs into impoverished backwaters), others (particularly some of Eno's) are thought-provokingly out-there:
Everybody becomes so completely cynical about the election process that voter turnout drops to 2 percent (families and relatives of prospective politicians) until finally the "democratic process" is abandoned in favour of a lottery system. Everything immediately improves.
Suicide becomes not only commonplace but socially acceptable and even encouraged. People choose when to die: living too long is considered selfish and old-fashioned.
A new profession -- cosmetic psychiatry -- is born. People visit "plastic psychiatrists" to get interesting neuroses and obsessions added into their makeup.
A new kind of holiday becomes popular: you are dropped by helicopter in an unknown place, with two weeks' supply of food and water. You are assured that you will not see anyone else in this time. There is a panic button just in case.
A highly successful new magazine -- Ordinary People, edited by the nonagenarian Studs Terkel -- focuses only on people who have never done anything in particular to deserve attention.
A new type of artist arises: someone whose task is to gather together existing but overlooked pieces of amateur art, and, by directing attention onto them, to make them important. (This is part of a much larger theory of mine about the new role of curatorship, the big job of the next century.)
Manufacturers of underwear finally realize that men have different-sized balls.
(via Boing Boing)
According to a new book, Americans are increasingly segregating themselves from people with different values or political views, mostly along the liberal-conservative culture-war faultlines:
In 1976 Jimmy Carter won the presidency with 50.1% of the popular vote. Though the race was close, some 26.8% of Americans were in “landslide counties” that year, where Mr Carter either won or lost by 20 percentage points or more.
The proportion of Americans who live in such landslide counties has nearly doubled since then. In the dead-heat election of 2000, it was 45.3%. When George Bush narrowly won re-election in 2004, it was a whopping 48.3%. As the playwright Arthur Miller put it that year: “How can the polls be neck and neck when I don't know one Bush supporter?” Clustering is how.
For example, someone who works in Washington, DC, but wants to live in a suburb can commute either from Maryland or northern Virginia. Both states have equally leafy streets and good schools. But Virginia has plenty of conservative neighbourhoods with megachurches and Bushites you've heard of living on your block. In the posh suburbs of Maryland, by contrast, Republicans are as rare as unkempt lawns and yard signs proclaim that war is not the answer but Barack Obama might be.The Big Sort manifests itself in where people live (another manifestation of Paul Graham's observation that cities reinforce different ambitions; neighbourhoods and communities also reinforce (positively or negatively) political and cultural values), but goes beyond that. Many Americans have retreated into cognitive gated communities; they watch cable news that reinforces their beliefs, meet their mates on dating websites exclusively for liberals or conservatives (the British equivalent would presumably be the Times/Guardian/Torygraph's respective dating sites; in Britain, newspaper preference is an ideological marker), and in some cases, homeschool their kids to protect them from "wrong" ideas such as evolution or homosexuality. (AFAIK, homeschooling seems to be more a religious conservative phenomenon.) According to a University of Pennsylvania survey of people from 12 countries, Americans are the least likely to talk about politics with those who disagreed with them. And then there was the survey of an online book-recommendation service from some years ago, showing clusters of books read by liberals and conservatives, with next to no connection between them.
“We now live in a giant feedback loop,” says Mr Bishop, “hearing our own thoughts about what's right and wrong bounced back to us by the television shows we watch, the newspapers and books we read, the blogs we visit online, the sermons we hear and the neighbourhoods we live in.”The downside of this is that, when people segregate themselves, their own opinions become more extreme and uncompromising, and those of the other side become demonised, making workable political compromise difficult:
Voters in landslide districts tend to elect more extreme members of Congress. Moderates who might otherwise run for office decide not to. Debates turn into shouting matches. Bitterly partisan lawmakers cannot reach the necessary consensus to fix long-term problems such as the tottering pensions and health-care systems.
America, says Mr Bishop, is splitting into “balkanised communities whose inhabitants find other Americans to be culturally incomprehensible.” He has a point. Republicans who never meet Democrats tend to assume that Democrats believe more extreme things than they really do, and vice versa. This contributes to the nasty tone of many political campaigns.
With the continuing rise in oil prices, some are saying that the age of cheap flights is over, as airlines raise their prices and/or collapse. Think about the implications of that for a moment: historic Eastern European town centres empty of drunken Britons, speculators unable to flog second homes on the Bulgarian Riviera to Ryanair junkies from Gillingham, people actually packing onto trains to go from, say, London to Manchester (and Britain's chronically underfunded railway infrastructure creaking under the weight of the extra patronage). As for bargain shopping in New York, forget it: if you want to see New York, your best bet may be to buy an Xbox 360 and Grand Theft Auto IV. Perhaps the end of the age of cheap travel will finally usher in the Stay-At-Home Century, when tomorrow's people will range as far from their homes as their mediæval peasant ancestors, instead communicating through broadband links.
Meanwhile, General Motors is shutting down four plants that make its Hummer SUV, which for a long time embodied the ugly side of the American Dream. This is after gasoline (that's petrol to the Europeans/Australians reading this) reached $4 a gallon (incidentally, breaking the mechanical pumps at some older gas stations, whose designers never envisioned a gallon of gasoline costing more than $3.99), and dealerships are having trouble moving the hulking behemoths. Perhaps soon we will hear an old joke about Eastern Bloc cars being repurposed?
And still in America, CNN is now running articles about whether the age of the railroads has returned. (Mostly in reference to Europe and Asia as the paragons of modernity.)
The writer L.P. Hartley once stated that the past is a foreign country where they do things differently. Today's Cat And Girl revises this, arguing that, in the age of globalisation, instantaneous communication and accelerating change, our past is more foreign to our present selves than actual foreign countries are.
Paul "Hackers and Painters" Graham has an interesting essay on the implicit messages that cities send to their inhabitants:
Great cities attract ambitious people. You can sense it when you walk around one. In a hundred subtle ways, the city sends you a message: you could do more; you should try harder.
The surprising thing is how different these messages can be. New York tells you, above all: you should make more money. There are other messages too, of course. You should be hipper. You should be better looking. But the clearest message is that you should be richer.
What I like about Boston (or rather Cambridge) is that the message there is: you should be smarter. You really should get around to reading all those books you've been meaning to.
How much does it matter what message a city sends? Empirically, the answer seems to be: a lot. You might think that if you had enough strength of mind to do great things, you'd be able to transcend your environment. Where you live should make at most a couple percent difference. But if you look at the historical evidence, it seems to matter more than that. Most people who did great things were clumped together in a few places where that sort of thing was done at the time.Graham lists the messages sent by other cities: while New York is about money and Cambridge, Massachusetts, is about knowledge, Silicon Valley is all about power, Berkeley is about being civilised and living better, while Washington DC and Los Angeles are about whom you know, although in different ways. Internationally, Oxford and Cambridge (the original one) are intellectual, though not as strongly as Cambridge, MA. Paris is about style (people care about art there, though it's not an intellectual centre). Meanwhile, London is residually about being more aristocratic (at least, when viewed from an American viewpoint), though also places a value on "hipness", and keeping abreast of trends.
He also speculates that this effect, by which a city accumulates a set of values and, in its outlook, encourages certain motivations (whilst implicitly discouraging ones at odds with them), was behind historical phenomena such as mediæval Florence being home to a disproportionate number of artists compared to other, equally big, cities.
You can see how powerful cities are from something I wrote about earlier: the case of the Milanese Leonardo. Practically every fifteenth century Italian painter you've heard of was from Florence, even though Milan was just as big. People in Florence weren't genetically different, so you have to assume there was someone born in Milan with as much natural ability as Leonardo. What happened to him?
If even someone with the same natural ability as Leonardo couldn't beat the force of environment, do you suppose you can?The power of the invisible cultural environment, as manifested in a myriad tiny things, to nurture or thwart ambitions is powerful:
No matter how determined you are, it's hard not to be influenced by the people around you. It's not so much that you do whatever a city expects of you, but that you get discouraged when no one around you cares about the same things you do
There's an imbalance between encouragement and discouragement like that between gaining and losing money. Most people overvalue negative amounts of money: they'll work much harder to avoid losing a dollar than to gain one. Similarly, though there are plenty of people strong enough to resist doing something just because that's what one is supposed to do where they happen to be, there are few strong enough to keep working on something no one around them cares about.
Because ambitions are to some extent incompatible and admiration is a zero-sum game, each city tends to focus on one type of ambition. The reason Cambridge is the intellectual capital is not just that there's a concentration of smart people there, but that there's nothing else people there care about more. Professors in New York and the Bay area are second class citizens—till they start hedge funds or startups respectively.
At the moment, San Francisco's message seems to be the same as Berkeley's: you should live better. But this will change if enough startups choose SF over the Valley. During the Bubble that was a predictor of failure—a self-indulgent choice, like buying expensive office furniture. Even now I'm suspicious when startups choose SF. But if enough good ones do, it stops being a self-indulgent choice, because the center of gravity of Silicon Valley will shift there.Speaking as one who has moved from Melbourne to London, this rings true. Melbourne seems more about creativity and collaborative expression, whereas London is more about status and success. In Melbourne, people get involved in creative projects that don't have a career or strategy for success attached to them, be it making zines, playing music or other arts. In London, it's more about success; all the artists want to be Damien Hirst or Tracey Emin (or, indeed, Banksy) and all the indie bands have stylists and want to be on the cover of NME. (All of this reflects itself in the nature of art produced here; the calculatedly stylised nature of it, falling in a spectrum between naked commercialism and (equally mercantile) fashion-driven hipness.) People pay more attention to keeping score here, and that affects the way the game is played. If you tinker around on creative projects with no commercial potential, you may as well be building model train sets in your shed, or, indeed, playing World Of Warcraft. (In fact, if you did the latter, you'd probably have more of an impact on the world.) People humour you, but nobody's interested in that any more than they are in what you had for lunch. I don't know whether this was always the case in the capital, or whether it was a result of Thatcherite/Blairite entrepreneurial values being adopted by the broader population, though it is a noticeable phenomenon.
(via Architectures Of Control)
Atheism is gaining popularity in the US (by some accounts, it is now more popular than bubonic plague). Now some atheists are discussing whether or not atheists should have their own church. After all, churches (particularly in America) fulfil a social function, distinct from their religious function, as centres of communities and bring people together (which, incidentally, is the literal meaning of the word "religion"), and with recent studies pointing out the health benefits of having a good sense of community, perhaps, the argument would go, it is time for a church for the godless?
Many atheist sects are experimenting with building new, human-centered quasi-religious organizations, much like Ethical Culture. They aim to remove God from the church, while leaving the church, at least large parts of it, standing. But this impulse is fueling a growing schism among atheists. Many of them see churches as part of the problem. They want to throw out the baby and the bathwater—or at least they don’t see the need for the bathwater once the baby is gone.There are already vaguely churchlike organisations for atheists (or those with religious (non)beliefs indistinguishable from atheism): the article mentions the Society for Ethical Culture, a 19th-century "secular cathedral", and Humanist Judaism, which maintains the traditions of the Jewish faith but jettisons the faith bit. And then there are the Unitarian Universalists and other content-free quasi-religions.
Not surprisingly, there is not only no agreement on what the new atheist creed is meant to contain, but also what it should call itself.
At this point, the movement can’t even agree on a name. Christopher Hitchens, author of God Is Not Great, prefers the term anti-theist because he’s entertained the possibility that God exists and finds the prospect frightening, the spiritual equivalent of living in North Korea. Daniel Dennett continues to promote the term bright, which, he has said, is “modeled very deliberately and very consciously on the homosexual adoption of the word gay.” (In the first chapter of God Is Not Great, Hitchens dismisses the term as conceited.) And Sam Harris, brash young scientist that he is, triggered a minor revolt last fall at the Atheist Alliance International Conference in Crystal City, Virginia, when he lashed out against the term atheist, disparaging those who identify with a negation. “It reverberated in atheist circles as a sacrilege,” Harris told me. “But what’s worse is adopting language that was placed on us by religious people. We don’t feel the need to brand ourselves non-astrologers or non-racists.”
Dennett sees value in atheism’s great awakening, in the energy and money that come from organizing, but he counsels caution. “The last thing atheists want to see is their rational set of ideas yoked up with the trappings of a religion,” he says. “We think we can do without that.” Even Richard Dawkins is not one to reject certain memes based on their churchly pedigree. He calls himself a “cultural Christian,” admitting that he likes to sing Christmas carols as much as the next guy. But there’s a limit to his tolerance of religion.While I can understand the arguments, the idea of an atheist church seems a bit absurd. For one, atheism is a purely negative belief, by which I do not mean that it is harmful or wrong, but that it is only a statement of what one does not believe. If I tell you that someone is an atheist, I am telling you nothing about what that person actually does believe; they could be anything from a Buddhist to a Marxist to a secular humanist, to say the least; the only thing you know is that their belief system does not include a personal supreme being. As such, atheism in itself is not much of a rock on which to found a church. Granted, one could beef it up with a range of complementary beliefs or values (such as beliefs in the beneficience of science, the innate dignity of the individual, the equality of races and sexes or the humour of Monty Python), though then it ceases to be merely atheism and becomes something else.
Besides which, I doubt whether an atheist church could be remotely successful by any standard. Without the promise of eternal salvation (or some equivalent form of supernatural brownie points), going to church becomes just another activity, competing for time with a myriad possible other activities. Do you go to the Church of No God to hear a reading from Douglas Adams and then discuss it over tea and biscuits, or do you read a book or catch up with a friend or go rollerblading or see that new exhibition you've read about? Without the all-seeing gaze of the Almighty keeping tabs on His flock (or, more precisely, the common belief in such), such a church would more often than not take second place to other activities.
In fact, the whole question of whether atheists need their own church appears, to me, to be the wrong question, particularly when attendance of mainstream churches has been declining in recent years. A better question would be how the social function that churches fulfil could be best fulfilled, in today's society, without religion. (The key phrase is "in today's society"; in a world where people move around much more than they used to, don't necessarily live amongst people who share their cultural or religious outlooks, and where communications are often mediated by increasingly powerful technology, such as mobile phones and the internet.) While these changes have led to the breakdown of traditional social structures, they are also ushering in new forms of social connection (as Clay Shirky describes in Here Comes Everybody), and it is far from clear that creating an atheist church would make any more sense than designing a new high-tech buggy whip.
It has emerged that children in Britain are posing as paedophiles online to intimidate each other.
Officers have warned parents and children to be vigilant after as many as nine youngsters in Padstow, Cornwall, were targeted through the networking sites Bebo and MSN. Police initially believed a local man was trying to groom the children by befriending them online and arranging to meet them. But a member of the public has come forward and told them that youngsters are trying to settle playground disputes by posing as a paedophile to frighten their rivals.
A spokesman for Devon and Cornwall police said: "Information from the public has highlighted a possibility that the offenders could be children aged 10 and over, masquerading as a paedophile. The investigations are continuing and at this moment we are looking into every line of inquiry and are not ruling out any possibility. However, the language used on the social networking sites such as Bebo and MSN is at times childish. It could be youngsters playing a sick game to try and intimidate friends they have fallen out with. This will be treated seriously and we will be contacting the families of the children involved and we will try and help them by involving social services."Granted, a lot of this is the inevitable modern variant of kids trying to scare each other with imaginary serial killers/monsters/urban myths, updated for the age of paedoterror, though it wouldn't surprise me if, in these jumpy times, some 12-year-old ended up on the sex offenders' register after pulling such a stunt.
(via Boing Boing)
Clay Shirky, author of Here Comes Everybody, posits an interesting theory: that entertainment television, an arguably stupefying medium, arose in the 20th century as a temporary coping mechanism for dealing with a surplus of free time and cognitive capacity, a way for people to harmlessly manage free time they had no traditional uses for. A parallel he quotes was the explosion in consumption of gin (in those days a disreputable, highly intoxicating drink) during the mass migration from the countryside to the cities in Britain:
The transformation from rural to urban life was so sudden, and so wrenching, that the only thing society could do to manage was to drink itself into a stupor for a generation. The stories from that era are amazing-- there were gin pushcarts working their way through the streets of London.
And it wasn't until society woke up from that collective bender that we actually started to get the institutional structures that we associate with the industrial revolution today. Things like public libraries and museums, increasingly broad education for children, elected leaders--a lot of things we like--didn't happen until having all of those people together stopped seeming like a crisis and started seeming like an asset.Television, Shirky argues, fulfils the same role. During the 20th century, a majority of the population found itself with something they didn't have before: free time. Since there was no use for this, it was more of a crisis than an opportunity, and once again, society turned to an intoxicant as a means of control:
If I had to pick the critical technology for the 20th century, the bit of social lubricant without which the wheels would've come off the whole enterprise, I'd say it was the sitcom. Starting with the Second World War a whole series of things happened--rising GDP per capita, rising educational attainment, rising life expectancy and, critically, a rising number of people who were working five-day work weeks. For the first time, society forced onto an enormous number of its citizens the requirement to manage something they had never had to manage before--free time.
And what did we do with that free time? Well, mostly we spent it watching TV.
We did that for decades. We watched I Love Lucy. We watched Gilligan's Island. We watch Malcolm in the Middle. We watch Desperate Housewives. Desperate Housewives essentially functioned as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat.Now, Shirky claims, society is figuring out ways to use surplus cognitive capacity more productively than by watching sitcoms. With the internet, people are starting to turn the television off and use their time, if not more productively, more interactively. This can take the form of amateur collective efforts such as Wikipedia or of pasting captions onto photographs of cats or playing multiplayer games. (Granted, in this early stage, even contributions to Wikipedia often are about TV shows, but this will probably pass):
And television watching? Two hundred billion hours, in the U.S. alone, every year. Put another way, now that we have a unit, that's 2,000 Wikipedia projects a year spent watching television. Or put still another way, in the U.S., we spend 100 million hours every weekend, just watching the ads. This is a pretty big surplus. People asking, "Where do they find the time?" when they're looking at things like Wikipedia don't understand how tiny that entire project is, as a carve-out of this asset that's finally being dragged into what Tim calls an architecture of participation.This is not a passing phase, Shirky asserts, but a profound social shift; he cites as an example an anecdote illustrating that young children today are already in a post-television mindset, in which a one-directional consumeristic medium is seen as broken, rather than just as "the way things are and have always been":
I was having dinner with a group of friends about a month ago, and one of them was talking about sitting with his four-year-old daughter watching a DVD. And in the middle of the movie, apropos nothing, she jumps up off the couch and runs around behind the screen. That seems like a cute moment. Maybe she's going back there to see if Dora is really back there or whatever. But that wasn't what she was doing. She started rooting around in the cables. And her dad said, "What you doing?" And she stuck her head out from behind the screen and said, "Looking for the mouse."Will, in a generation or two, our descendents look back on the entire 20th century as an age of stupidity and conformism, sort of like the mythical Leave-it-to-Beaver 1950s writ large? (Assuming, of course, they're not too busy avoiding starvation or fighting over the Earth's remaining oil supplies or something.)
(via Boing Boing)
This is not the Onion: The latest children's book to be making a ripple is "My Beautiful Mommy", written by Florida plastic surgeon Michael Salzhauer, and intended to help children come to terms with their mothers' plastic surgery:
"My Beautiful Mommy" is aimed at kids ages four to seven and features a plastic surgeon named Dr. Michael (a musclebound superhero type) and a girl whose mother gets a tummy tuck, a nose job and breast implants. Before her surgery the mom explains that she is getting a smaller tummy: "You see, as I got older, my body stretched and I couldn't fit into my clothes anymore. Dr. Michael is going to help fix that and make me feel better." Mom comes home looking like a slightly bruised Barbie doll with demure bandages on her nose and around her waist.
Then there are the body image issues raised by cosmetic surgery—especially for daughters. Berger worries that kids will think their own body parts must need "fixing" too. The surgery on a nose, for example, may "convey to the child that the child's nose, which always seemed OK, might be perceived by Mommy or by somebody as unacceptable," she says.
(via Boing Boing)
Research is showing that a compound found in cannabis has antipsychotic effects. The compound, cannabidiol, naturally occurs in cannabis, though it is perhaps no surprise that high-potency varieties of "skunk" now on the market, which have been bred for maximum THC (the psychoactive compound in cannabis, which has been linked to psychosis) have less cannabidiol than older varieties.
Which, IMHO, is an argument for legalisation and regulation of cannabis. Alcohol is regulated; relatively safe varieties are easily available, and those selling liquor with ingredients considered unsafe (from poisonous ethanol to excessive amounts of thujone) face prosecution. With cannabis-induced psychosis looming as a public health issue, perhaps a law restricting the ratio of THC to cannabidiol would ameliorate the crisis?
The other solution, and one infinitely more culturally appropriate for the Anglo-Saxon world, is the familiar zero-tolerance Reaganite war-on-drugs approach. Perhaps if we build more prisons, jail more users and dealers (and perhaps execute a few particularly bad apples for good measure to put the fear of God into potheads), and institute a regime of mass surveillance and appropriate abridgements to civil liberties to catch offenders, then maybe, just maybe,
the damned horse can fly we'll eventually achieve a drug-free society.
(via Mind Hacks)
One of the most striking differences between European and Asian societies is the question of individualism versus collectivism. This arguably goes beyond the question of individual rights and social obligations, and into the way people think about entities versus systems:
There is no better way to shatter someone's "we are all the same" illusion than to show pictures of a monkey, a panda and a banana to someone from Japan and someone from Britain. Ask them which two images go together. Chances are, the Japanese will pick the monkey and the banana, because they have a functional relationship: the former eats the latter. The Brit will select the panda and the monkey, because they are both mammals. As Richard Nisbett of the University of Michigan described in his 2003 book, "The Geography of Thought: How Asians and Westerners Think Differently … and Why," Westerners typically see classifications where Asians see relationships. He means "see" literally. When students in one study looked at tanks holding a large fish, a bunch of small fry and the usual aquarium plants and rocks, the Japanese later said they'd seen lots of background elements; the Americans saw the big fish.Now a new hypothesis from evolutionary psychology suggests that these cognitive traits could have been the result of natural selection driven by disease-causing microbes, i.e., in pathogen-rich environments, tendencies towards collectivism were adaptive (i.e., you were more likely to survive), whereas where there were fewer pathogens, populations had the luxury of evolving more ruggedly individualistic tendencies:
A reluctance to interact with strangers can protect against pathogens because strangers are more likely to carry strange microbes that the group lacks immunity to, says Mark Schaller of the University of British Columbia; xenophobia keeps away strangers and their strange bugs. Respect for traditions also works: ways of preparing food (using hot pepper, say, which kills microbes), rules about hygiene and laws about marriage (wed only in-group members, whose microbes you're probably immune to) likely arose to keep pathogens at bay. "Conformity helps maintain these buffers against disease," says Corey Fincher of the University of New Mexico; mavericks are dangerous. In places with a high prevalence of pathogens, such cultural traits—which happen to be the hallmarks of societies that value the group over the individual—would be adaptive. Put another way, societies that arose in pathogen-rife regions and did not have such traits would be wiped out by disease. Societies that did have them would survive.
When the scientists examined how closely collectivism tracked the prevalence of pathogens, they found a strong correlation, they will report in Proceedings of the Royal Society B. In general, tropical regions have more pathogens, and societies there tend to be more group-oriented than those at higher latitudes. Ecuador, Panama, Pakistan, India, China and Japan are the world's most group-first societies—and historically have had the highest prevalence of natural pathogens due to their climate and topography. The most individualistic are in Northern Europe and the United States, where there have historically been fewer native pathogens. For years scientists have scratched their heads over why collectivism declines with distance from the equator, and why living in colder regions should promote individualism (you'd think polar people would want to huddle together more). The answer seems to be that equatorial regions breed more pathogens.The research acknowledges that nurture and culture play a significant role (i.e., Asian immigrants in America soon become as individualistic as other members of their adoptive society), so any genetic bias may be a subtle one. Though when a number of individuals form a civilisation, it may only take a slight cognitive bias to change the basic cultural assumptions that evolve.
On the other hand, given that the research is of American and Canadian (i.e., Western) origin, perhaps it rests on a western, individualist cultural bias. Which doesn't necessarily invalidate it, though it makes one wonder how a Japanese or Chinese evolutionary psychologist would theorise the origins of the differences between individualist and collectivist societies.
Via Crikey, an account of an earlier Olympic torch protest, this one before the Melbourne olympics in 1956:
With this escort around him, the runner made his way through the streets all the way to the Sydney Town Hall. He bounded up the steps and handed the torch to the waiting mayor who graciously accepted it and turned to begin his prepared speech.
Then someone whispered in the mayor’s ear, “That’s not the torch.” Suddenly the mayor realized what he was holding. Held proudly in his hand was not the majestic Olympic flame. Instead he was gripping a wooden chair leg topped by a plum pudding can inside of which a pair of kerosene-soaked underwear was burning with a greasy flame. The mayor looked around for the runner, but the man had already disappeared, melting away into the surrounding crowd.The hoaxer was a veterinary student named Barry Larkin, who (along with eight other students from the University of Sydney) planned the prank to take the piss out of a Nazi-era tradition which they felt was being treated with too much reverence.
Surprisingly, Larkin was treated as a hero; even the rector of the University of Sydney reportedly walked up to him the following day and said "well done, son". If he faced any punishment, it is not mentioned in the article. It's hard to imagine something like this happening these days without universal condemnation from the press and criminal charges, larrikinism being best left to professionals (such as TV celebrities) who can keep it safe for all. Could 1956-era Australia have been, in some ways, less conservative than the present day?
In today's paranoid age, controlling parents have ever-increasing options for monitoring everything their children do:
The SnoopStick looks like a memory stick. You plug it into your teenager's computer when they are not around, and it installs stealth software on to the machine. Then you plug it into your own computer and can sit back at your leisure and observe, in real time, exactly what your child is doing online - what websites they are visiting, the full conversations they are having on the instant messenger (IM) service, and who they are sending emails to. It is as if you are sitting and invisibly spying over their shoulder.
Significantly, the £37.50 device comes with the warning that, if you use it to monitor an employee's computer without notifying them, you may well be in breach of employment laws. But install it secretively on the computer of your teenager, who has absolutely no rights at all, and no one can touch you. The moral argument doesn't come into it.
The following devices, please note, are not just being marketed to private detectives to catch errant spouses; they are being targeted at parents of teenagers. You can get clothes with tracking devices fitted into them. You can fit such devices covertly into mobile phones. For $149 you can purchase a mobile spy data extractor, which reads deleted text messages from a SIM card. For $79 you can buy a semen detection kit, to test your teenage daughter's clothing. And for $99, if you really want to ape the mad ex-Marine father in American Beauty, you can buy a drug identification kit which can detect up to 12 different illegal drugs.
The SnoopStick symbolises the modern obsession with control. The American psychologist Robert Epstein, who wrote the controversial book The Case Against Adolescence, estimates that young Americans are now ten times more restricted than adults, and twice as restricted as convicted criminals. He says teenagers are infantilised and deprived of human rights. As well as the obvious legal bar to prevent them smoking, drinking, marrying, voting and gambling, teenagers have no privacy rights, no property rights, no right to sign contracts or make decisions regarding their own medical or psychiatric treatment.
The government of Thailand has outlawed cosmetic castration, a surgical procedure that was becoming surprisingly popular as a cheaper, quicker alternative to sex change operations:
However, at the lower end of the market, clinics have responded to demand from teenage boys to look more like girls by posting Internet advertisements offering castration for as little as 4,000 baht ($125).Demand for teenage boys to look more like girls? Is this driven by particularly harsh economic considerations, or is it an extreme manifestation of fashion?
Momus observes that, far from being centres of culture or creativity, districts which attract "funky" bars are merely centres of drunkenness:
I thought that being in the midst of a district dominated by theatre and retail I'd be living in a refined environment. Instead, I found I was living in a sewer. Brydges Place, of an evening, became an open toilet, used as a slash-wall of last resort by many of the thousands of people who descended on central London every evening to drink... heavily. My friend Thomi, who had a studio above John Calder's publishing house on Green's Court in Soho, had it even worse: people would stand on his step and pee right through the letterbox. Later I moved to the Chinese end of the Lower East Side just in time to see it teeter between a quietly industrious Asian district by day and a burgeoning, boisterous white people's drinking district by night.Momus lays the blame squarely at the feet of white people:
White people -- if you'll forgive the generalisation -- drink, and the further north you go the more immoderately and self-destructively they tend to drink. Or, to put that a little differently, the whiter your district gets, the more bars are going to pop up, and the more your Friday and Saturday nights will fill up with piss, shouting, boom-boom -boom, swagger and bravado.Momus' solution to avoiding being surrounded by vomiting revellers is simple: choose an area with a large Islamic population.
It is apparently possible to travel around England entirely by local bus, if one doesn't mind doing so at a leisurely pace. And here are the timetables for getting from Penzance to Berwick-upon-Tweed entirely on local buses; the journey takes six days.
Other than obsessive bus anoraks (of which there must be some), this may be of interest to thrifty pensioners, for whom local buses across England have just become entirely free. Though, judging by the comments, not everyone's happy with that:
These baby boomers really know how to look after themselves. Their war veteran parents over the last 20 years had to pay. Never heard them getting free national bus travel. And their kids had to get out big loans to go to University while they got full grants. The FREEBIE generation.
Misguided, that word "free"! Yes, the pensioners will get a nice free ride but everyone else will be forced to subsidise it via higher bus prices. Good PR for the government; everyone else however will suffer further price increases. The bus companies will not let us off the hook as they still have to pay for the services. Gordon Brown cheers
Richard Kendrick, Leeds
In 2002, Teresa Nielsen-Hayden wrote up a taxonomy of the various forms virtually all fraud falls into, from pyramid schemes to promises of inside information to variants of classics like the "Spanish Prisoner", to the numerous "tax protest" frauds rife among the mad-as-a-rattlesnake class in the US. Anyway, amongst the illuminating commentary, there is the following insight:
A couple of days ago I finally put my finger on something I’ve been sensing but not grasping—you know, one of those itchy back-of-the-brain apprehensions that there’s a pattern here, only you can’t quite see what it is. Somehow it’s felt like literary analysis. The question is, why do these scams—inheritance cons, MLMs, tax dodges, Make Money Fast, hot stock tip swindles, et cetera—take the forms they do?
What did it was looking at my list of basic scams and observing that what they have in common is the promise of lucrative, risk-free investments. Lord knows the things exist, I thought, but nobody ever gives them away. In theory, high rates of return are the investor’s payoff for taking on higher-risk investments. Achieving that happy state of all payoff and no risk is the main reason the wealthy and powerful manipulate the system.
These scams take the forms they do because they’re parodies—no, a better way to put it: they’re cargo-cult effigies—of the deals the ruling class cut for themselves. If you’re an insider, if you have the secret, you can have a job where you make heaps of money for very little work. You can avoid paying your taxes. You can inherit a pile of money because an ancestor of yours left a moderate fortune that’s been appreciating ever since. You can be your own boss. You can have other people working for you, who have other people working for them, who all pay you a percentage of the take.Which, when applied to get-rich-quick schemes, from scams and frauds to perfectly honest (if dumber than a sack of hammers) ideas based on visualisation, prayer, ritual or other forms of magical thinking (such as "the Secret", as found in the self-help sections of bookshops across the US), makes perfect sense. The original cargo cults consisted of Melanesian islanders who, upon witnessing American airmen arrive during World War 2 with food rations, clothing and other useful goods (whose provenance their culture had not equipped them to understand), reasoned that these goods must be boons from the gods and that, if they carried out the same rituals as the Americans (i.e., parading in handmade US Army uniforms, building makeshift runways and control towers), they would reap the same benefits. Could it not be that this magical mode of thinking is not purely the province of "primitive" cultures, but is an idiosyncracy of the human mind's irrational pattern-matching tendencies, the same tendencies that attribute misfortune to elaborate (and unfalsifiable) conspiracies over mere chance? After all, our instincts say, there must be a man behind the curtain.
Elsewhere in the article, there is the following observation about one persistent category of frauds: the ever-thriving business of telling people that they don't really need to pay taxes, and that, for a fee, they can know the secret of how to get away with not paying it (which, unsurprisingly, seldom works):
Somewhat humorously, in several cases where the IRS has gone after promoters of “Don’t File” schemes, it was determined that the promoter—while advocating not filing returns—had been filing their returns all along. This really isn’t surprising, since most of the promoters will secretly confide that they really don’t believe these theories either, but it makes them good money.
US Department of Homeland Security convenes a group of science fiction writers, dubbed "SIGMA", to brainstorm ideas for defending the nation; writers, instead, go off on bizarre tangents:
Niven said a good way to help hospitals stem financial losses is to spread rumors in Spanish within the Latino community that emergency rooms are killing patients in order to harvest their organs for transplants.
“The problem [of hospitals going broke] is hugely exaggerated by illegal aliens who aren’t going to pay for anything anyway,” Niven said.
(via Boing Boing)
The Graun has an article on the phenomenon of fried chicken shops in Britain, tying in the class aspect (fried chicken as a signifier of underclass status), the racial and cultural dimensions and the connection with Islam:
The increasing number of halal fried chicken shops in the UK is testament to changing demographic and eating patterns. "The Muslim community here is growing," says Enam Ali, chair of the Guild of Bangladeshi Restaurateurs. "Fried chicken is cheap - [people who eat it] are young, students, with limited pocket money." Masood Khawaja, president of the Halal Food Authority, says, "A great percentage of third generation Muslims are not eating the original cuisine of their families - they want more takeaways, more convenience foods."
"Let's just grasp the nettle here," says black comic Paul Ricketts, whose stand-up observations often turn to this issue. "All black areas have loads of fried chicken outlets. It is a socio-economic thing. Chicken is one of the cheapest birds you can get. When people go on about smelly food, what they really mean is fried chicken, and they're having a dig at the people eating it - we have an era where we don't mention class any more, we just call them chavs or hoodies - it's a term for working-class scum."
At Halal Southern Fried Chicken in London's Brick Lane, they lace their hot wing batter with chilli powder, turmeric, cumin and coriander. Most customers are men in their 20s. The story is the same further down the road at Al-Badar Fried Chicken and Curry Restaurant, where their hot wings are coated in cinnamon, coriander and fresh and crushed chillies. Manager Amer Salim differentiates his product from the nearby KFC, which, he says, caters to another market. "In London's Tower Hamlets, the Bangladeshi community like spicy with more and more chilli," he says. "Fried chicken in KFC is not spicy."It doesn't mention the iconographic idiosyncracies of these shops, with their varyingly plausible faux-Americanisms (from "_ Fried Chicken" shops named after random US states to shops whose signage evokes images of cowboys frying chicken over campfires on the Rio Grande to the ubiquitous cartoon mascots of chickens in cowboy hats.
Concerned about its young citizens being too busy working hard to find partners, the government of Singapore (perhaps one of the most efficiently managed societies in history) has begun offering lessons in seduction. Not that type of seduction, though, of course, but something altogether more wholesome and befitting of a place described as "Disneyland with the death penalty":
Students at two polytechnics can earn two credits towards their final degree by choosing the love elective. Activities include watching romantic films, holding hands and "love song analysis".(They need a course with credits for holding hands? Good grief. Has all spontaneity really been disciplined out of the Singaporean spirit to the point where they need to be directed on how to fall in love?)
But it is not so easy to put Singaporean youth in the mood for love. Another student who did the course, Kamal Prakash, said: "I'm not really looking for a girlfriend now as I want to concentrate on my studies."
The practice of street photography, taking spontaneous photographs in public places, is under threat, as photographers find themselves lumped in with the shadowy paedoterrorist hordes who are out to kill us all and molest our children:
In the past year, the photography blogs have buzzed with tales of harassment, even violence. There's the war photographer who dodged bullets abroad only to be beaten up in his own South London backyard by a paranoid parent who (wrongly) thought his child was being photographed. There's the amateur photographer punched prostrate in the London Tube after refusing to give up his film to a stranger; the case of the man in Hull, swooped on by police after taking photographs in a shopping centre. “Any person who appears to be taking photos in a covert manner should expect to be stopped and spoken to by police ...” ran the Humberside force's statement.
Sophie Howarth is a curator specialising in street photography. She says she's noticed - despite the difficulties - a boom for the art, enabled by technology, and with London at the centre. “In France, traditionally one of the great centres of street photography, the law now says you own the rights to your own image, so street photography's become a dead art. In London there's a growing community of photographers, using digi- tal technology, not just cameras, but blogs, too, to document the city and give each other instant feedback.”When did the law in France change? Was that one of Sarkozy's neo-Galambosian intellectual-property-maximalist reforms, like pushing for EU-wide copyright term extension?
“I'm not going to belittle the issue of terrorism, but this is paranoia. And unfortunately, since Lady Di and now this link with terrorists, photography's seen by many people as something that's a little ... cheap.”
The Napsterization blog (which is not about craptacular DRM-shackled music-rental services but about the social and economic implications of disruptive technologies) has a piece on the lengths Facebook application authors go to get people to install their applications, such as doing sleazy things like not only requiring users to install their application to see messages from friends, but wilfully misleading them into believing that if they don't forward a message (of a pornographic tone) to some friends, they won't get to see it. As a result, the maker of the app gets a juicy boost to their installation figures, whilst pissing all over people's social relationships and making your user experience that much crappier.
In this case, the culprit was Slide, with their popular FunWall application, though neither Slide nor Facebook will accept the blame for this:
Facebook pointed the finger at Slide (the app maker in this case), and said, "There is nothing we can do. We have no control over the apps people make or the stuff they send." Oh, and if I wanted Facebook to change the rules for apps makers? I'd have to get say, 80k of my closest Facebook friends to sign on a petition or group, and then they might look at the way they have allowed porn spam to trick people into forwarding, but until then, there would be no feature review.
Slide said that they thought Facebook was the problem, because as the "governing" body, Facebook makes the rules and "Slide wouldn't be competitive if they changed what they do, and their competitors weren't forced to as well." In other words, Slides competitors use the same features to get more users (or trick more users as the case may be) and Slide didn't want to lose out on getting more users with similar features, regardless of the effect the features have on us and our relationships.And things aren't likely to change by much. Human psychology being what it is, people are willing to put up with a lot of annoyance in software as long as it provides a social function. (How else could you explain MySpace, with its spammy, craptacular user experience, going from strength to strength and maintaining its position as the dominant social software site?) Some people may generally amused by every piece of spam that comes in, or believe that, like billboard advertising, it brightens up people's otherwise dull lives. Others may put up with it due to the peer pressure to not seem cranky and antisocial; after all, the argument would go, that's what they do here, and if you don't like it, why did you join? (The corollary to this argument is, of course, the attrition rate as people who get sick of having three wall applications and being awash in a sea of silly surveys and chain letters stop logging in one day.)
Community activism can cut both ways; in San Diego, for example, activists are mobilising to occupy benches to keep the homeless off. The noble cause behind this: dissatisfaction by the benches' donors (a merchants' association) that they were attracting undesirables:
Esther Viti, who oversees the donation of public benches for a merchants' association in La Jolla, sent an e-mail to 45 other activists last week asking them to sit in three-hour shifts, no bathroom breaks allowed.
"After all, you MUST OCCUPY THAT BENCH continually for three hours to prevent that homeless person from sitting on that bench," the e-mail said.Interesting how the non-judgemental phrase "homeless person" has wasted no time in soaking up the same connotations as politically-incorrect phrases like "bum" and "tramp" that it displaced.
Finnish newspaper Helsingin Sanomat has a beautifully poetic and thought-provoking article about the death of a recluse, found in his Helsinki apartment some three years after his death:
The odd invoice arrived, followed by their reminders, and then not even them.
Direct debit arrangements handled most of the bills, including the maintenance charge on the apartment.
The guy who comes to read the electricity meter didn’t ring the doorbell, because he didn’t need to: the meter is in the basement.
The man lay in the bathroom doorway.
At some point the bathroom lamp gave up the ghost, as they do, and he was left in the dark.
The makers of Stuff White People Like bring us two more slightly uncomfortable satirical glimpses into race and class in today's America: firstly, Stuff White Trash People Like (including the likes of "boxed wine", "NASCAR", and "High School Sweethearts"):
#1: AmericaAnd then there's Stuff Educated Black People Like (like "Getting Dressed Up", "Conferences" and "Poetry Slams" and "Moving To Atlanta").
Budweiser, fake tits, the V8, Little Debbies, the Fourth of July, all you can eat buffets, Viagra, yeah, America invented all that shit. Not enough for you? Tell you what, every other country that’s been to the moon raise your hand.
That’s what we thought.
If America’s not the best country ever, then why did Jesus invent it? See, you can’t argue with that logic.
The Irish Independent has a piece on how social networking websites are changing relationships, and in particular, how they end and what happens afterward:
I started getting clues that I might be about to become a free man when my girlfriend's friends posted messages to her that read: "Good luck with tonight -- it's for the best."
First came the announcement online of my new 'Single' status. Deftly inserted into Facebook's running newsfeed, it informed everyone that both she and I knew that I had been dumped, in much the same way that Reuters proclaims the engagement of a minor member of the British royal family. There was no way of deleting it, so it sat there haunting me.
But then her status updates started to tell a story. Just three days after we broke up, she changed hers to: "2008, new job (check), new flat (check), new man (working on it)."
Your ex's blog may only be read by five and half people, but you still don't really want them telling complete strangers how you were unable to put the loo seat down and never really gave the choosing of shelves the attention it deserved, and how these things were symptomatic of your lack of commitment to the relationship.
It makes me think that our grandparents had an easier time. If one of their relationships went bad they could always go to sea -- or at least the next village -- and never see the other person again.The whole issue of relationship breakups in the age of the internet recently hit the spotlight spectacularly with Wikipedia founder Jimmy Wales' breakup with his girlfriend, FoxNews journalist Rachel Marsden. Wales apparently dumped her on Wikipedia, and she retaliated by releasing transcripts of their online chats, the major upshot of which was a revelation that these lofty public figures were, scandalously, quite into having sex with each other while they were going out.
It'll be interesting to see how the standards of socially acceptable conduct evolve once it is literally impossible to dissociate oneself from an ex without becoming a hermit. Will slagging off one's exes and their failings in public blogs become taboo, or restricted to some acceptable bounds of fair play? Or will people get used to the fact that anyone in the dating marketplace probably has several scathingly negative references from their various exes? (Perhaps there is a niche for a site which aggregates exes' references, along with reputation scores for the referers?) Will things like Rachel Marsden's release of the chat transcripts become unacceptable, the social equivalent of a nuclear first strike?
A poll has shown that fewer than a third of Americans consider nanotechnology to be morally acceptable; considerably fewer people than in Europe:
In a sample of 1,015 adult Americans, only 29.5 percent of respondents agreed that nanotechnology was morally acceptable.
In European surveys that posed identical questions about nanotechnology to people in the United Kingdom and continental Europe, significantly higher percentages of people accepted the moral validity of the technology. In the United Kingdom, 54.1 percent found nanotechnology to be morally acceptable. In Germany, 62.7 percent had no moral qualms about nanotechnology, and in France 72.1 percent of survey respondents saw no problems with the technology.The authors of the poll believe that this is not so much due to any specific moral issue concerning the making of molecule-sized materials or devices per se, but due to many Americans subscribing to a religious worldview that takes a dim view of "tampering with God's creation":
The catch for Americans with strong religious convictions, Scheufele believes, is that nanotechnology, biotechnology and stem cell research are lumped together as means to enhance human qualities. In short, researchers are viewed as "playing God" when they create materials that do not occur in nature, especially where nanotechnology and biotechnology intertwine, says Scheufele.
The moral qualms people of faith express about nanotechnology is not a question of ignorance of the technology, says Scheufele, explaining that survey respondents are well-informed about nanotechnology and its potential benefits. "They still oppose it," he says. "They are rejecting it based on religious beliefs. The issue isn't about informing these people. They are informed."Which is somewhat ironic, if the first post-Enlightenment nation is now dominated by a steadfastly pre-Enlightenment worldview; a people at peace with technology but hostile to the scientific mindset that makes it possible. Or, in the words of one member of a Christian Fundamentalist web forum (of course):
Technology makes peoples lives easier. Technology is the product of inventive geniuses who were inspired by God. Inventions and innovations improve life.
Science causes confustion and makes things complicated. Everytime there is a new discovery the old discoveries and old wisdom are discarded! And theories get more and more complex. Science makes people confused and complicates things. Who is the author of confusion? Satan of course. The bible it the opposite of science. Biblical wisdom NEVER CHANGES, and anyone can get it. Scientific wisdom is always changing and contradicting itself, and really nobody gets it.On a similar tangent: "Dumb and Dumber: are Americans Hostile to Knowledge?", a review of a new book claiming that anti-intellectualism is on the rise in the US.
Two Oxford University sociologists look at the question of why graduates in science, engineering and medicine are overrepresented in terrorist and extremist groups:
However, contrary to popular speculation, it's not technical skills that make engineers attractive recruits to radical groups. Rather, the authors pose the hypothesis that "engineers have a 'mindset' that makes them a particularly good match for Islamism," which becomes explosive when fused by the repression and vigorous radicalization triggered by the social conditions they endured in Islamic countries.
Whether American, Canadian or Islamic, they pointed out that a disproportionate share of engineers seem to have a mindset that makes them open to the quintessential right-wing features of "monism" (why argue where there is one best solution) and by "simplism" (if only people were rational, remedies would be simple).
As overt expressions of racism become unacceptable, good ol' boys in the US South have adapted by referring to black people as "Canadians":
Last August, a blogger in Cincinnati going by the name CincyBlurg reported that a black friend from the southeastern U.S. had recently discovered that she was being called a Canadian. "She told me a story of when she was working in a shop in the South and she overheard some of her customers complaining that they were always waited on by a Canadian at that place. She didn't understand what they were talking about and assumed they must be talking about someone else," the blogger wrote.
A University of Kansas linguist said that a waitress friend reported that "fellow workers used to use a name for inner-city families that were known to not leave a tip: Canadians. ‘Hey, we have a table of Canadians.... They're all yours.' "
Stefan Dollinger, a postdoctoral fellow in linguistics at University of British Columbia and director of the university's Canadian English lab, speculated that the slur reflects a sense of Canadians as the other. "This ‘code' word, is the replacement of a no-longer tolerated label for one outsider group, with, from the U.S. view, another outsider group: Canadians. It could have been terms for Mexicans, Latinos etc. but this would have been too obvious," he said. "What's left? Right, the guys to the north."The comments to the Boing Boing post which mentioned this are enlightening as well:
I work with an American who recently emigrated to Canada from one of the "suh-then" states. He tells me our early stance against the "war" on Iraq left a bad taste toward Canada in the rural south. Raw hate and "we should invade those b*stards and kick them out on an ice flow" rage was quite common in his semi-rural area. Using "Canadian" in this fashion would be a logical progression. They're not being ironic at all.
My friend has parents that used to use the word frequently until she married an actual Canadian. When she told them that he was Canadian they went totally ape-shit. She informed them that they were not invited to the wedding. When they found out he wasn't black (oh the relief... you should've seen it), they apologized. They're still bigots, but possibly one degree less so now.
I live in Pennsylvania, where I've heard a similar practice; many of my father's friends use the term "Democrat" instead of "Canadian" for the exact same purpose. Most of these guys are old, white Republicans, and many of them are also Freemasons.
Actually, the term "Canadian" in reference to black people has been around and in prevalent use for years, like seven or eight of them. It can't have taken that long for the mainstream to have figured that out. The new term is "German" because it was feared that black folks were catching on to the "Canadian" thing a couple of years ago.It is not clear whether "Canadian" started off as restaurant slang for "cheapskate" (presumably due to Canada not having a tipping culture as in the US due to higher minimum wages?) or was a racist euphemism all along.
(via Boing Boing)
A chap named Virgil Griffith has correlated the most popular books at every college in the US (as fetched from Facebook) with that school's average test score to find a correlation between intelligence and favourite books. According to Griffith's study, the book most correlated with high scores is Vladimir Nabokov's Lolita, and that with low scores is the Holy Bible (not to be confused with the Bible, which is around the middle, just below Harry Potter); other books correlated with high scores are Ayn Rand's Atlas Shrugged, Kurt Vonnegut's Cat's Cradle, and Freakonomics, and books dumber than "I Dont Read" include various erotica and hip-hop/ghetto fiction, The Purpose Driven Life and Fahrenheit 451. Slightly smarter than not reading are the likes of Fight Club, Dan Brown and John Grisham, along with Shakespeare's Hamlet; sci-fi, fantasy and geek/fan-interest books like The Lord of the Rings, Dune and, umm, Eragon rate more highly. (Mind you, this is correlated to test scores, not cultural well-roundedness.)
It would be interesting to see one of these correlating a measure of intelligence (such as test scores) with other factors, such as favourite music (I imagine things that a lot of geeks listen to, like metal, industrial and prog rock would come out on top, and rap-metal/nu-metal and R&B would come out fairly low), movies, or even which Facebook groups/applications one has installed.
(via Boing Boing)
With it being Australia Day, The Age has a raft of articles looking at Australian culture and identity, including one examining the idea of something being "un-Australian" (for the first time since the end of the Howard era):
Current generations might believe that to be un-Australian and its attendant "ism" were coined in the conservative 1990s, when the values debate raged and the then prime minister, John Howard, spearheaded a failed attempt to get the term mateship enshrined in the constitution. But its ancestry goes back much further. Etymologically, it began life as a literal recognition of things that were not Australian in character; the first recorded use, in 1855, described a part of the landscape similar to Britain.
Cultural commentator Hugh Mackay has argued that anything labelled un-Australian is, in fact, Australian: "Surely it's 'Australian' to do whatever Australians do. It's Australian to drink and drive, to get hopelessly into debt, lie to secure an advantage — whether political, commercial or personal — and engage in merciless and slanderous gossip. It's Australian to give vent to our xenophobia through outbreaks of racism, to reserve our nastiest prejudices for indigenous people, and to worship celebrity … It's Australian to do such things because it's human to do them."And there's also a piece titled "I speak Aboriginal every day", about the surfeit of Aboriginal place names in Australia, most of their meanings all but forgotten to most of the people who use them:
Prahran turns out to be an Aboriginal word — a corruption of Birrarung (mist, or land surrounded by water). Dandenong was Tathenong (big mountain). Geelong comes from Tjalang (tongue). Moorabbin means "mother's milk". Looking up a single page in a street directory now to check a spelling (because I know these words better spoken than written), I find Kanooka, Kanowindra, Kanowna, Kantara, Kantiki, Kanu, Kanuka, Kanumbra, Kanyana … and on and on. Forty-five per cent of Victorian place names are Aboriginal.I didn't know that Prahran was an Aboriginal word; if pressed, I would have guessed that it was taken from India, perhaps in honour of some earlier triumph of the British Empire (by analogy to the area near Flemington which is sometimes referred to as Travancore), or alternately came, badly mangled, from some indigenous British minority language.
The internet, with its detachment between online and offline actions and its lack of a private register, has spawned the phenomenon of griefers, or highly organised subcultures of people (mostly young men) who delight in ruining other people's online fun:
Consider the case of the Avatar class Titan, flown by the Band of Brothers Guild in the massively multiplayer deep-space EVE Online. The vessel was far bigger and far deadlier than any other in the game. Kilometers in length and well over a million metric tons unloaded, it had never once been destroyed in combat. Only a handful of player alliances had ever acquired a Titan, and this one, in particular, had cost the players who bankrolled it in-game resources worth more than $10,000.
So, naturally, Commander Sesfan Qu'lah, chief executive of the GoonFleet Corporation and leader of the greater GoonSwarm Alliance — better known outside EVE as Isaiah Houston, senior and medieval-history major at Penn State University — led a Something Awful invasion force to attack and destroy it.
"The ability to inflict that huge amount of actual, real-life damage on someone is amazingly satisfying" says Houston. "The way that you win in EVE is you basically make life so miserable for someone else that they actually quit the game and don't come back."
To see the philosophy in action, skim the pages of Something Awful or Encyclopedia Dramatica, where it seems every pocket of the Web harbors objects of ridicule. Vampire goths with MySpace pages, white supremacist bloggers, self-diagnosed Asperger's sufferers coming out to share their struggles with the online world — all these and many others have been found guilty of taking themselves seriously and condemned to crude but hilarious derision.Griefers defend their behaviour by claiming that they're merely giving those who take the internet far too seriously a reality check. The implied subtext is that anything that happens online is just a game and doesn't count. Though, given how the internet has become a mainstream part of many people's lives (witness, for example, the rise in social networking websites), this assertion makes about as much sense as Tom Hodgkinson's call to kill your Facebook account, throw away your email address and instead socialise in the pub with people near you. There's not a great leap from asserting that anything that happens online doesn't really count and absurdly ludditic claims like "if you don't know what someone smells like, they're a stranger".
On the other hand, there is no such thing as the right to be respected, or even to not be ridiculed. If one posts a web page detailing one's peculiar political views, conspiracy theories and/or sexual fetishes online, one can expect to be laughed at and even snidely remarked about. Though there is a distinction between demolishing someone's homepage in a blog or discussion forum and actively gathering a posse and going out to hound them off the net.
Griefing happens in the real world, though it's usually called other things, such as bullying. The difference is that the internet has democratised bullying. In the real world, in more conformistic societies, bullies can typically only be those either of or contending for alpha social status, enforcing an exaggerated version of majority values by picking on those perceived to not conform to them (witness the use of the word "gay", sometimes semi-euphemised as "ghey", as a general-purpose term of derision), and in more liberal or pluralistic environments, even that is frowned upon. Online, anyone can find a group of like-minded misfits, make up a cool-sounding name, set up a virtual clubhouse and start picking on mutually agreed targets, with little fear of social consequences.
A bus company in Yorkshire is facing accusations of discrimination against alternative lifestyles after a Goth leading his girlfriend on a leash was stopped from boarding a bus:
"Our primary concern is passenger safety and while the couple are very welcome to travel on our buses, we are asking that Miss Maltby remove her dog lead before boarding the bus.
"It could be dangerous for the couple and other passengers if a driver had to brake sharply while Miss Maltby was wearing the lead."Which raises the issue of when does something becomes discrimination. Is there a difference between Goths (who, in this case, are presumably BDSM fetishists or Goreans or something as well; AFAIK, this sort of thing is not a fundamental part of the Goth subculture) leading each other on leashes and, say, some Muslim women covering their faces? Both behaviours are at odds with the accepted social norms. If there is a difference, is it because religious justficiations automatically bear more weight than non-religious ones?
While we're on the subject of multiculturalism in the UK: a childrens' educational CD-ROM based on the story of the Three Little Pigs has been rejected from a government agency's annual awards because it may offend Muslims.
The New York Times has an excellent piece by Steven Pinker on the human instinct for moral reasoning:
At the same time, many behaviors have been amoralized, switched from moral failings to lifestyle choices. They include divorce, illegitimacy, being a working mother, marijuana use and homosexuality. Many afflictions have been reassigned from payback for bad choices to unlucky misfortunes. There used to be people called “bums” and “tramps”; today they are “homeless.” Drug addiction is a “disease”; syphilis was rebranded from the price of wanton behavior to a “sexually transmitted disease” and more recently a “sexually transmitted infection.”
This wave of amoralization has led the cultural right to lament that morality itself is under assault, as we see in the group that anointed itself the Moral Majority. In fact there seems to be a Law of Conservation of Moralization, so that as old behaviors are taken out of the moralized column, new ones are added to it. Dozens of things that past generations treated as practical matters are now ethical battlegrounds, including disposable diapers, I.Q. tests, poultry farms, Barbie dolls and research on breast cancer. Food alone has become a minefield, with critics sermonizing about the size of sodas, the chemistry of fat, the freedom of chickens, the price of coffee beans, the species of fish and now the distance the food has traveled from farm to plate.While what is considered a moral issue differs between cultures and societies (and, to an extent, periods of time), there appear to be five categories for moral judgment hardwired into the human psychology: harm, fairness, community (or group loyalty), authority and purity.
The five spheres are good candidates for a periodic table of the moral sense not only because they are ubiquitous but also because they appear to have deep evolutionary roots. The impulse to avoid harm, which gives trolley ponderers the willies when they consider throwing a man off a bridge, can also be found in rhesus monkeys, who go hungry rather than pull a chain that delivers food to them and a shock to another monkey. Respect for authority is clearly related to the pecking orders of dominance and appeasement that are widespread in the animal kingdom. The purity-defilement contrast taps the emotion of disgust that is triggered by potential disease vectors like bodily effluvia, decaying flesh and unconventional forms of meat, and by risky sexual practices like incest.
The ranking and placement of moral spheres also divides the cultures of liberals and conservatives in the United States. Many bones of contention, like homosexuality, atheism and one-parent families from the right, or racial imbalances, sweatshops and executive pay from the left, reflect different weightings of the spheres. In a large Web survey, Haidt found that liberals put a lopsided moral weight on harm and fairness while playing down group loyalty, authority and purity. Conservatives instead place a moderately high weight on all five. It’s not surprising that each side thinks it is driven by lofty ethical values and that the other side is base and unprincipled.And, further down:
Though wise people have long reflected on how we can be blinded by our own sanctimony, our public discourse still fails to discount it appropriately. In the worst cases, the thoughtlessness of our brute intuitions can be celebrated as a virtue. In his influential essay “The Wisdom of Repugnance,” Leon Kass, former chair of the President’s Council on Bioethics, argued that we should disregard reason when it comes to cloning and other biomedical technologies and go with our gut: “We are repelled by the prospect of cloning human beings . . . because we intuit and feel, immediately and without argument, the violation of things that we rightfully hold dear. . . . In this age in which everything is held to be permissible so long as it is freely done . . . repugnance may be the only voice left that speaks up to defend the central core of our humanity. Shallow are the souls that have forgotten how to shudder.”
There are, of course, good reasons to regulate human cloning, but the shudder test is not one of them. People have shuddered at all kinds of morally irrelevant violations of purity in their culture: touching an untouchable, drinking from the same water fountain as a Negro, allowing Jewish blood to mix with Aryan blood, tolerating sodomy between consenting men. And if our ancestors’ repugnance had carried the day, we never would have had autopsies, vaccinations, blood transfusions, artificial insemination, organ transplants and in vitro fertilization, all of which were denounced as immoral when they were new.
The latest must-have accessory on the Tokyo subway is a portable subway strap. Such a strap, of course, doesn't provide support, but it does keep one's hands occupied, and provides proof that one is not using them to grope women in the crush (something which happens a lot).
Such as the following: 1) simulate how a crowd flees from a burning car toward a single evacuation point; 2) test out how a pathogen might be transmitted through a mobile pedestrian over a short period of time; 3) see how the existing urban grid facilitate or does not facilitate mass evacuation prior to a hurricane landfall or in the event of dirty bomb detonation; 4) design a mall which can compel customers to shop to the point of bankruptcy, to walk obliviously for miles and miles and miles, endlessly to the point of physical exhaustion and even death; 5) identify, if possible, the tell-tale signs of a peaceful crowd about to metamorphosize into a hellish mob; 6) determine how various urban typologies, such as plazas, parks, major arterial streets and banlieues, can be reconfigured in situ into a neutralizing force when crowds do become riotous; and 7) conversely, figure out how one could, through spatial manipulation, inflame a crowd, even a very small one, to set in motion a series of events that culminates into a full scale Revolution or just your average everyday Southeast Asian coup d'état -- regime change through landscape architecture.
Or you quadruple the population of Chicago. How about 200 million? And into its historic Emerald Necklace system of parks, you drop an al-Qaeda sleeper cell, a pedophile, an Ebola patient, an illegal migrant worker, a swarm of zombies, and Paris Hilton. Then grab a cold one, sit back and watch the landscape descend into chaos. It'll be better than any megablockbuster movie you'll see this summer.And here are emotional maps of various urban areas, including parts of London and San Francisco, created by having volunteers walk around them with GPS units and galvanic skin response meters.
(via schneier, mind hacks)
The source code of the classic urban-planning simulation game, SimCity, has now been released under the GPL. You can find it here. The code is based on the original, UNIX/X11/Tcl/Tk version of SimCity, with a few changes: (a) the game has been renamed to Micropolis (which was its original working title), as "SimCity" is an Electronic Arts trademark for their commercial urban-simulation games, (b) it has been ported to the OLPC XO-1 (the cute green laptop being given to children in developing countries), and (c) everything has been placed in a C++ class and bound to a Python interpreter, making the entire game eminently hackable and extensible in Python. Let a million hacks bloom.
Scientists have developed a vaccine against cocaine, which permanently reconfigures the immune system to attack and destroy cocaine molecules before they can reach the brain:
The developers of the new cocaine vaccine, known as 'TA-CD', are doing essentially the same thing by cleverly combining a deactivated cocaine molecule with a deactivated cholera toxin molecule. The deactivated cholera toxin is enough to trigger the immune system, which finds and adapts to the new invader.
If effective, you can see that some parents might want to vaccinate their non-addicted, perfectly healthy children, so they are 'immune' to cocaine. The difference here, is that once given, the 'immunity' may be permanent. In other words, you would make the decision that your child will never be able to experience the effects of cocaine for the rest of their life.Another option (and one with a whiff of authoritarianism about it, though perhaps not much more than the militarised, prison-filling War On Drugs) would be a compulsory mass vaccination programme, perhaps of all school-aged children. Implemented on a large enough scale, this could be the only way of killing off the cocaine cartels other than legalising the stuff (politically unpalatable) or rendering coca extinct by biological means (an ecological non-starter).
A vaccine against heroin may also be possible, though one wouldn't want to ever be in need of strong painkillers if one has had one of those.
(via Mind Hacks)
New research shows that the human race is evolving faster than ever, and that, far from the accepted truth of the human race being biologically homogeneous, different populations have, over the past 10,000 years, been evolving apart, pushed by different selection pressures:
“Genes are evolving fast in Europe, Asia and Africa, but almost all of these are unique to their continent of origin. We are getting less alike, not merging into a single, mixed humanity.
“Our study denies the widely held assumption that modern humans appeared 40,000 years ago, have not changed since and that we are all pretty much the same. We aren’t the same as people even 1,000 or 2,000 years ago.”
The scientists said this reflected the great increase in human populations over that period, which has allowed more beneficial mutations to emerge. Changes in the human environment, particularly the rise of agriculture, also created new selective pressure to which humans adapted.Examples of evolutionary divergences include lactose tolerance among descendents of northern European populations (where the lack of sunlight would have made this mutation beneficial), differences in resistance to diseases such as malaria, smallpox and HIV between European and African populations, and more controversially, the hypothesis that above-average intelligence in Ashkenazi Jews is a result of selection pressures in mediaeval Europe (where they were confined to a small number of primarily cognitively demanding vocations). (And wasn't there a study a while ago that showed that the average IQ of a population was proportional to how many generations their ancestors had lived in urban environments?)
The latest minority (or perhaps majority) demanding justice is the ugly (or perhaps Ugly). At least in Buenos Aires, a city famous for its beautiful people and fastidious attention to appearances. Ugly Liberation activist Gonzalo Otalora is campaigning for taxes on beautiful people to offset the advantage they get from their appearance.
It seems that Ugly Pride may be an idea whose time has finally come. As the Ugly assert their identity, perhaps the next battle will be against discriminatory phrases like "an ugly situation".
Allegations have emerged of a secret mailing list used by a cabal of Wikipedia admins to unaccountably ban people they suspect of being enemies of Wikipedia trying to infiltrate the site. (The evidence against one banned user was that they seemed too skilled and too productive for a new user, and thus were probably a troll or sleeper agent trying to infiltrate Wikipedia and win its trust before doing god knows what.)
Meanwhile, it turns out that "wikipedia" is an ingredient in the food at at least one Chinese restaurant:
In this case, "wikipedia" turns out to be a type of fungus. This is not the first time that a restaurant in Asia has listed "wikipedia" as an ingredient; some two years ago, a French restaurant in Taiwan listed cheesecake as "wikipedia", and there are other reports of another establishment using the word to denote squid. Could it be some quirk of novice attempts at Chinese-to-English translation that causes this word to be mistaken for so many culinary ingredients?
Writing in InformationWeek, Cory Doctorow delivers a scathing indictment of Facebook, and its eyeball-herding business model:
Facebook is no paragon of virtue. It bears the hallmarks of the kind of pump-and-dump service that sees us as sticky, monetizable eyeballs in need of pimping. The clue is in the steady stream of emails you get from Facebook: "So-and-so has sent you a message." Yeah, what is it? Facebook isn't telling -- you have to visit Facebook to find out, generate a banner impression, and read and write your messages using the halt-and-lame Facebook interface, which lags even end-of-lifed email clients like Eudora for composing, reading, filtering, archiving and searching. Emails from Facebook aren't helpful messages, they're eyeball bait, intended to send you off to the Facebook site, only to discover that Fred wrote "Hi again!" on your "wall." Like other "social" apps (cough eVite cough), Facebook has all the social graces of a nose-picking, hyperactive six-year-old, standing at the threshold of your attention and chanting, "I know something, I know something, I know something, won't tell you what it is!"
If there was any doubt about Facebook's lack of qualification to displace the Internet with a benevolent dictatorship/walled garden, it was removed when Facebook unveiled its new advertising campaign. Now, Facebook will allow its advertisers use the profile pictures of Facebook users to advertise their products, without permission or compensation. Even if you're the kind of person who likes the sound of a benevolent dictatorship this clearly isn't one.To be honest, Facebook doesn't seem quite as bad about this as other (such as MySpace, which has the chutzpah to make logging-in users click through interstitial ads, knowing that cool-obsessed teenagers will endure any amount of intrusive advertising as long as it's bright and flashy and ugly-nu-rave enough). Though all this could change if it does start using your name and picture to endorse some product which you once bought. (Though if it does this, it could be on shaky legal ground. It's quite likely that its retroactively amended click-through agreement would, following the great click-wrap power-grab tradition, state that users will consent to endorsing all products they buy without their knowledge in return for their fix of zombie vampire monkey robot ninja action, though whether any sane court of law would find this reasonable is another matter.) Certainly, them having removed the ability to opt out of their marketing programme does feel rather sleazy.)
Fear not, though, as Cory says that Facebook, like all other social networks before it, is doomed, by a simple law which limits the lifespan of a social network to the initial period of growth:
Sure, networks generally follow Metcalfe's Law: "the value of a telecommunications network is proportional to the square of the number of users of the system." This law is best understood through the analogy of the fax machine: a world with one fax machine has no use for faxes, but every time you add a fax, you square the number of possible send/receive combinations (Alice can fax Bob or Carol or Don; Bob can fax Alice, Carol and Don; Carol can fax Alice, Bob and Don, etc).
Having watched the rise and fall of SixDegrees, Friendster, and the many other proto-hominids that make up the evolutionary chain leading to Facebook, MySpace, et al, I'm inclined to think that these systems are subject to a Brook's-law parallel: "Adding more users to a social network increases the probability that it will put you in an awkward social circumstance." Perhaps we can call this "boyd's Law" for danah boyd, the social scientist who has studied many of these networks from the inside as a keen-eyed net-anthropologist and who has described the many ways in which social software does violence to sociability in a series of sharp papers.As more people join a social network, the tensions increase. Turning down a friend request is socially awkward, and unfriending someone literally says to them "you're dead to me". (OMG, teh drama!) As people from all walks of life join your friends list, the range of things that are suitable for discussion among all of them narrows considerably. Eventually, with your boss, your relatives and your friends all reading your profile, you're restricted to the most innocuously content-free of communications, until you stop bothering to log in, and your Facebook account goes the way of your long-moribund Friendster, Tribe and Orkut logins.
Of course, then comes along the next social network service, and the cycle begins again. Perhaps the next service will learn from its predecessors' mistakes and offer users the vitally important ability to compartmentalise information, to make certain parts of one's profile visible only to certain subsets of one's friends list. This is not a new idea; LiveJournal has allowed its users to do this with journal posts for a long time, and Flickr has a somewhat more limited version of this concept (allowing photos to be restricted to people flagged as "family" or "friends"). However, if a social network system is to be able to cope with real-world social relationships, and the fact that people present different aspects of themselves to different friends and acquaintances, such a mechanism is essential.
(via Boing Boing)
As social network websites with user-generated content become mainstream, online dating websites, as a category, are dying. Which makes sense: the only reason that online dating sites (which are like the online equivalent of leks, full of of people putting on their best dating-profile face and saying whatever they think makes them look more attractive), now look even more starkly naff than they did when they were the only game in town. Or, when there are alternatives, going anywhere specifically to pick up doesn't reflect well on oneself:
There's a reason Mulligan and Helm are above online dating. They're part of the social networking generation. Neither would admit to going on sites like Facebook or News Corp.'s MySpace (NWS) expressly looking to hook up. And that's precisely why it's such a better answer to the problem of meeting someone interesting. It's like going to a bar with your friends. Maybe you are going to meet someone special, but maybe you're just going to hang out with your friends too. You can play it cool. "MySpace and Facebook feel like going to a nature preserve, [whereas] a dating site is like walking past a bunch of animals in cages at the zoo," Helm says.
Other sites that meld user-generated content with social networking to accomplish certain tasks can be even handier. Consider Yelp, where people write reviews of their favorite restaurants, bars, and other haunts, or Digg, where users vote and post comments on their favorite online stories. You can scope out Yelp or Digg users on their profile pages, which show pictures and list basic likes and dislikes. But you can really find out about them from the locations they Yelp about or stories they Digg. Both sites have features that even let you connect with fellow users based on shared traits. It's like a version of eHarmony you don't have to opt into. And while many online dating sites charge a fee, most new Web sites are free.And use of social web sites isn't the only thing that has gone mainstream; the Business Week article linked above signs off with:
The Web moves fast. And sorry online dating, but you just didn't keep up. In the parlance of the kids who won't use you, you got "pwned."
Don't envy the super-rich, this article says; their wealth has almost certainly made them miserable:
According to de Vries, the super-rich are increasingly succumbing to what has been labelled Wealth Fatigue Syndrome (WFS). When money is available in near-limitless quantities, the victim sinks into a kind of inertia.
"The rich are never happy, no matter what they have," he told CNN. "There was this man who owned a 100ft yacht. I said: 'This is a terrific boat.' He said: 'Look down the harbour.' We looked down the marina, and there were boats two and three times as large. He said: 'My 100ft yacht today is like a dinghy compared to these other boats.' When else in history has someone been able to call a 100ft yacht a dinghy?"
Some of our friends have jumped from nice five-bedroom houses in South Kensington to gated mansions in St John's Wood, complete with hot and cold running staff. But many who join the super-rich find it hard to keep their old circles of support. Happiness studies have repeatedly shown that being marginally better off than your neighbours makes you feel good, but being a hundred times richer makes you feel worse. So either you change your friends or live with the envy of others.The article goes on to expound numerous other causes of wealth-induced misery: social support networks break down, as relationships with old friends are strained by the wealth disparity and poisoned by real or perceived envy; and all the cars, yachts and new houses your money can buy you just become boring much more quickly. (It's the hedonic treadmill effect, where one becomes acclimatised to one's level of comfort and contentment, it takes even more to not succumb to ennui.) Meanwhile, the wives of the super-rich (and most of the super-rich are men; presumably husbands of super-rich women or gay partners would suffer the same) suffer the same psychological consequences as the unemployed (that is, when they're not traded in for younger, prettier models), and their children, shuttled between nannies and estates, often end up clinically depressed.
The conclusion is that money can buy happiness — but only up to a point. A key component of happiness is social connectedness, of the sort that cannot be bought:
The happiest nations, he says, are those where people feel most equal, even if that means being less wealthy. Pentecost, a tiny island in the South Pacific, has recently been voted the happiest place on earth. They don't have WFS – in fact, they don't have money; they use pigs' horns instead.
In places such as Pentecost, people actually talk to each other – indeed, belonging to a community is one of the single most important prerequisites for happiness.
A new study has discovered the phenomenon of suicide tourism, which involves people committing suicide choosing to do so at or near iconic landmarks or historic locations, and travelling to do so. The study claims that one in every 10 suicides in Manhattan is by an out-of-towner who travelled to the city expressly to die:
Some 274 suicides by non-residents were recorded in Manhattan between 1990 and 2004, more than half of them as a result of long falls from bridges and high-rise commercial buildings, including hotels, according to the report.I once read that luxury hotels are a big suicide magnet, with many treating themselves to a luxurious exit, though this is the first time I heard of suicide tourism as such (not counting specific examples, such as various bridges).
In recent health-related news: a cure may have been discovered for the debilitating condition of unrequited love. Researchers in Alabama and Iran have found that a combination of the hormones of melatonin and vasotocin may alleviate the condition:
Intense romantic love is associated with specific physiological, psychological and behavioural changes, including euphoria, obsessiveness, and a craving for closeness with the target.
The key is the pea-sized pineal gland, which produces melatonin. This hormone plays a key role in the circadian cycle. It has also shown anti-dopamine activities in part of the brain, while a second hormone, arginine-vasotocin, also has a key role in romantic love. The researchers suggest that giving the two hormones may be a cure for non-returned romantic love.(Alabama and Iran? I wonder whether there's any deeper significance to two places known for religiously-based social conservatism being at the forefront of research to control a powerful and sometimes disruptive phenomenon. Is it heartening or disturbing that, even as talk of a US/Iranian war grows louder, US and Iranian scientists can join forces in the War On Unrequited Love?)
Also in the same article: taking showers may cause a neurodegenerative condition associated with inhalation of manganese, keeping dogs may cause breast cancer and sunlight may increase violent impulses.
The Graun has a piece on Don Tapscott's recently released book Wikinomics, and the theory that computer-aided networking may soon make large corporations redundant:
Ronald Coase had noticed something odd about capitalism. The received wisdom, among western economists, was that individuals should compete in a free market: planned economies, such as Stalin's, were doomed. But in that case, why did huge companies exist, with centralised operations and planning? The Ford Motor Company was hailed as a paragon of American business, but wasn't the Soviet Union just an attempt to run a country like a big company? If capitalist theory was correct, why didn't Americans, or British people, just do business with each other as individual buyers and sellers in the open market, instead of organising themselves into firms?
The answer - which won Coase a Nobel prize - is that making things requires collaboration, and finding and linking up all the people who need to collaborate costs money. Companies emerge when it becomes cheaper to gather people, tools and material under one roof, rather than to go out looking for the best deal every time you need a few hours' labour, or a part for a car. But the internet, Tapscott argues, is radically lowering the cost of collaborating. Companies - certainly big companies - are losing their raison d'etre. Individuals, and tiny companies, can collaborate without corporate behemoths to organise them. Considering how many of us spend our weekdays working for big companies, and then spend our weekends giving our money to them, this is a far-reaching thought.Tapscott cites a number of examples, from a struggling gold-mining concern which, facing bankruptcy, opened up its geological survey data and, with the help of experts across the web, made a recovery, to Chinese motorcycle manufacturing, which rather than being dominated by large companies as in Japan or America, consists of networks of small suppliers and assemblers who meet in tea shops to do deals. (Which sounds weird, but it is exactly how a big chunk of the PC industry has been operating for a while; non-brand-name PCs, assembled from separately-bought parts by end users or small businesses.) And, of course, the user-generated content phenomenon.
If anything, it is tempting to suggest that Tapscott is too kind to large companies. (His multimillion-dollar research was, after all, funded by a consortium of them.) Wikinomics is a book for existing corporations who want to learn how to survive: he suggests, for example, turning consumers into "prosumers", with an active role in product design, as with Lego Mindstorms, a range of construction toys with robotic bricks, aimed at adults. And he's scathing about record labels and others who don't see that the internet is a platform on which they can build new, profitable products, rather than something to be fought with lawsuits. But in the very long term, there's no particular reason why large corporations should survive at all. If Ronald Coase's 1937 insight remains valid, we could yet see the day when big companies such as Google begin to look rather prehistoric -because they are still, after all, big companies.
The Times has the poignant story of the death of a 42-year-old loner, whose body was not found until, two months after his death, a neighbour (who did not know him) noticed an odd smell coming from his north London council flat:
For some, the decision to disappear is gradual. It begins with an impulse, a desire to disconnect. It could mean turning the phone off and retreating under the duvet. For most people, it’s a fleeting escape. Family and friends are what keep them tethered. But what happens to those who become untethered? Or let go on purpose? Days, months, even years can pass. They have slipped through the cracks. Despite the presence of CCTV cameras and telecoms technology, which make most of us feel we are constantly monitored, it has become easier for those who live alone to avoid human contact altogether.
The pharmacist said he was always dressed neatly. He described him as “shy and pleasant – nothing mentally ill about him”, and admitted that when he didn’t see him for a while, he just assumed that Smith had moved away.
A few doors down from his flat, at No 168, Andrew’s neighbour, a postman, described Andrew as quiet, tall and thin. They lived near each other for 13 years but had only spoken to say hello when they passed each other coming and going on the stairs. In all the years he lived there, he said, he had seen no friends, ever. Andrew kept to himself.
Apparently there is a study which claims that we lose 1.2 friends each year between the ages of 20 and 40.
The recent cheap electronics boom (made possible by inexpensive, flexible microcontrollers and cheap manufacturing in places like China) has spawned a new wave of technological solutions to antisocial problems:
A Tennessee company has created a $50 device that shuts up other people's dogs by answering their barks with an ultrasonic squeal that humans can't hear. (The unit is disguised as a birdhouse.) British inventors are exporting a new product for people who hate lousy drivers -- it's a luminescent screen that fits in a car's rear window and, at the driver's command, flashes any one of five messages to other motorists.One of the best known examples of this phenomenon is the TV-B-Gone, a keyring-sized infrared transmitter which, at the press of a button, sends out the "switch off" codes for hundreds of models of television. Some business owners have taken to removing or masking infrared receivers on their televisions to prevent people from switching them off. while hackers have customised their TV-B-Gones by embedding them in hats or else enhancing them with massive arrays of LEDs for extra range.
There's an updated version of the TV-B-Gone in the works that will be powerful enough to shut off televisions from behind sheets of glass. A well-publicized British invention called "the Mosquito" that emits high-frequency sounds particularly irritating to congregations of teenagers is now being marketed in the U.S. by a company called Kids Be Gone.
As Labour in Britain toys with the idea of giving 16-year-olds the vote, an advisor to the (recently resigned) premier of Victoria has come up with a uniquely Australian extension of this: giving votes to all children, to be exercised by their parents until they turn 18. Thus a two-parent family with three children would have five votes, which would break the crippling stranglehold of selfish childless people on the political process and introduce a new era of "family-friendly" policies.
Curiously enough, the proponent of this policy, Evan Thornley, is not a religious right-winger, but a member of the Fabian Society, that very Britishly pragmatic socialist organisation which once had George Bernard Shaw as one of its members (and, during the Cold War, was accused by Bircher types of using its shadowy influence over the Labor Party to implement "Sovietisation by stealth").
There are, of course, numerous problems with this proposal. Were it to be adopted, politicians would start bidding for the votes of large families by giving them more money, taken by punitively taxing the suddenly all-but-disenfranchised non-breeders. (What are they going to do, vote for someone else?) This would result in a system which effectively regards not having children as deviant behaviour to be penalised; once this is a matter of bureaucratic fact, the culture would soon follow. And then there is the likelihood of a bias towards large families bringing with it a bias towards religious conservatism; all of a sudden, Victoria would look like the repressively paternalistic 1950s white-picket-fence dystopia John Howard didn't quite succeed in building.
Of course, that's if such a policy were ever adopted. There are practical problems with implementing it, such as deciding which parent gets their childrens' votes. Granted, they could be split in half (with each parent in the 3-child family having 2.5 votes), though this proposal effectively changes the paradigm of democracy, from one comprised of voting individuals to one comprised of voting families. It has echoes of the top-down "strict-father" model of the family so favoured by conservatives, and at the heart of the culture war in America and Australia: it reinforces the idea of a family being defined by a chain of authority residing in the head of the household. Granted, it does not define a head of the household, though it is a short distance from accepting the paradigm that votes are allocated per household, and not per individual, to accepting that the votes for all members of the household are cast by the head of the household.
Mind you, given that Thornley's boss has suddenly resigned, this proposal is likely to be even more dead in the water than it was before. Unless the Howard government decide that it has battler-rallying potential and put it to a referendum, or else Rudd decides to use it to outflank the family-values warriors on the right.
A study of social network website users in the US has shown a class divide between MySpace and Facebook users. Apparently Facebook has more users from wealthier homes and more academic backgrounds, while MySpace has more working-class teenagers, minorities and members of social groups ostracised by the popular kids in high school (this may include music- and fashion-related youth subcultures).
A new study in the UK shows that the law-abiding majority is a myth, and more than 6 out of 10 Britons regularly commit crimes against the government, their employers or businesses. These crimes include such heinous acts as stealing stationery from work (18%), paying "cash in hand" to avoid taxation (34%), and padding out insurance claims to get more money (7%).
The Guardian reveals an all-but-forgotten fragment of the social history of 1980s Britain: a ZX Spectrum game named Hampstead, which codified the aspirational values of Thatcher-era Britain in the blocky, primary-coloured computer graphics of the period:
Hampstead was the ultimate 1980s adventure game, yet one of the few that broke from the traditional orcs and goblins fare. In it, you took the role of a down and out dreamer trapped in a grotty east London flat with ideals of leafy suburbs and affluence.
As aspirational games go, this text adventure was pretty high on the narcissistic scale. With the right clothes, the right education, the right muesli and the right girl (Pippa, of course), all that stood between your and your freehold was her Dad. And he was a pussycat. Hampstead taught a generation of future Brees and Tarquins how to climb the social ladder and how to look good while doing it.
Secularist philosopher A.C. Grayling weighs in on the curious case of why the recent publication of half a dozen anti-religious books has caused so much alarm, while the constant flood of religious books attracts no attention:
Half a dozen anti-religious books; what is amazing is how little, if anything, is said about the many thousands of pro-religious books published every year all round the world. The magazine Publishers Weekly reported earlier this year that the member publishing houses of the Evangelical Christian Publishers Association between them produced 13,400 new titles in the two years 2005-6 alone. This is just one segment of the religious publishing industry in just one wing of one of the world religions; the mind boggles at the extent of forests being felled for purveyance of religious doctrine, opinion, exhortation and polemic in every shade, nuance and type.I had the good fortune to see Grayling speak at the Hay-on-Wye festival recently, and while he is in a similar philosophical camp to the likes of Richard Dawkins and Sam Harris, he certainly couldn't be classified as a "militant atheist". Then again, according to this blog post (also via Peter), the very phrase "militant atheist" is one of those weasel words, so thoroughly assimilated into the vernacular that people use it to describe people of quite moderate views, which just happen to be anti-religious:
From the meaning of "militant", you might expect that Dawkins, Harris, and Hitchens are burning down churches, or at least leading protests, stirring up crowds with their fiery rhetoric. You would be disappointed, of course. What Dawkins, Harris, and Hitchens have done is write books. Hitchens is more of a curmudgeon than a militant, and Dawkins and Harris are both rather mild-mannered. Nobody is leaving their public events carrying torches and singing the atheist analogue of the Horst Wessel song.I'm not sure I'd agree about Harris; his The End Of Faith seemed to echo a lot of rather ugly neoconservative warblogger polemic.
Though the blogger seems to have a point that a lot of people are willing to cut people a lot more slack if their behaviour or demeanour has a religious justification.
When Jerry Falwell died recently, newspaper obituaries rarely described him as "militant", even though the adjective fit him much better than mild-mannered atheists like Harris. Ironically, however, the Associated Press obituary by Sue Lindsey, referred to Falwell's father and grandfather as "militant atheists".
Cory Doctorow has an essay in Forbes, asserting that ubiquitous surveillance, of the sorts that has been made technologically feasible recently, not only doesn't make cities more secure but undermines the social contracts that make them work:
The key to living in a city and peacefully co-existing as a social animal in tight quarters is to set a delicate balance of seeing and not seeing. You take care not to step on the heels of the woman in front of you on the way out of the subway, and you might take passing note of her most excellent handbag. But you don't make eye contact and exchange a nod. Or even if you do, you make sure that it's as fleeting as it can be.
I once asked a Japanese friend to explain why so many people on the Tokyo subway wore surgical masks. Are they extreme germophobes? Conscientious folks getting over a cold? Oh, yes, he said, yes, of course, but that's only the rubric. The real reason to wear the mask is to spare others the discomfort of seeing your facial expression, to make your face into a disengaged, unreadable blank--to spare others the discomfort of firing up their mirror neurons in order to model your mood based on your outward expression. To make it possible to see without seeing.
Crazy, desperate, violent people don't make rational calculus in regards to their lives. Anyone who becomes a junkie, crack dealer, or cellphone-stealing stickup artist is obviously bad at making life decisions. They're not deterred by surveillance.
(via Boing Boing)
The Sussex Police are deploying extra officers in Brighton on nights where there is a full moon after the force's research showed a correlation between full moons and the frequency of violent incidents. No corresponding increase in lycanthropy has been reported, though an increase in violence has been found to occur on paydays (presumably because of people drinking portions of their paycheques).
"When I went into a coma there was only tea and vinegar in the shops, meat was rationed and huge petrol queues were everywhere," Mr Grzebski said.
"Now I see people on the streets with mobile phones and there are so many goods in the shops it makes my head spin," he told Polish television.
If the UK free tabloids are to be believed, up to 2,000 people in Japan have been sold lambs and told that they were poodles (which are both extremely fashionable and rare in Japan):
Entire flocks of lambs were shipped over from the UK and Australia to Japan by an internet company and marketed as the latest 'must have' accessory. But the scam was only spotted after a leading Japanese actress said her 'poodle' didn't bark and refused to eat dog food.
New research from Cardiff University has found a correlation between violence and the price of beer; namely, the cheaper beer is, the more violence there is:
The researchers examined admissions to 58 hospital accident and emergency departments over a five year period and found that as the price of beer increased, violence-related injuries decreased.The study also looked at other factors, finding that increases in poverty, youth unemployment, diversity of ethnic population, major sporting events and it being summer also independently predicted an increase in violence.
I wonder how much of the study (which was carried out in England and Wales) is specific to Anglo-Saxon or British cultural factors, and how much of it would translate to other societies.
Recently, an article in the press quoted a British doctor who was proposing raising the drinking age in Britain from 18 to 21. His rationale seemed to be that Blairite attempts at introducing a "Continental drinking culture" were doomed to fail because Anglo-Saxons were incapable of handling alcohol as responsibly as the French and Italians, and hence Britain should learn from that other great Anglo-Saxon state across the Atlantic. This was duly lambasted by commentators aghast at yet another proposal to import more crude American ideas whilst ignoring the more sophisticated and humane ones across the Channel.
(via Mind Hacks)
The Guardian has an excerpt from a recent book by Barbara Ehrenreich, which postulates that the rise of subjective individual self-awareness and the decline of the collective celebrations common in mediæval times may have touched off an epidemic of depression we've been living in ever since:
And very likely the phenomena of this early "epidemic of depression" and the suppression of communal rituals and festivities are entangled in various ways. It could be, for example, that, as a result of their illness, depressed individuals lost their taste for communal festivities and even came to view them with revulsion. But there are other possibilities. First, that both the rise of depression and the decline of festivities are symptomatic of some deeper, underlying psychological change, which began about 400 years ago and persists, in some form, in our own time. The second, more intriguing possibility is that the disappearance of traditional festivities was itself a factor contributing to depression.
One approaches the subject of "deeper, underlying psychological change" with some trepidation, but fortunately, in this case, many respected scholars have already visited this difficult terrain. "Historians of European culture are in substantial agreement," Lionel Trilling wrote in 1972, "that in the late 16th and early 17th centuries, something like a mutation in human nature took place." This change has been called the rise of subjectivity or the discovery of the inner self and since it can be assumed that all people, in all historical periods, have some sense of selfhood and capacity for subjective reflection, we are really talking about an intensification, and a fairly drastic one, of the universal human capacity to face the world as an autonomous "I", separate from, and largely distrustful of, "them".
But the new kind of personality that arose in 16th- and 17th-century Europe was by no means as autonomous and self-defining as claimed. For far from being detached from the immediate human environment, the newly self-centered individual is continually preoccupied with judging the expectations of others and his or her own success in meeting them: "How am I doing?" this supposedly autonomous "self" wants to know. "What kind of an impression am I making?"If this hypothesis is correct, then the epidemic of depression and mental illness that began in the 1600s (which Ehrenreich provides supporting evidence for, in historical records) is a side-effect of a step in the evolution of human psychology that began at around that time, with the pressures of communication, trade and social organisation dragging the human mind kicking and screaming from a sleepy collective life to a more dynamic way of living. In this case, a lot of the anxiety, angst and low-level distress people feel routinely is not a result of human nature, but rather human nature reacting against "unnatural" circumstances. Small wonder that many have sought relief in an annihilation of the self, from hippie communes to Communist utopias, from meditation to severe religious submission, from the Arcadian pastoral utopias throughout art (Tolkien, William Morris and the Arcade Fire to name three examples off the top of my head) to the transcendental nihilism of drugs (take, for example, Lou Reed wishing he had been born "a thousand years ago" in Heroin).
So where does that leave us? Perhaps, given enough time (hundreds if not thousands of years), human psychology will evolve into depression-resistant directions, assuming that some kind of technological catastrophe doesn't cut the process short. Genetic evolution is slow, but cultural evolution is faster, and it could be argued that our technologies and cultural institutions are part of the "extended phenotype" of humanity; that the invention of antidepressant drugs is an adaptation to these changes in our environment. It's a crude, reactionary adaptation, merely treating the symptoms; though there is hope on the horizon. There has recently been a lot of focus on the study of the psychology of happiness, and what factors make for environments conducive to sustainable happiness. With any luck, this will lead to improvements in areas from urban planning to social policy to economics.
Then again, if the hypothesis is true, would it be possible to somehow get the best of both worlds? Could one have the happy, fulfilling collective connectedness people (allegedly) had before the 16th century, whilst retaining the gains made since then? Or is the very presence of subjective thought, the demarcation between the self and the collective, poisonous?
(On the other hand, L. Ron Hubbard claims that depression comes from humanity's early ancestor, the clam, and the tension between the desire to open and close its hinge.)
After Stephen Fry commented that British actors have an unfair advantage in America because Americans mistake British accents for brilliance, the BBC has published a piece on what a British accent gets you in the US. (And, apparently, a "British accent" includes anything from Hugh Grant plumminess to deepest darkest Geordie.)
"For most Americans, there's no distinction between British accents. For us, there's just one sort of British accent, and it's better than any American accent - more educated, more genteel," says Rosina Lippi-Green, a US academic and author of English with an Accent: Language, Ideology and Discrimination in the United States.
"There was a sitcom called Dead Like Me with a Brit [Callum Blue] in it. He was a scruffy, 20-something drug dealer. Even he had that sort of patina - his was not an RP accent, it was a working class London accent."
Katharine Jones, author of Accent of Privilege: English Identities and Anglophilia in the US, says the "educated and cultured" associations have a long history. "British etiquette books have been used for years; and although Americans say they have no class system, they do - and the American upper class apes the British upper class."Another point the article makes: British expatriates in Australia (where their accent is associated with complaining and being bad at cricket, and/or where refinement and intelligence have traditionally been associated with weakness and/or metaphorical or literal homosexuality rather than any positive attributes) tend to lose their accents pretty quickly, whereas those in the US (where their accents make them appear intelligent and sophisticated, and often get them preferential treatment) retain theirs. Funny, that.
An article in New York Magazine argues that a confluence of recent technological phenomena (the rise of the internet, social software, the decline of privacy) has produced the greatest generation gap since the dawn of Rock and Roll. Whereas subsequent "gaps" (punk rock kids rebelling against their Buddy Holly-listening parents, mall-goths and gangsta-rap kids rebelling against their new-waver parents, and such) were merely the new generation individuating itself by adopting a different dress code and slang, this one is a much more substantial rift, as kids who have grown up with the internet think differently, and their parents (much like the bemused parents of the young rockers of the early 1950s) don't quite know what to make of it all:
It's been a long time since there was a true generation gap, perhaps 50 years--you have to go back to the early years of rock and roll, when old people still talked about "jungle rhythms." Everything associated with that music and its greasy, shaggy culture felt baffling and divisive, from the crude slang to the dirty thoughts it was rumored to trigger in little girls. That musical divide has all but disappeared. But in the past ten years, a new set of values has sneaked in to take its place, erecting another barrier between young and old. And as it did in the fifties, the older generation has responded with a disgusted, dismissive squawk. It goes something like this:
"Kids today. They have no sense of shame. They have no sense of privacy. They are show-offs, fame whores, pornographic little loons who post their diaries, their phone numbers, their stupid poetry--for God's sake, their dirty photos!--online. They have virtual friends instead of real ones. They talk in illiterate instant messages. They are interested only in attention--and yet they have zero attention span, flitting like hummingbirds from one virtual stage to another.Those on the younger side of the generation gap differ from their elders in several ways. They consider themselves to have an audience, and where older people have discarded the ephemera of their adolescence, the kids are archiving it, keeping a bridge to the past. Most tellingly, as the article puts it, their skin is thicker than yours. Where older people might consider concealing their private lives (in the name of privacy, security or just in case), the kids recognise that privacy is futile, and are more likely to reveal all.
And after all, there is another way to look at this shift. Younger people, one could point out, are the only ones for whom it seems to have sunk in that the idea of a truly private life is already an illusion. Every street in New York has a surveillance camera. Each time you swipe your debit card at Duane Reade or use your MetroCard, that transaction is tracked. Your employer owns your e-mails. The NSA owns your phone calls. Your life is being lived in public whether you choose to acknowledge it or not, and if being seen is inevitable, one might as well embrace it and make the best of it:This attitude manifests itself in various ways:
From their perspective, it's the extreme caution of the earlier generation that's the narcissistic thing. Or, as Kitty put it to me, "Why not? What's the worst that's going to happen? Twenty years down the road, someone's gonna find your picture? Just make sure it's a great picture."
Consider Casey Serin. On Iamfacingforeclosure.com, the 24-year-old émigré from Uzbekistan has blogged a truly disastrous financial saga: He purchased eight houses in eight months, looking to "fix 'n' flip," only to end up in massive debt. The details, which include scans of his financial documents, are raw enough that people have accused him of being a hoax, à la YouTube's Lonelygirl15. ("ForeclosureBoy24," he jokes.) He's real, he insists. Serin simply decided that airing his bad investments could win him helpful feedback--someone might even buy his properties. "A lot of people wonder, 'Aren't you embarrassed?' Maybe it's naïve, but I'm not going to run from responsibility."
"If that girl's video got published, if she did it in the first place, she should be thick-skinned enough to just brush it off," Xiyin muses. "I understand that it's really humiliating and everything. But if something like that happened to me, I hope I'd just say, well, that was a terrible thing for a guy to do, to put it online. But I did it and that's me. So I am a sexual person and I shouldn't have to hide my sexuality. I did this for my boyfriend just like you probably do this for your boyfriend, just that yours is not published. But to me, it's all the same. It's either documented online for other people to see or it's not, but either way you're still doing it. So my philosophy is, why hide it?"Of course, as this phenomenon is in the early stages, nobody knows entirely what kind of society will emerge from this:
For anyone over 30, this may be pretty hard to take. Perhaps you smell brimstone in the air, the sense of a devil's bargain: Is this what happens when we are all, eternally, onstage? It's not as if those fifties squares griping about Elvis were wrong, after all. As Clay Shirky points out, "All that stuff the elders said about rock and roll? They pretty much nailed it. Miscegenation, teenagers running wild, the end of marriage!"
Because the truth is, we're living in frontier country right now. We can take guesses at the future, but it's hard to gauge the effects of a drug while you're still taking it. What happens when a person who has archived her teens grows up? Will she regret her earlier decisions, or will she love the sturdy bridge she's built to her younger self--not to mention the access to the past lives of friends, enemies, romantic partners? On a more pragmatic level, what does this do when you apply for a job or meet the person you're going to marry? Will employers simply accept that everyone has a few videos of themselves trying to read the Bible while stoned? Will your kids watch those stoner Bible videos when they're 16? Is there a point in the aging process when a person will want to pull back that curtain--or will the MySpace crowd maintain these flexible, cheerfully thick-skinned personae all the way into the nursing home?
US discount store chain Target has withdrawn a line of CD cases with Che Guevara's image, after
the Cuban government sued for copyright violation socialists protested at the commercialisation of his image critics protested at the glorification of an architect of totalitarianism:
"What next? Hitler backpacks? Pol Pot cookware? Pinochet pantyhose?" wrote Investor's Business Daily in an editorial earlier this month, citing the Guevara case as a model of "tyrant-chic".
In North Pole, Alaska, it is Christmas every day. The decorations never come down, the streetlights are painted like candy canes, and even the McDonalds is Christmas-themed. Meanwhile, the town's new mayor wants to extend the Christmas theme, having shop workers wear elf costumes. Good cheer is a civic duty, and for some reason, not everybody's happy with that.
Recently, a group of high-school children was arrested after planning a Columbine-style high-school massacre:
Earl says the goths were non-Christmassy outcast loners, bullied by the jocks, their intended victims. Iwas a bullied goth at school and so I understand the impulse to want to kill bullies. But there's a big difference between them and me. There were 15 of them. Six ringleaders and nine others who knew about it and were to play subsidiary roles. A gang of 15 can hardly call themselves bullied loners.Fifteen is a huge number in a town of 1,600. It's 25% of the school's 13-year-olds. And they were going to kill dozens of their classmates. This sounds to me like civil war, the non-Christmassy kids against the Christmassy ones.The kids were all (a) identified as "goths" (apparently the goths in American Red States are a lot more violent and nihilistic than the ones elsewhere; the Mordorian Orcs of the goth world?), and (b) 13, which means that they would have recently done their first stint of letter-writing-elf duty, replying to some of the letters sent by children around the world to "Santa, North Pole". Some speculate that the shock of discovering that there is no Santa Claus, combined with the avalanche of human misery in the letters, may have pushed some of them to breaking point:
She explains: the town keeps the practice a secret from the younger children. They have no idea that they'll one day - at the age of 11 or 12 - be obliged to become letter-writing elves. She says it can be quite a shock. Jessie says it isn't as bad as it could be. They do have rules: "If someone writes something like, 'Dear Santa, my mom has cancer. Can you make it go away?' we don't deal with those. We give them back to the teacher." But still, she says, it's a disappointment.
"you'll probably see it in their faces. They prepare you for a few weeks before, but there's always that one person who's like, 'Wait. What are we doing?' And that's the person you should be looking out for. The person who wasn't paying attention in class until the letters are right in front of them. And then they're shattered. It's a weird experience."