Sam Han | Eportfolio

Main menu:

Site search

Categories

May 2024
S M T W T F S
 1234
567891011
12131415161718
19202122232425
262728293031  

Tags

Archive for 'Uncategorized'

The Guardian: Arabs are democracy’s new pioneers | Michael Hardt and Antonio Negri | Comment is free

One challenge facing observers of the uprisings spreading across north Africa and the Middle East is to read them as not so many repetitions of the past but as original experiments that open new political possibilities, relevant well beyond the region, for freedom and democracy. Indeed, our hope is that through this cycle of struggles the Arab world becomes for the next decade what Latin America was for the last – that is, a laboratory of political experimentation between powerful social movements and progressive governments from Argentina to Venezuela, and from Brazil to Bolivia.

These revolts have immediately performed a kind of ideological house-cleaning, sweeping away the racist conceptions of a clash of civilisations that consign Arab politics to the past. The multitudes in Tunis, Cairo and Benghazi shatter the political stereotypes that Arabs are constrained to the choice between secular dictatorships and fanatical theocracies, or that Muslims are somehow incapable of freedom and democracy. Even calling these struggles “revolutions” seems to mislead commentators who assume the progression of events must obey the logic of 1789 or 1917, or some other past European rebellion against kings and czars.

These Arab revolts ignited around the issue of unemployment, and at their centre have been highly educated youth with frustrated ambitions – a population that has much in common with protesting students in London and Rome. Although the primary demand throughout the Arab world focuses on the end to tyranny and authoritarian governments, behind this single cry stands a series of social demands about work and life not only to end dependency and poverty but to give power and autonomy to an intelligent, highly capable population. That Zine al-Avidine Ben Ali and Hosni Mubarak or Muammar Gaddafi leave power is only the first step.

The organisation of the revolts resembles what we have seen for more than a decade in other parts of the world, from Seattle to Buenos Aires and Genoa and Cochabamba, Bolivia: a horizontal network that has no single, central leader. Traditional opposition bodies can participate in this network but cannot direct it. Outside observers have tried to designate a leader for the Egyptian revolts since their inception: maybe it’s Mohamed ElBaradei, maybe Google’s head of marketing, Wael Ghonim. They fear that the Muslim Brotherhood or some other body will take control of events. What they don’t understand is that the multitude is able to organise itself without a centre – that the imposition of a leader or being co-opted by a traditional organisation would undermine its power. The prevalence in the revolts of social network tools, such as Facebook, YouTube, and Twitter, are symptoms, not causes, of this organisational structure. These are the modes of expression of an intelligent population capable of using the instruments at hand to organise autonomously.

Although these organised network movements refuse central leadership, they must nonetheless consolidate their demands in a new constituent process that links the most active segments of the rebellion to the needs of the population at large. The insurrections of Arab youth are certainly not aimed at a traditional liberal constitution that merely guarantees the division of powers and a regular electoral dynamic, but rather at a form of democracy adequate to the new forms of expression and needs of the multitude. This must include, firstly, constitutional recognition of the freedom of expression – not in the form typical of the dominant media, which is constantly subject to the corruption of governments and economic elites, but one that is represented by the common experiences of network relations.

And given that these uprisings were sparked by not only widespread unemployment and poverty but also a generalised sense of by frustrated productive and expressive capacities, especially among young people, a radical constitutional response must invent a common plan to manage natural resources and social production. This is a threshold through which neoliberalism cannot pass and capitalism is put to question. And Islamic rule is completely inadequate to meet these needs. Here insurrection touches on not only the equilibriums of north Africa and the Middle East but also the global system of economic governance.

Hence our hope for the cycle of struggles spreading in the Arab world to become like Latin America, to inspire political movements and raise aspirations for freedom and democracy beyond the region. Each revolt, of course, may fail: tyrants may unleash bloody repression; military juntas may try to remain in power; traditional opposition groups may attempt to hijack movements; and religious hierarchies may jockey to take control. But what will not die are the political demands and desires that have been unleashed, the expressions of an intelligent young generation for a different life in which they can put their capacities to use.

As long as those demands and desires live, the cycle of struggles will continue. The question is what these new experiments in freedom and democracy will teach the world over the next decade.

Oh okay H + N.

Posted via email from sam han’s posterous

Dear Jews Who May Have Been Sitting Near Me And My Friends In Connecticut College’s Harris Dining Hall Fall Semester, 1989 | The Awl

Dear Jews who may have been sitting near me and my friends in Connecticut College’s Harris dining hall fall semester, 1989,

Sorry for making anti-Semitic slurs.

It was an honest mistake. As a group, we would never have wanted to say anything to offend anyone along racial or ethnic lines. But that fall, we’d taken to using an expression without knowing its etymology. And if you knew this expression’s etymology, and if you were Jewish, and happened to be sitting near us in Harris dining hall, where we ate most of our meals, because it was attached to the complex of dorms where we all lived, the Plex, and if you heard us using this expression, which you probably would have, because we used it frequently for a while there, and were the type of college freshman and sophomores who spoke loudly, I think, each of us trying to say something funnier than what had just been said beforehand, and often needing to fairly shout to be heard above the din of uproarious laughter at the table, because we did always find ourselves to be very, very funny, you might have been offended. In fact, if you knew the etymology of the expression, you might have been offended even if you were not Jewish.

The expression is “jay,” used as a verb, meaning to steal or cheat or otherwise unethically beat someone out of something. To “screw someone over” is probably the most accurate definition. It was usually used in the past tense, “jayed,” as in, “Five dollars for a bagel with cream cheese?! Oh, man, you got jayed!” Or, in the context of a college dining hall, where the food comes free with tuition: Matt gets up to get a bagel from the bagel station. Todd asks him to get him a bagel. Matt returns with an everything bagel for himself and a cinnamon raisin bagel for Todd.

Todd: “Cinnamon raisin? [Sarcastically] Thanks a lot.”

Matt: “Sorry, dude. [Smugly spreads cream cheese on his everything bagel.] That was the only one left.”

Group: [Uproarious laughter]

Me: “Ha, Todd, he totally jayed you!”

You can see why we had to speak so loudly.

Oddly, we were a predominantly Jewish bunch. Matt’s last name is Coen. Todd’s is Schwartz. Mine, in case you’ve ever wondered, is an acronym, originally Hebrew, that stands for “Ben Rabbi Yitzok” (or “Yisrael”—my grandfather’s genealogical research found evidence for both.) Steve Arnoff, another Jew, was often at our table, too. Mike Brockhaus, Carter Beal, and Will Noonan made up the rest of the core group. They’re not Jews, but I can vouch for their enlightened views on the subject.

The expression came to us through Carter, I think. This would make sense, because Carter grew up in Minnesota, where there are fewer Jews than say, New York City, or Westchester, or Yisrael. When I think back to hearing the expression for the first time, I hear it in Carter’s voice—which has traces of the hard-bite Minnesotan accent I knew best before meeting him from Replacements records, like the part at the end of “Treatment Bound” on the Hootenany album, after the song has sort of fallen apart and the music has stopped and Tommy Stinson laughs and asks, “What were the chords to that one part?” and his older brother Bob, sounding very drunk (as was his wont), says, “Fucked ’em up…” The Replacements were my favorite group in the world at that time of my life, and I remember really liking the way it sounded when Carter said “jayed,” and half-consciously emulating his pronunciation, putting a more nasal stress on the “A” than I would have in New Jersey.

Which is not to blame Carter, or Minnesota. But just to say I can imagine how widespread usage of a Jewish slur could go on more easily in a place where there are not so many Jews. Surely, much of the usage was innocent. Lots of people in lots of places use lots of expressions without knowing all the attendant connotations. That’s how it works when you learn new slang. You hear it, you hear it again, you just pick it up. You might ponder the derivation, but if you don’t know it for sure, it’s not going to stop you from saying it.

So one day, we’re sitting at lunch in Harris and someone says someone got jayed, and Matt sort of stiffens up and says, “You guys realize you’re making an anti-Semitic slur every time you say that word, right?”

“Huh?”

“What?”

“No.”

“I don’t think it’s really cool to be saying ‘jayed’ all the time,” he said. “It’s totally anti-Semitic.”

“It is?” I said.

“What did you think the ‘J’ stood for?”

I looked around. We all had dumb looks on our faces. “I thought it meant, like, ‘jacked,’” I said. That’s what I did think. “Like, robbed. Like car-jacking.”

I looked at Carter.

“That’s what I thought, too,” he said. “Or ‘jerked.’ Like, ‘Don’t me jerk me around.’”

I think someone even said they thought it was a reference to the bird, because blue jays were known to steal food from other birds or something. They are aggressive birds, blue jays.

“Nope.” Matt had apparently been told differently. This was maybe soon after Thanksgiving. Maybe he’d gone home, recently, to Rhode Island. Maybe he’d gone with his family to Friday night services at his synagogue. Maybe he’d said ‘jayed’ to some people there and they’d told him. Maybe it was the rabbi himself. I don’t remember exactly, but he’d been told. “Every time you say, ‘jayed,’ you’re saying, ‘Jewed.’ Like, to ‘jew someone down’ on a business deal. It’s like ‘shyster.’”

It made sense. Certainly, I was aware of the stereotypes surrounding Jewish business practices. I had heard the expression ‘to Jew someone down’ before. While Jews are notoriously paranoid about this kind of stuff—there’s that famous scene from Annie Hall, when Woody Allen tells Tony Roberts the story about hearing the words “did you” as “Jew”—this actually seemed legit. Without any evidence more persuasive than, “I just figured…” in support of the more innocent explanations, I was inclined to believe Matt was right. That is how we used it, like, “ripped off.”

“Huh.”

“Wow.”

“Yeah.”

It was quiet for a while, while we thought. I looked around, noticed the tables nearby, filled with our classmates—people we didn’t know very well, people who didn’t know us very well, people of undetermined ethnic origin. There were no yarmulkes in sight. Just as there were none at our table.

“We’re idiots,” Todd said, breaking into a self-effacing smile, and soon we got back to loud laughter and talking.

We were. But we stopped saying ‘jayed’ after that.

I had the pleasure of meeting Dave Bry recently at a birthday party of a mutual friend. (Jay-Z was also at this party. NBD.) I literally STAN-ed* out when I met him, citing previous entries of his great Public Apology column at The Awl.

Here’s his most recent piece. SOOO GOOOD.

*STAN (n.) – to act like an obsessed fan, like the protagonist of Eminem’s song “Stan” (you know the one that samples Dido).

Posted via email from sam han’s posterous

Shia Crescent – Wikipedia, the free encyclopedia

“Shia Crescent” is a term that we will all be familiar with very soon. I’m not sure how much analytic power it has but it will be the surd in every single article regarding Mid-East politics. I can almost guarantee it.

For instance: http://www.nytimes.com/2011/02/24/world/middleeast/24saudis.html?hp

Posted via email from sam han’s posterous

Arab uprisings mark a turning point for the taking | Peter Hallward | Comment is free | guardian.co.uk

Hallward is exceptional as a commentator on Deleuze and as an expert on Haiti. Quite good on the Middle East also as you can see.

Posted via email from sam han’s posterous

Haruki Murakami’s ‘Norwegian Wood’: International Trailer Goes Live « Word and Film

NYT Graphic: Global income inequality

USA! USA! We’re the best at everything, including leaving our poor way poorer than our rich.

H/T: http://twitter.com/iKnewdles/

Posted via email from sam han’s posterous

Hawking contra Philosophy | Philosophy Now

Hawking contra Philosophy

Christopher Norris presents a case for the defence.

Stephen Hawking recently fluttered the academic dovecotes by writing in his new book The Grand Design – and repeating to an eager company of interviewers and journalists – that philosophy as practised nowadays is a waste of time and philosophers a waste of space. More precisely, he wrote that philosophy is ‘dead’ since it hasn’t kept up with the latest developments in science, especially theoretical physics. In earlier times – Hawking conceded – philosophers not only tried to keep up but sometimes made significant scientific contributions of their own. However they were now, in so far as they had any influence at all, just an obstacle to progress through their endless going-on about the same old issues of truth, knowledge, the problem of induction, and so forth. Had philosophers just paid a bit more attention to the scientific literature they would have gathered that these were no longer live issues for anyone remotely au fait with the latest thinking. Then their options would be either to shut up shop and cease the charade called ‘philosophy of science’ or else to carry on and invite further ridicule for their head-in-the-sand attitude.

Predictably enough the journalists went off to find themselves media-friendly philosophers – not hard to do nowadays – who would argue the contrary case in a suitably vigorous way. On the whole the responses, or those that I came across, seemed overly anxious to strike a conciliatory note, or to grant Hawking’s thesis some measure of truth as judged by the standards of the natural science community while tactfully dissenting with regard to philosophy and the human sciences. I think the case needs stating more firmly and, perhaps, less tactfully since otherwise it looks like a forced retreat to cover internal disarray. Besides, there is good reason to mount a much sturdier defence on principled grounds. These have to do with the scientists’ need to philosophize and their proneness to philosophize badly or commit certain avoidable errors if they don’t take at least some passing interest in what philosophers have to say.

Science is Philosophical

Professor Hawking has probably been talking to the wrong philosophers, or picked up some wrong ideas about the kinds of discussion that currently go on in philosophy of science. His lofty dismissal of that whole enterprise as a useless, scientifically irrelevant pseudo-discipline fails to reckon with several important facts about the way that science has typically been practised since its early-modern (seventeenth-century) point of departure and, even more, in the wake of twentieth century developments such as quantum mechanics and relativity.

Science has always included a large philosophical component, whether at the level of basic presuppositions concerning evidence, causality, theory-construction, valid inference, hypothesis-testing, and so forth, or at the speculative stage where scientists ignore the guidance offered by well-informed philosophers only at risk of falling into various beguiling fallacies or fictions. Such were those ‘idols of the theatre’ that Bacon warned against in his New Organon of 1620, and such – albeit in a very different philosophic guise – those delusive ideas that, according to Kant, were liable to lead us astray from the path of secure investigation or truth-seeking enquiry. This was sure to happen, he warned, if the exercise of pure (speculative) reason concerning questions outside and beyond the empirical domain were mistakenly supposed to deliver the kind of knowledge that could be achieved only by bringing sensuous intuitions under adequate or answering concepts. While in no way wishing to lumber science with the baggage of Kantian metaphysics I would suggest that this diagnosis, or something like it, applies to a great many of the speculative notions nowadays advanced by theoretical physicists including proponents of string theory (Hawking among them) and some of the more way-out quantum conjectures. These thinkers appear unworried – blithely unfazed, one is tempted to say – by the fact that their theories are incapable of proof or confirmation, or indeed of falsification as required by Karl Popper and his followers. After all, it is the peculiar feature of such theories that they posit the existence of that which at present, and perhaps forever, eludes any form of confirmation by observation or experiment.

True, science has achieved some of its most notable advances precisely by venturing beyond the furthest limits of evidential proof. It has often broken new ground by following some speculative line of thought that involves a readiness, at least for the time being, to make do without the props and securities of ‘good’ scientific method. Indeed, this reliance on theoretical commitments that exceed the utmost scope of empirical testing is something that some philosophers would attribute even to basic physical laws or widely taken-for-granted scientific truths. On their view there is no such thing as plain empirical self-evidence, since observations are always to some degree theoretically informed. By the same token, scientific theories are always ‘underdetermined’ by the best evidence to hand, meaning that the evidence is always open to other, equally rational interpretations given some adjustment of this or that ‘auxiliary hypothesis’ or negotiable element of background belief. All the same, I don’t want to push that line of argument too far, because among some philosophers of science it has now become an article of faith; a dogma maintained just as fixedly as any precept of the old, unreconstructed positivist creed. Moreover it has given rise to a range of relativist or ‘strong’ sociological approaches which use the theory-ladenness and underdetermination theses to cast doubt on any distinction between true and false theories, valid and invalid hypotheses, or science and pseudo-science.

Very likely it is notions of this kind – ideas with their home ground in sociology, or cultural studies, or on the wilder shores of philosophy of science – which provoked Professor Hawking to issue his pronouncement. However they are in no way germane to my point about the speculative element involved in many episodes of major scientific advance and how philosophy has played its jointly enabling and regulative part in that process. By this I mean its role as a source of new ideas or creative hypotheses and also as a source of guiding precepts with respect to such matters as empirical evidence, logical validity, inductive warrant, corroboration, falsification, hypothesis-testing, causal reasoning, probability-weighting, and so forth. These serve to keep science securely on track and prevent it from taking the seductive turn toward pure, evidentially unanchored speculation or sheer science-fiction fantasy. That scientists can mostly do this for themselves is no doubt true enough although, I should add, it is very largely the long-term result of the work of philosophers. Ever since Aristotle there has existed a close though historically fluctuating relationship between the natural sciences and those branches of philosophy that took it as a part of their task to provide science with a clearer grasp of its own methodological bearings. Moreover it has sometimes been primarily a shift of philosophical perspective that has brought about some epochal change of scientific paradigm such as those whereby, in the insouciant phrase of American philosopher W.V. Quine, “Kepler superseded Ptolemy, or Einstein Newton, or Darwin Aristotle.”

I have no quarrel with Hawking’s aversion to philosophy of science in so far as it is provoked by the kind of wholesale paradigm-relativism that Quine was seeking to promote. On Quine’s account (and that of Thomas Kuhn) we should think of scientific theory-change as involving so radical a shift of conceptual schemes as to render the history of science rationally unaccountable and philosophy of science a poor (since entirely dependent) relation of sociology and behavioural psychology. If that were the sole position available to present-day philosophers owing to some large-scale failure of intellectual nerve then Hawking would be fully justified in launching his anti-philosophy salvo. However this ignores the strong turn toward a realist and causal-explanatory approach that has been the single most conspicuous feature of philosophy of science during the past two decades. In place of that earlier relativist drift these thinkers advocate a robust conception of natural kinds along with their essential structures, properties, and causal dispositions. Crucially in the present context their approach offers a critical purchase on the issue of what properly counts as scientific enquiry and what should more aptly be classed as metaphysical conjecture or (at the limit) mere invention.

So philosophy of science now looks set to reoccupy its native ground by getting back in touch with physics. This is not just a relatively trivial semantic point about the physical sciences having been described as so many branches of ‘natural philosophy’ until quite recently. Rather it is the point that scientific theories – especially theories of the ultra-speculative kind that preoccupy theoretical physicists like Hawking – involve a great deal of covert philosophising which may or may not turn out to promote the interests of knowledge and truth. This had better be recognised if we are not to be taken in by a false appeal to the authority of science as if it possessed the kind of sheer self-evidence or indubitable warrant that could rightfully claim to evict ‘philosophy’ as a relic from the pre-scientific past.

Least of all should philosophers carry their justified respect for science and its many impressive achievements to the point of ceding all authority over issues that lie within their own sphere of competence. Thus it is counter-productive for everyone concerned, philosophers and physicists alike, when Quine and others suggest that we should always be willing to change the ground-rules of logic so as to help us find room for certain otherwise puzzling, anomalous, or downright baffling results. Perhaps the seeming quantum paradox of wave/particle dualism can have its sting temporarily removed by lifting the classical rules of bivalence or excluded middle, i.e., those that would require that we accept either the statement ‘light propagates as waves’ or the statement ‘light is a stream of particles’ but surely not both on pain of logical contradiction. However the revisionist ‘solution’ gives rise to yet more intractable problems since it leaves both scientists and philosophers stuck with a huge normative deficit. After all, if they accepted Quine’s proposal then they would lack the most basic conceptual resources for assessing statements, theories or hypotheses in point of their internal (logical) consistency or even concerning the extent to which they hung together properly with other items of scientific lore.

Here again philosophers would do much better to stick to their guns, reject this particular line of least resistance, and hold out for the indispensability (on empirical as well as ‘purely’ rational grounds) of a due respect for the classical rule of bivalent truth/falsehood. Not that it could ever achieve what Hawking seems to envisage in the final paragraph of his book when he marvels at the thought of how ‘abstract logic’ could have thrown up the sheer wondrous profusion of present-day scientific knowledge. Here the point needs making – one to which his own book bears ample witness – that the knowledge in question has resulted from a disciplined yet often highly inventive project of enquiry wherein ‘abstract’ reasoning plays a crucial though far from all-encompassing or self-sufficiently productive role. This project combines the basic procedures of logical, e.g., hypothetico-deductive thought and inductive reasoning on the evidence with a whole range of ancillary resources such as analogy, thought experiments, rational conjecture, and – subsuming all these – inference to the best, most adequate explanation.

Hawking offers numerous examples of the use of each of these philosophical tools in the course of his book, along with other cases where their joint operation is the only thing that could possibly explain how science has been able to achieve some particular advance. All the same he is compelled by the ‘abstract logic’ of his own doctrinaire science-first approach to push that evidence temporarily out of sight when declaring the total irrelevance of philosophy for anyone possessed of an adequate (i.e., scientifically informed) worldview. Indeed it may be good for philosophers occasionally to remind scientists how their most productive thinking very often involves a complex interplay of empirical data, theories, working hypotheses, testable conjectures and even (sometimes) speculative fictions. Likewise absent from Hawking’s account is philosophy’s gatekeeper role in spotting those instances where science strays over without due acknowledgement from one to another mode, or – as frequently happens nowadays – where certain evidential constraints are lifted and empirically informed rational conjecture gives way to pure fabulation.

Besides this, there are supposedly cutting-edge theories which turn out, on closer inspection, to unwittingly replicate bygone notions from the history of thought that have been criticised and eventually laid to rest. Hawking’s book puts forward two such theories. One is his linchpin ‘M-theory’ having to do with the multiple dimensions – eleven at the latest count – that are taken to constitute the ultimate reality beyond appearances despite our sensory perception being limited to the three-plus-one of our familiar spatio-temporal world. On this account there cannot be a single, comprehensive ‘Theory of Everything’ of the kind favoured by sanguine types like Steven Weinberg but we can hope to get a whole range of specially tailored, region-specific theories which between them point toward the nature and structure of ultimate reality. The other, closely related to that, is Hawking’s idea of ‘model-dependent realism’ as an approach that makes allowance (as per orthodox quantum mechanics) for the effect of observation on the item observed but which nonetheless retains an adequate respect for the objectivity of scientific truth.

Here Hawking’s argument shows all the signs of a rudderless drifting between various positions adopted by different philosophers from Kant to the present. He spends a lot of time on what seems to be a largely unwitting rehash of episodes in the history of idealist or crypto-idealist thought, episodes which have cast a long shadow over post-Kantian philosophy of science. That shadow still lies heavy on Hawking’s two central ideas of M-theory and model-dependent realism. They both look set to re-open the old Kantian split between a ‘noumenal’ ultimate reality forever beyond human knowledge and a realm of ‘phenomenal’ appearances to which we are confined by the fact of our perceptual and cognitive limits. So if Hawking is right to charge some philosophers with a culpable ignorance of science then there is room for a polite but firm tu quoque, whether phrased in terms of pots calling kettles black or boots on other feet. For it is equally the case that hostility or indifference toward philosophy can sometimes lead scientists, especially those with a strong speculative bent, not only to reinvent the wheel but to produce wheels that don’t track straight and consequently tend to upset the vehicle.

A firmer grasp of these issues as discussed by philosophers during the past few decades might have moderated Hawking’s scorn and also sharpened his critical focus on certain aspects of current theoretical physics. My point is not so much that a strong dose of philosophic realism might have clipped those speculative wings but rather that philosophers are well practised in steering a course through such choppy waters, or in managing to navigate despite all the swirls induced by a confluence of science, metaphysics, and far-out conjecture. After all, physics has increasingly come to rely on just the kind of disciplined speculative thinking that philosophers have typically invented, developed, and then criticised when they overstepped the limits of rationally accountable conjecture. Such are those ‘armchair’ thought-experiments that claim to establish some substantive, i.e., non-trivial thesis concerning the nature of the physical world by means of a rigorous thinking-through that establishes the truth (or, just as often, the demonstrable falsehood) of any statement affirming or denying it.

No doubt there is room to debate whether these are really (and remarkably) instances of scientific discovery achieved through an exercise of a priori reasoning or whether they amount, as sceptics would have it, to a species of disguised tautology. However there are just too many impressive examples in the history of science – from Galileo’s marvellous thought-experiment showing that Aristotle must have been wrong about falling bodies to a number of crucial quantum-related results – for anyone to argue convincingly that results obtained in the ‘laboratory of the mind’ can only impress philosophers keen to defend their patch. Indeed, there is a sense in which the scientific enterprise stands or falls on the validity of counterfactual-conditional reasoning, that is to say, reasoning from what necessarily would be the case should certain conditions obtain or certain hypotheses hold. In its negative guise, this kind of thinking involves reasoning to what would have been the outcome if certain causally or materially relevant factors had not been operative in some given instance. Hawking constantly relies on such philosophical principles in order to present and justify his claims about the current and likely future course of developments in physics. Of course he is very welcome to them but he might do better to acknowledge their source in ways of thinking and protocols of valid argumentation that involve distinctly philosophical as well as scientific grounds.

This brings us back to the point likely to provoke the most resistance from those scientists – chiefly theoretical physicists – who actually have the most to gain from any assertion of philosophy’s claim to a hearing in such matters. It is that scientists tend to go astray when they start to speculate on issues that exceed not only the current-best observational evidence but even the scope of what is presently conceivable in terms of testability. To speak plainly: one useful job for the philosopher of science is to sort out the errors and confusions that scientists – especially theoretical physicists – sometimes fall into when they give free rein to a speculative turn of mind. My book Quantum Theory and the Flight from Realism found numerous cases to illustrate the point in the statements of quantum theorists all the way from Niels Bohr – a pioneering figure but a leading source of metaphysical mystification – to the current advocates (Hawking among them) of a many-worlds or ‘multiverse’ theory. To adapt the economist Keynes’ famous saying: those scientists who claim to have no use for philosophy are most likely in the grip of a bad old philosophy or an insufficiently thought-out new one that they don’t fully acknowledge.

There is a large supply of present-day (quasi-)scientific thinking at the more – let us say – creative or imaginative end of the scale that falls into just this hybrid category of high-flown metaphysical conjecture tenuously linked to certain puzzling, contested, or at any rate far from decisive empirical results. Nor is it mere hubris for philosophers to claim a special competence in judging when thought has crossed that line from the realm of rational, scientifically informed but so far unproven conjecture to the realm of unanchored speculation or outright science fiction fantasy. One has only to pick up a copy of New Scientist or Scientific American to see how much of the latest thinking inhabits that shadowy border-zone where the three intermingle in ways that a suitably trained philosopher would be best equipped to point out. Nowhere is this more evident than in the past hundred years of debate on and around the seemingly paradoxical implications of quantum mechanics. Those paradoxes include wave/particle dualism, the so-called ‘collapse of the wave-packet’, the observer’s role in causing or inducing said collapse, and – above all since it appears the only way of reconciling these phenomena within anything like a coherent ontology – faster-than-light interaction between widely separated particles.

I shall risk the charge of shameless self-advertisement and suggest that readers take a look at my book for the case that these are pseudo-dilemmas brought about by a mixture of shaky evidence, dubious reasoning on it, fanciful extrapolation, and a flat refusal to entertain alternative theories (such as that of the physicist David Bohm) which considerably lighten the burden of unresolved paradox. At any rate we are better off trusting to the kinds of advice supplied by scientifically-informed philosophers with a well-developed sense of how speculative thinking can sometimes go off the rails than the kinds – including the advice ‘let’s put a stop to philosophy’ – issued by philosophically under-informed scientists.

Conclusions

No doubt there is a fair amount of ill-informed, obtuse, or ideologically angled philosophy that either refuses or tries but fails to engage with the concerns of present-day science. One can understand Hawking’s impatience – or downright exasperation – with some of the half-baked notions put around by refuseniks and would-be engageniks alike. All the same he would do well to consider the historically attested and nowadays more vital than ever role of philosophy as a critical discipline. It continues to offer the sorts of argument that science requires in order to dispel not only the illusions of na ïve sense-certainty or intuitive self-evidence but also the confusions that speculative thought runs into when decoupled from any restraining appeal to regulative principles such as that of inference to the best explanation. To adapt a quotation by Kant in a different though related context: philosophy of science without scientific input is empty, while science without philosophical guidance is blind. At any rate it is rendered perilously apt to mistake the seductions of pure hypothetical invention for the business of formulating rationally warranted, metaphysically coherent, and – if only in the fullness of time – empirically testable conjectures.

© Prof. Christopher Norris 2011

Christopher Norris is Professor of Philosophy at Cardiff University.

Further Reading
• Stephen Hawking with Leonard Mlodinow, The Grand Design: new answers to the ultimate questions of life (Bantam Press, 2010)
• Christopher Norris, Quantum Theory and the Flight from Realism: philosophical responses to quantum mechanics (Routledge, 2000)
• David Papineau (ed.), The Philosophy of Science (O.U.P., 1996)

This is utterly obvious but glad someone has presented this neatly. I mean why would anyone, especially Hawking of all people, think philosophy is outmoded? That’s in essence, what he does!

Posted via email from sam han’s posterous

Bruno Latour: Where is res extensa? An Anthropology of Object | Continental Philosophy

The real threat of Glenn Beck’s fantasies | Frances Fox Piven | Comment is free | guardian.co.uk

Dissent Magazine – Cult Stud Mugged: Why We Should Stop Worrying and Learn To Love a Hip English Professor –

Cult Stud Mugged: Why We Should Stop Worrying and Learn To Love a Hip English Professor

Kevin Mattson – January 31, 2011

BACK IN the late eighties, I was an undergraduate at the New School for Social Research, along with other punk rockers, political activists, and wannabe writers. It was a hotbed of the postmodern academic Left: my fellow students would “interrogate” texts while carefully avoiding “logocentrism,” deconstruct television shows, and write essays that “de-gendered” literary works they hadn’t read.

Like many college students, I was confused and horny. So it didn’t take long for me to notice numerous young women clutching copies of Joan Scott’s Gender and the Politics of History (her then-husband taught history at the New School). Read this book pronto, my libido told me. In short order, I learned to challenge “the accuracy of fixed binary distinctions” and was making casual conversational references to stylish French thinkers like Michel Foucault and Jacques Derrida.

Midway through my reeducation, I came across an essay in Joan Scott’s book that set out to dismantle E.P. Thompson, the historian of the “English working class.” Scott took him on by blowing up the very concept of “rights” and the language of inclusion, as used, for example, by the movement that demanded that people without property be allowed to vote. She questioned Thompson’s faith in “rational” politics and the “abstract individual, the bearer of rights.” I stumbled through Scott’s cumbersome sentences—somehow critiques of abstracted individualism never yielded decent prose. And I remember thinking to myself: aren’t rational arguments in favor of rights a good thing? And especially for anyone who claims to be on the Left, seeing that universal rights are the basis of…well, just about everything?

No, the other students in my classes told me, because such a “position” hadn’t “sufficiently problematized” (a term I heard a lot back then) the binary oppositions inherent in rights and universalism. Conversation after conversation like this ended with my head buzzing and my heart broken.

After getting a Ph.D. in American history and entering the joyous condition of chronic underemployment that it secures, I finally acquired a full-time teaching position in 2001, just a week before September 11. As we all know, a very different world soon opened up. It included not simply a war on terror carried out by a tongue-tied president, but also a conservative movement that was emboldened, by having conquered every branch of government, to search out tenured radicals and make war on the press.

I often wondered where Joan Scott was through the past decade. It turned out that she was heading up Committee A of the American Association of University Professors (AAUP), an organization founded almost a hundred years ago by the liberal philosopher John Dewey to fight for the distinctly bourgeois ideal of free inquiry. Committee A focuses on “academic freedom” and tenure issues, and therefore I had to rub my eyes a bit when I came across the powerful testimony Scott gave to numerous state legislatures that were considering rules that would allow politicians to smoke out and dismiss radical professors. Her defense of “academic freedom”—including the right to control classroom content—was steadfast and thorough, and she cited a document that the AAUP had created in 1940 that elaborated on the idea. It read, in bold and clear language, “The common good depends upon the free search for truth and its free exposition.”

I thought back to my New School days, dwelling on the many ironies that attended Scott’s new professional mission. Academic freedom? Now, aren’t there a lot of binary oppositions and gendered compromises that riddle this dead-white-male, liberal value? Isn’t this an abstract universal proposition, and a meta-narrative to boot? Shouldn’t she interrogate or problematize the idea for those listening legislators?

But we were no longer in the eighties or nineties. The Right had come with guns blazing to legislate against “academic freedom,” and suddenly a left-wing academic realized that the liberal and universal values she used to criticize weren’t such bad things after all. It turned out to be far more important to defend such ideas than to interrogate them. I suddenly realized I had witnessed an intellectual mugging.

IRVING KRISTOL famously quipped in the early 1980s that a neoconservative—a member of a faction then defecting from the Democratic Party in droves—was “a liberal who has been mugged by reality.” What he meant was that people like himself—intellectuals who had leaned left—were being driven to the right thanks to their experience with the New Left in the late sixties. Mugged: as in, having your ideas taken away and replaced by something else; a signal that the world no longer operates the way you expected.

Today we’re seeing another mugging occur on the intellectual landscape—a more subdued and drawn-out shifting of intellectual coordinates than anything announced by the self-advertising Kristol, but a mugging just the same. The inflated pomo world I had inhabited at the New School has popped like the dot-com bubble. Joan Scott’s retreat into universal liberal verities was an early symptom, but several years on, the lack of seriousness that had been synonymous with the nineties—the intellectual fads, the pop culture studies, the French theories—had collapsed under the weight of an economic meltdown. What once appeared to be a liberating application of high theory to essential aspects of political and cultural experience now seems silly. Tenured radicals have awakened out of their comfortable nineties slumber to reckon with full-scale catastrophe.

One figure encapsulates the shift from the heyday of cultural studies in the eighties and nineties more fully than any other: NYU American studies professor Andrew Ross. Here was the studliest cult stud of them all. Ross was the celebrity professor who rebelled against literature in favor of popular culture, who left a tenured position at Princeton’s stodgy English department to head up NYU’s booming American studies department, a place where gay porn mattered more than Hemingway. “I am glad to be rid of English departments,” he told a reporter for New York in 1994. “I hate literature for one thing, and English departments tend to be full of people who love literature.” He edited the red-hot academic journal Social Text, wrote books published by the theory-happy publishing house Routledge, and even dressed the part of puckish culture rebel with not just one but two earrings. A New York Times reporter at the Modern Language Association (MLA) conference in 1991 remembered Ross for his “hand-painted Japanese tie,” “mango wool-and-silk Comme des Garcons blazer,” and “wedge-heeled suede lace-ups recently acquired on West Eighth Street in Greenwich Village,” as much as for the paper he presented on “Mapplethorpe and 2 Live Crew”: “Tall, lean, with saturnine good looks, Ross attracts attention wherever he goes. ‘That’s him!’ comes a reverent whisper from a group of graduate students nearby. ‘That’s Andrew Ross!’”

In 1997, Ross’s status as king of the cult studs was confirmed in James Hynes’s novella Queen of the Jungle, whose central character sweats away as a literature postdoc in Iowa, hoping to clinch a position at a university in Chicago where his wife just received tenure. He writes a paper that makes a “linkage … between ‘The Metamorphosis’ and My Mother the Car.” His wife calls it “a little too Andrew Ross for me,” but that just fuels his desire to outline a new book with chapters like “The Sitcom at the End of the New Frontier: The Brady Brunch and The Wild Bunch in Contrapuntal Perspective.” So Andrew Ross.

Such gently satirical callouts were meant as homage to Ross’s wide-ranging influence. He was a wordsmith, prolific in intellectual output. His career took off in 1989 with No Respect: Intellectuals and Popular Culture. The book rode a cresting wave of voguish populism among the university’s theory elite. Ross derided the “well-known, conspiratorial view of ‘mass culture’ as imposed upon a passive populace.” He celebrated “the ability of people to variously interpret and use what they see and hear in mass-produced culture.” He sneered at intellectual elites who suffered from “paternalism, containment, and even allergic reaction” to pop culture—including, bizarrely, those who criticized the rigging of the quiz shows during the fifties. These snobs ignored the “extraordinary success and immense popularity” the shows evinced—meaning, one supposes, that their popular mandate entitled them to continue jacking up ratings by fraud.

The larger message of No Respect now seems embarrassingly trite. It amounted to this: if you find yourself digging television, don’t sweat it. Even pornography, as the reader found out in the second to last chapter, should elicit no concerns about the mechanization of sex but instead should be seen (if not celebrated) as representing an exciting conflict between the “discourses of popular pleasure” and the “morality laid down by the appointed or self-styled intellectual protectors of the public interest.” The Rossian populist revolt might be summed up with the slogan, Run to the video store!

The cult studs of the nineties took the putative political implications of such work very, very seriously. As George Herbert Walker Bush dispatched troops to the Middle East to free oil-rich Kuwait from Saddam Hussein, cult studs mounted the barricades of protest, armed with pop guns. Here’s the ringing first line from an op-ed that Ross wrote with Constance Penley condemning the Gulf War in the New York Times: “As scholars of popular culture, we spend a good deal of our time resisting the widespread assumption that people are passive consumers of the mass media.” In other words, they took the occasion of war to assail the “myth” of the “couch potato”—a stratagem that fell considerably short of inspiring the multitudes to take to the streets chanting, “No war for oil!”

What Ross and Penley succeeded in was advancing the kind of vapid sloganeering that passed for campus radicalism in those days. “It is war that makes people stupid, not TV,” they wrote. Never mind that a lot of people were getting their information about war from TV. What really mattered was the “racist and xenophobic aggression” that drove Americans’ war-time fervor. Porn was OK, and so were rigged quiz shows, but war was not. Unfortunately, this time the people were on the other side, supporting the war by a wide margin.

As it turned out, the Gulf War didn’t last long enough to test audience-reception theories and their relationship to military mobilization. And by 1992, everything had changed. The Clinton years were about to start. The tech bonanza was just around the corner, the Internet was blossoming, the hills were alive with the sound of globalization. As the economy boomed, cultural studies did too.

In 1992, the movement produced its foundational document, a doorstop of a book titled Cultural Studies that bulged with essays and transcripts of conversations held at a conference at the University of Illinois. Ross’s essay in the volume, titled “New Age Technoculture,” explored how “modern science’s founding sacraments”—note the equation of science and religious faith—were “rapidly disintegrating.” This erosion was on full, deliquescent display, Ross argued, in the wacky New Age cults that had grown out of the sixties counterculture. Ross wrote that educated elites saw New Age stuff as mere dross, the “lowest of the low.” But not an arch-populist like Ross: he saw “political lessons” in the New Age movement, which, to give just a smattering of examples, included such empowering diversions as “aromatherapy,” “Bach Flower Therapy,” “chakra therapy,” and “quantum healing.”

The essay was filled, as such work usually was, with flashes of silliness, mountains made out of molehills, and things-that-make-you-go-hmm elevated to a level of grave academic import. Consider how, in this hothouse atmosphere, Ross pounced on a certain New Age guru named Dr. Welles for failing to fully understand the politics of the toilet. Ross drove his verdict home against his opponent, who lacked “any description of the historical or ideological conditions under which immaculately white porcelain toilet technology was developed to demarcate squatting from non-squatting populations, and thereby create, if you will, an international division of excremental labor.” (Less high-flown critics might have taken this observation to mean that we were dealing with some real shit here.) Ross slammed shut the door on the traditional, compromised misreading of the toilet with a rhapsody of relativism: “the highly technological concept of a ‘correct’ or ‘natural’ human posture is itself a culturally relative idea for which no universal norm can be assumed.” The death of master narratives and universals explain everything else, so why not the ways we relieve ourselves?

Ross’s next step was to bend one knee at the altar of Madonna. One of the signature cult-stud moves in those days was to transform the disco chanteuse into a readymade vector of all things subversive, pansexual, and patriarchy-challenging. So in the collected set of essays titled (what else?) Madonnarama, our heroic theorist of demotic porn greeted the publication of Madonna’s salacious book-like object Sex with another overheated salute to dirty pictures. Sex represented “twenty post-Stonewall years of sex radicalism,” Ross wrote; the multivalent Madonna opus was “nurtured by its uneasy bedfellowship with free market enterprise.” Uneasy? The most cursory flip through the title in question showed it to be an all-out, race-to-the-bottom romance with the market.

In a saner discourse about mass culture, other questions might command attention—such as, who the hell cares about scandal-addled pop singers, and why would we want to read even more about omnipresent celebrities? Why bother putting academic gloss on what everyone knows is a commercial boondoggle? There was something so nineties about this sort of thing.

Which is not to say that the Rossian worldview was ephemeral. It has in many ways formed the template for so many later waves of fake-populist bullshit, delivered by the Ivy League hipsters in the squalid orbit of Vice magazine or the “creative class” working in the culture industry. There was also—and this explains in part Ross’s prolific nature—something easy about this work. Call it couch potato theory. Its user manual went roughly like this: absorb some liberating cathode rays; read something published by Routledge; then write, write, write. This briskly productive ethos accounts in part for the insular thinking that governed the allegedly populist cultural-studies academy. The sheer scale of theoretical labor on display ensured that few would challenge this or that reading of on-demand porn or the Weather Channel or the Madonna revolution. Ross never bothered to interview any ordinary Americans to find out if they were lulled by pop culture, or weren’t racist after all, or didn’t really give a shit about what sort of titillating fare went out for public consumption under the Madonna brand. Hell, no one did that in those days.

Ross just barreled ahead. He was in his groove, able to toss off observations about every species of American cultural effluvia. His career was skyrocketing. His visage made it into New York, the New York Times, even GQ.

THE ROSS star, like the NASDAQ index, continued its remorseless ascent, but only for so long. First came cracks, and then a bust. Reality started to mug Andrew Ross—albeit in distinct stages. The first crack in his brand came in May 1996, via an admirably executed literary hoax: the so-called Sokal affair.

Many of us remember this event as a turning point in American intellectual life (or at least a turning point for the cultural studies movement). Alan Sokal, a physicist at NYU, wrote a complete bullshit article, putatively about physics but mainly significant because it was chock-full of postmodern jargon and quotes from then-fashionable theorists. (Its absurd title: “Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity.”) Sokal sent it to Social Text, hoping Ross and his fellow editors would publish it, positioning him thereby to expose their vapidity. The trick was a wild success, even garnering front-page coverage in the New York Times.

What we might not remember is just how squarely Sokal’s blow hit Andrew Ross. It was Ross, after all, who took leadership in obtaining the article. He wrote Sokal back in November 1994, soon after the physics professor submitted the article, to say that the editors found it “interesting”; in March 1995, he followed up with a letter to Sokal requesting that he revise the piece for inclusion in a forthcoming science-themed issue of Social Text. The editors then went ahead and accepted Sokal’s piece as it was. And at that point in the trajectory of Ross’s career, the editors of an anthology called The Sokal Hoax recount, “Ross’s visibility helped to ensure that Sokal’s hoax reached a wide audience.” More to the point, though, Ross’s views on science—as just another fiction or belief system relative to other contested narratives—were just the kind of balderdash that Sokal wanted to mock. Here, for example, is Ross’s critique of “objectivity” in Strange Weather, his book-length meditation on the cultural politics of the Weather Channel: “Any picture of the world purporting to be ‘natural’ and fundamental is in fact heavily underscored by particular moral and political beliefs about nature and social behavior.”

The Sokal hoax showed, in other words, all the classic signs of an intellectual mugging. Ross himself described feeling “snakebit” in the wake of the embarrassing disclosure that the whole thing had been a put-up job. Still, stodgy empirical matters could never deter the appointed course of theory, so Ross and his co-editor Bruce Robbins engaged in acrobatic apologetics. They explained that Sokal’s article appeared “a little hokey” to them but “not knowing the author or his work”—and not even bothering to pay him a visit in his nearby office on the NYU campus—“we engaged in some speculation about his intentions, and concluded that the article was” in earnest. But they didn’t send it out to anyone with a knowledge of science any deeper than what you might learn from the Weather Channel or the various philosophers of science published by Verso Books. In an especially telling maneuver, Ross tried to turn the political tables on Sokal, accusing the physicist of defending the science status quo. (The populist rebel in Ross just wouldn’t die.) At one point, Ross told the New York Times that Sokal had written “caricatures of complex scholarship,” now sounding like a boundary-policing academic elitist. He zigged this way and that. Katha Pollitt reported on a conversation with Ross in which he argued that “Sokal had possibly written his article seriously, and only now claimed it as a parody,” that “its being a parody was, in any case, irrelevant to its content,” and that “leftists should support Social Text out of ‘unity and solidarity.’” Solidarity, it seems, being the last refuge of the mugged.

Today, the idea that science is an elitist practice that excludes what ordinary citizens want to believe is no longer the domain of the populist academic Left. Like so many of the populist tendencies in cultural debate, it has become a hallmark of the Right. The same year the Sokal hoax occurred, the Discovery Institute was founded in Seattle, a think tank devoted to promoting an updated version of creationism called “intelligent design.” Discovery Institute scholars began to crank out papers echoing one of the better-known refrains from the academic culture wars, urging educators to follow an open-minded course of “teaching the controversy.” Teachers don’t need to endorse creationist curricula, the Institute’s argument goes; instead, they can teach intelligent design as just another paradigm like evolution—itself a master narrative requiring interrogation. Phillip Johnson, the chief intellectual guru behind the Discovery Institute, admitted his politics were rightward but claimed his ideas were “dead-bang mainstream” in “academia these days.” The same dynamic lurks behind the Right’s climate-change denialism—right down to the think-tank front groups. In other words, the postmodern Left of the nineties provided fertile ground for the anti-intellectual backlash of the following decade.

ANOTHER CRACK in the Andrew Ross brand had nothing to do with science or even with Ross himself; rather, it involved academia and the homelier aspects of casualized labor. Recall that Ross solidified his academic celebrity when he moved from stodgy Princeton to hipster NYU. NYU had not always been hip, of course. But in 1984, the school launched a bonanza fundraising drive, aiming for $1 billion that would be used immediately to upgrade infrastructure—which in New York City parlance means real estate. NYU was able to leverage the support of trustees who just happened to work at such places as Chemical Bank, J.P. Morgan, and Salomon Brothers, reaching the $1 billion target well ahead of schedule. I can remember from my New School days in the East Village how the NYU brand spread like an ink blot through the neighborhood, the university’s trademark royal purple flags hovering over more and more buildings, its dormitories pushing out old immigrant housing, even encroaching upon the Lower East Side.

As NYU boomed along with the dot-com bubble, according to The University Against Itself, it became “the most popular choice for college applicants” in America, and its bank accounts overflowed with ballooning tuition fees. Ross had no role in the fundraising scheme or the skyrocketing enrollment costs, but he was an asset of intangible value all the same. In much the same way that the boutiques of Eighth Street and Fifth Avenue helped lure suburban kids into NYU’s orbit, Ross was something of a boutique intellectual, a magnet to the fashionable and au courant. Trustees from big financial institutions couldn’t care less that Ross still claimed to be a Marxist—especially if that Marxist said crazy things that made for good publicity. Media profiles about Ross made the university’s stock go up. And graduate students, in their hip-looking glasses and nose rings, came in droves in order to study the hermeneutics of Star Trek and porn.

Although it seems hard to believe now, there was a time when grad students were cool, but what was most important about them was that they also worked as teaching assistants, running classes and grading papers on the cheap. No one paid it much attention at the time, but this pool of cheap labor helped nurture the NYU boom—and it was to prove an early casualty of late-nineties bubble culture. A year after the Sokal hoax, NYU grad assistants created the Graduate Student Organizing Committee and contacted the United Automobile Workers to organize a union. By 1998, they had enough cards signed to warrant a union election. They filed a complaint about their status as university workers in 1999, and a year later the National Labor Relations Board ruled in the students’ favor, affirming their rights to organize as NYU employees. One person who testified on their behalf, a friend of mine named Joel Westheimer, found himself denied tenure soon thereafter. As part of his case, an attorney made public an email from Dean Ann Marcus that explained the labor practices of NYU in terms so bold they still shock: “We need people we can abuse, exploit and then turn loose.” Westheimer departed for a teaching position in Canada, and the graduate students are still trying to get NYU to recognize their union to this day.

To his credit, Andrew Ross came out on the side of the graduate students, thereby locking horns with the administration that had recruited him as a badge of NYU’s academic status. The edgy souls who had once looked like future symbolic analysts started to look more like sweaty workers; nose-ringed hipsters now were a pool of low-wage employees with lousy benefit packages—what the more class-conscious breed of cult stud might call the “global south” of American academe. The NYU boom had helped jump-start Ross’s renown, but now its dark underbelly was impossible to ignore. No longer could a cocoon of academic prattle shield theorists from the market’s abrasive workings.

IT’S HARD to miss the change in Ross’s tone since the late nineties. His writing has steadily shifted from consumption to work, from leisure to labor, from science in the abstract to workplace organization. He has, in his own words, “grown weary of armchair opinion” and resolved to become more of a reporter—not tossing off observations about the Weather Channel from the couch but actually going to workplaces and interviewing people.

There’s something more visceral and weighty about his work now. After all, it’s nearly impossible to get inside the heads of TV watchers to see whether they’re manipulated or empowered by the habit. But an observer can actually see the impact of globalization on work, or what Ross calls “downward wage pressure, and the establishment of a permanent climate of job insecurity” in Chinese workplaces. Merely by opening one’s eyes, one can see labor “casualization” in places like Silicon Valley and New York City, which serves as the basis of Ross’s important 2003 book, No-Collar.

In a series of interviews, Ross has explained the pronounced shift in his published work. He tells one interlocutor that there was a “certain degree of overcompensation involved in the cultural turn” he and others had pursued during the nineties—something of a world-class understatement. Still, he insists, there are points of continuity between his pre- and post-boom careers; he began examining the inner workings of the international textile sweatshop, for instance, thanks to his ongoing interest in fashion consumption. It’s better than presenting oneself as a mugging victim, I suppose.

Consider his most recent book, Nice Work if You Can Get It. Here Ross examines how paying attention to questions of labor can change the way we perceive culture. In explaining the rise of digital music, Ross writes, “As a devotee of these electronic genres, I could certainly count myself among those who believed that their inventive use of drum machines, samplers, and sequences ushered in a quantum leap in musical progress.” This is the old language of the cult stud—imagining consumers appropriating cultural shards for their own, invariably progressive purposes. “Yet,” he goes on,

whenever I asked no-name working musicians who depended on live club and bar bookings what they thought of ‘DJ music,’ I was guaranteed an earful. There was no question in their minds that owners of live venues welcomed and encouraged a DJ-based economy of pre-recordings or musical acts because it cut their overheads and labor costs by eliminating drummers, keyboard players, guitarists, and vocalists. Killing off live music may have been sold to fans as a worthy crusade against the pretensions to authenticity of the rock aristocracy, but it was also a serious labor problem.

Once the fake populist posturing of the cult stud gives way to matters economic, the callow characterization of performing musicians as an “aristocracy” becomes instantly unsupportable.

Indeed, Ross’s broader engagement with labor relations has pushed him steadily away from cult-stud truisms and the New Economy boosterism they often resembled. Recall how the management theorists of the nineties championed the economy of the “free agent”? It was a simple matter, back in that heady age, to join the language of the empowered consumer to a celebration of free agents in the workplace, who had supposedly liberated themselves from stodgy corporate structures.

But the new Andrew Ross sees through the liberationist cant. He has spent a good deal of time observing those who are contending with the real-world consequences of the liberated workplace, and he has some bad news for the “creative class” apostrophized by writers like Richard Florida. Ross observes “a stripping away, or shredding, of layers of protection and social insurance against risk and insecurity. In the absence of safeguards and protections, the ultra-humane workplace could easily turn into a medium of self-exploitation—with bottomless seventy-hour-plus workweeks, and a dissolution of all boundaries between company and personal time.” The rebel employees of the nineties simply laid the groundwork for the workplace horrors that more and more will face in our brave new world. “No longer on the margins of society,” Ross observes about free agents, “in bohemia or the ivory tower, they are providing a rationale for the latest model of exploitation in core sectors of the information economy, and pioneering the workplaces of tomorrow.” In a remorselessly narrowing job market, the rebel worker and free agent have found themselves pretty much bankrupted.

What’s most satisfying about all this is that Ross is no longer talking about “the people” in the abstract, the way he did when he celebrated them as consumers. Now Ross talks about actual people in his backyard—even the graduate students at NYU. “The struggle for fair labor is not solely a geographically distant matter, played out in the poorest corners of the world, or among the lowest-paid domestic workers. It also applies to the degradation of domestic white-collar professions as the casualization of work in the domestic economy continues apace.”

Ross’s claim to being a man of the Left no longer sounds as weird or affected as it did during the nineties. Today he moves easily from observations on unfair work conditions to specific policy solutions—including labor union activity, “green-blue” alliances between environmental activists and organized labor, and consumer movements for reform. Increasingly, Ross sounds like a labor policy wonk. In Nice Work If You Can Get It, he discusses the sprawl and environmental devastation around Phoenix, Arizona and proposes solutions that don’t sound sexy and daring but sensible and even viable. He calls for “an infill program” that would provide “tax credits, incentives and waivers” in order to “build in central city areas.” He explains that such a program could not only cut down on sprawl but also provide high-wage employment for construction workers. Yes, construction workers! Not so Andrew Ross.

THE WEIGHTLESSNESS of the nineties is gone forever. And Andrew Ross was not the only one to change; there was an epidemic of intellectual muggings. The Boston University sociologist Alan Wolfe, who drifted rightward during the Clinton era, learned about the true nature of radical conservatism and has now written a book about the virtues of liberalism. Conservative bloggers like Andrew Sullivan and Charles Johnson have been mugged by the xenophobia of mainstream Republican Party discourse.

With these reversals in view, let me go full circle and propose a master narrative for contemporary American intellectual life: the silliness of the nineties has melted into a seriousness for the 2000s (and hopefully beyond). It feels as if the country’s going through a change similar to that from the twenties to the thirties. During those gloriously awful years, you could hear the word “commitment” used to describe the attitude of writers moving out of the alienated “jazz age.” There was a sense that intellectuals had to make their work accessible to ordinary citizens by addressing their suffering. There was a downside to this, of course—the corrupt Stalinism of the era, and some really bad proletarian fiction that no one should ever have to read again. Still, the thirties were a time when “the people”—those whom writers like Sinclair Lewis and H.L. Mencken loved to bash as morons during the twenties—invaded the work of writers and intellectuals and pressed to be taken seriously. Economic insecurity changed the work of the mind. Highbrow writers like Edmund Wilson, who reported on the impact of the Great Depression in The American Jitters, felt obliged to go out, scuff their shoes, and observe people toiling through hard times. By directly engaging with ordinary Americans, such reporting actually informed the policies that New Deal brain-trusters pursued—and sometimes even pushed FDR leftward.

None of this is to say that a new demotic turn in cultural inquiry will follow from our own recent wave of intellectual muggings. Nevertheless, it surely says something that a star professor who made his career by musing over the populist nature of cable programming and the liberating formal innovations of pornography would go on to ponder zoning and labor policy in ever-wonkier treatises on the downward spiral of working conditions. Sometimes it takes a good mugging, after all, to wake a person up.

Kevin Mattson is author most recently of What the Heck Are You Up To, Mr. President? now out in paperback.

My God. SHOTS FIRED.

I wonder what the squabble is really about because there are axes being ground here.

Posted via email from sam han’s posterous