The Guardian’s annual mating ritual with ‘business success’ only served to confirm the old proverb: post coitum omne animal triste est. The 1988 prize went to John Gunn of British & Commonwealth Holdings; two years later B&C crashed with £1 billion in liabilities. Richard Brewster, chief executive of the packaging group David S. Smith (Holdings), took the palm in 1990, praised by the judges for his ‘management skills in rebuilding an important British industry’. By 1991, he had gone –‘hounded’, as one City commentator cruelly observed, ‘by the curse of having been nominated Guardian Young Businessman of the Year’. The award itself, like so many of its winners, conked out soon afterwards.
Typical British defeatism, some might say: a true professional would heed Victor Kiam’s wise words and ‘turn those negatives into positives!’ In 1987, five years after Tom Peters initiated the craze for management blockbusters with In Search of Excellence, most of the US firms he had marked as ‘excellent’ were in steep decline. The cover of Business Week magazine, which first noticed his remarkable inability to pick winners, carried the single word ‘Oops!’ Worse still, a comparative study in the Financial Analysts’ Journal found that whereas stocks in two-thirds of Peters’s model corporations had underperformed Standard & Poor’s 500 index, those in thirty-nine companies which were reckoned abysmal by Peters’s six ‘measures of excellence’ had actually outperformed the market over the same five-year period.
Was Peters abashed? Of course not: when opportunity knocks, the entrepreneur is always home! With admirable chutzpah, he capitalised on the blunder by writing another bestseller, which advocated entirely different solutions to the problems of American business. ‘Excellence isn’t,’ he announced in the opening sentences of Thriving on Chaos (1987). ‘There are no excellent companies.’ With his customary impeccable timing the book was published on Black Monday, when the Dow Jones plummeted by 20 per cent – thus apparently confirming his new discovery that the world had spun out of control and ‘no company is safe’.
Or, to quote the unimprovable headline on a despatch from the Agence France Presse news agency in the closing months of the year 2000: ‘Order, Chaos Vie to Shape 21st Century’.
3 It’s the end of the world as we know it (#ulink_055b17e9-a0dc-521e-81db-df08939cd02a)
‘Before I draw nearer to that stone to which you point,’ said Scrooge, ‘answer me one question. Are these the shadows of the things that Will be, or are they the shadows of the things that May be only?’
Still the Ghost pointed downward to the grave by which it stood.
‘Men’s courses will foreshadow certain ends, to which, if persevered in, they must lead,’ said Scrooge. ‘But if the courses be departed from, the ends will change. Say it is thus with what you show me!’
CHARLES DICKENS, A Christmas Carol
For public intellectuals in the early 1980s, one little prefix was obligatory. Post-modernism, post-feminism, post-Fordism and ‘post-culture’ (a term coined by Professor George Steiner) all joined the lexicon of modish discourse. Within a few years, however, even these concepts had been superseded. When the economist Lester C. Thurow said that ‘the sun is about to set on the post-industrial era’, James Atlas of the New York Times posed the obvious question: ‘What follows post?’
In The Sense of an Ending, the British literary critic Sir Frank Kermode argued that humans require an illusion of order and narrative: ‘To make sense of their span they need fictive concords with origins and ends, such as give meaning to lives and to poems.’ With the fin de siècle and indeed the end of the millennium looming, there was a palpable sense of imminent closure and conclusion: the critic Arthur Danto announced ‘the end of art’; the New Yorker writer Bill McKibben was paid a $1 million advance for his book The End of Nature; an editor at the Scientific American, John Horgan, published The End of Science: Facing the Limits of Knowledge in the Twilight of the Scientific Age. As Atlas commented, ‘clearly, it’s late in the day’. President Reagan had celebrated the arrival of ‘morning in America’; before most people had even managed to finish their breakfast, however, the shadows of evening were already lengthening, and citizens who had been enchanted by the Gipper’s sunny optimism now seemed to have an almost masochistic yearning for gloomy jeremiads.
One unlikely beneficiary of this new appetite was Paul Kennedy, a bearded, soft-spoken Englishman from the history faculty at Yale University, whose 677-page tome The Rise and Fall of the Great Powers: Economic Change and Military Conflict from 1500 to 2000 was published by Random House in January 1988, with an initial print-run of 9,000. By mid-February it had sold more than 100,000 copies. Few of these readers, one must assume, were particularly curious about the old imperial powers of China, Spain, Holland or Britain, whose histories comprised most of the book. Nor, one guesses, would Kennedy have been inundated with requests from TV talk shows or invitations to address congressional committees had he stuck to his original plan and ended the narrative in about 1945. Deciding that this would be ‘a cop-out’, he added a speculative final chapter predicting that American hegemony was waning. Simply put, the Kennedy thesis asserted that great empires establish themselves through economic might, but are then obliged to spend an ever larger proportion of their wealth on military prowess with which to protect themselves from upstart rivals. The consequent ‘imperial overstretch’ seals their fate. It happened to the Bourbons and the Habsburgs, and would just as surely happen to the apparently omnipotent United States within a few years.
‘When a serious work of history with more than a thousand footnotes starts selling in Stephen King-like quantities,’ the New Republic commented, ‘you can be sure it has touched something in the popular mood.’ And, sure enough, a great power overburdened with defence commitments duly succumbed to imperial overstretch soon afterwards. Alas for Kennedy, it was not the United States but the Soviet Union, whose speedy and spectacular demise he had quite failed to foresee. Undaunted, he interpreted this unexpected plot-twist as confirmation of his prophecy of American decline, since the end of the Cold War reduced ‘the significance of the one measure of national power in which the United States had a clear advantage over other countries’, that is military strength. (The other measures of American dominance – economic, cultural, technological – were apparently invisible from Professor Kennedy’s study at Yale.) Besides, didn’t the collapse of Soviet Communism, which even many American Cold Warriors had thought impossible, re-emphasise his central point: that nothing lasts for ever? With that, at least, few could take issue – or so it seemed until the summer of 1989, when the National Interest magazine carried a fifteen-page article entitled ‘The End of History?’ Its author was an obscure young official from the policy planning staff of the US State Department, Francis Fukuyama.
Once again, a shy, tweed-jacketed historian woke up to find himself famous. Even those who disagreed with Fukuyama paid tribute to his intellectual audacity, which was further rewarded with book contracts, lecture invitations and a professorship at the Johns Hopkins School of Advanced International Studies. ‘How is it that some people become famous while others do not?’ a jealous rival asked. ‘Of course, it smacks of sour grapes for one of the latter to ask this about one of the former, but Francis Fukuyama’s career begs for the question. How exactly do you get ahead by boldly making one of the worst predictions in social science?’ The question answers itself: if you are going to be wrong, be wrong as ostentatiously and extravagantly as possible. Had Fukuyama confined himself to saying that the end of the Cold War marked a victory for economic and political liberalism, scarcely anyone would have paid attention, since identical observations could be found in newspaper editorials any day of the week. But he understood what was required to titillate the jaded palate of the chattering classes: simplify, then exaggerate. ‘What we are witnessing’, he proclaimed, ‘is not just the end of the Cold War, or a passing of a particular period of postwar history, but the end of history as such: that is, the end point of mankind’s ideological evolution and the universalisation of Western liberal democracy as the final form of human government.’ By the time he had expanded his essay into a book, two years later, even the question-mark in the title had disappeared.
The obvious flaws in this terminalist teleology were magicked away with similar nonchalance. The German philosopher G. W. F. Hegel, who was cited by Fukuyama as his chief inspiration, also believed that we had reached ‘the last stage of history, our world, our own time’ – but dated it to Napoleon’s victory at the Battle of Jena in 1806. Some political soothsayers might interpret this precedent as a cautionary tale of reckless complacency, but not Fukuyama. With nimble dialectic – or, if you prefer, shameless chutzpah – he argued that Hegel was right after all, since ‘the present world seems to confirm that the fundamental principles of socio-political organisation have not advanced terribly far since 1806’. Ergo, we had reached the zenith and terminus of political evolution: Nazism and Communism were mere ‘bypaths of history’. (‘How far shall we trust a “Universal History” that relegates the conflagrations of two world wars and the unspeakable tyranny of Hitler and Stalin as “bypaths”?’ the American commentator Roger Kimball asked in a review of Fukuyama’s book. ‘I submit that any theory which regards World War II as a momentary wrinkle on the path of freedom is in need of serious rethinking.’)
History is itself an ambiguous term, of course. It can mean no more than what occurs in the world, or the techniques for finding this out, but it is also the discipline that orders events and experiences into an evolutionary narrative – summarised by the Enlightenment historian Lord Bolingbroke as ‘philosophy teaching by examples’, and later defined by R. G. Collingwood (in The Idea of History) as the reality of the present tempered by the necessity of the past and the possibilities of the future. This too was declared obsolete by the Terminalists. Fukuyama did not merely foreclose all possibilities other than the universal and perpetual reign of liberal American capitalism, the predominant present reality; he was also implicitly slamming the door on the past, muffling the cries and whispers of previous generations. Since ‘all of the really big questions had been settled’, he argued, ‘in the posthistorical period there will be neither art nor philosophy, just the perpetual caretaking of the museum of human history’. Imagination, heroism and idealism would be supplanted by economic calculation.
Not the most appealing manifesto for a brave new world, you might think. And Fukuyama would agree – sometimes. When celebrating the ultimate triumph of liberal capitalism he chides those tiresome nations which are still somehow ‘stuck in history’, and means it as an insult; yet in his more wistful moments he admits that ‘the end of history will be a sad time’. Sad, and deeply dull: he fears that sheer boredom, married to ‘a powerful nostalgia for the time when history existed’, may yet ‘serve to get history started again’. The incredulous italics are mine: having asserted that modern Americanstyle capitalist democracy is so manifestly unimprovable that it has seen off every conceivable challenger, Fukuyama casually concedes that this invincible titan could yet be overthrown by nothing stronger than the sleepy ennui of its beneficiaries. One is reminded of Karl Marx’s private confession to Frederick Engels after writing a newspaper article on the likely outcome of the Indian mutiny in the 1850s: ‘It is possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way.’
Fukuyama’s dialectic is similarly artful. He parades Hegel’s philosophy of history as supporting evidence for his own blithe certainty; yet he acknowledges (if only in a footnote) that the Hegelian historical terminus, the supreme desideratum, was not American capitalism but the absolute monarchy of nineteenth-century Prussia, described by Hegel as ‘the achievement of the modern world, a world in which the substantial Idea has won the infinite form’. No doubt Stone Age men and women, if they ever gave it a moment’s thought, assumed that their own way of life was just as immutable: few people have ever been able to imagine any kind of society other than the one that they inhabit. If Hegel was wrong about the eternal reign of Prussian absolutism, why should we believe that the present system has any more staying power? Fukuyama has an answer to that, too: ‘We cannot picture to ourselves a world that is essentially different from the present one, and at the same time better. Other, less reflective ages also thought of themselves as the best, but we arrive at this conclusion exhausted, as it were, from the pursuit of alternatives we felt had to be better than liberal democracy.’
The contempt for ‘less reflective ages’ is deliciously ironic. Fukuyama’s beloved Hegel had a fatal penchant for concepts such as the Substantial Idea and the World Spirit – the Geist – but, thanks to the Enlightenment’s legacy, he did work in an era which enjoyed an embarras de richesse of truly imaginative reinterpretations of the world. After the collapse of Communism there was an eruption of grand universal theories whose reflectiveness was in inverse proportion to their réclame. Cretinous oversimplification seemed to be what policy-makers and political analysts required. Where Fukuyama led, his old tutor from Harvard, Samuel Huntington, soon followed. His pitch for the Big Idea market – global chaos theory – mimicked Fukuyama’s own product-launch so closely, indeed, that one wondered if both men had taken the same correspondence course on How to Be a Modern Political Guru in Three Easy Lessons.
First, summarise your tentative thesis in an American policy journal: Huntington’s essay ‘The Clash of Civilizations?’ (note the query) was published by Foreign Affairs in the summer of 1993.
Secondly, devise a concept so arrestingly simple that it can be understood and discussed even by half-witted politicians or TV chat-show hosts. Again, Huntington was happy to oblige. ‘It is my hypothesis that the fundamental source of conflict in this new world will not be primarily ideological or primarily economic,’ he wrote. ‘The great divisions among humankind and the dominating source of conflict will be cultural. Nation states will remain the most powerful actors in world affairs, but the principal conflicts of global politics will occur between nations and groups of different civilisations. The clash of civilisations will dominate global politics. The fault lines between civilisations will be the battle lines of the future.’
Finally, having got everyone talking about your provocative new idea (Huntington’s article was translated into twenty-six languages), reap the rewards by expanding it into a bestselling book. The Clash of Civilizations and the Remaking of World Order was duly published in 1996.
There, however, any resemblance to Fukuyama ceased – or so it appeared. Huntington’s paradigm was generally taken as a rebuttal of Fukuyama’s Panglossian optimism. As well it might be, for Huntington was a perfect specimen of the gloomy realist, committed to maintaining the balance of power and profoundly mistrustful of utopian dreamers – or indeed anyone who thought or hoped that the human condition was susceptible to improvement. As he told an interviewer, ‘I am a child of Niehbur’ – Ronald Niehbur, a Protestant theologian who believed that order could be preserved only by severe restrictions designed to bridle humanity’s inherent wickedness.
Huntington’s fellow-postgraduates at Harvard in 1950 included a chubby, precocious émigré called Henry Kissinger; a few years later, as a young don at the university’s School of Government, his closest colleague was Zbigniew Brzezinski, later a hawkish National Security Adviser to President Carter. Unlike his friends Kissinger and Brzezinski, Huntington remained in academe, seeking inspiration from tutorials and seminars rather than crisis meetings in the Oval Office (though he did advise Lyndon Johnson’s administration in 1967, and wrote a few speeches for Jimmy Carter a decade later). His modus operandi was set out in his first book, The Soldier and the State, in 1957. While admitting that ‘actual personalities, institutions and beliefs do not fit into neat logical categories’, he nevertheless insisted that ‘neat logical categories are necessary if a man is to think profitably about the real world in which he lives and to derive from it lessons for broader application and use’. Without abstraction, generalisation and simplification there could be no understanding. One reviewer complained that the text was ‘noisy with the sounds of sawing and stretching as the facts are forced into the bed that has been prepared for them’.
This was the technique he exercised thirty-five years later (and rather profitably, to purloin his own adverb) when formulating the Clash of Civilisations theory. He divided the world into ‘seven or eight’ distinct civilisations – Western, Islamic, Hindu, Latin American, Slavic-Orthodox, Confucian, Japanese and ‘possibly’ African. The artificiality of this taxonomy became most apparent with his startling declaration that Greece ‘is not part of Western civilisation’; because it happened to have the wrong sort of Christianity, the birthplace of European culture was filed alongside Russia under ‘Orthodox’. Guessing that this might raise eyebrows, Huntington cited an extra reason for excluding Greeks from the Western bloc: for a few years in the 1960s and 1970s they were ruled by a military dictatorship. Yet Spain, which endured the dictatorship of General Franco at the same time, was welcomed into his club without any awkward questions from the membership secretary.
The categorisation unmistakably reflected his own values and prejudices, as when he rebuked politicians in Australia for betraying the country’s Western heritage by seeking to ‘cultivate close ties with its [Asian] neighbours’. Mixed marriages between countries representing different cultures can never succeed, Huntington said, because ‘successful economic association needs a commonality of civilisation’. With characteristic perversity, however, he decided that incongruous alliances outside the Western world were entirely natural: hence, for example, his warning that an Islamic – Confucian coalition ‘has emerged to challenge Western interests, values and power’, as proved by the sale of Chinese weapons to Iran and Pakistan in the 1980s. The supply of Western arms to Saudi Arabia in that period, exemplified most conspicuously by the multi-billion-dollar Al-Yamamah contract, did not lead him to conclude that there is an equally ‘natural’ Christian-Islamic connection.
As the argument proceeds, it becomes increasingly clear that pedantic distinctions between, say, Japan and Thailand or Italy and Greece are a flimsy camouflage intended to disguise his even cruder overstatement: that the modern world can be defined as ‘the West vs the Rest’. Not that any camouflage was necessary. In a further article published by Foreign Affairs later in 1993, Huntington replied to those who had accused him of oversimplification with a defiant plea of guilty as charged: ‘When people think seriously, they think abstractly; they conjure up simplified pictures of reality called concepts, theories, models, paradigms. Without such intellectual constructs, there is, William James said, only “a bloomin’ buzzin’ confusion”.’
True enough, up to a point: free thinkers should always keep Occam’s Razor within reach, to cut through needless complexities and obfuscation. But Huntington’s arbitrary bladework served only to obliterate the reality that most conflict is not between civilisations but within them, as the inhabitants of Rwanda, Northern Ireland and countless other tribal cockpits know to their cost. As Edward Said pointed out, the theory made no allowance for ‘the internal dynamics and plurality of every civilisation; or for considering that the major contest in most modern cultures concerns the definition or interpretation of each culture; or for the unattractive possibility that a great deal of demagogy and downright ignorance is involved in presuming to speak for a whole religion or civilisation’.
Each of Huntington’s supposedly monolithic ‘civilisations’, even that of the West, includes different currents – fundamentalism, traditionalism, modernism, liberalism and so on. Timothy McVeigh’s bombing of the Murrah Building in Oklahoma, like the sarin gas attack on Tokyo subway passengers by the Aum Shinrikyo cult, was a spectacular (though mercifully rare) manifestation of the tensions to be found within even modern and democratic cultures. These destructive assaults differ only in scale, not in kind, from the more frequent atrocities perpetrated by groups such as the Muslim Brotherhood in Egypt, the Taliban in Afghanistan or the Islamic Salvation Front in Algeria against fellow-members of their own ‘civilisation’.
For all its apparent novelty, Huntington’s eye-catching model was largely a reworking of the classical ‘realist’ theories that have long dominated the study of foreign relations, in which international politics is essentially an unending struggle for power between coherent but isolated units, each striving to advance its own interests in an anarchic world. The only difference, as critics pointed out, was that Huntington ‘has replaced the nation-state, the primary playing piece in the old game of realist politics, with a larger counter: the civilisation. But in crucial respects, the game itself goes on as always.’
Curiously, Samuel Huntington’s conservative pessimism – with its emphasis on cultural predestination, its narrow religiocultural definition of what constitutes a ‘civilisation’, its reluctance to accept the possibility of cross-pollination between cultures – echoed many of the tenets promoted by those self-styled radicals in the West who had marched down the dead-end of ‘identity politics’. Both effectively denied people the freedom to choose their own affiliations and associations, imposing lifelong allegiance to a club which they never applied to join. The Nobel laureate Amartya Sen described the Clash of Civilisations theory as nothing less than ‘a violation of human rights’, which may sound like hyperbole until one recalls that the Ayatollah Khomeini ordered the murder of Salman Rushdie because the Londonbased ‘blasphemer’ had Muslim forebears; had The Satanic Verses been written by a white Anglo-Saxon, no fatwa could have been promulgated. Professor Sen cited the bloodshed in Rwanda, Congo, Bosnia and Kosovo – and the rise of violent Hindu chauvinism in his own birthplace, India – as further evidence that the amplification of one distinctive identity ‘can convert one of many co-existing dividing lines into an explosive and confrontational division’.
The Clash of Civilisations and the End of History were invariably regarded as opposites – often, indeed, the only two alternatives available. ‘These are the two touchstones of any debate about the future direction of the world,’ the Washington Post reported. ‘They’re the theoretical elephants in the room. The old debate about capitalism vs. communism has been replaced by Fukuyama vs. Huntington.’ Because of the yearning for binary simplicity, and the obvious tonal contrast between the respective optimism and pessimism of these two academic jumbos, few noticed just how much they had in common. Both were rigidly determinist in their insistence that humanity’s fate had been preordained, whether ideologically or culturally, and grotesquely reductionist in their refusal to acknowledge the complex pluralities that constitute those vague abstractions ‘history’ and ‘civilisation’. Just as Fukuyama effectively erased Nazism and Stalinism from his account of the past 200 years because they didn’t fit, so Huntington ignored the fact that neither the number nor the causes of conflicts had changed much over the years. People still took up arms for the traditional reasons – territorial hunger, economic desperation, religious zeal, lust for power, defence against external threats or internal rivals. Nevertheless, the polished sheen of his neat Manichean theorising dazzled many Western policy-makers – not least because the phrase ‘global chaos theory’ gave an extra veneer of scientific method to his coarse generalisations. Although he would be horrified by the comparison, Huntington aped the techniques of Soviet Communists who boasted of the inevitability and irrefutability of ‘scientific socialism’; and perhaps he had learned a trick or two from the post-modernist intellectuals of the 1980s whose freestyle riffs about truth and reality were given a semblance of empirical rigour by being expressed in the language of advanced physics and mathematics. It might seem an unlikely influence, since the deconstructionists presented themselves as radicals bent on demolishing reactionary grand narratives such as the Clash of Civilisations.
But were they? As we shall see, these self-styled progressives had more in common with the conservatives than they would care to admit.
4 The demolition merchants of reality (#ulink_f9a748b4-c9a6-55a6-a22f-8c852e9949fe)
You propose then, Philo, said Cleanthes, to erect religious faith on philosophical scepticism; and you think that if certainty or evidence be expelled from every other subject of inquiry, it will all retire to these theological doctrines, and there acquire a superior force and authority. Whether your scepticism be as absolute and sincere as you pretend, we shall learn by and by, when the company breaks up: We shall then see, whether you go out at the door or the window; and whether you really doubt if your body has gravity, or can be injured by its fall; according to popular opinion, derived from our fallacious senses, and more fallacious experience.
DAVID HUME, Dialogues Concerning Natural Religion (1779)
Colin MacCabe, an obscure young Fellow of King’s College, Cambridge, was denied a lectureship by the English faculty’s appointments board in January 1981. Not the sort of news that would usually merit a paragraph in the university newspaper, let alone the national press: yet the rebuff to MacCabe, an expert on the novels of James Joyce and the films of Jean-Luc Godard, was reported on the front page of the Guardian. When MacCabe returned to England from a trip abroad a couple of days later, he found himself mobbed by reporters and photographers at Heathrow airport. His failure to gain tenure at the university provoked demonstrations in the streets of Cambridge and earnest debate on current affairs programmes. Newsweek cleared a page for the story (under the inevitable headline, ‘Unquiet Flow the Dons’), which it described as ‘one of the most extraordinary debates in the [university’s] eight-century history’:
Dons who normally confine their disputes to sherry parties leak damaging rumours about each other and threaten libel suits. Charges of academic sleaziness and intellectual persecution fly back and forth. Television crews roam King’s Parade to catch the carping of talkative academicians … Angry students began seeking to have the entire English faculty board suspended, and MacCabe sympathisers spoke of breaking away to form their own department.
Even some of his enemies agreed that MacCabe was an excellent scholar and teacher; but he was also a ‘post-structuralist’ who believed in analysing literature through study of its linguistic rules and cultural assumptions. Although MacCabe argued that these methods were no great radical departure from the traditions established by earlier generations of Cambridge dons – I. A. Richards and William Empson both undertook close formal analysis of the language of literary texts, while F. R. Leavis and Raymond Williams attempted to place novels within the general cultural history of the country – he did admit that it was the ‘enormous explosion of work in the mid-Sixties in Paris’ by structuralist and deconstructionist pioneers such as Jacques Lacan, Jacques Derrida, Roland Barthes, Louis Althusser and Michel Foucault which had ‘galvanised me and many others’, thus confirming the suspicion among traditionalists that MacCabe was the carrier of a dangerous foreign germ which would infect the whole corpus of English teaching unless he were swiftly quarantined. In the words of the anti-structuralist don Christopher Ricks, ‘It is our job to teach and uphold the canon of English literature.’ Ricks’s colleague Ian Jack added that ‘one does want to keep the attention of students focused on the great writers’. On the other side of the barricades, Dr Tony Tanner described the treatment of MacCabe as ‘the most unjust thing I have ever seen in academic life’ and resigned from the faculty’s degree committee in protest. Raymond Williams, the grand old man of Marxist criticism, was voted off the appointments board for defending MacCabe. So, more surprisingly, was Professor Frank Kermode; though not a structuralist or semiologist himself, he argued that the university ought to accommodate a plurality of critical styles and techniques.
Having succeeded in forcing out Colin MacCabe, the Cambridge conservatives continued to guard the gates against foreign barbarians for many years. (As a young lecturer observed, ‘Cambridge is an island in some ways, cut off from the rest of the country. When I ran into a colleague in London once, he said: “Fancy seeing you in England.”’) At a degree-awarding ceremony in March 1992, three of them shocked the hundreds of proud parents assembled in Senate House by standing up and shouting ‘non placet’ – thus imposing a temporary veto on the proposal to give an honorary doctorate to Jacques Derrida, the sixty-two-year-old doyen of deconstructionism. But although Cambridge may have won the odd battle, it was the continental theorists who won the war. When Derrida came to speak in Oxford a few weeks before the Cantabrigian yell of ‘non placet’, he drew an audience of 1,800 – as against the 400 who turned up at the Oxford Union that month to hear the Hollywood star Warren Beatty. The success of the theorists’ long march through the institutions can also be gauged by Colin MacCabe’s career: immediately after his eviction from Cambridge a full-blown professorship was created for him at Strathclyde University; three years later he was appointed head of production at the British Film Institute and, for good measure, professor of English at the University of Pittsburgh.
Вы ознакомились с фрагментом книги.
Приобретайте полный текст книги у нашего партнера: