Much impressed by American Conservatory Theater's production of Souvenir: A Fantasia on the Life of Florence Foster Jenkins. Foster Jenkins was a 30s and 40s era socialite who fancied herself a soprano diva notwithstanding her near total lack of talent. Her fame grew through recitals to privileged audiences until she rented Carnegie Hall to her eternal fame and shame. She was evidently oblivious, and died without warning, while purchasing sheet music, only a few months later.
Judy Kaye is magnificent in the title role. The whole thing was riotously funny ... and poignant, and they played both to the hilt.
There is not a lot of room for rumination in this one, the more so given that the audience is obviously willing to laugh without reservation at a woman who really should be embarrassed. At the time, the audiences hid their ridicule sufficiently that she could and would carry on. We have no such compunction these days. It took me some time before I could watch the early auditions of American Idol without my finger on the channel changer. The really awful acts are obvious setups, but there is this stratum of people who truly think they can sing but they just plain cannot. They ask for it, they get it, and we get to watch. But it is agonizing to witness their humiliation, and even worse to see them rail against it. No matter how plainly the plain-spoken Simon spits it out, they just do not know that they are "awful."
No excuses for me here ... I am a big American idol fan and I will blog again this year about some of the highlights and lowlifes. But the terrible screeching of American Idol just does not compare to the symphonic monstrosity of Florence Foster Jenkins ... try this on for size:
The other performance in Souvenir was Donald Corren's rendering of the aptly named Cosmé McMoon, the Mexican-born and obviously gay man who was her long-time accompanist. The narration was introspective ... he wondered how it was that he ended up doing this thing. I somehow doubt the original Cosmé had any such reservations ... money is money to the starving artist. And society is society to the fag trying to find a niche. It's never easy, but it was particularly not easy for a gay man to make it in the New York of the 20s, 30s, 40s. I thought Corren was spectacular. And the closet he played was very After Dark which was very apropos, I suppose.
But I am an out-of-the-closet fightin' fag, and so ...
The Oscars and All That
Huzzah to Sean Penn who gets it, and talks the talk and walks the walk. Who will care about the skimpy charms of Slumdog Millionaire in a year or a month or a decade? But Milk cuts, makes history, renders for anyone to see something that must be seen and that the enemies of sight deny to sight. (Shame on Jonathan Rauch for his scandalous pandering to religious bigots. It has never been about assuaging bigotry ... they can keep their bigotry. We have no claim on it. But we deserve to live as any other person lives, and that is what Sean Penn understands, but Jonathan Rauch is willing to give up for a bag of fool's gold.)
I saw Cleve Jones on Rachel Maddow, and he was crystal clear about the nature of the task. He said that 1978 was not as much a memo as 1964 when the black movement understood that we can never get our freedom state by state or county by county. The bigots thrive in the states and the counties. We need a clear federal mandate that gay people are people. Period. That's it. Religious bigots are welcome to hide under whatever bridge they can find or whatever cave they can slither into. But they have no place in the civil rights of fellow human beings.
Did you hear that Obama?
Before I come back to Obama, I have to note that seeing Cleve was a little shocking. We all age. I knew Cleve in passing ... our paths crossed as he was moving on to a kind of entrepreneurial activism and I, a recent immigrant to San Francisco, remained a true believer in the hardy band of lovers who had created the gay movement in the 70s. I mean no slight by the term "entrepreneurial activism", the more so because ultimately Cleve managed to create the Quilt whose power needs to be witnessed to be understood. But I did not have that in me, and my true believerism eventually devolved into retirement and ultimately a career outside the movement. So it goes.
But Cleve back then was such a sweet puppy of a young man ... and now he is not. His voice is as clear as ever, and his understanding unbowed by compromise or cowardice. I admire him. I just wish we did not have to age.
So to Obama, who needs to get with the program ...
That said, what a speech:
Not much to add to all the press, but the non-State-of-the-Union was a masterpiece. How much joy to the rhetorician's heart is a President who rides the language like a surfer on the waves of the ocean of speech.
Monday, February 23, 2009
One Legend, then Another, then a New One
Posted by Arod in San Francisco at 20:56 0 comments
Sunday, February 15, 2009
Stanley Goes to the Fish Market
I read Stanley Fish's blog in the New York Times pretty regularly, and he recently published a piece called The Last Professor. The post questions the survivability of the humanities in the academe, and refers to a recent book by one of Fish's students, Frank Donoghue, who has a new book entitled “The Last Professors: The Corporate University and the Fate of the Humanities.”
Fish argues for "higher education as an enterprise characterized by a determined inutility" ... I refer to a similar concept in the phrase "the tyranny of relevance." Fish writes (and I have conflated two paragraphs for clarity's sake):
This view of higher education as an enterprise characterized by a determined inutility has often been challenged, and the debates between its proponents and those who argue for a more engaged university experience are lively and apparently perennial. The question such debates avoid is whether the [ideal of Michael Oakeshott that "there is an important difference between learning which is concerned with the degree of understanding necessary to practice a skill, and learning which is expressly focused upon an enterprise of understanding and explaining"] (celebrated before him by Aristotle, Kant and Max Weber, among others) can really flourish in today’s educational landscape. It may be fun to argue its merits (as I have done), but that argument may be merely academic – in the pejorative sense of the word – if it has no support in the real world from which it rhetorically distances itself. In today’s climate, does it have a chance?
Both Fish and Donoghue say "no" ... regardless of the merits, scholarship as an end in itself untethered to an explicit and argued social utility cannot and will not survive.
There is plenty of evidence to back up their skepticism, not the least of which is that students increasingly look to post-secondary education with career in primary mind. The last flush of inutilitarian idealism might have been the 70s when people like me took classes in Chinese language whose principle was that the memorization of Tang dynasty poetry could lead to fluency. The result of that is that I can recite imperfectly two lines of a famous poem and I can scratch out around 25 characters.
I believe, and I am in agreement with Fish here, that scholarship is justified about anything that is knowable or anything about which we still do not know everything. That covers pretty much any subject. To require an explicit tie between knowability and utility predisposes the scholar's work with reference not to the object studied but to a present paradigm that may have nothing to do with the object. This was precisely the tyranny-of-relevance error of the whole theory digression that bedeviled literary studies for a couple of decades ... I am happy to aver that the grip of airless theory seems to be loosening.
Having said that ... that study itself is sufficient argument for scholarship ... methinks Professor Fish got lost in the marketplace. Because as with everything, the pure argument, the single source of truth, is always wrong at least because it is incomplete but more so because it closes the door to the fuller explication of the underlying dialectic. You see, scholarship has never been free of the tyranny of relevance in any meaningful overarching sense. Certainly one has professor x who spent three decades deciphering runes while his colleagues snickered at his tawdry clothes and poor hygiene. But even professor x lived in an institution situated within a society, an institution subject to rulership, economics, pressures, and social changes.
Complexity is what interests me, and reductive-to-the-single-point arguments run out of steam. I can stamp my foot all I want about pure scholarship ... I could rage on about the undeniable fact that most professors produce precious little of memorable quality ... and I could feel good about it. But it would not get me closer to understanding the dynamics that are rendering scholarship in the current era.
I think you need to start with the counterposing pure scholarship and social context. They are never separate, though social context pays vastly less attention to pure scholarship than the other way around, certainly. The fascination is in the complexity of that relationship. I wrote a few days ago about a forum on the term "Islamism" and I hope that I prefigured a couple of these points there. Emmerson, the scholar talking about Islamism took the term out of its robust and fractious social context and laid it before us as something to be studied. I think that Emmerson lost a chance to change scholarship by insisting on writing a book about this term instead of creating a living monument on the web. But that demonstrates precisely the struggle between the pure scholarship of wresting the term from its context and looking it as a captured wave (not a fixed or reified object) and the social context that wants some feedback from the pointy-heads who do this kind of stuff.
Emmerson lives in a rarified world, the world of the endowed scholar who just thinks and writes. Great work if you can get it. I think that world has a responsibility to knowability, even when the objects it might choose to study are not relevant, or what is more significant posed-as-relevant, to come current issue or problem. In other words, we can demand of endowed scholars that they do something, and then we can tell them that we are free to make of what of what they do what we want. They can't own it or keep it back. Again, that is a dialectic.
The key to saving the humanities is to inhabit both terms of that dialectic.
At MRU, the major research university where I pile seashells on the beach in exchange for loaves and fishes, I have argued on numerous occasions ... and you must realize that I have no formal standing to make this argument ... that the humanities departments need to seize the idea of the double major. I like this one: French and Civil Engineering. Or, Classics and Urban Studies. Or, Art History and Mechanical Engineering. Or, History and Earth Systems.
Yes, there is a grave danger here that the humanities find themselves tied to a stated and required relevance. But, so long as one keeps this slippery slope in mind, such formally combined majors enable the humanities to inject into the real world the perspective of pure scholarship. More significantly from the social context point of view, they can round up some students and increase their own footprint.
So, the French and Civil Engineering works in Africa for the United Nations, the Classics and Urban Studies student intertwines classical ideas of open space and citizenship with modern transportation systems, the Art History and Mechanical Engineering student retains the notion of beauty as he proposes a bridge on a freeway, and the History and Earth Systems student comes to this most complex of human issues with an understanding that we have ripped our environments to shreds since we first walked the earth.
The humanities can surrender the social context parts of the double major to the science and engineering departments, and thereby liberate itself to its own genius, to whit the inutilitarian search for the knowable.
I propose this example by way of outlining a path for the humanities to engage with the world that threatens it even as it operates to save its most previous perquisite ... the ability of scholars to define for themselves what is worth studying. Moaning that the world has no room for us, well, that is an ancient complaint that has an odor not of striving and research but of coffee and upholstery.
A few further notes on complexity and military history
Last night at a dinner of five pointy-heads, my excellent friend Ian and I sparred about the Dresden bombing and that spun into Hiroshima and the Japanese surrender. Dresden is in the news these days because February 13 was the 64th anniversary, and the day often presages neo-Nazi marches about war crimes committed against Germany. There is a nice piece in the English Der Spiegel here in which Frederick Taylor, a British historian who has written extensively on Dresden, is interviewed. (I suspect he is the son of the noted left-wing British historian, AJP Taylor but I could not find any proof of that).
Taylor rejects the view that Dresden was chosen only because it was an architectural jewel and as part of a singular effort to destroy Germany by destroying civilian morale; he argues that these were certainly factors, but that it was also a significant military target. One can quibble, and we certainly now know that the carpet bombing strategy was not a significant factor in the defeat of Germany. But I find a reduction of the bombing of Dresden to war crime to be a kind of moral cubbyhole that exempts the viewer from the full range of dialectics at play in the moment. In other words, once we assign the single factor and require that all argument start there, we lose the ability to understand how things actually happen, how strategies in the high sense devolve into tactics on the ground, if I can make a macabre pun. But complexity of argument can readily come back to its starting point, and Taylor, the champion of a complex argument about Dresden does precisely that. The aforementioned interviews ends thus:
SPIEGEL ONLINE: Do you think it was justifiable?
Taylor: Personally, though I can trace the logic of it, I have serious doubts. It is a ghastly example of how war depletes the moral reserves even of democratic nations. Goetz Bergander, who survived the bombing of his native city as an 18-year-old and has written widely about its destruction, has described the bombing in his characteristically forgiving way as "outsize." It was certainly all of that.
I think a similar moral cubbyhole has long obscured clarity in argument, especially among liberals, about the events that ended the Pacific War in 1945. It has been argued that the Japanese were ready to surrender, but either we needed to give them time or we needed to figure some way not to force unconditional surrender. There is no evidence that I know that supports this other than assertion. Significantly ... and this is a grave error in anyone seeking to understand military history ... no credit in this view is given to the implications of the fog of war. What the Americans knew was that resistance had not stopped and that large numbers of troops were being amassed in the South in preparation to resist the assumed invasion. That the Russians had entered the war likely had more impact on the Americans than the Japanese military leadership much of which was committed to a fight to the death of the last Japanese; why would they care whether that final plucked life was given up to a Russian or an American?
But once we credit a notion that the Japanese were ready to surrender before Hiroshima, or that they really surrendered after Hiroshima because they were afraid of the Russians and not the bomb, we avoid a grindingly difficult moral argument: did the A-bombs result in a net of fewer deaths; and, if so, were they thereby justifiable?
That is the moral crux of the end of Japanese war, and it should not be avoided.
A last note:
I have meant to write about this for a few days, so I append it here. There was a touching article in the New York Times entitled "My Sister's Keeper". It concerns aging colonies of lesbian separatists who have kept the faith, as it were, with that peculiar 70s ideology. I want to note that the lesbian separatists did damage to the gay movement, and they are not heroes of mine even if they have lived their lives courageously as they saw fit and so desired. The gay movement of the 70s was overwhelmingly male not because we rejected women but because women rejected us. They had lots of excuses ... all men are rapists, the gay boys have too much sex, we expected the women to make the coffee while we did the movement ... but only one argument is real. Feminism of the 70s was homophobic in a dominant sense, that is in its majority and in its actions. Lesbians active in the women's movement felt they had to hide or at least downplay their homosexuality. The separatists, while hardly hiding from feminism, hid from the world and proclaimed that isolation as liberation. They contributed nothing to our present freedom, and their absence weakened us both in the short term and in the long term. I wish them well in their retirement, but the record tells the tale.
Photos by Arod. The owl is from a window on Divisadero Street; the lion is from a Board of Education building on Van Ness; the window reflection shot is just off Market near Gough; the statue is from Golden Gate Park; the finger is from Hayes Street (I think) near Masonic; all are from San Francisco.
Posted by Arod in San Francisco at 10:15 0 comments
Labels: Gay, History, Lit crit, Mythologies, Thought, Tyranny of Relevance
Saturday, February 07, 2009
You took two parking spots!
The roommate, with whom I commute as well as live, had jury duty so I was sitting alone in the wee 86 Honda Civic SI which I alternately call Red or Czar, listening to the BBC News. In the rain, the rare, rare, rain. Counting the minutes until I made a short dash to the train platform under an elevated freeway to catch the 7:19 Baby Bullet Caltrain #314 to MRU, the major research university where I sort hayseeds for contributions to my wellness fund.
And there across the street is a gigantic planet-killing SUV, slowly, inexpertly, inauspiciously backing into a clumsy park and, characteristically, taking two parking spots. The young woman gets out of the car, and unloads sundry purses and bags, along with a tiny dog. I am a cranky old bugger and, as regular readers will know, I am occasionally given to expressing my crankiness at particular breaches of the social contract. So I open the door and lean my leonine head into the drizzle to shout, "Ma'am, you took two parking spots!"
She heard me, paused a moment, and then carried on with her yuppie unloading ... and walked away without apparently even given a second's thought to rectifying her self-absorbed assault on subsequent parkers. I was steamed, but I tried to put it out of mind to pay more attention to the latest update on the Sri Lankan operations against the Tamil Tigers ... I'm for the government in that one, by the way.
Not the worst crime, taking two parking spots, but emblematic of this me-me-me culture of giganto planet-killers and cell-phone Nazis and barren suburban sprawl. Riles me.
But that is not my point.
I got to thinking about a committee on which I serve at MRU ... in fact, a committee I chair. The committee has proposed an activity which I, in my majesty, think is a waste of time. I fret about such things ... fretting is genetically predetermined in me, and I inherit it from my sainted mother but not, I think, from old Dad whom I resemble in so many other ways. Fretting used to be nothing more than self-torture, but in my magisterial aging, I have found ways to put fretting to use. One factor in this is to have a ready made retreat from fretting available ... I learned how to do this during a particularly nasty academic hair-pulling and mud-wrestling exercise that took place all around me in the terminal years of my doctoral "process". I try to maintain, at ready head as it were, a salubrious or at least diverting topic to which I can turn whenever I find myself turning pirouettes over something I cannot control, leastwise in the immediate term. One of my favorite retreat topics is Mars exploration; another is imagining what it would be like to be abandoned on an island, comme Alexander Selkirk.
But I digress ... in fact, this entire blog is a digression ... in fact, there is nothing I like more than digressing ... in fact, I think I have digressed enough that I return to my main point which is ... digressing is how I manage to think about things, so given the threat of a committee not agreeing with me, I wonked around the Internet until I stumbled upon a Washington Post opinion by David Ignatius entitled, in classic self-congratulation, "The Death of Rational Man." You can read it for yourself if you want his point. My point is to make my point, not his point, so even pointing to his article is pointless if my point is to keep you on my point. Lord ...
The sentence that provided me with a strategy to explain my opposition to the committee's agenda was this one:
A pre-mortem analysis can provide a real "stress test" to conventional thinking. Let's say that a company or government agency has decided on a plan of action. But before implementing it, the boss asks people to assume that five years from now, the plan has failed -- and then to write a brief explanation of why it didn't work.
Thank you, Mr. Ignatius ... that thought has given me something to "keep in mind" for the rest of my life. It is always worth thinking through what it would be like, and what the factors would be, if your plan utterly failed. Glass half full indeed. I suppose this is something I do all the time anyway ... but it is gratifying, crystallizing to read it put so deftly.
I clapped and wrote a little script for Monday when the committee meets, and I have rethought all my previous thoughts, and fretted myself to the point where I think that I can make my objections rational and positive ... the key to winning one's point in most cases ... and to co-opting the intentions of those who proposed something else to a solution that includes all of us. Goody, goody.
Then I read this:
... a Japanese proverb ... "An inch ahead is darkness."
An inch ahead is darkness. From the corporeal present to the ethereal future. If the committee goes against me, it will amount to an annoyance and not much more. But those bastards taking two parking spots, driving planet killers, consuming like there is no tomorrow ... an inch ahead is darkness.
So my fretting now is no longer the immediate "committee" but the disasters that await us all.
That is so much more satisfying.
Tonight's beverages ... classics both ... a Tonga Zombie and a Long Island Iced Tea. I guess I am no longer a "cheap drunk".
Photos by Arod, all part of a set I uploaded today to Flickr.
Posted by Arod in San Francisco at 20:00 0 comments
Labels: Hell, Rambling, Three Rules
Thursday, February 05, 2009
Islamism: What's in a Word?
I went to a lecture and exchange today featuring the Southeast Asianist scholar Donald Emmerson. He is writing a book with a colleague that amounts to a debate concerning the use of the word "Islamism" and the validity or lack thereof of self-censorship when confronting such a charged term. This colloquium was a presentation of his thought process with reference to his colleague's counter process. It raised some issues that are dear to my heart.
So, pursuant to a challenge issued at the event by the scholar Olivier Roy, I will start by providing a definition. Islamism, in my view, is the ideological use of Islam to influence state and society to adopt Islamic policies and practices. The term may be new, but the practice is old. I find it risible to suggest that Islam is just a religion, and political action should not reflect upon it. Islamists, like fundamentalists of all stripes, draw on their religious heritage to justify their campaigns, whatever they may be. All religious textual traditions contain irresolvable antinomies precisely so that they can be used variously to argue for peace and war, love and hate, acceptance and rejection. So no matter how violent or moderate the Islamist, they are from the religion and they argue their own legitimacy from the religion.
Enough of me.
Emmerson proposed a little diagram to explain his argument about the validity or lack thereof in self-censorship in addressing a term like "Islam ... I am not going to reproduce the graphic because I am not sure if it is proprietary or if he wants it bandied about. But he counterposes accuracy and considerateness, so that something that is accurate and considerate is contextualization; something that is considerate but inaccurate is denial; something that is accurate but inconsiderate is candor; and something that is inaccurate and inconsiderate is stigmatization. Nice, and I htink you can see where this is going.
I thought Emmerson produced a lengthy and erudite rationale for sometimes using the word so long as it well-grounded. His colleague ... I cannot for the life of me remember the name ... refuses to use the term ever. And the aforementioned Olivier Roy thinks it is a perfectly fine term ... he claims to have invented the phrase moderate Islamist in 1984 ... and uses it whenever he bloody well pleases, just so long as it is well-defined.
The subsequent comments and Emmerson's responses represented an erudite debate about language, usage, the role of scholarship in the world, and methodology. A knee-jerk response might be to find it all pointy-headed and airless, but that was just not what was actually happening. Emmerson pointed out that scholars actually have an increased role in public debate these days; I hasten to point that the notion that involvement in public debate justifies a given scholarship is a kind of tyranny of relevance. But it happens to be true in this case. There are many scholar talking heads on the news shows and especially on PBS. They too get rounded up into the sound-bite hell of modern broadcast TV, but they do have the opportunity to introduce terms and perspectives that would have been lost in earlier eras.
So the first issue is this ... since language matters and since language is fluid ... sometimes like water, sometimes like glass, mostly like something in between ... public rumination on the particulars of language has a high function. This is hardly confined to the present. I am presently reading Peter Brown's excellent The Rise of Western Christendom and he relates how the seizure of Latin by the church during Carolingian times liberated the vernaculars to develop much more freely ... and all of that arose as part of the debate about precise terminology and usage. In the case of "Islamism", the media use the term increasingly as equivalent to terrorist, and this certainly serves purposes. But simply to surrender a term, especially in these days of aggressive and ephemeral appropriation, is to surrender a discourse. No matter that one might lose eventually in a debate, one always needs to make one's points. So at the least, that function was served today.
At one point Emmerson argued that he had tried to change his usage of Islamism from representing a political Islam to representing a public Islam. I thought this was a dangerous slide because public does not need to represent an attempt to enforce a politico-religious ideology. But another participant had a sharper notion as a counterpoise to Emmerson's idea that public was broader (and, by my way of seeing his point, more forgiving) than political. This young man with an Arabic accent said that in repressive societies you can include secret activity in the political and that might be broader than the restricted realm of the public. So the public protestation of devout Islam in a repressive circumstance might be seen by authority as a challenge, and hence Islamist, but it is more readily tolerated and subsumed that secret political activity which is decidedly thereby Islamist. Because, as always with religion, it comes back to the state. There is no Islamism without looking at the state.
But more interesting to me than this ramble is this curiosity. Why did Emmerson choose to write a book that amounted to a dialogue, and to include in this book the responses of 12 selected contributors? Why not mount this on some kind of web site, and make it a dynamic experiment in public scholarship? The reason is two-fold ... first of all, the academe at its highest levels has been incapable of finding a route into the new modes of publication, and no one has risen to take the bull by its horns, as it were, to force the issue. So, secondly, Emmerson seeks to freeze something in time, to make a monument. But the time for monuments is passed, and the Emmersons of the world lose the immediacy of their arguments by fixing them in a book. This sort of argument is Internet-worthy, but not book-worthy because by the time it appears its urgency will have passed.
In my work, there are no excuses. The new technologies press in on us every day, and the urgency is constantly present. It's time for that to be true of the scholars who inhabit the sandstone building a stone's throw from the modular unit where I toil for wages.
This post developed vastly differently from what I imagined as I walked to the train tonight ... but here it is nonetheless.
Photos by Arod . Gawd noze why I chose these, though gawd noze everthang so it must be rite.
Posted by Arod in San Francisco at 19:23 1 comments
Labels: Islam, Lit crit, Tyranny of Relevance
Tuesday, February 03, 2009
Justinian and Aurengzeb
These two great monarchs separated by more than a millennium, Justinian (b 482 - r 527 - d 565) and Aurengzeb (b 1618 - r 1658 - d 1707) are exemplars of the lost opportunity hidden by bitter military success. But in each case, it is not the lost opportunity that is well-known. Allow me to elaborate.
History readers give up on mystery to a great degree because generally when one studies a period, you know how it turns out. So when I speak of lost opportunities, I suppose I am re-introducing an element of mystery. The path not taken leads nowhere in the event, and speculations on where it might have led are fruitless if still potentially fascinating. So from prologue to provisos ... both men stood on the cusp of Central Asia. Justinian's empire crept up to Central Asia but never conquered it as Justinian wasted precious time in heading west; Aurengzeb's empire was a sort of fugitive from Central Asia and he famously turned his back on his origins and headed south.
Justinian was arguably the last great ancient emperor. We in the West prefer to think that Rome died in 476 with Odoacer's displacement of Romulus Augustulus ... one of the really lucky sods ever to rule in the western empire as he survived his displacement and lived out his life in comfort near Ravenna. But by then, Rome had been increasingly and predominantly eastern and Greek for at least two centuries. Justinian confronted on his eastern border the Persian Sasanian empire, another last of the ancients. I am not well schooled in the Sasanians, and, frankly, few are. They hung on for a couple of centuries, famously swept aside like so much dust by the Arabs at Qadissiya (in modern Iraq) in 637. But for the hundred years before then, they were more or less constantly at war with the Romans ... whom we prefer inaccurately to call the Byzantines, but I don't think they are Byzantine until after the Arab conuests, perhaps not until, say, 843 which is traditionally the date ascribed to the birth of Eastern Orthodoxy.
The Sassanian parts of what is now the Middle East, parts of Syria, Iraq, and Iran, had huge Christian populations, too little studied and known. Christianity was actually making great headway across Central Asia in a similar trajectory to what the Muslims would pursue only a century later. The Sassanians periodically repressed Christianity, primarily when adopted by their own nobility who, they felt, should stick to imperial Zoroastrianism. But the non-military Christian populations were allowed to muddle on, and by reason of the state's unconcern for orthodoxy, heterodoxy flourished particularly in the form of the Monophysites (i.e., Christ is of one divine nature, perhaps the right wing of the argument in which the Nestorians where Christ is of human nature are the left wing; the Orthodox and the Catholics fit in the middle where Christ is both and simultaneously divine and human ... theology is bunk, and one needs these little signposts to keep pressing through the historical issues). Such Christians might prefer the Persians over the Romans because their theological disputations would not end up in death or exile or some other moritification. That preference explains to a good degree why the Christians of the Middle East quickly accommodated to their Arab conquerors.
So my question is this: why didn't Justinian set himself to the task of destroying the Sassanians; why did he not look East in his early reign? He wasted his time reconquering Italy and North Africa, and they provided him with insufficient revenues to justify the expense of holding them, even when he increasingly stripped them of troops in favor of the "eastern front". In 542-3, a great plague ripped through the Mediterranean world, and the initiative had been lost, Justinian spent most of the rest of his reign ... he famously did not physically leave Constantinople for 50 years ... trying by subtle persuasions and brutal repressions to impose upon his entire empire his sole view of Christian orthodoxy.
He did not succeed. The combination of the ravages of the plague and Justinian's religious nonsense and the liminal character of much of what is now the Middle East led to a situation where imperial control was distant and imperial allegiances were absent. Cities atrophied and urban communities found their primary allegiance through a religious communitarianism in which the enemy was their heretical neighbor. When the Arabs conquered in a mad whirlwind dash only seven decades later, they found a society that did not care a whole lot who ruled them so long as they were left alone in their cellular religious particularism.
What would have happened if Justinian's early career had been devoted to crushing the Sassanids instead of the pointless exercise of resubjecting the littorals of the Western Mediterranean? If he had ignored the imperial self-absorption of requiring each of his subjects to have the identical view of the precise nature of the godhead? perhaps then the Arabs confront a society that resists over the centuries their religion, which subordinates the conquerors, as has so many times been the pattern, rather than the conqueror converting the conquered.
What ifs are perhaps best left to cocktail conversation ... and in defnse of this notion, as I write this I am down one sublime martini made with Dolin vermouth and Junipero gin and one "PampleRose" which is a gin and Campari invention of my excellent roommate and bartender R. But let me aver that the failure of Justinian to look past his religious Caesarism served to foredoom Roman Asia to Islam. Religious monomania is always destructive ... and I suppose that is subset of religion being the root of all evil.
This tragic false focus repeats itself a millennium later in northern India. Aurengzeb was as much a religious fanatic as Justinian, albeit a Muslim one, and it is his religious blindness that brought his dynasty, the Mughals, low and, one might argue, cursed India for centuries to come.
Aurengzeb was the sixth Mughal emperor, the last great one, and the only one of the great emperors to successfully overthrow his father, Shah Jahan, notwithstanding the shibboleth that the Mughal sons were prone to overthrow their fathers ... in fact only one succeeded, despite various other attempts. The important point is that the Turkic system of succession was designed to turn royal sons into generals and governors so that the ultimate winner would be best positioned to advance the interests of the dynasty. Jahangir tried to revolt against Akbar, but Akbar defeated his attempt. When Akbar died, Jahangir moved quickly, perhaps inspired by his unsuccessful revolt, to counter his potential enemies, including his son Khusraw whom he blinded. The future Shah Jahan revolted against Jahangir, but he too was defeated only to succeed a year later when dad died, and against the wishes of his father's favorite wife, the famous Nur Jahan.
Westerners tend to look at the Mughals in a comic-bookish sort of way. We remember the bizarre and corpulent lassitude of the hangers-on and placeholders whom the British used and abused and eventually displaced. And we warm to the aforementioned reductionist tales of sons tormenting fathers and all that. But the Mughal tale is a pivot in Indian history. As with many pivots, it did not have to turn out as it did, and contemporaneous observers of the early Mughals probably thought it would not turn out as it did. Babur (b 1483 - r 1526 - d 1530), a Chaghatay Turk said to descend from both Genghis Khan and Timur, generally known as Tamerlane, swept down from Afghanistan and showed northern India what Central Asia had known about Turks and Mongols on horseback. Babur, by the way, was certainly gay. His son Humayun (b 1508 - r 1530-1539 - r 1555 - d 1556) lost the empire to Sher Shah Suri, struggled with his brothers, found refuge with the Shiite Safavids in Persia, and then regained the empire just in time to die a bizarre death ... he caught his foot on his robe and fell to his death.
Humayun was succeeded by the great Akbar (b 1542 - r 1556 - d 1605) whose reign was the occasion of a grand experiment in religious and artistic syncretism. That experiment continued with the succession of his son Jahangir (b 1565 - r 1609 - d 1627) and his son Shah Jahan (b 1592 - r 1627-1658 - d 1666) whose artistic achievements were crowned by the Taj Mahal. But Shah Jahan was a decidedly more Islamic ruler, and less given to the syncretism that had slowly curdled since the death of Akbar.
The real story of the accession of Aurengzeb (b 1618 - r 1658 - d 1707) is religion. He was a Muslim fanatic, one of four brothers who went after each other during their father's sudden, only seemingly fatal illness from which he ultimately completely recovered. The key battle was between Aurengzeb and the oldest brother Dara Shukoh (sometimes Dara Shikoh). They met at Samugarh near Agra on May 30, 1658 ... one of those great turning point battles little known beyond locals and experts. After Samugarh and a long flight, Dara Shukoh was ultimately subjected to a cruel execution.
Dara Shukoh probably took his succession for granted, and certainly made errors of hubris in his war against his younger brother. Dara Shukoh was a religious syncretist, the pupil of a Sufi Shaikh who argued that Vedas and the Upanishads were the concealed scriptures mentioned in the Qur'an (Richards, page 152). His accession to the throne would have continued a religious experiment and perhaps created a party of syncretism that would exist to this day. More importantly, he would not have been the arrogant and unquenchable pursuer of Hindu rebels in the South that Aurengzeb became.
The state that Aurengzeb inherited was rich and stable and produced vast tax revenues. But he could not sit still; he spent precious little time in his capital and pursued for the bulk of his reign destructive campaigns against Hindu rebels in the South of whom the most storied is Shivaji Bhonsla (1627-80). Neither side could win, and they exhausted themselves in endless campaigns. Meanwhile, imperial authority in the north atrophied, and so did the authority of the dynasty and Aurengzeb's successors who were a bunch of nincompoops! That is what the British confronted as their power rose.
So to me Aurengzeb's victory over Dara Shukoh is the lost opportunity of a powerful kingdom with a syncretic ruler and patron of the arts confronting the British and negotiating a different sort of colonial encounter. His victory was a victory of orthodoxy, and orthodoxy stifles.
So Aurengzeb left a weakened empire that was conquered by Christians; he also bequeathed modern India its hard differentiation between Muslims and Hindus. And Justinian left a weakened empire that was largely conquered by Muslims, and bequeathed the modern Middle East a few weird Christian sects struggling for air in a vast Muslim sea.
Might it have been different? We'll never know.
Photos by Arod; any connection with the post is obviously enigmatic at best. The middle photo is one of the dolls that Winfield expertly makes for his Christmas tree.
Posted by Arod in San Francisco at 19:19 0 comments
Labels: History, Horses and Peoples, Islam, Religion