MONTHLY BLOG 43, MIS-SPEAKING …AND HOW TO RESPOND

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

When we talk for a living and don’t do it to a written script, there’s always a chance of getting the words wrong. Mostly it doesn’t matter. Phrases can be rephrased, self-corrections swiftly made. The sentences flow on and listeners hardly notice. Yet sometimes a sudden silence tells the speaker that a blunder or infelicity has been noted. Funnily enough, I remember a few times when I’ve felt that sudden frigidity in the atmosphere, but can hardly remember exactly what I said wrong. So my attempt at a confessional is somewhat thwarted by the human capacity for benign forgetting.

For many years now, I have adopted the policy of giving all my lectures and talks from notes. They are sometimes written and detailed, sometimes just in my head. There’s always a structure, often threefold. I began that policy when one of my old friends protested that he was disappointed when I lectured from a fully written script. (Strangely, when I next heard him lecture, he too had a written text). But there’s no doubt that such a practice is much more boring than free-speech. So I threw away my scripts and launched into freedom. It was nerve-wracking at first but then became really good fun. I now positively enjoy lecturing, because free-speaking requires a great mix of relaxation and concentration, which really keeps one mentally on one’s toes. Talk about living in the here-and-now. But, as already admitted, there’s always a chance of mis-speaking.

The quickest response to a blunder is a quick admission, ‘No, that came out wrongly’ or ‘No, forget that: let me put the point a better way’. Another option is a self-deprecating joke. That’s generally the best way, thawing the atmosphere and making room for a revised statement. Alas, however, the appropriate quips don’t always come to mind immediately. How often does one wake in the middle of the night with the perfect riposte, which had proved elusive during the daylight hours?

(The answer to that rhetorical question is actually: not that much, since I generally sleep soundly. But sometimes …)

In fact, I often mull over conversations after the event, thinking of what was said or unsaid. It’s one way of understanding my partner in life, who is a keep-his-cards-close-to-his-chest sort of person. I appreciate that, since I have the same trait, under an outward show of chattiness.

Anyway, in the course of mulling over my contributions to asking questions in academic seminars, I am aware that there’s a fine line between jokes and jibes that work, and those that don’t. My aim is to make some genial general observation, which is intended to open up the wider implications of the question in hand, before honing in on a specific query. Doesn’t always work, but that’s my aim. It’s not a tactic that I recommend to beginners in academic life; but something that I require of myself as a comparative senior.

On one occasion, I made a sharp remark about the panel of speakers, who were enthusing over historic riots. My aim was to tease them about the contrast between their academic respectability and their admiration for lawlessness (if in a good cause). It was the precursor to my question, not the major point. But anyway, it went down like the proverbial lead balloon. Made me seem to be avoiding engagement with the issues at stake – just the reverse of my intention.

These particular panellists reminded me somewhat of my late uncle, Christopher Hill, the eminent Marxist historian.He loved historic outlaws, pirates, highwaymen, and vagrants, as well as earnest seventeenth-century Puritans, who challenged the unquestioning authority of traditional religious teaching in an era when it was difficult to do so. In fact, Hill wrote a book about them, entitled Liberty against the Law (1996) which aptly expressed his appreciation.2  The fact that the worthy Puritans of whom he wrote approvingly would have hated the irreligious and a-religious outlaws with whom they were yoked did not trouble him. From his virtuous life of laborious and enjoyable study, Hill enjoyed the raffish life of the outlaws vicariously. And why not? Many of us have mixtures of Puritanism and libertinage within us. I was too hard on him, in my thoughts; and needlessly sardonic with my colleagues.

Unlikely fellows in the cause of ‘Liberty’: (T) an ascetic Puritan divine, in this case the American theologian/evangelist Jonathan Edwards, from an engraving by R. Babson and J. Andrews; and (B) the highwayman Dick Turpin on his famous steed Black Bess (in a Victorian image).So what should I have done? Worded my point in a more felicitous way, which I would have done, if writing. Or deleted my little joke at their expense? Probably the latter. I was playing the footballers and not the ball. Breaking my own rules for seminar questions. (The point might not be amiss in a review where viewpoints can be explained more fully.) So the occasion – and the disapproving silence from the audience – has taught me something useful for the future.

Lastly, a chance to record a fine response to another example of mis-speaking, this time not by me. The occasion was the book launch of F.M.L. (Michael) Thompson’s urban history of Hampstead (1974). The Mayor of Camden had been asked by the publisher to make a suitable speech. That he did, before ending, ungraciously: ‘But I shan’t read this book’. Probably he didn’t mean to be so rude. Perhaps he really meant something like: ‘But I fear that this volume may be a bit too learned for me …’. Either way, his remark did not meet the moment. It seemed to express a traditional and unhelpful strand of anti-intellectualism in the working-class Labour movement. (Not the entire story, of course, since there is another strand that values engagement with learning and adult education).

Be that as it may, I still remember Mike Thompson’s lungeing riposte, at the end of his gracious speech in reply. Having thanked his wife, publisher and friends, he then thanked the Mayor in his civic capacity: ‘But I shan’t vote for you’.

Well done, Mike. I hope not to mis-speak again. Yet, if it happens accidentally, I hope that I get as neat a riposte.

For more on Christopher Hill (1912-2003) see P.J. Corfield, ‘“We are All One in the Eyes of the Lord”: Christopher Hill and the Historical Meanings of Radical Religion’, History Workshop Journal, 58 (2004), pp. 110-27; and within PJC website as Pdf/5.

2  C. Hill, Liberty against the Law: Some Seventeenth-Century Controversies (Penguin: London, 1996).

F.M.L. Thompson, Hampstead: Building a Borough, 1650-1964 (Routledge: London, 1974).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 43 please click here

MONTHLY BLOG 42, CHAIRING SEMINARS AND LECTURES

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

The aim is to get everyone involved in a really good discussion, aiding the speaker and the seminar/lecture participants alike. By ‘good’, I mean critical but supportive. Any criticisms, of course, should be directed at the paper, not at the speaker: as in football, kicking the ball, not the person.

Okay, that sounds pretty easy. How best to promote the desired result? At the start, it’s essential to open the proceedings in an open and genial manner, with a joke, or failing that, at least a humorous tone. Nothing like a murmur of laughter to weld a group together. Then the speaker should be introduced pithily, without notes. None of these lengthy recitations of everything that he or she has ever done, which makes everyone drowsy. And certainly no advance-guessing by the chair of the points that the speaker ought to make – thus stealing (or bodging) the thunder before the show has begun.

By the way, from the start the chair should make a point of visibly and fairly slowly looking all round the room, bringing everyone within an encompassing gaze. And do this more than once. I call it giving the lighthouse beam.

june007
During the paper or lecture, whether good or bad, the chair has to look alert and listen. It encourages the speaker and the audience; and it’s necessary, as from time to time the speaker refers to the chair (perhaps to ask how much time is left). Actually, that’s why I like chairing, as it keeps one wide-awake. Ideally, speakers should have been briefed before the meeting about the length of talk required. But chair should always confirm that at the start; and then gently halt speakers who go on for too long. On a formal occasion, a printed card saying TIME! can be passed to the speaker but, informally, a hand signal usually suffices. There’s always some leeway on these things. If the speaker is part of a panel, then strict timekeeping is essential. In other circumstances, it’s the chair’s judgement call. But don’t allow too much over-running, as the audience gets at first restive and then somnolent.

While the speaker is talking, I usually make a mental list of the key questions raised by the paper. A good seminar or lecture audience will usually spot them all; but it’s a useful backup. Immediately after the paper, it’s absolutely essential for the chair to make some suitable response while people gather their thoughts. It’s always bad news when the chair just says abruptly: ‘Any questions?’ And even worse when there’s a great silence and the chair adds dolefully: ‘Well, I can see it’s going to be a difficult session’. Lead balloons all round.

Instead, the chair should briefly thank the speaker (nothing over the top) and note the range of issues raised by the paper (that’s helpful for beginners). Followed by an ‘opening’ question, to get the discussion going. Not too detailed or heavy; but not a patsy either.

While the speaker answers, the chair should look intently round the room to encourage people to signal that they have questions. This is the really crucial bit. If at all possible, the chair should sit up, or semi-stand, leaning against a chair or table, to free the sightlines. Then the lighthouse beam can skim lightly over everyone there. Preferably with a smile. People usually give very imperceptible signals – a nod or lift of the hand. It’s rather like the sly nods and winks at an auction, though fortunately not quite as covert.

Usually, the questions are taken in the order that they come. But, if there’s a long list of respondents, then it’s helpful to call people from different parts of the room. That draws everyone into the discussion.

Very rarely indeed, there are rude or out-of-order questions. The chair should then intervene, extracting the element within the question that can be answered and telling the speaker to ignore the rest. Or, if the question is completely out of order, the chair should simply say so. That is more likely to happen in political meetings than in academic gatherings. And even then, it’s rare. Other problems sometimes occur with poorly phrased or incomprehensible questions. The speaker is entitled to look to the chair for help, so be ready to paraphrase the question into something answerable.

Discreetly, the chair is conducting the discussion; and should have a range of questions up his/her sleeve to throw into the pool, if the questioning flags. Difficult depth-chargescan be used especially against the good and the great, who shouldn’t be let off too easily.

Beginners, however, should not be given too hard a time – enough to test them but not to destroy. It’s good to intervene with some supportive words, if they are seriously floundering, though the debate must be allowed to flow.

In terms of manner, the chair should be genial; but not too ‘in’. It’s best to avoid calling people to speak by their first names or, even worse, by unfamiliar nicknames. Such references make the group seem too cliquey and seriously deter newcomers. When calling people, I refer to them by location: ‘A question at the back’ and ‘Now a question from this side of the room’ and so forth. Not the sartorial references favoured by TV chat-show hosts: ‘the person in the blue jumper’; ‘the woman with glasses’, which are too impersonal.

It should go without saying, in free-flowing academic events, that speakers should not be called in order of academic seniority. The old-style seminars, when the professors speak before everyone else, in rank order, can still be found and even appreciated as a rarity. But that’s what they should remain.

Lastly, the question of tone. The chair must be friendly but not over fulsome. That just sounds sycophantic. At the same time, the chair must be critical but not too sardonic. As an auto-critic, I wince at memories of the times when I’ve tried to be sharp but just come over as waspish. The sardonic remark that isn’t funny really isn’t funny. Luckily these moments (only few, I hope) get lost in the flow. It’s the paper or lecture that gets remembered. Where is the next academic gathering to chair? I’m ready with my lighthouse beam.
june0081 See PJC BLOG no 27, February 2013: ‘Asking Questions’.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 42 please click here

MONTHLY BLOG 41, HISTORICAL REPUTATIONS: DISAPPEARING FROM HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

What does it take for individuals to disappear from recorded history? Most people manage it. How is it done? The first answer is to die young. That deed has been achieved by far too many historic humans, especially in eras of highly infectious diseases. Any death before the age of (say) 21 erases immense quantities of potential ability.

After all, how many child prodigies or Wunderkinder have there been? Very few whose fame has outlasted the immediate fuss in their own day. A number of chess-masters and mathematicians have shown dramatic early abilities. But the prodigy of all prodigies is Wolfgang Amadeus Mozart, who began composing at the age of five and continued prolifically for the remaining thirty years of his life. His music is now more famous and more widely performed than it ever was in his own day. Mozart is, however, very much the exception – and his specialist field, music, is also distinctive in its ability to appeal across time and cultures.

A second way of avoiding the attentions of history is to live and die before the invention of writing. Multiple generations of humans did that, so that all details of their lifestyles, as inferred by archaeologists and palao-anthropologists, pertain to the generality rather than to individuals. Oblivion is particularly guaranteed when corpses have been cremated or have been buried in conditions that lead to total decay.

As it happens, a number of frozen, embalmed, or bog-mummified bodies from pre-literate times have survived for many thousands of years. Scholars can then study their way of life and death in unparalleled and fascinating detail. One example is Ötzi the Iceman, found in a high glacier on the Italian/Austrian border in 1991, and now on dignified display in the South Tyrol Museum of Archaeology at Bolzano, Italy. His clothing and weaponry reveal much about the technological abilities of Alpine hunters from over five thousand years ago, just as his bodily remains are informative about his diet, health, death, and genetic inheritance.1 Nonetheless, the world-view of individuals like Ötzi are matters of inference only. And the number of time-survivors from pre-literate eras are very few.22014-5-Pic1 OtzitheIceman

Ötzi the Iceman, over 5000 years old but initially thought to be a recent cadaver when discovered in 1991:
now in the South Tyrol Museum of Archaeology, Bolzano, Italy.

The third way of avoiding historical attention is to live a quiet and secluded life, whether willingly or unwillingly. Most people in every generation constitute the rank-and-file of history. Their deeds might well be important, especially collectively. Yet they remain unknown individually. That oblivion applies especially to those who remain illiterate, even if they live in an era when reading and writing are known.

‘Full many a flower is born to blush unseen/ And waste its sweetness on the desert air’, as Thomas Gray put it eloquently in 1751 (in context, talking about humans, not horticulture).3 One might take his elegiac observation to constitute an oblique call for universal education (though he didn’t). Yet even in eras of widening or general literacy, it remains difficult for every viewpoint to be recorded and to survive. In nineteenth-century Britain, when more people than ever were writing personal letters, diaries and autobiographies, those who did so remained a minority. And most of their intimate communications, especially if unpublished, have been lost or destroyed.

Of course, past people were also known by many other forms of surviving evidence. The current vogue in historical studies (in which I participate) is to encourage the analysis of all possible data about as many as possible individuals, whether ‘high’ or ‘lowly’, by making the information available and searchable on-line.4 Nonetheless, historians, however determined and assiduous, cannot recover everybody. Nor can they make all recovered information meaningful. Sometimes past data is too fragmented or cryptic to have great resonance. It can also be difficult to link imperfect items of information together, with attendant risks: on the one hand, of making false linkages and, on the other hand, of missing real ones.

Moreover, there are still many people, even in well documented eras, whose lives left very little evidence. They were the unknowns who, in George Eliot’s much-quoted passage at the end of Middlemarch (1871/2): ‘lived faithfully a hidden life, and rest in unvisited tombs’.5 She did not intend to slight such blushing violets. On the contrary, Eliot hailed their quiet importance. ‘The growing good of the world is partly dependent on unhistoric acts’, she concluded. A realist might add that the same is true of the ‘bad of the world’ too. But again many lives remain hidden from historic record, even if the long-term impact of their collective actions and inactions has not.

Finally, there is concealment. Plenty of people then and now have reasons for hiding evidence – for example, pertaining to illegitimacy, adultery, addiction, crime, criminal conviction, or being on the losing side in warfare. And many people will have succeeded, despite the best efforts of subsequent scholar-sleuths. Today, however, those seeking to erase their public footprint face an uphill task. The replicating powers of the electronic media mean that evidence removed from one set of files returns, unbidden, in other versions or lurks in distant master files. ‘Delete’ does not mean absolute deletion.

Concluding the saga of The Mayor of Casterbridge (1886), the bipolar anti-hero Michael Henchard seeks to become a non-person after his death, leaving a savage will demanding ‘That I be not buried in consecrated ground & That no sexton be asked to toll the bell … & That no flowers be planted on my grave & That no man remember me’.6
2014-5-Pic2 Non-person

Non-Person © www.idam365.com (2014)

Today: yes, people can still be forgotten; or even fall through the administrative cracks and become a non-person. But to disappear from the record entirely is far from easy. Future historians of on-line societies are going to face the problems not of evidential dearth but of massive electronic glut. Still, don’t stop writing BLOGs, tweets, texts, emails, letters, books, graffiti. If we can’t disappear from the record, then everyone – whether famous, infamous, or unknown – can take action and ‘bear witness’.

1 For Ötzi, see: http://en.wikipedia.org/wiki/%C3%96tzi.

2 See P.V. Glob, The Bog People: Iron-Age Man Preserved, transl. R. Bruce-Mitford (London, 1969); D.R. Brothwell, The Bog Man and the Archaeology of People (London, 1986).

3 T. Gray, ‘Elegy Written in a Country Churchyard’ (1751), lines 55-6.

4 See e.g. Proceedings of the Old Bailey Online, 1674-1913: www.oldbaileyonline.org; London Lives, 1690-1800: www.londonlives.org; Clergy of the Church of England Database, 1540-1835: www.theclergydatabase.org; London Electoral History, 1700-1850: www.londonelectoralhistory.com.

5 G. Eliot [Mary Ann Evans], Middlemarch: A Study of Provincial Life (1871/2), ed. W.J. Harvey (Harmondsworth, 1969), p. 896.

6 T. Hardy, The Mayor of Casterbridge: The Life and Death of a Man of Character (1886), ed. K. Wilson (London, 2003), p. 321.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 41 please click here

MONTHLY BLOG 40, HISTORICAL REPUTATIONS THROUGH TIME

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

What does it take to get a long-surviving reputation? The answer, rather obviously, is somehow to get access to a means of endurance through time. To hitch a lift with history.

People in sports and the performing arts, before the advent of electronic storage/ replay media, have an intrinsic problem. Their prowess is known at the time but is notoriously difficult to recapture later. The French actor Sarah Bernhardt (1844-1923), playing Hamlet on stage when she was well into her 70s and sporting an artificial limb after a leg amputation, remains an inspiration for all public performers, whatever their field.1  Yet performance glamour, even in legend, still fades fast.

Bernhardt in the 1880s as a romantic HamletWhat helps to keep a reputation well burnished is an organisation that outlasts an individual. A memorable preacher like John Wesley, the founder of Methodism, impressed many different audiences, as he spoke at open-air and private meetings across eighteenth-century Britain. Admirers said that his gaze seemed to pick out each person individually. Having heard Wesley in 1739, one John Nelson, who later became a fellow Methodist preacher, recorded that effect: ‘I thought his whole discourse was aimed at me’.2

Yet there were plenty of celebrated preachers in Georgian Britain. What made Wesley’s reputation survive was not only his assiduous self-chronicling, via his journals and letters, but also the new religious organisation that he founded. Of course, the Methodist church was dedicated to spreading his ideas and methods for saving Christian souls, not to the enshrining of the founder’s own reputation. It did, however, forward Wesley’s legacy into successive generations, albeit with various changes over time. Indeed, for true longevity, a religious movement (or a political cause, come to that) has to have permanent values that outlast its own era but equally a capacity for adaptation.

There are some interesting examples of small, often millenarian, cults which survive clandestinely for centuries. England’s Muggletonians, named after the London tailor Lodovicke Muggleton, were a case in point. Originating during the mid-seventeenth-century civil wars, the small Protestant sect never recruited publicly and never grew to any size.  But the sect lasted in secrecy from 1652 to 1979 – a staggering trajectory. It seems that the clue was a shared excitement of cultish secrecy and a sense of special salvation, in the expectation of the imminent end of the world. Muggleton himself was unimportant. And finally the movement’s secret magic failed to remain transmissible.3

In fact, the longer that causes survive, the greater the scope for the imprint of very many different personalities, different social demands, different institutional roles, and diverse, often conflicting, interpretations of the core theology. Throughout these processes, the original founders tend quickly to become ideal-types of mythic status, rather than actual individuals. It is their beliefs and symbolism, rather than their personalities, that live.

As well as beliefs and organisation, another reputation-preserver is the achievement of impressive deeds, whether for good or ill. Notorious and famous people alike often become national or communal myths, adapted by later generations to fit later circumstances. Picking through controversies about the roles of such outstanding figures is part of the work of historians, seeking to offer not anodyne but judicious verdicts on those ‘world-historical individuals’ (to use Hegel’s phrase) whose actions crystallise great historical moments or forces. They embody elements of history larger than themselves.

Hegel himself had witnessed one such giant personality, in the form of the Emperor Napoleon. It was just after the battle of Jena (1806), when the previously feared Prussian army had been routed by the French. The small figure of Napoleon rode past Hegel, who wrote: ‘It is indeed a wonderful sensation to see such an individual, who, concentrated here at a single point, astride a horse, reaches out over the world and masters it’.4

(L) The academic philosopher G.W.F. Hegel (1770-1831) and (R) the man of action, Emperor Napoleon (1769-1821),  both present at Jena in October 1806The means by which Napoleon’s posthumous reputation has survived are interesting in themselves. He did not found a long-lasting dynasty, so neither family piety nor institutionalised authority could help. He was, of course, deposed and exiled, dividing French opinion both then and later. Nonetheless, Napoleon left numerous enduring things, such as codes of law; systems of measurement; structures of government; and many physical monuments. One such was Paris’s Jena Bridge, built to celebrate the victorious battle.

Monuments, if sufficiently durable, can certainly long outlast individuals. They effortlessly bear diachronic witness to fame. Yet, at the same time, monuments can crumble or be destroyed. Or, even if surviving, they can outlast the entire culture that built them. Today a visitor to Egypt may admire the pyramids, without knowing the names of the pharaohs they commemorated, let alone anything specific about them. Shelley caught that aspect of vanished grandeur well, in his poem to the ruined statue of Ozymandias: the quondam ‘king of kings’, lost and unknown in the desert sands.6

So lastly what about words? They can outlast individuals and even cultures, provided that they are kept in a transmissible format. Even lost languages can be later deciphered, although experts have not yet cracked the ancient codes from Harappa in the Punjab.7  Words, especially in printed or nowadays digital format, have immense potential for endurance. Not only are they open to reinterpretation over time; but, via their messages, later generations can commune mentally with earlier ones.

In Jena, the passing Napoleon (then aged 37) was unaware of the watching academic (then aged 36), who was formulating his ideas about revolutionary historical changes through conflict. Yet, through the endurance of his later publications, Hegel, who was unknown in 1806, has now become the second notable personage who was present at the scene. Indeed, via his influence upon Karl Marx, it could even be argued that the German philosopher has become the historically more important figure of those two individuals in Jena on 13 October 1806. On the other hand, Marx’s impact, having been immensely significant in the twentieth century, is also fast fading.

Who from the nineteenth century will be the most famous in another century’s time? Napoleon? Hegel? Marx? (Shelley’s Ozymandias?) Time not only ravages but provides the supreme test.

1  R. Gottlieb, Sarah: The Life of Sarah Bernhardt (New Haven, 2010).

R.P. Heitzenrater, ‘John Wesley’s Principles and Practice of Preaching’, Methodist History, 37 (1999), p. 106. See also R. Hattersley, A Brand from the Burning: The Life of John Wesley (London, 2002).

3  W. Lamont, Last Witnesses: The Muggletonian History, 1652-1979 (Aldershot, 2006); C. Hill, B. Reay and W. Lamont, The World of the Muggletonians (London, 1983); E.P. Thompson, Witness against the Beast: William Blake and the Moral Law (Cambridge, 1993).

G.W.F. Hegel to F.I. Neithammer, 13 Oct. 1806, in C. Butler (ed.), The Letters: Georg Wilhelm Friedrich Hegel, 1770-1831 (Bloomington, 1984); also transcribed in www.Marxists.org, 2005.

See http://napoleon-monuments.eu/Napoleon1er.

6  P.B. Shelley (1792-1822), Ozymandias (1818).

For debates over the language or communication system in the ancient Indus Valley culture, see: http://en.wikipedia.org/

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 40 please click here

MONTHLY BLOG 39, STUDYING THE LONG AND THE SHORT OF HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

A growing number of historians, myself included, want students to study long-term narratives as well in-depth courses.1 More on (say) the peopling of Britain since Celtic times alongside (say) life in Roman Britain or (say) medicine in Victorian times or (say) the ordinary soldier’s experiences in the trenches of World War I. We do in-depth courses very well. But long-term studies are also vital to provide frameworks.2

Put into more abstract terms, we need more diachronic (long-term) analysis, alongside synchronic (short-term) immersion. These approaches, furthermore, do not have to be diametrically opposed. Courses, like books, can do both.

That was my aim in an undergraduate programme, devised at Royal Holloway, London University.3  It studied the long and the short of one specific period. The choice fell upon early nineteenth-century British history, because it’s well documented and relatively near in time. In that way, the diachronic aftermath is not too lengthy for students to assess within a finite course of study.

Integral to the course requirements were two long essays, both on the same topic X. There were no restrictions, other than analytical feasiblity. X could be a real person; a fictional or semi-fictionalised person (like Dick Turpin);4  an event; a place; or anything that lends itself to both synchronic and diachronic analysis. Students chose their own, with advice as required. One essay of the pair then focused upon X’s reputation in his/her/its own day; the other upon X’s long-term impact/reputation in subsequent years.

There was also an examination attached to the course. One section of the paper contained traditional exam questions; the second just one compulsory question on the chosen topic X. Setting that proved a good challenge for the tutor, thinking of ways to compare and contrast short- and long-term reputations. And of course, the compulsory question could not allow a simple regurgitation of the coursework essays; and it had to be equally answerable by all candidates.

Most students decided to examine famous individuals, both worthies and unworthies: Beau Brummell; Mad Jack Mytton; Queen Caroline; Charles Dickens; Sir Robert Peel; Earl Grey; the Duke of Wellington; Harriette Wilson; Lord Byron; Mary Shelley; Ada Lovelace; Charles Darwin; Harriet Martineau; Robert Stephenson; Michael Faraday; Augustus Pugin; Elizabeth Gaskell; Thomas Arnold; Mary Seacole; to name only a few. Leading politicians and literary figures tended to be the first choices. A recent book shows what can be done in the case of the risen (and rising still further) star of Jane Austen.5 In addition, a minority preferred big events, such as the Battle of Waterloo; or the Great Exhibition. None in fact chose a place or building; but it could be done, provided the focus is kept sharp (the Palace of Westminster, not ‘London’.)

Studying contemporary reputations encouraged a focus upon newspaper reports, pamphlets, letters, public commemorations, and so forth. In general, students assumed that synchronic reputation would be comparatively easy to research. Yet they were often surprised to find that initial responses to X were confused. It takes time for reputations to become fixed. In particular, where the personage X had a long life, there might well be significant fluctuations during his or her lifetime. The radical John Thelwall, for example, was notorious in 1794, when on trial for high treason, yet largely forgotten at his death in 1834. 6

By contrast, students often began by feeling fussed and unhappy about studying X’s diachronic reputation. There were no immediate textbooks to offer guidance. Nonetheless, they often found that studying long-term changes was good fun, because more off-the-wall. The web is particularly helpful, as wikipedia often lists references to X in film(s), TV, literature, song(s) and popular culture. Of course, all wiki-leads need to be double-checked. There are plenty of errors and omissions out there.

Nonetheless, for someone wishing to study the long-term reputation of (say) Beau Brummell (1778-1840), wikipedia offers extensive leads, providing many references to Brummell in art, literature, song, film, and sundry stylistic products making use of his name, as well as a bibliography. 7

Beau Brummell (1778-1840) from L to R: as seen in his own day; as subject of enquiry for Virginia Woolf (1882-1941); and as portrayed by Stewart Granger in Curtis Bernhardt’s film (1954).Plus it is crucial to go beyond wikipedia. For example, a search for relevant publications would reveal an unlisted offering. In 1925, Virginia Woolf, no less, published a short tract on Beau Brummell.8 The student is thus challenged to explore what the Bloomsbury intellectual found of interest in the Regency Dandy. Of course, the tutor/ examiner also has to do some basic checks, to ensure that candidates don’t miss the obvious. On the other hand, surprise finds, unanticipated by all parties, proved part of the intellectual fun.

Lastly, the exercise encourages reflections upon posthumous reputations. People in the performing arts and sports, politicians, journalists, celebrities, military men, and notorious criminals are strong candidates for contemporary fame followed by subsequent oblivion, unless rescued by some special factor. In the case of the minor horse-thief Dick Turpin, he was catapulted from conflicted memory in the eighteenth century into dashing highwayman by the novel Rookwood (1834). That fictional boost gave his romantic myth another 100 years before starting to fade again.

Conversely, a tiny minority can go from obscurity in their lifetime to later global renown. But it depends crucially upon their achievements being transmissable to successive generations. The artist and poet William Blake (1757-1827) is a rare and cherished example. Students working on the long-and-the-short of the early nineteenth century were challenged to find another contemporary with such a dramatic posthumous trajectory. They couldn’t.

But they and I enjoyed the quest and discovery of unlikely reactions, like Virginia Woolf dallying with Beau Brummell. It provided a new way of thinking about the long-term – not just in terms of grand trends (‘progress’; ‘economic stages’) but by way of cultural borrowings and transmutations between generations. When and why? There are trends but no infallible rules.

1 ‘Teaching History’s Big Pictures: Including Continuity as well as Change’, Teaching History: Journal of the Historical Association, 136 (Sept. 2009), pp. 53-9; and PJC website Pdf/3.

2 My own answers in P.J. Corfield, Time and the Shape of History (2007).

3 RH History Course HS2246: From Rakes to Respectability? Conflict and Consensus in Britain 1815-51 (content now modified).

4 Well shown by J. Sharpe, Dick Turpin: The Myth of the Highwayman (London, 2004).

5 C. Harman, Jane’s Fame: How Jane Austen Conquered the World (Edinburgh, 2009).

6 Two PJC essays on John Thelwall (1764-1834) are available in PJC website, Pdf/14 and Pdf/22.

7 See http://en.wikipedia.org/wiki/Beau_Brummell.

8 See V. Woolf, Beau Brummell (1925; reissued by Folcroft Library, 1972); and http://www.dandyism.net/woolfs-beau-brummell/.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 39 please click here

MONTHLY BLOG 38, WHY IS THE LANGUAGE OF ‘RACE’ HOLDING ON SO LONG WHEN IT’S BASED ON A PSEUDO-SCIENCE?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

Of course, most people who continue to use the language of ‘race’ believe that it has a genuine meaning – and a meaning, moreover, that resonates for them. It’s not just an abstract thing but a personal way of viewing the world. I’ve talked to lots of people about giving up ‘race’ and many respond with puzzlement. The terminology seems to reflect nothing more than the way things are.

But actually, it doesn’t. It’s based upon a pseudo-science that was once genuinely believed but has long since been shown as erroneous by geneticists. So why is this language still used by people who would not dream of insisting that the earth is flat, or the moon made of blue cheese.

Part of the reason is no doubt the power of tradition and continuity – a force of history that is often under-appreciated.1 It’s still possible to hear references to people having the ‘bump of locality’, meaning that they have a strong topographical/spatial awareness and can find their way around easily. The phrase sounds somehow plausible. Yet it’s derived from the now-abandoned study of phrenology. This approach, first advanced in 1796 by the German physician F.J. Gall, sought to analyse people’s characteristics via the contours of the cranium.2  It fitted with the ‘lookism’ of our species. We habitually scrutinise one another to detect moods, intentions, characters. So it may have seemed reasonable to measure skulls for the study of character.

Phrenologist’s view of the human skull: point no. 31 marks the bump of locality, just over the right eyebrow.Yet, despite confident Victorian publications explaining The Science of Phrenology3  and advice manuals on How to Read Heads,4  these theories turned out to be no more than a pseudo-science. The critics were right after all. Robust tracts like Anti-Phrenology: Or a Chapter on Humbug won the day. Nevertheless, some key phrenological phrases linger on.5  My own partner in life has an exceptionally strong sense of topographical orientation. So sometimes I joke about his ‘bump of locality’, even though there’s no protrusion on his right forehead. It’s a just linguistic remnant of vanished views.

That pattern may apply similarly in the language of race, which is partly based upon a simple ‘lookism’. People who look like us are assumed to be part of ‘our tribe’. Those who do not seem to be ‘a race apart’ (except that they are not). The survival of the phrasing is thus partly a matter of inertia.

Another element may also spring, paradoxically, from opponents of ‘racial’ divisions. They are properly dedicated to ‘anti-racism’. Yet they don’t oppose the core language itself. That’s no doubt because they want to confront prejudices directly. They accept that humans are divided into separate races but insist that all races should be treated equally. It seems logical therefore that the opponent of a ‘racist’ should be an ‘anti-racist’. Statistics of separate racial groups are collected in order to ensure that there is no discrimination.

Yet one sign of the difficulty in all official surveys remains the utter lack of consistency as to how many ‘races’ there are. Early estimates by would-be experts on racial classification historically ranged from a simplistic two (‘black’ and ‘white’) to a complex 63.6  Census and other listings these days usually invent a hybrid range of categories. Some are based upon ideas of race or skin colour; others of nationality; or a combination And there are often lurking elements of ‘lookism’ within such categories (‘black British’), dividing people by skin colour, even within the separate ‘races’.7

So people like me who say simply that ‘race’ doesn’t exist (i.e. that we are all one human race) can seem evasive, or outright annoying. We are charged with missing the realities of discrimination and failing to provide answers.

Nevertheless, I think that trying to combat a serious error by perpetrating the same error (even if in reverse) is not the right way forward. The answer to pseudo-racism is not ‘anti-racism’ but ‘one-racism’. It’s ok to collect statistics about nationality or world-regional origins or any combination of such descriptors, but without the heading of ‘racial’ classification and the use of phrases that invoke or imply separate races.

Public venues in societies that historically operated a ‘colour bar’  used the brown paper bag test for quick decisions,  admitting people with skins lighter than the bag and rejecting the rest.  As a means of classifying people, it’s as ‘lookist’ as phrenology  but with even fewer claims to being ‘scientific’.  Copyright © Jessica C (Nov. 2013)What’s in a word? And the answer is always: plenty. ‘Race’ is a short, flexible and easy term to use. It also lends itself to quickly comprehensible compounds like ‘racist’ or ‘anti-racist’. Phrases derived from ethnicity (national identity) sound much more foreign in English. And an invented term like ‘anti-ethnicism’ seems abstruse and lacking instant punch.

All the same, it’s time to find or to create some up-to-date phrases to allow for the fact that racism is a pseudo-science that lost its scientific rationale a long time ago. ‘One-racism’? ‘Humanism’? It’s more powerful to oppose discrimination in the name of reality, instead of perpetrating the wrong belief that we are fundamentally divided. The spectrum of human skin colours under the sun is beautiful, nothing more.

1 On this, see esp. PJC website BLOG/1 ‘Why is the Formidable Power of Continuity so often Overlooked?’ (Nov. 2010).

2 See T.M. Parssinen, ‘Popular Science and Society: The Phrenology Movement in Early Victorian Britain’, Journal of Social History, 8 (1974), pp. 1-20.

3 J.C. Lyons, The Science of Phrenology (London, 1846).

4 J. Coates, How to Read Heads: Or Practical Lessons on the Application of Phrenology to the Reading of Character (London, 1891).

5 J. Byrne, Anti-Phrenology: Or a Chapter on Humbug (Washington, 1841).

6 P.J. Corfield, Time and the Shape of History (London, 2007), pp. 40-1.

7 The image comes from Jessica C’s thoughtful website, ‘Colorism: A Battle that Needs to End’ (12 Nov. 2013): www.allculturesque.com/colorism-a-battle-that-needs-to-end.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 38 please click here

MONTHLY BLOG 37, HOW DO PEOPLE RESPOND TO ELIMINATING THE LANGUAGE OF ‘RACE’?

 If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

 Having proposed eliminating from our thoughts and vocabulary the concept of ‘race’ (and I’m not alone in making that suggestion), how do people respond?

Indifference: we are all stardust. Many people these days shrug. They say that the word ‘race’ is disappearing anyway, and what does it matter?

Indeed, a friend with children who are conventionally described as ‘mixed race’ tells me that these young people are not worried by their origins and call themselves, semi-jokingly, ‘mixed ray’. It makes them sound like elfin creatures from the sun and stars – rather endearing really. Moreover, such a claim resonates with the fact that many astro-biologists today confirm that all humans (among all organic and inorganic matter on earth) are ultimately made from trace elements from space – or, put more romantically, from stardust. 1

So from a cosmic point of view, there’s no point in worrying over minor surface differences within one species on a minor planet, circulating within the constellation of a minor sun, which itself lies in but one quite ordinary galaxy within a myriad of galaxies.

Ethnic pride: On the other hand, we do live specifically here, on earth. And we are a ‘lookist’ species. So others give more complex responses, dropping ‘race’ for some purposes but keeping it for others. Given changing social attitudes, the general terminology seems to be disappearing imperceptibly from daily vocabulary. As I mentioned before, describing people as ‘yellow’ and ‘brown’ has gone. Probably ‘white’ will follow next, especially as lots of so-called ‘whites’ have fairly dusky skins.

‘Black’, however, will probably be the slowest to go. Here there are good as well as negative reasons. Numerous people from Africa and from the world-wide African diaspora have proudly reclaimed the terminology, not in shame but in positive affirmation.

Battersea’s first ‘black’ Mayor, John Archer (Mayor 1913/14) was a pioneer in that regard. I mentioned him in my previous BLOG (no 35). Archer was a Briton, with Irish and West Indian ancestry. He is always described as ‘black’ and he himself embraced black consciousness-raising. Yet he always stressed his debt to his Irish mother as well as to his Barbadian father.

In 1918 Archer became the first President of the African Progress Union. In that capacity, he attended meetings of the Pan-African Congress, which promoted African decolonisation and development. The political agenda of activists who set up these bodies was purposive. And they went well beyond the imagery of negritude by using a world-regional nomenclature.

Interestingly, therefore, the Pan-African Congress was attended by men and women of many skin colours. Look at the old photograph (1921) of the delegates from Britain, continental Europe, Africa and the USA (see Illus 1). Possibly the dapper man, slightly to the L of centre in the front row, holding a portfolio, is John Archer himself.

Illus 1: Pan-African Congress delegates in Brussels (1921)Today, ‘black pride’, which has had a good cultural run in both Britain and the USA, seems to be following, interestingly, in Archer’s footsteps. Not by ignoring differences but by celebrating them – in world-regional rather than skin-colourist terms. Such labels also have the merit of flexibility, since they can be combined to allow for multiple ancestries.

Just to repeat the obvious: skin colour is often deceptive. Genetic surveys reveal high levels of ancestral mixing. As American academic Henry Louis Gates has recently reminded us in The Observer, many Americans with dark skins (35% of all African American men) have European as well as African ancestry. And the same is true, on a lesser scale, in reverse. At least 5% of ‘white’ male Americans have African ancestry, according to their DNA.

Significantly, people with mixed ethnicities often complain at being forced to choose one or the other (or having choice foisted upon them), when they would prefer, like the ‘Cablinasian’ Tiger Woods, to celebrate plurality. Pride in ancestry will thus outlast and out-invent erroneous theories of separate ‘races’.

Just cognisance of genetic and historic legacies: There is a further point, however, which should not be ignored by those (like me) who generally advocate ‘children of stardust’ universalism. For some social/political reasons, as well as for other medical purposes, it is important to understand people’s backgrounds.

Thus ethnic classifications can help to check against institutionalised prejudice. And they also provide important information in terms of genetic inheritance. To take one well known example, sickle-cell anaemia (drepanocytosis) is a condition that can be inherited by humans whose ancestors lived in tropical and sub-tropical regions where malaria is or was common. It is obviously helpful, therefore, to establish people’s genetic backgrounds as accurately as possible.

All medical and social/political requirements for classification, however, call for just classification systems. One reader of my previous BLOG responded that it didn’t really matter, since if ‘race’ was dropped another system would be found instead. But that would constitute progress. The theory of different human races turned out to be erroneous. Instead, we should enquire about ethnic (national) identity and/or world-regional origins within one common species. Plus we should not use a hybrid mix of definitions, partly by ethnicities and partly by skin colour (as in ‘black Britons’).

Lastly, all serious systems of enquiry should ask about plurality: we have two parents, who may or may not share common backgrounds. That’s the point: men and women from any world-region can breed together successfully, since we are all one species.

1 S. Kwok, Stardust: The Cosmic Seeds of Life (Heidelberg, 2013).

2 For John Richard Archer (1869-1932), see biog. by P. Fryer in Oxford Dictionary of National Biography: on-line; and entry on Archer in D. Dabydeen, J. Gilmore and C. Jones (eds), The Oxford Companion to Black British History (Oxford, 2007, p. 33.

3 The Observer (5 Jan. 2014): New Review, p. 20.

4 M. Tapper, In the Blood: Sickle Cell Anaemia and the Politics of Race (Philadelphia, 1999).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 37 please click here

MONTHLY BLOG 36, TALKING OF LANGUAGE, IT’S TIME TO UPDATE THE LANGUAGE OF RACE

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

Names matter. Geneticists have long told us that all humans form part of one big human race.1  Indeed, we share biological characteristics not only with one another but also with a surprising number of other species. Nature is versatile in its ability to try many elegant variations within the common building blocks of life. As a result, the old view, that there were many separate human species, which were incapable of inter-marrying and inter-breeding, has gone. So we should not still talk or think in terms of there being different races of humans. It’s simply not true. Continuing to talk that way is like talking of the flat earth or insisting that the moon is made of blue cheese.

It therefore follows that we should not describe individuals as being of ‘mixed race’. The phrase is not only scientifically erroneous but positively misleading. It is a hangover from older ideas. Some early global explorers were impressed by our common humanity. Others, in good faith, saw different races. But the latter group proved to be wrong. Hence the logic is clear. Since there are no separate races, individuals cannot be mixtures of separate races. We are all one people. All ultimately in the human diaspora ‘out of Africa’.

Out of Africa diagram© Tom MooreNonetheless, there are different heritages and variegated group experiences within one common human history. How can we talk of those? One possible way is to refer to different ‘peoples’ within one species. But that terminology easily becomes confusing. Another old vocabulary talked of different ‘tribes’. Yet that too is unhelpful. If ‘peoples’ seem too nebulous and vague, then ‘tribes’ seem too small and sectarian. And in neither case is it easy to talk about compound heritages, whether from ‘mixed tribes’ or ‘mixed peoples’.

In fact, a number of Victorian social anthropologists spent a lot of time trying without success to classify the world’s ‘races’. But no criteria worked systematically, not only because people are intermixed today but also because we have a long history of intermixture. So there was no consensus about the number of different ‘races’. Various criteria were proposed, including skin colour, hair texture, average heights, cranial (skull) formation, nose-shapes, and testable intelligence. But these all yielded different and inconsistent answers.3

Estimates of the number of different human ‘races’ can be found from as low as two (‘black’ and ‘white’) to as high as 63. Such a range of guesstimates indicates not just that the task was hard but that it was impossible. For example, the many subtle variations in the handsome spectrum of human skin colours, from lily to ebony, make drawing up hard-and-fast divisions based upon colour a subjective and fallible exercise.

Interestingly, most of the proposed criteria for racial identification were solely external and ‘lookist’. But it’s hardly a secret that external appearance is no automatic guide to parentage. In countries where there were colour bars, plenty of people who were classified as black were able to ‘pass’ for white and vice versa.4  There are many permutations of looks and skin colour, even amongst very close family. Look around your own.

Or consider some public examples. A current member of the British cabinet, the conservative politician Iain Duncan-Smith, has a Japanese great-grandmother. But you would not guess that he is one-eighth Japanese at a quick glance. Or take a different case: the twin British girls born to Kylee Hodgson and Remi Horder in 2005 have contrasting skin and eye colours, one being dark-skinned and brown-eyed, the other having light colouring and blue eyes. Their parents view them proudly as a genetic gift. But a stranger would not know that the girls are sisters – let alone twins – simply by looking at them.

Does it matter? Not at all, for any human who accepts humanity as we are. It only matters for those who mind about such things, usually with hostile intent towards one or other of the attributed ‘racial’ categories. Indeed, some cultures do still maintain elaborate hierarchies of public status, tending to view those with light skin as ‘higher’ than those with darker hues.7  Such attitudes are, however, historic legacies of cultural classification that are not related to innate human qualities. For that reason, plenty of people reject a colourist world-view. The long history of caste fluidity and inter-caste marriage indicates that old cultural assumptions can be overcome – or shed entirely.

At the same time, we do need to acknowledge variety in ancestry and ethnicity. There are some medical conditions that are associated with particular genetic clusters. So some form of reference is needed. In my view, the ‘lookist’ language of skin colour, though still widely used, is historically on the way out as a means of classification. It is too crude and, currently, too socially sensitive. We don’t now refer to ‘yellows’, ‘browns’ or ‘coloured’. And, in my view, references to ‘white’ and ‘black’ will also go the way of history.

That prediction relates especially to how we name others. Some may want to retain the badge of colour as a proud form of self-identification, especially when it’s done to challenge old prejudices. But such labels may still be misleading. Particularly in the USA, where mobility and inter-marriage are rife, many dark-skinned people turn out to have very diverse parentage, with ancestors who don’t look like them but are still ancestors. Read Neil Henry’s account of A Black Man’s Search for his White Family: the upwardly mobile ‘black’ professional traced his socially declining ‘white trash’ cousins. But when they met, after the initial surprise on all sides, it was just normal.8

What then remains? The obvious forms of recognising difference relate to what we call ‘ethnicity’, pertaining to the many different human nations. That form of identification covers both biological and cultural affinities. So ‘ethnicity’ is not just a grand term for race. Instead, it’s an alternative way of recognising the effects of history and geography, by acknowledging the different cultures and traditions around the globe.

All human babies in their first year babble in the phonemes of all the thousands of human languages.9  Yet each child is brought up to speak predominantly in but one – or perhaps two – of those tongues.10 It’s a good example of difference within a common ability.

babies babble in the phonemes of all the world’s languages:  baby silhouette© victor-magz.com (2013)‘Ethnicity’ provides a neutral way of referring to variety within unity. It uses nationhood or world-region to provide a social label. Thus the ‘Japanese’ are those bred in Japan and who share the Japanese cultural identity – whatever their skin colour. Similarly, all the ‘Scots’ who will vote on the forthcoming referendum on the future of Scotland are those on the current Scottish electoral register, wherever they were born. Close neighbours, like my first cousin who self-identifies as ‘Scottish’ but lives in the north of England, will not.

The great advantage of using national or regional labels is that they can be doubled, to acknowledge diversity of heritage. Thus John Archer, known as London’s first ‘black’ Mayor (Battersea: 1913-14) can be more properly described as a Briton, born in Liverpool, with Barbadian Irish ancestry. That pays due respect to both his parents. The Americans, as a ‘people’ with a long history of immigration, are paving the way in this usage, helping individuals to acknowledge their adherence to America but also a different parental heritage: African American, Irish American, Hungarian American, and so forth.

But admittedly, there is one large complication when people have many ethnicities to acknowledge. John Archer, after all, was Barbadian Irish British. His wife was West Indian Canadian. But such convolutions can easily become cumbersome. What would their children be? Here the golfer Tiger Woods has found a witty answer. He’s pioneered the adjective ‘Cablinasian’ to name his Caucasian, Black, American Indian and Asian heritage. That should (even if it hasn’t yet) stop people trying to define him as ‘black’.

Lastly, what to do when recent politics still governs the language of social description? It’s only recently that South Africa shed its tripartite classification of ‘white’, ‘black’ and ‘Cape coloured’ (difficult as it was to implement at the multiple margins). Now perhaps one might distinguish between people of Dutch South African descent or English South African heritage. But it would then be logical to talk about African South Africans; or, for mixed ancestries, (say) Dutch African South African. It’s all too much. How about following the people of Brazil, with their mixed heritage from indigenous Americans, Portuguese, Africans, and Asians? Their National Research by Household Sample (2008) classifies people partly by self-assigned colour and partly by family origin by world-region.11 For all other purposes, however, they are ethnic ‘Brazilians’.

I guess that’s what Mandela would have wished to see happening among the next generations of South Africans. Down with skin-deepishness. Long live world-regional identities – plus their mixing.

1 L.L. and F. Cavalli-Sforza, The Great Human Diasporas: The History of Diversity and Evolution (New York, 1995): all humans should read this book.

2 See Carl Zimmer, ‘Genes are Us. And Them’, National Geographic (July 2013)

3 For an attempted scientific methodology, see R.B. Dixon, The Racial History of Man (New York, 1923), pp. 8-45, 475-523. See also, for context, E. Barkan, The Retreat of Scientific Racism: Changing Concepts of Race in Britain and the United States between the World Wars (Cambridge, 1992).

4 The difficulty of classifying individuals objectively into clearly separate and unmixed ‘races’ has vitiated various past attempts at classifying racial intelligence – quite apart from the problem of finding tests that factor out the effects of different nurture and social/biological environment.

5 M. Tempest, ‘Duncan Smith’s Secret Samurai Past’, The Guardian, 3 Sept. 2001: see
www.theguardian.com/politics/2001.

6 See report by Paul Harris and Lucy Laing, Daily Mail, 30 March 2012: www.dailymail.co.uk/news/article-2123050/Look-The-black-white-twins-turn-seven.

7 On pigmentary hierarchies, which are found in some but not all cultures, see D. Gabriel, Layers of Blackness: Colourism in the African Diaspora (London, 2007); E.N. Glenn (ed.), Shades of Difference: Why Skin Color Matters (Stanford, Calif., 2009); S.B. Verma, ‘Obsession with Light Skin: Shedding Some Light upon the Use of Skin Lightening Products in India’, International Journal of Dermatology, 49 (2010), pp. 464ff.

8 N. Henry, Pearl’s Secret: A Black Man’s Search for his White Family (Berkeley, Calif., 2001).

9 D. Crystal (ed.), The Cambridge Encyclopedia of Language (Cambridge, 1994), pp. 236-7.

10 Most children are monolingual, but bilingualism is not uncommon, where the parents have different languages, or where the wider society operates with more than one official language. It’s much rarer to be polyglot: see e.g. Xiao-Lei Wang, Growing Up with Three Languages: Birth to Eleven (Bristol, 2008).

11 Brazil’s National Research by Household Sample (2008) reported that 48.43% of the Brazilian population, when surveyed, described themselves as ‘white’; 43.80% as ‘brown’ (multi-ethnic); 6.84% as ‘black’; 0.58% as ‘Amerindian’ (officially known as ‘Indigenous’); while 0.07% (about 130,000 individuals) did not declare any additional identity. A year earlier, Brazil’s National Indian Foundation also reported the existence of at least 67 different ‘uncontacted’ tribes. See en.wikipedia.org/wiki/Brazil.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 36 please click here

MONTHLY BLOG 35, DONS AND STUDENT-CUSTOMERS? OR THE COMMUNITY OF LEARNERS?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

Names matter. Identifying  things – people – events – in realistic terminology means that they are being fully understood and taken seriously. Conversely, it’s warping to the mind and eventually corrosive of good thought to be constantly urged to give lip-service to the ‘wrong’ terms. People who live under dictatorial systems of would-be thought-control often testify to the ‘dead’ feeling that results from public censorship, especially when it is internalised as self-censorship.

By the way, I wrote that paragraph before remembering that this sentiment dovetailed with something I’d read about Confucius. A quick Google-check confirmed my half-memory.  Confucius long ago specified that: ‘The beginning of wisdom is to call things by their proper names’. It’s a great dictum. It doesn’t claim too much. Naming is only the ‘beginning of wisdom’, not the entirety. And there is often scope for debating what is or should be the ‘proper’ name. Nonetheless, Confucius not only highlights the good effect of clear vision, accurately acknowledged to others, but equally implies the malign effects of the reverse. The beginning of madness is to delude oneself and others about the true state of affairs.

Which brings me to my current question: are University students ‘customers’? If so, an interesting implication follows. If ‘the customer is always right’, as the business world asserts but does not always uphold, should not all students get top marks for having completed an assignment or an exam paper? Or, at very least not get bad marks?

Interestingly, now that student payments for tuition are very much up-front and personal in the form of fees (which are funded as repayable loans), so the standard of degrees is gradually rising. Indeed, grade inflation has become noticeable ever since Britain’s Universities began to be expanded into a mass system. A survey undertaken in 2003 found that the third-class degree has been in steady decline since 1960 and was nearing extinction by 2000. And a decade on, the lower second (2.2) in some subjects is following the same trajectory. Better teaching, better study skills, and/or improved exam preparation may account for some of this development. But rising expectations on the part of students – and increasing reputational ambitions on the part of the Universities – also exert subtle pressures upon examiners to be generous.

Nonetheless, even allowing for a changing framework of inputs and outputs, a degree cannot properly be ‘bought’. Students within any given University course are learners, not customers. Their own input is an essential part of the process. They can gain a better degree not by more money but by better effort, well directed, and by better ability, suitably honed.

People learn massively from teachers, but also much from private study, and much too from their fellow-learners (who offer both positive and negative exemplars). Hence the tutors, the individual student, and other students all contribute to each individual’s result.2

A classic phrase for this integrated learning process was ‘the community of scholars’. That phrase now sounds quaint and possibly rather boring. Popularly, scholarship is assumed to be quintessentially dull and pedantic, with the added detriment of causing its devotees to ‘scorn delights and live laborious days,’ in Milton’s killing phrase.3  In fact, of course, learning isn’t dull. Milton, himself a very learned man, knew so too. Nonetheless, ‘the community of scholars’ doesn’t cut the twenty-first century terminological mustard.

But ‘learning’ has a better vibe. It commands ‘light’. People may lust for it, without losing their dignity. And it implies a continually interactive process. So it’s good for students to think of themselves as part of a community of learners. Compared with their pupils, the dons are generally older, sometimes wiser, always much better informed about the curriculum, much more experienced in teaching, and ideally seasoned by their own research efforts. But the academics too are learners, if more advanced along the pathway. They are sharing the experience and their expertise with the students. Advances in knowledge can come from any individual at any level, often emerging from debates and questions, no matter how naive. So it’s not mere pretension that causes many academics to thank in their scholarly prefaces not only their fellow researchers but also their students.

Equally, it’s good for the hard-pressed dons to think of themselves as part of an intellectual community that extends to the students. That concept reasserts an essential solidarity. It also serves to reaffirm the core commitment of the University to the inter-linked aims of teaching and research. Otherwise, the students, who are integral to the process, are seemingly in danger of getting overlooked while the dons are increasingly tugged between the rival pressures of specialist research in the age of Research Assessment, and of managerial business-speak in the age of the University-plc.4

Lastly, reference to ‘community’ need not be too starry-eyed. Ideals may not always work perfectly in practice. ‘Community’ is a warm, comforting word. It’s always assumed to be a ‘good thing’. Politicians, when seeking to commend a policy such as mental health care, refer to locating it in ‘the community’ as though that concept can resolve all the problems. (As is now well proven, it can’t). And history readily demonstrates that not all congregations of people form a genuine community. Social cohesion needs more than just a good name.

That’s why it’s good to think of Universities as containing communities of learners, in order to encourage everyone to provide the best conditions for that basic truth to flourish at its best. That’s far from an easy task in a mass higher-education system. It runs counter to attempts at viewing students as individual consumers. But it’s more realistic as to how teaching actually works well. And calling things by their proper names makes a proper start.

William Hogarth’s satirical Scholars at a Lecture (1736) offers a wry reminder to tutors not to be boring and to students to pay attention1 ‘Third Class Degree Dying Out’, Times Higher Education, 5 Sept. 2003: www.timeshighereducation.co.uk/178955/article: consulted 4 Nov. 2013.

2 That’s one reason why performance-related-pay (PRP) for teachers, based upon examination results for their taught courses, remains a very blunt tool for rewarding teaching achievements. Furthermore, all calculations for PRP (to work even approximately justly) need to take account of the base-line from which the students began, to measure the educational ‘value-added’. Without that proviso, teachers (if incentivised purely by monetary reward) should logically clamour to teach only the best, brightest, and most committed students, who will deliver the ‘best’ results.

3 John Milton, Lycidas: A Lament for a Friend, Drowned in his Passage from Chester on the Irish Seas (1637), lines 70-72: ‘Fame is the spur that the clear spirit doth raise/ (That last Infirmity of Noble mind)/ To scorn delights and live laborious days.’

4 On the marketisation of higher education, see film entitled Universities Plc? Enterprise in Higher Education, made by film students at the University of Warwick (2013): www2.warwick.ac.uk/fac

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 35 please click here

MONTHLY BLOG 34, COPING WITH WRITER’S BLOCK

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

You’re suffering from writer’s block? A common ailment. What to do?? The first and best answer is: don’t hit the bottle. It’s only too true that alcohol makes you think that things are going better (at least for a while) whilst concealing the fact that things are getting much worse. Eventually, you become so stalled that there’s no way out, other than a bleak confession of failure.

The prototype is Scott Fitzgerald’s The Crack-Up (1936).1  Beautifully written but  painful reading for all his admirers. Many famous writers have gone down this alcoholic route, almost invariably with disastrous results.2  On the other hand, recent research suggests that moderate amounts of booze for those who are not habitually heavy drinkers may unleash creativity and lateral thinking (at least when solving questions about word-associations).3  Great. Have your bright ideas with an alcohol buzz in your spare time. But be warned. Don’t sit down to unblock your history-writing, which requires concentrated reasoning over a good span of time, with a glass and bottle at hand.

october007Next bit of advice is to stand back from the blocked task and ask yourself: do you really want to do it? (Of course, this question may be resolved if the answer is that you have to undertake whatever writing is involved – say, to complete a course or to gain a qualification. In that case, skip this paragraph). Writer’s block is sometimes a deep auto-message to say that you should be doing something else. When I am advising friends on coping with this problem, I often start by giving them permission to drop the task entirely. A small but far from negligible percentage respond with sighs of relief. Their brows clear; they find a civil way to terminate their writing commitments; maybe they publish what they have done already; and then they do something else, often very enthusiastically, tapping into lots of thwarted energy.

But that’s not the case for everyone. Many want to complete the task but can’t find the time, space, self-organisation, or inspiration to proceed.  It’s not a good state of mind to inhabit for any length of time, since it’s often linked with vexation, self-chiding, and various degrees of despair. Patrick Leigh Fermor, who wrote two brilliant books of a trilogy, agonised for years over his prolonged failure to produce the missing third volume.4  Blocked writers particularly wince when innocent bystanders ask cheerfully: how’s the writing going – why isn’t it done yet? So the following comments are addressed to those who, when given permission to drop the writing, respond with irritation that they do really want to do the task but can’t even bear talking about why it’s not getting done.

I’ve been in that situation myself – fortunately, not often but enough times to know what a mental closed-circuit can result. One method that helped me was the technique of writing freely, in unstructured prose, a private memo to myself about the problem in a stream of consciousness, or Streamo, as I call it. No-one else need ever see this screed. It’s good to start simply by trying to work out for oneself: what is particularly troublesome about this assignment? Is it XXX? No, not really. What about the problem of YYY? or ZZZ? Perhaps, yes; perhaps, no. Writing as fast as possible. Musing to oneself. Not worrying if sentences aren’t perfectly grammatical. It can often take a long time, circling around, dredging thoughts from deep within, trying to pinpoint what factor or factors are causing the block.

Once I had stalled because I’d reached a tricky question, whose answer I couldn’t resolve. There was a genuine intellectual point at issue. The problem was that there was not one simple response but a plethora of interconnected ones. After lots of scribbling, I realised that I was worrying wrongly about the lack of one striking answer. Instead I could offer many. With a sigh of relief, I deleted all my scribbles. In the blocked chapter that I was writing, I inserted a new sentence, saying something banal like: ‘This is a complex problem, for which there is not one simple answer’. After that, bingo, my prose flowed again. Sometimes I smile when re-reading that text, to think of all the grief it caused me. But it had value. The technique of Streaming is not only useful for unblocking but also for planning new projects. So my Streamos, which I mainly delete once projects are launched, are not as substantial as first drafts but rather constitute first drafts of inspiration. They are useful as mechanisms to coalesce disparate strands of thought. Try writing one as fast as possible, preferably on-line, and see if it helps you.

(Solo meditation, for intellectual blockage, tends to be more useful, in my view, than the talking cure, which often works well in other circumstances. Vocalising writer’s block as a ‘problem’ risks giving it an unwelcome life of its own. It invites thoughts of the renowned grand projects which remain forever anticipated but forever postponed.5 )

Actual history writing, of course, moves much more slowly than the fast and furious pace of memos to oneself. So it’s important also to think about the long-term context of regular writing. Obvious things like: get a desk or working area and, ideally too, a room,6  where you are happy to spend a lot of time; find lighting that focuses a concentrated pool of light on your working areas; try ear-plugs for heightened concentration; institute good filing and storage arrangements for notes, drafts etc.; and of course implement a rigorous back-up system after every batch of writing; plus find a goodish span of time to write, on each occasion (less than two hours is unproductive); and a personal start-ritual.

Different writers have their own preferences. The prolific Charles Dickens used to patrol his house, checking that everything was in order, and then arrange the items on his desk in a specific order, before sitting down to write. Each to his/her own. Many make do these days with the sequenced rituals of switching on computers, ipads etc. But find your own preferences; stick to your sequences; and don’t open email during writing stints.

What else? Another very important way of keeping the flow of writing going is to undertake regular exercise of the repetitive kind. Swimming, riding, running, walking, yoga, these are all good. The subconscious mind can work on problems, in a non-linear way, whilst the body is absorbed in such activities. And the fresh air is an ideal antidote to the confinement of sitting for long hours at a desk, gazing into a screen. Dickens was also a great walker. But again, it’s really a case of each to his/ her own. If your preference is for an explosive sport, then go for that. Exercise of any kind is much better than nothing. But repetitive and rhythmic exercises (avoiding the obvious innuendoes here) are particularly good for unblocking, especially if sustained for at least half an hour – daily.

Lastly, to write history, you need not only something to say but also good and relevant evidence to intermesh with your analysis. That means a whole lifestyle choice. You have to do the research as well as find time to write. It’s a wonderful thing, if you have the will, the interest, and a subject that enthrals you. If you have these things, then go for it but keep running/ riding/ swimming/ regularly alongside the scholarship, and scribbling a Streamo whenever you have an intellctual problem to solve. These methods  will unblock a block, if you have one; or, better still, prevent it from forming in the first place.

Burning Bush, Winkworth Arboretum © Antony Belton, 20131 F.S. Fitzgerald, The Stories of F. Scott Fitzgerald, Vol. 2: The Crack-Up with Other Pieces and Stories (Harmondsworth, 1965), pp. 39-56.

2 See e.g. D.W. Goodwin, Alcohol and the Writer (Kansas City, 1988).

3 A.F. Jarosz and others, ‘Uncorking the Muse: Alcohol Intoxication Facilitates Creative Problem Solving’, Consciousness and Cognition, 21 (2012), pp. 487-93. c

4 Published posthumously from his notebooks: see P.L. Fermor, The Broken Road: From the Iron Gates to Mount Athos, ed. C. Thubron and A. Cooper (London, 2013), following A Time of Gifts (London, 1977) and Between the Woods and the Water (London, 1986).

5 The most celebrated fictional example remains Dr Casaubon’s ‘Key to all the Mythologies’ in George Eliot’s Middlemarch (1871/2); and a real-life case was Lord Acton’s projected ‘History of Liberty’, two chapters being published posthumously in J.E.E. Dalberg-Acton, The History of Freedom and Other Essays, ed. J.N. Figgis and R.V. Laurence (London, 1907).

6 See inevitably V. Woolf, A Room of One’s Own: An Essay on Women in Relation to Literature (London, 1929).

7 For creativity and work routines, see M. Currey, Daily Rituals: How Artists Work (New York, 2013) – even if in reality there may be variations from day to day.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 34 please click here