Oscar Wilde would have been on Grindr – but he preferred a more clandestine connection

This post originally appeared on The Conversation. This article was written by Jack Sargent, PhD student in History. 

It has never been so easy to find love, or sex, quickly. In 2017, there is nothing shameful or illicit about using dating apps or digital tools to connect with someone else. More than 100 years ago, of course, things were very different.

Oscar Wilde and other men and women who, like him, desired same-sex relationships, had to resort to attending secret parties to meet potential partners. The idea that it would become normal to meet and flirt with an ever changing group of strangers, sending explicit pictures or a few cheeky sentences on a device you hold in your hand, would have amused the writer. The openness about conducting such relationships would have amazed him.

But would Oscar Wilde have enjoyed the most famous gay dating app, Grindr, and the way it has contributed to gay culture? We know he would probably have welcomed the fact that gay men and women could easily meet new sexual partners. In the late-Victorian period, Wilde’s membership of clandestine homoerotic networks of clubs and societies, was far more furtive. They were gatherings of forbidden passions and desires, shrouded in secrecy.

Wilde loved being part of this underground community. He adored being with crowds of immaculately dressed people in beautiful rooms. He believed the most important goal in life was to experience emotion and sensuality, to have intense connections and embrace beauty.

This belief came from his involvement in a movement called Aestheticism. Late-Victorian aesthetes proposed that beauty and sensation were the keys to an individual’s authentic experience of life. They argued that beauty and connections with beauty should be pursued even at the expense of conventional systems of morality, and what society considered right or wrong. For Wilde, this meant he thought about whether it was aesthetically – not morally – right to sleep with someone.

Oscar Wilde was born in Dublin in 1854 and died in Paris in 1900, a few years after his release from jail for “gross indecency” with other men. Before his imprisonment, Wilde was (I think almost uniquely) shockingly positive and active about his desire for other men. This was a time when same-sex desire and intercourse was illegal, seen as illicit and monstrous – an abhorrent illness which should be exercised from Christian culture.

Wilde met and slept with many other men, continuing relationships for years, months, weeks, or maybe even only a night, before effectively dropping them and moving on. Is this so different to how gay relationships are conducted now?

Every part of gay culture today stems from the way that Wilde and the group of men he mixed with lived their lives. Their philosophy that they should have their own dedicated spaces to meet still stands. At first they evolved into gay bars and clubs. Now those physical spaces are closing as members of the gay community go online to meet each other.

The importance of being on Grindr. Shutterstock

Grindr, now eight years old, allows people to make connections, if they like the look of someone’s body. It is the same type of connection that Wilde was interested in, but it doesn’t give people the intense, sensual involvement with another human being he was looking for. You might see someone you like on Grindr, but there is no promise they will respond to your message. Downloading and using the app doesn’t automatically make you part of a network of people that are thinking and feeling intense emotional sensations. Wilde, at his parties and gatherings, taking risks and breaking the law, must have felt part of a group who came together to all feel something special and exciting.

This excitement was not only to do with the illegal nature of the acts undertaken in secret. It had something to do with the vibrancy and sensuality offered by being in a particular place, engaging sensually and physically with other people, reading them for signs of interest, right down to the smallest gesture.

Digital declarations

This is not possible on Grindr. Grindr offers instead a potentially unlimited amount of possible connections, but connections which are digital, not physical. Once downloaded, the app offers a digital network of people that can be loaded and reloaded with a simple swipe of the screen. The continual possibility of meeting someone different or better means that users don’t necessarily need to commit to connecting. It seems we are in danger of creating a generation of potentially disconnected individuals, who rather than going to a gay bar, choose to spend the night in, waiting for a stranger to send them a message.

Had he been able to, Wilde would have downloaded Grindr, of that I think we can be certain. Would he have liked it? Well, he may have found some beauty in the technology and the freedom it represents. And perhaps, sometimes, he would have enjoyed the novelty.

But he would probably have preferred the clubs, societies and networks he engaged with during the late 1800s. For while they did not promise successful or happy encounters, they did foster physical relationships between men within spaces of affirmation, liberation and fulfilment. And although Grindr also offers the chance for casual sex, I think late Victorian gay men would have been saddened by the lack of opportunity for their counterparts today to connect emotionally with others.

Being lovesick was a real disease in the Middle Ages

This post originally appeared on The Conversation. The piece was written by Laura Kalas Williams, Postdoctoral Researcher in Medieval Literature and Medicine at Exeter. 

Love sure does hurt, as the Everly Brothers knew very well. And while it is often romanticised or made sentimental, the brutal reality is that many of us experience fairly unpleasant symptoms when in the throes of love. Nausea, desperation, a racing heart, a loss of appetite, an inability to sleep, a maudlin mood – sound familiar?

Today, research into the science of love recognises the way in which the neurotransmitters dopamine, adrenalin and serotonin in the brain cause the often-unpleasant physical symptoms that people experience when they are in love. A study in 2005 concluded that romantic love was a motivation or goal-orientated state that leads to emotions or sensations like euphoria or anxiety.

But the connection between love and physical affliction was made long ago. In medieval medicine, the body and soul were closely intertwined – the body, it was thought, could reflect the state of the soul.

Humoral imbalance

Text and tabular of humours and fevers, according to Galen, c.1420. In MS 49 Wellcome Apocalypse, f.43r. Wellcome Library

Medical ideas in the Middle Ages were based on the doctrine of the four bodily humours: blood, phlegm, black bile and yellow bile. In a perfectly healthy person, all four were thought to be perfectly balanced, so illness was believed to be caused by disturbances to this balance.

Such ideas were based on the ancient medical texts of physicians like Galen, who developed a system of temperaments which associated a person’s predominant humour with their character traits. The melancholic person, for example, was dominated by the humour of black bile, and considered to have a cold and dry constitution.

And as my own research has shown, people with a melancholic disposition were thought, in the Middle Ages, to be more likely to suffer from lovesickness.

The 11th-century physician and monk, Constantine the African, translated a treatise on melancholia which was popular in Europe in the Middle Ages. He made clear the connection between an excess of the black bile of melancholy in the body, and lovesickness:

The love that is also called ‘eros’ is a disease touching the brain … Sometimes the cause of this love is an intense natural need to expel a great excess of humours … this illness causes thoughts and worries as the afflicted person seeks to find and possess what they desire.

Curing unrequited love

Towards the end of the 12th century, the physician Gerard of Berry wrote a commentary on this text, adding that the lovesick sufferer becomes fixated on an object of beauty and desire because of an imbalanced constitution. This fixation, he wrote, causes further coldness, which perpetuates melancholia.

Whoever is the object of desire – and in the case of medieval religious women, the beloved was often Christ – the unattainability or loss of that object was a trauma which, for the medieval melancholic, was difficult to relieve.

But since the condition of melancholic lovesickness was considered to be so deeply rooted, medical treatments did exist. They included exposure to light, gardens, calm and rest, inhalations, and warm baths with moistening plants such as water lilies and violets. A diet of lamb, lettuce, eggs, fish, and ripe fruit was recommended, and the root of hellebore was employed from the days of Hippocrates as a cure. The excessive black bile of melancholia was treated with purgatives, laxatives and phlebotomy (blood-letting), to rebalance the humours.

Blood-letting in Aldobrandino of Siena’s ‘Régime du Corps’. British Library, MS Sloane 2435, f.11v. France, late 13thC. Wikimedia Commons

Tales of woe

It is little wonder, then, that the literature of medieval Europe contains frequent medical references in relation to the thorny issue of love and longing. Characters sick with mourning proliferate the poetry of the Middle Ages.

The grieving Black Knight in Chaucer’s The Book of the Duchess mourns his lost beloved with infinite pain and no hope of a cure:

This ys my peyne wythoute red (remedy),
Alway deynge and be not ded.

In Marie de France’s 12th-century Les Deus Amanz, a young man dies of exhaustion when attempting to win the hand of his beloved, who then dies of grief herself. Even in life, their secret love is described as causing them “suffering”, and that their “love was a great affliction”. And in the anonymous Pearl poem, a father, mourning the loss of his daughter, or “perle”, is wounded by the loss: “I dewyne, fordolked of luf-daungere” (I languish, wounded by unrequited love).

The lover and the priest in the ‘Confessio Amantis’, early 15th century. MS Bodl. 294, f.9r. Bodleian Library, Oxford University

The entirety of John Gower’s 14th-century poem, Confessio Amantis (The Lover’s Confession), is framed around a melancholic lover who complains to Venus and Cupid that he is sick with love to the point that he desires death, and requires a medicine (which he has yet to find) to be cured.

The lover in Confessio Amantis does, finally, receive a cure from Venus. Seeing his dire condition, she produces a cold “oignement” and anoints his “wounded herte”, his temples, and his kidneys. Through this medicinal treatment, the “fyri peine” (fiery pain) of his love is dampened, and he is cured.

The medicalisation of love has perpetuated, as the sciences of neurobiology and evolutionary biology show today. In 1621, Robert Burton published the weighty tome The Anatomy of Melancholy. And Freud developed similar ideas in the early 20th century, in the book Mourning and Melancholia. The problem of the conflicted human heart clearly runs deep.

So if the pain of love is piercing your heart, you could always give some of these medieval cures a try.

Children have long been unfairly hit by US presidential executive orders

This post originally appeared on The Conversation. This piece was written by Rachel Pistol, Associate Research Fellow (History). 

Around 75 years ago, in February 1942, US President Franklin Delano Roosevelt signed Executive Order 9066, which led to the forced relocation and internment of more than 110,000 individuals of Japanese ancestry. The majority of them were American citizens, and a large proportion were children.

But unlike President Trump’s 2017 executive order to halt immigration and ban refugees from American soil, Roosevelt’s sweeping political move did not provoke any protest or dissent. Both presidents had mentioned the notion of “national security’ in their orders, and both decrees were said to be aimed at specific national groups. So is President Trump merely copying the policy of one of his more popular predecessors?

From the moment the US entered World War II in late 1941, all “enemy aliens” living in America – German, Austrian, Italian, and Japanese – were subject to restrictions on their freedom. These included the imposition of curfews and a ban on owning radios. So the real significance of EO9066, as it is known, was that it authorised the detention not just of enemy aliens, but also of American citizens. In theory, any American citizen could be relocated by order of the military.

But EO9066 was created for a particular purpose, which was to enable the internment of Japanese Americans living on the West Coast of America. It also made it possible for further orders to be authorised, such as Civilian Exclusion Order No.79, which ordered that “all persons of Japanese ancestry, both alien and non-alien” be excluded from a portion of the West Coast.

Japanese American children pledging allegiance in California, 1942. US Library of Congress

Yet one of the most striking things about EO9066 is that, unlike Trump’s executive order, it does not once talk about nationality. Instead, Roosevelt gave military commanders the right to “prescribe military areas in such places and of such extent as he or the appropriate military commander may determine, from which any or all persons may be excluded”.

Roosevelt declares war against Japan.National Archives and Records Administration

The creation of protected military areas during times of war is not unusual, and makes sense for security reasons. However, usually these zones surround military installations and coastal areas where the threat of invasion is greatest. In the case of the US during World War II, the whole of the West Coast was designated a military protected area. The most likely place for invasion, however, was the only place on American soil that had already been attacked – Hawaii.

About 40% of the population of Hawaii was of Japanese descent, as opposed to the West Coast, where they made up just over 1%. The military knew that Hawaii could not function if all the Japanese people were removed, and therefore decided to impose martial law. Individuals (usually men) considered the greatest threat to national security were arrested and interned, while the rest of their families were able to live at liberty.

The military’s decision to selectively intern on Hawaii was backed up by J. Edgar Hoover, director of the FBI, who was quoted as saying: “This evacuation isn’t necessary; I’ve already got all the bad boys.”

Currently, any immigrant or refugee who is given entry to the US goes through a stringent vetting procedure. This is partly why, according to American think tank the Cato Institute, no refugees have been involved in terrorist attacks on US soil since the Refugee Act of 1980. It is also worth noting that those behind major terrorist attacks in the US have mostly been born in America, or were permanent legal residents from countries not covered by Trump’s ban.

Land of the free?

But perhaps the greatest similarity between Roosevelt’s and Trump’s orders is how American-born children are affected. Half of those interned under EO966 during World War II were American-born minors. Some have said this was inevitable because of the decision to intern both Japanese parents in the continental US. However, not all German, Austrian, or Italian mothers were interned, which meant that not all of their children were taken to camps.

In some cases, German-American children were left without care when both their father and mother were arrested. In other cases, families could “voluntarily” request to join husbands and fathers interned. There was no choice for Japanese-Americans. In other allied countries such as Great Britain, most enemy alien women were allowed to remain at liberty, along with their children. In the US, the children were considered as much of a threat as their foreign born parents, leading to the internment of entire family units.

This seems to still be the case today, as demonstrated by the fact that an American five-year-old boy was detained for more than four hours as a result of Trump’s immigration order because his mother was Iranian. Sean Spicer, Trump’s press secretary, defended the decision because “to assume that just because of someone’s age and gender that they don’t pose a threat would be misguided and wrong”.

American-born children, therefore, are still considered dangerous, but only, it seems, if they are born to non-white immigrant parents. For others born in the US their rights appear to remain linked to the country of their parents’ birth. Just as in 1942, the promise of “liberty and justice for all” still does not to apply to all American citizens.

The busy Romans needed a mid-winter break too … and it lasted for 24 days

This was originally appeared on The Conversation and is written by Dr Richard Flowers, Senior Lecturer in Classics and Ancient History, University of Exeter.

In the Doctor Who Christmas Special from 2010, Michael Gambon’s Scrooge-like character remarks that across different cultures and worlds people come together to mark the midpoint of winter. It is, he imagines, as if they are saying: “Well done, everyone! We’re halfway out of the dark!”

The actual reasons for celebrating Christmas at this particular time in the year have long been debated. Links have often been drawn to the winter solstice and the Roman festival of Saturnalia. Some people have also associated it with the supposed birthday of the god Sol Invictus, the “unconquered sun”, since a fourth-century calendar describes both this and Christ’s birth as taking place on December 25.

Such speculation has inevitably led to claims that this traditionally Christian festival is little more than a rebranding of earlier pagan activities. But questions about the “religious identity” of public celebrations are, in fact, nothing new and were being asked in the later periods of the Roman empire as well.

This is particularly evident in the case of a rather obscure Roman festival called the Brumalia, which started on November 24 and lasted for 24 days. We cannot be sure exactly when it began to be celebrated, but one of our best accounts of it comes from the sixth century AD. A retired public official called John the Lydian explained that it had its origins in earlier pagan rites from this time of year, including Saturnalia.

Some people celebrated Brumalia by sacrificing goats and pigs, while devotees of the god Dionysus inflated goat skins and then jumped on them. We also believe that each day of the festival was assigned a different letter of the Greek alphabet, starting with alpha (α) on November 24 and finishing with omega (ω) on December 17.

A person would wait until the day that corresponded to the first letter of their own name and then throw a party. This meant that those with a wide circle of friends – and, in particular, friends with a wide variety of names – might potentially get to go to 24 consecutive celebrations.

We also have other evidence for the popularity of the Brumalia during the sixth century. A speech by the orator Choricius of Gaza praises the festivities laid on by the emperor Justinian (527–565), remarking that the emperor and his wife, Theodora, celebrated the Brumalia on adjacent days, since the letter iota (ι) – for Justinian – directly follows theta (θ) – for Theodora – in the Greek alphabet. Surviving accounts from the cellars of a large estate in Egypt also detail the wine distributions to officials and servants for the Brumalia of the master, Apion, which fell on the first day of the festival.

Yet, the origins of the Brumalia are far from clear. It seems to have been related to the earlier Roman Bruma festival, which took place on a single day in November and looked ahead to the winter solstice (or bruma in Latin) a month later, but little is known about this.

It is only really from the sixth century onwards that it appears in surviving sources, even though by then most Romans were Christians and had been ruled by Christian emperors for more than two centuries. John the Lydian also states that the “name day” aspect of the celebrations was a recent innovation at this time. As far as we can tell, therefore, this was not merely a remnant from a distant pagan past, but had actually developed and grown at precisely the same time as emperors, including Justinian, were endeavouring to clamp down on perceived “paganism” in their empire.

The historian Roberta Mazza, in one of the most comprehensive modern discussions of the festival, has argued that the Brumalia was simply too popular to get rid of entirely, but that Justinian sought to strip it of “pagan” elements. She says that in doing so, the emperor “reshaped and reinvented the meanings and purposes of the feast” and made it “both acceptable from a religious point of view and useful for constructing a common cultural identity throughout the different provinces of the empire”.

The true meaning of Brumalia

We know that the Brumalia continued to be celebrated at the imperial court in Constantinople until at least the tenth century, but it was certainly not without its opponents. John the Lydian reports that the church was opposed to the Brumalia, and similar statements of disapproval and attempts to ban it were also made by church councils in 692 and 743. For some Christians, it remained just too pagan for comfort. Controversy also surrounded other celebrations in late antiquity, including the wearing of masks at New Year, the Roman Lupercalia (with its naked runners), and the processions and dancing involved in the “Bean Festival” at Calama in North Africa.

How then should we view the Brumalia? Was it still essentially “pagan”, or had it become safely Christianised or secularised? I think that any attempt to neatly categorise these festivals, let alone their participants, is destined to fail. For some people, the religious elements will have loomed larger, while for others they will have been almost entirely irrelevant, as also happens with Christmas today.

The Brumalia could be celebrated in a variety of ways and have a multitude of meanings to different people throughout the empire, even if all of them saw themselves as Christians. Rather than arguing that Justinian or others who enjoyed the Brumalia were “less Christian” than its opponents, we might instead treat it as a vivid illustration of the fluidity and malleability of notions of culture and identity.

We cannot ever discover the true meaning of Brumalia, but we can be sure that it brought people together to commemorate being halfway out of the dark.

Medieval women can teach us how to smash gender rules and the glass ceiling

This post originally appeared on The Conversation. The post is written by Laura Kalas Williams, Postdoctoral researcher in medieval literature and medicine and Associate Tutor at the University of Exeter.

On the night of the US election, Manhattan’s magisterial, glass-encased Javits Centre stood with its ceiling intact and its guest-of-honour in defeated absence. Hillary Clinton – who has frequently spoken of “the highest, hardest glass ceiling” she was attempting to shatter – wanted to bring in a new era with symbolic aplomb. As supporters despaired in that same glass palace, it was clear that the symbolism of her defeat was no less forceful.

People wept, hopes were dashed, and more questions were raised about just what it will take for the most powerful leader on the planet to one day be a woman. Hillary Clinton’s staggering experience and achievements as a civil rights lawyer, first lady, senator and secretary of state were not enough.

The double-standards of gender “rules” in society have been disconcertingly evident of late. The Clinton campaign said FBI director James Comey’s handling of the investigation into Clinton’s private server revealed “jaw-dropping” double standards. Trump, however, lauded him as having “guts”. When no recriminating email evidence was found, Trump ran roughshod over the judicial process, claiming: “Hillary Clinton is guilty. She knows it. The FBI knows it, the people know it.” Chants of “lock her up” resonated through the crowd at a rally.

Mob-like cries for a woman to be incarcerated without evidence or trial? That’s medieval.

The heart of a king

Since time immemorial, women have manipulated gender constructs in order to gain agency and a voice in the political milieu. During her speech to the troops at Tilbury, anticipating the invasion of the Spanish Armada, Elizabeth I famously claimed:

I know I have the body but of a weak and feeble woman; but I have the heart and stomach of a king, and of a king of England too.

Elizabeth I, The Ditchley Portrait, c. 1592, National Portrait Gallery. Elizabeth stands upon England, and the top of the world itself. Her power and domination are symbolised by the celestial sphere hanging from her left ear. The copious pearls represent her virginity and thus maleness. Wikimedia Commons

Four hundred years later, Margaret Thatcher seemed obliged to follow the same approach, employing a voice coach from the National Theatre to help her to lower her voice. And Clinton told a rally in Ohio: “Now what people are focused upon is choosing the next president and commander-in-chief.” Not a million miles away from the kingly-identifications of Elizabeth, the pseudo-male “Virgin Queen”.

This gender-play has ancient origins. In the late fourth century AD, St Jerome argued that chaste women become male. Likewise, the early Christian non-canonical Gospel of Thomasclaimed that Jesus would make Mary “male, in order that she also may become a living spirit like you males”.

15th century ‘Disease Woman’. Wellcome Collection, MS Wellcome Apocalypse 49, f.38r.

By the Middle Ages, this idea of female bodily inferiority became material as well as spiritual as medical texts on the topic proliferated. Women’s bodies were considered inferior and more prone to disease. Because of the interiority of female anatomy, male physicians had to rely on diagrams and texts to interpret them, often with a singular focus on the reproductive system. Since men mostly wrote the books, the lexical and pictorial construction of the female body has therefore been historically, and literally, “written” by male authors.

So women, who were socially constrained by their female bodies and living in a man’s world, had to enact radical ways to modify their gender and even their very physiology. To gain authority, women had to be chaste, and to behave like men by adopting “masculine” characteristics. Such modifications might appear to compromise feminist, or proto-feminist, ambitions, but they were in fact sophisticated strategies to undermine or subvert the status quo.

Gender-play

Illuminated image from Hildegard of Bingen’s (1098-1179) Scivias, depicting her enclosed in a nun’s cell, writing. Wikimedia Commons

Medieval women who desired a voice in religious circles (the Church was, of course, the unelected power of the day) shed their femininity by adapting their bodies, the way that they used them, and therefore the way in which they were “read” by others. Through protecting their virginity, fasting, mortifying their flesh, perhaps reading, writing, or becoming physically enclosed in a monastery or anchorhold, they reoriented the way in which they were identified.

Joan of Arc (1412-1431) famously led an army to victory in the Hundred Years War dressed as a soldier, in a time when women were not supposed to fight.

Catherine of Siena (1347-1380), defying social codes of female beauty, shaved her hair in defiance of her parents’ wish to have her married. She later had a powerful mystical experience whereby she received the heart of Christ in place of her own; a visceral transformation which radically altered her body and identity.

And St Agatha (231-251), whose story was widely circulated in the Middle Ages, refused to give in to sexual pressure and was tortured, finally suffering the severing of her breasts. She has since been depicted as offering her breasts on a plate to Christ and the world. Agatha subverted her torturers’ aim, exploited her “de-feminised” self and instead offered her breasts as symbols of power and triumph.

Saint Agatha bearing her severed breasts on a platter, Piero della Francesca (c. 1460–70). Wikimedia Commons

Some scholars have even argued that monks and nuns were a considered a “third gender” in the Middle Ages: neither fully masculine nor feminine.

These flexible gender systems show how medieval people were perhaps more sophisticated in their conceptualisation of identity that we are today, when challenges to binary notions of gender are only now becoming widely discussed. Medieval codes of chastity might not be to most 21st-century tastes, but these powerful women-in-history took control of their own identification: found loopholes in the rules, found authority in their own self-fashioning.

The US presidential campaign has without doubt reinvigorated the politics of gender. Hillary Clinton has said: “If I want to knock a story off the front page, I just change my hairstyle”. It is easy to leap at such a comment, seeing Clinton as a media-sycophant, playing to the expectation that women are defined by their appearance. But in fact, like myriad women before her, Clinton was manipulating and exploiting the very rules that seek to define her.

Complete liberation this is not. Only when the long history of gender rules is challenged will powerful women no longer be compared to men. Like the response of Joan of Arc and her troops, it is surely now time for another call to arms: for the freedoms of tolerance, inclusion, equality and compassion. We must turn grief into optimism and words into action. To shatter not the dreams of girls around the world, but the glass ceilings that restrain them.

How Ross Poldark was a victim of Cornwall’s changing industrial landscape

This post originally appeared on The Conversation. The article was written by Joseph Crawford, Lecturer in English. 

In July 2016, the BBC announced the commissioning of a third season of costume drama Poldark, months before the second series was even due to be broadcast. This represents an impressive vote of confidence in the series, especially as season two will apparently not be repeating the famous “topless scything” scene which won the National Television Awards’ prize for TV Moment of the Year.

Go West. BBC

Go West. BBC

The real pivotal moment depicted by Poldark, however, is one of historical change in south-west England. In the mid-18th century, Cornwall and Devon were major commercial and industrial centres. Cornwall’s tin and copper mines were some of the largest and most sophisticated in Europe, while the profits from the Cornwall and Devonshire wool trade helped make Exeter one of the biggest and richest cities in England.

By the mid-19th century however, much had changed. The rise of the mechanised cloth industry in England’s North and Midlands sent the south-western wool trade into serious decline. And while Cornwall’s mining industry survived well into the 20th century, it experienced repeated crises from the 1770s onwards. This was primarily due to newly discovered tin and copper mines elsewhere in the world, leading to the large-scale emigration of Cornish miners to countries such as Mexico, Australia and Brazil.

The era depicted in Poldark shows the region on the very tipping-point of this transition. Ross Poldark’s struggles to keep his mine open and profitable are symptomatic of the economic difficulties experienced by the region as a whole during the late 18th and early 19th centuries.

As south-western towns lost their traditional role as centres of trade and industry, their focus shifted increasingly to tourism. This was especially true during the long years of the Napoleonic Wars which form the backdrop to the later Poldark novels. Cut off by war from their favoured resorts in France and Italy, a generation of English tourists began taking holidays in Devon and Cornwall instead.

By the late 18th century, writers in Devon were praising their native county for its natural beauty and its ancient history, rather than for the wealth and industry of which their parents and grandparents had been so proud. By the mid-19th century, the same was increasingly true of Cornwall.

This economic shift led, in turn, to the development of the Victorian mythology of the “romantic South-West”, still beloved of local tourist boards today.

This mythology is built upon a version of the region’s history which emphasises its remote and wild character, playing on associations with Merlin and King Arthur, druids and witches, smugglers and wreckers and pirates.

Like most costume dramas, Poldark’s primary concern is with the travails of cross-class romance. But it is also a narrative about de-industrialisation, and about the struggle of local businesses to remain competitive and economically viable within an increasingly globalised economy – a story which has some resonance in early 21st-century Britain.

The poverty of the Cornish miners with whom Ross Poldark identifies is not simply the result of gratuitous oppression. Instead they are the victims of a new economic order which has little interest in preserving local industry for its own sake.

Wild West

The show has certainly not been shy about making lavish use of the beauty of its Cornish setting, and has already triggered something of a tourism boom, with visitors flocking to the region to see for themselves the moors, cliffs, and beaches which Poldark employs to such dramatic visual effect.

But it also depicts the historical struggles of the region’s inhabitants to preserve the South West as something more than just a pretty place for other people to visit on holiday. In this sense, it is rather symbolic that season one of Poldark ends with Ross being falsely accused of wrecking. The legend of the Cornish wreckers, which reached its definitive form in Du Maurier’s Jamaica Inn, is founded on extremely slender historical evidence, but it persists because it fits in so neatly with the Victorian mythology of the South West in general, and Cornwall in particular: a mythology which viewed it as a lawless and desperate land, filled with crime and adventure, and remote from all true civilisation.

In Poldark, the looting of the wrecked vessel is motivated by hunger and poverty, which have in turn been caused by the economic depression besetting the region. But after spending the whole season struggling against Cornwall’s industrial decline, Ross finds himself in danger of being absorbed into a new kind of narrative about the South West – one which will have no place for men like him, except as picturesque savages.

Of course, in this respect, Poldark rather wants to both have its grain and (shirtlessly) reap it, too. Ross Poldark and Demelza appeal to their audience precisely because they embody the kind of romantic wildness which, since the Victorian era, has been the stock-in-trade of the south-western tourist industry.

They are passionate, free-spirited, and dismissive of class boundaries and social conventions: hardly the kind of people that the self-consciously respectable merchants and industrialists of the 18th-century South West would have wanted as their representatives or champions. But by setting its story of class antagonism against the backdrop of this crucial turning-point in the history of the South West, Poldark does serve as a reminder that the quietness of the region, which has proven so attractive to generations of tourists, is not the natural state of a land untouched by commerce or industry. It is the silence which follows their enforced departure.

The Medieval Somme: forgotten battle that was the bloodiest fought on British soil

This article originally appeared on The Conversation. It was written by James Clark, Professor of Medieval History.

Richard Caton Woodville’s The Battle of Towton.

A Battle of the Somme on British soil? It happened on Palm Sunday, 1461: a day of fierce fighting in the mud that felled a generation, leaving a longer litany of the dead than any other engagement in the islands’ history – reputed in some contemporary reports to be between 19,000 – the same number killed or missing in France on July 1 1916 – and a staggering 38,000.

The battle of Towton, fought near a tiny village standing on the old road between Leeds and York, on the brink of the North York Moors, is far less known than many other medieval clashes such as Hastings or Bosworth. Many will never have heard of it.

But here, in a blizzard on an icy cold March 29 1461, the forces of the warring factions of Lancaster and York met in a planned pitched battle that soon descended into a mayhem known as the Bloody Meadow. It ran into dusk, and through the fields and byways far from the battlefield. To the few on either side that carried their weapon to the day’s end, the result was by no means clear. But York in fact prevailed and within a month (almost to the day), the towering figure of Duke Edward, who stood nearly six-feet-five-inches tall, had reached London and seized the English crown as Edward IV. The Lancastrian king, Henry VI, fled into exile.

Victor: the Yorkist Edward IV. The National Portrait Gallery

Towton was not merely a bloody moment in military history. It was also a turning-point in the long struggle for the throne between these two dynasties whose rivalry has provided – since the 16th century – a compelling overture to the grand opera of the Tudor legend, from Shakespeare to the White Queen. But this summer, as national attention focuses on the 100th anniversary of The Battle of the Somme, we might also take the opportunity to recall a day in our history when total war tore up a landscape that was much closer to home.

An English Doomsday

First, the historian’s caveats. While we know a remarkable amount about this bloody day in Yorkshire more than 550 years ago, we do not have the benefits granted to historians of World War I. Towton left behind no battle plans, memoranda, maps, aerial photographs, nor – above all other in value – first-hand accounts of those who were there. We cannot be certain of the size of the forces on either side, nor of the numbers of their dead.

A death toll of 28,000 was reported as early as April 1461 in one of the circulating newssheets that were not uncommon in the 15th century – and was taken up by a number of the chroniclers writing in the months and years following. This was soon scaled up to nearly 40,000 – about 1% of England’s entire male population – by others, a figure which also came to be cemented in the accounts of some chroniclers.

This shift points to the absence of any authoritative recollection of the battle – but almost certainly the numbers were larger than were usually seen, even in the period’s biggest clashes. Recently, historians have curbed the claims but the latest estimate suggests that 40,000 men took to the field, and that casualties may have been closer to 10,000.

Lethal: an armour-piercing bodkin arrow, as used at Towton. by Boneshaker

But as with the Somme, it is not just the roll-call, or death-toll, that matters, but also the scar which the battle cut across the collective psychology. Towton became a byword for the horrors of the battlefield. Just as July 1 1916 has become the template for the cultural representation of the 1914-18 war, so Towton pressed itself into the popular image of war in the 15th and 16th centuries.

When Sir Thomas Malory re-imagined King Arthur for the rising generation of literate layfolk at the beginning of the Tudor age, it was at Towton – or at least a battlefield very much like it – that he set the final fight-to-the-death between Arthur and Mordred (Morte d’Arthur, Book XXI, Chapter 4). Writing less than ten years after the Yorkist victory, Malory’s Arthurian battleground raged, like Towton, from first light until evening, and laid waste a generation:

… and thus they fought all the long day, and never stinted till the noble knights were laid to the cold earth and ever they fought still till it was near night, and by that time there was there an hundred thousand laid dead upon the ground.

Lions and lambs

In his history plays, Shakespeare also presents Towton as an expression of all the terrible pain of the years of struggle that lasted over a century, from Richard II to Henry VIII. He describes it in Henry VI, Part 3, Act 2, Scene 5:

O piteous spectacle! O bloody times! While lions war and battle for their dens, poor harmless lambs abide their enmity. Weep, wretched man, I’ll aid thee tear for tear.

Both the Somme and Towton saw a generation fall. But while it was a young, volunteer army of “Pals” that was annihilated in 1916, osteo-analysis suggests that Towton was fought by grizzled older veterans. But in the small society of the 15th century, this was no less of a demographic shock. Most would have protected and provided for households. Their loss on such a scale would have been devastating for communities. And the slaughter went on and on. The Lancastrians were not only defeated, they were hunted down with a determination to see them, if not wiped out, then diminished to the point of no return.

Battle of Towton: initial deployment. by Jappalang, CC BY-SA

For its time, this was also warfare on an unprecedented scale. There was no be no surrender, no prisoners. The armies were strafed with vast volleys of arrows, and new and, in a certain sense, industrial technologies were deployed, just as they were at the Somme. Recent archaeology confirmed the presence of handguns on the battlefield, evidently devastating if not quite in the same league as the German’s Maschinengewehr 08 in 1916.

These firearm fragments are among the earliest known to have been in used in northern European warfare and perhaps the very first witnessed in England. Primitive in their casting, they presented as great a threat to the man that fired them as to their target. Surely these new arrivals would have added considerably to the horror.

Fragments of the past

Towton is a rare example in England of a site largely spared from major development, and vital clues to its violent past remain. In the past 20 years, archaeological excavations have not only extended our understanding of the events of that day but of medieval English society in general.

The same is true of the Somme. That battlefield has a global significance as a place of commemoration and reconciliation, especially as Word War I passes out of even secondhand memory. But it also has significance as a site for “live” research. Its ploughed fields and pastures are still offering up new discoveries which likewise can carry us back not only to the last moments of those lost regiments but also to the lost world they left behind them, of Late Victorian and Edwardian Britain.

It is essential that these battlefields continue to hold our attention. For not only do they deepen our understanding of the experience and mechanics of war, they can also broaden our understanding of the societies from which such terrible conflict springs.

Official World War I memorial rituals could create a generation uncritical of the conflict

This article first appeared on The Conversation. It was written by Catriona Pennell (Senior Lecturer in History, University of Exeter) and Mark Sheehan (Senior Lecturer, School of Education, Victoria University of Wellington)

French and British school children during a Somme Memorial in Thiepval. Yui Mok/PA wire

As commemorations to mark the centenary of the Battle of Somme begin, its clear that World War I retains a lingering and vivid presence in the countries which fought in it. But the unfolding centenary anniversaries can also be understood as a moment of heightened anxiety about the future of the way the war is remembered.

As we move further away from the original event itself, much state-sponsored centenary activity in the UK, Australia and New Zealand has actively targeted young people – singling them out as the “next generation”, charged with carrying the memory of what happened on the battlefield forward.

Children take part in a Somme memorial at Manchester Cathedral. Christopher Furlong/PA Wire

In these countries, the memory of World War I has been sanctified to such an extent that other perspectives beyond a sense of respect for those who were directly affected by the conflict is often overlooked.

In the UK, young people are taking centre stage in all the major government-funded commemorative activities, the cornerstone of which is the £5.3m Centenary Battlefield Tours Programme, which aims to take 12,000 secondary state school pupils from England to the memorials on the western front between 2014 and 2019 as part of a national education initiative.

Revered status

Such investment, in a time of economic austerity, requires scrutiny, particularly regarding what ideas about the conflict are emphasised and at the expense of which alternatives. For example, whether children are being asked to reflect on civilians, pacifists or survivors – rather than solely the military dead – or to explore Britain’s uncomfortable relationship to its imperial past.

One secondary school pupil, who took part in a trip to World War I battlefields in spring 2015, told us how she might respond to a member of her coach party who felt remembering the war glorified conflict. She said:

I just really disagree with that viewpoint … it’s like walking into a church and you know saying that you love the devil and you hate God and everything. It’s not appropriate … the tour was to remember and to learn about that you know not many people there are going to put their hands up and agree with you because that’s not the purpose of going.

Much of the UK government’s commemorative activities involving young people are semi-religious, reverential and ritualistic. This risks closing down the opportunity for students to question the purpose of the war, to explore notions of the war’s futility in the light of the outbreak of the World War II, or to consider which narratives of the war are being commemorated at the expense of others.

Anzac identity

In Australia and New Zealand, war remembrance is closely aligned with an Anzac identity. Purported to have emerged at Gallipoli in 1915, this ideal is framed around so-called “common values” of “mateship, courage, equality, self-sacrifice, duty and loyalty”. It is central to museum education programmes that continue to attract thousands of school visitors.

At this year’s Anzac Day dawn service at the Australian War Memorial in Canberra, director Brendan Nelson addressed “young Australians” directly: “Your search for belonging, meaning and values for the world you want – ends here.”

Australia’s and New Zealand’s commemorative activities also share many of the core elements of British commemoration, for example the exhortation of Laurence Binyon’s ode “we will remember them” is used at Anzac Day dawn services. At the heart of all three national cultures of remembrance is a sense of unquestioning reverence for those who served.

Anzac Day in Sydney. European Press Association

In New Zealand, all schools participated in the Ministry of Education’s Fields of Remembrance project. This saw 80,000 white crosses with the names of local service personal who had died overseas hand delivered to schools and laid in the school grounds, where along with poppies and posters they became the focus of war commemorations.

In Australia, pride has been used to encourage young people to connect with their country’s World War I history. Teenage duo The Berrys won the 2016 ACT Premier’s Anzac Spirit Prize with Proud, a song that thanks a dying Australian soldier and his mates for “a legend to be proud of”.

Amid these official celebrations, there appears to be little space for different perspectives on war remembrance in the UK, Australia and New Zealand that go beyond pride and reverence of the armed forces, are inclusive of difference and allow young people to think critically about the significance of World War I. But if we are serious about the memories of the conflict surviving in all their diversity, we need to equip and encourage young people to engage critically as well as emotionally with this cataclysmic event, and with what it might say to us in the 21st century.

The authors would like to thank to Christina Spittel, lecturer in the school of humanities and social sciences, University of New South Wales, Canberra for help with the Australian examples.

How the Battle of the Bastards squares with medieval history

This article originally appeared on The Conversation. The post was written by James Clark, Professor of Medieval History. This article contains spoilers for Game of Thrones season six, episode nine.

A 12-foot giant, his unhuman features oddly familiar (almost homely, after two screen decades colonised by combat-ready orcs) wheels around a wintry courtyard, wondering at the thicket of arrow shafts now wound around his torso. He stops, sways somewhat, and falls, dead. So Wun Wun the Wilding met his doom in The Battle of the Bastards, the penultimate episode of this season of Game of Thrones.

One casualty which, with countless others in the scenes before and after, might have a claim to a place in history, apparently. “The most fully realised medieval battle we’ve ever seen on the small screen (if not the big one too)”, is the breathless verdict from The Independent.

As a full-time historian of the other Middle Ages – Europe’s, every bit as feuding and physical as the Seven Kingdoms but with better weather – I am struck by the irony that Martin’s mock-medieval world might now be seen to set the bar for authenticity. There’s no doubt that for much of screen’s first century, medieval was the Cinderella era: overlooked, patronised and pressed into service for clumsy stage-adaptations, musical comedy and children. But over the past two decades – almost from the moment that Marsellus fired the line in Pulp Fiction (1994) – we have been “getting medieval” more and more.

Medieval millennium

Any connection between Braveheart (1995) and recorded history may have been purely coincidental, but its representation of the scale and scramble of combat at the turn of the 13th century set a new standard, pushing even Kenneth Branagh’s earnest Henry V (1989) closer to the Panavison pantomime of Laurence Olivier’s film (1944). Branagh had at least toned down the hues of his happy breed from the bold – indeed, freshly laundered – primary colours of Sir Laurence’s light brigade, but his men-at-arms still jabbed at each other with the circumspection of the stage-fighters while noble knights strutted and preened.

Of course, at times it threatened to be a false dawn: First Knight (1995) and A Knight’s Tale (2001) are undeniable obstacles in making the case for a new realism. But new epics have extended the territory taken in Mel Gibson’s first rebel assault.

Now already a decade old, Kingdom of Heaven (2005) achieved a level of accuracy without reducing the cinematic to the documentary. For the first time, the scene and size of the opposing forces were not compromised by either budget or technological limitations. The audience is led to gates of the Holy City as it would have appeared to the Crusaders. The armies’ subsequent encounter with one another is captured with the same vivid colour and fear that the contemporary chroniclers conjure them, catching especially the crazy spectacle of Christian liturgical performance – crucifixes, chanting priests – on the Middle Eastern plain. And descriptive details were not lost, particularly in the contentious arena of Crusader kit, now a hobbyists’ domain into which only the brave production designer – and braver historian – strays.

Meanwhile, Peter Jackson painted energetically with his medieval palate in the Lords of the Rings trilogy, not, of course, pointing us to a place or time but certainly providing a superior visual vocabulary for the experience of combat in a pre-industrial age.

Back to basics

So, has Game of Thrones bettered this?

There are certainly some satisfyingly authentic twists and turns woven around The Battle of the Bastards. The most significant casualties occur away from the melee of the pitched battle in one of a number of routs (medieval battles always ended with a ragged rout, not a decisive bloodbath). And the principal actors in the drama do not readily present themselves for a tidy dispatch. The mounted forces of Westeros are rarely decisive and even fighters of the highest status do not see out the day in the saddle.

Also accurate are the individual acts of near-bestial violence which occur, are witnessed and go on to define the significance of battle. The deliberate breaking of Ramsey’s face by Jon Snow is a point-of-entry into a central but still under-researched dimension of medieval conflict: ritual violence, such as the systematic, obscene dismemberment of the dead and dying English by their Welsh enemies during the Glyn Dwr wars.

Before the fall. ©2016 Home Box Office, Inc.

Yet I suspect that these are not the snapshots that have won the superlatives. No doubt it is the standout features of the battle scenes: their scale, the weaponry and the “reality” of wounding in real time that have held most attention. And these threaten to turn us again in the direction of that Ur-Middle Ages which we had every reason to hope we had left for good.

Because medieval armies were always smaller than was claimed, far smaller than we see here. Weaponry was not fixed in time, but – more like the Western Front in 1917 than you might imagine – a fluid domain of fast-developing technology. It is time that directors gave space to firearms, which were the firsthand experience of any fighting man from the final quarter of the 15th century. They must also shed their conviction that “medieval” means hand-to-hand combat. It was sustained arrow-fire that felled armies, not swordplay, nor fisticuffs.

Life on the medieval battle path also meant poor health, rapid ageing and no personal grooming. So we are also overdue sight of a medieval fighting force as it might actually have arrived on the field: neither sporting sexy hairstyles, nor match-fit for action. They of course arrived after months of marching, if they arrived at all: dysentery passed through campaigning forces with fatal routine. They faced their foe in a youth that would have felt more like middle age to you and me.

And in the middle of this Ur-medieval battlefield there is a 12-foot giant, just to confirm that this not medieval Europe, by any means.