Monthly Archives: September 2014

More Popular than Ever? Beards and Masculinity in History.

Associate Research Fellow Dr Alun Withey is an expert in early modern medical history; in this blog he looks at ‘beard trends’ over the centuries. He has looked at the ways facial hair fashions have changed and the influence this has had on the cosmetics industry.

This post first appeared in Dr Alun Withey’s blog.

This week came the startling revelation that, in the past year, manufacturers of razors and related goods such as shaving foam, have seen a drop in sales of more than £72 million pounds. Market analysts IRI noted that men’s shopping habits were changing and, even though the total market still accounted for 2.2 billion pounds, this was a substantial dent. The cause of this change? Beards.

Express:Rise in popularity of beards cuts sales of shaving products by whopping £72m

Nobody can have failed to notice in recent months the ubiquity of facial hair. Keep your eyes open as you walk down your local high street and you will probably notice a variety of styles, with the ‘Amish’ style seemingly especially popular. It is also interesting how newsworthy beards are. Just look at how often they have appeared as a topic for discussion in recent months. The furore caused by Jeremy Paxman’s beard for example. There were lengthy discussions about celebrity beards at the Baftas in 2013, and now the economic revelations about how much the beard is costing.

This current beard trend is actually very interesting. Over the past 10 years or so beards have been less in vogue. There have been ‘spikes’ of beardedness but these have tended to be of short duration – sometimes only a matter of months.

But this latest outcrop of beards has already lasted the better part of eighteen months. By early summer 2013 the idea of ‘peak beard’ was already being put forward. Quoting the head of a major British barbering company, the Guardian suggested that “beards are more popular than ever…there’s a beard culture – people like talking about their beards, feeling their beards’. Now, in September 2014, passion for beards shows little sign of abating and, in many ways, appears to be going from strength to strength.

It is also interesting to note how economics have begun to intrude into the argument. By anyone’s yardstick £72 million is a large chunk of revenue to be lost to what some people see as an irrelevance – something everyday, quirky…even repulsive. In reality though beards have never been anything less than central to men’s conceptions of themselves. Faces, after all, are the most public part of us. The way we present ourselves to others involves all manner of things, from clothing to cosmetics, but the face is the ultimate index of character. The decision to shave, cover or adorn the face has implications for how we see ourselves and wish to be seen by others. Beards actually matter. Quite a lot. And they always have done.

Over the centuries beard trends tended to last for decades. It’s perfectly possible to identify an historical period by its beard hair. Think of sixteenth-century England. The Tudor ‘Spade beard’ was the order of the day. This was the long, oblong outgrowth of facial topiary sported by kings, princes and elites. Doubtless it made its way a lot further down the social scale too. This type of beard is evident in Holbein’s paintings. Not all Tudor men embraced the beard though. Men like Thomas More was a clean-shaven, perhaps in line with his austere lifestyle. Thomas Cranmer was clean-shaven but, it is said, grew a beard as a symbol of his grief upon the death of Henry and of his break with the past. In this sense the beard was a turning point in his life.

Young Cranmer

Young Cranmer

 

Old Cranmer

Old Cranmer

 

 

 

 

 

 

 

 

 

 

 

Roundhead

Roundhead

In the seventeenth century Stuart monarchs preferred small, pointed ‘Van Dyke’ beards. Charles I and Royalist ‘Cavaliers’ often sported this type of facial hair together with flowing locks. Masculinity here was remarkably feminine, with flowing, diaphanous gowns and silk breeches the order of the day. Contrast this with Puritans whogenerally went clean-shaven, believing beards to be a mere bauble. One argument about the origins of the term ‘roundhead’ is that it referred to the shape of the head after the beard and hair had been shaved – a popular parliamentarian style – rather than the shape of helmets.

Victorian men, after 1850, were characterised by their huge bushy beards. After nearly a century of being clean shaven British men were exalted by a range of new

publications with names like Why Shave? which sought to convince them that shaving was little less than a crime against God and nature. The beard was the ultimate symbol of masculinity, and something used as a tool to prove to men that their position of superiority over women was justified. More than this, it was argued, beards had health benefits that simply couldn’t be ignored. They acted as filters to keep germs away from the nose and throat. (See my other post on Victorian beard health).

In the twentieth century, at least up until around 1950, moustaches were much more in vogue. Charlie Chaplin’s ‘toothbrush’ moustache was a cultural icon. Whether or not (as is sometimes suggested) Adolf Hitler grew his because of Chaplin, whose work he admired, is another matter, but the military moustache was a staple of the first decades of the century, from British Tommies to the emblematic RAF pilot’s moustache.

There are many other important aspects to beards. Growing a beard has been an important marker of life stage; the transition from adolescence to adulthood. The first shave is a virtual rite of passage for a teenage boy. On the other hand, in the past, the ‘beardless boy’ has been a symbol of immaturity or even of a lack of sexual prowess.

Indeed the ability to grow a beard has been central to conceptions of masculinity through time. In the early modern period the lack of a beard was viewed in humoural medical terms as the result of a lack of heat in the ‘reins’ and therefore a lack of sexual potency. Men who had a thin, scanty beard were open to suspicion of effeminacy (in the early modern sense literally meaning that they had feminine characteristics). In the nineteenth and even twentieth centuries, so central was the moustache to military regiments that men unable to grow one were expected to wear a false moustache made of goats hair.

How d’ye like me? Eighteenth-century dandy

The management of facial hair says much about how men view themselves. During the enlightenment the mark of a civilized man was a clean-shaven face. To be bearded signified loss of control over the self and a rugged masculinity that was not elegant or refined. After 1850, however, as I have noted, the fashion was for huge beards, which were seen then as the ultimate symbol of God-given male authority. In this sense it was the emblem of the Victorian man.

After 1900 with the burgeoning market for shaving apparel and cosmetics the situation became even more complex. It is also noteworthy that the pace of change has quickened. Where beard trends used to last decades, since the 80s they have become more fleeting – probably a result of internet-driven celebrity culture.

If all this is true, what does the current vogue for facial hair tell us about men today? What ideal of masculinity are men in 2014 aspiring to? It is difficult to say. Unlike in the past it is harder to track changes in masculine ideal as they are now much more transitory. Nonetheless, one of the constants has been emulation. In the early modern period monarchs provided a bearded (or indeed clean-shaven) ideal. By the Victorian period powerful and fashionable figures, and new types of industrial and military heroes, offered men something to aspire to. Now, with almost unlimited access to the lives of celebrities through the voracious media and internet, the opportunities to find fashion ‘heroes’ to emulate are almost limitless. The question now is how long this trend will last and, perhaps more interesting, will there be a backlash against the beard? History suggests so.

Dr Katie Lunnon Q and A: why she is dedicating her career to dementia research

Dr Katie Lunnon is the University of Exeter lead on the newly launch Alzheimer’s Research UK’s South West Research Network Centre, a partnership with Plymouth University.

Here, in a Q and A with the charity, she outlines her work and explains why she is dedicating her career to advancing knowledge on the condition.

Find out more about how the new research partnership will tackle dementia.

Why did you decide to become a dementia researcher?

Since a young age I have always been fascinated by science, and wanted to become an academic scientist. Dementia is an illness very close to my heart; both of my grandmothers sadly died with dementia. The first of my grandmothers, Mary, first developed symptoms of Lewy Body Dementia about 15 years ago, and watching her change from a fun-loving lady, who was the centre of every party, into a shadow of her former self was incredibly hard for all the family. When my other grandmother, Joan, developed vascular dementia in more recent years, we knew what changes to expect, but it was still very hard to see her kind personality change. My hope is that ultimately the work we do every day as dementia researchers will one day mean that other people won’t see their grandparents lost to this devastating illness.

How will the new South West Network change your research and research in the area?

The new South West Centre will significantly enhance research capacity across Devon and Cornwall. The centre will provide a platform for new collaborations within Exeter and Plymouth, as well as allowing us to easily interact with researchers at the other 14 Network centres across the UK. This support from Alzheimer’s Research UK will also foster the next generation of dementia researchers too; postgraduate students will be able to now attend conferences with centre support, which they may not have otherwise been able to do. Finally, given the age demographic within the South West, the establishment of this new network centre will allow researchers to interact and work together with members of the public affected by dementia at our annual meeting. I believe as researchers, seeing and understanding the impact of dementia within the community we live in is incredibly important.

Dr Kate Lunnon is the University of Exeter lead for the South West Alzeimer's Research network centre.

Dr Kate Lunnon is the University of Exeter lead for the South West Alzeimer’s Research network centre.

What has been the highlight of your career?

I work in a very new research field called “Dementia Epigenetics”. To put this in context, every cell in your body has the same genetic code contained within its DNA, yet a heart cell behaves and looks very different to a nerve cell for example, as certain genes are switched on and certain genes switched off in different cell types, which is known as “epigenetic” regulation. Epigenetic changes are thought to be one mechanism through which environmental factors, such as diet, exercise and smoking for example, can lead to disease. I am very proud that we recently published the first studies showing that very striking and consistent epigenetic changes occur in Alzheimer’s disease, particularly in regions of the brain that are involved in learning and memory. Off the back of this work I have some funding from a number of Alzheimer’s charities, including Alzheimer’s Research UK, Alzheimer’s Association (US) and the Bristol-based Alzheimer’s charity BRACE to explore these changes in more detail.

What do you think needs to be done to increase the number of dementia researchers in the UK (currently outnumbered by cancer researcher 6:1)?

One of the biggest hurdles is the significant imbalance in funding available to dementia researchers, compared to those working in cancer, with twelve-times as much funding for cancer research compared to dementia. With a limited pool of funds available, many excellent research projects often cannot be funded. As an early career researcher, you are often competing with established investigators for the same pot of money. With greater funding from government for dementia research and with more early-career focussed schemes, it would encourage more excellent young researchers to pursue a career in dementia research.

What do you think are the barriers for people pursuing academic careers?

As a researcher, after completing your postgraduate (PhD) studies, you often spend up to ten years as a postdoctoral scientist. These jobs are rarely permanent, and you work on fixed-term contracts ranging from a couple of months to a number of years. After each contract, you often end up moving to a new university, in a different city, and perhaps even a different country. It is hard for scientists to put down roots during this postdoctoral period, and given that the academic career path represents a pyramid, with significantly fewer positions at each level along the pathway, unsurprisingly many excellent scientists leave academia to take up permanent positions outside of the sector before reaching a tenured post.

The gender imbalance in science is often very striking, especially when it comes to the numbers of female professors. What do you think can be done to address this? Do you have any views on getting more girls into STEM subjects in the first place?

The prospect of many years on fixed-term contracts could discourage many women with children from pursuing an academic career. The Athena SWAN charter aims to encourage more women into STEM disciplines. It is incredibly encouraging for research in the South West that Exeter has been awarded a silver institutional award for addressing gender imbalances, and has working groups tasked with maintaining this progress. Within the University, flexible working patterns, and adapted targets to account for career breaks are positive steps for addressing these imbalances for the next generation of scientists.

Multi-discipline courses will help solve emerging global problems

What are the values of multi-disciplinary degrees and will they be more successful at tackling the problems of climate change?

In this blog, Dr Amber Griffiths (nee Teacher) looks at the benefits of offering more interdisciplinary degrees.

The Environment and Sustainability Institute‘s Dr Griffiths is a Research Fellow with a particular interest in the biology of wildlife.

This blog first appeared in The Conversation.

 

 

Across the globe, we are experiencing rapid changes to our environment and social structures. Climate change, population growth, and social unrest are causing ever increasing problems. The rate of change poses serious challenges for education and how we prepare graduates for an unpredictable future.

Courses addressing environmental change and social adaptability are slowly appearing in university prospectuses around the world. For the most part, these topics come in the form of new post-graduate courses.

For example, Harvard University has a graduate program in sustainability and environmental management. The prospectus states students will be “primed to create solutions to the crises affecting our global community”. Many other universities also now run similar masters-level courses on environmental sustainability.

Combining different subjects

But sustainability as a subject can only be taught by drawing from several academic disciples. The answers to the big global questions cannot be found within single traditional disciplines such as biology or politics on their own.

The new courses tend to combine elements of environmental science, economics and politics. They often include modules covering new topics such as global environmental politics or the sustainability of food production. Enabling students to learn from multiple disciplines is a crucial step towards helping them address the big problems facing society. This is particularly important since we cannot predict what the future problems might be.

Undergraduate courses have lagged behind, but there are some truly interdisciplinary degree courses beginning to appear. Several universities now provide a diverse education via new BASc degrees in arts and sciences. The most successful examples are from University College London in the UK and McMaster University in Canada.

Helping to solve tomorrow’s problems.
Lightbulb image via Shutterstock

The BASc degrees typically include new modules on multi-disciplinary working and communicating knowledge. These enable students to then pick and mix from pre-existing modules across many different departments. Additional features of these degrees include interdisciplinary research projects and substantial work placements, which are likely to improve employability.

Flexibility and online learning

Broad interdisciplinary degrees are unfortunately not yet widely available. However, more international universities are now offering flexible combined honours degrees. This approach is similar to the US major/minor model of higher education.

Many university students also now routinely use Massive Open Online Courses to extend their learning beyond their degrees. Supplementing learning with online courses provides broader training than is available through standard degrees.

Such approaches are well placed to provide the diversity of knowledge students need to address the global environmental and social problems that don’t stay within the realms of a single subject. But diversifying education is only part of the change needed. The methods we use to teach and assess students also play critical roles in making them adaptable.

Problem-based learning is already at the heart of many medical and law degrees. It provides the opportunity to practice broad thinking under real-world situations. Problem based learning also encourages self-directed and explorative learning. This approach could be used more broadly to encourage the ability to adapt that students need in the current climate.

For example, students could be faced with a local farmer who is experiencing crop failures, or a small business which is struggling due to the increasing cost of raw materials. The students then research the underlying problems and potential solutions. Both scenarios are broadly related to climate change, but the first might require pulling together subjects such as ecology, soil science, engineering, and economics. The second scenario might require research on climate forecasting, ecosystem services, and business.

Some universities now offer cross-disciplinary problem-based learning events focused around global challenges such as food security or even educational reform itself. Assessment can be directly built into these new forms of teaching, reducing the reliance on traditional exams, which have been widely criticised for being a poor test of understanding.

Skills for unpredictable situations

Rolling out modern teaching and learning approaches more broadly could help students to integrate the many disciplines needed to address global change, and to apply their knowledge to unpredictable situations.

Our education system was designed for a bygone time, and is not equipping students with the skills to thrive in our changing world. It is clear that employers increasingly need staff who are capable of working in unstructured situations. Broader society also needs the same flexibility in this time of great change. Reluctance to change is common, but universities will need to embrace new approaches educate tomorrow’s society.

The Conversation

Amber Griffiths is scientific adviser for cultural laboratory, FoAM Kernow. She currently receives funding from the EU, the Royal Society, the Natural Environment Research Council, and the Fishmongers’ Company. Her ORCID ID is 0000-0002-7455-6795.

This article was originally published on The Conversation.
Read the original article.