Wednesday 17 December 2014

On a Street Like This

Another year is drawing to a close.  Among other things, 2014 has seen a historic Scottish referendum, Winter Olympics in Russia and a World Cup in Brazil.  And in February this year, Russell T. Davies’ taboo-busting series Queer as Folk turned fifteen years old.  Fifteen.  One whole Nathan Maloney.  Nathan would be thirty now - the same age Vince turned halfway through the first series.  Stuart’s son Alfred would be fifteen himself.  And Stuart and Vince would be in their mid-forties.

Queer as Folk depicted the drug-fuelled, sex-filled party life of promiscuous Stuart, his luckless best friend Vince, and the series of events following one damp Thursday when Stuart picks up 15-year-old Nathan who’s on his first night out on Manchester’s Canal Street.  Not only was Queer as Folk ground-breaking, it was also a fantastic piece of television.  From its pitch-perfect soundtrack - ranging from club classics, to cheese pop, to Murray Gold’s pulsating, electronic score - to Davies’ brilliantly crafted script, it was an unrivalled success.  The script, in parts witty, in others excruciatingly acerbic, is trademark Davies.  His distinctive dialogue, capturing the randomness and warmth of everyday speech patterns, so familiar now to viewers following his revival of Doctor Who in 2005, is flawless.  Queer as Folk was unashamed, delightfully quirky and, above all, bursting with life – characters don’t walk, they run; through schools and hospitals and right down the middle of the road.  It’s a show you wish there was more of whilst being simultaneously glad that there isn’t.

The two-part second series that aired in 2001 leaves you wondering what the characters might be up to now.  Vince, I like to think, is happy.  Doctor Who is back on the telly, and, one hopes, he did eventually get that shag.  I imagine Nathan all grown up, bored now of being king of his small world and ready to hand on the mantle to his own protégé.  But what would Stuart Alan Jones make of 2014?  Re-watching the show in 2014, although it’s as vivacious as ever, some aspects have clearly aged.  The chunky mobiles, dial-up internet and Vince’s video tape collection all seem a little retro.  And, some might argue, the gay scene of the 1990s has changed beyond recognition.

In terms of gay rights, 1999 was a very different place.  Back then, the age of consent for homosexual couples was 18, making Nathan’s age all the more shocking.  It was lowered to 16 in January 2001, and since then, gay rights have crept towards equality at a fairly steady pace – the Sexual Offences Act of 2004 finally removed all reference to gender, civil partnerships were legally recognised by the end of 2005, the same year in which same-sex adoption became legal, and finally, in March this year, same sex marriage was officially recognised in the UK.  Homophobia is now outlawed and discriminating against someone based on their sexuality, in any area of public life, is illegal.

In Queer as Folk, Stuart is out and proud and wants nothing whatsoever to do with the heterosexual lifestyle.  He accuses Vince of being “a straight man who f*cks men” and of “wanting a wife”, simply because Vince appears to want to settle down in a monogamous relationship.  For Stuart, being gay means promiscuity, drugs and clubs; it means embracing a whole alternate lifestyle – there are no grey areas.  As he tells Martin Brooks (he of the wife and dodgy roof) – “you either do it or you don’t, but don’t be a tourist.”

In part, the development of the gay culture that Stuart embraces was a reaction to the increasing homophobia and marginalisation of gays throughout the latter half of the 20th century.  Homosexual acts may have been decriminalised in 1967, but by no means did this allow gay people to enjoy the same lifestyle as their heterosexual peers.  They had no hope of marriage or children, the things that many people aspired to.  They were ostracised in every area of society, from education to the military, and media and politics.

This reached a head in the 1980s, with the AIDS crisis heightening the stigma of promiscuity and recklessness, and with the introduction of Section 28 of the Local Government Act in 1988, which stated that a local authority should not intentionally promote homosexuality or present it as normal.  In the light of this, it is little wonder that a gay subculture developed.  If society deemed them to be abnormal, then they would reject everything that society said they should be, and places such as Canal Street developed to accommodate these separate communities.

Canal Street itself developed as Manchester’s gay village following the decline in the use of canals and the collapse of the cotton industry by the 1960s.  The unlit and unvisited Canal Street made a good place for clandestine meetings.  During the 1980s, Manchester’s Chief of Police, James Anderton, an evangelical Christian, encouraged police officers to “stalk its dank alleys and expose anyone caught in a clinch, while police motorboats with spotlights cruised for gay men around the canal's locks and bridges.”

Everything changed in 1990, with the opening of the glass-fronted club Manto - an antithesis to the notion that gay people should hide themselves away.  The club lost money initially, as people were afraid to be seen there, but other clubs and bars soon grew up around it, transforming Canal Street into a vibrant gay community.

Manto closed its doors last year.  The face of Canal Street, the Village and the gay scene as a whole is changingPeople complain that Canal Street is now too ‘straight’, overrun with hen-dos and tourists, in part a result of the ‘Queer as Folk-effect’.  Gay bars elsewhere in the UK are losing money too, since there is a declining desire for segregated bars.  And now there is a division opening up between those in the gay community who want to retain their separate lifestyle and those who want to live the traditional heterosexual lifestyle, now that they can have it too.  There has been some tension over customers being turned away from clubs for ‘not looking gay enough’.


What would Stuart Jones make of all this?  Would he be a middle aged party boy still holding the same beliefs or a responsible father embracing the changes?  Could Stuart’s world still exist in 2014?  It’s true that homosexuality is becoming more accepted but it has not yet been normalised.  Although traditional gender roles are gradually becoming more blurred, homophobia certainly hasn’t been stamped out.  Fifteen years has gone by, and we’re all getting older.  Some things have changed, but others, sadly, haven’t: there is still a long way to go before a person’s sexuality doesn’t define them.  Stuart and Vince and Nathan may well still be out there partying away.  And if Russell T. Davies is so inclined, there’s certainly a few fans who would be interested in finding out what they're up to.

Thursday 27 November 2014

Teach a Child to Fear

Last Saturday, as the people of Ferguson waited for the grand jury verdict on the shooting of Michael Brown, 500 miles away, in Cleveland, Ohio, 12-year-old Tamir Rice was shot and fatally wounded by two police officers who had been called after reports of a child playing with a gun in a playground.  The gun in question was in fact a replica ‘airsoft’ pistol.  Officers reportedly asked Tamir to put his hands up – he reached for the gun.  They fired.  He was twelve years old.

I work with 12-year-olds.  Most of them are idiots at least once a day.  Many of them are idiots for large portions of the day.  That’s ok.  They’re supposed to be idiots.  They’re kids – they’re not done yet.  I know 12-year-olds who like to break the rules and challenge authority.  I know 12-year-olds who’d think it was a bit of a laugh to wave a replica gun around and ignore a request from a police officer.  They don’t half wind me up sometimes.  But I am still happy to see them walk into the classroom every morning.  Because they sometimes ask for help with a maths problem; because sometimes they’ll reach out to help another student; because sometimes the only kind words they’ll ever hear are from teachers; and because they are so full of potential to grow out of being idiots.

Footage released today shows police officers pulling up right beside Tamir and shooting him within two seconds of arriving at the scene. It’s been reported that the officers shouted warnings as they drove into the empty park, but it appears that no attempt was made to negotiate with Tamir once they arrived.  His hand seems to go to the gun in his waistband as the police car pulls up.  I know 12-year-olds who, when in high stress situations (and one would imagine being faced by armed police officers is just a little more stressful than being asked to demonstrate something in front of the rest of the class) would not understand a simple instruction to put their hands up.  The police officers who shot Tamir Rice, we are told, however, had to make a ‘split second decision’.  What kind of world do we live in where split second decisions lead to shooting dead a child?

The unavoidable debate here is one about race.  Because Tamir Rice was black, and the officers who shot him were white.  Around 53% of Cleveland’s population is black, but blacks make up only 27% of its police force.  Over 50 years after Martin Luther King and his followers marched to Washington and with its first black President in office, the USA remains a racially divided nation, and in no area more so than the criminal justice system.  The statistics are quotable and undeniable.  Racial profiling is still rife, with blacks being twice as likely to be stopped by police in many cities in the US.  African Americans constitute nearly 1 million of the USA’s 2.3 million prison population and are incarcerated at nearly six times the rate of whites.  If current trends continue, one in three black males born today can expect to spend time in prison during his lifetime.  There are almost as many black men in prison than in college.  In 2010, African Americans comprised 13% of the population but accounted for 55% of gun homicides.  Homicide is the leading cause of death amongst African-American males aged 15-24, who are 10 times more likely to die of murder than whites of the same age group.

Although disputes over racial imbalance in the justice system remain politically explosive, fuelled further by shootings such as those of Tamir Rice, Michael Brown, Trayvon Martin, the bigger issue here is a country where officers policing the streets shoot first and ask questions later.  Guns are glorified in the US - the right to own and use them is sanctified in the constitution and politicians and a powerful gun lobby fight passionately against laws to control them.  This is a country where over 8000 people a year are killed by guns. Its gun crime rate is comparable with some of the most corrupt and dangerous regimes in the world, including the Democratic Republic of Congo and Columbia.  It’s a country where kids want to play with toy guns because they see grown-ups playing with real ones.  And it’s a country where a 12-year-old child playing in a snowy park is presumed to be a dangerous criminal, unable to be reasoned with.  How many more children have to be shot and denied the chance to ever be a wonderful idiot again before America changes the way it thinks about guns?

Friday 7 November 2014

From Cuba with Love

When the Ebola outbreak first spiralled into an international crisis, the country that dispatched the largest contingent of medical personnel to help was not a major western power, but Cuba, a small island nation of 11 million with a GDP of $6,051 per capita[1].  There are hundreds of Cuban medical staff now in Sierra Leone, Liberia and Guinea - more than from any other nation.  In contrast, many western countries have seemed “more focused on stopping the epidemic at their borders than actually stemming it in West Africa.”

For Cuba to be playing such a major role in the medical response to the crisis is not unprecedented.  Cuba currently has around 50,000 medical workers in 66 countries around the world and has more health workers deployed abroad than any of the G8 nations.  Fidel Castro’s newly formed government first dispatched medical aid abroad in 1960, following an earthquake in Chile.  Many more missions in numerous countries followed over the subsequent decades.  The country’s current dedication to what has been termed ‘medical internationalism’ originated in the aftermath of Hurricanes George and Mitch, which devastated the Caribbean in 1998.  The Cuban government vowed to train one doctor for every life lost in the storms.  To achieve this aim they established the Latin American Medical School: ELAM.  Initially it was free for all students to attend (Raoul Castro’s reforms since taking power from his brother Fidel in 2008 have included charging North American students to attend ELAM) and has subsequently trained over 33,000 students from 76 different countries, who return home to practice, usually amongst the world’s poorest people.

After the Haiti Earthquake in 2010, it was Cuba that provided the highest number of health workers, many of whom had been there since the 1998 storms.  Cuba also sent 2,250 doctors to treat survivors of the Kashmir Earthquake in 2005.  But Cuba does not limit its offers of assistance to poorer countires.  Within 3 days of Hurricane Katrina hitting New Orleans in 2005, Cuba had assembled 1,100 doctors and nurses and 24 tonnes of medicines, ready to fly to the US, but the Bush government did not even acknowledge the offer. 

Although the Cuban Henry Reeve Brigade (named after a US volunteer in Cuba’s war of independence against Spain of 1868-1878) is trained in disaster relief medicine, Cuba also commits to long-term projects.  They dislike the paternalistic term ‘medical aid’ and instead prefer ‘medical cooperation’ or ‘collaboration’.  When Cubans are sent to disaster areas, they are there for the long haul, usually working in the affected area for two years, and will be replaced by more Cubans should the need persist.

The importance of a robust healthcare system has been a key Cuban policy since Castro’s rebels overthrew Batista in the Revolution of 1959.  Ernesto ‘Che’ Guevara, Fidel Castro’s right hand man and himself an outsider who came to the aid of a country in need, had qualified as a doctor back home in Argentina before turning to life as a revolutionary.  Both he and Castro believed that unrestricted access to medical care was a basic human right and it was therefore enshrined in the Cuban constitution.  Despite the fact that almost half of Cuba’s doctors had fled from the communist regime by 1961, Che’s ‘revolutionary medicine’ encouraged a new generation of poor Cubans to train as doctors and return to those poor areas to practice.

The emphasis placed on good quality medical care still persists.  In Cuba, doctors visit all their patients at least once a year, regardless of whether they are ill or not.  This model has been in place since the revolution – the cornerstone of Cuba’s medical system is that prevention is better than cure.  This, alongside a comprehensive programme of vaccinations, has led to a significant reduction in the infectious diseases that blight Cuba’s Caribbean neighbours.  The net result is a mortality rate similar to that enjoyed by developed nations: “as Cubans joke, they live like the poor but die like the rich.”  In addition, “Cuba produces some 80 per cent of its own medical products, which are sold at a fraction of the price they would cost elsewhere.”

However, alongside these admirable principles lie some uncomfortable truths.  The average Cuban doctor’s salary is only $25 a month and many doctors trained for free in Cuba defect once they are sent abroad.  The Bush Administration was particularly keen on promoting this and set up a special programme in 2006, specifically targeting Cuban doctors and encouraging them to defect to the US.  Until January this year, when the high price of exit visas for Cubans was dropped, the cost of leaving the country prevented the ‘brain drain’ suffered by other poorer nations.  Since January, there has been a marked increase in the number of Cubans exiting the country legally and it remains to be seen what impact this will have on Cuban medical care.  What is certain, however, is that medical internationalism has had an effect on Cuba’s national healthcare system, with waiting times for GPs increasing.  Yet although approximately 20% of Cuban doctors are working abroad, “the ratio of doctor to patients in Cuba is still probably the best in the world.”[2]

Prior to this year’s Ebola outbreak, Cuba had been a longstanding friend of many African nations.  In 1963, Cuban doctors and nurses accompanied Algerian soldiers fighting on the border with Morocco and the injured were brought back to Cuba for free treatment.  In 1965, Che himself went to fight alongside local insurgents in Zaire (now the Democratic Republic of Congo) and whilst there “helped launch one of Africa’s first mass immunisation campaigns." Many more operations took place over the following decade, but it was the medical support, rather than the military, which was more effective and ended up forging the stronger links.  In the 1970s, when many emerging nations were experimenting with socialism and aligning themselves with communist states who opposed their former colonial masters, ties deepened between Cuba and Africa.  When Sierra Leone president Ernest Bai Koroma welcomed the first Cuban delegation in the capital Freetown last month, he announced “this is a friendship that we have experienced since the 1970s and today you have demonstrated that you are a great friend of the country.”

Why is Cuba so committed to medical collaboration?  First and foremost, it is Cuba’s main export.  The largest cohort of Cuban medical personnel working abroad are in Hugo Chavez’s Venezuela, in exchange for around 100,000 barrels of oil a day at preferential prices.  The Cuban government is now specifically looking to expand their collaboration into wealthier countries.  Secondly, medical internationalism is unquestionably a form of soft power for Cuba, winning the country influence with potentially hostile governments and consolidating support from other smaller nations.  Just last month, 188 members of the United Nations voted for the 23rd time to condemn the US embargo against Cuba, first imposed by President Kennedy in the 1960s.  There are signs that the US may be ready to take a more open approach to Cuba’s medical internationalism, and, much to the consternation of the Republicans, the Obama administration has sent officials to meet with the Cubans to discuss a joint approach to the Ebola crisis. It is hoped that this combined response to Ebola may pave the way for an ending of the embargo.

There are those who argue that Cuba’s medical internationalism does little to negate the serious human rights abuses taking place within its own borders.  There is widespread repression of civil liberties in Cuba, especially with regards to political dissidents and journalists, with all media being heavily censored.  Though capital punishment was officially ended in 2003, there remain complaints of unfair trials, unjust sentences and torture in Cuban prisons.  Even within the revered healthcare system there is no right to privacy, patients do not have the right to refuse treatment and doctors are expected to report on the political leanings of their patients.  The favourable international view of Cuban healthcare is doubtless coloured by suppression of dissenting views.  The fact that such a small nation has made such a disproportionate contribution to the wellbeing of some of the world’s poorest nations is commendable and there is no doubt that Cuban medical personnel working abroad have significantly improved the health of millions around the world; it is only a shame that it should come at such a high cost to the freedoms of the Cuban people.






[1] To put that in context, the GDP of the USA is $53,143 per capita.
[2] There are 6.7 GPs per 1,000 people in Cuba – there are 2.4 per 1,000 people in the US and in Canada - http://www.counterpunch.org/2012/12/14/medical-internationalism-in-cuba/

Thursday 16 October 2014

Should the Falkland Islands stay British?


The strange political anachronism that is a cluster of rocky islands in the Southern Atlantic will, for the foreseeable future, be a thorn in the side of the British government; one of those small thorns which, for the most part, you are able to ignore but which occasionally flares up into a throbbing, angry, infected wound.  Whilst the average person living in the UK rarely gives the Falkland Islands a thought, for the Argentinians the question of Las Malvinas is central to their political life.  Regaining them for Argentina is one of the principal platforms on which many of their politicians stand for election.  When Britain went to war over the Falklands back in 1982, many people were surprised that Thatcher’s government would risk the lives of British servicemen over a distant outpost of a dismantled Empire.  But go to war we did, and the Islands were reclaimed.  Following the war, relations between the UK and Argentina reached a low point from which they have never recovered.

Just last week, the Welsh First Minister Carwyn Jones was the latest politician to blunder inadvertently into the debate.  The Argentinian Ambassador, Alicia Castro, following a meeting with Jones, published a statement on the proceedings, in which she declared that she "refutes the propaganda from a sector of the Malvinas Islands' inhabitants portraying Argentina as hostile, in an attempt to justify the UK government's refusal to resolve the sovereignty dispute."  It is not known whether Jones knew Castro would make this statement, but there were calls immediately for Mr Jones to distance himself from such ‘distasteful’ comments.  As any British politician knows, the Falklands question is the Pandora’s Box of British politics – no good can come from trying to address it.  Britain will only come out appearing to be clinging to its colonial past whilst simultaneously hamstrung by the powerful Falklands lobby in Westminster.

Rather distressingly, then, for the British government, the United Nations has instructed the UK to address the question on numerous occasions since it passed Resolution 2065 in 1965, which called for both states to conduct bilateral negotiations to reach a peaceful settlement of the dispute.  In defiance of the cherished special relationship, the US has often backed Argentina in the debate, in a collaboration of New World against Old.  Even American actor Sean Penn decided to weigh-in with his opinion, back in 2012, proclaiming that “I think that the world today is not going to tolerate any kind of ludicrous and archaic commitment to colonialist ideology."

In one of the least suspenseful elections ever held, just 3 people from a 92% turnout voted against remaining part of the UK in a referendum held in March 2013.  David Cameron suggested that this had settled the question once and for all and that the wishes of the Islanders should be respected.  Argentina labelled the referendum a political farce and has refused to drop its claims to Las Malvinas.

The Falklands are an archipelago of around 778 mountainous islands covering 4,700 square miles in the middle of the storm-battered South Atlantic. The population stands at about 2,932, primarily native Falkland Islanders (descended from 19th century British settlers), with a smattering of recent immigrants from the UK, Gibraltar, France, Scandinavia, Saint Helena and Chile.  There are less than 30 Argentinians living on the Falkland Islands.  Recent oil exploration has so far proved fruitless and the main industries remain fishing and agriculture.  The climate is cold, wet and extremely windy.  So why is Argentina so keen to have them back?

History
The foundation of Argentina’s claim to ownership of Las Malvinas is that the Islands were originally a colony of Spain, taken by force by the British, and therefore should have been inherited by Argentina when it declared independence from Spain in 1816.  However, discovery and colonisation of the Falklands isn’t as simple as the Argentinians might have us believe.

Although there is evidence that there may have been prehistoric occupants, the Islands were uninhabited at the time of the first recorded landing, by an English Captain, John Strong, en route to Peru and Chile in 1690.  Strong sailed on, however, and the Islands remained uninhabited until the French established Port Louis on East Falkland in 1764.  Two years later, British settlers founded Port Egmont on Saunders Island, a little to the northwest of West Falkland, but it’s entirely possible that the two colonies were unaware of each other’s existence.  That same year, 1766, the French surrendered East Falkland to the Spanish, who renamed Port Louis ‘Puerto Soledad’.  When the Spanish stumbled across Port Egmont in 1770, they launched a war and took the settlement from the British, but lost it again in 1771.

The Spanish and British managed to coexist in the archipelago until 1774, when the British withdrew from the Islands for economic reasons, leaving behind a plaque claiming them in the name of King George III.  In the upheaval of the Napoleonic Wars, during which Spain was allied to France, and in the aftermath of British invasions of South America, the Spanish evacuated their colony on East Falkland.  By 1811, the only inhabitants were gauchos and fishermen, and the Islands became politically undisputed fishing grounds until 1823, when a German-born merchant by the name of Luis Vernet was granted permission by Buenos Aires to fish and rear cattle in the ruins of Puerto Soledad.  He grew his enterprise and eventually brought over more settlers, and in 1829, Buenos Aires named him Military and Civil Commander of the Islands.  A raid by the US warship USS Lexington in 1831, captained by US Navy Commander Silas Duncan, ended Vernet’s tenure as governor of the Islands.  Buenos Aires attempted to reassert control over the settlement by installing a garrison there, but a mutiny in 1832 was followed by the arrival of British forces, who established British rule.

The British troops departed soon after and the same year, Vernet’s deputy, Scotsman Matthew Brisbane, returned to pick up where Vernet had left off.  However, unrest followed, and gaucho Antonio Riviera led a group of dissenters in murdering Brisbane and attacking the settlers.  In the wake of this, the British returned to impose order once more.  The Falklands Islands became an official Crown Colony in 1840 and began to be populated by Scottish and Welsh emigrants, descendants of whom have lived on the Islands ever since.

Like many places in the world, the history of the Falkland Islands is complex and muddied by the colonial one-upmanship of the European powers during the 18th and 19th centuries.  Much water has passed under the bridge and the world has changed dramatically since Buenos Aires last had control of Puerto Soledad.  But since the first official inhabitants were French, albeit briefly, if we are tracing back rightful ownership on historical grounds, should not the Falkland Islands technically belong to France?

Geography
Lying approximately 300 miles off the Argentinian coast in comparison to the nearly 8,000 miles from the UK is, admittedly, a rather significant geographical imbalance.  Argentina claims that this disparity strengthens their claims to ownership of the Islands, but this case doesn’t hold up well to scrutiny.  At its narrowest point, the English Channel is only 20.6 miles wide – does that give France the right to ownership of the UK as well?

…And a dash of politics
In the 1960s, against a backdrop of emerging nation states across the world declaring independence from their old colonial masters, the United Nations called for universal decolonisation.  Argentina seized upon this mentality as an opportunity to further its claim on Las Malvinas, egged on by the US.  But if we backtrack a little, what is it that makes Argentina (and by extension, the US) any less of a colonial nation?  Modern day Argentina is a product of the Spanish colonists who settled in South America in the 19th century, persecuting and marginalising many of the indigenous people already living there.  Nowadays, studies have found that some 56% of the Argentinian population have traces of indigenous DNA in their genetic makeup.  They came, they conquered, and they entwined their cultures and produced a new nation.  Shouldn’t this be the antipathy to outdated claims to territorial entitlement?

The British government has been perfectly willing for the sake of political expediency to ignore the views of native populations in its past decolonisation efforts – see Hong Kong and Diego Garcia, for example.  But the opinion of the people who have made their homes on the Falkland Islands must be considered – the overwhelming majority of them want, as their cringe-worthy Union Flag referendum-day suits demonstrated, to remain British.  They are, as has often been said, more British than the British.

As much as I have a natural aversion to flag-waving Old Boy patriotism and conceited nostalgia for the good old days of the Empire, the Argentinian government has not made annexation with Argentina an attractive prospect for the Islanders.  If this changes, maybe the Falkland Islanders will begin to see the economic and trade benefits of a union with their closest neighbour.  But, until then, their views must be respected.  Sean Penn was right – colonialism is an antiquated notion, and just as Britain has no right to claim territories for itself against the wishes of the native population, neither does Argentina.

Monday 11 August 2014

Is organic food really worth the extra cost?

When the recession hit, one of the sectors hit hardest was the organic food market. It was a luxury that people could ill-afford when they were struggling to pay the bills. However, figures published in February this year indicate that people are buying organic again, prompted in part by the horsemeat scandal in 2013. The recent revelations about Halal meat are likely have a similar impact. Four out of five households now buy at least some organic food and organic baby food “makes up more than 54% of all baby-food purchases.”

Why do people buy organic? The practical reasons generally fall into three categories – a wish to eat food grown without pesticides; desire to limit the damage that traditional farming methods inflict on the environment; and concern for animal welfare. Less practically there is the feeling that somehow organic produce is more virtuous and representative of a more wholesome way of life that you can buy into by picking up an organic yogurt in your local supermarket. I buy into this myself, with my organic veg box delivered every week and organic joints of meat (when I can afford it).

Growing up on the organic vegetables grown by my parents and eating eggs from the chickens that scratched around the bottom of our garden, in my mind, all organic produce comes from a similar source – small family farms, where the farmers toil happily in the sunshine and the animals skip freely through the fields – but my eyes were opened when I read The Omnivore’s Dilemma by Michael Pollan.

In his examination of this dilemma (what do you eat when you can eat anything?), Pollan investigates the food industry in the US, tracing a selection of meals back to their source. The organic farms he visits are a far cry from the idyllic image in my head: vegetables are grown on an industrial scale, sometimes by the same agribusinesses that dominate the non-organic food market, and animals may be kept in pesticide-free environments, but they are hardly roaming free. The book made me wonder how the UK’s organic standards compare and whether I am entitled to feel quite so self-satisfied about my organic purchases.

The good news, I was pleased to discover, is that standards in the UK are stricter. The criteria are laid down in European Union law and approved by individual countries. DEFRA have this responsibility in the UK and for farms and food producers to be approved organic they have to be certified by an officially sanctioned body, the largest and most recognised of which in the UK is the Soil Association. The Soil Association guidance lists the myriad of standards with which organic producers must comply, but on closer inspection many of them turn out to be just that – guidance on best practice, and not regulations at all.

Intuitively it might appear logical that food that is not grown with the several hundred chemical pesticides allowed in conventional farming by EU law would be better for your health. In August 2013, it was suggested that up to 98% of some fruit and vegetables sold in UK supermarkets contained traces of pesticides. However, researchers from Stanford University published a literature review in 2012 which concluded that organic food was no better for our health, a story which was seized upon with glee by the Daily Mail. And though the findings appeared to back up the conclusions of a study commissioned by the UK Food Standard’s Agency in 2010, the Soil Association pointed out that the “US study, of limited application in Europe, found organic food helps people avoid pesticides in their food” and “recognised that organic milk has significantly higher levels of beneficial nutrients.” A study conducted in March this year by Cancer Research UK concluded that organic food did not lower overall cancer risk, though the figures are hugely generalised from a small sample and the conclusions seem to ignore the finding of a “21%reduction in the risk for non-Hodgkin lymphoma in women who mostly ate organic food.” Studies also ignore the fact that whilst organic farms continue to border conventional farms, there will continue to be pesticides in the soil and water supplies that feed organic produce. The water industry spends huge amounts of public money every year on removing pesticides from our drinking water which have run off farmland.

There are also less obvious health benefits to buying and eating organic. Although the UK does not use the numbers of antibiotics in farming as many other countries (ranking 8th in the EU), there are concerns that their routine use in the feed of conventionally reared livestock, to combat the effects of their close living quarters, often standing in their own waste, and unnatural diets, will help to spread antibiotic-resistant bacteria to humans. These bacteria can be spread through the water supply and through the carcasses of slaughtered meat getting into the food chain. The antibiotics themselves can get into the food supply, through eggs and dairy products, as well as through meat, and decrease the effectiveness of antibiotics in combatting human disease. Organic livestock are not routinely treated with antibiotics unless they are ill.

The environmental benefits of buying organic are where the arguments become most contentious. Much is made by those in the anti-organic camp about the carbon footprint of organic farms. If an organic farm uses fossil-fuelled farm machinery then its carbon footprint won’t be all that different to that of a conventional farm. And though organic farms don’t contribute the same volume of fossil fuels in the production of petro-chemical fertilisers used by non-organic farms, due to their natural grass diet, organic dairy cows produce up to twice as much methane as non-organic dairy cows. Methane is a far more destructive greenhouse gas than carbon dioxide. There are some cases in which conventional methods will win out – an organic tomato grown under heated glass in the UK, for example, has a far larger carbon footprint than a non-organic tomato grown in Southern Spain and trucked to the UK. Though we could negate this by reverting back to eating our fruit and veg in the right seasons.

Whilst the UK does not have the thousand-acre swathes of cornfields that blanket the mid-western states of the US, conventional farming in the UK does encourage monocultures, which are damaging for the biodiversity of the countryside. Crop rotation employed by most organic farmers in place of artificial fertilisers is better for the overall quality of the soil, as it reduces soil erosion and encourages bacterial and insect life. Organic farms have been found to support “34% more plant, insect and animal species on average compared with conventional farms”.

When we think of organically reared animals, we tend to think of the lifestyle similar to that enjoyed by my parents’ chickens. Whilst an organic chicken is not kept in battery conditions, up to ten broiler chickens can be kept in a 1m2 space, which really isn’t much at all, and they can be kept in flocks of up to 1,000. Some conventional chicken farms keep up to 40,000 birds together, so 1,000 in comparison doesn’t sound so bad, but it’s still a far cry from the image of a small happy band of chickens pecking around the dirt. All cattle must be fed on a minimum of 60% forage and any concentrates must be organic, which is certainly a more natural diet. Organic dairy farmers must not sell bull calves into the veal trade but there are currently still no rules against them being killed at birth. The life of organic livestock is unquestionably better than their non-organic peers, but it’s perhaps not quite the idyll we all imagine.


So am I right to feel smug about buying organic? Perhaps, but not as much as the higher prices might suggest. Whilst there are benefits to our health, the environment and animal welfare, under current organic standards, they are minimal. On the other hand, organic suppliers do vary and it might be better to do a little research into the source of your organic food, rather than just throwing it into your supermarket trolley. There are many organic suppliers who are committed to using sustainable energy to grow their organic produce. At the end of the day, if you really want to know where your food comes from, the best solution is probably to grow it yourself. There is a plethora of brownfield sites with the potential to be used to grow organic food locally and sustainably and it’s been proven that the soil in our allotments and back gardens is the healthiest in the country. And whether or not you think that organic food is better for the environment, animals or your health, if nothing else, most of the research seems to have concluded that it tastes better, so that’s something.

Saturday 14 June 2014

What the World Cup means to me

I love football.  For all its bloated, grey-suited misogynist executives, its overpaid superstars and its moronic, ugly hooliganism, I love it.  I love to play it and to watch it, whether professionals on television or lumbering amateurs in the local park on a Sunday. My winter weekends are instantly brightened by a win for West Ham.  I buy shirts and scarves and mugs and hats.  I glue my ear to the fuzzy static of Five Live commentary and watch live scores from the Scottish League Two scroll up my television screen on a Saturday afternoon.

But what I really love is football's unifying power.  The rules aren't hard to learn (yes, even the offside one) and you don't need any expensive equipment - four jumpers and something vaguely round to kick about and you're all set.  Kids (and adults) all over the world can join in with a game - on parks, and beaches and dusty streets in every country on the planet, you can bet you'll find an impromptu football game.

And I really love the World Cup.  I love the vibrant palette of colours, of the flags and the kits.  I love the melting pot of different names, languages and anthems.  I love the fans – people from all over the world united by their passion for the beautiful game.  I love the drama - the heroes (and villains) that materialise.  I love making miniature flags of the participating nations (yes, I know).  I love all the back-stories, the politics and the history, the little emerging nations, and the journeys they’ve been on to find eleven men lining up on an acre of grass to run their legs off for their country.

This year, the team that’s caught my heart and my imagination, unsurprisingly, is Bosnia-Herzegovina.  Playing in their first World Cup, the Bosnian football team is the only national institution that can say it truly embodies the Balkan country’s multicultural heritage, which was so viciously torn apart by the war of the early 1990s.  Whilst many Bosnian Croats will be supporting Croatia and the majority of Bosnian Serbs couldn't care less about the World Cup since Serbia didn't qualify, on the pitch, Bosnian Croats, Serbs and Bosniaks will play side by side.  The team are of a generation that grew up in the shadow of the war; many left the country and spent their childhood as refugees abroad, but others lived through the horror of the conflict.  Surely the ultimate fairytale is that of Manchester City’s Edin Džecko, who spent the war as a child under siege in Sarajevo, and who the fans will be looking to set their World Cup campaign alight.  More recently the country, which has spent the last twenty years trying to rebuild itself, was devastated by flooding at the end of May.  Success at the World Cup would really give Bosnians something to smile about.  They've been drawn in a fairly open group, and have a realistic chance of coming second to Argentina, who they play in their opening match tomorrow.  I’ve got my flag and my shirt all ready. 

But what I can’t stand, is the England team.  I mean, I’m as English as they come.  I’ve searched extensively through my family tree and I cannot find a single drop of non-English blood in me.  (There was a time when I thought my Scottish surname might lead me somewhere, but it appears that we are of the West Ham Wilson Clan).  Everything about the English team turns me off – we have a dreary anthem, a boring kit, thuggish fans and whiny, over-hyped players.  I can’t hear an English player or pundit on the radio without wanting to throw something at it.  I can understand why my non-football loving friends and family would be sick of hearing about the World Cup.

I've pondered at some length whether there is there something more deep-rooted beneath my superficial dislike of the England team.  Is it my uncomfortable leftie embarrassment at our colonial past?  Or my middle class distaste at the over-enthusiasm of English fans, fuelled by my peculiarly British aversion to boastfulness?  My love of an underdog?  My association of the English flag with the far-right racism of the EDL?  The connotations of rampant nationalism that come with overt patriotism?

I think this year I’ve finally managed to pin down the answer.  It’s the sense of entitlement that the England football team carry with them.  The sense that they should be winning, because we invented the sport and because our players play for some of the best clubs in the world.  And we’ve had that entitlement for centuries.  We’ve been the best and the greatest and done pretty well out of it.  Half the players and managers giving interviews after their matches will be speaking our language.  And now, maybe it's someone else's turn.

So, for me, the World Cup is about sitting back, enjoying the spectacle and rooting for other countries; countries for whom winning even one game would mean so much more, in terms of healing, and pride, and giving their people hope.  And that’s why, for the next few weeks, I’ll be wearing blue.

Saturday 12 April 2014

"A Visionary Piece of Legislation": Reviewing the Mental Capacity Act

The House of Lords published their review of the 2005 Mental Capacity Act (MCA) a few weeks ago.  You’d be forgiven for not noticing – it slipped from the headlines almost immediately.  Much like the Act itself, the select committee findings barely penetrated the public consciousness.  The reporting about them was sloppy and sensationalist in turn, lumping the MCA together with the unwieldy and unworkable Deprivation of Liberty Safeguards, which were shoehorned into the Act in 2009.  The generic “mental health laws” that the news articles referred to were, in fact, not generic at all, but very specific.  As the Lords report stated, the Mental Capacity Act was a “visionary piece of legislation”, designed to empower and protect those who lack capacity, and which has been misused due to misunderstanding and a simple lack of awareness.

Craiglockhart Hospital, where new treatments
for shell-shocked soldiers were explored
The history of mental health legislation in the UK dates back to the Act Regulating Madhouses of 1774, which first formalised the treatment of the mentally ill in society, who were at the time locked in madhouses in appalling conditions.  As legislation stuttered through the nineteenth century ‘lunatics’ became ‘patients’, the madhouses were replaced by asylums and laws were introduced to regulate conditions in them.  These developments continued into the twentieth century, inching towards a more progressive view of those suffering from mental illness.  One of the major breakthroughs, in law at least, was the 1930 Mental Treatment Act, which permitted voluntary admissions to mental hospitals.  Advances in psychiatric treatment, accelerated by innovative treatment of shell-shocked prisoners after WWI, allowed for more patients to be treated and discharged back into the community.  But whilst legislation and treatment was changing, society’s opinion of those with ‘mental deficiencies’ was slower to catch up.

The integration of mental hospitals into the National Health Service in 1948 was critical in instigating a move away from institutionalisation in the 1950s; a move which was finally enshrined in the Mental Health Act of 1959.  The 1959 Act sought to bring treatment of mental health in line with treatment of physical illnesses.  It also removed “promiscuity or other immoral behaviour” as grounds for detention and encouraged the development of care in the community for those suffering from mental disorders.  The 1959 Mental Health Act was “heralded as a great piece of liberalising legislation”, but sadly came to be overshadowed by concerns about “failures of services and abuses of professional power.”

These concerns, fuelled also by changing public perceptions of mental health, prompted the 1983 Mental Health Act.  The 1983 Act put in place stronger legal controls around medical treatments of mental illness and detention of the mentally ill, and introduced a Mental Health Act Commission to oversee the implementation of the Act in practice (now subsumed into the Care Quality Commission).  Notwithstanding some extra sections inserted in 2007, the 1983 Mental Health Act remains in place today and provides the statutory framework for providing care to formal mental health patients.

But there was a gap.  There were no provisions in the Mental Health Act for those patients who had been informally admitted.  The 2005 Mental Capacity Act plugged this hole.  The MCA provided a statutory framework for safeguarding vulnerable people who lack the capacity to make certain decisions about their life.

The ground-breaking core principle of the MCA is that capacity is to be assumed unless proven otherwise.  The framework of the Act should be used to support individuals to make their own decisions wherever possible.  It recognises the right of the individual to make unwise decisions.  I’ve made plenty of unwise decisions in my time (see, for example, the scars on my right palm from taking a metal pan out of the oven with my bare hands), but no one is questioning my capacity to make the important decisions in my life, such as where I live or what I do with my money, just as we shouldn’t immediately conclude that someone with dementia or a learning disability is unable to make these decisions about their lives.  The MCA makes it clear that capacity is decision-specific.

The ‘best interests’ process set out in the Act ensures that a person who is found to lack capacity around a certain decision has their wishes and feelings taken into account, by involving those who care about and for them.  The Act “signified a step change in the legal rights afforded to those who may lack capacity, with the potential to transform the lives of many.”

In theory, the MCA is a wonderful piece of legislation.  The majority of professionals who gave evidence to the House of Lords select committee spoke highly of it.  However, the Act has not been implemented in the way it was intended to be.  It is not entrenched into the practice of health and social care professionals, whose prevailing cultures of “paternalism (in health) and risk-aversion (in social care) have prevented the Act from becoming widely known or embedded.” Further still, beyond these sectors the Act and its empowering ethos are scarcely recognised at all.  It has far-reaching implications and the Lords report argues that its uses and application should be more widely publicised to both service users and their families, as well as amongst other professional sectors, such as banking and policing.

The central recommendation of the Lords report is the creation of an independent body to oversee the implementation of the Mental Capacity Act.  This body should also have responsibility for publicising the Act and raising awareness of it outside of health and social care.  The report also recommends scrapping the Deprivation of Liberty Safeguards (DOLS) completely and starting again, which in the light of the recent Supreme Court judgement, which has made them an even more bureaucratic box-ticking exercise that does little to truly protect the people it claims to, is urgently required.  However, DOLS aside, the Mental Capacity Act is an incredibly powerful and positive piece of legislation. Spread the word - the more people who know about it and put it into practice, the better life will be for the people it is there to help.

See: the Mental Capacity Act (2005): http://www.legislation.gov.uk/ukpga/2005/9/contents

Thursday 13 February 2014

Dayton's Discontents: Life After Peace in Bosnia

More than any other European nation, Bosnia has struggled to shake its association with war in the eyes of the western world.  Tell someone you’re off on a trip there and more often than not they’ll ask, with a look of surprise, “is it safe?”  Until last week, apart from the landmines scattered across Bosnia’s beautiful countryside, the only signs of the war that tore the country apart in the 1990s were bullet and shell marks peppering buildings in many towns and cities.  To the casual traveller, passing through, Bosnia is a country of culture, history, good food and stunning scenery and Sarajevo easily holds its own against any of the European capitals as a vibrant, cosmopolitan city.  But scratch a little deeper and you’ll find the divisions and disillusions that have flared into violent protests over the last few days.


It started in the north-eastern town of Tuzla last Wednesday.  Unemployed workers took to the streets to protest about their poor pensions and unpaid benefits in a town that was once an industrial hub.  By the weekend the protests had spread to Sarajevo, Zenica and Mostar, where government buildings were set on fire and protesters clashed with the police.  Bosnians are taking to the streets to object to the corruption, nepotism and petty squabbling of their politicians, who they hold responsible for Bosnia’s economic stagnation.

Official statistics put unemployment in Bosnia at 27.5%, but it is believed that up to 44% of Bosnians are out of work.  Youth unemployment is estimated at around 60%.  The Bosnian economy has never fully recovered from the war.  Privatisations of state firms have been rife with crooked deals.  Added to the fact that most of these companies had already racked up massive debts during the collapse of the Yugoslav economy in the 1980s is the sad truth that Bosnia no longer has any industry to speak of and no service economy to fall back on.

The war in Bosnia ended nineteen years ago, in December 1995, with the signing of the Dayton Peace Accords.  They left the country divided between the semi-autonomous Republika Srpska (RS) and the Bosnian-Croat Federation.  The government of Bosnia-Herzegovina is headed by a three member presidency - a Bosniak and a Croat from the Federation and a Serb from the RS – with the Chair rotating every eight months.  Though the idea of electing politicians based on ethnicity is troubling, in reality Bosnia’s politicians are distant, bloated elites barely distinguishable from one another, regardless of their nominative party.  Below the national government are layers of local government, divided into cantons, which in turn are subdivided into municipalities.  This multi-layered and unwieldy system leads to frequent political deadlock, such as that seen last summer over the failure to pass a law on issuing ID numbers.  The stalemate, which resulted in the death of a young baby unable to leave the country for vital medical treatment, sparked protests in Sarajevo, which died down after only a few days.  But the embers have been glowing ever since, and last week were ignited once more.

The aim of the Dayton Peace Accords was to end the fighting in a way that would be acceptable to all sides.  In many ways it is an impressive piece of diplomacy, creating a structure within which Bosnia could begin its recovery from the conflict free from inter-ethnic power struggles.  But, nearly twenty years on, Dayton has its critics.  Eric Gordy believes that Dayton created a “permanent,parasitic political class.”  Like naughty schoolchildren, Dayton left Bosnia under international guardianship, in the shape of the Office of the High Representative, an international envoy who is unaccountable to the Bosnian people.  Others have argued that Dayton left Bosnia’s political situation unstable and divided along ethnic lines.

Prior to Dayton, the most serious attempt at a peace deal was the Vance-Owen Peace Plan (VOPP), put forward in 1993, which proposed division of Bosnia into ten semi-autonomous cantons.  The VOPP was rejected by the three warring parties, who could not agree on the divisions.  By August 1995, when an intense bombing campaign by NATO finally forced the BSA to surrender, the situation on the ground had changed sufficiently to allow the parties to sign a peace deal without losing face.  The UN “safe areas” in Eastern Bosnia had fallen to the BSA, eliminating the troublesome Bosniak pockets and allowing virtual ethnic homogeneity in the east of the country, and the Bosniaks and Bosnian Croats had formed an uneasy alliance against the Bosnian Serbs.  Under these circumstances, peace was achieved and Bosnia, which had dominated news bulletins for three years, slipped from the public consciousness.

 “Look at us now, my God! Look at Vucko and cry.”
The world had forgotten about Bosnia.  It took three days for western news outlets to pick up on last week’s protests, and they have now disappeared once more from the headlines.  It would seem that as long as the Bosnians aren’t killing each other, the world doesn’t want to know.  In a somewhat jarring juxtaposition with the protests, British gold medal-winning ice dancers Jayne Torvill and Christopher Dean returned to Sarajevo today to celebrate the thirtieth anniversary of their success in the 1984 Winter Olympics.  It seems it is easier to remember the Bolero than to remember the plight of the Bosnian people.

There have been calls for both further engagement by the international community and Bosnian self-determination in equal measure.  Peace deals imposed by foreign diplomats which leave nations crippled, humiliated and disgruntled can be dangerous.  Yet Dayton cannot take all the blame and it is important to note that the protests are targeted against internal politicians, not the international community.  Discontent has been brewing in Bosnia for some time, held back only by fear of descent into the violence of the 1990s.  Bosnians need to take ownership of the problems in their country.  The protests, which have resulted in the resignation of the leaders of a number of cantons, are a start.  Far better than accepting the terms imposed by outsiders is initiating internal reforms; by the people, for the people.

Whilst some believe that the protests have united Bosnians from all ethnic backgrounds, the protests have yet to have much of an impact in the RS.  Some Bosnian Serbs are fearful that the movement will undermine the autonomy of the RS, which is not as dysfunctional as the Federation.  Although it seems impossible to talk of Bosnia without mentioning ethnic divides, there is a danger that the ethno-nationalists will try to hijack the growing national discourse for their own ends and steer it off course.  If real change is to be effected in Bosnia, this cannot be allowed to happen.


"Courts and police are protecting the gangs in power"
The protests need direction if they are to achieve anything.  The frustration is understandable, but mindless vandalism is not the answer.  There are signs that the protests are starting to organise, with a series of plenums organised across the country.  The ruling political class cannot be expected to tear down a system from which they reap such great financial rewards.  What the protesters need is someone to direct the anger and despair; leaders who are not part of the status quo, who can translate their frustrations into the reforms that are needed to stimulate the economy.  The early elections that have been called for will be pointless until someone emerges to take on this mantle.  When they do, Bosnia’s Spring may finally bloom.

Saturday 18 January 2014

Marrying Mr Darcy: An Act of Feminism?

The conclusion of Jane Austen’s best-loved novel, Pride and Prejudice, recounts like the plot of an unimaginative average fairy tale – girl of modest means falls in love with and marries a rich and handsome man and lives happily ever after.  By today’s standards Elizabeth Bennet’s eventual achievement of finding herself a husband is not much of a triumph, yet she remains one of the most enduringly popular characters in literature.  Her story has captivated me many times over.  Is this because secretly all I long for is financial security as the wife of my very own Mr Darcy?  Or does Elizabeth hold a deeper attraction, as a character?  Is there any reasonable justification for my affection for this eighteenth century genteel heroine?

The first thing that we notice about Elizabeth is her wit – her “impertinence” as she calls it; or the “liveliness of [her] mind”, as Mr Darcy prefers.  Like many I sat down to read P.D. James’ Death Comes to Pemberley with glee – legitimate fanfiction, by a reputable author!  But the book left me disappointed and it took me a while to realise why.  It was because James had reduced Elizabeth to a monochromic docile wife; she has none of the sparkling wit that so endears her to us in the original novel.  Absent was her “lively, sportive manner” of talking to her husband described in Pride and Prejudice’s closing chapter.  It is a testament to the timelessness Austen’s writing that the vast majority of the dialogue in Andrew Davies’ 1995 BBC adaptation is lifted straight from the book and still has the power to make us laugh.

Elizabeth is always laughing.  She herself declares that she “dearly love[s] a laugh” but qualifies this with the addition that she hopes not to “ridicule what is wise and good”, only “follies and nonsense, whims and inconsistencies.”  This distinction sets her apart from the laughter of her younger sisters, Kitty and Lydia, and her mother.  Lydia is often described as laughing; after eloping with Wickham, she writes that she can “hardly help laughing myself at your surprise tomorrow morning, as soon as I am missed.”  Lydia has no sense of when laughing is appropriate, whereas Elizabeth has a keen awareness of this.

That Elizabeth flaunts social convention and expectations of women in some ways (“jumping over stiles and springing over puddles” on her way to see her sister Jane when she is taken ill, rambling in solitude in the parkland at Rosings) is tempered by her acute consciousness of decorum.  In response to her mother’s loud and vulgar pontificating on the subject and Jane and Bingley’s marriage at the Netherfield Ball, Elizabeth “blushed and blushed again with shame and vexation.”  Austen uses Elizabeth’s blushes throughout the novel to indicate a “social awareness that others lack.”[1]  Elizabeth knows when and where to appropriately direct her mischievous conversation.  Darcy, as evidenced by his conversations with Miss Bingley, though more reserved, is also capable of this.  It is this, alongside Eliza Bennet’s “fine eyes”, that first attracts him to her.  Though he believes her manners are “not those of the fashionable world” he is “caught by their easy playfulness.”  He, like us, does not care for the simpering pretentions of Miss Bingley, preferring instead intelligent exchanges with Elizabeth.

With her great eye for the “minutiae”[2], Austen’s novels are filled with gently mocking observations of the ridiculousness present in the society of her time, and through Elizabeth we hear Austen’s own voice.  Most of these characters are so pompous and oblivious that they do not realise they are being mocked, and so we share in the private joke with Elizabeth and Austen.  It is this balance of disregard for the good opinions of those she does not respect with her recognition of the impropriety of her family’s behaviour that identifies Elizabeth as the rational voice guiding us through the follies of Regency Society.

Elizabeth is not without faults, which perhaps accounts for why we love her more than Austen’s other heroines.  She is not the meek Fanny Price of Mansfield Park nor the unwaveringly sensible Elinor Dashwood of Sense and Sensibility, nor is she the sometimes irritatingly delusional eponymous Emma Woodhouse.  She certainly has her flaws – her hasty prejudices, to name one – but retains an intrinsic goodness and an all-important self-awareness, which is familiar to the modern reader.  Though she is quick to judge Darcy based on Wickham’s lies about their shared past, when she learns the truth she is happy to acknowledge her error.

There is one aspect of Elizabeth’s behaviour that divides those who debate her status as a feminist character.  Her determination to marry for love leads her to reject not one, but two proposals.  At first glance this seems vindication of Elizabeth’s position as a thoroughly modern woman – who amongst us could contemplate the unendurable agony of a marriage to the absurd Mr Collins?  Should we not celebrate her courage in turning down Mr Darcy, with all his wealth, because she does not like him?  Mr Darcy, as would be expected of a man of his social standing at the time, is certain of her acceptance, despite the insulting pride and arrogance of his proposal.  Elizabeth’s rebuttal is so strong and unusual for the time that we cannot help but cheer her on.

We join Elizabeth in her shock at her friend Charlotte Lucas’ acceptance of Mr Collins' offer of marriage; in her “distressing conviction that it was impossible for that friend to be tolerably happy in the lot she had chosen.”  We see later that Charlotte is embarrassed by the gaucheness of her husband, but she defends herself: “I am not a romantic you know.  I never was.  I ask only a comfortable home.”  Marrying well was one of the only options available to women at this time and Charlotte has submitted to pragmatic realism.  And thus we start to question whether Elizabeth is, in fact, acting selfishly in turning down two men who have the means to provide financial security for her and her sisters, whose fortunes are entailed away from them on their father’s death.

Lyme Park, standing in for Pemberley in  the
1995 BBC version of Pride and Prejudice
In Georgian England, these decisions were not merely a question of love versus money.  This “crudely reduce[s] the intricacies of human choice” for “surely the strategic and emotional are blended in all of us?”[3]  Indeed, we can never be sure that Elizabeth is a not a little serious when she tells Jane that her love for Mr Darcy dates from the moment she “first saw his beautiful grounds at Pemberley.”  It does appear that Elizabeth’s feelings have already begun to change on arrival at Pemberley, before Mr Darcy has made his unexpected entrance.  In the eighteenth century, the “notion that the way a man landscaped his grounds might give some indication of his moral and mental qualities” was not uncommon[4], and we may reasonably conclude that Elizabeth’s admiration of Pemberley’s extensive woods and natural beauty is due to her appreciation of Mr Darcy’s tastes, and not to her desire for his fortune.  Nevertheless she does muse, apparently with some regret, “of all this I might have been mistress.”

Set at the end of the eighteenth century, Pride and Prejudice’s characters would doubtless have been aware of Mary Wollstonecraft’s A Vindication of the Rights of Women, published in 1792.  Against this backdrop, we may judge Elizabeth harshly for succumbing to that which society expected of her and settling for life as a gentleman’s wife.  On the other hand, this is no marriage of convenience and we should not begrudge her a life of comfort, for it is not one chosen in haste.  In Georgian England, where marriages were primarily business arrangements, for the consolidation of assets and procreation of heirs, and where women lived entirely subservient to their husbands, hers, we are confident, will be a marriage based on love and, most importantly, respect.  Safe in the knowledge that Mr Darcy so admires the qualities in Lizzy that we do ourselves - so much so that he has been inspired to change his opinions and his manners - we are able to anticipate her future happiness.

Is Elizabeth Bennet a feminist, a realist, or simply a romantic?  It may well be that she is a mixture of all three in equal measure, set apart from her contemporary literary heroines by her passion, her flaws, her complexities and, ultimately, her humanity.  She is as real a woman to today’s readers as she was to her 1813 audience and will unquestionably remain so for many centuries to come.





[1] John Mullan, What Matters in Jane Austen? Twenty Crucial Puzzles Solved, (London: Bloomsbury, 2012), p.260.
[2] Ibid., p.5.
[3] Amanda Vickery, The Gentleman’s Daughter: Women’s Lives in Georgian England, (New Haven: Yale University Press, 1998), p.44.
[4] Tony Tanner’s notes on the 1972 Penguin edition of Pride and Prejudice, p.397.