Archive for the ‘society’ category

The fear of genetics

August 20, 2015

Ten years ago or so I had a meeting with members of the local community living in the vicinity of the university I was then leading, Dublin City University. They had asked for the meeting to express their concerns about the development of the university’s National Institute of Cellular Biotechnology. More particularly, they were concerned, as one gentleman expressed it, that we were up to ‘Frankenstein kind of things’. I guess he was thinking of Dolly the sheep, and was wondering whether we might take that a few steps further in our newly funded institute. I explained to him that what my colleagues were working on was diabetes and cancer. My visitors were somewhat reassured, but a small group remarked to me, as they were leaving, that GMOs (genetically modified organisms) were undoubtedly evil. Why, I asked. Because everyone knows they are, they replied.

A little later, in 2008, the then newly installed Irish coalition wrote into its programme for government that Ireland would be a ‘GMO-free zone’. I was appalled by this, as I felt it would convey a signal to others that Ireland would not be willing to engage in scientific innovation in some of the areas where that would be needed most and offered the most promise; and the Irish Times published a piece setting out my views. And now all this has been brought up again for me as the Scottish Rural Affairs Secretary, Richard Lochhead MSP, has announced that there will be a ban on growing genetically modified crops in Scotland. This decision has been criticised by a number of research organisations and universities and has been the subject of some media discussion. The Minister has assured his critics that there will be no ban on research carried out in controlled conditions, but the reality probably is that those seeking to do and to fund such research will not choose a location where the process is seen to be contrary to public policy. Innovation will go elsewhere.

In both nutrition and life sciences, scientific innovation has tended over the last decade or two to focus on genetics. This isn’t altogether new. Insulin, with which diabetes is treated and which has been around since 1922, is a GMO. A good deal of medical research has moved, over recent years, from chemical synthesis to biopharmaceutical remedies, and this trend is accelerating. The capacity to feed the world as the population continues to grow may come to depend on GMO research.

For those who are not expert in this field the available literature – or often, the propaganda – on both sides of the GMO argument is unhelpful, because both sides use ‘evidence’ that is not easily verifiable by the rest of us. But there are few signs of ‘Frankenstein kind of things’ damaging us or our environment. In any case, we need to continue to do research, and we should not place it into a setting of general suspicion that is not visibly evidence-based. The idea that innovation should exclude genetics is a dangerous one.

Scientific discovery and technological innovation has its risks and needs ethical oversight, but we must also remember that it has done more than anything else in human history to make possible the feeding of the hungry, the healing of the sick, and the combating of poverty. We should not abandon that lightly. By all means let us make sure that new experiments with GMOs are properly controlled and subject to appropriate safety checks. But let us not start with the assumption, without the need for any proper evidence, that this is a form of innovation to be opposed.

I strongly hope the Scottish government will re-consider its decision on this issue.

From sensitivity into intellectual vacuity

August 17, 2015

Back in the early 1990s, a British trade union developed quite a reputation for right-on radicalism. One of its innovations was that, at its annual conference, it had a ‘speech monitor’ whose task it was to follow every speech as it was being delivered and to identify the use of terms and expressions that were deemed to be offensive to anyone with a progressive radical agenda; and when he heard any such terms or expressions, his job was (literally) to pull the plug on the speech, switching off the microphone and forcing the speaker into an embarrassing return to their seat, and maybe longer term ignominy.

Furthermore, this particular power was well used. At one point when I was following one of the speeches (then being televised) the speaker used the word ‘denigrate’, and before he could finish his sentence the microphone was off and he was in disgrace. He had used a word that connected ‘black’ (niger in Latin) with something negative. There was something excitingly bizarre about this, and I confess I was watching solely in the hope that I would see one or two more of these displays of Orwellian censorship.

It is sometimes suggested that this kind of over-sensitivity has reached university campuses. In an article in The Atlantic, Greg Lukianoff and Jonathan Haidt set out a fairly disturbing picture of American universities being subjected to increasing pressure not to let anyone say anything that could possibly offend or disturb someone of a very thin-skinned disposition. Examples given are pressure applied not to teach rape law in a law school, or not to make English literature students read The Great Gatsby (because it ‘portrays misogyny and physical abuse’). With this a (to me at least) new concept has been introduced: that of the ‘microaggression’. This is described as ‘small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless’ – such as asking someone from an ethnic minority where they were born.

It is of course right that universities seek, to the greatest extent possible, to create safe spaces for those who work or study in them. But this should not mean encouraging people to make a great effort to find things offensive. Universities need to prepare students for the world, and it is a world in which they cannot be protected from such stuff at all times. Furthermore, the university must maintain a culture of curiosity and inquiry, which should not be restricted just because in some contexts not everything is completely lovely. As Lukianoff and Haidt point out, if this approach is abandoned it will damage students both intellectually and mentally.

Respect and sensitivity must be part of any university’s framework of values. But at the same time, universities are there to challenge and  stimulate. This task becomes impossible if every innocuous statement has to be examined again and again before it is made, in case somebody unexpectedly might contrive to be offended by it. The academy’s educational mission must stay on the right side of intellectual vacuity.

Talking points: Keeping watch

August 8, 2015

Is the Apple Watch a major success or has the company made a mistake? Those assessing this particular product don’t seem too be able to make up their minds, or agree. Recent reports suggest that Apple may have got it right again. If so, it is ironic that Apple may be about to revive the fortunes of a particular accessory – the watch – that its other products had been busily killing. A group of students told me recently that they would not wear watches because their iPhones told them the time; watches were superfluous and awkward.

But of course the Apple Watch is more than a chronograph. It puts a number of elements of my smartphone on to my wrist, and it monitors my lifestyle and my health. The information it gathers can of course do more than amuse me; I suspect insurers would love to have it.

I have an Apple Watch, having been given it as a present. I like it. And I wonder what it tells us about times yet to come.

Is this for real?

July 27, 2015

One of the most interesting dialogues of Plato – the Allegory of the Cave (a part of The Republic) – analyses how we can appear to perceive reality that is not, in truth, real. The allegory describes prisoners chained to the wall of a cave for their entire lives; their heads are restrained so they can only see the wall and nothing else. Their sole glimpse of others is through shadows on the wall as people walk past in front of a fire burning behind the prisoners. The reality here, as Plato has Socrates explain, does not consist of the shadows, and yet the prisoners may think otherwise because this is all they have ever seen.

Fans of a certain genre of literature or movie drama (the Matrix, in particular, or maybe Existenz – but there are many others) will of course immediately recognise an early insight into simulation. And of course Plato was articulating something that many of us will feel from time to time: how real is our reality, really? Is this world, indeed are we ourselves, just something that someone else has designed and in which we only imagine ourselves to be? If you are thinking this is a topic best left to a certain type of rather embarrassing nerd, you’d be wrong. Professor Niklas Boström, a Swedish philosopher now working at the University of Oxford, presented the ‘simulation argument’ in 2003, which broadly suggests it is more likely than not that we are in fact living in a computer-generated simulation.

Whether we believe this or not – and the success of simulation depends on its subjects not recognising it – it does tell us something about the fragility of reality. And that is not a bad thing for universities to ponder.

Analogue tales

July 5, 2015

I was standing behind two teenagers waiting for a bus the other day, and one was telling the other about a get-together planned for that evening with some old school friends. ‘Wow’, said the other, ‘that’s so analogue Facebook’. I chuckled at the expression. But right now we can still laugh because even the two teenagers still had some point of reference to distinguish between a real life meeting and social media interaction. They also understood that many things digital have or had an analogue antecedent.

record

 

But is the analogue world slipping away from us? Or is it more resilient than we sometimes thing? After all, vinyl records are apparently making a comeback. And I have set my Apple Watch (and yes, of course I have one) to show an analogue clock on its home screen. I still have (and use) a telephone on which I can really dial numbers.

analog

And in between reading stuff on my iPad, I still buy hard copy books.

reading

It’s not all gone.

watch

PS. However, all the above photos were taken with the iPhone 6 camera and edited with Photoshop. Hm.

Doing it in style?

June 30, 2015

Most academics get to where they are without receiving professional advice. By that I mean, they may have mentors, departments heads, supervisors and all such helpful folk; but they won’t tend to turn to a professional consultant in planning or developing their careers. But there are such people, and one of them is Karen Kelsky, who runs the website The Professor Is In. There she advises people on interview techniques, on writing skills, on preparing for retirement, and other such matters.

She also offers advice on what to wear. In an article just published in the Chronicle of Higher Education, Kelsky makes suggestions on how to present yourself to greatest advantage at an academic interview. The article comes with photographs from what looks like a model shoot.

Am I sneering (as some academics might, I suspect)? Absolutely not. Kelsey remarks in her piece, with some understatement, that ‘academia doesn’t prioritise fashion’. It certainly doesn’t. And I’m not at all sure that this suggests integrity and seriousness of purpose, as some probably feel it does.

Some years ago I was at an academic conference, and found myself looking for a friend and colleague at the reception just before the main conference dinner. I couldn’t see my friend, but as I scanned the crowd it suddenly occurred to me that – how shall I put this – the majority of those present had not exactly made an effort to dress nicely for the event. The de rigueur uniform for the men was an open shirt – generally coloured in some shade of beige – and a pair of jeans, or corduroys for the very adventurous. Their hair was slightly too long, and generally hadn’t been washed in honour of the event. More of the women had made an effort, but in a fairly demure kind of way. And then suddenly the crowds parted, and in walked a visiting American female scholar, all easy charm, immaculate hair and make-up, in a designer dress. She walked about between the academics, clearly charming both the men and the women. She talked earnestly but also with flashes of wit. So was this an interloper trivialising the whole intellectual thing? Or was this someone making effective use of what has been called ‘erotic capital’ (a term originally coined by Adam Isaiah Green of the University of Toronto in his 2008 article ‘The Social Organization of Desire’, and popularised by the British academic Catherine Hakim)?

The reality is that style is a form of communication. We are saying something when we dress, or when we decorate our homes, buy our cars, choose our coffee shops or bars. We may not be saying whatever it is we want to disseminate in our academic mission, but we are creating a background that will sometimes make people more or less open to our message. The academy has, I suspect, never quite worked out whether it accepts the legitimacy of packaging of any sort. But then again, the person in rather worn clothes with chalk marks all over them, hair and beards out of control and leather elbow patches is also coming in a package; whether it is one that will help disseminate the message may be another matter.

My research says they’re out to get me

June 22, 2015

Here’s the kind of thing I really enjoy. According to an article in the Huffington Post, the Russians are demanding an international inquiry on the NASA moon landings, because as we all know these never happened and were merely staged for gullible western television audiences. We know that because the the US flag planted by Neil Armstrong fluttered in a non-existent wind, there were clearly discernible studio lights, the ‘moon rock’ samples have disappeared: you get the idea.

It didn’t take the Russians to activate this particular conspiracy theory, it’s been around for years. In fact, the number of such theories is impressive, and there’s one to subvert every obvious historical fact you ever thought of. Napoleon was in fact a woman. The Second World War was just a staged show put up by international bankers. Aliens have landed all ver the planet and various secret agencies have suppressed the news. Elvis never died (well, that one’s credible). Princess Diana was murdered. The CIA staged the 9/11 attacks. You probably have your own favourite one.

But almost as resilient as the conspiracy theories are the theories about conspiracy theories. Earlier this year the University of Miami hosted a conference about the topic, with 36 presentations on various aspects of the phenomenon. The Leverhulme Tust and the University of Cambridge have conspired – oops, collaborated – on a project about conspiracy and democracy, with an eerily strange website. Overall, conspiracy theorists are thought to use different neurological methods of processing information from the rest of us, and the impact of their published suspicions can be significant: apparently most people believe in at least one conspiracy theory.

For myself, I find it really suspicious that the Miami conference was ‘not open to the public due to both space and catering considerations’. Really? Do they think we’re stupid?


Follow

Get every new post delivered to your Inbox.

Join 862 other followers