By Jane Charlesworth, University of Oxford.
I am a biologist. I also love computers. Sadly, I have many
colleages who find programming scary, and I used to be one of those
people. Here is how I learned to stop worrying and embrace programming
as a tool for my research.
All the tech skills are self-taught, or picked up from eavesdropping
on computer scientist friends in the pub. Most of my colleagues are
either in the same boat or reluctant to spend their research time
learning to write reproducible code. Even those of us who are eager to
learn don't know where to begin. I think this represents a crisis in
biology education.
I did a degree at a top UK university about ten years ago. A good
95% of my degree consisted of training for working in a wet-lab, doing
molecular biology. I have never worked in a wet-lab. Granted, when I did
my degree, the first human genome was still unpublished, but I find it
impossible to believe that our course organisers could not have foreseen
that sequencing was a rapidly maturing technology and the students they
were training would need to be equipped with the skills to deal with
the resulting deluge of data.
I recently looked up the course content of my degree, ten years on,
expecting things to have improved. The course descriptions were
virtually identical, despite the fact that computational biology and
bioinformatics are some of the fastest growing areas of biological
research. At university, our sole bioinformatics practical session
consisted of doing a single search of an online database (BLAST).
It didn't take much imagination to ask "is there a way of automating
this task, if you want to do it for a whole bunch of genes?" The answer
was: get a student to do it.
The idea that I might want to learn to program was never mentioned
until I graduated and started a PhD, where I was expected to arrive
proficient in skills that hadn't been part of my undergraduate training.
This expectation knocked my confidence and turned programming into a
big, scary thing in my mind. After spending more time than I'd like to
admit panicking, I dived into a pile of books, Googling at least every
other word.
I still spend the bare minimum part of my work day on writing a
functioning script, and spend time generalising and refining it during
the evenings (I've grown to learn that programming can be satisfying and
frustrating in equal measures). Learning tech skills on top of
full-time work as a research scientist is daunting, and it's hardly
surprising that people find it difficult to find the time to develop
their skills beyond the most rudimentary scripting.
Nevertheless, investing the time to learn some tech skills is
invaluable as a biologist. Every single thing I've learned about
software engineering has made my research massively easier, usually by
forcing me to organise myself. Tools like version control can be very
useful even for lab scientists, allowing people to collaborate on papers
or protocols in a systematic way. What researcher would say no to half a
day's training if that half-day meant saying goodbye to folders of
files with cryptic suffixes like newest or latest?
I can't think of a good reason why these tools aren't being taught
and I hear the same complaint from colleagues, over and over again.
Organisers of undergraduate biology courses need to stop believing that
biology students are scared of tech and maths and start encouraging
their students to embrace these subjects if their aim is to train the
next generation of researchers.
Post a Comment
Thanks for reading my blog.
Note: only a member of this blog may post a comment.