Nuts and bolts: gene regulation

At first glance, our DNA seems to use an alarmingly simple code. It can carry the instructions for how to make everything from a walnut, to a barracuda, to the thousands of microbes partying on your tongue, all from just four chemical bases.

This was what we learned in introductory biology: DNA is the instruction manual for life. But how can such dizzying complexity arise from different arrangements of adenine (A), cytosine (C), thymine (T), and guanine (G)? Answering this would take a library’s worth of textbooks to cover, but in this post, I’d like to talk about one particular aspect that has fascinated me throughout my study of biology: gene regulation.

First, a quick recap of what genes do.

Check out more of Ian's work at www.ianmilligan.com

Check out more of Ian’s work at http://www.ianmilligan.com

When a gene is turned on, or active, we say that it is being expressed. Expression involves the DNA of a particular gene being transcribed into RNA, a “photocopy” of the gene that leaves the nucleus to direct protein assembly elsewhere in the cell. The “instructions” carried by DNA are, more precisely, instructions for how to make RNA, and the RNA is instructions for assembling proteins. So far, so good.

(As a sidenote, RNA also has a variety of other fascinating roles in the cell, but for now, I’ll focus on RNA’s job as the genetic messenger. In biology, we distinguish this RNA by calling it messenger RNA, or mRNA).

While RNA chemically resembles DNA, proteins come in an infinite array of shapes and sizes. But importantly, they all have one thing in common: they’re all made of strings of compounds called amino acids. Human cells have 20 of these amino acids to choose from, and make all their proteins by arranging them differently. Because of this common feature, DNA (and RNA) can code for the plethora of different proteins in the human body simply by specifying what order to place the amino acids in.

But, instructions aren’t quite enough to get the job done.

Proteins often work in teams to perform tasks. This is a simplified diagram of the proteins that interact to begin transcription, the process of "photocopying" DNA into RNA.  (Image via nature.com: click for source.)

Proteins often work in teams to perform tasks. This is a simplified diagram of the proteins that interact to begin transcription, the process of “photocopying” DNA into RNA. (Image via nature.com: click for source.)

In life, timing is everything. Without mechanisms that allow genes to be turned on and off at precise times, it would be impossible to coordinate the reactions required to keep the cell running. Even though your cells can construct a vast number of different proteins, many do their work in complexes, cooperating with other proteins to perform tasks. For this reason, the expression of each gene must be adjustable based on the cell’s needs at that time. These complexes can’t assemble properly if components are missing, in the same way that a car can’t run with only most of its parts.

Conversely, if every gene in our DNA sent out instructions to make protein all the time, it would also be disastrous. Losing the subtle changes in available proteins would make it impossible for us to think, breathe, react to stimuli, digest food, move our muscles, or do just about anything. There are even some proteins that cause the cell to commit suicide — on purpose!

I’m absolutely certain we’d die if this started happening; I just don’t know what would kill us fastest.

So how does the cell control when genes are turned on and off?

In the image above, you can start to appreciate that to begin transcription of a gene, a group of proteins must be able to physically bind to the DNA. Because this is all happening on the molecular level, chemical interactions are the overwhelming determinants of whether or not the right proteins can snuggle up to the gene.

Don’t worry if high school chemistry has long since left your realm of priorities — this is all about charge, and charge is pretty simple.

In this diagram, the blue blobs are histone complexes, with the DNA wrapped around them. You can see that the transcription complex can only access the unwound DNA. (Image via cancer.gov; click for source)

In this diagram, the blue blobs are histone complexes, with the DNA wrapped around them. You can see that the transcription complex (multicolored) can only access the unwound DNA. (Image via cancer.gov; click for source)

DNA accessibility has a lot to do with how the DNA is packaged. We have so much DNA in every cell that it needs to be wound up and scrunched together just to fit inside the nucleus, and the first level of this packaging occurs by wrapping the DNA around a class of proteins called histones. Not only does this reduce the space taken up by the DNA, but it also prevents other proteins (including those that start transcription to express the gene) from moving in and binding. When DNA is wound up like this, it’s turned off.

So how do we get at that DNA to turn it on? Thankfully, there are proteins for that, too. Histone modifying enzymes are actually capable of adding or removing chemical groups from the histone proteins, and this completely changes how they interact with DNA. DNA carries a negative charge: if the outside of the histone is positively charged, the DNA will be held tightly to it. If that histone is modified and becomes negative, it immediately pushes the DNA away, leaving it dangling like spaghetti in the wind, unwound and accessible. With the right proteins in place, the gene can now be expressed.

The DNA itself can also be modified, although the ways in which this can happen are more limited. If you modify DNA too much, you either break it completely, or prevent those important transcription proteins from getting at it.

By making these changes in a gene-specific way, cells can react to their environment, adapt, grow, and reproduce. It is this careful regulation that permits life to use very complex chemical tasks in a controlled, logical way, deploying its protein arsenal in different combinations as needed. I’ve never lost my fascination for this level of biology; it’s as if we can look close enough to stop seeing the gooey-growth of life, and start to see the gears turning.

Tasting Our Own Medicine

What would happen if 12 genetics students had the chance to sequence their own oncogenes, for free?

When I realized that I wanted to pursue medical genetics research, I implicitly accepted that I would need to take an ethics class. It seemed inevitable. After spending four and a half years completing an undergraduate biology degree speckled with in-class vignettes about genetic testing and Monsanto and fluorescent fish, and the public controversies they have created, I assumed I’d have to face those things head-on as part of my training.

I’m about to begin the second year of my graduate degree, and I now realize this won’t happen.

This is forgivable. My research in advanced prostate cancer involves molecular biology and genetics, and uses human cells, but I will never see the face of a patient. Also, the projects I’m involved with are far enough from actual drug development that any hard ethical decisions created by my work won’t appear until several years after I’ve graduated. But still, I feel like there is something missing. I can complete my degree without a background in ethics, but I don’t know if I will feel as fulfilled. At this point, I’m strongly considering auditing an ethics course or taking one online.

When ethical issues do arise, I can sense them. I understand the gut reaction people have when they hear “genetically modified”, or “designer babies”. And, while I understand the science behind issues rooted in genetic technology, the opinions I form sometimes feel lacking. As much as anyone else, I often go by my gut. Without formal training, I worry that this means I’m lacking in expertise that would benefit me, as well as the public that I’m interested in communicating with.

This uneasy feeling has sat in the back of my mind for several years now, but it wasn’t until a month ago that I felt it stir. I received an email containing the schedule for a one-week, intensive laboratory course that I was about to take, focusing on next generation sequencing. The email explained that we would be sequencing our own DNA, but focusing in particular on a “cancer panel” – a selection of genes known to be early indicators for cancer susceptibility.

This sounded like a terrible idea.

I want this post to be more about the ethics than the science, but it’s important to understand the science as well.

Genetic sequencing, using the most common sequencing machines, works something like this: DNA is extracted from the cells, and then chopped up into pieces by enzymes. On the end of each piece, we add a “tag”, which allows us to distinguish the side from which the sequencing starts. Through some very nifty chemistry that produces a different colour of light for each base (A, C, G, or T), the sequencing machine can read the code of each DNA fragment by keeping track of the light produced as the chemicals interact with each base, one by one.

Each piece of DNA is read multiple times by the machine. This part is key, because the fewer “reads” the machine collects of a given fragment, the less certain we can be that the sequence we get is real. Machines make mistakes, so we sequence the same piece over and over and over to ensure that we get the same code every time.

Until this year, the NGS students would sequence their entire genomes, but to very low “depth” (meaning that each DNA segment was only sequenced once or twice, on average). The quality of the sequences they received from the machine was consequently very low: they couldn’t be sure anything they saw was real. The exercise was meant to demonstrate how to prepare samples, and how the sequencer worked, and gave some “toy” data for them to analyze. For the purposes of learning, you don’t need high quality data, and when you’re sequencing the whole genome, high quality data comes with a hefty price tag.

This year, the trade-off was switched: we would only sequence certain genes (so the scope of the information was greatly reduced), and this allowed us to sequence to much greater depth, without added cost. Suddenly, the data we would receive were real. If any of us had a mutation, we would see it.

Art by Ian Milligan

Credit: Ian Milligan, Toronto graphic designer. Click the image to visit his website.

What if one of us found a mutation?

When we think about mutations, it’s easy to assume that anything you see will be bad. Thankfully, this isn’t the case. Many mutations are silent, which means that even though the code is altered, the protein that the gene codes for will not be changed. This means that each mutation needs to be characterized before we can predict its effect; for us, looking at the sequences of our cancer-related genes, it meant that it would be important not to panic if we saw a mutation. Context is everything.

As genetics students, we would have some idea of the relevance of the sequences we received. I think this is probably one of the ways our professor defended his lesson plan (it was mentioned that several researchers in the department objected to the project). When we provided consent to have our DNA sequenced, it was not only voluntary, but also very informed. Obtaining informed consent is very important in the context of, for example, clinical trials – you need to make sure the people participating understand the potential risks and benefits – and in the context of a classroom, we had every opportunity to inform ourselves on the subject before seeing our results. There was nothing stopping us from refusing, and there was nothing to indicate that we would find anything relevant to our health. We all verified that we also knew the risks.

This was a rare opportunity for us, because it’s almost impossible to be recommended for genetic testing in Canada without prior family history of genetic problems. In the United States, you can get certain kinds of testing done privately, for a fee. But here it was, free of charge, with a turn-around time of one week from cheek swab to results. For many of us, the curiosity outweighed the risks.

As I said before, there was an overwhelming chance that none of us would have a bad mutation. Typically, if you have a mutation in a cancer susceptibility gene (we call these oncogenes), the consequences present quite young, or you know already that many family members have developed cancers at young ages. It’s doubtful that this would sneak up on any of us. And even if we did find a mutation that is connected to disease, you can find examples in the literature of people carrying seemingly deadly mutations and living long, healthy lives. Genetics is not a precise science in that sense: testing for diseases that are influenced by many genes, such as cancer, is still ludicrously difficult.

Freaking people out, on the other hand, is quite easy.

The fact is, Canada is ill-equipped to teach its students the ethics of genetic testing. The policy situation in the Great White North for genetic information has been described as regulatory chaos and researchers are forced to rely on a mash-up of local, institution-specific rules, and guidelines proposed in the UK or USA. No consensus exists for the Canadian institutions sitting at the interface of research and medicine: hospitals have their own rules, but in research, it becomes murky.

453281a-i1.0

The day is approaching when genetic sequencing will supplement regular healthcare, but we’re not quite there yet. (Click image for source.)

For example, let’s say you volunteer to donate cells for a leukemia study as a “healthy control” – labs often need samples from healthy people to compare with their disease samples, in order to determine what may be conferring the disease. Let’s also assume that even though your sample will be anonymized, someone along the line will have access to the database that will link it back to you. One day, while preparing your sequence for use in their analysis, a researcher notices that you have a mutation in the LRRK2 gene, which indicates that you are at risk for developing Parkinson’s disease. The study isn’t on Parkinson’s disease, but this is a well-known mutation, the risk is high, and the results are robust. In a case like this, which we call an incidental finding, should you be told?

Thankfully, Canadian research institutions are beginning to integrate these situations into their consent agreements: they either say explicitly that they will or will not inform you about incidental findings, or they give you a choice. But to my knowledge, there is no national-scale discussion of genetic ethics being held currently. You can argue that issues like this are best handled on a case-by-case (or rather, department-by-department) basis, but I’m inclined to disagree. The day is fast approaching when patients will present their doctors with genetic data and demand it be integrated into their healthcare: where will our national healthcare policies be then?

There is a big different between genetic testing for trisomy 21, and genetic testing for cancer.

Certain disorders, such as Down’s syndrome and Huntington’s disease, have clear enough causes that we can tackle them with our current understanding of genetics. We can look for an extra chromosome, and if we see it, we can determine that this person will like suffer from the syndrome.

I’ve written previously about the complexity of cancer, and this is where that complexity rears its ugly head. Cancer is a polygenic disease, which means it is under the control of many genes and their products. At this point, it is next to impossible for geneticists to take a genetic profile of a cancer and predict how it will grow – there are simply too many variables and unknowns. It would be even harder to predict future cancers from your healthy DNA.

With this in mind, I began reading about the ethics of genetic testing for polygenic disease, and it seems to me that for now, it comes down to a cost-benefit analysis.

Potential benefits}

  • Benefit to psychological well-being (at least now you know)
  • Benefit to health (through increased vigilance, better diet, quit smoking, etc)

Potential drawbacks

  • Decreased psychological well-being (emotional distress)
  • Potential for false positives/negatives
  • Genetic discrimination (harder time acquiring health insurance, etc)

It’s a short list, isn’t it? Many of these items can apply to single-factor genetic conditions as well (for example, knowing that your risk for Huntington’s disease is extremely high would probably cause some emotional distress) but the big difference here is the misinterpretation of risk. This acts on two levels: the patient may not understand what the doctor or genetic counsellor is telling them, but also, the field of genetics is not advanced enough to make an informed assessment in many cases. Even the most informed geneticists can make a completely incorrect judgment, simply because we do not yet understand the way genetic predisposition works for many diseases such as cancer.

This is why we don’t often perform genetic testing for cancer. In many ways, we don’t know what we’re doing yet.

I have the data from my own sequencing. The files are sitting on my laptop at home, waiting to be loaded and scrutinized. In those files are the A, C, G, and T sequences that spell out how to make key proteins in my cells – proteins that repair my DNA, tell my cells when to grow, help make energy in my mitochondria, and trigger cell death when things go wrong. If there is a defect hidden in one of these genes, it may alter the course of my life… or it may not.

The only other question is, would you want to know?

The Lab Culture

I’ve been getting a lot of great feedback about a short humor piece I wrote recently for the Science Creative Quarterly. I’d like to repost it here, for anyone that missed it — and I’d encourage you to all click through and check out SCQ itself, just because it’s awesome, and Dave Ng is a great editor who has great ideas for advancing science communication and education.

THE LAB CULTURE: A TRUE SCIENTIFIC HORROR STORY

My first co-op placement was in a research lab, four and a half thousand kilometres away from my home university. For the first week or so, it was like taking a really intense lab for a university class, except the prof was just judging me instead of grading me. He would ask me questions to probe my knowledge and get me to think hard about things, but the environment and topic were so new that this usually just made me feel silly and lost. I did learn things, but not very quickly.

Soon enough, I was set up to do cell culture on my own. In a biology lab, this means sitting in front of a metal table with a box over it, all sterilized and ventilated, with room to put your hands in and manipulate things as you look in through a glorified sneeze guard. This is where you grow cells in petri dishes. But no one really calls them petri dishes anymore. You sound really foolish if you call them that. They all just called them plates.

The idea was simple enough. Put the cells on the plate, and cover them with a pink nutrient-rich liquid, and put the lid back on. Then put them back in the incubator. A few days later, if you looked under a microscope and saw that there were a lot more, you could take them back off the plate with chemicals, dilute them in more pink liquid, slaughter most of them, and put the rest on a new plate. You can do this with cancer cells pretty much forever.

Every day, I would stress over these tiny, loathsome critters. Most of them were actually human cells, but you don’t think of them that way. You’re too busy feeling like a ham-fisted klutz to think about it. My arms were too short to reach the back of the workspace, my hands would tremble, and sometimes I’d almost, almost forget not to put bleach in the yellow bag, because I still hadn’t internalized the fact that a tiny garbage mix-up could literally kill me and all of my coworkers. Shit’s intense.

Mercifully, finally, I started getting the hang of it. It was now about two months into my placement, and at this point I felt comfortable enough in the lab to show up feeling mildly grumpy on Mondays. I was settling in nicely, and so far, nothing horrible had happened.

Anyway, on one particular Monday morning, I was getting ready to start growing a new kind of cells – the same type of cancer, but a different cell line, so it had different nutrient requirements. I dug around in cupboards briefly for supplements, before remembering that a previous student had grown these cells before. He had finished his honours thesis and left shortly before I started, so his stuff was still in the fridge.

This is the thing about cell culture… you’re basically putting sugar and vitamins on a plate. We keep these nutrient liquids in plastic bottles in a fridge, sealed up, and only open them in sterile containment, because there are a lot of tiny things that would be happier than a pig in shit if they found themselves in there.

Evidently, something got into this one.

The next part is a bit of a blur. As I pulled the bottle from the back of the fridge, I held it to the light to see if there was any mold growing in there. That’s typically what we’d find… mold. I don’t think that’s what this was. It was almost a perfect sphere, maybe a bit oblong like an egg, and a bit yellowish in colour. It was about the size of a tangerine – I’m not kidding. It seemed solid. I shook the bottle a bit, and it bounced off the sides.

All I could think was… oh god. It’s human.

Listen, I’m sorry. I know it’s gross. That was honestly my best guess, because the bottle was last used to feed a human cell line, and sometimes if people aren’t paying attention they’ll put the glass pipette back in the bottle after touching the plate. That’s all it takes to move cells around. This thing had been growing undisturbed for at least two months, probably more. Bacteria would make the liquid cloudy. Fungus would probably look like tendrils through the bottle. I don’t know. I panicked. It looked like a damn kidney.

So, here I am, Monday morning, holding a tumor in a bottle in one hand and spinning in circles to see if anyone else was around. It never registered in my mind that this wasn’t my fault. Really, the last guy was the most likely culprit, but I was still new enough that I was convinced I would get blamed. That’s kind of how it works in labs: the lowest on the pecking order gets blamed if no one else can be directly implicated. So, having accepted that this was now my problem and I only had moments before other people started arriving for work and discovered me, I did the only thing I knew to do when dealing with a biological contamination.

I got the bleach.

I poured off some of the liquid into the sink and bleached it a bit, then topped up the bottle with more bleach.

And then I shook it. Hard. For quite a while.

This thing was so big that it wouldn’t have come out of the opening on its own. And there was no way I was about to stick something in there to break it up. I wanted to kill it, fast, not risk pissing it off.

I shook, and shook, and sweated profusely, and hyperventilated a bit, and shook. I’d pour off a bit, rinse, bleach, repeat. Until, at last, I had broken it up into a suspension, bleached the last of it, and poured it all down the drain. I was shaking worse than at senior prom, but at least this time no one was around to see it.

So, there you have it. Terror. Stupid, weird, messed up terror – not the kind that will get you a book deal after the fact, but definitely the kind you get to inflict on other people over beers.

Giving me the proof

Filters. Syringe filters. Cell media. Damn it, stupid filter.

I woke up to these thoughts two nights ago, around 1am. I had made media for my cells that afternoon. This involves taking a 500mL bottle of pink liquid, loaded with nutrients, and adding some antibiotics and serum. We filter the serum as it goes in, even though it would have been filtered when it was bottled by one of my colleagues, just to be extra sure no contamination has slipped through.

If your media is contaminated, the cells you’re growing will either die, or be sick.

I woke up thinking about that filter. Of pressing the serum through the syringe and through the filter and watching it drip into my media. In my mind’s eye, I scrutinized it. Did I filter fast enough? Did any get sucked back into the syringe? We had run out of the large syringes, so I had to use the same one twice to filter all of the serum. It was hard for me to move the plunger up and down; I have weak wrists and at the angle I was at (working in a cell culture hood means I was reaching below a pane of fiberglass, then back up to hold the syringe over the bottle) it was almost painful for me to squeeze the plunger back into place. A few times, I thought I felt it slip backwards: one of my colleagues had warned me not to let that happen.

I silently debated the likelihood that serum had, in fact, moved backwards through the filter, and what that might mean. None of this should have mattered at 1am, or 2am, or 3:30am, when I finally managed to get back to sleep.

Sometimes, you don’t know your media is causing the problem until weeks later, wasting many hours of work. Your experiments just start failing.

I’m writing this post hoping that it will get me back into the swing of blogging. The truth is, I thought that maintaining a blog would help me stay grounded, and give me an outlet for analyzing my feelings as I run into trouble with my project. That’s not what happened. After my first project fell through, I spent the remainder of August trying desperately to gain a foothold in a new one, which required a lot of reading and a lot of thinking. In September, I left the lab for 4.5 weeks, to take a month-long class that is required for my degree program. When I got back in mid-October, I finalized my project plan, formed my thesis committee, and slowly, slowly, began to organize my experiments.

Picture1

The month away was good for me I think, but when I returned to the lab, I was hit hard by the psychological and intellectual stress that I had temporarily escaped. I really was starting from square one again. Here I was, 9 months into my degree, and I was starting all over.

I became extremely nervous and self-conscious, two traits that I thought I had left behind several years ago when I entered undergrad. I began questioning my abilities, my self-worth, and my right to be in this lab. Less than a month earlier, I had watched as one of my peers was transferred to another lab because our supervisor isn’t here enough to give her the one-on-one support she needed; I worried obsessively that I would be next.

It’s odd that these things become easier to talk about in retrospect. Really, it didn’t seem as bad at the time. I had begun to accept that I was no longer an exceptional student, that my goals would have to change accordingly, and I slipped into that defeatism because I thought it was my only option. I knew I was probably becoming depressed, and stupidly, I accepted that. I was embarrassed to admit to anyone that I was feeling this way; I felt like a failure. What happened to the confident and engaging student that pushed through an honours project at Dalhousie, just a year and a half ago? Why couldn’t I just push through this too? People have made off with graduate degrees in much harder situations than mine. I can pay my bills each month, I can feed myself and my cat, and I still get to see my friends now and again. I obviously just needed to work harder.

Now that I’m starting to pull myself out of it, I realize how damaging those thoughts were. I still spend some days feeling defeated and scared, and I still have trouble sleeping.

All the while, I knew… this wasn’t my fault. I couldn’t have prepared for things going the way they did.

I submitted a grant application today, which is a story in and of itself, but not one I’m interested in digging up. It’s done, and yesterday, I had the chance to read the reference letter written by my supervisor. It hasn’t been often that I’ve had the chance to see what my supervisors write about me, but he wanted me to proof it for him. I thought, What do you expect me to say, “Tell them I’m awesome? Please?”

Earlier that day, we’d sat down for a brief meeting to cover what I’d been working on while he’d been visiting collaborators this past month. He seemed pleased with me at the time, but it’s always hard for me to interpret him. Even as we talked, I felt numb. Invincible, almost. After two months of near-constant stress, I’d run myself into the ground emotionally. I probably couldn’t feel much worse.

I avoided the reference letter for several hours before finally sitting down with it. It was a glowing review.

“Extremely intelligent”, “impressive”, “highly motivated”, “world class”, “top 2-3%”, “perfect candidate”.

“Strongest possible recommendation”.

I stared at the words. I felt elated; I felt suspicious. I had never read anything this commending of me before, and now I was hearing it from the boss I had convinced myself I was disappointing. Now, in the depth of crippling doubt. Was he just doing this because some of the award money would subsidize my stipend? When I “proof” this am I supposed to ask him to tone it down?

Then, I saw: “I am extremely selective, and she would not be in my laboratory if I did not believe this.”

This was something I knew to be true.

Last night, I slept for 7 hours straight.

Playing with fire

This is going to sound silly and perhaps a bit hypocritical. I worry sometimes about our culture’s growing fascination with genetics.

For my entire life, I’ve been known as an overly cautious person. My parents tell me that the first time I encountered snow as a toddler, I sat down on the edge of the deck and gingerly put a foot in while my younger sister launched herself head-first into the fluff. I think I’ve jumped off a diving board once in my life. I don’t think I’ve ever been on a rollercoaster, now that I think about it. I lock the apartment door even when I’m in the apartment and it’s the middle of the day (we live in a very safe neighborhood).

You could say that I’m overreacting, but as the resident purveyor of caution, I like to think that sometimes we all need a healthy dose before leaping into something new.

The genetic code is not what you think it is.

Businesses like 23andMe are beginning to offer genetic testing to the public, and at the outset, I do believe this is good. Genetic sequencing is getting cheaper and easier each year, and leaving this technology locked behind lab doors in universities and biotech companies doesn’t resonate with my morals. Genetic sequences can be powerful resources, and you have as much right to seeing your genetic code as anyone else, especially if you’re the one offering to pay for it.

23andmekit

The spit kit from 23andMe (click for a link to the original article in Discover Magazine).

The beauty of 23andMe’s model is that they do not, in fact, do full-genome sequencing. Instead, they focus on a set of genes that are believed to have relevance for health and ancestry. This means that when you get your results back, they can summarize for you the limited assessment that such a test is capable of giving. For ancestry, I’m sure this is more than adequate. It may also be adequate even for certain medical purposes, such as determining if you may carry a mutation associated with a genetic disease (I say “may” because these tests are not foolproof and require repeating, at the very least, to determine true presence of a mutation). By limiting the information they intend to collect, commercial sequencing services avoid some of the more complex demands of genetic analysis. There is a reason you can get results for $99 — you’re getting yes/no answers, and very little analysis by a human being (I’m not familiar with the business models of commercial “personal genomics” sequencing companies but I’m guessing that $99 can’t cover materials, maintenance, overhead, and even one day’s wages for one of their technicians). From my understanding, your “risk” score is calculated by comparing to a slew of Genome-Wide Association Studies (GWAS) which essentially aim to sequence as many people in different ethnic groups as possible to determine associations between their genetic patterns and disease (as well as ethnicity-related patterns for ancestry analysis).

Now, if you’re doing it for fun or to do a little snooping about your ancestry, these services are harmless. Aside from their specific methods (I’m really not here to pick apart the legitimacy of these sorts of businesses), I have problems with the influence that services like this have on our culture. First, I think people tend to place too much confidence in genetic assessments; and second, it won’t be long before commercial services are sequencing whole genomes and giving out huge packets of genetic code, and that will bring a slew of problems that we’re not quite ready to face yet.

When the Human Genome Project was wrapped up, we all had to face the sobering reality that a completed set of human DNA sequences will not spell out the cure for cancer (or many other genetics-based diseases). It’s important to remember just how salient the hype was for this project: President Clinton vehemently supported the HGP, and his words helped to fuel a confidence and excitement for genetics that science communicators would be hard-pressed to capture:

In coming years, doctors increasingly will be able to cure diseases like Alzheimer’s, Parkinson’s, diabetes and cancer by attacking their genetic roots. Just to offer one example, patients with some forms of leukemia and breast cancer already are being treated in clinical trials with sophisticated new drugs that precisely target the faulty genes and cancer cells, with little or no risk to healthy cells. In fact, it is now conceivable that our children’s children will know the term cancer only as a constellation of stars.

Who wouldn’t be moved by that? Even now, in 2013 and as a genetics student, I can read through this speech and feel inspired by it. If only it had been so easy. I was 10 years old when Mr. Clinton said this, a child of his hopeful generation, and I severely doubt my children will be free of the suffering that cancer creates.

genome_project_cartoonThe problem lies in the fact that genetics is not the whole story… at least, not in the way that it is made to be by the media. In fact, I would go so far as to say that sometimes biologists also put too much confidence in what the genetic code can tell them about human health and disease. That confidence is broadcast out yet again by means of university press offices and overenthusiastic journalists interviewing overenthusiastic researchers. I can’t blame them: in the fantastic mess that is human genetics, finding anything that looks significant can instill the giddy sense of having found a needle in a computational haystack. That excitement is contagious.

With the scientists, media, government, and businesses all a-buzz saying that your genetics will unlock the key to your health and happiness, I think it’s up to the rest of us knowledgeable-but-not-well-paid genetics folks to pull in the reins and say, shhh, hey now. Let’s not get ahead of ourselves.

Genetics is stupidly complicated. Really. Seriously. (I’m writing a post for the SciAm Guest Blog at the moment that should be out in the next week covering more of my thoughts about complex cancer genetics, so I won’t go on much about it here). The reason we haven’t extracted our answers from the code yet isn’t because we’re not smart enough, or not working hard enough, or being bribed to not talk about it (please do try to bribe me — I need a new couch).

The genetic code has offered us new insights into how biology works, but not in a simple way. It’s like we paid an engineer to build us complex piece of machinery, and instead he dropped off the supplies and a hand-drawn sketch of what he thinks it may look like when it’s done, handed us a toolbox and said “Have fun!” Was that seriously just the cost of the materials? Well who’s going to put it all— what, me??

This will all get a bit crazier when full sequencing gets cheap enough to be offered to the public. This is what will likely happen: some company (probably backed by Google) will begin offering full-genome sequencing. They — or potentially someone else — will then provide a service for “analyzing” it, looking for mutations and essentially decrypting it for you to tell you what it all means. Regardless of whether or not they intend to be comprehensive, they probably won’t be. The speed of sequencing technology is increasing so much faster than our ability to analyze the data it spits out; the analysis of a single genome currently requires so many computational and intellectual resources that bringing it within the price range of consumers will either sacrifice the quality of the analysis, or the breadth.

Maybe we will reach a point where full-scale commercial genetics will be a legitimate endeavor, but I think it’s important that we all be cautious as this kind of technology appears on the market. Researchers learn new and important things about human genetics every single day, and that will likely still be the case when people begin looking at their own codes. Will these genetics companies be sure to thoroughly warn and counsel their clients about the problems that may be present in the data? Will the public be open to the sort of skeptical analysis required for data like this? Or will they instead demand immediate answers from us, leaving most researchers to duck into the shadows while the opportunistic entrepreneurs begin churning out analytic tools that have bypassed peer review? What happens when they myriad of tools being developed by programmers and biologists wind up being marketed for a public hungry for clear answers?

The best science produces opportunities as the community learns and grows, and these opportunities can often be abused. I hope that when the time comes, and the public turns to academics to legitimize marketable genetics, we will be in a position to answer them. Remaining vigilant and critical is key, both for scientists and the public, and the building of trust between science and the wider community is even more important. We all know what happens when popular opinion drowns out reason: we can’t let it happen again.

Life at the interface: hot biology meets the public eye

Science journalism presents unique opportunities and challenges as it jostles for a place within the politic-rich media diet of modern life. It’s a difficult line to walk: I would imagine science journalists struggle to keep their message clear while also giving the topic the attention to detail it deserves. The ways in which we choose to communicate complex ideas say a lot about our values, as they can immediately reflect what we believe to be the important aspects of a story. Luckily, the internet provides us the opportunity to select from a variety of sources, all of which will cover the topic from a different angle and to a different level of detail. Mainstream media may give you the popcorn 30-second blurb, and in-depth articles can take you in further.

From my perspective in a scientific field, I find it fascinating which stories bubble up into the wider public consciousness, and which others remain in the shadows. This isn’t to say that the low-profile stories are hidden: I’m more talking about the stories that have deep implications and excite people, but get ignored beyond the circles of the highly skilled. I sometimes wonder what else would bubble up to public knowledge if more was done to communicate those ideas.

The summer of 2013 has produced a number of “blockbuster” biology stories. I’ll cover a few of them briefly, and link out to additional reading if you’d like to learn more.

The first successful cloning of human stem cells
Nature News & Views: http://www.nature.com/nature/journal/v498/n7453/full/498174a.html
Wired UK: http://www.wired.co.uk/news/archive/2013-05/17/cloned-stem-cells
Al Jazeera: http://www.aljazeera.com/news/americas/2013/05/2013515195843482384.html

In May of this year, a team of researchers at the Oregon Health & Science University finally perfected the recipe for proper cloning of human stem cells.

In areas where embryonic stem cell research has remained too controversial, and in situations when embryonic stem cells are not strictly required, scientists have substituted with a special type of “reprogrammed” human cells called induced pluripotent stem cells, or iPSCs. A technological marvel in their own right, iPSCs are derived from adult tissues, such as skin, and induced to revert back to a stem cell-like state. The beauty of stem cells is that they can be used as a tabula rasa cell type, able to differentiate and form many other types of cell in the human body. This ability it called pluripotence; pluripotent cells could be valuable in creating, for example, replacement organs that have been grown from the patient’s own tissues (thus hopefully reducing the chance of rejection). It is still unclear how similar iPSCs are to embryonic stem cells.

PIIS0092867413005710.fx1.lrg

Graphical abstract for the landmark paper, “Human Embryonic Stem Cells Derived by Somatic Cell Nuclear Transfer”, published in Cell. Click the image to go to the paper.

This story was important for very obvious ethical reasons: many people have moral issues with the use of human embryos for research or medicine, as the embryo must be destroyed in the process. This new method bypasses the need for embryo-stage cells, requiring only a cell from the patient (eg. a skin cell) and a donor egg. Essentially, you take the nucleus from the patient’s cell and transfer it to the egg, which has had its own nucleus removed (this process is called somatic cell nuclear transfer). With the right cocktail of proteins, electricity, and a little caffeine, the cell can be induced to begin dividing. This is the same sort of method that produced Dolly the sheep, born in 1996.

For human stem cells, it all turned out to be a matter of timing. Because human eggs don’t complete their cell divisions until after fertilization, the timing of cellular events is critical if the egg is to survive. In somatic cell nuclear transfer, certain manipulations of the egg that are crucial for transfer of the new nucleus can cause it to speed up its division, throwing the timing off and ultimately disabling the egg’s ability to become a stem cell. By introducing caffeine, the researchers were able to slow down this process and give the egg the time it needed to enter pluripotence.

Aside from replacement organs, these cells could also be useful for treating mitochondrial disorders — the mitochondria is the energy-producing organelle in your cells, and defects in these structures can lead to life-long complications. This technique involves moving only the nucleus from the patient’s cell into the donor egg, so if the patient’s mitochondria are defective, they’ll be left behind. Instead, the resulting stem cells will have mitochondria derived from the egg (because mitochondria have their own DNA, they essentially replicate themselves when the cell replicates; when the stem cell divides to make two stem cells, they both get copies of mitochondria from the egg mitochondria, and nuclear DNA from the patient). This could potentially lead to new treatments for mitochondrial-based disorders.

The new stem cells seem to differ slightly from iPSCs on the molecular level, so further validation of this research is needed. Also, it’s important to note that this isn’t the first group to claim successful human cloning. Rumors have suggested that data in the present study may have been falsified, but I would recommend no one start taking sides until the necessary validations are done. Because of the controversial nature of the subject, it was bound to be met with skepticism: this is what peer review is for.

I could go on for quite a while about this subject, but I’d like to wrap up with a thought. Although I have yet to hear anyone asking about it, I have no doubt that sooner or later someone will claim that this technology will be (or is already being) used to create human clones (like, the fully grown, people-types). If I can, I’d like to pre-emptively dispel any concerns about that. Cloning mammals is very difficult at every stage of the process: in this study, of 851 successfully cloned ferret embryos from somatic cell nuclear transfer, only one pup was born (and died shortly thereafter). In the second attempt, of 917 cloned embryos, 3 were born and 2 survived. A cloned cell does not a baby make, and the unlikelihood of such a clone coming successfully to term is reason enough for us to avoid using this technology for reproductive purposes.

Lab-grown beef is consumed for the first time
Discover Magazine: http://blogs.discovermagazine.com/crux/2013/08/05/first-test-tube-burger/#.UgQCMqzFfZ4
The New York Times: http://www.nytimes.com/2013/08/06/science/a-lab-grown-burger-gets-a-taste-test.html?_r=0
NPR: http://www.npr.org/blogs/thesalt/2013/08/05/209163204/long-awaited-lab-grown-burger-is-unveiled-in-london
Scientific American: http://www.scientificamerican.com/article.cfm?id=test-tube-burger-lab-culture

This story is a bit more recent… and delicious.

cooking-burger

The lab-grown burger is cooked! Image via Discover Magazine (click image for link)

Dr. Mark Post’s group at Maastricht University has created the world’s first petri-dish burger. Okay, maybe they’re not calling it that, but I am. The project was funded by Google heavy-hitter Sergey Brin, who invested over $330,000 USD and didn’t even get to help eat it. I hope they at least offered to let him lick the plate.

Reports of the burger’s bland flavour abound, and that’s probably because this was pure muscle — as lean as it gets, really. No one thought it was immediately gross, so I bet if they had some fat mixed in there you could hardly tell the difference.

It took 3 months to culture the stem cells, which were taken from cow shoulder tissue (these satellite cells are key to muscle regeneration after injury). They were grown for a time in nutrient-rich media called fetal bovine serum, which induces them to proliferate and differentiate into muscle cells and form tissue strands. The resulting muscle fibres were exposed to minor electrical stimulation to make them contract and relax, and this “exercise” allowed them to grow further. 20,000 of these strands were then put together to form the final burger.

Although this project was terrifically expensive for one measly burger, it’s important to remember that most of the 5 years it took them to do this was all the planning and optimization of their methods. The actual process only took a few months, and that’s really not so bad for a first try. Even though this is nowhere near ready for commercialization, I really don’t believe the naysayers who claim there is no potential here. If other realms of biology are any indication, if it took 3 months this year, it’ll be down to a month in a year or two. The advancement of this sort of technology depends on what hurdles need to be overcome, and from my understanding the biggest limiting factor at this point is the cost of the fetal bovine serum (which currently runs for about $250/litre).

A new vaccine for malaria
BBC News: http://www.bbc.co.uk/news/health-23607612
ScienceDaily: http://www.sciencedaily.com/releases/2013/08/130808142144.htm
Nature: http://www.nature.com/news/zapped-malaria-parasite-raises-vaccine-hopes-1.13536

I just started seeing articles about this today, so I’m guessing this is quite recent.

Malaria is caused by a blood pathogen that enters the body via mosquito bites. The PfSPZ vaccine, created by Rockville-based Sanaria Inc, is the latest in a flurry of research attempts to develop a malaria vaccine, a major goal championed by the World Health Organization.

mosquito_634_600x450

Image via National Geographic (click image for link)

The development of the vaccine was difficult, and quite a bit different from normal vaccines. After mosquitoes were allowed to feed on pathogen-infected blood, the malaria parasite was isolated again from the mosquitoes’ salivary glands and irradiated. This weakens the pathogen, but does not kill it (many vaccines contain only proteins isolated from the pathogen they are targeted at; however, there are some vaccines containing weakened pathogens, and this process is known to be quite safe and efficacious). In the latest clinical trial 6 people received the vaccine, and after being bitten by infected mosquitoes, all 6 resisted the disease.

The results are promising, but the mode of delivery is problematic. To induce an effective immune response, PfSPZ must be delivered intravenously — a simple intradermal injection does not yield a sufficient response. Sanaria is currently exploring options for easier vaccine delivery, noting that the amount of vaccine required to enter the bloodstream is quite small. Further concerns about the stability of the vaccine (which must be kept at extremely low temperatures before being administered) are being addressed with the possibility of using veterinary infrastructure already in place, originally designed for the long-term storage of semen for artificial insemination of livestock in remote areas.

In 2010, it is estimated that more than one million people died of malaria, and while it seems that infections have been reduced by programs providing items such as mosquito nets to small communities, a truly efficacious vaccine has been desperately need and will be a huge victory for medical science. Let’s hope it lives up to all the hype.

Here, there be code dragons

In principle, I understand how genomes are sequenced. In practice… I’m starting to think no one really knows.

There are different kinds of sequencers, but at the Genome Science Centre in Vancouver they mostly use a machine called the Illumina HiSeq, so that’s what I know best. The quick version of the actual sequencing is as follows. You start with DNA that has been extracted from your sample of interest (a tissue culture, a colony of bacteria, a mouse blood sample, etc). The purified DNA is chopped up into tiny fragments, typically about 100 nucleotides (DNA letters) long. On the end of each of these fragments, an adaptor molecule is attached. This adaptor is complementary to a molecule that is attached to a glass plate, so when they bind, this sticks the DNA fragment to the plate as well. On this plate, the chemistry is performed to replicate the fragments many times, forming clusters of replicated fragments. Your DNA is now ready for sequencing.

More nucleotides are added, which bind to the DNA fragments within the clusters (bear in mind that each cluster should contain identical fragments, since we’ve created them by replicating a single fragment many times). These new nucleotides are labelled with a fluorescent molecule, causing light to be emitted when they bind to a fragment. The light released from each cluster tells the machine which base (A, G, C, or T) has just bound, so by putting on more and more nucleotides and keeping track of the light on the plate, the machine can discern the sequences of all the fragments (because the nucleotides that bind will ideally be complementary to the sequence they’re attaching to). You wind up with many, many sequences (reads) for each cluster, and these are used to determine the true sequence of each fragment.

So now you have your genome sequenced… except it’s in billions of tiny pieces.

The chemistry is not the hard part — now, you need to find a way to take all of those fragments, line them up in the right order, and determine how this sequence differs to other known sequences. This is where bioinformatics comes in.

Now that the human genome has been sequenced (many, many, many times) we’ve got it pretty good. Since humans share most of their DNA, we can look back to other human sequences as a reference, and align our fragments to it by searching for similar sequences in this reference. Thankfully, we have computers for that. If we didn’t, I have no doubt it would be us grad students that would have to do it manually. This is billions of sequences. With a B. You need a fast computer.

mangopeachtea

This image is actually unrelated to the text — this is my favorite tea to drink while I’m working/attending a seminar. If anyone sees this being sold in Vancouver, PLEASE let me know. I can’t find it anywhere.

Once this is done, you look for the differences between your sequence and the reference. This can be as dramatic as missing chromosomes, such as in Down syndrome, or as miniscule as a single changed DNA letter (point mutation). Sometimes the letters have been shuffled, sometimes sections have been moved or replicated or lost… all sorts of changes can occur depending on the life of the cell that the DNA is coming from.

As you may recall, the key to the immunome project is that we’re taking patient tumor samples and looking for mutations. That means I need to learn how to do all of this. Also, learning these skills makes one highly prized as a lab technician, because many labs that outsource their DNA sequencing also need help in making the code usable to researchers who know very little about the super technical computer side of things. So, oblivious and excited, I set out to teach myself how to do all of this.

Things got out of hand very quickly.

I started by learning about the analysis workflow that is being developed in our lab — it made sense to stick to what my colleagues know, so if I needed help, I’d have people to go to. However, a lot of people in this lab are computer scientists (or at least trained somewhat in computer science), so a lot of their methods involve coding and database management and algorithms and that is literally my entire knowledge of what these people do, right there, in those three terms. It became obvious that if I approach the project this way, I might as well let someone else pull out my mutations for me, because I’ll have to be hand-held the entire way.

I needed something more accessible. I needed more options.

I was directed to a graduate student in another lab, who implored me to take a computer science or bioinformatics course. This would be great advice, except I can’t take one in the summer, and I can’t take one in the fall semester because I have a full-time class running through September. I needed to do this alone, but where do you even start?

A very helpful fellow on Twitter, @genetics_blog (who turns out to be Dr. Stephen Turner, director of the Bioinformatics Core at UVA), responded to my call for online resources and directed me to a list he keeps on his blog of bioinformatics training resources. I found this paper, and this associated curriculum list. Published just last September, these papers lay the groundwork for a person wishing to train themselves in bioinformatics, using only free online course materials. If the author and Dr. Turner didn’t live so far away, I’d insist on baking them both thank-you cakes. Eagerly I outlined the courses I would need to take — a few of the last biology ones of course, two mathematics courses, and a few computer science. A few of the links were broken, but it seems most places have made the courses easy enough to find. Finally, I had a plan.

This was one month ago… and only now do I understand the enormity of the task ahead.

heat-stroke-in-dogs-dog-and-water-hose

The torrent of genomics data has left the field of genetics scrambling to pull in computational experts and design appropriate tools to manage it.

When I started in the Genome Science & Technology program, I had the impression that after a decade in the “genomics era” of sequences and sequencers and technological growth surpassing Moore’s law, genetics folks had the situation under control. This couldn’t be further from the truth. In reality, genomic sequencing has given us access to a new realm of scientific discovery, and it’s pushing not only our technological expertise, but also our statistical and analytical techniques. Because it’s all so new and constantly being updated, we always need to ask “What’s the best method for doing this? What should we consider ‘significant’? Where should we place our data filters? What assumptions are we making that may be outdated?” At every step, we second-guess ourselves, and this culture of constant learning, adapting, and updating has resulted in the creation of dozens of computational tools aiming to do similar jobs in different ways, with no clear answer as to what works best or why.

My notebook is filled with lists of programs I could use for a variety of different tasks. For viewing sequence databases, there seem to be three options: the UCSC Genome Browser, Ensembl, and NCBI. Because I have a bit of experience with UCSC I’ve decided to go with them — the interface looks dated but is clearly laid out and easy to navigate. For alignment and mutation detection (variant calling), I think I can use Galaxy, an online suite of tools that seems geared towards people who need to do this kind of work without a strong background in computer science. I haven’t played around with it yet, but it has vast archives of tutorials and manuals, and seems to be known for providing great user support through its documentation. There are reams of tools available (many of them command line-based) for playing with sequence data and finding interesting features, but I haven’t even attempted to sift through that list yet. Most likely I’ll have to recruit a colleague to help me select more specific analytical tools, but that’s a problem I won’t face for some time.

On top of all of this, I must constantly remind myself that most of the courses I’m taking are from 2011 or 2012. The information they’re giving me may already be outdated. I try to pull up papers for the tools I intend to use, but not only could something else already be available (and better), but more could become available between now and when I’m ready to actually start sequencing. It seems impossible to not be lagging behind when the technology moves this fast.

It’s unfortunate that I don’t have more guidance for my introduction to this area, but at least I now think I’m on the right track. With any luck I’ll come out the other side with a decent understanding of genetic analysis and the available tools. It just may take a bit longer this way.

Finding a new equilibrium

When I chose to go to grad school, many people had advice to offer. Friends and family who haven’t been to grad school, or known anyone going to grad school, were overwhelmingly supportive. Those who had gone would pull me aside when no one was looking and insist, “Think hard about why you’re doing this.”

Why? It looks like a sweet deal. Over the next two or three years, you can work with the best researchers in your field, you can learn new things and work with cutting-edge technology, you can learn about where science is right now, today, not ten years ago when your teachers last read the literature. And they’d even pay you to do it! (Although not well. Trust me, the stipend is less of an incentive, and more a safety net to make sure you’re at least eating enough to get your work done.)

Well, all of those things are true. But life is not simple, and working in a complicated field bogged down by administration, tight funding, and competition certainly does not make anything easier.

I suppose people are expecting an update on my thesis. I wish I had one.

As I mentioned earlier, I’ve taken on a second project alongside my immunome project — and, unfortunately for this blog, I’m not allowed to talk about it. (“OH HEAVENS!” you say, “SHE’S WORKING ON HUMAN HYBRIDS, OR FOR MONSANTO, OR ON A BIOWEAPON, OR–”) No. None of that. As I mentioned when I explained the immunome project, sometimes researchers worry that if their idea gets out, someone else will jump on it and do the work first. If the rival lab publishes a paper first, it’s called “getting scooped”, and it renders the time/money investment you made in the project worthless. This project looks promising, is specific enough that the experimental plan is clear and well-defined by past research, and many other labs are capable of doing it. If our hypothesis is right, it could be a heavy-hitting publication… but we need to hope we’re the only ones with this idea. And that means I can’t talk about it.

As for the immunome project, I’ve essentially stalled. Until we can figure out where our samples are coming from, there’s not much for me to do. In some ways this is a blessing, as it gives me ample time to teach myself everything I need to know to do our experiments (I need to learn how to do genetic sequence analyses, and without anyone willing to devote many hours to teaching me, I’ve turned to free online courses and lectures offered by places like the NIH and Harvard). This will take time… probably months. I’m trying to fit at least an hour and a half of “class” into each work day, but I’m starting to feel the squeeze between that and the background reading I have to do for the other project.

I’m starting to feel that fear, the stuff I was warned about.

1016266_4642974966952_1052505295_n

At least the weather in Vancouver has been beautiful for almost the entire month. Maybe that’s the real reason I haven’t been blogging…

I can easily see how a grad degree can eat up your life, and I’ve been intentionally limiting how much I work. I don’t respond to emails in the evenings and I try very, very hard not to think about work on the weekends. So far, everything is manageable. But with an increased workload looming, and my real classes beginning again in September, I’m starting to realize that perhaps signing up for a PhD (instead of a shorter MSc) was ambitious. Honestly, I’m wondering if it’s the right decision for me.

Throughout my education, I try to make my long-term decisions as late as possible. This isn’t because I want to avoid them, but rather because I know that if I wait until a few months before a hard limit (graduation, application deadlines, etc) I will have the most possible information to make the decision. In that vein, I’m not surprised that my early decision to do a PhD (usually made after a year or two of an MSc, but I made it at the beginning) has led to me feeling uncomfortable and trapped. I’ve placed myself in a position where I don’t feel able to grow and evolve academically. I see avenues I could take instead, and wonder if I’m even allowed to change my mind now.

I will see this thesis through, even if it takes a different form than I initially expected. After 7 months in Vancouver, I can say I’ve learned far more about biology than I did in the last 2 years of undergrad. My ideas and goals are changing, and I need to find a way to reconcile that with what my supervisors expect of me, and what I expect of myself.

This post is my attempt to get back into writing, after spending a month hiding away in my brain.

There are topics that I’m eager to talk about, things I’ve learned or am thinking about, or things I would like to explore more. But I knew in order to do that, I’d first want to cover where my own work is at. I know this isn’t much of a substantial update, but unfortunately it’s all I have for now. My days are spent compiling a summary of background knowledge in preparation to begin the new project, watching seminars, going to lab meetings, and occasionally reading some new immunology papers. I’ve created an RSS list that I think keeps me more or less up to date on new papers coming out in the big journals, as well as potentially relevant press releases and blog feeds, so I’m finally starting to feel like I’m gaining some mental traction in the field. As well, I started using Mendeley, a popular reference manager to keep track of all of the papers I’m reading (this has proved infinitely easier than organizing in my DropBox folders, but I do back everything up on DropBox so I could theoretically do work elsewhere. Mendeley and DropBox refuse to sync effectively so it’s a bit of a work-around).

I also found out a few weeks ago that I’ve had a pretty substantial iron deficiency for some time now, so taking iron supplements has really given me a greater feeling of control over my life. Anyone who has been iron deficient knows the depths of the fatigue and helplessness that comes along with it, and fighting through that over the past few months (without really acknowledging that anything was wrong) had made me think I couldn’t handle grad school. It seems silly how much difference a little pill of iron can make. It no longer takes multiple cups of coffee for me to get through the day… and in fact, I no longer drink coffee on weekdays, because now that my energy levels are normal I find it makes me jittery and uncomfortable.

So, long story short: I’m doing well, I’m getting the hang of this, but I definitely had the wind knocked out of me this month. Science is fascinating, but the institution of science is fraught with paperwork and deadlines and restrictions and responsibilities. It’s becoming clearer to me why people who love it choose to leave, but I’m nowhere near that point yet. I’m just readjusting to a new normal.

Roadblock

I have found the perfect gif for this situation.

0IakfP6

The model for scientific research in a capitalist society is pretty straight-forward: convince people who have money that your research is worth doing, and maybe they will give you money to do it. Unfortunately, each year this gets harder. Between the sequester impacts on US funding agencies and the cuts here in Canada, having a grant rejected is par for the course these days. However, we still rely on them.

We just received two rejections.

This is unfortunate for two reasons: first, because the immunome project (my thesis project) largely relied on these to get off the ground, and second, because I was under the assumption that we were almost guaranteed to get at least one of them. Now, we’re scrambling to see what we can do with the resources we do have, and what that means for the quality and timeline of the project.

Et tu, science?

Needless to say, I’m worried. I’m sure I’m not the only one. My supervisors are meeting (without me) tomorrow, and in the meantime, I’ve taken on another project offered by my local supervisor. At best, it looks now that the immunome project will be a while getting off the ground, so I need something to do in the meantime. At worst, I may need to withdraw from that project, in which case the new one will become the basis of my thesis.

In any case, it will be my decision, and it will likely come down to how determined I feel to work on this specific project. This will affect my choice about whether or not to transfer into the PhD program, and whether or not I will remain in Vancouver into 2014. It’s all still up in the air, but hopefully we’ll be able to lay out a new plan over the next week or two. It’s just startling how fast it all happened. And it feels horrible to be in limbo.

I’ll leave this story short for now, and update again when things settle down.

[Image credit: #whatshouldwecallgradschool, a website accurate enough to be my biography.]

The margins of critical thinking

I’ve never seen myself as a very good critical thinker. I’m often uncomfortable asking questions in seminars (for fear of exposing my ignorance), I allow myself to absorb lectures passively instead of actively engaging with the topic (for fear of missing key details if I start over-thinking the material), and I tend to trust what I read by default.

As an academic, I will be expected to critique not only the work of others, but also my own work. It’s important not to underestimate the difficulty of this: finding flaws in a study can be particularly difficult when it’s your own work, and many a paper has found its way through the research, writing, and publication process, only to be presented as a jumble of data propped up on shaky conjecture and rooted on a foundation of jibberish.

A good example of this is the notorious “arsenic DNA” paper from 2010, which has been thoroughly slammed and ultimately refuted.

I’ve been reading about critical thinking a lot lately, because reading is the only way I can learn about something and be on Twitter at the same time (go on, judge me). It seems to me, as a newcomer to this most sacred of scientific methods, that people regularly get it wrong in two ways. As a writer of a particular article or paper (or if you have some similar vested interest in the topic), you can be sucked into what’s being discussed, and lose the ability to be critical about it or receive criticism. Conversely, when a paper/article is deemed lacking, some researchers can forego rational discourse and be absolute dicks about it. In both cases, it appears to be due to a lack of objectivity as unnecessary emotions get involved.

Let’s explore this further with two convenient case studies.

>>People who think they know the cure for cancer

I promised myself I wouldn’t write about this, but I’m clearly craving some sweet, sweet hatemail.

We’ll start this with the obvious caveats: I haven’t been in this research biz for very long (3 years, and only about a year of that has been in cancer research). I am not a specialist; most of my knowledge is scattered across several cancer types and subjects (genetics, immunology, metabolism, biochemistry, proteomics) in varying degrees. I don’t know the answer, but I know the problem, and I spend about 6 hours each day learning more about it. I have a long way to go before I’m an authority. That being said, I at least know the depth of my ignorance. Ignorance on its own is not dangerous: it becomes dangerous when it becomes a virtue. And it becomes a virtue when people lose faith in the providers of scientific knowledge.

1044116_4582462414176_123854920_n

Here’s a pretty picture of fireworks. It is here because as I write I will occasionally stop to look at it, to give my blood pressure time to go back down. I encourage you to join me.

Occasionally, we see an article pop up in the media: “New drug promises to stop [insert cancer here] in its tracks”. Oh, but wait — they’re not getting funding? Why?! Or “[Insert cheap supplement/vitamin] prevents cancer development”. When this happens, people begin claiming that researchers prevent the public from learning about “the cure for cancer” because it would bring the behemoth of the pharma industry to its knees.

Let me be abundantly clear from the outset: if there existed on this sweet earth a “cure for cancer”, no pharmaceutical company in existence could stop the public from finding out about it. It does not exist. In fact, it is the opinion of most sane researchers that it will never exist. It will not be purified from a rainforest plant or naked mole rat serum. Cancer is a very complex set of diseases, and will require a battery of sophisticated treatments to truly master, not one silver bullet.

The reason this rumor perpetuates is because a) pharmaceutical companies are profit-driven, and b) many research projects are halted in their early stages for financial reasons. That being said, there exists no evidence that anyone is being “silenced” — if Snowden can run to the press, researchers could too. The reason we don’t is because we simply do not have those answers. If the bigwigs in the pharma industry are smart enough to silence thousands of well-meaning people (who have virtually all lost someone to cancer), they would certainly be smart enough to find a way to profit off some magical “cure” for all cancers. The myth is unfounded and ridiculous, and reeks of paranoia.

But then, what of these news stories? I’ve read things about up-and-coming treatments that look really promising! There are researchers going to the press to draw attention to their work! These people are heroes! … Right?

This is where the critical thinking kicks in: given the fact that we have been throwing the most advanced technology ever developed by mankind at this problem, and given the fact that some of the brightest minds on the planet are working together to tackle it, you should be immediately suspicious of any claim of progressive “leaps” in treatment. This has nothing to do with whether or not this researcher is right: this critiquing of new ideas, called peer review, is absolutely essential to how science works, because it helps us decide who has presented enough evidence to make a compelling case for their ideas.

One of the best pieces of advice I have read lately to gear your mind for critical thinking is simply “Begin in a position of disbelief, and demand to be convinced“. This is far, far harder for the general public to do for something like a cancer therapy because understanding the reasoning behind a treatment requires a mountain of background information that often is not provided in press releases or media coverage (in fact, the media can often distort the truth quite a bit). In order to properly comb through a newly published study, you need people who are familiar with the topic and context — the experts, other scientists. When a claim of breakthrough reaches the public before the scientific community has had a chance to properly scrutinize it, it can claim to be a miracle treatment and no one will dispute it (often, the media will eat it up). When other scientists do move in, it is sometimes interpreted as “silencing”, and the results of a lingering refuted claim can be disastrous. It’s all a matter of timing.

Pay attention to how claims are communicated. Is there a paper published in a scientific journal about this? If you do a google search of this topic, does it seem like reputable sources (universities, organizations like the NIH, scientist bloggers) support the idea? If you know someone who works in a related field, have they heard anything about this study?

The take-home message: if the scientific community is not convinced, it is because the evidence is unconvincing, and the claim does not have adequate support. Scientists shouldn’t dodge the critical evaluation of other scientists and run to the press with an idea, but when it’s your project, YOUR life’s work, it can seem like it deserves it. These people are not necessarily trying to be manipulative.; being emotionally invested in a project can prevent them from looking at it critically, just as being emotionally invested can make the public want to believe it.

>>ENCODE and the ongoing “junk DNA” debate

A very brief background on this topic: over the past few years, the ENCODE Consortium has been working to characterize the “functional elements” of the human genome. You may have heard that only a small fraction of our genetic code is “functional”: this is because only about 2% of our DNA can be used as the blueprint for proteins. The rest is jumbled code, left over from genes that have been too mutated to function anymore, stretches of repeated DNA that serve no apparent purpose, and fragments from viral code that were inserted into our code over millenia and retained for no particular reason. In a controversial paper published several years ago, the ENCODE team flipped our view of the human genetic code on its head, claiming that they had found that 80% of our DNA is actually functional.

ENCODEThis was met with immediate backlash from other geneticists, and for good reason. Their criteria for classifying areas as “functional” were very loose — if that stretch of DNA could bind with a protein (this is necessary for transcription to occur), the area would be deemed “functional”, even though there is no evidence that this binding would result in transcription. Although some of these stretches of DNA may be transcribed into RNA, they did nothing to show that this RNA would have any purpose. There is no reason to believe that proteins can be coded from it, and even if they are, there is no reason to believe that they have any function themselves. Further, although RNA itself can be useful (for example, the ribosome itself is made of RNA, and RNA can interfere with how other genes are being transcribed), no effort was made to show that any of this is happening. Extraordinary claims require extraordinary evidence, and in the opinion of many geneticists, ENCODE fell far short of convincing us.

The problem I have with the critiques offered by some scientists (worth the read if you like schadenfreude), is the way in which the critique is presented. I read an article yesterday by a geneticist who has spoken repeatedly about his distaste for ENCODE’s methods (link at the bottom of this post), and following in the theme of his previous posts and the tone of others, he was blatantly sarcastic, snarky, and borderline vitriolic. His emotions come through very plainly: he was likely offended by the studies pushed by ENCODE, and he does nothing to hide this.

We are all human, and just as our attachments to our work can make us blind to its flaws, they can also make us aggressively disdainful of research that we see as unfit. In both cases, we lose our ability to communicate our ideas. I will admit that when the backlash to ENCODE began surfacing, I was immediately defensive of their work. When we feel offended, we lose our objectivity. This is why science is not supposed to take sides.

What is the point of peer review? Science is a never-ending debate, but it should be a respectful debate. The goal is to convince others, not beat them into shame and submission (regardless of what American politics has taught you). If you’re out to bully people more than you are to educate, then you’re not acting in the best interests of the field.

Critical thinking is a balance between cynicism and passion.

Scientists must aim to be cynical about the claims of others, but not to be cynical of other people. We must be passionate about what we do, but not so passionate as to protect our work from all criticism. Like a parent, we should give our hypotheses a chance to get pushed into the mud a few times, in the hopes that it will be a growing experience, and eventually the strong ones will learn to intellectually hold their own. Following on the theme of being good parents, we shouldn’t throw an elbow into everyone that irks us — that’s someone’s child, damn it! How would you feel if someone did that to yours? Are you out to help your colleagues grow, or are you out to bully them into taking your side?

Additional References

Davies, Paul. “The final frontier in the war on cancer“. The Telegraph. http://www.telegraph.co.uk/science/science-news/9065707/The-final-frontier-in-the-war-on-cancer.html

Moran, Laurence A. “How to Make a Scientific Argument”. Sandwalk (personal blog). http://sandwalk.blogspot.ca/2013/07/how-to-make-scientific-argument.html?m=1

Graur, D. et al. On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE. Genome Biol Evol (2013). http://gbe.oxfordjournals.org/content/early/2013/02/20/gbe.evt028.short?rss=1

Follow

Get every new post delivered to your Inbox.