What is Gene Therapy?

By: Danning Li

On December 19th, 2017, Leber’s congenital amaurosis, an inherited disease that causes severe vision loss and blindness in children met its match in the form of voretigene neparvovec (AAV2-hRPE65v2, or Luxturna for ease of pronunciation), the first gene replacement therapy approved by the FDA for sale in the United States. The previous sentence might contain some words that are not part of the current medical curriculum, but they will become an increasingly important part of our future practice as technology develops. Already, the rate of approval for these novel therapies is ramping up. Last year, the FDA approved three new genetic therapies for public sale (for those interested, the FDA news releases are here in chronological order Kymriah, Yescarta, and finally Luxturna). So, for today’s blog entry, with the possible dawn of the genetic therapy age upon us, let us focus on just what a “gene replacement therapy” is, and why it is such a big deal.

Gene Replacement Therapy 

Gene replacement therapy, as its component words suggest, is a treatment that aims to provide a replacement copy of a gene to a patient’s body. It doesn’t matter if the patient doesn’t have this gene, protein or enzyme, or even if the patient has a mutant, non-functional or less-functional version of the enzyme; gene therapy seeks to rectify this issue. The goal of this therapy is for the replacement artificial gene copy to produce a functional enzyme or enzyme subunit that the patient’s own body needs, and can use effectively.

Now, some might be wondering why we don’t just supply the missing enzyme instead, after all, Enzyme Replacement Therapies (ERTs) are well-established treatments for several diseases already. However, the answer to this question is simple: economics. Remember how enzymes are constantly made and broken down in the body naturally? Well, this applies to inherited diseases too since the patient would break down the injected replacement enzymes overtime, and the effects would wear off. Since the replaced enzymes are made in a lab and have to be reinjected into the patient regularly, we would have to constantly produce the purified enzymes and the patient would be on the hook for the rest of their life. This might not sound like such a bad problem, a lot of medications are like this already, right? Unfortunately, it turns out treating someone for decades with regular ERTs cost around USD $9-10 million or more. Insurance companies and OHIP won’t be very happy if a lot of patients started lining up for therapies with these kind of price tags attached.

Looking Towards the Future

Naturally, when faced with such a tantalizingly difficult problem, scientists and physicians thought up a ridiculously simple solution in the 1960s. If making these enzymes outside of the body and then injecting it into the patient is too costly, why don’t we just give the patient’s body the genetic information it needs to make the missing enzymes inherantly? After all, the most efficient enzyme production facility is a living, breathing body that converts regular nutrients into precious enzymes. Even better, if we were lucky enough, and the inserted DNA managed to stick around inside the body somehow, we would have just cured the disease completely, improved the patient’s quality of life, and opened up a completely new world in science. Thus, the idea for gene replacement therapy was born, and brilliant minds around the world would spend the next half a century trying to make this ridiculously simple idea, a reality.

Author: Danning Li


Danning Li completed his BSc. majoring in Physiology at McGill University. Afterwards, he worked for two years on developing a gene replacement therapy for Canavan Disease, a rare inherited leukodystrophy, at the Horae Gene Therapy Center at the University of Massachusetts Medical School. Now a medical student at Schulich, he wants to bring attention to the interesting genetic therapies that will become available in the not so distant future.

Photo Credits: Creative Commons, Gene Therapy Infographic

Title: Cost and Effect: Do We Over-Regulate Drug Development?

By: James Payne

On one of my very first tests in medical school, I was asked which branch of government was responsible for approving new drugs in Canada, and I’m ashamed to say the question gave me pause.  One frequently hears about the actions of the FDA in the US; but is there even a comparable body in this country?  Of course, I’m being glib.  The Health Products and Food Branch of Health Canada (totally didn’t have to look that up) does indeed regulate which drugs are available for patients. So, we can all sleep safe tonight knowing we’re protected from the profit-hungry machinations of Big Pharma; but should we?

Government regulations are designed to keep us safe, and it’s not hard to see why they’re so important.  One needs only to recall the failure of the authorities to protect young mothers and their children from thalidomide in the early 1960’s.  Even after the drug was withdrawn from markets all around the world due to the terrible effects on fetal development, thalidomide was still legally sold in this country. Ironically, it was a Canadian physician, Dr. Frances Kelsey, who blocked approval of the drug in the United States even while up against enormous pressure; and so spared that country from suffering the same fate of her own.  In 1962, Health Minister J.W. Monteith specifically cited the thalidomide tragedy when pushing for a new bill with stricter regulations on drug development in Parliament.

Drugs are among the most important discoveries in the history of science.  How many lives have been saved by antibiotics?  By chemotherapeutics?  By the almighty beta-blocker!? But these drugs can also pose a danger to us all as well.  They are designed to change things within our bodies; yet our bodies are so complex that sometimes they may change the wrong things.  This is why thorough and extensive drug testing is so important to protect both patients’ safety, and their confidence in the healthcare system.  Regulation is essential.  But have we recently gone too far?

Big Pharma and Drug Development

People often look with contempt at big pharmaceutical companies.  Yet, we should also remember that these companies are responsible for many major drug breakthroughs over the past century.  They’re in the business of making money, but saving lives is a happy side effect.  I don’t expect you to feel sorry for Pfizer or Novartis or Roche, but I will ask you to consider that when they don’t make money, they don’t make drugs. Their bottom lines can be very much our problem.

Regulations on drug development increase drug costs (okay, fair enough; as we have discussed above, regulations are important!).  But the question of the day is: Do we over-regulate?  Over the past decades, evidence standards for drug efficacy and safety have steadily risen, and thus so have the costs of development.  Yes, this has likely prevented injury and even death from dangerous drugs (which cannot be undervalued); but if there is a steadfast rule of economics, it is that when firms pay more, we pay more.  Fifty years ago, the cost of developing a new drug was approximately $250 million USD (adjusting for inflation).  Today, it may be as high as $5 billion.

These prices are astonishing, so it’s not hard to imagine some valuable areas of research being dismissed as unprofitable, especially for diseases more prevalent in the developing world. Of course, there is a strong argument to be made that regulations help drug companies to an extent; getting to claim an FDA stamp of approval can be used as justification to charge more per pill, and skyrocketing drug costs are not solely a function of increased regulation. However, stringent regulations may be costing us even more than mere dollars and cents.

Understanding the Process

As standards rise, so does time of development.  The time required to take a new drug from discovery to distribution has increased steadily since the 1960’s; currently, the average ‘bench to beside’ period for a new drug is 14 years.  Let us consider the implications of this delay: drugs limit mortality and morbidity.  Therefore, does increasing the time required to produce a new drug lead, in the interim, to increased death?  Basic logic suggests that it does, and some studies have estimated the numbers to be in the hundreds of thousands.

Moving forward, should the drug approval process become a simple matter of guessing whether it will kill or save more people?  Of course not.  Medicine is more than arithmetic; public confidence and peace of mind depend on the knowledge that what doctors prescribe is safe.  But can we reduce those regulations without seriously undermining the efficacy of authorities like the FDA?  Several ideas have been proposed, and I encourage you to read some of them here.

To leave some food for thought: are we so preoccupied with making sure our medications don’t kill us, that we allow diseases to do just that?  Is one worse than the other?  These are difficult questions.  But it may be time to ask ourselves: when it comes to regulating drugs, what is the cost, and what is the effect?

Author: James Payne

With a last name tailor-made for a future doctor, James really couldn’t have wound up anywhere but Schulich!  A London native, he did his undergrad at Queen’s, where he majored in Chemistry and Economics; the latter of which was the focus of his work for the UWOMJ journal.  James loves sports (he can catch a football better than Tom Brady), music, and, as you may come to find out, semi-colons.


Understanding the Health Impacts of Climate Change: Advancing the advocacy role of doctors and healthcare professionals

By: George T. Kitching

In 2009, a Lancet Commission on the health impact of climate change warned that climate change poses the greatest global health threat of the 21st century and has the potential to undermine the past 50 years of progress in global health. In June 2015, a second Lancet Commission on Climate Change, (this one focused on health policy implications), concluded that “tackling climate change could be the greatest global health opportunity of the 21st century.” In the same month, the Lancet published a joint commission with The Rockefeller Foundation entitled Safeguarding human health in the Anthropocene epoch, with the intent to create a new interdisciplinary, collaborative research field named Planetary Health.

Defining ‘Planetary Health’

There is no agreed upon definition of Planetary Health, however it was first envisioned, as the achievement of the highest attainable standard of health, wellbeing, and equity worldwide through judicious attention to the human systems-political, economic, and social-that shape the future of humanity and the Earth’s natural systems that define the safe environmental limits within which humanity can flourish. Time will tell whether this research field has staying power over the long term, however a PubMed search for ‘Planetary Health’ scientific journal articles identified 191 publications already since June 2015 regarding the topic. Much of the activity around Planetary Health has focused on defining the new field and attempting to distinguish it from Global Health and concepts of One Health and EcoHealth.

This process has been assisted by the launch of a new Lancet sub-journal in Planetary Health in April 2017. A scan of topics covered in the first eleven issues among others include the examination of the health co-benefits of action on climate change, and in the September issue, a focus on the health impact of pollution. A comparison between One Health (which focuses on the connections between humans, animals and their environment), EcoHealth (which values biodiversity, including all forms of life, and incorporates concepts of environmental sustainability), and Planetary Health, found commonalities between the various approaches to health, however there were differences identified between contributing sciences, core focus and core values. In particular, Planetary Health was noted to have a more anthropocentric approach to health, with greater focus on equity in human health. Planetary Health was identified to differ from Global Health through its emphasis on sustainability based on natural resources.

The Real-World Health Impact 

The development of the field of Planetary Health, with its direct focus on human health, speaks to the growing awareness of the impact of political determinants of health, mediated through the local and global environment. Air pollution and accumulating microplastics are two examples of factors that transcend national borders to contribute to global climate change and environmental degradation. The Lancet Commission on Pollution, published in October 2017, found diseases caused by pollution were responsible for an estimated 9 million premature deaths in 2015, with the greatest deaths being contributed to air pollution alone. Furthermore, these 9 million premature deaths are approximately 16% of all deaths worldwide, and three times higher than AIDS, tuberculosis and malaria deaths combined. A 2017 investigation by Orb Media found 83% of potable tap water samples, collected from locations around the world, were contaminated with microplastics. In addition, further research into human consumption of microplastics has identified other daily sources of microplastic ingestion, such as sea salt, commonly used in cooking. The impact of microplastics on human health is still unknown, however, they have been recognized as potential vectors for persistent organic pollutants.

So, what is the value of a new field such as Planetary Health? Well, it provokes the examination of climate change through a human health lens. It spurs reflection on the ways that human health is connected to the environment and affected by political, economic and social paradigms. It challenges the public and global health scientific community to engage in interdisciplinary research. AND it challenges physicians and healthcare providers to advocate on behalf of their patients, for the development of sustainable societies through aggressive climate change mitigation strategies.

Looking Forward 

In September 2015 Canada joined countries around the world in adopting the Sustainable Development Goals (SDGs) to ‘end poverty, protect the planet, and ensure prosperity for all’. While addressing climate change is recognized within the SDGs (Goal 13), it is interwoven into the meaningful attainment of every other SDG. Without addressing climate change, ending poverty (Goal 1) or ensuring health and well-being for all (Goal 3), may be transient victories, reversed rapidly for those displaced from lands no longer habitable.

In summary, definitive, lasting attainment of the targets set out in the SDGs rests upon robust and aggressive climate change mitigation. We need increased research by the global health scientific community into the health implications of climate change and its mitigation, and increased advocacy by our physicians and healthcare providers in understanding and preventing the local and global impacts of unsustainable resource use. And to achieve this, the field of Planetary Health helps to remind us all of the inseparable link between our environment and our health.

Author: George T. Kitching

George Tjensvoll Kitching is of Canadian and Norwegian ancestry, originally from Toronto. He has completed a BSc. in Biochemistry (Dalhousie University) and a MSc. in Public Health specializing in Global Health (Norwegian University of Science and Technology). George is currently studying at Western as a first-year medical student at the Schulich School of Medicine and Dentistry, at the Windsor campus. He is interested in understanding the role of healthcare providers in mitigating the health consequences of climate change.

Photo Credit: Creative Commons, Wind Turbines