Tuesday, September 6, 2011

Silencing a mysterious cellular passenger.


Malaria is an ancient killer. For millennia it has claimed the lives of the world’s most vulnerable populations - the youngest and the poorest. Despite the human species' long history with the disease, it remains a mystery to us in many ways. We really only know the basics. Plasmodium falciparum, the parasitic microorganism that causes malaria, has an incredibly complex lifecycle. It is transmitted to humans in the saliva of a tiny Anopheles mosquito. When the mosquito bites, the parasites flow into the bloodstream. Once inside a human, the parasite goes through multiple phases, taking residence in liver cells and red blood cells. In each phase it feeds on the resources within the human cell as it grows rapidly and replicates its cellular machinery many times. Then it suddenly divides into many new individuals, aggressively bursting out of the human cell. This parasitic amplification occurs once in a liver cell, and subsequent to its forceful exit from the liver, it will amplify many times in red blood cells.

Not only does the Plasmodium undergo a very complicated lifecycle, it also contains some unusual cellular equipment. The Plasmodium contains a plastid, an organelle like the chloroplasts found in plants, which is capable of photosynthesis. What could a plastid be doing inside a human parasite? This question has managed to elude researchers for twenty years. One thing we know for sure is that it’s not performing photosynthesis like its chloroplast brethren.

The plastid found in Plasmodium is called an apicoplast. It is believed that plastids were once free-living bacteria that were gobbled up by algae 1500 million years ago and harnessed for their photosynthetic ability. Like all complex plastids, the apicoplast found its way into the Plasmodium eons ago when a Plasmodium engulfed a single-celled red algae that contained plastids. Over millions of years of evolution, however, the Plasmodium’s plastid passenger has lost its photosynthetic power and shipped most of its genetic information to the nucleus of the Plasmodium. In many ways the apicoplast seems like nothing more than an evolutionary relict. However, if the apicoplast serves any critical functions for the Plasmodium, it could be a key target for anti-malarial treatments.

Scientists have high hopes for apicoplast-targeted malaria treatments for one very important reason. Even though it has resided in more complex organisms for ages, the apicoplast is still of bacterial origin. In contrast, the Plasmodium, like a human cell, is a eukaryote. Definitions aside, this means that Plasmodium metabolism is more similar to metabolism in human cells than it is to apicoplast metabolism. Therefore, treatments that interrupt metabolic pathways of the apicoplast are likely to leave human cells unharmed, whereas treatments that target the Plasmodium itself may have the same adverse effect on human cells. This leaves scientists left to ponder the role of the apicoplast in the Plasmodium. What essential functions does the apicoplast perform for the Plasmodium that the Plasmodium cannot do for itself?

Both apicoplasts and human cells produce important molecules called isoprenoid precursors. The human and apicoplast versions of these molecules may look identical after synthesis, but since human cells are eukaryotic and apicoplasts are more like bacteria, the metabolic pathways that produce them are completely different. After the isoprenoid precursors are made in the apicoplast, they are shipped out into the parasite where they are used to make isoprenoids, a diverse and biologically important class of molecules. This step occurs during the blood stage of the parasite’s life cycle. Scientists believe that the synthesis and export of these isoprenoid precursors may be the only function of the apicoplast that is actually essential for parasitic growth.

In a recent publication in PLoS Biology, Drs. Ellen Yeh of Stanford and Joseph DeRisi of UCSF were able to demonstrate just how important these isoprenoid precursors are to the Plasmodium. It is known that several antibiotics are effective at combating malaria. Antibiotics attack the bacteria-like apicoplast, not the Plasmodium itself. However, their effectiveness at killing the malaria parasite suggests that functions of the apicoplast must be essential to the survival of the Plasmodium. Yeh and DeRisi took this information one step further, attempting to understand the mechanism of these antibiotics that cause the parasite to die.

They grew Plasmodium in a laboratory culture and treated the culture with antibiotics. As expected, the parasites stopped growing or died. Next they treated the cultures with antibiotics but added isoprenoid precursor molecules. By doing this they found that one particular molecule, IPP, “rescued” the Plasmodium in the culture. Even though the antibiotics killed the apicoplast, the Plasmodium survived with the addition of IPP. From this simple experiment, they were able to deduce that the essential function that the apicoplast performs for the Plasmodium is the synthesis of IPP.

Now that scientists have discovered the role of the apicoplast, they can target this metabolic function when developing new anti-malarials. In addition to that, through their experiment Yeh and DeRisi were able to produce a Plasmodium strain that lacks the apicoplast. This strain will be a powerful tool for Plasmodium studies, especially for identifying apicoplast drug targets and more advanced vaccines. Well done Dr. Yeh and Dr. DeRisi!

Yeh E, DeRisi JL (2011) Chemical Rescue of Malaria Parasites Lacking an Apicoplast Defines Organelle Function in Blood-Stage Plasmodium falciparum. PLoS Biol 9(8): e1001138. doi:10.1371/journal.pbio.1001138

The image comes from work by Waller, et al. (2000). This image shows the apicoplast, stained green, inside a Plasmodium during its many cellular stages in the blood phase. Notice that the apicoplast is replicated many times before the parasite divides into several new individual cells. The EMBO Journal (2000) 19, 1794 - 1802 doi:10.1093/emboj/19.8.1794

Sunday, August 28, 2011

Be a good landlord.

Antibiotic-resistant superbugs are scary, but they are not the only negative, long-term consequence of our overuse of antibiotics. Dr. Martin Blaser of the Department of Medicine at NYU recently wrote commentary for Nature regarding the liberal use of antibiotics and its destructive impact on beneficial bacteria. Our gastrointestinal tracts provide habitat for a community of microorganisms that aid in digestion, produce vitamin K, and guard against harmful invaders. From an ecological perspective, these are mutualisms – relationships in which both organisms, the human and the bacterium, derive a benefit. This relationship should be protected. Instead, we cause irreparable damage the community of helpful bacteria with repeated courses of antibiotics. A therapeutic dose of amoxicillin may clear-up an ear infection, but not without collateral damage to these beneficial microbiota. Many people experience an upset stomach during a course of antibiotics. This is an indication that our helpful bacteria have been eliminated, but the results may go far beyond a tummy ache.

I spent some time in the Blaser Lab this summer where scientists and students were hard at work researching Helicobacter pylori. As Dr. Blaser explains in his essay, H. pylori was the dominant microbe in the stomachs of most people in the twentieth century. By the turn of the twenty-first century, however, fewer than 6% of children in the United States, Germany and Sweden were carrying the organism. H. pylori may have a bad rap for its connection to ulcers and stomach cancer, but its eradication has several surprising effects. For instance, people without the bacterium are more likely to develop asthma, hay fever, and skin allergies. Moreover, H. pylori helps regulate ghrelin and leptin, hormones that control appetite and metabolism, which may have implications in obesity. A dose of amoxicillin administered to treat a respiratory infection will also eliminate H. pylori in 20 – 50% of cases.

Farmers have noticed that repeated low doses of antibiotics cause animals to gain weight with less food. The Blaser lab has discovered that comparable sub-therapeutic doses cause changes in body fat and tissue composition in mice. Large doses, like those used to treat childhood infections, have similar results. Dr. Blaser goes on to emphasize the importance of age. The physiological changes that are triggered by antibiotic usage early in life are the hardest to reverse, yet the average child in the United States receives 10 – 20 courses of antibiotics before age 18.

To read more about threats to your friendly bacterial tenants and what we should do to protect them, read Dr. Blaser’s expert opinion in his essay for Nature.

Saturday, August 27, 2011

A hurricane to Tip the scale.

Tonight New York City is bracing for Hurricane Irene, and while the storm will undoubtedly deliver aggressive winds and major flooding, the level of panic is a little excessive for a fading Category 1 hurricane. Don’t get me wrong, people should take all necessary precautions, and those New Yorkers in Zone A should certainly obey evacuation orders. As Governor Christy says “Get the hell off the beach,” you’ve maximized your tan. I just think that come Monday, people may feel a little bit silly for clearing out the grocery stores of every last crumb of bread and drop of milk.

Although the memory of Hurricane Katrina is still fresh (even more so for fans of Treme), New York City on Monday morning will still resemble New York City, not a post-Katrina New Orleans. Let’s get some perspective. Hurricanes, or tropical cyclones as they are known worldwide, are characterized by a region of extreme low pressure at the center that is surrounded by thunderstorms, causing powerful winds and heavy rain. The lower the pressure and the stronger the winds, the more intense the storm. The Saffir-Simpson Hurricane Scale only categorizes storms by wind speed, although central pressure is also a good indicator of hurricane strength.

Right now, Irene is hanging out about 100 miles south of Ocean City, Maryland. The average sustained wind speed is 80 mph, which makes it a Category 1 storm, and the pressure at the eye of the storm is 951 millibar. When the winds slow to 73 mph, it will become a tropical storm. When Hurricane Katrina hit New Orleans on August 29, 2005, it was a Category 3 storm with sustained winds of 125 mph and pressure of 920 millibar.

Katrina was the costliest tropical cyclone ever, with damage exceeding $100 billion. It also was one of the deadliest, claiming 1836 lives. But let’s talk about another storm. The most intense tropical cyclone ever was Typhoon Tip in 1979. Tip was positioned in the north western Pacific Ocean, and it made landfall on Guam and southern Japan. Although it didn’t claim as many lives as Katrina, the statistics are staggering. The highest sustained winds were 190 mph for one minute and 160 mph for ten minutes. The pressure at the eye of the storm reached an unheard of 870 millibar. It was also the largest storm ever, extending 1380 miles across, which is half the area of the continental United States. Typhoon Tip had such extreme winds, that it ranks as a hypothetical Category 6 on the Saffir-Simpson Scale. This means that the storm’s potential damage is beyond catastrophic.

Hurricane Irene may damage some roofs and fell trees, but we will emerge relatively unscathed. In comparison, read the description of a Category 5 hurricane below:

"People, livestock, and pets are at very high risk of injury or death from flying or falling debris, even if indoors in mobile homes or framed homes. Almost complete destruction of all mobile homes will occur, regardless of age or construction. A high percentage of frame homes will be destroyed, with total roof failure and wall collapse. Extensive damage to roof covers, windows, and doors will occur. Large amounts of windborne debris will be lofted into the air. Windborne debris damage will occur to nearly all unprotected windows and many protected windows. Significant damage to wood roof commercial buildings will occur due to loss of roof sheathing. Complete collapse of many older metal buildings can occur. Most unreinforced masonry walls will fail which can lead to the collapse of the buildings. A high percentage of industrial buildings and low-rise apartment buildings will be destroyed. Nearly all windows will be blown out of high-rise buildings resulting in falling glass, which will pose a threat for days to weeks after the storm. Nearly all commercial signage, fences, and canopies will be destroyed. Nearly all trees will be snapped or uprooted and power poles downed. Fallen trees and power poles will isolate residential areas. Power outages will last for weeks to possibly months. Long-term water shortages will increase human suffering. Most of the area will be uninhabitable for weeks or months." From the National Weather Service.

The above photograph is pretty much the reason for this whole post. Courtesy of KeystoneUSA-Zuma/Rex Features.


Too much of a good thing.

Like all living organisms, plants depend on nitrogen as an essential nutrient for growth. In natural ecosystems plants are supplied with nitrogen by microorganisms in the soil that decompose organic matter and break down large, complex organic nitrogen molecules into the smaller, usable forms nitrate and ammonium. Other microbes assist plants with their nitrogen need by fixing atmospheric nitrogen into mineral nitrogen, or by helping plants reach distant soil nitrogen when it is in short supply.

The nitrogen cycle reliably churns along, supporting plant life and all dependent organisms, as long as all of the elements are in place. In agricultural systems, the most important part of the equation is removed when the crops are harvested. Without that plant matter returning to the ground to decompose, soils quickly become nitrogen deficient, and agricultural yield drops. To combat this, farmers must restore the lost nitrogen, and they do so with the addition of nitrogen fertilizers. Unfortunately, fertilizers are often applied in excess as farmers try to maximize yield. Far more nitrogen is added to the system than can be utilized by crops, and the excess finds its way into the atmosphere and bodies of water where it wreaks havoc. A single molecule of nitrous oxide contributes to global climate change with 296 times the global warming potential as a molecule of carbon dioxide. When nitrogen fertilizers reach the water, they promote the growth of algae, leading to massive blooms that choke off marine and aquatic life.

An essay by Allen Good and Perrin Beatty published in this month’s PLoS Biology draws attention to the imbalance of nitrogen fertilizer usage in different regions of the world. For instance, China uses far more nitrogen than is needed for optimal yields, yet their fertilizer use continues to rise. In contrast the countries of sub-Sahara Africa don’t use enough nitrogen and as a result, they have nutrient-poor soils and low yields. When faced with poor water quality due to nitrogen surplus, the European Union established and implemented best nutrient management practices in 1987, resulting in a 56% usage decrease in twenty years.

How was this achieved? Scientists conducted long term studies to determine the optimal amount of nitrogen fertilizer for each crop species in various regions of the world. Ordinarily farmers would apply fertilizers willy nilly with little consideration for the specifics of the plant species, application method, and fertilization rate. The results of these experiments proved that even in well-balanced systems, farmers can reduce the application of nitrogen fertilizer with no loss in yield.

Good and Beatty used this idea and took it a step further by quantifying the potential economic and environmental savings to be gained if fertilizer usage is reduced. First they determined the economic cost associated with the environmental damage of excess fertilizer use. Then they used fertilizer use and price projections to calculate the cost savings if nitrogen use is reduced to match the regional recommendations. All of the countries that were analyzed, which account for 74% of global fertilizer use, required either no change in nitrogen use or a reduction from 5 to 20%. Based on their analysis, Good and Beatty found that directed nutrient management strategies could achieve a total savings of $19.8 billion a year by 2020 and $56 billion a year by 2030. These values are shocking, not only because of the amount of money that is wasted through careless use of fertilizers, but also the magnitude of environmental damage that is incurred year after year. To learn more about the study, and to see Good and Beatty’s recommendations, you can find their essay at PLoS Biology.

Thursday, August 4, 2011

Universal flu care.

Designing the annual flu vaccine is not unlike playing the stock market. Each year the World Health Organization (WHO) creates a portfolio of three strains of flu, each representing a different influenza virus. This portfolio is the trivalent inactive vaccine, the official name for your annual flu shot. The strains that are selected are predicted to be the predominant source of flu infection in the upcoming season. Inactive forms of the viruses are combined into one shot, which gives your immune system a preview of what it can expect to fight in the upcoming months. Your body produces antibodies to those strains in advance so that it is ready to attack when flu season begins.

Year after year the scientists at WHO have much success in predicting the most harmful strains of flu; however, their selection process is by no means infallible. While the scientists have plenty of data to draw upon, the decision is, at best, an educated guess. There is no way to know for sure which strains present the greatest risk, and those strains that don’t make it into the vaccine can still infect you and make you sick. Moreover, foresight is greatly limited by the speed with which the viruses evolve. Flu viruses mutate so rapidly that vaccines lose their effectiveness every year. Even after your body builds a supply of antibodies to a particular flu strain, it will be unable to recognize the same strain the following year.

A virus is a very simple entity consisting of a piece of DNA contained within a protein case. Antibodies can only recognize one specific site on the protein case, which they attach to, signaling white blood cells to attack. Usually the antibodies that develop in response to the annual flu vaccine target a highly variable site on the head of the protein case, meaning that a different antibody is needed for each strain of the virus every year. During the 2009 H1N1 pandemic, however, vaccinated patients produced a different kind of antibody. These bound to a region of the protein that is conserved among all Influenza A subtypes, including seasonal flu, avian flu, and swine flu. Additionally, the high degree of conservation among flu viruses suggests that this site on the protein may not mutate from year to year.

Of course the antibody itself cannot be a vaccine, but it will inform the design of a future universal flu vaccine. Now knowing the best region to target, scientists will be able to develop a vaccine that triggers your immune system to produce the antibodies to latch onto the same, conserved site, regardless of the year or strain. Such a vaccine will eliminate the annual guesswork and protect against unexpected strains. You can learn more about this stunning development in immunology as it was published in Science as well as additional reports in Nature News.

Sunday, January 9, 2011

Rumination on chemistry.

I just learned that 2011 has been designated as the International Year of Chemistry. For a science that has only found its star among the product and pharmaceutical giants, such a declaration is a pretty big deal. Let's be honest, chemistry plays an all-important role in developing the materials we use, but it hasn't captured the public's imagination the way that biology has (I'll admit that Walter White's crystal meth production in Breaking Bad may indicate a changing tide). But now that my final biochemistry grade has been calculated and recorded, honoring the central science is a celebration that I can get on board with.

I have always found chemistry to be the most fascinating, if not the most challenging, science I have ever studied. Chemistry and I definitely have a love/hate relationship, but now that I have completed what might be my last chemistry course ever, I'm leaning heavily towards the love side. Learning that all the matter in the universe is governed by the properties of only a handful of different particles is truly an awe-inspiring, if not life-altering, realization. Chemistry doesn't just give meaning to life; it gives meaning to everything. If I dedicate a moment to thinking about that one idea, it absolutely blows me away. Indeed, I have shed a tear while reading my chemistry text, and it's not the kind of tear (rage, desperation, frustration) you may expect.

In my undergraduate chemistry courses, I had trouble grasping some of the advanced concepts simply because I hadn't moved beyond the basics. It's not that I didn't believe them or understand them, it's just that I found the basics far too amazing to simply gloss over in the first two lectures. I didn't have the chance to fully appreciate them before I had to accept them as "oh-yeah-well-duh!"-type truths. I have always thought that atomic structure, the periodic table, principles of chemical bonding, and the properties of water should have their own semester-long course. That course would be taught in an antique parlor somewhere, where students can sit in comfortable chairs under dim lighting, surrounded by candles. There would be group readings, perhaps some chanting, and lots of emotion. It would be something that inspires fits of passion, tears of joy, and students speaking in tongues. To me, chemistry is that amazing.

Maybe it's blasphemy to suggest that chemistry deserves the kind of dramatic and mystical treatment usually reserved for religion. I am among the first to argue that science and religion are incompatible. Scientific inquiry is after all, the antithesis of faith. But chemistry is a discipline rooted in models of reality - the best possible description of that which we cannot actually see. True, models are scientific theory - they are well-tested and supported by independent inquiry, and they offer a sound explanation for natural phenomena. But models have flaws. Models are subject to refinement (plum pudding, anyone?). Indeed, such is the goal of science. To proceed with confidence in the study of chemistry, you must trust the models. You must ignore the nagging worry that somewhere in the world an innovative chemist will soon use the latest technology to debunk, or at least improve upon, the model that you will devote years, a career, your life, to elucidating. It probably won't upend the entire discipline, but it will definitely shake its core. That beautiful eureka moment drives science forward, but it can also break the individual scientist.

Forgive me for being melodramatic. I just read a recent article by Philip Ball in Nature News that reinforced this idea that science is determined to evolve and that nothing is absolutely certain. New techniques are casting doubt on the ABCs of the entire discipline - chemical bonds. No, seriously. The plastic sticks that link the plastic balls in your organic chemistry modeling kit may not actually exist. In fact, the hydrogen bond has already been redefined based on new experimental results that change the idea of electrostatic attraction (I panic - what about protein folding?!). Today's accepted model of the quantum chemical bond is based on interactions between electrons as governed by their wavefunctions, a mathematical tool used to describe the quantum state of a particle. The wavefunction cannot be measured as a one-electron unit, however, because the behavior of each electron depends on the behavior of its neighbors. Now here's the kicker - currently there exists no exact method for computing this correlational energy among electrons, so the description of every quantum chemical bond is at best an approximation.

The existence of a chemical bond also depends on the exact time at which it is characterized. Linus Pauling, the author of the valence-bond description, said that a group of atoms can be considered bonded "when it is convenient for the chemist to consider it as an independent molecular species." Now more than ever this ambiguous definition rings true. Ultra-fast laser spectroscopy, which makes it possible to study molecules on an extremely short time-scale, has produced results suggesting that chemical bonds may not accurately characterize a molecule's structure and reactivity. Moreover, a bond between two atoms embedded within molecule can be difficult to determine - they may simply be held in close proximity by the surrounding atoms. There is also little certainty in deciding which electrons belong to which atoms. The article enforces the idea that molecules are ultimately just a collection of nuclei embedded in a continuous electron cloud. The interactions between atoms - the sticks in our modeling kits - are not hard and fast as we once thought. Despite teaching chemical bonds as a fundamental concept of chemistry for decades, we are witnessing the idea unraveling as new technology and contemporary research challenges our interpretation of the classical work.

What can we do with the theories and models that have been replaced? What becomes of the knowledge that scientists have dedicated their careers to building and that students have toiled to understand? The plum pudding model is actually pretty impressive when you realize that Thompson didn't even know about the atomic nucleus when he came up with the idea in 1904. The nucleus was discovered soon after with Geiger and Marsden's beautiful gold foil experiment in 1909. But Rutherford's planetary model of the atom, a theory borne out of the gold foil experiment, was subsequently improved by Neils Bohr with the development of quantum mechanics. Even the Bohr model has since been invalidated and replaced with the frustratingly complicated atomic orbital model. Though they may seem silly now, these models were not replaced because the science behind them was shoddy or careless. On the contrary, each was revolutionary and derived from elegant experimentation and theory. Scientists can only use the technology available to them to make careful leaps from existing knowledge. These limits to scientific research are better seen in hindsight.

Luckily disproved or refined models do not decay in obsolescence, rather they enrich the discipline. I realized this fact when I found myself able to recall the antecedent atomic models just now. Even though it may not appear on an exam, we are taught the history of chemistry in class because it honors the ground-breaking work of chemists, and it illustrates the way in which science grows from existing knowledge. Refinement of existing theories is proof that science works. Classical research is meaningful, even if it is no longer useful. As Ronald Hoffmann of Cornell University says of the hotly debated valence bond and molecular orbital theories, "discarding any one of the two theories undermines the intellectual heritage of chemistry." Honoring past research gives value to modern science and motivates scientists who know that their discoveries may ultimately be bested. The beauty of science is that it is humble in the present, constantly striving to improve itself, while being deeply reverent of the past.

References: Ball, P. (2011). "Beyond the bond." Nature 469(7328): 26-28.