ø

The Visible EmbryoHome
Google
 
Home-----History-----Bibliography-----Pregnancy Timeline-----Prescription Drugs in Pregnancy-----Pregnancy Calculator------Female Reproductive System-----News Alerts-----Contact

Week Ending FRIDAY May 22, 2009---------------------------News Archive

Twins Born After Fertility Treatment Have a Higher Risk of Problems at Birth and in the First Three Years of Life
Twins born as a result of assisted reproductive technology (ART) are more likely to be admitted to neonatal intensive care and to be hospitalised in their first three years of life than spontaneously conceived twins, according to new research published online (Wednesday 20 May) in Europe’s leading reproductive medicine journal Human Reproduction.

It is known already that ART twins are at higher risk of problems such as low birth weight and premature delivery than singletons around the time of their birth, but, to a large extent, these risks exist as part of the problems associated with multiple births in general. Up to now there has been conflicting evidence about whether assisted reproduction itself is responsible for adding to the number of problems seen in ART twins.

To answer this question, researchers in Australia and the UK looked at perinatal outcomes and hospital admissions for all twin children born in Western Australia between 1994 and 2000, whether as a result of ART or spontaneous conception.

Twins that arise as a result of ART usually do so because two (or sometimes more) separate embryos are implanted in the woman’s womb. They are non-identical and each has its own placenta. However, twins that arise as a result of spontaneous conception can either be non-identical because two eggs have been fertilised at the same time, or identical because one fertilised egg has divided to make two embryos. Identical twins share a placenta in about two-thirds of all cases, and this is associated with an increased risk of death and other complications. In order to ensure that, as far as possible, they were comparing like for like, the researchers matched the ART twins with spontaneously conceived, non-identical twins of different sexes (referred to in the study as “unlike sex spontaneously conceived twins”, or “ULS SC twins”).

Michèle Hansen, a researcher and PhD student at the Telethon Institute for Child Health Research in Western Australia, said: “We found that twins conceived following ART treatment had a greater risk of adverse perinatal outcome, including preterm birth, low birthweight and death, compared with spontaneously conceived twins of unlike sex. ART twins had more than double the risk of perinatal death compared to ULS SC twins, although the risk was similar to that of all SC twins, including identical twins.

“ART twins stayed longer in hospital than ULS SC twins at the time of their birth: an average of 12 days compared with eight days. ART twins were four times more likely to be admitted to neo-natal intensive care than ULS SC twins, and were more likely to be admitted to hospital during the first three years of their life. After adjusting for confounding factors such as year of birth, maternal age, parity and so on, ART twins still had a nearly two-thirds higher risk of being admitted to neo-natal intensive care, and a higher risk of being admitted to hospital in their first three years of life, although this was only statistically significant in their second year, when their risk was nearly two-thirds higher.”

Ms Hansen continued: “Couples undergoing fertility treatment should be aware that, in addition to the known increased perinatal risks associated with a twin birth, ART twins are more likely than spontaneously conceived twins to be admitted to neonatal intensive care and to be hospitalised in their first three years of life.

“We don't know the reason for the increased risks of adverse perinatal outcome and hospitalisation and preliminary analysis of specific diagnoses does not provide any answers. The underlying causes of parental infertility and/or components of the ART procedure may be increasing the risks of adverse outcome, and increased concern about children born after a long period of infertility may also be contributing to their increased risk of hospitalisation. Estimates of the cost of an ART twin delivery should take into account these increased risks, and, in order to reduce the problems associated with twin births, clinicians and couples should consider the benefits of opting for single embryo transfer.”

A second study, also published online today in Human Reproduction, provides reassuring evidence on the outcome of children born after embryos were frozen and stored, before being thawed and transferred to the womb [2]. The results are good news as an increasing number of children, estimated to be 25% of ART babies worldwide, are now born after freezing or vitrification (a process similar to freezing that prevents the formation of ice crystals).

The study, led by Dr Ulla-Britt Wennerholm, an obstetrician at the Institute for Clinical Sciences, Sahlgrenska Academy (Goteborg, Sweden), reviewed the evidence from 21 controlled studies that reported on prenatal or child outcomes after freezing or vitrification.

She found that embryos that had been frozen shortly after they started to divide (early stage cleavage embryos) had a better, or at least as good, obstetric outcome (measured as preterm birth and low birth weight) as children born from fresh cycles of IVF (in vitro fertilisation) or ICSI (intracytoplasmic sperm injection). There were comparable malformation rates between the fresh and frozen cycles. There were limited data available for freezing of blastocysts (embryos that have developed for about five days) and for vitrification of early cleavage stage embryos, blastocysts and eggs.

“Slow freezing of embryos has been used for 25 years and data concerning infant outcome seem reassuring with even higher birthweights and lower rates of preterm and low birthweights than children born after fresh IVF/ICSI. For the newly introduced technique of vitrification of blastocysts and oocytes, very limited data have been reported on obstetric and neonatal outcomes. This emphasises the urgent need for properly controlled follow-up studies of neonatal outcomes and a careful assessment of evidence currently available before these techniques are added to daily routines. In addition, long-term follow-up studies are needed for all cryopreservation techniques,” concluded Dr Wennerholm.


Oh Baby!

Gene Therapy Could Expand Stem Cells' Promise

Once placed into a patient's body, stem cells intended to treat or cure a disease could end up wreaking havoc simply because they are no longer under the control of the clinician

But gene therapy has the potential to solve this problem, according to a perspective article from physician-scientists at NewYork-Presbyterian Hospital/Weill Cornell Medical Center published in a recent issue of the journal Cell Stem Cell. The paper details strategies for genetically modifying stem cells prior to transplantation in order to ensure their safety.

"Stem cell therapy offers enormous potential to treat and even cure serious diseases. But wayward stem cells can turn into a runaway train without a conductor. This is an issue that can be dealt with and we have the technology to do that in the form of gene therapy," says senior author Dr. Ronald G. Crystal, chief of the Division of Pulmonary and Critical Care Medicine at NewYork-Presbyterian Hospital/Weill Cornell Medical Center, and the Bruce Webster Professor of Internal Medicine and Professor of Genetic Medicine at Weill Cornell Medical College.

Stem cells have the capacity to differentiate into any of the different tissues making up the human body, thus holding the promise of treating or curing diseases such as multiple sclerosis or spinal-cord injury by replacing diseased cells with healthy cells.

But unlike other therapies such as chemotherapy, antibiotics or aspirin, stem cells have no expiration date, and that poses a real problem.

"Almost all therapeutics we use have a half life. They only last a certain amount of time," Dr. Crystal says. "Stem cells are the opposite. Once the future stem cell therapist does the therapy, stem cells have the innate potential to produce more cells."

The challenge takes on even more urgency with recent developments, including a federal administration now more open to exploring the potential of stem cells, the recent FDA approval of a human trial involving embryonic stem cells, as well as the reported case of a young boy who developed a brain tumor four years after receiving a stem-cell treatment for a rare genetic disorder.

As evidenced by this boy's experience, one of the biggest potential problems with stem cell therapy is the development of tumors.

But there are other problems as well.

Stem cells directed to become beating heart cells might mistakenly end up in the brain. Or insulin-producing beta cells which can't stop means the body can no longer regulate insulin levels.

"You've totally lost control," Dr. Crystal says. "What do you do?"

The best chance of circumventing these issues is genetic modification of the stem cells prior to actually transplanting them, Dr. Crystal says. Theoretically, this is similar to how gene therapy is used to treat cancer, but with important improvements.

"Instead of gene therapy being done in the patient, as is the case in cancer, it's being done in the cells in a laboratory before doctors use them for therapy so that they still have control of these cells," Dr. Crystal explains.

Therapists would rig certain genes to respond to a "remote control" signal. For instance, giving a certain drug could prompt a "suicide" gene to kill a budding tumor.

But gene therapy also needs to be carefully done and, ideally, two independent gene-manipulation systems would be used to ensure that stem cells remain firmly in control of clinicians.



Diapers.com

BPA, Chemical Used to Make Plastics, Found to Leach from Polycarbonate Drinking Bottles Into Humans
A new study from Harvard School of Public Health (HSPH) researchers found that participants who drank for a week from polycarbonate bottles, the popular, hard-plastic drinking bottles and baby bottles, showed a two-thirds increase in their urine of the chemical bisphenol A (BPA)

Exposure to BPA, used in the manufacture of polycarbonate and other plastics, has been shown to interfere with reproductive development in animals and has been linked with cardiovascular disease and diabetes in humans. The study is the first to show that drinking from polycarbonate bottles increased the level of urinary BPA, and thus suggests that drinking containers made with BPA release the chemical into the liquid that people drink in sufficient amounts to increase the level of BPA excreted in human urine.

The study appears on the website of the journal Environmental Health Perspectives and is freely available at http://www.ehponline.org/members/2009/0900604/0900604.pdf.

In addition to polycarbonate bottles, which are refillable and a popular container among students, campers and others and are also used as baby bottles, BPA is also found in dentistry composites and sealants and in the lining of aluminum food and beverage cans. (In bottles, polycarbonate can be identified by the recycling number 7.) Numerous studies have shown that it acts as an endocrine-disruptor in animals, including early onset of sexual maturation, altered development and tissue organization of the mammary gland and decreased sperm production in offspring. It may be most harmful in the stages of early development.

"We found that drinking cold liquids from polycarbonate bottles for just one week increased urinary BPA levels by more than two-thirds. If you heat those bottles, as is the case with baby bottles, we would expect the levels to be considerably higher. This would be of concern since infants may be particularly susceptible to BPA's endocrine-disrupting potential," said Karin B. Michels, associate professor of epidemiology at HSPH and Harvard Medical School and senior author of the study.

The researchers, led by first author Jenny Carwile, a doctoral student in the department of epidemiology at HSPH, and Michels, recruited Harvard College students for the study in April 2008. The 77 participants began the study with a seven-day "washout" phase in which they drank all cold beverages from stainless steel bottles in order to minimize BPA exposure. Participants provided urine samples during the washout period. They were then given two polycarbonate bottles and asked to drink all cold beverages from the bottles during the next week; urine samples were also provided during that time.

The results showed that the participants' urinary BPA concentrations increased 69% after drinking from the polycarbonate bottles. (The study authors noted that BPA concentrations in the college population were similar to those reported for the U.S. general population.) Previous studies had found that BPA could leach from polycarbonate bottles into their contents; this study is the first to show a corresponding increase in urinary BPA concentrations in humans.

One of the study's strengths, the authors note, is that the students drank from the bottles in a normal use setting. Additionally, the students did not wash their bottles in dishwashers nor put hot liquids in them; heating has been shown to increase the leaching of BPA from polycarbonate, so BPA levels might have been higher had students drunk hot liquids from the bottles.

Canada banned the use of BPA in polycarbonate baby bottles in 2008 and some polycarbonate bottle manufacturers have voluntarily eliminated BPA from their products. With increasing evidence of the potential harmful effects of BPA in humans, the authors believe further research is needed on the effect of BPA on infants and on reproductive disorders and on breast cancer in adults.

"This study is coming at an important time because many states are deciding whether to ban the use of BPA in baby bottles and sippy cups. While previous studies have demonstrated that BPA is linked to adverse health effects, this study fills in a missing piece of the puzzle-whether or not polycarbonate plastic bottles are an important contributor to the amount of BPA in the body," said Carwile.

The study was supported by the Harvard University Center for the Environment and the National Institute of Environmental Health Sciences Biological Analysis Core, Department of Environmental Health, HSPH. Carwile was also supported by the Training Program in Environmental Epidemiology.



Back to Basics: Scientists Discover a Fundamental Mechanism for Cell Organization
Scientists have discovered that cells use a very simple phase transition - similar to water vapor condensing into dew - to assemble and localize subcellular structures that are involved in formation of the embryo.

The discovery, which was made during the 2008 Physiology course at the Marine Biological Laboratory (MBL), is reported in the May 21 early online edition of Science by Clifford P. Brangwynne and Anthony A. Hyman of the Max Planck Institute for Molecular Cell Biology and Genetics in Dresden, Germany, and their colleagues, including Frank Jülicher of the Max Planck Institute for the Physics of Complex Systems, also in Dresden.

Working with the worm C. elegans, the scientists found that subcellular structures called P granules, which are thought to specify the “germ cells” that ultimately give rise to sperm or eggs, are liquid droplets that transition between a dissolved or condensed state. In newly fertilized one-cell embryos, the P granules are dissolving throughout the cell, like water droplets at high temperature. But prior to the first cell division, the P granules rapidly condense at one end of the cell, as if the temperature were suddenly lowered there. The progenitor germ cell subsequently forms where the P granules have condensed.

“This kind of phase transition could potentially be working for many other subcellular structures similar to P granules,” Brangwynne says. P granules are ribonucleoprotein assemblies (RNPs), and a given cell may contain dozens of different types of RNPs.

“It is interesting to think about this in the context of evolution and the origin of life,” he says. What we have found is that, in some cases, simple physical-chemical mechanisms, such as a classic phase transition, give rise to subcellular structure…This is likely the kind of thing that happened in the so-called primordial soup; but it's not surprising that even highly evolved cells continue to take advantage of such mechanisms.”

The insight emerged when Brangwynne, a biophysicist who was a teaching assistant in the MBL Physiology course, watched a movie of P granules fusing that had been made by a student in the course, David Courson of the University of Chicago. “We were looking at that and thinking, man, that looks exactly like two liquid droplets fusing,” Brangwynne says. They began making measurements of liquid-type behaviors in P granules, and made the first estimates of P granule viscosity and surface tension. By the end of the course they were “90 percent sure” that P granules are liquid droplets that localize in the cell by controlled dissolution and condensation, a concept that Brangwynne further confirmed after he returned to Dresden.

Brangwynne credits the discovery to the “dynamic nature” of the MBL Physiology course, where scientists from different fields (biology, physics, computer science) work intensively together on major research questions in cell biology. In addition to Courson, the other co-authors of the Science paper who were in the Physiology course are Hyman, and Jülicher, who were Physiology faculty members, and Jöbin Gharakhani, who was a teaching assistant. The paper also credits Physiology course co-director Tim Mitchison for valuable discussions.

“There are so many molecules in the cell, and we are coming out of the age of cataloguing them all, which was critical, to find out who the players are,” Brangwynne says. “Now we are putting it all together. What are the principles that come out of these complex interactions (between molecules)? In the end, it may be relatively simple principles that help us understand what is really happening.”


THURSDAY May 21, 2009---------------------------News Archive

Smoking in Pregnancy Likely to Produce Children Who Smoke
Children of mothers who smoked during pregnancy and their early childhood years may be predisposed to take up smoking as teens and young adults, compounding the physical damage they sustained from the smoke exposure

"It is well-known that maternal smoking influences a developing fetus in myriad ways, contributing to low birth weight, premature birth and a host of other health problems after birth," said Roni Grad, M.D., associate professor of clinical pediatrics at the University of Arizona College of Medicine. "Previous studies have suggested that maternal smoking during pregnancy may increase the risk of the offspring becoming regular smokers as adults, but the impact of postnatal cigarette smoke exposure was hard to differentiate from prenatal exposure."

The study results will be presented on Tuesday, May 19, at the American Thoracic Society's 105th International Conference in San Diego.

To determine the impact of maternal smoking during pregnancy and early childhood, the on the smoking behavior of the offspring as young adults, the researchers used data from the Tucson Children's Respiratory Study. Maternal smoking during pregnancy, at nine days, 1.5 months and 1.5 years was used to assess smoke exposure during pregnancy and the early life of the child. Maternal smoking was further assessed at ages six, nine and eleven years to evaluate smoke exposure during the school age years of the child. The smoking behavior of the offspring was then assessed at ages 16 and 22 years.

The researchers found that maternal smoking during pregnancy and the early childhood years was associated with the offspring being regular smokers at the age of 22, independent of whether the mother smoked during the school age years of the child. Furthermore, of all of the offspring who had ever smoked, offspring of mothers who smoked during pregnancy and early life were less likely to quit than those of mothers who had never smoked or who had taken up the habit only when the child reaches the school age years. Finally, the impact of early maternal smoking was independent of the effect of paternal smoking and also the effect of exposure to peer smoking during the offspring's adolescence. The greatest impact on the smoking behavior of the offspring as young adults was linked to .

"Smoking during pregnancy by mothers who stopped smoking by the time the child reached the school age years is a risk factor for smoking in their offspring during early adulthood," said Dr. Grad. "The data suggest that a biological effect is in play, and that eliminating maternal smoking during pregnancy and the preschool years of the child will reduce the risk of her children becoming regular smokers in adulthood. In children of mothers who did smoke during this critical period, it is important to prevent experimentation with tobacco during the adolescent years."


Oh Baby!

Ghrelin Hormone Increases Appetite AND Abdominal Fat

The ghrelin hormone not only stimulates the brain giving rise to an increase in appetite, but also favours the accumulation of lipids in visceral fatty tissue, located in the abdominal zone and considered to be the most harmful

The Metabolic Research Laboratory of the University Hospital of Navarra, Spain, published their conclusions recently in the International Journal of Obesity.

Ghrelin is a hormone produced in the stomach which tells the brain that the body has to be fed. Thus, the level of ghrelin increases before eating and decreases after. It is known to be important in the development of obesity, explained Ms Amaia Rodríguez Murueta-Goyena, doctor in biology and main researcher on the study.

However, researchers at University Hospital of Navarra have also discovered that, besides stimulating the hypothalamus to generate appetite, ghrelin also acts on the tabula rasa cortex. They observed how this hormone favours the accumulation of lipids in visceral fatty tissues by causing the over-expression of the fatty genes that take part in the retention of lipids, explained Ms Rodríguez.

It is precisely this accumulated abdomenal fat that is deemed to be most harmful, as visceral obesity is related to higher blood pressure and/or type 2 diabetes. Abdominal fat is in direct contact with the liver, leading to the formation of liver fat and increasing the risk of developing a resistance to insulin. Consequent to its association with hypertension, high levels of triglycerides, resistance to insulin and hypercholesterolemia, visceral fat promotes metabolic syndrome, the researcher pointed out.

Ghrelin can show itself in acylated - octanoic acid present - or deacylated forms, according to Ms Rodriguez. Previously it was thought that only the acylated form was active in the process of weight increase, but many studies now point to both hormones being biologically functional.

Future development of pharmaceutical drugs
The discovery of the twin actions of ghrelin opens the door to future treatment for obesity. The full function of a hormone must be known in order to design effective pharmaceuticals.

Ms Rodríguez points out that the acylated-form of ghrelin in the blood increases amongst obese persons, particularly those suffering from diabetes. Obese persons with diabetes have a greater tendency to accumulate visceral fat than obese persons with normal glycemic levels.



Research Team Finds Important Role for Junk DNA
Scientists have called it "junk DNA."

Scientists are beginning to find, however, that much of this so-called junk plays important roles in the regulation of gene activity. No one yet knows how extensive that role may be.

They have long been perplexed by these extensive strands of genetic material that dominate the genome but seem to lack specific functions. Why would nature force the genome to carry so much excess baggage?

Now researchers from Princeton University and Indiana University who have been studying the genome of a pond organism have found that junk DNA may not be so junky after all. They have discovered that DNA sequences from regions of what had been viewed as the "dispensable genome" are actually performing functions that are central for the organism. They have concluded that the genes spur an almost acrobatic rearrangement of the entire genome that is necessary for the organism to grow.

It all happens very quickly. Genes called transposons in the single-celled pond-dwelling organism Oxytricha produce cell proteins known as transposases. During development, the transposons appear to first influence hundreds of thousands of DNA pieces to regroup. Then, when no longer needed, the organism cleverly erases the transposases from its genetic material, paring its genome to a slim 5 percent of its original load.

In order to prove that the transposons have this reassembly function, the scientists disabled several thousand of these genes in some Oxytricha. The organisms with the altered DNA, they found, failed to develop properly.

Other authors from Princeton's Department of Ecology and Evolutionary Biology include: postdoctoral fellows Mariusz Nowacki and Brian Higgins; 2006 alumna Genevieve Maquilan; and graduate student Estienne Swart. Former Princeton postdoctoral fellow Thomas Doak, now of Indiana University, also contributed to the study.

Landweber and other members of her team are researching the origin and evolution of genes and genome rearrangement, with particular focus on Oxytricha because it undergoes massive genome reorganization during development.

In her lab, Landweber studies the evolutionary origin of novel genetic systems such as Oxytricha's. By combining molecular, evolutionary, theoretical and synthetic biology, Landweber and colleagues last year discovered an RNA (ribonucleic acid)-guided mechanism underlying its complex genome rearrangements.

"Last year, we found the instruction book for how to put this genome back together again - the instruction set comes in the form of RNA that is passed briefly from parent to offspring and these maternal RNAs provide templates for the rearrangement process," Landweber said. "Now we've been studying the actual machinery involved in the process of cutting and splicing tremendous amounts of DNA. Transposons are very good at that."

The term "junk DNA" was originally coined to refer to a region of DNA that contained no genetic information.

Instead, scientists now sometimes refer to these regions as "selfish DNA" if they make no specific contribution to the reproductive success of the host organism. Like a computer virus that copies itself ad nauseum, selfish DNA replicates and passes from parent to offspring for the sole benefit of the DNA itself.

The present study suggests that some selfish DNA transposons can instead confer an important role to their hosts, thereby establishing themselves as long-term residents of the genome.


Diapers.com

Area of Brain that Makes You a 'People Person' - Or Not
Cambridge University researchers have discovered that whether someone is a ‘people-person’ may depend on the structure of their brain: the greater the concentration of brain tissue in certain parts of the brain, the more likely they are to be a warm, sentimental person

Why is it that some of us really enjoy the company of others while some people are detached and independent? In an effort to explore these questions, Maël Lebreton and colleagues from the Cambridge Department of Psychiatry, in collaboration with Oulu University, Finland, examined the relationship between personality and brain structure in 41 male volunteers.

The volunteers underwent a brain scan using Magnetic Resonance Imaging (MRI). They also completed a questionnaire that asked them to rate themselves on items such as 'I make a warm personal connection with most people', or 'I like to please other people as much as I can'. The answers to the questionnaire provide an overall measure of emotional warmth and sociability called social reward dependence.

The researchers then analysed the relationship between social reward dependence and the concentration of grey matter (brain-cell containing tissue) in different brain regions. They found that the greater the concentration of tissue in the orbitofrontal cortex (the outer strip of the brain just above the eyes), and in the ventral striatum (a deep structure in the centre of the brain), the higher they tended to score on the social reward dependence measure. The research is published in the European Journal of Neuroscience.

Dr Graham Murray, who is funded by the Medical Research Council and who led the research, said: "Sociability and emotional warmth are very complex features of our personality. This research helps us understand at a biological level why people differ in the degrees to which we express those traits." But he cautioned, "As this research is only correlational and cross-sectional, it cannot prove that brain structure determines personality. It could even be that your personality, through experience, helps in part to determine your brain structure."

Interestingly, the orbitofrontal cortex and ventral striatum have previously been shown to be important for the brain's processing of much simpler rewards like sweet tastes or sexual stimuli.

Dr Murray explained: "It's interesting that the degree to which we find social interaction rewarding relates to the structure of our brains in regions that are important for very simple biological drives such as food, sweet liquids and sex. Perhaps this gives us a clue to how complex features like sentimentality and affection evolved from structures that in lower animals originally were only important for basic biological survival processes."

The research could also lead to new insights into psychiatric disorders where difficulties in social interaction are prominent, such as autism or schizophrenia.

"Patients with certain psychiatric conditions often experience difficulties in feeling emotional closeness, and this can have a big impact on their life. It could be that the cause of these difficulties is at least partly due to brain structural features of those disorders," said Dr Murray.



Why Do People with Down Syndrome Have Less Cancer?
Research in mice and human stem cells suggests new therapeutic targets

Most cancers are rare in people with Down syndrome, whose overall cancer mortality is below 10 percent of that in the general population. Since they have an extra copy of chromosome 21, it's been proposed that people with Down syndrome may be getting an extra dose of one or more cancer-protective genes. The late cancer researcher Judah Folkman, MD, founder of the Vascular Biology Program at Children's Hospital Boston, popularized the notion that they might be benefiting from a gene that blocks angiogenesis, the development of blood vessels essential for cancer's growth, since their incidence of other angiogenesis-related diseases like macular degeneration is also lower. A study from Children's confirms this idea in mice and human cells and identifies specific new therapeutic targets for treating cancer.

Publishing online May 20 in the journal Nature, cancer researcher Sandra Ryeom, PhD, and colleagues from Children's Vascular Biology Program show that a single extra copy of Dscr1 (one of the 231 genes on chromosome 21 affected by trisomy, with three copies rather than two) is sufficient to significantly suppress angiogenesis and tumor growth in mice, as well as angiogenesis in human cells. The team also found its protein, DSCR1, to be elevated in tissues from people with Down syndrome and in a mouse model of the disease.

Further study confirmed that DSCR1 acts by suppressing signaling by the angiogenesis-promoting protein vascular endothelial growth factor (VEGF). In a mouse model of Down syndrome, endothelial cells (which make up blood vessel walls) showed a decreased growth response to VEGF when they had an extra copy of Dscr1. An extra copy of another chromosome 21 gene, Dyrk1A, also appeared to decrease cells' response to VEGF.

Finally, Ryeom and colleagues showed that these extra genes suppress VEGF signaling via a specific signaling pathway inside endothelial cells -- the calcineurin pathway. Until now, Ryeom says, little has been known about the internal pathways VEGF activates once it binds to cellular receptors; most existing anti-VEGF drugs work by simply binding to VEGF (like Avastin) or blocking its ability to bind to its cellular receptors.

"We're now moving further downstream by going inside the cell," Ryeom says. "When we targeted calcineurin, we suppressed the ability of endothelial cells to grow and form vessels. While it's likely not the only pathway that's involved, if you take it out, VEGF is only half as effective."

Ryeom and her group next validated the mouse findings in human cells. In collaboration with George Daley, MD, PhD, and colleagues in the Stem Cell program at Children's, she worked with induced pluripotent stem cells (iPS cells) created from skin cells from a patient with Down syndrome - one of 10 disease-specific lines recently developed in Daley's lab.

Knowing that iPS cells tend to induce tumors known as teratomas when inserted into mice, Ryeom guessed that teratomas grown from iPS cells with an extra chromosome 21 would have far fewer blood vessels than teratomas from iPS cells with the normal number of chromosomes. She was right: blood vessels budded in the Down teratomas, but never fully formed.

"The studies in the iPS cells helped validate and confirm that the suppression of angiogenesis that we saw in mouse models also holds true in humans," says Ryeom. "It suggests that these two genes might point to a viable cancer therapy."

Ryeom's group has identified which part of the DSCR1 protein blocks calcineurin and is testing to see whether that fragment can be delivered specifically to endothelial cells, to prevent them from forming new blood vessels, without affecting calcineurin pathways in other cells and causing side effects. "Immunosuppressive drugs also target calcineurin in T-cells," Ryeom notes. "We think that Dscr1 blocks calcineurin specifically in endothelial cells, without affecting T-cells, but we need to check."

Folkman's interest in why patients with Down syndrome have such a reduced risk for cancer focused on endostatin, an anti-angiogenic compound made by the body. Discovered in the Folkman lab, endostatin is a fragment of collagen 18 - whose gene is also on chromosome 21. People with Down syndrome reportedly have almost doubled levels of endostatin because of the extra copy of the gene.

"I think there may be four or five genes on chromosome 21 that are necessary for angiogenesis suppression," says Ryeom. "In huge databases of cancer patients with solid tumors, there are very few with Down syndrome. This suggests that protection from chromosome 21 genes is pretty complete."


WEDNESDAY May 20, 2009---------------------------News Archive

Excessive Cola Consumption Can Lead to Super-Sized Muscle Problems Warn Doctors
Doctors have issued a warning about excessive cola consumption after noticing an increase in the number of patients suffering from muscle problems, according to the June issue of IJCP, the International Journal of Clinical Practice

"We are consuming more soft drinks than ever before and a number of health issues have already been identified including tooth problems, bone demineralisation and the development of metabolic syndrome and diabetes" says Dr Moses Elisaf from the Department of Internal Medicine at the University of Ioannina, Greece.
"Evidence is increasing to suggest that excessive cola consumption can also lead to hypokalaemia, in which the blood potassium levels fall, causing an adverse effect on vital muscle functions."

A research review carried out by Dr Elisaf and his colleagues has shown that symptoms can range from mild weakness to profound paralysis. Luckily all the patients studied made a rapid and full recovery after they stopped drinking cola and took oral or intravenous potassium.

The case studies looked at patients whose consumption ranged from two to nine litres of cola a day.

They included two pregnant women who were admitted with low potassium levels.

The first, a 21 year-old woman, was consuming up to three litres of cola a day and complained of fatigue, appetite loss and persistent vomiting. An electrocardiagram also revealed she had a heart blockage, while blood tests showed she had low potassium levels.

The second also had low potassium levels and was suffering from increasing muscular weakness. It turned out she had been drinking up to seven litres of cola a day for the last 10 months.

In a commentary on the paper, Dr Clifford Packer from the Louis Stokes Cleveland VA Medical Centre in Ohio relates the strange case of the ostrich farmer who returned from the Australian outback with muscle weakness. He had been drinking four litres of cola a day for the last three years and drank up to 10 litres a day when he was in the outback, causing a rapid reduction in his potassium levels.

He also relates a puzzling case he saw in his own clinical practice, which was solved when the patient turned up at his office with a two-litre bottle of cola in the basket of his electric scooter. It turned out he routinely drank up to four litres a day. He refused to stop drinking cola, but halved his consumption and the muscle weakness he had been complaining of improved.

In 2007 the worldwide annual consumption of soft drinks reached 552 billion litres, the equivalent of just under 83 litres per person per year, and this is projected to increase to 95 litres per person per year by 2012. However the figure has already reached an average of 212 litres per person per year in the United States.

It appears that hypokalaemia can be caused by excessive consumption of three of the most common ingredients in cola drinks – glucose, fructose and caffeine.

"The individual role of each of these ingredients in the pathophysiology of cola-induced hypokalaemia has not been determined and may vary in different patients" says Dr Elisaf. "However in most of the cases we looked at for our review, caffeine intoxication was thought to play the most important role. This has been borne out by case studies that focus on other products that contain high levels of caffeine but no glucose or fructose. "Despite this, caffeine free cola products can also cause hypokalaemia because the fructose they contain can cause diarrhoea."

The authors argue that in an era when portion sizes are becoming bigger and bigger, the excessive consumption of cola products has real public health implications. "Although most patients recover when they stop drinking cola and take potassium supplements, cola-induced chronic hypokalaemia can make them more susceptible to potentially fatal complications, such as an irregular heartbeat" says Dr Elisaf. "In addition, excessive consumption of any kind of cola can lead to a range of health problems including fatigue, loss of productivity and muscular symptoms that vary from mild weakness to profound paralysis. We believe that further studies are needed to establish how much is too much when it comes to the daily consumption of cola drinks."

Dr Packer agrees that the problem needs to be addressed.
"Cola drinks need to be added to the physician's checklist of drugs and substances that can cause hypokalaemia" he says. "And the soft drink industry needs to promote safe and moderate use of its products for all age groups, reduce serving sizes and pay heed to the rising call for healthier drinks."

Oh Baby!

Caltech, UCSF Scientists Determine How Body Differentiates Between a Scorch and a Scratch

You can tell without looking whether you've been stuck by a pin or burnt by a match. But how?

In research that overturns conventional wisdom, a team of scientists from the California Institute of Technology (Caltech) and the University of California, San Francisco (UCSF), have shown that this sensory discrimination begins in the skin at the very earliest stages of neuronal information processing, with different populations of sensory neurons--called nociceptors--responding to different kinds of painful stimuli.

Their findings were published this week in the early online edition of the Proceedings of the National Academy of Sciences (PNAS).

"Conventional wisdom was that the nociceptive neurons in the skin can't tell the difference between heat and mechanical pain, like a pin prick," says David Anderson, Seymour Benzer Professor of Biology, a Howard Hughes Medical Institute (HHMI) Investigator, and one of the paper's lead authors. "The idea was that the skin is a dumb sensor of anything unpleasant, and that higher brain areas disentangle one pain modality from another, to tell you if you've been scorched or scratched."

This conventional wisdom came from recording the electrical responses of nociceptive neurons, where it was shown that these neurons are capable of sensing pretty much every kind of painful stimulus--from pin pricks to heat to cold. But this, Anderson notes, was not sufficient to understand the control of pain-avoidance behavior. "We were asking the cells what the cells can sense, not asking the animal what the cells can sense," he explained.

And so Anderson and coprincipal investigator Allan Basbaum, chair of the Department of Anatomy at UCSF, decided to ask the animal. To do so, they created a genetically engineered mouse in which specific populations of pain-sensing neurons can be selectively destroyed. They were then able to see if the mouse continued to respond to different types of stimuli by pulling its paw away when exposed to a relatively gentle heat source or poked with a nylon fishing line.

What the researchers found was that, when they killed off a certain population of nociceptor neurons, the mice stopped responding to being poked, but still responded to heat. Conversely, when the researchers injected a toxin to destroy a different population of neurons, the mice stopped responding to heat, but their sense of poke remained intact.

"This tells us that the fibers that mediate the response to being poked are neither necessary nor sufficient for a behavioral response to heat," Anderson explains, "and vice versa for the fibers that mediate the response to heat."

In addition, Anderson notes, neither of these two classes of sensory neurons seem to be required for responding to a painful cold stimulus, like dry ice. Research into pinpointing that population of cells is ongoing.

"This tells us that the discernment of different types of painful stimuli doesn't happen only in the brain--it starts in the skin, which is therefore much smarter than we thought," says Anderson. "That's a pretty heretical point of view."

It's also a potentially useful point of view, as Anderson points out. "If doctors want to repair or replace damaged nerve fibers in conditions such as diabetic neuropathy," he explains, "they need to make sure they're replacing the right kind of nerve fibers."

In addition to Anderson, the paper's coauthors include graduate student Daniel Cavanaugh from UCSF, postdoctoral scholar Hyosang Lee and HHMI Research Specialist Liching Lo from Caltech, Shannon Shields from UCSF (now at the Hospital Nacional de Paraplejicos in Toledo, Spain), and Mark Zylka, a former postdoctoral fellow at Caltech now on the faculty at the University of North Carolina, Chapel Hill.

Work on the PNAS paper, "Distinct subsets of unmyelinated primary sensory fibers mediate behavioral responses to noxious thermal and mechanical stimuli," was funded by grants from the National Institutes of Health, the National Alliance for Research on Schizophrenia and Affective Disorders, the Searle Scholars Program, the Whitehall, Klingenstein, Sloan and Rita Allen Foundations, the Christopher and Dana Reeve Foundation, and the Howard Hughes Medical Institute.



International Team Tracks Clues to HIV
Rice University's Andrew Barron and his group, working with labs in Italy, Germany and Greece, have identified specific molecules that could block the means by which the deadly virus spreads by taking away its ability to bind with other proteins

Using computer simulations, researchers tested more than 100 carbon fullerene, or C-60, derivatives initially developed at Rice for other purposes to see if they could be used to inhibit a strain of the virus, HIV-1 PR, by attaching themselves to its binding pocket.

"There are a lot of people doing this kind of research, but it tends to be one group or one pharmaceutical company taking a shotgun approach -- make a molecule and try it out, then make another molecule and try it out," said Barron, Rice's Charles W. Duncan Jr.-Welch Professor of Chemistry and professor of materials science. "This is interesting because we're tackling an important problem in a very rational way."

The groups reported their findings in a paper published on the American Chemical Society's Journal of Chemical Information and Modeling Web site last week.

Their method of modeling ways to attack HIV may not be unique, but their collaboration is. Research groups from five institutions -- two in Greece, one in Germany, one in Italy and Barron's group at Rice -- came together through e-mail contacts and conversations over many months, each working on facets of the problem. "Not all the groups have ever met in person," Barron said. Most remarkable, he said, is that their research to date has been completely unfunded.

Using simulations to narrow down a collection of fullerenes to find the good ones is "the least time-consuming low-cost procedure for efficient, rational drug design," the team wrote.

"A long time ago, people noticed that C-60 fits perfectly into the hydrophobic pocket in HIV, and it has an inhibition effect," Barron said. "It's not particularly strong, but there's potentially a very strong binding effect. The problem is, it's not the perfect unit." The objective was to find an existing fullerene derivative molecule that could be easily modified to become the perfect unit.

Rice got involved, he said, "because we make the molecules and the other guys had a great method for in-silico testing of molecules. They approached us and said, 'Do you think we could use some of these?' Then we started bouncing ideas around.

"We began thinking about a very simple experiment to calculate the binding efficiency of a molecule in the HIV pocket, then calculate that for a series of molecules, decide which one is best, make that molecule in real life and see if it correlates," Barron said. "If it does, then you've got a way to design your ultimate molecule. Our work was the first step in the process."

In fact, through their "in-silico," or computer-based, calculations, they found two good fits among the fullerene derivatives tested and are now working to enhance their binding properties to get that perfect molecule, one that sticks "like Velcro" to the virus and can be fine-tuned for various strains.

"This is just one component of the problem -- we're not going to cure HIV," Barron cautioned. The hope, he said, is to develop a method for the rapid creation of drugs to address various strains of HIV and other diseases.

Authors of the paper with Barron were Manthos Papadopoulos of the National Hellenic Research Foundation, Athens; Serdar Durdagi of the National Hellenic Research Foundation and the Freie Universitat, Berlin; Claudiu Supuran of the University of Florence, Italy; T. Amanda Strom, Nadjmeh Doostdar and Mananjali Kumar of Rice; and Thomas Mavromoustakos of the National Hellenic Research Foundation and the University of Athens.

The impromptu nature of the project intrigued Barron as much as the subject itself. "Here you've got computational people, experimental people, synthesis people, characterization people who've come together naturally as a collaboration and developed this protocol, developed their own methodologies.

"And no one's paid us to collaborate. Serdar Durdagi’s graduate fellowship was funded by the European Union. The fellowships of Rice graduate students Amanda Strom, Nadjmeh Doostdar and Mananjali Kumar were funded, in part, by Rice's Center for Biological and Environmental Nanotechnology. This is purely an academic collaboration." He said the group is working on a second paper and seeking funding to expand the project.

The paper in the Journal of Chemical Information and Modeling.


Diapers.com

Gene Signature Helps Predict Breast Cancer Prognosis
Evidence for epigenetic inheritance in wide range of species

For years, genes have been considered the one and only way biological traits could be passed down through generations of organisms.

Not anymore.

Increasingly, biologists are finding that non-genetic variation acquired during the life of an organism can sometimes be passed on to offspring—a phenomenon known as epigenetic inheritance. An article forthcoming in the July issue of The Quarterly Review of Biology lists over 100 well-documented cases of epigenetic inheritance between generations of organisms, and suggests that non-DNA inheritance happens much more often than scientists previously thought.

Biologists have suspected for years that some kind of epigenetic inheritance occurs at the cellular level. The different kinds of cells in our bodies provide an example. Skin cells and brain cells have different forms and functions, despite having exactly the same DNA. There must be mechanisms—other than DNA—that make sure skin cells stay skin cells when they divide.

Only recently, however, have researchers begun to find molecular evidence of non-DNA inheritance between organisms as well as between cells. The main question now is: How often does it happen?

"The analysis of these data shows that epigenetic inheritance is ubiquitous," write Eva Jablonka and Gal Raz, both of Tel-Aviv University in Israel. Their article outlines inherited epigenetic variation in bacteria, protists, fungi, plants, and animals.

These findings "represent the tip of a very large iceberg," the authors say.

For example, Jablonka and Raz cite a study finding that when fruit flies are exposed to certain chemicals, at least 13 generations of their descendants are born with bristly outgrowths on their eyes. Another study found that exposing a pregnant rat to a chemical that alters reproductive hormones leads to generations of sick offspring. Yet another study shows higher rates of heart disease and diabetes in the children and grandchildren of people who were malnourished in adolescence.

In these cases, as well as the rest of the cases Jablonka and Raz cite, the source of the variation in subsequent generations was not DNA. Rather, the new traits were carried on through epigenetic means.

There are four known mechanisms for epigenetic inheritance. According to Jablonka and Raz, the best understood of these is "DNA methylation." Methyls, small chemical groups within cells, latch on to certain areas along the DNA strand. The methyls serve as a kind of switch that renders genes active or inactive.

By turning genes on and off, methyls can have a profound impact on the form and function of cells and organisms, without changing the underlying DNA. If the normal pattern of methyls is altered—by a chemical agent, for example—that new pattern can be passed to future generations.

The result, as in the case of the pregnant rats, can be dramatic and stick around for generations, despite the fact that underlying DNA remains unchanged.

Lamarck Revisited

New evidence for epigenetic inheritance has profound implications for the study of evolution, Jablonka and Raz say.

"Incorporating epigenetic inheritance into evolutionary theory extends the scope of evolutionary thinking and leads to notions of heredity and evolution that incorporate development," they write.

This is a vindication of sorts for 18th century naturalist Jean Baptiste Lamarck. Lamarck, whose writings on evolution predated Charles Darwin's, believed that evolution was driven in part by the inheritance of acquired traits. His classic example was the giraffe. Giraffe ancestors, Lamarck surmised, reached with their necks to munch leaves high in trees. The reaching caused their necks to become slightly longer—a trait that was passed on to descendants. Generation after generation inherited slightly longer necks, and the result is what we see in giraffes today.

With the advent of Mendelian genetics and the later discovery of DNA, Lamarck's ideas fell out of favor entirely. Research on epigenetics, while yet to uncover anything as dramatic as Lamarck's giraffes, does suggest that acquired traits can be heritable, and that Lamarck was not so wrong after all.




New Stem Cell Lab for Horses Opens at UC Davis Veterinary School
Focused on providing the latest in stem cell therapies for horses, the UC Davis School of Veterinary Medicine today opened its new Regenerative Medicine Laboratory at the William R. Pritchard Veterinary Medical Teaching Hospital

The new laboratory provides a state-of-the art facility for processing, culturing and storing stem cells for use in horses. It is one of only four such university-based veterinary stem cell labs in the nation, providing services to clients and referring veterinarians.

“We are excited to be able to offer this new clinical service to our clients for their horses as a complement to our stem-cell research program,” said Bennie Osburn, dean of the School of Veterinary Medicine. “Stem cell science is leading us into a new era in human and veterinary medicine.”

In recent years, scientists have made significant advances in using stem cells to treat horses suffering from diseases including colic and neuromuscular degeneration, as well as burns and other injuries. Horses have been one of the first species to benefit from veterinary stem cell therapy because they are prone to many of the injuries that can be successfully treated with such therapy.

“The marvelous thing about stem cell therapy is that it holds the promise of a cure,” said Sean Owens, a veterinary professor and director of the new Regenerative Medicine Laboratory. “We can use pharmacological medicine to alleviate the pain associated with orthopedic injuries in horses, but only with biological medicine such as stem cell therapy can we actually repair the damage that has already been done.”

The research-driven laboratory is expected to yield new knowledge that also will benefit other animal species.

New laboratory
The new Regenerative Medicine Laboratory, located on the first floor of the UC Davis William R. Pritchard Veterinary Medical Teaching Hospital, will support the clinical arm of the veterinary stem cell program. Lab personnel will process, culture and store stem cells that have been collected from the hospital’s equine patients to treat injuries.

The laboratory also will provide stem cell collection kits to private veterinarians so that they can harvest stem cells from their equine patients and return the cells to the UC Davis lab for processing or storage. Processed stem cells then will be returned so that the veterinarians can treat their patients. Some horses also will be referred to the teaching hospital for stem cell treatments.

While the costs associated with stem cell processing and treatment will vary from case to case, the fee for processing and expansion of one bone marrow sample will be approximately $1,800. Each sample will be expanded into four therapeutic stem cell doses. One dose will be returned to the submitting veterinarian, while the other three will be stored for future use. The fee for stem cells injections at the Veterinary Medical Teaching Hospital will vary according to the number and frequency of doses administered. For most patients, the fee will be approximately $1,500.

Stem cells and regenerative medicine
Regenerative medicine is the field of human and veterinary medicine that involves creating living, functional tissues to repair or replace tissues or organs that have been damaged by injury, disease, aging or birth defects.

One way to do this is by collecting stem cells, which are unspecialized cells that can be induced in the laboratory to become specialized cell types such as muscle, blood and nerves.

The use of embryonic stem cells has raised much debate in human medicine. It is important to note that the new regenerative medicine program at the UC Davis Veterinary Medical Teaching Hospital does not use embryonic stem cells, but rather stem cells that have been collected from the horse’s own blood or bone marrow.

“The stem cell, with its ability to recreate, repair or revitalize damaged organs or tissues, is rapidly changing all of medicine," said Gregory Ferraro, a veterinary professor and director of UC Davis’ Center for Equine Health. “The application of stem cell science to treating horses is advancing so quickly that within three to five years, the treatments that are currently being provided for orthopedic repair in athletic horses will seem crude in hindsight.”

Veterinary stem cell team
The Center for Equine Health is coordinating a five-year collaborative research study, now in its second year.

The study is being carried out by a team of 11 UC Davis veterinary researchers, who are working to develop methods for collecting, processing, storing and administering stem cells to repair bone, tendon and ligament injuries in horses. These types of injuries are common problems especially for race horses and other performance horses. The team’s early findings indicate that stem cell treatments may reduce the recurrence of certain tendon and ligament injuries and lessen the progression of arthritis associated with traumatic joint diseases in horses.

This veterinary team, under the direction of professor and equine surgeon Larry Galuppo, also has established a working partnership with the UC Davis Health System’s Stem Cell Program in human medicine, directed by Jan Nolta, a medical school professor and one of the nation’s leading stem cell researchers.


TUESDAY May 19, 2009---------------------------News Archive

How Embryo Movement Stimulates Joint Formation
A new study uncovers a molecular mechanism that explains why joints fail to develop in embryos with paralyzed limbs. The research, published by Cell Press in the May issue of the journal Developmental Cell, answers a longstanding question about the influence of muscle activity on developing joints and underscores the critical contribution of movement to regulation of a signaling pathway that is important during development and beyond.

Joint development requires changes in gene expression that "commit" cells to becoming part of the developing joint and distinguish them from the surrounding cartilage tissue. Previous research has shown that the Wnt/?-catenin signaling pathway plays a key role in maintaining this joint cell fate and preventing joint cells from differentiating into cartilage.

It is also clear that muscle contraction is involved in proper formation of the skeleton. "We have known for over a century that embryonic movement is intimately involved in development of the joints. However, the precise mechanism by which active musculature regulates joint formation has remained elusive," explains senior study author Dr. Elazar Zelzer from the Department of Molecular Genetics at the Weizmann Institute of Science in Israel.

Dr. Zelzer and colleagues confirmed that the normal process of joint formation was disrupted in mouse models that lacked limb musculature or muscle contractility. They then noted that cells at the presumptive joint sites ceased to express classical joint markers and instead followed a pathway for developing cartilage. Local loss of ?-catenin activity explained why the joints failed to form.

"Prior to the current study, the mechanisms that underlie the contribution of movement to the process of joint development were mostly missing," says Dr. Zelzer. "Our findings show that muscle contraction is necessary to maintain joint progenitor cell fate and explain how and why movement-induced mechanical stimuli play a key role during development."

Importantly, the current results also establish joint formation as a context in which to study mechanical regulation of the Wnt/?-catenin signaling more generally. The ability to respond to mechanical stimuli may also affect ?-catenin-related tumorigenesis in disorders such as colon cancer.



Novel Vaccine Approach Offers Hope in Fight Against HIV
A research team may have broken the stubborn impasse that has frustrated the invention of an effective HIV vaccine, by using an approach that bypasses the usual path followed by vaccine developers. By using gene transfer technology that produces molecules that block infection, the scientists protected monkeys from infection by a virus closely related to HIV -- the simian immunodeficiency virus, or SIV -- that causes AIDS in rhesus monkeys.

"We used a leapfrog strategy, bypassing the natural immune system response that was the target of all previous HIV and SIV vaccine candidates," said study leader Philip R. Johnson, M.D., chief scientific officer at The Children's Hospital of Philadelphia. Johnson developed the novel approach over a ten-year period, collaborating with K. Reed Clark, Ph.D., a molecular virologist at Nationwide Children's Hospital in Columbus, Ohio.

The study appeared today in the online version of Nature Medicine.

Johnson cautioned that many hurdles remain before the technique used in this animal study might be translated into an HIV vaccine for humans. If the technique leads to an effective HIV vaccine, such a vaccine may be years away from realization.

Most attempts at developing an HIV vaccine have used substances aimed at stimulating the body's immune system to produce antibodies or killer cells that would eliminate the virus before or after it infected cells in the body. However, clinical trials have been disappointing. HIV vaccines have not elicited protective immune responses, just as the body fails on its own to produce an effective response against HIV during natural HIV infection.

The approach taken in the current study was divided into two phases. In the first phase, the research team created antibody-like proteins (called immunoadhesins) that were specifically designed to bind to SIV and block it from infecting cells. Once proven to work against SIV in the laboratory, DNA representing SIV-specific immunoadhesins was engineered into a carrier virus designed to deliver the DNA to monkeys. The researchers chose adeno-associated virus (AAV) as the carrier virus because it is a very effective way to insert DNA into the cells of a monkey or human.

In the second part of the study, the team injected AAV carriers into the muscles of monkeys, where the imported DNA produced immunoadhesins that entered the blood circulation. One month after administration of the AAV carriers, the immunized monkeys were injected with live, AIDS-causing SIV. The majority of the immunized monkeys were completely protected from SIV infection, and all were protected from AIDS. In contrast, a group of unimmunized monkeys were all infected by SIV, and two-thirds died of AIDS complications. High concentrations of the SIV-specific immunoadhesins
remained in the blood for over a year.

Further studies need to be conducted if this technique is to become an actual preventive measure against HIV infection in people, Johnson said. "To ultimately succeed, more and better molecules that work against HIV, including human monoclonal antibodies, will be needed," he and his co-authors conclude. Finally, added Johnson, their approach may also have potential use in preventing other infectious diseases, such as malaria.


Oh Baby!

Environmental Exposures May Damage DNA in Three Days
Exposure to particulate matter has been recognized as a contributing factor to lung cancer development for some time, but a new study indicates inhalation of certain particulates can actually cause some genes to become reprogrammed, affecting both the development and the outcome of cancers and other diseases.

The research will be presented on Sunday, May 17, at the 105th International Conference of the American Thoracic Society in San Diego.

"Recently, changes in gene programming due to a chemical transformation called methylation have been found in the blood and tissues of lung cancer patients," said investigator Andrea Baccarelli, M.D., Ph.D., assistant professor of applied biotechnology at the University of Milan. "We aimed at investigating whether exposure to particulate matter induced changes in DNA methylation in blood from healthy subjects who were exposed to high levels of particulate matter in a foundry facility."

Researchers enrolled 63 healthy subjects who worked in a foundry near Milan, Italy. Blood DNA samples were collected on the morning of the first day of the work week, and again after three days of work. Comparing these samples revealed that significant changes had occurred in four genes associated with tumor suppression.

"The changes were detectable after only three days of exposure to particulate matter, indicating that environmental factors need little time to cause gene reprogramming which is potentially associated with disease outcomes," Dr. Baccarelli said.

"As several of the effects of particulate matter in foundries are similar to those found after exposure to ambient air pollution, our results open new hypotheses about how air pollutants modify human health," he added. "The changes in DNA methylation we observed are reversible and some of them are currently being used as targets of cancer drugs."

Dr. Baccarelli said the study results indicate that early interventions might be designed which would reverse gene programming to normal levels, reducing the health risks of exposure.

"We need to evaluate how the changes in gene reprogramming we observed are related to cancer risk," he said. "Down the road, it will be particularly important not only to show that these changes are associated with increased risk of cancer or other environmentally-induced diseases, but that, if we were able to prevent or revert them, these risks could be eliminated."

Diapers.com

100 Reasons to Change the Way We Think About Genetics
Article reviews evidence for epigenetic inheritance in wide range of species

For years, genes have been considered the one and only way biological traits could be passed down through generations of organisms.

Not anymore.

Increasingly, biologists are finding that non-genetic variation acquired during the life of an organism can sometimes be passed on to offspring—a phenomenon known as epigenetic inheritance. An article forthcoming in the July issue of The Quarterly Review of Biology lists over 100 well-documented cases of epigenetic inheritance between generations of organisms, and suggests that non-DNA inheritance happens much more often than scientists previously thought.

Biologists have suspected for years that some kind of epigenetic inheritance occurs at the cellular level. The different kinds of cells in our bodies provide an example. Skin cells and brain cells have different forms and functions, despite having exactly the same DNA. There must be mechanisms—other than DNA—that make sure skin cells stay skin cells when they divide.

Only recently, however, have researchers begun to find molecular evidence of non-DNA inheritance between organisms as well as between cells. The main question now is: How often does it happen?

"The analysis of these data shows that epigenetic inheritance is ubiquitous …," write Eva Jablonka and Gal Raz, both of Tel-Aviv University in Israel. Their article outlines inherited epigenetic variation in bacteria, protists, fungi, plants, and animals.

These findings "represent the tip of a very large iceberg," the authors say.

For example, Jablonka and Raz cite a study finding that when fruit flies are exposed to certain chemicals, at least 13 generations of their descendants are born with bristly outgrowths on their eyes. Another study found that exposing a pregnant rat to a chemical that alters reproductive hormones leads to generations of sick offspring. Yet another study shows higher rates of heart disease and diabetes in the children and grandchildren of people who were malnourished in adolescence.

In these cases, as well as the rest of the cases Jablonka and Raz cite, the source of the variation in subsequent generations was not DNA. Rather, the new traits were carried on through epigenetic means.

There are four known mechanisms for epigenetic inheritance. According to Jablonka and Raz, the best understood of these is "DNA methylation." Methyls, small chemical groups within cells, latch on to certain areas along the DNA strand. The methyls serve as a kind of switch that renders genes active or inactive.

By turning genes on and off, methyls can have a profound impact on the form and function of cells and organisms, without changing the underlying DNA. If the normal pattern of methyls is altered—by a chemical agent, for example—that new pattern can be passed to future generations.

The result, as in the case of the pregnant rats, can be dramatic and stick around for generations, despite the fact that underlying DNA remains unchanged.

LAMARCK REVISITED

New evidence for epigenetic inheritance has profound implications for the study of evolution, Jablonka and Raz say.

"Incorporating epigenetic inheritance into evolutionary theory extends the scope of evolutionary thinking and leads to notions of heredity and evolution that incorporate development," they write.

This is a vindication of sorts for 18th century naturalist Jean Baptiste Lamarck. Lamarck, whose writings on evolution predated Charles Darwin's, believed that evolution was driven in part by the inheritance of acquired traits. His classic example was the giraffe. Giraffe ancestors, Lamarck surmised, reached with their necks to munch leaves high in trees. The reaching caused their necks to become slightly longer—a trait that was passed on to descendants. Generation after generation inherited slightly longer necks, and the result is what we see in giraffes today.

With the advent of Mendelian genetics and the later discovery of DNA, Lamarck's ideas fell out of favor entirely. Research on epigenetics, while yet to uncover anything as dramatic as Lamarck's giraffes, does suggest that acquired traits can be heritable, and that Lamarck was not so wrong after all.



New Vaccine Strategy Might Offer Protection Against Pandemic Influenza Strains
A novel vaccine strategy using virus-like particles (VLPs) could provide stronger and longer-lasting influenza vaccines with a significantly shorter development and production time than current ones, allowing public health authorities to react more quickly in the event of a potential pandemic.

Ted Ross, Ph.D., an assistant professor at the University of Pittsburgh's Center for Vaccine Research, will present his laboratory's latest data on the efficacy of VLP vaccines for potential pandemic strains, such as H5N1 and 1918 influenza, today at the 109th General Meeting of the American Society for Microbiology in Philadelphia.

"Virus-like particles look just like a live virus, but they are hollow shells without a genome inside and they cannot reproduce," Ross explained. "Because they look like the virus, they evoke a more robust immune response against the real thing."

Ross and his colleagues have already made VLP vaccines that have been tested in early clinical trials and appear to provide complete protection against both the H5N1 avian influenza virus and the 1918 Spanish influenza virus.

"There is a debate in the influenza community about priming the human population for potential pandemic strains such as H5N1 or 1918," Ross said. "Some researchers advocate adding these strains to the annual flu vaccine. They might not match the next pandemic flu strain exactly, but could provide some of protection."

Others contend that it might be premature, as well as costly, to vaccinate people against a virus that may never emerge, he said.

The current injectable vaccine for seasonal influenza is a trivalent, inactivated vaccine. It consists of three different influenza strains that are grown in eggs and then inactivated, or killed, by chemicals that break them into tiny pieces. Because they no longer look like the circulating virus, conventionally made vaccines strains do not elicit as strong an immune response as VLP vaccines. Because it is made with live, attenuated virus, the inhaled, mist-based vaccine can elicit a strong immune response but can also increase the risk of side effects.

VLPs can be quickly and easily produced in several ways, including growing them in cell cultures or in plants. Also, if the genes in the disease virus are identified, then researchers can generate particles for a vaccine without an actual sample of the agent.

"The sequence for the recent H1N1 'swine flu' virus was online and available to scientists long before physical samples could be delivered," Dr. Ross noted. "It would have been possible to produce VLPs in quantity in as little as 12 weeks while conventional vaccines require physical samples of the virus and production can take approximately nine months."

One VLP-based vaccine already is on the market, namely the human papilloma virus (HPV) vaccine.


Week Ending FRIDAY May 15, 2009---------------------------News Archive

Vitamin D Insufficiency Linked to Bacterial Vaginosis in Pregnant Women
Bacterial vaginosis (BV) is the most common vaginal infection in US women of childbearing age, and is common in pregnant women. BV occurs when the normal balance of bacteria in the vagina is disrupted and replaced by an overgrowth of certain bacteria

Because having BV puts a woman at increased risk for a variety of complications, such as preterm delivery, there is great interest in understanding how it can be prevented.

Vitamin D may play a role in BV because it exerts influence over a number of aspects of the immune system. This hypothesis is circumstantially supported by the fact that BV is far more common in black than white women, and vitamin D status is substantially lower in black than white women.

This relation, however, has not been rigorously studied. To assess whether poor vitamin D status may play a role in predisposing a woman to BV, Bodnar and coworkers at the University of Pittsburgh and the Magee-Womens Research Institute studied 469 pregnant women. The results of their investigation are published in the June 2009 issue of the Journal of Nutrition.

This prospective epidemiologic study investigated the relation between vitamin D status and BV in 209 white and 260 black women at <16 wk of pregnancy with singleton gestations. Blood samples were taken, and serum analyzed for 25-hydroxyvitamin D [25(OH)D], a marker of vitamin D status. 25(OH)D levels below 80 nmol/L are typically considered insufficient. Pelvic examinations were performed, and Gram-stained vaginal smears were assessed to diagnose BV.

The data indicate that 41% of all enrolled women had BV, and that 93% had 25(OH)D levels indicative of vitamin D insufficiency.

Overall, women with BV had lower serum 25(OH)D concentrations than those without BV (P < 0.01). The prevalence of BV decreased as vitamin D concentration increased to 80 nmol/L (P < 0.001). Compared with 75 nmol/L, serum 25(OH)D concentrations of 20 nmol/L and 50 nmol/L were associated with 65% and 26% increases, respectively, in the likelihood of BV.

In summary, these findings suggest that vitamin D insufficiency is associated with BV in the first 4 months of pregnancy. Further, poor vitamin D status may contribute to the strong racial disparity in the prevalence of BV in US women. Controlled intervention trials will be needed to confirm this hypothesis.



ACLU versus Myriad Genetics - Landmark Case
The American Civil Liberties Union action in filing a lawsuit yesterday against Myriad Genetics is going to lead to one of the most important legal battles in the history of biotechnology, asserts Genetic Engineering & Biotechnology News (GEN). (www.genengnews.com)

The ACLU charged that the patenting of two human genes linked to breast and ovarian cancer will inhibit medical research. The organization also claims that the patents are invalid and unconstitutional.

“This is going to turn into one of the watershed events in the evolution of the bioindustry,” says John Sterling, Editor in Chief of GEN. “The pros and cons of patenting genes have been an ongoing, and often acrimonious series of debates, since the in re Chakrabarty decision in 1980.

But this particular case seems to have taken on a life of its own with over fifteen plaintiffs. For while the lawsuit specifically centers on the patentability of two cancer-related genes, the ACLU says it plans to challenge the entire concept of patenting genes. What we have here is one group, the ACLU and its allies, contending that gene patents stifle life science research and potentially harm the health of thousands of patients. On the other side are biotech companies who maintain that without gene patents research incentives are seriously diminished and innovation is smothered.”

Kenneth I. Berns, M.D., Ph.D., Editor in Chief of the peer reviewed journal, Genetic Testing and Molecular Biomarkers (http://www.liebertpub.com/gtmb), which is the official journal of the Genetic Alliance, says the “patenting of human genes is a bad idea and that healthcare in the U.S. would be enhanced if the ACLU suit prevails.” Dr. Berns is also Director of the University of Florida Genetics Institute in Gainesville.

William Warren, partner at the Sutherland law firm, thinks the ACLU, in this case, is barking up the wrong tree. “The ACLU unexpectedly based its invalidity challenge on claims to unpatentable subject matter,” he says. “The ACLU might have instead considered challenging the Myriad patents for obviousness.” Warren and Sutherland colleague, Lei Fang, Ph.D., M.D., have authored a legal article, which will be published in the June 1 issue of GEN entitled “Patentability of Genetic Sequences Limited.” It is now available online. (http://www.genengnews.com/news/bnitem.aspx?name=54504126&source=genwire)

For the specific details surrounding the lawsuit please see the article on the GEN website entitled “Myriad Genetics Comes under Legal Fire for Gene Patents” (www.genengnews.com/news/bnitem.aspx?name=54425046&source=genwire), which includes pertinent comments from both research and legal professionals.


Day care and Insensitive Parenting May Have Lasting Effects
Drawing on the large, longitudinal NICHD - supported study of early child care and youth development in the United States

A growing number of American children are enrolled in child care and questions remain about how these settings may affect them in both positive and negative ways.

A new study published in the May/June 2009 issue of the journal Child Development finds that early interpersonal experiences—center-based child care and parenting—may have independent and lasting developmental effects.

The study draws on the large, longitudinal Study of Early Child Care and Youth Development in the United States, which was carried out in collaboration with the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD).

The NICHD study has followed about 1,000 children from 1 month through mid-adolescence to examine the effects of child care in children's first few years of life on later development. The researchers observed children in and out of their homes, and when the children were 15, they measured their levels of awakening cortisol—a stress-responsive hormone that follows a daily cycle (cortisol levels are usually high in the morning and decrease throughout the day).

Children who, during their first three years, (a) had mothers who were more insensitive and/or (b) spent more time in center-based child care - whether of high or low quality - were more likely to have the atypical pattern of lower levels of cortisol just after awakening when they were 15 years of age, which could indicate higher levels of early stress.

These findings held even after taking into consideration a number of background variables (including family income, the mothers' education, the child's gender, and the child's ethnicity), as well as observed parenting sensitivity at age 15. The associations were small in magnitude, and were not stronger for either boys or girls.

The study was supported by the National Institute of Child Health and Human Development, NICHD.

How UV Radiation Causes Cells to Die to Avoid Cancer Damage
Ultraviolet radiation from the sun can zap DNA, damage cells, and set the stage for the subsequent development of cancer. Scientists have now identified the built-in safety mechanism that forces some cells damaged by UV radiation to commit suicide so they do not perpetuate harmful mutations

Alberto R. Kornblihtt, a Howard Hughes Medical Institute international research scholar at the University of Buenos Aires and the National Research Council of Argentina, has found that UV radiation causes human cells to create proteins that trigger cell death. It’s a built-in safety pathway whose precise mechanism had never been seen before.

“It's better for the cell to die than to spread the mutations,” Kornblihtt says. The findings were published in the May 15, 2009 issue of the journal Cell.

All cells in the body rely on the same set of approximately 25,000 genes as the blueprint for the proteins they need to carry out their activities. They expand this limited repertoire through a mechanism called alternative splicing, which allows a cell to produce an assortment of different proteins from the same gene. They achieve this diversity by modifying messenger RNA (mRNA) molecules—the intermediary in the conversion from a gene to a protein.

In their experiments, Kornblihtt and his colleagues—an international team of laboratories from the U.S., France, and Spain—bombarded human cells with a highly energetic form of UV radiation that is typically blocked by the ozone layer, called UV-C. They then looked inside the damaged cells for mRNA, which ferries the genetic message from gene to protein. By examining the sequence of nucleotide letters in the mRNA, Kornblihtt could see which genes or parts of genes were used to make proteins in the damaged cells—and if they had been alternatively spliced.

They compared mRNA sequences from the damaged cells to the mRNA in healthy cells to see which genes were alternatively spliced. Using special chips that analyzed the mRNA of about 500 genes, Kornblihtt found that 14 percent of the genes switched forms in response to UV-C. “We found that UV radiation causes changes in alternative splicing, but only in a certain subset of genes,” Kornblihtt says.

Manuel Muñoz, a graduate student in Kornblihtt’s lab who is first author of the Cell paper, decided to see if any of the genes that switched forms were important in apoptosis, the process that causes cells to commit suicide. Muñoz identified two genes, Bcl-X and caspase 9, that are known to be involved with apoptosis, or programmed cell death. Apoptosis culls unneeded cells during development and growth and protects organisms by killing defective cells. Defects in apoptosis can be harmful—leading to extended cell survival and the potential for the uncontrolled growth characteristic of cancer.

The Bcl-X and caspase-9 genes can produce two different proteins via alternative splicing. For each gene, one version prevents cell death, while the other version encourages it. Kornblihtt and Muñoz found that, in both cases, UV radiation triggered production of the protein that encourages cell death. “This finding was really striking,” Kornblihtt says.

The researchers then repeated the experiments in cells missing a key protein called p53. Normally, p53 triggers the cascade of events that lead to apoptosis in response to cellular damage. But even in cells lacking p53, UV radiation still caused apoptosis, with Bcl-X and caspase 9 helping the process along. “We demonstrated that the cell death mechanisms we found are independent of p53,” Kornblihtt says. “That’s an important finding because p53 is usually needed to cause apoptosis.”

To find out how UV damage induces cell death, Kornblihtt turned to his previous work studying alternative splicing, specifically a key enzyme called polymerase II. Polymerase II is like the Xerox machine of the cell. It reads DNA then makes mRNA copies, which are later processed to make proteins. Kornblihtt had previously shown that the speed that polymerase II moves along a strand of DNA determines whether an alternative splice of mRNA is made. If it moves quickly, the enzyme will skip over some segments of the DNA. But if it moves slowly, it will include those segments, leading to an alternative splice.

Kornblihtt and his colleagues looked to see if there were any obstacles in cells damaged by UV-C that might slow down polymerase II—and thereby induce alternative splicing. They fluorescently tagged the newly formed messenger RNA to measure polymerase II speed, and found that the enzyme slowed in response to UV radiation. This decrease in speed produced the alternative forms of Bcl-X and caspase 9 that then caused the cells to commit suicide.

Now the group plans to repeat the experiments with UV-A and UV-B, which are less energetic than UV-C but are more common causes of skin cell damage in people. Kornblihtt also wants to find out how UV-C causes polymerase II to slow down. “It’s clear that UV radiation indirectly affects the speed of polymerase II,” Kornblihtt says. “Although we don’t know exactly how this happens yet.”



THURSDAY May 14, 2009---------------------------News Archive

Embryo’s Heartbeat Drives Generation of New Blood Cells
During the early days of an embryo’s development, the heart begins to beat. It turns out that beating heart does more than circulate the embryo’s small existing blood supply. Howard Hughes Medical Institute investigators have found that the blood’s movement through the aorta triggers the production of new blood stem cells, which will give rise to all the red and white blood cells the organism needs to survive

The researchers have also discovered that this essential biomechanical signal can be mimicked with drugs. The findings could help clinicians expand the supply of blood stem cells needed to treat leukemia, autoimmune disorders, and other diseases.

“The biomechanical stress of early blood flow is needed for an organism to grow its initial supply of blood cells,” says George Daley, an HHMI investigator at Children’s Hospital Boston and senior author on one of the reports, published May 13, 2009, in Nature. The second report, with HHMI investigator Leonard Zon as senior author, was published May 13, 2009, in the journal Cell.

The two investigators homed in on the importance of flow for blood development from different angles.

“For a long time, I’ve had the idea that the initiation of the heartbeat in an embryo is crucial for the creation of blood stem cells,” says Daley, who hopes to grow blood stem cells from pluripotent stem cells in the laboratory so that they can be infused into patients to treat a range of diseases. He began investigating the idea with bioengineers at the Massachusetts Institute of Technology in 2001, and in early experiments Daley’s team noticed that streaming a fluid across embryonic stem cells growing in a bioreactor did spur the development of new blood cells.

Daley shelved that research for a time, but it took off again when he began collaborating with Guillermo García-Cardeña, from Brigham and Women’s Hospital in Boston. García-Cardeña invented a miniaturized cell-culturing system that can impose different degrees of fluid flow on cells. The system grows cells on a surface beneath a shallow inverted cone, which spins at different speeds to create different rates of fluid flow.

García-Cardeña’s team seeded the system with embryonic stem cells, and found that spinning fluid at a specific rate increased the production of blood stem cells. The system produced the most blood stem cells when the fluid force was equivalent to the force of blood flow in a developing mouse aorta when the heart begins beating, at about day ten and a half of embryonic development. “The cells are tuned to sense the right force,” says Daley.

Researchers had previously established that blood arises in two waves within mouse embryos. Early blood is produced in small quantities outside the embryo, in the yolk sac, while the later blood stem cells bud from the walls of the developing aorta. Daley’s work shows that when the embryo’s heart starts to beat, the frictional forces against the walls of the aorta trigger the production of blood stem cells.

Zon, also at Children’s Hospital Boston, approached the problem in a different way. Zon has been working to identify compounds that boost production of blood stem cells, with the ultimate goal of increasing the number of blood stem cells in bone marrow and umbilical cord blood, which are transfused into patients to rebuild their immune systems after cancer therapy.

To this end, Zon developed a system to quickly test thousands of drugs in zebrafish. This approach tags zebrafish embryos with a purple dye that appears only in new blood stem cells. Since zebrafish embryos are translucent, laboratory workers can watch new blood stem cells as they are generated. “You could never do this screen in any other animal, you have to do this in zebrafish,” says Zon. “We're literally looking at the aorta as blood stem cells are being born.”

In 2007, Zon and colleagues identified a compound, called prostaglandin E2, that increases the production of new blood stem cells. The drug screens also highlighted a class of compounds that increased blood flow, and showed that these compounds increased the production of blood stem cells. Until then, Zon says, “it was not known at all that blood flow is a signal that produces blood stem cells in embryos.”

Zon then worked with mutant zebrafish embryos missing a key heart protein. The hearts in these embryos never begin beating. “You never get circulation in these fish, and if you look in their aorta, you see very few blood stem cells. That confirmed to us that blood flow is truly required to make the blood stem cells,” Zon says.

With further experiments, Zon’s team found a group of drugs that enabled the fish without beating hearts to produce blood stem cells. These drugs all had something in common: they generated nitric oxide, a well-known molecule used by cells to talk to each other. Normal blood flow enhances the production of nitric oxide. “That's at least one of the critical signals that blood flow is triggering,” Zon says. But their experiments demonstrate that nitric oxide can actually supplant the need for flow.

Daley, too, found that nitric oxide is crucial for development of blood stem cells. He used a drug to block nitric oxide production in pregnant mice, and found a marked decrease in blood stem cells in the embryos they carried.

“The lesson here may be that as we try to grow blood stem cells in the laboratory, any number of drugs that produce nitric oxide may be valuable,” Daley says.

Common Treatment to Delay Labor Decreases Preterm Infants’ Risk for Cerebral Palsy
Preterm infants born to mothers receiving intravenous magnesium sulfate, a common treatment to delay labor, are less likely to develop cerebral palsy (CP) than are preterm infants whose mothers do not receive it, reported researchers in a large National Institutes of Health (NIH) research network.1

The research was conducted by investigators in 20 participating research centers of the Maternal Fetal Medicine Units Network of NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). The 2,241 women in the study were at risk for preterm delivery between 24 and 31 weeks of gestation.

The researchers theorized that magnesium sulfate protects against CP because it can stabilize blood vessels, protect against damage from oxygen depletion, and protect against injury from swelling and inflammation.

CP refers to any one of a group of neurological disorders affecting control of body movement and muscle coordination. Although muscle movements are affected, CP is not caused by problems in muscles or nerves; there are abnormalities in the parts of the brain that control muscle movements that cause CP. Many people with CP suffer additional neurological disabilities, including mental retardation and epilepsy. In CP, the brain may be injured or develop abnormally during pregnancy, birth, or in early childhood; however, the causes of CP are not well understood.

For their primary calculation, the researchers grouped the proportions of infants with moderate and severe CP together with the proportion of infants who died. The researchers included the death rate in this primary calculation, because mortality among preterm infants is very high. The researchers found that a total of 11.3 percent of infants in the magnesium sulfate group had either moderate or severe CP, or had died at birth or were stillborn. In contrast, a total of 11.7 percent of the infants in the placebo group had moderate to severe CP or had died.

The results indicate that the risk of death occurring in the magnesium sulfate group (9.5 percent) did not differ significantly from those in the placebo group (8.5 percent). However, among the babies that did survive preterm births, moderate or severe CP occurred significantly less frequently in the magnesium sulfate group (1.9 percent vs. 3.5 percent). The study authors did not include mild CP in their analyses, as mild CP will often disappear with time.

These study results support the findings of an earlier National Institute of Neurological Disorders and Stroke study published in Pediatrics that reported that mothers of preterm infants who did not have CP were more likely to have received magnesium sulfate than were mothers of infants who had CP.2

The NIH study is the largest, most comprehensive study to date to analyze this inexpensive and commonly used treatment to reduce the occurrence of CP after preterm birth, according to the researchers. “This is a major advance,” said Catherine Y. Spong, MD, chief of NICHD’s Pregnancy and Perinatology Branch and an author of the study. “Our results show that obstetricians can use magnesium sulfate, which they have experience prescribing, to reduce the risk of a devastating condition - CP - in preterm infants.”

Researchers are continuing to examine the roles of genetics, environment, and traumatic events early in brain development that may lead to brain malformations and abnormalities that result in CP. The National Children’s Study, which is investigating how genetics and environmental influences before birth and in childhood impact health, will contribute to the body of research in CP and other health conditions.

Babies Increasingly Born to Unwed Mothers
By Rob Stein and Donna St. George, of the Washington Post
The number of babies being born out of wedlock has increased sharply in the United States, driven primarily by significant jumps in women in their 20s and 30s having children without getting married, according to a federal report released today


More than 1.7 million babies were born to unmarried women in 2007, a 26 percent rise from 2002 and more than double the number in 1980, according to the report from the National Center for Health Statistics. The increase reflected a 21 percent jump in the rates of unmarried women giving birth, which rose from 43.7 per 1,000 women in 2002 to 52.9 per 1,000 women.

That means that unmarried women accounted for 39.7 percent of all U.S. births in 2007 -- nearly four out of every 10 newborns -- up from 34 percent in 2002 and more than double the percentage in 1980.

"If you see 10 babies in the room, four them were born to women who were not married," said Stephanie J. Ventura, who led the analysis of birth certificate data nationwide. "It's been a huge increase -- a dramatic increase. It's quite striking."

Although the report did not examine the reasons for the increase, Ventura and other experts said the trend has been driven by a combination of factors, including the lessening of the social stigma associated with unmarried motherhood, an increase in couples delaying or forgoing marriage, and growing numbers of financially independent women and older and single women who decide to have children on their own after delaying childbearing.

"It's many factors," Ventura said. "Certainly the social disapproval factor has diminished. That's just not a factor that unmarried women once faced. And a lot of women are postponing marriage."

Some experts said the trend represents many positive changes for some women -- women are less likely to be shunned if they have children by themselves or to be forced to give their children up for adoption.

"We've seen a transformation of social norms," said Rosanna Hertz, a professor of sociology at Wellesley College. "Women can have children on their own and it's not going to destroy your employment and it's not going to mean that you'll be made a pariah by the community."

But others said that while the shift may represent some positive changes, the trend is disturbing because studies have shown that children generally tend to fare better when they grow up in stable households with two parents.

"We know that babies and children do best with committed, stable adult parents -- preferably married," said Sarah Brown of the National Campaign to Prevent Teen and Unplanned Pregnancy. "That tends to be the arrangement that produces the best outcome for children. I look at this and say, 'Maybe this trend is what young adults want or stumble into, but it's not in the best interest of children.'"

The trend has been indicated in previous reports, but the new analysis is the first to examine the dramatic social shift in detail, exploring differences in age and ethnicity as well as comparing the United States to other countries.

Although experts have been concerned about a recent uptick in births to older teens after years of decline, that is not the driving force in the overall trend but more likely a reflection of it, Ventura said.

Instead, much of the increase is due to significant increases in births among unmarried women in their 20s and 30s. Between 2002 and 2006, the rate at which unmarried women were having babies increased by 13 percent among women ages 20 to 24, by 21 percent for those ages 25 to 29, by 34 percent for women 30 to 34 and by 29 percent for those 35 to 39, the report found.

"Those are really big increases," Ventura said, noting that the increase among women in their 20s was the most important factor because they have the highest birth rate. "It's really what's happening for women in their 20s that is the dominant factor."

Compared to 1980, the rate of births among unmarried women more than doubled from 41 per 1,000 among women ages 20 to 24 to 80 per 1,000 in 2006, and nearly tripled for women ages 35 to 39 -- from 10 per 1,000 in 1980 to 27 per 1,000 in 2006, the report showed.

In 2007, 45 percent of women who gave birth in their 20s were unmarried. Sixty percent of those who had babies between 20 and 24 were single, up from 52 percent in 2002, and nearly one-third of those giving birth at ages 25 to 29 unmarried, up from one-fourth in 2002. Nearly one in five women who gave birth in their 30s were unmarried, compared with one in seven in 2002.

The rates increased for all races, but they remained highest and rose fastest for Hispanics and blacks. There were 106 births to every 1,000 unmarried Hispanic women, 72 per 1,000 blacks, 32 per 1,000 whites and 26 per 1,000 Asians, the report showed.

The rate of babies being born to unmarried women in the United States is starting to look more like that of some European countries, the report showed. For example, the percentage of babies born to unmarried women is about 66 percent in Iceland, about 55 percent in Sweden, about 50 percent in France and about 44 percent in the United Kingdom.

In many of those countries couples are living together instead of getting married, which is also the case in the United States, Ventura noted. Previous research indicates about 40 percent of births to unmarried women occur in households where couples are cohabitating, she said.

"We're seeing a big drop in emphasis on marriage," Hertz said. "There are more people living together without being married -- look at Brad and Angelina."


Brain Chemical Reduces Anxiety, Increases Survival of New Cells
Animal study suggests potential new treatment for anxiety disorders and depression

New research on a brain chemical involved in development sheds light on why some individuals may be predisposed to anxiety. It also strengthens understanding of cellular processes that may be common to anxiety and depression, and suggests how lifestyle changes may help overcome both.

The animal study, in the May 13 issue of The Journal of Neuroscience, shows an important role for fibroblast growth factor 2 (FGF2), a chemical important in brain development, in anxiety. The findings advance understanding of cellular mechanisms involved in anxiety and illuminate the role of neurogenesis, or cell birth and integration in the adult brain, in this process. Together, these findings may offer new drug targets for the treatment of anxiety and potentially for depression as well.

According to the National Institute of Mental Health, approximately 40 million Americans adults have anxiety disorders, and 14.8 million suffer from major depression. These disorders often co-occur: people with anxiety frequently also have depression, and research suggests that the two disorders may share common causes. Previous human studies led by the senior author, Huda Akil, PhD, at the University of Michigan and her collaborators in the Pritzker Consortium, showed that people with severe depression had low levels of FGF2 and other related chemicals. However, it was unclear whether reductions in FGF2 were the cause or effect of the disease.

This new study, led by Javier Perez, PhD, also at the University of Michigan, examined FGF2 levels in rats selectively bred for high or low anxiety for over 19 generations. Consistent with the human depression studies, the researchers found lower FGF2 levels in rats bred for high anxiety compared to those bred for low anxiety.

The study also suggests that environmental enrichment reduces anxiety by altering FGF2. Other researchers have shown that anxiety behaviors in rats can be modified by making changes to their environment, perhaps akin to lifestyle changes for people. Perez and colleagues found that giving the high-anxiety rats a series of new toys reduced anxiety behaviors and increased their levels of FGF2. Furthermore, they found that FGF2 treatment alone reduced anxiety behaviors in the high-anxiety rats.

“We have discovered that FGF2 has two important new roles: it’s a genetic vulnerability factor for anxiety and a mediator for how the environment affects different individuals. This is surprising, as FGF2 and related molecules are known primarily for organizing the brain during development and repairing it after injury,” Perez said.

Finally, the findings suggest that part of FGF2’s role in reducing anxiety may be due to its ability to increase the survival of new cells in a brain region called the hippocampus. Previous research has suggested that depression decreases the production and incorporation of new brain cells, a process called neurogenesis. Although the researchers found that high-anxiety rats produced the same number of new brain cells as low-anxiety rats, they found decreased survival of new brain cells in high-anxiety rats compared to low-anxiety rats. However, FGF2 treatment and environmental enrichment each restored brain cell survival.

“This discovery may pave the way for new, more specific treatments for anxiety that will not be based on sedation — like currently prescribed drugs — but will instead fight the real cause of the disease,” said Pier Vincenzo Piazza, MD, PhD, Director of the Neurocentre Magendie an INSERM/University of Bordeaux institution in France, an expert on the role of neurogenesis in addiction and anxiety who was not involved in the current study.

The research was supported by the National Institute of Mental Health, National Institute on Drug Abuse, Office of Naval Research, and The Pritzker Neuropsychiatric Disorders Research Fund.



WEDNESDAY May 13, 2009---------------------------News Archive

Folic Acid Supplements Before Birth Reduce Preemie Risk
Taking folic acid supplements for at least a year before conception is associated with reduction in the risk of premature birth, according to a study by Radek Bukowski (from the University of Texas Medical Branch, United States of America) and colleagues, published in this week's PLoS Medicine

Although most pregnancies last about 40 weeks, many babies (for example around 12% in the United States) are born before 37 completed weeks of pregnancy. Babies born prematurely are less likely to survive than full-term babies and are more likely to have breathing difficulties and learning or developmental disabilities.

Currently, there are no effective methods of prevention or treatment of premature (preterm) birth, but previous studies have suggested that lower concentrations of folate (folic acid) are associated with shorter duration of pregnancy. Bukowski and colleagues therefore tested this idea, by analyzing data collected from a cohort of nearly 35,000 pregnant women.

The results of this study showed that taking folate supplements for at least one year before conception was associated with a 70% reduction in spontaneous premature birth between 20 and 28 weeks (a reduction from 0.27% to 0.04%), and a 50% reduction between 28 and 32 weeks (reduction from 0.38% to 0.18%), as compared to the rate of preterm birth when mothers did not take additional folate supplementation.

Folate supplementation for less than a year before conception was not linked to a reduction in the risk of premature birth in this study, and folate supplementation was not associated with any other complications of pregnancy.

In a related commentary also published in this week's PLoS Medicine, Nicholas Fisk from the University of Brisbane, Australia, and colleagues (who were not involved in the original study) say "Methodologically, the study has several strengths... It is based on a huge dataset, with prospective recording of dietary supplements and potential confounders, and gestational age determined accurately on first trimester ultrasound. Those born preterm because of intervention were appropriately censored."

Nevertheless, Nicholas Fisk and colleagues also point out limitations to the study – for example, this was a secondary analysis of a Down syndrome screening study, so information on folic acid dose, formulation (with or without other supplements), and daily compliance is incomplete.

The study design was observational, so the presence of other factors, such as healthier behaviors on the part of women who take folate supplements, may explain the findings. Further evidence as to whether folic acid prevents spontaneous preterm birth will require a randomized controlled trial.

Molecular Structure of The Color of Hair and Eyes
Scientists have long known that members of the phenoloxidase family are involved in skin and hair coloring. When they are mutated, they can cause albinism – the loss of coloring in skin and hair. Produced over abundantly, they are associated with the deadly skin cancer melanoma

In an elegant structural study, a team of Baylor College of Medicine (www.bcm.edu) and German researchers explain how hemocyanin, an oxygen-carrying large protein complex which can be turned into phenoloxidase, is activated – a finding that could lead to a better understanding of both ends of the skin and hair color spectrum. A report of their work appears in the current issue of the journal Structure.

When Dr. Yao Cong, a postdoctoral researcher in the laboratory of Dr. Wah Chiu (http://www.bcm.edu/biochem/?PMID=3715), displays the computer representation of hemocyanin, it glows like a four-part jewel on the computer screen (see Figure 1). Chiu is professor of biochemistry and molecular biology at BCM and director of the National Center for Macromolecular Imaging (http://ncmi.bcm.tmc.edu/ncmi/).

"It is very large and composed of 24 molecules," Cong said. In fact, it consists of four hexamers, each with six monomers (Movie 1 and Figure).

Just getting this far required using single particle electron cryomicroscopy (cryo-EM) to produce three dimensional density maps of the molecule at sub-nanometer resolution.
"Cryo-EM is becoming a structural tool that can be used for understanding structural mechanism of large protein, which has translational and biotechnological application as demonstrated in this study," said Chiu, a senior author.

"There are some critical structural features are very well resolved in our maps," said Cong. "which could not be captured using other techniques."

She and her colleagues used the detergent SDS, which is usually used as denaturant to degrade protein, to activate hemocyanin. At certain high concentrations, instead of destroyomg the complex, it turns hemocyanin into an enzymatically active phenoloxidase.
Each monomer of the protein particle has three domains.

"It is very interesting," said Cong. "One domain is more flexible than the other two domains because it has much less interaction with neighboring subunits as compared with the other two domains."

Upon activation, there is an overall conformational change of the complex (Movie 2). The most obvious is formation of two bridges in the previously vacant middle of the protein, which strengthens the interaction between the two halves of the complex.

"Zoom into the active site," said Cong. The intrinsically flexible domain twists away from the other two domains, dragging away a blocking residue and exposes the entrance to the active site (Movie 3). This movement is then stabilized by enhanced interhexamer interactions."

"This is all about interaction," said Cong. "A single change in the local domain of a subunit can result in conformation changes in the entire complex and make it work cooperatively. This is really a molecular machine."

Using hemocyanin as a model system, scientists can learn about the activation mechanism of other phenoloxidase enzymes in the same family, opening the door to new understanding of both melanoma and albinism, she said.

"If you know the mechanism of activating the protein, you could mutate it to enhance the interaction or inhibit it – depending on what you want to accomplish," she said.

Not only does this research have implications for human disease, it could also play a role in agriculture, where enzymes in this protein family are responsible for fruit and vegetables turning brown as they age.

Imaging Technique Reveals Structural Changes of Tourette's
Toddlers with autism appear more likely to have an enlarged amygdala, a brain area associated with functions such as the processing of faces and emotion, a study by University of North Carolina at Chapel Hill School of Medicine researchers has found

In addition, this brain abnormality appears to be associated with the ability to share attention with others, a fundamental ability thought to predict later social and language function in children with autism.

These findings are published in the May 2009 issue of Archives of General Psychiatry. Lead author of the article is Matthew W. Mosconi, Ph.D. of the UNC Neurodevelopmental Disorders Research Center. Joseph Piven, M.D., director of both the Neurodevelopmental Disorders Research Center and the Carolina Institute for Developmental Disabilities at UNC, is the study’s senior and corresponding author.

“Autism is a complex neurodevelopmental disorder likely involving multiple brain systems,” Piven said. “Converging evidence from magnetic resonance imaging, head circumference and postmortem studies suggests that brain volume enlargement is a characteristic feature of autism, with its onset most likely occurring in the latter part of the first year of life.” Based both on its function and studies of changes in its structure, the amygdala has been identified as a brain area potentially associated with autism.

Mosconi, Piven and colleagues conducted a magnetic resonance imaging study involving 50 autistic children and 33 control children. Participating children underwent brain scans along with testing of certain behavioral features of autism at ages 2 and 4. This included a measure of joint attention, which involves following another person’s gaze to initiate a shared experience.

Compared to control children, those children with autism were more likely to have amygdala enlargement both at age 2 and age 4. “These findings suggest that, consistent with a previous report of head circumference growth rates in autism and studies of amygdala volume in childhood, amygdala growth trajectories are accelerated before age 2 years in autism and remain enlarged during early childhood,” the authors write. “Moreover, amygdala enlargement in 2-year-old children with autism is disproportionate to overall brain enlargement and remains disproportionate at age 4 years.”

Among children with autism, amygdala volume was associated with an increase in joint attention ability at age 4. This suggests that alterations to this brain structure may be associated with a core deficit of autism, the authors note.

“The amygdala plays a critical role in early-stage processing of facial expression and in alerting cortical areas to the emotional significance of an event,” the authors write. “Amygdala disturbances early in development, therefore, disrupt the appropriate assignment of emotional significance to faces and social interaction.” Continued follow-up of research participants, now under way, will help determine whether amygdala growth rates continue at the same rate or undergo another period of accelerated growth or a period of decelerated growth in autistic children after age 4.

The study was funded by grants from the National Institutes of Health.


Neutralizing Tumor Growth in Embryonic Stem Cell Therapy
Researchers at the Hebrew University of Jerusalem have discovered a method to potentially eliminate the tumor-risk factor in utilizing human embryonic stem cells. Their work paves the way for further progress in the promising field of stem cell therapy

Prof. Nissim Benvenisty (seated, center) with students and team members of the Stem Cell Unit and the Embryonic Stem Cell Bank, established at the Faculty of Science with the support of the Legacy Heritage Fund

Human embryonic stem cells are theoretically capable of differentiation to all cells of the mature human body (and are hence defined as "pluripotent"). This ability, along with the ability to remain undifferentiated indefinitely in culture, make regenerative medicine using human embryonic stem cells a potentially unprecedented tool for the treatment of various diseases, including diabetes, Parkinson’s disease and heart failure.

A major drawback to the use of stem cells, however, remains the demonstrated tendency of such cells to grow into a specific kind of tumor, called teratoma, when they are implanted in laboratory experiments into mice. It is assumed that this tumorigenic feature will be manifested upon transplantation to human patients as well. The development of tumors from embryonic stem cells is especially puzzling given that these cells start out as completely normal cells.

A team of researchers at the Stem Cell Unit in the Department of Genetics at the Silberman Institute of Life Sciences at the Hebrew University has been working on various approaches to deal with this problem.

In their latest project, the researchers analyzed the genetic basis of tumor formation from human embryonic stem cells and identified a key gene that is involved in this unique tumorigenicity. This gene, called survivin, is expressed in most cancers and in early stage embryos, but it is almost completely absent from mature normal tissues.

The survivin gene is especially highly expressed in undifferentiated human embryonic stem cells and in their derived tumors. By neutralizing the activity of survivin in the undifferentiated cells as well as in the tumors, the researchers were able to initiate programmed cell death (apoptosis) in those cells.

This inhibition of this gene just before or after transplantation of the cells could minimize the chances of tumor formation, but the researchers caution that a combination of strategies may be needed to address the major safety concerns regarding tumor formation by human embryonic stem cells.

A report on this latest project of the Hebrew University stem cell researchers appeared in the online edition of Nature Biotechnology. The researchers are headed by Nissim Benvenisty, who is the Herbert Cohn Professor of Cancer Research, and Ph.D. student Barak Blum. Others working on the project are Ph.D. student Ori Bar-Nur and laboratory technician Tamar Golan-Lev.


TUESDAY May 12, 2009---------------------------News Archive

Equality of the Sexes? Not Always When it Comes to Biology
MUHC researchers demonstrate that estrogen renders the innate immune system of women more powerful than that of men.

When it comes to immunity, men may not have been dealt an equal hand. The latest study by Dr. Maya Saleh, of the Research Institute of the McGill University Health Centre and McGill University, shows that women have a more powerful immune system than men. In fact, the production of estrogen by females could have a beneficial effect on the innate inflammatory response against bacterial pathogens. These surprising results will be published today in the Proceedings of the National Academy of Sciences.

More specifically, estrogen naturally produced in women seems to block the production of an enzyme called Caspase-12, which itself blocks the inflammatory process. The presence of estrogen would therefore have a beneficial effect on innate immunity, which represents the body's first line of defence against pathogenic organisms. "These results demonstrate that women have a more powerful inflammatory response than men," said Dr. Saleh.

This study was conducted on mice that lack the Caspase-12 gene, meaning that the mice were extremely resistant to infection. The human Caspase-12 gene was implanted in a group of male and female mice, yet only the males became more prone to infection. "We were very surprised by these results, and we determined that the estrogen produced by the female mice blocked the expression of the human Caspase-12 gene," explained Dr. Saleh. "We were also able to locate where the estrogen receptor binds on the gene in order to block its expression, which indicates that the hormone exerts direct action in this case."

Since these experiments were conducted using a human gene, the researchers consider these results to be applicable to humans. This feature of the female innate immune system might have evolved to better protect women's reproductive role.

The positive effect of natural estrogen on our resistence to infection is also exhibited with synthetic hormones such as 17-beta-estradiol. This finding might therefore open the door to new therapeutic applications that reinforce the immune system, but a question remains: will men be amenable to the idea of being treated with an exclusively female hormone?

Study Reveals Conflict between Doctors, Midwives over Homebirth
Two Oregon State University researchers have uncovered a pattern of distrust – and sometimes outright antagonism – among physicians at hospitals and midwives who are transporting their homebirth clients to the hospital because of complications

Oregon State University assistant professor Melissa Cheyney and doctoral student Courtney Everson said their work revealed an ongoing conflict between physicians and midwives, similar to that found in other studies of the dynamics between the two groups across the country.

The pair recently examined birth records in Oregon’s Jackson County from 1998 through 2003, a period when that county saw higher-than-expected rates of prematurity and low birth weight in some populations. The researchers wanted to assess whether those rates were linked to midwife-attended homebirths.

The findings revealed that assisted homebirths did not appear to be contributing to the lower-than-average health outcomes and, in fact, that the homebirths documented all had successful outcomes. But even more importantly to Cheyney, discussions with doctors and midwives uncovered a deep mistrust between the two groups of birthing providers, with doctors expressing the firm belief that only hospital births are safe, while midwives felt marginalized, mocked and put on the defensive when in contact with physicians.

“We’ve been getting insight into their world view, and it’s been quite illuminating,” Cheyney said.
Cheyney, who is a practicing midwife in addition to being an assistant professor of medical anthropology and reproductive biology, said she was surprised that physicians, when presented with scientifically conducted research that indicates homebirths do not increase infant mortality rates, still refuse to believe that births outside of the hospital are safe.

“Medicine is a social construct, and it’s heavily politicized,” she said.

She is working with Lane County obstetrician Dr. Paul Qualtere-Burcher to draft guidelines that would help midwives and their clients decide when they need to seek medical help, based in large part on Cheyney’s research, and another that would ask physicians to recognize midwives as legitimate caregivers.

Qualtere-Burcher said creating an open channel of communication isn’t easy.

“I do get some pushback from physician friends who say that I’m too open and too supportive,” he said. “My answer, to quote (President) Obama, is that dialogue is always a good idea.”

Qualtere-Burcher said he believes that if midwives felt more comfortable contacting physicians with medical questions or concerns, there would be a greater chance that women would get medical help when they needed it.

“Treat (midwives) with respect, as colleagues, and they’ll not be afraid to call,” he said.

While Qualtere-Burcher believes it would be wonderful, but Utopian, for all midwives to agree to seek medical assistance under the guidelines they’re proposing, and for all physicians to learn to deal more collegially with midwives, he hopes that if a small group on each side agrees to the plan, it will provide more evidence that a stronger relationship between physicians and midwives will lead to better outcomes for mothers and infants.

Last year the American Medical Association passed Resolution 205, which states: “the safest setting for labor, delivery and the immediate post-partum period is in the hospital, or a birthing center within a hospital complex…” The resolution was passed in direct response to media attention on home births, the AMA stated.

What is interesting, Cheyney points out, is that 99 percent of American births occur in the hospital, but the United States has one of the highest infant mortality rates of any developed country, with 6.3 deaths per 1,000 babies born. Meanwhile, the Netherlands, where a third of deliveries occur in the home with the assistance of midwives, has a lower rate of 4.73 deaths per 1,000.

One of the biggest problems Cheyney sees is that physicians only come into contact with midwives when something has gone wrong with the homebirth, and the patient has been transported to the hospital for care. There are a number of reasons why this interaction often is tension-filled and unpleasant for both sides, she says.

First is the assumption that homebirth must be dangerous, because the patient they’re seeing has had to be transported to the hospital. Secondly, the physician is now taking on the risk of caring for a patient who is unknown to them, and who has a medical chart provided by a midwife which may not include the kind of information the physician is used to receiving.

And because the midwife is often feeling defensive and upset, Cheyney said, the contact between her and the physician can often be tense and unproductive. Meanwhile, the patient, whose intention was not to have a hospital birth, is already feeling upset at the change in birth plan, and is now watching her care provider come into conflict with the stranger who is about to deliver her baby.

“It’s an extremely tension-fraught encounter,” Cheyney said, “and something needs to be done to address it.” As homebirths increase in popularity, she added, these encounters are bound to increase and a plan needs to be in place so that doctors and midwives know what protocol to follow.
“We’re having a meeting in early May to propose a draft for a model of collaborative care that might be the first of its kind,” in the United States, Cheyney said.

Cheyney is also pushing to get hospitals and the state records division to better track homebirths. The department of vital records had no way to indicate whether a birth occurred at home until 2008, and without being able to pull data, Cheyney said it’s hard to explore the nature of home birth in Oregon. She’s also working on education programs for midwives in rural areas, including a cultural competency piece as demographics in Oregon continue to change.
The research was funded by Oregon State University's Department of Anthropology Summer Writing Fellowship, the Center for the Study of Women and Society, and the Stanton Women’s Health Fellowship.

Toddlers with Autism More Likely to have Enlarged Amygdala
Toddlers with autism appear more likely to have an enlarged amygdala, a brain area associated with functions such as the processing of faces and emotion, a study by University of North Carolina at Chapel Hill School of Medicine researchers has found

In addition, this brain abnormality appears to be associated with the ability to share attention with others, a fundamental ability thought to predict later social and language function in children with autism.

These findings are published in the May 2009 issue of Archives of General Psychiatry. Lead author of the article is Matthew W. Mosconi, Ph.D. of the UNC Neurodevelopmental Disorders Research Center. Joseph Piven, M.D., director of both the Neurodevelopmental Disorders Research Center and the Carolina Institute for Developmental Disabilities at UNC, is the study’s senior and corresponding author.

“Autism is a complex neurodevelopmental disorder likely involving multiple brain systems,” Piven said. “Converging evidence from magnetic resonance imaging, head circumference and postmortem studies suggests that brain volume enlargement is a characteristic feature of autism, with its onset most likely occurring in the latter part of the first year of life.” Based both on its function and studies of changes in its structure, the amygdala has been identified as a brain area potentially associated with autism.

Mosconi, Piven and colleagues conducted a magnetic resonance imaging study involving 50 autistic children and 33 control children. Participating children underwent brain scans along with testing of certain behavioral features of autism at ages 2 and 4. This included a measure of joint attention, which involves following another person’s gaze to initiate a shared experience.

Compared to control children, those children with autism were more likely to have amygdala enlargement both at age 2 and age 4. “These findings suggest that, consistent with a previous report of head circumference growth rates in autism and studies of amygdala volume in childhood, amygdala growth trajectories are accelerated before age 2 years in autism and remain enlarged during early childhood,” the authors write. “Moreover, amygdala enlargement in 2-year-old children with autism is disproportionate to overall brain enlargement and remains disproportionate at age 4 years.”

Among children with autism, amygdala volume was associated with an increase in joint attention ability at age 4. This suggests that alterations to this brain structure may be associated with a core deficit of autism, the authors note.

“The amygdala plays a critical role in early-stage processing of facial expression and in alerting cortical areas to the emotional significance of an event,” the authors write. “Amygdala disturbances early in development, therefore, disrupt the appropriate assignment of emotional significance to faces and social interaction.” Continued follow-up of research participants, now under way, will help determine whether amygdala growth rates continue at the same rate or undergo another period of accelerated growth or a period of decelerated growth in autistic children after age 4.

The study was funded by grants from the National Institutes of Health.


New Tissue Scaffold Regrows Cartilage and Bone
MIT engineers have built a new tissue scaffold that can stimulate bone and cartilage growth when transplanted into the knees and other joints

The scaffold could offer a potential new treatment for sports injuries and other cartilage damage, such as arthritis, says Lorna Gibson, the Matoula S. Salapatas Professor of Materials Science and Engineering and co-leader of the research team with Professor William Bonfield of Cambridge University.

"If someone had a damaged region in the cartilage, you could remove the cartilage and the bone below it and put our scaffold in the hole," said Gibson. The researchers describe their scaffold in a recent series of articles in the Journal of Biomedical Materials Research.

The technology has been licensed to Orthomimetics, a British company launched by one of Gibson's collaborators, Andrew Lynn of Cambridge University. The company recently received approval to start clinical trials in Europe.

The scaffold has two layers, one that mimics bone and one that mimics cartilage. When implanted into a joint, the scaffold can stimulate mesenchymal stem cells in the bone marrow to produce new bone and cartilage. The technology is currently limited to small defects, using scaffolds roughly 8 mm in diameter.

The researchers demonstrated the scaffold's effectiveness in a 16-week study involving goats. In that study, the scaffold successfully stimulated bone and cartilage growth after being implanted in the goats' knees.

The project, a collaboration enabled by the Cambridge-MIT Institute, began when the team decided to build a scaffold for bone growth. They started with an existing method to produce a skin scaffold, made of collagen (from bovine tendon) and glycosaminoglycan, a long polysaccharide chain. To mimic the structure of bone, they developed a technique to mineralize the collagen scaffold by adding sources of calcium and phosphate.

Once that was done, the team decided to try to create a two-layer scaffold to regenerate both bone and cartilage (known as an osteochondral scaffold). Their method produces two layers with a gradual transition between the bone and cartilage layers.

"We tried to design it so it's similar to the transition in the body. That's one of the unique things about it," said Gibson.

There are currently a few different ways to treat cartilage injuries, including stimulating the bone marrow to release stem cells by drilling a hole through the cartilage into the bone; transplanting cartilage and the underlying bone from another, less highly loaded part of the joint; or removing cartilage cells from the body, stimulating them to grow in the lab and re-implanting them.

The new scaffold could offer a more effective, less expensive, easier and less painful substitute for those therapies, said Gibson.

MIT collaborators on the project are Professor Ioannis Yannas, of mechanical engineering and biological engineering; Myron Spector of the Harvard-MIT Division of Health Sciences and Technology (HST); Biraja Kanungo, a graduate student in materials science and engineering; recent MIT PhD recipients Brendan Harley (now at the University of Illinois) and Scott Vickers; and Zachary Wissner-Gross, a graduate student in HST. Dr. Hu-Ping Hsu of Harvard Medical School also worked on the project.

Cambridge University researchers involved in the project are Professor William Bonfield, Andrew Lynn, now CEO of Orthomimetics, Dr. Neil Rushton, Serena Best and Ruth Cameron.

The research was funded by the Cambridge-MIT Institute, the Whitaker-MIT Health Science Fund, Universities UK, Cambridge Commonwealth Trust and St. John's College Cambridge.


MONDAY May 11, 2009---------------------------News Archive

Estrogen Controls How the Brain Processes Sound
Scientists at the University of Rochester have discovered that the hormone estrogen plays a pivotal role in how the brain processes sounds.

The findings, published in today's issue of The Journal of Neuroscience, show for the first time that a sex hormone can directly affect auditory function, and point toward the possibility that estrogen controls other types of sensory processing as well. Understanding how estrogen changes the brain's response to sound, say the authors, might open the door to new ways of treating hearing deficiencies.

"We've discovered estrogen doing something totally unexpected," says Raphael Pinaud, assistant professor of brain and cognitive sciences at the University of Rochester and lead author of the study. "We show that estrogen plays a central role in how the brain extracts and interprets auditory information. It does this on a scale of milliseconds in neurons, as opposed to days, months or even years in which estrogen is more commonly known to affect an organism."

Previous studies have hinted at a connection between estrogen and hearing in women who have low estrogen, such as often occurs after menopause, says Pinaud. No one understood, however, that estrogen was playing such a direct role in determining auditory functions in the brain, he says. "Now it is clear that estrogen is a key molecule carrying brain signals, and that the right balance of hormone levels in men and women is important for reasons beyond its role as a sex hormone," says Pinaud.

Pinaud, along with Liisa Tremere, a research assistant professor of brain and cognitive sciences, and Jin Jeong, a postdoctoral fellow in Pinaud's laboratory, demonstrated that increasing estrogen levels in brain regions that process auditory information caused heightened sensitivity of sound-processing neurons, which encoded more complex and subtle features of the sound stimulus. Perhaps more surprising, says Pinaud, is that by blocking either the actions of estrogen directly, or preventing brain cells from producing estrogen within auditory centers, the signaling that is necessary for the brain to process sounds essentially shuts down. Pinaud's team also shows that estrogen is required to activate genes that instruct the brain to lay down memories of those sounds.

"It turns out that estrogen plays a dual role," says Pinaud. "It modulates the gain of auditory neurons instantaneously, and it initiates cellular processes that activate genes that are involved in learning and memory formation."

Pinaud and his group stumbled upon these findings while investigating how estrogen may help change neuronal circuits to form memories of familiar songs in a type of bird typically used to understand the biology of vocal communication. "Based on our findings we must now see estrogen as a central regulator of hearing," he says. "It both determines how carefully a sound must be processed, and activates intracellular processes that occur deep within the cell to form memories of sound experiences."

Pinaud and his team will continue their work investigating how neurons adapt their functionality when encountering new sensory information and how these changes may ultimately enable the formation of memories. They also will continue exploring the specific mechanisms by which estrogen might impact these processes.

"While we are currently conducting further experiments to confirm it, we believe that our findings extrapolate to other sensory systems and vertebrate species," says Pinaud. "If this is the case, we are on the way to showing that estrogen is a key molecule for processing information from all the senses."

When Your Brain Doesn't Know What Your Body Is Doing
As anyone with a busy schedule can attest, intending to do something and actually doing it are two different things

But your brain doesn't make such neat distinctions, according to a new study. Researchers have found that when you wave at someone, for example, the intention to move your hand creates the feeling of it having moved, not the physical motion itself. The discovery sheds new light on how the brain tracks what the body does.

Although neuroscience has revealed much about how the brain processes experiences, the origin of intention has remained a mystery. Past studies linked it to the posterior parietal cortex and the premotor cortex, two regions of the brain also associated with motion and awareness of movement, but each region's role and how they work together remained unclear.

Neuroscientist Angela Sirigu of the Centre de Neuroscience Cognitive in Bron, France, became intrigued by the posterior parietal's role in willed actions when working with patients who had injured that part of their brains. The patients couldn't define when they began to want to move, says Sirigu, because they couldn't monitor their own intention.

Sirigu joined researchers at the University of Lyon in France and neurosurgeon Carmine Mottolese of Lyon's Hôpital Pierre Wertheimer to take advantage of a common operating room practice. As part of their preparation for surgery, neurosurgeons sometimes electrically stimulate the brains of their patients, who are awake under local anesthetic, to map the brain and minimize surgical complications. During brain tumor surgery on seven patients, Mottolese stimulated their frontal, parietal, and temporal brain regions, and Sirigu's team asked the patients to describe what they felt.

After stimulation of the parietal cortex, patients reported "wanting" to move their arms, legs, lips, or chest but didn't actually move them. When Mottolese stimulated the same region more intensely, patients believed that they had moved the body parts they'd intended to move even though they hadn't. Stimulating the premotor cortex, on the other hand, resulted in real movements, but the patients were never conscious of their motions.

The results, reported in May 8th issue of Science, suggest that "we need intention to be aware of what we are doing," says Sirigu. The brain's intention and its prediction of what will result from carrying out that intention create our experience of having moved, she says.

"I think this study is extremely exciting," says Patrick Haggard, a cognitive neuroscientist at University College London in the United Kingdom. "It's quite encouraging to think that there could be a neuroscience of volition," he says. "And this idea of volition is about as central to our nature as it gets."

Iron Deficiency in Womb May Delay Auditory Nervous System in Preemies
Iron plays a large role in brain development in the womb, and new University of Rochester Medical Center research shows an iron deficiency may delay the development of auditory nervous system in preemies

This delay could affect babies ability to process sound which is critical for later language development in early childhood.

The study evaluated 80 infants over 18 months, testing their cord blood for iron levels and using a non-invasive tool -- auditory brainstem-evoked response (ABR) -- to measure the maturity of the brain's auditory nervous system soon after birth. The study found that the brains of infants with low iron levels in their cord blood had abnormal maturation of auditory system compared to infants with normal cord iron levels.

"Sound isn't transmitted as well through the immature auditory pathway in the brains of premature babies who are deficient in iron as compared to premature babies who have enough iron," said Sanjiv Amin, M.D., associate professor of Pediatrics at the University of Rochester Medical Center and author of the abstract presented today at the Pediatric Academic Society meeting in Baltimore. "We suspect that if the auditory neural system is affected during developmental phase, then other parts of the brain could also be affected in the presence of iron deficiency."

As many as 20 to 30 percent of pregnant women with lower socio-economic status are iron deficient. Iron deficiency in pregnant woman can cause anemia, a condition in which there are not enough red blood cells to carry oxygen around the body. Anemia can cause a range of problems in pregnancy from exhaustion to preterm labor and low birth weight. But physicians didn't know that an iron deficiency in a fetus may also delay auditory neural maturation. which could lead to language problems.

"We are concerned by these findings because of its potential implications for language development," Amin said. "More study is needed to fully understand what this delay in maturation means. This finding at least underscores an already understood need to monitor iron levels in pregnant women."

Does Mom Know When Enough is Enough?
As the childhood obesity epidemic in the United States continues, researchers are examining whether early parent and child behaviors contribute to the problem

A study from the Department of Nutritional Sciences, Rutgers University, published in the May/June 2009 issue of the Journal of Nutrition Education and Behavior reports that mothers who miss signs of satiety in their infants tend to overfeed them, leading to excess weight gains during the 6 month to 1 year period.

Ninety-six low-income black and Hispanic mothers, who chose to formula feed exclusively, were enrolled in the study. Data was collected during an initial interview and three home visits at 3, 6, and 12 months. During the home visits, feedings were observed, the mothers were interviewed, and the child's weight was measured. Feeding diaries were also checked for omissions or clarifications.

A number of characteristics that predicted infant weight gain from birth to 3 months were included in the analysis. These were birth weight, gender, race/ethnicity, maternal age, education, country of origin, body mass index (BMI) before pregnancy, and weight gain during pregnancy. For the 3 to 6 month period, birth weight, maternal BMI, infant weight gain from birth to 3 months, infant length gain from birth to 3 months, the estimated number of feeds per day, the month that solid food was introduced, and the mothers' sensitivity to the infants' signals at 3 months were included. And, finally, for the 6 to 12 month period, birth weight, maternal BMI, infant weight gain from 3 to 6 months, infant length gain from 3 to 6 months, maternal sensitivity to infant signals at 6 months, and the estimated number of feeds/day at 6 months were entered as the independent variables.

None of these variables served to predict infant weight gain over the first 3 months, or similarly, from 3 to 6 months. However, the number of feeds per day at 6 months approached significance in predicting weight gain from 6 to 12 months, and maternal sensitivity to the infants' signals reached predictive significance, but in a negative direction—indicating that mothers who were less sensitive to satiety cues had infants who gained more weight.

Writing in the article, John Worobey, PhD; Maria Islas Lopez, MA; and Daniel J. Hoffman, PhD, state, "More frequent feedings, particularly with formula, are an easy culprit on which to assign blame. But maternal sensitivity to the infant's feeding state, as reflected by the Feeding Scale scores, suggests that an unwillingness to slow the pace of feeding or terminate the feeding when the infant shows satiation cues may be overriding the infant's ability to self-regulate its intake."

However, the researchers warn that, "To use this knowledge to better inform low-income/educated mothers, indeed, mothers of any background who have settled on a feeding method, could pose a daunting challenge. Feeding an infant is a primal behavior, and to suggest to a new mother that she is feeding her infant too often, too much, or worse yet, is not very good at reading her infant's signals, would require an extremely skilled nurse or social worker. Giving counsel after watching a mother feed her infant might be seen as threatening, or at the very least meddling, and just pointing it out could be construed as an accusation of 'poor mothering.'"















Care.com
Learn to Read with Hooked on Phonics
Diapers.com_Free Shipping_ (100x100) Animated
Target
HBC
Kohl's
Sears
Home---History- --Bibliography- -Pregnancy Timeline---Prescription Drugs in Pregnancy--- Pregnancy Calculator----Female Reproductive System---News Alerts---Contact-
Creative Commons LicenseContent protected under a Creative Commons License. No dirivative works may be made or used for commercial purposes.