ScienceDaily (Nov. 18, 2011) — Scientists at the University of California, San Diego School of Medicine and UC San Diego Moores Cancer Center, in collaboration with colleagues in Boston and South Korea, say they have identified a novel gene mutation that causes at least one form of glioblastoma (GBM), the most common type of malignant brain tumor.


The findings are reported in the online edition of the journal Cancer Research.


Perhaps more importantly, the researchers found that two drugs already being used to treat other forms of cancer effectively prolonged the survival of mice modeling this particular form of GBM. That could be good news for at least some GBM patients. More than 9,000 new cases of the disease are diagnosed each year in the United States and effective treatments are limited. The tumors are aggressive and resistant to current therapies, such as surgery, radiation and chemotherapy. The median survival rate for newly diagnosed GBM patients is just 14 months.


Past studies have identified epidermal growth factor receptor (EGFR) as a common genetically altered gene in GBM, though the cause or causes of the alteration is not known. The research team, led by scientists at the Dana-Farber Cancer Institute in Boston, analyzed the GBM genomic database, ultimately identifying and characterizing an exon 27 deletion mutation within the EGFR carboxyl-terminus domain (CTD). An exon is a segment of a DNA or RNA molecule containing information coding for a protein or peptide sequence.


"The deletion mutant seems to possess a novel mechanism for inducing cellular transformation," said Frank Furnari, PhD, associate professor of medicine at the UC San Diego School of Medicine and an associate investigator at the San Diego branch of the Ludwig Institute for Cancer Research.


The study researchers determined that cellular transformation was induced by the previously unknown EGFR CTD deletion mutant, both in vitro and in vivo, and resulted in GBM in the animals. The researchers then turned to testing a pair of approved drugs that target EGFR: a monoclonal antibody called cetuximab and a small molecule inhibitor called erlotinib.


Cetuximab, marketed under the name Erbitux, is currently approved for use in treating metastatic colorectal cancer and squamous cell carcinoma of the head and neck. Erlotinib, marketed under the name Tarceva, is used to treat lung and pancreatic cancers.


Both drugs were found to effectively impair the tumor-forming abilities of oncogenic EGFR CTD deletion mutants. Cetuximab, in particular, prolonged survival of mice with the deletion mutants when compared to untreated control mice.


However, neither cetuximab nor erlotinib is an unabashed success story. The drugs work by binding to sites on the EGFR protein and inhibiting activation, but they are not effective in all cancer patients and produce some adverse side effects, such as rashes and diarrhea.


But Santosh Kesari, MD, PhD, Director of Neuro-Oncology at UC San Diego Moores Cancer Center and the UCSD Department of Neurosciences, and co-corresponding author of the study, said the new study points to a more selective, effective use of the drugs for some patients with GBM.


"In the past when we treated brain cancer patients with these drugs, the response rate was very small," Kesari said. "What we now show is that the tumors with CTD mutations respond best to these EGFR targeted agents. If we knew this beforehand, we might have been able to select patients most likely to respond to these agents. We are now trying to put together a prospective clinical trial to prove this. We would select only patients with these tumor mutations and treat them. This kind of research gets us closer to identifying genetic subtypes, to doing better biomarker-based clinical trials, and to personalizing treatments in brain cancers."


"This is a great example of personalized medicine in action," said Webster Cavenee, PhD, director of the Ludwig Institute at UC San Diego. "UCSD has made a concerted effort in the past few years to develop a first-class brain tumor research and therapy group that includes adult neuro-oncology, neurosurgery, neuropathology and their pediatric equivalents to join with internationally-renowned brain tumor research. This is making UCSD a destination for the very best in brain tumor management."


Co-authors of the study are Jeonghee Cho, Department of Medical Oncology, Dana-Farber Cancer Institute, Center for Cancer Genome Discovery, Boston, MA, Genomic Analysis Center, Samsung Cancer Research Institute, Seoul, Republic of Korea; Sandra Pastorino and Ying S. Chao, Department of Neurosciences, Moores Cancer Center, UC San Diego; Qing Zeng and Xiaoyin Xu, Department of Radiology, Brigham and Women's Hospital, Boston; William Johnson, Dana-Farber Cancer Institute, Center for Cancer Genome Discovery, Boston; Scott Vandenberg, Department of Pathology, UC San Diego; Roel Verhaak, Amit Dutt, Derek Chiang and Yuki Yuza, Department of Medical Oncology, Dana-Farber Cancer Institute and Broad Institute of MIT and Harvard; Andrew Cherniack and Robert C. Onofrio, Broad Institute of MIT and Harvard; Hideo Watanabe and Matthew Meyerson, Department of Medical Oncology, Dana-Farber Cancer Institute, Center for Cancer Genome Discovery and Broad Institute of MIT and Harvard; Jihyun Kwon, Genomic Analysis Center, Samsung Cancer Research Institute.


Funding for this research came, in part, from the National Institutes of Health, the Sontag Foundation Distinguished Scientist Award, James S. McDonnell and the Samsung Cancer Research Institution.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools:

ScienceDaily (Nov. 18, 2011) — Researchers at the Naval Research Laboratory Marine Meteorology Division (MMD), Monterey, Calif., have developed the Coupled Ocean/Atmosphere Mesoscale Prediction System Tropical Cyclone (COAMPS-TC™) model, achieving a significant research milestone in predictions of tropical cyclone intensity and structure.


While the predictions of the paths or tracks of hurricanes, more generally referred to as tropical cyclones (TC), have steadily improved over the last few decades, improvements in the predictions of storm intensity have proven much more difficult.


"Over the past two years, the COAMPS-TC model has shown to be the most accurate emerging research model for predicting tropical cyclone intensity," said Dr. Jim Doyle, research meteorologist, NRL Monterey. "There is no better example of these difficult challenges than the intensity predictions for Hurricane Irene this past August."


Producing very accurate intensity predictions during a real-time experimental demonstration of Hurricane Irene, COAMPS-TC intensity errors were six knots on average for a series of three-day forecasts, a clear improvement over the official National Hurricane Center (NHC) forecast and other operational models that ranged from 20-30 knots.


The successful predictions have demonstrated that Numerical Weather Prediction (NWP) models can outperform operational statistical-dynamic models that are based on climatology and previous behavior. It is further believed that NWP models, which explicitly predict the location, dynamics and intensity of a storm, will eventually provide the most promising approach to achieve accurate TC intensity and structure prediction.


Advancing further methodologies used for vortex initialization, data assimilation and representation of physical processes, COAMPS-TC is expected to become fully-operational in 2013 at the Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC) in Monterey. Considerable advancements have been made to several components of the modeling system including the data assimilation of conventional and special TC synthetic observations, vortex initialization, and representation of key TC physical processes such as air-sea fluxes, clouds and convection.


The COAMPS-TC project will potentially signal a paradigm shift in TC forecasting and is already making a strong impression on the forecasting community. Focusing on the development and transition of a fully coupled air-ocean-wave prediction system, the COAMPS-TC model includes nonhydrostatic atmospheric dynamics, multiple nested moving grids that follow the center of the storm and improved boundary layer and cloud physical parameterizations.


COAMPS-TC was first tested in real-time in support of two field campaigns sponsored by the Office of Naval Research (ONR). The Tropical Cyclone Structure-08 (TCS-08) conducted as part of The Observing System Research and Predictability Experiment (THORPEX) Pacific Asian Regional Campaign (T-PARC) in 2008 and the Impact of Typhoons on the Ocean in the Pacific (ITOP) in 2010, both of which took place in the Western Pacific. Additionally, COAMPS-TC advancements and real-time demonstrations in the Eastern Pacific and Western Atlantic have taken place through collaboration with the National Oceanic and Atmospheric Administration (NOAA) as part of the Hurricane Forecast Improvement Project (HFIP) -- a community-wide effort focused on improving operational hurricane prediction.


In June 2011, COAMPS-TC was one of nine worldwide winners of the inaugural High Performance Computing (HPC) Excellence Award presented at the ISC-11 International Supercomputing Conference in Hamburg, Germany -- an award presented annually to recognize noteworthy achievements by users of HPC technologies. As a result, COAMPS-TC was recognized for achieving 'a significantly improved model for tropical cyclone forecasting.' COAMPS-TC development benefited significantly from the Department of Defense HPC Modernization Program Office (HPCMO) computational assets at the Navy Defense Supercomputing Resource Center (DSRC) at Mississippi's Stennis Space Center.


Increasingly-sophisticated developmental versions of COAMPS-TC will continue to be demonstrated in real-time and in support of the Joint Typhoon Warning Center and the National Hurricane Center. A key additional enhancement will be a fully coupled ocean-atmosphere version in which the NRL Costal Ocean Model (NCOM) and the Wave Watch III (WWIII) will provide the ocean circulation and wave components, respectively.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools:

ScienceDaily (Nov. 18, 2011) — Why do people with a hereditary mutation of the red blood pigment hemoglobin (as is the case with sickle-cell anemia prevalent in Africa) not contract severe malaria? Scientists in the group headed by Prof. Michael Lanzer of the Department of Infectious Diseases at Heidelberg University Hospital have now solved this mystery.


A degradation product of the altered hemoglobin provides protection from severe malaria. Within the red blood cells infected by the malaria parasite, it blocks the establishment of a trafficking system used by the parasite's special adhesive proteins -- adhesins -- to access the exterior of the blood cells. As a result, the infected blood cells do not adhere to the vessel walls, as is usually the case for this type of malaria. This means that no dangerous circulatory disorders or neurological complications occur.


The research study has been published in the journal Science, appearing initially online.


In the 1940s, researchers already discovered that sickle-cell anemia with its characteristic blood mutation was particularly prevalent in certain population groups in Africa. They also survived malaria tropica, whose course is usually especially virulent. With malaria tropica, the malaria parasites (Plasmodia) enter the person after a bite of an infected Anopheles mosquito. The mosquito first multiplies in the person's liver cells and then infects the red blood cells (erythrocytes). Once inside the erythrocytes, they divide again and ultimately destroy them. The nearly simultaneous bursting of all infected blood cells causes the characteristic symptoms, which include bouts of fever and anemia.


Adhesins on red blood cells cause circulatory disorders


In patients with malaria tropica, neurological complications such as paralysis, seizures, coma and severe brain damage also frequently occur. This is caused by an anomaly of the parasite Plasmodium falciparum. It forms special adhesins that reach the cell surface of the infected blood cell. Once there, it causes the erythrocytes to adhere to the vessel walls, preventing them from being recognized in the spleen as damaged and removed from circulation. The parasite's protective mechanism results in smaller vessels closing, becoming inflamed and for example, prevents parts of the nervous system from being adequately supplied with oxygen.


In humans with mutated hemoglobin, these complications occur in a weakened form or not at all. "At the cell surface of infected erythrocytes with mutated hemoglobin, there are significantly fewer adhesins of the parasite than in normal red blood cells," explained Prof. Lanzer, Director of the Dept. of Infectious Diseases, Parasitology. "For this reason, we had a closer look at the trafficking system within the host cell." To this end, the team compared the blood cells with normal hemoglobin and two hemoglobin variants (hemoglobin S and hemoglobin C), which occur in around one-fifth of the African population in malaria-infected areas.


Trafficking system of the malaria parasite visualized for the first time


In so doing, the scientists used high-resolution microscopy techniques such as cryoelectron tomography to discover a new transport mechanism. The parasite uses a certain protein (actin) from the cytoskeleton (cellular skeleton) of the erythrocytes for its own trafficking network. "It forms a completely new structure that has nothing in common with the rest of the cytoskeleton," explained Dr. Marek Cyrklaff, group leader at the Dept. of Infectious Diseases, Parasitology and first author of the article. "The vesicles with the adhesins reach the cell surface of the red blood cells directly via these actin filaments."


In contrast to erythrocytes with the two hemoglobin variants,here only short pieces of actin filaments are found. Targeted transport to the surface is not possible. "The entire transport system of the malaria parasite is degenerated in these blood cells," Cyrklaff added. Laboratory tests showed that the hemoglobins themselves were not responsible for this, but rather a degradation product, ferryl hemoglobin. This is an irreversibly damaged, chemically altered hemoglobin that is no longer able to bind oxygen. The hemoglobins S and C are considerably more unstable than normal hemoglobin. As a result, blood cells with these variants contain ten times more ferryl hemoglobin than other erythrocytes. This high concentration destabilizes the binding of the actin structure and it disintegrates.


"With these results, we have now described a molecular mechanism for the first time that explains this hemoglobin variant's protective effect against malaria," Lanzer said.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools:

ScienceDaily (Nov. 18, 2011) — Scientists at Chalmers have succeeded in creating light from vacuum -- observing an effect first predicted over 40 years ago. In an innovative experiment, the scientists have managed to capture some of the photons that are constantly appearing and disappearing in the vacuum.


The results have been published in the journal Nature.


The experiment is based on one of the most counterintuitive, yet, one of the most important principles in quantum mechanics: that vacuum is by no means empty nothingness. In fact, the vacuum is full of various particles that are continuously fluctuating in and out of existence. They appear, exist for a brief moment and then disappear again. Since their existence is so fleeting, they are usually referred to as virtual particles.


Chalmers scientist, Christopher Wilson and his co-workers have succeeded in getting photons to leave their virtual state and become real photons, i.e. measurable light. The physicist Moore predicted way back in 1970 that this should happen if the virtual photons are allowed to bounce off a mirror that is moving at a speed that is almost as high as the speed of light. The phenomenon, known as the dynamical Casimir effect, has now been observed for the first time in a brilliant experiment conducted by the Chalmers scientists.


"Since it's not possible to get a mirror to move fast enough, we've developed another method for achieving the same effect," explains Per Delsing, Professor of Experimental Physics at Chalmers. "Instead of varying the physical distance to a mirror, we've varied the electrical distance to an electrical short circuit that acts as a mirror for microwaves."


The "mirror" consists of a quantum electronic component referred to as a SQUID (Superconducting quantum interference device), which is extremely sensitive to magnetic fields. By changing the direction of the magnetic field several billions of times a second the scientists were able to make the "mirror" vibrate at a speed of up to 25 percent of the speed of light.


"The result was that photons appeared in pairs from the vacuum, which we were able to measure in the form of microwave radiation," says Per Delsing. "We were also able to establish that the radiation had precisely the same properties that quantum theory says it should have when photons appear in pairs in this way."


What happens during the experiment is that the "mirror" transfers some of its kinetic energy to virtual photons, which helps them to materialise. According to quantum mechanics, there are many different types of virtual particles in vacuum, as mentioned earlier. Göran Johansson, Associate Professor of Theoretical Physics, explains that the reason why photons appear in the experiment is that they lack mass.


"Relatively little energy is therefore required in order to excite them out of their virtual state. In principle, one could also create other particles from vacuum, such as electrons or protons, but that would require a lot more energy."


The scientists find the photons that appear in pairs in the experiment interesting to study in closer detail. They can perhaps be of use in the research field of quantum information, which includes the development of quantum computers.


However, the main value of the experiment is that it increases our understanding of basic physical concepts, such as vacuum fluctuations -- the constant appearance and disappearance of virtual particles in vacuum. It is believed that vacuum fluctuations may have a connection with "dark energy" which drives the accelerated expansion of the universe. The discovery of this acceleration was recognised this year with the awarding of the Nobel Prize in Physics.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools:

ScienceDaily (Nov. 18, 2011) — A team of researchers belonging to the Universitat Politècnica de València's CUINA group has achieved a 50% reduction in the amount of salt in already desalted cod, thus obtaining a final product that preserves all its sensory properties and is particularly suitable for persons with hypertension.


This research has been published in the Journal of Food Engineering.


The key to reducing the amount of salt in cod is to partially replace sodium with potassium after the desalting process. "Once we have desalted the cod, we introduce a piece of it in a solution containing potassium chloride. During this process, a partial exchange of sodium for potassium takes place -it is like a second desalting. Thus, we get a piece of cod containing 50% less sodium than standard desalted cod," says José Manuel Barat, a researcher at the UPV's CUINA group. The fish also retains all its properties of flavour, texture, etc., as show the results of several sensory studies that have been conducted in the UPV's laboratories. It also contains enough salt so that it can be stored under refrigeration for as long as is needed. So far, this new technique has been applied -and validated- in laboratory tests.


This new method proposed by researchers at the UPV's CUINA group responds to an increasingly important demand by the food industry for developing low-salt products. "With this technique, we open the door to offering a new product both to those consumers who, for medical reasons, must have little salt in their diet, and to the general public, who are advised to reduce their sodium intake. Furthermore, by replacing sodium chloride with potassium chloride we get an even healthier product," says José Manuel Barat.


Researchers at the UPV's CUINA group have extensive experience in the processes of salting and desalting food. They also have several patents, including a method for desalting and preserving fish.


This experience and this knowledge were applied to a collaborative project with the fishing industry company Conservas Ubago, which resulted in the commercialization of ready-to-cook refrigerated desalted cod. "Even though it was desalted cod, it still had a certain amount of salt, as it is necessary in order to store refrigerated cod. Now we have gone a step further, and have reduced even that sodium content. We have thus laid the ground for the development of a new product, with less sodium and more potassium, with all its properties unaltered, particularly suitable for diets with a low sodium content," said José Manuel Barat.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools:

ScienceDaily (Nov. 18, 2011) — In the fast-paced world of health care, doctors are often pressed for time during patient visits. Researchers at the University of Missouri developed a tool that allows doctors to view electronic information about patients' health conditions related to diabetes on a single computer screen. A new study shows that this tool, the diabetes dashboard, saves time, improves accuracy and enhances patient care.


The diabetes dashboard provides information about patients' vital signs, health conditions, current medications, and laboratory tests that may need to be performed. The study showed that physicians who used the dashboard were able to correctly identify data they were searching for 100 percent of the time, compared with 94 percent using traditional electronic medical records. Further, the number of mouse clicks needed to find the information was reduced from 60 to three when using the diabetes dashboard.


Richelle Koopman, associate professor of family and community medicine in the School of Medicine, says diabetes care is complex because there are so many other health conditions associated with the disease; thus coordination of treatments is required. The goal of the diabetes dashboard is to make it easier for doctors to make the right decision about treatments.


"The diabetes dashboard is so intuitive that it makes it hard for physicians not to do the right thing," Koopman said. "Doctors can see, at a glance, everything that might affect their decision. This frees up their minds and helps them make better decisions about patients' care."


According to Koopman, the research has important implications for patient safety and costs. For example, the dashboard shows doctors a list of tests that are standard for diabetes patients and indicates whether patients have recently had the tests or need to have them. This eliminates the potential for physicians to order costly tests that are not necessary.


"It is difficult to quantify how much money the dashboard saves, but in terms of time and accuracy, the savings are substantial," Koopman said. "Doctors are still going to spend 15 minutes with each patient, but instead of using a large portion of that time to search through charts for information, they can have interactive conversations with patients about lifestyle and diet changes that are important for diabetes care."


The researchers say the dashboard was well received by doctors who tested it because it was designed by physicians familiar with their needs. The study, published in Annals of Family Medicine, was a collaboration among the MU School of Medicine, The Informatics Institute, the School of Information Science and Learning Technologies in the College of Education, the Center for Health Care Quality and the Sinclair School of Nursing.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools:

ScienceDaily (Nov. 18, 2011) — Many experts believe that advanced biofuels made from cellulosic biomass are the most promising alternative to petroleum-based liquid fuels for a renewable, clean, green, domestic source of transportation energy. Nature, however, does not make it easy. Unlike the starch sugars in grains, the complex polysaccharides in the cellulose of plant cell walls are locked within a tough woody material called lignin. For advanced biofuels to be economically competitive, scientists must find inexpensive ways to release these polysaccharides from their bindings and reduce them to fermentable sugars that can be synthesized into fuels.


An important step towards achieving this goal has been taken by researchers with the U.S. Department of Energy (DOE)'s Joint BioEnergy Institute (JBEI), a DOE Bioenergy Research Center led by the Lawrence Berkeley National Laboratory (Berkeley Lab).


A team of JBEI researchers, working with researchers at the U.S. Department of Agriculture's Agricultural Research Service (ARS), has demonstrated that introducing a maize (corn) gene into switchgrass, a highly touted potential feedstock for advanced biofuels, more than doubles (250 percent) the amount of starch in the plant's cell walls and makes it much easier to extract polysaccharides and convert them into fermentable sugars. The gene, a variant of the maize gene known as Corngrass1 (Cg1), holds the switchgrass in the juvenile phase of development, preventing it from advancing to the adult phase.


"We show that Cg1 switchgrass biomass is easier for enzymes to break down and also releases more glucose during saccharification," says Blake Simmons, a chemical engineer who heads JBEI's Deconstruction Division and was one of the principal investigators for this research. "Cg1 switchgrass contains decreased amounts of lignin and increased levels of glucose and other sugars compared with wild switchgrass, which enhances the plant's potential as a feedstock for advanced biofuels."


The results of this research are described in a paper published in the Proceedings of the National Academy of Sciences (PNAS) titled "Overexpression of the maize Corngrass1 microRNA prevents flowering, improves digestibility, and increases starch content of switchgrass."


Lignocellulosic biomass is the most abundant organic material on earth. Studies have consistently shown that biofuels derived from lignocellulosic biomass could be produced in the United States in a sustainable fashion and could replace today's gasoline, diesel and jet fuels on a gallon-for-gallon basis. Unlike ethanol made from grains, such fuels could be used in today's engines and infrastructures and would be carbon-neutral, meaning the use of these fuels would not exacerbate global climate change. Among potential crop feedstocks for advanced biofuels, switchgrass offers a number of advantages. As a perennial grass that is both salt- and drought-tolerant, switchgrass can flourish on marginal cropland, does not compete with food crops, and requires little fertilization. A key to its use in biofuels is making it more digestible to fermentation microbes.


"The original Cg1 was isolated in maize about 80 years ago. We cloned the gene in 2007 and engineered it into other plants, including switchgrass, so that these plants would replicate what was found in maize," says George Chuck, lead author of the PNAS paper and a plant molecular geneticist who holds joint appointments at the Plant Gene Expression Center with ARS and the University of California (UC) Berkeley. "The natural function of Cg1 is to hold pants in the juvenile phase of development for a short time to induce more branching. Our Cg1 variant is special because it is always turned on, which means the plants always think they are juveniles."


Chuck and his colleague Sarah Hake, another co-author of the PNAS paper and director of the Plant Gene Expression Center, proposed that since juvenile biomass is less lignified, it should be easier to break down into fermentable sugars. Also, since juvenile plants don't make seed, more starch should be available for making biofuels. To test this hypothesis, they collaborated with Simmons and his colleagues at JBEI to determine the impact of introducing the Cg1 gene into switchgrass.


In addition to reducing the lignin and boosting the amount of starch in the switchgrass, the introduction and overexpression of the maize Cg1 gene also prevented the switchgrass from flowering even after more than two years of growth, an unexpected but advantageous result.


"The lack of flowering limits the risk of the genetically modified switchgrass from spreading genes into the wild population," says Chuck.


The results of this research offer a promising new approach for the improvement of dedicated bioenergy crops, but there are questions to be answered. For example, the Cg1 switchgrass biomass still required a pre-treatment to efficiently liberate fermentable sugars.


"The alteration of the switchgrass does allow us to use less energy in our pre-treatments to achieve high sugar yields as compared to the energy required to convert the wild type plants," Simmons says. "The results of this research set the stage for an expanded suite of pretreatment and saccharification approaches at JBEI and elsewhere that will be used to generate hydrolysates for characterization and fuel production."


Another question to be answered pertains to the mechanism by which Cg1 is able to keep switchgrass and other plants in the juvenile phase.


"We know that Cg1 is controlling an entire family of transcription factor genes," Chuck says, "but we have no idea how these genes function in the context of plant aging. It will probably take a few years to figure this out."


Co-authoring the PNAS paper with Chuck and Simmons were Christian Tobias, Lan Sun, Florian Kraemer, Chenlin Li, Dean Dibble, Rohit Arora, Jennifer Bragg, John Vogel, Seema Singh, Markus Pauly and Sarah Hake.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools: