ScienceDaily (Nov. 30, 2011) — Over the past year, researchers at the California Institute of Technology (Caltech), and around the world, have been studying a group of potent antibodies that have the ability to neutralize HIV in the lab; their hope is that they may learn how to create a vaccine that makes antibodies with similar properties. Now, biologists at Caltech led by Nobel Laureate David Baltimore, president emeritus and Robert Andrews Millikan Professor of Biology, have taken one step closer to that goal: they have developed a way to deliver these antibodies to mice and, in so doing, have effectively protected them from HIV infection.

This new approach to HIV prevention -- called Vectored ImmunoProphylaxis, or VIP -- is outlined in the November 30 advance online publication of the journal Nature.

Traditional efforts to develop a vaccine against HIV have been centered on designing substances that provoke an effective immune response -- either in the form of antibodies to block infection or T cells that attack infected cells. With VIP, protective antibodies are being provided up front.

"VIP has a similar effect to a vaccine, but without ever calling on the immune system to do any of the work," says Alejandro Balazs, lead author of the study and a postdoctoral scholar in Baltimore's lab. "Normally, you put an antigen or killed bacteria or something into the body, and the immune system figures out how to make an antibody against it. We've taken that whole part out of the equation."

Because mice are not sensitive to HIV, the researchers used specialized mice carrying human immune cells that are able to grow HIV. They utilized an adeno-associated virus (AAV) -- a small, harmless virus that has been useful in gene-therapy trials -- as a carrier to deliver genes that are able to specify antibody production. The AAV was injected into the leg muscle of mice, and the muscle cells then put broadly neutralizing antibodies into the animals' circulatory systems. After just a single AAV injection, the mice produced high concentrations of these antibodies for the rest of their lives, as shown by intermittent sampling of their blood. Remarkably, these antibodies protected the mice from infection when the researchers exposed them to HIV intravenously.

The team points out that the leap from mice to humans is large -- the fact that the approach works in mice does not necessarily mean it will be successful in humans. Still, the researchers believe that the large amounts of antibodies that the mice were able to produce -- coupled with the finding that a relatively small amount of antibody has proved protective in the mice -- may translate into human protection against HIV infection.

"We're not promising that we've actually solved the human problem," says Baltimore. "But the evidence for prevention in these mice is very clear."

The paper also notes that in the mouse model, VIP worked even in the face of increased exposure to HIV. To test the efficacy of the antibody, the researchers started with a virus dose of one nanogram, which was enough to infect the majority of the mice who received it. When they saw that the mice given VIP could withstand that dose, they continued to bump it up until they were challenging them with 125 nanograms of virus.

"We expected that at some dose, the antibodies would fail to protect the mice, but it never did -- even when we gave mice 100 times more HIV than would be needed to infect 7 out of 8 mice," says Balazs. "All of the exposures in this work were significantly larger than a human being would be likely to encounter."

He points out that this outcome likely had more to do with the properties of the antibody that was tested than the method, but adds that VIP is what enabled the large amount of this powerful antibody to circulate through the mice and fight the virus. Furthermore, VIP is a platform technique, meaning that as more potent neutralizing antibodies are isolated or developed for HIV or other infectious organisms, they can also be delivered using this method.

"If humans are like mice, then we have devised a way to protect against the transmission of HIV from person to person," says Baltimore. "But that is a huge if, and so the next step is to try to find out whether humans behave like mice."

He says the team is currently in the process of developing a plan to test their method in human clinical trials. The initial tests will ask whether the AAV vector can program the muscle of humans to make levels of antibody that would be expected to be protective against HIV.

"In typical vaccine studies, those inoculated usually mount an immune response -- you just don't know if it's going to work to fight the virus," explains Balazs. "In this case, because we already know that the antibodies work, my opinion is that if we can induce production of sufficient antibody in people, then the odds that VIP will be successful are actually pretty high."

The study, "Antibody-based Protection Against HIV Infection by Vectored ImmunoProphylaxis," was funded by the Bill and Melinda Gates Foundation, the National Institutes of Health, and the Caltech-UCLA Joint Center for Translational Medicine. Caltech biology researchers Joyce Chen, Christin M. Hong, and Lili Yang also contributed to the paper, as well as Dinesh Rao, a hematologist from the University of California, Los Angeles.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by California Institute of Technology.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Alejandro B. Balazs, Joyce Chen, Christin M. Hong, Dinesh S. Rao, Lili Yang, David Baltimore. Antibody-based protection against HIV infection by vectored immunoprophylaxis. Nature, 2011; DOI: 10.1038/nature10660

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 17, 2011) — Galaxies learned to "go green" early in the history of the universe, continuously recycling immense volumes of hydrogen gas and heavy elements to build successive generations of stars stretching over billions of years.

This ongoing recycling keeps galaxies from emptying their "fuel tanks" and therefore stretches out their star-forming epoch to over 10 billion years. However, galaxies that ignite a rapid firestorm of star birth can blow away their remaining fuel, essentially turning off further star-birth activity.

This conclusion is based on a series of Hubble Space Telescope observations that flexed the special capabilities of its comparatively new Cosmic Origins Spectrograph (COS) to detect otherwise invisible mass in the halo of our Milky Way and a sample of more than 40 other galaxies. Data from large ground-based telescopes in Hawaii, Arizona, and Chile also contributed to the studies by measuring the properties of the galaxies.

This invisible mass is made up of normal matter -- hydrogen, helium, and heavier elements such as carbon, oxygen, nitrogen, and neon -- as opposed to dark matter that is an unknown exotic particle pervading space.

The results are being published in three papers in the November 18 issue of Science magazine. The leaders of the three studies are Nicolas Lehner of the University of Notre Dame in South Bend, Ind.; Jason Tumlinson of the Space Telescope Science Institute in Baltimore, Md.; and Todd Tripp of the University of Massachusetts at Amherst.

The Key Findings

The color and shape of a galaxy is largely controlled by gas flowing through an extended halo around it. All modern simulations of galaxy formation find that they cannot explain the observed properties of galaxies without modeling the complex accretion and "feedback" processes by which galaxies acquire gas and then later expel it after processing by stars. The three studies investigated different aspects of the gas-recycling phenomenon.

"Our results confirm a theoretical suspicion that galaxies expel and can recycle their gas, but they also present a fresh challenge to theoretical models to understand these gas flows and integrate them with the overall picture of galaxy formation," Tumlinson says.

The team used COS observations of distant stars to demonstrate that a large mass of clouds is falling through the giant corona halo of our Milky Way, fueling its ongoing star formation. These clouds of ionized hydrogen reside within 20,000 light-years of the Milky Way disk and contain enough material to make 100 million suns. Some of this gas is recycled material that is continually being replenished by star formation and the explosive energy of novae and supernovae, which kicks chemically enriched gas back into the halo; the remainder is gas being accreted for the first time. The infalling gas from this vast reservoir fuels the Milky Way with the equivalent of about a solar mass per year, which is comparable to the rate at which our galaxy makes stars. At this rate the Milky Way will continue making stars for another billion years by recycling gas into the halo and back onto the galaxy. "We now know where is the missing fuel for galactic star formation," Lehner concludes. "We now have to find out its birthplace."

One goal of the studies was to study how other galaxies like our Milky Way accrete mass for star making. But instead of widespread accretion, the team found nearly ubiquitous halos of hot gas surrounding vigorous star-forming galaxies. These galaxy halos, rich in heavy elements, extend as much as 450,000 light-years beyond the visible portions of their galactic disks. The surprise was discovering how much mass in heavy elements is far outside a galaxy. COS measured 10 million solar masses of oxygen in a galaxy's halo, which corresponds to about 1 billion solar masses of gas -- as much as in the entire interstellar medium between stars in a galaxy's disk. They also found that this gas is nearly absent from galaxies that have stopped forming stars. This is evidence that widespread outflows, rather than accretion, determine a galaxy's fate. "We didn't know how much mass was there in these gas halos, because we couldn't do these observations until we had COS," Tumlinson says. "This stuff is a huge component of galaxies but can't be seen in any images."

He points out that because so much of the heavy elements has been ejected into the halos instead of sticking around in the galaxies, the formation of planets, life, and other things requiring heavy elements could have been delayed in these galaxies.

The COS data also demonstrate that those galaxies forming stars at a very rapid rate, perhaps a hundred solar masses per year, can drive 2-million-degree gas very far out into intergalactic space at speeds of up to 2 million miles per hour. That's fast enough for the gas to escape forever and never refuel the parent galaxy. While hot plasma "winds" from galaxies have been known for some time, the new COS observations reveal that hot outflows extend to much greater distances than previously thought and can carry a tremendous amount of mass out of a galaxy. Some of the hot gas is moving more slowly and could eventually be recycled. The Hubble observations show how gas-rich star-forming spiral galaxies can evolve to quiescent elliptical galaxies that no longer have star formation. "So not only have we found that star-forming galaxies are pervasively surrounded by large halos of hot gas," says Tripp, "we have also observed that hot gas in transit -- we have caught the stuff in the process of moving out of a galaxy and into intergalactic space."

The light emitted by this hot plasma is invisible, so the researchers used COS to detect the presence of the gas by the way it absorbs certain colors of light from background quasars. The brightest objects in the universe, quasars are the brilliant cores of active galaxies that contain rapidly accreting supermassive black holes. The quasars serve as distant lighthouse beacons that shine through the gas-rich "fog" of hot plasma encircling galaxies. At ultraviolet wavelengths, COS is sensitive to absorption from many ionized heavy elements, such as nitrogen, oxygen, and neon. COS's high sensitivity allows many galaxies that happen to lie in front of the much more distant quasars to be studied. The ionized heavy elements serve as proxies for estimating how much mass is in a galaxy's halo.

"Only with COS can we now address some of the most crucial questions that are at the forefront of extragalactic astrophysics," Tumlinson says.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by NASA/Goddard Space Flight Center.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal References:

T. M. Tripp, J. D. Meiring, J. X. Prochaska, C. N. A. Willmer, J. C. Howk, J. K. Werk, E. B. Jenkins, D. V. Bowen, N. Lehner, K. R. Sembach, C. Thom, J. Tumlinson. The Hidden Mass and Large Spatial Extent of a Post-Starburst Galaxy Outflow. Science, 2011; 334 (6058): 952 DOI: 10.1126/science.1209850N. Lehner, J. C. Howk. A Reservoir of Ionized Gas in the Galactic Halo to Sustain Star Formation in the Milky Way. Science, 2011; 334 (6058): 955 DOI: 10.1126/science.1209069J. Tumlinson, C. Thom, J. K. Werk, J. X. Prochaska, T. M. Tripp, D. H. Weinberg, M. S. Peeples, J. M. O'Meara, B. D. Oppenheimer, J. D. Meiring, N. S. Katz, R. Dave, A. B. Ford, K. R. Sembach. The Large, Oxygen-Rich Halos of Star-Forming Galaxies Are a Major Reservoir of Galactic Metals. Science, 2011; 334 (6058): 948 DOI: 10.1126/science.1209840

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 17, 2011) — New research provides the first evidence that depression can be treated by only targeting an individual's style of thinking through repeated mental exercises in an approach called cognitive bias modification.

The study suggests an innovative psychological treatment called 'concreteness training' can reduce depression in just two months and could work as a self-help therapy for depression in primary care.

Led by the University of Exeter and funded by the Medical Research Council, the research shows how this new treatment could help some of the 3.5 million people in the UK living with depression.

People suffering from depression have a tendency towards unhelpful abstract thinking and over-general negative thoughts, such as viewing a single mistake as evidence that they are useless at everything. Concreteness training (CNT) is a novel and unique treatment approach that attempts to directly target this tendency. Repeated practice of CNT exercises can help people to shift their thinking style.

CNT teaches people how to be more specific when reflecting on problems. This can help them to keep difficulties in perspective, improve problem-solving and reduce worry, brooding, and depressed mood. This study provided the first formal test of this treatment for depression in the NHS.

121 individuals who were currently experiencing an episode of depression were recruited from GP practices. They took part in the clinical trial and were randomly allocated into three groups. A third received their usual treatment from their GP, plus CNT, while some were offered relaxation training in addition to their usual treatment and the remainder simply continued their usual treatment. All participants were assessed by the research team after two months and then three and six months later to see what progress they had made.

The CNT involved the participants undertaking a daily exercise in which they focused on a recent event that they had found mildly to moderately upsetting. They did this initially with a therapist and then alone using an audio CD that provided guided instructions. They worked through standardised steps and a series of exercises to focus on the specific details of that event and to identify how they might have influenced the outcome.

CNT significantly reduced symptoms of depression and anxiety, on average reducing symptoms from severe depression to mild depression during the first two months and maintaining this effect over the following three and six months. On average, those individuals who simply continued with their usual treatment remained severely depressed.

Although concreteness training and relaxation training both significantly reduced depression and anxiety, only concreteness training reduced the negative thinking typically found in depression. Moreover, for those participants who practised it enough to ensure it became a habit, CNT reduced symptoms of depression more than relaxation training.

Professor Edward Watkins of the University of Exeter said: "This is the first demonstration that just targeting thinking style can be an effective means of tackling depression. Concreteness training can be delivered with minimal face-to-face contact with a therapist and training could be accessed online, through CDs or through smartphone apps. This has the advantage of making it a relatively cheap form of treatment that could be accessed by large numbers of people. This is a major priority in depression treatment and research, because of the high prevalence and global burden of depression, for which we need widely available cost-effective interventions."

The researchers are now calling for larger effectiveness clinical trials so that the feasibility of CNT as part of the NHS's treatment for depression can be assessed.

Published in the journal Psychological Medicine, this study was carried out by a team from the Mood Disorders Centre, which is a partnership between the NHS and the University of Exeter and the Peninsula College of Medicine and Dentistry, a joint entity of the Universities of Exeter and Plymouth and the NHS in the South West.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of Exeter.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 18, 2011) — Suitable habitat for native fishes in many Great Plains streams has been significantly reduced by the pumping of groundwater from the High Plains aquifer -- and scientists analyzing the water loss say ecological futures for these fishes are "bleak."


Results of their study have been published in the journal Ecohydrology.


Unlike alluvial aquifers, which can be replenished seasonally with rain and snow, these regional aquifers were filled by melting glaciers during the last Ice Age, the researchers say. When that water is gone, it won't come back -- at least, until another Ice Age comes along.


"It is a finite resource that is not being recharged," said Jeffrey Falke, a post-doctoral researcher at Oregon State University and lead author on the study. "That water has been there for thousands of years, and it is rapidly being depleted. Already, streams that used to run year-round are becoming seasonal, and refuge habitats for native fishes are drying up and becoming increasingly fragmented."


Falke and his colleagues, all scientists from Colorado State University where he earned his Ph.D., spent three years studying the Arikaree River in eastern Colorado. They conducted monthly low-altitude flights over the river to map refuge pool habitats and connectivity, and compared it to historical data.


They conclude that during the next 35 years -- under the most optimistic of circumstances -- only 57 percent of the current refuge pools would remain -- and almost all of those would be isolated in a single mile-long stretch of the Arikaree River. Water levels today already are significantly lower than they were 40 and 50 years ago.


Though their study focused on the Arikaree, other dryland streams in the western Great Plains -- composed of eastern Colorado, western Nebraska and western Kansas -- face the same fate, the researchers say.


Falke said the draining of the regional aquifers lowers the groundwater input to alluvial aquifers through which the rivers flow, creating the reduction in streamflow. He and his colleagues estimate that it would require a 75 percent reduction in the rate of groundwater pumping to maintain current water table levels and refuge pools, which is "not economically or politically feasible," the authors note in the study.


Dryland streams in the Great Plains host several warm-water native fish species that have adapted over time to harsh conditions, according to Falke, who is with the Department of Fisheries and Wildlife at Oregon State University. Brassy minnows, orange-throat darters and other species can withstand water temperatures reaching 90 degrees, as well as low levels of dissolved oxygen, but the increasing fragmentation of their habitats may impede their life cycle, limiting the ability of the fish to recolonize.


"The Arikaree River and most dryland streams are shallow, with a sandy bottom, and often silty," Falke said. "The water can be waist-deep, and when parts of the river dry up from the pumping of groundwater, it is these deeper areas that become refuge pools. But they are becoming scarcer, and farther apart each year."


Falke said the changing hydrology of the system has implications beyond the native fishes. The aquifer-fed stream influences the entire riparian area, where cottonwood trees form their own ecosystem and groundwater-dependent grasses support the grazing of livestock and other animals.


Pumping of regional aquifers is done almost entirely for agriculture, Falke said, with about 90 percent of the irrigation aimed at corn production, with some alfalfa and wheat.


"The impact goes well beyond the Arikaree River," Falke said. "Declines in streamflow are widespread across the western Great Plains, including all 11 headwaters of the Republican River. Ultimately, the species inhabiting these drainages will decline in range and abundance, and become more imperiled as groundwater levels decline and climate changes continue."


Other authors on the study include Kurt Fausch, Robin Magelky, Angela Aldred, Deanna Durnford, Linda Riley and Ramchand Oad, all of Colorado State University. The study was supported by the Colorado Division of Wildlife and the Colorado Agricultural Experiment Station.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools:

ScienceDaily (Nov. 18, 2011) — Scientists at the University of California, San Diego School of Medicine and UC San Diego Moores Cancer Center, in collaboration with colleagues in Boston and South Korea, say they have identified a novel gene mutation that causes at least one form of glioblastoma (GBM), the most common type of malignant brain tumor.


The findings are reported in the online edition of the journal Cancer Research.


Perhaps more importantly, the researchers found that two drugs already being used to treat other forms of cancer effectively prolonged the survival of mice modeling this particular form of GBM. That could be good news for at least some GBM patients. More than 9,000 new cases of the disease are diagnosed each year in the United States and effective treatments are limited. The tumors are aggressive and resistant to current therapies, such as surgery, radiation and chemotherapy. The median survival rate for newly diagnosed GBM patients is just 14 months.


Past studies have identified epidermal growth factor receptor (EGFR) as a common genetically altered gene in GBM, though the cause or causes of the alteration is not known. The research team, led by scientists at the Dana-Farber Cancer Institute in Boston, analyzed the GBM genomic database, ultimately identifying and characterizing an exon 27 deletion mutation within the EGFR carboxyl-terminus domain (CTD). An exon is a segment of a DNA or RNA molecule containing information coding for a protein or peptide sequence.


"The deletion mutant seems to possess a novel mechanism for inducing cellular transformation," said Frank Furnari, PhD, associate professor of medicine at the UC San Diego School of Medicine and an associate investigator at the San Diego branch of the Ludwig Institute for Cancer Research.


The study researchers determined that cellular transformation was induced by the previously unknown EGFR CTD deletion mutant, both in vitro and in vivo, and resulted in GBM in the animals. The researchers then turned to testing a pair of approved drugs that target EGFR: a monoclonal antibody called cetuximab and a small molecule inhibitor called erlotinib.


Cetuximab, marketed under the name Erbitux, is currently approved for use in treating metastatic colorectal cancer and squamous cell carcinoma of the head and neck. Erlotinib, marketed under the name Tarceva, is used to treat lung and pancreatic cancers.


Both drugs were found to effectively impair the tumor-forming abilities of oncogenic EGFR CTD deletion mutants. Cetuximab, in particular, prolonged survival of mice with the deletion mutants when compared to untreated control mice.


However, neither cetuximab nor erlotinib is an unabashed success story. The drugs work by binding to sites on the EGFR protein and inhibiting activation, but they are not effective in all cancer patients and produce some adverse side effects, such as rashes and diarrhea.


But Santosh Kesari, MD, PhD, Director of Neuro-Oncology at UC San Diego Moores Cancer Center and the UCSD Department of Neurosciences, and co-corresponding author of the study, said the new study points to a more selective, effective use of the drugs for some patients with GBM.


"In the past when we treated brain cancer patients with these drugs, the response rate was very small," Kesari said. "What we now show is that the tumors with CTD mutations respond best to these EGFR targeted agents. If we knew this beforehand, we might have been able to select patients most likely to respond to these agents. We are now trying to put together a prospective clinical trial to prove this. We would select only patients with these tumor mutations and treat them. This kind of research gets us closer to identifying genetic subtypes, to doing better biomarker-based clinical trials, and to personalizing treatments in brain cancers."


"This is a great example of personalized medicine in action," said Webster Cavenee, PhD, director of the Ludwig Institute at UC San Diego. "UCSD has made a concerted effort in the past few years to develop a first-class brain tumor research and therapy group that includes adult neuro-oncology, neurosurgery, neuropathology and their pediatric equivalents to join with internationally-renowned brain tumor research. This is making UCSD a destination for the very best in brain tumor management."


Co-authors of the study are Jeonghee Cho, Department of Medical Oncology, Dana-Farber Cancer Institute, Center for Cancer Genome Discovery, Boston, MA, Genomic Analysis Center, Samsung Cancer Research Institute, Seoul, Republic of Korea; Sandra Pastorino and Ying S. Chao, Department of Neurosciences, Moores Cancer Center, UC San Diego; Qing Zeng and Xiaoyin Xu, Department of Radiology, Brigham and Women's Hospital, Boston; William Johnson, Dana-Farber Cancer Institute, Center for Cancer Genome Discovery, Boston; Scott Vandenberg, Department of Pathology, UC San Diego; Roel Verhaak, Amit Dutt, Derek Chiang and Yuki Yuza, Department of Medical Oncology, Dana-Farber Cancer Institute and Broad Institute of MIT and Harvard; Andrew Cherniack and Robert C. Onofrio, Broad Institute of MIT and Harvard; Hideo Watanabe and Matthew Meyerson, Department of Medical Oncology, Dana-Farber Cancer Institute, Center for Cancer Genome Discovery and Broad Institute of MIT and Harvard; Jihyun Kwon, Genomic Analysis Center, Samsung Cancer Research Institute.


Funding for this research came, in part, from the National Institutes of Health, the Sontag Foundation Distinguished Scientist Award, James S. McDonnell and the Samsung Cancer Research Institution.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools:

ScienceDaily (Nov. 18, 2011) — Researchers at the Naval Research Laboratory Marine Meteorology Division (MMD), Monterey, Calif., have developed the Coupled Ocean/Atmosphere Mesoscale Prediction System Tropical Cyclone (COAMPS-TC™) model, achieving a significant research milestone in predictions of tropical cyclone intensity and structure.


While the predictions of the paths or tracks of hurricanes, more generally referred to as tropical cyclones (TC), have steadily improved over the last few decades, improvements in the predictions of storm intensity have proven much more difficult.


"Over the past two years, the COAMPS-TC model has shown to be the most accurate emerging research model for predicting tropical cyclone intensity," said Dr. Jim Doyle, research meteorologist, NRL Monterey. "There is no better example of these difficult challenges than the intensity predictions for Hurricane Irene this past August."


Producing very accurate intensity predictions during a real-time experimental demonstration of Hurricane Irene, COAMPS-TC intensity errors were six knots on average for a series of three-day forecasts, a clear improvement over the official National Hurricane Center (NHC) forecast and other operational models that ranged from 20-30 knots.


The successful predictions have demonstrated that Numerical Weather Prediction (NWP) models can outperform operational statistical-dynamic models that are based on climatology and previous behavior. It is further believed that NWP models, which explicitly predict the location, dynamics and intensity of a storm, will eventually provide the most promising approach to achieve accurate TC intensity and structure prediction.


Advancing further methodologies used for vortex initialization, data assimilation and representation of physical processes, COAMPS-TC is expected to become fully-operational in 2013 at the Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC) in Monterey. Considerable advancements have been made to several components of the modeling system including the data assimilation of conventional and special TC synthetic observations, vortex initialization, and representation of key TC physical processes such as air-sea fluxes, clouds and convection.


The COAMPS-TC project will potentially signal a paradigm shift in TC forecasting and is already making a strong impression on the forecasting community. Focusing on the development and transition of a fully coupled air-ocean-wave prediction system, the COAMPS-TC model includes nonhydrostatic atmospheric dynamics, multiple nested moving grids that follow the center of the storm and improved boundary layer and cloud physical parameterizations.


COAMPS-TC was first tested in real-time in support of two field campaigns sponsored by the Office of Naval Research (ONR). The Tropical Cyclone Structure-08 (TCS-08) conducted as part of The Observing System Research and Predictability Experiment (THORPEX) Pacific Asian Regional Campaign (T-PARC) in 2008 and the Impact of Typhoons on the Ocean in the Pacific (ITOP) in 2010, both of which took place in the Western Pacific. Additionally, COAMPS-TC advancements and real-time demonstrations in the Eastern Pacific and Western Atlantic have taken place through collaboration with the National Oceanic and Atmospheric Administration (NOAA) as part of the Hurricane Forecast Improvement Project (HFIP) -- a community-wide effort focused on improving operational hurricane prediction.


In June 2011, COAMPS-TC was one of nine worldwide winners of the inaugural High Performance Computing (HPC) Excellence Award presented at the ISC-11 International Supercomputing Conference in Hamburg, Germany -- an award presented annually to recognize noteworthy achievements by users of HPC technologies. As a result, COAMPS-TC was recognized for achieving 'a significantly improved model for tropical cyclone forecasting.' COAMPS-TC development benefited significantly from the Department of Defense HPC Modernization Program Office (HPCMO) computational assets at the Navy Defense Supercomputing Resource Center (DSRC) at Mississippi's Stennis Space Center.


Increasingly-sophisticated developmental versions of COAMPS-TC will continue to be demonstrated in real-time and in support of the Joint Typhoon Warning Center and the National Hurricane Center. A key additional enhancement will be a fully coupled ocean-atmosphere version in which the NRL Costal Ocean Model (NCOM) and the Wave Watch III (WWIII) will provide the ocean circulation and wave components, respectively.


Recommend this story on Facebook, Twitter,
and Google +1:


Other bookmarking and sharing tools: