ScienceDaily (Oct. 27, 2011) — Official assessments of a doctor's professionalism should be considered carefully before being accepted due to the tendency for some doctors to receive lower scores than others, and the tendency of some groups of patient or colleague assessors to provide lower scores, claims new research published online in the British Medical Journal.

Researchers from the Peninsula College of Medicine and Dentistry in Exeter investigated whether there were any potential patient, colleague and doctor-related sources of bias evident in the assessment of doctors' professionalism.

Doctors' regulator the General Medical Council (GMC) is working on a new system of revalidation for all UK doctors that could be introduced next year as a way of ensuring doctors are fit to continue to practise. This is likely to involve the use of multi-source feedback from patients, peers and supervisors as part of the evidence used to judge a clinician's performance.

The researchers used data from two questionnaires completed by patients and colleagues. A group of 1,065 doctors from 11 different settings, including mostly NHS sites and one independent sector organisation, took part in the study.

They were asked to nominate up to 20 medical and non-medically trained colleagues to take part in an online secure survey about their professionalism, as well as passing on a post-consultation questionnaire to 45 patients each. Collectively, the doctors returned completed questionnaires from 17,031 colleagues and 30,333 patients.

Analysis of the results that allowed for characteristics of the doctor and the patient to be taken into account, showed that doctors were less likely to receive favourable patient feedback if their primary medical degree was from any non-European country.

Several other factors also tended to mean doctors got less positive feedback from patients, such as that they practised as a psychiatrist, the responding patient was not white, and the responding patient reported that they were not seeing their "usual doctor."

From colleagues, there was likely to be less positive feedback if the doctor in question had received their degree from any country other than the UK or South Asia. Other factors that predicted a less favourable review from colleagues included that the doctor was working in a locum capacity, the doctor was working as a GP or psychiatrist, or the colleague did not have daily or weekly professional contact with the doctor.

The researchers say they have identified possible "systematic bias" in the assessment of doctors' professionalism.

They conclude: "Systematic bias may exist in the assessment of doctors' professionalism arising from the characteristics of the assessors giving feedback, and from the personal characteristics of the doctor being assessed. In the absence of a standardised measure of professionalism, doctor's assessment scores from multisource feedback should be interpreted carefully, and, at least initially, be used primarily to help inform doctor's professional development."

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by BMJ-British Medical Journal.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

John L. Campbell, Martin Roberts, Christine Wright, Jacqueline Hill, Michael Greco, Matthew Taylor, Suzanne Richards. Factors associated with variability in the assessment of UK doctors’ professionalism: analysis of survey results. BMJ, 2011; 343: d6212 DOI: 10.1136/bmj.d6212

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Oct. 28, 2011) — A team of researchers at MIT has found one of the most effective catalysts ever discovered for splitting oxygen atoms from water molecules -- a key reaction for advanced energy-storage systems, including electrolyzers, to produce hydrogen fuel and rechargeable batteries. This new catalyst liberates oxygen at more than 10 times the rate of the best previously known catalyst of its type.

The new compound, composed of cobalt, iron and oxygen with other metals, splits oxygen from water (called the Oxygen Evolution Reaction, or OER) at a rate at least an order of magnitude higher than the compound currently considered the gold standard for such reactions, the team says. The compound's high level of activity was predicted from a systematic experimental study that looked at the catalytic activity of 10 known compounds.

The team, which includes materials science and engineering graduate student Jin Suntivich, mechanical engineering graduate student Kevin J. May and professor Yang Shao-Horn, published their results in Science on Oct. 28.

The scientists found that reactivity depended on a specific characteristic: the configuration of the outermost electron of transition metal ions. They were able to use this information to predict the high reactivity of the new compound -- which they then confirmed in lab tests.

"We not only identified a fundamental principle" that governs the OER activity of different compounds, "but also we actually found this new compound" based on that principle, says Shao-Horn, the Gail E. Kendall (1978) Associate Professor of Mechanical Engineering and Materials Science and Engineering.

Many other groups have been searching for more efficient catalysts to speed the splitting of water into hydrogen and oxygen. This reaction is key to the production of hydrogen as a fuel to be used in cars; the operation of some rechargeable batteries, including zinc-air batteries; and to generate electricity in devices called fuel cells. Two catalysts are needed for such a reaction -- one that liberates the hydrogen atoms, and another for the oxygen atoms -- but the oxygen reaction has been the limiting factor in such systems.

Other groups, including one led by MIT's Daniel Nocera, have focused on similar catalysts that can operate -- in a so-called "artificial leaf" -- at low cost in ordinary water. But such reactions can occur with higher efficiency in alkaline solutions, which are required for the best previously known catalyst, iridium oxide, as well as for this new compound.

Shao-Horn and her collaborators are now working with Nocera, integrating their catalyst with his artificial leaf to produce a self-contained system to generate hydrogen and oxygen when placed in an alkaline solution. They will also be exploring different configurations of the catalyst material to better understand the mechanisms involved. Their initial tests used a powder form of the catalyst; now they plan to try thin films to better understand the reactions.

In addition, even though they have already found the highest rate of activity yet seen, they plan to continue searching for even more efficient catalyst materials. "It's our belief that there may be others with even higher activity," Shao-Horn says.

Jens Norskov, a professor of chemical engineering at Stanford University and director of the Suncat Center for Interface Science and Catalysis there, who was not involved in this work, says, "I find this an extremely interesting 'rational design' approach to finding new catalysts for a very important and demanding problem."

The research, which was done in collaboration with visiting professor Hubert A. Gasteiger (currently a professor at the Technische Universität München in Germany) and professor John B. Goodenough from the University of Texas at Austin, was supported by the U.S. Department of Energy's Hydrogen Initiative, the National Science Foundation, the Toyota Motor Corporation and the Chesonis Foundation.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Massachusetts Institute of Technology. The original article was written by David L. Chandler, MIT News Office.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

J. Suntivich, K. J. May, H. A. Gasteiger, J. B. Goodenough, Y. Shao-Horn. A Perovskite Oxide Optimized for Oxygen Evolution Catalysis from Molecular Orbital Principles. Science, 2011; DOI: 10.1126/science.1212858

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Oct. 27, 2011) — A team of computer scientists, physicists, and physicians at Harvard has developed a simple yet powerful method of visualizing human arteries that may result in more accurate diagnoses of atherosclerosis and heart disease.

The prototype tool, called "HemoVis," creates a 2D diagram of arteries that performs better than the traditional 3D, rainbow-colored model. In a clinical setting, the tool has been shown to increase diagnostic accuracy from 39% to 91%.

Presented Oct. 27 at the IEEE Information Visualization Conference(InfoVis 2011), the new visualization methodoffers insight to clinicians, imaging specialists, engineers, and others in a wide range of fields who need to explore and evaluate complex, branching structures.

"Our goal was to design a visual representation of the data that was as accurate and efficient for patient diagnosis as possible," says lead author Michelle Borkin, a doctoral candidate at the Harvard School of Engineering and Applied Sciences (SEAS). "What we found is that the prettiest, most popular visualization is not always the most effective."

HemoVis takes data from patient-specific blood flow simulations, combined with traditional imaging data, and visually displays a tree diagram of the arteries with areas of disease highlighted to assist in diagnosis.

Tools for artery visualization in both clinical and research settings commonly use 3D models that portray the shape and spatial arrangement of vessels of interest. These complex tools require users to rotate the models to get a complete perspective of spatial orientation.

By contrast, the new visualization requires no such rotation or interaction. The tool utilizes 2D, circumference-adjusted cylindrical cross sections arranged in tree diagrams.

Though this visualization method may seem less high-tech, the team demonstrated through quantitative evaluation with medical experts that the 2D model is actually more accurate and efficient for patient diagnosis.

"In the 3D case, the more complex and branched the arteries were, the longer it took to complete the patient diagnosis, and the lower the accuracy was," Borkin reflects. "In the 2D representation, it didn't matter how many branches we had or how complex they were -- we got consistently fast, accurate results. We weren't expecting that."

Tree diagrams are hardly new, as evolutionary biologists will attest, but scientists in many fields are using them to solve a range of very modern and complex problems. In fact, Borkin applied her own experience in astronomy and physics to transform the concept of visualization for SEAS' Multiscale Hemodynamics research group. In prior work, she had used a very similar type of tree diagram to determine the structure of nebulae in outer space.

"With the consultation and cooperation of clinicians, we were able to draw on fairly well known visualization techniques and principles from computer science to solve a practical clinical problem," says Hanspeter Pfister, Gordon McKay Professor of the Practice of Computer Science at SEAS.

Borkin, Pfister, and their colleagues relied on the input of physicians and others with clinical or laboratory imaging experience throughout the process. Through extensive surveys and interviews, they identified the most popular options for display, accurate layout, and coloring of these arterial projections.

However, Borkin drew on well supported research that is less well known outside the visualization community:

"For years, visualization, computer science, and psychology researchers have identified that color is critical for conveying the value of data, but that the rainbow coloring is not well-attuned to the human visual system."

Accordingly, HemoVis departs from the traditional practice of rainbow color-coding in favor of a graded single-color scheme (red to black) that can represent placement along a continuum.

In tests, diagnostic accuracy, as measured by the proportion of diseased areas identified, increased dramatically with the new color scheme.

Widespread adoption of visual representations like those in HemoVis could have the effect of not only optimizing tasks that are critical for doctors, but also changing long-entrenched mindsets and making scientists "think twice" about their assumptions in data visualization, Borkin says.

"This approach to visualization design and validation is broadly applicable in medicine, engineering, and science," notes Pfister. "We hope that people will use this process as a template for transforming their own visualizations."

Borkin and Pfister acknowledged that while HemoVis represents an important step forward, traditional 3D artery models still play a role, particularly in providing a spatially intuitive tool for surgical planning.

With this in mind, the next steps for this research include further development and optimization of the 2D tool and investigation into how it might complement, rather than replace, its 3D counterpart.

A paper about this work will be published later this year in the journal IEEE Transactions on Visualization and Computer Graphics.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Harvard University. The original article was written by Mureji Fatunde.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Oct. 27, 2011) — Robert Linhardt is working to forever change the way some of the most widely used drugs in the world are manufactured. In a new studying appearing in the journal Science, he and his partner in the research, Jian Liu, have announced an important step toward making this a reality.

Linhardt, the Ann and John H. Broadbent Jr. '59 Senior Constellation Professor of Biocatalysis and Metabolic Engineering at Rensselaer Polytechnic Institute, and Jian Liu, a professor in the Eshelman School of Pharmacy at the University of North Carolina at Chapel Hill, have discovered an entirely new process to manufacture ultra-low molecular weight heparin.

The research shows that the drug is identical in performance and safety to the current and successful anticoagulant fondaparinux, but is purer, faster, and less expensive to produce.

"This research represents an entirely new paradigm in drug manufacturing," Linhardt said. "With this discovery, we have successfully demonstrated that replacing the current model of drug production with a chemoenzymatic approach can greatly reduce the cost of drug development and manufacturing, while also increasing drug performance and safety, and reduce the possibility of outside drug contamination. It is our hope that this is the first step in the adoption of this method for the manufacture of many other drugs."

The new process uses chemicals and enzymes to reduce the number of steps in production of fondaparinux from approximately 50 steps down to just 10 to 12. In addition, it increases the yield from that process 500-fold compared to the current fondaparinux process, and could decrease the cost of manufacture by a similar amount, according to Linhardt.

Fondaparinux, which is sold as a name-brand drug and was also recently approved by the FDA as a generic drug, is a synthetic anticoagulant used to treat deep vein thrombosis, with over $500 million in annual sales. It is part of a much larger family of anticoagulant drugs known as heparins. But, unlike most heparin products, it is chemically synthesized from non-animal materials. All other heparin-based drugs currently on the market use materials from the intestines of pigs and lungs of cattle as source materials. Such animal materials are more likely to become contaminated, according to Linhardt.

"When we rely on animals, we open ourselves up for spreading viruses and prion diseases like mad cow disease through the use of these heparins," Linhardt said. "And because most of the raw material is imported, we often can't be sure of exactly what we are getting."

But, fondaparinux is extremely costly to produce, according to Linhardt. "The process to produce the drug involves many steps to purify the material and creates tons and tons of hazardous waste to dispose of," Linhardt said.

The new process developed by Linhardt and Liu greatly reduces the number of steps involved in the production of the drug. This reduces the amount of waste produced and the overall cost of producing the drug.

"Cost should no longer be a major factor in the use or production of this drug," Linhardt said.

The process uses sugars and enzymes that are identical to those found in the human body to build the drug piece by piece. The backbone of the material is first built sugar by sugar and then decorated with sulfate groups through the use of enzymes to control its structure and function in the body.

Linhardt and Liu have already begun testing the drug in animal models with successful results and think the drug could be quickly transferred to the market.

"Because the new drug is biologically identical in its performance to the already approved fondaparinux, the approval process for this new drug should work very similar to the approval process used for fondaparinux," Linhardt said. He also thinks that this combined chemical and enzymatic synthesis can be quickly brought to patients in need and adapted for the production of many other improved carbohydrate-containing drugs.

"During this study, we were able to quickly build multiple doses in a simple laboratory setting and feel that this is something than can be quickly and easy commercialized to reduce the cost of this drug and help to shift how pharmaceutical companies approach the synthesis of carbohydrate-containing drugs."

The finding is part of a much larger body of work occurring in the Linhardt lab to completely replace all types of heparin-based or other glycoprotein-based drugs with safer, low-cost, synthetic versions that do not rely on foreign, potentially contaminated animal sources.

The research is funded by the National Institutes of Health.

Linhardt and Liu were joined in the research by Yongmei Xu, Haoming Xu, Renpeng Liu, and Juliana Jing of the University of North Carolina, Chapel Hill; Sayaka Masuko of Rensselaer Polytechnic Institute; and Majde Takieddin and Shaker Mousa of the Albany College of Pharmacy and Health Sciences.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Rensselaer Polytechnic Institute.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Yongmei Xu, Sayaka Masuko, Majde Takieddin, Haoming Xu, Renpeng Liu, Juliana Jing, Shaker A. Mousa, Robert J. Linhardt, Jian Liu. Chemoenzymatic Synthesis of Homogeneous Ultralow Molecular Weight Heparins. Science, 2011; 334 (6055): 498-501 DOI: 10.1126/science.1207478

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Oct. 27, 2011) — People who use over-the-counter "thyroid support'' supplements may be putting their health at risk, according to a study being presented at the annual meeting of the American Thyroid Association. The supplements contain varying amounts of two different kinds of thyroid hormones apparently derived in large part from chopped up animal thyroid glands, says the study's senior investigator, Victor Bernet, M.D., an endocrinologist at Mayo Clinic in Florida.

The hormones are known as T3, or triiodothyronine, and T4, or thyroxine. They are regulated by the U.S. Food and Drug Administration and intended for use only in prescription medication because they can cause significant health issues, such as an increase in heart rate, heart irregularities and palpitations, nervousness, and diarrhea, Dr. Bernet says.

"These hormones have effects throughout the body, which is why they are controlled," he says.

Not only did nine of the 10 supplements studied have animal hormone, the amount of hormones in the products varied significantly, sometimes exceeding doses used for individual patients and comparable to levels found in prescription thyroid medication, Dr. Bernet says.

The supplements likely do not give most people the results they are seeking, such as weight loss or less fatigue, he says.

"The amount of thyroid hormone a normal person would have to take to lose weight would be dangerously high and there is no evidence that use of thyroid hormone effectively treats fatigue when used in people without actual hypothyroidism," he says.

Because physicians have seen a number of abnormal thyroid tests from patients using over-the-counter supplements, Dr. Bernet became interested in this issue when he heard reports of such cases as chairman of the American Thyroid Association's public health committee. He worked with researchers including endocrinologists at Walter Reed Army Medical Center, where he practiced at the time.

The researchers bought 10 commercially available thyroid supplements from stores or websites and used high-pressure liquid chromatography to separate and identify the chemical components of T3 and T4. Nine of the 10 contained T3 and five of them would deliver as much, or more, than 50 percent of the total amount of T3 produced by the body daily.

Four of the 10 supplements contained T4, and some of those contained a dose that could be twice as much as what an adult needs each day. Only one supplement had no detectable T3 or T4.

The results show there is a need for more effective monitoring of the contents of over-the-counter thyroid support products and more patient education about the products' potential health risks, Dr. Bernet says.

The study was funded by the Department of Clinical Investigation at Walter Reed Army Medical Center, which, in August 2011, became the Walter Reed National Military Medical Center.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Mayo Clinic.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Oct. 27, 2011) — Using highly potent antibodies isolated from HIV-positive people, researchers have recently begun to identify ways to broadly neutralize the many possible subtypes of HIV. Now, a team led by biologists at the California Institute of Technology (Caltech) has built upon one of these naturally occurring antibodies to create a stronger version they believe is a better candidate for clinical applications.

Current advances in isolating antibodies from HIV-infected individuals have allowed for the discovery of a large number of new, broadly neutralizing anti-HIV antibodies directed against the host receptor (CD4) binding site -- a functional site on the surface of the virus that allows for cell entry and infection. Using a technique known as structure-based rational design, the team modified one already-known and particularly potent antibody -- NIH45-46 -- so that it can target the binding site in a different and more powerful way. A study outlining their process was published in the Oct. 27 issue of Science Express.

"NIH45-46 was already one of the most broad and potent of the known anti-HIV antibodies," says Pamela Bjorkman, Max Delbrück Professor of Biology at Caltech and senior author on the study. "Our new antibody is now arguably the best of the currently available, broadly neutralizing anti-HIV antibodies."

By conducting structural studies, the researchers were able to identify how NIH45-46 interacted with gp120 -- a protein on the surface of the virus that's required for the successful entry of HIV into cells -- to neutralize the virus. Using this information, they were able to create a new antibody (dubbed NIH45-46G54W) that is better able to grab onto and interfere with gp120. This improves the antibody's breadth -- or extent to which it effectively targets many subtypes of HIV -- and potency by an order of magnitude, according to Ron Diskin, a postdoctoral scholar in Bjorkman's lab at Caltech and the paper's lead author.

"Not only did we design an improved version of NIH45-46, our structural data are calling into question previous assumptions about how to make a vaccine in order to elicit such antibodies," says Diskin. "We hope that these observations will help to guide and improve future immunogen design."

By improving the efficacy of antibodies that can neutralize HIV, the researchers point to the possibility of clinical testing for NIH45-46G54W and other antibodies as therapeutic agents. It's also plausible that understanding effective neutralization by powerful antibodies may be useful in vaccine development.

"The results uncover the structural underpinnings of anti-HIV antibody breadth and potency, offer a new view of neutralization by CD4-binding site anti-HIV antibodies, and establish principles that may enable the creation of a new group of HIV therapeutics," says Bjorkman, who is also a Howard Hughes Medical Institute investigator.

Other Caltech authors on the study, "Increasing the Potency and Breadth of an HIV Antibody by Using Structure-Based Rational Design," include Paola M. Marcovecchio, Anthony P. West, Jr., Han Gao, and Priyanthi N.P. Gnanapragasm. Johannes Scheid, Florian Klein, Alexander Abadir, and Michel Nussenweig from Rockefeller University, and Michael Seaman from Beth Israel Deaconess Medical Center in Boston also contributed to the paper. The research was funded by the Bill & Melinda Gates Foundation, the National Institutes of Health, the Gordon and Betty Moore Foundation, and the German Research Foundation.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by California Institute of Technology. The original article was written by Katie Neith.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Ron Diskin, Johannes F. Scheid, Paola M. Marcovecchio, Anthony P. West, Jr., Florian Klein, Han Gao, Priyanthi N. P. Gnanapragasam, Alexander Abadir, Michael S. Seaman, Michel C. Nussenzweig, Pamela J. Bjorkman. Increasing the Potency and Breadth of an HIV Antibody by Using Structure-Based Rational Design. Science, Published online Oct. 27, 2011 DOI: 10.1126/science.1213782

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Oct. 27, 2011) — Today's multicore processors are not being utilized in a sufficiently intelligent way. They get too hot and run slowly because they are used inefficiently. At the same time, transistors are becoming so small that they will ultimately become unreliable. Major European research organizations are now attempting to create a revolution in computer architecture.

Are you disappointed with the performance of your new computer or phone? Then perhaps rightly so. Their clock frequency has not risen significantly since 2000. Overheating problems and limited battery capacity are forcing manufacturers to limit power usage substantially, typically to 100 W per chip.

"You can't sell a mobile device with bulky sophisticated cooling. So we can't increase the clock frequency, a problem that multicore processors were designed to avoid," says Per Stenström, Professor of Computer Engineering at Chalmers University of Technology and project coordinator for a new center called Eurecca.

"So far, multicore systems have merely yielded marginal performance improvements. It will be increasingly difficult to keep all the processors busy without exceeding the power limit."

Today modern microprocessors typically have six cores. The problem is that no one can as yet optimally program even two-core (two parallel processors on the same chip) computers. The same problem has also baffled giants such as Intel and Microsoft. Basically what is needed is moving from sequential to parallel programming -- i.e. the art of getting several processors to operate together as efficient, well trained sports teams instead of inefficiently coordinated superstars. This second technological challenge goes by the name of concurrency.

"The industry is not yet mature for this step, and we lag behind in terms of education and research," says Per Stenström.

That's why three leading universities in computer architecture are now founding the European Research Center on Computer Architecture (EuReCCA). The planned center so far comprises fifty research scientists, but computer technology institutions from all parts of Europe are joining at a rapid rate, all coordinated by Chalmers. The aim is to make pioneering progress in the development of systems involving multicore processors and paradigms for their productive programming, which should ultimately result in new generations of products and innovative company start-ups.

Eurecca should above all boost the competitive power of the European computer industry, which is a world leader in computers for embedded applications such as cars and mobile phones. A modern car contains no less than about fifty processors. British ARM processors are found in mobile phones from manufacturers such as Nokia, SonyEricsson and Apple. European research projects normally run for three years, but Eurecca will be set up as a longer term Eurolab structure.

"We want to have a permanent structure in order to sustain a long-term focus on bringing innovations out to industry. This has not worked in a satisfactory way in Europe so far. To build up functional systems of knowledge transfer is not something that can be done in three years," says Per Stenström.

The Eurolab structure is also needed to build up streamlined education in computer technology at master and doctoral level throughout Europe. The aim is to get together to nurture a new generation of computer architects ready to lead system innovations based on the parallel philosophy in high demand.

The third technical challenge is miniaturization. Transistor dimensions are halved every 18 months. Soon, transistors will be only a few tens of atoms in size. Along the way they become increasingly unreliable, which will present an enormous challenge for computer system designers in the future. No one currently knows how to tackle this.

"We quite simply have to create a revolution in computer architecture," says Per Stenström, who expects national and European funding for Eurecca, as well as investments from industry.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Chalmers University of Technology.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here