Web marketing is one of the biggest source of making money. Weather you are earning online as profession or as part time work. Internet is best way but its not so much easy to survive online Because their are too many ways to make money online but difficult part is, which we have to choose and which is easy to understand and having better earning also.
But Today i decided to discuss about the different online methods. I  already discuss about ways to make money from ads. But these are combine 12 ways for everyone who having less knowledge about web designs or other technical skills. 

1. Cost Per Click ( CPC )

This method is based on earning from click on ads published on your site, web pages, forum or any apps. I have already discuss about cpc in my article about ads. If you want to learn more in details about cpc must read my article about ways to make money from ads.
The following best cost per click networks which offers best rates for every click are 

Google Adsense:-  It one of biggest and popular trusted network by everyone. Google Adsense Offers Highest CPC and CPM rates. But its earning is totally depend on your visitors location and CTR also. It not so easy to get approval of adsense.

Media.net:- This is new network form by partnership of yahoo and bing with media.net. It is good competitor of adsense. It offers good CPC rates as compare to others.

Chitika:- It is also most popular network for CPC. It also offers best revenue from ads but slightly lesser than Adsense

Infolinks:- It is one the best program for text based ads. It offers good revenue for In-text link ads and for other banner ads also.

Bidvertiser:- It is also relative good as compare to others but its is lesser as compare to above programs. But Still it is best one.
  

2. Cost Per Mile ( CPM )

This method is based on earning from every impression made on ads published on your website or blogs. For every 1000 unique impression you will earn money. But difficult task is to choose best network which offers best rates. It is also known as CPM method.
Some best CPM networks are


  • Tribalfusion
  • BrustMedia
  • BuysellAds
  • Valueclickmedia
  • VibrantMedia
  • Adpepper
  • Cpxinteractive
  • MadadsMedia


3. Sell Affiliates Products

Affiliates marketing is commission based program. Means  for every successful sale made buy you. Then you will earn commission. In my recent article i discuss about Bigrock affiliates program which offers commission for every sale of domains and hosting plans. If you want to learn more about affiliates then i suggest you read in  more detail about affiliates program. 
Here are following Best affiliates which offers good commission


  • Google Affiliates
  • ClickBank
  • Commission Junction
  • E-junkie
  • Amazon Affiliates
  • eBay Affiliates
  • DigiResult 
  • FreeLancer


4. Cost Per Action ( CPA )

In this method you will earn money for every successful action done by user which required on that spot. It is just like an affiliates but in this network if their ads about filling any details or subscribing email service or any other . If someone fill details required by ads then you will earn for every successful action.


  • ClickBooth
  • Peerfly
  • CPAWay
  • MaxBounty
  • CPALead
  • NeverBlue
  • Mgcash


5. Earn From Uploading

Their are many programs which based on download program. But in last few years their are many report about increasing spam in these types of programs. It means for every successful download of your file will get earn. some programs are on premium based but some offers free service.
Some trusted download networks are


  • ShareCash.org
  • CashFile.org
  • Uploadables


6. Write For Other Sites

If you are interested in writing online then this is best to make money . If have knowledge about any specific topic or field. Then the best way is to write articles for other services or you can also hire yourself to write for other blogs who want any author for their blogs.
Below Given are Some networks which gave you money to write for them


  • PayPerPost
  • Social Spark 
  • Sponsored Review
  • RevewMe
  • Payu2Blog 


7. Complete Online Surveys

This is one of the effective and easiest way for everyone who want to make money online because it don't required any web knowledge or technical skills.  In this network you have to fill the accurate information for every question ask on survey. This is best way to make side earning for working  2,3 hours on internet.
Some Best survey programs are


  • MySurvey
  • Dollersurvey
  • SurveyMonkey
  • SurveySpot
  • Myview
  • ClearvoiceSurvey
  • Toluna
  • GlobalTestMarket


8. Online Projects / Freelance

It is freelancer job means like data entry jobs. Their are many programs in which you can apply your application or as per rule. Then you have to complete specific project and task. For every successful completing project you will get earn. Their are many programs which are based on freelancer services.
Some Networks which are based on freelancing are below 


  • Microworkers
  • Elance 
  • Fiverr
  • Odesk
  • BreakStudio
  • ConstantContent


9. Selling Products which you own

It is also an effective to way to make money online if you have any product and service which you feel that peoples likes to buy. You can also put your product on online auction which is best idea to earn more. You can also design any product which you own. you can also sell some services or plans online to get earn. 


  • Amazon
  • eBay
  • Quikr
  • OLX
  • Sell.com
  • Gazelle.com


10. Selling Products which you don't own

It just like an affiliates marketing selling any product which you don't owns but you can sell them of any direct advertiser or by using any network. You can also sell spaces on your website or blog for displaying ads of any product. Which help to earn money for every product sale.


  • Amazon Associates
  • ebay
  • Shirtcity
  • CafePress


Above Mentions are 10 different ways to start working online. I try to cover more topics as much as possibles. I hope all the above methods are clear and helpful for everyone to make bright future online you have to do work hark. Their are many other networks related to above categories but i publish those which i feel simplest and best.
1.Go to this Website from here.
2.Then Click on Connect button after then allow the apps to connect to your facebook account.

3.Then the wall machine box appear. This box is customizable edit names and Photos.

4.After the all steps are completed then click on save button.

5.Then open your save file in new tab and take a photo by print screen.






Type *#61# and press call - Check redirection status.Cancel all redirections:

##002# *43# to activate call waiting, to deactivate #43#.

If your phone doesn't have incoming call barring and outgoing call barring, you can try this. For outgoing call barring dial *33*barcode*# and pres OK.

To deactivate it dial #33#barcode*#. On any phone on any network type in **43# to enable conference calls. You can make up to 7 calls at onceIf you need to block SMS reception (if you are spammed by someone) just press this code: *35*xxxx*16# xxxx is your Call Barring code (default is 0000). To remove this barring type: #35*xxxx# If you want to hide/show your phone number when calling, put one of these codes below in front of the number that you are going to call.(*#30# / *#31# or *31# / #31# ) Works on some networks.Typing *0# or *nm# on the beginning of a txt message gives you detailed delivery report on some networks.. But turn off reports in message settings before.When the sim card-pin blocked type **042*pin2 old*newpin2*newpin2*
hey........!!! frends i am telling about how can use free internet in your mobile with ZONG connection
its 2 simple Step:



Zong free Internet
STEP: (1)  => Go Write Msg Option  type your: Mobile Company Name, Mobile Model, Internet

And send Msg 131
 

=>>> PlzZ save your setting
with code 1234

STEP: (2)  
=> And than Go Internet setting profile
Edit: APN
Change your APN setting; 


==>> Type this One
 

APN: Wapgw: 10.81.6.33
Use Free Internet & enjoy....




Today i will show you how to publish single line or multiple line blank status on facebook. You can also use this method in facebook comments. So lets get started.

   For single line Blank Status
  • Copy below codes into your status
  • @[0:0: ]
  • Note:- the text is like @[0:0:space]
  • Dont write space where i have writen space rather press spacebar.

    For MultiLine Blank Status :

    • Paste as shown below in your staus.
    • @[0:0: ]
      @[0:0: ]
      @[0:0: ]
      @[0:0: ]
      @[0:0: ]
      @[0:0: ]

    Done ! subscribe below for such great posts.  
ScienceDaily (Nov. 30, 2011) — The effectiveness of using specific fungi as mycoherbicides to combat illicit drug crops remains questionable due to the lack of quality, in-depth research, says a new report from the National Research Council. Questions about the degree of control that could be achieved with such mycoherbicides, as well as uncertainties about their potential effects on nontarget plants, microorganisms, animals, humans, and the environment must be addressed before considering deployment. The report states that additional research is needed to assess the safety and effectiveness of proposed strains of mycoherbicides.

Mycoherbicides, created from plant pathogenic fungi, have been proposed as one tool to eradicate illicit drug crops. Congress requested an independent examination of the scientific issues associated with the feasibility of developing and implementing naturally occurring strains of these fungi to control the illicit cultivation of cannabis, coca, and opium poppy crops.

As an initial step, the report recommends research to study several candidate strains of each fungus in order to identify the most efficacious under a broad array of environmental conditions. The resulting information would guide decisions regarding product formulation, the appropriate delivery method, and the scale required to generate enough mycoherbicide product to achieve significant control. However, conducting the research does not guarantee that a feasible mycoherbicide product will result. Furthermore, countermeasures can be developed against mycoherbicides, and there are unavoidable risks from releasing substantial numbers of living organisms into an ecosystem.

Multiple regulatory requirements would also have to be met before a mycoherbicide could be deployed. Additional regulations and agreements might also be needed before these tools could be used internationally, as approval to conduct tests in countries where mycoherbicides might be used has been difficult or impossible to obtain in the past.

The study was sponsored by the Office of National Drug Control Policy. The National Academy of Sciences, National Academy of Engineering, Institute of Medicine, and National Research Council make up the National Academies. They are independent, nonprofit institutions that provide science, technology, and health policy advice under an 1863 congressional charter. Panel members, who serve pro bono as volunteers, are chosen by the Academies for each study based on their expertise and experience and must satisfy the Academies' conflict-of-interest standards. The resulting consensus reports undergo external peer review before completion.

Report.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by National Academy of Sciences.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — Research conducted by a pair of physicians at Boston University School of Medicine (BUSM) and Boston Medical Center (BMC) has led to the development of a test that can help diagnose membranous nephropathy in its early stages. The test, which is currently only offered in the research setting and is awaiting commercial development, could have significant implications in the diagnosis and treatment of the disease. Currently, the only way to diagnose the disease is through a biopsy.

The pioneering work is being led by Laurence Beck, MD, PhD, assistant professor of medicine at BUSM and a nephrologist at BMC, and David Salant, MD, professor of medicine at BUSM and chief of the renal section at BMC.

Over the past four years, the Halpin Foundation has contributed more than $350,000 to Beck to investigate the genetics and molecular mechanisms behind membranous nephropathy. Most recently, Beck was awarded a $50,000 grant from the Foundation to further his efforts.

Membranous nephropathy is an autoimmune disease caused by the immune system attacking the kidneys, resulting in the thickening and dysfunction of the kidney's filters, called glomeruli. When antibodies attack the glomeruli, large amounts of protein in the urine are released. In 2009, Beck and Salant identified that the antibodies were binding to a protein in the glomeruli. They determined that the target was a protein called PLA2R, or phospholipase A2 receptor, and these findings were published in the New England Journal of Medicine.

"For the first time, a specific biomarker has been identified for this relatively common kidney disease," said Beck, who is part of an international collaboration that has demonstrated that these antibodies are present in patients from many different ethnicities.

With the antigen protein identified, Beck and Salant have developed a blood test to detect and measure the amount of the specific antibodies in a sample.

Approximately one third of patients with membranous nephropathy eventually develop kidney failure, requiring dialysis or a kidney transplant. According to the University of North Carolina's Kidney Center, the disease affects people over the age of 40, is rare in children and affects more men than women. This disease is treated by high powered chemotherapy, and if successful, the antibodies go away.

"Being able to detect the presence of these antibodies using a blood test has tremendous implications about who is treated, and for how long, with the often toxic immunosuppressive drugs," said Beck.

Beck continues his research focus on the treatment of the disease by targeting the antibodies and stopping them from attacking the glomeruli.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Boston University Medical Center.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — A comparison of home-birth trends of the 1970s finds many similarities -- and some differences -- related to current trends in home births.

For instance, in the 1970s -- as now -- women opting to engage in home births tended to have higher levels of education. That's according to a 1978 survey by Home Oriented Maternity Experience (HOME) that was recently found by University of Cincinnati historian Wendy Kline in the archives of the American Congress of Obstetricians and Gynecologists (ACOG).

That survey showed that in the late 1970s, one third of the group's members participating in home births had a bachelor's, master's or doctoral degree. Fewer than one percent did not have a high school education.

Also, according to the 2,000 respondents to HOME's 1978 survey, 36 percent of women engaging in home births at the time were attended by physicians. That is a much higher percentage than is the case currently for mothers participating in home births. (In research by Eugene Declerq, Boston University School of Public Health, and Mairi Breen Rothman, Metro Area Midwives and Allied Services, it was found that about five percent of homebirths were attended by a physician in 2008.)

These comparisons are possible because of historical information found by UC's Kline, including "A Survey of Current Trends in Home Birth" by the founders HOME and published in 1979.

Kline is also conducting interviews with and has obtained historical documents from the founders of and the midwives first associated with HOME, a grass roots organization founded in 1974, to provide information and education related to home births.

Kline will present this research and related historical information as one of only nine international presenters invited to the "Communicating Reproduction" conference at Cambridge University Dec. 6-7.

The debate surrounding health, safety and home births rose to national prominence as recently as October 2011 during the Home Birth Consensus Summit in Virginia, held because of increasing interest in home births as an option for expectant mothers.

Overall, Kline's research of HOME and of ACOG counters the stereotypical view of the 1970s home-birth movement as countercultural and peopled by "hippies." In fact, the founders of HOME deliberately reached out to a broad cross section of women across the political and religious spectrum, including religious conservatives as well as those on the left of the political spectrum.

Said Kline, "In looking through the historical record, we find that many women involved in home births in the 1970s signed their names 'Mrs. Robert Smith' or 'Mrs. William Hoffman.' The movement included professionals, business people, farmers, laborers and artists. It defies simplistic categorization."

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of Cincinnati.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — In 2008, according to the National Highway Traffic Safety Administration, 2.3 million automobile crashes occurred at intersections across the United States, resulting in some 7,000 deaths. More than 700 of those fatalities were due to drivers running red lights. But, according to the Insurance Institute for Highway Safety, half of the people killed in such accidents are not the drivers who ran the light, but other drivers, passengers and pedestrians.

In order to reduce the number of accidents at intersections, researchers at MIT have devised an algorithm that predicts when an oncoming car is likely to run a red light. Based on parameters such as the vehicle's deceleration and its distance from a light, the group was able to determine which cars were potential "violators" -- those likely to cross into an intersection after a light has turned red -- and which were "compliant."

The researchers tested the algorithm on data collected from an intersection in Virginia, finding that it accurately identified potential violators within a couple of seconds of reaching a red light -- enough time, according to the researchers, for other drivers at an intersection to be able to react to the threat if alerted. Compared to other efforts to model driving behavior, the MIT algorithm generated fewer false alarms, an important advantage for systems providing guidance to human drivers. The researchers report their findings in a paper that will appear in the journal IEEE Transactions on Intelligent Transportation Systems.

Jonathan How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics at MIT, says "smart" cars of the future may use such algorithms to help drivers anticipate and avoid potential accidents.

Video: See the team's algorithm in action as robots are able to negotiate a busy intersection and avoid potential accidents.

"If you had some type of heads-up display for the driver, it might be something where the algorithms are analyzing and saying, 'We're concerned,'" says How, who is one of the paper's authors. "Even though your light might be green, it may recommend you not go, because there are people behaving badly that you may not be aware of."

How says that in order to implement such warning systems, vehicles would need to be able to "talk" with each other, wirelessly sending and receiving information such as a car's speed and position data. Such vehicle-to-vehicle (V2V) communication, he says, can potentially improve safety and avoid traffic congestion. Today, the U.S. Department of Transportation (DOT) is exploring V2V technology, along with several major car manufacturers -- including Ford Motor Company, which this year has been road-testing prototypes with advanced Wi-Fi and collision-avoidance systems.

"You might have a situation where you get a snowball effect where, much more rapidly than people envisioned, this [V2V] technology may be accepted," How says.

In the meantime, researchers including How are developing algorithms to analyze vehicle data that would be broadcast via such V2V systems. Georges Aoude SM '07, PhD '11, a former student of How's, designed an algorithm based on a technique that has been successfully applied in many artificial intelligence domains, but is relatively new to the transportation field. This algorithm is able to capture a vehicle's motion in multiple dimensions using a highly accurate and efficient classifier that can be executed in less than five milliseconds.

Along with colleagues Vishnu Desaraju SM '10 and Lauren Stephens, an MIT undergraduate, How and Aoude tested the algorithm using an extensive set of traffic data collected at a busy intersection in Christianburg, Va. The intersection was heavily monitored as part of a safety-prediction project sponsored by the DOT. The DOT outfitted the intersection with a number of instruments that tracked vehicle speed and location, as well as when lights turned red.

Aoude and colleagues applied their algorithm to data from more than 15,000 approaching vehicles at the intersection, and found that it was able to correctly identify red-light violators 85 percent of the time -- an improvement of 15 to 20 percent over existing algorithms.

The researchers were able to predict, within a couple of seconds, whether a car would run a red light. The researchers actually found a "sweet spot" -- one to two seconds in advance of a potential collision -- when the algorithm has the highest accuracy and when a driver may still have enough time to react.

Compared to similar safety-prediction technologies, the group found that its algorithm generated fewer false positives. How says this may be due to the algorithm's ability to analyze multiple parameters. He adds that other algorithms tend to be "skittish," erring on the side of caution in flagging potential problems, which may itself be a problem when cars are outfitted with such technology.

"The challenge is, you don't want to be overly pessimistic," How says. "If you're too pessimistic, you start reporting there's a problem when there really isn't, and then very rapidly, the human's going to push a button that turns this thing off."

The researchers are now investigating ways to design a closed-loop system -- to give drivers a recommendation of what to do in response to a potential accident -- and are also planning to adapt the existing algorithm to air traffic control, to predict the behavior of aircraft.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Massachusetts Institute of Technology.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — With the December holidays a peak season for indulging in marzipan, scientists are reporting development of a new test that can tell the difference between the real thing -- a pricey but luscious paste made from ground almonds and sugar -- and cheap fakes made from ground soy, peas and other ingredients. The report appears in ACS' Journal of Agricultural and Food Chemistry.

Ilka Haase and colleagues explain that marzipan is a popular treat in some countries, especially at Christmas and New Year's, when displays of marzipan sculpted into fruit, Santa and tree shapes pop up in stores. And cakes like marzipan stollen (a rich combo of raisins, nuts and cherries with a marzipan filling) are a holiday tradition. But the cost of almonds leads some unscrupulous manufacturers to use cheap substitutes like ground-up peach seeds, soybeans or peas.

Current methods for detecting that trickery have drawbacks, allowing counterfeit marzipan to slip onto the market to unsuspecting consumers. To improve the detection of contaminants in marzipan, the researchers became food detectives and adapted a method called the polymerase chain reaction (PCR) -- the same test famed for use in crime scene investigations.

They tested various marzipan concoctions with different amounts of apricot seeds, peach seeds, peas, beans, soy, lupine, chickpeas, cashews and pistachios. PCR enabled them to easily finger the doctored pastes. They could even detect small amounts -- as little as 0.1% -- of an almond substitute. The researchers say that the PCR method could serve as a perfect tool for the routine screening of marzipan pastes for small amounts of contaminants.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by American Chemical Society.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Philipp Brüning, Ilka Haase, Reinhard Matissek, Markus Fischer. Marzipan: Polymerase Chain Reaction-Driven Methods for Authenticity Control. Journal of Agricultural and Food Chemistry, 2011; 59 (22): 11910 DOI: 10.1021/jf202484a

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — Distrust is the central motivating factor behind why religious people dislike atheists, according to a new study led by University of British Columbia psychologists.

"Where there are religious majorities -- that is, in most of the world -- atheists are among the least trusted people," says lead author Will Gervais, a doctoral student in UBC's Dept. of Psychology. "With more than half a billion atheists worldwide, this prejudice has the potential to affect a substantial number of people."

While reasons behind antagonism towards atheists have not been fully explored, the study -- published in the current online issue of Journal of Personality and Social Psychology -- is among the first explorations of the social psychological processes underlying anti-atheist sentiments.

"This antipathy is striking, as atheists are not a coherent, visible or powerful social group," says Gervais, who co-authored the study with UBC Associate Prof. Ara Norenzayan and Azim Shariff of the University of Oregon. The study is titled, Do You Believe in Atheists? Distrust is Central to Anti-Atheist Prejudice.

The researchers conducted a series of six studies with 350 American adults and nearly 420 university students in Canada, posing a number of hypothetical questions and scenarios to the groups. In one study, participants found a description of an untrustworthy person to be more representative of atheists than of Christians, Muslims, gay men, feminists or Jewish people. Only rapists were distrusted to a comparable degree.

The researchers concluded that religious believer's distrust -- rather than dislike or disgust -- was the central motivator of prejudice against atheists, adding that these studies offer important clues on how to combat this prejudice.

One motivation for the research was a Gallup poll that found that only 45 per cent of American respondents would vote for a qualified atheist president, says Norenzayan. The figure was the lowest among several hypothetical minority candidates. Poll respondents rated atheists as the group that least agrees with their vision of America, and that they would most disapprove of their children marrying.

The religious behaviors of others may provide believers with important social cues, the researchers say. "Outward displays of belief in God may be viewed as a proxy for trustworthiness, particularly by religious believers who think that people behave better if they feel that God is watching them," says Norenzayan. "While atheists may see their disbelief as a private matter on a metaphysical issue, believers may consider atheists' absence of belief as a public threat to cooperation and honesty."

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of British Columbia.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — Surgeons can learn their skills more quickly if they are taught how to control their eye movements. Research led by the University of Exeter shows that trainee surgeons learn technical surgical skills much more quickly and deal better with the stress of the operating theatre if they are taught to mimic the eye movements of experts.

This research, published in the journal Surgical Endoscopy, could transform the way in which surgeons are trained to be ready for the operating theatre.

Working in collaboration with the University of Hong Kong, the Royal Devon and Exeter NHS Foundation Trust and the Horizon training centre Torbay, the University of Exeter team identified differences in the eye movements of expert and novice surgeons. They devised a gaze training programme, which taught the novices the 'expert' visual control patterns. This enabled them to learn technical skills more quickly than their fellow students and perform these skills in distracting conditions similar to the operating room.

Thirty medical students were divided into three groups, each undertaking a different type of training. The 'gaze trained' group of students was shown a video, captured by an eye tracker, displaying the visual control of an experienced surgeon. The footage highlighted exactly where and when the surgeon's eyes were fixed during a simulated surgical task. The students then conducted the task themselves, wearing the same eye-tracking device. During the task they were encouraged to adopt the same eye movements as those of the expert surgeon.

Students learned that successful surgeons 'lock' their eyes to a critical location while performing complex movements using surgical instruments. This prevents them from tracking the tip of the surgical tool, helping them to be accurate and avoid being distracted.

After repeating the task a number of times, the students' eye movements soon mimicked those of a far more experienced surgeon. Members of the other groups, who were either taught how to move the surgical instruments or were left to their own devices, did not learn as quickly. Those students' performance broke down when they were put into conditions that simulated the environment of the operating theatre and they needed to multi-task.

Dr Samuel Vine of the University of Exeter explained: "It appears that teaching novices the eye movements of expert surgeons allows them to attain high levels of motor control much quicker than novices taught in a traditional way. This highlights the important link between the eye and hand in the performance of motor skills. These individuals were also able to successfully multi-task without their technical skills breaking down, something that we know experienced surgeons are capable of doing in the operating theatre.

"Teaching eye movements rather than the motor skills may have reduced the working memory required to complete the task. This may be why they were able to multi-task whilst the other groups were not."

Dr Samuel Vine and Dr Mark Wilson from Sport and Health Sciences at the University of Exeter have previously worked with athletes to help them improve their performance through gaze training, but this is the first study to examine the benefits of gaze training in surgical skills training.

Dr Vine added: "The findings from our research highlight the potential for surgical educators to 'speed up' the initial phase of technical skill learning, getting trainees ready for the operating room earlier and therefore enabling them to gain more 'hands on' experience. This is important against a backdrop of reduced government budgets and new EU working time directives, meaning that in the UK we have less money and less time to deliver specialist surgical training."

The research team is now analysing the eye movements of surgeons performing 'real life' operations and are working to develop a software training package that will automatically guide trainees to adopt surgeons eye movements.

Mr John McGrath, Consultant Surgeon at the Royal Devon and Exeter Hospital, said: "The use of simulators has become increasingly common during surgical training to ensure that trainee surgeons have reached a safe level of competency before performing procedures in the real-life operating theatre. Up to now, there has been fairly limited research to understand how these simulators can be used to their maximum potential.

"This exciting collaboration with the Universities of Exeter and Hong Kong has allowed us to trial a very novel approach to surgical education, applying the team's international expertise in the field of high performance athletes. Focussing on surgeons' eye movements has resulted in a reduction in the time taken to learn specific procedures and, more importantly, demonstrated that their skills are less likely to break down under pressure. Our current work has now moved into the operating theatre to ensure that patients will benefit from the advances in surgical training and surgical safety."

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of Exeter.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — Scientists investigating the interactions, or binding patterns, of a major tumor-suppressor protein known as p53 with the entire genome in normal human cells have turned up key differences from those observed in cancer cells. The distinct binding patterns reflect differences in the chromatin (the way DNA is packed with proteins), which may be important for understanding the function of the tumor suppressor protein in cancer cells.

The study was conducted by scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory and collaborators at Cold Spring Harbor Laboratory, and is published in the December 15 issue of the journal Cell Cycle.

"No other study has shown such a dramatic difference in a tumor suppressor protein binding to DNA between normal and cancer-derived cells," said Brookhaven biologist Krassimira Botcheva, lead author on the paper. "This research makes it clear that it is essential to study p53 functions in both types of cells in the context of chromatin to gain a correct understanding of how p53 tumor suppression is affected by global epigenetic changes -- modifications to DNA or chromatin -- associated with cancer development."

Because of its key role in tumor suppression, p53 is the most studied human protein. It modulates a cell's response to a variety of stresses (nutrient starvation, oxygen level changes, DNA damage caused by chemicals or radiation) by binding to DNA and regulating the expression of an extensive network of genes. Depending on the level of DNA damage, it can activate DNA repair, stop the cells from multiplying, or cause them to self-destruct -- all of which can potentially prevent or stop tumor development. Malfunctioning p53 is a hallmark of human cancers.

Most early studies of p53 binding explored its interactions with isolated individual genes, and all whole-genome studies to date have been conducted in cancer-derived cells. This is the first study to present a high-resolution genome-wide p53-binding map for normal human cells, and to correlate those findings with the "epigenetic landscape" of the genome.

"We analyzed the p53 binding in the context of the human epigenome, by correlating the p53 binding profile we obtained in normal human cells with a published high-resolution map of DNA methylation -- a type of chemical modification that is one of the most important epigenetic modifications to DNA -- that had been generated for the same cells," Botcheva said.

Key findings

In the normal human cells, the scientists found p53 binding sites located in close proximity to genes and particularly at the sites in the genome, known as transcriptions start sites, which represent "start" signals for transcribing the genes. Though this association of binding sites with genes and transcription start sites was previously observed in studies of functional, individually analyzed binding sites, it was not seen in high-throughput whole-genome studies of cancer-derived cell lines. In those earlier studies, the identified p53 binding sites were found not close to genes, and not close to the sites in the human genome where transcription starts.

Additionally, nearly half of the newly identified p53 binding sites in the normal cells (in contrast to about five percent of the sites reported in cancer cells) reside in so-called CpG islands. These are short DNA sequences with unusually high numbers of cytosine and guanine bases (the C and G of the four-letter genetic code alphabet, consisting of A, T, C, and G). CpG islands tend to be hypo- (or under-) methylated relative to the heavily methylated mammalian genome.

"This association of binding sites with CpG islands in the normal cells is what prompted us to investigate a possible genome-wide correlation between the identified sites and the CpG methylation status," Botcheva said.

The scientists found that p53 binding sites were enriched at hypomethylated regions of the human genome, both in and outside CpG islands.

"This is an important finding because, during cancer development, many CpG islands are subjected to extensive methylation while the bulk of the genomic DNA becomes hypomethylated," Botcheva said. "These major epigenetic changes may contribute to the differences observed in the p53-binding-sites' distribution in normal and cancer cells."

The scientists say this study clearly illustrates that the genomic landscape -- the DNA modifications and the associated chromatin changes -- have a significant effect on p53 binding. Furthermore, it greatly extends the list of experimentally defined p53 binding sites and provides a general framework for investigating the interplay between transcription factor binding, tumor suppression, and epigenetic changes associated with cancer development.

This research, which was funded by the DOE Office of Science, lays groundwork for further advancing the detailed understanding of radiation effects, including low-dose radiation effects, on the human genome.

The research team also includes John Dunn and Carl Anderson of Brookhaven Lab, and Richard McCombie of Cold Spring Harbor Laboratory, where the high-throughput Illumina sequencing was done.

Methodology

The p53 binding sites were identified by a method called ChIP-seq: for chromatin immunoprecipitation (ChIP), which produces a library of DNA fragments bound by a protein of interest using immunochemistry tools, followed by massively parallel DNA sequencing (seq) for determining simultaneously millions of sequences (the order of the nucleotide bases A, T, C and G in DNA) for these fragments.

"The experiment is challenging, the data require independent experimental validation and extensive bioinformatics analysis, but it is indispensable for high-throughput genomic analyses," Botcheva said. Establishing such capability at BNL is directly related to the efforts for development of profiling technologies for evaluating the role of epigenetic modifications in modulating low-dose ionizing radiation responses and also applicable for plant epigenetic studies.

The analysis required custom-designed software developed by Brookhaven bioinformatics specialist Sean McCorkle.

"Mapping the locations of nearly 20 million sequences in the 3-billion-base human genome, identifying binding sites, and performing comparative analysis with other data sets required new programming approaches as well as parallel processing on many CPUs," McCorkle said. "The sheer volume of this data required extensive computing, a situation expected to become increasingly commonplace in biology. While this work was a sequence data-processing milestone for Brookhaven, we expect data volumes only to increase in the future, and the computing challenges to continue."

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by DOE/Brookhaven National Laboratory.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal References:

Krassimira Botcheva, Sean R. McCorkle, W.R. McCombie, John J. Dunn, Carl W. Anderson. Distinct p53 genomic binding patterns in normal and cancer-derived human cells. Cell Cycle, 2011; 10 (24) [link]William A. Freed-Pastor, Carol Prives. Dissimilar DNA binding by p53 in normal and tumor-derived cells. Cell Cycle, 2011; 10 (24) [link]

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — Ultra-tiny zinc oxide (ZnO) particles with dimensions less than one-ten-millionth of a meter are among the ingredients list of some commercially available sunscreen products, raising concerns about whether the particles may be absorbed beneath the outer layer of skin. To help answer these safety questions, an international team of scientists from Australia and Switzerland have developed a way to optically test the concentration of ZnO nanoparticles at different skin depths. They found that the nanoparticles did not penetrate beneath the outermost layer of cells when applied to patches of excised skin.

The results, which were published this month in the Optical Society's (OSA) open-access journal Biomedical Optics Express, lay the groundwork for future studies in live patients.

The high optical absorption of ZnO nanoparticles in the UVA and UVB range, along with their transparency in the visible spectrum when mixed into lotions, makes them appealing candidates for inclusion in sunscreen cosmetics. However, the particles have been shown to be toxic to certain types of cells within the body, making it important to study the nanoparticles' fate after being applied to the skin. By characterizing the optical properties of ZnO nanoparticles, the Australian and Swiss research team found a way to quantitatively assess how far the nanoparticles might migrate into skin.

The team used a technique called nonlinear optical microscopy, which illuminates the sample with short pulses of laser light and measures a return signal. Initial results show that ZnO nanoparticles from a formulation that had been rubbed into skin patches for 5 minutes, incubated at body temperature for 8 hours, and then washed off, did not penetrate beneath the stratum corneum, or topmost layer of the skin. The new optical characterization should be a useful tool for future non-invasive in vivo studies, the researchers write.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Optical Society of America.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Zhen Song, Timothy A. Kelf, Washington H. Sanchez, Michael S. Roberts, Jaro Ricka, Martin Frenz, Andrei V. Zvyagin. Characterization of optical properties of ZnO nanoparticles for quantitative imaging of transdermal transport. Biomedical Optics Express, 2011; 2 (12): 3321 DOI: 10.1364/BOE.2.003321

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — Imagine someone inventing a "super-toner," a revolutionary new "dry ink" for copiers and laser printers that produces higher-quality, sharper color images more economically, cutting electricity by up to 30 percent. One that also reduces emissions of carbon dioxide -- the main greenhouse gas -- in the production of tens of thousands of tons of toner produced each year. One that reduces the cost of laser printing, making it more affordable in more offices, schools and homes.

Sound like a toner that is too good to be true? Well, a team of scientists at the Xerox Corporation actually invented it. A new episode in the 2011 edition of a  video series from the American Chemical Society (ACS), the world's largest scientific society, focuses on the research and the teamwork that led to this advance.

Titled Prized Science: How the Science Behind ACS Awards Impacts Your Life, the videos are available without charge at the Prized Science website and on DVD.

ACS encourages educators, schools, museums, science centers, news organizations and others to embed links to Prized Science on their websites. The videos discuss scientific research in non-technical language for general audiences. New episodes in the series, which focuses on ACS' 2011 award recipients, will be issued in November and December.

"Science awards shine light on individuals who have made impressive achievements in research," noted ACS President Nancy B. Jackson, Ph.D. "Often, the focus is on the recipients, with the public not fully grasping how the award-winning research improves the everyday lives of people around the world. The Prized Science videos strive to give people with no special scientific knowledge the chance to discover the chemistry behind the American Chemical Society's national awards and see how it improves and transforms our daily lives."

A Revolutionary New "Dry Ink" for Laser Printers & Photocopy Machines features the research of Patricia Burns, Ph.D., Grazyna Kmiecik-Lawrynowicz, Ph.D., Chieh-Min Cheng, Ph.D., and Tie Hwee Ng, Ph.D., winners of the 2011 ACS Award for Team Innovation sponsored by the ACS Corporation Associates. Toner is the fine powder used instead of ink in photocopy machines, laser printers and multifunction devices -- machines that print, copy and fax. The researchers at Xerox developed a new toner called "EA Toner," which stands for "emulsion aggregation." They start with a liquid material that looks like house paint. That's the "emulsion" part. Then, they throw in pigments for color, waxes and other useful things and let everything "aggregate," or stick together. Then, it all dries out, and what's left is a fine powder that they can put into a toner cartridge. That worked fine in the lab, but scaling it up to produce millions of toner cartridges to meet consumers' demands was difficult -- all of the scientists had to work together to make the new toner a commercial reality.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by American Chemical Society.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — A team of researchers from the University of Utah and the University of Massachusetts has identified the first gene associated with frequent herpes-related cold sores.

The findings were published in the Dec. 1, 2011, issue of the Journal of Infectious Diseases.

Herpes simplex labialis (HSL) is an infection caused by herpes simplex virus type 1 (HSV-1) that affects more than 70 percent of the U.S. population. Once HSV-1 has infected the body, it is never removed by the immune system. Instead, it is transported to nerve cell bodies, where it lies dormant until it is reactivated. The most common visible symptom of HSV-1 reactivation is a cold sore on or around the mouth. Although a majority people are infected by HSV-1, the frequency of cold sore outbreaks is extremely variable and the causes of reactivation are uncertain.

"Researchers believe that three factors contribute to HSV-1 reactivation -- the virus itself, exposure to environmental factors, and genetic susceptibility," says John D. Kriesel, M.D., research associate professor of infectious diseases at the University of Utah School of Medicine and first author on the study. "The goal of our investigation was to define genes linked to cold sore frequency."

Kriesel and his colleagues previously had identified a region of chromosome 21 containing six genes significantly linked to HSL disease using DNA collected from 43 large families to map the human genome. In the current study, Kriesel and his colleagues performed intensive analysis of this chromosome region using single nucleotide polymorphism (SNP) genotyping, a test which identifies differences in genetic make-up between individuals.

"Using SNP genotyping, we were able to identify 45 DNA sequence variations among 618 study participants, 355 of whom were known to be infected with HSV-1," says Kriesel. "We then used two methods called linkage analysis and transmission disequilibrium testing to determine if there was a genetic association between particular DNA sequence variations and the likelihood of having frequent cold sore outbreaks."

Kriesel and his colleagues discovered that an obscure gene called C21orf91 was associated with susceptibility to HSL. They identified five major variations of C21orf91, two of which seemed to protect against HSV-1 reactivation and two of which seemed to increase the likelihood of having frequent cold sore outbreaks.

"There is no cure for HSV-1 and, at this time, there is no way for us to predict or prevent cold sore outbreaks," says Kriesel. "The C21orf91 gene seems to play a role in cold sore susceptibility, and if this data is confirmed among a larger, unrelated population, this discovery could have important implications for the development of drugs that affect cold sore frequency."

Kriesel's University of Utah collaborators include Maurine R. Hobbs, Ph.D., research assistant professor of internal medicine and adjunct assistant professor of human genetics, and Mark F. Leppert, Ph.D., distinguished professor and former chair of human genetics.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of Utah Health Sciences.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

J. D. Kriesel, B. B. Jones, N. Matsunami, M. K. Patel, C. A. St. Pierre, E. A. Kurt-Jones, R. W. Finberg, M. Leppert, M. R. Hobbs. C21orf91 Genotypes Correlate With Herpes Simplex Labialis (Cold Sore) Frequency: Description of a Cold Sore Susceptibility Gene. Journal of Infectious Diseases, 2011; 204 (11): 1654 DOI: 10.1093/infdis/jir633

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — Take a Petri dish containing crude petroleum and it will release a strong odor distinctive of the toxins that make up the fossil fuel. Sprinkle mushroom spores over the Petri dish and let it sit for two weeks in an incubator, and surprise, the petroleum and its smell will disappear. "The mushrooms consumed the petroleum!" says Mohamed Hijri, a professor of biological sciences and researcher at the University of Montreal's Institut de recherche en biologie végétale (IRBV).

Hijri co-directs a project with B. Franz Lang promoting nature as the number one ally in the fight against contamination. Lang holds the Canada Research Chair on Comparative and Evolutionary Genomics and is a professor at the university's Department of Biochemistry. By using bacteria to stimulate the exceptional growth capacity of certain plants and microscopic mushrooms, Hijri and Lang believe they are able to create in situ decontamination units able to successfully attack the most contaminated sites on the planet.

The recipe is simple. In the spring, we plant willow cuttings at 25-centimeter intervals so the roots dive into the ground and soak up the degrading contaminants in the timber along with the bacteria. At the end of the season, we burn the stems and leaves and we are left with a handful of ashes imprisoning all of the heavy metals that accumulated in the plant cells. Highly contaminated soil will be cleansed after just a few cycles. "In addition, it's beautiful," says Hijri pointing to a picture of dense vegetation covering the ground of an old refinery after just three weeks.

Thanks to the collaboration of an oil company from the Montreal area, the researchers had access to a microbiological paradise: an area where practically nothing can grow and where no one ventures without protective gear worthy of a space traveler. This is where Hijri collected microorganisms specialized in the ingestion of fossil fuels. "If we leave nature to itself, even the most contaminated sites will find some sort of balance thanks to the colonization by bacteria and mushrooms. But by isolating the most efficient species in this biological battle, we can gain a lot of time."

Natural and artificial selection

This is the visible part of the project, which could lead to a breakthrough in soil decontamination. The project is called Improving Bioremediation of Polluted Soils Through Environmental Genomics and it requires time-consuming sampling and fieldwork as well as DNA sequencing of the species in question. The project involves 16 researchers from the University of Montreal and McGill University, many of which are affiliated with the IRBV. The team also includes four researchers, lawyers and political scientists, specializing in the ethical, environmental, economic, legal and social aspects of genomics.

The principle is based on a well-known process in the sector called phytoremediation that consists in using plant matter for decontamination. "However, in contaminated soils, it isn't the plant doing most of the work," says Lang. "It's the microorganisms i.e. the mushrooms and bacteria accompanying the root. There are thousands of species of microorganisms and our job is to find the best plant-mushroom-bacteria combinations."

Botanist Michel Labrecque is overseeing the plant portion of the project. The willow seems to be one of the leading species at this point given its rapid growth and premature foliation. In addition, its stem grows even stronger once it has been cut. Therefore, there is no need to plant new trees every year. However, the best willow species still needs to be determined.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Université de Montréal.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Nov. 30, 2011) — The most poisonous substance on Earth -- already used medically in small doses to treat certain nerve disorders and facial wrinkles -- could be re-engineered for an expanded role in helping millions of people with rheumatoid arthritis, asthma, psoriasis and other diseases, scientists are reporting. Their study appears in ACS' journal Biochemistry.

Edwin Chapman and colleagues explain that toxins, or poisons, produced by Clostridium botulinum bacteria, cause of a rare but severe form of food poisoning, are the most powerful toxins known to science. Doctors can inject small doses, however, to block the release of the neurotransmitters, or chemical messengers, that transmit signals from one nerve cell to another. The toxins break down a protein in nerve cells that mediates the release of neurotransmitters, disrupting nerve signals that cause pain, muscle spasms and other symptoms in certain diseases. That protein exists not just in nerve cells, but in other cells in the human body. However, these non-nerve cells lack the receptors needed for the botulinum toxins to enter and work. Chapman's group sought to expand the potential use of the botulinum toxins by hooking it to a molecule that can attach to receptors on other cells.

Their laboratory experiments showed that these engineered botulinum toxins do work in non-nerve cells, blocking the release of a protein from immune cells linked to inflammation, which is the underlying driving force behind a range of diseases. Such botulinum toxin therapy holds potential in a range of chronic inflammatory diseases and perhaps other conditions, which could expand the role of these materials in medicine.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by American Chemical Society.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Felix L. Yeh, Yiming Zhu, William H. Tepp, Eric A. Johnson, Paul J. Bertics, Edwin R. Chapman. Retargeted Clostridial Neurotoxins as Novel Agents for Treating Chronic Diseases. Biochemistry, 2011; 50 (48): 10419 DOI: 10.1021/bi201490t

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

ScienceDaily (Dec. 1, 2011) — Only 21 percent of surveyed medical students could identify five true and two false indications of when and when not to wash their hands in the clinical setting, according to a study published in the December issue of the American Journal of Infection Control, the official publication of APIC -- the Association for Professionals in Infection Control and Epidemiology.

Three researchers from the Institute for Medical Microbiology and Hospital Epidemiology at Hannover Medical School in Hannover, Germany collected surveys from 85 medical students in their third year of study during a lecture class that all students must pass before bedside training and contact with patients commences. Students were given seven scenarios, of which five ("before contact to a patient," "before preparation of intravenous fluids," "after removal of gloves," "after contact to the patient's bed," and "after contact to vomit") were correct hand hygiene (HH) indications. Only 33 percent of the students correctly identified all five true indications, and only 21 percent correctly identified all true and false indications.

Additionally, the students expected that their own HH compliance would be "good" while that of nurses would be lower, despite other published data that show a significantly higher rate of HH compliance among nursing students than among medical students. The surveyed students further believed that HH compliance rates would be inversely proportional to the level of training and career attainment of the physician, which confirms a previously discovered bias among medical students that is of particular concern, as these higher-level physicians are often the ones training the medical students at the bedside.

"There is no doubt that we need to improve the overall attitude toward the use of alcohol-based hand rub in hospitals," conclude the authors. "To achieve this goal, the adequate behavior of so-called 'role models' is of particular importance."

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Elsevier, via AlphaGalileo.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

K. Graf and I.F. Chaberny, R.-P. Vonberg. Beliefs about hand hygiene: A survey in medical students in their first clinical year. American Journal of Infection Control, Volume 39, Issue 10 (December 2011)

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here