Peers encourage risky behaviour just by being there

Worrying findings in a study reported at ScienceDaily this week.

Teenagers are generally regarded to be at a stage where risk-taking behaviour is more common. We now have an indicator of why this behaviour occurs at a time in a person’s development.

A scenario was devised to test whether teenagers were more likely to engage in risky actions in the presence of their peers. The answer, sadly, was yes.

One of the study’s authors, Lawrence Steinberg, says “We know that when one is rewarded by one thing then other rewards become more salient. Because adolescents find socializing so rewarding, we postulate that being with friends primes the reward system and makes teens pay more attention to the potential pay offs of a risky decision.”

Having friends is a risky business when you’re a teen.


Top 10 scientific breakthroughs of 2010

Just what it says up there.

Here.


Gender effects on disease?

Here is an interesting theory: noting that certain classes of diseases, including allergies and autoimmune problems, are more prevalent amongst women, researchers in the US are proposing that traditional gender roles have a part to play in limiting the exposure of girls to immune-system ‘challenges’ early in life.

Noting that boys are more likely to be encouraged to play actively while girls tend to be supervised during indoor play and prevented from getting dirty, they suggest that the variation in numbers and kinds of micro-organisms that children encounter is significant.

We’ve heard these ideas before, and the study’s authors are not suggesting that girls should be eating a spoonful of dirt in the backyard. However, this is a trend that they argue is important enough, and notable elsewhere in the world where rapid social change has been observed, that it deserves further consideration in studies of epidemiology.

More here. (Oregon State University)


Reading is changing

A link to a very interesting article was emailed to me today by a colleague.

The irony doesn’t escape me – in fact, I’m still smiling at this – but it’s a rather lengthy piece at The Atlantic.com about how the proliferation of text-based communication technologies has changed the way we read.

The provocative title – “Is Google Making Us Stupid” – doesn’t do the article justice. It’s not only about the way we search for information, but also about how we interact with, digest and utilise it.

The author makes a number of interesting points, but you should read it itself. If you can concentrate long enough!


Be a cat person, early

There may be a protective effect against asthma in being raised as a child in a household with a cat.

It was reported today that the Columbia Centre for Childrens’ Environmental Health (CCCEH) has described the development of an immune response, as early as 2 or 3 years of age, to cat allergens. They go on to show that by the age of 5 this translates into a lower risk of those children showing symptoms of asthma, such as wheezing.

In the centre’s own statement: “The presence of cats in the home at a very early age seems to help reduce the risk of developing asthma.”

I wonder if this might be the case in households with dogs or other common domestic pets?


Popular image of Biotechnology = BAD

I came across this report on attitudes to biotech arising from its portrayal in films, by the Australian Institute for Biotechnology, and it made for some pretty interesting reading.

We all like a good yarn, and in the last 10 years or so we’ve seen some films that were absolute crackers where science – and particularly biotechnology – has been central to the plot. I’m thinking of Jurassic Park and Gattaca and The Island and you can probably think of more.

A problem arises, though, when creating an exciting story leads writers to take liberties with the scientific principles: for the audience, probably unfamiliar with the technologies, a degree of accuracy is then assumed which is simply inaccurate.

Cloning is a perfect example: the processes involved are never accurately shown and the outcomes are almost universally bad/evil… Which leads most people to have at least a subconscious belief that the same technology in the real world is bad.

I wonder if schools, at least here in Australia, are doing enough to counter this trend? Or are we too afraid to be seen to adopt a position that we, as educators, are complicit in this widespread ignorance?


Now, look here!

The Boston Herald (via Crunchgear) has published a story about the creation of a bionic eye by researchers at The Boston Retinal Implant Project

eye-glasses.jpg

Designed to be implanted behind the retina, it transmits signals along a fine wire directly into the optic nerve. While this bionic eye won’t provide a great deal of visual resolution and works only for people who were able to see at some time in the past, it holds great promise for helping to restore sight to those who have lost their sight to degenerative diseases of the eye.

Human trials will begin in the next couple of years.