Drug company funded events for health professionals: the state of play in Australia

The links between the pharmaceutical industry and doctors are many and tangled. Drug companies are keen to schmooze doctors and, directly or not, persuade clinicians to prescribe their drug instead of a similar one by a competitor. One way that drug companies try to influence doctors is by sponsoring events, such as conferences or [...]...

Robertson, J., Moynihan, R., Walkom, E., Bero, L., & Henry, D. (2009) Mandatory Disclosure of Pharmaceutical Industry-Funded Events for Health Professionals. PLoS Medicine, 6(11). DOI: 10.1371/journal.pmed.1000128  Mandatory Disclosure of Pharmaceutical Industry-Funded Events for Health Professionals


See original: Research Blogging - All Topics - English Drug company funded events for health professionals: the state of play in Australia

Alvin: World Usability Day | Making Life Easy!

Alvin
World Usability Day | Making Life Easy! - http://www.worldusabilityday.org/
Mr. Gunn liked this

See original: FriendFeed - search Alvin: World Usability Day | Making Life Easy!

Dark Energy: Where did the Light go? (Part 3) [Starts With A Bang]

Though the Sun is gone, I have a light. -Kurt Cobain

Last time we visited dark energy, we discussed its initial discovery. This came about from the fact that supernovae observed with a certain redshift (i.e., moving away from us) appear to be systematically fainter than we were able to explain.

supernova.jpg

But we weren't satisfied with simply saying that there must be dark energy. We asked a lot of critical questions about why these supernovae might appear so faint.

First off, we asked the question, "Could these supernovae from far away be different than the type Ia supernovae we have today?"

phot-31b-07-preview.jpg

Unfortunately, the answer is a resounding no. So long as atoms work exactly the same way, they require the same pressure to collapse at all points, times, and places in the Universe. The process of forming a Type Ia Supernova -- having a white dwarf accrete mass until the core collapses and it explodes -- should be independent of location and time.

Well, if the supernovae are constant, could the environments that they form in be different than the environments today? Of course they could. So, is there any way to make them appear fainter without them actually being fainter, and without having to resort to dark energy? Sure, you might say, block some of that light! All you need is some dust, like so.

barnard68v2_vlt.jpg

What a simple idea, right? Problem solved?

Not so fast. Dust, in real life, is made up of real particles (atoms, molecules, grains, etc.), with real sizes. This means they affect light differently at different wavelengths. Not just red, green, and blue, but X-rays, ultraviolet, infrared, and more. We don't see this light dimmed more in one spectral band than any other; it's dimmed equally at all wavelengths!

So, real dust is out. But what if we invented some new type of dust that absorbed light the same at all wavelengths? We can give it a name: grey dust. We have no idea what would cause it, but it's a lot more believable that there's some new kind of dust out there than there is a whole new type of energy pervading the Universe.

Well, if this grey dust were there, then the light from distant supernovae would simply continue to appear dimmer and dimmer the farther away they were. Whereas, if the Universe had dark energy, the supernovae should start to appear relatively brighter beyond a certain distance. Take a look at the graph below to compare some different theories with the data.

img44.gif

As you can see, grey dust (the top line) is as inconsistent as a Universe with only normal matter (bottom line) when compared to the data.

So you can't simply blame it on a trick of the light. In fact, if we look at the most modern supernova data, it clearly favors dark energy significantly over even a flat, low-density Universe.

dDM-vs-z-Union-2008-75.gif

Other "light-blocking" schemes, such as photon-axion oscillations, suffer from the same problem; they don't give the right turn-over as shown above. If we've got the right laws of gravity, there's pretty much no way around dark energy.

But we don't like relying on only one source of data. Supernovae are nice, but what happens when we look at all the other evidence? Does that tell us there must be dark energy too, or could it be that the supernova data just cannot be trusted? Seems like a job for part 4, and so I'll see you then!

Read the comments on this post...

See original: ScienceBlogs Select Dark Energy: Where did the Light go? (Part 3) [Starts With A Bang]

The dual-tasking meditation master

I recently read an article in the latest Scientific American Mind magazine discussing the cell mechanisms underlying meditative states. The author briefly mentioned the fact that expert meditators were able to avoid the attentional blink that lay people are prone to experiencing when barraged with rapidly presented visual stimuli.This brought up a question for me. Would expert meditators perform better on dual-tasks compared to age-matched subjects?I believe the answer is in the affirmative. My reasoning behind this hypothesis has to do with the fact that meditation not only strengthens attentional abilities, but fosters neural efficiency as well (dual-tasking is not about doing two things simultaneously, but more about doing one thing at a time at an extremely fast pace, thus creating an illusion as if one is doing two things at once). A 2007 study by Farb et. al has shown that meditation activates the anterior cingulate cortex, which is a region central to switching your attention. With the development of my dual-task paradigm underway I hope to prove that daily meditation practice can have beneficial effects when it comes to multi-tasking. If my prediction prove to be true, not only will this ancient practice developed thousands of years ago better our physical and emotional well-being, but help us keep afloat in the fast pace era of technology as well. Farb, N., Segal, Z., Mayberg, H., Bean, J., McKeon, D., Fatima, Z., & Anderson, A. (2007). Attending to the present: mindfulness meditation reveals distinct neural modes of self-reference Social Cognitive and Affective Neuroscience, 2 (4), 313-322 DOI: 10.1093/scan/nsm030...

Farb, N., Segal, Z., Mayberg, H., Bean, J., McKeon, D., Fatima, Z., & Anderson, A. (2007) Attending to the present: mindfulness meditation reveals distinct neural modes of self-reference. Social Cognitive and Affective Neuroscience, 2(4), 313-322. DOI: 10.1093/scan/nsm030  Attending to the present: mindfulness meditation reveals distinct neural modes of self-reference


See original: Research Blogging - All Topics - English The dual-tasking meditation master

Guess the Dow, Win Chow! [The Quantum Pontiff]

Last month a local restaurant group, Chow foods---among whose restaurants is one of our favorite Sunday breakfast spots, The Five Spot---ran a contest/charity event: "Chow Dow." The game: guess the value of the Dow Jones Industrial Average at the close of the market on October 29th, 2009. The closest bet under the closing value which did not go over the value would be the winner. The prize was the value of Dow in gift certificates to the Chow restaurants: i.e. approximately $10K in food (or as we would say in Ruddock House at Caltech: "Eerf Doof!" We said that because it fit nicely with another favorite expression, "Eerf Lohocla!", this later phrase originating in certain now obscure rules enforced by administrative teetotalers.) I love games like this, and I especially love games where the rules are set up in an odd way. Indeed what I found amusing about this game was that, as a quick check of the rules on the Chow website showed, you could enter your guesses at anytime up until October 28th. Relevant also: maximum of 21 bets per person with a suggested donation of $1 per guess. So what would your strategy be optimizing your probability of winning, assuming that you are going to enter 21 times?

Below the fold: my strategy, the amazing power of the X-22 computer, and....chaos!

Read the rest of this post... | Read the comments on this post...

See original: ScienceBlogs Select Guess the Dow, Win Chow! [The Quantum Pontiff]

Digging Deeper Into p66Shc and Enhanced Longevity

Mitochondria, you will recall, are the power plants of our cells, churning out stored energy in the form of ATP molecules, and pollution in the form of damaging free radicals or reactive oxygen species (ROS). Mitochondria have their own DNA, separate from the DNA in the nucleus of our cells, a legacy of their origin as free-roaming bacteria. Free radicals are very reactive, which means that they can tear apart the biochemical machinery of cells by reacting with crucial components. This free radical pollution is at the heart of the mitochondrial free radical theory of aging, which presents a large component of the aging process as essentially a runaway feedback loop: mitochondria damage themselves via their own free radicals, making them produce even more free radicals. This in turn leads to cells overtaken by that pollution, and which throw free radicals out into the body to cause widespread harm as the years pass. Thus mitochondria are considered to be important: changes in genes that alter the operation of mitochondria can cause dramatic shifts in life span in mice. Differences in mitochondrial biochemistry are correlated with differences in life span between similar species. Mitochondria are involved with cellular programmed death mechanisms,......

Tomilov, A., Bicocca, V., Schoenfeld, R., Giorgio, M., Migliaccio, E., Ramsey, J., Hagopian, K., Pelicci, P., & Cortopassi, G. (2009) Decreased superoxide production in macrophages of long-lived p66Shc-knockout mice. Journal of Biological Chemistry. DOI: 10.1074/jbc.M109.017491  Decreased superoxide production in macrophages of long-lived p66Shc-knockout mice


See original: Research Blogging - All Topics - English Digging Deeper Into p66Shc and Enhanced Longevity

Two Articles on Predictions & Hype in Science [Framing Science]

Earlier this year, in an article at Nature Biotechnology, I joined with several colleagues in warning that the biggest risk to public trust in science is not the usual culprits of religious fundamentalism or "politicization" but rather the increasing tendency towards the stretching of scientific claims and predictions by scientists, university press offices, scientific journals, industry, and journalists. As I detail with Dietram Scheufele in a separate article at the America Journal of Botany,(PDF) each time a scientific prediction or claim goes beyond the available evidence and proves to be false, it serves as a vivid negative heuristic for the public.

This past week, two important articles describing the perils of prediction in the life sciences and climate sciences appeared at The Scientist magazine and Nature Reports Climate Change respectively. In a cover article for the The Scientist, Stuart Blackman identifies several factors driving the tendency towards hype. As Blackman notes, scientists are under increasing pressure to publish at ever more competitive flagship journals, meaning that the conclusions of a paper have to be that much more provocative. Granting agencies are also putting stronger emphasis on the public impacts portion of funding proposals, again creating an incentive to sometimes promise too much. A third and major factor is the increasing privatization of university-based science with strong incentives and rewards for commercialization, a route that usually involves a heavy dose of promotion. In his article, Blackman draws on the insights of some of the top social scientists studying these trends including Brian Wynne, Christine Hauskeller, and Daniel Sarewitz. A useful sidebar summarizes advice on how researchers can avoid hype in communicating with the public, policymakers, and/or the media.

In a commentary at Nature Reports Climate Change, Mike Hulme, Roger Pielke, Jr and Suraje Dessai warn against promising that climate science can "supply on-demand climate predictions to governments, businesses and individuals," estimating impacts on certain regions and sectors.

"Scientists and decision-makers alike should treat climate models not as truth machines to be relied upon for making adaptation decisions, but instead as one of a range of tools to explore future possibilities," they write. And as they aptly observe, it's not just a matter of technical certainty. Even in cases where forecasts might be accurate in a formal statistical sense, effectively communicating the complexity of these findings to the public and decision-makers will prove a difficult task. Here's the key take away from their commentary:

For scientists, the lesson here is clear. Caution is warranted when promising decision-makers a clarified view of the future. Guaranteeing precision and accuracy over and above what science can credibly deliver risks contributing to flawed decisions. We are not suggesting that scientists abandon efforts to model the behaviour of the climate system. Far from it. Models as exploratory tools can help identify physically implausible outcomes and illuminate the boundaries where uncertain knowledge meets fundamental ignorance. But using models in this way will require a significant rethink on the role of predictive climate science in decision-making. In some cases the prudent course of action will be to let policymakers know the very real limitations of predictive science. For decision-makers, the lesson is to plan for a range of possible alternatives. Instead of seeking certainty, decision-makers need to ask questions of scientists such as 'What physically could not happen?' or 'What is the worst that could happen?'

The authors' warning is important to the U.S. context. Perhaps the most effective way to convey the significance of climate change is to communicate to Americans how it is impacting the region or area in which they live. Yet as effective as this strategy might be, these communication efforts need to proceed cautiously, otherwise they risk opening the door to counter-claims that scientists and government agencies are going beyond available scientific evidence.

Read the comments on this post...

See original: ScienceBlogs Select Two Articles on Predictions & Hype in Science [Framing Science]

23andMe gets scooped on hair curl genes [Genetic Future]

Medland et al. (2009). Common Variants in the Trichohyalin Gene Are Associated with Straight Hair in Europeans. The American Journal of Human Genetics DOI: 10.1016/j.ajhg.2009.10.009


A couple of weeks ago I reported on a presentation by 23andMe's Nick Eriksson at the American Society of Human Genetics meeting in Honolulu, in which Eriksson presented data on a series of genome-wide association studies performed by the company using genetic and trait data from its customers.
Along with genetic analysis of a variety of other traits (such as asparagus anosmia and photic sneeze) Eriksson presented data on two novel regions significantly associated with hair curl, one close to the TCHH gene and a second near WNT10A (see the abstract for details). I noted at the time that 23andMe appears to be doing a pretty good job of running genome-wide association studies, although of course the real test of this is independent replication.
Well, now we have replication (of a sort) for at least two of 23andMe's novel findings - but unfortunately for the 23andMe crew the "replication" study has beaten them into print.

Read the rest of this post... | Read the comments on this post...

See original: ScienceBlogs Select 23andMe gets scooped on hair curl genes [Genetic Future]

Duncan Hull: Free, immediate and permanently available research results for all - that's what the open-access campaigners want. Unsurprisingly, the subscription publishers disagree.

Duncan Hull
Free, immediate and permanently available research results for all - that's what the open-access campaigners want. Unsurprisingly, the subscription publishers disagree. - http://www.timeshighereducation.co.uk/story...
Free, immediate and permanently available research results for all - that's what the open-access campaigners want. Unsurprisingly, the subscription publishers disagree. Zoe Corbyn weighs up the ramifications for journals, while Matthew Reisz asks how books will fare Stephen Hicks, a reader in health and social care at the University of Salford, has just uploaded nine of his journal articles to his university's online open-access repository of institutional papers, and has another ten in the pipeline. Doing so had not crossed his mind before, and it won't be compulsory until January 2010 (last month, Salford mandated so-called "self-archiving", becoming the 100th organisation worldwide to do so). But he was turned on to the idea after hearing Martin Hall, Salford's vice-chancellor and an open-access advocate, speak. - Duncan Hull
It's not surprising they disagree, but it won't be surprising to find, a few years from now, them they saying it was their idea all along... - Mr. Gunn
@Mr Gunn Heh! - Duncan Hull

See original: FriendFeed - search Duncan Hull: Free, immediate and permanently available research results for all - that's what the open-access campaigners want. Unsurprisingly, the subscription publishers disagree.

A call for new technological minds for the genome sequencing instrument fields

There's a great article in the current Nature Biotechnology (alas, you'll need a subscription to read the full text) titled "The challenges of sequencing by synthesis" as this post detailing the challenges around the current crop of sequencing-by-synthesis instruments. The paper was written by a number of the PIs on grants for $1K genome technology.While there is one short section on the problem of sample preparation, the heart of the paper can be found in the other headings: surface chemistryfluorescent labelsthe enzyme-substrate systemopticsthroughput versus accuracyread-length and phasing limitationsEach section is tightly written and well-balanced, with no obvious playing of favorites or bashing of anti-favorites present. Trade-offs are explored & the dreaded term (at least amongst scientists) "cost models" shows up; indeed there is more than a little bit of a nod to accounting -- but if sequencing is really going to be $1K/person on an ongoing basis the beans must be counted correctly!I won't try to summarize much in detail; it really is hard to distill such a concentrated draught any further. Most of the ideas presented as possible solutions can be viewed as evolutionary relative to the current platforms, though a few exotic concepts are floated as well (such as synthetic aperture optics. It is noteworthy that an explicit goal of the paper is to summarize the problem areas so that new minds can approach the problem; as implied by the section title list above this is clearly a multi-discipline problem. It does somewhat suggest the question whether Nature Biotechnology, a journal I am quite fond of, was the best place for this. If new minds are desired, perhaps Physical Review Letters would have been better. But that's a very minor quibble.Fuller CW, Middendorf LR, Benner SA, Church GM, Harris T, Huang X, Jovanovich SB, Nelson JR, Schloss JA, Schwartz DC, & Vezenov DV (2009). The challenges of sequencing by synthesis. Nature biotechnology, 27 (11), 1013-23 PMID: 19898456...

Fuller CW, Middendorf LR, Benner SA, Church GM, Harris T, Huang X, Jovanovich SB, Nelson JR, Schloss JA, Schwartz DC.... (2009) The challenges of sequencing by synthesis. Nature biotechnology, 27(11), 1013-23. PMID: 19898456   The challenges of sequencing by synthesis.


See original: Research Blogging - All Topics - English A call for new technological minds for the genome sequencing instrument fields