Saturday, April 30, 2011

 

drugs simvastatine

April 20, 2011 — The 10 most prescribed drugs in the U.S. aren't the drugs on which we spend the most, according to a report from the IMS Institute for Healthcare Informatics.

The institute is the public face of IMS, a pharmaceutical market intelligence firm. Its latest report provides a wealth of data on U.S. prescription drug use.

Continuing a major trend, IMS finds that 78% of the nearly 4 billion U.S. prescriptions written in 2010 were for generic drugs (both unbranded and those still sold under a brand name). In order of number of prescriptions written in 2010, the 10 most-prescribed drugs in the U.S. are:

  • Hydrocodone (combined with acetaminophen) -- 131.2 million prescriptions
  • Generic Zocor (simvastatin), a cholesterol-lowering statin drug -- 94.1 million prescriptions
  • Lisinopril (brand names include Prinivil and Zestril), a blood pressure drug -- 87.4 million prescriptions
  • Generic Synthroid (levothyroxine sodium), synthetic thyroid hormone -- 70.5 million prescriptions
  • Generic Norvasc (amlodipine besylate), an angina/blood pressure drug -- 57.2 million prescriptions
  • Generic Prilosec (omeprazole), an antacid drug -- 53.4 million prescriptions (does not include over-the-counter sales)
  • Azithromycin (brand names include Z-Pak and Zithromax), an antibiotic -- 52.6 million prescriptions
  • Amoxicillin (various brand names), an antibiotic -- 52.3 million prescriptions
  • Generic Glucophage (metformin), a diabetes drug -- 48.3 million prescriptions
  • Hydrochlorothiazide (various brand names), a water pill used to lower blood pressure -- 47.8 million prescriptions.

The 10 Best-Selling Drugs

It shouldn't be a surprise that these generic drugs are not the ones bringing in the big bucks for pharmaceutical companies. The drugs on which we spend the most money are those that are still new enough to be protected against generic competition.

The IMS reports that Americans spent $307 billion on prescription drugs in 2010. The 10 drugs on which we spent the most were:

  • Lipitor, a cholesterol-lowering statin drug -- $7.2 billion
  • Nexium, an antacid drug -- $6.3 billion
  • Plavix, a blood thinner -- $6.1 billion
  • Advair Diskus, an asthma inhaler -- $4.7 billion
  • Abilify, an antipsychotic drug -- $4.6 billion
  • Seroquel, an antipsychotic drug -- $4.4 billion
  • Singulair, an oral asthma drug -- $4.1 billion
  • Crestor, a cholesterol-lowering statin drug -- $3.8 billion
  • Actos, a diabetes drug -- $3.5 billion
  • Epogen, an injectable anemia drug -- $3.3 billion

U.S. Prescription Drug Use: 2010 Factoids

Who's paying for all these drugs? Commercial insurance helped pay for 63% of prescriptions, down from 66% five years ago. Federal government spending through Medicare Part D covered 22% of prescriptions.

For Americans covered by insurance, Medicare, or Medicaid, the average co-payment for a prescription was $10.73 -- down a bit from 2009 due to increased use of generic drugs. The average co-payment for branded drugs for which generic alternatives were available jumped 6% to $22.73.

Other facts from the 2010 IMS report:

  • Doctor visits were down 4.2% since 2009.
  • Patients filled more than half of their prescriptions -- 54% -- at chain drugstores, possibly because of discounts on generic drugs.
  • Brands that lost their protection from generic competition led to $12.6 billion less spending in 2010 than in 2009.
  • The price increase for drugs without generic competition led to $16.6 billion more spending in 2010 than in 2009.
  • Drug companies offered $4.5 billion in rebates to assist patients with the high cost of brand name drugs for which there was no generic alternative.

SOURCE:

IMS Institute for Healthcare Informatics: "The Use of Medicines in the United States: Review of 2010," April 2011.


 

lithium dementia

April 28, 2011 — Low-dose lithium appears to slow the progression of memory loss and cognitive decline in individuals with amnestic mild cognitive impairment (aMCI), a significant risk factor for Alzheimer's disease (AD), a new study suggests.

A randomized, placebo-controlled trial showed treatment with lithium, an old drug historically used to treat bipolar disorder (BD) and major depression, was associated with a significant decrease in cerebrospinal fluid (CSF) concentrations of phosphorylated tau (P-tau) and better cognitive performance in individuals with aMCI. Furthermore, the drug was well tolerated and had a side effect profile similar to placebo.

"This study supports the idea that giving lithium to a person who is at risk for AD may have a protective effect and slow down the progression of memory loss to dementia. Although our study has a relatively small sample size, we believe our results are promising and point to a need for further trials with larger numbers of participants," study investigator Orestes V. Forlenza, MD, PhD, University of Sao Paulo, Brazil, said in a statement.

The study is published in the May issue of the British Journal of Psychiatry.

Growing Body of Evidence

According to the study authors, a large body of experimental research suggests that by inhibiting glycogen synthase kinase 3 beta (GSK3B) lithium may modify AD-specific pathological processes, including hyperphosphorylation of tau, overproduction of the amyloid-β (Aβ-42) peptide, and Aβ neurotoxicity.

"Thus, lithium, via the inhibition of GSK3B, may hamper mechanisms that lead to the formation of amyloid plaques and neurofibrillary tangles and, consequently, having neuroprotective effects against AD," the investigators write.

They also note that several clinical trials also suggest the drug may help guard against AD from developing, including 1 study that showed a lower prevalence of AD in BD patients receiving long-term lithium therapy.

To assess the effect of long-term therapy on the progression of cognitive deficits in people at high risk for but not with AD, the investigators conducted a study that included 41 subjects with aMCI, all of whom were older than 60 years and had no evidence of ongoing psychiatric disorders.

Subjects were randomized to receive lithium (n = 21) with starting daily doses of 150 mg, titrated to target serum levels of 0.25 to 0.5 mmol/L through weekly visits, controlling for tolerability, or placebo (n = 20). The researchers note that the lithium dose was lower than that commonly used for the treatment of affective disorders.

The study's primary outcome measures were the modification of cognitive or functional status and concentrations of Aβ-42, total tau (T-tau), and P-tau in the CSF.

The Clinical Dementia Rating (CDR) scale, including the sum of boxes (SoB) score, and the cognitive subscale of the Alzheimer's Disease Assessment Scale (ADAS-cog) were used to assess global functional and cognitive state.

Memory, attention, and executive function were evaluated using the Consortium to Establish a Registry for Alzheimer's Disease (CERAD) delayed recall test, Sequence of Letters and Numbers (SLN), and the Trail Making Test.

Secondary outcomes included conversion from aMCI to AD, as well as safety and tolerability analysis.

Protective Effect

The investigators report that 11 participants (24%) of the total sample progressed to AD after 12 months. "As expected, these individuals displayed at baseline the typical 'Alzheimer's disease signature' in the CSF, ie, higher concentrations of T-tau and P-tau and lower concentrations of Aβ-42, as compared with nonconverters."

The number of conversions was higher in the placebo group (7/20) compared with the lithium group (4/21), but this difference was not significant.

After 12 months the researchers found that all participants experienced a decline in memory and cognitive functioning as indicated by mean CDR-SoB scores (P < .04). However, they note the decline was significantly smaller in the group treated with lithium vs those in the placebo group as indicated by ADAS-cog and SLN test scores.

Lithium treatment was also associated with a significant decrease in concentrations of P-tau in CSF. In contrast, there was a slight increase in P-tau observed in participants receiving placebo, which was significant (P = .02). They found no effect of lithium on T-tau.

Adverse effects were similar in number in both groups, most of which were mild and only lasted for a short period. In addition, the study authors note that at 91% the adherence rate was high.

"In conclusion, the present findings reinforce the notion that in an individual at risk for Alzheimer's disease, lithium may have a protective effect on the progression of cognitive impairment to dementia.

"We acknowledge that the relatively small sample size of this single-center study is a limitation to the generalization of the current findings. Therefore, we think that the present results warrant replication in multicentric trials with a larger sample," the study authors write.

Plea for Research

In an accompanying editorial, Professor Allan Young, MB ChB, MPhil, PhD, director for the Centre for Mental Health Imperial College, London, United Kingdom, said the study is encouraging and noted the findings are "suggestive of likely benefit.

"This trial adds to the increasing evidence that lithium may have beneficial effects on the brain and begs to be replicated in further randomized trials," he writes.

Professor Young appealed to governments and charitable organizations to take the lead in funding such trials.

"The pharmaceutical industry is clearly very much focused on developing treatments for dementia...If such treatments were to be given to large numbers of at-risk individuals for prolonged periods of time, the commercial rewards to those owning the patents for such treatments are likely to be very considerable.

"Lithium, of course, is under no patent and will not attract industry funds for further development as a treatment except perhaps as a comparator to commercial compounds or possibly in combination with another agent. The onus is therefore on governmental and charitable funding agencies. Such trials will not be cheap, but, were they to prove positive, the possible benefits in health to our ever aging population would be beyond any such price," writes Professor Young.

The authors and Professor Young have disclosed no relevant financial relationships.

Br J Psychiatry. 2011;198:351-356.


 

brain volume hersenen cognition

April 26, 2011 — Many studies have linked brain volume abnormalities to a variety of mental health conditions, including major depressive disorder (MDD), bipolar disorder, and schizophrenia.

But the author of an analysis published online April 4 in Archives of General Psychiatry concludes that the number of statistically significant results in the brain volume literature is "way too large to be true."

Dr. John P. A. Ioannidis

"This pattern suggests strong biases in the literature," writes John P. A. Ioannidis, MD, DSc, of Stanford University's Prevention Research Center in California. "Selective outcome reporting and selective analyses reporting" are 2 possible explanations for the "excess significance bias" uncovered in the literature on brain volume abnormalities.

Reached for comment on the analysis, John D. Port, MD, PhD, associate professor of radiology and assistant professor of psychiatry at Mayo Clinic, Rochester, Minnesota, called Dr. Ioannidis' article and analysis "pretty darn good and not surprising at all." The excess of statistically significant results, Dr. Port said, "is a side effect of statistics and publication bias, in my opinion."

Many Links 'Likely Spurious'

Dr. Ioannidis has published numerous papers on a variety of medical topics that question the methods and interpretations of clinical trials and the ideas they generate. One of his most widely read publication may be the 2005 essay published in PLoS Medicine entitled "Why Most Published Research Findings Are False."

In his latest analysis, he used an "excess significance test" to evaluate whether there are too many reported studies in the brain volume literature that have statistically significant results.

"In a nutshell...there are too many studies done in the field showing too many significant results with brain volume abnormalities [and] many of these significant associations are likely to be spurious," Dr. Ioannidis told Medscape Medical News.

From 8 articles, he evaluated 41 meta-analyses with 461 data sets pertaining to brain volume abnormalities in 7 conditions: major depressive disorder, bipolar disorder, obsessive-compulsive disorder, posttraumatic stress disorder, autism, first-episode schizophrenia, and relatives of patients with schizophrenia.

Of the 41 meta-anlayses, roughly half (n = 21) found "statistically significant" associations, and 142 of the 461 data sets (31%) found a "positive" association.

"Even if the effect sizes observed in the meta-analyses are accurate, the number of positive results (n = 142) is almost double than what would have been expected (n = 78) based on power calculations for the included samples," said Dr. Iaonnidis.

And no condition is spared. On the basis of his research, "bias may be present in meta-analyses of all 7 examined conditions and in most of the examined brain structures," Dr. Ioannidis writes.

"The whole field needs transparent design and reporting of its results and careful reappraisal of putative associations," he said.

'Statistical Noise'

Dr. Port, who was not involved in the analysis, said scientists "need to supply the full data set (positive and negative analyses) and publishers need to be willing to publish the full data set so meta-analyses have the full data to look at."

Publication bias toward the positive findings, he added, "creates, in my opinion, what I like to call statistical noise.

"We know if you set a P value of .05, 1 out of every 20 results is going to be abnormally positive, by definition. So if you are only going to publish positive results, you're going to be publishing some by mistake [noise results], and these show up in these meta-analyses as a bias toward excess significance," Dr. Port explained.

To tackle this issue, "we can start by using more rigorous statistical limits, so instead of using a P value of .05 — use .01 — which means a lot less stuff will be significant, which, unfortunately, means a lot less papers will be published," he added.

The Bonferroni correction can also help. "We hate to do Bonferroni correction because you take a whole bunch of positive results and you get rid of them, but it's a very rigorous test, and if something is really positive, it will survive a Bonferroni correction."

The analysis had no funding. Dr. Ioannidis and Dr. Port have disclosed no relevant financial relationships.

Arch Gen Psychiatry. Published online April 4, 2011.


 

dementia

Abstract and Introduction

Abstract

Dementias and related cognitive disorders of the brain are strongly age-associated and prevalence is expected to rise dramatically with a rapidly aging population. As a result, there has been increasing attention on the prevention and treatment of cognitive decline associated with these conditions. A number of approaches have been designed to maintain and strengthen the cognitive capacity of the healthy, as well as the pathologically damaged brain. Evidence suggests that despite advancing age, our brains, and thus our cognitive functions, retain the ability to be maintained and strengthened through the biological process of neuroplasticity. With this opportunity, a new commercial field of 'brain fitness' has been launched to bring to the market training exercises and games that maintain and strengthen cognitive abilities in adulthood. However, the majority of brain fitness methods and products now marketed and sold to consumers have scant scientific evidence to support their effectiveness.


Section 1 of 15

Friday, April 29, 2011

 

statines

Abstract and Introduction

Abstract

As the primary target of therapy in the management of dyslipidemia, LDL-C has been a central focus for practicing clinicians for more than a decade. National Cholesterol Education Program guidelines encourage physicians to lower LDL-C levels to outlined therapeutic targets on the basis of ongoing randomized controlled trials demonstrating significant benefit in cardiovascular outcomes among primary and secondary prevention individuals. Relevant epidemiological analysis of cardiovascular outcomes in the USA reports that although statin therapy provides a relative risk reduction of 30%, many coronary heart disease patients at the LDL-C target level are still having major events, of which more than half are recurrent. Although statins – the mainstay of therapy – are able to decrease LDL-C by a range of approximately 30–50% depending on the potency and dose of the statin administered, they remain underused in the clinical setting by practicing physicians. There also remains controversy as to whether more intensive lowering of LDL-C provides additional cardiovascular benefit or not. Intensive lowering of LDL-C as it pertains to the incidence of cardiovascular outcomes (including myocardial infarction, coronary revascularization and ischemic stroke) is assessed in this meta-analysis of 170,000 individuals from 26 large, randomized controlled trials. The implications of the Cholesterol Treatment Trialists' Collaboration for practicing physicians are discussed here.


Section 1 of 5

 

statines

Results Among users and non-users of statins with comparable propensity scores, 95/942 users and 686/3615 non-users died on the day that pneumonia was diagnosed. In the following six month period, 109/847 statin users died compared with 578/2927 non-users, giving an adjusted hazard ratio of 0.67 (0.49 to 0.91). If these observed benefits translated into clinical practice, 15 patients would need to be treated with a statin for six months after pneumonia to prevent one death.

Conclusions Compared with people who were not taking statins, the risk of dying in the six month period after pneumonia was substantially lower among people who were already established on long term statin treatment when the pneumonia occurred. Whether some or all of this protective effect would be obtained if statin treatment begins when a patient first develops pneumonia is not known. However, given that statins are cheap, safe, and well tolerated, a clinical trial in which people with pneumonia are randomised to a short period of statin treatment is warranted.


 

statines

SUMMARY AND COMMENT

Statin Use and Risk for Death After Pneumonia

April 28, 2011 | Paul S. Mueller, MD, MPH, FACP

Statin users were significantly less likely to die in the 6 months after pneumonia diagnoses.

Reviewing: Douglas I et al. BMJ 2011 Apr 6; 342:d1642


Thursday, April 28, 2011

 

statines dementia

Abstract

Alzheimer's disease (AD), the most common cause of dementia, is a progressive neurodegenerative disorder affecting millions of people worldwide. AD has a multifactorial origin, resulting from an interaction between genetic susceptibility and environmental risk factors. Genetic, epidemiological, experimental and clinical data strongly suggest that the metabolism of cholesterol has an important role in AD pathogenesis. Several studies have demonstrated that high concentrations of serum cholesterol increase the risk of AD. Statins, drugs that reduce cholesterol levels, have been investigated as a possible treatment for AD. However, the literature is not exempt of contradictory results. In this article, we review a recent article by Reitz et al. demonstrating that higher levels of high-density lipoprotein cholesterol, total cholesterol and non-high-density lipoprotein cholesterol are associated with lower risk of AD. In addition, we discuss the current state of knowledge regarding the relationship between plasma cholesterol and AD, stressing the need for understanding the molecular mechanisms behind this association.

Introduction

Alzheimer's disease (AD) is a progressive neurodegenerative disorder that constitutes the main cause of dementia in the elderly. Two well-established risk factors for AD are age and the presence of the ε4 isoform of apolipoprotein E (apoE). In the past decade, several risk factors have been associated with AD, including hypertension, diabetes mellitus, obesity, metabolic syndrome and hypercholesterolemia.[1] In addition, cholesterol metabolism has been previously implicated in AD pathogenesis.[2–4] However, epidemiological studies have reported contradictory results concerning the role of cholesterol as a risk factor for AD. Globally, a positive association with AD was found when cholesterol levels were measured in middle-aged individuals,[5] with the exception of the Honolulu Asia Study, where a cluster of cardiovascular factors increased the risk of vascular dementia (VaD) but not of AD.[6]

Epidemiological studies demonstrated that apoE4 carriers have a higher risk for both AD and an early onset of the disease.[7] Genetic studies have identified several cholesterol-related genes that are associated with AD.[8] It has been suggested that both genetic and environmental risk factors could act in synergy to confer the risk for AD.[9]

Based on evidence supporting hypercholesterolemia as a risk factor for AD, cholesterol synthesis inhibitors, such as statins, were proposed as valuable tools for therapeutic intervention. Despite the fact that the majority of retrospective studies with statins demonstrated a positive effect, others were not able to determine an association between the use of these drugs, cognitive decline and dementia or AD risk. It has been proposed that the different capacity of statins to pass the BBB, their different pleotropic effects, or the necessity to start treatment in very early phases of the disease may have caused these controversial results.[10–12]

Vascular dysfunction in the brain has long been recognized as a contributing factor to the development of VaD. More recently, accumulated evidence suggests that the development of AD may also be strongly related to underlying vascular problems.[13] Furthermore, it seems obvious to infer that a disturbance in plasma cholesterol levels or metabolism could participate in vascular dysfunctions, and therefore in the development of VaD.

The molecular mechanisms linking cholesterol with AD pathology are still unknown. Amyloidβ (Aβ) accumulation in the brain is thought to play a key role in the neurodegenerative processes and results from both brain overproduction and aberrant clearance of these peptides across the BBB.[14] In vitro studies have also demonstrated that intracellular cholesterol levels can modulate the processing of APP to Aβ and that the cleavage of APP by β- and γ-secretases occurs in cholesterol-rich microdomains within the neuronal membranes known as 'lipid rafts'. On the other hand, cholesterol metabolites were also demonstrated to modulate APP processing[15] and Aβ production.

Since the brain is separated from the plasma cholesterol by the BBB, it is intriguing how high serum cholesterol levels influence the prevalence of AD. Oxysterols are capable of passing through the BBB, and high levels of 27-hydroxycholesterol passing from the blood into the brain has been proposed as a possible mechanism.[16] Recent data demonstrated that high cholesterol and 27-hydroxycholesterol levels affect memory consolidation in experimental studies[17,18] and are intimately linked to the renin–angiotensin system overactivation observed in the brains of AD patients.[19]

Although midlife studies consistently associate high plasma cholesterol levels with dementia, this association is less clear in older people. Reitz et al. reported that high levels of high-density lipoprotein cholesterol (HDL-C) in elderly individuals are associated with a decreased risk of late-onset AD.[20] This result, which may seem controversial, has also been reported in other previous studies.[21]

In this article, we discuss possible explanations for these apparently controversial findings, and emphasize the need to understand how blood lipid levels participate in the pathogenetic mechanisms leading to different dementia-causing disorders.


Section 1 of 4

 

statines

Abstract

As the primary target of therapy in the management of dyslipidemia, LDL-C has been a central focus for practicing clinicians for more than a decade. National Cholesterol Education Program guidelines encourage physicians to lower LDL-C levels to outlined therapeutic targets on the basis of ongoing randomized controlled trials demonstrating significant benefit in cardiovascular outcomes among primary and secondary prevention individuals. Relevant epidemiological analysis of cardiovascular outcomes in the USA reports that although statin therapy provides a relative risk reduction of 30%, many coronary heart disease patients at the LDL-C target level are still having major events, of which more than half are recurrent. Although statins – the mainstay of therapy – are able to decrease LDL-C by a range of approximately 30–50% depending on the potency and dose of the statin administered, they remain underused in the clinical setting by practicing physicians. There also remains controversy as to whether more intensive lowering of LDL-C provides additional cardiovascular benefit or not. Intensive lowering of LDL-C as it pertains to the incidence of cardiovascular outcomes (including myocardial infarction, coronary revascularization and ischemic stroke) is assessed in this meta-analysis of 170,000 individuals from 26 large, randomized controlled trials. The implications of the Cholesterol Treatment Trialists' Collaboration for practicing physicians are discussed here.


Sunday, April 24, 2011

 

dabigatran AF

April 13, 2011 (Silver Spring, Maryland) — Few saw it coming: the FDA's surprise decision not to approve the 110-mg dose of dabigatran (Pradaxa, Boehringer Ingelheim) when it approved the 150-mg dose last fall. At the time, an FDA spokesperson told heartwire : "It could be argued that it wouldn't even be ethical to use the lower dose. . . . The data in favor of a 110-mg dose were suggestive but not entirely convincing."

Now, in a perspective published online April 13, 2011 in the New England Journal of Medicine, three FDA officers offer a full explanation and some numbers for that decision [1].

"There were certainly reasons why we might have approved both doses," Drs B Nhi Beasley, Ellis F Unger, and Robert Temple (FDA Center for Drug Evaluation and Research) concede. "Ultimately, the FDA's decision to approve only the 150-mg strength was based on our inability to identify any subgroup in which use of the lower dose would not represent a substantial disadvantage."

Speaking with heartwire , RE-LY principal investigator Dr Stuart J Connolly (McMaster University, Hamilton, ON) disagreed, calling the FDA perspective an "apology" for a bad decision.

"I just think that they're wrong; they should have approved the lower dose. I don't think what they're saying is incorrect--they couldn't find a subgroup in the RE-LY study--but I do think that there are patients where the 110-mg dose makes perfect sense,"

An After-the-Fact Explanation

As previously reported by heartwire , the pivotal RE-LY trial showed that both the 150-mg and the 110-mg doses were noninferior to warfarin. The higher dose was significantly better than both the lower dose and warfarin for the primary end point of stroke or systemic embolism but caused more bleeding than the lower dose (at a rate similar to warfarin). By contrast, the lower dose was superior to warfarin for major bleeding.

As such, write Beasley et al, both doses met the "evidentiary standards for safety and efficacy," and, they acknowledge, patients and doctors "value choices that allow treatment to be individualized."

Indeed, these two issues were hashed out at length during the FDA advisory committee discussions of dabigatran's approvability.

Would Benefits Outweigh Risks in High-Risk Patients?

But the question the FDA zeroed in on was whether the 110-mg dose provided a meaningful option for the kinds of patients who, in theory, could benefit from less bleeding: namely, patients at high risk of bleeding, elderly patients, or patients with impaired renal function.

The high-risk-for-bleeding group was the simplest: according to the FDA's analysis, 57% of patients in RE-LY who had a major bleed during the study resumed taking their study drug or never stopped taking it in the first place. And rates of additional bleeds were no different among the three groups. While "exploratory," these numbers "do not support the strategy of transitioning patients to the lower dose," Beasley et al conclude.

Older patients made up 40% of the RE-LY population: in this group, the rate of stroke was slightly lower in the higher-dabigatran-dose group than in the 110-mg group, but the rate of bleeding was higher.

Major Events in Patients >75 Years

End point 110-mg dose (rate/100 patient-years) 150-mg dose (rate/100 patient-years)
Stroke/embolism 1.9 1.4
Major hemorrhage 4.4 5.1

"If stroke or systemic embolism and major hemorrhage were considered equally undesirable, these rates would indicate similar benefit/risk assessments for the two doses," they write. But since most people "would agree" that stroke/systemic embolism is a worse outcome than bleeding, the risk/benefit balance falls in favor of the higher dose in this group, they conclude.

Impaired-Renal-Function Group

The impaired-renal-function group was identified as the most likely to derive benefit from a lower dose, because dabigatran is cleared primarily by the kidney. According to the analysis of RE-LY patients with moderate renal impairment (creatinine clearance >30 to 50 mL/min) cited by Beasley et al, the rate of stroke in the higher-dose group was actually half that of the lower-dose group, while bleeding rates were no different.

Major Events in Patients With Moderate Renal Impairment

End point 110-mg dose (rate/100 patient-years) 150-mg dose (rate/100 patient-years)
Stroke/embolism 2.4 1.3
Major hemorrhage 5.7 5.3

Where the FDA made an exception, however, was for patients with severe renal impairment--a group excluded from RE-LY. It was for these patients that the agency opted to approve the surprise, untested, 75-mg dose. This decision, they explain, "was based not on efficacy and safety data, but on pharmacokinetic and pharmacodynamic modeling."

But that 75-mg dose is only for this renal subgroup. "What's needed is [a reduced dose for] those patients in whom the bleeding risk is considered to be sufficiently high that they wouldn't want to use the 150-mg dose," Connolly said. "Without the 110-mg dose available, there are, I think, a lot of patients who now will not receive dabigatran at all--these are patients for whom warfarin was not being used because the bleeding risk was considered to be high, or patients who have some bleeding on the dabigatran 150-mg dose for whom there is no alternative available."

Connolly acknowledged the concern raised by the FDA authors--that if physicians had access to two doses, they might try to "play it safe" by giving the lower dose more than absolutely necessary.

"What's clear is that they don't trust physicians to make rational choices about the use of the two doses," he said. "The point that they're missing is that physicians have a lot more information at their disposal when they're encountering a patient than what is in the RE-LY database, including an appreciation of the patient's own values, a lot more information about the patient's bleeding risk, and other information about the patient's history that just isn't in the RE-LY database."

Of note, Health Canada approved both the 110-mg dose and the 150-mg dose for the prevention of stroke and systemic embolism in AF patients last October. A 75-mg dose (as well as a 110-mg dose) is already on the market in the European Union, where dabigatran was approved in 2008 for the prevention of venous thromboembolism in patients undergoing hip or knee replacement. On Thursday, a spokesperson for Boehringer Ingelheim told heartwire that the company "continues to believe that there is a place for the 110-mg dose in reducing the risk of stroke and systemic embolism in patients with nonvalvular atrial fibrillation" and that it "will continue to discuss the 110-mg dose with the FDA as we remain committed to providing clinicians with appropriate options for the treatment of patients."

Last week, a RE-LY substudy presented at the American College of Cardiology 2011 Scientific Sessions suggested that the 110-mg dose was less effective than the higher dose in patients with permanent atrial fibrillation, although investigators cautioned against overinterpreting this finding.


Tuesday, April 19, 2011

 

bloeddruk ACE inhibitors



Antihypertensive treatment adherence varies with drug class


13 April 2011

MedWire News: Meta-analysis findings show that patient adherence to antihypertensive medications differs with drug class.

“There was a remarkable degree of consistency in the pattern of our results showing superior adherence to angiotensin II receptor blockers (ARBs) and ACE inhibitors, and inferior adherence to diuretics and beta blockers,” comment the authors.

They highlight, however, that drug adherence was suboptimal for all drug classes, and therefore advocate that “it is important for clinicians to pay attention to adherence regardless of antihypertensive drug class.”

The 17-study meta-analysis involved 935,920 patients with a mean age of 61.7 years, who were prescribed ARBs, ACE inhibitors, beta blockers, calcium channel blockers (CCBs), and/or diuretics to treat pre-existing hypertension.

Writing in the journal Circulation, Ian Kronish (Mount Sinai School of Medicine, New York, USA) and team report that the mean adherence to prescribed antihypertensives varied from 28% for beta blockers to 65% for ARBs.

Patient adherence was greatest for ARBs, with rates 33% and 57% higher for this class than for ACE inhibitors and CCBs, respectively.

Adherence to ARBs was around twice that for diuretics and beta blockers, and, “overall, ACE inhibitors appeared to have the second-best level of adherence, followed by CCBs,” the authors note.

They add, however, that “insufficient data were available for definitive rankings.”

In a related commentary, Niteesh Choudhry (Brigham and Women’s Hospital, Boston, Massachusetts, USA) hypothesized that unexpected side effects, complicated dosing regimens, and poor clinician follow-up may underlie the poor rates of adherence observed in the current study.

He encouraged clinicians to take measures to overcome these obstacles, such as routinely asking patients about therapeutic adherence and substituting complex dosing schedules for simpler ones where possible.

MedWire (www.medwire-news.md) is an independent clinical news service provided by Springer Healthcare Limited. © Springer
Circulation 2011; 123: 1584–1586, 1611–1621



© Copyright Springer Healthcare Ltd, 2011

Monday, April 18, 2011

 

dabigatran AF


Dabigatran 150 mg more effective than warfarin for stroke prevention in AF


14 April 2011

MedWire News: Findings from two subanalyses of the RE-LY study indicate that dabigatran 150 mg is more effective than warfarin at reducing the risk for stroke in patients with atrial fibrillation (AF).

Both analyses, presented at the American College of Cardiology Annual Scientific Sessions in New Orleans, Louisiana, USA, confirmed results from the main RE-LY (Randomized Evaluation of Long-Term Anticoagulation Therapy) trial, which showed that at a dose of 110 mg dabigatran has similar efficacy to warfarin, but has higher efficacy at 150 mg.

In the first analysis, Jonas Oldgren (Uppsala Clincal Research Center, Sweden) and team looked specifically at the effect of the CHA2DS2-VAsc stroke risk score on the outcomes of 18,113 AF patients receiving dabigatran 110 mg (D110), dabigatran 150 mg (D150), or warfarin.

The CHA2DS2-VAsc score predicts stroke risk in AF patients by assigning a score of 1 point for the presence of heart failure, hypertension, age 65–74 years, diabetes, female gender, or vascular disease, and a score of 2 points for a history of stroke or TIA and an age of 75 years or older.

The researchers found that both doses of dabigatran were superior to warfarin at reducing stroke risk in patients with CHA2DS2-VAsc scores of 2 or less (n=4042). However, among patients with scores of 3 or more, only D150 was superior while D110 was non-inferior to warfarin at preventing stroke.

The second analysis involved 18,107 patients with permanent (n=6375), paroxysmal (n=5943), or persistent (n=5789) AF, and was designed to determine the efficacy of dabigatran in patients with different types of AF.

Led by Greg Flaker (McMaster University, Hamilton, Ontario, Canada), the team found that, irrespective of AF type, D150 was more effective than warfarin at preventing stoke, and D110 was equally as effective as warfarin.

Specifically, compared with warfarin treatment, the risk for stroke among patients taking D150 was significantly reduced by 30%, 39%, and 36% for persistent, paroxysmal, and permanent AF, respectively.

But among D110 patients there was no significant difference in stroke risk compared with warfarin treatment.

In addition, no significant difference was observed in the risk for major bleeding episodes, which occurred at an approximate rate of 3% among all patients, irrespective of AF type and treatment.

Flaker and team therefore conclude that dabigatran is as safe as and more effective than standard warfarin treatment at preventing stroke in AF patients.

MedWire (www.medwire-news.md) is an independent clinical news service provided by Springer Healthcare Limited. © Springer Healthcare Ltd; 2011

American College of Cardiology 60th Annual Scientific Sessions; New Orleans, Louisiana, USA: 2–5 April 2011



© Copyright Springer Healthcare Ltd, 2011

Tuesday, April 12, 2011

 

AF ablation


Cryoballoon ablation more effective for paroxysmal than persistent AF


12 April 2011

MedWire News: A single cryoballoon ablation procedure is more effective at preventing 1-year recurrence of atrial fibrillation (AF) in patients with paroxysmal rather than persistent forms of the arrhythmia, show findings of a systematic review.

Researchers led by Marc Dubuc, from Université de Montréal in Canada, reviewed 23 studies involving cryoballoon ablation in 1221 patients with paroxysmal and 87 patients with persistent AF.

As reported in the journal Heart Rhythm, the patients had a mean age of 57.5 years, and a mean AF duration of 4.7 years.

Irrespective of AF type, more than 90% of patients had acute procedural success, defined as restoration of sinus rhythm immediately after ablation, and approximately 95% of all pulmonary veins targeted during ablation were successfully isolated.

Discounting the 3-month blanking period occurring immediately after ablation, 1-year freedom from AF recurrence occurred at a rate of 73% among patients with paroxysmal AF.

In patients with persistent AF, however, this rate of freedom from AF recurrence was only 45%.

Dubuc and team report that cryoballoon ablation produced a low rate of adverse events, with the commonest complication, transient phrenic nerve palsy, occurring at a rate of 6.3%. Other typical ablation-associated complications, namely left atrial-esophageal fistulae and pulmonary vein stenosis, occurred at very low rates of less than 1.0%.

Stroke or transient ischemic attack also occurred at a low rate of 0.3%.

“Further studies, including direct comparison to conventional radiofrequency ablation, are ongoing and will provide important insight into long-term efficacy and safety,” conclude Dubuc and team.

MedWire (www.medwire-news.md) is an independent clinical news service provided by Springer Healthcare Limited. © Springer Healthcare Ltd; 2011

Heart Rhythm 2011; Advance online publication

Friday, April 08, 2011

 

AF

1. Wat is boezemfibrilleren?

Het is een hartritmestoornis waarbij de boezems van het hart snel en ongecontroleerd samentrekken. De hartslag kan oplopen naar 150 à 200 slagen per minuut, twee keer zoveel als normaal.

Hart van slag?

Honderdduizenden mensen leven met boezemfibrilleren. Wilt u hier graag meer over horen, zien en lezen? klik hier

Het hart bestaat uit vier holtes: twee boezems (of atria) en twee kamers (of ventrikels). Boven in het hart zit de sinusknoop: de elektriciteitscentrale van het hart. Die stuurt elektrische prikkels door het hart waardoor eerst de boezems en een fractie van een seconde later de kamers samentrekken. Dat zorgt dat het bloed door het hart – en de rest van het lichaam – wordt rondgepompt.

Bij boezemfibrilleren is er niet sprake van één, maar van tientallen elektrische prikkels, die zich onafhankelijk van elkaar door de boezems begeven. In plaats van dat ze in een keurig ritme tegelijkertijd samentrekken, trillen ze maar wat aan. Als gevolg van die chaos raken ook de hartkamers ontregeld.

2. Wat merk je daarvan?

Veel patiënten – maar niet alle – hebben last van heftige hartkloppingen en een opgejaagd gevoel. Ook duizeligheid, vermoeidheid en kortademigheid komen voor. Dat kan beangstigend zijn, vooral omdat een aanval van boezemfibrilleren vaak onverwachts komt. Van die angst hebben patiënten soms nog het meest te lijden. Ze durven bijvoorbeeld niet meer te sporten of op vakantie te gaan uit vrees dat er iets fout gaat met hun hart en ze niet op tijd hulp zullen krijgen.

3. Zijn de klachten er continu?

Het begint meestal aanvalsgewijs: de klachten komen op en verdwijnen vanzelf. Soms duurt zo’n aanval een paar minuten, soms een uur of een dag. In de loop van de tijd komen de aanvallen vaker terug en/of duren ze langer. Uiteindelijk kan het boezemfibrilleren chronisch worden en is er constant sprake van een verstoord en versneld hartritme.

4. Wie hebben er het meest last van?

Driekwart van de patiënten is boven de 65 jaar. Van de veertigers heeft 1 op de 100 er last van. Bij mensen van 65 is dat 1 op de 20 en bij mensen van 80 zelfs 1 op de 10. In totaal lijden zo’n 300.000 Nederlanders aan boezemfibrilleren.

5. Is het gevaarlijk?

Nee en ja. Hoe snel en onregelmatig de hartslag ook is, de kamers van het hart blijven hun werk doen. In die zin is het ongevaarlijk. Maar omdat het bloed tijdens het boezemfibrilleren minder goed door het hart stroomt, kunnen er bloedpropjes ontstaan. Die kunnen in de hersenen terechtkomen en daar een beroerte veroorzaken. Ook op andere plekken, bijvoorbeeld in de darmen of de nieren, kunnen bloedpropjes problemen geven. Om dat te voorkomen, slikken veel patiënten antistollingsmiddelen, zogeheten coumarines of vitamine K antagonisten. Die verlagen de kans op een beroerte met 60 tot 80 procent.

Lastig is dat de waarden van het middel in het bloed nogal kunnen schommelen, bijvoorbeeld door het eten van producten met hoge concentraties vitamine K, zoals groene groenten en kaas. Omdat de hoeveelheid medicatie daarop moet worden aangepast, moeten patiënten elke twee à drie weken hun bloedwaarden laten controleren bij de trombosedienst. Overigens verschilt het risico op bloedpropjes van patiënt tot patiënt. Niet iedereen heeft antistollingsmiddelen nodig.

6. Wanneer is het verstandig om naar de dokter te gaan?

Als je meer dan twee keer een op hol geslagen hart hebt gehad of als een eerste aanval langer dan twee dagen duurt. Dus óók als de klachten vanzelf weer over zijn gegaan. Vanwege het verhoogde risico op beroertes is het belangrijk dat boezemfibrilleren vroeg wordt opgespoord – er kunnen dan medicijnen worden gegeven om het te voorkomen. Bovendien verbetert de kwaliteit van leven met de juiste behandeling vaak aanzienlijk. Oudere mensen denken nogal eens dat de klachten bij hun leeftijd horen en dat het geen zin heeft om naar de dokter te gaan. Maar ook zij kunnen dikwijls goed geholpen worden.

7. Hoe ontstaat het?

De meest voorkomende oorzaken zijn achterliggende hartproblemen, aderverkalking, een hoge bloeddruk en een te snel werkende schildklier. Vaak is het ook een kwestie van slijtage van het hart. Fanatieke sporters hebben vanwege de belasting van hun hart meer kans op boezemfibrilleren. Serieus overgewicht, roken en veel alcohol vergroten het risico ook.

8. Kun je boezemfibrilleren krijgen van koffie of alcohol?

Het is bijna nooit de enige oorzaak, maar alcohol en koffie kunnen een aanval van boezemfibrilleren veroorzaken of klachten verergeren. Net als een stofje in Chinees eten, ve-tsin.

9. Is het erfelijk?

Er bestaan zeldzame, erfelijke varianten van boezemfibrilleren. Die openbaren zich meestal al op jonge leeftijd. Boezemfibrilleren boven de 60 is vaak het gevolg van ouderdom of van andere lichamelijke problemen.

10. Hoe kom je erachter of je boezemfibrilleren hebt?

Met behulp van een hartfilmpje (ecg). Dat moet worden gemaakt tijdens het boezemfibrilleren. Als de klachten zich alleen aanvalsgewijs voordoen, krijgt een patiënt vaak voor 24 of 48 uur een draagbare recorder mee naar huis, waarmee de hartslag constant wordt gemeten. Dit heet een holter-onderzoek.

11. Wat is eraan te doen?

In eerste instantie wordt geprobeerd het boezemfibrilleren te voorkomen met een combinatie van medicijnen. Er zijn middelen die de hartslag verlagen, zoals bètablokkers en digoxine, en middelen die het hartritme weer regelmatig maken (anti-aritmica), zoals tambocor en amiodarone. Vooral amiodarone kan vervelende bijwerkingen hebben, zoals zonneallergie en schildklierproblemen.

Omdat de structuur van het hartweefsel door boezemfibrilleren verandert, werken medicijnen na een bepaalde tijd dikwijls niet meer. Dan is een ingreep mogelijk via een dun, buigzaam buisje, een katheter, dat wordt ingebracht via de lies. Ablatie, heet dat. Via de katheter worden stukjes weefsel in de hartwand die ritmeproblemen veroorzaken, uitgeschakeld door er kleine littekens in te maken. De klachten verdwijnen bijna altijd volledig.

Ablatie is een ingewikkelde techniek. Daarom mogen slechts veertien ziekenhuizen in Nederland de ingreep verrichten. Als mensen heel bang worden van het boezemfibrilleren, kunnen ze voor hulp terecht bij een medisch psycholoog.

12. Als je last hebt van boezemfibrilleren, houd je het dan de rest van je leven?

Niet als de achterliggende oorzaak wordt weggenomen. Stel: het boezemfibrilleren komt door een te snel werkende schildklier. Als dat wordt gecorrigeerd, verdwijnt het boezemfibrilleren meestal ook. Het ziet ernaar uit dat door middel van ablatie het boezemfibrilleren zelf kan worden genezen. Maar omdat die techniek relatief nieuw is, is nog niet duidelijk of de klachten op de lange duur kunnen terugkomen.

13. Wat zijn de nieuwste ontwikkelingen?

Onderzoekers werken hard aan nieuwe anti-aritmica met minder bijwerkingen, zoals dronedarone en vernakalant. Er is ook een nieuw soort antistollingsmiddelen in de maak. Die verminderen de kans op een beroerte verder en leiden minder gauw tot bloedingen. Het belangrijkste nieuws is dat patiënten die die nieuwe antistollingsmiddelen gebruiken, niet langer hun bloed hoeven te laten nakijken bij de trombosedienst.

De stollingswaarde blijft automatisch in orde en kan niet meer in de war worden gestuurd door voeding. Dabigatran is naar verwachting de eerste variant van de nieuwe antistollingsmiddelen die (volgend jaar) op de markt worden gebracht. Verder is ablatie in ontwikkeling. Artsen kunnen steeds preciezer werken en er kunnen meer verschillende patiënten mee worden geholpen.

14. Wat mag wel en niet als je last hebt van boezemfibrilleren?

Werken, sporten, vrijen: in principe mag alles, zolang je je er prettig bij voelt. Wie antistollingsmiddelen gebruikt, moet sommige activiteiten, zoals contactsporten, mijden vanwege een verhoogd risico op bloedingen.

15. Hoe kun je zelf boezemfibrilleren voorkomen of klachten verminderen?

Niet roken, niet te veel drinken, regelmatig bewegen en niet te zwaar worden.

Uw verhaal vertellen?

Mensen met boezemfibrilleren hebben allen een verhaal. Over angst voor beroertes, vermoeidheid, medicatie... Uw verhaal terugzien in een filmportret en verhalenboekje? Meld u snel aan op www.boezemfibrilleren.nl. Hier vindt u meer nieuws, verhalen en tips over boezemfibrilleren.

Dit kan ook al:

zelf bloedwaarden controleren

Er zijn meters op de markt waarmee iemand die antistollingsmiddelen gebruikt, thuis zijn bloedwaarden kan controleren. Tijdens een training bij de trombosedienst wordt de patiënt geleerd hoe hij zo nodig zelf de dosering van zijn medicijnen kan aanpassen. Iemand die thuis de stollingswaarde van zijn bloed in de gaten houdt, hoeft in plaats van elke twee of drie weken maar vier keer per jaar voor controle naar de trombosedienst.

Met medewerking van Lukas Dekker, cardioloog in het Catharina Ziekenhuis in Eindhoven en gespecialiseerd in hartritmestoornissen.

Door: Marte van Santen
Bron: Plus Magazine - 01-11-2010

 

oestrogeen Odette

Lengthened WHI Follow-Up: Postmenopausal Estrogen Therapy

Women's Health Initiative follow-up of postintervention phase showed estrogen had no substantial adverse effects on most health outcomes; reduction in relative risk for breast cancer persisted.

In the Women's Health Initiative (WHI) Estrogen-Alone Trial, 11,000 postmenopausal women with hysterectomies (age range at baseline, 50–79) received conjugated equine estrogen or placebo for a median of 5.9 years; the trial was stopped at mean follow-up of 7.1 years when initial findings showed excess risk for stroke in the estrogen group. Subsequent analysis showed that, in younger WHI participants (age range at baseline, 50–59), estrogen use was associated with lower risk for coronary heart disease (CHD) and overall mortality during the intervention period. Now, investigators have assessed postintervention health outcomes in 7645 WHI participants.

Overall, at a mean 10.7 years after baseline, estrogen use was not associated with excess risk for stroke or other adverse outcomes (CHD, deep venous thrombosis, hip fracture, colorectal cancer, and mortality during follow-up). As had been noted in previous WHI analyses of the intervention phase, risk for invasive breast cancer was lower in the estrogen group than in the placebo group (hazard ratio, 0.77; 95% confidence interval, 0.62–0.95); moreover, among younger women, estrogen use was associated with lower risk for CHD (HR, 0.59; 95% CI, 0.38–0.90) and marginally lower risk for mortality during follow-up (HR, 0.73; 95% CI, 0.53–1.00).

Comment: Although these new findings from the WHI are intriguing, they do not imply that estrogen-only therapy should be recommended at this time for cardioprotection or breast cancer chemoprophylaxis. What these results do provide is reassurance to young menopausal women who are posthysterectomy and who present with bothersome vasomotor symptoms that use of estrogen therapy for as long as 6 years is safe.

Andrew M. Kaunitz, MD

Published in Journal Watch Women's Health April 5, 2011


 

AF

SUMMARY AND COMMENT

Long-Term Statin Therapy Does Not Lower Risk for Atrial Fibrillation

April 7, 2011 | Paul S. Mueller, MD, MPH, FACP

Several factors might explain discrepant results between short-term and long-term trials.

Reviewing: Rahimi K et al. BMJ 2011 Mar 16; 342:d1250


Thursday, April 07, 2011

 

Is the Radial Approach for Everyone?
There are a few prerequisites for patients to be a candidate for the transradial approach. The first is confirmation of a dual, or "protected", blood supply to the hand. The radial artery loops around the hand and joins the ulnar artery. Both arteries supply blood to the hand and fingers. It is precisely this dual blood supply that makes the radial technique safe. Should the radial artery close up (a complication seen in a small percentage of cases) the clinical result tends to be benign, because the ulnar artery continues to function.

The first step a cardiologist takes in deciding on the radial approach is an Allen test to assess that both radial and ulnar arteries are functioning normally -- a simple test that can be done by compressing the arteries by hand at bedside or in the doctor's office. If they are not normal, then the femoral approach is preferred. Some other contraindications exist, such as the need to use larger devices during the angioplasty, pre-existing bypass grafts in certain areas or tortuous vessels that may prevent the catheter from navigating to the coronaries from the arm. About 30-40% of patients are not candidates for radial access.

diagram of radial and ulnar arteries

While the complication rate with the radial approach is extremely low, there is always some risk with any medical procedure. It is important for patients to discuss the risks and benefits of the femoral vs. radial approaches, as these can vary for each individual.


Wednesday, April 06, 2011

 

AF Rocket AF trial

Another Contender in the Race to Unseat Warfarin

Mark S. Link, MD

Posted: 03/25/2011; Journal Watch © 2011 Massachusetts Medical Society









Abstract and Introduction

Abstract and Introduction

Abstract

Compared with aspirin, apixaban reduced the risk for embolic events in patients with atrial fibrillation.

Introduction

A fierce competition is under way to develop a replacement for warfarin in the treatment of atrial fibrillation (AF). Dabigatran, a direct thrombin inhibitor, is approved for use in the U.S., and rivaroxaban, a factor Xa inhibitor, was noninferior to warfarin in the preliminary results of the ROCKET AF trial. Now, apixaban, another factor Xa inhibitor, has been compared with aspirin in an industry-sponsored trial.

The AVERROES investigators randomized 5599 patients with AF and at least one additional risk factor for stroke to apixaban (5 mg twice daily) or aspirin (81–324 mg daily). All patients were considered unsuitable for warfarin treatment, 40% because of prior problems with the drug.

The study was terminated early because of demonstrated superiority of apixaban; mean follow-up was 1.1 years. The rate of the primary outcome — stroke or systemic embolism — was 1.6% per year in the apixaban group versus 3.7% per year in the aspirin group (hazard ratio, 0.45; P<0.001). Major bleeding rates were similar in the two groups (apixaban, 1.4% per year; aspirin, 1.2% per year). The rate of the composite of stroke, systemic embolism, myocardial infarction, death from vascular causes, and major bleeding was 5.3% per year in the apixaban group versus 7.2% per year in the aspirin group (HR, 0.74; P=0.003).


 

AF dabigatran

From Future Neurology

Dabigatran for Stroke Prevention in Patients with Atrial Fibrillation and Previous Stroke or Transient Ischemic Attack








Abstract and Introduction

Abstract

This study aimed to assess the efficacy and safety of daibgatran in two doses (110 and 150 mg) compared with warfarin in a prespecified subgroup analysis of patients with previous stroke or transient ischemic attack in the Randomized Evaluation of Long-Term Anticoagulation Therapy (RE-LY) trial. There were nonsignificant risk reductions for the primary outcome (stroke and systemic embolism) for both doses of dabigatran compared with warfarin in this subgroup of patients. However, the 110-mg dose of dabigatran provided significantly greater reductions of mortality and higher net clinical benefit compared with warfarin. This was not seen in the 150-mg dose. The bleeding complication rates of this subgroup were consistent with the main RE-LY trial. In the warfarin group, patients with previous history of stroke or transient ischemic attack developed more intracranial bleeding than patients without this history, but this was not the case in dabigatran treatment groups.


 

AF

Abstract

Atrial fibrillation (AF) is associated with significant morbidity and mortality. It is also a progressive disease secondary to continuous structural remodelling of the atria due to AF itself, to changes associated with ageing, and to deterioration of underlying heart disease. Current management aims at preventing the recurrence of AF and its consequences (secondary prevention) and includes risk assessment and prevention of stroke, ventricular rate control, and rhythm control therapies including antiarrhythmic drugs and catheter or surgical ablation. The concept of primary prevention of AF with interventions targeting the development of substrate and modifying risk factors for AF has emerged as a result of recent experiments that suggested novel targets for mechanism-based therapies. Upstream therapy refers to the use of non-antiarrhythmic drugs that modify the atrial substrate- or target-specific mechanisms of AF to prevent the occurrence or recurrence of the arrhythmia. Such agents include angiotensin-converting enzyme inhibitors (ACEIs), angiotensin receptor blockers (ARBs), statins, n-3 (ω-3) polyunsaturated fatty acids, and possibly corticosteroids. Animal experiments have compellingly demonstrated the protective effect of these agents against electrical and structural atrial remodelling in association with AF. The key targets of upstream therapy are structural changes in the atria, such as fibrosis, hypertrophy, inflammation, and oxidative stress, but direct and indirect effects on atrial ion channels, gap junctions, and calcium handling are also applied. Although there have been no formal randomized controlled studies (RCTs) in the primary prevention setting, retrospective analyses and reports from the studies in which AF was a pre-specified secondary endpoint have shown a sustained reduction in new-onset AF with ACEIs and ARBs in patients with significant underlying heart disease (e.g. left ventricular dysfunction and hypertrophy), and in the incidence of AF after cardiac surgery in patients treated with statins. In the secondary prevention setting, the results with upstream therapies are significantly less encouraging. Although the results of hypothesis-generating small clinical studies or retrospective analyses in selected patient categories have been positive, larger prospective RCTs have yielded controversial, mostly negative, results. Notably, the controversy exists on whether upstream therapy may impact mortality and major non-fatal cardiovascular events in patients with AF. This has been addressed in retrospective analyses and large prospective RCTs, but the results remain inconclusive pending further reports. This review provides a contemporary evidence-based insight into the role of upstream therapies in primary (Part I) and secondary (Part II) prevention of AF.


Tuesday, April 05, 2011

 

homo sapiens neanderthal

BMC Biology 2011, 9:20doi:10.1186/1741-7007-9-20

The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1741-7007/9/20

Received: 29 March 2011
Accepted: 31 March 2011
Published: 31 March 2011

© 2011 Liang and Nielsen; licensee BioMed Central Ltd.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Is it true that modern humans have Neanderthals and other archaic species in their direct ancestry?

According to two recently published papers by Green et al. and Reich et al., the answer to this question is yes. Human genomes are in part composed of DNA from other archaic hominin species that traditionally have not been counted among our ancestors, although the proportion of archaic DNA in the genome depends on your ethnicity. On the basis of analyses of ancient DNA, Green et al. report that, on average, 1 to 5% of the genomes of non-African individuals are descended from a Neanderthal, and Reich et al. report that 4 to 6% of the genomes of Melanesians are derived from a newly discovered archaic hominin population dubbed the Denisovans. Denisovans and Neanderthals are the only archaic species investigated so far, but future investigations may reveal contributions of DNA from other species, perhaps even from species that have never been characterized well morphologically.

What is an archaic hominin, exactly?

Hominins are humans and their closely related extinct ancestors. Denisovans and Neanderthals were hominins that last lived approximately 30,000 years ago. Neanderthal fossils were first found in 1856, in the Neander Valley, which lends its name to the species. Since then, specimens have been found in a wide geographical range, including the Middle East, Central Asia, and Western and Central Europe. To date, the only discovered Denisovan remains are the finger bone and two teeth discovered in Denisova Cave in Siberia. On the basis of genetic analysis of the finger bone, Reich et al. conclude that Denisovans represent a deeply diverged population distinct from other Neanderthals. Whether Neanderthals and Denisovans comprise separate species is probably mainly an issue of semantics and, in any case, cannot be answered without additional Denisovan samples.

How does this fit with current theories of human origins?

The question of human origins has intrigued scientists ever since Darwin first proposed the theory of evolution. Historically, most of the debate has focused on two competing hypotheses: the out of Africa (OOA) theory (Figure 1a) and the multi-regional theory (Figure 1b). The OOA hypothesis posits that anatomically modern humans first evolved in Africa 200,000 to 150,000 years ago and then migrated out of Africa 100,000 to 60,000 years ago, displacing other archaic hominins, and giving rise to all current human populations. The multi-regional theory suggests that archaic hominins spread out of Africa much earlier, and that humans then evolved from this Eurasia-wide population, with some degree of interbreeding, and thus gene flow, among individuals from different populations being responsible for the degree of genetic differentiation between populations we currently observe. Mitochondrial (mt) DNA data first reported in 1987 and subsequent analyses of autosomal DNA seemed to support the OOA hypothesis.

thumbnailFigure 1. Human origins. Each panel shows a hypothesis for the evolutionary history of humans. The colored bars show the phylogenetic relationships between species, with each color representing a species and blue representing the ancestral hominin species. Arrows represent gene flow, or admixture, with question marks to indicate possible admixture from as yet undiscovered hominins. (a) The Out of Africa (OOA) hypothesis; (b) the multiregional hypothesis; (c) a modification of the OOA hypothesis to include the archaic admixture inferred from recent work.

However, even before the publication of the Neanderthal genome, analyses of modern human DNA from different geographic sources by Jeffrey Wall and others had suggested that, contrary to the earlier consensus, anatomically modern humans evolved in Africa recently, but admixed with endemic archaic hominids - Neanderthals, Denisovans, or even Homo erectus - as they spread throughout the world (Figure 1c), and that ancestral admixture may be much more common than previously thought.

Wall et al. based their analysis on the pattern of haplotype lengths. After controlling for other confounding factors, such as demographic history and recombination rate variation, they concluded that the observed lengths of these regions could only be accounted for by archaic admixture on the order of 5%. The evidence of admixture from Neanderthal and Denisovan nuclear DNA lends credence to these claims.

Isn't it extremely difficult to get authentic undamaged DNA from individuals dead for over 30,000 years?

Yes. It was necessary to locate samples that had been buried in cool and dry conditions, under which DNA is degraded relatively slowly. Even so, for the Neanderthal samples, most DNA fragments were very short, and approximately 95 to 99% of the DNA in the samples belonged to bacteria. To reduce the amount of sequencing needed, the relative proportion of hominin DNA was increased by treating the DNA extract with a concoction of restriction enzymes that were chosen to cut bacterial DNA preferentially. This increased the relative proportion of Neanderthal DNA to over 10%. This enriched extract was analyzed using new-generation sequencing machines, which produced a draft sequence with approximately 1.3× coverage - that is, on average, each base pair in the genome was sequenced 1.3 times. Because of the random nature of next-generation sequencing, this means that certain parts of the genome will not have been sequenced at all, while other parts will have been sequenced many more times.

The genetic material from the Denisovan individual was extracted from a finger bone. Because of the cooler climate in Siberia, there was less environmental degradation of the DNA. However, the small volume of the finger bone yielded only enough DNA to sequence the Denisovan genome to 1.9× coverage.

We must have a lot of DNA in common with archaic hominins because of our shared ancestry - How can we infer interbreeding?

If we have two human populations, one of which has undergone more archaic admixture than the other, then we expect the more admixed human population to be more genetically similar to the archaic hominin than the other human population. This intuition is formalized by the ABBA-BABA test. In this statistical test, DNA representing the same sites in a chimpanzee sequence, an archaic hominin sequence, and sequences from a pair of modern human populations, such as Han and Yoruban or Japanese and French, designated H1 and H2, are compared. Only sites with two alleles, A and B, are considered. The chimpanzee is assumed to carry A, the ancestral allele. Two numbers are then computed, nABBA, the number of sites where the chimpanzee and one of the pair of modern humans (H2) have allele A and the archaic hominin and the other modern human (H1) have allele B (ABBA) and nBABA, the number of sites where chimpanzee and H1 have allele A and the archaic hominin and H2 have allele B (BABA). Finally, nABBA and nBABA are added up over all pairings of H1/H2 samples from the two human populations being analyzed. If there had been no archaic admixture, then the difference of these sums is expected to be 0. If the difference is significantly different from 0, then the null hypothesis of no admixture is rejected. Using population genetic models, the admixture fraction can also be estimated from the magnitude of this difference. The ABBA-BABA test can then be used for each pair of human populations to determine the differences in admixture rates between them.

The results of the ABBA-BABA test showed that human non-African populations are more closely related to Neanderthals than African populations. When applied to the Denisovan genome, Reich et al. found that only Melanesians showed evidence of admixture.

This seems quite a subtle test - Might these results be explained by human contamination?

Probably not. Contamination is a serious problem in any sequencing project. A recent paper by Longo et al. reports significant human contamination in non-primate genome databases, and previous analyses of Neanderthal genetic material have also been plagued by human DNA contamination.

In the light of this earlier experience, researchers took several precautions to guard against contamination. The initial sample preparation and DNA extraction were done in a clean room, using several procedures to reduce the chances of modern human DNA contamination. As an additional step in the sample preparation, special primers were ligated onto both ends of each fragment, identifying the fragments. During the sequencing, only reads with this clean room tag were used to assemble the draft genome, minimizing the effect of post-clean room contamination.

The efficacy of these methods was validated using three different procedures: by looking at mtDNA; by looking for Y chromosome sequences; and by using statistical analyses of autosomes. mtDNA is much easier to sequence because it occurs in much higher concentration than nuclear DNA. As a result, the Neanderthal mtDNA sequence can be very accurately determined, and several fixed differences between humans and Neanderthals have been identified. These differences can be used to estimate the ratio of human mtDNA to Neanderthal mtDNA in the sample. Likewise, because all the samples were female, the amount of Y chromosomal DNA can be used to estimate the level of contamination from human males. Finally, researchers used human heterozygosity and allele frequency data to directly estimate contamination in the autosomal DNA. All three methods estimated the human contamination to be 1% or less.

This is consistent with the results of a blind test in which Green et al. examined present-day human genetic variation without knowledge of the Neanderthal sequence, and were able to locate regions of the human genome that appeared admixed. Comparison of their predictions with the Neanderthal data showed that these candidate regions matched the Neanderthal sequence at a higher frequency than could be explained by any level of contamination.

What about DNA damage?

The main problem in dealing with ancient DNA is the dearth of genetic material. The Neanderthal and Denisovan genomes could not be sequenced to a higher coverage not because of a lack of money or time, but because of a lack of DNA extract; the three bones from Vindija Cave and the one from Denisova Cave have been completely hollowed out to produce the genomes reported.

Ancient DNA sequencing typically shows a much high error rate than observed in modern DNA. Errors in the reported genome can be caused by degradation of the DNA from the environment or by sequencing error. In ancient DNA samples, deamination of cytosine residues causes C to have the chemical properties of T, and G to have the chemical properties of A. As a result, the Neanderthal draft genome shows an abnormally large number of C-> T and G-> A substitutions, the vast majority of which are errors. In sequencing the Denisovan samples, this deamination was chemically reversed, allowing the C and G residues to be sequenced correctly. This, together with the drier and cooler climate at Denisova Cave, resulted in DNA samples that were about ten times less damaged.

Sequencing error can also be a problem, as the error rate of new-generation sequencing is only slightly lower than the divergence between humans and Neanderthals. However, this problem will hopefully disappear as new-generation sequencing technology becomes more accurate and the discovery of new samples allows for deeper coverage.

That sounds serious - How confident can we be of any interpretation if the sequencing error rate and the divergence are that close?

The statistical analysis of the Neanderthal and Denisovan genomes was designed with the limitations of the data in mind. A paper by Durand et al. argues that the ABBA-BABBA test for admixture is not sensitive to confounding factors, such as human or Neanderthal demographic history, sequencing error or damage to the DNA, as long as the H1 and H2 samples were processed in the same way. However, one source of concern is the possibility of a shared error structure caused by DNA sequencing methods. Current sequencing technology is highly temperamental, and the frequency and type of sequencing errors in the final data depend on many factors, such as sample preparation, the type of sequencing machine, contamination from local conditions and reagents, and sequencing coverage. If the error structures of the archaic DNA and one of the modern human DNA samples are similar to each other for one of many reasons, the ABBA-BABA test could report admixture when it did not in fact occur. Even a very small proportion of shared errors could cause a strong effect on the ABBA-BABA statistic. For example, small effects that we typically tend to ignore, such as shared contamination of reagents between the samples, could cause artifactual evidence of admixture. Green et al. and Reich et al. made great efforts to control for these effects, and appear to have succeeded. However, the issues of errors in next-generation sequencing data, particularly for ancient DNA, and their consequences for current and future inference of low levels of admixture remain a critical issue that is likely to be the focus of much future research.

Assuming that we can be confident of the conclusions of these studies, how much of our genomes comes from other hominins?

These two papers only investigated the possibility of admixture from Neanderthals and Denisovans into humans. It is possible that other archaic hominins, perhaps as yet undiscovered, also contributed to the human genome. In fact, Plagnol and Wall report that there is evidence for significant admixture into African populations as well, although no candidate species has been proposed.

On the basis of the data and analyses presented by Green et al. and Reich et al., it appears that a simple out of Africa hypothesis with no admixture does not give the full picture of human origins. As sequencing technology improves and additional archaeological discoveries are made, we should be able to gain a more detailed understanding of what now seems to be the mosaic ancestry of the human genome.

Where can I find out more?

1. Green RE, Krause J, Briggs AW, Maricic T, Stenzel U, Kircher M, Patterson N, Li H, Zhai W, Fritz MH, Hansen NF, Durand EY, Malaspinas AS, Jensen JD, Marques-Bonet T, Alkan C, Prüfer K, Meyer M, Burbano HA, Good JM, Schultz R, Aximu-Petri A, Butthof A, Höber B, Höffner B, Siegemund M, Weihmann A, Nusbaum C, Lander ES, Russ C, et al.: A draft sequence of the Neandertal genome. Science 2010, 328:710-722.

2. Reich D, Green RE, Kircher M, Krause J, Patterson N, Durand EY, Viola B, Briggs AW, Stenzel U, Johnson PL, Maricic T, Good JM, Marques-Bonet T, Alkan C, Fu Q, Mallick S, Li H, Meyer M, Eichler EE, Stoneking M, Richards M, Talamo S, Shunkov MV, Derevianko AP, Hublin JJ, Kelso J, Slatkin M, Pääbo S: Genetic history of an archaic hominin group from Denisova Cave in Siberia. Nature 2010, 468:1053-1060.

3. Durand E, Patterson N, Reich D, Slatkin M: Testing for ancient admixture between closely related species. Genetics 2011, in press.

4. Wall J, Hammer M: Archaic admixture in the human genome. Curr Opinin Genet Dev 2006, 16:606-610.

5. Plagnol V, Wall J: Possible ancestral structure in human populations. PLoS Genet 2006, 2:972-979.


This page is powered by Blogger. Isn't yours?