Dosing algorithms for warfarin incorporate clinical and genetic factors, but human intervention to overrule algorithm-based dosing may occasionally be required. The frequency and reasons for varying from algorithmic warfarin management have not been well studied.
Methods
We analyzed a prospective cohort of 1015 participants from the Clarification of Optimal Anticoagulation through Genetics trial who were randomized to either pharmacogenetic- or clinically-guided warfarin dosing algorithms. Clinicians and participants were blinded to dose but not international normalized ratio (INR) during the first 28 days. If an issue arose that raised concern for clinicians but might not be adequately accounted for by the protocol, then clinicians contacted the unblinded medical monitor who could approve exceptions if clinically justified. All granted exceptions were logged and categorized. We analyzed the relationships between dosing exceptions and both baseline characteristics and the outcome of percentage of time in the therapeutic INR range during the first 4 weeks.
Results
Sixteen percent of participants required at least one exception to the protocol-defined warfarin dose (15% in the genotype arm and 17% in the clinical arm). Ninety percent of dose exceptions occurred after the first 5 days of dosing. The only baseline characteristic associated with dose exceptions was congestive heart failure (odds ratio 2.12, 95% confidence interval, 1.49-3.02, P <.001). Neither study arm nor genotype was associated with dose exceptions.
Conclusion
Despite rigorous algorithms, human intervention is frequently employed in the early management of warfarin dosing. Congestive heart failure at baseline appears to predict early exceptions to standardized protocol management.
Warfarin is one of the most commonly prescribed medications but is difficult to manage because of substantial variability in dose requirements within and across individuals. Despite the advent of newer oral anticoagulants for atrial fibrillation and deep venous thrombosis, warfarin continues to be widely used for these and many other clinical indications. While some clinicians use an empiric approach to adjust the dose of warfarin, there are computer-assisted algorithms that have been shown to improve time in the therapeutic international normalized ratio (INR) range compared with empiric dosing. Widely available algorithms for choosing the initial dose of warfarin incorporate clinical factors including: age, race, body surface area, smoking status, history of diabetes, history of stroke, deep vein thrombosis or pulmonary embolism as the primary indication for warfarin therapy, target INR, and major interacting medications (ie, amiodarone or fluvastatin). When available, the addition of pharmacogenetic data, including genotypes for cytochrome P-450 family 2 subfamily C polypeptide 9 enzyme (CYP2C9) and vitamin K epoxide reductase complex 1 (VKORC1), appeared to further improve warfarin dose prediction in some models, but not in a randomized clinical trial.
The key components of dosing algorithms cannot account for every circumstance affecting each individual, and human intervention to overrule algorithm-based dosing may occasionally be required.2Â The frequency and reasons for varying from algorithm-based warfarin management have not been well studied.
The Clarification of Optimal Anticoagulation through Genetics (COAG) trial was a randomized clinical trial that aimed to determine if initiation of warfarin therapy using algorithms based on genotype and clinical information (ie, pharmacogenetic-guided dosing) improved the time in the INR range compared with algorithms based on clinical information alone (ie, clinically-guided dosing). The trial found no significant difference between study arms, but provided a rare opportunity to study the applicability of warfarin dosing algorithms. During the first 28 days after enrollment in COAG, the actual dose of warfarin was blinded to both clinicians and patients, but was directed by a series of standardized computerized algorithms. Clinicians were aware of INRs. If an issue arose that raised concern for clinicians but might not be adequately accounted for by the algorithm, then clinicians contacted an unblinded COAG medical monitor who could approve exceptions to the protocol algorithm if clinically justified.
In order for these algorithms to be relied upon in clinical practice, providers should know before their use if there are specific patients or circumstances in which they might fail and how often, and if the addition of genetic data limits the need for these exceptions. We hypothesized that the baseline characteristics that would predict which patients require exceptions to algorithm-based dosing would be other medical comorbidities or indications for warfarin therapy not included in current algorithms, and location of the patient (inpatient vs outpatient) on the day of enrollment. If confirmed, these findings could lead to refinements of existing algorithms that would improve warfarin dosing in the future. Moreover, those predicted to require frequent overruling of the standard algorithms might be better served with an alternative anticoagulant.
To read this article in its entirety please visit our website.
-Scott E. Kasner, MD, MSCE, Le Wang, MS, Benjamin French, PhD, Steven R. Messé, MD, Jonas Ellenberg, PhD, Stephen E. Kimmel, MD, MSCE, COAG Trial Steering Committee
This article originally appeared in the April 2016 issue of The American Journal of Medicine.