Computerized interpretation of the electrocardiogram (ECG) began in the 1950s when conversion of its analog signal to digital form became available. Since then, automatic computer interpretations of the ECG have become routine, even at the point of care, by the addition of interpretive algorithms to portable ECG carts. Now, more than 100 million computerized ECG interpretations are recorded yearly in the United States. These interpretations have contributed to medical care by reducing physician reading time and accurately interpreting most normal ECGs. But errors do occur. The computer cannot be held responsible for misinterpretations due to recording errors, such as muscle artifacts or lead reversal. But, in many abnormal ECGs, the computer makes its own errors—sometimes critical—in its incorrect detection of arrhythmias, pacemakers, and myocardial infarctions. These errors require that all computerized statements be over-read by trained physicians who have the advantage of clinical context, unavailable to the computer. Together, the computer and over-readers now provide the most accurate ECG interpretations available.
Introduction and Background
In the 1950s it became possible to convert the analog electrocardiogram (ECG) signal into digital form, and in the 1960s these digitized signals permitted the development of algorithms that interpret the ECG by computer.1 Since then, the algorithms have been continuously updated and improved. By 1988, 52 million ECGs were being recorded in the United States each year, and the prospects for computerized interpretation were bright.1Automatic, accurate, and rapid interpretations had the potential to reduce interpretation errors, especially in places where trained readers were not available, and to save physician time where they were available. Optimism was so high that Medicare insurance decided that computerized ECG interpretations were sufficiently accurate that it was no longer necessary to pay physicians to read them.2 This decision was quickly overturned, but some of the perceived advantages were confirmed. Reading time was reduced by an estimated 24% to 28%1 thanks to reliable, automatic, and accurate measurements of the heart rate, QRS axis, and duration of the PR and QRS intervals. Measurement of the QT interval, however, remained a problem (see below). The later development of the microprocessor enabled manufacturers to incorporate computer analyses into portable ECG carts.1 By 2006, it was estimated that 100 million ECGs were being interpreted by computer annually in the United States alone and a similar number in Europe.1 The computer had the ability to identify abnormalities inadvertently overlooked by a physician reader; however, the computerized interpretation was best at correctly identifying sinus rhythm and normal wave form but disturbingly less accurate with abnormalities in rhythm, conduction, and abnormal wave form. A considerable literature has accumulated on all aspects of computed interpretations, including an excellent detailed review in 20061 and another in 2017.3
To read this article in its entirety please visit our website.
-Harold Smulyan, MD
To watch a video commentary from our editor in chief Dr. Alpert, follow this link.
This article originally appeared in the February issue of The American Journal of Medicine.