VAERS, Underreporting, and the Mysterious 1%
A CDC database that was mostly unknown except to a small segment of the population is now regularly in the news. It is the Vaccine Adverse Event Reporting System, commonly known as VAERS. My topic here is not VAERS generally, but a number that recurs often in discussions about the vaccine injury reports VAERS contains.
For instance, in today’s Mercola.com newsletter there is an article summarizing Dr. Mercola’s interview with Dr. Vladimir Zelenko. In that intervew Dr. Zelenko states a common refrain about VAERS reporting:
“If you look at the 2009 Harvard study on the VAERS system, they said only 1% of events are actually reported. So, OK … whatever the number is, it’s not 6,000. Maybe only 10% are being reported. I don’t know. But definitely it’s being underreported.”
Mercola, just a few paragraphs below that, emphasizes the point:
“Studies have indicated it could be as low as 1%.6,7”
The Harvard study mentioned by Dr. Zelenko is the same as the one found at reference #6 cited by Mercola. Mercola’s reference #7 is superfluous, because it also references #6. Clearly that Harvard study is where the gold is.
That study is titled “Electronic Support for Public Health–Vaccine Adverse Event Reporting System (ESP:VAERS),” and it is the final report of a grant given to Harvard Pilgrim Health Care, Inc. It is worth pointing out that Harvard Pilgrim Health Care, Inc., has no affiliation with Harvard University. Reference to “the Harvard study” regarding the 1% figure adds legitimacy it doesn’t deserve.
The point of the grant given was to develop a more accurate system for capturing vaccine injuries, one that mined electronic medical records in real time, generating flags and reports for physician approval of automated submission to VAERS. It would be a dramatic improvement over the current health-care-worker-initiated reports.
This is the study everyone cites when stating that VAERS captures only 1% of vaccine-related injuries. So where is that information in this study? Here is the statement, with a bit of context:
“Adverse events from drugs and vaccines are common, but underreported. Although 25% of ambulatory patients experience an adverse drug event, less than 0.3% of all adverse drug events and 1–13% of serious events are reported to the Food and Drug Administration (FDA). Likewise, fewer than 1% of vaccine adverse events are reported.”
There is no citation for that sentence. It just states it as fact. Nothing coming before or after even suggests what that number is based upon.
At the end of the document is not a list of supporting citations, but only a list of publications by the organization on the topic of the automated adverse event reporting. In fact I am unable to find any study of reporting efficiency to VAERS that would justify the 1% estimate as being either an over- or under-estimation of the percent of injuries being reported.
As such, I think it is somewhat sloppy science to reference the “Harvard study” as support of VAERS underreporting. If anyone can find one or several citations that actually seek to quantify VAERS underreporting, please send me a note.
There is another story being missed regarding that Harvard Pilgrim study, though. The point of Harvard Pilgram Health Care, Inc. getting that grant was to develop a much more efficient system for capturing vaccine injuries, a system that would be built into electronic health records systems and so could flag temporal relationships between vaccinations being dispensed and symptoms subsequently reported. So were there any problems with meeting the objectives of the grant? What did their final report have to say about developing and implementing such a system?
Restructuring at CDC and consequent delays in terms of decision making have made it challenging despite best efforts to move forward with discussions regarding the evaluation of ESP:VAERS [the new system] performance in a randomized trial and comparison of ESP:VAERS performance to existing VAERS and Vaccine Safety Datalink data.
Synthetic and real test data was been generated and transmitted between Harvard and Constella. However, real data transmissions of non-physician approved reports to the CDC was unable to commence, as by the end of this project, the CDC had yet to respond to multiple requests to partner for this activity.
We had initially planned to evaluate the system by comparing adverse event findings to those in the Vaccine Safety Datalink project … however, due to restructuring at CDC and consequent delays in terms of decision making, it became impossible to move forward with discussions regarding the evaluation of ESP:VAERS performance in a randomized trial, and compare ESP:VAERS performance to existing VAERS and Vaccine Safety Datalink data.
Unfortunately, there was never an opportunity to perform system performance assessments because the necessary CDC contacts were no longer available and the CDC consultants responsible for receiving data were no longer responsive to our multiple requests to proceed with testing and evaluation.
Keep in mind, this system would have extracted probable vaccine injury information from patient charts and fed it directly into the VAERS system. In other words, it would have dramatically increased both the accuracy and the number of vaccine injuries reported to VAERS.
The CDC blocked both the data collection by the Harvard Pilgrim group, and also any implementation that could have followed from their findings. In fact, lack of engagement by the CDC rendered the entire project pretty much null.
The Harvard Pilgrim study is not a reference that substantiates any claims about VAERS underreporting. It is a reference to substantiate claims that the CDC has no interest in making reports of vaccine injury either more accurate or more comprehensive.
The Harvard Pilgraim study is here for all to see: