How point-of-care diagnostic lab tests help clinical decisions
Point-of-care (POC) diagnostic laboratory testing is not common in eye care. This is not due to any lack of clinical need-it is rather the result of a lack of specific tests known to demonstrate diagnostic and/or treatment relevance to the optometrist and a general resistance to adopting new diagnostic technologies.
Point-of-care (POC) diagnostic laboratory testing is not common in eye care. This is not due to any lack of clinical need-it is rather the result of a lack of specific tests known to demonstrate diagnostic and/or treatment relevance to the optometrist and a general resistance to adopting new diagnostic technologies. Yet there is evidence of a rapidly growing acceptance of POC testing as clinicians begin to better understand the diagnostic and treatment value of currently available tests and how they correlate to various ocular surface disorders.1
Quantitative or qualitative ophthalmic lab tests are quite precise when used as diagnostic tools. They may also be used to evaluate treatment efficacy. Whether they are based on traditional lateral flow immunoassay systems, electrochemical sensors, or other platforms under development, continued advances in technology have steadily produced more sophisticated devices capable of accurately measuring an increasing number of target ocular analytes.
More from this issue:
POC lab tests have the real potential to streamline health care in general and improve clinical outcomes. In the ophthalmic clinic, for example, the transfer from physician diagnostic chair time to lab tech time is itself a significant improvement in patient flow. The more testing the technician is able to perform, the better for overall office flow-the clinician is able to spend more time with the patient instead of performing tests. Imagine the improvement in patient flow if a patient’s lab test results are in the chart before the doctor enters the lane.
In my experience, the use of POC lab testing within optometry is growing, and it will continue to expand as more tests are developed and more clinicians adopt them in their day-to-day diagnostic regimen. There has been a slow but steady shift over the past 20 years from such tests considered as interesting gadgets to becoming critical diagnostic systems.
Medical necessity as guiding principle
We must begin by stating that the use of lab tests, particularly those that are reimbursable, are governed by one basic premise; they must be judged to be medically necessary.2 As health maintenance organizations (HMOs) and government agencies seek to provide quality cost-effective medicine, reduction in the ordering of “unnecessary” laboratory tests are one of their favorite objects of pursuit.
In this regard, the critical question facing physicians is: What constitutes a necessary laboratory test? Medical necessity is defined as diagnosing and treating an illness or injury. The patient’s documented signs, symptoms, or diagnosis must support the services or treatment in order to be considered medically necessary.3 It is incumbent upon physicians to understand which laboratory tests are accurate enough to be clinically useful and appropriate to order in the diagnosis and follow-up of a patient’s medical condition.4
Internal server error