You are on page 1of 20

Qualitative and Semi-quantitative Module (QMC)

By Carol Lee

Copyright 2002-2010, Data Innovations, LLC. All rights reserved world-wide. This document may not be reproduced without express written permission.

Method Comparison Qual/SemiQuant


Purpose
To evaluate the degree of concordance between two methods for qualitative or semi-quantitative methods.

Principle
Qualitative methods and semi-quantitative methods lack sufficient analytical sensitivity to demonstrate accuracy across a wide range of observed results. Therefore: The results are categorized in states of diagnostic description. i.e.: postive / negative, equivocal, gray zone, Qualitative 2 states (i.e. positive and negative) Semi-quantitative a limited number of states (up to 6 are allowed in EP Evaluator).
2

QMC
Evaluate the degree of concordance between two qualitative or
semi-quantitative methods.
colorful bubble chart size of bubbles proportional to number of points concordance or truth tables

Qualitative 2 states (i.e. positive and negative)


% pos agreement, % neg agreement False pos and false neg in red If one method is gold standard Sensitivity and specificity is calculated

Semi-quantitative up to 6 states allowed in EP Evaluator


Click custom codes to set up states Alphanumeric i.e., Equivocal, gray zone Numeric cutoff values < cutoff, cutoff
2002-2008 David G. Rhoads Associates, Inc

Experimental Design
Qualitative:
At least 20 specimens. Equal numbers of positives and negatives

Semi quantitative
Same number of states for both methods User defined results or cutoffs and labels 0 to 10, few, Try for similar N in each state.

Method Comparison Qualitative


Enter values as 2 states
User defined labels

Calculations
If One method is specified as a gold standard (i.e. absolutely correct). calculates specificity and sensitivity If Neither method is a gold standard. Only calculates relationship of the two methods, namely the degree of agreement and whether one method is more or less sensitive relative to the other.

Qualitative

Default is 2 states P = positive N = Negative Allowed results show in the drop down box in the data entry area

Gold Standard Reference

Enter 2 state results

Gold standard

Not Gold Standard

Report Excerpt
Prepared for: chemistry Dept -- Holy Name hospital By: Clinical Laboratory -- Community Hospital

Qualitative Method Comparison


Ref. Method: Immunochromatic Test Method: NCCLSEx2

Statistical Analysis
(Comparison of two Laboratory Methods)

neg <-- Test --> pos

Agreement Positive Agreement Negative Agreement

94.6% (92.3 to 96.2%) 95.3% 93.7%

95% confidence intervals calculated by the "Score" method.

McNemar Test for Symmetry: Test < Reference 14 (2.6%) Test > Reference 15 (2.8%) Symmetry test PASSES p = 0.853 (ChiSq=0.034, 1 df)
A value of p<0.05 suggests that one method is consistently "larger".

Cohen's Kappa

89.0% (85.1 to 92.9%)

1 2 neg <-- Reference -->pos

Kappa is the proportion of agreement above what's expected by chance. Rule of thumb is Kappa>75% indicates "high" agreement. We would like to see VERY high (close to 100%) agreement.

Statistical Summary
Negative Reference Negative Test
222

Positive Reference
14

Legend:
Total
236 1 2 Negative (N) Positive (P) Negative (N) Positive (P)

Concordance table

nd the test method hat the new

result accurately shows the absence of the disease. Agreement: The percent of total cases in which the two methods give the same result. Related statistics for qualitative tests: Positive Agreement is the percent of cases that match when the reference method is positive: TP/(TP+FP). Negative Agreement is the percent of cases that match when the reference method is negative: TN/(TN+FN). Cohen's Kappa: Similar to Agreement, but adjusted for the probability that the two methods agree by chance. Kappa ranges from -100% to 100%. A value of zero indicates random agreement. A value of 100% indicates perfect agreement. It is desirable for Kappa to be well above 75%. McNemar Test for Symmetry: A test for bias -- whether one method is consistently larger than the other. If the number of cases where X>Y is equal (within random error) to the number of cases X<Y, the method is unbiased, and the symmetry test passes. If most of the differences between X and Y occur when X>Y (or when X<Y), the symmetry test fails.

hods, the key methods give the

Statistical tests

parison

non-quantitative stick method for he results for the ps corresponding to n be used to h quantitative tests used in nt.

ely correct.

s positive by both

Preliminary Report
The word PRELIMINARY printed diagonally across the report indicates that the data is incomplete, and the report is not acceptable as a final report. Some or all of the statistics may be missing. This report is preliminary if there are less than 20 unexcluded results.

s negative by both

negative by the test hod, with the hod is correct.

ositive by the test

Semi quantitative
Prepared for: chemistry Dept -- Holy Name hospital By: Clinical Laboratory -- Community Hospital

Ref. Method: Chem Assay

Test Method: Analyzer

Statistical Analysis
6 5 4 3 2 1
(Comparison of two Laboratory Methods)

Agreement

71.9% (61.8 to 80.2%)

95% confidence intervals calculated by the "Score" method.

McNemar Test for Symmetry: Test < Reference 23 (25.8%) Test > Reference 2 (2.2%) Symmetry test FAILS p < 0.001 (ChiSq=17.640, 1 df)
A value of p<0.05 suggests that one method is consistently "larger".

Cohen's Kappa
1 2 3 4 5 neg <-- Reference -->pos 6

60.5% (47.4 to 73.6%)

Kappa is the proportion of agreement above what's expected by chance. Rule of thumb is Kappa>75% indicates "high" agreement. We would like to see VERY high (close to 100%) agreement.

Test

Reference 1 2 3

Legend:
4 5 6 Total Reference Test

Allow 1 step difference

Allow 1 step
Prepared for: chemistry Dept -- Holy Name hospital By: Clinical Laboratory -- Community Hospital

Ref. Method: Chem Assay

Test Method: Analyzer

Statistical Analysis
6 5 4 3 2 1
(Comparison of two Laboratory Methods)

Agreement Agreement within two

71.9% (61.8 to 80.2%) 98.9% (93.9 to 99.8%)

95% confidence intervals calculated by the "Score" method.

McNemar Test for Symmetry: Test < Reference 23 (25.8%) Test > Reference 2 (2.2%) Symmetry test FAILS p < 0.001 (ChiSq=17.640, 1 df)
A value of p<0.05 suggests that one method is consistently "larger".

Cohen's Kappa
1 2 3 4 5 neg <-- Reference -->pos 6

60.5% (47.4 to 73.6%)

Kappa is the proportion of agreement above what's expected by chance. Rule of thumb is Kappa>75% indicates "high" agreement. We would like to see VERY high (close to 100%) agreement.

Test 1

Reference 1 2 3
10 5 --

Legend:
4
--

5
--

6
--

Total
15 1

Reference
Very Negative

Test
Very Negative

Custom results codes Cutoff values


Cutoff value > than
the number entered

Policy Definitions: Editing Analyte settings


Your choices in Modules
and options will define what headers are visible in the analyte settings
If you dont select an sd goal in SP, the column for random err budget will not be present.

If you check QMC, a


separate tab appears for QMC

Jan 6, 2010

17

QMC entering analyte parameters


edit paste headers into excel Make changes Copy / paste from excel back to EE9

LvlN Analyt Custom Result ame LvlVal LvlNa LvlVal LvlNa LvlVal LvlNa LvlVal LvlNa LvlVal LvlNa LvlVal e ? sType 1 ue1 me2 ue2 me3 ue3 me4 ue4 me5 ue5 me6 ue6 BACT y fuzzy cells y
Jan 6, 2010

Nor 1m Nor 1m

Abn Abn

1200 1
18

QMC analyte settings policy definitions

0 = alpha 1 = numeric large is pos 2 = numeric large is neg

Jan 6, 2010

19

Questions and Discussion

2002-2008 David G. Rhoads Associates, Inc

20

You might also like