Using health outcomes data to compare plans, networks and providers. Palmer, R. H. International journal for quality in health care: journal of the International Society for Quality in Health Care / ISQua, 10(6):477–483, December, 1998.
abstract   bibtex   
PURPOSE: To analyze the challenge of using health outcomes data to compare plans, networks and providers. ANALYSIS: Different questions require different designs for collecting and interpreting health outcomes data. When evaluating effectiveness of treatments, tests or other technologies, the question is what processes improve health outcomes? For this purpose, the strongest evidence comes from a double-blind randomized controlled trial. In program evaluations, the question is 'what is the impact of this policy and related programs on health outcomes?' For this purpose, we may be able to randomize subjects, but are more likely to have a quasi-experimental or an epidemiological design. When we compare plans, networks and providers for quality improvement purposes the question is 'do these specific plans perform differently from one another?', or, 'are these specific plans improving their performance over time?' We want to isolate for study the effects attributable to specific plans. Designs that yield strong evidence cannot be applied because we lack experimental control. CONCLUSIONS: When we already have strong evidence linking specific processes of care with specific outcomes, comparing process data may reveal more about performance of plans, networks and providers than comparing outcomes data. Comparisons of process data are easier to interpret and more sensitive to small differences than comparisons of outcomes data. Outcomes data are most useful for tracking care given by high volume providers over long periods of time, targeting areas for quality improvement and for detecting problems in implementation of processes of care.
@article{palmer_using_1998,
	title = {Using health outcomes data to compare plans, networks and providers},
	volume = {10},
	issn = {1353-4505},
	abstract = {PURPOSE: To analyze the challenge of using health outcomes data to compare plans, networks and providers.
ANALYSIS: Different questions require different designs for collecting and interpreting health outcomes data. When evaluating effectiveness of treatments, tests or other technologies, the question is what processes improve health outcomes? For this purpose, the strongest evidence comes from a double-blind randomized controlled trial. In program evaluations, the question is 'what is the impact of this policy and related programs on health outcomes?' For this purpose, we may be able to randomize subjects, but are more likely to have a quasi-experimental or an epidemiological design. When we compare plans, networks and providers for quality improvement purposes the question is 'do these specific plans perform differently from one another?', or, 'are these specific plans improving their performance over time?' We want to isolate for study the effects attributable to specific plans. Designs that yield strong evidence cannot be applied because we lack experimental control.
CONCLUSIONS: When we already have strong evidence linking specific processes of care with specific outcomes, comparing process data may reveal more about performance of plans, networks and providers than comparing outcomes data. Comparisons of process data are easier to interpret and more sensitive to small differences than comparisons of outcomes data. Outcomes data are most useful for tracking care given by high volume providers over long periods of time, targeting areas for quality improvement and for detecting problems in implementation of processes of care.},
	language = {eng},
	number = {6},
	journal = {International journal for quality in health care: journal of the International Society for Quality in Health Care / ISQua},
	author = {Palmer, R. H.},
	month = dec,
	year = {1998},
	pmid = {9928586},
	keywords = {Benchmarking, Health Care Reform, Humans, Managed Care Programs, Managed Competition, Outcome Assessment (Health Care), Process Assessment (Health Care), Randomized Controlled Trials as Topic, Research Design, United States},
	pages = {477--483}
}

Downloads: 0