Difference between revisions of "SMHS ResearchCritiques"
m |
|||
Line 1: | Line 1: | ||
==[[SMHS| Scientific Methods for Health Sciences]] - Study and Research Critiques == | ==[[SMHS| Scientific Methods for Health Sciences]] - Study and Research Critiques == | ||
+ | ===Overview=== | ||
+ | The scientific rigor in published literature, grant proposals and reports needs to be assessed and scrutinized to minimize errors in data extraction and meta-analysis. Reporting biases present significant obstacles to collecting of relevant information on the effectiveness of an intervention, strength of relations between variables, or causal associations. Research findings are important to individual decision making in revealing benefits, harms, or lack of effectiveness of new and established health intervention. Thus, the systematic review could flag studies, projects or data that may not be appropriately acquired, analyzed or interpreted. | ||
+ | ===Motivation=== | ||
+ | Identifying relevant research data is challenging. There is evidence that published data may not represent all of the findings on an intervention’s effectiveness. There is intrinsic bias to publish positive findings rather than null or negative results. In some cases, the stakeholders may not be able to determine the quality of a study because the presented details are poorly documented or because the authors and reviewers may fundamentally disagree on core principles. | ||
+ | ===General Approach for Research Critique=== | ||
+ | Before initiating the rigorous scientific review of a study, we need to summarize the study meta-data including: | ||
+ | * Authors: complete information about the study authors | ||
+ | * Affiliations: institutions, credentials, etc. | ||
+ | * Title: informative title of the study | ||
+ | * Venue: including who solicited the study (if applicable), what are the funding mechanisms, potential COIs, prior peer/external reviews | ||
+ | * Date(s): RFA dates, study dates, other time-relevant information about the study | ||
+ | * DOI/URL: digital object identifiers, RFAs/RFPs, and other appropriate web references | ||
+ | |||
+ | ====Research Critique==== | ||
+ | * Introduction - Background and Rationale | ||
+ | * Hypothesis/Key Questions | ||
+ | * Objectives/Goals | ||
+ | * Data (prelim, prospective, retrospective, selection of participants, sources, numbers, PHI/confidentiality/consent, access, etc.) | ||
+ | * Study Methods | ||
+ | * Risk-Benefit Analysis (if appropriate) | ||
+ | * Statistical Considerations (power/sample-sizes, strategy/period of data collection, assumptions, blindedness, techniques, etc.) | ||
+ | * Strengths and weaknesses | ||
+ | * Suggestions for improvements | ||
+ | * Significance, innovation and broader impacts | ||
+ | * Appendix (potential COI, open-science principles, policy implications, ethics, confidentiality, potential for misuse, etc.) | ||
+ | |||
+ | ===Applications=== | ||
+ | Caution should be exercised when collecting, analyzing, interpreting and criticizing large and complex datasets. | ||
+ | * Avoid confusion of ''research rigor'' and statistical computing concepts like ''measurement precision'', quantification, and generalizability. The latter ones are choices made by each investigator in determining how to best meet the project research objectives and are not something that should be inherently desired in-and-of-itself. | ||
+ | * When claiming that some data collection or analysis techniques are better (e.g., ''more'' rigorous) than others is always relative. All techniques are tools in a researcher’s toolbox and ''A saw is better than a hammer because it is sharper'' does not really make sense outside of the scope of the study. | ||
+ | * Over prescribing ''standards'' for specific methodological techniques that share a common set of core properties but may include a range of variations and nuances. Often, the power of a research technique is in its ability to be adapted to multiple research situati | ||
+ | ons. Truncating the variability around a technique will | ||
+ | only make the tool less useful. | ||
+ | Fourth, we need to avoid trying to link specific tec | ||
+ | hniques to specific research goals. As tools, methods | ||
+ | are a means to an end. It is surprising how such m | ||
+ | eans can be adapted to serve many different goals. For | ||
+ | example, I could easily imagine scenarios where paired | ||
+ | comparisons could be used to explore, describe, | ||
+ | compare, or test hypotheses. | ||
+ | Fifth, we need to stop associating standards and | ||
+ | rigor only with confirmatory and hypothesis-driven | ||
+ | research. I see no reason why we cannot set standards | ||
+ | of rigor for exploratory and descriptive research as | ||
+ | well. I suspect that some of the | ||
+ | criteria will vary based on specific r | ||
+ | esearch objectives, while some of the | ||
+ | criteria will cut across all types of research. More on this below. | ||
+ | |||
+ | ===Problems=== | ||
+ | Select a research article published in a major journal in the past 18 months, a recently submitted research proposal, or a study design protocol and generate a detailed critique of the study. | ||
+ | |||
+ | ===References=== | ||
+ | * [http://www.iom.edu/Reports/2011/Finding-What-Works-in-Health-Care-Standards-for-Systematic-Reviews/Standards.aspx IOM Standards-for-Systematic-Reviews] | ||
+ | * [http://www.pcori.org/assets/PCORI-National-Priorities-and-Research-Agenda-2012-05-21-FINAL1.pdf The Patient Protection and Affordable Care Act of 2010 created the Patient-Centered Outcomes Research Institute] (PCORI) | ||
Revision as of 12:57, 4 August 2014
Contents
Scientific Methods for Health Sciences - Study and Research Critiques
Overview
The scientific rigor in published literature, grant proposals and reports needs to be assessed and scrutinized to minimize errors in data extraction and meta-analysis. Reporting biases present significant obstacles to collecting of relevant information on the effectiveness of an intervention, strength of relations between variables, or causal associations. Research findings are important to individual decision making in revealing benefits, harms, or lack of effectiveness of new and established health intervention. Thus, the systematic review could flag studies, projects or data that may not be appropriately acquired, analyzed or interpreted.
Motivation
Identifying relevant research data is challenging. There is evidence that published data may not represent all of the findings on an intervention’s effectiveness. There is intrinsic bias to publish positive findings rather than null or negative results. In some cases, the stakeholders may not be able to determine the quality of a study because the presented details are poorly documented or because the authors and reviewers may fundamentally disagree on core principles.
General Approach for Research Critique
Before initiating the rigorous scientific review of a study, we need to summarize the study meta-data including:
- Authors: complete information about the study authors
- Affiliations: institutions, credentials, etc.
- Title: informative title of the study
- Venue: including who solicited the study (if applicable), what are the funding mechanisms, potential COIs, prior peer/external reviews
- Date(s): RFA dates, study dates, other time-relevant information about the study
- DOI/URL: digital object identifiers, RFAs/RFPs, and other appropriate web references
Research Critique
- Introduction - Background and Rationale
- Hypothesis/Key Questions
- Objectives/Goals
- Data (prelim, prospective, retrospective, selection of participants, sources, numbers, PHI/confidentiality/consent, access, etc.)
- Study Methods
- Risk-Benefit Analysis (if appropriate)
- Statistical Considerations (power/sample-sizes, strategy/period of data collection, assumptions, blindedness, techniques, etc.)
- Strengths and weaknesses
- Suggestions for improvements
- Significance, innovation and broader impacts
- Appendix (potential COI, open-science principles, policy implications, ethics, confidentiality, potential for misuse, etc.)
Applications
Caution should be exercised when collecting, analyzing, interpreting and criticizing large and complex datasets.
- Avoid confusion of research rigor and statistical computing concepts like measurement precision, quantification, and generalizability. The latter ones are choices made by each investigator in determining how to best meet the project research objectives and are not something that should be inherently desired in-and-of-itself.
- When claiming that some data collection or analysis techniques are better (e.g., more rigorous) than others is always relative. All techniques are tools in a researcher’s toolbox and A saw is better than a hammer because it is sharper does not really make sense outside of the scope of the study.
- Over prescribing standards for specific methodological techniques that share a common set of core properties but may include a range of variations and nuances. Often, the power of a research technique is in its ability to be adapted to multiple research situati
ons. Truncating the variability around a technique will only make the tool less useful. Fourth, we need to avoid trying to link specific tec hniques to specific research goals. As tools, methods are a means to an end. It is surprising how such m eans can be adapted to serve many different goals. For example, I could easily imagine scenarios where paired comparisons could be used to explore, describe, compare, or test hypotheses. Fifth, we need to stop associating standards and rigor only with confirmatory and hypothesis-driven research. I see no reason why we cannot set standards of rigor for exploratory and descriptive research as well. I suspect that some of the criteria will vary based on specific r esearch objectives, while some of the criteria will cut across all types of research. More on this below.
Problems
Select a research article published in a major journal in the past 18 months, a recently submitted research proposal, or a study design protocol and generate a detailed critique of the study.
References
- IOM Standards-for-Systematic-Reviews
- The Patient Protection and Affordable Care Act of 2010 created the Patient-Centered Outcomes Research Institute (PCORI)
- SOCR Home page: http://www.socr.umich.edu
Translate this page: