Difference between revisions of "SMHS ResearchCritiques"
(→Applications) |
(→Problems) |
||
(One intermediate revision by the same user not shown) | |||
Line 37: | Line 37: | ||
* As tools and methods are a means to an end, avoid trying to link specific techniques to specific research goals. For example, paired comparisons could be used to explore, describe, compare, or test a wide variety of hypotheses. | * As tools and methods are a means to an end, avoid trying to link specific techniques to specific research goals. For example, paired comparisons could be used to explore, describe, compare, or test a wide variety of hypotheses. | ||
− | * Standards and rigorous methods are not only associated with confirmatory and hypothesis-driven | + | * Standards and rigorous methods are not only associated with confirmatory and hypothesis-driven research, these can be employed in exploratory and descriptive research studies. |
− | research, these can be employed in exploratory and descriptive research studies. | ||
===Problems=== | ===Problems=== | ||
− | Select a research article published in a major journal in the past 18 months, a recently submitted research proposal, or a study design protocol and generate a detailed critique of the study. | + | [http://scholar.google.com/ Select a research article published in a major journal in the past 18 months], a recently submitted research proposal, or a study design protocol and generate a detailed critique of the study. |
===References=== | ===References=== |
Latest revision as of 13:34, 4 August 2014
Contents
Scientific Methods for Health Sciences - Study and Research Critiques
Overview
The scientific rigor in published literature, grant proposals and reports needs to be assessed and scrutinized to minimize errors in data extraction and meta-analysis. Reporting biases present significant obstacles to collecting of relevant information on the effectiveness of an intervention, strength of relations between variables, or causal associations. Research findings are important to individual decision making in revealing benefits, harms, or lack of effectiveness of new and established health intervention. Thus, the systematic review could flag studies, projects or data that may not be appropriately acquired, analyzed or interpreted.
Motivation
Identifying relevant research data is challenging. There is evidence that published data may not represent all of the findings on an intervention’s effectiveness. There is intrinsic bias to publish positive findings rather than null or negative results. In some cases, the stakeholders may not be able to determine the quality of a study because the presented details are poorly documented or because the authors and reviewers may fundamentally disagree on core principles.
General Approach for Research Critique
Before initiating the rigorous scientific review of a study, we need to summarize the study meta-data including:
- Authors: complete information about the study authors
- Affiliations: institutions, credentials, etc.
- Title: informative title of the study
- Venue: including who solicited the study (if applicable), what are the funding mechanisms, potential COIs, prior peer/external reviews
- Date(s): RFA dates, study dates, other time-relevant information about the study
- DOI/URL: digital object identifiers, RFAs/RFPs, and other appropriate web references
Research Critique
- Introduction - Background and Rationale
- Hypothesis/Key Questions
- Objectives/Goals
- Data (prelim, prospective, retrospective, selection of participants, sources, numbers, PHI/confidentiality/consent, access, etc.)
- Study Methods
- Risk-Benefit Analysis (if appropriate)
- Statistical Considerations (power/sample-sizes, strategy/period of data collection, assumptions, blindedness, techniques, etc.)
- Strengths and weaknesses
- Suggestions for improvements
- Significance, innovation and broader impacts
- Appendix (potential COI, open-science principles, policy implications, ethics, confidentiality, potential for misuse, etc.)
Applications
Caution should be exercised when collecting, analyzing, interpreting and criticizing large and complex datasets.
- Avoid confusion of research rigor and statistical computing concepts like measurement precision, quantification, and generalizability. The latter ones are choices made by each investigator in determining how to best meet the project research objectives and are not something that should be inherently desired in-and-of-itself.
- When claiming that some data collection or analysis techniques are better (e.g., more rigorous) than others is always relative. All techniques are tools in a researcher’s toolbox and A saw is better than a hammer because it is sharper does not really make sense outside of the scope of the study.
- Over prescribing standards for specific methodological techniques that share a common set of core properties but may include a range of variations and nuances. Often, the power of a research technique is in its ability to be adapted to multiple research applications. Truncating the variability around a technique may make the tool less useful.
- As tools and methods are a means to an end, avoid trying to link specific techniques to specific research goals. For example, paired comparisons could be used to explore, describe, compare, or test a wide variety of hypotheses.
- Standards and rigorous methods are not only associated with confirmatory and hypothesis-driven research, these can be employed in exploratory and descriptive research studies.
Problems
Select a research article published in a major journal in the past 18 months, a recently submitted research proposal, or a study design protocol and generate a detailed critique of the study.
References
- IOM Standards-for-Systematic-Reviews
- The Patient Protection and Affordable Care Act of 2010 created the Patient-Centered Outcomes Research Institute (PCORI)
- SOCR Home page: http://www.socr.umich.edu
Translate this page: