0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Investigation |

Web-Based Education in Systems-Based Practice:  A Randomized Trial FREE

B. Price Kerfoot, MD, EdM; Paul R. Conlin, MD; Thomas Travison, PhD; Graham T. McMahon, MD, MMSc
[+] Author Affiliations

Author Affiliations: Urology Section (Dr Kerfoot) and Endocrinology Section (Drs Conlin and McMahon), Veterans Affairs Boston Healthcare System, Harvard Medical School (Drs Kerfoot, Conlin, and McMahon), and Endocrinology, Diabetes, and Hypertension Division (Drs Conlin and McMahon), Brigham & Women's Hospital, Boston, Mass; and New England Research Institutes, Watertown, Mass (Dr Travison).


Arch Intern Med. 2007;167(4):361-366. doi:10.1001/archinte.167.4.361.
Text Size: A A A
Published online

Background  All accredited US residency programs are expected to offer curricula and evaluate their residents in 6 general competencies. Medical schools are now adopting similar competency frameworks. We investigated whether a Web-based program could effectively teach and assess elements of systems-based practice.

Methods  We enrolled 276 medical students and 417 residents in the fields of surgery, medicine, obstetrics-gynecology, and emergency medicine in a 9-week randomized, controlled, crossover educational trial. Participants were asked to sequentially complete validated Web-based modules on patient safety and the US health care system. The primary outcome measure was performance on a 26-item validated online test administered before, between, and after the participants completed the modules.

Results  Six hundred forty (92.4%) of the 693 enrollees participated in the study; 512 (80.0%) of the participants completed all 3 tests. Participants' test scores improved significantly after completion of the first module (P<.001). Overall learning from the 9-week Web-based program, as measured by the increase in scores (posttest scores minus pretest scores), was 16 percentage points (95% confidence interval, 14-17 percentage points; P<.001) in public safety topics and 22 percentage points (95% confidence interval, 20-23 percentage points; P<.001) in US health care system topics.

Conclusion  A Web-based educational program on systems-based practice competencies generated significant and durable learning across a broad range of medical students and residents.

Figures in this Article

The Accreditation Council for Graduate Medical Education (ACGME) now requires that all US residency programs teach and assess their residents in each of 6 general competencies.1,2 Many medical schools are now adopting similar competency frameworks.37 The competency of systems-based practice requires awareness of and responsiveness to the larger context and system of health care and the ability to effectively call on system resources to provide care that is of optimal value.1 Topics that fall under the domain of systems-based practice include patient safety, health policy, structure of the health care system, health care access, and health care quality. This competency is one of the most challenging to teach and assess because few faculty members have specific expertise in this area, it has traditionally received little attention in medical education, and few valid and reliable assessment tools on this topic have been developed.8,9

Evidence is accumulating that Web-based teaching can be an effective pedagogical tool for delivering and evaluating curricular content across multiple institutions and levels of training.1012 As computers with high-speed Internet access have become ubiquitous in the clinic and at home, dispersed residents and students can easily access Web-based materials regardless of their location and at times that do not conflict with their clinical responsibilities or duty-hour restrictions.13,14 In this study, we conducted a multi-institutional randomized, controlled trial to investigate whether a Web-based program could effectively teach and assess elements of systems-based practice in medical students and residents.

STUDY PARTICIPANTS

The institutional review board at Harvard Medical School reviewed and approved the protocol. Residents and students (417 and 276, respectively) from 7 Harvard-affiliated residencies and 2 Harvard Medical School courses were enrolled in the Web-based program between August and October 2005 (Table). All program directors agreed to have their residents and students participate. Participants were free to withhold their data from the research data set.

Table Graphic Jump LocationTable. Characteristics of the 693 Randomized Enrollees*
DEVELOPMENT OF A VALidateD CURRICULUM

Web-based educational materials developed by the Risk Management Foundation (RMF) and the Kaiser Family Foundation (KFF) were selected on the basis of their curricular relevance to the competency of systems-based practice and their perceived educational value for students and residents; each granted permission for the use of their content. Three Web-based educational modules from the RMF were selected to cover topics in patient safety, error prevention, and systems theory. This material is delivered using interactive Web pages that include multiple-choice questions (with answers and explanations), short audio and video clips, and simple animations. Each module takes an average of 25 to 35 minutes to complete. Four Web-based educational modules from the KFF were selected to address topics pertaining to the structure of the US health care system: Medicare, Medicaid, women's health policy, and the new prescription drug benefit. These modules are delivered online as narrated slide presentations and take 12 to 17 minutes each to complete.15 The content validity of the materials was established by 2 RMF and 4 KFF content experts.

DEVELOPMENT OF A VALidateD TEST INSTRUMENT

Two investigators (B.P.K. and G.T.M.) developed a provisional set of 33 multiple-choice questions (16 on patient safety and 17 on the US health care system) based on the curricular content in the Web-based modules. A panel of content experts at RMF and KFF established content validity of the test items. To determine the psychometric properties of the validated test questions, the 33 items were pilot tested with a group of 18 medical students (years 2-4) and 16 medical residents (postgraduate years 1-3). Point-biserial correlation and Kuder-Richardson 20 calculations were performed for each test item (Integrity Software; Castle Rock Research Corporation, Edmonton, Alberta), and 7 poorly performing items were eliminated to optimize the reliability of the instrument. The resulting validated test contained 26 items: 14 questions on patient safety (PS) and 12 questions on the structure of the US health care system (HS).

STUDY OBJECTIVES AND DESIGN

The objectives of this study were to determine whether a Web-based program could effectively teach and assess the competency of systems-based practice, and whether medical students and residents would perceive this online educational program as acceptable and appropriate. The program was constructed with a randomized crossover design to provide an effective control group for measuring initial and overall learning, to estimate the test-retest reliability of the test instrument, and to measure the medium-term retention of the educational content (Figure 1). The curriculum was offered during a 9-week period and comprised 2 sets of topic-specific modules surrounded by a pretest, midtest, and posttest. A pretest covering both topic areas was administered during week 1. During weeks 2 through 4, the first set of Web-based educational materials was distributed (PS materials to group 1 and HS materials to group 2). At week 5, participants completed a midtest covering both topics. During weeks 6 through 8, the remaining set of modules was distributed. At week 9, participants completed the posttest covering both topics. Participants were randomized to the order in which they received the educational modules. Completion of the entire Web-based program of tests and educational modules required 3 to 4 hours.

Place holder to copy figure label and caption
Figure 1.

Structure of the 9-week randomized, controlled trial (crossover design). Each test assessed knowledge of both topic areas.

Graphic Jump Location

Hyperlinks to the online tests and educational modules were distributed to the residents and students via weekly e-mail messages. The test questions were administered online and the test responses were collected online using the SurveyMonkey platform (SurveyMonkey.com, Portland, Ore). The test items and their order on the pretest, midtest, and posttest were identical to increase the reliability of the instrument. Explanations for the test answers were provided after submission of the posttest. Participants self-reported their time spent with the educational materials.

OUTCOME MEASURES

Primary outcome measures were the change in the test scores in each of the 2 topic areas from the pretest to the midtest (initial learning), the stability of the topic-specific test scores between the midtest and the posttest (retention), and the overall educational efficacy as judged by the change in score from the pretest to the posttest (overall learning). The mean differences in scores of those in the control group were subtracted from those in the intervention group to calculate the improvement in scores attributable to the Web-based curriculum on that topic.

Secondary outcome measures included the subjective perceptions of the residents and students of the acceptability of the Web-based educational program and the appropriateness of the educational content for their level of training.

STATISTICAL METHODS

The reliability of the validated test instrument was determined with Cronbach α, a measure of internal consistency.16 In addition, the averaged 4-week test-retest reliability (indicating stability of the measurement over time) was estimated with a Spearman-Brown adjustment.17

Although only 206 subjects would have been required for a 0.9 power to detect a 10% difference in learning, we recruited a larger sample to assess the generalizability of the Web-based program across different institutions, specialties, and levels of training. Enrollees were stratified by program/course and year of training and then block randomized at a single time point by 1 investigator (B.P.K.) between 2 groups. Program directors were blinded to group assignment. Participation in the study was defined as submission of baseline data and completion of the pretest. To allow a conservative intention-to-treat analysis, these baseline data on participants were carried forward, if needed, to impute any missing midtest and/or posttest data; this fixed gains in knowledge at zero for those subjects who did not complete the midtest or posttest. Completion of the Web-based program was defined as submission of all 3 tests.

Overall and topic-specific test scores were normalized to a percentage scale, with a minimum score of 0% and a maximum of 100%. Two-tailed t tests and Wilcoxon rank sum tests were used to determine the statistical significance of changes in learning. Intervention effect sizes for learning were measured by means of Cohen d, which was calculated by dividing mean scores or score increases by pooled standard deviations.16 Cohen d expresses the difference between the means in terms of standard deviation units, with 0.2 generally considered a small effect, 0.4 considered a moderate effect, and 0.8 considered a large effect.18 Because the intention-to-treat structure might bias in favor of improved retention by including participants with imputed midtest scores, a secondary analysis of retention was performed in the subset of participants who completed both the midtest and the posttest. Potential associations between treatment effects and subject characteristics were examined by graphical and tabular exploration and formally assessed by multiple linear regression analyses. Statistical calculations were performed with Stata 9.0 (StataCorp, College Station, Tex) and SPSS for Windows 13.0 (SPSS Inc, Chicago, Ill) statistical software.

The 693 enrolled residents and students who were randomized to groups 1 and 2 (344 and 349 enrollees, respectively) were similar with respect to a wide range of characteristics (Table). Six hundred forty (92.4%) of the enrollees participated in the study, and 512 (80.0%) of the participants completed all 3 tests. Of the participants, 128 (20.0%) did not complete the midtest (99 [15.5%]) and/or the posttest (70 [10.9%]). Dropout rates for the different groups were comparable (Figure 2). No enrollees in group 1 and two enrollees in group 2 elected to remove their data from the research data set and were included in those designated as having declined to participate. Under an intention-to-treat analysis, data for all 640 initial participants were included in the test-score analyses.

Place holder to copy figure label and caption
Figure 2.

CONSORT flowchart of randomized controlled trial. SBP indicates system-based practice.

Graphic Jump Location

The Cronbach α for the 26-item online test instrument was 0.76 (posttest). Four-week test-retest reliability was 0.63.

Of 539 responding participants, 453 (84.0%) completed at least 1 of the PS modules, and 372 (69.0%) completed all 3. Of 553 responding participants, 473 (85.5%) completed at least 1 of the HS modules, and 357 (64.6%) completed all 4. Minimal crossover between groups was reported: 52 (9.6%) of 541 responding participants reported completing 1 or more of the alternate modules before taking the midtest.

INITIAL LEARNING

Mean (SD) pretest scores were 58% (16%) and 47% (14%) in the PS and HS topics, respectively. Pretest scores were comparable between groups (Figure 3). Topic-specific midtest scores were significantly higher for those who received the educational intervention on that topic (P<.001 in each topic, 2-tailed t test). Wilcoxon rank sum tests, which are less susceptible to departures from normality (found to be mild in the data), produced identical results. Increases in initial learning attributable to the Web-based curricula were 14 (95% confidence interval [CI], 11-16) and 20 (95% CI, 17-23) percentage points in the PS and HS topic areas, representing relative increases of 24% and 43% over PS and HS pretest scores, respectively. These changes correspond to Cohen d effect sizes of 0.75 (95% CI, 0.59-0.91) and 0.95 (95% CI, 0.79-1.11). A multivariate regression model controlling for the effects of program type, year of training, degree, sex, specialty, and age produced no significant changes in intervention effects.

Place holder to copy figure label and caption
Figure 3.

Test results for each topic area during the Web-based program. Participants in group 1 received the educational materials on patient safety, followed by those on the structure of the US health care system; participants in group 2 received the materials in the reverse order. The statistical analyses represented in the plots are 2-tailed t tests between groups, with the error bars representing 95% confidence intervals.

Graphic Jump Location
RETENTION

Group 1 participants displayed strong retention of the PS curriculum, with no significant change in their topic-specific scores from midtest to posttest (mean change, 1% [95% CI, 0%-2%]; P = .10). A small but statistically significant decay was seen in the HS score in group 2 participants from midtest to posttest (mean change, −3% [95% CI, −5% to −2%]; P<.001), which represented a 4% relative decline in HS test scores. A secondary analysis of retention in the subset of participants who completed both the midtest and the posttest demonstrated similar results. Adjusting both models for participant covariates did not alter these retention findings.

OVERALL LEARNING

The overall mean increases in scores (posttest scores minus pretest scores) during the entire 9-week program were 16 (95% CI, 14-17; P<.001) and 22 (95% CI, 20-23; P<.001) percentage points in the PS and HS topic areas, representing relative increases of 28% and 47% over pretest scores, respectively. These changes correspond to effect sizes of 1.00 (95% CI, 0.84-1.16) and 1.22 (95% CI, 1.08-1.32), respectively. These results did not change in models that controlled for participant covariates.

ACCEPTABILITY AND APPROPRIATENESS OF THE WEB-BASED PROGRAM

Participants were asked to rate on a 5-point Likert-type scale the acceptability of the online program as a means of fulfilling their competency education requirements in systems-based practice (in the scale, 1 indicated not at all acceptable; 5, very acceptable). The participants' median rating was 3. When asked to similarly rate the degree to which the content in the online modules was appropriate to their level of training (on a 5-point Likert-type scale in which 1 indicated not at all appropriate; 5, very appropriate), the participants' median rating was 4 for both the PS and HS modules.

This multi-institutional randomized, controlled trial demonstrates that a Web-based program generated significant and durable learning in the competency of systems-based practice, one of the more challenging competency areas to teach and assess. Our results confirm that Web-based teaching may be an effective method for delivering and assessing curricular material of this type across a wide range of medical specialties, institutions, and levels of training.

Previous efforts to teach elements of systems-based practice have included in-house workshops,19 interdisciplinary learning groups,20 independent study projects,21 large-group collaborative projects,22 Web-broadcast workshops,23 large-group lectures,24 community projects,25 outcomes cards,26 simulations,2729 and team competitions.29 The reports cited have been small or specialty-focused, they often fail to validate or assess the reliability of their evaluation instruments, and their use of control groups to assess programmatic efficacy is infrequent. In contrast, we tested the efficacy of a Web-based program across a wide spectrum of trainees, used a carefully designed and tested evaluation instrument, and set up a controlled crossover design to assess the initial learning attributable to each module and the retention of that learning. In this setting we showed significant and durable increases in learning.

No definitive answer is available as to whether a Web-based program to teach and assess systems-based practice is “good enough” to fulfill the ACGME competency requirements because valid standards for competency in this domain have yet to be established. ideally, these competency standards would be defined and assessed on the basis of actual trainee practice within relevant systems. We have not set a competency standard in our knowledge-based test, nor do we suggest that performance in such a test should be sufficient for competency. Nevertheless, this study demonstrates that knowledge deemed important by content experts in the field can be effectively learned by using Web-based education. This model could be readily implemented across a broad range of programs and institutions.

Participants rated the acceptability of the Web-based program as neutral and rated the content as appropriate. This neutral level of acceptance may reflect a cultural rift between the competency-based educational agenda promoted by the ACGME and the pressing educational needs as perceived by the trainees. Until trainees appreciate the clinical relevance of systems-based practice competencies, educational programs in this domain may be perceived as unwelcome training requirements.

Several factors should be considered in the interpretation of our findings. Learning outcomes assessed by multiple-choice questions cannot supplant practice-based measures of trainee performance. While the test's reliability precludes making high-stakes decisions based on trainees' scores, this reliability level is not unexpected given the brevity of the test (26 items) and the disparate nature of the topics covered. Although the participant dropout rate of 20% is well within acceptable standards for educational studies, the possibility of some degree of dropout bias cannot be excluded. Although the identical test was used at 3 time points to maintain assessment reliability, the inclusion of a control group enabled us to confirm that test score increases in the intervention groups were directly due to the educational program, not merely due to priming from prior knowledge of the test questions. Strengths of this study include the randomized, controlled study design; the methodologic rigor of the test development; and the inclusion of trainees from a wide range of specialties, institutions, and levels of training.

In summary, this multi-institutional randomized, controlled trial establishes the principle that Web-based programs can be effectively used across a wide range of medical specialties, institutions, and levels of training to generate substantial learning and retention in the competency of systems-based practice. Further work is needed to establish valid standards for competency in systems-based practice and to promote acceptance of competency education in this domain at the trainee level.

Correspondence: B. Price Kerfoot, MD, EdM, Veterans Affairs Boston Healthcare System, 150 S Huntington Ave, 151DIA, Jamaica Plain, MA 02130 (price.kerfoot@gmail.com).

Accepted for Publication: November 15, 2006.

Author Contributions: Dr Kerfoot had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Study concept and design: Kerfoot, Conlin, Travison, and McMahon. Acquisition of data: Kerfoot and McMahon. Analysis and interpretation of data: Kerfoot, Conlin, Travison, and McMahon. Drafting of the manuscript: Kerfoot and Travison. Critical revision of the manuscript for important intellectual content: Conlin and McMahon. Statistical analysis: Kerfoot and Travison. Obtained funding: Kerfoot and Conlin. Administrative, technical, and material support: Conlin, Travison, and McMahon. Study supervision: Conlin and McMahon.

Financial Disclosure: None reported.

Funding/Support: This study was supported by a grant from the RMF. Additional support was obtained from the Research Career Development Award Program and by research grants TEL-02-100 and IIR-04-045 from the Veterans Affairs Health Services Research & Development Service, grants from the American Urological Association Foundation and Astellas Pharma US, Inc, grants K24 DK63214 and R01 HL77234 from the National Institutes of Health, and grants from the Academy at Harvard Medical School.

Disclaimer: The views expressed in this article are those of the authors and do not necessarily reflect the position and policy of the US government or the Department of Veterans Affairs. No official endorsement should be inferred.

Acknowledgment: We thank the RMF and the KFF for use of their Web-based educational materials; Robert B. Hanscom, JD, and Elizabeth G. Armstrong, PhD, for their support of the program; Lucean L. Leape, MD, and Saul N. Weingart, MD, for editing and content validation of the PS test items; Alina Salganicoff, PhD, Juliette Cubanski, MPP, MPH, Caya Lewis, MPH, Tricia Neuman, ScD, and Usha Ranji, MS, of the Kaiser Family Foundation for editing and content validation of the HS test items; Ronald A. Arky, MD, Stanley W. Ashley, MD, Christopher C. Baker, MD, Eugene Beresin, MD, Lori R. Berkowitz, MD, Charlie M. Fergusen, MD, Joel T. Katz, MD, Hope A. Riccotti, MD, William Taylor, MD, and Carrie D. Tibbles, MD, for including their programs and courses in the Web-based program; Daniel D. Federman, MD, for support in the conception of the program and assistance in its financial administration; and Susan Herlihy, Jessica E. Hyde, and Colleen E. Graham for administrative support.

Accreditation Council for Graduate Medical Education, Outcome Project. http://www.acgme.org/Outcome/. Accessed June 22, 2006
Leach  DC A model for GME: shifting from process to outcomes: a progress report from the Accreditation Council for Graduate Medical Education. Med Educ 2004;3812- 14
PubMed Link to Article
Indiana University School of Medicine, Competency curriculum. http://meded.iusm.iu.edu/Programs/ComptCurriculum.htm. Accessed March 13, 2006
Wayne State University School of Medicine, Medical school competencies and institutional learning objectives. http://www.med.wayne.edu/educational_programs/form.asp. Accessed March 13, 2006
Mercer University School of Medicine, Medical student competencies. http://medicine.mercer.edu/news?news_id=87. Accessed March 13, 2006
Dartmouth Medical School, Essential standards for matriculation, promotion, and graduation. http://dms.dartmouth.edu/admin/olads/esmpg.shtml. Accessed March 13, 2006
University of California, San Francisco, School of medicine: competencies and outcome learning objectives for the doctor of medicine program. http://medschool.ucsf.edu/curriculum/outcome_objs.aspx. Accessed March 13, 2006
Kerfoot  BPConlin  PRMcMahon  GT Health systems knowledge and its determinants in medical trainees. Med Educ 2006;401132
PubMed Link to Article
Heard  JKAllen  RMClardy  J Assessing the needs of residency program directors to meet the ACGME general competencies. Acad Med 2002;77750
PubMed Link to Article
Kerfoot  BPBaker  HJackson  TL  et al.  A multi-institutional randomized controlled trial of adjuvant Web-based teaching to medical students. Acad Med 2006;81224- 230
PubMed Link to Article
Ruiz  JGMintzer  MJLeipzig  RM The impact of E-learning in medical education. Acad Med 2006;81207- 212
PubMed Link to Article
Kerfoot  BPConlin  PRMcMahon  GT Comparison of delivery modes for online medical education. Med Educ 2006;401137- 1138
PubMed Link to Article
Kerfoot  BPNabha  KSHafler  JP Designing a surgical “resident-as-teacher” programme. Med Educ 2004;381190
PubMed Link to Article
Accreditation Council for Graduate Medical Education,Outcome Project: frequently asked questions. http://www.acgme.org/outcome/about/faq.asp. Accessed March 7, 2006
The Henry J. Kaiser Family Foundation, Tutorials. http://www.kaiseredu.org/tutorials_index.asp. Accessed March 13, 2006
Cohen  J Statistical Power Analysis for the Behavioural Sciences. 2nd ed. Hillsdale, NJ Lawrence A Erlbaum Associates1988;
Aiken  LR Psychological Testing and Assessment. 10th ed. Boston, Mass Allyn & Bacon2000;
Maxwell  SEDelaney  HD Designing Experiments and Analyzing Data: A Model Comparison Approach.  Belmont, Calif Wadsworth1990;
David  RAReich  LM The creation and evaluation of a systems-based practice/managed care curriculum in a primary care internal medicine residency program. Mt Sinai J Med 2005;72296- 299
PubMed
Dickey  JGirard  DEGeheb  MACassel  CK Using systems-based practice to integrate education and clinical services. Med Teach 2004;26428- 434
PubMed Link to Article
Allen  EZerzan  JChoo  CShenson  DSaha  S Teaching systems-based practice to residents by using independent study projects. Acad Med 2005;80125- 128
PubMed Link to Article
Frey  KEdwards  FAltman  KSpahr  NGorman  RS The “Collaborative Care” curriculum: an educational model addressing key ACGME core competencies in primary care residency training. Med Educ 2003;37786- 789
PubMed Link to Article
Bortot  ATCulberson  JWModak  IBecan-McBride  KNieman  LZ Assessment of competencies in physicians-in-training through the delivery of a community-based health curriculum using distance learning. Med Teach 2004;26615- 620
PubMed Link to Article
Boltri  JMVogel  RLTindall  WNTedders  SH Improving medical students' knowledge of systems-based practice. Manag Care Q 2002;1029- 32
PubMed
Doezema  DMcLaughlin  SSklar  DP An approach to fulfilling the systems-based practice competency requirement. Acad Emerg Med 2002;91355- 1359
PubMed Link to Article
Tomolo  ACaron  APerz  MLFultz  TAron  DC The outcomes card: development of a systems-based practice educational tool. J Gen Intern Med 2005;20769- 771
PubMed Link to Article
Voss  JDNadkarni  MMSchectman  JM The Clinical Health Economics System Simulation (CHESS): a teaching tool for systems- and practice-based learning. Acad Med 2005;80129- 134
PubMed Link to Article
Wang  EEVozenilek  JA Addressing the systems-based practice core competency: a simulation-based curriculum. Acad Emerg Med 2005;121191- 1194
PubMed Link to Article
Johnson  AWPotthoff  SJCarranza  LSwenson  HMPlatt  CRRathbun  JR CLARION: a novel interprofessional approach to health care education. Acad Med 2006;81252- 256
PubMed Link to Article

Figures

Place holder to copy figure label and caption
Figure 1.

Structure of the 9-week randomized, controlled trial (crossover design). Each test assessed knowledge of both topic areas.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.

CONSORT flowchart of randomized controlled trial. SBP indicates system-based practice.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 3.

Test results for each topic area during the Web-based program. Participants in group 1 received the educational materials on patient safety, followed by those on the structure of the US health care system; participants in group 2 received the materials in the reverse order. The statistical analyses represented in the plots are 2-tailed t tests between groups, with the error bars representing 95% confidence intervals.

Graphic Jump Location

Tables

Table Graphic Jump LocationTable. Characteristics of the 693 Randomized Enrollees*

References

Accreditation Council for Graduate Medical Education, Outcome Project. http://www.acgme.org/Outcome/. Accessed June 22, 2006
Leach  DC A model for GME: shifting from process to outcomes: a progress report from the Accreditation Council for Graduate Medical Education. Med Educ 2004;3812- 14
PubMed Link to Article
Indiana University School of Medicine, Competency curriculum. http://meded.iusm.iu.edu/Programs/ComptCurriculum.htm. Accessed March 13, 2006
Wayne State University School of Medicine, Medical school competencies and institutional learning objectives. http://www.med.wayne.edu/educational_programs/form.asp. Accessed March 13, 2006
Mercer University School of Medicine, Medical student competencies. http://medicine.mercer.edu/news?news_id=87. Accessed March 13, 2006
Dartmouth Medical School, Essential standards for matriculation, promotion, and graduation. http://dms.dartmouth.edu/admin/olads/esmpg.shtml. Accessed March 13, 2006
University of California, San Francisco, School of medicine: competencies and outcome learning objectives for the doctor of medicine program. http://medschool.ucsf.edu/curriculum/outcome_objs.aspx. Accessed March 13, 2006
Kerfoot  BPConlin  PRMcMahon  GT Health systems knowledge and its determinants in medical trainees. Med Educ 2006;401132
PubMed Link to Article
Heard  JKAllen  RMClardy  J Assessing the needs of residency program directors to meet the ACGME general competencies. Acad Med 2002;77750
PubMed Link to Article
Kerfoot  BPBaker  HJackson  TL  et al.  A multi-institutional randomized controlled trial of adjuvant Web-based teaching to medical students. Acad Med 2006;81224- 230
PubMed Link to Article
Ruiz  JGMintzer  MJLeipzig  RM The impact of E-learning in medical education. Acad Med 2006;81207- 212
PubMed Link to Article
Kerfoot  BPConlin  PRMcMahon  GT Comparison of delivery modes for online medical education. Med Educ 2006;401137- 1138
PubMed Link to Article
Kerfoot  BPNabha  KSHafler  JP Designing a surgical “resident-as-teacher” programme. Med Educ 2004;381190
PubMed Link to Article
Accreditation Council for Graduate Medical Education,Outcome Project: frequently asked questions. http://www.acgme.org/outcome/about/faq.asp. Accessed March 7, 2006
The Henry J. Kaiser Family Foundation, Tutorials. http://www.kaiseredu.org/tutorials_index.asp. Accessed March 13, 2006
Cohen  J Statistical Power Analysis for the Behavioural Sciences. 2nd ed. Hillsdale, NJ Lawrence A Erlbaum Associates1988;
Aiken  LR Psychological Testing and Assessment. 10th ed. Boston, Mass Allyn & Bacon2000;
Maxwell  SEDelaney  HD Designing Experiments and Analyzing Data: A Model Comparison Approach.  Belmont, Calif Wadsworth1990;
David  RAReich  LM The creation and evaluation of a systems-based practice/managed care curriculum in a primary care internal medicine residency program. Mt Sinai J Med 2005;72296- 299
PubMed
Dickey  JGirard  DEGeheb  MACassel  CK Using systems-based practice to integrate education and clinical services. Med Teach 2004;26428- 434
PubMed Link to Article
Allen  EZerzan  JChoo  CShenson  DSaha  S Teaching systems-based practice to residents by using independent study projects. Acad Med 2005;80125- 128
PubMed Link to Article
Frey  KEdwards  FAltman  KSpahr  NGorman  RS The “Collaborative Care” curriculum: an educational model addressing key ACGME core competencies in primary care residency training. Med Educ 2003;37786- 789
PubMed Link to Article
Bortot  ATCulberson  JWModak  IBecan-McBride  KNieman  LZ Assessment of competencies in physicians-in-training through the delivery of a community-based health curriculum using distance learning. Med Teach 2004;26615- 620
PubMed Link to Article
Boltri  JMVogel  RLTindall  WNTedders  SH Improving medical students' knowledge of systems-based practice. Manag Care Q 2002;1029- 32
PubMed
Doezema  DMcLaughlin  SSklar  DP An approach to fulfilling the systems-based practice competency requirement. Acad Emerg Med 2002;91355- 1359
PubMed Link to Article
Tomolo  ACaron  APerz  MLFultz  TAron  DC The outcomes card: development of a systems-based practice educational tool. J Gen Intern Med 2005;20769- 771
PubMed Link to Article
Voss  JDNadkarni  MMSchectman  JM The Clinical Health Economics System Simulation (CHESS): a teaching tool for systems- and practice-based learning. Acad Med 2005;80129- 134
PubMed Link to Article
Wang  EEVozenilek  JA Addressing the systems-based practice core competency: a simulation-based curriculum. Acad Emerg Med 2005;121191- 1194
PubMed Link to Article
Johnson  AWPotthoff  SJCarranza  LSwenson  HMPlatt  CRRathbun  JR CLARION: a novel interprofessional approach to health care education. Acad Med 2006;81252- 256
PubMed Link to Article

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

675 Views
28 Citations

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Collections
PubMed Articles
Jobs
×