0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Investigation |

Multicenter Implementation of a Shared Graduate Medical Education Resource FREE

Stephen D. Sisson, MD; Darius A. Rastegar, MD; Tasha N. Rice; Mark T. Hughes, MD
[+] Author Affiliations

Author Affiliations: Division of General Internal Medicine, Department of Medicine, The Johns Hopkins University School of Medicine, Baltimore, Maryland.


Arch Intern Med. 2007;167(22):2476-2480. doi:10.1001/archinte.167.22.2476.
Text Size: A A A
Published online

Background  The Accreditation Council of Graduate Medical Education (ACGME) is changing residency program assessment to include education outcomes assessment, challenging resources of residency training programs. The internet is a means of sharing education resources among training programs.

Methods  A multicenter survey was distributed to leaders of 80 internal medicine residency training programs that shared an online medical knowledge curriculum that included education outcomes assessment. Program characteristics, curriculum implementation methods, and use of educational outcome assessment were analyzed to determine how implementation differed among programs.

Results  Seventy-four programs (92%) completed the survey. The programs vary in medical school affiliation, number of house staff, and proportion of students who specialize on graduation. They most commonly use the curriculum to augment a preexisting curriculum (37 programs [50%]); 41 programs (56%) use the curriculum to comply with ACGME requirements. The programs differ in how they adapt the curriculum to their needs, most commonly by discussing modules with house staff (47 programs [63%]). In 61 programs (82%), module completion is mandatory. Thirty-five programs (47%) use penalties to encourage module completion, most commonly poor evaluation scores (15 programs [20%]) or withholding of promotion (12 programs [16%]). Nearly all programs (71 [97%]) track module completion; 34 programs (47%) track group performance on learning objectives; and 8 programs (11%) alter their educational curriculum based on group performance.

Conclusions  A medical knowledge curriculum that includes education outcome assessment can be adapted at a range of residency training programs, helping them to comply with ACGME requirements. However, most residency training programs are not using outcomes data to their full potential.

Figures in this Article

Graduate medical education is increasingly focused on the assessment and measurement of educational outcomes. The Accreditation Council for Graduate Medical Education (ACGME) is changing residency training program assessment from process evaluation to education outcomes assessment, known as the ACGME Outcome Project.1 Because of the effort required to enact educational outcome assessment envisioned by the ACGME, sharing of resources across institutions is needed.2 Anticipating this need, the ACGME has created an evaluation “toolbox” as a resource for residency training programs, along with guidelines for selecting and implementing assessment systems.3 Residency training programs can contribute to this process by sharing educational resources, including curricula and assessment tools.4

Implementation of the ACGME Outcome Project began in July 2006 and is expected to be complete by June 2011. Currently, the only educational outcome measure used by the ACGME in internal medicine program evaluation is aggregate pass rate data of program graduates on the American Board of Internal Medicine certification examination.2 When fully implemented, all residency training programs will be required to demonstrate educational outcome assessments of the 6 core competencies.1 Measurement of educational outcomes is challenging and will require residency training programs to commit significant resources.2,58 A combination of assessment methods will be needed to provide valid and reliable results to residency training programs and to accreditation organizations such as the ACGME.9 The rising demands on graduate medical education are occurring at a time when academic health centers are challenged to find the resources to support the faculty time that is required to develop educational curricula and to implement educational outcome assessment projects.10

The Division of General Internal Medicine at The Johns Hopkins University, Baltimore, Maryland, has developed an online curriculum in ambulatory care that includes educational content as well as outcome measures.11 This online curriculum has been shown to improve knowledge outcomes for a number of clinical topics.1215 The purpose of the present study is to describe how this curriculum has been implemented at other residency training programs, how sharing of educational content has been effectively achieved across programs, and how educational outcome measures have been used to improve training at multiple sites.

In 1997 and 1998, a curriculum in ambulatory care medicine was developed using a 6-step approach to curriculum development.16 An average of 6 medical knowledge learning objectives was developed for each topic (“module”) in the curriculum. A case-based didactic was then written using best medical evidence as defined by national guidelines, position papers, and seminal studies. Pretests and posttests consisting of multiple-choice questions were written for each module by medical education experts using established principles of medical question writing to test baseline knowledge and change in knowledge after completion of didactics.17 Specialty experts were used to validate questions on several modules, and questions were piloted with medical house staff for clarity and equivalence of results between pretests and posttests. An additional set of questions on each module topic was written for an end-of-year posttest. Modules were updated annually based on evolving medical evidence and practice and feedback from learners, as well as on performance characteristics on the pretests and the posttests.

In 1999, a Web site was developed to distribute the curriculum (www.hopkinsilc.org.. The Web site was marketed to internal medicine residency training programs in 2002 via a mailed brochure for an annual subscription fee of $1500. Content is password protected and requires registration and approval for access. Learners select modules from a menu and proceed by completing the pretest, followed by case-based didactics and then the posttest. Web site structure does not allow access to the didactics until the pretest is completed, and the posttest cannot be accessed until the didactics are completed. Credit for module completion occurs when the final posttest question is answered. While completing modules, learners are informed if answer selections are correct or incorrect and, if incorrect, which answer choice is correct. Individual scores on modules can be viewed by each learner, as well as comparison with aggregate scores of all other learners. The didactic section of a module can be printed on completion of a module.

An enhanced level of access is available for program administrators and faculty (hereafter just called administrators). Administrators can schedule the order of modules, view content without requiring module completion, and view aggregate performance data on pretests and posttests. They can also track which modules have been completed by each learner but cannot view individual scores of each learner on the module. However, they can view individual scores on the end-of-year tests. Web site structure accommodates groups of individuals that function independently of each other. Module selection and scheduling can be customized by administrators in each group, who can track module completion by learners in their group, aggregate performance by their group, and compare aggregate performance with other groups.

To understand the characteristics and reasons why a residency program would use an external Web-based curriculum, residency programs that are current or past users of the ambulatory curriculum Web site were surveyed between January 2, 2007, and February 2, 2007. The Johns Hopkins Hospital and its affiliates were excluded from the study. One individual from each residency program who had acted as an administrator of the ambulatory curriculum was identified to answer questions on program characteristics, methods of curriculum implementation, and tracking and use of educational outcomes. A combination of quantitative and qualitative questions was used in the survey. The survey was distributed using an online survey service; individual responses to survey questions were anonymous. Statistical analyses were performed with Stata software, version 8.2 (Stata Corp, College Station, Texas). Relationships between survey results were evaluated using the χ² test. All P values were 2-sided, and P < .05 was considered significant. Univariate and multivariate regression were performed to examine the factors that would predict whether a program would make a curriculum mandatory, as well as whether the curriculum would be used to satisfy ACGME requirements. Qualitative responses were independently coded by 2 of us (S.D.S. and M.T.H.); coding was then reviewed to reach consensus on major themes. This study was granted exemption from institutional review board approval, as survey participation was voluntary and anonymous and posed no more than minimal harm to participants.

PROGRAM CHARACTERISTICS

The ambulatory curriculum has been used by 80 internal medicine residency training programs in 28 states and Washington, District of Columbia. Sixty-seven of these programs (84%) are currently using the curriculum, while 13 programs (17%) no longer use the curriculum. Seventy-four programs (64 current subscribers and 10 past subscribers) completed the survey on program characteristics and curriculum implementation (response rate, 93%; 96% of current subscribers and 77% of past subscribers).

The surveyed programs vary in medical school affiliation and program size (Figure 1). Thirty-three of the 74 programs (45%) are primary affiliates of medical schools, while 26 (35%) are secondary affiliates. Fifteen programs (20%) are loose affiliates of medical schools. Fifteen programs (20%) have fewer than 30 house officers (postgraduate years 1 through 3); 34 programs (46%) have between 30 and 60 house officers; 12 programs (16%) have 61 to 90 house officers; and 13 programs (18%) have 90 or more house officers. The programs also vary in the degree of specialization by house staff after residency. In 44 programs (60%), the majority of house officers specialize. Program characteristics do not differ between current and past subscribers.

Place holder to copy figure label and caption
Figure 1.

Characteristics of internal medicine residency training programs that use the online curriculum.

Graphic Jump Location
CURRICULUM IMPLEMENTATION

The programs differ in how they use the online ambulatory curriculum (Figure 2). Twenty-two programs (30%) had no ambulatory curriculum and use the online curriculum to provide one; 15 programs (20%) use the online curriculum to replace their preexisting ambulatory curriculum; and 37 programs (50%) use the online curriculum to augment their current ambulatory curriculum. Forty-one programs (56%) use the curriculum to comply with ACGME requirements.

Place holder to copy figure label and caption
Figure 2.

Role of online training program in internal medicine residency training programs that use the online curriculum.

Graphic Jump Location
LOCAL ADAPTATION OF CURRICULUM

As shown below, the programs differ in how they adapt the curriculum to their particular program.

The majority of the programs (47 [64%]) discuss modules with house staff in some way, either formally or informally. Thirty-four programs (46%) discuss modules either formally or informally in clinic-related didactics. Ten programs (14%) formally review modules during noon conference lectures or as part of other nonclinic-associated lectures. Ten programs (14%) discuss modules during resident report or ambulatory block experiences. A small number of programs (5 [7%]) review modules with house staff during chart audits, at the computer laboratory, or as part of competency testing. Among current subscribers, whether or not modules were discussed did not influence satisfaction with the curriculum.

The programs also differ as to whether or not module completion is mandatory, whether or not specific time is allocated for module completion, and how module completion is enforced, as shown below.

Sixty-one programs (82%) require module completion by house staff as a mandatory component of training, with an average of 15 modules required annually (range, 3-36). Nineteen programs (26%) require module completion during a specific training experience (usually ambulatory block). Twenty-four programs (32%) block time from other duties specifically for module completion, which is done less frequently for other curricula. Thirteen programs (18%) use incentives to encourage module completion, most commonly awards or prizes (9 programs [12%]). Nearly half of all programs (35 [47%]) use penalties to encourage module completion, most commonly poor marks on evaluations (15 programs [20%]) and withholding of promotion or graduation (12 programs [16%]).

Program directors believe that the best method to encourage module completion is to make completion mandatory (27 programs [36%]), followed by reminders (12 programs [16%]). Other suggested methods to encourage module completion include incorporating modules into other teaching activities (11 programs [15%]), penalties (8 programs [11%]), incentives (6 programs [8%]), and protected time to complete modules (6 programs [8%]). Some programs use a combination of methods to encourage module completion. Program directors are significantly more likely to be satisfied with module completion if completion is mandatory rather than voluntary (54.7% vs 0%; P < .001).

EDUCATIONAL OUTCOMES TRACKING

Nearly all programs (71 [97%]) use the curricular Web site's ability to track the number of modules completed by house staff (Figure 3). Thirty-four programs (47%) use the curricular Web site's ability to track group performance on learning objectives to evaluate the educational performance of their house staff. However, only 8 programs (11%) alter their educational program based on these results. Logistic regression of program characteristics, curriculum adaptation, and educational outcomes tracking did not show any statistically significant trends but may have been limited by sample size.

Place holder to copy figure label and caption
Figure 3.

Proportion of internal medicine residency training programs tracking education outcome measures and/or changing educational program in response to results.

Graphic Jump Location
DISENROLLED PROGRAMS

The 10 programs that no longer use the curriculum were asked why they had discontinued use. Four programs (40%) cited poor utilization by house officers; 4 programs (40%) stated that the curriculum was inadequate for their needs; and 2 programs (20%) stated that implementation of the curriculum was too demanding of faculty time.

PAYMENT ISSUES

The programs were asked their opinion on the annual fee charged to use the curricular Web site. Fifty-three programs (72%) commented that the fee was reasonable or appropriate, while 6 programs (8%) objected to the fee. Of the 53 programs that did not object to the fee, 8 (15%) acknowledged the effort that is required to keep educational content current, while 4 programs (8%) acknowledged the expense and value of the Web site.

A medical knowledge curriculum in ambulatory care delivered online can be shared across a wide range of internal medicine residency training programs. Residency programs use such a curriculum to satisfy different needs, including providing a curriculum where none existed before or augmenting a curriculum still in use. Programs differ in how they implement the curriculum. Although the curriculum can function solely as a self-directed learning tool, most programs devote additional resources to implementation by dedicating faculty time to lectures, discussions, or reviews.

Nearly all programs (97%) use Web site features to track utilization, and most make module completion mandatory (often with penalties for those who do not comply). Those that do not make module utilization mandatory are uniformly dissatisfied with resident use of the curriculum. Less than half of the programs using this curriculum (47%) track medical knowledge outcomes, and only 8 programs (11%) use these results to change their educational program. There are several possible explanations as to why programs track module utilization much more than educational outcomes. The majority of programs using this curriculum do so in part to satisfy ACGME requirements, which have focused on process rather than on educational outcomes. Also, tracking educational outcomes and altering the educational program requires additional faculty resources. Half of the programs that use this curriculum either have no ambulatory curriculum or have one that merits replacement, perhaps suggesting some limitation in educational or faculty resources. Tracking of educational outcomes may improve as the ACGME Outcome Project becomes fully implemented.

Medical knowledge may be the easiest of the ACGME's 6 competencies to measure. Multiple-choice questions are well established as a valid and reliable tool for assessing medical knowledge.6,1820 Even so, developing valid and reliable multiple-choice questions requires knowledge of principles of clinical question writing as well as demonstration of measures of validity (eg, content validity and construct validity) and reliability (eg, stability and equivalence).16,17 Multiple-choice questions may not be an appropriate method of evaluation for the other ACGME competencies, some of which have few validated methods of assessment.21 Assessment of educational outcomes for the other ACGME competencies may require significantly more resources. It is unlikely that residency programs will be able to develop these resources on their own, demonstrating the need for shared resources. Well-studied assessment tools such as observed structured clinical encounters and standardized patients will require still more resources.

Most programs that use the Hopkins ambulatory curriculum do not block time for its use, perhaps because of clinical demands on house staff, other educational priorities, and a desire to comply with the 80-hour work week. Although this curriculum provides assessment of knowledge outcomes, residents' time is required to complete the pretests and posttests (let alone the didactics) that are needed to assess the educational outcome. Assessment methods used to measure other competencies (eg, observed structured clinical encounters and standardized patients) also require significant time out of the residents' busy schedules, further straining residency training program resources. As outcome assessment becomes more widespread in residency training, programs will need to devote additional resources merely to track the progress of residents and to encourage participation in defined educational exercises and outcome assessments.

We found that the best way to ensure resident participation in educational exercises and outcome assessment is to make them mandatory. In our study, program directors who made completion of ambulatory curriculum modules voluntary were uniformly dissatisfied with resident participation. Even in mandatory programs, only just more than half of program directors (55%) were satisfied with module completion. Nearly half (47%) resorted to penalties to encourage module completion; a smaller number (32%) blocked time specifically for module completion. As outcome assessment becomes more widespread, program directors will likely need to devote significant resources to track and encourage resident participation and to potentially grapple with disciplining house staff who fail to comply.

Most programs using the Hopkins curriculum do not object to the annual subscription fee, acknowledging the resources that are required to develop and maintain medical curricula, as well as the expense of the Web site. We use the funds generated by this curriculum to pay faculty to write modules, to keep the modules current, and to edit the content, as well as to pay commercial fees for Web site upgrades, maintenance, and support. While the curriculum and Web site are currently self-sufficient, their initial development required considerable financial resources and faculty time.11

This study has several limitations. First, the effect of the curriculum on long-term knowledge and patient care has not been assessed. Second, other domains of competence such as attitudes and skills were not addressed by this intervention. Finally, the model of a shared educational resource that includes both educational content and outcomes assessment and is adaptable to the needs of a wide range of medical residency training programs may not work for the other ACGME competencies.

Our experience with delivering the curriculum over the past 5 years and the survey data from users of the curriculum provide several lessons for those interested in sharing educational resources. First, residency programs that adapt an external curriculum locally will likely need to commit additional resources and faculty time to implement it. Second, residency program directors implementing an external curriculum are more likely to be satisfied with its impact if they make curriculum participation mandatory. Third, house officer participation in curriculum didactics and education outcomes assessment requires commitment of administrative and resident resources and may need to involve incentives or punitive measures to ensure participation. Finally, tracking utilization of a curriculum is a widely accepted practice, but residency programs are currently underutilizing educational performance outcomes.

In summary, we have developed and distributed a curricular product that includes educational content and tools for outcomes assessment and has been adapted for use at a wide range of internal medicine residency training programs. Educational outcome assessment that has been mandated to comply with the ACGME Outcome Project will place additional demands on internal medicine residency training programs. Shared education resources such as those described herein have the potential to help programs educate residents and to assess education outcomes.

Correspondence: Stephen D. Sisson, MD, Department of Medicine, Johns Hopkins University School of Medicine, 601 N Caroline St, Room 7150G, Baltimore, MD 21287 (ssisson@jhmi.edu).

Accepted for Publication: July 14, 2007.

Author Contributions:Study concept and design: Sisson, Rastegar, and Hughes. Acquisition of data: Sisson and Hughes. Analysis and interpretation of data: Sisson, Rastegar, Rice, and Hughes. Drafting of the manuscript: Sisson and Hughes. Critical revision of the manuscript for important intellectual content: Sisson, Rastegar, Rice, and Hughes. Statistical analysis: Rice.

Financial Disclosure: Drs Sisson, Rastegar, and Hughes are paid an editorial fee to oversee educational content of the curriculum described in this article, for which residency programs pay an annual fee.

 ACGME Outcome Project. Accreditation Council for Graduate Medical Education Web site. http://www.acgme.org/acWebsite/home/home.asp. Accessed February 6, 2007
Goroll  AHSirio  CDuffy  D  et al.  A new model for accreditation of residency programs in internal medicine. Ann Intern Med 2004;140 (11) 902- 909
PubMed Link to Article
 ACGME Outcome Project: toolbox of assessment methods. Accreditation Council for Graduate Medical Education Web site. http://www.acgme.org/outcome/assess/toolbox.asp. Accessed February 6, 2007
Green  ML Identifying, appraising, and implementing medical education curricula: a guide for medical educators. Ann Intern Med 2001;135 (10) 889- 896
PubMed Link to Article
Ludmerer  KM Learner-centered medical education. N Engl J Med 2004;351 (12) 1163- 1164
PubMed Link to Article
Holmboe  ESHawkins  RE Methods for evaluating the clinical competence of residents in internal medicine: a review. Ann Intern Med 1998;129 (1) 42- 48
PubMed Link to Article
Weinberger  SESmith  LGCollier  VUEducation Committee of the American College of Physicians, Redesigning training for internal medicine. Ann Intern Med 2006;144 (12) 927- 932
PubMed Link to Article
Fitzgibbons  JPBordley  DRBerkowitz  LRMiller  BWHenderson  MC Redesigning residency education in internal medicine: a position paper from the Association of Program Directors in Internal Medicine. Ann Intern Med 2006;144 (12) 920- 926
PubMed Link to Article
Miller  GE The assessment of clinical skills/competence/performance. Acad Med 1990;65 (9) ((suppl)) S63- S67
PubMed Link to Article
Henderson  MCIbrahim  TTierney  LM Confronting the brutal facts in health care. Am J Med 2005;118 (10) 1061- 1063
PubMed Link to Article
Sisson  SDHughes  MTLevine  DBrancati  FL Effect of an Internet-based curriculum on postgraduate education: a multicenter intervention. J Gen Intern Med 2004;19 (5, pt 2) 505- 509
PubMed Link to Article
Cosgrove  SEPerl  TMSong  XSisson  SD Ability of physicians to diagnose and manage illness due to category A bioterrorism agents. Arch Intern Med 2005;165 (17) 2002- 2006
PubMed Link to Article
Sisson  SDRastegar  DRice  TNProkopowicz  GHughes  MT Physician familiarity with diagnosis and management of hypertension according to JNC 7 guidelines. J Clin Hypertens (Greenwich) 2006;8 (5) 344- 350
PubMed Link to Article
Sisson  SDRice  THughes  MT Physician knowledge of national cholesterol guidelines before and after an interactive curriculum. Am J Cardiol 2007;99 (9) 1234- 1235
PubMed Link to Article
Ashar  BHRice  TNSisson  SD Physicians' understanding of the regulation of dietary supplements. Arch Intern Med 2007;167 (9) 966- 969
PubMed Link to Article
Kern  DEThomas  PAHoward  DMBass  EB Curriculum Development for Medical Education: A Six-Step Approach.  Baltimore, MD Johns Hopkins University Press1998;
Case  SMSwanson  DB Constructing Written Test Questions for the Basic and Clinical Sciences. 3rd ed. Philadelphia, PA National Board of Medical Examiners2002;
Norcini  JJSwanson  DBGrosso  LJWebster  GD Reliability, validity and efficiency of multiple choice question and patient management problem item formats in assessment of clinical competence. Med Educ 1985;19 (3) 238- 247
PubMed Link to Article
Case  SMRipkey  DRSwanson  DB The relationship between clinical science performance in 20 medical schools and performance on Step 2 of the USMLE licensing examnation: 1994-95 Validity Study Group for USMLE Step 1 and 2 Pass/Fail Standards. Acad Med 1996;71 (1) ((suppl)) S31- S33
PubMed Link to Article
Epstein  RMHundert  EM Defining and assessing professional competence. JAMA 2002;287 (2) 226- 235
PubMed Link to Article
Frohna  JGKalet  AKachur  E  et al.  Assessing residents' competency in care management: report of a consensus conference. Teach Learn Med 2004;16 (1) 77- 84
PubMed Link to Article

Figures

Place holder to copy figure label and caption
Figure 1.

Characteristics of internal medicine residency training programs that use the online curriculum.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.

Role of online training program in internal medicine residency training programs that use the online curriculum.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 3.

Proportion of internal medicine residency training programs tracking education outcome measures and/or changing educational program in response to results.

Graphic Jump Location

References

 ACGME Outcome Project. Accreditation Council for Graduate Medical Education Web site. http://www.acgme.org/acWebsite/home/home.asp. Accessed February 6, 2007
Goroll  AHSirio  CDuffy  D  et al.  A new model for accreditation of residency programs in internal medicine. Ann Intern Med 2004;140 (11) 902- 909
PubMed Link to Article
 ACGME Outcome Project: toolbox of assessment methods. Accreditation Council for Graduate Medical Education Web site. http://www.acgme.org/outcome/assess/toolbox.asp. Accessed February 6, 2007
Green  ML Identifying, appraising, and implementing medical education curricula: a guide for medical educators. Ann Intern Med 2001;135 (10) 889- 896
PubMed Link to Article
Ludmerer  KM Learner-centered medical education. N Engl J Med 2004;351 (12) 1163- 1164
PubMed Link to Article
Holmboe  ESHawkins  RE Methods for evaluating the clinical competence of residents in internal medicine: a review. Ann Intern Med 1998;129 (1) 42- 48
PubMed Link to Article
Weinberger  SESmith  LGCollier  VUEducation Committee of the American College of Physicians, Redesigning training for internal medicine. Ann Intern Med 2006;144 (12) 927- 932
PubMed Link to Article
Fitzgibbons  JPBordley  DRBerkowitz  LRMiller  BWHenderson  MC Redesigning residency education in internal medicine: a position paper from the Association of Program Directors in Internal Medicine. Ann Intern Med 2006;144 (12) 920- 926
PubMed Link to Article
Miller  GE The assessment of clinical skills/competence/performance. Acad Med 1990;65 (9) ((suppl)) S63- S67
PubMed Link to Article
Henderson  MCIbrahim  TTierney  LM Confronting the brutal facts in health care. Am J Med 2005;118 (10) 1061- 1063
PubMed Link to Article
Sisson  SDHughes  MTLevine  DBrancati  FL Effect of an Internet-based curriculum on postgraduate education: a multicenter intervention. J Gen Intern Med 2004;19 (5, pt 2) 505- 509
PubMed Link to Article
Cosgrove  SEPerl  TMSong  XSisson  SD Ability of physicians to diagnose and manage illness due to category A bioterrorism agents. Arch Intern Med 2005;165 (17) 2002- 2006
PubMed Link to Article
Sisson  SDRastegar  DRice  TNProkopowicz  GHughes  MT Physician familiarity with diagnosis and management of hypertension according to JNC 7 guidelines. J Clin Hypertens (Greenwich) 2006;8 (5) 344- 350
PubMed Link to Article
Sisson  SDRice  THughes  MT Physician knowledge of national cholesterol guidelines before and after an interactive curriculum. Am J Cardiol 2007;99 (9) 1234- 1235
PubMed Link to Article
Ashar  BHRice  TNSisson  SD Physicians' understanding of the regulation of dietary supplements. Arch Intern Med 2007;167 (9) 966- 969
PubMed Link to Article
Kern  DEThomas  PAHoward  DMBass  EB Curriculum Development for Medical Education: A Six-Step Approach.  Baltimore, MD Johns Hopkins University Press1998;
Case  SMSwanson  DB Constructing Written Test Questions for the Basic and Clinical Sciences. 3rd ed. Philadelphia, PA National Board of Medical Examiners2002;
Norcini  JJSwanson  DBGrosso  LJWebster  GD Reliability, validity and efficiency of multiple choice question and patient management problem item formats in assessment of clinical competence. Med Educ 1985;19 (3) 238- 247
PubMed Link to Article
Case  SMRipkey  DRSwanson  DB The relationship between clinical science performance in 20 medical schools and performance on Step 2 of the USMLE licensing examnation: 1994-95 Validity Study Group for USMLE Step 1 and 2 Pass/Fail Standards. Acad Med 1996;71 (1) ((suppl)) S31- S33
PubMed Link to Article
Epstein  RMHundert  EM Defining and assessing professional competence. JAMA 2002;287 (2) 226- 235
PubMed Link to Article
Frohna  JGKalet  AKachur  E  et al.  Assessing residents' competency in care management: report of a consensus conference. Teach Learn Med 2004;16 (1) 77- 84
PubMed Link to Article

Correspondence

CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Web of Science® Times Cited: 5

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles