Measuring Competency of Pharmacy Residents: A Survey of Residency Programs’ Methods for Assessment and Evaluation


Steven J Kary, Zack Dumont, Kirsten Tangedal, Jennifer Bolt, William M Semchuk

ABSTRACT

Background

The Canadian Pharmacy Residency Board (CPRB) specifies the competencies that pharmacy residents must attain and the need for assessment and evaluation. Methods of assessment and evaluation are left to the discretion of individual programs. There is a scarcity of published literature compiling and comparing the strategies used by Canadian residency programs.

Objectives

To determine curricular components used for assessment and evaluation; to describe the tools used by programs; to characterize the scheduling, frequency, and repetition of curricular components; and to determine the individuals or groups involved.

Methods

Coordinators of hospital pharmacy residency programs with CPRB accreditation or accreditation pending were surveyed to collect information about the assessment and evaluation of select CPRB standards.

Results

From the 37 eligible residency programs, 20 unique responses (54%) were received. All respondents were general practice programs (100%) in predominantly multicentre organizations (70%). Programs were similar in terms of assessment components used, with all respondents citing care plan review, direct observation of patient care, journal clubs, creation of project timelines, and ethics submission. The predominant evaluation components were within-department presentations (100%), written manuscripts (95%), drug information rotations (85%), and longitudinal evaluations (75%). Standardized forms (70%–100%) defined by Bloom’s taxonomy (65%) and the CPRB “levels and ranges” document (60%) were the principle means used. Assessments for patient care and for provision of education were generally carried out immediately (80% and 95%, respectively), whereas project management skills were assessed predominantly at final evaluation (75%). Self-assessment and assessment by pharmacy team members occurred for every competency, whereas patients (0%–10%) and allied health professionals (5%) were less frequently involved.

Conclusions

The assessment and evaluation strategies reported by programs were congruent. The results provide a summary of national practices and will allow existing and developing programs to examine their approach to assessment and evaluation for alignment with national standards.

KEYWORDS: assessment, evaluation, competency, pharmacy residency, training, professional development

RÉSUMÉ

Contexte

Le Conseil canadien de résidence en pharmacie (CCRP) précise les compétences que les résidents en pharmacie doivent acquérir ainsi que le besoin d’observation et d’évaluation. Les méthodes d’observation et d’évaluation sont laissées à la discrétion de chacun des programmes. La littérature publiée qui compile et compare les stratégies utilisées par les programmes en résidence canadiens est rare.

Objectifs

Déterminer les composantes des programmes utilisés pour l’observation et l’évaluation des normes; décrire les outils utilisés par ces programmes; établir l’horaire, la fréquence et la répétition des éléments qui constituent ces programmes et déterminer les personnes ou les groupes concernés.

Méthodes

Les coordinateurs des programmes de résidence en pharmacie hospitalière ayant un agrément ou dont l’agrément est en cours de procédure ont été interrogés afin qu’ils fournissent des informations concernant l’observation et l’évaluation des normes CCRP sélectionnées.

Résultats

Des 37 programmes de résidence admissibles, 20 réponses individuelles (54 %) sont parvenues aux investigateurs. Tous les répondants représentaient des programmes de pratique générale (100 %) dans des organismes majoritairement multicentriques (70 %). Les programmes étaient similaires en termes de points à observer : tous les répondants citaient l’examen des plans de soins, l’observation directe des soins aux patients, les clubs de journaux, la création d’échéanciers pour la réalisation de projets et la proposition de documents sur l’éthique. Les critères d’évaluation prédominants consistaient en des présentations au sein du département (100 %), la rédaction de manuscrits (95 %), des rotations reliées au service d’information pharmacothérapeutique (85 %) et les évaluations longitudinales (75 %). Les formulaires standardisés (70 %–100 %) définis par la taxonomie de Bloom (65 %) et le document Levels and ranges (niveaux de performance des compétences) du CCRP (60 %) étaient les ressources de base utilisées. L’observation des soins aux patients et de la formation avait généralement lieu immédiatement (respectivement 80 % et 95 %,), tandis que les compétences en matière de gestion de projet étaient majoritairement évaluées en dernier (75 %). L’auto-observation et l’observation effectué par des membres de l’équipe de pharmacie portaient sur chaque compétence, tandis que les patients (0 % – 10 %) et les autres professionnels de la santé (5 %) participaient plus rarement à cette observation.

Conclusions

Les stratégies d’observation et d’évaluation rapportées par les programmes concordaient. Les résultats fournissent un résumé des pratiques nationales et permettront aux responsables des programmes existants et en cours d’élaboration d’étudier l’approche de l’observation et de l’évaluation pour l’aligner sur les normes nationales.

MOTS CLÉS: observation, évaluation, compétence, résidence en pharmacie, formation, développement professionnel

INTRODUCTION

In 2010, the Canadian Pharmacy Residency Board (CPRB) published accreditation standards that implemented the change to a competency-based educational approach for pharmacy residency programs in Canada.1 The release and adoption of these standards aligned with the evolution of professional education—notably within medicine, social work, and chiropractic care—from a focus on pathways and process to a focus on outcomes or competencies of graduates.2,3 The CPRB standards have defined competencies that align with current pharmacy practice, and CPRB-accredited programs have adjusted their respective frameworks to ensure these competencies are being achieved.

Within the CPRB accreditation standards for year 1 residencies, as updated in 2018,4 the core resident competencies describe (3.1) provision of patient care as a member of an interprofessional team, (3.2) management and improvement of medication systems, (3.3) leadership, (3.4) self-management of one’s own practice, (3.5) provision of medication- and practice-related education, and (3.6) project management. These competencies align with those described by the National Association of Pharmacy Regulatory Authorities for pharmacists at entry to practice,5 and they build upon the educational outcomes to be achieved in the first professional degree, as defined by the Association of Faculties of Pharmacy of Canada.6 Furthermore, they parallel the 4 competencies described by the American Society of Health-System Pharmacists, which reflect the ongoing progression of health-system pharmacy practice.7

Despite these definitions of pharmacist competencies, developing the educational processes required to achieve them is complex. The specific activities or curricular components set the course for a resident’s progression through a program and form the basis for meaningful assessment.2 Each competency defined by the CPRB is further delineated to describe the skills, attitudes, and values required to demonstrate success. For example, standard 3.1 defines the components to demonstrate proficiency in evidence-based pharmacy practice in conjunction with other health care professionals. Residents are required to place high priority on selecting and providing appropriate pharmacy services, to demonstrate proficiency in navigating resources, and to establish inter- and intra-professional relationships.4 The components described within each competency ensure that residency programs can provide relevant assessment and evaluation of residents.

Demonstration of competency through these components is no less complex, and multiple methods are therefore recommended. 8,9 Programs require both appropriate assessment—estimation of a learner’s ability, performed longitudinally to guide learning—and evaluation—the summative judgment of an amount or value of competency, occurring at the midpoint or end of an educational or training program.4,9 CPRB-accredited programs require ongoing formative assessments to aid learning and development of competencies, as well as final evaluations to describe the competencies achieved.3,4 The CPRB also requires ongoing resident self-assessment of activities, which is to be reviewed with a preceptor.4 The requirements for “what” but not “how” allow for varied interpretation and implementation by individual programs, and may result in discordance of assessment and evaluation from one program to another.

Appropriate assessment and evaluation are paramount in optimizing learners’ capabilities, protecting the public, and providing a basis for learners to progress.9 Although tools exist to aid in competency assessment,10 programs must determine their individual needs and implement assessment methods appropriate to those needs. The CPRB has published a “levels and ranges” document to guide programs in their definition of expected levels of competency.11 Given this flexibility and freedom in the choice of methods to assess and evaluate these competencies, the onus for developing suitable methods lies with individual programs.

There is a lack of published literature pertaining to how CPRB-accredited and accreditation-pending year 1 pharmacy residency programs are assessing the competency of hospital pharmacy residents. Although some programs have shared examples of their assessment tools online,12 a compilation and comparison of the assessment strategies used by Canadian programs is not currently available.

This study was undertaken to determine the curricular components used for assessment and evaluation of residents’ competencies; to describe the tools used for assessment and evaluation; to characterize the scheduling, frequency, and repetition of assessments; and to determine the individuals or groups involved in assessment.

METHODS

A survey focused on assessment and evaluation of specified resident competencies was designed by the authors to determine the methods used by CPRB-accredited and accreditation-pending hospital pharmacy residency programs. The survey questions and response options were determined through analysis of the components of the Regina Qu’Appelle Health Region pharmacy residency program and literature available on the assessment and evaluation of competency. Although the survey was not validated, the content and design were revised before distribution, on the basis of feedback from 2 former residency coordinators and a pharmacist with a background in survey design and implementation, none of whom were otherwise involved in the study.

Survey content reflected the scope of CPRB year 1 standards 3.1, 3.5, and 3.6. These competencies were selected to represent the complete set of standards and were hypothesized to cover a broad range of resident skills and activities and to illustrate perceived similarities (standard 3.1) and differences (standards 3.5 and 3.6) among programs. Standards 3.2, 3.3, and 3.4 were excluded to ensure that the project remained within the scope of a year 1 pharmacy residency project; these standards were not anticipated to provide better examples of congruence or disparity among programs. Respondents were allowed, but not required, to select all applicable answers to each question, and an open response option was made available, to ensure that any unlisted responses could be captured. Respondents were able to provide comments if they wished to elaborate on their response to any question. The survey questions are available in Appendix 1 (https://www.cjhp-online.ca/index.php/cjhp/issue/view/192/showToc).

The survey was distributed electronically via the Research Electronic Data Capture (REDCap) system.13 REDCap is a secure, web-based application designed to support data capture for research studies hosted on a local server (https://www.projectredcap.org/). Potential participants were identified through the CPRB website, which is updated at least annually and provides contact information for all accredited and accreditation-pending residency programs. The coordinators of the identified residency programs were invited to participate in the survey. If multiple coordinators were involved in an individual program, they were asked to submit a single unified response. Individuals responsible for coordination of multiple programs were asked to submit a separate response for each program. The survey was open from February 19 to March 16, 2018, inclusive, and reminders were sent by e-mail to prospective participants at 2 weeks and 1 week before the survey closing date.

In cases where the number of respondents from each province exceeded the total number of programs known to exist, the data were reviewed independently by 2 of the authors to assess for any duplication of response from individual programs. Where multiple responses were clearly noted for a single program, the most thorough response was used for the analysis.

The data were analyzed descriptively, because of their categorical nature. Results are reported in terms of frequency distributions and medians with interquartile ranges (IQRs). Results from the open responses are reported as “other” and are summarized. Comments from each section of the survey were reviewed for applicability to the results and are highlighted within the Discussion.

RESULTS

Overview of Respondent Programs

A total of 22 responses were received. Following screening for and elimination of duplicates, information from 20 (54%) of the 37 Canadian year 1 programs was included in the analysis. Respondents represented programs in 7 of the 9 provinces with residency programs both large (more than 10 residents accepted annually) and small (10 or fewer residents accepted annually). During the 2017/2018 residency year, 196 residency positions were offered nationally,14 including the 85 potential positions reported by Quebec Clinical Master’s programs. The responses received accounted for 139 (71%) of available positions. All respondents were from general practice residency programs (100%) in predominantly multicentre organizations (70%) (Table 1).

Table 1 Demographic Characteristics of Survey Respondents


 

The programs accepted a median of 4 residents (IQR 2–4), with a median of 12 rotations (IQR 9–14) undertaken by each resident annually. Respondents described programs as being predominantly focused on provision of direct patient care, with 17 (85%) of the respondents stating that this aspect constituted 60% or more of the curriculum. The other competencies addressed in the survey made up lesser portions of the programs: medication- and practice-related education and project management skills composed 20% or less of the curriculum in 12 (60%) and 18 (90%) of the programs, respectively. Three respondents (15%) defined medication- and practice-related education as accounting for 80% or more of the curriculum.

Curricular Components Used for Assessment

The curricular components used to assess direct patient care included care plan review (100% of respondents), direct observation of patient care activities (100%), and written documentation of patient care (95%) (Table 2). Respondents also reported discussions with the health care team and the use of a teaching rotation as “other” components of assessment. The components used to assess provision of medication- and practice-related education were journal clubs (100%), response to drug information requests (95%), individual patient education (85%), and preceptorship (85%). Group teaching sessions were used by fewer programs (35%). Assessment of competency in project management was consistently based on creating project timelines, communication with project members, ethics submission, and data collection (100% for all). Additional components cited by 5 (25%) of the programs included presentations to leadership groups, poster preparation, protocol write-up, and background research.

Table 2 Curricular Components Used for Assessment of Competencies 3.1, 3.5, and 3.6


 

Curricular Components Used for Evaluation

The curricular components used for evaluation of direct patient care were longitudinal evaluations (75% of respondents), comprehensive final rotations (45%), and practical skills examinations (25%) (Table 3). Four programs (20%) cited other evaluation components, including a comprehensive oral evaluation, which was used by one of the programs. Provision of medication- and practice-related education was evaluated via departmental or staff presentations in all programs, and additionally through specific rotations: 17 programs (85%) had a drug information rotation and 13 (65%) had a preceptorship rotation. Fewer programs used written education (60%) or other components (25%), such as learning portfolios, presentation slides, and seminars or academic teaching, as curricular components for evaluation. All respondents indicated that a research project was used for demonstration of project management skills; however, 12 programs (60%) additionally used non–research-based projects, such as mini-projects (e.g., audits). Evaluation of competency was predominantly based on written manuscripts (95%), completion of a research project (90%), a separate management project (85%), or a poster presentation of results (75%). Few programs (20%) used an additional publication for evaluation of project management.

Table 3 Curricular Components Used for Evaluation of Competencies 3.1, 3.5, and 3.6


 

Tools Used by Programs for Evaluation

Thirteen programs (65%) reported sharing assessment and evaluation tools with another program (Table 4). The majority of programs (75%) used more than 1 tool to define competency, the most common tools being Bloom’s taxonomy16 (65%) and the CPRB “levels and ranges”11 document (60%). Most programs used standard evaluation forms for assessment of the 3 competencies; however, the median number of questions on each evaluation form ranged from 10 (IQR 3–13) to 20 (IQR 15.5–20), depending on the competency.

Table 4 Characteristics of Tools Used for Evaluation of Competency


 

Scheduling and Frequency of Assessments

Assessments of standard 3.1 were predominantly carried out immediately (80% of respondents reporting “often” or “always”) or within 24 to 48 h (70% “often” or “always”) after the provision of patient care. Similarly, assessments of standard 3.5 were carried out immediately or within 24 to 48 h for provision of medication-related education (95% “often” or “always” and 70% “often” or “always”, respectively) and practice-related education (85% “often” or “always” and 60% “often” or “always”, respectively). Assessments were typically recorded “often” or “always” for midpoint and final evaluations (at least 70% of programs) within these competencies (Figure 1). Assessment of project management skills (standard 3.6) was less frequently undertaken immediately (25% “often” or “always”) or within 24 to 48 h (35% “often” or “always”), and these skills were predominantly assessed at the final evaluation (75% “often” or “always”) or at additional times specified by the program, including month-end and quarterly, periodically according to the activity schedule, or as set by the project preceptor.

 


 

Figure 1 Scheduling of assessment carried out for competencies 3.1, 3.5, and 3.6. The frequency of occurrence (from “never” to “always”) for each of the timing domains is represented as a percentage of all respondents. The respondents who chose not to reply to a given question are represented as “blank”. Respondents who chose “other” were given the opportunity to provide an open (free-text) response, with the following responses received: “ongoing as components of the project are completed”, “month-end and quarterly”, “periodic according to activity schedule”, “as set by preceptor”, “assessment of written work may take 1–2 weeks”.

Individuals or Groups Involved in Assessments

Assessment by pharmacy team members, as well as self-assessment by the resident, was used for all 3 competencies (Figure 2). Allied health professionals and patients were rarely involved in assessments of patient care (5% “often” or “always” and 0% “often” or “always”, respectively) or assessment of medication-related education (5% “often” or “always” and 10% “often” or “always”, respectively). Fourteen respondents (70%) indicated that residency coordinators or directors were “always” or “often” involved in assessment of practice-related education. Coordinators and directors were also reported as being “always” or “often” involved in assessment of project management skills in 12 programs (60%). Programs additionally identified project members (85% “often” or “always”) and faculty liaisons as being involved in the assessment of project management skills.

 


 

Figure 2 Individuals involved in assessment of competencies 3.1, 3.5, and 3.6. The frequency of occurrence (from “never” to “always”) for each individual or group is represented as a percentage of all respondents. The respondents who chose not to reply to a given question are reported as “blank”. Respondents who chose “other” were given the opportunity to provide an open (free-text) response, with the following responses received: rotation preceptor, pharmacy technicians during distribution, physician feedback (1 respondent); rotation preceptor (2 respondents); project management assigned to another individual, faculty liaison (1 respondent).

DISCUSSION

This study sought to describe how CPRB-accredited and accreditation-pending year 1 pharmacy residency programs assess and evaluate residents in accordance with 3 CPRB competencies: provision of direct patient care, provision of medication- and practice-related education, and demonstration of project management skills.4 These competencies were selected as it was anticipated that they would highlight similarities and differences among programs. Patient care was found to be a primary focus (greater than 60% of curriculum) in 85% of respondent programs. Education and project management were anticipated to compose a smaller portion of programs, and the results bore out this assumption, although education was noted to represent 80% or greater of the curriculum in 3 programs. These outliers may reflect a difference among programs, exemplified by the structure of the Quebec Clinical Master’s programs as distinct from the structure of residency programs in other provinces. However, there may also be overlap of patient education and provision of patient care as the primary focus of certain programs. The programs were largely congruent in terms of the curricular components used for assessment and the timing of assessment of these 3 competencies. More variability was noted in terms of the curricular components used for evaluation and the individuals involved in assessment of competency.

Identification of the components used for assessment is critical within competency-based education as they facilitate progression of competency development.17 Many of the curricular components used for assessment of competency in the provision of patient care and the provision of medication- and practice-related education—including care plan review, written documentation, direct observation of patient care, and response to drug information requests—are activities specified within the CPRB accreditation standards.4 Additionally, these components align with required patient care services specified in US pharmacy residency programs.7 Case-based lectures, journal clubs, and preceptorship are assessed in many of the Canadian programs, although they are not required. This similarity among components for assessment suggests a common approach to pharmacy practice, and thus to competency development.

All of the respondent programs used a research project for the assessment of project management skills, and the surveyed curricular components were used by at least 90% of respondents (Table 2). Although the CPRB accreditation standards do not mandate a research project for year 1 residencies, a resident must be involved in project development and in data collection, analysis, and interpretation, and must prepare a report suitable for publication in a peer-reviewed journal, activities that together are analogous to the research process.4 The components for assessment that were reported by survey respondents largely align within the standard for project management, and variability among programs seemed to be reflected only in respondents’ free-text comments, which mentioned presentations to leadership groups, poster preparation, protocol write-up, and background research. However, these components were not listed within the options presented to respondents, and it is possible that they were used but not mentioned by other programs; hence, their true frequency within the sample cannot be verified.

Competency-based education places less emphasis on evaluation than on assessment3,18; however the curricular components used for evaluation continue to provide a valuable summation of competency achieved.9 Longitudinal evaluations, used by 75% of programs, represented one of the consistent evaluation processes for provision of patient care. The 2018 CPRB accreditation standards require the ongoing use of longitudinal assessments,4 which can serve as a foundation for monitoring professional development,9 but there are few other evaluation requirements in the standards. Correspondingly, an apparent lack of standardization was found among programs in terms of the components used for evaluation. More than half of respondent programs used written education (60%), and just under half used a comprehensive final rotation (45%), whereas a quarter or less used a practical skills exam (25%) or additional publications (20%) for evaluation of competency. However, these components varied substantially across respondents. Although a best practice cannot be defined, the use of multiple methods can help to validate the findings of an evaluation,9 and programs may benefit from incorporation of additional evaluation components within these competencies.

The use of descriptive assessment tools provides increasing detail and specificity for residents to gain competency.17 The use of multiple methods to define competency, including Bloom’s taxonomy and the “levels and ranges” document (as reported by 75% of respondent programs), demonstrates that qualitative assessment is favoured over quantitative assessment. National standardization and validation of qualitative assessment tools have been suggested in competency-based medical education,9,18 although these approaches have not been formally implemented in Canadian pharmacy programs. For the competencies surveyed, forms that programs use to assess competency are predominantly standardized within each program (70% to 100%), with 65% of programs sharing their tools with another program. These results reflect the existence of standardized forms at the provincial level in British Columbia and Ontario.12 It appears that CPRB-accredited pharmacy residency programs are standardizing their approach, although this may not be a consistent national trend. Currently, the CPRB has not implemented formal sharing of assessment tools or standardization of these tools.

To have the desired effect, assessments should be performed frequently,3,18 although the benefit associated with timing of feedback may vary depending on the focus of the assessment.19 Assessments done immediately may aid in faster acquisition of knowledge and skill related to specific tasks, whereas a delay in feedback may allow increased automaticity in development of learning strategies and process.19 Most respondents reported an expectation that assessments would be conducted immediately or within 24 to 48 h for the provision of patient care and for the provision of medication- and practice-related education. Fewer programs assessed these competencies at the midpoint and end of the residency, which may reflect a focus on reinforcing knowledge and skill at the time of acquisition. However, the CPRB accreditation standards continue to recommend midpoint assessments, and they require final assessments.4 Variation in the timing of assessment of residents’ project management skills was noted, which may reflect the time required for review of written documents. Two respondents commented that the timing of assessment for this competency varied according to completion of activities or the schedule of activities that had been laid out.

Involvement of multiple individuals, specifically patients and allied health professionals, has been suggested as essential for successful assessment within competency-based medical education, 9,18 and pharmacy residency programs may benefit from adopting this practice. Assessment for the various competencies largely relied on pharmacy team members, which may reflect the 2010 standards requiring qualified pharmacists or pharmacy technicians to act as preceptors.1 Only one program reported regular use of allied health professionals in the assessment of patient care. Few programs used allied health professionals (5% “often” or “always”) or patients (10% “often” or “always”) in the assessment of medication-related education (representing 1% and 2.5% of the total residency spots, respectively). The more recent CPRB standards, released in 2018, identify the need to include patients and health care providers and require their input in the assessment of residents.4 Patient feedback surveys with a Likert-scale rating, along with review of findings with a preceptor could help to ensure that feedback is specific and meaningful.20 Additionally, self-assessment remains an essential component of competency-based training, and all of the surveyed programs included self-assessment “often” or “always” for each competency. It has been suggested that self-assessment in isolation is ineffective and potentially dangerous,18 so it is important to note that the respondent programs did not rely on resident self-assessment in isolation—all of the programs made use of at least one other individual in assessment of each competency.

Implications

This study has provided insight into the assessment and evaluation practices of CPRB-accredited and accreditation-pending pharmacy residency programs in Canada. The alignment and variation among programs may help in identifying areas on which to focus for growth in the tools and methods used, as well as guiding programs currently in development. The results of this study may help inform processes for continuous quality improvement, which are required of all programs.4 Currently, the CPRB recognizes and publishes information on leading practices, and also provides webinars to aid in program development.21 Taken together, the results of this survey may help in identifying methods suitable for incorporation into individual programs.

The programs represented in the survey responses were predominantly congruent with respect to assessment and evaluation of competencies, and many of their practices adhered to the accreditation standards. The 2018 CPRB accreditation standards require ongoing incorporation of assessments from multiple individuals, including patients and allied health professionals, 4 methods that were infrequently used by the respondent programs. Most programs will require changes to incorporate assessments by these individuals, and the few programs that have an existing standard in this area may be able to provide guidance as to how this might be achieved.

This study focused on describing how competency assessments and evaluations were being performed at the time of the survey, but did not assess the quality of the assessments by individual programs or the ramifications if competence was not demonstrated through these measures. Additionally, measurement of the outcomes of competency assessments and evaluations and how these outcomes translate into future practice were outside the scope of this study. A framework for the evaluation of competency-based programs was previously described by Baartman and others,3 and has been applied to a pharmacy residency program in North Carolina for purposes of improvement. 22 The same framework could be applied to Canadian programs to determine the effectiveness of the current approach. Further research should focus on determining best practice and how this might be implemented into the CPRB-accredited pharmacy residency programs.

Limitations

The use of a survey for this study allowed ease of distribution to participants and ease of response, which likely contributed to the relatively high response rate. However, the survey design limited the nature of the responses collected and may have contributed to the high degree of congruence observed among the programs. A semistructured interview or the use of more open-ended questions might have been better ways to achieve more depth and detail about the actual practices of individual programs. Respondents were not required to answer every question, which may have affected interpretation of the overall response for questions with components that were infrequently used. Additionally, the interpretation of response options was subjective and may have varied among individual respondents.

Three of the CPRB-defined competencies were selected to represent assessment and evaluation practices across programs. However, this selection may not be truly representative of practice, as programs may have alignment and variation in the emphasis placed on the other 3 competencies. Although the survey responses were intended to apply to all programs, there may have been bias in the survey design, such that responses may have reflected practices within the Regina Qu’Appelle Health Region pharmacy residency program. Finally, participation in the survey was voluntary, and the findings may reflect programs most interested in the topic rather than being truly representative of all programs.

CONCLUSION

This study showed that the materials and methods used by individual Canadian pharmacy residency programs to assess and evaluate residents’ competencies are largely congruent, although some variation exists, particularly with respect to evaluation. These results help to describe the practice landscape among CPRB-accredited pharmacy residency programs with regard to assessment and evaluation. Although a specific best practice was not sought and thus cannot be defined from these findings, the results reported here can help to reinforce current practices. Furthermore, these results help to identify the extent of variability among programs, indicating where efforts could be concentrated if and when it is determined that national alignment is appropriate.

Supplementary Material

References

1 Canadian hospital pharmacy residency board accreditation standards. Ottawa (ON): Canadian Society of Hospital Pharmacists; 2010 Jan.

2 Frank JR, Snell LS, Ten Cate O, Holmboe ES, Carraccio C, Swing SR, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–45.
cross-ref  pubmed  

3 Baartman LKJ, Prins FJ, Kirschner PA, van der Vleuten CPM. Determining the quality of competence assessment programs: a self-evaluation procedure. Stud Educ Eval. 2007;33(3–4):258–81.
cross-ref  

4 Canadian Pharmacy Residency Board accreditation standards for pharmacy (year 1) residencies. Ottawa (ON): Canadian Society of Hospital Pharmacists; 2018.

5 Professional competencies for Canadian pharmacists at entry to practice (2014). Ottawa (ON): National Association of Pharmacy Regulatory Authorities; 2014 [cited 2019 Sep 11]. Available from: https://napra.ca/pharmacists/professional-competencies-canadian-pharmacists-entry-practice-2014

6 AFPC educational outcomes for first professional degree programs in pharmacy in Canada 2017. Association of Faculties of Pharmacy of Canada; 2017.

7 ASHP accreditation standard for postgraduate year one (PGY1) pharmacy residency programs. Bethesda (MD): American Society of Health-System Pharmacists; 2016 [cited 2018 Nov 25]. Available from: https://www.ashp.org/-/media/assets/professional-development/residencies/docs/pgy1-residency-accreditation-standard-2016.ashx?la=en&hash=9FF7C76962C10562D567F73184FAA45BA7E186CB

8 Baartman LKJ, Bastiaens TJ, Kirchner PA, van der Vleuten CPM. The wheel of competency assessment: presenting quality criteria for competency assessment programs. Stud Educ Eval. 2006;32(1):153–70.
cross-ref  

9 Epstein RM. Medical education: assessment in medical education. N Engl J Med. 2007;356(4):387–96.
cross-ref  pubmed  

10 Murdaugh LB. Competence assessment tools for health-system pharmacies. 5th ed. Bethesda (MD): American Society of Health-System Pharmacists; 2015.

11 Canadian Hospital Pharmacy Residency Board 2010 accreditation standards workshop proceedings: levels and ranges document August 2009. Ottawa (ON): Canadian Society of Hospital Pharmacists; 2009 Aug.

12 Interior Health pharmacy practice residency program direct patient care rotation ITER (in-training evaluation of resident): competency-based evaluation. Kelowna (BC): Interior Health; 2017 [cited 2018 May 1]. Available from: http://static1.1.sqspcdn.com/static/f/920943/27587643/1496871693503/IH+DPC+ITER+FINAL+2017.pdf?token=Em47LwiLmbCmKEYgeGk138A0BF8%3D

13 Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JC. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377–81.
cross-ref  pmc  

14 Canadian Pharmacy Residency Board. Residency board news. Ottawa (ON): Canadian Society of Hospital Pharmacists; 2019 Spring.

15 Goel S. An overview of selected theories about student learning. Indo-US workshop on effective teaching and learning at college/university level; 2011 Feb 10–12; Delhi, India. Available from: https://files.eric.ed.gov/fulltext/ED523206.pdf [cited 2018 Nov 25].

16 Anderson L, Krathwohl D, editors. A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives. New York (NY): Longman; 2000.

17 Carraccio C, Wolfsthal S, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361–7.
cross-ref  pubmed  

18 Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010; 32(8):676–82.
cross-ref  pubmed  

19 Hattie J, Timperley H. The power of feedback. Rev Educ Res. 2007;77(1): 81–112.
cross-ref  

20 Bogetz AL, Orlov N, Blankenburg R, Bhavaraju V, McQueen A, Rassbach C. How residents learn from patient feedback: a multi-institutional qualitative study of pediatrics residents’ perspectives. J Grad Med Educ. 2018; 10(2):176–84.
cross-ref  pubmed  pmc  

21 Canadian Pharmacy Residency Board: leading practices. Ottawa (ON): Canadian Society of Hospital Pharmacists; [cited 2018 May 28]. Available from: https://www.cshp.ca/leading-practices

22 Shah S, McLaughlin J, Eckel S, Mangun J, Hawes E. Evaluating the quality of competency assessment in pharmacy: a framework for workplace learning. Pharmacy. 2016;4(1):4.
cross-ref  


Steven J Kary, BSP, ACPR, is with Oncology Pharmacy Services, Saskatoon Cancer Centre, Saskatoon, Saskatchewan
Zack Dumont, BSP, ACPR, MS(Pharm), is with Pharmacy Services, Saskatchewan Health Authority Regina Area, Regina, Saskatchewan
Kirsten Tangedal, BSP, ACPR, is with Pharmacy Services, Saskatchewan Health Authority Regina Area, Regina, Saskatchewan
Jennifer Bolt, BScPharm, ACPR, PharmD, is with Clinical Support Services, Central Okanagan Seniors’ Health and Wellness Centre, Kelowna, British Columbia
William M Semchuk, BSP, MSc, PharmD, FCSHP, is with Pharmacy Services, Saskatchewan Health Authority Regina Area, Regina, Saskatchewan

Competing interests: Jennifer Bolt serves as a member of the Canadian Pharmacy Residency Board. No other competing interests were declared. ( Return to Text )


Address correspondence to: Steven J Kary, Oncology Pharmacy Services, Saskatoon Cancer Centre, 20 Campus Drive, Saskatoon SK S7N 4H4, e-mail:steven.kary@saskcancer.ca

(Return to Top)


Funding: None received. ( Return to Text )


Acknowledgements

The authors would like to thank Lynette Kosar and Lisa Ruda, who each pilot-tested the survey.


Canadian Journal of Hospital Pharmacy, VOLUME 72, NUMBER 5, September-October 2019