Figure 10 - Database Design in Access
1 media/Figure 10_thumb.png 2023-12-07T05:30:01-08:00 Dr. Robert Bruce Scott 6e92010469710dace11b27a901faa9971d3e6566 43299 1 Figure 10 - Database Design in Access plain 2023-12-07T05:30:01-08:00 Dr. Robert Bruce Scott 6e92010469710dace11b27a901faa9971d3e6566This page is referenced by:
-
1
2023-05-26T13:33:45-07:00
A Fresh Look at the Faculty Evaluation Process
82
Article 1 authored by Linda Merillat, Jane Carpenter, Bobbe Mansfield, and Debbie Isaacson, Washburn University, School of Nursing
plain
2024-01-09T06:06:17-08:00
By Linda Merillat, Jane Carpenter, Bobbe Mansfield, and Debbie Isaacson, Washburn University, School of Nursing
Introduction
In the ever-evolving landscape of higher education, the quality of teaching and the learning experiences are paramount for program success. It is within this context that faculty annual evaluations emerge as a critical mechanism for assessing and improving the performance of educators in universities and colleges. These evaluations should offer a comprehensive view of faculty members' personal strengths and areas in need of development which impact the quality of the learning experience for students. These evaluations should serve as a cornerstone of academic excellence and a driving force behind continuous program improvement. This article provides an overview of how a medium-sized Midwestern university stepped back, took a fresh look at the faculty annual evaluation process, and transformed the process from an archaic process into one that encourages and supports faculty in their professional development.Problem
The problems with the faculty annual evaluation process within the School of Nursing (SON) were long-standing. The process was based on a form that was developed over twenty years ago. The evaluation consisted of two parts. First, each year the faculty were required to complete a narrative activities report (Figure 1, Figure 2). Then, they were asked to evaluate themselves using a Likert scale-type form (Figure 3, Figure 4). When done, a meeting was scheduled with the Associate Dean. During the meeting, evaluation scores were often changed or altered by the evaluator. The reasons for these changes seemed subjective, since no rubric nor details were provided about what was expected for the levels of performance. At the end of the process, faculty were provided with an overall score of 1 to 4, with 1 being Unsatisfactory and 4 being Exemplary. Faculty often felt demeaned by a process that had no clear rules, and the experience felt capricious and punitive.
From a management perspective, there was no way to track faculty performance over time. The written form provided no means for collecting metadata for longitudinal analysis. During this time, the SON was tasked with collecting data for a Center of Excellence application, and faculty performance data was found to be lacking.
Alt text for Figures 1 & 2 - Outdated Dean's Annual Evaluation of Faculty MS Word doc - provided by Linda Merillat.
Alt text for Figures 3 & 4 - Outdated Faculty Annual Report Template MS Word doc - provided by Linda Merillat.Solution
The solution was to develop a comprehensive, rubric-based form that focused on continual improvement rather than a summative score. The criteria were primarily based on requirements established in the SON’s promotion and tenure guidelines. A draft of the proposed rubric was presented to the deans and program directors. Over several meetings, the language of the rubrics was refined. The current criteria and associated rubrics are provided in an appendix (Appendix- Faculty Annual Evaluation Rubric).
As the project unfolded, there were discussions about the best way for faculty to complete the forms. Using a PDF form filler was suggested. Ultimately, it was decided to implement the form as a series of tabs in Excel. The benefits of using Excel were that faculty could download the form at the beginning of the academic year and update it throughout the year; and then, after meeting with their Associate Dean, the data from the form could be imported into the SON’s database to track metadata and longitudinal trends.
The first page of the Excel workbook was a summary page (Figure 5 - Summary Page of New Faculty Annual Evaluation Form). On the summary page, faculty entered their name. The form listed all the categories under evaluation organized by a widely accepted model of Teaching, Scholarship, Service, and Other. Steps in the overall process with due dates were provided. Hyperlinks were provided to make it easier to navigate the form.
Each tab in the form followed a consistent format (Figure 6 - Sample tab in New Faculty Annual Evaluation form). The rubric for the criteria was developed and clearly provided. Under the rubric were radio buttons for selection. In the process, faculty assessed themselves for each criterion. Next, the faculty members provided support for their selection with relevant examples of their performance. During their meeting with the Associate Dean, changes were collaboratively made to these determinations.
Alt txt for Figures 5, 6, 7, 8, 9 - New SON Faculty Evaluation Form Template - 2023-2024 MS Excel spreadsheet - provided by Linda Merillat.Benefits
The introduction of this new faculty annual evaluation form and process has provided the SON with several benefits.
First, it shifted the atmosphere of faculty annual evaluations from punitive and subjective to a more flexible, formative format focused on continual improvement. Faculty no longer felt that they were being assigned an arbitrary score.
Next, faculty annual evaluations became more consistent. The evaluations were administered by two different associate deans. The new form with the embedded rubrics ensured that each of the criteria was given the same treatment.
Faculty members also saw benefits. The form was under the control of each faculty member which made it easier for faculty to track and document accomplishments throughout the academic year. The raw data from each form was incorporated into the SON’s database which allowed detailed performance reports to be provided to each faculty member in support of their tenure and promotion applications.
From a managerial perspective, the incorporation of the raw data into the SON’s database allowed reports to be provided that showed faculty performance and trends over time (Figure 7) and a scholarship summary that was used for other departmental required reporting (Figure 8).
Lessons Learned
The general construction of the form has remained stable, but several improvements to the form and the process were made. Most importantly, it was critical to provide a clear delineation of what accomplishments should appear on which tab and to communicate these expectations to faculty. Examples of the types of activities to be included within each criterion area were added. To facilitate ease of use for faculty, the headings of each tab were color-coded to designate the major areas of teaching, scholarship, service, and an ‘other’ category. An unexpected issue arose when faculty were required to sign the form. The process was primarily digital, and a formal signature added an unnecessary layer of complexity. Instead, faculty members type in their initials electronically during the review meeting with their Associate Dean.
The process of incorporating the raw data into the SON’s database was problematic at times. Faculty had to be advised to limit lines to 250 characters, so that their comments didn’t get truncated during the import process. Special attention also had to be given when the final spreadsheets were submitted for import into the database, requiring a manual review for correctness and accuracy. To minimize these potential problems faculty were given several tips for completing the form.- Don't insert new rows.
- Keep all your accomplishments in one row.
- Don't use spaces to force a new line.
- When editing in Excel (not in the browser), type ALT+ENTER to force a new line.
Technical Details for Creating the Form
The next section of the article discusses some of the technical details used to create the form.
Developing the Rubric
The rubric development was a crucial aspect of the creation of the form. The rubric was developed and reviewed over several months. The general process was:
- Criterion areas were decided based on the SON’s Promotion and Tenure Guidelines.
- A rubric for each criterion was drafted. A 4-point scale was adopted (Exemplary, Professional,Improvement Required, Unsatisfactory Performance).
- Most criteria included a Not Applicable option. This allowed for flexibility in how the form was used based on tenure-track vs. non-tenure-track faculty and undergraduate vs. graduate faculty.
- The criteria were grouped into the general areas of teaching, scholarship, service, and other.
- The rubric was reviewed several times before it was finalized.
Creating a Faculty Annual Evaluation Form in Excel
The form uses some advanced features of Excel and the general process for creating the form is outlined
below.
The form used radio buttons. To use radio buttons in Excel, the Developer tab was enabled. The
Developer tab is not displayed by default, but it was added to the ribbon.
1. On the File tab, go to Options: Customize Ribbon.
2. Under Customize the Ribbon and under Main Tabs, select the Developer check box.
Next,- Opened a blank Excel sheet.
- Created a summary sheet.
- Created tabs for Goal Status, New Goals, and Review Comments.
- For each criteria area,
b. Added the rubric at the top
c. Created a set of radio buttons under each criterion in the rubric.
d. Added lines for Support for Evaluation
5. Added tabs for additional requirements or other
a. Sending current CV to SON Coordinator
b. Providing APA references for scholarship
c. Providing details for Grants
d. Completing assigned Office 365 Planner tasks
e. Any other faculty accomplishments
6. Added hyperlinks to ease navigation
a. At the top of each tab, added a Return to Summary link
b. On the Summary page, added hyperlinks from the criteria category to each spreadsheet tab.
7. Added a hidden spreadsheet to associate button values with descriptions ( Figure 9 ). Added logic and formulas to display descriptions on the Summary sheet corresponding to the button value selected for each criteria area.
8. Added hidden spreadsheet to facilitate drop-down lists for faculty name and academic year.
Integrating the Form with Our Database
This form was designed to integrate with an Access database.
- The database was designed (Figure 10 - Database Design in Access).
- Values were determined.
- Several hidden sheets were created. Each corresponded to one of the tables in the database design.
a. Report Summary
4. Formulas were used to transpose data from each sheet into the hidden sheets for easy import into the database.
b. Faculty Evaluation Import
c. Faculty Accomplishment Import
d. Goal Status Import
e. New Goals ImportEstablishing an Annual Process for Faculty to Follow
The annual process used by each faculty member evolved. The final process for faculty was documented in the form:- Enter your name – use the drop-down.
- For each criteria area, click on the link or the tab at the bottom of the sheet.
- Review the rubric.
- Provide support for your evaluation of the criteria.
- Select a self-evaluation score. (Don't forget the CV and other tabs at the end.)
- Click on the link to return to this Summary page. Continue for each criteria area.
- Update the status of previous goals.
- Establish new goals for new upcoming academic year.
- Send CV to SON Coordinator.
- Send the completed form to your immediate supervisor - DUE BY MAY 31
- Your supervisor will review and add comments.
- Meet with your supervisor. During your review meeting with your supervisor, updates can be made.
- Add final response comment, if desired.
- Send the completed form to the database administrator. - DUE BY AUGUST 15
- A copy of the final summary report will be put in your official faculty folder.
- A complete final report will be sent to you by the database administrator for your records.
Faculty Annual Evaluation Form Template
The Faculty Annual Evaluation template is available for download. Download the form before viewing. The radio buttons do not work in the Office 365 web view.
Appendix- Faculty Annual Evaluation Rubric
The Faculty Annual Evaluation Rubric is designed to promote formative professional growth. The rubric below is flexible. There may be some criteria that do not apply to all faculty positions or an individual faculty member’s scholarship plan. The rubric is integrated into the SON Faculty Annual Evaluation Form. In the SON Faculty Annual Evaluation form, each faculty member self-evaluates their performance for each criterion and documents support for the selected evaluation rating. If a criterion does not apply, the faculty member selects ‘Not Applicable.’
* Select only those areas that reflect your scholarship plan – it is not expected that all areas will be addressed during each evaluation periodCriteria Exemplary
Performance
Professional
Performance
Improvement
Required
Unsatisfactory Performance
Teaching Academic Program Planning/ Curriculum Development Takes a leadership role in ongoing program planning and curriculum development; ensures courses meet objectives for the overall program plan; and develops student learning activities to achieve course outcomes. Actively participates in ongoing program planning and curriculum development; ensures courses meet objectives for the overall program plan; and develops student learning activities to achieve course outcomes. Participates minimally in ongoing program planning and curriculum development; courses need improvement to meet the objectives for the overall program plan; or student learning activities are not linked clearly with course outcomes Does not participate in ongoing program planning and curriculum development and course/program outcomes are not achieved. Content Expertise
(Requires peer review to confirm – if no peer review was done, select “Improvement Required”)Demonstrates personal scholarship/expert knowledge in content areas by incorporating key topics, supporting instruction with empirical evidence, presenting the latest advances, connecting relevant information to expected outcomes/roles, and grounding learning activities in real-world experience; this level of performance is confirmed by peer review evaluation.
Demonstrates knowledge in content areas by incorporating key topics, supporting instruction with empirical evidence, presenting the latest advances, connecting relevant information to expected outcomes/roles, and grounding learning activities in real-world experience; this level of performance is confirmed by peer review evaluation ratings. Demonstrates knowledge in content areas that is deficient in one of the following areas: focus on key topics, empirical evidence support, latest advances related to topics, relevance to expected outcomes/roles, or grounding of learning activities in real-world experience; this level of performance is supported by peer review evaluation ratings or no peer review was done. Demonstrates knowledge in content areas that is deficient in multiple areas and this level of performance is confirmed by peer evaluation. Course Delivery Initiates active engagement with students throughout each course; provides constructive, relevant, and frequent feedback to students within published time frames; maintains a positive learning environment. Engages and maintains availability with students throughout each course; provides constructive and relevant feedback to students within published time frames; maintains a positive learning environment. Complaints from two or more students in a course concerning instructor availability and/or engagement; feedback is not considered to be constructive, relevant, or timely; or the learning environment is not considered to be positive. Multiple complaints from students concerning instructor responsiveness/engagement; or significant concerns about the content or timeliness in receiving feedback. Course Design Looks for opportunities to make continual course improvements; and finds/uses resources to make changes and incorporate innovative learning strategies. Maintains an average score of 4.0 or above across all Course categories: Course Design, Clinical Course Design, and Overall Course Experience. Recognizes the need for continual course improvements; and uses resources to make changes or to pilot new learning strategies. Maintains an average score of 3.5 to 3.9 across all Cours