Outcomes Assessment Plan

Radiologic Technology Program

WAYNE STATE UNIVERSITY in partnership with HENRY FORD HOSPITAL

Spring 2015 – Winter 2016

Our mission is to prepare graduates to be competent in the art and science of radiography. Our ultimate goal is to enhance the quality of cost-effective health care for all persons. We strive to instill in the graduates that patients come first by providing each patient the quality of care and comfort in a multicultural environment that we want for our families and ourselves.

Goal 1: The student/graduate will demonstrate problem-solving and critical thinking skills in the clinical arena.
Outcome / Measurement Tool / Benchmark / Time Period / Person Responsible / Results
The student/graduate will manipulate technical factors for non-routine examinations. / Clinical Rotation Evaluation Form (CREF)
Question 20
Emergency Room rotation / ≥ 4.0 (5.0 scale) / Second year of program during the Emergency Room rotation
(semesters 4-6) / Clinical Coordinator
Student Contact Technologists / 2016: 4.6 ave.
2015: 4.4 ave.
2014: 4.3 ave.
2013: 4.2 ave.
2012: 4.2 ave.
Clinical Rotation Evaluation Form (CREF)
Question 20
Surgical
C-Arm rotation / ≥ 3.5 (5.0 scale) / First year of program during c-arm rotation
(semesters 1-3) / Clinical Coordinator
Student Contact Technologists / 2016:4.2 ave.
2015: no data returned
2014: no data returned
2013: no data
2012: no data
The student/graduate will be able to adapt positioning for trauma patients. / Clinical Rotation Evaluation Form (CREF)
Question 18
Emergency Room rotation / ≥ 4.0 (5.0 scale) / Senior year of program during the Emergency Room rotation
(semesters 4-6 of 6) / Clinical Coordinator
Student Contact Technologists / 2016: 4.6 ave.
2015: 4.4 ave.
2014: 4.4 ave.
2013: 4.2 ave.
2012: 4.3 ave.
Clinical Education Final Exams – Questions related to Emergency Room rotation / 80% or higher / Junior year of program during the Emergency Room rotation
(semesters 1-3 of 6) / Clinical Coordinator
Student Contact Technologists / 2016: 65%*
2015: no data
2014: no data
2013: no data
2012: no data

Action/Analysis:

SLO 1 – Tool 1: Benchmark met. This measurement tool seems to be a good measure of the students’ ability to set technical factors for non-routine exams. The benchmark was raised during this cycle.

SLO 1 – Tool 2: *New* tool implemented during current assessment cycle. Benchmark met. We set the benchmark lower than Tool 1 due to the fact that first year students have less technical experience than second year students. We will monitor scores in the next assessment cycle to see if changes need to be made.

Note for SLO 1:While Tool 1 and Tool 2 seem to be identical, different rotations are used in different years. Program Officials would like to measure the increase in skills during the program.

SLO 2 – Tool 1: Benchmark met. This measurement seems to be a good measure of the students’ ability to adapt to non-routine exams for trauma patients. The benchmark was raised during this cycle.

SLO 2 – Tool 2: *New* tool implemented during current assessment cycle. Benchmark not met. This is a new measurement tool for the current cycle. At the end of each clinical semester, each student takes a final exam (30Q in length) that is worth 20% of the semester grade for the clinical course. Each exam is tailored to the individual student – based on the competencies completed and the rotations visited. It is more didactic in nature, since the material is related to the ARRT standards for positioning rather than specific protocols. *Some students struggle with the wording of the questions, especially in the first semester when they are new to the program. Additionally, the cohort is smaller than recent cohorts (with only 11 students), resulting in a smaller number of answered questions to measure. Program Officials are satisfied that progress can be made in the next assessment cycle.

Note for SLO 2: The measurement tools used measure didactic progress (Tool 2) in the first year and clinical progress (Tool 1) in the second year. Students’ clinical skills in the Emergency Room (at our Level One Trauma Center) really develop to proficiency in the second year. During the first year, students focus on competency; they become proficient during the second year, with time and practice, prior to program completion.

Program Officials continue to be concerned with negligible return rate for graduate surveys which also provide written permission to contact the individual’s current employer (potentially providing data for the Employer Survey). The program notes the importance of surveying employers to gauge their satisfaction with graduates’ work performance; however, the program does not want to rely on data that has proven problematic to obtain to assess student learning outcomes. The program is investigating a more general employer survey that does not rely on graduate permission. While we will continue to address the lack of graduate and employer survey returns, we do not want to continue including this currently ineffective measurement tool in the Outcomes Assessment Plan. We will continue to meet with Alumni Engagement officers for the College while we investigate other survey methods. When we are able to start collecting data from surveys, we will consider adding it back into the Outcomes Assessment Plan. In the meantime, the program will actively pursue methods to capture this important data. The Program notes the importance of feedback from graduates and their employers as to the satisfaction of their experience.

Goal 2: The student/graduate will demonstrate clinical competence, producing images of diagnostic quality and enhancing the delivery of health care to the community.
Outcome / Measurement Tool / Benchmark / Time Period / Person Responsible / Results
The student/graduate will provide quality patient care. / Clinical Rotation Evaluation Form (CREF)
Question 4 / ≥ 4.0 (5.0 scale) / Second year of program
(semesters 4-6) / Clinical Coordinator
Student Contact Technologists / 2016: 4.6 ave.
2015: 4.5 ave.
2014: 4.4 ave.
2013: 4.4 ave.
2012: 4.5 ave.
ARRT Registry Exam Mean Scaled Section Score (Section E: Patient Care and Education) / ≥ 8.5 (9.9 scale) / Annually – 1 year Post-Program Completion / Program Director / 2016: 8.8
2015: no data returned
2014: no data returned
2013: no data returned
2012: no data returned
The student/graduate will recognize errors and appropriately define necessary corrections. / Completion of three radiographic procedure simulations critically evaluating them / 85% or higher / Fall Semester
First Year
(semester 2 of 6) / Procedures 1 Instructor / 2016: 96% ave.
2015: 96% ave.
2014: 92.5% ave.
2013: 95% ave.
2012: 92.5% ave.
2011: 98% ave.
Image Critique section of Seminar Course
Final Exam Score / 85% or higher / Winter Semester
Second Year
(semester 6 of 6) / Image Critique Instructor / 2016: 92% ave.
2015: no data returned
2014: no data
2013: no data
2012: no data

Action/Analysis:

SLO 1 – Tool 1: Benchmark met. The benchmark was raised for this cycle. With the start of our E*Value online evaluation system (Spring 2009), the Advisory Committee further standardized questions across clinical areas; consequently, this question is now a part of the Clinical Rotation Evaluation Form (CREF) for every clinical area. Therefore, data was available for 2010 and subsequent years for all clinical areas.

SLO 1 – Tool 2: *New* tool implemented during current assessment cycle. Benchmark met. After the ARRT Registry Exam update, the Section Letter will be updated (during the 2017-2018 cycle).

SLO 2 – Tool 1: Benchmark met. Program Officials are satisfied with students’ clinical competence, problem solving and creative thinking abilities. Students are able to recognize errors in positioning and formulate the appropriate corrective action to modify patient and technical considerations to improve the resultant diagnostic image. The Procedures 1 instructor is pleased with the students’ progress during the semester.

SLO 2 – Tool 2: *New* tool implemented during current assessment cycle. Benchmark met. Image Critique is part of a semester-long course that meets during the last semester of the program (semester 6 of 6). The final exam for the Image Critique section consists of radiographic images that students must evaluate for errors in position and technique (among others), and specify corrections.

Goal 3: The graduate will be able to demonstrate effective written and oral communication skills.
Outcome / Measurement Tool / Benchmark / Time Period / Person Responsible / Results
The student will demonstrate effective written communication skills. / Rotational Summary/ Journal Submission / ≥ 4.0 (5.0 scale) / Upon completion of each clinical rotation
(24 total)
Every 5 weeks in first year / every 3 weeks during second year.
(semesters 1-6) / Program Director / 2016: 4.8 ave.
2015: 4.8 ave.
2014: 4.6 ave.
2013: 4.5 ave.
2012: 4.4 ave.
Research Paper – 3,000 word in APA style
Independent Study (RDT 4800) / 82% or higher / Annually during the 5th semester
(semester 5 of 6) / Course Faculty
Program Director
Medical Advisor / 2016: 90% ave.
2015: 93% ave.
2014: 87% ave.
2013: 88% ave.
2012: 84% ave.
Research Paper – in APA style
Radiation Biology (RDT 3200) / 80% or higher / Annually during the 1st semester
(semester 1 of 6) / Program Director / 2016: 82% ave.
2015: 88% ave.
2014: 85% ave.
2013: 86.5% ave.
2012: 83% ave.
The student/graduate will demonstrate effective oral communication skills. / Pathophysiology Reports/Presentations (to program faculty and classmates) / 80% or higher / Winter Semester
First Year
(semester 3 of 6) / Course Faculty
Program Director / 2016: 90% ave.
2015: 92% ave.
2014: 88% ave.
2013: 90% ave.
2012: 87% ave.
Clinical Rotation Evaluation Form (CREF) Questions 1 - 2 for all rotations / ≥ 3.5 (5.0 scale) / Annually,
all rotations
(semesters 1-6) / Clinical Coordinator
Student Contact Technologists / 2016: 4.4 ave.
2015: 4.4 ave.
2014: 4.3 ave.
2013: 93% ave.
2012: 96% ave.
Rad Bio presentations
(to program faculty and classmates) / 80% or higher / Spring/Summer Semester
First Year
(semester 1 of 6) / Course Faculty
Program Director / 2016: 91%
2015: no data returned
2014: no data
2013: no data
2012: no data

Action/Analysis:

SLO 1 – Tool 1: Benchmark met. The benchmark was raised during this cycle. Students continue to show progress in informal writing. Incoming students are generally better prepared and more successful academically than in previous years, in part due to Wayne State University’s increased entrance requirements.

SLO 1 – Tool 2: Benchmark met. The benchmark was raised during this cycle. Consistently meet goal. Program Officials would like to see continued improvement in writing skills from current students. A workshop is provided to students by the University Writing Center; additionally, the Center has expanded hours for student assistance. This course is the program’s university-designated Writing Intensive course.

SLO 1 – Tool 3: Benchmark met. First-year students are instructed in database searches for relevant materials by Henry Ford Hospital Sladen Library personnel. A workshop is also provided by the Wayne State University Writing Center to address skills necessary for writing at a college level. The workshop was presented to Class of 2012 and subsequent cohorts. The current scores are lower than previous cohorts; the program will discuss any additional writing assistance that the students feel is necessary as they prepare for the Independent Study course (Tool 2).

SLO 2 – Tool 1: Benchmark met. See the response for Goal 2 – SLO 2 – Tool 1.

SLO 2 – Tool 2: Benchmark met. Program Officials are satisfied with the effectiveness of the measurement tool as discussed during the Advisory Committee meeting (June 2016). The program will also monitor this tool to see if it should be split into two tools (one for first year students and one for second year students).

SLO 2 – Tool 3: *New* tool implemented during current assessment cycle. Benchmark met. Since the return rate of graduate and employer surveys has been negligible, this tool has been changed during the current assessment cycle. The program will continue to investigate methods to improve the survey return rate. This was discussed at both the Advisory Committee meeting as well as at the JRCERT Outcomes Assessment Seminar in July.

Goal 4: The student/graduate will exhibit professionalism.
Outcome / Measurement Tool / Benchmark / Time Period / Person Responsible / Results
Students/graduates will demonstrate ethical professional behavior and sound professional judgment. / Assignment: Reflection paper on a current medical ethics dilemma that requires sound professional judgment. / 85% or higher / Annually
Winter Semester
Second Year / Jurisprudence (RDT 4900) Faculty / 2016: 100% ave.
2015: 93% ave.
2014: 76%*
2013: no data returned
2012: no data returned
Clinical Rotation Evaluation Form (CREF)
Question 11 / ≥ 4.0 (5.0 scale) / Second year of program / Clinical Coordinator
Student Contact Technologists / 2016: 4.5 ave.
2015: 4.3 ave.
2014: 4.3 ave.
2013: 4.3 ave.
2012: 4.4 ave.
Students/graduates will participate in professional activities which promote professional development and lifelong learning. / Program Records & Verification Forms: students will attend at least one professionally oriented seminar/conference / 50% or higher / Annually / Program Director / 2016: 100%
2015: 100%
2014: 62.5%
2013: 93%
2012: 62.5%
Program Records:students will complete additional advanced certification in either mammography or computed tomography in conjunction with the general diagnostic certificate in radiography. / 35% or higher / Annually / Program Director / 2016: 62.5%
2015: 33%*
2014: 62.5%
2013: 53%
2012: 50%
Directed Reading Completion – each student will complete a minimum of two radiology-related directed readings / 80% or higher / Annually
Spring Semester First Year.
Fall Semester Second Year. / Program Director / 2016: 99%/98% ave.
2015: 96% / 0% ave.*
2014: 100%/98% ave.
2013: 100%/98% ave.
2012: 93%/96% ave.

Action/Analysis:

SLO 1 – Tool 1: Benchmark met. The past two cohorts have been very engaged in both in-class and online discussions, as well as being responsive to course faculty. For 2015, the benchmark was not met because one student of 9 failed to complete the assignment bringing down the average. The average of the remaining cohort was 86%.

SLO 1 – Tool 2: Benchmark met. The benchmark was raised during this cycle.

SLO 2 – Tool 1: Benchmark met. During the confines of the professional program, students attend a variety of professional offerings, to include a Registry Review Seminar (presented by Dr. C. William Mulkey, presented by MSRT), the Henry Ford Hospital Allied Health Interdisciplinary Education Series (AHIES), and student conference sessions at RSNA (Radiologic Society of North America) Annual Meeting. The Advisory Committee felt that if a person attends a seminar as a student, it may increase the likelihood that, as a technologist, the person will be engaged in professional development.

SLO 2 – Tool 2: Benchmark met. Program Officials increased the benchmark. Students are interested in pursuing advanced certification, and the advanced modality areas have been supportive.

SLO 2 – Tool 3:Benchmark met. During the last assessment cycle, a replacement directed reading was added to a course (since the second had been dropped in the previous cycle).

Program Effectiveness Measures (PEM)
Outcome / Measurement Tool / Benchmark / Timeframe / Person Responsible / Results
Students will pass the national certification examination on the 1st attempt. / National Certification Exam 1st Time Pass Rates within six (6) months / 80% or higher each year
5 year avg of 80% or higher / 6 months post- graduation (or upon completion by all) / Program Director / 2015: 93%
2014: 100%
2013: 100%
2012: 100%
2011: 100%
5-yr Avg = 99%
Students wishing to be will be gainfully employed will be within 6 months post-graduation. / Graduate Survey
(demographic info: Question 3)
Or “word of mouth” / 80% or higher yearly / 6 months post-graduation (or upon completion by all) / Program Director / 2015: 100%
2014: 100%
2013: 100%
2012: 100%
2011: 100%
5-yr Avg = 100%
Students will complete the program. / Graduation roster / 75% / End of program / Program Director / 2015: 100%
2014: 75%
2013: 100%
2012: 82%
2011: 93%
5-yr Avg = 92%
(66 /72 students - 5-yrs)
Graduates consider themselves adequately prepared. / Graduate Survey
Question 17 / ≥ 3.5 (4 point scale) / 6 – 8 months post -graduation / Program Director / 2014: no data returned
2013: 3.6
2012: 3.7
2011: 3.9
2010: 3.8
5-yr Avg = 3.78
Employers will be satisfied with patient care skills of newly hired technologists. / Employer Survey of Graduate Radiographers
Question 9 / ≥ 3.5 (5 point scale) / 12 months post- graduation (or upon completion by all) / Program Director / 2014: no data
2013: no data
2012: no data
2011: no data
2010: 4.5
5-yr Avg = 4.5

Action/Analysis:

PEM 1: Benchmark met. Benchmark increased during the current assessment cycle. Students are encouraged to sit for the national registry exam without delay after program completion. Additionally, the college has invested in an online radiography review program (NorthStar Learning) that the faculty and students have access to for one year, commencing with the start of the second year of the professional program. Program Officials monitored data for effectiveness of the online review program and continue with its use. The program also has made use of a HESI exit exam (Elsevier) during the last semester of the program. Students have responded well to the administration of the online exam as it closely simulates the registry exam.

PEM 2: Benchmark met. Of the program graduates, all were either employed or “not actively seeking employment” (JRCERT guidelines) within one year of graduation. The economy in metro Detroit has improved, leading to more open positions for entry-level radiographers.

PEM 3: Benchmark met. Entrance requirements were amended to allow only those students who have completed all prerequisite courses to gain entrance to the program. Unfortunately even with amended entrance requirements, students are not always adequately prepared financially or academically; these issues cannot be accurately predicted during an interview. For the Class of 2014, twelve students started the program, 8 students graduated on time, one additional student graduated within 150% of program length (one additional year). Three students did not graduate. Of those three, one moved back to his home country due to family pressure, and two withdrew for other personal reasons and/or academic performance issues.

PEM 4: Benchmark met. The program continues to investigate more effective ways of surveying graduates to increase data collection.

PEM 5: Benchmark met. The program continues to work with the College and get feedback from other programs for more effective methods of survey return and data collection. We are also investigating a more general employer survey (for a group of employees that are graduates of the program) rather than the formal method of one survey for each graduate.