Annual Department Assessment Plan Summary

Annual Department Assessment Plan Summary

Annual Assessment System Summary

Department: Newman Division of Nursing Date: March 5, 2008

Part 1: Outcomes

1. Synthesize empirical and theoretical knowledge from nursing and the arts, sciences and humanities to practice professional nursing.

A. Develop and use higher-order problem-solving and critical thinking skills for developing, implementing, and evaluating nursing interventions.

B. Synthesize evidence-based, theoretical, and empirical findings from nursing, the arts, sciences, and humanities as appropriate in identifying, framing, and making nursing-related decisions across a variety of health care situations.

2. Demonstrate values central to professional nursing practice within the frameworks of legal, ethical and professional standards.

A. Understand and practice the concept of caring in the practice of nursing across the health care continuum and in all health care settings.

B. Demonstrate knowledge of legal, ethical, and professional standards when identifying, framing, and making nursing-related decisions.

C. Demonstrate abilities to collaborate with Person and with other health care professionals in meeting diverse health-care needs of Person.

D. Advocate for Person within the health care delivery system.

E. Assume responsibility and accountability for own actions in the practice of nursing.

F. Assume responsibility for life-long learning and plan for professional career development.

3. Use leadership skills and knowledge to develop in the role of a professional nurse.

A. Demonstrate knowledge of health care systems and the factors that shape organizations and environments in which nursing and health care are delivered when identifying, framing, and making nursing-related decisions with Persons.

B. Understand the profession of nursing and participate in the political and regulatory processes that shape the health care delivery system.

C. Demonstrate knowledge of the professional nurse’s role to design, coordinate and manage nursing care using the application of outcome-based practice and the skills of communication, collaboration, negotiation, delegation, coordination, and evaluation of interdisciplinary work.

D. Demonstrate the knowledge and skills needed to be a member of interdisciplinary health care teams and support agendas that enhance high quality, cost-effective health care.

4. Provide professional nursing care to promote health, reduce risk, prevent disease and manage illness and disease.

A. Demonstrate knowledge of factors that promote, protect, and predict the health of Persons when delivering and evaluating nursing care across the lifespan and health care continuum.

B. Synthesize knowledge of pharmacology, pathophysiology, health assessment, and nursing interventions in the identification, management, and evaluation of signs and symptoms of Person when delivering nursing care across the lifespan and health care continuum and in a variety of health care settings.

C. Synthesize knowledge of the biopsychosocial, cognitive, and spiritual aspects of Person when delivering nursing care across the lifespan and health care continuum.

5. Demonstrate technical skills and communication methods necessary to deliver professional nursing care.

A. Demonstrate appropriate, effective use of communication skills (i.e., nonverbal, listening, oral, written, electronic) when delivering nursing care to Person across the lifespan and health care continuum and in a variety of health care environments.

B. Demonstrate the ability to modify communication methods in response to cultural or special needs of Person when delivering nursing care.

C. Demonstrate the ability to access and document health-related information in a timely manner.

D. Perform a wholistic assessment of Person across the lifespan and health care continuum and in a variety of health care environments and use assessment findings to diagnose, plan, deliver, and evaluation quality care.

E. Demonstrate appropriate, safe, and efficient technical skills and use of technology when delivering nursing care to patients across the lifespan and health care continuum and in a variety of health care environments.

6. Demonstrate knowledge of human diversity in a global community while in the role of a professional nurse.

A. Synthesize knowledge of human diversity, cultural values, and global health practices when providing culturally sensitive care to Person.

B. Demonstrate knowledge of the impact of globalization on health care systems, policy, modalities, practices when in the role of a professional nurse.

Part 2: Assessment Planning Charts

  1. Direct Measures- Evidence, based on student performance, which demonstrates actual learning (as opposed to surveys of “perceived” learning or program effectiveness). See “Assessment type” chart at the end of this document for a list of potential assessment types and their definitions. Note how it is possible to have an objective covered by more than one assessment, or one assessment to cover more than one objective.

The following are examples of the assessment data collected in the NDN. Complete data are on file in the NDN and can be provided as needed.

Objective(s) # / Assessment(s) / Type #
+(see chart) / Data/Results / Action Taken/Recommendations
(if necessary)
1 / ATI Comprehensive Exam
(total scale scores, individual level data) / 3 / Percent of students meeting NDN benchmark (.99 predictability of passing NCLEX-RN)
2006 (N = 32): 48%; 2007 (N = 32): 55%
The number of students achieving the benchmark improved from 2006 to 2007. / Recommend to continue the ATI Comprehensive testing and to trend data over the next 2 years with a goal of 75% of students meeting the benchmark by 2009. Also recommend ongoing assessment and review by course faculty and curriculum to identify additional strategies to strengthen student learning outcomes. Strengthen content areas and content delivery methods based on the ATI Comprehensive outcomes.
1 / ATI Comprehensive Exam
(Critical thinking subscale score, individual level data) / 3 / 2006 and 2007 results:
Interpretation = 40% to 42%
Analysis = 56% to 74%
Evaluation = 56% to 81%
Inference = 78% to 71%
Explanation = 28% to 19%
Improvement was seen in the categories of analysis and evaluation; however, student outcome performance in interpretation and explanation remain below the benchmark (.99 probability). Inference remained above the benchmark for both testing periods. / Explore use of ATI Critical Thinking Assessment. Also recommend ongoing assessment and review by course faculty and curriculum to identify additional strategies to strengthen student learning outcomes.
Note: ATI Critical Thinking Assessment will begin with August 2008 admission class. This will provide more critical thinking data early in a student’s progression through the curriculum. It also will allow more comparison data between 1st semester sophomore and 2nd semester senior students.
1 / NU 490 Leadership Paper
(NU490 Writing Rubric) / 1 & 5 / Benchmark: 100% at 70% or above
2006: 32/32 students
2007: 31/32 students / Recommend the development/implementation of NDN-wide rubric for written papers.
1, 2, 3, 4, 5, & 6 / NU 491 Clinical Practicum
(NU 491 Clinical Evaluation Tool) / 1 & 5 / Benchmark: 100% of students successfully complete the NU491 practicum and meet all course objectives.
2006: 100%
2007: 97% / To facilitate measurement of outcome data, recommend exploring the development of a clinical evaluation tool across the curriculum that utilizes a multidimensional scale.
23 / ATI Comprehensive
(Leadership subscale scores, individual level data) / 3
2 / ATI Leadership Content Mastery
(Quality and Legal, Ethical Issues subscale scores, individual level data) / 3
3 / ATI Leadership Content Mastery
(ATI total scale scores, individual level data) / 3
4 / ATI Comprehensive Exam
(Nursing process subscale score, individual level data) / 3
4 / ATI Comprehensive Exam
(Health Promotion subscale score, individual level data) / 3
4 / ATI Comprehensive Exam
(Management of Care subscale score, individual level data) / 3
4 / ATI Comprehensive Exam
(Pharmacology subscale score, individual level data) / 3
4 / ATI Comprehensive Exam
(Physiological Adaptation subscale score, individual level data) / 3
4 / ATI Comprehensive Exam
(Reduction of Risk, subscale score, individual level data) / 3
4 / ATI Comprehensive Exam
(Basic Care and Comfort subscale score, individual level data) / 3
4 / ATI Comprehensive Exam
(Psychosocial Integrity, subscale scores, individual level data) / 3
4 / ATI Comprehensive Exam
(Safety and Infection Control subscale scores, individual level data) / 3
5 / ATI Comprehensive Exam
(Therapeutic Nursing Interventions subscale scores, individual level data) / 3 / Benchmark: .99 predictability of passing NCLEX-RN
2006: 53% of senior students
2007: 52% of senior students / Recommend evaluation by the curriculum committee of current TNIs and explore competency evaluation in key courses as determined through curriculum review. In 2007, adult and pediatric simulators were purchased. Faculty were educated on the use of simulation. Use of simulation should increase student learning in TNIs and critical thinking.
5 / ATI Comprehensive Exam
(Communication subscale scores, individual level data) / 3
1, 4, 5 & 6 / NCLEX-RN Pass rate / 3 / Benchmark: At least 80% or state/national average
2001: 81%
2002: 83.90%
2003: 84%
2004: 65.52%
2005: 82.14%
2006: 76.67%
2007: 76.67%
Some improvement since drop in 2004; however, still not at benchmark. / See explanation given in another section of this report: Part 4: Summary. Level B. Program Improvement.
1, 2 & 4 / ATI Content Mastery Exams:
Fundamentals of Nursing
Medical-Surgical Nursing Care
Pharmacology in Nursing
Practice
Maternal/Newborn Nursing Care
Nursing Care of children
Mental Health Nursing Care
Community Health Nursing Care
Nursing Leadership / 3 / See additional information included in other sections of Part 4: Summary regarding the use of formative measures and the development of “thumb prints.”
  1. Indirect Measures -Reflection about the learning gained, or secondary evidence of its existence. Please refer to “assessment type” chart at the end of this document.

Objective(s) # / Assessment(s) / Type #
+(see chart) / Data/Results / Action Taken/Recommendations
(if necessary)
1, 2, 3 & 5 / ESU/NDN Employer Satisfaction Survey / 11 / Overall, the data indicate employer satisfaction with graduates. / Even though the data indicate satisfaction, the return rate is so low that there are questions about the validity of the data. Steps have been taken to increase the return rate; however, it is time intensive for a relative small increase. Once the curriculum review is completed, the employer and graduate satisfaction surveys will be revised to reflect changes in the curriculum and the educational outcomes.
1, 2, 3 & 5 / ESU/NDN Graduate Satisfaction Survey / 11 / Overall, the data indicate graduate satisfaction with the program. / See note re: employer satisfaction survey

Part 3: Evaluation Rubric for Assessment System

1
Beginning / 2
Developing / 3
At Standard / 4
Above Standard
Level A: Beginning Implementation
Professional standards and student learning outcomes / Development of the assessment system does not reflect professional standards/outcomes nor are the standards established by faculty and/or outside consultants. / Development of the assessment system is based on professional standards/outcomes, but the faculty and the professional community were not involved. / Development of the assessment system is based on professional standards/outcomes, and the faculty AND the professional community were involved. / Development of the assessment system is based on professional standards/outcomes, and the faculty AND professional community are engaged in continuous improvement through systematic (e.g., yearly) activities.
Faculty involvement / No faculty involvement is evidenced in department assessment activities. / Faculty involvement consists of one or two individuals who work on program assessment needs and activities. Little or no communication is established with other faculty or professionals. / Faculty involvement consists of a small core within the department, but input from other faculty and professionals about assessment issues is evidenced. / Faculty involvement is widespread throughout the program or department. All faculty within the department have contributed (and continue to contribute) to the use and maintenance of an assessment plan.
Assessment alignment / No alignment between faculty identified learning outcomes and assessments is evidenced. / Alignment exists with some outcomes and assessments, but not others OR the alignment is weak/unclear. / Alignment between outcomes and assessments is complete and clear. / Alignment between outcomes and assessments complete. Courses are identified that address each outcome.
Level B: Making Progress in Implementation
Assessment structure / The assessment plan has only one of the following attributes:
1) multiple direct and indirect assessments are used.
2) assessments are used on a regular basis (i.e., not just given once to get initial data).
3) assessments provide comprehensive information on student performance at each stage of their program. / The assessment plan has only two of the following attributes: multiple. regular and comprehensive, at each stage. / The assessment plan has all of the following attributes: multiple, regular and comprehensive, at each stage. / The assessment plan has all necessary attributes and are embedded in the program (versus “added-on”).
Data management / No data management system exists. / A data management system is in place to collect and store data but it does not have the capacity to store and analyze data from all students over time. / A data management system is in place that can store and process most student performance data over time. / A data management system is in place that can store and process all student performance data over time. Data are regularly collected and stored for all students and analyzed and reported in user-friendly formats.
Data collection points / Data are not collected across multiple points and do not predict student success. / Data are collected at multiple points but there is no rationale regarding their relationship to student success. / Data are systematically collected at multiple points and there is strong rationale (e.g., research, best practice) regarding their relationship to student success. / Data are systematically collected at multiple points and provide strong relationship between assessments and student success.
Data collection sources / Data collected from applicants, students, and faculty, but not graduates or other professionals. / Data collected from applicants, students, faculty, and graduates, but not other professionals. / Data collected from applicant, students, recent graduates, faculty, and other professionals. / Data collected from multiple information on/from applicants, students, recent graduates, faculty, and other professionals.
Program improvement / Data are only generated for external accountability reports (e.g., accreditation), are not used for program improvement, and are available only to administrators. / Some generated data are based on internal standards and used for program improvement, but are available only to administrators “as needed.” / An ongoing, systematic, objectives based process is in place for reporting and using data to make decisions and improve programs within the department. / An ongoing, systematic, objectives based process is in place for reporting and using data to make decisions and improve programs both within the department and university-wide.
Level C: Maturing Stages of Implementation
Comprehensive and integrated measures / The assessment system consists of measures that are neither comprehensive nor integrated. / The assessment system includes multiple measures, but they are not integrated or they lack scoring/cut-off criteria. / The assessment system includes comprehensive and integrated measures with scoring/cut-off criteria. / The assessment system includes comprehensive and integrated measures with scoring/cut-off criteria that are examined for validity and utility, resulting in program modifications as necessary.
Monitoring student progress, & managing & improving operations & programs / Measures are used to monitor student progress, but are not used to manage and improve operations and programs. / Measures are used to monitor student progress and manage operations and programs, but are not used for improvement. / Measures are used to monitor student progress and manage and improve operations and programs. / Measures are used to monitor student progress and manage and improve operations and programs. Changes based on data are evident.
Assessment data usage by faculty / Assessment data are not shared with faculty. / Assessment data are shared with faculty, but with no guidance for reflection and improvement. / Assessment data are shared with faculty with guidance for reflection and improvement. / Assessment data are shared with faculty with guidance or reflection and improvement. Remediation opportunities are made available.
Assessment data shared with students / Assessment data are not shared with students. / Assessment data are shared with students, but with no guidance for reflection and improvement. / Assessment data are shared with students with guidance for reflection and improvement. / Assessment data are shared with students with guidance for reflection and improvement. Remediation opportunities are made available.
Fairness, accuracy, and consistency of assessments / No steps have been taken to establish fairness, accuracy, and consistency of assessments. / Assessments have “face validity” regarding fairness, accuracy, and consistency. / Preliminary steps have been taken to establish fairness, accuracy, and consistency of assessments. / Assessments have been established as fair, accurate, and consistent through data analysis.

Part 4: Summary

Factors / Rubric Score / Evidence/Rationale
Level A
Professional standards and student learning outcomes / 1 2 3 4 / The educational outcomes and sub-outcomes for the Newman Division of Nursing (NDN) were developed by the nursing faculty following a review of professional standards and educational/program requirements from professional organizations including the American Association of Colleges of Nursing (AACN), the National League for Nursing Accrediting Commission (NLNAC), and the Kansas State Board of Nursing (KSBN). In addition, the detailed test plan for the national licensure exam (NCLEX-RN) developed by the National Council of State Boards of Nursing was reviewed. A Web-based search for educational outcomes from other baccalaureate programs (in Kansas and in other states) also provided more information that was considered when the NDN outcomes were developed. The outcomes were finalized in August 2007.
According to requirements from NLNAC and KSBN, the NDN has an established Systematic Evaluation Plan (SEP) that provides for the systematic evaluation of the entire program. As part of the SEP, the NDNengaged in assessment activities at least annually. Some data originally collected as part of the SEP can now also be used as data for the NDN’s recently developed assessment plan for student learning, for example, pass-rate on the NCLEX-RN and results from the graduate and employer surveys. In addition, the NDN had already implemented the administration of required achievement tests that were linked to specific courses in the nursing curriculum. Data pertaining to the results of the achievement tests can also be used with the assessment plan for student learning.