Annual Program Assessment Report – 2010-2011

1. Please summarize your department or program(s)’s assessment activities during the past academic year.

Other than GPA, what data/evidence was used to determine that graduates have achieved the stated outcomes for the degree? (e.g., capstone course, portfolio review, licensure examination)

Biology and Chemistry: We used a two-pronged approach to assessment of our Biology majors. We employed embedded assessments in four core courses (General Biology I and II, Ecology and Genetics) of outcomes pertaining both to content knowledge of key concepts in biology and to skills such as problem solving/critical thinking and laboratory techniques. We also applied evaluation rubrics to further assess critical thinking and laboratory skills in student poster presentations of their capstone projects (in either Developmental Biology or Molecular Biology). Curriculum mapping of departmental learning outcomes with individual course outcomes determined which concepts and skills were assessed in each course.

Business Department: Department assessment activities during the 2010-2011 academic year included:

·  Finalized curriculum mapping to ensure all departmental learning outcomes are covered throughout the curriculum. Additionally, where necessary, the instruments used to determine the extent to which each outcome is met were developed and/or refined.

·  During the Fall 2010 and Spring 2011 semesters, individual course faculty members assessed student learning for the majority of Business Department learning outcomes based on identified thresholds for student performance. Embedded assignments, exam questions, and pre-assessment tests and surveys were used to assess student learning outcomes related to the departmental learning outcomes. For example, a pre-general computer literacy test and Microsoft skill self-assessment survey are used at the beginning of each semester for Introduction to Computer Information Systems to determine students’ knowledge prior to taking the course. Pre-tests and self-assessments are also utilized within Business Fluctuations and Forecasting to assess students’ knowledge of the departmental and course learning outcomes that correlate to this course. Throughout the semester, results of pre-determined assignments and exam questions are utilized to assess student progress and achievement in regards to Business Department as well as course learning outcomes. In addition, in courses such as Introduction to Computer Information Systems, post-self assessment surveys are also administered and assessed. (See attachment A which lists the department learning outcomes, the data used to determine the extent to which each outcome is met, the threshold at which students are considered successful in meeting each outcome, and summarized Fall 2010 and Spring 2011 actual assessment results.)

·  Student internship experiences are assessed at the midterm and endpoint of each semester. Also, job offers generated by the internship are tracked to provide further evidence of student professional competency.

·  Student satisfaction surveys, graduating student surveys, and focus groups are utilized to assess perceptions of student competencies.

·  During Summer 2010, a Business Administration Advisory Board was formed. The advisory board is comprised of business leaders with specializations that cover all key business areas. Two board meetings were held during the 2010-2011 academic year, one meeting was held each semester. During these meetings, the Business Administration Advisory Board members analyzed the current business curriculum, including the internship program and concentrations. Advisory Board discussions included analysis to determine whether student learning outcomes appear to be met by curriculum. (See attachments B and C for Business Department Advisory Board meeting summaries.)

·  Business faculty representatives attended the NEEAN (New England Educational Assessment Network) Fall Forum 2010. The Forum’s theme was Assessment: Tensions, Opportunities, and Outcomes.

Communications Media: Our Internship Evaluation Form is the sole assessment applied in our department.

Success on internship as demonstrated by how the internship site rates the student demonstrates in-part the effectiveness of our program.

Computer Science: No assessment report has been submitted

Criminal Justice: For AY10/11, the Criminal Justice Program used a rubric via Tk20 to assess papers from Senior

Colloquium and Data Analysis. However, although findings and observations in this report are currently limited to FA10 Colloquium data only (14 students and 28 papers total, as assessed by two CJ faculty members), data from SP11 are in the process of being assessed by CJ faculty and analyzed by Chris Cratsley, Interim Director of Assessment. The data from SP11, including 29 additional Colloquium papers, will be assessed and analyzed using the new juried assessment process of Tk20.

Economics: Over the course of the last year the faculty in the economics program have conducted a survey of current students; entered data for selected courses in economics, at both the introductory and advanced levels; and engaged in continuing discussions about the state of the economics program and the concentration in international business and economics, including the changing composition of students in the major and its possible implications.

Education: A) We conducted all our assessments as prescribed in the Education Unit Assessment System. B) We analyzed the data from the 2010 –2011 instruments and discussed the results.

English Studies: During the 2010-2011 Academic Year, the English Department’s Assessment Committee completed our department-wide assessment of all senior portfolios received in the 2009-2010 AY (winter and spring graduates). We assessed all 16 of the portfolios (with each assessed separately by two committee members) using a department-wide objective and rubric, using a three-point scale to grade the students’ abilities to distinguish between and analyze multiple literary genres in the papers included in those portfolios.

Exercise and Sports Science: During the past academic year, the EXSS chose to assess Program Goals 3, 5, and 6-- competence in health-related fitness testing, competence in exercise programming for healthy populations, and competence in exercise programming for higher level athletic performance. The assessment activities for each of these goals are summarized below.

·  Goal 3: Demonstrate competence in health-related fitness testing.

o  The previous year’s assessment showed that students in the Introduction to Exercise Science class were meeting the standards for health-related fitness testing at the “Basic knowledge and skills” level. As a result, we decided to move on to another class and another artifact to assess this goal at the “Demonstrated Competence” level.

o  During the spring semester, the faculty observed and evaluated practical exams in the Exercise Testing and Prescription class. Each student had to perform a battery of standard health-related fitness tests on a client.

·  Goal 5: Demonstrate competence in exercise programming for healthy populations

o  For this goal, we evaluated case studies from Exercise Testing and Prescription.

·  Goal 6: Demonstrate competence in exercise programming for higher level athletic performance

o  For this goal, we identified the final projects for Strength and Conditioning as the artifact for assessment this year.

o  We had previously used this assignment to assess Goal 5. However, after some discussion, the faculty agreed this would a more appropriate artifact to assess Goal 6, as all but 1 of the projects involved developing a training program for an athlete.

Geo/Physical Science: Consistent with our plan submitted last year we have begun to acquire baseline data from proficiency quizzes and pre-tests conducted in introductory courses. There are no licensure examinations in the earth sciences. While students have the option to pursue a major project (i.e., capstone), at present this is not required.

History: We draw data using the following assessments.

Assessments:

HIST 4500 Research Paper Rubric

HIST 4500 Exit Survey

Our intended student benchmark is that at least 85% of students completing the HIST 4500 Senior Seminar research paper will perform at an acceptable (3) or exemplary (4) level on each of the nine outcomes included in the research paper rubric and that at least 85% of respondents on the HIST 4500 Exit Survey will respond with “Strongly Agree” (4) or “Agree” (3) to General Impression About the History Major and The Goals of the History Major.

Human Services: No assessment report has been submitted

Industrial Technology: The entire Industrial Technology program conducted a review of internship experiences. Within the architecture concentration student work was assessed in the Capstone Course ITEC 3460 Arch Des. II and a Portfolio Review was begun in ITEC 4470 Arch. Prof. Prac. Within the Technology Education concentration student learning outcomes were assessed based on MTEL Examination Results for Communication and Literacy and Technology/ Engineering.

Interdisciplinary Studies: In the academic year 2010-2011, the Humanities department worked consistently on assessment for the four areas that offer Minors – Art, Music, Language and Philosophy. The results of this work are substantial. In September, a committee of four people, one from each area, was elected to an Assessment Committee. The members of this committee were Jessica Robey, Robin Dinda, Rala Diakite and Walter Jeffko. Jessica Robey acted as Chair of this committee. The committee met on a regular basis throughout the year. Each member of the committee also met, communicated and collaborated on assessment issues with the faculty in their areas. Assessment experts Stephen Wall-Smith and Chris Cratsley were consulted at various times during the year.

The goals of the committee were:

1) to develop assessment rubrics for each of the areas

2) to choose one or two elements to be assessed

3) to determine and collect materials to be assessed, and

4) to assess the materials for the chosen goal/s, using the rubrics

5) to begin using TK20 to archive out assessment materials and data

Thus far, goals 1-3 have been achieved. Since materials to be assessed were collected during the finals period, and faculty was not available after semester end to do the job of assessing these, an appropriate time will be set in September to complete this process, and goal #4 will thus be achieved. Goal #5 will be achieved during the summer of 2011, as department faculty Rala Diakite and Paul Beaudoin learn how to use TK20 and begin to upload collected materials there.

In addition, the committee had as a secondary goal to begin discussion of IDIS assessment. Jane Fiske (Program Advisor for the IDIS major) and Rala Diakite (Department Chair) met and communicated with Chris to receive guidance on the assessment of the IDIS Major. There was discussion of the IDIS capstone and how this may be central to the assessment of this major. We collected all of the capstones completed at the end of Spring semester 2011, and these will be archived on TK20, for future use. Assessment of the IDIS major will be the primary goal for academic year 2010-2011, since assessment of the Minors will be well underway.

Leadership Academy: The Thesis Advisor and the Director of the Leadership Academy assessed students’ Honors Theses on the basis of the quality of the research, quality of the sources, quality of the written communication, quality of the oral communication, initiative, and creativity.

Liberal Arts and Sciences Curriculum: The Liberal Arts & Sciences Council worked to apply and revise as necessary the rubrics for our five desired student objectives (Art Appreciation, Citizenship, Communication, Ethical Reasoning, Problem Solving and Synthesis). Fall 2010, the Committee worked as a whole to assess samples of student work from assignments that were collected haphazardly for each of the 5 objectives, and produced data for each of the five objectives.

Mathematics: In AY11 the Mathematics Department Assessment Committee continued to roll out our Assessment Plan. As stated in last year's summary of Assessment activities we have collected data on student presentations, developed a rubric for assessing students’ abilities to write proofs, and implemented an instrument to assess student's skills in reading proofs. In addition we developed a survey for students who are changing their major (out of the Mathematics Department) to complete in the hopes to collect data we can use to help increase retention in the major. We also continued to collect and assess student work that related to the technology goal, since this is the second year of this type of data collection we were able to compare work from Fall 2009 to that we collected in Fall 2010.

Other activities

In addition this year several Mathematics Department members contributed to campus-wide assess-

ment activities, including attending the Fall NEEAN conference, participating on NEASC self-study sub-committees, attending LA& S Assessment Afternoons in the Center for Teaching and Learning, participating in Assessment Day workshop activities, attending TK20 User's Conference in Austin, TX, and attending the NEEAN spring workshop in Keene, NH.

Nursing: The following assessment activities were used to determine how well students met our outcomes:

a.  During the past year, the Department of Nursing assessed all components of the program, in preparation for writing our interim report to CCNE, our accrediting agency. The Program Evaluation Map calls for assessment of Mission and Goals, Institutional Resources and Commitment, Curriculum and Teaching-Learning Practices and Program Effectiveness on a rotating basis. During AY2010-2011, all outcomes in the plan were evaluated.

b.  Curriculum changes were recommended, based on a comparison to the new Essentials of Baccalaureate Education in Nursing (American Association of Colleges of Nursing). These changes were approved by college governance.

c.  The curriculum was assessed for inclusion of all components of the 2010 NCLEX-RN Detailed Test Plan. Any deficiencies in curriculum content were addressed.

d.  The curriculum committee and the faculty continued to process the results of course, clinical and other evaluations.

The department uses a variety of methods to ensure students have met the outcomes of the program. Initial pass rates on the NCLEX-RN licensure exam are tracked, and the NCLEX Program Reports are examined to evaluate how our graduates performed in various areas of the licensure exam. In addition, students complete a practicum portfolio, which demonstrates how they have met the program outcomes. This is completed weekly during their final semester, while they are completing their practicum. Finally, we ask students to self-rate their level of fulfillment of the program outcomes with our Senior Exit survey.

Political Science: The main assessment vehicle used by the Political Science program is a reflective portfolio conducted as part of our senior seminar course. As all majors must complete this requirement prior to graduation and as a senior seminar it comes at the end of the student’s undergraduate career, the portfolio allows students to reflect on their careers. The Political Science faculty have set four criteria that all graduates should be able to demonstrate mastery of. In the portfolio, students must reflect on the work they have done and provide evidence of meeting these skills. These criteria are as follows: