Metropolitan State University of DenverDraft 11 SEPT 14

Assessment Report Template

2013-14 annual report

Program Name Meteorology

Program Description – include a brief description of the program including a list of majors, minors, concentrations as applicable and the number of students in each, and the number of faculty by category.
The Meteorology program is housed in the Department of Earth and Atmospheric Sciences. The program offers a major and a minor in meteorology (no concentrations in either). It is difficult to get an accurate count of the number of majors and minors, but we have approximately 60 majors and 30 minors. There are 3 Category 1 faculty members, two tenured and one tenure-track. One Category 2 faculty member also teaches upper and lower division courses in meteorology courses, but also contributes to the Land Use program. There were five Category 3 faculty members who taught MTR courses during the 2013-14 academic year.
Student Learning Outcomes (SLOs)
1. Describe the general characteristics of the atmosphere, including physical processes and weather systems.
2. Select and interpret appropriate weather and climate data, including in-situ and remotely sensed information, for different situations.
3. Synthesize multiple types of weather and climate data to formulate short, medium, and long-range weather forecasts.
4. Organize, analyze, and prepare written scientific reports.
5. Create and deliver scientific presentations using multimedia techniques.
6. Apply mathematical and statistical techniques to the analysis and interpretation of atmospheric dynamics, thermodynamics, and radiation processes.
7. Apply scientific computing skills using appropriate software and structured programming.
8. Evaluate social, economic, cultural, and global aspects of the impacts of weather and climate phenomena.
Data collection / Findings / Target/Expectation / Action and Rationale
For each category, three types of data were collected.
  1. Artifacts were collected and rated according to rubrics for each of the SLO’s. The ratings were on a scale of 1 to 4.
  1. Faculty{3 Category 1 and 1 Category 2} rated graduating seniors on their preparation in each category. The ratings were on a scale of 1 to 4, with no rating if a faculty member did not have sufficient information to rate particular students. For each student, an average faculty rating was determined for each SLO.
  1. Students were asked to rate their own preparation in the areas delineated by the eight SLO’s.
/ See findings for individual SLO’s / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared / We plan to change the labels to:
4 = well prepared
3 = adequately prepared
2 = moderately prepared
1= poorly prepared
SLO #1: “Describe the general characteristics of the atmosphere, including physical processes and weather systems.”
A 100 question multiple choice exam was implemented in the MTR 4400 Advanced Synoptic course in Spring 2014. This exam has been given to graduating seniors for over twenty years and therefore provides valuable comparative data. See comments in the Action and Rationale column. / Students scored above the 19-year average for this exam, with the highest overall average since 2004. The students scored particularly well on the general characteristics of the atmosphere. This cohortwas six students, so it is difficult to interpret whether this represents a meaningful change in the program.
See Appendices A and B for more details. / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared
In future, the expectation is for 85% of students to achieve a rating of 3 or 4. / We are strongly considering replacing the current knowledge exam and would appreciate input from the assessment peer reviewers. In particular, it is difficult to assess the students’ ability to “describe characteristics” using multiple choice exams.
The exam, which has been used for 19 years, does provide some measure of comparative information. However, some of the questions seem somewhat arcane and in general emphasize recall and vocabulary and not enough higher order thinking.
SLO #2: “Select and interpret appropriate weather and climate data, including in-situ and remotely sensed information, for different situations.”
Eight data analysis laboratory explorations requiring selecting and analyzing climate data in the Spring 2014 MTR 3330 Climatology course were used to assess this outcome. / Students were well prepared or adequately prepared based on sampled work products. The average score on this SLO (3.38/4) continues to be one of the highest. Both the faculty and student perception were quite high. The inconsistency between the different rating measures call for more specific rubrics for the faculty and student ratings. / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared
In future, the expectation is for 85% of students to achieve a rating of 3 or 4. / Because MTR 3330 moved from a 4000-level course to a 3000-level course, most students taking the course are at the sophomore or junior level. While this artifact can still be used as a formative assessment tool, it would be more appropriate to develop a summative assessment in another course such as MTR 4400 Advanced Synoptic Meteorology or MTR 4500 Mesometeorology.
SLO #3: “Synthesize multiple types of weather and climate data to formulate short, medium, and long-range weather forecasts.” Case study portfolio produced for MTR 3410 Weather Analysis Techniques course in Spring 2013 required synthesizing various types of weather data for a comprehensive analysis. No data was collected in 2013-14. / Because no data was collected this year, here is a reminder of the findings from spring 2013. Students were mostly adequately prepared based on sampled work products. The overall rating was 2.94/4 but that was biased by two students receiving a poorly prepared (1) rating. Furthermore, the course for which this portfolio was drawn enrolls many minors in addition to majors. The minors tend to be less prepared, although the results have not been separated. It may make sense to include only the results from the majors.
Both the faculty and student perception were quite high. The inconsistency between the different rating measures call for more specific rubrics for the faculty and student ratings. / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared
In future, the expectation is for 85% of students to achieve a rating of 3 or 4. / Most students in this course are at the sophomore or junior level. Therefore, we need to develop a summative assessment in a course at the 4000-level, as well as develop a more specific rubric. One possibility would be to use a rubric to analyze forecast discussions presented by students in MTR 4400 Advanced Synoptic Meteorology. This way we are not assessing on their first course discovering the material in this SLO, but instead, their final course demonstrating it.
SLO #4: “Organize, analyze, and prepare written scientific reports.”
Evaluation of final papers for MTR 4600 Senior Research Seminar course in Fall 2013. / Students were mostly adequately prepared based on sampled work products with one student outlier. The majority of students had creative, original topics, carried out research using appropriate methodology and data, based research on previous literature, created effective figures, and wrote this up in a logical and scientifically sound manner. Not all students achieved this level of science or writing. / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared
In future, the expectation is for 85% of students to achieve a rating of 3 or 4. / Maintain current emphasis on this SLO. Try to find a way to identify low-achievers, who are in the minority, early on in this course through some early benchmarks such as a methodology plan, proposal, outline, or literature review, then guide them more rigorously.
SLO #5: “Create and deliver scientific presentations using multimedia techniques.” Evaluation of final oral reports utilizing PowerPoint for MTR 4600 Senior Research Seminar course in Fall 2013. / Students were mostly adequately prepared based on sampled work products. See Appendix A. / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared
In future, the expectation is for 85% of students to achieve a rating of 3 or 4. / Maintain current emphasis on this SLO.
SLO #6: “Apply mathematical and statistical techniques to the analysis and interpretation of atmospheric dynamics, thermodynamics, and radiation processes.”
Evaluation of four exams using problem solving for MTR 3450 Dynamic Meteorology in Fall 2013 requiring mathematical techniques and creative problem-solving. Exams were chosen to analyze over problem sets because it tests the individual’s ability to solve problems on their own. / Most of the students in the course being analyzed were unusually sharp and brought the struggling students in the class up to their level. These collected artifacts only cover the mathematical techniques applied to the analysis and interpretation of atmospheric dynamics, therefore, it does not cover the whole SLO and adequate data was absent to assess students’ use of statistical techniques in this course. Still, nearly every student was adequately prepared, which shows an increase in achievement from previous years. However, this may be a function of changing the course and professor from which the data was collected this year. Students are required to complete a 24-credit hour mathematics minor including 3 semesters of calculus and 2 additional courses with calculus prerequisites. Several of our theoretical courses include considerable application of calculus and other mathematical methods. Students and faculty rate themselves lower on this than previous SLO’s. Perhaps students do not feel confident in this skill and this is something to strive for through integrating more practice into the program. / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared
In future, the expectation is for 85% of students to achieve a rating of 3 or 4. / Maintain current emphasis on this SLO. Practice problem solving in more courses so practice can build to further confidence and demonstration of problem solving skills. Return to collecting assessment findings from MTR 3430 Atmospheric Thermodynamics instead of MTR 3450 Dynamic Meteorology this year.
SLO #7: “Apply scientific computing skills using appropriate software and structured programming.” Students in MTR 4400 (Advanced Synoptic Meteorology) in spring 2014 were given a series of exploration labs that required them to teach themselves to use a data visualization software commonly used in the field of atmospheric science called IDV. Students found data, ingested it into the program, and created complex weather maps they used to answer general questions in lab assignments that related live weather to the meteorology equations related to the course. Some maps required coding simple calculations for the computer to calculate, such as advection of vorticity. Students, through both teamwork and individual efforts, were able to create maps with several layers of information and adjust contour intervals and shading thresholds to create visually pleasing and information-rich weather maps. Students used maps for lab reports as well as case studies. This lab required them to look at the parts of the Frontogenesis equation for a cyclone event. / While students were adequately prepared with respect to the student work assessed, the assignment did not include any structured programming. Our current program does not include structured programming with the exception of the one computer course they are required to take. Senior exit surveys are highly critical of the required Computer Science I course, both in the computer language (JAVA) and the emphasis on software engineering rather than problem solving through structured programming.
The single required computer science course is not adequate. This fall, many students have been advised to take a new R programming course instead, which may reflect positively on their confidence in programming for future years of graduates.
The program is undertaking a several year plan to incorporate computer programming into the entire curriculum starting with the lower-division courses in order to take full advantage of computer facilities.
This is clearly the weakest component of the program and needs the most improvement. See Appendix A. Right now the emphasis is on software that does not require programming, and while this software is a great tool for learning meteorology concepts and is great practice for future forecasters, it does not teach them the depth of computer skills (programming) we strive for. / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared
In future, the expectation is for 85% of students to achieve a rating of 3 or 4. / The results confirm those from the past three years, which found that this is the area we need the most work on. Our vision is to design new computer programming assignments and experiences for virtually every course in the major. This is a process that will probably take 5 years to fully implement, with the assistance of our UNIX administrator and possible computer science faculty.
There is an urgent short-term mandate to develop a detailed curriculum map incorporating sub-SLO scale objectives within SLO #7, followed by formal curricular revision.
We are hoping to find a way for Computer Science to offer a “Programming for Scientists and Engineers” course. Ongoing discussions with the Mathematics and Computer Science Department are planned.
SLO #8: “Evaluate social, economic, cultural, and global aspects of the impacts of weather and climate phenomena.” Evaluation of essay exam questions concerning climate change impacts taken from the MTR 3330 Climatology course in Spring 2013. No data was collected in 2013-14. / See Appendix A. The data collected suggests that while some students are well prepared, some are merely adequately prepared. / The target is for each SLO to achieve an average rating of greater than 3.0 where:
4 = well prepared
3 = adequately prepared
2 = not well prepared
1= poorly prepared
In future, the expectation is for 85% of students to achieve a rating of 3 or 4. / Because MTR 3330 moved from a 4000-level course to a 3000-level course, with most students taking the course at the sophomore or junior level. While this artifact can still be used as a formative assessment tool, it would be more appropriate to develop a summative assessment (and appropriate course content and activities) in another course such as MTR 4500 Mesometeorology.
Process for interpretation of findings – Describe the structure of responsibilities for program assessment. Specify processes undertaken for faculty review of findings prior to submission of this report. How did the program come to the decisions it made?
Because the SLO’s are only four years old, and the number of students each year is small, any trends in the data would be impossible to discern from noise in the signal. The plan is to compare 3-year time periods (i.e. 2013-2016 vs 2010-2013) once sufficient data is available. The exam used to assess SLO #1 has been used for more than 15 years, with no decipherable trends. The current set of SLO’s reflect the knowledge and skills expected of program faculty, which now emphasize various critical thinking and process skills in addition to “content knowledge”.
Response to prior peer review report(s) - Describe the ways in which the program has responded to any prior year’s peer review report. What kind of continuous improvement cycle are you using?
The 2012-13 MTR program assessment review report was all positive, but we continue to challenge ourselves to improve annually, adding in new rubrics and adjustments each year.
Plans for the program assessment process –To what extent does the program assessment process need modifications? Describe any plans for modifying the program assessment process. Is it the program’s intent to gather data about every outcome every year? If not, what is the proposed data collection cycle?
Because the number of students in each senior class is small, the program plans to collect data about every outcome every year and aggregate results into three-year periods. One area of improvement in the current assessment protocol is to improve the rubrics used for rating individual students, which then are aggregated into average ratings.
Implementation plan for applicable programchanges- Summarize the changes described in the Action and Rationale columns above and specify the implementation plan including a timeline.
One change coming out of the assessment reviews over the past few years was a revision of the laboratory components of several courses in the form of a major curriculum revision in 2012-13. This involved conversion of lecture to lab hours for three courses. In addition, almost all of the courses are incorporating more computer work and using weather visualization and analysis software into the courses; in meteorology, most of the “lab” work is on the computer. In 2013-14, we launched a new introductory lab course, which includes an introduction to using specialized computer software. We had planned to incorporate the rudiments of computer programming and UNIX in this course, but this may be best implemented once the program has thoroughly mapped out specific benchmarks in improving computing proficiency in terms of activities and assignments in several courses in addition to the computer programming course requirement within the mathematics minor.
Some results in conjunction with an exit exam indicate some measures that will require collaboration with other departments in terms of potential changes for their service courses. One example is the computer science requirement, for which most students now take Computer Science I course which is better suited for computer scientists than scientists/engineers. The meteorology program is incorporating more computer applications including coding into many courses to complement the one-course formal computer science requirement. We plan to continue discussions with the Computer Science faculty in terms of the shortcomings of the Computer Science I course for the needs of meteorology majors. We are pleased that the Mathematics and Computer Science Department has developed a new course in R programming which seems to be very well suited to analysis of large data sets.
We have initiated discussions with the Physics Department concerning dissatisfaction with the content in the General Physics II course (students and faculty are very satisfied with the General Physics I course content). The entire course is devoted to Electricity and Magnetism, and should include some relevant material on thermodynamics, waves, and fluid flow. Under a curriculum proposal under review, the laboratory portion of General Physics II will no longer be required, although in many instances we will encourage students to take the lab if they feel that the hands-on exploration of electricity and magnetism topics will aid them in mastering the material in General Physics II. There is some concern about the very large sections for the physics courses (close to 100 students, Metro State and UCD students combined). The students already know to take the courses from Metro State, rather than UCD professors, if possible.

Appendix A: Summary Statistics