- Curriculum Proposal System
- Academic Programs
As part of our ongoing outcomes/assessment analysis of the Biology major, we have continuously refined and reflected on our process. In our assessment, we have pursued a two-pronged approach that is based on both direct and indirect assessment methods. Our indirect assessment gathers data on students' opinions on many aspects of their educational experience. All graduating seniors are invited to complete the web-based survey. The questions on the survey have all been developed in conjunction with the OSU Survey Research Center which participates in both the administration and the analysis of the survey results. Thus, this is a scientific survey with statistically valid analyses and margins of error. Representative questions and responses will be presented. Our direct assessment method is based on the administration to all graduating seniors of the Educational Testing Service's Major Field Test in Biology. This standardized exam is administered to over 50,000 students annually. Again, results of the examination data to date will be shared with the participants. Conclusions and changes to the Biology major based on the information from these two forms of assessment will be discussed.
This presentation will introduce a framework for a continuous improvement process for programs driven by program outcome assessment. A case study illustrating the application of this framework to programs in the College of Engineering will be presented. Program outcome data collected from different sources (courses, students, and alumni) is being used to provide feedback to evaluate whether or not changes are needed in programs. A set of processes has been implemented to create an annual review cycle with a focus on continuous improvement of programs. Specific examples and templates for program outcome assessment and reporting will be provided.
This presentation will consider uses of writing and writing rubrics to assess holistic performances of learning. While assessment is changing the landscape of higher education, it can be challenging to assess what is often thought to be the "ineffable." Both informal and formal writing assignments can be used (and combine with) quantitative to methods to support curricular inquiry.
In recent years, the Food Science and Technology Department (FST) has utilized various forms of classroom assessment of student learning. These class-based assessments have been found to be valuable tools for gaining immediate, actionable, feedback during the conduct of a given term's class. They are also an indispensible component of our programmatic assessment. Student learning outcomes in (1) core knowledge, (2) critical thinking and (3) communication skills have all been assessed in multiple FST classes, and the results used to inform both individual class and broader curricular improvements. A course can be designed to facilitate the collection of relevant assessment data. The use of rubrics has proven particularly helpful, as they provide guidance to students as well as concrete means of evaluation. Rubrics are arranged as a two dimensional matrices. On one axis, several dimensions, or attributes, of a given skill or body of knowledge are represented. The other axis categorizes several levels of mastery. Brief descriptions of evidence of a given level of performance are included in the boxes associated with a given dimension/mastery level combination. Rubrics used in common across several classes help tie classroom assessment to programmatic assessment goals. Examples of FST's implementation of classroom assessment, and the incorporation of rubrics in this process, will be provided.
Developing processes and organizing data for program assessment purposes can be a daunting task. Important things to consider include how to develop mechanisms to generate relevant qualitative and quantitative data to measure achievement of program objectives, as well as data management methods that balance faculty workload with reporting requirements. This session will highlight different approaches and considerations for program assessment processes. The presentation will share lessons learned as well as technologies and best practices to consider for further improvement.
The Department of Food Science and Technology began the process of identifying and assessing programmatic outcomes in 2005 when they were faced with reporting requirements from within OSU and from an external accrediting body. In response to these dual pressures, their department began educating itself about developing and assessing learning outcomes at the individual course level and across the entire program. The approach FST ultimately used involved generating programmatic outcomes built from the skills the faculty felt were most important in their graduates and using these to drive the individual course-level assessment. Added to this were data gathered from exit interviews, program growth and job placement data, and employer surveys. Dr. Shellhammer will discuss their journey highlighting the successes and difficulties they encountered.
In the self-study process for becoming an accredited College of Public Health and Human Sciences we are required to have a list of competencies for every degree offered in the College. In addition, the University requires monitoring and reporting of learning outcomes. We are working to combine the two process for in an efficient approach that satisfies both of these needs but more importantly clearly indicates student progress and learning.
The physics department at OSU has a long history of innovative teaching with award winning upper division courses. I was hired in 2007 to lead reform of the large-lecture lower division courses including changing the curriculum, the structure, pedagogical strategies, and assessing these changes. Classroom assessment must be tied to goals - so the first step is to be clear about the goals of the course. Widely used assessments exist - it is best to start with something that can be easily administered and directly interpreted based on a collection of nationwide data. Deeper assessments take time - strategies should best reflect the goals to maximise their benefit. I will share our assessment choices and findings for the large-lecture introductory calculus based physics courses.
For all the time and effort you put into teaching, that does not guarantee that your students are learning. This workshop is intended to give faculty practical tools they can use to ensure that what they are teaching is indeed being learned. Dr. Pappas will introduce key concepts regarding classroom-level assessment and then demonstrate several tools faculty across all disciplines can use to ensure that their students are learning what is being taught.