Faculty members and administrators have been engaged in outcomes assessment work at the college level since the early 1990s. Given the faculty-driven nature of this work, as leadership of college-based committees changed, the work accomplished differed. College level outcomes committee chairs served on a university committee that met to coordinate the work and update each other on their work. Feedback from the college based committees indicated a need for more support and communication from academic administration and the deans particularly. This need, combined with the launch of FOCUS 2011 that specifically charged the academic community to re-evaluate rigor, relevance and quality of academic programs, generated enhanced effectiveness/productivity of the college and university committees on outcomes assessment work through the creation of the Faculty Outcomes & Assessment Committee (FOAC) and the Deans Steering Committee on Outcomes Assessment (DSCOA).
In 2005, after analyzing the current state of outcomes at the institution, DSCOA implemented an aggressive but viable timeline for improving outcomes and assessment at the university. A strategy was devised that outlines key goals for each academic year between 2006 and 2012. The plan was reviewed for alignment with FOCUS 2011 and, where alignment was weak, it was tightened.
What do we expect students to learn?During the 2006-2007 academic year the FOAC succeeded in auditing outcomes in all undergraduate degree programs, schools and colleges (see 2006-2007 University Outcomes Year End Report 10, 2007). The results of this audit suggested a need for revising and updating program and, in some cases, college outcomes to clarify language and reshape the structure of each outcome to assure it was assessable. A need for faculty training in outcomes and assessment was also determined after consultation with faculty members serving on FOAC and members of the University Deans Committee.
By the end of the 2006-2007 academic year, the university had evidence that student learning outcomes existed for all programs. The university, through the work of FOAC and DSCOA, revisited the four university outcomes that to which all schools and colleges subscribe. Discussions ensued about governance of university outcomes since they fall outside of, and theoretically above, each of the institutions schools and colleges. It was determined, through discussion and debate that these outcomes fall under the authority of the University Provost’s Office and are maintained by the University Deans Committee.
Improvements Made to OutcomesDuring the 2007-2008 academic year program and/or college outcomes were updated in the College of Culinary Arts, The Hospitality College, the School of Technology and the College of Business. Faculty and administrators completed an unprecedented upgrade of outcomes during the year. FOAC and DSCOA also transitioned toward faculty training and development while continuing the process of improving outcomes. Training and development in contemporary perspectives in outcomes and assessment was pursued to maximize ongoing improvement efforts. To this end, major investments have been made to train members of FOAC.
Conferences and seminars related to outcomes and assessment that committee members participated in include Performance Assessment in Higher Education at Harvard University, New England Association of Schools and Colleges Annual Conference, and the New England Educational Assessment Network 4th Annual Academic Assessment Institute.
Using knowledge gained from conference attendance and from prior experience, FOAC developed a standard set of training goals, a training plan and presentation for all schools and colleges. Each school and college allocated one academic in-service training session to outcomes and assessment. By the start of the 2008-2009 academic year, 347 full-time faculty members (73% of all full time faculty) participated in the training and all five undergraduate schools and colleges completed a secondary audit of their outcomes. Once it was confirmed that all baccalaureate programs had formally written outcomes, members of FOAC were prepared to move to the next stage in their plan: improving assessment.
Are We Satisfied We Can Measure SLO’s?After investigating this question, members of FOAC and DSCOA have determined that SLO’s can be measured and often are measured in each degree program. In addition to traditional forms of assessment of student learning, the university had employed the use of a rubric based Performance Transcript (PT) across all schools and colleges to rate student performance in relation to competencies within their program of study. For nearly a decade PT data have been collected. However, through the work of the FOAC and DSCOA, interest in PT has been renewed and weaknesses related to how PT results are used have been discovered. Improving traditional as well as PT based assessment of student learning is a priority for the 2008-2009 and 2009-2010 academic years.
Plans for the 2008-2009 academic year focus on launching the assessment portion of the five year plan. Members of FOAC have been formally trained in assessment and approach the coming year with a belief that assessment is already occurring within each degree program offered. Their work is focused on formalizing how direct and indirect measures of student learning already being collected can be organized and summarized as a first step toward using data about student learning for continual improvement. Much of the work is focused on formalizing data collection instead of creating methods of assessment.
Are We Satisfied We Are Using the Results for Program Improvement?The primary opportunity for DSCOA and FOAC is in the area of collecting assessment data for use in improving student learning and academic programs. The university has rich, varied and unique (in the form of PT) assessment data that has yet to be analyzed in all but a few cases. Although a weakness has been exposed in the form of a need for enhanced data analysis, both FOAC and DSCOA are confident in their capacity to successfully address this weakness as part of their multi-year plan. The paragraphs that follow provide more details about both direct and indirect forms of assessment underway at the university.
Assessment of SLO’s at Johnson & Wales University is as varied as the schools and colleges the university operates. In many cases both direct and indirect evidence of student learning exists. For example, in the College of Culinary Arts laboratories, the positive effect of educational programs can be observed directly in the transition from low to high in the quality of food produced by students during laboratory courses. The improvement of food quality is not a random outcome, but an effect of a highly developed curriculum with explicit learning objectives for each course, learning outcomes for each degree program and assessment methodologies, including highly refined grading rubrics, that provide quantitative and qualitative feedback to students as they progress from one laboratory class to the next.
Indirect measures include data collected on alumni employment, starting salaries and general satisfaction with the university experience. These are examples of how direct and indirect measures of student learning are used to assure students benefit from their academic experiences at the university. Each school and college can provide examples and has written college/school outcomes, program outcomes, forms of assessment and methods for using assessment data to improve student learning. Such accountability for student learning is not new to the university. Appropriate systems of administration, governance and support exist at the university to sustain ongoing efforts toward continual improvement.