Fall 2016 Report to the HLC Assessment Academy
What projects have you been following on the Collaboration Network? What have you learned from the experiences of other schools that is useful to your project? To learn more about the progress and development of other projects, get alerts by following other projects. |
|
---|---|
We have been following Logan University, University of Northern Colorado, Minnesota State University Moorhead, and Kettering College, most of them in connection with the tag “institutional outcomes.” Apart from seeing our current challenges echoed in a number of other institutions, we have also encountered ideas in other institutions’ posts that we are either exploring or that we will consider this year.
Examples of ideas we’re exploring how to implement:
Examples of activities or goals that we have not yet adopted but want to look at more closely:
|
|
|
Update Questions
How has your project developed and changed in the months since the Roundtable? |
|
---|---|
We accomplished our first two objectives for year one:
1) aligning our University mission, vision, and core values with the ELOs 2) aligning individual courses with the ELOs in individual Student Learning Charts (which we previously had called Suskie Charts). By the end of last academic year, 86% of the faculty had drafted such a chart for at least one of their courses. We also did another revision of the ELO language itself, based on a trip to Alverno College and feedback from our mentor (see below).
In the first year we were not able to start on our third goal, relating the program outcomes to the ELOs. We realized that our original plan was overly ambitious and that institutional change takes place more slowly than we had hoped. This goal is now a major focus of year two.
We now anticipate that we will not be able to cover as much as we had hoped in the four years. Apart from scaling back the pace of the project, the direction and goals of the project remain the same. On the whole we’re satisfied with what we have learned and with the faculty’s response to our initiative.
In March we had an hour-and-a-half conference call with Alan Hodder from Hampshire College and Jeana Abromeit from Alverno College on narrative grading, an approach we’re exploring the context of our assessment work.
In June five members of our assessment team visited Alverno for its annual assessment workshop. We gained many insights from their comprehensive, ungraded, competency-based approach. We generated a list of strategies from Alverno that could be implemented here. We will look for ways to include these strategies in our assessment planning for our project.
We presented updates on the work of our team to the whole faculty in May and again in August 2016. In August we also reported on lessons from Alverno and received faculty input on the Development of Consciousness ELO, one of the two we will be focusing on this year.
We added a new member to our assessment team to replace one who has changed status from full-time to adjunct. |
|
|
How did you incorporate the feedback that you received on your previous posting? |
|
---|---|
We received excellent feedback on our posting. We received additional in-depth feedback in a personal meeting and subsequent conference call between our assessment project team and our Mentor, Peggy Bloom. I mention below changes we have made based on this feedback.
A. Using the written feedback to our posted report on the Assessment Academy Network
B. Using oral feedback from our personal meeting and a subsequent conference call between our assessment team and our Mentor:
Our mentor confirmed that we were on the right track with our sequence of steps in aligning institutional, program, and course levels of goals and assessments. |
|
|
What are the plans for the next six months? How will this work advance your project? |
|
---|---|
1) Revise the timeline for the whole assessment project based on a revised conception of what we can accomplish in three more years, now that we have more experience with student learning-focused institutional change.
2) Meet with all faculty on September 30 to introduce initial terminology, purposes, and strategies for the one and a half hour meetings with individual departments this year. 3) Schedule and conduct initial visits for all nine of our academic departments, starting with the smaller departments (which also happened to be more advanced in their assessment know-how). Based on the experience with the smaller departments we will refine our methodology, presentation, and discussion questions and then move on to the larger departments. In the department meetings we plan to: a) Focus the department on simplifying their outcomes to around five. b) Help each department refine the wording of their program outcomes. c) Analyze program outcomes in terms of the ELOs to determine how many of the ELOs are covered in the stated program outcomes. 4) Develop assessment plans for the undergraduate and graduate colleges (including supporting technology) that uses our previous years of experience in assessment combined with our newer discoveries of the ELOs and our renewed focus on student learning. 5) Build two teams of faculty around the two ELOs we will measure this year. These teams will take charge of assessment of those ELOs. These teams will develop or adapt existing rubrics for their ELO. 6) Conduct initial measurements of upper division students on the two ELOs through assignments embedded in existing courses. 7) Summarize the initial data from the first measurement of the two ELOs and present this data to faculty in the August, 2017 faculty development seminar for their evaluation and response. Create a process by which faculty teams can review the new Student Learning Charts so that the burden of reviewing the charts no longer rests with the Assessment Team alone. |
|
|
What challenges do you anticipate? How will you address them? |
|
---|---|
1) Getting departments to accept the responsibility to revise the learning outcomes for their programs—with adequate metrics—is always challenging, because departments may not yet have totally embraced the learning paradigm we’re promoting. They may still be in the delivery mindset, which looks at a program as a set of courses delivered by an institution to qualify someone for a degree. Assessments in this older model are gatekeeping mechanisms, but not necessarily evidence of learning that have useful data for decision-making.
Remedy: In both the faculty meeting and in the subsequent meetings with departments we will frame the discussion of program outcomes in terms of what students will gain from their academic program that will stick with them long after they have graduated. Improved assessment of the ELOs and other program outcomes will be presented as a means of determining whether their students are learning what they need to learn to be effective in their lives and professions after college. We will also present program assessment in terms of what can be used to help students see the value added from their academic program, thereby building student satisfaction.
2) We will need to start strong and stay focused this year as we have essentially three initiatives that must run simultaneously: I) Create a three-year assessment plan for undergraduate and graduate education II) Align the ELOs with undergraduate program outcomes and ensure that all graduate programs have stated outcomes III) Begin the measuring of two ELOs—probably Development of Consciousness and Communication
Remedy: We have chosen leaders who will monitor the progress of each initiative and keep it on schedule.
3) Deciding how to obtain appropriate samples of student work for the two ELOs that we will begin measuring this year will be challenging. Remedy: We plan to address this issue by forming faculty assessment user groups for each ELO around at least one faculty per department (larger departments will have more than one) who could pilot the assessments in their courses. Faculty who agree to participate will select an assignment in their course to include in the first round of assessments of the ELO. Assessment of these assignments will generate the first round of data for instructional improvement in our Student Learning Improvement Project. |