Fall 2016 Report to the HLC Assessment Academy

What projects have you been following on the Collaboration Network? What have you learned from the experiences of other schools that is useful to your project?

To learn more about the progress and development of other projects, get alerts by following other projects.

We have been following Logan University, University of Northern Colorado, Minnesota State University Moorhead, and Kettering College, most of them in connection with the tag “institutional outcomes.” Apart from seeing our current challenges echoed in a number of other institutions, we have also encountered ideas in other institutions’ posts that we are either exploring or that we will consider this year.

 

Examples of ideas we’re exploring how to implement:

  • We’re looking at the extent to which our current Learning Management Software (Sakai) can manage our assessment goals, and we’re evaluating whether other software may be needed. We have noted some of the software systems that other schools are using.
  • We are incorporating assessment of our Essential Learning Outcomes into processes for program review and approval of new programs.

 

Examples of activities or goals that we have not yet adopted but want to look at more closely:

  • Joining the Campus Action Network of AACU
  • Creating Assessment Fellows to help carry out the work of the assessment team after our project is over
  • Need to develop a specialized budget for our project, a budget which, once approved, can be drawn upon by the Assessment Team leader

Update Questions

How has your project developed and changed in the months since the Roundtable?

We accomplished our first two objectives for year one:

1) aligning our University mission, vision, and core values with the ELOs

2) aligning individual courses with the ELOs in individual Student Learning Charts (which we previously had called Suskie Charts). By the end of last academic year, 86% of the faculty had drafted such a chart for at least one of their courses. We also did another revision of the ELO language itself, based on a trip to Alverno College and feedback from our mentor (see below).

 

In the first year we were not able to start on our third goal, relating the program outcomes to the ELOs. We realized that our original plan was overly ambitious and that institutional change takes place more slowly than we had hoped. This goal is now a major focus of year two.

 

We now anticipate that we will not be able to cover as much as we had hoped in the four years. Apart from scaling back the pace of the project, the direction and goals of the project remain the same. On the whole we’re satisfied with what we have learned and with the faculty’s response to our initiative.

 

In March we had an hour-and-a-half conference call with Alan Hodder from Hampshire College and Jeana Abromeit from Alverno College on narrative grading, an approach we’re exploring the context of our assessment work.

 

In June five members of our assessment team visited Alverno for its annual assessment workshop. We gained many insights from their comprehensive, ungraded, competency-based approach. We generated a list of strategies from Alverno that could be implemented here. We will look for ways to include these strategies in our assessment planning for our project.

 

We presented updates on the work of our team to the whole faculty in May and again in August 2016. In August we also reported on lessons from Alverno and received faculty input on the Development of Consciousness ELO, one of the two we will be focusing on this year.

 

We added a new member to our assessment team to replace one who has changed status from full-time to adjunct.

How did you incorporate the feedback that you received on your previous posting?

We received excellent feedback on our posting. We received additional in-depth feedback in a personal meeting and subsequent conference call between our assessment project team and our Mentor, Peggy Bloom. I mention below changes we have made based on this feedback.

 

A. Using the written feedback to our posted report on the Assessment Academy Network

 

  1. Our mentor and scholar both noted that we needed a shared language and shared concepts surrounding assessment work before we could make much progress toward our goals. They recommended a “pilot project” that would help us engage the faculty around such a language and concepts. Subsequently we used the “Suskie Chart Project” as such a pilot project. It did help us discuss writing objectives and aligning objectives with assessments and activities in a course. This chart made us aware of how unfamiliar some faculty are with basic concepts. It also helped focus the faculty on student learning, and it introduced language and concepts that help us talk about student learning.
  2. Our mentor and scholar also noted that it will be important to develop a shared vision of what each of our nine Essential Learning Outcomes (ELOs) means, a point we had anticipated but which helped us further focus our efforts. For most of the outcomes, we have convened smaller expert faculty groups to discuss the meaning of each outcome. For example, to further define what was aimed at by our ELO called “Teamwork,” we assembled a group of experts in the study of collaboration to help us refine our language for this outcome.

 

B. Using oral feedback from our personal meeting and a subsequent conference call between our assessment team and our Mentor:

 

  1. Our mentor recommended we distinguish graduate and undergraduate level outcomes when we discuss the ELOs and that we hold only the undergraduate college accountable in developing the ELOs. Though we will be focusing first on the undergraduate programs, we will examine the relevance of the ELOs to graduate programs also. Each graduate program will be expected to have a set of five outcomes unique to that program.
  2. Our mentor confirmed our thinking that, rather than trying to deal with the faculty as a whole in our effort to align the ELOs with program outcomes, two members of the assessment project team would visit with each department, one by one, to discuss outcomes for each program of the department. We will be looking for around five outcomes for each program. Since this conversation, we have also decided to hold an all-faculty meeting to launch and introduce the meetings with individual departments.
  3. Our mentor recommended that we view our ELOs as essential (as the name implies), not aspirational. We will ask every program to do one round of assessments of their outcomes and report their findings in their annual report.
  4. Our mentor also confirmed our thinking that we develop a formal assessment plan for the ELOs and that we start with one that we think we can collect some good data on immediately, for “an early win.” Then we include a rotation plan whereby we measure one or two outcomes each year until all of the outcomes have been measured.
  5. Our mentor gave us specific feedback on each of our ELOs, including recommendations for simplifying the wording in several. We incorporated most of this feedback and did a final revision of the document that defines all of our ELOs at a general level.

Our mentor confirmed that we were on the right track with our sequence of steps in aligning institutional, program, and course levels of goals and assessments.

What are the plans for the next six months? How will this work advance your project?

1)    Revise the timeline for the whole assessment project based on a revised conception of what we can accomplish in three more years, now that we have more experience with student learning-focused institutional change.

2)    Meet with all faculty on September 30 to introduce initial terminology, purposes, and strategies for the one and a half hour meetings with individual departments this year.

3)    Schedule and conduct initial visits for all nine of our academic departments, starting with the smaller departments (which also happened to be more advanced in their assessment know-how). Based on the experience with the smaller departments we will refine our methodology, presentation, and discussion questions and then move on to the larger departments. In the department meetings we plan to:

a)     Focus the department on simplifying their outcomes to around five.

b)    Help each department refine the wording of their program outcomes.

c)     Analyze program outcomes in terms of the ELOs to determine how many of the ELOs are covered in the stated program outcomes.

4)    Develop assessment plans for the undergraduate and graduate colleges (including supporting technology) that uses our previous years of experience in assessment combined with our newer discoveries of the ELOs and our renewed focus on student learning.

5)    Build two teams of faculty around the two ELOs we will measure this year. These teams will take charge of assessment of those ELOs. These teams will develop or adapt existing rubrics for their ELO.

6)    Conduct initial measurements of upper division students on the two ELOs through assignments embedded in existing courses.

7)    Summarize the initial data from the first measurement of the two ELOs and present this data to faculty in the August, 2017 faculty development seminar for their evaluation and response.

Create a process by which faculty teams can review the new Student Learning Charts so that the burden of reviewing the charts no longer rests with the Assessment Team alone.

What challenges do you anticipate? How will you address them?

1)    Getting departments to accept the responsibility to revise the learning outcomes for their programs—with adequate metrics—is always challenging, because departments may not yet have totally embraced the learning paradigm we’re promoting. They may still be in the delivery mindset, which looks at a program as a set of courses delivered by an institution to qualify someone for a degree. Assessments in this older model are gatekeeping mechanisms, but not necessarily evidence of learning that have useful data for decision-making.

 

Remedy: In both the faculty meeting and in the subsequent meetings with departments we will frame the discussion of program outcomes in terms of what students will gain from their academic program that will stick with them long after they have graduated. Improved assessment of the ELOs and other program outcomes will be presented as a means of determining whether their students are learning what they need to learn to be effective in their lives and professions after college. We will also present program assessment in terms of what can be used to help students see the value added from their academic program, thereby building student satisfaction.

 

2)    We will need to start strong and stay focused this year as we have essentially three initiatives that must run simultaneously:

I)    Create a three-year assessment plan for undergraduate and graduate education

II)    Align the ELOs with undergraduate program outcomes and ensure that all graduate programs have stated outcomes

III)    Begin the measuring of two ELOs—probably Development of Consciousness and Communication

 

Remedy: We have chosen leaders who will monitor the progress of each initiative and keep it on schedule.

 

3)    Deciding how to obtain appropriate samples of student work for the two ELOs that we will begin measuring this year will be challenging.

Remedy: We plan to address this issue by forming faculty assessment user groups for each ELO around at least one faculty per department (larger departments will have more than one) who could pilot the assessments in their courses. Faculty who agree to participate will select an assignment in their course to include in the first round of assessments of the ELO. Assessment of these assignments will generate the first round of data for instructional improvement in our Student Learning Improvement Project.

 

Response to the Report from the Academy Mentor and Scholar