Response to the Fall 2015 Report

What are some strengths of this project/Academy work? Why are these strengths?
Reviewed by Margaret Bloom (Primary Mentor)

Your team has developed the outlines of a very ambitious project to develop and implement a three level assessment process for the “Essential Learning Outcomes”, the MIU institutional learning outcomes for both undergraduate and graduate students. By the end of the 4-year project you intend to be using the assessment data to guide “planning, reporting, and budgeting”. I understand your sense of pressure to have a fully operating assessment process by the 2020 reaccreditation visit and I admire the amount of planning you all accomplished at the meeting. Strengths in version 1 of the project include:

  1. your work to conceptualize which model of assessment you will use, Barbara Walvoord’s approach to assessment at the institutional, program, and course level. Having a model guides you as you proceed and helps the team  communicate consistently to faculty, students and administrators about how assessment works at MIU.
  2. selecting only one level of assessment for the project,  the MIU ELOs (institutional learning outcomes).  It seems a continuation of the work already done and captures recent cross-university energy for identifying and drafting the ELO’s. It also has strong relevance for MIU. Highly mission driven institutions such as MIU really need to examine in what ways and how effectively students achieve the goals of the mission. This project can result in important student learning data to support MIU’s unique mission and aims, better tell the story about advantages of an MIU education and lead to improvements in student outcomes.
  3.  using a project planning approach to identify specific actions and deadlines.

Reviewed by Jan Smith (HLC Scholar)

I fully agree with Peggy’s evaluation of the strengths of your project. This has the potential to be a great project as you have a good appreciation that your Academy experience is ultimately about improvement of student learning. In addition to the strengths identified by Peggy, I commend you on taking the time to lay a foundation for effective progression through later phases of the project as this will help ensure that you have both adequate infrastructure and an informed, engaged faculty, both key elements to success. In addition, I really like that you are including the student perspective, as this has the potential to enrich the project in critical ways as the purpose of the project is to positively impact your students.

What remains unclear or what questions do you still have about this work to assess and improve student learning?
Reviewed by Margaret Bloom (Primary Mentor)

I have several questions about how this project fits with past assessment work at MIU and the team’s thinking.

1. Have team members reflected on what worked and what did not work gaining faculty and administrator commitment and engagement for learning assessment activities at MIU prior to this year?

2. What as been done already in respect to program level student learning assessment for the majors and for Graduate programs? The plan is not clear how program outcome assessment will be part of institutional ELO assessment.

3. The university web pages describe a vibrant community and student life outside the classroom. What role should units like the residence halls, library and student life activities have in assessment of ELOs?

Reviewed by Jan Smith (HLC Scholar)

Peggy asks some great questions. Although you are new to the Academy, you are not new to assessment of student learning so starting from a good understanding of your prior successes and challenges will add to your project foundation. What is the current culture of your campus and how strong is the role of shared governance? How easy will it be to get faculty to adopt the “Suskie Chart?” Having ample training opportunities and enlisting the assistance of chairs will be very helpful. Building assessment activities into annual reviews also has the potential to be helpful, provided this is accomplished in a way that is appropriate to the culture of the institution and current level of skills of faculty.

In addition, Peggy’s comment about assessment of student learning outside of the classroom (co-curricular assessment) is on target. Your project is already ambitions, so it would probably not be wise to extend the scope at this time. However, as you engage in longer-term planning, it may be very helpful to consider how you can build a process that will allow for integration of co-curricular assessment down the road.

I also have questions about your ELOs. I think that it was very wise to reduce your original 30 institutional outcomes to a manageable number. Your ELOs appear to be very much on target for your mission, although some will be more challenging to assess than others. Does everyone have a shared understanding of what each of the ELOs mean? That will be critical when actually assessing these outcomes and will impact the approaches you need to take in year 2.

 

What are some critical things to which the institution should pay attention as it plans its work for the next six months?
Reviewed by Margaret Bloom (Primary Mentor)

Your strength of commitment and desire to accomplish ironically is a potential impediment if you rush to implement the first part of the project this month.  There are several aspects of the project that may need more reflection and planning before you launch this project. A bit more time spent in planning will avoid dead-ends and antagonizing faculty and department heads will happen if  they perceive that the team rushed before the plan was clear.  There a several aspects that need attention to now and if addressed could enhance project success.

A.  Systematic assessment extends across the whole university and will require faculty, students, and administrators to learn new approaches, change attitudes a their roles and practice different behaviors. The team listed as potential barriers to project success the need for faculty and administrator time and energy as well as faculty general resistance to change.  Currently your first year plan focuses on instrumental assessment “tasks”, e.g., “require all faculty to add a chart to their syllabus” and “have major programs and general education faculty map the ELOs to their courses”. Before you launch the first year of the project I encourage the team to discuss and add process goals, strategies and actions to the plan that will build faculty and administrator readiness and capacity for assessment and their engagement in institutional assessment.

Some NILOA papers about addressing resistance and faculty involvement are attached. Some process focused strategies  frequently mentioned in the literature include: building a common language for assessment, a shared baseline of knowledge about assessment (perhaps adopt the Walvoord text you have used?), recruit and cultivate faculty opinion leaders, start with a pilot project that has high probability for success and frequent communication via multiple forms about the project across the institution.

B. In assessment work keeping the plan as simple as possible can increase your success. I may not accurately understood your plan for the first year but you seem to be doubling the work of the team and the faculty in the first couple of years by first using a top-down, deductive reasoning approach in the actions to use the current 8 ELOs and their 23 specific outcomes, i.e., a. to align university documents to current ELO’s and b. ask faculty to determine how the given ELOs are included in each course  and prepare a “Suskie Chart” i.e., course assessment plan. Then you plan to use a bottom-up, inductive reasoning approach, i.e., a. each faculty member reports what learning outcomes are achieved in each course and then b. the essential learning outcomes are refined and reduced. I suspect these two steps will catch the faculty and program chairs in the double work of developing the “Suskie Charts” (assessment plans) for each course to assess the current essential outcomes and then having to redo everything for both course and program learning outcomes one to two years later when the ELOs are narrowed down. I can hear the shouts of “you wasted my time” already! One way to simplify would be to proceed as your first step to refine the “essential learning outcomes” by involving a broad cross section of faculty and administrators to reduce the number of ELO’s and state each as a measurable outcome that can be assessed. Broad participation in this process would also be a way to build capacity for writing measureable learning outcomes and understanding that student learning is central to the university. By waiting to have faculty develop course and program assessment “Suskie Charts” until there are fewer and more clearly stated ELO’s you have a better chance that they know the basics of assessment and will be able to accomplish the project goals.

Reviewed by Jan Smith (HLC Scholar)

Again, Peggy makes some great points for you to consider. I too wonder if it would be helpful to consider building in some sort of pilot mechanism that would help address these issues. Many institutions find it more manageable to start with one or two outcomes and/or begin with a limited group before expanding to the remainder of their ELOs and the rest of campus. No matter how you decide to proceed, don’t be afraid to adjust your timeline as needed. Although some Academy teams adhere to their original timeline, it is more common to revisit and adapt the the timeline based on evolving needs of the project.

What are some other possibilities or resources that might contribute to the success of this project? For instance, can you suggest resources such as books, benchmarks, instruments, models, and processes?
Reviewed by Margaret Bloom (Primary Mentor)

Some of the MIU ELO’s are about holistic development and not easily measured. You may find a book about assessing values, attitudes and commitments useful.

Beyond the Big Test: Noncognitive Assessment in Higher Education by William E. Sedlacek, Jossey-Bass; 1 edition (February 27, 2004).

Reviewed by Jan Smith (HLC Scholar)

In addition, the Collaboration Network has lots of practical information, with many schools working on projects that are of relevance to what you are doing. It might be helpful to take a look at other relevant projects to get some ideas that you can evaluate for your own institution rather than try to come up with everything yourself. If you click on “Browse the Network” in the left hand navigation pane, you will see a list with several topics you can select.