Fall 2018 Report to the HLC Assessment Academy

What projects have you been following on the Collaboration Network? What have you learned from the experiences of other schools that is useful to your project?

To learn more about the progress and development of other projects, get alerts by following other projects

We discovered a new learning outcome management system in reviewing the Park University report; a systematic process for data collection and review from the Newman University final report; and some of the strengths and pitfalls of graduate assessment from the University of Northern Iowa’s focus on graduate assessment:

“the approach taken to moving graduate assessment forward has been one of quiet persistence, with an emphasis on building relationships, providing resources, and focusing on the benefits of assessment to academic program areas. In fact, the overall approach to assessment might be described by what the director likes to describe as the R2S3 model: Relationships, Resources, Systems, Strategies, and Support.”

Update Questions

What were the most significant results from the Third Year Consultation?

The consultations we had with our mentor, Dr. Bloom, helped us focus on the most important outcomes for our four-year project in this our final year. With this increased focus we could also plan our activities for this year.

Dr. Bloom also was able to simplify our thinking into a few clear goals and to give us a an experience-based perspective on what is realistic given our time and resources.

We also valued the appreciation she expressed for the work we had done and the confidence we gained that we were on the right track.

Looking back at the tasks that you had outlined for your project following the Midpoint Roundtable, what progress has been made and what tasks remain? What is your plan to address the remaining tasks in the next six months?

 

The main steps of progress since the Mid-point Rountable have been that the Team

  • Created a new peer-review process for annual reports centered in the Assessment Team
  • Collected and organized a decade or more of data on the development of consciousness Essential Learning Outcome
  • Developed an operational definition of holistic thinking that can be reliably assessed
  • Developed a new computerized diagnostic measure of students writing abilities at entry
  • Collected baseline data on holistic thinking and critical thinking for entering students
  • Conducted more training for the faculty as a whole on assessment of holistic thinking, critical thinking, and writing
  • Created the shell for our revised assessment resources website that now needs to be filled in with information and materials
  • Awarded three more Assessment awards to faculty and programs demonstrating excellence in assessment

In general terms, the plan for this year is to

  1. Collect data with regard to undergraduate writing, critical thinking, development of consciousness, and holistic thinking, but fully score the writing sample we collect and take these results through a process that can be used the following year with the other three data sets. Having a process for collecting, interpreting, reporting, and planning with the data on writing is a sufficiently ambitious process for this year.
  2. Complete the assessment resource website that we have largely designed by the end of the year.
  3. Create a set of school-wide goals for the Graduate School, developed inductively with all of the graduate program leaders.
  4. Work with the Distance Education Office to make sure that all of the online programs are collecting data on their students that feeds into the program reporting process for the F2F programs.
  5. Support the development of a new computer science assessment plan focused on alumni feedback.
  6. Formalize the assessment function at the University by
    1. the creation of an assessment committee that reads the annual reports and provides feedback to programs
    2. description of new institutional roles described in the Faculty Handbook
    3. preparation of an annual report to the Trustees on the state of assessment at the University

How is the Academy project contributing to creating a culture of learning? How is the team engaging institutional stakeholders in the Academy work?

Creating a culture of assessment

We have done all of the following on campus:

  • Created a cadre of people on campus now who have a great deal of expertise in assessment (10 faculty or 14%).
  • Repeatedly emphasized in training sessions the importance of student learning outcomes at the course, program, and institutional levels, and as of this report we have received reports from 18 of 23 programs (78%) to date, with the remaining programs requiring further support.
  • Achieved approximately eighty percent compliance among faculty who are required to use the “student learning charts” for their courses showing the relationship between course objectives, activities, and assessments.
  • Created a new Assessment Team which will become the Assessment Committee of the Faculty Senate at the end of our project.
  • Introduced all of the faculty to the skills of writing course and program learning outcomes, creating student learning charts, and assessing writing, critical thinking, and holistic thinking.
  • Given end-of-year assessment awards to deserving faculty and programs.

 

These are the beginnings of a culture of assessment at our University, yet much remains to be done if we are to sustain this culture and help it blossom. In the year ahead we plan to do this sustaining work as outlined above under plans for the next six months.

 

Engaging institutional stakeholders in the Academy work?

We have ten semi-autonomous units on campus that run educational programs (essentially departments or schools). If the goal of all our activities is to help units use outcomes assessment as a means of improvement and growth, four of the ten have reached the stage where they achieve this goal annually with relatively little coaxing. Another four of the ten, while compliant, are still not yet self-sustaining. One is still dragging its feet but will still be compliant.

 

All of the five non-sustaining units require coaxing and substantial support to achieve this goal. Our intent this year is to find allies within these units to help those units gain and carry out the habits of assessment that will make them self-sustaining, to the point where they see the value themselves and will do the necessary analyses and reporting without being coaxed or cudgeled.

What are your plans and goals for the next six months? What challenges do you anticipate?

Our plans and goals are spelled out in the previous sections. To summarize, the next six months include:

  1. Develop a data collection process for our three essential learning outcomes that require another round of collection, with special attention on the writing outcome.
  2. Complete the assessment website design
  3. Set in motion a process of synthesizing the graduate school outcomes into a short list of shared outcomes for the school as a whole (we should be able to complete this process in six months)
  4. Audit the DE programs to see that they are collecting samples of writing, critical thinking, and holistic thinking that can feed into the campus-wide outcomes assessment process
  5. Develop a new assessment plan for the Computer Science Professionals program
  6. Find, and if necessary train, allies within the units that are still developing their assessment capacity.

The process of formalizing the assessment roles and responsibilities and of generating reports on our Essential Learning Outcomes for the faculty and trustees will be done this year, but not in the next six months.