December 2017 Report to the HLC Assessment Academy

What projects have you been following on the Collaboration Network? What have you learned from the experiences of other schools that is useful to your project?

To learn more about the progress and development of other projects, get alerts by following other projects.

We have been following Hocking College and Kettering University. We spoke with Hocking College at the midpoint roundtable and learned about their satisfaction with the Livetext assessment management software. This conversation encouraged us further in the use of this software.

Update Questions

Identify and explain any specific changes to your project scope and design since the last Project Update.

Our team spent its time in Oakbrook 1) envisioning the end goal of our assessment work, 2) narrowing what we could actually accomplish in the final two years of the project, 3) creating work plans for three sub-groups that will undertake specific parts of the project, and 4) planning for the sustainability of the project beyond the four-year project scope initially planned. We are still optimistically planning to collect a second round of data on the three initial outcomes we measured to demonstrate improvement resulting from a planned change. For example, we would like to show with respect to writing that we have collected data that suggests specific improvements in writing-intensive courses, followed by another round of data collection that suggests that those improvements in fact enhanced the student learning in those courses. We believe we should be able to collect this data with respect to writing, critical thinking, and possibly development of consciousness.

Describe your short term plan for measuring student learning. What specific tasks do you plan to accomplish in the next six months?

As mentioned above, we broke our project into three tasks, each of which will have accomplished the following by April:

1) at the institutional level, we will identify performance indicators for our nine Essential Learning Outcomes and define and collect data on two more of our nine (teamwork and holistic thinking) ;

2) at the program level, we will by April have piloted with one department new templates for a curriculum map, an assessment plan, and a document aligning program learning outcomes with Essential Learning Outcomes; and

3) at the course level, we will further inspire and instruct the faculty in the use of “student learning charts,” created at the beginning of this project. In addition, we will be monitoring the number of high quality charts, so that we can raise the standards for their use. We will further update our “Closing the Loop reports” to align with the new student learning charts and program annual reports.

We also are aware of the need to keep these three tasks aligned, even as we work on different levels/aspects of the whole project.

How well are you positioned to complete the project in the final two years of the Academy? What additional tools, resources, and engagement do you need?

We believe we are well positioned to accomplish our project. We have a core of highly placed faculty and administrators driving this project, and we need to continually think of ways to broaden the engagement and help faculty rise above their day to day teaching responsibilities to work “on,” in addition to “in,” the business of higher education.

Our recent achievements in this challenging area are 1) we have received faculty support for a monthly faculty development meeting, which will provide the venue for more training in assessment; 2) we have recently held a faculty training session in the use of our Learning Management tool (LMS), Sakai; and 3) we awarded last year five faculty awards for excellence in assessment, bringing the appreciation for assessment to the level of the trustees.

Additional resources needed to increase the levels of faculty engagement and expertise are 1) a dependable assessment management software (e.g., Livetext), together with the resources to administer it fully; 2) we need most, if not all, faculty to put all of their courses on Sakai; 3) we need to create a website that has all of the templates, reports, flowcharts, and contact information that faculty need to become more effective in their assessment practices; 4) we want to continue the awards and announce them earlier; and 5) we need to do a better job of asking the faculty how our efforts are working for them—i.e., we need to listen more to their suggestions.

What changes do you anticipate as you move into the second half of the Academy term? What have you learned from the first two years of the Academy to mitigate these challenges?

The challenges we anticipate moving forward are

1)    finding the resources (time and money primarily) to support the evaluation activities we envision,

2)    maintaining faculty participation (as described above),

3)    creating reporting processes that give us the greatest evidence with the least demand on the faculty,

4)     maintaining the vision of the goal throughout the details of implementation.

5)     communicating continuously about the benefits of assessment to faculty and students

6)     keeping the ELO working groups focused on their jobs

7)     making sure that assessment is embedded fully within the online and hard copy fabrics of our institution (manuals, job descriptions, web promotions, internal reports, etc.)

From the first two years of our project, we have learned a vast amount about institutional change. We have always aspired to change, but we have never so fully held ourselves accountable for change as we have in this Pathway initiative. Top among those lessons are

1)     institutional improvement is a process as much as an outcome, so we need to build a system that continues to improve itself based on evidence and insight;

2)     human nature dictates that change is difficult, and thus we must tap into that spark in everyone that seeks improvement, even in the face of discomfort;

3)     in addition to inspiration, faculty need instruction in how to conduct assessment in a reliable and useful manner, and this instruction comes with its own conceptual frameworks, terminology, and principles that need to be taught consistently to all of the faculty;

4)    assessment change by necessity has to proceed on three levels simultaneously: the course level, the program level, and the institutional level, and the alignment of all three.

How have you used what you have learned about student learning to improve your educational strategies (curricular and co-curricular)? What evidence do you have that your work thus far has improved student learning? What more do you need to know?

The first achievement so far has been the demonstration to the faculty of the critical step of aligning assessment with instruction in every class. As we have developed our Essential Learning Outcomes, which define what we seek in all our classes, the faculty have realized that they need to re-engineer their student activities and assignments to address as many of the performance indicators in each outcome as possible. For example, if we define holistic thinking as primarily 1) connecting course content to one’s own experience, 2) connecting two previously unconnected topics in a course to one another, and 3) connecting content across disciplines or courses both to an underlying principle and one’s own experience, then every activity or assignment that seeks to develop holistic thinking has to explicitly ask for these features in its instructions to the students, and the faculty have to assess the degree to which students accomplish each of these elements in their assignments. This may seem like a trivial point, but it was one of the insights that came out of our meeting with faculty where we presented data on our ELOs to faculty.

A second discovery that emerged when looking at student writing was that correctness of writing (grammar, punctuation, and the like) was not as great a challenge to our students as more stylistic and structural issues. Therefore, additional instruction needs to be given on these more abstract features of student writing.

A third discovery that we anticipated, but still re-discovered, is that one of the great benefits of a focus on student learning is not the outcome but the process of engaging faculty in discussions of what certain kinds of outcomes look like. In other words, meeting in small groups to discuss student work around outcomes such as critical thinking or writing ability is itself a kind of faculty development, even as we try to consolidate institutional agreement about standards for these outcomes.

As for what we still need, these three improvements in educational strategy still do not meet a complete cycle of assessment, where a change is recommended by assessment evidence, then implemented, and then confirmed to be an improvement. We are still a ways from this achievement, but a least with regard to writing and critical thinking, and possibly development of consciousness, we are planning to achieve all three of these steps, from discovery to implementation to confirmation. At the program level, we would also like to see three or four programs which have gone through this complete cycle. This is what we hope to report in our final documents to the academy.

 

Response to the December 2017 Report