Spring 2017 Report to the HLC Assessment Academy
What projects have you been following on the Collaboration Network? What have you learned from the experiences of other schools that is useful to your project? To learn more about the progress and development of other projects, get alerts by following other projects. |
|
---|---|
There are two institutions we have been looking at, Wartburg College and University of Northern Colorado, for examples of how to package the resources we have developed in house for student learning outcomes assessment and how to spread the culture of assessment to which we are committed. Wartburg has a “Student Learning Assessment Resource Guide” for curricular and co-curricular programs that is a good model for something that we might create ourselves. The University of Northern Colorado has a nicely designed assessment plan template that could be of great use to us. They also have an Assessment Leadership Institute that is similar to the process that we put all of our faculty through, but that adds also a focus on research and publication in assessment scholarship that we have not yet captured. They have also an annual Assessment Fair that has been simplified to two poster sessions and a lunch, but that becomes a venue for showcasing and sharing faculty achievements in assessment. In addition they have created a mini-grant program awarding $15,000 per year (up to $1500 per proposal) for assessment research and development. We are inspired by all of these initiatives and are exploring them with our Provost. | |
|
Update Questions
Describe your team’s initial implementation of the project you have designed. |
|
---|---|
Since our last report, we have seen the introduction of a new President and Provost of the University, with a continuing Vice President of Academic Affairs who has been intimately involved with our Pathway Initiative from the beginning. The new Provost also has a strong background in assessment, and thus our assessment efforts continue with strong support from all senior academic administrators, including the President.
In the last six months, under the leadership of our Assessment Team and the Vice President of Academic Affairs, and with the support of our Mentor, Peggy Bloom, we have been focusing on two major components of our Student Learning Improvement Project: 1) defining and beginning to gather data on three of our nine Essential Learning Outcomes, 2) focusing, updating, measuring and analyzing Program Learning Outcomes in preparation for departments’ Annual Reports in June. More detailed discussion of these initiatives is given below. 1) Defining and beginning to gather data on three of our nine Essential Learning Outcomes After discussing with Dr. Bloom some of our options, we settled on gathering initial data on two of our Essential Learning Outcomes. The first represents a core value of the institution, Development of Consciousness; the second represents a fairly standard higher education goal, Communication (with a focus on written communication). A third outcome, Critical Thinking, emerged fortuitously out of a Ph.D. dissertation that one member of the Assessment Team was planning. Of the three, the Development of Consciousness and Critical Thinking are furthest along. Both have full “profiles,” including an elaborated definition, a checklist or rubric, and suggested teaching methodologies. The definition and measurement of student writing levels is proceeding more inductively by looking initially at data based on current student writing and then using that initial data to create and refine the written communication profile. Our goal is still by the end of this semester to have baseline levels of student proficiency in each of these three areas: Development of Consciousness, Critical Thinking, and Writing. 2) Focusing, updating, measuring and analyzing Program Learning Outcomes In previous years all of the University’s academic programs have posted their learning outcomes online. This phase of the Student Learning Improvement Project sought to consolidate each program’s outcomes to five in number, to improve the quality of the writing of these outcomes, and to develop faculty’s capacity to measure and analyze data on these outcomes. Three full Faculty Senate meetings addressed the skills of program assessment step by step: 1) writing student learning outcomes 2) choosing measures for outcomes 3) collecting and analyzing data on outcomes to draw conclusions about student learning. All sessions were recorded on video and subsequently posted on the faculty site for teaching and learning resources. Over the four months when these meetings were held, members of the Assessment Team in pairs visited each academic department to review and provide feedback on their outcome statements and measurement plans. In the final session this March, we introduced a template for data collection with a couple of simulated data sets so that the faculty could practice drawing conclusions about a program from data. Based on these training sessions, each department is poised this spring to collect its own data and write its Annual Report from two sets of data (from direct assessment) of one of its stated program outcomes.The aim of this process is to assist all departments in looking at their programs from the standpoint of demonstrated student learning. Another outcome of departments’ revisions of their student learning outcomes is that we can now create a master grid matching Program Learning Outcomes to Essential Learning Outcomes (ELOs) in order to see how well embedded the ELOs are in each academic program. We will be able to see where there are opportunities to measure the ELOs at advanced levels in the academic majors, and which ELOs may not be getting sufficient attention after the general education courses. |
|
|
How has your project developed and changed since the last posting? | |
---|---|
Our Student Learning Improvement Project has not changed in its aims. Some components, however, have been pushed back in the implementation timeline. Specific changes are given below:
|
|
|
How have you incorporated the feedback from the Consolidated Response to your previous Project Update? |
|
---|---|
We have incorporated feedback, but we are still working to incorporate all of the proposed suggestions. We list the progress on these below:
1. Formalizing the official policies on assessment. We have included the articulation of learning outcomes and their assessment into some important University documents such as the New Program Approval forms and processes. We have not yet formalized the roles and responsibilities for assessment into the Faculty Handbook, or other stand-alone “Assessment Policies” document such as Dr. Bloom shared with us from Marquette University. Several members of the Assessment Team have committed to creating this coming summer a chapter of the Faculty Handbook dedicated to assessment policies and responsibilities; at the same time the existing sections describing responsibilities of Program Directors and Department Chairs will be updated to incorporate the latest concepts from our Pathway Initiative, the Student Learning Improvement Project. 2. Fleshing out the assessment cycle. The assessment cycle for academic programs is set and clear. Each program completes an Annual Report each year by the end of July, and we have recently reviewed this cycle with all the departments. The same cycle applies to measuring and analyzingthe ELOs, though the details of this process with timelines, responsible parties, and intended learning changes still needs fleshing out. This will be the focus of our next Assessment Team meetings. 3. Developing Communication Processes. We have been clearly communicating to the majority of faculty through the three recent all-faculty meetings focused on the steps of program assessment. We have also posted each of the two previous assessment reports online for faculty to read, and shared by email the updates of the Essential Learning Outcomes and the new “Profile” of the Development of Consciousness ELO. In addition, we have a faculty resources site, well-maintained by our Dean of Teaching and Learning, that includes all the resources we have developed under the auspices of this project. That said, we believe we need to do more. We are still exploring ways to reward excellence in assessment as part of our pre-graduation faculty-trustees celebration achievements. Some kind of recognition or award will help to both raise the profile of faculty assessment and remind faculty at a crucial time of the year that assessment is a priority for the campus. 4. Coordination Efforts. So far our efforts to include faculty in the three ELO working groups have been effective at spreading involvement and expertise around campus, and we have always had a member of the Assessment Team on each working group to ensure the coordination and connectedness of the groups to the Team. The Chair of the Assessment Team has worked to make sure that the work of each group remains focused and practically oriented. So far it has worked. The writing assessment group has been the most innovative, so they will need the most oversight going forward, but they have good communication with the Chair. 5. Monitoring of timelines. Dr. Smith (our HLC Scholar) has encouraged us to include the monitoring of timelines as part of the coordination efforts. Although we have discussed timelines numerous times, writing this report reminds us that these timelines have not been formally written out and we shouldcreate a Gantt chart to commit to deadlines and products due. |
|
|
Thus far, what have you discovered about student learning at your institution. |
|
---|---|
We have discovered that the face of student learning hides behind a number of veils that must be lifted one by one, starting with a clear definition of what that face might look like. Subsequent veils are layers of performances that hide patterns. The final veil is itself only a pattern that arises from reflecting on and discussing student work. The pattern suggests strong and weak areas of performance. Lift this veil and underneath one finds the face of student learning in one domain, and by comparing the face one sees with the face one is seeking one can begin to imagine what can be done to improve teaching and learning.
The great relief in this process, however, is that at every step, as another veil is lifted, the face beneath begins to take form, and one becomes more sure of what one is seeking—first in broad lines and then in increasing detail. As a result, even before the final veil is lifted, intuition goes to work and begins to guide teaching in more productive directions. Most programs and most faculty have only lifted a couple veils so far, but they find the discovery process and the increasing clarify of the face beneath quite rewarding. And with every veil removed we are more determined to finish the unveiling. Speaking less metaphorically, the Program Directors have yet to really see the value of assessment to teaching and learning, but they are seeing the value of crafting clear program outcomes—i.e., arriving collectively at a clear statement of the learning they are seeking. We have had several wonderful discussions with departments about what they are trying to achieve with their students. Another long hold-out, the computer science department, has also had an early success with a simple exam they created to check on student competence at a crucial time in their program. The art department has also seen new directions that their assessments can take based on one of the workshops we have hosted. Slowly but surely Departments are beginning to put the assessment pieces together (lift more of the veils from the previous analogy) and seeing benefits to the process, mostly on the level of conceiving their outcomes. |
|
|
How will you continue to advance your project in the next six months? |
|
---|---|
We have set the following five goals:
|
|
|
What challenges do you anticipate? How will you address them? |
|
---|---|
Challenge 1.
Department chairs and program directors may measure something to comply with the requirement that they conduct assessments of one of their Program Learning Outcomes, but the faculty as a whole in each department may remain relatively uninvolved in data analysis and creating a proposal for improving teaching in the department. Addressing Challenge 1: Check in with Department Chairs by phone to ask when they will meet with faculty to review student learning outcome data. Challenge 2. There will be a logistical challenge to getting the three groups who are looking into the first three ELOs to meet and review the data that they have gathered, so that they can report to the faculty on current levels among students. Addressing Challenge 2. Schedule the meetings early, even this month, so the three groups have a deadline for their data gathering and commit to a data review session. Challenge 3. Some academic programs are undergoing changes in leadership or curriculum, making it difficult for them to focus on business as usual, including the business of assessing program learning outcomes. Addressing Challenge 3. For these programs and their leadership, we may have to adapt the process of program learning outcomes assessment to their immediate needs. For example, for a new program with no outcomes yet to assess, facultycan focus on making their learning outcomes and the measurement process clear. For a program undergoing curricular change, facultycan still use assessment of their prior curriculum to gain insights about their future curriculum. There were some questions they posed in their previous responses that we were supposed to answer this time around. I remember that you emailed Peggy to ask her whether she wanted answers then, and she replied that we could answer them in the next report. But I don’t recall what those issues were… nor am I finding copies of that email trail in my email. Did those issues/questions/answers get addressed in the context of these new questions? |
Response to the Spring 2017 Report