To continue a summary of our presentation at the NC State Assessment Symposium…
Our ‘closing the loop’ example was the most detailed of the best practices that we presented because I was personally involved in the project.
Closing the loop refers to not just collecting data, but using it to inform decision-making. It is the focus, seemingly, of most accrediting agencies. In fact, one engineering school I visited recently was criticized by ABET for collecting too much data and not doing very much with it. I don’t think that’s just an engineering issue, as much as the stereotype of engineers might suggest it, but a reality of outcomes assessment. We can collect all kinds of data but using it to change the process can be the most challenging part of the process.
I was impressed, at several different sessions at NC State, to see educators excited at the prospect of even slender amounts of data. As Dr. Ken Bain and his colleagues at Montclair State University’s Research Academy argue, repositioning the whole accreditation/outcomes/teaching debate as a question of academic inquiry rather than external requirements can be quite powerful. Educators are researchers – no matter whether teaching economics, English, or third grade. So it makes sense that presented with data, educators begin to see their own teaching as an area worthy of research.
We presented the changes brought to a first-year engineering program, particularly having to do with research skills.
The challenge is one familiar to most educators: how do we teach students to value the library databases and scholarly resources available to them, and understand the differences between Wikipedia, Google searches, and corporate websites?
The consensus amongst a team of first-year humanities instructors, who taught an interdisciplinary first year covering composition, design and research, and literature, was that research skills could be taught more effectively. The process saw 30 sections of first-year engineers marched to the library in January for a 60 minute presentation by librarians on the library databases.
This approach was a classic catch-22: since the students were just beginning their research project, they didn’t have any vested interest in learning the ins and outs of ProQuest and LexisNexis. But when they did need the info, later in the term as they finished up length design proposals, it was too late to teach them. So these 50 minute sessions were often dry and difficult for students and teachers alike, even when the librarians tried to engage the students with examples of design projects past.
Using Waypoint, the team of ten to twelve humanities instructors had collected two years of detailed data on student achievement across three major written projects. These projects were completed by teams of students and constituted:
- Problem Definition Statement
This two page document, due in early February, began to define the problem that the team of engineering students would work to solve. Too often students wanted to leap to solutions (“nuclear powered microwave, cars that can parallel park themselves”) without considering whether anyone needs their great idea. Even at this stage quality research was stressed in lectures and document requirements. - Proposal
This five page design report developed the Problem Definition into a full-blown engineering proposal, complete with detailed criteria and constraints. The documents were due in mid-March. Scholarly sources were required, as was a survey of literature. - Final Report
This ten page design report documented a solution to the identified problem. Often these solutions were sketches or feasibility studies, but five scholarly sources were required and the document structure was quite specific.
Data representing the skill of “applying research” for each of these three documents, across three academic years, is shown in Figure 1.
Figure 1: Student Achievement – Average Score for “Application of Research”
(click to see a larger version)
Given that we were working in the relatively unscientific world of higher education, we had fantastic data:
- 100+ teams of students (5 students per team) were evaluated each year
- The team of instructors were experienced (most had been teaching the course for four or more years)
- Detailed information on students was available: class rank, SAT scores, high school GPA
- The curriculum was proven and stable
- Rubrics developed in 2004 had been used hundreds of times to assess students and faculty were ‘normed’. Click here to see a sample rubric (note that you can click on the image to make it clearer).
- Two readers were used on every document, with a proven record of inter-rater reliability.
The first two years (2004 and 2005) showed a reproducible process: students were weak in this area – seen by data from the Problem Definition stage – and steadily improved over the 20 week design project. However the average score (on the rubric) hovered at 0.8 on a zero to 1.0 scale – not ideal. This means, roughly, that half the teams were scoring at the equivalent to, “Research is evident, but could be used more effectively…”
In the fall of 2005 the interdisciplinary team met with the engineering librarians to devise a change in the curriculum based on the data from these formal assessment and the anecdotal experiences of multiple faculty and students. The team decided that the 50 minute, lecture-style presentation in January would be replaced with an interactive tour provided through WebCT (Blackboard Vista), face-to-face conferences later in the process with the engineering librarians, and discussion boards monitored by engineering librarians available in WebCT.
Clearly the librarians were saving effort on the mechanical challenge of scheduling and presenting 30+ lectures on library skills, but they would need to manage 50+ face-to-face conferences later in the term (two design teams would meet together to talk through their challenges and goals).
Without the formal, if largely (seemingly) unsuccessful ‘library skills exercise’ in January, the Problem Definition scores fell dramatically in 2006, the first year with a new process. So the dry 50 minute presentation clearly had been making an impact! And removing it significantly reduced the quality of the Problem Definition Statements delivered in February 2006.
The overall average returned to historic levels by the Final Report. But this was only achieved through the brute force efforts of the library staff. The additional effort required of staff and students (scheduling meetings with busy schedules, the shear paper chase of managing such an undertaking with 500+ freshman engineering students) was judged to be too much, and the curriculum reverting to the previous model.
Presenting this data at NC State was exciting, because the audience of 60+ people clearly engaged with the data. Excellent questions were asked, including whether the academic team had been able to survey students to discover which approach they preferred. One attendee asked whether there had been an opportunity to design a control group, who would do things the ‘old’ way.
We had great confidence in our data, and did not want to punish a cohort of students by using them as a control. So we felt comfortable, at the gut and statistical levels, changing the curriculum for the 05-06 academic year.
Unfortunately surveys of student opinion were not available to us – or at least the surveys could not be changed to target these specific issues. So this kind of an approach to validating and investigating data was not open to the team.
The data was very useful, however, in bringing about change in the library orientation process. The library went back to doing January overviews for the sections, but made the sessions much shorter – limiting a formal presentation of material to 20 minutes. Since the training continued to take place in January, the librarians worked together with instructors to build homework for students to be completed in advance. This homework would give the students some background information on an issue, like trash incinerators in New Jersey, and the librarian would walk students through the process to research the issue. The data – both the two years of benchmark data and the data resulting from the large change in curriculum, was very helpful for staff and faculty looking to understand their process in greater detail.