Assessing Undergraduate Student Learning: Does it Work?
"In recent years, national conversations about the value of higher education have led to renewed attention on the topic of general education and those educational outcomes that should be expected for any student who completes a college degree. Further, from an accountability perspective, there has been considerable interest in evidence that postsecondary students are attaining the knowledge, skills, and attitudes expected of them upon completion of a degree in a specific field. This has led to discussion of the importance of identifying effective ways of assessing student learning, both at the course and program level."
—Belle S. Wheelan (1951- )
Vice Provost for Planning and Assessment
Assessment: it comes in various forms, with various ends. In higher education, assessment may be of faculty performance, student learning, academic programs, colleges, and universities.
This article is about standardized approaches to assessing the learning of undergraduates across the measures of critical thinking, complex reasoning, and communication skills. Our purpose is to review what is known about the standardized assessment instruments, why the assessments are important to Texas Tech's future, and how the assessments might be used not only to evaluate students' knowledge and skills, but also to add value to undergraduates' experiences and acknowledgments from the university. But, first we should put the assessment of student learning into perspective.
The Big Picture
Great undergraduate education begins with general education in the liberal arts and continues with focused disciplinary education and training—all woven with pedagogical and sociocultural interventions that are intended to foster the development of students' life-long learning skills. Let's consider these components in turn.
At Texas Tech University (TTU), general education requires 47 semester hours in communications, natural sciences, humanities, social and behavioral sciences, visual and performing arts, and mathematics. Texas Tech also has a multicultural requirement, and one or two years of foreign language is required depending upon the degree pursued. These requirements provide a sound foundation in arts, sciences, and culture necessary for more advanced study in the disciplines and professional areas from architecture to business to communications to engineering and more. Besides the general education core courses, learning is supported through first-year experience offerings, writing-intensive requirements, and a host of curricular and co-curricular experiences aimed at improving critical thinking, complex reasoning, and communication skills. Recently, an area of focus has been added with the adoption of a new mission statement and inclusion of "ethical development," commonly referred to as the "strive for honor" initiative, which began in 2005. As examples of the curricular and co-curricular experiences, we offer the following:
- Recent increased enrollments in ethics courses, in the humanities, sciences, and technology disciplines, coupled with ethics initiatives being led by the Student Government Association, Athletics, the Student Affairs Division, and the Texas Tech Ethics Center.
- Introductory courses in composition that, according to Texas Tech's Core Curriculum Committee (2011), infuse the use of technology while developing students' abilities "to specify audience and purpose and to make appropriate communication choices."
- An institutional requirement, implemented in 1994, that focuses on increasing understanding of multiculturalism and diversity in society, with co-curricular learning facilitated through residence hall programming, student activities, international affairs, study abroad, and the Cross-Cultural Academic Advancement Center.
With general education requirements completed, the majority of students go on to majors that require 63 to 82 hours of coursework. The latter requirements may be fulfilled through didactic offerings, including service-learning courses, undergraduate research, study abroad options, and internships—all providing opportunities for training and experience in problem-solving, critical thinking, complex reasoning, and communication. The latter skills can then be subject to assessment by a variety of methods, which is the primary subject of this article.
Assessing Student Learning
Few would argue that critical thinking, complex reasoning, and communication skills are not essential to personal and professional development along a path of life-long learning. Some policy makers and state officials would posit that these skills should be the focus of student learning assessment and in turn, the benchmarks for college and university performance. We believe, perhaps akin to most academics, that a variety of measurements are necessary to assess students' learning during their pursuit of undergraduate degrees. Nevertheless, many commercially developed instruments have been used to assess critical thinking, complex reasoning, and communication skills, and it is important for all academics—students and faculty alike—to understand these instruments and their application.
The Voluntary System of Accountability (VSA) was formed by the National Association of State Universities and Land-grant Colleges (now the Association of Public and Land-grant Universities) to provide an institutional "portrait" utilizing a common set of information and learning assessments. The VSA utilizes a pre- and post-test methodology to assess learning while in college and has endorsed three instruments. These include:
- Collegiate Learning Assessment (CLA): Developed by the Council for Aid to Education (CAE), a non-profit organization founded in 1952 and based in New York City, the CLA is probably the most widely used instrument of its type. According to the CAE website (http://www.collegiatelearningassessment.org), "The CLA presents realistic problems that require students to analyze complex materials and determine the relevance to the task and credibility. Students' written responses to the tasks are evaluated to assess their abilities to think critically, reason analytically, solve problems, and communicate clearly and cogently." Texas Tech has utilized the CLA over a three-year period to assess cohorts of 100 freshmen and seniors.
- Collegiate Assessment of Academic Proficiency (CAAP): American College Testing produces the CAAP (http://www.act.org/caap). The CAAP includes six modules, which are administered separately through 40-minute sessions. The assessment areas include: reading (36-item multiple choice questions based on readings in prose fiction, humanities, social sciences, and natural sciences), writing (72-item multiple choice test of comprehension of written English "punctuation, grammar and usage, sentence structure (usage/mechanics), strategy, organization, and style in writing," but not "spelling, vocabulary, and rote recall of rules of grammar"), writing essay (composed of two, 20-minute writing assignments addressing hypothetical situations and written to hypothetical audiences that require taking a position and providing evidence appropriate to the issues), mathematics (35-item test of mathematics reasoning using elements from pre-algebra, elementary, intermediate and college algebra, coordinate geometry, and trigonometry), science (45-item test of scientific reasoning across the biological and physical sciences), and critical thinking (32-item multiple choice exam intended to assess abilities to analyze, evaluate, and extend arguments contained in four passages). In the fall of 2010, Texas Tech administered the CAAP science assessment, and in the spring of 2011, the CAAP math assessment will be administered in selected upper-division classes in most colleges.
- Educational Testing Service (ETS) Proficiency Profile (also known as the Measure of Academic Proficiency and Progress [MAPP]): The ETS is an international non-profit corporation founded in 1947 with headquarters in Princeton, New Jersey. It is best known as the developer of the Scholastic Aptitude Test (SAT; now known as the SAT Reasoning Test and administered by the College Board), the Graduate Record Exam, and the Test of English as a Foreign Language. The ETS Proficiency Profile is a single multiple-choice examination that intends to measure proficiency in critical thinking, reading, writing and mathematics in the context of humanities, social sciences, and natural sciences. According to ETS website (http://www.ets.org/proficiencyprofile/about), the Proficiency Profile is designed to test academic skills developed, as opposed to subject knowledge taught, in general education courses.
As gleaned from above, the available assessment instruments endorsed by the VSA are similar though not identical. But, all are recognized through the VSA, which is discussed further in a companion article of this issue of All Things Texas Tech (Smith, 2011). The CLA is the assessment instrument that is most commonly used by Texas public universities and colleges, and it has been considered for adoption by the Texas Higher Education Coordinating Board as a possible measure of the value added to student learning by the university experience. As importantly, the State of Texas is currently attempting to identify a methodology to measure the construct of value added as an indication of institutional performance.
Whatever our views on assessment of student learning, when administered with a large enough sample, each instrument mentioned previously can provide some information about student learning in general education content areas and skills. Furthermore, the role of institutions in developing skills such as critical thinking, complex reasoning, and communication skills has been the subject of debate and controversy. Some would say, for instance, that standardized assessment is an insufficient measure of the effectiveness of an institution's educational experiences and their impact on student learning. Others would cite evidence that significant numbers of students are not acquiring even these basic skills during their collegiate experience.
Arum and Roksa (2011) recently published a study with results of standardized tests of critical thinking, complex reasoning, and communication skills of students at 24 diverse higher education institutions. The longitudinal study began during 2005, with 2,341 students all taking the CLA. In 2009, 1,666 students from the same cohort were re-tested. A total of 64% of this group gained by 18% in average CLA scores; 36% gained nothing in measured critical thinking, complex reasoning, and communication skills. As strikingly, 50% of the students took no courses in one semester where they had to write a paper of 20 pages or more; 30% took no courses where they had to read more than 40 pages per semester. Perhaps as notable were Arum and Roksa's findings that of the student cohort's weekly hours, the students spent on average only 9% attending class and 7% studying, but 51% socializing. Indeed, study times were observed to drop from 25 hours per week in 1960 to 12.5 hours per week in 2005. The students who performed best on the CLA generally were those who had taken courses from instructors who had high expectations and required considerable reading and writing assignments. Thus, Arum and Roska's research contributes to the conclusion that, while the results of CLA or similar assessment should not serve as a proxy for an institution's total worth, these instruments offer a potentially important measure of the instructional quality and student learning in the important areas of critical thinking, complex reasoning, and communication skills. However, the testing robustness and validity are challenged by at least two factors.
First, longitudinal studies of the type described by Arum and Roksa (2011) are difficult and very expensive to conduct, especially if you wish to derive collective as well as individual assessments. The alternative approach, involving singular cross-sectional testing of students (typically in their junior or senior years), can provide measurements that mirror longitudinal results (Lumina studies) so long as students are motivated to contribute their best efforts during the taking of examinations. But, here's the rub.
One finds few students who like to take exams or participate in assessment activities. Thus, unless the assessment has a significant impact on a student's academic program, motivation for high performance is low. There is some evidence that a modest payment, perhaps $50-100 makes a difference in student performance on assessments, but there are skeptics.
So, here we are at a critical nexus in this paper. We can see value in assessing critical thinking, complex reasoning, and communication skills. And, there is no doubt that high performance in these skills are valuable to students in their subsequent careers, not to mention the positive impact of such skills on life-long learning and functioning as good citizens. Such thinking is not only consistent with the work of Arum and Roksa (2011) but also that of Richard Light (2001), who studied more than 1,600 Harvard students over several years. A key factor in accurate and reliable assessment, however, is student motivation to perform well.
Here's where we offer a new proposal.
Finding Mutual Benefit in Standardized Test Taking
If the results of assessments are beneficial to an institution and its oversight agencies, provide useful markers for instructional effectiveness and student skills, but are challenged by student motivation to perform at a high level, why not find a mutually beneficial solution? The solution could come with a straightforward voluntary program.
One reason for a paucity of student motivation in assessment activities, as suggested above, is that there is no apparent benefit from performing well. However, if demonstrable value can be vested in notable assessment scores, including perhaps value assigned by future employers, why not offer the potential for university recognition of high performance in critical thinking, complex reasoning, and communication skills? Here's what we have in mind.
Imagine a system where students are offered the opportunity as freshmen to begin to develop an academic portfolio of their work, including assessment results. The latter could be taken from classroom assessments as well as institution-wide assessments. Students could be assigned electronic portfolios in which samples of freshmen work would be stored; faculty in specific disciplines with emphasis on critical thinking, complex reasoning, problem solving, communication skills, and other core competencies would encourage students to upload work samples as benchmarks of their work as freshmen. Then, each year, students would be encouraged to continue to archive work samples that exhibit their best performance or learning to date. In their senior year, results for institution-wide assessments, capstone courses, and senior-level writing courses could be included, along with a resume that had been developed with the assistance of the Career Center's services. Then, at graduation, students would be acknowledged for developing a Texas Tech portfolio that could be used to include samples of their best academic work and assessment results, as well as their diploma. Students could subsequently submit this documentation to prospective employers or bring it to a job interview. Thus, Texas Tech students would be equipped with writing samples, videos of speeches, senior projects, along with their transcript and diploma.
We think the above idea and its voluntary program linkage is worthy of consideration at Texas Tech and nationally. But, we would appreciate learning thoughts from our academic community—faculty members and students alike. Feedback could be helpful in formulating plans for going forward with a pilot of the voluntary program idea offered herein. So, let's hear from you. An e-mail to Bob Smith or Valerie Paton is all that it takes.
Arum, Richard and Josipa Roksa. 2011. Academically Adrift: Limited Learning on College Campuses. Chicago: University of Chicago Press.
Collegiate Assessment of Academic Proficiency. 2011; http://www.act.org/caap/index.html
Council for Aid to Education Collegiate Learning Assessment. 2011; http://www.collegiatelearningassessment.org
ETS Proficiency Profile Overview. 2011; http://www.ets.org/proficiencyprofile/about
Light, Richard. 2001. Making the Most Out of College—Students Speak Their Minds. Cambridge, MA: Harvard University Press.
Smith, Bob. 2011. "Are Students Customers? Many Factors Should Inform our Judgment," All Things Texas Tech 3(1); http://www.depts.ttu.edu/provost/attt/2011/02/students-customers.php
Texas Tech University Core Curriculum Committee. 2011; http://www.depts.ttu.edu/provost/councilscmtes/ccc/corecurriculum.php#comm
About the Authors
Valerie Paton is the vice provost for planning and assessment, Texas Tech University, Lubbock, Texas.
Bob Smith is the provost and senior vice president at Texas Tech University, Lubbock, Texas.