Wednesday, September 30, 2015

IHE: Admissions Revolution

Article: Admissions Revolution
Author: Scott Jaschik
Publication: Inside Higher Ed
Date: September 29, 2015

In this article, Jaschik describes a revolutionary approach to college admissions adopted by 80+ institutions of higher learning.  Developed and adopted by the newly-formed Coalition for Access, Affordability, and Success, this new protocol seeks to set up a more holistic application process for prospective students.

While the majority of the article is not directly related to assessment in higher education, I found it notable that one major component of the initiative is an online platform on which high school students will build electronic admissions portfolios, beginning in their ninth grade year:
The high school student's portfolio: This would be offered to all high school students, free, and they would be encouraged to add to it, starting in ninth grade, examples of their best work, short essays on what they most proud of, descriptions of their extracurricular activities and so forth. Students could opt to share or not share all or part of their portfolios, but college admissions leaders would provide regular prompts, appropriate for grades nine and up, and questions students should ask about how they are preparing for college.
Not only does this initiative reinforce the importance of portfolio assessment in education, it also may in time provide a commonly accepted framework for what a "successful" ePortfolio looks like.  This would be of great use to higher education institutions looking to expand upon or create portfolio or capstone forms of assessment.  Furthermore, beginning in the 2019-2020 school year, institutions of higher learning will begin enrolling students who come to college with a ready-made four-year portfolio.  How can colleges build upon these efforts?

Read the original article for more details on the other aspects of the new admissions process.

Tuesday, September 29, 2015

IHE: Are They Learning?


Article: Are They Learning?
Author: Doug Lederman
Publication: Inside Higher Ed
Date: September 25, 2015

In this article from IHE, Doug Lederman examines the recent completion of the first year of a faculty-driven pilot study by the Multi-State Collaborative and the AAC&U.  This pilot hopes to bring about a set of commonly adopted Student Learning Outcomes and rubrics (based on the VALUE rubrics adopted by MCLA and institutions nation-wide) for use in authentic assessment of a wide variety of undergraduate work.  Lederman asserts that the pilot may be a good way to begin the process of allowing federal higher education policy to focus more on student learning:
The question of student learning outcomes has been largely relegated to the back burner of public policy in the last few years, displaced by recession-driven concerns over whether students are emerging from college prepared for jobs.... That's not entirely by choice, though; administration officials noted in a policy paper accompanying the Scorecard that while learning outcomes are "an important way to understand the results and quality of any educational experience … there are few recognized and comprehensive measures of learning across higher education, and no data sources exist that provide consistent, measurable descriptions across all schools or disciplines of the extent to which students are learning, even where frameworks for measuring skills are being developed."
 Lederman's article asserts that the MSC pilot (in which Massachusetts institutions have participated as part of the Vision Project) may be a good way to begin to bridge the gap between college faculty (who are given the power to select student work they view as "important" and perform the grading) and those who create academic policy who want to see student progressed assessed in common ways across institutions.  Additionally, it provides opportunity for institutions to demonstrate growth over time.

Read the original article for (largely positive) reactions from faculty and policy-makers involved in the study, as well as a summary of the results of the assessment.

For another look at the pilot, you can also refer to the article Faculty Measures to See Promise in Unified Way to Measure Student Learning from the Chronicle of Higher Education.
But it was the subcategories within each broad skill area that were often more revealing, several faculty members said....  Such detailed feedback is particularly useful because it directly relates to actual course work, said Jeanne P. Mullaney, assessment coordinator for the Community College of Rhode Island. The results can help faculty members change their assignments, guided by a shared conception of a particular skill area. "The great thing with rubrics," she said, "is the strengths and weaknesses are readily apparent."