Quantitative Assessment: Highlights of Results
To convey the range and depth of the ePDP First-Year Seminar student survey, the Fall 2011 report is attached below. Here we focus on a multi-year overview of the quantitative assessment since Fall 2010 with attention to our conclusions about what we see happening, or not happening, as we have expanded use and fine-tuned integration of the ePDP within the First-Year Seminar.
- Linear regression results suggested that students participating in Fall 2010 FYS ePDP sections had significantly higher Fall GPAs (2.95) compared to nonparticipants (2.79), even after High School GPAs, SAT scores, Gender, Income Level, and Admit Date (proxy for student motivation) were entered as covariates. Students who participated in ePDP sections earned Fall GPAs .14 higher than nonparticipants. Results are shown in Tables 1 and 2 in the attached document.
- Logistic regression results suggested that students participating in Fall 2010 FYS ePDP sections had significantly higher one-year retention rates (80 percent) compared to nonparticipants (74 percent), even after High School GPAs, SAT scores, Gender, Income Level, and Admit Date (proxy for student motivation) were entered as covariates. Students who participated in ePDP sections had 48 percent greater odds of being retained compared to students not participating in ePDP sections. Results are shown in Tables 1 and 3.
- Once the numbers of FYS instructional teams assigning the ePDP increased in subsequent fall semesters, the positive effects of ePDPs on students’ levels of academic success and persistence rates were not sustained. It is possible that more extensive faculty professional development is necessary to ensure that the ePDP is used effectively as a pedagogical tool (one that enhances students’ meaning-making, self-awareness, reflective thinking, and writing). Results are displayed in Tables 4 and 5.
- Fall 2012 students participating in University College FYS sections who completed five or more sections of the ePDP had significantly higher mean scores on student success and self-reported learning outcomes in the following areas: using reflective writing to understand their experiences, adjusting to college, deciding on a major or future career, and understanding self and motivations for attending college. Results are shown in Table 6.
Discussion of Findings
Our findings suggest that, overall, something was lost in the scaling of use of the ePDP. We saw very positive outcomes in GPA, retention, and course outcomes in the pilot year (2010), when ten faculty used the ePDP with approximately 350 students. These ten faculty went through a week-long seminar focused primarily on reflection and scaffolding the portfolio into the course curriculum. In 2011 and 2012, approximately 40 faculty used the ePDP with over 1,000 students each semester, and faculty development was conducted in two three-hour workshops. Content in these workshops was similar but very condensed, and faculty had less hands-on time to think through integrating the ePDP into their course syllabus. As we scaled the implementation, increases in GPA and retention were lost. This suggests we need to spend significant time and effort developing a robust, yet manageable, faculty development program for the ePDP.
On the positive side, as we scaled implementation of the ePDP, we still saw increases in self-reported survey data on course and ePDP outcomes. Students completing ePDPs continued to report statistically significantly greater outcomes than students who did not. Many of the outcomes are factors related to retention, so it is our hope that we will see increase in retention over time, even if they did not appear after the first year. In 2013-14, we will be developing tracking and year-to-year reporting of GPA and retention for students who have completed an ePDP, particularly as the cohort from 2010–if on track for on-time graduation–will be eligible to graduate in spring 2014.
Finally, the development of the conceptual model will allow us to begin developing learning outcomes and associated assessment methods that are more robust and able to address the complexity of the learning that occurs through use of the ePDP.
Qualitative Assessment Highlights
Content analysis of both student ePDPs and student ePDP survey responses each year has spurred ongoing refinement of ePDP use. For example:
- Analysis of qualitative questions on the Fall 2011 ePDP evaluation show that when students were asked to list three things they learned:
- 19% identified “career/major exploration and development”
- 18% identified “understanding/self-awareness”
- 17% identified “goal setting”
- When asked what three things were most valuable about completing an ePDP:
- 18% responded “goal setting”
- 15% indicated “understanding self/self-awareness”
- 14% indicated “career discovery and planning”
- When asked what could be improved:
- 19% indicated “improve technology”
- 16% responded “nothing”
- 12% suggested “more instruction, direction and support”
Responses to all three questions were very similar to the Fall 2010 pilot results. The one major difference was that “more organization and improved process” was the most common suggestion for improvement in 2010 but did not emerge in Fall 2011, suggesting that our faculty development for first-time users improved and that faculty became more adept at scaffolding the ePDP into their courses the second time they used the ePDP.
In addition, in the first year of ePDP piloting, we found that students were responding to reflective prompts with one- or two-sentence answers, rather than reflecting deeply. As a result, we revised reflective prompts to eliminate redundancy and elicit more extended reflective essays.
Our initial approaches to ePDP assessment were piecemeal, using rubrics focused on responses to individual prompts. We found that this strategy yielded little insight into student learning and thinking. We quickly moved first to assessment of sections of the ePDP and now to holistic assessment of ePDPs with a rubric intended to capture progress on intended FYS outcomes like increased self-awareness, planning and goal-setting, and integration of learning.
Analysis of ePDPs also helped us to recognize the need for an over-arching conceptual model for the ePDP across students’ undergraduate experience. An interdisciplinary group formed in late spring of 2012 and worked steadily through 2012-13 to develop such a model (see our Professional Development practice for further information), and continues to work in 2013-14 to clarify and extend its principles and to structure professional development for the faculty, advisors, and other instructional staff who will use the ePDP as it rolls out through the undergraduate experience.
Diversity Enrichment and Achievement Program/Student African American Sisterhood
Some of our learning has been disappointing, though valuable. The Diversity Enrichment and Achievement Program (DEAP, which now encompasses the former Student African American Sisterhood), a support program for students of color, has had difficulty implementing the ePDP with its students. Not all DEAP students were engaged with the ePDP in their sections of the FYS, no credit was attached to ePDP work done in DEAP, and even when there was apparent convergence, students were often confused about the relationship between what their FYS instructors asked them to do and what the DEAP leader asked. Thus, the program leader reluctantly paused implementation in 2013 until most incoming students are developing an ePDP and relationships can be more clearly articulated.
Life-Health Sciences Internship Program
The LHSI program, one of our C2L partner programs, used the ePDP as foundation for discussions among the interns, challenging them to reflect more deeply about the meaning of their year-long experiences as interns in research. Following the first full year of ePDP use, the program director and 2012-13 Student Ambassadors (peer mentors) reviewed observations about what worked well and where improvements might be made. They incorporated informal review of both the interns’ ePDPs and the comments captured from the online discussion sessions.
Knowing they would be discussing their experiences and sometimes sharing their own thoughts with their faculty mentors encouraged interns to think critically as they completed the prompts about what they were doing and learning beyond gaining basic laboratory experience. In online and on-site discussions with the Student Ambassadors, students were able to voice their concerns about what to include in their ePDPs and what to emphasize. They were also able to share ideas with each other about how to integrate their internship with their other learning experiences and their career aspirations, and to articulate these relationships within the framework of the reflective prompts. As one student put it, “it was useful to listen to multiple perspectives about how my internship was integrated with my current coursework as well as a future career. I was able to use the questions to create an ongoing conversation about my performance at work.”
Other discussions centered on practical applications of creating an ePortfolio. “One thing I really like about the ePDP is how you can make multiple submissions so at the end of our internship when we have to do the poster we will be able to look back and see how we have grown and improved throughout the experience, kind of like a notebook,” as one student said. Another noted that “the questions have really helped me to take time and think about what I am learning from the internship instead of just go through the motions.”
The program director shared these lessons with the new 2013-14 Ambassadors (who were interns themselves in 2012-13) to strengthen guidance to help the new group of interns make the most of this important experience.