Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Are Those Rose-Colored Glass you are Wearing? Student and Alumni Survey Responses

2015
Combining data from the Strategic National Arts Alumni Project and the National Survey of Student Engagement, this study examines self-reported college experiences and skill development of seniors and alumni who majored in the arts. Results suggest alumni rate their overall experience higher, while students more positively judge aspects of their institutional experience and their skill development....Read more
RESEARCH & PRACTICE IN ASSESSMENT Volume Ten | Winter 2015 5 CORRESPONDENCE Email anglmill@indiana.edu AUTHORS Amber D. Dumford, Ph.D. Indiana University Angie L. Miller, Ph.D. Indiana University Abstract Student surveys are often important elements of assessment in higher education, but alumni surveys can play a substantial role as well. However, little is known about how responses from these two groups compare to one another. Combining data from the Strategic National Arts Alumni Project (SNAAP) and the National Survey of Student Engagement (NSSE), this study examines self-reported college experiences and skill development of seniors and alumni who majored in the arts. Results suggest that alumni rate their overall experience higher, while students judge specific aspects of their institutional experience and their skill development more positively. Given these differences, it is recommended that institutions survey both students and alumni to achieve a more complete picture of the educational experience. Are those Rose-Colored Glasses You are Wearing? Student and Alumni Survey Responses As the economy slowly emerges from the struggle of recession and funding to higher education institutions continues to be cut, there is an increasing trend for requiring colleges and universities to show measures of their effectiveness (Kuh & Ewell, 2010). Using surveys to assess skill development and the quality of collegiate experiences has become commonplace (Kuh & Ikenberry, 2009; Porter, 2004), but much of that research uses current or graduating students to collect information. In fact, the vast majority (85%) of U.S. colleges and universities use some type of national student survey in their assessment plan (Kuh, Jankowski, Ikenberry, & Kinzie, 2014). Yet students are not the only source of information that could be of use for institutions that are determined to provide evidence of their value and success. Other stakeholders can contribute relevant assessment information as well. An increasing number of institutions are turning to alumni surveys, focus groups, and interviews to gain a unique perspective on learning and other outcomes (Borden & Kernel, 2013; Kuh et al., 2014). One important measure of institutional effectiveness is alumni success in the workplace (Cabrera, Weerts, & Zulick, 2005). Not only do those at higher education institutions have to show evidence of their effectiveness to state funding and accreditation agencies, but students are also aware that in the current economy their employment prospects may be constrained and they are concerned with getting the best return on their academic investment in the form of employability. With these things in mind, perhaps the viewpoints of alumni who are already in the field or struggling to enter their field would be even more enlightening than those of students still in their programs. However, little is known about how undergraduate student responses compare with those of alumni. Does the passage of time change the capacity of people to reflect on their learning experiences during college?
RESEARCH & PRACTICE IN ASSESSMENT 6 Volume Ten | Winter 2015 Literature Review In addition to the pure content knowledge gained in a student’s chosen major, administrators, faculty, and staff at institutions of higher education claim to prepare their students with a multitude of skills, ranging from effective communication practices to analytical and creative thinking skills (Tait & Godfrey, 1999). Although not all skills learned in higher education settings may transfer directly to the workplace (Stasz, 2001), those at institutions must make every effort to prepare students to be suitable employees. A major function of higher education is to help students develop skills that will lead them to success in the workplace (Evers, Rush, & Berdrow, 1998; Stasz, 2001). While some acquired skills are considered discipline-specific, many “transferable skills” that will lead to workplace success, such as problem solving and effective communication, are applicable to a broad range of fields (Bradshaw, 1985; Stasz, 1997). There is a need for generic skills across multiple types of jobs, and students possessing them appear more marketable to potential employers. The Association of American Colleges and Universities has recently addressed many of these skills, including critical and creative thinking, inquiry and analysis, and written and oral communication, as essential learning outcomes for higher education, hoping to encourage deliberate progress in their development. If curriculum and programming at institutions are lacking in these areas, the employability of their graduates will decrease (Evers et al., 1998). Alumni surveys can provide direct information on career attainment, as alumni can report back to the institution not only their current job(s) and income, but how useful the skills they learned at their institution are to their current occupation and how their educational experiences may have shaped the development of these skills and competencies. Because of the need to develop such a range of different skills, many higher education institutions have begun to scrutinize whether or not they are effectively teaching these skills in their curriculum, and alumni surveys can provide this type of information. As there is increasing pressure for colleges and universities to shorten the time in which it takes students to earn their degrees, some aspects of the curriculum must be cut. Multiple perspectives on the importance of a variety of skills can help departments prioritize their required course content. Although alumni can provide an abundance of important information, logistical issues are involved in the surveying of alumni. While student populations are considered a more captive audience with the confidence of accurate email addresses, alumni populations are less defined. Alumni surveys also often have lower response rates compared with student surveys (Smith & Bers, 1987), for a variety of reasons including outdated contact information, suspicion of money solicitation, and decreased institutional loyalty after graduation. Indeed, response rates across a variety of groups have been falling over the past decade (Atrostic, Bates, Burt, & Silberstein, 2001; Baruch, 1999; Porter, 2004). One must also be aware of the increasing demands of technology when it comes to survey research. Individuals are often encumbered with endless requests to complete online surveys, and while their internet access is virtually unlimited and enables flexibility in the location of completing surveys, taking surveys on smartphones and tablets can be additionally burdensome (Buskirk & Andrus, 2012; Lambert & Miller, 2015; Mavletova, 2013). These new issues further add to the complexity of surveying alumni. Nevertheless, it is imperative that administrators at higher education institutions acquire knowledge from their alumni. Arts programs are one disciplinary area that has been under fire for a lack of preparation in skills needed for the “real world” of work, and it is often difficult to align some of the arts curriculum with rigid accountability standards that may not take into account the unique skills and experiences of arts students (Johnson, 2002). One study found that practical business and management-related skills were greatly underemphasized within arts curricula (Bauer, Viola, & Strauss, 2011), and artists themselves recognize the need for “learning on the fly” and the power of networking and similar smart career mindsets (Smilde, 2008). Conversely, there is also research to suggest that students in the arts are especially adept at certain types of skills, including incorporating verbal studio feedback into revisions of their work (Edstrom, 2008) and critical thinking and interpersonal understanding (Badcock, Pattison, & Harris, 2010). If arts programs are to address these With these things in mind, perhaps the viewpoints of alumni who are already in the feld or struggling to enter their feld would be even more enlightening than those of students still in their programs. If arts programs are to address these criti- cisms concerning skill development, collecting information from current students as well as alumni is an instrumen- tal aspect of curricular modifcation.
RESEARCH & PRACTICE IN ASSESSMENT Abstract Student surveys are often important elements of assessment in higher education, but alumni surveys can play a substantial role as well. However, little is known about how responses from these two groups compare to one another. Combining data from the Strategic National Arts Alumni Project (SNAAP) and the National Survey of Student Engagement (NSSE), this study examines self-reported college experiences and skill development of seniors and alumni who majored in the arts. Results suggest that alumni rate their overall experience higher, while students judge specific aspects of their institutional experience and their skill development more positively. Given these differences, it is recommended that institutions survey both students and alumni to achieve a more complete picture of the educational experience. AUTHORS Amber D. Dumford, Ph.D. Indiana University Angie L. Miller, Ph.D. Indiana University Are those Rose-Colored Glasses You are Wearing? Student and Alumni Survey Responses A s the economy slowly emerges from the struggle of recession and funding to higher education institutions continues to be cut, there is an increasing trend for requiring colleges and universities to show measures of their effectiveness (Kuh & Ewell, 2010). Using surveys to assess skill development and the quality of collegiate experiences has become commonplace (Kuh & Ikenberry, 2009; Porter, 2004), but much of that research uses current or graduating students to collect information. In fact, the vast majority (85%) of U.S. colleges and universities use some type of national student survey in their assessment plan (Kuh, Jankowski, Ikenberry, & Kinzie, 2014). Yet students are not the only source of information that could be of use for institutions that are determined to provide evidence of their value and success. Other stakeholders can contribute relevant assessment information as well. An increasing number of institutions are turning to alumni surveys, focus groups, and interviews to gain a unique perspective on learning and other outcomes (Borden & Kernel, 2013; Kuh et al., 2014). One important measure of institutional effectiveness is alumni success in the workplace (Cabrera, Weerts, & Zulick, 2005). Not only do those at higher education institutions have to show evidence of their effectiveness to state funding and accreditation CORRESPONDENCE agencies, but students are also aware that in the current economy their employment prospects may be constrained and they are concerned with getting the best return on their Email academic investment in the form of employability. With these things in mind, perhaps the anglmill@indiana.edu viewpoints of alumni who are already in the field or struggling to enter their field would be even more enlightening than those of students still in their programs. However, little is known about how undergraduate student responses compare with those of alumni. Does the passage of time change the capacity of people to reflect on their learning experiences during college? Volume Ten | Winter 2015 5 RESEARCH & PRACTICE IN ASSESSMENT Literature Review With these things in mind, perhaps the viewpoints of alumni who are already in the field or struggling to enter their field would be even more enlightening than those of students still in their programs. In addition to the pure content knowledge gained in a student’s chosen major, administrators, faculty, and staff at institutions of higher education claim to prepare their students with a multitude of skills, ranging from effective communication practices to analytical and creative thinking skills (Tait & Godfrey, 1999). Although not all skills learned in higher education settings may transfer directly to the workplace (Stasz, 2001), those at institutions must make every effort to prepare students to be suitable employees. A major function of higher education is to help students develop skills that will lead them to success in the workplace (Evers, Rush, & Berdrow, 1998; Stasz, 2001). While some acquired skills are considered discipline-specific, many “transferable skills” that will lead to workplace success, such as problem solving and effective communication, are applicable to a broad range of fields (Bradshaw, 1985; Stasz, 1997). There is a need for generic skills across multiple types of jobs, and students possessing them appear more marketable to potential employers. The Association of American Colleges and Universities has recently addressed many of these skills, including critical and creative thinking, inquiry and analysis, and written and oral communication, as essential learning outcomes for higher education, hoping to encourage deliberate progress in their development. If curriculum and programming at institutions are lacking in these areas, the employability of their graduates will decrease (Evers et al., 1998). Alumni surveys can provide direct information on career attainment, as alumni can report back to the institution not only their current job(s) and income, but how useful the skills they learned at their institution are to their current occupation and how their educational experiences may have shaped the development of these skills and competencies. Because of the need to develop such a range of different skills, many higher education institutions have begun to scrutinize whether or not they are effectively teaching these skills in their curriculum, and alumni surveys can provide this type of information. As there is increasing pressure for colleges and universities to shorten the time in which it takes students to earn their degrees, some aspects of the curriculum must be cut. Multiple perspectives on the importance of a variety of skills can help departments prioritize their required course content. If arts programs are to address these criticisms concerning skill development, collecting information from current students as well as alumni is an instrumental aspect of curricular modification. 6 Volume Ten | Winter 2015 Although alumni can provide an abundance of important information, logistical issues are involved in the surveying of alumni. While student populations are considered a more captive audience with the confidence of accurate email addresses, alumni populations are less defined. Alumni surveys also often have lower response rates compared with student surveys (Smith & Bers, 1987), for a variety of reasons including outdated contact information, suspicion of money solicitation, and decreased institutional loyalty after graduation. Indeed, response rates across a variety of groups have been falling over the past decade (Atrostic, Bates, Burt, & Silberstein, 2001; Baruch, 1999; Porter, 2004). One must also be aware of the increasing demands of technology when it comes to survey research. Individuals are often encumbered with endless requests to complete online surveys, and while their internet access is virtually unlimited and enables flexibility in the location of completing surveys, taking surveys on smartphones and tablets can be additionally burdensome (Buskirk & Andrus, 2012; Lambert & Miller, 2015; Mavletova, 2013). These new issues further add to the complexity of surveying alumni. Nevertheless, it is imperative that administrators at higher education institutions acquire knowledge from their alumni. Arts programs are one disciplinary area that has been under fire for a lack of preparation in skills needed for the “real world” of work, and it is often difficult to align some of the arts curriculum with rigid accountability standards that may not take into account the unique skills and experiences of arts students (Johnson, 2002). One study found that practical business and management-related skills were greatly underemphasized within arts curricula (Bauer, Viola, & Strauss, 2011), and artists themselves recognize the need for “learning on the fly” and the power of networking and similar smart career mindsets (Smilde, 2008). Conversely, there is also research to suggest that students in the arts are especially adept at certain types of skills, including incorporating verbal studio feedback into revisions of their work (Edstrom, 2008) and critical thinking and interpersonal understanding (Badcock, Pattison, & Harris, 2010). If arts programs are to address these RESEARCH & PRACTICE IN ASSESSMENT criticisms concerning skill development, collecting information from current students as well as alumni is an instrumental aspect of curricular modification. Furthermore, arts programs in particular have recently been under scrutiny for the career outcomes of their graduates. Data indicate that those majoring in the arts have some of the lowest income levels, especially among recent college graduates (Carnevale, Cheah, & Strohl, 2012), and arts majors are widely considered in the popular press to be “worthless” in terms of income and employment (Cantor, 2012). Institutions can combat this accusation with alumni data. In addition to simply reporting income and employment status, it may be helpful to use alumni data in expanding the definition of what a “successful” graduate looks like. Research suggests that other aspects of one’s career, such as opportunities to be creative or contribute to the greater good, can provide just as much, if not more, of a rewarding experience as can the traditional measures of income and prestige (Lambert & Miller, 2013). This may be particularly pertinent in fields such as the arts or education, which are not generally associated with higher career earnings. Thus, especially when looking at the arts, alumni views of their educational experiences might shed some light on the true value of their time at their institutions. The current study compares information from an arts alumni survey and a survey of graduating seniors to explore how the views on the experiences of the two groups may differ and strengthen one another. Thus, especially when looking at the arts, alumni views of their educational experiences might shed some light on the true value of their time at their institutions. Research Questions Given the need for student and alumni surveys in higher education assessment, the purpose of this study is to explore the relationship between student and alumni views. The following general research questions guided this study: 1. Are there differences in how students and alumni perceive aspects of their institutional experiences and the skills and competencies that they acquire at their institutions? 2. What are the implications of interpreting alumni reports as unbiased assessments of strengths and weakness of a program? Conversely, do alumni evaluate their institutions with “rose-colored glasses” and cast things in a positive light, or do they evaluate their education more harshly once they gain a more practical knowledge of the working world? 3. Finally, if differences between students and alumni do exist, whose report should be given precedence in making curricular or programming assessments and changes? Should institutions give more weight to student reports that have the accuracy of closeness in time to the experience, or those reports of alumni that have the advantage of pragmatic perspective and hindsight? Methodology To address these questions, this study used data from the Strategic National Arts Alumni Project (SNAAP) and the National Survey of Student Engagement (NSSE). SNAAP is an online annual survey of arts graduates from a broad spectrum of institutions, including independent colleges of art and design, music conservatories, and arts schools, departments, or programs at comprehensive colleges and universities. The arts are defined broadly to include a range of fields such as music, theatre, dance, design, architecture, creative writing, film, media arts, illustration, and fine arts. SNAAP surveys alumni on a wide range of content, including formal education and degrees, institutional experiences, postgraduate resources for artists, past and current career information, avocational arts engagement, income and debt, and demographic information. The 2011 SNAAP administration included over 36,000 total respondents at 66 participating institutions. Participants were sent an invitation email including a link to the survey with a unique identification number. Participants could log in to their unique link multiple times, so they were not constrained to respond to all survey questions during a single sitting. However, the unique link tracking system ensured that participants could only submit their completed survey once. The median completion time was 22 minutes. Volume Ten | Winter 2015 7 RESEARCH & PRACTICE IN ASSESSMENT NSSE is an annual online survey of first-year and senior students that gives a snapshot of college student experiences inside and outside of the classroom. The items on NSSE gather information on the extent to which students engage in and are exposed to educational experiences that represent good practices related to desirable college outcomes. The 2012 NSSE administration included over 285,000 respondents at 546 institutions. The median completion time for the core NSSE survey was 13 minutes. Each year, experimental item sets are appended to the end of the core NSSE survey. As part of the 2012 NSSE administration, a set of experimental items asked first-year and senior students at selected institutions about skills and experiences that matched questions on the SNAAP questionnaire. Sample When thinking back to their institutional experience as a whole, it may be that alumni are viewing it through rose-colored glasses. For the purposes of this study, only data from those institutions that participated in both the 2011 SNAAP administration and the additional item set on the 2012 NSSE administration were used. SNAAP is administered in the fall, while NSSE has a spring administration. Therefore, these two data sources were collected at the closest points in time to one another, compared to other years of survey data from either project. The sample consisted of more than 222 seniors and 593 recent undergraduate alumni (graduating between 2001 and 2010) at six different four-year institutions. The seniors were selected based on reporting an arts major in one of the corresponding SNAAP arts programs of participation. The alumni cohorts of 2001 to 2010 were chosen because their experiences were closer to those of the graduating seniors, and no major curricular changes had occurred in those years at these six participating institutions. As with most survey research, females responded at a higher rate than their male counterparts. Nearly two-thirds of both the graduating senior and alumni respondents were female (72% and 61% respectively). In contrast, the race of respondents was similar to the population of these six institutions (73% white for NSSE and 70% for SNAAP), with the only exception being that Asian respondents were slightly over-represented for SNAAP respondents (5%). About one-third of the respondents were first-generation students (37% and 30%) and nearly all respondents were U.S. citizens (98% for both surveys). The response rates for the six institutions ranged from 14% to 25% for SNAAP and 27% to 51% for NSSE, with an average institutional response rate of 19% for SNAAP and 34% for NSSE. Measures It may also be informative to borrow some concepts from cognitive psychology in a further discussion of how students and alumni rely on memory searches to respond to survey items. 8 Volume Ten | Winter 2015 The measures that are the focus of this study are taken from one individual item and two additional item sets. The first question asked students and alumni to give an overall rating of their institutional experience on a 4-point Likert scale ranging from Poor to Excellent. This question is on the core survey for both NSSE and SNAAP. In contrast, the next two sets were developed for SNAAP and are on the SNAAP core survey, but were added to NSSE as additional questions appearing at the end of the core NSSE survey. The second set of questions asked participants to rate their satisfaction with nine aspects of their time at the institution, including academic advising, opportunities for degree-related internships or work, instructors, sense of belonging and attachment, and opportunities to network with alumni and others. The set was on a 4-point Likert scale from Very dissatisfied to Very satisfied with an additional Not relevant option. For the purposes of this study, the Not relevant responses were removed from the data to create ordinal variables. Finally, the third question set asked about 16 different skills and competencies developed at their institution. Participants were asked, “How much did [your institution] help you acquire or develop each of the following skills and abilities?” and provided responses using a 4-point Likert scale with the end points of Not at all to Very much. The skills and competencies used included critical thinking, broad knowledge and education, creative thinking, research skills, persuasive speaking, project management skills, technological skills, artistic technique, financial and business management skills, leadership skills, networking and relationship building, and teaching skills. All skills and aspects of time at institution included in the question sets are listed in Table 1. The demographic variables of gender, race, citizenship status, and parent education were included on both survey instruments as well. RESEARCH & PRACTICE IN ASSESSMENT Table 1 Comparison of Graduating Seniors and Alumni on Institutional Experiences and Development of Skills Adjusted Meansa Student Overall Experience 3.27 Aspects of Time at Institution Opportunities to present, perform, or exhibit your work 3.30 Opportunities to take classes outside of your 3.12 major/discipline Instructors in classrooms, labs, and studios 3.37 Academic advising 2.99 Advising about career or further education 2.80 Opportunities for degree-related internships or work 2.68 Opportunities to network with alumni and others 2.71 Sense of belonging and attachment 3.09 Freedom and encouragement to take risks 3.17 Skills and Abilities Critical thinking and analysis of arguments and 3.41 information Broad knowledge and education 3.30 Listening and revising 3.49 Creative thinking and problem solving 3.59 Research skills 3.30 Clear writing 3.21 Persuasive speaking 2.96 Project management skills 3.21 Technological skills 3.23 Artistic Technique 3.71 Financial and business management skills 2.24 Entrepreneurial skills 2.23 Interpersonal relations and working collaboratively 3.18 Leadership skills 3.05 Networking and relationship building 3.07 Teaching skills 2.86 a Adjusted for gender, race, U.S. citizenship status, and first-generation status. * p < .05, ** p < .01, *** p < .001 Alumni 3.39 * Effect Size (d) .17 * *** ** -.16 -.35 -.27 ** *** * * -.23 -.30 -.21 -.21 *** ** -.38 -.27 * *** -.21 -.28 Sig. 3.29 3.25 3.37 2.84 2.44 2.41 2.61 3.19 3.16 3.34 3.30 3.44 3.53 3.11 2.96 2.78 3.02 3.12 3.63 1.92 1.99 3.17 2.88 2.83 2.73 Data Analysis Analysis of covariance (ANCOVA) was conducted to determine whether differences of reported satisfaction and skill development exist between graduating seniors and alumni. Prior to the estimation of the models, exploratory analyses were conducted testing the assumptions underlying the application of ANCOVA and all were met (Glantz & Slinker, 2001). Using SNAAP data from the previous fall also guaranteed that no NSSE respondents would be eligible for participation in SNAAP after their graduation, which would violate the independent samples assumption of the statistical analyses. The adjusted means are reported for each of the groups, along with the statistical significance of the difference between the two groups. The statistical software used (Statistical Package for the Social Sciences v20.0) automatically implements a corrected formula to account for unequal sample sizes. Next, effect sizes (standardized mean differences using Cohen’s d for ANCOVAs, calculated by dividing the adjusted mean difference by the square root of the mean square error) were calculated to determine the magnitude of the graduating senior and alumni differences. The effect size with controls represents how much of the raw difference is left unexplained after adjusting the means for student and alumni characteristics. Control variables included gender, race, U.S. citizenship status, and first-generation status, as previous research (Pascarella & Terenzini, 2005) suggests that there are differences in student engagement and educational experiences for students based on these characteristics. Because these variables are categorical, they were dummy-coded prior to inclusion in the analyses. Taken together, the general pattern suggested in these results is that alumni provide more positive evaluations of their institutions overall, yet more critical judgments when certain specific aspects are concerned. Results Comparison of the ratings of their overall institutional experience suggests that alumni give higher general appraisals than their graduating senior counterparts when evaluating their educational experience as a whole (p < .05, Cohen’s d = .17). Using the adjusted means, Volume Ten | Winter 2015 9 RESEARCH & PRACTICE IN ASSESSMENT While it is hard to determine which group has a more accurate report of the experience, important institutional information can be gained through surveying both students and alumni. Students may be better able to provide information about affective components of their experience, while alumni may be better judges of specific things needed in the workplace. significant differences were found for three of the nine specific aspects of time at institution (academic advising, career advising, and opportunities for internships). In contrast with the overall institutional experience evaluation, these results suggest that alumni give lower specific appraisals for particular aspects (Cohen’s d = -.16, d = -.35, d = -.27, respectively). Adjusted means comparisons for the amount of institutional contribution to acquired skills and competencies show a similar pattern, with alumni giving statistically significant, lower appraisals for 8 of the 16 skills (Cohen’s d ranging from -.21 to -.38). The skills with significantly lower ratings were clear writing, persuasive speaking, networking and relationship building, leadership skills, research skills, project management, financial and business skills, and entrepreneurial skills. All ANCOVA results are shown in Table 1. Discussion When thinking back to their institutional experience as a whole, it may be that alumni are viewing it through rose-colored glasses. The arts alumni included in this study tended to rate their institutions slightly more favorably than the senior students graduating with arts majors when making universal assessments. However, when considering more nuanced aspects of their educational experiences, alumni perceptions may have a more lackluster pallor. In terms of their satisfaction with aspects of their time at the institution, post-graduation experiences in the workplace may better enable alumni to reflect on certain aspects and realize where improvements could help them in their current careers. In particular, alumni were less satisfied than graduating seniors in the areas of academic advising, career advising, and opportunities for internships or degree-related work. Applying the old adage of “hindsight is 20/20,” it may be the case that as students, respondents do not realize that they need better advising or an internship until they enter the workforce and then gain a more realistic perception. This highlights the importance of surveying both students and alumni as part of an institutional assessment plan. While student surveys may be easier, in terms of a readily available population, they may not always provide the most insightful or reflective information. In addition to this more complex understanding of satisfaction with certain aspects of their time, alumni may also learn they needed to have better developed skills only once they have gained work experience. Alumni were less satisfied than graduating seniors with their institution’s contribution to their development of clear writing, persuasive speaking, networking and relationship building, leadership skills, research skills, project management, financial and business skills, and entrepreneurial skills. These results could be interpreted to mean that upon leaving the institution and entering the workforce, alumni perceptions shift in terms of some communicative and procedural skills. Writing, speaking, networking, and leadership are important aspects of communication that may be experienced differently in an applied setting, such as the workplace, in comparison to a classroom situation. Likewise, some task-based procedural skills like research, project management, finance, and entrepreneurship may also be more completely comprehended and valued once an individual transitions from student to employee. When current senior students answer that their institution has contributed “very much” to the development of a certain skill, it may be that they are referencing their development since their first year at the institution and think that they have made great strides. There is also the possibility that once alumni enter the workforce, they are referencing their skill levels in comparison with colleagues who are quite advanced in these skills resulting from years, or perhaps even decades, of actual use. 10 Volume Ten | Winter 2015 It may also be informative to borrow some concepts from cognitive psychology in a further discussion of how students and alumni rely on memory searches to respond to survey items. In responding to an item about overall satisfaction with their institution, people may use a heuristic recall strategy, which quickly scans through all associated memories, seeking the most relevant cases (Reisberg, 2012). This strategy is substantially different from an algorithmic one, which systematically evaluates all possible steps of a procedure (Davis & Palladino, 2012). When responding to the items concerning satisfaction with aspects of time and their acquisition of skills, a longer list appeared containing all of the items in the set, grouped under a common stem. For these items, respondents could work through the list a single item at a time, focusing on recall for each one before moving on to the next. This type of format may lend itself to an algorithmic approach, as opposed to the more heuristic strategy that allows an efficient recall of a more general topic area. Although the heuristic approach is RESEARCH & PRACTICE IN ASSESSMENT more efficient, it also risks error; thus, the memories available for recall may differ between students and alumni, partially explaining the different direction of patterns for alumni and students for the different types of survey items. Taken together, the general pattern suggested in these results is that alumni provide more positive evaluations of their institutions overall, yet more critical judgments when certain specific aspects are concerned. However, it should also be noted that in terms of the magnitude of the differences between the alumni and student responses, the effect sizes were all in the moderate to small range (Cohen, 1992). Although this is common for social science and educational research (Gonyea & Sarraf, 2009; Hayek, Gonyea, & Zhao, 2001), it is still a consideration in the interpretation of the results. The statistical significance of the comparisons is certainly important, but the practical significance of the comparisons, most of which were small to moderate is an essential component for a complete understanding of the results as well. When institutions with limited resources are considering which potential curricular and programming changes they should prioritize, those aspects with the larger effects might be the more practical areas on which to focus. Limitations Although there are strengths of this study, some limitations should be noted. Given the data collection procedures and response rates, the sample may not be representative of all arts alumni and students, and caution should be made when making generalizations. It may also be the case that respondents to student surveys are different than respondents to alumni surveys, but there is evidence to suggest that despite their lower response rates, respondents to alumni surveys are just as representative as student surveys (Lambert & Miller, 2014). Furthermore, this study relied on self-reported data, which may not always be completely objective. However, most studies looking at self-reports in higher education suggest that selfreports and actual measures of constructs such as abilities are positively related (Anaya, 1999; Converse & Presser, 1989; Hayek, Carini, O’Day, & Kuh, 2002; Laing, Sawyer, & Noble, 1987; Pace, 1985; Pike, 1995) and that social desirability bias is not a substantive concern for reports of basic cognitive and academic behaviors (Miller, 2012). It should also be noted that this study design was cross-sectional rather than longitudinal, and although the students and alumni were matched for major and institution, there were still different individuals responding from each group. Additionally, the quantitative nature of the data may have missed some of the nuance and tone of student and alumni perceptions of their institutions and skill development. Thus, if administrators and faculty want the complete picture of what can help create the optimal institutional experiences for students and also prepare them for the workforce, gathering information from both students and alumni may be the best assessment practice in this situation. Conclusion While it is hard to determine which group has a more accurate report of the experience, important institutional information can be gained through surveying both students and alumni. Students may be better able to provide information about affective components of their experience, while alumni may be better judges of specific things needed in the workplace. Being closer in time to the experience may have the advantage in terms of memory accuracy, but temporal distance may have the advantage of reflective insight. Thus, if administrators and faculty want the complete picture of what can help create the optimal institutional experiences for students and also prepare them for the workforce, gathering information from both students and alumni may be the best assessment practice in this situation. Future research should not only expand the topics on which student and alumni comparisons can be made, but also incorporate a longitudinal design that matches data at the respondent level. Moreover, it may be useful to incorporate matched assessment data that are not self-reported. For instance, employer feedback on the skill development of alumni or summative rubrics from faculty in required major courses may supplement the findings from alumni and student surveys. Furthermore, qualitative approaches such as focus groups and one-on-one interviews could provide an additional source of information for assessment purposes. The SNAAP survey instrument actually includes several different open-ended questions for alumni to elaborate on various topics, and institutional users often report that these quotes are very powerful in conveying the survey findings to numerous audiences. For example, when asked about how the institution could have better prepared them for their career, one alumnus in this study included a specific curricular suggestion, noting, “One Volume Ten | Winter 2015 11 RESEARCH & PRACTICE IN ASSESSMENT thing that I really enjoyed at [my institution] was the push to pursue your own ideas, but the design program could also incorporate projects that focus on the designer/client relationship.” This type of qualitative information can further enhance the value and application of the quantitative data when making program updates. Alumni surveys may be especially important as part of assessment cycles. The responses of alumni may be used to make curricular changes, which then impact current students, who can be assessed as students and then later as alumni to determine whether or not the changes were effective. This process can also be interpreted as a means of institutional transparency, as alumni already have their degree so they have a different focus and less at stake, and institutions are willing to share their feedback, both positive and negative, in order to make upgrades. Accessing the perspectives of both students and alumni are important sources of data for improvement in higher education; therefore, surveys of both populations should be administered for the best information possible. 12 Volume Ten | Winter 2015 RESEARCH & PRACTICE IN ASSESSMENT References Anaya, G. (1999). College impact on student learning: Comparing the use of self-reported gains, standardized test scores, and college grades. Research in Higher Education, 40, 499-526. Atrostic, B. K., Bates, N., Burt, G., & Silberstein, A. (2001). Nonresponse in U.S. government household surveys: Consistent measure, recent trends, and new insights. Journal of Official Statistics, 17(2), 209-226. Badcock, P. B. T., Pattison, P. E., & Harris, K. L. (2010). Developing generic skills through university study: A study of arts, science and engineering in Australia. Higher Education, 60, 441-458. doi: 10.1007/s10734-010-9308-8 Baruch, Y. (1999). Response rates in academic studies – A comparative analysis. Human Relations, 52, 421-434. Bauer, C., Viola, K., & Strauss, C. (2011). Management skills for artists: ‘Learning by doing’? International Journal of Cultural Policy, 17(5), 626-644. Borden, V. M. H., & Kernel, B. (2013). Measuring quality in higher education: An inventory of instruments, tools, and resources. Retrieved from http://apps.airweb.org/surveys/Default.aspx Bradshaw, D. (1985). Transferable intellectual and personal skills. Oxford Review of Education, 11(2), 201-216. Buskirk, T. D., & Andrus, C. (2012). Smart survey for smart phones: Exploring various approaches for conducting online mobile surveys via smartphones. Survey Practice, 5(1). Cabrera, A. F., Weerts, D. J., & Zulick, B. J. (2005). Making an impact with alumni surveys. New Directions for Institutional Research, 2005, 5-17. doi: 10.1002/ir.144 Cantor, M. (2012, April 23). The 13 most worthless majors. Newser. Retrieved from www.newser.com. Carnevale, A. P., Cheah, B., & Strohl, J. (2012). College majors, unemployment, and earnings: Not all college degrees are created equal. Washington, DC: Center of Education and the Workforce, Georgetown University. Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155-159. Converse, J. M., & Presser, S. (1989). Survey questions: Handcrafting the standardized questionnaire. Newbury Park, CA: Sage. Davis, S. F., & Palladino, J. J. (2012). Psychology (7th ed.). Upper Saddle River, NJ: Prentice Hall. Edstrom, A. M. (2008). Art students making use of studio conversations. Art, Design & Communication in Higher Education, 7(1), 31-44. doi: 10.1386/adche.7.1.31/1 Evers, F. T., Rush, J. C., & Berdrow, I. (1998). The bases of competence: Skills for lifelong learning and employability. San Francisco: Jossey-Bass. Glantz, S. A., & Slinker, B. K. (2001). Primer of applied regression and analysis of variance (2nd ed.). New York: McGraw-Hill. Gonyea, R., & Sarraf, S. (2009, June). Contextualizing NSSE effect sizes: Empirical analysis and interpretation of benchmark comparisons. Paper presented at the Annual Forum of the Association for Institutional Research, Atlanta, GA. Hayek, J. C., Carini, R. M., O’Day, P. T., & Kuh, G. D. (2002). Triumph or tragedy: Comparing student engagement levels of members of Greek-letter organizations and other students. Journal of College Student Development, 43(5), 643-663. Hayek, J., Gonyea, R., & Zhao, C. (2001, November). Reporting and interpreting effect sizes in higher education research. Paper presented at the Annual Conference for the Association for the Study of Higher Education, Richmond, VA. Johnson, C. (2002). The position of the fine art student in the context of public accountability of higher education, with specific reference to the Quality Assurance Agency experience. Teaching in Higher Education, 7(2), 215224. doi: 10.1080/13562510220124277 Kuh, G. D., & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), 1-20. Kuh, G. D., & Ikenberry, S. O. (2009). More than you think, less than we need: Learning outcomes assessment in American higher education. Urbana, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment. Kuh, G. D., Jankowski, N., Ikenberry, S. O., & Kinzie, J. (2014). Knowing what students know and can do: The current state of student learning outcomes assessment in U.S. colleges and universities. Urbana, IL: University of Illinois, National Institute of Learning Outcomes Assessment. Volume Ten | Winter 2015 13 RESEARCH & PRACTICE IN ASSESSMENT Lambert, A. D., & Miller, A. L. (2013, April). Assessing alumni success: Income is NOT the only outcome! Paper presented at the Annual Meeting of the American Educational Research Association, San Francisco, California. Lambert, A. D., & Miller, A. L. (2014). Lower response rates on alumni surveys might not mean lower response representativeness. Educational Research Quarterly, 37(3), 38-51. Lambert, A. D., & Miller, A. L. (2015). Living with smartphones: Does completion device affect survey responses? Research in Higher Education, 56(2), 166-177. Laing, J., Swayer, R., & Noble, J. (1987). Accuracy of self-reported activities and accomplishments of college-bound seniors. Journal of College Student Development, 29(4), 362-368. Mavletova, A. (2013). Data quality in PC and mobile web surveys. Social Science Computer Review, 31(6), 725-743. doi: 10.1177/0894439313485201. Miller, A. L. (2012). Investigating social desirability bias in student self-report surveys. Educational Research Quarterly, 36(1), 30-47. Pace, C. R. (1985). The credibility of student self-reports. Los Angeles: The Center for the Study of Evaluation, Graduate School of Education, University of California at Los Angeles. Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students (vol. 2). San Francisco: Jossey-Bass. Pike, G. R. (1995). The relationship between self-reports of college experiences and achievement test scores. Research in Higher Education, 36(1), 1-22. Porter, S. R. (2004). Raising response rates: What works? New Directions for Institutional Research, 121, 5-21. Reisberg, D. (2012). Cognition: Exploring the science of the mind (5th ed.). New York: W. W. Norton & Company. Smilde, R. (2008). Lifelong learners in music; research into musicians’ biographical learning. International Journal of Community Music, 1(2), 243-252. Smith, K., & Bers, T. (1987). Improving alumni survey response rates: An experiment and cost-benefit analysis. Research in Higher Education, 27(3), 218-225. doi: 10.1007/BF00991999. Stasz, C. (1997). Do employers need the skills they want? Evidence from technical work. Journal of Education and Work, 10(3), 205-223. Stasz, C. (2001). Assessing skills for work: Two perspectives. Oxford Economic Papers, 3, 385-405. Tait, H., & Godfrey, H. (1999). Defining and assessing competence in generic skills. Quality in Higher Education, 5(3), 245-253. 14 Volume Ten | Winter 2015