This mixed method study explored the professional competencies that administrators expect from entry-, mid-, and senior-level professionals as reflected in 1,759 job openings posted in 2008. Knowledge, skill, and dispositional competencies were identified during the qualitative phase of the study. Statistical analysis of the prevalence of competencies revealed significant differences between major functional areas and requirements for educational and work experience. Implications for institutional leaders, graduate faculty, and professional development planning as well as for mixed methods research are discussed.
When situations change on a dime, as is frequently the case in today's economy, people are what make a difference" (Whitely, as cited in Grund, 2009, p. 12). The student affairs profession values people, and in an era of assessment and accountability, it must also be a profession that values the development and demonstration of competence by those people. Palomba and Banta (1999) defined assessment as "the systematic collection, review, and use of information about education for the purpose of [emphasis added] improving student learning and development" (p. 4). Closing the assessment loop from data collection and analysis to improving learning and development often entails changes in the design and delivery of educational programs, but this effort should also consider the knowledge, skills, and dispositions of the educators who enact those programs. For this reason, assessment-minded scholar-practitioners have recently afforded increased attention to the question of what specific competencies successful student affairs professionals need. The research and professional literature addressing this question has largely emphasized new professional competencies and relied on self-reports from senior student affairs officers, mid- and entry-level professionals, and graduate preparation faculty. Though Pace (1985) suggested that self-report data in well-designed studies are valid, Brener, Billy, and Grady (2003) have shown that cognitive and situational factors may influence participant self-report responses. Thus, it is important to triangulate self-report data with additional measures of the phenomenon in question.
The purpose of this exploratory mixed methods study is to extend current literature addressing the self-reported competencies required of entry-, mid-, and senior-level student affairs professionals by examining which competencies college and university administrators include in formal job postings. While we acknowledge that the content of job postings is often politically negotiated between competing campus interests, influenced by campus culture, or tempered by human resource professionals who wish to avoid potential litigation, we also assert that job postings are a meaningful reflection of the competencies college and university administrators desire from the professionals they hire. Further, we stress both the functional and the symbolic importance of the content of job postings. Functionally, the content of job postings provide an initial indicator to prospective candidates of the outcomes that institutions will expect of them as well as of the competencies that administrators believe will be necessary to achieve those outcomes. Symbolically, job postings present a first impression of the underlying institutional values that will guide the desired means of accomplishing outcomes. Thus, the findings of this study should inform not only the work of the preparation programs and professional development organizations that help student affairs professionals to develop the competencies necessary to be successful, it should also inform the work of the leaders who design job postings and their associated job descriptions.
To address the study's purpose, the researchers posed these research questions: (a) Which competencies do college and university administrators most frequently include in formally advertised job postings? (b) Which competencies are more or less often required of student affairs professionals within various functional areas? (c) What differences exist in competency requirements between major functional areas, different types and sizes of institutions, and positions requiring different levels of education and work experience?
Our framework for this study considered both the scope and development of competencies through education and professional experience. Standards developed by the American Psychological Association (APA), the Council for Accreditation of Counseling and Related Educational Programs (CACREP), and the National Association of Social Workers (NASW) emphasize knowledge and skills. Because the scope of student affairs work includes leadership and educational functions as well as advocacy and helping roles, we chose to also consider dispositional competencies. The literature addressing dispositions is grounded in works such as Goleman's (1995) model of emotional intelligence and Perkin's (1993) work addressing the interconnections between neurological, experiential, and reflective intelligence and spans both education and leadership studies (e.g. Avolio, 2010; Bass & Riggio, 2005; Leithwood, Seashore Louis, Anderson, & Wahlstrom, 2004). For the purpose of this study, we defined dispositional competencies as encompassing "attitudes, values, and beliefs" (NCATE, 2008, p. 80) and "habits of the mind ... that filter one's knowledge, skills, and beliefs and impact the action one takes in professional settings" (Thornton, 2005, p. 62).
To address the evaluation and development of competencies, we drew upon the five stages of the skill acquisition model developed by Dreyfus and Dreyfus (1980). Movement through the various stages is marked first by a shift from following concrete rules (stage one, novice) to identifying recurrent patterns or aspects (competence). In the third stage, proficiency, professionals move beyond aspect recognition to evaluating aspects in the context of various situations. Professionals next demonstrate expertise when intuition replaces aspect recognition and evaluation. Finally, the fifth stage of mastery is marked by a degree of transcendence of expertise.
American College Personnel Association (ACPA) and National Association of Student Personnel Administrators (NASPA; 2010) recently published a set of 10 professional competency areas for student affairs professionals. Their work built on prior sets of competencies identified by the Counsel for the Advancement of Standards in Higher Education (CAS, 2006) and ACPA (2008), as well as numerous empirical studies. This work, officially adopted by ACPA and NASPA governing boards, extends prior discussions of professional competence by providing outcomes and descriptions of each competency area that were "divided into basic, intermediate, and advanced levels that delineate the increasing complexity and ability that should be demonstrated by practitioners as they grow in their professional development" (ACPA & NASPA, 2010, p. 4). The 10 competency areas are (a) advising and helping, (b) assessment, evaluation, and research, (c) equity, diversity, and inclusion, (d) ethical professional practice, (e) history, philosophy, and values, (f) human and organizational resources, (g) law, policy, and governance, (h) leadership, (i) personal foundations, and (j) student learning and development.
Several recent research studies have also aimed to identify sets of professional competencies that entry-level professionals should possess. Lovell and Kosten (2000) conducted a meta-analysis of 30 years of research in order to identify 16 broad knowledge, skill, and personal trait characteristics that were vital to success in the student affairs profession. Their competencies were similar to those recently generated by ACPA and NASPA, though they did not include competencies in the areas of equity, diversity, and inclusion; ethical professional practice; or history, philosophy, and values. Burkard, Cole, Ott, and Stoflet (2005) employed a Dephi design involving multiple iterations of surveys with a panel of 104 mid- and senior-level student affairs administrators. The 32 competencies identified by Burkard et al. aligned well with the ACPA and NASPA competencies, though they did not include any competencies in the areas of ethical professional practice or history, philosophy, and values. These competency areas did materialize in a recent study by Hickmott and Bresciani (in press) who classified the 26 competencies that emerged from their study as knowledge, skills, or dispositions. In this study, ethical practice was included with the legal knowledge competency. The Hickmott and Bresciani study differed from those conducted by Lovell and Kosten and Burkard et al. in that they employed a grounded theory approach to analyze formal documents from 54 graduate preparation programs. Thus, competencies related to ethical professional practice and history, philosophy, and values emerged from the study that examined graduate program documents but not from those that examined prior research (Lovell & Kosten, 2000) or the self-reports of mid- and senior-level practitioners (Burkard et al., 2005).
Development and Evaluations of Competencies
Additional research has endeavored to assess faculty and administrator perceptions regarding the degree to which entry-level professionals have mastered essential competencies. Herdlein (2004) administered a mixed methods survey to a sample of 48 senior student affairs officers (SSAOs) who worked at colleges and universities with student affairs graduate preparation programs. Herdlein found that SSAOs were generally satisfied with the level of new professional competence, and that they rated new professionals highest in the areas of overall knowledge of higher education, knowledge of student development theory, and skills in leadership, technology, and counseling. These SSAOs rated new professionals lowest in the skill areas of budgeting, strategic planning, and research and assessment, as well as in the knowledge areas of campus politics and legal matters. More recently, Herdlein, Kline, Boquard, and Haddad (2010) studied faculty perceptions of the importance of various learning outcomes for their programs. When responding to survey items, the most highly rated learning outcomes were in the areas of (a) knowledge of student characteristics and the effects of college, (b) student development theory, (c) how values inform practice, and (d) multicultural perspectives. The lowest rated outcomes were for (a) teaching methods, (b) international education, (c) governance and public policy, and (d) research methods. When asked via an open-ended question to identify the course that was most important to professional practice, faculty listed student development and learning courses more than twice as often as any other course.
Waple (2006) studied entry-level professionals themselves rather than SSAOs or faculty. Waple's findings largely mirrored Herdlein's (2004), though Waple found that new professionals rated themselves lower in several technology-related competency areas. Cuyjet, Longwell-Grice, and Molina (2009) studied recent preparation program graduates and their supervisors and found that graduates rated their knowledge acquisition higher than did their supervisors. Renn and Jessup-Anger (2008), however, found that assessing their own levels of competence and proving themselves were significant challenges for new professionals.
Two additional studies sought to explore differences in the perceptions of SSAOs and faculty regarding entry-level competencies. Kuk, Cobb, and Forrest (2007) analyzed survey responses from 60 SSAOs, 60 mid-level managers, and 60 faculty regarding the importance of 50 competencies that aligned with four broad clusters of knowledge and skill competencies. They found that faculty rated the importance of (a) individual practices and administration, (b) goal setting and the ability to manage change, and (c) managing organizations and groups significantly lower than did either SSAOs or mid-level managers; they found no differences for professional knowledge and content. Faculty were also more likely to expect entry-level professionals to master professional knowledge and content through coursework, though they expected them to learn how to manage organizations and groups in professional settings. Renn and Jessup-Anger (2008), in their grounded theory study of the experiences of entry-level professionals, found that new professionals desired greater support in managing the cultural dynamics of work environments.
Dickerson et al. (2011) compared ratings by 125 faculty and 275 SSAOs of 51 discrete knowledge, skill, and dispositional competencies. They found no differences between SSAOs and faculty in the perceived importance of 49 competencies and no differences in assessments of the degree to which new professionals possessed 42 of the 51 competencies. Dickerson et al. further examined differences between the degree to which the entire sample rated the competencies as "desired for" and "currently possessed by" new professionals. They found significant gaps in the areas of fiscal management, assessment, and knowledge of legal standards, findings that mirror those of Herdlein (2004) and Waple (2006). However, Dickerson et al. also found significant gaps for collaboration, conflict management, the application of theory to practice, and written communication, areas identified as strengths among the Herdlein and Waple studies.
To summarize, there appears to be emerging consensus within current research and professional literature regarding the scope of knowledge, skill, and dispositional competencies for entry-level professionals. However, this consensus largely reflects analyses of the self-reports by SSAOs, faculty, and other practitioners regarding these competencies, but not which competencies administrators include in job postings. Further, current competency research is largely limited to expectations for new professionals, but not those for mid- or senior-level professionals who should be able to demonstrate skill acquisition at a more advanced level.
We drew from a pragmatic orientation to adapt what Creswell and Plano Clark (2011) described as an exploratory mixed methods research design. This design involves an initial qualitative data collection and analysis phase that informs subsequent quantitative data collection and analyses. In keeping with what Patton (1990) identified as a "mixed form design" and Tashakkori and Teddlie (1998) described as a "mixed model design," our study involved a single data set that we initially analyzed using a qualitative approach. The results of the initial qualitative analysis were then analyzed using quantitative methods. This mixing of data analyses allowed us to extend the identification of competencies from the data set to an exploration of the prevalence of these competencies both within and between groups inside the larger data set. The following provides an overview of the data utilized in this study. Because of the sequential nature of the study, we present the research design and results for each phase of the study separately.
Data Collection and Sample
The data for this study consisted of all 1,759 job descriptions posted through The Placement Exchange (TPE) in 2008. TPE is a partnership between NASPA, the Association of College and University Housing Officers-International (ACUHO-I), the National Association for Campus Activities (NACA), the Association for Student Judicial Affairs (ASJA), the
National Orientation Directors Association (NODA), the Association of Fraternity/Sorority Advisors (AFA), and HigherEdJobs.com (www.theplacementexhange.org). TPE holds an annual placement conference prior to the national meeting of NASPA and serves as a centralized online web source for student affairs job postings. Data collected for this set of job postings included the institutional type and size, job category, education and work experience required of applicants, and the full text of the job postings (See Table 1).
Assumptions and Limitations
This single data set served as our source and therefore we assumed that the sample was representative of student affairs positions throughout the United States. In qualitative terms, these job postings served as a large data set that should contribute to reasonable external or "ecological" validity (see Gall, Borg, & Gall, 1996), meaning that we anticipated that a similar set of competencies would emerge from a grounded theory analysis of an alternate comprehensive set of student affairs job postings. Since there are no existing data that accurately break down the number of student affairs professionals employed nationally at various types and sizes of institutions or in various functional areas, it was impossible to compare this sample to the full population of student affairs jobs.
There are several limitations to our assumptions regarding the representativeness of the sample and to the external validity of findings. The data were collected in 2008, just prior to a significant economic downturn and shortly following the publication of the Spelling's Report (U.S. Department of Education, 2006). This, along with other socio-historical factors likely influenced the content of some job postings; one should use some caution, for example, in assuming that job postings advertised during the economic downturn would reflect the findings of this study. Further, a visual review of Table 1 reveals that positions within community colleges were underrepresented, as were positions within the functional areas of admissions and enrollment, academic advising, outreach, and financial aid. Additionally, there was significant variation in the content, length, and detail of information included in job postings and descriptions. We assumed that these variations, which are a form of measurement error, were randomly distributed across the large sample of data.
For the initial phase of the study, we employed an adaptation of open and substantive coding to identify categories of competencies that were emergent within the data set (see Morse, 2009). Because our first research question aimed to identify competencies but not to explore the interrelationships between them, we utilized only open and substantive coding processes. We delimited the open coding process to the first 100 job postings in the data set and used these data as the basis for identifying competency areas.
Twenty-three job competency categories emerged from our initial analysis and clustering of codes. Drawing from the job postings associated with each category, we generated definitions for each competency area and then used these definitions to re-code the entire data set of 1,759 job postings. For this final re-coding, we used whole job postings as the unit of analysis; in effect, we assigned a yes or no dummy code for each of the 23 competency areas to every job posting. We then reviewed frequencies to ensure discriminant validity between the various competency areas. As there was more than 90% overlap among the job postings coded as "assessment," "program evaluation," and "research," we collapsed these three into a single competency category. Table 2 summarizes the emergent definitions and frequency counts for each of the final 21 competencies.
Testing for Differences
For the second phase of the study, we constructed a series of cross-tabulation (crosstab) tables to compare the frequencies within various functional areas as well as between functional areas, institutional types and sizes, and sets of required education and work experience. In what follows, we review the specifics of our research design followed by the results for each research question sequentially.
Differences within Functional Areas
To examine differences within the various functional areas, we first delimited the sample to the 1,540 job postings categorized in only one functional area. Given 21 competency areas (each coded yes or no) and 11 functional areas (coded yes or no), this yielded 231 2x2 crosstab tables. We used Fisher's exact test to check for differences and Phi to test for effect size. The null hypothesis for Fisher's exact assumes that the prevalence of yes and no values for each competency area will be divided proportionally across each of the 11 functional areas. When statistically significant, we rejected the null hypothesis and assumed that there were differences in the prevalence of the competency for the given functional area. In those instances, we further calculated Phi to examine the effect size of the differences. Phi is a symmetric measure that determines the effect size of differences in 2x2 crosstab tables. Cohen (1988) placed Phi (along with the other symmetric measures used in this study) into the same family of statistics as the more common Pearson's r measure of correlation; thus, one should interpret the Phi statistic in a similar manner as one would interpret a Pearson's r.
Table 3 reviews the results of tests of differences within each of the functional areas. Because non-parametric measures are sensitive to sample size, one should not compare Phi values between two different functional areas. The Phi values are an accurate measure of the effect size for differences within each of the functional areas. While there were many statistically significant differences in competency prevalence within the various functional areas, the effect size of these differences were generally small or quite small. According to Cohen (1988), effect sizes for Phi that are less than .10 are much smaller than typical for the social sciences; those between .10 and .30 are small.
Differences between Groups
Our third research question addressed differences between various functional areas, between different types and sizes of institutions, and between positions requiring different levels of education and work experience. To compare the effect size for differences between various functional areas using non-parametric statistics, one needs to have sufficient numbers within each of the comparison groups, and there should not be any large fluctuation in the sample sizes of the groups. Given the large sample size differences for each of the functional areas, it was impossible to run comparative data without heavily weighting the data, which would significantly increase the likelihood of Type 1 measurement error. For this reason, we limited comparative analyses to institutional type, institutional size, and the levels of education and work experience required.
Differences by institutional type. In order to test for differences between institutional types, we first delimited our sample to 2-year, 4-year private, and 4-year public institutions (n = 1,641) and then weighted the data for the purpose of comparison. The result was 21 2x3 tables, one for each competency; each table analyzed the competency (yes or no) against the three institution types (2-year, 4-year private, or 4-year public). For 2x3 tables with nominal data, the chi-square is the appropriate non-parametric test of difference and Cramer's V is the preferred symmetric measure.
A few statistically significant differences emerged in the comparisons by institutional type. Two-year institutions were more likely to include requirements for collaboration with other professionals, [chi square](2) = 49.36, p < .001; V = .13, p < .001; but less likely to include competencies related to crisis management, [chi square](2) = 57.67, p < .001; V = .15, p < .001; teaching and training, [chi square](2) = 53.69, p < .001; V = .14, p < .001; group advising, [chi square](2) = 89.54, p < .001; V = .18, p < .001; and individual advising, [chi square](2) = 55.34, p < .001; V = .14, p < .001. Private 4-year institutions were more likely than public 4-year institutions to include requirements within these latter two advising competency areas, as well as in conflict mediation, [chi square](2) = 30.86, p < .001; V = .11, p < .001.
Differences by institutional size. For institutional size, we did not weight the data because each group had at least 200 cases and there were minimal sample size differences between the groups. The result was 21 2x4 tables, one for each competency; each table analyzed the competency (yes or no) against the four institution size groups (less than 5,000; 5,000-9,999; 10,000-20,000; or more than 20,000). Given 2x4 crosstab tables with ordinal data, we used the chi-square to test for differences and Kendall's tau-b to examine effect sizes. Few statistically significant results emerged from these analyses. The most significant difference was for the attitudes and dispositions competency, which was slightly more prevalent at smaller institutions, [chi square](3) = 17.58, p < .01; tau-b = -.09, p < .001.
Differences by level of education. Though we were unable to classify the various job postings as entry-level, mid-level, or senior-level, the requirements for education and work experience afforded us a proxy to examine differences along a range of positions extending from entry-level to senior-level. For education requirements, we organized the 1,648 positions that included education requirements into 21 2x4 crosstab tables, one for each competency; each table analyzed the competency (yes or no) against the four levels of required education (bachelor's only, master's preferred, master's required, or doctorate preferred/required). We used the chi-square to test for differences and Kendall's tau-b to test the effect size. Table 4 presents the results of these analyses. Not all tau-b values were statistically significant; this reflects instances where there were differences among the four groups that do not reflect the ordinal progression of the four educational levels (e.g. when a competency was more prevalent among the master's required and preferred groups than either the bachelor's only or the doctorate preferred/required group). We listed the competency areas in Table 4 in rank order from those positions requiring the most education to those requiring the least.
Differences by level of work experience. We organized the 1,422 postings that included work experience requirements into 21 2x3 tables, one for each competency; each table compared the competency (yes or no) against the three levels of work experience required (0-3 years, 4-6 years, or 7 or more years). We used the chi-square to test for differences and Kendall's tau-b to test the effect size. Table 5 presents the results. We listed the competency areas in Table 5 in rank order from those positions requiring the most work experience to those requiring the least.
The 21 competencies that emerged from this study aligned well with those generated by ACPA and NASPA (2010), as well as with those identified by the empirical studies that informed the ACPA/NASPA publication (e.g. Burkard et al., 2005; Cuyjet et al., 2009; Herdlein, 2004). Interestingly, one could align three of the four most prevalent competencies in this study (programming, communication, and teaching and training) with ACPA/ NASPA's "advising and helping" competency. As noted previously in the literature review, ethical practice--one of the ACPA and NASPA competencies--emerged only from the Hickmott and Bresciani (in press) study of graduate preparation curricula, and then only when integrated with legal knowledge. It did not emerge from either the Lovell and Kosten (2000) or the Burkard et al. (2005) study, and it did not emerge as a stand-alone competency in this study. This finding highlights the fact that the findings of this study were descriptive, not prescriptive in nature. The fact that ethical practice did not emerge as an important competency area does not mean that it is not important for professionals or the profession.
ACPA and NASPA's history, philosophy, and values competency was also missing from the Burkard et al. (2005) study of mid- and senior-level perceptions and the Lovell and Kosten (2000) meta-analysis of prior practitioner research, though it did emerge in the Hickmott and Bresciani (in press) study of graduate preparation program documents. One might have interpreted this to mean that history and philosophy are more important to faculty than to practitioners. However, knowledge of the profession's history and philosophy emerged as a stand-alone competency in this study, a finding that suggests that this competency is important to more than just faculty. Further, this competency was most commonly included among positions in student affairs administration that required higher levels of education and experience, which may reflect how knowledge of history and philosophy may contribute to practitioners' capacities in aspect recognition and evaluation, processes associated with higher levels of skill acquisition in the Dreyfus and Dreyfus (1980) model.
Developing and Evaluating Competencies
When comparing the prevalence of these competencies within this sample to studies that have aimed to rank the importance of competencies, several interesting differences emerged. The studies by Burkard et al. (2005), Herdlein (2004), and Waple (2006) each identified sets of attitudes and dispositions, practical skills, and critical thinking skills among the most important or highly ranked competencies in their studies. However, critical thinking (3.2%), attitudes and dispositions (14.3%), and practical competencies (24.0%) were among those least frequently included among the 2008 job postings. This may reflect hesitancy by human resource departments to include in job postings those competencies that are difficult to measure in selection processes or it may mean that these skill sets are assumed. Yet, if these skills are important and serve as criteria for future performance evaluations, administrators may be wise to negotiate for their inclusion in formal job postings. Future research could address this issue by triangulating job posting analyses with performance evaluation criteria or qualitative interviews with the administrators and human resource professionals who craft job postings, job descriptions, and performance evaluation protocols.
On the other end of the spectrum, colleges and universities included assessment, evaluation, and research competencies in 48.1% of 2008 job postings, but the related knowledge and skill competencies were ranked in the middle of the competency sets generated by Waple (2006) and Herdlein et al. (2010) and near the bottom of the 32 competencies generated in the Burkard et al. (2005) study; they were not included at all among the 34 traits identified as critical for success in the Herdlein (2004) study. This may reflect the growing importance of outcomes-based assessment and program review in student affairs and higher education particularly in light of the growing economic challenges and increased calls for accountability. Regardless of the reason, we find the increased prevalence in this study encouraging. We also suggest that competency in assessment, evaluation, and research is both germane to all functional areas, and it should be the work of all professional educators on campus. Thus, we encourage leaders who design job postings and their associated job descriptions to more intentionally and systematically include assessment-related competencies in these important documents.
Functional Area Differences
Most of the differences within the various functional areas seemed intuitive, though there were a few surprises. As noted previously, the fact that there were statistically significant differences for a greater number of competencies within residence life, student activities, and student affairs administration may reflect a broader set of desired competencies for these positions, but it may also reflect the sensitivity of non-parametric measures to sample size. Among the functional areas with smaller sample sizes, the fact that fundraising emerged as a more prevalent competency within multicultural services was noteworthy. This may reflect the reality that multicultural services is often neither self-supporting, as is the case with residence life, nor supported by student fees, as is often the case with student activities. It may also reflect the growing availability of grants to support those services and programs and the understanding that these programs are desirable philanthropic venues for many donors. However, it may also be that some institutions or divisions of student affairs continue to view the work of multicultural services as more peripheral than central to their mission. In any case, we suggest that future studies explore why higher percentages of positions in multicultural services require fundraising competencies.
In residence life, fundraising competencies were among the least prevalent along with assessment, evaluation, and research competencies, and the two collaboration-related competency areas. This is of interest since residence life has served as a common training ground for advancement in the student affairs profession, yet the competencies in the areas of collaboration, assessment, evaluation, and research, and fundraising were among the more prevalent required of student affairs administrators. Also common for student affairs administration positions were the diversity and social justice competencies, a pattern matched only by the multicultural services functional area. We certainly do not question the value of residence life experience for advancement in the student affairs profession. That said, we note that many of the competencies most frequently required of student affairs administrators were also common within career services and multicultural services, which suggests that these areas may also serve well as training grounds for senior-level leadership.
Competencies for Entry-, Mid-, and Senior-Level Professionals
Several interesting differences emerged between job postings requiring different levels of education and work experience, our proxy for examining differences between entry-, mid-, and senior-level positions. It is important to note that Dreyfus and Dreyfus (1980) described advancements in "skill acquisition" in terms of shifts to more ambiguous, situational, holistic, and intuitive means of functioning. In light of this study's findings, it appears that development of competence in some areas is cumulative, which is an assumption that is consistent with the Dreyfus and Dreyfus model and upheld by the NASPA/ACPA (2010) Professional Competency Area document. This seemed to be the case, for example, with the leadership and fiscal management competencies, which were most frequently included in job postings that required higher levels of education and experience. Other competencies, such as those related to the history and philosophy of the profession, may have greater utility as professionals advance to positions that require more situational, holistic, and intuitive ways of knowing (knowledge), functioning (skill), and being (disposition).
Competency areas such as technology, practical competence, and advising were more prevalent among entry-level postings. In the case of the practical and technological competency areas, this may suggest that these are "gateway" competencies or it could assume that these are competencies that SSAOs delegate to their staff. Practitioners who do not master outcomes for these competencies at what ACPA/NASPA have described as the basic, intermediate, and advanced level may have limited capacity to advance to mid- and senior-level positions. For advising and training competencies, areas that are also more prevalent among entry-level positions, the cumulative level learning may work somewhat differently. One could argue that the skills developed in these competency areas are transferable to the areas of collaboration and leadership. Experience and professional development in advising and training may serve as a precursor to later development within the collaboration and leadership competency areas as well as serving as a prerequisite to career advancement. In light of the attention that ACPA and NASPA (2010) afforded to delineating "the increasing complexity and ability that should be demonstrated by practitioners as they grow in their professional development" (p. 6), we suggest that this progression and development of professional competence is a topic worthy of further investigation.
Assuming that the scope and prevalence of competencies within advertised job postings reflect the values of administrators in terms of professional education, training, and development, there are important implications for graduate preparation programs and professional organizations, as well as for employers. We invite readers to question whether this listing would accurately represent what the profession values as a whole, albeit in 2008 or beyond. In addition, we encourage practitioners to question where and when practitioners should master these competencies.
Graduate preparation programs are important training grounds for new professionals, and they are most effective when informed by quality standards (see Young & Janosik, 2007). The results of this study, along with the findings of related studies, should inform graduate preparation faculty of the competencies that are most relevant to entry-level and mid-level professionals. For example, the importance of assessment, evaluation, and research and of student learning and development were each evident in this study. Bresciani (2010) has found that training for outcomes-based assessment is most effective when paired with training in student development theory. The findings here further suggest that the competency areas of individual and group advising, conflict mediation and teaching and training may be of particular importance for master's level programs. Future studies should build on the work of Bresciani and the findings of this study to explore synergistic opportunities in training for multiple competency areas.
The integration of training for multiple competency areas is likely even more important for doctoral preparation programs and for those individuals who design and deliver professional development programs for mid-level and senior-level professionals. In these settings, educators should emphasize the development of more ambiguous, situational, holistic, and intuitive competency in the areas of leadership, budgeting and fiscal management, assessment, evaluation, and research, collaboration, and diversity and social justice. Hoffman and Bresciani (2010), for example, found a high co-occurrence of competency requirements in leadership, decision-making, collaboration, and teaching and training for assessment professionals working in student affairs. Paired with the findings of this study, the implication is that best practices for leadership training in student affairs are integrative and sequential. If we assume each competency area to be one that builds upon its expertise over time, then perhaps employers may also want to consider what competencies are required of the profession holistically and how they represent expected competencies in all of their position advertisements and at what level. For guidance in this area, employers may consider professional literature that is more definitive (i.e., ACPA & NASPA, 2010) than descriptive, as is the case for this study.
We believe that student affairs is a profession that values people, and a profession that values competency within its people. The best of assessment and accountability efforts emphasize systematic self-study for the purpose of improving practices that result in greater levels of student learning and success. Comprehensive assessment efforts that aim to close the loop between self-study and improving practice must consider the knowledge, skill, and dispositional competencies of the educators who design learning interventions for students, both within and outside the classroom. Ongoing research and scholarly discourse regarding the scope and content of competencies will continue to be critical as the student affairs profession intentionally designs and implements professional preparation programs and professional development to educate the people who work so diligently to promote access, equity, and overall student success within higher education.
American College Personnel Association. (2008). Professional competencies: A report of the steering committee on professional competencies. Retrieved October 1, 2008 from http://www.myacpa.org/au/governance/docs/ACPA_Competencies.pdf
American College Personnel Association & National Association of Student Personnel Administrators. (2010). ACPA/ NASPA professional competency areas for student affairs practitioners. Washington, DC: Authors.
Avolio, B. J. (2010). Full range leadership development (2nd ed.). Thousand Oaks, CA: Sage.
Bass, B. M., & Riggio, R. E. (2005). Transformational leadership. Mahwah, NJ: Lawrence Erlbaum Associates.
Brener, N.D., Billy, J.O., & Grady, W.R. (2003). Assessment of factors affecting the validity of self-reported health-risk behavior among adolescents: Evidence from the scientific literature. Journal of Adolescent Health, 33, 436-457.
Bresciani, M.J. (2010). Understanding barriers to student affairs professionals' engagement in outcomes-based assessment of student learning and development. Journal of Student Affairs, 14, 81-90.
Burkard, A., Cole, D.C., Ott, M., & Stoflet, T. (2005). Entry-level competencies of new student affairs professionals: A Delphi study. NASPA Journal, 42(3), 283-306.
Creswell, J.W., & Plano Clark, V.L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA: Sage.
Cohen, J. (1988). Statistical power and analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.
Council for the Advancement of Standards in Higher Education. (2006). CAS professional standards for higher education (6th ed.). Washington, DC: Author.
Cuyjet, M.J., Longwell-Grice, R., & Molina, E. (2009). Perceptions of new student affairs professionals and their supervisors regarding the application of competencies learned in preparation programs. Journal of College Student Development, 50(1), 104-119.
Dickerson, A.M., Hoffman, J. L., Anan, B.P., Brown, K.F., Vong, L.K., Bresciani, M.J., Monzon, R., & Hickmott, J. (2011). A comparison of senior student affairs officers and student affairs preparatory faculty expectations of entry-level professionals' competencies. The Journal of Student Affairs Research and Practice, 48(4), 463-479. doi: 10.2202/19496,605,627
Dreyfus, S. E., & Dreyfus, H. L. (1980). A five-state model of the mental activities involved in directed skill acquisition. Unpublished report, University of California, Berkeley.
Gall, M.D., Borg, W.R., & Gall, J.P. (1996). Educational research: An introduction (6th ed.). New York, NY: Longman.
Goleman, D. (1995). Emotional intelligence. New York, NY: Bantam Books.
Grund, N. (2009). Student affairs and the new economy. Leadership Exchange, 7(3), 11-13.
Herdlein, R. J., III. (2004). Survey of chief student affairs officers regarding relevance of graduate preparation of new professionals. NASPA Journal, 42(1), 51-70.
Herdlein, R. J., III, Kline, K., Boquard, B., & Haddad, V. (2010). A survey of faculty perceptions of learning outcomes in master's level programs in higher education and student affairs. College Student Affairs Journal, 30 (1), 33-45.
Hickmott, J., & Bresciani, M.J. (in press). Examining learning outcomes in student personnel preparation programs. Journal of College Student Development.
Hoffman, J.L., & Bresciani, M.J. (2010). Assessment work: Examining the prevalence and nature of learning assessment competencies and skills in student affairs job postings. The Journal of Student Affairs Research and Practice, 47(4), 495-512. doi: 10.2202/1949-6605.6082
Kuk, L., Cobb, B., & Forrest, C. (2007). Perceptions of competencies of entry-level practitioners in student affairs. NASPA Journal, 44(4), 664-691.
Leithwood, K. K., Seashore Louis, K., Anderson, S., & Wahlstrom, K. (2004). How leadership influences student learning. New York, NY: Wallace Foundation. Retrieved June, 2005, from http://www.wallacefoundation.org/NR/rdon lyres/E3BCCFA5-A88B-45D3-8E27-B973732283C9/0/ReviewofResearchLearningFromLeadership.pdf
Lovell, C.D., & Kosten, L.A. (2000). Skills, knowledge, and personal traits necessary for success as a student affairs administrator: A meta-analysis of thirty years of research. NASPA Journal, 37(4), 353-369.
Morse, J. M. (2009). Tussles, tensions, and resolutions. In J. M. Morse, P. N. Stern, & J. Corbin (Eds.), Developing grounded theory: The second generation (pp. 13-19). Walnut Creek, CA: Left Coast Press.
National Council for the Accreditation of Teacher Education. (2008, February). Professional standards for the accreditation of teacher preparation institutions. Washington, DC: Author. Retrieved from http://www.ncate.org/documents/ standards/ NCATE%20Standards%202008.pdf
Pace, C.R. (1985). The credibility of student self-reports. Los Angeles, CA: University of California, The Center for the Study of Evaluation, Graduate School of Education.
Palomba, C.A., & Banta, T.W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education. San Francisco, CA: Jossey-Bass.
Patton, M. Q. (1990). Qualitative evaluation and research methods (2nd ed.). Newbury Park, CA: Sage.
Perkins, D. (1993). New conceptions of thinking: From ontology to education. Educational Psychologist, 28(1), 67-85.
Renn, K.A., & Jessup-Anger, E.R. (2008). Lessons for graduate preparation programs from the National Study of New Professionals in Student Affairs. Journal of College Student Development, 49(4), 319-335. doi: 10.1353/csd.0.0022
Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks, CA: Sage.
Thornton, H. (2006). Dispositions in action: Do dispositions make a difference in practice? Teacher Education Quarterly, 33(2), 53-68.
United States Department of Education. (2006). A test of leadership: Charting the future of higher education. Retrieved from http://www.ed.gov/about/bdscomm/list/hiedfuture/reports/final-report.pdf
Young, D.G., & Janosik, S.M. (2007). Using CAS standards to measure learning outcomes of student affairs preparation programs. NASPA Journal, 44(2), 341-366.
Waple, J.N. (2006). An assessment of skills and competencies necessary for entry-level student affairs work. NASPA Journal, 43(1), 1-18.
John L. Hoffman, Ph.D.
California State University, Fullerton
Marilee J. Bresciani, Ph.D.
San Diego State University
The authors wish to thank The Placement Exchange and its supporting professional associations for providing the data for this study. The authors additionally thank Chieh-Hsing Chou and Michelle Cuellar for their assistance with coding the data.
Table 1 Institution and Job Characteristics Category N % Institution Type (n = 1,759) 2-Year Institutions 116 6.6% 4-Year Public Institution 600 34.1% 4-Year Private Institution 925 52.6% Professional Organization/Other 118 6.7% Institution Size (n = 1,759) Under 5,000 658 37.4% 5,000-9,999 287 16.3% 10,000-20,000 372 21.1% More than 20,000 442 25.1% Job Category (n = 1,729) * Admissions and Enrollment Management 45 2.6% Career Services 70 4.0% Counseling, Health, and Wellness 65 3.7% Greek Life 64 3.6% Judicial Affairs 50 2.8% Multicultural (including lesbian, gay, 105 6.0% bisexual/transgendered services, and women's centers) Orientation and New Student Programs 63 3.6% Residence Life 797 45.3% Student Activities (including 232 13.2% leadership programs and student unions) Student Affairs Administration 405 23.0% Other Student Affairs Positions 150 8.5% Highest Level Education Preferred or Required (n = 1,668) ** Bachelor's Only 129 7.8% Master's Preferred 611 37.1% Master's Required 700 42.5% Doctorate Preferred or Required 208 12.6% Years of Professional Experience Required (n = 1,422) 0-3 Years 779 54.8% 4-6 Years 512 36.0% 7 or More Years 131 9.2% Note. * 219 job postings were categorized under two or functional areas; thus, the cumulative percentages for this section exceed 100%. ** The data for some job postings did not include highest level of education preferred or years of professional experience required. Table 2 Emergent Competency Areas Competency Definition Programming This consisted of requirements to design, develop, implement, or manage programs, activities, and other forms of learning interventions. Communication This encompassed both written and oral forms of communication as well as interpersonal communication. Assessment, Evaluation, This area included all references to and Research outcomes-based assessment, program review and evaluation, research, and data analysis. Teaching and Training This involved teaching credit-bearing and non-credit courses as well as providing various forms of training Leadership This encompassed efforts to provide long-term direction to divisions or departments including planning, forecasting, and visioning. Budgeting and Fiscal This included all references to Management understanding and managing budgets, financial plans, and fiscal resources. It did not include fundraising. Collaboration with This involved requirements for teamwork, Non-Faculty partnership, or collaboration that Professionals specifically cited non-faculty professionals on me campus. (Collaboration with faculty was also cited in 480,69.3%, of these postings.) Collaboration with This included requirements for teamwork, Faculty partnership, or collaboration specifically with faculty. (Collaboration with non-faculty was also cited in 479,73.8%, of these postings.) Law and Policy This consisted of knowledge of higher education law, oversight of disciplinary procedures, development and administration of policy, and understanding governance structures. Diversity and This entailed required knowledge, skill, Social Justice or dispositional competencies necessary to work with diverse students or to create inclusive learning environments. This extended beyond mere descriptions of working in a diverse environment or identification of the institution as an equal opportunity employer. Technology This consisted of all requirements to use computers and other forms of technology. Student Learning and This included required knowledge of Development student development or learning theory and skill-based expectations to apply these in practice. Practical Competence This consisted of an array of skills related to time management, organizational skills, managing multiple responsibilities, working autonomously, and meeting deadlines. Advising Groups This encompassed requirement for any form of advising student clubs, organizations, or groups of students. Cris is Management This included all requirements to respond to emergencies, assess risks, or manage crisis situations. Attitudes and This covered broad requirements for Dispositions dispositional competencies; the most common were creativity, enthusiasm, flexibility, aid a positive attitude. Advising Individuals This encompassed requirements for providing any form of advising or counseling to individual students. Conflict Mediation This included requirements far mediating conflicts between individuals or groups of students. Foundations of the This entailed all requirements for Profession knowledge of the history, philosophy, or ethical standards of student affairs or higher education. Fundraising This included requirements to engage in grant-writing, organize fundraising events, and solicit donations. Critical Thinking This encompassed requirements for problem-solving, decision-making, critical thinking, and reflective practice. Competency Frequency Percentage Programming 1,139 64.8% Communication 1,038 59.0% Assessment, Evaluation, 346 48.1% and Research Teaching and Training 756 43.0% Leadership 753 42.8% Budgeting and Fiscal 736 41.8% Management Collaboration with 693 39.4% Non-Faculty Professionals Collaboration with 649 36.9% Faculty Law and Policy 615 35.0% Diversity and 600 34.1% Social Justice Technology 589 33.5% Student Learning and 552 31.4% Development Practical Competence 423 24.0% Advising Groups 318 18.1% Cris is Management 31S 18.1% Attitudes and 252 14.3% Dispositions Advising Individuals 210 11.9% Conflict Mediation 174 9.9% Foundations of the 100 5.7% Profession Fundraising 60 34.0% Critical Thinking 56 32.0% Table 3 Differences Within Functional Areas Functional Area' Competency Fisher's Phi Phi Exact Value Sig. Teat Sig. Admissions/Enrollment Management (n = 30) Technology .00 .08 .00 Advising Groups .00 -.07 .01 Cris is Management .00 -.07 .01 Student Learning and Development .00 -.09 .00 Teaching and Training .00 -.12 .00 Career Services(n = 55) Advising Individuals .00 .16 .00 Technology .00 .14 .00 Collaboration with Faculty .00 .08 .00 Communication .00 .07 .00 Cris is Management .00 -.09 .00 Law and Policy .00 -.12 .00 Counseling/Health/Wellness (n = 40) Advising Groups .00 -.07 .01 Greek Life(n = 47) Advising Groups .00 .18 .00 Judicial Affairs(n = 26) Conflict Mediation .00 .09 .00 Law and Policy .00 .09 .00 Teaching and Training .00 .09 .00 Assessment, Evaluation, and Research .01 .07 .01 Budgeting and Fiscal Management .00 -.07 .01 Multicultural Services (n = 63) Diversity and Social Justice .00 .12 .00 Fundraising .00 .11 .00 Cris is Management .00 -.08 .00 Law and Policy .00 -.09 .00 Residence Life/Housing (n = 720) Cris is Management .00 .24 .00 Conflict Mediation .00 .14 .00 Law and Policy .00 .09 .00 Advising Groups .00 .08 .00 Foundations of the Profession .00 -.07 .00 Programming .00 -.08 .00 Leadership .00 -.10 .00 Teaching and Training .00 .13 .00 Collaboration with Faculty .00 -.11 .00 Communication .00 -.13 .00 Technology .00 -.13 .00 Collaboration with other Professionals .00 -.14 .00 Budgeting and Fiscal Management .00 -.15 .00 Assessment, Evaluation, and Research .00 -.16 .00 Fundraising 0 -.16 .00 Student Activities (n = 158) Advising Groups .00 .15 .00 Budgeting and Fiscal Management .00 .09 .00 fundraising .00 .08 .00 Communication .01 -.07 .00 Conflict Mediation .00 -.09 .00 Diversity and Social Justice .00 -.10 .00 Cris is Management .00 -.11 .00 Collaboration with Faculty .00 -.12 .00 Collaboration with Other Professionals .00 -.14 .00 Student Affair s Administration (n = 287) Leadership .00 .18 .00 Budgeting and Fiscal Management .00 .16 .00 Collaboration with Faculty .00 .16 .00 Collaboration with Other Professionals .00 .15 .00 Diversity and Social Justice .00 .11 .00 Communication .00 .10 .00 Foundations of the Profession .00 .09 .00 Fundraising .01 .08 .00 Assessment, Evaluation, and Research .01 .07 .01 Advising Individuals .01 -.07 .01 Advising Groups .00 -.16 .00 Teaching and Training .00 -.17 .00 Other Student Affairs Positions (n = 77) Fundraising .00 .12 .00 Technology .00 .10 .00 Communication .01 .07 .01 Advising Groups .00 -.07 .01 Cris is Management .00 -.08 .00 Note Table 3 lists competency areas that were statistically significant at p<.0l listed from most common to least common according to Phi values (positive Phi values indicate that the competency was more likely to occur within the functional area; negative Phi values indicate the opposite) Table 4 Differences Between Positions Requiring Different Levels of Education Chi-Square Chi-Square Value Significance (df = 3) Competency Leadership 202.7 0 Budgeting and Fiscal Management 172.7 0 Collaboration-Faculty 100.8 0 Assessment, Evaluation, and Research 74.3 0 Diversity and Social Justice 47.1 0 Collaboration with Other Professionals 46.9 0 Student Learning and Development 28.6 0 Law and Policy 33.8 0 Communication 20.1 0 Crisis Management 41.3 0 Foundations of the Profession 33.6 0 Fundraising 32.2 0 Conflict Mediation 11.9 0.01 Teaching and Training 39.8 0 Advising Groups 77.4 0 Advising Individuals 24.1 0 Practical Competence 39 0 Technology 50.5 0 Kendall's Kendall's tau-b tau-b Value Significance Competency Leadership 0.22 0 Budgeting and Fiscal Management 0.2 0 Collaboration-Faculty 0.15 0 Assessment, Evaluation, and Research 0.14 0 Diversity and Social Justice 0.09 0 Collaboration with Other Professionals 0.09 0 Student Learning and Development 0.08 0 Law and Policy 0.07 0 Communication 0.06 0 Crisis Management 0.05 0 Foundations of the Profession 0.05 0.01 Fundraising 0.05 0.02 Conflict Mediation 0.03 0.06 Teaching and Training -0.01 0.53 Advising Groups -0.06 0 Advising Individuals -0.07 0 Practical Competence -0.1 0 Technology -0.11 0 Note. Positive Kendall's tau-b values indicate that the competency area was more prevalent for positions requiring higher levels of education; negative values indicate the opposite. Table 5 Differences Between Positions Requiring Different Levels of Work Experience Competency Chi-Square Chi-Square Value (df = 2) Significance Leadership 240.3 0 Budgeting and Fiscal 190.6 0 Management Collaboration with Other 107 0 Professionals Collaboration with Faculty 79.6 0 Diversity and Social Justice 49.3 0 Assessment, Evaluation, 44.4 0 and Research Communication 34.2 0 Fundraising 21.9 0 Programming 15.9 0 Foundations of the Profession 15.2 0 Law and Policy 23.4 0 Technology 8.4 0.02 Crisis Management 15.9 0 Conflict Mediation 17.7 0 Advising Individuals 44.9 0 Teaching and Training 109.6 0 Advising Groups 151.8 0 Competency Kendall's Kendall's tau-b tau-b Value Significance Leadership 0.3 0 Budgeting and Fiscal 0.27 0 Management Collaboration with Other 0.2 0 Professionals Collaboration with Faculty 0.17 0 Diversity and Social Justice 0.13 0 Assessment, Evaluation, 0.12 0 and Research Communication 0.11 0 Fundraising 0.09 0 Programming 0.08 0 Foundations of the Profession 0.07 0 Law and Policy 0.07 0 Technology -0.01 0.74 Crisis Management -0.07 0 Conflict Mediation -0.08 0 Advising Individuals -0.12 0 Teaching and Training -0.19 0 Advising Groups -0.24 0 Note. A positive Kendall's tau-b value indicates that the competency area was more prevalent for positions requiring higher work experience; a negative value indicates the opposite.