Part 2- PCF for Student Success Coaches: Improving Student Success through the Performance Management of Sucess Coaches
by Dr. Andre P. Davis Jr.

Part 2- PCF for Student Success Coaches: Improving Student Success through the Performance Management of Sucess Coaches

Part 2 (Continuation of the article)

Black and Taylor (2021) state that the COVID-19 crisis forced colleges to adopt case management processes to distribute much-needed resources to students, primarily CARES Act funding from the US government. SSCs understand that student engagement usually happens from one crisis to the next. Weissman and Schmidt (2020) suggest that the challenges with ineffective and reduced delivery of support services negatively impacted students in need. The literature suggests that students respond to a one-on-one case management model, and institutions often see increases in key performance indicators (Schiemann & Molnar, 2019). A one-on-one case management model could allow for personal connections with students, a more thorough investigation of a student's situation, and a potential solution to a student's unique needs (Black & Taylor, 2021). Institutional leadership at a community college in the Texas Community College System introduced a model for poverty-informed support retention strategies through a department called the Student Advocacy Center (SAC), which utilized one-on-one case management as a retention tool, increased focus on program pathways, graduation, and university transfer through active learning. It provided training to increase faculty awareness of poverty on student retention (Black & Taylor, 2021).?

The Stanley Workflow Chart was created by the author as a proactive way to monitor the student engagement process by SSCs, with the goal of students not stopping out between various support services. The Stanley Workflow Chart came into existence after engaging a student who had attended a North Carolina Community College for several years but had yet to progress beyond the first year of courses in the curriculum of the student's chosen program. The student had engaged in Disability Services at the College but did not utilize the accommodations provided. Moreover, because the student could no longer receive financial aid, had been academically suspended, and used their own money to pay for classes, the student's academic advisor continued to register this student for classes without a thorough investigation of the student's records, and eventually the student stopped out. SSCs, as stated in this paper, generally see students from one crisis to the next. The student was engaged in the last crisis; however, the student could not be retained even with some interventions like tutoring and one-on-one coaching sessions.

Di Tommaso (2011) found that students with other cognitive abilities neglected seeking support services because of time constraints, past adverse experiences with student support personnel, and a lack of perceived need for support services. The developmental and educational literature defines situational and socio-affective variables as non-cognitive variables, which could be barriers for students with other cognitive abilities (Di Tommaso, 2011). Federaro (2010) found that the need for developmental courses in math, reading, and writing has risen to over 65 percent of students entering College, including students with other cognitive abilities. Other research has shown that students with other cognitive abilities could experience issues navigating financial decisions, managing schedules, self-efficacy, studying skills, finding their barrens, and managing stress. Black and Taylor (2021) suggest that a one-on-one model provides a student with a more robust experience with support service personnel. Using a one-on-one case management model is a process that SSCs can use to manage student interactions effectively. The Stanley Workflow Chart and the Success Coach Evaluation Rubric help SSCs develop an equitable and consistent student service delivery standard. When coupled with a tool like an early alert system (EAS), which utilizes a predictive analytics model to manage student information effectively, the students' experience becomes even more robust.

Figure 1

Stanley Workflow Chart for Student Case Management (Please visit Dr. Davis' profile for the workflow chart).

Note. Davis, A. (2021). Stanley Workflow Chart. Manuscript submitted for publication.?

Predictive Analytics

The beginnings of predictive analytics have their roots in computer science, where early explorations were through data mining, online learning, and community network analysis (Siemens, 2013). Higher education started using data mining, much like marketers for products that used data to predetermine the actions of consumers to determine market needs, and with the idea of using data to inform decision-making about organizational needs and performance (Campbell et al., 2007; Siemens, 2013). The use of data mining and online learning evolved into using learning management systems (LMS) and complicated data collection methods, creating algorithms to predetermine student outcomes and decision-making processes around student interventions (Hall et al., 2021). The utilization of predictive analytic systems has increased in higher education, primarily in student development, advising, and enrollment management. Perdue University's, Signals system was an early version of the use of predictive analytics, finding that it could be predetermined at a rate of 87 percent accuracy as to students' ability to pass a course with a grade of C or better (Campbell et al., 2007). Also, Norte Dame's engineering program utilizes predictive analytics to determine retention with an 80% to 90% accuracy. Student success coaches have benefitted from using predictive analytic systems like AVISO and EAB-Navigate to identify students needing intervention based on various student performance predictors with some success (Hall et al., 2021).

Professional Development

Professional development (PD) is another critical component of the PCF, which helps SSCs identify potential opportunities for improvement and enhance their skills and knowledge. SSCs need to utilize different professional development opportunities offered by their college, the state college system, and other in-state or out-of-state organizations that provide professional development for higher education employees. At a North Carolina Community College, employees have a metric in the performance review evaluation that states they must complete at least 5 hours of PD per year. The North Carolina Community College’s SSC evaluation rubric has a metric for monthly PD that focuses on skills pertinent to the SSC role. The author encourages SSCs to engage in monthly in-person or virtual PD as part of the SSC evaluation rubric. Due to the COVID-19 pandemic, the learning and development departments within organizations will need to move to a hybrid training model to meet the professional development needs of employees (Mikolajczyk, 2021). Fenwick (2003) describes the struggle between organizations controlling professional development and the desire for autonomy of employees to determine their course of professional development. Blau et al. (2008) suggest that employees continue their professional development to stay relevant in the job market. In addition, employees invest in themselves if they feel their organization needs to be more committed to their further development (Blau et al., 2008). Patton (2017) stated that internal mentorship programs offer a valuable professional development opportunity for new employees when paired with organizational leaders. Fenwick (2003) found that professional growth plans effectively increase employee obligation, professional learning, and a sense of self-affirmation. Dachner et al. (2021) suggest that employee-driven development is the new trend in human capital development practices. Discussing employee professional development opportunities could be facilitated during performance reviews or employee-to-manager meetings.

One-On-One Meetings (O3s)

O3s are an essential component of the PCF, which provide opportunities for SSCs to improve morale, build better teamwork, enhance employee relationships, communicate effectively with their direct reports, and resolve conflicts. Rogelberg et al. (2006) define one-on-one (1:1) meetings as a gathering of more than one individual on or off the job site to discuss work-related matters. The purpose of O3s is to facilitate communication, give and receive direct feedback on performance, coach direct reports on areas of improvement, motivate employees, build relationships with direct reports, and as a supervisor, ask for feedback from employees about one's performance as a supervisor (Schumacher, 2018). According to Flinchum et al. (2022), 1:1 meetings, though the primary form of meetings, may vary between; employee to client, and peer to peer, with the most used being a manager to direct report. Primarily for performance appraisals and, in some instances, for disciplinary actions. Research shows that nearly 47% of all meetings are 1:1, and the lack of a 1:1 meeting could be a missed opportunity for managers with direct reports (Flinchum et al., 2022). According to Rogelberg et al. (2007), executive leadership spends about 23 hours weekly in meetings.

The average manager spends more time preparing to lead meetings than any other job-related function (Allen et al., 2015). Recent research at Microsoft and Cisco showed that employees with structured and frequent 1:1 meetings were more engaged than their counterparts (Flinchum et al., 2022). There needs to be more research conducted on 1:1 meetings, especially between managers and direct reports to guide promising practices as the use of 1:1 meetings increases within organizations (Flinchum et al., 2022). The caveat in a 1:1 meeting is the communicative commitment to staying present and interacting compared to potential mental disconnection and lack of presence in group meetings (Poole & Billingsley, 1989).

Determining Key Performance Indicators for Data Collection and Analysis

For-profit businesses, especially those linked to social media and with an online presence, monitor KPIs to determine the success of their businesses. Though using KPIs is still a new concept in higher education, institutions are businesses and could find that KPIs can be helpful in reaching goals. Measuring KPIs could serve an organization for performance improvement and new strategy implementation. Determining whether SSCs or students are successful outside of completion and persistence rates could be challenging. Grave (2019) challenged using standard quantitative metrics as an evaluative process for combining secondary data to determine success. Using varying measures could support the effectiveness of processes, which determines success. The question is what are key performance indicators (KPIs)? A KPI quantifies work that could be otherwise described qualitatively and can be utilized to determine how an SSC is performing in the role and the success of a student in their course work (Qlik Tech International AB., 2020). Some KPIs could include average GPA, hours attempted versus hours completed, completion and persistence rates, the average number of terms students engage with a student success coach, SAP completion percentage, and cumulative GPAs of students engaged with a success coach.

The two types of KPIs are lagging indicators and leading indicators. Lagging indicators show long-term results (Qlik Tech International AB., 2020). In the PCF, a lagging indicator could be the average GPA of students in a particular program pathway from the fall to the spring semester (Qlik Tech International AB., 2020). Lagging indicators show how we did. Leading indicators capture data influencing outcomes (Qlik Tech International AB., 2020). A leading indicator could be determining the success of a student activities event by the number of students who attended the event. Leading indicators show how we are doing (Qlik Tech International AB., 2020). Both indicators work together in organizational goal achievement.

In the author's role at a North Carolina Community College as the Director of Student Enrichment and Engagement, there was an intentionality in collecting quantitative data to determine student outcomes. Working with institutional research (IR) at the North Carolina Community College provided the support needed to determine what to measure. IR helped to determine the focus on additional academic indicators (student outcomes) like average GPA, hours attempted versus hours completed, the average number of terms students engage with an SSC, SAP completion percentage, and cumulative GPAs of students engaged with an SSC. The author facilitated a monthly data extraction from CareConnect, exported as an Excel spreadsheet containing every student's identification number associated with an SSC intervention that month. IR then imported the Excel spreadsheet into Informer 5 (data analysis software), and the author requested KPIs on a pivot table for further analysis and potential implementation.

As more institutions focus on data-driven decision-making, it is essential to choose KPIs based on the needs of an organization, which could increase data literacy across an organization and promote goal achievement (Qlik Tech International AB., 2020). If KPIs no longer meet an organization's needs, it may be necessary to adjust what is being measured (Qlik Tech International AB., 2020). When KPIs are understood, used effectively, and selected based on organizational needs, they could support leaders in making informed decisions and could be an effective tool (Qlik Tech International AB., 2020). Data collection and analysis are essential components of the PCF, which help identify performance improvement opportunities.

Performance Improvement Process

The performance reporting process is a critical component of the PCF, which uses a scoring rubric to evaluate SSC performance and students' performance based on KPIs such as average GPA, completion, and persistence rates. The rubric emphasizes the importance of equitable and consistent student service delivery rather than simply achieving high scores. SSCs engage in one-on-one meetings, case management, using key performance indicators, data collection, and analysis, support identifying areas for improvement, and strategies for implementation. Kristie (2009) states that success is confronted when new knowledge requires a change in practices, and good execution is no guarantee for success.?

After identifying opportunities for process improvement, the following steps typically involve developing and implementing an improvement plan. Some suggestions might be:

1. Prioritize the Opportunities: Review and prioritize them based on impact, implementation, and resources. Focus on the opportunities that have the most significant impact—prioritized the Early Alert System because of its advantage to our SCs.

2. Develop a Plan: Create a plan based on prioritization and create an improvement plan. Include specific objectives, timelines, and measures of success. Always consider the resources, personnel, and budget.

3. Implement the plan: Using a pilot approach. Communicate changes with stakeholders and employees as their support is needed.

4. Monitor and Evaluate: Monitor regularly to ensure it is working. Work with IR to collect data to look at KPIs. Look for trends or issues to address.

5. Adjust the Plan: Keep an eye on the data to ensure it continues to meet the organization's needs. Communicate regularly with all stakeholders about any results to maintain their support and engagement and adjust the plan as needed.

Following these suggestions could effectively identify and implement process improvements that will support Student Success Coaches, organizational goal achievement, and overall performance.?

Conclusion

A Stanford research study showed that students who were engaged by Student Success Coaches were 15 percent more likely to stay in school (Marcus, 2013). Chapman University saw a return on their success coach investment with InsideTrack of a 9 percent increase in retention, a $1.1 million revenue increase, and more actively engaged students, especially with campus resources like tutoring, and career counseling (Enrollment Report, 2006). Marcus (2013) stated a coaching program at Wallace State Community College showed coached students performed eight percentage points higher than uncoached students from the fall to the spring semesters. The literature is filled with data supporting the productivity of SSCs. The author developed and implemented a performance coaching framework to manage SSCs at a North Carolina Community College. The PCF is a new conceptual framework that could be utilized to guide monthly performance evaluations to determine the effectiveness of SSCs. The goal is to positively impact student outcomes.

The PCF emphasizes the importance of establishing an equitable, consistent student service delivery standard and method for measuring the performance of Student Success Coaches. The author implemented the PCF in the Fall of 2022. From Fall 2022 to Spring 2023, the data on key performance indicators showed that SSCs across program pathways students had an average GPA of 1.75, attempted credit 11,779 versus completed credits 8,342, with a course completion rate of 71%. The average number of terms a student engaged with an SSC was 3, the average SAP completion rate was 60%, and the credentials completed were 8. Though there was a change in average GPA from 1.89 down to 1.75 during the months of implementation, the course completion rate did increase from 59% to 71%, the average number of semesters students engaged with an SSC stayed consistent at 3, the SAP completion rate increased from 50% to 60%, and credentials completed increased from 4 to 8. The KPI goals by the end of Spring 2024 are an average GPA of 2.2, a course completion rate of 80%, and an average SAP completion rate of 70%.

Some key takeaways from this paper are;

? The success coaches' goals should align with the mission and vision of the institution.

? Decide what KPIs to use to determine how student-successful coaches are performing.

? Discuss data collection and analysis with the Institutional Research Department.

? The use of a scoring rubric for performance reporting promotes accountability, transparency, and continuous improvement.

? Use performance results to identify areas to improve and create a strategy for improvement.

? Review the effectiveness of the performance coaching framework and adjust as necessary.

? The use of a case management process could establish a student service delivery standard.

? Maintain a training support and professional development schedule for Student Success Coaches.

? Utilize O3s for collaboration, communication, feedback, and relationship building.?

About the Author

Dr. Andre P. Davis Jr. is the Director of Student Enrichment and Engagement at a mid-size North Carolina Community College. Dr. Davis' purview includes Success Coaches, Minority Male Programming, Student Recruitment, Student Activities, Counseling, Accessibility Services, Emergency Financial Assistance Programming, Food Pantry, Student Wellness and Basic Needs, Community Engagement Programming, Veterans Recruitment, and Esports. Dr. Davis was an adjunct instructor in Continuing Education and worked for several years for Federal TRIO programs. He holds a Doctorate in Business Administration, focusing on "Social Impact Management."

References: Available upon request.

Contact Info: [email protected]

要查看或添加评论,请登录

社区洞察

其他会员也浏览了