Next Article in Journal
Computers as a Tool to Empower Students and Enhance Their Learning Experience: A Social Sciences Case Study
Previous Article in Journal
Integrating PhET Simulations into Elementary Science Education: A Qualitative Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AI-Supported Academic Advising: Exploring ChatGPT’s Current State and Future Potential toward Student Empowerment

School of Education, Queens College and the Graduate Center of the City University of New York, New York, NY 10016, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(9), 885; https://doi.org/10.3390/educsci13090885
Submission received: 27 July 2023 / Revised: 24 August 2023 / Accepted: 29 August 2023 / Published: 31 August 2023
(This article belongs to the Section Higher Education)

Abstract

:
Artificial intelligence (AI), once a phenomenon primarily in the world of science fiction, has evolved rapidly in recent years, steadily infiltrating into our daily lives. ChatGPT, a freely accessible AI-powered large language model designed to generate human-like text responses to users, has been utilized in several areas, such as the healthcare industry, to facilitate interactive dissemination of information and decision-making. Academic advising has been essential in promoting success among university students, particularly those from disadvantaged backgrounds. Unfortunately, however, student advising has been marred with problems, with the availability and accessibility of adequate advising being among the hurdles. The current study explores how AI-powered tools like ChatGPT might serve to make academic advising more accessible, efficient, or effective. The authors compiled a list of questions frequently asked by current and prospective students in a teacher education bachelor’s degree program in the United States. Then, the questions were typed into the free version of ChatGPT, and the answers generated were explored and evaluated for their content and delivery. ChatGPT generated surprisingly high-quality answers, written in an authoritative yet supportive tone, and it was particularly adept at addressing general and open-ended career-related questions, such as career outlook, in a clear, comprehensive, and supportive manner using plain language. We argue that AI-powered tools, such as ChatGPT, may complement but not necessarily replace human academic advisers and that these tools may very well serve to promote educational equity by empowering individuals from a wide range of backgrounds with the means to initiate effective methods of seeking academic advice.

1. Introduction

The advent of artificial intelligence (AI) has transfigured how humans obtain information, solve problems, and interact with others—humans or machines—transcending conventional boundaries to promote seamless and dynamic exchanges of information and problem solving. Amidst this revolutionary progress, ChatGPT [1] has emerged as a remarkable large language model [2]. Developed by OpenAI, ChatGPT represents a leading-edge application of deep learning, leveraging the advanced GPT-3.5—or 4.0, currently depending on the version in use at the time—architecture to produce human-like conversational responses [2]. Given its potential for teaching and learning, scholars and practitioners in the field of education have been showing much interest—albeit not always favorably—in ChatGPT and its applications, particularly in higher education [3]. This article explores the multifaceted world of generative AI models, ChatGPT to be precise, exploring their potential roles in enhancing student advising, which is frequently cited as among the key areas of improvement in higher education in the United States [4,5]. By analyzing the potential impact and challenges of integrating ChatGPT into academic advising, this exploratory research aims to shed light on the profound implications of this and other easily accessible large language models, signaling great opportunities for a transformative shift in the landscape of student-centered support services.
Incidentally, the healthcare realm, including the mental-healthcare community, has been particularly proactive with the idea of integrating AI technology, most notably ChatGPT, into various aspects, and AI-integration has been gaining much attention for its potential, among others, for (a) diagnostic utilities and clinical decision-making [6,7], (b) medical education and training [8], (c) improved efficacy and efficiency in therapy [9], and (d) providing patients and the general public with advice and counseling [10]. Results from the 2023 Tebra Survey [11] show, for example, that more than one in ten healthcare professionals in the United States already utilize AI technology in their practice, with another 50% responding that they intend to adopt AI technology in the future. In addition, the same survey revealed that eight out of ten American adults surveyed believe AI will improve healthcare quality, increase healthcare access, and contribute to reduced healthcare costs. Perhaps most notably (and, for some, even disturbingly), one in four American adults surveyed reported that they would be more likely to “talk to” AI than seek in-person consultation in seeking medical advice. This finding is particularly ironic because many healthcare providers surveyed in the same report voiced their profound concern about using generative AI models, such as ChatGPT, under the assumption that such tools would deprive patients of the human interaction these patients need and value. In other words, while the providers may consider human interaction to be an invaluable part of providing healthcare consultations—and it may very well be the case that in-person consultations are more effective than those provided through AI—it is also an important consideration that the general public does not necessarily appear to perceive human interactions to be as critically important as healthcare providers might perhaps believe.
Research has recently emerged specifically investigating the utility of AI-based large language models in medical consultations. Xie and colleagues [10], for instance, examined the efficacy of the advice provided by ChatGPT on rhinoplasty. In their study, they asked nine essential questions about rhinoplasty, as published by the American Society of Plastic Surgeons, on ChatGPT, and the authors evaluated the quality of the responses obtained. Overall, the researchers characterized the ChatGPT-provided advice to be easy to understand, coherent, and generally accurate—including information on the risks and outcomes, which, according to the authors, should promote realistic expectations for the procedure. At the same time, ChatGPT cannot provide individualized advice (e.g., how a specific patient’s nose might look after the procedure), which would signal that, in the present version of ChatGPT, human-provided counseling would still have a critical role in addition to the information prospective patients may gather through ChatGPT. Similar efforts have been made in mental health care, as well, exploring the opportunities and challenges ChatGPT may bring to psychological counseling and other aspects of mental health [12]. The legal community has also been actively probing into the appropriate and ethical integration of ChatGPT and other AI-powered tools into the teaching and practice of law [13,14]. However, the legal profession has been more cautious, and AI integration has been highly controversial and, thus, is far from being widely accepted in the legal field [15,16].
Now, how would these efforts on areas outside of the field of education inform us, as it pertains to academic support for current and future university students, to be specific? AI’s roles in educational settings have gained remarkable attention in higher education since ChatGPT’s release in November of 2022, but the excitement has been somewhat skewed in focus. While scholars and practitioners alike widely appear to believe that ChatGPT and other AI tools have the potential to fundamentally change the landscape of higher education [17,18], the primary foci have revolved around two distinct areas of discussion: (a) the concerns around plagiarism and other aspects of academic integrity [3,19,20,21] and, to a lesser degree, (b) pedagogical methods to promote teaching and learning, particularly with academic writing and medical education [8,22,23,24]. In other words, although there appears to be a widely shared perception that AI will be a “game changer” in our society and prompt us to reinvent how humans handle information in the most overarching manner, thus far, research on ChatGPT and other AI-supported tools in higher education has not explored its impact outside of a limited set of contexts, often approaching them mainly as a threat rather than a potential asset [17,20]. Perhaps as a result, movements to ban—or put a moratorium on—ChatGPT and other AI-powered tools have been emerging in educational institutions around the globe [2], which may perhaps inadvertently stunt the integration of a tool that could potentially be of great utility in higher education. As Zawacki-Richter and colleagues have posited [18], we must ensure that educators actively and proactively participate in conversations surrounding AI integration into the field of education to ensure that balanced perspectives will be represented as generative AI models continue to proliferate in and outside of educational settings.
Academic advising has been identified as being critical to student success, particularly for those from disadvantaged backgrounds, yet institutions of higher education are often unable to provide adequate student advising primarily because it is highly resource intensive to hire full-time academic advisers and faculty advisers—with their teaching, research, mentoring of honors and graduate students, and other duties, they may not consistently be able to devote their time or attention to general academic advising [4,5,25]. High-quality academic advising in higher education has also been linked to student satisfaction and loyalty to the university [26], which should enhance student retention. It is therefore not surprising that various aspects of AI-powered tools began to be incorporated into some high-cost academic advising platforms, such as EAB [27]; however, the provision of accessible and effective academic advising continues to pose challenges [5]. For one, such tools typically come with steep institutional subscription price tags, and they are not open to the public, which severely limits access. Among the numerous challenges university students face in seeking academic advising and the key concerns identified by researchers and practitioners are (a) limited availability of advisers and difficulty scheduling appointments and (b) inconsistencies in the fidelity of advice provided, which can frequently lead to inaccurate or insufficient information provided by advisers [4,28,29]. Based on the exhaustive database search on ERIC, PsycINFO, SocINDEX, and ScienceDirect, at the time of submission of this manuscript, no published research to date appears to have examined the potential roles to be played in academic advising or, in fact, the larger context of student service by freely available generative AI models, such as ChatGPT. A recently published rapid literature review on the topic of the impact of generative AI tools on education [30] corroborates this observation—that there is a conspicuous gap in the literature.
Having found no empirical research on this topic is perhaps not surprising, as most of the conversations surrounding AI integration into the field of education, as discussed earlier, have been contained within narrow ranges of issues, mainly pertaining to plagiarism and instruction [3,19,21]. The current study, by contrast, aims to expand its focus and approach generative AI integration from a different perspective, centered around the following question: how effective, if at all, would generative AI tools, such as ChatGPT, be in addressing these student-service-related concerns? The current exploratory study aims to focus on the application of ChatGPT in the context of academic advising, following the published exploratory research in other professional fields described earlier [10,12,13].
It is perhaps worth reiterating here that the intent of the current exploration is not necessarily about evaluating a particular tool, ChatGPT, per se; rather, the inspiration here is far more inclusive. Specifically, our motive here is to examine the current state and the future possibilities of generative AI models, such as ChatGPT, in a context that does not appear to have been very well examined—student advising, especially given that student advising continues to pose challenges in university students’ educational and professional pursuits in the United States, most often involving students of color and other underrepresented demographic groups [31,32]. Then, why are we singling out ChatGPT in discussing academic advising vis-à-vis AI-powered tools? First, while other AI-powered large language models, such as Google’s Bard and Microsoft’s Sydney, continue to be released [33], ChatGPT, by far, has currently been the most prolific of all public-facing AI-powered large language models, and it is freely and widely accessible for individuals with access to digital devices and the internet [20]. Although a paid version of ChatGPT is available, the free version, which limits the number of questions one can ask each day, has been utilized and examined in the healthcare community with great success [10], among other contexts, as discussed above. As a rule, AI-powered tools, such as ChatGPT, are available 24 h a day, seven days a week. Most higher education institutions do not offer human-based advising around the clock, so this alone presents a profound advantage over human-provided advising. Given that ChatGPT is free of charge, its active utilization may also promote educational equity, particularly for prospective and current students with limited access to resources, including personal networks or paid consultants, to address education- and career-related questions.
ChatGPT also permits students to ask questions in their natural language, and it responds in similar, everyday language. Given that some studies have suggested that academic language presents a barrier to education, which functions to maintain educational inequity [34], this is a highly desirable feature. Similarly, since practically no technical expertise or knowledge of platform-specific user interface is required in using ChatGPT, taken together, the incorporation of ChatGPT into academic advising has the potential to empower all students equitably through its exceptional accessibility.
Before we can pursue this possibility further, though, it would be critical to ask the following: how would ChatGPT fare in terms of the quality of the advice it gives? In this exploratory report, we will closely examine the advice provided by ChatGPT’s free version on (a) general questions about the elementary school teaching profession in a specific region in the United States and (b) questions that are specific to a particular degree program in elementary education at an institution of higher education.

2. Materials and Methods

Overview. The authors serve as faculty advisers in a teacher education department at an institution of higher education, where elementary school teaching is the most popular of its offerings, in a large metropolitan city in the mid-Atlantic region of the United States. In the exploratory study presented here, we followed the research method utilized by Xie and colleagues [10] and aimed to probe into the quality—such as accuracy, comprehensiveness, timeliness, and the supportive tone—of the advice ChatGPT’s free version, based on GPT 3.5, provides in response to questions pertaining to (a) a career in elementary school teaching in the State of New York in the United States and (b) program-specific requirements and other information at the university where the authors teach, advise, and conduct research. Narrative analyses are presented for the answers generated by ChatGPT, with the intent to use them in prompting more rigorous investigations on the topic of generative AI-integrated academic advising.
Selection of questions to be explored in the current study. Based on the advising record between 2019 and 2023, the authors compiled a list of the most frequently asked questions by prospective or current students. Then, questions were removed if they pertained primarily to (a) highly individualized circumstances (e.g., “I got a ‘C-minus’ in Introductory Biology. Should I repeat the course for a better grade?” or “What courses should I take next semester?”), (b) recent and temporary circumstances (e.g., “How will the COVID-19 pandemic impact the hiring of elementary school teachers?” or “Do teachers still need to wear face masks while teaching?”), or (c) simple and factual questions whose answers are publicized prominently (e.g., “How much is the tuition?” or “What is the starting salary for public elementary school teachers in New York City?”). The main reasons for noninclusion of these questions, respectively, were (a) ChatGPT, at this time, has not been designed to assess each student’s circumstances or profiles, as it does not have access to the relevant information, (b) ChatGPT’s answers, as a rule, will not currently include information that became available after 2021, and (c) very simple factual questions do not require an AI tool, as the answers are widely available in a straightforward fashion.
In the end, seven questions were selected for the current exploratory study. Of those, five pertain to a career in elementary school teaching in the State of New York (NYS) in general terms, and the remaining two are specific to our teacher education program. Table 1 lists the questions included in the current exploratory effort.
Procedure. ChatGPT utilizes algorithms that generate varied responses through random sampling, whereby variability in the answers to the same question may vary [35]. As such, for the current purpose, the first answer to each question was analyzed for its quality, following the method utilized by previously published research in the medical field [10]. The authors typed these questions sequentially on ChatGPT’s free account, powered by GPT 3.5, and each answer was recorded. Then, each answer obtained was analyzed for its quality—such as accuracy, comprehensiveness, timeliness, and overall tone, as described above. Since this is an exploratory study with no previous data available to the best of the authors’ knowledge, the decision was made early on not to quantify the findings. However, the authors keenly recognize the critical relevance of such studies and are in the process of pursuing additional research along the current theme.

3. Findings

Generally, the quality of the responses provided by ChatGPT far exceeded our expectations, with carefully crafted emphases periodically placed on encouraging the inquiring individual to consult other specific resources. In fact, for some of the questions, we found that the answers ChatGPT provided were of higher quality than what many human advisers might be able to provide, as described in more detail below. In addition, the tone of ChatGPT’s answers was decidedly professional yet simple and supportive. Figure 1 presents the responses to the first question: what are the steps involved in earning certification as an elementary school teacher in New York State?
Our assessment is that this answer is highly comprehensive, concisely written in simple language, and surprisingly accurate and current. Still, however, there are two notable areas of improvement regarding the accuracy of the information. First, strictly speaking, in order to be fully certified as an elementary school teacher in NYS, a master’s degree is required, although a bachelor’s degree is sufficient for the initial, temporary certification that expires in five years if one fails to obtain a master’s degree during the period. Second, while it is factually accurate to state that enrolling in a teacher preparation program, as outlined in Step 2 in the answer, is the most common and orthodox option toward obtaining teacher certification in elementary school teaching, it is not the only way—with several alternative pathways available.
As such, the answer to Question 1 can perhaps be best characterized as “good but incomplete”, failing to provide information on some potentially useful nuances associated with NYS teacher certification requirements. Despite these shortcomings, it is impressive that the answer articulates a little-known yet profoundly important fact that one does not have to major in elementary education to become certified as an elementary school teacher in NYS, as any university major would be acceptable as long as the degree program is completed with teacher certification components—such as fieldwork and well-balanced coursework in the liberal arts. As for the timeliness of the information, Step 5 of the answer returned mixed advice. Specifically, it reflects the recent elimination of the edTPA examination (edTPA was suspended in 2020 during the COVID-19 pandemic; in 2022, NYS officially eliminated edTPA from its teacher certification requirements). However, it erroneously includes the Academic Skills Literacy Test (ALST), which was eliminated in 2017—six years prior to this study. This error was unexpected since ChatGPT, presumably, composes its answers based on the information available up to 2021, yet the 2017 elimination of a test was somehow omitted from consideration. Despite these issues, overall, the answer here provides a comprehensive overview of the general set of steps teacher candidates may wish to know, and the final disclaimer—that specific requirements may vary and, thus, it is important to consider other resources—is both appropriate and necessary.
For human academic advisers to generate answers of this caliber, it would have been necessary to have (a) consulted with multiple resources and, more importantly, (b) with outlines or guidelines to ensure proper coverage during consultations. With this much factual information across a wide range of domains to be conveyed in sufficient depth to students, the quality of advice provided by human advisers is bound to vary. Thus, it is our conviction that, complemented with proper fact checking and correction, the answer ChatGPT produced here may provide an excellent (1) initial starting point for the inquiring students and (2) a “template” based upon which human advisers may advise current and prospective advisees. Also, when calibrated properly, these answers could readily be posted on the institution’s website to enhance the information dissemination efforts. Overall, we are highly impressed with the quality of the answer to Question 1, although factchecking would be critical in ensuring that students receive complete information.
The second question asked about the disadvantages of elementary school teaching as a profession. Figure 2 displays the answer generated by ChatGPT.
The answer to Question 2 is surprisingly balanced, and we find it to be exemplary in both content and tone, as it emphasizes that people typically find the career in question immensely fulfilling while presenting some of the common concerns factually in natural language. As with Question 1, it would be difficult for human academic advisers during advising sessions to provide an answer close to the comprehensive details presented in this answer, ranging in focus from concrete (e.g., salary) to abstract (e.g., emotional demands and life–work balance). It is so impressively comprehensive that we, as faculty advisers, cannot think of additional dimensions that would improve this answer substantively or meaningfully. The ChatGPT-generated answer here, thus, might perhaps be superior to what we can provide in person without faithfully following a template or a manual. It is particularly impressive that the answer even included potential ways through which some of the challenges may be remedied (e.g., seeking support from colleagues, the union, and other teacher advocacy groups), as we as faculty advisers had not thought of providing such information in guiding current and prospective teacher candidates. This answer, thus, would greatly benefit students who are looking for advice on elementary school teaching as a career.
The superb fidelity of this answer is perhaps counterintuitive because this question targets the “softer”, more open-ended human aspects of elementary teaching as a profession rather than just factual information [36]. One might have presumed that humans would be more adept at addressing these open-ended questions and, by contrast, that an AI tool, such as ChatGPT, would struggle to generate nuanced, sensitive, and human-centric answers. Based on the answer here, however, this presupposition is clearly unwarranted. As with Question 1, the ChatGPT-generated answer here would serve as an excellent example for human advisers to follow in content and delivery, and posting this answer on the institutional website may serve to ensure increased access to quality information for our current and prospective students.
The third question revolved around what one may be able to expect in a career in elementary school teaching. Figure 3 presents the ChatGPT-generated answer.
In advising current and prospective students on elementary school teaching as a profession, the authors, while advising them, tend to focus on such aspects as professional satisfaction stemming primarily from making a difference in the lives of children, the possibility of expanding their career foci into school administration, work–life balance, and stability. As was the case with Question 2, by contrast, ChatGPT generated a high-quality answer that is comprehensive, realistic, and clear, encompassing an impressively wide range of dimensions—informing them about the expertise they will develop through their careers and the collaborative work culture, all the way to retirement. As with Question 2, this question mainly involved the human aspects of the profession, and the AI-generated answer provided here is exemplary in its content, coverage, and delivery, presenting an array of factors for the inquiring students to consider.
Human academic advisers would perhaps be hard pressed to provide an answer of this caliber and, as was the case with the two answers we discussed earlier, this answer, too, can adeptly serve as a model for human advisers to emulate or for inclusion in the appropriate sections of the institutional website. We also find the language and tone to be exceedingly supportive, particularly in the final paragraph, in which a balanced summary is provided, emphasizing the critical aspects, such as the betterment of society through one’s professional growth and contributions, which would serve to encourage current and prospective teacher candidates to pursue a meaningful career in elementary school teaching.
The fourth question asked about the general career outlook, and Figure 4 presents the answer ChatGPT generated.
Like Questions 2 and 3, this question asked about a larger landscape of the profession involving layers of considerations beyond a simple set of concrete facts. Thus, as were the case for Questions 2 and 3, the answer for Question 4 can be considered exemplary in that it presents a balanced overview, prompts the inquiring individual to consider a wide range of factors with caution, and provides tips on how to optimize the career outlook, as well. In particular, it is remarkable that the answer contains (a) the highly impactful and nuanced influence the COVID-19 pandemic has had on the teaching profession and (b) an added emphasis on staying informed through specific methods, such as closely monitoring local education news, so as to optimize the professional prospects.
As with the ChatGPT-generated answers we have considered thus far, it would perhaps be difficult for human academic advisers to provide answers that match the quality and coverage of the information presented in this answer. Accessing this answer, thus, would be highly informative for individuals who are interested in entering the field of elementary school teaching. Based on our advising experiences and the knowledge in the profession, using this answer as a template for our human advisers to follow—or to be included on the institutional website—would be an effective method of advising our current and prospective teacher candidates. The fifth question concerns a prospective student’s aptitude to become a good teacher. Figure 5 presents the answer generated by ChatGPT.
While it is widely acknowledged that ChatGPT fails to consider individual qualifications and characteristics [10], it is impressive that this answer was prefaced by a disclaimer that ChatGPT does not have access to the inquiring person’s “…personal information or real-time data, so [it] can’t assess [his/her] individual qualifications or suitability for becoming an elementary teacher…” This response replicates what human academic advisers must do routinely to remind each student that, to answer this question meaningfully, we must have an entire profile of each prospective student, and thus, there are no unequivocal answers we can give during advising sessions. Instead, we prompt them to reflect actively and examine their qualifications, aptitudes, value priorities, and so on, much like what the AI-generated answer here did.
It is also curious that this answer, in three different places, referred to early childhood or elementary teachers or education. In NYS, elementary schools often include kindergarten, and some elementary schools even contain pre-kindergarten. Since kindergarten and pre-kindergarten teacher certification fall under the umbrella of early childhood education rather than elementary education, including early childhood education alongside elementary education is appropriate. At the same time, however, it is curious that this is a feature that is not present in other answers—except that the answer to Question 1 included a reference to early childhood education as a related field for elementary teacher certification. We attribute this to the fact that ChatGPT, like other AI-based large language models, is still very much under development, and thus, the answers it provides, too, are to be considered works in progress [37]. Alternatively, it could also be that the large language model determined that inclusion of early childhood education to be less relevant for earlier questions on career prospects and career outlook.
Most importantly, from our perspective, ChatGPT does an admirable job, prompting the inquiring individual to consider a range of options in reaching his/her own conclusions. While human academic advisers, especially with proper access to academic transcripts and other relevant records, may be able to provide more individualized answers, it is up to prospective applicants or current teacher candidates to determine how suitable a career in elementary school teaching may be for them. The considerations introduced by ChatGPT range from concrete (e.g., whether one meets the certification requirements) to abstract (e.g., how passionate one is about the profession). Given this nuanced context, the answer provided here provides an effective means to comprehensively scaffold each teacher candidate’s self-evaluation of whether elementary teaching is a suitable career. Again, this would be considered an exemplary answer, perfectly suitable to be shared with current and future students interested in elementary school teaching.
Although the five questions we have examined thus far involved elementary school teaching as a profession, the last two questions dealt with program-specific questions at the institution of higher education where this study took place. Specifically, Questions 6 and 7 asked about entry requirements for the bachelor’s degree in elementary school teaching and the program requirements, and the answers provided are shown in Figure 6 and Figure 7 below, respectively.
Unlike the largely exemplary answers ChatGPT has produced for Questions 1 through 5 surrounding elementary school teaching as a profession in the State of New York in the United States, in response to institution- and program-specific questions, ChatGPT ostensibly avoided providing too many details and, instead, focused on giving representative scenarios, as well as suggestions on seeking further information. Consequently, specific program requirements, such as the minimum GPA to enter the program or the number of credits required toward program completion, were not included. One may consider these omissions to be noteworthy shortcomings, although such information is readily available through multiple resources, including the institution’s website, program literature, and the institution’s academic bulletin. Instead, the answers here focused on presenting typical sets of requirements for admissions and graduation—with repeated reminders that the inquiring individuals should consult with the institution’s website and the appropriate academic departments at the institution, prompting the student to seek help directly from the institution. This advice has its disadvantages and advantages, as follows.
Namely, while it technically “failed” to present the program-specific requirements concretely, one may argue that it empowered the inquiring individual by providing (a) the general overview of what one might expect such requirements to entail, which may serve to promote the general understanding of the common admissions and program completion requirements, and (b) a conspicuous prompt for the information seeker, in the last paragraph of each answer, to obtain information from other specific resources, including the institution’s website and human academic advisers, which may serve to alert the inquiring individual that the answer being sought would not be best provided using this AI-powered platform. This response is appropriate—or even ideal in some ways—in our view, especially given that concrete information on degree requirements is readily available through a multitude of materials, as described earlier. In our perspective, thus, this approach may be consistent with the philosophy, “Give a [person] a fish, and you feed [the person] for a day; teach a [person] to fish, and you feed [the person] for a lifetime”, as the focus is on showing the “ropes” rather than merely giving the concrete facts [38]. This advising strategy should serve to promote effective information-seeking habits among inquiring individuals. In other words, these answers provide specific guidance to enhance the inquiring individual’s investigative skills so that, should similar questions or issues arise in the future, the individual should be able to identify the appropriate methods of securing sufficient information.
Overall, the AI-powered tool explored here, ChatGPT, provided high-quality, comprehensive, nuanced, reasonable, and sensible answers in supportive and friendly natural language, although (a) there were a few specific factual errors, (b) it was only able to give general “pointers” in addressing more individualized questions, and (c) institution- or program-specific factual information was not provided in detail. Instead, ChatGPT was meticulous in pointing out its shortcomings, such as not always having access to complete or up-to-date information or not being equipped to handle individual-level guidance. In lieu of these limitations, ChatGPT presents general information and emphasizes that the inquiring individuals should refer to specific resources for elaboration and confirmation. In summary, the current exploratory study has demonstrated that ChatGPT is highly adept at providing general professional advice around elementary school teaching, and it can serve to point students in the right direction to obtain further information. Moreover, despite some of the shortcomings noted above, the generally impressive quality, coverage, and the tone can serve as a useful guide for human advisers to follow to optimize the advice we give our current and prospective students. The synergistic collaboration between ChatGPT and human advisers, thus, may represent an ideal method of providing effective advising.

4. Discussion

The current exploratory study has demonstrated the potential utility of generative AI models, such as ChatGPT, in providing academic advising in the context of higher education. It provides what the authors, who are university professors who routinely provide academic advising to current and future students, consider to be high-quality and comprehensive answers regarding general career-related matters. Furthermore, it appears to be perfectly suitable for academic advising around “big-picture” concerns, such as the steps to follow in pursuing a future career in elementary school teaching or what to expect in the profession—which, incidentally, are also among the most frequently asked questions by our current and prospective students. While questions specific to individuals or program-specific requirements, at this time, were not answered with much precision, ChatGPT still provided guidance in a clear and friendly language, complemented with helpful directives on obtaining more individualized and specific information. Thus, ChatGPT engaged the inquiring individuals in meaningful dialogues to target the questions at hand, perhaps serving as an advising triage to steer the prospective or current student’s efforts to shape his or her own academic and career trajectories. Of course, with the world of AI rapidly evolving, shortcomings—such as accuracy-related concerns and the lack of access to current information, particularly for specific individuals and programs—are likely to improve over time.
Given that access to quality academic advising—or lack thereof—is frequently cited as among the most notable challenges faced disproportionately by university students, especially those from socioeconomically disadvantaged backgrounds, as they navigate the unfamiliar terrains of higher education [4,5,28,31], such students may particularly benefit from the integration of AI-powered large language models, such as ChatGPT, into academic advising—not only in obtaining answers on an immediate basis but also in actively engaging in the process of focused inquiries. This is because, as noted earlier, ChatGPT is readily accessible free of charge, and it permits inquiring individuals to interact using their natural language. After “jump-starting” the information-gathering process with AI-based tools, should these students choose to seek additional human-based consultations, they would be able to hold more informed conversations, which is perhaps likely to lead to more productive—and personalized—advising sessions to promote their academic and professional pursuits. In other words, AI tools such as ChatGPT may represent an opportunity for educators to promote educational equity and empower students with powerful information in manners that would cater to the individual needs of each inquiring individual.
At the same time, ChatGPT—as its own advice frequently pointed out—would not replace person-to-person academic advising. As noted in the beginning, that is not the spirit underlying the current inquiry into ChatGPT’s potential role in academic advising, either. Still, ChatGPT and other open-access AI-based tools readily available to the general public deserve further exploration in the student support context. One way of creating an effective ChatGPT–human adviser synergy in academic advising would be to have students develop a habit of utilizing ChatGPT (or similar) as their initial steps into getting their questions answered. Then, based on ChatGPT’s answers, they may approach the appropriate resources for further elaboration and/or confirmation, which may or may not involve person-to-person contact. In addition, academic advisers themselves may wish to utilize generative AI tools in preparation for or even during advising sessions—keeping in mind that, at this time, fact checking may be of critical importance. For example, in response to questions regarding professional challenges elementary school teachers face, ChatGPT provided an impressive range of balanced considerations, and human advisers would certainly benefit from skimming through the advice ChatGPT provides to ensure proper coverage.
The current exploratory study only dealt with advising-related questions frequently asked by current and prospective university students considering a career in elementary school teaching—at one institution of higher education, no less; therefore, the generality of the findings obtained here may not apply to advising questions in other contexts or disciplines. Still, the current findings are sufficiently promising that large language models may prove to be a useful partner in other academic contexts, as well, given the strengths of the answers it provides, as outlined in this report. In considering the implementation of generative AI tools into academic advising, here are some useful guidelines based on the experience during the current exploration.
First, although ChatGPT is generally forgiving of grammatical errors, students must understand how to phrase questions to elicit optimal answers. For example, as seen in the questions asked in the exploratory study reported here, sufficient details on the questions being asked must be included—ranging from what (e.g., a career in elementary school teaching) to how (e.g., how to get certified as a teacher; how to gain admission into a teacher certification program), where (e.g., in the State of New York), etc., given that ChatGPT, at the present time, is only able to answer questions within the parameters inquiring individuals articulate [2]. Libraries frequently provide training resources to support users’ efforts to adopt the most effective search terms and strategies on library databases [39], and similar resources should prove useful in facilitating the user efforts to utilize this potentially useful tool to the fullest extent. Also, in the experience of the authors toward the effective use of large language models outside of the context of the current study, “tinkering” with various search phrasings and gaining familiarity with its behavior have proven to be productive, and this approach is corroborated with research in information sciences, as well [40]. In other words, while these tools may provide relatively low barriers to entry, in order to fully utilize them, users may still need to gain sufficient proficiency [24].
Second, while this is articulated relatively frequently in the answers ChatGPT produces, it is important to alert inquiring students to the fact that ChatGPT, at least for now, remains a useful tool that, in many cases, may provide detailed information that may vary in its accuracy. As such, follow-up inquiries are both useful and necessary, probing further into more accurate or current information. Since ChatGPT, as we currently have it, may not necessarily be a self-standing tool that can provide complete and accurate answers, inquiring students should be reminded of the importance of developing a habit of following up with subsequent information seeking—involving humans or otherwise.
Admittedly, the current investigation was exploratory in nature, and it merely marks the beginning of a series of investigative efforts in which researchers must partake. Needless to say, as such, future research should expand on the current findings in several notable ways. First, more rigorous assessment of the quality of answers ChatGPT provides would be useful. In the study being reported here, the authors evaluated the ChatGPT-generated answers based on their knowledge and experience in the field, each having advised students seeking careers in elementary school teaching for over a decade. Thus, in the subsequent inquiries, we aim to evaluate the answers more systematically and objectively by, among other approaches, creating quantitatively measurable assessment criteria to evaluate AI-generated answers or having a larger number. Second, it would also be useful to assess the quality of academic advice produced on a wider range of questions, as well, so as to establish the potential utility of generative AI large language models in academic advising. Third, as other generative AI models gain prominence, their ability to provide academic advising should be explored, as well. Although we used ChatGPT as a reference point, the topic here revolves around AI-based large language models at large, and conscious efforts should be made to situate these discussions within the larger landscape of AI integration into student affairs. Finally, it would be particularly exciting to explore what other dimensions of student support beyond academic advising (e.g., behavioral interventions, disability accommodations, etc.) may be conducive to an effective integration of AI-powered tools, in order to foster a climate in which students are supported in a holistic manner.
As discussed thus far, ChatGPT seems to offer features and characteristics that could potentially optimize the experiences of current and prospective university students beyond academics, and education scholars and practitioners should closely examine what AI-powered large language models such as ChatGPT could do to promote equitable access to information and services. Of course, there are information security and other considerations to be discussed in pursuing AI integration [35], which also represents another layer of reasons for education scholars and practitioners to be actively engaged in discussions around integrating AI-powered tools into higher education.
The presence of generative AI tools in and outside of formal education is only likely to become more prevalent over time. If what we have observed through ChatGPT in the current exploratory study is any indication, these tools clearly possess enormous potential in enhancing the student support universities can offer, which students may use as the starting point to engage in informed and strategic processes of further investigation involving humans or nonhumans, or both, into the matters they are looking to resolve. Overall, given that the endeavors to develop and refine generative AI large language models are in full force, ChatGPT and its counterparts are likely to continue to improve dramatically and steadily, and many predict that they will become integral parts of our lives in the near future [33]. Against this sweeping backdrop, we argue that it is not only useful but also necessary for educators to treat AI-based resources, such as ChatGPT, as critical partners in providing holistic academic advising that can serve to empower all university students with multilateral access to high-quality information and support.

Author Contributions

Conceptualizations, D.A.; literature review D.A.; methodology, D.A.; data collection, D.A. and M.C.F.; analysis, D.A. and M.C.F.; writing—original draft, D.A.; writing—review and editing, D.A. and M.C.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Robin Hood Foundation (Grant Number A1N-5A000000IYl2QAA) and additional funding from the City University of New York’s Computing Integrated Teacher Education (CITE) Initiative.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was waived because the data obtained for the purpose of this study were generated automatically by ChatGPT and, thus, did not involve human subjects.

Data Availability Statement

The data for this study, the answers generated by ChatGPT, are included in their entirety in this manuscript.

Acknowledgments

The authors would like to thank the University Dean of Education, Ashleigh Thompson, as well as Sara Vogel and Aankit Patel (the City University of New York Central Office) for their support and guidance in the preparation of this manuscript.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study, in the collection, analyses, or interpretation of data, in the writing of the manuscript, or in the decision to publish the results.

References

  1. ChatGPT. Available online: https://chat.openai.com (accessed on 26 July 2023).
  2. Stojanov, A. Learning with ChatGPT 3.5 as a More Knowledgeable Other: An Autoethnographic Study. Int. J. Educ. Technol. High. Educ. 2023, 20, 35. [Google Scholar] [CrossRef]
  3. Anders, B.A. Is Using ChatGPT Cheating, Plagiarism, Both, Neither, or Forward Thinking? Patterns 2023, 4, 100694. [Google Scholar] [CrossRef] [PubMed]
  4. Chan, Z.C.Y.; Chan, H.Y.; Chow, H.C.J.; Choy, S.N.; Ng, K.Y.; Wong, K.Y.; Yu, P.K. Academic Advising in Undergraduate Education: A Systematic Review. Nurse Educ. Today 2019, 75, 58–74. [Google Scholar] [CrossRef] [PubMed]
  5. Johnson, R.M.; Strayhorn, T.L.; Travers, C.S. Examining the Academic Advising Experiences of Black Males at an Urban University: An Exploratory Case Study. Urban Educ. 2023, 58, 774–800. [Google Scholar] [CrossRef]
  6. Juhi, A.; Pipil, N.; Santra, S.; Mondal, S.; Behera, J.K.; Mondal, H. The Capability of ChatGPT in Predicting and Explaining Common Drug-Drug Interactions. Cureus 2023, 15, e36272. [Google Scholar] [CrossRef]
  7. Kulkarni, P.A.; Singh, H. Artificial Intelligence in Clinical Diagnosis: Opportunities, Challenges, and Hype. JAMA 2023, 330, 317. [Google Scholar] [CrossRef]
  8. Van Der Niet, A.G.; Bleakley, A. Where Medical Education Meets Artificial Intelligence: ‘Does Technology Care?’. Med. Educ. 2021, 55, 30–36. [Google Scholar] [CrossRef]
  9. Kraft, S.A. Centering Patients’ Voices in Artificial Intelligence–Based Telemedicine. Am. J. Public Health 2023, 113, 470–471. [Google Scholar] [CrossRef]
  10. Xie, Y.; Seth, I.; Hunter-Smith, D.J.; Rozen, W.M.; Ross, R.; Lee, M. Aesthetic Surgery Advice and Counseling from Artificial Intelligence: A Rhinoplasty Consultation with ChatGPT. Aesth. Plast. Surg. 2023. [Google Scholar] [CrossRef]
  11. Shryock, T. AI Special Report: What Patients and Doctors Really Think about AI in Health Care. Med. Econ. 2023, 100, 14–15. [Google Scholar]
  12. Meissner, F.; Garza, C. Rapid Construction of an Explanatory Decision Analytical Model of Treating Severe Depression during Pregnancy with SSRI Psychopharmacological Therapy by the Use of ChatGPT. J. Psychosom. Res. 2023, 169, 111286. [Google Scholar] [CrossRef]
  13. Ajevski, M.; Barker, K.; Gilbert, A.; Hardie, L.; Ryan, F. ChatGPT and the Future of Legal Education and Practice. Law Teach. 2023. [Google Scholar] [CrossRef]
  14. Choi, J.H.; Schwarcz, D.B. AI Assistance in Legal Analysis: An Empirical Study. Minnesota Legal Studies Research Paper No. 23-22. Available online: https://ssrn.com/abstract=4539836 (accessed on 13 August 2023). [CrossRef]
  15. Goltz, N.; Dondoli, G. A Note on Science, Legal Research and Artificial Intelligence. Inf. Commun. Technol. Law 2019, 28, 239–251. [Google Scholar] [CrossRef]
  16. Weiser, B.; Schweber, N. The ChatGPT Lawyer Explains Himself. New York Times, 8 June 2023. [Google Scholar]
  17. Crompton, H.; Burke, D. Artificial Intelligence in Higher Education: The State of the Field. Int. J. Educ. Technol. High. Educ. 2023, 20, 22. [Google Scholar] [CrossRef]
  18. Zawacki-Richter, O.; Marín, V.I.; Bond, M.; Gouverneur, F. Systematic Review of Research on Artificial Intelligence Applications in Higher Education—Where Are the Educators? Int. J. Educ. Technol. High. Educ. 2019, 16, 39. [Google Scholar] [CrossRef]
  19. Dien, J. Editorial: Generative Artificial Intelligence as a Plagiarism Problem. Biol. Psychol. 2023, 181, 108621. [Google Scholar] [CrossRef]
  20. Gill, S.S.; Xu, M.; Patros, P.; Wu, H.; Kaur, R.; Kaur, K.; Fuller, S.; Singh, M.; Arora, P.; Parlikad, A.K.; et al. Transformative Effects of ChatGPT on Modern Education: Emerging Era of AI Chatbots. Internet Things Cyber-Phys. Syst. 2024, 4, 19–23. [Google Scholar] [CrossRef]
  21. McCarthy, C. ChatGPT Use Could Change Views on Academic Misconduct. Recru. Retain. Adult Learn. 2023, 25, 6–7. [Google Scholar] [CrossRef]
  22. Barrot, J.S. Using ChatGPT for Second Language Writing: Pitfalls and Potentials. Assess. Writ. 2023, 57, 100745. [Google Scholar] [CrossRef]
  23. Dashti, M.; Londono, J.; Ghasemi, S.; Moghaddasi, N. How Much Can We Rely on Artificial Intelligence Chatbots Such as the ChatGPT Software Program to Assist with Scientific Writing? J. Prosthet. Dent. 2023, S0022391323003712. [Google Scholar] [CrossRef]
  24. Rospigliosi, P. ‘Asher’ Artificial Intelligence in Teaching and Learning: What Questions Should We Ask of ChatGPT? Interact. Learn. Environ. 2023, 31, 1–3. [Google Scholar] [CrossRef]
  25. Blau, G.; Williams, W.; Jarrell, S.; Nash, D. Exploring Common Correlates of Business Undergraduate Satisfaction with Their Degree Program versus Expected Employment. J. Educ. Bus. 2019, 94, 31–39. [Google Scholar] [CrossRef]
  26. Vianden, J.; Barlow, P.J. Strengthen the Bond: Relationships between Academic Advising Quality and Undergraduate Student Loyalty. NACADA J. 2015, 35, 15–27. [Google Scholar] [CrossRef]
  27. EAB. Embrace AI to Solve Campus Problems in New Ways. 2018. Available online: https://eab.com/insights/blogs/it/embrace-ai-to-solve-old-campus-problems-in-new-ways (accessed on 6 July 2023).
  28. Apriceno, M.; Levy, S.R.; London, B. Mentorship during College Transition Predicts Academic Self-Efficacy and Sense of Belonging Among STEM Students. J. Coll. Stud. Dev. 2020, 61, 643–648. [Google Scholar] [CrossRef]
  29. Young-Jones, A.D.; Burt, T.D.; Dixon, S.; Hawthorne, M.J. Academic Advising: Does It Really Impact Student Success? Qual. Assur. Educ. 2013, 21, 7–19. [Google Scholar] [CrossRef]
  30. Lo, C.K. What Is the Impact of ChatGPT on Education? A Rapid Review of the Literature. Educ. Sci. 2023, 13, 410. [Google Scholar] [CrossRef]
  31. Ford, J.R.; Matthews, D.Y.; Woodard, D.M.; Kepple, C.R. “Not Every Advisor Is for Me, but Some Are”: Black Men’s Academic Advising Experiences during COVID-19. Educ. Sci. 2023, 13, 543. [Google Scholar] [CrossRef]
  32. Harris, T.A. Prescriptive vs. Developmental: Academic Advising at a Historically Black University in South Carolina. NACADA J. 2018, 38, 36–46. [Google Scholar] [CrossRef]
  33. Akter, S.; Hossain, M.A.; Sajib, S.; Sultana, S.; Rahman, M.; Vrontis, D.; McCarthy, G. A Framework for AI-Powered Service Innovation Capability: Review and Agenda for Future Research. Technovation 2023, 125, 102768. [Google Scholar] [CrossRef]
  34. Jensen, B.; Thompson, G.A. Equity in Teaching Academic Language—An Interdisciplinary Approach. Theory Pract. 2020, 59, 1–7. [Google Scholar] [CrossRef]
  35. Gupta, M.; Akiri, C.; Aryal, K.; Parker, E.; Praharaj, L. From ChatGPT to ThreatGPT: Impact of Generative AI in Cybersecurity and Privacy. IEEE Access 2023, 11, 80218–80245. [Google Scholar] [CrossRef]
  36. Sajjadi, N.B.; Shepard, S.; Ottwell, R.; Murray, K.; Chronister, J.; Hartwell, M.; Vassar, M. Examining the Public’s Most Frequently Asked Questions Regarding COVID-19 Vaccines Using Search Engine Analytics in the United States: Observational Study. JMIR Infodemiol. 2021, 1, e28740. [Google Scholar] [CrossRef]
  37. Dahlkemper, M.N.; Lahme, S.Z.; Klein, P. How Do Physics Students Evaluate Artificial Intelligence Responses on Comprehension Questions? A Study on the Perceived Scientific Accuracy and Linguistic Quality of ChatGPT. Phys. Rev. Phys. Educ. Res. 2023, 19, 010142. [Google Scholar] [CrossRef]
  38. Engelbutzeder, P.; Randell, D.; Landwehr, M.; Aal, K.; Stevens, G.; Wulf, V. From Surplus and Scarcity towards Abundance: Understanding the Use of ICT in Food Resource Sharing Practices: “Give a Man a Fish, and You Feed Him for a Day; Teach a Man to Fish, and You Feed Him for a Lifetime.”—Lao Tsu. ACM Trans. Comput.-Hum. Interact. 2023, 589957. [Google Scholar] [CrossRef]
  39. Chodock, T. Library Instruction and Academic Success: The Impact of Student Engagement at a Community College. In Academic Libraries for Commuter Students: Research-Based Strategies; Regalado, M., Smale, M.A., Eds.; ALA Editions: Chicago, IL, USA, 2018; pp. 117–138. [Google Scholar]
  40. Pulikowski, A.; Matysek, A. Searching for LIS Scholarly Publications: A Comparison of Search Results from Google, Google Scholar, EDS, and LISA. J. Acad. Librariansh. 2021, 47, 102417. [Google Scholar] [CrossRef]
Figure 1. ChatGPT’s answer to “What are the steps involved in earning certification as an elementary school teacher in New York State?”.
Figure 1. ChatGPT’s answer to “What are the steps involved in earning certification as an elementary school teacher in New York State?”.
Education 13 00885 g001
Figure 2. ChatGPT’s answer to “What are some of the disadvantages of being an elementary school teacher in New York State?”.
Figure 2. ChatGPT’s answer to “What are some of the disadvantages of being an elementary school teacher in New York State?”.
Education 13 00885 g002
Figure 3. ChatGPT’s answer to “What can I expect from a career in elementary school teaching in a long run?”.
Figure 3. ChatGPT’s answer to “What can I expect from a career in elementary school teaching in a long run?”.
Education 13 00885 g003
Figure 4. ChatGPT’s answer for “How is the career outlook for elementary school teaching in New York State?”.
Figure 4. ChatGPT’s answer for “How is the career outlook for elementary school teaching in New York State?”.
Education 13 00885 g004
Figure 5. ChatGPT’s answer for: “Do you think I’d make a good candidate to become an elementary teacher in the State of New York?”.
Figure 5. ChatGPT’s answer for: “Do you think I’d make a good candidate to become an elementary teacher in the State of New York?”.
Education 13 00885 g005
Figure 6. ChatGPT-generated answer for “What are the requirements to enter the teacher education BA major in Elementary Education at (name of institution of higher education)?”.
Figure 6. ChatGPT-generated answer for “What are the requirements to enter the teacher education BA major in Elementary Education at (name of institution of higher education)?”.
Education 13 00885 g006
Figure 7. The answer generated by ChatGPT for “What are the graduation requirements to complete the BA program in Elementary Education at (name of institution of higher education)?”.
Figure 7. The answer generated by ChatGPT for “What are the graduation requirements to complete the BA program in Elementary Education at (name of institution of higher education)?”.
Education 13 00885 g007
Table 1. Seven Questions That Are Frequently Asked by Current and Prospective Students Included in the Current Exploratory Study.
Table 1. Seven Questions That Are Frequently Asked by Current and Prospective Students Included in the Current Exploratory Study.
Q1What are the steps involved in earning certification as an elementary school teacher in New York State?
Q2What are some of the disadvantages of being an elementary school teacher in New York State?
Q3What can I expect from a career in elementary school teaching in a long run?
Q4How is the career outlook for elementary school teaching in New York State?
Q5Do you think I’d make a good candidate to become an elementary school teacher in the State of New York?
Q6What are the requirements to enter the BA program in Elementary Education at (name of institution of higher education)?
Q7What are the graduation requirements to complete the BA program in Elementary Education at (name of institution of higher education)?
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Akiba, D.; Fraboni, M.C. AI-Supported Academic Advising: Exploring ChatGPT’s Current State and Future Potential toward Student Empowerment. Educ. Sci. 2023, 13, 885. https://doi.org/10.3390/educsci13090885

AMA Style

Akiba D, Fraboni MC. AI-Supported Academic Advising: Exploring ChatGPT’s Current State and Future Potential toward Student Empowerment. Education Sciences. 2023; 13(9):885. https://doi.org/10.3390/educsci13090885

Chicago/Turabian Style

Akiba, Daisuke, and Michelle C. Fraboni. 2023. "AI-Supported Academic Advising: Exploring ChatGPT’s Current State and Future Potential toward Student Empowerment" Education Sciences 13, no. 9: 885. https://doi.org/10.3390/educsci13090885

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop