Home Artificial intelligence AI chatbots can cushion the high school counselor shortage — but are they bad for students?

AI chatbots can cushion the high school counselor shortage — but are they bad for students?

by prince

CareerVillage is another California nonprofit focused on scaling good college and career advice. CareerVillage.org has been aggregating crowd-sourced questions and expert answers since 2011 to help people navigate the path to a good career.

When ChatGPT came out, co-founder and executive director Jared Chung saw the potential immediately. By the summer of 2023, his team had a full version of their AI Career Coach to pilot, thanks to help from 20 other nonprofits and educational institutions. Now “Coach” is available to individuals for free online, and high schools and colleges around the country are starting to embed it into their own advising.

At the University of Florida College of Nursing, a more specialized version of Coach, “Coach for Nurses,” gives users round-the-clock career exploration support. Shakira Henderson, dean of the college, said Coach is “a valuable supplement” to the college’s other career advising.

Coach for Nurses personalizes its conversation and advice based on a user’s career stage, interests and goals. It is loaded with geographically specific, current labor market information so people can ask questions about earnings in a specific job, in a specific county, for example. Coach can also talk people through simulated nursing scenarios, and it’s loaded with chat-based activities and quizzes that can help them explore different career paths.

Henderson is clear on the tool’s limitations, though: “AI cannot fully replace the nuanced, empathetic guidance provided by human mentors and career advisors,” she said. People can assess an aspiring nurse’s soft skills, help them think about the type of hospital they’d like most or the work environment in which they’d thrive. “A human advisor working with that student will be able to identify and connect more than an AI tool,” she said.

Of course, that requires students to have human advisors available to them. Marcus Strother, executive director of MENTOR California, a nonprofit supporting mentoring programs across the state, said Coach is worlds better than nothing.

“Most of our young people, particularly young people of color in low-income areas,” Strother said, “they don’t get the opportunities to meet those folks who are going to be able to give them the connection anyway.”

By contrast, Coach, he said, is “like having a mentor in your pocket.”

‘A regulatory desert’

Last month, California state Sen. Steve Padilla, a San Diego Democrat, introduced legislation to protect children from chatbots. Senate Bill 243 would, among other things, limit companies from designing chatbots that encourage users to engage more often, respond more quickly or chat longer. These design elements use psychological tricks to get users to spend more time on the platform, which research indicates can create an addiction that keeps people from engaging in other healthy activities or lead them to form unhealthy emotional attachments to the bots.

The addictive nature of certain apps has long been a critique of social media, especially for young people. In Freeland Fisher’s research for the Clayton Christensen Institute, she included a comment from Vinay Bhaskara, the co-founder of CollegeVine, which released a free AI counselor for high schoolers called Ivy in 2023.

“I’ve seen chat logs where students say, ‘Ivy, thank you so much. You’re like my best friend,’ which is both heartwarming, but also kind of scary. It’s a little bit of both,” the report quotes him as saying.

Reached by phone, Bhaskara said his company’s tool is designed to be friendly and conversational so students feel comfortable using it. Millions of students have used the chatbot for free on CollegeVine’s website and more than 150 colleges in California and around the country have offered the technology to their own students. After seeing how many millions of emails, text messages and online chat sessions have happened outside of working hours, Bhaskara now argues the insight and support students have gotten from the chatbot outweigh the risks.

In announcing Padilla’s bill, his office referenced a number of cases in which chatbots directed children who had become attached to them to do dangerous things. At the most extreme, a Florida teen took his own life after a Character.AI chatbot he had become romantically involved with reportedly encouraged him to “come home to me.” Padilla said his bill wouldn’t keep young people from getting the benefits of college and career advising from chatbots; it would offer reasonable guidelines to address a serious need.

You may also like

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?