The Urban Legend

The School Newspaper of Urban School of San Francisco

The Urban Legend

The School Newspaper of Urban School of San Francisco

The Urban Legend

Opinion: Urban should implement AI into the curriculum

Understanding the assets of ChatGPT
Illustration+credit%3A+Kyle+Young.+
Illustration credit: Kyle Young.

At this point, it seems likely that most, if not all Urban students, have prompted the essay crafting, code writing and problem-solving Large Language Model (LLM) known as ChatGPT. Through exploring its capabilities, whether it be having a conversation about secrets, or showing off its writing skills using iambic pentameter, ChatGPT has proven to be a versatile tool.

According to the student handbook, Urban considers students using ChatGPT as plagiarism.  However, an increasing number of conversations have been sparked around using generative artificial intelligence (AI) to cheat on school assignments. 

Teachers have used a variety of methods to catch students, from asking them directly to ChatGPTZero — a tool for detecting writing by LLMs. However, the reality persists that students are tempted by how quickly they can finish their work, especially after a busy afternoon of homework, sports and activities. So the question remains: how does generative AI fit into the Urban curriculum?

“Urban’s approach to technology is purposely holistic: technology skills, multimedia design and digital literacy and citizenship are integrated throughout the daily curriculum to foster active, student-centered learning,” according to the Urban website.

So naturally, it surprised me that according to the handbook, Urban is taking such an absolute approach to a tool that could be used to foster student learning. Integrating ChatGPT into the curriculum as soon as possible is the solution. Students need a more solid structure that aims to strike a more comprehensive balance of learning and using the tools available to them.

Different departments are approaching generative AI in ways unique to their subject. Science is taking an experimental route. “We’ve talked about this as a science department. We’ve put our lab prompts into ChatGPT and we know which ones are ChatGPT-able,” said Matthew Casey, science department chair. “We modified those to make them more open-ended so that students are thinking more about the prompts and not just generating ideas.”

On the other hand, the English department has prohibited ChatGPT from being used in their classroom, but has not fully ruled out using other forms of generative AI. “Your brains are developing, and part of how you get better at critical reading and critical thinking is to actually do it yourself,” said Cathleen Sheehan, English department chair. “But things are always evolving, right? So I’m sure we’re going to continue to come back to this conversation. But at the moment, we want to go back to the basics of your own writing and thinking independently.” 

In agreement with Sheehan and Casey, Urban should be constantly changing how we approach generative AI since it has created a new meaning for digital literacy. However, that isn’t going to stop Urban students from using it to cheat. Removing it fully or trying to go around the issue by making prompts open-ended are just stopgap issues that won’t resolve the root of the problem: students using generative AI to think for them. As such, Urban must create a class or build classes around pushing students in the right direction with generative AI, rather than removing a learning opportunity. 

However, with something as capable as ChatGPT, this would have to be a mutual agreement between students and faculty. “There’s give or take,” said Casey. “I can imagine it being difficult for a student who needs help on a lab report to distinguish what suggestions from ChatGPT are beneficial and accurate [versus] what suggestions might be taking a student in the wrong direction.” 

Teaching students to use ChatGPT and other generative AI effectively would allow students to take full advantage of its power and produce better work. However, it’s then up to the student to use it as an enhancement to their original ideas, rather than using it to mitigate stress.

Learning about prompt engineering already exists. David Manlan, Computer Science 50 professor at Harvard University created a challenge called Ready Player 50, where contestants must converse with LLMs to get a password. As they progress, different levels are harder, such as encoding the password or partially censoring the topics the player can prompt out. From this, contestants experiment and learn how to intentionally prompt LLMs to get the right information. 

With such a policy or class, students would have an enhanced learning experience powered by generative AI as a tool, such as a tutor or a Google Docs add-on to help polish their writing. Khan Academy, for example, has implemented Khanmigo, an AI-powered tutor to help students interactively engage with a chatbot for math, science and humanities.

Urban should not discourage students from utilizing all tools available to them, but instead encourage them to use these tools to enhance and better their writing and understanding. However, living in an ever-evolving world of AI, students must hold up their end of the agreement by learning how to become independent thinkers. 

 

About the Contributor
Kyle Young
Kyle Young, Editor in Chief, Print