News
AI in McKelvey–balancing new teaching methods with maintaining academic integrity
Faculty in the McKelvey School of Engineering are grappling with how to best prepare computer science students for careers that will be fundamentally changed by generative Artificial Intelligence (AI).
AI has altered the landscape of every field for which it can create content, including writing and art. Coding is no different. 15-20% of the Washington University student body is majoring or minoring in a program associated with computer science. McKelvey faculty is working to address the ways that AI is changing the workforce and pedagogical landscape.
McKelvey professors are deciding how to teach students to not only use AI but also understand the back end of AI. At the same time that faculty navigate these questions, they are working to ensure that students do not fully rely on chatbots to complete their coding assignments.
Aaron Bobick, Dean of the McKelvey School of Engineering, said the school’s upcoming strategic plan will outline their goals regarding the use of AI for research and educational purposes.
These research goals include investigating how AI can work in tandem with other domains like the social and environmental sciences to improve research and increase the efficiency of work in engineering fields.
The two AI educational pillars of the plan are to ensure that all engineering graduates are competent with AI and that faculty come up with ways to assess students’ knowledge effectively given the presence of chatbots.
“McKelvey shouldn’t be graduating anyone who doesn’t understand the role of AI within the domains in which they were getting educated, whatever engineering discipline that may be,” Bobick said.
For people who know how to engage with these tools, AI will undoubtedly be a time saver, said Bobick.
“Practicing software engineers use these tools, so it would be unreasonable to say we’re not going to have you use these tools,” Bobick said.
Yevgeniy Vorobeychik, a McKelvey Computer Science & Engineering professor who primarily teaches about AI, echoed Bobick’s sentiment about the need to educate students how to successfully use AI.
“AI education should be an important component of essentially all engineering disciplines,” Vorobeychik said. He added that a key part of what courses will have to teach is how not to misuse these technologies from an ethical perspective.
Vorobeychik also said that McKelvey should be preparing students for a workforce changed by AI.
“[The advancement of AI tools] is a bit worrisome, I suppose, for lower-level engineers,” he said. “But on the flip side, it might just change their job description…[to] higher level design things and less lower-level coding,” Vorobeychik said.
Similarly, Bobick said that AI can help people write pieces of code, but coders will still be needed in the foreseeable future to create coding systems. He also added that a chatbot can create relatively complex code through ongoing prompting of the tool, but that understanding the fundamentals of coding is important for effectively carrying out those exchanges.
William Sepesi, a senior majoring in computer science and mathematics, predicted how AI might alter the landscape for coders over the next couple decades.
“If I had to make a bet on whether or not AI will be able to take everyone’s job eventually, I would say that it could,” Sepesi said. He noted that AI is able to do extremely advanced tasks if prompted correctly, but that its usefulness is extremely variable depending on how competently a user can prompt the bot.
Sepesi said that McKelvey is preparing students to be the ones to code the AI itself.
To prepare students for this world of coding, Bobick said that it is important they understand some of the fundamentals of the coding material, which means contending with how students can use chatbots in classes.
Bobick suggested professors could test in a way that reveals the intentions of the student’s code, ensuring that students are engaging with the process of coding.
Jay Turner, Vice Dean for Education at McKelvey, said that the process of assessing students poses particular challenges in larger classes.
“One aspirational goal in the very near term will be more extensively leveraging [our Assistants to Instructors to] understand what the students know and more importantly help the students learn how to learn in this new environment,” Turner said.
While there is currently a lot of consternation in pedagogy about how to best handle AI, there is also a sense that generative AI in coursework might quickly become the new normal.
Bobick compared the use of generative AI for coding assignments to using spell checking tools for writing papers.
“Asking Chat GPT to make sure the syntax is correct in your code… is not stunningly different from having Microsoft Word tell you you just misspelled ‘committee’ for the 47th time, or suggesting better grammar,” Bobick said. “It can, of course, do much more than mechanical editing, but we’re embracing it as a tool that everyone will leverage. We do know that we’re going to have to change a bit how we do certain types of assessment of capabilities because producing an artifact will no longer be evidence that you know how to do that.”
Regardless of how individual professors decide to implement AI into their own coursework, it is clear that those in McKelvey will lean into the technology in the years to come.
“Here at McKelvey, we’re embracing this big time,” Bobick said. “We’re going to work so that our students can take advantage of AI in every way possible. It is no longer the future — it is now.”