On Campus

SU’s College of Engineering, Computer Science strives to incorporate AI with integrity

Flynn Ledoux | Contributing Illustrator

Professors at Syracuse University's College of Engineering and Computer Science are taking steps to address generative AI’s role in academic curricula as it becomes more prevalent among students in the engineering field.

Get the latest Syracuse news delivered right to your inbox.
Subscribe to our newsletter here.

Syracuse University students enrolled in Computer and Information Science 151: Fundamentals of Computing and Programming this semester will soon have 24/7 access to a personal teaching assistant: ChatGPT.

Nadeem Ghani, an assistant teaching professor of electrical engineering and computer science, is looking to incorporate an AI-based teaching assistant into his class in the following weeks to provide academic assistance to students that will explain how to find an answer, rather than directly providing a solution.

To keep the model from providing actual code for his students, Ghani is planning to use a wrapper – a program or code that wraps around other program components – that would give ChatGPT specific instructions to provide coding assistance rather than an answer.

“The plan is that we are going to give them a ChatGPT-based TA,” Ghani said. “So we wrote a version that’s not going to show you the code, but it’s going to describe what the answer should do in natural language.”



Ghani is one of several professors in SU’s College of Engineering and Computer Science taking steps to address generative AI’s role in the classroom. Professors at universities across the country — including Yale University, Harvard University and Georgia State University — have similarly begun exploring the use of AI as a teaching assistant. While some professors are embracing AI usage in their classes, others are more hesitant to rely on the emerging technology.

Priyantha Kumarawadu, an associate teaching professor of electrical engineering and computer science, said that while he encourages his students to use AI tools to assist their learning, he sets clear boundaries for its appropriate usage.

“One of the important things is that we have to clearly inform the students how to maintain a culture of academic integrity and ethics even when using AI tools,” Kumarawadu said.

Cindy Zhang | Digital Design Director

Recent advancements in AI technology have allowed engineers to quickly and efficiently tackle complicated issues and automate processes. Popular chatbots such as ChatGPT — a widely-used chatbot driven by AI technology — can also write code in various programming languages for engineers. However, its answers aren’t always correct.

When confronting software programming questions, ChatGPT answers incorrectly more than half the time, according to a study from Purdue. A separate study done by a team from Stanford and UC Berkeley found that ChatGPT’s accuracy has gotten worse over time.

With AI’s unreliability, Garrett Katz, an assistant professor of electrical engineering and computer science, said he limits the use of AI in his class.

“ChatGPT is known for sometimes saying non-factual information,” Katz said. “It can generate very realistic-looking content in whatever medium.”

Kumarawadu, on the other hand, permits the use of generative AI in his classes so that students can understand the impact of emerging AI applications on the engineering field, he said. He aims to shift his teaching styles to align with how people in the field are currently using these emerging tools, including automating repetitive tasks, streamlining complex procedures and assisting in problem-solving.

“The actual industry is being supported by AI tools, so therefore I believe that the students should be exposed to these tools in their classes too,” Kumarawadu said.

Katz, who is teaching Computer and Information Science 467: Introduction to Artificial Intelligence this semester, said students need to learn the fundamentals and ethical ramifications of AI so they are prepared for the workforce.

Although Katz educates students on the fundamentals of AI and how to build AI systems, he describes his feelings toward AI usage in the classroom as “conservative.” He does not permit students to use AI to complete their assignments because he said that excessive reliance on AI tools may hinder students’ comprehension of important topics.

“Learning isn’t supposed to be easy, necessarily,” Katz said. “AI sort of supplants thinking and just allows students to get to the answer without really understanding what they think.”

Hamid Ekbia, director of SU’s Autonomous Systems Policy Institute, said he sees the application of AI in engineering as essential, with many engineers tackling current issues and problems they face using AI tools. He believes that under SU’s current strategic plan, the university’s engineering professors are taking major strides in their use of AI for problem-solving.

“From material design to software engineering and from structural analysis to space missions, AI techniques provide unique ways to deal with complex, uncertain and unpredictable environments, allowing engineers to build robust, resilient and scalable systems,” Ekbia wrote in an email to The Daily Orange.

The prevalence of AI and its applications are also becoming increasingly discussed by experts in the engineering field. The Accreditation Board for Engineering and Technology has begun to look at the professional and ethical implications of AI in the industry, Kumarawadu said.

Kumarawadu said AI tools can uniquely contribute to a more interactive academic environment through their assistance in peer-to-peer programming — when multiple developers write code cooperatively — and project-based learning.

In his syllabi for Computer Engineering 283: Introduction to Object-Oriented Design, Kumarawadu outlines that generative AI can be used for brainstorming and organizing but not for completing graded work. Kumarawadu also reiterates that all AI use must be properly documented and cited to stay within university policies on academic honesty.

Qinru Qu, a professor of electrical engineering and computer science, does not include any specific AI policies in her classes, believing the tool can be used more like a search engine. As long as students are ensuring that their answers are correct and in their own words, she sees AI tools as beneficial to the learning process.

“One of the required skills for engineers is the ability to find references and use tools to solve the engineering problem,” Qu, the Electrical Engineering and Computer Science graduate program director, wrote in an email to The D.O. “Personally, I’m okay if students use AI as a tool to get answers.”

Personalized learning can also be enhanced by the use of AI, Kumarawadu said. He believes AI tools can be used to identify what a student’s learning styles are and can assist in customizing assessments or assignments to personalize their work to them.

Despite the potential for AI to strengthen learning for students, some professors are worried about the negative implications regarding academic integrity and individual thinking.

To mitigate the potential for students to use AI to cheat, Katz said he randomizes test questions for each student and administers exams primarily in person. Although ChatGPT can help students understand a certain topic in principle or boost productivity, he said, students should not simply rely on AI-produced answers.

Kumarawadu sees academic integrity as the most important factor in a student’s learning experience, which is why he requires that students always disclose when AI is used in any capacity when in class or completing an assignment, he said.

Ghani said artificial intelligence as a whole is transforming the skill set required to be an engineer, forcing engineering professors to rethink curriculums worldwide.

“It was unthinkable that a software engineer could be useful if they couldn’t code, but now it’s not,” Ghani said. “In five years writing code could be completely irrelevant.”

membership_button_new-10





Top Stories