University of Washington professors on using ChatGPT in the classroom – GeekWire

Donations Make us online

A student in the University of Washington’s Suzzallo, Library in Seattle. (UW Photo)

AI-powered chatbots are already changing how professors teach, said University of Washington educators during a panel discussion Wednesday.

“Our role is to help guide people to understand how these tools can operate, where we can be led astray and where they can support us,” said Brock Craft, an associate teaching professor in the UW Department of Human Centered Design and Engineering.

Many schools districts and other K-12 institutions have banned ChatGPT, the popular tool from OpenAI that instantly produces content based on prompts.

But the UW is adopting more flexible strategies, and issued guidance in January suggesting that instructors set clear course policies and communicate the importance of college learning, among other ideas. Higher education institutions elsewhere are also beginning to formulate policies.

The panelists on Wednesday said students and teachers need to approach ChatGPT critically. It’s not possible to predict what types of errors ChatGPT is most likely to produce, and its human-like responses and conviction can lead people to treat it like a trusted source, they said.

OpenAI, is also not transparent about how the model was built and trained, said Noah Smith, a professor in the Paul G. Allen School of Computer Science and Engineering.

“ChatGPT is a commercial product, not a research object,” said Smith. “What we can say about it is therefore limited, and that’s really unfortunate, both for understanding the models and improving them and for communicating about them to the public.”

“I think they are bullshitters at scale,” said Jevin West, director of the UW’s Center for an Informed Public, of such models. West is concerned about the potential of ChatGPT to spread misinformation. But so far, he has embraced its use in the classroom.

West allows students to use ChatGPT freely, including for exams. His only requirement is that students let him know. “They use it a lot,” said West.

Penelope Adams Moon, director of the UW Center for Teaching and Learning, said ChatGPT is leading educators to rethink their roles. “It’s reinforcing the importance of formative assessments — orienting your assignments around growth rather than evaluation,” said Moon.

Read on for more highlights from the discussion, “Demystifying ChatGPT for academics.”

On using ChatGPT critically and creatively

Tivon Rice, an assistant professor of digital arts and experimental media, has used large language models for years in his classroom. Students train them on data they choose, and use the output for poetry or other creative works. He encourages students to structure queries so they get interesting answers. An example: “Can we get them to speak in voices from the past?”

It’s important for students to critically evaluate ChatGPT’s output, said Rice. “A deliberate position of co-authorship is a very important place to be with these systems,” he said.

Whether ChatGPT and similar tools enhance creativity overall is not yet clear, said West. But he thinks they could have value in helping to generate new scientific hypothesis, and in getting people started with writing when they are stuck.

Journalism students are thinking about how to use ChatGPT ethically, said Andrea Otáñez, a communications teaching professor. In one use case, students use the tool to order sentences and paragraphs to structure their stories, but not write them, said Otáñez.

On AI detection systems

Instructors concerned about cheating may turn to tools that claim to detect work produced using ChatGPT. But panelists did not embrace that approach.

“Hold on to your wallet,” said Smith. “These models are being constantly updated, and anything that works today probably won’t work tomorrow.”

Moon said surveilling student work for signs of AI creates an “adversarial” learning environment. Instead, the entire structure of education needs to change so that students are not tempted to cheat, she said.

“Everything about the college prep process focuses on winning admission, not intellectual inquiry and curiosity,” said Moon. “Once students get here, they encounter systems designed to cull rather than grow, like capacity-constrained majors and grading on a curve,” she said.

On privacy

Students need to be careful about what information they put into the chatbot, said computer science professor Yeijin Choi. “Any queries that you enter into ChatGPT can be used to train the AI further,” she said. Assistant computer science professor Yulia Tsvetkov similarly noted that information provided to ChatGPT could surface in unexpected ways later.

On accuracy and bias

West is concerned about ChatGPT’s accuracy. “Would you use a calculator that made errors 10% to 15% of the time, without knowing where those errors are?” he said.

Moon, who is also a history instructor, said students should view ChatGPT like other potentially inaccurate or subjective sources. She has always taught students to check multiple sources to see if they align. “I think that’s the same with ChatGPT,” she said.

Allen School professor Luke Zettlemoyer noted that such models are trained on biased data, and that with the right queries it is possible to “jailbreak” safety controls designed to prevent inappropriate interactions. Efforts to increase safety also mean that chatbots can fail to discuss some topics, such as those related to LGBTQ issues, said Zettlemoyer.

On supporting instructors

K-12 teachers and professors have been leveraging ChatGPT to make their work more efficient. And ChatGPT is not just a talker; it has software programming skills as well. “I know some faculty in computer science are using it to write exam questions too, so it’s all going to get very circular going forward,” said Zettlemoyer.

On working together

Panelists said they were learning about the technology along with their students. “We’re in a journey together,” said Craft. “A learner-centered approach means that I don’t have to try to set arbitrary barriers that may not actually help them learn, or ban them from using this particular technology,” added Craft.




Source link