You have /5 articles left.
Sign up for a free account or log in.
Photo illustration by Justin Morrison/Inside Higher Ed | ASU | Joan Cros/NurPhoto via Getty Images
Arizona State University relationships with Silicon Valley giants date back to the early 2000s and work with Google.
When the opportunity arose in January to partner with ChatGPT creator OpenAI, it felt like a natural fit, according to Lev Gonick, ASU’s chief information officer.
The university hit the ground running in a first-of-its-kind partnership with the artificial intelligence (AI) pioneer: Hundreds of project proposals poured in, with ideas ranging from tutoring bots to streamlining administrative tasks.
Gonick said the university is just getting started.
“I know I don’t have another 20 years to go through the journey of [scaling] online to this moment; this has to happen in three or four years,” he told Inside Higher Ed at the Digital Universities conference earlier this month.
Long before the launch of ChatGPT in November 2022, the ASU-OpenAI relationship emerged from a “technical conversation” about what the company was seeing in the marketplace. Deeper talks on a partnership began in the summer of 2023, with the hope to kick things off at the start of the school year.
“That turned out to be January,” Gonick said. “And now we’re off to the races.”
ASU is the sole educational partner of OpenAI, save for the University of Michigan writing a $75 million check to OpenAI co-founder Sam Altman’s venture firm, Hydrazine Capital, in Dec. 2023.
While activity from the partnership continues to grow, not everyone is happy about it. Some ASU faculty and students have voiced concerns about the seemingly all-in partnership with the tech behemoth.
“The university has thrown all this at us and pushed,” said Laurie Stoff, a professor at Arizona State University. She is a member of United Campus Workers of Arizona, which represents the state’s three public universities.
AI Projects and Proposals
Gonick said ASU is tackling the partnership in a two-pronged, “laser-focused” approach—versus what he calls a “spray painting method to see what sticks.”
The first prong seeks out proposals from faculty on the best ways to utilize the technology. The proposals fall into three buckets: teaching and learning outcomes; research and the public interest; and improving the university experience, such as better customer service and streamlining work between departments.
The university received more than 400 proposals, almost half of them in the first two weeks. The university approved 104 projects for the spring and 114 for this summer.
The projects run the gamut of an AI ‘language buddy’ supporting asynchronous learning environments to a ‘FacilitatorBot’ that helps faculty receive AI-powered support.
A second proposal round that included ideas from students closed earlier this month.
“The intent here is to use [the proposals] in the service of student success,” Gonick said. “It makes sense to start with the teachers, the faculty, then the students, and then engage further. There’s a ton of work that is underway to engage students in an intentional fashion.”
Enterprise Beyond the University
The university also acts a bit like a consultant to OpenAI for its “enterprise” large language model, giving feedback when necessary and flagging potential solutions. Gonick pointed toward researchers requesting access to GPT’s application program interface, better known as an API, to tinker with the foundational models.
“The API is going to allow us to figure out what's technically the good, bad or ugly about these large language models that we’re all working on,” Gonick said. “OpenAI said, ‘Well, we haven’t thought about making the API library available to researchers. What do you all need?’ So again, we work through what that looks like.”
Faculty and student proposals, once completed, could potentially go to market as education technology products for OpenAI, Gonick said. He gave few details on what they could look like.
“It’s so early days; this is a company that has effectively got a product for our education that’s six months old,” he said. “It’s a bit presumptuous for me to sort of say, ‘How are they going to go to market?’”
Faculty React to OpenAI Partnership
When asked about any faculty pushback on the partnership, Gonick said there has been “lots of good, open dialogue.” He pointed to multiple lines of communication, citing three AI task forces: immediate input, long-term strategic planning and AI ethics.
“When faculty members say, ‘I’ve got an issue,’ it’s like, ‘Great, here’s a whole bunch of ways that you can have input,’” he said. “ASU is doing the right thing, just in terms of getting the guardrails [in place] and we’re totally open to constructive criticism as well.”
But not all faculty are satisfied with the AI partnership.
Stoff, the ASU professor, also sits on the union’s “AI Concerns” subcommittee, which formed earlier this year and is collecting feedback to present to university leadership.
“It was always why it’s so great, ‘Here’s why you should engage,’” she said. “It wasn’t ever a discussion of, ‘We should be thinking about this collectively, what are the downsides.’”
Stoff joins a number of faculty unions, from local state associations to national groups, that have concerns about potential AI overreach. Many are discussing how to ensure concerns are addressed by institution administrators. The National Education Association represents nearly 200,000 members and expects to have language finalized in July that members can use as a framework for bargaining.
The American Federation for Teachers recommended faculty and staff members working at institutions or in states without strong unions—such as Arizona where it is illegal to have collective bargaining agreements—get college administrators to codify AI policy language in an employee handbook.
Both Stoff and Alex Young, an associate professor within ASU’s Barrett Honors College, noted previous statements from ASU about the potential use of AIs as tutors in the classroom. Stoff and Young brought up worries about potential labor woes.
“I certainly have not been presented with any convincing, peer-reviewed evidence that suggests the hard-to-quantify outcomes we’re looking for—which is not just producing essays but developing critical thinking—are being achieved through artificial intelligence,” Young said. “That’s my real fear, in terms of a labor issue: it can both detract from the student experience and push writing instruction into the purview into a machine rather than hard-working instructors.”
Gonick said the university does not plan to replace professors, and that the use of OpenAI’s large language model could help supplement courses. He pointed to the “Comp 101” course that is required at most institutions, noting that ASU had more than 530 sections.
“So there’s like, no writing instructor left in the Valley, who can actually help students with this small pod 20-person engagement,” he said. “So where it makes sense—and only where it makes sense—this is not about replacing humans, it’s about augmenting.”
But Stoff said that focuses on the wrong issue.
“The argument of helping faculty use their time better doesn’t get to the heart of why faculty and students are overworked,” she said. “Providing feedback is essential to teaching. The answer is not cutting the role of someone who provides feedback to students; the answer is to reduce the number of classes and students.”
Katrina Michalak, a student at ASU, wrote about her hesitancy with the technology and the subsequent OpenAI partnership in the ASU student paper The State Press, pointing toward Gonick’s specific Comp 101 example.
“Even if ChatGPT can help students brainstorm, researching ideas for a school assignment is something students can succeed at on their own,” she said in the article. “Students are learning how to become AI literate instead of practicing brainstorming and researching independently.”
Future Plans
While ASU is the sole educational partner of OpenAI, OpenAI is far from the standalone partner for ASU. Gonick said the university uses about 25 other large language models and works with several AI products, including Google’s Gemini and Anthropic’s Claude.
“[OpenAI] is just the one right now where—with so much attention on them as the leader in this space—we are, like everyone else, trying to keep up with all the new things that keep on getting shared,” he said. “Trying to figure out how to make them available to our faculty, students and staff.”
But there are no plans to slow the growth with OpenAI or others. The OpenAI contract is on a year-to-year basis, up for renewal this January, and ASU is always looking to grow its roster.
“Every month or so there’s a new entrant that’s looking to disrupt; we’re involved in conversations with all of them,” Gonick said. “This is about being able to experience—and then be able to amplify—the ways in which generative AI is helping to change the trajectory of students’ lives in their learning journey.”