Indiana University launches a new artificial intelligence tool for use by faculty, students

  • Print
Listen to this story

Subscriber Benefit

As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe Now
0:00
0:00
Loading audio file, please wait.
  • 0.25
  • 0.50
  • 0.75
  • 1.00
  • 1.25
  • 1.50
  • 1.75
  • 2.00

his fall, Indiana University began rolling out a new feature across its campuses to encourage the responsible use of artificial intelligence in the classroom.

Called ChatGPT Edu, the technology is specifically designed to assist users in higher education.

Frank Emmert

In a press release announcing the rollout, leaders at IU said the introduction of the technology is important to prepare students for the workforce they’ll graduate into.

Some faculty at IU agree, saying that institutions of higher education should be offering students opportunities to work with emerging technology.

“Law school is the place to practice and learn. If we don’t teach them, where are they supposed to get it?” said Frank Emmert, a professor at the IU McKinney School of Law.

OpenAI announced the launch of ChatGPT Edu last May, saying the program is an affordable way for universities to “responsibly deploy” artificial intelligence in higher education.

The company advertises ChatGPT Edu as a more accessible option for universities, boasting features like improved language capabilities and robust data security. The program doesn’t use student and faculty conversations and data to train OpenAI models, a security feature that Asaf Lubin emphasized.

Asaf Lubin

“It ensures that those students and faculty, when we interact with it, all of the data remains confidential…it’s all subject to IU internal policies and controls,” said Lubin, an associate professor of law at the IU Maurer School of Law.

Beginning this school year, the program will roll out to 120,000 people across IU’s campuses around the state. Faculty could request access starting this month, while students must wait until next semester.

The rollout is OpenAI’s second largest for the AI program.

Through the program, faculty and staff can explore AI-teaching strategies and streamline administrative tasks, while students can use the program to develop their AI knowledge and build skills to prepare them for the workforce, Indiana University said in its press release.

Using AI

In practice, the use of the program will look different based on individual preferences.

In addition to his teaching role at Maurer, Lubin coaches the school’s Jessup International Law Moot Court team. He’s considering ways that students can use the program to prepare for the competition.

Last year, the International Law Students Association, the organization that administers the Jessup competition, ran an experiment where it designed 10 approaches on a fictional case through different AI models. The models were placed in competition and judged alongside the students’ work, and the models earned scores on par with the historical average of live competitors, Lubin said.

Now, the association is allowing the use of AI to prepare for the competition.

“If your team doesn’t use it, they’re at a disadvantage compared to other teams,” Lubin said.

He added that he’s asked for early access to ChatGPT Edu for students in preparation for next year’s competition.

It’s unknown exactly how many Maurer professionals have requested access to ChatGPT Edu so far, but Lubin believes several are interested in using it. Some, in fact, are probably already subscribed to their own AI tools, he said.

And for faculty who are skeptical of the technology, he sees the rollout as a way to earn their trust.

“So many faculty, like a year ago or two years ago, tried ChatGPT, the free version, saw that it produced not good quality, and therefore assumed that AI stayed at that level,” he said. “Giving them their access to advanced models will demonstrate to them the value of it to all aspects of their work life, that I think, will be the incentive for doing it.”

Developing AI literacy

The legal profession is increasingly implementing AI strategies to support its work. The American Bar Association’s 2024 Artificial Intelligence TechReport found that 47.8% of surveyed attorneys who work at firms with 500 or more employees use AI.

Of the attorneys surveyed, 52.1% said they use ChatGPT specifically.

Because of this, it’s important for law students to understand how to appropriately interact with the technology, Lubin said.

“Providing students early familiarity with how to interact with these tools, how to use them to support one’s research and writing, not to replace it, is just the kind of preparation any law school in 2025 must offer their students, because that’s the marketplace they’re about to enter,” Lubin said. “And that certainly will be the marketplace that we’ll all function in 10 years from now.”

Regardless of their views on AI, Lubin and his colleagues agree that in order to work with it, law students must first acquire the hard skills needed for the profession.

“You cannot assess the quality of the product you’re receiving from the tool without having an understanding of what a good product looks like and how to get there,” Lubin said.

To use AI effectively, students must be able to weigh its efficiency against their studied knowledge of the law, educators said. Without that, students are more likely to run into issues like AI hallucinations or the technology providing outdated information.

And if they’re going to mess up, it’s better to do it in school.

“Making mistakes, that’s what law school is there for,” Emmert said. “Because our mistakes, they’re in a closed environment. You’re not going to be sued for malpractice or lose a client.”

A growing emphasis on AI literacy echoes across IU’s campuses and degree tracks.

In August, IU launched a GenAI 101 course to help students, staff, and faculty establish a foundation for understanding artificial intelligence, its uses and its limitations.

The free course is self-paced and teaches users 20 key GenAI skills in the areas of foundational prompt engineering, using AI as a thought partner, and AI as a productivity amplifier.

Off campus, law firms are increasingly using AI for everyday tasks.

In 2023, Taft Stettinius & Hollister formalized its GenAI strategy in response to increased interest in the technology by both clients and attorneys following the release of ChatGPT in 2022, Lyndsay Capeder, Taft’s chief client and innovation officer, said in an email.

The firm’s approach to the technology rests on six core pillars that support the needs of its clients, attorneys, and professional staff, and adheres to the firm’s professional responsibilities and the realities of the marketplace.

Taft’s professional staff and attorneys receive the firm’s AI policy and both a core set of training modules and specific training for individual AI tools. The firm also offers continuing AI education to enforce the responsible and effective use of the technology, Capeder said.

Like the faculty at IU, Capeder encourages students’ efforts to familiarize themselves with AI technology prior to entering any profession but emphasizes that AI can’t have the final word.

“Gen AI is not a silver bullet—it is good to be aware of the benefits and risks of these tools. But at the end of the day, your critical thinking skills and good judgment is what is going to make you most successful when utilizing these tools,” she said in an email.•

Please enable JavaScript to view this content.

{{ articles_remaining }}
Free {{ article_text }} Remaining
{{ articles_remaining }}
Free {{ article_text }} Remaining Article limit resets on
{{ count_down }}