By Claire Kowalchik P’22
Illustration by Katie Thomas
Photos by Matt Lester

Committed to advancing teaching and learning, Moravian University is exploring the applications of generative AI, and several faculty members across disciplines are bringing it into their classrooms.

You encounter artificial intelligence (AI) every day—when you open your smartphone with Face ID, ask Alexa or Siri what the weather will be, or use the map search in your car for directions.

AI personalizes feeds on your social media, the ads you see on the internet, and Netflix recommendations. It sends questionable emails to your spam folder and checks for fraud when you make purchases. And what would we do without Grammarly and spell-check?

When we’re talking about AI’s place in education, we’re specifically referring to generative AI (a.k.a. gen AI)—ChatGPT, Gemini, and Claude, among others—and those discussions have rocketed to the forefront of our examination of technology and its impact on learning. Many educators support its value; others express concern.

So, what is gen AI?

“Generative AI is a type of artificial intelligence that can create new content, like text, images, or music. It works by learning patterns from vast amounts of existing data [a.k.a. machine learning] and then using that knowledge to produce original outputs,” explains Claude. “Think of it as a highly advanced autocomplete but instead of just finishing your sentence, it can write entire articles, compose songs, or even generate artwork. These AI systems . . . can engage in conversations, answer questions, and assist with various tasks.

“Generative AI has numerous applications across industries, including content creation, product design, drug discovery, and more,” adds Claude. “However, it also raises ethical concerns regarding copyright, misinformation, and the potential displacement of human creativity.”

It seems even generative AI has some concerns about itself.

Seventy-nine Moravian faculty members responded to a recent survey asking them about their use of generative AI along with their thoughts about its impact on learning. Of the respondents—and taking into account comments from the “other” category—70.8% allow some aspect of generative AI in their classes, whether that’s permitting students to use it freely but transparently as a supportive tool for assignments or as a muse, or at the very least teaching their students how to use it in preparation for their careers.

Essays, Papers, and Fantasy Fiction

Imagine your assignment is to write about a fantastical world of your own invention. You stare at the blank Google Doc on your laptop. Where to begin?

“Generative AI can generate ideas and help break through writer’s block,” says Chris Hassay, writing instructor and associate director of the Center for Inclusive Excellence, who is currently teaching a world-building class in which students describe new fictional places. “ChatGPT is amazing at throwing out ideas. So if we’re trying to create an interesting fantasy setting, and you’re stuck trying to name your characters, ChatGPT can give you dozens of characters in a millisecond.” Then it’s up to the student to consider them and make choices to suit the narrative. Or perhaps seeing those options sparks something new in the writer’s mind.

Hassay has created his own character, student Hal 9000, whose true identity is ChatGPT. Hal assists Hassay in teaching students about how generative AI works. Hassay gives Hal a typical writing assignment, and projected on the board at the front of the classroom, lines of text rapidly scroll upward. 

“When we glance at an AI-generated piece, it might look amazing, but once we start to investigate beneath the surface, we can see the cracks,” says Hassay. "And those cracks illustrate the difference between humans and this thing being a large language model.”

The class has a deep discussion about the successes and failures of Hal’s piece and how they might improve it. “And I hope that students can keep some of that in mind for their own work,” Hassay says.

Generative AI can generate ideas and help break through writer’s block. ChatGPT is amazing at just throwing out ideas.” 

Chris Hassay, writing instructor and associate director of the Center for Inclusive Excellence

A rich writing experience can occur if a student thinks of generative AI as a writing partner. When ChatGPT or Claude delivers on a prompt, it will ask if it can answer any other questions. “If we break up a writing assignment or writing experience into its constituent pieces and then give gen AI more manageable chunks, we can have a conversation and iterate on those pieces,” says Hassay.

“If a student is using AI in my class, I hope they’re using it as a component of their writing process, not as a replacement—not just using it to get an assignment done in a quarter of the time but actively having a conversation with this resource or using it to generate ideas. And then when they embed AI into some of their work, the students are expected to provide notation to reference not only that they used AI but the prompts that they used to get that particular output.”

Gen AI can help students become better writers. Randy Ziegenfuss, professor of practice in education and director of Moravian’s EdD program, asks his students to use generative AI to comment on their writing. First, he explains the assignment, which his students must complete on their own, and then he instructs them to submit their work to gen AI with the assignment directions and ask for feedback. 

“You don’t want to have AI do the assignment for the students. You want the students to do the assignment so they build the skills they need. You’re not using it to circumvent, and that’s a critical piece. Whether we’re professors or students, we have to know if we’re circumventing some cognitive dissonance that we actually want to be there so that learning takes place.”

Data and Ideas

Generative AI can quickly gather facts and figures, perform data analysis, conduct a literature review, summarize research, explain difficult passages in a technical paper, and translate a journal article for a student whose first language is not English.

Lorraine Marchand, assistant professor of practice and business program director in Moravian’s School of Professional Studies and Innovation, recommends her students use generative AI as a research tool.

“For business classes where we need to do market research, market sizing, and some financial analysis, I may give students prompts to help them gather financial or market share information on a company; determine the components of cost of goods sold for a product and estimated costs; or find market trends for an industry, company, or market,” she says. “The purpose is to make their background research more efficient. Publicly available data can be hard to come by, and while I want the students to know where to look (company websites, financial statements, investor reports, S&P reports), I don’t want them spending all their time looking for data, because a business professional’s value is in deriving insights and making decisions based on the data.”

Marchand does not allow students to use ChatGPT to get answers to questions on their homework. “They need to study the data, make their own assessment of it, and write their own translation of what that data means regarding the assignment.

“For example, in my digital transformation course, students are assigned a company for which they need to develop a digital strategy playbook. Prompting ChatGPT on the company’s market share, financials, competitive set, industry trends, and areas for innovation helps them gather information quickly. They still must corroborate the findings with other sources, but at least it gives them a starter set of information and ideas. I require at least five other sources of data to confirm what they find with ChatGPT.”

Mark Koscinski, associate professor of practice in accounting, also sees generative AI as a tool for research if used correctly, so he teaches his students how to use it.

“One of the first exercises I do with students is show them how unreliable gen AI can be. For example, in tax class, I’ll ask them to research what will happen if someone wins a $30 million lawsuit. What are the tax implications? And the answer invariably comes back incorrect. So the first demonstration shows that you cannot just take an accounting or finance issue, run it through AI, and think that the answer will be correct.”

After students see that gen AI doesn’t have the right answers to broad prompts, Koscinski works with them to learn how to develop specific and probing questions. “I will have the students conduct research solely using AI, and that gives them practice in question architecture to learn to ask the same question in three or four or five or six different ways to make sure they’re covering all the bases. And indeed, if you look at some of the answers, even if they’re wrong, they give you a first start on the research and where to look for the correct answer.”

Artists do research, too. A painter might use gen AI to call up several images of a specific tree to use as reference for a painting, saving hours of time locating the species nearby and photographing it in multiple views.

MaryJo Rosania-Harvie, professor of practice in art and art education, encourages her students to use gen AI
for research and brainstorming.

It throws ideas out, and the human part is, How do we interact with that? So, if it gives you six ideas, you don’t say, ‘I’m done.’ Those six ideas spark something else. And then you create something new.”

—Randy Ziegenfuss, professor of practice in education and director of Moravian’s EdD program

For one assignment, her class created an arts-based field trip guide for elementary students. When given the parameters of the trip—grade level, subject, learning goals—generative AI listed several places, including some that neither Rosania-Harvie nor her students knew of—a bonus. “ChatGPT doesn’t tell them how to write up the field trip; it simply gives them ideas.”

In Rosania-Harvie’s Art and Childhood Development course, students must create a final project—a children’s book based on the course material. The course doesn’t teach students how to write a children’s book, so this is an opportunity to use generative AI to augment an assignment. Students need to know the core information for the story, such as the age range of their audience, developmental milestone(s), and problems or concerns based on the course content. They use ChatGPT to research possible story plotlines, choose an idea, and shape it into a unique story.

“It throws ideas out, and the human part is, How do we interact with that?” Ziegenfuss says. “So, if it gives you six ideas, you don’t say, ‘I’m done.’ Those six ideas spark something else. And then you create something new.”

Coding, of Course

Moravian’s computer science program has been using generative AI for two years already—no surprise there. Their tool? GitHub Copilot. Instead of being trained on language text, it’s been trained on lots and lots of code. To use GitHub Copilot, students give it some starting code along with a word prompt. 

“When introducing generative AI, the first thing I ask students to do is prompt it to make a program that can do X,” says Jeff Bush, associate professor of computer science. “The program fails for every single person. It is a great example of what happens when you give AI too big of a problem.”

Bush works with his students to figure out how to break problems down into small segments. One by one, they input those segments into GitHub Copilot, which outputs the code. Students then need to test the code to confirm that it works.

“While we go over some basics of programming in class, AI will generate things the students have never seen before,” Bush says. “So it’s a requirement that they understand everything they submit.” If Bush sees something that hasn’t been discussed in class, he gives the student an opportunity to explain it and provide a short example using it.

“Making sure students process the output of generative AI is a very important step,” Bush says. “No one should ever be submitting AI-generated work that they have not read and approved or corrected.”

A significant advantage to using generative AI, says Bush, is that it can take care of mundane programming tasks, freeing up time for higher-learning work. “They don’t have to worry about programming language minutiae such as ‘Do I use square brackets or do I use parentheses in this situation?’ There’s less burden around memorization, which I’m loving because now we get to focus on the higher level of critical-thinking skills. We’re hoping that then propagates to the rest of our curriculum and we can move to some of the skills that we wouldn’t teach until second or third course, which in turn means we get to teach more advanced stuff in the higher-level courses.”

Generative AI tools are being used in industry, and Bush says he’s seen various sources that report it saves about 25 percent of a person’s time by handling the little things. “That tells you that 75 percent of the time you’re using critical-thinking skills, and that’s what we really care about,” says Bush. “We get to focus on human skills.”

Teaching and Tools in Rehab Sciences

Moravian’s School of Rehabilitation Sciences is on the leading edge among health sciences programs when it comes to using generative AI in its classes. When Jay Scifers, associate provost and dean of the College of Health, and David Wilkenfeld, assistant professor of athletic training, have given presentations on AI to educators in athletic training or health sciences, they’ve discovered that most educators haven’t begun to explore the possibilities of this technology. At Moravian, faculty have developed many ways to use it.

Students in rehab sciences analyze lots of case studies, and faculty will ask ChatGPT to develop cases that they then modify according to their teaching goals. They will also prompt ChatGPT with patient symptoms to get a differential diagnosis. “It comes up with things I wouldn’t have thought about,” says Scifers. In the classroom, students think through the case and develop their own differential diagnosis.

“From a patient-care standpoint, I used to have students journal therapeutic exercise examples for a specific condition,” says Wilkenfeld. “I’d ask them to brain-dump for five minutes on modifications for a single-leg hop, for example, so that they start to broaden their scope beyond the cookie-cutter rehabilitation plan. ChatGPT or Gemini or any of the other gen AI tools can be very helpful in that task.”

In rehabilitation classes, the conditions students are most likely to see are covered in depth, which means not every situation gets class time. When they’re out in the field, students may see a patient who’s had a surgery they’ve never seen before. “They can go to AI and ask it to generate a rehabilitation program for that condition,” says Scifers. “But they need to be able to
vet it—to go through that program and say, this makes sense, this seems inaccurate, or I need to expand upon this.

“We know generative AI is being used by clinicians,” adds Scifers. “So how do you use it responsibly? How do you vet the information it’s giving you as being accurate and up to date? In medicine, things change very quickly. So we want to show students how they can use it, while emphasizing that they don’t just use it blindly.”

“An example of that is clinical documentation,” says Wilkenfeld. “It’s important for students to understand how to properly document what they’re doing on a daily basis, especially for professions that get reimbursed through third-party payers like medical insurance. Often insurance is denied because of improper documentation. It’s important for clinicians to know how to properly document and know what third-party payers are looking for, and from a litigation standpoint, how to protect yourself as a provider. So what we do with our students is provide them generative-AI-created clinical notes and have them go through them as though they were an auditor and identify where AI messed up.” Students must use what they’ve learned to determine what’s accurate and what’s not; they also see that generative AI makes mistakes.

“Again, because of the speed at which medical knowledge is changing, almost within five years of leaving Moravian what they’ve learned in a lot of areas is obsolete,” Scifers emphasizes. “So they need to be careful—we’re teaching them to be critical thinkers and problem solvers and to be able to apply basic knowledge to create new knowledge and advanced practice.”

Students work on an assignment in Chris Hassay’s writing class.

Students work on an assignment in Chris Hassay’s writing class.

Humans vs. Generative AI

How do humans match up with generative AI? Claude, a generative AI tool like ChatGPT, provided the following lists.

Here are 10 tasks that humans can generally perform better than current generative AI:

1. Empathizing and providing emotional support
2. Making nuanced ethical judgments in complex situations
3. Interpreting subtle social cues and context
4. Engaging in original scientific research and theorizing
5. Creating genuinely novel art concepts and movements
6. Adapting quickly to unexpected real-world physical scenarios
7. Understanding and applying commonsense reasoning
8. Forming deep, meaningful personal relationships
9. Making decisions that require integrating varied life experiences
10. Engaging in open-ended strategic planning for complex organizations

Here are 10 things generative AI can potentially do better than humans:

1. Process and analyze vast amounts of data quickly
2. Generate large volumes of content in a short time
3. Perform repetitive tasks without fatigue
4. Identify subtle patterns in complex datasets
5. Produce variations on a theme or style
6. Translate between multiple languages simultaneously
7. Generate realistic images based on text descriptions
8. Compose music in various styles and genres
9. Provide 24/7 customer service at scale
10. Optimize complex systems with many variables

"It’s important to note that while AI excels in these areas, human creativity, emotional intelligence, and complex reasoning are still unmatched in many domains. The ideal approach is often to combine AI capabilities with human insight and oversight," adds Claude.

A student analyzes the code that GitHub CoPilot has produced.

A student analyzes the code that GitHub CoPilot has produced.

Generative AI can fulfill many tasks and goals for students and faculty. It can spark creativity, help improve writing, strengthen critical thinking, and accelerate learning through efficiencies. But its use raises some shadowy issues.

Academic Integrity and Plagiarism

Among the biggest concerns is that students will ask generative AI to write an essay or paper based on their assignment, cut and paste the entire AI piece into a Google Doc, and hand it in as original. Several faculty, including those who allow students to use gen AI, have seen work that appears not to be student-authored.

The usual giveaway is that the writing is glaringly more advanced than anything that student has previously submitted. Detection software exists, but it notoriously gives false positives.

In general, faculty who suspect that a paper has been written by ChatGPT or another gen AI tool will privately meet with the student author to review the work and its origin. 

“I emphasize to students that there is an MLA and University of Chicago method for citing generative AI. So if you’re using ChatGPT in your paper, you have to cite it just like any other source. I encourage students to go to primary sources, but if they’re citing a gen AI and doing so correctly, I don’t have a problem with that,” says Koscinski. “If I suspect that a student has turned in a fully ChatGPT-generated assignment as their own, I give them a chance to out themselves,” he says. “I then remind them of the academic honesty policies.”

Hassay doesn’t agree that faculty can be certain a student has turned in an AI-generated paper based on a dramatic change in voice and writing skill. “If everything a student has submitted was generated through ChatGPT, you wouldn’t know that student’s voice,” he says. “Also, it’s difficult for us to make those sorts of judgment calls because one part of this entire experience is that we’re hoping the writer’s voice is growing and pushing and changing and evolving. Where do we draw the line between authentic growth versus AI?”

Jane Berger, associate professor of history, finds it challenging to address a student who may or may not have turned in a ChatGPT-generated paper. “I have gone to a bunch of workshops because I want to know what to do. We’ve been advised to have the student come in and talk through the paper—their thinking and argument—to see if this student can articulate responses. But you can run into a second-language learner or someone who’s intimidated and isn’t as adept at describing things. And there’s room for bias in terms of professors thinking this student is probably a good writer versus another whom they assume is not very good. It’s really complicated. You have to be really cautious if you want to avoid accusing someone of something that they have not done.”

Misinformation and Bias

At the bottom of a reply from any one of the generative AI tools is a disclaimer:

“ChatGPT can make mistakes. Check important info.”

“Claude can make mistakes. Please double-check responses.”

And from Google’s Gemini: “As you try Gemini, please remember: Gemini will not always get it right. Gemini may give inaccurate or offensive responses.”

Perhaps the most important lesson students can apply as they work with any generative AI tool is that it’s not to be wholly trusted. Moravian faculty who allow the use of gen AI stress this point right from the start.

Sounds like a negative, but looked at from a different perspective, it creates the necessity for students to use and grow their critical-thinking skills as they analyze what ChatGPT gives them in search of mistakes, statements that don’t make sense, or code that doesn’t work. And every source that GenAI provides must be checked to confirm that it exists.

What about bias? Skeptics of generative AI say it may have acquired biases during training. “It’s using data that has been created by humans, and we all naturally have a bias toward and against certain things, so its output could potentially be biased,” Ziegenfuss says. But he argues, “Everything’s biased. I think it all comes down to the idea that whatever it puts out, we have to interact with it and be critical about what it gives us.”

Supplanting Learning

Berger doesn’t encourage the use of generative AI in any of her classes. “There might be good uses for it, and I’m interested in learning about those possibilities, but I don’t know them yet. I’ve attended all the workshops we’ve had, and I understand why there’s an enthusiasm for us to think about how to use it. I also think it’s important for us to think about where not to use it. And for me, in the 100-level history classroom, I think that’s a good place—at least right now—not to use it.”

Berger explains that the objective of all 100-level courses in history is to help students develop strategies for analyzing various types of primary sources: photographs, political cartoons, letters, newspaper articles, and so forth. “We are trying to help them develop their skills of reading between the lines and interpreting and thinking about context and asking the sources questions, and then taking that analysis and putting it into an essay that is based on a thesis that they come up with. And then they use the evidence from the sources to substantiate their thesis. 

“They need to analyze the sources themselves,” Berger continues. “They need to come up with their own thesis statement, and then organize their ideas to prove their argument by referencing the data, the evidence, their analysis of their sources. And regardless of how fantastic AI becomes, I still will always want every Moravian student to be able to do that,” Berger says. “If they can’t do that, then they will be, I fear, consumers of ideas but not able to effectively compete as producers of ideas.”

Kara Mosovsky, associate professor of biology, who’s just begun delving into gen AI, wonders if it’s a shortcut. “Learning is hard,” she says. “It’s a challenge. It hurts the brain. Can you learn as effectively through AI?” She points out, too, that good writing comes from practicing the skill, not having a machine do it for you.

Marchand, who avidly supports using generative AI in her classrooms, is also concerned that students will give ChatGPT prompts that enable it to write the answers to their assignments, or they will become so dependent on AI that they don’t use other sources of information to do their work.

Job Erasure

In December 2023, Forrester, one of the world’s leading global research and advisory firms, put out a forecast on the impact generative AI would have on jobs and workers in the United States. The firm’s projections showed that by 2030, 1.5 percent of jobs (2.4 million) will be lost to gen AI, while 6.9 percent (11.08 million) will be influenced by generative AI, meaning that they will change as this technology is integrated into the job. “Workers should be more focused on how to leverage the technology than how to compete with it,” Forrester concludes.

In a podcast featuring J. P. Gownder, vice president and principal analyst at Forrester, and Michael O’Grady, principal analyst, Gownder says that gen AI will most likely influence the jobs of college-educated workers because the type of work they do is most closely related to the capabilities of a gen AI tool. “If you’re doing anything written, if you’re doing anything mathematical or [requiring] memorization . . . science, critical thinking, then you’re probably going to find these tools very useful,” says O’Grady.

About creative work, Gownder says human creativity far surpasses the quality of creative content produced by AI tools. “You can use them to mock up some ideas, but ultimately it will be a human that will turn it into something creative.”

He projects that the greatest impact of generative AI will be felt in the legal profession, IT, and other professional service industries. Several Moravian faculty members agree with the Forrester experts. 

I tell my accounting students, AI will not replace you. A person that knows AI and
how to use it will
replace you.”

—Mark Koscinski, associate professor of practice in accounting

“I tell my accounting students, AI will not replace you,” says Koscinski. “A person that knows AI and how to use it will replace you. I have been in accounting for 40 years, and every year there is the new technology that will put accountants out of business. First it was the calculator, then computers, then laptops. And what happens is technology becomes a force multiplier—technology efficiently used generates jobs, different jobs.

“Also, companies do not hire computers, they hire people. At the end of the day, you cannot connect your client to a computer. Your client wishes to speak to someone to understand the nuance of their business, to be an advisor to their business. But the job is going to require different skills than what we have, and we are changing the curriculum content here to make sure that students get these skills.”

“I think that everyone in the workplace will need to understand what AI does and how to use it so they can apply it appropriately in a professional setting and based on a company’s policies,” says Marchand. “It can be very good at scheduling, administrative work, scanning documents, and looking for themes or trends. Early use cases are in sales administration, where a salesperson may not want to enter data on a customer but where they need to look at trends in customer purchasing and determine if they should continue to call on a customer. In the legal profession, AI can be helpful in reviewing documents and identifying answers to a question or a pattern, so paralegals may use it. AI has many uses in healthcare, supply chain, and where patterns and trends are helpful for faster decision-making.”

"I think that AIs will take the place of many tasks, and this will overall be a good thing," says Bush. "We need to make sure students are learning the 'human skills' part of their discipline."

As for creative jobs, Generative AI can produce images and illustrations, and around the world, people fear that it will eliminate jobs for graphic designers, illustrators, photographers, and fine artists. 

“In photography, the only constant is innovation,” says adjunct professor Luke Wynne. “From daguerreotypes to wet-plate collodion to paper negatives to film and now digital, photography has been adaptable. AI will have a very impactful change; how it affects the landscape of jobs will probably be profound. But when the horse and buggy gave way to the automobile, jobs multiplied, and I imagine the same will happen in the AI universe of photography.”

“AI does pose a threat, but mainly to those who refuse to see it as a tool,” says Professor of Art Angela Fraleigh. “Those in the commercial realm have always employed new technologies to further their creative output. In the fine art realm, maybe it’s naive, but I believe there will always be a hunger for images, objects, and art created by human hands.”

In Ben Coleman’s classes, students use GitHub Copilot to build computer programs.

In Ben Coleman’s classes, students use GitHub Copilot to build computer programs.

Seventy-nine of 366 (152 full-time and 214 part-time) faculty members responded to a recent, brief survey asking them about their use of generative AI in their courses along with their thoughts about its impact on learning. 

Which of the following best describes your position on AI use for your students?

24.4% I permit AI to be used freely, but students must acknowledge its use.
16.7% I permit AI to be used only as a supportive tool for assignments.
14.1% I do not permit use of AI.
8% I don’t permit use of AI for most assignments.
28.8% Other: Individual comments

Do you think long-term use of AI will impact students’ critical thinking skills?

27.8% It will improve critical thinking.
17.7% It will impair critical thinking.
13.9% No, I think it will be the dumbing down of students and will make it easier for those who cannot write to turn in good papers.
6.0% It won’t have an effect on critical thinking.
35.6% Other: Individual comments

Do you think long-term use of AI will impact students’ creativity?

34.2% It will improve creativity.
27.9% It will impair creativity.
13.9% It won’t have an effect on creativity.
24% Other: Individual comments

Mary Jo Rosania-Harvie encourages her students to use ChatGPT for research and idea generation.

Mary Jo Rosania-Harvie encourages her students to use ChatGPT for research and idea generation.

For many, it feels as though this technology has suddenly and unexpectedly burst through the gates of higher education. So, while it has been adopted by numerous faculty, others are working toward a fuller understanding of the ways gen AI enhances or perhaps detracts from their students’ learning. Over this next year, they will get lots of help from a group of colleagues.

Moravian University has a history of being on the leading edge of education. The institution was born out of that drive. And it’s why the university has assembled a team to participate in the inaugural Institute on AI, Pedagogy, and the Curriculum sponsored by the American Association of Colleges and Universities’ (AAC&U). Moravian’s AI team comprises Scifers, Wilkenfeld, Hassay, Professor of Computer Science Ben Coleman, Provost Carol Traupman-Carr, and Public Services and Information Literacy Librarian Rayah Levy.

I think that AI will take the place of many tasks, and this will overall be a good thing. We need to make sure students are learning the ‘human skills’ part of their discipline.”

—Jeff Bush, associate professor of computer science

The team submitted a proposal in the spring of 2024 and was accepted into the institute, joining more than 130 groups across the country. Led by experts in generative AI, organizational change, pedagogical practice, and curricular redesign, the institute launched in September. Teams will engage in webinars, virtual events, mentorship, and interactions with groups from other institutions. “We will create an action plan during the early part of the institute,” says Coleman, “while also beginning conversations with various groups on campus: academic affairs, student support, the library, and so forth. We hope to partner with Writing Across Moravian and the Teaching and Learning Center to provide May workshops, and then work with all
our partners on campus to deliver a variety of programming during the 2025–26 academic year.

“We are not aiming to dictate a policy or process around using generative AI,” Coleman adds. “There will always be people who don’t want to touch it. We very much respect that. You should still understand AI and what’s happening around it so that you can effectively manage it. Helping faculty understand and think through all of the issues—that’s our broad goal.”

And that goal is welcome.

Mosovsky may have questions about the role generative AI can play in education, but she wants to explore possible applications. As chair of the biology department, she plans to suggest that her colleagues all try it out in some way—incorporate it into an assignment, create an in-class activity, design an assignment. “Then we come back together and share what we did and how it worked,” she says. “We need to stay on top of it, and I want us to stay at the forefront of how it’s being used in the classroom.”

Berger may not yet use generative AI in her classes, but she is eager to learn what her colleagues will share from the institute.
“I am open to hearing how other people are using it. And I will use it if I feel it enhances student learning,” Berger says. “But I want to feel that Moravian is both identifying itself as poised to make the most of generative AI and ready to say, nope, we’re not going to use it in this instance because we still want students to learn these skills that make them marketable employees down the road. We need to have the flexibility to understand that in some places in the institution it’s an asset, and in other places it’s a threat to what we do. ”

Questions, opinions, and curiosity surround the role of generative AI at Moravian. What we know for certain is that it has arrived, and the educational landscape will continue to evolve as it has at Moravian for centuries.

Students use GitHub Copilot to assist them in writing a computer program.

Students use GitHub Copilot to assist them in writing a computer program.