Courage in the Classroom: Why Vulnerability Is Key to Ethical AI Integration
Supporting faculty through the unknown with trust, not fear
“Fear,” Parker Palmer writes, “is what distances us from our colleagues, our students, ourselves” (Palmer, 2007, p. 36).
The rise of AI has brought with it both hope and hesitancy, curiosity and caution. At Lipscomb, we have heard it in faculty voices. Will I lose what makes me a good teacher? Will students still learn deeply? Will I know how to use these tools well or at all? Most recently, a faculty member told me about developing an AI bot to engage with her students on course content instead of giving them a quiz. As she and the students went through the exercise with the bot, she began to ask herself, Will the students remember that I created this bot? Will they understand that I’m still a vital part of the learning process?
These are real questions, and they require real courage.
This year, our approach to supporting faculty in AI adoption has not focused on eliminating fear. Those of us leading these efforts understand that these feelings are real and valid. Instead, we are creating spaces where fear does not paralyze. Valuing vulnerability and courage is at the heart of this work. In our AI and Academic Integrity badge series, faculty explore complex issues like authorship, collaboration, and student agency. We do not pretend to have all the answers. Instead, we provide language, frameworks, and the trust that growth is a shared endeavor. We share ideas with one another about how to support students as they learn to use AI with integrity, focusing on our Guiding Principles for the use of AI.
In Creativity, Inc., Ed Catmull says, “There is a sweet spot between the known and the unknown where originality happens; the key is to be able to linger there without panicking” (Catmull & Wallace, 2014, p. 224). I’ve been thinking about this concept a lot as we have walked through our journey with AI this past year.
This idea also resonates with broader findings about the future of work. The World Economic Forum’s Future of Jobs Report 2025 highlights that two of the most critical skills for the next decade are resilience and agility (World Economic Forum, 2024). Both of these skills flourish not by avoiding uncertainty or discomfort, but by embracing it with courage. Or, as Catmull says, being able to linger in the space of the unknown without panicking.
At Lipscomb, our faculty are modeling this courage in remarkable ways. We started the AI Design Lab thinking it would be a time for experimenting with creating assignments and assessments. But once we began, many teachers realized that they didn’t know enough to create the product on their own. They needed a space to be vulnerable, to say, “I don’t know how to do this”, to struggle together. So faculty members worked as a team, sharing assignment and assessment ideas with one another, and found that their willingness to try opened up new engagement with one another and with students.
For example, the instructor who created the bot instead of giving a quiz asked her students, “When you do activities like this, do you still think you need a professor?” The students affirmed that they view faculty as essential, but what we’re seeing is that our place in the learning process may look different than it has in the past. Mentoring, guiding, and thinking together all become even more critical in this new environment.
This is what it means to linger together in the space between the known and the unknown. We do not need to panic. Instead, we need to be open with one another about what we don’t know and about what we are learning. In his book, The Courage to Teach, Parker Palmer says “…I remind myself that to teach is to create a space in which the community of truth is practiced—that I need to spend less time filling the space with data and my own thoughts and more time opening a space where students can have a conversation with the subject and with each other..." (Palmer, 2007, p. 107) When we view ourselves as part of the learning community alongside our students and our colleagues, as the person whose role it is to create a space for truth seeking, we are able to stand together in that space with courage and curiosity.
Of course, our journey with AI is a matter of technological adaptation, but, more importantly, it’s also a matter of human flourishing. In Lipscomb’s Fear to Flourishing framework, thriving in a time of AI requires creating learning environments marked by belonging, purpose, and growth. It means moving beyond technical skill-building to fostering communities where vulnerability and curiosity are understood as catalysts for transformation.
Faculty hospitality, an intentional willingness to create spaces where questions are welcomed, plays a crucial role here. Inviting students and faculty alike into shared risk-taking transforms fear into creativity. In classrooms where hospitality and vulnerability are practiced, uncertainty becomes a shared journey rather than an individual burden.
Ethical AI use does not begin with policy, although policy is important. It begins with people who have the courage to ask hard questions and the vulnerability to be willing to grow together.
References
Catmull, E., & Wallace, A. (2014). Creativity, Inc.: Overcoming the unseen forces that stand in the way of true inspiration. Random House.
Palmer, P. J. (2007). The courage to teach: Exploring the inner landscape of a teacher’s life (10th anniversary ed.). Jossey-Bass. (Original work published 1998)
World Economic Forum. (2024). The Future of Jobs Report 2025. https://www.weforum.org/publications/the-future-of-jobs-report-2025/