Season 12 | Episode 4

AI on the Front Lines: Creativity, Industry, and the Classroom

Mar 3, 2026 | Season 12

“The next time you go on a plane, if you have a look at, look around you, all those overhead compartments on an Airbus or a Boeing, they’ve been made with human and machine. Creative collaboration working on these designs. So when I think about product, really what I’m talking about in the book is going beyond what just we as humans and teams can do on our own to bring in a new collaborator, a new partner with AI.”
James Taylor

Episode Transcription

AI on the Front Lines: Creativity, Industry, and the Classroom with James Taylor

Matthew Worwood: In this episode, we step into the world of industry to explore AI and creativity from a professional perspective. In our conversation with James Taylor, we examine how AI is reshaping creative work in real time. How value is being defined and what challenges and opportunities are emerging for those on the front lines.

Matthew Worwood: But then we turn back to the classroom and ask, what does this mean for education? If industry is rethinking process and product in the age of ai, do we need to reconsider how closely we try to mimic real world experiences, especially in how much weight we place on the final outcome of an assignment.

Hello everyone. My name is Dr. Matthew Ward, and my name is Dr. Cindy Burnett. This is the Fueling Creativity in Education Podcast. On this podcast, we’ll be talking about various creativity topics and how they relate to the fields of education. We’ll be talking with scholars, educators. And resident experts about their work challenges they face, and exploring new perspectives of creativity, all with a goal to help fuel a more rich and informed discussion that provides teachers, administrators, and emerging scholars with the information they need to infuse creativity into teaching and learning.

So let’s begin.

Cyndi Burnett: James is an award-winning keynote speaker and internationally recognized authority on creativity, innovation, and artificial intelligence.

Cyndi Burnett: He started his career managing high profile rock stars and has since become a global thought leader in business creativity and AI driven innovation. James is on a mission to help individuals and organizations unlock the creative potential, accelerate innovation, and build a sustainable future.

Cyndi Burnett: Believing that the greatest competitive advantage comes from creative collaboration between humans and technology, he champions strategies. To future proof businesses in this age of disruption. Now, James, I want to let our audience know that this isn’t our first conversation. So several years ago you had me on your podcast, which is called the Super Creativity Podcast, and we had a great talk on creativity and education, which we will link in the show notes.

Cyndi Burnett: But over the years, you have interviewed over 750 of the world’s. Creative Minds and just released a book based on your findings called Super Creativity, accelerating Innovation in the Age of ai. So today we want to couch this perspective of the future of business world within our educational system today.

Cyndi Burnett: So first of all, let’s start with welcome to the show.

James Taylor: Thank you for having me as a guest,

Cyndi Burnett: and we’d love for you to start by telling us how you define super creativity.

James Taylor: Yeah, so super creativity is really the augmentation of our creativity by collaborating more deeply with other people and machines.

James Taylor: So in the book I talk about these different types of, that this kinda way of working the super creativity. I talk about human and human creativity. So. This is, I was like working with other people, but increasingly, and the thing I speak about a lot, which is this idea of human plus machine, or human plus ai, creativity, although it’s not just a, we have robotics and then we have other things as well.

James Taylor: So the book is really an, an exploration of this blend of creativity and collaboration, whoever you’re actually collaborating with.

Matthew Worwood: And just to pick this up just in case any of our listeners are making this connection, is there a relationship with , the Ray Kurtz Well’s Singularity, where we’re actually connecting?

Matthew Worwood: Are you thinking that far in the future, or are you thinking more about, primarily dialogue or, communicating via text?

James Taylor: Yeah, so most of the clients that bring me in and probably the vast majority is more in the corporate world. So the large tech companies, the large brands that, you would know.

James Taylor: But actually a part of what I do as well is also speaking to, administrators within education world, as well. So what I find, regardless of who those audiences are that I’m speaking to, I’m not a futurist. I don’t wanna go way out there. I don’t find that’s so useful for my audience anyway.

James Taylor: So I tend to focus a little bit more on the here and now. Just take the maybe next two to three years out. Different industries are going at different speeds, in terms of their adoption of things like artificial intelligence, for example. So we can all learn from other industries. But I probably wouldn’t, I’m not a, I wouldn’t class myself as a tech bro.

James Taylor: Let’s put it that way. I mean, not Austin, Texas today with the land of the tech bros. And, and while I, yesterday I was going to a restaurant and I traveled by autonomous car to that restaurant. So I love all that. That’s all fascinating. But really what I’m very keen to get back to is the human part within this.

James Taylor: And that human creativity. So I love the idea that technology can actually augment that and help us find our most, our deepest creative potential within each of us.

Cyndi Burnett: So you frame your thinking around eight piece , and many of these Ps we have talked on the show in depth, such as people, process, personality.

Cyndi Burnett: But there are two that I really wanna cover today. And the first one is around creative products. Because one of the things that Matt and I talk a lot about on the show is, with ai, how much should AI be helping us with our creative products? And what is the best use of it, and what is the most ethical use of it?

Cyndi Burnett: So I’d just like to start this conversation there and see where it takes us.

James Taylor: Yeah, let’s maybe ground it in a real world example. Before we came on, we were talking about Italy actually before we started recording, and they just discovered in Pompeii in Italy,, a two, 2000 year old fresco of what is one of the first pizzas in the world.

James Taylor: It was a fresco of a pizza and, it didn’t have like tomato and mozzarella on it. It did have something that su especially looked like pineapple. So maybe the first. Have a pizza was Hawaiian, but don’t tell the Italians they won’t be very happy about that. So, just two days ago I was speaking at a thing called Baked tech, which is one of the biggest conferences for the production baking industry. So when you go to a. A store and buy bake products in those, that’s all the people that were at this event. And I was telling this story about this 2000 year old fresco, and then I was kinda contrasting it between where we’re seeing this kind of idea of creative product come in today.

James Taylor: You know, back then that was a creative product. It was a new product that is being created in the market. So today, you look at something like, and we’re talking about the ethics here as bringing the ethics back in something like, cheese. You know, cheese, it takes 700, , liters of water, I think it is, to produce one pound of cheese.

James Taylor: It’s a very inefficient practice to produce dairy-based cheeses. It takes 500 times less water to produce plant-based cheeses. But the problem with plant-based cheeses is they lack, the stretchiness, the strangeness that, you know, when you take out pizza in it, it stretches like that great, , browning capacity.

James Taylor: And that’s because it lacks, functional proteins like casing. So, to give you an example of this idea of product and super creativity and product one, really interesting team came along based in California called Climax Foods. And I talk about this in the book. And their approach to cheese is more about data than is about dairy.

James Taylor: So what they were, what they did is they focused on. How could they use human creativity plus artificial intelligence and machine learning to create newer types of plant-based cheeses? And so what they would do is they would take the 300,000 different varieties of plants that exist out there, and they could look at all the combinations of permutations.

James Taylor: In order to make these new plant-based, these new types of ingredients, these types of products. Now if you think about something like , that’s a perfect example. It would take a human innovation lab within food manufacturing like decades, hundreds of years to kind of go through all those different permutations.

James Taylor: So as humans, we are creators and makers of things, products and services companies and countries. In the past, it was always a case that ideas had to come from us, the humans. But today we can go to like generative ai, for example, and say, this is the idea I have of this product. It will do the first pass.

James Taylor: It will create thousands of different designs based upon the criteria that you select. But then the human importantly, we step back in and we review all these and decide really which ones we wanna prototype and test. Now the final product could be more creative, more innovative, more sustainable than the kind of thing that a human could have created on their own.

James Taylor: So human plus machine creative collaboration, super collaboration. So that’s an example. And today. That company, climax Foods, they’re developing all these new products that are coming out into the market. Which had this combination of human and machine creativity about how it’s gonna come together.

James Taylor: And so a lot of products you buy, like if you go to France, you buy, baby Bell products, and many of these have been created using this. In fact, the next time you go on a plane, if you have a look at, look around you, all those overhead compartments on an Airbus or a Boeing, they’ve been made with human and machine.

James Taylor: Creative collaboration working on these designs. So when I think about product, really what I’m talking about in the book is going beyond what just we as humans and teams can do on our own to bring in a new collaborator, a new partner with ai.

Matthew Worwood: I’ve got a couple of follow ups. I’d love to talk a little bit about some of the research that has concerns about the limitations of AI to generate truly original ideas. But before we do that, and maybe we won’t get too much into that, but putting this into a classroom, one of the things that we’ve been talking a lot about on the show, and I think this speaks to what you were talking about, is that we could, in essence, automate.

Matthew Worwood: Assignments. If I’m a student, I’m a learner, I could automate some of my thinking. I’ve been given a writing assignment, for example, I have to write 12 pages, and it has to be on this topic, this theme, and I’m gonna go to AI and I’m gonna automate my thinking of what I may write about. And AI is gonna generate the outline for me.

Matthew Worwood: And then what I’m gonna do is I’m gonna go about and hopefully not use ai, but write my essay based on that outline Now. And I think I’ve certainly feel that some students see that as kind of acceptable behavior. ’cause hey, I’m writing the essay, . What I would say and argue is augmentation.

Matthew Worwood: It is more appropriate if you come up with the idea and you come up with , the theme you take, and obviously that’s gonna be informed very much by your existing knowledge of the world, the connections that you are making to the topic. But AI can assist you in perhaps expanding those connections, providing you with some additional information.

Matthew Worwood: And so. Maybe that you create the outline and as you reference then have AI review it. And I think you had referenced a new example creating a prototype that the outline could be an initial prototype and there’s now the exchange, the back and forth exchange to kind of like polish that outline further.

Matthew Worwood: What’s really important is that capacity to problem find and then be in that position to validate those ideas, to make those choices. Do you think that there is a danger both in. You know what? Maybe I’ll just ask an industry. Do you think there is a danger over the next few years as we’re in transition that we could actually do a little bit too much al automation, which is like that first example as opposed to actually doing augmentation, which is actually we’re kinda like driving the ship.

James Taylor: Yeah, so, a key challenge that’s coming up in lots of industries today. And I’ll give you one example. I have a client, Medtronic, they’re in the robotic surgery world. And so you probably have people that listen to this show that are in the kind of teaching, in healthcare and those areas.

James Taylor: And what they’re finding is as, as amazing as the robotic surgery things can be. And they allow you to do much more complex procedures, much more detailed procedures. Some surgeons are getting a little bit of, atrophy in their thinking and what starts to, to get go not so good. And we, you probably see this in students as well, is the critical thinking skills.

James Taylor: So I’m a huge proponent of the four most important skills I believe for this new age are gonna be, creativity, collaboration, communication, and critical thinking so, creativity, I know you talk about a lot within the show, but I think within.

James Taylor: AI is starting to come into things. I think the critical thinking skill is gonna become actually more important as we go on. And I think there’s a, there’s an exploration. I got involved in online learning and e-learning when I first moved to California, and this was the early days of the MOOCs, the Coursera’s and all these, and there was so much optimism about how these technologies could unlock potential in students and also could help teachers as well.

James Taylor: And I think some of that has not been born out to be true. I think there is a, there’s a danger. So for us, as you know, in your case as educators and I come from a family of educators, of teachers. So even though , I speak mostly to a corporate audience, I’m kind of secretly an educator, although I don’t use that word with them.

James Taylor: I often think about is like trying to find that, that blend you were talking about, the augmentation, there, Matthew. So an interesting experiment I think that’s going on just now for educators is one that’s happening in London. Where there’s a class a, a school there where what they’re doing is in the morning, the students have been taught by artificial intelligence, so they’re using AI and their lesson plans and doing that.

James Taylor: And in the afternoon they’re being taught by a human teacher. And that teacher is doing really tight into trying to build, like, what does that mean? Some of those things that you’ve learned in a day to work with that material, to really to develop confidence, to develop those collaboration, you know, critical thinking skills, creativity, skills.

James Taylor: And so I think the jury is out. At the moment in terms of where that balance is gonna be, and it’s gonna be different for different things. If I’m speaking to someone in the military, for example, it’s very different because they already have what we call loyal wing men, which is a way that they work with artificial intelligence in other industries.

James Taylor: I just spoke last week for Gucci, the clothing, the branding company , and those luxury brands are much more concerned about how do we elevate the role of the person that’s working in that store to help you choose the perfect makeup or the perfect, clothing for you.

James Taylor: So it’s really much more about the augmentation, but , I think it’s just gonna depend on, it’s gonna depend on the industry. I don’t think there’s gonna be one right way to do it. And, and you’ll know much more about this, this world than I do, but I’ve been seeing some research recently about. You know, everyone probably five, 10 years ago was, or the schools are really pushing iPads, getting everyone those devices in the school.

James Taylor: And there’s, I believe there’s some research been coming out just now about how is that necessarily the best way of teaching? In that way are we really getting the value from using some of these tools? And I think AI will just be another version of that. So it’s just gonna try and find that balance.

Matthew Worwood: And, I do think though that’s a critical point. Just to pause for a moment because we spoke about this on the show, but that’s one of the challenges. You know, and I know I’m being a little bit cynical, but we’ve continued to face these technological disruptions in education where sometimes we have technology, it isn’t necessarily built for the classroom, but in essence penetrates those four walls and people are using it, both teachers and students and, you know.

Matthew Worwood: One of the challenges that I see right now as we’re kind of transitioning, people have to understand that there is a difference between augmentation and automation. And we’re gonna have to start making a decision about what part our work do we feel should, can we automate, can we trust it, we can automate it, and what might we augment it?

Matthew Worwood: I think you are talking, bringing up a really good point that might vary by domain and context, but in the classroom environment. I think as we kind of navigating this wave. Teachers need to just keep thinking about those learning objectives. What are those learning objectives? And we never want to be in a situation where we are ever automating learning.

Matthew Worwood: And we’ve also gotta be very careful about what we are augmenting within our learning experience to make sure that it is not taking away the capacity for students to, to reach that learning objective. And I think so long as we do that and put that at the front end of our work, then we can continue to move forward on.

Matthew Worwood: As you referenced with the iPad, I’m, we use an experiment of how AI can, works not just in the classroom, but in our lives.

Cyndi Burnett: So I have a question about original work based on what you were just discussing. So if students use AI to generate essays, art, and code, we don’t want students doing that, right? We don’t want them to automate their essays or automate their artwork. But what does original work mean now? In the age of ai, how do we look at something and say, this is original or is it all, we had a guest on the show, Edward Clap, who said, we should be looking at the biography of an idea.

Cyndi Burnett: So is that true with ai and looking at, first there’s this idea and now it, it evolves over time into something else. What does that look like? The originality of an idea?

James Taylor: Okay, so my wife’s a lawyer, she’s an intellectual property lawyer, this is her world as well.

James Taylor: But I, I know that certainly in the uk I’m not sure in terms of within the US and state laws there. But, for an idea to be. Kinda patentable, let’s say. , It has to be, had a human involved in the element of making that thing. So you couldn’t have something that’s purely, let’s say, we’re talking about now, about things like AI that you’re prob people are hearing about today.

James Taylor: The AI purely creates something on its own. Just using its training data that is not patentable in some places. You can go to the legal side of it as well. Let’s come back to the human side. Let’s say that student, I think , it’s a extremely nuanced thing. I think there should be an openness in sharing, like what training, what tools you’re using to come up with things as well.

James Taylor: I don’t have a problem if, in my world, someone came to me and said, I’ve used an AI to. Write this pitch or this presentation and then have a conversation about that and say, well, why did that one come up with that one? As opposed to, this other AI tool coming up with another one as well.

James Taylor: But you can even have fun with it. There’s ways now you can say, I’ve got this thing I have to write. I go to five different ai, go to chat, GPT, go to Gemini, go to copilot, come up with five different versions of how you would do this. And then what I would do is the human is, I would synthesize that information and say, okay.

James Taylor: Now that I know these things, what is the story that I want to tell why I want to do, I think that’s how using it as a tool, I’m kind of a little bit less concerned about, the originality because, from my work in music, we steal from other. Musicians, we learn from other things and as long as the human comes back in, because I am imperfect as a human and I’ll put my imperfections into that thing, and that puts my elements on it.

James Taylor: So I am the originality. I think it has to be, within the legal side, there has to be things in order that people are recompense for their work and their IP and originality for the what they do. But if someone is coming to me with an idea for something, I’m almost not so concerned how they’ve arrived at the idea.

James Taylor: Initially I wanna talk about the idea, and then maybe I would get into the workings to see if it’s been done before. Is this is the other thing that we have with ai, which we have the hallucination side. In law cases that are happening all across America, lawyers have been given a hard time rightly because they’re quoting statutes that don’t exist.

James Taylor: Cases that don’t exist because it is hallucinating as well. So I would worry if I was a teacher, if a student is coming to me and saying, this thing here, based upon this study here, that study doesn’t exist. It’s been hallucinated and made up. That kinda goes tracks back to what we were speaking about earlier with this idea of critical thinking, developing those critical thinking skills.

James Taylor: And I’ll give you, I’ll give you a really, a simple, example actually where AI can actually help us with the critical thinking skills. Because if we think about it,, our brain is an amazing thing, but it is always looking for ways to save energy. So it does this show the use of heuristics or rules of thumb?

James Taylor: So it says, I’ve seen this before, therefore the answer must be a bit. So it’s, you know, Daniel Canman thinking fast, thinking, slow type of work. Where we can use AI is to help us to see through , the weaknesses. In our ideas and to come at it from a different perspective and to stress test our ideas as well.

James Taylor: So, something I talk about in the book is one, the, the concepts I talk about is this idea of virtual or imaginary masterminds is, I would say. Put together five or six individuals that you would love to have as your ideal board of advisors. Ideal board of mentors. Anytime you work on something, write on something.

James Taylor: It could be a plan, it could be a, whatever the thing is, give it to your virtual board of advisors and what are the questions that they’re coming back with. About it. Once again, the AI’s not gonna come up with a solution, but what it’s gonna hopefully do is gonna help you see through some of your own cognitive biases that we all have, and that’s just part of being a human being and it hopefully then makes our work better as a result of this.

James Taylor: And this is great, especially if you are just working on your own. You maybe don’t have support by a great peer group around you. This is a brilliant thing to be able to do.

Matthew Worwood: You know, Cindy, I’m seeing an immediate connection there to our recent interview with Marette Kovski, who talks a lot around AI use and education as a feedback tool. And I, I love the idea of it being able to catch our biases, but I think there is an opportunity for students to get in the habit.

Matthew Worwood: Where they produce some content and they’re in a position where they can share that content with ai. And I think it’s a wonderful idea, James, with the fact that maybe they can generate some five personalities, , and have that, have their own review board respond to that. Those five personalities could be around the idea of, Hey, I’m particularly weak at spelling.

Matthew Worwood: Here’s my background. Make sure that you’re challenging me to expand beyond what I currently know based on my background. There’s all these different things that we actually could assist students in helping them. Use their creativity to determine the best ways that they can deploy AI within their learning experience.

Matthew Worwood: But. I do wanna just come back a little bit to the originality piece, and I, and I have actually, if you don’t mind a question for Cindy, because you’re talking a little bit about originality and I’m wondering, your immediate thoughts on this and then maybe James, if you’ve got a consideration.

Matthew Worwood: But I’ve been wondering, so often in education we talk about the importance of giving them the real world experiences and why they’ve gotten to do this, because it’s gonna be the real world. And we we’re so often trying to mimic. The real world in education, but I’m just wondering if certain parts of AI is now requiring us to say, no, our priority is not the outcome.

Matthew Worwood: Our priority is the process. And the reason why I bring that up, James, in the industry, it would make sense. I don’t care where the idea comes from. But when you’re looking at it from a learning process, actually having a conversation about where the idea comes from is potentially where we uncover the learning that’s taken place.

Matthew Worwood: And I, I’m just starting to think we need to be very cautious about, how much blending occurs between conversations around AI and industry and AI in education. So Cindy, I throw it back at you with the originality and then James, I dunno if you’ve got some, some thoughts about that.

James Taylor: I’ve got a thought of that.

James Taylor: Yeah.

Cyndi Burnett: I, I, something I’m struggling with a lot because even, talking with teachers about student work and how they might use it, , it varies I think, like James said, based on discipline, based on, what the assignment is, based on what the learning outcome is.

Cyndi Burnett: And I like this idea of augmentation versus automation because it can help augment our thinking in, in a way. Like we’re writing in a paper. And James had mentioned, the research piece, when we were working on our book, which is coming out this summer,

Cyndi Burnett: I know for myself, we had these little research sections and I would start with a deep dive, on a large language model that looked at a specific area, and then I would use that as a springboard to find other articles. And that just kept, bringing it, bringing it out, bringing it out.

Cyndi Burnett: Something like that probably would’ve taken a. A month, but instead it took me a couple of days and I used it as a springboard. I think that is an organic way to use it, without it doing the thinking for me, because actually it, it probably required more thinking because I had to look at each article and say, is this what I’m looking for?

Cyndi Burnett: Yes. Go here. No. Okay. What else? What is it that I’m looking for? So using it as like a thought partner as you work through these. Tough challenges in education and having students document that. I think that’s really an area we need to be looking at because , in my experience in working with teachers, they’re either like, no, I don’t wanna know anything about it.

Cyndi Burnett: We’re not, I’m not bringing that in. And then the other side, which is like, yeah, this sounds great. So how do I actually do it? I think there’s this ground that we need to be playing with that really brings in. How do we help our students maximize their work and in the super creativity sort of way so that it brings on the most creative products while still representing the student’s creative potential?

James Taylor: Yeah, I, I love that. And as you were saying, I was also thinking about in the, from the role of teachers, and I guess this, it’s gonna be different depending on it, if it’s elementary or or university or what the level of teaching is. But if I think about healthcare. And I think about 40% of the time, let’s say local hospital where I live, 40% of the time that the nurses are spending on those wards is spent filling in forms 40%.

James Taylor: And no nurse ever got into that profession. Sit there spending and filling in forms. So let AI do that stuff. Get as much of that stuff off, your plate as possible. Going back to this idea of attribution, I, I think is an interesting one.

James Taylor: Whether something should be tagged as AI partly generated, someone is a teacher has used AI in their work. There’s a little bit of a thing you have to watch for here, and this is stuff that came out from Harvard actually recently, and I think you can read it about it in the maybe last month’s Harvard Business Review.

James Taylor: It’s called the Competency Penalty. And the competency penalty says that. Professionals often will actively avoid using AI in order for them to not look less competent in their jobs. And this is borne out by, by data now, because what they’re finding is that. If someone uses artificial intelligence and they say to a coworker or their boss that they’re using artificial intelligence and that coworker doesn’t, is not big use of AI themselves, they will be judged to be 18% less competent.

James Taylor: Than someone who didn’t say that, even if it’s exactly the same piece of work. Importantly though, this is even more the case for female workers and workers aged over 40. This is a stat that blows your weight if a woman uses artificial intelligence in her job, whatever that that job is, and she takes that work to her boss, who is maybe a male non adopting user of artificial intelligence.

James Taylor: She will be judged to be 39% less competent than a male doing exactly the same thing. So there, is still the human part of this, the culture bit we think is, the tech and it’s, a large part of that is that if I look at, I’m speaking for, in a couple of months time I was speaking for, college, university business offices, in the US

James Taylor: and one of the things I’m trying to get across to them is. 80% of the reason that AI initiatives in universities and organizations fail is not because of the technology, it’s the people. It’s the process, the playbooks that they’re running in their heads. That kind of culture piece we have to deal with.

James Taylor: And that’s, definitely the case within teaching.

Matthew Worwood: So persuasion has come up a little bit on the podcast before it’s one of your ps. Could you talk us a little bit about why persuasion is part of a creative skillset and perhaps, do you feel it’s under in schools based on what you see in the workforce?

James Taylor: The reason I, yeah, the reason I put persuasion in there is ’cause I’ve seen it and we’ve probably all seen this.

James Taylor: Someone with a great idea. Maybe it’s you, is what you think is, is a great idea and it falls down because you’re unable to persuade other people to its value. And, this is the interesting thing that’s happening even with ai, other, other things just now where they allow you to generate lots of ideas and then you decide, okay, which of these ideas do I want to, proceed with and take to other people and get their feedback?

James Taylor: And if you don’t have a basic understanding about how to persuade other people to your idea, then that is a problem. And, and I don’t know in the US but certainly in the uk. We have stopped teaching, some communication skills type training, and we’ve definitely stopped teaching public speaking training in many schools today.

James Taylor: And I think this is a failure, this is wrong because so much of what gets an idea over the wall and what kinda makes an idea actually happen is your ability to go out and persuade others to its value. So in the book, it’s one, i, I kinda leave it right to the end. And I talk about how you can use artificial intelligence to help persuade other people to your idea.

James Taylor: And and some of it’s a little bit on, a little bit more kind of on the future side of where we’re going just now. And I talked about just that example earlier where. Before I ever step into the room with someone, I have to persuade, I will use an artificial intelligence to analyze that person how they think, how they feel, how they make decisions, because that helps me , be a better, presenter of my ideas to them as well.

James Taylor: So that’s just something very, , very simple that you can do. Here’s another example. I was just talking, I was in Atlanta the other day and I was talking to someone. And he was explaining to me what he’s done is he’s created a digital twin, as it’s called, an AI version of his boss. So anytime he has to go and give a presentation to his boss to maybe ask for additional resources, he runs the presentation and through this the digital twin AI version of his boss First, he then gets that feedback and then says, okay, , your boss really wants a little bit more data here, wants a little bit more supporting evidence, or is gonna ask these questions.

James Taylor: You need to be prepared to answer these questions. So that’s. Examples of kind of using AI for persuasion and helping that. Now the ethics side comes in really interesting here. And I’ll give you an example of that. I was just speaking at a conference. It was for the eyecare industry. In Florida, and this is for all the iCare companies across America.

James Taylor: And what they were talking about was something I thought, oh, this has a potential impact on the work of educators, which is the new generation of the smart glasses, which are like Raybans. And they have a, essentially they have a heads up display unit like a fighter pilot would have. And if you connect that with your corporate, relationship management software, your facial recognition switched on, it means that.

James Taylor: What, whoever I’m looking at, whatever I’m looking at, I’m immediately getting information about that. Now, think about that for a moment. That means that, say if, if I walk into a car, showroom. And that sales person is wearing these glasses. It’s connected to, it’s got facial recognition switched on. It’s connected to the CRM.

James Taylor: It’s immediately telling me, oh, this person has bought this vehicle from us before. This person’s looking at our website, looking at these vehicles with this specification. This person has a higher credit rating than this person. I’m gonna speak to this person first. So you can already start to see the ethical implications.

James Taylor: Of that. Now, in, in the US as a, as a federal, you don’t have an AI act, you don’t have anything to protect kinda against that. You have individual states like California, and they’re kinda pushing a little bit on this. Europe. We have a, an AI act within Europe as well.

James Taylor: So as you’re thinking about these tools starting to kinda get deployed, I do think we need to think a little bit about the. A bit more deeply about the ethical implications and whether, you almost within some institutions, I think you need to almost have an AI board of advisors. I think it needs to get to that level so that they’re really thinking through not just the technical implications of deploying, let’s say, AI and what they’re doing, but also the ethical implications.

James Taylor: What does this mean for us?

Cyndi Burnett: Well, James, this was a really interesting episode and it’s certainly given us a lot to think about, which we will talk about on our debrief episode. So for those of you listening, if you’d like to learn more about James’s work, you can explore his book, super Creativity, accelerating Innovation in The Age of AI, and visit the Super Creativity Podcast.

Cyndi Burnett: And if today’s episodes. Spark something for you, especially around rethinking product persuasion and what original work means in an AI driven world. Please share it with a colleague who is thinking seriously about the future of education. These are conversations all schools need to be having. So make sure you subscribe to the Feeling Creativity Education Podcast and join our newsletter where we continue to explore research practical strategies and tools you can bring directly into your classroom, your leadership team, or research.

Cyndi Burnett: My name is Dr. Cindy Burnett,

Matthew Worwood: and my name is Dr. Matthew Ward.


How is AI changing creative work right now? And what does that mean for the way we teach and assess students?


In this episode, Dr. Matthew Worwood and Dr. Cyndi Burnett talk with James Taylor about what he is seeing on the front lines of business and innovation.

James shares his idea of “super creativity,” which is simply the idea that humans and machines can work together to create better ideas than either could alone. He offers real examples from industry and then helps us think through what those changes mean for schools.


Together, they discuss:


– The difference between using AI to automate work and using it to support thinking
– Whether students should use AI in the early stages of idea development
– What original work means when AI tools are widely available
– Why critical thinking may matter more than ever
– The role of persuasion and communication in bringing ideas to life
– The ethical questions educators cannot ignore


The conversation also explores an important tension. In business, the focus is often on the final product. In education, the focus must remain on the learning process. As AI becomes more common, teachers may need to rethink what they assess and how they assess it.


If you are trying to make sense of AI without swinging to either extreme, this episode offers a thoughtful and balanced perspective.

About the Guest

James Taylor is an award-winning keynote speaker and internationally recognized expert on creativity, innovation, and artificial intelligence. He has interviewed more than 750 leading creative thinkers on his Super Creativity Podcast and works with global organizations to help them unlock innovation through human and machine collaboration.

His latest book, Super Creativity: Accelerating Innovation in the Age of AI, explores how individuals and organizations can thrive in a rapidly changing technological landscape.

Episode Debrief

Collection Episodes

Why Handwriting Still Matters for Creative Thinking

Why Handwriting Still Matters for Creative Thinking

Season 12 | Episode 2 Why Handwriting Still Matters for Creative Thinking"so I think somewhere I would venture a guess that we that the importance of handwriting that you would you might see in a parochial school where you're working on cursive you know every day and...

read more

Follow the pod

Subscribe Today

available on your favorite podcasting platforms