Brent Warner 0:00 This is the HigherEdTech podcast Season Five, Episode Five: a look at AI tools in Fall 23. Tim Van Norman 0:20 Welcome to today's HigherEdTech podcast. I'm Tim Van Norman, the Instructional Technologist at Irvine Valley College. Brent Warner 0:27 And I'm Brent Warner, Professor of ESL here at IVC. We both enjoy integrating technology into the classroom, which is what this show is all about. Tim Van Norman 0:34 Welcome. We're glad you're here with us. So anything new going on from the other side of the world? Oh, Brent Warner 0:41 yeah, I got a, I got a car. So that's exciting. It's temporary, because I'm only here for a short amount of time. But we really wanted to have our own car and do some, some traveling while we're working. And so that's pretty fun. And then I also did a speech for tea soul, which is the big ESL organization, I did their first presentation on artificial intelligence. And so that was really cool. It was it was really well received. There. It was a pretty big crowd there for I think, a Tuesday night or something like 150 people in there. And so and then recorded for whatever, later, but But yeah, it was great. So a lot of people were opening up their, their thinking and trying to figure out how to do things. And I know you're dealing with a lot of that to Tim with, with the course that you're building on campus. And so how about how is how's that going, by the way? Cuz since we're talking AI Tim Van Norman 1:36 Yes, since we're talking AI, yeah, we've released a course for faculty. And I have a feeling some of the stuff that we're going to talk about today is going to wind up getting added to the quarters, some of it is from the course some of its being going to be added. That's really good. And then was just asked to do a presentation to classified in November, on AI, as well using AI and in the workplace. So basically, every time I turned around, there's something going on with AI. And it's a great time to be in this business. But yeah, Brent Warner 2:19 yeah, I mean, it's wild. So every time I've done a presentation, they're like, can you also add in some stuff for our classified staff or, you know, however, they, you know, determine that non non teaching staff, right. different terminology around but but, but yeah, it's a big part of it, too, right. So it's pretty exciting to see all that and then recognizing that all all the different aspects of the educational institution are really trying to go okay, how do we deal with this as well, so, so today, then, Tim, we're breaking down some of the tools, some of the things that we're looking at these ideas, it would be impossible to cover every tool, I think, probably about 350 come out every single day, if not. And so we're just gonna kind of talk about some of the ones that are going to affect us in the in education in particular in the classroom. But I think we should jump in and start looking at these categories. Tim Van Norman 3:15 Excellent. So as you mentioned, we've got a lot of different categories. So we're not going to try to answer every question about every one or anything like that. This is broad strokes. Some of the stuff like the first one large language generators, that's what really started all of this. For a lot of people. They didn't even realize AI existed until a year ago and chat GVT and oh, no, the world is changing. And so there's a whole bunch of large language generators, though, not just chapter GPT. But being Microsoft's come out with being having it built in Google Bard, one of my favorites is perplexity AI. The reason I like that one is because it will actually give you references for where it found the information. Yeah. Or where it where you couldn't find the information. So, Brent Warner 4:05 so let's talk about that for a minute. Because there's a few things going on here. So one, perplexity, I've seen a couple of these different ones that are like, hey, Chad GPT is not citing it sources, being kind of does, right, sometimes. And again, all of these are changing, right? So we're talking today, right? This all this whole conversation has to be a snapshot of, you know, early October, probably even by the day really the truth is, but like, but but a lot of those ones are not necessarily giving references, which a lot of teachers like oh, well, then we obviously cannot use it because it's not giving references. It's like, well, hold on a second. That's that's a different conversation. But because, by the way, a lot of our students fail to give us references, as well. But I think that it's really important to kind of look at these things and say, Okay, well hold on, Are they pulling resources? So what I did today, chat GPT just Recently, if you're using the paid version, they just announced a connection with Bing, right? So you can actually now ask for it to search the web on, you know, today's information as well. And, and so I started asking it to do research on the stuff that's going on this news in Japan, right? There's this big scandal going on in Japan. And so I said, go search the Japanese websites, find the main information, and then translate it and tell me what's going on in English. Right. And I asked him to do that. And it did a pretty good job. It did an ok job. It was it was like, short summaries, I was looking for more in depth really, to have a give me a lot of information about what's going on. And then I said, Well, okay, I was interested, because the scandal also involves the Japanese media, basically covering up this the sexual harassment scandal that's been massive in Japan for many, many years. And so I'm like, Well, how have the media been reflecting themselves? Has there been any conversation about whether the media is taking on any responsibility for it? And he said, Oh, well, a little bit. There's some some information here and there, and there's an article that does talks about this and an article that talks about that. So I said, Okay, please give me the links to those articles. And I said, okay, here are the links to the articles and gave me four. I'm like, okay, cool. And I clicked on the first one. And it's not, you know, it's the Asahi Shimbun, which is a big newspaper here in Japan. And it took me to a 404 page. And then I'm like, oh, okay, well, let me click the NEXT 1404 page, different newspaper, NHK, 404. Sound like all of these looks like telling me that there are links to these things, and not a single one of them actually had a link for them. So still doing that hallucination. You know, that stuff about fake links and fake information? And confidently? I mean, can you imagine how confidently it's like, it's like, here's four links, and you're like, you better not click on them teach. But it was it was fascinating, because it's still kind of stuck in that. That's been an early conversation around chat GPT in these things, is that that hallucinated? That false confidence about things and it will just give you an answer no matter what happens, right? Like, it can't say, I don't know, where we couldn't find anything. It says, Here it is. And it's like, well, there's nothing there. And I just because I took the couple extra steps to figure that out. So anyways, we want to be aware of this as we're using these tools. And we want to make sure that our students are aware of this too, because it's a problem, right? And, and it should be able to just say, Hey, I don't have enough information, I can't find anything about it, or whatever else that is, or, you know, but it shouldn't say, hey, they're false. Here's information that just doesn't exist. Tim Van Norman 7:37 Right. And that's what I found as well. And barred and perplexity is some of the information. I always felt that about 90% of it was accurate. But the 10% sometimes completely changed everything. Yeah. Okay, and so. So it was a neat way to get started looking for things, but I could never rely on it for the product. Yeah, I think that's fair, that makes sense. And I think that's really where we are in the world, is everything you do, you're gonna have to pay attention to what you do exactly, and how it works exactly. As we get into some of the others, as I've tried them out, I have to, you have to tweak it, to make it sound the way you want it to or something like that. And so, Brent Warner 8:27 yeah, there is this whole concept. And we'll get into that a little bit here. But as we wrap up the this is the big one, right, these large language generators, all these things, but I do want to clarify, as we stepped forward, a lot of the tools that we're going to be talking about here, Tim, the way that I see them is and we'll just use Chad GPT as an example. They're just a paper thin mask covering up a large language generator, right? A lot of them are just, hey, we figured out this prompting, and then we're going to shoot we're going to, we're going to launder your prompt right through right through to chat GBT and then shoot it right back to you. And we're charging you a fee for doing that. And it's like, Well, okay, but you if you learn how to do it, you could just do it directly with. And again, I'm using ChaCha PT as an example, there are multiple ones of these, but a lot of these are not really their own services. They're just kind of there, they're fine tuning it and acting like it's its own thing, right. And so just to be aware of that, and it's for some people that's really useful. And they go, Hey, I don't care because I just want this one service out of it. And that's easier for me. But other people might go, No, hold on a second, I can learn how to do all this stuff and really, really manipulate it to my specific needs. Tim Van Norman 9:36 Right, exactly. Brent Warner 9:38 Okay, so what's the next category? Tim Van Norman 9:39 The next one we wanted to talk about is voice generation. So voice generation, a long time ago, Cisco prompts, which is still very useful. I believe that's been around for I don't even know how long a long time. But basically, you type in something and a voice generates that text back to you. Yeah, really useful if you're creating call trees, or something like that, okay, so you just generate all of the voice text, put it into little mp3 or whatever and provide it and you're done. It's a really neat tool. And it sounds okay. It's gotten better, I believe, over the last couple of years, but, but it does sound a lot better than it has in the past. But those are the types of things that that are, I would call it the low end. Yeah, because there's some really cool things that you can do with voice generation. Brent Warner 10:40 Yeah, so that's kind of like, though, if you type in a few of your words, the things that you want it to say. And then it kind of gives you these choices like here's like a bunch of different default Google voices, or a bunch of default Microsoft voices or whatever. And then it'll say it in different ways. And some of those sound better than other ones. So it's worth playing around with but free and quick and easy. That's that's the part that a lot of people are going to like there. I want to talk about the other end here, Tim, and we were doing a little bit of pre show talking about the chat GPT has just announced starting in October, they're allowed, they're opening up this foot voice functionality, where you can talk to the app on the phone, you can always do it with a little Voice microphone, but now it's really talking back and forth with you quite clearly. And it's pretty amazing. So I gave you I showed you a very short sample. And I'm going to try and record this a little bit more properly, maybe put it up in the show notes, so that people can kind of see what this looks like it didn't come through very well on the Zoom. But But, but it's amazing. So you can just start having a natural conversation with it. And it'll just start responding back to you. And it sounds like a person even with hesitations, or like I'm okay, you have a little you know, so it's really fascinating to see how powerful and how quickly this technology is shifting. Tim Van Norman 11:59 I loved using that as an example, I loved your concept of, I'm an ESL student, I want to act like I'm talking to a salesperson at a car dealership, and then say, Oh, that was too complicated. Can you make it easier or something like that? And it was nice to have it, rephrase it and come back to questions that you could answer. Yeah, at a different level. And just that concept of being able to chat with somebody at any time, and actually get immediate feedback immediately hear what they're saying? And by the way, read it. Yeah, read the transcript. That was the other part that was just absolutely amazing. Is you saw a transcript on your phone of exactly what you were dealing with. Brent Warner 12:49 Yeah, super cool. And then also, the part here that is actually hard for me to get my head around, because I'm so trained on search engines is that you can adjust course in the middle of working with these things, right? So it's like, so it's like, Hey, make this easier. And it's like, okay, let's do that right now. Right. And then all of a sudden, you're on a different path, where a lot of times with the traditional search engines, you're like, Oh, I searched for something, it didn't come up with what I came up with. So I have to now figure out a different way to, you know, like, I'm kind of like starting over every time. But what this remembers what you've been doing, and it just makes shifts into the way that he's talking about it and just like the way you would with a normal person, if you say, Hey, I don't understand what you're saying, Can you make it easier? And they'll go Oh, yeah, sorry, I didn't. I shouldn't have said do you want to compact sedan? Do you want to, you know, monster truck, whatever. Like it's like, it's like, Do you want a big car or a small car? That's like, Okay, that's great. So, so really cool to be able to make those adjustments now. Tim, there's a couple more in here. Hey, Jen and 11 labs have you looked at either of these I played with I haven't seen them. Okay, so Hey, Jen is the one I showed you this before? Which is the the one that translated all of my speaking to Chinese. Tim Van Norman 13:59 Wow, that was amazing. When we recorded the last time that's part of what bread brought us to this idea for today. That I had you said you had recorded it. You're speaking in Chinese now I don't know Chinese so I can't tell you about the accuracy. But your lips were moving it it was you speaking in Chinese with my with sounded like it to me with your voice and everything. I was amazing. Well, I can also Brent Warner 14:31 verify it was great Chinese because I sent it to some of my Chinese students. And they're like, oh my god, like when they're like, they're like real what's going on with this? And then one of them's like I showed it to my husband and he can't believe that you don't speak Chinese like so it was like actually very strong. There was like one or two little issues they said like little things that were a little bit confusing but not but they said like over 90% accurate, clean grammar easy to understand correct pronunciation For those of you that studied Chinese, you know that pronunciations are real, real difficult one. So anyways, this was one I basically I can upload a video of myself, I could click on a button for a couple of dollars, it spins it around, and then outputs that same exact video of me speaking in English, but it's changing it to me speaking in Chinese. And it was really cool. And then this other one here is 11. Labs. This is when I showed in earlier presentations back in spring, when remember, Tim, when I had it's like giving me my voice. And I was talking about, you know, the rules of past tense or something like that. And we'll put, I'll put links to videos on all of these in the show notes here. But just so people can see examples of them. But that one, I could upload my voice as a sample. And it could just generate whatever I wanted to say, speaking basically, in my voice, and those are getting better too with intonation. You know, it, my English was a little bit flat, five or six months ago, when I first recorded that and it's like, it's like, Hey, folks, I'm going to talk a little bit about this. And it's fine. It's not bad. But if you listen closely, it'd be like, oh, there's a little bit of the bounce in my voice that isn't quite there in that recording, but I think it's getting better and better. So these voice generators are really cool. And if you start thinking about it in terms of like, Hey, I could put this information, I could be talking to my students about this thing. Or I could be sharing the, you know, some of this information. And I've already got it written out. But I don't really have time to sit down and do all this recording, click, boom, get yourself a voice generated version of it, you could put it up as a little podcast for your class or whatever else you want going on. So there's lots of cool things going on with that voice generation. Tim Van Norman 16:32 Absolutely. Brent Warner 16:36 Next up image generation. This and this is another world that's just like, wild stuff is going on with the image generation. So again, open AI, I'm just gonna start with them. Because they're, they're leading the pack. I mean, you know, you can say there's other stuff going on. And it's true, but like, just like, every time they start saying something, it's like, whoa, whoa, whoa, hold on, we're not ready to go that far yet. So they just opened up, or they just announced the new Dali three, which is their image generator, one. And it is one of the things so I have mid journey subscriptions. So mid journey is the one you can type in, you kind of have to learn how to prompt it, it'll make images kind of similar to what you're looking for. It's cool and makes these amazing images. But this one with the Dalai three, the mid journey doesn't necessarily recognize how you want things in proportion to another one another. So if you say like, you know, if you say, a man standing two feet away from a woman, and they're talking about whatever and like before, it would be like it didn't really understand that it might put a man or woman together and I put them on opposite sides of each other. Now it's doing all this stuff like spatial recognition inside of the images. And so you can actually just put in tons of detailed information. And you can start adjusting those after you've given the prompt, right? So you can say, hey, I made this and I said, No, I want I want his glasses to be blue, or whatever else it is. And so. So it's really making major shifts. And so you don't need to know that prompt engineering as much anymore. You just start saying really what you want. And it's going to be better and better at creating that output. So hasn't come out for me quite yet. But it's coming along soon. Tim Van Norman 18:19 Nice. Have you planned to have a loving? I've loved by the way on these. I'm loving how you can start using it in your class? Yeah, yeah. Always you just want a picture to demonstrate something. Try it, for sure which you can get, because you can use these images, and nobody else is going to have it. So yeah, Brent Warner 18:44 it'll be it'll be unique to your class. They also said with, at least with open AI, which does chat up T and Dolly and these ones, they said anything that you generate is, is yours, like proprietarily, it's yours. And so it's not they don't own it, or whatever else that is, so you can use those in your classes. You can create example, pictures of things that you're like, you know, I don't know about you, Tim, how many times I've spent so many hours searching for just the perfect picture that's going to accommodate this one slide on my, you know, my presentation, or whatever else it is and so, so that alone is just so so great, and so worth it. But also, you know, like you can customize those things, and you can start sending your students in to say, why don't you guys started creating some images that help us understand this concept, right? And so it's not just about you being able to do it for one picture or whatever. It's really a fact that you can actually incorporate it into your pedagogy and start using multimodal learning. Right? And I know we're not you know, we're not trying to say that, like, I'm not trying to prioritize one over the other. I'm just saying that like, hey, there's different ways to approach these things and look at these ideas. That might be really, you know, sometimes you'll see a chart and you're like, Oh, now it makes sense. Well sense instead of reading out the paper, right, and so, so all these things are starting to become more and more possible. And there I am, I'm just convinced that you know, like, come the turn of the year, it's going to be a full on revolution on the way we start thinking about things. So these image generators are huge. And really, really interesting. Tim Van Norman 20:19 Nice. Now, you mentioned mid journey before, as you're talking about, and then you've got hugging face. I'm not familiar with that one at all. Brent Warner 20:27 Yeah, hugging faces really interesting. So that one's a little bit more of a, you don't need to really be a programmer or any kind, but it looks kind of like a GitHub type of setup, you know, where it's, you go into it. And it's like, people are building all these parameters. And so hugging face is, it allows for a lot of exploratory, like, hey, I want to figure out this different way of doing things. So one thing that's really popular on social media right now that people are playing with is they're doing this in what's called Image diffusion. So it's like, if you have a pattern that's like a spiral pattern, and then they put it they say, Hey, make this into a Greek villa. And it's like this cool looking Greek Villa that also has this hidden pattern in it kind of like an MC Escher type of drawing or something like that. And so it's able to do all these things. So anyways, all these experimental things that you might want to play with. And so I was thinking again, same thing, you could do a spiral, but you could also do like a, you know, like a Fibonacci spiral and, and then tell your students Okay, what do you find in here? What's going on with this picture? How are you know, and so you could do all sorts of really cool things, just to see like, it could just be for fun. But it could also be for like, you know, quizzing students for fitness to figuring out how things work. And so again, hugging face is just one that sets you up with lots of different types of images, or purposes for images that you might want to generate, instead of just saying, oh, make me a picture, or make me a realistic photo that looks like a movie set or something like that. So this one's a little bit more experimental. Tim Van Norman 22:01 Excellent. So as we look to the next topic, academic guides, so this one was kind of more of a general topic for us. And by the way, for those listening, we have a whole lot more topics. So I don't know how much further we're gonna go today and then break it into another one or something. But yeah, academic guides. Grammarly is probably one of the most known of these guides. Everybody's heard of Grammarly. It helps you identify things in your class and, and stuff like that. And by the way, you know, turn it in has draft coach, and there's a whole bunch of other ones that kind of do some of those same things. But there's some other really cool ones that are coming out to you mentioned ethically, when we were talking before, Brent. Brent Warner 22:52 Yeah, ethically. So this is still early days for ethically, but it's pretty cool. Because what it does is help students through the writing process, and then it helps teachers back through the writing. So they work back and forth, right? It's kind of like, you know, it's like Google Classroom, with with AI integrated into it, or, you know, something like that. So. So basically, what this what goes on is you say, Hey, there's this assignment is in three or four drafts, right, or whatever, whatever number you want to say. And then you say, students are gonna have to turn in the first draft at this point. And then basically, they will turn in their first draft, and then the it'll go to the teacher. And then for the teachers side, it'll actually give the AI recommendations like, here are the things that we would have noted on here. Do you agree with them or not? And or do you want to adjust these in any way? And so you can click Yes, yes, no, no, adjust whatever. And then, when you're all good, then you respond and you and you can also add your own just direct notes onto the document, right? Just like you would with a Google Doc, for example. And then you send that back to the student, and then they can work with that. And then they can say, Hey, did I achieve those things for my next draft? And I fixed those things that you were talking about? Or did I add the information or whatever else? It was? Super cool, right? And then, and so then it goes back and forth. And then you can give students you know, so right now, it's kind of K 12. based, but I'm actually secret and reaching out to them to see if we can dial it and IVC because, because it's cool. And it seems like it's a really useful way to balance out a little bit of the like, hey, what am I what are we doing with AI? How are we doing this in ways that are productive and positive for student growth and student learning? And I think there are a lot of ways and this is kind of pointing a direction that makes it where, where it feels like to the students that hey, I'm working together with you to figure this stuff out and to build your skills. Not I'm using these tools to catch you cheating or whatever else it is. And so that's really important, I Tim Van Norman 24:47 think. Yeah, and notice in this whole thing, we have not talked about cheating. We have not talked about identifying AI, work, anything like that. That's not what this is. About this is about tools for actually creating AI. So you had one last one here, Khan Academy conmigo. Brent Warner 25:08 Yeah. So again, same thing, all of these things are so aimed at, you know, k 12, which is always like we were part of the education system. But, but Khan Academy. So back in spring, when I was at the cue conference, Sal Khan came and talked, and he kind of introduced this and showed it like the first time. And it's super cool, because again, the same type of thing where you can set up parameters for students to learn about things, and then it'll go through with AI. So you know, a lot of the stuff that people have kind of talked about where it's like, Hey, I'm a historical figure, I'm Benjamin Franklin, come and talk to me or something like that. And you can do this with these AI things. But now, it's also tracked and traced through a learning program inside of here with Khan Academy. And, you know, they really are way ahead of the curve, you know, so like, when you start seeing things that are going on, you're like, oh, yeah, you can almost be guaranteed that Khan Academy has been working on it for like a year and a half, before we heard about it, because they're just so tied in and so, so anyways, conmigo, unfortunately, it's not free, like all the Khan Academy videos were at least it wasn't when they first announced it. I'm not sure what they're doing, or if they're making variations, but, but it might be something that students individually want to look at and say, Hey, this is going to be a study guide for me that I'm going to use. So maybe students can pay for their own individual subscriptions and kind of treated as a tutor worth taking a look at and kind of seeing how it goes. There's just so much cool stuff inside of there. Tim Van Norman 26:41 Absolutely. This is amazing. I love all of this stuff. And by the way, for those of you listening, I think we're going to do the next episode on part two. Yeah, I don't think we got through, we did not get through half of the stuff that we had. So just for perspective, and already, I'm thinking of stuff to add to the next one. So the next one might be a little bit longer, which is good. That's what this is all about. Right is helping to educate and helping to learn. Brent Warner 27:09 Absolutely, absolutely. So so we'll come back next time. Well, this will be part one, and next time around will be part two. Tim Van Norman 27:19 Thank you for listening today. In this episode, we looked at part one of AI tools for fall 23. For more information about this show, please visit our website at the higher ed tech podcast.com. There you'll find our podcasts and links to the information we've covered. Brent Warner 27:35 As always, we do want your feedback. So please go to the higher ed tech podcast.com. And let us know your thoughts. And of course, if you have ideas for future shows, there's a link over there where you can give us your topic ideas. Tim Van Norman 27:45 For everyone at IVC. That's listening. If you need help with technology questions, please contact IVC technical support. If you have questions about technology in your classroom, AI in the classroom, anything like that, you know, shoot me an email and I'll be more than happy to to guide you set you up in the class that we've got that we talked about earlier, stuff like that. So I'm AT Tvannorman@ivc.edu. Brent Warner 28:12 And if you want to reach out to me about the show, I've actually been getting pretty active on LinkedIn recently to him so so please feel free to reach out say that you listen to the show. If you want to connect with me, give me drop me a little note there. I'm on LinkedIn at @BrentGWarner. Tim Van Norman 28:28 I'm Tim Van Norman. Brent Warner 28:30 And I'm Brent Warner. And we hope this episode has helped you on the road from possibility to actuality. Take care everybody