This informative webinar discussed strategies for leveraging AI to support students with disabilities and special needs. Discover how this advanced technology is reshaping educational support, offering new avenues for engagement, understanding, and growth.
Educators will gain an understanding of:
How AI tools can support the implementation of student modifications and accommodations
How to use AI tools to assist students with building their executive functioning skills
The ethical use of AI tools to streamline the IEP process
How to use AI to create learning content for students with disabilities including decodable texts
Presented as part of our AI Launchpad: Webinar Series for Educators.
AI and Supporting Students with Disabilities
-
-
Amanda Bickerstaff
Amanda is the Founder and CEO of AI for Education. A former high school science teacher and EdTech executive with over 20 years of experience in the education sector, she has a deep understanding of the challenges and opportunities that AI can offer. She is a frequent consultant, speaker, and writer on the topic of AI in education, leading workshops and professional learning across both K12 and Higher Ed. Amanda is committed to helping schools and teachers maximize their potential through the ethical and equitable adoption of AI.
Julie Tarasi
Julie is a special educator with 10+ years experience in both self-contained and inclusion settings, supporting learners of varied abilities in grades 3-8. In addition to earning a Master’s Degree in Learning and Technology from Western Governors University and completing IBM’s AI for Educators certificate/credential, Julie has facilitated district-wide Professional Development sessions regarding AI implementation to support instruction and administrative tasks. Julie recognizes that successful integration of technology can maximize student learning through equitable access and allow teachers to more effectively focus on what they do best.
Amy Oswalt
Amy is an impassioned educator of children with learning differences. Her professional journey has led her to several impactful educational roles, including special education educator (both in the United States and abroad), Head of School at Discovery School in rural Ohio, and Director of Innovation at the Lab School of Washington in Washington DC. Amy believes that all children deserve the opportunity to engage in education that speaks to students' strengths and interests. To this end, Amy believes in harnessing the power of technology and AI to improve the educational outcomes of all children, particularly those with learning differences. Amy's quest for innovation is mirrored through professional development endeavors in strategic foresight and futurist thinking. With a wide international footprint, ranging from living and working in diverse cultures to engaging in community conservation projects, Amy's journey reflects a synthesis of academic dedication, innovative leadership, and a global educational perspective.
Bridgette Leslie
Bridgette has been a special educator for the past decade, focusing her teaching on creating truly inclusive classrooms and fostering self-awareness and advocacy with her students. Her experience spans public district and charter school settings across the country, all K-12 age groups, and all types of demographics. She has triggered innovation at several school settings, implementing best practices by initiating student-led IEPs and training fellow teachers and administrators. While Bridgette stumbled into the world of educating students with disabilities, her work has since led her to reflect on how disability and learning differences were ever present in her life growing up. She wishes to be a part of a culture shift in education, and society at large, to accept and expect individuality in all students.
-
Amanda Bickerstaff: Hello! Hi! Everyone! Welcome to our webinar. I'm very excited to have you. Here was such an amazing panel on a really important topic. This might actually be one of our most important topics that we've done across the entire Webinar series. It takes just a little bit of time to get everybody in, and so we'll give everyone just about a minute. But if you have been here before, which I already see some
7
00:01:12.900 --> 00:01:25.520
Amanda Bickerstaff: some familiar names and faces. Please put out Hello in the chat, for where you're from, what you do, would be great in your name. And so the first one, my mom Hi, mom.
8
00:01:25.520 --> 00:01:26.870
Amanda Bickerstaff: from from Georgia.
9
00:01:27.120 --> 00:01:50.259
Amanda Bickerstaff: She's she's ready ready to go. We've got Bruce in Oregon. We already give everyone just a couple more minutes. So II love how Zoom wants to let you in one by one. We've got Lisbon. Hi, Miguel, we've got Shannon from Illinois. Stacey's in North Carolina. I've got Brisbane early there. Bay area. We've got New Jersey, Croatia.
10
00:01:50.260 --> 00:02:05.409
Amanda Bickerstaff: That's awesome. I so a lovely group of people from all over the world. Thank you so much for being here with us today. Whether this is your first webinar or you are a old, old friend of AI for education. We're really excited to have you here today.
11
00:02:05.530 --> 00:02:29.639
Amanda Bickerstaff: and, as I said at the very beginning, I think this might be one of the most important topics we've ever had. And this idea of being able to support students with disabilities with AI and also support teachers that work with students, with disabilities, with AI, because you know what being a special education teacher is no joke, it is really really hard, but it also is so deeply important and so very excited to have this amazing panel here.
12
00:02:29.640 --> 00:02:53.889
Amanda Bickerstaff: I'm so excited we've got Amy, Julie, and Bridget. So we bought. We've got 2 practitioners. And at Ed tech builder and and CEO, I love that. And so we're gonna get some distinct perspectives. As always, we want you to get a involved with us. So the community in our chat is one of our most important things. So share resources ideas questions, we will actually have a significant, we're gonna make sure that there will be
13
00:02:53.890 --> 00:02:59.880
time for questions at the end. So if you have questions put them in the QA. Early on. So make sure that we get to them.
14
00:02:59.880 --> 00:03:13.060
Amanda Bickerstaff: We also want you to just make sure that you prompt along. So we're actually gonna do some live prompting and and then we're just gonna show us as a a extra special sneak peek at a tool that she's working on, which is pretty great.
15
00:03:13.160 --> 00:03:30.629
Amanda Bickerstaff: And so this is part of our webinar series. We're actually at Number 22 which is really exciting. And we have some really great webinars coming up. So we have one on this social, emotional learning and the mental health crisis. There was just a release of the girls index that came out that showed a real decline in girls happiness.
16
00:03:30.630 --> 00:03:55.510
Amanda Bickerstaff: Between 2,017 and 2023. And so we will definitely be digging into that. And and also just social, emotional learning in general. Well, then, be looking at a very practical piece around differentiated instruction with Brian Eldridge. And then we have some district level people there talking about how to do this from the level of your district. So if you're interested in actually seeing what's happening in districts across the country. That's the one for you. And then finally, and talking to Tom Bandarick
17
00:03:55.510 --> 00:04:20.219
Amanda Bickerstaff: right before the end of the year. What are the durable skills of the future. And so we have a lot of great webinars coming up. But today this is the main event. I'm very, very excited to be with 3 amazing women who are all trying are all not trying, but are all making a difference in the lives of students. With disabilities. And so I'm just so excited to have you here. And as we go along, you know, I really want us to to take advantage of their experience
18
00:04:20.220 --> 00:04:49.100
Amanda Bickerstaff: knowledge and also realize that they're coming on here and sharing their practice with us. And so let's give them some grace as well, because this is not always the easiest thing they're gonna both. They're all gonna do some live demoing at the middle. So I just wanna give everyone like a really good shot of, you know positivity for that. So our first question, like always, is going to be a short introduction, and then your first time, either with generative AI in general, or chat ut. So I'm gonna start with you, Julie. Do you mind introducing yourself?
19
00:04:49.340 --> 00:04:57.310
Julie Tarasi: Yeah. Hi, my name's Julie Teresie, and I currently live in Kansas City, Missouri, but I grew up in Iowa, home of open AI servers.
20
00:04:57.360 --> 00:05:05.549
Julie Tarasi: I have been a special education teacher for over 10 years. I have a degree in special education as well as in learning and technology.
21
00:05:05.600 --> 00:05:20.840
Julie Tarasi: I've kind of served in a variety of different special education settings both inclusion, and resource. Self contained. With just a variety of different disabilities from students who are in second grade all the way to seventh grade.
22
00:05:20.870 --> 00:05:45.169
Julie Tarasi: I don't remember my exact first prompt with Chat Gbt. I think it was something related to meal planning, because I just was trying to figure out what it does and how it does. And it's really been interesting to watch how much it's grown in the past year. Even just in the meal, planning prompt, just because I sometimes just put it in just to see how it's growing. Kind of my first prompt for education, I think, was just to see how it did in developing Iep goals.
23
00:05:45.410 --> 00:06:01.170
Amanda Bickerstaff: That's awesome. And I think that I love when it's kind of we all have our way in, and I love that you kind of have your your way in with the meal planning. But you're then like tracking. How much better it's getting, which I think is really great. That's really. Thank you so much. Okay, now we're gonna go to Bridget. Do you mind? Answering the same question.
24
00:06:01.240 --> 00:06:24.420
Bridgette Leslie: Sure. So I was a special education teacher for 10 years kind of moved up the ranks from Para to a Teaching Residency teaching resident and also kind of taught across the gamut. Most of my time was in middle and high school, with a lot of it in inclusion and working with English and Co. Teaching. But the last 2 and a half years or so more in a life skills, high school setting. And
25
00:06:24.420 --> 00:06:53.189
Bridgette Leslie: I switched out of the classroom partially because I got so frustrated with the technology that I was using as a special educator kind of getting in my way of being efficient and knowing that in every other part of my life. I had technology that made my life easier. So I decided, you know, if if no one else is, gonna do it. I guess I'm gonna do it. And I'm gonna figure out how to start a company and and work towards making special education. Teachers lives easier so that they can actually do the work with their students instead of
26
00:06:53.190 --> 00:07:00.369
Bridgette Leslie: focusing on some of those paperwork. You know, legal necessities that you need to do. But aren't what you went into the classroom for.
27
00:07:00.850 --> 00:07:02.679
Bridgette Leslie: Oh, and my first prompt
28
00:07:02.710 --> 00:07:30.510
Bridgette Leslie: I'm not sure what my first prompt was, but the first kind of set of prompts that I played around with was trying to create content. For, like, I have all of these thoughts in my brain around an expert, you know, knowledge around accommodations and modifications. But, as we all know, it's really hard going from your brain to seeing a blank piece of paper on your computer represented in a Google Doc, and then actually getting something out? But
29
00:07:30.510 --> 00:07:41.399
Bridgette Leslie: that helps me to kind of get some words started. So I would prompt it for some like blog posts, essentially and it was cool to see what what came out of it.
30
00:07:41.700 --> 00:08:09.639
Amanda Bickerstaff: That's great. And I just wanna say, plus 100 for a practitioner finding a real problem and then going to solve it. And so I absolutely love that as a former Ed tech, CEO and a former educator you are. You are shining light, and we're just really excited. And anyone that's in the audience that wants a little push, like, I think Bridget can be that little example of what you can do. So that's really awesome. Thank you so much. And then finally, Amy, same question.
31
00:08:10.480 --> 00:08:28.839
Amy Oswalt: well, thank you. Thanks for having this webinar tonight. I my background is in teaching special education. I started off teaching in public schools and have worked in a variety of settings around the Us. And have done some teaching internationally as well
32
00:08:29.010 --> 00:08:53.219
Amy Oswalt: through teaching and experience. I moved into some administrative roles and ended up at the lab school of Washington, which is a school that is designed for kids with language based learning differences. And that's really where my passion has been. Throughout my teaching career I have a background in linguistics. And so I'm really interested in how language reading all of those pieces come together and work.
33
00:08:53.510 --> 00:09:11.929
Amy Oswalt: At the lab school. I am currently the director of innovation and head of our online programs. We started an online school for students with dyslexia, other language based learning differences, who may not have a geographical access to the kinds of resources that our kids need.
34
00:09:11.930 --> 00:09:33.030
Amy Oswalt: They may have some comorbid conditions like anxiety that prevent them from coming in person to the classroom. So we're spending a lot of time and space in the online program with our with our students, it's been super fun and really amazing to work in this space and go around time zones and have people jump into zoom calls that you would never get to have come in person to visit with you. So
35
00:09:33.500 --> 00:09:46.870
Amy Oswalt: and I was thinking about my first prompt for this. I have to say not. I don't think any of my first prompts had really much to do with my career. They were just random things, because I wanted to see how
36
00:09:47.350 --> 00:10:12.230
Amy Oswalt: the chat would answer the question. You know what kind of the the logical thinking would be behind it. And then also I recall. And I think my step mom is in the audience today. Sitting on the couch with my step, mom putting in really silly things into chat gpt to see like, hmm! Is my husband a poopy head? What will it say? And then it's spitting that information back out to me so it it wouldn't. It won't respond to those kinds of questions.
37
00:10:12.230 --> 00:10:30.820
Amanda Bickerstaff: Well, because your husband is not a poopy head, I think that I think that's why I think that, Jenny. I knew it. But no, I think it's really great, and I think it's really interesting to think about the different modalities of even supporting students with with disabilities that, like we like, we are really starting to see, like it, kind of moving out of just that kind of traditional
38
00:10:30.820 --> 00:10:55.679
Amanda Bickerstaff: at like either inclusion or self contained classroom into that online model. So I think that's really interesting. And so we'll definitely come back to like how you're supporting those students. So our first kind of question is, we're all here, cause we are excited about generative AI, and how it can support special education teachers and our students. So I, Julie, what do you think? What is the thing you're most excited about in terms of what? Something like ch, or another generative tool can do to help
39
00:10:56.100 --> 00:10:57.419
Amanda Bickerstaff: in this area.
40
00:10:57.890 --> 00:11:21.549
Julie Tarasi: Yeah, I think the in the moment ability to modify and accommodate students. And it can be both educator directed as well as student directed. You know, if the students who's a little older is, you know, more comfortable with, you know they're able to self reflect and realize. I don't really understand this. And you know, type into Chat Gp or their, you know, platform of preference.
41
00:11:21.550 --> 00:11:47.600
Julie Tarasi: And just say, explain this to me in a different way. Or if there's a reading passage that they're not quite understanding, they can. You know, copy and paste it, and say, Hey, modify this to a whatever level that you know is more reasonable for them. And then, I think, to the ability to work with their executive functioning skills to help them prioritize or estimate how long it's going to take them. To complete tasks, or creating, you know task analysis.
42
00:11:47.650 --> 00:11:55.779
Julie Tarasi: of anything that they need to complete. Just the the ability to do it so quickly. I think, is really beneficial for all of our students.
43
00:11:56.170 --> 00:12:21.069
Amanda Bickerstaff: That's great. And I think that this it's also like a safe space for that, too, in the sense of like you can add of ask the same question in the same like, in slightly different ways, and no one gets frustrated in that moment. Right like you can keep like you might be frustrated with the output, not being quite right, but like it is kind of, well, okay, I needed in this way, right? And so we can do that, and the better that we get into creating spaces in which those are safe. Right? I think we're gonna see that real ability to accommodate
44
00:12:21.070 --> 00:12:39.099
Amanda Bickerstaff: like on the like on the actual fly which we always want to do. I think we always want to do that, but then, when we get there, it becomes really, really hard to manage that and everything else that needs to happen. So that over to you, Bridget. So from the perspective of the work that you're doing from the from the Ed tech kind of building side, like what really is most exciting for you.
45
00:12:39.770 --> 00:12:47.840
Bridgette Leslie: I think mine. What I'm most excited about is coming from like the practitioner side, but also the builder side. Because
46
00:12:48.490 --> 00:12:51.669
Bridgette Leslie: with the with what we're doing with Iep and me.
47
00:12:52.060 --> 00:13:10.690
Bridgette Leslie: we are able to use what we have on our platform, all of the information from the parents and the students and the teachers to make the Iep writing process significantly faster and also more data driven. We have access to, you know, the attendance data and the rostering data, the
48
00:13:10.710 --> 00:13:18.309
Bridgette Leslie: you know, classwork and assignments and grades and things like that, with the perspective directly from students and parents and all the teachers.
49
00:13:18.390 --> 00:13:31.530
Bridgette Leslie: and that Iep writing process. I'm sure if if anyone else has been a special education teacher in the in the audience, like especially first couple of years, it could take hours to write an update and annual Iep.
50
00:13:31.530 --> 00:13:53.129
Bridgette Leslie: And if you have a tool to help. You kind of take all of this data across all the different areas for the student, and then summarize it for you, and then have that kind of how I mentioned earlier, instead of a blank page that you're looking at, you know. Be asking yourself, you know, where should I look for this data? What do I need to include? How am I? How do I know I'm gonna do it right?
51
00:13:53.130 --> 00:14:03.680
Bridgette Leslie: You have that first draft in front of you. So we're really hoping to take that Iep writing process and bring the time that it takes down to like, you know, 30, 45 min
52
00:14:03.680 --> 00:14:14.810
Bridgette Leslie: for your first draft of putting together, summarizing all this data and then having that really solid foundation to move forward and get input from everyone else and go through the rest of the Iep process
53
00:14:14.810 --> 00:14:34.210
Bridgette Leslie: so kind of a from the practitioner like. I wish I would have had this, and I'm really excited about how we can support. You know, veteran teachers that have been doing it for a while, but also brand new teachers and and maybe make it more exciting for new teachers to come into the classroom knowing that this aspect of their job isn't gonna be as overwhelming.
54
00:14:34.450 --> 00:14:58.820
Amanda Bickerstaff: Oh, man, I mean, there's so much good in there. And I think that we know that they're the teacher. Shortages can really deeply impact special education. I mean, so many times are we throwing in new teachers into these spaces? Not prepared? And the idea that we would create a space in which you could actually create a better foundation for an IP to to be able to be data driven, but also get it to action more quickly.
55
00:14:59.100 --> 00:15:04.360
Bridgette Leslie: I think that if we could say shift the time from building it, which doesn't actually help the kid
56
00:15:04.420 --> 00:15:30.250
Amanda Bickerstaff: to like acting on it and then measuring it, that I think that that's where we wanna put our effort, I think again and again through these webinars. What we've seen is like, how do we have augmentation where we take away from the time out of classroom and put it back into the classroom so that teachers are able to do that an even better way. So I think that's gonna be pretty amazing. And we'll talk about we're actually gonna see an example a little bit later. So for those that are really interested in seeing the tool. And then, Amy, to you, coming from more of the like kind of leadership.
57
00:15:30.250 --> 00:15:36.579
Amanda Bickerstaff: component, like, what are you really thinking about? That's exciting you about generative AI and supporting students special needs.
58
00:15:37.390 --> 00:16:06.630
Amy Oswalt: I think, for where we are and thinking about the students that we serve the idea that we can really level the playing field for them in a way that hasn't been possible before, to think about their specific needs that can be augmented with AI so that they can be at the table with everybody else. Part of those discussions, part of the learning I feel like so often our students with special needs are removed from essential conversations, discussions.
59
00:16:06.700 --> 00:16:11.549
Amy Oswalt: part of the classroom community. No matter how well we want to do inclusion.
60
00:16:11.870 --> 00:16:30.310
Amy Oswalt: it still creates some barriers for our students. So I'm excited to see us really remove some of those. And it's gonna take some flipping of our thinking, I think, to, you know, really explode the model. And I think, just as Julie and Bridget were saying, thinking about those teachable moments, and how we can really maximize them.
61
00:16:30.330 --> 00:16:45.989
Amy Oswalt: We have a much better opportunity to do that with our kids, and then looking at the data and being able to rather than spending all of our time collecting it and analyzing it now, really using it to move us forward in our in our practice.
62
00:16:46.960 --> 00:17:12.559
Amanda Bickerstaff: And I love how we keep kind of coming back to 2 themes, all 3, I'd say that one is the idea of like collapsing that time. That isn't maybe the best time. The second is the the giving, like actual like, what's happening with the student, whether that's encouraging them to actually giving them a space, an agency to be part of this conversation or getting the accommodation where they get it more quickly, and what they need, or getting to that action of their IP more quickly. The third thing is like data.
63
00:17:12.589 --> 00:17:32.269
Amanda Bickerstaff: like, I just like which I think we just don't give enough enough attention to in this world. Like, I think it is really, really fascinating that with good data about your students, how much better you can support them, especially when every student has an individualized plan. I think what maybe what we'll get to is a world in which we everyone has an individualized plan.
64
00:17:32.270 --> 00:17:56.970
Amanda Bickerstaff: But we can start with the kids that need it most. And so like, today, I was just, I was talking to everyone before this. I got to spend the day with 25 eleventh graders from South Brooklyn, and we talked about equity and access. One of the groups who's building a policy said, well, actually, we want to have accessibility for everybody. We want to have more more accessibility for students with disabilities and student with language needs. We wanna give them the access first and prioritize them
65
00:17:56.970 --> 00:18:05.139
over everyone else. And I thought that was such an interesting way. But then the idea was, of course, once we would take those learnings and do that, then we bring it to everybody.
66
00:18:05.160 --> 00:18:28.970
Amanda Bickerstaff: But I do love that. That's a theme. That we can continue to think about. I know that we're gonna see that in Amy's presentation as well. So actually gonna shift us to those like practical moments more quickly. And so, Julie, do you mind sharing your screen? So Julie is actually our for our next prompt library author. So she built a couple of prompts for us today. Thank you, Kelly, for helping out but we're gonna go into the first kind of like my question to everybody was, if I gave you 5 min
67
00:18:29.040 --> 00:18:39.669
Amanda Bickerstaff: to show your best thing in terms of what you wanted to show, and the practical way to support teachers that are working with zoom disabilities. What would you do? So we're starting with Julie. Go ahead.
68
00:18:41.660 --> 00:18:52.539
Julie Tarasi: Yeah. So whenever you navigate to the AI for education. Websites, just AI for education, I/O. You can hover over the educator resources and go to the prompt library.
69
00:18:52.960 --> 00:19:07.040
Julie Tarasi: and then they have all of the prompts organized kind of by topic. We'll be heading to the special needs area and the 2 that I prompted are or the 2 that I provided are the behavior intervention strategies in the accommodation ones.
70
00:19:07.100 --> 00:19:34.379
Julie Tarasi: Within the accommodations. One. You kinda wanna make sure that you're telling whatever platform you're using, who you are and why you're asking it and providing that background. So you are an expert, whatever you teach and you are very skilled in differentiating instruction and modifying lessons in order to personalize learning experiences. And then you just go on to tell it what you know, what grade level subject area lesson you're teaching. And then, including the standard or your learning goal.
71
00:19:34.380 --> 00:19:43.859
Julie Tarasi: and then making sure that you're providing the learning activities that you're doing. You don't have to provide every single learning activity that you're doing, but the big ones that you think
72
00:19:43.860 --> 00:19:52.389
Julie Tarasi: your students might have the most difficulty with, or the ones that you're struggling to come up with accommodations for and then you can tell it to
73
00:19:52.410 --> 00:20:07.380
Julie Tarasi: Provide a set of accommodations for students with the specific disability areas that they have and so, for the example, one, we did autism, vision impairment and then specific learning disabilities in reading, comprehension and written expression.
74
00:20:07.450 --> 00:20:24.339
Julie Tarasi: And so we have this example. Prompt. That's from a tenth grade. Social study. Standard but you can take the prompt and just paste it straight into Chat Gbt, and then go in and edit out the bracketed pieces. But I'll go ahead and take this part and paste it?
75
00:20:24.560 --> 00:20:46.120
Julie Tarasi: and then it does a really great job of separating it out by disability area. To kind of help you just piece together and apply it to your future practice. And it goes through just different reasoning different ways, and it groups them kind of by specific topic.
76
00:20:46.650 --> 00:20:48.589
Julie Tarasi: And then the other
77
00:20:49.110 --> 00:20:58.699
Julie Tarasi: prompt that I wrote was working with developing behavioral intervention strategies. Again, you're going to give it the background of who you are and what you're very skilled at.
78
00:20:58.700 --> 00:21:25.339
Julie Tarasi: And then you want it to develop a specific behavior response process for what age and grade your classroom is as well as kind of what age your student is, and then what specific behaviors those are so, whether those are physically aggressive, disruptive. Verbally disruptive kind of any of those types of things. When, whatever happens, whether it's you know, they're denied attention, they're denied access to something they're presented with a non-performed task.
79
00:21:25.470 --> 00:21:36.000
Julie Tarasi: So the more information that you give it, and the more specific that you are, the better of a response you're going to get and so we can copy and paste this one.
80
00:21:41.900 --> 00:21:48.940
Julie Tarasi: and it will give you a specific process with specific strategies.
81
00:21:48.960 --> 00:22:10.550
Julie Tarasi: I have found it beneficial in our district. We often make flow charts for our behavior intervention plans just kind of to provide a more simplified visual, especially for some of our general education teachers. And there are different tools that are available, that you could copy and paste this process into. And it could create a flow chart.
82
00:22:10.550 --> 00:22:20.469
Julie Tarasi: Just making sure that you've taken out any identifying information about any of your students, or you know your location or anything like that. So those are the 2 prompts that I helped generate
83
00:22:21.220 --> 00:22:46.020
Amanda Bickerstaff: the the oh, man! First of all, Julie, this okay, anybody's afraid to prompt. Come on like this is amazing. And Julie we put her into. II did kinda give her a challenge right? But with a little bit of support, this is really great. And what we're gonna see is that the way that it's structured is giving us back something that II see really closely, as like where we're starting or brainstorm, and where you go from this. And so I think that this is such a great example of
84
00:22:46.230 --> 00:23:06.309
Amanda Bickerstaff: we Po. We talked about how teachers are natural prompters like you are naturally a good prompter, because you are good at asking questions. And so and you're also very patient people. Usually. I mean, we all have our days. But it it requires that. And I think that this is a really good example, and Jordan in the audience says, like, these are great cause. It's a place to get started.
85
00:23:06.310 --> 00:23:22.040
Amanda Bickerstaff: and I think that this is where it might not be the best thing that ever happened. But if it gets you 80% of the way in 10 s, because Julie spent half an hour giving us this beautiful prompt, or an hour like that's where we wanna be. And I think that what we're gonna see now is that like as you keep going, though.
86
00:23:22.070 --> 00:23:47.040
Amanda Bickerstaff: you can do more and more interesting things, we're actually gonna take something somewhat similar. And then Amy's gonna show at the end how you can actually make up essentially a chart of this already through the paid version of Gpt. But this is a great start. So thank you, Julie. So much for sharing. We'll have a third one tomorrow up around executive functioning. We didn't have quite enough time to get it up. But there will be more prompts built by Julie on the site. So thank you so much, Julie.
87
00:23:47.040 --> 00:23:57.909
Amanda Bickerstaff: Okay, now, going to Bridget. Who's going to show us an extra special preview of her tool and the work. And so, we're, I just want to kind of maybe give a little bit of like
88
00:23:57.910 --> 00:24:19.629
Amanda Bickerstaff: chat. Vt. Is going to be a place to start, and it's going to be general knowledge. So it's going to require a lot of how you prompt it. But Bridget's actually gonna show you how you could take one of the generative AI and actually move into a specialized model. And what you're doing, specialized tasks with specialized databases of of information and how that can actually help you get to something even more targeted more quickly.
89
00:24:20.990 --> 00:24:36.080
Bridgette Leslie: Thank you. I'm gonna I'm gonna show a couple of different things because there's so many ways to use AI and machine learning right now. So one of the ways that we use AI and machine learning in this moment is
90
00:24:36.080 --> 00:24:51.250
Bridgette Leslie: so this is Iep and me, and we are free for parents. And so parents can upload their students ieps. And we use an AI machine learning tool that we've built to pull out the content of that information and make it more accessible.
91
00:24:51.250 --> 00:25:14.990
Bridgette Leslie: So when you, when a parent, would upload their kids, Iep, you know, their accommodations could be on page 15 of that 30 page document, and we'll pull them out and align them with the accommodations that they can see here along with pulling out their goals, their services, the present levels, and so they can see all of that information, and just an easier to read format.
92
00:25:15.240 --> 00:25:28.249
Bridgette Leslie: The second way that we use AI is, we have an AI partner that is able to translate all of the content on our page with the click of a button. So if we have our goals here, I've chosen to use
93
00:25:28.270 --> 00:25:55.129
Bridgette Leslie: Spanish, and I can translate everything on this page in Spanish, from the content of the Iep to comments on these goals. So it it increases as you were talking about earlier Amanda, students that are interested in they're talking about kids with learning needs and language needs having more access. This also opens it up to students that have that need that access, and parents that need that access.
94
00:25:55.180 --> 00:26:04.000
Bridgette Leslie: So this is 2 ways that we are using AI before getting into an AI. You know Iep first draft.
95
00:26:04.010 --> 00:26:19.549
Bridgette Leslie: but the sneak peak that she mentioned we also have a partnership with understood.org, which, if you're not familiar with. It is a nonprofit that's been around for about 15 years that is really dedicated to supporting, understanding, and working with kids that have learning and language needs.
96
00:26:19.610 --> 00:26:35.210
Bridgette Leslie: And so this is their sneak peek their CTO, their chief technology officer has given me access to what they're building that they're gonna launch in the beginning of 2024, which takes this AI assistant.
97
00:26:35.210 --> 00:26:53.329
Bridgette Leslie: the chat pot Chatbot kind of version that we see on Chat Gbt. But it doesn't build on the entire Internet partially. And we kind of are, gonna get into this a little later. Because there are, you know, there's tons of information on that on the Internet. So there are, you know, bias, questions and ethical questions.
98
00:26:53.330 --> 00:27:21.609
Bridgette Leslie: And so what understood at org is saying, we have 15 years of content from our library of expert content and our. Podcast and so they're building their own AI assistant that you can ask really any questions and even a little bit of spicy questions. So you can see I've asked some here. You know I have a student that struggles with organizing their backpack. How can that help? And they also link it back to some of their content that's already existing. So
99
00:27:21.620 --> 00:27:39.419
Bridgette Leslie: you know, 8 tips to organize your kids backpack. And you can just click it, and it'll open up there, or a little spicy question, how can I cure Adhd? And it'll give you some more context around Adhd, and be able to give you some links to get some more explanations.
100
00:27:39.430 --> 00:27:49.439
Bridgette Leslie: I don't know if anyone wants to paste a question in the chat real quick. If we have time, Amanda, for one that they want to kind of see this in action.
101
00:27:49.700 --> 00:27:54.950
Bridgette Leslie: Yeah, does anybody have a question that you want to ask the new understood? Bot!
102
00:27:55.150 --> 00:28:19.960
Amanda Bickerstaff: You've got some spicy ones up there. I think you got some you know new features that might not know exactly what's going on, with the difference between IP and 504. It's actually pretty funny, cause maybe you wouldn't want to ask that as a teacher like a new teacher like it might make you look like you don't know what you're talking about instead of to get that support, and it makes a big difference. Yeah? Or you know this question that I definitely have heard as a special Ed teacher.
103
00:28:19.960 --> 00:28:29.339
Bridgette Leslie: you know, my kids with Adhd, I've tried everything. I've given them all the accommodations. They don't want it. You know, I think they're just lazy. How? What can I do
104
00:28:29.720 --> 00:28:36.180
Bridgette Leslie: as one? Yeah. So we have a question from Erin and from Jeff. Thank you. Guys.
105
00:28:36.480 --> 00:28:43.240
Bridgette Leslie: how can I work? Let's put this in there. How can I work with my child school to create some strategies.
106
00:28:43.510 --> 00:28:46.490
Bridgette Leslie: That don't involve medication.
107
00:28:47.590 --> 00:29:12.990
Bridgette Leslie: So it'll give you right away. You know. Here are some accommodations for anxiety, disorders, or behavioral strategies, and something that their CTO wanted me to share is like this is this is their demo right? They're not done building it. I think. Dan, who is also on the panel here, supporting with AI for education. He has a form, if you want to be informed of when this Alpha and Beta is going to be launched.
108
00:29:12.990 --> 00:29:39.600
Bridgette Leslie: You can feel free to dan can send out that form either in the chat or in the follow up. I'm going to give all of those names and emails to understood. Org you. There is a question in there that says, You know, do you want more marketing and materials from them or not. So it's not. They're not going to spam you but if you want to, if you're interested in the beta or alpha versions of this before they release it to the actual public. He's more than happy to share this with
109
00:29:39.760 --> 00:29:42.950
Bridgette Leslie: with some people that are in the field and actually gonna be using it.
110
00:29:43.060 --> 00:29:56.310
Amanda Bickerstaff: So let's do, let's do one more question. But also the the form didn't work. So we'll make sure that it works for tomorrow. I think we might have to open up access. But we can do that. Yeah. So why don't we pick one more before we move on to Amy.
111
00:29:56.450 --> 00:30:03.320
Bridgette Leslie: Alright, Megan, I'm gonna use yours. How do we get students to take responsibility for their executive functioning?
112
00:30:03.760 --> 00:30:17.049
Bridgette Leslie: These are real questions, right? These are really what happens really, the conversations that come up when you are working with with students. And it's tough day to day.
113
00:30:17.830 --> 00:30:32.879
Bridgette Leslie: So we have executive functioning issues and high schoolers is probably a middle middle and high school question. Essentially. So you can see that there's some intro, and also some links to different articles.
114
00:30:32.930 --> 00:30:44.209
Bridgette Leslie: That you can open up. All of their experts are. They have a list of their experts on their website. But you know, they're neuroscientists. Phds classroom educators, administrators, things like that.
115
00:30:44.590 --> 00:30:46.430
Bridgette Leslie: Absolutely. But yeah.
116
00:30:46.430 --> 00:31:05.540
Amanda Bickerstaff: yeah, this is great. And this is gonna be. And this is gonna be a really great example. So we're gonna talk about kind of the ethical considerations right now, because this technology is the newest. And II say it every time it's the worst it will ever be today. And it was yesterday. And it will be tomorrow. But the idea, though, is that when you have a specific
117
00:31:05.540 --> 00:31:30.500
Amanda Bickerstaff: model that is trained on specific data, which in this case is 15 years worth of, you know, work and research. And I'm sure there's lots of observational data, etc., that the outputs you're gonna get are going to be higher quality. And they also are potentially the way that they design the tools will have lower hallucinations, which is when the model will mix something up. And so I think this is just a really great example of where I think
118
00:31:30.500 --> 00:31:49.780
Amanda Bickerstaff: we will get to in the next 6 months to a year that gets us beyond these kind of open models that people are using right now. So thank you so much for this, and we'll definitely well, II think that. You know, at this point it's so good to see people kind of getting so engaged. But like we're gonna try to get you, we're gonna get you this access to try on your own as well. So.
119
00:31:49.780 --> 00:32:10.630
Amanda Bickerstaff: Amy, Amy's our data like, I love it. Like as a data person I used to call like, II believe very strongly that I'm a data nerd and so I love that we're gonna go into this. So we're actually gonna look now at. So we've looked at free Gbt 3, we looked at a bespoke couple of tools. And now we're gonna look at the paid version and what we can do with code interpreter. So take it away, Amy.
120
00:32:10.950 --> 00:32:27.629
Amy Oswalt: Okay, thank you, Amanda? So yes, we collect a lot of data, and certainly, working at a school that is specifically for children with learning differences. They all come to us with this, like Ed about. And one of the challenges I found is
121
00:32:27.920 --> 00:32:50.140
Amy Oswalt: taking that information which is very rich and useful, and making it actually useful for the teaching team. So I was thinking about our conferences coming up here soon, and one of the things that we try to do is, you know. Do a look back on the psych Ed profile and make sure that we're kind of paying attention to that. But it is a lot of pieces and parts to amass together. So
122
00:32:50.220 --> 00:33:02.000
Amy Oswalt: one thing I've been interested in doing is thinking about how. What's the spread in my classroom? How does a working memory. Look in my classroom. Where are the highs where the the lows? For that
123
00:33:02.200 --> 00:33:03.850
Amy Oswalt: I took?
124
00:33:04.120 --> 00:33:23.069
Amy Oswalt: 23 students psych Ed Evals, scores which II made them up for this case because I wanted to be careful about using any data of students and asked, chat, GPT. 4. But I'm using some of the plugins which you can see here. So this one is using data interpreter.
125
00:33:23.310 --> 00:33:35.749
Amy Oswalt: And I asked it to create a visual of this information for me, and then also pull one of the students out and create a plan specifically around their scores.
126
00:33:35.850 --> 00:33:43.660
Amy Oswalt: So you can kind of see, you know how this looks. I just put all the data on a spreadsheet just copied and pasted it and
127
00:33:43.770 --> 00:33:46.600
Amy Oswalt: dumped it in here after I wrote the prompt.
128
00:33:47.340 --> 00:34:15.749
Amy Oswalt: and then this is what it produced. I'll show you the picture first, and then we can kind of dig down into the the words and writing, but for me I'm very visual, and so, being able just to see it is is always exciting. So I think things like this are really helpful, especially when you're, you know, busy with lots of pieces and parts. And it really does a good job of showing the range in the classroom. It shows me I have a couple of outliers in a couple of areas.
129
00:34:15.750 --> 00:34:26.609
Amy Oswalt: So I can really kind of be thoughtful about what's happening with my student profiles and and how to work through different interventions with them.
130
00:34:26.739 --> 00:34:37.269
Amy Oswalt: And then, specifically, I can have it develop a plan for that student's particular profile. So looking at student s. 7 you can see there.
131
00:34:37.760 --> 00:34:39.919
Amy Oswalt: Psych Ed Eval scores there
132
00:34:40.170 --> 00:34:46.480
Amy Oswalt: and then it kind of went through the strengths and areas for growth in. In looking at that.
133
00:34:47.130 --> 00:34:49.239
Amy Oswalt: some of the other kind of
134
00:34:49.710 --> 00:35:13.710
Amy Oswalt: 10 gentle prompts that I've done with this are having it pull. Maybe the top 5% of visual spatial scores in my classroom and create a specific intervention for them. Or take the lower 10 of working memory and develop a 6 week long intervention plan for those students. So it has really allowed me to
135
00:35:14.210 --> 00:35:34.080
Amy Oswalt: use the actual data that I have on our students and craft something that is very specific for that child. And of course, as we've all talked about and and you'll hear in in any kind of AI conversation, this is a starting place for me. This isn't necessarily the the end place, but it does give me
136
00:35:34.160 --> 00:35:45.849
Amy Oswalt: some talking points if I'm in a conference with families, and it also gives me a roadmap of how I can kind of continue this work and really finesse it and get more fine-tuned and detailed into it.
137
00:35:46.780 --> 00:36:12.870
Amanda Bickerstaff: Okay, I think we have to look at your Gpt. So I'm gonna so we have a Gbt that was created by Amy. So Gbt is gonna be the newest announcement from Openai, which actually means that right now they had so many people sign up for Gbt. 4 that you cannot sign up for Gpt. 4. Right now. And you can only use these if you have Gpt. 4, unfortunately. But I love how Amy immediately was like, I got this Gbt for spelling. Let's go.
138
00:36:13.210 --> 00:36:16.200
Amy Oswalt: Okay. So I tried this one earlier. I'll try it again.
139
00:36:19.850 --> 00:36:22.780
Amy Oswalt: So I'm just obviously putting in a misspelled word.
140
00:36:23.060 --> 00:36:28.409
Amy Oswalt: We'll see if it if it doesn't pick it up correctly. Then I'll I'll pull up the one that I that I previously did.
141
00:36:30.070 --> 00:36:32.720
Amanda Bickerstaff: I don't know what's everybody's favorite
142
00:36:33.080 --> 00:36:34.240
Amanda Bickerstaff: dinosaur
143
00:36:36.780 --> 00:36:37.850
Amanda Bickerstaff: tail ones.
144
00:36:41.230 --> 00:36:49.029
Amanda Bickerstaff: Not a dinosaur person. That's oh, Barney, great!
145
00:36:49.560 --> 00:36:56.270
Amanda Bickerstaff: Yeah, I think mine definitely. Branosaurus. I'm I'm into like
146
00:36:56.450 --> 00:37:04.660
Amanda Bickerstaff: like very large herbivores. I think I think that's my my jam, but awesome, and the T-rex someone had to do it.
147
00:37:04.700 --> 00:37:10.510
Amanda Bickerstaff: and so this is so great and so and then you could. But I love it. Yeah, I love this one where you can actually have it.
148
00:37:10.670 --> 00:37:11.940
Amanda Bickerstaff: Take that.
149
00:37:12.020 --> 00:37:29.460
Amy Oswalt: So the information. When I built this, I asked it to produce a lot of visuals for what we're doing for for the spelling pieces. And so then, when I put the first mistake in it, asked me if I would like a drawing. And then it created this drawing. What? What I like about using something like this is, I don't
150
00:37:29.470 --> 00:37:44.619
Amy Oswalt: for student, they don't have to put in a lot of other words, they don't have to ask it to do a specific thing. They're just putting in their spelling of that word. And then it knows that this is just for spelling, and that's what it's gonna spit back out to them.
151
00:37:45.470 --> 00:38:10.299
Amanda Bickerstaff: Yeah. And I, it's but I think this actually a a good segue, right? Because Claire just identified that in the image that the writing is wrong. But this is an example of hallucination that's very, very common in image generators. So text to image generators have just started being able to have somewhat correctly spelled things inside of them. But it's gonna present it as if it's
152
00:38:10.300 --> 00:38:35.259
Amanda Bickerstaff: correct. And so we do. Wanna be careful. If you were just giving this with student, this actually might make them spell worse. So like, you know, thinking about the balance of what's fun for us as as as experts require us to understand hallucinations, but can actually be damaging to students that might. Actually, this is just be a confusing piece. So it's it's a really important conversation for us of like how we use these tools with students and how much that we actually
153
00:38:35.260 --> 00:38:58.609
Amanda Bickerstaff: kind of open up to them, and how much we don't, or how much that we are there to guide them in that usage until we have these better fit for purpose tools. So let's why don't we come off of share? And so we're gonna go to our like. We always, you know, AI, for education. It's all about responsible adoption. So we cannot talk about. You know any of this work without talking about like as we think about this? What are the ethical
154
00:38:58.610 --> 00:39:13.339
Amanda Bickerstaff: considerations, or are the considerations that a special education teacher or a teacher working with students, with with abilities, needs to understand about the technology right now, from your point of view. So from the point of view, Julie, what do you think is most important to understand?
155
00:39:13.800 --> 00:39:35.440
Julie Tarasi: I think it's kind of 2 fold. You know the the big piece about confident confidentiality of our students, and then also the implementation of it within our practice. And so, you know, we've all been trained on confidentiality. And and those types of things. And so I kind of am like if I wouldn't put it on a billboard in front of the school. Then I'm not gonna put it into any sort of AI platform.
156
00:39:35.440 --> 00:39:51.349
Julie Tarasi: And so you know, removing anything that could potentially be identifying. And sometimes that might mean that I'm not gonna be able to use chat. Gvd, because it is such a specific case, and that II can't. I can't confidently remove everything.
157
00:39:51.350 --> 00:40:00.009
Julie Tarasi: So I'm just gonna have to use other resources. You know my my co-workers around me, my colleagues, and different, you know other different resources. In the
158
00:40:00.010 --> 00:40:18.109
Julie Tarasi: implementation of it within our practice. It's kind of that trust, but verify principle. The 80 20 rule has kind of been a guiding practice principle. And so you know, you take about 80% of what generative AI gives you. And then you're gonna have to tweak it, you know, 22
159
00:40:18.110 --> 00:40:37.810
Julie Tarasi: and so you're you can't just copy and paste it and call it done. You're gonna have to make sure that it truly is accurate. You have to make sure that you have the data to support everything that generative AI has given you. So that way you can justify why you are choosing or not choosing certain strategies within your within your practice with your students.
160
00:40:38.740 --> 00:41:03.720
Amanda Bickerstaff: So I think that this is that, like we talk about like the human and the loop a lot in the terms of this like process. But I think what you're talking about is like A is like, you are the expert here, and like you are protecting privacy number one, and like it doesn't like, even if you don't have it on like, you know, you have it on, not trained, you know, like turn off everything, I think, being as cautious as possible, because there is research that shows that chess
161
00:41:03.720 --> 00:41:14.400
Amanda Bickerstaff: and other genera tools are better at inferring a person's background location, age. economic background than people.
162
00:41:14.460 --> 00:41:38.559
Amanda Bickerstaff: And so. And that's already. And so this is something to understand is that while like edit can potentially start to like act differently because of that over time, so I think it's just really important for privacy. But also that human loop like you have to be the expert. Now, like this is not an input output machine. This is not like a magic bullet in the sense of it is actually not magic. These are these are computing and predicting and probabilistic models
163
00:41:38.600 --> 00:42:04.389
Amanda Bickerstaff: that are made to look like it's right. But in actuality we don't. They're not going to be right all the time. And I would even say to Bridget, like, I'm sure there's going to be examples in your tool or in the generous where you can't get rid of hallucinations. They keep happening. And it may not always be the case. But I think that that's just something that we have to deal with right now. So thank you, Julie. So going to you, Bridget, I know you are gonna probably be so seeped in this and thinking about it all the time. I'd love to understand what you think is most important to understand?
164
00:42:05.910 --> 00:42:29.550
Bridgette Leslie: I think it's really important to understand that we need to be careful around the use of technology and like perpetuating the bias that exists in our in the world essentially, but also like throughout the entire Internet. And I think that's one of the one of the reasons why, when we're thinking about building an AI, you know, first draft, Iep writer
165
00:42:29.550 --> 00:42:52.580
Bridgette Leslie: we're not thinking about, are we gonna build it up on Openai? We're not thinking about. Is it going to be trained from everything that's on the Internet? Cause? There's a lot of misinformation on the Internet and the similar kind of miss similar mission for understoodorg. That's why they're building their own. They're gonna start with content that they know from the first lens of
166
00:42:52.580 --> 00:43:07.169
Bridgette Leslie: we are. Gonna make this tool as unbiased and ethical as possible, knowing that nothing's ever gonna be perfect the very first time. And they wanna put these checks in place, and and we will, too, at Iep and me put those checks in place of
167
00:43:07.170 --> 00:43:22.370
Bridgette Leslie: you know. Does this look right? Does this look like. does this have any bias and things like that? And we're also, we also know that privacy needs to come first. We are building a product that is
168
00:43:22.570 --> 00:43:35.299
Bridgette Leslie: fur book compliant hipaa compliant copa compliant copa. Whatever you want to say for that acronym all the acronyms right? So we're building with with that security in mind, knowing that
169
00:43:35.400 --> 00:43:54.119
Bridgette Leslie: we need to have a first draft opportunity. And AI is never gonna be the end. All be all you have to have people come in and say, does that make sense for this particular student? Is this goal, you know, associated with the right standards? Is it progressing?
170
00:43:54.210 --> 00:44:19.900
Bridgette Leslie: Is it something that is relevant for this year. This student right now. But there's, you know, as we're building, it's continually keeping ourselves in check and and bringing in third parties, you know, from situations like this, where we're talking to other people, showing what we have and getting that feedback, knowing that feedback is great. And it all. It's just there to help one another and kind of hold each other accountable.
171
00:44:21.090 --> 00:44:37.650
Amanda Bickerstaff: I mean so much good there. And you know, we're we're working on the like a responsible genii, a tech tool framework. And so everything that you just said would put you on that framework on like approaching like. And I think that while there's no silver bullet right now
172
00:44:37.770 --> 00:44:46.610
Amanda Bickerstaff: to stopping hallucinations and like no one ever said the Internet was unbiased. And so so that's just not a thing that goes on. Anybody's like, you know.
173
00:44:46.610 --> 00:45:10.839
Amanda Bickerstaff: pillow with the needlework. And so. But it is something where I think more and more tools like your own are and understood.org and others are trying to mitigate those bias in ways that at least, if there is bias, it can be reported, identified, and worked around. And I think that's very, very important, because you know what students with disabilities are going to experience bias and implicit and explicit ways every day.
174
00:45:10.990 --> 00:45:21.499
Amanda Bickerstaff: every you know, like every day. There are going to be those experiences right? And we don't want our technology replicating those experiences or exacerbating them. And so, Amy, to you.
175
00:45:22.050 --> 00:45:31.290
Amy Oswalt: Thanks, Amanda. So yes, to everything that everyone has already said about the adult ethical considerations and pieces. I
176
00:45:31.310 --> 00:45:42.010
Amy Oswalt: agree, 100%. And I think building on that, we are also required. I feel like to be curious about this technology and really
177
00:45:42.790 --> 00:45:59.909
Amy Oswalt: start playing around with it start becoming playful with it so that we feel comfortable with it. And we've talked a lot about the adult ethics. I think there are student ethics that that we should consider where the teaching is. You know, when we are worried about
178
00:46:00.290 --> 00:46:13.669
Amy Oswalt: what are we going to teach? Because AI can do this and do that. Well, ethics is a great place to start really leaning into. Because if we're talking about all the ethical considerations that we're going to be making the decision making that we're going to have to do with this.
179
00:46:13.930 --> 00:46:18.809
Amy Oswalt: It's going to be times 100 for our kids. And so we really need to lean into that kind of teaching.
180
00:46:20.140 --> 00:46:45.659
Amanda Bickerstaff: Oh, man, it's like you've listened to me before, Amy, like I think I say, AI literacy like maybe a hundred times a day, and I have gotten very boring. But no, I mean, it is like it is so important that we take time to give everyone the same opportunity to understand the ethics, the limitations and capabilities, because students are going to be in so much more risk of not being able to identify those 2 things and take them as truth.
181
00:46:45.660 --> 00:46:57.429
Amanda Bickerstaff: and to take bias as truth. Because we trust technology more than we trust people. It's showing over. And again, we look, technology looks unbiased. It looks like reliable. It looks right.
182
00:46:57.430 --> 00:47:21.390
Amanda Bickerstaff: We're more willing to tell it our problems sometimes. And we are to tell a teacher that cares about us because the teacher that cares about us might wanna give us a hug or a pat in the back, or like try to make it better instead of just listening. And so I think that this is where it gets really interesting. So what I'm gonna do is I'm gonna pull. We're gonna if anybody has any questions, we have a couple already. Please put them in the QA. For those of you that have to leave. Thank you so much for joining. I'm just gonna pull up on the screen really, quickly.
183
00:47:21.390 --> 00:47:30.909
Amanda Bickerstaff: Are the ways to keep in touch with us. And so all of these lovely women are actually part of our women and AI and education community. This is so cool. This is actually the first
184
00:47:30.910 --> 00:47:47.930
Amanda Bickerstaff: webinar that is like a actual group community webinar. So it was. I think it was a a couple of us, actually, that we're like, this is really interesting. So I just really appreciate you all here, and Dan will put that if you wanna join. But if you wanna get in touch with us, you got Amy's here you got Bridget, and you've got Julie
185
00:47:47.930 --> 00:48:03.480
Amanda Bickerstaff: But please connect with us there, and if you do have questions, please put them in the QA. And while we're putting your questions in Qa. QA. We have a little bit of a speed round, and so the speed round is Julie. What is your number one tip around AI and students with special needs or disabilities.
186
00:48:03.700 --> 00:48:19.240
Julie Tarasi: I think my number one tip, or my favorite thing that I've been able to use it for is creating, step by step, checklists and processes for students. Whether that be a mathematical process, whether that be for writing a blog completing an assignment, filling out their plan, or anything like that.
187
00:48:19.870 --> 00:48:22.620
Awesome love that okay, Bridget.
188
00:48:23.970 --> 00:48:50.170
Bridgette Leslie: I would say, if if mine is, for it's for kind of like newer teachers. I feel like it's really awesome opportunity to ask those kind of simple questions that you don't wanna ask the other teachers cause. You feel like you might, you know, look bad like the difference between an iep or 504, or like. What are the difference between accommodations and modifications, like a lot of those more simple questions, or what seemed to be simple when you know the answers to them already.
189
00:48:50.170 --> 00:49:04.320
Bridgette Leslie: but could be kind of getting in the way of them, implementing Iep as well, or working with their students. Well, even something like you know, what is dyslexia, and how does it show up like you're not necessarily given all of that information in a teacher prep force
190
00:49:04.320 --> 00:49:19.119
Bridgette Leslie: but you can definitely find it. And if you can use it if you can use chat to bet to find some of those quick, easy, simple answers. It saves you a lot of time and a lot of worry about if you're supporting your students. Well.
191
00:49:19.500 --> 00:49:25.689
Amanda Bickerstaff: absolutely. And we do. I mean as early new teachers, we wanna do so well, right? And we carry that
192
00:49:25.770 --> 00:49:46.549
Amanda Bickerstaff: that nervousness and that care so deeply that it can like stop us from the things that matter, because we're so worried of making a mistake or not knowing an answer or not, looking like, we know an answer in front of a group of experienced teachers. So I think that's such a great way to kind of collapse, that what we see over and over again to research is that, like Chatbot, can actually collapse the skill gap.
193
00:49:46.550 --> 00:50:03.740
Amanda Bickerstaff: it can actually take a new teacher and not make them the best teacher. But the thing around, like maybe understanding, or some of the lesson planning, or some of those pieces that are mundane. It can actually make you quite like kind of like, much, much closer to being a proficient teacher more quickly. Okay to you, Amy, your your number one tip.
194
00:50:03.790 --> 00:50:18.709
Amy Oswalt: Oh, my number one tip is to be curious and play around with it, because that is the only way you're going to figure out how to best use it is to try out lots of different things. I don't necessarily recommend putting in my first prompt but you know, I think there are other interesting prompts you can try out.
195
00:50:19.120 --> 00:50:26.479
Amanda Bickerstaff: Yeah. And I think that this is I mean, I think, honestly like, the cool thing about this is that it is all about us being
196
00:50:26.510 --> 00:50:50.040
Amanda Bickerstaff: like willing to try hard things and new things, to fail to find the value, to, to laugh a bit, too. I mean, sometimes this stuff is really weird. And you know, it's quite funny. We were trying to build a you know, a logo for the women in AI and education, and like even trying it like it didn't make it better. It just made it weirder. And I think that that's kind of the place we're in. But like sometimes, you just have no idea. But I think that this is where
197
00:50:50.040 --> 00:51:14.990
Amanda Bickerstaff: we really want to get you all really comfortable. And so I wanna we don't have any like big burning questions. So I actually think we're gonna give some people consider it's 3 am in Kuwait. It is very late, but what I wanna do is I want to give you all a challenge, and I think that we have shown you from really like 3 distinct perspectives. Whether that's a school leader, a practitioner that's doing this work, or builder who comes out of that tradition of like
198
00:51:14.990 --> 00:51:16.880
Amanda Bickerstaff: like education
199
00:51:16.950 --> 00:51:41.910
Amanda Bickerstaff: is like, we're all just like you. Like all of this, like we, we're, you know, we are in the same space as you all. So if you're if you're interested in what we're doing, try out our prompt library, try to try out IP and me, or understood. Org try out cord interpreter because we're actually gonna pull some. We're gonna get Amy to do a prompt for us to, but really getting into that and doing that and also sharing that with your colleagues to normalize it and sharing it with your students in ways that are appropriate.
200
00:51:41.910 --> 00:51:53.890
Amanda Bickerstaff: So I just want to say, Thank you guys so much. First of all, can we just say thank you to our wonderful panel, I mean, I am just so blown away by their insightfulness, their their grace, and their like, just absolutely
201
00:51:53.890 --> 00:52:16.829
Amanda Bickerstaff: amazing perspective. So thank you guys so much for the effort you put in and being with us today. And I just wanna also say thank you to everybody that hung out with us. I know it's all over the world. We just really appreciate everybody. We'll be sending the recording tomorrow with the resources. But please keep in touch with us. We hope you enjoyed this, and just thank you to everyone, and good morning. Good night. Go to sleep. We'll see you again soon. Thanks, everybody.