AI & Children's Wellbeing

As AI continues to advance and become more prevalent in various aspects of our lives, it is crucial to understand its potential impact on the mental, emotional, and physical health of students. Join us as our expert panel discusses new research findings from the School Social Work + AI study and explores the role of school-based health staff in guiding the responsible adoption of AI in schools.

Attendees gained valuable insights into:

  • The latest research on AI and children's wellbeing

  • Best practices for the responsible adoption of AI in schools

  • Strategies for health staff to contribute to AI policy development

  • Opportunities for collaboration to create AI-enhanced learning environments that prioritize student wellbeing

  • Amanda Bickerstaff

    Amanda is the Founder and CEO of AI for Education. A former high school science teacher and EdTech executive with over 20 years of experience in the education sector, she has a deep understanding of the challenges and opportunities that AI can offer. She is a frequent consultant, speaker, and writer on the topic of AI in education, leading workshops and professional learning across both K12 and Higher Ed. Amanda is committed to helping schools and teachers maximize their potential through the ethical and equitable adoption of AI.

    Rebecca K. Oliver

    Ms. Oliver is the Executive Director of the School Social Work Association of America. Prior to becoming the Executive Director, Mrs. Oliver served on the SSWAA Board of Directors and has over 20 years experience working as a school social worker. In her current role with SSWAA, Rebecca is able to support school social workers across the nation and advocate for the profession about which she is so passionate. When not working, Rebecca enjoys traveling with her husband Jon, singing, running, reading, doing home-improvements, and outdoor activities including walks with her two dogs, Abby & Buddy.

    Andrew Buher

    Mr. Buher is the Founder and Managing Director of Opportunity Labs, a national non-profit research, policy, and consulting lab. Andrew is a visiting lecturer at Princeton University's School of Public and International Affairs. Andrew was appointed by President Obama to serve as a White House Fellow at the U.S. Department of Housing and Urban Development. Prior to his work at HUD, Andrew was the Chief Operating Officer of the New York City Department of Education and formerly Chief of Staff to Chancellor Dennis Walcott. Andrew is a White House Fellow, Forbes 30 under 30 honoree, French American Foundation Young Leader, Atlantic Council Millennium Fellow, and an Education Pioneer. He received an MPA from Columbia University and a BA from Rider University. His writing has been featured in The Hill, CNN, Politico, and the NY Daily News.

    Dr. Marina Badillo-Diaz

    Dr. Badillo-Diaz is an experienced school administrator and counseling director with a demonstrated history of working in community mental health and in education as a social worker. Currently, Dr. Badillo-Diaz is a consultant focusing on the training of educators and social workers. She is also an adjunct professor and board member of the National School Social Work Association of America. She is also the author of the blog, “The AI Social Worker”, a 21st Century Skills and AI practice guide for Social Workers. Her areas of interest include 21st-century skills, social-emotional learning programming, school social work practice, education, youth mental health, clinical supervision for social workers, ethical AI, data management, and career development.

  • Amanda Bickerstaff: Hello! Hi! Everyone welcome our first webinar of what feels like summer here in New York City really excited to have you all here with us today takes us a little bit for us all to get in with zooms, letting everyone in one at a time. But we're really excited to have you here. We're gonna get started just in about a minute. And so if you've been with us before. We know we love that you say hello. So please.

    00:00:34.700 --> 00:00:59.519

    Amanda Bickerstaff: put in the chat like who you are, what you do and where you're coming from. I know we have people from all over the world. And so it's really great. We're so excited to have this conversation today. And you know, I think that this is a a really important conversation that we don't believe as a collective. So our wonderful panelists today is being talked about enough

    00:00:59.520 --> 00:01:25.389

    Amanda Bickerstaff: and so really excited to have you all here as we really gonna dive into, like what AI means for children, for child wellbeing student wellbeing. Because this is something that we know. There has been an enormous impact of technology on student well, being already with the advent of devices, Internet. And of course, social media. And so now that we have this new arrival technology where this technology is something that we are, it's it's not just

    00:01:25.390 --> 00:01:49.090

    Amanda Bickerstaff: impacting us as educators and leaders and and industry people, but also really students. And so there's a recent piece of research that we're talking about today. And Andrew's gonna come on screen in a moment that talks about this. But also we know from research that students have been interacting with things like Snapchat AI for a full year all the way down to 7, 8 years old. And so we're really gonna start focusing on what I think will be the first of many conversations

    00:01:49.090 --> 00:02:12.919

    Amanda Bickerstaff: on the impact of AI on student. Well, being but I'm actually gonna just remind everyone, if you've been here before or not, this is your first webinar with us it is meant to be a community of practice. So I really appreciate everyone that's already said hello in the chat. What we're gonna do is you use the chat to like kind of share resources, chat with everybody. But there's a specific question for the panel. If you can put that into the QA.

    00:02:12.920 --> 00:02:36.701

    Amanda Bickerstaff: That would be great. So that makes it a little bit easier because we do have a really active chat. So we wanna make sure that we don't miss any of those great questions. But if you do have a resource, let's say that you have a piece that you really love or strategy? Around signal being an AI, please put that into the chat, and we'll be dropping some of those resources in as we go as well. So what I'm gonna do is I'm gonna come off share. And I'm actually gonna invite Andrew Burr. And I have to say, Andrew,

    00:02:37.110 --> 00:03:03.769

    Amanda Bickerstaff: it's a kindred spirit in a lot of ways. But when I first talked to Andrew about generative AI, the first thing that he had this we had to talk about was this idea of like, what will the impact be on wellbeing and students themselves? And so there's a lot of great research has come out of opportunity labs if you want to come on screen, Andrew, that would be great. And what we're gonna do is I'm gonna hand it over to Andrew to introduce himself, but also to run us through a quick kind of 7 min overview of the research.

    00:03:03.770 --> 00:03:19.839

    Amanda Bickerstaff: Evidence is the most important thing that we have today, especially this new crazy time. So to be able to situate that together is really important. And then what we're gonna do is we have a couple of really awesome panelists that are gonna come on and we're gonna have a bit of a discussion. So I'm gonna turn over to you, Andrew, and take it away.

    00:03:20.820 --> 00:03:34.381

    Andrew Buher: Great thanks. So much, Amanda. For the warm welcome to the AI for education team, for coordinating and thanks to everyone for joining. I guess I should say good evening. Good afternoon. Good morning, since it seems like there's folks from

    00:03:34.730 --> 00:03:58.959

    Andrew Buher: all over the world. On this call. Thanks for being here taking time out of your busy schedules. And more importantly, thanks for the work that you do. On behalf of kids every day. So, as Amanda mentioned, I am Andrew Beer. I am from opportunity labs. We are a national nonprofit organization. We've been around for about 5 years. And I also teach at Princeton University.

    00:03:59.343 --> 00:04:10.459

    Andrew Buher: Here in New Jersey. So what I am hoping to do in the next 5 min is to build a foundation for what we hope is a very

    00:04:10.590 --> 00:04:13.460

    Andrew Buher: practical and pragmatic conversation.

    00:04:13.600 --> 00:04:20.969

    Andrew Buher: I want to spend those 5 min basically talking about 3 things. Why we undertook

    00:04:20.990 --> 00:04:23.299

    Andrew Buher: this research that we're going to discuss

    00:04:23.360 --> 00:04:29.899

    Andrew Buher: what we learned and then some policy considerations. So I'm gonna go ahead and share my screen

    00:04:31.240 --> 00:04:33.750

    Andrew Buher: hopefully, everybody can see this

    00:04:35.040 --> 00:04:36.520

    Andrew Buher: alright. So

    00:04:37.015 --> 00:04:50.130

    Andrew Buher: we put a national survey in the field in partnership with Rebecca Oliver and her team, who you will meet momentarily and had a really great response. Almost 600 respondents.

    00:04:50.762 --> 00:04:58.809

    Andrew Buher: and so I wanna start fundamentally by talking about why we undertook this research.

    00:04:59.630 --> 00:05:04.669

    Andrew Buher: There's there's basically 2 reasons. First, I think we can

    00:05:04.730 --> 00:05:08.850

    Andrew Buher: disagree about the scale of

    00:05:08.910 --> 00:05:14.770

    Andrew Buher: harm to kids caused by increased screen time and social media use

    00:05:14.780 --> 00:05:29.390

    Andrew Buher: over the last 14 ish years. But to argue that those concerns are baseless or too minimal to matter. I think just isn't supported by the evidence anymore at this point. So

    00:05:29.500 --> 00:05:32.450

    Andrew Buher: and I'm sure so many of you on this call

    00:05:33.012 --> 00:05:44.329

    Andrew Buher: have experienced the suffering that children and families and schools and communities. Have had to navigate since the early 20 tens, because of

    00:05:44.430 --> 00:05:46.530

    Andrew Buher: kids, online lives.

    00:05:46.830 --> 00:05:52.650

    Andrew Buher: And of course, within schools, school-based health staff, including school social workers.

    00:05:53.034 --> 00:06:03.400

    Andrew Buher: Have acutely felt this impact. You know, when we talk to district superintendents and school leaders and school based health staff, we hear again and again. The stories about

    00:06:04.000 --> 00:06:10.569

    Andrew Buher: these really troubling and instances of of impact that they have had to to navigate

    00:06:11.315 --> 00:06:23.659

    Andrew Buher: and so, if generative AI has the potential to exacerbate these challenges even a little bit. We thought it was really important to understand the perspectives of

    00:06:24.147 --> 00:06:35.260

    Andrew Buher: folks that are closest to kits. And one of those roles, obviously within schools is school social workers. And so we really tried to

    00:06:35.681 --> 00:06:46.910

    Andrew Buher: get a sense of their perspective across of a range of issues related to generative AI. That's the first reason we undertook this research. The second reason

    00:06:47.070 --> 00:07:09.009

    Andrew Buher: is that school social workers are super unique. Right? They have a unique set of skills. Their role is really unique. They are incredibly proximate to students, staff and families. And we think that they should be at the table. When policy and programmatic decisions are being made.

    00:07:09.510 --> 00:07:13.720

    Andrew Buher: That have the potential to influence their work. And obviously.

    00:07:14.165 --> 00:07:19.269

    Andrew Buher: the adoption of new technologies has the potential to do exactly that.

    00:07:19.800 --> 00:07:22.769

    Andrew Buher: Okay, so what did we learn?

    00:07:23.159 --> 00:07:40.310

    Andrew Buher: Everybody? Hopefully, has seen this research. If you haven't, you can check it out, I think. The AI for Ed team shared it with with everyone. But I wanna highlight, maybe 3 non obvious takeaways that we have been thinking about as a result of of this research.

    00:07:40.530 --> 00:07:41.860

    Andrew Buher: So first.

    00:07:42.202 --> 00:07:45.529

    Andrew Buher: we call this the the don't knows, and the neutrals

    00:07:46.109 --> 00:07:52.220

    Andrew Buher: there were lots of don't know. And neutral answers across questions.

    00:07:53.450 --> 00:07:55.749

    Andrew Buher: we think that's actually pretty meaningful

    00:07:56.216 --> 00:08:12.030

    Andrew Buher: because the don't knows. And neutrals represent knowledge gaps in what school social workers know about generative AI to no fault of their own. Of course. We actually think this is directly attributable to the fact that

    00:08:13.610 --> 00:08:21.240

    Andrew Buher: access to AI literacy programming for school-based health staff, including school social workers, is

    00:08:21.360 --> 00:08:25.930

    Andrew Buher: uneven at best and non existent in a lot of cases.

    00:08:26.650 --> 00:08:33.149

    Andrew Buher: So that's the the first non obvious takeaway. Second, we call this the what the how

    00:08:34.819 --> 00:08:35.980

    Andrew Buher: respondents

    00:08:36.059 --> 00:09:03.609

    Andrew Buher: overwhelmingly agreed about potential risks associated with the adoption of generative AI. And I wanna emphasize the word potential right? There are risks related to safety and data, privacy and cognitive development, and and certainly mental and and social wellbeing. And I think we can call these the what right?

    00:09:04.440 --> 00:09:07.999

    Andrew Buher: But respondents were also deeply unsure about how.

    00:09:08.651 --> 00:09:11.139

    Andrew Buher: These risks would manifest

    00:09:11.737 --> 00:09:25.309

    Andrew Buher: and you know, I think if we're being honest intellectually with with ourselves, no one really knows the answer to how just yet. But there are starting to be some really smart projections about

    00:09:25.410 --> 00:09:30.530

    Andrew Buher: how these risks might and might is bold and capitalized here.

    00:09:30.944 --> 00:09:34.100

    Andrew Buher: Manifest, which we think is is really meaningful.

    00:09:35.700 --> 00:09:49.499

    Andrew Buher: third, important, non obvious takeaway. We're thinking about as out of the loop. And maybe this isn't as non obvious as the the first 2

    00:09:49.640 --> 00:09:55.490

    Andrew Buher: takeaways that that I discussed. But I think it might be the most eye opening.

    00:09:56.040 --> 00:10:04.389

    Andrew Buher: School social workers are just out of the loop too much. When it comes to the adoption of new technologies in schools.

    00:10:04.550 --> 00:10:10.710

    Andrew Buher: They aren't consulted as frequently and consistently as they should be.

    00:10:10.760 --> 00:10:16.129

    Andrew Buher: And as a result they aren't talking to important stakeholders, including families

    00:10:16.210 --> 00:10:20.640

    Andrew Buher: as often as they should be about these issues.

    00:10:21.240 --> 00:10:45.639

    Andrew Buher: So the the 3 big, non obvious takeaways, the don't know's and the neutrals, the what's but not the house. And then the fact that school social workers and and honestly, most school based health staff, including school nurses and school counselors, are often out of the loop when it comes to conversations about the adoption of emerging technologies in schools and in the classroom in particular.

    00:10:45.980 --> 00:10:53.979

    Andrew Buher: Okay, again, you have the research. This is the full list of policy considerations that we that we

    00:10:54.490 --> 00:10:57.789

    Andrew Buher: codified after after reading the research.

    00:10:58.030 --> 00:11:03.099

    Andrew Buher: But if I could summarize, I would effectively say 3 things

    00:11:03.580 --> 00:11:19.009

    Andrew Buher: leaders at the district level at the school level should be asking as many questions and seeking as much input as possible from folks that are closest to kids, particularly folks that are dealing with.

    00:11:19.621 --> 00:11:23.389

    Andrew Buher: The manifestation of the youth, mental health crisis.

    00:11:23.630 --> 00:11:25.400

    Andrew Buher: and of course, that

    00:11:25.460 --> 00:11:30.899

    Andrew Buher: includes school social workers. But it really includes all school based health staff, and of course, educators as well.

    00:11:31.367 --> 00:11:37.639

    Andrew Buher: They should be working to democratize the flow of information, and that's inclusive of

    00:11:37.690 --> 00:11:50.040

    Andrew Buher: access to knowledge, building opportunities like the AI literacy work AI, for education provides so that all stakeholders are informed enough to make thoughtful decisions.

    00:11:50.360 --> 00:11:51.829

    Andrew Buher: And then, finally, I think.

    71

    00:11:51.860 --> 00:11:58.500

    Andrew Buher: putting in place a system of systems that help all stakeholders prepare for both

    72

    00:11:58.650 --> 00:12:10.609

    Andrew Buher: potential opportunities which are many and potential risks associated with this this new technology. Is gonna be really, really important. So

    73

    00:12:11.246 --> 00:12:15.220

    Andrew Buher: hopefully, this gives folks a good sense of

    74

    00:12:15.270 --> 00:12:28.920

    Andrew Buher: why we undertook this research. What we learned and some ideas for action. But most importantly, hopefully, this is a foundation for a really interactive conversation that is both

    75

    00:12:28.980 --> 00:12:35.049

    Andrew Buher: pragmatic and practical. And so I'm going to pause there and pass it back to Amanda.

    76

    00:12:35.270 --> 00:12:41.400

    Amanda Bickerstaff: Thank you so much, Andrew, and I just want to kind of put a PIN on a couple of things. First of all, we see a lot of research where people

    77

    00:12:41.420 --> 00:13:06.359

    Amanda Bickerstaff: are kind of put to what like like make a decision. But I think it actually is really important to have that neutral. We don't know yet as actually something that social workers are identifying. And I think that when you talk about a gap in terms of like training and also in terms of being a part of those bigger conversation with technology is something that's just really important. And there's Carolyn in the in the chat talks about being a librarian and not being

    78

    00:13:06.638 --> 00:13:31.159

    Amanda Bickerstaff: brought in. So like, what we see a lot is that when schools and systems actually bring in school psychologists, counselors and media specialists and other people that are in and around students. For these generative AI trainings you see a much richer discussion that goes way, way way deeper, because we're looking at the whole child now, and the whole child experience in schools. And so I hope that this is a you know a galvanizing moment to have that focus which I think is really important.

    79

    00:13:31.160 --> 00:13:47.699

    Amanda Bickerstaff: But there's also, before you bring up the panel. There was a couple of questions about this is a first step, right? So this is focus on teachers. But do you have a a plan or know of a a plan to bring students into the fold in terms of following up with similar questions to them about the potential harms and benefits.

    80

    00:13:49.560 --> 00:14:08.413

    Andrew Buher: Yeah, we've seen quite a bit of research already. That has surveyed students. So we've seen some from pew we've seen some from the Walton family foundation. At least that was commissioned by the Walton family foundation. And you know, I would say most of that research is focused on

    81

    00:14:08.750 --> 00:14:24.359

    Andrew Buher: usage practices, right? How young folks are interacting with the technology and what ways they are utilizing it. And the extent to which they're utilizing it. I haven't seen anything that has gotten really deep on potential impacts to

    82

    00:14:25.145 --> 00:14:47.054

    Andrew Buher: well, being. And so there is certainly an opportunity, I think, to broaden the conversation and bring youth voice into a deeper dialogue about the potential impacts on on their wellbeing. Which is gonna be really really critical, I think, in the United Kingdom. You are starting to see some

    83

    00:14:47.998 --> 00:14:53.480

    Andrew Buher: youth voice coalesce around concerns around screen time and social media

    84

    00:14:53.929 --> 00:15:05.220

    Andrew Buher: which is really really interesting. But I will try to find and drop the links for the research that I've seen from queue and and Walton. About

    85

    00:15:05.650 --> 00:15:09.380

    Andrew Buher: youth interacting with these technologies. As this conversation goes on.

    86

    00:15:09.650 --> 00:15:33.710

    Amanda Bickerstaff: That's great. And if my team could also find the Ofcom online survey would be great as well. It's a Uk study, but I think, as a little bit of a a teaser, Andrew. And for people here, we're actually going to be starting a student in the AI education community. That's gonna be a partner to our women in AI education community for that exactly that reason of having more opportunity for students that are concerned about the impact or just wanted

    87

    00:15:33.710 --> 00:15:58.659

    Amanda Bickerstaff: to make sure that student voice is part of the conversation cause I think that that's gonna be so important, moving forward, so really excited to like. Now bring up our panel. And we actually have, 2 amazing panelists, Rebecca, who is supportive of this techno of this research, but also Marina, who is still doing the work as well. And so I'm gonna call out both. And Marina can come on video as well. And what we're gonna do is we're gonna have a bit of a just a discussion.

    88

    00:15:58.660 --> 00:16:26.900

    Amanda Bickerstaff: And I know there was a question in the QA. About potential risks and harms going deeper into that. So we're gonna take a little. We're gonna take a pragmatic approach, a balanced approach which I know that all of us here love to have. So we are gonna talk about concerns, but also potential benefits, because we always wanna have that balanced approach. But we start with every single webinar starts the same way with our panelists. Which is, do you mind introducing yourself and then talking about that first time that you interacted with generative AI. So we're gonna start with Marina. If you want to go first. That would be amazing.

    89

    00:16:27.660 --> 00:16:38.586

    Dr. Marina A. Badillo-Diaz: So Hi, everybody! Thank you again for putting together this webinar and panel. So my name is Dr. Marina Badio Diaz. I am a a school social worker working in education for the past 10 years.

    90

    00:16:38.890 --> 00:17:03.759

    Dr. Marina A. Badillo-Diaz: I started in K through 8 education as a school social worker, moved into a leadership role and position as a youth development director and an alternative transfer high school setting here in New York City. But most recently I've been teaching a school social work at the the masters level. So teaching future social workers and also doing consulting in K through 12 schools. And so a bit about my entry into using Jenner of any

    91

    00:17:03.760 --> 00:17:24.150

    Dr. Marina A. Badillo-Diaz: AI really came about a year ago, as I am scrolling on my consultant Instagram account of these reels on like generate content with with AI. Put these props to the chat. Gbt, and I'm a social worker. I didn't go to school for marketing, and this is a really big challenge of mine to put together content for my social media pages.

    92

    00:17:24.150 --> 00:17:37.509

    Dr. Marina A. Badillo-Diaz: And I just found it so useful and practical to try chat to Vt, and it really just grew from there, as I'm doing multiple projects at the same time. And I just found it so useful, not just in my business, but also as a practitioner, as well.

    93

    00:17:38.020 --> 00:17:58.369

    Amanda Bickerstaff: That's great. And I love like you're like, what is this? What is this thing that everyone's talking about? But I love that you actually use it for productivity as well cause that as someone that's going to be as entrepreneurial as you are, and also still doing the work in in schools. I think it's a great way to to find some balance which is amazing. So, Rebecca, do you same question to you? Would love to know more about you.

    94

    00:17:58.790 --> 00:18:21.160

    Rebecca Oliver: Sure. Hi, everybody! My name is Rebecca Oliver. I again also wanna thank Amanda and her team for pulling this together, Andrew for looping school social workers in on this really important research. I started as a school social worker worked in school settings for over 20 years, and then I've been with the School Social Work Association of America for almost 10

    95

    00:18:21.170 --> 00:18:25.910

    Rebecca Oliver: and so we're just really excited to be able to

    96

    00:18:26.030 --> 00:18:32.960

    Rebecca Oliver: really get in on these conversations as we just see. At least, I have seen AI just

    97

    00:18:33.030 --> 00:18:45.840

    Rebecca Oliver: kind of explode over the last couple of years. So just seeing its impact. So I'd say, maybe my first experience with it was some chat bots that I realized really weren't humans and

    98

    00:18:46.090 --> 00:19:00.479

    Rebecca Oliver: or just regurgitating some information to me, and then did become familiar with Chat Gpt last year through my husband and his work and some things. Some ways he was stepping into that area.

    99

    00:19:01.090 --> 00:19:23.310

    Amanda Bickerstaff: Yes, and I I think that chat bots were pretty bad for a really long time, and they suddenly got pretty early. Good which was, although some of them are still really really bad. I think we can all agree to that. And then, Andrew, you've already introduced yourself, but also like what was your first? I know you kind of went all in a bit that we met at a for those that don't know. We met at a think tank around this. I know you've been thinking about it a lot.

    100

    00:19:24.310 --> 00:19:36.958

    Andrew Buher: Yeah, I have. But it's funny. Amanda. So my wife and I were talking at dinner maybe a week ago, about sort of like the inverse relationship between age and first experiences.

    101

    00:19:37.758 --> 00:19:57.079

    Andrew Buher: You really have to like, try hard to create new experiences as you get older, I don't remember interacting with generative AI for the first time. So I certainly missed that first experience. But you know, shortly after the this sort of boom, a couple of winters ago.

    102

    00:19:57.791 --> 00:20:03.159

    Andrew Buher: There was someone that effectively rewrote

    103

    00:20:04.406 --> 00:20:11.190

    Andrew Buher: a national education law using chat gpt, and so I remember that sticking out very prominently.

    104

    00:20:12.540 --> 00:20:35.055

    Amanda Bickerstaff: Oh, yeah, I think there was a lot of that ve very much in in the beginning days where we started to see some content that probably shouldn't have been used chat fee should have been used, or at least a clo disclosed. So all very interesting. I I we use this kind of as our workshop test, because we all come to it in different ways, and we all are. I think this has become part of our lives in such a a unique and intentional and unintentional way, right? Because it's something that's

    105

    00:20:35.290 --> 00:20:47.079

    Amanda Bickerstaff: been what we talk about. If you're on Linkedin, you hear this all the time right? It feels like it's ever present. But it is really interesting, because I think just to ground us back into the research, there is still an enormous knowledge gap

    106

    00:20:47.080 --> 00:21:12.050

    Amanda Bickerstaff: in schools around these technologies and the potential impact. And I think that that's a big part about what we need to think about going forward. But I think one of the the most interesting things is that AI is especially chat bots are designed to feel like they are human. There's a human or human like intelligence on the other end. And I think that that's what I really want us to focus on this first question about like student. Well, being, we're gonna talk specifically around

    107

    00:21:12.050 --> 00:21:29.180

    Amanda Bickerstaff: AI, not just AI as a larger field, but a lot of these use cases of like character, AI or Snapchat, AI, or other tools that students are using. I would really like to understand what you think the actual impact on student well being could be with with these generative tools. Starting with Marina.

    108

    00:21:31.730 --> 00:21:55.199

    Dr. Marina A. Badillo-Diaz: So, yeah, I mean, I definitely think there's gonna be a a wide sort of range of the social, emotional impact of the of the use of these generate AI tools? And again, I I wanna also acknowledge our youth are our most vulnerable. They are in their periods of life, where their brains are still developing. They're still developing socially. And so as they're using these tools as a young person, I think there's a lot of

    109

    00:21:55.630 --> 00:22:20.489

    Dr. Marina A. Badillo-Diaz: unintentional like impact this could have, I think, both. And we're gonna talk about this in the webinar, both for positive use, but also for just some ethical and just and again, social, emotional concerns. So I think for me, what I think sort of the impact can be on sort of the the negative end is increasing isolation, increasing anxiety increasing. You know, depression, I think, about the potential, the impact of

    110

    00:22:20.550 --> 00:22:27.850

    Dr. Marina A. Badillo-Diaz: of bullying, and how that could really be upped and amt in in cyber bullying. Through these uses and tools of generative AI.

    111

    00:22:28.820 --> 00:22:53.589

    Amanda Bickerstaff: Yeah. And I mean, I think we we see that right, like deep fakes non consensual fake nudes. You know the ability to, not just bully like kids, but like a adults as well, is like a really big, like meaty, potential impact that we already have seen right? Like, we don't have to wait for that to happen, and it seems to have just fallen into patterns that already existed in our schools before. But now they're being hyper charged because that

    112

    00:22:53.590 --> 00:23:10.299

    Amanda Bickerstaff: that ability to create things that feel real is is so much at our hands now. So I think, thank you for calling that out, and we'll actually, I think Amanda, from our team can drop in the deep fax piece that we did recently to to that point as well. Rebecca, do you like? What do you think the impacts gonna be and suit well, being, or has already been.

    113

    00:23:11.160 --> 00:23:24.116

    Rebecca Oliver: Yeah. You know, we know. Andrew touched on this in his intro, that student, mental health issues have been increasing. And and you know, at alarming levels to where we're calling it a youth mental health crisis.

    114

    00:23:24.440 --> 00:23:45.399

    Rebecca Oliver: And so I think, as Andrew said, if there's going to be any impact, even if it might be slight. We need to look at it very carefully. So, and and a lot of the things Marina mentioned as well, you know that we're working with our most vulnerable. We, as adults, have to be the ones that step into that space and get the information and help youth.

    115

    00:23:45.895 --> 00:24:09.200

    Rebecca Oliver: You know I know I've seen some things that I thought were real, and my husband was like that's not real. That was made up with AI, you know. And if, as an adult, I'm not able to discriminate and and use other knowledge, I have to determine what's true, what's not true, how can we expect young people to do that? So I think we have to step into that space to make sure our children are safe

    116

    00:24:09.260 --> 00:24:22.630

    Rebecca Oliver: and that we need to ask the questions and be really critical in thinking about the impact or the potential impact. And I think, as we've already said, a lot of it's unknown. We, you know, we really don't know.

    117

    00:24:23.380 --> 00:24:27.049

    Amanda Bickerstaff: Yeah. And I I I love that idea, though, about like

    118

    00:24:27.110 --> 00:24:50.540

    Amanda Bickerstaff: starting to think through that now. Like, if it like, it's like one of those pieces where it doesn't have to be negative. But like should we be looking at if it is like? And actually, instead of like social media? What if we had actually taken a moment? Said new technology in kids hands? What would that mean to their mental health and well being? How much different the world would be right now potentially for these young people. So it's not even like a deficit mindset like, if this is going to be in the hands of students that are using it for

    119

    00:24:50.970 --> 00:25:08.810

    Amanda Bickerstaff: like AI girlfriends or chat bots like pie that were brought up by someone in the audience around that are, they're designed to be emotional support like these are going to have impacts because it's never happened before. We've never had synthetic relationships. Or, you know, artificial intimacy before. Right? So like.

    120

    00:25:08.810 --> 00:25:25.370

    Amanda Bickerstaff: let's not not put pay attention to it from the perspective of student mental health and wellbeing. And it and I think, adult as well, so to you, Andrew, like, what do you think about in terms of that impact like, based on both the the research? But also, I know you have some pretty strong statements about this moment. In time we need to stop.

    121

    00:25:25.460 --> 00:25:28.990

    Amanda Bickerstaff: reflect, and be intentional. So you talk a little bit. That would be great.

    122

    00:25:29.270 --> 00:25:40.729

    Andrew Buher: Yeah, sure. So a a couple of of points and and some of these are exclamation points on what Rebecca and and Marina just said. But you know, I think obviously, there is

    123

    00:25:40.750 --> 00:26:09.969

    Andrew Buher: significant potential upside from a teaching and learning perspective from a school operations perspective and potentially even from a social connection perspective. If these tools are used. And when I say these tools, I'm specifically talking about tools that are powered by generative AI are used within a set of really thoughtful guardrails that are optimizing for kids wellbeing.

    124

    00:26:10.331 --> 00:26:22.258

    Andrew Buher: But I think that the the knotty thing that we have to untangle is that there's also risks right associated with these technologies, and they're hard to qualify and quantify right now.

    125

    00:26:23.063 --> 00:26:29.929

    Andrew Buher: But I think we are increasingly convinced that if we don't attempt to mitigate those risks upfront.

    126

    00:26:30.040 --> 00:26:52.449

    Andrew Buher: we never actually get to the full benefits of these technologies, because folks will begin to lose enthusiasm around them. Right? And they'll be harder to adopt at scale. In schools and and outside of schools. Frankly, so, I sort of think about like 2 questions. Here. One is extremely

    127

    00:26:52.560 --> 00:27:01.749

    Andrew Buher: practical or practical ish, and the other is pretty theoretical, and maybe bordering on the verge of philosophical. So the the first

    128

    00:27:01.910 --> 00:27:11.379

    Andrew Buher: question is, really, how can we leverage transitive AI tools to enhance social connection? Not erode it?

    129

    00:27:12.175 --> 00:27:28.729

    Andrew Buher: And I think there's some early indications that, like there are some use cases where where these tools can be used to to have students working, not independently, but in in small groups, right? And reflecting on each other's work.

    130

    00:27:29.269 --> 00:27:42.060

    Andrew Buher: and so on. The philosophical question, one that I've been thinking about, and I wouldn't say keeps me up at night yet, but it's sort of trending in that direction is

    131

    00:27:42.648 --> 00:27:44.989

    Andrew Buher: what does it mean for

    132

    00:27:45.030 --> 00:27:52.489

    Andrew Buher: children? To live in a world where it is very hard to distinguish what is true and what is not

    133

    00:27:52.680 --> 00:27:57.579

    Andrew Buher: right. What does that do to kids experience? And how does that inform

    134

    00:27:58.100 --> 00:28:18.120

    Andrew Buher: their mental and social and behavioral wellbeing. And so that is a question that we are trying to parse. But I think, Amanda, it goes right back to your point about deep fakes and misinformation as the most prominent negative use case that's popped up so far.

    135

    00:28:19.660 --> 00:28:43.630

    Amanda Bickerstaff: Yeah. And this is a really interesting 1, one of like character. AI, so and we're there are lots of student facing tools that are that are that are either intentional or not intentional. So you've got your magic schools, your school ais, or missus that are chat bots that are supposed to be pedagogically focused. And then you've got the ones that are more like students find on their own right the consumer technology. So the Snapchat ais etc.

    136

    00:28:43.660 --> 00:29:07.590

    Amanda Bickerstaff: But I think the most interesting one is Character AI. So character AI is A is a tool that has 18 million avatars that you can talk to, and you can actually create your own. So you could talk to Marie Curie or Naruto, or you know someone from, you know, you could make it up. And what's really interesting is that they they're very popular with 16 to 24 year olds, and even somewhat younger students with the average use of around 2 h.

    137

    00:29:07.710 --> 00:29:25.919

    Amanda Bickerstaff: and then what they see is that you know the they have this new feature called a group chat feature, which is a combination of real people and bots. But the thing that is so fascinating to me is that the only way that you know what is real or an avatar run by a generative tool is if you go in the group settings.

    138

    00:29:26.110 --> 00:29:39.589

    Amanda Bickerstaff: So they did like it is indistinguishable. And it is actually something in which the group itself is starting to blur the lines between real and not real. And so I think that these and that is what's popular. That's 20 of the site traffic of Chat. Gbt.

    139

    00:29:39.590 --> 00:30:04.570

    Amanda Bickerstaff: so it actually is something that I think is going to be really interesting. But I do think, as we start thinking about these tools like we might think of them as generative tools that we bring into schools. But I think, really, our focus right now is what kids are bringing with them that we may not know that they use. And so our next question is, gonna be like a a little bit of a combo. So we're gonna look at this idea of what do you actually think is your biggest fear or concern? But then, what can we do as social work.

    140

    00:30:04.570 --> 00:30:13.729

    Amanda Bickerstaff: or those that are in and around students to start mitigating that. And the answer may be, we don't know exactly yet, but I would love to get your perspective on what you think we can do.

    141

    00:30:14.270 --> 00:30:15.500

    Amanda Bickerstaff: Sorry with Marina.

    142

    00:30:16.020 --> 00:30:39.678

    Dr. Marina A. Badillo-Diaz: Yeah. So I'm again, thank you for raising this really important question, because we do have to acknowledge the fear and have these conversations more deeply and so definitely. My biggest concern and concern is that we know meant to help us on the rise. Adolescents, suicide, suicidality rates are on the rise. And now I think about generative AI and bullying, and how this could continue to exacerbate that. So that's definitely my biggest fear is the cyber bullying and the impact on

    143

    00:30:39.960 --> 00:30:48.449

    Dr. Marina A. Badillo-Diaz: depression and and suicidality. So I think a real response that we have to talk deeply now in this point in time, with

    144

    00:30:48.791 --> 00:31:13.389

    Dr. Marina A. Badillo-Diaz: with our school administrators, with school psychologists, school counselors, school social workers and our educators is thinking about how we're gonna be addressing and providing crisis intervention supports for those students who are impacted by these deep fakes. You know, this is this is, gonna be a a real source of of concern. And so I think about again the intervention aspect as a school social work, of really

    145

    00:31:13.390 --> 00:31:36.140

    Dr. Marina A. Badillo-Diaz: supporting their mental health and connecting them to resources, but also addressing this as a school community so that we can create safe learning communities for all. And so I think about that more preventative stance as well, that Us. School counselor, school social workers. We need to have these conversations with our students about responsible and safe use and also addressing cyber bullying and the impacts that has on, on Mental health.

    146

    00:31:37.060 --> 00:31:52.770

    Amanda Bickerstaff: Oh, man, I love that. I mean, I think it's it's like, sometimes we wait till something happens, and then we create the plan. It's very reactive and not proactive. But if we already know again, it's already happening. This is how kids interact. Already. This is something that could exacerbate this

    147

    00:31:52.770 --> 00:32:17.720

    Amanda Bickerstaff: like, let's be really, really intentional about doing that first, and it's being anticipatory. So like for the next like, say, school year is about to end. But like, do you have a plan for this before the next school year? Do you have a training for students? Is there a decision tree of next steps? Are there examples like, I think, that that is such a unique and positive conversation to have. And it kind of goes next to guidance. But it's even different than that right? And I think that that could be. It's super super

    148

    00:32:17.720 --> 00:32:29.289

    Amanda Bickerstaff: tactical, which I love, Marina. But I do think that that could be something that would really really be helpful for next year and going forward. So that's really awesome. So, Rebecca, what do you think? What is your biggest concern? And then how can we mitigate that.

    149

    00:32:30.000 --> 00:32:59.420

    Rebecca Oliver: Yeah. So it's hard for me to come up with just one you know, I wanna elevate a little bit of what school social workers that were included in this data. What some of those issues and fears and concerns that came up, things that Andrews touched on. You know, concern about the impact on mental health, as Marina mentioned, and also how it interacts with the cognitive development of these young people, potential exposure, inappropriate content to inaccurate content to biased content.

    150

    00:32:59.862 --> 00:33:17.569

    Rebecca Oliver: You know those concerns potential cyber bullying and confidentiality risks. So there are a lot of them. And I think that, coupled with again indications from the data that many school staff are concerned, that families don't have the information they need to help.

    151

    00:33:17.963 --> 00:33:45.929

    Rebecca Oliver: Put those guardrails around their their young people. And also again, that school social workers and other school professionals aren't being included as much as needed in those conversations. I think those are my biggest takeaways. Just that. There is such an immense amount of information in this area, and many parents and families are just trying to get food on the table and work and get their children to appointments.

    152

    00:33:45.930 --> 00:33:58.900

    Rebecca Oliver: and having the time to wade through. All this information is challenging. And so we just need to bring, as many people like Andrew said, that are close to children. Into these conversations.

    153

    00:33:59.250 --> 00:34:28.960

    Amanda Bickerstaff: That's great. And I just wanna call out Patty in the the chat he's also talking about like doing an AI literacy event. Not just. For like teachers and students with families like that could be a wonderful way to start to build those relationships, too, cause one of the things that is really unique. Part of this is that we can help te like parents actually help their kids more with these tools, the same that there's productivity gains to be had for teachers using these tools effectively could be the same for students. And I think equipping parents with that, too, could help with some of those

    154

    00:34:28.960 --> 00:34:45.610

    Amanda Bickerstaff: is those deeper connections that we sometimes miss as social workers. Right? That, don't we? We have trouble getting that that cycle going and really getting that engagement with parents for a long period of time. So I think there are some other opportunities that could be done as well, which is super super great. So, Andrew, yeah, go ahead. Go ahead.

    155

    00:34:45.610 --> 00:35:14.799

    Rebecca Oliver: Sorry I mean just to, because I didn't really sweep around to your last part of that question of how school social workers can assist with this, and I think school social workers very much are a link between home school and community. And so school social workers are uniquely positioned to help draw parents into those conversations, to put together resources, to maybe offer trainings, or find those trainings that can be offered. And so I think they can be a really impactful part of the conversation.

    156

    00:35:15.000 --> 00:35:16.020

    Amanda Bickerstaff: Absolutely.

    157

    00:35:16.290 --> 00:35:18.599

    Amanda Bickerstaff: And so, Andrew, to you. Same question.

    158

    00:35:19.130 --> 00:35:25.769

    Andrew Buher: Yeah, sure. I might filibuster a little bit here, Amanda. First, just on the question in itself.

    159

    00:35:25.820 --> 00:35:34.930

    Andrew Buher: I think we should avoid using the term fear, right? Probably a disservice for kids, just like treating generative AI

    160

    00:35:35.477 --> 00:35:42.982

    Andrew Buher: powered products and tools. As a silver bullet which we know just doesn't exist in education. So

    161

    00:35:43.790 --> 00:35:53.099

    Andrew Buher: I am unclear about the impacts. So I'm gonna talk about what I'm concerned about more broadly, which is speed

    162

    00:35:54.540 --> 00:35:55.470

    Andrew Buher: and

    163

    00:35:56.236 --> 00:35:57.529

    Andrew Buher: I think

    164

    00:35:57.600 --> 00:36:02.019

    Andrew Buher: we have a tendency across the education ecosystem

    165

    00:36:02.660 --> 00:36:07.699

    Andrew Buher: to operate under the adopt or be left behind

    166

    00:36:07.750 --> 00:36:10.510

    Andrew Buher: paradigm, which

    167

    00:36:11.110 --> 00:36:18.969

    Andrew Buher: I think the evidence would indicate is probably a fallacy, particularly as it pertains to education, technology, tools.

    168

    00:36:19.714 --> 00:36:24.859

    Andrew Buher: But what that paradigm does is it forces folks to

    169

    00:36:24.930 --> 00:36:30.347

    Andrew Buher: act to adopt emerging technologies at scale too quickly.

    170

    00:36:31.110 --> 00:36:35.610

    Andrew Buher: And so, I think a better approach. A more practical approach

    171

    00:36:35.690 --> 00:36:39.200

    Andrew Buher: would be small, contained.

    172

    00:36:39.280 --> 00:36:43.480

    Andrew Buher: carefully evaluated classroom-based pilots.

    173

    00:36:44.390 --> 00:36:45.869

    Andrew Buher: That are

    174

    00:36:46.576 --> 00:36:54.449

    Andrew Buher: staffed by cross functional holistic teams, right educators, school-based health staff administrators.

    175

    00:36:55.750 --> 00:36:57.010

    Andrew Buher: and

    176

    00:36:57.440 --> 00:36:58.549

    Andrew Buher: and I think

    177

    00:36:58.640 --> 00:37:01.649

    Andrew Buher: all the folks participating in those pilots. By the way.

    178

    00:37:02.165 --> 00:37:19.620

    Andrew Buher: should have really deep, robust, rigorous AI literacy training, so that they are well informed about what they are going to encounter. As they begin to engage with these tools and technologies. I think the the real challenge here, right is

    179

    00:37:19.720 --> 00:37:30.360

    Andrew Buher: there aren't a lot of pieces of evidence around efficacy related to outcomes of any of these emerging technologies. Yet.

    180

    00:37:30.830 --> 00:37:38.579

    Andrew Buher: and so I think it would be really compelling and potentially, really helpful for the technology itself

    181

    00:37:38.900 --> 00:37:55.190

    Andrew Buher: to get some proof of concept that these tools can really move the needle on student outputs and student outcomes. And I think the best way to do that is, by these small contain pilots that are really carefully evaluated and staffed by cross functional teams.

    182

    00:37:55.430 --> 00:38:10.950

    Amanda Bickerstaff: Yeah. And I, I think what's really fascinating. So like Microsoft right now is is, do is setting up a a pilot or the call to get a private peer review, because it can't be a co-pilot pilot, apparently. For 13 to 17 year olds using co-pilot.

    183

    00:38:11.020 --> 00:38:39.590

    Amanda Bickerstaff: But what's really interesting is that it's so tech led like in the same thing, like, I'm not sure what happened with magic school or school AI or others like we don't have any evidence or any structure or support to actually know how these these pilots are are run before they're released widely. And I think that we had a question about the tools that are pedagogical. And I'm gonna use this because Jen AI is not pedagogical. Everybody, even with the best fine tuning and support. It's not a thinking machine. It's not a human. It's not a teacher.

    184

    00:38:39.590 --> 00:38:50.030

    Amanda Bickerstaff: But I do think that there's a a question of how do we actually start identifying those tools that we know are already impacting students and building space for those cross functional teams because

    185

    00:38:50.030 --> 00:39:15.030

    Amanda Bickerstaff: it has. We have this bad habit of of like not coming together when something new happens and actually creating evidence in a non siloed way. It's happening in tech. It's happening in schools and districts, but in very uneven ways. So maybe this is a call to action to actually have something that is like, we're gonna test the top 5 student facing products and actually see what this means. And we're going to include a mental health and wellbeing component, not just a pedagogy, not just an engagement, not just

    186

    00:39:15.030 --> 00:39:37.350

    Amanda Bickerstaff: student outcome learning outcome piece. I think that would be really unique. And I think they're they're because we're all interested in this. Maybe there's an actual opportunity for it to happen instead of just being talked about. So that's really interesting. Next question is more, this is our last big question. But what do you actually think is the positive that could happen around student wellbeing. There's been research from a typical and others that show that

    187

    00:39:37.350 --> 00:39:52.719

    Amanda Bickerstaff: suit like people are using these tools from mental health and wellbeing. There's been at least I typical said that there was a lowering of social suicidal ideation for some adults using these tools. We know they're using them so like, what do you think is the opportunity for student while being supported by Jen AI.

    188

    00:39:54.640 --> 00:40:19.569

    Dr. Marina A. Badillo-Diaz: So I think about this in 2 ways of the 2 opportunities. So one is actually the practitioner using these tools to support student well-being. So I think about where I role as a school social worker. We know we have high case loads. It might be only one school social worker for several grades or in a school setting. So I have multiple multiple students that I'm supporting for their mental health. So to have these time saving tools to help me be more efficient at my job.

    189

    00:40:19.570 --> 00:40:42.619

    Dr. Marina A. Badillo-Diaz: I could use generative. I to help me create intervention, plan templates. I could have it help me generate ideas for counseling activities in in my office. I could also use it to help create, you know, campaigns for suicide, awareness and anti bullying campaigns. I think they're just immense potential opportunity for these tools really support and enhance the mental health services that our practitioners, like school social workers.

    190

    00:40:42.620 --> 00:40:57.879

    Dr. Marina A. Badillo-Diaz: cool school counselors, have in schools. And then the other piece is something else that we've been talking about, which which is using AI as the companion to really support youth, mental health. So I think about the app like alongside which is a

    191

    00:40:57.950 --> 00:41:22.060

    Dr. Marina A. Badillo-Diaz: chat Bot generated for specifically for adolescents to use alongside with schools. And it's it's created and developed by clinicians. And this sort of chat bot is available in partnership with schools for students to use it. And what I also really appreciate about this model is that students are accessing it to ask questions, to talk about their problems, that they're experiencing.

    192

    00:41:22.374 --> 00:41:46.940

    Dr. Marina A. Badillo-Diaz: Sometimes they just need someone as like a sounding board to validate their feelings. But once it crosses the threshold of a student is reporting ideation that they're suicidal, that they're in crisis. That's when they contact the school. So if there is that human crisis element component of that partnership, so the school staff can then support that that child who's in need, and who's in crisis. So I think about how this won't be replacing our

    193

    00:41:46.940 --> 00:42:06.320

    Dr. Marina A. Badillo-Diaz: school social work role. But it really will enhance the opportunity for us to support. And I think about equity and access. This can be a real potential to support students who may not have access to a school social worker and and counselor, but then also doesn't eliminate the human concern of of the crisis piece. So I think there's a real opportunity there.

    194

    00:42:06.650 --> 00:42:27.219

    Amanda Bickerstaff: Yeah. And I mean, we are so understaffed, right? Especially in high schools around counselors and social workers. That is this a place to be able to to do more of what we have and and uniquely put in the human component. We did. Re, we built a well being tool in Australia. That was my last job into research. And we we found exactly what alongside finds that, like students having the ability to ask for help

    195

    00:42:27.220 --> 00:42:41.780

    Amanda Bickerstaff: and or to identify it even on their own. And then also, even if they don't, having that come and be lifted up and and prioritize and triage to a human is a really really great way to like, actually identify those students that have problems that you do not see.

    196

    00:42:41.780 --> 00:43:02.860

    Amanda Bickerstaff: And I think that this is where, like the the ability to uncover what's below the surface, which we know is has happening, I think, is a really unique opportunity, with with a generative AI tool like alongside, and we love alongside. So that's that's really great. So, Rebecca, to you, what do you think is the the best use case for Jen AI. And how it could potentially support student well, being.

    197

    00:43:03.250 --> 00:43:24.474

    Rebecca Oliver: So I have to say, it's funny. Marina and I have been in multiple meetings even this week together, and we did not talk about our answers to this, but she took a lot of she took a lot of the things that I wanted to say so. But I do again. I think a big thing is we do know, that there are a lot of time consuming tasks and understaffed at schools.

    198

    00:43:24.750 --> 00:43:36.609

    Rebecca Oliver: And so there's the potential for these tools to help us be able to reach more students and be able to connect with more students. So again, I think that that the barrier piece of that question is

    199

    00:43:36.610 --> 00:43:45.619

    Rebecca Oliver: just ensuring that we're at the table to evaluate the various tools, to ask the questions, to determine which ones can be used in which ways.

    200

    00:43:45.620 --> 00:44:01.650

    Rebecca Oliver: and ensuring that we still have, like Marina, said, that human touch of a real human that evaluates some of that critical information. And is that that human point of contact with the students that need the support.

    201

    00:44:02.340 --> 00:44:27.319

    Amanda Bickerstaff: That's great. And I I first of all, I'm so glad that you're having a lot of conversations about this Rebecca and Marina, though, cause. I it makes me so happy that people are asking and listening, and the fact that we have over 200 people hanging out with us at a very last, you know, like it's getting to last weeks of school kind of life is a good indication. But I do. It is interesting that you have so much overlap there, because I think that that's where we can do a focus right? And like it has something that we could start.

    202

    00:44:27.320 --> 00:44:46.290

    Amanda Bickerstaff: And then we could start to to move from there. So Andrew's always gonna have one of the things I love about Andrew is that he is willing to be bracing and also filibuster and all these things. So I would be really interested in knowing what you think are the actual kind of impact potential, positive, most positive impacts of Jenna and Zoom. Well, being.

    203

    00:44:46.520 --> 00:44:50.341

    Andrew Buher: Yeah, so maybe this is gonna be counterintuitive coming from me, Amanda. But

    204

    00:44:51.220 --> 00:45:02.370

    Andrew Buher: you know, I think like, if you broadly agree that one of the primary purposes of school and schooling is to build both social skills and social connection for kids.

    205

    00:45:02.961 --> 00:45:07.540

    Andrew Buher: Which we know has obviously significant bearings on on overall wellbeing.

    206

    00:45:07.966 --> 00:45:12.729

    Andrew Buher: Then I think there's a real opportunity for generative AI to help

    207

    00:45:12.830 --> 00:45:14.699

    Andrew Buher: right? You know, I think most

    208

    00:45:14.750 --> 00:45:19.000

    Andrew Buher: social media media is actually anti social media.

    209

    00:45:19.160 --> 00:45:35.410

    Andrew Buher: But there are corners of those ecosystems that actually have created really rich opportunities for kids to find other kits right? And so I think if we can sort of replicate

    210

    00:45:35.560 --> 00:45:43.849

    Andrew Buher: the good that is happening in pockets and corners of the social media ecosystem

    211

    00:45:44.000 --> 00:45:53.979

    Andrew Buher: and figure out what analogs exist that potentially can be amplified, expanded, leaned into

    212

    00:45:54.490 --> 00:46:06.540

    Andrew Buher: with generative AI powered tools and products. Then there's a real opportunity. Not only to help kids build social skills and awareness. But social connection also.

    213

    00:46:08.270 --> 00:46:26.000

    Amanda Bickerstaff: I think that's great. And yeah, I think you're right. I mean, it's we, even, I mean everything that we think will be like, fully bad for kids like like Covid was going like their kids. That's that flew flew with online education. And like, we'll continue to fly with something different. So I think it's absolutely

    214

    00:46:26.000 --> 00:46:50.880

    Amanda Bickerstaff: true about the of all things. I'm just gonna I'm gonna I'm gonna insert this one thing. I think that we are under like, there's an opportunity for neuro divergent support like hit supporting kids with nerd divergence that's really really high. Whether that's students that might not feel comfortable with written communication because they're dyslexic, or that have unique needs around sensory issues or have not been able to communicate at all. Like because they haven't.

    215

    00:46:50.880 --> 00:47:15.870

    Amanda Bickerstaff: They're they're considered non verbal to be able to now create images, videos, the ability to have an avatar of yourself that could could express things that you've never been able to express, I think, is something that could like significantly impact student wellbeing. I think that if we I would, I would love us to see more in those spaces. First, as a used case, because I think that the the we could really do pilots on that. But also the impact

    216

    00:47:15.870 --> 00:47:40.689

    Amanda Bickerstaff: that could be really, really large, really, really quickly. I mean just the giving of someone, a voice and never had. One is like a unique place that we could do this work together. So we're coming up on time. So we have our last. I know we fixed it all, everybody soon. Well, beings fixed. We got it. But I do wanna stop with like, like, we always like to be as practical as possible, and so like, what is your most practical strategy

    217

    00:47:40.690 --> 00:47:52.040

    Amanda Bickerstaff: that you want people to take away with you in terms of supporting student. Well, being this time, or kind of starting to gather evidence, or whatever you think is most important. So, Marina, what is your what's your idea?

    218

    00:47:52.540 --> 00:48:17.509

    Dr. Marina A. Badillo-Diaz: So my idea really comes from stemming and creating a multi tiered support plan for your for your schools are really thinking about student well being and AI adoption. And in that multi tiered sort of support plan, really thinking about having a cross disciplinary conversation with administration. The It tech person, the the librarian, yours, your school support staff like your school counsel.

    219

    00:48:17.510 --> 00:48:41.250

    Dr. Marina A. Badillo-Diaz: social workers your your educators, and really coming up with tiered interventions from the crisis support to thinking about more of the preventive place of bringing in families into this conversation family workshops, but also student lessons and Scl lessons on digital citizenship and but also cyber bullying. So I think there's a huge opportunity to really create this more thoughtfully

    220

    00:48:41.250 --> 00:48:44.850

    Dr. Marina A. Badillo-Diaz: of a sort of like a multi-tiered intervention plan.

    221

    00:48:45.360 --> 00:48:59.530

    Amanda Bickerstaff: That's great, and if you want to, if you're writing it, we'll support it. If you want to write it together. We're ready. I think that you know, having those resources in hand for the start of the school year that people can use is is an amazing idea. Rebecca.

    222

    00:49:00.260 --> 00:49:28.989

    Rebecca Oliver: So mine's kind of 2 fold one is is that piece of what Marina just touched on making sure people are coming together to ask the questions, to have conversations with families. You know. I think, Amanda, something you had said in our our pre discussion about this webinar is, you know, I think it's also important that we create a safe space where parents can say, I don't know, like, what are my kids interacting with, what are these different things. And if because if we don't give that space.

    223

    00:49:29.370 --> 00:49:54.070

    Rebecca Oliver: you know, then they're not going to come to the table, and and we're not going to be able to have a full discussion. So that's one part of it. And then I think my other piece is balance is going back to what we know is good for kids. Wellbeing in general. Right? All those basic things of making sure students are having balance, rest and sleep, that they're having time outdoors

    224

    00:49:54.070 --> 00:50:05.500

    Rebecca Oliver: that they're in green and blue spaces, that they're drinking water, you know, right? Staying hydrated the basic things connecting with their family and friends, disconnecting from screen time.

    225

    00:50:05.570 --> 00:50:16.529

    Rebecca Oliver: and then learning all sorts of different coping skills that help them, mindfulness and breathing and different things that help them manage and keep balance in their life.

    226

    00:50:18.000 --> 00:50:19.690

    Amanda Bickerstaff: First of all, like.

    227

    00:50:19.990 --> 00:50:30.890

    Amanda Bickerstaff: we see this in the AI literacy work and adoption work we do that seems like we need to throw out the rule book because it's happening so quickly. But I think we need to lean into the role book more.

    228

    00:50:30.890 --> 00:50:55.869

    Amanda Bickerstaff: And you know those spaces because that's gonna give us this, the opportunity to like, do the foundations, and then and then support the ways we know, because it's moving so quickly that if we try to keep up with that and try to change and everything. But like, let's not. Let's do the things that we know work, provide people spaces to be able to learn together to be if I know if you didn't like the word fear. But some people are afraid. But maybe it's okay to be afraid in ways that are

    229

    00:50:55.870 --> 00:51:08.890

    Amanda Bickerstaff: intent like that are not spiraling, but intentional safe spaces that lead to dialogue and next steps. But I think that that's a way to get our hands wrapped around it like the same thing with Marina like, let's do the things we know that really work.

    230

    00:51:08.960 --> 00:51:20.559

    Amanda Bickerstaff: and then talk about how we support this movement with generative AI instead of the other way around, I think, is a really good piece of advice. So, Andrew, you're gonna bring us home. What is your most practical strategy?

    231

    00:51:21.157 --> 00:51:25.672

    Andrew Buher: I'm gonna try to bring us home 2. I would offer to

    232

    00:51:26.280 --> 00:51:28.450

    Andrew Buher: 2 things. One is, I think.

    233

    00:51:28.550 --> 00:51:35.670

    Andrew Buher: following the evidence becomes incredibly critical here, right with the technology advancing so rapidly.

    234

    00:51:37.463 --> 00:51:43.099

    Andrew Buher: understanding what the evidence says, and asking very specific questions

    235

    00:51:43.130 --> 00:51:51.010

    Andrew Buher: about any technology that is being adopted and will be student facing is super critical right? And so those are questions like.

    236

    00:51:51.634 --> 00:51:58.749

    Andrew Buher: what evidence exists that the technology is successfully doing what it is intended to do?

    237

    00:51:59.339 --> 00:52:03.129

    Andrew Buher: What evidence exists that the technology will do no harm?

    238

    00:52:03.360 --> 00:52:06.200

    Andrew Buher: What evidence exists that

    239

    00:52:06.220 --> 00:52:13.176

    Andrew Buher: the builder of these technologies has optimized for student well-being. Okay? And then my second

    240

    00:52:13.660 --> 00:52:18.113

    Andrew Buher: suggestion is a maybe sort of

    241

    00:52:18.840 --> 00:52:23.439

    Andrew Buher: very, very non sexy one. But I think

    242

    00:52:23.720 --> 00:52:30.220

    Andrew Buher: if we are trusting big tech great. You shouldn't have any concerns.

    243

    00:52:30.280 --> 00:52:39.540

    Andrew Buher: If you don't trust big tech then we should be thinking about how we get builders of these products

    244

    00:52:39.700 --> 00:52:45.220

    Andrew Buher: to optimize for student wellbeing, and I think the best way to do that is through procurement.

    245

    00:52:45.530 --> 00:52:47.900

    Andrew Buher: So I would be encouraging folks

    246

    00:52:47.920 --> 00:52:53.440

    Andrew Buher: to get their procurement officers, their districts. To think about building

    247

    00:52:53.730 --> 00:53:07.239

    Andrew Buher: fundamental benchmarks across efficacy and data, privacy and security and interoperability and access. Among other things, that

    248

    00:53:07.540 --> 00:53:13.130

    Andrew Buher: technology providers would need to meet to sell into schools. Right? I think the only way we shift

    249

    00:53:13.659 --> 00:53:22.740

    Andrew Buher: movement towards child. Well, being, when we're building these products is by changing incentives, and the biggest opportunity we have to change incentives in K. 12

    250

    00:53:22.930 --> 00:53:24.460

    Andrew Buher: is our purchasing power.

    251

    00:53:24.750 --> 00:53:41.790

    Amanda Bickerstaff: Okay, yeah. And and I agree. And I think that what you just said, though, isn't, it goes way beyond generative. AI just goes to like, we need to do better with our technology and schools. And I think that that's just across the board. A really important piece. But this might be what's really cool about this moment time. This could be the lever

    252

    00:53:41.790 --> 00:54:06.780

    Amanda Bickerstaff: for getting that for more than just generative AI, because people are paying attention ways. They've never paid attention before. And so I think that's a really great point. I just wanna say, thank you like, can we just give everybody? I know we can't do hearts. But I'm gonna do a heart for everyone that's in the audience today. Thank you so much to our amazing panelists. For also just Andrew and the team at opportunity labs. Rebecca, everyone, Swa Marina, like we just really appreciate you putting this time and effort and evidence into the world

    253

    00:54:06.780 --> 00:54:31.739

    Amanda Bickerstaff: at this important time, like, it's just really, really, really important. And we want to hear more of this. And so I love that that we're able to have this discussion, and this also. Just thank everyone for coming. I mean, we kept, you know, almost the whole group the whole time, a great conversation in the chat. And we just wanna continue this work. So please, like, if you have good ideas and resources, we're always happy to hear them. But I just wanna say thank you to everyone for for taking a moment out of your time to think about students in a meeting.

    254

    00:54:31.740 --> 00:54:51.210

    Amanda Bickerstaff: Meaningful way like that means a lot, and I think that it shows a lot for what we can do to bring that back to our schools and system worldwide at this point. So thanks everybody. I hope you have a good morning. Night lunch. Go to bed. Whatever time it is. We appreciate you all, and just we look forward to having you all for our next webinar in a couple of weeks. Thanks, everybody.