Artwork

Player FM - Internet Radio Done Right
Checked 10M ago
اضافه شده در three سال پیش
Looks like the publisher may have taken this series offline or changed its URL. Please contact support if you believe it should be working, the feed URL is invalid, or you have any other concerns about it.
محتوای ارائه شده توسط IDEA Data and IDEA Data Center (IDC). تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط IDEA Data and IDEA Data Center (IDC) یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Player FM - برنامه پادکست
با برنامه Player FM !
icon Daily Deals

Data Swap: From State Data Manager to IDC Technical Assistance Provider

24:22
 
اشتراک گذاری
 

بایگانی مجموعه ها ("فیدهای غیر فعال" status)

When? This feed was archived on September 30, 2024 00:25 (9M ago). Last successful fetch was on August 23, 2024 01:48 (10M ago)

Why? فیدهای غیر فعال status. سرورهای ما، برای یک دوره پایدار، قادر به بازیابی یک فید پادکست معتبر نبوده اند.

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 406427230 series 3340807
محتوای ارائه شده توسط IDEA Data and IDEA Data Center (IDC). تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط IDEA Data and IDEA Data Center (IDC) یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you!
You can contact us via the Podcast page on the IDC website at https://ideadata.org/.
### Episode Transcript ###
00:00:01.52 >> You're listening to "A Date With Data" with your host, Amy Bitterman.
00:00:07.34 >> Hey. It's Amy, and I'm so excited to be hosting "A Date With Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day.
00:00:19.50 >> "A Date With Data" is brought to you by the IDEA Data Center.
00:00:24.61 >> Welcome to another episode of "A Date With Data." I am joined by several former data managers who are now IDC technical assistance providers. Austin Ferrier, Kelley Blas and Kristen DeSalvatore were all data managers up until a few months ago, and they're going to share with us their unique experience of going from being a data manager to now working to build the capacity of data managers and other state staff to improve the quality of their IDEA data. Welcome to all three of you. And I think, to get things going, it would be great if each of you could just briefly introduce yourselves, say a little bit about how long you were in the role as a data manager and what you're doing now. And, Kelley, do you want to start us off?
00:01:09.05 >> Sure. Thank you, Amy. Again, my name is Kelley Blas, and I was a data manager at the Department of Public Instruction in North Carolina. I was there for 17 years as the data manager and the last few years as the SPP/APR coordinator, and I'm very happy to have joined IDC as a state liaison and technical assistance provider, just love working with states.
00:01:35.64 >> Great. Thanks, Kelley. Kristen, do you want to say hello?
00:01:39.35 >> Hello, everyone. I am Kristen DeSalvatore, and I was the data manager at the New York State Education Department for about 11 years. In that role, I also served as [Indistinct] coordinator. In my role at IDC now, I'm working to provide technical assistance and support to states and LEAs related to the collection and reporting of education data and especially the special ed IDA data.
00:02:08.19 >> Wonderful. Austin?
00:02:10.19 >> Yes, thank you for that introduction, Amy. Afternoon, everybody. My name is Austin Ferrier. I am the former IDEA Part B data manager for the Florida Department of Education. I was actually in that role for about a year and 6 or 7 months, and as I transitioned to IDC, I have been focusing on CA specialists and 618 data.
00:02:36.26 >> Great. Thank you. So it's wonderful to have all three of you, and I'm so excited to now be able to work with you on the IDC and technical-assistance side of things. I was fortunate enough to get to know all of you when you were data managers so really thrilled to have you now onboard with IDC. And to get things going, I'm wondering if you all could reflect a little bit on your time as a data manager, if there is advice that you might want to provide, if there are newer data managers. Austin, you were in that role much more recently, so it's a newer kind of experience for you, just thinking about things you would've maybe wanted to know as a new data manager. Kelley, do you want to kick things off?
00:03:21.54 >> Sure. So when I was reflecting on advice for new data managers, one of the first things that came to me was to utilize all of the technical-assistance resources that are available to you. I know that when I started 17 years ago, I wasn't really aware of what technical-assistance resources were available to me. Felt like there were times where I couldn't find exactly what I needed, but now in this day and age and especially with having IDC, there's a website that's super filled with resources. There are technical-assistance resources, such as state liaisons. I would definitely encourage folks, especially newer folks, to invite their state liaison to walk them through all of the resources that are available on the website. Also, the other thing is just having a plan in mind for what your year is going to look like around data collection, when it's being collected, when analysis needs to happen, when submissions are due and just having that calendar planned out that's kind of unique to your state. Not every state collects things at the same time, but we know that it's all kind of due at the ... It is due at the same time to OSEP, so having that calendar planned out for the year really helped me know when I was supposed to take leave if I wanted to take a vacation or make plans and when I was going to need to be really, really focused on my data-analysis activities.
00:04:50.06 >> Yeah. Kristen, do you want to give us some advice?
00:04:53.72 >> Sure, so I'm going to echo what Kelley said about the IDC resources and really advise folks to make the time to really investigate the IDC resources that are available to you and to reach out to your state liaison on a regular basis. Like Kelley, I didn't really realize the depth of support that is available and really wish that I had. I would also tell people to network and collaborate with others in your agency and really try to find allies in the Office of Special Ed or data shop that you can work with and count on to help you get the work done that you need to get done. And lastly, I would say don't be afraid of OSEP. They're not out to get you, that a lot of times folks feel like, "We're going to get dinged. We're going to have a big problem with OSEP." So keep in mind really that OSEP has rules and regulations that they must follow, right? So they are pretty scripted by what's in the IDEA law. And also keep in mind that we all have the same goals in mind, right? We want to improve the landscape for students with disabilities. That should really be the underpinning of all of the work that you're doing.
00:06:21.45 >> That's great to keep in mind and especially for newer folks who might be intimidated by OSEP, but know that they're there to support states in any way really that they can, so they're a resource and a partner. Austin, what advice do you have?
00:06:38.69 >> I do have to echo some of the sentiments that both Kelley and Kristen expressed, especially what Kristen mentioned about, regardless of what we do, it does all come back to the students. We might be looking at data timelines inundated with messages from OSEP and emails, but one of the biggest things that really clicked with me as a new data manager, that this all comes back to fate, providing free and public education for kids and those support. So each time you see a number in an Excel sheet, any type of visualization you make, just try to keep in mind that that's a real student, a real person and that we are trying to do our best to support those groups. Another thing I would really advise new data managers to do, if you get the chance, please, please, I highly advise to attend one of the IDC Interactive Institutes summits or any type of project or presentation that IDC puts on. I know when I first made my initial trip for the SPP/APR summit, it opened up a whole new world outside the lens of Florida in terms of the level of collaboration, the level of support available, and it just really helped encapsulate what we are trying to do in our positions and in our positions as Part B data managers in support of LEAs. If your state allows you, highly advise to take the trip to any of those Interactive Institutes or summits. And then just finally, just as Kelley was saying, IDC has data manager connection groups, data quality peer groups. The level of collaboration you can achieve in those specific safe spaces is amazing. I know the conversation ... I've had conversation with Kristen herself when we were both in that role in meetings where we were talking about some very sensitive topics, but that collaboration really allowed us to build our own state's capacity and just kind of build that knowledge base.
00:08:56.86 >> Yeah, so it sounds like what I'm hearing as a theme from all of you is really the collaboration, the relationship building because often in the state you might be a little isolated, the only one really doing a lot of the work that's kind of in your head and the importance of working across your SEA, and then also just reaching out and tapping other data managers is such an important piece of the role. So now that you've transitioned out of the role of being a data manager, Kristen, what's something that you're really going to miss about being a data manager?
00:09:33.63 >> I'm really going to miss the collaboration with my immediate colleagues in the data shop at NISED as well as with the staff in the Office of Special Education. As a data manager, I really was the bridge between the two separate offices in the department, and I worked very hard to bring the data and build the data literacy in the Office of Special Education and did reap a lot of rewards from that and made some good connections with colleagues, so I will definitely miss that part of it. I'll also miss working with the data, and as Austin said, it's numbers on a spreadsheet, but the work that a data manager does, does have the potential to influence directly things for kids, right? So you do have the ability to influence practices and policies that do have that potential to make a difference really even in an individual kid's life, so that to me is something I will miss.
00:10:50.73 >> Thanks. Austin, what about you?
00:10:53.98 >> I just have to echo kind of what Kristen was saying in terms of that level of collaboration, but I will say probably one of the biggest aspects I'll miss, especially at the SEA state level, was the interaction I had with LEA exceptional education directors. I was a phone call away with almost every special director in every district in Florida, and having that connection and having that bridge really felt like the distance between the state and the LEA was lessened, and the trust was strengthened, so knowing that they could call me at any moment if they had a question regarding their data specifically, even at the school level, just building that trust and those relationships and having that direct assistance and seeing that have a direct effect in real time, one of the biggest things I'll miss. Second biggest thing I'll miss, I worked with some amazing individuals at the state level who, even outside of their career and job, were still focused on community outreach programs, were still going to school-board meetings after work. The passion was there, and it was evident, and I fed off of that, and I'll truly miss some of the individuals I ... They were superstars, rock stars.
00:12:21.34 >> Mm-hmm. Yeah. And, Kelley, what things are you going to miss?
00:12:26.55 >> It's funny. We're all kind of missing the same thing, but definitely for me, I'll definitely miss the friends that I made at the Department of Public Instruction across the agency, but the main relationship that I'll miss is, after 17 years and being involved in building our state special education data system, you develop some really strong connections and supportive relationships with our LEAs, and again, they knew that they could contact me, and I would know a little trick to get their data to go in just right or what the workaround was for our system, and so those kinds of phone calls, they're always so gratifying because they know that they can call you, and you'll figure it out for them. So those things I'll miss and definitely, definitely the data. I am a data geek at heart, I think, and so being able to look at those big data sets and knowing the rules of our state and how everything's supposed to fit together and how to present that visually where it makes sense when we're talking to our districts about their data, those are the top three things, I think, for all three of us that I'll miss.
00:13:41.01 >> Mm-hmm. Yeah, lot of similarities. I think as you get more into the role, the states that you're working with will sort of become like your districts were, and they'll be calling you, and you'll be calling them and build up that same type of relationship, so I think you'll still see that in somewhat different ways. And we touched on this a little bit already, but are there things that you want to mention that you as a data manager wished you had known more specifically about IDC and what IDC does and the services and the other TA centers as well? Austin, is there anything you want to mention?
00:14:18.98 >> Yes, yes, definitely I want to re-emphasize the area of safe space that IDC creates. I really do want to just express and emphasize IDC is not a punitive organization. They are not looking to ding you on any of your data pieces. It's a holistic examination of state processes as a whole in a ... It's pure assistance, so always keep that in mind. Come at it with a positive attitude and come at it as if we know where you're coming from, and Kelley and Kristen, they will agree. We've been in their shoes. So just having that empathy and knowing that we do try to create a safe space, and I really do hope states understand that.
00:15:16.08 >> Yeah, absolutely. Kelley, what are some things that you want to mention about IDC or other TA centers?
00:15:25.34 >> I think one of the things that I didn't realize about IDC that I know now just from my own experience as a state liaison is the multitude of ways that IDC and other TA centers can really come in and support states when it comes to the work that they're doing around data. For example, I'll be going to a state next month and helping facilitate a stakeholder meeting around some changes that they're making to their Indicator 4 methodology, and I wish that I had known that those kinds of supports were available to me as a data manager and to our team in general at the Department of Public Instruction because we could've utilized those supports in a way that would've made those meetings just richer and so just understanding the true depth of support. And I'll say it again because I didn't ... I definitely didn't understand this, but safe space, safe space, safe space. I think for a long time I felt like if I shared too much information with our TA centers that maybe I was airing my state's dirty laundry, and I didn't want it to be known that we were doing things wrong. I just wanted to help correct it. But knowing that these TA centers are here specifically to support us in improvement efforts and making things ... making our data stronger, it changes the whole understanding of what technical assistance really is, and I just ... I had some misconceptions for sure.
00:17:06.18 >> Great. And, Kristen, anything you want to add?
00:17:11.04 >> So, Kelley and Austin, you did a great job of taking the words right out of my mouth. I do ... One thing that I really didn't understand was how important it was or is to get to the conferences and institutes that IDC offers and pays for folks to attend. In New York, we had a hard time with travel, and we were finally allowed to go to ii23, Interactive Institute 23, and it was just mind-blowing for us. We were like, "Oh, my goodness, we have really missed out on a lot of in-person technical assistance and information and the collaboration and the networking and that piece of it," so that is one thing that I would highly, highly recommend is that all data managers but especially new data managers really work hard to be able to attend the in-person events that are offered.
00:18:19.61 >> Yeah, between all the data centers, all the TA centers, there really is such a wealth of expertise and knowledge and resources that we need to make sure all the states and staff and especially newer staff are aware of and can utilize. So kind of looking forward now, we talk a lot at IDC about what it means to be a data quality influencer and how everybody is a data quality influencer in different ways. You all as data managers were definitely, of course, data quality influencers in that role. Can you talk a little bit about some things you're excited about in your new role in terms of being a data quality influencer? And, Kelley, do you want to start us off?
00:19:09.72 >> So my mind is always whirling on what I could've used during my time as a data manager, and now that I'm with IDC and I know that part of the focus at IDC is creating tools and resources to help data managers and states have better tools to analyze and display their data and make meaningful change in their daily work and efforts, it's really just exciting to me to be able to think about what I could've used and how I can present those ideas and potentially create these tools to assist LEAs, for example, significant disproportionality or the indicators, just tools that will help them analyze and display data better.
00:19:53.13 >> Thanks, and really who better than the three of you and other former data managers to come on and help put those ideas and dreams you might have had as a data manager like, "It would be great if this existed, but I just don't have the time or the capacity to create it," and now can really be devoted to that and helping other data managers create those for them. Kristen, tell me a little bit about what you're envisioning in your role as a data quality influencer with IDC.
00:20:25.65 >> Well, Amy, as you said, the state will kind of become our LEAs or district, so I am looking forward to working with the individual states to provide one-on-one technical assistance as well as being thoughtful about updating existing resources to reflect new guidance, new practices and new perspectives and create new resources, knowing, as Kelley said, what I thought I might ... would be super helpful when I was in that role as a data manager. I am really excited to now be on the other side of the aisle with the deep understanding that I have of what data managers are going through, what their workload is like, how much information there is for them to process and get right, right ... There is very high-stakes data, this IDEA data ... and how important the work is. So if there is one data role, I think, in a state agency that you want to be a data quality influencer, it is around the IDEA Special Education data, right? It's super high-stakes, and I'm just very excited to be on the train.
00:21:50.25 >> Great. Yeah, we need a whole army of data quality influencers for this IDEA data. So, Austin, what ... How do you see yourself being a data quality influencer now?
00:22:02.51 >> Yes, Amy. When I think about this question, the word empathy keeps popping in my head. We ... Kristen and Kelley, we've been into EMAPS. We've had to submit our SPP/APRs on February. We've seen those emails from our bosses' bosses asking for updates or asking for some type of information that you have to put together quickly and translate it into something they can digest quickly. Having that empathy and having gone through the data-submission process, it really helps me understand what state SEA data managers are going through, and just knowing those feelings that they have, those crunch-time deadlines and just navigating your OSEP guidelines and then your state, federal ... your state guidelines and state legislation, just being able to ... Having that in the back of my head when I have conversations with states, just making sure that they know that we've been in those positions before. We know how you feel, and we're going to try to the best of our ability to help you in your data submissions.
00:23:26.18 >> Yeah, that makes so much sense. It is such value that you bring having had those experiences and gone through everything they're going through, and, like you said, having the empathy, that really adds so much to what you all bring. So thank you all so much for sharing with me and all of us your experiences as a data manager and now transitioning into this new role, what that looks like and what you're looking forward to and so happy and thankful to have you on.
00:23:59.55 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at ideadata.org.
  continue reading

53 قسمت

Artwork
iconاشتراک گذاری
 

بایگانی مجموعه ها ("فیدهای غیر فعال" status)

When? This feed was archived on September 30, 2024 00:25 (9M ago). Last successful fetch was on August 23, 2024 01:48 (10M ago)

Why? فیدهای غیر فعال status. سرورهای ما، برای یک دوره پایدار، قادر به بازیابی یک فید پادکست معتبر نبوده اند.

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 406427230 series 3340807
محتوای ارائه شده توسط IDEA Data and IDEA Data Center (IDC). تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط IDEA Data and IDEA Data Center (IDC) یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you!
You can contact us via the Podcast page on the IDC website at https://ideadata.org/.
### Episode Transcript ###
00:00:01.52 >> You're listening to "A Date With Data" with your host, Amy Bitterman.
00:00:07.34 >> Hey. It's Amy, and I'm so excited to be hosting "A Date With Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day.
00:00:19.50 >> "A Date With Data" is brought to you by the IDEA Data Center.
00:00:24.61 >> Welcome to another episode of "A Date With Data." I am joined by several former data managers who are now IDC technical assistance providers. Austin Ferrier, Kelley Blas and Kristen DeSalvatore were all data managers up until a few months ago, and they're going to share with us their unique experience of going from being a data manager to now working to build the capacity of data managers and other state staff to improve the quality of their IDEA data. Welcome to all three of you. And I think, to get things going, it would be great if each of you could just briefly introduce yourselves, say a little bit about how long you were in the role as a data manager and what you're doing now. And, Kelley, do you want to start us off?
00:01:09.05 >> Sure. Thank you, Amy. Again, my name is Kelley Blas, and I was a data manager at the Department of Public Instruction in North Carolina. I was there for 17 years as the data manager and the last few years as the SPP/APR coordinator, and I'm very happy to have joined IDC as a state liaison and technical assistance provider, just love working with states.
00:01:35.64 >> Great. Thanks, Kelley. Kristen, do you want to say hello?
00:01:39.35 >> Hello, everyone. I am Kristen DeSalvatore, and I was the data manager at the New York State Education Department for about 11 years. In that role, I also served as [Indistinct] coordinator. In my role at IDC now, I'm working to provide technical assistance and support to states and LEAs related to the collection and reporting of education data and especially the special ed IDA data.
00:02:08.19 >> Wonderful. Austin?
00:02:10.19 >> Yes, thank you for that introduction, Amy. Afternoon, everybody. My name is Austin Ferrier. I am the former IDEA Part B data manager for the Florida Department of Education. I was actually in that role for about a year and 6 or 7 months, and as I transitioned to IDC, I have been focusing on CA specialists and 618 data.
00:02:36.26 >> Great. Thank you. So it's wonderful to have all three of you, and I'm so excited to now be able to work with you on the IDC and technical-assistance side of things. I was fortunate enough to get to know all of you when you were data managers so really thrilled to have you now onboard with IDC. And to get things going, I'm wondering if you all could reflect a little bit on your time as a data manager, if there is advice that you might want to provide, if there are newer data managers. Austin, you were in that role much more recently, so it's a newer kind of experience for you, just thinking about things you would've maybe wanted to know as a new data manager. Kelley, do you want to kick things off?
00:03:21.54 >> Sure. So when I was reflecting on advice for new data managers, one of the first things that came to me was to utilize all of the technical-assistance resources that are available to you. I know that when I started 17 years ago, I wasn't really aware of what technical-assistance resources were available to me. Felt like there were times where I couldn't find exactly what I needed, but now in this day and age and especially with having IDC, there's a website that's super filled with resources. There are technical-assistance resources, such as state liaisons. I would definitely encourage folks, especially newer folks, to invite their state liaison to walk them through all of the resources that are available on the website. Also, the other thing is just having a plan in mind for what your year is going to look like around data collection, when it's being collected, when analysis needs to happen, when submissions are due and just having that calendar planned out that's kind of unique to your state. Not every state collects things at the same time, but we know that it's all kind of due at the ... It is due at the same time to OSEP, so having that calendar planned out for the year really helped me know when I was supposed to take leave if I wanted to take a vacation or make plans and when I was going to need to be really, really focused on my data-analysis activities.
00:04:50.06 >> Yeah. Kristen, do you want to give us some advice?
00:04:53.72 >> Sure, so I'm going to echo what Kelley said about the IDC resources and really advise folks to make the time to really investigate the IDC resources that are available to you and to reach out to your state liaison on a regular basis. Like Kelley, I didn't really realize the depth of support that is available and really wish that I had. I would also tell people to network and collaborate with others in your agency and really try to find allies in the Office of Special Ed or data shop that you can work with and count on to help you get the work done that you need to get done. And lastly, I would say don't be afraid of OSEP. They're not out to get you, that a lot of times folks feel like, "We're going to get dinged. We're going to have a big problem with OSEP." So keep in mind really that OSEP has rules and regulations that they must follow, right? So they are pretty scripted by what's in the IDEA law. And also keep in mind that we all have the same goals in mind, right? We want to improve the landscape for students with disabilities. That should really be the underpinning of all of the work that you're doing.
00:06:21.45 >> That's great to keep in mind and especially for newer folks who might be intimidated by OSEP, but know that they're there to support states in any way really that they can, so they're a resource and a partner. Austin, what advice do you have?
00:06:38.69 >> I do have to echo some of the sentiments that both Kelley and Kristen expressed, especially what Kristen mentioned about, regardless of what we do, it does all come back to the students. We might be looking at data timelines inundated with messages from OSEP and emails, but one of the biggest things that really clicked with me as a new data manager, that this all comes back to fate, providing free and public education for kids and those support. So each time you see a number in an Excel sheet, any type of visualization you make, just try to keep in mind that that's a real student, a real person and that we are trying to do our best to support those groups. Another thing I would really advise new data managers to do, if you get the chance, please, please, I highly advise to attend one of the IDC Interactive Institutes summits or any type of project or presentation that IDC puts on. I know when I first made my initial trip for the SPP/APR summit, it opened up a whole new world outside the lens of Florida in terms of the level of collaboration, the level of support available, and it just really helped encapsulate what we are trying to do in our positions and in our positions as Part B data managers in support of LEAs. If your state allows you, highly advise to take the trip to any of those Interactive Institutes or summits. And then just finally, just as Kelley was saying, IDC has data manager connection groups, data quality peer groups. The level of collaboration you can achieve in those specific safe spaces is amazing. I know the conversation ... I've had conversation with Kristen herself when we were both in that role in meetings where we were talking about some very sensitive topics, but that collaboration really allowed us to build our own state's capacity and just kind of build that knowledge base.
00:08:56.86 >> Yeah, so it sounds like what I'm hearing as a theme from all of you is really the collaboration, the relationship building because often in the state you might be a little isolated, the only one really doing a lot of the work that's kind of in your head and the importance of working across your SEA, and then also just reaching out and tapping other data managers is such an important piece of the role. So now that you've transitioned out of the role of being a data manager, Kristen, what's something that you're really going to miss about being a data manager?
00:09:33.63 >> I'm really going to miss the collaboration with my immediate colleagues in the data shop at NISED as well as with the staff in the Office of Special Education. As a data manager, I really was the bridge between the two separate offices in the department, and I worked very hard to bring the data and build the data literacy in the Office of Special Education and did reap a lot of rewards from that and made some good connections with colleagues, so I will definitely miss that part of it. I'll also miss working with the data, and as Austin said, it's numbers on a spreadsheet, but the work that a data manager does, does have the potential to influence directly things for kids, right? So you do have the ability to influence practices and policies that do have that potential to make a difference really even in an individual kid's life, so that to me is something I will miss.
00:10:50.73 >> Thanks. Austin, what about you?
00:10:53.98 >> I just have to echo kind of what Kristen was saying in terms of that level of collaboration, but I will say probably one of the biggest aspects I'll miss, especially at the SEA state level, was the interaction I had with LEA exceptional education directors. I was a phone call away with almost every special director in every district in Florida, and having that connection and having that bridge really felt like the distance between the state and the LEA was lessened, and the trust was strengthened, so knowing that they could call me at any moment if they had a question regarding their data specifically, even at the school level, just building that trust and those relationships and having that direct assistance and seeing that have a direct effect in real time, one of the biggest things I'll miss. Second biggest thing I'll miss, I worked with some amazing individuals at the state level who, even outside of their career and job, were still focused on community outreach programs, were still going to school-board meetings after work. The passion was there, and it was evident, and I fed off of that, and I'll truly miss some of the individuals I ... They were superstars, rock stars.
00:12:21.34 >> Mm-hmm. Yeah. And, Kelley, what things are you going to miss?
00:12:26.55 >> It's funny. We're all kind of missing the same thing, but definitely for me, I'll definitely miss the friends that I made at the Department of Public Instruction across the agency, but the main relationship that I'll miss is, after 17 years and being involved in building our state special education data system, you develop some really strong connections and supportive relationships with our LEAs, and again, they knew that they could contact me, and I would know a little trick to get their data to go in just right or what the workaround was for our system, and so those kinds of phone calls, they're always so gratifying because they know that they can call you, and you'll figure it out for them. So those things I'll miss and definitely, definitely the data. I am a data geek at heart, I think, and so being able to look at those big data sets and knowing the rules of our state and how everything's supposed to fit together and how to present that visually where it makes sense when we're talking to our districts about their data, those are the top three things, I think, for all three of us that I'll miss.
00:13:41.01 >> Mm-hmm. Yeah, lot of similarities. I think as you get more into the role, the states that you're working with will sort of become like your districts were, and they'll be calling you, and you'll be calling them and build up that same type of relationship, so I think you'll still see that in somewhat different ways. And we touched on this a little bit already, but are there things that you want to mention that you as a data manager wished you had known more specifically about IDC and what IDC does and the services and the other TA centers as well? Austin, is there anything you want to mention?
00:14:18.98 >> Yes, yes, definitely I want to re-emphasize the area of safe space that IDC creates. I really do want to just express and emphasize IDC is not a punitive organization. They are not looking to ding you on any of your data pieces. It's a holistic examination of state processes as a whole in a ... It's pure assistance, so always keep that in mind. Come at it with a positive attitude and come at it as if we know where you're coming from, and Kelley and Kristen, they will agree. We've been in their shoes. So just having that empathy and knowing that we do try to create a safe space, and I really do hope states understand that.
00:15:16.08 >> Yeah, absolutely. Kelley, what are some things that you want to mention about IDC or other TA centers?
00:15:25.34 >> I think one of the things that I didn't realize about IDC that I know now just from my own experience as a state liaison is the multitude of ways that IDC and other TA centers can really come in and support states when it comes to the work that they're doing around data. For example, I'll be going to a state next month and helping facilitate a stakeholder meeting around some changes that they're making to their Indicator 4 methodology, and I wish that I had known that those kinds of supports were available to me as a data manager and to our team in general at the Department of Public Instruction because we could've utilized those supports in a way that would've made those meetings just richer and so just understanding the true depth of support. And I'll say it again because I didn't ... I definitely didn't understand this, but safe space, safe space, safe space. I think for a long time I felt like if I shared too much information with our TA centers that maybe I was airing my state's dirty laundry, and I didn't want it to be known that we were doing things wrong. I just wanted to help correct it. But knowing that these TA centers are here specifically to support us in improvement efforts and making things ... making our data stronger, it changes the whole understanding of what technical assistance really is, and I just ... I had some misconceptions for sure.
00:17:06.18 >> Great. And, Kristen, anything you want to add?
00:17:11.04 >> So, Kelley and Austin, you did a great job of taking the words right out of my mouth. I do ... One thing that I really didn't understand was how important it was or is to get to the conferences and institutes that IDC offers and pays for folks to attend. In New York, we had a hard time with travel, and we were finally allowed to go to ii23, Interactive Institute 23, and it was just mind-blowing for us. We were like, "Oh, my goodness, we have really missed out on a lot of in-person technical assistance and information and the collaboration and the networking and that piece of it," so that is one thing that I would highly, highly recommend is that all data managers but especially new data managers really work hard to be able to attend the in-person events that are offered.
00:18:19.61 >> Yeah, between all the data centers, all the TA centers, there really is such a wealth of expertise and knowledge and resources that we need to make sure all the states and staff and especially newer staff are aware of and can utilize. So kind of looking forward now, we talk a lot at IDC about what it means to be a data quality influencer and how everybody is a data quality influencer in different ways. You all as data managers were definitely, of course, data quality influencers in that role. Can you talk a little bit about some things you're excited about in your new role in terms of being a data quality influencer? And, Kelley, do you want to start us off?
00:19:09.72 >> So my mind is always whirling on what I could've used during my time as a data manager, and now that I'm with IDC and I know that part of the focus at IDC is creating tools and resources to help data managers and states have better tools to analyze and display their data and make meaningful change in their daily work and efforts, it's really just exciting to me to be able to think about what I could've used and how I can present those ideas and potentially create these tools to assist LEAs, for example, significant disproportionality or the indicators, just tools that will help them analyze and display data better.
00:19:53.13 >> Thanks, and really who better than the three of you and other former data managers to come on and help put those ideas and dreams you might have had as a data manager like, "It would be great if this existed, but I just don't have the time or the capacity to create it," and now can really be devoted to that and helping other data managers create those for them. Kristen, tell me a little bit about what you're envisioning in your role as a data quality influencer with IDC.
00:20:25.65 >> Well, Amy, as you said, the state will kind of become our LEAs or district, so I am looking forward to working with the individual states to provide one-on-one technical assistance as well as being thoughtful about updating existing resources to reflect new guidance, new practices and new perspectives and create new resources, knowing, as Kelley said, what I thought I might ... would be super helpful when I was in that role as a data manager. I am really excited to now be on the other side of the aisle with the deep understanding that I have of what data managers are going through, what their workload is like, how much information there is for them to process and get right, right ... There is very high-stakes data, this IDEA data ... and how important the work is. So if there is one data role, I think, in a state agency that you want to be a data quality influencer, it is around the IDEA Special Education data, right? It's super high-stakes, and I'm just very excited to be on the train.
00:21:50.25 >> Great. Yeah, we need a whole army of data quality influencers for this IDEA data. So, Austin, what ... How do you see yourself being a data quality influencer now?
00:22:02.51 >> Yes, Amy. When I think about this question, the word empathy keeps popping in my head. We ... Kristen and Kelley, we've been into EMAPS. We've had to submit our SPP/APRs on February. We've seen those emails from our bosses' bosses asking for updates or asking for some type of information that you have to put together quickly and translate it into something they can digest quickly. Having that empathy and having gone through the data-submission process, it really helps me understand what state SEA data managers are going through, and just knowing those feelings that they have, those crunch-time deadlines and just navigating your OSEP guidelines and then your state, federal ... your state guidelines and state legislation, just being able to ... Having that in the back of my head when I have conversations with states, just making sure that they know that we've been in those positions before. We know how you feel, and we're going to try to the best of our ability to help you in your data submissions.
00:23:26.18 >> Yeah, that makes so much sense. It is such value that you bring having had those experiences and gone through everything they're going through, and, like you said, having the empathy, that really adds so much to what you all bring. So thank you all so much for sharing with me and all of us your experiences as a data manager and now transitioning into this new role, what that looks like and what you're looking forward to and so happy and thankful to have you on.
00:23:59.55 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at ideadata.org.
  continue reading

53 قسمت

همه قسمت ها

×
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date with Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey. It's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date with Data" is brought to you by the IDEA Data Center. 00:00:24.58 >> Welcome to another episode of "A Date with Data." As part of a series of episodes about IDC's Data Quality Peer Groups, today we are featuring the EDFacts Coordinator Data Manager Data Quality Peer Group. The Data Quality Peer Groups are facilitated by IDC TA providers to bring state role groups together to discuss and collaborate on the data quality issues of greatest importance in states. To tell us about this group, I am joined by the group's facilitators, Kristen DeSalvatore and Audrey Rudick. Thank you both so much for being here. 00:00:59.80 >> Thanks, Amy. 00:01:01.51 >> Thanks for having us, Amy. 00:01:04.00 >> Great. To begin, for those who aren't part of the group or don't know much about the data quality peer groups, can you talk about what this group is like, who tends to participate? What is the structure and the format of the group, and maybe what are some of the topics that you tend to cover? 00:01:23.20 >> Sure. I'd be happy to tackle that. So this peer group was really started last year in 2023 to start facilitating conversations between part-B data managers and their EDFacts counterparts in response to the EDFacts modernization efforts that we're having. It's a little bit different in format as we send out invitations to all EDFacts coordinators and all part-B data managers each time we meet, which is monthly, sometimes every other month just depending on how much is going on in the world of EDFacts and in IDEA. During our meetings, we usually start with some sort of icebreaker and then present on topics, have conversations related to those topics. Those topics might include information about how data are reported to the US Department of Education and how they're used by OSEP for determinations in the SPP/APR. Kristen, I don't know if you had anything to add there. 00:02:24.07 >> Sure, so the 618 data is ... Really, the bulk of it is reported through EDFacts, and so that includes child count, LRE, personnel, exiting, discipline and assessment. So of course, those are topics that are going to be covered. We also do talk about things like data governance, data processes and data verification. And the way that the two roles, the EDFacts coordinator and the data manager, collaborate is different in every state. So we try and structure the topics and the meetings to provide best practices as well as that timely information that Audrey already mentioned. 00:03:10.58 >> Yeah, and I'll say, Kristen, it's been really great, working with you on this since you had both roles as EDFacts coordinator and data manager prior to joining IDC. 00:03:21.25 >> I do find that I undervalued that information when I was in a state. 00:03:30.60 >> But having that perspective I think definitely adds value to the group and as facilitator because you have been in their shoes. 00:03:38.81 >> Yes, all things EDFacts and most things ID. 00:03:46.23 >> So now that we've gone through all of the EDFacts, EDPass data collections at least once, once, I think, for everything at this point, I'm curious to know recently, especially, what are some of the themes that you have heard emerging from states during these calls? 00:04:07.59 >> So I would say a big theme that we heard in January was around the assessment data being available in EMAPS. It is now not populated in EMAPS until right before the submission deadline, and it does not allow the states a lot of time to see it in EMAPS and react to it. So it's really critical that the EDFacts coordinators and the data managers are working together outside of EMAPS ahead of that population of the data to make sure that eyes are getting on the data from the IDEA special ed folks working with the EDFacts coordinators to make sure that there are changes that need to be made or data notes that need to be written, et cetera. 00:05:03.57 >> Yeah, I think that thinking through some of that collaboration and coordination around the SPP/APR because a lot of EDFacts coordinators I think understood that the data is used but not particularly how it's used. And I think that providing that information to multiple people within the state teams who are working with the data just really helps improve that data quality and helps to ensure that every bit of data that is being reported is accurate. 00:05:38.26 >> Yes. 00:05:39.43 >> Mm-hmm. 00:05:40.20 >> So there has been lots of discussion about how the data is pulled, how it's analyzed, how it's built into the EDFacts file, which speaks to data governance. 00:05:52.41 >> Mm-hmm. 00:05:53.16 >> And so all of that has been discussed widely and deeply. 00:05:59.51 >> What would metadata? 00:06:01.41 >> Oh, yes, and metadata ... 00:06:02.71 >> Everybody's favorite piece of the collection. 00:06:07.60 >> The way that data notes are handled now in EDPass is ... It's different now, so it used to be that your data was submitted, and then OSEP reviewed it, and months later you got a spreadsheet with data [Indistinct] issues that they wanted you to discuss or talk about, provide a data note for. But now all of that is done in EDPass prior to your final submission of the data, so that's a big change. 00:06:37.89 >> Yeah. How did it sound from the participants, I guess, as they got more used to using EDPass? Did it seem like there was more coordination and collaboration between the coordinators and data managers kind of as the time has gone on? Are there still challenges, I'm sure, that pop up? 00:06:57.53 >> Well, I think with staff turnover that we're seeing so regularly within states, it's great to have these ongoing conversations. So once you've been through it and you understand what's happening and you understand the data and where those data are coming from and how they're being reported within the new EDPass system, I think it makes a lot of sense, and it seems a lot more seamless, at least on the reporting side. Sometimes the EDPass side has had a few data hiccups as they're getting the new system up and running. So there's always ... is something to tweak. So it seems like everything is becoming more smooth on the state side behind the scenes as they're learning, as state people have learned their roles, their particular roles and how they're working together. 00:07:57.06 >> Mm-hmm. 00:07:58.64 >> But I think ... So this will be just our second year of Child Count, which, it's ... That submission starts in July. So we will see how that goes. 00:08:10.68 >> Probably ... 00:08:11.47 >> Oh, and we might have year-to-year edits. 00:08:13.63 >> Oh, yeah. 00:08:14.13 >> Great. 00:08:14.56 >> That will be exciting. 00:08:15.66 >> Yeah. 00:08:16.96 >> Less anxiety this time hopefully than the first time around. 00:08:21.40 >> I think so. 00:08:22.37 >> For sure. 00:08:24.48 >> I know this group was created in part because of that need to try to improve and increase that communication and collaboration, coming up with the new EDPass system between EDFacts coordinators and data managers. What tips, recommendations, strategies do you all have to share that are related to maybe improving that coordination between the different role groups that might help other states to know about? 00:08:55.29 >> So as Audrey mentioned, that knowledge of how the data is used is very, very important. It's really ... The EDFacts coordinator needs to understand the high-stakes nature of the IDEA data. It's submitted through EDFacts, and there's lots of data that is submitted through EDFacts, but the IDEA data is some of the most scrutinized data that an SEA submits to the US Department of Education, and it has monitoring and fiscal implications for a state that just don't really exist in most other data sets. So for an EDFacts coordinator to really understand that, and even though there's a Child Count certification form that has to be signed by the state director or other authorized official, I'm not aware of any other data that has an actual certification that needs to be signed and sent in after it is submitted. Or actually it has to be submitted on the day that a state submits it. 00:10:09.37 >> Not for EDFacts files, for some of the other reports, I think there's certifications, but, yeah, certainly that. 00:10:16.18 [ Chatter ] 00:10:16.40 >> Yeah. 00:10:18.30 >> And I would just say communication is key. You have to ... those lines of communication open. 00:10:26.42 >> Yeah. Our favorite, I think, phrase is just to become besties. If you're a part-B data manager, schedule some regular check-ins with your EDFacts coordinator, particularly leading up to those deadlines so that you can work together and understand the data that are going into those EDFacts files and maybe some of your state's business rules that are embedded in that file creation process. 00:10:53.83 >> Yeah. A formal data governance structure is important, but Audrey just said, that more informal besties relationship between the EDFacts coordinator and the data manager is just super important and helpful, as well. 00:11:09.49 >> And hopefully this group provided an opportunity maybe for some states that didn't have those lines of communication entirely open, and this was a good jumping-off point to improve some of that. 00:11:23.86 >> I think so. 00:11:25.25 >> Yeah. I think we've had a lot of good engagement from states, and it's fun to see when both the part-B data manager and EDFacts coordinator come to the meetings together. 00:11:37.05 >> Mm-hmm. And maybe have some of those ah-ha moments, like, "Oh, I didn't know that you did this or that was what was happening." 00:11:43.71 >> Exactly, yeah. 00:11:44.90 >> Great. Well, thank you both so much for sharing about this great group, and if anyone wants more information, has questions, is interested in learning more about the group, please reach out to IDC at the ideadata@westat.com e-mail, and thank you both so much. 00:12:06.27 >> Thank you, Amy. 00:12:07.65 >> Thanks. This was great, to be here. 00:12:10.59 >> To access podcast resources, submit questions related to today's episode, or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content. Or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date with Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey. It's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date with Data" is brought to you by the IDEA Data Center. 00:00:24.59 >> Hello. Welcome to "A Date with Data." This is a very special episode that is part of a series that we're going to be doing to learn more about IDC's Data Quality Peer Groups. These are groups that are facilitated by IDCTA providers to bring state role groups together to discuss and collaborate around different data quality issues that are important in states. For this episode, I am joined by the facilitators of the SSIP Data Quality Peer Group, Jennifer Schaaf and Beckie Davis. Welcome to both of you. Thank you so much for being on. 00:00:58.19 >> Hey. 00:00:59.43 >> Hi, Amy. Thanks for having us. 00:01:01.17 >> Of course. So for those who may not be familiar with the IDC Peer Groups, what they're like, who can participate, can you start off just by telling us a little bit about the peer group? What are the roles that generally are involved in the SSIP peer group, for example? 00:01:19.06 >> Yeah. Thanks, Amy. Most of our peer group participants are SSIP coordinators, usually working out of the state departments of education. But some of the SSIP coordinators are consultants, and they may work out of another site like the university. 00:01:31.70 >> Okay. 00:01:32.57 >> Some of our participants may have other duties such as being in charge of the State Personnel Development Grant, or SPDG. 00:01:41.58 >> Mm-hmm. 00:01:42.07 >> But anyone who has any involvement in the SSIP work is always welcome. 00:01:47.18 >> Great, so it's not just if you're an SSIP Coordinator but if you're somehow involved in the SSIP. This is open for you, as well. 00:01:53.90 >> Yes. 00:01:54.95 >> Yes. 00:01:55.93 >> Great. So tell us about the structure or kind of the format and some of the topics that you cover in this group. 00:02:04.29 >> So our structure is very informal. The group is designed specifically to meet the needs of the participants. Our calls take place every other month from February to September, and then we start monthly calls in October through January because that's when the SSIP work really picks up. The calls are on the third Thursday of the month from 2 o'clock to 3 o'clock Eastern time. 00:02:32.42 >> The way we select our topics is really based on requests that we receive from the Peer Group members or topics that are timely at that point in the year or in the SSIP cycle. And what we do in each meeting is really focus on group sharing and problem solving, so our meetings usually start off with a brief presentation from us, the facilitators, and in that presentation we'll share information that may be new or pertinent. But then we segue fairly quickly into state discussion so that the bulk of the time is spent in peer sharing. States are often facing similar issues to each other, and they can share solutions and successes on those calls, and we provide just a safe place to talk about all things SSIP and ask questions, share accomplishments or frustrations, share resources and network with each other. 00:03:26.58 >> Now that the cycle has sort of been wrapped up, more or less, I guess, for 2024 with the submission of the SPP/APRs and the clarification period, I'm curious to know what are some of the common themes during the SPP/APR cycle and process and now kind of after that you have been noticing coming in from states during these calls. 00:03:49.96 >> Well, that's a good question. One issue that keeps coming up is how to effectively recruit SSIP participants, particularly for those states that are using cohort models for implementing the SSIP. It can be challenging to encourage districts or schools to buy into the process because educators, while they're dedicated and they want good outcomes for their students, they are also overloaded, and often just the idea of adding one more thing to their plates can seem overwhelming. So our peer group participants share ideas, and they have some helpful recruitment strategies like presenting data from participating schools that shows positive outcomes for students. Seeing data on this and results like that can help motivate the other schools or districts to join because they can really see how that participation can benefit their staff and their students. Another issue is once we get close to reporting season, we usually have a good deal of meeting time that's devoted to talking about the specifics of reporting. 00:04:57.00 >> Mm-hmm. 00:04:57.67 >> These include topics like exactly what information belongs with each prompt, when and how to report changes in the SSIP and also how to create responses that are complete enough to let the reader know the full story but also brief enough to fit within the character limit. Sometimes the phrase Word Ninja is thrown about. 00:05:20.52 >> It is a balancing act. 00:05:22.93 >> And, Amy, some of the additional issues center around data collection. One of the things that we frequently talk about with data collection is interim data collection. Most of the states are using outcome data such as their statewide assessment data or their graduation rate for their state-identified measurable result, or SIMR. 00:05:48.02 >> Mm-hmm. 00:05:48.56 >> And because these data are only collected once a year, it provides for a challenge in how to judge progress toward the SIMR. 00:06:00.63 >> Mm-hmm. 00:06:01.04 >> And just like measuring progress toward an annual goal in the IEP, you can't wait until that annual review date to collect data to figure out if you're making progress toward your annual goal. 00:06:14.03 >> Yeah. 00:06:14.31 >> So very similarly, you can't wait until you get your statewide assessment data or your graduation data to figure out if your strategies and activities are working. Of course, by then, it's too late to make any changes. 00:06:28.45 >> Mm-hmm. 00:06:28.87 >> So states talk about how they're tracking progress along the way, or again, that interim data. 00:06:35.97 >> Hmm. 00:06:36.44 >> Another topic that we often discuss is how to scale up or scale out the work. And it's a challenge that states face in how to do this, how to expand the work. So how do you achieve the second S in SSIP, which is "Systemic," Statewide Systemic Improvement Plan, and we talk about building capacity at both the state and local levels so that the work can not only be expanded but also sustained. That's a critical issue, is that sustaining of the work. 00:07:13.63 >> Mm-hmm. Yeah, a lot of great discussion and topics, it sounds like, that you have going on. What are any tips or recommendations that are related to the SSIP that you have that you can share with states that would be beneficial for them to know about as they're working on the SSIP, having some of these same types of questions and considerations? 00:07:39.63 >> So we've got a couple of relatively new tools that are just coming online, and one of them is the Indicator 17 Data Process Protocol. And just like the other Data Process Protocols, this one is designed to help states document the process for collection, validation, analysis, submission and communication of data and results that are related to the SSIP work. 00:08:08.68 >> Another new tool we've also created is a Template Reporting Guide, and that gives tips on how to respond to each prompt in the template. The template can be a bit confusing as to which information goes with which prompt, so this guy will help the SSIP writer know where to put their information. 00:08:27.12 >> And of course, the best resource of all is our SSIP Data Quality Peer Group. 00:08:33.21 >> Mm-hmm. Yeah. 00:08:35.44 >> So we hope you'll consider joining us if you are working on the SSIP. It's a big task, and it does help to get that input from your peers. 00:08:43.78 >> Yes. It helps to know that you're not alone, and there's others out there who are experiencing the same thing, and we have our IDCT providers there to help guide the conversation and talk about resources and tips and tricks. So thank you both so much for all this great information, and if you are interested in learning more about the Peer Groups, please reach out to IDC at IDEAdata@westat.com. Thank you, Beckie and Jenny, so much for your time and joining me on the podcast. You shared so much wonderful information and so much exciting activity going on. Thank you, again, for being on. 00:09:23.55 >> Thanks so much for having us, Amy. 00:09:25.88 >> We really appreciate the opportunity to talk about the good work that this group does. 00:09:32.59 >> To access podcast resources, submit questions related to today's episode, or if you have ideas for future topics, we'd love to hear from you. The links are in the Episode Content. Or connect with us via the podcast page on the IDC website at IDEAdata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date with Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey. It's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date with Data" is brought to you by the IDEA Data Center. 00:00:24.50 >> Hello. Welcome to "A Date with Data." On this episode, which is part of a series that we are doing on IDC Data Quality Peer Groups, we are focusing on the SPP/APR Data Quality Peer Group. This peer group is part of a number of other groups that are facilitated by IDCTA providers to bring state-role groups together to discuss and collaborate around data-quality issues that are of greatest importance in the States. In this episode, I am joined by the group's facilitators, Nancy Johnson and Chris Thacker. Thank you both so much for being here. 00:01:00.93 >> My pleasure. 00:01:01.90 >> Our pleasure. 00:01:03.67 >> Great, so to to get started, for those who might not be familiar with Data Quality Peer Groups, and in particular, the SPP/APR group, can you tell me about this group? Who tends to participate in it? What is the structure or the format like, and what are some of the topics that you tend to cover? 00:01:24.06 >> Well, as the name implies, we are the state performance plan, annual performance report, Data Quality Peer Group, so naturally our topics of conversation are around the SPP/APR and the various indicators that are addressed within that document. There is also discussions around the SSIP or the SiMR that are part of Indicator 17. We talk about many different things. The group consists largely of Part B data managers, the SPP/APR lead, indicator leads, the SSIP lead or SiMR leader, depending on what the particular topic is on a call for that particular month. We meet on a monthly basis every ... 00:02:06.71 >> Second Wednesday of the month. 00:02:07.78 >> ... second Wednesday of each month at 3 p.m. Eastern Time through a Zoom link, and we try to let them know ahead of time what that month's topic of conversation is going to be. And really it's a lot of dialogue between and among the the state participants who attend. As the name implies, we're trying to share that information among one another, talk about things that have been of concern that particular time of year. For example, we just had a conversation around the SPP/APR feedback that's occurring, so depending on the time of year as to what the topics might address. 00:02:43.16 >> And what are some of the common topics that have been bubbling up recently that you're hearing from states on these calls? 00:02:50.80 >> Currently, some of the common themes that we're hearing, questions about stakeholder engagement, and in fact, that is our topic for tomorrow's meeting. And we're really going to focus on including efforts to build the capacity of a diverse group of parents to support implementation activities designed to improve outcomes as part of that stakeholder engagement. We also get a lot of questions about representativeness, non-response bias and sampling plans as they relate to Indicators 8 and 14. And of course, this year we're getting questions and issues around collecting and documenting data for the new Indicator 18, which states have to report on in the next SPP/APR in February of 2025 along with that expanded general supervision section in the introduction that includes eight new elements that states are expected to address and then the continuing concerns questions about methodology that's reasonably designed for Indicators 4A and 4B. 00:03:57.92 >> Mm-hmm. 00:03:58.38 >> Those are the things that come to the top of my mind. Chris may have some other things to add. 00:04:02.97 >> I think you got the major highlights there, the topics that come up. There might be some specific things that will happen that are unique to a state or circumstances that exist there or circumstances that a state may think are unique in what to bring to the group to see if other states are experiencing, and that could just be a wide variety of different topics, and that's largely dependent upon what we're talking about. If we happen to be addressing Indicator 4, then we might have something around discipline and how data around that are collected or things of that nature, but it's kind of topic-driven mostly. 00:04:36.64 >> Yeah. One of the nice things about these groups I have found is that it's a great platform for states to come and just be able to ask other states, "We're experiencing X, Y or Z. Is this something you've also dealt with, and if so, how have you addressed it?" or just to kind of feel like they're not alone in some of the challenges and things that are going on. 00:04:58.20 >> That definitely seems to be the way our groups flow. We try to keep it as informal as possible so that we can encourage those open conversations and dialogues among the states. 00:05:08.05 >> Mm-hmm. 00:05:08.36 >> And we get a really good participation, it seems, each time we have a call. Generally, we're talking about anywhere from 20 to two dozen or so, maybe as many as 30, 35 sometimes, participants on board to have these conversations. We try to start it out with providing a little information or a little content, if you will, about whatever the topic of that month's call is going to be. And of course, they know ahead of time, generally speaking, what the conversation or the topic is going to be as well. 00:05:35.96 >> Mm-hmm. 00:05:36.28 >> So that has them bringing in their thoughts and ideas to the meeting so they're not caught by surprise, and it turns out pretty well. 00:05:44.27 >> Well, I was just going to say, I agree that one of the best things about these meetings is states' opportunities to talk with each other and get ideas about issues that they're having related to the topic or just questions that they have and talking back and forth. It always sparks something that states can take back and use. 00:06:04.19 >> And one of the conversations that I've found or topics that have come up a lot is, "What did OSEP say to you about this particular situation, and have you gotten feedback from OSEP on doing it this way or doing it that way?" So those types of conversations really seem to be of the most interest to the group. 00:06:21.49 >> Yeah. You mean states will say, "I had a conversation with OSEP and discussed this particular topic, and this is what they said." 00:06:27.62 >> Yes. 00:06:27.74 >> Yeah, I think definitely that's a lot of states ... 00:06:30.08 >> Yes. 00:06:30.17 >> And I think about that Indicator 4 and some issues that were going around with how you're supposed to collect data and interpret the data for Indicator 4 was a a good conversation that wanted to know what OSEP's perspective was on a lot of those circumstances. 00:06:45.12 >> That's definitely one of the benefits of those calls for states. What are some tips or recommendations that you can share related to the SPP/APR that states maybe that aren't always part of these calls might benefit from from knowing about? 00:07:00.89 >> Well, one of the things that comes to mind, and Chris is certainly the expert in this area, but are getting states to use the IDC's protocols for documenting data processes for the 618 data collections in the SPP/APR indicators and contacting their state liaison to get assistance with that from TA providers facilitating that process. Chris certainly does a lot more of that than I do, but it really does turn out well when states document their processes. 00:07:33.42 >> I think that's a good point, Nancy, in bringing up the the documentation of state protocols. That is one of the big key PA things that IDC does with states. We do a lot of different types of PA, but that one seems to be a hit just about everywhere we go because it brings different people from within the state together at the same table to talk about conversations, about specific topics, whether we're looking at an indicator or specific types of data, whether we're talking child count or discipline or exiting or what have you. Having those people at the table really opens up those dialogues and those conversations and helps people within the SEA to understand what other staff are doing so they have a better sense of the overall picture and not just maybe their particular corner of the world, if you will. 00:08:19.95 >> Just about the data processes, one thing is, I know a lot of states are are interested in them, especially as DMS is is coming up because part of what happens during those DMS visits is really kind of sharing a lot of detail, right, with OSEP around processes. And if you have it all ready to go and documented ahead of time, then you know you have a lot of that already there at your fingertips. And also, given what Chris was saying, it's great to get down on on paper, electronically, all of those detailed steps. But a big benefit of doing the documentation, to Chris' point, is that it brings together a number of folks who maybe don't often come together and have these discussions. So it has sort of that other added benefit of really having that dedicated time to sit down and hear what everybody is doing and really be able to understand at a higher level, from start to finish, everyone that's involved from the collection, analysis, reporting, every step along the way. 00:09:25.70 >> And I'm glad that you brought up the DMS 2.0 and the impact that that might have with regards to documenting processes. You may not be documenting your processes for the purposes of doing DMS 2.0. Ideally you're doing it to enhance the quality of the data that you're capturing or to have consistency in how you might fill out your SPP/APR on particular indicators or to break in new employees to give them some information. 00:09:51.13 >> Mm-hmm. 00:09:51.70 >> But if you've got processes documented, it's going to be a very big help when you go through the DMS 2.0 interviews and conversations that you would have. But those that you have a place to reference to, somewhere to go back to so you don't get lost in those types of conversations, if you're familiar with your protocols, it's going to become more second-nature to you in answering those questions. So I think it's very, very helpful for that as a side benefit. But to me, the primary reason for doing it is to enhance your own activities and your own processes to have that consistency. 00:10:27.42 >> Mm-hmm. 00:10:28.10 >> One other benefit with the data processes, it's a way for more than just one staff member to understand what those processes are because it is documented on paper, and so staff have a better understanding of the processes that go forward rather than one person just having all that information in their head or to themselves so that it really becomes a team effort, as we've discussed earlier. 00:10:58.05 >> Another benefit that hits me, and it's not something that's discussed a great deal, at least in my mind, state directors come and go from states on just out of normal transitioning. And if they come into this role and can see all the processes that are going on within the office, it can give them a better appreciation and understanding of what roles and responsibilities are of individual staff as well as the office at large. It also can help to develop an appreciation for data managers. Therefore, the state director have an appreciation for that data manager and to know what it is that they're providing and kind of gives them a go-to person, if you will, for different topics. 00:11:39.86 >> Those are all very true. Any other tips or recommendations? 00:11:45.49 >> Something else that comes to mind for me is developing an annual plan for when you're sharing your data, when you collect and share your data for use with specific indicators for the SPP/APR so that you should, in my opinion, be scheduling regular meetings between your SPP/APR coordinator, your data manager, your indicator leads and any data analysis tied to certain indicators as appropriate so that throughout the year, working on the development of your SPP/APR rather than waiting until the last couple of months of the year soon before it's due. Because some of this data that feeds some of the indicators you have early in the year, and then some of it you get later in the year during the summer. So it gives you a time, if you develop that annual plan, to figure out how you're going to plan out your work and then meet regularly to make sure everybody is on the same page about what the data is saying, what you're going to be using related to the SPP/APR. So I just think that annual plan helps you with organizing all of that information. 00:12:58.25 >> I would also like to add that being part of the community through the Data Quality Peer Groups gives you the ability to network with your peers, and oftentimes the peers that are doing the same type of work you're doing in a different state. So that can broaden your knowledge base of places to go. Obviously being that Nancy and I both work with the IDC, the Data Center, we are state liaisons for particular states. You can come to us, your state liaison, ask questions or concerns. It might not be that particular liaison has the answer to your question, but we can work with our colleagues at IDC to find that answer, and sometimes we might even reach out to other TA centers funded by OSEP to be able to get answers for you. So don't be bashful about asking your state liaison for assistance around any of this. Don't be bashful when you're attending your Data Quality Peer Groups to ask questions. Yep. That's how we learn, is sharing the information and experiences that we each have because they're all going to be different, but they might be close enough to give you an idea that you might not have thought of on your own, and seeing how another state is doing something could give you an idea to go back and and do something new and different back home. 00:14:08.51 >> On all these groups, we encourage any and all questions because it might be something, like Chris said, you haven't thought of before that kind of jogs an idea in your head, and we try to make these very ... 00:14:20.34 >> And I would add there, too, we do not record these. 00:14:24.55 >> Mm-hmm. 00:14:24.95 >> We do not have people from our staff from OSEP attending. So it's kind of an anonymous group, if you will, anonymous in that there's no one there who is a monitor over your programs that you can come in, and from the information you shared, be able to do something that ... in a monitoring capacity because that's not our role as a TA center. Our key role as TA center staff is to provide you that technical assistance that you need to be able to be in compliance and operate your programs appropriately and to improve results for children with disabilities. It's not a gotcha place. It's a safe environment, and it is your colleagues for the most part that you're talking with. We just kind of provide the space, the opportunity, for it to get you all together and have those conversations. 00:15:10.66 >> I'd like to put in a plug for one other IDC tool that I found helpful when I was an SPP/APR coordinator in a state, and that's IDC's Data Meeting Toolkit. You have, as an SPP/APR coordinator, the data manager. You have a lot of meetings around your data and around developing your SPP and in terms of analyzing your data for that, those kinds of thing, particularly when you have meetings with more external groups, stakeholder groups. That meeting toolkit was very helpful to me, and I hope it's helpful to states in helping plan for those meetings. The tools within the toolkit were just very beneficial, so I wanted to put in a quick plug about that because this is ... It's about the SPP/APR, but it's all about the data that you use and how you analyze your data to make decisions for the SPP/APR. 00:16:13.90 >> Thank you both so much for sharing all this information and letting us know what this group is like, what are you discussing on these groups, and your tips and recommendations. 00:16:24.13 >> Thank you for having us. 00:16:26.77 >> To access podcast resources, submit questions related to today's episode, or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date with Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey, it's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date with Data" is brought to you by the IDEA Data Center. 00:00:24.65 >> Hello, welcome to "A Date with Data." We are kicking off a special series of episodes that are focusing on IDC's data quality peer groups. These groups are facilitated by IDC TA providers to bring state role groups to discuss and collaborate on the data-quality issues that are of greatest importance in states. For this episode, I am joined by the facilitators of the Data Manager Data Quality Peer Group, Kelley Blas and Kristen DeSalvatore. Thank you so much for being here and welcome. 00:00:56.74 >> Thanks, Amy. Thanks for having us. 00:00:59.78 >> Yes, thanks, Amy. It's a pleasure to be here. 00:01:03.13 >> So to start things off for those who might not be familiar with these peer groups, can you tell us what these peer groups are like? Who tends to participate in this particular group? What is the structure and the format like? And what are some of the topics that you cover? 00:01:19.76 >> The Data Manager Data Quality Peer Group is a space for folks to come together to discuss and learn all things IDEA data. So as IDC facilitators, we provide timely information and resources, but there is also sharing by the states and learning from each other. 00:01:41.45 >> I agree with that, Kristen. I feel like though we do discuss topics and come in with data topics each month, the bulk of the learning really comes from the discussion and the information that states are able to bring forward, so we'll set the stage, and then they really take it from there and support each other. So it's also an opportunity for real networking. 00:02:10.15 >> The name probably says it's data managers that are a part of this group. 00:02:15.25 >> Yeah, so all IDEA Part B data managers are welcome. 00:02:18.92 >> Mm-hmm. 00:02:19.56 >> But they can also invite and bring other staff in the SEA that they think will benefit from participation. The meetings are held virtually on Zoom once a month, and as Kelley mentioned, we set the stage, so IDC comes prepared with a structured PowerPoint, but as we also already said, there is ample time and encouragement for interaction and dialogue with us, IDC and with the other states that are on the calls. So IDC is really there to support the states, so discussion and collaboration on all IDEA data subjects are open to discussion whether they are on the agenda or not. 00:03:13.49 >> But we also give states an opportunity to share their resources, so, for example, if we're talking about one of the 618 data collections, and they have a state-developed resource, we love to give them an opportunity to share their resource, which then could also turn into another opportunity for states to network together and collaborate on resources. And states that use IDC-developed resources are also often invited to share how they use it, demonstrate the use of it and share how they may use it either for pre-edit checks, so if they're using an edit check tool, they may share how they use that for their edit checks and for their data quality, or they may share how they use it for their public reporting. But we really like for states to come in and demonstrate the use of their own tools as well as IDC-developed tools. 00:04:09.88 >> I know when I was involved a while ago with those groups, that was one of my favorite parts and I think the states' too was getting a chance to see how other states were tackling some of the same challenges, what they were coming up with in terms of solutions and resources. They can kind of piggyback off each other. I know in some states they might even have reached out after they saw a particular resource, reached back out to that state and said, "Hey, can we hear more about this? We'd really love to maybe adapt something like it for our state." One of the best parts of these groups. 00:04:44.44 >> Yeah, that's always ... That's wonderful when states ask for contact information of each other and plan to follow up offline to take advantage of knowledge in one state that could be useful in another state. 00:04:59.92 >> And thinking about that, another really positive aspect of these groups is that it does feel like a safe space for data managers to share that information and talk about any of the concerns that they may be having with their data, whether they're having errors or even just sharing their resources because those calls are not recorded, and it is a smaller group. They may be more apt to share and discuss openly than the might be if we were trying to invite them to present at a nationwide conference or something, so we're finding that, that safety and that small-group feeling tends to open up more discussion. 00:05:41.63 >> Yeah, definitely. 00:05:43.13 >> In terms of some of the topics that we cover ... 00:05:46.02 >> Mm-hmm. 00:05:47.15 >> ... We really do cover IDEA data collection and reporting from A to Z., right, from the collection and requirements of the collections to the actual submission of the data. In particular, reporting requirements, of course, the SPP/APR is always a big one. EDFacts Data Files timelines and processes are some of our frequent fliers, and something that Kelley and I do is, we work really hard to make sure that folks are aware of all of the resources and services that IDC has available to states. We find that sometimes there are states that just aren't really aware of everything that is available, and the resources and services can really be amazingly helpful to states, so we find that it's really important that states realize what is out there and do our best to share things. 00:06:54.32 >> And one of the things that we've noticed from that sharing, and so in our just common template that we use every month, we touch on the data selection calendar that's posted on IDC website, and we also make sure that we bring up the comprehensive list of resources, and we've found that each month, that there are a-ha moments or discussions about some of the resources that are available. So that's been very helpful to just keep that as a forefront, and one of the first things that we discuss is what's upcoming and then touch on those, that comprehensive list. 00:07:34.44 >> So being very ... 00:07:35.01 >> Yeah. 00:07:35.38 >> ... relevant about what's going on at that time of year for the states and making sure that, that gets covered or at least brought up as a topic that ... 00:07:45.55 >> Right. 00:07:45.67 >> ... folks would have questions about or want to get into. 00:07:48.31 >> Absolutely, and really to provide a service to the states to keep them focused and kind of take a tiny bit of the burden off them to keep them on track, it can be very difficult as a Part B data manager to keep all the balls in the air. 00:08:06.32 >> Yeah, and know what's ... everything that's out there, like you said. IDC has a lot of resources. All of the centers do, and it seems like these groups are able to sort of elevate and point out the ones that are most relevant at that particular moment in time, which, like you said, kind of takes away some of the burden of the data manager having to pour through websites and be looking for things themselves. 00:08:30.60 >> Yes, they don't know what they don't know. 00:08:34.28 >> Very true, so you mentioned more generally a number of topics more generally that you cover, but what are some of the common themes that have been emerging in the last couple months of calls that you've held with this group? 00:08:52.39 >> Well, certainly in the last couple of months, SPP/APR-related issues. 00:08:57.30 >> Mm-hmm. 00:08:58.83 >> There has been a lot of discussion and focus on gathering the data, the process that's used to write the APR, if there's any kind of submission issues, responding to OSEP-required actions and then of course clarification, which has just finished. Some other things, frustration and confusion with the differences between significant discrepancy and significant disproportionality. 00:09:32.28 >> Mm-hmm. 00:09:33.56 >> And the reasonableness of indicator for thresholds, those have really been a topic of discussion lately as well as we've really dove pretty deeply into documentation and processes in general. 00:09:54.52 >> Right, and particularly around the documentation and processes, we've had states talk extensively on how helpful it has been for them to access the resources at IDC for their liaison to come out and document their data processes when they're thinking about their DMS 2.0 work, and in addition to what Kristen stated around SPP/APR and Indicator 4, those questions seem to come up pretty much monthly. There's also often discussion around Indicator 18 and general supervision, thinking about the new things that have come out over the past year and how they're going to collect information. Everybody is ready for the Indicator 18 calculator to come out, so they're really looking forward to ii24, where they're going to be able to see some of the new things that are available to support them and their data. 00:11:00.41 >> A lot going on around the SPP/APR and changes that are coming soon. That's a hot topic, I'm sure. 00:11:10.30 >> Yes. 00:11:11.72 >> Well, and then the other thing that we have found to be just a really, really big topic is that a lot of states are in the process of either developing or redesigning their public reporting dashboards, and so we've had states come on and share what they've developed. We've had states come on and share kind of partial development and ask questions about, how should we present a certain indicator? For example, Indicator 7 is a really rough one, and because there there has so many components and categories to it, how should they present it within the BI tool that they're using? And so there's been a lot of collaboration between states talking about those tools and talking about how they want to present their data publicly, so that has been very rich discussions, and we've found, like we said before, a lot of states are finding opportunities to engage with each other after the calls and say, "Oh, I really like what Georgia is doing, so I need to have discussions later on with their data manager, so that's been a very rich theme that's been coming up. 00:12:19.16 >> And using those calls, too, it sounds like for states to kind of get feedback from their peers who are engaged in the same thing in their state, kind of ideas and what we started on the right track. What ideas do you have for how to continue the work? Those sorts of things, that's kind of another side benefit. 00:12:38.34 >> Absolutely, we had a rich discussion at our last meeting about how to make the data more digestible for stakeholders. 00:12:50.20 >> Hmm. 00:12:51.01 >> Right? And it was a really great discussion with different states chiming in on what they do and struggles that they have had with presenting data to stakeholders. 00:13:04.36 >> Mm-hmm, yeah, that's definitely something that comes up quite a bit, especially around SPP/APR time. 00:13:10.64 >> Yep, that is going to be part of our topical discussion at ii24. 00:13:16.35 >> Great. 00:13:16.90 >> That's what I was going to say. Yeah, that was such a rich discussion. That that's what our data managers chose as their topic to continue that discussion. 00:13:27.45 >> Great, looking forward to hearing that. What are some tips or recommendations that maybe you've shared during these groups, resources, things that maybe you all as former data managers have found to be helpful that other states might benefit from knowing about kind of related to the 618 SPP/APR data, the data-quality topics that come up in this group? 00:13:53.83 >> So as Kelley already mentioned, we have had several states share Data Dashboards and visualizations, and that has really been amazing and helpful. It's been really cool for states to see what their colleagues in other states are doing and for states to be able to ask questions and have a discussion around how states are making it happen, as Kelley said. What ... software, who does the work? What is their role? Et cetera. 00:14:27.61 >> Mm-hmm. 00:14:28.33 >> So that has been ... It's not a specific tip or recommendation, but it's been just great knowledge-sharing in general. And then of course tips on APR language and submission are always welcomed and found useful. At our last meeting, we had a discussion, as I mentioned, about making the data more digestible, and one of the states shared information on a product or app that's called the Hemingway Editor, and it helps you tell a story and how you can look at data in real time, edits in real time and tells you what's hard to understand and I believe even what grade level things are written ... that it's a very specific tip ... 00:15:27.38 >> Yep. 00:15:28.04 >> ... [Indistinct] and the result of it, of being in the group. And then I wouldn't say another huge bonus in the, quote, tip category is, as we've already said, learning about resources and services that are available through IDC. There's so many offered that I was not fully aware of when I was a data manager, and I do really regret that I didn't participate in the Data Manager Data Quality Peer Group more often when I was a data manager in a state. 00:16:08.82 >> I totally agree with that, and I feel like the value in being able to align those resources with the specific topic that we're talking about is very helpful to data managers. We also really try to listen for any changes that are coming up with 618 data, 616 data submissions, if there's going to be any changes in the EDFacts file specification. For example, we heard of a change in language in FS009, and so our first meeting after that change, we kind of highlighted it, and then of course the data managers had some questions, so we went back to OSEP and to some of our TA providers at IDC and asked some questions, and so now at our next meeting we'll be able to provide some more clarification. So we really try to listen to what's happening in the moment and real time so that we are addressing questions and concerns that they're having right now. 00:17:10.48 >> And sometimes things come up not expecting. Again, I'm going to reference our last call that we just had last week, and we had a participant from one of our entities who stayed on after everyone else had left to ask a question about EDFacts reporting that turned out it was not IDEA-related. 00:17:35.80 >> Mm-hmm. 00:17:36.44 >> But I was able to point him in the right direction and give him enough information to get going, so it really is ... It's just a great place to come to for support. 00:17:50.55 >> Thank you, Kelley and Kristen, so much for being on. You shared such great information about this wonderful group, really appreciate your time. 00:17:59.11 >> Thank you. We appreciate the opportunity to share. 00:18:02.86 >> Yes, thank you so much. It's been great. 00:18:07.93 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You are listening to A Date with Data with your host, Amy Bitterman. 00:00:07.34 AMY BITTERMAN: Hey, it's Amy and I'm so excited to be hosting A Date with Data. I'll be chatting with state and district special education staff who just like you are dealing with IDEA data every day. 00:00:19.50 >> A Date with Data is brought to you by the IDEA data center. 00:00:24.60 AMY BITTERMAN: Hello. Welcome to A Date with Data. Now that states have used EDPass to submit all of their 618 data files, we want to hear how those submissions went. And in particular, highlight how the data manager and EDFacts coordinator in one state have collaborated to successfully submit the files using the new system. On this episode, I am joined by Dominique Donaldson, who is the Part B Data Manager, and Adam Churney, who is the EDFacts Coordinator, and both are with the Georgia Department of Education. They're going to share about their experiences and provide some strategies that worked for them to partner effectively in this process. Welcome Dominique and Adam. So glad to have you on. 00:01:05.27 DOMINIQUE DONALDSON: Hi Amy. Thank you so much for having us. 00:01:08.88 ADAM CHURNEY: Yes, thank you for having us here. 00:01:11.10 AMY BITTERMAN: So I want to start off, if you each can just very briefly introduce yourself, say a little bit about your role at the Department of Education. Dominique, do you want to go first? 00:01:20.62 DOMINIQUE DONALDSON: Hi, my name is Dominique Donaldson and I am the Part B Data Manager for the Georgia Department of Education. And I've been with the Georgia Department of Education for a little over one and a half years. 00:01:33.88 AMY BITTERMAN: Great, thank you. Adam? 00:01:36.80 ADAM CHURNEY: Hi, I'm Adam Churney. I am the Georgia Department of Education State Data Analysis and Reporting Manager, which includes being the EDFacts Coordinator. I also handle Power BI Reports and many other data-related stuff. 00:01:50.53 AMY BITTERMAN: All right, so going back a few years now, when EDFacts Modernization kind of first came up and knowing that it was happening, why was it important for you all to approach those changes that were coming as a team and how did you create that team? 00:02:08.41 DOMINIQUE DONALDSON: Well, you know, for the longest time we had been hearing about EDFacts Modernization and there was so much talk about what was coming. It was discussed and shared in emails from PSC and IDC, you know, in all of the conferences like the Interactive Institute and OSEP. But there were very few tangible resources about EDFacts Modernization. So, you know, it kind of made us a little nervous. So in preparation for the EDFacts Modernization, the EDFacts coordinators and Part B Data Managers participated in all of the webinars and working groups. And not only was this process changing, but there was an entirely new platform for submitting the data files. So it was very, very clear that collaboration was going to be the key to a successful transition to the EDFacts modernization using the EDPass system. So, you know, we knew that we needed to work together. 00:03:09.40 AMY BITTERMAN: Mm-hmm. 00:03:09.92 DOMINIQUE DONALDSON: And we needed to first identify the necessary people required for the task. So the EDFacts coordinators identified Part B Data Managers as integral partners in the submission of special education data. So the Part B Data Managers were provisioned for access and relevant collections in EDPass. Our EDPass coordinators were crucial in helping the Part B Data Managers acquire the correct provisioning through the ed.gov and EDPass platforms. Not only did they grant us the access, but there was also a strategic walkthrough by our EDPass coordinators of the system to orient us to the inner workings of the system. In that walkthrough, we were able to navigate throughout EDPass and ask questions. Being able to access the system, interact with the system components such as the file download, and to be able to see the options within the platform allowed the Part B Data Managers to be a relevant part of the data submission process. You know, looking at the system together really helped us to visualize the shared goal that we were working to complete together and to ensure that we submit accurate and timely data for students with disabilities. 00:04:39.63 AMY BITTERMAN: Thanks, Dominique. So Adam, tell me how you as the EDFacts coordinator, and Dominique as the Data Manager, how do you share information back and forth? 00:04:50.52 ADAM CHURNEY: Sharing data, there's a lot of places where we actually share data because it really starts when we collect the data from the LEAs. We have several data collections and in each one of them, we've got reports that basically are already out there that we can do our first check on the data to make sure that it's coming in correctly. From that, we, most of our data that we send to EDFacts is actually aggregating from these data collections. So in that process, my team has created a portal within our application that we have all our data collection in that allows the data managers for IDA and all the other ones to basically go in and start seeing these aggregated reports, aggregated accounts, not just for the current year, but also a comparison with previous years to do that first check of data quality. 00:05:35.44 AMY BITTERMAN: Mm-hmm. 00:05:36.00 ADAM CHURNEY: And within that application, we've also embedded our ability to sign off on the data for the data managers. So not only are we getting the data in, but then the data managers are really going through looking at the data and signing off on it. 00:05:49.93 AMY BITTERMAN: Got it. That's great that you have that shared application and can do all of those checks like to your comparisons it sounds like, as well as some of the other validations I'm assuming that get done through EDPass as well. How do you, between, again, data managers and the EDFacts team, how are you collaborating and working together to really make sure that the data that you are submitting are accurate and of high quality? 00:06:15.78 ADAM CHURNEY: So one of the benefits of the new EDPass system is the ability to check your data quickly. 00:06:21.70 AMY BITTERMAN: Mm-hmm. 00:06:22.06 ADAM CHURNEY: So one of the priorities as the EDFacts coordinator is we really want to get the data into the new system as soon as possible. There's that first round of data checks. First is the format correct, which there's format changes every year. 00:06:33.81 AMY BITTERMAN: Mm-hmm. 00:06:34.15 ADAM CHURNEY: And most of those are just a simple fix here or there. Sometimes when there's new data that was collected that we're having to pull in as a new element, we go through and do that first check. But once that first round of uploads happen, me and we also have a programmer [INAUDIBLE] that kind of go through that first errors, data quality edits that we get and we go through and determine quickly whether it is a data issue as in the data that we collected is showing numbers that are out of spec or is it the code that we use to aggregate the data causing that. And last year being the first year EDPass was used for a lot of the files. There was a lot of data quality where it was on the code part where we didn't include zeros. And as a team, we really didn't want to also show this to the IDA managers that go like, "Oh my God, there's so many errors." That, "Okay, just wait. Let us do our first checks." And we did those first checks. We'd identify this data quality check is a code error, or this one is an actual data error. So once we did that first cleaning, we had a lot less error message to then take to the IDA managers to basically look at and say, "Okay, here are the data errors that are actually being caused by our data." And then we do that first round of checks where we'd go, is this a unique case to that system, that LEA, that school, or is this a bigger systemic thing with how we collected the data? 00:07:58.63 And then we would kind of turn through those and make sure that okay, it is a unique case for the school. And from that collaboration, I'm not a subject matter expert when it comes to IDE data particularly, but I know enough that I'm able to sit there, go through, say, okay, let's figure this out, get with the IDA managers and really look at the data, question the data, and then be able to report it. And then one of the nice things in EDPass is if we have a problem where we're looking at a data quality issue and it's like maybe we should have coded it as this instead of that. What's really nice about EDPass is we would take our data file, make a couple of edits to the schools or the LEAs that were throwing these data quality issues, submit it with this altered data to see does this cause other issues or does this clear the air? And being able to do that iteration process, we really were able to ask more specific questions in the data to say what happens if we coded it this way and see, uh, oh, something was triggered over there. And then go like, maybe we shouldn't go that way, or oh no other errors triggered. 00:08:59.99 And then we could come back to our IDA team and say, Hey, if we'd make this change, is this better data quality? And we could kind of iterate. And as you got through the big errors and got to the smaller and smaller errors, you could really start getting into good conversations where we're really talking about how is this data being interpreted, how is it being gathered, is it accurate, is it showing the intention what we're supposed to be showing? And it's really helped a lot with data quality. 00:09:25.36 AMY BITTERMAN: So it sounds like for you all EDPass has been a good thing, it seems like. Has it helped you think with the data quality? 00:09:30.99 ADAM CHURNEY: It has helped a lot. 00:09:31.97 AMY BITTERMAN: Great. 00:09:32.23 ADAM CHURNEY: My first years here I was under the old Eden system and it was great, you know, format errors were checked off early and then, you know, several months later I was like, oh, we've got these data quality issues. 00:09:43.13 AMY BITTERMAN: Yes. There isn't that lag anymore. You can kind of deal with it all at one time and get in, like you said, to sort of even some of those more nuanced, you know, issues to really ensure that the data are absolutely, you know, as clean and high quality as you would hope they would be. 00:10:01.24 ADAM CHURNEY: Yes. Over the past year with all the files that we submitted last year, we had a lot of great conversations with individual program managers where we would sit down after we did our first upload or even after those second uploads and just kind of go like, "Let's talk about the data. Are these numbers correct?" It really got more into the specifics where being on the outside of a lot of the data, I could ask the simple questions that they hadn't thought about it and well, and they would go like, "Maybe we should try this." And then being able to iterate, we could upload a file with a small change, see if that caused any problems or anything else would pop up. And it really did help create good conversations. And even with this charter upload that was just happening being the second year, all the errors that we had from the previous years about the data building and the data code are gone. So now we're really down into these specifics and we've had a lot of great conversations where should we be in some cases with the charter data we have multiple management organizations and just sitting there talking through that with the team asking the PSC if we did this, what would happen in trying that. It really helped us kind of get together and communicate and collaborate. Just really focusing on these small little nuances that we never had time to think about in the past because it was, everything's good check and then three months later uh, oh let's fix the data. Now it's like, let's talk about it and fix it and kind of iterate from there. 00:11:26.43 AMY BITTERMAN: Mm-hmm. And you feel like you're maybe not getting dinged as much either as obviously, I guess, because you're seeing those issues come up before the data get officially submitted, which is probably-- 00:11:41.68 ADAM CHURNEY: Exactly. 00:11:42.12 AMY BITTERMAN: Nice to kind of deal with those early on before it gets too far down the road. 00:11:47.30 ADAM CHURNEY: And I think just the whole collaboration process while we're actually able to re-upload files is really helping data quality. It really sits there just iterating and making sure that is this data right. We have the time now to really ask that question and dig into it to make sure that what we are sending to EDPass is the best data that we can give. 00:12:07.15 AMY BITTERMAN: And you've already spoken to this, but I don't know if you or Dominique have anything to add in terms of what have you seen as the impact of this collaboration and working so closely together. 00:12:17.22 DOMINIQUE DONALDSON: Well, you know, Adam really touched on a lot of the positive impact in the collaboration. And as a part B Data Manager, it really has made, you know, my life easier because of the consistent communication between the two departments. You know, if we had not formed that strong collaborative nature then we might not have identified issues in enough time, which could have possibly resulted in submission of inaccurate data or late data submissions. And the collaboration has really made it where we are able to identify any issues and concerns prior to. And it allows us to be able to submit all of the files in a timely manner and have accurate data submissions and like Adam said, improve data quality. 00:13:06.21 AMY BITTERMAN: Yes. Which is the ultimate goal of all this. 00:13:08.50 DOMINIQUE DONALDSON: Right. 00:13:09.57 AMY BITTERMAN: So I'm sure there are states listening who wish they had the kind of collaboration and strong partnership that it seems like you all have in Georgia. I'm sure it wasn't necessarily easy to get there and there was a lot of work involved. Can you share with us what are some steps that maybe others states out there can take to try to build that collaboration that you all have? 00:13:31.25 ADAM CHURNEY: Yes, I think one of the first things that we need to do is really identify the necessary people needed. 00:13:36.27 AMY BITTERMAN: Mm-hmm. 00:13:36.89 ADAM CHURNEY: Of course, there's the EDPass coordinator. And one of the things that I think is really important with the EDPass coordinator is instead of programs sitting there going through individual business rules and the BRSI or BSRI, I always get that acronym [LAUGHTER]. Being able to be the translator for those program managers so they don't have to live in that document at all, since throughout the year I'm living in it every single cycle. Just going through what errors are popping up, let's understand them. So that really helps having that center of focus person. The other interesting thing that Georgia has is we have our data collections which collect the majority of all the data that we do submit to EDPass. And then we have the programmer and myself that will sit there and create the files to then upload. So my team managed the uploads. So really the program managers don't have to get their hands dirty with data too much. They have to understand the data and understand their data, particularly with how it relates to each other. So with my team, we'll build the files, we'll upload it as soon as possible. And then once we have that we can then go to the Part B managers and any other program managers with the data and say like, Hey, we've got the data up there early. This is what we found. These are the data errors that are code-related where we need to sit there and kind of recode it. Don't worry about anything else until we fix that. And then once we fix that saying, okay, Part B managers, here's what's truly errors. Let's talk about this. Let's see, is it just a issue where we've got some systems that have weird data or is it something else? 00:15:06.86 But really once we found that team and figured out who's responsible for the data, but it's all kind of that center spoke around the EDPass coordinator making sure the data's getting uploaded, being built in time, and then having the managers bringing them in at the time where their time is used to the fullest where they can sit there and say, this data is correct. I know you know, X School has X number of students what they know. And relay that back to the thought. So that's really the key people that we kind of brought into this. 00:15:36.56 AMY BITTERMAN: Yes, that makes a lot of sense. 00:15:38.30 DOMINIQUE DONALDSON: And, you know, after identifying those necessary people, which Adam just touched on. 00:15:43.34 AMY BITTERMAN: Mm-hmm. 00:15:43.67 DOMINIQUE DONALDSON: Then we, you know, it's going to be important for teams to set up meetings to discuss roles, access, and goals. IDC has a very useful resource called the EDFacts Modernization Planning Questions to Consider. And that research is going to be helpful for teams when we're looking at understanding who's responsible for what and what processes you need to complete the task and how you will collaborate as a team. You can search for this file right on the IDC website and it'll help you answer some of those questions like when will we submit files? You know, reviewing the EDFacts submission calendar organizer is also so very important. Using backwards planning to make sure your data is reviewed and submitted on time. You can also look at other questions and discuss where data comes from and what are the audit checks for accuracy and verification of data. Also thinking about what access will be given to special education. Special education needs to have access to all of the EDFacts files containing special education data. 00:16:48.71 AMY BITTERMAN: Yes. [LAUGHTER] You would think that would be a given. Right? 00:16:51.34 DOMINIQUE DONALDSON: Right. 00:16:51.70 AMY BITTERMAN: Yes. 00:16:51.98 DOMINIQUE DONALDSON: You know, all of that special education data, it really should be reviewed by the people that you've previously identified in special education as crucial to the project and the data submission process. And so it's going to be important that no special education data is submitted without special education teams getting a chance to look at the data first and verifying the data. 00:17:16.21 AMY BITTERMAN: Mm-hmm. 00:17:16.61 DOMINIQUE DONALDSON: I also want to know when and how you'll review those files. So thinking about processes and then what are the procedures if a concern arises? You'll need to know who you need to alert first when you see something that's of concern. And one of our issues in our state, our first contact is for our infotech architect and we bring concerns to him and we work through the issues together to resolve them. Another thing that you might want to do is also make sure that you're reading the current file specs and developing that common language. That way everybody is understanding the same information. And it's going to be also important to set up regular meetings to discuss upcoming collections to ensure that your data is in alignment with the file specifications. And then you also want to use that data submission organizer to set tentative dates for submissions to make sure that you are timely and accurate. 00:18:20.67 ADAM CHURNEY: Another thing that I think really helped us was attending office hours together or whenever we could. 00:18:26.58 AMY BITTERMAN: Mm-hmm. 00:18:27.24 ADAM CHURNEY: The office hours that PSCS put together are immensely useful. Not just sitting there, you know, asking the questions you have, but listening to other SCAS and the issues they're having with their data. A lot of times I would find myself listening to an SCA that had an issue and they were talking through it, then I would go, "Do we have that same issue?" [LAUGHTER] And then I would go to the program manager saying like, "Hey, this was an issue in this state. Is this something that we have, how are we accounting for this?" And it really helped seeing not just your state, but seeing everybody else's problems. And having the IDA manager there too also helps because they're seeing what the other program managers are having problems with. And I can't tell you how valuable they were because I learned so much about other states. And then from what I learned from them, I learned more about our data because again, now I had more questions I could ask our teams about our data to see how the data was actually being formed. And it really helped. 00:19:23.67 DOMINIQUE DONALDSON: And the last thing, we should always celebrate the wins. It's important to make sure that your team knows how valuable they are and they need to know that the work that they do has a powerful positive impact on the data that we submit. So working together of as a team is so important and making sure that we say thank you. We are very grateful for Adam and his team because [LAUGHTER] without them, we wouldn't be as successful as we are. 00:19:59.38 AMY BITTERMAN: Yes. I've worked with Dawn going back many years, one of the former data managers, and Dominique and I've heard over the years, Adam's name come up many, many times, [LAUGHTER] so. 00:20:09.08 DOMINIQUE DONALDSON: Mm-hmm. 00:20:09.41 AMY BITTERMAN: I know [LAUGHTER] how wonderful that relationship is, And I think it's something that a lot of states will kind of envy and hope some of the information you all shared can kind of help get them more towards that type of relationship too. 00:20:24.75 DOMINIQUE DONALDSON: Yes. I agree. 00:20:26.28 AMY BITTERMAN: So what are your plans for the future? What do you have coming up regarding your working relationship EDPass, EDFacts Modernization, what's going to be coming up next for you? 00:20:38.57 ADAM CHERNEY: So now that we have a year of EDPass behind us. 00:20:41.30 AMY BITTERMAN: Mm-hmm. 00:20:41.57 ADAM CHURNEY: And we've gone through most of those code errors that, you know, consumes some time. The plan this year and going forward is, and you can search for whatever infographic you want, but there's that data information, knowledge, wisdom pyramid, where data's down at the bottom, then it gets a little smaller. Information, knowledge, and wisdom. SCAS are really good about collecting data. 00:21:03.04 AMY BITTERMAN: Yes. 00:21:03.40 ADAM CHURNEY: Some of them are getting better at making it into information where you organize that data into a fashion that makes logical sense. It's the last two tiers, the knowledge, and wisdom where some SCAS have better ability to do that. And one of the things that we're moving forward with is we're creating-- we're using Power BI as our informed data decisions. And we're using Power BI as the tool to express that and communicate that. Not just building the reports. What we had in our Eden portal that we talked about earlier within our application, we're just canned static report, here are the numbers, here are the LEAs, et cetera, which are useful. But when you have 200 or 2,300 schools, 230-something LEAs, it gets a wash in noise. So if there's anything that stands out, you just don't see it in a normal Excel report. So using Power BI and other visualization and techniques, what we're hoping to do is highlight the changes, highlight stuff that we should sit there and talk about where it's not just, oh, does everything look good? It's, Hey, this system's doing something different. What are they doing? And finding out what actual systems are doing. And we've started creating some Power BI reports that really highlight those and save us tons of time on the back end because instead of sitting there filtering through a report of 2300 schools, here are the 10 schools that we identified with this crazy change. 00:22:26.43 AMY BITTERMAN: Mm-hmm. 00:22:26.65 ADAM CHURNEY: Let's talk about them. Is this something that we need to fix? Is this something that's going to happen more in the future? And just using visualization and the abilities with the data dashboard to help communicate not just to my teams and they're making sure the data's correct, going into EDPass, but also having reports for Dominique and other teams around our state to say, Hey, here's your data in a nice pretty fashion with colors. And then highlight the stuff that, here's stuff that you might want to know, because a lot of data, 95% of it in, you know, most cases is the basic case scenario. I always like to say my enrollment scenarios, most students come in day one in school A and leave in school A at the end of the year. Once you start getting to the next level of students that transfers and all this other stuff. Those small 5%, last 1%, et cetera. 00:23:17.12 AMY BITTERMAN: Mm-hmm. 00:23:17.41 ADAM CHURNEY: Are really where the interesting stuff happens. And the quicker you can get to those and start talking about those, that's where you get the final little bits of data quality resolved. 00:23:27.52 AMY BITTERMAN: Mm-hmm. 00:23:27.80 ADAM CHURNEY: And that's what with EDPass being able to iterate through stuff quickly, that's really helping there. Our hope is with our Power BI visualizations, we're going to get that before the end of the window and really understand our data to the next level and create knowledge and wisdom of the data that we're submitting to EDPass. 00:23:45.86 AMY BITTERMAN: Yes. And really use the data that you're collecting, be able to-- in Dominique's group, and then with the schools and the districts. 00:23:53.79 DOMINIQUE DONALDSON: Right. 00:23:54.17 AMY BITTERMAN: It sounds like providing them the data so they can really use it to think about what are we doing that's working well, what maybe isn't working as well, what changes do we need to make? So all of the stuff that you hope finally happens after you've kind of feel secure in your data quality. 00:24:10.91 DOMINIQUE DONALDSON: Yes. I absolutely love that because the power BI visualizations will provide the insights that LEAs need to really identify the positive impacts on what they are doing and identify opportunities for growth, all for student improvement. 00:24:32.55 AMY BITTERMAN: Great. Well, looking forward to hearing more. Maybe you both can come back and talk more about those reports once you've gotten them going and are using them with the state and districts, we'd love to hear more about them. 00:24:43.40 ADAM CHURNEY: Thank you for having us. 00:24:45.09 AMY BITTERMAN: Thank you both so much. Really appreciate it. 00:24:47.66 DOMINIQUE DONALDSON: Thank you so much 00:24:50.47 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content or connect with us via the podcast page on the IDC website at ideadata.org…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date With Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey, it's Amy, and I am so excited to be hosting "A Date With Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date With Data" is brought to you by the IDEA Data Center. 00:00:24.76 >> Hello, and welcome to "A Date With Data." On this episode, we're going to be throwing a spotlight on 619 data with Candice Taylor, who is the Early Childhood Special Education 619 Supervisor, and Becky Palculict, who is the 619 Coordinator, both with the Mississippi Department of Education. Thank you both so much for being on the podcast. And I would love to get things started by having you each say just a little bit about your role at the Department. Candice, do you want to kick us off? 00:00:59.81 >> Absolutely. Thanks for having us, Amy. So I am Candice Taylor. I have been with the Mississippi Department of Education for 9 years. This is my 28th year in education. I've done all kinds of special education work in public schools over the years, and I've been ... I was 619 Coordinator for several years and then finally was able to grow our team, and now I have Becky. 00:01:23.44 >> Yes, Becky, what about you? Tell us about your role. 00:01:27.40 >> Well, again, thank you for having us. I am Dr. Becky Palculict. I am the 619 Coordinator. I took Candice's place, and she has been a great mentor to me. This is my 32nd year in education, all in special ed. I have enjoyed learning everything there is to learn about preschool education. And I ... My primarily ... My job is to provide technical assistance, answer questions or whatever for the districts within our state and to help them with whatever it is they need help with. 00:02:09.19 >> Okay, so all things 619. That is ... 00:02:12.00 >> Yes ... 00:02:12.57 >> That is you. 00:02:13.05 >> ... all things 619. 00:02:15.39 >> Great, so we have the right people on the call on this podcast episode, because that's what we're going to be digging into, is 619 world. And to kind of set the stage, can you start off by just telling us kind of the data story of your 619 indicators? 00:02:33.84 >> So I've always done this work in what I feel like are three buckets: indicator six, indicator seven and indicator 12. 00:02:40.66 >> Mm-hmm. 00:02:41.18 >> And we ... I've tried to do work on each one of those as we've gone along, and as far as our indicator six, we really ... We don't have universal pre-K in our state, so that's one of the things that we've really worked on a lot over the years, is trying to increase the number of seats for preschoolers, particularly in general education programs. That's what we really want, is to really help that LRE data. 00:03:07.84 >> Yep. 00:03:08.07 >> And one of the ways we've done that is offering what we call the blended program where we provide funding to school districts to have them start new classes that are gen ed classes but to make sure they include children with disabilities so that we can increase those opportunities for our children. 00:03:30.51 >> Mm-hmm. 00:03:31.04 >> As far as our indicator seven data, we have changed our process for how we collect that data over the last few years. We have gone from a single screener to the child outcome summary process, and so that's kind of been where that work has moved and is moving toward now. 00:03:50.87 >> Mm-hmm. 00:03:52.05 >> And as far as our indicator 12 work, we do joint trainings with our part C folks, and we really try to keep ongoing communication with them. And we're actually looking to increase that connection with them. We are about to hire a coordinator for our team that is specific to the CSPD, the Comprehensive System of Personnel Development. 00:04:16.02 >> Mm-hmm. 00:04:16.30 >> So we'll be able to have that person really work on those connections with part C, as well. Get that data up. 00:04:22.54 >> Yeah, that's wonderful, having especially someone that part of their position is focusing on that, because I think that tends to sometimes fall by the wayside, or it is kind of a little piece of what one person is responsible for and doesn't always get the attention that it needs. So that's exciting. Can you talk about some data quality challenges that you've encountered over the years? And what are some ways that you've tried to address them? 00:04:50.26 >> So one of the big things that has happened during my time doing this work is historically our state used a single screener to collect data for indicator seven. And when the time came for our contract to end with that provider, we decided to move to the Child Outcome Summary Process because so many states and territories use it, both in part B and part C. 00:05:15.67 >> Mm-hmm. 00:05:16.46 >> And what we really ... We really felt like that single screener was only giving us a snapshot of what the child could do. 00:05:25.02 >> Mm-hmm. 00:05:25.35 >> And we really felt like if we moved to that COS process, that would give us more of a whole picture of the child. So we're still a little early yet in the process to see what that data looks like, but we anticipate a real increase in the quality of our data since it's not just that single screener that's determining the child's outcomes but really letting the IEP committee decide and talk about the progress that the child has made. So that's been a big one for us. And, Becky, you've also worked on some data things for indicator 12, as well. Do you want to talk about that? 00:06:04.84 >> With indicator 12, the data team and I have met, and we'd meet twice a month about whatever data it is that we need to be speaking about. 00:06:14.93 >> Mm-hmm. 00:06:15.70 >> We have revamped how the districts and part C ... part B and part C talk. We require the districts to contact part C monthly and then provide us with a report to help us know that there is a communication between them now, and we take that data and look at it to see what it is that's needed. And I believe it's really helped with knowing what the transitioning of children from part C to part B. 00:06:52.06 >> We're also using technology to try to increase our communication, as well. We changed our notification system, and so now in addition to the our technology folks who are in a different office than us, Becky and I also receive a notification when we receive data from the part C side because our systems are not connected. We hope someday they will be, but historically and right now they've not been. And so receiving that notification lets us know that that communication is still flowing. 00:07:21.75 >> Great, and that's a step in the right direction, it seems, to getting it all fully connected. 00:07:26.11 >> Absolutely. 00:07:26.45 >> And then once you get the notification, does that go to the district, as well, or do you then have to pass it off to the district? 00:07:32.70 >> So our technology folks do that that are able to put that information in our student information system that we use for our state. 00:07:39.66 >> Okay. 00:07:39.75 >> And that way, districts get notified. But that lets us know that things are still moving along. 00:07:45.46 >> Yes, yes, and that ... Yeah. That seems probably not completely seamless at this point, but at least there is that process that is being followed, and it seems like things can happen fairly timely, hopefully, doing it that way. So we talk a lot about data culture and trying to build the capacity of the districts and SEA staff around data culture. What are some ways in Mississippi that you're really trying to strengthen the data culture? 00:08:17.47 >> Well, as Candice had mentioned, we have been meeting with the part B friends that we have. We provided joint training across the states and regional training with different districts and those that might need the training and provided assistance with them that way. Then we also provide technical assistance and monitoring to the different districts as they are trying to understand this new COS process that we're doing. 00:08:52.89 >> Mm-hmm. Mm-hmm. 00:08:53.78 >> Many of the districts are still ... Even though this is, I believe, our third year into the COS process, they're still trying to determine how they do it because they've had turnover in the people who were trained in it to begin with. 00:09:09.90 >> Yeah. Mm-hmm. 00:09:10.78 >> So we're retraining people, and it's getting there. They're learning, and they're understanding the process of how it's all supposed to work and come together cohesively. 00:09:22.40 >> Yeah, that's probably a challenge that goes without saying, is turnover. 00:09:25.68 >> Mm-hmm. 00:09:26.03 >> I know we're not the only ones in our state with that problem. We ... Other states are experiencing that, as well. It is hard to retain people in our school districts, and so even though we train and we train, we feel like every year we kind of start over with some folks, having to provide additional training to people who are new. 00:09:46.02 >> We actually have been out to many districts that have come together and gotten neighboring districts together, and I've provided a little small regional training and been there for them to ask those questions so that there's a lot more understanding, I believe. And then knowing that they can contact me ... 00:10:10.27 >> Mm-hmm. 00:10:11.01 >> ... and ask any question that they have has gone a long way because pretty much daily, I get an e-mail or a phone call that says, "Hey, can you help me with this?" 00:10:20.75 >> Mm-hmm. 00:10:21.05 >> And if I don't know, I'm sure I find the answer, and I get right back with them. So that technical assistance has really helped throughout the state with the ... 00:10:32.78 >> One of the things about being able to build our team that has been so great is that when it was just me, I had a very limited capacity as to being able to stay in touch with the data. And so one of the things that Becky and I have talked about from her very first day is that I really wanted that. I wanted her to get connected because we, Becky and I, work in the Office of Early Childhood. Our data folks, our data manager and all of our data team, they work in the Office of Special Education, and so we don't live together. 00:11:05.70 >> Mm-hmm. 00:11:05.91 >> We're not even in the same building. 00:11:07.34 >> Oh, boy. 00:11:07.78 >> And so one of the things I really wanted Becky to be able to do when she came on was to be able to coordinate more often. And so she's been able to have meetings with the data team and herself, just really trying to keep up with what's going on instead of waiting until data gets done yearly and then find out after the fact. 00:11:30.47 >> Yeah, that's one thing, just having worked with you all in Mississippi for a while. There definitely is a very strong relationship and connection, it seems like, between you all with 619 and part B data and the program side of things there, too, so that's a real plus, I think, when you have those tighter relationships, especially around the data. What are some plans you have for moving forward and continuing to improve the quality of your 619 data? 00:12:01.02 >> We have actually got more training throughout the districts planned. Also, I have ... Mississippi was chosen among four states to be part of a cohort to ... It's basically the trainer cohort, but to be able to use the COS process to make the IEPs better, being able to use those outcomes to write the goals so that they are more appropriate and to write the IEPs as smart IEPs for the children as they move into kindergarten and first grade. And so the plan is once the cohort is over, the team that has been picked to go and do the training, they come back and do regional trainings across the state with the districts so that they can learn how to use those exit outcome summaries and scores to write better IEPs that are more appropriate for the children as they move through their kindergarten and first-grade years and further education. 00:13:24.68 >> That's really exciting, because that's a great example of we collect all this data. Well, how are we using it? Sometimes we don't use it for a whole lot beyond kind of our compliance and the standard reporting that you need to do. But this is a great way to show how you're really going to use that data. To make changes and improvements, so ... 00:13:46.11 >> Right. Make it cohesive. Move through the pre-K, kindergarten, first-grade years and beyond. I'm really excited about it. We all are. 00:13:57.32 >> And that's one of the things that we continue to struggle with, we feel like our districts kind of struggle with, is it's not about collecting data to check off a box to submit to the state so that we can submit to The Feds. It's about seeing where our children are and how to provide better services for them, more appropriate services, to increase their outcomes. 00:14:20.73 >> Right. 00:14:20.83 >> And that's the big thing that we're trying to help districts understand. It's not just about those numbers. It's about boys and girls and helping them have the best outcomes they possibly can. 00:14:31.57 >> Yeah, absolutely. Well, thank you both so much. Learned a lot about what's going on around 619 data, and sounds like some really wonderful work happening and coming up soon, so thank you both so much for sharing your story. 00:14:49.96 >> Thank you for having us. 00:14:51.09 >> Yes, thank you. 00:14:53.84 >> To access podcast resources, submit questions related to today's episode, or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content. Or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/. ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date With Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey, it's Amy, and I am so excited to be hosting "A Date With Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date With Data" is brought to you by the IDEA Data Center. 00:00:24.68 >> Welcome to "A Date With Data." Kim Murray, who is the Director Special Education, and Jennifer Nicosia, who is the 619 Coordinator with the New Jersey Department of Education, are here to share with us New Jersey's 619 data journey. So glad to have both of you here today. Can you tell us a little bit about your background and your role with New Jersey? 00:00:48.21 >> I've been the Director of the Office of Special Education for 2 years in February, so it's a rather new role for me, and it's exciting to be in it. But I have been with the Department in the Office of Special Ed for about 15 years, mostly on the compliance side monitoring complaint investigation and everything that comes with policy and procedure. 00:01:16.41 >> Great. Thank you. Jennifer? 00:01:18.77 >> Good morning. My name is Jennifer Nicosia, and I'm the 619 Coordinator for New Jersey, and I work with Kim in the Office of Special Education preschool. And what my position really means is that I support preschoolers ages 3 to 5 across the state who have individualized education programs. And on a day-to-day basis, that entails providing technical assistance and professional development and working with different organizations and agencies to support those students. I also have a background. I won't say how long, but let's say 20-plus years I was a special education teacher, and I worked for many years in public school system in special ed and preschool. 00:02:08.59 >> Great. Thank you both so much. So let's start off with the story of your 619 data, and, Kim, since you've been with the Special Education Office in the New Jersey Department of Education for such a long time, I'm wondering if you can just talk about some of the data quality challenges that have come up over the years and ways that you've all gone about addressing them. 00:02:32.28 >> Sure. I'd be happy to. With our 619 data, for the longest time, each program office had one data person who was in charge solely of collecting the data, verifying the data as valid and reliable and reporting the data. And that led to some really siloed work because that person was the only one who really had an in-depth understanding of the data and our data collection processes. So it was very hard to improve that when it was held with one person. In 2016, the Department created the Office of Fiscal and Data Services in our division, which, now we have a director who understands data, who loves data. She has a team working on the data, so we were able to pull together some more robust processes around data collection, data reporting, our data dashboards. So that piece has been great, and we're building on that work and the collaboration between the offices in terms of how we use the data and how they are collecting and analyzing the data. One of the issues that we have had with our LEA that they were struggling with submitting accurate data whether it was a lack of training. The staff member maybe who was inputting the data maybe wasn't someone familiar with the data, data system, so that was an issue. So we were seeing a lot of gaps in the data, incomplete data sets. Not every student would have a complete ... All the fields were completed, which means our data is not valid and reliable. So we've done a lot of training with specific districts. We've identified those who have the most issue with accurate data and making sure that they're getting some personal training and technical assistance to improve the quality of their data. And in terms of performance data, when we in general talk about Indicator 7, we had used the Battelle as our basis ... 00:04:55.00 >> Mm-hmm. 00:04:55.58 >> ... for our performance data. And what we were seeing is not many districts were using it, so the data set was getting smaller and smaller, and it wasn't really capturing what was happening in the preschool world for us. So now we've transferred over to the CODs, and we're building that up, which is great. And Jen has done an amazing job with ramping that up and improving the quality of data that we're collecting. Then we've also seen that the LEAs don't really have a connection to the data. They don't see the story the data is telling. So they submit it, and it's kind of, "We're done here. We've submitted it. We're not really analyzing it." So we've been doing a lot with building out data dashboards, making the data more accessible, more user-friendly, and then really explicitly stating the connections that the data has to program improvement and program development. And that's our goal, is to have the data really drive what's happening in terms of decision making and program improvement. 00:06:07.84 >> Yeah. That's a great strategy because if data are being collected for no real purpose or the LEAs can't see the purpose, then why are they going to put in so much effort to make sure it's of high quality? But if they see what it's being used for, how they can use it, how it can benefit them and their own students and families and teachers, then that just gives them more incentive, obviously, to make sure their data are accurate and of high quality. 00:06:35.14 >> Yeah, exactly. That's our goal. 00:06:37.62 >> What are some ways that you're working on strengthening New Jersey's 619 data culture? 00:06:43.40 >> So I think Kim told such an accurate story, since I'm here about a little over a year, if you could believe that. 00:06:49.99 >> Mm-hmm. 00:06:50.75 >> So she definitely told the story, and so by the time I came along, a lot of these things were in place. And so I was able to work with, under Kim's leadership and with the new monitors and the fiscal office and some of our colleagues who we could attest to absolutely love data and help to explain the why. So once we did that and we worked closely with the LEAs, what we were able to do was, through coaching, through a lot of technical assistance and professional development, we were able to really start to strengthen New Jersey's 619 data culture. And what I like to say is, so we have Alex Pensiero who works with us, and I would be remiss not to mention her because what she did was she actually makes data not scary. 00:07:43.68 >> Mm-hmm. 00:07:44.10 >> And so she's enabling us to work with all the LEAs across New Jersey and to change it from sort of being fearful of data, and once you understand the why and you start to understand the importance of the connections between everything that we're doing for preschool environment and least restrictive environment and then outcomes, it really starts to change the mindset, and it starts to really improve the data culture to be much more positive, I think similar to what Kim said talking about the why, tailoring the professional development, and having relationships that are really very positive with the local education agencies and especially the leadership of each district that's submitting data has allowed us to really significantly impact the work that we're getting. So clearly our processes are becoming stronger, but just the nature of the mindset, being comfortable talking about the data, and what does it really mean? And being honest, having those honest, open conversations, I think. And so some of the very specific ways that we've been able to change the data culture is by really sharing it and having things similar to what Kim meant. We have a dashboard where schools can go ahead and look at their data. We can provide technical assistance where we go ahead together with an LEA and look at their data and sort of analyze it and then figure out what it means. We have a few projects going on with some external stakeholders based off of the 619 data. And so I think what this does is it changes the culture by sharing our vision about meeting the needs of all children in early childhood. 00:09:42.64 >> Getting that data out there, getting the exposure, the transparency, and I like what you said, making it so it's not as scary or scary at all by kind of holding their hands and really walking through it w them so that they can understand it and be able to explain it themselves, is so, so important to building that strong data culture. Great, and what about moving forward? Do you have plans, any changes, new initiatives that you'll be putting in place to really continue on this journey of improving the quality of your 619 data? 00:10:19.93 >> Data quality is a priority for us, and it's ongoing, right? We're always building, seeing how we can do better, seeing how we can be more accessible. So in our office, everything we do is built on two foundational beliefs: equity and inclusion. And we don't make decisions without the data to back it up and to drive our initiatives forward. So when we're talking about inclusion in preschool, which is something that Jen and I hold very dear, and how we can improve our outcomes, we have external partners that we're working on around this, but the data is driving that. 00:11:02.09 >> Mm-hmm. 00:11:02.85 >> And that starts the conversation. When we meet with an LEA, here is your data. Let's talk about it. This is what it means. This is what we're seeing. These are your trends. So everything we do is around ... starts with a data review. 00:11:20.39 >> Mm-hmm. 00:11:21.02 >> And we imagine that will continue to do the same, right? That will continue to grow. We continue to work on data quality, whether it's Jennifer working with the system where we collect the Indicator 7 data. She's constantly refining that as we see where an LEA is struggling or they've identified sort of a work-around that we didn't know existed. We go in, and we fix that. So we make that system more robust. We're working on business rules to make sure that LEAs are submitting quality data and complete data sets. We work with our Office of Fiscal and Data Services to see how we can reimagine this data, how we can tell the story with the data and not just make it accessible for the LEAs. That's just one piece of our audience. But we have families who are interested in this and our external stakeholders and our advocacy groups. We see that sort of collective impact, everyone being on the same page and speaking the same story about the same data ... the data in the same way. That's how we can leverage our work, and that's how we can maximize our outcome. So we really are trying to develop this common language and common understanding around our data. 00:12:47.60 >> Yeah. I love that so much, just how you're being so thoughtful and intentional of thinking even just beyond the SEA and the districts but how critical it is that families and community and other stakeholders need to be at the table and understanding this data, too, and really part of the conversation. 00:13:09.26 >> And I think I would be remiss if I didn't say that we continue to engage in really nice conversations and professional development and training from IDC. We always ask for help on how we can improve our data, how we can make it better. We're very grateful for that because it has helped drive a lot of the improvements that we have seen in our data. 00:13:32.64 >> Great. Well, thank you, really appreciate that. Well, thank you both so much for spending some time chatting with me, doing some really great work in New Jersey, and love to hear what's happening next and what changes and how this is all going moving forward. So thank you for your time. 00:13:53.53 >> Thank you so much for having us. 00:13:55.74 >> Thank you. 00:13:58.30 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content. Or connect with us via the Podcast page on the IDC website at IDEAdata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:00.50 [MUSIC] 00:00:01.52 >> You're listening to "A Date with Data," with your host, Amy Bitterman. 00:00:07.34 >> Hey, it's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Data with Data" is brought to you by the IDEA Data Center. 00:00:24.78 >> Welcome to "A Date with Data." Authentic and broad stakeholder engagement is required as part of the state performance plan annual performance report, or SPP-APR. And on this episode, I am joined by Susan Bineham, who is a manager in the Special Population Strategic Supports and Reporting Division of the Texas Education Agency. Susan is going to share with us how in Texas they've been able to continuously and meaningfully engage stakeholders in their IDEA data. Welcome, Susan. Thank you so much for joining us. 00:00:59.17 >> Well, thank you so much for having me. 00:01:01.70 >> Great. To get us started, can you just say a little bit about your role, and how long you've been with TEA? 00:01:08.86 >> I started with TEA on August the 1st of 2018. So Texas rolled out a brand new, big monitoring division and program, and I was originally part of that group that created our new monitoring program. And then in 2020, I became the SPP-APR coordinator. 00:01:32.55 >> Great. And you're still the SPP-APR coordinator currently? 00:01:36.25 >> Yes. Yeah. 00:01:36.45 >> Okay. Wonderful. All right. So we're talking about stakeholders, and I know in Texas, you have several different groups that you convene. Can you just start off by telling us who is part of these groups, and what is the purpose for each of the different groups that you have? 00:01:56.17 >> Sure. So first we have our continuing advisory committee, or our CAC -- that's the required committee we have; 17 governor-appointed members from around the state. They represent parents, general education teachers, special education teachers, consumers, special education liaisons. Most of them are individuals with disabilities, or parents of children with disabilities. They meet quarterly at a minimum. Agendas are posted publicly, and public comment is encouraged for this group. Their biographies and meeting minutes and recordings are all posted on the Texas Education Agency website. 00:02:45.48 They advise TEA on standards related to significant disproportionality, and they are required by statute to submit a report to the legislature bi-annually, which recommended changes to state law, agency rules related to special education, and the meeting date agenda and minutes are all published on the TEA website. We also have education service centers. So Texas is divided into 20 regional EFCs, we call them. 00:03:15.35 >> Mm-hmm? 00:03:16.08 >> They're intended to be the first point of contact for the local education agencies, and they are non-regulatory, they provide technical assistance and professional development for the LEAs. So we meet with -- each one of those centers have a special education director; they're not just special education centers. They service LEAs from pretty much anything, any capacity. 00:03:40.61 >> Mm-hmm? 00:03:41.87 >> We meet with their special education directors monthly to discuss new initiatives. But their role might be in providing training, how to support LEAs. They're really the first point of contact for LEAs, and because they're regional, if it's a real rural, small area, it has a different perspective than maybe people from the Houston, Dallas or San Antonio areas. So they provide very meaningful, good feedback to us on how we can support them better, and then they can support schools. 00:04:18.98 We also have a special education director's panel. So each of those 20 service centers, they recommend a special education director from a local education agency. So that is a select group of special education directors who work with TEA throughout the year. As I said, they're nominated to participate by their regional education service center. Their purpose -- it's a closed group -- their purpose is to provide feedback and input on initiatives and projects related to special education, including the SPP-APR. Presenting this panel provides an opportunity to capture the current needs in the field from the perspective of a special education director. 00:05:12.10 Additionally, this panel allows us the opportunity to gather stakeholder input and the time to collaborate with LEA special education directors that are currently in the field. We also have our Texas Continuous Improving Steering Committee, what we call "TCISC." This is the group that really works on a SPP-APR with us. They are an external work group tasked with advising topics as SPP-APR indicators, areas of slippage, Indicators 8 and 14 results in outreach, sampling plans, potential legal rule changes, legislative updates, state assessment participation, and our Indicator 17, our SSIP. 00:06:02.34 This group has about 15 people representing key perspective roles across diverse perspectives. The nature of this group represents parents, teachers, service providers, evaluation personnel, special education directors, campus administrators, districts, EFC's higher education institutions, advocacy, professional groups, other related state agencies, and other stakeholder groups whose mission include education of students with disabilities. So this group really is our in-the-weeds workgroup. We do target-setting with them, and they -- we ask them to commit to about a three-year period so that we have people that are really informed on the process. This group is also, like I said, close, we do not record these meetings, a real work group. We do trainings on SPP-APR each year and discuss potential changes. We discussed our sampling plan this past year, made some tweaks to it for SPP-8, and they really helped us with that. In fact, they're going to help us get the word out on 8 and 14. Some of those advocacy groups are going to help explain why not everyone will be participating in 8, because there's a sampling plan, but we do a census for 14. They're talking to us about putting things on their web pages to help the validity of these surveys. We contract out for those surveys, so some people want to make sure, hey, is this spam, is this okay? So those advocacy groups, which sometimes, you know, people, they can be our worst critics. 00:07:58.45 >> Mm-hmm. 00:07:59.98 >> But that's why they're in the room, because if we had them in on the front end, that's the best place to have them. They can advise us, they can talk to us about potential pitfalls, how to improve things. And even they initiated, hey, can we meet with you separately to help you with these two surveys? So some of these groups, they're really great work groups. People that participate in them want to be there, because their voice is really heard, and we take what they say and integrate it. And if we can't, we tell them why. 00:08:36.03 >> Mm-hmm. 00:08:36.33 >> Like, okay, "Well, this, we can't really do this, but how about we look at it from this angle?" So it's really productive, great work groups. We don't waste their time. We send them information prior to let them know what we're going to talk about, so they can come in with some questions already ready for us, also, for each group. 00:08:54.49 >> Yeah. Wow. So it sounds like you are doing so much in the state, bringing in so many diverse stakeholders, such a broad group. And you're doing just a wonderful job, that's so exciting. And I know other states out there in particular would really like to hear if there are certain strategies you're using that have built stakeholders' capacity, so that you're not every year having to start over again with explaining the indicators, and building that base for them, that they have a lot of that information going in, and being able to sustain the groups, like you said, asking for that three-year commitment. Yeah, what are just some ways that you've really been able to build a capacity and keep them engaged? 00:09:40.58 >> Well, some of it is what you'd typically expect. We present with the audience in mind. So we might have the same presentation for each group, but we're going to tweak it, or adjust it, given who's in the room, or who's in the Zoom. 00:09:56.77 >> Mm-hmm. 00:09:57.45 >> Our special education directors may or may not know some of this information. Our advocacy groups may know more than our special education directors sometimes, because they've been doing it a long time. And parents are coming from the parent perspective. So when we're presenting to a diverse group, we really try to keep all of those points I mind so we can -- people can understand and provide good feedback. 00:10:21.79 >> Yeah, you could tailor it. 00:10:23.59 >> Yeah. With that TCISC group I was talking about, the one that really works with our SPP-APR, we do presentations and training about each indicator, especially when we're setting targets. Even if they've been in the group for a while, because sometimes there's a little tweak to it. But they don't do this every day. 00:10:45.68 >> Yeah. 00:10:46.10 >> So each year, we'll do a little refresh reminder, go through the indicators where the data comes from, how it's used, why we set the targets where we set them. And they help with the target-setting. So they're thinking about things from their -- advocacy groups are thinking about things from the lens of families. So they, again, as I said earlier, they provide input and outreach information for SPPI 8 and 14 surveys. 00:11:14.81 >> Mm-hmm. 00:11:15.22 >> And they've had some really great input on some changes we've made with the digital world and people using their phones, they've provided some really great information for us on getting that response rate to move up a little bit. 00:11:32.23 >> Mm-hmm. 00:11:33.01 >> We also are a continuing advisory committee. They go beyond target setting, that's the regulatory group. Those meetings are recorded. They're posted on our website, the agendas are posted. We encourage public comment for those meetings. So that's the real regulatory meeting, I guess I should call it. The EFC directors, the service center directors, we talk to them about how their data from each of their regions impacts the SPP-APR. 00:12:10.43 >> Mmm. 00:12:10.89 >> So as I was saying earlier, some are real rural, and some are very urban. So we parse out the data to look at the difference between rural and urban, what kind of supports we need to help provide for them. We do presentations on individual indicators, the entire SPP process. And beyond the SPP-APR, we present on our RDA websites any new initiatives that we have coming out. So they don't advise us only on SPP-APR, we talk to them about everything; "Hey, we want to roll this out, what do you think?" And then people from rural committee can tell us, "Hey, we need this," or from urban, say, "We could do it like that." But that's -- we put pretty much everything in front of our EFC directors, because they support statewide. 00:13:04.13 Our special education director's panel, we went through this year our theory of action for our SSIP, and how the data from each LEA impacts the SPP-APR. So we really went through kind of the flow of what we provide, how it goes to EFCs, how it goes to LEA, how it goes to the teachers, the classroom, all the way down to the student level, because the SPP-APR seems kind of nebulous, it's this data-gathering, and we're writing this report, so when we really connected the dots for them, they can see, oh, I understand a little bit better now how this can actually support the students. 00:13:47.23 And there is a high turnover in special education directors, I'm sure all states are dealing with some of that. So this panel is our specifically-selected directors, but there's still some turnover there, so we do repeat trainings. But again, since they don't do this every day, they don't live and breathe the SPP-APR, any training, they're actually much appreciative. And they always say, "Oh, I learned something new this time" -- 00:14:16.33 >> Yeah. 00:14:16.43 >> -- or something got tweaked a little bit so they can learn a little bit more about what we mean by significant disproportionality, and those kinds of things. 00:14:26.33 Our TCISC group, we do ask those members to commit to a three-year term. Those meetings are quarterly, but we frequently will give them, like, homework before the next meeting. 00:14:39.54 >> Mm-hmm? 00:14:40.25 >> So we'll send an email out saying, hey, meeting's coming up, these are going to be some of our topics, and give them some information so that when they get to the meeting, we can get right to work. 00:14:52.49 >> Yeah. I was going to ask, like, what's an example of homework that you might have them do? 00:14:57.75 >> We did -- when we were going through our sampling plan, we gave them kind of a draft of our sampling plan for 8 to look at. And it was pretty comprehensive, so it was a lot. We wanted to at least look at it. When we were doing target setting for this new six-year cycle, we had them kind of fill out a questionnaire so we could divide them into some groups. So we divided the SPP-APR kind of by indicators that would go together naturally -- 00:15:30.65 >> Yeah. 00:15:31.25 >> -- and then send the information. And they met outside of our group in between our quarterly meetings to work on target setting. They called and met with some of us separately, but we only had the four quarterly meetings scheduled, and then they had separate work groups when we did all that target setting work. We also worked with them -- we'd do trainings and presentations for each individual indicator. We review the targets with them each year to say, these are the targets we're going through. And we also call them for our slippage reasons. So when we got all of our assessment data in January this year, we had a meeting already on the books with them to talk about potential slippage, and talked about what we felt the reasons for slippage were. And then they provided additional reasons. And we included all of that in the SPP-APR, because as I said, this is a very diverse group, working with parents and advocacy groups. So their input is really -- it really has a good pulse of what's going on out in the community. 00:16:41.70 >> Yeah. It's very valued, and you show that. It's not a matter of just checking a box, but you're actually incorporating in very transparent ways what you're getting back from them. 00:16:52.65 >> Yes. And they ask us about it. And they can see it in the SPP-APR. 00:16:58.16 >> Yep. 00:16:58.60 >> We're, like, "Here's the comment you guys provided us." So if -- when we do that, it's also very helpful, and it validates their work. 00:17:06.58 >> Great. Is there a highlight or a story, or something that you're really proud of in particular related to the stakeholder engagement you have going on? 00:17:18.24 >> We have some other stuff that goes on that I'm happy to talk about, because it's the most exciting thing. 00:17:24.76 >> Mm-hmm? 00:17:27.05 >> What I'm most proud of goes beyond what Texas does for the SPP-APR and target setting. We have a family engagement and outreach coordinator who manages our SPEDTex website. This website is available to anyone across the state. Parents can participate in trainings and focus groups. Has many resources provided in English and Spanish. Parents can create an account if they choose to. They can enter their child's IEP, the date of their last IEP, and it will send them reminders saying, "Hey, this is coming up, here's some things you might want to do." It has information on procedural safeguards, any of that. So that website is very interactive. And it's specifically targeted at parents and families -- 00:18:22.22 >> Mm-hmm? 00:18:22.69 >> -- to help them understand processes. And they do webinars -- whenever they do a webinar, it's also done simultaneously in Spanish, so that both groups get the same information at the same time; it's not just translated into Spanish. They present in Spanish. 00:18:40.38 >> Hmm. 00:18:41.05 >> So I think that's a really positive effort to include our Spanish speakers across Texas. We're a large state with a lot of different languages. 00:18:52.31 >> Yeah. 00:18:53.34 >> Our Dispute Resolution Process webpage has information on special education complaints, but also resources for special education dispute resolution. And again, this website is available in English, Spanish, Chinese, Vietnamese, and Arabic. So we work really hard to not just provide things in English, we want to hit those top languages across the state as best we can. 00:19:23.28 Also, when they're participating in the cyclical monitoring, we had a stakeholder survey for families that have students with disabilities. So if your school district, or LEA, is engaging with us for their cyclical monitoring, we send out this survey. The questions follow three constructs; engagement, like opportunities for staff to collaborate, or parents to collaborate, or how schools provide information out. So there's engagement, and then understanding how data is used when creating IEPs, importance of inclusion, and also competency -- so, areas they can improve. So it's engagement, understanding and competency. So the questions are just general questions, but the answers give a good understanding of this. Do you meet regularly on this, so we can measure a little bit, the best we can. It's kind of like a consumer survey that you would do. 00:20:28.65 >> Mm-hmm. 00:20:29.05 >> And again, these surveys are offered in Spanish, Vietnamese, Filipino, Chinese, Burmese and Arabic. And the way we set them up along constructs, we can run the data mathematically to see how they scored. We don't have to worry about translating all the information back, we can have each answer numbered and run understanding the engagement and competency through that. 00:20:57.19 So all that's pretty exciting. I really like that we work hard to provide different languages, some groups, some supports for parents. We're a big state, we do a lot with Zoom. So some of these stakeholder groups, we're able to reach the far rural areas, because they can participate by Zoom, travel isn't required, because that really hampers people's ability to participate if they have to come to Austin from El Paso. That's a couple of days. They're going to have to fly in, spend the night, go to your meeting -- it's a big ask, asking people to participate in person. 00:21:38.34 >> Yep. Yeah. As much as, I think we, yeah, are all kind of Zoomed out. But it's been a great tool for sure in terms of being able to involve a lot more folks to participate than otherwise might have been able to. 00:21:56.55 >> Absolutely. We definitely have more participation, more regular participation. 00:22:02.06 >> Yeah. 00:22:02.73 >> People can take a couple of hours out of their day pretty easily, rather than having to commit a whole weekend or something, to come into town. 00:22:12.94 >> Yeah. Well, you have so much wonderful activities and initiatives going on. Do you have plans for anything coming up? Any changes you're going to be making? Any new areas you'll be focusing on? 00:22:29.47 >> There are some things that our technical assistant and professional development groups are working on, and some contracts, so they will be touching base with our EFCs and our special education director groups. And we also talked to our TCISC about it, because we want their input on how we can message out to parents, hey, this is stuff that's going on, we want your input also. Or does your student engage in any of these activities? So any time we push out new initiatives, or things we want to change or tweak, we really talk to these groups, because they can also point out to us any unintended consequences, like "Oh, this project's working really great, why are you changing it?" Or, "Just do this little tweak, and this will be great," or, "We can really get a lot of teachers trained if we had this or that." So it's a lot of open back and forth conversations with our professional development groups, our technical assistance groups. And usually, I'm included in that because it affects the SPP-APR. 00:23:35.05 >> Yeah. 00:23:35.16 >> So we work really hard to show those connections, too. 00:23:39.07 >> Absolutely. 00:23:39.70 >> Those LEAs, with all our new special education directors that aren't familiar with the SPP-APR, they can see how, when they look at our theory of action, you can see the direct line from the state all the way down to the student, so it's very helpful. And like I said earlier, we do periodic trainings or reminders about that process also. 00:24:02.62 >> And that's so important to be able to tell the story so that those at the local level see how it directly impacts them and their students, and how their data feed into the larger region and state. But it all matters, and they can kind of see where they fall in it. 00:24:21.27 >> Exactly. 00:24:22.22 >> Well, thank you, Susan, so much. This was such a wonderful conversation. Learned a lot about what strategies you have, and the work you have going on with stakeholders. And hopefully other states will maybe pick up some tips. And we know this is such a big area that a lot of states struggle with in terms of sustaining groups, and finding diverse participants. But it sounds like you all are doing such a fantastic job, and it was so wonderful to hear all about it. 00:24:50.46 >> Well, I really thank you so much for inviting me. And I appreciate the time. 00:24:55.48 [MUSIC] 00:24:56.68 >> To access podcast resources, submit questions related to today's episode, or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content. Or, connect with us via the podcast page on the IDC website, at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date with Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey, it's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date with Data" is brought to you by the IDEA Data Center. 00:00:24.73 >> Hello, welcome to an episode of "A Date with Data." Now that states have submitted their federal fiscal year '22 SPP/APRs and hopefully have taken a much needed break, it is time to start thinking about the next SPP/APR cycle, especially because there are changes coming. On this episode we have two of IDC's TA providers, Rachel Wilkinson and Nancy Johnson, who are going to talk to us about what those changes are, what are some implications that states might want to be thinking about and some ways that states can get ready for those changes. Hello, Rachel and Nancy. Thanks for joining. So a lot of states might be somewhat familiar with what these changes are, but some folks listening may not be really be aware, so maybe start off, if that's okay, Rachel, talking about what these changes are to the FFY 2023 submission that will be due in February 2025. 00:01:23.20 >> Sure, so the approved changes that are going to take effect in the FFY 2023 submission, there's really ... I think of it as three buckets of changes, so the first is a change to the introduction, and this will also be a change to the instructions for this upcoming submission for 2023. So OSEP has added language for the introduction instructions to include eight elements, and they provide them in detail, but they're part of the state's general supervision system. Some of this information might already be something that states are including in their introduction currently. Some of the items look similar to the additional language that was added to the template and the instructions for the FFY 2022 submission. But there are some key differences that we did want to highlight so people ... It's good to get this kind of percolating in the mind to think about now because these are things you'll want to be able to easily address when you do get ready to start the submission process for 2023, which is not as far off as we always want it to be. Yeah, so the things that stood out to Nancy and I when we were looking at the changes, particularly to the introduction, were the requirement to include a description of how student files are selected for monitoring for indicators and for the verification of correction of noncompliance. Nancy and I review lots and lots and lots of drafts of the SPP/APR and kind of eat, drink, sleep, dream all things SPP/APR for a good chunk of the year. And we do notice that this is something we're not seeing often in the SPP/APRs we're reviewing, so it's something states are probably going to have to start adding in, particularly that verification of collection of noncompliance section. So sometimes there may just be comments that people have that say something to the effect of, "We verified through additional records that the noncompliance identified for X indicator has been corrected," but this is really asking for more detail of how those files are selected. And so being thoughtful and mindful of how you're going to respond to this as a state so that you have kind of clear processes in place, I think would be a really helpful thing to prepare for now rather than later right before the deadline is caving in. And then the other item was a description of how the state makes LEA determinations, and then that includes the criteria the state uses and the schedule for communicating the determinations to LEAs. I don't know that every state has that information in an accessible, quick way to write down. A lot of us use our institutional knowledge or people who maybe have notes documented here and there but maybe not in a cohesive description. So that's something to also be thinking about and keeping in mind as you prepare for the revisions to the introduction and these new components. So that really gets at the introduction piece, so that's sort of bucket one for the changes, so the second changes, the bucket that we're looking at, so to speak, is for Indicator 4. And these aren't massive changes, so these aren't things that may rock people's world, so to speak, or be super challenging, but they still are things that we're not seeing in our reviews called out consistently. So for Indicator 4, the requirement is going to be to now actually provide the numbers used in calculations, and that's based on LEAs that met n-n cell size requirements. So the state is going to have to define what those n-n cell size minimum requirements are as well as the rationale used for those minimum n-n cell sizes. That's explicitly called out in the new instructions. And then for option one, so if a state is conducting analyses of long-term suspension and expulsion data based on the state rates or the comparison of LEAs to each other across the state. Then the state would be expected to provide the actual state long-term suspension and expulsion rate. And then if they use option two, which is comparing students with and without disabilities within the same LEA who are experiencing those long-term suspensions and expulsions, if states are using a rate-difference methodology, like the difference in suspension and expulsion rate for students with disabilities compared to those without disabilities, then the expectation from OSEP is that the state will provide the actual ratio used, that difference, or their rate ratio used if that's how the state is conducting the analysis. So that information now explicitly needs to be called out and spelled out rather than maybe a more general description of what a methodology is without those nuances. And I think Nancy is going to tackle the big and third and final change that is on everyone's mind. 00:07:13.59 >> This is the one that I refer to as the biggie bucket, Indicator 18, and it is on everyone's mind. We get a lot of questions about that. People want to talk about Indicator 18 so they ... Most states, I believe, are aware of this change, and it focuses on the state's exercise of its general supervision responsibilities to monitor LEAs, and it must include findings from data collected through all components of the state's general-supervision system that are used to identify noncompliance. So I'm going to kind of go through these in short spurts here for a second. It can include information from the state-monitoring system or from the state's data system, dispute resolution, fiscal monitoring, for example, so really looking at all your different systems, your general-supervision systems, you would be pulling data about findings of noncompliance and then the verification of the correction of those findings ... 00:08:18.93 >> And, Nancy, that goes ... 00:08:19.64 >> ... for ... 00:08:19.79 >> That goes beyond the SPP/APR Indicators. Right? So it's not just ... 00:08:23.90 >> Yes. 00:08:24.13 >> ... thinking about ... 00:08:24.47 >> It is not just the indicators. It's looking at the indicators, findings within the indicators, but also findings in the other places related to the indicators. Now for FFY 2023, this will just pertain to the compliance indicators. However, if, for example, you have findings for, let's say, Indicator 11, then you would also need to look at findings related to the evaluation of students to determine eligibility that you might have found in some other system within your state. Maybe through complaints, you had findings of noncompliance related to that same thing. There is also the related-requirements document that states would want to look at and ensure that they're addressing any of those things related to Indicator 11, as an example. So while it pertains just to the compliance indicators, it is more broadly looking at all components of your general supervision where you might have identified a finding of noncompliance. So along with this, the baseline for FFY 2023 is expected to be FFY 2023, and states will have to provide rationale if they use a different baseline year for this indicator. And because this is a compliance indicator, targets are expected or required to be 100 percent. Again, you're going to use your related-requirements document to ensure that all of the IDEA regulations or citations associated with each of the compliance indicators for this FFY 2023 year are being reported, anything related to that, any findings you make. And it is based on, and this ... Sometimes states get confused about dates, but it is based on findings that were made in FFY 2022 and the status of the verification of the correction of the individual and systemic compliance within 1 year of when those findings were made or when LEAs were notified of those findings. So it is data that states are looking at right now in terms of the verification of correction of findings that they're making currently in this reporting year that were based on findings from the FFY 2022 year. 00:11:06.22 >> Okay, so it's noncompliance from the previous year that currently or already states should have issued findings already but should get districts in the process of verifying and correcting. 00:11:22.07 >> Correcting, and the state ... 00:11:22.93 >> And the state verifying. 00:11:23.70 >> ... and I'm going to emphasize, again, that word verification because it's not just the districts correcting the noncompliance. It's that the state has verified, used some methodology to verify that those corrections were actually completed and are acceptable corrections, that they actually corrected that individual findings and any systemic compliance. So it's important that they understand that they have to do that verification process within that year's timeline. 00:12:01.26 >> All right. So now that we've talked through the changes, what are some recommendations for how states can really prepare for these changes? Nancy, do you want to mention a few? 00:12:12.35 >> Sure. I'll be happy to start. First of all, I would recommend based on Indicator 18, which is the biggie bucket, to dig into the related- requirements document and make sure that all of their staff offices department and any staff responsible really understand any of those requirements and are aware of those requirements as it relates to the components, and that they need to provide this information to whoever is then completing Indicator 18 so that they need to be able to provide any information about the findings of noncompliance for FFY 2022 and then the verification of the correction of noncompliance so that whoever is developing, there will be a table in the tool that has certain requirements to complete all of this information. And it's got to be a joint effort. One person cannot do all of this, and it's different departments. Different people are working on different components that all may impact what those findings are. 00:13:24.76 >> Mm-hmm. 00:13:25.22 >> So digging into those related requirements, I would say, is the first thing I would recommend people do. And then I would also make sure that you establish with your staff a common understanding of the verification of correction of noncompliance processes. That it is not just the district correcting the noncompliance, it's the state verifying and how they verified that the district corrected the noncompliance. We're also in the process, or I would recommend that districts, if they haven't already, create some kind of a tool or process for how they track all of their information for Indicator 18 because it is a lot of data they're going to have to submit. And the sooner they start collecting that information and continue to add to it throughout the year until they're ready to submit would be helpful. And then maybe meeting with your team just to ensure that all the components for general supervision are clearly articulated and reflective of the work that is going on in a state. I know oftentimes when I meet with states, they'll be talking about work they've done with stakeholder engagement or other general-supervision requirements, but they haven't documented it or haven't included it in their SPP/APR, and they really didn't think about including it. But yet they're doing it, so they really need a way to document that and report it in their FFY 2023. 00:15:02.06 >> Yeah. You want to get credit for all that good work that you're doing. 00:15:04.87 >> Yes. You do want credit for that good work you're doing, absolutely. 00:15:09.60 >> We know states are getting into this and trying to figure out what needs they might have, as is IDC, how we can best support states with these changes, so what are some tools, resources, technical-assistant services that are out there and available or maybe things that we're working on that are going to be in place to help states related to implementing these changes? 00:15:34.44 >> When you mention this question was resources that exist currently, so the data-processes documentation is really valuable for making sure that states are capturing all the relevant information for the indicators, so thinking about the introduction as well as the changes for Indicator 4. This might be a good time for states who've done the data processes to revisit those and make sure that the information that's now required is reflected there for those data processes, and so everything is consistent. What's in their process documents matches what's in the SPP/APR because we know OSEP really wants that consistency across different reporting avenues, especially with the DMS visits that they're doing and looking at all that documentation. And for those of you who haven't done data-process documentation or have but maybe are thinking you could use some additional support, just a reminder that IDC will support these efforts and come and meet with your team, document information, facilitate conversation. So definitely reach out to your state liaison if this is something you're interested in doing, especially as you prepare for these changes or for monitoring visits or all that good stuff coming down the pike, so ... And then the last thing I can think of is that the Interactive Institutes that will be in Atlanta this year in June, so II 2024 will have some sessions that will be addressing these changes in the SPP/APR, so just a plug to take advantage of that opportunity to go to II '24 and learn about all sorts of things related to IDEA data, but in particular there will be a focus on Indicator 18 as well as some of these other changes, so ... 00:17:38.58 >> Thanks, and everyone will get a chance to meet Rachel and Nancy probably in-person and ask questions that you might have thought about during this podcast during the Interactive Institute. So definitely much more to come related to these changes, and do please reach out to your IDC state liaison with any questions or support that you might need. And thank you so much, Rachel and Nancy, for being on and giving us some insight into what's happening and what states can be thinking about, getting ready for it. 00:18:12.44 >> Thank you for having us, Amy. It was my pleasure to be a part of this, and we look forward to seeing everyone at II '24 and having the opportunity to discuss more of this with our state colleagues and each other, and thank you. 00:18:29.65 >> Thank you, everybody. This was great to talk more about this, and as Nancy said, we'll certainly be in touch with more information and hope to see you in-person at the different events that are going on across the country, including II '24. 00:18:45.63 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date With Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey. It's Amy, and I'm so excited to be hosting "A Date With Data." I'll be chatting with state and district special-education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date With Data" is brought to you by the IDEA Data Center. 00:00:24.56 >> Hi. Welcome to "A Date With Data." In early February, states submitted their federal fiscal year 2020-2022 SPP/APRs. IDC reviewed and provided feedback for many states' SPP/APRs, so we have a really unique perspective on overall patterns and trends, areas where states maybe are really doing well, others that seem to be more challenging, and maybe there is some room for improvement. So on this episode, I am joined by the two IDC TA providers who led the SPP/APR review effort, Rachel Wilkinson and Nancy Johnson. Thank you both so much for joining me. So let's start off by hearing about, what are some things that states are doing really well in their SPP/APRs? Rachel, do you want to go first? 00:01:14.98 >> Sure. So we were lucky enough this year ... I think we saw about 34 states, some version of their SPP/APR drafts, so it was great to really get a feel for how different states are tackling the SPP/APR, and for a lot of them, we saw improvements in the development of slippage statements. We know that that's been a challenging area because it really requires that digging into the data, and so it's less, "We need to fix this problem, and we're going to fix it in this way," but more in saying, "These are the reasons that the slippage occurred," so we've seen a lot of effort being made to really do that deep dive into the data, so that was really encouraging. Yeah, and then let's see here. I think some states were really good in describing their general supervision systems, so they provided more details about what those systems entailed. That might include things like their monitoring processes. That was a new example of different items to include in the general-supervision field in the instructions and then in the template this go-round, so we saw a lot more rich discussion about what states' general supervisions are. And then I think one thing with all the states we reviewed drafts for ... One thing that we were encouraged by is that they were getting them in really early, so people were submitting things in early December and some even earlier than that, which was great because that meant that they could take the feedback and really apply it and make sure that their drafts were high quality and compliant, and that was really encouraging to see, and I think the SPP/APR Summit has helped sort of spotlight the importance of working on this sooner rather than later. 00:03:15.54 >> Yeah, definitely. It's nice to have that summit occur when it does to kind of kick off in some ways. States probably have been working and should be on their SPPRs well before that, but the time where it really comes to putting pen to paper and really getting down into it and reminding states, of course, at that point too that we're available to review their SPP/APRs, and hearing that so many states took us up on that is ... It's really exciting. 00:03:41.89 >> Yeah, it was wonderful. We were thrilled. 00:03:44.03 >> Great. It's really promising to hear about all the ways that states are improving and doing well, and we know there are still some areas where states could use some improvement, and I'm wondering, Nancy, if you want to talk a little bit about what some of those are. 00:03:59.82 >> Sure, I'd be happy too, Amy. There is always room for improvement no matter how well we do things, and there are some ... a couple of areas that jump out at us that could use some improvement. One in particular was in the area of Indicator 4. OSEP had some very specific comments from states to address some requirements related to reasonable methodology with Indicator 4, and the state ... Some states really did not address those requirements with the comments from OSEP. They may be waiting for more directive from OSEP with regard to Indicator 4, and also, some states have let us know they had already issued their requirements regarding Indicator 4 earlier before they ever got their clarification or the OSEP letter in June, so they just kind of waited until maybe this year to take a look at it. We would suggest though that it would be beneficial for states to be thinking about their existing processes alongside stakeholders, including their stakeholders to ensure that children with disabilities are not being subjected to inappropriate removals that could prevent them from accessing instruction and which then would impact long-term outcomes for children to their adult lives, so that Indicator 4 area is a real ... not a concern to us but ... And we're waiting to also see how OSEP is going to respond to those requirements. Another area is in the area of stakeholder engagement. Some states did a nice job in that area, but many states provided more generic information regarding stakeholder engagement or referenced work they had done prior years, particularly prior to with that first submission of FFY2020, which is 4 years removed from that now, but it's worthwhile for states to be connecting with stakeholders regularly on an ongoing basis, and they should be addressing more than just setting new targets but also looking at their data analysis, their progress toward their targets, their improvement activities and their evaluations, and it was sometimes challenging for us to find information in there about what currently occurred this past reporting year related to stakeholder engagement. And then a third area that ... And this is one that's been kind of ongoing ... is in the area of corrections of noncompliance. Corrections in noncompliance is really about the state verifying [Indistinct] LEAs corrected any findings of noncompliance, and that verification of correction should be consistent with the OSEP QA 23-01 that came out last July and the requirements within. In particular related to correction of noncompliance, we saw and continued to see challenges with the correction of systemic noncompliance where it is unclear whether or not states really looking at additional records in addition to the findings of noncompliance that they initially made and then looking at then additional records that they were reviewing and when those reviews took place. We also saw that states sometimes still combine their individual child-specific noncompliance and their systemic noncompliance all together and just repeat it in both prompts when they really are two separate prompts for that information. And in some instances, we just saw some general boilerplate-type language about their monitoring and review process rather than specific language about what the state actually did to verify the correction of noncompliance, which is the OSEP requirement. So those are three kind of main areas that come to mind when we think about ways states can continue to improve their SPP/APR. 00:08:17.06 >> All right. Thanks, Nancy. And coming up in April is what's called the clarification period, so if there are things that come to OSEP's attention when they're reviewing the SPP/APRs, it's an opportunity to come back and ask questions. I think I kind of got that right, but I might be missing pieces of it. So maybe for newer state staff, Rachel, do you want to say any more about what the clarification period is for those who might not be familiar with it? 00:08:44.89 >> Sure, so this is an opportunity that OSEP offers. It's usually about 2 weeks in length where they'll have comments, feedback, sometimes clarifying questions, thus the name of the clarifications to ask about particular indicators, components of indicators and the introduction as well as other areas that they might look at, so it's important when you get those clarification notices from OSEP to dig into the different comments they've made, and all of those will be in that SPP/APR tool in the EMAPS data system, and then states develop responses, provide those to OSEP, and then based on those final responses, OSEP will provide any comments or questions, feedback that they might have as a result of their reviews of the SPP/APR, and then they'll review states' responses because each state will need to respond to those comments or questions, and then they will issue either additional required actions based on that feedback or any other comments that they think are appropriate, and then that information is what's finalized and reported in the final SPP/APR. 00:10:05.14 >> Okay. Thank you. And IDC is able and willing to review how states are addressing and responding to anything that comes back from OSEP during that clarification period as well, so just putting that out there for states. 00:10:21.25 >> Absolutely, and we would encourage you to make sure that you have at least someone there who can take notes or be an extra set of ears, so IDC state liaisons are great to be that resource, and we really do encourage during this time frame that states take up the offer from OSEP to review the clarifications as a group and to have a call to do so because then you can ask questions and get feedback on things. Maybe you have a question about a comment that was provided that wasn't clear. These are the calls that you can get that clarification, and some states have asked for clarification and found that issues that had been flagged really weren't issues, and they didn't have to address them, so that meeting with OSEP and then having someone there to help you with taking notes or listening in could be really valuable. 00:11:16.80 >> Great, so that's, I think, a helpful tip for states when it comes to addressing OSEP's questions and feedback. Are there other tips that you all have to share that might be helpful for states? 00:11:28.13 >> Well, I can think of one more, but I think Nancy's got a couple as well because she's worked with a lot of states on this. 00:11:34.40 >> Yes. 00:11:35.72 >> But one of my additional thoughts was just to come up with a strategic plan of how your group as a state is going to respond to the questions and clarifications. That helps everyone understand what the expectations are and how they're responsible for certain portions, so if you have an indicator lead who needs to answer a programmatic question, they're aware of what that clarification question is so that you can be strategic in your response as a state, or if there's a data-specific question, make sure you have your data manager, whomever put the data together for the indicator available and aware of the question so that they can give feedback, so again, kind of coming up with that plan internally so that the responses you as a state provide are coherent and reflective of all the expertise that might be needed. 00:12:31.26 >> Yes, and I do also have a couple of tips, and I would like to go back to the importance of the call with OSEP because sometimes states think or interpret the information that OSEP asked for and then learn during the call as they're talking about it that OSEP really meant something different than what the state thought they meant, so those clarification calls are very important. Along with that, it's important as you're getting ready for that call that you review all of OSEP's feedback carefully before you have the call, and I want to emphasize including the comments for the introduction section. I worked with a couple of states that had overlooked the introduction section and only focused on the indicators, but there are comments also in the introduction section. And then lastly, for the required ... Think about, for those required actions that OSEP is asking you to address next year, you want to ensure that you're documenting these and beginning to develop plans on how to address those actions, and if you thought about that before your clarification period, you could perhaps consider even ... or during the clarification period running those past OSEP during the call or asking them about them during the call as you're thinking about them so that that will help you know how to address those required actions in the next SPP/APR and just documenting any of that information. 00:14:14.01 >> Yeah, while it's still kind of fresh in everyone's minds and you have that chance with OSEP, yeah, might as well use it, and that's a great, great tip. 00:14:23.47 >> And then I would also emphasize having ... inviting your IDC state liaison to be on the call with you because they can be that extra set of ears and might hear something you didn't hear or also taking notes for you so you do have things documented. 00:14:40.05 >> Yeah. Wonderful. Well, thank you both so much. I picked up some great tips. Hopefully others listening did as well and really appreciate you being on the podcast. 00:14:54.11 >> Thanks, Amy. This was a pleasure. 00:14:56.45 >> Thank you, Amy. It was our pleasure to do this. 00:15:00.32 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date With Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey. It's Amy, and I'm so excited to be hosting "A Date With Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date With Data" is brought to you by the IDEA Data Center. 00:00:24.61 >> Welcome to another episode of "A Date With Data." I am joined by several former data managers who are now IDC technical assistance providers. Austin Ferrier, Kelley Blas and Kristen DeSalvatore were all data managers up until a few months ago, and they're going to share with us their unique experience of going from being a data manager to now working to build the capacity of data managers and other state staff to improve the quality of their IDEA data. Welcome to all three of you. And I think, to get things going, it would be great if each of you could just briefly introduce yourselves, say a little bit about how long you were in the role as a data manager and what you're doing now. And, Kelley, do you want to start us off? 00:01:09.05 >> Sure. Thank you, Amy. Again, my name is Kelley Blas, and I was a data manager at the Department of Public Instruction in North Carolina. I was there for 17 years as the data manager and the last few years as the SPP/APR coordinator, and I'm very happy to have joined IDC as a state liaison and technical assistance provider, just love working with states. 00:01:35.64 >> Great. Thanks, Kelley. Kristen, do you want to say hello? 00:01:39.35 >> Hello, everyone. I am Kristen DeSalvatore, and I was the data manager at the New York State Education Department for about 11 years. In that role, I also served as [Indistinct] coordinator. In my role at IDC now, I'm working to provide technical assistance and support to states and LEAs related to the collection and reporting of education data and especially the special ed IDA data. 00:02:08.19 >> Wonderful. Austin? 00:02:10.19 >> Yes, thank you for that introduction, Amy. Afternoon, everybody. My name is Austin Ferrier. I am the former IDEA Part B data manager for the Florida Department of Education. I was actually in that role for about a year and 6 or 7 months, and as I transitioned to IDC, I have been focusing on CA specialists and 618 data. 00:02:36.26 >> Great. Thank you. So it's wonderful to have all three of you, and I'm so excited to now be able to work with you on the IDC and technical-assistance side of things. I was fortunate enough to get to know all of you when you were data managers so really thrilled to have you now onboard with IDC. And to get things going, I'm wondering if you all could reflect a little bit on your time as a data manager, if there is advice that you might want to provide, if there are newer data managers. Austin, you were in that role much more recently, so it's a newer kind of experience for you, just thinking about things you would've maybe wanted to know as a new data manager. Kelley, do you want to kick things off? 00:03:21.54 >> Sure. So when I was reflecting on advice for new data managers, one of the first things that came to me was to utilize all of the technical-assistance resources that are available to you. I know that when I started 17 years ago, I wasn't really aware of what technical-assistance resources were available to me. Felt like there were times where I couldn't find exactly what I needed, but now in this day and age and especially with having IDC, there's a website that's super filled with resources. There are technical-assistance resources, such as state liaisons. I would definitely encourage folks, especially newer folks, to invite their state liaison to walk them through all of the resources that are available on the website. Also, the other thing is just having a plan in mind for what your year is going to look like around data collection, when it's being collected, when analysis needs to happen, when submissions are due and just having that calendar planned out that's kind of unique to your state. Not every state collects things at the same time, but we know that it's all kind of due at the ... It is due at the same time to OSEP, so having that calendar planned out for the year really helped me know when I was supposed to take leave if I wanted to take a vacation or make plans and when I was going to need to be really, really focused on my data-analysis activities. 00:04:50.06 >> Yeah. Kristen, do you want to give us some advice? 00:04:53.72 >> Sure, so I'm going to echo what Kelley said about the IDC resources and really advise folks to make the time to really investigate the IDC resources that are available to you and to reach out to your state liaison on a regular basis. Like Kelley, I didn't really realize the depth of support that is available and really wish that I had. I would also tell people to network and collaborate with others in your agency and really try to find allies in the Office of Special Ed or data shop that you can work with and count on to help you get the work done that you need to get done. And lastly, I would say don't be afraid of OSEP. They're not out to get you, that a lot of times folks feel like, "We're going to get dinged. We're going to have a big problem with OSEP." So keep in mind really that OSEP has rules and regulations that they must follow, right? So they are pretty scripted by what's in the IDEA law. And also keep in mind that we all have the same goals in mind, right? We want to improve the landscape for students with disabilities. That should really be the underpinning of all of the work that you're doing. 00:06:21.45 >> That's great to keep in mind and especially for newer folks who might be intimidated by OSEP, but know that they're there to support states in any way really that they can, so they're a resource and a partner. Austin, what advice do you have? 00:06:38.69 >> I do have to echo some of the sentiments that both Kelley and Kristen expressed, especially what Kristen mentioned about, regardless of what we do, it does all come back to the students. We might be looking at data timelines inundated with messages from OSEP and emails, but one of the biggest things that really clicked with me as a new data manager, that this all comes back to fate, providing free and public education for kids and those support. So each time you see a number in an Excel sheet, any type of visualization you make, just try to keep in mind that that's a real student, a real person and that we are trying to do our best to support those groups. Another thing I would really advise new data managers to do, if you get the chance, please, please, I highly advise to attend one of the IDC Interactive Institutes summits or any type of project or presentation that IDC puts on. I know when I first made my initial trip for the SPP/APR summit, it opened up a whole new world outside the lens of Florida in terms of the level of collaboration, the level of support available, and it just really helped encapsulate what we are trying to do in our positions and in our positions as Part B data managers in support of LEAs. If your state allows you, highly advise to take the trip to any of those Interactive Institutes or summits. And then just finally, just as Kelley was saying, IDC has data manager connection groups, data quality peer groups. The level of collaboration you can achieve in those specific safe spaces is amazing. I know the conversation ... I've had conversation with Kristen herself when we were both in that role in meetings where we were talking about some very sensitive topics, but that collaboration really allowed us to build our own state's capacity and just kind of build that knowledge base. 00:08:56.86 >> Yeah, so it sounds like what I'm hearing as a theme from all of you is really the collaboration, the relationship building because often in the state you might be a little isolated, the only one really doing a lot of the work that's kind of in your head and the importance of working across your SEA, and then also just reaching out and tapping other data managers is such an important piece of the role. So now that you've transitioned out of the role of being a data manager, Kristen, what's something that you're really going to miss about being a data manager? 00:09:33.63 >> I'm really going to miss the collaboration with my immediate colleagues in the data shop at NISED as well as with the staff in the Office of Special Education. As a data manager, I really was the bridge between the two separate offices in the department, and I worked very hard to bring the data and build the data literacy in the Office of Special Education and did reap a lot of rewards from that and made some good connections with colleagues, so I will definitely miss that part of it. I'll also miss working with the data, and as Austin said, it's numbers on a spreadsheet, but the work that a data manager does, does have the potential to influence directly things for kids, right? So you do have the ability to influence practices and policies that do have that potential to make a difference really even in an individual kid's life, so that to me is something I will miss. 00:10:50.73 >> Thanks. Austin, what about you? 00:10:53.98 >> I just have to echo kind of what Kristen was saying in terms of that level of collaboration, but I will say probably one of the biggest aspects I'll miss, especially at the SEA state level, was the interaction I had with LEA exceptional education directors. I was a phone call away with almost every special director in every district in Florida, and having that connection and having that bridge really felt like the distance between the state and the LEA was lessened, and the trust was strengthened, so knowing that they could call me at any moment if they had a question regarding their data specifically, even at the school level, just building that trust and those relationships and having that direct assistance and seeing that have a direct effect in real time, one of the biggest things I'll miss. Second biggest thing I'll miss, I worked with some amazing individuals at the state level who, even outside of their career and job, were still focused on community outreach programs, were still going to school-board meetings after work. The passion was there, and it was evident, and I fed off of that, and I'll truly miss some of the individuals I ... They were superstars, rock stars. 00:12:21.34 >> Mm-hmm. Yeah. And, Kelley, what things are you going to miss? 00:12:26.55 >> It's funny. We're all kind of missing the same thing, but definitely for me, I'll definitely miss the friends that I made at the Department of Public Instruction across the agency, but the main relationship that I'll miss is, after 17 years and being involved in building our state special education data system, you develop some really strong connections and supportive relationships with our LEAs, and again, they knew that they could contact me, and I would know a little trick to get their data to go in just right or what the workaround was for our system, and so those kinds of phone calls, they're always so gratifying because they know that they can call you, and you'll figure it out for them. So those things I'll miss and definitely, definitely the data. I am a data geek at heart, I think, and so being able to look at those big data sets and knowing the rules of our state and how everything's supposed to fit together and how to present that visually where it makes sense when we're talking to our districts about their data, those are the top three things, I think, for all three of us that I'll miss. 00:13:41.01 >> Mm-hmm. Yeah, lot of similarities. I think as you get more into the role, the states that you're working with will sort of become like your districts were, and they'll be calling you, and you'll be calling them and build up that same type of relationship, so I think you'll still see that in somewhat different ways. And we touched on this a little bit already, but are there things that you want to mention that you as a data manager wished you had known more specifically about IDC and what IDC does and the services and the other TA centers as well? Austin, is there anything you want to mention? 00:14:18.98 >> Yes, yes, definitely I want to re-emphasize the area of safe space that IDC creates. I really do want to just express and emphasize IDC is not a punitive organization. They are not looking to ding you on any of your data pieces. It's a holistic examination of state processes as a whole in a ... It's pure assistance, so always keep that in mind. Come at it with a positive attitude and come at it as if we know where you're coming from, and Kelley and Kristen, they will agree. We've been in their shoes. So just having that empathy and knowing that we do try to create a safe space, and I really do hope states understand that. 00:15:16.08 >> Yeah, absolutely. Kelley, what are some things that you want to mention about IDC or other TA centers? 00:15:25.34 >> I think one of the things that I didn't realize about IDC that I know now just from my own experience as a state liaison is the multitude of ways that IDC and other TA centers can really come in and support states when it comes to the work that they're doing around data. For example, I'll be going to a state next month and helping facilitate a stakeholder meeting around some changes that they're making to their Indicator 4 methodology, and I wish that I had known that those kinds of supports were available to me as a data manager and to our team in general at the Department of Public Instruction because we could've utilized those supports in a way that would've made those meetings just richer and so just understanding the true depth of support. And I'll say it again because I didn't ... I definitely didn't understand this, but safe space, safe space, safe space. I think for a long time I felt like if I shared too much information with our TA centers that maybe I was airing my state's dirty laundry, and I didn't want it to be known that we were doing things wrong. I just wanted to help correct it. But knowing that these TA centers are here specifically to support us in improvement efforts and making things ... making our data stronger, it changes the whole understanding of what technical assistance really is, and I just ... I had some misconceptions for sure. 00:17:06.18 >> Great. And, Kristen, anything you want to add? 00:17:11.04 >> So, Kelley and Austin, you did a great job of taking the words right out of my mouth. I do ... One thing that I really didn't understand was how important it was or is to get to the conferences and institutes that IDC offers and pays for folks to attend. In New York, we had a hard time with travel, and we were finally allowed to go to ii23, Interactive Institute 23, and it was just mind-blowing for us. We were like, "Oh, my goodness, we have really missed out on a lot of in-person technical assistance and information and the collaboration and the networking and that piece of it," so that is one thing that I would highly, highly recommend is that all data managers but especially new data managers really work hard to be able to attend the in-person events that are offered. 00:18:19.61 >> Yeah, between all the data centers, all the TA centers, there really is such a wealth of expertise and knowledge and resources that we need to make sure all the states and staff and especially newer staff are aware of and can utilize. So kind of looking forward now, we talk a lot at IDC about what it means to be a data quality influencer and how everybody is a data quality influencer in different ways. You all as data managers were definitely, of course, data quality influencers in that role. Can you talk a little bit about some things you're excited about in your new role in terms of being a data quality influencer? And, Kelley, do you want to start us off? 00:19:09.72 >> So my mind is always whirling on what I could've used during my time as a data manager, and now that I'm with IDC and I know that part of the focus at IDC is creating tools and resources to help data managers and states have better tools to analyze and display their data and make meaningful change in their daily work and efforts, it's really just exciting to me to be able to think about what I could've used and how I can present those ideas and potentially create these tools to assist LEAs, for example, significant disproportionality or the indicators, just tools that will help them analyze and display data better. 00:19:53.13 >> Thanks, and really who better than the three of you and other former data managers to come on and help put those ideas and dreams you might have had as a data manager like, "It would be great if this existed, but I just don't have the time or the capacity to create it," and now can really be devoted to that and helping other data managers create those for them. Kristen, tell me a little bit about what you're envisioning in your role as a data quality influencer with IDC. 00:20:25.65 >> Well, Amy, as you said, the state will kind of become our LEAs or district, so I am looking forward to working with the individual states to provide one-on-one technical assistance as well as being thoughtful about updating existing resources to reflect new guidance, new practices and new perspectives and create new resources, knowing, as Kelley said, what I thought I might ... would be super helpful when I was in that role as a data manager. I am really excited to now be on the other side of the aisle with the deep understanding that I have of what data managers are going through, what their workload is like, how much information there is for them to process and get right, right ... There is very high-stakes data, this IDEA data ... and how important the work is. So if there is one data role, I think, in a state agency that you want to be a data quality influencer, it is around the IDEA Special Education data, right? It's super high-stakes, and I'm just very excited to be on the train. 00:21:50.25 >> Great. Yeah, we need a whole army of data quality influencers for this IDEA data. So, Austin, what ... How do you see yourself being a data quality influencer now? 00:22:02.51 >> Yes, Amy. When I think about this question, the word empathy keeps popping in my head. We ... Kristen and Kelley, we've been into EMAPS. We've had to submit our SPP/APRs on February. We've seen those emails from our bosses' bosses asking for updates or asking for some type of information that you have to put together quickly and translate it into something they can digest quickly. Having that empathy and having gone through the data-submission process, it really helps me understand what state SEA data managers are going through, and just knowing those feelings that they have, those crunch-time deadlines and just navigating your OSEP guidelines and then your state, federal ... your state guidelines and state legislation, just being able to ... Having that in the back of my head when I have conversations with states, just making sure that they know that we've been in those positions before. We know how you feel, and we're going to try to the best of our ability to help you in your data submissions. 00:23:26.18 >> Yeah, that makes so much sense. It is such value that you bring having had those experiences and gone through everything they're going through, and, like you said, having the empathy, that really adds so much to what you all bring. So thank you all so much for sharing with me and all of us your experiences as a data manager and now transitioning into this new role, what that looks like and what you're looking forward to and so happy and thankful to have you on. 00:23:59.55 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date With Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey. It's Amy, and I'm so excited to be hosting "A Date With Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date With Data" is brought to you by the IDEA Data Center. 00:00:24.62 >> Hello. Welcome to "A Date With Data." On this episode, we are continuing our conversation with Leah Voorhies, who is the State Director Special Education, and LauraLee Gillespie, who's the Special Education Coordinator of the Utah Program Improvement Planning System, and they're both with the Utah State Board of Education. On the last episode, they started their story about their state's general supervision system, and on this episode, we are going to hear more about some of the challenges that you've experienced, how they've addressed them and some of the areas they're really proud of, so thanks for joining us. So what are some of the challenges that you've encountered, and how have you tackled them to make these improvements and put these great processes in place? 00:01:12.60 >> Well, I'll speak to a few challenges that I see. Bridging the gap between compliance and program improvement, again, we know from the Supreme Court that we look at benefit. We look at procedure, and we ... That's been the requirement in special ed for decades. But sometimes procedure gets a little more focus, and although it's important, we want to ensure that those procedures are actually leading to outcomes for students and program improvement for schools, and that, I think, is a huge challenge that I personally feel it's not ... It's easy to just look at a file and say, "Here's the issue," but it's much more difficult to look at a file and, again, look at it over time and really see what we're doing to improve outcomes for students, so I would say that, in talking to staff in LEAs, in talking to individuals here at the state and talking to parents, that that is probably the largest challenge, and so trying to bridge that gap between the importance of compliance, and I do have a place for compliance in my heart because of my legal background, but I ... But it is important to really understand the why behind it, why it's important and why I will say honestly there's lots of conversations I have with local education agencies in particular, helping them to see why it matters from the parent point of view, from the stakeholder point of view because that's kind of where my background is, and so that piece, I think, is one of the most challenging obstacles that I think we face in our processes. There's lots of logistical issues that are involved, but I would say if I was going to pick one thing, that would be it. 00:03:17.16 >> Mm, yeah, that is a big challenge for sure. 00:03:20.74 >> Leah, would you ... 00:03:24.01 >> I think another challenge we identified last year and the way we addressed it possibly unique ... We're still not sure if it made a difference or not, but when the monitoring team sits and reviews files, they give the teacher of that student, the file, an opportunity to sit with them, and so they talk through the process if the teacher is available and really listen to the teacher talk about what they were trying to accomplish for the student and help the teacher understand the compliance issues. And we recognized that, because we don't have a statewide IEP system in Utah ... There are 16 different IEPs, IEP systems, and then lots of LEAs use our model forms, just paper-pencil model forms, and we recognized that teachers struggle with understanding how to fill out the forms. Teachers struggle with understanding the why of filling out the forms. They struggle to understand the why related to the rule, but they also struggle like LauraLee was just talking about to understand the why of what the relationship to student outcomes is, and sometimes it doesn't really look like there is an outcome, so then they're just frustrated. 00:04:44.57 >> Yeah. 00:04:44.76 >> Right? So we have a statewide institute on special education law, basically a law conference, every year. We've been doing it for 35 years or something, and every summer we spend 2 days. We bring in attorneys from around the country to talk about important issues related to our state and to our general supervision system, and we did something different the first time in our, however, 35 years. LauraLee and her team did a session with almost 2,000 people who attended our conference. It did 3-hour session just going through a file with the whole state line by line by line. They made up a student. And they just had 2,000 people in the room and online that did a file review together so that she could have that conversation that she likes to have and her team likes to have with individual teachers with the whole state. 00:05:55.69 >> Wow. 00:05:56.55 >> And so that's one of the ways, one of the challenges that we found and that we've addressed, and the other big challenge, which we try the same way every other state tries, is that we have staff shortages. Our LEA directors are overwhelmed. Our teachers are overwhelmed. Our parent educators are coming and going. It's like a rotating door. 00:06:19.52 >> Mm-hmm. 00:06:20.49 >> And so there's always a need for professional learning. There's always a need for technical assistance. There's a need to provide emotional support to special-education providers, and so we have ... We've created synchronous, on-demand and asynchronous available at 2 o'clock in the morning professional learning and technical assistance so that our seriously overstretched educators and administrators can access information whenever wherever from us with a consistent message. We all say exactly the same thing, and to LauraLee's point about compliance supporting practice and practice supporting compliance, we have one message. We all say exactly the same thing, and you could get it from our mouths. You could get it from recordings from us and just to try to address that particular barrier. 00:07:30.21 >> Sounds like you've come up with a lot of unique and creative strategies to try to handle these challenges, and that's really fascinating and good for you all. And one last question, you mentioned the 2301 memo, the OSEP General Supervision Q and A that came out. Can you talk about how that has impacted, if at all, Utah, your system, kind of things you might be changing potentially, what kind of your reaction was to it? 00:08:03.34 >> Yeah, so my reaction was probably not as dramatic or concerned as some other state directors because we had already moved to a mostly cyclical monitoring cycle. We had already put in place a lot of the requirements in the memo. A couple that we needed to address, one, the transfer issue, ensuring correction once a student transfers across LEAs and across states, and then another one. We have a statewide education complaint hotline, and so anyone anywhere anytime can submit a complaint to the state board of education about anything that's happening in their school, and we have a system internally to be able to route the complaints related to special education to our Special Education Dispute Resolution team so that people that submit hotline complaints can get the information they need about IDEA dispute resolution and submit IDEA complaints that way or request mediation or a due-process hearing. So one of the things that we had to put in place was a process to move from that complaint-hotline procedure that we already have in place in our state to address the credible-allegation issue in the 2301 memo. So I asked directors, and we have about 160 LEAs in our state, so I asked the directors to nominate one another to participate on a work group to be able to address the issues and come up with policy ... recommended policy that I could take to the state board of education related to those two issues, the credible-allegation issue and the transfer issue. About 17 directors nominated themselves or someone else to participate with me, and we had several meetings. We all reviewed the guidance together front to back. LauraLee and I went through and made a PowerPoint that hit all of the key issues in the guidance, so we reviewed all of the guidance, and then I had the two questions that this work group needed to address at the end, and in the almost 14 years that I have been at the state board of education, that work group did, I think, the best, the quickest work that I have experienced at the state board. They all agreed to just drop their defensive about the fact that this ... The whole credible-allegation issue was going to be more work for them because we have a statewide hotline-complaint system, so I get allegations all the time. 00:11:18.79 >> Yeah. 00:11:19.57 >> And so we just got to work. We probably spent 3 hours together over two and a half different meeting times, and they came up with a decision tree that I then took to the state board of education as a recommendation for how we would proceed, and they all agreed. They asked great questions. I was just so proud of them. And the state board accepted it, and we are moving forward with putting together a database for tracking the procedure that they outlined and that the state board accepted, so we are ... we're beginning the implementation of that system now. 00:12:11.04 >> Wow, that is fast. That's amazing, and, yeah, what you went through to come to that and bringing in the directors to really formulate your plan makes so much sense since they're going to be the ones involved as well and their buy-in, so that's really smart. Well ... 00:12:31.02 >> It was amazingly smooth. I'm really proud of them. 00:12:34.37 >> Yeah, that's just a testament, I think, to the relationship like we've been talking about that you've built with them and other stakeholders over the years that you can tap them in that way and that it is such a successful activity. Great. Well, I want to just thank you both so much for your time and talking to us about all of the great work that you're doing and the general supervision elements that you already have in place and what you're working on. It's so good to hear from both of you. Thank you so much. 00:13:08.87 >> Thank you, Amy. 00:13:10.03 >> Thanks. 00:13:12.23 >> To access podcast resources, submit questions related to today's episode, or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date with Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey, it's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date with Data" is brought to you by the IDEA Data Center. 00:00:24.61 >> Hello, welcome to "A Date with Data" with Differentiated Monitoring and Support, or DMS 2.0 in full swing, and the general supervision Q and A that came out from OSEP back in July of 2023. General supervision is a high priority area for all states. And on this episode, we have joining me Leah Voorhies, who is the Assistant Superintendent of Student Support and also the State Director of Special Education, and also LauraLee Gillespie, who is the Special Education Coordinator of the Utah Program Improvement Planning System, and they are both with the Utah State Board of Education. They are going to be sharing with us the story of their state's general supervision system, how they use it to monitor and support compliance with federal and state requirements in local education agencies across the state of Utah. Welcome, Leah and LauraLee. Thank you so much for being here. 00:01:24.09 >> Thank you. 00:01:25.10 >> Thank you for having us. 00:01:26.52 >> Of course! Can you start out just really briefly introducing yourselves and saying a little bit about your role? 00:01:33.38 >> I can begin. I'm Leah Voorhies. I've been the Assistant Superintendent and State Director for going on 8 years now. Before that, I was the Special Education Coordinator, so like the assistant director in Utah, for 6 years. And I have been working on trying to improve and implement general supervision systems in either LEAs or the state for ... This is my 22nd year. 00:02:08.98 >> Mmm, you've a long history of it. 00:02:13.10 >> And I'm LauraLee Gillespie. I've been at the State 6 years now. My work has been around program improvement and compliance for the most part, the entire time I've been at the State. Previous to the State, I was at the Disability Law Center, which is the protection advocacy for the state of Utah, and representing parents of students with disabilities in special education. 00:02:39.38 >> Great. Well, I guess I want to start out just, if you could, talk about your more data-focused areas of your general supervision system, like the SPP/APR, data on results and processes, LEA determinations, just describe those areas to me and what you're doing in terms of monitoring and compliance. 00:03:00.75 >> I will start this part off. So we focus on data a lot ... 00:03:05.28 >> Mm-hmm. 00:03:05.55 >> ... through lots of different avenues. We focus on the indicator data as outlined in the annual performance report. We also look at state-level data and local education agency-level data as we work with local education agencies to improve outcomes and ensure compliance. Just for clarification, in Utah, charter schools and districts are all public schools, so we just refer to all of it as local education agencies. Some areas that we prioritized in Utah have included post-secondary transition planning. That's our ... It's just been an area of focus that we've needed to really focus in on, really looking closely at the data and trying to improve the outcomes for students with disabilities. We've done a lot with inclusive practices and effective instruction. We're also starting to do some educational benefit reviews and looking at the data of what we're finding from LEAs across 3 years. We're just starting into that. We're in our baby phase of educational benefit, and we just want to look ... We want to educate our local education agencies on the importance of monitoring progress over time as well as developing meaningful and rigorous individualized education programs. So again, looking at indicator data across the board, looking at specific local data as well as state data to try to really get a whole picture. Currently, our LEAs annual performance determinations are based on the 16 indicators, so 1 through 16 as they are placed on the annual performance report. 00:04:51.99 >> Mm-hmm. 00:04:52.21 >> We go through and bring those together and develop a score, and they get a determination, but we also do a results-driven accountability or a risk score as well as that APR determination. And the risk score includes 1 through 14 of the indicators, but we also take into account fiscal data, timely and accurate submissions of data, the scoring on the completeness of the program improvement plan, if they've had corrective actions plans. 00:05:26.08 >> Mm-hmm. 00:05:26.37 >> And part of the reason for that is to help them to ... to help us to see how we can best support them. So the higher the risk level is, the more support the LEA is offered through the state board of education. I can jump into a little bit further about our monitoring, but I want to know are there any questions at this ... 00:05:49.08 >> Yeah, I was just ... I was going to ask that, yeah. 00:05:51.34 >> Yeah. 00:05:51.46 >> In terms of the risk score and how that feeds into the monitoring and how you decide which districts would ... or LEAs will get monitored and when? 00:06:01.73 >> Yeah, so we've gone through some overhaul in the last 3 years which has been a really great change for us. We switched to ... I want to call it a 6-year cycle, but it's not a cycle as in, I'm going to come to your LEA, and you can just say, "Oh, I'm off the hook now for 6 years." 00:06:21.70 >> Mm-hmm. 00:06:21.95 >> The way we look at it is, we tell the local education agencies that we are going to come at least once every 6 years. We may come more. We may come less, and this is where the risk score comes in. So the risk determination is considered as well as several other factors. So the higher the risk, the more likely we are going to be coming to them. 00:06:44.42 >> Mm-hmm. 00:06:44.78 >> But we also look at other pieces as well. We are really focusing in on our targeted support and improvement schools, or our TSI schools. We look at how much dispute resolution they've had. 00:06:57.33 >> Mm-hmm. 00:06:57.88 >> We have a hotline here at the State Board of Education, so we pull that into the determination of who we're going to be monitoring during the year. If there's fiscal concerns, we take that into consideration, and that's sometimes a specific population. So our youth in custody, our youth in care student populations are always a focused part of ... 00:07:22.77 >> Hmm. 00:07:22.89 >> ... who we do full monitoring with, but it may increase the likelihood that we're going to come a little sooner rather than later. 00:07:30.25 >> Hmm. 00:07:30.35 >> So again, it's a 6-year cycle, but the cycle is, you can be moved up quicker depending ... 00:07:37.06 >> Mm-hmm. 00:07:37.36 >> ... on the risks and needs that we see based on your data and other factors. 00:07:43.71 >> A very holistic approach, it sounds like ... 00:07:46.39 >> Mm-hmm. 00:07:46.60 >> ... looking at comprehensively all factors to make those types of decisions. 00:07:52.58 >> Absolutely, and that's our full monitoring. And, of course, we do additional monitoring for fiscal ... 00:07:58.57 >> Mm-hmm. 00:07:58.79 >> ... Indicators 4, 9, 10, 11, 12, 13, and our significant cognitive disabilities are 1 percent. And those aren't based as much on risk. They're based more on data and information that we have on file regarding those specific indicators, determines who we look at. 00:08:18.61 >> Great. And through all these different areas that you're focusing on, can you talk about how stakeholders are involved in either making some of the decisions around what these different processes may look like, reviewing results, giving feedback? How do you bring them in? 00:08:38.01 >> So this is Leah, and stakeholder feedback is important to us because it's required. 00:08:46.44 >> Mm-hmm. 00:08:46.65 >> It's also important to us because we're a small enough state that we actually know all of our major stakeholders. We participate with them in all sorts of activities, not just in gathering feedback related to the activities that we do. 00:09:06.44 >> Mm-hmm. 00:09:07.17 >> So we have established as an entire agency, not just in special education but as an entire education agency, feedback gathering standard operating procedures. So ... 00:09:21.53 >> Hmm. 00:09:21.66 >> ... For all of the decisions that our elected State Board of Education makes, we have a process to gather feedback. And then in special education, we ... Because we know all of the frequent players when it comes to students with disabilities and individuals with disabilities, we interact with them frequently. So I meet monthly with the director of our parent training and information center. 00:09:55.51 >> Mm-hmm. 00:09:56.76 >> My team members meet monthly with the PTA, with the board of the parent training and information system. We meet regularly with our Disability Law Center. We have a legislative coalition for people with disabilities that we meet with. I serve as the chair of a statutorily required committee on policy, state policy related to students and individuals with disabilities. All the state agencies in this state are required by statute to be ... participate in that committee, as well as all the nonprofit organizations that serve individuals with disabilities that receive any state funding. 00:10:41.83 >> Hmm. 00:10:42.30 >> And then we have a robust focus group and follow-up survey system that we use. So our first strategy is that we go to our stakeholders. We don't ... 00:10:56.50 >> Mm-hmm. 00:10:57.23 >> ... require that they come to us. And so we go to their meetings. We participate in their meetings as regular members, even voting members most of the time, and then we hold focus groups with them when there is something specific that we need them to comment on, and then we have a follow-up survey. And the focus group, the follow-up survey process is an agency process. That's not just special ed. 00:11:27.92 >> Yeah, I like that idea a lot. I think states do seem to see more success sometimes going to the stakeholders rather than adding something else, another meeting or time on them, but hitting them when they're already gathering or meeting for other purposes. 00:11:47.06 >> And if I tried to roll something out that they hadn't had ... that they hadn't provided feedback for, it wouldn't go very well. 00:11:58.15 >> You'd hear about it? 00:11:59.59 >> We have created ... They're my friends. They're my team members, friends. We have ... Luckily, I've been the State Director long enough, and my team has been ... LauraLee has been with us for long enough that we have a longstanding team. We have a cohesive team. They know us. They know us personally. We know them personally. It's not just that we respect each other. It's that we're friends, and it would damage our friendships, and that is not okay with us or with them ... 00:12:36.08 >> Mm-hmm. 00:12:36.70 >> ... for that matter. 00:12:37.75 >> Yeah, so building those relationships ... 00:12:39.94 >> Yeah. 00:12:40.64 >> ... is very key. What are some of the aspects of your general supervision work that you're especially proud of, that you really want to highlight? 00:12:51.33 >> There's a lot of things. It's been a ... As Leah said, the stability has led to some opportunities in terms of general supervision that have worked well. In no particular order, when this question was asked, I started to think about these things. One thing that we do in terms of monitoring that I'm really proud of is, we've really worked hard to, as one of our colleagues says, "Practice supports compliance, and compliance supports practice." And we've really tried to ensure that that is evidenced by the work that we do at the State. So in other words, when we go out and do monitoring, our specialists, our content specialists in effective instruction and inclusion and behavior, they come right along with us on those monitoring visits. They conduct the interviews, and what I see almost every time we go on a monitoring visit is that we start out with this very nervous feeling ... 00:13:57.23 >> Mm-hmm. 00:13:58.42 >> ... amongst the staff, and that doesn't always go away. But a lot of staff build relationships with our specialists during those monitoring visits and actually follow up with them and have conversations with them about their concerns. And so that ... Again, it's that building a relationship, and our monitoring is all on-site, and that's very ... 00:14:21.90 >> Okay. 00:14:22.07 >> ... intentional because we want ... We really want to build those relationships with those teams. Just as an example, our assessment specialist, she had a very in-depth conversation with one of our rural school districts, and they've continued to communicate with each other based on the monitoring visit. 00:14:42.35 >> Hmm. 00:14:42.47 >> And Indicator 13 is another really great example. We have this amazing coach for Indicator 13. She does just an incredible job, and as they go through and do the file reviews, she's right there. And they're doing the file reviews and they're checking compliance, but she is right there giving them tips and ... 00:15:04.20 >> Mmm. 00:15:04.38 >> ... things that they can use as they move forward. And again, that, to me, is powerful in terms of trying to really improve outcomes for students and think about it through a different lens. We've been doing a big focus on inclusion here ... 00:15:24.33 >> Mm-hmm. 00:15:24.55 >> ... in the state of Utah, and I think that's starting to meld into some of our monitoring and our general supervision. A lot of the technical assistance documents, we can immediately pull those as we go out and are monitoring or helping an at-risk school to ... and again connecting them to folks at the state who are specialists in these fields. Another piece, again I'm going to back to educational benefit. We're newbies to this. There's other states out there that have tried it, but our educational benefit specialist recently was with a school and went through that process, and the power that was instilled in the actual LEA, not by the state, not us telling you what to do, but the LEA, the staff there realizing and recognizing what they needed to do to really improve progress for students with disabilities was ... It's just phenomenal. And the last thing, I think, is the internal thing that we do as a state. We, for the last 4 years, have done general supervision sessions. So this isn't so much so much for the LEAs on the outside but really for our staff on the inside, and this was ... Again, I have to give Leah credit for because she had this wonderful idea that we need to ensure that all of our staff understand general supervision ... 00:16:50.03 >> Mm-hmm. 00:16:50.69 >> ... that golden requirement and why we exist as a state agency. And it's led to ... We just meet once a quarter, and we train out on different components on the general supervision pieces. We talk about rules. 2301 has been a big focus this year [Indistinct] and will be next year again ... 00:17:13.24 >> Mm-hmm. 00:17:13.47 >> ... because it's the new thing that is related to general supervision. So I kind of belled on, but I have a lot of things that I'm very proud of that have been happening. Leah, I don't know if you have any additional thoughts. 00:17:28.68 >> No, that's a good overview of things that we are proud of. I think where you started, LauraLee, with the fact that we have a team that has been together for a long time and that, as a result of that, we've been able to be very intentional about our continuous improvement process. 00:17:56.80 >> Mm-hmm. 00:17:57.03 >> And we've been able to be intentional with our LEAs and with our stakeholders about our continuous improvement process and support each other, and with the IETs that are training and preparing new special education providers. So that's ... I think that's definitely one of the things that we're proud of is, because we have been around a while, we have established relationships that are making it easier for us to improve our processes, and hopefully that leads to improved outcomes for students. 00:18:33.94 >> Yes. Well, it sounds like you have a lot in place to be proud of, for sure, and that sustainability seems like a major part of that because of building those relationships and maintaining those relationships and that trust, so kudos. 00:18:49.69 >> I think Leah brings up a really good point about the institutes of higher education and efforts to really prepare teachers. It's looking at it again from a whole picture, right, of ... It's not just, what can we do with the LEAs? It's, what can we do with those who are coming into this field to improve that as well? 00:19:09.41 >> Thank you so much, Leah and LauraLee, for sharing with us all of the great things you have going on in Utah around the data elements of your general supervision system. Please make sure to listen to our next episode of "A Date with Data," when we will be hearing more about your general supervision system and all the wonderful things you have going on. 00:19:36.65 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content, or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Resources Response Rate Representativeness and Nonresponse Bias Nonresponse Bias Analysis Application Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date With Data" with your host Amy Bitterman. 00:00:07.34 >> Hey, it's Amy, and I'm so excited to be hosting "A Date With Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day. 00:00:19.50 >> "A Date With Data" is brought to you by the IDEA Data Center. 00:00:24.70 >> Welcome to "A Date With Data." On this episode, I am joined by Heather Dunphy, who is the lead education programs specialist from the Arizona Department of Education, and we also have with us one of my wonderful colleagues from IDC, Tamara Nimkoff. Heather is going to be talking to us about how Arizona has been conducting their nonresponse bias analysis that all states are required to complete for SPP/APR Indicators 8 and 14, and Tamara is also going to be here to highlight an IDC tool and talk about some other support that IDC can provide states related to the nonresponse bias analysis so welcome to both of you. 00:01:08.06 >> Thank you. 00:01:09.19 >> Thank you. 00:01:10.50 >> So first off, I was hoping if each of you could just introduce yourselves briefly, say a little bit about your role and what you do. Heather, do you want to go first? 00:01:20.13 >> Sure, my name is Heather Dunphy, and I'm a lead education programs specialist at the Department of Education in Arizona. I work a lot with significant disproportionality and LEA determinations, and I also coordinate most of our federal submissions, including the state performance plan annual performance report, and I'm really happy to be here today. 00:01:42.19 >> Great, happy to have you. Tamara, do you want to talk a little about what you do on IDC? 00:01:47.26 >> Sure, thanks, Amy. So I've been with IDC as a state liaison for mostly a [Indistinct] specialist for several years now. My work has been really focused around the analysis and use of data, previously on the state systemic improvement plans, on using data on structured data meeting and over the recent years more focused on supporting states around their data collections or sampling as well as in the areas of representativeness and nonresponse bias, on the topic of today's chat. 00:02:26.75 >> Great, thank you both. So, Heather, can you start us off by kind of walking us through Arizona's data-quality journey related to nonresponse bias analysis? What does that look like? 00:02:39.97 >> Sure, so I began with the agency about 2 1/2 years ago. The FFY2020 SPP/APR was the first federal document that I was responsible for coordinating, and that was the first year that the words nonresponse bias appeared in the APR, and the question in the APR, it asked, describe the analysis of the response rate including any nonresponse bias that was identified. So I was familiar with how to analyze response rates, but I was unclear what exactly nonresponse bias was and certainly how we were going to analyze it. So in 2021 we did our best to analyze the nonresponse bias to the extent that we could, so for Indicator 8, for example, parent involvement survey, what we did was, we divided our survey window into three periods: the beginning period, the middle and the end, and the idea was that the responses might differ from people who answered this survey early compared to those who answered the survey late, and then we examined those responses that came in from parents at the end of the data-collection period as a proxy for nonresponders. And then we then compared those responders to the ones that came in during the beginning and the middle of the data collection period, so this method gave us some insight into whether or not the results might be biased. And then the other strategy we used at that time for Indicator 8, we were looking at our responses by subgroup and to see if they were representative in respect to certain demographic areas such as race and ethnicity, and then we looked at the rate of agreeableness with Indicator 8 by race and ethnicity, and we were just kind of visually trying to see if there was any nonresponse bias. If we received more survey responses from one particular race, ethnicity, and their level of agreement was different than the others, then there might be nonresponse bias. So this provided a good estimate for measuring nonresponse bias at that time. Those are kind of the tools that we had at that time. 00:04:58.87 >> Gotcha, okay, what about for 14? Were you doing something similar?? 00:05:02.10 >> Yes, we were doing the same for 14 that we were doing for, or similar, to Indicator 8, yes. 00:05:09.91 >> Great, so you had kind of your start and had a sense of what kind of makes sense. And then what led you to engage with IDC in terms of supporting you around the nonresponse bias analysis? I know you attended the Hands-On Learning Academy that IDC hosted on the nonresponse bias analysis tool that IDC had developed so tell me kind of what made you kind of shift from what you had been doing initially to wanting to do something different, something more. 00:05:40.62 >> Mm-hmm, right. Well, yes, in the spring of 2023, it was last spring, I had heard about a new tool for analyzing nonresponse bias that was in the testing stage, and they were looking for states to try it out. 00:05:56.98 >> Mm-hmm. 00:05:57.72 >> I had heard that the tool was built to assist states in addressing the requirements related to response rates, representativeness and nonresponse bias and to kind of ease that burden of analyzing survey data for Indicators 8 and 14, and anything that's going to ease the burden of any work, I'm all for it. 00:06:18.89 >> Mm-hmm. 00:06:19.85 >> So myself and one of our Indicator 14 specialists in April went to Rockville, Maryland, for 2 full days of learning about the NRBA App. 00:06:31.23 >> Great, and you said you did attend that? 00:06:35.21 >> Yes, I did. 00:06:35.90 >> The OLA. Okay. 00:06:36.71 >> Mm-hmm. 00:06:37.15 >> Do you want to talk a little bit about your experience at the OLA, what you learned? 00:06:43.00 >> Oh, absolutely, yeah, at the workshop it was Tamara and Ben. They walked us through the various ways to use the app. There are several tests that we can run. They showed us how to set up our data set in the appropriate columns. It needs to be set up in a certain way, and as soon as you have your data set up in a certain way, really the app does most of the work, and I think Tamara can talk more about all the things that this app does, but it really does take the burden off of the user and puts it onto the computer to do the calculations for you. 00:07:20.76 >> Yeah, that's always a good thing in lots of ways. We can hope there's not the user error that we might experience, and it's just kind of pushing a button. 00:07:29.59 >> Right, and the nice thing about the workshop was, it was still a little bit in that testing phase, so we had a small group of representatives from several states, and we could try it out, and we saw that some things were not quite right, and then that gave them time on the developer side to work out the kinks before it went live to all of the states. 00:07:50.98 >> Great, that was, yeah, a good kind of dual opportunity there for IDC to be able to have some testers, too. 00:07:58.39 >> Mm-hmm. 00:07:59.61 >> So, Tamara, can you talk more about the NRBA tool, how it works, why we developed it, how states can use it? 00:08:08.48 >> Yeah, absolutely, and building off of what Heather has shared, really emphasizing that the development of this resource has really been directly informed by input from our colleagues and in the state agencies. We were at the beta version stage in that spring 2023 that Heather mentioned attending. We had other states, representatives from Georgia, Indiana, North Carolina and West Virginia there, and we also had earlier input on an alpha version from our friends in Montana that gave really valuable feedback to inform the development. The impetus of the tool was really ... It's really aligned with our direct support of states' capacity to meet those SPP/APR data quality requirements for Indicators 8 and 14, as Heather was sharing about the work that they were doing prior to engaging with this particular tool, and it was triggered really by those requirements of the 2020 package where it caused us to think about at that time what tools were already in the field and where the gap might be. We felt that the field could really benefit from a tool that was both powerful but also flexible and user-friendly to the extent possible, so we wanted something that would allow a user to choose among many different ways of analyzing their survey data using those best practices, but that gave them some flexibility to have as much guidance along the way as was needed. So it was also informed by some of the kind of common issues that we at IDC had observed over years of supporting states in writing their SPP/APR responses to those prompts, and we knew that we wanted something that could both sort of support states in getting that conceptual understanding of the differences between data representativeness and nonresponse bias, which Heather alluded to, as well as gave them options for digging into their data in deeper ways beyond the submission of an APR report each year if they wanted to. So a little bit about the tool itself just to give people a really high level: It's a browser-based application, online and application, and the first time that people use it, they'll install a free statistical program along with the package itself. It's the program R, which is an open-source program that's widely used across many fields, including education. A couple of things that were really important to us were to make it really flexible but secure, so users access the app within their preferred Web browser like Google Chrome or whatever while the statistical program runs the computations in the background. 00:11:44.73 >> Mm-hmm. 00:11:45.35 >> When users upload in order to use it, their data into the app for a session, it's done via a Web browsers, so no data are actually passing across the Web, so their data remains secure within their local computer. 00:12:04.58 >> Yeah, that's important. 00:12:06.17 >> Absolutely, absolutely, it's kind of built on, the application, it is built on a couple of kind of stages. It guides the user through setting up the session, which is importing their data set and then indicating how they collected those data, telling the application about their data. For example, was it from an attempted census? Or was it from a sample? Then they can choose from a whole series of, excuse me, analysis options, their questions about response rate, representativeness and nonresponse bias. So it's a tool that is not just focused on nonresponse bias but allows the user to look at all of those areas guided, for example, what are our response rates? Do they differ across subgroups? Are some subgroups in our population overrepresented or underrepresented in our data? Is looking at data representative? But then also how do our survey outcomes differ across subgroups? And understanding how those survey outcomes vary across subgroups combined with the information about representativeness of their subgroup is what informs the user about the presence of nonresponse bias in their data. 00:13:36.13 >> Mm-hmm. 00:13:37.16 >> And the tool is also powerful enough that it gives the user options for looking at, can we use some statistical adjustments to reduce nonresponse bias of the data? 00:13:47.94 >> Hmm. 00:13:48.24 >> So not a requirement of the APR but an added kind of best practice of looking at one way of assessing if there's nonresponse bias and seeing how it might be adjusted is by using weighting adjustment. 00:14:05.42 >> Mm-hmm. 00:14:06.14 >> So that's an option in the tool. There are many kind of analysis options that if a user wanted to dig more deeply into their data, they can do so. 00:14:17.62 >> So, Tamara, would the tool actually weight the data for you or [Indistinct]? 00:14:22.64 >> That's right. That's right. All of the analyses, whether it's calculating a response rate or whether it's looking at the proportional difference between representation or whether it is providing a comparison of weighted and unweighted data, all of that is done through these preprogrammed analyses that are part of the application. They're running in the background, and the user is choosing what variables to look at as well as other kind of parameters of the analysis that may need to be decided depending upon the specific statistical task that's being done. 00:15:13.13 >> Wow, that's ... And just having that all in one package, all of those pieces because I think so much of the confusion that we heard from states, especially when the nonresponse bias analysis requirement was added, was not understanding the difference between them, how they are connected, how they work together potentially, and so just having it all together like that I think is so powerful for understanding the different requirements. 00:15:41.52 >> That's right. Yeah, that was really a goal, and one of the things Heather mentioned, the input that was provided at the OLA, which, again, I'll just say was so valuable for our development process, one of the things that is also really a part of the application are the supporting resources around it, so having resources that provide detailed instructions about not just how to install the app but how to set up the data set for really honestly whether you're using the application or not, what elements of the data set are valuable to have for the particular analyses that need to be done for them as well as kind of a pretty comprehensive reference guide that gets to the conceptual pieces of along with using the app, what it means. How do you interpret the analyses? 00:16:54.75 >> Yeah. 00:16:54.84 >> So all of the pieces we want to be a part of the resources available and we're continuing to think about and get input from folks about what resources might be useful moving forward. 00:17:13.25 >> Yeah, and this is something that you don't have to have any type of statistical background to use. What would you say ... A data manager could pick this up and do it with the resources ... 00:17:25.51 >> Yeah, I ... 00:17:25.79 >> ... and support that goes along. 00:17:28.33 >> ... Yeah, absolutely, and Heather can certainly speak to this as well from our, the great engagement that we've had in collaboration around them using this tool with their data is that we really do encourage users to leverage IDC's technical assistance to make the most of the application, but the support is really intended to be flexible. There may be folks who are very experienced in statistics who might choose to use the application independently just using the guide for reference. Others may really benefit from engage one-on-one with an IDETA specialist like myself to work with the data collaboratively to get input on which analyses might be most useful to them or to discuss together, how the results might be interpreted. Folks might want to kind of gain proficiency with the tools themselves with that range of IDC support, and it really is ... The support around the tool is really quite flexible as well. 00:18:43.18 >> So, Heather, tell us about your experience using the tool and what that's been like and how maybe that's changed your analysis, your interpretation, your results in terms of the nonresponse bias and representativeness as well. 00:19:00.24 >> Sure, sure, so like I mentioned at the workshop, Tamara and Ben, they walked us through various ways to use the app, and just like anything we learn that is new, I felt a little outside of my comfort zone at the workshop but with their leadership, the process wasn't too stressful. It seemed fairly straightforward, but then when I returned to Arizona, I tried using the tool on my own. I found it far more difficult than I anticipated. I felt clumsy. I was making mistakes. I tried to follow the written directions I had gotten at the workshop, but I was stumbling with the steps, so I reached out to Tamara. I felt a little embarrassed because I needed additional support after spending 2 fulls days with me but you know what? She made me feel totally comfortable. We went through some of the exercises together. She helped me understand how to interpret those results. She made sure that my data set was organized correctly, and then after that the process was fairly simple. Just like Tamara said, the RStudio app will run all of the calculations, so it can do some of the basic calculations like calculating the response rates by subgroups and comparing subgroup percentages in respondent data to data from respondents and the nonrespondents, but what I found really helpful is learning, do we have that nonresponse bias? And so, for example, in Arizona the Indicator 8 data we have is about 92 percent of agreeableness, but we're only receiving 14 percent of responses, so that's a lot of unknown. That's 86 percent that we don't really know how they're going to respond, and so the tool was really helpful. We found that if we were to get everyone to respond, which we only got 14 percent, but if we were to extrapolate that and get everyone to respond in respect to race and ethnicity, it was extremely close to that 92 percent that we calculated. So that gave us ... 00:21:08.09 >> Good. 00:21:08.25 >> That gave us a lot of confidence that we didn't have nonresponse bias in respect to race and ethnicity for Indicator 8, and we also looked at Indicator 14, and we received quite a few responses. We have about a 74 percent response rate, but you don't still don't know if there's nonresponse going on even though we have quite a few respondents, so we ran the test there, and what we found is, we're getting about 80 percent of our responses from graduates, okay? So we get quite a few of the graduates, but we're getting fewer responses from dropouts. It's harder to get responses from the dropouts, and so we wanted to know if there's some nonresponse bias going on. And the areas of engagement differ from graduates and dropouts, and especially when we look at, for example, who's going into higher ed, what we found is that for the percentage of youths with IUPs that go into higher ed, it's about 19 percent in Arizona. But we were wondering, if we received responses from everyone, would it still be 19 percent? So that's where the tool really came in, in handy, so in a perfect world, if we had gotten all of the responses from graduates and all the responses from dropouts, the app showed us that instead of 19 percent, it would be lower. It would be about 17 percent. 00:22:31.25 >> Mm-hmm. 00:22:31.88 >> And so my next step in this process would be to learn about, is that significant? What's the level of significance, that 2 percent difference? And that's something that I think the tool can help with. I would like to learn more about that in the future. 00:22:47.34 >> Mm-hmm. 00:22:48.91 >> So I'm still kind of at the beginning stage of learning about the tool, but from what I've seen it is really, really neat and saves a lot of work. 00:22:57.37 >> Yeah, did you use it for the SPP/APR you're working on that's due in February? 00:23:02.25 >> Yes. 00:23:02.46 >> Great. 00:23:02.76 >> Yes, we did, mm-hmm. 00:23:04.11 >> That's very exciting. Well, it sounds like, yeah, you've gotten a lot out of the tool, and there's a lot it can do and a lot more still to even explore with it, so kudos to you all. 00:23:17.68 >> Yeah, thank you. I will add that any time you're working with something complicated, you try not to make it so complicated. You try to simplify it, so one thing that did help me to understand the tool better is, I made a smaller data set of fake data. So instead of looking at 10,000 responses, I just made a false data set of 100 responses to help me just understand the tool better. And so in respect to race, ethnicity, I just made 50 were Hispanic, and 50 were white, and I ran different scenarios to see how the tool worked and what if more Hispanics answered? What if more ... 00:23:57.06 >> Yeah, how would that change? 00:23:57.95 >> ... white people answered? What if the Hispanic level of agreements was very high, and the white level of agreement was really low? How does that change each of these calculations. So I would suggest to anyone learning about this tool to try that, to create a smaller data set and just look at the different statistical calculations that the app provides and what states might find valuable. 00:24:19.09 >> That's a great tip. Do you have anything else for states that are just interested in possibly using this or have starting using it that might be helpful for them? 00:24:28.87 >> Sure, I would say in addition to trying a small data set, to reach out to IDC. Like I said, I was kind of unsure, kind of feeling clumsy about it, and they are just so helpful to make sure that your data set is set up correctly. If it's not, the app is not going to work at all, and so that's the first step. And then when you finally get the results, you might need some help interpreting them, and that's where IDC can really come in handy as well. 00:24:54.46 >> Great, so, Tamara, for states that are interested in learning more about this and getting their hands on it, can you tell us how they can get the tool and maybe a little bit more about the support that IDC is able to provide? 00:25:09.45 >> Yeah, absolutely, well, the app itself, information about it, if they want to kind of self-explore is available on the IDC website along with those supporting documents. Also on the website in various places are various kind of presentations that we've done either at the SPP/APR summit, for example, that speak to both the topic as well as a bit about the application. If a state user doesn't want to kind of engage in it, then contact your IDC state liaison, and they will be able to connect you with a TA specialist. In terms of what's next, we are continually improving and looking for those opportunities to provide TA, really, that help inform the hearing from the field on what other kind of supporting resources would be useful. It's been immensely helpful to collaborate and rewarding to collaborate with Heather in Arizona and, for example, mentioning the smaller fake data set, we have small data sets that we have developed that are, with fake data, that are for Indicators 8 and for Indicators 14 that we use for demonstrations with the app, and we certainly could make those available to folks so that they could then ... 00:26:43.24 >> They ran themselves. 00:26:44.27 >> ... use those, absolutely, so we're constantly learning from each other in this process, so that's great, and as she mentioned and I mentioned, really our support is very flexible and geared towards what's going to be the most useful and meaningful for the state user in whatever phase they are in, in this process. 00:27:09.43 >> Great, and we'll put links to the resource application itself as well as some of the other presentations that might be helpful in the notes for the episode so folks can easily get to them. 00:27:21.77 >> Great. 00:27:22.56 >> Great, well, thank you both so much. I know that I learned a lot more about the app than I knew before, and it seems like something that's so useful and helpful and so appreciative that you came on and have shared your story so other states can hear about this, too. 00:27:39.92 >> Oh, you're welcome. Thank you for having us. 00:27:41.84 >> Yeah, thank you so much. 00:27:42.96 >> Of course. 00:27:44.71 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content or connect with us via the podcast page on the IDC website at ideadata.org.…
 
Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you! You can contact us via the Podcast page on the IDC website at https://ideadata.org/ . ### Episode Transcript ### 00:00:01.52 >> You're listening to "A Date with Data" with your host, Amy Bitterman. 00:00:07.34 >> Hey. It's Amy, and I am so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who just like you are dealing with IDEA data every day. 00:00:19.50 >> "A Date with Data" is brought to you by the IDEA Data Center. 00:00:24.87 >> Hello. Welcome to "A Date with Data." On this episode, I am joined by two data managers. We have Amy Patterson from the Kentucky Department of Education and Alisa Fewkes from the Idaho State Department of Education. Both Amy and Alisa have been in their role for a number of years, and they're going to be reflecting on their experiences and also sharing some lessons that they've learned over time and insights that they have for newer state staff. Can each of you tell me how long you've been your state's data manager and maybe also a little bit about your responsibility since we know that a lot of different tasks and initiatives fall under being a data manager, and that can really vary a lot from state to state, what you're responsible for? So love to hear what you do in each of your states. 00:01:14.39 >> Sure. So my name is Amy Patterson, and I am the Part-B Data Manager in Kentucky, and I've been in this role for about 8 1/2 years. Some of my responsibilities include pulling data from our statewide student information system as well as getting some data from the districts directly. And then I review the data, format the data, and then I give it to our effects coordinator, and she's the one that actually submits the data. I also do a lot of training to LEAs on how to enter data and how to review their data and so that we get reporting ... quality data reported back to us. And I do the data for the SPP/APR, but I as much as not responsible for writing the SPP/APR or the SSIP. 00:02:05.67 >> Got you. Okay. Thanks, Amy. How about you, Alisa? 00:02:08.78 >> Hi. I'm Alisa Fewkes. I am the Data and Reporting Coordinator for Idaho, and I have a lot of similarities to Amy. I've been in my position for right about 8-ish years. I do work directly with LEAs to provide support on their data quality for especially child count and then those bigger reporting areas, program exit, all the way through. So I can be contacted by our LEAs to kind of go down those rabbit holes of data and reporting coding. I also provide support through doing trainings, working with teams on data quality issues, figuring out, can we ... more data and reporting plans that they can consider, those root causes to go through and look at all of their data and information. We always provide a regional training every year that I'm part of, which goes around like five different locations around the state and go through root causes analysis. And then I am also the lead for SPP/APR, both on data, writing, submission. Thankfully, as Amy said, don't do the SSIP ... 00:03:41.57 >> Mm-hmm. 00:03:41.78 >> ... so very glad about that. We have other folks working on that information. And then I work directly with our IT team for pulling data for our EDFacts reporting, so do a lot of work with them. If we're seeing any reporting errors as we go through the submission, I troubleshoot with them trying to figure out what's going on. 00:04:05.47 >> Mm-hmm. 00:04:06.11 >> And so a lot of direct interaction, and most of the reason for that is actually because I've pushed for it ... 00:04:14.62 >> Hmm. 00:04:15.32 >> ... for my position. I really want to be more hands-on as opposed to just having my data come in and not knowing what's going on with it. So I really appreciate that team members have been willing to let me be involved and that I have a good relationship with our IT team who's involved in all of that data collection and processes. 00:04:40.51 >> Hmm. Thank you both, and what you were just talking about, Alisa, I think kind of folds into the first question I had, which is, how has your role evolved over time? And it sounds like maybe things have changed somewhat. I'm sure they have for both of you both in terms of what's going on in your state and the context and other staff that are there as well as SPP/APR changes, EDFacts changes. So, Alisa, do you want to kick us off just saying a little bit how things have changed since you started about 8 years ago and until now? 00:05:13.99 >> Yeah, sure. So when I first came in, there was a lot of processes being run without interaction of the Part-B Data Manager, which, again, I just wanted to be a lot more hands-on. I like knowing. 00:05:31.76 >> Mm-hmm ... 00:05:32.26 >> So ... 00:05:33.23 >> ... what's going on. 00:05:34.60 >> Yeah, so I really work to build up those relationships and team interactions across our different divisions, so whether it was with IT or assessment, those different folks to make sure that when data processes were being considered for changes that they were really, "Oh. We need to bring Alisa in on this conversation." So it wasn't like that special ed was outside of that conversation. It was automatically, "Oh. We need to make sure that we're including those folks, and I'm right in there with them." And that was a big change when I came in. That was very much a, "Oh. Oh, yeah. We didn't talk to special ed. maybe we should do that." 00:06:31.80 >> Mm-hmm. 00:06:32.67 >> It's just a regular part of the process, which I think is great. And again, I just like knowing, so to be in the know, you have to be part of those problematizations ... 00:06:46.79 >> At the table. 00:06:47.62 >> ... yeah, and at the table. The other thing I've seen change over time, I see kind of a pendulum swing back and forth, and sometimes it depends on the time of the year of how much direct support I'm providing to LEA teams. Sometimes it really is more I'm on SPP/APR. I am not having those one-on-one conversations with teams. And sometimes it's really majority of my day is taken up providing feedback on calls to LEA teams. 00:07:29.80 >> Hmm. 00:07:30.59 >> So it's kind of a swing back and forth, but a lot of my work also now is around the public reporting, making sure that that information is out in more understandable ways, which sometimes can take a lot more time. 00:07:48.90 >> Yeah, absolutely. Amy, what about you? How have things changed for you? 00:07:53.78 >> A lot of what Alisa said is true for me, as well. I do spend more time now on public reporting and communicating data and data visualizations. And I'm working with a contractor for dashboards, which, that evolved probably 3 or 4 years ago. And a lot of my time has been spent on that. When I first came in, my role was basically just to collect data and report it, and I was a data person but not a special ed person. 00:08:24.87 >> Mmm. 00:08:25.32 >> So the more I learned and because data is in everything special ed, I have become more involved in sort of programmatic discussions ... 00:08:35.89 >> Mm-hmm. 00:08:36.12 >> ... behavior discussions. I am sort of the face of our special ed office because I do most of the trainings, although they're around our data system. People know they can reach out to me, so I'm doing a lot of fielding calls and questions. And also, as I've become more familiar with our system and with special education, I've made some changes to sort of update how we do our data collection and reporting. And we're constantly sort of revising the data collection process and trying to document the data cleanup so that we're consistent and we have the same data. We have better data every year based on what we're getting from the LEAs and reporting back. 00:09:27.96 >> Sounds like both of you have really kind of taken this position and made it your own and really run with it and tried to make improvements and expand like both going more into the public reporting, for example. So that's really exciting and something for those who are newer in the data manager role to kind of think about, too, as you're in your role for more time. And what are other areas that you can get into and improve? Talking about new data managers and new staff, Amy, what advice do you have for those who are just getting off the ground as a data manager? 00:10:07.24 >> First, I would say give yourself a little grace. Don't panic because there is a lot to this, and it's ... It can be a little intimidating, knowing you have federal due dates and all this data coming in. So do the best you can, and reach out to other data managers [Indistinct] ... 00:10:31.42 >> Mm-hmm. 00:10:32.58 >> ... or any one you feel comfortable talking to, IDC, if you have questions. Don't hesitate to ask question. The second thing I would say is to document everything, the data process protocols. I'm always talking about the data process protocols because they kind of saved me. We had a data manager in our state that was here for a long time, and when he left, we were kind of lost. And so it's been very helpful. He came back and helped us document these protocols, and then I've kept them going over the last few years, and it was really helpful. And then, third, I would say relationships are key. And Alisa sort of alluded to that earlier. My relationships that I try to build, I have really good relationships with our LEAs, and I'm building relationships with other offices within our department, our tech team and assessment because those relationships really do go a long way in helping you get your job done and ensuring that you have the best data possible reported. 00:11:47.52 >> Great advice. Alisa, what advise do you have? 00:11:52.24 >> My first one was very much going to echo Amy. Be kind to yourself. It's probably going to take you about 3 years before you're going to feel stronger in your position and that you're going to be able to make changes to your processes. Yes, probably after that first year you're going to make tweaks to what you are doing and going, "Oh, this was a headache. I need to address this." But then after that 3 years, you're going to likely be able to a lot better manage your calendar, getting that calendar set up so that you're working with special ed teams and IT teams, whoever you need to work with so you can really chunk out your processes so they're more manageable because there can be a lot of stuff that mounds up on you all at once if you don't do that. So develop a calendar. Figure out what works with your team, whether you do a lot of EDFacts reporting activities, public reporting activities, whether you're on more of those program monitoring or SPP/APR. Figure out so that you don't have too much on your plate all at once because inevitably something else is going to pop up during that same time frame, and it's going to make it stressful for you. So be kind to yourself. Also, work on developing your team's ability to look at data because if you're the only person who's looking at the data and the only access point, which, to a great degree I am, don't do that to yourself. Make it so that your team knows where the information is, how to use that information, and that may take a lot of time up front, developing that, and may take more time at different points in the year. But that's going to save you time and energy throughout the year if your team members know how to access that and feel confident. And looking that information, I think, is, I don't know, a great dream to have. 00:14:26.86 >> And I think that ties into what Amy was saying, too, about documenting everything, so making sure everything, all your processes, procedures, are documented, well documented, and that others on your team are aware of it, have seen it, have looked at it, understand it, too, so kind of building the internal capacity piece. 00:14:46.95 >> And that also builds sustainability ... 00:14:49.78 >> Mm-hmm. 00:14:50.01 >> ... within the team of anybody at any point leaves. 00:14:54.33 >> Right. 00:14:55.17 >> Then, you have those backups. There was something else I was going to say. Oh, yes. Around documenting those processes, there's lots of activities that you're going to be only doing once a year. 00:15:08.15 >> Mm-hmm. 00:15:08.95 >> So having those processes documents and so you can make changes, updates throughout and take the opportunity to make those updates when they're happening because you may forget about it later on, and you don't want to have to go back and do a lot of work on a lot of different files later on. 00:15:32.76 >> Yeah. 00:15:32.91 >> But, yes, document your processes because one year later, you've had a lot of things you've done. 00:15:39.71 >> Mm-hmm. Yep. Yeah. Having ... You're kind of going through and having your processes open and available kind of while you're going through it and updating it along the way, too, as you as you make changes. Like you said, instead of waiting until after, when you feel like maybe I have a little more time but then you're going to forget things. So that makes sense. So, Alisa, I'm really curious to hear, what is your favorite part about being a data manager? 00:16:07.78 >> Favorite part about being a data manager is really that collaboration that you can have across team, those relationships you can build, those opportunities to ... I'm a person who really likes troubleshooting stuff. 00:16:24.78 >> Mm-hmm. 00:16:25.86 >> So going through working with others to make the process better, whether it's more efficient or better quality or whichever, that collaboration and bouncing ideas off of other folks, that's one of my favorite things. The other is when I actually get to do more of that direct support to LEAs, and they have an ah-ha moment. 00:16:52.88 >> Mm-hmm. 00:16:53.39 >> "Oh. You mean I don't have to do all of these things, and I can get my data to look better, and it's going to be easier for me?" That's another one of my favorite things. 00:17:05.26 >> Mm-hmm. Yeah. Amy, what is your favorite thing or things about the job? 00:17:11.37 >> Somebody asked me the other day what my favorite part of my job was, and I said that I get to use all these nerdy interest databases and Excel spreadsheets and data, and I get to use those to help people. So similar to what Elisa said, I love providing support to districts, and I love the data, as well. So it's kind of a nice mix. And the fact that one day never looks like the next ... 00:17:43.71 >> Mm-hmm. 00:17:44.45 >> ... where it's constantly changing, even ... It might just be the data set that I'm working on. It might be that one day, I'm doing a training. The next day, I'm collecting data. The next day, I'm working on public reporting. But it never gets boring. 00:18:00.40 >> Mm-hmm. Yeah. 00:18:01.72 >> Can I pop in another thing? 00:18:04.34 >> Yeah. Yes. 00:18:05.40 >> I really enjoy that I'm always working to make it better, make the ... 00:18:10.75 >> Mm-hmm. 00:18:11.17 >> ... data better, the process better. And so like Amy said, it's never exactly the same because you and others that you're working with are always trying to advance it forward for your state. 00:18:28.39 >> Right. It seems ... Yeah.it would be impossible to have everything be 100-percent perfect. There's always room for improvement and especially as there's new requirements and changes that come through. It's always going to be evolving. And, Amy, what would you say that you're grappling with right now in terms of data quality challenges, and how are you working to address them? 00:18:55.54 >> I think with the data system that we have, they just made updates to the data system. And any time there's an update, there is a change that causes the directors of special education in the state to sort of panic. 00:19:15.09 >> Mm-hmm. 00:19:15.96 >> Because they're not data people. They're special ed people. But they're having to review data and report data, and we have such a turnover ... 00:19:27.28 >> Mm-hmm. 00:19:28.19 >> ... in directors of special education every year, even in the middle of the year, that it's hard to keep the training up. We record the trainings, but sometimes people don't even know that the trainings are out there because they're that new. 00:19:44.48 >> Mm-hmm. 00:19:45.35 >> So I think it's keeping up with the updates in the system and communicating to everyone, especially the new people that you don't ... It's not as hard as it seems. 00:19:58.97 >> Right. 00:19:59.11 >> It's not as scary as it seems. Please don't panic, because I'm here to help. 00:20:03.37 >> Mm-hmm. Alisa, what about you? 00:20:06.12 >> I think also the turnover really is the greatest challenge right now because we've got turnover not only in special ed directors every year. We've got turnover in data enterers. 00:20:20.92 >> Mm-hmm. 00:20:21.22 >> We've got turnover in those registrar folks. And they all contribute to either the high quality or not-so-high quality of data. And they're not just learning how to submit information to the state. They're also learning how to enter that information into their student information systems because LEAs in Idaho have access to use whatever student information system they want ... 00:21:00.96 >> Mm-hmm. 00:21:01.18 >> ... use whatever platform or IAP software they want. And while we have now a lot of folks on the optional IAP software, it all interacts with those student information systems differently. So there is a lot of learning going on, and at the state level, we don't necessarily have the capacity or ability to address all of those learning situations with that student information system. So we have lots of calls come in of, "Could you tell me how I enter this piece of information into my student information system?" It's like, "Well, I can tell you how to submit it to the state and what that's going to look like. But I can't tell you these other pieces because I don't know your software system." 00:21:59.22 >> Mm-hmm. 00:21:59.51 >> I don't know what version you have, and so there is just a lot learning that's going on in each district and that the level of turnover plays a huge, huge amount into that piece. And while you're going through and doing all this training annually on data quality, you have to make sure that you're meeting all of those folks at the level they're at so you're addressing those new to the data recording, and then those who have been in for 15-plus years. So ... 00:22:38.54 >> Yeah. 00:22:38.68 >> That's probably highest level. 00:22:41.80 >> Hmm. So it's similar, I think, in both states in terms of the turnover, your data systems' complexities, training new folks, all of that. So finally, I just wanted to know if there's something you want to talk about you're working on now or something that you're planning for in the future related to improving the quality of your state's IDEA data that you'd like to share with us. Alisa, do you want to kick us off? 00:23:09.15 >> Yeah. I'll jump in. So one of the things that we're trying to figure out, get better at, is all of that public reporting and the ... how that relates back to the data quality is really, if people understand what it's about, they're more likely to put a higher level of importance onto recording good-quality data. 00:23:37.76 >> Mm-hmm. 00:23:38.40 >> So we're trying to make sure that not only are we addressing those 508 accessibility requirements and those basic federal reporting requirements that we're also making that information more understandable and usable for our public. And negotiating those criteria can be pretty tricky because you try to provide it out in more infographic layouts. Then, you're potentially running into those 508 accessibility issues. 00:24:15.08 >> Mm-hmm. 00:24:15.44 >> And so we're just trying to get better, trying to make it more, like I said, understandable, usable and valuable ... 00:24:24.87 >> Yeah. 00:24:24.97 >> ... to not just our teams at the state but at the local level and to families, as well. 00:24:32.40 >> Mm-hmm. Amy, what about in Kentucky? 00:24:34.99 >> So we're working on updating how we collect data and data reports from districts, just trying to make a more ... make it more efficient and ensure that we're getting accurate data like indicators 11, 12 and 13, for example. We collect those directly from the LEAs, and training around that has been difficult, again, because of turnover. And it's ... It can be confusing to them. So we're working on that. And something else I would like to do is, for districts who don't make determinations, do a data retreat. 00:25:09.97 >> Mmm. 00:25:10.47 >> It's something I've been thinking about for a while, and I feel like it would help people understand their data and be able to make changes accordingly instead of just trying ... looking at the overall data or even not even looking at data at all ... 00:25:26.42 >> Mm-hmm. 00:25:26.66 >> ... to make changes. I feel like if we could do sort of a data dive with them and let them disaggregate it and see what's going on within the numbers, I feel like that would help districts a lot. 00:25:39.20 >> Yeah, definitely. That's sounds like a great idea, and I would love to hear more about it and hope that happens. 00:25:45.43 >> Me, too. 00:25:47.18 >> Well, thank you both so much. Really appreciate you being on and sharing your experience and knowledge and expertise in this just ever-changing world and role. And it sounds like you have a lot of great stuff coming up. 00:26:04.37 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content. Or connect with us via the podcast page on the IDC website at IDEAdata.org.…
 
Loading …

به Player FM خوش آمدید!

Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.

 

icon Daily Deals
icon Daily Deals
icon Daily Deals

راهنمای مرجع سریع

در حین کاوش به این نمایش گوش دهید
پخش