Transcript of Sold a Story E13: The List
This podcast is designed to be heard. We strongly encourage you to listen to the audio if you are able.

I’ve gotten a lot of emails from listeners since Sold a Story first came out. I have a fat file folder full of actual letters, too. Sent in the mail!
One of these letters came from Matt Huffman. At the time, he was president of the Ohio state Senate. The letter is three handwritten pages. Huffman said he was — quote — “invigorated” after listening to the podcast. He could see there was a problem with how reading was taught, and he wanted to fix it. He wasn’t the only one.
Christopher Peak: Ohio had a lot of people who listened to our podcast.
This is my co-reporter, Christopher Peak.
Peak: I got a call just a couple months after Sold a Story came out from one of the top education officials saying all the executives in the department were listening to Sold a Story and they want to do something about it.
(Music)
A few weeks after Chris got that call, the governor gave his State of the State address.
Gov. Mike DeWine: I’m calling for a renewed focus on literacy.
Peak: He's saying a big proposal is coming, we're going to make changes to how reading is taught in Ohio.
Two weeks later, legislators introduced a bill.
Peak: And this bill says the department has to come up with a list of programs that are aligned with the science of reading.
The bill passed in June. The governor signed it into law on the Fourth of July.
Now, it was up to the Ohio Department of Education to make a list of approved reading programs.
I’m Emily Hanford and this is Sold a Story, a podcast from APM Reports.
In this episode, we’re going to tell you how state education officials in Ohio came up with their list. Why the Success for All program wasn’t on it — at first. And the influential organization that Ohio and other states are looking to for help when they’re figuring out what programs count as the science of reading. An organization that wasn’t set up to do that. We're also going to hear why even an evidence-based program doesn’t always work. Teaching kids to read is about more than just a program.
(Music ends)
So Ohio’s new reading law passes in 2023. It directs the state government to come up with a list of approved reading programs.
Melissa Weber-Mayrer: My name is Dr. Melissa Weber-Mayrer.
And it’s this person’s job to figure out how to do that.
Weber-Mayrer: I work for the Ohio Department of Education and Workforce.
She and her colleagues have to come up with this list quickly. The law says schools in Ohio must be using a state-approved reading program by the end of the following school year.
Weber-Mayrer: We had a very short window to get things in place.
Melissa Weber-Mayer and her colleagues knew what wasn’t going to be on the list.
Weber-Mayrer: If there was any indication of any part of a three-cueing method being used, they didn't move forward.
That’s because the Ohio law included a ban on cueing — the flawed strategies we focused on in this podcast. At least 16 other states now have similar bans. So, programs that included cueing were out in Ohio. But what was in? The law said programs had to be — quote — “aligned with the science of reading.”
Melissa Weber-Mayrer and her team decided it wasn’t feasible for them to do their own analysis of research on reading programs.
Weber-Mayrer: We actually did not review efficacy studies.
They had to come up with a way to do this quickly. So one thing they did ....
Weber-Mayrer: We looked at what our other state colleagues who already had similar laws had done.
They looked at other state lists. A program could make a case to get approved in Ohio if it had already been approved by another state. At least nine states have recently created new science of reading lists.
And there was another way to make it onto Ohio’s list.
Weber-Mayrer: Have you been reviewed by EdReports?
EdReports.
(Music)
EdReports is the organization I mentioned earlier that’s having a big influence on whether a program makes it onto a state’s list.
My co-reporter Christopher Peak has been digging into EdReports for several months.
(Music ends)
Emily Hanford: Hi, Chris.
Peak: Hi, Emily.
Hanford: So let’s start with some basics: What is EdReports?
Peak: It’s a pretty new organization. It’s a nonprofit, and it’s only 10 years old. And it’s already built up a lot of clout, by billing itself as a kind of “Consumer Reports” for curriculum.
Hanford: So what exactly does EdReports do?
Peak: They review curriculum. Teams of teachers actually do the reviews. They review not just reading curriculum but math and science curriculum too. And they rate it. It’s a red, yellow and green system. So if you’re a publisher you want an “all green” rating from EdReports. Nearly 2,000 school districts have used its reviews to make their purchasing decisions. And the organization says 40 publishers have actually adjusted their products in response to an EdReports review. This is bigger than just the new state lists. EdReports was having a big influence on the publishing industry before Sold a Story and the current conversation about the science of reading.
Hanford: And it turns out there’s a bit of a disconnect here, right? EdReports wasn’t set up with the science of reading in mind.
Peak: No, it was set up with something else in mind. Something called the Common Core State Standards.
Barack Obama: Forty-eight states have now joined a nationwide partnership to develop a common set of rigorous, career-ready standards in reading and math.
Peak: Common Core was a thing during the Obama administration. It was an effort to raise education standards across the country. The goal was to make sure students in different states were learning the same core skills. But it ran into the same kind of problem that George W. Bush’s big education effort ran into. Publishers were saying their programs were aligned to the Common Core. Just like publishers were saying their programs were “scientifically based” during Reading First.
Hanford: And there was no one really policing that.
Peak: And that’s why EdReports was established. To review curriculum and say, yes, this curriculum really was designed with the Common Core standards in mind. Or, no, this curriculum wasn’t. It’s not aligned with the new standards.
Hanford: So EdReports released its first reviews in 2015, and it becomes very influential, very fast. But then along comes the science of reading. And people are starting to ask a different question — not, is your curriculum aligned with the Common Core? But — is your curriculum aligned with the science of reading?
Peak: Exactly. And what I found in my reporting is that EdReports has given high marks to some programs that include the cueing strategies, which as you know is the opposite of what science has taught us about how kids become good readers.
Hanford: So say more about that — do you have an example?
Peak: So I talked to Kari Kurto. She was a literacy specialist at the state Department of Education in Rhode Island, which was one of the first states to really try to push for better reading curriculum. Rhode Island had looked to EdReports to come up with a list of programs that districts should be using. And Kari had been on the job for just a couple of weeks when she had a “jaw dropping” moment.
Kurto: I was in my cube on the fourth floor of the Department of Ed. And I began to go through the materials on the approved list, and some of them had some great evidence aligned instruction and others, I started flipping through and said, “Uh oh.”
Peak: She was seeing programs telling teachers to say things like, “Read the pictures,” and to use cues other than the sounds of the letters.
Kurto: They had a lot of the strategies and guidance that we know runs counter to the science of reading and yet they were on this list that said — go ahead and adopt these programs. This is what the Rhode Island Department of Education stands behind.
Hanford: I think to understand how this happened it helps to know a bit about what the Common Core standards are.
Peak: Yep. The Common Core standards basically lay out what kids should know and be able to do at each grade level. I have a copy of the English Language Arts Standards right here. It’s 66 pages long. And here’s an example of one of the standards for first grade. It says that a first grader should be able to “ask and answer questions about key details in a text.” But the Common Core standards don’t say anything about how to do that. They don’t say anything about how to teach. They just say what to teach.
(Music)
Hanford: And you can see how this could be in conflict with the science of reading. Because one of the big things the science of reading has revealed is that how you teach kids matters. But EdReports was basically agnostic on how things were taught. What EdReports essentially wanted to see was that a curriculum was covering everything in that 66-page standards document you’ve got there.
Peak: Right. Even some of the people who were once supporters of EdReports are recognizing this conflict now between the science of reading and the Common Core standards.
(Music ends)
Peak: I talked to David Liben. He’s an educator with more than 50 years of experience.
Liben: I've been involved in education since shortly after the Civil War.
Peak: As you can tell, he likes to joke around a bit, too. David Liben worked with EdReports when it was first set up. He thought the organization was needed because of that problem we mentioned earlier — publishers slapping Common Core stickers on their products and no one checking to see, is this program really living up to that label? But David Liben now says EdReports’ methodology is flawed.
Liben: Success is dependent upon how we align with standards, as opposed to how we align with the science of reading.
Peak: He says one of the biggest problems with EdReports is that some programs that are backed by rigorous research are not getting those coveted “all green” ratings. They’ve got good studies that show they’re effective. But EdReports doesn’t factor studies into their ratings. That’s not part of their review process.
Hanford: So EdReports was designed to look at — does your program cover all of the standards? Not — does your program deliver on the science of reading?
Peak: Right. And I should note, too, that both David Liben and Kari Kurto — the woman from Rhode Island — they’re both now associated with organizations that do their own curriculum reviews.
Hanford: I want to ask about Success for All — the program they use in Steubenville. Success for All has never been reviewed by EdReports. Why not?
Peak: Because of what we learned in the previous episode. Success for All is not just a reading curriculum. It’s a whole school reform program. So I asked an EdReports’ spokesperson about this. And she told me that reviewing just the reading curriculum wouldn’t have provided a complete picture of Success for All. So EdReports decided not to review it.
Hanford: Interesting. Nancy Madden — the co-creator of Success for All — told me that she didn’t want her program to be rated by EdReports.
Nancy Madden: I don't want to validate that approach to reviewing what instruction should be. It's the wrong approach. We need to judge — what's the outcome? We need to look at — what is the evidence of effectiveness?
Hanford: She and her late husband, Bob Slavin, spent their careers trying to get schools to use evidence.
Madden: What we wanted to do was show that the evidence could matter.
Hanford: I was surprised when Nancy told me they left the country for a while because they were so frustrated by what they saw as a lack of interest in evidence here in the U.S. And what she told me when I interviewed her was that when our podcast came out, she was feeling hopeful again. The CEO of Success for All said the same thing.
Wible: This idea of science of reading coming to schools across the country, we were thrilled.
Hanford: Her name is Julie Wible.
Wible: Finally, you know, we’re going to look at the evidence. We’re going to look at the science. And kids are going to get what they need.
Hanford: But she and Nancy told me it was kind of déjà vu when states started making lists. And Success for All wasn’t getting on those lists.
Peak: Success for All was actually on one state list: Arizona. But Melissa Weber-Mayrer — the education official in Ohio — she told me that her team didn’t think Arizona’s review process was rigorous enough.
Hanford: That seems kind of ironic to me. Success for All is on Arizona’s list in part because Arizona doesn’t look at EdReports. You get on Arizona’s list if you have evidence for your program.
Peak: And we know Success for All has that evidence.
(Music)
Peak: But most states are not looking at evidence to decide what belongs on their lists. Some of them are looking at EdReports instead. And that’s why when Ohio’s list first came out, Success for All wasn’t on it: The program has never been reviewed by EdReports.
Hanford: I’m going to have you come back later to tell us what the CEO of EdReports had to say about all of this in your interview with him.
Peak: Alright, see you soon.
First, I’m going to finish the story of what happened in Ohio.
(Music ends)
When the superintendent in Steubenville first heard about Ohio’s new science of reading law, she wasn’t worried.
Melinda Young: Oh, no big deal. SFA is the science of reading.
This is Melinda Young.
Young: As naïve as I guess I was, I really just never gave it a second thought.
When I first visited Steubenville, the news was still kind of sinking in. They were hopeful that Success for All might eventually make the list; state officials said a second review process would be coming. But they were already looking at new reading programs.
Tricia Saccoccia: We are proactive here.
This is Tricia Saccoccia, the principal of East Elementary.
Saccoccia: We’re not just sitting here waiting. We’re getting ready just to be prepared.
They were looking at the programs on the state’s initial list.
Lynnett Gorman: And there are a lot of school districts who are using approved curriculum already.
That’s Lynnett Gorman, another principal in Steubenville. She and her colleagues were looking up test scores in the school districts that were using an approved program. Close to a third of districts in Ohio were already using something on the state’s initial list. But only one of those districts was doing better in reading than Steubenville. It’s a tiny district with a very low poverty rate. The teachers in Steubenville were having a hard time understanding why they might have to stop using Success for All.
Teacher: I don’t want a new program.
Teacher: Why get rid of something that is proven to work?
Teacher: I would be upset about it.
They were upset. But they weren’t panicking.
Teacher: Either way, we’ll be fine, we’re a strong district. We’ll get through it if we have to.
Nicolette Hill: I feel in good hands. So I don’t worry.
This is Nicolette Hill, an eighth grade English teacher.
Hill: We have a wonderful board of education and higher-up staff, and they put a lot of thought into everything we do. And I think that they’ll make sure that we stay where we need to be and keep excelling and doing what’s right for the kids.
(Music )
It really struck me the way teachers here trust their administrators. I don’t sense that same kind of trust in a lot of school districts I visit. I think it has something to do with the frequent turnover in leadership in many districts. The average superintendent in a poor school district in the United States lasts only about 5 years. The superintendent in Steubenville has been on the job for 10 – and before that she was a principal and a teacher here.
Stability is a feature of this place. Steubenville has low principal turnover and low teacher turnover, too. And according to the school district, 48% of the people who work in Steubenville schools went to Steubenville schools. I think this stability — the commitment to this place — is one of the reasons Success for All has worked here. Why it’s lasted for 25 years.
But it doesn’t work everywhere. Often, it doesn’t even last very long.
More on that — and how Success for All finally got on Ohio’s list — after a break.
** BREAK **
I talked to William Corrin. He’s been overseeing evaluations of education programs for decades and knows a lot about Success for All. I told him that Steubenville had been using the program for 25 years.
Hanford: Does it surprise you that a district has been using SFA for that long?
William Corrin: Yes.
He said it would surprise him to hear that a district used any program for 25 years.
Corrin: The practicality is often that priorities shift over time in districts and, you know, their new administration comes in and they say just — here’s the new stuff we want to do.
Program churn is kind of a defining characteristic of American education. And that churn has not been favorable to Success for All. We identified more than 150 schools that had adopted Success for All at some point. But then dropped it. We wanted to know why.
Olivia Chilkoti: I started with emails.
Our research fellow Olivia Chilkoti reached out to those schools.
Chilkoti: Almost nobody got back to me. So, I just started cold calling.
She made close to 100 phone calls. Eventually, she got some interviews. Here’s what she learned about why schools dropped Success for All.
Chilkoti: So a lot of the time, it came down to administrative turnover. It was common for Success for All to be shepherded in with a new superintendent. But when the superintendent left, Success for All was out, too. There was one district where people who didn’t like Success for All used the change in leadership to lobby for something else.
Hanford: That’s something we heard in the last episode — there tends to be resistance to Success for All. Some people just don’t like it.
Chilkoti: For sure, that is one dynamic. I talked to Ryan Mariouw. When he got a teaching job at a charter school in Detroit, the school had recently started using Success for All.
Ryan Mariouw: You know, I was enthusiastic about the program. I don't know if it was necessarily as welcomed by everybody. You know, one of the biggest downfalls that a lot of teachers would talk about was the scripted nature of it. You know they didn't necessarily love having to be on a certain page on a certain day at a certain time. They almost felt like it was robotic.
Chilkoti: But it didn’t feel robotic to him. He says Success for All helped him become a better teacher.
Hanford: So why did the school stop using it? Was it resistance from teachers?
Chilkoti: No, actually. In this case it came down to money. The school had gotten a grant to adopt Success for All and when the grant money ran out, they dropped it.
Hanford: So cost is a factor here. Leadership change is a factor. What else did you learn about why schools stop using Success for All?
Chilkoti: What emerged during my phone calls was a portrait of how complicated and delicate implementing a new program can be. I talked to Jennifer Hansen. She’s the English Language Arts Specialist for Geary County Schools in Kansas. She says Success for All worked better for some schools than it did for others.
Jennifer Hansen: They weren’t always seeing the same results.
Hanford: What was going on?
Chilkoti: So, this district includes a military base, and the teaching staff turns over a lot. Jennifer Hansen told me they get about 100 new teachers a year.
Hanford: Wow, that’s a lot.
Chilkoti: Yeah, that’s like 15% of their teachers. She says there was inconsistency in how different schools and different teachers were using Success for All. Eventually, a new superintendent came in and decided it was time for a new program. And they looked to EdReports to decide what that should be.
Hansen: On EdReports, they had to be all green. If there was an area that they were not green in, we didn't even look at them or have them come and talk to us.
Hanford: Another reminder of how influential EdReports has become.
Chilkoti: Yeah, and something else that came up was how Success for All groups kids for reading instruction. Remember: Kids get grouped by ability instead of grade level. Several people I talked to said they had a tough time making that work. They said kids who were behind weren’t catching up. And the schools ultimately gave up on Success for All because they couldn’t get enough kids up to grade level.
Hanford: I talked to the folks in Steubenville about that. They said that was a challenge for them at first, too — that it took a couple of years for them to really figure out how to group kids and monitor them and get the tutoring right. But now, it’s a rare exception when a child is still behind by the end of third grade.
Chilkoti: It does sound like Success for All is something you have to stick with for a while to see the full results. It may be that some schools are giving up too quickly. And often it’s because a new leader comes in who wants to take things in a new direction.
Hanford: I was struck by something else you learned in your calls — you told me not all schools that adopted Success for All did the whole program.
Chilkoti: That’s right. Some schools were using it just as a reading curriculum. They weren’t doing all the elements — like the tutoring and the attendance. In some cases, it was because they didn’t have the staff to do all that. Those schools didn’t see great results with Success for All. And maybe that’s not surprising since they weren’t doing the whole program.
Hanford: Did you come across any schools that had dropped Success for All because it wasn’t on a new state list?
Chilkoti: Yes. Franklin County Public schools in Virginia had been using Success for All in several of its schools — some schools had been using it for 17 years. But then Virginia passed a science of reading law and created a list of approved programs.
Brenda Muse: SFA was not on the list of approvals from the state.
Chilkoti: Brenda Muse is the director of curriculum and instruction.
Muse: We all were really a little bit shocked that they didn't make the cut.
Chilkoti: She said the district could have tried to get a waiver to keep using Success for All. But district leaders decided it made more sense to adopt a state approved program for the entire school district.
Hanford: There are other schools that we know of that have dropped the program because of a state list. The Success for All organization sent us the names of 42 schools in seven states that — according to their records — had recently dropped the program because it wasn’t on their state’s list of approved programs.
Chilkoti: I think there’s a lot of confusion about the research base for Success for All. Some people I spoke with said they dropped Success for All in favor of a program that was backed by evidence. One person said explicitly that they dropped Success for All because it wasn’t backed by evidence — which was probably the most puzzling thing I heard in all my calls. There seems to be a perception out there that Success for All is not the science of reading. Maybe because the science of reading is new to so many people. And Success for All has been around for a long time. It seems like it’s not current or something.
Hanford: Really interesting insights! Thanks for making all of those phone calls, Olivia.
Chilkoti: No problem.
(Music)
So now I’m going to turn back to Ohio. As we were wrapping up the reporting for this episode, the state updated its list. Just over a month ago — a year after the initial list was published — the Ohio Department of Education added Success for All. And some other programs too. I emailed the education official you heard earlier to find out what happened. She said programs that failed in the first round were allowed to reapply last fall. This time, the state didn’t rely on EdReports. They did their own review of Success for All.
(Music ends)
And the program was approved.
Young: As soon as I got the news, I sent it out to all of the principals.
This is Melinda Young, the Steubenville superintendent.
Young: It was on a Friday evening, and it was crazy ‘cause they all responded back within, I would say, five minutes. It was like, relief. (Laughs) Yes. Relief.
Ohio’s list was updated in time to save Success for All in Steubenville. But we know of two charter schools in Ohio that had already been told by their parent organizations to drop Success for All because it wasn’t state approved. And as hundreds of Ohio school districts were looking for new programs over the past year, not a single one reached out to the Success for All organization about adopting their program.
Peak: The decisions schools and districts are making now will affect how reading is taught for the next 5, 10 years — maybe more.
This is my co-reporter Christopher Peak.
Peak: And a lot of money is being spent. Ohio gave out more than $50 million to help districts pay for new reading programs. And most of that money is going to programs that got good ratings from EdReports.
Peak: So, could you just start off by introducing yourself?
Hirsch: Sure. I'm Eric Hirsch and I'm the Chief Executive Officer of EdReports.
Peak: Eric Hirsch started off our interview by talking about the history of the organization.
Hirsch: It’ll be our decade anniversary in March, and it’s been fairly amazing.
Peak: But he hesitated a bit when I asked him about the influence his organization is having right now.
Peak: I’ve seen EdReports come up a lot in state regulations or state laws about, you know, you should be looking to EdReports to figure out — is this a good program or not? And I was wondering what you make of that. Is that a good thing to have EdReports in state regulations? How do you feel about that personally?
Hirsch: We say EdReports is a place to start.
Peak: He repeated this several times in our interview.
Hirsch: EdReports is a place to start.
Hirsch: EdReports is a place to start.
Hirsch: We believe curriculum is a place to start, right? And Ed Reports is a place to start.
Peak: He told me EdReports shouldn’t be the final say on what the best reading programs are.
Hirsch: EdReports provides information from the lens of our educator reviewers, and we believe it's helpful to districts and states in understanding what's in the materials. Before EdReports, there was not a lot out there, not much to go on.
Hanford: But the thing is: A lot of states and school districts have been treating EdReports as more than a starting point. They’ve been treating it as a gatekeeper — a place that can tell them which programs are compatible with the science of reading and which ones aren’t.
Peak: And EdReports has been telling teachers its reviews were based on that science. I found a blog post they published in 2023 that said, “EdReports has always reviewed instructional materials for the science of reading.”
Hanford: But then critics started pointing to curriculum with the cueing strategies that were getting good reviews from EdReports. And curriculum that were not getting good reviews but had evidence that showed they were effective.
Peak: Right. And recently, EdReports has made some changes. They now include a “science of reading” summary with their reviews that highlights how well programs teach foundational skills. And just a few months ago they changed their review tool. Programs that teach the cueing strategies will now automatically fail.
Hanford: So is EdReports going to go back and re-review all the reading programs they have already rated?
Peak: No. They’ve already released ratings for 86 reading programs, and they are not going to go back and do those reviews again.
(Music)
Hanford: So let’s talk about what this all means. Like, what to make of it all?
Peak: Well, it’s clearly a problem when a program that has lots of evidence behind it — a program like Success for All — when a program like that is having such a hard time getting on state lists.
Hanford: Yeah. States could be doing the opposite — they could be saying schools should only use programs that have research evidence.
Peak: But that could be a problem too. Rigorous studies are expensive and complex and take a long time. Lots of programs never get studied.
Hanford: Right. They might be good programs — programs that include effective practices. But they haven’t actually been tested. One of the reasons Success for All has so many studies is that Bob Slavin and Nancy Madden were researchers before they created a program.
(Music ends)
Hanford: I asked the education research guy we heard earlier if he thought schools should use only programs that have been rigorously studied and proven to work. And William Corrin said — no.
Corrin: We can't hamstring ourselves by saying you can only do these things that reach, you know, this very high standard.
Hanford: He says there wouldn’t be enough programs, not enough choices. He thinks choice is important. That there isn’t a program that will work well everywhere.
Peak: But there is a real risk here that schools and districts are committing time and money to programs that aren’t effective. That’s the potential downside when programs that haven’t been proven get popular.
Hanford: Right.
(Music)
Hanford: I’m thinking about why we made this podcast in the first place. We made this podcast because there is something else to consider here. And that is — what’s the idea about how reading works that a program is based on? If you recall, scientists behind the government’s Reading First initiative were trying to get rid of the cueing idea. They were trying to get rid of that disproven theory that beginning readers don’t need to sound out written words. But then Reading First got caught up in arguments about programs and the whole thing fell apart.
Peak: What we wanted to do was focus attention on the idea again. To show people that there was an idea about reading that wasn’t right that was still in popular curriculum materials.
Hanford: Yeah, lots of people — lots of teachers — didn’t know there was anything wrong with that idea. And I think that’s because many of them didn’t actually know how kids learn to read. I thought teachers needed to know that. I thought they needed to understand what cognitive scientists had figured out about how reading works — that this was one of the missing links here.
(Music ends)
Hanford: But Steubenville challenged my thinking about that in an interesting way. When I was there, some of the teachers were taking a new state-mandated science of reading course. And they told me they were learning a lot! Some of them hadn’t really known how kids learn to read — they didn’t know the science behind it. But they didn’t need to know that to teach reading well. They were given an effective program. And they did it. And it worked.
Peak: But as Steubenville clearly shows, it is more complex than just handing teachers a program. They got all that training. There were resistant teachers there that needed to be convinced.
Hanford: And the community has things going for it that many others don’t. The consistent leadership, the low teacher turnover. All the people working there who grew up there. It’s clear that improving reading achievement is about more than just a program.
Peak: Right. It’s not like the answer here is that every school should be doing Success for All.
Hanford: Even Nancy Madden says that. And she was one of the people who created the program.
Madden: We don’t want Success for All to be the thing that everybody uses.
What she wants is for states and schools to consider evidence.
(Music)
And she’s worried that all the talk these days about the science of reading won’t actually result in better outcomes for kids. That we’ll look back in a few years and say — that didn’t work. And after everyone is done blaming each other, we’ll be left with the same narrative that took hold after that big report by James Coleman back in the 1960s — the report that seemed to indicate schools don’t matter that much.
Madden: We have to maintain the expectation that kids really can succeed.
And the expectation that schools can make a difference.
Madden: We have to remember that kids can learn, we can do better. There’s a way to do it. You could be Steubenville.
(Music ends)
Before we go, I want to say one more thing. A main theme of this podcast is — research matters. And the body of research known as the “science of reading” — a lot of that research was funded by federal grants. I’m recording this in early March of 2025. The Trump administration recently announced it is terminating hundreds of millions of dollars in federal contracts related to education research — including research on reading. If you have information you’d like to share with us about that or anything else, we want to hear from you. You can call us, send us a voice memo, or write us an email. Our address is soldastory@apmreports.org. The number is (612) 888-7323. That’s (612) 888-READ. All those ways to reach us are in the show notes. Let us know if we need to keep your name or other identifying details confidential.
(Music)
This podcast is not over. We’re going to keep following events. And looking for more stories about schools and districts that are succeeding. You can sign up for our newsletter so you’ll be notified when we have new episodes. You can do that on our website, sold a story dot org.
You can also find a story there by my co-reporter Christopher Peak about EdReports. He has more on the history of that organization, how they became so influential, and how they’re responding to the science of reading. It’s a great read.
This episode of Sold a Story was produced by me with reporting from Christopher Peak, Olivia Chilkoti, Kate Martin and Carmela Guaglianone. Our editor is Curtis Gilbert. Our digital editor is Andy Kruse. Fact checking by Betsy Towner-Levine. Mixing, sound design and original music by Chris Julin. Final mastering by Derek Ramirez. Our theme music was written by Jim Brunberg and Ben Landsverk of Wonderly. Special thanks to Margaret Goldberg.
Tom Scheck is the deputy managing editor of APM Reports. And our executive editor is Jane Helmke.
Leadership support for Sold a Story comes from Hollyhock Foundation and Oak Foundation.
Support also comes from Ibis Group, Esther A. & Joseph Klingenstein Fund, Kenneth Rainin Foundation, and the listeners of American Public Media.