5 Tools Teachers Can Use to Find Research-Backed Edtech That Works

Posted on • Reading Time: 7 min read

This article was written by Steven Yonder on October 26, 2023 and published by Edutopia. 

Illustration of hand and keys

When Jake Miller was looking for a car or refrigerator, he would borrow his father’s Consumer Reports to compare what was out there. But Miller, who spent 14 years as a science and math teacher, noticed there was nothing like it that teachers could use in their work lives—a resource that would let them evaluate the many edtech products that claim to help students learn.

Few edtech tools have been subjected to well-designed studies of whether they work. One study suggests that about 60 percent of edtech doesn’t meet the goals for which it was purchased. Other surveys routinely find that most edtech software licenses never get used. “Programs claim that they’re the best thing in the world. Most of them aren’t,” says Clarence Ames, of the Utah-based STEM Action Center, who runs a project that evaluates the effectiveness of math education software.

Someone searching for a list of the best products in a particular category—say “best edtech products for Algebra I”—will pull up dozens of sites of dubious value that tout particular companies, provide generic lists of “best edtech products,” share the product preferences of a few educators, or don’t mention Algebra I at all.

It’s near-impossible for teachers putting on a five-hour show each day, plus grading and planning, to find time to sort through the noise. But several free websites are making an attempt. There’s not yet a Consumer Reports of edtech, but here are five tools that offer the closest approximation: impartial guidance on which products might be useful in their classrooms.

 

  1. EdTech Evidence Exchange 

 

Launched in 2018 and run by the nonprofit InnovateEDU, the EdTech Evidence Exchange lets teachers share their experiences implementing math products to help each other make better decisions. 

Teachers create a free account and complete two 20-to-30-minute surveys: one about their school district and a second about the math tools they’ve used. “We’re trying to make this usable in one planning period,” says InnovateEDU Executive Director Erin Mote. 

After finishing the second survey, users get instant access to an “implementation report” with information about school districts similar to theirs, which math edtech products those districts used, and how well they worked. That product advice is specific and even actionable. Teachers from one district wrote that a particular product “seemed overwhelming because there are a lot of moving parts, but it’s really easy to make your own materials. Also, use the ‘clone’ tool to save yourself some time instead of rewriting all your problems every time.”

Equally useful, teachers can look up specific tools and the experiences of all educators with those tools, not just those from similar districts.

Anthony Kingston is chief technology officer for Alabaster City Schools in Alabama and responsible for the edtech in his district. But even he learned some things by having teachers in his district complete the Evidence Exchange surveys: They were using some tools that “we didn’t have a clue about,” says Kingston, and their surveys let him know of others that teachers wanted to use but had never asked for.

So far, about 2,000 teachers nationwide have completed the surveys, Mote says.

 

  1. Evidence for ESSA 

The 2015 federal Every Student Succeeds Act, or ESSA, encourages the use of programs and strategies with solid evidence of effectiveness. Evidence for ESSA, a free database developed in 2017 at Johns Hopkins University, aims to help educators find tools that meet those evidence standards.

Teachers can view edtech tools in math and reading and can use the sidebar to narrow what they’re looking for: tools with strong, moderate, or promising evidence; those targeting specific grade levels; those appropriate for rural, suburban, or urban districts; and so forth. To pull up edtech tools—as opposed to, say, textbooks or tutoring programs—make sure to check the box for “technology” in the sidebar.

Tools with a “strong” ESSA evidence rating have been subject to at least one randomized, well-conducted study showing significant positive student outcomes. A “moderate” rating means the product or program was the subject of at least one quasi-experimental, well-conducted study showing significant positive student outcomes. A “promising” rating means that at least one correlational, well-conducted study was done showing significant positive student outcomes.

A few minutes on the Evidence for ESSA database indicates how little solid research is out there on the 11,000 or so edtech products being sold to schools. On the reading side, only 103 tools get any rating at all, 68 of which get a “strong” evidence rating. In math, only 36 get ratings, and just 19 of these get the “strong” rating. And there are not yet any ratings for science and writing products or for products aiming to improve attendance or support social and emotional learning.

 

  1. EdSurge Product Index

This free database, a joint product of the digital news and research publication EdSurge and the International Society for Technology in Education, is intended as a first stop for teachers looking for edtech.

The index is the teacher-facing side of the Learning Technology Directory, an ISTE database where edtech companies list their products. The index includes product profiles and validations from third-party education and technology organizations, with the intent of offering educators reliable and up-to-date information on what’s being sold.

 

For example, an initial search for tools to support students in Algebra I brings up 119 products. Users can employ the sidebar filters to narrow that by parameters like grade level, tech specifications, privacy standards, and pricing structure. Plugging in all of those will whittle those Algebra I products down to just one, and putting “Algebra I” in quotes cuts the original 119 results to just three, eliminating products that aren’t specific to math.

 

Users also can filter for products that have gotten various third-party seals of approval (called “badges”). One badge that’s important to whether a product might be effective is the “Digital Promise Research-Based Design Product Certification.” It indicates that the product was created with an understanding of and commitment to “rigorous research on how people learn” and that the creator shared the research that informed the product’s design and development. (That’s a less rigorous standard than the Evidence for ESSA database’s ratings, which are based on formal research studies of student outcomes.)

 

  1. Triple E Framework

This free assessment platform was developed in 2011 by Elizabeth Keren-Kolb at the University of Michigan’s School of Education. It’s based on research-grounded guidelines on effective practices for using edtech tools, starting from the assumption that “effective technology integration begins with good instructional strategies and not fancy tools,” as the Triple E Framework website puts it. Those strategies are “Engage in learning goals,” “Enhance learning goals,” and “Extend learning goals” (the Triple E in the title). 

 

Unlike the EdSurge index, the Triple E Framework is not a search engine—teachers can’t use it to narrow down 100 edtech products to 10. Instead, users must have at least a demo copy of the product. “You do need to actually play (or sandbox) with the specific tool to complete the evaluation,” Keren-Kolb tells Edutopia. “You cannot just guess based on the product description.”

Users complete a 15-question evaluation of the product, including about whether the product has been the subject of reliable studies of effectiveness (a link to the Evidence for ESSA database is embedded in this question so that users can look up the answer), whether it passes accepted privacy standards, how well it keeps students’ minds focused on the learning task, and more.

 

On the basis of those responses, the tool gives the product a green-, yellow-, or red-light score, which corresponds to the level of support that a teacher will have to provide to make the product an effective learning tool. Green means it will require less support because there’s a strong connection between the intended learning goals and the tool; yellow means the teacher will have to offer more instructional support; and red means the tool might not be the best choice because it requires lots of instructional support to work.

  

“It doesn’t say do or don’t use a particular tool,” says Keren-Kolb. But if the light is flashing red, “it says you’re going to have to do a lot of work because the product doesn’t have good science of learning behind it.”

  

Jessica Stage, a school district technology integration specialist in Michigan, says her district started using the framework about 18 months ago when their teachers were getting “bombarded” with offers of free edtech products. Now they employ it to evaluate new edtech that teachers express interest in. It’s helping educators across the district focus not on picking tools but on choosing their learning goals and then matching tools to those.

 

  1. LearnPlatform

 A private company, LearnPlatform offers products designed to be used by whole school systems, unlike other tools described so far. But individual teachers who aren’t in a participating system might find useful a free tool called the LearnPlatform Community Library (an account is needed to access the library).

  

Much like the EdSurge index, the library lets users search for products in specific categories. A search for “reading” brings up dozens of  products, and numerous filters allow narrowing by product type (writing tools versus reading tools, for example), reviews and certifications (“research and evidence” and “privacy,” for example), education level, tech specifications, and more. Perhaps unsurprisingly, selecting options under the filter for “research and evidence” significantly narrowed the dozens of products included.

 

Clicking “details” under an individual product brings up its description and, when available, the grade it received from other educators who have used it—on factors like quality of features, impact on teaching effectiveness, impact on student learning, and others. LearnPlatform senior vice president Karl Rectanus described the community library as a “research-based Tripadvisor for all your edtech.” (You can tour it without signing up here.)

 

SEARCHING FOR THE BEST

Before jumping into any of these tools, make sure to narrowly define the problem you’re trying to solve, says Jake Miller, who authored the book Educational Duct Tape: An Edtech Integration Mindset. “The first and most important thing to me is, does it meet the goal or solve the problem that you started off to solve? It might do some wonderful things, but if those wonderful things aren’t what you’re trying to do, those are just kind of fluff.”

 

Once you’ve started looking, you also don’t need to check out the entire universe of products, he says—when you find one that meets your needs, stop and try it yourself.

And choosing the right product is only the first step in good edtech integration, says Tod Johnston, who taught elementary school for 10 years in Oregon and now curates and creates content to support STEM teachers at the edtech company Sphero. Collaboration is important: It’s better if a group of teachers pilots a product together so they can compare notes. 

Maybe most important is testing the product, especially with a small group of students, he says: “Getting the perspective of students about what they find engaging or helpful in learning is something that teachers need to keep in mind always.”