Advertisement

8 steps to pilot smarter K-12 edtech product tests

What Digital Promise learned from its League of Innovative Schools about evaluating and purchasing edtech tools more effectively.

Mike Nagler has seen his share of edtech product demonstrations as superintendent at Mineola Public Schools on Long Island, New York. And if there’s one thing he’s learned, it’s the value of conducting bona-fide pilot tests.

Pilot programs aren’t the same as product demonstrations, or even beta tests, as Nagler and others see it. Conducted properly, pilot programs can help school systems weigh the potential value and impact of new education tools in ways that beta or other tests typically can’t.

A majority of schools across the country, however, lack the resources, know-how or the tools to conduct full-fledged edtech pilots.

That’s why school administrators and technology directors are likely to welcome a new resource called the “Ed-Tech Pilot Framework” released last week by Digital Promise, an independent nonprofit specializing in innovative education practices.

Advertisement

The new framework provides an eight-step game plan, along with case studies, research reports and other resources, aimed at helping education leaders and technology developers run more effective education technology pilot programs – and ultimately support smarter purchasing decisions.

The framework is based on three years of work by Digital Promise, which studied 14 districts, among its League of Innovative Schools, and how they went about evaluating and ultimately purchasing 15 edtech products, according to Aubrey Francisco, director of research at Digital Promise.

“We found districts were relying heavily on edtech test projects. But they lacked the help they needed,” to find what works at other schools, and how to go about testing new and improved education applications in their own schools, she said.

That led Digital Promise, in partnership with the University of California-Davis School of Education, to identify districts demonstrating success with edtech pilots – and the common practices that worked for them. Digital Promise then developed a recommended model for piloting product tests – and test-piloted the model in selected school districts.

“We had some ideas on what the best practices might look like, then we conducted pilots ourselves,” she said.

Advertisement

Nagler’s school district was one of the participating school systems, where 200 fifth-graders in 10 classes took part in a pilot test with Mathspace, an online adaptive learning application. The application identifies whether students have taken the correct steps to solve a problem and offers feedback when they haven’t.

“I’m a big believer in doing pilot tests,” Nagler said, in an interview with EdScoop. He contrasted them to beta tests, where often times, developers are using schools to perfect their product. “A lot of edtech startups, they’ll say their doing a beta, but the product hasn’t been fully vetted. A pilot is more of, ‘this our product and let’s see how hit works for you.’”

Pilots have the added benefit of revealing how willing a provider is to work with a school and tweak their product, “based on our feedback,” Nagler said. “That’s important. We’re really about trying to modify content and not just buy the product the way it is,” said Nagler, a 30-year education veteran whose district is one of 86 recognized nationally as a member of the League of Innovative Schools.

But Nagler also emphasized the importance of working with an independent third party, like Digital Promise, which helped the district develop a benchmark of students’ knowledge, guide it through the three-month pilot test and complete a post-test assessment, which took another two months.

“When people hear ‘pilot,’ it’s often not a concrete concept. You can pilot something, but without a benchmark and post-pilot review, it is kind of useless,” he said. “In the absence of a third party of doing that, I’d need to make our own benchmark and objective measures.”

Advertisement

Nagler said he is also leery of doing pilots that involve only one or two classes, or a cross section of students, rather than have the participation of every student appropriate for the pilot. Gathering data from all, rather than a sampling of students, helps provide a more complete picture of how a product will perform, he said.

In Mineola’s pilot test of Mathspace, Nagler said his teachers were able to see a very clear correlation between the amount of time kids were using the product and their proficiency improvements.

Francisco said the framework’s eight steps are a way to fully define what a pilot should mean and how to conduct them. Those steps, as spelled out on Digital Promise’s webside, include:

1. Identify Need – Clearly articulate the specific need or challenge your district is trying to address so you’ll be able to determine whether or not the product meets that need.

2. Discover & Select – Identify and evaluate the various products in the market. Choose a product that matches your defined need and consider other factors such as student privacy features, fit with school IT system, and the skills required to implement it.

Advertisement

3. Planning – Clearly articulate specific pilot goals to ensure a shared vision, and identify data that will be used to determine success. Set agreements with edtech providers and researchers that outline roles and responsibilities, timelines, and how results will be used.

4. Train & Implement – Ensure teachers have district- and/or company-provided training, technology support, and instructional coaching to enable strong implementation of the new tool.

5. Collect Data – Collect quantitative and qualitative data to determine whether the pilot goals are met. Create formal opportunities (e.g., surveys, interviews, focus groups, and team meetings) for teachers and students to give feedback about the tools.

6. Analyze & Decide – Analyze collected data to evaluate whether the ed-tech tool met the pilot goal(s). Consider both qualitative and quantitative data when deciding whether to purchase, continue piloting, or discontinue using the tool.

7. Negotiate & Purchase – Work with the edtech provider to understand and negotiate the total cost of implementing the edtech tool. Consider ongoing costs for licensing, installation, training, and IT support.

Advertisement

8. Summarize & Share – Summarize and share results with pilot participants in order to foster transparency and trust. Consider sharing the results externally to support other schools and districts in their edtech decision-making.

Francisco suggested that the greater the cost and risk involved in trying a new product, the more rigorous a pilot should be before schools move forward with a product purchase.

She said one of the next steps for Digital Promise is to apply the eight-step pilot framework at more schools and fine-tune the coaching process to help schools go through the process more effectively.

Wyatt Kash

Written by Wyatt Kash

Wyatt Kash is an award-winning editor and journalist who has been following government IT trends for the past decade. He joined Scoop News Group in June 2014, as Vice President of Content Strategy, where he heads up the company's content strategy and editorial product development. Prior to joining SNG, Mr. Kash served as Editor of , where he developed content and community relations for the government technology market, covering big data, cloud computing, cybersecurity, enterprise architecture, mobile technology, open government and leadership trends. Previously, he co-led an AOL start team, where he helped create, launch, manage and market an online news platform, featuring advanced social media strategies, aimed at government, defense and technology industry executives. Mr. Kash has also held positions with The Washington Post Co. and subsequently 1105 Media, as Editor-in-Chief of and , where he directed editorial strategy and content operations for print, online, and mobile products and industry events. Contact the writer at wyatt.kash@fedscoop.com or on Twitter at @wyattkash.

Latest Podcasts