ETIN issues new guidelines for maximizing edtech research

The report, conducted by Empirical Education, offers guidance on how to navigate the accelerated world of educational research under ESSA.

A new report released by the Educational Technology Industry Network (ETIN), a division of the Software & Information Industry Association (SIIA), explains how educators, edtech developers and researchers can evaluate the potential impact of educational technology in K-12 classrooms.

The report, titled “Guidelines for Conducting and Reporting EdTech Impact Research in U.S. K-12 Schools,” is a revision of a similar 2011 report by SIIA. The 16 guidelines are broken down into four stages of educational research, from “Getting Started” to “Designing the Research” to “Implementing the Design” to “Reporting your Results” and offer recommendations and suggestions for developers and researchers throughout.

“The current technology and policy environment provides an opportunity to transform how research is done,” said Denis Newman, CEO of Empirical Education Inc. and lead author of the guidelines, in an official statement. “Our goal in developing the new guidelines was to clarify current requirements in a way that will help edtech companies provide school districts with the evidence they need to consistently quantify the value of software tools.”

Empirical Education, a member of ETIN, conducted the report, and guidelines were chosen based on three factors, including the pace of educational technology development, the increase in data collection with the cloud and new standards from the Every Student Succeeds Act (ESSA).


“The need for the revision of the 2011 guidelines stems from how quickly the industry is evolving, the technology that is currently available for research and the change in requirements from the Department of Education,” a spokesperson from Empirical Education told EdScoop.

The influx of cutting-edge edtech since 2011 has left educators with myriad choices, but little guidance in the way of selecting an effective technology to use in the classroom. For researchers and developers it has become crucial to prove, through evidence-based research, that a product will have an impact on students.

“In light of the ESSA evidence standards and the larger movement toward evidence-based reform, publishers and software developers are increasingly being called upon to show evidence that their products make a difference with children,” said guidelines peer reviewer Robert Slavin, director of the Center for Research and Reform in Education at Johns Hopkins University. “The ETIN Guidelines provide practical, sensible guidance to those who are ready to meet these demands.”

ESSA, which was passed in 2015, made strides in clarifying and defining how to measure the success of edtech for researchers seeking a baseline in their field.

Newman and the researchers behind this report, however, stress that ESSA’s language requires further clarification before it can be completely understood. Until ESSA is completely clear, the report says, the guidelines should be interpreted as just what they are labeled; guiding, not definitive, suggestions for stakeholders.


ETIN’s guidelines “are not written as a definitive set of standards with a level of specificity to which research reports should adhere; however, they are meant to serve more as a comprehensive reference for what should be included in the planning, design, implementation and reporting of research,” the report reads.

The first section of the report provides a basic outline for researchers to reference in the early stages of their product and study design. Titled “Getting Started,” the guidelines suggest documenting a working model of the product, methods to secure funding, strategies for establishing a target audience and considerations for and against influencing research studies based around the product.

The second section of the report, “Designing the Research,” offers guidelines for researchers to design their study in compliance with ESSA while maintaining a reasonable hold on finite resources, like cost and time.

The third section, “Implementing the Design,” cautions stakeholders against carelessness in handling personally identifiable information and choosing researchers who aren’t objective, while “Reporting the Results” encourages detailed, understandable final impact reports that are published in a space that allows easy access to stakeholders.

Latest Podcasts