Impact Research and Optimization

In partnership with institutions, instructors, and students around the country, we conduct carefully designed studies to explore variations in how students, instructors, and institutions use a product and any differences in outcomes they achieve as a result. This is used to continuously refine the product design and guide support and training to help students and instructors to achieve their best outcomes. We publish our methodology and provide detailed research briefs behind all our studies to provide full transparency.


How we do impact research

We partner with instructors and institutions to provide expert consulting to help them to rigorously research outcomes, contexts, and product use cases.

Once studies are approved by the Institutional Review Board, we help instructors to establish essential baseline data for their course and student population, such as previous academic performance, demographic data, perception, motivation, and confidence.

We meticulously document their classroom environment, course structure, and use of the product through classroom observations, interviews, focus groups, surveys, and product data.

We analyze relationships between these findings and student outcomes (including course and exam grades, retention, and completion) to provide them with insights, including suggestions for how to further improve results.

We perform meta-analyses over multiple courses, instructors, and institutions to identify trends and best practices for how to improve outcomes.

We use a variety of established and cutting-edge methodologies, including cluster analysis to identify common characteristics, rapid-cycle evaluations to explore use cases and outcomes for continuously evolving digital solutions, and efficacy studies to control for different contributors to and drivers of outcomes.

We also monitor instructor and course outcomes nationally to identify where we can provide additional or better support and training.

The Impact Research Advisory Council

A panel of leading experts critique our methods, results, and claims

Measuring impact on learning provides essential information for institutions, instructors, and students when making decisions about what digital learning solutions to adopt and how to use them to achieve the best outcomes. However, impact research is notoriously challenging. At Macmillan Learning, we don’t have all the answers, but our research endeavors to embrace the diversity of students, instructors and their courses, the local educational environment, and continuously evolving digital solutions. We therefore use a portfolio of coordinated research that include implementation science, rapid-cycle evaluations, and summative impact research. To challenge our research methods, practices, results, and claims, we have formed the Impact Research Advisory Council. This Council is made up of a diverse panel of external experts in designing and measuring the impact of educational technology, methods for measuring effectiveness, modeling and evaluating learning performance, standards for measurement in education, and respecting current and evolving data privacy standards and laws.

Learn More

Dr. Christopher Dede

Timothy E. Wirth Professor in Learning Technologies, Technology Innovation, and Education Program at the Harvard Graduate School of Education.

Dr. Dede’s research focuses on developing new types of educational systems to meet the opportunities and challenges of the 21st century. His work spans emerging technologies for learning, infusing technology into large-scale educational improvement initiatives, developing policies that support educational transformation, and providing leadership in educational innovation. He has conducted funded studies to develop and assess learning environments based on virtual worlds, augmented realities, transformed social interaction, and online teacher professional development. He is a leader in mobile learning initiatives and has developed a widely used Framework for scaling up educational innovations.

Dr. Dede sits on our Learning and Impact Research Advisory Councils to ensure continuity.

Michael Feldstein

Partner at MindWires Consulting, Co-Publisher of e_literate, and Co-Producer of e-Literate TV.

Feldstein is a prominent figure in the educational technology space who regularly provides strategic planning and program management consulting for universities, publishers, educational technology companies, and financial services companies interested in educational technology. His research focuses on the development and provision of e-learning and knowledge management products and services, with a special emphasis on software simulations. Feldstein is a frequent invited speaker on a range of e-learning related topics including; usability, the future of the LMS, ePortfolios, and edupatents for organizations ranging from the eLearning Guild to the Postsecondary Electronic Standards Council. The Chronicle of Higher Education, the Associated Press, and U.S News and World Report often interview Felsdtein as an e-Learning expert.

Dr. Sara Finney

Professor, Department of Graduate Psychology and Associate Director in the Center for Assessment and Research Studies.

Dr. Finney’s research focuses on issues and techniques broadly related to measurement and statistics in psychology and education. Her scholarship focuses on the presence of less-than-ideal conditions for research, quality of measures, the measurement of academic entitlement, and test-taking motivation for students. As part of Finney’s work at the Center for Assessment and Research Studies she designs and leads impact research around campus initiatives. Focused on actionable insights, the program of research is developed around gathering a body of evidence of effectiveness and impact using novel approaches to research and evaluation.

Dr. Suzanne Lane

Department Chair Research Methodology, University of Pittsburgh.

Dr. Lane is a recognized measurement methodologist who has published extensively on technical and validity issues associated with educational measurement. Her work is published in journals such as the Journal of Educational Measurement, Applied Measurement in Education, and Educational Measurement: Issues and Practice. She was the President of NCME, Vice President of Division D of AERA, member of the AERA, APA, and NCME Joint Committee for the Revision of the Standards for Educational and Psychological Testing, and member of the Management Committee for the next revision of the Standards. She has served on the Editorial Boards for the Journal of Educational Measurement, Applied Measurement in Education, Educational Assessment, Educational Researcher, and Educational Measurement: Issues and Practice.

Dr. Thanos Patelis

Research Scholar at Fordham University and Principal Scientist at Human Resources Research Organization

Dr. Patelis is a noted expert in applied research and evaluation. His research focuses on developing growth models to measure learning progressions, constructing valid metacognitive measurements, multivariate statistical analysis, program evaluation, structural equation modeling, item and assessment validity, and applied psychometrics. Dr. Patelis has been a reviewer for Educational and Psychological Measurement, Educational Measurement: Issues and Practice, International Journal of Testing, Mental Measurements Yearbook, and Psychological Methods and on the editorial board of Applied Measurement in Education. He has published and presented over 50 paper and has served on an APA Task Force for the Recruitment of Quantitative Psychology Professionals.

Dr. Elana Zeide

Yale Law School Visiting Fellow, Information Society Project; Princeton University, Associate Research Scholar, Center for Information Technology Policy.

Dr. Zeide is an attorney, scholar, and consultant focusing on student privacy, predictive analytics, and the proverbial permanent record in the age of big data. Ziede examines the law, policies, and cultural norms emerging as education and human evaluation become increasingly data-driven. This includes exploring how innovation alters the assumptions underlying traditional and new approaches to data protection and creating cross-disciplinary conversations to better align privacy conceptualization and regulation to today's technology.

Educational Results

Science + Empathy + Data Insights

The result of this empathetic and scientific approach are learning solutions that are practical, deeply intuitive, highly impactful, and help students, instructors, and institutions to achieve their very best outcomes.

See Our Latest Educational Results