Skip to content
Home » Research Evidence Scorecard | LXD Research

Research Evidence Scorecard | LXD Research

Evidence Scorecard | LXD Research

You have a study.
Will it actually pass?

Michigan is scoring your evidence. Arizona is vetting your research by category. Mississippi just expanded requirements to middle school. Find out exactly where your studies stand before you submit.

Get Your Evidence Scorecard Up to 3 studies per product · Expert review + written scorecard + 60-minute strategy call · $750

States aren’t just asking whether you have research anymore. They’re grading it.

You invested in a study — maybe you ran it internally, maybe you hired a university partner or a research vendor. You have a report with positive results, and your marketing team is ready to put it on the website.

But here’s what most companies don’t realize until it’s too late: states are no longer treating evidence as a checkbox. They’re building scored frameworks where your research quality directly determines your competitive position. A weak study doesn’t just fail to help — it actively costs you points against competitors who invested in stronger designs.

“We just lost this RFP because our evidence didn’t align with the criteria.” That quote came from a real sales call. Having evidence isn’t the same as having evidence that passes review.

The gap between “we have a study” and “our study meets the standard” is where deals are lost. A small design flaw, a weak comparison group, an inappropriate measure, or an overclaimed finding can mean the difference between approval and rejection — and most companies don’t find out which one they have until a state reviewer tells them.

Three states. Three different systems. All of them are evaluating your research.

These aren’t hypothetical requirements. They’re active right now, with real submission deadlines and real consequences for your market access.

Michigan literacy rankings
Michigan
Points-Based Rankings

Michigan’s landmark literacy legislation, signed in October 2024 and backed by $87 million in funding, introduced a rankings-based model for literacy materials. Programs earn points based on ESSA tier level — Tier 1 earns the most, but even Tier 4 earns credit. No evidence earns nothing. Evidence is weighted alongside instructional quality and SoR alignment, which means your research score directly affects your product placement.

Districts will apply for funding based on product placement on the rankings list. Submissions continue through September 2026, creating a rolling opportunity to improve your standing — but also a rolling risk if competitors are earning more points than you.

Arizona evidence-based literacy requirements
Arizona
Vetted Program Lists

Arizona’s Move On When Reading law separates literacy products into three distinct categories — Core, Supplemental, and Intervention — and each category must independently meet ESSA Tier 1, 2, or 3 evidence requirements. A single study cannot qualify a product across multiple categories. Evidence must match the intended instructional use.

Vendors submit research documentation during a defined window (typically July 1–November 1). Programs not approved don’t appear on vetted lists. If your program isn’t on the list, districts face compliance risks when selecting it. Arizona also applies stricter-than-federal standards — even at Tier 3, they require a control group and a large sample size.

Mississippi Senate Bill 2487 literacy requirements
Mississippi
8th-Grade Reading Gate

Senate Bill 2487 extends Mississippi’s pioneering Science of Reading requirements from early elementary into grades 4–8, with an eighth-grade reading gate beginning in the 2027–2028 school year. Students who cannot read at grade level will not be promoted to high school.

The law mandates evidence-based instruction across six components, explicitly bans balanced literacy and three-cueing approaches, and requires all content-area teachers to complete training in evidence-based reading instruction. This opens a new market for intervention and supplemental programs — but only those that can demonstrate evidence alignment.

A written scorecard with state-specific verdicts — plus a strategy call to act on it.

We evaluate up to three studies for a single product — each scored individually against the exact criteria these states and national bodies use. You receive a written scorecard you can share with your team, and a 60-minute call to walk through every finding and plan your next move.

Evidence Quality Scorecard
Sample — illustrative only
Study design & comparison group
Sample size & statistical power
Outcome measures & assessments
Analysis approach & reporting
Claims aligned to findings
Implementation fidelity documentation
Arizona MOWR
Supplemental: likely approved
Core: not yet
Michigan Rankings
Est. ESSA Tier 3
Moderate points
JHU / Digital Promise
Tier 3 ready
Tier 2 with revisions

Expert review, written scorecard, and a strategy call — one fixed-price package.

Submit up to three studies for one product. Each study is evaluated individually against state and national evidence standards. You get a written scorecard you can share with your leadership, plus a 60-minute call to discuss findings, implications, and next steps. Need a review for additional products? Each product requires a separate session.

01

Written Evidence Scorecard

A structured evaluation of each study across the criteria that states and certification bodies use — study design, sample, measures, analysis, and claims. Clear ratings with explanations.

02

State-Specific Verdicts

How your studies would fare under Arizona’s MOWR vetting process (by category), Michigan’s points-based rankings, and national certifications like Johns Hopkins Evidence for ESSA and Digital Promise.

03

60-Minute Strategy Call

Walk through every finding with Dr. Schechter. Understand which weaknesses are fixable without a new study, which need additional data, and which are fine as-is despite looking imperfect.

04

Prioritized Fix List

A ranked list of what to address first — from quick report revisions that improve how your research reads to methodological gaps that require follow-up. Mapped to the specific approvals you’re targeting.

You keep the scorecard. It’s a document you can share with your leadership, your board, your research vendor, or your product team. Many clients use it to hold vendors accountable for research quality or to scope their next study investment.

Companies come to us at different moments. All of them wish they’d come sooner.

Scenario One

“We hired a vendor to run our study. The results look good, but we’re not sure about the methodology.”

Not all research vendors produce work that meets state review standards. We’ll tell you whether your vendor’s design, measures, and analysis will hold up — and what to ask them to fix before you publish.

Scenario Two

“We’re preparing for Arizona’s submission window and need to know if our evidence qualifies.”

Arizona’s vetting is category-specific and stricter than federal ESSA standards. We’ll evaluate each of your studies, tell you which categories they support, whether they meet ADE’s requirements, and what you’d need for additional categories.

Scenario Three

“Michigan just ranked us lower than our competitor. We need to understand why.”

Michigan’s framework scores evidence alongside instructional quality. We’ll diagnose exactly where you’re losing points and what investments would move you up the rankings before the next evaluation cycle.

Scenario Four

“We have old research and we’re not sure if it still meets current standards.”

Evidence standards have shifted significantly since 2020. Studies that would have passed five years ago may not meet today’s requirements. We’ll tell you whether your existing portfolio still works — or whether it’s time to invest in new evidence.

Scenario Five

“Someone did their dissertation on our product. Can we actually use it?”

Dissertations can be valuable evidence — but they vary wildly in design quality, sample size, and whether the measures align with what reviewers look for. We’ll tell you if it’s usable as-is, whether it could support a specific ESSA tier, and what (if anything) would need to change before you cite it in a state submission.

The researcher who has navigated more ESSA approvals than any other independent firm.

Dr. Rachel Schechter

Dr. Rachel Schechter founded LXD Research after serving as Director of Research at Lexia Learning, where she built a 12-member research department and a portfolio of What Works Clearinghouse-aligned studies. Since 2020, LXD Research has produced more studies approved by Johns Hopkins’ Evidence for ESSA than any other independent research organization, including multiple Strong (Level 1) and Moderate (Level 2) ratings.

She has supported state board approval processes in Arizona, Utah, and Arkansas, earned certifications from Digital Promise across dozens of products, and designed studies accepted by ERIC, the International Dyslexia Association, and state education agencies in seven states. When she reviews your research, she’s applying the same standards that gatekeepers use — because she’s been on both sides of the review process.

7+
State education agency approvals
15+
ESSA-approved studies since 2020
40+
Digital Promise ESSA certifications earned for clients
10K+
Views across 100+ research reports on ResearchGate

Find out before the reviewers do.

1
Submit your studies through the intake form
2
Receive your invoice and pay online
3
Schedule your session — times start the following week

Know exactly where your research stands, what needs to change, and how to get to approval — all in one fixed-price session with no proposal process and no contract negotiation.

Get Your Evidence Scorecard · $750 60-minute session with written scorecard · Sessions begin within one week of payment
Evidence Scorecard | LXD Research

You have a study.
Will it actually pass?

Michigan is scoring your evidence. Arizona is vetting your research by category. Mississippi just expanded requirements to middle school. Find out exactly where your study stands before you submit.

Get Your Evidence Scorecard Expert review + written scorecard + 60-minute strategy call · $750

States aren’t just asking whether you have research anymore. They’re grading it.

You invested in a study — maybe you ran it internally, maybe you hired a university partner or a research vendor. You have a report with positive results, and your marketing team is ready to put it on the website.

But here’s what most companies don’t realize until it’s too late: states are no longer treating evidence as a checkbox. They’re building scored frameworks where your research quality directly determines your competitive position. A weak study doesn’t just fail to help — it actively costs you points against competitors who invested in stronger designs.

“We just lost this RFP because our evidence didn’t align with the criteria.” That quote came from a real sales call. Having evidence isn’t the same as having evidence that passes review.

The gap between “we have a study” and “our study meets the standard” is where deals are lost. A small design flaw, a weak comparison group, an inappropriate measure, or an overclaimed finding can mean the difference between approval and rejection — and most companies don’t find out which one they have until a state reviewer tells them.

Three states. Three different systems. All of them are evaluating your research.

These aren’t hypothetical requirements. They’re active right now, with real submission deadlines and real consequences for your market access.

Michigan literacy rankings
Michigan
Points-Based Rankings

Michigan’s landmark literacy legislation, signed in October 2024 and backed by $87 million in funding, introduced a rankings-based model for literacy materials. Programs earn points based on ESSA tier level — Tier 1 earns the most, but even Tier 4 earns credit. No evidence earns nothing. Evidence is weighted alongside instructional quality and SoR alignment, which means your research score directly affects your product placement.

Districts will apply for funding based on product placement on the rankings list. Submissions continue through September 2026, creating a rolling opportunity to improve your standing — but also a rolling risk if competitors are earning more points than you.

Arizona evidence-based literacy requirements
Arizona
Vetted Program Lists

Arizona’s Move On When Reading law separates literacy products into three distinct categories — Core, Supplemental, and Intervention — and each category must independently meet ESSA Tier 1, 2, or 3 evidence requirements. A single study cannot qualify a product across multiple categories. Evidence must match the intended instructional use.

Vendors submit research documentation during a defined window (typically July 1–November 1). Programs not approved don’t appear on vetted lists. If your program isn’t on the list, districts face compliance risks when selecting it. Arizona also applies stricter-than-federal standards — even at Tier 3, they require a control group and a large sample size.

Mississippi Senate Bill 2487 literacy requirements
Mississippi
8th-Grade Reading Gate

Senate Bill 2487 extends Mississippi’s pioneering Science of Reading requirements from early elementary into grades 4–8, with an eighth-grade reading gate beginning in the 2027–2028 school year. Students who cannot read at grade level will not be promoted to high school.

The law mandates evidence-based instruction across six components, explicitly bans balanced literacy and three-cueing approaches, and requires all content-area teachers to complete training in evidence-based reading instruction. This opens a new market for intervention and supplemental programs — but only those that can demonstrate evidence alignment.

A written scorecard with state-specific verdicts — plus a strategy call to act on it.

We review your study against the exact criteria these states and national bodies use. You receive a written scorecard you can share with your team, and a 60-minute call to walk through every finding and plan your next move.

Evidence Quality Scorecard
Sample — illustrative only
Study design & comparison group
Sample size & statistical power
Outcome measures & assessments
Analysis approach & reporting
Claims aligned to findings
Implementation fidelity documentation
Arizona MOWR
Supplemental: likely approved
Core: not yet
Michigan Rankings
Est. ESSA Tier 3
Moderate points
JHU / Digital Promise
Tier 3 ready
Tier 2 with revisions

Expert review, written scorecard, and a strategy call — one fixed-price package.

Submit your study. We review it against state and national evidence standards. You get a written scorecard you can share with your leadership, plus a 60-minute call to discuss findings, implications, and next steps.

01

Written Evidence Scorecard

A structured evaluation of your study across the criteria that states and certification bodies use — study design, sample, measures, analysis, and claims. Clear ratings with explanations.

02

State-Specific Verdicts

How your study would fare under Arizona’s MOWR vetting process (by category), Michigan’s points-based rankings, and national certifications like Johns Hopkins Evidence for ESSA and Digital Promise.

03

60-Minute Strategy Call

Walk through every finding with Dr. Schechter. Understand which weaknesses are fixable without a new study, which need additional data, and which are fine as-is despite looking imperfect.

04

Prioritized Fix List

A ranked list of what to address first — from quick report revisions that improve how your study reads to methodological gaps that require follow-up. Mapped to the specific approvals you’re targeting.

You keep the scorecard. It’s a document you can share with your leadership, your board, your research vendor, or your product team. Many clients use it to hold vendors accountable for research quality or to scope their next study investment.

Companies come to us at different moments. All of them wish they’d come sooner.

Scenario One

“We hired a vendor to run our study. The results look good, but we’re not sure about the methodology.”

Not all research vendors produce work that meets state review standards. We’ll tell you whether your vendor’s design, measures, and analysis will hold up — and what to ask them to fix before you publish.

Scenario Two

“We’re preparing for Arizona’s submission window and need to know if our evidence qualifies.”

Arizona’s vetting is category-specific and stricter than federal ESSA standards. We’ll tell you which category your study supports, whether it meets ADE’s requirements, and what you’d need for additional categories.

Scenario Three

“Michigan just ranked us lower than our competitor. We need to understand why.”

Michigan’s framework scores evidence alongside instructional quality. We’ll diagnose exactly where you’re losing points and what investments would move you up the rankings before the next evaluation cycle.

Scenario Four

“We have old research and we’re not sure if it still meets current standards.”

Evidence standards have shifted significantly since 2020. Studies that would have passed five years ago may not meet today’s requirements. We’ll tell you whether your existing portfolio still works — or whether it’s time to invest in new evidence.

The researcher who has navigated more ESSA approvals than any other independent firm.

Dr. Rachel Schechter

Dr. Rachel Schechter founded LXD Research after serving as Director of Research at Lexia Learning, where she built a 12-member research department and a portfolio of What Works Clearinghouse-aligned studies. Since 2020, LXD Research has produced more studies approved by Johns Hopkins’ Evidence for ESSA than any other independent research organization, including multiple Strong (Level 1) and Moderate (Level 2) ratings.

She has supported state board approval processes in Arizona, Utah, and Arkansas, earned certifications from Digital Promise across dozens of products, and designed studies accepted by ERIC, the International Dyslexia Association, and state education agencies in seven states. When she reviews your study, she’s applying the same standards that gatekeepers use — because she’s been on both sides of the review process.

13+
ESSA-approved studies since 2020
40+
Digital Promise ESSA certifications earned for clients
7
State education agency approvals

Find out before the reviewers do.

1
Submit your study through the intake form
2
Receive your invoice and pay online
3
Schedule your session — times start the following week

Know exactly where your research stands, what needs to change, and how to get to approval — all in one fixed-price session with no proposal process and no contract negotiation.

Get Your Evidence Scorecard · $750 60-minute session with written scorecard · Sessions begin within one week of payment