Early learning programmes (ELPs) help children learn, grow, and prepare for school. But access alone is not enough – quality matters.
That is why measuring programme quality in a way that is clear, consistent, and trustworthy is so important. It is also central to Sustainable Development Goal (SDG) 4.2, which calls for all children to have access to quality early childhood development, care and preprimary education so that they are ready for primary education.
In 2020, DataDrive2030 introduced the LPQA (v1) as a practical tool for measuring programme quality. It has been used in studies of pre-Grade R programmes, alongside ELOM 4&5 Years child assessments, as part of quality rating and improvement systems, and to generate reliable programme quality scores for local, provincial, and national samples.
While psychometric testing confirmed that LPQA (v1) was reliable, the tool relied on assessors assigning ratings such as Good, Basic, or Inadequate, which allowed for some subjective interpretation. Ahead of the 2024 Thrive by Five Index, DataDrive2030 refined the scoring approach and retested the tool using a nationally representative sample.
What’s new in LPQA (v2)?
The biggest change is the scoring system.
Instead of relying on a single overall rating for each item, LPQA (v2) breaks each item into clear sub-questions, scores them numerically, and combines them to generate the final rating: Inadequate, Basic, or Good. This makes scoring more consistent, objective, and reliable, while keeping results easy to interpret.
Using 2024 Thrive by Five Index data, DataDrive2030 applied rigorous psychometric analyses to further refine the tool. The 22 core items remain, now organised into five slightly updated domains informed by both statistical findings and expert input.
The reporting system has also been strengthened. In addition to organisation-level reports, the automated system now generates a dedicated programme- or ELP-level report, giving each ELP a tailored, shareable report with clear guidance and practical, item-specific recommendations for improvement.
LPQA (v2) in brief
- Improved and more objective scoring system
- Refined quality domains
- Rigorous psychometric testing on a nationally representative sample
- Mandatory two-day, in-person training for assessors
- Data captured, cleaned, securely stored, and analysed through DataDrive2030 systems
- Automated reports at both programme and group level
Supporting resources
A comprehensive LPQA (v2) Technical Manual is available on the DD2030 website. It covers the tool’s theoretical foundations, methodological refinements, and psychometric properties, including internal consistency, construct validity, and a preliminary investigation into criterion validity.
LPQA (v1) remains available as an open-access download, although it does not integrate with the DataDrive2030 data system or include automated reporting.
For accredited LPQA (v1) assessors, DataDrive2030 has also developed a free online bridging course. On completion, assessors are accredited to administer LPQA (v2).