Test identification

Name of test Progress in Reading Assessment Primary
Version New PiRA/PiRA
Previous version(s) New PiRA primary (paper or digital), PiRA (paper or digital) Key Stage 1, Key Stage 2, Key Stage 3, PiRA Scotland
Subjects Literacy
Summary A suite of standardised termly tests to measure and predict progress in reading and compare performance to national averages. Written to the National Curriculum in 2014, PiRA is designed to be used towards the end of each term in each school year of Key Stages 1,2 and 3 in order to measure and mointor pupils' progress and to provide predictive and diagnostic information

Assessment screening

Subscales n/a
Authors Colin McCarty, Kate Ruttle
Publisher RS Assessment from Hodder Education
Test source https://www.risingstars-uk.com/series/assessment/rising-stars-pira-tests
Guidelines available? Yes
Norm-referenced scores. Yes
Age range 5-11 Years
Key Stage(s) applicable to KS1, KS2
UK standardisation sample Yes
Publication date 2016
Re-norming date n/a

Eligibility

Validity measures available? Yes
Reliability measures available? Yes
Reason for exclusion from shortlist shortlisted

Evaluation and Appraisal

Additional information about what the test measures Reading attainment
Are additional versions available? Termly tests (Reception: Spring, Summer; Years 1-9: Autumn, Spring, Summer). Manual Stage 1: Reception - Year 2.Manual Stage 2: Year 3-6Manual Stage 3: Year 7=9Pencil and paper or online interactive (PiRA Interactive).PiRA Scotland tagged to Scottish curriculumOriginal PiRA (standardised 2009) is now out of print, questions were updated for new national curriculum.Note that New PiRA will be published 2020/2021.
Can subtests be administered in isolation? No subtests, but key stage 1 PiRA questions are categorised into strands in relation to focus on phonics, comprehension (literal understanding and retrieval) and becoming a reader (inference and prediction from text along with appreciation of language, structure and presentation). The reception and Year 1 tests allow differentiation between scores on phonics/decoding and reading in context.Key stage 2 PiRA questions are categorised into strands in relation to focus on comprehension (literal understanding and retrieval), inference (prediction from text), with appreciation of language, structure and presentation.
Administration group size small group, whole class
Administration duration Key Stage 1: 30-40 minutes.Reception and Year 1 can take two short breaks.Kay Stage 2: 30-50 minutes.
Description of materials needed to administer test ManualPaper version: each pupil will require a test booklet, pencil/pen, eraser.Digital version: each pupil will require an interactive test credit and a computer/tablet.
Any special testing conditions? no

Response format

Response mode Electronic OR Paper and pencil
What device is required Computer, tablet
Queston format. mixed
Progress through questions Flat

Assessor requirements

Is any prior knowledge/training/profession accreditation required for administration? Not stated (no)
Is administration scripted? Yes

Scoring

Description of materials needed to score test Paper version: Manual (mark scheme).Standardisation tables are in the manual or use MARK (my assessment and reporting kit) online analysis and reporting tool.Digital version: automatically marked.
Types and range of available scores Raw scores (by question /25, strand, overall),Age standardised scores (<70->130),Standardised score (cohort, not age standardised),Percentile (<2->98),Reading age (varies depending on the test, full range <4;2->12;6),Performance indicator bands (based on national curriculum - working towards, working at, working at greater depth),Hodder scale (0-6+, for comparison against other measures e.g., PUMA or GAPS),National test estimated scaled score on KS1/KS2 test (85-115).
Score transformation for standard score Age standardised
Age bands used for norming unclear
Scoring procedures computer scoring with direct entry by test taker/computer scoring with manual entry of responses from paper form/simple manual scoring key – clerical skills required
Automatised norming computerised/online

Construct Validity

Does it adequately measure literacy, mathematics or science?
Does it reflect the multidimensionality of the subject? Generic literacy
Construct validity comments (and reference for source) Tests have good face validity - tests were developed to allign with national curriculum guidance. 2nd edition was developed to align to the 2014 national curriculum and 2015 test frameworks.However, there is little information about the theory of reading or theoretical constructs, nor is there any statistical analyses presented to support tbis assessment of validity.The Key Stage 1 Manual states that validity was assessed by checking correlations between pupils' test scores and age, and by examining children in Year 2 and Year 6 who took the same Summer PiRA tests in subsequent years. However, no statistics are presented.The Key Stage 2 Manual tates that validity was assessed by checking correlations between pupils' test scores and age, 7 cohorts (10,000 pupils) were tracked termly over an academic year, and data from 1800 children in Year 2 and Year 6 who took the same Summer PiRA tests in subsequent years were compared. However, no statistics are presented.

Criterion Validity

Does test performance adequately correlate with later, current or past performance?
Summarise available comparisons The Manuals note the correlation between original PiRA raw scores and either national test scores or teacher-assessed levels supplied by schools during the standardisation process. Pearson coefficient .32-.83 for Key Stage 1 (see p63 of Stage 1 manual for detail) and .64-.79 for Key Stage 2 (see p80 of Stage 2 manual for detail). This reported as evidence of reliability but supports criterion validity, however, note that this is based on the original PiRA standardisation, not on the 2nd edition restandardisation or equating studies.

Reliability

Is test performance reliable?
Summarise available comparisons Internal consistency was excellent in the initial PiRA standardisation, but not reported for the second edition restandardisation or equating studies. Note, 20% of questions were changed for the second edition. However, 2nd edition equating study showed excellent correlations between original PiRA and 2nd Edition pearson coefficient .94-.99 for Key Stage 1 and .94-.99 for Key Stage 2 (see p63 of stage 1 manual and p80 of Stage 2 manual for further detail).Key stage 1; Cronbach's alpha .87-.94; SEM 1.86-2.08; 90% confidence bands +/-2.97-3.32; 95% confidence bands +/-4.1-3.72Key Stage 2: Cronbach's alpha .89-.94; SEM 2.26-2.73; 90% confidence bands +/-3.62-4.37; 95% confidence bands +/-4.52-5.46Unclear whether paper or digital versions were used in standardisation. Although large scale studies were conducted tracking pupil performance over 4 terms (in order to calculate the Hodder scale), no information given about equivelance reliability of digital/paper versions, or the different forms.

Is the norm-derived population appropriate and free from bias?

Is population appropriate and free from bias? Yes
If any biases are noted in sampling, these will be indicated here. Sample sizes for the norming of the original PiRA and the restandardisation and equating studies are large. However it is difficult to evaluate whether there are any biases in sampling due to limited information about sampling methodology and pupil demographics. Gender differences are noted, with girls consistently outperforming boys on all tests (which the authors note is consistent with national patterns of reading tests and English tests in general). Mean raw score by gender is provided in manual.

Sources

Sources McCarty, C., & Ruttle, K. (2016). Progress in Reading: Manual Stage 1 (Reception, Year 1, Year 2) (Second Edition ed.). London, UK: RS Assessment from Hodder Education.McCarty, C., & Ruttle, K. (2016). Progress in Reading Assessment: Manual Stage 2 (Year 3 to Year 6) (Second Edition ed.). London, UK: RS Assessment from Hodder Education.