Test identification

Name of test Progress in Understanding Mathematics Assessment (Primary)
Version PUMA primary
Previous version(s) PUMA Year 1 Autumn, Spring, Summer; Year 2 Autumn, Spring, Summer; Year 3 Autumn, Spring, Summer; Year 4 Autumn, Spring, Summer; Year 5 Autumn, Spring, Summer; Year 6 Autumn, Spring, Summer; Year 7 Autumn, Spring, Summer; Year 8 Autumn, Spring, Summer; Year 9 Autumn, Spring, Summer
Subjects Maths
Summary A suite of standardised, paper-based termly tests to measure and predict student progress in maths and compare their performance against national averages.

Assessment screening

Subscales Strands rather than subtests: Number, place value and rounding,Addition, subtraction, multiplication and division,fractions, decimals and percentages,measures,geometry (shapes, position, direction, motion),statistics and data handling.
Authors McCarty, Colin, Cooke, Carolin
Publisher RS Assessment from Hodder Education
Test source https://www.risingstars-uk.com/subjects/assessment/rising-stars-puma
Guidelines available? Yes
Norm-referenced scores. Yes
Age range 5-11 Years
Key Stage(s) applicable to KS1, KS2
UK standardisation sample Yes
Publication date 2016
Re-norming date n/a

Eligibility

Validity measures available? Yes
Reliability measures available? Yes
Reason for exclusion from shortlist shortlisted

Evaluation and Appraisal

Additional information about what the test measures Mathematics attainment, plus a profile of mathematics skills
Are additional versions available? Termly tests (Reception: Summer; Years 1-6: Autumn, Spring, Summer). Manual Stage 1: Reception - Year 2.Manual Stage 2: Year 3-6Pencil and paper or online interactive (PUMA Interactive).Original PUMA is now out of print.Note that New PUMA will be published 2020/2021.
Can subtests be administered in isolation? No but strand scores can be differentiated
Administration group size small group, whole class
Administration duration Key stage 1: most pupils < 30 minutes.Maximum 40 minutes.Key stage 2: 30-60 minutes, depending on the year. Maximum 50 mins for year 3&4, maximum 60 minutes for year 5&6.
Description of materials needed to administer test ManualPaper version: each pupil will require a test booklet, pencil/pen, eraser.Digital version: each pupil will require an interactive test credit and a computer/tablet.
Any special testing conditions? No

Response format

Response mode Electronic OR Paper and pencil
What device is required Computer, tablet
Queston format. mixed
Progress through questions Flat

Assessor requirements

Is any prior knowledge/training/profession accreditation required for administration? Not stated (no)
Is administration scripted? Yes

Scoring

Description of materials needed to score test Paper version: Manual (mark scheme).Standardisation tables are in the manual or use MARK (my assessment and reporting kit) online analysis and reporting tool.Digital version: automatically marked.
Types and range of available scores Raw scores (by question, strand, overall),Age standardised scores (<70->130),Standardised score (cohort, not age standardised),Percentile (1->98),Mathematics age (varies depending on the test, full range <4;10->12;07),Performance indicator bands (based on national curriculum),Hodder scale (0-6+, for comparison against other measures e.g., PiRA or GAPS),National test estimated scaled score on KS2 test (85-115).
Score transformation for standard score Age standardised and year cohort standardised
Age bands used for norming 1 month
Scoring procedures computer scoring with direct entry by test taker/computer scoring with manual entry of responses from paper form/simple manual scoring key – clerical skills required
Automatised norming computerised/online

Construct Validity

Does it adequately measure literacy, mathematics or science?
Does it reflect the multidimensionality of the subject? Generic Maths
Construct validity comments (and reference for source) Test guidance describes good face validity as tests were developed to allign with national curriculum guidance and curriculum mpas. 2nd edition was developed to align to the 2014 national curriculum and 2015 test frameworks. However, there is no information about the theory of mathematics development or theoretical constructs. For Key Stage 1 Pearson correlation with age .35-.56 (see page 70 of Stage 1 manual) is reported as a measure of validity.For Key Stage 2 Pearson correlation with age .3-.4 except for in Year 6 where the correlations are -0.24 to 0.21 (see page 75 of Stage 2 manual)Correlations with age should not be expected to be particularly high for termly standardised tests.

Criterion Validity

Does test performance adequately correlate with later, current or past performance?
Summarise available comparisons Criterion validity is reported in the Key Stage 1 manual as assessed through concurrent correlations between teacher assessments and PUMA scores for 200 pupils: At Key Stage 1 .94 (excellent), at Key Stage 2 .83 (excellent).Authors note that teacher assessments tended to be higher than obtained National Curriculum test levels, but don't report the correlations with obtained National Curriculum test levels.The Key Stage 2 manual reports that criterion validity was assessed by orrelating PUMA 6 test raw scores and scaled scores in Year 6 national tests for 1400 pupils from 56 schools. This is used to provide KS2 predicted scores, however the correlation itself is not reported.

Reliability

Is test performance reliable?
Summarise available comparisons Internal consistency of all tests was good-excellent.Key Stage 1 cronbach's alpha .86-.90; 90% confidence bands +/-3.2-3.5; 95% confidence bands +/- 4.10-4.42; SEM 2.05-2.21 see p70 of Stage 1 Manual for detail by test; Key Stage 2 cronbach's alpha .90-.94; 90% confidence bands +/-3.9-4.6; 95% confidence bands +/- 4.86-5.8; SEM 2.43-2.9 see p75 of Stage 2 Manual for detail by test.Unclear whether paper or digital versions were used in standardisation and reliability testing. Although large scale studies were conducted tracking pupil performance over 4 terms (in order to calculate the Hodder scale), no information given about equivelance reliability of digital/paper versions, or the different forms.

Is the norm-derived population appropriate and free from bias?

Is population appropriate and free from bias? Yes
If any biases are noted in sampling, these will be indicated here. Sample sizes for the norming of the PUMA are large. However it is difficult to evaluate whether there are any biases in sampling due to limited information about sampling methodology and pupil demographics. Gender differences are noted, with girls consistently outperforming boys on all tests (which the authors note is consistent with national patterns of mathematics tests in general). Mean raw score by gender is provided in manual.

Sources

Sources McCarty, C., & Cooke, C. (2016). Progress in Understanding Mathematics Assessment (Manual Stage 1: Reception, Year 1, Year 2). (Second Ed.). London, UK: RS Assessment from Hodder Education.McCarty, C., & Cooke, C. (2016). Progress in Understanding Mathematics Assessment (Manual Stage 2: Years 3-6) (Second ed.). London, UK: RS Assessment from Hodder Education.