placeholder
Stuart Gentle Publisher at Onrec

Mobile assessment: how to make it work for you

We may barely remember it now but more than a decade ago, assessment went through a transition from paper and pencil to online. At the time, the industry raised many concerns about equivalence between the two approaches.

However, the fears turned out to be overblown. Online testing is now ubiquitous and paper and pencil has largely fallen by the wayside. We are now at a similar early stage with the move to mobile-friendly assessment.

In the world of recruitment, suppliers of applicant tracking systems are seeing a rapid shift towards candidates making applications via mobile. Page Up, a leading provider, is seeing nearly 40% of applications via mobile already and predicts this could reach 80-90% by 2020. A recent survey by Glassdoor, the site where candidates share their recruitment experiences, showed 89% of respondents see their mobile device as a key tool for finding their next job.

With data suggesting that 20% of millennials ONLY use a mobile to access the web, this has become the predominant means of accessing online content. A survey of 16,000 candidates by IBM in 2016 also indicated that a significantly higher proportion of high potential candidates use mobile in their application process, and 70% of high potentials see organisations offering a mobile experience as more attractive.

In short, it has become essential to go mobile to reach the full range of talent available.

Challenges and opportunities

A recent paper by CEB in 2013 explored the challenges in this area, concluding that the future of assessment is mobile. It suggested a number of common sense recommendations including:

  • Recommending that mobile access should be offered to candidates to ensure equal access to assessments
  • Minimise assessment length and hence opportunity for distractions
  • Minimise reliance on scrolling in assessment design, as a key factor to ensure there are not significant differences by device type in terms of the difficulty for a candidate to complete the test

These issues are key to ensuring mobile assessment works. Content should be made available on mobile, but the design of questions needs to be done in such a way that they can be completed effectively on the smaller screen of a mobile device.

Research in this area has provided initial insights in line with these points. Nguyen and Strazzulla (2012) found no significant differences in reading comprehensions across device types. Doverspike, Arthur, Taylor and Carr (2012) found differences on cognitive tests across device types. Conversely, subsequent research by Morelli (2013) involving 13,023 applicants who completed a suite of selection assessments using a mobile device and 375,054 completing the same content by desktop, found no significant differences on the cognitive component. Morelli also warned of making overly simplistic assumptions about the cause of differences, which may sometimes be due in part to the make-up of the different groups rather than the device used.

The published academic research in this area has, in part, involved looking at content that was not designed to work on mobile. The conclusions from these papers suggest the format, such as the requirement for scrolling and level of mobile optimisation are likely to be the driving factors impacting test performance when completed via mobile.

The other issue, just as with any remote testing, is ensuring candidates take responsibility for being in a suitable environment where they can concentrate effectively. Since testing moved online many years ago, the responsibility for being in an appropriate environment when completing a cognitive assessment online was passed from the provider to the candidate. Just like this transition from proctored, paper and pencil testing to remote online assessment, strong communication and expectation setting is required to ensure candidates take ownership of conditions and perform to their best.

It is not surprising then that non-mobile-optimised, legacy cognitive tests may present some challenges. Using assessments that have been built ‘mobile first’ and can then scale responsively to larger screens appears to be the logical approach to take to deliver a fair and accessible assessment experience.

Implications for choosing the right assessment

The need to go mobile to reach the whole talent pool makes many existing ‘non-mobile’ question banks redundant, as many existing tests and question banks will not squeeze easily onto smaller screens. Creating fresh content designed for mobile is hence essential.

Research in the US has also indicated that minority ethnic candidates, who may score lower on average on some cognitive assessments, are more likely to be using a mobile device now (even with existing non-optimised tests). This compounds the challenges from a diversity perspective, but also highlights why it is essential to redesign content to be fit for the mobile world.

As a result, redesign from a ‘mobile first’ approach to create questions which work equally well on smaller and larger screens is essential. For more difficult questions, this requires challenging but concise content to avoid unnecessary scrolling or navigation challenges.

The timing of cognitive ability tests has also been subject to increased research scrutiny. Moving away from a very speed-based testing approach to a more power-based, using untimed methods of testing (for instance exploiting adaptive testing techniques) may help make the process fairer to different groups as well as mitigate any cross-platform differences.

The internet’s gone mobile, already

The fundamental driver of change is that the internet has already gone mobile, and with it user expectations have followed. So the question is perhaps less whether assessment should go mobile or not but how to make it work effectively – it simply has to in order to remain relevant.

Using assessments which are not mobile optimised clearly carries risks for both the candidate experience and fairness. When looking at moving assessments to the mobile world, the primary consideration should be whether the content has been designed specifically for mobile devices to ensure it is fair and accessible.