Meet The Oxford English Assessment Team

Meet The Oxford English Assessment Team

If you thought that Oxford University Press (OUP) only supported English language teachers and learners with learning materials, think again. Since publishing its first book in 1478, OUP has been a trusted source for publications, notably the Oxford English Dictionary, the definitive record of the English language, first published in 1884. By 1926, OUP had established its English Language Teaching (ELT) department, publishing materials for teachers and students worldwide to advance knowledge and learning. Milestones in the ELT department include the first advanced learner’s dictionary in 1948, the Headway English course in 1985, and Oxford graded readers in 1988, to name just a few.?

But another milestone was in 2009 which saw the launch of the Oxford Placement Test, the world’s first global computer-adaptive placement test designed specifically for English language learners across all CEFR levels. The test was designed by the newly established Test Development Unit, now the Oxford English Assessment team, which has gone on to produce a range of innovative English language tests, including the Oxford Placement Test for Young Learners (2013), the Oxford Test of English (2017), the Oxford Test of English for Schools?(2020), and the?Oxford Test of English Advanced?(2024).

The key to this success has been our commitment to quality. As a department of the University of Oxford, our mission is to further the university's objective of excellence in research, scholarship, and education. To achieve this, we have a team of experts dedicated to providing tests that are trusted by students, teachers, schools, parents, test centres, employers and admissions offices the world over.?

Meet the team?

The Oxford English Assessment team is divided into three areas: Research, Content, and Delivery. Together we have created and delivered tests to millions of test takers across the globe, providing valid and reliable results to help place students in the right class, and to help admissions officers and employers select candidates with the required level of English proficiency. The teams work closely together but have distinct areas of responsibility.?

The Research team?

The Research team brings a wealth of knowledge, experience and expertise to the design and validation of our tests, ensuring that they measure reliably and accurately. The team’s duties encompass a whole host of activities, covering test design, item analysis, CEFR alignment, item bank calibration, assessor quality assurance, live test monitoring, malpractice identification, conference presentations, research publication, public engagement, the list goes on. One of the most interesting aspects of this is psychometrics, which involves defining a construct (such as reading) and measuring it statistically. This is foundational to computer-adaptive testing, which employs the Rasch model. This puts test questions and ability on the same scale so that the algorithm can do its thing.??

“Psychometrics is an integral part of what we do,” says Dr Nathaniel Owen, Senior Research Manager, “It means we can have a high level of confidence in the results that we provide.”?

Of course, we don’t work in a vacuum, and over the years of developing tests we have worked with a who’s who of leading assessment thinkers and practitioners, including Dr Allastair Pollitt at Cambridge Exam Research, Professor Anthony Green at the Centre for Research in English Language Learning and Assessment (CRELLA), Professor Charles Alderson, Emeritus professor at the Linguistics at Lancaster University, Professor Claudia Harsch, University of Bremen, Dr Jon De Jong, Professor Emeritus of Language Testing at VU University Amsterdam, Dr Phillida Schellekens at Schellekens Consultancy, and Professor James Purpura, Teachers College, Columbia, to name but a few.?

In whatever it does, the team is always looking for opportunities to innovate. For example, the design of the Oxford Test of English secured a finalist place in the e-assessment awards for “innovative use of e-assessment to provide excellence in English language testing,”. Another example of innovation is automating statistical analysis to derive Rasch difficulty values for test questions, an otherwise labour-intensive activity.??

The Content team?

Computer-adaptive tests, such as the Oxford Test of English, are able to give more precise results than equivalent paper-based tests. How do they do this? Well, unlike traditional tests, where all the test takers answer the same questions, with adaptive testing, each time a test taker answers a question, an algorithm calculates the test taker’s ability based on whether they answered the question correctly or incorrectly and chooses the next question at just the right level of difficulty. This requires large banks of test questions for the algorithm to select from and as you might imagine, creating large banks test questions is no easy matter. This is where the Content team comes in.??

The Content team has targets of the number and type of questions they need to produce over the coming year to ensure that item banks have sufficient questions. To do this, they recruit freelance ELT professionals with appropriate experience to become ‘item writers’ to write test questions. Item writers go through a rigorous item-writing training programme covering everything from how to write questions that work, to understanding the CEFR, to copyright law. Those who pass are added to an item-writing team for a module of a test and receive module-specific item-writer training. Item writers are then commissioned to write a number of items.??

In a process which aligns with ALTE best practice (as set out in the Manual for Test Development and Examining), each test question submitted by an item writer undergoes a rigorous process of pre-editing, re-writing, editing, vetting and proofreading, each stage improving the question until it meets strict specifications. "It's a lot more complicated than you might imagine,” says Anoushka üzüm, Product Development Team Manager, “but it’s hugely rewarding to see a blank page turn into a really robust test question.”?

But the team aren’t standing still and to complement this traditional approach to content creation, they are busy working with technology partners looking into the possibility of AI-generated test questions. The challenge will be to create questions that match the high quality of the current approach, but the Content team have never shied away from a challenge.?

The Delivery team?

From Austria to Argentina, Korea to Kuwait, Mexico to Malaysia, the Delivery team works with our approved test centres and research partners across the globe, covering everything from test-centre inspections to reasonable adjustments requests, to malpractice investigations, to managing examiners, to pretesting.?

All this has to be done to the highest standards. Take pretesting, for example. Once the Content team have created sufficient test questions, they need to be validated to ensure that they are working properly before they can be put into a test. To do this, the questions are ‘tried out’, or pretested, with students internationally. Their answers are then statistically analyzed by the Research team and only those questions meeting strict standards are used in a live test. The Delivery team have curated a global network of ‘research partner’ schools and other educational institutions with thousands of students happy to try out pretest questions. It’s an enormous task, but one the Delivery team continue to meet. “We have a fantastic team, fueled with passion, energy and delivery commitment!” says Niamh Power, Head of Assessment Delivery. “We put quality and efficiency at the heart of everything we do.” And this commitment has paid dividends. For example, by focusing on efficiency, the Delivery team reduced the time that Oxford Test of English test takers have to wait for their Speaking and Writing results from 14 days to just five.?

Of course, the Oxford Assessment Team is far from the whole story– a legion of support comes from teams such as Customer Services, Marketing, Sales, and Technology who all play their part in bringing customers tests they can trust.?

??

References?


Colin Finnerty is Head of Assessment Research at Oxford University Press. He has worked in language assessment at OUP for eight years, heading a team which created the Oxford Young Learner’s Placement Test and the Oxford Test of English. His interests include learner corpora, washback, and machine learning.

?

Dave Allan

President at Norwich Institute for Language Education - NILE

5 个月

How do we find out everyone who is in the team?

回复
Kirsten Sutton

Language Assessment and Education Specialist

5 个月

Colin Finnerty that's lovely to read, especially also because you are still including learner corpora in your research agenda.

要查看或添加评论,请登录

Teaching English with Oxford的更多文章

社区洞察

其他会员也浏览了