Digital Assessment Design - Principles and Directions

Digital Assessment Design - Principles and Directions

After all these years working on digital assessment, it is time to take stock of the fundamental principles that guide this approach. I propose to initiate an update of the design principles, particularly in the era of artificial intelligence. I call on your experience and contributions to enrich this approach and better define the scope of the field.

But before anything else, I would like to anchor this reflection in the project of a forthcoming versatile, free, and open-source platform, as we have envisioned for years. Let’s consider the immense revolution brought by AI, which now makes it possible to achieve projects that would have once required unrealistic development costs (cf. article 14/01/2025).

Here is a first draft of the fundamental principles guiding such a project. If certain elements of this list seem debatable, so is their order:

1. Versatility of Uses

Digital assessment is a vast field. The multiplication of small applications specialized in specific uses is a hindrance to its expansion. A modern platform must offer extensive possibilities: beyond a secure exam mode, it should also include a learning mode—focused on students and self-assessment, and a live mode—similar to classroom engagement tools like Kahoot. The platform must be capable of providing adaptive or conditional assessments, and even AI-driven assessment agents.

2. Scalability

The platform must integrate new functionalities through object interfaces, support themes (CSS-based appearances), and plugins (new interactions, additional features). Scalability is essential to ensure that the tool adapts to users, rather than forcing users to adapt to the tool.

3. Security

Security must cover tests, items, user data, and results. It is essential to establish a web exposure policy for high-stakes assessments and to offer local test deployments. Access to content—from creation to test-taking—must be protected. The platform must also ensure compliance with GDPR regulations.

4. Efficiency

Streamlined deployment using Docker-based solutions for easy installation and management. Smooth user interface for designing assessments. Optimized test execution speed, with minimal response times and reactive servers. Cloud-native infrastructure with auto-scaling capabilities (e.g., Kubernetes, serverless computing).

A Node.js-based backend to unify backend and frontend development using JavaScript.

5. Modularity

A modular design makes it easier to maintain the code and develop it as sub-projects. The chosen stack—Node + Express + MongoDB + Vue3 + Vite + Pinia—follows a highly efficient modular approach, which I have personally tested and validated.

6. Compatibility

Adopt JSON as the core compatibility format for legacy formats such as XML, QTI, PCI. AI can now easily generate conversion routines to ensure seamless integration.

7. Portability

A modern platform should be integrable into existing technical environments (e.g., LTI, SCORM). For LTI integration, there are already available Express.js modules: ltijs, JSON Web Tokens (JWT), OpenID Authentication

8. Community-Driven Development

The project must rely on user communities at multiple levels: Teachers and students to share best practices and improve the toolset. Developers to ensure widespread adoption of technologies, based on open-source standards and long-term sustainability.

9. Accessibility & Inclusion

Ensure compatibility with screen readers. Allow customization of display settings (contrast, font size, speech synthesis, etc.). Adapt to specific learning needs (extra time, alternative formats). Use accessibility evaluation tools to assess compliance with standards.

10. Integration of Artificial Intelligence

AI can automate grading and evaluation: Open-ended questions, oral responses, handwritten answers, and even graphical items (e.g., modified diagrams). Until now, manual grading platforms were required for evaluation (I personally developed one for the DEPP a few years ago).

AI can assist in test generation, analyze student performance, and generate reports, such as: Student Reports, Teacher Reports (for one or multiple classes), Psychometric Studies

11. Open Research & Open Data

Encourage scientific and collaborative approaches: Provide APIs for anonymous data export to improve assessment methodologies.

12. Comprehensive and Dynamic Documentation

With AI-assisted documentation generation, detailed technical explanations are no longer a barrier to adoption.

13. Sustainability

The project should be maintained within an institutional framework, ensuring community-driven decision-making through a democratic process.

14. Transparent Governance

A project of this scope should be built on trustworthy individuals and institutions that support free and open education. The licensing model should: Promote free and unrestricted use for public institutions. Require a fair contribution from private-sector partners and for-profit organizations.

Voilà!

This is just an initial draft, and I hope this approach sparks interest and further discussion on how to integrate modern technologies into the educational ecosystem effectively.

要查看或添加评论,请登录

Jean-Philippe Rivière的更多文章

  • Impact of AI on Student Performance Assessment: Challenges and Perspectives

    Impact of AI on Student Performance Assessment: Challenges and Perspectives

    French version here Introduction Pupil assessment is a cornerstone of the education system, but it is time-consuming…

  • Digital Assessment Design

    Digital Assessment Design

    (French version) Why discuss Digital Assessment Design (DAD) today? We find ourselves at a crossroads, facing new…

  • Episode 8 : Matériel et infrastructure

    Episode 8 : Matériel et infrastructure

    On continue le passage en revue des dimensions qui conditionnent la démarche d'évaluation digitale. Aujourd'hui : le…

  • Episode 7 : Le contr?le

    Episode 7 : Le contr?le

    Un nouvel épisode concernant le tour d'horizon des dimensions conditionnant la qualité des évaluations digitales…

  • Episode 6 : l'assistance

    Episode 6 : l'assistance

    Voici une nouvelle dimension (la 6ème) qui conditionne la qualité d'une évaluation numérique que j'ai nommé…

  • Episode 5 : la fréquence d'évaluation

    Episode 5 : la fréquence d'évaluation

    On continue l'exploration de l'analyse systémique de l'évaluation digitale en essayant de passer en revue les…

  • Episode 4 - les supports qui accompagnent l'évaluation

    Episode 4 - les supports qui accompagnent l'évaluation

    Voici une typologie qui montre que même la question qui semble anodine soulève de nombreuses questions pour la…

  • Episode 3 - Evaluations numériques : l'encha?nement des items

    Episode 3 - Evaluations numériques : l'encha?nement des items

    Passer d'un item à l'autre n'est jamais anodin. Voici les possibilités qu'offrent l'évaluation digitale.

  • Episode 2 - Evaluations numériques : les modalités

    Episode 2 - Evaluations numériques : les modalités

    Retour aux fondamentaux, je vous propose de passer en revue les dimensions qui structurent la démarche d’évaluation…

  • Episode 1 : Pourquoi Evaluer ?

    Episode 1 : Pourquoi Evaluer ?

    Retour aux fondamentaux, je vous propose de passer en revue les dimensions qui structurent la démarche d’évaluation…

社区洞察

其他会员也浏览了