How Logrus IT’s Quality Evaluation Portal Enhances Game Localization Quality: A Case Study
Ensuring high-quality game localization is crucial for global success, particularly for AAA and AA games where players invest heavily in their setups and game licenses. These players are often demanding and meticulous. Despite the best efforts of client companies to maintain high standards across all languages—through strict control over voice talent casting, recording quality, and more—traditional measures alone cannot entirely prevent subpar localizations. Various factors can contribute to this issue:
Any of these issues can easily ruin gaming experience for one or more markets. However, they are challenging to detect because traditional quality control focuses heavily on technical aspects like casting, recording, formatting, and tag errors, which are easier to check. Translating standalone fragments with limited or no contextual information further complicates the matter.
While it’s impractical to review the entire game screen by screen in live mode, a well-organized spot-check that focuses on the right criteria and reflects user experience and sentiment can greatly enhance localization quality. This approach can prevent potential disasters at a fraction of the cost and within a reasonable timeframe.
Logrus IT ’s Quality Evaluation Portal enables a comprehensive evaluation of game localization quality by emphasizing the overall user experience. Structured feedback allows both publishers and localizers to identify systemic issues, implement corrective measures, improve quality before the game release, and achieve higher user satisfaction across global markets.
Logrus IT’s Quality Evaluation Portal: Enhancing Game Localization Quality - Case Details
Task: The client aimed to evaluate the quality of their game localized into multiple target languages, focusing on player sentiment. They also sought to identify and summarize systemic issues and enhance future localizations through discussions with localizers. The languages managed by our team included Chinese (Simplified and Traditional), German, French, Italian, Japanese, Korean, Spanish (ES), and Portuguese (BR).
Challenge: Due to budget constraints, Logrus IT could only evaluate string translations, with a volume limited to 2000 words per language.
Solution: This task was ideally suited for the Logrus IT Quality Evaluation Portal. The portal already possessed the necessary functionality for running evaluations, including creating customized metrics, and combining holistic evaluations with logging specific issues at the more granular level.
Most importantly, the Logrus IT Quality Evaluation Portal seamlessly integrates complete arbitration functionality. Content creators or localizers can provide feedback on the reviewer’s suggestions or logged issues. Reviewers can then address these comments by either resolving the issues (modifying their evaluation, providing explanations, etc.) or escalating them to the project manager if a particular disagreement cannot be resolved at the localizer/reviewer level.
Most importantly, the Logrus IT Quality Evaluation Portal seamlessly integrates complete arbitration functionality. Content creators or localizers can provide feedback on the reviewer’s suggestions or logged issues. Reviewers can then address these comments by either resolving the issues (modifying their evaluation, providing explanations, etc.) or escalating them to the project manager if a particular disagreement cannot be resolved at the localizer/reviewer level.
Solution: The client provided a randomized, representative selection of localized product strings for each language. Our objectives were to:
For this project, we selected a relatively “standard” 3D hybrid quality metric that combined two holistic criteria (Informativeness/Relevance and Consistency) with atomistic evaluation and a simple error typology.
It’s important to emphasize that the holistic quality criteria we chose covered areas often overlooked during regular quality checks. These aspects are crucial for user sentiment and perception. We aimed to answer the following questions:
领英推荐
The atomistic evaluation addressed other critical issues, such as incorrect or unintelligible translations, as well as more traditional topics like language, style, locale conventions, technical issues, tone/voice, and terminology.
After finalizing the metric, the Logrus IT project manager created customized guidelines for reviewers. These guidelines explained the project goals and priorities, the metric (including holistic quality scales required for objective evaluation), and detailed steps, rules, and recommendations for using the Logrus IT 's Quality Evaluation Portal.
After defining the scope, metric, and reviewer guidelines, we initiated quality evaluation projects for each language. We maintained close contact with the client and localization provider representatives, who had access to the project on the portal.
Client representatives had full, PM-level access, while localizers could only access their respective languages and were limited to providing comments. (The portal supports multiple roles, each with specific permissions. The PM can also restrict reviewer and/or localizer access to a particular date range to prevent changes after project completion and unauthorized access.)
During the arbitration stage, localizers could access reviewer evaluations, comments, and logged errors, and also clarify or explain certain decisions and/or request grade changes.
Results: Upon project completion, the Logrus IT Team provided the client (and the localizers) with the following:
General Feedback:
Wish List: Even more impressive results could be achieved with an in-context review, which would require a slightly larger budget. We have multiple scenarios for this, including script-based evaluation on live localized game builds or screenshot-based evaluation.
You are welcome to try this with your games or localized materials. I will be happy to discuss details with colleagues in the game publishing and localization industry, many of whom I expect to meet at the External Development Summit (XDS) #XDS2024 event that starts in less than two weeks...
Please drop us a note…
#QualityEvaluation #GameLocalization #QualityArbitration #HolisticQuality #HybridQualityEvaluation #MultidimensionalQuality #LQA
I wrote this article myself. I have used Microsoft Copilot to improve style, and then edited the article extensively to make sure it says exactly what I meant :-).