Comments about OARS and CSM age ratings

I’ve had quite a few comments from people stating that using age rating classification values based on American culture is wrong. So far I’ve been using the Common Sense Media research (and various other psychology textbooks) to essentially clean-room implement a content-rating to appropriate age algorithm.

Whilst I do agree that other cultures have different sensitivities (e.g. Smoking in Uganda, references to Nazis in Germany) there doesn’t appear to be much research on the suggested age ratings for different categories for those specific countries. Lots of things are outright banned for sale for various reasons (which the populous may completely ignore), but there doesn’t seem to be many statistics that back up the various anecdotal statements. For instance, are there any US-specific guidelines that say that the age rating for playing a game that involves taking illegal drugs should be 18, rather than the 14 which is inferred from CSM? Or the age rating should be 25+ for any game that features drinking alcohol in Saudi Arabia?

Suggestions (especially references) welcome. Thanks!

Published by

hughsie

Richard has over 10 years of experience developing open source software. He is the maintainer of GNOME Software, PackageKit, GNOME Packagekit, GNOME Power Manager, GNOME Color Manager, colord, and UPower and also contributes to many other projects and opensource standards. Richard has three main areas of interest on the free desktop, color management, package management, and power management. Richard graduated a few years ago from the University of Surrey with a Masters in Electronics Engineering. He now works for Red Hat in the desktop group, and also manages a company selling open source calibration equipment. Richard's outside interests include taking photos and eating good food.

6 thoughts on “Comments about OARS and CSM age ratings”

  1. I think using the CSM as a basis for age ratings is the easiest and the best way to do it. In my opinion, globalisation is the way to go; for the same reasons you can access the same OS in the US, Japan, Saudi Arabia, etc., a user should be presented with the same exact software, and thus age ratings, especially if they are based on objective research.

  2. What about different age ratings based on locale settings?

    If package maintainers do not care about ratings for some country (which most likely is the case), you can at least show something like “unrated”. In the end it makes the feature quite complex for maintainers and quite useless for (many) users, though.

  3. My concern would be that even within a particular country, people’s opinions differ.

    Some people would be happy for their child to see content that involves sex, but not violence; or swearing but not violence; or sex, but not misogyny; or…

    So I’d suggest not attempting age ratings, because they’re very subjective. Instead, give guidance warnings individually for each category of potentially-sensitive material, which would involve much less of a value judgement.

    Perhaps something similar to PEGI’s approach (a simple image search will give you a flavour), but without the summary age rating.

  4. I agree with Greg. Content ratings are entirely subjective based upon the cultures in which they evolve, and you will never get consensus on what makes an acceptable content rating policy. There simply can’t be an objective rating system. Furthermore, merely having a rating system can open up possibilities for abuse and unwarranted pressure from advocacy and industry groups as recently detailed by the EFF.

  5. I agree with Greg and Bruce. Age ratings are
    not objective, and I think it’s best to acknowledge that. Even if there was a scientific, objective way to tell which kind of content is appropriate for a given age, parents and teachers have their own ideas about such things and would not necessary appreciate the computer impartially and scientifically giving green light to things they themselves find inappropriate.

    It is very easy for a “universal” system to cause surprises. E.g. American parents and educators might assume that content rated for 13-year-olds does not feature nudity and swearing and would be unpleasantly surprised if it does. (Since I’m not American, I might be wrong here – but that’s the point.) If the system tries to avoid giving anyone in the world unpleasant surprises (which would be a reasonable default), a lot of content that most of the world considers ubobjectionable is going to end up rated 18+, just to be sure.

    What Greg suggests works much better across cultures and individual preferences: let the users make informed decisions based on specific categories of content. This kind of system would not be completely value-neutral either, but it would give people a much better idea of what to expect.

Comments are closed.