Periodically sites that incorporate user generated content need a way to moderate the incoming stream before publishing it to the site. Typically this is accomplished by putting the content in a queue and allowing moderators to explicitly accept or reject content. I needed such functionality for a site I was working on so I wrote the “ActsAsModerated” plugin which allows specific columns of a model to be audited by a moderator at some later point.
ActsAsModerated is good for:
- spot-checking user generated content
- being notified when new content is created
- tracking changes within a record
- spam checking
Moreover, this plugin allows for custom callbacks and validations around the moderation event. This flexibility gives developers the ability to augment the moderation flow by stacking custom rules as needed. For example, a developer write code to assign a reputation score to a user and increment that score for every non-spam contribution they make. This would mean that you could auto approve any content that is made by a user above a certain score, thereby reducing the workload on moderators. I have been using this plugin for over a year on a very high-volume site and it has held up quite well, and so I thought I’d share it with the rest of the Ruby on Rails community.
Download & Install
For those with ADD you can find acts_as_moderated here: https://github.com/heavysixer/acts_as_moderated
Create the moderated_records table like so:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
ActsAsModerated has two integration points. The first is within the model(s) to be moderated, which can be done like so:
1 2 3
The second integration point is the class that acts as the moderator. Typically this is some user or account class. The idea is that behind this integration point is to create an audit trail for decisions made by the moderator if you ever need to watch the watcher.
1 2 3
The plugin also supports an after_moderation callback on the record being moderated, which you can use to take action based on what the moderator did. For example:
- Delete the record if it is inappropriate or spam
- Email the content creator that their content has been approved / denied
There are several dynamically created methods added to every acts_as_moderated class which moderators can use as a shortcut for making decisions. For example where @moderator is an object with acts_as_moderator applied:
Default Ordering, Callbacks & Flagging
The ModerationRecord class has a named_scope called queue, which will return records sorted oldest to newest. However, if you flag a record it will be returned first regardless of its age relative to unflagged records. This is useful if you want to ensure that moderators see potentially dangerous records first. A good way to flag a record is using the after_moderated callback for example:
1 2 3 4 5 6 7
The moderation record will attempt to make callbacks on the model being moderated after a record is first created after_moderation and when a moderator rejects a record after_rejection here is an example of what they might do:
1 2 3 4 5 6 7 8 9 10 11
The moderation plugin adds an attr_accessor called skip_moderation, which when set to true will prevent a moderation record from being created for that instance of a save. This is useful if you need to create records programmatically, which don’t need to be moderated initially but will need to be moderated at some later point. For example:
A word to the wise however, since this attr_accessor prevents records from being moderated you will want to protect it from mass_assignment in your model.
Presently if you use the :always_moderate flag on a STI model it will produce a never-ending series of record updates. I’ll keep working on this bug, in the meantime please do investigate!