Guidance for Using MTurk

What is MTurk?

MTurk is a crowd-sourcing service run by Amazon Web Services that links “workers” with “requestors.” Initially established to help businesses find workers to do small tasks, MTurk has become quite popular with social science researchers. “Workers” browse open jobs and contract to complete tasks for a price determined by the requestor. The attraction of MTurk for researchers is that it provides a large pool of comparatively diverse participants (at least compared to a college campus) for a relatively inexpensive cost.

 What to Consider When Using MTurk

  • Be clear about compensation and bonuses, including how long it will take workers to be paid
  • Clearly and accurately state the time required to complete the task.
  • Participants should be told if there is any screening applied for workers to qualify.  If so, the description of the project should include the necessary qualifications.
  • Be clear about whether participants are being paid for the time it takes to complete the screener.   
  • Researchers should be clear about the type of task participants are being asked to do.  For instance, if the task involves writing, or watching videos, this should be stated in the description.  Also be aware that certain types of tasks, such as writing tasks, elicit higher compensation.
  • The researcher’s name and/or school affiliation should be listed either as the Requester or in the project description.
  • If applicable the to the online survey should be included.

Consent: The first page of the online survey should be the consent document. The online consent will have all the elements of a regular consent, but it will not require a signature.  Participants will either click an “I Agree” or an “I do not Agree” box.  The “I Agree” should include a statement attesting that they are over 18 years of age. The “I Agree” box will take them into the survey.  The “I do not agree” box will thank them for their time and take them away from the survey.  For a sample of an online consent form, please see our one page, online survey consent template.

Debrief: If the researchers are using deception or incomplete disclosure (i.e. – are not stating exactly what the study is about so as not to bias participants responses), then it is important to include a debriefing form at the end of the survey.  This debriefing form could be embedded into the last page of the survey and would require participants to answer a final question allowing researchers to use their data (or not use their data) now that they know the true purpose of the study. 

Confidentiality: While it may have been the goal that MTurk workers were anonymous to academic researchers, the reality is that anonymity cannot be guaranteed in any online environment where data is being collected.  Recent research shows that MTurk worker IDs can easily be linked to individuals Amazon profiles including individuals wish lists and previous product reviews.  This means that researchers must be careful in deciding what information to collect from participants.  The default should be that participants MTurk worker IDs not be collected.  If it is necessary to collect worker IDs, then the researchers should ensure that worker IDs are kept confidential and secure, are not linked back to survey data, and are deleted after use.

Reasons for Failure or Rejection: Please remember that MTurk participants are workers—many depend on MTurk for their livelihood. Rejection rates dictates what projects they can take, which determines how much money they make. This affects some participants’ ability to make money, and they have no way to reverse rejections from their accounts.

Given this, researchers should include a statement in the informed consent about instances in which participants may fail to qualify or might be rejected for a task (e.g., failed attention checks). You should never reject a participant for a task because of your errors.

Amended from:

https://www.umass.edu/research/guidance/mturk-guidance

https://irb.upenn.edu/mission-institutional-review-board-irb/guidance/types-research

Resources:

https://www.cloudresearch.com/resources/blog/irb-template-for-mechanical-turk-and-turk-prime/

https://cornellsun.com/2020/02/21/popular-among-cornell-professors-mechanical-turk-continues-to-face-ethical-questions/

http://ccss.djnavarro.net/onlinestudies/session2_ethics.pdf

https://www.brookings.edu/blog/techtank/2016/02/03/can-crowdsourcing-be-ethical-2/