{"id":110,"date":"2022-08-26T19:24:36","date_gmt":"2022-08-26T19:24:36","guid":{"rendered":"https:\/\/inside.wooster.edu\/hsrc\/?page_id=110"},"modified":"2022-09-26T00:33:35","modified_gmt":"2022-09-26T00:33:35","slug":"guidance-for-using-mturk","status":"publish","type":"page","link":"https:\/\/inside.wooster.edu\/hsrc\/guidance-for-using-mturk\/","title":{"rendered":"Guidance for Using MTurk"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\" style=\"font-style:italic;font-weight:600\">What is MTurk?<\/h2>\n\n\n\n<p>MTurk is a crowd-sourcing service run by Amazon Web Services that links \u201cworkers\u201d with \u201crequestors.\u201d Initially established to help businesses find workers to do small tasks, MTurk has become quite popular with social science researchers. \u201cWorkers\u201d browse open jobs and contract to complete tasks for a price determined by the requestor. The attraction of MTurk for researchers is that it provides a large pool of comparatively diverse participants (at least compared to a college campus) for a relatively inexpensive cost.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">&nbsp;<strong><em>What to Consider When Using MTurk<\/em><\/strong><\/h2>\n\n\n\n<ul class=\"wp-block-list\"><li>Be clear about compensation and bonuses, including how long it will take workers to be paid<\/li><li>Clearly and accurately state the time required to complete the task.<\/li><li>Participants should be told if there is any screening applied for workers to qualify.\u00a0 If so, the description of the project should include the necessary qualifications.<\/li><li>Be clear about whether participants are being paid for the time it takes to complete the screener.\u00a0 \u00a0<\/li><li>Researchers should be clear about the type of task participants are being asked to do.\u00a0 For instance, if the task involves writing, or watching videos, this should be stated in the description.\u00a0 Also be aware that certain types of tasks, such as writing tasks, elicit higher compensation.<\/li><li>The researcher\u2019s name and\/or school affiliation should be listed either as the Requester or in the project description.<\/li><li>If applicable the to the online survey should be included.<\/li><\/ul>\n\n\n\n<p><strong>Consent:<\/strong> The first page of the online survey should be the consent document. The online consent will have all the elements of a regular consent, but it will not require a signature.&nbsp; Participants will either click an \u201cI Agree\u201d or an \u201cI do not Agree\u201d box.&nbsp; The \u201cI Agree\u201d should include a statement attesting that they are over 18 years of age. The \u201cI Agree\u201d box will take them into the survey.&nbsp; The \u201cI do not agree\u201d box will thank them for their time and take them away from the survey.&nbsp; For a sample of an online consent form, please see our one page, <a href=\"https:\/\/livewooster.sharepoint.com\/:w:\/s\/HumanSubjectResearchCommittee\/EaZUhsiPg6BIiUBwTLgPeqEB0dJ9cMtaDSLmNlq7l-mQ4g?e=e2WnTc\" target=\"_blank\" rel=\"noreferrer noopener\">online survey consent template<\/a>.<\/p>\n\n\n\n<p><strong>Debrief:<\/strong> If the researchers are using deception or incomplete disclosure (i.e. \u2013 are not stating exactly what the study is about so as not to bias participants responses), then it is important to include a debriefing form at the end of the survey.&nbsp; This debriefing form could be embedded into the last page of the survey and would require participants to answer a final question allowing researchers to use their data (or not use their data) now that they know the true purpose of the study.&nbsp;<\/p>\n\n\n\n<p><strong>Confidentiality:<\/strong> While it may have been the goal that MTurk workers were anonymous to academic researchers, the reality is that anonymity cannot be guaranteed in any online environment where data is being collected.&nbsp; Recent research shows that MTurk worker IDs can easily be linked to individuals Amazon profiles including individuals wish lists and previous product reviews.&nbsp; This means that researchers must be careful in deciding what information to collect from participants.&nbsp; The default should be that participants MTurk worker IDs not be collected.&nbsp; If it is necessary to collect worker IDs, then the researchers should ensure that worker IDs are kept confidential and secure, are not linked back to survey data, and are deleted after use.<\/p>\n\n\n\n<p><strong>Reasons for Failure or Rejection<\/strong>: Please remember that MTurk participants are workers\u2014many depend on MTurk for their livelihood. Rejection rates dictates what projects they can take, which determines how much money they make. This affects some participants\u2019 ability to make money, and they have no way to reverse rejections from their accounts.<\/p>\n\n\n\n<p>Given this, researchers should include a statement in the informed consent about instances in which participants may fail to qualify or might be rejected for a task (e.g., failed attention checks). You should never reject a participant for a task because of your errors.<\/p>\n\n\n\n<p>Amended from:<\/p>\n\n\n\n<p><a href=\"https:\/\/www.umass.edu\/research\/guidance\/mturk-guidance\">https:\/\/www.umass.edu\/research\/guidance\/mturk-guidance<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/irb.upenn.edu\/mission-institutional-review-board-irb\/guidance\/types-research\">https:\/\/irb.upenn.edu\/mission-institutional-review-board-irb\/guidance\/types-research<\/a><\/p>\n\n\n\n<p><strong><em>Resources<\/em><\/strong><em>:<\/em><\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><a href=\"https:\/\/www.cloudresearch.com\/resources\/blog\/irb-template-for-mechanical-turk-and-turk-prime\/\">https:\/\/www.cloudresearch.com\/resources\/blog\/irb-template-for-mechanical-turk-and-turk-prime\/<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/cornellsun.com\/2020\/02\/21\/popular-among-cornell-professors-mechanical-turk-continues-to-face-ethical-questions\/\">https:\/\/cornellsun.com\/2020\/02\/21\/popular-among-cornell-professors-mechanical-turk-continues-to-face-ethical-questions\/<\/a><\/p>\n\n\n\n<p><a href=\"http:\/\/ccss.djnavarro.net\/onlinestudies\/session2_ethics.pdf\">http:\/\/ccss.djnavarro.net\/onlinestudies\/session2_ethics.pdf<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.brookings.edu\/blog\/techtank\/2016\/02\/03\/can-crowdsourcing-be-ethical-2\/\">https:\/\/www.brookings.edu\/blog\/techtank\/2016\/02\/03\/can-crowdsourcing-be-ethical-2\/<\/a><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>What is MTurk? MTurk is a crowd-sourcing service run by Amazon Web Services that links \u201cworkers\u201d with \u201crequestors.\u201d Initially established to help businesses find workers to do small tasks, MTurk has become quite popular with social science researchers. \u201cWorkers\u201d browse open jobs and contract to complete tasks for a price determined by the requestor. The [&hellip;]<\/p>\n","protected":false},"author":161,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-110","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/inside.wooster.edu\/hsrc\/wp-json\/wp\/v2\/pages\/110","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/inside.wooster.edu\/hsrc\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/inside.wooster.edu\/hsrc\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/inside.wooster.edu\/hsrc\/wp-json\/wp\/v2\/users\/161"}],"replies":[{"embeddable":true,"href":"https:\/\/inside.wooster.edu\/hsrc\/wp-json\/wp\/v2\/comments?post=110"}],"version-history":[{"count":6,"href":"https:\/\/inside.wooster.edu\/hsrc\/wp-json\/wp\/v2\/pages\/110\/revisions"}],"predecessor-version":[{"id":127,"href":"https:\/\/inside.wooster.edu\/hsrc\/wp-json\/wp\/v2\/pages\/110\/revisions\/127"}],"wp:attachment":[{"href":"https:\/\/inside.wooster.edu\/hsrc\/wp-json\/wp\/v2\/media?parent=110"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}