Assignments

You'll ANALYZE and IMPLEMENT existing crowdsourcing systems and techniques on your own.

Tips & Info

What do I do?

There will be a set of assignments that will be announced as the course progresses, in which you'll:

  • Become a crowd worker.
  • Survey & Analyze existing crowdsourcing systems.
  • Replicate results from an academic paper.
  • Implement a crowdsourcing technique.

Why do this?

Hand-on exercises and implementations are a fun and effective way to learn.

How do I submit?

We'll create an assignment in KLMS for each assignment.

Late Policy

You'll lose 10% for each late day. Submissions will be accepted until three days after the deadline. After then you'll get 0 on that assignment.

Assignment 1

Assignment 1: Be a crowd worker

Due: Before class on 9/13 (Tue)

What do I do?

In this assignment, you'll experience what it's like to be a crowd worker. In class, we walked through the MTurk user interface, both as a worker and as a requester, and discussed its strengths and limitations. Now it's your turn to pick a crowdsourcing platform of your choice (sorry, MTurk doesn't count!) and be a worker yourself. We'll focus on voluntary crowdsourcing platforms, which are quite different from platforms offering monetary reward in many ways.

Pick one of the following platforms: You need to spend at least one hour as a crowd worker using the platform. Show us some visual proof of what you've done with screenshots (e.g., example task, how much work you've done, your score in a game, etc.).

Your report

In your report, please reflect on your experience by answering the following questions:

  1. What is the name of your platform?
  2. Why is this crowdsourcing? What is it crowdsourcing?
  3. Walk through your task finding, performing, and learning experience with screenshots.
  4. What did you like about the platform? Why do people come to this platform?
  5. What problems did you run into? How would you improve the platform and the tasks to attract more volunteers?
  6. In the scale of 1 (not really) - 10 (very much), answer the following: "I think I contributed meaningful work that helps the goal of the platform."
  7. In the scale of 1 (not really) - 10 (very much), answer the following: "I personally enjoyed working on the tasks."

How do I submit?

KLMS

Assignment 2

Assignment 2: Analyze Crowdsourcing Platforms

Due: Before class on 9/20 (Tue)

What do I do?

In this assignment, you'll analyze FIVE existing crowdsourcing platforms and applications, using the dimensions identified by learnersourcing.

Here are some notes:

  1. Examples that have been previously discussed in class cannot be used.
  2. Five platforms you pick should be different from one another in a significant way (e.g., you cannot pick five Zooniverse projects).
  3. You can define crowdsourcing broadly, as long as you can explain why it can be seen as crowdsourcing (e.g., Wikipedia? Uber? Crowdfunding? Recommender systems?).
  4. To help you find good examples, Wikipedia has a crowdsourced collection of crowdsourcing projects. But you should also explore on your own!

Dimensions

We'll use the following seven dimensions, which were top-rated in our own crowdsourcing effort to rank the dimensions that matter the most in analyzing a crowdsourcing system.
  • Motivation - why would a crowd worker do this?
  • Aggregation - how are results from multiple workers combined?
  • Crowd pool - undefined, paid, voluntary, community, location-based, etc.?
  • Quality control - how to ensure valid results?
  • Human skill - what kind of human skill is required to complete a task?
  • Process order - in what order is the work processed between computer, worker, and requester?
  • Goal visibility - how much of the overall goal of the system is visible to an individual crowd worker?
We understand that it might be difficult to precisely extract some of the dimensions as crowd workers of a platform. The underlying mechanism might be hidden from the crowd. In such a case, please (1) indicate that the information is not publicly available, but (2) write your thought on what might be happening under the hood.

Your report

In your report, please analyze each platform by answering the following questions:

  1. What is the name of the platform? URL?
  2. Why is this crowdsourcing? What is it crowdsourcing?
  3. How did you find out about the platform?
  4. Analyze the platform using the seven dimensions above.
  5. Discuss one strength and one weakness of the platform.
  6. Include a screenshot.

How do I submit?

KLMS

Assignment 3

Assignment 3: Heuristic Evalution Part 1

Due: 11:59pm on 11/25 (Fri)

What do I do?

In this assignment, you'll test a prototype by one peer student group and provide useful feedback. Yes, it's a learnersourcing approach: you learn and practice how to evalute a user interface while helping peer students improve their prototype! We will use an evaluation technique called heuristic evaluation.

Simply put, heuristic evaluation uses guidelines that capture the principles of effective user interfaces. While there are plenty of options to choose from, we'll use Nielsen's 10 usability heuristics. As you run the assigned prototype, note any striking problem or success you find. Then relate it to one of the heuristics, and explain how it is violated or met.

NOTE: Since most prototypes are still not complete, please do not focus on aesthetics or feature completeness, but rather focus on the usability of the interface components that are related to the main task.

Here are some useful examples from MIT's 6.813 class notes:

  • Include positive comments as well as criticisms:
    • "Good: Toolbar icons are simple, with good contrast and few colors (minimalist design)"
  • Be tactful:
    • NOT: "the menu organization is a complete mess"
    • Better: "menus are not organized by function"
  • Be specific:
    • NOT: "text is unreadable"
    • Better: "text is too small, and has poor contrast (black text on dark green background)"

Which prototype should I test?

Refer to the assignment sheet.

Your report

In your report, please include results of your heuristic evaluation:

  1. What is the name of the system you tested?
  2. Make a numbered list of 10 usability problems and successes you find. For each problem or positive comment:
    • Describe the problem or positive feature. You may use screenshots if they help. If not, please make sure your comment is descriptive enough to point to a particular feature or UI element.
    • Identify one relevant usability heuristic from Nielsen's 10 usability heuristics, and explain how it is violated or met.

How do I submit?

The report should be written in Markdown (please use the .md extension). Submit using KLMS.