A research project which focuses on conducting the different research methods with strangers.

The end product is a crowdsourced bias scoring system in which users can annotate ads to explain why certain aspects of it is biased.


Everyday users find it difficult to accurately recognize and report incidents of algorithmic bias on Instagram


- Think Aloud

- Semi-structured Interview

- Contextual Inquiry

- Affinity Mapping

- Speed Dating


Senior fall - 2021

Project Length: 

7 weeks

Team Members: 

Yael Canaan

Sidra Manzoor

Kristian Pham

Nikhil Saravan

Yufei Wang

My Role on the team:

- interviewer for: 

    - Think Aloud

    - Semi-structured Interview

    - Contextual Inquiry

    - Speed Dating

- Insight Analysis

- Participate in the Affinity Mapping session

- final poster design

    - Overall layout & structure

    - Individual member portraits

- "Royal Doodler" on the collaboration space

We propose a crowdsourced bias scoring system in which users can annotate ads to explain why certain aspects of them are biased. When people scroll in their main feed, if the ad they receive is potentially biased and has been reported by multiple people, it will then be tagged with a "bias score". For example, an ad has a 49% bias score, which means 49% of the people who interacted with this ad thought this ad is biased. We hope that people who see these scores will be then encouraged by this score thinking that they are not the only ones who see this ad as biased and are more likely to report or look closer into why the ad is biased. Users are also able to write their own idea of why this ad is biased, which this information can also be shared with other users and we thought this way people are learning about algorithmic bias implicitly. After contributing to report biases to a certain degree, users are getting a bias buster badge on their page to show other users that they care about the Instagram community. 

UI/UX & product designer

Four Words to describe me: 




And here is my portfolio ;p
Enjoy my work!

  • LinkedIn
  • Facebook
  • Instagram


We are surrounded by information these days. Even an ad that pops up on social media could be made personalized towards the user based on the data the platform had collected from them. It appears that ads could be algorithmically biased sometimes. And these biases could harm people.

For example, an algorithm bias I experienced was that I have been receiving male shampoo-related ads because of what I constantly search on YouTube (while I am a female in real life). I think what had gone wrong was that I majored in product design and I constantly search industrial-related or woodwork-related content. Moreover, I also constantly search video game-related content on YouTube since I like to play games in my leisure time. So the algorithm just assumed me to be a male instead of a female. Before this research, I myself wasn't sure what algorithm bias was, and I was shocked and confused about why there suddenly appeared so many male-targeted ads on my YouTube.

It appears that I am not alone. There are so many more social media users out there that had experienced algorithm bias and got confused or even upset about what they had got from the platforms. In this case, our group wanted to help everyday users accurately recognize and report incidents of algorithmic bias on social media. Which we wanted to start with Instagram. 


To inform our work and better understand everyday users of Instagram, we used a variety of research methods which included semi-structured interviews, generative think aloud, contextual inquiry, speed dating, stakeholder mapping, empathy mapping, and affinity diagramming.


Through these research methods, we tested everyday users' familiarity with the existing reporting system on Instagram, observed different ways people use their Instagram, asked questions related to their usual interactions on Instagram, and finally quickly proposed 15 different draft solutions to resolve the algorithm bias problem to the participants and listen for their comments for each of the solutions. Synthesizing and finding patterns from our research data allowed us to uncover key insights that led us to our final solution.

" When I’m on Instagram, I just want to catch up with my friends, I don’t want to feel like I’m working. "

" It’s not that I don’t want to learn about bias, it’s just that I don’t have time to read through a lot of information. "





Our gathered data led us to 3 key insights. Users want a low-effort way to learn about AI bias. While using Instagram, people don’t want to be repeatedly interrupted. And finally, users don’t like extremely educational content and instead prefer more engaging or fun experiences.


Frame 1 (2).png
Frame 1 (2).png