Define “computer bias” in your own words and explain how it can result from intentional or unintentional factors in software development. Give a brief

example of this. Explain how programmers can actively work to reduce bias in their algorithms?

Answer: Computer bias is where a computer is programmed to act differently either because of the inputs(humans) it is given or the way it is coded by the makers. Intentional: The student teaching showed a really good intentional bias example, an intentional factor would be to shown Netflix exclusive content so users subscribe and pay money. Another intentional factor is the data used in studies. For example, the people leading the study might choose people who change the data like people who they know play more video games so they can change the way the data comes out. Unintentional: An unintentional factor of a computer bias is inaccurate facial recognition. For example, the facial recognition algorithmn in your phone is trained with data. The algorithmn may recognize a white person better than a black person because the data that trained the algorithmn had more white people. Programmers can reduce bias in their algorithms by broadening their dataset like including people of all races in their facial recognition data.

Briefly describe the two types of bias in software development and provide examples from the gaming industry and social media platforms. How might biases in software design affect user engagement and experiences?

Two types of bias are intentional and unintentional. Intentional bias is where the computer is intentionally programmed to act a certain way based on input. The student lesson includes how Talking Tom(funny characters, nice music) targets younger kids and Violent video games with shooting and bombing targets teens and older people. YouTubers use bias to target a certain audience. They change their editing style and music and personality. Cocmolen videos are edited in a way to maximize engagement from kids alongside cartoony thumbnails for kids. Biases discriminate against users by favoring one group over another which will decrease engagement and user experience. The discrimination can between groups whether it be age, race, or something else.