June 7: Computational Thinking
What do ghosts that can code say?
BOOlean!!
Coding Blog
Background
- SCRATCH in elementary school
- AP Computer Science Principles in 10th grade
Python
Because of my inexperience with coding, I decided to complete the Codeacademy modules to truly learn Python. I completed 19% of the expected 30% of the course, learning about lists, date and time, strings, integers, float, boolean, and just basic Python Syntax. Lists where challenging for me, because I kept mixing up whether to use quotation marks or parantheses, and when integers and strings were introduced, creating lists became even more challenging. I still need more practice to get comfortable creating my own lists, but Codeacademy provided a beneficial introduction to Python.My Work
ChatGPT
I agree that ChatGPT has racial and gender bias. Before reading these articles, my friend presented his research about machine learning incorrectly identifying cancer on darker skin. The articles only furthered the evidence that machine learning contains bias.
Artificial Intelligence Has a Racial and Gender Bias Problem | TIME
- Generally, facial analysis systems are trained on images of “predominantly light-skinned men.”
- Though technology is assumed to be unbiased, research has uncovered racial AND gender bias in “tech giants like IBM, Microsoft, and Amazon.”
- Technology even misrepresents famous people of color.
- Lack of representation in machine learning could be a result of the lack of people of color and women working in the technology industry.
Why algorithms can be racist and sexist
- Difficult to identify “exactly how systems might be susceptible to algorithmic bias”
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home