Junue Jang South Torrance High / 9th Grade
It’s inevitable that AI will end the world, whether it be through an army of robots or activating nuclear launch codes. But for now, it’ll be the sole reason I have an A in English.
If she happens to be reading, I’m kidding!
With the surge of ChatGPT and other neural networks, concerns on AI’s rapid development have been raised, which can be addressed by understanding how AI and computer learning works.
Unfortunately, we don’t know how computer learning works.
In short, AI receives a set of data with inputs and outputs and molds an algorithm that matches the former to the latter. For example, I would feed a color-recognition tool inputs of colors and outputs of their corresponding words. The algorithm that the AI receives is strengthened by more and more data to the point where it can tell apart different shades of red or identify multiple colors. The issue lies in the fact that no one, not even its developers, can decipher the algorithms that AI forms. It has thus earned the name of being a “black box model”, as the processes that spit out outputs are not visible to the human eye.
If we have no way to visualize the algorithm, then it becomes increasingly difficult to point out the flaws that come with it, especially when biased data is fed to the model. VICE notes that “the data they are trained on are often inherently biased, mimicking the racial and gender biases that exist within our society”. Of course, this doesn’t apply to our humble color-recognition tool, but it does apply to processes that are more complex and crucial to the lives of many.
For example, the American Civil Liberties Union finds that tenants are evaluated by AI tools that feature “court records and other datasets that have their own built-in biases” of racism, misogyny, and ableism. This can limit housing opportunities for the marginalized by spiking their prices by millions or refusing their offers altogether. As we grow more reliant on AI, we may be reversing our progress towards a more equitable society.
There’s not a lot we can do to combat this issue, however. Unless we can find a way to crack open the imaginary black box, the most we can do is get the road sign captchas right, so that the self-driving cars of the future don’t crash.
<Junue Jang South Torrance High / 9th Grade