Fri. Jan 21st, 2022


Problems with Technology, as many declare, its quantitative tendency, its “hard” mathematics has been placed in the soft human world. Tech is Mark Zuckerberg: All beautiful girls In numbers And satire about its social wonders Metaverse Every human interaction is so awkward that he instantly becomes meme. There are zoos in the human world, but that’s all he failed so spectacularly. That failure, the lack of social and moral chop, many believe he shared with the industry he is so attached to.

And so, as Big Tech fails to understand people, we often hear that its workforce needs to recruit more people who To do Understand. Titles such as “Liberal arts majors are the future of the technology industry“And”Why computer science needs humanity“Technology and business articles have become a recurring feature over the last few years. It has been suggested that social workers and librarians can help the technology industry reduce the damage to social media. Black youth And its spread Rumors, Respectively. Many anthropologists, sociologists and philosophers বিশেষ especially those with advanced degrees who are experiencing the financial pressure of academia’s bias for STEM-are rushing to demonstrate their usefulness to technologists whose starting salaries will blush the average anthropology professor.

I have been studying nontechnical staff in the technology and media industry for the past few years. Arguments to “bring” socio-cultural experts avoid the fact that these roles and personnel already exist in the technology industry and have always existed in various ways. For example, many current UX researchers have advanced degrees in sociology, anthropology, and library and information science. And teachers and EDI (equity, diversity and inclusion) specialists often play a role in the tech HR department.

However, recently in the technology industry Is Exploring where non-technical skills can address some of the social issues associated with their products. Increasingly, technology companies are challenging law and philosophy professors, activists and critical scholars to help protect marginalized users, and algorithmic harassment, confusion, and community expertise to help them through the legal and ethical complexities of platform governance. . Management, user wellness, and digital activation and revolution. These data-driven industries are working hard to increase their technical knowledge and data base with social, cultural, and ethical skills, or what I often refer to as “soft” data.

But you can add all the soft data staff you want and the industry will change little if you don’t value that kind of data and skills. Indeed, many academics, policy misconceptions, and other sociocultural experts in the AI ​​and technical policy space To notice An annoying trend of technology companies looking for their skills and then ignoring it for more technical work and employees.

Such experiences illustrate this filling moment, especially in the growing context of AI ethics, where the technology industry may be claiming to include non-technological roles and actually adding ethical and sociocultural structures to job titles that will ultimately be held by the “same old”. “Technologists. More importantly, in our passion for these often less-appreciated” soft “professions, we should not overlook their limitations in achieving the lofty goals set for them.

When it is As important as championing the critical work performed by these unappreciated and low-income professions, there is no reason to believe that their members are inherently better equipped to be moral mediators. These individuals have very real and important social and cultural skills, but their fields are counting with their own structural dilemmas and areas of weakness.

Take anthropology, a discipline that has emerged as part and parcel of the Western colonial project. Although cultural anthropology now often supports the goals of social justice, there is no guarantee that an anthropologist (85% of whites in the United States) will orient or set up algorithms in a less biased way than a computer scientist. Perhaps the most infamous example is PredPol, a multi-million dollar predictive policing company that Ruha Benjamin is part of. New gym code. PredPol was created by Jeff Brantingham, a professor of anthropology at UCLA.

Other academic communities are similarly opposed by those pushing for softer data. Sociological preliminary monitoring and quantification of the black population have been determined A role In today’s surveillance technology that irresistibly monitors the black community. I have my own research area, critical internet studies, skews are very white and Failed To centralize concerns around race and racism. In fact, I am one of the black and brown researchers who often attends conferences in our field. I was surrounded by more diversity in the technology industry space than in the academic space where the initial critiques of Big Tech originated.





Source link

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *