Community, Leadership, Experimentation, Diversity, & Education
Pittsburgh Arts, Regional Theatre, New Work, Producing, Copyright, Labor Unions,
New Products, Coping Skills, J-O-Bs...
Theatre industry news, University & School of Drama Announcements, plus occasional course support for
Carnegie Mellon School of Drama Faculty, Staff, Students, and Alumni.
CMU School of Drama
Monday, October 07, 2019
Geena Davis Unveils Partnership With Disney to "Spellcheck" Scripts for Gender Bias
Hollywood Reporter: During a closing keynote speech at the New Zealand Power of Inclusion Summit, the actress explained how a new digital tool will prevent film and television works from perpetuating underrepresentation and stereotypes — and their pernicious real-world effects.
Subscribe to:
Post Comments (Atom)
7 comments:
This software sounds amazing. This provides an easy reference and clear cut measurement for gender-bias, something that can feel so big and elusive. I’m curious to know how its implemented and how Disney reacts to the “spellcheck” score. Do then ensure that each movie meets a certain threshold or is it more macro and based on a calendar year. My one concern with this technology is that it enables people to tell stories that they shouldn’t. For example, if a movie has an all male creative team and they are telling a story that should be told by a non-male perspective, but the software shows that there is gender equality in the amount of lines spoken has the software achieved its goal. Though it’s wonderful to have a metric, I hope this doesn’t negatively affect writers rooms because they are using this software to combat bias, rather than hiring people of diverse perspectives.
This is really such an interesting concept. I really liked how it made sure to specify that it was not intended to shame the writers of the script, but rather to help point out unconscious bias that humans inherently have. Sometimes it takes a completely unbiased source, like this software, to look through and point out what should be fixed to be more inclusive and representative of the population as a whole. I also think it is great that they are piloting this software with a company as large as Walt Disney Studios. Setting the example from a company this huge helps to acknowledge this bias in our everyday lives. The conscious efforts to try to be more inclusive will trickle down to the viewers of Disney works and, slowly, I think that more people will be aware of their unconscious bias that they have. No person is perfect and it is very helpful to have a digital software to check our work.
This is a wonderful use of the latest technology. I have been having doubts about activists denying the past works by Disney for being gender bias. Certainly, they do have a point that the works have had a huge influence on society, and the new works must be more diverse. However, finding out how to realize the ideal, not just by celebrating colored princesses, seems to be more practical. Visualizing the level of gender bias by the actual number has clarity and absoluteness, which can solve the problem of the judgement itself being biased. On the other hand, the absolute measurement can be dangerous, when that becomes the only subject of consideration. If the “Spellcheck” becomes common, I believe there will be a group of people who would find a way to just get the number right, without examining what must be reconsidered. However, there is no doubt that this tool will provide a trigger for people in the industry to be more careful about their works perpetuating underrepresentation and stereotypes.
This is not how I thought this article was going to go. It doesn’t surprise me to see that Disney, and others likely to follow shortly, is going to be checking scripts to become more inclusive. However, I did not see it coming in the form of a computer. While I think this is a cool idea, I am not all that sure about it. Computers inherently lack understanding and perspective which is important to storytelling which may end up working against being inclusive because it will be looking for numbers, percentages, and phrases or words. It will lack being able to tell if something is worth doing despite the numbers being slightly off point. This is to say, that a film like Get Out would likely not pass this screening device because it does not accurately reflect our world. That would be a huge detriment, and no one would ever know. This is why I expect that this would happen but occur through people because computers just can’t make up the difference, at least not yet.
This software is super cool to see machine learning and data mining to help check for Gender Bias. However, this raises the question of how is GD-IQ: Spellcheck for Bias not biased itself. The software is created by a certain group of people who carry their own bias as to look out for things. However, from the article, it seems that this software is mostly looking at casting numbers, which seems like something technology doesn’t really insert any biased to. This raises another question - How would these gathered numbers lead to more representation? Would shows only be approved if the score met certain standards? I completely agree with Rebecca’s question of what would be done with the schools. One concern that I have with all checkers of gender bias or of any bias is how much do they help solve a problem and how do they just enforce a stereotype? This article talks about getting more female representation through the software but then this could easily get rid of the transgender voice and non-binary voice who won’t use the same gendered language that a man or women uses. I think it is cool to see technology combat biased but I don’t know if it is the most effective tool.
I am apprehensive about this development. I appreciate the focus of this project on the concept of unconscious bias. Recently, in Production Personnel Management we discussed the hiring process. Through this discussion, we got on the topic of different websites and software that exist to help companies edit their job postings, advertisements, and descriptions to eliminate gender, economic,, and racial bias. I took away a lot about how much bias is really rooted in how we phrase requirements, positions, desires, etc., not just in the more apparent forms, for example pronoun usage. While it is a step in the right direction to ensure we are telling diverse and authentic stories, I will be interested in seeing not only how effective this software actually is, but whether or not it becomes sort of a crutch. We should be aspiring to do the work ourselves. To check our own biases before they make it into our art. This could become a way to avoid that personal growth.
I'm intrigued, but skeptical. I think we all know of the similar YouTube algorithms that "learn as they go" that aren't as effective as they are intended to be. I think that this software is a great step in the right direction toward gender equality in film, but I think it definitely has a lot to prove before it can become any sort of industry standard.
I appreciate that this article mentions unintentional bias. Even the most well-meaning of people can have an unconscious bias that is reflected in their work. In fact, this reminds me of the really interesting "Project Implicit" from Harvard University that tests you on that bias regarding different topics. It's truly interesting, and I think that this new technology, if programmed and implemented correctly, has the chance to expose these hidden biases, not just to improve on-screen representation, but also educate writers on what they can try to avoid in the future.
Post a Comment