Peer-reviewed publications

Investigating the Effects of Gender Bias on GitHub

Sentiment and politeness analysis tools on developer discussions are unreliable, but so are people

Overview

In my first year of PhD, I worked with Dr. Emerson Murphy-Hill who is now a research scientist at Google. Emerson was already working on a project investigating gender bias in software industry, specifically in open source software. He and his colleagues studied pull request acceptance for men vs women on GitHub; and found that women overall get their pull requests accepted more often than men do, but when the gender of the pull requestor is visible, the trend is reversed. The work, at the time, was heavily talked about in the tech industry (Just google the paper title!).

I, with my lab mates, continued the study of investigating the controversial gender bias in the tech industry. We followed Williams and Dempsey’s framework where they state four patterns of gender bias that women face in the workplace. The framework primarily supposes conventional workplaces, like physical offices. However, we conduct our study on a social coding platform GitHub. We formualted proxy hypotheses curated for GitHub platform based on the original framework. For example, women may face more pushback than men on their pull requests which is a proxy hypotheses for the original pattern of bias — that is — women needs to provide more evidence of competence than men, i.e. Prove-It-Again.

We formulated 12 hypotheses and tested them on the GitHub platform. Our results show that effects of gender bias are largely invisible on the GitHub platform itself. However, there are still signals of women concentrating their work in fewer places and being more restrained in communication than men. Our work was accepted at the International Conference on Software Engineering 2019 (ICSE) which is the top conference for the filed of software engineering. (Wohoo!)

This project also resulted in another workshop paper where we evaluate the sentiment and politeness measurment tools on developer discussion on GitHub. We refer to this evaluation in the ICSE paper to discuss our use of natural language processing (NLP) tools in measuring the communication pattern of women vs men.

What I learned

This was the first project of my PhD marathon which gave me my first peer-reviewed publications. I was introduced with research through this project, and also for the first time, I could use and enhance my coding and scripting skills for a real outcome rather than just school projects. I will try to note down few important points below that comes to my mind at this moment:

  • writing, writing, and writing: God! I never knew academic writing was this hard. Emerson held a pair-writing session (just like pair programming) with me for my first workshop paper which I believed has helped me immensely (A note for future professors!).

  • scripting : The core part of our work was data analysis, that is querying a database and generating the statistics. While my SQL skills improved substantially after this project, I also learned the value of developing good scripts that will automate a process from start to end so that a re-analysis is just a click away.

  • Navigating and tinkering Others’ code :

  • Qualitative coding: No, it’s not any kind of programming.

  • Some statistics:

  • Review and rebuttal