📌 Key facts
Artificial intelligence (AI) is increasingly influencing the opinions and behavior of people not only in everyday life, but also regarding business decisions. However, the over-representation of men in the design and application of these technologies could quietly undo decades of advances in diverisity and gender equality because AI-based systems sometimes make problematic or discriminatory decisions, e.g. in the realm of employee selection or internet search engines. Focusing on the role of gender in AI, this thesis is to discuss the imbalanced power structures in AI processes and their consequences.
🦾Who We Are
The Chair for Strategy and Organization is focused on research with impact. This means we do not want to repeat old ideas and base our research solely on the research people did 10 years ago. Instead, we currently research topics that will shape the future. Topics such as Agile Organizations and Digital Disruption, Blockchain Technology, Creativity and Innovation, Digital Transformation and Business Model Innovation, Diversity, Education: Education Technology and Performance Management, HRTech, Leadership, and Teams. We are always early in noticing trends, technologies, strategies, and organizations that shape the future, which has its ups and downs.
The goal of this thesis is to present the current state of research regarding diversity and gender bias rising by the use of AI, in particular, to highlight research and application areas, in which diversity and gender bias in the use of AI is a problem, and elaborate possible solutions, challenges, and chances (e.g., through diversity tech). Of particular interest for the chair is the focus on the field of stereotypes and female managers.
🧠 Topics of Interest
Diversity and gender stereotypes and bias
Influence of new technology on business and society
- Reliable and self-driven
- Enthusiasm for diversity and career management as well as new technology and its impact on society and business
- Ability to do sophisticated internet, desk research, and connect with practitioners
- Passion to learn more about the future and do research with impact
📚 Further Reading
Beck, S. et al. (2019), Künstliche Intelligenz und Diskriminierung – Whitepaper aus der Plattform Lernende Systeme, München 2019
Leavy, S. (2018), Gender bias in artificial intelligence: the need for diversity and gender theory in machine learning, Proceedings of the 1st International Workshop on Gender Equality in Software Engineering, 14-16
Parsheera, S. (2018), A Gendered Perspective On Artificial Intelligence, Itu Kaleidoscope: Machine Learning For A 5g Future (Itu K), IEEE, New York
Peus, C. and Welpe, I.M. (2011), Frauen in Führungspositionen: Was Unternehmen wissen sollten, OrganisationsEntwicklung 2(2), 47–55.
Heilman, Madeline E. (2012), Gender stereotypes and workplace bias, Research in organizational behavior 32, 113-135.
Webster, Jane and Watson, Richard T. (2002), Analyzing the past to prepare for the future: writing a literature review, MIS Quarterly 26(2), xiii-xxiii.
📄 Requirements to any Work
We do not want your research to gather dust in some corner of bookshelf but make it accessible to the world. Thus, we warmly encourage you to create some or all of the following:
- Infograph - visually represent some of your work (find examples here)
- Slide Deck - summarize your research and possibly present it
- Extract most important sequences from podcasts, videos, and other media
- 3-4 Tweets about the most important findings and summarizing the topic
- optional: Medium Article - let people outside the university know about your research and start your personal brand
📬 How to Apply
If you are interested, please send your application with a short motivational statement, your current grade sheet, CV, and possible starting date in one .pdf file to Ilse Hagerer (email@example.com). You can also briefly outline your tentative research idea (research question, data and methods, possible outcomes with a tentative outline all in word as *.docx)
We're greatly looking forward to hearing more about you!