Christine Bauer

[1] Christine Bauer & Markus Schedl (2019). Global and country-specific mainstreaminess measures: Definitions, analysis, and usage for improving personalized music recommendation systems. PLOS ONE, 14(6), Art no. e0217389. arXiv: 1912.06933. DOI: 10.1371/journal.pone.0217389

[2] Andrés Ferraro, Christine Bauer, & Xavier Serra (2020). Last.fm Artists Gender Information: zenodo.  DOI:10.5281/zenodo.3748787

[3] Andrés Ferraro, Xavier Serra, & Christine Bauer (2021). Break the Loop: Gender Imbalance in Music Recommenders. Proceedings of the 6th ACM SIGIR Conference on Human Information Interaction and Retrieval (CHIIR ‘21). Canberra, ACT, Australia, 14-19 March, pp 249-254. DOI: 10.1145/3406522.3446033

[4] Andrés Ferraro, Xavier Serra, & Christine Bauer (2021). What is fair? Exploring the artists’ perspective on the fairness of music streaming platforms. In Carmelo Ardito, Rosa Lanzilotti, Alessio Malizia, Helen Petrie, Antonio Piccinno, Giuseppe Desolda et al. (Eds.), Human-Computer Interaction – INTERACT 2021. Volume 12933, pp 562-584. Cham, Germany: Springer International Publishing. DOI: 10.1007/978-3-030-85616-8_33

[5] Dominik Kowald, Peter Müllner, Eva Zangerle, Christine Bauer, Markus Schedl, & Elisabeth Lex (2021). Support the underground: characteristics of beyond-mainstream music listeners. EPJ Data Science, 10(1). DOI: 10.1140/epjds/s13688-021-00268-9

[6] Vaughn Schmutz & Alison Faupel (2010). Gender and Cultural Consecration in Popular Music. Social Forces, 89(2), pp 685-707. 

[7] Stacy L. Smith, Marc Choueiti, & Katherine Pieper (2018). Inclusion in the Recording Studio?: Gender and Race/Ethnicity of Artists, Songwriters & Producers across 600 Popular Songs from 2012-2017. Annenberg Inclusion Initiative. URL: http://assets.uscannenberg.org/docs/inclusion-in-the-recording-studio.pdf (last accessed 17 November 2021).

Mark Berger

Daphne Lenders & Co.

[1] Jeffrey Dastin. Amazon scraps secret ai recruiting tool that showed bias against women. https://www.reuters.com/article/us-amazon-com-jobs automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed bias-against-women-idUSKCN1MK08G, 2018. Accessed: 03-11-2021. 

[2] Jon Kleinberg, Sendhil Mullainathan, and Manish Raghavan. Inherent trade-offs in the fair determination of risk scores. arXiv preprint arXiv:1609.05807, 2016. 

[3] Tolga Bolukbasi, Kai-Wei Chang, James Zou, Venkatesh Saligrama, and Adam Kalai. Man is to computer programmer as woman is to homemaker? debiasing word embeddings. Curran Associates Inc., 2016. 

[4] Tesla. Tesla vehicle safety report. https://www.tesla.com/VehicleSafetyReport, 2021. Accessed: 03-11-2021. 

[5] Paul R Daugherty, H James Wilson, and Rumman Chowdhury. Using artificial intelligence to promote diversity. MIT Sloan Management Review, 60(2):1, 2019.

Mónica Fernández Peñalver

[1] Malik, Y. S., Sircar, S., Bhat, S., Ansari, M. I., Pande, T., Kumar, P., … & Dhama, K. (2021). How artificial intelligence may help the Covid?19 pandemic: Pitfalls and lessons for the future. Reviews in medical virology, 31(5), 1-11. 

[2] Belfiore, M. P., Urraro, F., Grassi, R., Giacobbe, G., Patelli, G., Cappabianca, S., & Reginelli, A. (2020). Artificial intelligence to codify lung CT in Covid-19 patients. La radiologia medica, 125(5), 500-504.

[3] Keshavarzi Arshadi, A., Webb, J., Salem, M., Cruz, E., Calad-Thomson, S., Ghadirian, N., … & Yuan, J. S. (2020). Artificial intelligence for COVID-19 drug discovery and vaccine development. Frontiers in Artificial Intelligence, 3, 65.

[4] Röösli, E., Rice, B., & Hernandez-Boussard, T. (2021). Bias at warp speed: how AI may contribute to the disparities gap in the time of COVID-19. Journal of the American Medical Informatics Association, 28(1), 190-192.

[5] Leslie, D., Mazumder, A., Peppin, A., Wolters, M. K., & Hagerty, A. (2021). Does “AI” stand for augmenting inequality in the era of covid-19 healthcare?. bmj, 372.

[6] https://partnershiponai.org/why-pattern-should-not-be-used-the-perils-of-using-algorithmic-risk-assessment-tools-during-covid-19/ 

[7] West, S. M., Whittaker, M., & Crawford, K. (2019). Discriminating systems. AI Now.

Aashutosh Ganesh

Puzzles, hints & solutions

Click here, to reveal the hints and solutions of this edition’s puzzles.