Dibner Discussions is a monthly series for NYU students to talk about current events and social issues in STEM through an interdisciplinary lens. All participants must agree to our Guidelines for Respectful Discussion prior to participating. If you require support after the discussion, please see this resource guide for information on resources available to NYU students. Scroll down to view themes, recordings, and resources from previous years. (Note: registration opens 4 weeks prior to the event date.)
Topics & Dates:
[Running resources & additional info document]
Spring 2025
You might be familiar with the sentiment - “if a service is free, you are the product”. This month’s discussion will be held during Data Services’ Love Data Week and will focus on exploring the impacts of current day data ownership norms and practices. It may seem fair to allow Google to collect and sell data about your usage patterns on their platforms, but how does this translate to a company like 23andMe owning and using your genetic data? Who does your data belong to?
Generative AI and other types of artificial intelligence often seem like magic when presented to end users. These products seem to work relatively seamlessly and the impression is that teams of software developers and engineers are solely responsible for their operation and maintenance. But what happens when ChatGPT or DallE produces concerning content? Who steps in to moderate posts on Facebook, TikTok, or Instagram? Ghost work is a term often used to refer to the unseen labor of people who are tasked with content moderation, labelling data for training datasets, and other tasks necessary for AI to function. In this month’s discussion we’ll learn more about ghost workers and discuss the role they play in AI.
The name of this month’s discussion might be somewhat misleading - after all, don’t algorithms perform the exact task they are programmed to perform? While a machine learning model may perfectly execute the algorithm(s) it is programmed to execute, what types of assumptions and norms are baked into those algorithms? How does one translate something as subjective as emotion into a rigid and measurable category that a computer can process? What are the consequences of these translations? This month’s discussion will ask you to try out this translation process yourself.
Previous Topics:
Fall 2024
Do scientists, engineers, software developers, and other STEM researchers or professionals have a responsibility to make sure their research or work is used ethically? Should STEM professionals and researchers consider how their work may be used in the future? This discussion will tackle these questions while also looking at archival materials from the Polytechnic Archives during the Vietnam War and comparing how students and faculty considered these same issues.
OpenAI, the company behind ChatGPT and DALL-E, has recently released Sora AI which generates video from text inputs. What are the implications of generative AI tools such as Sora AI and DALL-E? What impact will they have on misinformation and disinformation? Are these tools ethical? How would we answer that question?
Spring 2024
Fall 2023
Spring 2023
Mapping Social Justice with Michelle Thompson Gumbs
The Past, Present, and Future of Tech Discrimination with Dr. Joy Rankin & Dr. Sarah Myers West
Contact Tracing and Marginalized Communities with Prof. Kadija Ferryman