Sometimes, the best advice comes from your friends. That’s what Emma Bates learned in 2018, when she was 25 years old, working a corporate job, and found herself in need of the morning-after pill for the first time. After buying the pill at CVS on her lunch break, she did what many do when trying to get quick, clear information: She turned to the internet. “All I really wanted was to find answers and validation,” she says. Instead, she wound up “in one of those classic internet spirals,” wasting a ton of time, and feeling increasingly lost and confused.
The search results included lots of things that Bates didn’t need (like the side effects of the medication, which are listed on the package) and very little by way of information she did need: how people actually felt when taking it and whether those who’d experienced side effects from birth-control pills (like she had) had any issues with it. So, she instinctively turned to the women in her life, both in person and online, for help. The magnitude of the response awed her: People sprang into action, offering their firsthand experiences and advice in spades. These responses left Bates feeling far more comfortable buying and taking emergency contraception than when she was armed with just Google results.
Now, Bates is bottling the unique benefits of social crowdsourcing into the search engine she wishes she’d had then— powered by many, many more friends. Enter: Diem, the “social search engine” that draws on the collective knowledge of women and non-binary folks, as well as artificial intelligence (AI), to create a friendly, informative knowledge hub. Bates and Diem co-founder Diya Singh launched Diem in public beta in October 2021, and after refining through beta testing with thousands of users, they just launched the final product.
How Diem blends AI with community-generated content to offer accurate, applicable info
Designed for women-identifying and non-binary folks in their mid-twenties to early thirties, Diem aims to take all the knowledge, wisdom, and relatable humor that lives in so many individual group chats, organize it in one easily searchable spot, and supercharge that database with AI. The goal is to return simple answers to questions folks pose.
Think of Diem like a mix of Quora, Reddit, and ChatGPT: Users can ask their questions to the community and answer others’ questions, and they’re encouraged to interact to make answers as detailed, funny, personal, and insightful as possible
Think of Diem like a mix of Quora, Reddit, and ChatGPT: Users can ask their questions to the community and answer others’ questions, and they’re encouraged to interact to make answers as detailed, funny, personal, and insightful as possible. At the same time, the AI component collects data from those interactions among Diem users, as well as the broader internet, and uses it all to spin out comprehensive, yet super-relevant answers to questions.
For example, let’s say you want to know about how much it costs to freeze your eggs. You’d type that question into Diem, and the AI component would utilize its broader internet database and Diem user-contributed content to generate both a one-sentence answer and a longer one. And you’d receive both, alongside all the community-generated posts on the topic, in case you were interested in scrolling through them.
In this way, Diem combines the ease of AI with the personalization of knowledge from actual women and non-binary folks—the very kind of anecdotal information and life experiences and gossip often deprioritized by search engines due to bias (more on that below). It’s the combination, Bates says, that makes the information on Diem both useful and directly applicable to people from these communities. “The way we’ve always thought about building is, ‘How can we create a space where you could search for the collective insights of your communities?’” she says. “If you can build collectively with your community, it shapes the voice, data, and experience, and it ends up feeling different.”
The issue of gender bias on search engines and AI platforms
Because gender bias is prevalent in society as a whole, it also exists within internet search-engine algorithms, and fuels the gender information gap. It’s the same gap that Bates confronted when trying to find applicable, relatable information on the morning-after pill in 2018.
As evidence, consider the results of a 2022 study, for which researchers looked to determine whether Google Images searches in countries with higher gender-inequality scores (assessed with the Global Gender Gap Index) would also demonstrate a gender bias. To do so, they conducted a search for the word “person” in the local language of 37 countries in their first study and 52 countries in their second. In all cases, the Google search results in countries with greater levels of societal gender inequality also revealed greater gender disparity (meaning that a higher percentage of the first 100 image results showed a man versus a woman).
New AI bots have also been found to perpetuate gender bias. For example, Bloomberg reported that a computational scientist got ChatGPT to spit out code that said only Asian or white men would make good scientists; the company’s CEO responded to the scientist on Twitter, encouraging users to “thumbs down” such answers in order to help the program learn not to propagate bias.
When I asked ChatGPT a similar question, it said it’s not appropriate to use someone’s gender or race to determine whether they’d excel in a profession. But while it seems like the AI is currently learning to be less biased, it’s still relying on objective, compassionate users to train it to be that way. (Whereas Diem’s AI is largely trained on information created by and for women and non-binary people, and thus isn’t subject to the same issue of anti-women bias from the jump.)
“AI is only as good as the data it’s trained on, and because all humans have bias and prejudice, when their data is used, that bias and prejudice becomes a part of the AI.” —Nina Vasan, MD, psychiatrist
These findings illustrate how the biases we maintain as people naturally transfer to the products we create and the datasets to which we contribute. “AI is only as good as the data it’s trained on, and because all humans have bias and prejudice, when their data is used, that bias and prejudice becomes a part of the AI,” says psychiatrist Nina Vasan, MD, chief medical officer of digital therapy platform Real and founder and executive director of Brainstorm: The Stanford Lab for Mental Health Innovation.
How Diem aims to become a fun, informative search engine specifically for women and non-binary people
Given Diem’s intended user demographic, Bates says she thinks of the information shared on the platform and generated by the chatbot as answers to the questions people have when they’re in the “friend stage” of their lives; people absorb lots of information from friends when they’re in their mid-twenties and early thirties, she says. The idea with Diem is that you’re gaining perspectives from a variety of people of your age and gender identity with whom you may not have otherwise interacted, and that collective understanding benefits everyone on the platform.
Because Diem’s AI is fed with information directly from people of this demographic, it’s a representative platform by design. As of now, thousands of women and non-binary people have asked questions and shared answers on Diem about topics relevant to them, especially in the categories of personal health, money, and relationships—which is where Diem’s AI is best trained.
While it’s still learning, it pulls from a broader proprietary dataset (similar to the one that powers ChatGPT) to find information about questions that people in the community haven’t yet discussed. A point of differentiation, though, is that it still spins out a “summarized Diem-y answer” for these questions, says Bates, using actual Diem convos to inform the answer’s style and tone.
However, because Diem is still an AI relying on both user-generated content and that broader dataset, it is certainly not immune to the problems faced by other search engines and AI chatbots, including false, misleading, and, yes, even biased content. In turn, there’s a button within the app and on the desktop version to report abuse, and the platform has published a set of community guidelines—which promote things like “kindness and curiosity” and “empathetic engagement”—to ensure its users’ safety and the quality of the information.
Diem launches during a time when government officials in both the United States and European Union are exploring ways to regulate AI, largely because of the above concerns. In the U.S. specifically, President Joe Biden’s office released a draft of an AI Bill of Rights in October, a bipartisan group of U.S. senators led by Senator Chuck Schumer is working on legislation to establish guardrails around the industry, and four government agencies released a joint statement last month outlining the powers they hold to regulate the industry more robustly, particularly against discrimination resulting from bias.
But Bates isn’t deterred by entering this landscape. She predicts that there will be an “interesting shift in the future of the internet,” where people will seek out information that caters specifically to where they are in their lives. Rather than asking Google for dating advice, for instance, she hopes you’ll soon go to a platform like Diem, which will give you answers derived from people with a similar lived experience to your own.
“Right now, we exist in monopolies, and that impacts how we use the internet and what we expect of platforms,” says Bates. After all, those monopolies often reflect bias in favor of men. “I think the future of the internet is going to offer more niche experiences,” she says. And who’s to say a niche is necessarily small? As Bates points out, women-identifying folks make up a near equal share of the people on the planet. And now, there’s a giant group chat waiting for their input.