Flipping AI Bias on Its Head

A Path to Advancing Inclusive Marketing and Improving Health Equity with Generative AI

Creado por Amy Gómez, PhD

23 de febrero de 2024
WHAT TO KNOW: AI-GENERATED HIGHLIGHTS
01. GenAI in healthcare risks deepening health inequities due to biases from non-diverse development teams that create the tools and by non-inclusive training data
02. Klick’s Strategy team advocates for training GenAI with diverse inputs and values to combat bias and support equity
03. Cross-cultural marketing via GenAI can help reduce disparities by providing inclusive strategies and marketing content
04. Ethical GenAI requires diverse data sets, targeted collection, and culturally aware teams to truly enhance health equity

EXECUTIVE SUMMARY

The increasing use of generative AI (GenAI) in healthcare exacerbates the risk of perpetuating biases, which can intensify existing health inequities. The biases in GenAI are often linked to a lack of diversity in the development teams and training data. Recognizing this challenge, Klick’s Strategy team emphasizes the importance of “good parenting” in the development of GenAI tools, promoting exposure to diverse cultures and instilling values of equity and inclusion.

Cross-cultural marketing is essential for addressing healthcare disparities, but it remains underutilized in pharmaceutical marketing due to awareness gaps and budgetary concerns. GenAI tools can help bridge the gaps and lower the barriers, leading to more culturally informed marketing strategies.

The future success of GenAI in promoting health equity depends on concerted efforts to diversify training data, conduct targeted data collection, and build diverse development teams. With ethical development and application, GenAI tools hold the potential to significantly advance inclusive marketing and improve health equity. However, this positive outcome is conditional on overcoming substantial challenges in the development process, requiring careful optimism and proactive measures.

INTRODUCTION

As GenAI technologies become deeply integrated into every aspect of life, the biases they perpetuate in healthcare risk exacerbating an already unequal healthcare landscape. For example, an algorithm widely used by hospital networks used healthcare spending as a proxy for healthcare needs and, in doing so, grossly underestimated the needs of Black people vs. white people.

The well-documented bias in GenAI tools has sparked widespread concern over the negative effects of these technologies. For example, an examination of AI-generated images of “professionals” illustrated how they reinforce pernicious stereotypes: they tend to depict high-paying occupations as white males and low-paying occupations as people of color, and they underrepresent women in both categories versus actual labor statistics.

Many attribute this bias to a lack of diversity in the technology companies responsible for their creation and in the data sets that the tools are trained on. Today’s GenAI tools are in their infancy, and the companies that build them are akin to parents. How they “raise” these growing children will determine whether they become adults who replicate and reinforce bias or become champions for equity.

The Strategy team at Klick is deeply committed to health equity and believes the question is more than just an ethical challenge. We will explore what “good parenting” would look like in the context of training GenAI tools. Are we exposing our “children” to a wide variety of cultures, beliefs, and practices? Are we instilling the importance of key values like fairness, equity, and inclusion? How could GenAI be used to help mitigate the ingrained biases in AI-based systems and serve as an instrument for advancing health equity and cultural inclusivity?

Bias in GenAI and healthcare-system inequity are examples of the unintentional harm that results when systems are developed without centering inclusivity. GenAI bias and healthcare bias are pernicious enough independently, but their combined effect— biased algorithms operating on biased data sets—amplifies the damage multiplicatively.

However, what if these documented problems presented an opportunity to show exactly where the healthcare industry needs to focus? Author, feminist, and civil rights activist Audre Lorde famously said, “...the master’s tools will never dismantle the master’s house,” but what if they actually could? This paper explores how we might flip AI bias on its head.

In the realm of life sciences commercialization, (as the market-facing expression of healthcare companies’ commitment to health equity), cross-cultural marketing can play an essential role in addressing inequities within healthcare. Effective cross-cultural marketing initiatives can help historically underserved patient populations learn about the health issues that impact them, accelerate their path to diagnosis, and promote greater adherence to health-preserving treatments.

To execute authentically, marketers must start by deeply understanding the attitudes, beliefs, behaviors, and unmet needs of their entire patient population, especially those underdiagnosed and undertreated. A comprehensive strategic approach to cross-cultural marketing considers all the facets of identity that impact individuals’ interactions with the healthcare system and determines their health outcomes, such as gender, race, ethnicity, socio-economic status, and sexual identity. Ultimately, successful cross-cultural marketing involves translating this knowledge into insight-driven efforts like inclusive and accessible creative, culturally tailored patient journeys, and activation plans and cross-cultural media initiatives.

Why Isn’t Cross-Cultural Marketing More Prevalent in Pharma?

Although cross-cultural initiatives have become increasingly common in consumer marketing, they are still the exception rather than the norm in pharmaceutical marketing. There are several reasons for the gap. One factor is a lack of awareness about the depth of healthcare inequity and a lack of knowledge about the best practices associated with cross-cultural marketing. Even within therapeutic areas that are known to disproportionately affect communities of color, such as heart failure and kidney disease, marketing efforts often rely on executional elements like casting choices and direct translations.

Perhaps more importantly, finite budgets and the perceived complexity of cross-cultural efforts act as significant barriers. A marketer might have the desire to address the needs of underserved patient groups but lack the necessary knowledge and skill set to drive such initiatives. The same marketer faces a challenge in hiring a separate team with specialized expertise as it adds cost and risks, creating a distinct and unintegrated workstream. These perceived challenges hinder marketers from realizing their desire to address health equity issues, so they continue with business as usual.

 

How Can GenAI Tools Help?

GenAI tools could play a pivotal role in overcoming these barriers by providing better access to the insights required to guide more authentically inclusive marketing efforts. When “well parented”—in other words trained on data sets that either include representative samples of historically underrepresented groups or ones that have been fine-tuned to adjust for bias— advanced tools could provide marketers with the cultural understanding and quantitative data to better understand diverse patient populations and their specific needs.

By leveraging GenAI tools that have been “well parented” marketers could be automatically exposed to market research they may not have thought to consult, including relevant insights and practical strategies for cross-cultural marketing they can apply to their work. It’s critical to note that human oversight and ethical considerations are imperative to ensure the responsible and unbiased use of these technologies. With those in place, GenAI tools could help marketers bridge the knowledge and skill gap, enabling them to navigate this complex terrain more efficiently and effectively. As a result, pharmaceutical companies could be better equipped to cater to underserved patient groups and achieve more inclusive and impactful marketing campaigns.

 

USE CASES FOR GENAI TOOLS TO IMPROVE INCLUSIVITY IN MARKETING

NOW: Text Analysis of Patient Materials—The Current State

Making patient materials easily understandable empowers them to make well-informed decisions about their health, helping them actively participate in their healthcare journey. It also plays a crucial role in promoting health literacy by adjusting complex medical terminology to the patient’s comprehension level. Clear and accessible materials contribute to patient safety by ensuring that vital information is comprehensible and promotes inclusivity in healthcare by catering to individuals with diverse backgrounds, education levels, and language proficiency.

Unfortunately, overcomplicated language and confusing explanations in patient materials often flummox even native speakers. Here’s where GenAI text analysis tools have the potential to deliver, identify, and analyze complex medical terminology, jargon, and technical language, and suggest alternative, simpler explanations. They can assess the overall level of patient materials and provide feedback on reading difficulty, helping authors simplify the text to align with the audience’s health literacy level, and deliver recommendations on making the content more empathetic and patient-centered. Healthcare marketers who leverage these tools can create materials that are easier for patients to understand and contribute to better health outcomes, improved adherence, and increased patient empowerment.

NOW: Image Analysis

With startling regularity, ads appear on the market that are offensive for their insensitive and stereotypical representation of historically underserved groups. (As long-time cross cultural marketers, we have a digital folder full of them!) Since the topics addressed in healthcare marketing are often a matter of life and death, avoiding such faux pas is even more critical. Can GenAI tools provide a solution? Have their “parents” raised them to do so?

The potential to use GenAI tools for image analysis to assess campaign and asset imagery for underrepresentation and misrepresentation of historically underserved groups remains untapped. As highlighted by the Bloomberg article cited in the introduction, the lack of diversity in the data sets used to train text-to-image AI tools like Stable Diffusion leads to the generation of images that exemplify a world of amplified bias, i.e., even more biased in terms of gender, race, and ethnicity than reality itself. To return to our analogy, GenAI “infants” demonstrate the (noxious) impact of nurture over nature—it’s not how they inherently are, it’s how they’ve been “raised”. As computer scientist and AI expert Dr. Joy Buolamwini aptly notes, “With the adoption of AI systems, at first I thought we were looking at a mirror, but now I believe we’re looking into a kaleidoscope of distortion. Because the technologies we believe to be bringing us into the future are actually taking us back from the progress already made.

Unless these “children” are exposed broadly and deeply to data about historically underrepresented and underserved populations and trained to utilize them to improve equity, this will continue to be the case.

For healthcare marketers to be able to utilize GenAI tools as a safeguard against their own unconscious biases and ingrained stereotypes, developers who produce them would need to become more thoughtful and inclusive “parents,” collecting a diverse dataset of images that encompasses individuals from different ethnicities, races, genders, and body types. A large multimodal model like GPT-4V could be used to analyze a wide and diverse array of images to create descriptions, then categorize them based on those descriptions. To create robust AI models, human experts in the loop can verify accuracy.

Automated Checks for Potential Bias

The model could be utilized to analyze images, detecting instances of underrepresentation or misrepresentation of certain groups. This analysis could identify if the images reflect biases such as improper labeling,tokenism, or harmful stereotypes, safeguarding marketers from cultural missteps. With increasing scrutiny on GenAI tools and the data sets they are trained on, one could optimistically predict that such “well-raised” tools will be available soon.

FUTURE:

Video and Multimodal Experiences All of the potential described above for GenAI tools to act as bias interrupters will be available for video in the foreseeable future, and will be able to analyze the video, the on- screen text, and the accompanying voiceover. Given the primacy of video communication in health, politics, the arts, etc., this will be a powerful development.

 

CRITICAL WORK TO ENABLE GENAI-POWERED CROSS-CULTURAL STRATEGY AND MARKETING

As mentioned above, cross-cultural marketing is a way for healthcare brands to begin addressing inequities in their therapeutic areas. At Klick, we want to see a future where GenAI tools, no longer in their infancy but thoughtfully and deliberately evolved, could make cross-cultural approaches more widely accessible to the healthcare industry andaccelerate the advancement of health equity.

Diverse Training Data

AI models learn from the data they are trained on. Healthcare companies and marketers need to advocate for collecting and utilizing large and diverse training data for GenAI tools and only employ tools that prioritize ethical considerations, including data privacy, fairness, and transparency. With those protocols securely in place—and potentially monitored or regulated—GenAI tools could provide healthcare marketers with critical insights about diverse patient populations. Using diverse training data would expose AI models to different experiences and perspectives related to healthcare, allowing the models to generate more accurate insights and predictions about various patient populations. They could help healthcare marketers understand the unique social determinants, preferences, and culturally based attitudes of different communities, equipping them to tailor their marketing strategies and messaging to better resonate with historically underserved patient populations.

AI tools could also help identify healthcare access and delivery gaps for these populations. Unlike the infamous algorithm previously cited that used spending on healthcare as a proxy for patients’ healthcare needs, thus significantly underestimating the unmet medical needs in the Black community, AI tools could be trained to do just the opposite. Through data analysis, AI models could detect disparities in healthcare outcomes and potentially shed light on potential solutions. This would empower healthcare marketers to develop targeted campaigns and initiatives to bridge those gaps and improve healthcare access and outcomes.

Targeted Data Collection

Targeted data collection is another crucial component of “good parenting” by tech developers—it’s a proactive approach to consciously gather information from historically underrepresented groups. It will be an important way to correct for the biases currently engrained in AI tools. Developers can ensure that their AI models learn directly from a diverse range of individuals by conducting surveys, interviews, or user studies specifically designed for these populations. This approach is crucial because it will begin to capture the unique insights, experiences, and preferences of underrepresented groups, thus avoiding biases and improving AI systems’ overall fairness and accuracy. Developers can create more inclusive and representative models by actively seeking out diverse data.

Additionally, this approach could help correct the current gaps in marketers’ understanding of historically underserved populations andprovide valuable insights into their needs andperspectives.

An example of targeted data collection is Latimer, otherwise known as “The Black ChatGPT,” which builds on current AI techs like Meta’s Llama 2 model and GPT-4, the large language model that powers ChatGPT, but adds in books, oral histories, and local archives from Black and brown communities that are not well-represented in the data sets those models were trained on. Another example is Mozilla’s Common Voice project, which was created to address the lack of representation of non-English speakers, people of color, disabled people, women, and LGBTQIA+ people in voice recognition data sets. For GenAI tools to become enablers of health equity-focusedmarketing, initiatives like these must become the norm, not exceptions.

Augmenting Data with External Sources

Another “good parenting” approach to data acquisition that will be important in training GenAI tools is augmenting data with external sources. In this approach, developers enhance existing data sets by incorporating additional information. This can mean integrating data from different data sets or including data obtained from publicly available sources, including government databases, social media platforms, or third-party data providers.

This can enrich the existing dataset, provide additional context, or fill in critical missing information (e.g., historically underrepresented and underserved populations).

Some beneficial augmentation sources from a health equity perspective:

Academic research, particularly studies addressing the healthcare attitudes, beliefs, behaviors, and unmet needs of underserved populations, with a wide lens across the impacts of gender, race, ethnicity, socio-economic status, sexual identity, geography, etc.

Sociological studies on the socio-cultural factors that influence under-resourced communities and their health outcomes, to equip AI systems with a more nuanced understanding of diverse populations’ behaviors and preferences

Community-driven platforms, such as forums, social media groups, and online communities, allow individuals to express their unfiltered thoughts, experiences, and concerns. Analyzing data from these platforms can give AI systems real-time information and perspectives from the communities that healthcare marketers need to better understand

Reports and studies conducted by organizations focused on diversepopulations (examples include leading advocacy organizations like UnidosUS, the National Urban League, and the NIH, which collects large, diverse healthcare data sets in its All of Us research initiative) could also contribute to augmenting AI’s knowledge. These organizations often conduct research in their communities and collect data to better advocate for their rights. By leveraging their findings, AI algorithms could access up-to-date information and develop a more comprehensive understanding of the challenges and opportunities faced by underrepresented groups.

 

Building and Collaborating with Diverse Teams

At the beginning of this paper, we referenced the lack of diversity within tech companies as a frequently cited cause of the biases within AI tools. The reverse is also true. By bringing together individuals from diverse backgrounds, especially those with personal knowledge and lived experience, teams can benefit from their distinct perspectives and insights. Diverse team members play a crucial role in identifying blind spots and potential biases that arise during the design and development of AI systems. They can provide valuable human input on potential biases and the cultural/social contexts that AI systems cannot fully understand on their own. These insights could help ensure that AI systems are built to be more inclusive and consider the needs (especially those that are unmet), values, and perspectives of all individuals. In short, diverse teams are the “parents” that AI tools, now in their infancy, require to develop into mature tools that can serve as powerful equity drivers.

 

CAUTIOUS OPTIMISM AND CAVEATS

The effects of bias in GenAI tools are pernicious and wide-reaching, as some examples in this paper have illustrated. If AI tools are in their infancy, these “infants”—with all their blind spots and lack of perspective—yield enormous clout. We’ve expressed optimism about the opportunity for the tech industry to be thoughtful and responsible “parents” to these “infants” and evolve them into “well-educated adults” that advance inclusion and health equity.

But that outcome is not a given and faces headwinds such as the tremendous and potentially prohibitive computing resources required for data transparency and the dependency that creates on well-funded tech giants; the opaque and confusing paths through which algorithms can entrench historical inequities; and the need for a potentially non-scalable amount of highly contextual human judgment to reliably interrupt bias within GenAI tools. However, the stakes are high, and the potential is great.


We welcome your questions and feedback. Please contact: Michael Chambers SVP, Opportunity Creation at mchambers@klick.com

Flipping AI Bias on Its Head

Esperamos que hayas disfrutado leyendo esta vista previa de POV. Asegúrese de descargar para ver el contenido completo.


Autor

Amy Gómez, PhD

Amy Gómez, PhD
Vicepresidente Sénior, Estrategia de Diversidad

Amy es una especialista en marketing intercultural con más de 20 años de experiencia ayudando a empresas de Fortune 500 y liderando organizaciones sin fines de lucro a comunicarse eficazmente con diversos consumidores. Encabeza la creación de comunicaciones relevantes e impactantes para los segmentos que impulsan el crecimiento en los EE. UU. hoy en día: Hispano, negro, asiático y LGBTQ. Tiene una maestría de la Universidad de Pensilvania y un doctorado de Stanford y domina inglés, español e italiano.

Ready to Drive Life Sciences Forward?

Experience the transformative power of Klick Health, where deep industry expertise meets cutting-edge AI-driven wisdom.

As your trusted partners in life sciences commercialization, we combine a storied history in healthcare with the latest technologies to elevate every facet of your omnichannel strategy. From crafting engaging narratives to enabling data-driven decision-making, our integrated capabilities ensure you lead the way in transforming patient outcomes through digital health innovation.

Let’s create something transformative together.

By completing this form, I agree to receive marketing communications from Klick. View our Privacy Policy. for full details.