Skills AI Can't Steal

Let's talk about Algorithmic Bias with Melissa Santiago (Brand Strategist)

June 13, 2023 Josh Kidwell Season 1 Episode 6
Let's talk about Algorithmic Bias with Melissa Santiago (Brand Strategist)
Skills AI Can't Steal
More Info
Skills AI Can't Steal
Let's talk about Algorithmic Bias with Melissa Santiago (Brand Strategist)
Jun 13, 2023 Season 1 Episode 6
Josh Kidwell

Today we're going to talk about something that AI can and does steal from us – our human biases. Yes, algorithms can be biased just like me and you. 

We talk with Melissa Santiago about the problems and pitfalls of using AI for your brands marketing and communications needs. Especially when it comes to the topic of Diversity, Equality and Inclusion. 

TOPICS

  • Where does Algorithmic bias/ AI bias come from?
  • Levi's AI driven diversity campaign fail.
  • How to be a more inclusive Social Media manager.
  • Being transparent when using AI for marketing.

Melissa is a brand strategist, creative thinker, and community builder. She is energetic, empathetic and brings a creative spark of energy to her work. She is passionate about equity, inclusion, and belonging for all people and works to educate herself on new ways to include new perspectives and expand her cultural experiences.

Melissa lives on the occupied land of the Ohlone people, currently known as San Francisco.

LINKS
Melissa's Instagram
Melissa's Linkedin
Coded Bias Documentary

Bloomberg Article: Humans are biased  Generative AI is even worse.   


Thanks for listening. Please rate us in the appstore.
Remember, "You are a human being with intrinsic worth!"

Show Notes Transcript Chapter Markers

Today we're going to talk about something that AI can and does steal from us – our human biases. Yes, algorithms can be biased just like me and you. 

We talk with Melissa Santiago about the problems and pitfalls of using AI for your brands marketing and communications needs. Especially when it comes to the topic of Diversity, Equality and Inclusion. 

TOPICS

  • Where does Algorithmic bias/ AI bias come from?
  • Levi's AI driven diversity campaign fail.
  • How to be a more inclusive Social Media manager.
  • Being transparent when using AI for marketing.

Melissa is a brand strategist, creative thinker, and community builder. She is energetic, empathetic and brings a creative spark of energy to her work. She is passionate about equity, inclusion, and belonging for all people and works to educate herself on new ways to include new perspectives and expand her cultural experiences.

Melissa lives on the occupied land of the Ohlone people, currently known as San Francisco.

LINKS
Melissa's Instagram
Melissa's Linkedin
Coded Bias Documentary

Bloomberg Article: Humans are biased  Generative AI is even worse.   


Thanks for listening. Please rate us in the appstore.
Remember, "You are a human being with intrinsic worth!"

Josh:

Today, we're going to do something a little bit different. We're going to talk about something that AI can and does steal from us, and it's something that people don't even realize. Any guesses It's our human bias. Yes, algorithms can be biased, just like me and you. Today, we'll talk with Melissa Santiago about the problems and pitfalls of using AI for your brand marketing and communication needs, especially when it comes to the topic of diversity, equality and inclusion. Melissa has worked in the marketing and communications field for over 15 years as a communications and brand strategist. She's passionate about equity, inclusion and belonging for all people. We'll cover topics such as where does algorithmic bias come from, levi's AI-driven diversity campaign and how to be more inclusive as a social media manager.

Josh:

But before we get started, let's get grounded with three deep breaths. Here we go One, two, three, okay, as always, remember you are a human being with intrinsic worth. On with the show. This is Skills AI Can't Steal. Melissa, thank you for coming on the show. It's good to see you. Yes, welcome to Skills AI Can't Steal. You talked to me a few weeks ago about wanting to come on the show and talk about AI and algorithmic bias and all the things that you had mentioned. You mentioned the documentary coded bias. I went and watched it. I went and watched it last week. So coded bias, it's fascinating. For those of you who haven't seen, it's a documentary featuring Dr Joy Boulamwini.

Joy Buolamwini:

So, for example, if I want a machine to see a face, I'm going to provide many examples of faces and also things that aren't faces. I started looking at the data sets themselves And what I discovered is many of these data sets contain majority men and majority lighter-skinned individuals, So the systems weren't as familiar with faces like mine.

Josh:

She's the founder of the Algorithmic Justice League. She's a computer scientist, award-winning researcher, but she's like an advocate for minorities and people who are being marginalized by AI.

Joy Buolamwini:

And so that's when I started looking into issues of bias that can creep into technology.

Josh:

The documentary. I mean, people will point out like, oh, it's a few years old, it's 2020, right, it's 2023 now, right, these things have been solved And I think, perhaps, maybe they've gotten better at scanning faces, no doubt, but have the problems been fixed? It's like AI is fixing problems and also creating problems at the same time. We're never really catching up And a lot of the problems are unforeseen. It's not like anybody set out to do wrong by people to do bad. They thought, oh, we will scan faces, it'll be so easy for you to register or be recognized And we can save you time, save you energy. But, like in the documentary, they're pointing out how it doesn't recognize different skin tones. They're using it for, you know, profiling people in public of cameras and and just like, oh, that person Could have been on this watch list, let's go, send some people to go, you know, pick them up just walking down the street. So it happens that people, minorities, are more Targeted, right, or I don't know.

Melissa:

You chime in for a minute here and yeah, and I'll just pause to say I know a lot of people want to say right minorities, but globally speaking, right from a global perspective, and people with brown skin are the global majority majority.

Melissa:

It's just in European Eurocentric countries like here and in Europe, of that we kind of pride, you know. We say, oh, white is Is the normal, right, white is that. And then everyone else is diverse, right, everyone else is the You know quote-unquote minority, like no one's setting out to Create tools that oppress people. Right, like that's never the intention. However, right the way, like when we code something and we code it, you know, to be Eurocentric Just because that's the way that we were brought in our city doing it right.

Melissa:

Yeah, and that's just the way. You know, when white is the default, you know a problems can arise. And then I think part of it is when you set a bit of code and then it Starts to like morph and change and show up here and get integrated into this thing and get integrated into that thing. It's like amplifying these problems, right. It's like repeating them on and on and on. You know, races a social construct right, there, there was a time when it was not an issue. And then you know There's these, the origins of racial hierarchy.

Melissa:

You know that began as a worldview in the 18th century. You know, it's a set of culturally created attitudes and beliefs about human differences and It's, you know, linked to the rise of capitalism and Western Europe, right. So all of these things and are hardwired into us. And then it's like, hey, the internet right, my face Facebook, twitter, like all these cool things, right, we had the Arab Spring in 2012. Some really amazing things and great movements have happened, and, and there the movements have been rooted in these social networks, social platforms, but Also they've caused a lot of problems, right, and they've amplified a lot of heat, speech and other messages. So It's, it's, it's a magical place, but also, you know, kind of a haul of inherent bias.

Josh:

You were saying, um, unconscious bias. You know, i have it, everybody has it. If you don't think you have it, it's unconscious. That's why it's called unconscious bias. It's in our history, right? Mm-hmm, and. And then all of history has been encoded in the internet, right?

Josh:

It's all everything that was written has been uploaded on the internet, and then the internet was scraped to train the AI models. So, mm-hmm, it wasn't as if they were created in like this pure world view, where they had no bias, didn't exist. It's a reflection of, of of who we are right. So it's like where we point the finger at at ourselves, but we can't pretend that it's not a reflection of ourselves. There's some good quotes that came from the movie The past dwells in our algorithms. Basically, algorithms are using historical information to make predictions about the future. It's gonna just keep making the same mistakes that we make mm-hmm, unless we actively try to to Change it.

Melissa:

Yeah, and I mean that just kind of brings up another thing that I've you know, another topic that I've looked into The phrase technological redlining, as coined by dr Safiya noble. And yeah, she starts to talk about all of these things that exclude people, whether on purpose or not. Right, like we talked about facial ID algorithms not being able to detect melanated skin. I'm sure you've seen, and if people haven't, we can leave some Links for them to discover but Google's autocomplete used to be wildly and incredibly racist. Especially someone did a study on it. if you type in why do black girls? Like if you want to learn about how people view groups that they consider othered, you know from themselves And honestly, like kind of dehumanize. But if you want to do that, just type in like a certain group like you can type in like why did Jews or why did Chinese people or why did you know whatever? why do women? I haven't done it in a while because I don't want to be depressed, but there are some like horrible, horrible, you know responses that you get.

Melissa:

So I think another issue that falls to the responsibility of the people creating this code and like creating AI and all these different search models and engines, you can't just set it and forget it. You can't just like set up the code and intend that everything is going to be fine and that there's no bias in it And it's, you know, all great and good, and then we're just going to let it run and scrape the historically bias internet right, put everything together, and then we're going to back away and say, well, the computer did that, like. I think we need to sort of really interrogate ourselves and the systems right to check it and double check it, and not just we know now If you've just set it up and let it go like it's not good.

Josh:

Yeah, On top of that, I don't think people realize. I don't think I realized It's like AI and social media, like it's not trained not to do bad things. It doesn't know not to do bad things. It's trained like it's trained on the data. But then we have to have content moderators. Human beings have to go in and say, well, that's disturbing, That's illegal. You know, that's a terrible thing to say. It's not like the algorithm knows the difference. People have to tell it the difference. People have to be subjected to the worst of the internet so that other people can enjoy the sterilized version of the internet, or at least as sterile as it can be, so that it can at least project some good enough version of it for us to enjoy as the general public and think, oh, it's fine, But it didn't know, It had to be told and we have to keep telling it.

Melissa:

Yeah, I think it needs input and it needs moderation, It needs people to check it. I feel like for far too long we've had people like Mark Zuckerberg who set up this thing, rake in all the money and then kind of take a step back and they're like oh oops, we didn't mean to break democracy, We didn't mean to interfere in the election, That wasn't our intention, It just kind of happened. But it's on your platform, It's on your watch, It's with the thing that you built. Take some responsibility. And now we know. I feel like there is the tiniest bit with Twitter, with Jack Dorsey, and with Facebook and Mark Zuckerberg where it's like oh no, Oh no, This is out of control, We're not in charge of what's happening anymore. But I feel like they also can't actively deny and just say, oh, I don't want to hear about that.

Melissa:

They need to take some ownership. So I would love to see people working within those organizations that are already huge to really take proactive steps and not just reactive PR steps to kind of right their wrongs. And I think they're trying. I think some of the platforms are trying, and maybe not trying as hard as they're trying to add revenue, but in some way they're kind of sort of trying.

Josh:

Yeah, Sadly it's. A business model has a conflict of interest is part of the problem, And ideally we would have regulation that would help get us past that, Melissa. you told me about the Levi's brand AI thing. Do you mind sharing that story again for our listeners?

Melissa:

Yeah. So Levi's decided hey, we are going to lean into the fact that most of the world, before too long, is going to be ethnically ambiguous. Right, like mixed race, whatever. Like there are going to be very few people who are, like you know, just white. Right, everyone's going to be a mixture, and especially in the Bay Area. Right, like you know, i know, like child is mixed race. It's like you know, all of her friends, everyone in preschool, like so many people, right, are just, we do live in the Bay Area.

Melissa:

Yeah, we do the Bay Area. So, yeah, also, i'll just side note to plug the documentary on HBO by Bay Area native W Kamau Bell. It is called 1000% me and it is just delightful and it interviews a lot of Bay Area kids and a lot of Bay Area parents that are multi racial, mixed race, and it's just really interesting and it just makes you proud to be from the Bay Area. You're like, oh my gosh, you're like producing all of these self reflective like global citizens who have, you know, one parent from Ghana and one parent from Pakistan and like Levi's, decided you know the face of America, if you will, is changing, right, the future of America, people can't get enough of these racially ambiguous models, right? Are they Asian? Are they black? Are they spent like you don't know? right, they're just beautiful people.

Melissa:

And so they decided to partner with an AI company, a digital fashion studio that creates life like fashion models, and we're going to put out a catalog that was sustainable and in hopes to increase the number and diversity of Levi's models. And they would, you know, do this at these AI generated models and put the work out there. And, of course, everyone was like why don't you actually just use real, actual, diverse people. Right, it's like, okay, we've been using too many, you know, white people, people in the global minority, we've been using too many of those people. So, instead of looking for like different casting and like interrogating why that's our default, we'll just make some computer generated AI people and bring them in. It's just, it was very bizarre And like how did that get?

Melissa:

like a press release actually went out Like this wasn't like an idea that someone had and then like got suppressed. And then like it came out that this. But like they went as far as doing a press release to like announce it and kind of pat themselves on the back And yeah, there was a lot of backlash for that. Like can you just hire diverse models? Can you, if it's a casting issue, can you get a different casting agency? Like I think there's a meme that has been going around for a while, ever since Beyonce did Coachella, and it's like you know, people say they can't find like diverse talent, and talent that you know is black and brown people. And it's like Beyonce found 20 black trombone players. Like you can't, you know, for one weekend, like you can't find any different people than what you're used to looking for And you're, you're casting and talent like.

Josh:

I don't know how they they missed it, but I don't think they set out to do a bad thing. But but again, it's like unforeseen problems, like they thought oh well, we can project this multicultural version of ourselves And this will help. It'll promote people looking more diverse and being more inclusive, right All while we save money, or something like that. Or or while we what do they say?

Josh:

It was a more sustainable sustainable right, a sustainable way to, to show how inclusive we are. And before you even share that with me, i had already started thinking about like how this would would work. Like I'm a, you know, video production art director guy. I'm thinking about like stock media, stock photo catalogs. You know you go to Getty Images. You know I'm half Chinese. So if I, like you know, type in Asian man, you know it's like, oh okay, someone that looks like my uncle Asian man doing this, doing that, barbecuing, you know, driving a car, working at a laptop, holding his phone right, and there's a zillion different things you could look up, but in every single one of them there was an actual human being who stood in front of a camera. A photographer was there, took the picture. They edited it. It was a real point in time. So there was, you know, a real person model who got paid.

Josh:

There was a real photographer that got paid. They had to license the rights for that image to be used And then you know they're in a particular pose. But now, with with AI, you, there doesn't have to be a camera, there doesn't have to be a person, and but it's drawing off of. It yeah work.

Josh:

It knows what people look like And it's taking it it. It looked at all of those, those giant databases of photos and just took it all and decided to make their own versions. So, Melissa, you, you're like a brand strategist. You are, you know, a social media strategist. People in your line of work would probably think, oh, this is, this is fantastic, right, we can like. What's the problem with with trying to project being inclusive, saying it's all about diversity, equality and inclusion, like that's very in and anyone who wants to look like they're doing the right thing says so on social media. Any thoughts?

Melissa:

Yeah, i mean, the first thing is right Be brave enough to have a conversation with someone who's underrepresented in your organization or on your team and ask what their experience has been like. You know, i've done that a few places where I've worked and there's some courage, even the most, you know, striving to do the best, whatever, like there's still individual people who make mistakes. So, you know, when you hear stories or you, you know, talk to someone about hey, what's it like to be the person in the room that comes to the, the DEI, the diversity, equity, inclusion, or like whatever you call it meeting, and then someone mentioned something and two thirds of the people in the room kind of look to you to see what your reaction is or what your response is, as you know, a person of color. Or kind of look to you and like wait for you to answer. Or you know I talked to someone that was like you know I. You know, sometimes I don't want to go to those meetings because I don't want people to look at me and like I've talked to a few people that are like I dread, like Martin Luther King Day and Black History Month, because I feel like someone's going to come to me and, like, ask me to essentially produce some sort of content or like make some sort of statement or kind of like, be that like token representation. I guarantee you, if you're one of two or three people with a darker skin tone working at a company and they do social media posts and they want to shoot something and have something in the house, like you're going to get hit up right, like it's going to happen, I have done that. You know it's like hey, we know that we don't want to just use a bunch of white people's hands in this video that we're shooting. So, you know, can we get different people? I think you know that's something that's always at the forefront. But I also feel like we've gotten on topic. So, anyway, i will steer us back to what I wanted to say, which I think one big area is in the content creator, like influencer marketing space.

Melissa:

I think a lot of the people who get the big dollars, who get the big bucks to make content and do these like product endorsements, are usually white people And there's a few people of color. But when you think of an influencer, if someone says an influencer like, what's the immediate thing to come to your mind? It's usually a white person first and foremost, and they're the ones that have the big representation and get the big money and get all the brand deals. I have seen way too many presentations, both on the brand side and the agency side, that are like hey, we're going to present to you like some optional influencers, right, here's the topic, here's the brief, this is what we want the content to be And we're going to work with influencers or contract creators, right, and it's like five. It's like a slide. I'm sure we can all envision it. There's a white or black slide with five circles And each of those circles is a person's face And it's almost always three or four of them are white men, and then there's like two, right, maybe three. Usually there's a white woman And then there's some other male that's like ethnically ambiguous or potentially a person of color, right, and when people look at this, they're like, oh yeah, we've threw in a few options, a few diverse options.

Melissa:

My theory on that is people working in social media and brands, you should your influencer balance of you know minorities or you know people of color or diversity, should be zero to 100. So when you're looking for people to like, represent your brand and talk about your brand. I feel strongly that marketing, and most marketing teams and advertising in general, has been dominated by wait men since its inception. So just look for a different perspective. If you're looking at content creators, if you're looking at influencers, you're already looking for a different perspective on your company. You know an outside perspective looking in. So you know, don't make your creator mix 50, 50, don't make it 75, 25, like no, me, aim to make it. You know 100% underrepresented voices in the space, especially if you're like an auto brand, a luxury brand, like just look for people who are underrepresented and I guarantee you they are being their most authentic selves and their follower base is probably incredibly loyal to them, right, because they there's something about this creator that they identify with, right, that they like see themselves in. And so, yeah, that's my soapbox about influencers and content creators.

Melissa:

You know, before I was saying, if there's like two or three people of color that work at a company that, like they're going to be approached all the time to be in different pieces of content, especially if you're making things in house, i think that's okay. But reach out to them and ask them about fashion week or the Super Bowl or productivity week or what they're reading that they like. Like just reach out to them and ask them normal questions. Right, don't say, oh, we're going to ask you know all these other people and then when it's black history month or Asian Pacific Islander heritage month or you know one of these like specific months of the year, like then we're going to go to that person from that group and we're going to ask them these questions about, like, oh, how do you feel about that? Like no, you know, if it's a mom, like ask her how it feels to be someone you know with a mom who is an immigrant on Mother's Day. Right, don't make it about necessarily their ethnicity.

Melissa:

I think a lot of people that will come out in their story. You know, it doesn't have to be front and center. Go to these people, reach out to these people, champion these people 12 months out of the year. Then you're kind of doing the right thing and don't make a big deal out of it. I feel like a lot of companies and marketers want to like pat themselves on the back right for like seating some space, like backing off and saying, hey, like I usually have all the reins and I have all the control, i'm your white savior here, like I'm your white knight and I'm going to invite you, like poor person of color, to like be seen and, like you know, contribute to this and like I'm stepping aside to let you. It's like, if you are kind of making yourself the hero in this, especially for like kind of PR or whatever other reason, it's still not doing the right thing and it's still like incredibly awkward and complicated and gross for the person that you're essentially manipulating And just all of this to say that can you go into the like tic tac creator studio thing and say, hey, i want to spend X amount of money on ads and I want it to come from creators, not from, like, my brand.

Melissa:

Can you just kind of go in and set it and forget it and then just like get some content sent back to you? Yeah, totally Right, you can do that. But like, can you sort of interrogate the process a little bit? if tic tac creator studio comes back to you and just has you know, white men as the predominant option Say like, hey, can we look at, can we look at, some different people? Can we look at some additional creators? Like, do you have anyone that's not the typical like and we think, man, that's. That's really a mystery, that's really a disappointment to see it.

Melissa:

But no one stops and says, like why is this our default? Why is it so special and so other to invite someone else to be part of this or to purposefully look for them? I think that's another thing a lot of people don't want to say. We're looking for black female creators to be part of this. It's like, oh, we have to pretend that we're all colorblind and no one sees anything and whatever. We can't mention race, we can't mention this, but that's part of the problem.

Melissa:

If we just like the AI stuff, if we just said it and forget it and kind of walk away, then the robots are going to do what the robots have always done. You know, input and output. It's not going to just fix itself. I don't know that much about coding. I do know that when and I think we both know this from working and like product marketing that when there's an error in the code or there's an issue or there's a bug, you don't just keep going and say like, oh, you know, it'll probably fix itself, it'll probably iron out the cake Like no, someone has to go back and pinpoint what's the issue.

Melissa:

What's the problem? what's going wrong? right And like how is it causing this issue? We would never just expect the code to just kind of straighten itself out or fix itself. A real person has to like test it and look at it and like make sure it runs and like put it in the right place. People should probably see themselves more as like question askers and curators. Those are two things that AI will never be able to replace. You know, the other side is the responsibility right To use it responsibly and use it thoughtfully and interrogate your own bias. Like you know, i think we had talked about the social media front of. You know who, the faces and the skin tones that are representing your brand, or the influencers that you choose to work with.

Melissa:

But you know, in the coding, in the writing of the email, right in the market research. I think all of those areas need to sort of expand their thinking. Bring in people from the outside. At least if you can't hire like I know a lot of teams are running very thin right And like running on very small margins right now, at least with your testing right be incredibly intentional about who you're bringing in to test your product or who you invite into your betas. I think there's a lot of ways that that can kind of come to life beyond just the front lines of visible communications that go up.

Josh:

Melissa, thank you for your time. I just want to have like one like kind of the parting question or thought that I usually ask my guests. you know, you being a brand strategist and all of our talk about inclusion and algorithmic bias We talked a lot about like being transparent you know what are the moral and legal ramifications? I don't know where you want to take that, but like a question there.

Melissa:

Yeah, i think that you know if you are using AI for your work. If you do use AI generated images, i think that you know it's your responsibility to say, like you know, as a marketer, hey, we've been playing around with this, right, and this is what we produce, right, this is what came out of it. Or, hey, putting a call out to your audience, you know, depending on what your brand is like, the brand personality, brand tone, et cetera let's say like, hey, you know, what should we generate? What are some prompts we could use? I think it always needs to tie back to a human right, a human perspective, a human story. It's my closer.

Josh:

Thank you for your time and sharing your experience, and You're welcome.

Melissa:

It was great. Thanks for having me.

Josh:

Well, that wraps up the interview, melissa, thank you so much for sharing your experience and perspective with us. You can follow her on Instagram, at shortformelissa, and linkedin. That's linkedincom. Slash i-n slash m-e-l-s-a-n-t-i-a-g-o. There will be links in the show notes. If you found this episode to be encouraging or informative, please share it with a friend and don't forget to rate us in the App Store. Helps a lot, all right, and with that, i hope we can all remember that you are a human being with intrinsic worth. Until next time.

AI Bias in Brand Marketing
Levi's AI Model Backlash
Bias in Influencer Marketing