Why Salesforce’s Kathy Baxter says diversity and inclusion efforts aren’t enough

At this year’s Transform we’re stepping up our efforts to build a roster of speakers that reflects the diversity in the industry and highlights the work of leaders who are making a difference.

Among them is Kathy Baxter, Principal Architect, Ethical AI Practice at Salesforce. In 2016, Baxter pitched the role of AI Ethicist to the company’s chief scientist, who pitched it to the CEO, and six days later, it was official. We were excited for the opportunity to speak to her about what the role entails, as well as her thoughts on how the industry is changing, and why focusing on diversity, equity and inclusion (DE&I) efforts isn’t enough.

See the first two in the series: Intel’s Huma Abidi and Refin’s Bridget Frey . More to follow.

VB: Could you tell us about your background, and your current role at your company?

I received a BS [Bachelor of Science] in Applied Psychology and a MS [Masters of Science] in Engineering Psychology/Human Factors Engineering from GA [Georgia] Tech. The degrees combine social science with technology. It also had a strong foundation in research ethics.

I started working on AI ethics “on the side” at Salesforce in 2016, and by 2018, I was working the equivalent of two full-time jobs. I pitched a full-time role of AI Ethicist to our Chief Scientist at the time, Richard Socher, in August of 2018. He agreed this was needed and pitched it to our CEO, Marc Bennioff, who also agreed, and six days later, it was official.

My colleague, Yoav Schlesinger, and I partner with research scientists and product teams to identify potential unintended consequences of the AI research and features they create. We work with them to ensure that the development is responsible, accountable, transparent, and inclusive. We also work to ensure that our solutions empower our customers and society. It’s not about AI replacing humans but helping us create better solutions where it makes sense. That means we also want to avoid techno-solutionism and so we always ask, not just ‘Can we do this, but should we?’

We also work with our partners and customers to ensure that they are using our AI technology responsibly and with our government affairs team to participate in the creation of regulations that will ensure everyone is creating and using AI responsibly.

VB: Any woman in the tech industry, or adjacent to it, is already forced to think about DE&I just by virtue of being “a woman in tech” — how has that influenced your career?

I have always participated in DE&I events at the places I have worked whether that was educational or recruiting events. I’ve also facilitated courses focused on skills to help URMs [underrepresented minorities] advance to higher levels in companies where we’ve seen a large drop off.

The last few years though, I have stepped away from those efforts because I don’t believe they actually address the root cause of lack of diversity and inclusion. Recruiting events or teaching skills to people in underrepresented groups how to deal with systemic bias puts the emphasis on this being a pipeline problem or that the people facing bias are responsible for fixing it.

In my experience, both of these premises fail to address the most serious cause of lack of diversity, and that’s the inherent bias of those in power to decide who is hired, how people are treated when they are hired, and who gets promoted.

I look for every opportunity to ensure when we are hiring for a role that I have any contact with that we reach out to as wide a field of candidates as possible, that we are aware of our biases during the hiring and promotion discussions, and to always be the person that speaks out when I hear or see non-inclusive behavior happening. It’s about calling people in, not out.

So reminding people when we talk about things as simple as project names, “That’s another male scientist’s name. How about a female’s name or we avoid gendered names altogether?” Or looking around the room in important meetings and observing out loud, “Wow. This is a pretty homogenous group we have here. How can we get some other voices involved?”

I also believe in the importance of mentoring and sponsoring others. When I find brilliant folks with expertise that aren’t in the room, in a document, or on an email thread perhaps because they are junior or they aren’t connected with the particular project at hand, I make sure to mention their names and bring them in. It takes work to make sure that hierarchy or organizational charts don’t prevent us from having the best people in discussions because it is worth it for everyone.

VB: How do you see the industry changing in response to the work that women, especially Black and BIPOC women, are doing on the ground? What will the industry look like for the next generation?

The ethics in tech, especially ethics in AI work is largely driven by women and BIPOC since they are the ones harmed by non-inclusive practices and products. It’s taken a long time but it’s gratifying to see that the work of Joy Buolamwini and Timnit Gebru on bias in facial recognition technology [FRT] being broadly consumed by regulators, technology creators, and even consumers thanks to the “Coded Bias” video on Netflix.

We still have a long way to go as FRT is increasingly being used in harmful ways because there is no transparency or accountability when harm is found.

I’m also excited to see more and more students graduating from technology programs with a better understanding of ethics and responsibility. As they become a larger part of tech companies, my hope is that we will see a demise of dark design patterns and a greater focus on helping society, not just making money off of it.

This won’t be sufficient so we need meaningful regulation to stop irresponsible companies from racing to the ethical bottom in the pursuit of profits by any means necessary. We need more women, LGBTQ+, Black, and BIPOC members in the government, civil society, and leadership positions in all companies to make significant changes.

[Baxter’s talk is just one of many conversations around diversity and inclusion at Transform 2021 (July 12-16).  On July 12, we’ll kick off with our third Women in AI breakfast gathering. On Wednesday, we will have a session on BIPOC in AI. On Friday, we’ll host the Women in AI awards. Throughout the agenda, we’ll have numerous other talks on inclusion and bias, including with Margaret Mitchell, a leading AI researcher on responsible AI, as well as with executives from Pinterest, Redfin, and more.]

Source link