cft

What is Data Equity?

And why should we care about it?


user

Meghan Wenzel

2 years ago | 6 min read

In the wake of racial unrest in the last year and a half, there have been renewed conversations around race, bias, and equity in all facets of society.

As companies roll out new technology around facial recognition, policing, housing, immigration, healthcare, and education, we often hide behind their proprietary algorithms and soothing “objectivity”. However it’s a false sense of security — the algorithms themselves are encoded with bias, from the people who designed them to the historical data they’re trained on.

LA Tech4Good brought a panel on Elevating Data Equity in Practice to Data Con LA 2021. Rachel Whaley, LA Tech4Good’s Data Equity Program Manager, moderated the discussion with panelists Dr. Eboni Dotson, Maria Khan, Kathryn Wolterman, and Eva Pereira.

All are leading work for data equity in their sectors and have participated in LA Tech4Good’s data equity workshops. Below is a summary of the topics covered and insightful perspectives the panelists shared.

Defining data equity

To frame the discussion, Whaley started with a definition of data equity: “Data is a thing we make and put to use. We can make and use it differently.” She noted that data is not inherently objective — at each phase someone made decisions, who to sample, how to collect it, how to store it, how to analyze it, and how to use it.

Additionally, the person(s) analyzing the data has their own inherent biases, which will color their interpretation and conclusions. We need to recognize this subjectivity and center equity in each of these decisions in order to reduce and prevent harm.

Understanding data equity in practice

The panel included:

What area do you see as the highest priority for the data community to focus on regarding data equity?

Dr. Dotson explained how we need to understand what equitable data practices actually mean. She said they’re complex and multifaceted, but they ultimately refer to the ways in which data is collected, analyzed, and interpreted. She noted that marginalized communities present themselves through data, and we need to hone in on this and change it.

She calls out that data isn’t objective — it can create or perpetuate a power dynamic. We need to explore who are we collecting data for? Why are we seeking data? What’s the dynamic between the question asker and the question answerer? Equity needs to be addressed throughout each stage of the data lifecycle.

Pereira elaborates on data ethics, noting they’re the behaviors and norms that guide how we collect, manage, and use data. She argues our goal should be to uphold and protect civil liberties, minimize the risk to individuals, and maximize the public good.

She recommended being aware of relevant regulations, acting with integrity, and upholding professional standards when working with data. She also suggested establishing a clear process for reporting data ethics violations and concerns.

She shared the federal data ethics framework, arguing we all need to be accountable and transparent. Who are the data stakeholders? How will the data be used? Who will be impacted by the data? Having all stakeholders sign data use agreements can help ensure everyone is clear on how the data will be used.

Where do you see the biggest opportunity in your field to implement better practices around data equity?

Wolterman drew on her experience at a large global corporation. With roughly 80,000 employees around the world, she argued that promoting education around data ethics and equity can be really impactful.

During the LA Tech4Good workshop, she created a customized 4-week long workshop with a mix of lecture and hands-on activities to help employees learn about data ethics and apply it to their daily work.

Khan asserted that we need to find a balance of equity and trust. She argued we need to present ourselves as not just data holders, but as data sharers. We need to engage with the community, bring them into the data world, and give them equal voice and representation.

Whaley agreed, saying democratizing data can help build transparency. She noted that we need to make sure all stakeholders have some level of data literacy and access to the relevant tools and data.

What are the biggest barriers you’ve seen or anticipate as you implement ethical data practices?

Dr. Dotson explained how emerging tech has the potential to reinforce existing inequalities. AI for example can reinforce racial and ethnic disparities due to reliance on historical data to train them.

She also noted how lack of trust impedes our ability to get solid data, which is one of the biggest barriers to implementing equitable practices within healthcare.

She shared that within healthcare they often say “what gets measured gets improved,” but often what we’re measuring is an incomplete picture. While the data gap is really complex, she said we need to advocate for strategies to collect more representative data.

Wolterman argued that you can’t simply approach data equity as a “bolt on” effort. You have to build it into an organization’s core business strategy and company values in order to make authentic change and lasting progress. At CGI, she started with the educational element.

From there, she developed steering committees to get key stakeholders involved and attract executive sponsors. She shared that having leadership on board was fundamental to success.

Community activists often struggle to get access to data, while big companies have massive amounts of data. How do we address this?

Pereira shared that her team helps manage California’s open data portal. She acknowledged that not everyone has the technical skills to review data, so her team developed a new role — the data liaison.

They created a whole education program around this, meeting with community members to understand what questions they have. Pereira’s team helps them find the data, plug it into a tool, and analyze it, in order to build the data capacity of neighborhood councils.

On the other hand, they’ve develop another program, Know Your Community, to provide insight-ready information via an app. They conducted research to find out people’s most common questions and how to deliver insights in an accessible way. By coming at the problem from both sides, they’ve provided workshops and education to build knowledge around data as well as providing accessible and insights-ready tools.

What are actionable next steps that we can take?

Khan said to start by asking who your data is about. From there learn about your community. Who’s behind the data set that you’re looking at? Learn about those communities on all different levels.

Dr. Dotson urged individuals to educate themselves on what data equity is and what it means, especially in the space they operate in most often. She suggested checking out The Ethical Algorithm as well as Chicago Beyond’s equity series including Why am I being researched?

Pereira suggested you create a data ethics committee at your organization. This can help you start to check yourself — question your assumptions, review your methodology, and discuss if it’s the right approach. This also creates a space to discuss other data ethics issues with colleagues.

Wolterman recommended watching Coded Bias for a great look at algorithmic injustice present in our world today. She noted that the documentary includes interviews with several experts in the field, and you can explore their work as well. Additionally, she recommended LA Tech4Good’s data equity workshop as a great hands-on and supported approach.

Overall it was a great discussion. Data equity is a complex and important topic, and we need to work together to drive it forward. Education is the first step. We need to recognize that data isn’t objective — it’s a reflection of the historical data it was trained, on ripe with inequalities, as well as the inherent biases and perspectives of the people who collected and analyzed it.

From there, we need to continue the discussion and implement change. Developing data ethics committees, advocating for data regulations, and implementing standards and processes at our organizations are just the beginning.

Upvote


user
Created by

Meghan Wenzel

As a User Researcher and Strategist, I help companies solve the right problems and build more relevant, efficient, and intuitive products. I started my UX career at a Fortune 500 company, and I've since helped established the research practice at three B2B startups. I'm currently a Senior User Researcher at Unqork, the leading enterprise no code platform.


people
Post

Upvote

Downvote

Comment

Bookmark

Share


Related Articles