Artwork

内容由Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons提供。所有播客内容(包括剧集、图形和播客描述)均由 Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

Natalie Evans Harris | Creating A Shared Code Of Ethics To Guide Ethical and Responsible Use of Data

30:31
 
分享
 

Manage episode 264308732 series 2706384
内容由Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons提供。所有播客内容(包括剧集、图形和播客描述)均由 Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

During her career at the National Security Agency, Capitol Hill and the White House, Natalie Evans Harris saw that while we collected troves of data, we didn't have strong frameworks and governance in place to protect people in a data driven world. “Data has been used to intrude in our lives. Things are happening based upon data that nobody communicated to the public was actually happening,” she explained during a conversation with Stanford’s Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast.

Data ethics and responsible use of data are essentially about building trust. There's this gap in understanding what sharing data means. Two things have to happen if we're going to build a relationship where people allow their data to be used by a company. Individuals have to trust that what the company is doing with that data is something they're okay with. And the company has to be able to prove that they're being responsible with the use of the data. A company could have the best products out there, but if people don't trust you or understand what you're doing with the data, then they're not going to trust you to use the data. And then innovation stops.

She believes the biggest problem is we do not have a shared vision of what ethical practices mean. We don’t want to put broad impact laws in place to govern responsible use of data when we're still trying to define that vision. To change business practices, we have to change company expectations so that they're not only incentivized to be ethical and responsible in their business models, but they're also penalized when they violate.

Harris has been advocating for a data science “code of ethics” to create a shared vision to guide our behaviors, and then start to develop best practices around. Companies are now taking this code of ethics and personalizing it to their businesses around principles like informed consent, transparency, fairness and diversity. Companies then publicize the practices that they're putting in place to align with those principles. That's how you start to create that shared vision.

She sees that there's this transformation happening with the relationship between technology and people. For so long, technology has been this very passive thing in our lives, and now with AI and machine learning and all of these uses of data and technology, there's this tension around what technology can do and what humans should do. Until people know and understand what is happening with their data, and until companies can thoughtfully express what they're doing with the data in a very transparent fashion, we will continue to have this tension. She is hoping that this code of ethics can start to ease that tension.

RELATED LINKS
Connect with Natalie Evans Harris on Twitter (@QuietStormnat) and LinkedIn
Find out more about Natalie on her personal website
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website
Read more about BrightHive and Beeck Center

Listen and Subscribe to the WiDS Podcast on Apple Podcasts, Google Podcasts, Spotify, Stitcher

  continue reading

53集单集

Artwork
icon分享
 
Manage episode 264308732 series 2706384
内容由Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons提供。所有播客内容(包括剧集、图形和播客描述)均由 Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

During her career at the National Security Agency, Capitol Hill and the White House, Natalie Evans Harris saw that while we collected troves of data, we didn't have strong frameworks and governance in place to protect people in a data driven world. “Data has been used to intrude in our lives. Things are happening based upon data that nobody communicated to the public was actually happening,” she explained during a conversation with Stanford’s Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast.

Data ethics and responsible use of data are essentially about building trust. There's this gap in understanding what sharing data means. Two things have to happen if we're going to build a relationship where people allow their data to be used by a company. Individuals have to trust that what the company is doing with that data is something they're okay with. And the company has to be able to prove that they're being responsible with the use of the data. A company could have the best products out there, but if people don't trust you or understand what you're doing with the data, then they're not going to trust you to use the data. And then innovation stops.

She believes the biggest problem is we do not have a shared vision of what ethical practices mean. We don’t want to put broad impact laws in place to govern responsible use of data when we're still trying to define that vision. To change business practices, we have to change company expectations so that they're not only incentivized to be ethical and responsible in their business models, but they're also penalized when they violate.

Harris has been advocating for a data science “code of ethics” to create a shared vision to guide our behaviors, and then start to develop best practices around. Companies are now taking this code of ethics and personalizing it to their businesses around principles like informed consent, transparency, fairness and diversity. Companies then publicize the practices that they're putting in place to align with those principles. That's how you start to create that shared vision.

She sees that there's this transformation happening with the relationship between technology and people. For so long, technology has been this very passive thing in our lives, and now with AI and machine learning and all of these uses of data and technology, there's this tension around what technology can do and what humans should do. Until people know and understand what is happening with their data, and until companies can thoughtfully express what they're doing with the data in a very transparent fashion, we will continue to have this tension. She is hoping that this code of ethics can start to ease that tension.

RELATED LINKS
Connect with Natalie Evans Harris on Twitter (@QuietStormnat) and LinkedIn
Find out more about Natalie on her personal website
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website
Read more about BrightHive and Beeck Center

Listen and Subscribe to the WiDS Podcast on Apple Podcasts, Google Podcasts, Spotify, Stitcher

  continue reading

53集单集

ทุกตอน

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南