- Fitness
- |
- Nov 07, 2020
Kate Crawford Quotes
Most Famous Kate Crawford Quotes of All Time!
We have created a collection of some of the best kate-crawford quotes so you can read and share anytime with your friends and family. Share our Top 10 Kate Crawford Quotes on Facebook, Twitter, and Pinterest.
- Last Updated on May 30, 2021
- Good
- |
- Nov 07, 2020
It is a failure of imagination and methodology to claim that it is necessary to experiment on millions of people without their consent in order to produce good data science.
- Anxiety
- |
- Nov 07, 2020
Surveillant anxiety is always a conjoined twin: The anxiety of those surveilled is deeply connected to the anxiety of the surveillers. But the anxiety of the surveillers is generally hard to see; it's hidden in classified documents and delivered in highly coded languages in front of Senate committees.
- Able
- |
- Nov 07, 2020
We urgently need more due process with the algorithmic systems influencing our lives. If you are given a score that jeopardizes your ability to get a job, housing, or education, you should have the right to see that data, know how it was generated, and be able to correct errors and contest the decision.
- How
- |
- Nov 07, 2020
Big Data is neither color-blind nor gender-blind. We can see how it is used in marketing to segment people.
- How
- |
- Nov 07, 2020
Sexism, racism, and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many 'intelligent' systems that shape how we are categorized and advertised to.
- Intelligence
- |
- Nov 07, 2020
Like all technologies before it, artificial intelligence will reflect the values of its creators. So inclusivity matters - from who designs it to who sits on the company boards and which ethical perspectives are included.
- Go
- |
- Nov 07, 2020
Histories of discrimination can live on in digital platforms, and if they go unquestioned, they become part of the logic of everyday algorithmic systems.
- Future
- |
- Nov 07, 2020
We need to be vigilant about how we design and train these machine-learning systems, or we will see ingrained forms of bias built into the artificial intelligence of the future.
- Down
- |
- Nov 07, 2020
Many of us now expect our online activities to be recorded and analyzed, but we assume the physical spaces we inhabit are different. The data broker industry doesn't see it that way. To them, even the act of walking down the street is a legitimate data set to be captured, catalogued, and exploited.
- Debate
- |
- Nov 07, 2020
We need a sweeping debate about ethics, boundaries, and regulation for location data technologies.
- Believe
- |
- Nov 07, 2020
The promoters of big data would like us to believe that behind the lines of code and vast databases lie objective and universal insights into patterns of human behavior, be it consumer spending, criminal or terrorist acts, healthy habits, or employee productivity. But many big-data evangelists avoid taking a hard look at the weaknesses.
- Design
- |
- Nov 07, 2020
Numbers can't speak for themselves, and data sets - no matter their scale - are still objects of human design.
- Better
- |
- Nov 07, 2020
Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation.
- Data
- |
- Nov 07, 2020
While many big-data providers do their best to de-identify individuals from human-subject data sets, the risk of re-identification is very real.
- Justice
- |
- Nov 07, 2020
If you're not thinking about the way systemic bias can be propagated through the criminal justice system or predictive policing, then it's very likely that, if you're designing a system based on historical data, you're going to be perpetuating those biases.
- History
- |
- Nov 07, 2020
Data will always bear the marks of its history. That is human history held in those data sets.
- Mirror
- |
- Nov 07, 2020
If you have rooms that are very homogeneous, that have all had the same life experiences and educational backgrounds, and they're all relatively wealthy, their perspective on the world is going to mirror what they already know. That can be dangerous when we're making systems that will affect so many diverse populations.
- Equivalent
- |
- Nov 07, 2020
We should have equivalent due-process protections for algorithmic decisions as for human decisions.
- Design
- |
- Nov 07, 2020
Data and data sets are not objective; they are creations of human design. We give numbers their voice, draw inferences from them, and define their meaning through our interpretations.
- Hidden
- |
- Nov 07, 2020
Hidden biases in both the collection and analysis stages present considerable risks and are as important to the big-data equation as the numbers themselves.
- Culture
- |
- Nov 07, 2020
While massive datasets may feel very abstract, they are intricately linked to physical place and human culture. And places, like people, have their own individual character and grain.
- Live
- |
- Nov 07, 2020
As we move into an era in which personal devices are seen as proxies for public needs, we run the risk that already-existing inequities will be further entrenched. Thus, with every big data set, we need to ask which people are excluded. Which places are less visible? What happens if you live in the shadow of big data sets?
- Data
- |
- Nov 07, 2020
When dealing with data, scientists have often struggled to account for the risks and harms using it might inflict. One primary concern has been privacy - the disclosure of sensitive data about individuals, either directly to the public or indirectly from anonymised data sets through computational processes of re-identification.
- Never
- |
- Nov 07, 2020
Only by developing a deeper understanding of AI systems as they act in the world can we ensure that this new infrastructure never turns toxic.
- Impossible
- |
- Nov 07, 2020
Error-prone or biased artificial-intelligence systems have the potential to taint our social ecosystem in ways that are initially hard to detect, harmful in the long term, and expensive - or even impossible - to reverse.
- Daily Lives
- |
- Nov 07, 2020
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
- Discriminated
- |
- Nov 07, 2020
The fear isn't that big data discriminates. We already know that it does. It's that you don't know if you've been discriminated against.
- Data
- |
- Nov 07, 2020
We should always be suspicious when machine-learning systems are described as free from bias if it's been trained on human-generated data. Our biases are built into that training data.
- Problem
- |
- Nov 07, 2020
There is no quick technical fix for a social problem.
- Data
- |
- Nov 07, 2020
With big data comes big responsibilities.
- Need
- |
- Nov 07, 2020
There's been the emergence of a philosophy that big data is all you need. We would suggest that, actually, numbers don't speak for themselves.
- Complete
- |
- Nov 07, 2020
Big data sets are never complete.
- Nov 07, 2020