As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
Data will always bear the marks of its history. That is human history held in those data sets.
Copied to Clipboard
Copied to Clipboard
We should always be suspicious when machine-learning systems are described as free from bias if it's been trained on human-generated data. Our biases are built into that training data.
Copied to Clipboard
Copied to Clipboard
Data and data sets are not objective; they are creations of human design. We give numbers their voice, draw inferences from them, and define their meaning through our interpretations.
Copied to Clipboard
Copied to Clipboard
Self-tracking using a wearable device can be fascinating.
Copied to Clipboard
Copied to Clipboard
Big data sets are never complete.
Copied to Clipboard
Copied to Clipboard
When dealing with data, scientists have often struggled to account for the risks and harms using it might inflict. One primary concern has been privacy - the disclosure of sensitive data about individuals, either directly to the public or indirectly from anonymised data sets through computational processes of re-identification.
Copied to Clipboard
Copied to Clipboard
Many of us now expect our online activities to be recorded and analyzed, but we assume the physical spaces we inhabit are different. The data broker industry doesn't see it that way. To them, even the act of walking down the street is a legitimate data set to be captured, catalogued, and exploited.
Copied to Clipboard
Copied to Clipboard
Vivametrica isn't the only company vying for control of the fitness data space. There is considerable power in becoming the default standard-setter for health metrics. Any company that becomes the go-to data analysis group for brands like Fitbit and Jawbone stands to make a lot of money.
Copied to Clipboard
Copied to Clipboard
Rather than assuming Terms of Service are equivalent to informed consent, platforms should offer opt-in settings where users can choose to join experimental panels. If they don't opt in, they aren't forced to participate.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.