Privacy of Brain Data There has been a lot of noise this month about California's amendment to consumer privacy laws to make it clear that data about brain activity will be protected as "sensitive personal information." I've had the chance to speak about this and other legal aspects of brain data privacy (structural info as well as info about brain activity) in the English and French language media over the last month (see below). There is a range of views to be seen on whether existing and near-term neurotechnologies pose a threat to privacy - these views range from "nothing to see here" all the way to "this can read your mind." Ethicists have an incentive to find and point out interesting things to be worried about, and perhaps sometimes this gets over-heated. At the same time, note an old warning from Plato in the Phaedrus (Socrates is recounting a discussion on whether the human faculty of memory would be destroyed by the innovation of writing) - "the parent or inventor of an art is not always the best judge of the utility or non-utility of his own inventions to the users of them." The developers of new tools may not see - or have incentive to see - the full path that their new tools will take and how they will be repurposed. As is usually the case, we are at present somewhere in the middle, and whether there's a serious issue depends upon which kind of brain data, what type of inference is to be drawn, and for what reason. Also - relying upon consumer knowledge and consent to protect privacy is demonstrably not very effective. Also keep in mind that in some contexts, the freedom to choose whether or not to share data is greatly constrained. In my view from the mushy middle - it is wise to try to anticipate in a reasonable manner. We don't want to throw the baby out with the bathwater by raising problems that will damage a promising field, and we don't want to sleepwalk into problems that could be headed off ahead of time. Do we need laws on this? In some places, existing privacy laws already capture brain data as personal information. We will likely see more legislative change in the coming months and years. Goodyear S. Why your brain could be the next frontier of data privacy. As it Happens. CBC. (11 October 2024) https://lnkd.in/eBCMZfeW Mullin E. I tried these brain-hacking headphones that claim to improve focus. (24 September 2024) https://lnkd.in/epn56yC5 Robitaille-Grou P. Neurotechnologies commerciales : un far west à l’abri des normes médicales. Radio-Canada (29 August 2024) https://lnkd.in/emWfBC4b California law here: https://lnkd.in/eHitYyre
Jennifer Chandler’s Post
More Relevant Posts
-
This morning, I published an article for Tech Policy Press exploring California’s groundbreaking legislation to classify neural data as sensitive personal information. This amendment to the California Consumer Privacy Act (CCPA) is a significant step forward in addressing the #ethical and #legal challenges posed by #neurotechnology—a rapidly growing field that bridges the gap between human cognition and technology. With companies like Neuralink, EMOTIV, and Kernel advancing brain-computer interfaces and consumer neurodevices, the collection of brain-generated #data is no longer science fiction. This data offers incredible promise but raises pressing questions about #privacy, #autonomy, and the future of human dignity in a tech-driven world. In my article, I delve into: ✔️ The implications of California’s SB 1223 for businesses and consumers ✔️ How neural data privacy compares globally, including initiatives in Chile and under the GDPR ✔️ The ethical concerns of tracking and analyzing brain activity in real-time ✔️ Potential future directions for neural data legislation and the neurorights movement This legislation not only strengthens consumer protections but also sets the stage for a global conversation on how we balance innovation with safeguarding mental privacy. 📖 Read the full article here: https://lnkd.in/gb-zQtnR Let’s discuss: What do you think the future of neurotechnology and privacy should look like? #DataPrivacy #Neurotechnology #NeuralData #CCPA #TechPolicy #Innovation #PrivacyRights
Neural Data and Consumer Privacy: California’s New Frontier in Data Protection and Neurorights | TechPolicy.Press
techpolicy.press
To view or add a comment, sign in
-
🧠A Step Forward in Brain Data Protection🧠 ✨ Exciting news from Colorado! Governor Jared Polis just signed a landmark bill into law, extending privacy rights to neural data—data derived from our brains, which technology companies have been increasingly coveting. This is a major step forward in protecting our most intimate biological and neural information. 𝗛𝗲𝗿𝗲'𝘀 𝘄𝗵𝗮𝘁 𝘆𝗼𝘂 𝗻𝗲𝗲𝗱 𝘁𝗼 𝗸𝗻𝗼𝘄: 𝙉𝙚𝙬 𝙋𝙧𝙤𝙩𝙚𝙘𝙩𝙞𝙤𝙣𝙨: Colorado's new law treats brainwave data with the same stringency as biometric data, such as fingerprints and facial images, under the Colorado Privacy Act. 💪 𝘾𝙤𝙣𝙨𝙪𝙢𝙚𝙧 𝙋𝙧𝙤𝙩𝙚𝙘𝙩𝙞𝙤𝙣: This law specifically targets consumer neurotechnology companies (recent advancements by Neuralink no doubt played a role), setting strict regulations on how they handle neural data, including banning undisclosed data sharing. Additionally, consumers have the right to access, delete, and correct their brain data, and to opt out of its sale or use for targeted advertising. 𝙂𝙡𝙤𝙗𝙖𝙡 𝙏𝙧𝙚𝙣𝙙: This initiative isn't isolated. California and Minnesota are pushing for similar legislation. Internationally, countries like Chile, Brazil, Spain, Mexico, and Uruguay have either passed similar protections or are making moves to do so. This development resonates deeply with me. I was first drawn to privacy because I was intrigued by the potential for companies to use our personal data to subtly influence our behavior and compromise our autonomy. My friends thought I was paranoid, but my concerns were validated by Shoshan Zuboff in her insightful book 'The Age of Surveillance Capitalism' (a book I highly recommend reading). 📚 I also highly recommend reading Nita Farahany's book, 'The Battle for Your Brain', which highlights how neurotechnologies—technology that can interpret your mental and emotional state from your brain activity—can seriously threaten our fundamental human rights to privacy, freedom of thought, and self-determination if left unchecked. 𝗟𝗶𝗺𝗶𝘁𝗮𝘁𝗶𝗼𝗻𝘀 𝗮𝗻𝗱 𝘁𝗵𝗲 𝗣𝗮𝘁𝗵 𝗙𝗼𝗿𝘄𝗮𝗿𝗱 Despite the positives, the law has its limitations. It primarily targets neurotechnology companies collecting data to identify individuals, which is just the tip of the iceberg. As Nita Farahany pointed out in a recent The New York Times's article (which I'll link in the comments), most neurotech applications aim to infer thoughts or feelings rather than identify individuals, suggesting a need to broaden these protections. 𝗟𝗲𝘁'𝘀 𝗱𝗶𝘀𝗰𝘂𝘀𝘀: This law is a step toward respecting the privacy of our innermost selves. As technology evolves, so too should our laws to guard our personal and sensitive information. What are your thoughts on this groundbreaking legislation? #DataPrivacy #NeuroRights #TechnologyEthics #Privacy #DigitalPrivacy
To view or add a comment, sign in
-
Attendees of the Privacy Symposium 2024 on June 13 can hear from partner Patrick Van Eecke as he speaks on key information about #AI, data ownership and intellectual property rights. Cooley partner Travis LeBlanc will also be leading a session on privacy in democratic societies on June 10 + will share insights during a panel on internet governance on June 12. 📆 More information at the link below. For more AI resources and to see the team's full capabilities in this evolving space, visit our #CooleyAI page here: https://bit.ly/CooleyAI #AI #artificialintelligence
Privacy Symposium 2024 // Cooley
cooley.com
To view or add a comment, sign in
-
3 things privacy professionals should consider at the intersection of AI and data privacy The Checks team shares three things that privacy professionals should consider when thinking about AI and privacy in 2024.
3 things privacy professionals should consider at the intersection of AI and data privacy
blog.google
To view or add a comment, sign in
-
3 things privacy professionals should consider at the intersection of AI and data privacy The Checks team shares three things that privacy professionals should consider when thinking about AI and privacy in 2024.
3 things privacy professionals should consider at the intersection of AI and data privacy
blog.google
To view or add a comment, sign in
-
As organisations public and private start to use #generativeai more and more, that doesn't mean sidestepping responsibilities in terms of collecting information and using it with consent for the purpose it was collected for!
In Internet New Zealand's research, almost half of respondents were more concerned than excited about AI. Privacy Commissioner Michael Webster says while AI may be new and unknown, it’s not an unregulated area. Read our full media statement: https://lnkd.in/g-thbu5m
Office of the Privacy Commissioner | New Zealanders more concerned than excited by AI
privacy.org.nz
To view or add a comment, sign in
-
Findings from the Office of the Privacy Commissioner, New Zealand highlight the need to develop and deploy artificial intelligence (AI) that provides increased transparency, fairness, and accountability in AI algorithms and decision-making processes. #responsibleai #artificialintelligence #artificialintelligenceforbusiness #aiforbusiness #aialgorithms #datamodeling #machinelearning #aiml #privacylaw #dataprivacy #informationgovernance #datagovernance
In Internet New Zealand's research, almost half of respondents were more concerned than excited about AI. Privacy Commissioner Michael Webster says while AI may be new and unknown, it’s not an unregulated area. Read our full media statement: https://lnkd.in/g-thbu5m
Office of the Privacy Commissioner | New Zealanders more concerned than excited by AI
privacy.org.nz
To view or add a comment, sign in
-
For Your Consideration “Our market choices—what we see, choose, and click—are scripted and arranged in advance,” said Dr. Ryan Calo, University of Washington School of Law and Co-Director of the University’s Technology Lab, in his testimony. “Many AI techniques boil down to recognizing patterns in large data sets. Even so-called generative AI works by guessing the next word, pixel, or sound in order to produce new text, art, or music. Companies are increasingly able to use this capability to derive sensitive insights about individual consumers from public or seemingly innocuous information…The ability of AI to derive sensitive information such as pregnancy or mental health based on seemingly non-sensitive information creates a serious gap in privacy protection.” https://lnkd.in/e-KgxFpW
Experts Warn Senators AI is Upping the Urgency for Federal Privacy Law
commerce.senate.gov
To view or add a comment, sign in
-
Isha Marathe reports: On Monday, California Gov. Gavin Newsom signed into law SB 1223, amending the California Consumer Privacy Act (CCPA) to include neural data as personal sensitive information. The provision, authored by California State Sen. Josh Becker, comes into effect immediately as a part of the CCPA. The law marks the second such legal protection for data produced from invasive neurotechnology, following Colorado, which incorporated neural data into its state data privacy statute, the Colorado Privacy Act (CPA) in April. Neurotechnology tools can measure and visualize brain activity, and in some cases, alter it as well. A report released by the nonprofit Neurorights Foundation earlier this year found 30 consumer-grade tools currently on the market that collect neural data. The CCPA applies to for-profit businesses with an annual gross income of 25 million or more, if the business buys, sells, or shares the personal information of at least 100,000 California residents or households, and if the business derives at least 50% of its annual revenue from selling or sharing California residents' personal information. Jared Genser, a former DLA Piper partner who founded law firm Perseus Strategies and now serves as the general counsel of the nonprofit NeuroRights Foundation, told Legaltech News that the California law is similar to its Colorado counterpart, albeit with a little more specificity. "Specifically, neural data in California matches the scientific definition of data that can only be captured by medical-grade neurotechnologies and it excludes non-neural inferential data captured from outside the body, which is much less sensitive," he said. Not everyone agrees with that approach. Nita Farahany, professor of law and philosophy at Duke Science and Society, wrote in a post that the California and Colorado definitions are too ambiguous—and don't focus enough on what she calls "cognitive" or "mental" privacy. "Data such as heart rate, eye-tracking, and even fitness wearables can similarly reveal mental states like stress or emotional reactions, but they fall outside the bill's scope," Farahany wrote. "This creates a significant gap, leaving non-neural cognitive biometric data unprotected and open to exploitation." 𝗥𝗲𝗮𝗱 𝘁𝗵𝗲 𝘄𝗵𝗼𝗹𝗲 𝘀𝘁𝗼𝗿𝘆: https://lnkd.in/gKpqFhJe
To view or add a comment, sign in
-
I made a several AI images (in a Kafkaesque style) with key quotes from my new short essay, Kafka in the Age of AI and the Futility of Privacy as Control (with Woodrow H.) https://lnkd.in/eA2RpR44
Kafka in the Age of AI and the Futility of Privacy as Control
https://teachprivacy.com
To view or add a comment, sign in