California's new data and privacy rights in effect as of today - Microsoft pledges US-wide applicati

Delicieuxz

[H]ard|Gawd
Joined
May 11, 2016
Messages
1,664
Today is a great day for data ownership and privacy rights, because, as of today, California's Consumer Privacy Act (CCPA) comes into force in California, greatly strengthening data and data-privacy rights for Californians and beyond.


Microsoft has posted that they will apply California's new CCPA rules nationwide in the US: Microsoft will honor California’s new privacy rights throughout the United States

But I wouldn't be too quick to assume that Microsoft is doing this out of the goodness of their greedy, corporate, data-coveting hearts, and Microsoft conspicuously doesn't mention anything regarding applying the same treatment to the rest of the world in their post. Microsoft, which makes huge money stealing your personal data from your computer through Windows 10 and various Microsoft services, reportedly was one of the tech companies that originally tried to prevent CCPA from becoming a California ballot box initiative. See ***** below for more information about California ballot box initiatives

So, why might Microsoft be openly supporting CCPA rules and applying them US-wide? It could be a case of 'if you can't beat them, join them - and then twist them in your favour'. After all, Microsoft is the company which coined the phrase "Embrace, extend, and extinguish".


And it isn't just Microsoft that are trying to get in-front of any new data rights protections: Tech industry titans suddenly love internet privacy rules. Wanna know why? We'll tell you

Companies that make millions and billions of dollars each year by pilfering your data and applying it to their own technology operations or selling it either in raw or aggregated form to 3rd-parties (including researchers, governments, advertisers, police, and pretty much whoever else has the money to pay for it) see newer data rights protection laws as a threat to their profits but inevitable and so they want to get in-front of their implementation to shape their designs in a way that will favour their companies' continued profiteering on your personal data. They want new data laws to get in the way of their unfettered data-theft practices as little as possible.


The CCPA is already stronger legislation than Europe's GDPR, which, while a positive movement and being an important catalyst for data and privacy rights awareness, I've thought to be largely toothless and the minimum possible to be done while still not restricting data-harvesting and not giving data-owners solid control over their data property.

But the sponsors behind California's new law are going even further than the CCPA, with their announced California Privacy Rights Act (CPRA), which is an updated, fortified, and extended version of the CCPA that will address more recent data privacy exploit tactics and grant a bunch more important rights to data-owners that were missing in the CCPA, such as denying the collection of anything beyond what is absolutely necessary for a user-requested service to function.

The CPRA will also establish a California Consumer Privacy Protection Agency to protect data-owners.

The CPRA is intended be put to a ballot initiative, circumventing the possibility of its proposals being watered-down in California's legislature, as happened with the CCPA.


***** Ballot box initiatives: California has a wonderful law that enables average, non-politician Californian residents to create new laws in California by securing a certain number of signatures in support of the legislation proposal. And if the required number of signatures is received for a proposed legislation, then that proposed legislation goes to a ballot where all the residents of California get to vote on it. And if it passes the California-wide vote, then it becomes law in California.

Likely to ward-off a ballot box initiative regarding the proposed CCPA data-protection legislation, the California legislature unanimously passed a watered-down version. If California's legislature had not done that, then those sponsoring the CCPA, which had the required number of signatures to put their legislation proposal to a ballot initiative, would have put the matter to a California-wide referendum and then their stricter proposed legislation would have certainly passed, because, well, people logically value and want their privacy and want to control their own data.



------------------------------ newsletter from Californians for Consumer Privacy ------------------------------


The California Consumer Privacy Act (CCPA) goes into effect today!

The most sweeping consumer privacy legislation in the nation grants new data privacy rights for every Californian.

Here are your new rights:

  • The right to know what personal information a business has collected about you.
  • The right to say no to the sale of your information.
  • The right to delete your information.
  • The right to access your information in a portable format.
Here's what's next:

We are circulating a new initiative called the California Privacy Rights Act (CPRA)

Here's why:

Since we passed CCPA, two things have happened: First, some of the world’s largest companies have actively and explicitly prioritized weakening the law. Second, technological tools have evolved in ways that exploit a consumer’s data with potentially dangerous consequences. I believe using a consumer’s data in these ways is not only immoral, but it also threatens our democracy.

CPRA would:

  • Create new rights around the use and sale of sensitive personal information, such as health and financial information, racial or ethnic origin, and precise geolocation.

  • Provide enhanced protection for violations of children’s privacy by tripling CCPA’s fines for breaking the law governing collection and sale of children’s private information and would require opt-in consent to collect data from consumers under the age of 16.

  • Require much-needed transparency around automated decision-making and profiling, so consumers can know when their information is used to make adverse decisions that impact lives in critical ways, including employment, housing, credit, and even politics.

  • Establish a new authority to protect these rights, the California Privacy Protection Agency, which will simultaneously enforce the law and provide necessary guidance to industry and consumers, many of whom are struggling to protect themselves in an increasingly complex digital ecosystem, where hacking and identity theft remain a terrible problem.

  • Most importantly, it would enshrine these rights by requiring that future amendments be in furtherance of the law, even though I am only setting the threshold to amend at a simple majority in the legislature. While amendments will be necessary given how technically complex and fast-moving this area is, this approach respects the role of the legislature while still providing substantial protections for Californians from attempts to weaken the law and their new human rights.
Here's how they compare:

1690080551_CCPAversioncomparisonchart.jpg.05db6253d3124a35a48672f0738e4c93.jpg
 
Last edited:
Not to be a pessimist, it's just that, this isn't going to prevent anything.
Companies will continue to do this, get fined a few million, and then sell the data they have wrongfully collected for 10s to 100s of millions; profit.

Microsoft is being smart about getting on board with this, especially since having opposed it previously; good for them, bad for us, and meanwhile those politicians backing this will become slowly, but surely, bought and paid for.
The only way to truly stop this is by voting with your wallets.

Megacorporations ruling over a weakened centralized government passing support of such laws for the total control of the masses... and thus, 2020 is truly the beginning of the dark cyberpunk future. :borg:


gv9MLbY.jpg


-Gmork, The Neverending Story (1984)
 
Last edited:
And California DMV will continue to sell your information.
 
This:
upload_2020-1-2_12-29-51.png

means putting this ability in streamlined. If it's there it will be used with the thinnest of reasons, then used whether this condition is met or not.

This seems odd. Is state constitutional law broken in CA. If you need something to be robust against the legislature, it should just need to be dropped into the state constitution. This is a red flag, imho.
upload_2020-1-2_12-32-0.png
 
This:
View attachment 212531
means putting this ability in streamlined. If it's there it will be used with the thinnest of reasons, then used whether this condition is met or not.

This seems odd. Is state constitutional law broken in CA. If you need something to be robust against the legislature, it should just need to be dropped into the state constitution. This is a red flag, imho.
View attachment 212532

That doesn't seem legal to me. I mean why not just make a bunch of laws and stipulate that future legislation can't do anything about it.
 
Makes sense that the DMV sells this information. I've always wondered why they hang up on me when I tell them I have a 83 Suzuki and I'm very interested in extending the warranty. It's cause they know I don't have one!
 
I personally think legislation like this is a dumpster fire. It's supposed to protect the consumers from corporations, but the internet as a whole is basically bulldozed as collateral damage and nobody cares.

Legislation like this threatens the history of the internet, and has already caused the loss of important data that may have value in the future. It also makes retaining history of the internet a whole lot more complex and expensive because suddenly you need an army of lawyers and staff to respond to millions of people that may want their data deleted. So 20 years down the line someone can go to an archivist and say they're not allowed to have what they produced because they changed their mind, when that information could have historical significance.

To put it into perspective, if the internet existed ~245 years ago and the declaration of independence was made and signed via people across the internet and one person out of the group 50 years later regretted it, they could order it deleted and wherever it existed on the internet is required by law to comply. Hell, as poorly as this law was written, a descendant of the person could do the same. the GDPR and this are terrifying pieces of legislation. What's more terrifying is this is yet another gross overreach of power by the government in California. If they want laws passed, they need to stay within their own borders.

Vintage technology is a hobby of mine and I've seen many things lost to time because of legislation like this, the GDPR killed off a whole lot of stuff. But the damage done by it is only just started, it will make preserving the history of the internet in the future a basically impossible task. Whether or not the data is historical or has significance to anyone, it's all treated the same and can be deleted at any time.
 
According to this source, Microsoft already regards California's new data rights law as de facto national law:

Microsoft to apply California privacy rules to all users nationwide

The move isn't a shocker. Microsoft president Brad Smith told me during a Churchill Club interview that California's law would become the "de facto national privacy law."

If that's because Microsoft sees as inevitable future nationwide legislation that will be at least as strong, then willingly applying California's law nationwide right now has the possibility to make Microsoft look responsible with privacy which could generate goodwill from state lawmakers down the road.


I personally think legislation like this is a dumpster fire. It's supposed to protect the consumers from corporations, but the internet as a whole is basically bulldozed as collateral damage and nobody cares.

Legislation like this threatens the history of the internet, and has already caused the loss of important data that may have value in the future. It also makes retaining history of the internet a whole lot more complex and expensive because suddenly you need an army of lawyers and staff to respond to millions of people that may want their data deleted. So 20 years down the line someone can go to an archivist and say they're not allowed to have what they produced because they changed their mind, when that information could have historical significance.

To put it into perspective, if the internet existed ~245 years ago and the declaration of independence was made and signed via people across the internet and one person out of the group 50 years later regretted it, they could order it deleted and wherever it existed on the internet is required by law to comply. Hell, as poorly as this law was written, a descendant of the person could do the same. the GDPR and this are terrifying pieces of legislation. What's more terrifying is this is yet another gross overreach of power by the government in California. If they want laws passed, they need to stay within their own borders.

Vintage technology is a hobby of mine and I've seen many things lost to time because of legislation like this, the GDPR killed off a whole lot of stuff. But the damage done by it is only just started, it will make preserving the history of the internet in the future a basically impossible task. Whether or not the data is historical or has significance to anyone, it's all treated the same and can be deleted at any time.

I don't know if this would hit archivist activities. That's a different type of data, isn't it? I think this only hits the behind-the-scenes personal data logging, and not stuff like saving a webpage that might feature personal information to archive.org - which I think is Right to be forgotten territory.

Which archivist activities have you seen affected by GDPR?
 
I don't know if this would hit archivist activities. That's a different type of data, isn't it?

It most definitely impacts internet archival, there is no stipulation on who is exempt from collecting what kinds of data. The first three points on the left column are killers themselves. And even if there are exemptions made, it's going to be a case by case basis and you'll probably need to be some NGO with an army of lawyers and a perpetual pile of money.

"Right to know what data businesses collect about you" - this doesn't specify what kind of data is being collected. Phone number? Home address? Facebook account name? Some post you made on some forum about an IBM System/360 20-30 years ago? It's all treated the same. This law is retroactive you know, 1995 internet is not exempt, neither is pre-internet like BBS which has been archived on the internet.

"Right to say no to sale of your info" - this is a BIG killer for projects like oocities, which is an archive of Geocities pages before they went offline. Data is being changed hands from business A to business B and one way you could look it is that business B must contact all users of their intent to store their data before being put online elsewhere, or they're liable for being sued in breach of this law. This is the primary reason for companies which store public or private user data nuke everything form orbit, they don't want the liability. But this creates a problem because in the case of Geocities, there were people that had VERY important information stored there, didn't get the memo when Geocities was shutting down and subsequently lost that data. One thing pops into mind is I think a mother had some baby pictures of a child she lost that were only on Geocities, which oocities miraculously saved. This kind of thing is not possible with this law.

"Right to delete your info" - Again, this doesn't specify what information, so technically everything is covered. So let's say some indeterminate amount of time ago you made a major contribution to XYZ by writing diagrams, instructions, debugging information, etc. and now you change your mind. But the stuff you wrote is one of the only sources of info on XYZ which has been spread around the web, now everyone else must comply with your request for deletion or face legal action. Stuff similar to this has happened in the vintage computer territory where people release something for free, then if it is popular will turn around and revoke free and start scalping people for huge amounts of money, being a disservice to everyone in the community over greed. Now they'll have even more power to shake the community down.

Then there's this gem:

"Storage limitation: right to prevent data from being stored longer than necessary" - OK, how long is too long? Who gets to decide the length of time data is allowed to be stored? Are people who are the owners of the information allowed to months/years/decades later demand their information be removed when it is historically significant?

I think this only hits the behind-the-scenes personal data logging, and not stuff like saving a webpage that might feature personal information to archive.org - which I think is Right to be forgotten territory.

Right to be forgotten is certainly a thing, but the scope of the term "data" is not defined, and as stated earlier can be anything.

Which archivist activities have you seen affected by GDPR?

Right now? The prevention of archiving. Data is being deleted every day under this law, all of which we'll never know the significance of since we were never able to see it. If data is deleted on the internet before crawlers are able to grab a snapshot of it, nobody can look at it at a later date.


I do not like this law, neither do I the GDPR. They're massive pieces of needlessly complex legislation that are broadly defined and cause collateral damage to everything on the internet. Of course consumers need to be protected from mega corporations, but this is not the answer.
 
It most definitely impacts internet archival, there is no stipulation on who is exempt from collecting what kinds of data. The first three points on the left column are killers themselves. And even if there are exemptions made, it's going to be a case by case basis and you'll probably need to be some NGO with an army of lawyers and a perpetual pile of money.

"Right to know what data businesses collect about you" - this doesn't specify what kind of data is being collected. Phone number? Home address? Facebook account name? Some post you made on some forum about an IBM System/360 20-30 years ago? It's all treated the same. This law is retroactive you know, 1995 internet is not exempt, neither is pre-internet like BBS which has been archived on the internet.

"Right to say no to sale of your info" - this is a BIG killer for projects like oocities, which is an archive of Geocities pages before they went offline. Data is being changed hands from business A to business B and one way you could look it is that business B must contact all users of their intent to store their data before being put online elsewhere, or they're liable for being sued in breach of this law. This is the primary reason for companies which store public or private user data nuke everything form orbit, they don't want the liability. But this creates a problem because in the case of Geocities, there were people that had VERY important information stored there, didn't get the memo when Geocities was shutting down and subsequently lost that data. One thing pops into mind is I think a mother had some baby pictures of a child she lost that were only on Geocities, which oocities miraculously saved. This kind of thing is not possible with this law.

"Right to delete your info" - Again, this doesn't specify what information, so technically everything is covered. So let's say some indeterminate amount of time ago you made a major contribution to XYZ by writing diagrams, instructions, debugging information, etc. and now you change your mind. But the stuff you wrote is one of the only sources of info on XYZ which has been spread around the web, now everyone else must comply with your request for deletion or face legal action. Stuff similar to this has happened in the vintage computer territory where people release something for free, then if it is popular will turn around and revoke free and start scalping people for huge amounts of money, being a disservice to everyone in the community over greed. Now they'll have even more power to shake the community down.

The bill refers specifically to personal user information that is collected by businesses. I don't see it as applying to website archives.


Then there's this gem:

"Storage limitation: right to prevent data from being stored longer than necessary" - OK, how long is too long? Who gets to decide the length of time data is allowed to be stored? Are people who are the owners of the information allowed to months/years/decades later demand their information be removed when it is historically significant?

"Longer than necessary" in the context of the CCPA means longer than is needed to supply a user with the service they requested. In other words, the business can keep the data long enough to fulfill the customer's needs and the functioning of the requested service, but not longer.


Right to be forgotten is certainly a thing, but the scope of the term "data" is not defined, and as stated earlier can be anything.

The bill doesn't use the word "data" but instead "personal information". And the bill also defines what personal data is in the view of the law.

https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180AB375

"The bill would prescribe various definitions for its purposes and would define “personal information” with reference to a broad list of characteristics and behaviors, personal and commercial, as well as inferences drawn from this information. The bill would prohibit the provisions described above from restricting the ability of the business to comply with federal, state, or local laws, among other things."

https://en.wikipedia.org/wiki/California_Consumer_Privacy_Act#Definition_of_personal_data

"CCPA defines personal information as information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household such as a real name, alias, postal address, unique personal identifier, online identifier, Internet Protocol address, email address, account name, social security number, driver's license number, passport number, or other similar identifiers.

An additional caveat identifies, relates to, describes, or is capable of being associated with, a particular individual, including, but not limited to, their name, signature, Social Security number, physical characteristics or description, address, telephone number, passport number, driver's license or state identification card number, insurance policy number, education, employment, employment history, bank account number, credit card number, debit card number, or any other financial information, medical information, or health insurance information.

It does not consider Publicly Available Information as personal.

Key differences between CCPA and the European Union's GDPR include the scope and territorial reach of each, definitions related to protected information, levels of specificity, and an opt-out right for sales of personal information. CCPA differs in definition of personal information from GDPR as in some cases the CCPA only considers data that was provided by a consumer and excludes personal data that was purchased by, or acquired through, third parties. The GDPR does not make that distinction and covers all personal data regardless of source (even in the event of sensitive personal information, this doesn't apply if the information was manifestly made public by the data subject themselves, following the exception under Art.9(2),e). As such the definition in GDPR is much broader than defined in the CCPA."


Right now? The prevention of archiving. Data is being deleted every day under this law, all of which we'll never know the significance of since we were never able to see it. If data is deleted on the internet before crawlers are able to grab a snapshot of it, nobody can look at it at a later date.

I haven't noticed any archiving taking a hit after GDPR, or anybody talking about negative impacts on archiving as a result of GDPR. Can you point to any cases where archives were removed as a result?
 
Here's the bill's definition of "personal information" and related terms that establish what data the bills regulates.

This information is in section 1798.140 of the bill.
https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180AB375

(o) (1) “Personal information” means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following:
(A) Identifiers such as a real name, alias, postal address, unique personal identifier, online identifier Internet Protocol address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiers.
(B) Any categories of personal information described in subdivision (e) of Section 1798.80.
(C) Characteristics of protected classifications under California or federal law.
(D) Commercial information, including records of personal property, products or services purchased, obtained, or considered, or other purchasing or consuming histories or tendencies.
(E) Biometric information.
(F) Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer’s interaction with an Internet Web site, application, or advertisement.
(G) Geolocation data.
(H) Audio, electronic, visual, thermal, olfactory, or similar information.
(I) Professional or employment-related information.
(J) Education information, defined as information that is not publicly available personally identifiable information as defined in the Family Educational Rights and Privacy Act (20 U.S.C. section 1232g, 34 C.F.R. Part 99).
(K) Inferences drawn from any of the information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, preferences, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
(2) “Personal information” does not include publicly available information. For these purposes, “publicly available” means information that is lawfully made available from federal, state, or local government records, if any conditions associated with such information. “Publicly available” does not mean biometric information collected by a business about a consumer without the consumer’s knowledge. Information is not “publicly available” if that data is used for a purpose that is not compatible with the purpose for which the data is maintained and made available in the government records or for which it is publicly maintained. “Publicly available” does not include consumer information that is deidentified or aggregate consumer information.
(p) “Probabilistic identifier” means the identification of a consumer or a device to a degree of certainty of more probable than not based on any categories of personal information included in, or similar to, the categories enumerated in the definition of personal information.
(q) “Processing” means any operation or set of operations that are performed on personal data or on sets of personal data, whether or not by automated means.
(r) “Pseudonymize” or “Pseudonymization” means the processing of personal information in a manner that renders the personal information no longer attributable to a specific consumer without the use of additional information, provided that the additional information is kept separately and is subject to technical and organizational measures to ensure that the personal information is not attributed to an identified or identifiable consumer.
(s) “Research” means scientific, systematic study and observation, including basic research or applied research that is in the public interest and that adheres to all other applicable ethics and privacy laws or studies conducted in the public interest in the area of public health. Research with personal information that may have been collected from a consumer in the course of the consumer’s interactions with a business’ service or device for other purposes shall be:
(1) Compatible with the business purpose for which the personal information was collected.
(2) Subsequently pseudonymized and deidentified, or deidentified and in the aggregate, such that the information cannot reasonably identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer.
(3) Made subject to technical safeguards that prohibit reidentification of the consumer to whom the information may pertain.
(4) Subject to business processes that specifically prohibit reidentification of the information.
(5) Made subject to business processes to prevent inadvertent release of deidentified information.
(6) Protected from any reidentification attempts.
(7) Used solely for research purposes that are compatible with the context in which the personal information was collected.
(8) Not be used for any commercial purpose.
(9) Subjected by the business conducting the research to additional security controls limit access to the research data to only those individuals in a business as are necessary to carry out the research purpose.
(t) (1) “Sell,” “selling,” “sale,” or “sold,” means selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.
(2) For purposes of this title, a business does not sell personal information when:
(A) A consumer uses or directs the business to intentionally disclose personal information or uses the business to intentionally interact with a third party, provided the third party does not also sell the personal information, unless that disclosure would be consistent with the provisions of this title. An intentional interaction occurs when the consumer intends to interact with the third party, via one or more deliberate interactions. Hovering over, muting, pausing, or closing a given piece of content does not constitute a consumer’s intent to interact with a third party.
(B) The business uses or shares an identifier for a consumer who has opted out of the sale of the consumer’s personal information for the purposes of alerting third parties that the consumer has opted out of the sale of the consumer’s personal information.
(C) The business uses or shares with a service provider personal information of a consumer that is necessary to perform a business purposes if both of the following conditions are met: services that the service provider performs on the business’ behalf, provided that the service provider also does not sell the personal information.
(i) The business has provided notice that information being used or shared in its terms and conditions consistent with Section 1798.135.
(ii) The service provider does not further collect, sell, or use the personal information of the consumer except as necessary to perform the business purpose.
(D) The business transfers to a third party the personal information of a consumer as an asset that is part of a merger, acquisition, bankruptcy, or other transaction in which the third party assumes control of all or part of the business provided that information is used or shared consistently with Sections 1798.110 and 1798.115. If a third party materially alters how it uses or shares the personal information of a consumer in a manner that is materially inconsistent with the promises made at the time of collection, it shall provide prior notice of the new or changed practice to the consumer. The notice shall be sufficiently prominent and robust to ensure that existing consumers can easily exercise their choices consistently with Section 1798.120. This subparagraph does not authorize a business to make material, retroactive privacy policy changes or make other changes in their privacy policy in a manner that would violate the Unfair and Deceptive Practices Act (Chapter 5 (commencing with Section 17200) of Part 2 of Division 7 of the Business and Professions Code).
(u) “Service” or “services” means work, labor, and services, including services furnished in connection with the sale or repair of goods.
(v) “Service provider” means a sole proprietorship, partnership, limited liability company, corporation, association, or other legal entity that is organized or operated for the profit or financial benefit of its shareholders or other owners, that processes information on behalf of a business and to which the business discloses a consumer’s personal information for a business purpose pursuant to a written contract, provided that the contract prohibits the entity receiving the information from retaining, using, or disclosing the personal information for any purpose other than for the specific purpose of performing the services specified in the contract for the business, or as otherwise permitted by this title, including retaining, using, or disclosing the personal information for a commercial purpose other than providing the services specified in the contract with the business.
(w) “Third party” means a person who is not any of the following:
(1) The business that collects personal information from consumers under this title.
(2) A person to whom the business discloses a consumer’s personal information for a business purpose pursuant to a written contract, provided that the contract:
(A) Prohibits the person receiving the personal information from:
(i) Selling the personal information.
(ii) Retaining, using, or disclosing the personal information for any purpose other than for the specific purpose of performing the services specified in the contract, including retaining, using, or disclosing the personal information for a commercial purpose other than providing the services specified in the contract.
(iii) Retaining, using, or disclosing the information outside of the direct business relationship between the person and the business.
(B) Includes a certification made by the person receiving the personal information that the person understands the restrictions in subparagraph (A) and will comply with them.
A person covered by paragraph (2) that violates any of the restrictions set forth in this title shall be liable for the violations. A business that discloses personal information to a person covered by paragraph (2) in compliance with paragraph (2) shall not be liable under this title if the person receiving the personal information uses it in violation of the restrictions set forth in this title, provided that, at the time of disclosing the personal information, the business does not have actual knowledge, or reason to believe, that the person intends to commit such a violation.
(x) “Unique identifier” or “Unique personal identifier” means a persistent identifier that can be used to recognize a consumer, a family, or a device that is linked to a consumer or family, over time and across different services, including, but not limited to, a device identifier; an Internet Protocol address; cookies, beacons, pixel tags, mobile ad identifiers, or similar technology; customer number, unique pseudonym, or user alias; telephone numbers, or other forms of persistent or probabilistic identifiers that can be used to identify a particular consumer or device. For purposes of this subdivision, “family” means a custodial parent or guardian and any minor children over which the parent or guardian has custody.
(y) “Verifiable consumer request” means a request that is made by a consumer, by a consumer on behalf of the consumer’s minor child, or by a natural person or a person registered with the Secretary of State, authorized by the consumer to act on the consumer’s behalf, and that the business can reasonably verify, pursuant to regulations adopted by the Attorney General pursuant to paragraph (7) of subdivision (a) of Section 1798.185 to be the consumer about whom the business has collected personal information. A business is not obligated to provide information to the consumer pursuant to Sections 1798.110 and 1798.115 if the business cannot verify, pursuant this subdivision and regulations adopted by the Attorney General pursuant to paragraph (7) of subdivision (a) of Section 1798.185, that the consumer making the request is the consumer about whom the business has collected information or is a person authorized by the consumer to act on such consumer’s behalf.
 
The way I read into it, they're allowed access to your "personal information" for as long as you use Windows.
There is an option to delete the data that Microsoft collects already built into Windows. Of course, right now there is no transparency to confirm either that they really did it or what the collected and deleted right now. Google already does the same thing, too, but without a way to audit them this is all just for feels at this point.
 
so the pessimist in me simply sees as no company in California will be able to sell your data... that said I'd like to announce our billing will now be handled by a separate company just across the state line in Nevada
 
Back
Top