16 comments

  • dang 1621 days ago
  • DIVx0 1621 days ago
    I work for a large US based corp that focuses on technology and data services for the healthcare field. We have massive amounts of PHI for the majority of people who have visited a provider within the country.

    We apply all sorts of stuff to this data, ML, AI or whatever other buzzy tech you can think of.

    Most of this work happens within our own data centers but there is significant work done within public clouds.

    We have BAAs (business associate agreement) with every cloud vendor we work with. We also have gone to extreme lengths to be confident that our cloud deployments are as secure (or more) than our on premises stuff.

    However, none of that is unique. We're no industry trail blazers with adopting public clouds. Just about every other major player is doing this in a way fairly similar to ours

    So, what I _really_ don't understand with this story is, did Ascension just simply give up their data to google without boundaries? Their BAA should be very clear that Ascension intends to use google's cloud services but not giving rights to their data to google.

    It would not be unusual to engage a vendor or form some other partnership with another firm to work on problems or generate new products. I assumed that this is what Ascension and google were doing but this whistleblower and other stories make it seem like google just has free and clear access to this data outside of their relationship with Ascension.

    Is that true? If so, that's crazy! Otherwise, business as usual?

    • CPLX 1621 days ago
      I've seen a ton of responses from people in the industry along the lines of "this is normal" or similar. People who work on this stuff are incredulous that there's even an issue, since everyone's health info is already being uploaded to AWS or something. This is business as usual, they say.

      The uproar is taken by people actually working in the business as a sign that the public are ignorant and misinformed. Things are actually HIPPA compliant, they say. This isn't a big deal, they say.

      But perhaps the education should be going in the other direction, and the people in the industry should realize they are the ignorant ones for realizing that this is totally not OK for a huge number of people.

      Realize that we are absolutely horrified that this data is being shared, that a reasonable response is to say that if HIPPA is OK with this then we need stronger laws, that we don't want faceless algorithms studying our most intimate personal and medical issues at companies we never had a relationship with.

      And, especially, we are absolutely fucking certain that we want literally none of it to be seen by employees of the sociopathic tech companies that are surveilling every aspect of our life in order to better manipulate politics, markets, and our society.

      The thing to take away here is that people are shocked and horrified at what's apparently business as usual.

      • UncleMeat 1621 days ago
        One of the challenges is how do you distinguish the kinds of outrage? It seems to me like the majority of outrage is based on falsehood (Ascension is selling data to Google to use for ads). Then there are some people where the very idea of medical data being managed by cloud providers is unacceptable. The presence of the first shadows the latter and makes the whole thing come across as misinformed.
      • inetknght 1621 days ago
        > Realize that we are absolutely horrified that this data is being shared, that a reasonable response is to say that if HIPPA is OK with this then we need stronger laws, that we don't want faceless algorithms studying our most intimate personal and medical issues at companies we never had a relationship with.

        I absolutely 150% agree with you and I know others off of HN who do as well.

      • raxxorrax 1621 days ago
        I share that perception. For me, this is a sign that people working with the data, probably should not work with this data. They don't really make a good case.

        And saying this is business as usual is extremely disingenuous, since the distribution of information was completely different just a few years ago.

      • trebligdivad 1621 days ago
        But if they're just using Google as cloud storage/compute it's not being shared. Only very few Google employees would have access, they'd have very careful limitations and access on who accessed what - that's not the same as giving to google for some big AI experiment.
        • inetknght 1621 days ago
          > Only very few Google employees would have access

          That's "very few" Google employees more than zero. I expect zero Google employees to have access to my medical data. Any number above zero is absolutely not acceptable to me.

          > they'd have very careful limitations and access on who accessed what

          Yeah, just like Equifax, right?

          • kazen44 1621 days ago
            > That's "very few" Google employees more than zero. I expect zero Google employees to have access to my medical data. Any number above zero is absolutely not acceptable to me.

            Heck, people owning my medical data who are not my doctor/GP and related medical professionals is a big no go in my opinion.

            Medical data is rather private.

            • inetknght 1621 days ago
              > Heck, people owning my medical data who are not my doctor/GP and related medical professionals is a big no go in my opinion.

              Not even "owning", but having.

              Having even "anonymized" data is not acceptable to me.

      • peterwwillis 1621 days ago
        > we don't want faceless algorithms studying our most intimate personal and medical issues

        So you don't want an advanced algorithm to analyze your vitals and detect your cancer at an early stage?

        Technology has more than doubled our life expectancy. More technology could do even more so. The fact that it's a faceless algorithm, and not a creepy biased human, should be a comfort. But you're reacting like this is all going to harm you, when it's way more likely to do the opposite.

        > we are absolutely horrified that this data is being shared

        People always fear the things they don't understand. That doesn't mean that fear is justified, by a long shot.

        • CPLX 1621 days ago
          > So you don't want an advanced algorithm to analyze your vitals and detect your cancer at an early stage?

          That’s correct, I don’t want that.

          And if one day I change my mind and make the choice to opt-in to a system that works like that, I still won’t ever want Google to be involved in it.

          • blacksmith_tb 1621 days ago
            Given that early detection increases chances of survival[1] considerably, you may not have a chance to opt-in later. Which raises interesting questions about whether society might force you to participate in having your data analyzed, similar to say, vaccinating.

            1: https://www.canaryfoundation.org/wp-content/uploads/EarlyDet...

          • peterwwillis 1620 days ago
            Oh, I see. A brand's values and reputation, as well as your own personal choices, are more important to you than your health.
    • mfer 1621 days ago
      From the original WSJ article...

      > Staffers across Alphabet Inc., Google’s parent, have access to the patient information, documents show, including some employees of Google Brain, a research science division credited with some of the company’s biggest breakthroughs.

      I'm struck by the wide access of those within Alphabet to health information that's not anonymous.

      Is this how other healthcare companies are doing it?

      • Kalium 1621 days ago
        One reading of this is that it's a cross-departmental collaboration, rather than just a single division.
        • bilbo0s 1621 days ago
          That's the difference everyone here is talking about though. Normally in the healthcare industry, you engage a cloud provider, and it's "you store this data for us." Full Stop. You don't look at it. You don't analyze it. You don't share it. You don't touch it. It's our data, not yours.

          If what you're saying is true, Ascension, for some reason, has a deal with its cloud provider that allows Google to search through, analyze, etc etc etc. It sounds like all sorts of rights were given to Google. That's an irregular agreement. It's not normally how things are done.

          • Spooky23 1621 days ago
            That's a very naive view of what happens. It's exactly how things are done, except that it's a one-stop shop.

            The reality is that there is a fig leaf of privacy. HIPPA protects you from the office staff gossiping about your medical conditions. When you are admitted to the hospital, your prescriptions are sent to data aggregators in near real-time, your claims are sent to your insurer and subrogation in near real-time, etc.

            Each one of these downstream providers perform their own analysis on the data. The prescription data is sold to pharmaceutical companies and wholesalers to provide KPIs for the sales organization. The insurance and subrogation people sell de-personalized data to marketing companies. The marketing companies can trivially figure out who most people are.

            The end result is that you can easily get a list of every person in a zipcode who is pregnant (with estimated due date), has diabetes, had a stroke, etc.

            • bilbo0s 1621 days ago
              We're talking about cloud providers here, not pharmaceutical or insurance providers. There are different levels of interaction required to accomplish different operational objectives. Giving a cloud provider this sort of access is, as I said initially, highly irregular. There is just not that same level of collaboration needed with your cloud provider.

              Just as a for instance, there are very good operational reasons on your side that pharmaceutical partners need the data to be reasonably certain that the 1000 doses of highly controlled substance X that they previously sent to you, or distributed on your behalf, were administered to people who both required the medication, and who actually exist. That pharmaceutical partner needs to have the ability to be certain of this prior to sending you, or distributing for you, the 10000 additional doses you are all of a sudden requesting.

              By contrast, there is no operational reason on your side that a cloud provider needs to know the names and addresses of the patients in your database.

              • Spooky23 1621 days ago
                Cloud providers can provide all sorts of services. Who says they are just selling dumb storage services?

                If you use Office 365, they process your data to do analytics. Some they expose to you (Delve, MyAnalytics), others they don’t.

                In government, they do even more. Medicaid systems are usually run by third parties, who also sell services to providers to optimize billing.

        • mfer 1621 days ago
          Maybe and sort of.

          Alphabet is a conglomerate, right. Alphabet is the parent company that owns Google, LLC. I'm personally surprised to see talk of the data crossing the Google / Alphabet company boundary. That makes it cross company collaboration, right?

    • thatfrenchguy 1621 days ago
      > I work for a large US based corp that focuses on technology and data services for the healthcare field. We have massive amounts of PHI for the majority of people who have visited a provider within the country.

      The real question here is: why can't I opt-out ? I want a easy button when I do anything medical to say "no, don't use my health data for any of this stuff". And the current CA privacy law does not provide this, unlike GDPR, which sucks.

      • igetspam 1621 days ago
        This I exactly it. In Texas, there is always a check box about sharing. I opt out 100% of the time. This only covers a specific use case though and there are still tons of sharing agreements in place that I can't do anything about. I'd opt out of all of them if I could but at least knowing how many places my data exist would be a start.
      • chopin 1621 days ago
        GDPR does not provide this unfortunately. Just last week a bill in Germany was enacted which allows the medical data of all insured people to be shared with medical companies. There is no opt-out.
        • Hamuko 1621 days ago
          I believe GDPR does provide that, but consent is not required to process data if that processing is "necessary for compliance with a legal obligation to which the controller is subject". So if an EU member state makes a law that requires insurance companies to export your medical data wholesale to medical companies, GDPR does not give you an option to opt out. Really the only solution there is either to a) move elsewhere b) vote for people who don't want to enact such laws.
    • vl 1621 days ago
      The irony, of course, (as anyone with failed ML launch at Google knows, haha) is that internally Google has extremely strong privacy practices and safeguards and probably is the best organization to actually handle this data correctly.

      As for de-anonymization, of course raw data should be pre-anonymized: anonymization is a major source of mistakes in the data, often rendering data useless, and needs to be done correctly by the people who know how to do it correctly.

    • temac 1621 days ago
      It is not just because "everybody" is doing crazy shit, that it is OK to do it.
      • summerlight 1621 days ago
        Then you should blame the industry rather than a newcomer to the industry who tries to follow the industry standard?
        • smt88 1621 days ago
          No one should be excused for unethical behavior just because it’s standard industry practice.
          • kazen44 1621 days ago
            Also, this shows the massive disconnect between those who make money from said data and those who provide that data.
          • lsaferite 1621 days ago
            Who is being accused of being unethical?
    • ocdtrekkie 1621 days ago
      I think the highlight point, that is new information (AFAIK) in this article is that 150 Google employees are working on this project. So this isn't "Ascension is hosting their data in the cloud", it's "Google is working with health data".
      • summerlight 1621 days ago
        IIUC, this is also an industry standard practice as long as it's covered by their BAA. The question would be whether the actual content in the BAA makes sense or not, but I think the requirement for a BAA is pretty specific.

        https://www.hhs.gov/hipaa/for-professionals/covered-entities...

        • JohnFen 1621 days ago
          To the best of my knowledge, the BAA has not been published, so we have no idea what it says.

          But what seems clear is that a ton of complete medical data, including names and other identifying data, has been handed to Google for its use in what sounds like training an ML system.

          That we don't know what restrictions are in place (if any) is a large part of why this is so alarming.

  • atonse 1621 days ago
    We can only rely on whistleblowers to tell us truly what's going on behind the scenes with these kinds of things.

    Kudos to the whistleblower for coming out.

    Look at how this is done vs. how Apple is doing their health program. There's tons of transparency, they're openly talking about doing studies with Medical Schools, and the medical parties are publishing results.

  • janlin1999 1621 days ago
    The real story here might be about how many exceptions are allowed within HIPAA, and how often health data gets transferred.

    My understanding is that patient consent is not necessary for things like research, which Google's AI efforts could be argued to fall under. This makes sense in that requiring researchers to get consent for every piece of patient data would quickly become cost-prohibitive for many types of studies or introduce sampling issues (e.g. selection bias) or some combination of the two. Patient health data is frequently handed over to researchers, but HIPAA probably did not anticipate the case in which a powerful consumer-facing entity that is notorious for using personal data also could do legitimate research on health data.

    Additionally, given that Google adheres to the HIPAA business associate agreement, healthcare institutions are allowed to give non-anonymized patient information without patient consent. A typical example might be a medical group that outsources its billing procedures. My understanding is that the medical group does not need to get patient consent in order to hand over medical data to the medical billing coders.

    Someone who is more knowledgeable about HIPAA might be able to add to the discussion, but from the few details that have been publicized, it could be that Google is following the law, but people are surprised by what is allowed by the law.

    • marcinzm 1621 days ago
      >My understanding is that patient consent is not necessary for things like research, which Google's AI efforts could be argued to fall under.

      HIPAA allows for entities to share data with other entities which provide a service to the original entity. For example, Hospitals may share data to a billing provider or a claims analytics provider. It’s almost certain that Google is following the letter of the law.

    • owlninja 1621 days ago
      Yes, can someone point out what they violated? I know GOOGLE BAD! around here, but surely they aren't this reckless.
  • tokeepmyjob 1621 days ago
    I created this anon account because I don't want to lose my job.

    I work at a major hospital/university as a research engineer and, 100%, the whole system is completely broken. Using our hospital and the 10 or so other major hospitals we work with as my source, I cannot come to any other conclusion.

    HIPAA is constantly touted as the reason to push more and more CYA hurdles on to the staff's day to day interactions. One of the hospital I work with has an email system where you receive a notification (via email) that you have a message from xyz@majorhospital.com, in the body is a link you have to click through to reach a "secure" portal, then you 2FA, just to read an email someone sent _you_ (with no PII).

    While the system that actually handle all the PII are basically rubber stamped with no real security reviews. Prototypes I've built that should never pass a security review, regularly do and get let out into the wild. I've been screaming from the top of a mountain for years and all I ever hear back is that it would cost too much, or prolong the dev cycle, or <enter reason here>. If I counter with "I can't believe this will pass HIPAA muster," the answer is always the same: "It passed the review and that is what matters."

    When that something eventually happens, we can say we went through the proper vetting with the review team and that's it, next to no liability for building crap infrastructure. Of course the security review team will say, they used an analysis software that cleared everything so it's not their fault. Finally the company that made that software will say it's not perfect, but it got certified by XYZ, and just like that the whole thing blows over.

    The hospital itself is what facilitates this shifting of responsibility, they pays millions upon millions of dollars for any software claiming to securely retain and protect PII, when all to often that software isn't even in beta, let all vetted and hardened. But hey, they got some certification that the hospital can point to and say "It wasn't us." Every year there is some new thing we are supposed to do that feign interest in security, but the weakest links just keep getting weaker. No one cares about your personal information, hospitals only care that they aren't liable when your personal information getting leaked.

    I literally lie to my doctor, not because I don't trust her, but because I know that it's just a matter of time before my information is out there.

    • neoburkian 1621 days ago
      Stuff like this is everywhere. I used to work for a company that had millions of credit profiles. We had the chief of IT spending enormous amounts of time to make things "secure," but in the meantime junior PMs (and pretty much everyone else too) had access to a web portal where they could decrypt and inspect anyones credit information, address, name, etc. Inspecting the personal financial details of people in our system was something employees would do on a lark. We had a horrible password manager that nobody used, so the passwords that were employed to validate logins to this portal were literally the names of the employees with one or two extra characters. No 2FA, no IP address restriction, nothing.

      We would deploy enormous resources to protect something if IT believed that had a legal requirement to do so (and make a lot of noise about how "secure" we were), but we would leave treasure chests of information sitting around in the open if there wasn't a box they needed to check saying "don't leave unattended treasure chests of private data in the open."

      If anything this convinced me that the regulations surrounding this sort of thing are a joke. We don't need rules about security or how to build X - IT will just see a list of boxes, check them, and then ignore everything not specifically enumerated. We need a white hat law for certifying hacking teams that can legally try to crack corporations with sensitive data. If they succeed, the company has to pay enormous fines _to the team that hacked them_ and solve the problem or get their certifications/contracts revoked.

    • cwarzel 1621 days ago
      hey — i'm a journalist at the New York Times. I'd like to chat more about this with you if you're interested. you can reach me at charlie.warzel@nytimes.com or via protonmail cwarzel@protonmail.com
  • blissofbeing 1621 days ago
    What is the risk of exposing our health data? To me it is not so obvious, other than maybe embarrassment? Is it like how in our culture we don't like to talk about how much money we make? Why are all these things supposed to be secret in the first place?
    • reaperducer 1621 days ago
      Because your medical information can be used against you.

      Want a new job? Nope! We don't want someone with your condition on our team.

      Want to buy a house? Nope! An AI bot says may not live long enough to pay the loan.

      Want to get some ice cream? Can't haz. When you swipe your electronic payment method the database says you're at a risk for diabetes.

      There are thousands of other scenarios.

      • SamuelAdams 1621 days ago
        Also, consider the poor security track record of implantable medical devices [1]. From the source [2]:

        > The vulnerabilities could allow an unauthorized individual (i.e. someone other than a health care professional) to access and potentially change the settings of an implantable device, home monitor or clinic programmer.

        Ok cool, so there's millions of implantable cardioverter defibrillators whose voltages can be changed by someone who sits nearby at a restaurant. If they up the ampage high enough, the patient has a heart attack.

        Now, what if I found out that a sitting president or other elected politician had a small electronic defibrillator implanted in their chest? We could see targeted assassinations look like medical device malfunctions when in reality someone learned about this device, built an exploit for it, and hired someone to sit within bluetooth range to deliver the payload.

        It's not impossible to do. Unlikely, sure, but this is a result of the high-tech world we are living in.

        [1]: https://nakedsecurity.sophos.com/2019/03/25/medtronic-cardia...

        [2]: https://global.medtronic.com/xg-en/product-security/security...

      • antpls 1620 days ago
        To me, your examples are not about "using information against" someone.

        > Want a new job? Nope! We don't want someone with your condition on our team.

        There are laws in some countries that could forbid such discrimination. Even if the employer lies about the true reasons, lies don't go unpunished forever. The parent is right about it, they are cultural biases.

        > Want to buy a house? Nope! An AI bot says may not live long enough to pay the loan.

        As said by another reply, your example is purely fictional. Even if it was true, there could be a culture change and new laws to support those people in need.

        > Want to get some ice cream? Can't haz. When you swipe your electronic payment method the database says you're at a risk for diabetes.

        Again, fictional example.

        > There are thousands of other scenarios.

        Sure, if you use all the words in the English dictionary, you can create billions of sentences, but very few will be actually close to the truth...

      • kazinator 1621 days ago
        > Want to buy a house? Nope! An AI bot says may not live long enough to pay the loan.

        Nobody cares whether you pay or not, as long as the property value isn't under water. They want you to die. If you die without an heir who can take over the payments (perhaps with the help of insurance money), they get everything you paid so far, and the property.

      • blissofbeing 1621 days ago
        The can't get ice cream one is a bit far fetched eh? I'm sure that no matter if they knew my health history they would still sell me some ice cream :)
        • reaperducer 1621 days ago
          Not if the Point of Sale system won't let them.
      • UncleMeat 1621 days ago
        These are independent of any sharing. A medical provider could directly provide these features as a service without sharing any of the underlying data.
    • Altimos 1621 days ago
      It's not the same as the income situation. In that situation it's about whoever pays you not wanting you to know that you're being undervalued compared to your coworkers. With healthcare it's about not wanting your community to know things and have biases against you and treat you differently because of your medical history.

      Imagine you used to have cancer and beat it, and independent of that fact are looking for a new job that offers health insurance.

      If your condition every resurfaces, you're going to be a real expensive hire for your employer in terms of time off (for treatments and recovery) and your new health insurance provider.

      You might be so expensive that they decide not to hire you, or decide not to cover you.

      If you want to think of your own cases, think about medical conditions that have been stigmatized over the course of history and how people's actual lives are effected by them.

    • triceratops 1621 days ago
      Isn't embarrassment enough? Why do you close the door when you go to the bathroom?

      In addition, many diseases like STD/STIs or mental illness carry a stigma. Information like that, if inadequately protected, can have a social or even financial cost like losing your job.

      • blissofbeing 1621 days ago
        Maybe if it was public these diseases wouldn't be so stigmatized. It might have a social impact now, but I see that as the issue.
        • cortesoft 1621 days ago
          I don't think it should be up to someone else what I am allowed to be embarrassed about, or what I want to keep private.

          I don't think changing society to not be embarrassed about things being made public is a good solution to privacy concerns.

          Also, we might have a million, non-embarrassment reasons to want to keep things private. I might not want people to know about my cancer because I want to be able to live my life without having to answer questions about it, or have people treat me as delicate. I might not want people to know I am pregnant because I don't want to have to tell them when I have a miscarriage.

          Most importantly, I want to be able to have whatever arbitrary reasons I want to keep things private.

        • smolder 1621 days ago
          I think you're underestimating people's ability or inclination to abuse information for personal gain.
    • caseysoftware 1621 days ago
      > Why are all these things supposed to be secret in the first place?

      Because it's detailed, personal, intimate information about you and your immediate family.

      Even if there may be some upsides from releasing portions of it, it must be up to you to control and opt into.

    • smt88 1621 days ago
      Would you want future employers, romantic partners, loan officers, and/or family members to know your medical history?

      What if you were HIV+ or taking Viagra? What if you were just sick and didn’t want to be discriminated against?

      There are life and death implications in medical privacy in most countries. Stigma is real, as are insurance risk algorithms.

      • kazen44 1621 days ago
        > There are life and death implications in medical privacy in most countries. Stigma is real, as are insurance risk algorithms.

        Hell, in some countries people have been killed because of their medical history. Also, centralizing this data is very dangerous if this data falls into the wrong hands at a later date.

        A prime example would be the registration of religion which happened in the Netherlands prior to world war 2. (for taxation reasons). Which resulted in a large number of victims during the holocaust in part because of this record keeping.

      • cloverich 1621 days ago
        So... You are advocating that your hiv status should be hidden from your romantic partner? ;) More seriously, unified health record sharing is inevitable, especially if we are to move to universal healthcare. Would you be against a different entity hosting this data, or just Google?
        • smt88 1621 days ago
          > You are advocating that your hiv status should be hidden from your romantic partner?

          No, I'm advocating that a corporation or data leak does not get to decide when/how someone's HIV status is revealed.

          > Would you be against a different entity hosting this data, or just Google?

          I would be against most entities hosting this data, but Google has proved to be untrustworthy.

          My ideal situation would be for my medical records to be encrypted and only usable with my explicit permission.

    • raxxorrax 1621 days ago
      I see you skipped a lot of work because of health issues lately. That aside and completely unrelated, I think you are just not a good fit in our team.

      I may not be a good person. Or I can rationalize it enough because some people pay me to minimize any risk.

    • foolrush 1621 days ago
      Because the right to your private medical issues is at stake. And that ignores the predatory insurance industry and other discriminatory practices as an aside.
    • dfalfndfk 1621 days ago
      Imagine even more sophisticated advertising from Big Pharma to the old and vulnerable. That is reason enough.

      As a society we are getting better at talking about income. We hide _that_ because of a lack of class consciousness. We should absolutely be transparent with income (except when negotiating for a new salary).

    • blissofbeing 1621 days ago
      So essentially the answer I'm hearing is that there is a fear of what other people will do with the information. Mostly around discrimination and shame.
    • PavlovsCat 1621 days ago
      What is the risk of talking about this? Why is this article pushed to page 2?

          32.  How we built Uber Engineering's highest query-per-second service using Go (2016) (162 points, 11 hours ago, 102 comments)
          33.  Google whistleblower: the medical data of millions of Americans is at risk (160 points, 2 hours ago, 45 comments)
      
      Do you join me in my request to make all flags and votes on HN public?
  • tylerl 1621 days ago
    This strikes me as a whistleblower fail.

    The story isn't, "I know that something bad happened", as whistleblowing is supposed to mean. Instead they complain that essentially, "I am not personally privy to the consent requirements and privacy protections required, nor do I know what has been implemented." Which... congratulations on not being important, I guess?

    But in particular, this person gives an emotional plea that people must be given the explicit opportunity to "opt-out", not from the medical research in general, but from any subset of research conducted in collaboration with Google, specifically because it's Google, rather than another research contractor.

    I donno. Should Internet users be given the opportunity to opt-out from having their IP packets transit fiber owned by AT&T? I mean, sure I agreed to a bunch of stuff when I signed up with my ISP, but my provider is Comcast. They never mentioned AT&T. And some of those packets might not me encrypted....

  • tmaly 1621 days ago
    A single HIPPA violation for a single record alone is costly. There was a Ask HN post a few weeks back where someone posted a medical record horror story. They later deleted the post after pointing out that even programmers are criminally liable.
  • vkaku 1621 days ago
    I salute you, O Whistleblower!
  • marmot777 1621 days ago
    The are HIPAA regulations supposed to protect the data but I take it people don’t trust that there’s sufficient oversight.
  • sys_64738 1621 days ago
    Why does an ad company need my medical data?
  • altgoogler 1621 days ago
    Disclaimer: Googler here, my opinions are my own.

    I have no non-public knowledge of this topic besides these two Guardian reports, and the Google public blog post on the same subject[1].

    I do, however, have nearly a decade in non-Google work experience working in clinical documentation technologies for a company who had BAAs with literally dozens of health companies.

    I simply do not understand the objection this whistle-blower is raising. As far as I understand it, the controversy is simply because Google is involved.

    > Above all: why was the information being handed over in a form that had not been “de-identified"

    DeID is typically used at the edges of an IT system, and is tailored to the rights of certain users accessing the system. If you have a system that says, "A ha! There's a patient with a 5mm AAA with no evidence of follow-up! They need a procedure STAT!", you obviously need to have the original documentation to know who to contact.

    There are ways to keep PHI (identifying info) separated from documentation, but if Google is both the cloud storage provider and doing R&D, both sides of that system would fall on the Google side of the fence.

    The bulk transfer of documents was almost certainly done via HL7v2 messages which, IIRC, don't have any built-in mechanism for redacting PHI, and if it did usually health systems lack the expertise do this consistently between all BAAs they contract with.

    > I was worried too about the security aspect of placing vast amounts of medical data in the digital cloud.

    I mean, yeah, this is an important set of data. In previous years the issues were people walking out of the health system with their laptop and it getting stolen with 1ks of records on it. Cloudification has certainly reduced the vectors to steal large quantities of data.

    > data potentially being handed on to third parties;

    Their BAA specifically prevents this.

    > adverts one day being targeted at patients according to their medical histories.

    Again, the law specifically prevents this use. These are all hypothetical scenarios.

    > Full HIPAA compliance must be enforced, and boundaries must be put in place to prevent third parties gaining access to the data without public consent.

    There simply is no evidence that full HIPAA compliance isn't being followed.

    > Employees at big tech companies having access to personal information

    Thousands of employees at Ascension have this level of access. I personally had access to millions of health care records. BUT! There were auditing systems in place. If you attempted to use that access outside of the scope of your job, you're fired, no second chance.

    > To quote one of my role models, Luke Skywalker: “May the force be with you”.

    Are you kidding me? Is this satire?

    > In short, patients and the public have a right to know what’s happening to their personal health information at every step along the way.

    In short, the concerns here are all hypothetical. There's no basis of any wrong doing. There is no proposal to actually address these concerns in a practical way.

    [1] https://cloud.google.com/blog/topics/inside-google-cloud/our...

  • peterwwillis 1621 days ago
    > Two simple questions kept hounding me: did patients know about the transfer of their data to the tech giant? Should they be informed and given a chance to opt in or out?

    That's literally what HIPAA was created to address. The user doesn't have to opt-in to every single solitary business that touches their data, because there is a chain of business contracts that explicitly dictate what they can do, that originates with the care provider.

    This is like "blowing the whistle" because your dentist sent your dental impression to a company to create a crown for you. You didn't "opt-in" to that dental company getting your impressions, because your dentist had a contract with them to cover what they would do with it in the first place.

    Jesus christ.

    • JohnFen 1621 days ago
      > This is like "blowing the whistle" because your dentist sent your dental impression to a company to create a crown for you.

      I see a huge difference here, though. the company making my crown doesn't get my complete medical records. They get what is necessary to make the crown.

      Google is getting everything, to use for purposes beyond the patient's immediate medical needs.

      > That's literally what HIPAA was created to address.

      That HIPAA allows this sort of thing to happen is a great reason to pressure lawmakers to improve HIPAA.

      • peterwwillis 1621 days ago
        > Google is getting everything, to use for purposes beyond the patient's immediate medical needs.

        That claim was never made anywhere in this whistleblower account. Furthermore, it's illegal, and anyone working with health care records knows that. You can't use the records for anything other than what was covered by the business associate contract.

        It's clear from the letter that the whistleblower literally has no idea at all what's going on, so they got frightened and yelled fire, in hopes that somebody who knows more than them will come and look for a fire.

        If they did about 30 minutes of research, they'd know that any person whose records are covered by this agreement has the legal right to request from Google all information about how their medical records are being used. All they'd need to do is ask 8 random people to request records from Google, and one of them might be covered. They could then find out the answer, or at least have some kind of evidence of a lack of process or oversight. They have provided none.

        Multiple times in the letter they mention that the public must consent to this. There is no law requiring this, so really the whistleblower is trying to assert their own opinions about public policy, and using the veiled threat of malfeasance to get press time.

        I'd expect this anonymous source to out themselves soon so they can lead some kind of public demand for more visibility (hence more regulation). Which might be good, except the HHS (which actually regulates HIPAA claims) already doesn't enforce it much at all. So it's probably just going to be used as a political tool to gain votes without actually improving people's lives.

        • JohnFen 1621 days ago
          > That claim was never made anywhere in this whistleblower account.

          I may have misread the various reports and press releases, of course. If so, please do correct me. But I believe that everyone, including Google and the medical group, has mentioned this data is to be used to train a ML engine.

          > You can't use the records for anything other than what was covered by the business associate contract.

          My understanding is that the ML use is covered by the BAA. Of course, I haven't read it, so I don't know, but it seems likely.

          I seriously doubt that anyone is actually overtly and intentionally breaking any laws here.

  • uoaei 1621 days ago
    My housemate is not careful with their digital privacy so their devices are all unambiguously theirs and the profile assembled by advertisers is likely cohesive across platforms.

    Two days ago they said they've been getting a lot more ads about prescription vaginal creams lately. This wasn't happening a week ago. There is reason to expect that someone with access to their medical history would segment them as a likely target for such ads. The topic hasn't come up in their life recently so it would have had to have been from medical records.

    To be perfectly clear: this probably wouldn't be happening if Google was telling the truth about not using the data for ads.

    It's time to stop believing ad networks when they say they're "not going to use this data for ads" and "are doing everything they can to keep the data safe".

    Singular anecdatum, I know, but if we find enough examples of similar occurrences we may be able to build a strong case that they're simply lying about the security of our data in their hands.

    • markstos 1621 days ago
      As a man, I had ads about custom fit bras follow me around the internet for awhile. This had nothing to do with my medical records. I read an article about a related company.

      Sorry, getting ads about medical products is not evidence that the advertising platform has your medical records.

      • earthboundkid 1621 days ago
        Facebook has been giving me ads for a quit smoking drug for a month or two. (Chantrix? Shows how well the ad worked. I remember there's a picture of a cold turkey.) I don't smoke now, and I've never smoked regularly. Machine learning!