top of page
Search

Reflection - Part 4: Security

  • Xiruo Ding
  • May 2, 2020
  • 3 min read

Updated: May 31, 2020

From Xiruo:

It is interesting for me to start from that 2019 NYT news and move on and on with more similar reports. And what’s more interesting is that they all seem to be connected with one company, which has already raised tons of privacy issues and fallen in numerous scandals in monitoring people, outside healthcare world. I think it is a quite natural reaction of being scared when people connect Google with their health records, which most of us assume would be only kept by our physicians.


The second interesting part is that many research hospitals are actually doing similar researches as Google, but very little news of them being sued showed up. I personally think that’s a benefit of IRB that could protect us and patients as well, and we are under the same parent entity as the hospital, making researchers in the university more trustful to patients. Not to mention that it is impossible for us to obtain more data other than patient’s health records, such as the huge amount of data Google has collected.

The third interesting part is that most of times, universities are not good at putting ideas into broad and general use for the public. Those big companies, with sufficient funding, adequate incentives, and mature engineering teams, have great potentials to transferring ideas into applications. With “selling data” part aside, I think Google has some advantages in mining health data. That being said, there should also be adequate laws and regulations, and more importantly, parties that regulating rules routinely. To make the process public and monitored is a way to eliminate public’s underlying distrust.


From Yue:

The reason why I am interested in HIPAA is that I’ve noticed a medical student download the PHI data into his laptop without encrypted. It is very dangerous to do that. All of us are required to finish the HIPAA training first, but some people may not take it seriously. He might do that for convenience, cause connecting VPN, and log into the secure workstation takes several more minutes. He might think that the login password for the laptop can also protect that information. However, a Windows password helps keep honest people honest, protecting your computer from casual unauthorized access. If an attacker gains physical access to your computer, all bets are off, and a Windows password won’t help much. Encryption is not mandatory under HIPAA Rules, but it cannot be ignored. Any disclosure of protected health information that is not permitted under the HIPAA Privacy Rule can attract a financial penalty and an adverse impact on future scientific cooperation.


From Jake:

In my readings, I tried to tie what we are doing with the natural language processing to security and privacy. Enter the i2b2/UTHealth challenege. The challenge provided a conduit between NLP and privacy. It focuses on using NLP for de-identification of data. De-identification of data allows for researchers to conduct research safely. I read three papers for this analysis. The papers provided insight into the overview of the task and the specific methods used. It was interesting to see quintessential machine learning rubrics applied in a privacy and security setting. It appears that removing HIPAA related information is almost a science; although they mention it cannot be done with computers alone. Precision, recall, and f-measure were used to measure the output. All in all, the i2b2 challenge provides an insight into one of the main ways NLP can be used to protect privacy.

 
 
 

Comments


©2020 by nlppeopleprimer. Proudly created with Wix.com

bottom of page