“From round-the-clock patient support to improving the speed and accuracy of care outcomes, ChatGPT has already found its place in the sector.” – Will Long, MIS, CHISL, CISSP, CPHIMS, Chief Security Officer, Enterprise Security & Technology, First Health Advisory
SC Media just released an article, There’s no need for providers to ban ChatGPT use in healthcare, written by First Health Advisory’s Chief Security Office of Enterprise Security & Technology, Will Long, MIS, CHISL, CISSP, CPHIMS. The article highlights the concerns around compliance issues in the healthcare industry with the increased popularity of Chat GPT. Long notes that while there are certainly reasons to be concerned, as data breaches are definitely plausible, Chat GPT is still a very powerful tool in the healthcare industry. He suggests that healthcare entities should consider working Chat GPT into their risk assessments and compliance measures rather than banning its use altogether. By taking the appropriate precautions, healthcare entities can securely derive benefits from the tool without accidentally breaching data.
For more information on how First Health Advisory can help your digital health system assess your security practices and implement proven security measures, contact us at [email protected] to schedule with a Chief Security Officer.