Compliance and AI in the dental practice: Where do we go from here?

Dentist Chart Patient

Artificial intelligence (AI) innovations certainly make this an exciting time to be in dentistry, yet these innovative opportunities must be balanced with an awareness of any potential or unknown risk. In this third of a three-part series on AI in dentistry, we’ll explore potential risks and future regulatory oversight, as well as provide practical tips for implementing AI into your practice or organization.

Legal experts and insurance carriers are beginning to identify areas of concern in the evolving use of AI. Consider these three potential areas of concern posed by experts.

Patient privacy laws

According to the Electronic Privacy Information Center (EPIC), a nonprofit research and advocacy center, as of April 2023, 11 states have enacted AI legislation and approximately 14 states have proposed legislation. Many of these new state laws allow consumers to opt out of targeted advertising and profiling, and they also require data protection assessments.

To date, these state privacy laws are consumer-focused, yet there may be some overlap with federal laws, such as HIPAA. HIPAA laws may also come into play when using nonhealthcare AI platforms or when a healthcare AI platform experiences a security incident.

Consider the following scenarios. What if your office manager uses ChatGPT to create a dismissal letter that includes the patient’s name, address, and treatment information? Doing so would constitute an unauthorized disclosure of protected health information.

Or what if a patient’s medical history includes information about a substance-use disorder or participation in a maintenance treatment program and due to an AI malfunction or security incident, the data is compromised? This incident may trigger the reporting requirements of the Health Breach Notification Rule. Most definitely, patient information should not be used in nonhealthcare AI platforms that are not HIPAA compliant or by those who have not signed a business associate agreement.

Unlicensed practice of dentistry

AI has the potential to significantly improve healthcare outcomes. However, we must bear in mind that it is the dentist’s role to ultimately make a decision about diagnosis and treatment even though the clinical teams help gather diagnostic information (charting, probing, etc).

Will AI make it easy or tempting for staff to make or present a diagnosis to a patient that is beyond the scope of their practice? In addition, legal experts caution providers to use AI tools to support their clinical judgment and diagnosis rather than using it as a substitute for their clinical judgment and diagnosis.

Malpractice and related risk

It's conceivable that a malpractice claim could arise when the patient experiences an adverse outcome and the practitioner used AI as part of a decision-making process. This is an evolving area of risk that currently presents more questions than answers.

Could a plaintiff’s attorney allege there was a deviation of the standard of care by relying on an AI tool? Only time will tell. In the meantime, continue using all the customary diagnostic tools in addition to AI to ensure you have a complete picture of the patient’s oral health. Exercise caution to thoroughly document your findings, analysis, and diagnosis.

What about future regulatory oversight?

According to the National Conference of State Legislatures (NCLS), state-level legislation related to AI is pending in multiple states. This step mirrors the EPIC findings mentioned above.

NCLS reports that some of the pending legislation would not only apply to the developers but also to the deployers and users of the technology. These trends are definitely worth following as AI evolves in healthcare.

Strategies to mitigate risk in your practice

As you embrace AI in your practice or organization, consider the following two risk-prevention activities to help ensure the best outcomes for all -- you, your patients, and your practice. 

First, develop a workforce policy on the use of generative AI. Larger group practices and dental service organizations (DSOs) may consider organizing a committee of stakeholders to set policy on the use of AI. This policy could easily dovetail into the corporate compliance plan. 

Second, private practices and DSOs alike should consider reviewing the scope of existing insurance policies when utilizing AI solutions to determine if there is any coverage related to the use of AI. Start by having a conversation with your malpractice carrier.

Given the rapid growth of AI as discussed in this series, it is highly likely that there will be additional developments and efforts to regulate AI in healthcare. In the meantime, align your practice with AI programs that understand dental-specific needs and fulfill all the requirements of a HIPAA business associate.

Linda Harvey, MS, RDH, began her career in healthcare as a dental hygienist. Since that time, she’s become a nationally recognized dental risk management and corporate compliance expert. Linda is the founder of the Dental Compliance Institute, an exclusive compliance membership group for dental service organizations and private practices. Harvey has been recognized as a distinguished fellow by the American Society for Healthcare Risk Management.

Ann-Marie C. DePalma, CDA, RDH, MEd, is a graduate of the Forsyth School for Dental Hygienists, Northeastern University, and the University of Massachusetts Boston. She is a fellow of the Association of Dental Implant Auxiliaries, a fellow of the American Academy of Dental Hygiene, a continuous member and fellow of the American Dental Hygienists’ Association as well as a lifetime member of the American Association of Dental Office Management. She is the 2017 Esther Wilkins Distinguished Alumni of Forsyth Award recipient.

The comments and observations expressed herein do not necessarily reflect the opinions of DrBicuspid.com, nor should they be construed as an endorsement or admonishment of any particular idea, vendor, or organization.

Page 1 of 18
Next Page