CALL US TODAY: (713) 899-9812

When a patient enters a clinic suffering from debilitating pain that disrupts their ability to work, sleep, engage with family and friends, or pursue a healthy lifestyle, ‘compliance’ is likely not the foremost concern of the physical therapist. The therapist’s primary goal is to address the patient’s condition, alleviate their discomfort, acknowledge the success of their recovery, and proceed to assist the following individual in need. However, numerous challenges often arise that hinder this straightforward approach.

Many therapists harbor a legitimate fear of non-compliance, having been rigorously trained in physical therapy school on the consequences of fraud, waste, abuse, HIPAA regulations, and Medicare payer rules. While this fear is instilled early on, its gravity is often fully realized once a compliance issue arises. This can be compared to a teenager learning to drive; despite passing the test and being warned not to speed, the reality of those warnings only hits when they see flashing blue lights in the rearview mirror.

Many early-career therapists are driven to become the best in their field, often pursuing additional courses or certifications in the latest treatment techniques. While this dedication to professional development is commendable, they can be more independent of external systems to ensure compliance. Therapists rely heavily on the accuracy of information provided during their education, trusting that clinical instructors demonstrate compliant practices. Additionally, they depend on electronic medical record (EMR) vendors to correctly interpret federal regulations, billing, and coding requirements and to update systems as the field evolves. When questions arise, therapists often turn to supervisors for guidance, while practice owners may need to consult a paid expert or resort to online searches for direction. Unfortunately, this reliance on external sources can sometimes lead to incomplete information and, ultimately, errors.

There is no question that there is a payment issue in this profession.  Undeniably, there is a need for more therapists in many parts of this country.  The American Physical Therapy Association (APTA) reported in 2019 that although there will be 10,000 therapists graduating annually, there is an expectation of approximately 26,000 physical therapy job shortages by 2025.1 These are significant barriers to private practices and their earning potential.  Unfortunately, some of these issues are impossible for a clinic owner to address directly.  Private Practice owners are forced to address tangible complications to their business.  The focus is to be profitable.  To do this, the owner must retain employees, maximize payment, and minimize denials and audits.  Two hot topics in physical therapy can help in these circumstances: reducing administrative burden and incorporating artificial intelligence (AI) into their practice.  The underlying question remains, as with any issue with physical therapy: how to change and adapt yet stay compliant?

 

Administrative Burden

 

When looking at global costs in healthcare, certain activities, such as documentation, billing, and compliance, are estimated to be between eight % – 34% of general healthcare costs that are contributed to these nonclinical activities.2 This is significantly higher than in other countries of increased wealth.  This percentage significantly contributes to the United States’ leading healthcare spending.2

When taking the physical therapy shortages into account, adding increased nonclinical tasks with increased productivity standards, and increasing the number of patients needing treatment (think baby boomers), it is no wonder that physical therapists are showing increased emotional exhaustion.2  Emotional exhaustion, as defined in a study by Lee and Cheiladuraiis, is when “one feels overextended in their job, experiences extreme fatigue, and feels their emotional resources have been drained.”3  According to the APTA, 82.4 % of physical therapists experience burnout.2  Other studies have shown that younger practicing therapists with ten years or less experience have a higher rate of emotional exhaustion than older therapists (30-plus years of experience).4

APTA has also described further nonclinical aspects that are additional causes of administrative burden, including prior authorization complications, improper payments, claim denials, and denial appeals.5  Even Medicare reports similar data, asserting in their 2022 Medicare Fee for Service Supplemental Improper Payments Data report that physical therapists in private practice billing Part B services demonstrate a 15.8 % improper payment rate, equaling approximately $449.5 million.6  Studies show a correlation between burnout in medical professionals and patient dissatisfaction, high costs, poor outcomes, and increased medical errors.4

A frequent response therapists give when asked about the most significant contributors to administrative burden in physical therapy practice is documentation. While electronic medical records (EMRs) aim to alleviate some of this burden, outpatient physical therapy’s documentation requirements and productivity standards remain intense. Consider a typical day for a therapist in private practice, where productivity standards often require seeing an average of 12 patients daily. For this scenario, two of the 12 appointments are new patient evaluations. To make the case more realistic, one patient is covered by Medicare and needs a progress report, while another is post-operative and requires an update note to send to their surgeon before a follow-up visit.

Breaking down the time allocation, each patient session is typically 45 minutes, with some overlap between patients. Outside of treatment, documentation time quickly adds up. Even for experienced therapists using efficient EMR systems, documentation often spills into lunch breaks or after hours. In this example, daily treatment notes take 3 to 5 minutes each, evaluations require 15 to 20 minutes, and progress reports or surgeon updates add another 10 minutes. This results in 80 to 120 minutes spent solely on documentation—on top of the 45-minute treatment sessions for each of the 12 patients.

Unsurprisingly, private practice owners frequently turn to EMR vendors for solutions to this paradox: the desire for increased patient volume and reimbursement is at odds with the risk of therapist burnout. One emerging solution that EMR vendors are exploring to reduce this burden is artificial intelligence (AI).

 

Artificial Intelligence

 

Without question, AI has become a significant part of our daily lives.  All you have to do is look at the applications of Amazon’s Alexa or how Netflix recommends by personalization.7 With the evolution of resources like ChatGPT or similar services, AI has made a significant leap into everyday practice across many professions.  ChatGPT is an advanced language model AI that performs tasks such as providing information or answering questions.8 ChatGPT can also translate languages, create programming language solutions, and create original writing samples.8

These are just a few examples of AI in everyday use. Many articles delve deeply into what AI is and the many different types of AI. The science behind AI, machine learning, cognitive computing, or natural language processing can be challenging to follow.9, 10 Given the numerous resources that provide detailed definitions and examples, this article will not revisit that information. Instead, it will focus on exploring the current applications and, ultimately, the associated risks.

When discussing AI in healthcare specifically, it is well known that advances in AI help to improve current medical and health trends in areas of diagnosing and treating, especially with imaging services, development of new drugs, enhancing clinical decision-making processes, more personalized care, observing and noting changes in chronic disease, predictive analytics, research, and increased surgical accuracy.7, 11  The goal of AI in medicine is the same in all aspects of healthcare, as well as other industries.  These goals include improving cost efficiency, increasing access, assisting with human skills, enhancing decision-making, increasing security and safety, more personalization, and increasing productivity.7

Keith Loria’s article “What’s the Impact of AI in Physical Therapy?” in APTA Magazine describes the medical uses of AI in the realm of PT.  He states that there are some clinical applications in which AI works to help predict injuries and the best treatments paired with specific presentations.12 As with all medical AI goals, the same is true with physical therapy in that reducing documentation time enables the therapist to focus more on the patient.12  Every therapist is looking for ways to decrease their documentation burden, whether AI recording their voice and transcribing it into the appropriate place within the note or utilizing sensors to record the accurate range of motion measurements.  While these needs are evident, Loria reports that AI utilization in administrative tasks might increase quickly in the PT profession.12 Technology is making great strides in all aspects of medical practice, with many companies and organizations striving to meet these needs through technology and AI.  AI will revolutionize the healthcare industry, including physical therapy, with goals of “…improving diagnostic accuracy, personalizing treatment plans, enabling remote monitoring, enhancing therapy experiences, and optimizing administrative processes.”12

While these goals and advancements in AI sound great to the therapist facing emotional exhaustion, bordering on burnout, the truth is that AI is still new, and there are many difficulties with AI, such as drawbacks or pitfalls.  Many of the drawbacks are common across many professions and industries.  Specific to healthcare, AI drawbacks include bias, which could represent misdiagnosed or underdiagnosed complications within specific demographic populations.  Another drawback could be privacy and security concerns, especially when dealing with large amounts of patient data. Interoperability, technology dependence, and ethical concerns are some other drawbacks to AI.7, 11, 13

Another concern when dealing with AI is its algorithmic nature. An algorithm complication could arise called overfitting, which occurs when “too many variables influence the results, leading the algorithm to make inaccurate predictions.”13 While there are several issues with AI, one crucial category that should not be minimized is the social aspects of AI. There is a real fear in humanity that AI will eliminate or re-engineer jobs.13

When addressing concerns surrounding AI in physical therapy, the risk of AI-generated hallucinations presents a significant and tangible threat. Hallucination is when an AI presents incorrect information as fact.13 With AI being the cutting-edge technology for vendors in the healthcare space, the physical therapy profession must be aware that many vendors need to utilize experts in this industry, and there is the potential for minimal to no validation of these services.  Physical therapists must ensure that AI technologies are held accountable by AI specialists and qualified specialists within the PT community. There needs to be more than just a therapist on staff to provide feedback when there is a need to create standards and hold those standards to already defined and deeply held beliefs of safety and awareness in healthcare.13

One of the fundamental goals of practice owners utilizing AI in healthcare is to help lower work demands while improving payment. As outlined above, numerous risks are associated with using AI in healthcare. Lowering risk is one of the hallmark duties of a competent compliance program. The question now focuses on whether a current compliance program is robust enough to cover AI usage entirely or will it be an opportunity to modify current laws, regulations, policies, and procedures to protect all parties from AI.

.

Compliance

 

It is well documented that AI will help decrease administrative burden. Yet, exploring patient data, diagnostic assistance, and treatment specificity can undeniably lead to issues in ethics and law.  Already, there are lawsuits against insurance companies regarding their AI usage: Cigna, Humana, and United Health Care (UHC).  In UHC’s case, the plaintiffs claimed that UHC used an algorithm AI code to deny patient claims in post-acute care improperly.  Interestingly, approximately 90% were ultimately paid in this case when these denials were appealed.14 Humana was sued early this year for using the same AI algorithm that UHC was sued for several months prior.  This AI algorithm aims to examine the patient’s skilled nursing stays and attempt to predict the length of these rehabilitation stays.  Based on these predictions, the determination is made whether to approve or deny Medicare Advantage beneficiaries skilled care admittance.  Based on the plaintiff’s claims, the AI algorithm demonstrated a “high error rate and often contradicted doctors’ recommendations.”15

At the end of 2023, President Biden signed an Executive Order on “Safe, Secure, and Trustworthy Artificial Intelligence (AI).”16 This executive order addresses several key compliance elements, including creating new standards for AI safety and security and protecting privacy.16  This is a good start for compliance teams to begin coordinating and collaborating with AI to assist with minimizing risk that is elevated in the current AI environment.  While compliance efforts continuously focus on reviewing, monitoring, and optimizing processes for maximum effectiveness and benefit, there are valid reasons for utilizing AI to support compliance initiatives.  An example would be if AI could assist in automating a compliance task, such as an incident report autogenerated and delivered to the appropriate person if criteria monitored by a compliance professional went below or above a given limit.9

Even though everyone could benefit from AI resources, there are elevated risks when incorporating AI into work processes.  Healthcare risks are elevated with AI, including patient privacy laws, billing compliance, and addressing bias and unlawful discrimination, to name a few.17 Healthcare risks with AI are numerous, and the fear is that these growing risks become systemic.  Significant oversight and regulations could be inevitable if the government perceives this as a systemic issue. The sheer volume of errors can describe systemic risk.  An example of systemic risk is when one human billing coder makes a mistake, where the errors are minor and contained.  When AI is involved, performing the work of hundreds or thousands of humans, the volume of potential errors could be in the tens of thousands.18 Data privacy is a widely reported risk for AI in healthcare.  Because AI tasks typically collect and store significant amounts of data that contain patient and personal information, it is incumbent on the business or organization to evaluate whether this violates patient’s privacy rights, state or federal laws, regulations, and HIPAA.18

Some of the current healthcare environment’s usage of AI borders on some of these violations.  An example of this returns to the ever-popular ChatGPT.  ChatGPT was briefly mentioned earlier in describing the benefits the power of AI can bring.  There are limitations also.  Lack of critical thinking, bias vulnerability, sensitivity to input phrasing, and limitations on common sense and real-world understanding are just a few difficulties with ChatGPT.8  With limitations such as these, there are inevitably risks to compliance, including patient privacy, informed consent, and medical liability and malpractice.8 Leslie Boles, certified in healthcare compliance, presented at a Regional Healthcare Compliance Conference in Chicago concerning AI and the risks involved in healthcare.  She reported an example of non-compliance with ChatGPT.  In this example, she indicated that a healthcare provider copies symptoms and diagnosis information listed in the medical record and pastes this information into ChatGPT.  Boles reports that “this action automatically violates privacy laws.”8 There have been instances of lawyers being fined thousands of dollars because they used case law where ChatGPT made-up case law.  Lastly, ChatGPT is prone to cyber-attacks.  It has been reported that over 100,000 Chat GPT accounts’ data were stolen and sold on the dark web.8

 

The Future

 

There are authentic and valid reasons to worry about AI in healthcare. As mentioned earlier, healthcare providers face a significant burden. AI can help with this burden, but at what risk? The drawbacks are known. Compliance should not be minimized by focusing on increased finances and employee retention. Some simple steps can be completed and implemented with these known issues to mitigate these risks.

The APTA has taken positive steps in addressing the administrative burden and the pitfalls of AI in several ways.  APTA has published survey summary data on the administrative burden, calling for standardization of documentation requirements, coverage policies across all payers, new prior authorization processes, and unrestricted direct access per payer policies.5 As for artificial intelligence, APTA’s House of Delegates has approved three motions in the 2024 session that address AI.  These motions acknowledge the importance of AI in physical therapy and its potential for “expanding access, enhancing care delivery models, promoting safety, reducing administrative burden, and improving outcomes.”19 In another motion, the APTA states that they oppose payers using artificial intelligence to restrict access or reduce payment.20  The final motion from the 2024 House of Delegates that concerns AI deals with the ethical considerations and integration of AI in PT practice, education, and research.21  While these are significant steps for the profession to take notice of AI’s risks, further compliance safeguards are needed.

There is a consensus that businesses and organizations need to “self-regulate” to guarantee that the utilization of AI in practice is performed responsibly, ethically, and in agreement with the law.8, 12, 17, 18 There are seven main components of a compliance program.  Adding AI concepts into a compliance program should have all of the standard components, including:

  • AI oversight responsibilities by a compliance team
  • Written policies and procedures specifically for AI
  • AI auditing and monitoring
  • Processes created to address AI-specific issues
  • Open lines of communication18

When explicitly discussing ChatGPT and similar AI structures, organizations should provide extensive training to those team members who would utilize this system.  Clear guidelines, policies, and procedures should be implemented and accessible to all.  Other suggestions for managing AI usage are collecting and retaining detailed logs of ChatGPT interactions, conducting periodic audits, reviewing all third-party vendors on the usage and disclaimers, and addressing HIPAA requirements of patient consent, data access controls, and data breach notification protocols.8

When organizing a compliance need for data gained, created, or edited by AI, organizations should ask multiple pertinent questions to help build a compliance program that thoroughly addresses the risks of utilizing AI in the healthcare system.  Matt Schwartz, in presenting on the topic of “Artificial Intelligence in Healthcare: Compliance and Legal Considerations” at a conference, listed several of these types of questions, such as:

  • “How will the accuracy of the model be evaluated? Along what metrics? What is the target accuracy rate/error rate?
  • To what laws, regulations, rules, and requirements must the AI solution adhere?
  • How will sensitive data be protected and secured?
  • Was PHI used in the AI training? If so, how? Was the data de-identified?
  • Did the data used to train the model include errors or bias? How do we know?”18

 

Besides these questions, Schwartz says what he feels is needed in building a robust compliance program that addresses AI in all aspects.  Organizations should create a steering committee to oversee the implementation of AI.  This group should work on developing quality standards.  Businesses and organizations should take further actions for AI compliance, including explicitly building AI-specific policies and procedures, tools and processes for monitoring contractual and privacy risks, and tools and techniques to test and monitor AI solution quality and accuracy.18   Answering these questions and completing these actions will help should there be any claims of AI mistakes or systemic errors.

Dan Hurley serves as the Chief Innovation, Practice, and Growth Officer at AutoMynd, a company leveraging artificial intelligence (AI) to help home health clinicians streamline documentation processes and mitigate the risk of burnout. According to Mr. Hurley, AutoMynd Copilot—an AI-driven, point-of-care application—employs generative AI and ambient listening technology to enhance documentation accuracy and clinical efficiency. The AI system listens to conversations between clinicians, patients, caregivers, or family members during treatment, capturing critical data. Additional features include auto-population of documentation via photographs of medications and advanced wound imaging that helps measure and stage wounds for optimal care. AutoMynd’s focus on home health enables the automated generation of OASIS forms in real time based on data collected during treatment sessions. While automation streamlines much of the process, the system still requires clinicians to review and verify all information before final submission to the electronic medical record.

Regarding AutoMynd Copilot and how it addresses compliance concerns raised by Matt Schwartz, Mr. Hurley highlights its robust security framework. AutoMynd’s home health-specific generative AI operates securely within the Microsoft Azure ecosystem, eliminating the need for external data sources. In conjunction with other industry-standard security protocols, such as database security, System and Organizations Control (SOC) 2 Type 2 internal controls, Health Insurance Portability and Accountability Act (HIPAA), and HITECH security, AutoMynd is committed to maintaining the highest standards of security and accuracy. Through ongoing internal testing to ensure reliability and eliminate bias, Mr. Hurley emphasizes that their AI solutions prioritize compliance and patient data protection.

Integrating AI into healthcare and physical therapy promises significant advancements in clinical, administrative, and support functions, leading to enhanced efficiency and financial sustainability for clinics. However, it is essential to recognize that while AI can streamline operations, a physical therapist’s expertise, and human touch in identifying and addressing patient concerns remain irreplaceable. To ensure AI’s effectiveness while upholding ethical and compliance standards, it must operate under the oversight of both compliance professionals and clinicians. A comprehensive compliance program that includes annual in-depth evaluations of internal AI policies and periodic reviews of evolving regulatory guidelines is crucial for mitigating risk and staying current with legal requirements. Additionally, vendors providing AI tools to improve efficiency must establish and maintain policies that ensure ethical conduct and compliance, fostering trust among healthcare providers and the patients they serve.

 

 

References:

 

  1. Link K, Kupczynski L, Panesar-Aguilar S. A Correlation Study on Physical Therapy and Burnout. International J. Soc. Sci. and Education. 2021 Vol. 11: 63-78.
  2. Kyle MA, Frakt, AB. Patient administrative burden in the US health care system. Health Serv Res. 2021 Oct; 56(5): 755–765.
  3. Raintree Systems. The High Cost of Burnout in Physical Therapy Practices. Raintree Blog. Published April 14, 2023. Updated September 20, 2024. Accessed September, 25, 2024. https://www.raintreeinc.com/blog/cost-of-burnout-in-physical-therapy/
  4. Richter R. Battling Burnout: Programs that address the stresses of being a physician begin to show results. Published September 5, 2019. Accessed September 15, 2024. https://stanmed.stanford.edu/programs-addressing-doctor-burnout/
  5. American Physical Therapy Association. The Impact of Administrative Burden on Physical Therapist Services. Published March 7, 2023. Accessed September 15, 2024. https://www.apta.org/advocacy/issues/administrative-burden/infographic.
  6. Center for Medicare and Medicaid Services. Medicare Provider Compliance Tips: Physical Therapist in Private Practice. Published December 2023. Accessed September 15, 2024. https://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNProducts/medicare-provider-compliance-tips/medicare-provider-compliance-tips.html#PhysicalTherapy.
  7. Vodanovic M., Subasic M., Milosevic D., Pavicin IS. Artificial Intelligence in Medicine and Dentistry. Acta Stomatol Croat. 2023 Mar; 57(1): 70–84.
  8. Boles LV; The Bots are Coming: Navigating Healthcare Compliance Risks in the Age of AI. Presented at: HCCA Chicago & Kansas City Regional Healthcare Compliance Conference; September 14, 2023.
  9. Brill A; Artificial Intelligence and Compliance Programs. COSMOS Compliance Universe. 2024: 1-5.
  10. Vanderhoff M. AI: Huge Potential, or an Impenetrable Black Box? Published Monday, June 1, 2020. Accessed September 15, 2024. https://www.apta.org/apta-magazine/2020/06/01/ai-huge-potential-or-an-impenetrable-black-box
  11. Hitrust Services Corp. The Pros and Cons of AI in Healthcare. Published November 20, 2023. Accessed September 15, 2024. https://hitrustalliance.net/blog/the-pros-and-cons-of-ai-in-healthcare
  12. Loria K., What’s the Impact of AI on Physical Therapy? How artificial intelligence can enhance physical therapist services — and when PTs should use caution. APTA Magazine. Published December 1, 2023. Accessed September 15, 2024. https://www.apta.org/apta-magazine/2023/12/01/impact-ai-physical-therapy.
  13. Khan B, Fatima H, Qureshi A, et al. Drawbacks of Artificial Intelligence and Their Potential Solutions in the Healthcare Sector. Biomed Mater Devices. 2023 Feb 8: 1–8.
  14. Hall AT. Lawsuit Claims UnitedHealthcare Uses AI to Deny Majority of Medicare Advantage Extended-Care Facility Claims. Published January 18, 2024. Accessed September 15, 2024. https://www.jdsupra.com/legalnews/lawsuit-claims-unitedhealthcare-uses-ai-8036102/#:~:text=In%20a%20recent%20class%20action,facilities%20and%20in%2Dhome%20care.
  15. Vogel S. Humana used algorithm to deny care to Medicare Advantage patients, lawsuit claims. Published Dec. 13, 2023. Accessed September 15, 2024. https://www.healthcaredive.com/news/humana-lawsuit-algorithm-medicare-advantage-deny-claims/702403/?utm_source=Sailthru
  16. McSwain D, Kennedy P; Responsible and compliant implementation of Artificial Intelligence (AI) technologies in EHRs and clinical practice. Presented at: HCCA Regional Conference – Charlotte, NC; January 19, 2024.
  17. Joseph AM, Sherer JD. Artificial intelligence: Compliance considerations for provider organizations. COSMOS Compliance Universe. Sept 2024.
  18. Schwartz M; Artificial Intelligence in Healthcare: Compliance and Legal Considerations. Presented at: HCCA Northeast Regional Conference; May 7, 2021.
  19. Luo P, Jordan J, Student Council. RC 13-24 Amend: Digital Health Technologies, Digital Therapeutics, and 4 Artificial Intelligence in Physical Therapist Practice. 2024: 1-3. Accessed September 20, 2024. https://www.apta.org/siteassets/pdfs/2024-motions/rc_13_24_amend_digital_health_tech_therapeutics_pt_240712.pdf
  20. Saladin LK, Stickley LA, Ohio Chapter, South Carolina Chapter, Cardiovascular and Pulmonary Physical Therapy Section, Student Council. RC 12-24 Adopt: Inappropriate Use of Artificial Intelligence by Payers. 2024:1-3. Accessed September 20, 2024. https://www.apta.org/siteassets/pdfs/2024-motions/rc_12_24_adopt_inappropriate_use_ai_payers-240517.pdf
  21. DiFilippo A, Duijn AV, Ohio Chapter, South Carolina Chapter, Cardio Pulmonary Section, Innovation and Leadership, Student Council. RC 11-24 Adopt: Ethical and Effective Integration of Artificial Intelligence Across Physical Therapy Practice, Education, and Research. 2024: 1-3. Accessed September 20, 2024. https://www.apta.org/siteassets/pdfs/2024-motions/rc_11_24_adopt_ethical_effective_integration_across_pt-240702.pdf