Close Menu
    Facebook X (Twitter) Instagram
    • Contact Us
    • About Us
    • Write For Us
    • Guest Post
    • Privacy Policy
    • Terms of Service
    Metapress
    • News
    • Technology
    • Business
    • Entertainment
    • Science / Health
    • Travel
    Metapress

    When Algorithms Prescribe: Ojinga Harrison MD on the Line Between Clinical Tools and Clinical Judgment

    Lakisha DavisBy Lakisha DavisMarch 25, 2026Updated:April 27, 2026
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Algorithm-driven clinical decision tools intersecting with medical judgment in patient care
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Featuring Dr. Myleme Ojinga Harrison, MD, President of The Carter Clinic, P.A.

    The pitch is familiar by now. A digital platform uses a validated screening tool to identify likely depression, an algorithm to match the patient to a first-line medication, and a brief telehealth visit to confirm the prescription. The pitch is efficiency, scale, and reach, and in a country with a severe psychiatric workforce shortage, those are real goods.

    Ojinga Harrison MD does not dismiss the technology. Telepsychiatry, electronic health records, and symptom tracking are all in active use at The Carter Clinic, P.A., the twelve-location practice he leads as president and board-certified psychiatrist. What he rejects is a specific drift in how these tools are being deployed: the quiet transfer of clinical judgment from the clinician to the system.

    What Algorithms Do Well

    Risk flagging is a reasonable use case. A suicide risk screening tool administered routinely at intake catches presentations that clinicians, under time pressure, sometimes miss. Symptom rating scales provide a quantitative reference point that can be tracked over time. Electronic records, properly designed, make continuity possible across providers and settings.

    Dr. Myleme Ojinga Harrison uses all of these. In profiles of his practice, observers have noted that his telepsychiatry infrastructure supports regional consistency across twelve clinical sites without compromising individualized treatment planning. The tools, in his framing, are the kind of infrastructure that should exist in any modern psychiatric practice.

    What They Cannot Do

    The failure mode appears at the next step. Algorithms can identify risk. They cannot infer meaning. A rating scale can measure distress. It cannot contextualize that distress within a life story. A pattern-matching system can suggest a likely diagnosis. It cannot recognize when the presentation does not fit the pattern, which in psychiatry is exactly where the most consequential clinical work happens.

    “Data doesn’t carry responsibility. Clinicians do.”

    Harrison’s concern is not theoretical. It shows up in practice whenever a system is designed so that the algorithm’s recommendation becomes the default action and the clinician’s role shrinks to either confirming it or documenting why they deviated. When the cost of deviation, in time and administrative friction, is high enough, the clinician stops deviating. The algorithm, effectively, is now prescribing.

    The Problem With Protocolized Care

    Protocols themselves are not the problem. Standardized approaches to suicide risk assessment, trauma screening, and medication titration produce more consistent care than their absence. The problem arises when protocols are used as substitutes for judgment rather than as scaffolding for it.

    In adolescent care specifically, Ojinga Harrison MD has argued against protocolized treatment on clinical grounds. Adolescent presentations involve neurological maturation, environmental stability, psychosocial stressors, and shifting identity in ways that rarely map cleanly to DSM-5 criteria. A treatment plan generated by pattern-matching against a database of past cases will often be defensible. It will often also be wrong for this specific patient, in a way the algorithm has no mechanism to detect.

    Who Carries the Responsibility

    The deeper question is about responsibility. When a psychiatric decision turns out badly, a clinician is accountable. Their license, their standing, their sense of professional obligation are all on the line. Algorithms carry none of this. They produce recommendations, the recommendations are acted on or not, and the accountability remains with the human who signed the prescription.

    Harrison’s position is that this asymmetry should shape how the tools are deployed. A clinician whose judgment has been progressively displaced by a system still carries the legal and ethical weight of the outcome, but has been stripped of the authority to exercise the judgment the outcome required. That is not efficiency. It is a structural setup for harm.

    A Different Model for Integration

    At The Carter Clinic, the technology is structured around the clinician rather than the other way around. Screening tools surface information the clinician then interprets. Electronic records support continuity rather than dictate visit structure. Telepsychiatry extends geographic reach but does not change the expectation that treatment planning is integrative, family-informed, and attuned to context.

    Harrison’s framing is that technology should aid clinical thinking, not replace it. When systems begin to prescribe, the lines of responsibility become unclear, and in psychiatry, unclear responsibility is a precondition for the kind of slow, silent harm that does not show up in any quarterly dashboard.

    What the Next Wave Requires

    The next generation of digital psychiatric tools, including those incorporating large language models and predictive analytics, will be more capable than the last. That makes the design question more urgent, not less. The relevant test is not whether the tool can generate a plausible recommendation. The relevant test is whether the clinical environment built around the tool preserves the clinician’s capacity and authority to override it when the specific patient in front of them requires something different.

    For Ojinga Harrison MD, that test is the line between useful technology and a quiet surrender of the profession. Psychiatry, he would argue, cannot afford to cross it.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Lakisha Davis

      Lakisha Davis is a tech enthusiast with a passion for innovation and digital transformation. With her extensive knowledge in software development and a keen interest in emerging tech trends, Lakisha strives to make technology accessible and understandable to everyone.

      Follow Metapress on Google News
      7 Legal Missteps That Hurt DUI Defense Cases
      April 27, 2026
      The Legal Process That Begins After a Loss in a Wrongful Death Case in Longmont, CO
      April 27, 2026
      The “Same Damage, Different Opinion” Issue in Water Damage Claims
      April 27, 2026
      The Legal and Emotional Gaps Sexual Abuse Survivors Often Encounter
      April 27, 2026
      How to Create Custom Water Bottle Labels That Look Professional
      April 27, 2026
      From Blueprint to Reality: A Complete Journey Through Custom Home Building Success
      April 27, 2026
      Michael Henry: Understanding Private LTE Networks for Modern Connectivity
      April 27, 2026
      Dental Insurance Explained by Coral Springs Dentist: What It Covers and How to Navigate It
      April 27, 2026
      Why Custom Printing Is Becoming a Serious Growth Lever for Modern Product Businesses
      April 27, 2026
      Why Do Buyers Prefer HouseboatsForSale.com When Searching for Houseboats for Sale?
      April 27, 2026
      Broker IC Markets and Why You Should Try It
      April 27, 2026
      Homemade CBD Muscle Rub for Sore Muscles
      April 27, 2026
      Metapress
      • Contact Us
      • About Us
      • Write For Us
      • Guest Post
      • Privacy Policy
      • Terms of Service
      © 2026 Metapress.

      Type above and press Enter to search. Press Esc to cancel.