• NottaLottaOcelot@lemmy.ca
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 hours ago

    I had a very entertaining time asking search engine AI about various bacteria when writing an open book exam.

    Ask how X bacteria acts in the oral cavity, and the AI summary calls it a beneficial species

    Ask how X bacteria relates to periodontal disease, and the AI summary tells you it is a pathogen of utmost importance.

    It answers solely based on how you pose the question and does not even provide an accurate summary of the websites it purports to have used as sources.

  • finallymadeanaccount@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 hours ago

    Elon Musk’s AI recommends Ivermectin and anal bleaching, because it’s biased. I don’t care if what I said was true.

    But it is biased.

    Imagine doctors using the same AI that convinced that poor, lonely guy to kill himself!

  • CH3DD4R_G0B-L1N@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    8 hours ago

    I already immediately left a veterinarian for using AI as part of its xray diagnosis process, which may even be somewhat acceptable since computer vision is relatively mature. Fuck if I’m lasting 5 mins with a human doctor that utters the letters “AI.”

    • NottaLottaOcelot@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      I can’t speak for veterinary, but in dentistry AI can be problematic because:

      A) it really over-diagnoses - it’s very very sensitive, meaning that it identifies things that aren’t necessarily clinically relevant

      B) it does not compare to previous radiographs, so it cannot give reasonable clinical judgement on whether decay is active or arrested.

      It could be a helpful tool to give you a laundry list of places to check. However, I’ve used demo software and did not find it added anything for me, although I have 15 years experience. You still need to use your clinical judgement.

      I do worry about the younger clinicians being over-reliant on it, as they have it pushed on them by multi-practice dental corporations.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      7 hours ago

      Ya an LLM is very different than a vision based machine learning /trained visual model on something specific like x-rays.

      Now, if its just a LLM looking at an xray image, that’s another story, and it could’ve been that too.

      • CH3DD4R_G0B-L1N@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 hours ago

        Exactly. Imagery is my field so them just saying “AI X-ray analysis” instead of something more specific or scientific didn’t inspire confidence.

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    12 hours ago

    It is very curious rhetorical move from:

    If your doctor isn’t using AI, they are incompetent and awful and should be considered malpractice

    to

    We shouldn’t be forcing these Big Government Regulations on the itty bitty small bean doctors who just want to help people

    Techno-Libertarianism in a nutshell. It is never a serious analysis of best practices and procedures. Always some hollow appeal to legalism out of one side of the mouth and denouncement of bureaucracy out of the other. And all in pursuit of selling a new line of magic fucking beans to the rubes.

    Counting the days until Dr. Oz is talking about LLMs like he talks about ginseng and acai berry juice.

  • nonentity@sh.itjust.works
    link
    fedilink
    arrow-up
    12
    arrow-down
    2
    ·
    15 hours ago

    The number of competent experts who are impressed by an LLM wielded in their specified field, is as vanishingly infinitesimal as legitimate and justifiable invocations of the term ‘AI’.

    Those who have expressed the greatest enthusiasm for ‘AI’ are typically the farthest removed from actual, nuanced comprehension.

    It’s a grift economy built on statistically luke-warm, vibe lobotomised corpses.

  • 4am@lemmy.zip
    link
    fedilink
    arrow-up
    26
    ·
    19 hours ago

    Image recognition to help radiologists find tumors is probably fine; especially since you can usually run those models locally.

    These morons think ChatGPT is “conscious” and “was trained on humanity’s collective knowledge”. THAT is the problem with AI Derangement Syndrome

    EDIT: aw fuck let’s not use that acronym

    • Windex007@lemmy.world
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      17 hours ago

      There are a bunch of studies that in general show there is an effect where, despite what people say and think, they inevitably start to offload decision making to AI inappropriately and it eventually makes them worse. Harvard did a study specifically around radiologists, interestingly enough.

      The “only use it as an aid” seems to be a myth.

      To me it seems very similar to cocaine.

    • Contramuffin@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      18 hours ago

      AI (statistical predictive models) work best when it’s designed for a specific purpose and when the model is too challenging to derive by hand. Detecting tumors is a specific purpose, and doing so manually is challenging enough that it requires specific training. It gets a pass by me.

      Predicting protein structures/drug effects: specific purpose, check. Doing it manually, yep, very challenging. Good use of AI.

      LLM chatbot: purpose is unclear. Making a non-AI-based chatbot is easy and has been done before. Verdict: useless technology

      • vaultdweller013@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        12 hours ago

        Or to put it another way, use the right tool for the job don’t use the shitty multi tool that does every job passably at best. The only exception to this rule of thumb is the humble spork, but that’s a piece of engineering genius that couldn’t be replicated by AI pushers.

    • Send Pics of Sandwiches@sh.itjust.works
      link
      fedilink
      arrow-up
      11
      ·
      19 hours ago

      The worst part is that HIPAA has actually already allowed AI companies to do this. Epic EHR software now has built in AI chart summary support as well as AI dictation where the patient is recorded throughout the visit by the AI agent. Somone has decided that feeding patient’s PHI and voice into an AI company’s black box is somehow acceptable healthcare practice and has actively implemented it in physical healthcare facilities.

      • Catoblepas@piefed.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        At my last visit my doctor asked if I was okay with her recording for this (dunno if it was that exact company or not) and I said fuck no. I’m there to talk about sensitive shit, you wouldn’t catch me posting any of it online and you won’t catch me narrating it to a chatbot.

        • youcantreadthis@quokk.au
          link
          fedilink
          English
          arrow-up
          8
          ·
          18 hours ago

          Why? Just because it’s evil and dangerous and stupid and useless and will make shit up? Does Anyone who matters give a shit about any of that why would they?

        • Send Pics of Sandwiches@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          16 hours ago

          I’m fairly certain there’s a consent specifically for use of AI at patient checkin, but I doubt the front desk staff are trained on people refusing that consent and will likely tell you that you have to sign it to be treated which I’m almost positive is not true, and may even be illegal.

          An online medical service I used a while back had a consent for AI charting mixed in with the various other consent to treat paperwork, and I did have the option to decline it (and did so). During the visit, the provider once again pestered me about using AI, and I declined. They were irritated by this, because they used it to document the visit automatically, and didn’t want to have to do it themselves, but I wasn’t denied care because of it.

          • Zorcron@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 hours ago

            Do you with in healthcare? I haven’t seen an option like that in my EHR, and it does do AI summaries of patient’s charts. (To be clear, I’m not a physician, nor do I work in a clinic where I could ask a coworker about registration or “front desk” stuff, so if could be that I just don’t see it.)

  • gmtom@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    12 hours ago

    Nah, I work in AI for medicine we have lots of data that it does actually help.

    My work specifically looks at images from scans (mostly MRI and X-ray) to diagnose conditions (mostly respiratory) before even senior doctors are able to reliably diagnose it. It’s already out working in the world and has saved hundreds of people’s lives already.

    I also have friends that work in AI diagnosis and they have similar success and just save doctors a ton of time.

  • ZeDoTelhado@lemmy.world
    link
    fedilink
    arrow-up
    48
    ·
    23 hours ago

    Hey Hoffman, remember the sneezes you had in succession last winter for 2 weeks straight? I asked chatgpt and tells me is brain cancer. Are you going to start cancer therapy ASAP?

    PS: for the people that still remember WebMD at the start, they would never trust a machine for full diagnosis, let alone considering this as an option

  • Alvaro@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    34
    ·
    22 hours ago

    “Sir, you seem to be low on vitamin C, which gave you scurvy, but Grok says that it is more likely to be an psychosomatic response to an internal conflict between the way you live your life, and the Hitler inside you waiting to be let out”