cross-posted from: https://sh.itjust.works/post/1062067

In similar case, US National Eating Disorder Association laid off entire helpline staff. Soon after, chatbot disabled for giving out harmful information.

  • LostCause@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Yea especially the Bing chatbot is too cute for a job like this, it also added a 😅 later on in our little chat, though maybe the CEO should have taken some advice from it in this case.