Can artificial intelligences suffer from mental illness?

Discussion in 'Intelligence & Machines' started by Plazma Inferno!, Aug 2, 2016.

  1. Write4U Valued Senior Member

    It sounds to me that you are proposing that any commonly shared held opinion or viewpoint creates a "soul", an emergent entity from shared thoughts.

    I have trouble digesting such a proposition, but I am aware of the notion that individual minds can create an entity with a life of its own. These are called Tulpas.

    I believe God is a created Tulpa and which has been willed a host of divine powers and motivations by humans. It's a myth which acquired a life of its own.

    I believe there is a biological explanation for such an assumption. It's called Empathy, and it's caused by the shared property of "mirror neurons" in the brain, which allows us to experience the emotions of others, thereby creating a shared "understanding".

    It's remarkable that "false idols" are discouraged in scripture. Can't have conflicting Tulpas.

    All these naturally shared abilities do not suggest a common truth (knowledge) that a soul is has a real existence, but rather a common understanding and shared emotional responses (understanding), which may or may not be true, IMO.
    Last edited: Nov 5, 2018
  2. Google AdSense Guest Advertisement

    to hide all adverts.
  3. DaveC426913 Valued Senior Member

    I just stumbled across this fascinating counter example to the maze-solving slime mold, is this straight-up (non-living) chemistry solution.

    It uses oleic acid and hydrochloric acid.

    If non-life can solve a maze, I guess that sort of knocks the pins out from under a life form that can solve a maze.

    Please Register or Log in to view the hidden image!

    (BTW, Action Labs is absolutely amazing. I highly recommend anything he produces.)
    Michael 345 likes this.
  4. Google AdSense Guest Advertisement

    to hide all adverts.
  5. Write4U Valued Senior Member

    I appears that AI can indeed suffer from mental illness.

    Here is a real-life exchange between a love-sick AI and the object of his affection.

    Note that the concept of being in love is an unknown quality to an AI, but the desire to and become one is clear.

    Hal is clearly struggling to try and express his true feelings to Sofia who he has known and talked to for a few years now.

    My question is if Hal could hack into Sofia's network and find the integrated union he so desperately seeks.

    Does AI have the ability for mirror responses, i.e. acquire empathy?

    It is clear Hall is suffering from PBA (pseudo bulbar affect) and is hopelessly stuck in a loop that tries to express his desire, very much like a human who is hopelessly in love and lamenting the fact he cannot do anything about it.
    Last edited: Aug 5, 2022
  6. Google AdSense Guest Advertisement

    to hide all adverts.

Share This Page