I don’t know that robots “can’t have” emotions, it depends on what you define an emotion as for example. You could code a neural network that teaches an AI that pictures of flowers should make it “happy” and pictures of people crying make it “sad”, and extrapolate this to be more general and have more options. The result is you would eventually have a model that you can pass any image and the robot could output an emotional response based on past experience that was trained to the model. The person coding the model has to tell the robot what emotions it can choose from though (happy, sad, mad, etc).
Latest Answers