Jump to content

Androids/IPC "feelings"


Recommended Posts

My views is in the camp of "some can some can't". Some IPCs, either new, old or not programmed to be able to, are just that, machines. But there's some other IPCs which, through imperfections in software or hardware, CAN simulate or even feel.


In the IPC balance ideas thread that I made, people said that IPCs shouldn't be reskinned, mildly robotic organics. I agree with that. But the year is 2457 - AI's been a thing for a while now. It makes no sense to make it be just BEEP BOOP I AM ROBOT. And anyway - is that really what we want? Or is it an excuse we make to make people not want to be IPC?


Seriously - to me, disallowing IPC platforms from showing anything that resembles emotion is pretty much killing most, if not all IPCs aboard the station - and maybe the whole race since, really, who would want to RP as a machine who completely fails to understand every single nuance of human interaction aside from the basic things? Mostly malicious people who want the gameplay OPness of IPCs to do shit, that's who, I'm pretty sure.


As previously mentioned - we already have primitive thinking machines limited only by our hardware. Our basic communicating AIs are almost passing the Turing test. You're not going to tell me that 442 years is not enough to make AIs that can successfully simulate, or even have an equivalent, of organic behaviors and feelings.


I absolutely LOVE the EDI example, because we have a clear progression - EDI started out as a VI - what you guys describe as what IPCs should be, a very advanced system that still works on an input-output system. A little error turned it into a proto-AI, and then from there she was improved until she was an AI that, when completely unshackled, learned human behavior and social norms to the point she asked for dating advice.

Link to comment

Allow me to use a real world example of artificial intelligence: Cleverbot http://www.cleverbot.com/


She's an AI algorithm that learns human communication speech patterns by speaking with millions of people on the internet. She doesn't have an input/output model. Instead, she analyzes text patterns and responds to them accordingly, using data saved from prior communications. She has a score of around 50% on the turing tests. This is out modern AI. Nowhere near our sci-fi IPCs, but it holds the same concept.

Link to comment
This is out modern AI. Nowhere near our sci-fi IPCs, but it holds the same concept.

 

This is our modern PUBLIC AI. And even then, it's not the most advanced - there's the Neural Network idea that's getting some very interesting results (someone even repurposed it to be able to make decent music!)


And as I said, this is (from a lore perspective) 442 years in the past.

Link to comment

I think people overcomplicate or overglorify emotions. It's a evolutionary mechanism that helps your mind think in a certain way, a switch, an overide.


For example, a feelings of love and lust are both strongly tied with reproduction and preservation of family. Because, we are not naturally inclined to think about such things, or even care. Think about this; why do babies scream when they need something? It's strong emotion that drives them to scream, in turn, annoying the parents into giving them what they want/need. Simple instinct for survival. It's the feeling of love that stopped us from smashing the annoying creature into a wall, while lust caused the little monster to be created in the first place. This is all during our more primitive levels of developments, however, they remained because our technology (this includes patterns of thinking) developed too fast. While we look at social anxiety as a psychological issue you can live with, in those times, being refused by your fellow tribesman could spell your doom. A reason to be anxious indeed.


Today, because of the level we are at, those feelings have lost their function. They are not needed in our daily lives, so are labeled as things you need to deal with, rather than being caused by a lack of various physical and psychological needs.


AI-s do not have those needs and such a primitive system would be stupid. But as someone already said, emotions can be replaced with directives. For example, while they might not/feel/ fear, a directive, priority for self-preservation can make it do seemingly paranoid, pre-programmed jumps in logic. Instead of trying to assess the situation slowly, it might chose to skip proper data-collection and attack someone if risk-reward is high enough. It's a very similar behavior to fear and is basically fear, but without any actual emotion.


Emotions during conversations with organics could be a complex social mimicry, designed to allow easier transfer of information. AI's should not get pissed when someone is rude, they would /act/ pissed. It's an important difference. Of course, unless someone actually decided to install retarded, outdated, survival mechanisms into an AI.


Just my two cents.

Link to comment

On the flipside, AIs ARE, for the most part, designed by organics. If the creator (and I can think of at least 3 IPCs/AIs to which that could apply) actually wanted them to emulate organic thought processes, they may very well have some form of emotion simulacrum coded in.


Logically speaking, there IS no reason for IPCs to be. But their designers are not entirely logical.

Link to comment
  • 2 months later...
Guest
This topic is now closed to further replies.
×
×
  • Create New...