Artificial Emotive Intelligence for Non-Playable Character FSM Architecture

Artificial emotive intelligence can be applied to characters by endowing them with or representing them as emotively smart software objects with emotive states and associated intensities, and threshold values that trigger actions. The figure above is a high-level block diagram of an emotive object finite state machine, which can greatly simplify a game architecture and add inherent asynchronous options and dynamics to Non-Playable Character (NPC) game interaction. The NPC Finite State Machine (FSM) architecture is based on a given emotive makeup, a library of emotions that are dynamically accessible and callable by the character triggered by game circumstances and events. The software emotive objects contain thresholds, much like a neural net node, which gate to actions or behaviors within the universe of the character based on the emotive intensity of the character at the triggering event time and position. The finite states allow action to transition among emotive states that trigger more actions based on charging and discharging of emotive potential (intensity) over pre-set thresholds. The result is relatively autonomously responding NPCs, reacting much like an emotionally intelligent entity or sentient being. This adds intelligence to the NPC, modeled somewhat after ourselves. The NPC comprises an emotional makeup or emotive inventory in an electronic application, in the figure, composed of Cognitive Thinking, Cautious, Confused, Frightened, Aggressive, and Bossy emotions. These emotion software entities have thresholds corresponding to the creature’s emotive intensity in a corresponding emotive state. Game events, acts, circumstances and frames or external stimuli act to change finite states of the NPC, which in turn stimulates causal response actions/behavior by reacting or interacting NPCs or players. In this fashion, NPC behavioral autonomy and programming synchronicity are merely byproducts of this architecture, which runs on multiple serial feelings to change finite states continuing the action along in asynchronous interaction in accordance with an initial scenario.

An emotive priority stack could serve as a scheduler promoting the higher urgency finite states to respond to events. Story, scenario events and circumstances can trigger specific character emotive states/intensities from an emotive inventory to instantiate and morph facial expressions from changing states, simulating living creatures in real-time action. Where emotive signals trigger the set thresholds they trigger the set character threshold actions, which provides autonomy and intelligence to the NPC. A prosodic interface fed from a source of text and emotive content endows the NPC with autonomous intelligence as a side-effect, as text and emotive state outputs the prosodic speech in synchronicity with graphics, adding realism without additional programming.