US20040029625A1 - Group behavioral modification using external stimuli - Google Patents

Group behavioral modification using external stimuli Download PDF

Info

Publication number
US20040029625A1
US20040029625A1 US10/364,951 US36495103A US2004029625A1 US 20040029625 A1 US20040029625 A1 US 20040029625A1 US 36495103 A US36495103 A US 36495103A US 2004029625 A1 US2004029625 A1 US 2004029625A1
Authority
US
United States
Prior art keywords
character
attributes
external stimulus
characters
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/364,951
Inventor
Ed Annunziata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment America LLC
Original Assignee
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment America LLC
Priority to US10/364,951 priority Critical patent/US20040029625A1/en
Assigned to SONY COMPUTER ENTERTAINMENT AMERICA INC. reassignment SONY COMPUTER ENTERTAINMENT AMERICA INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ANNUNZIATA, ED
Priority to EP03254168A priority patent/EP1388357A3/en
Priority to JP2003288128A priority patent/JP3865721B2/en
Publication of US20040029625A1 publication Critical patent/US20040029625A1/en
Assigned to SONY INTERACTIVE ENTERTAINMENT AMERICA LLC reassignment SONY INTERACTIVE ENTERTAINMENT AMERICA LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SONY COMPUTER ENTERTAINMENT AMERICA LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/58Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6623Methods for processing data by generating or executing the game program for rendering three dimensional images for animating a group of characters

Definitions

  • This invention relates generally to electronic entertainment systems and more particularly to a system and method for modifying group behavior via external stimuli.
  • a user In electronic systems, particularly entertainment and gaming systems, a user typically controls the behavior or actions of at least one character in a game program using some type of manually activated controller device.
  • controller devices include joysticks, switches, buttons, and keyboards.
  • some gaming systems use specifically designed control devices, such as a steering wheel and pedals for driving simulations, or a stick and pedals for flight simulations.
  • Yet more advanced gaming systems may use voice controls or human movements in a virtual reality game.
  • a controller device In gaming systems using manually activated controller devices, a controller device, typically, utilizes buttons and keystrokes assigned with different meanings according to the requirements of the particular game. As an example, a game may have a particular button corresponding to a punch, while in another game the same button may correspond to firing a gun. In many games, a user can only control the actions of a single character. Although games may allow the user to control a group of characters, the characters typically act as a unit, so the group of characters effectively acts as a single character. Virtually all conventional games allow for manual user control of at least one character.
  • a system and method for behavioral modification of a group of characters via external stimuli is disclosed.
  • character behavior in a video game is driven by a character's attributes.
  • the character's attributes are classified as static attributes, dynamic attributes, and meta attributes.
  • a subset of the dynamic attributes include the character's emotional attributes expressed as numerical hate/love (H/L) values stored in H/L tables.
  • a user issues an external stimulus directed towards a group of characters associated with the user.
  • the external stimulus is a voice command.
  • the video game module processes the external stimulus and adjusts one or more attributes of each character based upon the external stimulus. Then, each character of the group of characters responds to the external stimulus based on each character's adjusted attributes.
  • an electronic entertainment system includes a microphone for receiving voice commands and converting the received voice commands to electrical signals, a processor coupled to the microphone, and a memory coupled to the processor and configured to store a game module.
  • the game module includes a data storage for storing each character's attributes, voice recognition software executable by the processor for processing the electrical signals, and a data table adjuster executable by the processor for adjusting each character's attributes based upon the processed electrical signals.
  • the game module further includes an action generator executable by the processor for processing each character's adjusted and non-adjusted attributes to respond to the voice commands.
  • a user issues a “run away” command to a group of characters.
  • the video game processes the “run away” command, and adjusts each character's fight/flight attribute based upon the processed voice command and a predefined “flee” adjustment.
  • the user issues a voice command to instantaneously and temporarily adjust each character's fight/flight attribute, causing each character of the group of characters to run away from enemy characters.
  • the group is running across the terrain at some average group speed, although each character runs at a speed identified by the character's speed attribute.
  • FIG. 1 is a block diagram of an exemplary electronic entertainment system, according to the present invention.
  • FIG. 2 is a block diagram of one embodiment of the main memory of FIG. 1, according to the present invention.
  • FIG. 3A is a block diagram of an exemplary embodiment of data storage of FIG. 2;
  • FIG. 3B is a block diagram of an exemplary embodiment of the character A data storage module of FIG. 3A;
  • FIG. 4 is a block diagram of an exemplary embodiment of the static parameter table of FIG. 3B;
  • FIG. 5 is a block diagram of an exemplary embodiment of the dynamic parameter table of FIG. 3B;
  • FIG. 6 is a block diagram of an exemplary embodiment of the meta parameter table of FIG. 3B;
  • FIG. 7 is a block diagram of an exemplary embodiment of the emotion tables of FIG. 3B.
  • FIG. 8 is a flowchart of method steps for group behavioral modification using external stimuli, according to one embodiment of the present invention.
  • FIG. 1 is a block diagram of an exemplary electronic entertainment system 100 according to the present invention.
  • the entertainment system 100 includes a main memory 102 , a central processing unit (CPU) 104 , at least one vector unit 106 , a graphics processing unit 108 , an input/output (I/O) processor 110 , an I/O processor memory 112 , a controller interface 114 , a memory card 116 , a Universal Serial Bus (USB) interface 118 , and an IEEE 1394 interface 120 , although other bus standards and interfaces may be utilized.
  • the entertainment system 100 further includes an operating system read-only memory (OS ROM) 122 , a sound processing unit 124 , an optical disc control unit 126 , and a hard disc drive 128 , which are connected via a bus 130 to the I/O processor 110 .
  • OS ROM operating system read-only memory
  • the entertainment system 100 is an electronic gaming console.
  • the entertainment system 100 may be implemented as a general-purpose computer, a set-top box, or a hand-held gaming device. Further, similar entertainment systems may contain more or less operating components.
  • the CPU 104 executes programs stored in the OS ROM 122 and the main memory 102 .
  • the main memory 102 may contain prestored programs and programs transferred through the I/O processor 110 from a CD-ROM, DVD-ROM, or other optical disc (not shown) using the optical disc control unit 126 .
  • the I/O processor 110 primarily controls data exchanges between the various devices of the entertainment system 100 including the CPU 104 , the vector unit 106 , the graphics processing unit 108 , and the controller interface 114 .
  • the graphics processing unit 108 executes graphics instructions received from the CPU 104 and the vector unit 106 to produce images for display on a display device (not shown).
  • the vector unit 106 may transform objects from three-dimensional coordinates to two-dimensional coordinates, and send the two-dimensional coordinates to the graphics processing unit 108 .
  • the sound processing unit 124 executes instructions to produce sound signals that are outputted to an audio device such as speakers (not shown).
  • a user of the entertainment system 100 provides instructions via the controller interface 114 to the CPU 104 .
  • the user may instruct the CPU 104 to store certain game information on the memory card 116 or instruct a character in a game to perform some specified action.
  • the controller interface 1 14 may include a transducer (not shown) to convert acoustic signals to electrical signals.
  • the transducer may be a separate element of the electronic entertainment system 100 .
  • voice recognition software stored in the main memory 102 is executable by the CPU 104 to process the electrical signals.
  • other devices may be connected to the entertainment system 100 via the USB interface 118 and the IEEE 1394 interface 120 .
  • FIG. 2 is a block diagram of one embodiment of the main memory 102 of FIG. 1 according to the present invention.
  • the main memory 102 is shown containing a game module 200 which is loaded into the main memory 102 from an optical disc in the optical disc control unit 126 (FIG. 1).
  • the game module 200 contains instructions executable by the CPU 104 , the vector unit 106 , and the sound processing unit 124 of FIG. 1 that allows a user of the entertainment system 100 (FIG. 1) to play a game.
  • the game module 200 includes data storage 202 , an action generator 204 , a characteristic generator 206 , a data table adjuster 208 , and a voice recognition module 210 .
  • the action generator 204 , the characteristic generator 206 , and the data table adjuster 208 may be modules executable by the CPU 104 .
  • the action generator 204 is executable by the CPU 104 to produce game play, including character motion and character response;
  • the characteristic generator 206 is executable by the CPU 104 to generate a character's expressions as displayed on a monitor (not shown);
  • the data table adjuster 208 is executable by the CPU 104 to update data in data storage 202 during game play;
  • the voice recognition module 210 is executable by the CPU 104 to convert the electrical signals received via the controller interface 114 (FIG. 1) to game instructions.
  • the CPU 104 accesses data in data storage 202 as instructed by the action generator 204 , the characteristic generator 206 , the data table adjuster 208 , and the voice recognition module 210 .
  • the game module 200 is a tribal simulation game in which a player creates and trains tribes of characters.
  • a tribe of characters is preferably a group (or team) of characters associated with a given game user.
  • the tribal simulation game includes a plurality of character species, and each team of characters may include any combination of characters from any of the character species.
  • a character reacts to internal stimuli and external stimuli based upon the character's genetic makeup as expressed by gene attributes.
  • internal stimuli are internal to the gaming environment, such as character/character interactions, character/group interactions, character/environment interactions, and character/game situation interactions. Typically, a user does not have direct control over internal stimuli.
  • the external stimuli which are external to the gaming environment, may be voice commands issued by a user, received by the controller interface 114 , and directed to modify the behavior of a group of characters.
  • Other forms of external stimuli are also contemplated by the present invention, such as keyboard or mouse initiated stimuli.
  • each character's behavior depends upon one or more gene attributes.
  • Gene attributes that typically remain constant throughout a character's life are called static attributes; gene attributes that change during game play in response to internal and external stimuli are called dynamic attributes; and gene attributes that are functions of the static and/or dynamic attributes are called meta attributes.
  • Those meta attributes that are based in part, or in whole, on the dynamic attributes are also dynamic in nature.
  • a character's dynamic and meta attributes may be modified by emotional attributes as quantified by hate/love (H/L) values.
  • a character's emotional attributes are dynamic and change during game play in response to internal and external stimuli and correspond to other species, teams, and characters.
  • a character's static attributes, dynamic attributes, meta attributes, and H/L values are described further below in conjunction with FIGS. 3 - 7 .
  • FIG. 3A is a block diagram of an exemplary embodiment of data storage 202 of FIG. 2 according to the present invention.
  • the data storage 202 includes a character A database 302 a , a character B database 302 b , and a character C database 302 c .
  • FIG. 3A embodiment of data storage 202 shows three character databases 302 a , 302 b , and 302 c , the scope of the present invention includes any number of character databases 302 .
  • FIG. 3B is a block diagram of an exemplary embodiment of the character A database 302 a of FIG. 3A.
  • the character A database 302 a includes a static parameter table 308 , a dynamic parameter table 310 , a meta parameter table 312 , and emotion tables 314 .
  • Character A's static attributes are stored in the static parameter table 308 ;
  • character A's dynamic attributes (preferably not including H/L values) are stored in the dynamic parameter table 310 ;
  • character A's meta attributes are stored in the meta parameter table 312 ; and character A's H/L values are stored in the emotion tables 314 .
  • attributes are also referred to as parameters.
  • the static attributes stored in the static parameter table 308 typically remain constant throughout character A's life
  • the static attributes may be changed through character training or user generated external stimuli.
  • a user may have a character train on a treadmill to increase the character's speed.
  • the user may have the character train with weights to increase the character's strength.
  • a user generated external stimuli may temporarily modify one or more static attributes.
  • a temporarily modified static attribute reverts back to its unmodified value after a time delay T, dependent upon the video game and the static attribute.
  • the character B database 302 b and the character C database 302 c are similar to the character A database 302 a.
  • FIG. 4 is an illustration of an exemplary embodiment of the static parameter table 308 of FIG. 3B.
  • the static parameter table 308 includes a plurality of static parameters, such as, but not entirely inclusive of or limited to, a strength parameter 402 , a speed parameter 404 , a sight parameter 406 , a hearing parameter 408 , a maximum hit point parameter 410 , a hunger point parameter 412 , a healing urge parameter 414 , a self-healing rate parameter 416 , and an aggressive base parameter 418 .
  • the scope of the invention may include other static parameters as well.
  • the strength parameter 402 corresponds to a character's strength
  • the speed parameter 404 corresponds to how fast a character walks or runs across terrain
  • the sight parameter 406 corresponds to a character's viewing distance
  • the hearing parameter 408 corresponds to a character's hearing distance.
  • the maximum hit point parameter 410 is, preferably, a health parameter threshold value, which is discussed further below in conjunction with FIG. 5.
  • the hunger point parameter 412 is a reference value to which a character's energy is measured to compute a character's hunger parameter, as will be described further below in conjunction with FIG. 6.
  • the healing urge parameter 414 corresponds to a character's desire to heal another character
  • the self-healing rate parameter 416 corresponds to a time rate at which a character heals itself.
  • the aggressive base parameter 418 is a reference value that represents a character's base aggression level, and is described further below in conjunction with FIG. 6. As previously indicated, not all of these parameters are required, and other parameters may be contemplated for use in the
  • FIG. 5 is an illustration of an exemplary embodiment of the dynamic parameter table 310 of FIG. 3B.
  • the dynamic parameter table 310 includes a plurality of dynamic parameters, such as an energy parameter 502 , a health parameter 504 , an irritation parameter 506 , and a game experience parameter 508 .
  • the scope of the present invention may not include all of the above listed parameters and/or include other dynamic parameters.
  • These dynamic parameters change during game play.
  • the character's energy parameter 502 is a function of the character's consumption of food and the rate at which the character uses energy. When the character eats, the character's energy parameter 502 increases. However, the character is continuously using energy as defined by the character's metabolic rate. The metabolic rate is a meta parameter dependant upon several static parameters and is further discussed below in conjunction with FIG. 6.
  • the health parameter 504 is less than or equal to the maximum hit point parameter 410 (FIG. 4), and is a function of the character's energy parameter 502 , the character's self-healing rate parameter 416 (FIG. 4), and a number of character hits.
  • a character is assigned a health parameter 504 equal to the maximum hit point parameter 410 upon game initialization.
  • the character's health parameter 504 decreases.
  • the character's health parameter 504 decreases.
  • the character's health parameter 504 increases at the character's self-healing rate 416 .
  • the health parameter 504 is based in part on the self-healing rate parameter 416 , which is a static parameter.
  • the character's irritation parameter 506 increases if the character is exposed to irritating stimuli, such as the presence of enemies or weapons fire within the character's range of sight, specified by the sight parameter 406 (FIG. 4).
  • the irritation parameter 506 decreases over time at a predefined rate.
  • the character's game experience parameter 508 quantifies a character's game experiences, particularly in association with character participation in tribal games and fighting. For example, an experienced character has accumulated wisdom, and is less likely to be surprised by game situations and more adept at making game decisions.
  • FIG. 6 is an illustration of an exemplary embodiment of the meta parameter table 312 of FIG. 3B.
  • the meta parameter table 312 includes a plurality of meta parameters, such as, but not necessarily completely inclusive of or limited to, a hunger parameter 602 , a metabolic rate parameter 604 , an aggression parameter 606 , and a fight/flight parameter 608 .
  • the meta parameters are typically changeable, and are based upon the static and dynamic parameters. For example, a character's desire to eat is dependent upon the hunger parameter 602 .
  • the hunger parameter 602 is a signed value defined by the energy parameter 502 (FIG. 5) less the hunger point parameter 412 (FIG. 4).
  • the character's hunger parameter 602 is greater than zero, then the character is not hungry. However, if the character's hunger parameter 602 is less than zero, then the character is hungry. As the negative hunger parameter 602 decreases (i.e., becomes more negative), the character's desire to eat increases. This desire to eat may then be balanced with other desires, such as a desire to attack an enemy or to search for a weapons cache. The weighting of these parameters may determine a character's behaviors and actions.
  • the metabolic rate parameter 604 is directly proportional to the character's speed parameter 404 (FIG. 4), the strength parameter 402 (FIG. 4), and the maximum hit point parameter 410 (FIG. 4), while indirectly proportional to the character's hunger point parameter 412 (FIG. 4) and healing urge parameter 414 (FIG. 4).
  • the character's healing urge parameter 414 is large, the character is likely a calm, non-excitable individual. Therefore the character's metabolic rate parameter 604 would be small.
  • the character's healing urge parameter 414 is small, the character is likely a highly-strung, excitable individual. Consequently, the character's metabolic rate parameter 604 would be large.
  • the aggression parameter 606 is defined as the aggressive base parameter 418 (FIG. 4) plus the irritation parameter 506 (FIG. 5). As the aggression parameter 606 increases, the character becomes more aggressive and is more likely to be engaged in fights.
  • a character uses the fight/flight parameter 608 to determine whether, when faced with an enemy or other dangerous situations, to fight or flee the enemy.
  • the fight/flight parameter 608 is preferably based upon the hunger parameter 602 , the aggression parameter 606 , the game experience parameter 508 (FIG. 5), and the energy parameter 502 (FIG. 5).
  • a large value for the fight/flight parameter 608 corresponds to a character's desire to fight
  • a small value for the fight/flight parameter 608 corresponds to a character's desire to flee. For example, as the character's hunger or aggression increases, as measured by the character's hunger parameter 602 and aggression parameter 606 , respectively, the character is more likely to engage in fights.
  • FIG. 7 is an illustration of one embodiment of the emotion tables 314 of FIG. 3B, according to the present invention.
  • the emotion tables 314 include an individual's hate/love (H/L) table 702 , a species H/L table 704 , and a team H/L table 706 .
  • the individual's H/L table 702 includes one or more character identification (ID) numbers and one or more character H/L values, wherein each character ID number is associated with a character H/L value.
  • ID character identification
  • character A has a ⁇ 900 character H/L value corresponding to a character identified by character ID number 192993293.
  • character A has high hate for the individual having character ID number 192993293.
  • character A has a 100 character H/L value for character ID number 339399928.
  • This positive H/L value corresponds to a general liking of the individual having ID number 339399928.
  • the individuals H/L table 702 may also include individual character names corresponding to the character ID numbers.
  • the species H/L table 704 includes one or more species names and one or more species H/L values. Each species name is associated with a species H/L value which represents character A's relationship with each species. Similar to the individuals H/L table 702 , the more negative or positive the H/L value, the more the particular species is hated or loved, respectively. For example, character A has a 100 species H/L value corresponding to the Nids species which implies a general like of the Nids species. Conversely, character A has a ⁇ 500 species H/L value corresponding to the Antenids species. Therefore, character A has a strong dislike (i.e., hate) for the Antenids species.
  • the team H/L table 706 includes one or more team ID numbers, one or more team H/L values, and one or more team names. Each team ID number is associated with a team H/L value and a team name.
  • the character A has a 1000 team H/L value corresponding to the Frosties team represented by ID number 139000. Because the H/L value is so high, character A has a deep love for the Frosties team. However, character A has a ⁇ 500 H/L value corresponding to the Slashers team represented by ID number 939992, thereby representing a hate for this team.
  • the character, species, and team H/L values range from ⁇ 1000 to 1000.
  • a character, species, or team H/L value of 1000 represents unconditional love directed towards the character, species, or team, respectively, while a character, species, or team H/L value of ⁇ 1000 represents extreme ashamed directed towards the character, species, or team, respectively.
  • a H/L value of zero represents a neutral feeling.
  • the H/L value ranges may be larger or smaller, and may include other maximum and minimum values.
  • Any type of character interaction may cause changes to H/L values.
  • character A initially has a 800 character H/L value corresponding to character B and a ⁇ 50 character H/L value corresponding to character C.
  • character A sees character C hit character B, and thus character A's character H/L values are adjusted accordingly.
  • character A's character H/L value corresponding to character B increases to 850 because of feelings of sympathy towards character B, and character A's character H/L value corresponding to character C may decrease to ⁇ 200 due to an increased ashamed for character C.
  • character C attacks character A, character A develops more ashamed towards character C, and character A's character H/L value corresponding to character C may further decrease to ⁇ 275.
  • character C communicates to character A useful information on the operation of a weapon, then character A's character H/L value corresponding to character C may increase to ⁇ 150.
  • the data table adjuster 208 (FIG. 2) initializes all character and team H/L values to zero upon initiation of a new game. Furthermore, the data table adjuster 208 initializes all species H/L values to zero or to non-zero predefined values dependent upon game-defined species compatibility. In an alternate embodiment, the data table adjuster 208 initializes all species, character, and team H/L values to zero upon initiation of a new game. In a further embodiment, the data table adjuster 208 initializes some or all character, species, and team H/L values to non-zero predefined values dependent upon game-defined character, species, and team compatibility.
  • a user may save all the H/L values to the memory card 116 (FIG. 1) or the hard disc drive 128 (FIG. 1) for future game play. If the user has instructed the game module 200 (FIG. 2) to save the H/L values, the data table adjuster 208 may use the saved H/L values to initialize all game H/L values upon continuation of game play.
  • FIG. 8 is an exemplary flowchart 800 of method steps for dynamic behavioral modification based upon game interactions, according to one embodiment of the present invention.
  • H/L values are initialized.
  • a user instructs the game entertainment system 100 (FIG. 1) to execute the game module 200 (FIG. 2) via user commands and the controller interface 114 (FIG. 1).
  • the CPU 104 receives the user commands and executes the data table adjuster 208 (FIG. 2).
  • the data table adjuster 208 accesses the character, species, and team H/L values and stores the character, species, and team H/IL values in the emotions table 314 (FIG. 3).
  • the data table adjuster 208 preferably, initializes all character and team H/L values to zero, where a H/L value of zero represents a neutral emotion.
  • step 804 the CPU 104 executes the action generator 204 (FIG. 2) and the characteristic generator 206 (FIG. 2) to generate game play and game interactions.
  • Game interactions typically include information exchange between characters, as well as communication, observation, detection of sound, direct physical contact, and indirect physical contact.
  • character A and character B may interact and exchange information via a conversation.
  • character A may receive information via observations. For instance, character A may observe character B engaged in direct physical contact with character C via a fist fight, or character A may observe character B engage character C in indirect physical contact via an exchange of weapons fire.
  • character A may observe character B interact with an “inanimate” object.
  • character B moves a rock and discovers a weapons cache.
  • character A may hear a communication between character B and character C.
  • character A may engage in direct physical contact with character B.
  • character A may engage in indirect physical contact with character B.
  • character A may discharge a weapon aimed at character B, or character A may receive fire from character B's weapon.
  • the user issues an external stimulus to a group of characters in a step 806 .
  • the user may issue a voice command via the controller interface 114 (FIG. 1), via a microphone (not shown) integrated with the controller interface 114 , or via a microphone electrically connected to the I/O processor 110 (FIG. 1).
  • the user may issue a variety of voice commands, such as “run away,” “fight,” “search for weapons,” “eat,” “spread out,” “group together,” “follow character A,” “hate species D,” and “love species D.”
  • other commands may be issued by the user, and alternative forms of external stimuli may be utilized (e.g., keyboard or mouse commands).
  • the external stimulus issued by the user modifies character attributes. More specifically, one or more of each character's static, dynamic, meta, and/or emotional attributes may be modified according to the issued stimulus.
  • the game entertainment system 100 uses the issued stimulus to temporarily modify a given attribute associated with each character of the group of characters by a predefined amount. After a predefined time delay T, dependent upon the video game and the given attribute, the given attribute reverts back to its pre-modified value.
  • the microphone converts the voice command to electrical signals and sends the electrical signals to the CPU 104 via the I/O processor 110 .
  • the voice recognition module 210 (FIG. 2), executable by the CPU 104 , processes the electrical signals and instructs the data table adjuster 208 to increment each character's fight/flight parameter 608 (FIG. 6) by a predefined “flee” amount.
  • the user-associated group of characters comprise character A with a fight/flight parameter of 63, character B with a fight/flight parameter of 86, and character C with a fight/flight parameter of 35, where the fight/flight parameter ranges from 0 (100% desire to flee) to 100 (100% desire to fight).
  • the data table adjuster 208 decreases character A's fight/flight parameter to 43, character B's fight/flight parameter to 66, and character C's fight/flight parameter to 15.
  • each character (A, B, and C) of the group of characters have more of a desire to “run away” from the enemy characters then before the external command was issued, although each character's individual desire to flee depends upon each character's modified fight/flight parameter. Since each character's behavior depends upon each character's attributes, and since many of the attributes are interdependent, an external stimulus issued to a group of characters may have many behavioral manifestations.
  • the data table adjuster 208 changes the characters' fight/flight parameters to their pre-modified values. That is, the data table adjuster changes character A's fight/flight parameter back to 63, character B's fight/flight parameter to 86, and character C's fight/flight parameter to 35.
  • the modification of the fight/flight parameter is not inclusive, and the data table adjuster 208 may modify alternate character attributes to elicit the desired character response to the external stimulus.
  • the data table adjuster 208 adjusts each character's fight/flight parameter by an amount dependent upon the strength of the voice command issued by the user. As the strength of the voice command increases, the adjustment increases. For example, a softly spoken “run away” command may decrease each character's fight/flight parameter by 10, whereas a loudly spoken “run away” command may decrease each character's fight/flight parameter by 35.
  • the user issues an external stimulus, “fight,” to a user-associated group of characters.
  • the data table adjuster 208 temporarily increases each character's fight/flight parameter by a predefined “fight” amount.
  • the predefined “fight” amount is 10
  • the data table adjuster 208 temporarily increases character A's fight/flight parameter to 73, character B's fight/flight parameter to 96, and character C's fight/flight parameter to 45.
  • each character (A, B, and C) of the group of characters have more of a desire to “fight” the enemy characters than before the external command was issued.
  • the data table adjuster 208 upon issuance of the “fight” command, adds a predefined aggression adjustment to each character's aggressive base parameter 418 (FIG. 4). Since the fight/flight parameter 608 is based in part on the aggression parameter 606 (FIG. 6), and since the aggression parameter 606 is related to the aggressive base parameter 418 and the irritation parameter 506 (FIG. 5), then as the aggressive base parameter 418 increases, the fight/flight parameter 608 increases and the character is more likely to be engaged in fights.
  • the user issues an external stimulus, “eat,” to a user-associated group of characters.
  • the data table adjuster 208 temporarily increases each character's hunger point parameter 412 (FIG. 4) by a predefined hunger point adjustment. Since each character's hunger parameter 602 (FIG. 6) decreases as each character's hunger point parameter 412 increases, each character has a greater desire to eat after the user issues the “eat” command.
  • the scope of the present invention covers all character actions and behaviors modified by a user-issued external stimulus, such as the “eat” command. For example, as each character eats, each character's energy parameter 502 (FIG.
  • each character increases, and each character is more likely to be healthy as characterized by a large character health parameter 504 (FIG. 5). Consequently, each character will be more likely to fight enemy characters, since in one embodiment of the invention, each character's fight/flight parameter 608 (FIG. 6) increases as each character becomes more healthy.
  • the user issues an external stimulus, “hate species D,” to a user-associated group of characters.
  • the data table adjuster 208 subtracts a predefined hate adjustment value from each character's species H/L value corresponding to species D.
  • character A has a 100 species H/L value (stored in the species H/L table 704 (FIG. 7)) corresponding to the Nids species.
  • character B has a ⁇ 235 species H/L value corresponding to the Nids species
  • character C has a ⁇ 800 species H/L value corresponding to the Nids species.
  • the predefined hate adjustment value is 100.
  • the data table adjuster 208 subtracts the predefined hate adjustment value from each character's species H/L value corresponding to the Nids species, resulting in character A having a 0 species H/L value corresponding to the Nids species, character B having a ⁇ 335 species H/L value corresponding to the Nids species, and character C having a ⁇ 900 species H/L value corresponding to the Nids species.
  • step 810 the user-associated group of characters responds to the user-issued external stimulus.
  • character A, character B, and character C run across the terrain in response to the “run away” voice command issued by the user.
  • character A, character B, and character C may not respond immediately to the “hate Nids” voice command issued by the user, however the adjustment to characters A, B, and C's species H/L values corresponding to the Nids species will influence their behavior upon an encounter with the Nids.
  • the character A, B, or C may attack the Nids characters should the character A, B, or C's H/L values modify other attributes of character A, B, or C, causing character A, B, or C to be more aggressive during a Nids encounter.
  • characters' flocking parameters are adjusted such that the characters have a greater desire to follow character A.
  • characters may follow character A step by step, but dependent upon each character's adjusted flocking parameter, each of the characters will move in the same general direction as character A.
  • step 812 the CPU 104 determines if the game user(s) have completed the game. If the CPU 104 determines that the game is completed, then the method ends. However if in step 812 the CPU 104 determines that the game is not completed, then the method continues at step 804 .
  • the scope of the invention includes method steps in which step 804 is optional. That is, a user may issue two or more external stimuli without an occurrence of any intervening game interaction.

Abstract

Character behavior in a video game is driven by a character's attributes. The character's attributes are classified as static attributes, dynamic attributes, meta attributes, and emotional attributes. A user issues external stimuli, such as voice commands, to a group of characters. One or more attributes associated with each character are adjusted based upon the external stimuli. Thus, the user may issue external stimuli to affect the behavior and actions of a group of characters.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of Provisional Patent Application Ser. No. 60/401,940, filed Aug. 7, 2002, entitled “Group Coaching with External Stimuli,” which is incorporated herein by reference.[0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • This invention relates generally to electronic entertainment systems and more particularly to a system and method for modifying group behavior via external stimuli. [0003]
  • 2. Description of the Background Art [0004]
  • In electronic systems, particularly entertainment and gaming systems, a user typically controls the behavior or actions of at least one character in a game program using some type of manually activated controller device. Conventional controller devices include joysticks, switches, buttons, and keyboards. Further, some gaming systems use specifically designed control devices, such as a steering wheel and pedals for driving simulations, or a stick and pedals for flight simulations. Yet more advanced gaming systems may use voice controls or human movements in a virtual reality game. [0005]
  • In gaming systems using manually activated controller devices, a controller device, typically, utilizes buttons and keystrokes assigned with different meanings according to the requirements of the particular game. As an example, a game may have a particular button corresponding to a punch, while in another game the same button may correspond to firing a gun. In many games, a user can only control the actions of a single character. Although games may allow the user to control a group of characters, the characters typically act as a unit, so the group of characters effectively acts as a single character. Virtually all conventional games allow for manual user control of at least one character. [0006]
  • As game players become more sophisticated, the players are demanding more advanced forms of gaming. Early forms of electronic games consisted of simple blocks and moving targets (e.g., Breakout, Space Invaders, Centipede). Over time, the games became graphically more detailed and intricate. More recently, virtual reality games have become popular. Virtual reality games allow for the player to immerse themselves into the gaming environment and interact with various elements of the environment. However, all of these types of games require a large amount of manual control over character actions during the game play. [0007]
  • Furthermore, conventional games do not normally create any form of attachment or emotion between game characters and a human player. The game character is considered just an element of the game used for entertainment value. This lack of attachment or caring for the character is partly due to the perception that the character is not “alive.” However, if the character projects life-like features and human characteristics, such as having feelings, the player is more likely to form an emotional attachment to the character. [0008]
  • In addition, users of conventional games do not typically utilize game character emotions as strategic game elements that may train or affect groups of characters via game play interactions. Emotions add a level of complexity and unpredictability to character behavior, and further add to a user's arsenal of strategic weapons to enhance game play enjoyment. [0009]
  • Therefore, there is a need for a system and method for dynamic behavioral modification of a group of characters via user participation. [0010]
  • SUMMARY OF THE INVENTION
  • A system and method for behavioral modification of a group of characters via external stimuli is disclosed. According to the present invention, character behavior in a video game is driven by a character's attributes. The character's attributes are classified as static attributes, dynamic attributes, and meta attributes. A subset of the dynamic attributes include the character's emotional attributes expressed as numerical hate/love (H/L) values stored in H/L tables. In one embodiment of the present invention, a user issues an external stimulus directed towards a group of characters associated with the user. Preferably, the external stimulus is a voice command. The video game module processes the external stimulus and adjusts one or more attributes of each character based upon the external stimulus. Then, each character of the group of characters responds to the external stimulus based on each character's adjusted attributes. [0011]
  • In one embodiment of the invention, an electronic entertainment system includes a microphone for receiving voice commands and converting the received voice commands to electrical signals, a processor coupled to the microphone, and a memory coupled to the processor and configured to store a game module. The game module includes a data storage for storing each character's attributes, voice recognition software executable by the processor for processing the electrical signals, and a data table adjuster executable by the processor for adjusting each character's attributes based upon the processed electrical signals. In addition, the game module further includes an action generator executable by the processor for processing each character's adjusted and non-adjusted attributes to respond to the voice commands. [0012]
  • For example, a user issues a “run away” command to a group of characters. The video game processes the “run away” command, and adjusts each character's fight/flight attribute based upon the processed voice command and a predefined “flee” adjustment. In other words, the user issues a voice command to instantaneously and temporarily adjust each character's fight/flight attribute, causing each character of the group of characters to run away from enemy characters. In effect, the group is running across the terrain at some average group speed, although each character runs at a speed identified by the character's speed attribute. [0013]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of an exemplary electronic entertainment system, according to the present invention; [0014]
  • FIG. 2 is a block diagram of one embodiment of the main memory of FIG. 1, according to the present invention; [0015]
  • FIG. 3A is a block diagram of an exemplary embodiment of data storage of FIG. 2; [0016]
  • FIG. 3B is a block diagram of an exemplary embodiment of the character A data storage module of FIG. 3A; [0017]
  • FIG. 4 is a block diagram of an exemplary embodiment of the static parameter table of FIG. 3B; [0018]
  • FIG. 5 is a block diagram of an exemplary embodiment of the dynamic parameter table of FIG. 3B; [0019]
  • FIG. 6 is a block diagram of an exemplary embodiment of the meta parameter table of FIG. 3B; [0020]
  • FIG. 7 is a block diagram of an exemplary embodiment of the emotion tables of FIG. 3B; and [0021]
  • FIG. 8 is a flowchart of method steps for group behavioral modification using external stimuli, according to one embodiment of the present invention. [0022]
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of an exemplary [0023] electronic entertainment system 100 according to the present invention. The entertainment system 100 includes a main memory 102, a central processing unit (CPU) 104, at least one vector unit 106, a graphics processing unit 108, an input/output (I/O) processor 110, an I/O processor memory 112, a controller interface 114, a memory card 116, a Universal Serial Bus (USB) interface 118, and an IEEE 1394 interface 120, although other bus standards and interfaces may be utilized. The entertainment system 100 further includes an operating system read-only memory (OS ROM) 122, a sound processing unit 124, an optical disc control unit 126, and a hard disc drive 128, which are connected via a bus 130 to the I/O processor 110. Preferably, the entertainment system 100 is an electronic gaming console. Alternatively, the entertainment system 100 may be implemented as a general-purpose computer, a set-top box, or a hand-held gaming device. Further, similar entertainment systems may contain more or less operating components.
  • The [0024] CPU 104, the vector unit 106, the graphics processing unit 108, and the I/O processor 110 communicate via a system bus 132. Further, the CPU 104 communicates with the main memory 102 via a dedicated bus 134, while the vector unit 106 and the graphics processing unit 108 may communicate through a dedicated bus 136. The CPU 104 executes programs stored in the OS ROM 122 and the main memory 102. The main memory 102 may contain prestored programs and programs transferred through the I/O processor 110 from a CD-ROM, DVD-ROM, or other optical disc (not shown) using the optical disc control unit 126. The I/O processor 110 primarily controls data exchanges between the various devices of the entertainment system 100 including the CPU 104, the vector unit 106, the graphics processing unit 108, and the controller interface 114.
  • The [0025] graphics processing unit 108 executes graphics instructions received from the CPU 104 and the vector unit 106 to produce images for display on a display device (not shown). For example, the vector unit 106 may transform objects from three-dimensional coordinates to two-dimensional coordinates, and send the two-dimensional coordinates to the graphics processing unit 108. Furthermore, the sound processing unit 124 executes instructions to produce sound signals that are outputted to an audio device such as speakers (not shown).
  • A user of the [0026] entertainment system 100 provides instructions via the controller interface 114 to the CPU 104. For example, the user may instruct the CPU 104 to store certain game information on the memory card 116 or instruct a character in a game to perform some specified action. The controller interface 1 14 may include a transducer (not shown) to convert acoustic signals to electrical signals. In other embodiments, the transducer may be a separate element of the electronic entertainment system 100. As discussed below in conjunction with FIG. 2, voice recognition software stored in the main memory 102 is executable by the CPU 104 to process the electrical signals. Further, other devices may be connected to the entertainment system 100 via the USB interface 118 and the IEEE 1394 interface 120.
  • FIG. 2 is a block diagram of one embodiment of the [0027] main memory 102 of FIG. 1 according to the present invention. The main memory 102 is shown containing a game module 200 which is loaded into the main memory 102 from an optical disc in the optical disc control unit 126 (FIG. 1). The game module 200 contains instructions executable by the CPU 104, the vector unit 106, and the sound processing unit 124 of FIG. 1 that allows a user of the entertainment system 100 (FIG. 1) to play a game. In the exemplary embodiment of FIG. 2, the game module 200 includes data storage 202, an action generator 204, a characteristic generator 206, a data table adjuster 208, and a voice recognition module 210.
  • In one embodiment, the [0028] action generator 204, the characteristic generator 206, and the data table adjuster 208 may be modules executable by the CPU 104. For example, the action generator 204 is executable by the CPU 104 to produce game play, including character motion and character response; the characteristic generator 206 is executable by the CPU 104 to generate a character's expressions as displayed on a monitor (not shown); the data table adjuster 208 is executable by the CPU 104 to update data in data storage 202 during game play; and the voice recognition module 210 is executable by the CPU 104 to convert the electrical signals received via the controller interface 114 (FIG. 1) to game instructions. In addition, the CPU 104 accesses data in data storage 202 as instructed by the action generator 204, the characteristic generator 206, the data table adjuster 208, and the voice recognition module 210.
  • For the purposes of this exemplary embodiment, the [0029] game module 200 is a tribal simulation game in which a player creates and trains tribes of characters. A tribe of characters is preferably a group (or team) of characters associated with a given game user. Preferably, the tribal simulation game includes a plurality of character species, and each team of characters may include any combination of characters from any of the character species. A character reacts to internal stimuli and external stimuli based upon the character's genetic makeup as expressed by gene attributes. According to the present invention, internal stimuli are internal to the gaming environment, such as character/character interactions, character/group interactions, character/environment interactions, and character/game situation interactions. Typically, a user does not have direct control over internal stimuli. In contrast, a user has direct control over external stimuli. For example, the external stimuli which are external to the gaming environment, may be voice commands issued by a user, received by the controller interface 114, and directed to modify the behavior of a group of characters. Other forms of external stimuli are also contemplated by the present invention, such as keyboard or mouse initiated stimuli.
  • Typically, each character's behavior depends upon one or more gene attributes. Gene attributes that typically remain constant throughout a character's life are called static attributes; gene attributes that change during game play in response to internal and external stimuli are called dynamic attributes; and gene attributes that are functions of the static and/or dynamic attributes are called meta attributes. Those meta attributes that are based in part, or in whole, on the dynamic attributes are also dynamic in nature. In addition, a character's dynamic and meta attributes may be modified by emotional attributes as quantified by hate/love (H/L) values. A character's emotional attributes are dynamic and change during game play in response to internal and external stimuli and correspond to other species, teams, and characters. A character's static attributes, dynamic attributes, meta attributes, and H/L values are described further below in conjunction with FIGS. [0030] 3-7.
  • FIG. 3A is a block diagram of an exemplary embodiment of [0031] data storage 202 of FIG. 2 according to the present invention. The data storage 202 includes a character A database 302 a, a character B database 302 b, and a character C database 302 c. Although the FIG. 3A embodiment of data storage 202 shows three character databases 302 a, 302 b, and 302 c, the scope of the present invention includes any number of character databases 302.
  • FIG. 3B is a block diagram of an exemplary embodiment of the [0032] character A database 302 a of FIG. 3A. The character A database 302 a includes a static parameter table 308, a dynamic parameter table 310, a meta parameter table 312, and emotion tables 314. Character A's static attributes are stored in the static parameter table 308; character A's dynamic attributes (preferably not including H/L values) are stored in the dynamic parameter table 310; character A's meta attributes are stored in the meta parameter table 312; and character A's H/L values are stored in the emotion tables 314. Herein, attributes are also referred to as parameters. Although the static attributes stored in the static parameter table 308 typically remain constant throughout character A's life, in an alternate embodiment of the invention, the static attributes may be changed through character training or user generated external stimuli. For example, a user may have a character train on a treadmill to increase the character's speed. Likewise, the user may have the character train with weights to increase the character's strength. Further, a user generated external stimuli may temporarily modify one or more static attributes. In one embodiment of the invention, a temporarily modified static attribute reverts back to its unmodified value after a time delay T, dependent upon the video game and the static attribute. Referring back to FIG. 3A, the character B database 302 b and the character C database 302 c are similar to the character A database 302 a.
  • FIG. 4 is an illustration of an exemplary embodiment of the static parameter table [0033] 308 of FIG. 3B. The static parameter table 308 includes a plurality of static parameters, such as, but not entirely inclusive of or limited to, a strength parameter 402, a speed parameter 404, a sight parameter 406, a hearing parameter 408, a maximum hit point parameter 410, a hunger point parameter 412, a healing urge parameter 414, a self-healing rate parameter 416, and an aggressive base parameter 418. The scope of the invention may include other static parameters as well. Typically, the strength parameter 402 corresponds to a character's strength; the speed parameter 404 corresponds to how fast a character walks or runs across terrain; the sight parameter 406 corresponds to a character's viewing distance; and the hearing parameter 408 corresponds to a character's hearing distance. The maximum hit point parameter 410 is, preferably, a health parameter threshold value, which is discussed further below in conjunction with FIG. 5. The hunger point parameter 412 is a reference value to which a character's energy is measured to compute a character's hunger parameter, as will be described further below in conjunction with FIG. 6. Further, the healing urge parameter 414 corresponds to a character's desire to heal another character, while the self-healing rate parameter 416 corresponds to a time rate at which a character heals itself. Finally, the aggressive base parameter 418 is a reference value that represents a character's base aggression level, and is described further below in conjunction with FIG. 6. As previously indicated, not all of these parameters are required, and other parameters may be contemplated for use in the present invention.
  • FIG. 5 is an illustration of an exemplary embodiment of the dynamic parameter table [0034] 310 of FIG. 3B. The dynamic parameter table 310 includes a plurality of dynamic parameters, such as an energy parameter 502, a health parameter 504, an irritation parameter 506, and a game experience parameter 508. However, the scope of the present invention may not include all of the above listed parameters and/or include other dynamic parameters. These dynamic parameters change during game play. For example, the character's energy parameter 502 is a function of the character's consumption of food and the rate at which the character uses energy. When the character eats, the character's energy parameter 502 increases. However, the character is continuously using energy as defined by the character's metabolic rate. The metabolic rate is a meta parameter dependant upon several static parameters and is further discussed below in conjunction with FIG. 6.
  • In the present embodiment, the health parameter [0035] 504 is less than or equal to the maximum hit point parameter 410 (FIG. 4), and is a function of the character's energy parameter 502, the character's self-healing rate parameter 416 (FIG. 4), and a number of character hits. For example, a character is assigned a health parameter 504 equal to the maximum hit point parameter 410 upon game initialization. Each time the character is hit by another character via a physical blow or weapons fire, the character's health parameter 504 decreases. In addition, whenever a character's energy parameter 502 falls below a predefined threshold value, the character's health parameter 504 decreases. Furthermore, the character's health parameter 504 increases at the character's self-healing rate 416. Thus, although static and dynamic parameters are stored in separate tables, these parameters are closely related. For example, the health parameter 504 is based in part on the self-healing rate parameter 416, which is a static parameter.
  • Preferably, the character's [0036] irritation parameter 506 increases if the character is exposed to irritating stimuli, such as the presence of enemies or weapons fire within the character's range of sight, specified by the sight parameter 406 (FIG. 4). The irritation parameter 506 decreases over time at a predefined rate.
  • Finally, the character's [0037] game experience parameter 508 quantifies a character's game experiences, particularly in association with character participation in tribal games and fighting. For example, an experienced character has accumulated wisdom, and is less likely to be surprised by game situations and more adept at making game decisions.
  • FIG. 6 is an illustration of an exemplary embodiment of the meta parameter table [0038] 312 of FIG. 3B. The meta parameter table 312 includes a plurality of meta parameters, such as, but not necessarily completely inclusive of or limited to, a hunger parameter 602, a metabolic rate parameter 604, an aggression parameter 606, and a fight/flight parameter 608. The meta parameters are typically changeable, and are based upon the static and dynamic parameters. For example, a character's desire to eat is dependent upon the hunger parameter 602. In one embodiment of the invention, the hunger parameter 602 is a signed value defined by the energy parameter 502 (FIG. 5) less the hunger point parameter 412 (FIG. 4). If the character's hunger parameter 602 is greater than zero, then the character is not hungry. However, if the character's hunger parameter 602 is less than zero, then the character is hungry. As the negative hunger parameter 602 decreases (i.e., becomes more negative), the character's desire to eat increases. This desire to eat may then be balanced with other desires, such as a desire to attack an enemy or to search for a weapons cache. The weighting of these parameters may determine a character's behaviors and actions.
  • Typically, the metabolic rate parameter [0039] 604 is directly proportional to the character's speed parameter 404 (FIG. 4), the strength parameter 402 (FIG. 4), and the maximum hit point parameter 410 (FIG. 4), while indirectly proportional to the character's hunger point parameter 412 (FIG. 4) and healing urge parameter 414 (FIG. 4). For example, if the character's healing urge parameter 414 is large, the character is likely a calm, non-excitable individual. Therefore the character's metabolic rate parameter 604 would be small. Alternatively, if the character's healing urge parameter 414 is small, the character is likely a highly-strung, excitable individual. Consequently, the character's metabolic rate parameter 604 would be large.
  • Finally, the aggression parameter [0040] 606 is defined as the aggressive base parameter 418 (FIG. 4) plus the irritation parameter 506 (FIG. 5). As the aggression parameter 606 increases, the character becomes more aggressive and is more likely to be engaged in fights.
  • A character uses the fight/flight parameter [0041] 608 to determine whether, when faced with an enemy or other dangerous situations, to fight or flee the enemy. The fight/flight parameter 608 is preferably based upon the hunger parameter 602, the aggression parameter 606, the game experience parameter 508 (FIG. 5), and the energy parameter 502 (FIG. 5). In one embodiment of the invention, a large value for the fight/flight parameter 608 corresponds to a character's desire to fight, whereas a small value for the fight/flight parameter 608 corresponds to a character's desire to flee. For example, as the character's hunger or aggression increases, as measured by the character's hunger parameter 602 and aggression parameter 606, respectively, the character is more likely to engage in fights.
  • FIG. 7 is an illustration of one embodiment of the emotion tables [0042] 314 of FIG. 3B, according to the present invention. The emotion tables 314 include an individual's hate/love (H/L) table 702, a species H/L table 704, and a team H/L table 706. The individual's H/L table 702 includes one or more character identification (ID) numbers and one or more character H/L values, wherein each character ID number is associated with a character H/L value. For example, character A has a −900 character H/L value corresponding to a character identified by character ID number 192993293. Thus, character A has high hate for the individual having character ID number 192993293. Conversely, character A has a 100 character H/L value for character ID number 339399928. This positive H/L value corresponds to a general liking of the individual having ID number 339399928. The more negative or positive the H/L value is, the more the particular individual is hated or loved, respectively. In a further embodiment, the individuals H/L table 702 may also include individual character names corresponding to the character ID numbers.
  • The species H/L table [0043] 704 includes one or more species names and one or more species H/L values. Each species name is associated with a species H/L value which represents character A's relationship with each species. Similar to the individuals H/L table 702, the more negative or positive the H/L value, the more the particular species is hated or loved, respectively. For example, character A has a 100 species H/L value corresponding to the Nids species which implies a general like of the Nids species. Conversely, character A has a −500 species H/L value corresponding to the Antenids species. Therefore, character A has a strong dislike (i.e., hate) for the Antenids species.
  • Similarly, the team H/L table [0044] 706 includes one or more team ID numbers, one or more team H/L values, and one or more team names. Each team ID number is associated with a team H/L value and a team name. For example, the character A has a 1000 team H/L value corresponding to the Frosties team represented by ID number 139000. Because the H/L value is so high, character A has a deep love for the Frosties team. However, character A has a −500 H/L value corresponding to the Slashers team represented by ID number 939992, thereby representing a hate for this team.
  • In one embodiment of the invention, the character, species, and team H/L values range from −1000 to 1000. A character, species, or team H/L value of 1000 represents unconditional love directed towards the character, species, or team, respectively, while a character, species, or team H/L value of −1000 represents extreme hatred directed towards the character, species, or team, respectively. A H/L value of zero represents a neutral feeling. In alternate embodiments, the H/L value ranges may be larger or smaller, and may include other maximum and minimum values. [0045]
  • Any type of character interaction may cause changes to H/L values. For example, character A initially has a 800 character H/L value corresponding to character B and a −50 character H/L value corresponding to character C. However, character A sees character C hit character B, and thus character A's character H/L values are adjusted accordingly. In this example, character A's character H/L value corresponding to character B increases to 850 because of feelings of sympathy towards character B, and character A's character H/L value corresponding to character C may decrease to −200 due to an increased hatred for character C. In addition, if character C then attacks character A, character A develops more hatred towards character C, and character A's character H/L value corresponding to character C may further decrease to −275. However, at some later time in the game, if character C communicates to character A useful information on the operation of a weapon, then character A's character H/L value corresponding to character C may increase to −150. [0046]
  • According to one embodiment of the present invention, the data table adjuster [0047] 208 (FIG. 2) initializes all character and team H/L values to zero upon initiation of a new game. Furthermore, the data table adjuster 208 initializes all species H/L values to zero or to non-zero predefined values dependent upon game-defined species compatibility. In an alternate embodiment, the data table adjuster 208 initializes all species, character, and team H/L values to zero upon initiation of a new game. In a further embodiment, the data table adjuster 208 initializes some or all character, species, and team H/L values to non-zero predefined values dependent upon game-defined character, species, and team compatibility. Upon game completion, a user may save all the H/L values to the memory card 116 (FIG. 1) or the hard disc drive 128 (FIG. 1) for future game play. If the user has instructed the game module 200 (FIG. 2) to save the H/L values, the data table adjuster 208 may use the saved H/L values to initialize all game H/L values upon continuation of game play.
  • FIG. 8 is an [0048] exemplary flowchart 800 of method steps for dynamic behavioral modification based upon game interactions, according to one embodiment of the present invention. In step 802, H/L values are initialized. Initially, a user (not shown) instructs the game entertainment system 100 (FIG. 1) to execute the game module 200 (FIG. 2) via user commands and the controller interface 114 (FIG. 1). The CPU 104 (FIG. 1) receives the user commands and executes the data table adjuster 208 (FIG. 2). The data table adjuster 208 accesses the character, species, and team H/L values and stores the character, species, and team H/IL values in the emotions table 314 (FIG. 3). These species H/L values are set to predefined values dependent upon game-defined species compatibility or results of a previous playing of the game. In an initial game play, since characters and teams have not yet interacted via game play, the data table adjuster 208, preferably, initializes all character and team H/L values to zero, where a H/L value of zero represents a neutral emotion.
  • Next in [0049] step 804, the CPU 104 executes the action generator 204 (FIG. 2) and the characteristic generator 206 (FIG. 2) to generate game play and game interactions. Game interactions typically include information exchange between characters, as well as communication, observation, detection of sound, direct physical contact, and indirect physical contact. For example, in one embodiment of the invention, character A and character B may interact and exchange information via a conversation. In another embodiment of the invention, character A may receive information via observations. For instance, character A may observe character B engaged in direct physical contact with character C via a fist fight, or character A may observe character B engage character C in indirect physical contact via an exchange of weapons fire. Alternatively, in another example, character A may observe character B interact with an “inanimate” object. For example, character B moves a rock and discovers a weapons cache. In a further embodiment of the invention, character A may hear a communication between character B and character C. In yet another embodiment of the invention, character A may engage in direct physical contact with character B. Finally, in another embodiment of the invention, character A may engage in indirect physical contact with character B. For example, character A may discharge a weapon aimed at character B, or character A may receive fire from character B's weapon. The above described game interactions are meant as exemplary interactions, however, the scope of the present invention covers all types of interactions.
  • Subsequently, the user issues an external stimulus to a group of characters in a [0050] step 806. For example, the user may issue a voice command via the controller interface 114 (FIG. 1), via a microphone (not shown) integrated with the controller interface 114, or via a microphone electrically connected to the I/O processor 110 (FIG. 1). Further, the user may issue a variety of voice commands, such as “run away,” “fight,” “search for weapons,” “eat,” “spread out,” “group together,” “follow character A,” “hate species D,” and “love species D.” Depending on the design of the game, other commands may be issued by the user, and alternative forms of external stimuli may be utilized (e.g., keyboard or mouse commands).
  • In [0051] step 808, the external stimulus issued by the user modifies character attributes. More specifically, one or more of each character's static, dynamic, meta, and/or emotional attributes may be modified according to the issued stimulus. In one embodiment of the invention, the game entertainment system 100 uses the issued stimulus to temporarily modify a given attribute associated with each character of the group of characters by a predefined amount. After a predefined time delay T, dependent upon the video game and the given attribute, the given attribute reverts back to its pre-modified value.
  • In a first example, if the user issues the external stimulus, “run away,” to a user-associated group of characters facing enemy characters via a voice command, the microphone (not shown) converts the voice command to electrical signals and sends the electrical signals to the [0052] CPU 104 via the I/O processor 110. The voice recognition module 210 (FIG. 2), executable by the CPU 104, processes the electrical signals and instructs the data table adjuster 208 to increment each character's fight/flight parameter 608 (FIG. 6) by a predefined “flee” amount.
  • For example, suppose the user-associated group of characters comprise character A with a fight/flight parameter of 63, character B with a fight/flight parameter of 86, and character C with a fight/flight parameter of 35, where the fight/flight parameter ranges from 0 (100% desire to flee) to 100 (100% desire to fight). In addition, if the predefined “flee” amount is −20, then the data table adjuster [0053] 208 decreases character A's fight/flight parameter to 43, character B's fight/flight parameter to 66, and character C's fight/flight parameter to 15. Subsequently, each character (A, B, and C) of the group of characters have more of a desire to “run away” from the enemy characters then before the external command was issued, although each character's individual desire to flee depends upon each character's modified fight/flight parameter. Since each character's behavior depends upon each character's attributes, and since many of the attributes are interdependent, an external stimulus issued to a group of characters may have many behavioral manifestations.
  • In another embodiment of the present invention, after a predefined time delay T has elapsed from issuance of the external stimulus, the data table adjuster [0054] 208 changes the characters' fight/flight parameters to their pre-modified values. That is, the data table adjuster changes character A's fight/flight parameter back to 63, character B's fight/flight parameter to 86, and character C's fight/flight parameter to 35.
  • Although the present example illustrates a modification of the characters' fight/flight parameters upon issuance of the “run away” external stimulus, the modification of the fight/flight parameter is not inclusive, and the data table adjuster [0055] 208 may modify alternate character attributes to elicit the desired character response to the external stimulus.
  • In an alternate embodiment, the data table adjuster [0056] 208 adjusts each character's fight/flight parameter by an amount dependent upon the strength of the voice command issued by the user. As the strength of the voice command increases, the adjustment increases. For example, a softly spoken “run away” command may decrease each character's fight/flight parameter by 10, whereas a loudly spoken “run away” command may decrease each character's fight/flight parameter by 35.
  • In a second example, the user issues an external stimulus, “fight,” to a user-associated group of characters. Subsequently, in one embodiment of the invention, the data table adjuster [0057] 208 temporarily increases each character's fight/flight parameter by a predefined “fight” amount. Referring to the first example, if the predefined “fight” amount is 10, then the data table adjuster 208 temporarily increases character A's fight/flight parameter to 73, character B's fight/flight parameter to 96, and character C's fight/flight parameter to 45. Subsequently, each character (A, B, and C) of the group of characters have more of a desire to “fight” the enemy characters than before the external command was issued.
  • In another embodiment of the invention, upon issuance of the “fight” command, the data table adjuster [0058] 208 adds a predefined aggression adjustment to each character's aggressive base parameter 418 (FIG. 4). Since the fight/flight parameter 608 is based in part on the aggression parameter 606 (FIG. 6), and since the aggression parameter 606 is related to the aggressive base parameter 418 and the irritation parameter 506 (FIG. 5), then as the aggressive base parameter 418 increases, the fight/flight parameter 608 increases and the character is more likely to be engaged in fights.
  • In a third example, the user issues an external stimulus, “eat,” to a user-associated group of characters. Subsequently, in one embodiment of the invention, the data table adjuster [0059] 208 temporarily increases each character's hunger point parameter 412 (FIG. 4) by a predefined hunger point adjustment. Since each character's hunger parameter 602 (FIG. 6) decreases as each character's hunger point parameter 412 increases, each character has a greater desire to eat after the user issues the “eat” command. The scope of the present invention covers all character actions and behaviors modified by a user-issued external stimulus, such as the “eat” command. For example, as each character eats, each character's energy parameter 502 (FIG. 5) increases, and each character is more likely to be healthy as characterized by a large character health parameter 504 (FIG. 5). Consequently, each character will be more likely to fight enemy characters, since in one embodiment of the invention, each character's fight/flight parameter 608 (FIG. 6) increases as each character becomes more healthy.
  • In a fourth example, the user issues an external stimulus, “hate species D,” to a user-associated group of characters. Subsequently, the data table adjuster [0060] 208 subtracts a predefined hate adjustment value from each character's species H/L value corresponding to species D. For example, character A has a 100 species H/L value (stored in the species H/L table 704 (FIG. 7)) corresponding to the Nids species. In addition, suppose character B has a −235 species H/L value corresponding to the Nids species, and character C has a −800 species H/L value corresponding to the Nids species. Furthermore, suppose that the predefined hate adjustment value is 100. After the user issues a “hate Nids” voice command, the data table adjuster 208 subtracts the predefined hate adjustment value from each character's species H/L value corresponding to the Nids species, resulting in character A having a 0 species H/L value corresponding to the Nids species, character B having a −335 species H/L value corresponding to the Nids species, and character C having a −900 species H/L value corresponding to the Nids species.
  • These above-described modifications to the character's species H/L values corresponding to the Nids species may affect the character's subsequent game actions, since the character's behavior depends upon the character's static, dynamic, meta, and emotional attributes. The above described external stimuli and adjustments to character attributes are meant as exemplary interactions, however, the scope of the present invention covers all types of external stimuli and all types of adjustments to a character's attributes. [0061]
  • In [0062] step 810, the user-associated group of characters responds to the user-issued external stimulus. Referring back to the first example of step 808, character A, character B, and character C run across the terrain in response to the “run away” voice command issued by the user. Alternatively, in the fourth example of step 808, character A, character B, and character C may not respond immediately to the “hate Nids” voice command issued by the user, however the adjustment to characters A, B, and C's species H/L values corresponding to the Nids species will influence their behavior upon an encounter with the Nids. In this example, if Nids characters are located near character A, B, or C, the character A, B, or C may attack the Nids characters should the character A, B, or C's H/L values modify other attributes of character A, B, or C, causing character A, B, or C to be more aggressive during a Nids encounter.
  • In another exemplary embodiment, if the external stimulus issued by the user in [0063] step 808 is “follow character A,” then characters' flocking parameters (not shown) are adjusted such that the characters have a greater desire to follow character A. Of course not all characters may follow character A step by step, but dependent upon each character's adjusted flocking parameter, each of the characters will move in the same general direction as character A.
  • Next in [0064] step 812, the CPU 104 determines if the game user(s) have completed the game. If the CPU 104 determines that the game is completed, then the method ends. However if in step 812 the CPU 104 determines that the game is not completed, then the method continues at step 804. The scope of the invention includes method steps in which step 804 is optional. That is, a user may issue two or more external stimuli without an occurrence of any intervening game interaction.
  • The invention has been described above with reference to specific embodiments. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The foregoing description and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. [0065]

Claims (29)

What is claimed is:
1. A method for behavioral modification of a group of characters using external stimuli, comprising the steps of:
receiving an external stimulus;
temporarily adjusting at least one attribute of each character from the group of characters based upon the external stimulus; and
responding to the external stimulus based upon the at least one attribute of each character.
2. The method of claim 1, wherein the external stimulus is a voice command.
3. The method of claim 1, wherein the at least one attribute of each character affects behavior of each character.
4. The method of claim 1, wherein at least one of the temporarily adjusted attributes reverts to a pre-adjusted value after a predefined time delay T.
5. The method of claim 1, wherein the at least one attribute includes hate/love (H/L) values.
6. The method of claim 1, wherein the at least one attribute includes static attributes.
7. The method of claim 1, wherein the at least one attribute includes dynamic attributes.
8. The method of claim 1, wherein the at least one attribute includes meta attributes.
9. An electronic-readable medium having embodied thereon a program, the program being executable by a machine to perform method steps for behavioral modification of a group of characters, the method steps comprising:
receiving an external stimulus;
temporarily adjusting at least one attribute of each character from the group of characters based upon the external stimulus; and
responding to the external stimulus based upon the at least one attribute of each character.
10. The electronic-readable medium of claim 9, wherein the external stimulus is a voice command.
11. The electronic-readable medium of claim 9, wherein the at least one attribute of each character affects behavior of each character.
12. The electronic-readable medium of claim 9, wherein at least one of the temporarily adjusted attributes reverts to a pre-adjusted value after a predefined time delay T.
13. A method for behavioral modification of a group of characters using external stimuli, comprising the steps of:
representing behavior of each character from the group of characters via attributes associated with each character from the group of characters;
receiving an external stimulus;
temporarily adjusting one or more attributes associated with each character based upon the external stimulus; and
having each character respond to the external stimulus based upon the one or more adjusted attributes.
14. The method of claim 13, wherein the external stimulus is a voice command.
15. The method of claim 13, wherein the step of adjusting one or more attributes further comprises the step of adjusting one or more emotional attributes associated with each character based upon the external stimulus.
16. The method of claim 15, wherein the one or more emotional attributes are hate/love (H/L) values.
17. The method of claim 15, further comprising the step of having each character respond to the external stimulus based upon the one or more adjusted emotional attributes.
18. An electronic-readable medium having embodied thereon a program, the program being executable by a machine to perform method steps for behavioral modification of a group of characters using external stimuli, the method steps comprising:
representing behavior of each character from the group of characters via attributes associated with each character from the group of characters;
receiving an external stimulus;
temporarily adjusting one or more attributes associated with each character based upon the external stimulus; and
having each character respond to the external stimulus based upon the one or more adjusted attributes.
19. The electronic-readable medium of claim 18, wherein the step of adjusting one or more attributes further comprises the step of adjusting one or more emotional attributes associated with each character based upon the external stimulus.
20. The electronic-readable medium of claim 19, further comprising the step of having each character respond to the external stimulus based upon the one or more adjusted emotional attributes.
21. A system for behavioral modification of a group of characters, comprising:
means for receiving an external stimulus;
means for temporarily adjusting at least one attribute of each character from the group of characters based upon the external stimulus; and
means for responding to the external stimulus based upon the at least one attribute of each character.
22. An electronic system for behavioral modification of a group of characters, comprising:
a processor;
a input device coupled to the processor for receiving commands and converting the received commands into electrical signals;
a data storage coupled to the processor for storing attributes of each character from the group of characters; and
a data table adjuster executable by the processor for adjusting each character's attributes based upon the electrical signals.
23. The electronic system of claim 22, wherein the input device is a microphone and the commands are voice commands.
24. The electronic system of claim 22, wherein the commands are external stimuli.
25. The electronic system of claim 22, further comprising an action generator executable by the processor for processing the attributes in response to the commands.
26. The electronic system of claim 22, wherein the attributes include emotional attributes.
27. The electronic system of claim 22, wherein the attributes include static attributes.
28. The electronic system of claim 22, wherein the attributes include dynamic attributes.
29. The electronic system of claim 22, wherein the attributes include meta attributes.
US10/364,951 2002-08-07 2003-02-11 Group behavioral modification using external stimuli Abandoned US20040029625A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/364,951 US20040029625A1 (en) 2002-08-07 2003-02-11 Group behavioral modification using external stimuli
EP03254168A EP1388357A3 (en) 2002-08-07 2003-06-30 Group behavioural modification using external stimuli
JP2003288128A JP3865721B2 (en) 2002-08-07 2003-08-06 Method and electronic device for modifying group behavior using external stimuli

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US40194002P 2002-08-07 2002-08-07
US10/364,951 US20040029625A1 (en) 2002-08-07 2003-02-11 Group behavioral modification using external stimuli

Publications (1)

Publication Number Publication Date
US20040029625A1 true US20040029625A1 (en) 2004-02-12

Family

ID=30448236

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/364,951 Abandoned US20040029625A1 (en) 2002-08-07 2003-02-11 Group behavioral modification using external stimuli

Country Status (3)

Country Link
US (1) US20040029625A1 (en)
EP (1) EP1388357A3 (en)
JP (1) JP3865721B2 (en)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050234993A1 (en) * 2004-03-18 2005-10-20 Ordille Joann J Method and apparatus for subscribing to dynamic attributes
US20070218993A1 (en) * 2004-09-22 2007-09-20 Konami Digital Entertainment Co., Ltd. Game Machine, Game Machine Control Method, Information Recording Medium, and Program
US20080146343A1 (en) * 2006-12-14 2008-06-19 Sullivan C Bart Wireless video game system and method
US20080248847A1 (en) * 2005-11-22 2008-10-09 Konami Digital Entertainment Co., Ltd. Game Result Evaluation Method and Device
US20080275680A1 (en) * 2007-05-04 2008-11-06 Dato Gregory G Systems and methods for simulating vehicle operation
US20090031258A1 (en) * 2007-07-26 2009-01-29 Nokia Corporation Gesture activated close-proximity communication
US20090037548A1 (en) * 2002-05-14 2009-02-05 Avaya Inc. Method and Apparatus for Automatic Notification and Response
US20090082076A1 (en) * 2002-08-07 2009-03-26 Sony Computer Entertainment America Inc. Emotion-based game character Manipulation
US20090254859A1 (en) * 2008-04-03 2009-10-08 Nokia Corporation Automated selection of avatar characteristics for groups
US20100041475A1 (en) * 2007-09-05 2010-02-18 Zalewski Gary M Real-Time, Contextual Display of Ranked, User-Generated Game Play Advice
US8790178B1 (en) * 2012-11-15 2014-07-29 Kabam, Inc. Metric based conformance by character units to specified formations
US8904298B2 (en) 2013-03-15 2014-12-02 Disney Enterprises, Inc. Effectuating modifications within an instance of a virtual space presented via multiple disparate client computing platforms responsive to detection of a token associated with a single client computing platform
US8909920B2 (en) 2012-12-26 2014-12-09 Disney Enterprises, Inc. Linking token detection at a single computing platform with a user identification to effectuate modifications in virtual space instances presented via multiple computing platforms
US8972369B2 (en) 2012-12-26 2015-03-03 Disney Enterprises, Inc. Providing a common virtual item repository in a virtual space
US8968099B1 (en) 2012-11-01 2015-03-03 Google Inc. System and method for transporting virtual objects in a parallel reality game
US8986115B2 (en) 2012-12-26 2015-03-24 Disney Enterprises, Inc. Facilitating customization of a virtual space based on accessible virtual items
US9126116B2 (en) 2007-09-05 2015-09-08 Sony Computer Entertainment America Llc Ranking of user-generated game play advice
US9128789B1 (en) 2012-07-31 2015-09-08 Google Inc. Executing cross-cutting concerns for client-server remote procedure calls
US9226106B1 (en) 2012-07-31 2015-12-29 Niantic, Inc. Systems and methods for filtering communication within a location-based game
USD748200S1 (en) 2013-01-15 2016-01-26 Disney Enterprises, Inc. Power disk reader
USD748199S1 (en) 2013-01-15 2016-01-26 Disney Enterprises, Inc. Multi-sided power disk
US9327200B2 (en) 2012-12-26 2016-05-03 Disney Enterprises, Inc. Managing a theme of a virtual space based on characters made accessible responsive to corresponding tokens being detected
US9387407B2 (en) 2012-12-26 2016-07-12 Disney Enterprises, Inc. Managing objectives associated with a virtual space based on characters made accessible responsive to corresponding tokens being detected
US9457263B2 (en) 2012-12-26 2016-10-04 Disney Enterprises, Inc. Unlocking virtual items in a virtual space responsive to physical token detection
US9517404B2 (en) 2012-12-26 2016-12-13 Disney Enterprises, Inc. Apparatus, system, and method for effectuating modifications to a virtual space responsive to token detection
US9539498B1 (en) 2012-07-31 2017-01-10 Niantic, Inc. Mapping real world actions to a virtual world associated with a location-based game
US9545565B1 (en) 2013-10-31 2017-01-17 Niantic, Inc. Regulating and scoring player interactions within a virtual world associated with a location-based parallel reality game
US9604131B1 (en) 2012-07-31 2017-03-28 Niantic, Inc. Systems and methods for verifying player proximity within a location-based game
US9621635B1 (en) 2012-07-31 2017-04-11 Niantic, Inc. Using side channels in remote procedure calls to return information in an interactive environment
US9667624B2 (en) 2012-12-26 2017-05-30 Disney Enterprises, Inc. Managing an environment of a virtual space based on characters made accessible responsive to corresponding tokens being detected
US9669296B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Linking real world activities with a parallel reality game
US9669293B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Game data validation
US9782668B1 (en) 2012-07-31 2017-10-10 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
US9833707B2 (en) 2012-10-29 2017-12-05 Sony Interactive Entertainment Inc. Ambient light control and calibration via a console
US10051457B2 (en) 2007-07-27 2018-08-14 Intertrust Technologies Corporation Content publishing systems and methods
US10128914B1 (en) 2017-09-06 2018-11-13 Sony Interactive Entertainment LLC Smart tags with multiple interactions
US20180366028A1 (en) * 2017-06-14 2018-12-20 IfWizard Corporation System for behavioral conditioning through gamification
US10463953B1 (en) 2013-07-22 2019-11-05 Niantic, Inc. Detecting and preventing cheating in a location-based game
US10561942B2 (en) 2017-05-15 2020-02-18 Sony Interactive Entertainment America Llc Metronome for competitive gaming headset
US10717005B2 (en) 2017-07-22 2020-07-21 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010533006A (en) * 2007-03-01 2010-10-21 ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー System and method for communicating with a virtual world

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4704696A (en) * 1984-01-26 1987-11-03 Texas Instruments Incorporated Method and apparatus for voice control of a computer
USRE35314E (en) * 1986-05-20 1996-08-20 Atari Games Corporation Multi-player, multi-character cooperative play video game with independent player entry and departure
US6273818B1 (en) * 1999-10-25 2001-08-14 Square Co., Ltd. Video game apparatus and method and storage medium
US20020082065A1 (en) * 2000-12-26 2002-06-27 Fogel David B. Video game characters having evolving traits
US6508706B2 (en) * 2001-06-21 2003-01-21 David Howard Sitrick Electronic interactive gaming apparatus, system and methodology
US6684127B2 (en) * 2000-02-14 2004-01-27 Sony Corporation Method of controlling behaviors of pet robots
US6729954B2 (en) * 2000-08-29 2004-05-04 Koei Co., Ltd. Battle method with attack power based on character group density
US6935954B2 (en) * 2000-02-24 2005-08-30 Nintendo Of America Inc. Sanity system for video game

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11126017A (en) * 1997-08-22 1999-05-11 Sony Corp Storage medium, robot, information processing device and electronic pet system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4704696A (en) * 1984-01-26 1987-11-03 Texas Instruments Incorporated Method and apparatus for voice control of a computer
USRE35314E (en) * 1986-05-20 1996-08-20 Atari Games Corporation Multi-player, multi-character cooperative play video game with independent player entry and departure
US6273818B1 (en) * 1999-10-25 2001-08-14 Square Co., Ltd. Video game apparatus and method and storage medium
US6684127B2 (en) * 2000-02-14 2004-01-27 Sony Corporation Method of controlling behaviors of pet robots
US6935954B2 (en) * 2000-02-24 2005-08-30 Nintendo Of America Inc. Sanity system for video game
US6729954B2 (en) * 2000-08-29 2004-05-04 Koei Co., Ltd. Battle method with attack power based on character group density
US20020082065A1 (en) * 2000-12-26 2002-06-27 Fogel David B. Video game characters having evolving traits
US6508706B2 (en) * 2001-06-21 2003-01-21 David Howard Sitrick Electronic interactive gaming apparatus, system and methodology

Cited By (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090037548A1 (en) * 2002-05-14 2009-02-05 Avaya Inc. Method and Apparatus for Automatic Notification and Response
US8510392B2 (en) 2002-05-14 2013-08-13 Avaya Inc. Method and apparatus for automatic notification and response
US9216354B2 (en) 2002-08-07 2015-12-22 Sony Computer Entertainment America Llc Attribute-driven gameplay
US20090082076A1 (en) * 2002-08-07 2009-03-26 Sony Computer Entertainment America Inc. Emotion-based game character Manipulation
US8172656B2 (en) 2002-08-07 2012-05-08 Sony Computer Entertainment America Llc Attribute-driven gameplay
US8727845B2 (en) 2002-08-07 2014-05-20 Sony Computer Entertainment America Llc Attribute-driven gameplay
US8096863B2 (en) 2002-08-07 2012-01-17 Sony Computer Entertainment America Llc Emotion-based game character manipulation
US8516045B2 (en) 2004-03-18 2013-08-20 Avaya Inc. Method and apparatus for automatic notification and response based on communication flow expressions having dynamic context
US20050234993A1 (en) * 2004-03-18 2005-10-20 Ordille Joann J Method and apparatus for subscribing to dynamic attributes
US8566311B2 (en) * 2004-03-18 2013-10-22 Avaya, Inc. Method and apparatus for notifying a user of a predefined changes to dynamic attributes
US8495163B2 (en) 2004-03-18 2013-07-23 Avaya, Inc. Method and apparatus for a publish-subscribe system with templates for role-based view of subscriptions
US8128497B2 (en) * 2004-09-22 2012-03-06 Konami Digital Entertainment Co., Ltd. Game machine, game machine control method, information recording medium, and program
US20070218993A1 (en) * 2004-09-22 2007-09-20 Konami Digital Entertainment Co., Ltd. Game Machine, Game Machine Control Method, Information Recording Medium, and Program
US8137171B2 (en) * 2005-11-22 2012-03-20 Konami Digital Entertainment Co., Ltd. Game result evaluation method and device
US20080248847A1 (en) * 2005-11-22 2008-10-09 Konami Digital Entertainment Co., Ltd. Game Result Evaluation Method and Device
US20080146343A1 (en) * 2006-12-14 2008-06-19 Sullivan C Bart Wireless video game system and method
US8333641B2 (en) * 2006-12-14 2012-12-18 Sullivan C Bart Wireless video game system and method
US7783461B2 (en) * 2007-05-04 2010-08-24 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for simulating vehicle operation
US20080275680A1 (en) * 2007-05-04 2008-11-06 Dato Gregory G Systems and methods for simulating vehicle operation
US20090031258A1 (en) * 2007-07-26 2009-01-29 Nokia Corporation Gesture activated close-proximity communication
US9843351B2 (en) 2007-07-26 2017-12-12 Nokia Technologies Oy Gesture activated close-proximity communication
US10271197B2 (en) 2007-07-27 2019-04-23 Intertrust Technologies Corporation Content publishing systems and methods
US10051457B2 (en) 2007-07-27 2018-08-14 Intertrust Technologies Corporation Content publishing systems and methods
US11218866B2 (en) 2007-07-27 2022-01-04 Intertrust Technologies Corporation Content publishing systems and methods
US20100041475A1 (en) * 2007-09-05 2010-02-18 Zalewski Gary M Real-Time, Contextual Display of Ranked, User-Generated Game Play Advice
US9108108B2 (en) 2007-09-05 2015-08-18 Sony Computer Entertainment America Llc Real-time, contextual display of ranked, user-generated game play advice
US10486069B2 (en) 2007-09-05 2019-11-26 Sony Interactive Entertainment America Llc Ranking of user-generated game play advice
US9126116B2 (en) 2007-09-05 2015-09-08 Sony Computer Entertainment America Llc Ranking of user-generated game play advice
US20090254859A1 (en) * 2008-04-03 2009-10-08 Nokia Corporation Automated selection of avatar characteristics for groups
US8832552B2 (en) * 2008-04-03 2014-09-09 Nokia Corporation Automated selection of avatar characteristics for groups
US9604131B1 (en) 2012-07-31 2017-03-28 Niantic, Inc. Systems and methods for verifying player proximity within a location-based game
US10806998B1 (en) 2012-07-31 2020-10-20 Niantic, Inc. Using side channels in remote procedure calls to return information in an interactive environment
US9128789B1 (en) 2012-07-31 2015-09-08 Google Inc. Executing cross-cutting concerns for client-server remote procedure calls
US10646783B1 (en) 2012-07-31 2020-05-12 Niantic, Inc. Linking real world activities with a parallel reality game
US9226106B1 (en) 2012-07-31 2015-12-29 Niantic, Inc. Systems and methods for filtering communication within a location-based game
US11167205B2 (en) 2012-07-31 2021-11-09 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
US10300395B1 (en) * 2012-07-31 2019-05-28 Niantic, Inc. Systems and methods for filtering communication within a location-based game
US10130888B1 (en) 2012-07-31 2018-11-20 Niantic, Inc. Game data validation
US9782668B1 (en) 2012-07-31 2017-10-10 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
US9723107B1 (en) 2012-07-31 2017-08-01 Niantic, Inc. Executing cross-cutting concerns for client-server remote procedure calls
US9669293B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Game data validation
US9539498B1 (en) 2012-07-31 2017-01-10 Niantic, Inc. Mapping real world actions to a virtual world associated with a location-based game
US9669296B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Linking real world activities with a parallel reality game
US9621635B1 (en) 2012-07-31 2017-04-11 Niantic, Inc. Using side channels in remote procedure calls to return information in an interactive environment
US9950259B2 (en) 2012-10-29 2018-04-24 Sony Interactive Entertainment Inc. Ambient light control and calibration via a console
US9833707B2 (en) 2012-10-29 2017-12-05 Sony Interactive Entertainment Inc. Ambient light control and calibration via a console
US8968099B1 (en) 2012-11-01 2015-03-03 Google Inc. System and method for transporting virtual objects in a parallel reality game
US8974299B1 (en) 2012-11-15 2015-03-10 Kabam, Inc. Metric based conformance by character units to specified formations
US8790178B1 (en) * 2012-11-15 2014-07-29 Kabam, Inc. Metric based conformance by character units to specified formations
US8986115B2 (en) 2012-12-26 2015-03-24 Disney Enterprises, Inc. Facilitating customization of a virtual space based on accessible virtual items
US9667624B2 (en) 2012-12-26 2017-05-30 Disney Enterprises, Inc. Managing an environment of a virtual space based on characters made accessible responsive to corresponding tokens being detected
US9387407B2 (en) 2012-12-26 2016-07-12 Disney Enterprises, Inc. Managing objectives associated with a virtual space based on characters made accessible responsive to corresponding tokens being detected
US9704336B2 (en) 2012-12-26 2017-07-11 Disney Enterprises, Inc. Managing a theme of a virtual space based on characters made accessible responsive to corresponding tokens being detected
US9327200B2 (en) 2012-12-26 2016-05-03 Disney Enterprises, Inc. Managing a theme of a virtual space based on characters made accessible responsive to corresponding tokens being detected
US9922185B2 (en) 2012-12-26 2018-03-20 Disney Enterprises, Inc. Linking token detection at a single computing platform with a user identification to effectuate modifications in virtual space instances presented via multiple computing platforms
US9517404B2 (en) 2012-12-26 2016-12-13 Disney Enterprises, Inc. Apparatus, system, and method for effectuating modifications to a virtual space responsive to token detection
US9457263B2 (en) 2012-12-26 2016-10-04 Disney Enterprises, Inc. Unlocking virtual items in a virtual space responsive to physical token detection
US8972369B2 (en) 2012-12-26 2015-03-03 Disney Enterprises, Inc. Providing a common virtual item repository in a virtual space
US8909920B2 (en) 2012-12-26 2014-12-09 Disney Enterprises, Inc. Linking token detection at a single computing platform with a user identification to effectuate modifications in virtual space instances presented via multiple computing platforms
US9552434B2 (en) 2012-12-26 2017-01-24 Disney Enterprises, Inc. Providing a common virtual item repository in a virtual space
USD748199S1 (en) 2013-01-15 2016-01-26 Disney Enterprises, Inc. Multi-sided power disk
USD748200S1 (en) 2013-01-15 2016-01-26 Disney Enterprises, Inc. Power disk reader
US9092114B2 (en) 2013-03-15 2015-07-28 Disney Enterprises, Inc. Effectuating modifications within an instance of a virtual space presented via multiple disparate client computing platforms responsive to detection of a token associated with a single client computing platform
US8904298B2 (en) 2013-03-15 2014-12-02 Disney Enterprises, Inc. Effectuating modifications within an instance of a virtual space presented via multiple disparate client computing platforms responsive to detection of a token associated with a single client computing platform
US10463953B1 (en) 2013-07-22 2019-11-05 Niantic, Inc. Detecting and preventing cheating in a location-based game
US10912989B2 (en) 2013-07-22 2021-02-09 Niantic, Inc. Detecting and preventing cheating in a location-based game
US10471358B1 (en) 2013-10-31 2019-11-12 Niantic, Inc. Regulating and scoring player interactions within a virtual world associated with a location-based parallel reality game
US9545565B1 (en) 2013-10-31 2017-01-17 Niantic, Inc. Regulating and scoring player interactions within a virtual world associated with a location-based parallel reality game
US10561942B2 (en) 2017-05-15 2020-02-18 Sony Interactive Entertainment America Llc Metronome for competitive gaming headset
US11069258B2 (en) * 2017-06-14 2021-07-20 Ifwizard Coprporation System for behavioral conditioning through gamification
US20180366028A1 (en) * 2017-06-14 2018-12-20 IfWizard Corporation System for behavioral conditioning through gamification
US10717005B2 (en) 2017-07-22 2020-07-21 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game
US11541315B2 (en) 2017-07-22 2023-01-03 Niantic, Inc. Validating a player's real-world location using activity within a parallel-reality game
US10541731B2 (en) 2017-09-06 2020-01-21 Sony Interactive Entertainment LLC Smart tags with multiple interactions
US10128914B1 (en) 2017-09-06 2018-11-13 Sony Interactive Entertainment LLC Smart tags with multiple interactions

Also Published As

Publication number Publication date
JP3865721B2 (en) 2007-01-10
JP2004065988A (en) 2004-03-04
EP1388357A2 (en) 2004-02-11
EP1388357A3 (en) 2004-06-09

Similar Documents

Publication Publication Date Title
US20040029625A1 (en) Group behavioral modification using external stimuli
US8096863B2 (en) Emotion-based game character manipulation
US11944903B2 (en) Using playstyle patterns to generate virtual representations of game players
JP6306442B2 (en) Program and game system
JP6515053B2 (en) Program and game system
KR20090003337A (en) Method for automatically adapting virtual equipment model
JP7447296B2 (en) Interactive processing method, device, electronic device and computer program for virtual tools
Joselli et al. An architecture for game interaction using mobile
Schouten et al. Human behavior analysis in ambient gaming and playful interaction
US20230124014A1 (en) Image display method and apparatus, device and storage medium
WO2023202252A1 (en) Virtual scene control method and apparatus, virtual scene release method and apparatus, and electronic device and storage medium
JP6622832B2 (en) Program and game system
CN114618157A (en) Data compensation method and device in game, electronic equipment and storage medium
KR102096856B1 (en) Apparatus and method for matching game
US20220305389A1 (en) Multi-player game
JP2023528574A (en) WIDGET DISPLAY METHOD, APPARATUS, DEVICE AND COMPUTER PROGRAM
KR20030075084A (en) Network game system using robot
JP2023040548A (en) Computer system, program, server and game control method
CN116943204A (en) Virtual object control method and device, storage medium and electronic equipment
CN116920402A (en) Virtual object control method, device, equipment, storage medium and program product
CN116549974A (en) Information communication method, device and product in virtual fight

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY COMPUTER ENTERTAINMENT AMERICA INC., CALIFORN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ANNUNZIATA, ED;REEL/FRAME:013768/0244

Effective date: 20021219

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: SONY INTERACTIVE ENTERTAINMENT AMERICA LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT AMERICA LLC;REEL/FRAME:038626/0637

Effective date: 20160331

Owner name: SONY INTERACTIVE ENTERTAINMENT AMERICA LLC, CALIFO

Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT AMERICA LLC;REEL/FRAME:038626/0637

Effective date: 20160331