US20020068500A1 - Adaptive toy system and functionality - Google Patents

Adaptive toy system and functionality Download PDF

Info

Publication number
US20020068500A1
US20020068500A1 US09/730,154 US73015400A US2002068500A1 US 20020068500 A1 US20020068500 A1 US 20020068500A1 US 73015400 A US73015400 A US 73015400A US 2002068500 A1 US2002068500 A1 US 2002068500A1
Authority
US
United States
Prior art keywords
toy
user
script
talk
toys
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/730,154
Inventor
Oz Gabai
Jacob Gabai
Nimrod Sandlerman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Creator Ltd
Original Assignee
Creator Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Creator Ltd filed Critical Creator Ltd
Priority to US09/730,154 priority Critical patent/US20020068500A1/en
Assigned to CREATOR LTD. reassignment CREATOR LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GABAI, JACOB, GABAI, OZ, SANDLERMAN, NIMROD
Publication of US20020068500A1 publication Critical patent/US20020068500A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/40Network security protocols
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/406Transmission via wireless network, e.g. pager or GSM
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/407Data transfer via internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/30Definitions, standards or architectural aspects of layered protocol stacks
    • H04L69/32Architecture of open systems interconnection [OSI] 7-layer type protocol stacks, e.g. the interfaces between the data link level and the physical level
    • H04L69/322Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions
    • H04L69/329Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions in the application layer [OSI layer 7]

Definitions

  • the present invention relates to networked electronic devices.
  • the present invention seeks to provide toy apparatus for electronic shopping.
  • a multiplicity of interactive toys each of which is connected to a computer network and adaptive toy operation software which is supplied to the multiplicity of interactive toys via the computer network, the adaptive toy operation software being operative to provide feedback, based on play experience with at least some of the multiplicity of interactive toys, via the computer network and to employ the feedback in adapting itself so as to change the play experience provided thereby.
  • a multiplicity of interactive entertainment units each of which is connected to a computer network and adaptive entertainment software which is supplied to the multiplicity of interactive entertainment units via the computer network, the adaptive entertainment software being operative to provide feedback, based on user experience with at least some of the multiplicity of interactive entertainment units via the computer network and to employ the feedback in adapting itself so as to change the entertainment experience provided thereby.
  • a method for establishing a network of toys including providing a plurality of scripts for at least some of the network of toys and sending at least one of the plurality of scripts to at least one of the toys in the network, over the network.
  • a toy system including an electronic toy content shop providing users with an option to pre-purchase accounts for users and a plurality of networked toys directly connected via a network to the electronic toy content store, the toys being operative to load themselves with at least one script sold at the electronic toy content store, wherein only a subset of the scripts sold at the electronic toy content shop are displayed to a user, depending on at least one personal characteristic of the user.
  • a toy system providing multi-level interaction between a population of users and a population of toys, the system including at least one scripts operative to pose at least one question to at least one user about a topic other than the user's characteristics, each of the scripts operative to analyze the user's answer and act upon its content and to derive knowledge about the user's characteristics from his answer.
  • FIG. 1 is a simplified semi-pictorial semi-block-diagram illustration of an interactive toy web system comprising a learning machine in accordance with a preferred embodiment of the present invention
  • FIG. 2 is a more detailed, semi-pictorial semi-block-diagram illustration of an interactive toy web system comprising a learning machine in accordance with a preferred embodiment of the present invention
  • FIG. 3 is a semi-pictorial semi-block-diagram illustration of an adaptive pattern learning system in accordance with a preferred embodiment of the present invention
  • FIG. 4 is a flowchart that describes an example of a sample group procedure of an adaptive pattern learning system in accordance with a preferred embodiment of the present invention
  • FIG. 5 is a simplified flowchart illustration of a suitable procedure of building a task for a user based on results of a sample group as provided by an adaptive pattern learning system in accordance with a preferred embodiment of the present invention
  • FIG. 6A is a simplified script diagram illustration of a main artificial life script in accordance with a preferred embodiment of the present invention.
  • FIG. 6B is a simplified flowchart illustration corresponding to the script diagram illustration of FIG. 6A which is provided in order to explain the script diagram notation of FIG. 6A;
  • FIG. 7A is a simplified script diagram illustration of an example of an artificial life script that provides a level 1 (i.e. simple) game in accordance with a preferred embodiment of the present invention
  • FIGS. 7 B- 7 C taken together, form a simplified flowchart illustration corresponding to the script diagram illustration of FIG. 7A;
  • FIG. 8 is a simplified script diagram illustration of an example of an artificial life script that provides a level 2 (i.e. normal) game in accordance with a preferred embodiment of the present invention
  • FIG. 9 is a simplified script diagram illustration of an example of an artificial life script that provides a level 3 (i.e. harder) game in accordance with a preferred embodiment of the present invention.
  • FIG. 10 is an example of a simplified screen display for a “play” function in the player
  • FIG. 11 is a simplified illustration of a “textbox” screen display of the programming function of the player, constructed and operative in accordance with a preferred embodiment of the present invention
  • FIG. 12 is a table that describes the programming feature of the player in accordance with a preferred embodiment of the present invention.
  • FIG. 13 is a semi-block diagram semi-flowchart illustration of the “personal” function of a player in accordance with a preferred embodiment of the present invention.
  • FIG. 14 is a semi-block diagram semi-flowchart illustration of the “club” function of a player in accordance with a preferred embodiment of the present invention.
  • FIG. 15 is a semi-block diagram semi-flowchart illustration of the “shop” function of a player in accordance with a preferred embodiment of the present invention.
  • FIG. 16 is a simplified flowchart illustration of a registration procedure provided in accordance with a preferred embodiment of the present invention.
  • FIG. 17 is a simplified flowchart illustration of an example procedure of sending a request message from a user to a server in accordance with a preferred embodiment of the present invention
  • FIG. 18 is a simplified flowchart illustration of a suitable procedure of creating a new group for the users' club in accordance with a preferred embodiment of the present invention
  • FIG. 19 is a simplified flowchart illustration of a suitable procedure of a leaving a group of the users' club in accordance with a preferred embodiment of the present invention.
  • FIG. 20 is a simplified flowchart illustration of a suitable procedure of viewing group members provided by the users' club in accordance with a preferred embodiment of the present invention
  • FIGS. 21 to 24 are simplified flowchart illustrations which, taken together, describe an example of a search procedure provided by the club in accordance with a preferred embodiment of the present invention.
  • FIG. 25 is a simplified flowchart illustration of a suitable procedure for sending a message to another user in accordance with a preferred embodiment of the present invention.
  • FIG. 26 is a simplified flowchart illustration of a suitable procedure for sending a script to another user in accordance with a preferred embodiment of the present invention.
  • FIG. 27 is a simplified flowchart illustration of a suitable implementation of send procedure D in FIGS. 25 and 26, constructed and operative in accordance with a preferred embodiment of the present invention
  • FIG. 28A is a simplified flowchart illustration of a suitable procedure for adding a user to a contact list in accordance with a preferred embodiment of the present invention
  • FIG. 28B is a simplified flowchart illustration of a suitable procedure for removing a user from a contact list in accordance with a preferred embodiment of the present invention.
  • FIG. 29 is a simplified flowchart illustration of a suitable account update procedure provided in accordance with a preferred embodiment of the present invention.
  • FIG. 30 is a simplified flowchart illustration of a suitable subject update procedure provided in accordance with a preferred embodiment of the present invention.
  • FIG. 31 is a simplified flowchart illustration of a suitable procedure for ignoring a message and/or a user in accordance with a preferred embodiment of the present invention
  • FIG. 32 is a simplified flowchart illustration of a suitable procedure for adding fields to a user's contact list in accordance with a preferred embodiment of the present invention
  • FIGS. 33 to 35 are simplified flowchart illustrations of three respective procedures for “adding to basket”, which procedures are preferably provided by the electronic shop of the present invention.
  • FIG. 36 is a simplified flowchart illustration of a suitable implementation of a “remove from basket” procedure preferably provided by the electronic shop of the present invention.
  • FIG. 37 is a simplified flowchart illustration of a suitable search procedure preferably provided by the electronic shop of the present invention.
  • FIG. 38 is a simplified flowchart illustration of a suitable procedure for “winning credit points” preferably provided by the electronic shop of the present invention.
  • FIG. 39 is a simplified flowchart illustration of a suitable procedure of paying with credit points preferably provided by the electronic shop of the present invention.
  • FIG. 40 is a simplified illustration of a screen display of the “My account” function of the club provided in accordance with a preferred embodiment of the present invention.
  • FIG. 41 is a simplified illustration of a screen display of the “Contact list” function of the club in accordance with a preferred embodiment of the present invention.
  • FIG. 42 is a simplified pictorial illustration of a screen display of the “Group” function of the club provided in accordance with a preferred embodiment of the present invention.
  • FIG. 43 is a simplified pictorial illustration of a screen display of the “Search user” function of the club in accordance with a preferred embodiment of the present invention.
  • FIG. 44 is a simplified pictorial illustration of a screen display of the “Send message/scripts” function of the club in accordance with a preferred embodiment of the present invention.
  • FIG. 45 is a simplified pictorial illustration of a screen display of the “Interests” form provided in accordance with a preferred embodiment of the present invention.
  • FIG. 46 is a simplified pictorial illustration of a screen display of the “Registration” form provided in accordance with a preferred embodiment of the present invention.
  • FIG. 47 is a simplified pictorial illustration of a screen display of the “Select content” function preferably provided by the electronic shop of the present invention.
  • FIG. 48 is a simplified pictorial illustration of a screen display of the “Select packages” function preferably provided by the electronic shop of the present invention.
  • FIG. 49 is a simplified pictorial illustration of a screen display of the “View Account” function preferably provided by the electronic shop of the present invention.
  • FIG. 50 is an illustration of a Living Object base station
  • FIG. 51 is an illustration of a Living Object Toy
  • FIG. 52 is a screen display of a Scriptwriter icon on desktop
  • FIG. 53 is a screen display of a Living Object Scriptwriter main screen
  • FIG. 54 is a “select tools—options” screen window display
  • FIG. 55 is a “toy” screen window display
  • FIG. 56 is a “hardware” screen window display
  • FIG. 57 is a “talk icon” screen display
  • FIG. 58 is a Scriptwriter system's main screen display with the added talk object
  • FIG. 59 is an illustration of the Scriptwriter main screen display with the added talk object connected by a line to the start object;
  • FIG. 60 is an illustration of the screen display of the action toolbar with the save icon
  • FIG. 61 illustrates the screen display for naming and saving the script
  • FIG. 62 illustrates a screen window display of a combo box for typing the toy's speech
  • FIG. 63 is a screen window display for recording sound to be played by the toy, wherein the toy's speech can be recorded through the toy or through the computer's microphone;
  • FIG. 64 is a screen window display for saving a recording
  • FIG. 65 is a screen window display for selecting a “wave” file to be played by the toy
  • FIG. 66 illustrates a Listen icon
  • FIG. 67 is a screen display of a part of the Scriptwriter main window with the Listen object added;
  • FIG. 68 is an example of the “Listen and Sense” screen window display
  • FIG. 69 illustrates the “Keyword link box” in the “Choose Link” screen display
  • FIG. 70 shows the Scriptwriter main screen display with a Listen object links to corresponding Talk objects
  • FIG. 71 shows a Run-Run screen window display
  • FIG. 72 shows the Sample error message screen window display
  • FIGS. 73A and 73B show a table of the functions provided by the Scriptwriter with their icons as presented on the Scriptwriter main screen display;
  • FIGS. 74 - 99 are simplified illustrations of examples of screen displays which may be generated by the scriptwriter system shown and described herein;
  • FIG. 100 is a dependence table useful in building an artificial life toy and environment constructed and operative in accordance with a preferred embodiment of the present invention
  • FIG. 101 is a formula table useful in building an artificial life toy and environment constructed and operative in accordance with a preferred embodiment of the present invention
  • FIGS. 102 - 124 are simplified illustrations of examples of screen displays which may be generated by the scriptwriter system shown and described herein;
  • FIG. 125 is a table that presents an interactive script between a toy and a player where the toy determines the characteristics of the player (namely, age range) to suggest the appropriate level of game content;
  • FIG. 126 is a table that stores an example of a list of multilevel questions
  • FIG. 127 is a script diagram of an interactive script where a toy determines the characteristics of a player (e.g. age range) to suggest the appropriate level of game content;
  • FIG. 128 is a simplified flowchart illustration of a learning process for a toy system constructed and operative in accordance with a preferred embodiment of the present invention
  • FIG. 129 is a simplified flowchart illustration of a process for analyzing a user's game results constructed and operative in accordance with a preferred embodiment of the present invention
  • FIG. 130 is a simplified flowchart illustration of a method for analyzing game results of a group of users, constructed and operative in accordance with a preferred embodiment of the present invention.
  • FIG. 131 is a simplified flowchart illustration of a process for customizing toy content to a user of a toy system, based on results of a learning procedure.
  • the electronic shop typically provides a credit account for a child user which the parent opens for him.
  • the software enables persons, typically adults, to buy other persons, typically children, a present in a store, and the present may comprise a fixed sum account in that store and the software includes book-keeping capability.
  • an electronic voucher is provided.
  • the child can be granted, by the parent, 100 credit points which serve as money (tender) issued by a particular electronic store.
  • the child may be granted a gift certificate.
  • a particular advantage of a preferred embodiment of the present invention is that an individual, such as a child, who does not own a credit card can have an independent electronic shopping experience.
  • the child may be entitled to buy only toys for his age-bracket, not computer toys, or only computer toys.
  • the child is prompted to earn credit points e.g. by agreeing to hear/view, and actually hearing and/or viewing, advertising.
  • filtering parameters imposed by the parents are also applied to the advertising which is presented to the child.
  • the child's response to advertising is monitored to verify exposure to the advertising message and the monitoring information is provided to the advertisement provider.
  • certain operations on the part of the child require parental approval which is given e.g. by the parent supplying his credit card number at the appropriate juncture in response to a prompt.
  • parental approval e.g. by the parent supplying his credit card number at the appropriate juncture in response to a prompt.
  • the child can operate as an independent consumer within the limitations imposed by the voucher or gift certificate.
  • the vehicle with which the child interacts comprises inter alia or exclusively a toy figure such as a teddy bear.
  • the teddy bear is typically purchased in conjunction with a CD-ROM or other software vehicle which the parent installs on the computer.
  • the software preferably is operable even by a very small child, either by means of a script which actuates the toy figure to interact with the child, and/or by means of suitable simple on-screen input devices, such as buttons, with which the child can perform operations such as opening an Internet shop.
  • the Internet shop preferably allows a child or parent to enter a site and buy a desired number of credit points e.g. with a credit card.
  • the shop preferably has an option whereby user can check how many credit points s/he has, and what filters or restrictions if any apply to these credit cards.
  • Filters typically include content filters. For example, a child is entitled to purchase only educational toys, or only games which pertain to history. Examples of subfilters are “only USA history”, “only the Napoleonic period”, and so on.
  • Sold goods typically comprise products, typically paid for by a lump sum, or content services, including games and books which are periodically updated, typically paid for by a monthly charge.
  • a parent may purchase, for his child, 5 credit points' worth of U.S. history games per month.
  • learning motivation may be generated by allowing a child to earn up to 5 credit points per month toward toy purchases by playing 5 credit points' worth of mathematical games per month, at his level of skill.
  • a toy such as a teddy bear may serve as a “tutor” or “governess” and teach the child the games which the child selects.
  • the games may be initiated at the child's request or the toy may initiate these sessions, at times or in response to prompts which are system determined or selected by a user such as an adult.
  • the brain of the toy is updatable via Internet and therefore an adult can buy pieces of the toy's brain for a child.
  • the controlling computer comprises a scheduler such that each of a plurality of content elements may be set to be executed at different child- or adult-selected times or in response to different prompts.
  • the toy typically initiates these sessions. For example, the toy may call for the child at 4 pm and if the child answers the toy initiates a learning session on the topic of Napoleon. At 5:30 pm the toy tries to initiate a learning session on multiplication.
  • a particular advantage of a preferred embodiment of the present invention is that it enables efficient processing of “nickel and dime” purchases and or micro payments, typically via the Internet and without resorting to use of smart cards.
  • a user club or affinity group is provided which may be similar to conventional ICQ systems.
  • messages are preferably delivered orally by a toy such as a teddy bear, the teddy bear therefore acting like a secretary/ messenger.
  • a toy such as a teddy bear
  • the teddy bear therefore acting like a secretary/ messenger.
  • a child can write a joke or generate a script or game, using a suitable development environment, and can post the product of his efforts for a friend in the same user group.
  • a portal is provided which sells gift certificates from any of a multiplicity of sites.
  • the value of gift certificate which the user can buy exceeds the purchase price.
  • a post-to-web feature is provided, allowing vendor access to the Internet shop of the present invention.
  • the post-to-web is a tool allowing vendors to post toy and game items for sale in an Internet toy and game shop.
  • filtering parametrization can also be provided by the vendor such that customers, typically adults, can filter purchases for the intended recipients, typically children.
  • a personification feature is provided.
  • Personification is a feature of a toy, a doll or another interactive entertainment unit having a defined persona, such as a known comic figure, action figure or human celebrity.
  • the personified unit presents the user with voice, intonation and mimics typical of the personified figure.
  • the personification mechanism enables a content developer to develop generic content such as a song or a story or educational material (e.g. biology) or an information item (e.g. news) and post it to the web.
  • the personification mechanism also enables each personified unit to download a personified version of the generic content to present to the user with the voice, intonation, gestures and other characteristics typical of a personified figure, such as a known comic figure, action figure or human celebrity.
  • the personification mechanism enables a content developer to develop generic content such as a song or a story or educational material (e.g. biology) or an information item (e.g. news) and post it to the web.
  • the personification mechanism also enables each personified unit to download a personified version of the generic content to present to the user with the voice, intonation, gestures and other characteristics of the personified figure.
  • an artificial life feature is provided.
  • the system of the preset invention comprises a script actuation database operative to receive output from at least one operating scripts and to actuate at least one additional script when certain conditions are fulfilled by the output, in combination, of the operating scripts.
  • An impression of artificial life is generated due to the accumulation of conditions from various scripts, imbuing activation with a live quality which is substantially not anticipatable.
  • scripts may be scheduled.
  • the system also provides time-based selection of single actions.
  • a script for at least one doll includes in it activation of another script for another doll when certain conditions are fulfilled.
  • the system of the present invention resides on a CD-ROM storing an IDE player plus scripts therefore plus, optionally, a shop for buying more scripts.
  • a “cyberbrain” feature is provided whereby each toy grows up on the basis of its own unique experiences, much as a child does.
  • the toy preferably adapts its contents to a user thereof as the toy learns more about that user.
  • each toy is provided not only by the child owner of that particular toy but by a total population of toy owners linked over the Internet.
  • the toy uses its experiences or impressions of its own child-owner and optionally of a group of children to which his child-owner belongs in order to become a better teacher or companion. Therefore, a particular toy, when bought by a first child, is not identical to the same toy bought subsequently by a second child because the toy is preferably constantly learning, as each child's experience is transmitted to a server which shares that experience with the entire virtual community of toys.
  • a toy's content software may comprise 3 jokes.
  • the toy may learn, for example, that children prefer one joke over the remaining two, in which case the toy is fed new parameters and begins using that joke in preference over the other jokes.
  • the toy may also learn that none of the 3 jokes available pass a satisfaction threshold of, say, 20%, in which case a content developer develops other jokes and downloads these replacement jokes to all toys in the virtual community.
  • the toy When the toy calls its server, it may receive new parameters and/or new contents.
  • a micro marketing feature is provided.
  • the micro marketing feature enables the online service provider to identify communities, or affinity groups, of users that have something in common, such as sharing a similar interest or preference or need.
  • the online service can provide, or suggest the provisioning of, selected appropriate content, such as educational, informational or promotional content, to the appropriate community or affinity group.
  • FIG. 1 is a simplified semi-pictorial semi-block-diagram illustration of an interactive toy web system comprising a learning machine in accordance with a preferred embodiment of the present invention.
  • FIG. 2 is a more detailed, semi-pictorial semi-block-diagram illustration of an interactive toy web system comprising a learning machine in accordance with a preferred embodiment of the present invention.
  • FIG. 3 is a semi-pictorial semi-block-diagram illustration of an adaptive pattern learning system in accordance with a preferred embodiment of the present invention.
  • FIG. 4 is a flowchart that describes an example of a sample group procedure of an adaptive pattern learning system in accordance with a preferred embodiment of the present invention.
  • FIG. 5 is a simplified flowchart illustration of a suitable procedure of building a task for a user based on results of a sample group as provided by an adaptive pattern learning system in accordance with a preferred embodiment of the present invention.
  • a web system comprising a plurality of networked entertainment units such as networked toys, and a learning machine allowing the web system to learn from the interaction/s between at least one user and at least one entertainment unit.
  • the learning machine is implemented by a system of scripts, termed herein “artificial life” scripts, which breathe artificial life into at least one of the networked entertainment units.
  • artificial life is intended to refer to scripts which cause at least one entertainment unit to respond not only to a current user interaction situation but also to take into account the past. These scripts typically operate in the background during interaction of a toy with a user. Each “artificial life” script is parametrized, each parameter not being fixed but rather being determined as a function of learned situational characteristics or user characteristic. Therefore, each “artificial life” script typically comprises an endless family of scripts having potentially endless variation contained therewithin.
  • a system of artificial life scripts builds on itself by learning from the interaction of users with the scripts such that the toy's functioning develops over time as a result of the variation between users in their interactions with their toys, which in turn causes differential parametrization of scripts depending on who has been playing with them.
  • an interactive toy web system with a learning machine comprises a feature of artificial life.
  • individual users on a system interact with artificial life scripts, e.g. in the form of games.
  • Results from interaction of all or some of the users on a system are sent to a server of an interactive toy web system. The results are processed and used in order to modify the scripts sent to individual users.
  • the Client side preferably is operative to perform at least the following functions:
  • the Server side preferably is operative to perform at least the following functions:
  • Analyzing information e.g. the server may update a formula in a server database every 1000 clients.
  • a Server database may include the following tables, inter alia:
  • a. Personal table which includes the following fields: ID, Password, Name, Gender, Birthday, City, Country and Address.
  • b History table which includes the following fields: ID, Company, Product, Script, LastRun(Date), ParamList.
  • Script Table which includes the following fields: Script, Company, Product, ParamList, Formula, ScheduleData.
  • a preferred Session Process which is presented to a client at a desired time set by the client, typically comprises the following stages:
  • Client computer sends to server the game results (parameters) and associate AL formula.
  • a Trivia game comprising a script which can call any of 3 other scripts, is now described.
  • the game described herein includes 9 questions, each three of which are illustrated in FIGS. 7 - 9 respectively.
  • the questions have the following 3 difficulty levels each associated with 3 questions in the illustrated example: level 1: simple, level 2: normal, level 3: harder.
  • the trivia game therefore typically comprises the following parameter structure:
  • a level parameter is defined as: S 1 Level ⁇ values: 1 or 2 or 3 ⁇ .
  • Results parameters are S 1 L 1 , S 1 L 2 , S 1 L 3 ⁇ S-script, L-level ⁇ , S 1 Boring.
  • Typical values of the results parameters at the end of a game comprise the following: 0-no answer, 1-only answered question #1 from among the three (in the illustrated embodiment) questions within the current level, 2-only answered #2, 4-only answered #3, 3-answered #1 and #2, 5-answered #1 and #3, 6-answered #2 and #3, 7-answered all 3 questions within the current level.
  • FIGS. 6A to 9 illustrate a set of artificial life scripts in accordance with a preferred embodiment of the present invention.
  • FIG. 6A is a simplified script diagram illustration of a main artificial life script in accordance with a preferred embodiment of the present invention.
  • the script can call any of the three levels described in FIGS. 7 to 9 respectively.
  • FIG. 6B is a simplified flowchart illustration corresponding to the script diagram illustration of FIG. 6A which is provided in order to explain the script diagram notation of FIG. 6A.
  • FIG. 7A is a simplified script diagram illustration of an example of an artificial life script that provides a level 1 (i.e. simple) game in accordance with a preferred embodiment of the present invention.
  • FIGS. 7 B- 7 C taken together, form a simplified flowchart illustration corresponding to the script diagram illustration of FIG. 7A.
  • FIGS. 7 B- 7 C are provided to explain the script diagram notation of FIG. 7A.
  • FIG. 8 is a simplified script diagram illustration of an example of an artificial life script that provides a level 2 (i.e. normal) game in accordance with a preferred embodiment of the present invention.
  • FIG. 9 is a simplified script diagram illustration of an example of an artificial life script that provides a level 3 (i.e. harder) game in accordance with a preferred embodiment of the present invention.
  • FIGS. 8 and 9 are similar to the script diagram notation of FIGS. 6A and 7A.
  • a “Main” Artificial Life Script illustrated in FIG. 6, which sends the toy to one of the script portions of FIGS. 7 - 9 , each script portion representing a level, is now described.
  • a preferred Objects Description for the “main” artificial life script of FIGS. 6 A- 6 B is as follows:
  • Start 1 (Start): Starting point for execution.
  • Memory 1 (Memory): Sets memory cell ⁇ S 1 L 1 > to “0”.
  • Memory 2 (Memory): Sets memory cell ⁇ S 1 L 2 > to “0”.
  • Memory 3 (Memory): Sets memory cell ⁇ S 1 L 3 > to “0”.
  • Memory 4 (Memory): Sets memory cell ⁇ S 1 Boring> to “0”.
  • Condition 1 (Condition): follow the true branch if the value of memory cell ⁇ S 1 Level> is equal to “1”, or the false branch, otherwise.
  • Condition2(Condition): follow the true branch if the value of memory cell ⁇ S 1 Level> is equal to “2”, or the false branch, otherwise.
  • Script 1 Runs the “C:/CreatorIDE/S 1 _Level 3 .script” script of FIG. 9.
  • Script2 Runs the “C:/CreatorIDE/S 1 _Level 2 .script” script of FIG. 8.
  • Script 3 Runs the “C:/CreatorIDE/S 1 _Level 1 .script” script of FIG. 7.
  • Talk1(Talk) Say “table or cup?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense1(ListenAndSense) Listens for one of the keywords (table, cup) for 5 seconds.
  • Talk2(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk3(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk4(Talk) Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Calculation2 (Calculation): Set the value of memory cell ⁇ S 1 L 1 > to the sum of the value of memory cell ⁇ S 1 L 1 > and “1” (if the operation is invalid, the cell is cleared).
  • Talk5(Talk) Say “TV or car?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense2 (ListenAndSense): Listens for one of the keywords (TV, car) for 5 seconds.
  • Talk6(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk7(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk8(Talk) Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk9(Talk) Say “piano or book?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense3 (ListenAndSense): Listens for one of the keywords (piano, book) for 5 seconds.
  • Talk10(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk11(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk12(Talk) Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • End1(End) Execution ends here.
  • Talk1(Talk) Say “turtle or dog?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense1 Listens for one of the keywords (turtle, dog) for 5 seconds.
  • Talk2(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk3(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk4(Talk) Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk5(Talk) Say “car or bike?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense2(ListenAndSense) Listens for one of the keywords (car, bike) for 5 seconds.
  • Talk6(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk7(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk8(Talk) Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk9(Talk) Say “space ship or car?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense3 (ListenAndSense): Listens for one of the keywords (spaceship, car) for 5 seconds.
  • Talk10(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk11(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk12(Talk) Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • End1(End) Execution ends here.
  • Talk1(Talk) Say “elephant or mouse?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense1 Listens for one of the keywords (elephant, mouse) for 5 seconds.
  • Talk2(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk3(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk4(Talk) Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk5(Talk) Say “big TV or pencil?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense2(ListenAndSense) Listens for one of the keywords (big TV, pencil) for 5 seconds.
  • Talk6(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk7(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk8(Talk) Say “not understood” in a Man's voice (duration: 127 seconds) while performing the “Talk” move using a toy named Storyteller.
  • Talk9(Talk) Say “bike or car?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • ListenAndSense3(ListenAndSense) Listens for one of the keywords (bike, car) for 5 seconds.
  • Talk10(Talk) Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk11(Talk) Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • Talk12(Talk) Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller.
  • End1(End) Execution ends here.
  • the system described herein includes an electronic shop, a club of networked toy users, a user registration system and a user interests' registration system registering each users' interests.
  • a profile of the user may be used. This profile may comprise parameters such as age, gender, and subjects of interest.
  • the user typically has to register and to fill-in a list of personal interests.
  • the user has to fill in the following personal data: full name, gender, birthday, phone number, full address, E-mail, and the language the user speaks.
  • the user typically selects his/her own subject(s) of interest from a given list.
  • the list may for example include the following main subjects: Education, Information, Services, TV guide, News, Entertainment, Freaky stuff, Free time, Famous people, Reading and writing, Sports, Pets.
  • the system shown and described herein preferably includes a shop in which the user buys different packages according to his/her personal preferences and relevant toy(s). The user is exposed first to a sorted list of packages, which include different categories of content adapted to user characteristics such as the user's age, personal interest, and language.
  • Examples of package content categories may include: Education, Information, Services, TV guide, News, Entertainment, Freaky stuff, Free time, Famous people, Reading and writing, Sports, Pets.
  • the screen display typically also presents the number of packages in each category, as well as information about packages on sale.
  • the user finds different packages for the chosen category, e.g. using the “select packages” screen display of FIG. 48.
  • packages for the chosen category, e.g. using the “select packages” screen display of FIG. 48.
  • the user finds a variety of subjects as: Arts, Biology, History, Languages, Logic, Arithmetic, Geometry, Medicine, Physics.
  • the user digitally signs the package choice he has made, indicating that he wishes this to be his purchase.
  • the user may browse around the shop, looking for other packages from other categories or for different toys. While browsing, the function of “search”, e.g. as illustrated in FIG. 37, may be employed.
  • FIGS. 40 - 44 Examples of screen displays suitable for implementing a preferred club feature of the system of the present invention are illustrated in FIGS. 40 - 44 .
  • the user can preferably do one or all of the following:
  • the user typically has several options, which may for example include the following options or functions illustrated in FIG. 40: My account, Group, Search, Send message/script.
  • the user defines his own personal information. For example, the user may fill in the following personal data to log-in to the club: nickname, full name, full address, phone number, E-mail, comments.
  • the user typically decides if the information is public or confidential.
  • the public information is available to other users while they are looking for new colleagues via the above-mentioned SEARCH function.
  • the user typically also marks in his/her own interests from among a given list.
  • the list contains subjects as: Education, Information, Services, TV guide, News, Entertainment, Freaky stuff, Free time, Famous people, Reading and writing, Sports, Pets.
  • the user may join any of the existing group(s), which were created and defined by the administrator.
  • the user finds here a list of all the groups with a full description of each. Typically, groups that fit the users' interests, as listed by him, appear first.
  • the user can search for new friends characterized according to parameters such as age, interests, country, gender, and toy type. He can also locate old friends according to a nickname or country for example. New friends approved by the user are added to the contact list.
  • the Contact List lists the user's colleagues with their personal data, derived from their public personal information.
  • the users' colleague data may for example comprise: nickname, full name, full address, phone number, e-mail and comments.
  • the user can send either a messages or scripts to his/her group's members or colleagues listed in his contact list, which is typically accessible from the screen display of the send messages/scripts option.
  • the origin of scripts to send might be free download from the web or home made scripts composed by the user or by his or her colleagues. Delivery of messages and scripts can be immediate or at any future date.
  • the system provides an incoming messages/scripts icon using which the user can elect to keep all new messages or to let the user delete incoming messages.
  • the system of the present invention includes a “player” which plays scripts selected by a user in accordance with a schedule also typically selected by the user.
  • FIG. 10 is an example of a simplified screen display for a “play” function in the player. Using the play function, the user programs schedule and content. Following is a description of the “play” function in accordance with a preferred embodiment of the present invention.
  • the User can get to the play function either from an existing PLAY icon in the player, or via a new icon that the user adds.
  • FIG. 11 is a simplified illustration of a “textbox” screen display of the programming function of the player, constructed and operative in accordance with a preferred embodiment of the present invention.
  • FIG. 12 is a table that describes the programming feature of the player in accordance with a preferred embodiment of the present invention.
  • FIG. 13 is a semi-block diagram semi-flowchart illustration of the “personal” function of a player in accordance with a preferred embodiment of the present invention.
  • FIG. 14 is a semi-block diagram semi-flowchart illustration of the “club” function of a player in accordance with a preferred embodiment of the present invention.
  • FIG. 15 is a semi-block diagram semi-flowchart illustration of the “shop” function of a player in accordance with a preferred embodiment of the present invention.
  • FIG. 16 is a simplified flowchart illustration of a registration procedure provided in accordance with a preferred embodiment of the present invention.
  • FIG. 17 is a simplified flowchart illustration of an example procedure of sending a request message from a user to a server in accordance with a preferred embodiment of the present invention.
  • FIG. 18 is a simplified flowchart illustration of a suitable procedure of creating a new group for the users' club in accordance with a preferred embodiment of the present invention.
  • FIG. 19 is a simplified flowchart illustration of a suitable procedure of a leaving a group of the users' club in accordance with a preferred embodiment of the present invention.
  • FIG. 20 is a simplified flowchart illustration of a suitable procedure of viewing group members provided by the users' club in accordance with a preferred embodiment of the present invention.
  • FIGS. 21 to 24 are simplified flowchart illustrations which, taken together, describe an example of a search procedure provided by the club in accordance with a preferred embodiment of the present invention.
  • FIG. 25 is a simplified flowchart illustration of a suitable procedure for sending a message to another user in accordance with a preferred embodiment of the present invention.
  • FIG. 26 is a simplified flowchart illustration of a suitable procedure for sending a script to another user in accordance with a preferred embodiment of the present invention.
  • FIG. 27 is a simplified flowchart illustration of a suitable implementation of send procedure D in FIGS. 25 and 26, constructed and operative in accordance with a preferred embodiment of the present invention.
  • FIG. 28A is a simplified flowchart illustration of a suitable procedure for adding a user to a contact list in accordance with a preferred embodiment of the present invention.
  • FIG. 28B is a simplified flowchart illustration of a suitable procedure for removing a user from a contact list in accordance with a preferred embodiment of the present invention.
  • FIG. 29 is a simplified flowchart illustration of a suitable account update procedure provided in accordance with a preferred embodiment of the present invention.
  • FIG. 30 is a simplified flowchart illustration of a suitable subject update procedure provided in accordance with a preferred embodiment of the present invention.
  • FIG. 31 is a simplified flowchart illustration of a suitable procedure for ignoring a message and/or a user in accordance with a preferred embodiment of the present invention.
  • FIG. 32 is a simplified flowchart illustration of a suitable procedure for adding fields to a user's contact list in accordance with a preferred embodiment of the present invention.
  • FIGS. 33 to 35 are simplified flowchart illustrations of three respective procedures for “adding to basket”, which procedures are preferably provided by the electronic shop of the present invention.
  • FIG. 36 is a simplified flowchart illustration of a suitable implementation of a “remove from basket” procedure preferably provided by the electronic shop of the present invention.
  • FIG. 37 is a simplified flowchart illustration of a suitable search procedure preferably provided by the electronic shop of the present invention.
  • FIG. 38 is a simplified flowchart illustration of a suitable procedure for “winning credit points” preferably provided by the electronic shop of the present invention.
  • FIG. 39 is a simplified flowchart illustration of a suitable procedure of paying with credit points preferably provided by the electronic shop of the present invention.
  • FIG. 40 is a simplified illustration of a screen display of the “My account” function of the club provided in accordance with a preferred embodiment of the present invention.
  • FIG. 41 is a simplified illustration of a screen display of the “Contact list” function of the club in accordance with a preferred embodiment of the present invention.
  • FIG. 42 is a simplified pictorial illustration of a screen display of the “Group” function of the club provided in accordance with a preferred embodiment of the present invention.
  • FIG. 43 is a simplified pictorial illustration of a screen display of the “Search user” function of the club in accordance with a preferred embodiment of the present invention.
  • FIG. 44 is a simplified pictorial illustration of a screen display of the “Send message/scripts” function of the club in accordance with a preferred embodiment of the present invention.
  • FIG. 45 is a simplified pictorial illustration of a screen display of the “Interests” form provided in accordance with a preferred embodiment of the present invention.
  • FIG. 46 is a simplified pictorial illustration of a screen display of the “Registration” form provided in accordance with a preferred embodiment of the present invention.
  • FIG. 47 is a simplified pictorial illustration of a screen display of the “Select content” function preferably provided by the electronic shop of the present invention.
  • FIG. 48 is a simplified pictorial illustration of a screen display of the “Select packages” function preferably provided by the electronic shop of the present invention.
  • FIG. 49 is a simplified pictorial illustration of a screen display of the “View Account” function preferably provided by the electronic shop of the present invention.
  • Described herein is a software tool for generating verbal content and for controlling toys and other manipulable objects, particularly suited for toys operated by a PC computer, in wireless communication, by means of a wireless, e.g. radio, base station connected to the PC, with a toy controller embedded inside the toy.
  • a wireless e.g. radio
  • Living Object Hardware and software technology for building computer controlled toys and other manipulable objects, and for the generation of verbal content for their control.
  • Scriptwriter A software program for the generation of verbal content for the control of toys based on Living Object technology.
  • Base Station A radio or other wireless transceiver connected to the PC providing wireless communication between the PC and the toy controller embedded in the computer controlled toy.
  • Toys and other objects based on Living Object technology use the computer, wireless communications, and voice recognition software to speak with their users in a human voice, with human-like personality and intelligence.
  • the toys hold entertaining, personalized dialogs with the child, demonstrating knowledge of the child and his/her likes and recalling past interactive sessions.
  • the Living Object Scriptwriter is a tool useful in creating interactive scripts that give the toys speech and personality. These scripts typically feature content that includes:
  • the Scriptwriter's computer is turned on.
  • the Living Toy is turned on and close by and the Living Object base station is plugged into the computer.
  • FIG. 50 showing the Living Object base station
  • FIG. 51 showing the Living Object Toy
  • FIG. 52 shows a screen display of the Scriptwriter icon on desktop
  • FIG. 53 showing a screen display of Living Object Scriptwriter main screen:
  • the Living Object Scriptwriter program opens to its main screen.
  • the screen on the computer may look like the screen display illustrated in FIG. 53.
  • FIG. 57 showing the “talk icon” screen display
  • FIG. 58 showing the Scriptwriter main screen display with the added talk object
  • FIG. 59 showing the Scriptwriter main screen display with the added talk object connected by a line to the start object.
  • FIG. 60 showing the screen display of the action toolbar with the save icon and to FIG. 61 showing the screen display for naming and saving the script:
  • FIG. 62 illustrates a screen window display of a combo box for typing the toy's speech:
  • FIG. 63 is a screen window display for recording sound to be played by the toy.
  • the toy's speech can be recorded through the toy or through the computer's microphone. Logistically, it may be easier to use a conventional microphone. If recording through the toy, make sure it is on and awake. If using the computer's microphone, make sure it is plugged into the microphone jack in the back of the computer and that the speakers and sound software are on. In the Talk window, click on the Record button. The Sound Recorder window opens.
  • the user clicks on the microphone icon For example, the user may record the line, “What do you feel like doing now, wise guy? When my eyes light up, say: A joke, a song, or a game.” When done, the user clicks again on the microphone icon.
  • the user plays back the recording to make sure it recorded well. The user then clicks on the speaker icon. If the user is not satisfied with the recording, he repeats. If it is desired to increase or decrease the volume of the recording, the Volume dial is adjusted by twisting it to the left or right with the mouse. Then the user records the line again.
  • FIG. 64 is a screen window display for saving a recording.
  • the recording is preferably saved. Click on the Save icon. The Record into window appears. The recording is saved in the same directory as the Scriptwriter script.
  • FIG. 65 is a screen window display for selecting a “wave” file to be played by the toy. Now a line that the user created and recorded is played through the user's toy.
  • the toy may have gone to sleep while the user was occupied with scriptwriting and recording. To wake the toy up, squeeze its hand or another body part that contains a sensor. If the toy responds with movement and/or a beep, then the user has switched it back to Alert mode. AT this point, the user clicks on the play button and the system plays the wave file through the toy.
  • FIG. 66 illustrates a Listen icon
  • FIG. 67 is a screen display of a part of the Scriptwriter main window with the Listen object added.
  • FIG. 68 is the “Listen and Sense” screen window display.
  • the Listen and Sense window opens. In the Listen and Sense window, the user defines what words the toy listens for or what sensors are in input mode during the current listen and sense segment.
  • Type the keywords, following the same spacing and punctuation pattern seen in parentheses. Type: a joke, a song, or a game.
  • Keyword phrases typically comprise exactly two words.
  • FIG. 69 illustrates the “Keyword link box” in the “Choose Link” screen display and to FIG. 70 showing the Scriptwriter main screen display with a Listen object links to corresponding Talk objects.
  • the toy gives a different answer to each keyword it hears. This process of building questions, keywords, and responses to keywords gives the toy its intelligent conversational ability—at the most basic level.
  • the system offers many different features that enable the user to give the dialog a highly intelligent aspect, such as random answers, answers based on memory, and answers based on collected personal data.
  • the fourth keyword may automatically display a link called “Not-Found.” This link allows the user to create a verbal response to a situation in which the toy did not hear or understand any of the keywords it was listening for (or, if the toy was awaiting sensor input, did not feel input to the sensor that was waiting for input). A description of how to create a “NotFound” reaction by the toy is provided below.
  • the fourth Talk object created typically needs to contain speech that tells the user what to do if the toy did not understand or hear the keyword spoken by the user. Typically, the user should-repeat the keyword or make a comment to the effect that the toy did not get one of the expected answers and is therefore moving on to the next point in the script. If the user is asked to repeat the keyword, the user should be reminded what the keywords are, in case she or he has forgotten them.
  • type text that tells the user that the toy did not hear the response, but is moving on to the next point in the script.
  • type “Hmmm, you want me to choose? Ok, I'm in the mood for a joke!”
  • the user's toy voices the texts defined in Talk1 listens for one of the three keywords you defined in Listen 1 , and responds accordingly by voicing the text from Talk2, or Talk3, or Talk4.
  • Talk1 is a wave file, whereas the other Talk objects are played through the toy as synthesized speech.
  • the Talk window of each of the Talk objects is opened and the text recorded, as described above.
  • FIGS. 73A and 73B showing a table of the functions provided by the Scriptwriter with their icons as presented on the Scriptwriter main screen display.
  • FIG. 74 showing the Talk object screen window display.
  • To enter into the talk options window double click on the icon on the script. The user clicks on the Advanced button for the following additional options:
  • TTS Text to Speech
  • a change can be made in the type of voice used by clicking on the different options available of the right of the TTS field e.g. man, women, boy, girl.
  • a wav file can be inserted by choosing the wav option field and allocating a wav file either from the computer or from a recorded wav file.
  • a message can be recorded by selecting the record button. This brings the user to the Sound recorder window, as described herein in the description of how to record speech.
  • the wav file can be played back from this window by clicking on the play button.
  • Movement Options allows the user to select the type of movement for the talk segment.
  • the Mood and Stage field are used for additional comment information.
  • Listen & Sense Object Reference is now made to FIG. 75 showing the Listen and Sense screen window display.
  • the keywords field is where the user defines the options available for speech recognition. With the say keywords button which is located at the end of the keywords field the user can hear the words chosen.
  • the Sensors field allows to define the sensor areas located on the toy for non verbal response.
  • Listen time allows the user to define the maximum time given to listen or wait for activation of sensors.
  • the Memory field allows the user to save the results of the recognition process.
  • Move Object Reference is now made to FIG. 76 showing the Move screen window display.
  • the Movement field allows the user to pick the type of activity the Toy is to make.
  • the Moving Time field defines the length of time the movement is to take place.
  • Memory Object Reference is now made to FIG. 78 showing the Memory screen window display which allows the user to put a certain compartment of the computer's memory and give the compartment a name.
  • Condition Object Reference is now made to FIG. 79 showing the Condition screen window display.
  • the computer can add, subtract, multiply and divide.
  • Random Object Reference is now made to FIG. 81 showing the Random screen window display which allows a user to create a list of values that the computer chooses from a random basis and to tell the computer in which memory compartment to put the values.
  • Time Marker Object Reference is now made to FIG. 82 showing the Date and Time screen window display which allows a user to put a certain time or date in a compartment in the computer's memory.
  • Wait Object Reference is now made to FIG. 83 showing the Wait screen window display. This display instructs the toy to wait for a certain amount of time before proceeding with the script.
  • Jump Object Reference is now made to FIG. 84 showing the Jump screen window display, allowing a user to skip to a different point in the script.
  • Script Object Reference is now made to FIG. 86 showing the Run Script screen window display which enables the user to run any other Scriptwriter Script.
  • FIG. 87 showing the Internet screen window display which opens a defined web page.
  • FIG. 88 showing the Graphics screen window display which shows a picture or video file on the computer's screen.
  • Display time is the length of time the image/video is to be shown.
  • Size field allows the user to determine the height and width of the image chosen.
  • Video (Advanced Options): When choosing the “Wait until finish” command this instructs the toy to wait until the video is completed before continuing with the script.
  • FIGS. 89, 90, 91 , 92 and 93 showing “End”, “Script Properties”, “Choose Link”, “Pop-up Menu” and “Options” screen window displays, respectively.
  • End Object The end object stops the script and allows for the users to define the exit names. When opening the script from a different window and the single output mode in not defined the user is able to view all the available script exists.
  • the IDE lets the user activate it in a way that gives life-like behavior to the toy.
  • the IDE comprises algorithms and a strong compiler that integrate time, pattern, and minimal interval and apply them to the script or a collection of scripts.
  • the resulting artificially created life for the toy is typically authentic in the sense that users can easily forget they are speaking and interacting with a toy. A description of how to use the IDE to create artificial life is now provided.
  • FIG. 94 showing the Artificial Life Algorithm Editor screen window display.
  • the Artificial Life Professional Editor allows the user to define formulas and assign values to local and system parameters that later act on a given script.
  • the Editor is used to write a user's own formulas or edit pre-written formulas provided by the function library.
  • the Editor then allows the user to create an algorithm from the formula the user has defined, and associate the algorithm with the current script.
  • the user typically needs to assign the formula a temporary value in the Formula parameter box.
  • the formula on the sample screen has been assigned a value of 1. This value could represent successful completion of, say, a lesson in science. If the script has never been completed successfully, it could have formula parameter value of 2.
  • FIG. 95 showing the Artificial Life Editor screen window display.
  • the editor enables to build formula for specific script.
  • steps to add AL formula to script First the user chooses the script by pressing load button. Then fill in formula by double click on a cell. At least one cell must typically be filled in. Finally the user saves the AL formula by pressing the save button.
  • FIG. 96 showing the Artificial Life Editor screen window display with the Cell Management pop-up window. By pressing on the right click on the mouse, the user gets a pop-up with cell functions.
  • FIG. 97 showing the Artificial Life Editor screen window display with the Function Library po-pup window.
  • FIG. 98 showing the Artificial Life Manager screen window display.
  • the Artificial Life Manager gives the user an overview of all scripts that the Artificial Life engine is to check and the formulas, parameters, and values assigned to them. It is possible to work from the Manager to make changes to the definitions.
  • the Manager contains functions for adding, removing, and viewing the history of the 10 last executions each script. Highlighting a script name with the highlight bar displays all the relevant details of that script.
  • FIG. 99 showing the Artificial Life Editor Viewer window display.
  • the Artificial Life Viewer presents a real-time and historical graphical depiction of the status of up to five scripts at any one time.
  • the Viewer can be used to track the behavior of different scripts, as determined by the value stored for each script in “Memory.”
  • the “Show activation level” item can be selected to view the threshold of the selected scripts, and thereby determine when the last time was that each script executed.
  • the Viewer displays the last 10 minutes of Artificial Life activity on an ongoing basis, scrolling to the right as additional charted activity takes place.
  • a preferred method for building an AL toy typically comprises the following steps:
  • FIG. 101 is an example of a Formula table.
  • FIG. 100 is an example of a Dependence table.
  • FIGS. 102 and 103 showing the Scriptwriter main screen display with corresponding AL scripts, specifically a game script and a laugh script respectively.
  • the system provides a variety of optional commands, which may be implemented as Function Bar Commands.
  • optional commands which may be implemented as Function Bar Commands.
  • An example of a set of optional commands which may be provided is now described.
  • File Menu Reference is now made to FIG. 104 showing the Scriptwriter main screen display with the File menu open.
  • the file menu allows the user to create a new script, open an existing script and other operations that are found in a normal file menu.
  • the file menu preferably includes menu options such as the following:
  • New Script In order to begin writing a script, click on new script in the file menu and a new window appears on the screen. Now the user can begin working on his/her script.
  • Open Script To open an already saved script, click on open script in the file menu. A window opens up containing a list of the existing scripts which the user can search from. When the user finds the script s/he is looking for, click on its name, for example scriptl.script, and the script file opens.
  • the Save Script As Image command saves the script in the format of a picture image.
  • the script is saved as a Metafile Image (WMF).
  • WMF is especially compatible with the Word program.
  • the user can make corrections and changes, outside the IDE Scriptwriter program, in Word itself.
  • FIG. 105 showing the Scriptwriter main screen display with the Create Report Window.
  • the Create Report command creates a chart in the Excel program which documents which objects appear in the script created.
  • the user clicks on Create Report the user can choose to chart all properties of all existing objects by pressing Print All.
  • the user can limit the chart to a specific object, for example talk, by selecting Create Selected in the window that opened when Create Report was clicked on.
  • Print Description When clicking on the Print Description command a detailed text file and NOT a chart appear. The same information, which appears in the Create Report chart, appears in Print Description in textual form.
  • Print Preview When clicking on the Print Preview command, the user receives a print preview of the script s/he has just created.
  • j. Print The Print command prints a visual picture of the script as well as a verbal description of the stages of the script. Below the print command in the file menu, appear the Last Opened Scripts. It can display a maximum of the last Three files that have been worked on. The last command in the File Menu is the Exit command. When clicking on the exit command, the user exits the IDE Scriptwriter Program.
  • FIG. 106 showing the Scriptwriter main screen display with the Edit menu open.
  • the Edit Menu allows the user to amend and change the script s/he has already created. It includes commands such as the following:
  • Paste The paste and copy commands are interconnected to undo the last operation that was made on the script that the user has been working on. When cutting a part of his script, the user must then click on the paste command in order to place that operation in another part of his script.
  • FIG. 107 showing the Scriptwriter main screen display with the Find Window.
  • the user can search for a specific word or object in his whole script, making his search easier.
  • FIG. 108 showing the Scriptwriter main screen display with the Replace Window. When clicking on the Replace a window appears. This window is split into two sections—Target and Object.
  • Target the target defines where the desired replacement should take place. It can take place in a selected part, or in the whole script.
  • Object the object defines in which objects the replacement should take place. It can take place in All objects that have shared properties, or the Replace command can be executed according to Object Type. A replacement is made in a specific object according to its unique properties.
  • Clipboard Copy the image or description (Copy Image to Clipboard, Copy Description to Clipboard) of the script onto Windows' clipboard. All Windows applications can now use the image or description.
  • View Menu Reference is now made to FIG. 109 showing the Scriptwriter main screen display with the View menu open.
  • the View Menu offers different forms of viewing the script created, such as zoom in/out and volume.
  • Zoom in The Zoom in lets the user view his script in magnified size.
  • Zoom out The Zoom out lets the user view his script in decreased size.
  • Normal Size The normal Size lets the user view his script in its original size.
  • Volume Reference is now made to FIG. 110 showing the Scriptwriter main screen display with the Volume and Speech Recognition Windows. Clicking on the Volume show the volume of all that is spoken or heard in the script. This can help the user understand why, for example, words are not being recognized by the program because the microphone level is too low.
  • SR Result The system uses a Speech Recognition (SR) window to show the speech recognition results during running the script. The accuracy helps the user determine if the sensitivity in identifying certain parts in the program should be lowered. The higher the accuracy the closer the annunciation is to the computer's.
  • SR Speech Recognition
  • the Rec. Wav button allows the user to hear the last saved recordings during listen process.
  • FIG. 111 showing the Scriptwriter main screen display with the Watch List and the Add Memory windows
  • Watches Using the Watches command, the user can follow the different values of Memory, that have been saved, during or after running the script.
  • Execute log Reference is now made to FIG. 112 showing the Scriptwriter main screen display with the Execute Log and the Messages windows.
  • the Execute Log is a logger of all operations that have been identified and executed. This can be extremely helpful in identifying errors that have been made.
  • FIG. 113 showing the Scriptwriter main screen display with the Sensor Selection window. This is a simulation for the sensors of the specific object in the user's script. The sensors that are active in different parts during the script are identified by name during this Sensor Simulation.
  • Link Style This refers to the different styles of links that can be made between two objects in the script (e.g. between talk & move). There are six different styles of links for e.g. Vertical-horizontal, horizontal-vertical. These different styles help the user to better organize his script writing form.
  • Simulator When clicking on the Simulator, a window opens up on the user's screen. A simulator doll is displayed that actually Acts Out the script, only if it is running in simulation mode.
  • Scheduler Reference is now made to FIG. 114 showing the Scheduler screen window display.
  • the Scheduler can determine at what set time the user's script is executed.
  • a user can schedule his script to appear once an hour, once a day, on an event like a birthday or every time a dolls' hand is pressed.
  • Not only scripts can be scheduled; the user can also schedule a Message to be delivered on a set time or event.
  • a user can also receive a List of the last scripts to be run and on what dates their running had occurred.
  • Scheduler asdding a task: Reference is now made to FIGS. 115 and 116 showing the Scheduler screen window display with the Add Task pop-up window Scheduler List pop-up window correspondingly.
  • Run Menu Reference is now made to FIG. 118 showing the Scriptwriter main screen display with the Run menu open. This menu allows the user to Run his finished script, pause and so on.
  • the menu typically offers options such as the following:
  • Run from Selected Allows the user to begin playing his script from a specific chosen point.
  • Pause Pause the script midway at a chosen place.
  • Stop Bring the running script to a complete stop.
  • Check Script Check the script for any errors or hints (if there are any errors or hints in the script the Message window appears).
  • Tools Menu controls all the environment of the IDE scriptwriter program. A user can control the toys, volume, and sensors.Options
  • the Options commands are split into the pages or screen windows illustrated in FIGS. 119 - 124 , including Toys, Hardware, Environment, Volume Setting, Living Toys, Script and Report.
  • Toys Page Reference is now made to FIG. 119 showing the Scriptwriter main screen display with the Option window at the Toys Page.
  • Toys page the user can define toys shown in the list. Toy is typically defined with name—toy and name, type—(which defines the toy according to the operations it can perform), number, and channel. This also allows a user to remove toys from the list.
  • FIG. 120 showing the Script-writer main screen display with the Option window at the Hardware Page.
  • a. Check Base Station checks the communication between the base station and the program, it resets the base station.
  • FIG. 121 showing the Scriptwriter main screen display with the Option window at the Environment Page.
  • View simulator awakens the simulator doll inside the program.
  • Advance Properties show every object (for e.g. talking, moving) with advanced properties.
  • Toy Identity Changes the illustration in the script itself to an illustration of the chosen toy and not the generic illustration. This helps to clarify which toy is talking or operating at different points in a multi-toy script.
  • Default Toy the toy in the script is the default toy.
  • Volume Setting Page Reference is now made to FIG. 122 showing the Scriptwriter main screen display with the Option window at the Volume Setting Page. These Volume Settings are for the speaker as well as the microphone in the doll. A doll is selected, and the reload button is clicked on. This asks the doll to send the program its internal volume settings. An update can be made to these settings, saving the update and changing the original settings. After the update, another check of the volume settings is made.
  • Living Toy Page Reference is now made to FIG. 123 showing the Scriptwriter main screen display with the Option window at the Living Toy Page.
  • the user can activate all toys that are programmed for artificial life.
  • the toy like a live person, to “sleep”, “eat”, and “wake up” at a set time.
  • a user can choose to activate only certain toys for artificial life, once the user has chosen them they “wake up”, if they are sleeping.
  • Script page Reference is now made to FIG. 124 showing the Scriptwriter main screen display with the Option window at the Script page.
  • Report Page The Report page typically comprises the following elements:
  • Save Listen File can save any listen made in a script (it always saves the last listen heard).
  • SR-Database is in use by a recogniser.
  • SR-Agc can not be active during user word training.
  • TTS-Specified parameter is out of range.
  • SR is an abbreviation for “script writer” and TTS is an abbreviation for “text to speech”.
  • the present invention relates to machines' interaction with humans, where the machines use the indirect approach to identify the characteristics, preferences and interests of the person.
  • the proper identification of the characteristics, preferences and interests enables the machine to suggest appropriate content, information, entertainment, games, stories, riddles and jokes as well as TV programs, movies and theater shows, and various articles on sale.
  • a multilevel question is a question that has two or more correct answers.
  • the answer selected by the questioned person provides information about or her. Such information may be: the age range, gender, culture, breadth of knowledge, preferences and inclinations, possible areas of interest and associations.
  • the question is a trivia question with one “obvious” or “superfluous” answer and another “concealed” or “deep” answer that requires somewhat “increased” level of knowledge.
  • the question contains a clue that orients the “knowledgeable” person to select the “concealed” answer. The “unknowledgeable” person disregards the clue and select the “obvious” answer.
  • the preferred embodiment of the current invention relates to computer games played by kids and, more specifically, to games played with computer controlled toys.
  • the present invention employs multilevel questions to identify the player's age and level of knowledge to select the level of content (game or story) to play with the player.
  • the interrogation is performed by verbally asking the questions and performing speech recognition on the recorded response.
  • Such forms can be visual and tactile, employing screen display, keyboard, pointing device such as mouse, switches and sensors.
  • Concealed information can also be provided by moving or lighting limbs of a doll or a graphic character on screen, or other parts of the scene.
  • FIG. 125 is a table that presents an interactive script between a toy and a player where the toy determines the characteristics of the player (namely, age range) to suggest the appropriate level of game content.
  • a list of multilevel questions is maintained with a set of characteristics associated with each question.
  • the characteristics define the types of persons that the question can help identify according to characteristics such as gender and age group, as shown in the table of FIG. 126.
  • the machine selects the appropriate question according to its associated characteristics. For example, if the player selects a male avatar, the program may ask a multilevel question with an obvious answer of general knowledge and a concealed answer regarding, for example, football. If the program is about to present an advertisement, it may first identify the age group by presenting a multilevel question that is associated with age group identification.
  • FIG. 126 is a table that stores an example of a list of multilevel questions.
  • FIG. 127 is a script diagram of an interactive script where a toy determines the characteristics of a player (e.g. age range) to suggest the appropriate level of game content.
  • the adaptive toy system of FIGS. 1 - 5 comprises a learning machine that uses trivia game scripts.
  • An example of such trivia game scripts is described in FIGS. 6 - 9 .
  • a detailed example of an adaptive toy system that uses trivia game scripts in accordance with a preferred embodiment of the present invention is now described.
  • FIG. 128 describes by means of a flowchart the basic stages of a learning procedure for a toy system in accordance with a preferred embodiment of the present invention.
  • trivia game scripts are sent to users of a toy system.
  • a sample-group of users of a given age group from among the entire community of users on a toy system receive the said scripts over a network.
  • the age of a user is typically registered by a toy system.
  • An example of a registration procedure in accordance with a preferred embodiment of the present invention is described in FIG. 16.
  • results of game scripts played by individual users are retrieved by a toy system.
  • the results of each user are initially processed on a personal computer controlling the toy of the said user, and the results are then sent over a network to a central computer on a server of a toy system.
  • the learning process of FIG. 128 typically comprises a stage of checking whether game results are “meaningful”. For example, game results are typically considered not meaningful if the game scripts turned out to be too difficult for users participating in a sample group. In such a case, one or more parts of a learning process are typically modified before the whole process is repeated. If game results are considered meaningful, such results are typically used in order to change the content provided by a toy system, possibly for the entire community of users on such a system. Examples of this as well as the previous stages of the procedure of FIG. 128 are detailed below. Thus, the procedure of FIG. 128 preferably provides the toy system with one or both of the following two adaptation capabilities: 1) change of toy content; and 2) modification of the learning procedure itself.
  • FIG. 129 describes by means of a flowchart an example of a procedure of analyzing game results for a single user in accordance with a preferred embodiment of the present invention.
  • the game in the present example includes three trivia game scripts that are described in FIGS. 7 - 9 .
  • the following four parameters are used in the course of the present procedure:
  • S 1 Boring shows the number of questions to which a user did not give a recognizable answer. Since a game in the present example comprises 9 questions, the value of this parameter is to be between 0 and 9.
  • S 1 L 1 shows which and how many questions from among the 3 questions of level 1 were answered correctly by a user. Typically, S 1 L 1 is incremented by 2 n ⁇ 1 if the n'th question is answered correctly, so that for each value of S 1 L 1 it is possible to discern which questions were answered correctly. For example, if S 1 L 1 equals 1, 2 or 4, that means that only a single question (questions 1, 2 and 3, respectively) was answered correctly by a user.
  • S 1 L 2 shows which and how many questions of level 2 were answered correctly by a user.
  • S 1 L 3 shows which and how many questions of level 3 were answered correctly by a user.
  • Game too easy is defined in the present example as a case where a user has answered correctly two or more of the questions in each of the three game scripts.
  • Game too difficult is defined in the present example as a case where a user has only answered correctly up to a single question in each of the tree game scripts.
  • results of such game scripts may be repeatedly used for a single user, for example, at different times of day, until reliable results are accumulated.
  • results of such game scripts are processed on a personal computer that controls the toy of the user in question, and then sent over a network to a computer on a server of a toy system.
  • Results sent to a computer on a server may or may not comprise information regarding a user's identity.
  • game results of a multiplicity of users are used in order to derive statistical data related to a sample group as a whole as described below. In such a case, no information related to users' identities is typically required.
  • FIG. 130 describes by means of a flowchart an example of a procedure of handling game results of a plurality of users, for example, members of sample group, in accordance with a preferred embodiment of the present invention.
  • the same categories of game result are used that are described in FIG. 129.
  • the previously described results of individual users are now treated at the sample group level.
  • a learning procedure is modified in case the game scripts turn out to be either too easy or too difficult for a large number of users from among a sample group ⁇ 30% or more in the present example.
  • the procedure of FIG. 128 is repeated for a sample group of users of either a higher or a lower age group, depending on whether the game scripts were too difficult or too easy as shown in FIG. 130.
  • this example also illustrates a method of handling a case described in FIG. 128 where sample group results are initially not meaningful in order to serve as learning data for the entire community of users on a toy system.
  • the final stage of the procedure described in FIG. 130 comprises using sample group results in order to change the content that is sent to toys on a toy system, possibly affecting the entire community of users on such a system.
  • a possible result of the procedure of FIG. 130 is that a large number of users from among the sample group members (e.g. 50% or more) are found to have succeeded in game script 1 but not in game scripts 2 and 3 .
  • such a result possibly implies that many users of the age group in question have a well developed understanding of the concept of “bigger”, yet have not yet acquired sufficient understanding of the concepts of “faster” and “heavier”.
  • a toy system preferably generates a toy content package where knowledge of “what is bigger” is assumed, and which is intended to facilitate acquisition of the concepts of “faster” and “heavier”. Such a content package is then made available for sending to users of the age group concerned from among the entire community of users on a toy system or any subgroup thereof.
  • FIG. 131 is a simplified flowchart illustration of a procedure whereby results of a previously performed learning procedure are used in order to customize toy content to a user of a toy system.
  • a user receives a game script that is intended to allow determination of one or more characteristics of that user. For example, a trivia game script such as that described in FIGS. 6 - 9 is used.
  • a process of analyzing game results for a single user is provided, such as, for example, the procedure described in FIG. 129. Game results of that user are then matched with results of a previously performed learning procedure at a sample group level, such as, for example, the procedure described in FIG. 130.
  • content is customized to the said user depending on the results of the said matching step. For example, a user has succeeded in game script 1 but not in game scripts 2 and 3. Then, a previously generated content package that is especially intended for users with such game results as described above is customized to the user in the present example.
  • the software components of the present invention may, if desired, be implemented in ROM (read-only memory) form.
  • the software components may, generally, be implemented in hardware, if desired, using conventional techniques.

Abstract

An adaptive toy system including a multiplicity of interactive toys, each of which is connected to a computer network and adaptive toy operation software which is supplied to the multiplicity of interactive toys via the computer network, the adaptive toy operation software being operative to provide feedback, based on play experience with at least some of the multiplicity of interactive toys, via the computer network and to employ the feedback in adapting itself so as to change the play experience provided thereby.

Description

    FIELD OF THE INVENTION
  • The present invention relates to networked electronic devices. [0001]
  • BACKGROUND OF THE INVENTION
  • Networked electronic devices are known. [0002]
  • Wireless toys are also known. [0003]
  • The disclosures of all publications mentioned in the specification and of the publications cited therein are hereby incorporated by reference. [0004]
  • SUMMARY OF THE INVENTION
  • The present invention seeks to provide toy apparatus for electronic shopping. [0005]
  • There is thus provided in accordance with a preferred embodiment of the present invention, a multiplicity of interactive toys, each of which is connected to a computer network and adaptive toy operation software which is supplied to the multiplicity of interactive toys via the computer network, the adaptive toy operation software being operative to provide feedback, based on play experience with at least some of the multiplicity of interactive toys, via the computer network and to employ the feedback in adapting itself so as to change the play experience provided thereby. [0006]
  • There is also provided in accordance with another preferred embodiment of the present invention, a multiplicity of interactive entertainment units, each of which is connected to a computer network and adaptive entertainment software which is supplied to the multiplicity of interactive entertainment units via the computer network, the adaptive entertainment software being operative to provide feedback, based on user experience with at least some of the multiplicity of interactive entertainment units via the computer network and to employ the feedback in adapting itself so as to change the entertainment experience provided thereby. [0007]
  • Also provided, in accordance with another preferred embodiment of the present invention, is a method for establishing a network of toys, the method including providing a plurality of scripts for at least some of the network of toys and sending at least one of the plurality of scripts to at least one of the toys in the network, over the network. [0008]
  • Also provided, in accordance with another preferred embodiment of the present invention, is a toy system including an electronic toy content shop providing users with an option to pre-purchase accounts for users and a plurality of networked toys directly connected via a network to the electronic toy content store, the toys being operative to load themselves with at least one script sold at the electronic toy content store, wherein only a subset of the scripts sold at the electronic toy content shop are displayed to a user, depending on at least one personal characteristic of the user. [0009]
  • Also provided, in accordance with another preferred embodiment of the present invention, is a toy system providing multi-level interaction between a population of users and a population of toys, the system including at least one scripts operative to pose at least one question to at least one user about a topic other than the user's characteristics, each of the scripts operative to analyze the user's answer and act upon its content and to derive knowledge about the user's characteristics from his answer. [0010]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will be understood and appreciated from the following detailed description, taken in conjunction with the drawings in which: [0011]
  • FIG. 1 is a simplified semi-pictorial semi-block-diagram illustration of an interactive toy web system comprising a learning machine in accordance with a preferred embodiment of the present invention; [0012]
  • FIG. 2 is a more detailed, semi-pictorial semi-block-diagram illustration of an interactive toy web system comprising a learning machine in accordance with a preferred embodiment of the present invention; [0013]
  • FIG. 3 is a semi-pictorial semi-block-diagram illustration of an adaptive pattern learning system in accordance with a preferred embodiment of the present invention; [0014]
  • FIG. 4 is a flowchart that describes an example of a sample group procedure of an adaptive pattern learning system in accordance with a preferred embodiment of the present invention; [0015]
  • FIG. 5 is a simplified flowchart illustration of a suitable procedure of building a task for a user based on results of a sample group as provided by an adaptive pattern learning system in accordance with a preferred embodiment of the present invention; [0016]
  • FIG. 6A is a simplified script diagram illustration of a main artificial life script in accordance with a preferred embodiment of the present invention; [0017]
  • FIG. 6B is a simplified flowchart illustration corresponding to the script diagram illustration of FIG. 6A which is provided in order to explain the script diagram notation of FIG. 6A; [0018]
  • FIG. 7A is a simplified script diagram illustration of an example of an artificial life script that provides a level[0019] 1 (i.e. simple) game in accordance with a preferred embodiment of the present invention;
  • FIGS. [0020] 7B-7C, taken together, form a simplified flowchart illustration corresponding to the script diagram illustration of FIG. 7A;
  • FIG. 8 is a simplified script diagram illustration of an example of an artificial life script that provides a level[0021] 2 (i.e. normal) game in accordance with a preferred embodiment of the present invention;
  • FIG. 9 is a simplified script diagram illustration of an example of an artificial life script that provides a level[0022] 3 (i.e. harder) game in accordance with a preferred embodiment of the present invention;
  • FIG. 10 is an example of a simplified screen display for a “play” function in the player; [0023]
  • FIG. 11 is a simplified illustration of a “textbox” screen display of the programming function of the player, constructed and operative in accordance with a preferred embodiment of the present invention; [0024]
  • FIG. 12 is a table that describes the programming feature of the player in accordance with a preferred embodiment of the present invention; [0025]
  • FIG. 13 is a semi-block diagram semi-flowchart illustration of the “personal” function of a player in accordance with a preferred embodiment of the present invention; [0026]
  • FIG. 14 is a semi-block diagram semi-flowchart illustration of the “club” function of a player in accordance with a preferred embodiment of the present invention; [0027]
  • FIG. 15 is a semi-block diagram semi-flowchart illustration of the “shop” function of a player in accordance with a preferred embodiment of the present invention; [0028]
  • FIG. 16 is a simplified flowchart illustration of a registration procedure provided in accordance with a preferred embodiment of the present invention; [0029]
  • FIG. 17 is a simplified flowchart illustration of an example procedure of sending a request message from a user to a server in accordance with a preferred embodiment of the present invention; [0030]
  • FIG. 18 is a simplified flowchart illustration of a suitable procedure of creating a new group for the users' club in accordance with a preferred embodiment of the present invention; [0031]
  • FIG. 19 is a simplified flowchart illustration of a suitable procedure of a leaving a group of the users' club in accordance with a preferred embodiment of the present invention; [0032]
  • FIG. 20 is a simplified flowchart illustration of a suitable procedure of viewing group members provided by the users' club in accordance with a preferred embodiment of the present invention; [0033]
  • FIGS. [0034] 21 to 24 are simplified flowchart illustrations which, taken together, describe an example of a search procedure provided by the club in accordance with a preferred embodiment of the present invention;
  • FIG. 25 is a simplified flowchart illustration of a suitable procedure for sending a message to another user in accordance with a preferred embodiment of the present invention; [0035]
  • FIG. 26 is a simplified flowchart illustration of a suitable procedure for sending a script to another user in accordance with a preferred embodiment of the present invention; [0036]
  • FIG. 27 is a simplified flowchart illustration of a suitable implementation of send procedure D in FIGS. 25 and 26, constructed and operative in accordance with a preferred embodiment of the present invention; [0037]
  • FIG. 28A is a simplified flowchart illustration of a suitable procedure for adding a user to a contact list in accordance with a preferred embodiment of the present invention; [0038]
  • FIG. 28B is a simplified flowchart illustration of a suitable procedure for removing a user from a contact list in accordance with a preferred embodiment of the present invention; [0039]
  • FIG. 29 is a simplified flowchart illustration of a suitable account update procedure provided in accordance with a preferred embodiment of the present invention; [0040]
  • FIG. 30 is a simplified flowchart illustration of a suitable subject update procedure provided in accordance with a preferred embodiment of the present invention; [0041]
  • FIG. 31 is a simplified flowchart illustration of a suitable procedure for ignoring a message and/or a user in accordance with a preferred embodiment of the present invention; [0042]
  • FIG. 32 is a simplified flowchart illustration of a suitable procedure for adding fields to a user's contact list in accordance with a preferred embodiment of the present invention; [0043]
  • FIGS. [0044] 33 to 35 are simplified flowchart illustrations of three respective procedures for “adding to basket”, which procedures are preferably provided by the electronic shop of the present invention;
  • FIG. 36 is a simplified flowchart illustration of a suitable implementation of a “remove from basket” procedure preferably provided by the electronic shop of the present invention; [0045]
  • FIG. 37 is a simplified flowchart illustration of a suitable search procedure preferably provided by the electronic shop of the present invention; [0046]
  • FIG. 38 is a simplified flowchart illustration of a suitable procedure for “winning credit points” preferably provided by the electronic shop of the present invention; [0047]
  • FIG. 39 is a simplified flowchart illustration of a suitable procedure of paying with credit points preferably provided by the electronic shop of the present invention; [0048]
  • FIG. 40 is a simplified illustration of a screen display of the “My account” function of the club provided in accordance with a preferred embodiment of the present invention; [0049]
  • FIG. 41 is a simplified illustration of a screen display of the “Contact list” function of the club in accordance with a preferred embodiment of the present invention; [0050]
  • FIG. 42 is a simplified pictorial illustration of a screen display of the “Group” function of the club provided in accordance with a preferred embodiment of the present invention; [0051]
  • FIG. 43 is a simplified pictorial illustration of a screen display of the “Search user” function of the club in accordance with a preferred embodiment of the present invention; [0052]
  • FIG. 44 is a simplified pictorial illustration of a screen display of the “Send message/scripts” function of the club in accordance with a preferred embodiment of the present invention; [0053]
  • FIG. 45 is a simplified pictorial illustration of a screen display of the “Interests” form provided in accordance with a preferred embodiment of the present invention; [0054]
  • FIG. 46 is a simplified pictorial illustration of a screen display of the “Registration” form provided in accordance with a preferred embodiment of the present invention; [0055]
  • FIG. 47 is a simplified pictorial illustration of a screen display of the “Select content” function preferably provided by the electronic shop of the present invention; [0056]
  • FIG. 48 is a simplified pictorial illustration of a screen display of the “Select packages” function preferably provided by the electronic shop of the present invention; [0057]
  • FIG. 49 is a simplified pictorial illustration of a screen display of the “View Account” function preferably provided by the electronic shop of the present invention; [0058]
  • FIG. 50 is an illustration of a Living Object base station; [0059]
  • FIG. 51 is an illustration of a Living Object Toy; [0060]
  • FIG. 52 is a screen display of a Scriptwriter icon on desktop; [0061]
  • FIG. 53 is a screen display of a Living Object Scriptwriter main screen; [0062]
  • FIG. 54 is a “select tools—options” screen window display; [0063]
  • FIG. 55 is a “toy” screen window display; [0064]
  • FIG. 56 is a “hardware” screen window display; [0065]
  • FIG. 57 is a “talk icon” screen display; [0066]
  • FIG. 58 is a Scriptwriter system's main screen display with the added talk object; [0067]
  • FIG. 59 is an illustration of the Scriptwriter main screen display with the added talk object connected by a line to the start object; [0068]
  • FIG. 60 is an illustration of the screen display of the action toolbar with the save icon; [0069]
  • FIG. 61 illustrates the screen display for naming and saving the script; [0070]
  • FIG. 62 illustrates a screen window display of a combo box for typing the toy's speech; [0071]
  • FIG. 63 is a screen window display for recording sound to be played by the toy, wherein the toy's speech can be recorded through the toy or through the computer's microphone; [0072]
  • FIG. 64 is a screen window display for saving a recording; [0073]
  • FIG. 65 is a screen window display for selecting a “wave” file to be played by the toy; [0074]
  • FIG. 66 illustrates a Listen icon; [0075]
  • FIG. 67 is a screen display of a part of the Scriptwriter main window with the Listen object added; [0076]
  • FIG. 68 is an example of the “Listen and Sense” screen window display; [0077]
  • FIG. 69 illustrates the “Keyword link box” in the “Choose Link” screen display; [0078]
  • FIG. 70 shows the Scriptwriter main screen display with a Listen object links to corresponding Talk objects; [0079]
  • FIG. 71 shows a Run-Run screen window display; [0080]
  • FIG. 72 shows the Sample error message screen window display; [0081]
  • FIGS. 73A and 73B show a table of the functions provided by the Scriptwriter with their icons as presented on the Scriptwriter main screen display; [0082]
  • FIGS. [0083] 74-99 are simplified illustrations of examples of screen displays which may be generated by the scriptwriter system shown and described herein;
  • FIG. 100 is a dependence table useful in building an artificial life toy and environment constructed and operative in accordance with a preferred embodiment of the present invention; [0084]
  • FIG. 101 is a formula table useful in building an artificial life toy and environment constructed and operative in accordance with a preferred embodiment of the present invention; [0085]
  • FIGS. [0086] 102-124 are simplified illustrations of examples of screen displays which may be generated by the scriptwriter system shown and described herein;
  • FIG. 125 is a table that presents an interactive script between a toy and a player where the toy determines the characteristics of the player (namely, age range) to suggest the appropriate level of game content; [0087]
  • FIG. 126 is a table that stores an example of a list of multilevel questions; [0088]
  • FIG. 127 is a script diagram of an interactive script where a toy determines the characteristics of a player (e.g. age range) to suggest the appropriate level of game content; [0089]
  • FIG. 128 is a simplified flowchart illustration of a learning process for a toy system constructed and operative in accordance with a preferred embodiment of the present invention; [0090]
  • FIG. 129 is a simplified flowchart illustration of a process for analyzing a user's game results constructed and operative in accordance with a preferred embodiment of the present invention; [0091]
  • FIG. 130 is a simplified flowchart illustration of a method for analyzing game results of a group of users, constructed and operative in accordance with a preferred embodiment of the present invention; and [0092]
  • FIG. 131 is a simplified flowchart illustration of a process for customizing toy content to a user of a toy system, based on results of a learning procedure. [0093]
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • According to a preferred embodiment of the present invention, some or all of the following electronic elements are provided in combination: [0094]
  • a. An interactive development environment (IDE) which allows user to “play god” by generating scripts, [0095]
  • b. an IDE player to run the scripts [0096]
  • c. an electronic shop on Internet [0097]
  • d. an electronic on-line club. [0098]
  • The electronic shop typically provides a credit account for a child user which the parent opens for him. Generally, the software enables persons, typically adults, to buy other persons, typically children, a present in a store, and the present may comprise a fixed sum account in that store and the software includes book-keeping capability. [0099]
  • Typically, an electronic voucher is provided. The child can be granted, by the parent, 100 credit points which serve as money (tender) issued by a particular electronic store. The child may be granted a gift certificate. A particular advantage of a preferred embodiment of the present invention is that an individual, such as a child, who does not own a credit card can have an independent electronic shopping experience. [0100]
  • Preferably, there is parametrization or filtering of child purchases as part of the conditions of the voucher or gift certificate. For example, the child may be entitled to buy only toys for his age-bracket, not computer toys, or only computer toys. [0101]
  • Preferably, the child is prompted to earn credit points e.g. by agreeing to hear/view, and actually hearing and/or viewing, advertising. Preferably, filtering parameters imposed by the parents are also applied to the advertising which is presented to the child. Preferably, the child's response to advertising is monitored to verify exposure to the advertising message and the monitoring information is provided to the advertisement provider. [0102]
  • Preferably, certain operations on the part of the child require parental approval which is given e.g. by the parent supplying his credit card number at the appropriate juncture in response to a prompt. Other than at these junctures, the child can operate as an independent consumer within the limitations imposed by the voucher or gift certificate. [0103]
  • Preferably, the vehicle with which the child interacts comprises inter alia or exclusively a toy figure such as a teddy bear. The teddy bear is typically purchased in conjunction with a CD-ROM or other software vehicle which the parent installs on the computer. Once the software is installed, the software preferably is operable even by a very small child, either by means of a script which actuates the toy figure to interact with the child, and/or by means of suitable simple on-screen input devices, such as buttons, with which the child can perform operations such as opening an Internet shop. [0104]
  • The Internet shop preferably allows a child or parent to enter a site and buy a desired number of credit points e.g. with a credit card. The shop preferably has an option whereby user can check how many credit points s/he has, and what filters or restrictions if any apply to these credit cards. [0105]
  • Filters typically include content filters. For example, a child is entitled to purchase only educational toys, or only games which pertain to history. Examples of subfilters are “only USA history”, “only the Napoleonic period”, and so on. [0106]
  • Sold goods typically comprise products, typically paid for by a lump sum, or content services, including games and books which are periodically updated, typically paid for by a monthly charge. For example, a parent may purchase, for his child, 5 credit points' worth of U.S. history games per month. Alternatively or in addition, learning motivation may be generated by allowing a child to earn up to 5 credit points per month toward toy purchases by playing 5 credit points' worth of mathematical games per month, at his level of skill. [0107]
  • A toy such as a teddy bear may serve as a “tutor” or “governess” and teach the child the games which the child selects. The games may be initiated at the child's request or the toy may initiate these sessions, at times or in response to prompts which are system determined or selected by a user such as an adult. In effect, the brain of the toy is updatable via Internet and therefore an adult can buy pieces of the toy's brain for a child. [0108]
  • Preferably, the controlling computer comprises a scheduler such that each of a plurality of content elements may be set to be executed at different child- or adult-selected times or in response to different prompts. The toy typically initiates these sessions. For example, the toy may call for the child at 4 pm and if the child answers the toy initiates a learning session on the topic of Napoleon. At 5:30 pm the toy tries to initiate a learning session on multiplication. [0109]
  • A particular advantage of a preferred embodiment of the present invention is that it enables efficient processing of “nickel and dime” purchases and or micro payments, typically via the Internet and without resorting to use of smart cards. [0110]
  • Preferably, a user club or affinity group is provided which may be similar to conventional ICQ systems. However, unlike conventional ICQ systems, messages are preferably delivered orally by a toy such as a teddy bear, the teddy bear therefore acting like a secretary/messenger. Through the affinity group, a child can write a joke or generate a script or game, using a suitable development environment, and can post the product of his efforts for a friend in the same user group. [0111]
  • Preferably a portal is provided which sells gift certificates from any of a multiplicity of sites. Preferably, the value of gift certificate which the user can buy exceeds the purchase price. [0112]
  • Preferably, a post-to-web feature is provided, allowing vendor access to the Internet shop of the present invention. The post-to-web is a tool allowing vendors to post toy and game items for sale in an Internet toy and game shop. Preferably, filtering parametrization can also be provided by the vendor such that customers, typically adults, can filter purchases for the intended recipients, typically children. [0113]
  • Preferably, a personification feature is provided. Personification is a feature of a toy, a doll or another interactive entertainment unit having a defined persona, such as a known comic figure, action figure or human celebrity. The personified unit presents the user with voice, intonation and mimics typical of the personified figure. The personification mechanism enables a content developer to develop generic content such as a song or a story or educational material (e.g. biology) or an information item (e.g. news) and post it to the web. The personification mechanism also enables each personified unit to download a personified version of the generic content to present to the user with the voice, intonation, gestures and other characteristics typical of a personified figure, such as a known comic figure, action figure or human celebrity. [0114]
  • The personification mechanism enables a content developer to develop generic content such as a song or a story or educational material (e.g. biology) or an information item (e.g. news) and post it to the web. The personification mechanism also enables each personified unit to download a personified version of the generic content to present to the user with the voice, intonation, gestures and other characteristics of the personified figure. [0115]
  • Preferably, an artificial life feature is provided. The system of the preset invention comprises a script actuation database operative to receive output from at least one operating scripts and to actuate at least one additional script when certain conditions are fulfilled by the output, in combination, of the operating scripts. An impression of artificial life is generated due to the accumulation of conditions from various scripts, imbuing activation with a live quality which is substantially not anticipatable. [0116]
  • Preferably, scripts may be scheduled. Preferably, the system also provides time-based selection of single actions. [0117]
  • Preferably, a script for at least one doll includes in it activation of another script for another doll when certain conditions are fulfilled. [0118]
  • Preferably, the system of the present invention resides on a CD-ROM storing an IDE player plus scripts therefore plus, optionally, a shop for buying more scripts. [0119]
  • Preferably, a “cyberbrain” feature is provided whereby each toy grows up on the basis of its own unique experiences, much as a child does. The toy preferably adapts its contents to a user thereof as the toy learns more about that user. [0120]
  • Optionally, the education of each toy is provided not only by the child owner of that particular toy but by a total population of toy owners linked over the Internet. The toy uses its experiences or impressions of its own child-owner and optionally of a group of children to which his child-owner belongs in order to become a better teacher or companion. Therefore, a particular toy, when bought by a first child, is not identical to the same toy bought subsequently by a second child because the toy is preferably constantly learning, as each child's experience is transmitted to a server which shares that experience with the entire virtual community of toys. Each day, or periodically, each toy in the community of toys develops, becomes smarter, and becomes a better companion. [0121]
  • For example, a toy's content software may comprise 3 jokes. The toy may learn, for example, that children prefer one joke over the remaining two, in which case the toy is fed new parameters and begins using that joke in preference over the other jokes. The toy may also learn that none of the 3 jokes available pass a satisfaction threshold of, say, 20%, in which case a content developer develops other jokes and downloads these replacement jokes to all toys in the virtual community. [0122]
  • When the toy calls its server, it may receive new parameters and/or new contents. [0123]
  • Preferably, a micro marketing feature is provided. The micro marketing feature enables the online service provider to identify communities, or affinity groups, of users that have something in common, such as sharing a similar interest or preference or need. The online service can provide, or suggest the provisioning of, selected appropriate content, such as educational, informational or promotional content, to the appropriate community or affinity group. [0124]
  • FIG. 1 is a simplified semi-pictorial semi-block-diagram illustration of an interactive toy web system comprising a learning machine in accordance with a preferred embodiment of the present invention. [0125]
  • FIG. 2 is a more detailed, semi-pictorial semi-block-diagram illustration of an interactive toy web system comprising a learning machine in accordance with a preferred embodiment of the present invention. [0126]
  • FIG. 3 is a semi-pictorial semi-block-diagram illustration of an adaptive pattern learning system in accordance with a preferred embodiment of the present invention. [0127]
  • FIG. 4 is a flowchart that describes an example of a sample group procedure of an adaptive pattern learning system in accordance with a preferred embodiment of the present invention. [0128]
  • FIG. 5 is a simplified flowchart illustration of a suitable procedure of building a task for a user based on results of a sample group as provided by an adaptive pattern learning system in accordance with a preferred embodiment of the present invention. [0129]
  • According to a preferred embodiment of the present invention, there is provided a web system, comprising a plurality of networked entertainment units such as networked toys, and a learning machine allowing the web system to learn from the interaction/s between at least one user and at least one entertainment unit. Preferably, the learning machine is implemented by a system of scripts, termed herein “artificial life” scripts, which breathe artificial life into at least one of the networked entertainment units. [0130]
  • The term “artificial life” is intended to refer to scripts which cause at least one entertainment unit to respond not only to a current user interaction situation but also to take into account the past. These scripts typically operate in the background during interaction of a toy with a user. Each “artificial life” script is parametrized, each parameter not being fixed but rather being determined as a function of learned situational characteristics or user characteristic. Therefore, each “artificial life” script typically comprises an endless family of scripts having potentially endless variation contained therewithin. Typically, a system of artificial life scripts builds on itself by learning from the interaction of users with the scripts such that the toy's functioning develops over time as a result of the variation between users in their interactions with their toys, which in turn causes differential parametrization of scripts depending on who has been playing with them. [0131]
  • In accordance with a preferred embodiment of the present invention, an interactive toy web system with a learning machine comprises a feature of artificial life. Typically, individual users on a system interact with artificial life scripts, e.g. in the form of games. Results from interaction of all or some of the users on a system are sent to a server of an interactive toy web system. The results are processed and used in order to modify the scripts sent to individual users. [0132]
  • Following is a description of a learning machine web system based on an artificial life feature of interactive toys in accordance with a preferred embodiment of the present invention. The system described herein has a client side and a server side. [0133]
  • The Client side preferably is operative to perform at least the following functions: [0134]
  • a. Collect game result information. [0135]
  • b. Send information and AL behavior (formulas or parameters) to server. [0136]
  • c. Update AL behavior from server. [0137]
  • The Server side preferably is operative to perform at least the following functions: [0138]
  • a. Get clients information results and AL behavior (formulas or parameters). [0139]
  • b. Analyzing information, e.g. the server may update a formula in a server database every 1000 clients. [0140]
  • c. Send update AL behavior to client. Get and send data are saved in history database, e.g. the last 20 client results may be saved to the history database. [0141]
  • For example, a Server database may include the following tables, inter alia: [0142]
  • a. Personal table which includes the following fields: ID, Password, Name, Gender, Birthday, City, Country and Address. [0143]
  • b. History table which includes the following fields: ID, Company, Product, Script, LastRun(Date), ParamList. [0144]
  • c. Script Table which includes the following fields: Script, Company, Product, ParamList, Formula, ScheduleData. [0145]
  • A preferred Session Process which is presented to a client at a desired time set by the client, typically comprises the following stages: [0146]
  • a. Client plays a game. [0147]
  • b. Client computer sends to server the game results (parameters) and associate AL formula. [0148]
  • c. Server gets all client results and saves it to database, finds associate formula from database and sends it to client. [0149]
  • d. Client gets new formula and parameters from server and updates them if necessary. [0150]
  • A Trivia game, comprising a script which can call any of 3 other scripts, is now described. The game described herein includes 9 questions, each three of which are illustrated in FIGS. [0151] 7-9 respectively. The questions have the following 3 difficulty levels each associated with 3 questions in the illustrated example: level 1: simple, level 2: normal, level 3: harder.
  • The trivia game therefore typically comprises the following parameter structure: [0152]
  • A level parameter is defined as: S[0153] 1Level {values: 1 or 2 or 3}. Results parameters are S1L1, S1L2, S1L3 {S-script, L-level}, S1Boring.
  • Typical values of the results parameters at the end of a game comprise the following: 0-no answer, 1-only answered [0154] question #1 from among the three (in the illustrated embodiment) questions within the current level, 2-only answered #2, 4-only answered #3, 3-answered #1 and #2, 5-answered #1 and #3, 6-answered #2 and #3, 7-answered all 3 questions within the current level.
  • FIGS. 6A to [0155] 9 illustrate a set of artificial life scripts in accordance with a preferred embodiment of the present invention.
  • FIG. 6A is a simplified script diagram illustration of a main artificial life script in accordance with a preferred embodiment of the present invention. The script can call any of the three levels described in FIGS. [0156] 7 to 9 respectively.
  • FIG. 6B is a simplified flowchart illustration corresponding to the script diagram illustration of FIG. 6A which is provided in order to explain the script diagram notation of FIG. 6A. [0157]
  • FIG. 7A is a simplified script diagram illustration of an example of an artificial life script that provides a level[0158] 1 (i.e. simple) game in accordance with a preferred embodiment of the present invention.
  • FIGS. [0159] 7B-7C, taken together, form a simplified flowchart illustration corresponding to the script diagram illustration of FIG. 7A. FIGS. 7B-7C are provided to explain the script diagram notation of FIG. 7A.
  • FIG. 8 is a simplified script diagram illustration of an example of an artificial life script that provides a level[0160] 2 (i.e. normal) game in accordance with a preferred embodiment of the present invention.
  • FIG. 9 is a simplified script diagram illustration of an example of an artificial life script that provides a level[0161] 3 (i.e. harder) game in accordance with a preferred embodiment of the present invention.
  • The script diagram notation of FIGS. 8 and 9 is similar to the script diagram notation of FIGS. 6A and 7A. [0162]
  • A “Main” Artificial Life Script, illustrated in FIG. 6, which sends the toy to one of the script portions of FIGS. [0163] 7-9, each script portion representing a level, is now described.
  • A preferred Objects Description for the “main” artificial life script of FIGS. [0164] 6A-6B is as follows:
  • Start[0165] 1(Start): Starting point for execution.
  • Memory[0166] 1(Memory): Sets memory cell <S1L1> to “0”.
  • Memory[0167] 2(Memory): Sets memory cell <S1L2> to “0”.
  • Memory[0168] 3(Memory): Sets memory cell <S1L3> to “0”.
  • Memory[0169] 4(Memory): Sets memory cell <S1Boring> to “0”.
  • Condition[0170] 1(Condition): Follow the true branch if the value of memory cell <S1Level> is equal to “1”, or the false branch, otherwise.
  • Condition2(Condition): Follow the true branch if the value of memory cell <S[0171] 1Level> is equal to “2”, or the false branch, otherwise.
  • Script[0172] 1(Script): Runs the “C:/CreatorIDE/S1_Level3.script” script of FIG. 9.
  • Script2(Script): Runs the “C:/CreatorIDE/S[0173] 1_Level2.script” script of FIG. 8.
  • Script[0174] 3(Script): Runs the “C:/CreatorIDE/S1_Level1.script” script of FIG. 7.
  • A preferred Objects Description for The [0175] Level 1 Artificial Life Script of FIG. 7 is now described:
  • Start1(Start): Starting point for execution. [0176]
  • Talk1(Talk): Say “table or cup?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0177]
  • ListenAndSense1(ListenAndSense): Listens for one of the keywords (table, cup) for 5 seconds. [0178]
  • Talk2(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0179]
  • Talk3(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0180]
  • Talk4(Talk): Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0181]
  • Calculation1(Calculation): Set the value of memory cell <S[0182] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation2(Calculation): Set the value of memory cell <S[0183] 1L1> to the sum of the value of memory cell <S1L1> and “1” (if the operation is invalid, the cell is cleared).
  • Talk5(Talk): Say “TV or car?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0184]
  • ListenAndSense2(ListenAndSense): Listens for one of the keywords (TV, car) for 5 seconds. [0185]
  • Talk6(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0186]
  • Talk7(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0187]
  • Talk8(Talk): Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0188]
  • Calculation3(Calculation): Set the value of memory cell <S[0189] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation4(Calculation): Set the value of memory cell <S[0190] 1L1> to the sum of the value of memory cell <S1L1> and “2” (if the operation is invalid, the cell is cleared).
  • Talk9(Talk): Say “piano or book?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0191]
  • ListenAndSense3(ListenAndSense): Listens for one of the keywords (piano, book) for 5 seconds. [0192]
  • Talk10(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0193]
  • Talk11(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0194]
  • Talk12(Talk): Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0195]
  • Calculation5(Calculation): Set the value of memory cell <S[0196] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation6(Calculation): Set the value of memory cell <S[0197] 1L1> to the sum of the value of memory cell <S1L1> and “4” (if the operation is invalid, the cell is cleared).
  • End1(End): Execution ends here. [0198]
  • Text1(Text) [0199]
  • A preferred Objects Description for The [0200] Level 2 Artificial Life Script of FIG. 8 is now described:
  • Start1(Start): Starting point for execution. [0201]
  • Talk1(Talk): Say “turtle or dog?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0202]
  • ListenAndSense1(ListenAndSense): Listens for one of the keywords (turtle, dog) for 5 seconds. [0203]
  • Talk2(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0204]
  • Talk3(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0205]
  • Talk4(Talk): Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0206]
  • Calculation1(Calculation): Set the value of memory cell <S[0207] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation2(Calculation): Set the value of memory cell <S[0208] 1L2> to the sum of the value of memory cell <S1L2> and “1” (if the operation is invalid, the cell is cleared).
  • Talk5(Talk): Say “car or bike?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0209]
  • ListenAndSense2(ListenAndSense): Listens for one of the keywords (car, bike) for 5 seconds. [0210]
  • Talk6(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0211]
  • Talk7(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0212]
  • Talk8(Talk): Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0213]
  • Calculation3(Calculation): Set the value of memory cell <S[0214] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation4(Calculation): Set the value of memory cell <S[0215] 1L2> to the sum of the value of memory cell <S1L2> and “2” (if the operation is invalid, the cell is cleared).
  • Talk9(Talk): Say “space ship or car?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0216]
  • ListenAndSense3(ListenAndSense): Listens for one of the keywords (spaceship, car) for 5 seconds. [0217]
  • Talk10(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0218]
  • Talk11(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0219]
  • Talk12(Talk): Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0220]
  • Calculation5(Calculation): Set the value of memory cell <S[0221] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation6(Calculation): Set the value of memory cell <S[0222] 1L2> to the sum of the value of memory cell <S1L2> and “4” (if the operation is invalid, the cell is cleared).
  • End1(End): Execution ends here. [0223]
  • Text1(Text): [0224]
  • A preferred Objects Description for The [0225] Level 3 Artificial Life Script of FIG. 9 is now described:
  • Start1(Start): Starting point for execution. [0226]
  • Talk1(Talk): Say “elephant or mouse?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0227]
  • ListenAndSense1(ListenAndSense): Listens for one of the keywords (elephant, mouse) for 5 seconds. [0228]
  • Talk2(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0229]
  • Talk3(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0230]
  • Talk4(Talk): Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0231]
  • Calculation1(Calculation): Set the value of memory cell <S[0232] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation2(Calculation): Set the value of memory cell <S[0233] 1L3> to the sum of the value of memory cell <S1L3> and “1” (if the operation is invalid, the cell is cleared).
  • Talk5(Talk): Say “big TV or pencil?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0234]
  • ListenAndSense2(ListenAndSense): Listens for one of the keywords (big TV, pencil) for 5 seconds. [0235]
  • Talk6(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0236]
  • Talk7(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0237]
  • Talk8(Talk): Say “not understood” in a Man's voice (duration: 127 seconds) while performing the “Talk” move using a toy named Storyteller. [0238]
  • Calculation3(Calculation): Set the value of memory cell <S[0239] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation4(Calculation): Set the value of memory cell <S[0240] 1L3> to the sum of the value of memory cell <S1L3> and “2” (if the operation is invalid, the cell is cleared).
  • Talk9(Talk): Say “bike or car?” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0241]
  • ListenAndSense3(ListenAndSense): Listens for one of the keywords (bike, car) for 5 seconds. [0242]
  • Talk10(Talk): Say “good answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0243]
  • Talk11(Talk): Say “wrong answer” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0244]
  • Talk12(Talk): Say “not understood” in a Man's voice (duration: 1277.637 seconds) while performing the “Talk” move using the toy named Storyteller. [0245]
  • Calculation5(Calculation): Set the value of memory cell <S[0246] 1Boring> to the sum of the value of memory cell <S1Boring> and “1” (if the operation is invalid, the cell is cleared).
  • Calculation6(Calculation): Set the value of memory cell <S[0247] 1L3> to the sum of the value of memory cell <S1L3> and “4” (if the operation is invalid, the cell is cleared).
  • End1(End): Execution ends here. [0248]
  • Text1(Text) [0249]
  • An interactive toy system constructed and operative in accordance with a preferred embodiment of the present invention is now described. The system described herein includes an electronic shop, a club of networked toy users, a user registration system and a user interests' registration system registering each users' interests. [0250]
  • Following is an example of a list of web services provided by the interactive toy system described above in accordance with a preferred embodiment of the present invention: [0251]
  • a. a shop [0252]
  • b. a members club [0253]
  • c. automatic download of charge-free new scripts [0254]
  • d. updating parts of the player [0255]
  • e. messages of different types [0256]
  • f. new services, which evolve on-line. [0257]
  • In order to adapt the system's services to the unique demands of each user (e.g. the type of new scripts), a profile of the user may be used. This profile may comprise parameters such as age, gender, and subjects of interest. [0258]
  • To get any of the services above, the user typically has to register and to fill-in a list of personal interests. [0259]
  • Preferably, while registering, the user has to fill in the following personal data: full name, gender, birthday, phone number, full address, E-mail, and the language the user speaks. [0260]
  • In the user interests' registration system, the user typically selects his/her own subject(s) of interest from a given list. The list may for example include the following main subjects: Education, Information, Services, TV guide, News, Entertainment, Freaky stuff, Free time, Famous people, Reading and writing, Sports, Pets. [0261]
  • The system shown and described herein preferably includes a shop in which the user buys different packages according to his/her personal preferences and relevant toy(s). The user is exposed first to a sorted list of packages, which include different categories of content adapted to user characteristics such as the user's age, personal interest, and language. [0262]
  • Examples of package content categories, as shown in FIG. 40, may include: Education, Information, Services, TV guide, News, Entertainment, Freaky stuff, Free time, Famous people, Reading and writing, Sports, Pets. [0263]
  • These categories may be chosen using the “select content” screen of FIG. 47. [0264]
  • The screen display typically also presents the number of packages in each category, as well as information about packages on sale. [0265]
  • For each selected packet content category, the user finds different packages for the chosen category, e.g. using the “select packages” screen display of FIG. 48. In the category of education, for example, the user finds a variety of subjects as: Arts, Biology, History, Languages, Logic, Arithmetic, Geometry, Medicine, Physics. [0266]
  • For each package the user finds the package price, the name(s) of the toy the package is aimed for, and a comment, if the user has bought this package already. [0267]
  • Here, the user digitally signs the package choice he has made, indicating that he wishes this to be his purchase. [0268]
  • The user may browse around the shop, looking for other packages from other categories or for different toys. While browsing, the function of “search”, e.g. as illustrated in FIG. 37, may be employed. [0269]
  • When the “basket is full up” with packages—it is time to pay and the user is typically prompted to pay. The user can see the number of credit points he has accumulated. The user can go from where he is to a browser to buy more credit points. The user can go from where he is to a playing station where he can EARN/WIN more credit points. [0270]
  • When the user enters his or her account, he typically views his account e.g. using the “view account” display screen of FIG. 49. The user finds here all the information about each of the packages s/he has chosen according to name, description, price, section (commercial and other), category (e.g. art or news) and the relevant toy(s). [0271]
  • Here the user confirms his/her shopping. [0272]
  • Examples of screen displays suitable for implementing a preferred club feature of the system of the present invention are illustrated in FIGS. [0273] 40-44. When in the club the user can preferably do one or all of the following:
  • a. Join to any of the group(s) [0274]
  • b. Meet old or new colleagues and list their data [0275]
  • c. Send messages to the club members [0276]
  • In the club the user typically has several options, which may for example include the following options or functions illustrated in FIG. 40: My account, Group, Search, Send message/script. [0277]
  • In the My Account option, the user defines his own personal information. For example, the user may fill in the following personal data to log-in to the club: nickname, full name, full address, phone number, E-mail, comments. [0278]
  • For each item the user typically decides if the information is public or confidential. The public information is available to other users while they are looking for new colleagues via the above-mentioned SEARCH function. [0279]
  • Under the My Account option, the user typically also marks in his/her own interests from among a given list. The list contains subjects as: Education, Information, Services, TV guide, News, Entertainment, Freaky stuff, Free time, Famous people, Reading and writing, Sports, Pets. [0280]
  • Under the Group option, the user may join any of the existing group(s), which were created and defined by the administrator. The user finds here a list of all the groups with a full description of each. Typically, groups that fit the users' interests, as listed by him, appear first. [0281]
  • Under the Search option, the user can search for new friends characterized according to parameters such as age, interests, country, gender, and toy type. He can also locate old friends according to a nickname or country for example. New friends approved by the user are added to the contact list. [0282]
  • The Contact List lists the user's colleagues with their personal data, derived from their public personal information. The users' colleague data may for example comprise: nickname, full name, full address, phone number, e-mail and comments. [0283]
  • Under the Send Messages/Scripts option, the user can send either a messages or scripts to his/her group's members or colleagues listed in his contact list, which is typically accessible from the screen display of the send messages/scripts option. [0284]
  • The origin of scripts to send might be free download from the web or home made scripts composed by the user or by his or her colleagues. Delivery of messages and scripts can be immediate or at any future date. [0285]
  • Typically, the system provides an incoming messages/scripts icon using which the user can elect to keep all new messages or to let the user delete incoming messages. [0286]
  • Typically, within the My Account option there is defined a Message Filtration suboption using which the user can list authors whose messages should be ignored. From here the user can also access his full contact list. [0287]
  • Typically, the system of the present invention includes a “player” which plays scripts selected by a user in accordance with a schedule also typically selected by the user. [0288]
  • FIG. 10 is an example of a simplified screen display for a “play” function in the player. Using the play function, the user programs schedule and content. Following is a description of the “play” function in accordance with a preferred embodiment of the present invention. [0289]
  • The User can get to the play function either from an existing PLAY icon in the player, or via a new icon that the user adds. [0290]
  • When the user clicks on “PROGRAMMING” at the window of FIG. 10, s/he typically gets the interface of FIG. 11. [0291]
  • FIG. 11 is a simplified illustration of a “textbox” screen display of the programming function of the player, constructed and operative in accordance with a preferred embodiment of the present invention. [0292]
  • FIG. 12 is a table that describes the programming feature of the player in accordance with a preferred embodiment of the present invention. [0293]
  • FIG. 13 is a semi-block diagram semi-flowchart illustration of the “personal” function of a player in accordance with a preferred embodiment of the present invention. [0294]
  • FIG. 14 is a semi-block diagram semi-flowchart illustration of the “club” function of a player in accordance with a preferred embodiment of the present invention. [0295]
  • FIG. 15 is a semi-block diagram semi-flowchart illustration of the “shop” function of a player in accordance with a preferred embodiment of the present invention. [0296]
  • FIG. 16 is a simplified flowchart illustration of a registration procedure provided in accordance with a preferred embodiment of the present invention; [0297]
  • FIG. 17 is a simplified flowchart illustration of an example procedure of sending a request message from a user to a server in accordance with a preferred embodiment of the present invention. [0298]
  • FIG. 18 is a simplified flowchart illustration of a suitable procedure of creating a new group for the users' club in accordance with a preferred embodiment of the present invention. [0299]
  • FIG. 19 is a simplified flowchart illustration of a suitable procedure of a leaving a group of the users' club in accordance with a preferred embodiment of the present invention. [0300]
  • FIG. 20 is a simplified flowchart illustration of a suitable procedure of viewing group members provided by the users' club in accordance with a preferred embodiment of the present invention. [0301]
  • FIGS. [0302] 21 to 24 are simplified flowchart illustrations which, taken together, describe an example of a search procedure provided by the club in accordance with a preferred embodiment of the present invention.
  • FIG. 25 is a simplified flowchart illustration of a suitable procedure for sending a message to another user in accordance with a preferred embodiment of the present invention. [0303]
  • FIG. 26 is a simplified flowchart illustration of a suitable procedure for sending a script to another user in accordance with a preferred embodiment of the present invention. [0304]
  • FIG. 27 is a simplified flowchart illustration of a suitable implementation of send procedure D in FIGS. 25 and 26, constructed and operative in accordance with a preferred embodiment of the present invention. [0305]
  • FIG. 28A is a simplified flowchart illustration of a suitable procedure for adding a user to a contact list in accordance with a preferred embodiment of the present invention. [0306]
  • FIG. 28B is a simplified flowchart illustration of a suitable procedure for removing a user from a contact list in accordance with a preferred embodiment of the present invention. [0307]
  • FIG. 29 is a simplified flowchart illustration of a suitable account update procedure provided in accordance with a preferred embodiment of the present invention. [0308]
  • FIG. 30 is a simplified flowchart illustration of a suitable subject update procedure provided in accordance with a preferred embodiment of the present invention. [0309]
  • FIG. 31 is a simplified flowchart illustration of a suitable procedure for ignoring a message and/or a user in accordance with a preferred embodiment of the present invention. [0310]
  • FIG. 32 is a simplified flowchart illustration of a suitable procedure for adding fields to a user's contact list in accordance with a preferred embodiment of the present invention. [0311]
  • FIGS. [0312] 33 to 35 are simplified flowchart illustrations of three respective procedures for “adding to basket”, which procedures are preferably provided by the electronic shop of the present invention.
  • FIG. 36 is a simplified flowchart illustration of a suitable implementation of a “remove from basket” procedure preferably provided by the electronic shop of the present invention. [0313]
  • FIG. 37 is a simplified flowchart illustration of a suitable search procedure preferably provided by the electronic shop of the present invention. [0314]
  • FIG. 38 is a simplified flowchart illustration of a suitable procedure for “winning credit points” preferably provided by the electronic shop of the present invention. [0315]
  • FIG. 39 is a simplified flowchart illustration of a suitable procedure of paying with credit points preferably provided by the electronic shop of the present invention. [0316]
  • FIG. 40 is a simplified illustration of a screen display of the “My account” function of the club provided in accordance with a preferred embodiment of the present invention. [0317]
  • FIG. 41 is a simplified illustration of a screen display of the “Contact list” function of the club in accordance with a preferred embodiment of the present invention. [0318]
  • FIG. 42 is a simplified pictorial illustration of a screen display of the “Group” function of the club provided in accordance with a preferred embodiment of the present invention. [0319]
  • FIG. 43 is a simplified pictorial illustration of a screen display of the “Search user” function of the club in accordance with a preferred embodiment of the present invention. [0320]
  • FIG. 44 is a simplified pictorial illustration of a screen display of the “Send message/scripts” function of the club in accordance with a preferred embodiment of the present invention. [0321]
  • FIG. 45 is a simplified pictorial illustration of a screen display of the “Interests” form provided in accordance with a preferred embodiment of the present invention. [0322]
  • FIG. 46 is a simplified pictorial illustration of a screen display of the “Registration” form provided in accordance with a preferred embodiment of the present invention. [0323]
  • FIG. 47 is a simplified pictorial illustration of a screen display of the “Select content” function preferably provided by the electronic shop of the present invention. [0324]
  • FIG. 48 is a simplified pictorial illustration of a screen display of the “Select packages” function preferably provided by the electronic shop of the present invention. [0325]
  • FIG. 49 is a simplified pictorial illustration of a screen display of the “View Account” function preferably provided by the electronic shop of the present invention. [0326]
  • Described herein is a software tool for generating verbal content and for controlling toys and other manipulable objects, particularly suited for toys operated by a PC computer, in wireless communication, by means of a wireless, e.g. radio, base station connected to the PC, with a toy controller embedded inside the toy. [0327]
  • The present specification uses the following terminology: [0328]
  • Living Object: Hardware and software technology for building computer controlled toys and other manipulable objects, and for the generation of verbal content for their control. [0329]
  • Scriptwriter: A software program for the generation of verbal content for the control of toys based on Living Object technology. [0330]
  • Base Station: A radio or other wireless transceiver connected to the PC providing wireless communication between the PC and the toy controller embedded in the computer controlled toy. [0331]
  • Toys and other objects based on Living Object technology use the computer, wireless communications, and voice recognition software to speak with their users in a human voice, with human-like personality and intelligence. The toys hold entertaining, personalized dialogs with the child, demonstrating knowledge of the child and his/her likes and recalling past interactive sessions. [0332]
  • The Living Object Scriptwriter is a tool useful in creating interactive scripts that give the toys speech and personality. These scripts typically feature content that includes: [0333]
  • a. Interactive dialog focused on a variety of content and activities, [0334]
  • b. Personalized data, [0335]
  • c. Historical data with same user, and [0336]
  • d. Time-linked content. [0337]
  • The following description explains how to use the Living Object Scriptwriter to write interactive scripts for a Living Object toy. [0338]
  • To start working with the Living Object Scriptwriter (hereafter, Scriptwriter), the Scriptwriter's computer is turned on. The Living Toy is turned on and close by and the Living Object base station is plugged into the computer. [0339]
  • A preferred method for setting up the Base Station and Toy is now described with reference to FIG. 50 showing the Living Object base station and to FIG. 51 showing the Living Object Toy: [0340]
  • a. Plug the base station's computer cable into the serial port in the back of the computer. [0341]
  • b. Plug the base station's electrical transformer into a nearby electrical socket. [0342]
  • c. Turn the computer on and wait until the computer is fully operational and the desktop is displayed. [0343]
  • d. Turn the toy's on/off switch to ON. The toy emits a few beeps and any moving facial parts move briefly. The system is now ready for preparation of a script in order to provide dialog between user and toy. When the toy is on but not active for a while, it automatically switches to Sleep mode. A description of how to wake the toy up is provided herein. [0344]
  • A preferred method for opening the Living Object Scriptwriter software is now described with reference to FIG. 52 which shows a screen display of the Scriptwriter icon on desktop and to FIG. 53 showing a screen display of Living Object Scriptwriter main screen: [0345]
  • a. Click on the Scriptwriter icon on the desktop. [0346]
  • b. The Living Object Scriptwriter program opens to its main screen. The screen on the computer may look like the screen display illustrated in FIG. 53. [0347]
  • A preferred method for telling the system which toy is being used is now described with reference to FIG. 54 showing the “select tools—options” screen window display and to FIG. 55 showing the “toy” screen window display: [0348]
  • a. Click on Tools-Options. The Options window opens. This window typically has “tabs” such as the following: Toys, Hardware, Environment, Volume Settings, Smart Toy, Scripts, and Reports. [0349]
  • b. Click on the Toys tab to open the Toy window. This is done only if the Toy window is not already displayed. [0350]
  • c. Check that, in the Toy List, a check mark appears next to the name of the toy. For example, if the toy is “Monster”, check that a check mark appears next to Monster. Information related to the Toy Description for Monster appears to the left. [0351]
  • d. Click on Insert/Edit to update the system with the details of a particular toy. [0352]
  • A preferred method for telling the system to recognize the base station and the toy is now described with reference to FIG. 56 showing the “hardware” screen window display: [0353]
  • a. Click on the Hardware tab to display the Hardware window. [0354]
  • b. Click on the Check button in the “Check base station” section. The system begins looking for the base station, which is plugged into the computer and the wall socket. When it finds the base station, the phrase “Base connected.” appears in the Report window. [0355]
  • c. Click on the Check button in the “Check toys” section. The system begins looking for the toy defined as above. When it finds the toy, the phrase “Toy connected.” appears in the Report window and details about the toy appear in the Search for toy section of the window. [0356]
  • Making Sure the Toy is Awake: When turned on but not actively engaged in talking, listening, or another function, the user's Living Object is programmed to slip into “Sleep” mode. Upon doing so, the toy emits a single beep. Even with the beep, during scriptwriting a user may not notice the toy has switched to Sleep mode until the user tries to make the toy talk or record and gets no response. [0357]
  • To switch a sleeping toy back to Alert mode, press one of its sensors, by squeezing its hand, foot, or other body part. The toy emits a beep and is back in Alert Mode, ready for action. [0358]
  • Typically, only a few minutes are required to write a simple interactive script. [0359]
  • Reference is now made to FIG. 57 showing the “talk icon” screen display, FIG. 58 showing the Scriptwriter main screen display with the added talk object and to FIG. 59 showing the Scriptwriter main screen display with the added talk object connected by a line to the start object. [0360]
  • A preferred method for adding a Talk Object is now described: [0361]
  • a. Click and hold down the Talk icon which is the first icon in the group to the left of the Scriptwriter desktop. [0362]
  • b. Drag the Talk icon onto the Scriptwriter desktop, underneath the Start object. The Talk icon now appears in a white box. When on the desktop, the icon is called a Talk object. [0363]
  • c. Move the cursor so that it is on top of the Start triangle. The cursor changes to a pencil. Drag the pencil from the Start triangle to the Talk object. A line appears that connects both objects. [0364]
  • A preferred method for saving a script is now described with reference to FIG. 60 showing the screen display of the action toolbar with the save icon and to FIG. 61 showing the screen display for naming and saving the script: [0365]
  • a. Save the work done so far. Click on the Save icon on the Actions toolbar or select Save from the File menu. The Save window appears. [0366]
  • b. For a user's Living Object script to run correctly, the user typically must save all related script and wave files in a suitable directory. If, for example, the user creates a script comprising 3 script files and 26 wave files, the user typically must keep all 29 files in the same directory. The directory does not have to be inside the Scriptwriter directory: it can be in any directory on the hard drive of the computer. The user may click on the down (browse) arrow to get to the directory in which s/he wants to save the file. If appropriate, create a new directory in which to save all files related to the particular script being worked on. It is advisable to name the directory after the toy, such as “Monster script.” If not done yet, double-click on the desired director (whether new or old) so that the file is saved to it. [0367]
  • d. In the File name field, enter a name for the script, such as “script1.” The software automatically adds the extension .script. [0368]
  • d. Click Save. [0369]
  • A preferred method for adding speech is now described with reference to FIG. 62 which illustrates a screen window display of a combo box for typing the toy's speech: [0370]
  • a. Double click on the Talk object. The Talk window opens. [0371]
  • b. In the first field, marked Toy Name, the name of the toy should appear. If it does not, click on the down arrow to the right of the field and choose the name of the toy from the list. [0372]
  • c. Click in the TTS box and type the words that the toy is to say, e.g. a question which creates a script sequence that demonstrates the toy's voice recognition ability. Type the question: “What do you feel like doing now, wise guy? When my eyes light up, say: A joke, a song, or a game.”[0373]
  • d. The toy now has a line of speech that the user has created. To hear the voice segment right now, through the toy, click the “Play” button on the screen. The toy vocalizes the typed text, speaking in computer synthesized speech called “text-to-speech”. Note that the user can select text-to-speech that sounds like a man, woman, boy, or girl. [0374]
  • e. To hear the toy vocalize the text in a human voice, the user still typically needs to record the line. Preferred methods for recording speech are described below. [0375]
  • A preferred method for recording speech is now described with reference to FIG. 63 which is a screen window display for recording sound to be played by the toy. The toy's speech can be recorded through the toy or through the computer's microphone. Logistically, it may be easier to use a conventional microphone. If recording through the toy, make sure it is on and awake. If using the computer's microphone, make sure it is plugged into the microphone jack in the back of the computer and that the speakers and sound software are on. In the Talk window, click on the Record button. The Sound Recorder window opens. [0376]
  • As soon as the user clicks on the microphone icon, s/he is recording. For example, the user may record the line, “What do you feel like doing now, wise guy? When my eyes light up, say: A joke, a song, or a game.” When done, the user clicks again on the microphone icon. [0377]
  • The user plays back the recording to make sure it recorded well. The user then clicks on the speaker icon. If the user is not satisfied with the recording, he repeats. If it is desired to increase or decrease the volume of the recording, the Volume dial is adjusted by twisting it to the left or right with the mouse. Then the user records the line again. [0378]
  • A preferred method for saving a recording is now described with reference to FIG. 64 which is a screen window display for saving a recording. [0379]
  • The recording is preferably saved. Click on the Save icon. The Record into window appears. The recording is saved in the same directory as the Scriptwriter script. [0380]
  • Click on the down arrow to get to the appropriate directory which is the directory in which the script file was saved earlier, in the above-referenced description of how to save a script. Then save the recorded file under any suitable name, such as “wav[0381] 1.” The software automatically adds the extension .wav. Once saved, the recording becomes a sound file, also known in the art as a “wave” file.
  • A preferred method for playing back a recording through the toy is now described with reference to FIG. 65 which is a screen window display for selecting a “wave” file to be played by the toy. Now a line that the user created and recorded is played through the user's toy. [0382]
  • If the Talk window is not already open, double-click on the Talk object on the desktop. [0383]
  • In the Talk window, click on the circle next to WAV. This tells the system that the user wants to play a pre-recorded wave file rather than the text-to-speech version of the text that appears in the TTS box. [0384]
  • Click on the open file icon (to the right of the WAV field) and browse until the wave file just recorded is found and saved. Click on the Open button to select the file. The file and its directory path now appear in the WAV field. [0385]
  • Click on the Play button. The system plays the wave file through the toy. [0386]
  • If no sound comes out of the toy, the toy may have gone to sleep while the user was occupied with scriptwriting and recording. To wake the toy up, squeeze its hand or another body part that contains a sensor. If the toy responds with movement and/or a beep, then the user has switched it back to Alert mode. AT this point, the user clicks on the play button and the system plays the wave file through the toy. [0387]
  • If problems occur, perform one or more of the following checks: [0388]
  • Make sure that the name in the Toy Name field at the top of the window is that of the toy. If “Computer” appears, then change the name to the toy's name, as explained above in the description of how to add speech. [0389]
  • Make sure the WAV circle is selected, rather than the TTS circle. This time the toy is to vocalize the wave file rather than the computer synthesized (TTS) version of the text. [0390]
  • The system set-up steps for the base station and toy, as described above, are now followed. [0391]
  • Reference is now made to FIG. 66 that illustrates a Listen icon and to FIG. 67 that is a screen display of a part of the Scriptwriter main window with the Listen object added. To add a Listen object to the script: [0392]
  • a. Click and hold down the Listen icon. Drag the Listen icon onto the Scriptwriter desktop, underneath the Talk object. The Listen icon now appears inside a white box. When in this form, on the Scriptwriter desktop, the icon is a Listen object. [0393]
  • b. Move the cursor over the Talk object until it changes to a pencil. Then drag a line from the Talk object to the Listen object. The script now flows from the Start object to the Talk object to the Listen object. Now the toy is told what to listen for. [0394]
  • A preferred method for defining keywords is now described with reference to FIG. 68 which is the “Listen and Sense” screen window display. [0395]
  • a. Previously, an object was added that tells the toy to listen. Now the user tells the toy what words to listen for. In defining the Talk object, the toy was told to tell the user: “.say: a joke, a song, or a game.” Each of these phrases is a keyword phrase which is now defined. [0396]
  • b. Double-click on the Listen object. The Listen and Sense window opens. In the Listen and Sense window, the user defines what words the toy listens for or what sensors are in input mode during the current listen and sense segment. [0397]
  • c. Double check that the correct name appears in the Toy Name field. Click in the Keywords field. [0398]
  • d. Type the keywords, following the same spacing and punctuation pattern seen in parentheses. Type: a joke, a song, or a game. [0399]
  • e. If it is desired to make one or more of the toy's sensors active at this point, the user should click the sensor number that corresponds to each of the sensors. [0400]
  • f. Click OK. Part of the list of keywords appears on the listen icon, as a point of reference. [0401]
  • To improve the accuracy of the keyword recognition, try to use keywords that have a least two syllables and make sure that the keywords in a particular group of keywords sound different from each other. Keyword phrases that may be used typically comprise exactly two words. [0402]
  • Sometimes the system does not know how to pronounce a keyword. This typically happens when the user uses special names or made up words. Click on the play button to hear how the system reads out the word. Then adjust the spelling of the word and play the word again, and repeat this process as necessary, until the system pronounces the word correctly. [0403]
  • A preferred method for creating a response for each keyword is now described with reference to FIG. 69 which illustrates the “Keyword link box” in the “Choose Link” screen display and to FIG. 70 showing the Scriptwriter main screen display with a Listen object links to corresponding Talk objects. [0404]
  • The toy gives a different answer to each keyword it hears. This process of building questions, keywords, and responses to keywords gives the toy its intelligent conversational ability—at the most basic level. The system offers many different features that enable the user to give the dialog a highly intelligent aspect, such as random answers, answers based on memory, and answers based on collected personal data. [0405]
  • To create a response, simply add a Talk object for each keyword as described herein with reference to Adding Speech. [0406]
  • To add a response to each of the keywords that have already been created: [0407]
  • a. Drag the Talk icon over to the Scriptwriter desktop and under the Listen object. Connect the Listen object to the Talk object. The keyword link box appears, with the first keyword in the list that was entered in the Listen window. [0408]
  • b. Click on OK. If this is not the right keyword, click the down arrow and scroll down until the correct keyword appears. Then click OK. [0409]
  • d. Drag the Talk icon over to the desktop four separate times, until there are four Talk objects beneath the Listen object. Connect a keyword link to each Talk object, as in the previous step. [0410]
  • d. To type some kind of verbal answer for each Talk object:Double-click on the first Talk object, which links to the “a joke” keyword. In the TTS box, type: “You want to hear a joke? You must be a very funny person.”[0411]
  • It is advisable to repeat the keyword at the very beginning of the toy's response. This tells the user that the toy indeed understood the spoken keyword. [0412]
  • e. For each keyword, type an appropriate response in the TTS box of the corresponding Talk object. [0413]
  • The fourth keyword may automatically display a link called “Not-Found.” This link allows the user to create a verbal response to a situation in which the toy did not hear or understand any of the keywords it was listening for (or, if the toy was awaiting sensor input, did not feel input to the sensor that was waiting for input). A description of how to create a “NotFound” reaction by the toy is provided below. [0414]
  • Creating an Answer to Not-Found: Sometimes the system does not understand the user's response (or the user did not provide a response at all). The fourth Talk object created typically needs to contain speech that tells the user what to do if the toy did not understand or hear the keyword spoken by the user. Typically, the user should-repeat the keyword or make a comment to the effect that the toy did not get one of the expected answers and is therefore moving on to the next point in the script. If the user is asked to repeat the keyword, the user should be reminded what the keywords are, in case she or he has forgotten them. [0415]
  • In the fourth Talk object, which is linked to a “not found” situation, do as follows: [0416]
  • a. Type text that tells the user to repeat the keyword. Double-click on the Talk object. In the TTS box, the user might type: “I'm sorry, but I didn't quite hear what you said. Please tell me again. Say: a joke, a song, or a game.”[0417]
  • Alternatively, type text that tells the user that the toy did not hear the response, but is moving on to the next point in the script. In the TTS box, type: “Hmmm, you want me to choose? Ok, I'm in the mood for a joke!”[0418]
  • The objects are now linked accordingly. If the user typed the text in [0419] Step 1, then the user typically needs to draw a link from the fourth Talk object back to the ListenStep 2, then the user typically needs to draw a link from the fourth Talk object to the Talk object that provides a response to the keyword “joke.”
  • A preferred method of running a script is now described with reference to FIG. 71 showing the Run-Run screen window display and to FIG. 72 showing the Sample error message screen window display. [0420]
  • At this point, there is enough of a script to run a talk-listen-respond sequence through the toy, which may be performed as follows: [0421]
  • Make sure the toy is awake by squeezing one of its sensors. [0422]
  • Select Run-Run: To run the script from a certain point rather than the beginning, simply click on the object from which it is desired to run the script and select Run-Run from Selected. The Start icon on the desktop is highlighted and the Living Object software runs through the script, highlighting each icon as that part of the script is activated. If there are any problems with the script, a window with error messages appears, like that illustrated in FIG. 72. [0423]
  • The errors listed indicate a problem with Talk1 and Talk5. These errors were generated when the Run-Run option was selected and the toy was still in Sleep mode. The system found an error with Talk1 because it was the first segment of the script that the system could not execute. The error in Talk5 reflects the inability of the sleeping toy to listen at all. [0424]
  • Ideally, as the user runs the script, the user's toy voices the texts defined in Talk1, listens for one of the three keywords you defined in Listen[0425] 1, and responds accordingly by voicing the text from Talk2, or Talk3, or Talk4.
  • Talk1 is a wave file, whereas the other Talk objects are played through the toy as synthesized speech. To run the entire script as wave files i.e. in natural voice, the Talk window of each of the Talk objects is opened and the text recorded, as described above. [0426]
  • An explanation of each of the functions that appears on the Scriptwriter main screen is now provided with reference to FIGS. 73A and 73B showing a table of the functions provided by the Scriptwriter with their icons as presented on the Scriptwriter main screen display. [0427]
  • Reference is now made to FIG. 74 showing the Talk object screen window display. To enter into the talk options window double click on the icon on the script. The user clicks on the Advanced button for the following additional options: [0428]
  • a. Toy Name: Determined according to toy available (appears in all of the motion group options) [0429]
  • b. Name: Name of object (appears in all of the motion group options) The toy name and name options appear in all of the motion group objects. [0430]
  • TTS (Text to Speech) field: enter in text to be spoken by toy. [0431]
  • A change can be made in the type of voice used by clicking on the different options available of the right of the TTS field e.g. man, women, boy, girl. [0432]
  • A wav file can be inserted by choosing the wav option field and allocating a wav file either from the computer or from a recorded wav file. [0433]
  • A message can be recorded by selecting the record button. This brings the user to the Sound recorder window, as described herein in the description of how to record speech. [0434]
  • The wav file can be played back from this window by clicking on the play button. [0435]
  • Movement Options allows the user to select the type of movement for the talk segment. [0436]
  • The Mood and Stage field are used for additional comment information. [0437]
  • Listen & Sense Object: Reference is now made to FIG. 75 showing the Listen and Sense screen window display. [0438]
  • Toy Name: Determined according to toy available [0439]
  • The keywords field is where the user defines the options available for speech recognition. With the say keywords button which is located at the end of the keywords field the user can hear the words chosen. [0440]
  • The Sensors field allows to define the sensor areas located on the toy for non verbal response. [0441]
  • Listen time allows the user to define the maximum time given to listen or wait for activation of sensors. [0442]
  • The Memory field allows the user to save the results of the recognition process. [0443]
  • In order to change the accuracy level the user clicks on the “Active” field and then “ok”. This brings the user to the Script Properties window. Here the user can change the speech recognition accuracy level. The lower the level of accuracy, the more sensitive the recognition is. [0444]
  • Move Object: Reference is now made to FIG. 76 showing the Move screen window display. The Movement field allows the user to pick the type of activity the Toy is to make. The Moving Time field defines the length of time the movement is to take place. When the user chooses the “Run in the Background” option, the user is instructing the toy to continue with the script once receiving the movement command. [0445]
  • Record Options: Reference is now made to FIG. 77 showing the Record Options screen window display. The Record option allows the user to record his/her voice, or anyone else's voice. Wav File Name—Insert name of file that is to be recorded in the script. [0446]
  • Memory Object: Reference is now made to FIG. 78 showing the Memory screen window display which allows the user to put a certain compartment of the computer's memory and give the compartment a name. [0447]
  • Condition Object: Reference is now made to FIG. 79 showing the Condition screen window display. [0448]
  • Compare two different values or check if one value is greater then, less then, equal to, or not equal to a certain value. [0449]
  • Calculation Object; Reference is now made to FIG. 80 showing the Calculation screen window display. [0450]
  • Do some math on the values that are stored on the computer's memory compartments. The computer can add, subtract, multiply and divide. [0451]
  • Random Object: Reference is now made to FIG. 81 showing the Random screen window display which allows a user to create a list of values that the computer chooses from a random basis and to tell the computer in which memory compartment to put the values. [0452]
  • Time Marker Object: Reference is now made to FIG. 82 showing the Date and Time screen window display which allows a user to put a certain time or date in a compartment in the computer's memory. [0453]
  • Wait Object: Reference is now made to FIG. 83 showing the Wait screen window display. This display instructs the toy to wait for a certain amount of time before proceeding with the script. [0454]
  • Jump Object: Reference is now made to FIG. 84 showing the Jump screen window display, allowing a user to skip to a different point in the script. [0455]
  • Execute Object: Reference is now made to FIG. 85 showing the Execute screen window display which allows the user to run and software program on the computer. [0456]
  • Script Object: Reference is now made to FIG. 86 showing the Run Script screen window display which enables the user to run any other Scriptwriter Script. [0457]
  • Internet Object: Reference is now made to FIG. 87 showing the Internet screen window display which opens a defined web page. [0458]
  • Graphics Object: Reference is now made to FIG. 88 showing the Graphics screen window display which shows a picture or video file on the computer's screen. [0459]
  • Preferably, some or all of the following options are provided: [0460]
  • a. Display time is the length of time the image/video is to be shown. [0461]
  • b. Size field allows the user to determine the height and width of the image chosen. [0462]
  • c. Choose Display is a function used for limiting and controlling the display panels. [0463]
  • Video (Advanced Options): When choosing the “Wait until finish” command this instructs the toy to wait until the video is completed before continuing with the script. [0464]
  • Reference is now made to FIGS. 89, 90, [0465] 91, 92 and 93 showing “End”, “Script Properties”, “Choose Link”, “Pop-up Menu” and “Options” screen window displays, respectively.
  • End Object: The end object stops the script and allows for the users to define the exit names. When opening the script from a different window and the single output mode in not defined the user is able to view all the available script exists. [0466]
  • Once a script has been written by the user, the IDE lets the user activate it in a way that gives life-like behavior to the toy. The IDE comprises algorithms and a strong compiler that integrate time, pattern, and minimal interval and apply them to the script or a collection of scripts. The resulting artificially created life for the toy is typically authentic in the sense that users can easily forget they are speaking and interacting with a toy. A description of how to use the IDE to create artificial life is now provided. [0467]
  • Artificial life is divided into three main screens, the Editor, the Manager, and the Viewer. Each of these screens is described herein. [0468]
  • Artificial Life (AL) Editor: There are two kinds of AL editors, professional and non-professional. [0469]
  • Reference is now made to FIG. 94 showing the Artificial Life Algorithm Editor screen window display. [0470]
  • The Artificial Life Professional Editor allows the user to define formulas and assign values to local and system parameters that later act on a given script. The Editor is used to write a user's own formulas or edit pre-written formulas provided by the function library. The Editor then allows the user to create an algorithm from the formula the user has defined, and associate the algorithm with the current script. [0471]
  • In the current example, a formula and parameters are being defined to determine how often the script entitled Games.script. is executed. [0472]
  • In the Behavior parameter box, four parameters must be assigned values: memory, initial Priority, threshold Priority, and minimum Interval. [0473]
  • To do a test run on the algorithm, the user typically needs to assign the formula a temporary value in the Formula parameter box. For example, the formula on the sample screen has been assigned a value of 1. This value could represent successful completion of, say, a lesson in science. If the script has never been completed successfully, it could have formula parameter value of 2. [0474]
  • Reference is now made to FIG. 95 showing the Artificial Life Editor screen window display. The editor enables to build formula for specific script. Here the steps to add AL formula to script. First the user chooses the script by pressing load button. Then fill in formula by double click on a cell. At least one cell must typically be filled in. Finally the user saves the AL formula by pressing the save button. [0475]
  • Reference is now made to FIG. 96 showing the Artificial Life Editor screen window display with the Cell Management pop-up window. By pressing on the right click on the mouse, the user gets a pop-up with cell functions. [0476]
  • Reference is now made to FIG. 97 showing the Artificial Life Editor screen window display with the Function Library po-pup window. [0477]
  • Add specific function. Select the function then fill in properties. Press OK in the properties section or double-click on the selected function to add the function, or press Esc button to cancel. [0478]
  • Reference is now made to FIG. 98 showing the Artificial Life Manager screen window display. The Artificial Life Manager gives the user an overview of all scripts that the Artificial Life engine is to check and the formulas, parameters, and values assigned to them. It is possible to work from the Manager to make changes to the definitions. The Manager contains functions for adding, removing, and viewing the history of the 10 last executions each script. Highlighting a script name with the highlight bar displays all the relevant details of that script. [0479]
  • Reference is now made to FIG. 99 showing the Artificial Life Editor Viewer window display. The Artificial Life Viewer presents a real-time and historical graphical depiction of the status of up to five scripts at any one time. The Viewer can be used to track the behavior of different scripts, as determined by the value stored for each script in “Memory.” The “Show activation level” item can be selected to view the threshold of the selected scripts, and thereby determine when the last time was that each script executed. The Viewer displays the last 10 minutes of Artificial Life activity on an ongoing basis, scrolling to the right as additional charted activity takes place. [0480]
  • Building Artificial Life Environment: A preferred method for building an AL toy typically comprises the following steps: [0481]
  • a. Make list of scripts. [0482]
  • b. Make list of parameters. [0483]
  • c. Make Tables, e.g. Dependence table and formula table. [0484]
  • d. Fill in formula table. FIG. 101 is an example of a Formula table. [0485]
  • e. Fill in dependence table. FIG. 100 is an example of a Dependence table. [0486]
  • f. Build the scripts. [0487]
  • g. Register/Add the scripts. [0488]
  • Reference is now made to FIGS. 102 and 103 showing the Scriptwriter main screen display with corresponding AL scripts, specifically a game script and a laugh script respectively. [0489]
  • Preferably the system provides a variety of optional commands, which may be implemented as Function Bar Commands. An example of a set of optional commands which may be provided is now described. [0490]
  • File Menu: Reference is now made to FIG. 104 showing the Scriptwriter main screen display with the File menu open. The file menu allows the user to create a new script, open an existing script and other operations that are found in a normal file menu. The file menu preferably includes menu options such as the following: [0491]
  • a. New Script: In order to begin writing a script, click on new script in the file menu and a new window appears on the screen. Now the user can begin working on his/her script. [0492]
  • b. Open Script: To open an already saved script, click on open script in the file menu. A window opens up containing a list of the existing scripts which the user can search from. When the user finds the script s/he is looking for, click on its name, for example scriptl.script, and the script file opens. [0493]
  • c. Download Script: The download script command in the File Menu opens a collection of existing scripts typically residing on an internet site such as (WHAT IS THE CREATOR SITE??) An existing script can be downloaded from the web to the IDE Scriptwriter program. [0494]
  • d. Save Script: To save a script created on the program, click on the Save Script command in the File Menu. [0495]
  • e. Save Script As: To save the new script under a certain file name, click the Save Script As command on the File Menu. A window opens up asking for the scripts' name, name the script, press the save command and the file is now saved in the directory assigned to it. [0496]
  • f. Save Script As Image: The Save Script As Image command saves the script in the format of a picture image. The script is saved as a Metafile Image (WMF). WMF is especially compatible with the Word program. When the user saves the script in the form of WMF, the user can make corrections and changes, outside the IDE Scriptwriter program, in Word itself. [0497]
  • g. Create Report: Reference is now made to FIG. 105 showing the Scriptwriter main screen display with the Create Report Window. The Create Report command creates a chart in the Excel program which documents which objects appear in the script created. In the window that opens, when the user clicks on Create Report, the user can choose to chart all properties of all existing objects by pressing Print All. The user can limit the chart to a specific object, for example talk, by selecting Create Selected in the window that opened when Create Report was clicked on. [0498]
  • h. Print Description: When clicking on the Print Description command a detailed text file and NOT a chart appear. The same information, which appears in the Create Report chart, appears in Print Description in textual form. [0499]
  • i. Print Preview: When clicking on the Print Preview command, the user receives a print preview of the script s/he has just created. [0500]
  • j. Print: The Print command prints a visual picture of the script as well as a verbal description of the stages of the script. Below the print command in the file menu, appear the Last Opened Scripts. It can display a maximum of the last Three files that have been worked on. The last command in the File Menu is the Exit command. When clicking on the exit command, the user exits the IDE Scriptwriter Program. [0501]
  • Reference is now made to FIG. 106 showing the Scriptwriter main screen display with the Edit menu open. The Edit Menu allows the user to amend and change the script s/he has already created. It includes commands such as the following: [0502]
  • a. Undo: This function allows the user to undo the last operation that was made on the script that s/he has been working on. [0503]
  • b. Redo: Allows the user to redo an undo operation that s/he has made. [0504]
  • c. Cut: Allows the user to cut a part of his script and to paste it in another place, or to cut a part of the script in order to remove it. [0505]
  • d. Copy: Allows the user to copy a part of his script and to place the same action copied into another part of the script, thus having two operations repeat themselves in two separate parts of the script. [0506]
  • e. Paste: The paste and copy commands are interconnected to undo the last operation that was made on the script that the user has been working on. When cutting a part of his script, the user must then click on the paste command in order to place that operation in another part of his script. [0507]
  • f. Select All: Allows the user to select all parts in his script so that changes and corrections he wish to make can take place in the whole script itself. [0508]
  • Reference is now made to FIG. 107 showing the Scriptwriter main screen display with the Find Window. Using the Find command the user can search for a specific word or object in his whole script, making his search easier. [0509]
  • Reference is now made to FIG. 108 showing the Scriptwriter main screen display with the Replace Window. When clicking on the Replace a window appears. This window is split into two sections—Target and Object. [0510]
  • Target—the target defines where the desired replacement should take place. It can take place in a selected part, or in the whole script. [0511]
  • Object—the object defines in which objects the replacement should take place. It can take place in All objects that have shared properties, or the Replace command can be executed according to Object Type. A replacement is made in a specific object according to its unique properties. [0512]
  • Clipboard: Copy the image or description (Copy Image to Clipboard, Copy Description to Clipboard) of the script onto Windows' clipboard. All Windows applications can now use the image or description. [0513]
  • View Menu: Reference is now made to FIG. 109 showing the Scriptwriter main screen display with the View menu open. The View Menu offers different forms of viewing the script created, such as zoom in/out and volume. [0514]
  • Zoom in: The Zoom in lets the user view his script in magnified size. [0515]
  • Zoom out: The Zoom out lets the user view his script in decreased size. [0516]
  • Normal Size: The normal Size lets the user view his script in its original size. [0517]
  • Volume: Reference is now made to FIG. 110 showing the Scriptwriter main screen display with the Volume and Speech Recognition Windows. Clicking on the Volume show the volume of all that is spoken or heard in the script. This can help the user understand why, for example, words are not being recognized by the program because the microphone level is too low. [0518]
  • SR Result: The system uses a Speech Recognition (SR) window to show the speech recognition results during running the script. The accuracy helps the user determine if the sensitivity in identifying certain parts in the program should be lowered. The higher the accuracy the closer the annunciation is to the computer's. [0519]
  • The Rec. Wav button allows the user to hear the last saved recordings during listen process. [0520]
  • Reference is now made to FIG. 111 showing the Scriptwriter main screen display with the Watch List and the Add Memory windows [0521]
  • Watches: Using the Watches command, the user can follow the different values of Memory, that have been saved, during or after running the script. [0522]
  • Execute log: Reference is now made to FIG. 112 showing the Scriptwriter main screen display with the Execute Log and the Messages windows. The Execute Log is a logger of all operations that have been identified and executed. This can be extremely helpful in identifying errors that have been made. [0523]
  • Messages: When clicking on the Messages, a box comes up on screen identifying any errors that might have been made or any hints the program has for the user. If nothing appears in the box no error was found and no hint offered. [0524]
  • Sensor Simulation: Reference is now made to FIG. 113 showing the Scriptwriter main screen display with the Sensor Selection window. This is a simulation for the sensors of the specific object in the user's script. The sensors that are active in different parts during the script are identified by name during this Sensor Simulation. [0525]
  • Link Style: This refers to the different styles of links that can be made between two objects in the script (e.g. between talk & move). There are six different styles of links for e.g. Vertical-horizontal, horizontal-vertical. These different styles help the user to better organize his script writing form. [0526]
  • The user can also change link style by double clicking on the link line itself, in his script. [0527]
  • Simulator: When clicking on the Simulator, a window opens up on the user's screen. A simulator doll is displayed that actually Acts Out the script, only if it is running in simulation mode. [0528]
  • Scheduler: Reference is now made to FIG. 114 showing the Scheduler screen window display. The Scheduler can determine at what set time the user's script is executed. A user can schedule his script to appear once an hour, once a day, on an event like a birthday or every time a dolls' hand is pressed. Not only scripts can be scheduled; the user can also schedule a Message to be delivered on a set time or event. A user can also receive a List of the last scripts to be run and on what dates their running had occurred. [0529]
  • Scheduler—adding a task: Reference is now made to FIGS. 115 and 116 showing the Scheduler screen window display with the Add Task pop-up window Scheduler List pop-up window correspondingly. [0530]
  • Find Toys: Reference is now made to FIG. 117 showing the Scriptwriter main screen display with the Find Toys and Assign Channels window. This command searches for the marked toys. The toys that are defined, are identified. It can also tell the user which toys are awake and which are sleeping, and which do not exist. [0531]
  • Run Menu: Reference is now made to FIG. 118 showing the Scriptwriter main screen display with the Run menu open. This menu allows the user to Run his finished script, pause and so on. The menu typically offers options such as the following: [0532]
  • Run: Play the script. [0533]
  • Run from Selected: Allows the user to begin playing his script from a specific chosen point. [0534]
  • Pause: Pause the script midway at a chosen place. [0535]
  • Stop: Bring the running script to a complete stop. [0536]
  • Check Script: Check the script for any errors or hints (if there are any errors or hints in the script the Message window appears). [0537]
  • Tools Menu: The Tools Menu controls all the environment of the IDE scriptwriter program. A user can control the toys, volume, and sensors.Options [0538]
  • The Options commands, in the illustrated embodiment, are split into the pages or screen windows illustrated in FIGS. [0539] 119-124, including Toys, Hardware, Environment, Volume Setting, Living Toys, Script and Report.
  • Toys Page: Reference is now made to FIG. 119 showing the Scriptwriter main screen display with the Option window at the Toys Page. In the Toys page the user can define toys shown in the list. Toy is typically defined with name—toy and name, type—(which defines the toy according to the operations it can perform), number, and channel. This also allows a user to remove toys from the list. [0540]
  • Reference is now made to FIG. 120 showing the Script-writer main screen display with the Option window at the Hardware Page. [0541]
  • The Hardware page is split into the following three subsections: [0542]
  • a. Check Base Station—checks the communication between the base station and the program, it resets the base station. [0543]
  • b. Check Toys—the toys which I have chosen to work with are checked. [0544]
  • c. Search for Toys—searches for a toy according to its number or channel, when the toy is found the program activates the toy which in return makes a sound. [0545]
  • There is also a Report box that reports what has happened, which toy was identified and which was not. [0546]
  • Environment Page: Reference is now made to FIG. 121 showing the Scriptwriter main screen display with the Option window at the Environment Page. [0547]
  • Simulation through the PC—simulation of the script run in the computer. [0548]
  • View simulator—awakens the simulator doll inside the program. Advance Properties—show every object (for e.g. talking, moving) with advanced properties. [0549]
  • Toy Identity—Changes the illustration in the script itself to an illustration of the chosen toy and not the generic illustration. This helps to clarify which toy is talking or operating at different points in a multi-toy script. [0550]
  • Default Toy—the toy in the script is the default toy. [0551]
  • Volume Setting Page: Reference is now made to FIG. 122 showing the Scriptwriter main screen display with the Option window at the Volume Setting Page. These Volume Settings are for the speaker as well as the microphone in the doll. A doll is selected, and the reload button is clicked on. This asks the doll to send the program its internal volume settings. An update can be made to these settings, saving the update and changing the original settings. After the update, another check of the volume settings is made. [0552]
  • Living Toy Page: Reference is now made to FIG. 123 showing the Scriptwriter main screen display with the Option window at the Living Toy Page. [0553]
  • Using the Living Toy page the user can activate all toys that are programmed for artificial life. The toy, like a live person, to “sleep”, “eat”, and “wake up” at a set time. A user can choose to activate only certain toys for artificial life, once the user has chosen them they “wake up”, if they are sleeping. [0554]
  • Script page: Reference is now made to FIG. 124 showing the Scriptwriter main screen display with the Option window at the Script page. [0555]
  • When selecting “activate automatic downloading” the scripts from the internet are directly downloaded into a chosen place in the user's hard disk. This option is only available to those who register. A user can choose to receive only scripts that match criteria selected. [0556]
  • Report Page: The Report page typically comprises the following elements: [0557]
  • Save Logger—every script that has run can be saved along with the date of the running. This can help to keep better track of the scripts. [0558]
  • Save Listen File—can save any listen made in a script (it always saves the last listen heard). [0559]
  • Memory—this allows the user to add or remove a memory the compilation of memories. [0560]
  • A suitable Error list for the system described herein the following: [0561]
  • 1—SR-Catch all error. Probably an internal error or a subtle corruption of the database. [0562]
  • 2—SR-User not found in database. [0563]
  • 3—SR-Language not found in database. [0564]
  • 4—SR-Syntax not found in database. [0565]
  • 5—SR-Context not found in database. [0566]
  • 6—SR-Database not found. [0567]
  • 7—SR-Dictionary not found in database. [0568]
  • 8—SR-Context with this name already exists in database. [0569]
  • 9—SR-Language with this name already exists in database. [0570]
  • 10—SR-Syntax with this name already exists in database. [0571]
  • 11—SR-User with this name already exists in database. [0572]
  • 12—SR-Database with this name already exists. [0573]
  • 13—SR-Error occurred while trying to activate context on recogniser. [0574]
  • 14—SR-Error occurred while trying to activate language on recogniser. [0575]
  • 15—SR-Error occurred while trying to activate syntax on recogniser. [0576]
  • 16—SR-Error occurred while trying to load user. [0577]
  • 17—SR-Grammar load failure. [0578]
  • 18—SR-No context defined. [0579]
  • 19—SR-No database defined. [0580]
  • 20—SR-No algorithm running. [0581]
  • 21—SR-No active context. [0582]
  • 22—SR-Invalid pointer (in a parameter). [0583]
  • 23—SR-Wrong inifile. [0584]
  • 24—SR-Access denied. [0585]
  • 25—SR-Buffer too small (in a parameter). [0586]
  • 26—SR-You cannot perform this action in this state. [0587]
  • 27—SR-Could not activate. [0588]
  • 28—SR-Out of heap memory. [0589]
  • 29—SR-No word recognized. [0590]
  • 30—SR-Invalid syntax. [0591]
  • 31—SR-Cannot merge given contexts. [0592]
  • 34—SR-WORDNOTFOUND-Cannot find or delete word. [0593]
  • 35—SR-Word already exists. [0594]
  • 36—SR-Class not found in context. [0595]
  • 37—SR-Cannot convert BNF file to context. [0596]
  • 38—SR-Cannot merge active words. [0597]
  • 39—SR-The active context is closed. [0598]
  • 40—SR-Cannot open file. [0599]
  • 41—SR-Cannot load library. [0600]
  • 42—SR-Cannot merge. [0601]
  • 43—SR-Wrong type (in a parameter). [0602]
  • 44—SR-Unsupported wave format. [0603]
  • 45—SR-Already active. [0604]
  • 46—SR-Context is still installed. [0605]
  • 47—SR-Cannot load context. [0606]
  • 48—SR-Context is not active. [0607]
  • 49—SR-Cannot load language. [0608]
  • 50—SR-Cannot load user. [0609]
  • 51—SR-Different languages cannot be active at the same time or trying to compile to a context with a different language. [0610]
  • 52—SR-Different users cannot be active at the same time. [0611]
  • 53—SR-No wave format specified. [0612]
  • 54—SR-Context is active. [0613]
  • 55—SR-Language is in use. [0614]
  • 56—SR-Language is in use. [0615]
  • 57—SR-Cannot create directory. [0616]
  • 58—SR-No valid database. [0617]
  • 59—SR-Database is opened. [0618]
  • 60—SR-Language is already registered. [0619]
  • 61—SR-Language is not registered. [0620]
  • 62—SR-Context is already registered. [0621]
  • 63—SR-Context is not registered. [0622]
  • 64—SR-Environment already exists. [0623]
  • 65—SR-Environment not found. [0624]
  • 66—SR-Cannot delete directory. [0625]
  • 67—SR-No dictionary specified. [0626]
  • 68—SR-Dictionary already exists. [0627]
  • 69—SR-DLL not found. [0628]
  • 70—SR-Corrupt DLL. [0629]
  • 71—SR-Database is corrupted. [0630]
  • 72—SR-Feature is not yet implemented. [0631]
  • 73—SR-Invalid input (of a parameter, or input signal). [0632]
  • 74—SR-Conversion failed. [0633]
  • 75—SR-Unable to copy a file. [0634]
  • 76—SR-Unable to delete a file. [0635]
  • 77—SR-Context is opened. [0636]
  • 78—SR-Bad name. [0637]
  • 79—SR-Incompatibility problem. [0638]
  • 80—SR-Disk full. [0639]
  • 81—SR-Dictionary is opened. [0640]
  • 82—SR-Format not found. [0641]
  • 83—SR-Symbol already exists in library. [0642]
  • 84—SR-Symbol not found in library. [0643]
  • 85—SR-Database is in use by a recogniser. [0644]
  • 86—SR-Dictionary is in use. [0645]
  • 87—SR-Syntax is in use. [0646]
  • 88—SR-Error creating file. [0647]
  • 89—SR-License Number in asrapi is invalid. [0648]
  • 90—SR-No training set found. [0649]
  • 91—SR-Property not found. [0650]
  • 92—SR-Export not found. [0651]
  • 93—SR-Value out of range. [0652]
  • 94—SR-No context library defined. [0653]
  • 95—SR-Different database used. [0654]
  • 96—SR-Error when generating transcription of a word. [0655]
  • 97—SR-Agc can not be active during user word training. [0656]
  • -1—TTS-File not found. [0657]
  • -2—TTS-File creation error. [0658]
  • -3—TTS-File writing error. [0659]
  • -4—TTS-Memory allocation error. [0660]
  • -5—TTS-Memory locking error. [0661]
  • -6—TTS-Memory unlocking error. [0662]
  • -7—TTS-Memory free error. [0663]
  • -8—TTS-Wave Device open error. [0664]
  • -9—TTS-Wave device closing error. [0665]
  • -10—TTS-Specified waveformat not supported. [0666]
  • -11—TTS-No wave devices available. [0667]
  • -12—TTS-TTS has not been initialized. [0668]
  • -13—TTS-Specified frequency not available. [0669]
  • -14—TTS-Specified parameter is out of range. [0670]
  • -15—TTS-Specified output PCM format not available. [0671]
  • -16—TTS-TTS system is busy. [0672]
  • -17—TTS-Not authorized TTS DLL is used. [0673]
  • -18—TTS-Dictionary loading error. [0674]
  • -19—TTS-wrong dictionary handle. [0675]
  • -20—TTS-Wave device writing error. [0676]
  • -21—TTS-No input text. [0677]
  • -22—TTS-Bad command for current state. [0678]
  • -23—TTS-Grapheme to phoneme conversion fail. [0679]
  • -24—TTS-Unknown dictionary format has been found. [0680]
  • -25—TTS-Creating instance error. [0681]
  • -26—TTS-No more TTS instance available. [0682]
  • -27—TTS-Invalid TTS instance has been specified. [0683]
  • -28—TTS-Invalid TTS engine has been specified. [0684]
  • -29—TTS-TTS instance is busy. [0685]
  • -30—TTS-TTS engine loading error. [0686]
  • -31—TTS-No engine has been selected. [0687]
  • -32—TTS-Internal system error. [0688]
  • -33—TTS-Specified wave device is busy. [0689]
  • -34—TTS-Invalid dictionary entry has been specified. [0690]
  • -35—TTS-Too long source or destination text has been used. [0691]
  • -36—TTS-Max. dictionary entries are reached. [0692]
  • -37—TTS-Specified entry exists already. [0693]
  • -38—TTS-Not enough space. [0694]
  • -39—TTS-Invalid argument. [0695]
  • -40—TTS-Invalid voice id. [0696]
  • -41—TTS-No engine has been specified. [0697]
  • x01—Invalid Handle. [0698]
  • x02—Device already opened. [0699]
  • x03—Device can't setup. [0700]
  • x04—Memory allocation. [0701]
  • x05—No communication. [0702]
  • x06—System. [0703]
  • x07—Base not connected [0704]
  • x08—Timeout. [0705]
  • x09—Invalid register number. [0706]
  • x10—Invalid channel. [0707]
  • x11—Invalid DeviceID. [0708]
  • x12—Wrong state. [0709]
  • x13—Invalid parameter. [0710]
  • x14—Sound card IN opened. [0711]
  • x15—Sound card OUT opened. [0712]
  • x16—File open. [0713]
  • x17—File create. [0714]
  • x18—File read. [0715]
  • x19—File write. [0716]
  • x20—Format not supported. [0717]
  • x21—TTS speech generation. [0718]
  • x22—SR engine not active [0719]
  • x23—Buffer is too small. [0720]
  • x24—SR no active context. [0721]
  • x25—TTS engine not active. [0722]
  • In the above error list, SR is an abbreviation for “script writer” and TTS is an abbreviation for “text to speech”. [0723]
  • It appreciated that the apparatus of the present invention is useful for generating scripts not only for toys but also for any computer-controllable object. [0724]
  • A preferred method for multilevel interaction between a user and a toy or other entertainment element is now described. Effective interaction is based on a continuous effort to cultivate the interest of the other party. To achieve this goal, each party struggles to identify the possible interests of the other party. This is done in daily small talk as well as in business negotiations. [0725]
  • In some situations, a direct question can be used to identify a possible interest or even a detailed form. However, in many situations an indirect approach is more useful. The indirect approach is more appropriate when conversing with kids and especially for very young children. [0726]
  • The present invention relates to machines' interaction with humans, where the machines use the indirect approach to identify the characteristics, preferences and interests of the person. The proper identification of the characteristics, preferences and interests enables the machine to suggest appropriate content, information, entertainment, games, stories, riddles and jokes as well as TV programs, movies and theater shows, and various articles on sale. [0727]
  • A multilevel question is a question that has two or more correct answers. The answer selected by the questioned person provides information about or her. Such information may be: the age range, gender, culture, breadth of knowledge, preferences and inclinations, possible areas of interest and associations. [0728]
  • Typically the question is a trivia question with one “obvious” or “superfluous” answer and another “concealed” or “deep” answer that requires somewhat “increased” level of knowledge. Typically the question contains a clue that orients the “knowledgeable” person to select the “concealed” answer. The “unknowledgeable” person disregards the clue and select the “obvious” answer. [0729]
  • There can be several clues for the “concealed answer” and there can be more than two “concealed answers” identifying, respectively, more than two “layers of knowledge.”[0730]
  • Any suitable clues can be provided such as: [0731]
  • a. Complex syntax, even somewhat “wrong” syntax. [0732]
  • b. Soundex, using words that sound alike but have different meanings. [0733]
  • c. Additional information that the “knowledgeable person” can identify as redundant. [0734]
  • The “closed” type of “multi layered questions” provides the answers while the “open” type does not. For example (“open” version): [0735]
  • Question: Who were the beetles? [0736]
  • Answer 1: Insects. [0737]
  • Answer 1: A pop music group. [0738]
  • “Concealment”: the soundex beetles and Beatles [0739]
  • “Clue”: the use of “were” instead of “are.” [0740]
  • The knowledge of the Beatles reveals the age group of the person. [0741]
  • The “closed” version of this question may be: [0742]
  • Question: What did the beetles do? [0743]
  • Answer 1: Crawl [0744]
  • Answer 2: Sing. [0745]
  • It is possible to identify the knowledge of the Beatles (to identify the age group) by asking, for example, “Who sailed the yellow submarine?” The disadvantage of this method is that the “unknowledgeable person” does not know the answer and is therefore discouraged. If a “closed form” question is used, such as “Who sailed the yellow submarine, the beetles or the crocodiles?” the “unknowledgeable person” can still guess and with a 50% probability select the correct answer. The multilevel question provides two (or more) correct answers and therefore, even the “unknowledgeable person” can knowingly select a correct answer. [0746]
  • The preferred embodiment of the current invention relates to computer games played by kids and, more specifically, to games played with computer controlled toys. The present invention employs multilevel questions to identify the player's age and level of knowledge to select the level of content (game or story) to play with the player. The interrogation is performed by verbally asking the questions and performing speech recognition on the recorded response. [0747]
  • It is clear that other forms of interaction can also be employed. Such forms can be visual and tactile, employing screen display, keyboard, pointing device such as mouse, switches and sensors. Concealed information can also be provided by moving or lighting limbs of a doll or a graphic character on screen, or other parts of the scene. [0748]
  • FIG. 125 is a table that presents an interactive script between a toy and a player where the toy determines the characteristics of the player (namely, age range) to suggest the appropriate level of game content. [0749]
  • A list of multilevel questions is maintained with a set of characteristics associated with each question. The characteristics define the types of persons that the question can help identify according to characteristics such as gender and age group, as shown in the table of FIG. 126. When the machine is required to characterize the person who uses it, the machine selects the appropriate question according to its associated characteristics. For example, if the player selects a male avatar, the program may ask a multilevel question with an obvious answer of general knowledge and a concealed answer regarding, for example, football. If the program is about to present an advertisement, it may first identify the age group by presenting a multilevel question that is associated with age group identification. [0750]
  • FIG. 126 is a table that stores an example of a list of multilevel questions. [0751]
  • FIG. 127 is a script diagram of an interactive script where a toy determines the characteristics of a player (e.g. age range) to suggest the appropriate level of game content. [0752]
  • In accordance with a preferred embodiment of the present invention, the adaptive toy system of FIGS. [0753] 1-5 comprises a learning machine that uses trivia game scripts. An example of such trivia game scripts is described in FIGS. 6-9. A detailed example of an adaptive toy system that uses trivia game scripts in accordance with a preferred embodiment of the present invention is now described.
  • Reference is now made to FIG. 128 which describes by means of a flowchart the basic stages of a learning procedure for a toy system in accordance with a preferred embodiment of the present invention. As shown in FIG. 128, trivia game scripts are sent to users of a toy system. For example, a sample-group of users of a given age group from among the entire community of users on a toy system receive the said scripts over a network. It is appreciated that the age of a user is typically registered by a toy system. An example of a registration procedure in accordance with a preferred embodiment of the present invention is described in FIG. 16. [0754]
  • The results of game scripts played by individual users are retrieved by a toy system. For example, the results of each user are initially processed on a personal computer controlling the toy of the said user, and the results are then sent over a network to a central computer on a server of a toy system. [0755]
  • The learning process of FIG. 128 typically comprises a stage of checking whether game results are “meaningful”. For example, game results are typically considered not meaningful if the game scripts turned out to be too difficult for users participating in a sample group. In such a case, one or more parts of a learning process are typically modified before the whole process is repeated. If game results are considered meaningful, such results are typically used in order to change the content provided by a toy system, possibly for the entire community of users on such a system. Examples of this as well as the previous stages of the procedure of FIG. 128 are detailed below. Thus, the procedure of FIG. 128 preferably provides the toy system with one or both of the following two adaptation capabilities: 1) change of toy content; and 2) modification of the learning procedure itself. [0756]
  • Reference is now made to FIG. 129 that describes by means of a flowchart an example of a procedure of analyzing game results for a single user in accordance with a preferred embodiment of the present invention. The game in the present example includes three trivia game scripts that are described in FIGS. [0757] 7-9. The following four parameters are used in the course of the present procedure:
  • 1. S[0758] 1Boring: shows the number of questions to which a user did not give a recognizable answer. Since a game in the present example comprises 9 questions, the value of this parameter is to be between 0 and 9.
  • 2. S[0759] 1L1: shows which and how many questions from among the 3 questions of level 1 were answered correctly by a user. Typically, S1L1 is incremented by 2n−1 if the n'th question is answered correctly, so that for each value of S1L1 it is possible to discern which questions were answered correctly. For example, if S1L1 equals 1, 2 or 4, that means that only a single question ( questions 1, 2 and 3, respectively) was answered correctly by a user.
  • 3. S[0760] 1L2: shows which and how many questions of level 2 were answered correctly by a user.
  • 4. S[0761] 1L3: shows which and how many questions of level 3 were answered correctly by a user.
  • The following are possible results of the procedure described in FIG. 129: [0762]
  • 1. User bored: is defined in the present example as a case where a user has ignored or not given a recognizable answer to more than three of the nine questions of the whole trivia game. [0763]
  • 2. Game too easy: is defined in the present example as a case where a user has answered correctly two or more of the questions in each of the three game scripts. [0764]
  • 3. Game too difficult: is defined in the present example as a case where a user has only answered correctly up to a single question in each of the tree game scripts. [0765]
  • 4. The user has succeeded in some yet not all of the three trivia game scripts. As shown in FIG. 129, there are six cases in the present example that correspond to this result. For example, a user has answered correctly at least two questions in each of the first two game scripts (“what is bigger?” and “what is faster?”) but has correctly answered no more than a single question in the third game script (“what is heavier?”). [0766]
  • It is appreciated that such game scripts, possibly comprising new sets of questions for each game script, may be repeatedly used for a single user, for example, at different times of day, until reliable results are accumulated. Preferably, results of such game scripts are processed on a personal computer that controls the toy of the user in question, and then sent over a network to a computer on a server of a toy system. This allows utilization of the multiplicity of computing power resources available on a networked system. Results sent to a computer on a server may or may not comprise information regarding a user's identity. For example, game results of a multiplicity of users are used in order to derive statistical data related to a sample group as a whole as described below. In such a case, no information related to users' identities is typically required. [0767]
  • Reference is now made to FIG. 130 that describes by means of a flowchart an example of a procedure of handling game results of a plurality of users, for example, members of sample group, in accordance with a preferred embodiment of the present invention. In this example, the same categories of game result are used that are described in FIG. 129. The previously described results of individual users are now treated at the sample group level. [0768]
  • As shown in FIG. 130, if a large number of users from among the sample group members (30% or more, in the present example) were bored with the game scripts, the scripts are modified. This is performed either by automatic selection of different database scripts, or, in case the same has already previously occurred, by generation of new scripts, possibly by means of direct human intervention. [0769]
  • Also as shown in FIG. 130, a learning procedure is modified in case the game scripts turn out to be either too easy or too difficult for a large number of users from among a sample group −30% or more in the present example. In such a case, the procedure of FIG. 128 is repeated for a sample group of users of either a higher or a lower age group, depending on whether the game scripts were too difficult or too easy as shown in FIG. 130. Thus, this example also illustrates a method of handling a case described in FIG. 128 where sample group results are initially not meaningful in order to serve as learning data for the entire community of users on a toy system. [0770]
  • The final stage of the procedure described in FIG. 130 comprises using sample group results in order to change the content that is sent to toys on a toy system, possibly affecting the entire community of users on such a system. For example, a possible result of the procedure of FIG. 130 is that a large number of users from among the sample group members (e.g. 50% or more) are found to have succeeded in [0771] game script 1 but not in game scripts 2 and 3. In the context of the illustrated example, such a result possibly implies that many users of the age group in question have a well developed understanding of the concept of “bigger”, yet have not yet acquired sufficient understanding of the concepts of “faster” and “heavier”. In such a case, a toy system preferably generates a toy content package where knowledge of “what is bigger” is assumed, and which is intended to facilitate acquisition of the concepts of “faster” and “heavier”. Such a content package is then made available for sending to users of the age group concerned from among the entire community of users on a toy system or any subgroup thereof.
  • FIG. 131 is a simplified flowchart illustration of a procedure whereby results of a previously performed learning procedure are used in order to customize toy content to a user of a toy system. A user receives a game script that is intended to allow determination of one or more characteristics of that user. For example, a trivia game script such as that described in FIGS. [0772] 6-9 is used. A process of analyzing game results for a single user is provided, such as, for example, the procedure described in FIG. 129. Game results of that user are then matched with results of a previously performed learning procedure at a sample group level, such as, for example, the procedure described in FIG. 130. Finally, content is customized to the said user depending on the results of the said matching step. For example, a user has succeeded in game script 1 but not in game scripts 2 and 3. Then, a previously generated content package that is especially intended for users with such game results as described above is customized to the user in the present example.
  • It is appreciated that the software components of the present invention may, if desired, be implemented in ROM (read-only memory) form. The software components may, generally, be implemented in hardware, if desired, using conventional techniques. [0773]
  • It is appreciated that various features of the invention which are, for clarity, described in the contexts of separate embodiments may also be provided in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the-context of a single embodiment may also be provided separately or in any suitable subcombination. [0774]
  • It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention is defined only by the claims that follow: [0775]

Claims (5)

1. An adaptive toy system comprising:
a multiplicity of interactive toys, each of which is connected to a computer network; and
adaptive toy operation software which is supplied to said multiplicity of interactive toys via said computer network, said adaptive toy operation software being operative to provide feedback, based on play experience with at least some of said multiplicity of interactive toys, via said computer network and to employ said feedback in adapting itself so as to change the play experience provided thereby.
2. An adaptive entertainment system comprising:
a multiplicity of interactive entertainment units, each of which is connected to a computer network; and
adaptive entertainment software which is supplied to said multiplicity of interactive entertainment units via said computer network, said adaptive entertainment software being operative to provide feedback, based on play experience with at least some of said multiplicity of interactive entertainment units, via said computer network and to employ said feedback in adapting itself so as to change the entertainment experience provided thereby.
3. A method for establishing a network of toys, the method comprising:
providing a plurality of scripts for at least some of the network of toys; and
sending at least one of the plurality of scripts to at least one of the toys in the network, over the network.
4. A toy system comprising:
an electronic toy content shop providing users with an option to pre-purchase accounts for users; and
a plurality of networked toys directly connected via a network to the electronic toy content store, the toys being operative to load themselves with at least one script sold at the electronic toy content store; wherein
only a subset of the scripts sold at the electronic toy content shop are displayed to a user, depending on at least one personal characteristic of the user.
5. A toy system providing multi-level interaction between a population of users and a population of toys, the system comprising:
at least one scripts operative to pose at least one question to at least one user about a topic other than the user's characteristics, each of the scripts operative to analyze the user's answer and act upon its content and to derive knowledge about the user's characteristics from his answer.
US09/730,154 1999-12-29 2000-12-05 Adaptive toy system and functionality Abandoned US20020068500A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/730,154 US20020068500A1 (en) 1999-12-29 2000-12-05 Adaptive toy system and functionality

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US17360399P 1999-12-29 1999-12-29
US18307000P 2000-02-16 2000-02-16
US09/730,154 US20020068500A1 (en) 1999-12-29 2000-12-05 Adaptive toy system and functionality

Publications (1)

Publication Number Publication Date
US20020068500A1 true US20020068500A1 (en) 2002-06-06

Family

ID=26869335

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/730,154 Abandoned US20020068500A1 (en) 1999-12-29 2000-12-05 Adaptive toy system and functionality

Country Status (3)

Country Link
US (1) US20020068500A1 (en)
AU (1) AU2216201A (en)
WO (1) WO2001050342A1 (en)

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020059386A1 (en) * 2000-08-18 2002-05-16 Lg Electronics Inc. Apparatus and method for operating toys through computer communication
US20030005067A1 (en) * 2001-06-29 2003-01-02 Martin Anthony G. System and method for using continuous messaging units in a network architecture
WO2003085545A1 (en) * 2001-08-31 2003-10-16 Intuition Intelligence, Inc. Processing device with intuitive learning capability
US20040033833A1 (en) * 2002-03-25 2004-02-19 Briggs Rick A. Interactive redemption game
US20040072498A1 (en) * 2002-10-15 2004-04-15 Yeon Ku Beom System and method for controlling toy using web
US20040077265A1 (en) * 1999-07-10 2004-04-22 Ghaly Nabil N. Interactive paly device and method
US20040077423A1 (en) * 2001-11-16 2004-04-22 Weston Denise Chapman Interactive quest game
US20050059483A1 (en) * 2003-07-02 2005-03-17 Borge Michael D. Interactive action figures for gaming schemes
US20050143173A1 (en) * 2000-02-22 2005-06-30 Barney Jonathan A. Magical wand and interactive play experience
US20050177428A1 (en) * 2003-12-31 2005-08-11 Ganz System and method for toy adoption and marketing
US20050192864A1 (en) * 2003-12-31 2005-09-01 Ganz System and method for toy adoption and marketing
US20050266907A1 (en) * 2002-04-05 2005-12-01 Weston Denise C Systems and methods for providing an interactive game
US20060100018A1 (en) * 2003-12-31 2006-05-11 Ganz System and method for toy adoption and marketing
US20060143202A1 (en) * 2002-11-27 2006-06-29 Parker Eric G Efficient data structure
US20060234601A1 (en) * 2000-10-20 2006-10-19 Weston Denise C Children's toy with wireless tag/transponder
US20070066396A1 (en) * 2002-04-05 2007-03-22 Denise Chapman Weston Retail methods for providing an interactive product to a consumer
US20080183678A1 (en) * 2006-12-29 2008-07-31 Denise Chapman Weston Systems and methods for personalizing responses to user requests
US20080290987A1 (en) * 2007-04-22 2008-11-27 Lehmann Li Methods and apparatus related to content sharing between devices
US20090204908A1 (en) * 2008-02-11 2009-08-13 Ganz Friends list management
US20090228557A1 (en) * 2008-03-04 2009-09-10 Ganz, An Ontario Partnership Consisting Of 2121200 Ontario Inc. And 2121812 Ontario Inc. Multiple-layer chat filter system and method
US20090248544A1 (en) * 2008-04-01 2009-10-01 Ganz, an Ontario partnership consisting of 212100 Ontario Inc. and 2121812 Ontario Inc. Reverse product purchase in a virtual environment
WO2010093995A1 (en) * 2009-02-13 2010-08-19 Social Gaming Network Apparatuses, methods and systems for an interworld feedback platform bridge
US20100257157A1 (en) * 2007-08-13 2010-10-07 Yuusuke Tomita Communication device, communication analysis method, and communication analysis program
US7827072B1 (en) 2008-02-18 2010-11-02 United Services Automobile Association (Usaa) Method and system for interface presentation
US7850527B2 (en) 2000-02-22 2010-12-14 Creative Kingdoms, Llc Magic-themed adventure game
US7878905B2 (en) 2000-02-22 2011-02-01 Creative Kingdoms, Llc Multi-layered interactive play experience
US20110078030A1 (en) * 2009-09-29 2011-03-31 Ganz Website with activities triggered by clickable ads
US8042061B1 (en) 2008-02-18 2011-10-18 United Services Automobile Association Method and system for interface presentation
US8073866B2 (en) 2005-03-17 2011-12-06 Claria Innovations, Llc Method for providing content to an internet user based on the user's demonstrated content preferences
US8078602B2 (en) 2004-12-17 2011-12-13 Claria Innovations, Llc Search engine for a computer network
US8086697B2 (en) 2005-06-28 2011-12-27 Claria Innovations, Llc Techniques for displaying impressions in documents delivered over a computer network
US8170912B2 (en) 2003-11-25 2012-05-01 Carhamm Ltd., Llc Database structure and front end
US8205158B2 (en) 2006-12-06 2012-06-19 Ganz Feature codes and bonuses in virtual worlds
US8226493B2 (en) 2002-08-01 2012-07-24 Creative Kingdoms, Llc Interactive play devices for water play attractions
US8255413B2 (en) 2004-08-19 2012-08-28 Carhamm Ltd., Llc Method and apparatus for responding to request for information-personalization
US20120246585A9 (en) * 2008-07-14 2012-09-27 Microsoft Corporation System for editing an avatar
US8316003B2 (en) 2002-11-05 2012-11-20 Carhamm Ltd., Llc Updating content of presentation vehicle in a computer network
US8380725B2 (en) 2010-08-03 2013-02-19 Ganz Message filter with replacement text
USRE44054E1 (en) 2000-12-08 2013-03-05 Ganz Graphic chatting with organizational avatars
US20130344770A1 (en) * 2011-12-07 2013-12-26 Ubooly Inc. Interactive toy
US8620952B2 (en) 2007-01-03 2013-12-31 Carhamm Ltd., Llc System for database reporting
US8645941B2 (en) 2005-03-07 2014-02-04 Carhamm Ltd., Llc Method for attributing and allocating revenue related to embedded software
US8689238B2 (en) 2000-05-18 2014-04-01 Carhamm Ltd., Llc Techniques for displaying impressions in documents delivered over a computer network
US8758136B2 (en) 1999-02-26 2014-06-24 Mq Gaming, Llc Multi-platform gaming systems and methods
US20160034042A1 (en) * 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US9443515B1 (en) * 2012-09-05 2016-09-13 Paul G. Boyce Personality designer system for a detachably attachable remote audio object
US9446316B2 (en) * 2012-12-11 2016-09-20 Activision Publishing, Inc. Interactive video game system comprising toys with rewritable memories
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
US9495446B2 (en) 2004-12-20 2016-11-15 Gula Consulting Limited Liability Company Method and device for publishing cross-network user behavioral data
US9659011B1 (en) 2008-02-18 2017-05-23 United Services Automobile Association (Usaa) Method and system for interface presentation
US10043412B2 (en) 2013-05-26 2018-08-07 Dean Joseph Lore System for promoting travel education
US20190027132A1 (en) * 2016-03-31 2019-01-24 Shenzhen Kuang-Chi Hezhong Technology Ltd. Cloud-based device and operating method therefor
USD894948S1 (en) * 2019-09-05 2020-09-01 Canopy Growth Corporation Display screen or portion thereof with transitional graphical user interface
US11358059B2 (en) 2020-05-27 2022-06-14 Ganz Live toy system
US11389735B2 (en) 2019-10-23 2022-07-19 Ganz Virtual pet system
US11455679B2 (en) 2005-03-30 2022-09-27 Ebay Inc. Methods and systems to browse data items
US20230018066A1 (en) * 2020-11-20 2023-01-19 Aurora World Corporation Apparatus and system for growth type smart toy

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5213510A (en) * 1991-07-09 1993-05-25 Freeman Michael J Real-time interactive conversational toy
US5752160A (en) * 1995-05-05 1998-05-12 Dunn; Matthew W. Interactive entertainment network system and method with analog video startup loop for video-on-demand
US5795228A (en) * 1996-07-03 1998-08-18 Ridefilm Corporation Interactive computer-based entertainment system
US5947478A (en) * 1997-05-21 1999-09-07 Kwan; David Chu Ki Toy electronic game with flexible interactive play section
US5964660A (en) * 1997-06-18 1999-10-12 Vr-1, Inc. Network multiplayer game
US6196920B1 (en) * 1998-03-31 2001-03-06 Masque Publishing, Inc. On-line game playing with advertising

Cited By (185)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10300374B2 (en) 1999-02-26 2019-05-28 Mq Gaming, Llc Multi-platform gaming systems and methods
US8758136B2 (en) 1999-02-26 2014-06-24 Mq Gaming, Llc Multi-platform gaming systems and methods
US8888576B2 (en) 1999-02-26 2014-11-18 Mq Gaming, Llc Multi-media interactive play system
US9186585B2 (en) 1999-02-26 2015-11-17 Mq Gaming, Llc Multi-platform gaming systems and methods
US9468854B2 (en) 1999-02-26 2016-10-18 Mq Gaming, Llc Multi-platform gaming systems and methods
US9731194B2 (en) 1999-02-26 2017-08-15 Mq Gaming, Llc Multi-platform gaming systems and methods
US9861887B1 (en) 1999-02-26 2018-01-09 Mq Gaming, Llc Multi-platform gaming systems and methods
US20040077265A1 (en) * 1999-07-10 2004-04-22 Ghaly Nabil N. Interactive paly device and method
US7850527B2 (en) 2000-02-22 2010-12-14 Creative Kingdoms, Llc Magic-themed adventure game
US7878905B2 (en) 2000-02-22 2011-02-01 Creative Kingdoms, Llc Multi-layered interactive play experience
US20050143173A1 (en) * 2000-02-22 2005-06-30 Barney Jonathan A. Magical wand and interactive play experience
US10188953B2 (en) 2000-02-22 2019-01-29 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US8164567B1 (en) 2000-02-22 2012-04-24 Creative Kingdoms, Llc Motion-sensitive game controller with optional display screen
US8169406B2 (en) 2000-02-22 2012-05-01 Creative Kingdoms, Llc Motion-sensitive wand controller for a game
US8184097B1 (en) 2000-02-22 2012-05-22 Creative Kingdoms, Llc Interactive gaming system and method using motion-sensitive input device
US10307671B2 (en) 2000-02-22 2019-06-04 Mq Gaming, Llc Interactive entertainment system
US8368648B2 (en) 2000-02-22 2013-02-05 Creative Kingdoms, Llc Portable interactive toy with radio frequency tracking device
US9814973B2 (en) 2000-02-22 2017-11-14 Mq Gaming, Llc Interactive entertainment system
US8475275B2 (en) 2000-02-22 2013-07-02 Creative Kingdoms, Llc Interactive toys and games connecting physical and virtual play environments
US8491389B2 (en) 2000-02-22 2013-07-23 Creative Kingdoms, Llc. Motion-sensitive input device and interactive gaming system
US9713766B2 (en) 2000-02-22 2017-07-25 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US7896742B2 (en) 2000-02-22 2011-03-01 Creative Kingdoms, Llc Apparatus and methods for providing interactive entertainment
US9579568B2 (en) 2000-02-22 2017-02-28 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US8686579B2 (en) 2000-02-22 2014-04-01 Creative Kingdoms, Llc Dual-range wireless controller
US9474962B2 (en) 2000-02-22 2016-10-25 Mq Gaming, Llc Interactive entertainment system
US8089458B2 (en) 2000-02-22 2012-01-03 Creative Kingdoms, Llc Toy devices and methods for providing an interactive play experience
US8708821B2 (en) 2000-02-22 2014-04-29 Creative Kingdoms, Llc Systems and methods for providing interactive game play
US20090051653A1 (en) * 2000-02-22 2009-02-26 Creative Kingdoms, Llc Toy devices and methods for providing an interactive play experience
US8790180B2 (en) 2000-02-22 2014-07-29 Creative Kingdoms, Llc Interactive game and associated wireless toy
US8814688B2 (en) 2000-02-22 2014-08-26 Creative Kingdoms, Llc Customizable toy for playing a wireless interactive game having both physical and virtual elements
US8915785B2 (en) 2000-02-22 2014-12-23 Creative Kingdoms, Llc Interactive entertainment system
US9149717B2 (en) 2000-02-22 2015-10-06 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US8689238B2 (en) 2000-05-18 2014-04-01 Carhamm Ltd., Llc Techniques for displaying impressions in documents delivered over a computer network
US20020059386A1 (en) * 2000-08-18 2002-05-16 Lg Electronics Inc. Apparatus and method for operating toys through computer communication
US9480929B2 (en) 2000-10-20 2016-11-01 Mq Gaming, Llc Toy incorporating RFID tag
US20060234601A1 (en) * 2000-10-20 2006-10-19 Weston Denise C Children's toy with wireless tag/transponder
US8961260B2 (en) 2000-10-20 2015-02-24 Mq Gaming, Llc Toy incorporating RFID tracking device
US10307683B2 (en) 2000-10-20 2019-06-04 Mq Gaming, Llc Toy incorporating RFID tag
US9320976B2 (en) 2000-10-20 2016-04-26 Mq Gaming, Llc Wireless toy systems and methods for interactive entertainment
US9931578B2 (en) 2000-10-20 2018-04-03 Mq Gaming, Llc Toy incorporating RFID tag
US8753165B2 (en) 2000-10-20 2014-06-17 Mq Gaming, Llc Wireless toy systems and methods for interactive entertainment
USRE44054E1 (en) 2000-12-08 2013-03-05 Ganz Graphic chatting with organizational avatars
US8384668B2 (en) 2001-02-22 2013-02-26 Creative Kingdoms, Llc Portable gaming device and gaming system combining both physical and virtual play elements
US8711094B2 (en) 2001-02-22 2014-04-29 Creative Kingdoms, Llc Portable gaming device and gaming system combining both physical and virtual play elements
US9393491B2 (en) 2001-02-22 2016-07-19 Mq Gaming, Llc Wireless entertainment device, system, and method
US10179283B2 (en) 2001-02-22 2019-01-15 Mq Gaming, Llc Wireless entertainment device, system, and method
US10758818B2 (en) 2001-02-22 2020-09-01 Mq Gaming, Llc Wireless entertainment device, system, and method
US9162148B2 (en) 2001-02-22 2015-10-20 Mq Gaming, Llc Wireless entertainment device, system, and method
US8913011B2 (en) 2001-02-22 2014-12-16 Creative Kingdoms, Llc Wireless entertainment device, system, and method
US8248367B1 (en) 2001-02-22 2012-08-21 Creative Kingdoms, Llc Wireless gaming system combining both physical and virtual play elements
US9737797B2 (en) 2001-02-22 2017-08-22 Mq Gaming, Llc Wireless entertainment device, system, and method
US7219139B2 (en) * 2001-06-29 2007-05-15 Claria Corporation System and method for using continuous messaging units in a network architecture
US20030005067A1 (en) * 2001-06-29 2003-01-02 Martin Anthony G. System and method for using continuous messaging units in a network architecture
WO2003085545A1 (en) * 2001-08-31 2003-10-16 Intuition Intelligence, Inc. Processing device with intuitive learning capability
US20040077423A1 (en) * 2001-11-16 2004-04-22 Weston Denise Chapman Interactive quest game
US20100056285A1 (en) * 2001-11-16 2010-03-04 Creative Kingdoms, Llc Systems and methods for interactive game play using a plurality of consoles
US20040033833A1 (en) * 2002-03-25 2004-02-19 Briggs Rick A. Interactive redemption game
US9616334B2 (en) 2002-04-05 2017-04-11 Mq Gaming, Llc Multi-platform gaming system using RFID-tagged toys
US9463380B2 (en) 2002-04-05 2016-10-11 Mq Gaming, Llc System and method for playing an interactive game
US11278796B2 (en) 2002-04-05 2022-03-22 Mq Gaming, Llc Methods and systems for providing personalized interactive entertainment
US8702515B2 (en) 2002-04-05 2014-04-22 Mq Gaming, Llc Multi-platform gaming system using RFID-tagged toys
US10010790B2 (en) 2002-04-05 2018-07-03 Mq Gaming, Llc System and method for playing an interactive game
US8827810B2 (en) 2002-04-05 2014-09-09 Mq Gaming, Llc Methods for providing interactive entertainment
US10507387B2 (en) 2002-04-05 2019-12-17 Mq Gaming, Llc System and method for playing an interactive game
US10478719B2 (en) 2002-04-05 2019-11-19 Mq Gaming, Llc Methods and systems for providing personalized interactive entertainment
US8608535B2 (en) 2002-04-05 2013-12-17 Mq Gaming, Llc Systems and methods for providing an interactive game
US20070066396A1 (en) * 2002-04-05 2007-03-22 Denise Chapman Weston Retail methods for providing an interactive product to a consumer
US9272206B2 (en) 2002-04-05 2016-03-01 Mq Gaming, Llc System and method for playing an interactive game
US20050266907A1 (en) * 2002-04-05 2005-12-01 Weston Denise C Systems and methods for providing an interactive game
US8226493B2 (en) 2002-08-01 2012-07-24 Creative Kingdoms, Llc Interactive play devices for water play attractions
US20040072498A1 (en) * 2002-10-15 2004-04-15 Yeon Ku Beom System and method for controlling toy using web
US8316003B2 (en) 2002-11-05 2012-11-20 Carhamm Ltd., Llc Updating content of presentation vehicle in a computer network
US20060143202A1 (en) * 2002-11-27 2006-06-29 Parker Eric G Efficient data structure
US7519603B2 (en) * 2002-11-27 2009-04-14 Zyvex Labs, Llc Efficient data structure
US8373659B2 (en) 2003-03-25 2013-02-12 Creative Kingdoms, Llc Wirelessly-powered toy for gaming
US10369463B2 (en) 2003-03-25 2019-08-06 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US10022624B2 (en) 2003-03-25 2018-07-17 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US9770652B2 (en) 2003-03-25 2017-09-26 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US11052309B2 (en) 2003-03-25 2021-07-06 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US9707478B2 (en) 2003-03-25 2017-07-18 Mq Gaming, Llc Motion-sensitive controller and associated gaming applications
US10583357B2 (en) 2003-03-25 2020-03-10 Mq Gaming, Llc Interactive gaming toy
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
US8961312B2 (en) 2003-03-25 2015-02-24 Creative Kingdoms, Llc Motion-sensitive controller and associated gaming applications
US9039533B2 (en) 2003-03-25 2015-05-26 Creative Kingdoms, Llc Wireless interactive game having both physical and virtual elements
US9393500B2 (en) 2003-03-25 2016-07-19 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US9993724B2 (en) 2003-03-25 2018-06-12 Mq Gaming, Llc Interactive gaming toy
US20090054155A1 (en) * 2003-07-02 2009-02-26 Ganz Interactive action figures for gaming systems
US20050059483A1 (en) * 2003-07-02 2005-03-17 Borge Michael D. Interactive action figures for gaming schemes
US9132344B2 (en) 2003-07-02 2015-09-15 Ganz Interactive action figures for gaming system
US20090053970A1 (en) * 2003-07-02 2009-02-26 Ganz Interactive action figures for gaming schemes
US8585497B2 (en) 2003-07-02 2013-11-19 Ganz Interactive action figures for gaming systems
US9427658B2 (en) 2003-07-02 2016-08-30 Ganz Interactive action figures for gaming systems
US10112114B2 (en) 2003-07-02 2018-10-30 Ganz Interactive action figures for gaming systems
US8636588B2 (en) 2003-07-02 2014-01-28 Ganz Interactive action figures for gaming systems
US20100151940A1 (en) * 2003-07-02 2010-06-17 Ganz Interactive action figures for gaming systems
US8734242B2 (en) 2003-07-02 2014-05-27 Ganz Interactive action figures for gaming systems
US7862428B2 (en) 2003-07-02 2011-01-04 Ganz Interactive action figures for gaming systems
US8170912B2 (en) 2003-11-25 2012-05-01 Carhamm Ltd., Llc Database structure and front end
US8500511B2 (en) 2003-12-31 2013-08-06 Ganz System and method for toy adoption and marketing
US20050177428A1 (en) * 2003-12-31 2005-08-11 Ganz System and method for toy adoption and marketing
US11443339B2 (en) 2003-12-31 2022-09-13 Ganz System and method for toy adoption and marketing
US10657551B2 (en) 2003-12-31 2020-05-19 Ganz System and method for toy adoption and marketing
US8002605B2 (en) 2003-12-31 2011-08-23 Ganz System and method for toy adoption and marketing
US7789726B2 (en) 2003-12-31 2010-09-07 Ganz System and method for toy adoption and marketing
US20110190047A1 (en) * 2003-12-31 2011-08-04 Ganz System and method for toy adoption and marketing
US8777687B2 (en) 2003-12-31 2014-07-15 Ganz System and method for toy adoption and marketing
US8641471B2 (en) 2003-12-31 2014-02-04 Ganz System and method for toy adoption and marketing
US8808053B2 (en) 2003-12-31 2014-08-19 Ganz System and method for toy adoption and marketing
US8814624B2 (en) 2003-12-31 2014-08-26 Ganz System and method for toy adoption and marketing
US7677948B2 (en) 2003-12-31 2010-03-16 Ganz System and method for toy adoption and marketing
US20110184797A1 (en) * 2003-12-31 2011-07-28 Ganz System and method for toy adoption and marketing
US20110167481A1 (en) * 2003-12-31 2011-07-07 Ganz System and method for toy adoption and marketing
US8900030B2 (en) 2003-12-31 2014-12-02 Ganz System and method for toy adoption and marketing
US20050192864A1 (en) * 2003-12-31 2005-09-01 Ganz System and method for toy adoption and marketing
US20090204420A1 (en) * 2003-12-31 2009-08-13 Ganz System and method for toy adoption and marketing
US20110167267A1 (en) * 2003-12-31 2011-07-07 Ganz System and method for toy adoption and marketing
US20060100018A1 (en) * 2003-12-31 2006-05-11 Ganz System and method for toy adoption and marketing
US8549440B2 (en) 2003-12-31 2013-10-01 Ganz System and method for toy adoption and marketing
US8292688B2 (en) 2003-12-31 2012-10-23 Ganz System and method for toy adoption and marketing
US7534157B2 (en) 2003-12-31 2009-05-19 Ganz System and method for toy adoption and marketing
US20110092128A1 (en) * 2003-12-31 2011-04-21 Ganz System and method for toy adoption and marketing
US20090063282A1 (en) * 2003-12-31 2009-03-05 Ganz System and method for toy adoption and marketing
US9238171B2 (en) 2003-12-31 2016-01-19 Howard Ganz System and method for toy adoption and marketing
US9947023B2 (en) 2003-12-31 2018-04-17 Ganz System and method for toy adoption and marketing
US8317566B2 (en) 2003-12-31 2012-11-27 Ganz System and method for toy adoption and marketing
US8465338B2 (en) 2003-12-31 2013-06-18 Ganz System and method for toy adoption and marketing
US20080009350A1 (en) * 2003-12-31 2008-01-10 Ganz System and method for toy adoption marketing
US8460052B2 (en) 2003-12-31 2013-06-11 Ganz System and method for toy adoption and marketing
US8408963B2 (en) 2003-12-31 2013-04-02 Ganz System and method for toy adoption and marketing
US20110167485A1 (en) * 2003-12-31 2011-07-07 Ganz System and method for toy adoption and marketing
US20080009351A1 (en) * 2003-12-31 2008-01-10 Ganz System and method for toy adoption marketing
US7967657B2 (en) 2003-12-31 2011-06-28 Ganz System and method for toy adoption and marketing
US20110161093A1 (en) * 2003-12-31 2011-06-30 Ganz System and method for toy adoption and marketing
US7465212B2 (en) 2003-12-31 2008-12-16 Ganz System and method for toy adoption and marketing
US20080040230A1 (en) * 2003-12-31 2008-02-14 Ganz System and method for toy adoption marketing
US7425169B2 (en) * 2003-12-31 2008-09-16 Ganz System and method for toy adoption marketing
US9721269B2 (en) 2003-12-31 2017-08-01 Ganz System and method for toy adoption and marketing
US20080109313A1 (en) * 2003-12-31 2008-05-08 Ganz System and method for toy adoption and marketing
US9610513B2 (en) 2003-12-31 2017-04-04 Ganz System and method for toy adoption and marketing
US20080134099A1 (en) * 2003-12-31 2008-06-05 Ganz System and method for toy adoption and marketing
US8255413B2 (en) 2004-08-19 2012-08-28 Carhamm Ltd., Llc Method and apparatus for responding to request for information-personalization
US9675878B2 (en) 2004-09-29 2017-06-13 Mq Gaming, Llc System and method for playing a virtual game by sensing physical movements
US8078602B2 (en) 2004-12-17 2011-12-13 Claria Innovations, Llc Search engine for a computer network
US9495446B2 (en) 2004-12-20 2016-11-15 Gula Consulting Limited Liability Company Method and device for publishing cross-network user behavioral data
US8645941B2 (en) 2005-03-07 2014-02-04 Carhamm Ltd., Llc Method for attributing and allocating revenue related to embedded software
US8073866B2 (en) 2005-03-17 2011-12-06 Claria Innovations, Llc Method for providing content to an internet user based on the user's demonstrated content preferences
US11455680B2 (en) 2005-03-30 2022-09-27 Ebay Inc. Methods and systems to process a selection of a browser back button
US11455679B2 (en) 2005-03-30 2022-09-27 Ebay Inc. Methods and systems to browse data items
US11461835B2 (en) * 2005-03-30 2022-10-04 Ebay Inc. Method and system to dynamically browse data items
US8086697B2 (en) 2005-06-28 2011-12-27 Claria Innovations, Llc Techniques for displaying impressions in documents delivered over a computer network
US8205158B2 (en) 2006-12-06 2012-06-19 Ganz Feature codes and bonuses in virtual worlds
US20080183678A1 (en) * 2006-12-29 2008-07-31 Denise Chapman Weston Systems and methods for personalizing responses to user requests
US8620952B2 (en) 2007-01-03 2013-12-31 Carhamm Ltd., Llc System for database reporting
US20080290987A1 (en) * 2007-04-22 2008-11-27 Lehmann Li Methods and apparatus related to content sharing between devices
US20100257157A1 (en) * 2007-08-13 2010-10-07 Yuusuke Tomita Communication device, communication analysis method, and communication analysis program
US20090204908A1 (en) * 2008-02-11 2009-08-13 Ganz Friends list management
US8464166B2 (en) 2008-02-11 2013-06-11 Ganz Friends list management
US8042061B1 (en) 2008-02-18 2011-10-18 United Services Automobile Association Method and system for interface presentation
US7827072B1 (en) 2008-02-18 2010-11-02 United Services Automobile Association (Usaa) Method and system for interface presentation
US9659011B1 (en) 2008-02-18 2017-05-23 United Services Automobile Association (Usaa) Method and system for interface presentation
US8316097B2 (en) 2008-03-04 2012-11-20 Ganz Multiple-layer chat filter system and method
US20090228557A1 (en) * 2008-03-04 2009-09-10 Ganz, An Ontario Partnership Consisting Of 2121200 Ontario Inc. And 2121812 Ontario Inc. Multiple-layer chat filter system and method
US20110113112A1 (en) * 2008-03-04 2011-05-12 Ganz Multiple-layer chat filter system and method
US8321513B2 (en) 2008-03-04 2012-11-27 Ganz Multiple-layer chat filter system and method
US20090248544A1 (en) * 2008-04-01 2009-10-01 Ganz, an Ontario partnership consisting of 212100 Ontario Inc. and 2121812 Ontario Inc. Reverse product purchase in a virtual environment
US20120246585A9 (en) * 2008-07-14 2012-09-27 Microsoft Corporation System for editing an avatar
WO2010093995A1 (en) * 2009-02-13 2010-08-19 Social Gaming Network Apparatuses, methods and systems for an interworld feedback platform bridge
US20110078030A1 (en) * 2009-09-29 2011-03-31 Ganz Website with activities triggered by clickable ads
US8380725B2 (en) 2010-08-03 2013-02-19 Ganz Message filter with replacement text
US20130344770A1 (en) * 2011-12-07 2013-12-26 Ubooly Inc. Interactive toy
US9443515B1 (en) * 2012-09-05 2016-09-13 Paul G. Boyce Personality designer system for a detachably attachable remote audio object
US9446316B2 (en) * 2012-12-11 2016-09-20 Activision Publishing, Inc. Interactive video game system comprising toys with rewritable memories
US10043412B2 (en) 2013-05-26 2018-08-07 Dean Joseph Lore System for promoting travel education
US10037084B2 (en) * 2014-07-31 2018-07-31 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US11150738B2 (en) 2014-07-31 2021-10-19 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US20200042098A1 (en) * 2014-07-31 2020-02-06 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US20160034042A1 (en) * 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US10725556B2 (en) 2014-07-31 2020-07-28 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US10452152B2 (en) 2014-07-31 2019-10-22 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US20190027132A1 (en) * 2016-03-31 2019-01-24 Shenzhen Kuang-Chi Hezhong Technology Ltd. Cloud-based device and operating method therefor
USD894948S1 (en) * 2019-09-05 2020-09-01 Canopy Growth Corporation Display screen or portion thereof with transitional graphical user interface
US11389735B2 (en) 2019-10-23 2022-07-19 Ganz Virtual pet system
US11872498B2 (en) 2019-10-23 2024-01-16 Ganz Virtual pet system
US11358059B2 (en) 2020-05-27 2022-06-14 Ganz Live toy system
US20230018066A1 (en) * 2020-11-20 2023-01-19 Aurora World Corporation Apparatus and system for growth type smart toy

Also Published As

Publication number Publication date
AU2216201A (en) 2001-07-16
WO2001050342A1 (en) 2001-07-12

Similar Documents

Publication Publication Date Title
US20020068500A1 (en) Adaptive toy system and functionality
US20240054118A1 (en) Artificial intelligence platform with improved conversational ability and personality development
Cohen et al. Voice user interface design
Ball et al. Emotion and personality in a conversational agent
Batish Voicebot and Chatbot Design: Flexible Conversational Interfaces with Amazon Alexa, Google Home, and Facebook Messenger
WO2000031613A1 (en) Script development systems and methods useful therefor
JP3936749B2 (en) Interactive toys
US20030028498A1 (en) Customizable expert agent
US20020010584A1 (en) Interactive voice communication method and system for information and entertainment
US20020005865A1 (en) System, method, and device for authoring content for interactive agents
Osada et al. The scenario and design process of childcare robot, PaPeRo
Wolf Learning to act/acting to learn: Children as actors, critics, and characters in classroom theatre
WO2001069830A2 (en) Networked interactive toy system
KR20020067592A (en) User interface/entertainment device that simulates personal interaction and responds to user&#39;s mental state and/or personality
JP2000511304A (en) Agent-based instruction system and method
US8503665B1 (en) System and method of writing and using scripts in automated, speech-based caller interactions
JP2006061632A (en) Emotion data supplying apparatus, psychology analyzer, and method for psychological analysis of telephone user
JP2003114692A (en) Providing system, terminal, toy, providing method, program, and medium for sound source data
US20050288820A1 (en) Novel method to enhance the computer using and online surfing/shopping experience and methods to implement it
Coates Voice applications for Alexa and Google assistant
Ahmed Automatic generation and detection of motivational interviewing-style reflections for smoking cessation therapeutic conversations using transformer-based language models
Lea et al. Rhyme as resonance in poetry comprehension: An expert–novice study
CN114048299A (en) Dialogue method, apparatus, device, computer-readable storage medium, and program product
Longoria Designing software for the mobile context: a practitioner’s guide
JP2006308815A (en) Electronic learning system and electronic system

Legal Events

Date Code Title Description
AS Assignment

Owner name: CREATOR LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GABAI, OZ;GABAI, JACOB;SANDLERMAN, NIMROD;REEL/FRAME:011341/0059

Effective date: 20001126

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION