WO2009017764A1 - Interactive educational tool - Google Patents

Interactive educational tool Download PDF

Info

Publication number
WO2009017764A1
WO2009017764A1 PCT/US2008/009216 US2008009216W WO2009017764A1 WO 2009017764 A1 WO2009017764 A1 WO 2009017764A1 US 2008009216 W US2008009216 W US 2008009216W WO 2009017764 A1 WO2009017764 A1 WO 2009017764A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject matter
region
computer system
user
text
Prior art date
Application number
PCT/US2008/009216
Other languages
French (fr)
Inventor
Victoria Ann Tucci
Original Assignee
Victoria Ann Tucci
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/900,953 external-priority patent/US20090075247A1/en
Application filed by Victoria Ann Tucci filed Critical Victoria Ann Tucci
Publication of WO2009017764A1 publication Critical patent/WO2009017764A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student

Definitions

  • Questions presented in the text are answered by uploading answers that were composed using a separate, off-line word processing software. Additionally, students are required to navigate through complex menu structures to view a list of content uploaded by the professor, thereby diverting the student's attention and detracting from the student's efforts to learn the material.
  • Embodiments are directed to graphical user interfaces, systems and methods of implementing an interactive educational tool. More specifically, embodiments provide a graphical user interface (GUI) for presenting text and related content (e.g., videos, still images, etc.) associated with a predetermined subject matter. Users may interact with various graphical elements of the GUI for selecting and viewing portions of the subject matter (e.g., facts, rules, glossary terms/definitions, etc. associated with the text and/or content), viewing additional content (e.g., handouts, worksheets, etc.) related to the subject matter, displaying interactive electronic flashcards for studying the subject matter, and entering/reviewing notes taken related to the subject matter. Users may also enter answers to questions and compare the entered answer with the correct answer (e.g., displayed in another region or area of the GUI). As such, embodiments enable users to more conveniently, effectively, and efficiently learn subject matter presented using the GUI.
  • GUI graphical user interface
  • Embodiments also enable students to initiate a session with coaches or professors for enabling the coach or professor to observe and evaluate actions of the student related to the predetermined subject matter. For example, where the student is learning to perform a task presented using the GUI, the professor may observe the student performing the task and evaluate the student based upon the performance using a remote observation and evaluation interface.
  • a scheduling interface may be presented to the user for scheduling the session with the professor.
  • a timing interface may be presented to the professor for enabling the professor to time the student and/or automatically bill the student based upon the duration of the session.
  • a graphical user interface for implementing an interactive educational tool includes a first region for displaying text, wherein the text comprises educational information associated with a predetermined subject matter.
  • the graphical user interface also includes a second region for presenting media simultaneously with the display of the text, wherein the media is related to the text and comprises educational information associated with the predetermined subject matter.
  • Data for generating the text and the media is stored on a first computer system and accessed by a second computer system presenting the text and the media, wherein the first computer system is located remotely from the second computer system.
  • the media may be selected from a group consisting of video and still images, and wherein the media is for visually depicting the predetermined subject matter associated with the text.
  • the text may comprise a question for testing a user
  • the graphical user interface may further include a third region for accepting a user-input response to the question and a fourth region for selectively displaying an answer to the question and for enabling comparison of the user-input response and the answer.
  • a method of implementing an interactive educational tool includes accessing data from a first computer system.
  • the method also includes displaying text comprising educational information associated with a predetermined subject matter, wherein the text is generated from the data and displayed on a second computer system located remotely from the first computer system.
  • Media is also presented related to the text and comprising educational information associated with the predetermined subject matter, wherein the media is generated from the data and presented on the second computer system, and wherein the text is displayed simultaneously with the presentation of the media.
  • the media may be selected from a group consisting of video and still images, and wherein the media is for visually depicting the predetermined subject matter associated with the text.
  • the method may also include, in response to a user interaction with the second computer system, initiating a communication channel with a third computer system for enabling a user of the third computer system to observe and evaluate in real-time a performance of a user of the second computer system, wherein the performance is associated with the predetermined subject matter, and wherein the third computer system is located remotely from the second computer system.
  • a system in yet another embodiment, includes a first computer system for storing data used to generate educational information.
  • the system also includes a second computer system communicatively coupled to the first computer system, where the second computer system is for accessing the data and generating a graphical user interface using the data.
  • the graphical user interface includes a first region for displaying text, wherein the text includes educational information associated with a predetermined subject matter.
  • the graphical user interface also includes a second region for presenting media simultaneously with the display of the text, wherein the media is related to the text and comprises educational information associated with the predetermined subject matter.
  • FIG. 1 shows an exemplary system for presenting a graphical user interface (GUI) in accordance with one embodiment of the present invention.
  • GUI graphical user interface
  • Figure 2 shows an exemplary computer system platform upon which embodiments of the present invention may be implemented.
  • Figure 3 shows an exemplary on-screen GUI for implementing an exemplary interactive educational tool in accordance with one embodiment of the present invention.
  • Figure 4 shows display of exemplary information in response to interaction with a GUI in accordance with one embodiment of the present invention.
  • Figure 5 shows an exemplary user prompt for encouraging interaction with a GUI and viewing of information in accordance with one embodiment of the present invention.
  • Figure 6 shows an exemplary user prompt for encouraging interaction with a GUI and viewing of electronic flashcards in accordance with one embodiment of the present invention.
  • Figure 7 shows an exemplary GUI comprising a list of terms with respective definitions in accordance with one embodiment of the present invention.
  • Figure 8A shows an exemplary GUI for enabling users to enter and save notes in accordance with one embodiment of the present invention.
  • Figure 8B shows several exemplary notes in accordance with one embodiment of the present invention.
  • Figure 9A shows an exemplary GUI for entering answers to questions in accordance with one embodiment of the present invention.
  • Figure 9B show an exemplary GUI for comparing a user-input response with a predetermined answer in accordance with one embodiment of the present invention.
  • Figure 10 shows an exemplary GUI for displaying saved answers in accordance with one embodiment of the present invention.
  • Figure 11 shows an exemplary GUI for selecting a unit or task in accordance with one embodiment of the present invention.
  • Figure 12 shows an exemplary GUI for tracking student progress in accordance with one embodiment of the present invention.
  • Figure 13 shows an exemplary computer-implemented process for implementing an interactive education tool in accordance with one embodiment of the present invention.
  • Figure 14 shows an exemplary computer-implemented process for presenting questions and accepting user-input responses to the question in accordance with one embodiment of the present invention.
  • Figure 15 shows an exemplary GUI for listing coaches or professors in accordance with one embodiment of the present invention.
  • Figure 16 shows an exemplary GUI for enabling a student to request a session with a coach or professor in accordance with one embodiment of the present invention.
  • Figure 17 shows an exemplary GUI for enabling a coach or professor to view students and setup a session with a student in accordance with one embodiment of the present invention.
  • Figure 18A shows an exemplary GUI for initiating an observation and/or evaluation session with a student in accordance with one embodiment of the present invention.
  • Figure 18B shows an exemplary GUI for terminating an observation and/or evaluation session with a student in accordance with one embodiment of the present invention.
  • Figure 19 shows an exemplary computer-implemented process for initiating a GUI for enabling observation and evaluation of a user in accordance with one embodiment of the present invention.
  • Figure 2OA shows an exemplary GUI for implementing electronic flashcards in accordance with one embodiment of the present invention. .
  • Figure 2OB shows an exemplary flipping of an electronic flashcard in accordance with one embodiment of the present invention.
  • Figure 2OC shows an exemplary swapping of front and back sides of an electronic flashcard in accordance with one embodiment of the present invention.
  • Figure 2OD shows an exemplary flipping of an electronic flashcard with front and back sides swapped in accordance with one embodiment of the present invention.
  • Figure 2OE shows an exemplary stack of electronic flashcards which have reached a predetermined limit on the number of electronic flashcards in the stack in accordance with one embodiment of the present invention.
  • Figure 21 shows an exemplary computer-implemented process for implementing electronic flashcards in accordance with one embodiment of the present invention.
  • FIG. 1 shows exemplary system 100 for presenting a graphical user interface (GUI) in accordance with one embodiment of the present invention.
  • computer systems 110a-110c are communicatively coupled by interface 120.
  • Interface 120 may comprise the internet, a network, or some other device/component for communicatively coupling computer systems 110a-110c.
  • computer system 110a may present GUI 130
  • computer system 110c may present GUI 150.
  • Data 140 may be accessed from computer system 110b (e.g., via interface 120) for generating GU1 130 and/or GUI 150, where computer system 110b may comprise a remote server in one embodiment.
  • data for generating GUI 130 and/or GUI 150 may be accessed locally from a respective computer system (e.g., 110a, 110c, etc.), remotely from a computer system other than computer system 110b, etc.
  • GU1 130 may comprise an interactive educational tool (e.g., as discussed with respect to Figures 3-14 below) for enabling users to interact with content associated with a predetermined subject matter (e.g., a topic or subject of a lesson plan selected by a user).
  • the content may comprise media (e.g., video, still images, sound, etc.) which is simultaneously displayed with text or other information associated with the predetermined subject matter.
  • GUI 130 and/or the content presented using GUI 130 may be generated from data (e.g., 140) accessed from a remote computer system (e.g., 110b).
  • the interactive educational tool implemented using GU1 130 may comprise an online interactive educational tool (e.g., presented using a web browser of computer system 110a).
  • remote access of information may enable GUI 130 to present additional and/or different content compared to conventional solutions, while presentation of multiple forms of content using GUI 130 may enable users to more conveniently, effectively, and efficiently learn subject matter presented using GUI 130.
  • system 100 may implement a remote observation and evaluation interface (e.g., as discussed with respect to Figures 15-20 below) for observing and evaluating (e.g., in real-time) a student's performance when performing an action or task associated with the predetermined subject matter (e.g., presented using the interactive educational tool implemented by GUI 130 as discussed above).
  • Computer system 110c may present video and/or audio information (e.g., using GUI 150) of the student's performance accessed or captured using at least one interface device (e.g., a camera, microphone, etc.) coupled to computer system 110a.
  • the video and/or audio information may be communicated via a communication channel (e.g., implemented using interface 120 and/or other networking components) formed between computer systems 110a and 110c.
  • a scheduling interface may be presented to the user for scheduling the session with the professor.
  • a timing interface may be presented to the professor for enabling the professor to time the student and/or automatically bill the student based upon the duration of the session.
  • GUI 130 may comprise an interface for enabling information on a selected side of an electronic flashcard (e.g., SAFMEDS card) to be displayed and hidden.
  • an electronic flashcard e.g., SAFMEDS card
  • the information on a first side of an electronic flashcard may be displayed while the information on the second side is hidden, thereby enabling a user to test himself or herself before revealing the information on the second side of the electronic flashcard (e.g., comprising a definition of a term on the first side, additional information about the information on the first side, etc.).
  • GUI 130 may also enable placement or storage of inactive electronic flashcards (e.g., which are not currently being viewed or used) in multiple decks or piles (e.g., based upon user-confidence level with the subject matter of the flashcards, differences in the subject matter of the flashcards, etc.) to improve learning of the material (e.g., by enabling users to focus study efforts on more troublesome material of flashcards placed in a given pile). Further, GUI 130 may enable automated shuffling of the electronic flashcards, thereby providing more randomized and improved shuffling over manual shuffling of conventional flashcards.
  • inactive electronic flashcards e.g., which are not currently being viewed or used
  • multiple decks or piles e.g., based upon user-confidence level with the subject matter of the flashcards, differences in the subject matter of the flashcards, etc.
  • GUI 130 may enable automated shuffling of the electronic flashcards, thereby providing more randomized and improved shuffling over manual shuffling of conventional flashcard
  • Figure 1 shows three computer systems (e.g., 110a-110c) coupled via interface 120, it should be appreciated that a larger or smaller number of computer systems may be coupled via interface 120 in other embodiments. Additionally, it should be appreciated that interface 120 may comprise more than one component in other embodiments.
  • Figure 2 shows exemplary computer system platform 200 upon which embodiments of the present invention may be implemented.
  • portions of the present invention are comprised of computer-readable and computer- executable instructions that reside, for example, in computer system platform 200 and which may be used as a part of a general purpose computer network (not shown).
  • computer system platform 200 of Figure 2 is merely exemplary.
  • the present invention can operate within a number of different systems including, but not limited to, general-purpose computer systems, embedded computer systems, laptop computer systems, hand-held computer systems, portable computer systems, stand-alone computer systems, or game consoles.
  • computer system platform 200 may comprise at least one processor 210 and at least one memory 220.
  • Processor 210 may comprise a central processing unit (CPU) or other type of processor.
  • memory 220 may comprise volatile memory (e.g., RAM), non-volatile memory (e.g., ROM, flash memory, etc.), or some combination of the two. Additionally, memory 220 may be removable, non-removable, etc.
  • computer system platform 200 may comprise additional storage (e.g., removable storage 240, non-removable storage 245, etc.).
  • Removable storage 240 and/or non-removable storage 245 may comprise volatile memory, nonvolatile memory, or any combination thereof.
  • removable storage 240 and/or non-removable storage 245 may comprise CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store information for access by computer system platform 200.
  • computer system platform 200 may communicate with other systems, components, or devices via communication interface 270.
  • Communication interface 270 may embody computer readable instructions, data structures, program modules or other data in a modulated data signal (e.g., a carrier wave) or other transport mechanism.
  • communication interface 270 may couple to wired media (e.g., a wired network, direct- wired connection, etc.) and/or wireless media (e.g., a wireless network, a wireless connection utilizing acoustic, RF, infrared, or other wireless signaling, etc.).
  • Communication interface 270 may also couple computer system platform 200 to one or more input devices (e.g., a keyboard, mouse, pen, voice input device, touch input device, etc.) and/or output devices (e.g., a display, speaker, printer, etc.).
  • input devices e.g., a keyboard, mouse, pen, voice input device, touch input device, etc.
  • output devices e.g., a display, speaker, printer, etc.
  • graphics processor 250 may perform graphics processing operations on graphical data stored in frame buffer 260 or another memory (e.g., 220, 240, 245, etc.) of computer system platform 200.
  • Graphical data stored in frame buffer 260 may be accessed, processed, and/or modified by components (e.g., graphics processor 250, processor 210, etc.) of computer system platform 200 and/or components of other systems/devices. Additionally, the graphical data may be accessed (e.g., by graphics processor 250) and displayed on an output device coupled to computer system platform 200.
  • computer system platform 200 may be used to implement computer system 110a, computer system 110b, computer system 110c, interface component 120, or some combination thereof.
  • communication interface 270 may communicatively couple computer system 200 to one or more other computer systems (e.g., 110a, 110b, 110c, etc. via interface 120).
  • memory 220, removable storage 240, non-removable storage 245, frame buffer 260, or a combination thereof may comprise instructions that when executed on a processor (e.g., 210, 250, etc.) perform a method of implementing an interactive educational tool (e.g., using GUI 130, GUI 150, etc.), implementing a remote observation and evaluation interface (e.g., using GUI 130, GU1 150, etc.), implementing electronic flashcards (e.g., using GU1 130, GUI 150, etc.), or some combination thereof.
  • the graphical data used to display the GUI may be accessed from frame buffer 260 and displayed on an output device coupled to computer system platform 200.
  • FIG. 3 shows exemplary on-screen GUI 300 for implementing an exemplary interactive educational tool in accordance with one embodiment of the present invention.
  • GUI 300 comprises window or region 310 for displaying text or other information about a predetermined subject matter.
  • the text or other information displayed in region 310 may be related to a lesson plan or region of a lesson plan identified in region 320.
  • Window or region 330 may present media (e.g., video, still images, etc.) related to the predetermined subject matter and/or the information presented in region 310.
  • the media presented in region 330 may also be related to audio (e.g., an audio track for a video displayed in region 330, etc.) and/or other visual information (e.g., presented using GUI 300, presented by other light emitting devices, presented by other displays, etc.).
  • audio e.g., an audio track for a video displayed in region 330, etc.
  • other visual information e.g., presented using GUI 300, presented by other light emitting devices, presented by other displays, etc.
  • GUI 300 may be used to simultaneously present various forms of information (e.g., text in region 310 and video in region 330, etc.) related to a predetermined subject matter, thereby providing an educational tool which more conveniently and effectively presents educational information to users.
  • region 320 may identify a unit associated with unclogging a drain
  • region 310 may present information about clogged drains (e.g., the cause of most clogs, the steps to be taken to unclog the drain, etc.)
  • region 330 may show a video of a plumber unclogging a drain.
  • GUI 300 Given the repetition of information in various forms (e.g., text in region 310 which may explain the steps of unclogging a drain, the video in region 330 which may show the performance of the steps explained in region 310, etc.), the simultaneous display of this information (e.g., enabling students to quickly and conveniently move from one form of information to another), and the ability for users to interact with this information using GUI 300 as discussed below.
  • information in various forms e.g., text in region 310 which may explain the steps of unclogging a drain, the video in region 330 which may show the performance of the steps explained in region 310, etc.
  • simultaneous display of this information e.g., enabling students to quickly and conveniently move from one form of information to another
  • the ability for users to interact with this information using GUI 300 as discussed below.
  • region 310 may comprise interactive regions 312-316 for enabling users to interact with text and/or other information presented in region 310.
  • regions 312-316 may comprise hyperlinked text for initiating display (e.g., within regions of GUI 300, in another displayed window, etc.) of additional content (e.g., related to the predetermined subject matter).
  • Region 330 may comprise interactive regions 332 and 334 for enabling a user to interact with media presented in region 330.
  • GUI 300 may comprise graphical objects 340 for enabling users to control and/or interact with media presented in region 330. For example, a user may play, stop, alter playback (e.g., fast forward, rewind, zoom, etc.), etc. media presented in region 330 by interacting with graphical objects 340. Additionally, a user may control the volume level (e.g., of the sound accompanying the media displayed in region 330) using one or more of graphical objects 340.
  • graphical objects 340 for enabling users to control and/or interact with media presented in region 330. For example, a user may play, stop, alter playback (e.g., fast forward, rewind, zoom, etc.), etc. media presented in region 330 by interacting with graphical objects 340. Additionally, a user may control the volume level (e.g., of the sound accompanying the media displayed in region 330) using one or more of graphical objects 340.
  • interaction with graphical object 352 may initiate presentation of facts, rules, or similar information associated with the predetermined subject matter.
  • Figure 4 shows display of exemplary information 410 in response to interaction with GUI 300 (e.g., graphical object 352) in accordance with one embodiment of the present invention.
  • Information 410 may comprise an excerpt, paraphrase, summary, etc. of information presented in region 310 and/or region 330 of GUI 300.
  • embodiments enable presentation of information about the predetermined subject matter to users in yet another form, thereby improving learning and retention of the information.
  • presentation of information 410 enables users to repeat and/or study the information (e.g., comprising important points that the user may have previously overlooked) to further improve learning of the information.
  • Figure 5 shows exemplary user prompt 510 for encouraging interaction with GUI 300 and viewing of information 410 in accordance with one embodiment of the present invention.
  • Graphical object 520 may be used to close or hide user prompt 510.
  • a user may be required to interact with graphical object 352 and view information 410 before the user will be allowed to progress in the lesson and learn new/different subject matter (e.g., presented in region 310, 330, etc.). As such, embodiments reduce the likelihood that a user will overlook or miss an important point (e.g., presented using information 410).
  • user prompt 510 may be displayed periodically as a user progresses through a lesson (e.g., each time an important point is presented using GUI 300), thereby increasing interaction with GUI 300 and improving learning of the predetermined subject matter presented using GUI 300.
  • interaction with graphical object 372 may initiate display of handouts, worksheets, or the like, associated with the predetermined subject matter.
  • a still image e.g., a chart, table, etc.
  • graphical object 372 may initiate the display of the still image in a larger form (e.g., in a separate window, etc.) for easier viewing and/or enable printing of the image.
  • graphical object 372 may initiate display of a list of worksheets for a unit, task, lesson, etc., thereby enabling a user to select, view, print, etc. materials associated with the predetermined subject matter.
  • Interaction with graphical object 374 may initiate display of answer sheets associated with the predetermined subject matter. For example, where questions are presented in region 310 and/or region 330, graphical object 374 may initiate display of answers to those questions. In this manner, a user may periodically test and/or check his or her knowledge of the presented subject matter to further reinforce and reiterate the material. As such, embodiments provide further mechanisms for improving learning of the predetermined subject matter (e.g., presented using GUI 300 and/or other related GUIs).
  • Interaction with graphical object 354 may initiate presentation of a GUI which implements electronic flashcards associated with the predetermined subject matter (e.g., as discussed with respect to Figures 21A-22).
  • the electronic flashcards may enable the user to further learn and/or study information about the predetermined subject matter.
  • the electronic flashcards may comprise information 410, thereby enabling users to learn the information (e.g., presented in regions 310 and/or 330 of GUI 300) in yet another form.
  • Figure 6 shows exemplary user prompt 610 for encouraging interaction with GUI 300 and viewing of electronic flashcards in accordance with one embodiment of the present invention.
  • Graphical object 620 may be used to close or hide user prompt 610.
  • a user may be required to interact with graphical object 354 and view electronic flashcards before the user will be allowed to progress in the lesson and learn new/different subject matter (e.g., presented in region 310, 330, etc.).
  • embodiments reduce the likelihood that a user will overlook or miss an important point (e.g., presented using a GUI for implementing the electronic flashcards).
  • user prompt 610 may be displayed periodically as a user progresses through a lesson (e.g., each time an important point is presented using GUI 300), thereby increasing interaction with GUI 300 and improving learning of the predetermined subject matter presented using GUI 300.
  • FIG. 7 shows exemplary GUI 700 comprising a list of terms (e.g., in column 710) with respective definitions (e.g., in column 720) in accordance with one embodiment of the present invention.
  • At least one respective keyword for each term is listed in column 730, while at least one respective task (e.g., unit number, lesson number, etc.) is listed in column 740.
  • Information within GUI 700 may be sorted based upon column characteristic (e.g., in ascending order, descending order, etc.) by interacting with column headers in row 760 of GUI 700.
  • the information within at least one of columns 710-740 may comprise the same or similar information as information 410 of Figure 4, thereby improving learning by enabling users to quickly peruse the listing of important terms within GUI 700.
  • a term and/or its corresponding information may be selected and viewed by interacting with the graphical object in column 750 corresponding to the row comprising the selected term and/or corresponding information.
  • the term and/or its corresponding information may be displayed in a separate window from GUI 700.
  • the term and/or its corresponding information may be displayed within GUI 700 (e.g., by hiding all rows except for the row with the selected term, by graying out information in rows with non-selected terms, etc.).
  • GUI 700 information within GUI 700 may be searched by interacting with graphical object 770.
  • graphical object 770 may bring up a separate GUI or window enabling a user to specify a search criteria for the search.
  • the search criteria may comprise one or more terms within one or more of columns 710-740.
  • the GUI for specifying the search criteria may comprise one or more graphical objects for initiating the search and causing the search results to be displayed (e.g., within GUI 700).
  • GUI 700 may reset the display of information within GUI 700 to an initial state (e.g., originally displayed in response to interaction with graphical object 356 of Figure 3), to a state preceding the search (e.g., initiated using graphical object 770), etc. Further, information within GUI 700 may be printed by interacting with graphical object 790.
  • FIG. 8A shows exemplary GUI 800A for enabling users to enter and save notes in accordance with one embodiment of the present invention.
  • terms may be entered in user-modifiable field 810
  • definitions may be entered in user-modifiable field 812
  • keywords may be entered in user-modifiable field 814
  • a unit number may be entered in user-modifiable field 816
  • a task may be entered in user-modifiable field 818.
  • the entries in fields 810-818 and/or changes to the entries in fields 810-818 may be saved using graphical object 820.
  • GUI 800A comprises graphical object 830 for deleting a recalled note.
  • interaction with graphical object 842 may create a new note.
  • fields 810-818 may be cleared to enable a user to enter new information.
  • subsequent interaction with graphical object 820 may save changes to the new note.
  • subsequent interaction with graphical object 830 may delete the new note.
  • information within GUI 800A may be printed by interacting with graphical object 848.
  • Interaction with graphical object 846 may display a list of saved notes within GUI 800B of Figure 8B.
  • Figure 8B shows several exemplary notes in accordance with one embodiment of the present invention, where each note comprises a term (e.g., within column 860), a respective definition (e.g., within column 862), at least one respective keyword (e.g., within column 864), a respective unit (e.g., within column 866), a respective task (e.g., within column 868), or some combination thereof.
  • Information within GUI 800B may be sorted based upon column characteristic (e.g., in ascending order, descending order, etc.) by interacting with column headers in row 880 of GUI 800B.
  • a saved note may be selected and viewed by interacting with the graphical object in column 870 corresponding to the row comprising the selected note.
  • a term and/or it corresponding information e.g., definition, keywords, unit, task, etc.
  • GUI 700 e.g., by hiding all rows except for the row with the selected note, by graying out information in rows with non-selected notes, etc.
  • GUI 800B information within GUI 800B may be searched by interacting with graphical object 844.
  • graphical object 844 may bring up a separate GUI or window enabling a user to specify a search criteria for the search.
  • the search criteria may comprise one or more terms within one or more of columns 860-868.
  • the GUI for specifying the search criteria may comprise one or more graphical objects for initiating the search and causing the search results to be displayed (e.g., within GUI 800B).
  • GUI 800B may reset the display of information within GUI 800B to the state preceding the search (e.g., initiated using graphical object 844). Further, information within GUI 800B may be printed by interacting with graphical object 848.
  • interaction with graphical objects 850 may navigate or cycle through saved notes (e.g., all saved notes, a subset of saved notes determined by a search, etc.).
  • Graphical object 852 may indicate a current note for which information is displayed (e.g., in GUI 800A). Additionally, graphical object 854 may indicate a number of saved notes through which graphical objects 850 may be used to navigate.
  • FIG. 9A shows exemplary GUI 900 for entering answers to questions in accordance with one embodiment of the present invention.
  • a question may be selected by interacting with graphical objects 910.
  • the selected question may be displayed in region 920.
  • user-modifiable field 930 may accept a user-input response to the question.
  • Figure 9B show exemplary GUI 900 for comparing a user-input response with a predetermined answer in accordance with one embodiment of the present invention.
  • interaction with graphical object 940 may initiate display of a predetermined answer (e.g., in region 950 of GUI 900) to the question (e.g., selected using graphical objects 910 and displayed in region 920), where the predetermined answer may remain hidden until graphical object 940 is activated.
  • the predetermined answer may remain visible in region 950 for a predetermined period of time until it automatically returns to the hidden state.
  • the predetermined answer may remain visible until a user input (e.g., via graphical object 940, another graphical object of GUI 900, etc.) is detected requesting that the predetermined answer be hidden. Accordingly, the user-input response entered in field 930 may be compared with the predetermined answer, thereby improving learning by encouraging the user to review the predetermined subject matter, find the correct answer, think about why the user- input response may not match the predetermined answer, etc.
  • Interaction with graphical object 960 may save the user-input response in field 930.
  • graphical object 960 may indicate a final answer to the question presented in region 920.
  • interaction with graphical object 970 may initiate display of a GUI which displays saved answers.
  • Figure 10 shows exemplary GUI 1000 for displaying saved answers in accordance with one embodiment of the present invention.
  • the answers displayed in GU1 1000 may be those entered into field 930 of GUI 900 and saved by interacting with graphical object 970 of GUI 900.
  • column 1010 may comprise a task number related to the answer
  • column 1020 may comprise the question (e.g., displayed in region 920 of GUI 900)
  • column 1030 may comprise a user-input response (e.g., entered into field 930 of GUI 900) to the question
  • column 1040 may comprise a predetermined answer (e.g., displayed in region 950 of GUI 900 as shown in Figure 9B) to the question.
  • users can easily and quickly review their responses and compare them to the correct or predetermined answers.
  • interaction with graphical object 1050 may initiate printing of the information in GU1 1000 (e.g., to enable users to study and review a hardcopy of the information).
  • user-input responses e.g., shown in column 1030 of GUI 1000
  • GUI 1000 may be accessed by another person and/or computer system for analyzing the student's performance, determining if a student is taking the lesson seriously, latency of the responses from a user, error analysis, or the like.
  • GUI 900 and/or GUI 1000 the information may be stored locally on computer system 100a for subsequent analysis (e.g., automatically by a program run on computer system 110a, by a professor or other person using computer system 110a after the user, etc.), exported to another computer system for remote analysis (e.g., automatically by a program run on computer system 110b and/or 110c, by a professor or other person using computer system 110b and/or 110c, etc.).
  • subsequent analysis e.g., automatically by a program run on computer system 110a, by a professor or other person using computer system 110a after the user, etc.
  • another computer system for remote analysis e.g., automatically by a program run on computer system 110b and/or 110c, by a professor or other person using computer system 110b and/or 110c, etc.
  • interaction with graphical object 382 may advance the lesson (e.g., advance to a new task, new unit, etc.) and present new information (e.g., using GUI 300, another GUI, etc.) related to the predetermined subject matter.
  • Graphical object 384 may return to previously-accessed subject matter.
  • interaction with graphical object 386 may initiate display of a GUI for selecting a unit or task (e.g., to present using GUI 300).
  • Figure 11 shows exemplary GUI 1100 for selecting a unit or task in accordance with one embodiment of the present invention.
  • column 1110 lists units
  • column 1120 lists the last task accessed (e.g., using GUI 300 of Figure 3)
  • column 1130 provides an exemplary description of each unit.
  • some or all of the information in columns 1110-1130 may be interactive such that users may initiate display of information (e.g., using GUI 300 of Figure 3) related to the information interacted with. For example, if a user interacts with "Unit 4" in column 1110, then GUI 300 may be displayed to present information about the fourth unit. As another example, if a user interacts with "4.01 a1" in column 1120, then GUI 300 may be displayed to present information about task 4.01a1.
  • GU1 1100 may also comprise graphical object 1140 for initiating display of a GUI for entering or changing account information (e.g., username, password, personal details, etc.). Interaction with graphical object 1150 may initiate display of a GUI for enabling observation and/or evaluation of a student by a professor (e.g., as discussed below with respect to Figures 15-20).
  • account information e.g., username, password, personal details, etc.
  • Interaction with graphical object 1150 may initiate display of a GUI for enabling observation and/or evaluation of a student by a professor (e.g., as discussed below with respect to Figures 15-20).
  • interaction with graphical object 388 may initiate display of a GUI for tracking student progress.
  • Figure 12 shows exemplary GUI 1200 for tracking student progress in accordance with one embodiment of the present invention.
  • interaction with graphical objects 1210 may initiate display (e.g., within a region of GU1 1200, within a separate window or GUI, etc.) of information (e.g., amount of the unit or lesson completed, completion date if completed, etc.) about the student's progress through the lesson. For example, if a user positions an on-screen cursor over the number "5" of graphical objects 1210, then information about the student's progress through unit 5 (e.g., a completion date for unit 5) may be displayed.
  • information about the student's progress through unit 5 e.g., a completion date for unit 5
  • region 1220 of GUI 1200 may comprise unit numbers and descriptions of each unit.
  • the descriptions of the units in region 1220 may match the descriptions in column 1130 of Figure 11.
  • the information in region 1220 may comprise a summary or title for each unit.
  • Region 1230 comprises headings and subheadings for grouping the listing of units (e.g., represented by graphical objects 1210 and the numbers within region 1220).
  • heading 1240 has four subheadings 1250-1280.
  • Subheading 1250 comprises unit 13, unit 1, and unit 4.
  • GUI 1200 provides a listing of units as well as grouping and/or characterization of the units (e.g., represented by headings and/or subheadings within column 1230) to provide additional information (e.g., contextual information for a selected unit with respect to other units) about the predetermined subject matter.
  • FIG 13 shows exemplary computer-implemented process 1300 for implementing an interactive education tool in accordance with one embodiment of the present invention.
  • step 1310 involves accessing data (e.g., 140 of Figure 1) from a first computer system (e.g., 110b of Figure 1).
  • the first computer system e.g., 110b
  • the first computer system may be communicatively coupled to an interface (e.g., 120 of Figure 1), and the data (e.g., 140) may be accessed using that interface (e.g., 120).
  • Step 1320 involves displaying text comprising educational information associated with a predetermined subject matter.
  • the text may be generated from the data (e.g., 140) accessed in step 1310. Additionally, the text may be displayed (e.g., using GUI 130 of Figure 1, in region 310 of GUI 300 of Figure 3, etc.) on a second computer system (e.g., 110a of Figure 1) located remotely from the first computer system.
  • a second computer system e.g., 110a of Figure 1
  • step 1330 involves presenting media related to the text (e.g., displayed in step 1320) and comprising educational information associated with the predetermined subject matter.
  • the media may be generated from the data (e.g., 140) accessed in step 1310. Additionally, the media may be presented (e.g., using GUI 130 of Figure 1 , in region 330 of GUI 300 of Figure 3, etc.) on a second computer system (e.g., 110a of Figure 1) located remotely from the first computer system. Further, the media may be presented simultaneously with the display of the text in step 1320 in one embodiment.
  • the media may comprise video, still images, sound, etc. Additionally, the media may visually depict the predetermined subject matter associated with the text. For example, where the predetermined subject matter is plumbing related to drain unclogging, the text (e.g., displayed in step 1320) may explain how to unclog a drain and the media (e.g., presented in step 1330) may show a plumber unclogging a drain.
  • step 1340 involves presenting other information associated with the predetermined subject matter.
  • the additional information presented in step 1340 may comprise a listing of answers to questions associated with the predetermined subject matter, user-input responses to questions associated with the predetermined subject matter, user-input notes associated with the predetermined subject matter, a glossary of terms used in the text, and educational worksheets associated with the predetermined subject matter.
  • FIG 14 shows exemplary computer-implemented process 1400 for presenting questions and accepting user-input responses to the question in accordance with one embodiment of the present invention.
  • step 1410 involves displaying a question associated with a predetermined subject matter.
  • the question may be displayed (e.g., using GU1 130, in region 920 of GUI 900, etc.) using data (e.g., 140) accessed from a remote computer system (e.g., 110b).
  • Step 1420 involves accessing a user-input response (e.g., answer) to the question.
  • the user-input response may be input to and/or displayed in a user- modifiable field (e.g., 930) of a GUI (e.g., 900).
  • a user- modifiable field e.g. 930
  • step 1430 involves displaying an answer to the question for comparison with the user-input response.
  • the answer may be displayed (e.g., in region 950 of GUI 900) in response to a user input (e.g., an interaction with graphical object 940 of GUI 900). Additionally, the answer may remain hidden until it is displayed in step 1430. The answer may remain visible for a predetermined period of time after display in step 1430 until it automatically returns to the hidden state. Alternatively, the answer may remain visible until a user input (e.g., via graphical object 940, another graphical object of GUI 900, etc.) is detected requesting that the predetermined answer be hidden.
  • a user input e.g., via graphical object 940, another graphical object of GUI 900, etc.
  • the user-input response (e.g., accessed in step 1420) may be compared with the predetermined answer (e.g., displayed in step 1430), thereby improving learning by encouraging the user to review the predetermined subject matter, find the correct answer, think about why the user-input response may not match the predetermined answer, etc.
  • Step 1440 involves determining whether a request was detected for the user- input response (e.g., accessed in step 1420) to be saved (e.g., by interacting with graphical object 960). If it is determined that a request was not detected for the user- input response to be saved, then steps 1420-1440 may be repeated. Alternatively, if it is determined that a request was detected for the user-input response to be saved, then the user-input response may be accessed and stored (e.g., in a memory of the computer system presenting the GUI for displaying the question, in a memory of a remote computer system, etc.). As shown in Figure 14, step 1460 involves accessing the stored user-input response for review and/or analysis.
  • the stored user-input response may be displayed (e.g., in GU1 1000 for review by a user, on a remote system for review by a professor or an individual performing student analysis, etc.).
  • the stored user-input response may be accessed (e.g., by a local computer system, by a remote computer system, etc.) for automated analysis of the student's performance in another embodiment.
  • FIG. 15 shows exemplary GUI 1500 for listing coaches or professors in accordance with one embodiment of the present invention.
  • region 1510 of GU1 1500 comprises a listing of coaches or professors, information (e.g., area of expertise, years in a given industry, etc.) about one or more of the coaches or professors, and times when the coach or professor is available to observe and/or evaluate a student. For example, "Coach 1" specializes in plumbing, has been a plumber for over 15 years, and is available on Tuesdays and Thursdays for observation and/or evaluation sessions with students.
  • GU1 1500 comprises graphical objects 1520 for requesting sessions with a coach or professor (e.g., listed in region 1510).
  • interaction with one of graphical objects 1520 may initiate display of a GUI for enabling a student to request a session with a coach or professor (e.g., corresponding to the activated one of graphical objects 1520).
  • interaction with graphical object 1530 may initiate display of a forum GUI for enabling users (e.g., students, coaches or professors, etc.) to share information, exchange content (e.g., videos, pictures, etc.), etc.
  • Figure 16 shows exemplary GUI 1600 for enabling a student to request a session with a coach or professor in accordance with one embodiment of the present invention.
  • region 1610 of GUI 1600 comprises information about the availability of coach or professor to aid the student in requesting a session.
  • Region 1620 comprises a plurality of user-modifiable fields for requesting a day and/or time for a session with a coach or professor. Further, students can enter or suggest multiple days/times for the session in order of preference. Additionally, region 1630 comprises a user-modifiable field for entering a message (e.g., to accompany the session request to the coach or professor). Further, interaction with graphical object 1640 may send the requested time and/or message to the coach or professor.
  • FIG 17 shows exemplary GU1 1700 for enabling a coach or professor to view students and setup a session with a student in accordance with one embodiment of the present invention.
  • GU1 1700 comprises a list of students (e.g., in column 1710), a respective highest unit number to which each student has access (e.g., in column 1720), and respective minutes of credit remaining (e.g., in column 1730) for use toward observation and/or evaluation sessions with a coach or professor.
  • column 1740 comprises a plurality of respective graphical objects corresponding to each student and for enabling a coach or professor to initiate a session with a student.
  • GUI 1800 For example, interaction with a graphical object in column 1740 may initiate display of a GUI for enabling the coach or professor to observe and/or evaluate the student.
  • Figure 18A shows exemplary GUI 1800 for initiating an observation and/or evaluation session with a student in accordance with one embodiment of the present invention
  • Figure 18B shows exemplary GUI 1800 for terminating an observation and/or evaluation session with a student in accordance with one embodiment of the present invention.
  • GUI 1800 comprises region 1810 for presenting media (e.g., video, still images, etc.).
  • video of a student performing an action or task related to a predetermined subject matter may be displayed in region 1810.
  • Audio related to the video presented in region 1810 may be played simultaneously with the video to implement an audio/visual presentation.
  • the media presented using regions of GU1 1800 e.g., region 1810) may be prerecorded, streamed, live, etc. Accordingly, a coach or professor may observe and/or evaluate the student's performance using GU1 1800.
  • Video displayed in region 1810 may be generated using a video and/or audio conferencing software such as SkypeTM, IChatfrom Apple Inc. of Cupertino, California, or the like.
  • a student may use a camera (e.g., web camera, etc.) coupled to a computer system (e.g., 110c of Figure 1) to record or otherwise capture an action or performance.
  • the video data of the performance may be accessed by a computer system (e.g., 110a) of a coach or professor and used to present the student's performance to the coach or professor (e.g., in region 1810 of GUI 1800).
  • GUI 1800 may also provide the ability to record the duration of an observation and/or evaluation session.
  • a coach or professor may interact with graphical object 1820 (e.g., shown in Figure 18A) to start a timer (e.g., displayed in region 1840 of GUI 1800 showing an elapsed time of the session).
  • the timer may be stopped by interacting with graphical object 1830 (e.g., shown in Figure 18B).
  • the timer may be started when the student begins a performance, and may be stopped when the student completes the performance.
  • graphical object 1820 and graphical object 1830 may be simultaneously displayed in GU1 1800 in other embodiments.
  • GU1 1800 may implement an automated billing system for the sessions conducted by the coach or professor. For example, a user or student may purchase a predetermined amount of time of observation/evaluation by a coach or professor.
  • Region 1850 may indicate an amount of purchased time (e.g., displayed in column 1730 of Figure 17) for use toward observation and/or evaluation by a coach.
  • region 1850 may display an amount of purchased time remaining before the current session was initiated (e.g., using graphical object 1820).
  • Region 1860 may indicate the remaining time for use toward observation and/or evaluation by a coach, where the amount of time displayed in region 1860 may decrement as the elapsed time displayed in region 1840 increments.
  • GUI1 1800 may enable the student to be automatically charged or billed (e.g., in response to interaction with graphical object 1870) for observation/evaluation time used (e.g., displayed in region 1840).
  • interaction with graphical object 1880 may enable the session to be reset (e.g., to reset the timer displayed in region 1840 and the remaining minutes displayed in region 1860).
  • audio alone e.g., associated with the video presented in region 1810 of GU1 1800, presented simultaneously with the display of GUI 1800, etc.
  • regions of GU1 1800 may enable timing (e.g., using graphical objects 1820 and 1830 to time the session whose duration is displayed in region 1840) and/or automated billing of the audio performance (e.g., based upon the elapsed time displayed in region 1840, based upon the remaining amount of purchased time displayed in region 1860, etc.).
  • FIG 19 shows exemplary computer-implemented process 1900 for initiating a GUI for enabling observation and evaluation of a user in accordance with one embodiment of the present invention.
  • step 1910 involves capturing content of a student's performance. Capturing may comprise storing the content (e.g., video, still images, audio, video and audio, etc.), generating a live feed of the content, digitizing the content, transforming the content (e.g., transforming light and/or sound into a signal or data used to reproduce the light and/or sound, etc.), etc.
  • the performance may be related to a predetermined subject matter (e.g., taught using an interactive educational tool implemented using GUI 300).
  • Video content and/or still image content may be captured by a camera (e.g., a web camera, other still-image camera, other video camera, etc.) and accessed by a computer system (e.g., 110a) of the student.
  • Audio content may be captured by a microphone or the like and accessed by a computer system (e.g., 110a) of the student.
  • Step 1920 involves accessing the captured content.
  • he captured content (e.g., captured in step 1910) may be accessed by a computer system (e.g., 110b) of a coach or professor.
  • the captured content may be accessed by an interface component (e.g., 120) coupled to the student's computer system (e.g., 110a) and/or a computer system of a coach or professor (e.g., 110b).
  • step 1930 involves presenting the accessed content using a GUI for enabling observation and/or evaluation of the student's performance.
  • the GUI e.g., 1800
  • the GUI may enable a coach or professor to observe and/or evaluate a student's performance (e.g., by displaying video or pictures of the performance captured in step 1910, playing audio of the performance captured in step 1910, etc.) related to the predetermined subject matter.
  • the content may be presented in real-time, thereby enabling the observation and/or evaluation in real-time.
  • the content may be communicated via a communication channel (e.g., implemented using interface 120 and/or other networking components) formed between the computer system presenting GUI 1800 (e.g., computer system 110a) and the coach's or professor's computer system (e.g., 110c).
  • a communication channel e.g., implemented using interface 120 and/or other networking components
  • Step 1940 involves automatically billing the student for the observation and/or evaluation session.
  • automated billing may be implemented by deducting a determined length of the session from units of time purchased by the student.
  • the duration of the session may be determined using a timer (e.g., presented to a coach or professor using GU1 1800) which may be started and stopped based upon user inputs (e.g., by a coach or professor interacting with graphical objects 1820 and 1830 of GUI 1800).
  • the duration of the session may be automatically determined (e.g., based upon the duration of the video and/or audio captured).
  • Electronic Flashcards e.g., presented to a coach or professor using GU1 1800
  • Figure 2OA shows exemplary GUI 2000 for implementing electronic flashcards in accordance with one embodiment of the present invention.
  • the term “electronic flashcard” can mean a visual representation of one or both sides of a flashcard, where the visual representation may be displayed on a display device coupled to a computer system (e.g., 110a, 110b, 110c, 200, etc.). One side of the electronic flashcard may remain hidden until it is selectively revealed or displayed (e.g., in response to a user input or interaction with GUI 2000), where the selective revealing or displaying may comprise a "flipping" of the electronic flashcard in one embodiment.
  • the term “hidden” as used herein can mean not displayed, displayed so that it is less visible, etc.
  • GUI 2000 comprises region 2010 for displaying one or both sides of an electronic flashcard (e.g., an "active" electronic flashcard). Additionally, region 2020 comprises multiple piles or stacks (e.g., 2022-2026) for storing electronic flashcards which are not currently being viewed (e.g., "inactive" electronic flashcards). In one embodiment, the electronic flashcards implemented using GUI 2000 may comprise SAFMEDS cards.
  • Electronic flashcards may be transferred between regions 2010 and 2020 by interacting with one or more regions of GUI 2000.
  • graphical object 2030 may be used to automatically transfer an active electronic flashcard from region 2010 to region 2020 in one embodiment.
  • interaction with one or both sides of the active flashcard displayed in region 2010 e.g., by (e.g., by moving an on-screen cursor over one or both sides and clicking a mouse button, by moving an on-screen cursor over one or both sides and double-clicking a mouse button, etc.
  • interaction with an inactive electronic flashcard in one of the piles of region 2020 and/or interaction with a graphical object may automatically transfer one or more selected electronic flashcards from region 2020 to region 2010.
  • limitations may be placed on the transferring of electronic flashcards between regions 2010 and 2020.
  • a user may be required to view or otherwise interact with an electronic flashcard in region 2010 (e.g., one time, multiple times, etc.) before transferring it to portion 2020.
  • embodiments may improve learning of the material presented using GUI 2000 by increasing user interaction with the material presented using the electronic flashcards of GUI 2000.
  • region 2010 comprises region 2040 (e.g., a first side of an active electronic flashcard) for displaying first set of information 2045.
  • Information 2045 displayed in region 2040 may remain visible while a second set of information to be displayed in region 2050 (e.g., a second side of the active electronic flashcard) is hidden.
  • Interaction with graphical object 2060 may "flip" the flashcard and display the second set of information (e.g., 2055) as depicted in Figure 2OB.
  • Information 2045 and information 2055 may both be associated with a predetermined subject matter (e.g., taught using an interactive educational tool implemented using GUI 300).
  • a user may learn the first and/or second sets of information by viewing the information displayed in region 2040 (e.g., information 2045), attempting to recite the information hidden in region 2050 (e.g., information 2055), interacting with graphical object 2060 to display information 2055 in region 2050, and then checking the recited information against information 2055.
  • Information 2045 and/or information 2055 may comprise text (e.g., a word, phrase, term, definition of the term, etc.), colors, patterns, etc.
  • information 2045 and information 2055 may be related (e.g., to one another and a predetermined subject matter) such that a user may view one set of information and test his or her knowledge of the other (e.g., by trying to recite the hidden information).
  • information 2045 displayed in region 2040 may comprise a term (e.g., the words "pipe wrench")
  • information 2055 to be selectively displayed in region 2050 may comprise a definition of the term displayed in region 2040, where information 2045 and information 2055 are related to a predetermined subject matter (e.g., plumbing).
  • a user may attempt to recite information 2055 (e.g., a definition of "pipe wrench") after looking at information 2045 (e.g., the term "pipe wrench") but before the display of information 2055 in region 2050, thereby using the electronic flashcards implemented using GUI 2000 to learn about the predetermined subject matter (e.g., plumbing).
  • information 2055 e.g., a definition of "pipe wrench”
  • information 2045 e.g., the term "pipe wrench”
  • embodiments improve learning (e.g., of a predetermined subject matter) by increasing the amount of information or content which may be displayed on the electronic flashcards compared with conventional, handwritten flashcards.
  • electronic information or content to be displayed on the electronic flashcards may be relatively small (e.g., occupy a relatively small amount of storage space) and/or be accessed from one or more sources (e.g., local hard drives, remote computer systems, etc.).
  • sources e.g., local hard drives, remote computer systems, etc.
  • the electronic flashcards are less likely to be damaged, lost, or stolen given that they are in electronic form.
  • information on each side of the electronic flashcards may be automatically generated (e.g., based upon a lesson plan of another module, based upon a user-defined subject matter, etc.) in one embodiment, thereby reducing the time and effort to create the flashcards.
  • information 2045 may comprise a picture (e.g., of a pipe wrench) or video (e.g., of a plumber using a pipe wrench).
  • a user may attempt to recite information 2055 (e.g., a definition of a "pipe wrench,” the term “pipe wrench,” etc.) after looking at information 2045 (e.g., a picture or video showing a pipe wrench) but before the display of information 2055 in region 2050, thereby using the electronic flashcards implemented using GUI 2000 to learn about the predetermined subject matter (e.g., plumbing).
  • information 2055 e.g., a definition of a "pipe wrench,” the term “pipe wrench,” etc.
  • information 2045 e.g., a picture or video showing a pipe wrench
  • embodiments may further improve learning (e.g., of a predetermined subject matter) by further increasing the amount of information or content which may be displayed on the electronic flashcards compared with conventional, handwritten flashcards.
  • learning e.g., of a predetermined subject matter
  • video and/or audio content may be presented to a user, thereby improving learning by presenting information in different forms to stimulate more senses of a user (e.g., catering to visual learners, audio learners, etc.).
  • the presentation of information in different forms can increase repetition of information to improve information absorption/retention.
  • interaction with graphical object 2070 may initiate display of second set of information 2055 in region 2040 (e.g., while first set of information 2045 is hidden). Additionally, interaction with graphical object 2070 may associate first set of information 2045 with region 2050 such that a subsequent interaction with graphical object 2060 may initiate display of information 2045 in region 2050 (e.g., as depicted in Figure 20D). As such, interaction with graphical object 2070 may effectively switch the front and back sides of the electronic flashcard in one embodiment.
  • embodiments further improve learning by enabling both sets of information (e.g., 2045 and 2055) to be selectively hidden and revealed (e.g., in region 2050 using graphical object 2060), thereby enabling users to test their memorization, learning, understanding, etc. of both sets of information (e.g., 2045 and 2055).
  • region 2020 comprises stacks 2022-2026 for storing electronic flashcards which are not currently being viewed (e.g., "inactive" electronic flashcards).
  • Stack 2022 may comprise electronic flashcards which have not yet been accessed or viewed (e.g., transferred to region 2010).
  • electronic flashcards may automatically accumulate in stack 2022 as a user progresses through a lesson and encounters or accesses new subject matter (e.g., to be placed on one or more electronic flashcards).
  • Stack 2024 and/or stack 2026 may comprise electronic flashcards which have been previously accessed or viewed (e.g., transferred from region 2010 to region 2020).
  • stacks 2024 and 2026 may comprise electronic flashcards sorted based upon based upon user-confidence level with the subject matter of the electronic flashcards.
  • stack 2024 may comprise electronic flashcards with subject matter which a user is less comfortable with (e.g., has not memorized, etc.)
  • stack 2026 may comprise electronic flashcards with subject matter which a user is more confident with (e.g., has memorized, etc.).
  • stacks 2024 and 2026 may comprise electronic flashcards sorted by subject matter (e.g., electronic flashcards with different types of plumbing tools in stack 2024, electronic flashcards with different plumbing techniques in stack 2026, etc.) and/or grouped based upon other characteristics.
  • embodiments enable users to sort, group, or otherwise place electronic flashcards in one or more stacks (e.g., 2024, 2026, etc.), thereby improving learning (e.g., of the predetermined subject matter) by enabling users to separate out and focus on the more troublesome material (e.g., displayed or stored in stack 2024) while devoting less attention to the material which the user is more comfortable with (e.g., displayed or stored in stack 2026).
  • the number of electronic flashcards in a given stack may also be reduced by placing the electronic flashcards into a larger number of stacks.
  • the material presented using the electronic flashcards may be more easily learned by enabling users may focus on a smaller amount of material at a given time.
  • the number of cards placed on at least one stack (e.g., 2022, 2024, 2026, etc.) of region 2020 may be limited. For example, once a stack (e.g., 2022, 2024, 2026, etc.) reaches its predetermined limit, one or more electronic flashcards may be removed before allowing additional electronic flashcards to be placed the stack. As a further example, stack 2022 may have a limit of one card and stack 2024 may have a limit of six cards, while stack 2026 may have no limit or a user-defined limit. In this manner, learning may be improved by encouraging and/or forcing a user to learn the previously-accessed material before moving on to additional material.
  • stack 2024 may have reached a predetermined limit (e.g., as indicated by a visual attribute, e.g., the darker color or shade of stack 2024) of six electronic flashcards, thereby requiring removal of an electronic flashcard from stack 2024 (e.g., by moving an electronic flashcard from stack 2024 to stack 2026, by moving an electronic flashcard from stack 2024 to region 2010 for viewing, etc.) before another electronic flashcard may be located on stack 2024.
  • a predetermined limit e.g., as indicated by a visual attribute, e.g., the darker color or shade of stack 2024
  • embodiments reduce the review time for each stack and enable users to more effectively, quickly, and easily learn the material (e.g., using the SAFMEDS method of periodically performing short review sessions).
  • GUI 2000 may enable automated shuffling of the electronic flashcards in one or more stacks (e.g., 2022, 2024, 2026, etc.) of region 2020.
  • the shuffling may be initiated by interacting with one or more graphical objects of GUI 2000 (not shown in Figure 20A), by interacting with a region of GUI 2000 (e.g., activating or selecting an electronic flashcard from region 2020, clicking or double-clicking a stack in region 2020, etc.), etc.
  • embodiments enable more randomized and improved shuffling over manual shuffling of conventional flashcards, thereby improving learning of the material.
  • the automatic shuffling of GUI 2000 may enable users to use the electronic flashcards in accordance with the SAFMEDS method.
  • GUI 2000 may enable one or more electronic flashcards (e.g., displayed within region 2010 and/or 2020) to be printed.
  • users may conveniently and quickly generate hard-copy flashcards (e.g., printed on paper, cardstock, index cards, etc.) for reviewing the material of the electronic flashcards in hard-copy form.
  • hard-copy flashcards e.g., printed on paper, cardstock, index cards, etc.
  • one or more of the flashcards may be conveniently and quickly re-printed.
  • Figures 20A-20E display only one active electronic flashcard in region 2010, it should be appreciated that more than one active electronic flashcard may be displayed in region 2010 in other embodiments. Additionally, although region 2020 comprises three stacks (e.g., 2022, 2024, and 2026) in Figures 20A-20E, it should be appreciated that region 2020 may comprise a larger or smaller number of stacks in other embodiments.
  • Figure 21 shows exemplary computer-implemented process 2100 for implementing electronic flashcards in accordance with one embodiment of the present invention.
  • step 2110 involves displaying a first set of information (e.g., 2045) associated with a predetermined subject matter in a first area (e.g., 2040) of a GUI (e.g., 2000 as depicted in Figures 20A-20E) while a second set of information (e.g. 2055) associated with the predetermined subject matter remains hidden.
  • the first set of information (e.g., 2045) may comprise text (e.g., a word, phrase, term, etc.), colors, patterns, graphical information (e.g., still images, video, etc.), or the like.
  • Step 2120 involves determining whether a request to display the second set of information (e.g., 2055) in the first area (e.g., 2040) has been detected. If a request to display the second set of information (e.g., 2055) in the first area (e.g., 2040) has not been detected, then step 2130 may be performed.
  • step 2130 involves detecting a user input requesting display of the second set of information (e.g., 2055).
  • the user input may comprise an interaction with a graphical object (e.g., 2060 of Figures 20A-20E) of the GUI (e.g., 2000) displaying the first set of information (e.g., 2045) in step 2110.
  • Step 2140 involves displaying the second set of information (e.g., 2055) in a second area (e.g., 2050) of the GUI (e.g., 2000).
  • the second set of information e.g., 2055
  • step 2150 involves automatically hiding the second set of information (e.g., 2055) after a predetermined period of time.
  • the second set of information e.g., 2055
  • may be hidden in response to a user input e.g., to GUI 2000.
  • step 2120 If it is determined in step 2120 that a request to display the second set of information (e.g., 2055) in the first area (e.g., 2040) has been detected, then the second set of information (e.g., 2055) may be displayed in the first area (e.g., 2040) of the GUI (e.g., 2000) in step 2160.
  • the second set of information (e.g., 2055) may be displayed in the first area (e.g., 2040) while the first set of information (e.g. 2045) remains hidden.
  • step 2170 involves detecting a user input requesting display of the first set of information (e.g., 2045).
  • the user input may comprise an interaction with a graphical object (e.g., 2060) of the GUI (e.g., 2000) displaying the second set of information (e.g., 2055) in step 2160.
  • a graphical object e.g., 2060
  • the GUI e.g., 2000
  • Step 2180 involves displaying the first set of information (e.g., 2045) in the second area (e.g., 2050) of the GUI (e.g., 2000).
  • the first set of information e.g., 2045
  • the first set of information may comprise text (e.g., a word, phrase, definition of a term, etc.), colors, patterns, graphical information (e.g., still images, video, etc.), or the like.
  • step 2190 involves automatically hiding the first set of information (e.g., 2045) after a predetermined period of time.
  • the first set of information e.g., 2045
  • may be hidden in response to a user input e.g., to GUI 2000.

Abstract

Graphical user interfaces, systems and methods of implementing an interactive educational tool are disclosed. More specifically, embodiments provide a graphical user interface (GUI) for presenting text and related content associated with a predetermined subject matter. Users may interact with various graphical elements of the GUI for selecting and viewing regions of the subject matter, viewing additional content related to the subject matter, displaying interactive electronic flashcards for studying the subject matter, and entering/reviewing notes taken related to the subject matter. Users may also enter answers to questions and compare the entered answer with the correct answer. Embodiments also enable students to initiate a session with coaches or professors for enabling the coach or professor to observe and evaluate actions of the student related to the predetermined subject matter.

Description

INTERACTIVE EDUCATIONAL TOOL
RELATED APPLICATIONS
The present application is related to and claims the benefit of United States Provisional Patent Application Number 60/963,342, filed August 2, 2007, entitled "INTERACTIVE LEARNING TOOL AND ELECTRONIC FLASH CARDS," naming Vicci Tucci as the inventor, assigned to the assignee of the present invention, and having attorney docket number TUCI-P001.PRO. That application is incorporated herein by reference in its entirety and for all purposes.
The present application is related to United States Patent Application Number 11/900,989, filed September 14, 2007, entitled "ELECTRONIC FLASHCARDS," naming Victoria A. Tucci as the inventor, assigned to the assignee of the present invention, and having attorney docket number TUCI-P002. That application is incorporated herein by reference in its entirety and for all purposes.
BACKGROUND OF THE INVENTION
Computer-based learning is becoming more common as time goes on. For example, Ecollege.com and other websites offer e-learning products to colleges and other institutions to help educate students. Conventional e-learning products are often used to supplement the classroom instruction by providing information and resources associated with the course material presented during class. In other situations, the e- learning products may replace classroom activities and provide an online course with little interaction from the professor or person in charge of overseeing the online course. Despite the increasing use of computer-based learning solutions, most conventional e-learning products provide little user interaction for helping the student learn the material. For example, many conventional online educational tools merely present the subject matter being taught in text form, thereby providing little student interaction for reinforcing, repeating, or otherwise aiding the learning of the material. Questions presented in the text are answered by uploading answers that were composed using a separate, off-line word processing software. Additionally, students are required to navigate through complex menu structures to view a list of content uploaded by the professor, thereby diverting the student's attention and detracting from the student's efforts to learn the material.
Similar to conventional online e-learning products, many conventional off-line educational tools, provided on a CD-ROM or similar media, fail to interact with and engage the student. Further, conventional off-line educational tools offer limited content given the space requirements of the media. As such, users are required to remove and insert media throughout the lesson which causes inconvenience and disrupts learning. Additionally, the student must purchase additional content, thereby increasing the price of the learning experience and further deterring learning.
SUMMARY OF THE INVENTION
Accordingly, a need exists for an educational tool with increased user interaction. A need also exists for an educational tool which more seamlessly and conveniently presents content associated with the lesson plan. Additionally, a need exists for an educational tool with increased content. Further, a need exists for an educational tool which more conveniently presents a lesson plan at a reduced cost. Embodiments of the present invention provide novel solutions to these needs and others as described below.
Embodiments are directed to graphical user interfaces, systems and methods of implementing an interactive educational tool. More specifically, embodiments provide a graphical user interface (GUI) for presenting text and related content (e.g., videos, still images, etc.) associated with a predetermined subject matter. Users may interact with various graphical elements of the GUI for selecting and viewing portions of the subject matter (e.g., facts, rules, glossary terms/definitions, etc. associated with the text and/or content), viewing additional content (e.g., handouts, worksheets, etc.) related to the subject matter, displaying interactive electronic flashcards for studying the subject matter, and entering/reviewing notes taken related to the subject matter. Users may also enter answers to questions and compare the entered answer with the correct answer (e.g., displayed in another region or area of the GUI). As such, embodiments enable users to more conveniently, effectively, and efficiently learn subject matter presented using the GUI.
Embodiments also enable students to initiate a session with coaches or professors for enabling the coach or professor to observe and evaluate actions of the student related to the predetermined subject matter. For example, where the student is learning to perform a task presented using the GUI, the professor may observe the student performing the task and evaluate the student based upon the performance using a remote observation and evaluation interface. In addition to enabling observation and/or evaluation of the student, a scheduling interface may be presented to the user for scheduling the session with the professor. Further, a timing interface may be presented to the professor for enabling the professor to time the student and/or automatically bill the student based upon the duration of the session.
In one embodiment, a graphical user interface for implementing an interactive educational tool includes a first region for displaying text, wherein the text comprises educational information associated with a predetermined subject matter. The graphical user interface also includes a second region for presenting media simultaneously with the display of the text, wherein the media is related to the text and comprises educational information associated with the predetermined subject matter. Data for generating the text and the media is stored on a first computer system and accessed by a second computer system presenting the text and the media, wherein the first computer system is located remotely from the second computer system. The media may be selected from a group consisting of video and still images, and wherein the media is for visually depicting the predetermined subject matter associated with the text. Additionally, the text may comprise a question for testing a user, and the graphical user interface may further include a third region for accepting a user-input response to the question and a fourth region for selectively displaying an answer to the question and for enabling comparison of the user-input response and the answer.
In another embodiment, a method of implementing an interactive educational tool includes accessing data from a first computer system. The method also includes displaying text comprising educational information associated with a predetermined subject matter, wherein the text is generated from the data and displayed on a second computer system located remotely from the first computer system. Media is also presented related to the text and comprising educational information associated with the predetermined subject matter, wherein the media is generated from the data and presented on the second computer system, and wherein the text is displayed simultaneously with the presentation of the media. The media may be selected from a group consisting of video and still images, and wherein the media is for visually depicting the predetermined subject matter associated with the text. Additionally, the method may also include, in response to a user interaction with the second computer system, initiating a communication channel with a third computer system for enabling a user of the third computer system to observe and evaluate in real-time a performance of a user of the second computer system, wherein the performance is associated with the predetermined subject matter, and wherein the third computer system is located remotely from the second computer system.
In yet another embodiment, a system includes a first computer system for storing data used to generate educational information. The system also includes a second computer system communicatively coupled to the first computer system, where the second computer system is for accessing the data and generating a graphical user interface using the data. The graphical user interface includes a first region for displaying text, wherein the text includes educational information associated with a predetermined subject matter. The graphical user interface also includes a second region for presenting media simultaneously with the display of the text, wherein the media is related to the text and comprises educational information associated with the predetermined subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements. Figure 1 shows an exemplary system for presenting a graphical user interface (GUI) in accordance with one embodiment of the present invention.
Figure 2 shows an exemplary computer system platform upon which embodiments of the present invention may be implemented.
Figure 3 shows an exemplary on-screen GUI for implementing an exemplary interactive educational tool in accordance with one embodiment of the present invention.
Figure 4 shows display of exemplary information in response to interaction with a GUI in accordance with one embodiment of the present invention.
Figure 5 shows an exemplary user prompt for encouraging interaction with a GUI and viewing of information in accordance with one embodiment of the present invention.
Figure 6 shows an exemplary user prompt for encouraging interaction with a GUI and viewing of electronic flashcards in accordance with one embodiment of the present invention.
Figure 7 shows an exemplary GUI comprising a list of terms with respective definitions in accordance with one embodiment of the present invention.
Figure 8A shows an exemplary GUI for enabling users to enter and save notes in accordance with one embodiment of the present invention. Figure 8B shows several exemplary notes in accordance with one embodiment of the present invention.
Figure 9A shows an exemplary GUI for entering answers to questions in accordance with one embodiment of the present invention.
Figure 9B show an exemplary GUI for comparing a user-input response with a predetermined answer in accordance with one embodiment of the present invention.
Figure 10 shows an exemplary GUI for displaying saved answers in accordance with one embodiment of the present invention.
Figure 11 shows an exemplary GUI for selecting a unit or task in accordance with one embodiment of the present invention.
Figure 12 shows an exemplary GUI for tracking student progress in accordance with one embodiment of the present invention.
Figure 13 shows an exemplary computer-implemented process for implementing an interactive education tool in accordance with one embodiment of the present invention.
Figure 14 shows an exemplary computer-implemented process for presenting questions and accepting user-input responses to the question in accordance with one embodiment of the present invention. Figure 15 shows an exemplary GUI for listing coaches or professors in accordance with one embodiment of the present invention.
Figure 16 shows an exemplary GUI for enabling a student to request a session with a coach or professor in accordance with one embodiment of the present invention.
Figure 17 shows an exemplary GUI for enabling a coach or professor to view students and setup a session with a student in accordance with one embodiment of the present invention.
Figure 18A shows an exemplary GUI for initiating an observation and/or evaluation session with a student in accordance with one embodiment of the present invention.
Figure 18B shows an exemplary GUI for terminating an observation and/or evaluation session with a student in accordance with one embodiment of the present invention.
Figure 19 shows an exemplary computer-implemented process for initiating a GUI for enabling observation and evaluation of a user in accordance with one embodiment of the present invention.
Figure 2OA shows an exemplary GUI for implementing electronic flashcards in accordance with one embodiment of the present invention. .
Figure 2OB shows an exemplary flipping of an electronic flashcard in accordance with one embodiment of the present invention.
Figure 2OC shows an exemplary swapping of front and back sides of an electronic flashcard in accordance with one embodiment of the present invention.
Figure 2OD shows an exemplary flipping of an electronic flashcard with front and back sides swapped in accordance with one embodiment of the present invention.
Figure 2OE shows an exemplary stack of electronic flashcards which have reached a predetermined limit on the number of electronic flashcards in the stack in accordance with one embodiment of the present invention.
Figure 21 shows an exemplary computer-implemented process for implementing electronic flashcards in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings. While the present invention will be discussed in conjunction with the following embodiments, it will be understood that they are not intended to limit the present invention to these embodiments alone. On the contrary, the present invention is intended to cover alternatives, modifications, and equivalents which may be included with the spirit and scope of the present invention as defined by the appended claims. Furthermore, in the following detailed description of the present invention, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, embodiments of the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the present invention.
Notation and Nomenclature
Some regions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing and other symbolic representations of operations on data bits within a computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing the terms such as "accepting," "accessing," "adding," "adjusting," "analyzing," "applying," "assembling," "assigning," "calculating," "capturing," "combining," "comparing," "collecting," "creating," "defining," "depicting," "detecting," "determining," "displaying," "establishing," "executing," "flipping," "generating," "grouping," "hiding," "identifying," "initiating," "interacting," "modifying," "monitoring," "moving," "outputting," "performing," "placing," "presenting," "processing," "programming," "querying," "removing," "repeating," "sampling," "sorting," "storing," "subtracting," "tracking," "transforming," "using," or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
System For Implementing A Graphical User Interface
Figure 1 shows exemplary system 100 for presenting a graphical user interface (GUI) in accordance with one embodiment of the present invention. As shown in Figure 1, computer systems 110a-110c are communicatively coupled by interface 120. Interface 120 may comprise the internet, a network, or some other device/component for communicatively coupling computer systems 110a-110c. Additionally, computer system 110a may present GUI 130, whereas computer system 110c may present GUI 150. Data 140 may be accessed from computer system 110b (e.g., via interface 120) for generating GU1 130 and/or GUI 150, where computer system 110b may comprise a remote server in one embodiment. Alternatively, data for generating GUI 130 and/or GUI 150 may be accessed locally from a respective computer system (e.g., 110a, 110c, etc.), remotely from a computer system other than computer system 110b, etc.
GU1 130 may comprise an interactive educational tool (e.g., as discussed with respect to Figures 3-14 below) for enabling users to interact with content associated with a predetermined subject matter (e.g., a topic or subject of a lesson plan selected by a user). In one embodiment, the content may comprise media (e.g., video, still images, sound, etc.) which is simultaneously displayed with text or other information associated with the predetermined subject matter. GUI 130 and/or the content presented using GUI 130 may be generated from data (e.g., 140) accessed from a remote computer system (e.g., 110b). Where interface 120 comprises the internet, the interactive educational tool implemented using GU1 130 may comprise an online interactive educational tool (e.g., presented using a web browser of computer system 110a). As such, remote access of information may enable GUI 130 to present additional and/or different content compared to conventional solutions, while presentation of multiple forms of content using GUI 130 may enable users to more conveniently, effectively, and efficiently learn subject matter presented using GUI 130.
In another embodiment, system 100 may implement a remote observation and evaluation interface (e.g., as discussed with respect to Figures 15-20 below) for observing and evaluating (e.g., in real-time) a student's performance when performing an action or task associated with the predetermined subject matter (e.g., presented using the interactive educational tool implemented by GUI 130 as discussed above). Computer system 110c may present video and/or audio information (e.g., using GUI 150) of the student's performance accessed or captured using at least one interface device (e.g., a camera, microphone, etc.) coupled to computer system 110a. The video and/or audio information may be communicated via a communication channel (e.g., implemented using interface 120 and/or other networking components) formed between computer systems 110a and 110c. In addition to enabling observation and/or evaluation of the student, a scheduling interface may be presented to the user for scheduling the session with the professor. Further, a timing interface may be presented to the professor for enabling the professor to time the student and/or automatically bill the student based upon the duration of the session. Alternatively, GUI 130 may comprise an interface for enabling information on a selected side of an electronic flashcard (e.g., SAFMEDS card) to be displayed and hidden. For example, the information on a first side of an electronic flashcard (e.g., comprising a term, phrase, picture, etc.) may be displayed while the information on the second side is hidden, thereby enabling a user to test himself or herself before revealing the information on the second side of the electronic flashcard (e.g., comprising a definition of a term on the first side, additional information about the information on the first side, etc.). GUI 130 may also enable placement or storage of inactive electronic flashcards (e.g., which are not currently being viewed or used) in multiple decks or piles (e.g., based upon user-confidence level with the subject matter of the flashcards, differences in the subject matter of the flashcards, etc.) to improve learning of the material (e.g., by enabling users to focus study efforts on more troublesome material of flashcards placed in a given pile). Further, GUI 130 may enable automated shuffling of the electronic flashcards, thereby providing more randomized and improved shuffling over manual shuffling of conventional flashcards.
Although Figure 1 shows three computer systems (e.g., 110a-110c) coupled via interface 120, it should be appreciated that a larger or smaller number of computer systems may be coupled via interface 120 in other embodiments. Additionally, it should be appreciated that interface 120 may comprise more than one component in other embodiments.
Figure 2 shows exemplary computer system platform 200 upon which embodiments of the present invention may be implemented. As shown in Figure 2, portions of the present invention are comprised of computer-readable and computer- executable instructions that reside, for example, in computer system platform 200 and which may be used as a part of a general purpose computer network (not shown). It is appreciated that computer system platform 200 of Figure 2 is merely exemplary. As such, the present invention can operate within a number of different systems including, but not limited to, general-purpose computer systems, embedded computer systems, laptop computer systems, hand-held computer systems, portable computer systems, stand-alone computer systems, or game consoles.
In one embodiment, depicted by dashed lines 230, computer system platform 200 may comprise at least one processor 210 and at least one memory 220. Processor 210 may comprise a central processing unit (CPU) or other type of processor. Depending on the configuration and/or type of computer system environment, memory 220 may comprise volatile memory (e.g., RAM), non-volatile memory (e.g., ROM, flash memory, etc.), or some combination of the two. Additionally, memory 220 may be removable, non-removable, etc.
In other embodiments, computer system platform 200 may comprise additional storage (e.g., removable storage 240, non-removable storage 245, etc.). Removable storage 240 and/or non-removable storage 245 may comprise volatile memory, nonvolatile memory, or any combination thereof. Additionally, removable storage 240 and/or non-removable storage 245 may comprise CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store information for access by computer system platform 200. As shown in Figure 2, computer system platform 200 may communicate with other systems, components, or devices via communication interface 270. Communication interface 270 may embody computer readable instructions, data structures, program modules or other data in a modulated data signal (e.g., a carrier wave) or other transport mechanism. By way of example, and not limitation, communication interface 270 may couple to wired media (e.g., a wired network, direct- wired connection, etc.) and/or wireless media (e.g., a wireless network, a wireless connection utilizing acoustic, RF, infrared, or other wireless signaling, etc.).
Communication interface 270 may also couple computer system platform 200 to one or more input devices (e.g., a keyboard, mouse, pen, voice input device, touch input device, etc.) and/or output devices (e.g., a display, speaker, printer, etc.).
As shown in Figure 2, graphics processor 250 may perform graphics processing operations on graphical data stored in frame buffer 260 or another memory (e.g., 220, 240, 245, etc.) of computer system platform 200. Graphical data stored in frame buffer 260 may be accessed, processed, and/or modified by components (e.g., graphics processor 250, processor 210, etc.) of computer system platform 200 and/or components of other systems/devices. Additionally, the graphical data may be accessed (e.g., by graphics processor 250) and displayed on an output device coupled to computer system platform 200.
In one embodiment, computer system platform 200 may be used to implement computer system 110a, computer system 110b, computer system 110c, interface component 120, or some combination thereof. For example, communication interface 270 may communicatively couple computer system 200 to one or more other computer systems (e.g., 110a, 110b, 110c, etc. via interface 120). Additionally, memory 220, removable storage 240, non-removable storage 245, frame buffer 260, or a combination thereof, may comprise instructions that when executed on a processor (e.g., 210, 250, etc.) perform a method of implementing an interactive educational tool (e.g., using GUI 130, GUI 150, etc.), implementing a remote observation and evaluation interface (e.g., using GUI 130, GU1 150, etc.), implementing electronic flashcards (e.g., using GU1 130, GUI 150, etc.), or some combination thereof. The graphical data used to display the GUI (e.g., 130, 150, etc.) may be accessed from frame buffer 260 and displayed on an output device coupled to computer system platform 200.
Interactive Educational Tool
Figure 3 shows exemplary on-screen GUI 300 for implementing an exemplary interactive educational tool in accordance with one embodiment of the present invention. As shown in Figure 3, GUI 300 comprises window or region 310 for displaying text or other information about a predetermined subject matter. The text or other information displayed in region 310 may be related to a lesson plan or region of a lesson plan identified in region 320. Window or region 330 may present media (e.g., video, still images, etc.) related to the predetermined subject matter and/or the information presented in region 310. The media presented in region 330 may also be related to audio (e.g., an audio track for a video displayed in region 330, etc.) and/or other visual information (e.g., presented using GUI 300, presented by other light emitting devices, presented by other displays, etc.).
Accordingly, GUI 300 may be used to simultaneously present various forms of information (e.g., text in region 310 and video in region 330, etc.) related to a predetermined subject matter, thereby providing an educational tool which more conveniently and effectively presents educational information to users. For example, where GUI 300 is used to present information about or otherwise teach plumbing, region 320 may identify a unit associated with unclogging a drain, region 310 may present information about clogged drains (e.g., the cause of most clogs, the steps to be taken to unclog the drain, etc.), and region 330 may show a video of a plumber unclogging a drain. As such, learning may be enhanced using GUI 300 given the repetition of information in various forms (e.g., text in region 310 which may explain the steps of unclogging a drain, the video in region 330 which may show the performance of the steps explained in region 310, etc.), the simultaneous display of this information (e.g., enabling students to quickly and conveniently move from one form of information to another), and the ability for users to interact with this information using GUI 300 as discussed below.
As shown in Figure 3, a user may interact (e.g., by moving an on-screen cursor over a region of GUI 300 and clicking a mouse button, by moving an on-screen cursor over a region of GUI 300, by pressing a touchscreen disposed over GUI 300, etc.) with various regions of GUI 300. For example, region 310 may comprise interactive regions 312-316 for enabling users to interact with text and/or other information presented in region 310. In one embodiment, regions 312-316 may comprise hyperlinked text for initiating display (e.g., within regions of GUI 300, in another displayed window, etc.) of additional content (e.g., related to the predetermined subject matter). Region 330 may comprise interactive regions 332 and 334 for enabling a user to interact with media presented in region 330.
Additionally, GUI 300 may comprise graphical objects 340 for enabling users to control and/or interact with media presented in region 330. For example, a user may play, stop, alter playback (e.g., fast forward, rewind, zoom, etc.), etc. media presented in region 330 by interacting with graphical objects 340. Additionally, a user may control the volume level (e.g., of the sound accompanying the media displayed in region 330) using one or more of graphical objects 340.
As shown in Figure 3, interaction with graphical object 352 may initiate presentation of facts, rules, or similar information associated with the predetermined subject matter. For example, Figure 4 shows display of exemplary information 410 in response to interaction with GUI 300 (e.g., graphical object 352) in accordance with one embodiment of the present invention. Information 410 may comprise an excerpt, paraphrase, summary, etc. of information presented in region 310 and/or region 330 of GUI 300. As such, embodiments enable presentation of information about the predetermined subject matter to users in yet another form, thereby improving learning and retention of the information. Additionally, presentation of information 410 enables users to repeat and/or study the information (e.g., comprising important points that the user may have previously overlooked) to further improve learning of the information.
Figure 5 shows exemplary user prompt 510 for encouraging interaction with GUI 300 and viewing of information 410 in accordance with one embodiment of the present invention. Graphical object 520 may be used to close or hide user prompt 510. In one embodiment, a user may be required to interact with graphical object 352 and view information 410 before the user will be allowed to progress in the lesson and learn new/different subject matter (e.g., presented in region 310, 330, etc.). As such, embodiments reduce the likelihood that a user will overlook or miss an important point (e.g., presented using information 410). Additionally, user prompt 510 may be displayed periodically as a user progresses through a lesson (e.g., each time an important point is presented using GUI 300), thereby increasing interaction with GUI 300 and improving learning of the predetermined subject matter presented using GUI 300.
Turning back to Figure 3, interaction with graphical object 372 may initiate display of handouts, worksheets, or the like, associated with the predetermined subject matter. In one embodiment where a still image (e.g., a chart, table, etc.) is displayed in region 330, graphical object 372 may initiate the display of the still image in a larger form (e.g., in a separate window, etc.) for easier viewing and/or enable printing of the image. Alternatively, graphical object 372 may initiate display of a list of worksheets for a unit, task, lesson, etc., thereby enabling a user to select, view, print, etc. materials associated with the predetermined subject matter.
Interaction with graphical object 374 may initiate display of answer sheets associated with the predetermined subject matter. For example, where questions are presented in region 310 and/or region 330, graphical object 374 may initiate display of answers to those questions. In this manner, a user may periodically test and/or check his or her knowledge of the presented subject matter to further reinforce and reiterate the material. As such, embodiments provide further mechanisms for improving learning of the predetermined subject matter (e.g., presented using GUI 300 and/or other related GUIs).
As shown in Figure 3, Interaction with graphical object 354 may initiate presentation of a GUI which implements electronic flashcards associated with the predetermined subject matter (e.g., as discussed with respect to Figures 21A-22). The electronic flashcards may enable the user to further learn and/or study information about the predetermined subject matter. In one embodiment, the electronic flashcards may comprise information 410, thereby enabling users to learn the information (e.g., presented in regions 310 and/or 330 of GUI 300) in yet another form.
Figure 6 shows exemplary user prompt 610 for encouraging interaction with GUI 300 and viewing of electronic flashcards in accordance with one embodiment of the present invention. Graphical object 620 may be used to close or hide user prompt 610. In one embodiment, a user may be required to interact with graphical object 354 and view electronic flashcards before the user will be allowed to progress in the lesson and learn new/different subject matter (e.g., presented in region 310, 330, etc.). As such, embodiments reduce the likelihood that a user will overlook or miss an important point (e.g., presented using a GUI for implementing the electronic flashcards). Additionally, user prompt 610 may be displayed periodically as a user progresses through a lesson (e.g., each time an important point is presented using GUI 300), thereby increasing interaction with GUI 300 and improving learning of the predetermined subject matter presented using GUI 300.
Turning back to Figure 3, interaction with graphical object 356 may initiate presentation of a GUI which implements a glossary of terms associated with the predetermined subject matter. For example, Figure 7 shows exemplary GUI 700 comprising a list of terms (e.g., in column 710) with respective definitions (e.g., in column 720) in accordance with one embodiment of the present invention. At least one respective keyword for each term is listed in column 730, while at least one respective task (e.g., unit number, lesson number, etc.) is listed in column 740. Information within GUI 700 may be sorted based upon column characteristic (e.g., in ascending order, descending order, etc.) by interacting with column headers in row 760 of GUI 700. Additionally, in one embodiment, the information within at least one of columns 710-740 may comprise the same or similar information as information 410 of Figure 4, thereby improving learning by enabling users to quickly peruse the listing of important terms within GUI 700.
As shown in Figure 7, a term and/or its corresponding information (e.g., definition, keywords, task, etc.) may be selected and viewed by interacting with the graphical object in column 750 corresponding to the row comprising the selected term and/or corresponding information. In one embodiment, the term and/or its corresponding information may be displayed in a separate window from GUI 700. Alternatively, the term and/or its corresponding information may be displayed within GUI 700 (e.g., by hiding all rows except for the row with the selected term, by graying out information in rows with non-selected terms, etc.).
Additionally, information within GUI 700 may be searched by interacting with graphical object 770. In one embodiment, graphical object 770 may bring up a separate GUI or window enabling a user to specify a search criteria for the search. The search criteria may comprise one or more terms within one or more of columns 710-740. Additionally, the GUI for specifying the search criteria may comprise one or more graphical objects for initiating the search and causing the search results to be displayed (e.g., within GUI 700). Where a search (e.g., initiated using graphical object 770) limits the displayed information within GUI 700 (e.g., to only the rows of information in GUI 700 meeting the search criteria), graphical object 780 may reset the display of information within GUI 700 to an initial state (e.g., originally displayed in response to interaction with graphical object 356 of Figure 3), to a state preceding the search (e.g., initiated using graphical object 770), etc. Further, information within GUI 700 may be printed by interacting with graphical object 790.
Turning back to Figure 3, interaction with graphical object 358 may initiate presentation of a GUI which implements a note-taking tool. For example, Figure 8A shows exemplary GUI 800A for enabling users to enter and save notes in accordance with one embodiment of the present invention. In one embodiment, terms may be entered in user-modifiable field 810, definitions may be entered in user-modifiable field 812, keywords may be entered in user-modifiable field 814, a unit number may be entered in user-modifiable field 816, and a task may be entered in user-modifiable field 818. The entries in fields 810-818 and/or changes to the entries in fields 810-818 may be saved using graphical object 820. Additionally, GUI 800A comprises graphical object 830 for deleting a recalled note.
As shown in Figure 8A, interaction with graphical object 842 may create a new note. In one embodiment, fields 810-818 may be cleared to enable a user to enter new information. Once graphical object 842 has been activated, subsequent interaction with graphical object 820 may save changes to the new note. Similarly, once graphical object 842 has been activated, subsequent interaction with graphical object 830 may delete the new note. Further, information within GUI 800A may be printed by interacting with graphical object 848.
Interaction with graphical object 846 may display a list of saved notes within GUI 800B of Figure 8B. Figure 8B shows several exemplary notes in accordance with one embodiment of the present invention, where each note comprises a term (e.g., within column 860), a respective definition (e.g., within column 862), at least one respective keyword (e.g., within column 864), a respective unit (e.g., within column 866), a respective task (e.g., within column 868), or some combination thereof. Information within GUI 800B may be sorted based upon column characteristic (e.g., in ascending order, descending order, etc.) by interacting with column headers in row 880 of GUI 800B.
As shown in Figure 8B, a saved note may be selected and viewed by interacting with the graphical object in column 870 corresponding to the row comprising the selected note. In one embodiment, a term and/or it corresponding information (e.g., definition, keywords, unit, task, etc.) of the selected note may be displayed in a separate window from GUI 700. Alternatively, a term and/or its corresponding information of the selected note may be displayed within GUI 700 (e.g., by hiding all rows except for the row with the selected note, by graying out information in rows with non-selected notes, etc.).
Additionally, information within GUI 800B may be searched by interacting with graphical object 844. In one embodiment, graphical object 844 may bring up a separate GUI or window enabling a user to specify a search criteria for the search. The search criteria may comprise one or more terms within one or more of columns 860-868. Additionally, the GUI for specifying the search criteria may comprise one or more graphical objects for initiating the search and causing the search results to be displayed (e.g., within GUI 800B). Where a search (e.g., initiated using graphical object 844) limits the displayed information within GUI 800B (e.g., to only the rows of information in GUI 800B meeting the search criteria), graphical object 846 may reset the display of information within GUI 800B to the state preceding the search (e.g., initiated using graphical object 844). Further, information within GUI 800B may be printed by interacting with graphical object 848.
Turning back to Figure 8A, interaction with graphical objects 850 may navigate or cycle through saved notes (e.g., all saved notes, a subset of saved notes determined by a search, etc.). Graphical object 852 may indicate a current note for which information is displayed (e.g., in GUI 800A). Additionally, graphical object 854 may indicate a number of saved notes through which graphical objects 850 may be used to navigate.
Turning back to Figure 3, interaction with graphical object 362 may initiate presentation of a GUI for entering answers to questions and comparing the entered answer with the correct answer. For example, Figure 9A shows exemplary GUI 900 for entering answers to questions in accordance with one embodiment of the present invention. As shown in Figure 9A, a question may be selected by interacting with graphical objects 910. The selected question may be displayed in region 920. Additionally, user-modifiable field 930 may accept a user-input response to the question.
Figure 9B show exemplary GUI 900 for comparing a user-input response with a predetermined answer in accordance with one embodiment of the present invention. As shown in Figure 9B, interaction with graphical object 940 may initiate display of a predetermined answer (e.g., in region 950 of GUI 900) to the question (e.g., selected using graphical objects 910 and displayed in region 920), where the predetermined answer may remain hidden until graphical object 940 is activated. The predetermined answer may remain visible in region 950 for a predetermined period of time until it automatically returns to the hidden state. Alternatively, the predetermined answer may remain visible until a user input (e.g., via graphical object 940, another graphical object of GUI 900, etc.) is detected requesting that the predetermined answer be hidden. Accordingly, the user-input response entered in field 930 may be compared with the predetermined answer, thereby improving learning by encouraging the user to review the predetermined subject matter, find the correct answer, think about why the user- input response may not match the predetermined answer, etc.
Interaction with graphical object 960 may save the user-input response in field 930. In one embodiment, graphical object 960 may indicate a final answer to the question presented in region 920. Additionally, interaction with graphical object 970 may initiate display of a GUI which displays saved answers.
Figure 10 shows exemplary GUI 1000 for displaying saved answers in accordance with one embodiment of the present invention. The answers displayed in GU1 1000 may be those entered into field 930 of GUI 900 and saved by interacting with graphical object 970 of GUI 900. As shown in Figure 10, column 1010 may comprise a task number related to the answer, column 1020 may comprise the question (e.g., displayed in region 920 of GUI 900), column 1030 may comprise a user-input response (e.g., entered into field 930 of GUI 900) to the question, and column 1040 may comprise a predetermined answer (e.g., displayed in region 950 of GUI 900 as shown in Figure 9B) to the question. As such, users can easily and quickly review their responses and compare them to the correct or predetermined answers. Further, interaction with graphical object 1050 may initiate printing of the information in GU1 1000 (e.g., to enable users to study and review a hardcopy of the information). In one embodiment, user-input responses (e.g., shown in column 1030 of GUI 1000) may be accessed by another person and/or computer system for analyzing the student's performance, determining if a student is taking the lesson seriously, latency of the responses from a user, error analysis, or the like. For example, where GU1 130 of computer system 110a is used to implement GUI 900 and/or GUI 1000, the information may be stored locally on computer system 100a for subsequent analysis (e.g., automatically by a program run on computer system 110a, by a professor or other person using computer system 110a after the user, etc.), exported to another computer system for remote analysis (e.g., automatically by a program run on computer system 110b and/or 110c, by a professor or other person using computer system 110b and/or 110c, etc.).
Turning back to Figure 3, interaction with graphical object 382 may advance the lesson (e.g., advance to a new task, new unit, etc.) and present new information (e.g., using GUI 300, another GUI, etc.) related to the predetermined subject matter. Graphical object 384 may return to previously-accessed subject matter. Additionally, interaction with graphical object 386 may initiate display of a GUI for selecting a unit or task (e.g., to present using GUI 300).
Figure 11 shows exemplary GUI 1100 for selecting a unit or task in accordance with one embodiment of the present invention. As shown in Figure 11 , column 1110 lists units, column 1120 lists the last task accessed (e.g., using GUI 300 of Figure 3), and column 1130 provides an exemplary description of each unit. In one embodiment, some or all of the information in columns 1110-1130 may be interactive such that users may initiate display of information (e.g., using GUI 300 of Figure 3) related to the information interacted with. For example, if a user interacts with "Unit 4" in column 1110, then GUI 300 may be displayed to present information about the fourth unit. As another example, if a user interacts with "4.01 a1" in column 1120, then GUI 300 may be displayed to present information about task 4.01a1.
GU1 1100 may also comprise graphical object 1140 for initiating display of a GUI for entering or changing account information (e.g., username, password, personal details, etc.). Interaction with graphical object 1150 may initiate display of a GUI for enabling observation and/or evaluation of a student by a professor (e.g., as discussed below with respect to Figures 15-20).
Turning back to Figure 3, interaction with graphical object 388 may initiate display of a GUI for tracking student progress. Figure 12 shows exemplary GUI 1200 for tracking student progress in accordance with one embodiment of the present invention. As shown in Figure 12, interaction with graphical objects 1210 may initiate display (e.g., within a region of GU1 1200, within a separate window or GUI, etc.) of information (e.g., amount of the unit or lesson completed, completion date if completed, etc.) about the student's progress through the lesson. For example, if a user positions an on-screen cursor over the number "5" of graphical objects 1210, then information about the student's progress through unit 5 (e.g., a completion date for unit 5) may be displayed.
As shown in Figure 12, region 1220 of GUI 1200 may comprise unit numbers and descriptions of each unit. In one embodiment, the descriptions of the units in region 1220 may match the descriptions in column 1130 of Figure 11. In other embodiments, the information in region 1220 may comprise a summary or title for each unit. Region 1230 comprises headings and subheadings for grouping the listing of units (e.g., represented by graphical objects 1210 and the numbers within region 1220). For example, heading 1240 has four subheadings 1250-1280. Subheading 1250 comprises unit 13, unit 1, and unit 4. As such, GUI 1200 provides a listing of units as well as grouping and/or characterization of the units (e.g., represented by headings and/or subheadings within column 1230) to provide additional information (e.g., contextual information for a selected unit with respect to other units) about the predetermined subject matter.
Figure 13 shows exemplary computer-implemented process 1300 for implementing an interactive education tool in accordance with one embodiment of the present invention. As shown in Figure 13, step 1310 involves accessing data (e.g., 140 of Figure 1) from a first computer system (e.g., 110b of Figure 1). The first computer system (e.g., 110b) may be communicatively coupled to an interface (e.g., 120 of Figure 1), and the data (e.g., 140) may be accessed using that interface (e.g., 120).
Step 1320 involves displaying text comprising educational information associated with a predetermined subject matter. The text may be generated from the data (e.g., 140) accessed in step 1310. Additionally, the text may be displayed (e.g., using GUI 130 of Figure 1, in region 310 of GUI 300 of Figure 3, etc.) on a second computer system (e.g., 110a of Figure 1) located remotely from the first computer system.
As shown in Figure 13, step 1330 involves presenting media related to the text (e.g., displayed in step 1320) and comprising educational information associated with the predetermined subject matter. The media may be generated from the data (e.g., 140) accessed in step 1310. Additionally, the media may be presented (e.g., using GUI 130 of Figure 1 , in region 330 of GUI 300 of Figure 3, etc.) on a second computer system (e.g., 110a of Figure 1) located remotely from the first computer system. Further, the media may be presented simultaneously with the display of the text in step 1320 in one embodiment.
The media may comprise video, still images, sound, etc. Additionally, the media may visually depict the predetermined subject matter associated with the text. For example, where the predetermined subject matter is plumbing related to drain unclogging, the text (e.g., displayed in step 1320) may explain how to unclog a drain and the media (e.g., presented in step 1330) may show a plumber unclogging a drain.
As shown in Figure 13, step 1340 involves presenting other information associated with the predetermined subject matter. For example, the additional information presented in step 1340 may comprise a listing of answers to questions associated with the predetermined subject matter, user-input responses to questions associated with the predetermined subject matter, user-input notes associated with the predetermined subject matter, a glossary of terms used in the text, and educational worksheets associated with the predetermined subject matter.
Figure 14 shows exemplary computer-implemented process 1400 for presenting questions and accepting user-input responses to the question in accordance with one embodiment of the present invention. As shown in Figure 14, step 1410 involves displaying a question associated with a predetermined subject matter. The question may be displayed (e.g., using GU1 130, in region 920 of GUI 900, etc.) using data (e.g., 140) accessed from a remote computer system (e.g., 110b). Step 1420 involves accessing a user-input response (e.g., answer) to the question. The user-input response may be input to and/or displayed in a user- modifiable field (e.g., 930) of a GUI (e.g., 900).
As shown in Figure 14, step 1430 involves displaying an answer to the question for comparison with the user-input response. The answer may be displayed (e.g., in region 950 of GUI 900) in response to a user input (e.g., an interaction with graphical object 940 of GUI 900). Additionally, the answer may remain hidden until it is displayed in step 1430. The answer may remain visible for a predetermined period of time after display in step 1430 until it automatically returns to the hidden state. Alternatively, the answer may remain visible until a user input (e.g., via graphical object 940, another graphical object of GUI 900, etc.) is detected requesting that the predetermined answer be hidden. Accordingly, the user-input response (e.g., accessed in step 1420) may be compared with the predetermined answer (e.g., displayed in step 1430), thereby improving learning by encouraging the user to review the predetermined subject matter, find the correct answer, think about why the user-input response may not match the predetermined answer, etc.
Step 1440 involves determining whether a request was detected for the user- input response (e.g., accessed in step 1420) to be saved (e.g., by interacting with graphical object 960). If it is determined that a request was not detected for the user- input response to be saved, then steps 1420-1440 may be repeated. Alternatively, if it is determined that a request was detected for the user-input response to be saved, then the user-input response may be accessed and stored (e.g., in a memory of the computer system presenting the GUI for displaying the question, in a memory of a remote computer system, etc.). As shown in Figure 14, step 1460 involves accessing the stored user-input response for review and/or analysis. The stored user-input response may be displayed (e.g., in GU1 1000 for review by a user, on a remote system for review by a professor or an individual performing student analysis, etc.). The stored user-input response may be accessed (e.g., by a local computer system, by a remote computer system, etc.) for automated analysis of the student's performance in another embodiment.
Remote Observation and Evaluation Interface
Figure 15 shows exemplary GUI 1500 for listing coaches or professors in accordance with one embodiment of the present invention. As shown in Figure 15, region 1510 of GU1 1500 comprises a listing of coaches or professors, information (e.g., area of expertise, years in a given industry, etc.) about one or more of the coaches or professors, and times when the coach or professor is available to observe and/or evaluate a student. For example, "Coach 1" specializes in plumbing, has been a plumber for over 15 years, and is available on Tuesdays and Thursdays for observation and/or evaluation sessions with students. Additionally, GU1 1500 comprises graphical objects 1520 for requesting sessions with a coach or professor (e.g., listed in region 1510). In one embodiment, interaction with one of graphical objects 1520 may initiate display of a GUI for enabling a student to request a session with a coach or professor (e.g., corresponding to the activated one of graphical objects 1520). Additionally, interaction with graphical object 1530 may initiate display of a forum GUI for enabling users (e.g., students, coaches or professors, etc.) to share information, exchange content (e.g., videos, pictures, etc.), etc. Figure 16 shows exemplary GUI 1600 for enabling a student to request a session with a coach or professor in accordance with one embodiment of the present invention. As shown in Figure 16, region 1610 of GUI 1600 comprises information about the availability of coach or professor to aid the student in requesting a session. For example, "Coach 1" is available from 11 am to 3 pm on Tuesday, and from 2 pm to 5 pm on Thursday. Region 1620 comprises a plurality of user-modifiable fields for requesting a day and/or time for a session with a coach or professor. Further, students can enter or suggest multiple days/times for the session in order of preference. Additionally, region 1630 comprises a user-modifiable field for entering a message (e.g., to accompany the session request to the coach or professor). Further, interaction with graphical object 1640 may send the requested time and/or message to the coach or professor.
Figure 17 shows exemplary GU1 1700 for enabling a coach or professor to view students and setup a session with a student in accordance with one embodiment of the present invention. As shown in Figure 17, GU1 1700 comprises a list of students (e.g., in column 1710), a respective highest unit number to which each student has access (e.g., in column 1720), and respective minutes of credit remaining (e.g., in column 1730) for use toward observation and/or evaluation sessions with a coach or professor. Additionally, column 1740 comprises a plurality of respective graphical objects corresponding to each student and for enabling a coach or professor to initiate a session with a student. For example, interaction with a graphical object in column 1740 may initiate display of a GUI for enabling the coach or professor to observe and/or evaluate the student. Figure 18A shows exemplary GUI 1800 for initiating an observation and/or evaluation session with a student in accordance with one embodiment of the present invention, whereas Figure 18B shows exemplary GUI 1800 for terminating an observation and/or evaluation session with a student in accordance with one embodiment of the present invention. As shown in Figure 18A, GUI 1800 comprises region 1810 for presenting media (e.g., video, still images, etc.). In one embodiment, video of a student performing an action or task related to a predetermined subject matter (e.g., taught using an interactive educational tool implemented using GUI 300) may be displayed in region 1810. Audio related to the video presented in region 1810 may be played simultaneously with the video to implement an audio/visual presentation. The media presented using regions of GU1 1800 (e.g., region 1810) may be prerecorded, streamed, live, etc. Accordingly, a coach or professor may observe and/or evaluate the student's performance using GU1 1800.
Video displayed in region 1810 may be generated using a video and/or audio conferencing software such as Skype™, IChatfrom Apple Inc. of Cupertino, California, or the like. As such, a student may use a camera (e.g., web camera, etc.) coupled to a computer system (e.g., 110c of Figure 1) to record or otherwise capture an action or performance. The video data of the performance may be accessed by a computer system (e.g., 110a) of a coach or professor and used to present the student's performance to the coach or professor (e.g., in region 1810 of GUI 1800). Thus, a coach or professor may observe and/or evaluate the student's performance using video (e.g., presented using GU1 1800) and/or audio (e.g., associated with the video presented in region 1810 of GU1 1800, presented simultaneously with the display of GUI 1800, etc.). GUI 1800 may also provide the ability to record the duration of an observation and/or evaluation session. For example, a coach or professor may interact with graphical object 1820 (e.g., shown in Figure 18A) to start a timer (e.g., displayed in region 1840 of GUI 1800 showing an elapsed time of the session). The timer may be stopped by interacting with graphical object 1830 (e.g., shown in Figure 18B). In one embodiment, the timer may be started when the student begins a performance, and may be stopped when the student completes the performance. Additionally, it should be appreciated that graphical object 1820 and graphical object 1830 may be simultaneously displayed in GU1 1800 in other embodiments.
Additionally, GU1 1800 may implement an automated billing system for the sessions conducted by the coach or professor. For example, a user or student may purchase a predetermined amount of time of observation/evaluation by a coach or professor. Region 1850 may indicate an amount of purchased time (e.g., displayed in column 1730 of Figure 17) for use toward observation and/or evaluation by a coach. In one embodiment, region 1850 may display an amount of purchased time remaining before the current session was initiated (e.g., using graphical object 1820). Region 1860 may indicate the remaining time for use toward observation and/or evaluation by a coach, where the amount of time displayed in region 1860 may decrement as the elapsed time displayed in region 1840 increments. As such, GU1 1800 may enable the student to be automatically charged or billed (e.g., in response to interaction with graphical object 1870) for observation/evaluation time used (e.g., displayed in region 1840). Alternatively, interaction with graphical object 1880 may enable the session to be reset (e.g., to reset the timer displayed in region 1840 and the remaining minutes displayed in region 1860). It should be appreciated that audio alone (e.g., associated with the video presented in region 1810 of GU1 1800, presented simultaneously with the display of GUI 1800, etc.) may be used to observe and/or evaluate a student in one embodiment. For example, a coach or professor may listen to a vocal performance (e.g., by accessing a pre-recorded performance of the student, using an audio conferencing software, etc.) of a student and evaluate the student's performance based upon the presented audio. Further, regions of GU1 1800 may enable timing (e.g., using graphical objects 1820 and 1830 to time the session whose duration is displayed in region 1840) and/or automated billing of the audio performance (e.g., based upon the elapsed time displayed in region 1840, based upon the remaining amount of purchased time displayed in region 1860, etc.).
Figure 19 shows exemplary computer-implemented process 1900 for initiating a GUI for enabling observation and evaluation of a user in accordance with one embodiment of the present invention. As shown in Figure 19, step 1910 involves capturing content of a student's performance. Capturing may comprise storing the content (e.g., video, still images, audio, video and audio, etc.), generating a live feed of the content, digitizing the content, transforming the content (e.g., transforming light and/or sound into a signal or data used to reproduce the light and/or sound, etc.), etc. The performance may be related to a predetermined subject matter (e.g., taught using an interactive educational tool implemented using GUI 300). Video content and/or still image content may be captured by a camera (e.g., a web camera, other still-image camera, other video camera, etc.) and accessed by a computer system (e.g., 110a) of the student. Audio content may be captured by a microphone or the like and accessed by a computer system (e.g., 110a) of the student. Step 1920 involves accessing the captured content. he captured content (e.g., captured in step 1910) may be accessed by a computer system (e.g., 110b) of a coach or professor. In another embodiment, the captured content may be accessed by an interface component (e.g., 120) coupled to the student's computer system (e.g., 110a) and/or a computer system of a coach or professor (e.g., 110b).
As shown in Figure 19, step 1930 involves presenting the accessed content using a GUI for enabling observation and/or evaluation of the student's performance. The GUI (e.g., 1800) may enable a coach or professor to observe and/or evaluate a student's performance (e.g., by displaying video or pictures of the performance captured in step 1910, playing audio of the performance captured in step 1910, etc.) related to the predetermined subject matter. In one embodiment, the content may be presented in real-time, thereby enabling the observation and/or evaluation in real-time. Additionally, the content may be communicated via a communication channel (e.g., implemented using interface 120 and/or other networking components) formed between the computer system presenting GUI 1800 (e.g., computer system 110a) and the coach's or professor's computer system (e.g., 110c).
Step 1940 involves automatically billing the student for the observation and/or evaluation session. In one embodiment, automated billing may be implemented by deducting a determined length of the session from units of time purchased by the student. The duration of the session may be determined using a timer (e.g., presented to a coach or professor using GU1 1800) which may be started and stopped based upon user inputs (e.g., by a coach or professor interacting with graphical objects 1820 and 1830 of GUI 1800). Alternatively, the duration of the session may be automatically determined (e.g., based upon the duration of the video and/or audio captured). Electronic Flashcards
Figure 2OA shows exemplary GUI 2000 for implementing electronic flashcards in accordance with one embodiment of the present invention. As used herein, the term "electronic flashcard" can mean a visual representation of one or both sides of a flashcard, where the visual representation may be displayed on a display device coupled to a computer system (e.g., 110a, 110b, 110c, 200, etc.). One side of the electronic flashcard may remain hidden until it is selectively revealed or displayed (e.g., in response to a user input or interaction with GUI 2000), where the selective revealing or displaying may comprise a "flipping" of the electronic flashcard in one embodiment. Additionally, the term "hidden" as used herein can mean not displayed, displayed so that it is less visible, etc.
As shown in Figure 20A, GUI 2000 comprises region 2010 for displaying one or both sides of an electronic flashcard (e.g., an "active" electronic flashcard). Additionally, region 2020 comprises multiple piles or stacks (e.g., 2022-2026) for storing electronic flashcards which are not currently being viewed (e.g., "inactive" electronic flashcards). In one embodiment, the electronic flashcards implemented using GUI 2000 may comprise SAFMEDS cards.
Electronic flashcards may be transferred between regions 2010 and 2020 by interacting with one or more regions of GUI 2000. For example, graphical object 2030 may be used to automatically transfer an active electronic flashcard from region 2010 to region 2020 in one embodiment. Alternatively, interaction with one or both sides of the active flashcard displayed in region 2010 (e.g., by (e.g., by moving an on-screen cursor over one or both sides and clicking a mouse button, by moving an on-screen cursor over one or both sides and double-clicking a mouse button, etc.) may transfer an active electronic flashcard from region 2010 to region 2020. Additionally, interaction with an inactive electronic flashcard in one of the piles of region 2020 and/or interaction with a graphical object (e.g., of GUI 2000) may automatically transfer one or more selected electronic flashcards from region 2020 to region 2010.
In one embodiment, limitations may be placed on the transferring of electronic flashcards between regions 2010 and 2020. For example, a user may be required to view or otherwise interact with an electronic flashcard in region 2010 (e.g., one time, multiple times, etc.) before transferring it to portion 2020. As such, embodiments may improve learning of the material presented using GUI 2000 by increasing user interaction with the material presented using the electronic flashcards of GUI 2000.
As shown in Figure 2OA, region 2010 comprises region 2040 (e.g., a first side of an active electronic flashcard) for displaying first set of information 2045. Information 2045 displayed in region 2040 may remain visible while a second set of information to be displayed in region 2050 (e.g., a second side of the active electronic flashcard) is hidden. Interaction with graphical object 2060 may "flip" the flashcard and display the second set of information (e.g., 2055) as depicted in Figure 2OB. Information 2045 and information 2055 may both be associated with a predetermined subject matter (e.g., taught using an interactive educational tool implemented using GUI 300). As such, in one embodiment, a user may learn the first and/or second sets of information by viewing the information displayed in region 2040 (e.g., information 2045), attempting to recite the information hidden in region 2050 (e.g., information 2055), interacting with graphical object 2060 to display information 2055 in region 2050, and then checking the recited information against information 2055. Information 2045 and/or information 2055 may comprise text (e.g., a word, phrase, term, definition of the term, etc.), colors, patterns, etc. Additionally, information 2045 and information 2055 may be related (e.g., to one another and a predetermined subject matter) such that a user may view one set of information and test his or her knowledge of the other (e.g., by trying to recite the hidden information). For example, information 2045 displayed in region 2040 may comprise a term (e.g., the words "pipe wrench"), while information 2055 to be selectively displayed in region 2050 may comprise a definition of the term displayed in region 2040, where information 2045 and information 2055 are related to a predetermined subject matter (e.g., plumbing). As such, a user may attempt to recite information 2055 (e.g., a definition of "pipe wrench") after looking at information 2045 (e.g., the term "pipe wrench") but before the display of information 2055 in region 2050, thereby using the electronic flashcards implemented using GUI 2000 to learn about the predetermined subject matter (e.g., plumbing).
Thus, embodiments improve learning (e.g., of a predetermined subject matter) by increasing the amount of information or content which may be displayed on the electronic flashcards compared with conventional, handwritten flashcards. For example, electronic information or content to be displayed on the electronic flashcards may be relatively small (e.g., occupy a relatively small amount of storage space) and/or be accessed from one or more sources (e.g., local hard drives, remote computer systems, etc.). Additionally, the electronic flashcards are less likely to be damaged, lost, or stolen given that they are in electronic form. Further, the information on each side of the electronic flashcards may be automatically generated (e.g., based upon a lesson plan of another module, based upon a user-defined subject matter, etc.) in one embodiment, thereby reducing the time and effort to create the flashcards. In another embodiment, information 2045 may comprise a picture (e.g., of a pipe wrench) or video (e.g., of a plumber using a pipe wrench). As such, a user may attempt to recite information 2055 (e.g., a definition of a "pipe wrench," the term "pipe wrench," etc.) after looking at information 2045 (e.g., a picture or video showing a pipe wrench) but before the display of information 2055 in region 2050, thereby using the electronic flashcards implemented using GUI 2000 to learn about the predetermined subject matter (e.g., plumbing).
Thus, embodiments may further improve learning (e.g., of a predetermined subject matter) by further increasing the amount of information or content which may be displayed on the electronic flashcards compared with conventional, handwritten flashcards. For example, video and/or audio content may be presented to a user, thereby improving learning by presenting information in different forms to stimulate more senses of a user (e.g., catering to visual learners, audio learners, etc.). Additionally, the presentation of information in different forms can increase repetition of information to improve information absorption/retention.
As shown in Figure 2OC, interaction with graphical object 2070 may initiate display of second set of information 2055 in region 2040 (e.g., while first set of information 2045 is hidden). Additionally, interaction with graphical object 2070 may associate first set of information 2045 with region 2050 such that a subsequent interaction with graphical object 2060 may initiate display of information 2045 in region 2050 (e.g., as depicted in Figure 20D). As such, interaction with graphical object 2070 may effectively switch the front and back sides of the electronic flashcard in one embodiment. Thus, embodiments further improve learning by enabling both sets of information (e.g., 2045 and 2055) to be selectively hidden and revealed (e.g., in region 2050 using graphical object 2060), thereby enabling users to test their memorization, learning, understanding, etc. of both sets of information (e.g., 2045 and 2055).
As shown in Figure 2OA, region 2020 comprises stacks 2022-2026 for storing electronic flashcards which are not currently being viewed (e.g., "inactive" electronic flashcards). Stack 2022 may comprise electronic flashcards which have not yet been accessed or viewed (e.g., transferred to region 2010). In one embodiment, electronic flashcards may automatically accumulate in stack 2022 as a user progresses through a lesson and encounters or accesses new subject matter (e.g., to be placed on one or more electronic flashcards).
Stack 2024 and/or stack 2026 may comprise electronic flashcards which have been previously accessed or viewed (e.g., transferred from region 2010 to region 2020). In one embodiment stacks 2024 and 2026 may comprise electronic flashcards sorted based upon based upon user-confidence level with the subject matter of the electronic flashcards. For example, stack 2024 may comprise electronic flashcards with subject matter which a user is less comfortable with (e.g., has not memorized, etc.), while stack 2026 may comprise electronic flashcards with subject matter which a user is more confident with (e.g., has memorized, etc.). Alternatively, stacks 2024 and 2026 may comprise electronic flashcards sorted by subject matter (e.g., electronic flashcards with different types of plumbing tools in stack 2024, electronic flashcards with different plumbing techniques in stack 2026, etc.) and/or grouped based upon other characteristics. As such, embodiments enable users to sort, group, or otherwise place electronic flashcards in one or more stacks (e.g., 2024, 2026, etc.), thereby improving learning (e.g., of the predetermined subject matter) by enabling users to separate out and focus on the more troublesome material (e.g., displayed or stored in stack 2024) while devoting less attention to the material which the user is more comfortable with (e.g., displayed or stored in stack 2026). The number of electronic flashcards in a given stack (e.g., 2022, 2024, 2026, etc.) may also be reduced by placing the electronic flashcards into a larger number of stacks. Thus, the material presented using the electronic flashcards may be more easily learned by enabling users may focus on a smaller amount of material at a given time.
In one embodiment, the number of cards placed on at least one stack (e.g., 2022, 2024, 2026, etc.) of region 2020 may be limited. For example, once a stack (e.g., 2022, 2024, 2026, etc.) reaches its predetermined limit, one or more electronic flashcards may be removed before allowing additional electronic flashcards to be placed the stack. As a further example, stack 2022 may have a limit of one card and stack 2024 may have a limit of six cards, while stack 2026 may have no limit or a user-defined limit. In this manner, learning may be improved by encouraging and/or forcing a user to learn the previously-accessed material before moving on to additional material.
As shown in Figure 20E, stack 2024 may have reached a predetermined limit (e.g., as indicated by a visual attribute, e.g., the darker color or shade of stack 2024) of six electronic flashcards, thereby requiring removal of an electronic flashcard from stack 2024 (e.g., by moving an electronic flashcard from stack 2024 to stack 2026, by moving an electronic flashcard from stack 2024 to region 2010 for viewing, etc.) before another electronic flashcard may be located on stack 2024. Thus, embodiments reduce the review time for each stack and enable users to more effectively, quickly, and easily learn the material (e.g., using the SAFMEDS method of periodically performing short review sessions).
Additionally, GUI 2000 may enable automated shuffling of the electronic flashcards in one or more stacks (e.g., 2022, 2024, 2026, etc.) of region 2020. The shuffling may be initiated by interacting with one or more graphical objects of GUI 2000 (not shown in Figure 20A), by interacting with a region of GUI 2000 (e.g., activating or selecting an electronic flashcard from region 2020, clicking or double-clicking a stack in region 2020, etc.), etc. As such, embodiments enable more randomized and improved shuffling over manual shuffling of conventional flashcards, thereby improving learning of the material. Additionally, in one embodiment, the automatic shuffling of GUI 2000 may enable users to use the electronic flashcards in accordance with the SAFMEDS method.
Further, in one embodiment, GUI 2000 may enable one or more electronic flashcards (e.g., displayed within region 2010 and/or 2020) to be printed. As such, users may conveniently and quickly generate hard-copy flashcards (e.g., printed on paper, cardstock, index cards, etc.) for reviewing the material of the electronic flashcards in hard-copy form. As such, if any of the cards are damaged, lost, stolen, etc., one or more of the flashcards may be conveniently and quickly re-printed.
Although Figures 20A-20E display only one active electronic flashcard in region 2010, it should be appreciated that more than one active electronic flashcard may be displayed in region 2010 in other embodiments. Additionally, although region 2020 comprises three stacks (e.g., 2022, 2024, and 2026) in Figures 20A-20E, it should be appreciated that region 2020 may comprise a larger or smaller number of stacks in other embodiments.
Figure 21 shows exemplary computer-implemented process 2100 for implementing electronic flashcards in accordance with one embodiment of the present invention. As shown in Figure 21 , step 2110 involves displaying a first set of information (e.g., 2045) associated with a predetermined subject matter in a first area (e.g., 2040) of a GUI (e.g., 2000 as depicted in Figures 20A-20E) while a second set of information (e.g. 2055) associated with the predetermined subject matter remains hidden. In one embodiment, the first set of information (e.g., 2045) may comprise text (e.g., a word, phrase, term, etc.), colors, patterns, graphical information (e.g., still images, video, etc.), or the like.
Step 2120 involves determining whether a request to display the second set of information (e.g., 2055) in the first area (e.g., 2040) has been detected. If a request to display the second set of information (e.g., 2055) in the first area (e.g., 2040) has not been detected, then step 2130 may be performed.
As shown in Figure 21, step 2130 involves detecting a user input requesting display of the second set of information (e.g., 2055). The user input may comprise an interaction with a graphical object (e.g., 2060 of Figures 20A-20E) of the GUI (e.g., 2000) displaying the first set of information (e.g., 2045) in step 2110.
Step 2140 involves displaying the second set of information (e.g., 2055) in a second area (e.g., 2050) of the GUI (e.g., 2000). In one embodiment, the second set of information (e.g., 2055) may comprise text (e.g., a word, phrase, definition of a term, etc.), colors, patterns, graphical information (e.g., still images, video, etc.), or the like.
As shown in Figure 21, step 2150 involves automatically hiding the second set of information (e.g., 2055) after a predetermined period of time. Alternatively, the second set of information (e.g., 2055) may be hidden in response to a user input (e.g., to GUI 2000).
If it is determined in step 2120 that a request to display the second set of information (e.g., 2055) in the first area (e.g., 2040) has been detected, then the second set of information (e.g., 2055) may be displayed in the first area (e.g., 2040) of the GUI (e.g., 2000) in step 2160. The second set of information (e.g., 2055) may be displayed in the first area (e.g., 2040) while the first set of information (e.g. 2045) remains hidden.
As shown in Figure 21 , step 2170 involves detecting a user input requesting display of the first set of information (e.g., 2045). The user input may comprise an interaction with a graphical object (e.g., 2060) of the GUI (e.g., 2000) displaying the second set of information (e.g., 2055) in step 2160.
Step 2180 involves displaying the first set of information (e.g., 2045) in the second area (e.g., 2050) of the GUI (e.g., 2000). In one embodiment, the first set of information (e.g., 2045) may comprise text (e.g., a word, phrase, definition of a term, etc.), colors, patterns, graphical information (e.g., still images, video, etc.), or the like.
As shown in Figure 21, step 2190 involves automatically hiding the first set of information (e.g., 2045) after a predetermined period of time. Alternatively, the first set of information (e.g., 2045) may be hidden in response to a user input (e.g., to GUI 2000).
In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. Thus, the sole and exclusive indicator of what is, and is intended by the applicant to be, the invention is the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. Hence, no limitation, element, property, feature, advantage, or attribute that is not expressly recited in a claim should limit the scope of such claim in any way. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims

CLAIMSWhat is claimed is:
1. A graphical user interface for implementing an interactive educational tool, said graphical user interface comprising: a first region for displaying text, wherein said text comprises educational information associated with a predetermined subject matter; a second region for presenting media simultaneously with said display of said text, wherein said media is related to said text and comprises educational information associated with said predetermined subject matter; and wherein data for generating said text and said media is stored on a first computer system and accessed by a second computer system presenting said text and said media, wherein said first computer system is located remotely from said second computer system.
2. The graphical user interface of Claim 1 , wherein said first region comprises at least one interactive region for enabling user interaction.
3. The graphical user interface of Claim 1 , wherein said media is selected from a group consisting of video and still images, and wherein said media is for visually depicting said predetermined subject matter associated with said text.
4. The graphical user interface of Claim 1 , wherein said text comprises a question for testing a user, and further comprising: a third region for accepting a user-input response to said question; and a fourth region for selectively displaying an answer to said question and for enabling comparison of said user-input response and said answer.
5. The graphical user interface of Claim 4, wherein said answer displayed in said fourth region is displayed in response to a user input.
6. The graphical user interface of Claim 1 further comprising: a fifth region for enabling users to create and view notes associated with said predetermined subject matter.
7. The graphical user interface of Claim 1 further comprising: a sixth region for presenting information selected from a group consisting of a listing of answers to questions associated with said predetermined subject matter, user- input responses to questions associated with said predetermined subject matter, user- input notes associated with said predetermined subject matter, a glossary of terms used in said text, and educational worksheets associated with said predetermined subject matter.
8. The graphical user interface of Claim 1 further comprising: a seventh region for enabling a user of a third computer system to observe and evaluate in real-time a performance of a user of said second computer system, wherein said performance is associated with said predetermined subject matter, and wherein said third computer system is located remotely from said second computer system.
9. A method of implementing an interactive educational tool, said method comprising: accessing data from a first computer system; displaying text comprising educational information associated with a predetermined subject matter, wherein said text is generated from said data and displayed on a second computer system located remotely from said first computer system; and presenting media related to said text and comprising educational information associated with said predetermined subject matter, wherein said media is generated from said data and presented on said second computer system, and wherein said text is displayed simultaneously with said presentation of said media.
10. The method of Claim 9, wherein said media is selected from a group consisting of video and still images, and wherein said media is for visually depicting said predetermined subject matter associated with said text.
11. The method of Claim 9, wherein said text comprises a question for testing a user, and further comprising: accessing and storing a user-input response to said question; and accessing and displaying said stored user-input response for review.
12. The method of Claim 11 further comprising: in response to a user input, displaying an answer to said question for enabling comparison of said user-input response and said answer.
13. The method of Claim 9 further comprising: presenting information selected from a group consisting of a listing of answers to questions associated with said predetermined subject matter, user-input responses to questions associated with said predetermined subject matter, user-input notes associated with said predetermined subject matter, a glossary of terms used in said text, and educational worksheets associated with said predetermined subject matter.
14. The method of Claim 9 further comprising: in response to a user interaction with said second computer system, initiating a communicative channel with a third computer system for enabling a user of said third computer system to observe and evaluate in real-time a performance of a user of said second computer system, wherein said performance is associated with said predetermined subject matter, and wherein said third computer system is located remotely from said second computer system.
15. A system comprising: a first computer system for storing data used to generate educational information; and a second computer system communicatively coupled to said first computer system, said second computer system for accessing said data and generating a graphical user interface using said data, said graphical user interface comprising: a first region for displaying text, wherein said text comprises educational information associated with a predetermined subject matter; and a second region for presenting media simultaneously with said display of said text, wherein said media is related to said text and comprises educational information associated with said predetermined subject matter.
16. The system of Claim 15, wherein said first region comprises at least one interactive region for enabling user interaction with said graphical user interface.
17. The system of Claim 15, wherein said media is selected from a group consisting of video and still images, and wherein said media is for visually depicting said predetermined subject matter associated with said text.
18. The system of Claim 15, wherein said text comprises a question for testing a user, and wherein said graphical user interface further comprises: a third region for accepting a user-input response to said question; and a fourth region for selectively displaying an answer to said question and for enabling comparison of said user-input response and said answer.
19. The system of Claim 18, wherein said answer displayed in said fourth region is displayed in response to activation of a graphical element of said graphical user interface.
20. The system of Claim 15, wherein said graphical user interface further comprises: a fifth region for enabling users to create and view notes associated with said predetermined subject matter.
21. The system of Claim 15, wherein said graphical user interface further comprises: a sixth region for presenting information selected from a group consisting of a listing of answers to questions associated with said predetermined subject matter, user- input responses to questions associated with said predetermined subject matter, user- input notes associated with said predetermined subject matter, a glossary of terms used in said text, and educational worksheets associated with said predetermined subject matter.
PCT/US2008/009216 2007-08-02 2008-07-30 Interactive educational tool WO2009017764A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US96334207P 2007-08-02 2007-08-02
US60/963,342 2007-08-02
US11/900,953 2007-09-14
US11/900,953 US20090075247A1 (en) 2007-09-14 2007-09-14 Interactive educational tool

Publications (1)

Publication Number Publication Date
WO2009017764A1 true WO2009017764A1 (en) 2009-02-05

Family

ID=40304673

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/009216 WO2009017764A1 (en) 2007-08-02 2008-07-30 Interactive educational tool

Country Status (1)

Country Link
WO (1) WO2009017764A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100366953B1 (en) * 2002-07-03 2003-01-14 Yejinet Co Ltd Method for synchronizing multi-moving picture by on-line education
US20050154992A1 (en) * 2004-01-12 2005-07-14 International Business Machines Corporation Online learning monitor
US20050181348A1 (en) * 2004-02-17 2005-08-18 Carey Tadhg M. E-learning system and method
US7149788B1 (en) * 2002-01-28 2006-12-12 Witness Systems, Inc. Method and system for providing access to captured multimedia data from a multimedia player

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7149788B1 (en) * 2002-01-28 2006-12-12 Witness Systems, Inc. Method and system for providing access to captured multimedia data from a multimedia player
KR100366953B1 (en) * 2002-07-03 2003-01-14 Yejinet Co Ltd Method for synchronizing multi-moving picture by on-line education
US20050154992A1 (en) * 2004-01-12 2005-07-14 International Business Machines Corporation Online learning monitor
US20050181348A1 (en) * 2004-02-17 2005-08-18 Carey Tadhg M. E-learning system and method

Similar Documents

Publication Publication Date Title
US8108786B2 (en) Electronic flashcards
US20090075247A1 (en) Interactive educational tool
Kosko et al. Preservice teachers’ professional noticing when viewing standard and 360 video
US9042808B2 (en) Didactic appliance
JP2021516809A (en) Learning content provision method and device using AI tutor
Tan et al. Effectiveness of a digital pen-based learning system with a reward mechanism to improve learners’ metacognitive strategies in listening
Littleton et al. Instruction as orchestration: Multimodal connection building with the interactive whiteboard
McFee Ethics, knowledge and truth in sports research: An epistemology of sport
Hoareau et al. Evaluation of internal and external validity of a virtual environment for learning a long procedure
Nguyen Fostering positive listening habits among EFL learners through the application of listening strategy and sub-skill instructions
KR100997682B1 (en) The Multimedia Studing Method which has a VoIP and Digital Image Processing Technology in Internet Environment
Pearsall Fast and effective assessment: How to reduce your workload and improve student learning
Dusen et al. Influencing students’ relationships with physics through culturally relevant tools
CN116452022A (en) Teacher teaching effect evaluation method and device and electronic equipment
Langeland Action research into the use of popular music: A goldmine worth exploring in the ELT secondary-school classroom?
WO2009017764A1 (en) Interactive educational tool
Kor et al. A View through a Different Lens: Elicitng Pupils’ Conception of a Good Mathematics Lesson Using Photovoice
US20180374376A1 (en) Methods and systems of facilitating training based on media
Way et al. Symposium: Interactive whiteboards and pedagogy in primary classrooms
Ntshuntshe Literacy practices and English as the language of learning and teaching in a grade nine classroom
US20230230491A1 (en) Skill journey feature and live assessment tool on a learning platform
Singh Exploring the potential of social annotations for predictive and descriptive analytics
Chui et al. Reflective inquiry practice of English language teacher: Blogging as e-portfolios within the TPACK framework
Salehi et al. Process pad: a low-cost multi-touch platform to facilitate multimodal documentation of complex learning
Gardner et al. Language Learning Beyond the Classroom in an Asian Context: Obstacles Encountered.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08794885

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08794885

Country of ref document: EP

Kind code of ref document: A1