WO2015025274A1 - Video recording system and methods - Google Patents

Video recording system and methods Download PDF

Info

Publication number
WO2015025274A1
WO2015025274A1 PCT/IB2014/063978 IB2014063978W WO2015025274A1 WO 2015025274 A1 WO2015025274 A1 WO 2015025274A1 IB 2014063978 W IB2014063978 W IB 2014063978W WO 2015025274 A1 WO2015025274 A1 WO 2015025274A1
Authority
WO
WIPO (PCT)
Prior art keywords
cast
computer
data
notification
readable medium
Prior art date
Application number
PCT/IB2014/063978
Other languages
French (fr)
Inventor
Paul Robert BAILEY
Original Assignee
Navico Holding As
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Navico Holding As filed Critical Navico Holding As
Publication of WO2015025274A1 publication Critical patent/WO2015025274A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K97/00Accessories for angling
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K79/00Methods or means of catching fish in bulk not provided for in groups A01K69/00 - A01K77/00, e.g. fish pumps; Detection of fish; Whale fishery
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K99/00Methods or apparatus for fishing not provided for in groups A01K69/00 - A01K97/00
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G17/00Apparatus for or methods of weighing material of special form or property
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/40Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight
    • G01G19/413Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means
    • G01G19/414Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means using electronic computing means only
    • G01G19/415Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means using electronic computing means only combined with recording means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G9/00Methods of, or apparatus for, the determination of weight, not provided for in groups G01G1/00 - G01G7/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3476Data logging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/02Digital computers in general; Data processing equipment in general manually operated with input through keyboard and computation using a built-in program, e.g. pocket calculators
    • G06F15/0225User interface arrangements, e.g. keyboard, display; Interfaces to other computer systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0231Cordless keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/11Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
    • G11B27/13Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier the information being derived from movement of the record carrier, e.g. using tachometer
    • G11B27/17Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier the information being derived from movement of the record carrier, e.g. using tachometer using electrical sensing means
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B31/00Arrangements for the associated working of recording or reproducing apparatus with related apparatus
    • G11B31/006Arrangements for the associated working of recording or reproducing apparatus with related apparatus with video camera or receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4335Housekeeping operations, e.g. prioritizing content for deletion because of storage space restrictions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B49/00Arrangements of nautical instruments or navigational aids
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G23/00Auxiliary devices for weighing apparatus
    • G01G23/18Indicating devices, e.g. for remote indication; Recording devices; Scales, e.g. graduated
    • G01G23/36Indicating the weight by electrical means, e.g. using photoelectric cells
    • G01G23/37Indicating the weight by electrical means, e.g. using photoelectric cells involving digital counting
    • G01G23/3728Indicating the weight by electrical means, e.g. using photoelectric cells involving digital counting with wireless means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/96Sonar systems specially adapted for specific applications for locating fish
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/3013Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system is an embedded system, i.e. a combination of hardware and software dedicated to perform a certain function in mobile devices, printers, automotive or aircraft systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3058Monitoring arrangements for monitoring environmental properties or parameters of the computing system or of the computing system component, e.g. monitoring of power, currents, temperature, humidity, position, vibrations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/835Timestamp
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/40Arrangements in telecontrol or telemetry systems using a wireless architecture
    • H04Q2209/43Arrangements in telecontrol or telemetry systems using a wireless architecture using wireless personal area networks [WPAN], e.g. 802.15, 802.15.1, 802.15.4, Bluetooth or ZigBee
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • a non-transitory computer-readable medium having stored thereon computer-executable instructions which, when executed by a computer, cause the computer to perform various actions.
  • the actions may include receiving a first notification that a first cast has been made.
  • the actions may include receiving data regarding a video input.
  • the actions may include receiving a second notification that a second cast has been made.
  • the actions may include deleting data regarding the video input that is associated with the first cast in response to receiving the second notification.
  • the data regarding the video input may be deleted without deleting data regarding the video input that is associated with the second cast.
  • the actions may include receiving a first time stamp corresponding to the first cast.
  • the actions may include receiving a second time stamp corresponding to the second cast.
  • the data between the first time stamp and the second time stamp may be the data associated with the first cast that is deleted.
  • the data regarding the video input may be associated with the first cast by indexing the data regarding the video input to data received after the first notification and before the second notification.
  • the data regarding the video input may be data recorded by a video camera.
  • the first notification may be received from a wearable device.
  • the first notification may be received over a wireless connection.
  • the first notification may include motion data corresponding to a fishing cast.
  • a non-transitory computer-readable medium having stored thereon computer-executable instructions which, when executed by a computer, cause the computer to perform various actions.
  • the actions may include receiving a notification that a cast has been made.
  • the actions may include receiving data regarding a video input associated with the cast.
  • the actions may include determining whether a catch has been detected that corresponds to the cast.
  • the actions may include storing the data regarding the video input in response to a determination that a catch has been detected.
  • the data regarding the video input may be stored in association with the cast or the catch.
  • the actions may include receiving a notification that a catch has been made.
  • the notification is received from a wearable device.
  • the notification is received over a wireless connection.
  • the notification may include motion data corresponding to a fishing cast.
  • a non-transitory computer-readable medium having stored thereon computer-executable instructions which, when executed by a computer, cause the computer to perform various actions.
  • the actions may include receiving a notification that a first cast has been made.
  • the actions may include receiving data regarding a video input.
  • the actions may include receiving a notification that a button has been pressed, wherein the button is associated with a second cast or a catch being made.
  • the actions may include deleting the data regarding the video input that is associated with the catch being made, if the button is associated with the first cast.
  • the actions may include storing a portion of the data regarding the video input if the button is associated with the catch being made.
  • the button may be a virtual button located in a graphical user display.
  • the button may be located on a wearable device.
  • a portion of the data regarding the video input may be deleted without deleting the data regarding the video input associated with the second cast.
  • the actions may include receiving a time stamp corresponding to the first cast.
  • the actions may also include receiving a second time stamp corresponding to the button being pressed.
  • the portion of the data being deleted may be the data between the first time stamp and the second time stamp.
  • Figure 1 illustrates a block diagram of a video recording system in accordance with implementations of various techniques described herein.
  • Figure 2 is a flow diagram of a video recording method in accordance with implementations of various techniques described herein.
  • Figure 3 illustrates a wearable device in accordance with implementations of various techniques described herein.
  • FIG. 4 is a block diagram of a wearable device in accordance with implementations of various techniques described herein.
  • Figure 5 is a flow diagram describing the operation of a fishing motion detecting software loaded in a wearable device in accordance with implementations of various techniques described herein.
  • Figure 6 is an illustration of a wearable device wirelessly transmitting data to a marine electronics device and receiving data from the device in order to begin recording data in accordance with implementations of various techniques described herein.
  • Figure 7 illustrates a schematic diagram of a video recording system having a computing system in which the various technologies described herein may be incorporated and practiced.
  • Figure 8 illustrates a schematic of a marine electronics device in accordance with implementations of various techniques described herein.
  • first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another.
  • a first object or step could be termed a second object or step, and, similarly, a second object or step could be termed a first object or step, without departing from the scope of the invention.
  • the first object or step, and the second object or step are both objects or steps, respectively, but they are not to be considered the same object or step.
  • the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context.
  • the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
  • FIG. 1 illustrates a block diagram of a video recording system 100 in accordance with implementations of various techniques described herein.
  • the video recording system 100 may include several components, such as a wearable device 300, a marine electronics device 600 and a video recorder 1 10 having a video camera 120, a processor 130, memory 140 and a clock 150.
  • a wearable device 300 see the section titled WEARABLE DEVICE FOR FISHING below.
  • a marine electronics device 600 see the section titled MARINE ELECTRONICS DEVICE below.
  • the video recording system 100 may communicate over wireless or wired network connections 105. The various components of the video recording system 100 are described in more detail with reference to the computing system diagram in Figure 7.
  • the video recorder 1 10 may be a stand-alone unit or embedded in a marine vessel. While one video camera is shown in Figure 1 , more than one video camera may be included in the video recording system 100. In one implementation, the video recorder 1 10 may record continuously throughout a fishing trip. Data not associated with catches may be deleted afterwards using method 200 below.
  • the video recording system 100 may record a fisherman throughout a cast. As such, the video recording system 100 may record the casting of a line into the water and when a fisherman catches a fish or anything else resulting from the cast.
  • the video recording system 100 may be configured to detect when a cast, catch or other predetermined event occurs, and limit video recording to time periods associated with those events. For more information regarding cast or catch detection, see the section titled FISHING MOTION DETECTION below.
  • the video recording system 100 may also determine when particular events take place and the video recording system 100 may begin recording or stop recording, accordingly. When nothing is being recorded by the video camera 120, the video recording system may enter a standby mode waiting for a predetermined event to occur to trigger a recording session.
  • FIG. 2 illustrates a flow diagram of a video recording method 200 in accordance with implementations of various techniques described herein.
  • method 200 may be performed by one or more components of the video recording system 100, such as the wearable device 100, the marine electronics device 600 and/or the video recorder 1 10. It should be understood that while method 200 indicates a particular order of execution of operations, in some implementations, certain portions of the operations might be executed in a different order. Further, in some implementations, additional operations or steps may be added to the method 200. Likewise, some operations or steps may be omitted.
  • the video recording system 100 may receive a notification that a first cast has been made.
  • the notification may be a message sent by the wearable device 100, the marine electronics device 600 or other computer device capable of determining whether a cast has been made.
  • the notification may also be motion data or other sensor data that the video recording system 100 may determine to represent a cast. For more information on using motion data or other sensor data to detect a cast, see the section titled FISHING MOTION DETECTION.
  • the video recording system 100 may receive a time stamp corresponding to the first cast from block 210.
  • the time stamp may be determined by a clock 150 or a timer in the video recording system 100. As such, the time stamp may be based on when the video recording system 100 received the notification at block 210.
  • the time stamp may be part of the notification at block 210 or a message sent separate from the notification.
  • the time stamp may also be a particular time (e.g., 5 seconds before or after a cast is detected) designated in relation to the cast.
  • the video recording system 100 may receive data regarding a video input (i.e., "the received data").
  • the video input may be an Audio/Video (AV) input, High Definition Multimedia Interface (HDMI) input, S-Video input, Component (YPbPr) video input, Digital Video Interface (DVI) input or the like.
  • the received data may be data received from the video camera 120 or other audio or video capturing device.
  • the received data may include video as well as related data such as audio recordings, the time, date, metadata, video camera settings, or the like.
  • the received data may be associated with the notification of the first cast from block 210, such as through indexing the received data to the notification or first cast.
  • the received data may be stored in memory 140 or to a hard disk.
  • the video recording system 100 may begin receiving data regarding a video input upon receiving the notification at block 210, in one implementation, the video recording system 100 may receive data regarding a video input continuously throughout a fishing trip, such as before the notification at block 210 is received. The video recording system 100 may also record data at predetermined time periods. For instance, the video recording system 100 may receive data regarding a video input whenever the video recording system 100 leaves a standby mode even if no notification has been received that a cast has been made.
  • the video recording system 100 may determine whether a catch has been made.
  • the video recording system 100 may receive a notification regarding a catch similar to a notification that a cast has been made as described at block 210.
  • the video recording system 100 may also receive motion or other sensor data from a wearable device 100 for detecting a catch. If a catch is detected, the process may proceed to block 235. Otherwise, the process may proceed to block 240.
  • the video recording system 100 may store the received data regarding the video input in association with the cast or the catch.
  • the received data may be converted into a video file using a video format, such as MP4, WMV, AVI, etc.
  • a video file may be stored in association with the same identifier as the cast.
  • the catch has a particular identifier, the video file may be stored using the same identifier as the catch.
  • the process may return to block 210 where the method 200 may restart the notification count (i.e., the next received notification that a cast has been made may then be the first cast at block 210) or enter a standby mode until another notification is received showing that a new cast has been made.
  • the video recording system 100 may determine whether a notification has been received that a second cast has been made at block 240. This notification may be similar to the notification received at block 210. If a second cast is detected, the process may proceed to block 250. If a second cast has not been detected, the process may return to block 225, where the video recording system 100 may continue to receive data regarding the video input.
  • the video recording system 100 may receive a time stamp corresponding to the second cast from block 240. This time stamp may be similar to the time stamp received at block 220.
  • the video recording system 100 may delete a portion or all of the received data that is associated with the first cast (i.e., "the associated data").
  • the received data may be associated with a particular cast using several methods.
  • the video recording system 100 may use the time stamps received at blocks 220 and 250 to determine the associated data, e.g., the period of time between the time stamp at block 220 and the time stamp at block 250.
  • portions of the received data at block 225 may be indexed to a particular cast, such as through metadata.
  • the video recording system 100 may index the data received after the notification to the corresponding cast until the next notification is received by the video recording system 100.
  • the process may index the received data following the next notification to the new cast.
  • the video recording system 100 may delete the received data at a staggered point in time. For example, in order to keep received data associated with the second cast, the video recording system 100 may delete the associated data up until a few seconds before the time stamp corresponding to the second cast. This may ensure that none of the received data associated with the second cast is deleted, such as due to a time delay between making the second cast and the video recording system 100 receiving a notification that the second cast was made.
  • the associated data may be deleted in response to the notification regarding the second cast at block 240. As such, the video recording system 100 may delete a portion of the received data associated with the first cast without deleting data associated with the second cast.
  • the process may end, enter a standby mode or return to block 225 to receive data regarding a video input that is associated with the second cast.
  • the notification at block 210 or block 240 may be a notification that a button has been pressed on the marine electronics device 600, the video recorder 1 10, or the wearable device 300.
  • the button may be a physical button or a virtual button, such as an icon on the screen 605.
  • the method 200 may be performed similar to the manner described at blocks 210- 260, but when a cast or catch is made, a user may press a button to notify the video recording system 100 of the cast or catch, respectively.
  • the received data associated with the first cast may also be deleted in response to a notification that a button was pressed.
  • the wearable device 300 may have a button designated for deleting the received data that is associated with the previous cast.
  • one or more virtual buttons may be created on the marine electronics device 600 corresponding to the associated data for one or more of previous casts on a trip. A user may delete or store the associated data by pressing the virtual button corresponding to any desired casts accordingly.
  • FIG. 3 illustrates a wearable device 300 in accordance with various implementations described herein.
  • the wearable device 300 may be worn around the fisherman's arm or wrist.
  • the wearable device 300 could also be attached to a fishing rod.
  • the wearable device 300 may include a housing 320.
  • the housing 320 may be in the shape of a band.
  • the housing 320 may be made of a combination of plastics and rubbers, or of any other synthetic material.
  • the housing 320 may also be waterproof.
  • the housing 320 may include a clasp, or another mechanism to aid in removal of the housing 320 from a user's arm.
  • FIG. 4 illustrates a block diagram of the wearable device 300 in accordance with various implementations described herein.
  • the housing 320 may include a computer 330 and at least one motion sensor 340.
  • the at least one motion sensor 340 may include one or more accelerometers, gyroscopes, muscle activity sensors, any other motion sensor, or any combination of motion sensors.
  • the at least one motion sensor 340 is configured to capture motion data.
  • the computer 330 is described in more detail in Figure 7.
  • the computer 330 may be loaded with software to analyze motion data from the at least one motion sensor 340.
  • the software may analyze motion data to determine when a fishing cast motion has been made.
  • the software may also record that a cast has been made and the time of the cast, e.g., a timestamp.
  • the software is described in more detail in Figure 5.
  • the wearable device 300 may include one or more buttons 310.
  • the one or more buttons 310 may be used for user input.
  • the one or more buttons 310 may be used to input the occurrence of a cast.
  • the one or more buttons 310 may be used to input the occurrence of a catch. The catch may then be recorded.
  • the one or more buttons 310 may be used to input the weight of a caught fish. The weight may then be recorded.
  • a user may press a button 310 to input the occurrence of a catch, and then may press the same or different button 310 to input the weight of the caught fish. The occurrence of the catch and the weight may then be recorded.
  • the one or more buttons 310 may be used to input the occurrence of a bite.
  • the wearable device may contain a display 350.
  • the display may be a series of Light Emitting Diodes (LED).
  • the display may be a Liquid Crystal Display (LCD).
  • the wearable device 300 may include wireless technology, such as Bluetooth, Wi-Fi, cellular technology such as GSM or CDMA, satellite communication, or any other wireless technology.
  • the wearable device 300 may be connected wirelessly to a marine electronics device 600, which is described in more detail in Figure 8.
  • the wearable device 300 is described as being wirelessly connected to a marine electronics device 600, it should be understood that the wearable device 300 may be connected to any computer system, including a portable computer system, a smart phone device, a remote server, the video recorder 1 10, a cloud server and the like. It should also be understood that the wearable device 300 may be connected to any other device able to store fishing data, e.g., data logging device.
  • the marine electronics device 600 or a computer system, including a smart phone may record additional data, such as location, weather, or other data.
  • the data from the marine electronics device 600 or computer system and the wearable device 300 may then be combined to provide comprehensive data regarding a fishing trip.
  • the combined data may then be transmitted directly to a remote server or cloud.
  • the combined data may be transmitted to a smart phone device, which then transmits the data to a remote server or cloud.
  • the combined data may be transmitted to the data logging device, which may then transmit the combined data at a later time.
  • the data from the wearable device 300 may be transmitted to the remote server or cloud via the smart phone without using the marine electronics device 600.
  • the data from the wearable device may be transmitted to a data logging device prior to being transmitted to a remote server or cloud via the smart phone.
  • the data from the wearable device 300 may be transmitted to the remote server or cloud without using the marine electronics device 600 or the smartphone.
  • Figure 5 illustrates a flow diagram of a cast detection method 500 in accordance with implementations of various techniques described herein.
  • method 500 may be performed by the computer 330. It should be understood that while method 500 indicates a particular order of execution of operations, in some implementations, certain portions of the operations might be executed in a different order. Further, in some implementations, additional operations or steps may be added to the method 500. Likewise, some operations or steps may be omitted.
  • the computer 330 contained in a wearable device 300 may be loaded with a set of instructions (software) to analyze data received from one or more sensors.
  • the software may receive motion data from the at least one motion sensor 340 in the wearable device.
  • the software may analyze the motion data and determine when a cast has been made.
  • the software may record the occurrence of the cast and time of the cast, e.g., a timestamp in memory, e.g., inside the computer 330.
  • the record may be a database, a log, or any other method of recording the fishing data.
  • the record may be a number representing the amount of casts that have occurred, with the number being incremented after each cast. The amount of casts may be shown on a display 350.
  • the computer 330 may be synchronized to a marine electronics device or a portable computer device, such as a smart phone. This step is optional. In one implementation, the computer 330 may be wirelessly synchronized to the marine electronics device 600. Figure 7 illustrates the wearable device 300 being synchronized to the marine electronics device.
  • the software may enter a standby mode in which data may be received from the at least one motion sensor 340 and analyzed.
  • the software may continuously monitor for a cast. Once a cast is detected, the cast and the timestamp corresponding to the detected cast may be recorded (block 550).
  • the software may determine the type of cast used using motion sensor data (block 550). The software may determine whether the cast made is a basic cast, roll cast, side cast, or any other type of cast. The software may then record the type of cast made (block 550). Then, the software returns to the standby mode (block 530).
  • the software may detect a catch or a bite.
  • the software may detect a catch or a bite based on the motion sensor data. Once a bite or a catch is detected, the occurrence of a bite or a catch and their corresponding timestamp may be recorded (block 540/550).
  • the record may be a database, a log, or any other method of recording the fishing data.
  • the record may be a number representing the amount of bites or catch that have occurred, with the number being incremented after each bite or catch. The amount of bites or catch may be shown on a display 350. Then, the software returns to the standby mode (block 520).
  • casts, bites and/or catches may be detected using one or more buttons 310.
  • a user may press a first button 310.
  • a user may press a second, different button 310.
  • a user may press a third, different button 310.
  • a user may press a button 310 and then quickly release the button 330 to indicate the occurrence of a bite. The user may also press the same button 310 and hold the button 310 down for a longer time to indicate a catch.
  • the software may receive further user input corresponding to the weight of the caught fish (block 560). If the software receives further user input, the software may then record the weight of the caught fish (block 560). The record may be a database, a log, or any other method of recording the fishing data. The inputted weight may be shown on a display 350. Then, the software returns to the standby mode (block 520).
  • the weight is entered using one or more buttons 310.
  • a weight may be entered by pushing the one or more buttons 310 a number of times to correspond to the weight of the caught fish. For example, to enter a three pound fish, a button 310 may be pressed three times.
  • the software may transmit the recorded data wirelessly to the connected device, e.g., the marine electronics device 600 (block 570).
  • the software may transmit the record data after each new entry, or at any other interval.
  • the transmission may be made after each cast.
  • the transmission may be to a remote server or to any computer system, including a smart phone or a marine electronics device.
  • FIG. 6 illustrates a schematic diagram of a marine electronics device 600 in accordance with various implementations described herein.
  • the components of the marine display device 600 are described in more detail with reference to the computing system 700 in Figure 7.
  • the marine electronics device 600 includes a screen 605.
  • the screen 605 may be sensitive to touching by a finger.
  • the screen 605 may be sensitive to the body heat from the finger, a stylus, or responsive to a mouse.
  • the display device 600 may display marine electronic data 615.
  • the marine electronic data types 615 may include chart data, radar data, sonar data, steering data, dashboard data, navigation data, fishing statistics, and the like.
  • the marine electronics device 600 may also include a plurality of buttons 620, which may be either physical buttons or virtual buttons, or a combination thereof.
  • Implementations of various technologies described herein may be operational with numerous general purpose or special purpose computing system environments or configurations.
  • Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the various technologies described herein include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, smart phones, and the like.
  • the various technologies described herein may be implemented in the context of marine electronics, such as devices found in marine vessels and/or navigation systems.
  • Ship instruments and equipment may be connected to the computing systems described herein for executing one or more navigation technologies.
  • the computing systems may be configured to operate using sonar, radar, the global positioning system (GPS) and like technologies.
  • GPS global positioning system
  • program modules include routines, programs, objects, components, data structures, etc. that performs particular tasks or implement particular abstract data types. Further, each program module may be implemented in its own way, and all need not be implemented the same way. While program modules may all execute on a single computing system, it should be appreciated that, in some implementations, program modules may be implemented on separate computing systems or devices adapted to communicate with one another. A program module may also be some combination of hardware and software where particular tasks performed by the program module may be done either through hardware, software, or both.
  • FIG. 7 illustrates a schematic diagram of the video recording system 100 having a computing system 700 in accordance with implementations of various techniques described herein.
  • the computer system 700 may describe the video recorder 1 10, the marine electronics device 600, the wearable device 300, or components spanning multiple devices.
  • the computing system 700 may be a conventional desktop, a handheld device, a controller, a personal digital assistant, a server computer, an electronics device/instrument, a laptop, a tablet, or part of a navigation system, or sonar system. It should be noted, however, that other computer system configurations may be used.
  • the computing system 700 may include a central processing unit (CPU) 730, a system memory 726, a graphics processing unit (GPU) 731 and a system bus 728 that couples various system components including the system memory 726 to the CPU 730. Although only one CPU 730 is illustrated in Figure 7, it should be understood that in some implementations the computing system 700 may include more than one CPU 730.
  • CPU central processing unit
  • GPU graphics processing unit
  • the CPU 730 can include a microprocessor, a microcontroller, a processor, a programmable integrated circuit, or a combination thereof.
  • the CPU 730 can comprise an off-the-shelf processor such as a Reduced Instruction Set Computer (RISC), or a Microprocessor without Interlocked Pipeline Stages (MIPS) processor, or a combination thereof.
  • RISC Reduced Instruction Set Computer
  • MIPS Microprocessor without Interlocked Pipeline Stages
  • the CPU 730 may also include a proprietary processor.
  • the GPU 731 may be a microprocessor specifically designed to manipulate and implement computer graphics.
  • the CPU 730 may offload work to the GPU 731 .
  • the GPU 731 may have its own graphics memory, and/or may have access to a portion of the system memory 726.
  • the GPU 731 may include one or more processing units, and each processing unit may include one or more cores.
  • the CPU 730 may provide output data to a GPU 731 .
  • the GPU 731 may generate graphical user interfaces that present the output data.
  • the GPU 731 may also provide objects, such as menus, in the graphical user interface. A user may provide inputs by interacting with the objects.
  • the GPU 731 may receive the inputs from interaction with the objects and provide the inputs to the CPU 730.
  • a video adapter 732 may be provided to convert graphical data into signals for a monitor 734.
  • the monitor 734 includes a screen 705.
  • the screen 705 can be sensitive to heat or touching (now collectively referred to as a "touch screen").
  • the host computer 799 may not include a monitor 734.
  • the system bus 728 may be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • bus architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus.
  • the system memory 726 may include a read only memory (ROM) 712 and a random access memory (RAM) 716.
  • ROM read only memory
  • RAM random access memory
  • BIOS basic input/output system
  • BIOS basic routines that help transfer information between elements within the computing system 700, such as during start-up, may be stored in the ROM 712.
  • the computing system 700 may further include a hard disk drive interface 736 for reading from and writing to a hard disk 750, a memory card reader 752 for reading from and writing to a removable memory card 756, and an optical disk drive 754 for reading from and writing to a removable optical disk 758, such as a CD ROM or other optical media.
  • the hard disk 750, the memory card reader 752, and the optical disk drive 754 may be connected to the system bus 728 by a hard disk drive interface 736, a memory card reader interface 738, and an optical drive interface 740, respectively.
  • the drives and their associated computer-readable media may provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computing system 700.
  • computing system 700 is described herein as having a hard disk, a removable memory card 756 and a removable optical disk 758, it should be appreciated by those skilled in the art that the computing system 700 may also include other types of computer-readable media that may be accessed by a computer.
  • computer-readable media may include computer storage media and communication media.
  • Computer storage media may include volatile and non-volatile, and removable and non-removable media implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules or other data.
  • Computer storage media may further include RAM, ROM, erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other solid state memory technology, CD-ROM, digital versatile disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computing system 700.
  • Communication media may embody computer readable instructions, data structures, program modules or other data in a modulated data signal, such as a carrier wave or other transport mechanism and may include any information delivery media.
  • modulated data signal may mean a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
  • the computing system 700 may also include a host adapter 733 that connects to a storage device 735 via a small computer system interface (SCSI) bus, a Fiber Channel bus, an eSATA bus, or using any other applicable computer bus interface.
  • the computing system 700 can also be connected to a router 764 to establish a wide area network (WAN) 766 with one or more remote computers 774.
  • the router 764 may be connected to the system bus 728 via a network interface 744.
  • the remote computers 774 can also include hard disks 772 that store application programs 770.
  • the computing system 700 may also connect to one or more remote computers 774 via local area network (LAN) 776 or the WAN 766.
  • LAN local area network
  • the computing system 700 may be connected to the LAN 776 through the network interface or adapter 744.
  • the LAN 776 may be implemented via a wired connection or a wireless connection.
  • the LAN 776 may be implemented using Wi-Fi technology, cellular technology, or any other implementation known to those skilled in the art.
  • the network interface 744 may also utilize remote access technologies (e.g., Remote Access Service (RAS), Virtual Private Networking (VPN), Secure Socket Layer (SSL), Layer 2 Tunneling (L2T), or any other suitable protocol).
  • RAS Remote Access Service
  • VPN Virtual Private Networking
  • SSL Secure Socket Layer
  • L2T Layer 2 Tunneling
  • a number of program modules may be stored on the hard disk 750, memory card 756, optical disk 758, ROM 712 or RAM 716, including an operating system 718, one or more application programs 720, and program data 724.
  • the hard disk 750 may store a database system.
  • the database system could include, for example, recorded points.
  • the application programs 720 may include various mobile applications ("apps") and other applications configured to perform various methods and techniques described herein.
  • the operating system 718 may be any suitable operating system that may control the operation of a networked personal or server computer.
  • a user may enter commands and information into the computing system 700 through input devices such as a keyboard 762 and pointing device.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, user input button, or the like.
  • These and other input devices may be connected to the CPU 730 through a serial port interface 742 coupled to system bus 523, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB).
  • a monitor 105 or other type of display device may also be connected to system bus 728 via an interface, such as a video adapter 732.
  • the computing system 700 may further include other peripheral output devices such as speakers and printers.

Abstract

A non-transitory computer-readable medium having stored thereon computer- executable instructions which, when executed by a computer, cause the computer to receive a first notification that a first cast has been made. The computer-executable instructions may further include instructions, which cause the computer to receive data regarding a video input. The computer-executable instructions may further include instructions, which cause the computer to receive a second notification that a second cast has been made. The computer-executable instructions may further include instructions, which cause the computer to delete a portion of the data regarding the video input that is associated with the first cast in response to receiving the second notification.

Description

VIDEO RECORDING SYSTEM AND METHODS CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application is a PCT filing of US Patent Application Number 14/259,052, filed 2014 April 22 and titled VIDEO RECORDING SYSTEM AND METHODS, and US Provisional Patent Application Number 61/868,444, filed 2013 August 21 and titled FISHING DATA COLLECTION AND USE. Each of the aforementioned applications is incorporated herein by reference.
BACKGROUND
[0002] This section is intended to provide background information to facilitate a better understanding of various technologies described herein. As the section's title implies, this is a discussion of related art. That such art is related in no way implies that it is prior art. The related art may or may not be prior art. It should therefore be understood that the statements in this section are to be read in this light, and not as admissions of prior art.
[0003] Being able to record a catch during a fishing trip is very useful for memorializing the experience. However, during any fishing trip, there may be time periods when fish do not bite and many casts do not result in caught fishes. Recording the entire fishing trip to capture each catch may result in plenty of video that is of little personal value.
SUMMARY
[0004] Described herein are implementations of various technologies for a method. In one implementation, a non-transitory computer-readable medium having stored thereon computer-executable instructions which, when executed by a computer, cause the computer to perform various actions. The actions may include receiving a first notification that a first cast has been made. The actions may include receiving data regarding a video input. The actions may include receiving a second notification that a second cast has been made. The actions may include deleting data regarding the video input that is associated with the first cast in response to receiving the second notification. [0005] In one implementation, the data regarding the video input may be deleted without deleting data regarding the video input that is associated with the second cast. In another implementation, the actions may include receiving a first time stamp corresponding to the first cast. The actions may include receiving a second time stamp corresponding to the second cast. The data between the first time stamp and the second time stamp may be the data associated with the first cast that is deleted. In another implementation, the data regarding the video input may be associated with the first cast by indexing the data regarding the video input to data received after the first notification and before the second notification. In another implementation, the data regarding the video input may be data recorded by a video camera. In another implementation, the first notification may be received from a wearable device. In another implementation, the first notification may be received over a wireless connection. In another implementation, the first notification may include motion data corresponding to a fishing cast.
[0006] Described herein are implementations of various technologies for a method. In one implementation, a non-transitory computer-readable medium having stored thereon computer-executable instructions which, when executed by a computer, cause the computer to perform various actions. The actions may include receiving a notification that a cast has been made. The actions may include receiving data regarding a video input associated with the cast. The actions may include determining whether a catch has been detected that corresponds to the cast. The actions may include storing the data regarding the video input in response to a determination that a catch has been detected.
[0007] In one implementation, the data regarding the video input may be stored in association with the cast or the catch. In another implementation, the actions may include receiving a notification that a catch has been made. In another implementation, the notification is received from a wearable device. In another implementation, the notification is received over a wireless connection. In another implementation, the notification may include motion data corresponding to a fishing cast.
[0008] Described herein are implementations of various technologies for a method. In one implementation, a non-transitory computer-readable medium having stored thereon computer-executable instructions which, when executed by a computer, cause the computer to perform various actions. The actions may include receiving a notification that a first cast has been made. The actions may include receiving data regarding a video input. The actions may include receiving a notification that a button has been pressed, wherein the button is associated with a second cast or a catch being made. The actions may include deleting the data regarding the video input that is associated with the catch being made, if the button is associated with the first cast.
[0009] In one implementation, the actions may include storing a portion of the data regarding the video input if the button is associated with the catch being made. In another implementation, the button may be a virtual button located in a graphical user display. In another implementation, the button may be located on a wearable device. In another implementation, a portion of the data regarding the video input may be deleted without deleting the data regarding the video input associated with the second cast. In another implementation, the actions may include receiving a time stamp corresponding to the first cast. The actions may also include receiving a second time stamp corresponding to the button being pressed. The portion of the data being deleted may be the data between the first time stamp and the second time stamp.
[0010] The above referenced summary section is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description section. The summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] Implementations of various techniques will hereafter be described with reference to the accompanying drawings. It should be understood, however, that the accompanying drawings illustrate only the various implementations described herein and are not meant to limit the scope of various techniques described herein.
[0012] Figure 1 illustrates a block diagram of a video recording system in accordance with implementations of various techniques described herein.
[0013] Figure 2 is a flow diagram of a video recording method in accordance with implementations of various techniques described herein. [0014] Figure 3 illustrates a wearable device in accordance with implementations of various techniques described herein.
[0015] Figure 4 is a block diagram of a wearable device in accordance with implementations of various techniques described herein.
[0016] Figure 5 is a flow diagram describing the operation of a fishing motion detecting software loaded in a wearable device in accordance with implementations of various techniques described herein.
[0017] Figure 6 is an illustration of a wearable device wirelessly transmitting data to a marine electronics device and receiving data from the device in order to begin recording data in accordance with implementations of various techniques described herein.
[0018] Figure 7 illustrates a schematic diagram of a video recording system having a computing system in which the various technologies described herein may be incorporated and practiced.
[0019] Figure 8 illustrates a schematic of a marine electronics device in accordance with implementations of various techniques described herein.
DETAILED DESCRIPTION
[0020] The discussion below is directed to certain specific implementations. It is to be understood that the discussion below is only for the purpose of enabling a person with ordinary skill in the art to make and use any subject matter defined now or later by the patent "claims" found in any issued patent herein.
[0021] It is specifically intended that the claimed invention not be limited to the implementations and illustrations contained herein, but include modified forms of those implementations including portions of the implementations and combinations of elements of different implementations as come within the scope of the following claims. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system- related and business related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure. Nothing in this application is considered critical or essential to the claimed invention unless explicitly indicated as being "critical" or "essential."
[0022] Reference will now be made in detail to various implementations, examples of which are illustrated in the accompanying drawings and figures. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be apparent to one of ordinary skill in the art that the present disclosure may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0023] It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first object or step could be termed a second object or step, and, similarly, a second object or step could be termed a first object or step, without departing from the scope of the invention. The first object or step, and the second object or step, are both objects or steps, respectively, but they are not to be considered the same object or step.
[0024] The terminology used in the description of the present disclosure herein is for the purpose of describing particular implementations only and is not intended to be limiting of the present disclosure. As used in the description of the present disclosure and the appended claims, the singular forms "a," "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "includes," "including," "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
[0025] As used herein, the term "if" may be construed to mean "when" or "upon" or "in response to determining" or "in response to detecting," depending on the context. Similarly, the phrase "if it is determined" or "if [a stated condition or event] is detected" may be construed to mean "upon determining" or "in response to determining" or "upon detecting [the stated condition or event]" or "in response to detecting [the stated condition or event]," depending on the context. As used herein, the terms "up" and "down"; "upper" and "lower"; "upwardly" and downwardly"; "below" and "above"; and other similar terms indicating relative positions above or below a given point or element may be used in connection with some implementations of various technologies described herein.
[0026] Various implementations of a video recording system described herein will now be described in more detail with reference to Figures 1 -8.
[0027] Figure 1 illustrates a block diagram of a video recording system 100 in accordance with implementations of various techniques described herein. The video recording system 100 may include several components, such as a wearable device 300, a marine electronics device 600 and a video recorder 1 10 having a video camera 120, a processor 130, memory 140 and a clock 150. For more information regarding the wearable device 300, see the section titled WEARABLE DEVICE FOR FISHING below. For more information regarding the marine electronics device 600, see the section titled MARINE ELECTRONICS DEVICE below. The video recording system 100 may communicate over wireless or wired network connections 105. The various components of the video recording system 100 are described in more detail with reference to the computing system diagram in Figure 7.
[0028] The video recorder 1 10 may be a stand-alone unit or embedded in a marine vessel. While one video camera is shown in Figure 1 , more than one video camera may be included in the video recording system 100. In one implementation, the video recorder 1 10 may record continuously throughout a fishing trip. Data not associated with catches may be deleted afterwards using method 200 below.
[0029] The video recording system 100 may record a fisherman throughout a cast. As such, the video recording system 100 may record the casting of a line into the water and when a fisherman catches a fish or anything else resulting from the cast. The video recording system 100 may be configured to detect when a cast, catch or other predetermined event occurs, and limit video recording to time periods associated with those events. For more information regarding cast or catch detection, see the section titled FISHING MOTION DETECTION below. Using the wearable device 300 or the marine electronics device 600, the video recording system 100 may also determine when particular events take place and the video recording system 100 may begin recording or stop recording, accordingly. When nothing is being recorded by the video camera 120, the video recording system may enter a standby mode waiting for a predetermined event to occur to trigger a recording session.
[0030] Figure 2 illustrates a flow diagram of a video recording method 200 in accordance with implementations of various techniques described herein. In one implementation, method 200 may be performed by one or more components of the video recording system 100, such as the wearable device 100, the marine electronics device 600 and/or the video recorder 1 10. It should be understood that while method 200 indicates a particular order of execution of operations, in some implementations, certain portions of the operations might be executed in a different order. Further, in some implementations, additional operations or steps may be added to the method 200. Likewise, some operations or steps may be omitted.
[0031] At block 210, the video recording system 100 may receive a notification that a first cast has been made. The notification may be a message sent by the wearable device 100, the marine electronics device 600 or other computer device capable of determining whether a cast has been made. The notification may also be motion data or other sensor data that the video recording system 100 may determine to represent a cast. For more information on using motion data or other sensor data to detect a cast, see the section titled FISHING MOTION DETECTION.
[0032] At block 220, the video recording system 100 may receive a time stamp corresponding to the first cast from block 210. The time stamp may be determined by a clock 150 or a timer in the video recording system 100. As such, the time stamp may be based on when the video recording system 100 received the notification at block 210. The time stamp may be part of the notification at block 210 or a message sent separate from the notification. The time stamp may also be a particular time (e.g., 5 seconds before or after a cast is detected) designated in relation to the cast.
[0033] At block 225, the video recording system 100 may receive data regarding a video input (i.e., "the received data"). For instance, the video input may be an Audio/Video (AV) input, High Definition Multimedia Interface (HDMI) input, S-Video input, Component (YPbPr) video input, Digital Video Interface (DVI) input or the like. The received data may be data received from the video camera 120 or other audio or video capturing device. The received data may include video as well as related data such as audio recordings, the time, date, metadata, video camera settings, or the like. The received data may be associated with the notification of the first cast from block 210, such as through indexing the received data to the notification or first cast. The received data may be stored in memory 140 or to a hard disk.
[0034] While the video recording system 100 may begin receiving data regarding a video input upon receiving the notification at block 210, in one implementation, the video recording system 100 may receive data regarding a video input continuously throughout a fishing trip, such as before the notification at block 210 is received. The video recording system 100 may also record data at predetermined time periods. For instance, the video recording system 100 may receive data regarding a video input whenever the video recording system 100 leaves a standby mode even if no notification has been received that a cast has been made.
[0035] At block 230, the video recording system 100 may determine whether a catch has been made. The video recording system 100 may receive a notification regarding a catch similar to a notification that a cast has been made as described at block 210. The video recording system 100 may also receive motion or other sensor data from a wearable device 100 for detecting a catch. If a catch is detected, the process may proceed to block 235. Otherwise, the process may proceed to block 240.
[0036] At block 235, the video recording system 100 may store the received data regarding the video input in association with the cast or the catch. For instance, the received data may be converted into a video file using a video format, such as MP4, WMV, AVI, etc. If the first cast from block 210 has a particular identifier, such as an identification number based on the date or time, a video file may be stored in association with the same identifier as the cast. If the catch has a particular identifier, the video file may be stored using the same identifier as the catch. The process may return to block 210 where the method 200 may restart the notification count (i.e., the next received notification that a cast has been made may then be the first cast at block 210) or enter a standby mode until another notification is received showing that a new cast has been made. [0037] As mentioned above, if no catch is detected, then the video recording system 100 may determine whether a notification has been received that a second cast has been made at block 240. This notification may be similar to the notification received at block 210. If a second cast is detected, the process may proceed to block 250. If a second cast has not been detected, the process may return to block 225, where the video recording system 100 may continue to receive data regarding the video input.
[0038] At block 250, the video recording system 100 may receive a time stamp corresponding to the second cast from block 240. This time stamp may be similar to the time stamp received at block 220.
[0039] At block 260, the video recording system 100 may delete a portion or all of the received data that is associated with the first cast (i.e., "the associated data"). The received data may be associated with a particular cast using several methods. In one implementation, the video recording system 100 may use the time stamps received at blocks 220 and 250 to determine the associated data, e.g., the period of time between the time stamp at block 220 and the time stamp at block 250.
[0040] In another implementation, portions of the received data at block 225 may be indexed to a particular cast, such as through metadata. When the video recording system 100 receives a notification that a cast has been made, the video recording system 100 may index the data received after the notification to the corresponding cast until the next notification is received by the video recording system 100. When the next notification is received, the process may index the received data following the next notification to the new cast.
[0041] In yet another implementation, the video recording system 100 may delete the received data at a staggered point in time. For example, in order to keep received data associated with the second cast, the video recording system 100 may delete the associated data up until a few seconds before the time stamp corresponding to the second cast. This may ensure that none of the received data associated with the second cast is deleted, such as due to a time delay between making the second cast and the video recording system 100 receiving a notification that the second cast was made. [0042] In still another implementation, the associated data may be deleted in response to the notification regarding the second cast at block 240. As such, the video recording system 100 may delete a portion of the received data associated with the first cast without deleting data associated with the second cast.
[0043] After block 260, the process may end, enter a standby mode or return to block 225 to receive data regarding a video input that is associated with the second cast.
[0044] In another implementation of method 200, the notification at block 210 or block 240 may be a notification that a button has been pressed on the marine electronics device 600, the video recorder 1 10, or the wearable device 300. The button may be a physical button or a virtual button, such as an icon on the screen 605. As such, the method 200 may be performed similar to the manner described at blocks 210- 260, but when a cast or catch is made, a user may press a button to notify the video recording system 100 of the cast or catch, respectively. In another implementation, at block 250, the received data associated with the first cast may also be deleted in response to a notification that a button was pressed. For instance, the wearable device 300 may have a button designated for deleting the received data that is associated with the previous cast.
[0045] In another implementation, one or more virtual buttons may be created on the marine electronics device 600 corresponding to the associated data for one or more of previous casts on a trip. A user may delete or store the associated data by pressing the virtual button corresponding to any desired casts accordingly.
WEARABLE DEVICE FOR FISHING
[0046] Fishermen often record details of their fishing trips so that the information can be referenced at a later time, and so that the trip can be analyzed. Using a wearable device that captures motion and determines when a cast has been made, fishing data could easily be recorded by a computer system without the need for significant user input. Accordingly, Figure 3 illustrates a wearable device 300 in accordance with various implementations described herein. The wearable device 300 may be worn around the fisherman's arm or wrist. The wearable device 300 could also be attached to a fishing rod. [0047] The wearable device 300 may include a housing 320. In one implementation, the housing 320 may be in the shape of a band. The housing 320 may be made of a combination of plastics and rubbers, or of any other synthetic material. The housing 320 may also be waterproof. The housing 320 may include a clasp, or another mechanism to aid in removal of the housing 320 from a user's arm.
[0048] Figure 4 illustrates a block diagram of the wearable device 300 in accordance with various implementations described herein. As shown in Figure 4, the housing 320 may include a computer 330 and at least one motion sensor 340. The at least one motion sensor 340 may include one or more accelerometers, gyroscopes, muscle activity sensors, any other motion sensor, or any combination of motion sensors. The at least one motion sensor 340 is configured to capture motion data.
[0049] The computer 330 is described in more detail in Figure 7. In one implementation, the computer 330 may be loaded with software to analyze motion data from the at least one motion sensor 340. For instance, the software may analyze motion data to determine when a fishing cast motion has been made. The software may also record that a cast has been made and the time of the cast, e.g., a timestamp. The software is described in more detail in Figure 5.
[0050] The wearable device 300 may include one or more buttons 310. The one or more buttons 310 may be used for user input. In one implementation, the one or more buttons 310 may be used to input the occurrence of a cast. In another implementation, the one or more buttons 310 may be used to input the occurrence of a catch. The catch may then be recorded. In another implementation, the one or more buttons 310 may be used to input the weight of a caught fish. The weight may then be recorded. In yet another implementation, a user may press a button 310 to input the occurrence of a catch, and then may press the same or different button 310 to input the weight of the caught fish. The occurrence of the catch and the weight may then be recorded. In still another implementation, the one or more buttons 310 may be used to input the occurrence of a bite.
[0051] The wearable device may contain a display 350. The display may be a series of Light Emitting Diodes (LED). The display may be a Liquid Crystal Display (LCD). [0052] The wearable device 300 may include wireless technology, such as Bluetooth, Wi-Fi, cellular technology such as GSM or CDMA, satellite communication, or any other wireless technology. In one implementation, the wearable device 300 may be connected wirelessly to a marine electronics device 600, which is described in more detail in Figure 8. Although the wearable device 300 is described as being wirelessly connected to a marine electronics device 600, it should be understood that the wearable device 300 may be connected to any computer system, including a portable computer system, a smart phone device, a remote server, the video recorder 1 10, a cloud server and the like. It should also be understood that the wearable device 300 may be connected to any other device able to store fishing data, e.g., data logging device.
[0053] The marine electronics device 600 or a computer system, including a smart phone, may record additional data, such as location, weather, or other data. The data from the marine electronics device 600 or computer system and the wearable device 300 may then be combined to provide comprehensive data regarding a fishing trip. The combined data may then be transmitted directly to a remote server or cloud. In one implementation, the combined data may be transmitted to a smart phone device, which then transmits the data to a remote server or cloud. In another implementation, the combined data may be transmitted to the data logging device, which may then transmit the combined data at a later time. In yet another implementation, the data from the wearable device 300 may be transmitted to the remote server or cloud via the smart phone without using the marine electronics device 600. In still another implementation, the data from the wearable device may be transmitted to a data logging device prior to being transmitted to a remote server or cloud via the smart phone. In still another implementation, the data from the wearable device 300 may be transmitted to the remote server or cloud without using the marine electronics device 600 or the smartphone.
FISHING MOTION DETECTION
[0054] Figure 5 illustrates a flow diagram of a cast detection method 500 in accordance with implementations of various techniques described herein. In one implementation, method 500 may be performed by the computer 330. It should be understood that while method 500 indicates a particular order of execution of operations, in some implementations, certain portions of the operations might be executed in a different order. Further, in some implementations, additional operations or steps may be added to the method 500. Likewise, some operations or steps may be omitted.
[0055] As mentioned above, the computer 330 contained in a wearable device 300 may be loaded with a set of instructions (software) to analyze data received from one or more sensors. The software may receive motion data from the at least one motion sensor 340 in the wearable device. The software may analyze the motion data and determine when a cast has been made. The software may record the occurrence of the cast and time of the cast, e.g., a timestamp in memory, e.g., inside the computer 330. The record may be a database, a log, or any other method of recording the fishing data. The record may be a number representing the amount of casts that have occurred, with the number being incremented after each cast. The amount of casts may be shown on a display 350.
[0056] At block 510, the computer 330 may be synchronized to a marine electronics device or a portable computer device, such as a smart phone. This step is optional. In one implementation, the computer 330 may be wirelessly synchronized to the marine electronics device 600. Figure 7 illustrates the wearable device 300 being synchronized to the marine electronics device.
[0057] At block 520, the software may enter a standby mode in which data may be received from the at least one motion sensor 340 and analyzed. At this step, the software may continuously monitor for a cast. Once a cast is detected, the cast and the timestamp corresponding to the detected cast may be recorded (block 550). In one implementation, the software may determine the type of cast used using motion sensor data (block 550). The software may determine whether the cast made is a basic cast, roll cast, side cast, or any other type of cast. The software may then record the type of cast made (block 550). Then, the software returns to the standby mode (block 530).
[0058] While in standby mode (block 530), the software may detect a catch or a bite. The software may detect a catch or a bite based on the motion sensor data. Once a bite or a catch is detected, the occurrence of a bite or a catch and their corresponding timestamp may be recorded (block 540/550). The record may be a database, a log, or any other method of recording the fishing data. The record may be a number representing the amount of bites or catch that have occurred, with the number being incremented after each bite or catch. The amount of bites or catch may be shown on a display 350. Then, the software returns to the standby mode (block 520).
[0059] In one implementation, casts, bites and/or catches may be detected using one or more buttons 310. To indicate a bite, a user may press a first button 310. To indicate a catch, a user may press a second, different button 310. To indicate a cast, a user may press a third, different button 310. Alternately, a user may press a button 310 and then quickly release the button 330 to indicate the occurrence of a bite. The user may also press the same button 310 and hold the button 310 down for a longer time to indicate a catch.
[0060] Once a catch is detected, the software may receive further user input corresponding to the weight of the caught fish (block 560). If the software receives further user input, the software may then record the weight of the caught fish (block 560). The record may be a database, a log, or any other method of recording the fishing data. The inputted weight may be shown on a display 350. Then, the software returns to the standby mode (block 520).
[0061] In one implementation, the weight is entered using one or more buttons 310. A weight may be entered by pushing the one or more buttons 310 a number of times to correspond to the weight of the caught fish. For example, to enter a three pound fish, a button 310 may be pressed three times.
[0062] When the trip is over, the software may transmit the recorded data wirelessly to the connected device, e.g., the marine electronics device 600 (block 570). In one implementation, the software may transmit the record data after each new entry, or at any other interval. For example, the transmission may be made after each cast. The transmission may be to a remote server or to any computer system, including a smart phone or a marine electronics device.
MARINE ELECTRONICS DEVICE
[0063] Figure 6 illustrates a schematic diagram of a marine electronics device 600 in accordance with various implementations described herein. The components of the marine display device 600 are described in more detail with reference to the computing system 700 in Figure 7. The marine electronics device 600 includes a screen 605. In certain implementations, the screen 605 may be sensitive to touching by a finger. In other implementations, the screen 605 may be sensitive to the body heat from the finger, a stylus, or responsive to a mouse. The display device 600 may display marine electronic data 615. The marine electronic data types 615 may include chart data, radar data, sonar data, steering data, dashboard data, navigation data, fishing statistics, and the like. The marine electronics device 600 may also include a plurality of buttons 620, which may be either physical buttons or virtual buttons, or a combination thereof.
COMPUTING SYSTEM
[0064] Implementations of various technologies described herein may be operational with numerous general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the various technologies described herein include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, smart phones, and the like.
[0065] The various technologies described herein may be implemented in the context of marine electronics, such as devices found in marine vessels and/or navigation systems. Ship instruments and equipment may be connected to the computing systems described herein for executing one or more navigation technologies. As such, the computing systems may be configured to operate using sonar, radar, the global positioning system (GPS) and like technologies.
[0066] The various technologies described herein may be implemented in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that performs particular tasks or implement particular abstract data types. Further, each program module may be implemented in its own way, and all need not be implemented the same way. While program modules may all execute on a single computing system, it should be appreciated that, in some implementations, program modules may be implemented on separate computing systems or devices adapted to communicate with one another. A program module may also be some combination of hardware and software where particular tasks performed by the program module may be done either through hardware, software, or both. [0067] The various technologies described herein may also be implemented in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network, e.g., by hardwired links, wireless links, or combinations thereof. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
[0068] Figure 7 illustrates a schematic diagram of the video recording system 100 having a computing system 700 in accordance with implementations of various techniques described herein. The computer system 700 may describe the video recorder 1 10, the marine electronics device 600, the wearable device 300, or components spanning multiple devices. The computing system 700 may be a conventional desktop, a handheld device, a controller, a personal digital assistant, a server computer, an electronics device/instrument, a laptop, a tablet, or part of a navigation system, or sonar system. It should be noted, however, that other computer system configurations may be used.
[0069] The computing system 700 may include a central processing unit (CPU) 730, a system memory 726, a graphics processing unit (GPU) 731 and a system bus 728 that couples various system components including the system memory 726 to the CPU 730. Although only one CPU 730 is illustrated in Figure 7, it should be understood that in some implementations the computing system 700 may include more than one CPU 730.
[0070] The CPU 730 can include a microprocessor, a microcontroller, a processor, a programmable integrated circuit, or a combination thereof. The CPU 730 can comprise an off-the-shelf processor such as a Reduced Instruction Set Computer (RISC), or a Microprocessor without Interlocked Pipeline Stages (MIPS) processor, or a combination thereof. The CPU 730 may also include a proprietary processor.
[0071] The GPU 731 may be a microprocessor specifically designed to manipulate and implement computer graphics. The CPU 730 may offload work to the GPU 731 . The GPU 731 may have its own graphics memory, and/or may have access to a portion of the system memory 726. As with the CPU 730, the GPU 731 may include one or more processing units, and each processing unit may include one or more cores. [0072] The CPU 730 may provide output data to a GPU 731 . The GPU 731 may generate graphical user interfaces that present the output data. The GPU 731 may also provide objects, such as menus, in the graphical user interface. A user may provide inputs by interacting with the objects. The GPU 731 may receive the inputs from interaction with the objects and provide the inputs to the CPU 730. A video adapter 732 may be provided to convert graphical data into signals for a monitor 734. The monitor 734 includes a screen 705. The screen 705 can be sensitive to heat or touching (now collectively referred to as a "touch screen"). In one implementation, the host computer 799 may not include a monitor 734.
[0073] The system bus 728 may be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus. The system memory 726 may include a read only memory (ROM) 712 and a random access memory (RAM) 716. A basic input/output system (BIOS) 714, containing the basic routines that help transfer information between elements within the computing system 700, such as during start-up, may be stored in the ROM 712.
[0074] The computing system 700 may further include a hard disk drive interface 736 for reading from and writing to a hard disk 750, a memory card reader 752 for reading from and writing to a removable memory card 756, and an optical disk drive 754 for reading from and writing to a removable optical disk 758, such as a CD ROM or other optical media. The hard disk 750, the memory card reader 752, and the optical disk drive 754 may be connected to the system bus 728 by a hard disk drive interface 736, a memory card reader interface 738, and an optical drive interface 740, respectively. The drives and their associated computer-readable media may provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computing system 700.
[0075] Although the computing system 700 is described herein as having a hard disk, a removable memory card 756 and a removable optical disk 758, it should be appreciated by those skilled in the art that the computing system 700 may also include other types of computer-readable media that may be accessed by a computer. For example, such computer-readable media may include computer storage media and communication media. Computer storage media may include volatile and non-volatile, and removable and non-removable media implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules or other data. Computer storage media may further include RAM, ROM, erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other solid state memory technology, CD-ROM, digital versatile disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computing system 700. Communication media may embody computer readable instructions, data structures, program modules or other data in a modulated data signal, such as a carrier wave or other transport mechanism and may include any information delivery media. The term "modulated data signal" may mean a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The computing system 700 may also include a host adapter 733 that connects to a storage device 735 via a small computer system interface (SCSI) bus, a Fiber Channel bus, an eSATA bus, or using any other applicable computer bus interface. The computing system 700 can also be connected to a router 764 to establish a wide area network (WAN) 766 with one or more remote computers 774. The router 764 may be connected to the system bus 728 via a network interface 744. The remote computers 774 can also include hard disks 772 that store application programs 770.
[0076] In another implementation, as discussed in more detail with respect to Figure 2, the computing system 700 may also connect to one or more remote computers 774 via local area network (LAN) 776 or the WAN 766. When using a LAN networking environment, the computing system 700 may be connected to the LAN 776 through the network interface or adapter 744. The LAN 776 may be implemented via a wired connection or a wireless connection. The LAN 776 may be implemented using Wi-Fi technology, cellular technology, or any other implementation known to those skilled in the art. The network interface 744 may also utilize remote access technologies (e.g., Remote Access Service (RAS), Virtual Private Networking (VPN), Secure Socket Layer (SSL), Layer 2 Tunneling (L2T), or any other suitable protocol). These remote access technologies may be implemented in connection with the remote computers 774. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computer systems may be used.
[0077] A number of program modules may be stored on the hard disk 750, memory card 756, optical disk 758, ROM 712 or RAM 716, including an operating system 718, one or more application programs 720, and program data 724. In certain implementations, the hard disk 750 may store a database system. The database system could include, for example, recorded points. The application programs 720 may include various mobile applications ("apps") and other applications configured to perform various methods and techniques described herein. The operating system 718 may be any suitable operating system that may control the operation of a networked personal or server computer.
[0078] A user may enter commands and information into the computing system 700 through input devices such as a keyboard 762 and pointing device. Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, user input button, or the like. These and other input devices may be connected to the CPU 730 through a serial port interface 742 coupled to system bus 523, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB). A monitor 105 or other type of display device may also be connected to system bus 728 via an interface, such as a video adapter 732. In addition to the monitor 734, the computing system 700 may further include other peripheral output devices such as speakers and printers.
[0079] While the foregoing is directed to implementations of various techniques described herein, other and further implementations may be devised without departing from the basic scope thereof, which may be determined by the claims that follow. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
[0080] Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims

What Is Claimed Is:
1 . A non-transitory computer-readable medium having stored thereon a plurality of computer-executable instructions which, when executed by a computer, cause the computer to:
receive a first notification that a first cast has been made;
receive data regarding a video input;
receive a second notification that a second cast has been made; and
delete at least a portion of the data regarding the video input that is associated with the first cast in response to receiving the second notification.
2. The non-transitory computer-readable medium of claim 1 , wherein the at least a portion of the data is deleted without deleting the data regarding the video input associated with the second cast.
3. The non-transitory computer-readable medium of claim 1 , wherein the computer- executable instructions further cause the computer to:
receive a first time stamp corresponding to the first cast; and
receive a second time stamp corresponding to the second cast; and
wherein the at least a portion of the data is between the first time stamp and the second time stamp.
4. The non-transitory computer-readable medium of claim 1 , wherein the data regarding the video input is associated with the first cast by indexing to the data regarding the video input to data received after the first notification and before the second notification.
5. The non-transitory computer-readable medium of claim 1 , wherein the data regarding a video input comprises data recorded by a video camera.
6. The non-transitory computer-readable medium of claim 1 , wherein the first notification is received from a wearable device.
7. The non-transitory computer-readable medium of claim 1 , wherein the first notification is received over a wireless connection.
8. The non-transitory computer-readable medium of claim 1 , wherein the first notification comprises motion data corresponding to a fishing cast.
9. A non-transitory computer-readable medium having stored thereon a plurality of computer-executable instructions which, when executed by a computer, cause the computer to:
receive a notification that a cast has been made;
receive data regarding a video input associated with the cast;
determine whether a catch has been detected that corresponds to the cast; and store the data regarding the video input in response to a determination that a catch has been detected.
10. The non-transitory computer-readable medium of claim 9, wherein the data is stored in association with the cast or the catch.
1 1 . The non-transitory computer-readable medium of claim 9, wherein the computer- executable instructions to determine whether a catch has been detected further comprise computer-executable instruction to cause the computer to receive a notification that a catch has been made.
12. The non-transitory computer-readable medium of claim 9, wherein the notification is received from a wearable device.
13. The non-transitory computer-readable medium of claim 9, wherein the notification is received over a wireless connection.
14. The non-transitory computer-readable medium of claim 9, wherein the notification comprises motion data corresponding to a fishing cast.
15. A non-transitory computer-readable medium having stored thereon a plurality of computer-executable instructions which, when executed by a computer, cause the computer to:
receive a notification that a first cast has been made;
receive data regarding a video input;
receive a notification that a button has been pressed, wherein the button is associated with a second cast or a catch being made; and
if the button is associated with the second cast, delete at least a portion of the data regarding the video input that is associated with the first cast.
16. The non-transitory computer-readable medium of claim 15, wherein the computer-executable instructions further cause the computer to store the at least a portion of the data regarding the video input if the button is associated with the catch being made.
17. The non-transitory computer-readable medium of claim 15, wherein the button is a virtual button located in a graphical user display.
18. The non-transitory computer-readable medium of claim 15, wherein the button is located on a wearable device.
19. The non-transitory computer-readable medium of claim 15, wherein the at least a portion of the data is deleted without deleting the data regarding the video input associated with the second cast.
20. The non-transitory computer-readable medium of claim 15, wherein the computer-executable instructions further cause the computer to:
receive a first time stamp corresponding to the first cast; and
receive a second time stamp corresponding to the button being pressed; and wherein the at least a portion of the data is between the first time stamp and the second time stamp.
PCT/IB2014/063978 2013-08-21 2014-08-20 Video recording system and methods WO2015025274A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361868444P 2013-08-21 2013-08-21
US61/868,444 2013-08-21
US14/259,052 US9572335B2 (en) 2013-08-21 2014-04-22 Video recording system and methods
US14/259,052 2014-04-22

Publications (1)

Publication Number Publication Date
WO2015025274A1 true WO2015025274A1 (en) 2015-02-26

Family

ID=52479853

Family Applications (11)

Application Number Title Priority Date Filing Date
PCT/IB2014/063982 WO2015025278A1 (en) 2013-08-21 2014-08-20 Fishing suggestions
PCT/IB2014/063979 WO2015025275A1 (en) 2013-08-21 2014-08-20 Fishing data sharing and display
PCT/IB2014/063973 WO2015025270A1 (en) 2013-08-21 2014-08-20 Wearable device for fishing
PCT/IB2014/063980 WO2015025276A1 (en) 2013-08-21 2014-08-20 Analyzing marine trip data
PCT/IB2014/063975 WO2015025272A1 (en) 2013-08-21 2014-08-20 Display of automatically recorded fishing statistics
PCT/IB2014/063977 WO2015028918A1 (en) 2013-08-21 2014-08-20 Fishing and sailing activity detection
PCT/IB2014/063983 WO2015025279A2 (en) 2013-08-21 2014-08-20 Motion capture while fishing
PCT/IB2014/063978 WO2015025274A1 (en) 2013-08-21 2014-08-20 Video recording system and methods
PCT/IB2014/063976 WO2015025273A1 (en) 2013-08-21 2014-08-20 Usage data for marine electronics device
PCT/IB2014/063981 WO2015025277A1 (en) 2013-08-21 2014-08-20 Controlling marine electronics device
PCT/IB2014/063974 WO2015025271A1 (en) 2013-08-21 2014-08-20 Fishing statistics display

Family Applications Before (7)

Application Number Title Priority Date Filing Date
PCT/IB2014/063982 WO2015025278A1 (en) 2013-08-21 2014-08-20 Fishing suggestions
PCT/IB2014/063979 WO2015025275A1 (en) 2013-08-21 2014-08-20 Fishing data sharing and display
PCT/IB2014/063973 WO2015025270A1 (en) 2013-08-21 2014-08-20 Wearable device for fishing
PCT/IB2014/063980 WO2015025276A1 (en) 2013-08-21 2014-08-20 Analyzing marine trip data
PCT/IB2014/063975 WO2015025272A1 (en) 2013-08-21 2014-08-20 Display of automatically recorded fishing statistics
PCT/IB2014/063977 WO2015028918A1 (en) 2013-08-21 2014-08-20 Fishing and sailing activity detection
PCT/IB2014/063983 WO2015025279A2 (en) 2013-08-21 2014-08-20 Motion capture while fishing

Family Applications After (3)

Application Number Title Priority Date Filing Date
PCT/IB2014/063976 WO2015025273A1 (en) 2013-08-21 2014-08-20 Usage data for marine electronics device
PCT/IB2014/063981 WO2015025277A1 (en) 2013-08-21 2014-08-20 Controlling marine electronics device
PCT/IB2014/063974 WO2015025271A1 (en) 2013-08-21 2014-08-20 Fishing statistics display

Country Status (5)

Country Link
US (11) US10251382B2 (en)
EP (1) EP3035793B1 (en)
AU (1) AU2014310326B2 (en)
CA (1) CA2921317C (en)
WO (11) WO2015025278A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI761900B (en) * 2019-08-07 2022-04-21 日商鐘化股份有限公司 Large-scale film-forming substrate and manufacturing method thereof, segmented film-forming substrate and manufacturing method thereof, production management method and production management system of segmented film-forming substrate

Families Citing this family (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160278360A1 (en) * 2012-06-18 2016-09-29 Spfm, L.P. Systems and methods for monitoring and communicating fishing data
US20150316409A1 (en) * 2012-06-18 2015-11-05 Spfm, L.P. Systems and methods for obtaining and transmitting a fish weight
CN102866402B (en) * 2012-08-22 2014-12-10 深圳市福锐达科技有限公司 Wireless fidelity (WIFI)-based wireless hydrological regime detection system and method
US9507562B2 (en) 2013-08-21 2016-11-29 Navico Holding As Using voice recognition for recording events
US10251382B2 (en) 2013-08-21 2019-04-09 Navico Holding As Wearable device for fishing
US10417900B2 (en) * 2013-12-26 2019-09-17 Intel Corporation Techniques for detecting sensor inputs on a wearable wireless device
US10360907B2 (en) 2014-01-14 2019-07-23 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10248856B2 (en) 2014-01-14 2019-04-02 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10024679B2 (en) 2014-01-14 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9915545B2 (en) * 2014-01-14 2018-03-13 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9476758B2 (en) * 2014-04-14 2016-10-25 Robert A. Jones Handheld devices and methods for acquiring object data
US10970352B1 (en) * 2014-07-02 2021-04-06 Google Llc Selecting content for co-located devices
US20160012401A1 (en) * 2014-07-08 2016-01-14 Navico Holding As Methods for Discovering and Purchasing Content for Marine Electronics Device
US9922236B2 (en) 2014-09-17 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
US10024678B2 (en) 2014-09-17 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable clip for providing social and environmental awareness
US9267804B1 (en) 2014-09-24 2016-02-23 Navico Holding As Forward depth display
US10257651B1 (en) * 2015-01-27 2019-04-09 Iiley THOMPSON Mobile electronic device for identifying and recording an animal harvest
US10490102B2 (en) 2015-02-10 2019-11-26 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for braille assistance
EP3056953A1 (en) * 2015-02-11 2016-08-17 Siemens Aktiengesellschaft Self-contained field device used in automation technology for remote monitoring
US10025312B2 (en) 2015-02-20 2018-07-17 Navico Holding As Multiple autopilot interface
US9594374B2 (en) * 2015-02-26 2017-03-14 Navico Holding As Operating multiple autopilots
US9972216B2 (en) 2015-03-20 2018-05-15 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for storing and playback of information for blind users
CN104748745B (en) * 2015-04-16 2018-01-19 无锡职业技术学院 Intelligence obtains optimal fishing position system
US9594375B2 (en) 2015-05-14 2017-03-14 Navico Holding As Heading control using multiple autopilots
US9836129B2 (en) 2015-08-06 2017-12-05 Navico Holding As Using motion sensing for controlling a display
WO2017042803A1 (en) * 2015-09-10 2017-03-16 Agt International Gmbh Method of device for identifying and analyzing spectator sentiment
US20170095199A1 (en) * 2015-10-01 2017-04-06 V1bes, Inc. Biosignal measurement, analysis and neurostimulation
CN105338399A (en) * 2015-10-29 2016-02-17 小米科技有限责任公司 Image acquisition method and device
US20170168800A1 (en) * 2015-12-10 2017-06-15 Navico Holding As Reporting Marine Electronics Data and Performing Software Updates on Marine Electronic Peripheral Devices
CA3009487A1 (en) 2015-12-21 2017-06-29 Angler Labs Inc. An on-line multiplayer gaming system for real world angling events
US10024680B2 (en) 2016-03-11 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Step based guidance system
US10709955B2 (en) * 2016-03-15 2020-07-14 Nike, Inc. Athletic data aggregation for online communities
US9928611B2 (en) * 2016-05-10 2018-03-27 Navico Holding As Systems and associated methods for measuring the length of a fish
US10275901B2 (en) 2016-05-10 2019-04-30 Navico Holding As Systems and associated methods for measuring the length of a fish
US9958275B2 (en) 2016-05-31 2018-05-01 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for wearable smart device communications
US10365366B2 (en) 2016-06-21 2019-07-30 Navico Holding As Adjustable range viewing of sonar imagery during trip replay
US10561519B2 (en) 2016-07-20 2020-02-18 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device having a curved back to reduce pressure on vertebrae
US11262850B2 (en) * 2016-07-20 2022-03-01 Autodesk, Inc. No-handed smartwatch interaction techniques
JP6425691B2 (en) * 2016-07-22 2018-11-21 株式会社わびすけ Fish fishing direct sales system
US9986197B2 (en) * 2016-08-24 2018-05-29 Navico Holding As Trip replay experience
US10948577B2 (en) 2016-08-25 2021-03-16 Navico Holding As Systems and associated methods for generating a fish activity report based on aggregated marine data
CN106372020A (en) * 2016-08-27 2017-02-01 天津大学 High-speed long-distance transmission system of marine seismic data
US10223937B2 (en) * 2016-10-27 2019-03-05 Jesse Pacchione Wearable sport fishing system and method
US10432851B2 (en) 2016-10-28 2019-10-01 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device for detecting photography
USD827143S1 (en) 2016-11-07 2018-08-28 Toyota Motor Engineering & Manufacturing North America, Inc. Blind aid device
US10012505B2 (en) 2016-11-11 2018-07-03 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable system for providing walking directions
US10521669B2 (en) 2016-11-14 2019-12-31 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing guidance or feedback to a user
CN106689081B (en) * 2016-11-28 2020-11-10 南京暴走团电子商务有限公司 Night fishing hand ring system
CN108984553B (en) * 2017-06-01 2022-02-01 北京京东尚科信息技术有限公司 Caching method and device
US10990622B2 (en) * 2017-06-20 2021-04-27 Navico Holding As Livewell operation and control for a vessel
EP3655913A4 (en) * 2017-07-15 2021-03-03 Fishing Chaos, Inc. System and method for measuring and sharing marine activity information
US10599922B2 (en) * 2018-01-25 2020-03-24 X Development Llc Fish biomass, shape, and size determination
US10395114B1 (en) * 2018-04-20 2019-08-27 Surfline\Wavetrak, Inc. Automated detection of features and/or parameters within an ocean environment using image data
US10534967B2 (en) 2018-05-03 2020-01-14 X Development Llc Fish measurement station keeping
US11229194B2 (en) * 2018-07-06 2022-01-25 Happy Medium, Llc Fly rod including cast sensors
US20210325679A1 (en) * 2018-08-22 2021-10-21 Robert Layne System for augmenting fishing data and method
US11475689B2 (en) 2020-01-06 2022-10-18 X Development Llc Fish biomass, shape, size, or health determination
US11703866B2 (en) 2020-02-14 2023-07-18 Navico, Inc. Systems and methods for controlling operations of marine vessels
JP7386141B2 (en) * 2020-08-27 2023-11-24 グローブライド株式会社 Fishing information management system
US11796661B2 (en) 2021-05-21 2023-10-24 Navico, Inc. Orientation device for marine sonar systems
US11760457B2 (en) 2021-07-09 2023-09-19 Navico, Inc. Trolling motor foot pedal controlled sonar device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5546695A (en) * 1993-07-13 1996-08-20 Langer; Alexander G. Fishing line and reel
US6263147B1 (en) * 1996-02-28 2001-07-17 Sun Microsystems, Inc. Delayed decision recording device
US6584722B1 (en) * 2001-04-18 2003-07-01 Peter L. Walls Fishing information device and method of using same
JP2004207812A (en) * 2002-12-24 2004-07-22 Sanyo Electric Co Ltd Video camera apparatus
JP2010193284A (en) * 2009-02-19 2010-09-02 Nippon Logics Kk Underwater imaging apparatus
US20120144723A1 (en) * 2006-03-21 2012-06-14 Davidson Kent G Fishing Lure For Implementing A Fishing Contest

Family Cites Families (169)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4829493A (en) 1985-06-14 1989-05-09 Techsonic Industries, Inc. Sonar fish and bottom finder and display
IS3079A7 (en) 1986-02-19 1986-10-30 ævarr Jonsson Sigurbjörn Automatic fishing line shredder. Line angler, angular seam, trajectory and cane for sharp and unlined line.
US5191341A (en) 1987-12-01 1993-03-02 Federation Francaise De Voile System for sea navigation or traffic control/assistance
US4879697A (en) 1988-08-05 1989-11-07 Lowrance Electronics, Inc. Sonar fish finder apparatus providing split-screen display
US5025423A (en) 1989-12-21 1991-06-18 At&T Bell Laboratories Enhanced bottom sonar system
GB2244195A (en) * 1990-05-01 1991-11-27 Peter John Forward Bite detectors
GB9210262D0 (en) * 1992-05-13 1992-07-01 Fox Design Int Fish-bite indicators
US5228228A (en) * 1992-06-19 1993-07-20 Meissner Garry D Bite detector for fishing
US5446775A (en) * 1993-12-20 1995-08-29 Wright; Larry A. Motion detector and counter
US6321158B1 (en) 1994-06-24 2001-11-20 Delorme Publishing Company Integrated routing/mapping information
GB9512887D0 (en) * 1995-06-23 1995-08-23 Fox Design Int A fish-bite indicator
US5537380A (en) 1995-07-14 1996-07-16 Lowrance Electronics, Inc. Sonar system having an interactive sonar viewing apparatus and method of configuring same
US6045076A (en) 1995-08-03 2000-04-04 Daniels; John J. Fishing reel with electronic antibacklashing features dependent on a sensed line condition
US6708441B2 (en) 1996-04-29 2004-03-23 Anthony Richard Dirito Fish bite detector
JP2001503610A (en) 1996-07-12 2001-03-21 ランス スティーア,スティーブン Indicator device for fishing
US20040124297A1 (en) * 1996-07-12 2004-07-01 Steer Steven Lance Fishing indicator device
US6222449B1 (en) 1997-07-21 2001-04-24 Ronald F. Twining Remote fish logging unit
US6252544B1 (en) * 1998-01-27 2001-06-26 Steven M. Hoffberg Mobile communication device
US6225984B1 (en) 1998-05-01 2001-05-01 Hitachi Micro Systems, Inc. Remote computer interface
US8266266B2 (en) 1998-12-08 2012-09-11 Nomadix, Inc. Systems and methods for providing dynamic network authorization, authentication and accounting
US6567792B1 (en) * 1999-03-02 2003-05-20 Thristle Marine, Llc Method of data collection for fisheries management
US6388688B1 (en) 1999-04-06 2002-05-14 Vergics Corporation Graph-based visual navigation through spatial environments
US6411283B1 (en) 1999-05-20 2002-06-25 Micron Technology, Inc. Computer touch screen adapted to facilitate selection of features at edge of screen
US6125571A (en) * 1999-10-04 2000-10-03 Sigwald; Darren H. Fishing rod holder and bite detector
AUPQ453299A0 (en) 1999-12-08 2000-01-06 Advanced Marine Technologies Pty Ltd A system for fishing
JP4124944B2 (en) 2000-03-31 2008-07-23 古野電気株式会社 Underwater detector
US20030147981A1 (en) 2000-05-23 2003-08-07 Jan Gillam Method of determining a dosage of anti-oxidant for an individual
US6421299B1 (en) 2000-06-28 2002-07-16 Techsonic Industries, Inc. Single-transmit, dual-receive sonar
FR2813684B1 (en) 2000-09-04 2003-01-10 Jean Baptiste Dumas DEVICE FOR EXPORTING DATA FROM A DATABASE TO A COMPUTER TERMINAL AND ASSOCIATED METHOD
US6768942B1 (en) 2000-09-18 2004-07-27 Navigation Technologies Corp. Navigation system with decryption functions and secure geographic database
US7319992B2 (en) 2000-09-25 2008-01-15 The Mission Corporation Method and apparatus for delivering a virtual reality environment
US20020099457A1 (en) 2001-01-25 2002-07-25 Fredlund John R. System and method for representing an activity involving a route along a plurality of locations
US20020116421A1 (en) 2001-02-17 2002-08-22 Fox Harold L. Method and system for page-like display, formating and processing of computer generated information on networked computers
US6751626B2 (en) 2001-05-03 2004-06-15 International Business Machines Corporation Method, system, and program for mining data in a personal information manager database
US7002579B2 (en) 2001-05-09 2006-02-21 Cadec Corporation Split screen GPS and electronic tachograph
US6459372B1 (en) 2001-05-29 2002-10-01 Devin Branham Hand-held computer for identifying hunting and fishing areas and displaying controlling regulations pertaining thereto
US6761692B2 (en) 2001-06-25 2004-07-13 Eagle Ultrasound As High frequency and multi frequency band ultrasound transducers based on ceramic films
US6938367B2 (en) * 2001-08-25 2005-09-06 Michael James Cameron Tension measured fishing line bite detector alarm
US6587740B2 (en) 2001-10-10 2003-07-01 Anglerschannel.Com, Llc System, method and computer program product for determining an angler rating
US6574554B1 (en) 2001-12-11 2003-06-03 Garmin Ltd. System and method for calculating a navigation route based on non-contiguous cartographic map databases
US7243457B1 (en) 2002-09-16 2007-07-17 W. C. Bradley/Zebco Holdings, Inc. Method and system for selecting optimal fishing equipment
US6816782B1 (en) 2002-10-10 2004-11-09 Garmin Ltd. Apparatus, systems and methods for navigation data transfer between portable devices
ZA200308052B (en) 2002-10-17 2004-07-08 Heerden Jaco Van Fishing data management method and system.
US6798378B1 (en) 2002-11-22 2004-09-28 Garmin Ltd. Device and method for displaying track characteristics
US20040198554A1 (en) 2002-12-20 2004-10-07 Orr Joseph A. Encoded metabolic data for transfer and subsequent use, methods of use, and apparatus for using the encoded metabolic data
US7321824B1 (en) 2002-12-30 2008-01-22 Aol Llc Presenting a travel route using more than one presentation style
US20040162830A1 (en) 2003-02-18 2004-08-19 Sanika Shirwadkar Method and system for searching location based information on a mobile device
US20040249860A1 (en) 2003-03-26 2004-12-09 Stechschulte Theodore J. Apparatus for collecting, storing and transmitting fishing information
US20070011334A1 (en) 2003-11-03 2007-01-11 Steven Higgins Methods and apparatuses to provide composite applications
JP4535742B2 (en) 2004-02-03 2010-09-01 株式会社シマノ Fishing reel, fishing information display device and fishing information display system
US8063540B2 (en) 2004-03-08 2011-11-22 Emantec As High frequency ultrasound transducers based on ceramic films
US7289390B2 (en) 2004-07-19 2007-10-30 Furuno Electric Company, Limited Ultrasonic transmitting/receiving apparatus and scanning sonar employing same
US7652952B2 (en) 2004-08-02 2010-01-26 Johnson Outdoors Inc. Sonar imaging system for mounting to watercraft
US20060053028A1 (en) 2004-09-03 2006-03-09 Congel Robert J Methods and systems for generating and collecting real-time experiential feedback on the operation of fishing equipment
US20060048434A1 (en) 2004-09-03 2006-03-09 Congel Robert J Methods and systems for developing and deploying a realistic, virtual fishing experience which provides opportunities for learning to fish and instantaneous experience measurement
US7236426B2 (en) 2004-09-08 2007-06-26 Lowrance Electronics, Inc. Integrated mapping and audio systems
US7430461B1 (en) 2004-10-18 2008-09-30 Navico International Limited Networking method and network for marine navigation devices
US20060095393A1 (en) 2004-10-24 2006-05-04 Vinsant Christopher M Pattern Build Software System
JP4849797B2 (en) 2004-12-03 2012-01-11 喜代志 伊藤 Fishing ground prediction device
US20060119585A1 (en) 2004-12-07 2006-06-08 Skinner David N Remote control with touchpad and method
DE102004059619A1 (en) 2004-12-10 2006-06-14 Siemens Ag Electronic device e.g. movement sensor unit, displaying and operating system for e.g. sports boat, has output interface over which information of engine data and sensor units is transferred to indicating device for graphical display
WO2006096773A2 (en) 2005-03-07 2006-09-14 Networks In Motion, Inc. Method and system for identifying and defining geofences
US7818280B2 (en) * 2005-03-28 2010-10-19 National University Corporation Tokyo University Of Marine Science And Technology Method for predicting depth distribution of predetermined water temperature zone, method and system for delivering fishing ground prediction information of migratory fish
US7434155B2 (en) 2005-04-04 2008-10-07 Leitch Technology, Inc. Icon bar display for video editing system
US20060265931A1 (en) 2005-05-24 2006-11-30 Steve Mcfadden Fish bite/strike alarm rod holder attachment
GB2426680A (en) * 2005-06-01 2006-12-06 Fox Int Group Ltd A fish-bite detector
US7809175B2 (en) 2005-07-01 2010-10-05 Hologic, Inc. Displaying and navigating computer-aided detection results on a review workstation
WO2007021747A2 (en) 2005-08-09 2007-02-22 Medexec Software Corporation A workflow oriented multi-screen healthcare information management system
US7173197B1 (en) 2005-08-30 2007-02-06 Adstracts, Inc. Handheld fish measuring apparatus with integral camera
US7447116B2 (en) 2005-09-09 2008-11-04 Ionetrics, Inc Dynamic underwater position tracking of small objects
FI20055544L (en) 2005-10-07 2007-04-08 Polar Electro Oy Procedures, performance meters and computer programs for determining performance
GB2416976B (en) 2005-11-04 2006-09-06 Mark Brown Fish bite alarm
US7669360B2 (en) * 2006-03-21 2010-03-02 Davidson Kent G Fishing system
US7930921B2 (en) 2006-03-31 2011-04-26 Ssd Company Limited Impact detector and controller for pseudoexperience device
WO2007115411A1 (en) 2006-04-12 2007-10-18 Craig Summers Navigational planning and display method for the sailor's dilemma when heading upwind
US7890867B1 (en) 2006-06-07 2011-02-15 Adobe Systems Incorporated Video editing functions displayed on or near video sequences
US20080032666A1 (en) 2006-08-07 2008-02-07 Microsoft Corporation Location based notification services
GB2441802A (en) 2006-09-13 2008-03-19 Marine & Remote Sensing Soluti Safety system for a vehicle
US20080126935A1 (en) 2006-11-08 2008-05-29 Ross James Blomgren Audio Visual Entertainment System and Method of Operation
US8428583B2 (en) 2006-12-21 2013-04-23 Nokia Corporation Managing subscriber information
US7671756B2 (en) 2007-01-07 2010-03-02 Apple Inc. Portable electronic device with alert silencing
KR101450584B1 (en) 2007-02-22 2014-10-14 삼성전자주식회사 Method for displaying screen in terminal
WO2008109778A2 (en) 2007-03-06 2008-09-12 Sailormade Tecnologia Ltd Marine telemetry and two way communication system
US8040758B1 (en) 2007-05-01 2011-10-18 Physi-Cal Enterprises Lp Golf watch having heart rate monitoring for improved golf game
US7722218B2 (en) 2007-06-14 2010-05-25 Wing Fai Leung Method of and device for attracting aquatic life forms using an electromagnetic field generation
WO2009024971A2 (en) 2007-08-19 2009-02-26 Saar Shai Finger-worn devices and related methods of use
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US8082100B2 (en) 2007-10-19 2011-12-20 Grace Ted V Watercraft automation and aquatic effort data utilization
WO2009055918A1 (en) 2007-11-02 2009-05-07 Marport Canada Inc. System and method for underwater seismic data acquisition
US7869608B2 (en) 2008-01-14 2011-01-11 Apple Inc. Electronic device accessory
EP3432656B1 (en) 2008-01-30 2021-04-14 Google LLC Notification of mobile device events
US7812667B2 (en) 2008-03-10 2010-10-12 Qualcomm Incorporated System and method of enabling a signal processing device in a relatively fast manner to process a low duty cycle signal
US20090231190A1 (en) 2008-03-12 2009-09-17 Grumbles Ernest W Electronic Tracking of Land Use Activities
US20090258710A1 (en) 2008-04-09 2009-10-15 Nike, Inc. System and method for athletic performance race
DE102008024770B4 (en) * 2008-05-23 2010-09-23 Miškatović, Željko Electronic bite indicator for fishing fish
FI123258B (en) 2008-05-28 2013-01-15 Liquid Zone Oy Electronic fishing gear and associated system, method and use
TW200949751A (en) 2008-05-30 2009-12-01 Hsin-Chi Su Maritime climate information system and method for collecting and processing maritime climate information
US8305844B2 (en) 2008-08-07 2012-11-06 Depasqua Louis Sonar navigation system and method
US8589114B2 (en) * 2008-08-19 2013-11-19 Angelo Gregory Papadourakis Motion capture and analysis
US20100121716A1 (en) 2008-11-12 2010-05-13 Jonathan Golan Activity-based targeted advertising
US8503932B2 (en) 2008-11-14 2013-08-06 Sony Mobile Comminications AB Portable communication device and remote motion input device
US8682576B2 (en) 2008-12-04 2014-03-25 Verizon Patent And Licensing Inc. Navigation based on user-defined points and paths
US20100198650A1 (en) 2009-01-23 2010-08-05 Mark Shaw Method of providing game tracking data
US7870496B1 (en) 2009-01-29 2011-01-11 Jahanzeb Ahmed Sherwani System using touchscreen user interface of a mobile device to remotely control a host computer
US8601401B2 (en) 2009-01-30 2013-12-03 Navico Holding As Method, apparatus and computer program product for synchronizing cursor events
JP5356874B2 (en) 2009-03-26 2013-12-04 古野電気株式会社 Sailing assistance device
US8619029B2 (en) 2009-05-22 2013-12-31 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting consecutive gestures
GB2470904B (en) 2009-06-08 2012-03-21 Matthew Hazley Angling data gathering and processing
US20100319235A1 (en) 2009-06-18 2010-12-23 Panaro Miles R Remote fishing system
US8300499B2 (en) 2009-07-14 2012-10-30 Navico, Inc. Linear and circular downscan imaging sonar
US8305840B2 (en) 2009-07-14 2012-11-06 Navico, Inc. Downscan imaging sonar
US9439736B2 (en) 2009-07-22 2016-09-13 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for controlling a remote medical device guidance system in three-dimensions using gestures
US8643508B2 (en) 2009-07-26 2014-02-04 Aspen Avionics, Inc. Avionics device, systems and methods of display
KR101451999B1 (en) 2009-07-28 2014-10-21 삼성전자주식회사 Data scroll method and apparatus
JP5419622B2 (en) 2009-10-01 2014-02-19 古野電気株式会社 Ship display device
WO2011069131A1 (en) 2009-12-04 2011-06-09 Google Inc. Presenting real time search results
JP5566700B2 (en) 2010-01-06 2014-08-06 株式会社シマノ Fishing information display device
GB2477933A (en) * 2010-02-17 2011-08-24 Catchum 88 Ltd Bite Indicator
US20110208479A1 (en) 2010-02-23 2011-08-25 Chaves Antonio E Angling data logging apparatus with a horizontal measuring weighing platform
US8731748B2 (en) 2010-02-26 2014-05-20 Strategic Fishing Systems, Llc Predictive mapping system for anglers
CA2792165C (en) 2010-03-05 2015-01-27 Ysi Incorporated Underwater sensor apparatus
US8634975B2 (en) 2010-04-16 2014-01-21 The Boeing Company Vessel performance optimization reporting tool
US8972903B2 (en) 2010-07-08 2015-03-03 Apple Inc. Using gesture to navigate hierarchically ordered user interface screens
FI20105796A0 (en) 2010-07-12 2010-07-12 Polar Electro Oy Analysis of a physiological condition for a cardio exercise
US20120047790A1 (en) 2010-08-24 2012-03-01 Videooptx, Llc Fishing lure with video camera
US8665668B2 (en) 2010-09-17 2014-03-04 Vivid Engineering, Inc. Ultrasonic distance measurement controller
US20120095978A1 (en) 2010-10-14 2012-04-19 Iac Search & Media, Inc. Related item usage for matching questions to experts
US8814754B2 (en) 2010-11-01 2014-08-26 Nike, Inc. Wearable device having athletic functionality
GB201018444D0 (en) * 2010-11-02 2010-12-15 Carroll John E G3 trio/twin
KR101164999B1 (en) 2010-12-07 2012-07-13 주식회사에이메일 System for offering service information respond of mobile application analysis and method therefor
CA2824465C (en) 2011-01-18 2018-08-21 Savant Systems, Llc Remote control interface providing head-up operation and visual feedback
US8468164B1 (en) 2011-03-09 2013-06-18 Amazon Technologies, Inc. Personalized recommendations based on related users
US8452797B1 (en) 2011-03-09 2013-05-28 Amazon Technologies, Inc. Personalized recommendations based on item usage
WO2012135059A2 (en) 2011-03-25 2012-10-04 Zoll Medical Corporation System and method for adapting alarms in a wearable medical device
CN107506249B (en) 2011-06-05 2021-02-12 苹果公司 System and method for displaying notifications received from multiple applications
US20120316932A1 (en) 2011-06-10 2012-12-13 Aliphcom Wellness application for data-capable band
US20120316456A1 (en) 2011-06-10 2012-12-13 Aliphcom Sensory user interface
US20120316458A1 (en) 2011-06-11 2012-12-13 Aliphcom, Inc. Data-capable band for medical diagnosis, monitoring, and treatment
WO2012170163A1 (en) 2011-06-10 2012-12-13 Aliphcom Media device, application, and content management using sensory input
US8721453B2 (en) * 2011-08-09 2014-05-13 G-Tracking, Llc Virtual activities that incorporate a physical activity
US20130074051A1 (en) 2011-09-20 2013-03-21 National Ict Australia Limited Tracking and analysis of usage of a software product
US20130097037A1 (en) 2011-10-17 2013-04-18 Johnson Controls Technology Company Battery selection and feedback system and method
US9182482B2 (en) 2011-10-25 2015-11-10 Navico Holding As Radar beam sharpening system and method
US20130107031A1 (en) 2011-11-01 2013-05-02 Robert Emmett Atkinson Underwater Digital Video Camera Recorder for Fishing
US20130109997A1 (en) 2011-11-02 2013-05-02 Peter Linke System for monitoring biological data
EP2613223A1 (en) 2012-01-09 2013-07-10 Softkinetic Software System and method for enhanced gesture-based interaction
US9718530B2 (en) 2012-04-17 2017-08-01 Garmin Switzerland Gmbh Marine vessel display system
US20150316409A1 (en) 2012-06-18 2015-11-05 Spfm, L.P. Systems and methods for obtaining and transmitting a fish weight
JP6014382B2 (en) 2012-06-20 2016-10-25 古野電気株式会社 Underwater detection device, underwater display system, program, and underwater display method
US9042971B2 (en) 2012-06-22 2015-05-26 Fitbit, Inc. Biometric monitoring device with heart rate measurement activated by a single user-gesture
KR101972955B1 (en) 2012-07-03 2019-04-26 삼성전자 주식회사 Method and apparatus for connecting service between user devices using voice
US8934318B2 (en) * 2012-07-18 2015-01-13 Reelsonar, Inc. System and method for fish finding using a sonar device and a remote computing device
US8930294B2 (en) 2012-07-30 2015-01-06 Hewlett-Packard Development Company, L.P. Predicting user activity based on usage data received from client devices
US8884903B2 (en) 2012-09-10 2014-11-11 Furuno Electric Co., Ltd. Remote controller for multiple navigation devices
WO2014063160A1 (en) 2012-10-19 2014-04-24 Basis Science, Inc. Detection of emotional states
US9334030B2 (en) 2012-10-22 2016-05-10 Electronics And Telecommunications Research Institute Method and system for managing traffic considering GPS jamming
US20140135592A1 (en) 2012-11-13 2014-05-15 Dacadoo Ag Health band
US9524515B2 (en) * 2012-12-06 2016-12-20 Fishbrain AB Method and system for logging and processing data relating to an activity
US9135826B2 (en) 2012-12-26 2015-09-15 Sap Se Complex event processing for moving objects
US20140195297A1 (en) 2013-01-04 2014-07-10 International Business Machines Corporation Analysis of usage patterns and upgrade recommendations
US20140221854A1 (en) 2013-01-08 2014-08-07 National Electronics and Watch Company Measuring device, including a heart rate sensor, configured to be worn on the wrist of a user
US20140358483A1 (en) 2013-05-31 2014-12-04 Pure Fishing, Inc. Fishing Data System
EP3003149A4 (en) 2013-06-03 2017-06-14 Kacyvenski, Isaiah Motion sensor and analysis
US10723487B2 (en) 2013-08-15 2020-07-28 The Boeing Company Systems and methods for detecting aircraft maintenance events and maintenance intervals
US9507562B2 (en) * 2013-08-21 2016-11-29 Navico Holding As Using voice recognition for recording events
US10251382B2 (en) 2013-08-21 2019-04-09 Navico Holding As Wearable device for fishing
US20150342481A1 (en) 2014-05-30 2015-12-03 Microsoft Corporation Motion compensation for optical heart rate sensors
TWI531330B (en) 2014-10-03 2016-05-01 中傳企業股份有限公司 Zipper head assembly structure for increasing frictional resistance and sliding member thereof
WO2016073363A1 (en) 2014-11-03 2016-05-12 Motion Insight LLC Motion tracking wearable element and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5546695A (en) * 1993-07-13 1996-08-20 Langer; Alexander G. Fishing line and reel
US6263147B1 (en) * 1996-02-28 2001-07-17 Sun Microsystems, Inc. Delayed decision recording device
US6584722B1 (en) * 2001-04-18 2003-07-01 Peter L. Walls Fishing information device and method of using same
JP2004207812A (en) * 2002-12-24 2004-07-22 Sanyo Electric Co Ltd Video camera apparatus
US20120144723A1 (en) * 2006-03-21 2012-06-14 Davidson Kent G Fishing Lure For Implementing A Fishing Contest
JP2010193284A (en) * 2009-02-19 2010-09-02 Nippon Logics Kk Underwater imaging apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI761900B (en) * 2019-08-07 2022-04-21 日商鐘化股份有限公司 Large-scale film-forming substrate and manufacturing method thereof, segmented film-forming substrate and manufacturing method thereof, production management method and production management system of segmented film-forming substrate

Also Published As

Publication number Publication date
US20150054655A1 (en) 2015-02-26
US20150058237A1 (en) 2015-02-26
US20150058323A1 (en) 2015-02-26
WO2015025279A2 (en) 2015-02-26
US9615562B2 (en) 2017-04-11
WO2015025277A1 (en) 2015-02-26
WO2015025272A1 (en) 2015-02-26
EP3035793B1 (en) 2020-09-23
US20150057929A1 (en) 2015-02-26
US10251382B2 (en) 2019-04-09
US20150054828A1 (en) 2015-02-26
US20150055827A1 (en) 2015-02-26
US20150055930A1 (en) 2015-02-26
WO2015028918A1 (en) 2015-03-05
AU2014310326B2 (en) 2017-03-30
EP3035793A1 (en) 2016-06-29
AU2014310326A1 (en) 2016-03-10
WO2015025270A1 (en) 2015-02-26
WO2015025279A3 (en) 2015-05-28
US10383322B2 (en) 2019-08-20
US9596839B2 (en) 2017-03-21
WO2015025278A1 (en) 2015-02-26
WO2015025276A1 (en) 2015-02-26
US20150057965A1 (en) 2015-02-26
WO2015025275A1 (en) 2015-02-26
US9572335B2 (en) 2017-02-21
WO2015025273A1 (en) 2015-02-26
WO2015025271A1 (en) 2015-02-26
US9992987B2 (en) 2018-06-12
CA2921317C (en) 2018-07-10
US9439411B2 (en) 2016-09-13
US20150057968A1 (en) 2015-02-26
US10952420B2 (en) 2021-03-23
CA2921317A1 (en) 2015-02-26
US20150054732A1 (en) 2015-02-26
US20150054829A1 (en) 2015-02-26

Similar Documents

Publication Publication Date Title
US9572335B2 (en) Video recording system and methods
US9507562B2 (en) Using voice recognition for recording events
US20160013998A1 (en) Collecting ad Uploading Data from Marine Electronics Device
KR20170097414A (en) Electronic device and operating method thereof
US9442956B2 (en) Waypoints generation systems and methods
EP3090533A1 (en) Domain aware camera system
US10460743B2 (en) Low-power convenient system for capturing a sound
CN111078523A (en) Log obtaining method and device, storage medium and electronic equipment
US9728013B2 (en) Engine detection
US9916418B2 (en) Uploading measurement data of non-connected medical measuring devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14787265

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14787265

Country of ref document: EP

Kind code of ref document: A1