US20050061976A1 - Method and system for displaying an image - Google Patents

Method and system for displaying an image Download PDF

Info

Publication number
US20050061976A1
US20050061976A1 US10/980,513 US98051304A US2005061976A1 US 20050061976 A1 US20050061976 A1 US 20050061976A1 US 98051304 A US98051304 A US 98051304A US 2005061976 A1 US2005061976 A1 US 2005061976A1
Authority
US
United States
Prior art keywords
image
display
camera unit
fov
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/980,513
Inventor
Alexander Kormos
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
L3 Technologies Inc
Original Assignee
Raytheon Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Raytheon Co filed Critical Raytheon Co
Priority to US10/980,513 priority Critical patent/US20050061976A1/en
Publication of US20050061976A1 publication Critical patent/US20050061976A1/en
Assigned to L-3 COMMUNICATIONS CORPORATION reassignment L-3 COMMUNICATIONS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RAYTHEON COMPANY
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera

Definitions

  • This invention relates generally to vision systems and, more particularly, to a method and system for displaying an image.
  • the driver of a vehicle is able to detect and recognize objects that would be difficult, if not impossible, to detect or recognize at night.
  • a deer approximately 500 meters ahead of a vehicle should be readily detectable and recognizable.
  • the deer will not be detectable, much less recognizable, at that distance because it will be beyond the range of the headlights.
  • the driver detects the deer and well before recognizing what it is, the vehicle will be much closer to the deer than during daylight. Accordingly, the risk of a resulting accident is much higher at night than during the day.
  • night vision systems have been developed to supplement the driver's vision.
  • One example of a night vision system is included in U.S. Pat. No. 5,781,243 entitled “Display Optimization for Night Vision Enhancement Systems.”
  • Some night vision systems include an infrared camera unit mounted in the grill of the vehicle and an image source mounted in the vehicle's dashboard. The camera unit gathers information regarding the scene in front of the vehicle, and the image source projects an image derived from the information onto the windshield for display.
  • the illumination of the image may be poor because a large amount of light is lost due to refraction.
  • the image may be distorted because of the windshield's varying curvature.
  • several night vision systems are proposing to use a magnifying optical element mounted to the dashboard as a display device for the driver. Because of vision and aesthetic considerations, there is a continuing demand to reduce the size of the display device. Typical displays provide excess information which may confuse the driver. For example, the excess information may distort the depth perception of the driver, particularly when the image displayed for the driver is one that has been minified.
  • the present invention provides a method and system for displaying an image that substantially eliminates or reduces at least some of the disadvantages and problems associated with previous methods and systems.
  • a method for displaying an image includes selecting a camera unit horizontal field of view (FOV) of about eighteen degrees and selecting a system magnification of between 0.4 and 1.0. The method also includes determining an aspect ratio for the image based on the selected camera unit horizontal FOV and the selected system magnification, receiving energy from a scene for forming the image and displaying the image on a display.
  • FOV field of view
  • the selected horizontal FOV may comprise eighteen degrees, and the system magnification may comprise approximately 0.55.
  • the determined aspect ratio for the image may comprise approximately 10:3.3 or 3.1.
  • a system for displaying an image includes a camera unit having a horizontal FOV selected to be about eighteen degrees and a system magnification selected to be between 0.4 and 1.0.
  • the system includes a display coupled to the camera unit.
  • the display is operable to display the image.
  • the image has an aspect ratio determined based on the selected camera unit horizontal FOV and the selected system magnification.
  • the system may further include a lens system operable to direct energy from a scene toward a detector and a display unit comprising the display.
  • the display unit may be coupled to the detector and may be operable to form the image using information received from the detector.
  • the detector may include an array of detector elements, each detector element operable to receive energy from a portion of the scene and to convert the received energy into information representative of the received energy and to send the information associated with at least some of the detector elements to the display unit.
  • the display unit may comprise a liquid crystal display (LCD) operable to project the image onto a fold mirror.
  • the fold mirror may be configured to reflect the visible image to an imaging mirror.
  • auxiliary vision system having a camera unit with a horizontal FOV of about eighteen degrees, a selected system magnification of approximately 0.4 to 1.0 and an aspect ratio determined based on the system magnification and the horizontal FOV of the camera unit.
  • a system is particularly suited to present an auxiliary image that better enables a driver to properly perceive depth in the image.
  • the horizontal FOV of the camera unit of about eighteen degrees presents a beneficial amount of horizontal information to the driver to effectively see potential hazards in the roadway in front of the vehicle, especially in combination with a system magnification selected between 0.4 and 1.0 and a displayed image aspect ratio based on such camera unit horizontal FOV and selected system magnification.
  • this horizontal FOV of the camera unit coupled with a selected magnification of between 0.4 and 1.0 can more effectively be utilized and packaged in an auxiliary vehicle system.
  • FIG. 1 is a diagrammatic view of a vehicle that includes one embodiment of an auxiliary vision system in accordance with the present invention
  • FIG. 2 is a diagrammatic view of the auxiliary vision system of FIG. 1 , showing in more detail the internal structure of a camera unit and a display unit of the auxiliary vision system;
  • FIG. 3 is a diagrammatic view of a camera unit coupled to a display unit in accordance with an embodiment of the present invention
  • FIG. 4 is a graph illustrating the effect on depth perception of displaying information that is proximate to a camera, in accordance with an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a method for displaying an image, in accordance with an embodiment of the present invention.
  • FIG. 1 is a diagrammatic view of a vehicle 10 incorporating one embodiment of an auxiliary vision system 20 in accordance with an embodiment of the present invention.
  • the auxiliary vision system 20 includes a camera unit 30 , which in the illustrated embodiment is mounted at the front of vehicle 10 , in the middle of a front grill 12 .
  • the camera unit 30 is electrically coupled at 39 to a display unit 40 , which is also a part of the auxiliary vision system 20 .
  • the display unit 40 is of a type that is commonly known as a head-up display (HUD).
  • the display unit 40 is mounted within a recess of a dashboard 14 of the vehicle 10 , and can project a visible image for reflection by a fold mirror of display unit 40 onto a display 17 for viewing by the driver. Display 17 is recessed within dashboard 14 when auxiliary vision system 20 is not in use.
  • the camera unit 30 is also electrically coupled to a computer 60 at 69 .
  • the computer 60 is also part of the auxiliary vision system 20 and provides instructions to camera unit 30 based on heading information it receives from an angle encoder 70 , which is coupled to a steering column 16 of vehicle 10 and electrically coupled to computer 60 at 79 , and/or an inclinometer 80 , which is coupled to the frame of vehicle 10 and electrically coupled to computer 60 at 89 .
  • Angle encoder 70 and inclinometer 80 which are two types of sensors, are also a part of auxiliary vision system 20 .
  • any type of sensor that can provide information regarding the heading of vehicle 10 such as, for example, steering rate, inclination rate, and/or orientation, may be used in auxiliary vision system 20 .
  • one, two, or even several sensors may be used in different embodiments. Particular embodiments may not include an angle encoder or an inclinometer.
  • the auxiliary vision system 20 of FIG. 1 is discussed in more detail later.
  • the driver's ability to see the road ahead is substantially more limited than would be case for the same section of road during daylight hours. This is particularly true in a rural area under conditions where there is little moonlight, there are no street lights, and there are no headlights of other vehicles. If an animal such as a deer happens to wander into the road at a location 500 meters ahead of the vehicle, the driver would readily notice and recognize the deer during daylight hours, whereas at night the deer may initially be beyond the effective reach of the illumination from the vehicle's headlights. Moreover, even when the headlights begin to illuminate the deer, the driver may not initially notice the deer, because the deer may be a brownish color that is difficult to distinguish from the surrounding darkness.
  • auxiliary vision system 20 of FIG. 1 One purpose of auxiliary vision system 20 of FIG. 1 is to provide the driver of the vehicle 10 with information above and beyond that which the driver can discern at night with the naked eye.
  • the camera unit 30 can detect infrared information at a distance well beyond the effective reach of the headlights of the vehicle 10 .
  • the heat signature of the life form when presented in an infrared image derived from the camera unit 30 , will usually have a significant contrast in comparison to the relatively hotter or cooler surrounding natural environment. As discussed above, this is not necessarily the case in a comparable nighttime image based on visible light.
  • the auxiliary vision system 20 provides a separate and auxiliary image, based on infrared radiation, that is reflected onto display 17 .
  • This auxiliary image can provide a detectable representation of lifeforms or objects ahead that are not yet visible to the naked eye. Further, the auxiliary image can provide a much more striking contrast than a visible image between the lifeforms or objects and the surrounding scene. Note that the auxiliary vision system 20 may also be useful during daylight hours to supplement the view of objects seen with natural light.
  • Camera unit 30 has particular horizontal and vertical fields of view through which it detects an image. At least a portion of this image is ultimately displayed as the auxiliary image to the driver using display 17 .
  • This auxiliary image may include substantially all of the horizontal portion of the image detected by the camera unit 30 . However, a vertical portion of the image detected by the camera unit 30 may not be displayed to the driver in the auxiliary image on display 17 so that the driver is better able to properly perceive depth in the auxiliary image displayed.
  • FIG. 2 is a diagrammatic view of the auxiliary vision system 20 of FIG. 1 , showing in greater detail the internal structure of both the camera unit 30 and the display unit 40 , in accordance with an embodiment of the present invention. More specifically, thermal radiation from a scene 50 enters the camera unit 30 and passes through a lens system 32 and a chopper 34 to a detector 36 . The lens system 32 directs the incoming radiation onto an image plane of the detector 36 .
  • the chopper 34 is a rotating disk of a known type. As the chopper 34 is rotated, it modulates the incoming infrared radiation to the detector 36 .
  • the detector 36 is a commercially available focal plane array or staring array detector, which has a two-dimensional matrix of detector elements, where each detector element produces a respective pixel of a resulting image.
  • detector 36 is an uncooled pyroelectric barium strontium titanate (BST) detector, although numerous other types of detectors would also be useful in auxiliary vision system 20 .
  • BST pyroelectric barium strontium titanate
  • Other such types may include vanadium oxide, thin-film ferrolectric or alpha-silicon bolometers.
  • the circuitry 38 is provided to control the detector 36 and read out the images that it detects, and also to synchronize the chopper 34 to operation of the detector 36 . Further, based on information from computer 60 , the circuitry 38 sends the information obtained from detector 36 through the electrical coupling 39 to the circuitry 42 within the display unit 40 .
  • the circuitry 42 controls a liquid crystal display (LCD) 44 , which in the disclosed embodiment has a two-dimensional array of pixel elements.
  • the display unit 40 displays an image having a horizontal to vertical aspect ratio of 10:3.3 or 3:1.
  • the circuitry 42 takes successive images obtained from the detector 36 through circuitry 38 , and presents these on the LCD 44 .
  • the LCD 44 may include backlighting that makes the auxiliary image on LCD 44 visible at night.
  • This auxiliary image is projected onto a fold mirror 48 that reflects the image so as to be directed onto display 17 , creating a virtual image for the driver.
  • display 17 comprises an imaging mirror.
  • fold mirror 48 and display 17 are shown diagrammatically in FIG. 2 as planar components, each may have a relatively complex curvature that is known in the art. The curvature may also provide some optical power.
  • Display 17 is movably supported, and its position at any given time is determined by a drive mechanism 46 . Using the drive mechanism 46 , the driver may adjust the display 17 so that it is in a viewing position comfortable for that particular driver. Once the driver has finished adjusting the display 17 to a suitable position, it remains in that position during normal operation of the auxiliary vision system 20 .
  • display 17 comprises an imaging mirror
  • the auxiliary image may be displayed directly for view by the driver, without reflection off a mirror or other component.
  • a driver may view the image directly on an LCD, cathode ray tube (CRT) display or other type of direct view display.
  • FIG. 3 illustrates a camera unit 150 for use with the auxiliary vision system 20 and employing an infrared sensor and an optical system 152 that focuses equally-spaced points 154 along a roadway 156 onto a focal plane array 158 .
  • the focal plane array 158 may be an array of uncooled infrared detectors from which an image is ultimately supplied via suitable cables 160 to a display 162 .
  • Optical system 152 is a wide-angle optical system, i.e., one that includes both near and far field points out in front of a vehicle.
  • equally-spaced points in object space are nonlinearly distributed on the focal plane array 158 .
  • This nonlinear distribution of optical rays may create a misjudgment of distance in the driver's mind.
  • the points 154 which are closer to the vehicle create a greater nonlinear distribution on the focal plane array 158 .
  • a driver's depth perception is improved by reducing in the auxiliary image the amount of information displayed that is closer to the vehicle. The amount of such information displayed is determined based on the amount of vertical information displayed to the driver in the auxiliary image.
  • the pointing angle of the camera unit 30 may be elevated to reduce the amount of information closer to the vehicle that is displayed to the driver.
  • FIG. 4 is a graph 180 illustrating three curves representing the relationship between the distance (in meters) of various points from a camera, plotted on the x-axis, and the tangent of the angle formed by the horizontal and a line from the camera to each point (for example, angles 164 of FIG. 3 ), plotted on the y-axis.
  • Each curve represents a relationship with the camera at a particular height.
  • Curve 182 represents a relationship when the camera is at a height of 0.5 meters
  • curve 184 represents a relationship when the camera is at a height of 1.0 meter
  • curve 186 represents a relationship when the camera is at a height of 2.0 meters.
  • each curve becomes more linear. However, when the distance from the camera is closer to zero (especially, for example, when the distance is less than approximately sixty meters), each curve is non-linear. This is indicative of the distortion in depth perception that can occur when viewing on a display an object that is relatively closer to the vehicle.
  • the system magnification may be computed as follows:
  • the angular subtense ⁇ D may be computed as follows:
  • the angular subtense ⁇ D for a head-up display is defined by the field of view of the device, which is defined by the magnifying power of the projection mirror.
  • a driver's depth perception of an auxiliary image is affected by the amount of information closer to the vehicle that is displayed and the magnification of the image.
  • the aspect ratio (the ratio of horizontal to vertical) of the image displayed on the display 17 may be modified.
  • the magnification of an auxiliary vision system 20 may be changed by changing the horizontal FOV of the camera unit 30 , the horizontal dimension of the image on display 17 or the distance between the driver's eye and the display 17 .
  • optimizing an auxiliary image displayed by display 17 is achievable by selecting a horizontal FOV for the camera unit 30 , selecting a magnification for auxiliary vision system 20 and determining an aspect ratio for the displayed image based on such selections in order to reduce distortion in a driver's depth perception.
  • the horizontal FOV of the camera unit 30 is about eighteen degrees (for example, between fifteen and twenty-one degrees or between fifteen and twenty-five degrees).
  • the system magnification may range from approximately 0.4 to 1.0.
  • An aspect ratio for the image displayed on display 17 is selected in order to better enable the driver to properly perceive depth in the auxiliary image.
  • the horizontal FOV of camera unit 30 is eighteen degrees
  • the system magnification is approximately 0.55
  • the aspect ratio of the displayed image is approximately 10:3.3.
  • an auxiliary vision system 20 having a camera unit 30 with a horizontal FOV of about eighteen degrees, a selected system magnification of approximately 0.4 to 1.0 and an aspect ratio determined based on the system magnification and the horizontal FOV of the camera unit 30 is particularly suited to present an improved auxiliary image that better enables a driver to properly perceive depth in the image.
  • a driver of a vehicle that includes an auxiliary vision system 20 as described herein is better able to judge the size of objects in front of the vehicle that are shown on display 17 .
  • the horizontal FOV of the camera unit 30 of about eighteen degrees presents a beneficial amount of horizontal information to the driver to effectively see potential hazards in the roadway in front of the vehicle, especially in combination with a system magnification selected between 0.4 and 1.0 and a displayed image aspect ratio based on such camera unit horizontal FOV and selected system magnification.
  • this horizontal FOV of camera unit 30 coupled with a selected magnification of between 0.4 and 1.0 can more effectively be utilized and packaged in an auxiliary vehicle system 20 .
  • a displayed image aspect ratio determined based on a horizontal FOV of about eighteen degrees and a system magnification of between 0.4 and 1.0 also optimally minimizes the number of eye fixations required to view the displayed image.
  • the number of eye fixations required to assimilate information from a display is directly proportional to angular area. Thus, minimizing the number of eye fixations is desirable for safety and efficiency in a display for aircraft, automobiles, trucks, recreational vehicles, or any other form of moving vehicle.
  • a displayed image having an aspect ratio determined as discussed above minimizes the number of eye fixations by minimizing the amount of displayed information for the viewer to observe.
  • FIG. 5 is a flowchart illustrating a method for displaying an image, in accordance with an embodiment of the present invention.
  • the method begins at step 200 where a camera unit 30 horizontal FOV of about eighteen degrees is selected. In particular embodiments, the selected camera unit 30 horizontal FOV may be approximately between fifteen and twenty-five degrees.
  • a system magnification of between 0.4 and 1.0 is selected.
  • an aspect ratio for the image is determined based on the selected camera unit horizontal FOV and the selected system magnification. The aspect ratio is determined based on the selected camera unit 30 horizontal FOV and system magnification such that an observer properly and effectively perceives depth in a display image.
  • step 206 energy from a scene 50 is received at each of a plurality of detector elements.
  • step 208 the energy received at each detector element is converted into information representative of the energy received at step 206 .
  • step 210 an image is formed using the information representative of the received energy.
  • the image is displayed by projection by an LCD 44 onto a fold mirror 48 for reflection onto an imaging mirror 17 for view by the driver of a vehicle. Through the image, the driver may detect lifeforms or objects ahead that are not yet visible to the naked eye.

Abstract

There is disclosed a method and apparatus for displaying an image that includes selecting a camera unit horizontal field of view (FOV) of about eighteen degrees and selecting a system magnification of less than 1.0. The method and apparatus also includes determining an aspect ratio for the image based on the selected camera unit horizontal FOV and the selected system magnification, receiving energy from a scene for forming the image and displaying the image on a display.

Description

    RELATED APPLICATIONS
  • The present application is a continuation of U.S. Ser. No. 10/163,343 filed Jun. 5, 2002, entitled Method and System for Displaying an Image, now U.S. Pat. No. 6,815,680.
  • TECHNICAL FIELD OF THE INVENTION
  • This invention relates generally to vision systems and, more particularly, to a method and system for displaying an image.
  • BACKGROUND OF THE INVENTION
  • During daylight hours, the driver of a vehicle is able to detect and recognize objects that would be difficult, if not impossible, to detect or recognize at night. For example, on a sunny day, a deer approximately 500 meters ahead of a vehicle should be readily detectable and recognizable. At night, however, particularly when the headlights provide the only illumination, the deer will not be detectable, much less recognizable, at that distance because it will be beyond the range of the headlights. Moreover, by the time the driver detects the deer, and well before recognizing what it is, the vehicle will be much closer to the deer than during daylight. Accordingly, the risk of a resulting accident is much higher at night than during the day.
  • Consequently, in order reduce the risk of accidents, night vision systems have been developed to supplement the driver's vision. One example of a night vision system is included in U.S. Pat. No. 5,781,243 entitled “Display Optimization for Night Vision Enhancement Systems.” Some night vision systems include an infrared camera unit mounted in the grill of the vehicle and an image source mounted in the vehicle's dashboard. The camera unit gathers information regarding the scene in front of the vehicle, and the image source projects an image derived from the information onto the windshield for display.
  • Using the windshield for image display, however, has several drawbacks. For example, the illumination of the image may be poor because a large amount of light is lost due to refraction. As another example, the image may be distorted because of the windshield's varying curvature. To address these drawbacks, several night vision systems are proposing to use a magnifying optical element mounted to the dashboard as a display device for the driver. Because of vision and aesthetic considerations, there is a continuing demand to reduce the size of the display device. Typical displays provide excess information which may confuse the driver. For example, the excess information may distort the depth perception of the driver, particularly when the image displayed for the driver is one that has been minified.
  • SUMMARY OF THE INVENTION
  • The present invention provides a method and system for displaying an image that substantially eliminates or reduces at least some of the disadvantages and problems associated with previous methods and systems.
  • In accordance with a particular embodiment of the present invention, a method for displaying an image includes selecting a camera unit horizontal field of view (FOV) of about eighteen degrees and selecting a system magnification of between 0.4 and 1.0. The method also includes determining an aspect ratio for the image based on the selected camera unit horizontal FOV and the selected system magnification, receiving energy from a scene for forming the image and displaying the image on a display.
  • The selected horizontal FOV may comprise eighteen degrees, and the system magnification may comprise approximately 0.55. The determined aspect ratio for the image may comprise approximately 10:3.3 or 3.1. The method may also include converting the energy received into information representative of the received energy and forming the image using the information representative of the received energy. Displaying the image on a display may comprise projecting the image onto a fold mirror and reflecting the visible image to an imaging mirror using the fold mirror.
  • In accordance with another embodiment, a system for displaying an image includes a camera unit having a horizontal FOV selected to be about eighteen degrees and a system magnification selected to be between 0.4 and 1.0. The system includes a display coupled to the camera unit. The display is operable to display the image. The image has an aspect ratio determined based on the selected camera unit horizontal FOV and the selected system magnification.
  • The system may further include a lens system operable to direct energy from a scene toward a detector and a display unit comprising the display. The display unit may be coupled to the detector and may be operable to form the image using information received from the detector. The detector may include an array of detector elements, each detector element operable to receive energy from a portion of the scene and to convert the received energy into information representative of the received energy and to send the information associated with at least some of the detector elements to the display unit. The display unit may comprise a liquid crystal display (LCD) operable to project the image onto a fold mirror. The fold mirror may be configured to reflect the visible image to an imaging mirror.
  • Technical advantages of particular embodiments of the present invention include an auxiliary vision system having a camera unit with a horizontal FOV of about eighteen degrees, a selected system magnification of approximately 0.4 to 1.0 and an aspect ratio determined based on the system magnification and the horizontal FOV of the camera unit. Such a system is particularly suited to present an auxiliary image that better enables a driver to properly perceive depth in the image. Furthermore, the horizontal FOV of the camera unit of about eighteen degrees presents a beneficial amount of horizontal information to the driver to effectively see potential hazards in the roadway in front of the vehicle, especially in combination with a system magnification selected between 0.4 and 1.0 and a displayed image aspect ratio based on such camera unit horizontal FOV and selected system magnification. Moreover, this horizontal FOV of the camera unit coupled with a selected magnification of between 0.4 and 1.0 can more effectively be utilized and packaged in an auxiliary vehicle system.
  • Other technical advantages will be readily apparent to one skilled in the art from the following figures, descriptions and claims. Moreover, while specific advantages have been enumerated above, various embodiments may include all, some or none of the enumerated advantages.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For a more complete understanding of particular embodiments of the invention and their advantages, reference is now made to the following descriptions, taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a diagrammatic view of a vehicle that includes one embodiment of an auxiliary vision system in accordance with the present invention;
  • FIG. 2 is a diagrammatic view of the auxiliary vision system of FIG. 1, showing in more detail the internal structure of a camera unit and a display unit of the auxiliary vision system;
  • FIG. 3 is a diagrammatic view of a camera unit coupled to a display unit in accordance with an embodiment of the present invention;
  • FIG. 4 is a graph illustrating the effect on depth perception of displaying information that is proximate to a camera, in accordance with an embodiment of the present invention; and
  • FIG. 5 is a flowchart illustrating a method for displaying an image, in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 is a diagrammatic view of a vehicle 10 incorporating one embodiment of an auxiliary vision system 20 in accordance with an embodiment of the present invention. The auxiliary vision system 20 includes a camera unit 30, which in the illustrated embodiment is mounted at the front of vehicle 10, in the middle of a front grill 12. The camera unit 30 is electrically coupled at 39 to a display unit 40, which is also a part of the auxiliary vision system 20. The display unit 40 is of a type that is commonly known as a head-up display (HUD). The display unit 40 is mounted within a recess of a dashboard 14 of the vehicle 10, and can project a visible image for reflection by a fold mirror of display unit 40 onto a display 17 for viewing by the driver. Display 17 is recessed within dashboard 14 when auxiliary vision system 20 is not in use.
  • The camera unit 30 is also electrically coupled to a computer 60 at 69. The computer 60 is also part of the auxiliary vision system 20 and provides instructions to camera unit 30 based on heading information it receives from an angle encoder 70, which is coupled to a steering column 16 of vehicle 10 and electrically coupled to computer 60 at 79, and/or an inclinometer 80, which is coupled to the frame of vehicle 10 and electrically coupled to computer 60 at 89. Angle encoder 70 and inclinometer 80, which are two types of sensors, are also a part of auxiliary vision system 20. In general, any type of sensor that can provide information regarding the heading of vehicle 10, such as, for example, steering rate, inclination rate, and/or orientation, may be used in auxiliary vision system 20. Additionally, one, two, or even several sensors may be used in different embodiments. Particular embodiments may not include an angle encoder or an inclinometer. The auxiliary vision system 20 of FIG. 1 is discussed in more detail later.
  • When a driver is operating a vehicle at night, the driver's ability to see the road ahead is substantially more limited than would be case for the same section of road during daylight hours. This is particularly true in a rural area under conditions where there is little moonlight, there are no street lights, and there are no headlights of other vehicles. If an animal such as a deer happens to wander into the road at a location 500 meters ahead of the vehicle, the driver would readily notice and recognize the deer during daylight hours, whereas at night the deer may initially be beyond the effective reach of the illumination from the vehicle's headlights. Moreover, even when the headlights begin to illuminate the deer, the driver may not initially notice the deer, because the deer may be a brownish color that is difficult to distinguish from the surrounding darkness. Consequently, at the point in time when the driver first realizes that there is a deer in the road, the vehicle will be far closer to the deer in a nighttime situation than would be the case during daylight hours. There are many other similar high risk situations, for example, where a pedestrian is walking along the road.
  • One purpose of auxiliary vision system 20 of FIG. 1 is to provide the driver of the vehicle 10 with information above and beyond that which the driver can discern at night with the naked eye. In this regard, the camera unit 30 can detect infrared information at a distance well beyond the effective reach of the headlights of the vehicle 10. In the case of a life form such as an animal or a human, the heat signature of the life form, when presented in an infrared image derived from the camera unit 30, will usually have a significant contrast in comparison to the relatively hotter or cooler surrounding natural environment. As discussed above, this is not necessarily the case in a comparable nighttime image based on visible light.
  • Thus, in addition to the image that is directly observed by the driver through the windshield of the vehicle based on headlight illumination and any other available light, the auxiliary vision system 20 provides a separate and auxiliary image, based on infrared radiation, that is reflected onto display 17. This auxiliary image can provide a detectable representation of lifeforms or objects ahead that are not yet visible to the naked eye. Further, the auxiliary image can provide a much more striking contrast than a visible image between the lifeforms or objects and the surrounding scene. Note that the auxiliary vision system 20 may also be useful during daylight hours to supplement the view of objects seen with natural light.
  • Camera unit 30 has particular horizontal and vertical fields of view through which it detects an image. At least a portion of this image is ultimately displayed as the auxiliary image to the driver using display 17. This auxiliary image may include substantially all of the horizontal portion of the image detected by the camera unit 30. However, a vertical portion of the image detected by the camera unit 30 may not be displayed to the driver in the auxiliary image on display 17 so that the driver is better able to properly perceive depth in the auxiliary image displayed.
  • FIG. 2 is a diagrammatic view of the auxiliary vision system 20 of FIG. 1, showing in greater detail the internal structure of both the camera unit 30 and the display unit 40, in accordance with an embodiment of the present invention. More specifically, thermal radiation from a scene 50 enters the camera unit 30 and passes through a lens system 32 and a chopper 34 to a detector 36. The lens system 32 directs the incoming radiation onto an image plane of the detector 36.
  • In the disclosed embodiment, the chopper 34 is a rotating disk of a known type. As the chopper 34 is rotated, it modulates the incoming infrared radiation to the detector 36.
  • Also in the disclosed embodiment, the detector 36 is a commercially available focal plane array or staring array detector, which has a two-dimensional matrix of detector elements, where each detector element produces a respective pixel of a resulting image. In particular, detector 36 is an uncooled pyroelectric barium strontium titanate (BST) detector, although numerous other types of detectors would also be useful in auxiliary vision system 20. Other such types may include vanadium oxide, thin-film ferrolectric or alpha-silicon bolometers.
  • The circuitry 38 is provided to control the detector 36 and read out the images that it detects, and also to synchronize the chopper 34 to operation of the detector 36. Further, based on information from computer 60, the circuitry 38 sends the information obtained from detector 36 through the electrical coupling 39 to the circuitry 42 within the display unit 40.
  • The circuitry 42 controls a liquid crystal display (LCD) 44, which in the disclosed embodiment has a two-dimensional array of pixel elements. In this embodiment, the display unit 40 displays an image having a horizontal to vertical aspect ratio of 10:3.3 or 3:1. The circuitry 42 takes successive images obtained from the detector 36 through circuitry 38, and presents these on the LCD 44. The LCD 44 may include backlighting that makes the auxiliary image on LCD 44 visible at night.
  • This auxiliary image is projected onto a fold mirror 48 that reflects the image so as to be directed onto display 17, creating a virtual image for the driver. In the illustrated embodiment, display 17 comprises an imaging mirror. Although fold mirror 48 and display 17 are shown diagrammatically in FIG. 2 as planar components, each may have a relatively complex curvature that is known in the art. The curvature may also provide some optical power. Display 17 is movably supported, and its position at any given time is determined by a drive mechanism 46. Using the drive mechanism 46, the driver may adjust the display 17 so that it is in a viewing position comfortable for that particular driver. Once the driver has finished adjusting the display 17 to a suitable position, it remains in that position during normal operation of the auxiliary vision system 20.
  • It should be understood that even though in the illustrated embodiment display 17 comprises an imaging mirror, in other embodiments the auxiliary image may be displayed directly for view by the driver, without reflection off a mirror or other component. For example, in some embodiments a driver may view the image directly on an LCD, cathode ray tube (CRT) display or other type of direct view display.
  • FIG. 3 illustrates a camera unit 150 for use with the auxiliary vision system 20 and employing an infrared sensor and an optical system 152 that focuses equally-spaced points 154 along a roadway 156 onto a focal plane array 158. The focal plane array 158 may be an array of uncooled infrared detectors from which an image is ultimately supplied via suitable cables 160 to a display 162.
  • Optical system 152 is a wide-angle optical system, i.e., one that includes both near and far field points out in front of a vehicle. In such a system, equally-spaced points in object space are nonlinearly distributed on the focal plane array 158. This nonlinear distribution of optical rays may create a misjudgment of distance in the driver's mind. More specifically, the points 154 which are closer to the vehicle create a greater nonlinear distribution on the focal plane array 158. Thus, in accordance with the present invention a driver's depth perception is improved by reducing in the auxiliary image the amount of information displayed that is closer to the vehicle. The amount of such information displayed is determined based on the amount of vertical information displayed to the driver in the auxiliary image. Thus, by reducing the amount of vertical information displayed to the driver in the auxiliary image, then the amount of information closer to the vehicle that is displayed will be reduced. The driver's depth perception will therefore be improved. In particular embodiments, the pointing angle of the camera unit 30 may be elevated to reduce the amount of information closer to the vehicle that is displayed to the driver.
  • For further explanation, FIG. 4 is a graph 180 illustrating three curves representing the relationship between the distance (in meters) of various points from a camera, plotted on the x-axis, and the tangent of the angle formed by the horizontal and a line from the camera to each point (for example, angles 164 of FIG. 3), plotted on the y-axis. Each curve represents a relationship with the camera at a particular height. Curve 182 represents a relationship when the camera is at a height of 0.5 meters, curve 184 represents a relationship when the camera is at a height of 1.0 meter and curve 186 represents a relationship when the camera is at a height of 2.0 meters.
  • From graph 180, one can observe that as the distance from the camera increases, each curve becomes more linear. However, when the distance from the camera is closer to zero (especially, for example, when the distance is less than approximately sixty meters), each curve is non-linear. This is indicative of the distortion in depth perception that can occur when viewing on a display an object that is relatively closer to the vehicle.
  • The importance of the overall system magnification between an object in the real world and an object as viewed on the display should also be noted. The system magnification may be computed as follows:
      • system magnification=θDO
        where
      • θO=angular subtense of feature in object space
      • θD=angular subtense of feature as viewed on display at the driver's eye position.
  • The angular subtense θD may be computed as follows:
      • θD=2 tan −1 ((A/2)/B)
        where
      • A=linear dimension of displayed feature
      • B=distance of driver's eye to display.
  • It should be noted that the angular subtense θD for a head-up display is defined by the field of view of the device, which is defined by the magnifying power of the projection mirror.
  • Given the system magnification relationship, it is noted that a system magnification which is less than 1.0 creates a problem in judging depth in the displayed auxiliary image. If the system magnification were always held to the value 1, the field of view (FOV) of the camera unit 30 would have little effect on the displayed information. This, however, requires a very large display 17 for wide field angles in the camera unit 30. Since a large display is impractical to package in many vehicles, the relationship described above with respect to system magnification becomes very useful in determining the amount of information to provide in the auxiliary image when a wide field angle is desired for the horizontal FOV of the camera unit 30.
  • Given the explanation discussed above, it is noted that a driver's depth perception of an auxiliary image is affected by the amount of information closer to the vehicle that is displayed and the magnification of the image. As stated above, one can reduce the amount of information closer to the vehicle that is displayed by reducing the amount of vertical information. It is desired to display in the auxiliary image substantially all of information in the horizontal FOV of the camera unit 30. Thus, to change the amount of vertical information displayed, the aspect ratio (the ratio of horizontal to vertical) of the image displayed on the display 17 may be modified. The magnification of an auxiliary vision system 20 may be changed by changing the horizontal FOV of the camera unit 30, the horizontal dimension of the image on display 17 or the distance between the driver's eye and the display 17.
  • Therefore, optimizing an auxiliary image displayed by display 17 is achievable by selecting a horizontal FOV for the camera unit 30, selecting a magnification for auxiliary vision system 20 and determining an aspect ratio for the displayed image based on such selections in order to reduce distortion in a driver's depth perception. In particular embodiments, the horizontal FOV of the camera unit 30 is about eighteen degrees (for example, between fifteen and twenty-one degrees or between fifteen and twenty-five degrees). In such embodiments, the system magnification may range from approximately 0.4 to 1.0. An aspect ratio for the image displayed on display 17 is selected in order to better enable the driver to properly perceive depth in the auxiliary image. In one embodiment, the horizontal FOV of camera unit 30 is eighteen degrees, the system magnification is approximately 0.55 and the aspect ratio of the displayed image is approximately 10:3.3.
  • It has been determined that an auxiliary vision system 20 having a camera unit 30 with a horizontal FOV of about eighteen degrees, a selected system magnification of approximately 0.4 to 1.0 and an aspect ratio determined based on the system magnification and the horizontal FOV of the camera unit 30 is particularly suited to present an improved auxiliary image that better enables a driver to properly perceive depth in the image. For example, a driver of a vehicle that includes an auxiliary vision system 20 as described herein is better able to judge the size of objects in front of the vehicle that are shown on display 17. Furthermore, the horizontal FOV of the camera unit 30 of about eighteen degrees presents a beneficial amount of horizontal information to the driver to effectively see potential hazards in the roadway in front of the vehicle, especially in combination with a system magnification selected between 0.4 and 1.0 and a displayed image aspect ratio based on such camera unit horizontal FOV and selected system magnification. Moreover, this horizontal FOV of camera unit 30 coupled with a selected magnification of between 0.4 and 1.0 can more effectively be utilized and packaged in an auxiliary vehicle system 20.
  • A displayed image aspect ratio determined based on a horizontal FOV of about eighteen degrees and a system magnification of between 0.4 and 1.0 also optimally minimizes the number of eye fixations required to view the displayed image. The number of eye fixations required to assimilate information from a display is directly proportional to angular area. Thus, minimizing the number of eye fixations is desirable for safety and efficiency in a display for aircraft, automobiles, trucks, recreational vehicles, or any other form of moving vehicle. A displayed image having an aspect ratio determined as discussed above minimizes the number of eye fixations by minimizing the amount of displayed information for the viewer to observe.
  • FIG. 5 is a flowchart illustrating a method for displaying an image, in accordance with an embodiment of the present invention. The method begins at step 200 where a camera unit 30 horizontal FOV of about eighteen degrees is selected. In particular embodiments, the selected camera unit 30 horizontal FOV may be approximately between fifteen and twenty-five degrees. At step 202, a system magnification of between 0.4 and 1.0 is selected. At step 204, an aspect ratio for the image is determined based on the selected camera unit horizontal FOV and the selected system magnification. The aspect ratio is determined based on the selected camera unit 30 horizontal FOV and system magnification such that an observer properly and effectively perceives depth in a display image.
  • The method continues at step 206 where energy from a scene 50 is received at each of a plurality of detector elements. At step 208, the energy received at each detector element is converted into information representative of the energy received at step 206. At step 210, an image is formed using the information representative of the received energy. At step 212, the image is displayed by projection by an LCD 44 onto a fold mirror 48 for reflection onto an imaging mirror 17 for view by the driver of a vehicle. Through the image, the driver may detect lifeforms or objects ahead that are not yet visible to the naked eye.
  • Although the present invention has been described in detail, various changes and modifications may be suggested to one skilled in the art. It is intended that the present invention encompass such changes and modifications as falling within the scope of the appended claims.

Claims (19)

1. A method for displaying an image on a display, comprising:
selecting a camera unit horizontal field of view (FOV) of about eighteen degrees;
selecting a system magnification of less than 1.0;
determining an aspect ratio for the image based on the selected camera unit horizontal FOV and the selected system magnification;
receiving energy from a scene for forming the image; and
displaying the image on a display of an automobile.
2. The method of claim 1, wherein selecting a camera unit horizontal field of view (FOV) of about eighteen degrees comprises selecting a camera unit horizontal FOV of between fifteen and twenty-one degrees.
3. The method of claim 1, wherein selecting a system magnification of less than 1.0 comprises selecting a system magnification greater than or equal to 0.4.
4. The method of claim 1, wherein:
selecting a camera unit horizontal field of view (FOV) of about eighteen degrees comprises selecting a camera unit horizontal FOV of eighteen degrees;
selecting a system magnification of less than 1.0 comprises selecting a system magnification of approximately 0.55; and
determining an aspect ratio for the image based on the selected camera unit horizontal FOV and the selected system magnification comprises determining an aspect ratio for the image of approximately 10:3.3.
5. The method of claim 1, wherein displaying the image on a display comprises reflecting the image onto an imaging mirror.
6. The method of claim 1, wherein displaying the image on a display comprises displaying the image on a liquid crystal display (LCD).
7. A method for displaying an image on a display, comprising:
selecting a camera unit horizontal field of view (FOV) of about eighteen degrees;
selecting a system magnification of less than 1.0;
determining an aspect ratio for the image based on the selected camera unit horizontal FOV and the selected system magnification;
receiving energy from a scene for forming the image at each of a plurality of detector elements;
converting the energy received at each detector element into information representative of the received energy;
forming the image using the information representative of the received energy; and
displaying the image on a display.
8. The method of claim 7, wherein displaying the image on a display comprises projecting the image onto a fold mirror and reflecting the visible image to an imaging mirror using the fold mirror.
9. A system for displaying an image, comprising:
a camera unit having a horizontal field of view (FOV) selected to be about eighteen degrees;
a system magnification selected to be less than 1.0; and
a display coupled to the camera unit, the display operable to display the image, the image having an aspect ratio determined based on the selected camera unit horizontal FOV and the selected system magnification.
10. The system of claim 9, wherein a camera unit having a horizontal field of view (FOV) selected to be about eighteen degrees comprises a camera unit having a horizontal FOV selected to be between fifteen and twenty-one degrees.
11. The system of claim 9, wherein a system magnification selected to be less than 1.0 comprises a system magnification selected to be greater than or equal to 0.4.
12. The system of claim 9, wherein:
a camera unit having a horizontal field of view (FOV) selected to be about eighteen degrees comprises a camera unit having a horizontal FOV selected to be eighteen degrees;
a system magnification selected to be less than 1.0 comprises a system magnification selected to be approximately 0.55; and
the image having an aspect ratio determined based on the selected camera unit horizontal FOV and the selected system magnification comprises the image having an aspect ratio of approximately 10:3.3.
13. The system of claim 9, wherein the display comprises the imaging mirror and further comprising a fold mirror operable to display the image onto the imaging mirror.
14. The system of claim 9, wherein the display comprises a liquid crystal display (LCD).
15. A system for displaying an image, comprising:
a camera unit having a horizontal field of view (FOV) selected to be about eighteen degrees;
a system magnification selected to be less than 1.0;
a display coupled to the camera unit, the display operable to display the image, the image having an aspect ratio determined based on the selected camera unit horizontal FOV and the selected system magnification;
a lens system operable to direct energy from a scene toward a detector;
a display unit comprising the display, the display unit coupled to the detector, the display unit operable to form the image using information received from the detector; and
wherein the detector includes an array of detector elements, each detector element operable to receive energy from a portion of the scene and to convert the received energy into information representative of the received energy and to send the information associated with at least some of the detector elements to the display unit.
16. The system of claim 15, wherein the display unit comprises a liquid crystal display (LCD) operable to project the image onto a fold mirror, the fold mirror configured to reflect the visible image to an imaging mirror.
17. The system of claim 15, wherein the detector comprises a vanadium oxide bolometer.
18. The system of claim 15, wherein the detector comprises a thin-film ferrolectric bolometer.
19. The system of claim 15, wherein the detector comprises an alpha-silicon bolometer.
US10/980,513 2002-06-05 2004-11-02 Method and system for displaying an image Abandoned US20050061976A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/980,513 US20050061976A1 (en) 2002-06-05 2004-11-02 Method and system for displaying an image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/163,343 US6815680B2 (en) 2002-06-05 2002-06-05 Method and system for displaying an image
US10/980,513 US20050061976A1 (en) 2002-06-05 2004-11-02 Method and system for displaying an image

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/163,343 Continuation US6815680B2 (en) 2002-06-05 2002-06-05 Method and system for displaying an image

Publications (1)

Publication Number Publication Date
US20050061976A1 true US20050061976A1 (en) 2005-03-24

Family

ID=29709952

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/163,343 Expired - Fee Related US6815680B2 (en) 2002-06-05 2002-06-05 Method and system for displaying an image
US10/980,513 Abandoned US20050061976A1 (en) 2002-06-05 2004-11-02 Method and system for displaying an image

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/163,343 Expired - Fee Related US6815680B2 (en) 2002-06-05 2002-06-05 Method and system for displaying an image

Country Status (5)

Country Link
US (2) US6815680B2 (en)
EP (1) EP1510074A1 (en)
JP (1) JP2005534549A (en)
AU (1) AU2003232465A1 (en)
WO (1) WO2003105481A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080150709A1 (en) * 2004-02-20 2008-06-26 Sharp Kabushiki Kaisha Onboard Display Device, Onboard Display System and Vehicle

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003295318A1 (en) * 2002-06-14 2004-04-19 Honda Giken Kogyo Kabushiki Kaisha Pedestrian detection and tracking with night vision
FR2857463B1 (en) * 2003-07-11 2005-09-02 Valeo Vision INFRARED NIGHT VISIT SYSTEM IN COLOR.
US7366325B2 (en) * 2003-10-09 2008-04-29 Honda Motor Co., Ltd. Moving object detection using low illumination depth capable computer vision
US6967569B2 (en) * 2003-10-27 2005-11-22 Ford Global Technologies Llc Active night vision with adaptive imaging
US20060103590A1 (en) * 2004-10-21 2006-05-18 Avner Divon Augmented display system and methods
US7786898B2 (en) * 2006-05-31 2010-08-31 Mobileye Technologies Ltd. Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications
US8194920B2 (en) * 2007-02-16 2012-06-05 Ford Global Technologies, Llc Method and system for detecting objects using far infrared images
US20180061008A1 (en) * 2016-08-31 2018-03-01 Autoliv Asp, Inc. Imaging system and method
US20200353865A1 (en) * 2017-07-06 2020-11-12 Mazda Motor Corporation Passenger imaging device
US11164404B2 (en) * 2018-03-02 2021-11-02 Ford Global Technologies, Llc Methods and systems for diagnosing an active grille shutter system

Citations (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2716193A (en) * 1951-06-08 1955-08-23 Siemens Ag Albis Poor-visibility scope for road vehicles
US3508723A (en) * 1967-12-26 1970-04-28 Nasa Method and apparatus for securing to a spacecraft
US3803407A (en) * 1972-08-18 1974-04-09 Us Army Night viewing pocket scope
US3887273A (en) * 1973-07-27 1975-06-03 Friedemann Conrad J Speedometer optical projection system
US4052123A (en) * 1971-11-29 1977-10-04 Hitachi, Ltd. Correcting lenses utilized in the manufacture of fluorescent screen of color picture tubes
US4131818A (en) * 1967-10-12 1978-12-26 Varian Associates, Inc. Night vision system
US4527861A (en) * 1983-11-23 1985-07-09 General Motors Corporation Antiglare rear view mirror
US4588150A (en) * 1982-04-23 1986-05-13 Erno-Raumfahrttechnik Gmbh Docking device for space vehicle
US4664475A (en) * 1985-08-14 1987-05-12 Hughes Aircraft Company Combiner mounting and stowage mechanism
US4664344A (en) * 1985-11-07 1987-05-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Apparatus and method of capturing an orbiting spacecraft
US4740780A (en) * 1985-06-24 1988-04-26 Gec Avionics, Inc. Head-up display for automobile
US4868652A (en) * 1980-08-19 1989-09-19 Elliott Brothers (London) Limited Head of display arrangements
US4919517A (en) * 1987-10-16 1990-04-24 Bayerische Motoren Werke Aktiengesellschaft Image-reproducing arrangement for motor vehicles including lens means forming a real image adjacent an upper edge of a windshield to which an instrument panel mirror reflects to form a virtual image
US4934771A (en) * 1985-05-11 1990-06-19 Pilkington P.E. Limited Optical apparatus
US4961625A (en) * 1987-09-18 1990-10-09 Flight Dynamics, Inc. Automobile head-up display system with reflective aspheric surface
US4970653A (en) * 1989-04-06 1990-11-13 General Motors Corporation Vision method of detecting lane boundaries and obstacles
US5001558A (en) * 1985-06-11 1991-03-19 General Motors Corporation Night vision system with color video camera
US5005786A (en) * 1989-03-01 1991-04-09 National Aerospace Laboratory Of Science & Technology Agency Docking and active damping device for space structures
US5013135A (en) * 1989-07-10 1991-05-07 Matsushita Electric Industrial Co., Ltd. Head-up display with two fresnel lenses
US5023451A (en) * 1990-02-20 1991-06-11 General Motors Corporation Indicia contrast enhancement for vehicle borne infrared vision system
US5028119A (en) * 1989-04-07 1991-07-02 Hughes Aircraft Company Aircraft head-up display
US5056890A (en) * 1987-04-16 1991-10-15 Yazaki Corporation Displaying apparatus for a vehicle having a projector on the ceiling of the vehicle
US5237455A (en) * 1991-12-06 1993-08-17 Delco Electronics Corporation Optical combiner with integral support arm
US5289312A (en) * 1991-09-28 1994-02-22 Nikon Corporation Catadioptric reduction projection optical system
US5289315A (en) * 1991-05-29 1994-02-22 Central Glass Company, Limited Head-up display system including a uniformly reflecting layer and a selectively reflecting layer
US5299062A (en) * 1990-05-11 1994-03-29 Omron Corporation Optical lens
US5361165A (en) * 1992-01-03 1994-11-01 General Motors Corporation Reflective cluster display with stowable viewing screen
US5364046A (en) * 1992-02-24 1994-11-15 Environmental Research Institute Of Michigan Automatic compliant capture and docking mechanism for spacecraft
US5414439A (en) * 1994-06-09 1995-05-09 Delco Electronics Corporation Head up display with night vision enhancement
US5497271A (en) * 1993-09-07 1996-03-05 Jaguar Cars Limited Head up displays for motor vehicles
US5504622A (en) * 1993-03-18 1996-04-02 Kansei Corporation Apparatus for head up display comprising a parabolic reflective mirror
US5506595A (en) * 1985-02-18 1996-04-09 Nissan Motor Co., Ltd. Vehicular display system forming display image on front windshield
US5511748A (en) * 1993-11-12 1996-04-30 Scott; David R. Method for extending the useful life of a space satellite
US5657163A (en) * 1995-05-31 1997-08-12 Delco Electronics Corporation Fiber optic illumination of HUD image source
US5686957A (en) * 1994-07-27 1997-11-11 International Business Machines Corporation Teleconferencing imaging system with automatic camera steering
US5729016A (en) * 1994-04-12 1998-03-17 Hughes Aircraft Company Low cost night vision system for nonmilitary surface vehicles
US5731903A (en) * 1992-11-05 1998-03-24 Hughes Electronics Virtual image instrument panel display
US5734357A (en) * 1994-12-02 1998-03-31 Fujitsu Limited Vehicular display device adjusting to driver's positions
US5735488A (en) * 1996-05-28 1998-04-07 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for coupling space vehicles
US5739848A (en) * 1993-09-08 1998-04-14 Sumitomo Electric Industries, Ltd. Object recognition apparatus and method
US5748377A (en) * 1995-10-26 1998-05-05 Fujitsu Limited Headup display
US5781243A (en) * 1995-05-08 1998-07-14 Hughes Electronics Display optimization for night vision enhancement systems
US5805119A (en) * 1992-10-13 1998-09-08 General Motors Corporation Vehicle projected display using deformable mirror device
US5803407A (en) * 1993-11-12 1998-09-08 Scott; David R. Apparatus and methods for in-space satellite operations
US5806802A (en) * 1993-11-12 1998-09-15 Scott; David D. Apparatus and methods for in-space satellite operations
US5859714A (en) * 1993-11-16 1999-01-12 Asahi Glass Company, Ltd. Head-up display, a combiner used for the head-up display and a method of designing the head-up display
US5864432A (en) * 1995-10-27 1999-01-26 Ldt Gmbh Co. Laser-Dispaly-Technologie Kg Device for showing a first image in a second image which is visible through a transparent sheet
US5867133A (en) * 1996-12-13 1999-02-02 Ut Automotive Dearborn, Inc. Dual use display
US5973827A (en) * 1997-03-27 1999-10-26 Raytheon Company Refractive/diffractive infrared imager and optics
US6014259A (en) * 1995-06-07 2000-01-11 Wohlstadter; Jacob N. Three dimensional imaging system
US6017000A (en) * 1998-08-02 2000-01-25 Scott; David R. Apparatus and methods for in-space satellite operations
US6072444A (en) * 1998-02-02 2000-06-06 The United States Of America As Represented By The Secretary Of The Air Force Adaptable hud mount
US6100943A (en) * 1996-07-09 2000-08-08 Harness System Technologies Research, Inc. Vehicular display device for directly and indirectly displaying information
US6262848B1 (en) * 1999-04-29 2001-07-17 Raytheon Company Head-up display
US6299107B1 (en) * 1998-12-04 2001-10-09 Honeybee Robotics, Ltd. Spacecraft capture and docking system
US20020005999A1 (en) * 2000-07-06 2002-01-17 Hutzel Barry W. Rearview mirror assembly with information display
US6359737B1 (en) * 2000-07-28 2002-03-19 Generals Motors Corporation Combined head-up display
US6392812B1 (en) * 1999-09-29 2002-05-21 Bae Systems Electronics Limited Head up displays
US20020063778A1 (en) * 2000-10-13 2002-05-30 Kormos Alexander L. System and method for forming images for display in a vehicle
US6731435B1 (en) * 2001-08-15 2004-05-04 Raytheon Company Method and apparatus for displaying information with a head-up display
US6789901B1 (en) * 2002-01-04 2004-09-14 Raytheon Company System and method for providing images for an operator of a vehicle
US6808274B2 (en) * 2002-06-04 2004-10-26 Raytheon Company Method and system for deploying a mirror assembly from a recessed position

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US502345A (en) * 1893-08-01 Court-house
GB1131293A (en) 1966-09-16 1968-10-23 Sperry Rand Ltd Optical sighting apparatus
DE2633067C2 (en) 1976-07-22 1986-07-17 Bayerische Motoren Werke Ag, 8000 Muenchen Device for the visual display of a variable safety distance of a vehicle
GB1584573A (en) 1976-11-29 1981-02-11 Sfena System for projecting images
EP0178373B1 (en) * 1984-10-12 1989-11-15 Siemens Aktiengesellschaft Protective coating for at least one optical fibre
GB2179716B (en) 1985-08-27 1989-11-01 Gec Avionics Head-up displays
EP0393098B1 (en) 1987-09-18 1996-07-10 Hughes Flight Dynamics, Inc. Automobile head-up display system
GB2213951B (en) 1987-12-17 1991-09-18 Gec Avionics Head-up displays
JP3141081B2 (en) 1990-08-10 2001-03-05 矢崎総業株式会社 Display device for vehicles
IT1245555B (en) 1991-05-23 1994-09-29 Fiat Auto Spa VIRTUAL IMAGE VIEWER, IN PARTICULAR FOR THE RETURN OF IMAGES ON BOARD VEHICLES
FR2693807B1 (en) 1992-07-17 1994-09-16 Renault Device for displaying information on the dashboard of a motor vehicle.
GB9420758D0 (en) 1994-10-14 1994-11-30 Pilkington Plc Projection unit for automotive head up display
JPH09185012A (en) 1995-12-27 1997-07-15 Honda Motor Co Ltd Headup display for vehicle
US5912652A (en) 1996-07-10 1999-06-15 Jong-Seng Won Magnetic fluid display panel and method of making the panel
US6034371A (en) 1996-12-20 2000-03-07 Raytheon Company Semi-opaque chopper for thermal imaging system and method
JP3040356B2 (en) 1997-01-27 2000-05-15 三菱電機株式会社 Infrared solid-state imaging device
WO1999033684A2 (en) 1997-12-31 1999-07-08 Gentex Corporation Vehicle vision system
JP4096445B2 (en) 1999-03-31 2008-06-04 アイシン精機株式会社 Parking assistance device
US6900440B2 (en) 2000-02-24 2005-05-31 University Of Virginia Patent Foundation High sensitivity infrared sensing apparatus and related method thereof

Patent Citations (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2716193A (en) * 1951-06-08 1955-08-23 Siemens Ag Albis Poor-visibility scope for road vehicles
US4131818A (en) * 1967-10-12 1978-12-26 Varian Associates, Inc. Night vision system
US3508723A (en) * 1967-12-26 1970-04-28 Nasa Method and apparatus for securing to a spacecraft
US4052123A (en) * 1971-11-29 1977-10-04 Hitachi, Ltd. Correcting lenses utilized in the manufacture of fluorescent screen of color picture tubes
US3803407A (en) * 1972-08-18 1974-04-09 Us Army Night viewing pocket scope
US3887273A (en) * 1973-07-27 1975-06-03 Friedemann Conrad J Speedometer optical projection system
US4868652A (en) * 1980-08-19 1989-09-19 Elliott Brothers (London) Limited Head of display arrangements
US4588150A (en) * 1982-04-23 1986-05-13 Erno-Raumfahrttechnik Gmbh Docking device for space vehicle
US4527861A (en) * 1983-11-23 1985-07-09 General Motors Corporation Antiglare rear view mirror
US5506595A (en) * 1985-02-18 1996-04-09 Nissan Motor Co., Ltd. Vehicular display system forming display image on front windshield
US4934771A (en) * 1985-05-11 1990-06-19 Pilkington P.E. Limited Optical apparatus
US5001558A (en) * 1985-06-11 1991-03-19 General Motors Corporation Night vision system with color video camera
US4740780A (en) * 1985-06-24 1988-04-26 Gec Avionics, Inc. Head-up display for automobile
US4664475A (en) * 1985-08-14 1987-05-12 Hughes Aircraft Company Combiner mounting and stowage mechanism
US4664344A (en) * 1985-11-07 1987-05-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Apparatus and method of capturing an orbiting spacecraft
US5056890A (en) * 1987-04-16 1991-10-15 Yazaki Corporation Displaying apparatus for a vehicle having a projector on the ceiling of the vehicle
US4961625A (en) * 1987-09-18 1990-10-09 Flight Dynamics, Inc. Automobile head-up display system with reflective aspheric surface
US4919517A (en) * 1987-10-16 1990-04-24 Bayerische Motoren Werke Aktiengesellschaft Image-reproducing arrangement for motor vehicles including lens means forming a real image adjacent an upper edge of a windshield to which an instrument panel mirror reflects to form a virtual image
US5005786A (en) * 1989-03-01 1991-04-09 National Aerospace Laboratory Of Science & Technology Agency Docking and active damping device for space structures
US4970653A (en) * 1989-04-06 1990-11-13 General Motors Corporation Vision method of detecting lane boundaries and obstacles
US5028119A (en) * 1989-04-07 1991-07-02 Hughes Aircraft Company Aircraft head-up display
US5013135A (en) * 1989-07-10 1991-05-07 Matsushita Electric Industrial Co., Ltd. Head-up display with two fresnel lenses
US5023451A (en) * 1990-02-20 1991-06-11 General Motors Corporation Indicia contrast enhancement for vehicle borne infrared vision system
US5299062A (en) * 1990-05-11 1994-03-29 Omron Corporation Optical lens
US5289315A (en) * 1991-05-29 1994-02-22 Central Glass Company, Limited Head-up display system including a uniformly reflecting layer and a selectively reflecting layer
US5289312A (en) * 1991-09-28 1994-02-22 Nikon Corporation Catadioptric reduction projection optical system
US5237455A (en) * 1991-12-06 1993-08-17 Delco Electronics Corporation Optical combiner with integral support arm
US5361165A (en) * 1992-01-03 1994-11-01 General Motors Corporation Reflective cluster display with stowable viewing screen
US5364046A (en) * 1992-02-24 1994-11-15 Environmental Research Institute Of Michigan Automatic compliant capture and docking mechanism for spacecraft
US5805119A (en) * 1992-10-13 1998-09-08 General Motors Corporation Vehicle projected display using deformable mirror device
US5731903A (en) * 1992-11-05 1998-03-24 Hughes Electronics Virtual image instrument panel display
US5504622A (en) * 1993-03-18 1996-04-02 Kansei Corporation Apparatus for head up display comprising a parabolic reflective mirror
US5497271A (en) * 1993-09-07 1996-03-05 Jaguar Cars Limited Head up displays for motor vehicles
US5739848A (en) * 1993-09-08 1998-04-14 Sumitomo Electric Industries, Ltd. Object recognition apparatus and method
US5511748A (en) * 1993-11-12 1996-04-30 Scott; David R. Method for extending the useful life of a space satellite
US6330987B1 (en) * 1993-11-12 2001-12-18 David R. Scott Apparatus and methods for in-space satellite operations
US5803407A (en) * 1993-11-12 1998-09-08 Scott; David R. Apparatus and methods for in-space satellite operations
US5806802A (en) * 1993-11-12 1998-09-15 Scott; David D. Apparatus and methods for in-space satellite operations
US5859714A (en) * 1993-11-16 1999-01-12 Asahi Glass Company, Ltd. Head-up display, a combiner used for the head-up display and a method of designing the head-up display
US5729016A (en) * 1994-04-12 1998-03-17 Hughes Aircraft Company Low cost night vision system for nonmilitary surface vehicles
US5414439A (en) * 1994-06-09 1995-05-09 Delco Electronics Corporation Head up display with night vision enhancement
US5686957A (en) * 1994-07-27 1997-11-11 International Business Machines Corporation Teleconferencing imaging system with automatic camera steering
US6484973B1 (en) * 1994-11-14 2002-11-26 David R. Scott Apparatus and methods for in-space satellite operations
US5734357A (en) * 1994-12-02 1998-03-31 Fujitsu Limited Vehicular display device adjusting to driver's positions
US5781243A (en) * 1995-05-08 1998-07-14 Hughes Electronics Display optimization for night vision enhancement systems
US5657163A (en) * 1995-05-31 1997-08-12 Delco Electronics Corporation Fiber optic illumination of HUD image source
US6014259A (en) * 1995-06-07 2000-01-11 Wohlstadter; Jacob N. Three dimensional imaging system
US5748377A (en) * 1995-10-26 1998-05-05 Fujitsu Limited Headup display
US5864432A (en) * 1995-10-27 1999-01-26 Ldt Gmbh Co. Laser-Dispaly-Technologie Kg Device for showing a first image in a second image which is visible through a transparent sheet
US5735488A (en) * 1996-05-28 1998-04-07 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for coupling space vehicles
US6100943A (en) * 1996-07-09 2000-08-08 Harness System Technologies Research, Inc. Vehicular display device for directly and indirectly displaying information
US5867133A (en) * 1996-12-13 1999-02-02 Ut Automotive Dearborn, Inc. Dual use display
US5973827A (en) * 1997-03-27 1999-10-26 Raytheon Company Refractive/diffractive infrared imager and optics
US6072444A (en) * 1998-02-02 2000-06-06 The United States Of America As Represented By The Secretary Of The Air Force Adaptable hud mount
US6017000A (en) * 1998-08-02 2000-01-25 Scott; David R. Apparatus and methods for in-space satellite operations
US6299107B1 (en) * 1998-12-04 2001-10-09 Honeybee Robotics, Ltd. Spacecraft capture and docking system
US6262848B1 (en) * 1999-04-29 2001-07-17 Raytheon Company Head-up display
US6392812B1 (en) * 1999-09-29 2002-05-21 Bae Systems Electronics Limited Head up displays
US20020005999A1 (en) * 2000-07-06 2002-01-17 Hutzel Barry W. Rearview mirror assembly with information display
US6648477B2 (en) * 2000-07-06 2003-11-18 Donnelly Corporation Rearview mirror assembly with information display
US6359737B1 (en) * 2000-07-28 2002-03-19 Generals Motors Corporation Combined head-up display
US20020063778A1 (en) * 2000-10-13 2002-05-30 Kormos Alexander L. System and method for forming images for display in a vehicle
US6731435B1 (en) * 2001-08-15 2004-05-04 Raytheon Company Method and apparatus for displaying information with a head-up display
US6789901B1 (en) * 2002-01-04 2004-09-14 Raytheon Company System and method for providing images for an operator of a vehicle
US6808274B2 (en) * 2002-06-04 2004-10-26 Raytheon Company Method and system for deploying a mirror assembly from a recessed position

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080150709A1 (en) * 2004-02-20 2008-06-26 Sharp Kabushiki Kaisha Onboard Display Device, Onboard Display System and Vehicle
US20080204205A1 (en) * 2004-02-20 2008-08-28 Sharp Kabushiki Kaisha Onboard display device, onboard display system and vehicle
US20080252439A1 (en) * 2004-02-20 2008-10-16 Sharp Kabushiki Kaisha Onboard display device, onboard display system and vehicle

Also Published As

Publication number Publication date
US20030226966A1 (en) 2003-12-11
AU2003232465A1 (en) 2003-12-22
WO2003105481A1 (en) 2003-12-18
EP1510074A1 (en) 2005-03-02
JP2005534549A (en) 2005-11-17
US6815680B2 (en) 2004-11-09

Similar Documents

Publication Publication Date Title
US7227515B2 (en) System and method for forming images for display in a vehicle
EP0686865B1 (en) Night vision system for motor vehicles
US9131120B2 (en) Multi-camera vision system for a vehicle
US6498620B2 (en) Vision system for a vehicle including an image capture device and a display system having a long focal length
EP0830267B1 (en) Rearview vision system for vehicle including panoramic view
WO2017072841A1 (en) Information display device
US6789901B1 (en) System and method for providing images for an operator of a vehicle
US20050270784A1 (en) Method and device for visualizing a motor vehicle environment with environment-dependent fusion of an infrared image and a visual image
US6815680B2 (en) Method and system for displaying an image
US6808274B2 (en) Method and system for deploying a mirror assembly from a recessed position
EP1410419B1 (en) Gathering image data using multiple sensors
WO2001081972A2 (en) Method and apparatus for obtaining infrared images in a night vision system
JP2020149063A (en) Head-up display device
EP3771937A1 (en) Systems and methods for displaying image on windscreen
Schwalm et al. Design solutions for thermal imaging devices in military vehicles
WO2003083526A1 (en) A method and a device for co-ordinated displaying of a direct image and an electro-optical image

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: L-3 COMMUNICATIONS CORPORATION, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RAYTHEON COMPANY;REEL/FRAME:017647/0010

Effective date: 20060207